Science.gov

Sample records for probabilistic damage tolerance

  1. Multidisciplinary design optimization of a fighter aircraft with damage tolerance constraints and a probabilistic model of the fatigue environment

    NASA Astrophysics Data System (ADS)

    Arrieta, Albert Joseph

    2001-07-01

    Damage tolerance analysis (DTA) was considered in the global design optimization of an aircraft wing structure. Residual strength and fatigue life requirements, based on the damage tolerance philosophy, were investigated as new design constraints. In general, accurate fatigue prediction is difficult if the load environment is not known with a high degree of certainty. To address this issue, a probabilistic approach was used to describe the uncertain load environment. Probabilistic load spectra models were developed from flight recorder data. The global/local finite element approach allowed local fatigue requirements to be considered in the global design optimization. AFGROW fatigue crack growth analysis provided a new strength criterion for satisfying damage tolerance requirements within a global optimization environment. Initial research with the ASTROS program used the probabilistic load model and this damage tolerance constraint to optimize cracked skin panels on the lower wing of a fighter/attack aircraft. For an aerodynamic and structural model similar to an F-16, ASTROS simulated symmetric and asymmetric maneuvers during the optimization. Symmetric maneuvers, without underwing stores, produced the highest stresses and drove the optimization of the inboard lower wing skin. Asymmetric maneuvers, with underwing stores, affected the optimum thickness of the outboard hard points. Subsequent design optimizations included von Mises stress, aileron effectiveness, and lift effectiveness constraints simultaneously. This optimization was driven by the DTA and von Mises stress constraints and, therefore, DTA requirements can have an active role to play in preliminary aircraft design.

  2. Certification of damage tolerant composite structure

    NASA Technical Reports Server (NTRS)

    Rapoff, Andrew J.; Dill, Harold D.; Sanger, Kenneth B.; Kautz, Edward F.

    1990-01-01

    A reliability based certification testing methodology for impact damage tolerant composite structure was developed. Cocured, adhesively bonded, and impact damaged composite static strength and fatigue life data were statistically analyzed to determine the influence of test parameters on the data scatter. The impact damage resistance and damage tolerance of various structural configurations were characterized through the analysis of an industry wide database of impact test results. Realistic impact damage certification requirements were proposed based on actual fleet aircraft data. The capabilities of available impact damage analysis methods were determined through correlation with experimental data. Probabilistic methods were developed to estimate the reliability of impact damaged composite structures.

  3. Probabilistic Fatigue Damage Program (FATIG)

    NASA Technical Reports Server (NTRS)

    Michalopoulos, Constantine

    2012-01-01

    FATIG computes fatigue damage/fatigue life using the stress rms (root mean square) value, the total number of cycles, and S-N curve parameters. The damage is computed by the following methods: (a) traditional method using Miner s rule with stress cycles determined from a Rayleigh distribution up to 3*sigma; and (b) classical fatigue damage formula involving the Gamma function, which is derived from the integral version of Miner's rule. The integration is carried out over all stress amplitudes. This software solves the problem of probabilistic fatigue damage using the integral form of the Palmgren-Miner rule. The software computes fatigue life using an approach involving all stress amplitudes, up to N*sigma, as specified by the user. It can be used in the design of structural components subjected to random dynamic loading, or by any stress analyst with minimal training for fatigue life estimates of structural components.

  4. Damage Tolerance of Composites

    NASA Technical Reports Server (NTRS)

    Hodge, Andy

    2007-01-01

    Fracture control requirements have been developed to address damage tolerance of composites for manned space flight hardware. The requirements provide the framework for critical and noncritical hardware assessment and testing. The need for damage threat assessments, impact damage protection plans, and nondestructive evaluation are also addressed. Hardware intended to be damage tolerant have extensive coupon, sub-element, and full-scale testing requirements in-line with the Building Block Approach concept from the MIL-HDBK-17, Department of Defense Composite Materials Handbook.

  5. Composites Damage Tolerance Workshop

    NASA Technical Reports Server (NTRS)

    Gregg, Wayne

    2006-01-01

    The Composite Damage Tolerance Workshop included participants from NASA, academia, and private industry. The objectives of the workshop were to begin dialogue in order to establish a working group within the Agency, create awareness of damage tolerance requirements for Constellation, and discuss potential composite hardware for the Crew Launch Vehicle (CLV) Upper Stage (US) and Crew Module. It was proposed that a composites damage tolerance working group be created that acts within the framework of the existing NASA Fracture Control Methodology Panel. The working group charter would be to identify damage tolerance gaps and obstacles for implementation of composite structures into manned space flight systems and to develop strategies and recommendations to overcome these obstacles.

  6. DNA damage tolerance.

    PubMed

    Branzei, Dana; Psakhye, Ivan

    2016-06-01

    Accurate chromosomal DNA replication is fundamental for optimal cellular function and genome integrity. Replication perturbations activate DNA damage tolerance pathways, which are crucial to complete genome duplication as well as to prevent formation of deleterious double strand breaks. Cells use two general strategies to tolerate lesions: recombination to a homologous template, and trans-lesion synthesis with specialized polymerases. While key players of these processes have been outlined, much less is known on their choreography and regulation. Recent advances have uncovered principles by which DNA damage tolerance is regulated locally and temporally - in relation to replication timing and cell cycle stage -, and are beginning to elucidate the DNA dynamics that mediate lesion tolerance and influence chromosome structure during replication. PMID:27060551

  7. Damage Tolerance Assessment Branch

    NASA Technical Reports Server (NTRS)

    Walker, James L.

    2013-01-01

    The Damage Tolerance Assessment Branch evaluates the ability of a structure to perform reliably throughout its service life in the presence of a defect, crack, or other form of damage. Such assessment is fundamental to the use of structural materials and requires an integral blend of materials engineering, fracture testing and analysis, and nondestructive evaluation. The vision of the Branch is to increase the safety of manned space flight by improving the fracture control and the associated nondestructive evaluation processes through development and application of standards, guidelines, advanced test and analytical methods. The Branch also strives to assist and solve non-aerospace related NDE and damage tolerance problems, providing consultation, prototyping and inspection services.

  8. Damage tolerance for commuter aircraft

    NASA Technical Reports Server (NTRS)

    Lincoln, John W.

    1992-01-01

    The damage tolerance experience in the United States Air Force with military aircraft and in the commercial world with large transport category aircraft indicates that a similar success could be achieved in commuter aircraft. The damage tolerance process is described for the purpose of defining the approach that could be used for these aircraft to ensure structural integrity. Results of some of the damage tolerance assessments for this class of aircraft are examined to illustrate the benefits derived from this approach. Recommendations are given for future damage tolerance assessment of existing commuter aircraft and on the incorporation of damage tolerance capability in new designs.

  9. Damage Tolerance and Reliability of Turbine Engine Components

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.

    1999-01-01

    This report describes a formal method to quantify structural damage tolerance and reliability in the presence of a multitude of uncertainties in turbine engine components. The method is based at the material behavior level where primitive variables with their respective scatter ranges are used to describe behavior. Computational simulation is then used to propagate the uncertainties to the structural scale where damage tolerance and reliability are usually specified. Several sample cases are described to illustrate the effectiveness, versatility, and maturity of the method. Typical results from this method demonstrate that it is mature and that it can be used to probabilistically evaluate turbine engine structural components. It may be inferred from the results that the method is suitable for probabilistically predicting the remaining life in aging or deteriorating structures, for making strategic projections and plans, and for achieving better, cheaper, faster products that give competitive advantages in world markets.

  10. Design of Composite Structures for Reliability and Damage Tolerance

    NASA Technical Reports Server (NTRS)

    Rais-Rohani, Masoud

    1999-01-01

    A summary of research conducted during the first year is presented. The research objectives were sought by conducting two tasks: (1) investigation of probabilistic design techniques for reliability-based design of composite sandwich panels, and (2) examination of strain energy density failure criterion in conjunction with response surface methodology for global-local design of damage tolerant helicopter fuselage structures. This report primarily discusses the efforts surrounding the first task and provides a discussion of some preliminary work involving the second task.

  11. Damage Tolerance of Large Shell Structures

    NASA Technical Reports Server (NTRS)

    Minnetyan, L.; Chamis, C. C.

    1999-01-01

    Progressive damage and fracture of large shell structures is investigated. A computer model is used for the assessment of structural response, progressive fracture resistance, and defect/damage tolerance characteristics. Critical locations of a stiffened conical shell segment are identified. Defective and defect-free computer models are simulated to evaluate structural damage/defect tolerance. Safe pressurization levels are assessed for the retention of structural integrity at the presence of damage/ defects. Damage initiation, growth, accumulation, and propagation to fracture are included in the simulations. Damage propagation and burst pressures for defective and defect-free shells are compared to evaluate damage tolerance. Design implications with regard to defect and damage tolerance of a large steel pressure vessel are examined.

  12. Probabilistic constitutive law for damage in ligaments.

    PubMed

    Guo, Zheying; De Vita, Raffaella

    2009-11-01

    A new constitutive equation is presented to describe the damage evolution process in parallel-fibered collagenous tissues such as ligaments. The model is formulated by accounting for the fibrous structure of the tissues. The tissue's stress is defined as the average of the collagen fiber's stresses. The fibers are assumed to be undulated and straightened out at different stretches that are randomly defined according to a Weibull distribution. After becoming straight, each collagen fiber is assumed to be linear elastic. Damage is defined as a reduction in collagen fiber's stiffness and occurs at different stretches that are also randomly defined by a Weibull distribution. Due to the lack of experimental data, the predictions of the constitutive equation are analyzed by varying the values of its structural parameters. Moreover, the results are compared with the available stress-strain data in the biomechanics literature that evaluate damage produced by subfailure stretches in rat medial collateral ligaments. PMID:19665914

  13. Probabilistic flood damage modelling at the meso-scale

    NASA Astrophysics Data System (ADS)

    Kreibich, Heidi; Botto, Anna; Schröter, Kai; Merz, Bruno

    2014-05-01

    Decisions on flood risk management and adaptation are usually based on risk analyses. Such analyses are associated with significant uncertainty, even more if changes in risk due to global change are expected. Although uncertainty analysis and probabilistic approaches have received increased attention during the last years, they are still not standard practice for flood risk assessments. Most damage models have in common that complex damaging processes are described by simple, deterministic approaches like stage-damage functions. Novel probabilistic, multi-variate flood damage models have been developed and validated on the micro-scale using a data-mining approach, namely bagging decision trees (Merz et al. 2013). In this presentation we show how the model BT-FLEMO (Bagging decision Tree based Flood Loss Estimation MOdel) can be applied on the meso-scale, namely on the basis of ATKIS land-use units. The model is applied in 19 municipalities which were affected during the 2002 flood by the River Mulde in Saxony, Germany. The application of BT-FLEMO provides a probability distribution of estimated damage to residential buildings per municipality. Validation is undertaken on the one hand via a comparison with eight other damage models including stage-damage functions as well as multi-variate models. On the other hand the results are compared with official damage data provided by the Saxon Relief Bank (SAB). The results show, that uncertainties of damage estimation remain high. Thus, the significant advantage of this probabilistic flood loss estimation model BT-FLEMO is that it inherently provides quantitative information about the uncertainty of the prediction. Reference: Merz, B.; Kreibich, H.; Lall, U. (2013): Multi-variate flood damage assessment: a tree-based data-mining approach. NHESS, 13(1), 53-64.

  14. Rotorcraft Damage Tolerance Evaluated by Computational Simulation

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.; Minnetyan, Levon; Abdi, Frank

    2000-01-01

    An integrally stiffened graphite/epoxy composite rotorcraft structure is evaluated via computational simulation. A computer code that scales up constituent micromechanics level material properties to the structure level and accounts for all possible failure modes is used for the simulation of composite degradation under loading. Damage initiation, growth, accumulation, and propagation to fracture are included in the simulation. Design implications with regard to defect and damage tolerance of integrally stiffened composite structures are examined. A procedure is outlined regarding the use of this type of information for setting quality acceptance criteria, design allowables, damage tolerance, and retirement-for-cause criteria.

  15. A Novel Approach to Rotorcraft Damage Tolerance

    NASA Technical Reports Server (NTRS)

    Forth, Scott C.; Everett, Richard A.; Newman, John A.

    2002-01-01

    Damage-tolerance methodology is positioned to replace safe-life methodologies for designing rotorcraft structures. The argument for implementing a damage-tolerance method comes from the fundamental fact that rotorcraft structures typically fail by fatigue cracking. Therefore, if technology permits prediction of fatigue-crack growth in structures, a damage-tolerance method should deliver the most accurate prediction of component life. Implementing damage-tolerance (DT) into high-cycle-fatigue (HCF) components will require a shift from traditional DT methods that rely on detecting an initial flaw with nondestructive inspection (NDI) methods. The rapid accumulation of cycles in a HCF component will result in a design based on a traditional DT method that is either impractical because of frequent inspections, or because the design will be too heavy to operate efficiently. Furthermore, once a HCF component develops a detectable propagating crack, the remaining fatigue life is short, sometimes less than one flight hour, which does not leave sufficient time for inspection. Therefore, designing a HCF component will require basing the life analysis on an initial flaw that is undetectable with current NDI technology.

  16. 77 FR 4890 - Damage Tolerance and Fatigue Evaluation for Composite Rotorcraft Structures, and Damage Tolerance...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-02-01

    ... Fatigue Evaluation for Metallic Structures'' (76 FR 75435), published December 2, 2011. In the ``Composite... Tolerance and Fatigue Evaluation for Composite Rotorcraft Structures'' (76 FR 74655). On December 2, 2011... Fatigue Evaluation for Composite Rotorcraft Structures, and Damage Tolerance and Fatigue Evaluation...

  17. Demonstrating damage tolerance of composite airframes

    NASA Technical Reports Server (NTRS)

    Poe, Clarence C., Jr.

    1993-01-01

    Commercial transport aircraft operating in the United States are certified by the Federal Aviation Authority to be damage tolerant. On 28 April 1988, Aloha Airlines Flight 243, a Boeing 727-200 airplane, suffered an explosive decompression of the fuselage but landed safely. This event provides very strong justification for the damage tolerant design criteria. The likely cause of the explosive decompression was the linkup of numerous small fatigue cracks that initiated at adjacent fastener holes in the lap splice joint at the side of the body. Actually, the design should have limited the damage size to less than two frame spacings (about 40 inches), but this type of 'multi-site damage' was not originally taken into account. This cracking pattern developed only in the high-time airplanes (many flights). After discovery in the fleet, a stringent inspection program using eddy current techniques was inaugurated to discover these cracks before they linked up. Because of concerns about safety and the maintenance burden, the lap-splice joints of these high-time airplanes are being modified to remove cracks and prevent new cracking; newer designs account for 'multi-site damage'.

  18. A probabilistic fatigue analysis of multiple site damage

    NASA Technical Reports Server (NTRS)

    Rohrbaugh, S. M.; Ruff, D.; Hillberry, B. M.; Mccabe, G.; Grandt, A. F., Jr.

    1994-01-01

    The variability in initial crack size and fatigue crack growth is incorporated in a probabilistic model that is used to predict the fatigue lives for unstiffened aluminum alloy panels containing multiple site damage (MSD). The uncertainty of the damage in the MSD panel is represented by a distribution of fatigue crack lengths that are analytically derived from equivalent initial flaw sizes. The variability in fatigue crack growth rate is characterized by stochastic descriptions of crack growth parameters for a modified Paris crack growth law. A Monte-Carlo simulation explicitly describes the MSD panel by randomly selecting values from the stochastic variables and then grows the MSD cracks with a deterministic fatigue model until the panel fails. Different simulations investigate the influences of the fatigue variability on the distributions of remaining fatigue lives. Six cases that consider fixed and variable conditions of initial crack size and fatigue crack growth rate are examined. The crack size distribution exhibited a dominant effect on the remaining fatigue life distribution, and the variable crack growth rate exhibited a lesser effect on the distribution. In addition, the probabilistic model predicted that only a small percentage of the life remains after a lead crack develops in the MSD panel.

  19. Damage Tolerance of Composite Laminates from an Empirical Perspective

    NASA Technical Reports Server (NTRS)

    Nettles, Alan T.

    2009-01-01

    Damage tolerance consists of analysis and experimentation working together. Impact damage is usually of most concern for laminated composites. Once impacted, the residual compression strength is usually of most interest. Other properties may be of more interest than compression (application dependent). A damage tolerance program is application specific (not everyone is building aircraft). The "Building Block Approach" is suggested for damage tolerance. Advantage can be taken of the excellent fatigue resistance of damaged laminates to save time and costs.

  20. Damage Tolerance of Integral Structure in Rotorcraft

    NASA Technical Reports Server (NTRS)

    Forth, Scott C.; Urban, Michael R.

    2003-01-01

    The rotorcraft industry has rapidly implemented integral structures into aircraft to benefit from the weight and cost advantages over traditionally riveted structure. The cost to manufacture an integral structure, where the entire component is machined from a single plate of material, is about one-fifth that of a riveted structure. Furthermore, the integral structure can weigh only one-half that of a riveted structure through optimal design of stiffening structure and part reduction. Finally, inspection and repair of damage in the field can be less costly than riveted structure. There are no rivet heads to inspect under, reducing inspection time, and damage can be removed or patched readily without altering the primary structure, reducing replacement or repair costs. In this paper, the authors will investigate the damage tolerance implications of fielding an integral structure manufactured from thick plate aluminum.

  1. Low cost damage tolerant composite fabrication

    NASA Technical Reports Server (NTRS)

    Palmer, R. J.; Freeman, W. T.

    1988-01-01

    The resin transfer molding (RTM) process applied to composite aircraft parts offers the potential for using low cost resin systems with dry graphite fabrics that can be significantly less expensive than prepreg tape fabricated components. Stitched graphite fabric composites have demonstrated compression after impact failure performance that equals or exceeds that of thermoplastic or tough thermoset matrix composites. This paper reviews methods developed to fabricate complex shape composite parts using stitched graphite fabrics to increase damage tolerance with RTM processes to reduce fabrication cost.

  2. Damage-tolerance strategies for nacre tablets.

    PubMed

    Wang, Shengnan; Zhu, Xinqiao; Li, Qiyang; Wang, Rizhi; Wang, Xiaoxiang

    2016-05-01

    Nacre, a natural armor, exhibits prominent penetration resistance against predatory attacks. Unraveling its hierarchical toughening mechanisms and damage-tolerance design strategies may provide significant inspiration for the pursuit of high-performance artificial armors. In this work, relationships between the structure and mechanical performance of nacre were investigated. The results show that other than their brick-and-mortar structure, individual nacre tablets significantly contribute to the damage localization of nacre. Affected by intracrystalline organics, the tablets exhibit a unique fracture behavior. The synergistic action of the nanoscale deformation mechanisms increases the energy dissipation efficiency of the tablets and contributes to the preservation of the structural and functional integrity of the shell. PMID:26892674

  3. High damage tolerance of electrochemically lithiated silicon

    NASA Astrophysics Data System (ADS)

    Wang, Xueju; Fan, Feifei; Wang, Jiangwei; Wang, Haoran; Tao, Siyu; Yang, Avery; Liu, Yang; Beng Chew, Huck; Mao, Scott X.; Zhu, Ting; Xia, Shuman

    2015-09-01

    Mechanical degradation and resultant capacity fade in high-capacity electrode materials critically hinder their use in high-performance rechargeable batteries. Despite tremendous efforts devoted to the study of the electro-chemo-mechanical behaviours of high-capacity electrode materials, their fracture properties and mechanisms remain largely unknown. Here we report a nanomechanical study on the damage tolerance of electrochemically lithiated silicon. Our in situ transmission electron microscopy experiments reveal a striking contrast of brittle fracture in pristine silicon versus ductile tensile deformation in fully lithiated silicon. Quantitative fracture toughness measurements by nanoindentation show a rapid brittle-to-ductile transition of fracture as the lithium-to-silicon molar ratio is increased to above 1.5. Molecular dynamics simulations elucidate the mechanistic underpinnings of the brittle-to-ductile transition governed by atomic bonding and lithiation-induced toughening. Our results reveal the high damage tolerance in amorphous lithium-rich silicon alloys and have important implications for the development of durable rechargeable batteries.

  4. High damage tolerance of electrochemically lithiated silicon

    PubMed Central

    Wang, Xueju; Fan, Feifei; Wang, Jiangwei; Wang, Haoran; Tao, Siyu; Yang, Avery; Liu, Yang; Beng Chew, Huck; Mao, Scott X.; Zhu, Ting; Xia, Shuman

    2015-01-01

    Mechanical degradation and resultant capacity fade in high-capacity electrode materials critically hinder their use in high-performance rechargeable batteries. Despite tremendous efforts devoted to the study of the electro–chemo–mechanical behaviours of high-capacity electrode materials, their fracture properties and mechanisms remain largely unknown. Here we report a nanomechanical study on the damage tolerance of electrochemically lithiated silicon. Our in situ transmission electron microscopy experiments reveal a striking contrast of brittle fracture in pristine silicon versus ductile tensile deformation in fully lithiated silicon. Quantitative fracture toughness measurements by nanoindentation show a rapid brittle-to-ductile transition of fracture as the lithium-to-silicon molar ratio is increased to above 1.5. Molecular dynamics simulations elucidate the mechanistic underpinnings of the brittle-to-ductile transition governed by atomic bonding and lithiation-induced toughening. Our results reveal the high damage tolerance in amorphous lithium-rich silicon alloys and have important implications for the development of durable rechargeable batteries. PMID:26400671

  5. High damage tolerance of electrochemically lithiated silicon.

    PubMed

    Wang, Xueju; Fan, Feifei; Wang, Jiangwei; Wang, Haoran; Tao, Siyu; Yang, Avery; Liu, Yang; Beng Chew, Huck; Mao, Scott X; Zhu, Ting; Xia, Shuman

    2015-01-01

    Mechanical degradation and resultant capacity fade in high-capacity electrode materials critically hinder their use in high-performance rechargeable batteries. Despite tremendous efforts devoted to the study of the electro-chemo-mechanical behaviours of high-capacity electrode materials, their fracture properties and mechanisms remain largely unknown. Here we report a nanomechanical study on the damage tolerance of electrochemically lithiated silicon. Our in situ transmission electron microscopy experiments reveal a striking contrast of brittle fracture in pristine silicon versus ductile tensile deformation in fully lithiated silicon. Quantitative fracture toughness measurements by nanoindentation show a rapid brittle-to-ductile transition of fracture as the lithium-to-silicon molar ratio is increased to above 1.5. Molecular dynamics simulations elucidate the mechanistic underpinnings of the brittle-to-ductile transition governed by atomic bonding and lithiation-induced toughening. Our results reveal the high damage tolerance in amorphous lithium-rich silicon alloys and have important implications for the development of durable rechargeable batteries. PMID:26400671

  6. High damage tolerance of electrochemically lithiated silicon

    SciTech Connect

    Wang, Xueju; Fan, Feifei; Wang, Jiangwei; Wang, Haoran; Tao, Siyu; Yang, Avery; Liu, Yang; Beng Chew, Huck; Mao, Scott X.; Zhu, Ting; Xia, Shuman

    2015-09-24

    Mechanical degradation and resultant capacity fade in high-capacity electrode materials critically hinder their use in high-performance rechargeable batteries. Despite tremendous efforts devoted to the study of the electro–chemo–mechanical behaviours of high-capacity electrode materials, their fracture properties and mechanisms remain largely unknown. In this paper, we report a nanomechanical study on the damage tolerance of electrochemically lithiated silicon. Our in situ transmission electron microscopy experiments reveal a striking contrast of brittle fracture in pristine silicon versus ductile tensile deformation in fully lithiated silicon. Quantitative fracture toughness measurements by nanoindentation show a rapid brittle-to-ductile transition of fracture as the lithium-to-silicon molar ratio is increased to above 1.5. Molecular dynamics simulations elucidate the mechanistic underpinnings of the brittle-to-ductile transition governed by atomic bonding and lithiation-induced toughening. Finally, our results reveal the high damage tolerance in amorphous lithium-rich silicon alloys and have important implications for the development of durable rechargeable batteries.

  7. High damage tolerance of electrochemically lithiated silicon

    DOE PAGESBeta

    Wang, Xueju; Fan, Feifei; Wang, Jiangwei; Wang, Haoran; Tao, Siyu; Yang, Avery; Liu, Yang; Beng Chew, Huck; Mao, Scott X.; Zhu, Ting; et al

    2015-09-24

    Mechanical degradation and resultant capacity fade in high-capacity electrode materials critically hinder their use in high-performance rechargeable batteries. Despite tremendous efforts devoted to the study of the electro–chemo–mechanical behaviours of high-capacity electrode materials, their fracture properties and mechanisms remain largely unknown. In this paper, we report a nanomechanical study on the damage tolerance of electrochemically lithiated silicon. Our in situ transmission electron microscopy experiments reveal a striking contrast of brittle fracture in pristine silicon versus ductile tensile deformation in fully lithiated silicon. Quantitative fracture toughness measurements by nanoindentation show a rapid brittle-to-ductile transition of fracture as the lithium-to-silicon molar ratiomore » is increased to above 1.5. Molecular dynamics simulations elucidate the mechanistic underpinnings of the brittle-to-ductile transition governed by atomic bonding and lithiation-induced toughening. Finally, our results reveal the high damage tolerance in amorphous lithium-rich silicon alloys and have important implications for the development of durable rechargeable batteries.« less

  8. Structural damage measure index based on non-probabilistic reliability model

    NASA Astrophysics Data System (ADS)

    Wang, Xiaojun; Xia, Yong; Zhou, Xiaoqing; Yang, Chen

    2014-02-01

    Uncertainties in the structural model and measurement data affect structural condition assessment in practice. As the probabilistic information of these uncertainties lacks, the non-probabilistic interval analysis framework is developed to quantify the interval of the structural element stiffness parameters. According to the interval intersection of the element stiffness parameters in the undamaged and damaged states, the possibility of damage existence is defined based on the reliability theory. A damage measure index is then proposed as the product of the nominal stiffness reduction and the defined possibility of damage existence. This new index simultaneously reflects the damage severity and possibility of damage at each structural component. Numerical and experimental examples are presented to illustrate the validity and applicability of the method. The results show that the proposed method can improve the accuracy of damage diagnosis compared with the deterministic damage identification method.

  9. Damage Tolerance Analysis of a Pressurized Liquid Oxygen Tank

    NASA Technical Reports Server (NTRS)

    Forth, Scott C.; Harvin, Stephen F.; Gregory, Peyton B.; Mason, Brian H.; Thompson, Joe E.; Hoffman, Eric K.

    2006-01-01

    A damage tolerance assessment was conducted of an 8,000 gallon pressurized Liquid Oxygen (LOX) tank. The LOX tank is constructed of a stainless steel pressure vessel enclosed by a thermal-insulating vacuum jacket. The vessel is pressurized to 2,250 psi with gaseous nitrogen resulting in both thermal and pressure stresses on the tank wall. Finite element analyses were performed on the tank to characterize the stresses from operation. Engineering material data was found from both the construction of the tank and the technical literature. An initial damage state was assumed based on records of a nondestructive inspection performed on the tank. The damage tolerance analyses were conducted using the NASGRO computer code. This paper contains the assumptions, and justifications, made for the input parameters to the damage tolerance analyses and the results of the damage tolerance analyses with a discussion on the operational safety of the LOX tank.

  10. Damage-Tolerant Composites Made By Stitching Carbon Fabrics

    NASA Technical Reports Server (NTRS)

    Dow, Marvin B.; Smith, Donald L.

    1992-01-01

    Work conducted at NASA Langley Research Center to investigate stitching combined with resin transfer molding to make composites more tolerant of damage and potentially cost competitive with metals. Composite materials tailored for damage tolerance by stitching layers of dry carbon fabric with closely spaced threads to provide reinforcement through thickness. Epoxy resin then infused into stitched preforms, and epoxy was cured. Various stitching patterns and thread materials evaluated by use of flat plate specimens. Also, blade-stiffened structural elements fabricated and tested. Stitched flat laminates showed outstanding damage tolerance, excellent compression strength in notched specimens, and acceptable fatigue behavior. Development of particular interest to aircraft and automotive industries.

  11. An Approach to Risk-Based Design Incorporating Damage Tolerance Analyses

    NASA Technical Reports Server (NTRS)

    Knight, Norman F., Jr.; Glaessgen, Edward H.; Sleight, David W.

    2002-01-01

    Incorporating risk-based design as an integral part of spacecraft development is becoming more and more common. Assessment of uncertainties associated with design parameters and environmental aspects such as loading provides increased knowledge of the design and its performance. Results of such studies can contribute to mitigating risk through a system-level assessment. Understanding the risk of an event occurring, the probability of its occurrence, and the consequences of its occurrence can lead to robust, reliable designs. This paper describes an approach to risk-based structural design incorporating damage-tolerance analysis. The application of this approach to a candidate Earth-entry vehicle is described. The emphasis of the paper is on describing an approach for establishing damage-tolerant structural response inputs to a system-level probabilistic risk assessment.

  12. Probabilistic Fatigue Damage Prognosis Using a Surrogate Model Trained Via 3D Finite Element Analysis

    NASA Technical Reports Server (NTRS)

    Leser, Patrick E.; Hochhalter, Jacob D.; Newman, John A.; Leser, William P.; Warner, James E.; Wawrzynek, Paul A.; Yuan, Fuh-Gwo

    2015-01-01

    Utilizing inverse uncertainty quantification techniques, structural health monitoring can be integrated with damage progression models to form probabilistic predictions of a structure's remaining useful life. However, damage evolution in realistic structures is physically complex. Accurately representing this behavior requires high-fidelity models which are typically computationally prohibitive. In the present work, a high-fidelity finite element model is represented by a surrogate model, reducing computation times. The new approach is used with damage diagnosis data to form a probabilistic prediction of remaining useful life for a test specimen under mixed-mode conditions.

  13. Progressive Fracture and Damage Tolerance of Composite Pressure Vessels

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.; Gotsis, Pascal K.; Minnetyan, Levon

    1997-01-01

    Structural performance (integrity, durability and damage tolerance) of fiber reinforced composite pressure vessels, designed for pressured shelters for planetary exploration, is investigated via computational simulation. An integrated computer code is utilized for the simulation of damage initiation, growth, and propagation under pressure. Aramid fibers are considered in a rubbery polymer matrix for the composite system. Effects of fiber orientation and fabrication defect/accidental damages are investigated with regard to the safety and durability of the shelter. Results show the viability of fiber reinforced pressure vessels as damage tolerant shelters for planetary colonization.

  14. Fame-Based Probabilistic Routing for Delay-Tolerant Networks

    NASA Astrophysics Data System (ADS)

    Shin, Kwangcheol; Lee, Dongman

    One of the important technologies for the Future Internet is the delay-tolerant network, which enables data transfers even when mobile nodes are connected intermittently. Routing algorithms for a delay-tolerant network generally aim to increase the message delivery rate and decrease the number of forwarded messages in the situation of an intermittent connection. A fame-based strategy for delay-tolerant network routing is suggested in this work. The number of contacts of a node with other nodes, known as the fame degree in this work, is counted to rank the fame degree of the node. By utilizing the fame degree, the proposed routing algorithm determines the probability of forwarding the messages of a node to the contact node. Due to the characteristics of the proposed algorithm, it can be combined harmonically with the PROPHET routing algorithm. Through experiments on well-known benchmark datasets, the proposed algorithms shows better delivery rates with much lower number of forwarded messages and lower average hop counts of delivered messages compared to Epidemic, PROPHET and SimBet.

  15. Some Examples of the Relations Between Processing and Damage Tolerance

    NASA Technical Reports Server (NTRS)

    Nettles, Alan T.

    2012-01-01

    Most structures made of laminated polymer matrix composites (PMCs) must be designed to some damage tolerance requirement that includes foreign object impact damage. Thus from the beginning of a part s life, impact damage is assumed to exist in the material and the part is designed to carry the required load with the prescribed impact damage present. By doing this, some processing defects may automatically be accounted for in the reduced design allowable due to these impacts. This paper will present examples of how a given level of impact damage and certain processing defects affect the compression strength of a laminate that contains both. Knowledge of the impact damage tolerance requirements, before processing begins, can broaden material options and processing techniques since the structure is not being designed to pristine properties.

  16. Damage tolerance and structural monitoring for wind turbine blades.

    PubMed

    McGugan, M; Pereira, G; Sørensen, B F; Toftegaard, H; Branner, K

    2015-02-28

    The paper proposes a methodology for reliable design and maintenance of wind turbine rotor blades using a condition monitoring approach and a damage tolerance index coupling the material and structure. By improving the understanding of material properties that control damage propagation it will be possible to combine damage tolerant structural design, monitoring systems, inspection techniques and modelling to manage the life cycle of the structures. This will allow an efficient operation of the wind turbine in terms of load alleviation, limited maintenance and repair leading to a more effective exploitation of offshore wind. PMID:25583858

  17. Damage tolerance and structural monitoring for wind turbine blades

    PubMed Central

    McGugan, M.; Pereira, G.; Sørensen, B. F.; Toftegaard, H.; Branner, K.

    2015-01-01

    The paper proposes a methodology for reliable design and maintenance of wind turbine rotor blades using a condition monitoring approach and a damage tolerance index coupling the material and structure. By improving the understanding of material properties that control damage propagation it will be possible to combine damage tolerant structural design, monitoring systems, inspection techniques and modelling to manage the life cycle of the structures. This will allow an efficient operation of the wind turbine in terms of load alleviation, limited maintenance and repair leading to a more effective exploitation of offshore wind. PMID:25583858

  18. Damage Tolerant Microstructures for Shock Environments

    NASA Astrophysics Data System (ADS)

    Cerreta, Ellen; Dennis-Koller, Darcie; Escobedo, Juan Pablo; Fensin, Saryu; Valone, Steve; Trujillo, Carl; Bronkhorst, Curt; Lebensohn, Ricardo

    While dynamic failure, due to shock loading, has been studied for many years, our current ability to predict and simulate evolving damage during dynamic loading remains limited. One reason for this is due to the lack of understanding for the linkages between process-induced as well as evolved microstructure and damage. To this end, the role of microstructure on the early stages of dynamic damage has been studied in high purity Ta and Cu. This work, which utilizes plate-impact experiments to interrogate these effects, has recently been extended to a subset to Cu-alloys (Cu-Pb, Cu-Nb, and Cu-Ag). These multi-length scale studies, have identified a number of linkages between damage nucleation and growth and microstructural features such as: grain boundary types, grain boundary orientation with respect to loading direction, grain orientation, and bi-metal interfaces. A combination of modeling and simulation techniques along with experimental observation has been utilized to examine the mechanisms for the ductile damage processes such as nucleation, growth and coalescence. This work has identified differing features of importance for damage nucleation in high purity and alloyed materials, lending insight into features of concern for mitigating shock induced damage in more complicated alloy systems.

  19. Damage Tolerance Issues as Related to Metallic Rotorcraft Dynamic Components

    NASA Technical Reports Server (NTRS)

    Everett, R. A., Jr.; Elber, W.

    1999-01-01

    In this paper issues related to the use of damage tolerance in life managing rotorcraft dynamic components are reviewed. In the past, rotorcraft fatigue design has combined constant amplitude tests of full-scale parts with flight loads and usage data in a conservative manner to provide "safe life" component replacement times. In contrast to the safe life approach over the past twenty years the United States Air Force and several other NATO nations have used damage tolerance design philosophies for fixed wing aircraft to improve safety and reliability. The reliability of the safe life approach being used in rotorcraft started to be questioned shortly after presentations at an American Helicopter Society's specialist meeting in 1980 showed predicted fatigue lives for a hypothetical pitch-link problem to vary from a low of 9 hours to a high in excess of 2594 hours. This presented serious cost, weight, and reliability implications. Somewhat after the U.S. Army introduced its six nines reliability on fatigue life, attention shifted towards using a possible damage tolerance approach to the life management of rotorcraft dynamic components. The use of damage tolerance in life management of dynamic rotorcraft parts will be the subject of this paper. This review will start with past studies on using damage tolerance life management with existing helicopter parts that were safe life designed. Also covered will be a successful attempt at certifying a tail rotor pitch rod using damage tolerance, which was designed using the safe life approach. The FAA review of rotorcraft fatigue design and their recommendations along with some on-going U.S. industry research in damage tolerance on rotorcraft will be reviewed.

  20. Damage Tolerance Issues as Related to Metallic Rotorcraft Dynamic Components

    NASA Technical Reports Server (NTRS)

    Everett, R. A., Jr.; Elber, W.

    2005-01-01

    In this paper issues related to the use of damage tolerance in life managing rotorcraft dynamic components are reviewed. In the past, rotorcraft fatigue design has combined constant amplitude tests of full-scale parts with flight loads and usage data in a conservative manner to provide "safe life" component replacement times. In contrast to the safe life approach over the past twenty years the United States Air Force and several other NATO nations have used damage tolerance design philosophies for fixed wing aircraft to improve safety and reliability. The reliability of the safe life approach being used in rotorcraft started to be questioned shortly after presentations at an American Helicopter Society's specialist meeting in 1980 showed predicted fatigue lives for a hypothetical pitch-link problem to vary from a low of 9 hours to a high in excess of 2594 hours. This presented serious cost, weight, and reliability implications. Somewhat after the U.S. Army introduced its six nines reliability on fatigue life, attention shifted towards using a possible damage tolerance approach to the life management of rotorcraft dynamic components. The use of damage tolerance in life management of dynamic rotorcraft parts will be the subject of this paper. This review will start with past studies on using damage tolerance life management with existing helicopter parts that were safe life designed. Also covered will be a successful attempt at certifying a tail rotor pitch rod using damage tolerance, which was designed using the safe life approach. The FAA review of rotorcraft fatigue design and their recommendations along with some on-going U.S. industry research in damage tolerance on rotorcraft will be reviewed. Finally, possible problems and future needs for research will be highlighted.

  1. Residual ultimate strength of a very large crude carrier considering probabilistic damage extents

    NASA Astrophysics Data System (ADS)

    Choung, Joonmo; Nam, Ji-Myung; Tayyar, Tansel

    2014-03-01

    This paper provides the prediction of ultimate longitudinal strengths of the hull girders of a very large crude carrier considering probabilistic damage extent due to collision and grounding accidents based on IMO Guidelines (2003). The probabilistic density functions of damage extent are expressed as a function of non-dimensional damage variables. The accumulated probabilistic levels of 10%, 30%, 50%, and 70% are taken into account for the estimation of damage extent. The ultimate strengths have been calculated using the in-house software called Ultimate Moment Analysis of Damaged Ships which is based on the progressive collapse method, with a new convergence criterion of force vector equilibrium. Damage indices are provided for several probable heeling angles from 0° (sagging) to 180° (hogging) due to collision- and grounding-induced structural failures and consequent flooding of compartments. This paper proves from the residual strength analyses that the second moment of area of a damage section can be a reliable index for the estimation of the residual ultimate strength. A simple polynomial formula is also proposed based on minimum residual ultimate strengths.

  2. Delamination, durability, and damage tolerance of laminated composite materials

    NASA Technical Reports Server (NTRS)

    Obrien, T. Kevin

    1993-01-01

    Durability and damage tolerance may have different connotations to people from different industries and with different backgrounds. Damage tolerance always refers to a safety of flight issue where the structure must be able to sustain design limit loads in the presence of damage and return to base safely. Durability, on the other hand, is an economic issue where the structure must be able to survive a certain life under load before the initiation of observable damage. Delamination is typically the observable damage mechanism that is of concern for durability, and the growth and accumulation of delaminations through the laminate thickness is often the sequence of events that leads to failure and the loss of structural integrity.

  3. On the enhancement of impact damage tolerance of composite laminates

    NASA Technical Reports Server (NTRS)

    Nettles, A. T.; Lance, D. G.

    1993-01-01

    This paper examines the use of a thin layer of Ultra High Molecular Weight Polyethylene (UHMWPE) on the outer surface of carbon/epoxy composite materials as a method of improving impact resistance and damage tolerance through hybridization. Flat 16-ply laminates as well as honeycomb sandwich structures with eight-ply facesheets were tested in this study. Instrumented drop-weight impact testing was used to inflict damage upon the specimens. Evaluation of damage resistance included instrumented impact data, visual examination, C-scanning and compression after impact (CAI) testing. The results show that only one lamina of UHMWPE did not improve the damage tolerance (strength retention) of the 16-ply flat laminate specimens or the honeycomb sandwich beams, however, a modest gain in impact resistance (detectable damage) was found for the honeycomb sandwich specimens that contained an outer layer of UHMWPE.

  4. Ontogenetic contingency of tolerance mechanisms in response to apical damage

    PubMed Central

    Gruntman, Michal; Novoplansky, Ariel

    2011-01-01

    Background and Aims Plants are able to tolerate tissue loss through vigorous branching which is often triggered by release from apical dominance and activation of lateral meristems. However, damage-induced branching might not be a mere physiological outcome of released apical dominance, but an adaptive response to environmental signals, such as damage timing and intensity. Here, branching responses to both factors were examined in the annual plant Medicago truncatula. Methods Branching patterns and allocation to reproductive traits were examined in response to variable clipping intensities and timings in M. truncatula plants from two populations that vary in the onset of reproduction. Phenotypic selection analysis was used to evaluate the strength and direction of selection on branching under the damage treatments. Key Results Plants of both populations exhibited an ontogenetic shift in tolerance mechanisms: while early damage induced greater meristem activation, late damage elicited investment in late-determined traits, including mean pod and seed biomass, and supported greater germination rates. Severe damage mostly elicited simultaneous development of multiple-order lateral branches, but this response was limited to early damage. Selection analyses revealed positive directional selection on branching in plants under early- compared with late- or no-damage treatments. Conclusions The results demonstrate that damage-induced meristem activation is an adaptive response that could be modified according to the plant's developmental stage, severity of tissue loss and their interaction, stressing the importance of considering these effects when studying plastic responses to apical damage. PMID:21873259

  5. Mechanical Data for Use in Damage Tolerance Analyses

    NASA Technical Reports Server (NTRS)

    Forth, Scott C.; James, Mark A.; Newman, John A.; Everett, Richard A., Jr.; Johnston, William M., Jr.

    2004-01-01

    This report describes the results of a research program to determine the damage tolerance properties of metallic propeller materials. Three alloys were selected for investigation: 2025-T6 Aluminum, D6AC Steel and 4340 Steel. Mechanical response, fatigue (S-N) and fatigue crack growth rate data are presented for all of the alloys. The main conclusions that can be drawn from this study are as follows. The damage tolerant design of a propeller system will require a complete understanding of the fatigue crack growth threshold. There exists no experimental procedure to reliably develop the fatigue crack growth threshold data that is needed for damage tolerant design methods. Significant research will be required to fully understand the fatigue crack growth threshold. The development of alternative precracking methods, evaluating the effect of specimen configuration and attempting to identify micromechanical issues are simply the first steps to understanding the mechanics of the threshold.

  6. Probabilistic Damage Characterization Using the Computationally-Efficient Bayesian Approach

    NASA Technical Reports Server (NTRS)

    Warner, James E.; Hochhalter, Jacob D.

    2016-01-01

    This work presents a computationally-ecient approach for damage determination that quanti es uncertainty in the provided diagnosis. Given strain sensor data that are polluted with measurement errors, Bayesian inference is used to estimate the location, size, and orientation of damage. This approach uses Bayes' Theorem to combine any prior knowledge an analyst may have about the nature of the damage with information provided implicitly by the strain sensor data to form a posterior probability distribution over possible damage states. The unknown damage parameters are then estimated based on samples drawn numerically from this distribution using a Markov Chain Monte Carlo (MCMC) sampling algorithm. Several modi cations are made to the traditional Bayesian inference approach to provide signi cant computational speedup. First, an ecient surrogate model is constructed using sparse grid interpolation to replace a costly nite element model that must otherwise be evaluated for each sample drawn with MCMC. Next, the standard Bayesian posterior distribution is modi ed using a weighted likelihood formulation, which is shown to improve the convergence of the sampling process. Finally, a robust MCMC algorithm, Delayed Rejection Adaptive Metropolis (DRAM), is adopted to sample the probability distribution more eciently. Numerical examples demonstrate that the proposed framework e ectively provides damage estimates with uncertainty quanti cation and can yield orders of magnitude speedup over standard Bayesian approaches.

  7. An Experimental Investigation of Damage Resistances and Damage Tolerance of Composite Materials

    NASA Technical Reports Server (NTRS)

    Prabhakaran, R.

    2003-01-01

    The project included three lines of investigation, aimed at a better understanding of the damage resistance and damage tolerance of pultruded composites. The three lines of investigation were: (i) measurement of permanent dent depth after transverse indentation at different load levels, and correlation with other damage parameters such as damage area (from x-radiography) and back surface crack length, (ii) estimation of point stress and average stress characteristic dimensions corresponding to measured damage parameters, and (iii) an attempt to measure the damage area by a reflection photoelastic technique. All the three lines of investigation were pursued.

  8. Damage tolerant composite wing panels for transport aircraft

    NASA Technical Reports Server (NTRS)

    Smith, Peter J.; Wilson, Robert D.; Gibbins, M. N.

    1985-01-01

    Commercial aircraft advanced composite wing surface panels were tested for durability and damage tolerance. The wing of a fuel-efficient, 200-passenger airplane for 1990 delivery was sized using grahite-epoxy materials. The damage tolerance program was structured to allow a systematic progression from material evaluations to the optimized large panel verification tests. The program included coupon testing to evaluate toughened material systems, static and fatigue tests of compression coupons with varying amounts of impact damage, element tests of three-stiffener panels to evaluate upper wing panel design concepts, and the wing structure damage environment was studied. A series of technology demonstration tests of large compression panels is performed. A repair investigation is included in the final large panel test.

  9. DNA replication: damage tolerance at the assembly line.

    PubMed

    Blastyák, András

    2014-07-01

    Damage tolerance mechanisms ensure resumption of DNA synthesis at damage-replisome encounters. Replication fork reversal (RFR) is one such widely recognized mechanism that acts on replisomes where lagging strand synthesis continues upon leading strand synthesis block. The possibility to form such a structure is highly counter to our current understanding of the replisome dynamics of single replisomes. Here, I suggest a model that takes coupled bidirectional replisome organization into account to solve this apparent contradiction. PMID:24957737

  10. DNA damage tolerance branches out toward sister chromatid cohesion

    PubMed Central

    Branzei, Dana

    2016-01-01

    ABSTRACT Genome duplication is temporarily coordinated with sister chromatid cohesion and DNA damage tolerance. Recently, we found that replication fork-coupled repriming is important for both optimal cohesion and error-free replication by recombination. The mechanism involved has implications for the etiology of replication-based genetic diseases and cancer. PMID:27308553

  11. 75 FR 11734 - Damage Tolerance Data for Repairs and Alterations

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-03-12

    ...The Federal Aviation Administration (FAA) is making minor technical changes to a final rule published in the Federal Register on December 12, 2007. That final rule required holders of design approvals to make damage tolerance data for repairs and alterations to fatigue critical airplane structure available to operators. After issuing the final rule, the FAA determined that further changes were......

  12. Heat tolerance of higher plants cenosis to damaging air temperatures

    NASA Astrophysics Data System (ADS)

    Ushakova, Sofya; Shklavtsova, Ekaterina

    Designing sustained biological-technical life support systems (BTLSS) including higher plants as a part of a photosynthesizing unit, it is important to foresee the multi species cenosis reaction on either stress-factors. Air temperature changing in BTLSS (because of failure of a thermoregulation system) up to the values leading to irreversible damages of photosynthetic processes is one of those factors. However, it is possible to increase, within the certain limits, the plant cenosis tolerance to the unfavorable temperatures’ effect due to the choice of the higher plants possessing resistance both to elevated and to lowered air temperatures. Besides, the plants heat tolerance can be increased when subjecting them during their growing to the hardening off temperatures’ effect. Thus, we have come to the conclusion that it is possible to increase heat tolerance of multi species cenosis under the damaging effect of air temperature of 45 (°) СC.

  13. Homologous recombination maintenance of genome integrity during DNA damage tolerance

    PubMed Central

    Prado, Félix

    2014-01-01

    The DNA strand exchange protein Rad51 provides a safe mechanism for the repair of DNA breaks using the information of a homologous DNA template. Homologous recombination (HR) also plays a key role in the response to DNA damage that impairs the advance of the replication forks by providing mechanisms to circumvent the lesion and fill in the tracks of single-stranded DNA that are generated during the process of lesion bypass. These activities postpone repair of the blocking lesion to ensure that DNA replication is completed in a timely manner. Experimental evidence generated over the last few years indicates that HR participates in this DNA damage tolerance response together with additional error-free (template switch) and error-prone (translesion synthesis) mechanisms through intricate connections, which are presented here. The choice between repair and tolerance, and the mechanism of tolerance, is critical to avoid increased mutagenesis and/or genome rearrangements, which are both hallmarks of cancer. PMID:27308329

  14. Probabilistic, multi-variate flood damage modelling using random forests and Bayesian networks

    NASA Astrophysics Data System (ADS)

    Kreibich, Heidi; Schröter, Kai

    2015-04-01

    Decisions on flood risk management and adaptation are increasingly based on risk analyses. Such analyses are associated with considerable uncertainty, even more if changes in risk due to global change are expected. Although uncertainty analysis and probabilistic approaches have received increased attention recently, they are hardly applied in flood damage assessments. Most of the damage models usually applied in standard practice have in common that complex damaging processes are described by simple, deterministic approaches like stage-damage functions. This presentation will show approaches for probabilistic, multi-variate flood damage modelling on the micro- and meso-scale and discuss their potential and limitations. Reference: Merz, B.; Kreibich, H.; Lall, U. (2013): Multi-variate flood damage assessment: a tree-based data-mining approach. NHESS, 13(1), 53-64. Schröter, K., Kreibich, H., Vogel, K., Riggelsen, C., Scherbaum, F., Merz, B. (2014): How useful are complex flood damage models? - Water Resources Research, 50, 4, p. 3378-3395.

  15. Rapid Damage eXplorer (RDX): A Probabilistic Framework for Learning Changes From Bitemporal Images

    SciTech Connect

    Vatsavai, Raju

    2012-01-01

    Recent decade has witnessed major changes on the Earth, for example, deforestation, varying cropping and human settlement patterns, and crippling damages due to disasters. Accurate damage assessment caused by major natural and anthropogenic disasters is becoming critical due to increases in human and economic loss. This increase in loss of life and severe damages can be attributed to the growing population, as well as human migration to the disaster prone regions of the world. Rapid assessment of these changes and dissemination of accurate information is critical for creating an effective emergency response. Change detection using high-resolution satellite images is a primary tool in assessing damages, monitoring biomass and critical infrastructures, and identifying new settlements. In this demo, we present a novel supervised probabilistic framework for identifying changes using very high-resolution multispectral, and bitemporal remote sensing images. Our demo shows that the rapid damage explorer (RDX) system is resilient to registration errors and differing sensor characteristics.

  16. A Framework for Probabilistic Evaluation of Interval Management Tolerance in the Terminal Radar Control Area

    NASA Technical Reports Server (NTRS)

    Hercencia-Zapana, Heber; Herencia-Zapana, Heber; Hagen, George E.; Neogi, Natasha

    2012-01-01

    Projections of future traffic in the national airspace show that most of the hub airports and their attendant airspace will need to undergo significant redevelopment and redesign in order to accommodate any significant increase in traffic volume. Even though closely spaced parallel approaches increase throughput into a given airport, controller workload in oversubscribed metroplexes is further taxed by these approaches that require stringent monitoring in a saturated environment. The interval management (IM) concept in the TRACON area is designed to shift some of the operational burden from the control tower to the flight deck, placing the flight crew in charge of implementing the required speed changes to maintain a relative spacing interval. The interval management tolerance is a measure of the allowable deviation from the desired spacing interval for the IM aircraft (and its target aircraft). For this complex task, Formal Methods can help to ensure better design and system implementation. In this paper, we propose a probabilistic framework to quantify the uncertainty and performance associated with the major components of the IM tolerance. The analytical basis for this framework may be used to formalize both correctness and probabilistic system safety claims in a modular fashion at the algorithmic level in a way compatible with several Formal Methods tools.

  17. Uncertainty handling in structural damage detection using fuzzy logic and probabilistic simulation

    NASA Astrophysics Data System (ADS)

    Chandrashekhar, M.; Ganguli, Ranjan

    2009-02-01

    A fuzzy logic system (FLS) with a new sliding window defuzzifier is developed for damage detection. The effect of changes in the damage evaluation parameter (frequency) due to uncertainty in material properties is explored and the results of the probabilistic analysis are used to develop a robust FLS for damage detection. Probabilistic analysis is performed using Monte Carlo Simulation (MCS) on a beam finite element (FE) model to calculate statistical properties of the variation in natural frequencies of the beam due to structural damage and material uncertainty. Variation in these frequency measures, further contaminated with measurement noise, are used for testing the FLS. The FLS developed for damage detection in the steel beam having material uncertainty (elastic modulus) with coefficient of variation (COV) of 3 percent and noise level of 0.15 in the measurement data, correctly identifies the fault with an accuracy of about 94 percent. The FLS also accurately classifies the undamaged condition in presence of the mentioned uncertainties reducing the possibility of false alarms. From an algorithmic standpoint, this paper connects the disparate areas of probability and fuzzy logic to alleviate uncertainty issues in damage detection.

  18. Optimization of Aerospace Structure Subject to Damage Tolerance Criteria

    NASA Technical Reports Server (NTRS)

    Akgun, Mehmet A.

    1999-01-01

    The objective of this cooperative agreement was to seek computationally efficient ways to optimize aerospace structures subject to damage tolerance criteria. Optimization was to involve sizing as well as topology optimization. The work was done in collaboration with Steve Scotti, Chauncey Wu and Joanne Walsh at the NASA Langley Research Center. Computation of constraint sensitivity is normally the most time-consuming step of an optimization procedure. The cooperative work first focused on this issue and implemented the adjoint method of sensitivity computation in an optimization code (runstream) written in Engineering Analysis Language (EAL). The method was implemented both for bar and plate elements including buckling sensitivity for the latter. Lumping of constraints was investigated as a means to reduce the computational cost. Adjoint sensitivity computation was developed and implemented for lumped stress and buckling constraints. Cost of the direct method and the adjoint method was compared for various structures with and without lumping. The results were reported in two papers. It is desirable to optimize topology of an aerospace structure subject to a large number of damage scenarios so that a damage tolerant structure is obtained. Including damage scenarios in the design procedure is critical in order to avoid large mass penalties at later stages. A common method for topology optimization is that of compliance minimization which has not been used for damage tolerant design. In the present work, topology optimization is treated as a conventional problem aiming to minimize the weight subject to stress constraints. Multiple damage configurations (scenarios) are considered. Each configuration has its own structural stiffness matrix and, normally, requires factoring of the matrix and solution of the system of equations. Damage that is expected to be tolerated is local and represents a small change in the stiffness matrix compared to the baseline (undamaged

  19. Damage-tolerant nanotwinned metals with nanovoids under radiation environments.

    PubMed

    Chen, Y; Yu, K Y; Liu, Y; Shao, S; Wang, H; Kirk, M A; Wang, J; Zhang, X

    2015-01-01

    Material performance in extreme radiation environments is central to the design of future nuclear reactors. Radiation induces significant damage in the form of dislocation loops and voids in irradiated materials, and continuous radiation often leads to void growth and subsequent void swelling in metals with low stacking fault energy. Here we show that by using in situ heavy ion irradiation in a transmission electron microscope, pre-introduced nanovoids in nanotwinned Cu efficiently absorb radiation-induced defects accompanied by gradual elimination of nanovoids, enhancing radiation tolerance of Cu. In situ studies and atomistic simulations reveal that such remarkable self-healing capability stems from high density of coherent and incoherent twin boundaries that rapidly capture and transport point defects and dislocation loops to nanovoids, which act as storage bins for interstitial loops. This study describes a counterintuitive yet significant concept: deliberate introduction of nanovoids in conjunction with nanotwins enables unprecedented damage tolerance in metallic materials. PMID:25906997

  20. Damage Tolerance and Reliability of Turbine Engine Components

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.

    1998-01-01

    A formal method is described to quantify structural damage tolerance and reliability in the presence of multitude of uncertainties in turbine engine components. The method is based at the materials behavior level where primitive variables with their respective scatters are used to describe that behavior. Computational simulation is then used to propagate those uncertainties to the structural scale where damage tolerance and reliability are usually specified. Several sample cases are described to illustrate the effectiveness, versatility, and maturity of the method. Typical results from these methods demonstrate that the methods are mature and that they can be used for future strategic projections and planning to assure better, cheaper, faster products for competitive advantages in world markets. These results also indicate that the methods are suitable for predicting remaining life in aging or deteriorating structures.

  1. Damage Tolerance and Reliability of Turbine Engine Components

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.

    1999-01-01

    A formal method is described to quantify structural damage tolerance and reliability in the presence of multitude of uncertainties in turbine engine components. The method is based at the materials behaviour level where primitive variables with their respective scatters are used to describe the behavior. Computational simulation is then used to propagate those uncertainties to the structural scale where damage tolerance and reliability are usually specified. Several sample cases are described to illustrate the effectiveness, versatility, and maturity of the method. Typical results from these methods demonstrate that the methods are mature and that they can be used for future strategic projections and planning to assure better, cheaper, faster, products for competitive advantages in world markets. These results also indicate that the methods are suitable for predicting remaining life in aging or deteriorating structures.

  2. Damage-tolerant nanotwinned metals with nanovoids under radiation environments

    PubMed Central

    Chen, Y.; Yu, K Y.; Liu, Y.; Shao, S.; Wang, H.; Kirk, M. A.; Wang, J.; Zhang, X.

    2015-01-01

    Material performance in extreme radiation environments is central to the design of future nuclear reactors. Radiation induces significant damage in the form of dislocation loops and voids in irradiated materials, and continuous radiation often leads to void growth and subsequent void swelling in metals with low stacking fault energy. Here we show that by using in situ heavy ion irradiation in a transmission electron microscope, pre-introduced nanovoids in nanotwinned Cu efficiently absorb radiation-induced defects accompanied by gradual elimination of nanovoids, enhancing radiation tolerance of Cu. In situ studies and atomistic simulations reveal that such remarkable self-healing capability stems from high density of coherent and incoherent twin boundaries that rapidly capture and transport point defects and dislocation loops to nanovoids, which act as storage bins for interstitial loops. This study describes a counterintuitive yet significant concept: deliberate introduction of nanovoids in conjunction with nanotwins enables unprecedented damage tolerance in metallic materials. PMID:25906997

  3. The research and development of damage tolerant carbon fiber composites

    NASA Astrophysics Data System (ADS)

    Miranda, John Armando

    This record of study takes a first hand look at corporate research and development efforts to improve the damage tolerance of two unique composite materials used in high performance aerospace applications. The professional internship with The Dow Chemical Company---Dow/United Technologies joint venture describes the intern's involvement in developing patentable process technologies for interleave toughening of high temperature resins and their composites. The subsequent internship with Hexcel Corporation describes the intern's involvement in developing the damage tolerance of novel and existing honeycomb sandwich structure technologies. Through the Doctor of Engineering professional internship experience this student exercised fundamental academic understanding and methods toward accomplishing the corporate objectives of the internship sponsors in a resource efficient and cost-effective manner. Also, the student gained tremendous autonomy through exceptional training in working in focused team environments with highly trained engineers and scientists in achieving important corporate objectives.

  4. Damage-tolerant nanotwinned metals with nanovoids under radiation environments

    DOE PAGESBeta

    Chen, Y.; Yu, K. Y.; Liu, Y.; Shao, S.; Wang, H.; Kirk, M. A.; Wang, J.; Zhang, X.

    2015-04-24

    Material performance in extreme radiation environments is central to the design of future nuclear reactors. Radiation induces significant damage in the form of dislocation loops and voids in irradiated materials, and continuous radiation often leads to void growth and subsequent void swelling in metals with low stacking fault energy. Here we show that by using in situ heavy ion irradiation in a transmission electron microscope, pre-introduced nanovoids in nanotwinned Cu efficiently absorb radiation-induced defects accompanied by gradual elimination of nanovoids, enhancing radiation tolerance of Cu. In situ studies and atomistic simulations reveal that such remarkable self-healing capability stems from highmore » density of coherent and incoherent twin boundaries that rapidly capture and transport point defects and dislocation loops to nanovoids, which act as storage bins for interstitial loops. This study describes a counterintuitive yet significant concept: deliberate introduction of nanovoids in conjunction with nanotwins enables unprecedented damage tolerance in metallic materials.« less

  5. Damage-tolerant nanotwinned metals with nanovoids under radiation environments

    NASA Astrophysics Data System (ADS)

    Chen, Y.; Yu, K. Y.; Liu, Y.; Shao, S.; Wang, H.; Kirk, M. A.; Wang, J.; Zhang, X.

    2015-04-01

    Material performance in extreme radiation environments is central to the design of future nuclear reactors. Radiation induces significant damage in the form of dislocation loops and voids in irradiated materials, and continuous radiation often leads to void growth and subsequent void swelling in metals with low stacking fault energy. Here we show that by using in situ heavy ion irradiation in a transmission electron microscope, pre-introduced nanovoids in nanotwinned Cu efficiently absorb radiation-induced defects accompanied by gradual elimination of nanovoids, enhancing radiation tolerance of Cu. In situ studies and atomistic simulations reveal that such remarkable self-healing capability stems from high density of coherent and incoherent twin boundaries that rapidly capture and transport point defects and dislocation loops to nanovoids, which act as storage bins for interstitial loops. This study describes a counterintuitive yet significant concept: deliberate introduction of nanovoids in conjunction with nanotwins enables unprecedented damage tolerance in metallic materials.

  6. The civil Damage Tolerance Requirements in theory and practice

    NASA Astrophysics Data System (ADS)

    Broek, David

    This paper presents a brief review of the Damage Tolerance Requirements (DTR) for civil aircraft, and of their impact and significance for both newly developed and aging aircraft. Some improvements to the DTR are suggested. In principle, fracture control of aircraft is ensured by non-destructive inspection, although destructive inspection has been suggested for certain aging aircraft, as will be discussed briefly as well.

  7. Fatigue Crack Growth Database for Damage Tolerance Analysis

    NASA Technical Reports Server (NTRS)

    Forman, R. G.; Shivakumar, V.; Cardinal, J. W.; Williams, L. C.; McKeighan, P. C.

    2005-01-01

    The objective of this project was to begin the process of developing a fatigue crack growth database (FCGD) of metallic materials for use in damage tolerance analysis of aircraft structure. For this initial effort, crack growth rate data in the NASGRO (Registered trademark) database, the United States Air Force Damage Tolerant Design Handbook, and other publicly available sources were examined and used to develop a database that characterizes crack growth behavior for specific applications (materials). The focus of this effort was on materials for general commercial aircraft applications, including large transport airplanes, small transport commuter airplanes, general aviation airplanes, and rotorcraft. The end products of this project are the FCGD software and this report. The specific goal of this effort was to present fatigue crack growth data in three usable formats: (1) NASGRO equation parameters, (2) Walker equation parameters, and (3) tabular data points. The development of this FCGD will begin the process of developing a consistent set of standard fatigue crack growth material properties. It is envisioned that the end product of the process will be a general repository for credible and well-documented fracture properties that may be used as a default standard in damage tolerance analyses.

  8. Elastic properties, strength and damage tolerance of pultruded composites

    NASA Astrophysics Data System (ADS)

    Saha, Mrinal Chandra

    Pultruded composites are candidate materials for civil engineering infrastructural applications due their higher corrosion resistance and lower life cycle cost. Efficient use of materials like structural members requires thorough understanding of the mechanism that affects their response. The present investigation addresses the modeling and characterization of E-glass fiber/polyester resin matrix pultruded composites in the form of sheets of various thicknesses. The elastic constants were measured using static, vibration and ultrasonic methods. Two types of piezoelectric crystals were used in ultrasonic measurements. Finally, the feasibility of using a single specimen, in the form of a circular disk, was shown in measuring all the elastic constants using ultrasonic technique. The effects of stress gradient on tensile strength were investigated. A large number of specimens, parallel and transverse to the pultrusion direction, were tested in tension, 3-point flexure, and 4-point flexure. A 2-parameter Weibull model was applied to predict the tensile strength from the flexure tests. The measured and Weibull-predicted ratios did not show consistent agreement. Microstructural observations suggested that the flaw distribution in the material was not uniform, which appears to be a basic requirement for the Weibull distribution. Compressive properties were measured using a short-block compression test specimen of 44.4-mm long and 25.4-mm wide. Specimens were tested at 0°, 30°, 45°, 60° and 90° orientations. The compression test specimen was modeled using 4-noded isoparametric layered plate and shell elements. The predicted elastic properties for the roving layer and the continuous strand mat layer was used for the finite element study. The damage resistance and damage tolerance were investigated experimentally. Using a quasi-static indentation loading, damage was induced at various incrementally increased force levels to investigate the damage growth process. Damage

  9. Sexes show differential tolerance to spittlebug damage and consequences of damage for multi-species interactions.

    PubMed

    Cole, Denise H; Ashman, Tia-Lynn

    2005-10-01

    Antagonists can play a role in sexual system evolution if tolerance or resistance is sex-dependent. Our understanding of this role will be enhanced by consideration of the effects of antagonists on other plant-animal interactions. This study determined whether the sex morphs of a gynodioecious Fragaria virginiana differ in their susceptibility and response to damage by spittlebugs and whether damage altered pollinator attraction traits or interactions with other antagonists. Tolerance, but not resistance, to spittlebugs differed between the sexes. Generally, spittlebugs were more damaging to hermaphrodites than females, a finding in accord with the hypothesis that the pollen-bearing morph is less tolerant of source-damage than the pollen-sterile morph when damage is incurred during flowering. In both sex morphs, spittlebugs reduced inflorescence height, increased petal size, but did not affect the number of open flowers per day, suggesting that the net effect of damage may be to increase pollinator attraction. Spittlebug infestation modified interactions with other antagonists in a sex-dependent manner: spittlebugs reduced attack by bud-clipping weevils in hermaphrodites but increased infection by leaf fungi in females. The complex interactions between plant sex, antagonists, and pollinator attraction documented here emphasize the importance of considering sex-differential multi-species interactions in plant sexual evolution. PMID:21646088

  10. Damage tolerance of a composite sandwich with interleaved foam core

    NASA Technical Reports Server (NTRS)

    Ishai, Ori; Hiel, Clement

    1992-01-01

    A composite sandwich panel consisting of carbon fiber-reinforced plastic (CFRP) skins and a syntactic foam core was selected as an appropriate structural concept for the design of wind tunnel compressor blades. Interleaving of the core with tough interlayers was done to prevent core cracking and to improve damage tolerance of the sandwich. Simply supported sandwich beam specimens were subjected to low-velocity drop-weight impacts as well as high velocity ballistic impacts. The performance of the interleaved core sandwich panels was characterized by localized skin damage and minor cracking of the core. Residual compressive strength (RCS) of the skin, which was derived from flexural test, shows the expected trend of decreasing with increasing size of the damage, impact energy, and velocity. In the case of skin damage, RCS values of around 50 percent of the virgin interleaved reference were obtained at the upper impact energy range. Based on the similarity between low-velocity and ballistic-impact effects, it was concluded that impact energy is the main variable controlling damage and residual strength, where as velocity plays a minor role.

  11. A preliminary damage tolerance methodology for composite structures

    NASA Technical Reports Server (NTRS)

    Wilkins, D. J.

    1983-01-01

    The certification experience for the primary, safety-of-flight composite structure applications on the F-16 is discussed. The rationale for the selection of delamination as the major issue for damage tolerance is discussed, as well as the modeling approach selected. The development of the necessary coupon-level data base is briefly summarized. The major emphasis is on the description of a full-scale fatigue test where delamination growth was obtained to demonstrate the validity of the selected approach. A summary is used to review the generic features of the methodology.

  12. DNA damage tolerance by recombination: Molecular pathways and DNA structures.

    PubMed

    Branzei, Dana; Szakal, Barnabas

    2016-08-01

    Replication perturbations activate DNA damage tolerance (DDT) pathways, which are crucial to promote replication completion and to prevent fork breakage, a leading cause of genome instability. One mode of DDT uses translesion synthesis polymerases, which however can also introduce mutations. The other DDT mode involves recombination-mediated mechanisms, which are generally accurate. DDT occurs prevalently postreplicatively, but in certain situations homologous recombination is needed to restart forks. Fork reversal can function to stabilize stalled forks, but may also promote error-prone outcome when used for fork restart. Recent years have witnessed important advances in our understanding of the mechanisms and DNA structures that mediate recombination-mediated damage-bypass and highlighted principles that regulate DDT pathway choice locally and temporally. In this review we summarize the current knowledge and paradoxes on recombination-mediated DDT pathways and their workings, discuss how the intermediate DNA structures may influence genome integrity, and outline key open questions for future research. PMID:27236213

  13. Damage-Tolerant Fan Casings for Jet Engines

    NASA Technical Reports Server (NTRS)

    2006-01-01

    All turbofan engines work on the same principle. A large fan at the front of the engine draws air in. A portion of the air enters the compressor, but a greater portion passes on the outside of the engine this is called bypass air. The air that enters the compressor then passes through several stages of rotating fan blades that compress the air more, and then it passes into the combustor. In the combustor, fuel is injected into the airstream, and the fuel-air mixture is ignited. The hot gasses produced expand rapidly to the rear, and the engine reacts by moving forward. If there is a flaw in the system, such as an unexpected obstruction, the fan blade can break, spin off, and harm other engine components. Fan casings, therefore, need to be strong enough to contain errant blades and damage-tolerant to withstand the punishment of a loose blade-turned-projectile. NASA has spearheaded research into improving jet engine fan casings, ultimately discovering a cost-effective approach to manufacturing damage-tolerant fan cases that also boast significant weight reduction. In an aircraft, weight reduction translates directly into fuel burn savings, increased payload, and greater aircraft range. This technology increases safety and structural integrity; is an attractive, viable option for engine manufacturers, because of the low-cost manufacturing; and it is a practical alternative for customers, as it has the added cost saving benefits of the weight reduction.

  14. Damage-tolerant nanotwinned metals with nanovoids under radiation environments

    SciTech Connect

    Chen, Y.; Yu, K. Y.; Liu, Y.; Shao, S.; Wang, H.; Kirk, M. A.; Wang, J.; Zhang, X.

    2015-04-24

    Material performance in extreme radiation environments is central to the design of future nuclear reactors. Radiation induces significant damage in the form of dislocation loops and voids in irradiated materials, and continuous radiation often leads to void growth and subsequent void swelling in metals with low stacking fault energy. Here we show that by using in situ heavy ion irradiation in a transmission electron microscope, pre-introduced nanovoids in nanotwinned Cu efficiently absorb radiation-induced defects accompanied by gradual elimination of nanovoids, enhancing radiation tolerance of Cu. In situ studies and atomistic simulations reveal that such remarkable self-healing capability stems from high density of coherent and incoherent twin boundaries that rapidly capture and transport point defects and dislocation loops to nanovoids, which act as storage bins for interstitial loops. This study describes a counterintuitive yet significant concept: deliberate introduction of nanovoids in conjunction with nanotwins enables unprecedented damage tolerance in metallic materials.

  15. Estimation of probability of failure for damage-tolerant aerospace structures

    NASA Astrophysics Data System (ADS)

    Halbert, Keith

    The majority of aircraft structures are designed to be damage-tolerant such that safe operation can continue in the presence of minor damage. It is necessary to schedule inspections so that minor damage can be found and repaired. It is generally not possible to perform structural inspections prior to every flight. The scheduling is traditionally accomplished through a deterministic set of methods referred to as Damage Tolerance Analysis (DTA). DTA has proven to produce safe aircraft but does not provide estimates of the probability of failure of future flights or the probability of repair of future inspections. Without these estimates maintenance costs cannot be accurately predicted. Also, estimation of failure probabilities is now a regulatory requirement for some aircraft. The set of methods concerned with the probabilistic formulation of this problem are collectively referred to as Probabilistic Damage Tolerance Analysis (PDTA). The goal of PDTA is to control the failure probability while holding maintenance costs to a reasonable level. This work focuses specifically on PDTA for fatigue cracking of metallic aircraft structures. The growth of a crack (or cracks) must be modeled using all available data and engineering knowledge. The length of a crack can be assessed only indirectly through evidence such as non-destructive inspection results, failures or lack of failures, and the observed severity of usage of the structure. The current set of industry PDTA tools are lacking in several ways: they may in some cases yield poor estimates of failure probabilities, they cannot realistically represent the variety of possible failure and maintenance scenarios, and they do not allow for model updates which incorporate observed evidence. A PDTA modeling methodology must be flexible enough to estimate accurately the failure and repair probabilities under a variety of maintenance scenarios, and be capable of incorporating observed evidence as it becomes available. This

  16. Durability and Damage Tolerance of High Temperature Polymeric Composites

    NASA Technical Reports Server (NTRS)

    Case, Scott W.; Reifsnider, Kenneth L.

    1996-01-01

    Modern durability and damage tolerance predictions for composite material systems rely on accurate estimates of the local stress and material states for each of the constituents, as well as the manner in which the constituents interact. In this work, an number of approaches to estimating the stress states and interactions are developed. First, an elasticity solution is presented for the problem of a penny-shaped crack in an N-phase composite material system opened by a prescribed normal pressure. The stress state around such a crack is then used to estimate the stress concentrations due to adjacent fiber fractures in composite materials. The resulting stress concentrations are then used to estimate the tensile strength of the composite. The predicted results are compared with experimental values. In addition, a cumulative damage model for fatigue is presented. Modifications to the model are made to include the effects of variable amplitude loading. These modifications are based upon the use of remaining strength as a damage metric and the definition of an equivalent generalized time. The model is initially validated using results from the literature. Also, experimental data from APC-2 laminates and IM7/K3B laminates are used in the model. The use of such data for notched laminates requires the use of an effective hole size, which is calculated based upon strain distribution measurements. Measured remaining strengths after fatigue loading are compared with the predicted values for specimens fatigued at room temperature and 350 F (177 C).

  17. Towards a damage tolerance philosophy for composite materials and structures

    NASA Technical Reports Server (NTRS)

    O'Brien, T. Kevin

    1990-01-01

    A damage-threshold/fail-safe approach is proposed to ensure that composite structures are both sufficiently durable for economy of operation, as well as adequately fail-safe or damage tolerant for flight safety. Matrix cracks are assumed to exist throughout the off-axis plies. Delamination onset is predicted using a strain energy release rate characterization. Delamination growth is accounted for in one of three ways: either analytically, using delamination growth laws in conjunction with strain energy release rate analyses incorporating delamination resistance curves; experimentally, using measured stiffness loss; or conservatively, assuming delamination onset corresponds to catastrophic delamination growth. Fail-safety is assessed by accounting for the accumulation of delaminations through the thickness. A tension fatigue life prediction for composite laminates is presented as a case study to illustrate how this approach may be implemented. Suggestions are made for applying the damage-threshold/fail-safe approach to compression fatigue, tension/compression fatigue, and compression strength following low velocity impact.

  18. Towards a damage tolerance philosophy for composite materials and structures

    NASA Technical Reports Server (NTRS)

    Obrien, T. Kevin

    1988-01-01

    A damage-threshold/fail-safe approach is proposed to ensure that composite structures are both sufficiently durable for economy of operation, as well as adequately fail-safe or damage tolerant for flight safety. Matrix cracks are assumed to exist throughout the off-axis plies. Delamination onset is predicted using a strain energy release rate characterization. Delamination growth is accounted for in one of three ways: either analytically, using delamination growth laws in conjunction with strain energy release rate analyses incorporating delamination resistance curves; experimentally, using measured stiffness loss; or conservatively, assuming delamination onset corresponds to catastrophic delamination growth. Fail-safety is assessed by accounting for the accumulation of delaminations through the thickness. A tension fatigue life prediction for composite laminates is presented as a case study to illustrate how this approach may be implemented. Suggestions are made for applying the damage-threshold/fail-safe approach to compression fatigue, tension/compression fatigue, and compression strength following low velocity impact.

  19. Advanced information processing system - Status report. [for fault tolerant and damage tolerant data processing for aerospace vehicles

    NASA Technical Reports Server (NTRS)

    Brock, L. D.; Lala, J.

    1986-01-01

    The Advanced Information Processing System (AIPS) is designed to provide a fault tolerant and damage tolerant data processing architecture for a broad range of aerospace vehicles. The AIPS architecture also has attributes to enhance system effectiveness such as graceful degradation, growth and change tolerance, integrability, etc. Two key building blocks being developed by the AIPS program are a fault and damage tolerant processor and communication network. A proof-of-concept system is now being built and will be tested to demonstrate the validity and performance of the AIPS concepts.

  20. Fuel containment and damage tolerance for large composite primary aircraft structures. Phase 1: Testing

    NASA Technical Reports Server (NTRS)

    Sandifer, J. P.

    1983-01-01

    Technical problems associated with fuel containment and damage tolerance of composite material wings for transport aircraft were identified. The major tasks are the following: (1) the preliminary design of damage tolerant wing surface using composite materials; (2) the evaluation of fuel sealing and lightning protection methods for a composite material wing; and (3) an experimental investigation of the damage tolerant characteristics of toughened resin graphite/epoxy materials. The test results, the test techniques, and the test data are presented.

  1. Intraspecific competition facilitates the evolution of tolerance to insect damage in the perennial plant Solanum carolinense.

    PubMed

    McNutt, David W; Halpern, Stacey L; Barrows, Kahaili; Underwood, Nora

    2012-12-01

    Tolerance to herbivory (the degree to which plants maintain fitness after damage) is a key component of plant defense, so understanding how natural selection and evolutionary constraints act on tolerance traits is important to general theories of plant-herbivore interactions. These factors may be affected by plant competition, which often interacts with damage to influence trait expression and fitness. However, few studies have manipulated competitor density to examine the evolutionary effects of competition on tolerance. In this study, we tested whether intraspecific competition affects four aspects of the evolution of tolerance to herbivory in the perennial plant Solanum carolinense: phenotypic expression, expression of genetic variation, the adaptive value of tolerance, and costs of tolerance. We manipulated insect damage and intraspecific competition for clonal lines of S. carolinense in a greenhouse experiment, and measured tolerance in terms of sexual and asexual fitness components. Compared to plants growing at low density, plants growing at high density had greater expression of and genetic variation in tolerance, and experienced greater fitness benefits from tolerance when damaged. Tolerance was not costly for plants growing at either density, and only plants growing at low density benefited from tolerance when undamaged, perhaps due to greater intrinsic growth rates of more tolerant genotypes. These results suggest that competition is likely to facilitate the evolution of tolerance in S. carolinense, and perhaps in other plants that regularly experience competition, while spatio-temporal variation in density may maintain genetic variation in tolerance. PMID:22684886

  2. Review of the Oconee-3 probabilistic risk assessment: external events, core damage frequency. Volume 2

    SciTech Connect

    Hanan, N.A.; Ilberg, D.; Xue, D.; Youngblood, R.; Reed, J.W.; McCann, M.; Talwani, T.; Wreathall, J.; Kurth, P.D.; Bandyopadhyay, K.

    1986-03-01

    A review of the Oconee-3 Probabilistic Risk Assessment (OPRA) was conducted with the broad objective of evaluating qualitatively and quantitatively (as much as possible) the OPRA assessment of the important sequences that are ''externally'' generated and lead to core damage. The review included a technical assessment of the assumptions and methods used in the OPRA within its stated objective and with the limited information available. Within this scope, BNL performed a detailed reevaluation of the accident sequences generated by internal floods and earthquakes and a less detailed review (in some cases a scoping review) for the accident sequences generated by fires, tornadoes, external floods, and aircraft impact. 12 refs., 24 figs., 31 tabs.

  3. Design, testing, and damage tolerance study of bonded stiffened composite wing cover panels

    NASA Technical Reports Server (NTRS)

    Madan, Ram C.; Sutton, Jason O.

    1988-01-01

    Results are presented from the application of damage tolerance criteria for composite panels to multistringer composite wing cover panels developed under NASA's Composite Transport Wing Technology Development contract. This conceptual wing design integrated aeroelastic stiffness constraints with an enhanced damage tolerance material system, in order to yield optimized producibility and structural performance. Damage tolerance was demonstrated in a test program using full-sized cover panel subcomponents; panel skins were impacted at midbay between stiffeners, directly over a stiffener, and over the stiffener flange edge. None of the impacts produced visible damage. NASTRAN analyses were performed to simulate NDI-detected invisible damage.

  4. The combined effect of glass buffer strips and stitching on the damage tolerance of composites

    NASA Technical Reports Server (NTRS)

    Kullerd, Susan M.

    1993-01-01

    Recent research has demonstrated that through-the-thickness stitching provides major improvements in the damage tolerance of composite laminates loaded in compression. However, the brittle nature of polymer matrix composites makes them susceptible to damage propagation, requiring special material applications and designs to limit damage growth. Glass buffer strips, embedded within laminates, have shown the potential for improving the damage tolerance of unstitched composite laminates loaded in tension. The glass buffer strips, less stiff than the surrounding carbon fibers, arrest crack growth in composites under tensile loads. The present study investigates the damage tolerance characteristics of laminates that contain both stitching and glass buffer strips.

  5. The effect of resin on the impact damage tolerance of graphite-epoxy laminates

    NASA Technical Reports Server (NTRS)

    Williams, J. G.; Rhodes, M. D.

    1981-01-01

    The effect of the matrix resin on the impact damage tolerance of graphite-epoxy composite laminates was investigated. The materials were evaluated on the basis of the damage incurred due to local impact and on their ability to retain compression strength in the presence of impact damage. Twenty-four different resin systems were evaluated. Five of the systems demonstrated substantial improvements compared to the baseline system including retention of compression strength in the presence of impact damage. Examination of the neat resin mechanical properties indicates the resin tensile properties influence significantly the laminate damage tolerance and that improvements in laminate damage tolerance are not necessarily made at the expense of room temperature mechanical properties. Preliminary results indicate a resin volume fraction on the order of 40 percent or greater may be required to permit the plastic flow between fibers necessary for improved damage tolerance.

  6. 14 CFR 27.573 - Damage Tolerance and Fatigue Evaluation of Composite Rotorcraft Structures.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 14 Aeronautics and Space 1 2013-01-01 2013-01-01 false Damage Tolerance and Fatigue Evaluation of... Requirements Fatigue Evaluation § 27.573 Damage Tolerance and Fatigue Evaluation of Composite Rotorcraft... practice, the applicant must do a fatigue evaluation in accordance with paragraph (e) of this section....

  7. 14 CFR 23.574 - Metallic damage tolerance and fatigue evaluation of commuter category airplanes.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 14 Aeronautics and Space 1 2013-01-01 2013-01-01 false Metallic damage tolerance and fatigue... COMMUTER CATEGORY AIRPLANES Structure Fatigue Evaluation § 23.574 Metallic damage tolerance and fatigue... evaluation of the strength, detail design, and fabrication must show that catastrophic failure due to...

  8. 14 CFR 23.574 - Metallic damage tolerance and fatigue evaluation of commuter category airplanes.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 14 Aeronautics and Space 1 2012-01-01 2012-01-01 false Metallic damage tolerance and fatigue... COMMUTER CATEGORY AIRPLANES Structure Fatigue Evaluation § 23.574 Metallic damage tolerance and fatigue... evaluation of the strength, detail design, and fabrication must show that catastrophic failure due to...

  9. 14 CFR 29.573 - Damage Tolerance and Fatigue Evaluation of Composite Rotorcraft Structures.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 14 Aeronautics and Space 1 2013-01-01 2013-01-01 false Damage Tolerance and Fatigue Evaluation of... Requirements Fatigue Evaluation § 29.573 Damage Tolerance and Fatigue Evaluation of Composite Rotorcraft... practice, the applicant must do a fatigue evaluation in accordance with paragraph (e) of this section....

  10. 14 CFR 23.574 - Metallic damage tolerance and fatigue evaluation of commuter category airplanes.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 14 Aeronautics and Space 1 2011-01-01 2011-01-01 false Metallic damage tolerance and fatigue... COMMUTER CATEGORY AIRPLANES Structure Fatigue Evaluation § 23.574 Metallic damage tolerance and fatigue... evaluation of the strength, detail design, and fabrication must show that catastrophic failure due to...

  11. 14 CFR 23.574 - Metallic damage tolerance and fatigue evaluation of commuter category airplanes.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 14 Aeronautics and Space 1 2014-01-01 2014-01-01 false Metallic damage tolerance and fatigue... COMMUTER CATEGORY AIRPLANES Structure Fatigue Evaluation § 23.574 Metallic damage tolerance and fatigue... evaluation of the strength, detail design, and fabrication must show that catastrophic failure due to...

  12. 14 CFR 23.574 - Metallic damage tolerance and fatigue evaluation of commuter category airplanes.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... evaluation of commuter category airplanes. 23.574 Section 23.574 Aeronautics and Space FEDERAL AVIATION... COMMUTER CATEGORY AIRPLANES Structure Fatigue Evaluation § 23.574 Metallic damage tolerance and fatigue evaluation of commuter category airplanes. For commuter category airplanes— (a) Metallic damage tolerance....

  13. 75 FR 793 - Damage Tolerance and Fatigue Evaluation of Composite Rotorcraft Structures

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-01-06

    ...This proposal would revise airworthiness standards for type certification requirements of normal and transport category rotorcraft. The amendment would require evaluation of fatigue and residual static strength of composite rotorcraft structures using a damage tolerance evaluation, or a fatigue evaluation, if the applicant establishes that a damage tolerance evaluation is impractical. The......

  14. How oxygen damages microbes: oxygen tolerance and obligate anaerobiosis.

    PubMed

    Imlay, James A

    2002-01-01

    The orbital structure of molecular oxygen constrains it to accept electrons one at a time, and its unfavourable univalent reduction potential ensures that it can do so only with low-potential redox partners. In E. coli, this restriction prevents oxygen from oxidizing structural molecules. Instead, it primarily oxidizes reduced flavins, a reaction that is harmful only in that it generates superoxide and hydrogen peroxide as products. These species are stronger oxidants than is oxygen itself. They can oxidize dehydratase iron-sulphur clusters and sulphydryls, respectively, and thereby inactivate enzymes that are dependent upon these functional groups. Hydrogen peroxide also oxidizes free iron, generating hydroxyl radicals. Because hydroxyl radicals react with virtually any biomolecules they encounter, their reactivity is broadly dissipated, and only their reactions with DNA are known to have an important physiological impact. E. coli elaborates scavenging and repair systems to minimize the impact of this adventitious chemistry; mutants that lack these defences grow poorly in aerobic habitats. Some of the growth deficits of these mutants cannot be easily ascribed to sulphydryl, cluster, or DNA damage, indicating that important aspects of oxidative stress still lack a biochemical explanation. Obligate anaerobes cannot tolerate oxygen because they utilize metabolic schemes built around enzymes that react with oxidants. The reliance upon low-potential flavoproteins for anaerobic respiration probably causes substantial superoxide and hydrogen peroxide to be produced when anaerobes are exposed to air. These species then generate damage of the same type that they produce in aerotolerant bacteria. However, obligate anaerobes also utilize several classes of dioxygen-sensitive enzymes that are not needed by aerobes. These enzymes are used for processes that help maintain the redox balance during anaerobic fermentations. They catalyse reactions that are chemically difficult

  15. Structurally Integrated, Damage-Tolerant, Thermal Spray Coatings

    NASA Astrophysics Data System (ADS)

    Vackel, Andrew; Dwivedi, Gopal; Sampath, Sanjay

    2015-07-01

    Thermal spray coatings are used extensively for the protection and life extension of engineering components exposed to harsh wear and/or corrosion during service in aerospace, energy, and heavy machinery sectors. Cermet coatings applied via high-velocity thermal spray are used in aggressive wear situations almost always coupled with corrosive environments. In several instances (e.g., landing gear), coatings are considered as part of the structure requiring system-level considerations. Despite their widespread use, the technology has lacked generalized scientific principles for robust coating design, manufacturing, and performance analysis. Advances in process and in situ diagnostics have provided significant insights into the process-structure-property-performance correlations providing a framework-enhanced design. In this overview, critical aspects of materials, process, parametrics, and performance are discussed through exemplary studies on relevant compositions. The underlying connective theme is understanding and controlling residual stresses generation, which not only addresses process dynamics but also provides linkage for process-property relationship for both the system (e.g., fatigue) and the surface (wear and corrosion). The anisotropic microstructure also invokes the need for damage-tolerant material design to meet future goals.

  16. Damage Tolerance Assessment of Friction Pull Plug Welds

    NASA Technical Reports Server (NTRS)

    McGill, Preston; Burkholder, Jonathan

    2012-01-01

    Friction stir welding is a solid state welding process developed and patented by The Welding Institute in Cambridge, England. Friction stir welding has been implemented in the aerospace industry in the fabrication of longitudinal welds in pressurized cryogenic propellant tanks. As the industry looks to implement friction stir welding in circumferential welds in pressurized cryogenic propellant tanks, techniques to close out the termination hole associated with retracting the pin tool are being evaluated. Friction pull plug welding is under development as a one means of closing out the termination hole. A friction pull plug weld placed in a friction stir weld results in a non-homogenous weld joint where the initial weld, plug weld, their respective heat affected zones and the base metal all interact. The welded joint is a composite, plastically deformed material system with a complex residual stress field. In order to address damage tolerance concerns associated with friction plug welds in safety critical structures, such as propellant tanks, nondestructive inspection and proof testing may be required to screen hardware for mission critical defects. The efficacy of the nondestructive evaluation or the proof test is based on an assessment of the critical flaw size in the test or service environments. Test data relating residual strength capability to flaw size in two aluminum alloy friction plug weld configurations is presented.

  17. Damage Tolerance Behavior of Friction Stir Welds in Aluminum Alloys

    NASA Technical Reports Server (NTRS)

    McGill, Preston; Burkholder, Jonathan

    2012-01-01

    Friction stir welding is a solid state welding process used in the fabrication of various aerospace structures. Self-reacting and conventional friction stir welding are variations of the friction stir weld process employed in the fabrication of cryogenic propellant tanks which are classified as pressurized structure in many spaceflight vehicle architectures. In order to address damage tolerance behavior associated with friction stir welds in these safety critical structures, nondestructive inspection and proof testing may be required to screen hardware for mission critical defects. The efficacy of the nondestructive evaluation or the proof test is based on an assessment of the critical flaw size. Test data describing fracture behavior, residual strength capability, and cyclic mission life capability of friction stir welds at ambient and cryogenic temperatures have been generated and will be presented in this paper. Fracture behavior will include fracture toughness and tearing (R-curve) response of the friction stir welds. Residual strength behavior will include an evaluation of the effects of lack of penetration on conventional friction stir welds, the effects of internal defects (wormholes) on self-reacting friction stir welds, and an evaluation of the effects of fatigue cycled surface cracks on both conventional and selfreacting welds. Cyclic mission life capability will demonstrate the effects of surface crack defects on service load cycle capability. The fracture data will be used to evaluate nondestructive inspection and proof test requirements for the welds.

  18. Water availability limits tolerance of apical damage in the Chilean tarweed Madia sativa

    NASA Astrophysics Data System (ADS)

    Gonzáles, Wilfredo L.; Suárez, Lorena H.; Molina-Montenegro, Marco A.; Gianoli, Ernesto

    2008-07-01

    Plant tolerance is the ability to reduce the negative impact of herbivory on plant fitness. Numerous studies have shown that plant tolerance is affected by nutrient availability, but the effect of soil moisture has received less attention. We evaluated tolerance of apical damage (clipping that mimicked insect damage) under two watering regimes (control watering and drought) in the tarweed Madia sativa (Asteraceae). We recorded number of heads with seeds and total number of heads as traits related to fitness. Net photosynthetic rate, water use efficiency, number of branches, shoot biomass, and the root:shoot biomass ratio were measured as traits potentially related to tolerance via compensatory responses to damage. In the drought treatment, damaged plants showed ≈43% reduction in reproductive fitness components in comparison with undamaged plants. In contrast, there was no significant difference in reproductive fitness between undamaged and damaged plants in the control watering treatment. Shoot biomass was not affected by apical damage. The number of branches increased after damage in both water treatments but this increase was limited by drought stress. Net photosynthetic rate increased in damaged plants only in the control watering treatment. Water use efficiency increased with drought stress and, in plants regularly watered, also increased after damage. Root:shoot ratio was higher in the low water treatment and damaged plants tended to reduce root:shoot ratio only in this water treatment. It is concluded that water availability limits tolerance to apical damage in M. sativa, and that putative compensatory mechanisms are differentially affected by water availability.

  19. Shuttle/Centaur G-prime composite adapters damage tolerance/repair test program

    NASA Technical Reports Server (NTRS)

    Sollars, Teresa A.

    1987-01-01

    The Space Shuttle/Centaur Composite Adapters Damage Tolerance/Repair Test program had as its goals the determination of probable and potentially critical defects or damages on the adapters' strength and stability, as well as the adequacy of repairs on significantly damaged areas and the generation of NDT data for the upgrading of acceptance criteria. Such rational accept/reject criteria and repair methods reduce both engineering liason costs and any unnecessary parts-scrapping. Successful 'damage tolerant' design ensures that degradations of strength and stability due to undetected defects or damage will not be catastrophic.

  20. Recent Advances in Durability and Damage Tolerance Methodology at NASA Langley Research Center

    NASA Technical Reports Server (NTRS)

    Ransom, J. B.; Glaessgen, E. H.; Raju, I. S.; Harris, C. E.

    2007-01-01

    Durability and damage tolerance (D&DT) issues are critical to the development of lighter, safer and more efficient aerospace vehicles. Durability is largely an economic life-cycle design consideration whereas damage tolerance directly addresses the structural airworthiness (safety) of the vehicle. Both D&DT methodologies must address the deleterious effects of changes in material properties and the initiation and growth of damage that may occur during the vehicle s service lifetime. The result of unanticipated D&DT response is often manifested in the form of catastrophic and potentially fatal accidents. As such, durability and damage tolerance requirements must be rigorously addressed for commercial transport aircraft and NASA spacecraft systems. This paper presents an overview of the recent and planned future research in durability and damage tolerance analytical and experimental methods for both metallic and composite aerospace structures at NASA Langley Research Center (LaRC).

  1. Ethical Implications of Probabilistic Event Attribution for Policy Discussions about Loss and Damage

    NASA Astrophysics Data System (ADS)

    Otto, F. E. L.; Thompson, A.

    2014-12-01

    Warming of the global climate system is unequivocal, predominantly due to rising greenhouse gases with direct implications from rising mean global temperatures for some slow-onset events such as sea level rise, which can therefore be linked directly to past emissions. In many regions, however, extreme weather events, like heatwaves, floods, and droughts, are associated with greater loss and damage. An increase in average temperatures will lead to an increase in the frequency or magnitude of some extreme weather events including heat waves and droughts. For example, the deaths of at least thirty-five thousand people in Europe are attributable to the record-breaking heat wave of 2003. Extreme heat events and subsequent droughts can be directly linked to the loss of human life as well as damage to, or the significant diminishment of economic productivity. Two points are crucial here. First, the science of attributing slow-onset phenomena, such as higher mean temperatures or rising sea levels, to greenhouse gas emissions and other anthropogenic climatic forcings is different than the science of attributing particular extreme weather events, such as heat waves and extreme precipitation, to anthropogenic global climate change. The latter requires a different statistical approach. Second, extreme weather events, at least in the short term, will cause more damage and thus adversely affect society more than slow-onset phenomena. But while there is widespread agreement that slow-onset climate affects can be reliably attributed to anthropogenic greenhouse gas emissions our ability to attribute any particular extreme weather event to anthropogenic climate change is less accepted. However, with the emerging science of probabilistic event attribution it is possible to attribute the fraction of risk caused by anthropogenic climate change to particular weather events and their associated losses. Even with high uncertainty the robust link of a only a small fraction of excessive

  2. Non-probabilistic information fusion technique for structural damage identification based on measured dynamic data with uncertainty

    NASA Astrophysics Data System (ADS)

    Wang, Xiao-Jun; Yang, Chen; Qiu, Zhi-Ping

    2013-04-01

    Based on measured natural frequencies and acceleration responses, a non-probabilistic information fusion technique is proposed for the structural damage detection by adopting the set-membership identification (SMI) and two-step model updating procedure. Due to the insufficiency and uncertainty of information obtained from measurements, the uncertain problem of damage identification is addressed with interval variables in this paper. Based on the first-order Taylor series expansion, the interval bounds of the elemental stiffness parameters in undamaged and damaged models are estimated, respectively. The possibility of damage existence (PoDE) in elements is proposed as the quantitative measure of structural damage probability, which is more reasonable in the condition of insufficient measurement data. In comparison with the identification method based on a single kind of information, the SMI method will improve the accuracy in damage identification, which reflects the information fusion concept based on the non-probabilistic set. A numerical example is performed to demonstrate the feasibility and effectiveness of the proposed technique.

  3. An Evaluation of the Applicability of Damage Tolerance to Dynamic Systems

    NASA Technical Reports Server (NTRS)

    Forth, Scott C.; Le, Dy; Turnberg, Jay

    2005-01-01

    The Federal Aviation Administration, the National Aeronautics and Space Administration and the aircraft industry have teamed together to develop methods and guidance for the safe life-cycle management of dynamic systems. Based on the success of the United States Air Force damage tolerance initiative for airframe structure, a crack growth based damage tolerance approach is being examined for implementation into the design and management of dynamic systems. However, dynamic systems accumulate millions of vibratory cycles per flight hour, more than 12,000 times faster than an airframe system. If a detectable crack develops in a dynamic system, the time to failure is extremely short, less than 100 flight hours in most cases, leaving little room for error in the material characterization, life cycle analysis, nondestructive inspection and maintenance processes. In this paper, the authors review the damage tolerant design process focusing on uncertainties that affect dynamic systems and evaluate the applicability of damage tolerance on dynamic systems.

  4. Concepts for improving the damage tolerance of composite compression panels. [aircraft structures

    NASA Technical Reports Server (NTRS)

    Rhodes, M. D.; Williams, J. G.

    1984-01-01

    The residual strength of specimens with damage and the sensitivity to damage while subjected to an applied inplane compression load were determined for flatplate specimens and blade-stiffened panels. The results suggest that matrix materials that fail by delamination have the lowest damage tolerance capability. Alternate matrix materials or laminates which are transversely reinforced suppress the delamination mode of failure and change the failure mode to transverse shear crippling which occurs at a higher strain value. Several damage-tolerant blade-stiffened panel design concepts are evaluated. Structural efficiency studies conducted show only small mass penalties may result from incorporating these damage-tolerant features in panel design. The implication of test results on the design of aircraft structures was examined with respect to FAR requirements.

  5. Strong, damage tolerant oxide-fiber/oxide matrix composites

    NASA Astrophysics Data System (ADS)

    Bao, Yahua

    cationic polyelectrolytes to have a positive surface charge and then dipped into diluted, negatively-charged AlPO4 colloidal suspension (0.05M) at pH 7.5. Amorphous AlPO4 (crystallizes to tridymite- and cristobalite-forms at 1080°C) nano particles were coated on fibers layer-by-layer using an electrostatic attraction protocol. A uniform and smooth coating was formed which allowed fiber pullout from the matrix of a Nextel 720/alumina mini-composite hot-pressed at 1250°C/20MPa. Reaction-bonded mullite (RBM), with low formation temperature and sintering shrinkage was synthesized by incorporation of mixed-rare-earth-oxide (MREO) and mullite seeds. Pure mullite formed with 7.5wt% MREO at 1300°C. Introduction of 5wt% mullite seeds gave RBM with less than 3% shrinkage and 20% porosity. AlPO4-coated Nextel 720/RBM composites were successful fabricated by EPID and pressureless sintering at 1300°C. Significant fiber pullout occurred and the 4-point bend strength was around 170MPa (with 25-30vol% fibers) at room temperature and 1100°C and a Work-of-Fracture 7KJ/m2. At 1200°C, the composite failed in shear due to the MREO-based glassy phase in the matrix. AlPO4-coated Nextel 720 fiber/aluminosilicate (no MREO) showed damage tolerance at 1200°C with a bend strength 170MPa.

  6. ADVANCED COMPOSITE WIND TURBINE BLADE DESIGN BASED ON DURABILITY AND DAMAGE TOLERANCE

    SciTech Connect

    Galib Abumeri; Frank Abdi

    2012-02-16

    damage and fracture modes that resemble those reported in the tests. The results show that computational simulation can be relied on to enhance the design of tapered composite structures such as the ones used in turbine wind blades. A computational simulation for durability, damage tolerance (D&DT) and reliability of composite wind turbine blade structures in presence of uncertainties in material properties was performed. A composite turbine blade was first assessed with finite element based multi-scale progressive failure analysis to determine failure modes and locations as well as the fracture load. D&DT analyses were then validated with static test performed at Sandia National Laboratories. The work was followed by detailed weight analysis to identify contribution of various materials to the overall weight of the blade. The methodology ensured that certain types of failure modes, such as delamination progression, are contained to reduce risk to the structure. Probabilistic analysis indicated that composite shear strength has a great influence on the blade ultimate load under static loading. Weight was reduced by 12% with robust design without loss in reliability or D&DT. Structural benefits obtained with the use of enhanced matrix properties through nanoparticles infusion were also assessed. Thin unidirectional fiberglass layers enriched with silica nanoparticles were applied to the outer surfaces of a wind blade to improve its overall structural performance and durability. The wind blade was a 9-meter prototype structure manufactured and tested subject to three saddle static loading at Sandia National Laboratory (SNL). The blade manufacturing did not include the use of any nano-material. With silica nanoparticles in glass composite applied to the exterior surfaces of the blade, the durability and damage tolerance (D&DT) results from multi-scale PFA showed an increase in ultimate load of the blade by 9.2% as compared to baseline structural performance (without nano

  7. Use of a New Portable Instrumented Impactor on the NASA Composite Crew Module Damage Tolerance Program

    NASA Technical Reports Server (NTRS)

    Jackson, Wade C.; Polis, Daniel L.

    2014-01-01

    Damage tolerance performance is critical to composite structures because surface impacts at relatively low energies may result in a significant strength loss. For certification, damage tolerance criteria require aerospace vehicles to meet design loads while containing damage at critical locations. Data from standard small coupon testing are difficult to apply to larger more complex structures. Due to the complexity of predicting both the impact damage and the residual properties, damage tolerance is demonstrated primarily by testing. A portable, spring-propelled, impact device was developed which allows the impact damage response to be investigated on large specimens, full-scale components, or entire vehicles. During impact, both the force history and projectile velocity are captured. The device was successfully used to demonstrate the damage tolerance performance of the NASA Composite Crew Module. The impactor was used to impact 18 different design features at impact energies up to 35 J. Detailed examples of these results are presented, showing impact force histories, damage inspection results, and response to loading.

  8. 14 CFR 25.571 - Damage-tolerance and fatigue evaluation of structure.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 14 Aeronautics and Space 1 2013-01-01 2013-01-01 false Damage-tolerance and fatigue evaluation of structure. 25.571 Section 25.571 Aeronautics and Space FEDERAL AVIATION ADMINISTRATION, DEPARTMENT OF..., manufacturing defects, or accidental damage, will be avoided throughout the operational life of the...

  9. 14 CFR 25.571 - Damage-tolerance and fatigue evaluation of structure.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 14 Aeronautics and Space 1 2011-01-01 2011-01-01 false Damage-tolerance and fatigue evaluation of structure. 25.571 Section 25.571 Aeronautics and Space FEDERAL AVIATION ADMINISTRATION, DEPARTMENT OF... damage, will be avoided throughout the operational life of the airplane. This evaluation must...

  10. 14 CFR 25.571 - Damage-tolerance and fatigue evaluation of structure.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 14 Aeronautics and Space 1 2010-01-01 2010-01-01 false Damage-tolerance and fatigue evaluation of structure. 25.571 Section 25.571 Aeronautics and Space FEDERAL AVIATION ADMINISTRATION, DEPARTMENT OF..., manufacturing defects, or accidental damage, will be avoided throughout the operational life of the...

  11. 14 CFR 25.571 - Damage-tolerance and fatigue evaluation of structure.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 14 Aeronautics and Space 1 2012-01-01 2012-01-01 false Damage-tolerance and fatigue evaluation of structure. 25.571 Section 25.571 Aeronautics and Space FEDERAL AVIATION ADMINISTRATION, DEPARTMENT OF..., manufacturing defects, or accidental damage, will be avoided throughout the operational life of the...

  12. Phosphorylation of human INO80 is involved in DNA damage tolerance

    SciTech Connect

    Kato, Dai; Waki, Mayumi; Umezawa, Masaki; Aoki, Yuka; Utsugi, Takahiko; Ohtsu, Masaya; Murakami, Yasufumi

    2012-01-06

    Highlights: Black-Right-Pointing-Pointer Depletion of hINO80 significantly reduced PCNA ubiquitination. Black-Right-Pointing-Pointer Depletion of hINO80 significantly reduced nuclear dots intensity of RAD18 after UV irradiation. Black-Right-Pointing-Pointer Western blot analyses showed phosphorylated hINO80 C-terminus. Black-Right-Pointing-Pointer Overexpression of phosphorylation mutant hINO80 reduced PCNA ubiquitination. -- Abstract: Double strand breaks (DSBs) are the most serious type of DNA damage. DSBs can be generated directly by exposure to ionizing radiation or indirectly by replication fork collapse. The DNA damage tolerance pathway, which is conserved from bacteria to humans, prevents this collapse by overcoming replication blockages. The INO80 chromatin remodeling complex plays an important role in the DNA damage response. The yeast INO80 complex participates in the DNA damage tolerance pathway. The mechanisms regulating yINO80 complex are not fully understood, but yeast INO80 complex are necessary for efficient proliferating cell nuclear antigen (PCNA) ubiquitination and for recruitment of Rad18 to replication forks. In contrast, the function of the mammalian INO80 complex in DNA damage tolerance is less clear. Here, we show that human INO80 was necessary for PCNA ubiquitination and recruitment of Rad18 to DNA damage sites. Moreover, the C-terminal region of human INO80 was phosphorylated, and overexpression of a phosphorylation-deficient mutant of human INO80 resulted in decreased ubiquitination of PCNA during DNA replication. These results suggest that the human INO80 complex, like the yeast complex, was involved in the DNA damage tolerance pathway and that phosphorylation of human INO80 was involved in the DNA damage tolerance pathway. These findings provide new insights into the DNA damage tolerance pathway in mammalian cells.

  13. Effect of braiding process on the damage tolerance of 3-D braided graphite/epoxy composites

    NASA Technical Reports Server (NTRS)

    El-Shiekh, Aly; Li, Wei; Hammad, Mohamed

    1989-01-01

    One of the key advantages of three-dimensional braided composite materials is their high impact damage tolerance comparing with laminated composites, due to their fully integrated fibrous substrates. In this paper, the effect of different processing methods on the impact damage tolerance of braided graphite/epoxy composite is experimentally assessed. The test specimens are prepared using both of the two existing three-dimensional braiding techniques (the 4-step and the 2-step processes). After the specimens are impacted under controlled impact energy, the damage introduced is studied. Then a compression test is conducted to evaluate the compression strength of the specimens after impact.

  14. Hypervelocity impact damage tolerance of fused silica glass

    NASA Technical Reports Server (NTRS)

    Edelstein, K. S.

    1992-01-01

    A test program was conducted at the NASA/Johnson Space Center (JSC) concerning hypervelocity impact damage in fused silica glass. The objectives of this test program were: to expand the penetration equation data base in the velocity range between 2 and 8 km/s; to determine how much strength remains in a glass pane that has sustained known impact damage; and to develop a relationship between crater measurements and residual strength predictions that can be utilized in the Space Shuttle and Space Station programs. The results and conclusions of the residual strength testing are discussed below. Detailed discussion of the penetration equation studies will follow in future presentations.

  15. DNA bending facilitates the error-free DNA damage tolerance pathway and upholds genome integrity

    PubMed Central

    Gonzalez-Huici, Victor; Szakal, Barnabas; Urulangodi, Madhusoodanan; Psakhye, Ivan; Castellucci, Federica; Menolfi, Demis; Rajakumara, Eerappa; Fumasoni, Marco; Bermejo, Rodrigo; Jentsch, Stefan; Branzei, Dana

    2014-01-01

    DNA replication is sensitive to damage in the template. To bypass lesions and complete replication, cells activate recombination-mediated (error-free) and translesion synthesis-mediated (error-prone) DNA damage tolerance pathways. Crucial for error-free DNA damage tolerance is template switching, which depends on the formation and resolution of damage-bypass intermediates consisting of sister chromatid junctions. Here we show that a chromatin architectural pathway involving the high mobility group box protein Hmo1 channels replication-associated lesions into the error-free DNA damage tolerance pathway mediated by Rad5 and PCNA polyubiquitylation, while preventing mutagenic bypass and toxic recombination. In the process of template switching, Hmo1 also promotes sister chromatid junction formation predominantly during replication. Its C-terminal tail, implicated in chromatin bending, facilitates the formation of catenations/hemicatenations and mediates the roles of Hmo1 in DNA damage tolerance pathway choice and sister chromatid junction formation. Together, the results suggest that replication-associated topological changes involving the molecular DNA bender, Hmo1, set the stage for dedicated repair reactions that limit errors during replication and impact on genome stability. PMID:24473148

  16. Collection, processing, and reporting of damage tolerant design data for non-aerospace structural materials

    NASA Technical Reports Server (NTRS)

    Huber, P. D.; Gallagher, J. P.

    1994-01-01

    This report describes the organization, format and content of the NASA Johnson damage tolerant database which was created to store damage tolerant property data for non aerospace structural materials. The database is designed to store fracture toughness data (K(sub IC), K(sub c), J(sub IC) and CTOD(sub IC)), resistance curve data (K(sub R) VS. delta a (sub eff) and JR VS. delta a (sub eff)), as well as subcritical crack growth data (a vs. N and da/dN vs. delta K). The database contains complementary material property data for both stainless and alloy steels, as well as for aluminum, nickel, and titanium alloys which were not incorporated into the Damage Tolerant Design Handbook database.

  17. Damage tolerance of candidate thermoset composites for use on single stage to orbit vehicles

    NASA Technical Reports Server (NTRS)

    Nettles, A. T.; Lance, D.; Hodge, A.

    1994-01-01

    Four fiber/resin systems were compared for resistance to damage and damage tolerance. One toughened epoxy and three toughened bismaleimide (BMI) resins were used, all with IM7 carbon fiber reinforcement. A statistical design of experiments technique was used to evaluate the effects of impact energy, specimen thickness, and impactor diameter on the damage area, as computed by C-scans, and residual compression-after-impact (CAI) strength. Results showed that two of the BMI systems sustained relatively large damage zones yet had an excellent retention of CAI strength.

  18. Damage tolerance design procedures for an automotive composite

    SciTech Connect

    Corum, J.M.; Battiste, R.L.

    1998-11-01

    Among the durability issues of concern in the use of composites in automobile structures is the damaging effects that low-energy impacts (e.g., tool drops and roadway kickups) might have on strength and stiffness. This issue was experimentally investigated, and recommended design evaluation procedures were developed for a candidate automotive structural composite--a structural reaction injection-molded polyurethane reinforced with continuous strand, swirl-mat E-glass fibers. Two test facilities were built to cover the range of impacts of interest--a pendulum device to characterize the effects of relative heavy objects at low velocities and an air gun to characterize the effects of relatively light objects at higher velocities. In all cases, the test specimen was a 9 x 9 x 1/8-in.-thick plate clamped on an 8-in.-diam circle. Sixty-five impact tests were performed. Included were tests using various impactor sizes and weights, tests at {minus}40 F, and tests on specimens that has been presoaked in water or exposed to battery acid. Damage areas were determined using ultrasonic C-scans, and the resulting areas were found to correlate with the quantity impactor mass to a power times velocity. A design curve was derived from the correlation and validated using dropped brick tests. To evaluate strength and stiffness reductions, the impacted plate specimens were cut into tensile, compressive, and fatigue test specimens that were used to determine reductions as a function of damage area. It was found that for design purposes, the strength reduction could be determined by representing the damage area by a circular hole of equivalent area.

  19. Effect of resin on impact damage tolerance of graphite/epoxy laminates

    NASA Technical Reports Server (NTRS)

    Williams, J. G.; Rhodes, M. D.

    1982-01-01

    Twenty-four different epoxy resin systems were evaluated by a variety of test techniques to identify materials that exhibited improved impact damage tolerance in graphite/epoxy composite laminates. Forty-eight-ply composite panels of five of the material systems were able to sustain 100 m/s impact by a 1.27-cm-diameter aluminum projectile while statically loaded to strains of 0.005. Of the five materials with the highest tolerance to impact, two had elastomeric additives, two had thermoplastic additives, and one had a vinyl modifier; all the five systems used bisphenol A as the base resin. An evaluation of test results shows that the laminate damage tolerance is largely determined by the resin tensile properties, and that improvements in laminate damage tolerance are not necessarily made at the expense of room-temperature mechanical properties. The results also suggest that a resin volume fraction of 40 percent or greater may be required to permit the plastic flow between fibers necessary for improved damage tolerance.

  20. Safe-life and damage-tolerant design approaches for helicopter structures

    NASA Technical Reports Server (NTRS)

    Reddick, H. K., Jr.

    1983-01-01

    The safe-life and damage-tolerant design approaches discussed apply to both metallic and fibrous composite helicopter structures. The application of these design approaches to fibrous composite structures is emphasized. Safe-life and damage-tolerant criteria are applied to all helicopter flight critical components, which are generally categorized as: dynamic components with a main and tail rotor system, which includes blades, hub and rotating controls, and drive train which includes transmission, and main and interconnecting rotor shafts; and the airframe, composed of the fuselage, aerodynamic surfaces, and landing gear.

  1. Nrf2 as a master regulator of tissue damage control and disease tolerance to infection

    PubMed Central

    Soares, Miguel P.; Ribeiro, Ana M.

    2015-01-01

    Damage control refers to those actions made towards minimizing damage or loss. Depending on the context, these can range from emergency procedures dealing with the sinking of a ship or to a surgery dealing with severe trauma or even to an imaginary company in Marvel comics, which repairs damaged property arising from conflicts between super heroes and villains. In the context of host microbe interactions, tissue damage control refers to an adaptive response that limits the extent of tissue damage associated with infection. Tissue damage control can limit the severity of infectious diseases without interfering with pathogen burden, conferring disease tolerance to infection. This contrasts with immune-driven resistance mechanisms, which although essential to protect the host from infection, can impose tissue damage to host parenchyma tissues. This damaging effect is countered by stress responses that confer tissue damage control and disease tolerance to infection. Here we discuss how the stress response regulated by the transcription factor nuclear factor-erythroid 2-related factor 2 (Nrf2) acts in such a manner. PMID:26551709

  2. Nrf2 as a master regulator of tissue damage control and disease tolerance to infection.

    PubMed

    Soares, Miguel P; Ribeiro, Ana M

    2015-08-01

    Damage control refers to those actions made towards minimizing damage or loss. Depending on the context, these can range from emergency procedures dealing with the sinking of a ship or to a surgery dealing with severe trauma or even to an imaginary company in Marvel comics, which repairs damaged property arising from conflicts between super heroes and villains. In the context of host microbe interactions, tissue damage control refers to an adaptive response that limits the extent of tissue damage associated with infection. Tissue damage control can limit the severity of infectious diseases without interfering with pathogen burden, conferring disease tolerance to infection. This contrasts with immune-driven resistance mechanisms, which although essential to protect the host from infection, can impose tissue damage to host parenchyma tissues. This damaging effect is countered by stress responses that confer tissue damage control and disease tolerance to infection. Here we discuss how the stress response regulated by the transcription factor nuclear factor-erythroid 2-related factor 2 (Nrf2) acts in such a manner. PMID:26551709

  3. Applications of a damage tolerance analysis methodology in aircraft design and production

    NASA Technical Reports Server (NTRS)

    Woodward, M. R.; Owens, S. D.; Law, G. E.; Mignery, L. A.

    1992-01-01

    Objectives of customer mandated aircraft structural integrity initiatives in design are to guide material selection, to incorporate fracture resistant concepts in the design, to utilize damage tolerance based allowables and planned inspection procedures necessary to enhance the safety and reliability of manned flight vehicles. However, validated fracture analysis tools for composite structures are needed to accomplish these objectives in a timely and economical manner. This paper briefly describes the development, validation, and application of a damage tolerance methodology for composite airframe structures. A closed-form analysis code, entitled SUBLAM was developed to predict the critical biaxial strain state necessary to cause sublaminate buckling-induced delamination extension in an impact damaged composite laminate. An embedded elliptical delamination separating a thin sublaminate from a thick parent laminate is modelled. Predicted failure strains were correlated against a variety of experimental data that included results from compression after impact coupon and element tests. An integrated analysis package was developed to predict damage tolerance based margin-of-safety (MS) using NASTRAN generated loads and element information. Damage tolerance aspects of new concepts are quickly and cost-effectively determined without the need for excessive testing.

  4. Fuel containment, lightning protection and damage tolerance in large composite primary aircraft structures

    NASA Technical Reports Server (NTRS)

    Griffin, Charles F.; James, Arthur M.

    1985-01-01

    The damage-tolerance characteristics of high strain-to-failure graphite fibers and toughened resins were evaluated. Test results show that conventional fuel tank sealing techniques are applicable to composite structures. Techniques were developed to prevent fuel leaks due to low-energy impact damage. For wing panels subjected to swept stroke lightning strikes, a surface protection of graphite/aluminum wire fabric and a fastener treatment proved effective in eliminating internal sparking and reducing structural damage. The technology features developed were incorporated and demonstrated in a test panel designed to meet the strength, stiffness, and damage tolerance requirements of a large commercial transport aircraft. The panel test results exceeded design requirements for all test conditions. Wing surfaces constructed with composites offer large weight savings if design allowable strains for compression can be increased from current levels.

  5. Damage tolerance of woven graphite-epoxy buffer strip panels

    NASA Technical Reports Server (NTRS)

    Kennedy, John M.

    1990-01-01

    Graphite-epoxy panels with S glass buffer strips were tested in tension and shear to measure their residual strengths with crack-like damage. The buffer strips were regularly spaced narrow strips of continuous S glass. Panels were made with a uniweave graphite cloth where the S glass buffer material was woven directly into the cloth. Panels were made with different width and thickness buffer strips. The panels were loaded to failure while remote strain, strain at the end of the slit, and crack opening displacement were monitoring. The notched region and nearby buffer strips were radiographed periodically to reveal crack growth and damage. Except for panels with short slits, the buffer strips arrested the propagating crack. The strength (or failing strain) of the panels was significantly higher than the strength of all-graphite panels with the same length slit. Panels with wide, thick buffer strips were stronger than panels with thin, narrow buffer strips. A shear-lag model predicted the failing strength of tension panels with wide buffer strips accurately, but over-estimated the strength of the shear panels and the tension panels with narrow buffer strips.

  6. Increasing the FOD tolerance of composites. [gas turbine engine blade foreign object damage

    NASA Technical Reports Server (NTRS)

    Novak, R. C.

    1978-01-01

    An experimental program was conducted for the purpose of increasing the foreign object damage tolerance of resin matrix composites in gas turbine engine fan blade applications. The superhybrid concept consisting of a resin matrix composite core surrounded by a sheath of boron/aluminum and titanium was found to be the most promising approach.

  7. 14 CFR 23.573 - Damage tolerance and fatigue evaluation of structure.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 14 Aeronautics and Space 1 2010-01-01 2010-01-01 false Damage tolerance and fatigue evaluation of structure. 23.573 Section 23.573 Aeronautics and Space FEDERAL AVIATION ADMINISTRATION, DEPARTMENT OF... operational life of the airplane must be consistent with the initial detectability and subsequent growth...

  8. 14 CFR 23.573 - Damage tolerance and fatigue evaluation of structure.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 14 Aeronautics and Space 1 2013-01-01 2013-01-01 false Damage tolerance and fatigue evaluation of structure. 23.573 Section 23.573 Aeronautics and Space FEDERAL AVIATION ADMINISTRATION, DEPARTMENT OF... operational life of the airplane must be consistent with the initial detectability and subsequent growth...

  9. 14 CFR 23.573 - Damage tolerance and fatigue evaluation of structure.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 14 Aeronautics and Space 1 2011-01-01 2011-01-01 false Damage tolerance and fatigue evaluation of structure. 23.573 Section 23.573 Aeronautics and Space FEDERAL AVIATION ADMINISTRATION, DEPARTMENT OF... operational life of the airplane must be consistent with the initial detectability and subsequent growth...

  10. 14 CFR 23.573 - Damage tolerance and fatigue evaluation of structure.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 14 Aeronautics and Space 1 2014-01-01 2014-01-01 false Damage tolerance and fatigue evaluation of structure. 23.573 Section 23.573 Aeronautics and Space FEDERAL AVIATION ADMINISTRATION, DEPARTMENT OF... operational life of the airplane must be consistent with the initial detectability and subsequent growth...

  11. 14 CFR 23.573 - Damage tolerance and fatigue evaluation of structure.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 14 Aeronautics and Space 1 2012-01-01 2012-01-01 false Damage tolerance and fatigue evaluation of structure. 23.573 Section 23.573 Aeronautics and Space FEDERAL AVIATION ADMINISTRATION, DEPARTMENT OF... operational life of the airplane must be consistent with the initial detectability and subsequent growth...

  12. 14 CFR 27.573 - Damage Tolerance and Fatigue Evaluation of Composite Rotorcraft Structures.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 14 Aeronautics and Space 1 2014-01-01 2014-01-01 false Damage Tolerance and Fatigue Evaluation of Composite Rotorcraft Structures. 27.573 Section 27.573 Aeronautics and Space FEDERAL AVIATION ADMINISTRATION, DEPARTMENT OF TRANSPORTATION AIRCRAFT AIRWORTHINESS STANDARDS: NORMAL CATEGORY ROTORCRAFT Strength Requirements Fatigue Evaluation §...

  13. 14 CFR 29.573 - Damage Tolerance and Fatigue Evaluation of Composite Rotorcraft Structures.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 14 Aeronautics and Space 1 2014-01-01 2014-01-01 false Damage Tolerance and Fatigue Evaluation of Composite Rotorcraft Structures. 29.573 Section 29.573 Aeronautics and Space FEDERAL AVIATION ADMINISTRATION, DEPARTMENT OF TRANSPORTATION AIRCRAFT AIRWORTHINESS STANDARDS: TRANSPORT CATEGORY ROTORCRAFT Strength Requirements Fatigue Evaluation...

  14. Assessment of the Damage Tolerance of Postbuckled Hat-Stiffened Panels Using Single-Stringer Specimens

    NASA Technical Reports Server (NTRS)

    Bisagni, Chiara; Vescovini, Riccardo; Davila, Carlos G.

    2010-01-01

    A procedure is proposed for the assessment of the damage tolerance and collapse of stiffened composite panels using a single-stringer compression specimen. The dimensions of the specimen are determined such that the specimen s nonlinear response and collapse are representative of an equivalent multi-stringer panel in compression. Experimental tests are conducted on specimens with and without an embedded delamination. A shell-based finite element model with intralaminar and interlaminar damage capabilities is developed to predict the postbuckling response as well as the damage evolution from initiation to collapse.

  15. An assessment of buffer strips for improving damage tolerance

    NASA Technical Reports Server (NTRS)

    Poe, C. C., Jr.; Kennedy, J. M.

    1981-01-01

    Graphite/epoxy panels with buffer strips were tested in tension to measure their residual strength with crack-like damage. Panels were made with 45/0/-45/90(2S) and 45/0/450(2S) layups. The buffer strips were parallel to the loading directions. They were made by replacing narrow strips of the 0 deg graphite plies with strips of either 0 deg S-Glass/epoxy or Kevlar-49/epoxy on either a one for one or a two for one basis. In a third case, O deg graphite/epoxy was used as the buffer material and thin, perforated Mylar strips were placed between the 0 deg piles and the cross-plies to weaken the interfaces and thus to isolate the 0 deg plies. Some panels were made with buffer strips of different widths and spacings. The buffer strips arrested the cracks and increased the residual strengths significantly over those plain laminates without buffer strips. A shear-lag type stress analysis correctly predicted the effects of layups, buffer material, buffer strip width and spacing, and the number of plies of buffer material.

  16. DNA damage tolerance: a double-edged sword guarding the genome

    PubMed Central

    Ghosal, Gargi; Chen, Junjie

    2013-01-01

    Preservation of genome integrity is an essential process for cell homeostasis. During the course of life of a single cell, the genome is constantly damaged by endogenous and exogenous agents. To ensure genome stability, cells use a global signaling network, namely the DNA damage response (DDR) to sense and repair DNA damage. DDR senses different types of DNA damage and coordinates a response that includes activation of transcription, cell cycle control, DNA repair pathways, apoptosis, senescence, and cell death. Despite several repair mechanisms that repair different types of DNA lesions, it is likely that the replication machinery would still encounter lesions that are mis-repaired or not repaired. Replication of damaged genome would result in high frequency of fork collapse and genome instability. In this scenario, the cells employ the DNA damage tolerance (DDT) pathway that recruits a specialized low fidelity translesion synthesis (TLS) polymerase to bypass the lesions for repair at a later time point. Thus, DDT is not a repair pathway per se, but provides a mechanism to tolerate DNA lesions during replication thereby increasing survival and preventing genome instability. Paradoxically, DDT process is also associated with increased mutagenesis, which can in turn drive the cell to cancer development. Thus, DDT process functions as a double-edged sword guarding the genome. In this review, we will discuss the replication stress induced DNA damage-signaling cascade, the stabilization and rescue of stalled replication forks by the DDT pathway and the effect of the DDT pathway on cancer. PMID:24058901

  17. Application of damage tolerance methodology in certification of the Piaggio P-180 Avanti

    NASA Technical Reports Server (NTRS)

    Johnson, Jerry

    1992-01-01

    The Piaggio P-180 Avanti, a twin pusher-prop engine nine-passenger business aircraft was certified in 1990, to the requirements of FAR Part 23 and Associated Special Conditions for Composite Structure. Certification included the application of a damage tolerant methodology to the design of the composite forward wing and empennage (vertical fin, horizontal stabilizer, tailcone, and rudder) structure. This methodology included an extensive analytical evaluation coupled with sub-component and full-scale testing of the structure. The work from the Damage Tolerance Analysis Assessment was incorporated into the full-scale testing. Damage representing hazards such as dropped tools, ground equipment, handling, and runway debris, was applied to the test articles. Additional substantiation included allowing manufacturing discrepancies to exist unrepaired on the full-scale articles and simulated bondline failures in critical elements. The importance of full-scale testing in the critical environmental conditions and the application of critical damage are addressed. The implication of damage tolerance on static and fatigue testing is discussed. Good correlation between finite element solutions and experimental test data was observed.

  18. Durability and damage tolerance of Large Composite Primary Aircraft Structure (LCPAS)

    NASA Technical Reports Server (NTRS)

    Mccarty, John E.; Roeseler, William G.

    1984-01-01

    Analysis and testing addressing the key technology areas of durability and damage tolerance were completed for wing surface panels. The wing of a fuel-efficient, 200-passenger commercial transport airplane for 1990 delivery was sized using graphite-epoxy materials. Coupons of various layups used in the wing sizing were tested in tension, compression, and spectrum fatigue with typical fastener penetrations. The compression strength after barely visible impact damage was determined from coupon and structural element tests. One current material system and one toughened system were evaluated by coupon testing. The results of the coupon and element tests were used to design three distinctly different compression panels meeting the strength, stiffness, and damage-tolerance requirements of the upper wing panels. These three concepts were tested with various amounts of damage ranging from barely visible impact to through-penetration. The results of this program provide the key technology data required to assess the durability and damage-tolerance capability or advanced composites for use in commercial aircraft wing panel structure.

  19. Materials and processes laboratory composite materials characterization task, part 1. Damage tolerance

    NASA Technical Reports Server (NTRS)

    Nettles, A. T.; Tucker, D. S.; Patterson, W. J.; Franklin, S. W.; Gordon, G. H.; Hart, L.; Hodge, A. J.; Lance, D. G.; Russel, S. S.

    1991-01-01

    A test run was performed on IM6/3501-6 carbon-epoxy in which the material was processed, machined into specimens, and tested for damage tolerance capabilities. Nondestructive test data played a major role in this element of composite characterization. A time chart was produced showing the time the composite material spent within each Branch or Division in order to identify those areas which produce a long turnaround time. Instrumented drop weight testing was performed on the specimens with nondestructive evaluation being performed before and after the impacts. Destructive testing in the form of cross-sectional photomicrography and compression-after-impact testing were used. Results show that the processing and machining steps needed to be performed more rapidly if data on composite material is to be collected within a reasonable timeframe. The results of the damage tolerance testing showed that IM6/3501-6 is a brittle material that is very susceptible to impact damage.

  20. The Regulation of DNA Damage Tolerance by Ubiquitin and Ubiquitin-Like Modifiers

    PubMed Central

    Cipolla, Lina; Maffia, Antonio; Bertoletti, Federica; Sabbioneda, Simone

    2016-01-01

    DNA replication is an extremely complex process that needs to be executed in a highly accurate manner in order to propagate the genome. This task requires the coordination of a number of enzymatic activities and it is fragile and prone to arrest after DNA damage. DNA damage tolerance provides a last line of defense that allows completion of DNA replication in the presence of an unrepaired template. One of such mechanisms is called post-replication repair (PRR) and it is used by the cells to bypass highly distorted templates caused by damaged bases. PRR is extremely important for the cellular life and performs the bypass of the damage both in an error-free and in an error-prone manner. In light of these two possible outcomes, PRR needs to be tightly controlled in order to prevent the accumulation of mutations leading ultimately to genome instability. Post-translational modifications of PRR proteins provide the framework for this regulation with ubiquitylation and SUMOylation playing a pivotal role in choosing which pathway to activate, thus controlling the different outcomes of damage bypass. The proliferating cell nuclear antigen (PCNA), the DNA clamp for replicative polymerases, plays a central role in the regulation of damage tolerance and its modification by ubiquitin, and SUMO controls both the error-free and error-prone branches of PRR. Furthermore, a significant number of polymerases are involved in the bypass of DNA damage possess domains that can bind post-translational modifications and they are themselves target for ubiquitylation. In this review, we will focus on how ubiquitin and ubiquitin-like modifications can regulate the DNA damage tolerance systems and how they control the recruitment of different proteins to the replication fork. PMID:27379156

  1. Reduced calcium-dependent mitochondrial damage underlies the reduced vulnerability of excitotoxicity-tolerant hippocampal neurons.

    PubMed

    Pivovarova, Natalia B; Stanika, Ruslan I; Watts, Charlotte A; Brantner, Christine A; Smith, Carolyn L; Andrews, S Brian

    2008-03-01

    In central neurons, over-stimulation of NMDA receptors leads to excessive mitochondrial calcium accumulation and damage, which is a critical step in excitotoxic death. This raises the possibility that low susceptibility to calcium overload-induced mitochondrial damage might characterize excitotoxicity-resistant neurons. In this study, we have exploited two complementary models of preconditioning-induced excitotoxicity resistance to demonstrate reduced calcium-dependent mitochondrial damage in NMDA-tolerant hippocampal neurons. We have further identified adaptations in mitochondrial calcium handling that account for enhanced mitochondrial integrity. In both models, enhanced tolerance was associated with improved preservation of mitochondrial membrane potential and structure. In the first model, which exhibited modest neuroprotection, mitochondria-dependent calcium deregulation was delayed, even though cytosolic and mitochondrial calcium loads were quantitatively unchanged, indicating that enhanced mitochondrial calcium capacity accounts for reduced injury. In contrast, the second model, which exhibited strong neuroprotection, displayed further delayed calcium deregulation and reduced mitochondrial damage because downregulation of NMDA receptor surface expression depressed calcium loading. Reducing calcium entry also modified the chemical composition of the calcium-buffering precipitates that form in calcium-loaded mitochondria. It thus appears that reduced mitochondrial calcium loading is a major factor underlying the robust neuroprotection seen in highly tolerant cells. PMID:18036152

  2. Damage Tolerance Testing of a NASA TransHab Derivative Woven Inflatable Module

    NASA Technical Reports Server (NTRS)

    Edgecombe, John; delaFuente, Horacio; Valle, Gerard

    2009-01-01

    Current options for Lunar habitat architecture include inflatable habitats and airlocks. Inflatable structures can have mass and volume advantages over conventional structures. However, inflatable structures carry different inherent risks and are at a lower Technical Readiness Level (TRL) than more conventional metallic structures. One of the risks associated with inflatable structures is in understanding the tolerance to induced damage. The Damage Tolerance Test (DTT) is designed to study the structural integrity of an expandable structure. TransHab (Figure 1) was an experimental inflatable module developed at the NASA/Johnson Space Center in the 1990 s. The TransHab design was originally envisioned for use in Mars Transits but was also studied as a potential habitat for the International Space Station (ISS). The design of the TransHab module was based on a woven design using an Aramid fabric. Testing of this design demonstrated a high level of predictability and repeatability with analytical predictions of stresses and deflections. Based on JSC s experience with the design and analysis of woven inflatable structures, the Damage Tolerance Test article was designed and fabricated using a woven design. The DTT article was inflated to 45 psig, representing 25% of the ultimate burst pressure, and one of the one-inch wide longitudinal structural members was severed by initiating a Linear Shaped Charge (LSC). Strain gage measurements, at the interface between the expandable elements (straps) and the nonexpandable metallic elements for pre-selected longitudinal straps, were taken throughout pressurization of the module and strap separation. Strain gage measurements show no change in longitudinal strap loading at the bulkhead interface after strap separation indicating loads in the restraint layer were re-distributed local to the damaged area due to the effects of friction under high internal pressure loading. The test completed all primary objectives with better than

  3. Damage tolerance of wrought alloy 718 Ni-Fe-base superalloy

    SciTech Connect

    Chang, M. . Dept. of Mechanical and Aerospace Engineering); Koul, A.K.; Au, P.; Terada, T. . Structures and Materials Lab.)

    1994-06-01

    The influence of a modified heat treatment (MHT) and the standard heat treatment (SHT) on the damage tolerance of alloy 718 turbine disk material has been studied over a range of temperature -- from room temperature to 650 C. The influence of these heats treatments on creep, low-cycle fatigue (LCF), notch sensitivity, cyclic stability, and fatigue crack growth rate (FCGR) properties has been studied. The microstructure developed through the MHT sequence is shown to be damage tolerant over the temperature range studied. Shot peening leads to a marked improvement in the LCF crack initiation life of the MHT material relative to the SHT material at 650 C. Serrated grain boundaries formed through controlled precipitation of grain-boundary [delta] phase are beneficial to elevated-temperature FCGRs. The [delta]-phase precipitates formed at an angle to the grain boundaries do not make the material notch sensitive.

  4. Damage tolerance of wrought alloy 718 Ni- Fe-base superalloy

    NASA Astrophysics Data System (ADS)

    Chang, M.; Koul, A. K.; Au, P.; Terada, T.

    1994-06-01

    The influence of a modified heat treatment (MHT) and the standard heat treatment (SHT) on the damage tolerance of alloy 718 turbine disk material has been studied over a range of temperatures— from room temperature to 650 °. The influence of these heat treatments on creep, low-cycle fatigue (LCF), notch sensitivity, cyclic stability, and fatigue crack growth rate (FCGR) properties has been studied. The microstructure developed through the MHT sequence is shown to be damage tolerant over the temperature range studied. Shot peening leads to a marked improvement in the LCF crack initiation life of the MHT material relative to the SHT material at 650 °. Serrated grain boundaries formed through controlled precipitation of grain-boundary 5 phase are beneficial to elevated- temperature FCGRs. The S-phase precipitates formed at an angle to the grain boundaries do not make the material notch sensitive.

  5. Advanced Damage Tolerance Analysis of International Space Station Pressure Wall Welds

    NASA Technical Reports Server (NTRS)

    Allen, Phillip A.

    2006-01-01

    EM20/MSFC has sponsored technology in the area of advanced damage tolerance analysis tools used to analyze the International Space Station (ISS) pressure wall welds. The ISS European modules did not receive non-destructive evaluation (NDE) inspection after proof test. In final assembly configuration, most welds could only be inspected from one side, and some welds were uninspectible. Therefore, advanced damage tolerance analysis was required to determine the critical initial flaw sizes and predicted safe life for the pressure wall welds. EM20 sponsored the development of a new finite element tools using FEA-Crack and WARP3D to solve the problem. This presentation gives a brief overview of the new analytical tools and the analysis results.

  6. FAA/NASA International Symposium on Advanced Structural Integrity Methods for Airframe Durability and Damage Tolerance

    NASA Technical Reports Server (NTRS)

    Harris, Charles E. (Editor)

    1994-01-01

    International technical experts in durability and damage tolerance of metallic airframe structures were assembled to present and discuss recent research findings and the development of advanced design and analysis methods, structural concepts, and advanced materials. The symposium focused on the dissemination of new knowledge and the peer-review of progress on the development of advanced methodologies. Papers were presented on: structural concepts for enhanced durability, damage tolerance, and maintainability; new metallic alloys and processing technology; fatigue crack initiation and small crack effects; fatigue crack growth models; fracture mechanics failure, criteria for ductile materials; structural mechanics methodology for residual strength and life prediction; development of flight load spectra for design and testing; and advanced approaches to resist corrosion and environmentally assisted fatigue.

  7. Probabilistic characteristics of random damage events and their quantification in acrylic bone cement.

    PubMed

    Qi, Gang; Wayne, Steven F; Penrose, Oliver; Lewis, Gladius; Hochstein, John I; Mann, Kenneth A

    2010-11-01

    The failure of brittle and quasi-brittle polymers can be attributed to a multitude of random microscopic damage modes, such as fibril breakage, crazing, and microfracture. As the load increases, new damage modes appear, and existing ones can transition into others. In the example polymer used in this study--a commercially available acrylic bone cement--these modes, as revealed by scanning electron microscopy of fracture surfaces, include nucleation of voids, cracking, and local detachment of the beads from the matrix. Here, we made acoustic measurements of the randomly generated microscopic events (RGME) that occurred in the material under pure tension and under three-point bending, and characterized the severity of the damage by the entropy (s) of the probability distribution of the observed acoustic signal amplitudes. We correlated s with the applied stress (σ) by establishing an empirical s-σ relationship, which quantifies the activities of RGME under Mode I stress. It reveals the state of random damage modes: when ds/dσ > 0, the number of damage modes present increases with increasing stress, whereas it decreases when ds/dσ < 0. When ds/dσ ≈ 0, no new random damage modes occur. In the s-σ curve, there exists a transition zone, with the stress at the "knee point" in this zone (center of the zone) corresponding to ~30 and ~35% of the cement's tensile and bending strengths, respectively. This finding explains the effects of RGME on material fatigue performance and may be used to approximate fatigue limit. PMID:20857320

  8. Advanced Durability and Damage Tolerance Design and Analysis Methods for Composite Structures: Lessons Learned from NASA Technology Development Programs

    NASA Technical Reports Server (NTRS)

    Harris, Charles E.; Starnes, James H., Jr.; Shuart, Mark J.

    2003-01-01

    Aerospace vehicles are designed to be durable and damage tolerant. Durability is largely an economic life-cycle design consideration whereas damage tolerance directly addresses the structural airworthiness (safety) of the vehicle. However, both durability and damage tolerance design methodologies must address the deleterious effects of changes in material properties and the initiation and growth of microstructural damage that may occur during the service lifetime of the vehicle. Durability and damage tolerance design and certification requirements are addressed for commercial transport aircraft and NASA manned spacecraft systems. The state-of-the-art in advanced design and analysis methods is illustrated by discussing the results of several recently completed NASA technology development programs. These programs include the NASA Advanced Subsonic Technology Program demonstrating technologies for large transport aircraft and the X-33 hypersonic test vehicle demonstrating technologies for a single-stage-to-orbit space launch vehicle.

  9. Variation and fitness costs for tolerance to different types of herbivore damage in Boechera stricta genotypes with contrasting glucosinolate structures

    PubMed Central

    Manzaneda, Antonio J.; Prasad, Kasavajhala V. S. K.; Mitchell-Olds, Thomas

    2010-01-01

    Summary Analyses of plant tolerance in response to different modes of herbivory are essential to understand plant defense evolution, yet are still scarce. Allocation costs and trade-offs between tolerance and plant chemical defenses may influence genetic variation for tolerance. However, variation in defenses occurs also for presence or absence of discrete chemical structures, yet, effects of intra-specific polymorphisms on tolerance to multiple herbivores have not been evaluated.Here, in a glasshouse experiment, we investigated variation for tolerance to different types of herbivory damage, and direct allocation costs in 10 genotypes of Boechera stricta (Brassicaceae), a wild relative of Arabidopsis, with contrasting foliar glucosinolate chemical structures (methionine-derived glucosinolates vs glucosinolates derived from branched-chain amino acids).We found significant genetic variation for tolerance to different types of herbivory. Structural variations in the glucosinolate profile did not influence tolerance to damage, but predicted plant fitness. Levels of constitutive and induced glucosinolates varied between genotypes with different structural profiles, but we did not detect any cost of tolerance explaining genetic variation in tolerance among genotypes.Trade-offs among plant tolerance to multiple herbivores may not explain the existence of intermediate levels of tolerance to damage in plants with contrasting chemical defensive profiles. PMID:20663059

  10. Damage Tolerance of Pre-Stressed Composite Panels Under Impact Loads

    NASA Astrophysics Data System (ADS)

    Johnson, Alastair F.; Toso-Pentecôte, Nathalie; Schueler, Dominik

    2014-02-01

    An experimental test campaign studied the structural integrity of carbon fibre/epoxy panels preloaded in tension or compression then subjected to gas gun impact tests causing significant damage. The test programme used representative composite aircraft fuselage panels composed of aerospace carbon fibre toughened epoxy prepreg laminates. Preload levels in tension were representative of design limit loads for fuselage panels of this size, and maximum compression preloads were in the post-buckle region. Two main impact scenarios were considered: notch damage from a 12 mm steel cube projectile, at velocities in the range 93-136 m/s; blunt impact damage from 25 mm diameter glass balls, at velocities 64-86 m/s. The combined influence of preload and impact damage on panel residual strengths was measured and results analysed in the context of damage tolerance requirements for composite aircraft panels. The tests showed structural integrity well above design limit loads for composite panels preloaded in tension and compression with visible notch impact damage from hard body impact tests. However, blunt impact tests on buckled compression loaded panels caused large delamination damage regions which lowered plate bending stiffness and reduced significantly compression strengths in buckling.

  11. Probabilistic Methods for Structural Design and Reliability

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.; Whitlow, Woodrow, Jr. (Technical Monitor)

    2002-01-01

    This report describes a formal method to quantify structural damage tolerance and reliability in the presence of a multitude of uncertainties in turbine engine components. The method is based at the material behavior level where primitive variables with their respective scatter ranges are used to describe behavior. Computational simulation is then used to propagate the uncertainties to the structural scale where damage tolerance and reliability are usually specified. Several sample cases are described to illustrate the effectiveness, versatility, and maturity of the method. Typical results from this method demonstrate, that it is mature and that it can be used to probabilistically evaluate turbine engine structural components. It may be inferred from the results that the method is suitable for probabilistically predicting the remaining life in aging or in deteriorating structures, for making strategic projections and plans, and for achieving better, cheaper, faster products that give competitive advantages in world markets.

  12. An assessment of buffer strips for improving damage tolerance of composite laminates at elevated temperature

    NASA Technical Reports Server (NTRS)

    Bigelow, C. A.

    1981-01-01

    Buffer strips greatly improve the damage tolerance of graphite/epoxy laminates loaded in tension. Graphite/polyimide buffer strip panels were made and tested to determine their residual strength at ambient and elevated (177 C) temperature. Each panel was cut in the center to represent damage. Panels were radiographed and crack-opening displacements were recorded to indicate fracture, fracture arrest, and the extent of damage in the buffer strip after arrest. All panels had the same buffer strip spacing and width. The buffer strip material was 0 deg S-glass/PMR-15. The buffer strips were made by replacing narrow strips of the 0 deg graphite plies with strips of the 0 deg S-glass on either a one-for-one or a two-for-one basis. Half of the panels were heated to 177 + or - 3 C before and during the testing. Elevated temperature did not alter the fracture behavior of the buffer configuration.

  13. Nitroglycerin induces DNA damage and vascular cell death in the setting of nitrate tolerance.

    PubMed

    Mikhed, Yuliya; Fahrer, Jörg; Oelze, Matthias; Kröller-Schön, Swenja; Steven, Sebastian; Welschof, Philipp; Zinßius, Elena; Stamm, Paul; Kashani, Fatemeh; Roohani, Siyer; Kress, Joana Melanie; Ullmann, Elisabeth; Tran, Lan P; Schulz, Eberhard; Epe, Bernd; Kaina, Bernd; Münzel, Thomas; Daiber, Andreas

    2016-07-01

    Nitroglycerin (GTN) and other organic nitrates are widely used vasodilators. Their side effects are development of nitrate tolerance and endothelial dysfunction. Given the potential of GTN to induce nitro-oxidative stress, we investigated the interaction between nitro-oxidative DNA damage and vascular dysfunction in experimental nitrate tolerance. Cultured endothelial hybridoma cells (EA.hy 926) and Wistar rats were treated with GTN (ex vivo: 10-1000 µM; in vivo: 10, 20 and 50 mg/kg/day for 3 days, s.c.). The level of DNA strand breaks, 8-oxoguanine and O (6)-methylguanine DNA adducts was determined by Comet assay, dot blot and immunohistochemistry. Vascular function was determined by isometric tension recording. DNA adducts and strand breaks were induced by GTN in cells in vitro in a concentration-dependent manner. GTN in vivo administration leads to endothelial dysfunction, nitrate tolerance, aortic and cardiac oxidative stress, formation of DNA adducts, stabilization of p53 and apoptotic death of vascular cells in a dose-dependent fashion. Mice lacking O (6)-methylguanine-DNA methyltransferase displayed more vascular O (6)-methylguanine adducts and oxidative stress under GTN therapy than wild-type mice. Although we were not able to prove a causal role of DNA damage in the etiology of nitrate tolerance, the finding of GTN-induced DNA damage such as the mutagenic and toxic adduct O (6)-methylguanine, and cell death supports the notion that GTN based therapy may provoke adverse side effects, including endothelial function. Further studies are warranted to clarify whether GTN pro-apoptotic effects are related to an impaired recovery of patients upon myocardial infarction. PMID:27357950

  14. Modeling continuous-fiber reinforced polymer composites for exploration of damage tolerant concepts

    NASA Astrophysics Data System (ADS)

    Matthews, Peter J.

    This work aims to improve the predictive capability for fiber-reinforced polymer matrix composite laminates using the finite element method. A new tool for modeling composite damage was developed which considers important modes of failure. Well-known micromechanical models were implemented to predict material values for material systems of interest to aerospace applications. These generated material values served as input to intralaminar and interlaminar damage models. A three-dimensional in-plane damage material model was implemented and behavior verified. Deficiencies in current state-of-the-art interlaminar capabilities were explored using the virtual crack closure technique and the cohesive zone model. A user-defined cohesive element was implemented to discover the importance of traction-separation material constitutive behavior. A novel method for correlation of traction-separation parameters was created. This new damage modeling tool was used for evaluation of novel material systems to improve damage tolerance. Classical laminate plate theory was used in a full-factorial study of layerwise-hybrid laminates. Filament-wound laminated composite cylindrical shells were subjected to quasi-static loading to validate the finite element computational composite damage model. The new tool for modeling provides sufficient accuracy and generality for use on a wide-range of problems.

  15. Shared Genetic Pathways Contribute to the Tolerance of Endogenous and Low-Dose Exogenous DNA Damage in Yeast

    PubMed Central

    Lehner, Kevin; Jinks-Robertson, Sue

    2014-01-01

    DNA damage that escapes repair and blocks replicative DNA polymerases is tolerated by bypass mechanisms that fall into two general categories: error-free template switching and error-prone translesion synthesis. Prior studies of DNA damage responses in Saccharomyces cerevisiae have demonstrated that repair mechanisms are critical for survival when a single, high dose of DNA damage is delivered, while bypass/tolerance mechanisms are more important for survival when the damage level is low and continuous (acute and chronic damage, respectively). In the current study, epistatic interactions between DNA-damage tolerance genes were examined and compared when haploid yeast cells were exposed to either chronic ultraviolet light or chronic methyl methanesulfonate. Results demonstrate that genes assigned to error-free and error-prone bypass pathways similarly promote survival in the presence of each type of chronic damage. In addition to using defined sources of chronic damage, rates of spontaneous mutations generated by the Pol ζ translesion synthesis DNA polymerase (complex insertions in a frameshift-reversion assay) were used to infer epistatic interactions between the same genes. Similar epistatic interactions were observed in analyses of spontaneous mutation rates, suggesting that chronic DNA-damage responses accurately reflect those used to tolerate spontaneous lesions. These results have important implications when considering what constitutes a safe and acceptable level of exogenous DNA damage. PMID:25060101

  16. Failure Predictions for VHTR Core Components using a Probabilistic Contiuum Damage Mechanics Model

    SciTech Connect

    Fok, Alex

    2013-10-30

    The proposed work addresses the key research need for the development of constitutive models and overall failure models for graphite and high temperature structural materials, with the long-term goal being to maximize the design life of the Next Generation Nuclear Plant (NGNP). To this end, the capability of a Continuum Damage Mechanics (CDM) model, which has been used successfully for modeling fracture of virgin graphite, will be extended as a predictive and design tool for the core components of the very high- temperature reactor (VHTR). Specifically, irradiation and environmental effects pertinent to the VHTR will be incorporated into the model to allow fracture of graphite and ceramic components under in-reactor conditions to be modeled explicitly using the finite element method. The model uses a combined stress-based and fracture mechanics-based failure criterion, so it can simulate both the initiation and propagation of cracks. Modern imaging techniques, such as x-ray computed tomography and digital image correlation, will be used during material testing to help define the baseline material damage parameters. Monte Carlo analysis will be performed to address inherent variations in material properties, the aim being to reduce the arbitrariness and uncertainties associated with the current statistical approach. The results can potentially contribute to the current development of American Society of Mechanical Engineers (ASME) codes for the design and construction of VHTR core components.

  17. Mechanical behavior, damage tolerance and durability of fiber metal laminates for aircraft structures

    NASA Astrophysics Data System (ADS)

    Wu, Guocai

    This study systematically explores the mechanical behavior, damage tolerance and durability of fiber metal laminates, a promising candidate materials system for next generation aerospace structures. The experimental results indicated that GLARE laminates exhibited a bilinear deformation behavior under static in-plane loading. Both an analytical constitutive model based on a modified classical lamination theory which incorporates the elasto-plastic behavior of aluminum alloy and a numerical simulation based on finite element modeling are used to predict the nonlinear stress-strain response and deformation behavior of GLARE laminates. The blunt notched strength of GLARE laminates increased with decreasing specimen width and decreasing hole diameter. The notched strength of GLARE laminates was evaluated based on a modified point stress criterion. A computer simulation based on finite element method was performed to study stress concentration and distribution around the notch and verify the analytical and experimental results of notched strength. Good agreement is obtained between the model predictions and experimental results. Experimental results also indicate that GLARE laminates exhibited superior impact properties to those of monolithic 2024-T3 aluminum alloy at low velocity impact loading. The GLARE 5-2/1 laminate with 0°/90°/90°/0° fiber configuration exhibits a better impact resistance than the GLARE 4-3/2 laminate with 0°/90°/0° fiber orientation. The characteristic impact energies, the damage area, and the permanent deflection of laminates are used to evaluate the impact damage resistance. The post-impact residual tensile strength under various damage states ranging from the plastic dent, barely visible impact damage (BVID), clearly visible impact damage (CVID) up to the complete perforation was also measured and compared. The post-impact fatigue behavior under various stress levels and impact damage states was extensively explored. The damage

  18. Reduction of female copulatory damage by resilin represents evidence for tolerance in sexual conflict

    PubMed Central

    Michels, Jan; Gorb, Stanislav N.; Reinhardt, Klaus

    2015-01-01

    Intergenomic evolutionary conflicts increase biological diversity. In sexual conflict, female defence against males is generally assumed to be resistance, which, however, often leads to trait exaggeration but not diversification. Here, we address whether tolerance, a female defence mechanism known from interspecific conflicts, exists in sexual conflict. We examined the traumatic insemination of female bed bugs via cuticle penetration by males, a textbook example of sexual conflict. Confocal laser scanning microscopy revealed large proportions of the soft and elastic protein resilin in the cuticle of the spermalege, the female defence organ. Reduced tissue damage and haemolymph loss were identified as adaptive female benefits from resilin. These did not arise from resistance because microindentation showed that the penetration force necessary to breach the cuticle was significantly lower at the resilin-rich spermalege than at other cuticle sites. Furthermore, a male survival analysis indicated that the spermalege did not impose antagonistic selection on males. Our findings suggest that the specific spermalege material composition evolved to tolerate the traumatic cuticle penetration. They demonstrate the importance of tolerance in sexual conflict and genitalia evolution, extend fundamental coevolution and speciation models and contribute to explaining the evolution of complexity. We propose that tolerance can drive trait diversity. PMID:25673297

  19. Reduction of female copulatory damage by resilin represents evidence for tolerance in sexual conflict.

    PubMed

    Michels, Jan; Gorb, Stanislav N; Reinhardt, Klaus

    2015-03-01

    Intergenomic evolutionary conflicts increase biological diversity. In sexual conflict, female defence against males is generally assumed to be resistance, which, however, often leads to trait exaggeration but not diversification. Here, we address whether tolerance, a female defence mechanism known from interspecific conflicts, exists in sexual conflict. We examined the traumatic insemination of female bed bugs via cuticle penetration by males, a textbook example of sexual conflict. Confocal laser scanning microscopy revealed large proportions of the soft and elastic protein resilin in the cuticle of the spermalege, the female defence organ. Reduced tissue damage and haemolymph loss were identified as adaptive female benefits from resilin. These did not arise from resistance because microindentation showed that the penetration force necessary to breach the cuticle was significantly lower at the resilin-rich spermalege than at other cuticle sites. Furthermore, a male survival analysis indicated that the spermalege did not impose antagonistic selection on males. Our findings suggest that the specific spermalege material composition evolved to tolerate the traumatic cuticle penetration. They demonstrate the importance of tolerance in sexual conflict and genitalia evolution, extend fundamental coevolution and speciation models and contribute to explaining the evolution of complexity. We propose that tolerance can drive trait diversity. PMID:25673297

  20. Damage Tolerance Testing of a NASA TransHab Derivative Woven Inflatable Module

    NASA Technical Reports Server (NTRS)

    Edgecombe, John; delaFuente, Horacio; Valle, Gerald D.

    2008-01-01

    Current options for Lunar habitat architecture include inflatable habitats and airlocks. Inflatable structures can have mass and volume advantages over conventional structures. Inflatable structures are perceived to carry additional risk because they are at a lower Technical Readiness Level (TRL) than conventional metallic structures. One of the risks associated with inflatable structures is understanding the tolerance to component damage and the resulting behavior of the system after the damage is introduced. The Damage Tolerance Test (DTT) is designed to study the structural integrity of an expandable structure during and subsequent to induced damage. The TransHab Project developed an experimental inflatable module developed at Johnson Space Center in the 1990's. The TransHab design was originally envisioned for use in Mars Transits but was also studied as a potential habitat for the International Space Station (ISS). The design of the TransHab module was based on a woven design using an Aramid fabric. Testing of this design demonstrated a high level of predictability and repeatability and good correlation with analytical predictions of stresses and deflections. Based on JSC's experience with the design and analysis of woven inflatable structures, the Damage Tolerance Test article was designed and fabricated using a woven design. The Damage Tolerance Test Article consists of a load bearing restraint layer, a bladder or gas barrier, and a structural metallic core. The test article restraint layer is fabricated from one inch wide Kevlar webbing that is woven in a basket weave pattern. Underneath the structural restraint layer is the bladder or gas barrier. For this test the bladder was required to maintain pressure for testing only and was not representative of a flight design. The bladder and structural restraint layer attach to the structural core of the module at steel bulkheads at each end. The two bulkheads are separated by a 10 foot center tube which provides

  1. Damage-Tolerance Characteristics of Composite Fuselage Sandwich Structures with Thick Facesheets

    NASA Technical Reports Server (NTRS)

    McGowan, David M.; Ambur, Damodar R.

    1997-01-01

    Damage tolerance characteristics and results from experimental and analytical studies of a composite fuselage keel sandwich structure subjected to low-speed impact damage and discrete-source damage are presented. The test specimens are constructed from graphite-epoxy skins borided to a honeycomb core, and they are representative of a highly loaded fuselage keel structure. Results of compression-after-impact (CAI) and notch-length sensitivity studies of 5-in.-wide by 10-in.long specimens are presented. A correlation between low-speed-impact dent depth, the associated damage area, and residual strength for different impact-energy levels is described; and a comparison of the strength for undamaged and damaged specimens with different notch-length-to-specimen-width ratios is presented. Surface strains in the facesheets of the undamaged specimens as well as surface strains that illustrate the load redistribution around the notch sites in the notched specimens are presented and compared with results from finite element analyses. Reductions in strength of as much as 53.1 percent for the impacted specimens and 64.7 percent for the notched specimens are observed.

  2. Damage tolerance of pressurized graphite/epoxy cylinders under uniaxial and biaxial loading

    SciTech Connect

    Lagace, P.A.; Priest, S.M.

    1997-12-31

    The damage tolerance behavior of internally pressurized, longitudinally slit, graphite/epoxy tape cylinders was investigated. Specifically, the effects of longitudinal stress, subcritical damage, and structural anisotropy were considered including their limitations on a methodology, developed for quasi-isotropic configurations, which uses coupon fracture data to predict cylinder failure. Failure pressure was recorded and fracture paths and failure modes evaluated via post-test reconstruction of the cylinders. These results were compared to results from previous tests conducted in biaxial loading. Structural anisotropic effects were further investigated by testing cylinders with the quasi-isotropic layup and comparing these with the results from the other quasi-isotropic layup. In all cases, the failure pressures for the uniaxially loaded cylinders fell below those for the biaxially loaded cases and the methodology was not able to predict these failure pressures. These differences were most marked in the case of the structurally anisotropic cylinders. Differences in fracture paths and overall failure mode were found to be greatest in the cases where there was the largest difference in the failure pressures. Strain gages placed near the slit tips showed that subcritical damage occurred in all cases. These results, coupled with previous work, show that failure is controlled by local damage mechanisms and the subsequent stress redistribution and damage accumulation scenario.

  3. Development of pressure containment and damage tolerance technology for composite fuselage structures in large transport aircraft

    NASA Technical Reports Server (NTRS)

    Smith, P. J.; Thomson, L. W.; Wilson, R. D.

    1986-01-01

    NASA sponsored composites research and development programs were set in place to develop the critical engineering technologies in large transport aircraft structures. This NASA-Boeing program focused on the critical issues of damage tolerance and pressure containment generic to the fuselage structure of large pressurized aircraft. Skin-stringer and honeycomb sandwich composite fuselage shell designs were evaluated to resolve these issues. Analyses were developed to model the structural response of the fuselage shell designs, and a development test program evaluated the selected design configurations to appropriate load conditions.

  4. Probabilistic Assessment of Fracture Progression in Composite Structures

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.; Minnetyan, Levon; Mauget, Bertrand; Huang, Dade; Addi, Frank

    1999-01-01

    This report describes methods and corresponding computer codes that are used to evaluate progressive damage and fracture and to perform probabilistic assessment in built-up composite structures. Structural response is assessed probabilistically, during progressive fracture. The effects of design variable uncertainties on structural fracture progression are quantified. The fast probability integrator (FPI) is used to assess the response scatter in the composite structure at damage initiation. The sensitivity of the damage response to design variables is computed. The methods are general purpose and are applicable to stitched and unstitched composites in all types of structures and fracture processes starting from damage initiation to unstable propagation and to global structure collapse. The methods are demonstrated for a polymer matrix composite stiffened panel subjected to pressure. The results indicated that composite constituent properties, fabrication parameters, and respective uncertainties have a significant effect on structural durability and reliability. Design implications with regard to damage progression, damage tolerance, and reliability of composite structures are examined.

  5. Effect of Buckling Modes on the Fatigue Life and Damage Tolerance of Stiffened Structures

    NASA Technical Reports Server (NTRS)

    Davila, Carlos G.; Bisagni, Chiara; Rose, Cheryl A.

    2015-01-01

    The postbuckling response and the collapse of composite specimens with a co-cured hat stringer are investigated experimentally and numerically. These specimens are designed to evaluate the postbuckling response and the effect of an embedded defect on the collapse load and the mode of failure. Tests performed using controlled conditions and detailed instrumentation demonstrate that the damage tolerance, fatigue life, and collapse loads are closely tied with the mode of the postbuckling deformation, which can be different between two nominally identical specimens. Modes that tend to open skin/stringer defects are the most damaging to the structure. However, skin/stringer bond defects can also propagate under shearing modes. In the proposed paper, the effects of initial shape imperfections on the postbuckling modes and the interaction between different postbuckling deformations and the propagation of skin/stringer bond defects under quasi-static or fatigue loads will be examined.

  6. Long-term hygrothermal effects on damage tolerance of hybrid composite sandwich panels

    NASA Technical Reports Server (NTRS)

    Ishai, Ori; Hiel, Clement; Luft, Michael

    1995-01-01

    A sandwich construction, composed of hybrid carbon-glass fiber-reinforced plastic skins and a syntactic foam core, was selected as the design concept for a wind tunnel compressor blade application, where high damage tolerance and durability are of major importance. Beam specimens were prepared from open-edge and encapsulated sandwich panels which had previously been immersed in water at different temperatures for periods of up to about two years in the extreme case. Moisture absorption and strength characteristics, as related to time of exposure to hygrothermal conditions, were evaluated for the sandwich specimens and their constituents (skins and foam). After different exposure periods, low-velocity impact damage was inflicted on most sandwich specimens and damage characteristics were related to impact energy. Eventually, the residual compressive strengths of the damaged (and undamaged) beams were determined flexurally. Test results show that exposure to hygrothermal conditions leads to significant strength reductions for foam specimens and open-edge sandwich panels, compared with reference specimens stored at room temperature. In the case of skin specimens and for beams prepared from encapsulated sanwich panels that had previously been exposed to hygrothermal conditions, moisture absorption was found to improve strength as related to the reference case. The beneficial effect of moisture on skin performance was, however, limited to moisture contents below 1% (at 50 C and lower temperatures). Above this moisture level and at higher temperatures, strength degradation of the skin seems to prevail.

  7. Insensitivity to Flaws Leads to Damage Tolerance in Brittle Architected Meta-Materials

    PubMed Central

    Montemayor, L. C.; Wong, W. H.; Zhang, Y.-W.; Greer, J. R.

    2016-01-01

    Cellular solids are instrumental in creating lightweight, strong, and damage-tolerant engineering materials. By extending feature size down to the nanoscale, we simultaneously exploit the architecture and material size effects to substantially enhance structural integrity of architected meta-materials. We discovered that hollow-tube alumina nanolattices with 3D kagome geometry that contained pre-fabricated flaws always failed at the same load as the pristine specimens when the ratio of notch length (a) to sample width (w) is no greater than 1/3, with no correlation between failure occurring at or away from the notch. Samples with (a/w) > 0.3, and notch length-to-unit cell size ratios of (a/l) > 5.2, failed at a lower peak loads because of the higher sample compliance when fewer unit cells span the intact region. Finite element simulations show that the failure is governed by purely tensile loading for (a/w) < 0.3 for the same (a/l); bending begins to play a significant role in failure as (a/w) increases. This experimental and computational work demonstrates that the discrete-continuum duality of architected structural meta-materials may give rise to their damage tolerance and insensitivity of failure to the presence of flaws even when made entirely of intrinsically brittle materials. PMID:26837581

  8. Insensitivity to Flaws Leads to Damage Tolerance in Brittle Architected Meta-Materials.

    PubMed

    Montemayor, L C; Wong, W H; Zhang, Y-W; Greer, J R

    2016-01-01

    Cellular solids are instrumental in creating lightweight, strong, and damage-tolerant engineering materials. By extending feature size down to the nanoscale, we simultaneously exploit the architecture and material size effects to substantially enhance structural integrity of architected meta-materials. We discovered that hollow-tube alumina nanolattices with 3D kagome geometry that contained pre-fabricated flaws always failed at the same load as the pristine specimens when the ratio of notch length (a) to sample width (w) is no greater than 1/3, with no correlation between failure occurring at or away from the notch. Samples with (a/w) > 0.3, and notch length-to-unit cell size ratios of (a/l) > 5.2, failed at a lower peak loads because of the higher sample compliance when fewer unit cells span the intact region. Finite element simulations show that the failure is governed by purely tensile loading for (a/w) < 0.3 for the same (a/l); bending begins to play a significant role in failure as (a/w) increases. This experimental and computational work demonstrates that the discrete-continuum duality of architected structural meta-materials may give rise to their damage tolerance and insensitivity of failure to the presence of flaws even when made entirely of intrinsically brittle materials. PMID:26837581

  9. Insensitivity to Flaws Leads to Damage Tolerance in Brittle Architected Meta-Materials

    NASA Astrophysics Data System (ADS)

    Montemayor, L. C.; Wong, W. H.; Zhang, Y.-W.; Greer, J. R.

    2016-02-01

    Cellular solids are instrumental in creating lightweight, strong, and damage-tolerant engineering materials. By extending feature size down to the nanoscale, we simultaneously exploit the architecture and material size effects to substantially enhance structural integrity of architected meta-materials. We discovered that hollow-tube alumina nanolattices with 3D kagome geometry that contained pre-fabricated flaws always failed at the same load as the pristine specimens when the ratio of notch length (a) to sample width (w) is no greater than 1/3, with no correlation between failure occurring at or away from the notch. Samples with (a/w) > 0.3, and notch length-to-unit cell size ratios of (a/l) > 5.2, failed at a lower peak loads because of the higher sample compliance when fewer unit cells span the intact region. Finite element simulations show that the failure is governed by purely tensile loading for (a/w) < 0.3 for the same (a/l); bending begins to play a significant role in failure as (a/w) increases. This experimental and computational work demonstrates that the discrete-continuum duality of architected structural meta-materials may give rise to their damage tolerance and insensitivity of failure to the presence of flaws even when made entirely of intrinsically brittle materials.

  10. Damage tolerance assessment of bonded composite doubler repairs for commercial aircraft applications

    SciTech Connect

    Roach, D.

    1998-08-01

    The Federal Aviation Administration has sponsored a project at its Airworthiness Assurance NDI Validation Center (AANC) to validate the use of bonded composite doublers on commercial aircraft. A specific application was chosen in order to provide a proof-of-concept driving force behind this test and analysis project. However, the data stemming from this study serves as a comprehensive evaluation of bonded composite doublers for general use. The associated documentation package provides guidance regarding the design, analysis, installation, damage tolerance, and nondestructive inspection of these doublers. This report describes a series of fatigue and strength tests which were conducted to study the damage tolerance of Boron-Epoxy composite doublers. Tension-tension fatigue and ultimate strength tests attempted to grow engineered flaws in coupons with composite doublers bonded to aluminum skin. An array of design parameters, including various flaw scenarios, the effects of surface impact, and other off-design conditions, were studied. The structural tests were used to: (1) assess the potential for interply delaminations and disbonds between the aluminum and the laminate, and (2) determine the load transfer and crack mitigation capabilities of composite doublers in the presence of severe defects. A series of specimens were subjected to ultimate tension tests in order to determine strength values and failure modes. It was demonstrated that even in the presence of extensive damage in the original structure (cracks, material loss) and in spite of non-optimum installations (adhesive disbonds), the composite doubler allowed the structure to survive more than 144,000 cycles of fatigue loading. Installation flaws in the composite laminate did not propagate over 216,000 fatigue cycles. Furthermore, the added impediments of impact--severe enough to deform the parent aluminum skin--and hot-wet exposure did not effect the doubler`s performance. Since the tests were conducting

  11. Pro-oxidant Induced DNA Damage in Human Lymphoblastoid Cells: Homeostatic Mechanisms of Genotoxic Tolerance

    PubMed Central

    Seager, Anna L.

    2012-01-01

    Oxidative stress contributes to many disease etiologies including ageing, neurodegeneration, and cancer, partly through DNA damage induction (genotoxicity). Understanding the i nteractions of free radicals with DNA is fundamental to discern mutation risks. In genetic toxicology, regulatory authorities consider that most genotoxins exhibit a linear relationship between dose and mutagenic response. Yet, homeostatic mechanisms, including DNA repair, that allow cells to tolerate low levels of genotoxic exposure exist. Acceptance of thresholds for genotoxicity has widespread consequences in terms of understanding cancer risk and regulating human exposure to chemicals/drugs. Three pro-oxidant chemicals, hydrogen peroxide (H2O2), potassium bromate (KBrO3), and menadione, were examined for low dose-response curves in human lymphoblastoid cells. DNA repair and antioxidant capacity were assessed as possible threshold mechanisms. H2O2 and KBrO3, but not menadione, exhibited thresholded responses, containing a range of nongenotoxic low doses. Levels of the DNA glycosylase 8-oxoguanine glycosylase were unchanged in response to pro- oxidant stress. DNA repair–focused gene expression arrays reported changes in ATM and BRCA1, involved in double-strand break repair, in response to low-dose pro-oxidant exposure; however, these alterations were not substantiated at the protein level. Determination of oxidatively induced DNA damage in H2O2-treated AHH-1 cells reported accumulation of thymine glycol above the genotoxic threshold. Further, the H2O2 dose-response curve was shifted by modulating the antioxidant glutathione. Hence, observed pro- oxidant thresholds were due to protective capacities of base excision repair enzymes and antioxidants against DNA damage, highlighting the importance of homeostatic mechanisms in “genotoxic tolerance.” PMID:22539617

  12. Seismic damages comparison of low-rise moderate reinforced concrete moment frames in the near- and far-field earthquakes by a probabilistic approach

    NASA Astrophysics Data System (ADS)

    Dadashi, Reza; Nasserasadi, Kiarash

    2015-06-01

    Buildings and other structures experience more damages in near-field earthquakes due to existence of high period pulse in the records of near-field earthquakes. These pulses may not be existed in all near-field records. Therefore, to evaluate the effect of near-field earthquakes on structures realistically, a probabilistic approach is used to evaluate the probability of different damage state in near- and far-field earthquakes. In this method, the damage of structure is evaluated by estimation of fragility function of structure through numerous non-linear dynamic analysis subjected to different ground motion records. To compare the effect of near-field and far-field earthquakes on low-rise moderate reinforced concrete moment, a two and three story concrete frame were selected and designed according to Iranian code. The fragility function of frames was estimated in near- and far-field earthquakes. In near-field earthquakes, mixture of pulse like and non-pulse like records were considered. The results have shown that no meaningful difference between probabilities of failure of near- and far-field was observed. Therefore, it can be concluded that although the near-field earthquake may cause severe damages on structures due to existing impulses in some records, from the probabilistic point of view and considering all near-field records, this effect is not significant.

  13. Coordination of DNA damage tolerance mechanisms with cell cycle progression in fission yeast

    PubMed Central

    Callegari, A. John; Kelly, Thomas J.

    2016-01-01

    ABSTRACT DNA damage tolerance (DDT) mechanisms allow cells to synthesize a new DNA strand when the template is damaged. Many mutations resulting from DNA damage in eukaryotes are generated during DDT when cells use the mutagenic translesion polymerases, Rev1 and Polζ, rather than mechanisms with higher fidelity. The coordination among DDT mechanisms is not well understood. We used live-cell imaging to study the function of DDT mechanisms throughout the cell cycle of the fission yeast Schizosaccharomyces pombe. We report that checkpoint-dependent mitotic delay provides a cellular mechanism to ensure the completion of high fidelity DDT, largely by homology-directed repair (HDR). DDT by mutagenic polymerases is suppressed during the checkpoint delay by a mechanism dependent on Rad51 recombinase. When cells pass the G2/M checkpoint and can no longer delay mitosis, they completely lose the capacity for HDR and simultaneously exhibit a requirement for Rev1 and Polζ. Thus, DDT is coordinated with the checkpoint response so that the activity of mutagenic polymerases is confined to a vulnerable period of the cell cycle when checkpoint delay and HDR are not possible. PMID:26652183

  14. Role of interfaces i nthe design of ultra-high strength, radiation damage tolerant nanocomposites

    SciTech Connect

    Misra, Amit; Wang, Yongqiang; Nastasi, Michael A; Baldwin, Jon K; Wei, Qiangmin; Li, Nan; Mara, Nathan; Zhang, Xinghang; Fu, Engang; Anderoglu, Osman; Li, Hongqi; Bhattacharyya, Dhriti

    2010-12-09

    The combination of high strength and high radiation damage tolerance in nanolaminate composites can be achieved when the individual layers in these composites are only a few nanometers thick and contain special interfaces that act both as obstacles to slip, as well as sinks for radiation-induced defects. The morphological and phase stabilities and strength and ductility of these nano-composites under ion irradiation are explored as a function of layer thickness, temperature and interface structure. Magnetron sputtered metallic multilayers such as Cu-Nb and V-Ag with a range of individual layer thickness from approximately 2 nm to 50 nm and the corresponding 1000 nm thick single layer films were implanted with helium ions at room temperature. Cross-sectional Transmission Electron Microscopy (TEM) was used to measure the distribution of helium bubbles and correlated with the helium concentration profile measured vis ion beam analysis techniques to obtain the helium concentration at which bubbles are detected in TEM. It was found that in multilayers the minimum helium concentration to form bubbles (approximately I nm in size) that are easily resolved in through-focus TEM imaging was several atomic %, orders of magnitude higher than that in single layer metal films. This observation is consistent with an increased solubility of helium at interfaces that is predicted by atomistic modeling of the atomic structures of fcc-bcc interfaces. At helium concentrations as high as 7 at.%, a uniform distribution of I nm diameter bubbles results in negligible irradiation hardening and loss of deformability in multi layers with layer thicknesses of a few nanometers. The control of atomic structures of interfaces to produce high helium solubility at interfaces is crucial in the design of nano-composite materials that are radiation damage tolerant. Reduced radiation damage also leads to a reduction in the irradiation hardening, particularly at layer thickness of approximately 5 run

  15. Genomic assay reveals tolerance of DNA damage by both translesion DNA synthesis and homology-dependent repair in mammalian cells.

    PubMed

    Izhar, Lior; Ziv, Omer; Cohen, Isadora S; Geacintov, Nicholas E; Livneh, Zvi

    2013-04-16

    DNA lesions can block replication forks and lead to the formation of single-stranded gaps. These replication complications are mitigated by DNA damage tolerance mechanisms, which prevent deleterious outcomes such as cell death, genomic instability, and carcinogenesis. The two main tolerance strategies are translesion DNA synthesis (TLS), in which low-fidelity DNA polymerases bypass the blocking lesion, and homology-dependent repair (HDR; postreplication repair), which is based on the homologous sister chromatid. Here we describe a unique high-resolution method for the simultaneous analysis of TLS and HDR across defined DNA lesions in mammalian genomes. The method is based on insertion of plasmids carrying defined site-specific DNA lesions into mammalian chromosomes, using phage integrase-mediated integration. Using this method we show that mammalian cells use HDR to tolerate DNA damage in their genome. Moreover, analysis of the tolerance of the UV light-induced 6-4 photoproduct, the tobacco smoke-induced benzo[a]pyrene-guanine adduct, and an artificial trimethylene insert shows that each of these three lesions is tolerated by both TLS and HDR. We also determined the specificity of nucleotide insertion opposite these lesions during TLS in human genomes. This unique method will be useful in elucidating the mechanism of DNA damage tolerance in mammalian chromosomes and their connection to pathological processes such as carcinogenesis. PMID:23530190

  16. Damage tolerance of well-completion and stimulation techniques in coalbed methane reservoirs

    SciTech Connect

    Jahediesfanjani, H.; Civan, F.

    2005-09-01

    Coalbed methane (CBM) reservoirs are characterized as naturally fractured, dual porosity, low permeability, and water saturated gas reservoirs. Initially, the gas, water and coal are at thermodynamic equilibrium under prevailing reservoir conditions. Dewatering is essential to promote gas production. This can be accomplished by suitable completion and stimulation techniques. This paper investigates the efficiency and performance of the openhole cavity, hydraulic fractures, frack and packs, and horizontal wells as potential completion methods which may reduce formation damage and increase the productivity in coalbed methane reservoirs. Considering the dual porosity nature of CBM reservoirs, numerical simulations have been carried out to determine the formation damage tolerance of each completion and, stimulation approach. A new comparison parameter named as the normalized productivity index is defined as the ratio of the productivity index of a stimulated well to that of a nondamaged vertical well as a function of time. Typical scenarios have been considered to evaluate the CBM properties, including reservoir heterogeneity, anisotropy, and formation damage, for their effects on this index over the production time. The results for each stimulation technique show that the value of the index declines over the time of production with a rate which depends upon the applied technique and the prevailing reservoir conditions. The results also show that horizontal wells have the best performance if drilled orthogonal to the butt cleats. Open-hole cavity completions outperform vertical fractures if the fracture conductivity is reduced by any damage process. When vertical permeability is much lower than horizontal permeability, production of vertical wells will improve while productivity of horizontal wells will decrease.

  17. Probabilistic assessment of failure in adhesively bonded composite laminates

    SciTech Connect

    Minnetyan, L.; Chamis, C.C.

    1997-07-01

    Damage initiation and progressive fracture of adhesively bonded graphite/epoxy composites is investigated under tensile loading. A computer code is utilized for the simulation of composite structural damage and fracture. Structural response is assessed probabilistically during degradation. The effects of design variable uncertainties on structural damage progression are quantified. The Fast Probability Integrator is used to assess the response scatter in the composite structure at damage initiation. Sensitivity of the damage response to design variables is computed. Methods are general purpose in nature and are applicable to all types of laminated composite structures and joints, starting from damage initiation to unstable damage propagation and collapse. Results indicate that composite constituent and adhesive properties have a significant effect on structural durability. Damage initiation/progression does not necessarily begin in the adhesive bond. Design implications with regard to damage tolerance of adhesively bonded joints are examined.

  18. Damage Tolerance Analysis of Space Shuttle External Tank Lug Fillet Welds Using NASGRO

    NASA Technical Reports Server (NTRS)

    Allen, Phillip A.

    2006-01-01

    The damage tolerance of the External Tank (ET) lug welds were reassessed because of an increase in the loads due to the removal of the protuberance air load (PAT.,) ramp. The analysis methods included detailed finite element analysis (FEA) of the ET welded lugs and FEA of the lug weld test configuration. The FEA results were used as input to the crack growth analysis code NASGRO to calculate the mission life capability of the ET lug welds and to predict the number of cycles to failure in the lug weld testing. The presentation presents the method of transferring the FEM results to the NASGRO model and gives correlations between FEM and NASGRO stress intensity calculations.

  19. Plasticity and ductility in graphene oxide through a mechanochemically induced damage tolerance mechanism.

    PubMed

    Wei, Xiaoding; Mao, Lily; Soler-Crespo, Rafael A; Paci, Jeffrey T; Huang, Jiaxing; Nguyen, SonBinh T; Espinosa, Horacio D

    2015-01-01

    The ability to bias chemical reaction pathways is a fundamental goal for chemists and material scientists to produce innovative materials. Recently, two-dimensional materials have emerged as potential platforms for exploring novel mechanically activated chemical reactions. Here we report a mechanochemical phenomenon in graphene oxide membranes, covalent epoxide-to-ether functional group transformations that deviate from epoxide ring-opening reactions, discovered through nanomechanical experiments and density functional-based tight binding calculations. These mechanochemical transformations in a two-dimensional system are directionally dependent, and confer pronounced plasticity and damage tolerance to graphene oxide monolayers. Additional experiments on chemically modified graphene oxide membranes, with ring-opened epoxide groups, verify this unique deformation mechanism. These studies establish graphene oxide as a two-dimensional building block with highly tuneable mechanical properties for the design of high-performance nanocomposites, and stimulate the discovery of new bond-selective chemical transformations in two-dimensional materials. PMID:26289729

  20. Plasticity and ductility in graphene oxide through a mechanochemically induced damage tolerance mechanism

    PubMed Central

    Wei, Xiaoding; Mao, Lily; Soler-Crespo, Rafael A.; Paci, Jeffrey T.; Espinosa, Horacio D.

    2015-01-01

    The ability to bias chemical reaction pathways is a fundamental goal for chemists and material scientists to produce innovative materials. Recently, two-dimensional materials have emerged as potential platforms for exploring novel mechanically activated chemical reactions. Here we report a mechanochemical phenomenon in graphene oxide membranes, covalent epoxide-to-ether functional group transformations that deviate from epoxide ring-opening reactions, discovered through nanomechanical experiments and density functional-based tight binding calculations. These mechanochemical transformations in a two-dimensional system are directionally dependent, and confer pronounced plasticity and damage tolerance to graphene oxide monolayers. Additional experiments on chemically modified graphene oxide membranes, with ring-opened epoxide groups, verify this unique deformation mechanism. These studies establish graphene oxide as a two-dimensional building block with highly tuneable mechanical properties for the design of high-performance nanocomposites, and stimulate the discovery of new bond-selective chemical transformations in two-dimensional materials. PMID:26289729

  1. The Stomatopod Dactyl Club: A Formidable Damage-Tolerant Biological Hammer

    SciTech Connect

    Weaver J. C.; DiMasi E.; Milliron, G.W.; Miserez, A.; Evans-Lutterodt, K.; Herrera, S.; Gallana, I.; Mershon, W.J.; Swanson, B.; Zavattieri, P.; Kisailus, D.

    2012-06-08

    Nature has evolved efficient strategies to synthesize complex mineralized structures that exhibit exceptional damage tolerance. One such example is found in the hypermineralized hammer-like dactyl clubs of the stomatopods, a group of highly aggressive marine crustaceans. The dactyl clubs from one species, Odontodactylus scyllarus, exhibit an impressive set of characteristics adapted for surviving high-velocity impacts on the heavily mineralized prey on which they feed. Consisting of a multiphase composite of oriented crystalline hydroxyapatite and amorphous calcium phosphate and carbonate, in conjunction with a highly expanded helicoidal organization of the fibrillar chitinous organic matrix, these structures display several effective lines of defense against catastrophic failure during repetitive high-energy loading events.

  2. Plasticity and ductility in graphene oxide through a mechanochemically induced damage tolerance mechanism

    NASA Astrophysics Data System (ADS)

    Wei, Xiaoding; Mao, Lily; Soler-Crespo, Rafael A.; Paci, Jeffrey T.; Huang, Jiaxing; Nguyen, Sonbinh T.; Espinosa, Horacio D.

    2015-08-01

    The ability to bias chemical reaction pathways is a fundamental goal for chemists and material scientists to produce innovative materials. Recently, two-dimensional materials have emerged as potential platforms for exploring novel mechanically activated chemical reactions. Here we report a mechanochemical phenomenon in graphene oxide membranes, covalent epoxide-to-ether functional group transformations that deviate from epoxide ring-opening reactions, discovered through nanomechanical experiments and density functional-based tight binding calculations. These mechanochemical transformations in a two-dimensional system are directionally dependent, and confer pronounced plasticity and damage tolerance to graphene oxide monolayers. Additional experiments on chemically modified graphene oxide membranes, with ring-opened epoxide groups, verify this unique deformation mechanism. These studies establish graphene oxide as a two-dimensional building block with highly tuneable mechanical properties for the design of high-performance nanocomposites, and stimulate the discovery of new bond-selective chemical transformations in two-dimensional materials.

  3. Optimal Design and Damage Tolerance Verification of an Isogrid Structure for Helicopter Application

    NASA Technical Reports Server (NTRS)

    Baker, Donald J.; Fudge, Jack; Ambur, Damodar R.; Kassapoglou, Christos

    2003-01-01

    A composite isogrid panel design for application to a rotorcraft fuselage is presented. An optimum panel design for the lower fuselage of the rotorcraft that is subjected to combined in-plane compression and shear loads was generated using a design tool that utilizes a smeared-stiffener theory in conjunction with a genetic algorithm. A design feature was introduced along the edges of the panel that facilitates introduction of loads into the isogrid panel without producing undesirable local bending gradients. A low-cost manufacturing method for the isogrid panel that incorporates these design details is also presented. Axial compression tests were conducted on the undamaged and low-speed impact damaged panels to demonstrate the damage tolerance of this isogrid panel. A combined loading test fixture was designed and utilized that allowed simultaneous application of compression and shear loads to the test specimen. Results from finite element analyses are presented for the isogrid panel designs and these results are compared with experimental results. This study illustrates the isogrid concept to be a viable candidate for application to the helicopter lower fuselage structure.

  4. Transparency and damage tolerance of patternable omniphobic lubricated surfaces based on inverse colloidal monolayers

    DOE PAGESBeta

    Vogel, Nicolas; Belisle, Rebecca A.; Hatton, Benjamin; Wong, Tak-Sing; Aizenberg, Joanna

    2013-07-31

    A transparent coating that repels a wide variety of liquids, prevents staining, is capable of self-repair and is robust towards mechanical damage can have a broad technological impact, from solar cell coatings to self-cleaning optical devices. Here we employ colloidal templating to design transparent, nanoporous surface structures. A lubricant can be firmly locked into the structures and, owing to its fluidic nature, forms a defect-free, self-healing interface that eliminates the pinning of a second liquid applied to its surface, leading to efficient liquid repellency, prevention of adsorption of liquid-borne contaminants, and reduction of ice adhesion strength. We further show howmore » this method can be applied to locally pattern the repellent character of the substrate, thus opening opportunities to spatially confine any simple or complex fluids. The coating is highly defect-tolerant due to its interconnected, honeycomb wall structure, and repellency prevails after the application of strong shear forces and mechanical damage. The regularity of the coating allows us to understand and predict the stability or failure of repellency as a function of lubricant layer thickness and defect distribution based on a simple geometric model.« less

  5. Transparency and damage tolerance of patternable omniphobic lubricated surfaces based on inverse colloidal monolayers

    SciTech Connect

    Vogel, Nicolas; Belisle, Rebecca A.; Hatton, Benjamin; Wong, Tak-Sing; Aizenberg, Joanna

    2013-07-31

    A transparent coating that repels a wide variety of liquids, prevents staining, is capable of self-repair and is robust towards mechanical damage can have a broad technological impact, from solar cell coatings to self-cleaning optical devices. Here we employ colloidal templating to design transparent, nanoporous surface structures. A lubricant can be firmly locked into the structures and, owing to its fluidic nature, forms a defect-free, self-healing interface that eliminates the pinning of a second liquid applied to its surface, leading to efficient liquid repellency, prevention of adsorption of liquid-borne contaminants, and reduction of ice adhesion strength. We further show how this method can be applied to locally pattern the repellent character of the substrate, thus opening opportunities to spatially confine any simple or complex fluids. The coating is highly defect-tolerant due to its interconnected, honeycomb wall structure, and repellency prevails after the application of strong shear forces and mechanical damage. The regularity of the coating allows us to understand and predict the stability or failure of repellency as a function of lubricant layer thickness and defect distribution based on a simple geometric model.

  6. Transparency and damage tolerance of patternable omniphobic lubricated surfaces based on inverse colloidal monolayers.

    PubMed

    Vogel, Nicolas; Belisle, Rebecca A; Hatton, Benjamin; Wong, Tak-Sing; Aizenberg, Joanna

    2013-01-01

    A transparent coating that repels a wide variety of liquids, prevents staining, is capable of self-repair and is robust towards mechanical damage can have a broad technological impact, from solar cell coatings to self-cleaning optical devices. Here we employ colloidal templating to design transparent, nanoporous surface structures. A lubricant can be firmly locked into the structures and, owing to its fluidic nature, forms a defect-free, self-healing interface that eliminates the pinning of a second liquid applied to its surface, leading to efficient liquid repellency, prevention of adsorption of liquid-borne contaminants, and reduction of ice adhesion strength. We further show how this method can be applied to locally pattern the repellent character of the substrate, thus opening opportunities to spatially confine any simple or complex fluids. The coating is highly defect-tolerant due to its interconnected, honeycomb wall structure, and repellency prevails after the application of strong shear forces and mechanical damage. The regularity of the coating allows us to understand and predict the stability or failure of repellency as a function of lubricant layer thickness and defect distribution based on a simple geometric model. PMID:23900310

  7. Structurally Integrated, Damage Tolerant Thermal Spray Coatings: Processing Effects on Surface and System Functionalities

    NASA Astrophysics Data System (ADS)

    Vackel, Andrew

    Thermal Spray (TS) coatings have seen extensive application as protective surfaces to enhance the service life of substrates prone to damage in their operating environment (wear, corrosion, heat etc.). With the advent of high velocity TS processes, the ability to deposit highly dense (>99%) metallic and cermet coatings has further enhanced the protective ability of these coatings. In addition to surface functionality, the influence of the coating application on the mechanical performance of a coated component is of great concern when such a component will experience either static or cyclic loading during service. Using a process mapping methodology, the processing-property interplay between coating materials meant to provide damage tolerant surface or for structural restoration are explored in terms of relevant mechanical properties. Most importantly, the residual stresses inherent in TS deposited coatings are shown to play a significant role in the integrated mechanical performance of these coatings. Unique to high velocity TS processes is the ability to produce compressive stresses within the deposit from the cold working induced by the high kinetic energy particles upon impact. The extent of these formation stresses are explored with different coating materials, as well as processing influence. The ability of dense TS coatings to carry significant structural load and synergistically strengthen coated tensile specimens is demonstrated as a function of coating material, processing, and thickness. The sharing of load between the substrate and otherwise brittle coating enables higher loads before yield for the bi-material specimens, offering a methodology to improve the tensile performance of coated components for structural repair or multi-functionality (surface and structure). The concern of cyclic fatigue damage in coated components is explored, since the majority of service application are designed for loading to be well below the yield point. The role of

  8. Damage tolerance and arrest characteristics of pressurized graphite/epoxy tape cylinders

    NASA Technical Reports Server (NTRS)

    Ranniger, Claudia U.; Lagace, Paul A.; Graves, Michael J.

    1993-01-01

    An investigation of the damage tolerance and damage arrest characteristics of internally-pressurized graphite/epoxy tape cylinders with axial notches was conducted. An existing failure prediction methodology, developed and verified for quasi-isotropic graphite/epoxy fabric cylinders, was investigated for applicability to general tape layups. In addition, the effect of external circumferential stiffening bands on the direction of fracture path propagation and possible damage arrest was examined. Quasi-isotropic (90/0/plus or minus 45)s and structurally anisotropic (plus or minus 45/0)s and (plus or minus 45/90)s coupons and cylinders were constructed from AS4/3501-6 graphite/epoxy tape. Notched and unnotched coupons were tested in tension and the data correlated using the equation of Mar and Lin. Cylinders with through-thickness axial slits were pressurized to failure achieving a far-field two-to-one biaxial stress state. Experimental failure pressures of the (90/0/plus or minus 45)s cylinders agreed with predicted values for all cases but the specimen with the smallest slit. However, the failure pressures of the structurally anisotropic cylinders, (plus or minus 45/0)s and (plus or minus 45/90)s, were above the values predicted utilizing the predictive methodology in all cases. Possible factors neglected by the predictive methodology include structural coupling in the laminates and axial loading of the cylindrical specimens. Furthermore, applicability of the predictive methodology depends on the similarity of initial fracture modes in the coupon specimens and the cylinder specimens of the same laminate type. The existence of splitting which may be exacerbated by the axial loading in the cylinders, shows that this condition is not always met. The circumferential stiffeners were generally able to redirect fracture propagation from longitudinal to circumferential. A quantitative assessment for stiffener effectiveness in containing the fracture, based on cylinder

  9. Tolerance of lesions in E. coli: Chronological competition between Translesion Synthesis and Damage Avoidance.

    PubMed

    Fuchs, Robert P

    2016-08-01

    Lesion tolerance pathways allow cells to proceed with replication despite the presence of replication-blocking lesions in their genome. Following transient fork stalling, replication resumes downstream leaving daughter strand gaps opposite replication-blocking lesions. The existence and repair of these gaps have been know for decades and are commonly referred to as postreplicative repair [39,38] (Rupp, 2013; Rupp and Howard-Flanders, 1968). This paper analyzes the interaction of the pathways involved in the repair of these gaps. A key repair intermediated is formed when RecA protein binds to these gaps forming ssDNA.RecA filaments establishing the so-called SOS signal. The gaps are either "repaired" by Translesion Synthesis (TLS), a process that involves the transient recruitment of a specialized DNA polymerase that copies the lesion with an intrinsic risk of fixing a mutation opposite the lesion site, or by Damage Avoidance, an error-free pathway that involves homologous recombination with the sister chromatid (Homology Directed Gap Repair: HDGR). We have developed an assay that allows one to study the partition between TLS and HDGR in the context of a single replication-blocking lesion present in the E. coli chromosome. The level of expression of the TLS polymerases controls the extent of TLS. Our data show that TLS is implemented first with great parsimony, followed by abundant recombination-based tolerance events. Indeed, the substrate for TLS, i.e., the ssDNA.RecA filament, persists for only a limited amount of time before it engages in an early recombination intermediates (D-loop) with the sister chromatid. Time-based competition between TLS and HDGR is set by mere sequestration of the TLS substrates into early recombination intermediates. Most gaps are subsequently repaired by Homology Directed Gap Repair (HDGR), a pathway that involves RecA. Surprisingly, however, in the absence of RecA, some cells manage to divide and form colonies at the expense of losing

  10. Matrix toughness, long-term behavior, and damage tolerance of notched graphite fiber-reinforced composite materials

    NASA Technical Reports Server (NTRS)

    Bakis, C. E.; Simonds, R. A.; Stinchcomb, W. W.; Vick, L. W.

    1990-01-01

    The long-term behavior of notched graphite-fiber-reinforced composite laminates with brittle or tough matrix materials and different fiber architectures was investigated using damage measurements and stiffness change, residual strength, and life data. The fiber/matrix materials included T300/5208, AS4/3501-6, AS4/1808, AS4/PEEK, and C3000/PMR-15 matrices and unidirectional tape and woven cloth fiber architectures. Results of damage evaluation and of residual strength measurements during the fatigue damage development showed that the long-term behavior and damage tolerance are controlled by a number of interacting factors such as the matrix toughness, fiber architecture, loading levels, and damage types and distributions.

  11. Hierarchical flexural strength of enamel: transition from brittle to damage-tolerant behaviour

    PubMed Central

    Bechtle, Sabine; Özcoban, Hüseyin; Lilleodden, Erica T.; Huber, Norbert; Schreyer, Andreas; Swain, Michael V.; Schneider, Gerold A.

    2012-01-01

    Hard, biological materials are generally hierarchically structured from the nano- to the macro-scale in a somewhat self-similar manner consisting of mineral units surrounded by a soft protein shell. Considerable efforts are underway to mimic such materials because of their structurally optimized mechanical functionality of being hard and stiff as well as damage-tolerant. However, it is unclear how different hierarchical levels interact to achieve this performance. In this study, we consider dental enamel as a representative, biological hierarchical structure and determine its flexural strength and elastic modulus at three levels of hierarchy using focused ion beam (FIB) prepared cantilevers of micrometre size. The results are compared and analysed using a theoretical model proposed by Jäger and Fratzl and developed by Gao and co-workers. Both properties decrease with increasing hierarchical dimension along with a switch in mechanical behaviour from linear-elastic to elastic-inelastic. We found Gao's model matched the results very well. PMID:22031729

  12. Fuel containment and damage tolerance in large composite primary aircraft structures. Phase 2: Testing

    NASA Technical Reports Server (NTRS)

    Sandifer, J. P.; Denny, A.; Wood, M. A.

    1985-01-01

    Technical issues associated with fuel containment and damage tolerance of composite wing structures for transport aircraft were investigated. Material evaluation tests were conducted on two toughened resin composites: Celion/HX1504 and Celion/5245. These consisted of impact, tension, compression, edge delamination, and double cantilever beam tests. Another test series was conducted on graphite/epoxy box beams simulating a wing cover to spar cap joint configuration of a pressurized fuel tank. These tests evaluated the effectiveness of sealing methods with various fastener types and spacings under fatigue loading and with pressurized fuel. Another test series evaluated the ability of the selected coatings, film, and materials to prevent fuel leakage through 32-ply AS4/2220-1 laminates at various impact energy levels. To verify the structural integrity of the technology demonstration article structural details, tests were conducted on blade stiffened panels and sections. Compression tests were performed on undamaged and impacted stiffened AS4/2220-1 panels and smaller element tests to evaluate stiffener pull-off, side load and failsafe properties. Compression tests were also performed on panels subjected to Zone 2 lightning strikes. All of these data were integrated into a demonstration article representing a moderately loaded area of a transport wing. This test combined lightning strike, pressurized fuel, impact, impact repair, fatigue and residual strength.

  13. Structural basis for cisplatin DNA damage tolerance by human polymerase η during cancer chemotherapy

    PubMed Central

    Ummat, Ajay; Rechkoblit, Olga; Jain, Rinku; Choudhary, Jayati R.; Johnson, Robert E.; Silverstein, Timothy D.; Buku, Angeliki; Lone, Samer; Prakash, Louise; Prakash, Satya; Aggarwal, Aneel K.

    2012-01-01

    A major clinical problem in the use of cisplatin to treat cancers is tumor resistance. DNA polymerase η (Polη) is a key polymerase that allows cancer cells to cope with cisplatin–DNA adducts formed during chemotherapy. We present here a structure of human Polη inserting dCTP opposite a cisplatin intrastrand cross-link (PtGpG). We show that specificity of human Polη for PtGpG derives from an active site that is open to permit Watson-Crick geometry of the nascent PtGpG•dCTP base pair and to accommodate the lesion without steric hindrance. The specificity is augmented by residues Gln38 and Ser62 that interact with PtGpG, and Arg61 that interacts with incoming dCTP. Collectively, the structure provides a basis for understanding how Polη in human cells can tolerate DNA damage caused by cisplatin chemotherapy and offers a framework for the design of inhibitors in cancer therapy. PMID:22562137

  14. The Functions of Serine 687 Phosphorylation of Human DNA Polymerase η in UV Damage Tolerance.

    PubMed

    Dai, Xiaoxia; You, Changjun; Wang, Yinsheng

    2016-06-01

    DNA polymerase η (polη) is a Y-family translesion synthesis polymerase that plays a key role in the cellular tolerance toward UV irradiation-induced DNA damage. Here, we identified, for the first time, the phosphorylation of serine 687 (Ser(687)), which is located in the highly conserved nuclear localization signal (NLS) region of human polη and is mediated by cyclin-dependent kinase 2 (CDK2). We also showed that this phosphorylation is stimulated in human cells upon UV light exposure and results in diminished interaction of polη with proliferating cell nuclear antigen (PCNA). Furthermore, we demonstrated that the phosphorylation of Ser(687) in polη confers cellular protection from UV irradiation and increases the efficiency in replication across a site-specifically incorporated cyclobutane pyrimidine dimer in human cells. Based on these results, we proposed a mechanistic model where Ser(687) phosphorylation functions in the reverse polymerase switching step of translesion synthesis: The phosphorylation brings negative charges to the NLS of polη, which facilitates its departure from PCNA, thereby resetting the replication fork for highly accurate and processive DNA replication. Thus, our study, together with previous findings, supported that the posttranslational modifications of NLS of polη played a dual role in polymerase switching, where Lys(682) deubiquitination promotes the recruitment of polη to PCNA immediately prior to lesion bypass and Ser(687) phosphorylation stimulates its departure from the replication fork immediately after lesion bypass. PMID:26988343

  15. Evaluation of Damage Tolerance of Advanced SiC/SiC Composites after Neutron Irradiation

    NASA Astrophysics Data System (ADS)

    Ozawa, Kazumi; Katoh, Yutai; Nozawa, Takashi; Hinoki, Tatsuya; Snead, Lance L.

    2011-10-01

    Silicon carbide composites (SiC/SiC) are attractive candidate materials for structural and functional components in fusion energy systems. The effect of neutron irradiation on damage tolerance of the nuclear grade SiC/SiC composites (plain woven Hi-Nicalon™ Type-S reinforced CVI matrix composites multilayer interphase and unidirectional Tyranno™-SA3 reinforced NITE matrix with carbon mono-layer interphase) was evaluated by means of miniaturized single-edged notched beam test. No significant changes in crack extension behavior and in the load-loadpoint displacement characteristics such as the peak load and hysteresis loop width were observed after irradiation to 5.9 × 1025 n/m2 (E > 0.1 MeV) at 800°C and to 5.8 × 1025 n/m2 at 1300°C. By applying a global energy balance analysis based on non-linear fracture mechanics, the energy release rate for these composite materials was found to be unchanged by irradiation with a value of 3±2 kJ/m2. This has led to the conclusion that, for these fairly aggressive irradiation conditions, the effect of neutron irradiation on the fracture resistance of these composites appears insignificant.

  16. Damage Tolerance Assessment of Friction Pull Plug Welds in an Aluminum Alloy

    NASA Technical Reports Server (NTRS)

    McGill, Preston; Burkholder, Jonathan

    2012-01-01

    Friction stir welding is a solid state welding process used in the fabrication of cryogenic propellant tanks. Self-reacting friction stir welding is one variation of the friction stir weld process being developed for manufacturing tanks. Friction pull plug welding is used to seal the exit hole that remains in a circumferential self-reacting friction stir weld. A friction plug weld placed in a self-reacting friction stir weld results in a non-homogenous weld joint where the initial weld, plug weld, their respective heat affected zones and the base metal all interact. The welded joint is a composite plastically deformed material system with a complex residual stress field. In order to address damage tolerance concerns associated with friction plug welds in safety critical structures, such as propellant tanks, nondestructive inspection and proof testing may be required to screen hardware for mission critical defects. The efficacy of the nondestructive evaluation or the proof test is based on an assessment of the critical flaw size. Test data relating residual strength capability to flaw size in an aluminum alloy friction plug weld will be presented.

  17. Damage tolerant functionally graded materials for advanced wear and friction applications

    NASA Astrophysics Data System (ADS)

    Prchlik, Lubos

    The research work presented in this dissertation focused on processing effects, microstructure development, characterization and performance evaluation of composite and graded coatings used for friction and wear control. The following issues were addressed. (1) Definition of prerequisites for a successful composite and graded coating formation by means of thermal spraying. (2) Improvement of characterization methods available for homogenous thermally sprayed coating and their extension to composite and graded materials. (3) Development of novel characterization methods specifically for FGMs, with a focus on through thickness property measurement by indentation and in-situ curvature techniques. (4) Design of composite materials with improved properties compared to homogenous coatings. (5) Fabrication and performance assessment of FGM with improved wear and impact damage properties. Materials. The materials studied included several material systems relevant to low friction and contact damage tolerant applications: MO-Mo2C, WC-Co cermets as materials commonly used sliding components of industrial machinery and NiCrAlY/8%-Yttria Partially Stabilized Zirconia composites as a potential solution for abradable sections of gas turbines and aircraft engines. In addition, uniform coatings such as molybdenum and Ni5%Al alloy were evaluated as model system to assess the influence of microstructure variation onto the mechanical property and wear response. Methods. The contact response of the materials was investigated through several techniques. These included methods evaluating the relevant intrinsic coating properties such as elastic modulus, residual stress, fracture toughness, scratch resistance and tests measuring the abrasion and friction-sliding behavior. Dry-sand and wet two-body abrasion testing was performed in addition to traditional ball on disc sliding tests. Among all characterization techniques the spherical indentation deserved most attention and enabled to

  18. FAA/NASA International Symposium on Advanced Structural Integrity Methods for Airframe Durability and Damage Tolerance, part 2

    NASA Technical Reports Server (NTRS)

    Harris, Charles E. (Editor)

    1994-01-01

    The international technical experts in the areas of durability and damage tolerance of metallic airframe structures were assembled to present and discuss recent research findings and the development of advanced design and analysis methods, structural concepts, and advanced materials. The principal focus of the symposium was on the dissemination of new knowledge and the peer-review of progress on the development of advanced methodologies. Papers were presented on the following topics: structural concepts for enhanced durability, damage tolerance, and maintainability; new metallic alloys and processing technology; fatigue crack initiation and small crack effects; fatigue crack growth models; fracture mechanics failure criteria for ductile materials; structural mechanics methodology for residual strength and life prediction; development of flight load spectra for design and testing; and corrosion resistance.

  19. Monte Carlo simulation methodology for the reliabilty of aircraft structures under damage tolerance considerations

    NASA Astrophysics Data System (ADS)

    Rambalakos, Andreas

    Current federal aviation regulations in the United States and around the world mandate the need for aircraft structures to meet damage tolerance requirements through out the service life. These requirements imply that the damaged aircraft structure must maintain adequate residual strength in order to sustain its integrity that is accomplished by a continuous inspection program. The multifold objective of this research is to develop a methodology based on a direct Monte Carlo simulation process and to assess the reliability of aircraft structures. Initially, the structure is modeled as a parallel system with active redundancy comprised of elements with uncorrelated (statistically independent) strengths and subjected to an equal load distribution. Closed form expressions for the system capacity cumulative distribution function (CDF) are developed by expanding the current expression for the capacity CDF of a parallel system comprised by three elements to a parallel system comprised with up to six elements. These newly developed expressions will be used to check the accuracy of the implementation of a Monte Carlo simulation algorithm to determine the probability of failure of a parallel system comprised of an arbitrary number of statistically independent elements. The second objective of this work is to compute the probability of failure of a fuselage skin lap joint under static load conditions through a Monte Carlo simulation scheme by utilizing the residual strength of the fasteners subjected to various initial load distributions and then subjected to a new unequal load distribution resulting from subsequent fastener sequential failures. The final and main objective of this thesis is to present a methodology for computing the resulting gradual deterioration of the reliability of an aircraft structural component by employing a direct Monte Carlo simulation approach. The uncertainties associated with the time to crack initiation, the probability of crack detection, the

  20. Rad18 confers hematopoietic progenitor cell DNA damage tolerance independently of the Fanconi Anemia pathway in vivo

    PubMed Central

    Yang, Yang; Poe, Jonathan C.; Yang, Lisong; Fedoriw, Andrew; Desai, Siddhi; Magnuson, Terry; Li, Zhiguo; Fedoriw, Yuri; Araki, Kimi; Gao, Yanzhe; Tateishi, Satoshi; Sarantopoulos, Stefanie; Vaziri, Cyrus

    2016-01-01

    In cultured cancer cells the E3 ubiquitin ligase Rad18 activates Trans-Lesion Synthesis (TLS) and the Fanconi Anemia (FA) pathway. However, physiological roles of Rad18 in DNA damage tolerance and carcinogenesis are unknown and were investigated here. Primary hematopoietic stem and progenitor cells (HSPC) co-expressed RAD18 and FANCD2 proteins, potentially consistent with a role for Rad18 in FA pathway function during hematopoiesis. However, hematopoietic defects typically associated with fanc-deficiency (decreased HSPC numbers, reduced engraftment potential of HSPC, and Mitomycin C (MMC) -sensitive hematopoiesis), were absent in Rad18−/− mice. Moreover, primary Rad18−/− mouse embryonic fibroblasts (MEF) retained robust Fancd2 mono-ubiquitination following MMC treatment. Therefore, Rad18 is dispensable for FA pathway activation in untransformed cells and the Rad18 and FA pathways are separable in hematopoietic cells. In contrast with responses to crosslinking agents, Rad18−/− HSPC were sensitive to in vivo treatment with the myelosuppressive agent 7,12 Dimethylbenz[a]anthracene (DMBA). Rad18-deficient fibroblasts aberrantly accumulated DNA damage markers after DMBA treatment. Moreover, in vivo DMBA treatment led to increased incidence of B cell malignancy in Rad18−/− mice. These results identify novel hematopoietic functions for Rad18 and provide the first demonstration that Rad18 confers DNA damage tolerance and tumor-suppression in a physiological setting. PMID:26883629

  1. Design, analysis, and fabrication of a pressure box test fixture for tension damage tolerance testing of curved fuselage panels

    NASA Technical Reports Server (NTRS)

    Smith, P. J.; Bodine, J. B.; Preuss, C. H.; Koch, W. J.

    1993-01-01

    A pressure box test fixture was designed and fabricated to evaluate the effects of internal pressure, biaxial tension loads, curvature, and damage on the fracture response of composite fuselage structure. Previous work in composite fuselage tension damage tolerance, performed during NASA contract NAS1-17740, evaluated the above effects on unstiffened panels only. This work extends the tension damage tolerance testing to curved stiffened fuselage crown structure that contains longitudinal stringers and circumferential frame elements. The pressure box fixture was designed to apply internal pressure up to 20 psi, and axial tension loads up to 5000 lb/in, either separately or simultaneously. A NASTRAN finite element model of the pressure box fixture and composite stiffened panel was used to help design the test fixture, and was compared to a finite element model of a full composite stiffened fuselage shell. This was done to ensure that the test panel was loaded in a similar way to a panel in the full fuselage shell, and that the fixture and its attachment plates did not adversely affect the panel.

  2. Rad18 confers hematopoietic progenitor cell DNA damage tolerance independently of the Fanconi Anemia pathway in vivo.

    PubMed

    Yang, Yang; Poe, Jonathan C; Yang, Lisong; Fedoriw, Andrew; Desai, Siddhi; Magnuson, Terry; Li, Zhiguo; Fedoriw, Yuri; Araki, Kimi; Gao, Yanzhe; Tateishi, Satoshi; Sarantopoulos, Stefanie; Vaziri, Cyrus

    2016-05-19

    In cultured cancer cells the E3 ubiquitin ligase Rad18 activates Trans-Lesion Synthesis (TLS) and the Fanconi Anemia (FA) pathway. However, physiological roles of Rad18 in DNA damage tolerance and carcinogenesis are unknown and were investigated here. Primary hematopoietic stem and progenitor cells (HSPC) co-expressed RAD18 and FANCD2 proteins, potentially consistent with a role for Rad18 in FA pathway function during hematopoiesis. However, hematopoietic defects typically associated with fanc-deficiency (decreased HSPC numbers, reduced engraftment potential of HSPC, and Mitomycin C (MMC) -sensitive hematopoiesis), were absent in Rad18(-/-) mice. Moreover, primary Rad18(-/-) mouse embryonic fibroblasts (MEF) retained robust Fancd2 mono-ubiquitination following MMC treatment. Therefore, Rad18 is dispensable for FA pathway activation in untransformed cells and the Rad18 and FA pathways are separable in hematopoietic cells. In contrast with responses to crosslinking agents, Rad18(-/-) HSPC were sensitive to in vivo treatment with the myelosuppressive agent 7,12 Dimethylbenz[a]anthracene (DMBA). Rad18-deficient fibroblasts aberrantly accumulated DNA damage markers after DMBA treatment. Moreover, in vivo DMBA treatment led to increased incidence of B cell malignancy in Rad18(-/-) mice. These results identify novel hematopoietic functions for Rad18 and provide the first demonstration that Rad18 confers DNA damage tolerance and tumor-suppression in a physiological setting. PMID:26883629

  3. Micro-Energy Rates for Damage Tolerance and Durability of Composite Structures

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.; Minnetyan, Levon

    2006-01-01

    In this paper, the adhesive bond strength of lap-jointed graphite/aluminum composites is examined by computational simulation. Computed micro-stress level energy release rates are used to identify the damage mechanisms associated with the corresponding acoustic emission (AE) signals. Computed damage regions are similarly correlated with ultrasonically scanned damage regions. Results show that computational simulation can be used with suitable NDE methods for credible in-service monitoring of composites.

  4. Role of Schizosaccharomyces pombe RecQ homolog, recombination, and checkpoint genes in UV damage tolerance.

    PubMed Central

    Murray, J M; Lindsay, H D; Munday, C A; Carr, A M

    1997-01-01

    The cellular responses to DNA damage are complex and include direct DNA repair pathways that remove the damage and indirect damage responses which allow cells to survive DNA damage that has not been, or cannot be, removed. We have identified the gene mutated in the rad12.502 strain as a Schizosaccharomyces pombe recQ homolog. The same gene (designated rqh1) is also mutated in the hus2.22 mutant. We show that Rqhl is involved in a DNA damage survival mechanism which prevents cell death when UV-induced DNA damage cannot be removed. This pathway also requires the correct functioning of the recombination machinery and the six checkpoint rad gene products plus the Cdsl kinase. Our data suggest that Rqh1 operates during S phase as part of a mechanism which prevents DNA damage causing cell lethality. This process may involve the bypass of DNA damage sites by the replication fork. Finally, in contrast with the reported literature, we do not find that rqh1 (rad12) mutant cells are defective in UV dimer endonuclease activity. PMID:9372918

  5. GENETIC AND MOLECULAR ANALYSIS OF DNA DAMAGE REPAIR AND TOLERANCE PATHWAYS.

    SciTech Connect

    SUTHERLAND, B.M.

    2001-07-26

    Radiation can damage cellular components, including DNA. Organisms have developed a panoply of means of dealing with DNA damage. Some repair paths have rather narrow substrate specificity (e.g. photolyases), which act on specific pyrimidine photoproducts in a specific type (e.g., DNA) and conformation (double-stranded B conformation) of nucleic acid. Others, for example, nucleotide excision repair, deal with larger classes of damages, in this case bulky adducts in DNA. A detailed discussion of DNA repair mechanisms is beyond the scope of this article, but one can be found in the excellent book of Friedberg et al. [1] for further detail. However, some DNA damages and paths for repair of those damages important for photobiology will be outlined below as a basis for the specific examples of genetic and molecular analysis that will be presented below.

  6. Desiccation sensitivity and tolerance in the moss Physcomitrella patens: assessing limits and damage.

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The moss Physcomitrella patens is becoming the model of choice for functional genomic studies at the cellular level. Studies report that P. patens survives moderate osmotic and salt stress, and that desiccation tolerance can be induced by exogenous ABA. Our goal was to quantify the extent of dehydr...

  7. DNA damage tolerance pathway involving DNA polymerase ι and the tumor suppressor p53 regulates DNA replication fork progression

    PubMed Central

    Hampp, Stephanie; Kiessling, Tina; Buechle, Kerstin; Mansilla, Sabrina F.; Thomale, Jürgen; Rall, Melanie; Ahn, Jinwoo; Pospiech, Helmut; Gottifredi, Vanesa; Wiesmüller, Lisa

    2016-01-01

    DNA damage tolerance facilitates the progression of replication forks that have encountered obstacles on the template strands. It involves either translesion DNA synthesis initiated by proliferating cell nuclear antigen monoubiquitination or less well-characterized fork reversal and template switch mechanisms. Herein, we characterize a novel tolerance pathway requiring the tumor suppressor p53, the translesion polymerase ι (POLι), the ubiquitin ligase Rad5-related helicase-like transcription factor (HLTF), and the SWI/SNF catalytic subunit (SNF2) translocase zinc finger ran-binding domain containing 3 (ZRANB3). This novel p53 activity is lost in the exonuclease-deficient but transcriptionally active p53(H115N) mutant. Wild-type p53, but not p53(H115N), associates with POLι in vivo. Strikingly, the concerted action of p53 and POLι decelerates nascent DNA elongation and promotes HLTF/ZRANB3-dependent recombination during unperturbed DNA replication. Particularly after cross-linker–induced replication stress, p53 and POLι also act together to promote meiotic recombination enzyme 11 (MRE11)-dependent accumulation of (phospho-)replication protein A (RPA)-coated ssDNA. These results implicate a direct role of p53 in the processing of replication forks encountering obstacles on the template strand. Our findings define an unprecedented function of p53 and POLι in the DNA damage response to endogenous or exogenous replication stress. PMID:27407148

  8. DNA damage tolerance pathway involving DNA polymerase ι and the tumor suppressor p53 regulates DNA replication fork progression.

    PubMed

    Hampp, Stephanie; Kiessling, Tina; Buechle, Kerstin; Mansilla, Sabrina F; Thomale, Jürgen; Rall, Melanie; Ahn, Jinwoo; Pospiech, Helmut; Gottifredi, Vanesa; Wiesmüller, Lisa

    2016-07-26

    DNA damage tolerance facilitates the progression of replication forks that have encountered obstacles on the template strands. It involves either translesion DNA synthesis initiated by proliferating cell nuclear antigen monoubiquitination or less well-characterized fork reversal and template switch mechanisms. Herein, we characterize a novel tolerance pathway requiring the tumor suppressor p53, the translesion polymerase ι (POLι), the ubiquitin ligase Rad5-related helicase-like transcription factor (HLTF), and the SWI/SNF catalytic subunit (SNF2) translocase zinc finger ran-binding domain containing 3 (ZRANB3). This novel p53 activity is lost in the exonuclease-deficient but transcriptionally active p53(H115N) mutant. Wild-type p53, but not p53(H115N), associates with POLι in vivo. Strikingly, the concerted action of p53 and POLι decelerates nascent DNA elongation and promotes HLTF/ZRANB3-dependent recombination during unperturbed DNA replication. Particularly after cross-linker-induced replication stress, p53 and POLι also act together to promote meiotic recombination enzyme 11 (MRE11)-dependent accumulation of (phospho-)replication protein A (RPA)-coated ssDNA. These results implicate a direct role of p53 in the processing of replication forks encountering obstacles on the template strand. Our findings define an unprecedented function of p53 and POLι in the DNA damage response to endogenous or exogenous replication stress. PMID:27407148

  9. Genetic analysis of repair and damage tolerance mechanisms for DNA-protein cross-links in Escherichia coli.

    PubMed

    Salem, Amir M H; Nakano, Toshiaki; Takuwa, Minako; Matoba, Nagisa; Tsuboi, Tomohiro; Terato, Hiroaki; Yamamoto, Kazuo; Yamada, Masami; Nohmi, Takehiko; Ide, Hiroshi

    2009-09-01

    DNA-protein cross-links (DPCs) are unique among DNA lesions in their unusually bulky nature. We have recently shown that nucleotide excision repair (NER) and RecBCD-dependent homologous recombination (HR) collaboratively alleviate the lethal effect of DPCs in Escherichia coli. In this study, to gain further insight into the damage-processing mechanism for DPCs, we assessed the sensitivities of a panel of repair-deficient E. coli mutants to DPC-inducing agents, including formaldehyde (FA) and 5-azacytidine (azaC). We show here that the damage tolerance mechanism involving HR and subsequent replication restart (RR) provides the most effective means of cell survival against DPCs. Translesion synthesis does not serve as an alternative damage tolerance mechanism for DPCs in cell survival. Elimination of DPCs from the genome relies primarily on NER, which provides a second and moderately effective means of cell survival against DPCs. Interestingly, Cho rather than UvrC seems to be an effective nuclease for the NER of DPCs. Together with the genes responsible for HR, RR, and NER, the mutation of genes involved in several aspects of DNA repair and transactions, such as recQ, xth nfo, dksA, and topA, rendered cells slightly but significantly sensitive to FA but not azaC, possibly reflecting the complexity of DPCs or cryptic lesions induced by FA. UvrD may have an additional role outside NER, since the uvrD mutation conferred a slight azaC sensitivity on cells. Finally, DNA glycosylases mitigate azaC toxicity, independently of the repair of DPCs, presumably by removing 5-azacytosine or its degradation product from the chromosome. PMID:19617358

  10. Ultraviolet-B-induced DNA damage and ultraviolet-B tolerance mechanisms in species with different functional groups coexisting in subalpine moorlands.

    PubMed

    Wang, Qing-Wei; Kamiyama, Chiho; Hidema, Jun; Hikosaka, Kouki

    2016-08-01

    High doses of ultraviolet-B (UV-B; 280-315 nm) radiation can have detrimental effects on plants, and especially damage their DNA. Plants have DNA repair and protection mechanisms to prevent UV-B damage. However, it remains unclear how DNA damage and tolerance mechanisms vary among field species. We studied DNA damage and tolerance mechanisms in 26 species with different functional groups coexisting in two moorlands at two elevations. We collected current-year leaves in July and August, and determined accumulation of cyclobutane pyrimidine dimer (CPD) as UV-B damage and photorepair activity (PRA) and concentrations of UV-absorbing compounds (UACs) and carotenoids (CARs) as UV-B tolerance mechanisms. DNA damage was greater in dicot than in monocot species, and higher in herbaceous than in woody species. Evergreen species accumulated more CPDs than deciduous species. PRA was higher in Poaceae than in species of other families. UACs were significantly higher in woody than in herbaceous species. The CPD level was not explained by the mechanisms across species, but was significantly related to PRA and UACs when we ignored species with low CPD, PRA and UACs, implying the presence of another effective tolerance mechanism. UACs were correlated negatively with PRA and positively with CARs. Our results revealed that UV-induced DNA damage significantly varies among native species, and this variation is related to functional groups. DNA repair, rather than UV-B protection, dominates in UV-B tolerance in the field. Our findings also suggest that UV-B tolerance mechanisms vary among species under evolutionary trade-off and synergism. PMID:27139425

  11. Micro(mi) RNA-34a targets protein phosphatase (PP)1γ to regulate DNA damage tolerance

    PubMed Central

    Takeda, Yuko; Venkitaraman, Ashok R

    2015-01-01

    The DNA damage response (DDR) triggers widespread changes in gene expression, mediated partly by alterations in micro(mi) RNA levels, whose nature and significance remain uncertain. Here, we report that miR-34a, which is upregulated during the DDR, modulates the expression of protein phosphatase 1γ (PP1γ) to regulate cellular tolerance to DNA damage. Multiple bio-informatic algorithms predict that miR-34a targets the PP1CCC gene encoding PP1γ protein. Ionising radiation (IR) decreases cellular expression of PP1γ in a dose-dependent manner. An miR-34a-mimic reduces cellular PP1γ protein. Conversely, an miR-34a inhibitor antagonizes IR-induced decreases in PP1γ protein expression. A wild-type (but not mutant) miR-34a seed match sequence from the 3′ untranslated region (UTR) of PP1CCC when transplanted to a luciferase reporter gene makes it responsive to an miR-34a-mimic. Thus, miR-34a upregulation during the DDR targets the 3′ UTR of PP1CCC to decrease PP1γ protein expression. PP1γ is known to antagonize DDR signaling via the ataxia-telangiectasia-mutated (ATM) kinase. Interestingly, we find that cells exposed to DNA damage become more sensitive – in an miR-34a-dependent manner – to a second challenge with damage. Increased sensitivity to the second challenge is marked by enhanced phosphorylation of ATM and p53, increased γH2AX formation, and increased cell death. Increased sensitivity can be partly recapitulated by a miR-34a-mimic, or antagonized by an miR-34a-inhibitor. Thus, our findings suggest a model in which damage-induced miR-34a induction reduces PP1γ expression and enhances ATM signaling to decrease tolerance to repeated genotoxic challenges. This mechanism has implications for tumor suppression and the response of cancers to therapeutic radiation. PMID:26111201

  12. A damage tolerance comparison of 7075-T6 aluminum alloy and IM7/977-2 carbon/epoxy

    NASA Technical Reports Server (NTRS)

    Nettles, Alan T.; Lance, David G.; Hodge, Andrew J.

    1991-01-01

    A comparison of low velocity impact damage between one of the strongest aluminum alloys, to a new, damage tolerant resin system as a matrix for high strength carbon fibers was examined in this study. The aluminum and composite materials were used as face sheets on a 0.13 g/cu cm aluminum honeycomb. Four levels of impact energy were used; 2.6 J, 5.3 J, 7.8 J and 9.9 J. The beams were compared for static strength and fatique life by use of the four-point bend flexure test. It was found that in the undamaged state the specific strength of the composite face sheets was about twice that of the aluminum face sheets. A sharp drop in strength was observed for the composite specimens impacted at the lowest (2.6J) energy level, but the overall specific strength was still higher than for the aluminum specimens. At all impact energy levels tested, the static specific strength of the composite face sheets were significantly higher than the aluminum face sheets. The fatigue life of the most severely damaged composite specimen was about 17 times greater than the undamaged aluminum specimens when cycled at 1 Hz between 20 percent and 85 percent of ultimate breaking load.

  13. Beauveria bassiana, Metarhizium anisopliae, and Metarhizium anisopliae var. acridum conidia: tolerance to imbibitional damage

    Technology Transfer Automated Retrieval System (TEKTRAN)

    When dry fungal cells are immersed in water, rapid imbibition (water uptake) may compromise the plasma membrane, killing the cell. This study investigated the impact of imbibitional damage (measured in terms of reduced viability) on Beauveria bassiana (Bb), Metarhizium anisopliae (Ma) and M. anisop...

  14. Damage tolerance in filament-wound graphite/epoxy pressure vessels

    NASA Astrophysics Data System (ADS)

    Simon, William E.; Ngueyen, Vinh D.; Chenna, Ravi K.

    1995-07-01

    Graphite/epoxy composites are extensively used in the aerospace and sporting goods industries due to their superior engineering properties compared to those of metals. However, graphite/epoxy is extremely susceptible to impact damage which can cause considerable and sometimes undetected reduction in strength. An inelastic impact model was developed to predict damage due to low-velocity impact. A transient dynamic finite element formulation was used in conjunction with the 3D Tsai-Wu failure criterion to determine and incorporate failure in the materials during impact. Material degradation can be adjusted from no degradation to partial degradation to full degradation. The developed software is based on an object-oriented implementation framework called Extensible Implementation Framework for Finite Elements (EIFFE).

  15. Simplification of Fatigue Test Requirements for Damage Tolerance of Composite Interstage Launch Vehicle Hardware

    NASA Technical Reports Server (NTRS)

    Nettles, A. T.; Hodge, A. J.; Jackson, J. R.

    2010-01-01

    The issue of fatigue loading of structures composed of composite materials is considered in a requirements document that is currently in place for manned launch vehicles. By taking into account the short life of these parts, coupled with design considerations, it is demonstrated that the necessary coupon level fatigue data collapse to a static case. Data from a literature review of past studies that examined compressive fatigue loading after impact and data generated from this experimental study are presented to support this finding. Damage growth, in the form of infrared thermography, was difficult to detect due to rapid degradation of compressive properties once damage growth initiated. Unrealistically high fatigue amplitudes were needed to fail 5 of 15 specimens before 10,000 cycles were reached. Since a typical vehicle structure, such as the Ares I interstage, only experiences a few cycles near limit load, it is concluded that static compression after impact (CAI) strength data will suffice for most launch vehicle structures.

  16. Damage tolerance of a geodesically stiffened advanced composite structural concept for aircraft structural applications

    NASA Technical Reports Server (NTRS)

    Rouse, Marshall; Ambur, Damodar R.

    1992-01-01

    This paper describes the features of a geodesically stiffened panel concept that was designed for a fuselage application with a combined axial compression loading of 3,000 lb/in. and a shear loading of 600 lb/in. Specimens representative of this panel concept were tested in uniaxial compression both with and without low-speed impact damage to study the buckling and postbuckling response of the structure. Experimental results that describe the stiffness and failure characteristics of undamaged and impacted damage specimens are presented. A finite element analysis model that captures the principal details of the specimens was developed and used to predict the panel response. Analytical results on panel end-shortening are compared with the experimental results. Analytical results that describe panel end-shortening, out-of-plane displacement and stress resultants are presented.

  17. Damage tolerance in filament-wound graphite/epoxy pressure vessels

    NASA Technical Reports Server (NTRS)

    Simon, William E.; Ngueyen, Vinh D.; Chenna, Ravi K.

    1995-01-01

    Graphite/epoxy composites are extensively used in the aerospace and sporting goods industries due to their superior engineering properties compared to those of metals. However, graphite/epoxy is extremely susceptible to impact damage which can cause considerable and sometimes undetected reduction in strength. An inelastic impact model was developed to predict damage due to low-velocity impact. A transient dynamic finite element formulation was used in conjunction with the 3D Tsai-Wu failure criterion to determine and incorporate failure in the materials during impact. Material degradation can be adjusted from no degradation to partial degradation to full degradation. The developed software is based on an object-oriented implementation framework called Extensible Implementation Framework for Finite Elements (EIFFE).

  18. Recent development in the design, testing and impact-damage tolerance of stiffened composite panels

    NASA Technical Reports Server (NTRS)

    Williams, J. G.; Anderson, M. S.; Rhodes, M. D.; Starnes, J. H., Jr.; Stroud, W. J.

    1979-01-01

    Structural technology of laminated filamentary-composite stiffened-panel structures under combined inplane and lateral loadings is discussed. Attention is focused on: (1) methods for analyzing the behavior of these structures under load and for determining appropriate structural proportions for weight-efficient configurations; and (2) effects of impact damage and geometric imperfections on structural performance. Recent improvements in buckling analysis involving combined inplane compression and shear loadings and transverse shear deformations are presented. A computer code is described for proportioning or sizing laminate layers and cross-sectional dimensions, and the code is used to develop structural efficiency data for a variety of configurations, loading conditions, and constraint conditions. Experimental data on buckling of panels under inplane compression is presented. Mechanisms of impact damage initiation and propagation are described.

  19. Honey bee (Apis mellifera) drones survive oxidative stress due to increased tolerance instead of avoidance or repair of oxidative damage.

    PubMed

    Li-Byarlay, Hongmei; Huang, Ming Hua; Simone-Finstrom, Michael; Strand, Micheline K; Tarpy, David R; Rueppell, Olav

    2016-10-01

    Oxidative stress can lead to premature aging symptoms and cause acute mortality at higher doses in a range of organisms. Oxidative stress resistance and longevity are mechanistically and phenotypically linked; considerable variation in oxidative stress resistance exists among and within species and typically covaries with life expectancy. However, it is unclear whether stress-resistant, long-lived individuals avoid, repair, or tolerate molecular damage to survive longer than others. The honey bee (Apis mellifera L.) is an emerging model system that is well-suited to address this question. Furthermore, this species is the most economically important pollinator, whose health may be compromised by pesticide exposure, including oxidative stressors. Here, we develop a protocol for inducing oxidative stress in honey bee males (drones) via Paraquat injection. After injection, individuals from different colony sources were kept in common social conditions to monitor their survival compared to saline-injected controls. Oxidative stress was measured in susceptible and resistant individuals. Paraquat drastically reduced survival but individuals varied in their resistance to treatment within and among colony sources. Longer-lived individuals exhibited higher levels of lipid peroxidation than individuals dying early. In contrast, the level of protein carbonylation was not significantly different between the two groups. This first study of oxidative stress in male honey bees suggests that survival of an acute oxidative stressor is due to tolerance, not prevention or repair, of oxidative damage to lipids. It also demonstrates colony differences in oxidative stress resistance that might be useful for breeding stress-resistant honey bees. PMID:27422326

  20. Telomerase reverse transcriptase expression protects transformed human cells against DNA-damaging agents, and increases tolerance to chromosomal instability.

    PubMed

    Fleisig, H B; Hukezalie, K R; Thompson, C A H; Au-Yeung, T T T; Ludlow, A T; Zhao, C R; Wong, J M Y

    2016-01-14

    Reactivation of telomerase reverse transcriptase (TERT) expression is found in more than 85% of human cancers. The remaining cancers rely on the alternative lengthening of telomeres (ALT), a recombination-based mechanism for telomere-length maintenance. Prevalence of TERT reactivation over the ALT mechanism was linked to secondary TERT function unrelated to telomere length maintenance. To characterize this non-canonical function, we created a panel of ALT cells with recombinant expression of TERT and TERT variants: TERT-positive ALT cells showed higher tolerance to genotoxic insults compared with their TERT-negative counterparts. We identified telomere synthesis-defective TERT variants that bestowed similar genotoxic stress tolerance, indicating that telomere synthesis activity is dispensable for this survival phenotype. TERT expression improved the kinetics of double-strand chromosome break repair and reduced DNA damage-related nuclear division abnormalities, a phenotype associated with ALT tumors. Despite this reduction in cytological abnormalities, surviving TERT-positive ALT cells were found to have gross chromosomal instabilities. We sorted TERT-positive cells with cytogenetic changes and followed their growth. We found that the chromosome-number changes persisted, and TERT-positive ALT cells surviving genotoxic events propagated through subsequent generations with new chromosome numbers. Our data confirm that telomerase expression protects against double-strand DNA (dsDNA)-damaging events, and show that this protective function is uncoupled from its role in telomere synthesis. TERT expression promotes oncogene-transformed cell growth by reducing the inhibitory effects of cell-intrinsic (telomere attrition) and cell-extrinsic (chemical- or metabolism-induced genotoxic stress) challenges. These data provide the impetus to develop new therapeutic interventions for telomerase-positive cancers through simultaneous targeting of multiple telomerase activities. PMID

  1. A study of the damage tolerance enhancement of carbon/epoxy laminates by utilizing an outer lamina of ultra high molecular weight polyethylene

    NASA Technical Reports Server (NTRS)

    Nettles, Alan T.; Lance, David G.

    1991-01-01

    The damage tolerance of carbon/epoxy was examined when an outer layer of ultra high molecular weight polyethylene (Spectra) material was utilized on the specimen. Four types of 16 ply quasi-isotropic panels, (0,+45,90,-45)s2 were tested. The first contained no Spectra, while the others had one lamina of Spectra placed on either the top (impacted side), bottom or both surfaces of the composite plate. A range of impact energies up to approximately 8.5 Joules (6.3 ft-lbs) was used to inflict damage upon these specimens. Glass/Phenolic honeycomb beams with a core density of 314 N/m3 (2.0 lb/ft3) and 8 ply quasi-isotropic facesheets were also tested for compression-after-impact strength with and without Spectra at impact energies of 1,2,3 and 4 Joules (.74, 1.47, 2.21 and 2.95 ft-lbs). It was observed that the composite plates had little change in damage tolerance due to the Spectra, while the honeycomb panels demonstrated a slight increase in damage tolerance when Spectra was added, the damage tolerance level being more improved at higher impact energies.

  2. Comparison of tissue damage caused by various laser systems with tissue tolerable plasma by light and laser scan microscopy

    NASA Astrophysics Data System (ADS)

    Vandersee, Staffan; Lademann, Jürgen; Richter, Heike; Patzelt, Alexa; Lange-Asschenfeldt, Bernhard

    2013-10-01

    Tissue tolerable plasma (TTP) represents a novel therapeutic method with promising capabilities in the field of dermatological interventions, in particular disinfection but also wound antisepsis and regeneration. The energy transfer by plasma into living tissue is not easily educible, as a variety of features such as the medium’s actual molecule-stream, the ions, electrons and free radicals involved, as well as the emission of ultraviolet, visible and infrared light contribute to its increasingly well characterized effects. Thus, relating possible adversary effects, especially of prolonged exposure to a single component of the plasma’s mode of action, is difficult. Until now, severe adverse events connected to plasma exposure have not been reported when conducted according to existing therapeutic protocols. In this study, we have compared the tissue damage-potential of CO2 and dye lasers with TTP in a porcine model. After exposure of pig ear skin to the three treatment modalities, all specimens were examined histologically and by means of laser scan microscopy (LSM). Light microscopical tissue damage could only be shown in the case of the CO2 laser, whereas dye laser and plasma treatment resulted in no detectable impairment of the specimens. In the case of TTP, LSM examination revealed only an impairment of the uppermost corneal layers of the skin, thus stressing its safety when used in vivo.

  3. Elimination of damaged mitochondria through mitophagy reduces mitochondrial oxidative stress and increases tolerance to trichothecenes.

    PubMed

    Bin-Umer, Mohamed Anwar; McLaughlin, John E; Butterly, Matthew S; McCormick, Susan; Tumer, Nilgun E

    2014-08-12

    Trichothecene mycotoxins are natural contaminants of small grain cereals and are encountered in the environment, posing a worldwide threat to human and animal health. Their mechanism of toxicity is poorly understood, and little is known about cellular protection mechanisms against trichothecenes. We previously identified inhibition of mitochondrial protein synthesis as a novel mechanism for trichothecene-induced cell death. To identify cellular functions involved in trichothecene resistance, we screened the Saccharomyces cerevisiae deletion library for increased sensitivity to nonlethal concentrations of trichothecin (Tcin) and identified 121 strains exhibiting higher sensitivity than the parental strain. The largest group of sensitive strains had significantly higher reactive oxygen species (ROS) levels relative to the parental strain. A dose-dependent increase in ROS levels was observed in the parental strain treated with different trichothecenes, but not in a petite version of the parental strain or in the presence of a mitochondrial membrane uncoupler, indicating that mitochondria are the main site of ROS production due to toxin exposure. Cytotoxicity of trichothecenes was alleviated after treatment of the parental strain and highly sensitive mutants with antioxidants, suggesting that oxidative stress contributes to trichothecene sensitivity. Cotreatment with rapamycin and trichothecenes reduced ROS levels and cytotoxicity in the parental strain relative to the trichothecene treatment alone, but not in mitophagy deficient mutants, suggesting that elimination of trichothecene-damaged mitochondria by mitophagy improves cell survival. These results reveal that increased mitophagy is a cellular protection mechanism against trichothecene-induced mitochondrial oxidative stress and a potential target for trichothecene resistance. PMID:25071194

  4. Test validation of environmental barrier coating (EBC) durability and damage tolerance modeling approach

    NASA Astrophysics Data System (ADS)

    Abdul-Aziz, Ali; Najafi, Ali; Abdi, Frank; Bhatt, Ramakrishna T.; Grady, Joseph E.

    2014-03-01

    Protection of Ceramic Matrix Composites (CMCs) is rather an important element for the engine manufacturers and aerospace companies to help improve the durability of their hot engine components. The CMC's are typically porous materials which permits some desirable infiltration that lead to strength enhancements. However, they experience various durability issues such as degradation due to coating oxidation. These concerns are being addressed by introducing a high temperature protective system, Environmental Barrier Coating (EBC) that can operate at temperature applications1, 3 In this paper, linear elastic progressive failure analyses are performed to evaluate conditions that would cause crack initiation in the EBC. The analysis is to determine the overall failure sequence under tensile loading conditions on different layers of material including the EBC and CMC in an attempt to develop a life/failure model. A 3D finite element model of a dogbone specimen is constructed for the analyses. Damage initiation, propagation and final failure is captured using a progressive failure model considering tensile loading conditions at room temperature. It is expected that this study will establish a process for using a computational approach, validated at a specimen level, to predict reliably in the future component level performance without resorting to extensive testing.

  5. Elimination of damaged mitochondria through mitophagy reduces mitochondrial oxidative stress and increases tolerance to trichothecenes

    PubMed Central

    Bin-Umer, Mohamed Anwar; McLaughlin, John E.; Butterly, Matthew S.; McCormick, Susan; Tumer, Nilgun E.

    2014-01-01

    Trichothecene mycotoxins are natural contaminants of small grain cereals and are encountered in the environment, posing a worldwide threat to human and animal health. Their mechanism of toxicity is poorly understood, and little is known about cellular protection mechanisms against trichothecenes. We previously identified inhibition of mitochondrial protein synthesis as a novel mechanism for trichothecene-induced cell death. To identify cellular functions involved in trichothecene resistance, we screened the Saccharomyces cerevisiae deletion library for increased sensitivity to nonlethal concentrations of trichothecin (Tcin) and identified 121 strains exhibiting higher sensitivity than the parental strain. The largest group of sensitive strains had significantly higher reactive oxygen species (ROS) levels relative to the parental strain. A dose-dependent increase in ROS levels was observed in the parental strain treated with different trichothecenes, but not in a petite version of the parental strain or in the presence of a mitochondrial membrane uncoupler, indicating that mitochondria are the main site of ROS production due to toxin exposure. Cytotoxicity of trichothecenes was alleviated after treatment of the parental strain and highly sensitive mutants with antioxidants, suggesting that oxidative stress contributes to trichothecene sensitivity. Cotreatment with rapamycin and trichothecenes reduced ROS levels and cytotoxicity in the parental strain relative to the trichothecene treatment alone, but not in mitophagy deficient mutants, suggesting that elimination of trichothecene-damaged mitochondria by mitophagy improves cell survival. These results reveal that increased mitophagy is a cellular protection mechanism against trichothecene-induced mitochondrial oxidative stress and a potential target for trichothecene resistance. PMID:25071194

  6. Recent Developments and Challenges Implementing New and Improved Stress Intensity Factor (K) Solutions in NASGRO for Damage Tolerance Analyses

    NASA Technical Reports Server (NTRS)

    Cardinal, Joseph W.; McClung, R. Craig; Lee, Yi-Der; Guo, Yajun; Beek, Joachim M.

    2014-01-01

    Fatigue crack growth analysis software has been available to damage tolerance analysts for many years in either commercial products or via proprietary in-house codes. The NASGRO software has been publicly available since the mid-80s (known as NASA/FLAGRO up to 1999) and since 2000 has been sustained and further developed by a collaborative effort between Southwest Research Institute® (SwRI®), the NASA Johnson Space Center (JSC), and the members of the NASGRO Industrial Consortium. Since the stress intensity factor (K) is the foundation of fracture mechanics and damage tolerance analysis of aircraft structures, a significant focus of development efforts in the past fifteen years has been geared towards enhancing legacy K solutions and developing new and efficient numerical K solutions that can handle the complicated stress gradients computed by today’s analysts using detailed finite element models of fatigue critical locations. This paper provides an overview of K solutions that have been recently implemented or improved for the analysis of geometries such as two unequal through cracks at a hole and two unequal corner cracks at a hole, as well as state-of-the-art weight function models capable of computing K in the presence of univariant and/or bivariant stress gradients and complicated residual stress distributions. Some historical background is provided to review how common K solutions have evolved over the years, including selective examples from the literature and from new research. Challenges and progress in rectifying discrepancies between older legacy solutions and newer models are reviewed as well as approaches and challenges for verification and validation of K solutions. Finally, a summary of current challenges and future research and development needs is presented. A key theme throughout the presentation of this paper will be how members of the aerospace industry have collaborated with software developers to develop a practical analysis tool that is

  7. Fatigue and Damage Tolerance Analysis of a Hybrid Composite Tapered Flexbeam

    NASA Technical Reports Server (NTRS)

    Murri, Gretchen B.; Schaff, Jeffrey R.; Dobyns, Al

    2001-01-01

    The behavior of nonlinear tapered composite flexbeams under combined axial tension and cyclic bending loading was studied using coupon test specimens and finite element (FE) analyses. The flexbeams used a hybrid material system of graphite/epoxy and glass/epoxy and had internal dropped plies, dropped in an overlapping stepwise pattern. Two material configurations, differing only in the use of glass or graphite plies in the continuous plies near the midplane, were studied. Test specimens were cut from a full-size helicopter tail-rotor flexbeam and were tested in a hydraulic load frame under combined constant axialtension load and transverse cyclic bending loads. The first determination damage observed in the specimens occurred at the area around the tip of the outermost ply-drop group in the tapered region of the flexbeam, near the thick end. Delaminations grew slowly and stably, toward the thick end of the flexbeam, at the interfaces above and below the dropped-ply region. A 2D finite element model of the flexbeam was developed. The model was analyzed using a geometrically non-linear analysis with both the ANSYS and ABAQUS FE codes. The global responses of each analysis agreed well with the test results. The ANSYS model was used to calculate strain energy release rates (G) for delaminations initiating at two different ply-ending locations. The results showed that delaminations were more inclined to grow at the locations where they were observed in the test specimens. Both ANSYS and ABAQUS were used to calculate G values associated with delamination initiating at the observed location but growing in different interfaces, either above or below the ply-ending group toward the thick end, or toward the thin end from the tip of the resin pocket. The different analysis codes generated the same trends and comparable peak values, within 5-11 % for each delamination path. Both codes showed that delamination toward the thick region was largely mode II, and toward the thin

  8. Rad5 Template Switch Pathway of DNA Damage Tolerance Determines Synergism between Cisplatin and NSC109268 in Saccharomyces cerevisiae

    PubMed Central

    Jain, Dilip; Siede, Wolfram

    2013-01-01

    The success of cisplatin (CP) based therapy is often hindered by acquisition of CP resistance. We isolated NSC109268 as a compound altering cellular sensitivity to DNA damaging agents. Previous investigation revealed an enhancement of CP sensitivity by NSC109268 in wild-type Saccharomyces cerevisiae and CP-sensitive and -resistant cancer cell lines that correlated with a slower S phase traversal. Here, we extended these studies to determine the target pathway(s) of NSC109268 in mediating CP sensitization, using yeast as a model. We reasoned that mutants defective in the relevant target of NSC109268 should be hypersensitive to CP and the sensitization effect by NSC109268 should be absent or strongly reduced. A survey of various yeast deletion mutants converged on the Rad5 pathway of DNA damage tolerance by template switching as the likely target pathway of NSC109268 in mediating cellular sensitization to CP. Additionally, cell cycle delays following CP treatment were not synergistically influenced by NSC109268 in the CP hypersensitive rad5Δ mutant. The involvement of the known inhibitory activities of NSC109268 on 20S proteasome and phosphatases 2Cα and 2A was tested. In the CP hypersensitive ptc2Δptc3Δpph3Δ yeast strain, deficient for 2C and 2A-type phosphatases, cellular sensitization to CP by NSC109268 was greatly reduced. It is therefore suggested that NSC109268 affects CP sensitivity by inhibiting the activity of unknown protein(s) whose dephosphorylation is required for the template switch pathway. PMID:24130896

  9. Damage tolerance of pressurized graphite/epoxy tape cylinders under uniaxial and biaxial loading. M.S. Thesis

    NASA Technical Reports Server (NTRS)

    Priest, Stacy Marie

    1993-01-01

    The damage tolerance behavior of internally pressurized, axially slit, graphite/epoxy tape cylinders was investigated. Specifically, the effects of axial stress, structural anisotropy, and subcritical damage were considered. In addition, the limitations of a methodology which uses coupon fracture data to predict cylinder failure were explored. This predictive methodology was previously shown to be valid for quasi-isotropic fabric and tape cylinders but invalid for structurally anisotropic (+/-45/90)(sub s) and (+/-45/0)(sub s) cylinders. The effects of axial stress and structural anisotropy were assessed by testing tape cylinders with (90/0/+/-45)(sub s), (+/-45/90)(sub s), and (+/-45/0)(sub s) layups in a uniaxial test apparatus, specially designed and built for this work, and comparing the results to previous tests conducted in biaxial loading. Structural anisotropy effects were also investigated by testing cylinders with the quasi-isotropic (0/+/-45/90)(sub s) layup which is a stacking sequence variation of the previously tested (90/0/+/-45)(sub s) layup with higher D(sub 16) and D(sub 26) terms but comparable D(sub 16) and D(sub 26) to D(sub 11) ratios. All cylinders tested and used for comparison are made from AS4/3501-6 graphite/epoxy tape and have a diameter of 305 mm. Cylinder slit lengths range from 12.7 to 50.8 mm. Failure pressures are lower for the uniaxially loaded cylinders in all cases. The smallest percent failure pressure decreases are observed for the (+/-45/90)(sub s) cylinders, while the greatest such decreases are observed for the (+/-45/0)(sub s) cylinders. The relative effects of the axial stress on the cylinder failure pressures do not correlate with the degree of structural coupling. The predictive methodology is not applicable for uniaxially loaded (+/-45/90)(sub s) and (+/-45/0)(sub s) cylinders, may be applicable for uniaxially loaded (90/0/+/-45)(sub s) cylinders, and is applicable for the biaxially loaded (90/0/+/-45)(sub s) and (0

  10. Nanoscale origins of the damage tolerance of the high-entropy alloy CrMnFeCoNi

    PubMed Central

    Zhang, ZiJiao; Mao, M. M.; Wang, Jiangwei; Gludovatz, Bernd; Zhang, Ze; Mao, Scott X.; George, Easo P.; Yu, Qian; Ritchie, Robert O.

    2015-01-01

    Damage tolerance can be an elusive characteristic of structural materials requiring both high strength and ductility, properties that are often mutually exclusive. High-entropy alloys are of interest in this regard. Specifically, the single-phase CrMnFeCoNi alloy displays tensile strength levels of ∼1 GPa, excellent ductility (∼60–70%) and exceptional fracture toughness (KJIc>200 MPa√m). Here through the use of in situ straining in an aberration-corrected transmission electron microscope, we report on the salient atomistic to micro-scale mechanisms underlying the origin of these properties. We identify a synergy of multiple deformation mechanisms, rarely achieved in metallic alloys, which generates high strength, work hardening and ductility, including the easy motion of Shockley partials, their interactions to form stacking-fault parallelepipeds, and arrest at planar slip bands of undissociated dislocations. We further show that crack propagation is impeded by twinned, nanoscale bridges that form between the near-tip crack faces and delay fracture by shielding the crack tip. PMID:26647978

  11. Nanoscale origins of the damage tolerance of the high-entropy alloy CrMnFeCoNi

    SciTech Connect

    Zhang, ZiJiao; Mao, M. M.; Wang, Jiangwei; Gludovatz, Bernd; Zhang, Ze; Mao, Scott X.; George, Easo P.; Yu, Qian; Ritchie, Robert O.

    2015-12-09

    Damage tolerance can be an elusive characteristic of structural materials requiring both high strength and ductility, properties that are often mutually exclusive. High-entropy alloys are of interest in this regard. Specifically, the single-phase CrMnFeCoNi alloy displays tensile strength levels of ~1 GPa, excellent ductility (~60–70%) and exceptional fracture toughness (KJIc>200M Pa√m). Here through the use of in situ straining in an aberration-corrected transmission electron microscope, we report on the salient atomistic to micro-scale mechanisms underlying the origin of these properties. We identify a synergy of multiple deformation mechanisms, rarely achieved in metallic alloys, which generates high strength, work hardening and ductility, including the easy motion of Shockley partials, their interactions to form stacking-fault parallelepipeds, and arrest at planar slip bands of undissociated dislocations. In conclusion, we further show that crack propagation is impeded by twinned, nanoscale bridges that form between the near-tip crack faces and delay fracture by shielding the crack tip.

  12. Nanoscale origins of the damage tolerance of the high-entropy alloy CrMnFeCoNi

    DOE PAGESBeta

    Zhang, ZiJiao; Mao, M. M.; Wang, Jiangwei; Gludovatz, Bernd; Zhang, Ze; Mao, Scott X.; George, Easo P.; Yu, Qian; Ritchie, Robert O.

    2015-12-09

    Damage tolerance can be an elusive characteristic of structural materials requiring both high strength and ductility, properties that are often mutually exclusive. High-entropy alloys are of interest in this regard. Specifically, the single-phase CrMnFeCoNi alloy displays tensile strength levels of ~1 GPa, excellent ductility (~60–70%) and exceptional fracture toughness (KJIc>200M Pa√m). Here through the use of in situ straining in an aberration-corrected transmission electron microscope, we report on the salient atomistic to micro-scale mechanisms underlying the origin of these properties. We identify a synergy of multiple deformation mechanisms, rarely achieved in metallic alloys, which generates high strength, work hardening andmore » ductility, including the easy motion of Shockley partials, their interactions to form stacking-fault parallelepipeds, and arrest at planar slip bands of undissociated dislocations. In conclusion, we further show that crack propagation is impeded by twinned, nanoscale bridges that form between the near-tip crack faces and delay fracture by shielding the crack tip.« less

  13. Real-time immune cell interactions in target tissue during autoimmune-induced damage and graft tolerance

    PubMed Central

    Miska, Jason; Abdulreda, Midhat H.; Devarajan, Priyadharshini; Lui, Jen Bon; Suzuki, Jun; Pileggi, Antonello; Berggren, Per-Olof

    2014-01-01

    Real-time imaging studies are reshaping immunological paradigms, but a visual framework is lacking for self-antigen-specific T cells at the effector phase in target tissues. To address this issue, we conducted intravital, longitudinal imaging analyses of cellular behavior in nonlymphoid target tissues to illustrate some key aspects of T cell biology. We used mouse models of T cell–mediated damage and protection of pancreatic islet grafts. Both CD4+ and CD8+ effector T (Teff) lymphocytes directly engaged target cells. Strikingly, juxtaposed β cells lacking specific antigens were not subject to bystander destruction but grew substantially in days, likely by replication. In target tissue, Foxp3+ regulatory T (Treg) cells persistently contacted Teff cells with or without involvement of CD11c+ dendritic cells, an observation conciliating with the in vitro “trademark” of Treg function, contact-dependent suppression. This study illustrates tolerance induction by contact-based immune cell interaction in target tissues and highlights potentials of tissue regeneration under antigenic incognito in inflammatory settings. PMID:24567447

  14. Essential Roles of the Smc5/6 Complex in Replication through Natural Pausing Sites and Endogenous DNA Damage Tolerance.

    PubMed

    Menolfi, Demis; Delamarre, Axel; Lengronne, Armelle; Pasero, Philippe; Branzei, Dana

    2015-12-17

    The essential functions of the conserved Smc5/6 complex remain elusive. To uncover its roles in genome maintenance, we established Saccharomyces cerevisiae cell-cycle-regulated alleles that enable restriction of Smc5/6 components to S or G2/M. Unexpectedly, the essential functions of Smc5/6 segregated fully and selectively to G2/M. Genetic screens that became possible with generated alleles identified processes that crucially rely on Smc5/6 specifically in G2/M: metabolism of DNA recombination structures triggered by endogenous replication stress, and replication through natural pausing sites located in late-replicating regions. In the first process, Smc5/6 modulates remodeling of recombination intermediates, cooperating with dissolution activities. In the second, Smc5/6 prevents chromosome fragility and toxic recombination instigated by prolonged pausing and the fork protection complex, Tof1-Csm3. Our results thus dissect Smc5/6 essential roles and reveal that combined defects in DNA damage tolerance and pausing site-replication cause recombination-mediated DNA lesions, which we propose to drive developmental and cancer-prone disorders. PMID:26698660

  15. Nanoscale origins of the damage tolerance of the high-entropy alloy CrMnFeCoNi.

    PubMed

    Zhang, ZiJiao; Mao, M M; Wang, Jiangwei; Gludovatz, Bernd; Zhang, Ze; Mao, Scott X; George, Easo P; Yu, Qian; Ritchie, Robert O

    2015-01-01

    Damage tolerance can be an elusive characteristic of structural materials requiring both high strength and ductility, properties that are often mutually exclusive. High-entropy alloys are of interest in this regard. Specifically, the single-phase CrMnFeCoNi alloy displays tensile strength levels of ∼ 1 GPa, excellent ductility (∼ 60-70%) and exceptional fracture toughness (KJIc>200 MPa√m). Here through the use of in situ straining in an aberration-corrected transmission electron microscope, we report on the salient atomistic to micro-scale mechanisms underlying the origin of these properties. We identify a synergy of multiple deformation mechanisms, rarely achieved in metallic alloys, which generates high strength, work hardening and ductility, including the easy motion of Shockley partials, their interactions to form stacking-fault parallelepipeds, and arrest at planar slip bands of undissociated dislocations. We further show that crack propagation is impeded by twinned, nanoscale bridges that form between the near-tip crack faces and delay fracture by shielding the crack tip. PMID:26647978

  16. Analysis of the Static and Fatigue Strenght of a Damage Tolerant 3D-Reinforced Joining Technology on Composite Single Lap Joints

    NASA Astrophysics Data System (ADS)

    Nogueira, A. C.; Drechsler, K.; Hombergsmeier, E.

    2012-07-01

    The increasing usage of carbon fiber reinforced plastics (CFRP) in aerospace together with the constant drive for fuel efficiency and lightweight design have imposed new challenges in next generation structural assemblies and load transfer efficient joining methods. To address this issue, an innovative technology, denominated Redundant High Efficiency Assembly (RHEA) joints, is introduced as a high-performance lightweight joint that combines efficient load transfer with good damage tolerance. A review of the ongoing research involving the RHEA joint technology, its through-thickness reinforcement concept and the results of quasi-static and fatigue tensile investigations of single lap shear specimens are exposed and discussed. Improvements in ultimate static load, maximum joint deformation, damage tolerance and fatigue life are encountered when comparing the performance of the RHEA lap shear joints to co-bonded reference specimens.

  17. Fracture assessment for electron beam welded damage tolerant Ti-6Al-4V alloy by the FITNET procedure

    NASA Astrophysics Data System (ADS)

    Lu, Wei; Shi, Yaowu; Li, Xiaoyan; Lei, Yongping

    2013-09-01

    Fracture assessment of the cracked structures is essential to avoiding fracture failure. A number of fracture assessment procedures have been proposed for various steel structures. However, the studies about the application of available procedures for titanium alloy structures are scarcely reported. Fracture assessment for the electron beam(EB) welded thick-walled damage tolerant Ti-6Al-4V(TC4-DT) alloy is performed by the fitness-for-service(FFS) FITNET procedure. Uniaxial tensile tests and fracture assessment tests of the base metal and weld metal are carried out to obtain the input information of assessment. The standard options and advanced options of FITNET FFS procedure are used to the fracture assessment of the present material. Moreover, the predicted maximum loads of compact tensile specimen using FITNET FFS procedure are verified with the experimental data of fracture assessment tests. As a result, it is shown that the mechanical properties of weld metal are inhomogeneous along the weld depth. The mismatch ratio M is less than 10% at the weld top and middle, whereas more than 10% at the weld bottom. Failure assessment lines of standard options are close to that of advanced option, which means that the standard options are suitable for fracture assessment of the present welds. The accurate estimation of the maximum loads has been obtained by fracture assessment of standard options with error less than 6%. Furthermore, there are no potential advantages of applying higher options or mismatch options. Thus, the present welded joints can be treated as homogeneous material during the fracture assessment, and standard option 1 can be used to achieve accurate enough results. This research provides the engineering treatment methods for the fracture assessment of titanium alloy and its EB welds.

  18. Composites Damage Tolerance

    NASA Technical Reports Server (NTRS)

    Gregg, Wayne

    2008-01-01

    This slide presentation reviews the development of MSFC-RQMT-3479 for requirements for fracture control of composites to be used in the Constellation program. This effort is part of the development of a revision of NASA-STD-5019(A), which will include MSFC-RQMT-3479. Examples of the requirement criteria and implementation are given.

  19. Verification of recursive probabilistic integration (RPI) method for fatigue life management using non-destructive inspections

    NASA Astrophysics Data System (ADS)

    Chen, Tzikang J.; Shiao, Michael

    2016-04-01

    This paper verified a generic and efficient assessment concept for probabilistic fatigue life management. The concept is developed based on an integration of damage tolerance methodology, simulations methods1, 2, and a probabilistic algorithm RPI (recursive probability integration)3-9 considering maintenance for damage tolerance and risk-based fatigue life management. RPI is an efficient semi-analytical probabilistic method for risk assessment subjected to various uncertainties such as the variability in material properties including crack growth rate, initial flaw size, repair quality, random process modeling of flight loads for failure analysis, and inspection reliability represented by probability of detection (POD). In addition, unlike traditional Monte Carlo simulations (MCS) which requires a rerun of MCS when maintenance plan is changed, RPI can repeatedly use a small set of baseline random crack growth histories excluding maintenance related parameters from a single MCS for various maintenance plans. In order to fully appreciate the RPI method, a verification procedure was performed. In this study, MC simulations in the orders of several hundred billions were conducted for various flight conditions, material properties, and inspection scheduling, POD and repair/replacement strategies. Since the MC simulations are time-consuming methods, the simulations were conducted parallelly on DoD High Performance Computers (HPC) using a specialized random number generator for parallel computing. The study has shown that RPI method is several orders of magnitude more efficient than traditional Monte Carlo simulations.

  20. Probabilistic Fatigue: Computational Simulation

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.

    2002-01-01

    Fatigue is a primary consideration in the design of aerospace structures for long term durability and reliability. There are several types of fatigue that must be considered in the design. These include low cycle, high cycle, combined for different cyclic loading conditions - for example, mechanical, thermal, erosion, etc. The traditional approach to evaluate fatigue has been to conduct many tests in the various service-environment conditions that the component will be subjected to in a specific design. This approach is reasonable and robust for that specific design. However, it is time consuming, costly and needs to be repeated for designs in different operating conditions in general. Recent research has demonstrated that fatigue of structural components/structures can be evaluated by computational simulation based on a novel paradigm. Main features in this novel paradigm are progressive telescoping scale mechanics, progressive scale substructuring and progressive structural fracture, encompassed with probabilistic simulation. These generic features of this approach are to probabilistically telescope scale local material point damage all the way up to the structural component and to probabilistically scale decompose structural loads and boundary conditions all the way down to material point. Additional features include a multifactor interaction model that probabilistically describes material properties evolution, any changes due to various cyclic load and other mutually interacting effects. The objective of the proposed paper is to describe this novel paradigm of computational simulation and present typical fatigue results for structural components. Additionally, advantages, versatility and inclusiveness of computational simulation versus testing are discussed. Guidelines for complementing simulated results with strategic testing are outlined. Typical results are shown for computational simulation of fatigue in metallic composite structures to demonstrate the

  1. Probabilistic fatigue methodology for six nines reliability

    NASA Technical Reports Server (NTRS)

    Everett, R. A., Jr.; Bartlett, F. D., Jr.; Elber, Wolf

    1990-01-01

    Fleet readiness and flight safety strongly depend on the degree of reliability that can be designed into rotorcraft flight critical components. The current U.S. Army fatigue life specification for new rotorcraft is the so-called six nines reliability, or a probability of failure of one in a million. The progress of a round robin which was established by the American Helicopter Society (AHS) Subcommittee for Fatigue and Damage Tolerance is reviewed to investigate reliability-based fatigue methodology. The participants in this cooperative effort are in the U.S. Army Aviation Systems Command (AVSCOM) and the rotorcraft industry. One phase of the joint activity examined fatigue reliability under uniquely defined conditions for which only one answer was correct. The other phases were set up to learn how the different industry methods in defining fatigue strength affected the mean fatigue life and reliability calculations. Hence, constant amplitude and spectrum fatigue test data were provided so that each participant could perform their standard fatigue life analysis. As a result of this round robin, the probabilistic logic which includes both fatigue strength and spectrum loading variability in developing a consistant reliability analysis was established. In this first study, the reliability analysis was limited to the linear cumulative damage approach. However, it is expected that superior fatigue life prediction methods will ultimately be developed through this open AHS forum. To that end, these preliminary results were useful in identifying some topics for additional study.

  2. Deterministic and Probabilistic Creep and Creep Rupture Enhancement to CARES/Creep: Multiaxial Creep Life Prediction of Ceramic Structures Using Continuum Damage Mechanics and the Finite Element Method

    NASA Technical Reports Server (NTRS)

    Jadaan, Osama M.; Powers, Lynn M.; Gyekenyesi, John P.

    1998-01-01

    High temperature and long duration applications of monolithic ceramics can place their failure mode in the creep rupture regime. A previous model advanced by the authors described a methodology by which the creep rupture life of a loaded component can be predicted. That model was based on the life fraction damage accumulation rule in association with the modified Monkman-Grant creep ripture criterion However, that model did not take into account the deteriorating state of the material due to creep damage (e.g., cavitation) as time elapsed. In addition, the material creep parameters used in that life prediction methodology, were based on uniaxial creep curves displaying primary and secondary creep behavior, with no tertiary regime. The objective of this paper is to present a creep life prediction methodology based on a modified form of the Kachanov-Rabotnov continuum damage mechanics (CDM) theory. In this theory, the uniaxial creep rate is described in terms of stress, temperature, time, and the current state of material damage. This scalar damage state parameter is basically an abstract measure of the current state of material damage due to creep deformation. The damage rate is assumed to vary with stress, temperature, time, and the current state of damage itself. Multiaxial creep and creep rupture formulations of the CDM approach are presented in this paper. Parameter estimation methodologies based on nonlinear regression analysis are also described for both, isothermal constant stress states and anisothermal variable stress conditions This creep life prediction methodology was preliminarily added to the integrated design code CARES/Creep (Ceramics Analysis and Reliability Evaluation of Structures/Creep), which is a postprocessor program to commercially available finite element analysis (FEA) packages. Two examples, showing comparisons between experimental and predicted creep lives of ceramic specimens, are used to demonstrate the viability of this methodology and

  3. Mdt1, a Novel Rad53 FHA1 Domain-Interacting Protein, Modulates DNA Damage Tolerance and G2/M Cell Cycle Progression in Saccharomyces cerevisiae

    PubMed Central

    Pike, Brietta L.; Yongkiettrakul, Suganya; Tsai, Ming-Daw; Heierhorst, Jörg

    2004-01-01

    The Rad53 kinase plays a central role in yeast DNA damage checkpoints. Rad53 contains two FHA phosphothreonine-binding domains that are required for Rad53 activation and possibly downstream signaling. Here we show that the N-terminal Rad53 FHA1 domain interacts with the RNA recognition motif, coiled-coil, and SQ/TQ cluster domain-containing protein Mdt1 (YBl051C). The interaction of Rad53 and Mdt1 depends on the structural integrity of the FHA1 phosphothreonine-binding site as well as threonine-305 of Mdt1. Mdt1 is constitutively threonine phosphorylated and hyperphosphorylated in response to DNA damage in vivo. DNA damage-dependent Mdt1 hyperphosphorylation depends on the Mec1 and Tel1 checkpoint kinases, and Mec1 can directly phosphorylate a recombinant Mdt1 SQ/TQ domain fragment. MDT1 overexpression is synthetically lethal with a rad53 deletion, whereas mdt1 deletion partially suppresses the DNA damage hypersensitivity of checkpoint-compromised strains and generally improves DNA damage tolerance. In the absence of DNA damage, mdt1 deletion leads to delayed anaphase completion, with an elongated cell morphology reminiscent of that of G2/M cell cycle mutants. mdt1-dependent and DNA damage-dependent cell cycle delays are not additive, suggesting that they act in the same pathway. The data indicate that Mdt1 is involved in normal G2/M cell cycle progression and is a novel target of checkpoint-dependent cell cycle arrest pathways. PMID:15024067

  4. A Damage Tolerance Comparison of Composite Hat-Stiffened and Honeycomb Sandwich Structure for Launch Vehicle Interstage Applications

    NASA Technical Reports Server (NTRS)

    Nettles, A. T.

    2011-01-01

    In this study, a direct comparison of the compression-after-impact (CAI) strength of impact-damaged, hat-stiffened and honeycomb sandwich structure for launch vehicle use was made. The specimens used consisted of small substructure designed to carry a line load of approx..3,000 lb/in. Damage was inflicted upon the specimens via drop weight impact. Infrared thermography was used to examine the extent of planar damage in the specimens. The specimens were prepared for compression testing to obtain residual compression strength versus damage severity curves. Results show that when weight of the structure is factored in, both types of structure had about the same CAI strength for a given damage level. The main difference was that the hat-stiffened specimens exhibited a multiphase failure whereas the honeycomb sandwich structure failed catastrophically.

  5. The Effects of Foam Thermal Protection System on the Damage Tolerance Characteristics of Composite Sandwich Structures for Launch Vehicles

    NASA Technical Reports Server (NTRS)

    Nettles, A. T.; Hodge, A. J.; Jackson, J. R.

    2011-01-01

    For any structure composed of laminated composite materials, impact damage is one of the greatest risks and therefore most widely tested responses. Typically, impact damage testing and analysis assumes that a solid object comes into contact with the bare surface of the laminate (the outer ply). However, most launch vehicle structures will have a thermal protection system (TPS) covering the structure for the majority of its life. Thus, the impact response of the material with the TPS covering is the impact scenario of interest. In this study, laminates representative of the composite interstage structure for the Ares I launch vehicle were impact tested with and without the planned TPS covering, which consists of polyurethane foam. Response variables examined include maximum load of impact, damage size as detected by nondestructive evaluation techniques, and damage morphology and compression after impact strength. Results show that there is little difference between TPS covered and bare specimens, except the residual strength data is higher for TPS covered specimens.

  6. Overexpression of AT14A confers tolerance to drought stress-induced oxidative damage in suspension cultured cells of Arabidopsis thaliana.

    PubMed

    Wang, Lin; He, Jie; Ding, Haidong; Liu, Hui; Lü, Bing; Liang, Jiansheng; Wang, L; He, J; Ding, H D; Liu, H; Lü, B; Liang, J S

    2015-07-01

    Drought stress can affect interaction between plant cell plasma membrane and cell wall. Arabidopsis AT14A, an integrin-like protein, mediates the cell wall-plasma membrane-cytoskeleton continuum (WMC continuum). To gain further insight into the function of AT14A, the role of AT14A in response to drought stress simulated by polyethylene glycol (PEG-6000) in Arabidopsis suspension cultures was investigated. The expression of this gene was induced by PEG-6000 resulting from reverse transcription-PCR, which was further confirmed by the expression data from publically available microarray datasets. Compared to the wild-type cells, overexpression of AT14A (AT14A-OE) in Arabidopsis cultures exhibited a greater ability to adapt to water deficit, as evidenced by higher biomass accumulation and cell survival rate. Furthermore, AT14A-OE cells showed a higher tolerance to PEG-induced oxidative damage, as reflected by less H2O2 content, lipid peroxidation (malondialdehyde (MDA) content), and ion leakage, which was further verified by maintaining high levels of activities of antioxidant defense enzymes such as ascorbate peroxidase and guaiacol peroxidase and soluble protein. Taken together, our results suggest that overexpression of AT14A improves drought stress tolerance and that AT14A is involved in suppressing oxidative damage under drought stress in part via regulation of antioxidant enzyme activities. PMID:25500719

  7. The role of quasi-plasticity in the extreme contact damage tolerance of the stomatopod dactyl club

    NASA Astrophysics Data System (ADS)

    Amini, Shahrouz; Tadayon, Maryam; Idapalapati, Sridhar; Miserez, Ali

    2015-09-01

    The structure of the stomatopod dactyl club--an ultrafast, hammer-like device used by the animal to shatter hard seashells--offers inspiration for impact-tolerant ceramics. Here, we present the micromechanical principles and related micromechanisms of deformation that impart the club with high impact tolerance. By using depth-sensing nanoindentation with spherical and sharp contact tips in combination with post-indentation residual stress mapping by Raman microspectroscopy, we show that the impact surface region of the dactyl club exhibits a quasi-plastic contact response associated with the interfacial sliding and rotation of fluorapatite nanorods, endowing the club with localized yielding. We also show that the subsurface layers exhibit strain hardening by microchannel densification, which provides additional dissipation of impact energy. Our findings suggest that the club’s macroscopic size is below the critical size above which Hertzian brittle cracks are nucleated.

  8. Probabilistic structural risk assessment for fatigue management using structural health monitoring

    NASA Astrophysics Data System (ADS)

    Shiao, Michael; Wu, Y.-T. J.; Ghoshal, Anindya; Ayers, James; Le, Dy

    2012-04-01

    The primary goal of Army Prognostics & Diagnostics is to develop real-time state awareness technologies for primary structural components. In fatigue-critical structural maintenance, the probabilistic structural risk assessment (PSRA) methodology for fatigue life management using conventional nondestructive investigation (NDI) has been developed based on the assumption of independent inspection outcomes. When using the emerging structural health monitoring (SHM) systems with in situ sensors, however, the independent assumption no longer holds, and the existing PSRA methodology must be modified. The major issues currently under investigation are how to properly address the correlated inspection outcomes from the same sensors on the same components and how to quantify its effect in the SHM-based PSRA framework. This paper describes a new SHM-based PSRA framework with a proper modeling of correlations among multiple inspection outcomes of the same structural component. The framework and the associated probabilistic algorithms are based on the principles of fatigue damage progression, NDI reliability assessment and structural reliability methods. The core of this framework is an innovative, computationally efficient, probabilistic method RPI (Recursive Probability Integration) for damage tolerance and risk-based maintenance planning. RPI can incorporate a wide range of uncertainties including material properties, repair quality, crack growth related parameters, loads, and probability of detection. The RPI algorithm for SHM application is derived in detail. The effects of correlation strength and inspection frequency on the overall probability of missing all detections are also studied and discussed.

  9. Probabilistic structural analysis methods for critical SSME propulsion components

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.

    1986-01-01

    The development of a three-dimensional inelastic analysis methodology for the Space Shuttle main engine (SSME) structural components is described. The methodology is composed of: (1) composite load spectra, (2) probabilistic structural analysis methods, (3) the probabilistic finite element theory, and (4) probabilistic structural analysis. The progress in the development of generic probabilistic models for various individual loads which consist of a steady state load, a periodic load, a random load, and a spike, is discussed. The capabilities of the Numerical Evaluation of Stochastic Structures Under Stress finite element code designed for probabilistic structural analysis of the SSME are examined. Variation principles for formulation probabilistic finite elements and a structural analysis for evaluating the geometric and material properties tolerances on the structural response of turbopump blades are being designed.

  10. Probabilistic Prediction of Lifetimes of Ceramic Parts

    NASA Technical Reports Server (NTRS)

    Nemeth, Noel N.; Gyekenyesi, John P.; Jadaan, Osama M.; Palfi, Tamas; Powers, Lynn; Reh, Stefan; Baker, Eric H.

    2006-01-01

    ANSYS/CARES/PDS is a software system that combines the ANSYS Probabilistic Design System (PDS) software with a modified version of the Ceramics Analysis and Reliability Evaluation of Structures Life (CARES/Life) Version 6.0 software. [A prior version of CARES/Life was reported in Program for Evaluation of Reliability of Ceramic Parts (LEW-16018), NASA Tech Briefs, Vol. 20, No. 3 (March 1996), page 28.] CARES/Life models effects of stochastic strength, slow crack growth, and stress distribution on the overall reliability of a ceramic component. The essence of the enhancement in CARES/Life 6.0 is the capability to predict the probability of failure using results from transient finite-element analysis. ANSYS PDS models the effects of uncertainty in material properties, dimensions, and loading on the stress distribution and deformation. ANSYS/CARES/PDS accounts for the effects of probabilistic strength, probabilistic loads, probabilistic material properties, and probabilistic tolerances on the lifetime and reliability of the component. Even failure probability becomes a stochastic quantity that can be tracked as a response variable. ANSYS/CARES/PDS enables tracking of all stochastic quantities in the design space, thereby enabling more precise probabilistic prediction of lifetimes of ceramic components.

  11. MELK-T1, a small-molecule inhibitor of protein kinase MELK, decreases DNA-damage tolerance in proliferating cancer cells.

    PubMed

    Beke, Lijs; Kig, Cenk; Linders, Joannes T M; Boens, Shannah; Boeckx, An; van Heerde, Erika; Parade, Marc; De Bondt, An; Van den Wyngaert, Ilse; Bashir, Tarig; Ogata, Souichi; Meerpoel, Lieven; Van Eynde, Aleyde; Johnson, Christopher N; Beullens, Monique; Brehmer, Dirk; Bollen, Mathieu

    2015-01-01

    Maternal embryonic leucine zipper kinase (MELK), a serine/threonine protein kinase, has oncogenic properties and is overexpressed in many cancer cells. The oncogenic function of MELK is attributed to its capacity to disable critical cell-cycle checkpoints and reduce replication stress. Most functional studies have relied on the use of siRNA/shRNA-mediated gene silencing. In the present study, we have explored the biological function of MELK using MELK-T1, a novel and selective small-molecule inhibitor. Strikingly, MELK-T1 triggered a rapid and proteasome-dependent degradation of the MELK protein. Treatment of MCF-7 (Michigan Cancer Foundation-7) breast adenocarcinoma cells with MELK-T1 induced the accumulation of stalled replication forks and double-strand breaks that culminated in a replicative senescence phenotype. This phenotype correlated with a rapid and long-lasting ataxia telangiectasia-mutated (ATM) activation and phosphorylation of checkpoint kinase 2 (CHK2). Furthermore, MELK-T1 induced a strong phosphorylation of p53 (cellular tumour antigen p53), a prolonged up-regulation of p21 (cyclin-dependent kinase inhibitor 1) and a down-regulation of FOXM1 (Forkhead Box M1) target genes. Our data indicate that MELK is a key stimulator of proliferation by its ability to increase the threshold for DNA-damage tolerance (DDT). Thus, targeting MELK by the inhibition of both its catalytic activity and its protein stability might sensitize tumours to DNA-damaging agents or radiation therapy by lowering the DNA-damage threshold. PMID:26431963

  12. MELK-T1, a small-molecule inhibitor of protein kinase MELK, decreases DNA-damage tolerance in proliferating cancer cells

    PubMed Central

    Beke, Lijs; Kig, Cenk; Linders, Joannes T. M.; Boens, Shannah; Boeckx, An; van Heerde, Erika; Parade, Marc; De Bondt, An; Van den Wyngaert, Ilse; Bashir, Tarig; Ogata, Souichi; Meerpoel, Lieven; Van Eynde, Aleyde; Johnson, Christopher N.; Beullens, Monique; Brehmer, Dirk; Bollen, Mathieu

    2015-01-01

    Maternal embryonic leucine zipper kinase (MELK), a serine/threonine protein kinase, has oncogenic properties and is overexpressed in many cancer cells. The oncogenic function of MELK is attributed to its capacity to disable critical cell-cycle checkpoints and reduce replication stress. Most functional studies have relied on the use of siRNA/shRNA-mediated gene silencing. In the present study, we have explored the biological function of MELK using MELK-T1, a novel and selective small-molecule inhibitor. Strikingly, MELK-T1 triggered a rapid and proteasome-dependent degradation of the MELK protein. Treatment of MCF-7 (Michigan Cancer Foundation-7) breast adenocarcinoma cells with MELK-T1 induced the accumulation of stalled replication forks and double-strand breaks that culminated in a replicative senescence phenotype. This phenotype correlated with a rapid and long-lasting ataxia telangiectasia-mutated (ATM) activation and phosphorylation of checkpoint kinase 2 (CHK2). Furthermore, MELK-T1 induced a strong phosphorylation of p53 (cellular tumour antigen p53), a prolonged up-regulation of p21 (cyclin-dependent kinase inhibitor 1) and a down-regulation of FOXM1 (Forkhead Box M1) target genes. Our data indicate that MELK is a key stimulator of proliferation by its ability to increase the threshold for DNA-damage tolerance (DDT). Thus, targeting MELK by the inhibition of both its catalytic activity and its protein stability might sensitize tumours to DNA-damaging agents or radiation therapy by lowering the DNA-damage threshold. PMID:26431963

  13. Damage tolerance modeling and validation of a wireless sensory composite panel for a structural health monitoring system

    NASA Astrophysics Data System (ADS)

    Talagani, Mohamad R.; Abdi, Frank; Saravanos, Dimitris; Chrysohoidis, Nikos; Nikbin, Kamran; Ragalini, Rose; Rodov, Irena

    2013-05-01

    The paper proposes the diagnostic and prognostic modeling and test validation of a Wireless Integrated Strain Monitoring and Simulation System (WISMOS). The effort verifies a hardware and web based software tool that is able to evaluate and optimize sensorized aerospace composite structures for the purpose of Structural Health Monitoring (SHM). The tool is an extension of an existing suite of an SHM system, based on a diagnostic-prognostic system (DPS) methodology. The goal of the extended SHM-DPS is to apply multi-scale nonlinear physics-based Progressive Failure analyses to the "as-is" structural configuration to determine residual strength, remaining service life, and future inspection intervals and maintenance procedures. The DPS solution meets the JTI Green Regional Aircraft (GRA) goals towards low weight, durable and reliable commercial aircraft. It will take advantage of the currently developed methodologies within the European Clean sky JTI project WISMOS, with the capability to transmit, store and process strain data from a network of wireless sensors (e.g. strain gages, FBGA) and utilize a DPS-based methodology, based on multi scale progressive failure analysis (MS-PFA), to determine structural health and to advice with respect to condition based inspection and maintenance. As part of the validation of the Diagnostic and prognostic system, Carbon/Epoxy ASTM coupons were fabricated and tested to extract the mechanical properties. Subsequently two composite stiffened panels were manufactured, instrumented and tested under compressive loading: 1) an undamaged stiffened buckling panel; and 2) a damaged stiffened buckling panel including an initial diamond cut. Next numerical Finite element models of the two panels were developed and analyzed under test conditions using Multi-Scale Progressive Failure Analysis (an extension of FEM) to evaluate the damage/fracture evolution process, as well as the identification of contributing failure modes. The comparisons

  14. Umbilical cord blood-derived stem cells improve heat tolerance and hypothalamic damage in heat stressed mice.

    PubMed

    Tseng, Ling-Shu; Chen, Sheng-Hsien; Lin, Mao-Tsun; Lin, Ying-Chu

    2014-01-01

    Heatstroke is characterized by excessive hyperthermia associated with systemic inflammatory responses, which leads to multiple organ failure, in which brain disorders predominate. This definition can be almost fulfilled by a mouse model of heatstroke used in the present study. Unanesthetized mice were exposed to whole body heating (41.2°C for 1 hour) and then returned to room temperature (26°C) for recovery. Immediately after termination of whole body heating, heated mice displayed excessive hyperthermia (body core temperature ~42.5°C). Four hours after termination of heat stress, heated mice displayed (i) systemic inflammation; (ii) ischemic, hypoxic, and oxidative damage to the hypothalamus; (iii) hypothalamo-pituitary-adrenocortical axis impairment (reflected by plasma levels of both adrenocorticotrophic-hormone and corticosterone); (iv) decreased fractional survival; and (v) thermoregulatory deficits (e.g., they became hypothermia when they were exposed to room temperature). These heatstroke reactions can be significantly attenuated by human umbilical cord blood-derived CD34(+) cells therapy. Our data suggest that human umbilical cord blood-derived stem cells therapy may improve outcomes of heatstroke in mice by reducing systemic inflammation as well as hypothalamo-pituitary-adrenocortical axis impairment. PMID:24804231

  15. Coordinated Changes in Antioxidative Enzymes Protect the Photosynthetic Machinery from Salinity Induced Oxidative Damage and Confer Salt Tolerance in an Extreme Halophyte Salvadora persica L.

    PubMed

    Rangani, Jaykumar; Parida, Asish K; Panda, Ashok; Kumari, Asha

    2016-01-01

    Salinity-induced modulations in growth, photosynthetic pigments, relative water content (RWC), lipid peroxidation, photosynthesis, photosystem II efficiency, and changes in activity of various antioxidative enzymes were studied in the halophyte Salvadora persica treated with various levels of salinity (0, 250, 500, 750, and 1000 mM NaCl) to obtain an insight into the salt tolerance ability of this halophyte. Both fresh and dry biomass as well as leaf area (LA) declined at all levels of salinity whereas salinity caused an increase in leaf succulence. A gradual increase was observed in the Na(+) content of leaf with increasing salt concentration up to 750 mM NaCl, but at higher salt concentration (1000 mM NaCl), the Na(+) content surprisingly dropped down to the level of 250 mM NaCl. The chlorophyll and carotenoid contents of the leaf remained unaffected by salinity. The photosynthetic rate (PN), stomatal conductance (gs), the transpiration rate (E), quantum yield of PSII (ΦPSII), photochemical quenching (qP), and electron transport rate remained unchanged at low salinity (250 to 500 mM NaCl) whereas, significant reduction in these parameters were observed at high salinity (750 to 1000 mM NaCl). The RWC% and water use efficiency (WUE) of leaf remained unaffected by salinity. The salinity had no effect on maximum quantum efficiency of PS II (Fv/Fm) which indicates that PS II is not perturbed by salinity-induced oxidative damage. Analysis of the isoforms of antioxidative enzymes revealed that the leaves of S. persica have two isoforms each of Mn-SOD and Fe-SOD and one isoform of Cu-Zn SOD, three isoforms of POX, two isoforms of APX and one isoform of CAT. There was differential responses in activity and expression of different isoforms of various antioxidative enzymes. The malondialdehyde (MDA) content (a product of lipid peroxidation) of leaf remained unchanged in S. persica treated with various levels of salinity. Our results suggest that the absence of pigment

  16. Coordinated Changes in Antioxidative Enzymes Protect the Photosynthetic Machinery from Salinity Induced Oxidative Damage and Confer Salt Tolerance in an Extreme Halophyte Salvadora persica L.

    PubMed Central

    Rangani, Jaykumar; Parida, Asish K.; Panda, Ashok; Kumari, Asha

    2016-01-01

    Salinity-induced modulations in growth, photosynthetic pigments, relative water content (RWC), lipid peroxidation, photosynthesis, photosystem II efficiency, and changes in activity of various antioxidative enzymes were studied in the halophyte Salvadora persica treated with various levels of salinity (0, 250, 500, 750, and 1000 mM NaCl) to obtain an insight into the salt tolerance ability of this halophyte. Both fresh and dry biomass as well as leaf area (LA) declined at all levels of salinity whereas salinity caused an increase in leaf succulence. A gradual increase was observed in the Na+ content of leaf with increasing salt concentration up to 750 mM NaCl, but at higher salt concentration (1000 mM NaCl), the Na+ content surprisingly dropped down to the level of 250 mM NaCl. The chlorophyll and carotenoid contents of the leaf remained unaffected by salinity. The photosynthetic rate (PN), stomatal conductance (gs), the transpiration rate (E), quantum yield of PSII (ΦPSII), photochemical quenching (qP), and electron transport rate remained unchanged at low salinity (250 to 500 mM NaCl) whereas, significant reduction in these parameters were observed at high salinity (750 to 1000 mM NaCl). The RWC% and water use efficiency (WUE) of leaf remained unaffected by salinity. The salinity had no effect on maximum quantum efficiency of PS II (Fv/Fm) which indicates that PS II is not perturbed by salinity-induced oxidative damage. Analysis of the isoforms of antioxidative enzymes revealed that the leaves of S. persica have two isoforms each of Mn-SOD and Fe-SOD and one isoform of Cu-Zn SOD, three isoforms of POX, two isoforms of APX and one isoform of CAT. There was differential responses in activity and expression of different isoforms of various antioxidative enzymes. The malondialdehyde (MDA) content (a product of lipid peroxidation) of leaf remained unchanged in S. persica treated with various levels of salinity. Our results suggest that the absence of pigment

  17. DNA polymerase X from Deinococcus radiodurans implicated in bacterial tolerance to DNA damage is characterized as a short patch base excision repair polymerase.

    PubMed

    Khairnar, Nivedita P; Misra, Hari S

    2009-09-01

    The Deinococcus radiodurans R1 genome encodes an X-family DNA repair polymerase homologous to eukaryotic DNA polymerase beta. The recombinant deinococcal polymerase X (PolX) purified from transgenic Escherichia coli showed deoxynucleotidyltransferase activity. Unlike the Klenow fragment of E. coli, this enzyme showed short patch DNA synthesis activity on heteropolymeric DNA substrate. The recombinant enzyme showed 5'-deoxyribose phosphate (5'-dRP) lyase activity and base excision repair function in vitro, with the help of externally supplied glycosylase and AP endonuclease functions. A polX disruption mutant of D. radiodurans expressing 5'-dRP lyase and a truncated polymerase domain was comparatively less sensitive to gamma-radiation than a polX deletion mutant. Both mutants showed higher sensitivity to hydrogen peroxide. Excision repair mutants of E. coli expressing this polymerase showed functional complementation of UV sensitivity. These results suggest the involvement of deinococcal polymerase X in DNA-damage tolerance of D. radiodurans, possibly by contributing to DNA double-strand break repair and base excision repair. PMID:19542005

  18. Probabilistic structural analysis to quantify uncertainties associated with turbopump blades

    NASA Technical Reports Server (NTRS)

    Nagpal, Vinod K.; Rubinstein, Robert; Chamis, Christos C.

    1988-01-01

    A probabilistic study of turbopump blades has been in progress at NASA Lewis Research Center for over the last two years. The objectives of this study are to evaluate the effects of uncertainties in geometry and material properties on the structural response of the turbopump blades to evaluate the tolerance limits on the design. A methodology based on probabilistic approach was developed to quantify the effects of the random uncertainties. The results indicate that only the variations in geometry have significant effects.

  19. Amelioration of ER stress by 4-phenylbutyric acid reduces chronic hypoxia induced cardiac damage and improves hypoxic tolerance through upregulation of HIF-1α.

    PubMed

    Jain, Kanika; Suryakumar, Geetha; Ganju, Lilly; Singh, Shashi Bala

    2016-08-01

    While endoplasmic reticulum (ER) stress has been observed in several human diseases, few studies have reported the involvement of ER stress in chronic hypoxia (CH) induced cardiac damage. Hypoxia, such as that prevalent at high altitude (HA), forms the underlying cause of several maladies including cardiovascular diseases. While the role of hypoxia inducible factor-1 (HIF-1α) in the adaptive responses to hypoxia is known, the role of the unfolded protein response (UPR) is only recently being explored in the HA pathophysiologies. The present study investigates the effect of ER stress modulation on CH mediated injury and the cardioprotective action of 4-phenylbutyric acid (PBA) in enhancing survival response under hypoxia. Here, we observed that exposure of rats, for 1, 7 and 14days CH to a simulated altitude of 7620m, led to cardiac hypertrophy and significant protein oxidation. This induced the activation of UPR signaling mechanisms, mediated by PERK, IRE1α and ATF6. By 14days, there was a marked upregulation of apoptosis, evident in increased CHOP and caspase-3/9 activity. PBA reduced CH induced right ventricular enlargement and apoptosis. Further, in contrast to tunicamycin, PBA considerably enhanced hypoxic tolerance. An elevation in the level of antioxidant enzymes, HIF-1α and its regulated proteins (HO-1, GLUT-1) was observed in the PBA administered animals, along with a concomitant suppression of UPR markers. Our study thus emphasizes upon the attenuation of ER stress by PBA as a mechanism to diminish CH induced cardiac injury and boost hypoxic survival, providing an insight into the novel relationship between the HIF-1α and UPR under hypoxia. PMID:27058435

  20. Damage tolerant light absorbing material

    DOEpatents

    Lauf, R.J.; Hamby, C. Jr.; Akerman, M.A.; Seals, R.D.

    1993-09-07

    A light absorbing article comprised of a composite of carbon-bonded carbon fibers, is prepared by: blending carbon fibers with a carbonizable organic powder to form a mixture; dispersing the mixture into an aqueous slurry; vacuum molding the aqueous slurry to form a green article; drying and curing the green article to form a cured article; and, carbonizing the cured article at a temperature of at least about 1000 C to form a carbon-bonded carbon fiber light absorbing composite article having a bulk density less than 1 g/cm[sup 3]. 9 figures.

  1. Damage tolerant light absorbing material

    DOEpatents

    Lauf, Robert J.; Hamby, Jr., Clyde; Akerman, M. Alfred; Seals, Roland D.

    1993-01-01

    A light absorbing article comprised of a composite of carbon-bonded carbon fibers, prepared by: blending carbon fibers with a carbonizable organic powder to form a mixture; dispersing the mixture into an aqueous slurry; vacuum molding the aqueous slurry to form a green article; drying and curing the green article to form a cured article; and, carbonizing the cured article at a temperature of at least about 1000.degree. C. to form a carbon-bonded carbon fiber light absorbing composite article having a bulk density less than 1 g/cm.sup.3.

  2. Probabilistic Structural Analysis Program

    NASA Technical Reports Server (NTRS)

    Pai, Shantaram S.; Chamis, Christos C.; Murthy, Pappu L. N.; Stefko, George L.; Riha, David S.; Thacker, Ben H.; Nagpal, Vinod K.; Mital, Subodh K.

    2010-01-01

    NASA/NESSUS 6.2c is a general-purpose, probabilistic analysis program that computes probability of failure and probabilistic sensitivity measures of engineered systems. Because NASA/NESSUS uses highly computationally efficient and accurate analysis techniques, probabilistic solutions can be obtained even for extremely large and complex models. Once the probabilistic response is quantified, the results can be used to support risk-informed decisions regarding reliability for safety-critical and one-of-a-kind systems, as well as for maintaining a level of quality while reducing manufacturing costs for larger-quantity products. NASA/NESSUS has been successfully applied to a diverse range of problems in aerospace, gas turbine engines, biomechanics, pipelines, defense, weaponry, and infrastructure. This program combines state-of-the-art probabilistic algorithms with general-purpose structural analysis and lifting methods to compute the probabilistic response and reliability of engineered structures. Uncertainties in load, material properties, geometry, boundary conditions, and initial conditions can be simulated. The structural analysis methods include non-linear finite-element methods, heat-transfer analysis, polymer/ceramic matrix composite analysis, monolithic (conventional metallic) materials life-prediction methodologies, boundary element methods, and user-written subroutines. Several probabilistic algorithms are available such as the advanced mean value method and the adaptive importance sampling method. NASA/NESSUS 6.2c is structured in a modular format with 15 elements.

  3. A Markov Chain Approach to Probabilistic Swarm Guidance

    NASA Technical Reports Server (NTRS)

    Acikmese, Behcet; Bayard, David S.

    2012-01-01

    This paper introduces a probabilistic guidance approach for the coordination of swarms of autonomous agents. The main idea is to drive the swarm to a prescribed density distribution in a prescribed region of the configuration space. In its simplest form, the probabilistic approach is completely decentralized and does not require communication or collabo- ration between agents. Agents make statistically independent probabilistic decisions based solely on their own state, that ultimately guides the swarm to the desired density distribution in the configuration space. In addition to being completely decentralized, the probabilistic guidance approach has a novel autonomous self-repair property: Once the desired swarm density distribution is attained, the agents automatically repair any damage to the distribution without collaborating and without any knowledge about the damage.

  4. The desert moss Pterygoneurum lamellatum (Pottiaceae) exhibits an inducible ecological strategy of desiccation tolerance: effects of rate of drying on shoot damage and regeneration

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Premise of the study: Bryophytes are regarded as a clade incorporating constitutive desiccation tolerance, especially terrestrial species. Here we test the hypothesis that the opposing ecological strategy of desiccation tolerance, inducibility, is present in a desert moss, and addressed by varying r...

  5. Probabilistic Seismic Risk Model for Western Balkans

    NASA Astrophysics Data System (ADS)

    Stejskal, Vladimir; Lorenzo, Francisco; Pousse, Guillaume; Radovanovic, Slavica; Pekevski, Lazo; Dojcinovski, Dragi; Lokin, Petar; Petronijevic, Mira; Sipka, Vesna

    2010-05-01

    A probabilistic seismic risk model for insurance and reinsurance purposes is presented for an area of Western Balkans, covering former Yugoslavia and Albania. This territory experienced many severe earthquakes during past centuries producing significant damage to many population centres in the region. The highest hazard is related to external Dinarides, namely to the collision zone of the Adriatic plate. The model is based on a unified catalogue for the region and a seismic source model consisting of more than 30 zones covering all the three main structural units - Southern Alps, Dinarides and the south-western margin of the Pannonian Basin. A probabilistic methodology using Monte Carlo simulation was applied to generate the hazard component of the model. Unique set of damage functions based on both loss experience and engineering assessments is used to convert the modelled ground motion severity into the monetary loss.

  6. Probabilistic protocols in quantum information science: Use and abuse

    NASA Astrophysics Data System (ADS)

    Caves, Carlton

    2014-03-01

    Protocols in quantum information science often succeed with less than unit probability, but nonetheless perform useful tasks because success occurs often enough to make tolerable the overhead from having to perform the protocol several times. Any probabilistic protocol must be analyzed from the perspective of the resources required to make the protocol succeed. I present results from analyses of two probabilistic protocols: (i) nondeterministic (or immaculate) linear amplification, in which an input coherent state is amplified some of the time to a larger-amplitude coherent state, and (ii) probabilistic quantum metrology, in which one attempts to improve estimation of a parameter (or parameters) by post-selecting on a particular outcome. The analysis indicates that there is little to be gained from probabilistic protocols in these two situations.

  7. Tolerating Zero Tolerance?

    ERIC Educational Resources Information Center

    Moore, Brian N.

    2010-01-01

    The concept of zero tolerance dates back to the mid-1990s when New Jersey was creating laws to address nuisance crimes in communities. The main goal of these neighborhood crime policies was to have zero tolerance for petty crime such as graffiti or littering so as to keep more serious crimes from occurring. Next came the war on drugs. In federal…

  8. Opportunities of probabilistic flood loss models

    NASA Astrophysics Data System (ADS)

    Schröter, Kai; Kreibich, Heidi; Lüdtke, Stefan; Vogel, Kristin; Merz, Bruno

    2016-04-01

    Oftentimes, traditional uni-variate damage models as for instance depth-damage curves fail to reproduce the variability of observed flood damage. However, reliable flood damage models are a prerequisite for the practical usefulness of the model results. Innovative multi-variate probabilistic modelling approaches are promising to capture and quantify the uncertainty involved and thus to improve the basis for decision making. In this study we compare the predictive capability of two probabilistic modelling approaches, namely Bagging Decision Trees and Bayesian Networks and traditional stage damage functions. For model evaluation we use empirical damage data which are available from computer aided telephone interviews that were respectively compiled after the floods in 2002, 2005, 2006 and 2013 in the Elbe and Danube catchments in Germany. We carry out a split sample test by sub-setting the damage records. One sub-set is used to derive the models and the remaining records are used to evaluate the predictive performance of the model. Further we stratify the sample according to catchments which allows studying model performance in a spatial transfer context. Flood damage estimation is carried out on the scale of the individual buildings in terms of relative damage. The predictive performance of the models is assessed in terms of systematic deviations (mean bias), precision (mean absolute error) as well as in terms of sharpness of the predictions the reliability which is represented by the proportion of the number of observations that fall within the 95-quantile and 5-quantile predictive interval. The comparison of the uni-variable Stage damage function and the multivariable model approach emphasises the importance to quantify predictive uncertainty. With each explanatory variable, the multi-variable model reveals an additional source of uncertainty. However, the predictive performance in terms of precision (mbe), accuracy (mae) and reliability (HR) is clearly improved

  9. Probabilistic record linkage

    PubMed Central

    Sayers, Adrian; Ben-Shlomo, Yoav; Blom, Ashley W; Steele, Fiona

    2016-01-01

    Studies involving the use of probabilistic record linkage are becoming increasingly common. However, the methods underpinning probabilistic record linkage are not widely taught or understood, and therefore these studies can appear to be a ‘black box’ research tool. In this article, we aim to describe the process of probabilistic record linkage through a simple exemplar. We first introduce the concept of deterministic linkage and contrast this with probabilistic linkage. We illustrate each step of the process using a simple exemplar and describe the data structure required to perform a probabilistic linkage. We describe the process of calculating and interpreting matched weights and how to convert matched weights into posterior probabilities of a match using Bayes theorem. We conclude this article with a brief discussion of some of the computational demands of record linkage, how you might assess the quality of your linkage algorithm, and how epidemiologists can maximize the value of their record-linked research using robust record linkage methods. PMID:26686842

  10. Probabilistic microcell prediction model

    NASA Astrophysics Data System (ADS)

    Kim, Song-Kyoo

    2002-06-01

    A microcell is a cell with 1-km or less radius which is suitable for heavily urbanized area such as a metropolitan city. This paper deals with the microcell prediction model of propagation loss which uses probabilistic techniques. The RSL (Receive Signal Level) is the factor which can evaluate the performance of a microcell and the LOS (Line-Of-Sight) component and the blockage loss directly effect on the RSL. We are combining the probabilistic method to get these performance factors. The mathematical methods include the CLT (Central Limit Theorem) and the SPC (Statistical Process Control) to get the parameters of the distribution. This probabilistic solution gives us better measuring of performance factors. In addition, it gives the probabilistic optimization of strategies such as the number of cells, cell location, capacity of cells, range of cells and so on. Specially, the probabilistic optimization techniques by itself can be applied to real-world problems such as computer-networking, human resources and manufacturing process.

  11. Probabilistic drug connectivity mapping

    PubMed Central

    2014-01-01

    Background The aim of connectivity mapping is to match drugs using drug-treatment gene expression profiles from multiple cell lines. This can be viewed as an information retrieval task, with the goal of finding the most relevant profiles for a given query drug. We infer the relevance for retrieval by data-driven probabilistic modeling of the drug responses, resulting in probabilistic connectivity mapping, and further consider the available cell lines as different data sources. We use a special type of probabilistic model to separate what is shared and specific between the sources, in contrast to earlier connectivity mapping methods that have intentionally aggregated all available data, neglecting information about the differences between the cell lines. Results We show that the probabilistic multi-source connectivity mapping method is superior to alternatives in finding functionally and chemically similar drugs from the Connectivity Map data set. We also demonstrate that an extension of the method is capable of retrieving combinations of drugs that match different relevant parts of the query drug response profile. Conclusions The probabilistic modeling-based connectivity mapping method provides a promising alternative to earlier methods. Principled integration of data from different cell lines helps to identify relevant responses for specific drug repositioning applications. PMID:24742351

  12. Probabilistic, meso-scale flood loss modelling

    NASA Astrophysics Data System (ADS)

    Kreibich, Heidi; Botto, Anna; Schröter, Kai; Merz, Bruno

    2016-04-01

    Flood risk analyses are an important basis for decisions on flood risk management and adaptation. However, such analyses are associated with significant uncertainty, even more if changes in risk due to global change are expected. Although uncertainty analysis and probabilistic approaches have received increased attention during the last years, they are still not standard practice for flood risk assessments and even more for flood loss modelling. State of the art in flood loss modelling is still the use of simple, deterministic approaches like stage-damage functions. Novel probabilistic, multi-variate flood loss models have been developed and validated on the micro-scale using a data-mining approach, namely bagging decision trees (Merz et al. 2013). In this presentation we demonstrate and evaluate the upscaling of the approach to the meso-scale, namely on the basis of land-use units. The model is applied in 19 municipalities which were affected during the 2002 flood by the River Mulde in Saxony, Germany (Botto et al. submitted). The application of bagging decision tree based loss models provide a probability distribution of estimated loss per municipality. Validation is undertaken on the one hand via a comparison with eight deterministic loss models including stage-damage functions as well as multi-variate models. On the other hand the results are compared with official loss data provided by the Saxon Relief Bank (SAB). The results show, that uncertainties of loss estimation remain high. Thus, the significant advantage of this probabilistic flood loss estimation approach is that it inherently provides quantitative information about the uncertainty of the prediction. References: Merz, B.; Kreibich, H.; Lall, U. (2013): Multi-variate flood damage assessment: a tree-based data-mining approach. NHESS, 13(1), 53-64. Botto A, Kreibich H, Merz B, Schröter K (submitted) Probabilistic, multi-variable flood loss modelling on the meso-scale with BT-FLEMO. Risk Analysis.

  13. What do we gain with Probabilistic Flood Loss Models?

    NASA Astrophysics Data System (ADS)

    Schroeter, K.; Kreibich, H.; Vogel, K.; Merz, B.; Lüdtke, S.

    2015-12-01

    The reliability of flood loss models is a prerequisite for their practical usefulness. Oftentimes, traditional uni-variate damage models as for instance depth-damage curves fail to reproduce the variability of observed flood damage. Innovative multi-variate probabilistic modelling approaches are promising to capture and quantify the uncertainty involved and thus to improve the basis for decision making. In this study we compare the predictive capability of two probabilistic modelling approaches, namely Bagging Decision Trees and Bayesian Networks and traditional stage damage functions which are cast in a probabilistic framework. For model evaluation we use empirical damage data which are available from computer aided telephone interviews that were respectively compiled after the floods in 2002, 2005, 2006 and 2013 in the Elbe and Danube catchments in Germany. We carry out a split sample test by sub-setting the damage records. One sub-set is used to derive the models and the remaining records are used to evaluate the predictive performance of the model. Further we stratify the sample according to catchments which allows studying model performance in a spatial transfer context. Flood damage estimation is carried out on the scale of the individual buildings in terms of relative damage. The predictive performance of the models is assessed in terms of systematic deviations (mean bias), precision (mean absolute error) as well as in terms of reliability which is represented by the proportion of the number of observations that fall within the 95-quantile and 5-quantile predictive interval. The reliability of the probabilistic predictions within validation runs decreases only slightly and achieves a very good coverage of observations within the predictive interval. Probabilistic models provide quantitative information about prediction uncertainty which is crucial to assess the reliability of model predictions and improves the usefulness of model results.

  14. Probabilistic Approaches: Composite Design

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.

    1997-01-01

    Probabilistic composite design is described in terms of a computational simulation. This simulation tracks probabilistically the composite design evolution from constituent materials, fabrication process through composite mechanics, and structural component. Comparisons with experimental data are provided to illustrate selection of probabilistic design allowables, test methods/specimen guidelines, and identification of in situ versus pristine strength. For example, results show that: in situ fiber tensile strength is 90 percent of its pristine strength; flat-wise long-tapered specimens are most suitable for setting ply tensile strength allowables; a composite radome can be designed with a reliability of 0.999999; and laminate fatigue exhibits wide spread scatter at 90 percent cyclic-stress to static-strength ratios.

  15. Probabilistic boundary element method

    NASA Technical Reports Server (NTRS)

    Cruse, T. A.; Raveendra, S. T.

    1989-01-01

    The purpose of the Probabilistic Structural Analysis Method (PSAM) project is to develop structural analysis capabilities for the design analysis of advanced space propulsion system hardware. The boundary element method (BEM) is used as the basis of the Probabilistic Advanced Analysis Methods (PADAM) which is discussed. The probabilistic BEM code (PBEM) is used to obtain the structural response and sensitivity results to a set of random variables. As such, PBEM performs analogous to other structural analysis codes such as finite elements in the PSAM system. For linear problems, unlike the finite element method (FEM), the BEM governing equations are written at the boundary of the body only, thus, the method eliminates the need to model the volume of the body. However, for general body force problems, a direct condensation of the governing equations to the boundary of the body is not possible and therefore volume modeling is generally required.

  16. Formalizing Probabilistic Safety Claims

    NASA Technical Reports Server (NTRS)

    Herencia-Zapana, Heber; Hagen, George E.; Narkawicz, Anthony J.

    2011-01-01

    A safety claim for a system is a statement that the system, which is subject to hazardous conditions, satisfies a given set of properties. Following work by John Rushby and Bev Littlewood, this paper presents a mathematical framework that can be used to state and formally prove probabilistic safety claims. It also enables hazardous conditions, their uncertainties, and their interactions to be integrated into the safety claim. This framework provides a formal description of the probabilistic composition of an arbitrary number of hazardous conditions and their effects on system behavior. An example is given of a probabilistic safety claim for a conflict detection algorithm for aircraft in a 2D airspace. The motivation for developing this mathematical framework is that it can be used in an automated theorem prover to formally verify safety claims.

  17. Probabilistic Composite Design

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.

    1997-01-01

    Probabilistic composite design is described in terms of a computational simulation. This simulation tracks probabilistically the composite design evolution from constituent materials, fabrication process, through composite mechanics and structural components. Comparisons with experimental data are provided to illustrate selection of probabilistic design allowables, test methods/specimen guidelines, and identification of in situ versus pristine strength, For example, results show that: in situ fiber tensile strength is 90% of its pristine strength; flat-wise long-tapered specimens are most suitable for setting ply tensile strength allowables: a composite radome can be designed with a reliability of 0.999999; and laminate fatigue exhibits wide-spread scatter at 90% cyclic-stress to static-strength ratios.

  18. Probabilistic composite analysis

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.; Murthy, P. L. N.

    1991-01-01

    Formal procedures are described which are used to computationally simulate the probabilistic behavior of composite structures. The computational simulation starts with the uncertainties associated with all aspects of a composite structure (constituents, fabrication, assembling, etc.) and encompasses all aspects of composite behavior (micromechanics, macromechanics, combined stress failure, laminate theory, structural response, and tailoring) optimization. Typical cases are included to illustrate the formal procedure for computational simulation. The collective results of the sample cases demonstrate that uncertainties in composite behavior and structural response can be probabilistically quantified.

  19. Probabilistic Threshold Criterion

    SciTech Connect

    Gresshoff, M; Hrousis, C A

    2010-03-09

    The Probabilistic Shock Threshold Criterion (PSTC) Project at LLNL develops phenomenological criteria for estimating safety or performance margin on high explosive (HE) initiation in the shock initiation regime, creating tools for safety assessment and design of initiation systems and HE trains in general. Until recently, there has been little foundation for probabilistic assessment of HE initiation scenarios. This work attempts to use probabilistic information that is available from both historic and ongoing tests to develop a basis for such assessment. Current PSTC approaches start with the functional form of the James Initiation Criterion as a backbone, and generalize to include varying areas of initiation and provide a probabilistic response based on test data for 1.8 g/cc (Ultrafine) 1,3,5-triamino-2,4,6-trinitrobenzene (TATB) and LX-17 (92.5% TATB, 7.5% Kel-F 800 binder). Application of the PSTC methodology is presented investigating the safety and performance of a flying plate detonator and the margin of an Ultrafine TATB booster initiating LX-17.

  20. Probabilistic, Multidimensional Unfolding Analysis

    ERIC Educational Resources Information Center

    Zinnes, Joseph L.; Griggs, Richard A.

    1974-01-01

    Probabilistic assumptions are added to single and multidimensional versions of the Coombs unfolding model for preferential choice (Coombs, 1950) and practical ways of obtaining maximum likelihood estimates of the scale parameters and goodness-of-fit tests of the model are presented. A Monte Carlo experiment is discussed. (Author/RC)

  1. Probabilistic Safety Assessment of Tehran Research Reactor

    SciTech Connect

    Hosseini, Seyed Mohammad Hadi; Nematollahi, Mohammad Reza; Sepanloo, Kamran

    2004-07-01

    Probabilistic Safety Assessment (PSA) application is found to be a practical tool for research reactor safety due to intense involvement of human interactions in an experimental facility. In this paper the application of the Probabilistic Safety Assessment to the Tehran Research Reactor (TRR) is presented. The level 1 PSA application involved: Familiarization with the plant, selection of accident initiators, mitigating functions and system definitions, event tree constructions and quantification, fault tree constructions and quantification, human reliability, component failure data base development and dependent failure analysis. Each of the steps of the analysis given above is discussed with highlights from the selected results. Quantification of the constructed models is done using SAPHIRE software. This Study shows that the obtained core damage frequency for Tehran Research Reactor (8.368 E-6 per year) well meets the IAEA criterion for existing nuclear power plants (1E-4). But safety improvement suggestions are offered to decrease the most probable accidents. (authors)

  2. Probabilistic structural analysis to quantify uncertainties associated with turbopump blades

    NASA Technical Reports Server (NTRS)

    Nagpal, Vinod K.; Rubinstein, Robert; Chamis, Christos C.

    1987-01-01

    A probabilistic study of turbopump blades has been in progress at NASA Lewis Research Center for over the last two years. The objectives of this study are to evaluate the effects of uncertainties in geometry and material properties on the structural response of the turbopump blades to evaluate the tolerance limits on the design. A methodology based on probabilistic approach has been developed to quantify the effects of the random uncertainties. The results of this study indicate that only the variations in geometry have significant effects.

  3. Probabilistic Models for Solar Particle Events

    NASA Technical Reports Server (NTRS)

    Adams, James H., Jr.; Dietrich, W. F.; Xapsos, M. A.; Welton, A. M.

    2009-01-01

    Probabilistic Models of Solar Particle Events (SPEs) are used in space mission design studies to provide a description of the worst-case radiation environment that the mission must be designed to tolerate.The models determine the worst-case environment using a description of the mission and a user-specified confidence level that the provided environment will not be exceeded. This poster will focus on completing the existing suite of models by developing models for peak flux and event-integrated fluence elemental spectra for the Z>2 elements. It will also discuss methods to take into account uncertainties in the data base and the uncertainties resulting from the limited number of solar particle events in the database. These new probabilistic models are based on an extensive survey of SPE measurements of peak and event-integrated elemental differential energy spectra. Attempts are made to fit the measured spectra with eight different published models. The model giving the best fit to each spectrum is chosen and used to represent that spectrum for any energy in the energy range covered by the measurements. The set of all such spectral representations for each element is then used to determine the worst case spectrum as a function of confidence level. The spectral representation that best fits these worst case spectra is found and its dependence on confidence level is parameterized. This procedure creates probabilistic models for the peak and event-integrated spectra.

  4. A probabilistic multi-class classifier for structural health monitoring

    NASA Astrophysics Data System (ADS)

    Mechbal, Nazih; Uribe, Juan Sebastian; Rébillat, Marc

    2015-08-01

    In this paper, a probabilistic multi-class pattern recognition algorithm is developed for damage monitoring of smart structures. As these structures can face damages of different severities located in various positions, multi-class classifiers are needed. We propose an original support vector machine (SVM) multi-class clustering algorithm that is based on a probabilistic decision tree (PDT) that produces a posteriori probabilities associated with damage existence, location and severity. The PDT is built by iteratively subdividing the surface and thus takes into account the structure geometry. The proposed algorithm is very appealing as it combines both the computational efficiency of tree architectures and the SVMs classification accuracy. Damage sensitive features are computed using an active approach based on the permanent emission of non-resonant Lamb waves into the structure and on the recognition of amplitude disturbed diffraction patterns. The effectiveness of this algorithm is illustrated experimentally on a composite plate instrumented with piezoelectric elements.

  5. Probabilistic authenticated quantum dialogue

    NASA Astrophysics Data System (ADS)

    Hwang, Tzonelih; Luo, Yi-Ping

    2015-12-01

    This work proposes a probabilistic authenticated quantum dialogue (PAQD) based on Bell states with the following notable features. (1) In our proposed scheme, the dialogue is encoded in a probabilistic way, i.e., the same messages can be encoded into different quantum states, whereas in the state-of-the-art authenticated quantum dialogue (AQD), the dialogue is encoded in a deterministic way; (2) the pre-shared secret key between two communicants can be reused without any security loophole; (3) each dialogue in the proposed PAQD can be exchanged within only one-step quantum communication and one-step classical communication. However, in the state-of-the-art AQD protocols, both communicants have to run a QKD protocol for each dialogue and each dialogue requires multiple quantum as well as classical communicational steps; (4) nevertheless, the proposed scheme can resist the man-in-the-middle attack, the modification attack, and even other well-known attacks.

  6. Geothermal probabilistic cost study

    NASA Technical Reports Server (NTRS)

    Orren, L. H.; Ziman, G. M.; Jones, S. C.; Lee, T. K.; Noll, R.; Wilde, L.; Sadanand, V.

    1981-01-01

    A tool is presented to quantify the risks of geothermal projects, the Geothermal Probabilistic Cost Model (GPCM). The GPCM model was used to evaluate a geothermal reservoir for a binary-cycle electric plant at Heber, California. Three institutional aspects of the geothermal risk which can shift the risk among different agents was analyzed. The leasing of geothermal land, contracting between the producer and the user of the geothermal heat, and insurance against faulty performance were examined.

  7. Geothermal probabilistic cost study

    SciTech Connect

    Orren, L.H.; Ziman, G.M.; Jones, S.C.; Lee, T.K.; Noll, R.; Wilde, L.; Sadanand, V.

    1981-08-01

    A tool is presented to quantify the risks of geothermal projects, the Geothermal Probabilistic Cost Model (GPCM). The GPCM model is used to evaluate a geothermal reservoir for a binary-cycle electric plant at Heber, California. Three institutional aspects of the geothermal risk which can shift the risk among different agents are analyzed. The leasing of geothermal land, contracting between the producer and the user of the geothermal heat, and insurance against faulty performance are examined. (MHR)

  8. Geothermal probabilistic cost study

    NASA Astrophysics Data System (ADS)

    Orren, L. H.; Ziman, G. M.; Jones, S. C.; Lee, T. K.; Noll, R.; Wilde, L.; Sadanand, V.

    1981-08-01

    A tool is presented to quantify the risks of geothermal projects, the Geothermal Probabilistic Cost Model (GPCM). The GPCM model was used to evaluate a geothermal reservoir for a binary-cycle electric plant at Heber, California. Three institutional aspects of the geothermal risk which can shift the risk among different agents was analyzed. The leasing of geothermal land, contracting between the producer and the user of the geothermal heat, and insurance against faulty performance were examined.

  9. Probabilistic simple splicing systems

    NASA Astrophysics Data System (ADS)

    Selvarajoo, Mathuri; Heng, Fong Wan; Sarmin, Nor Haniza; Turaev, Sherzod

    2014-06-01

    A splicing system, one of the early theoretical models for DNA computing was introduced by Head in 1987. Splicing systems are based on the splicing operation which, informally, cuts two strings of DNA molecules at the specific recognition sites and attaches the prefix of the first string to the suffix of the second string, and the prefix of the second string to the suffix of the first string, thus yielding the new strings. For a specific type of splicing systems, namely the simple splicing systems, the recognition sites are the same for both strings of DNA molecules. It is known that splicing systems with finite sets of axioms and splicing rules only generate regular languages. Hence, different types of restrictions have been considered for splicing systems in order to increase their computational power. Recently, probabilistic splicing systems have been introduced where the probabilities are initially associated with the axioms, and the probabilities of the generated strings are computed from the probabilities of the initial strings. In this paper, some properties of probabilistic simple splicing systems are investigated. We prove that probabilistic simple splicing systems can also increase the computational power of the splicing languages generated.

  10. Probabilistic Tsunami Hazard Analysis

    NASA Astrophysics Data System (ADS)

    Thio, H. K.; Ichinose, G. A.; Somerville, P. G.; Polet, J.

    2006-12-01

    The recent tsunami disaster caused by the 2004 Sumatra-Andaman earthquake has focused our attention to the hazard posed by large earthquakes that occur under water, in particular subduction zone earthquakes, and the tsunamis that they generate. Even though these kinds of events are rare, the very large loss of life and material destruction caused by this earthquake warrant a significant effort towards the mitigation of the tsunami hazard. For ground motion hazard, Probabilistic Seismic Hazard Analysis (PSHA) has become a standard practice in the evaluation and mitigation of seismic hazard to populations in particular with respect to structures, infrastructure and lifelines. Its ability to condense the complexities and variability of seismic activity into a manageable set of parameters greatly facilitates the design of effective seismic resistant buildings but also the planning of infrastructure projects. Probabilistic Tsunami Hazard Analysis (PTHA) achieves the same goal for hazards posed by tsunami. There are great advantages of implementing such a method to evaluate the total risk (seismic and tsunami) to coastal communities. The method that we have developed is based on the traditional PSHA and therefore completely consistent with standard seismic practice. Because of the strong dependence of tsunami wave heights on bathymetry, we use a full waveform tsunami waveform computation in lieu of attenuation relations that are common in PSHA. By pre-computing and storing the tsunami waveforms at points along the coast generated for sets of subfaults that comprise larger earthquake faults, we can efficiently synthesize tsunami waveforms for any slip distribution on those faults by summing the individual subfault tsunami waveforms (weighted by their slip). This efficiency make it feasible to use Green's function summation in lieu of attenuation relations to provide very accurate estimates of tsunami height for probabilistic calculations, where one typically computes

  11. Asteroid Risk Assessment: A Probabilistic Approach.

    PubMed

    Reinhardt, Jason C; Chen, Xi; Liu, Wenhao; Manchev, Petar; Paté-Cornell, M Elisabeth

    2016-02-01

    Following the 2013 Chelyabinsk event, the risks posed by asteroids attracted renewed interest, from both the scientific and policy-making communities. It reminded the world that impacts from near-Earth objects (NEOs), while rare, have the potential to cause great damage to cities and populations. Point estimates of the risk (such as mean numbers of casualties) have been proposed, but because of the low-probability, high-consequence nature of asteroid impacts, these averages provide limited actionable information. While more work is needed to further refine its input distributions (e.g., NEO diameters), the probabilistic model presented in this article allows a more complete evaluation of the risk of NEO impacts because the results are distributions that cover the range of potential casualties. This model is based on a modularized simulation that uses probabilistic inputs to estimate probabilistic risk metrics, including those of rare asteroid impacts. Illustrative results of this analysis are presented for a period of 100 years. As part of this demonstration, we assess the effectiveness of civil defense measures in mitigating the risk of human casualties. We find that they are likely to be beneficial but not a panacea. We also compute the probability-but not the consequences-of an impact with global effects ("cataclysm"). We conclude that there is a continued need for NEO observation, and for analyses of the feasibility and risk-reduction effectiveness of space missions designed to deflect or destroy asteroids that threaten the Earth. PMID:26215051

  12. Global/local methods for probabilistic structural analysis

    NASA Technical Reports Server (NTRS)

    Millwater, H. R.; Wu, Y.-T.

    1993-01-01

    A probabilistic global/local method is proposed to reduce the computational requirements of probabilistic structural analysis. A coarser global model is used for most of the computations with a local more refined model used only at key probabilistic conditions. The global model is used to establish the cumulative distribution function (cdf) and the Most Probable Point (MPP). The local model then uses the predicted MPP to adjust the cdf value. The global/local method is used within the advanced mean value probabilistic algorithm. The local model can be more refined with respect to the g1obal model in terms of finer mesh, smaller time step, tighter tolerances, etc. and can be used with linear or nonlinear models. The basis for this approach is described in terms of the correlation between the global and local models which can be estimated from the global and local MPPs. A numerical example is presented using the NESSUS probabilistic structural analysis program with the finite element method used for the structural modeling. The results clearly indicate a significant computer savings with minimal loss in accuracy.

  13. Time Analysis for Probabilistic Workflows

    SciTech Connect

    Czejdo, Bogdan; Ferragut, Erik M

    2012-01-01

    There are many theoretical and practical results in the area of workflow modeling, especially when the more formal workflows are used. In this paper we focus on probabilistic workflows. We show algorithms for time computations in probabilistic workflows. With time of activities more precisely modeled, we can achieve improvement in the work cooperation and analyses of cooperation including simulation and visualization.

  14. Topics in Probabilistic Judgment Aggregation

    ERIC Educational Resources Information Center

    Wang, Guanchun

    2011-01-01

    This dissertation is a compilation of several studies that are united by their relevance to probabilistic judgment aggregation. In the face of complex and uncertain events, panels of judges are frequently consulted to provide probabilistic forecasts, and aggregation of such estimates in groups often yield better results than could have been made…

  15. Probabilistic analysis of mechanical systems

    SciTech Connect

    Priddy, T.G.; Paez, T.L.; Veers, P.S.

    1993-09-01

    This paper proposes a framework for the comprehensive analysis of complex problems in probabilistic structural mechanics. Tools that can be used to accurately estimate the probabilistic behavior of mechanical systems are discussed, and some of the techniques proposed in the paper are developed and used in the solution of a problem in nonlinear structural dynamics.

  16. Probabilistic cellular automata.

    PubMed

    Agapie, Alexandru; Andreica, Anca; Giuclea, Marius

    2014-09-01

    Cellular automata are binary lattices used for modeling complex dynamical systems. The automaton evolves iteratively from one configuration to another, using some local transition rule based on the number of ones in the neighborhood of each cell. With respect to the number of cells allowed to change per iteration, we speak of either synchronous or asynchronous automata. If randomness is involved to some degree in the transition rule, we speak of probabilistic automata, otherwise they are called deterministic. With either type of cellular automaton we are dealing with, the main theoretical challenge stays the same: starting from an arbitrary initial configuration, predict (with highest accuracy) the end configuration. If the automaton is deterministic, the outcome simplifies to one of two configurations, all zeros or all ones. If the automaton is probabilistic, the whole process is modeled by a finite homogeneous Markov chain, and the outcome is the corresponding stationary distribution. Based on our previous results for the asynchronous case-connecting the probability of a configuration in the stationary distribution to its number of zero-one borders-the article offers both numerical and theoretical insight into the long-term behavior of synchronous cellular automata. PMID:24999557

  17. Quantum probabilistic logic programming

    NASA Astrophysics Data System (ADS)

    Balu, Radhakrishnan

    2015-05-01

    We describe a quantum mechanics based logic programming language that supports Horn clauses, random variables, and covariance matrices to express and solve problems in probabilistic logic. The Horn clauses of the language wrap random variables, including infinite valued, to express probability distributions and statistical correlations, a powerful feature to capture relationship between distributions that are not independent. The expressive power of the language is based on a mechanism to implement statistical ensembles and to solve the underlying SAT instances using quantum mechanical machinery. We exploit the fact that classical random variables have quantum decompositions to build the Horn clauses. We establish the semantics of the language in a rigorous fashion by considering an existing probabilistic logic language called PRISM with classical probability measures defined on the Herbrand base and extending it to the quantum context. In the classical case H-interpretations form the sample space and probability measures defined on them lead to consistent definition of probabilities for well formed formulae. In the quantum counterpart, we define probability amplitudes on Hinterpretations facilitating the model generations and verifications via quantum mechanical superpositions and entanglements. We cast the well formed formulae of the language as quantum mechanical observables thus providing an elegant interpretation for their probabilities. We discuss several examples to combine statistical ensembles and predicates of first order logic to reason with situations involving uncertainty.

  18. Probabilistic Finite Element: Variational Theory

    NASA Technical Reports Server (NTRS)

    Belytschko, T.; Liu, W. K.

    1985-01-01

    The goal of this research is to provide techniques which are cost-effective and enable the engineer to evaluate the effect of uncertainties in complex finite element models. Embedding the probabilistic aspects in a variational formulation is a natural approach. In addition, a variational approach to probabilistic finite elements enables it to be incorporated within standard finite element methodologies. Therefore, once the procedures are developed, they can easily be adapted to existing general purpose programs. Furthermore, the variational basis for these methods enables them to be adapted to a wide variety of structural elements and to provide a consistent basis for incorporating probabilistic features in many aspects of the structural problem. Tasks concluded include the theoretical development of probabilistic variational equations for structural dynamics, the development of efficient numerical algorithms for probabilistic sensitivity displacement and stress analysis, and integration of methodologies into a pilot computer code.

  19. The IEEE eighteenth international symposium on fault-tolerant computing (Digest of Papers)

    SciTech Connect

    Not Available

    1988-01-01

    These proceedings collect papers on fault detection and computers. Topics include: software failure behavior, fault tolerant distributed programs, parallel simulation of faults, concurrent built-in self-test techniques, fault-tolerant parallel processor architectures, probabilistic fault diagnosis, fault tolerances in hypercube processors and cellular automation modeling.

  20. Probabilistic Mesomechanical Fatigue Model

    NASA Technical Reports Server (NTRS)

    Tryon, Robert G.

    1997-01-01

    A probabilistic mesomechanical fatigue life model is proposed to link the microstructural material heterogeneities to the statistical scatter in the macrostructural response. The macrostructure is modeled as an ensemble of microelements. Cracks nucleation within the microelements and grow from the microelements to final fracture. Variations of the microelement properties are defined using statistical parameters. A micromechanical slip band decohesion model is used to determine the crack nucleation life and size. A crack tip opening displacement model is used to determine the small crack growth life and size. Paris law is used to determine the long crack growth life. The models are combined in a Monte Carlo simulation to determine the statistical distribution of total fatigue life for the macrostructure. The modeled response is compared to trends in experimental observations from the literature.

  1. Novel probabilistic neuroclassifier

    NASA Astrophysics Data System (ADS)

    Hong, Jiang; Serpen, Gursel

    2003-09-01

    A novel probabilistic potential function neural network classifier algorithm to deal with classes which are multi-modally distributed and formed from sets of disjoint pattern clusters is proposed in this paper. The proposed classifier has a number of desirable properties which distinguish it from other neural network classifiers. A complete description of the algorithm in terms of its architecture and the pseudocode is presented. Simulation analysis of the newly proposed neuro-classifier algorithm on a set of benchmark problems is presented. Benchmark problems tested include IRIS, Sonar, Vowel Recognition, Two-Spiral, Wisconsin Breast Cancer, Cleveland Heart Disease and Thyroid Gland Disease. Simulation results indicate that the proposed neuro-classifier performs consistently better for a subset of problems for which other neural classifiers perform relatively poorly.

  2. Probabilistic fracture finite elements

    NASA Technical Reports Server (NTRS)

    Liu, W. K.; Belytschko, T.; Lua, Y. J.

    1991-01-01

    The Probabilistic Fracture Mechanics (PFM) is a promising method for estimating the fatigue life and inspection cycles for mechanical and structural components. The Probability Finite Element Method (PFEM), which is based on second moment analysis, has proved to be a promising, practical approach to handle problems with uncertainties. As the PFEM provides a powerful computational tool to determine first and second moment of random parameters, the second moment reliability method can be easily combined with PFEM to obtain measures of the reliability of the structural system. The method is also being applied to fatigue crack growth. Uncertainties in the material properties of advanced materials such as polycrystalline alloys, ceramics, and composites are commonly observed from experimental tests. This is mainly attributed to intrinsic microcracks, which are randomly distributed as a result of the applied load and the residual stress.

  3. Probabilistic Fiber Composite Micromechanics

    NASA Technical Reports Server (NTRS)

    Stock, Thomas A.

    1996-01-01

    Probabilistic composite micromechanics methods are developed that simulate expected uncertainties in unidirectional fiber composite properties. These methods are in the form of computational procedures using Monte Carlo simulation. The variables in which uncertainties are accounted for include constituent and void volume ratios, constituent elastic properties and strengths, and fiber misalignment. A graphite/epoxy unidirectional composite (ply) is studied to demonstrate fiber composite material property variations induced by random changes expected at the material micro level. Regression results are presented to show the relative correlation between predictor and response variables in the study. These computational procedures make possible a formal description of anticipated random processes at the intra-ply level, and the related effects of these on composite properties.

  4. Probabilistic retinal vessel segmentation

    NASA Astrophysics Data System (ADS)

    Wu, Chang-Hua; Agam, Gady

    2007-03-01

    Optic fundus assessment is widely used for diagnosing vascular and non-vascular pathology. Inspection of the retinal vasculature may reveal hypertension, diabetes, arteriosclerosis, cardiovascular disease and stroke. Due to various imaging conditions retinal images may be degraded. Consequently, the enhancement of such images and vessels in them is an important task with direct clinical applications. We propose a novel technique for vessel enhancement in retinal images that is capable of enhancing vessel junctions in addition to linear vessel segments. This is an extension of vessel filters we have previously developed for vessel enhancement in thoracic CT scans. The proposed approach is based on probabilistic models which can discern vessels and junctions. Evaluation shows the proposed filter is better than several known techniques and is comparable to the state of the art when evaluated on a standard dataset. A ridge-based vessel tracking process is applied on the enhanced image to demonstrate the effectiveness of the enhancement filter.

  5. Probabilistic brains: knowns and unknowns

    PubMed Central

    Pouget, Alexandre; Beck, Jeffrey M; Ma, Wei Ji; Latham, Peter E

    2015-01-01

    There is strong behavioral and physiological evidence that the brain both represents probability distributions and performs probabilistic inference. Computational neuroscientists have started to shed light on how these probabilistic representations and computations might be implemented in neural circuits. One particularly appealing aspect of these theories is their generality: they can be used to model a wide range of tasks, from sensory processing to high-level cognition. To date, however, these theories have only been applied to very simple tasks. Here we discuss the challenges that will emerge as researchers start focusing their efforts on real-life computations, with a focus on probabilistic learning, structural learning and approximate inference. PMID:23955561

  6. Probabilistic Design of Composite Structures

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.

    2006-01-01

    A formal procedure for the probabilistic design evaluation of a composite structure is described. The uncertainties in all aspects of a composite structure (constituent material properties, fabrication variables, structural geometry, and service environments, etc.), which result in the uncertain behavior in the composite structural responses, are included in the evaluation. The probabilistic evaluation consists of: (1) design criteria, (2) modeling of composite structures and uncertainties, (3) simulation methods, and (4) the decision-making process. A sample case is presented to illustrate the formal procedure and to demonstrate that composite structural designs can be probabilistically evaluated with accuracy and efficiency.

  7. Probabilistic methods for structural response analysis

    NASA Technical Reports Server (NTRS)

    Wu, Y.-T.; Burnside, O. H.; Cruse, T. A.

    1988-01-01

    This paper addresses current work to develop probabilistic structural analysis methods for integration with a specially developed probabilistic finite element code. The goal is to establish distribution functions for the structural responses of stochastic structures under uncertain loadings. Several probabilistic analysis methods are proposed covering efficient structural probabilistic analysis methods, correlated random variables, and response of linear system under stationary random loading.

  8. Trehalose pretreatment induces salt tolerance in rice (Oryza sativa L.) seedlings: oxidative damage and co-induction of antioxidant defense and glyoxalase systems.

    PubMed

    Mostofa, Mohammad Golam; Hossain, Mohammad Anwar; Fujita, Masayuki

    2015-03-01

    Salinity in the form of abiotic stress adversely effects plant growth, development, and productivity. Various osmoprotectants are involved in regulating plant responses to salinity; however, the precise role of trehalose (Tre) in this process remains to be further elucidated. The present study investigated the regulatory role of Tre in alleviating salt-induced oxidative stress in hydroponically grown rice seedlings. Salt stress (150 and 250 mM NaCl) for 72 h resulted in toxicity symptoms such as stunted growth, severe yellowing, and leaf rolling, particularly at 250 mM NaCl. Histochemical observation of reactive oxygen species (ROS; O2 (∙-) and H2O2) indicated evident oxidative stress in salt-stressed seedlings. In these seedlings, the levels of lipoxygenase (LOX) activity, malondialdehyde (MDA), H2O2, and proline (Pro) increased significantly whereas total chlorophyll (Chl) and relative water content (RWC) decreased. Salt stress caused an imbalance in non-enzymatic antioxidants, i.e., ascorbic acid (AsA) content, AsA/DHA ratio, and GSH/GSSG ratio decreased but glutathione (GSH) content increased significantly. In contrast, Tre pretreatment (10 mM, 48 h) significantly addressed salt-induced toxicity symptoms and dramatically depressed LOX activity, ROS, MDA, and Pro accumulation whereas AsA, GSH, RWC, Chl contents, and redox status improved considerably. Salt stress stimulated the activities of SOD, GPX, APX, MDHAR, DHAR, and GR but decreased the activities of CAT and GST. However, Tre-pretreated salt-stressed seedlings counteracted SOD and MDHAR activities, elevated CAT and GST activities, further enhanced APX and DHAR activities, and maintained GPX and GR activities similar to the seedlings stressed with salt alone. In addition, Tre pretreatment enhanced the activities of methylglyoxal detoxifying enzymes (Gly I and Gly II) more efficiently in salt-stressed seedlings. Our results suggest a role for Tre in protecting against salt-induced oxidative damage

  9. Quantifying the risks of winter damage on overwintering crops under future climates: Will low-temperature damage be more likely in warmer climates?

    NASA Astrophysics Data System (ADS)

    Vico, G.; Weih, M.

    2014-12-01

    Autumn-sown crops act as winter cover crop, reducing soil erosion and nutrient leaching, while potentially providing higher yields than spring varieties in many environments. Nevertheless, overwintering crops are exposed for longer periods to the vagaries of weather conditions. Adverse winter conditions, in particular, may negatively affect the final yield, by reducing crop survival or its vigor. The net effect of the projected shifts in climate is unclear. On the one hand, warmer temperatures may reduce the frequency of low temperatures, thereby reducing damage risk. On the other hand, warmer temperatures, by reducing plant acclimation level and the amount and duration of snow cover, may increase the likelihood of damage. Thus, warmer climates may paradoxically result in more extensive low temperature damage and reduced viability for overwintering plants. The net effect of a shift in climate is explored by means of a parsimonious probabilistic model, based on a coupled description of air temperature, snow cover, and crop tolerable temperature. Exploiting an extensive dataset of winter wheat responses to low temperature exposure, the risk of winter damage occurrence is quantified under conditions typical of northern temperate latitudes. The full spectrum of variations expected with climate change is explored, quantifying the joint effects of alterations in temperature averages and their variability as well as shifts in precipitation. The key features affecting winter wheat vulnerability to low temperature damage under future climates are singled out.

  10. Probabilistic Open Set Recognition

    NASA Astrophysics Data System (ADS)

    Jain, Lalit Prithviraj

    Real-world tasks in computer vision, pattern recognition and machine learning often touch upon the open set recognition problem: multi-class recognition with incomplete knowledge of the world and many unknown inputs. An obvious way to approach such problems is to develop a recognition system that thresholds probabilities to reject unknown classes. Traditional rejection techniques are not about the unknown; they are about the uncertain boundary and rejection around that boundary. Thus traditional techniques only represent the "known unknowns". However, a proper open set recognition algorithm is needed to reduce the risk from the "unknown unknowns". This dissertation examines this concept and finds existing probabilistic multi-class recognition approaches are ineffective for true open set recognition. We hypothesize the cause is due to weak adhoc assumptions combined with closed-world assumptions made by existing calibration techniques. Intuitively, if we could accurately model just the positive data for any known class without overfitting, we could reject the large set of unknown classes even under this assumption of incomplete class knowledge. For this, we formulate the problem as one of modeling positive training data by invoking statistical extreme value theory (EVT) near the decision boundary of positive data with respect to negative data. We provide a new algorithm called the PI-SVM for estimating the unnormalized posterior probability of class inclusion. This dissertation also introduces a new open set recognition model called Compact Abating Probability (CAP), where the probability of class membership decreases in value (abates) as points move from known data toward open space. We show that CAP models improve open set recognition for multiple algorithms. Leveraging the CAP formulation, we go on to describe the novel Weibull-calibrated SVM (W-SVM) algorithm, which combines the useful properties of statistical EVT for score calibration with one-class and binary

  11. Probabilistic Risk Assessment: A Bibliography

    NASA Technical Reports Server (NTRS)

    2000-01-01

    Probabilistic risk analysis is an integration of failure modes and effects analysis (FMEA), fault tree analysis and other techniques to assess the potential for failure and to find ways to reduce risk. This bibliography references 160 documents in the NASA STI Database that contain the major concepts, probabilistic risk assessment, risk and probability theory, in the basic index or major subject terms, An abstract is included with most citations, followed by the applicable subject terms.

  12. 7 CFR 51.2954 - Tolerances for grade defects.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... not more than 6 pct which are damaged by mold or insects or seriously damaged by other means, of which not more than 5/6 or 5 pct may be damaged by insects, but no part of any tolerance shall be allowed for walnuts containing live insects No tolerance to reduce the required 70 pct of “light...

  13. 7 CFR 51.2954 - Tolerances for grade defects.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... not more than 6 pct which are damaged by mold or insects or seriously damaged by other means, of which not more than 5/6 or 5 pct may be damaged by insects, but no part of any tolerance shall be allowed for walnuts containing live insects No tolerance to reduce the required 70 pct of “light...

  14. Probabilistic theories with purification

    SciTech Connect

    Chiribella, Giulio; D'Ariano, Giacomo Mauro; Perinotti, Paolo

    2010-06-15

    We investigate general probabilistic theories in which every mixed state has a purification, unique up to reversible channels on the purifying system. We show that the purification principle is equivalent to the existence of a reversible realization of every physical process, that is, to the fact that every physical process can be regarded as arising from a reversible interaction of the system with an environment, which is eventually discarded. From the purification principle we also construct an isomorphism between transformations and bipartite states that possesses all structural properties of the Choi-Jamiolkowski isomorphism in quantum theory. Such an isomorphism allows one to prove most of the basic features of quantum theory, like, e.g., existence of pure bipartite states giving perfect correlations in independent experiments, no information without disturbance, no joint discrimination of all pure states, no cloning, teleportation, no programming, no bit commitment, complementarity between correctable channels and deletion channels, characterization of entanglement-breaking channels as measure-and-prepare channels, and others, without resorting to the mathematical framework of Hilbert spaces.

  15. DNA Damage Response

    PubMed Central

    Giglia-Mari, Giuseppina; Zotter, Angelika; Vermeulen, Wim

    2011-01-01

    Structural changes to DNA severely affect its functions, such as replication and transcription, and play a major role in age-related diseases and cancer. A complicated and entangled network of DNA damage response (DDR) mechanisms, including multiple DNA repair pathways, damage tolerance processes, and cell-cycle checkpoints safeguard genomic integrity. Like transcription and replication, DDR is a chromatin-associated process that is generally tightly controlled in time and space. As DNA damage can occur at any time on any genomic location, a specialized spatio-temporal orchestration of this defense apparatus is required. PMID:20980439

  16. Probabilistic exposure fusion.

    PubMed

    Song, Mingli; Tao, Dacheng; Chen, Chun; Bu, Jiajun; Luo, Jiebo; Zhang, Chengqi

    2012-01-01

    The luminance of a natural scene is often of high dynamic range (HDR). In this paper, we propose a new scheme to handle HDR scenes by integrating locally adaptive scene detail capture and suppressing gradient reversals introduced by the local adaptation. The proposed scheme is novel for capturing an HDR scene by using a standard dynamic range (SDR) device and synthesizing an image suitable for SDR displays. In particular, we use an SDR capture device to record scene details (i.e., the visible contrasts and the scene gradients) in a series of SDR images with different exposure levels. Each SDR image responds to a fraction of the HDR and partially records scene details. With the captured SDR image series, we first calculate the image luminance levels, which maximize the visible contrasts, and then the scene gradients embedded in these images. Next, we synthesize an SDR image by using a probabilistic model that preserves the calculated image luminance levels and suppresses reversals in the image luminance gradients. The synthesized SDR image contains much more scene details than any of the captured SDR image. Moreover, the proposed scheme also functions as the tone mapping of an HDR image to the SDR image, and it is superior to both global and local tone mapping operators. This is because global operators fail to preserve visual details when the contrast ratio of a scene is large, whereas local operators often produce halos in the synthesized SDR image. The proposed scheme does not require any human interaction or parameter tuning for different scenes. Subjective evaluations have shown that it is preferred over a number of existing approaches. PMID:21609883

  17. PROBABILISTIC INFORMATION INTEGRATION TECHNOLOGY

    SciTech Connect

    J. BOOKER; M. MEYER; ET AL

    2001-02-01

    The Statistical Sciences Group at Los Alamos has successfully developed a structured, probabilistic, quantitative approach for the evaluation of system performance based on multiple information sources, called Information Integration Technology (IIT). The technology integrates diverse types and sources of data and information (both quantitative and qualitative), and their associated uncertainties, to develop distributions for performance metrics, such as reliability. Applications include predicting complex system performance, where test data are lacking or expensive to obtain, through the integration of expert judgment, historical data, computer/simulation model predictions, and any relevant test/experimental data. The technology is particularly well suited for tracking estimated system performance for systems under change (e.g. development, aging), and can be used at any time during product development, including concept and early design phases, prior to prototyping, testing, or production, and before costly design decisions are made. Techniques from various disciplines (e.g., state-of-the-art expert elicitation, statistical and reliability analysis, design engineering, physics modeling, and knowledge management) are merged and modified to develop formal methods for the data/information integration. The power of this technology, known as PREDICT (Performance and Reliability Evaluation with Diverse Information Combination and Tracking), won a 1999 R and D 100 Award (Meyer, Booker, Bement, Kerscher, 1999). Specifically the PREDICT application is a formal, multidisciplinary process for estimating the performance of a product when test data are sparse or nonexistent. The acronym indicates the purpose of the methodology: to evaluate the performance or reliability of a product/system by combining all available (often diverse) sources of information and then tracking that performance as the product undergoes changes.

  18. Impact damage in composite laminates

    NASA Technical Reports Server (NTRS)

    Grady, Joseph E.

    1988-01-01

    Damage tolerance requirements have become an important consideration in the design and fabrication of composite structural components for modern aircraft. The ability of a component to contain a flaw of a given size without serious loss of its structural integrity is of prime concern. Composite laminates are particularly susceptible to damage caused by transverse impact loading. The ongoing program described is aimed at developing experimental and analytical methods that can be used to assess damage tolerance capabilities in composite structures subjected to impulsive loading. Some significant results of this work and the methodology used to obtain them are outlined.

  19. Probabilistic progressive buckling of trusses

    NASA Technical Reports Server (NTRS)

    Pai, Shantaram S.; Chamis, Christos C.

    1991-01-01

    A three-bay, space, cantilever truss is probabilistically evaluated to describe progressive buckling and truss collapse in view of the numerous uncertainties associated with the structural, material, and load variables (primitive variables) that describe the truss. Initially, the truss is deterministically analyzed for member forces, and member(s) in which the axial force exceeds the Euler buckling load are identified. These member(s) are then discretized with several intermediate nodes and a probabilistic buckling analysis is performed on the truss to obtain its probabilistic buckling loads and respective mode shapes. Furthermore, sensitivities associated with the uncertainties in the primitive variables are investigated, margin of safety values for the truss are determined, and truss end node displacements are noted. These steps are repeated by sequentially removing the buckled member(s) until onset of truss collapse is reached. Results show that this procedure yields an optimum truss configuration for a given loading and for a specified reliability.

  20. Rule Learning with Probabilistic Smoothing

    NASA Astrophysics Data System (ADS)

    Costa, Gianni; Guarascio, Massimo; Manco, Giuseppe; Ortale, Riccardo; Ritacco, Ettore

    A hierarchical classification framework is proposed for discriminating rare classes in imprecise domains, characterized by rarity (of both classes and cases), noise and low class separability. The devised framework couples the rules of a rule-based classifier with as many local probabilistic generative models. These are trained over the coverage of the corresponding rules to better catch those globally rare cases/classes that become less rare in the coverage. Two novel schemes for tightly integrating rule-based and probabilistic classification are introduced, that classify unlabeled cases by considering multiple classifier rules as well as their local probabilistic counterparts. An intensive evaluation shows that the proposed framework is competitive and often superior in accuracy w.r.t. established competitors, while overcoming them in dealing with rare classes.

  1. Vagueness as Probabilistic Linguistic Knowledge

    NASA Astrophysics Data System (ADS)

    Lassiter, Daniel

    Consideration of the metalinguistic effects of utterances involving vague terms has led Barker [1] to treat vagueness using a modified Stalnakerian model of assertion. I present a sorites-like puzzle for factual beliefs in the standard Stalnakerian model [28] and show that it can be resolved by enriching the model to make use of probabilistic belief spaces. An analogous problem arises for metalinguistic information in Barker's model, and I suggest that a similar enrichment is needed here as well. The result is a probabilistic theory of linguistic representation that retains a classical metalanguage but avoids the undesirable divorce between meaning and use inherent in the epistemic theory [34]. I also show that the probabilistic approach provides a plausible account of the sorites paradox and higher-order vagueness and that it fares well empirically and conceptually in comparison to leading competitors.

  2. Composite Structures Damage Tolerance Analysis Methodologies

    NASA Technical Reports Server (NTRS)

    Chang, James B.; Goyal, Vinay K.; Klug, John C.; Rome, Jacob I.

    2012-01-01

    This report presents the results of a literature review as part of the development of composite hardware fracture control guidelines funded by NASA Engineering and Safety Center (NESC) under contract NNL04AA09B. The objectives of the overall development tasks are to provide a broad information and database to the designers, analysts, and testing personnel who are engaged in space flight hardware production.

  3. Probabilistic framework for network partition

    NASA Astrophysics Data System (ADS)

    Li, Tiejun; Liu, Jian; E, Weinan

    2009-08-01

    Given a large and complex network, we would like to find the partition of this network into a small number of clusters. This question has been addressed in many different ways. In a previous paper, we proposed a deterministic framework for an optimal partition of a network as well as the associated algorithms. In this paper, we extend this framework to a probabilistic setting, in which each node has a certain probability of belonging to a certain cluster. Two classes of numerical algorithms for such a probabilistic network partition are presented and tested. Application to three representative examples is discussed.

  4. Probabilistic coding of quantum states

    SciTech Connect

    Grudka, Andrzej; Wojcik, Antoni; Czechlewski, Mikolaj

    2006-07-15

    We discuss the properties of probabilistic coding of two qubits to one qutrit and generalize the scheme to higher dimensions. We show that the protocol preserves the entanglement between the qubits to be encoded and the environment and can also be applied to mixed states. We present a protocol that enables encoding of n qudits to one qudit of dimension smaller than the Hilbert space of the original system and then allows probabilistic but error-free decoding of any subset of k qudits. We give a formula for the probability of successful decoding.

  5. 7 CFR 51.306 - Tolerances.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ..., CERTIFICATION, AND STANDARDS) United States Standards for Grades of Apples Tolerances § 51.306 Tolerances. In...: (1) U.S. Extra Fancy, U.S. Fancy, U.S. No. 1, and U.S. No. 1 Hail grades: 10 percent of the apples in... 5 percent, shall be allowed for apples which are seriously damaged, including therein not more...

  6. 7 CFR 51.306 - Tolerances.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ..., CERTIFICATION, AND STANDARDS) United States Standards for Grades of Apples Tolerances § 51.306 Tolerances. In...: (1) U.S. Extra Fancy, U.S. Fancy, U.S. No. 1, and U.S. No. 1 Hail grades: 10 percent of the apples in... 5 percent, shall be allowed for apples which are seriously damaged, including therein not more...

  7. Research on probabilistic information processing

    NASA Technical Reports Server (NTRS)

    Edwards, W.

    1973-01-01

    The work accomplished on probabilistic information processing (PIP) is reported. The research proposals and decision analysis are discussed along with the results of research on MSC setting, multiattribute utilities, and Bayesian research. Abstracts of reports concerning the PIP research are included.

  8. Probabilistic assessment of composite structures

    NASA Technical Reports Server (NTRS)

    Shiao, Michael E.; Abumeri, Galib H.; Chamis, Christos C.

    1993-01-01

    A general computational simulation methodology for an integrated probabilistic assessment of composite structures is discussed and demonstrated using aircraft fuselage (stiffened composite cylindrical shell) structures with rectangular cutouts. The computational simulation was performed for the probabilistic assessment of the structural behavior including buckling loads, vibration frequencies, global displacements, and local stresses. The scatter in the structural response is simulated based on the inherent uncertainties in the primitive (independent random) variables at the fiber matrix constituent, ply, laminate, and structural scales that describe the composite structures. The effect of uncertainties due to fabrication process variables such as fiber volume ratio, void volume ratio, ply orientation, and ply thickness is also included. The methodology has been embedded in the computer code IPACS (Integrated Probabilistic Assessment of Composite Structures). In addition to the simulated scatter, the IPACS code also calculates the sensitivity of the composite structural behavior to all the primitive variables that influence the structural behavior. This information is useful for assessing reliability and providing guidance for improvement. The results from the probabilistic assessment for the composite structure with rectangular cutouts indicate that the uncertainty in the longitudinal ply stress is mainly caused by the uncertainty in the laminate thickness, and the large overlap of the scatter in the first four buckling loads implies that the buckling mode shape for a specific buckling load can be either of the four modes.

  9. Probabilistic Techniques for Phrase Extraction.

    ERIC Educational Resources Information Center

    Feng, Fangfang; Croft, W. Bruce

    2001-01-01

    This study proposes a probabilistic model for automatically extracting English noun phrases for indexing or information retrieval. The technique is based on a Markov model, whose initial parameters are estimated by a phrase lookup program with a phrase dictionary, then optimized by a set of maximum entropy parameters. (Author/LRW)

  10. Designing Probabilistic Tasks for Kindergartners

    ERIC Educational Resources Information Center

    Skoumpourdi, Chrysanthi; Kafoussi, Sonia; Tatsis, Konstantinos

    2009-01-01

    Recent research suggests that children could be engaged in probability tasks at an early age and task characteristics seem to play an important role in the way children perceive an activity. To this direction in the present article we investigate the role of some basic characteristics of probabilistic tasks in their design and implementation. In…

  11. Making Probabilistic Relational Categories Learnable

    ERIC Educational Resources Information Center

    Jung, Wookyoung; Hummel, John E.

    2015-01-01

    Theories of relational concept acquisition (e.g., schema induction) based on structured intersection discovery predict that relational concepts with a probabilistic (i.e., family resemblance) structure ought to be extremely difficult to learn. We report four experiments testing this prediction by investigating conditions hypothesized to facilitate…

  12. Sugarcane Genotype Tolerance to Wireworms

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Sugarcane (interspecific hybrids of Saccharum spp.) growers in Florida normally apply a soil insecticide at planting to limit wireworm (Melanotus communis Gyllenhall) damage to seed cane (vegetative plantings of stalks). The objective of this study was to measure the tolerance of eight commercial su...

  13. Probabilistic structural analysis of space propulsion system LOX post

    NASA Technical Reports Server (NTRS)

    Newell, J. F.; Rajagopal, K. R.; Ho, H. W.; Cunniff, J. M.

    1990-01-01

    The probabilistic structural analysis program NESSUS (Numerical Evaluation of Stochastic Structures Under Stress; Cruse et al., 1988) is applied to characterize the dynamic loading and response of the Space Shuttle main engine (SSME) LOX post. The design and operation of the SSME are reviewed; the LOX post structure is described; and particular attention is given to the generation of composite load spectra, the finite-element model of the LOX post, and the steps in the NESSUS structural analysis. The results are presented in extensive tables and graphs, and it is shown that NESSUS correctly predicts the structural effects of changes in the temperature loading. The probabilistic approach also facilitates (1) damage assessments for a given failure model (based on gas temperature, heat-shield gap, and material properties) and (2) correlation of the gas temperature with operational parameters such as engine thrust.

  14. Transplantation tolerance.

    PubMed

    Salisbury, Emma M; Game, David S; Lechler, Robert I

    2014-12-01

    Although transplantation has been a standard medical practice for decades, marked morbidity from the use of immunosuppressive drugs and poor long-term graft survival remain important limitations in the field. Since the first solid organ transplant between the Herrick twins in 1954, transplantation immunology has sought to move away from harmful, broad-spectrum immunosuppressive regimens that carry with them the long-term risk of potentially life-threatening opportunistic infections, cardiovascular disease, and malignancy, as well as graft toxicity and loss, towards tolerogenic strategies that promote long-term graft survival. Reports of "transplant tolerance" in kidney and liver allograft recipients whose immunosuppressive drugs were discontinued for medical or non-compliant reasons, together with results from experimental models of transplantation, provide the proof-of-principle that achieving tolerance in organ transplantation is fundamentally possible. However, translating the reconstitution of immune tolerance into the clinical setting is a daunting challenge fraught with the complexities of multiple interacting mechanisms overlaid on a background of variation in disease. In this article, we explore the basic science underlying mechanisms of tolerance and review the latest clinical advances in the quest for transplantation tolerance. PMID:24213880

  15. Lower-bound magnitude for probabilistic seismic hazard assessment

    SciTech Connect

    McCann, M.W. Jr.; Reed, J.W. and Associates, Inc., Mountain View, CA )

    1989-10-01

    This report provides technical information to determine the lower-bound earthquake magnitude (LBM) for use in probabilistic seismic hazard (PSH) computations that are applied to nuclear plant applications. The evaluations consider the seismologic characteristics of earthquake experience at similar facilities and insights from probabilistic risk analysis. The recommendations for LBM satisfy the two basic precepts: (1) there is a reasonable engineering assurance that the likelihood of damage due to earthquakes smaller than the LBM is negligible, and (2) any small risk due to earthquakes smaller than the LBM is compensated by conservatisms in PSH results for larger earthquakes. Theoretical and empirical ground motion studies demonstrate that ground shaking duration and spectral shape are a strong function of earthquake magnitude. Small earthquakes have short duration and spectral shapes centered at high frequencies as compared to nuclear power plant design spectra which are typical of moderate and large earthquakes. Analysis of earthquake experience data shows damage to heavy industrial facilities, taken as analogs to nuclear plant structures and components, occurs for earthquakes having moment magnitude M larger than 5.1. Probabilistic seismic risk and margins studies show nuclear plant structures and adequately anchored ductile components to be rugged for moderate-size earthquakes with broad design-type spectral shapes. They may, therefore, be considered rugged for small earthquakes. Finally, nonlinear analysis of the damage effectiveness of strong-motion recordings shows that potential damage does not occur for earthquakes smaller than about M5.6. These results support a conservative LBM of M5.0 for application to nuclear power plant PSH assessments. 144 refs., 78 figs., 34 tabs.

  16. Probabilistic Structural Analysis Methods (PSAM) for select space propulsion system structural components

    NASA Technical Reports Server (NTRS)

    Cruse, T. A.

    1987-01-01

    The objective is the development of several modular structural analysis packages capable of predicting the probabilistic response distribution for key structural variables such as maximum stress, natural frequencies, transient response, etc. The structural analysis packages are to include stochastic modeling of loads, material properties, geometry (tolerances), and boundary conditions. The solution is to be in terms of the cumulative probability of exceedance distribution (CDF) and confidence bounds. Two methods of probability modeling are to be included as well as three types of structural models - probabilistic finite-element method (PFEM); probabilistic approximate analysis methods (PAAM); and probabilistic boundary element methods (PBEM). The purpose in doing probabilistic structural analysis is to provide the designer with a more realistic ability to assess the importance of uncertainty in the response of a high performance structure. Probabilistic Structural Analysis Method (PSAM) tools will estimate structural safety and reliability, while providing the engineer with information on the confidence that should be given to the predicted behavior. Perhaps most critically, the PSAM results will directly provide information on the sensitivity of the design response to those variables which are seen to be uncertain.

  17. Probabilistic Structural Analysis Methods for select space propulsion system structural components (PSAM)

    NASA Technical Reports Server (NTRS)

    Cruse, T. A.; Burnside, O. H.; Wu, Y.-T.; Polch, E. Z.; Dias, J. B.

    1988-01-01

    The objective is the development of several modular structural analysis packages capable of predicting the probabilistic response distribution for key structural variables such as maximum stress, natural frequencies, transient response, etc. The structural analysis packages are to include stochastic modeling of loads, material properties, geometry (tolerances), and boundary conditions. The solution is to be in terms of the cumulative probability of exceedance distribution (CDF) and confidence bounds. Two methods of probability modeling are to be included as well as three types of structural models - probabilistic finite-element method (PFEM); probabilistic approximate analysis methods (PAAM); and probabilistic boundary element methods (PBEM). The purpose in doing probabilistic structural analysis is to provide the designer with a more realistic ability to assess the importance of uncertainty in the response of a high performance structure. Probabilistic Structural Analysis Method (PSAM) tools will estimate structural safety and reliability, while providing the engineer with information on the confidence that should be given to the predicted behavior. Perhaps most critically, the PSAM results will directly provide information on the sensitivity of the design response to those variables which are seen to be uncertain.

  18. Probabilistic Fatigue And Flaw-Propagation Analysis

    NASA Technical Reports Server (NTRS)

    Moore, Nicholas; Newlin, Laura; Ebbeler, Donald; Sutharshana, Sravan; Creager, Matthew

    1995-01-01

    Probabilistic Failure Assessment for Fatigue and Flaw Propagation (PFAFAT II) package of software utilizing probabilistic failure-assessment (PFA) methodology to model flaw-propagation and low-cycle-fatigue modes of failure of structural components. Comprises one program for performing probabilistic crack-growth analysis and two programs for performing probabilistic low-cycle-fatigue analysis. These programs perform probabilistic fatigue and crack-propagation analysis by means of Monte Carlo simulation. PFAFAT II is extension of, rather than replacement for, PFAFAT software (NPO-18965). Written in FORTRAN 77.

  19. A probabilistic Hu-Washizu variational principle

    NASA Technical Reports Server (NTRS)

    Liu, W. K.; Belytschko, T.; Besterfield, G. H.

    1987-01-01

    A Probabilistic Hu-Washizu Variational Principle (PHWVP) for the Probabilistic Finite Element Method (PFEM) is presented. This formulation is developed for both linear and nonlinear elasticity. The PHWVP allows incorporation of the probabilistic distributions for the constitutive law, compatibility condition, equilibrium, domain and boundary conditions into the PFEM. Thus, a complete probabilistic analysis can be performed where all aspects of the problem are treated as random variables and/or fields. The Hu-Washizu variational formulation is available in many conventional finite element codes thereby enabling the straightforward inclusion of the probabilistic features into present codes.

  20. 7 CFR 51.2954 - Tolerances for grade defects.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... damaged by mold or insects or seriously damaged by other means, of which not more than 5/6 or 5 pct may be damaged by insects, but no part of any tolerance shall be allowed for walnuts containing live insects No... adhering hulls 15 pct total, by count, including not more than 8 pct which are damaged by mold or...

  1. 7 CFR 51.2954 - Tolerances for grade defects.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... damaged by mold or insects or seriously damaged by other means, of which not more than 5/6 or 5 pct may be damaged by insects, but no part of any tolerance shall be allowed for walnuts containing live insects No... adhering hulls 15 pct total, by count, including not more than 8 pct which are damaged by mold or...

  2. 7 CFR 51.2954 - Tolerances for grade defects.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... damaged by mold or insects or seriously damaged by other means, of which not more than 5/6 or 5 pct may be damaged by insects, but no part of any tolerance shall be allowed for walnuts containing live insects No... adhering hulls 15 pct total, by count, including not more than 8 pct which are damaged by mold or...

  3. Damage Progression in Bolted Composites

    NASA Technical Reports Server (NTRS)

    Minnetyan, Levon; Chamis, Christos C.; Gotsis, Pascal K.

    1998-01-01

    Structural durability, damage tolerance, and progressive fracture characteristics of bolted graphite/epoxy composite laminates are evaluated via computational simulation. Constituent material properties and stress and strain limits are scaled up to the structure level to evaluate the overall damage and fracture propagation for bolted composites. Single and double bolted composite specimens with various widths and bolt spacings are evaluated. The effect of bolt spacing is investigated with regard to the structural durability of a bolted joint. Damage initiation, growth, accumulation, and propagation to fracture are included in the simulations. Results show the damage progression sequence and structural fracture resistance during different degradation stages. A procedure is outlined for the use of computational simulation data in the assessment of damage tolerance, determination of sensitive parameters affecting fracture, and interpretation of experimental results with insight for design decisions.

  4. Damage Progression in Bolted Composites

    NASA Technical Reports Server (NTRS)

    Minnetyan, Levon; Chamis, Christos; Gotsis, Pascal K.

    1998-01-01

    Structural durability,damage tolerance,and progressive fracture characteristics of bolted graphite/epoxy composite laminates are evaluated via computational simulation. Constituent material properties and stress and strain limits are scaled up to the structure level to evaluate the overall damage and fracture propagation for bolted composites. Single and double bolted composite specimens with various widths and bolt spacings are evaluated. The effect of bolt spacing is investigated with regard to the structural durability of a bolted joint. Damage initiation, growth, accumulation, and propagation to fracture are included in the simulations. Results show the damage progression sequence and structural fracture resistance during different degradation stages. A procedure is outlined for the use of computational simulation data in the assessment of damage tolerance, determination of sensitive parameters affecting fracture, and interpretation of experimental results with insight for design decisions.

  5. A Robust Damage Assessment Model for Corrupted Database Systems

    NASA Astrophysics Data System (ADS)

    Fu, Ge; Zhu, Hong; Li, Yingjiu

    An intrusion tolerant database uses damage assessment techniques to detect damage propagation scales in a corrupted database system. Traditional damage assessment approaches in a intrusion tolerant database system can only locate damages which are caused by reading corrupted data. In fact, there are many other damage spreading patterns that have not been considered in traditional damage assessment model. In this paper, we systematically analyze inter-transaction dependency relationships that have been neglected in the previous research and propose four different dependency relationships between transactions which may cause damage propagation. We extend existing damage assessment model based on the four novel dependency relationships. The essential properties of our model is also discussed.

  6. Probabilistic load simulation: Code development status

    NASA Technical Reports Server (NTRS)

    Newell, J. F.; Ho, H.

    1991-01-01

    The objective of the Composite Load Spectra (CLS) project is to develop generic load models to simulate the composite load spectra that are included in space propulsion system components. The probabilistic loads thus generated are part of the probabilistic design analysis (PDA) of a space propulsion system that also includes probabilistic structural analyses, reliability, and risk evaluations. Probabilistic load simulation for space propulsion systems demands sophisticated probabilistic methodology and requires large amounts of load information and engineering data. The CLS approach is to implement a knowledge based system coupled with a probabilistic load simulation module. The knowledge base manages and furnishes load information and expertise and sets up the simulation runs. The load simulation module performs the numerical computation to generate the probabilistic loads with load information supplied from the CLS knowledge base.

  7. Confronting uncertainty in flood damage predictions

    NASA Astrophysics Data System (ADS)

    Schröter, Kai; Kreibich, Heidi; Vogel, Kristin; Merz, Bruno

    2015-04-01

    Reliable flood damage models are a prerequisite for the practical usefulness of the model results. Oftentimes, traditional uni-variate damage models as for instance depth-damage curves fail to reproduce the variability of observed flood damage. Innovative multi-variate probabilistic modelling approaches are promising to capture and quantify the uncertainty involved and thus to improve the basis for decision making. In this study we compare the predictive capability of two probabilistic modelling approaches, namely Bagging Decision Trees and Bayesian Networks. For model evaluation we use empirical damage data which are available from computer aided telephone interviews that were respectively compiled after the floods in 2002, 2005 and 2006, in the Elbe and Danube catchments in Germany. We carry out a split sample test by sub-setting the damage records. One sub-set is used to derive the models and the remaining records are used to evaluate the predictive performance of the model. Further we stratify the sample according to catchments which allows studying model performance in a spatial transfer context. Flood damage estimation is carried out on the scale of the individual buildings in terms of relative damage. The predictive performance of the models is assessed in terms of systematic deviations (mean bias), precision (mean absolute error) as well as in terms of reliability which is represented by the proportion of the number of observations that fall within the 95-quantile and 5-quantile predictive interval. The reliability of the probabilistic predictions within validation runs decreases only slightly and achieves a very good coverage of observations within the predictive interval. Probabilistic models provide quantitative information about prediction uncertainty which is crucial to assess the reliability of model predictions and improves the usefulness of model results.

  8. Probabilistic fatigue methodology and wind turbine reliability

    SciTech Connect

    Lange, C.H.

    1996-05-01

    Wind turbines subjected to highly irregular loadings due to wind, gravity, and gyroscopic effects are especially vulnerable to fatigue damage. The objective of this study is to develop and illustrate methods for the probabilistic analysis and design of fatigue-sensitive wind turbine components. A computer program (CYCLES) that estimates fatigue reliability of structural and mechanical components has been developed. A FORM/SORM analysis is used to compute failure probabilities and importance factors of the random variables. The limit state equation includes uncertainty in environmental loading, gross structural response, and local fatigue properties. Several techniques are shown to better study fatigue loads data. Common one-parameter models, such as the Rayleigh and exponential models are shown to produce dramatically different estimates of load distributions and fatigue damage. Improved fits may be achieved with the two-parameter Weibull model. High b values require better modeling of relatively large stress ranges; this is effectively done by matching at least two moments (Weibull) and better by matching still higher moments. For this purpose, a new, four-moment {open_quotes}generalized Weibull{close_quotes} model is introduced. Load and resistance factor design (LRFD) methodology for design against fatigue is proposed and demonstrated using data from two horizontal-axis wind turbines. To estimate fatigue damage, wind turbine blade loads have been represented by their first three statistical moments across a range of wind conditions. Based on the moments {mu}{sub 1}{hor_ellipsis}{mu}{sub 3}, new {open_quotes}quadratic Weibull{close_quotes} load distribution models are introduced. The fatigue reliability is found to be notably affected by the choice of load distribution model.

  9. Intolerant tolerance.

    PubMed

    Khushf, G

    1994-04-01

    The Hyde Amendment and Roman Catholic attempts to put restrictions on Title X funding have been criticized for being intolerant. However, such criticism fails to appreciate that there are two competing notions of tolerance, one focusing on the limits of state force and accepting pluralism as unavoidable, and the other focusing on the limits of knowledge and advancing pluralism as a good. These two types of tolerance, illustrated in the writings of John Locke and J.S. Mill, each involve an intolerance. In a pluralistic context where the free exercise of religion is respected, John Locke's account of tolerance is preferable. However, it (in a reconstructed form) leads to a minimal state. Positive entitlements to benefits like artificial contraception or nontherapeutic abortions can legitimately be resisted, because an intolerance has already been shown with respect to those that consider the benefit immoral, since their resources have been coopted by taxation to advance an end that is contrary to their own. There is a sliding scale from tolerance (viewed as forbearance) to the affirmation of communal integrity, and this scale maps on to the continuum from negative to positive rights. PMID:8051515

  10. Religious Tolerance.

    ERIC Educational Resources Information Center

    Martz, Carlton

    2000-01-01

    This theme issue looks at three issues of religious tolerance. The first article examines a case recently decided by the United States Supreme Court on student-led prayers at school events. The second article explores the persecution suffered by members of the Mormon religion during the 19th century. The final article looks at Martin Luther and…

  11. 7 CFR 51.2544 - Tolerances.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 6 6 (b) Serious Damage (Minor Insect or Vertebrate Injury, Mold, Rancid, Decay) 3 4 4 4 4 4 (1) Insect Damage, included in (b) 1 2 2 2 2 2 (c) Total Internal Defects 4 8 9 9 9 9 Table III—Tolerances... 1 1 1 (b) Foreign material (No glass, metal or live insects shall be permitted) .25 .25 .25 .25...

  12. Environmental probabilistic quantitative assessment methodologies

    USGS Publications Warehouse

    Crovelli, R.A.

    1995-01-01

    In this paper, four petroleum resource assessment methodologies are presented as possible pollution assessment methodologies, even though petroleum as a resource is desirable, whereas pollution is undesirable. A methodology is defined in this paper to consist of a probability model and a probabilistic method, where the method is used to solve the model. The following four basic types of probability models are considered: 1) direct assessment, 2) accumulation size, 3) volumetric yield, and 4) reservoir engineering. Three of the four petroleum resource assessment methodologies were written as microcomputer systems, viz. TRIAGG for direct assessment, APRAS for accumulation size, and FASPU for reservoir engineering. A fourth microcomputer system termed PROBDIST supports the three assessment systems. The three assessment systems have different probability models but the same type of probabilistic method. The type of advantages of the analytic method are in computational speed and flexibility, making it ideal for a microcomputer. -from Author

  13. Probabilistic Simulation for Nanocomposite Characterization

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.; Coroneos, Rula M.

    2007-01-01

    A unique probabilistic theory is described to predict the properties of nanocomposites. The simulation is based on composite micromechanics with progressive substructuring down to a nanoscale slice of a nanofiber where all the governing equations are formulated. These equations have been programmed in a computer code. That computer code is used to simulate uniaxial strengths properties of a mononanofiber laminate. The results are presented graphically and discussed with respect to their practical significance. These results show smooth distributions.

  14. Probabilistic methods for rotordynamics analysis

    NASA Technical Reports Server (NTRS)

    Wu, Y.-T.; Torng, T. Y.; Millwater, H. R.; Fossum, A. F.; Rheinfurth, M. H.

    1991-01-01

    This paper summarizes the development of the methods and a computer program to compute the probability of instability of dynamic systems that can be represented by a system of second-order ordinary linear differential equations. Two instability criteria based upon the eigenvalues or Routh-Hurwitz test functions are investigated. Computational methods based on a fast probability integration concept and an efficient adaptive importance sampling method are proposed to perform efficient probabilistic analysis. A numerical example is provided to demonstrate the methods.

  15. Probabilistic Cloning and Quantum Computation

    NASA Astrophysics Data System (ADS)

    Gao, Ting; Yan, Feng-Li; Wang, Zhi-Xi

    2004-06-01

    We discuss the usefulness of quantum cloning and present examples of quantum computation tasks for which the cloning offers an advantage which cannot be matched by any approach that does not resort to quantum cloning. In these quantum computations, we need to distribute quantum information contained in the states about which we have some partial information. To perform quantum computations, we use a state-dependent probabilistic quantum cloning procedure to distribute quantum information in the middle of a quantum computation.

  16. Distribution functions of probabilistic automata

    NASA Technical Reports Server (NTRS)

    Vatan, F.

    2001-01-01

    Each probabilistic automaton M over an alphabet A defines a probability measure Prob sub(M) on the set of all finite and infinite words over A. We can identify a k letter alphabet A with the set {0, 1,..., k-1}, and, hence, we can consider every finite or infinite word w over A as a radix k expansion of a real number X(w) in the interval [0, 1]. This makes X(w) a random variable and the distribution function of M is defined as usual: F(x) := Prob sub(M) { w: X(w) < x }. Utilizing the fixed-point semantics (denotational semantics), extended to probabilistic computations, we investigate the distribution functions of probabilistic automata in detail. Automata with continuous distribution functions are characterized. By a new, and much more easier method, it is shown that the distribution function F(x) is an analytic function if it is a polynomial. Finally, answering a question posed by D. Knuth and A. Yao, we show that a polynomial distribution function F(x) on [0, 1] can be generated by a prob abilistic automaton iff all the roots of F'(x) = 0 in this interval, if any, are rational numbers. For this, we define two dynamical systems on the set of polynomial distributions and study attracting fixed points of random composition of these two systems.

  17. Probabilistic Aeroelastic Analysis of Turbomachinery Components

    NASA Technical Reports Server (NTRS)

    Reddy, T. S. R.; Mital, S. K.; Stefko, G. L.

    2004-01-01

    A probabilistic approach is described for aeroelastic analysis of turbomachinery blade rows. Blade rows with subsonic flow and blade rows with supersonic flow with subsonic leading edge are considered. To demonstrate the probabilistic approach, the flutter frequency, damping and forced response of a blade row representing a compressor geometry is considered. The analysis accounts for uncertainties in structural and aerodynamic design variables. The results are presented in the form of probabilistic density function (PDF) and sensitivity factors. For subsonic flow cascade, comparisons are also made with different probabilistic distributions, probabilistic methods, and Monte-Carlo simulation. The approach shows that the probabilistic approach provides a more realistic and systematic way to assess the effect of uncertainties in design variables on the aeroelastic instabilities and response.

  18. Probabilistic Computational Methods in Structural Failure Analysis

    NASA Astrophysics Data System (ADS)

    Krejsa, Martin; Kralik, Juraj

    2015-12-01

    Probabilistic methods are used in engineering where a computational model contains random variables. Each random variable in the probabilistic calculations contains uncertainties. Typical sources of uncertainties are properties of the material and production and/or assembly inaccuracies in the geometry or the environment where the structure should be located. The paper is focused on methods for the calculations of failure probabilities in structural failure and reliability analysis with special attention on newly developed probabilistic method: Direct Optimized Probabilistic Calculation (DOProC), which is highly efficient in terms of calculation time and the accuracy of the solution. The novelty of the proposed method lies in an optimized numerical integration that does not require any simulation technique. The algorithm has been implemented in mentioned software applications, and has been used several times in probabilistic tasks and probabilistic reliability assessments.

  19. Probabilistic simulation of uncertainties in thermal structures

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.; Shiao, Michael

    1990-01-01

    Development of probabilistic structural analysis methods for hot structures is a major activity at NASA-Lewis, and consists of five program elements: (1) probabilistic loads, (2) probabilistic finite element analysis, (3) probabilistic material behavior, (4) assessment of reliability and risk, and (5) probabilistic structural performance evaluation. Attention is given to quantification of the effects of uncertainties for several variables on High Pressure Fuel Turbopump blade temperature, pressure, and torque of the Space Shuttle Main Engine; the evaluation of the cumulative distribution function for various structural response variables based on assumed uncertainties in primitive structural variables; evaluation of the failure probability; reliability and risk-cost assessment; and an outline of an emerging approach for eventual hot structures certification. Collectively, the results demonstrate that the structural durability/reliability of hot structural components can be effectively evaluated in a formal probabilistic framework. In addition, the approach can be readily extended to computationally simulate certification of hot structures for aerospace environments.

  20. A Probabilistic Design Method Applied to Smart Composite Structures

    NASA Technical Reports Server (NTRS)

    Shiao, Michael C.; Chamis, Christos C.

    1995-01-01

    A probabilistic design method is described and demonstrated using a smart composite wing. Probabilistic structural design incorporates naturally occurring uncertainties including those in constituent (fiber/matrix) material properties, fabrication variables, structure geometry and control-related parameters. Probabilistic sensitivity factors are computed to identify those parameters that have a great influence on a specific structural reliability. Two performance criteria are used to demonstrate this design methodology. The first criterion requires that the actuated angle at the wing tip be bounded by upper and lower limits at a specified reliability. The second criterion requires that the probability of ply damage due to random impact load be smaller than an assigned value. When the relationship between reliability improvement and the sensitivity factors is assessed, the results show that a reduction in the scatter of the random variable with the largest sensitivity factor (absolute value) provides the lowest failure probability. An increase in the mean of the random variable with a negative sensitivity factor will reduce the failure probability. Therefore, the design can be improved by controlling or selecting distribution parameters associated with random variables. This can be implemented during the manufacturing process to obtain maximum benefit with minimum alterations.

  1. Neural networks for damage identification

    SciTech Connect

    Paez, T.L.; Klenke, S.E.

    1997-11-01

    Efforts to optimize the design of mechanical systems for preestablished use environments and to extend the durations of use cycles establish a need for in-service health monitoring. Numerous studies have proposed measures of structural response for the identification of structural damage, but few have suggested systematic techniques to guide the decision as to whether or not damage has occurred based on real data. Such techniques are necessary because in field applications the environments in which systems operate and the measurements that characterize system behavior are random. This paper investigates the use of artificial neural networks (ANNs) to identify damage in mechanical systems. Two probabilistic neural networks (PNNs) are developed and used to judge whether or not damage has occurred in a specific mechanical system, based on experimental measurements. The first PNN is a classical type that casts Bayesian decision analysis into an ANN framework; it uses exemplars measured from the undamaged and damaged system to establish whether system response measurements of unknown origin come from the former class (undamaged) or the latter class (damaged). The second PNN establishes the character of the undamaged system in terms of a kernel density estimator of measures of system response; when presented with system response measures of unknown origin, it makes a probabilistic judgment whether or not the data come from the undamaged population. The physical system used to carry out the experiments is an aerospace system component, and the environment used to excite the system is a stationary random vibration. The results of damage identification experiments are presented along with conclusions rating the effectiveness of the approaches.

  2. Staged decision making based on probabilistic forecasting

    NASA Astrophysics Data System (ADS)

    Booister, Nikéh; Verkade, Jan; Werner, Micha; Cranston, Michael; Cumiskey, Lydia; Zevenbergen, Chris

    2016-04-01

    Flood forecasting systems reduce, but cannot eliminate uncertainty about the future. Probabilistic forecasts explicitly show that uncertainty remains. However, as - compared to deterministic forecasts - a dimension is added ('probability' or 'likelihood'), with this added dimension decision making is made slightly more complicated. A technique of decision support is the cost-loss approach, which defines whether or not to issue a warning or implement mitigation measures (risk-based method). With the cost-loss method a warning will be issued when the ratio of the response costs to the damage reduction is less than or equal to the probability of the possible flood event. This cost-loss method is not widely used, because it motivates based on only economic values and is a technique that is relatively static (no reasoning, yes/no decision). Nevertheless it has high potential to improve risk-based decision making based on probabilistic flood forecasting because there are no other methods known that deal with probabilities in decision making. The main aim of this research was to explore the ways of making decision making based on probabilities with the cost-loss method better applicable in practice. The exploration began by identifying other situations in which decisions were taken based on uncertain forecasts or predictions. These cases spanned a range of degrees of uncertainty: from known uncertainty to deep uncertainty. Based on the types of uncertainties, concepts of dealing with situations and responses were analysed and possible applicable concepts where chosen. Out of this analysis the concepts of flexibility and robustness appeared to be fitting to the existing method. Instead of taking big decisions with bigger consequences at once, the idea is that actions and decisions are cut-up into smaller pieces and finally the decision to implement is made based on economic costs of decisions and measures and the reduced effect of flooding. The more lead-time there is in

  3. Probabilistic evaluation of uncertainties and risks in aerospace components

    NASA Technical Reports Server (NTRS)

    Shah, A. R.; Shiao, M. C.; Nagpal, V. K.; Chamis, C. C.

    1992-01-01

    This paper summarizes a methodology developed at NASA Lewis Research Center which computationally simulates the structural, material, and load uncertainties associated with Space Shuttle Main Engine (SSME) components. The methodology was applied to evaluate the scatter in static, buckling, dynamic, fatigue, and damage behavior of the SSME turbo pump blade. Also calculated are the probability densities of typical critical blade responses, such as effective stress, natural frequency, damage initiation, most probable damage path, etc. Risk assessments were performed for different failure modes, and the effect of material degradation on the fatigue and damage behaviors of a blade were calculated using a multi-factor interaction equation. Failure probabilities for different fatigue cycles were computed and the uncertainties associated with damage initiation and damage propagation due to different load cycle were quantified. Evaluations on the effects of mistuned blades on a rotor were made; uncertainties in the excitation frequency were found to significantly amplify the blade responses of a mistuned rotor. The effects of the number of blades on a rotor were studied. The autocorrelation function of displacements and the probability density function of the first passage time for deterministic and random barriers for structures subjected to random processes also were computed. A brief discussion was included on the future direction of probabilistic structural analysis.

  4. Probabilistic evaluation of uncertainties and risks in aerospace components

    NASA Astrophysics Data System (ADS)

    Shah, A. R.; Shiao, M. C.; Nagpal, V. K.; Chamis, C. C.

    1992-03-01

    This paper summarizes a methodology developed at NASA Lewis Research Center which computationally simulates the structural, material, and load uncertainties associated with Space Shuttle Main Engine (SSME) components. The methodology was applied to evaluate the scatter in static, buckling, dynamic, fatigue, and damage behavior of the SSME turbo pump blade. Also calculated are the probability densities of typical critical blade responses, such as effective stress, natural frequency, damage initiation, most probable damage path, etc. Risk assessments were performed for different failure modes, and the effect of material degradation on the fatigue and damage behaviors of a blade were calculated using a multi-factor interaction equation. Failure probabilities for different fatigue cycles were computed and the uncertainties associated with damage initiation and damage propagation due to different load cycle were quantified. Evaluations on the effects of mistuned blades on a rotor were made; uncertainties in the excitation frequency were found to significantly amplify the blade responses of a mistuned rotor. The effects of the number of blades on a rotor were studied. The autocorrelation function of displacements and the probability density function of the first passage time for deterministic and random barriers for structures subjected to random processes also were computed. A brief discussion was included on the future direction of probabilistic structural analysis.

  5. Probabilistic structural analysis methods development for SSME

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.; Hopkins, D. A.

    1988-01-01

    The development of probabilistic structural analysis methods is a major part of the SSME Structural Durability Program and consists of three program elements: composite load spectra, probabilistic finite element structural analysis, and probabilistic structural analysis applications. Recent progress includes: (1) the effects of the uncertainties of several factors on the HPFP blade temperature pressure and torque, (2) the evaluation of the cumulative distribution function of structural response variables based on assumed uncertainties on primitive structural variables, and (3) evaluation of the failure probability. Collectively, the results obtained demonstrate that the structural durability of critical SSME components can be probabilistically evaluated.

  6. Probabilistic cloning of three nonorthogonal states

    NASA Astrophysics Data System (ADS)

    Zhang, Wen; Rui, Pinshu; Yang, Qun; Zhao, Yan; Zhang, Ziyun

    2015-04-01

    We study the probabilistic cloning of three nonorthogonal states with equal success probabilities. For simplicity, we assume that the three states belong to a special set. Analytical form of the maximal success probability for probabilistic cloning is calculated. With the maximal success probability, we deduce the explicit form of probabilistic quantum cloning machine. In the case of cloning, we get the unambiguous form of the unitary operation. It is demonstrated that the upper bound for probabilistic quantum cloning machine in (Qiu in J Phys A 35:6931, 2002) can be reached only if the three states are equidistant.

  7. Probabilistic system identification in the time domain

    NASA Technical Reports Server (NTRS)

    Beck, James L.

    1988-01-01

    The objective of system identification is to determine reliable dynamical models of a structure by systematically using its measured excitation and response. It brings together in an integrated fashion, experimental, analytical, and computational techniques in structural dynamics. Areas of application for system identification include the following: (1) Model Evaluation--assessing assumptions (linearity and equivalent viscous damping) and techniques (finite-element modeling) used to construct theoretical models of a structure; (2) Model Improvement--updating of a theoretical model to enable more accurate response predictions for possible future loads on the structure, or for control of the structure; (3) Empirical Modelling--developing empirical relationships (nonlinear models) or empirical parameter values (modal damping) because the present state of the art does not provide theoretical results; and (4) Damage Detection and Assessment--continual or episodic updating of a structural model through vibration monitoring to detect and locate any structural damage. It can be argued that since the construction or modification of models using test data is subject to inherent uncertainties, the above problems should be properly treated within a Bayesian probabilistic framework. Such a methodology is presented which allows the precision of the estimates of the model parameters to be computed. It also leads to a guiding principle in applications. Namely, when selecting a single model from a given class of models, one should take the most probable model in the class based on the experimental data. Practical applications of this principle are given which are based on the utilization of measured seismic motions in large civil structures. Examples include the application of a computer program MODE-ID to identify modal properties directly from seismic excitation and response time histories from a nine-story steel-frame building at JPL and from a freeway overpass bridge.

  8. Replicating Damaged DNA in Eukaryotes

    PubMed Central

    Chatterjee, Nimrat; Siede, Wolfram

    2013-01-01

    DNA damage is one of many possible perturbations that challenge the mechanisms that preserve genetic stability during the copying of the eukaryotic genome in S phase. This short review provides, in the first part, a general introduction to the topic and an overview of checkpoint responses. In the second part, the mechanisms of error-free tolerance in response to fork-arresting DNA damage will be discussed in some detail. PMID:24296172

  9. Against all odds -- Probabilistic forecasts and decision making

    NASA Astrophysics Data System (ADS)

    Liechti, Katharina; Zappa, Massimiliano

    2015-04-01

    In the city of Zurich (Switzerland) the setting is such that the damage potential due to flooding of the river Sihl is estimated to about 5 billion US dollars. The flood forecasting system that is used by the administration for decision making runs continuously since 2007. It has a time horizon of max. five days and operates at hourly time steps. The flood forecasting system includes three different model chains. Two of those are run by the deterministic NWP models COSMO-2 and COSMO-7 and one is driven by the probabilistic NWP COSMO-Leps. The model chains are consistent since February 2010, so five full years are available for the evaluation for the system. The system was evaluated continuously and is a very nice example to present the added value that lies in probabilistic forecasts. The forecasts are available on an online-platform to the decision makers. Several graphical representations of the forecasts and forecast-history are available to support decision making and to rate the current situation. The communication between forecasters and decision-makers is quite close. To put it short, an ideal situation. However, an event or better put a non-event in summer 2014 showed that the knowledge about the general superiority of probabilistic forecasts doesn't necessarily mean that the decisions taken in a specific situation will be based on that probabilistic forecast. Some years of experience allow gaining confidence in the system, both for the forecasters and for the decision-makers. Even if from the theoretical point of view the handling during crisis situation is well designed, a first event demonstrated that the dialog with the decision-makers still lacks of exercise during such situations. We argue, that a false alarm is a needed experience to consolidate real-time emergency procedures relying on ensemble predictions. A missed event would probably also fit, but, in our case, we are very happy not to report about this option.

  10. Overexpression of rice OsREX1-S, encoding a putative component of the core general transcription and DNA repair factor IIH, renders plant cells tolerant to cadmium- and UV-induced damage by enhancing DNA excision repair.

    PubMed

    Kunihiro, Shuta; Kowata, Hikaru; Kondou, Youichi; Takahashi, Shinya; Matsui, Minami; Berberich, Thomas; Youssefian, Shohab; Hidema, Jun; Kusano, Tomonobu

    2014-05-01

    Screening of 40,000 Arabidopsis FOX (Full-length cDNA Over-eXpressor gene hunting system) lines expressing rice full-length cDNAs brings us to identify four cadmium (Cd)-tolerant lines, one of which carried OsREX1-S as a transgene. OsREX1-S shows the highest levels of identity to Chlamydomonas reinhardtii REX1-S (referred to as CrREX1-S, in which REX denotes Required for Excision) and to yeast and human TFB5s (RNA polymerase II transcription factor B5), both of which are components of the general transcription and DNA repair factor, TFIIH. Transient expression of OsREX1-S consistently localized the protein to the nucleus of onion cells. The newly generated transgenic Arabidopsis plants expressing OsREX1-S reproducibly displayed enhanced Cd tolerance, confirming that the Cd-tolerance of the initial identified line was conferred solely by OsREX1-S expression. Furthermore, transgenic Arabidopsis plants expressing OsREX1-S exhibited ultraviolet-B (UVB) tolerance by reducing the amounts of cyclobutane pyrimidine dimers produced by UVB radiation. Moreover, those transgenic OsREX1-S Arabidopsis plants became resistant to bleomycin (an inducer of DNA strand break) and mitomycin C (DNA intercalating activity), compared to wild type. Our results indicate that OsREX1-S renders host plants tolerant to Cd, UVB radiation, bleomycin and mitomycin C through the enhanced DNA excision repair. PMID:24563249

  11. The probabilistic seismic loss model as a tool for portfolio management: the case of Maghreb.

    NASA Astrophysics Data System (ADS)

    Pousse, Guillaume; Lorenzo, Francisco; Stejskal, Vladimir

    2010-05-01

    Although property insurance market in Maghreb countries does not systematically purchase an earthquake cover, Impact Forecasting is developing a new loss model for the calculation of probabilistic seismic risk. A probabilistic methodology using Monte Carlo simulation was applied to generate the hazard component of the model. Then, a set of damage functions is used to convert the modelled ground motion severity into monetary losses. We aim to highlight risk assessment challenges, especially in countries where reliable data are difficult to obtain. The loss model estimates the risk and allows discussing further risk transfer strategies.

  12. Probabilistic Analysis of Rechargeable Batteries in a Photovoltaic Power Supply System

    SciTech Connect

    Barney, P.; Ingersoll, D.; Jungst, R.; O'Gorman, C.; Paez, T.L.; Urbina, A.

    1998-11-24

    We developed a model for the probabilistic behavior of a rechargeable battery acting as the energy storage component in a photovoltaic power supply system. Stochastic and deterministic models are created to simulate the behavior of the system component;. The components are the solar resource, the photovoltaic power supply system, the rechargeable battery, and a load. Artificial neural networks are incorporated into the model of the rechargeable battery to simulate damage that occurs during deep discharge cycles. The equations governing system behavior are combined into one set and solved simultaneously in the Monte Carlo framework to evaluate the probabilistic character of measures of battery behavior.

  13. Probabilistic Simulation for Nanocomposite Fracture

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.

    2010-01-01

    A unique probabilistic theory is described to predict the uniaxial strengths and fracture properties of nanocomposites. The simulation is based on composite micromechanics with progressive substructuring down to a nanoscale slice of a nanofiber where all the governing equations are formulated. These equations have been programmed in a computer code. That computer code is used to simulate uniaxial strengths and fracture of a nanofiber laminate. The results are presented graphically and discussed with respect to their practical significance. These results show smooth distributions from low probability to high.

  14. Probabilistic risk assessment: Number 219

    SciTech Connect

    Bari, R.A.

    1985-11-13

    This report describes a methodology for analyzing the safety of nuclear power plants. A historical overview of plants in the US is provided, and past, present, and future nuclear safety and risk assessment are discussed. A primer on nuclear power plants is provided with a discussion of pressurized water reactors (PWR) and boiling water reactors (BWR) and their operation and containment. Probabilistic Risk Assessment (PRA), utilizing both event-tree and fault-tree analysis, is discussed as a tool in reactor safety, decision making, and communications. (FI)

  15. Probabilistic approach to EMP assessment

    SciTech Connect

    Bevensee, R.M.; Cabayan, H.S.; Deadrick, F.J.; Martin, L.C.; Mensing, R.W.

    1980-09-01

    The development of nuclear EMP hardness requirements must account for uncertainties in the environment, in interaction and coupling, and in the susceptibility of subsystems and components. Typical uncertainties of the last two kinds are briefly summarized, and an assessment methodology is outlined, based on a probabilistic approach that encompasses the basic concepts of reliability. It is suggested that statements of survivability be made compatible with system reliability. Validation of the approach taken for simple antenna/circuit systems is performed with experiments and calculations that involve a Transient Electromagnetic Range, numerical antenna modeling, separate device failure data, and a failure analysis computer program.

  16. Modeling neural activity with cumulative damage distributions.

    PubMed

    Leiva, Víctor; Tejo, Mauricio; Guiraud, Pierre; Schmachtenberg, Oliver; Orio, Patricio; Marmolejo-Ramos, Fernando

    2015-10-01

    Neurons transmit information as action potentials or spikes. Due to the inherent randomness of the inter-spike intervals (ISIs), probabilistic models are often used for their description. Cumulative damage (CD) distributions are a family of probabilistic models that has been widely considered for describing time-related cumulative processes. This family allows us to consider certain deterministic principles for modeling ISIs from a probabilistic viewpoint and to link its parameters to values with biological interpretation. The CD family includes the Birnbaum-Saunders and inverse Gaussian distributions, which possess distinctive properties and theoretical arguments useful for ISI description. We expand the use of CD distributions to the modeling of neural spiking behavior, mainly by testing the suitability of the Birnbaum-Saunders distribution, which has not been studied in the setting of neural activity. We validate this expansion with original experimental and simulated electrophysiological data. PMID:25998210

  17. Probabilistic Seismic Hazard assessment in Albania

    NASA Astrophysics Data System (ADS)

    Muco, B.; Kiratzi, A.; Sulstarova, E.; Kociu, S.; Peci, V.; Scordilis, E.

    2002-12-01

    Albania is one of the coutries with highest sesimicity in Europe.The history of instrumental monitoring of seismicity in this country started since 1968 with the setting up of the first seismographic station of Tirana and more effectively after the beginning of the operation of the Albanian Seismological Network in 1976. There is a rich evidence that during two thousands years Albania has been hit by many disastrous earthquakes. The highest magnitude estimated is 7.2. After the end of Communist era and opening of the country, a boom of constructions started in Albania continuing even now. It makes more indispensabile the producing of accurate seismic hazard maps for preventing the damages of future probable earthquakes. Some efforts have already been done in seismic hazard assessment(Sulstarova et al., 1980; Kociu, 2000; Muco et al., 2002). In this approach, the probabilistic technique has been used in one joint work between Seismological Institute of Tirana, Albania and Department of Geophysics of Aristotle University of Thessaloniki, Greece, into the framework of NATO SfP project "SeisAlbania". The earthquake catalogue adopted was specifically conceived for this seismic hazard analysis and contains 530 events with magnitude M>4.5 from the year 58 up to 2000. We divided the country in 8 seismotectonic zones giving for them the most representative fault characteristics. The computer code used for hazard calculation was OHAZ, developed from the Geophysical Survey of Slovenia and the attenuation models used were Ambraseys et al., 1996; Sabetta and Pugliese, 1996 and Margaris et al., 2001. The hazard maps are obtained for 100, 475, 2375 and 4746 return periods, for rock soil condition. Analyzing the map of PGA values for a return period of 475 years, there are separated 5 zones with different escalation of PGA values: 1)the zone with PGA (0.20 - 0.24 g) 1.8 percent of Albanian territory, 2)the zone with PGA (0.16 - 0.20 g) 22.6 percent of Albanian territory, 3)the

  18. Probabilistic modeling of financial exposure to flood in France

    NASA Astrophysics Data System (ADS)

    Moncoulon, David; Quantin, Antoine; Leblois, Etienne

    2014-05-01

    CCR is a French reinsurance company which offers natural catastrophe covers with the State guarantee. Within this framework, CCR develops its own models to assess its financial exposure to floods, droughts, earthquakes and other perils, and thus the exposure of insurers and the French State. A probabilistic flood model has been developed in order to estimate the financial exposure of the Nat Cat insurance market to flood events, depending on their annual occurrence probability. This presentation is organized in two parts. The first part is dedicated to the development of a flood hazard and damage model (ARTEMIS). The model calibration and validation on historical events are then described. In the second part, the coupling of ARTEMIS with two generators of probabilistic events is achieved: a stochastic flow generator and a stochastic spatialized precipitation generator, adapted from the SAMPO model developed by IRSTEA. The analysis of the complementary nature of these two generators is proposed: the first one allows generating floods on the French hydrological station network; the second allows simulating surface water runoff and Small River floods, even on ungauged rivers. Thus, the simulation of thousands of non-occured, but possible events allows us to provide for the first time an estimate of the financial exposure to flooding in France at different scales (commune, department, country) and from different points of view (hazard, vulnerability and damages).

  19. Is Probabilistic Evidence a Source of Knowledge?

    ERIC Educational Resources Information Center

    Friedman, Ori; Turri, John

    2015-01-01

    We report a series of experiments examining whether people ascribe knowledge for true beliefs based on probabilistic evidence. Participants were less likely to ascribe knowledge for beliefs based on probabilistic evidence than for beliefs based on perceptual evidence (Experiments 1 and 2A) or testimony providing causal information (Experiment 2B).…

  20. Probabilistic Cue Combination: Less Is More

    ERIC Educational Resources Information Center

    Yurovsky, Daniel; Boyer, Ty W.; Smith, Linda B.; Yu, Chen

    2013-01-01

    Learning about the structure of the world requires learning probabilistic relationships: rules in which cues do not predict outcomes with certainty. However, in some cases, the ability to track probabilistic relationships is a handicap, leading adults to perform non-normatively in prediction tasks. For example, in the "dilution effect,"…

  1. Error Discounting in Probabilistic Category Learning

    ERIC Educational Resources Information Center

    Craig, Stewart; Lewandowsky, Stephan; Little, Daniel R.

    2011-01-01

    The assumption in some current theories of probabilistic categorization is that people gradually attenuate their learning in response to unavoidable error. However, existing evidence for this error discounting is sparse and open to alternative interpretations. We report 2 probabilistic-categorization experiments in which we investigated error…

  2. Software for Probabilistic Risk Reduction

    NASA Technical Reports Server (NTRS)

    Hensley, Scott; Michel, Thierry; Madsen, Soren; Chapin, Elaine; Rodriguez, Ernesto

    2004-01-01

    A computer program implements a methodology, denoted probabilistic risk reduction, that is intended to aid in planning the development of complex software and/or hardware systems. This methodology integrates two complementary prior methodologies: (1) that of probabilistic risk assessment and (2) a risk-based planning methodology, implemented in a prior computer program known as Defect Detection and Prevention (DDP), in which multiple requirements and the beneficial effects of risk-mitigation actions are taken into account. The present methodology and the software are able to accommodate both process knowledge (notably of the efficacy of development practices) and product knowledge (notably of the logical structure of a system, the development of which one seeks to plan). Estimates of the costs and benefits of a planned development can be derived. Functional and non-functional aspects of software can be taken into account, and trades made among them. It becomes possible to optimize the planning process in the sense that it becomes possible to select the best suite of process steps and design choices to maximize the expectation of success while remaining within budget.

  3. Compression of Probabilistic XML Documents

    NASA Astrophysics Data System (ADS)

    Veldman, Irma; de Keijzer, Ander; van Keulen, Maurice

    Database techniques to store, query and manipulate data that contains uncertainty receives increasing research interest. Such UDBMSs can be classified according to their underlying data model: relational, XML, or RDF. We focus on uncertain XML DBMS with as representative example the Probabilistic XML model (PXML) of [10,9]. The size of a PXML document is obviously a factor in performance. There are PXML-specific techniques to reduce the size, such as a push down mechanism, that produces equivalent but more compact PXML documents. It can only be applied, however, where possibilities are dependent. For normal XML documents there also exist several techniques for compressing a document. Since Probabilistic XML is (a special form of) normal XML, it might benefit from these methods even more. In this paper, we show that existing compression mechanisms can be combined with PXML-specific compression techniques. We also show that best compression rates are obtained with a combination of PXML-specific technique with a rather simple generic DAG-compression technique.

  4. The application of probabilistic fracture analysis to residual life evaluation of embrittled reactor vessels

    SciTech Connect

    Dickson, T.L.; Simonen, F.A.

    1992-05-01

    Probabilistic fracture mechanics analysis is a major element of comprehensive probabilistic methodology on which current NRC regulatory requirements for pressurized water reactor vessel integrity evaluation are based. Computer codes such as OCA-P and VISA-II perform probabilistic fracture analyses to estimate the increase in vessel failure probability that occurs as the vessel material accumulates radiation damage over the operating life of the vessel. The results of such analyses, when compared with limits of acceptable failure probabilities, provide an estimation of the residual life of a vessel. Such codes can be applied to evaluate the potential benefits of plant-specific mitigating actions designed to reduce the probability of failure of a reactor vessel. 10 refs.

  5. The application of probabilistic fracture analysis to residual life evaluation of embrittled reactor vessels

    SciTech Connect

    Dickson, T.L. ); Simonen, F.A. )

    1992-01-01

    Probabilistic fracture mechanics analysis is a major element of comprehensive probabilistic methodology on which current NRC regulatory requirements for pressurized water reactor vessel integrity evaluation are based. Computer codes such as OCA-P and VISA-II perform probabilistic fracture analyses to estimate the increase in vessel failure probability that occurs as the vessel material accumulates radiation damage over the operating life of the vessel. The results of such analyses, when compared with limits of acceptable failure probabilities, provide an estimation of the residual life of a vessel. Such codes can be applied to evaluate the potential benefits of plant-specific mitigating actions designed to reduce the probability of failure of a reactor vessel. 10 refs.

  6. Orchid flowers tolerance to gamma-radiation

    NASA Astrophysics Data System (ADS)

    Kikuchi, Olivia Kimiko

    2000-03-01

    Cut flowers are fresh goods that may be treated with fumigants such as methyl bromide to meet the needs of the quarantine requirements of importing countries. Irradiation is a non-chemical alternative to substitute the methyl bromide treatment of fresh products. In this research, different cut orchids were irradiated to examine their tolerance to gamma-rays. A 200 Gy dose did inhibit the Dendrobium palenopsis buds from opening, but did not cause visible damage to opened flowers. Doses of 800 and 1000 Gy were damaging because they provoked the flowers to drop from the stem. Cattleya irradiated with 750 Gy did not show any damage, and were therefore eligible for the radiation treatment. Cymbidium tolerated up to 300 Gy and above this dose dropped prematurely. On the other hand, Oncydium did not tolerate doses above 150 Gy.

  7. Fault Tolerant State Machines

    NASA Technical Reports Server (NTRS)

    Burke, Gary R.; Taft, Stephanie

    2004-01-01

    State machines are commonly used to control sequential logic in FPGAs and ASKS. An errant state machine can cause considerable damage to the device it is controlling. For example in space applications, the FPGA might be controlling Pyros, which when fired at the wrong time will cause a mission failure. Even a well designed state machine can be subject to random errors us a result of SEUs from the radiation environment in space. There are various ways to encode the states of a state machine, and the type of encoding makes a large difference in the susceptibility of the state machine to radiation. In this paper we compare 4 methods of state machine encoding and find which method gives the best fault tolerance, as well as determining the resources needed for each method.

  8. 7 CFR 51.628 - Tolerances.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 51 1 2 22 2 3 3 33 3 4 4 4 54 5 5 5 55 5 Very serious U.S. Fancy. damage U.S. No. 1. 4 3 5 7 8 10 11... 7 Agriculture 2 2010-01-01 2010-01-01 false Tolerances. 51.628 Section 51.628 Agriculture... § 51.628 Tolerances. In order to allow for variations incident to proper grading and handling in...

  9. Imprecise probabilistic estimation of design floods with epistemic uncertainties

    NASA Astrophysics Data System (ADS)

    Qi, Wei; Zhang, Chi; Fu, Guangtao; Zhou, Huicheng

    2016-06-01

    An imprecise probabilistic framework for design flood estimation is proposed on the basis of the Dempster-Shafer theory to handle different epistemic uncertainties from data, probability distribution functions, and probability distribution parameters. These uncertainties are incorporated in cost-benefit analysis to generate the lower and upper bounds of the total cost for flood control, thus presenting improved information for decision making on design floods. Within the total cost bounds, a new robustness criterion is proposed to select a design flood that can tolerate higher levels of uncertainty. A variance decomposition approach is used to quantify individual and interactive impacts of the uncertainty sources on total cost. Results from three case studies, with 127, 104, and 54 year flood data sets, respectively, show that the imprecise probabilistic approach effectively combines aleatory and epistemic uncertainties from the various sources and provides upper and lower bounds of the total cost. Between the total cost and the robustness of design floods, a clear trade-off which is beyond the information that can be provided by the conventional minimum cost criterion is identified. The interactions among data, distributions, and parameters have a much higher contribution than parameters to the estimate of the total cost. It is found that the contributions of the various uncertainty sources and their interactions vary with different flood magnitude, but remain roughly the same with different return periods. This study demonstrates that the proposed methodology can effectively incorporate epistemic uncertainties in cost-benefit analysis of design floods.

  10. Probabilistic cloning of equidistant states

    SciTech Connect

    Jimenez, O.; Roa, Luis; Delgado, A.

    2010-08-15

    We study the probabilistic cloning of equidistant states. These states are such that the inner product between them is a complex constant or its conjugate. Thereby, it is possible to study their cloning in a simple way. In particular, we are interested in the behavior of the cloning probability as a function of the phase of the overlap among the involved states. We show that for certain families of equidistant states Duan and Guo's cloning machine leads to cloning probabilities lower than the optimal unambiguous discrimination probability of equidistant states. We propose an alternative cloning machine whose cloning probability is higher than or equal to the optimal unambiguous discrimination probability for any family of equidistant states. Both machines achieve the same probability for equidistant states whose inner product is a positive real number.

  11. Probabilistic Reasoning for Plan Robustness

    NASA Technical Reports Server (NTRS)

    Schaffer, Steve R.; Clement, Bradley J.; Chien, Steve A.

    2005-01-01

    A planning system must reason about the uncertainty of continuous variables in order to accurately project the possible system state over time. A method is devised for directly reasoning about the uncertainty in continuous activity duration and resource usage for planning problems. By representing random variables as parametric distributions, computing projected system state can be simplified in some cases. Common approximation and novel methods are compared for over-constrained and lightly constrained domains. The system compares a few common approximation methods for an iterative repair planner. Results show improvements in robustness over the conventional non-probabilistic representation by reducing the number of constraint violations witnessed by execution. The improvement is more significant for larger problems and problems with higher resource subscription levels but diminishes as the system is allowed to accept higher risk levels.

  12. Development of probabilistic multimedia multipathway computer codes.

    SciTech Connect

    Yu, C.; LePoire, D.; Gnanapragasam, E.; Arnish, J.; Kamboj, S.; Biwer, B. M.; Cheng, J.-J.; Zielen, A. J.; Chen, S. Y.; Mo, T.; Abu-Eid, R.; Thaggard, M.; Sallo, A., III.; Peterson, H., Jr.; Williams, W. A.; Environmental Assessment; NRC; EM

    2002-01-01

    The deterministic multimedia dose/risk assessment codes RESRAD and RESRAD-BUILD have been widely used for many years for evaluation of sites contaminated with residual radioactive materials. The RESRAD code applies to the cleanup of sites (soils) and the RESRAD-BUILD code applies to the cleanup of buildings and structures. This work describes the procedure used to enhance the deterministic RESRAD and RESRAD-BUILD codes for probabilistic dose analysis. A six-step procedure was used in developing default parameter distributions and the probabilistic analysis modules. These six steps include (1) listing and categorizing parameters; (2) ranking parameters; (3) developing parameter distributions; (4) testing parameter distributions for probabilistic analysis; (5) developing probabilistic software modules; and (6) testing probabilistic modules and integrated codes. The procedures used can be applied to the development of other multimedia probabilistic codes. The probabilistic versions of RESRAD and RESRAD-BUILD codes provide tools for studying the uncertainty in dose assessment caused by uncertain input parameters. The parameter distribution data collected in this work can also be applied to other multimedia assessment tasks and multimedia computer codes.

  13. The EBR-II Probabilistic Risk Assessment: Results and insights

    SciTech Connect

    Hill, D.J.; Ragland, W.A.; Roglans, J.

    1993-12-31

    This paper summarizes the results from the recently completed EBR-II Probabilistic Risk Assessment (PRA) and provides an analysis of the source of risk of the operation of EBR-II from both internal and external initiating events. The EBR-II PRA explicitly accounts for the role of reactivity feedbacks in reducing fuel damage. The results show that the expected core damage frequency from internal initiating events at EBR-II is very low, 1. 6 10{sup {minus}6} yr{sup {minus}1}, even with a wide definition of core damage (essentially that of exceeding Technical Specification limits). The probability of damage, primarily due to liquid metal fires, from externally initiated events (excluding earthquake) is 3.6 10{sup {minus}6} yr{sup {minus}1}. overall these results are considerably better than results for other research reactors and the nuclear industry in general and stem from three main sources: low likelihood of loss of coolant due to low system pressure and top entry double, vessels; low likelihood of loss of decay heat removal due to reliance on passive means; and low likelihood of power/flow mismatch due to both passive feedbacks and reliability of rod scram capability.

  14. The EBR-II Probabilistic Risk Assessment: Results and insights

    SciTech Connect

    Hill, D.J.; Ragland, W.A.; Roglans, J.

    1993-01-01

    This paper summarizes the results from the recently completed EBR-II Probabilistic Risk Assessment (PRA) and provides an analysis of the source of risk of the operation of EBR-II from both internal and external initiating events. The EBR-II PRA explicitly accounts for the role of reactivity feedbacks in reducing fuel damage. The results show that the expected core damage frequency from internal initiating events at EBR-II is very low, 1. 6 10[sup [minus]6] yr[sup [minus]1], even with a wide definition of core damage (essentially that of exceeding Technical Specification limits). The probability of damage, primarily due to liquid metal fires, from externally initiated events (excluding earthquake) is 3.6 10[sup [minus]6] yr[sup [minus]1]. overall these results are considerably better than results for other research reactors and the nuclear industry in general and stem from three main sources: low likelihood of loss of coolant due to low system pressure and top entry double, vessels; low likelihood of loss of decay heat removal due to reliance on passive means; and low likelihood of power/flow mismatch due to both passive feedbacks and reliability of rod scram capability.

  15. Probabilistic machine learning and artificial intelligence.

    PubMed

    Ghahramani, Zoubin

    2015-05-28

    How can a machine learn from experience? Probabilistic modelling provides a framework for understanding what learning is, and has therefore emerged as one of the principal theoretical and practical approaches for designing machines that learn from data acquired through experience. The probabilistic framework, which describes how to represent and manipulate uncertainty about models and predictions, has a central role in scientific data analysis, machine learning, robotics, cognitive science and artificial intelligence. This Review provides an introduction to this framework, and discusses some of the state-of-the-art advances in the field, namely, probabilistic programming, Bayesian optimization, data compression and automatic model discovery. PMID:26017444

  16. Probabilistic machine learning and artificial intelligence

    NASA Astrophysics Data System (ADS)

    Ghahramani, Zoubin

    2015-05-01

    How can a machine learn from experience? Probabilistic modelling provides a framework for understanding what learning is, and has therefore emerged as one of the principal theoretical and practical approaches for designing machines that learn from data acquired through experience. The probabilistic framework, which describes how to represent and manipulate uncertainty about models and predictions, has a central role in scientific data analysis, machine learning, robotics, cognitive science and artificial intelligence. This Review provides an introduction to this framework, and discusses some of the state-of-the-art advances in the field, namely, probabilistic programming, Bayesian optimization, data compression and automatic model discovery.

  17. Probabilistic population projections with migration uncertainty

    PubMed Central

    Azose, Jonathan J.; Ševčíková, Hana; Raftery, Adrian E.

    2016-01-01

    We produce probabilistic projections of population for all countries based on probabilistic projections of fertility, mortality, and migration. We compare our projections to those from the United Nations’ Probabilistic Population Projections, which uses similar methods for fertility and mortality but deterministic migration projections. We find that uncertainty in migration projection is a substantial contributor to uncertainty in population projections for many countries. Prediction intervals for the populations of Northern America and Europe are over 70% wider, whereas prediction intervals for the populations of Africa, Asia, and the world as a whole are nearly unchanged. Out-of-sample validation shows that the model is reasonably well calibrated. PMID:27217571

  18. Probabilistic population projections with migration uncertainty.

    PubMed

    Azose, Jonathan J; Ševčíková, Hana; Raftery, Adrian E

    2016-06-01

    We produce probabilistic projections of population for all countries based on probabilistic projections of fertility, mortality, and migration. We compare our projections to those from the United Nations' Probabilistic Population Projections, which uses similar methods for fertility and mortality but deterministic migration projections. We find that uncertainty in migration projection is a substantial contributor to uncertainty in population projections for many countries. Prediction intervals for the populations of Northern America and Europe are over 70% wider, whereas prediction intervals for the populations of Africa, Asia, and the world as a whole are nearly unchanged. Out-of-sample validation shows that the model is reasonably well calibrated. PMID:27217571

  19. Maximizing Statistical Power When Verifying Probabilistic Forecasts of Hydrometeorological Events

    NASA Astrophysics Data System (ADS)

    DeChant, C. M.; Moradkhani, H.

    2014-12-01

    Hydrometeorological events (i.e. floods, droughts, precipitation) are increasingly being forecasted probabilistically, owing to the uncertainties in the underlying causes of the phenomenon. In these forecasts, the probability of the event, over some lead time, is estimated based on some model simulations or predictive indicators. By issuing probabilistic forecasts, agencies may communicate the uncertainty in the event occurring. Assuming that the assigned probability of the event is correct, which is referred to as a reliable forecast, the end user may perform some risk management based on the potential damages resulting from the event. Alternatively, an unreliable forecast may give false impressions of the actual risk, leading to improper decision making when protecting resources from extreme events. Due to this requisite for reliable forecasts to perform effective risk management, this study takes a renewed look at reliability assessment in event forecasts. Illustrative experiments will be presented, showing deficiencies in the commonly available approaches (Brier Score, Reliability Diagram). Overall, it is shown that the conventional reliability assessment techniques do not maximize the ability to distinguish between a reliable and unreliable forecast. In this regard, a theoretical formulation of the probabilistic event forecast verification framework will be presented. From this analysis, hypothesis testing with the Poisson-Binomial distribution is the most exact model available for the verification framework, and therefore maximizes one's ability to distinguish between a reliable and unreliable forecast. Application of this verification system was also examined within a real forecasting case study, highlighting the additional statistical power provided with the use of the Poisson-Binomial distribution.

  20. Probabilistic model better defines development well risks

    SciTech Connect

    Connolly, M.R.

    1996-10-14

    Probabilistic techniques to compare and rank projects, such as the drilling of development wells, often are more representative than decision tree or deterministic approaches. As opposed to traditional deterministic methods, probabilistic analysis gives decision-makers ranges of outcomes with associated probabilities of occurrence. This article analyzes the drilling of a hypothetical development well with actual field data (such as stabilized initial rates, production declines, and gas/oil ratios) to calculate probabilistic reserves, and production flow streams. Analog operating data were included to build distributions for capital and operating costs. Economics from the Monte Carlo simulation include probabilistic production flow streams and cost distributions. Results include single parameter distributions (reserves, net present value, and profitability index) and time function distributions (annual production and net cash flow).

  1. Non-unitary probabilistic quantum computing

    NASA Technical Reports Server (NTRS)

    Gingrich, Robert M.; Williams, Colin P.

    2004-01-01

    We present a method for designing quantum circuits that perform non-unitary quantum computations on n-qubit states probabilistically, and give analytic expressions for the success probability and fidelity.

  2. Do probabilistic forecasts lead to better decisions?

    NASA Astrophysics Data System (ADS)

    Ramos, M. H.; van Andel, S. J.; Pappenberger, F.

    2012-12-01

    The last decade has seen growing research in producing probabilistic hydro-meteorological forecasts and increasing their reliability. This followed the promise that, supplied with information about uncertainty, people would take better risk-based decisions. In recent years, therefore, research and operational developments have also start putting attention to ways of communicating the probabilistic forecasts to decision makers. Communicating probabilistic forecasts includes preparing tools and products for visualization, but also requires understanding how decision makers perceive and use uncertainty information in real-time. At the EGU General Assembly 2012, we conducted a laboratory-style experiment in which several cases of flood forecasts and a choice of actions to take were presented as part of a game to participants, who acted as decision makers. Answers were collected and analyzed. In this paper, we present the results of this exercise and discuss if indeed we make better decisions on the basis of probabilistic forecasts.

  3. Do probabilistic forecasts lead to better decisions?

    NASA Astrophysics Data System (ADS)

    Ramos, M. H.; van Andel, S. J.; Pappenberger, F.

    2013-06-01

    The last decade has seen growing research in producing probabilistic hydro-meteorological forecasts and increasing their reliability. This followed the promise that, supplied with information about uncertainty, people would take better risk-based decisions. In recent years, therefore, research and operational developments have also started focusing attention on ways of communicating the probabilistic forecasts to decision-makers. Communicating probabilistic forecasts includes preparing tools and products for visualisation, but also requires understanding how decision-makers perceive and use uncertainty information in real time. At the EGU General Assembly 2012, we conducted a laboratory-style experiment in which several cases of flood forecasts and a choice of actions to take were presented as part of a game to participants, who acted as decision-makers. Answers were collected and analysed. In this paper, we present the results of this exercise and discuss if we indeed make better decisions on the basis of probabilistic forecasts.

  4. COMMUNICATING PROBABILISTIC RISK OUTCOMES TO RISK MANAGERS

    EPA Science Inventory

    Increasingly, risk assessors are moving away from simple deterministic assessments to probabilistic approaches that explicitly incorporate ecological variability, measurement imprecision, and lack of knowledge (collectively termed "uncertainty"). While the new methods provide an...

  5. Probabilistic micromechanics for high-temperature composites

    NASA Technical Reports Server (NTRS)

    Reddy, J. N.

    1993-01-01

    The three-year program of research had the following technical objectives: the development of probabilistic methods for micromechanics-based constitutive and failure models, application of the probabilistic methodology in the evaluation of various composite materials and simulation of expected uncertainties in unidirectional fiber composite properties, and influence of the uncertainties in composite properties on the structural response. The first year of research was devoted to the development of probabilistic methodology for micromechanics models. The second year of research focused on the evaluation of the Chamis-Hopkins constitutive model and Aboudi constitutive model using the methodology developed in the first year of research. The third year of research was devoted to the development of probabilistic finite element analysis procedures for laminated composite plate and shell structures.

  6. A Probabilistic Formulation for Hausdorff Matching

    NASA Technical Reports Server (NTRS)

    Olson, Clark F.

    1998-01-01

    Matching images based on a Hausdorff measure has become popular for computer vision applications. In this paper, we develope a probabilistic formulation for Hausdorff matching in terms of maximum likelihood estimation.

  7. A novel Bayesian imaging method for probabilistic delamination detection of composite materials

    NASA Astrophysics Data System (ADS)

    Peng, Tishun; Saxena, Abhinav; Goebel, Kai; Xiang, Yibing; Sankararaman, Shankar; Liu, Yongming

    2013-12-01

    A probabilistic framework for location and size determination for delamination in carbon-carbon composites is proposed in this paper. A probability image of delaminated area using Lamb wave-based damage detection features is constructed with the Bayesian updating technique. First, the algorithm for the probabilistic delamination detection framework using the proposed Bayesian imaging method (BIM) is presented. Next, a fatigue testing setup for carbon-carbon composite coupons is described. The Lamb wave-based diagnostic signal is then interpreted and processed. Next, the obtained signal features are incorporated in the Bayesian imaging method for delamination size and location detection, as well as the corresponding uncertainty bounds prediction. The damage detection results using the proposed methodology are compared with x-ray images for verification and validation. Finally, some conclusions are drawn and suggestions made for future works based on the study presented in this paper.

  8. Application of a Probabilistic Algorithm for Ultrasonic Guided Wave Imaging of Carbon Composites

    NASA Astrophysics Data System (ADS)

    Hettler, Jan; Tabatabateipour, Morteza; Delrue, Steven; Van Den Abeele, Koen

    The Reconstruction Algorithm for Probabilistic Inspection of Damage (RAPID) is a baseline-dependent imaging method. It utilizes a permanent array of ultrasonic transducers that covers the region of interest to interrogate the structure and estimate the presence and location of damage. The method has already proven its capability to detect different types of damage in aluminum plate structures, e.g. cracking or corrosion damage. In the present study, we apply RAPID to inspect carbon-fiber reinforced polymer (CFRP) components for the presence of impact damage and delaminations. In addition, numerical and experimental results of a baseline-free RAPID approach for the detection of nonlinear defects in CFRP will be presented. This modified RAPID draws on the Scaling Subtraction Method (SSM) which is well known from the field of nonlinear ultrasound.

  9. Emulation for probabilistic weather forecasting

    NASA Astrophysics Data System (ADS)

    Cornford, Dan; Barillec, Remi

    2010-05-01

    Numerical weather prediction models are typically very expensive to run due to their complexity and resolution. Characterising the sensitivity of the model to its initial condition and/or to its parameters requires numerous runs of the model, which is impractical for all but the simplest models. To produce probabilistic forecasts requires knowledge of the distribution of the model outputs, given the distribution over the inputs, where the inputs include the initial conditions, boundary conditions and model parameters. Such uncertainty analysis for complex weather prediction models seems a long way off, given current computing power, with ensembles providing only a partial answer. One possible way forward that we develop in this work is the use of statistical emulators. Emulators provide an efficient statistical approximation to the model (or simulator) while quantifying the uncertainty introduced. In the emulator framework, a Gaussian process is fitted to the simulator response as a function of the simulator inputs using some training data. The emulator is essentially an interpolator of the simulator output and the response in unobserved areas is dictated by the choice of covariance structure and parameters in the Gaussian process. Suitable parameters are inferred from the data in a maximum likelihood, or Bayesian framework. Once trained, the emulator allows operations such as sensitivity analysis or uncertainty analysis to be performed at a much lower computational cost. The efficiency of emulators can be further improved by exploiting the redundancy in the simulator output through appropriate dimension reduction techniques. We demonstrate this using both Principal Component Analysis on the model output and a new reduced-rank emulator in which an optimal linear projection operator is estimated jointly with other parameters, in the context of simple low order models, such as the Lorenz 40D system. We present the application of emulators to probabilistic weather

  10. Defect tolerance of pressurized fiber composite shell structures

    SciTech Connect

    Gotsis, P.K.; Chamis, C.C.; Minnetyan, L.

    1996-12-31

    Progressive damage and fracture of pressurized graphite/epoxy thin composite shells are evaluated via computational simulation. An integrated computer code that scales up constituent micromechanics level material properties to the structure level and accounts for all possible failure modes is used for the simulation of composite degradation under loading. Damage initiation, growth, accumulation, and propagation to fracture are included in the simulation. Design implications with regard to defect and damage tolerance of thin walled composite cylindrical shells are examined. A procedure is outlined regarding the use of this type of information for setting quality acceptance criteria, design allowables, damage tolerance, and retirement-for-cause criteria.

  11. Probabilistic cloning of three symmetric states

    SciTech Connect

    Jimenez, O.; Bergou, J.; Delgado, A.

    2010-12-15

    We study the probabilistic cloning of three symmetric states. These states are defined by a single complex quantity, the inner product among them. We show that three different probabilistic cloning machines are necessary to optimally clone all possible families of three symmetric states. We also show that the optimal cloning probability of generating M copies out of one original can be cast as the quotient between the success probability of unambiguously discriminating one and M copies of symmetric states.

  12. Probabilistic Approaches for Evaluating Space Shuttle Risks

    NASA Technical Reports Server (NTRS)

    Vesely, William

    2001-01-01

    The objectives of the Space Shuttle PRA (Probabilistic Risk Assessment) are to: (1) evaluate mission risks; (2) evaluate uncertainties and sensitivities; (3) prioritize contributors; (4) evaluate upgrades; (5) track risks; and (6) provide decision tools. This report discusses the significance of a Space Shuttle PRA and its participants. The elements and type of losses to be included are discussed. The program and probabilistic approaches are then discussed.

  13. Probabilistic analysis of deposit liquefaction

    SciTech Connect

    Loh, C.H.; Cheng, C.R.; Wen, Y.K.

    1995-12-31

    This paper presents a procedure to perform the risk analysis for ground failure by liquefaction. The liquefaction is defined as the result of cumulative damage caused by seismic loading. The fatigue life of soil can be determined on the basis of the N-S relationship and Miner`s cumulative damage law. The rain-flow method is used to count the number of cycles of stress response of the soil deposit. Finally, the probability of liquefaction is obtained by integrating over all the possible ground motion and the fragility curves of liquefaction potential.

  14. Detailed probabilistic modelling of cell inactivation by ionizing radiations of different qualities: the model and its applications.

    PubMed

    Kundrát, Pavel

    2009-03-01

    The probabilistic two-stage model of cell killing by ionizing radiation enables to represent both damage induction by radiation and its repair by the cell. The model properties and applications as well as possible interpretation of the underlying damage classification are discussed. Analyses of published survival data for V79 hamster cells irradiated by protons and He, C, O, and Ne ions are reported, quantifying the variations in radiation quality with increasing charge and linear energy transfer of the ions. PMID:18684633

  15. 7 CFR 51.1405 - Application of tolerances.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... (INSPECTION, CERTIFICATION, AND STANDARDS) United States Standards for Grades of Pecans in the Shell 1... tolerance of less than 5 percent, except that at least one pecan which is seriously damaged by live...

  16. 7 CFR 51.1405 - Application of tolerances.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... STANDARDS) United States Standards for Grades of Pecans in the Shell 1 Application of Tolerances § 51.1405... that at least one pecan which is seriously damaged by live insects inside the shell is...

  17. 7 CFR 51.1405 - Application of tolerances.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... STANDARDS) United States Standards for Grades of Pecans in the Shell 1 Application of Tolerances § 51.1405... that at least one pecan which is seriously damaged by live insects inside the shell is...

  18. 7 CFR 51.1405 - Application of tolerances.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... (INSPECTION, CERTIFICATION, AND STANDARDS) United States Standards for Grades of Pecans in the Shell 1... tolerance of less than 5 percent, except that at least one pecan which is seriously damaged by live...

  19. 7 CFR 51.1405 - Application of tolerances.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... STANDARDS) United States Standards for Grades of Pecans in the Shell 1 Application of Tolerances § 51.1405... that at least one pecan which is seriously damaged by live insects inside the shell is...

  20. 7 CFR 51.2929 - Application of tolerances.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ..., except that 1 decayed or 1 seriously damaged specimen may be permitted in any sample. (b) For a tolerance... specified, except that 1 decayed specimen may be permitted in any sample. Definitions...

  1. 7 CFR 51.2929 - Application of tolerances.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ..., except that 1 decayed or 1 seriously damaged specimen may be permitted in any sample. (b) For a tolerance... specified, except that 1 decayed specimen may be permitted in any sample. Definitions...

  2. Probabilistic simulation of uncertainties in thermal structures

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.; Shiao, Michael

    1990-01-01

    Development of probabilistic structural analysis methods for hot structures is a major activity at Lewis Research Center. It consists of five program elements: (1) probabilistic loads; (2) probabilistic finite element analysis; (3) probabilistic material behavior; (4) assessment of reliability and risk; and (5) probabilistic structural performance evaluation. Recent progress includes: (1) quantification of the effects of uncertainties for several variables on high pressure fuel turbopump (HPFT) blade temperature, pressure, and torque of the Space Shuttle Main Engine (SSME); (2) the evaluation of the cumulative distribution function for various structural response variables based on assumed uncertainties in primitive structural variables; (3) evaluation of the failure probability; (4) reliability and risk-cost assessment, and (5) an outline of an emerging approach for eventual hot structures certification. Collectively, the results demonstrate that the structural durability/reliability of hot structural components can be effectively evaluated in a formal probabilistic framework. In addition, the approach can be readily extended to computationally simulate certification of hot structures for aerospace environments.

  3. Recent Advances in Composite Damage Mechanics

    NASA Technical Reports Server (NTRS)

    Reifsnider, Ken; Case, Scott; Iyengar, Nirmal

    1996-01-01

    The state of the art and recent developments in the field of composite material damage mechanics are reviewed, with emphasis on damage accumulation. The kinetics of damage accumulation are considered with emphasis on the general accumulation of discrete local damage events such as single or multiple fiber fractures or microcrack formation. The issues addressed include: how to define strength in the presence of widely distributed damage, and how to combine mechanical representations in order to predict the damage tolerance and life of engineering components. It is shown that a damage mechanics approach can be related to the thermodynamics of the damage accumulation processes in composite laminates subjected to mechanical loading and environmental conditions over long periods of time.

  4. MOND using a probabilistic approach

    NASA Astrophysics Data System (ADS)

    Raut, Usha

    2009-05-01

    MOND has been proposed as a viable alternative to the dark matter hypothesis. In the original MOND formulation [1], a modification of Newtonian Dynamics was brought about by postulating new equations of particle motion at extremely low accelerations, as a possible explanation for the flat rotation curves of spiral galaxies. In this paper, we attempt a different approach to modify the usual force laws by trying to link gravity with the probabilistic aspects of quantum mechanics [2]. In order to achieve this, one starts by replacing the classical notion of a continuous distance between two elementary particles with a statistical probability function, π. The gravitational force between two elementary particles then can be interpreted in terms of the probability of interaction between them. We attempt to show that such a modified gravitational force would fall off a lot slower than the usual inverse square law predicts, leading to revised MOND equations. In the limit that the statistical aggregate of the probabilities becomes equal to the usual inverse square law force, we recover Newtonian/Einstein gravity.[3pt] [1] Milgrom, M. 1983, ApJ, 270, 365 [2] Goradia, S. 2002, .org/pdf/physics/0210040

  5. Advanced probabilistic method of development

    NASA Technical Reports Server (NTRS)

    Wirsching, P. H.

    1987-01-01

    Advanced structural reliability methods are utilized on the Probabilistic Structural Analysis Methods (PSAM) project to provide a tool for analysis and design of space propulsion system hardware. The role of the effort at the University of Arizona is to provide reliability technology support to this project. PSAM computer programs will provide a design tool for analyzing uncertainty associated with thermal and mechanical loading, material behavior, geometry, and the analysis methods used. Specifically, reliability methods are employed to perform sensitivity analyses, to establish the distribution of a critical response variable (e.g., stress, deflection), to perform reliability assessment, and ultimately to produce a design which will minimize cost and/or weight. Uncertainties in the design factors of space propulsion hardware are described by probability models constructed using statistical analysis of data. Statistical methods are employed to produce a probability model, i.e., a statistical synthesis or summary of each design variable in a format suitable for reliability analysis and ultimately, design decisions.

  6. Probabilistic risk assessment familiarization training

    SciTech Connect

    Phillabaum, J.L.

    1989-01-01

    Philadelphia Electric Company (PECo) created a Nuclear Group Risk and Reliability Assessment Program Plan in order to focus the utilization of probabilistic risk assessment (PRA) in support of Limerick Generating Station and Peach Bottom Atomic Power Station. The continuation of a PRA program was committed by PECo to the U.S. Nuclear Regulatory Commission (NRC) prior to be the issuance of an operating license for Limerick Unit 1. It is believed that increased use of PRA techniques to support activities at Limerick and Peach Bottom will enhance PECo's overall nuclear excellence. Training for familiarization with PRA is designed for attendance once by all nuclear group personnel to understand PRA and its potential effect on their jobs. The training content describes the history of PRA and how it applies to PECo's nuclear activities. Key PRA concepts serve as the foundation for the familiarization training. These key concepts are covered in all classes to facilitate an appreciation of the remaining material, which is tailored to the audience. Some of the concepts covered are comparison of regulatory philosophy to PRA techniques, fundamentals of risk/success, risk equation/risk summation, and fault trees and event trees. Building on the concepts, PRA insights and applications are then described that are tailored to the audience.

  7. Dynamical systems probabilistic risk assessment.

    SciTech Connect

    Denman, Matthew R.; Ames, Arlo Leroy

    2014-03-01

    Probabilistic Risk Assessment (PRA) is the primary tool used to risk-inform nuclear power regulatory and licensing activities. Risk-informed regulations are intended to reduce inherent conservatism in regulatory metrics (e.g., allowable operating conditions and technical specifications) which are built into the regulatory framework by quantifying both the total risk profile as well as the change in the risk profile caused by an event or action (e.g., in-service inspection procedures or power uprates). Dynamical Systems (DS) analysis has been used to understand unintended time-dependent feedbacks in both industrial and organizational settings. In dynamical systems analysis, feedback loops can be characterized and studied as a function of time to describe the changes to the reliability of plant Structures, Systems and Components (SSCs). While DS has been used in many subject areas, some even within the PRA community, it has not been applied toward creating long-time horizon, dynamic PRAs (with time scales ranging between days and decades depending upon the analysis). Understanding slowly developing dynamic effects, such as wear-out, on SSC reliabilities may be instrumental in ensuring a safely and reliably operating nuclear fleet. Improving the estimation of a plant's continuously changing risk profile will allow for more meaningful risk insights, greater stakeholder confidence in risk insights, and increased operational flexibility.

  8. Representation of probabilistic scientific knowledge

    PubMed Central

    2013-01-01

    The theory of probability is widely used in biomedical research for data analysis and modelling. In previous work the probabilities of the research hypotheses have been recorded as experimental metadata. The ontology HELO is designed to support probabilistic reasoning, and provides semantic descriptors for reporting on research that involves operations with probabilities. HELO explicitly links research statements such as hypotheses, models, laws, conclusions, etc. to the associated probabilities of these statements being true. HELO enables the explicit semantic representation and accurate recording of probabilities in hypotheses, as well as the inference methods used to generate and update those hypotheses. We demonstrate the utility of HELO on three worked examples: changes in the probability of the hypothesis that sirtuins regulate human life span; changes in the probability of hypotheses about gene functions in the S. cerevisiae aromatic amino acid pathway; and the use of active learning in drug design (quantitative structure activity relation learning), where a strategy for the selection of compounds with the highest probability of improving on the best known compound was used. HELO is open source and available at https://github.com/larisa-soldatova/HELO PMID:23734675

  9. Probabilistic Description of Stellar Ensembles

    NASA Astrophysics Data System (ADS)

    Cerviño, Miguel

    I describe the modeling of stellar ensembles in terms of probability distributions. This modeling is primary characterized by the number of stars included in the considered resolution element, whatever its physical (stellar cluster) or artificial (pixel/IFU) nature. It provides a solution of the direct problem of characterizing probabilistically the observables of stellar ensembles as a function of their physical properties. In addition, this characterization implies that intensive properties (like color indices) are intrinsically biased observables, although the bias decreases when the number of stars in the resolution element increases. In the case of a low number of stars in the resolution element (N<105), the distributions of intensive and extensive observables follow nontrivial probability distributions. Such a situation ​​​ can be computed by means of Monte Carlo simulations where data mining techniques would be applied. Regarding the inverse problem of obtaining physical parameters from observational data, I show how some of the scatter in the data provides valuable physical information since it is related to the system size (and the number of stars in the resolution element). However, making use of such ​​​ information requires following iterative procedures in the data analysis.

  10. Probabilistic Methodology for Estimation of Number and Economic Loss (Cost) of Future Landslides in the San Francisco Bay Region, California

    USGS Publications Warehouse

    Crovelli, Robert A.; Coe, Jeffrey A.

    2008-01-01

    The Probabilistic Landslide Assessment Cost Estimation System (PLACES) presented in this report estimates the number and economic loss (cost) of landslides during a specified future time in individual areas, and then calculates the sum of those estimates. The analytic probabilistic methodology is based upon conditional probability theory and laws of expectation and variance. The probabilistic methodology is expressed in the form of a Microsoft Excel computer spreadsheet program. Using historical records, the PLACES spreadsheet is used to estimate the number of future damaging landslides and total damage, as economic loss, from future landslides caused by rainstorms in 10 counties of the San Francisco Bay region in California. Estimates are made for any future 5-year period of time. The estimated total number of future damaging landslides for the entire 10-county region during any future 5-year period of time is about 330. Santa Cruz County has the highest estimated number of damaging landslides (about 90), whereas Napa, San Francisco, and Solano Counties have the lowest estimated number of damaging landslides (5?6 each). Estimated direct costs from future damaging landslides for the entire 10-county region for any future 5-year period are about US $76 million (year 2000 dollars). San Mateo County has the highest estimated costs ($16.62 million), and Solano County has the lowest estimated costs (about $0.90 million). Estimated direct costs are also subdivided into public and private costs.

  11. A methodology for post-mainshock probabilistic assessment of building collapse risk

    USGS Publications Warehouse

    Luco, N.; Gerstenberger, M.C.; Uma, S.R.; Ryu, H.; Liel, A.B.; Raghunandan, M.

    2011-01-01

    This paper presents a methodology for post-earthquake probabilistic risk (of damage) assessment that we propose in order to develop a computational tool for automatic or semi-automatic assessment. The methodology utilizes the same so-called risk integral which can be used for pre-earthquake probabilistic assessment. The risk integral couples (i) ground motion hazard information for the location of a structure of interest with (ii) knowledge of the fragility of the structure with respect to potential ground motion intensities. In the proposed post-mainshock methodology, the ground motion hazard component of the risk integral is adapted to account for aftershocks which are deliberately excluded from typical pre-earthquake hazard assessments and which decrease in frequency with the time elapsed since the mainshock. Correspondingly, the structural fragility component is adapted to account for any damage caused by the mainshock, as well as any uncertainty in the extent of this damage. The result of the adapted risk integral is a fully-probabilistic quantification of post-mainshock seismic risk that can inform emergency response mobilization, inspection prioritization, and re-occupancy decisions.

  12. 7 CFR 51.2280 - Tolerances for grade defects.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 2 2010-01-01 2010-01-01 false Tolerances for grade defects. 51.2280 Section 51.2280... § 51.2280 Tolerances for grade defects. (a) All percentages shall be claculated on the basis of weight... (included in 1 percent very serious damage). U.S. Commercial 8 4 (included in 8 percent total defects)...

  13. 7 CFR 51.307 - Application of tolerances.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... STANDARDS) United States Standards for Grades of Apples Application of Tolerances § 51.307 Application of... least one apple which is seriously damaged by insects or affected by decay or internal breakdown may be... than 3 times the tolerance specified, except that at least three defective apples may be permitted...

  14. 7 CFR 51.307 - Application of tolerances.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... STANDARDS) United States Standards for Grades of Apples Application of Tolerances § 51.307 Application of... least one apple which is seriously damaged by insects or affected by decay or internal breakdown may be... than 3 times the tolerance specified, except that at least three defective apples may be permitted...

  15. 7 CFR 51.2648 - Tolerances.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ..., CERTIFICATION, AND STANDARDS) United States Standards for Grades for Sweet Cherries 1 Tolerances § 51.2648... 2 —(1) U.S. No. 1. 8 percent for cherries which fail to meet the requirements for this grade... damage, including in this latter amount not more than one-half of 1 percent for cherries which...

  16. A Probabilistic, Facility-Centric Approach to Lightning Strike Location

    NASA Technical Reports Server (NTRS)

    Huddleston, Lisa L.; Roeder, William p.; Merceret, Francis J.

    2012-01-01

    A new probabilistic facility-centric approach to lightning strike location has been developed. This process uses the bivariate Gaussian distribution of probability density provided by the current lightning location error ellipse for the most likely location of a lightning stroke and integrates it to determine the probability that the stroke is inside any specified radius of any location, even if that location is not centered on or even with the location error ellipse. This technique is adapted from a method of calculating the probability of debris collisionith spacecraft. Such a technique is important in spaceport processing activities because it allows engineers to quantify the risk of induced current damage to critical electronics due to nearby lightning strokes. This technique was tested extensively and is now in use by space launch organizations at Kennedy Space Center and Cape Canaveral Air Force Station. Future applications could include forensic meteorology.

  17. Demonstrate Ames Laboratory capability in Probabilistic Risk Assessment (PRA)

    SciTech Connect

    Bluhm, D.; Greimann, L.; Fanous, F.; Challa, R.; Gupta, S.

    1993-07-01

    In response to the damage which occurred during the Three Mile Island nuclear accident, the Nuclear Regulatory Commission (NRC) has implemented a Probabilistic Risk Assessment (PRA) program to evaluate the safety of nuclear power facilities during events with a low probability of occurrence. The PRA can be defined as a mathematical technique to identify and rank the importance of event sequences that can lead to a severe nuclear accident. Another PRA application is the evaluation of nuclear containment buildings due to earthquakes. In order to perform a seismic PRA, the two conditional probabilities of ground motion and of structural failure of the different components given a specific earthquake are first studied. The first of these is termed probability of exceedance and the second as seismic fragility analysis. The seismic fragility analysis is then related to the ground motion measured in terms of ``g`` to obtain a plant level fragility curve.

  18. Advanced Seismic Probabilistic Risk Assessment Demonstration Project Plan

    SciTech Connect

    Justin Coleman

    2014-09-01

    Idaho National Laboratories (INL) has an ongoing research and development (R&D) project to remove excess conservatism from seismic probabilistic risk assessments (SPRA) calculations. These risk calculations should focus on providing best estimate results, and associated insights, for evaluation and decision-making. This report presents a plan for improving our current traditional SPRA process using a seismic event recorded at a nuclear power plant site, with known outcomes, to improve the decision making process. SPRAs are intended to provide best estimates of the various combinations of structural and equipment failures that can lead to a seismic induced core damage event. However, in general this approach has been conservative, and potentially masks other important events (for instance, it was not the seismic motions that caused the Fukushima core melt events, but the tsunami ingress into the facility).

  19. Advanced probabilistic risk analysis using RAVEN and RELAP-7

    SciTech Connect

    Rabiti, Cristian; Alfonsi, Andrea; Mandelli, Diego; Cogliati, Joshua; Kinoshita, Robert

    2014-06-01

    RAVEN, under the support of the Nuclear Energy Advanced Modeling and Simulation (NEAMS) program [1], is advancing its capability to perform statistical analyses of stochastic dynamic systems. This is aligned with its mission to provide the tools needed by the Risk Informed Safety Margin Characterization (RISMC) path-lead [2] under the Department Of Energy (DOE) Light Water Reactor Sustainability program [3]. In particular this task is focused on the synergetic development with the RELAP-7 [4] code to advance the state of the art on the safety analysis of nuclear power plants (NPP). The investigation of the probabilistic evolution of accident scenarios for a complex system such as a nuclear power plant is not a trivial challenge. The complexity of the system to be modeled leads to demanding computational requirements even to simulate one of the many possible evolutions of an accident scenario (tens of CPU/hour). At the same time, the probabilistic analysis requires thousands of runs to investigate outcomes characterized by low probability and severe consequence (tail problem). The milestone reported in June of 2013 [5] described the capability of RAVEN to implement complex control logic and provide an adequate support for the exploration of the probabilistic space using a Monte Carlo sampling strategy. Unfortunately the Monte Carlo approach is ineffective with a problem of this complexity. In the following year of development, the RAVEN code has been extended with more sophisticated sampling strategies (grids, Latin Hypercube, and adaptive sampling). This milestone report illustrates the effectiveness of those methodologies in performing the assessment of the probability of core damage following the onset of a Station Black Out (SBO) situation in a boiling water reactor (BWR). The first part of the report provides an overview of the available probabilistic analysis capabilities, ranging from the different types of distributions available, possible sampling

  20. Probabilistic modeling of condition-based maintenance strategies and quantification of its benefits for airliners

    NASA Astrophysics Data System (ADS)

    Pattabhiraman, Sriram

    Airplane fuselage structures are designed with the concept of damage tolerance, wherein small damage are allowed to remain on the airplane, and damage that otherwise affect the safety of the structure are repaired. The damage critical to the safety of the fuselage are repaired by scheduling maintenance at pre-determined intervals. Scheduling maintenance is an interesting trade-off between damage tolerance and cost. Tolerance of larger damage would require less frequent maintenance and hence, a lower cost, to maintain a certain level of reliability. Alternatively, condition-based maintenance techniques have been developed using on-board sensors, which track damage continuously and request maintenance only when the damage size crosses a particular threshold. This effects a tolerance of larger damage than scheduled maintenance, leading to savings in cost. This work quantifies the savings of condition-based maintenance over scheduled maintenance. The work also quantifies converting the cost savings into weight savings. Structural health monitoring will need time to be able to establish itself as a stand-alone system for maintenance, due to concerns on its diagnosis accuracy and reliability. This work also investigates the effect of synchronizing structural health monitoring system with scheduled maintenance. This work uses on-board SHM equipment skip structural airframe maintenance (a subsect of scheduled maintenance), whenever deemed unnecessary while maintain a desired level of safety of structure. The work will also predict the necessary maintenance for a fleet of airplanes, based on the current damage status of the airplanes. The work also analyses the possibility of false alarm, wherein maintenance is being requested with no critical damage on the airplane. The work use SHM as a tool to identify lemons in a fleet of airplanes. Lemons are those airplanes that would warrant more maintenance trips than the average behavior of the fleet.

  1. Validation of seismic probabilistic risk assessments of nuclear power plants

    SciTech Connect

    Ellingwood, B.

    1994-01-01

    A seismic probabilistic risk assessment (PRA) of a nuclear plant requires identification and information regarding the seismic hazard at the plant site, dominant accident sequences leading to core damage, and structure and equipment fragilities. Uncertainties are associated with each of these ingredients of a PRA. Sources of uncertainty due to seismic hazard and assumptions underlying the component fragility modeling may be significant contributors to uncertainty in estimates of core damage probability. Design and construction errors also may be important in some instances. When these uncertainties are propagated through the PRA, the frequency distribution of core damage probability may span three orders of magnitude or more. This large variability brings into question the credibility of PRA methods and the usefulness of insights to be gained from a PRA. The sensitivity of accident sequence probabilities and high-confidence, low probability of failure (HCLPF) plant fragilities to seismic hazard and fragility modeling assumptions was examined for three nuclear power plants. Mean accident sequence probabilities were found to be relatively insensitive (by a factor of two or less) to: uncertainty in the coefficient of variation (logarithmic standard deviation) describing inherent randomness in component fragility; truncation of lower tail of fragility; uncertainty in random (non-seismic) equipment failures (e.g., diesel generators); correlation between component capacities; and functional form of fragility family. On the other hand, the accident sequence probabilities, expressed in the form of a frequency distribution, are affected significantly by the seismic hazard modeling, including slopes of seismic hazard curves and likelihoods assigned to those curves.

  2. A probabilistic solution of robust H∞ control problem with scaled matrices

    NASA Astrophysics Data System (ADS)

    Xie, R.; Gong, J. Y.

    2016-07-01

    This paper addresses the robust H∞ control problem with scaled matrices. It is difficult to find a global optimal solution for this non-convex optimisation problem. A probabilistic solution, which can achieve globally optimal robust performance within any pre-specified tolerance, is obtained by using the proposed method based on randomised algorithm. In the proposed method, the scaled H∞ control problem is divided into two parts: (1) assume the scaled matrices be random variables, the scaled H∞ control problem is converted to a convex optimisation problem for the fixed sample of the scaled matrix and a optimal solution corresponding to the fixed sample is obtained; (2) a probabilistic optimal solution is obtained by using the randomised algorithm based on a finite number N optimal solutions, which are obtained in part (1). The analysis shows that the worst case complexity of proposed method is a polynomial.

  3. Probabilistic, Seismically-Induced Landslide Hazard Mapping of Western Oregon

    NASA Astrophysics Data System (ADS)

    Olsen, M. J.; Sharifi Mood, M.; Gillins, D. T.; Mahalingam, R.

    2015-12-01

    Earthquake-induced landslides can generate significant damage within urban communities by damaging structures, obstructing lifeline connection routes and utilities, generating various environmental impacts, and possibly resulting in loss of life. Reliable hazard and risk maps are important to assist agencies in efficiently allocating and managing limited resources to prepare for such events. This research presents a new methodology in order to communicate site-specific landslide hazard assessments in a large-scale, regional map. Implementation of the proposed methodology results in seismic-induced landslide hazard maps that depict the probabilities of exceeding landslide displacement thresholds (e.g. 0.1, 0.3, 1.0 and 10 meters). These maps integrate a variety of data sources including: recent landslide inventories, LIDAR and photogrammetric topographic data, geology map, mapped NEHRP site classifications based on available shear wave velocity data in each geologic unit, and USGS probabilistic seismic hazard curves. Soil strength estimates were obtained by evaluating slopes present along landslide scarps and deposits for major geologic units. Code was then developed to integrate these layers to perform a rigid, sliding block analysis to determine the amount and associated probabilities of displacement based on each bin of peak ground acceleration in the seismic hazard curve at each pixel. The methodology was applied to western Oregon, which contains weak, weathered, and often wet soils at steep slopes. Such conditions have a high landslide hazard even without seismic events. A series of landslide hazard maps highlighting the probabilities of exceeding the aforementioned thresholds were generated for the study area. These output maps were then utilized in a performance based design framework enabling them to be analyzed in conjunction with other hazards for fully probabilistic-based hazard evaluation and risk assessment. a) School of Civil and Construction

  4. Probabilistic Survivability Versus Time Modeling

    NASA Technical Reports Server (NTRS)

    Joyner, James J., Sr.

    2015-01-01

    This technical paper documents Kennedy Space Centers Independent Assessment team work completed on three assessments for the Ground Systems Development and Operations (GSDO) Program to assist the Chief Safety and Mission Assurance Officer (CSO) and GSDO management during key programmatic reviews. The assessments provided the GSDO Program with an analysis of how egress time affects the likelihood of astronaut and worker survival during an emergency. For each assessment, the team developed probability distributions for hazard scenarios to address statistical uncertainty, resulting in survivability plots over time. The first assessment developed a mathematical model of probabilistic survivability versus time to reach a safe location using an ideal Emergency Egress System at Launch Complex 39B (LC-39B); the second used the first model to evaluate and compare various egress systems under consideration at LC-39B. The third used a modified LC-39B model to determine if a specific hazard decreased survivability more rapidly than other events during flight hardware processing in Kennedys Vehicle Assembly Building (VAB).Based on the composite survivability versus time graphs from the first two assessments, there was a soft knee in the Figure of Merit graphs at eight minutes (ten minutes after egress ordered). Thus, the graphs illustrated to the decision makers that the final emergency egress design selected should have the capability of transporting the flight crew from the top of LC 39B to a safe location in eight minutes or less. Results for the third assessment were dominated by hazards that were classified as instantaneous in nature (e.g. stacking mishaps) and therefore had no effect on survivability vs time to egress the VAB. VAB emergency scenarios that degraded over time (e.g. fire) produced survivability vs time graphs that were line with aerospace industry norms.

  5. Probabilistic Modeling of Rosette Formation

    PubMed Central

    Long, Mian; Chen, Juan; Jiang, Ning; Selvaraj, Periasamy; McEver, Rodger P.; Zhu, Cheng

    2006-01-01

    Rosetting, or forming a cell aggregate between a single target nucleated cell and a number of red blood cells (RBCs), is a simple assay for cell adhesion mediated by specific receptor-ligand interaction. For example, rosette formation between sheep RBC and human lymphocytes has been used to differentiate T cells from B cells. Rosetting assay is commonly used to determine the interaction of Fc γ-receptors (FcγR) expressed on inflammatory cells and IgG coated on RBCs. Despite its wide use in measuring cell adhesion, the biophysical parameters of rosette formation have not been well characterized. Here we developed a probabilistic model to describe the distribution of rosette sizes, which is Poissonian. The average rosette size is predicted to be proportional to the apparent two-dimensional binding affinity of the interacting receptor-ligand pair and their site densities. The model has been supported by experiments of rosettes mediated by four molecular interactions: FcγRIII interacting with IgG, T cell receptor and coreceptor CD8 interacting with antigen peptide presented by major histocompatibility molecule, P-selectin interacting with P-selectin glycoprotein ligand 1 (PSGL-1), and L-selectin interacting with PSGL-1. The latter two are structurally similar and are different from the former two. Fitting the model to data enabled us to evaluate the apparent effective two-dimensional binding affinity of the interacting molecular pairs: 7.19 × 10−5 μm4 for FcγRIII-IgG interaction, 4.66 × 10−3 μm4 for P-selectin-PSGL-1 interaction, and 0.94 × 10−3 μm4 for L-selectin-PSGL-1 interaction. These results elucidate the biophysical mechanism of rosette formation and enable it to become a semiquantitative assay that relates the rosette size to the effective affinity for receptor-ligand binding. PMID:16603493

  6. Neural representation of probabilistic information.

    PubMed

    Barber, M J; Clark, J W; Anderson, C H

    2003-08-01

    It has been proposed that populations of neurons process information in terms of probability density functions (PDFs) of analog variables. Such analog variables range, for example, from target luminance and depth on the sensory interface to eye position and joint angles on the motor output side. The requirement that analog variables must be processed leads inevitably to a probabilistic description, while the limited precision and lifetime of the neuronal processing units lead naturally to a population representation of information. We show how a time-dependent probability density rho(x; t) over variable x, residing in a specified function space of dimension D, may be decoded from the neuronal activities in a population as a linear combination of certain decoding functions phi(i)(x), with coefficients given by the N firing rates a(i)(t) (generally with D < N). We show how the neuronal encoding process may be described by projecting a set of complementary encoding functions phi;(i)(x) on the probability density rho(x; t), and passing the result through a rectifying nonlinear activation function. We show how both encoders phi;(i)(x) and decoders phi(i)(x) may be determined by minimizing cost functions that quantify the inaccuracy of the representation. Expressing a given computation in terms of manipulation and transformation of probabilities, we show how this representation leads to a neural circuit that can carry out the required computation within a consistent Bayesian framework, with the synaptic weights being explicitly generated in terms of encoders, decoders, conditional probabilities, and priors. PMID:14511515

  7. Historical Overview of Immunological Tolerance

    PubMed Central

    Schwartz, Ronald H.

    2012-01-01

    A fundamental property of the immune system is its ability to mediate self-defense with a minimal amount of collateral damage to the host. The system uses several different mechanisms to achieve this goal, which is collectively referred to as the “process of immunological tolerance.” This article provides an introductory historical overview to these various mechanisms, which are discussed in greater detail throughout this collection, and then briefly describes what happens when this process fails, a state referred to as “autoimmunity.” PMID:22395097

  8. Can crops tolerate acid rain

    SciTech Connect

    Kaplan, J.K.

    1989-11-01

    This brief article describes work by scientists at the ARS Air Quality-Plant Growth and Development Laboratory in Raleigh, North Carolina, that indicates little damage to crops as a result of acid rain. In studies with simulated acid rain and 216 exposed varieties of 18 crops, there were no significant injuries nor was there reduced growth in most species. Results of chronic and acute exposures were correlated in sensitive tomato and soybean plants and in tolerant winter wheat and lettuce plants. These results suggest that 1-hour exposures could be used in the future to screen varieties for sensitivity to acid rain.

  9. Future trends in flood risk in Indonesia - A probabilistic approach

    NASA Astrophysics Data System (ADS)

    Muis, Sanne; Guneralp, Burak; Jongman, Brenden; Ward, Philip

    2014-05-01

    Indonesia is one of the 10 most populous countries in the world and is highly vulnerable to (river) flooding. Catastrophic floods occur on a regular basis; total estimated damages were US 0.8 bn in 2010 and US 3 bn in 2013. Large parts of Greater Jakarta, the capital city, are annually subject to flooding. Flood risks (i.e. the product of hazard, exposure and vulnerability) are increasing due to rapid increases in exposure, such as strong population growth and ongoing economic development. The increase in risk may also be amplified by increasing flood hazards, such as increasing flood frequency and intensity due to climate change and land subsidence. The implementation of adaptation measures, such as the construction of dykes and strategic urban planning, may counteract these increasing trends. However, despite its importance for adaptation planning, a comprehensive assessment of current and future flood risk in Indonesia is lacking. This contribution addresses this issue and aims to provide insight into how socio-economic trends and climate change projections may shape future flood risks in Indonesia. Flood risk were calculated using an adapted version of the GLOFRIS global flood risk assessment model. Using this approach, we produced probabilistic maps of flood risks (i.e. annual expected damage) at a resolution of 30"x30" (ca. 1km x 1km at the equator). To represent flood exposure, we produced probabilistic projections of urban growth in a Monte-Carlo fashion based on probability density functions of projected population and GDP values for 2030. To represent flood hazard, inundation maps were computed using the hydrological-hydraulic component of GLOFRIS. These maps show flood inundation extent and depth for several return periods and were produced for several combinations of GCMs and future socioeconomic scenarios. Finally, the implementation of different adaptation strategies was incorporated into the model to explore to what extent adaptation may be able to

  10. Probabilistic numerics and uncertainty in computations

    PubMed Central

    Hennig, Philipp; Osborne, Michael A.; Girolami, Mark

    2015-01-01

    We deliver a call to arms for probabilistic numerical methods: algorithms for numerical tasks, including linear algebra, integration, optimization and solving differential equations, that return uncertainties in their calculations. Such uncertainties, arising from the loss of precision induced by numerical calculation with limited time or hardware, are important for much contemporary science and industry. Within applications such as climate science and astrophysics, the need to make decisions on the basis of computations with large and complex data have led to a renewed focus on the management of numerical uncertainty. We describe how several seminal classic numerical methods can be interpreted naturally as probabilistic inference. We then show that the probabilistic view suggests new algorithms that can flexibly be adapted to suit application specifics, while delivering improved empirical performance. We provide concrete illustrations of the benefits of probabilistic numeric algorithms on real scientific problems from astrometry and astronomical imaging, while highlighting open problems with these new algorithms. Finally, we describe how probabilistic numerical methods provide a coherent framework for identifying the uncertainty in calculations performed with a combination of numerical algorithms (e.g. both numerical optimizers and differential equation solvers), potentially allowing the diagnosis (and control) of error sources in computations. PMID:26346321

  11. Learning probabilistic document template models via interaction

    NASA Astrophysics Data System (ADS)

    Ahmadullin, Ildus; Damera-Venkata, Niranjan

    2013-03-01

    Document aesthetics measures are key to automated document composition. Recently we presented a probabilistic document model (PDM) which is a micro-model for document aesthetics based on a probabilistic modeling of designer choice in document design. The PDM model comes with efficient layout synthesis algorithms once the aesthetic model is defined. A key element of this approach is an aesthetic prior on the parameters of a template encoding aesthetic preferences for template parameters. Parameters of the prior were required to be chosen empirically by designers. In this work we show how probabilistic template models (and hence the PDM cost function) can be learnt directly by observing a designer making design choices in composing sample documents. From such training data our learning approach can learn a quality measure that can mimic some of the design tradeoffs a designer makes in practice.

  12. Parallel computing for probabilistic fatigue analysis

    NASA Technical Reports Server (NTRS)

    Sues, Robert H.; Lua, Yuan J.; Smith, Mark D.

    1993-01-01

    This paper presents the results of Phase I research to investigate the most effective parallel processing software strategies and hardware configurations for probabilistic structural analysis. We investigate the efficiency of both shared and distributed-memory architectures via a probabilistic fatigue life analysis problem. We also present a parallel programming approach, the virtual shared-memory paradigm, that is applicable across both types of hardware. Using this approach, problems can be solved on a variety of parallel configurations, including networks of single or multiprocessor workstations. We conclude that it is possible to effectively parallelize probabilistic fatigue analysis codes; however, special strategies will be needed to achieve large-scale parallelism to keep large number of processors busy and to treat problems with the large memory requirements encountered in practice. We also conclude that distributed-memory architecture is preferable to shared-memory for achieving large scale parallelism; however, in the future, the currently emerging hybrid-memory architectures will likely be optimal.

  13. Expert system development for probabilistic load simulation

    NASA Technical Reports Server (NTRS)

    Ho, H.; Newell, J. F.

    1991-01-01

    A knowledge based system LDEXPT using the intelligent data base paradigm was developed for the Composite Load Spectra (CLS) project to simulate the probabilistic loads of a space propulsion system. The knowledge base approach provides a systematic framework of organizing the load information and facilitates the coupling of the numerical processing and symbolic (information) processing. It provides an incremental development environment for building generic probabilistic load models and book keeping the associated load information. A large volume of load data is stored in the data base and can be retrieved and updated by a built-in data base management system. The data base system standardizes the data storage and retrieval procedures. It helps maintain data integrity and avoid data redundancy. The intelligent data base paradigm provides ways to build expert system rules for shallow and deep reasoning and thus provides expert knowledge to help users to obtain the required probabilistic load spectra.

  14. Probabilistic Exposure Analysis for Chemical Risk Characterization

    PubMed Central

    Bogen, Kenneth T.; Cullen, Alison C.; Frey, H. Christopher; Price, Paul S.

    2009-01-01

    This paper summarizes the state of the science of probabilistic exposure assessment (PEA) as applied to chemical risk characterization. Current probabilistic risk analysis methods applied to PEA are reviewed. PEA within the context of risk-based decision making is discussed, including probabilistic treatment of related uncertainty, interindividual heterogeneity, and other sources of variability. Key examples of recent experience gained in assessing human exposures to chemicals in the environment, and other applications to chemical risk characterization and assessment, are presented. It is concluded that, although improvements continue to be made, existing methods suffice for effective application of PEA to support quantitative analyses of the risk of chemically induced toxicity that play an increasing role in key decision-making objectives involving health protection, triage, civil justice, and criminal justice. Different types of information required to apply PEA to these different decision contexts are identified, and specific PEA methods are highlighted that are best suited to exposure assessment in these separate contexts. PMID:19223660

  15. Degradation monitoring using probabilistic inference

    NASA Astrophysics Data System (ADS)

    Alpay, Bulent

    In order to increase safety and improve economy and performance in a nuclear power plant (NPP), the source and extent of component degradations should be identified before failures and breakdowns occur. It is also crucial for the next generation of NPPs, which are designed to have a long core life and high fuel burnup to have a degradation monitoring system in order to keep the reactor in a safe state, to meet the designed reactor core lifetime and to optimize the scheduled maintenance. Model-based methods are based on determining the inconsistencies between the actual and expected behavior of the plant, and use these inconsistencies for detection and diagnostics of degradations. By defining degradation as a random abrupt change from the nominal to a constant degraded state of a component, we employed nonlinear filtering techniques based on state/parameter estimation. We utilized a Bayesian recursive estimation formulation in the sequential probabilistic inference framework and constructed a hidden Markov model to represent a general physical system. By addressing the problem of a filter's inability to estimate an abrupt change, which is called the oblivious filter problem in nonlinear extensions of Kalman filtering, and the sample impoverishment problem in particle filtering, we developed techniques to modify filtering algorithms by utilizing additional data sources to improve the filter's response to this problem. We utilized a reliability degradation database that can be constructed from plant specific operational experience and test and maintenance reports to generate proposal densities for probable degradation modes. These are used in a multiple hypothesis testing algorithm. We then test samples drawn from these proposal densities with the particle filtering estimates based on the Bayesian recursive estimation formulation with the Metropolis Hastings algorithm, which is a well-known Markov chain Monte Carlo method (MCMC). This multiple hypothesis testing

  16. A Probabilistic Cell Tracking Algorithm

    NASA Astrophysics Data System (ADS)

    Steinacker, Reinhold; Mayer, Dieter; Leiding, Tina; Lexer, Annemarie; Umdasch, Sarah

    2013-04-01

    The research described below was carried out during the EU-Project Lolight - development of a low cost, novel and accurate lightning mapping and thunderstorm (supercell) tracking system. The Project aims to develop a small-scale tracking method to determine and nowcast characteristic trajectories and velocities of convective cells and cell complexes. The results of the algorithm will provide a higher accuracy than current locating systems distributed on a coarse scale. Input data for the developed algorithm are two temporally separated lightning density fields. Additionally a Monte Carlo method minimizing a cost function is utilizied which leads to a probabilistic forecast for the movement of thunderstorm cells. In the first step the correlation coefficients between the first and the second density field are computed. Hence, the first field is shifted by all shifting vectors which are physically allowed. The maximum length of each vector is determined by the maximum possible speed of thunderstorm cells and the difference in time for both density fields. To eliminate ambiguities in determination of directions and velocities, the so called Random Walker of the Monte Carlo process is used. Using this method a grid point is selected at random. Moreover, one vector out of all predefined shifting vectors is suggested - also at random but with a probability that is related to the correlation coefficient. If this exchange of shifting vectors reduces the cost function, the new direction and velocity are accepted. Otherwise it is discarded. This process is repeated until the change of cost functions falls below a defined threshold. The Monte Carlo run gives information about the percentage of accepted shifting vectors for all grid points. In the course of the forecast, amplifications of cell density are permitted. For this purpose, intensity changes between the investigated areas of both density fields are taken into account. Knowing the direction and speed of thunderstorm

  17. Increased size of cotton root system does not impart tolerance to Meloidogyne incognita

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Plant tolerance or intolerance to parasitic nematodes represent a spectrum describing the degree of damage inflicted by the nematode on the host plant. Tolerance is typically measured in terms of yield suppression. Instances of plant tolerance to nematodes have been documented in some crops, inclu...

  18. 7 CFR 51.1215 - Application of tolerances to individual packages.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... Grades of Peaches Application of Tolerances § 51.1215 Application of tolerances to individual packages... any lot shall have not more than double the tolerance specified, except that at least one peach which... percentage of defects: Provided, That not more than one peach which is seriously damaged by insects...

  19. 76 FR 75435 - Fatigue Tolerance Evaluation of Metallic Structures

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-12-02

    .... It was generally agreed, based on in-service experience, that not accounting for damage could be a... intervals based on damage growth, the residual strength evaluation must show that the remaining structure... Federal Aviation Administration 14 CFR Part 29 RIN 2120-AJ51 Fatigue Tolerance Evaluation of...

  20. Normothermic central hypovolemia tolerance reflects hyperthermic tolerance

    PubMed Central

    Schlader, Zachary J.

    2016-01-01

    Purpose To test the hypothesis that those who are highly tolerant to lower body negative pressure (LBNP) while normothermic are also highly tolerant to this challenge while hyperthermic. Methods Sixty pairs of normothermic and hyperthermic LBNP tests to pre-syncope were evaluated. LBNP tolerance was quantified via the cumulative stress index (CSI), which is calculated as the sum of the product of the LBNP level and the duration of each level until test termination (i.e., 20 mmHg × 3 min + 30 mmHg × 3 min, etc.). CSI was compared between normothermic and hyperthermic trials. Internal and skin temperatures, heart rate, and arterial pressure were measured throughout. Results Hyperthermia reduced (P<0.001) CSI from 997 ± 437 to 303 ± 213 mmHg min. There was a positive correlation between normothermic and hyperthermic LBNP tolerance (R2 = 0.38; P<0.001). As a secondary analysis, the 20 trials with the highest LBNP tolerance while normothermic were identified (indicated as the HIGH group; CSI 1,467 ± 356 mmHg min), as were the 20 trials with the lowest normothermic tolerance (indicated as the LOW group; CSI 565 ± 166 mmHg min; P<0.001 between groups). While hyperthermia unanimously reduced CSI in both HIGH and LOW groups, in this hyperthermic condition CSI was ~threefold higher in the HIGH group (474 ± 226 mmHg min) relative to the LOW group (160 ± 115 mmHg min; P<0.001). Conclusions LBNP tolerance while hyperthermic is related to normothermic tolerance and, associated with this finding, those who have a high LBNP tolerance while normothermic remain relatively tolerant when hyperthermic. PMID:24700256

  1. Lactose tolerance tests

    MedlinePlus

    Hydrogen breath test for lactose tolerance ... Two common methods include: Lactose tolerance blood test Hydrogen breath test The hydrogen breath test is the preferred method. It measures the amount of hydrogen in the air you breathe out. ...

  2. Aligned composite structures for mitigation of impact damage and resistance to wear in dynamic environments

    DOEpatents

    Mulligan, Anthony C.; Rigali, Mark J.; Sutaria, Manish P.; Popovich, Dragan; Halloran, Joseph P.; Fulcher, Michael L.; Cook, Randy C.

    2005-12-13

    Fibrous monolith composites having architectures that provide increased flaw insensitivity, improved hardness, wear resistance and damage tolerance and methods of manufacture thereof are provided for use in dynamic environments to mitigate impact damage and increase wear resistance.

  3. Aligned composite structures for mitigation of impact damage and resistance to wear in dynamic environments

    DOEpatents

    Mulligan, Anthony C.; Rigali, Mark J.; Sutaria, Manish P.; Popovich, Dragan; Halloran, Joseph P.; Fulcher, Michael L.; Cook, Randy C.

    2009-04-14

    Fibrous monolith composites having architectures that provide increased flaw insensitivity, improved hardness, wear resistance and damage tolerance and methods of manufacture thereof are provided for use in dynamic environments to mitigate impact damage and increase wear resistance.

  4. Aligned composite structures for mitigation of impact damage and resistance to wear in dynamic environments

    DOEpatents

    Rigali, Mark J.; Sutaria, Manish P.; Mulligan, Anthony C.; Popovich, Dragan

    2004-03-23

    Fibrous monolith composites having architectures that provide increased flaw insensitivity, improved hardness, wear resistance and damage tolerance and methods of manufacture thereof are provided for use in dynamic environments to mitigate impact damage and increase wear resistance.

  5. Probabilistic evaluation of SSME structural components

    NASA Astrophysics Data System (ADS)

    Rajagopal, K. R.; Newell, J. F.; Ho, H.

    1991-05-01

    The application is described of Composite Load Spectra (CLS) and Numerical Evaluation of Stochastic Structures Under Stress (NESSUS) family of computer codes to the probabilistic structural analysis of four Space Shuttle Main Engine (SSME) space propulsion system components. These components are subjected to environments that are influenced by many random variables. The applications consider a wide breadth of uncertainties encountered in practice, while simultaneously covering a wide area of structural mechanics. This has been done consistent with the primary design requirement for each component. The probabilistic application studies are discussed using finite element models that have been typically used in the past in deterministic analysis studies.

  6. A Probabilistic Approach to Aeropropulsion System Assessment

    NASA Technical Reports Server (NTRS)

    Tong, Michael T.

    1999-01-01

    A probabilistic approach is described for aeropropulsion system assessment. To demonstrate this approach, the technical performance of a wave rotor-enhanced gas turbine engine (i.e. engine net thrust, specific fuel consumption, and engine weight) is assessed. The assessment accounts for the uncertainties in component efficiencies/flows and mechanical design variables, using probability distributions. The results are presented in the form of cumulative distribution functions (CDFs) and sensitivity analyses, and are compared with those from the traditional deterministic approach. The comparison shows that the probabilistic approach provides a more realistic and systematic way to assess an aeropropulsion system.

  7. Finite element methods in probabilistic mechanics

    NASA Technical Reports Server (NTRS)

    Liu, Wing Kam; Mani, A.; Belytschko, Ted

    1987-01-01

    Probabilistic methods, synthesizing the power of finite element methods with second-order perturbation techniques, are formulated for linear and nonlinear problems. Random material, geometric properties and loads can be incorporated in these methods, in terms of their fundamental statistics. By construction, these methods are applicable when the scale of randomness is not too large and when the probabilistic density functions have decaying tails. By incorporating certain computational techniques, these methods are shown to be capable of handling large systems with many sources of uncertainties. Applications showing the effects of combined random fields and cyclic loading/stress reversal are studied and compared with Monte Carlo simulation results.

  8. HFIR vessel probabilistic fracture mechanics analysis

    SciTech Connect

    Cheverton, R.D.; Dickson, T.L.

    1997-01-01

    The life of the High Flux Isotope Reactor (HFIR) pressure vessel is limited by a radiation induced reduction in the material`s fracture toughness. Hydrostatic proof testing and probabilistic fracture mechanics analyses are being used to meet the intent of the ASME Code, while extending the life of the vessel well beyond its original design value. The most recent probabilistic evaluation is more precise and accounts for the effects of gamma as well as neutron radiation embrittlement. This analysis confirms the earlier estimates of a permissible vessel lifetime of at least 50 EFPY (100 MW).

  9. Probabilistic assessment of uncertain adaptive hybrid composites

    NASA Technical Reports Server (NTRS)

    Shiao, Michael C.; Singhal, Surendra N.; Chamis, Christos C.

    1994-01-01

    Adaptive composite structures using actuation materials, such as piezoelectric fibers, were assessed probabilistically utilizing intraply hybrid composite mechanics in conjunction with probabilistic composite structural analysis. Uncertainties associated with the actuation material as well as the uncertainties in the regular (traditional) composite material properties were quantified and considered in the assessment. Static and buckling analyses were performed for rectangular panels with various boundary conditions and different control arrangements. The probability density functions of the structural behavior, such as maximum displacement and critical buckling load, were computationally simulated. The results of the assessment indicate that improved design and reliability can be achieved with actuation material.

  10. Impact damage of a graphite/PEEK

    SciTech Connect

    Demuts, E.

    1994-12-31

    Low-velocity non-penetrating impact has been applied to graphite polyetheretherketone (AS4/APC-2) laminates in accordance with the USAF guidelines for designing damage tolerant primary structures. The extent of delaminations and dent depths for two lay ups and five thicknesses at room temperature and ambient moisture conditions have been determined. Based on these findings as well as those presented elsewhere it may be concluded that the ``softer`` lay up (40/50/10), up to about 75-ply thickness, is more damage tolerant than the ``harder`` lay up (60/30/10) because within this thickness range the ``softer`` lay up displays smaller dent depths, smaller delaminated areas and higher post-impost compressive strength (PICS). For laminates thicker than 75 plies, the relative situation in delamination extent and PICS is reversed, i.e. the ``harder`` lay up is more damage tolerant than the ``softer`` one. The test data obtained in this experimental investigation provide the amount of initial damage to be assumed for a damage tolerant design of USAF primary structures made out of AS4/APC-2 graphite/PEEK. In addition, 9 these data may serve to validate the predictive capability of appropriate analytic models.

  11. Probabilistic structural analysis methods for space propulsion system components

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.

    1986-01-01

    The development of a three-dimensional inelastic analysis methodology for the Space Shuttle main engine (SSME) structural components is described. The methodology is composed of: (1) composite load spectra, (2) probabilistic structural analysis methods, (3) the probabilistic finite element theory, and (4) probabilistic structural analysis. The methodology has led to significant technical progress in several important aspects of probabilistic structural analysis. The program and accomplishments to date are summarized.

  12. Probabilistic structural analysis methods for space propulsion system components

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.

    1987-01-01

    The development of a three-dimensional inelastic analysis methodology for the Space Shuttle main engine (SSME) structural components is described. The methodology is composed of: (1) composite load spectra, (2) probabilistic structural analysis methods, (3) the probabilistic finite element theory, and (4) probabilistic structural analysis. The methodology has led to significant technical progress in several important aspects of probabilistic structural analysis. The program and accomplishments to date are summarized.

  13. Zero Tolerance in Schools.

    ERIC Educational Resources Information Center

    Henault, Cherry

    2001-01-01

    Questions the effectiveness of the widespread use of zero-tolerance policies enacted by school boards to punish students who violate weapon and drug rules. Suggests that enforcement of zero-tolerance policies has not been equitable. Reviews proposal for alternative to zero tolerance. (PKP)

  14. Radiation Damage Workshop

    NASA Technical Reports Server (NTRS)

    Stella, P. M.

    1984-01-01

    The availability of data regarding the radiation behavior of GaAs and silicon solar cells is discussed as well as efforts to provide sufficient information. Other materials are considered too immature for reasonable radiation evaluation. The lack of concern over the possible catastrophic radiation degradation in cascade cells is a potentially serious problem. Lithium counterdoping shows potential for removing damage in irradiated P-type material, although initial efficiencies are not comparable to current state of the art. The possibility of refining the lithium doping method to maintain high initial efficiencies and combining it with radiation tolerant structures such as thin BSF cells or vertical junction cells could provide a substantial improvement in EOL efficiencies. Laser annealing of junctions, either those formed ion implantation or diffusion, may not only improve initial cell performance but might also reduce the radiation degradation rate.

  15. NASA workshop on impact damage to composites

    NASA Technical Reports Server (NTRS)

    Poe, C. C., Jr.

    1991-01-01

    A compilation of slides presented at the NASA Workshop on Impact Damage to Composites held on March 19 and 20, 1991, at the Langley Research Center, Hampton, Virginia is given. The objective of the workshop was to review technology for evaluating impact damage tolerance of composite structures and identify deficiencies. Research, development, design methods, and design criteria were addressed. Actions to eliminate technology deficiencies were developed. A list of those actions and a list of attendees are also included.

  16. "Infectious" Transplantation Tolerance

    NASA Astrophysics Data System (ADS)

    Qin, Shixin; Cobbold, Stephen P.; Pope, Heather; Elliott, James; Kioussis, Dimitris; Davies, Joanna; Waldmann, Herman

    1993-02-01

    The maintenance of transplantation tolerance induced in adult mice after short-term treatment with nonlytic monoclonal antibodies to CD4 and CD8 was investigated. CD4^+ T cells from tolerant mice disabled naive lymphocytes so that they too could not reject the graft. The naive lymphocytes that had been so disabled also became tolerant and, in turn, developed the capacity to specifically disable other naive lymphocytes. This process of "infectious" tolerance explains why no further immunosuppression was needed to maintain long-term transplantation tolerance.

  17. Application of probabilistic ordinal optimization concepts to a continuous-variable probabilistic optimization problem.

    SciTech Connect

    Romero, Vicente Jose; Ayon, Douglas V.; Chen, Chun-Hung

    2003-09-01

    A very general and robust approach to solving optimization problems involving probabilistic uncertainty is through the use of Probabilistic Ordinal Optimization. At each step in the optimization problem, improvement is based only on a relative ranking of the probabilistic merits of local design alternatives, rather than on crisp quantification of the alternatives. Thus, we simply ask the question: 'Is that alternative better or worse than this one?' to some level of statistical confidence we require, not: 'HOW MUCH better or worse is that alternative to this one?'. In this paper we illustrate an elementary application of probabilistic ordinal concepts in a 2-D optimization problem. Two uncertain variables contribute to uncertainty in the response function. We use a simple Coordinate Pattern Search non-gradient-based optimizer to step toward the statistical optimum in the design space. We also discuss more sophisticated implementations, and some of the advantages and disadvantages versus non-ordinal approaches for optimization under uncertainty.

  18. Right Hemisphere Brain Damage

    MedlinePlus

    ... Language and Swallowing / Disorders and Diseases Right Hemisphere Brain Damage [ en Español ] What is right hemisphere brain ... right hemisphere brain damage ? What is right hemisphere brain damage? Right hemisphere brain damage (RHD) is damage ...

  19. Advanced Test Reactor probabilistic risk assessment methodology and results summary

    SciTech Connect

    Eide, S.A.; Atkinson, S.A.; Thatcher, T.A.

    1992-01-01

    The Advanced Test Reactor (ATR) probabilistic risk assessment (PRA) Level 1 report documents a comprehensive and state-of-the-art study to establish and reduce the risk associated with operation of the ATR, expressed as a mean frequency of fuel damage. The ATR Level 1 PRA effort is unique and outstanding because of its consistent and state-of-the-art treatment of all facets of the risk study, its comprehensive and cost-effective risk reduction effort while the risk baseline was being established, and its thorough and comprehensive documentation. The PRA includes many improvements to the state-of-the-art, including the following: establishment of a comprehensive generic data base for component failures, treatment of initiating event frequencies given significant plant improvements in recent years, performance of efficient identification and screening of fire and flood events using code-assisted vital area analysis, identification and treatment of significant seismic-fire-flood-wind interactions, and modeling of large loss-of-coolant accidents (LOCAs) and experiment loop ruptures leading to direct damage of the ATR core. 18 refs.

  20. Probabilistic models for creep-fatigue in a steel alloy

    NASA Astrophysics Data System (ADS)

    Ibisoglu, Fatmagul

    In high temperature components subjected to long term cyclic operation, simultaneous creep and fatigue damage occur. A new methodology for creep-fatigue life assessment has been adopted without the need to separate creep and fatigue damage or expended life. Probabilistic models, described by hold times in tension and total strain range at temperature, have been derived based on the creep rupture behavior of a steel alloy. These models have been validated with the observed creep-fatigue life of the material with a scatter band close to a factor of 2. Uncertainties of the creep-fatigue model parameters have been estimated with WinBUGS which is an open source Bayesian analysis software tool that uses Markov Chain Monte Carlo method to fit statistical models. Secondly, creep deformation in stress relaxation data has been analyzed. Well performing creep equations have been validated with the observed data. The creep model with the highest goodness of fit among the validated models has been used to estimate probability of exceedance at 0.6% strain level for the steel alloy.

  1. Sensor Based Engine Life Calculation: A Probabilistic Perspective

    NASA Technical Reports Server (NTRS)

    Guo, Ten-Huei; Chen, Philip

    2003-01-01

    It is generally known that an engine component will accumulate damage (life usage) during its lifetime of use in a harsh operating environment. The commonly used cycle count for engine component usage monitoring has an inherent range of uncertainty which can be overly costly or potentially less safe from an operational standpoint. With the advance of computer technology, engine operation modeling, and the understanding of damage accumulation physics, it is possible (and desirable) to use the available sensor information to make a more accurate assessment of engine component usage. This paper describes a probabilistic approach to quantify the effects of engine operating parameter uncertainties on the thermomechanical fatigue (TMF) life of a selected engine part. A closed-loop engine simulation with a TMF life model is used to calculate the life consumption of different mission cycles. A Monte Carlo simulation approach is used to generate the statistical life usage profile for different operating assumptions. The probabilities of failure of different operating conditions are compared to illustrate the importance of the engine component life calculation using sensor information. The results of this study clearly show that a sensor-based life cycle calculation can greatly reduce the risk of component failure as well as extend on-wing component life by avoiding unnecessary maintenance actions.

  2. Advanced neutron source reactor probabilistic flow blockage assessment

    SciTech Connect

    Ramsey, C.T.

    1995-08-01

    The Phase I Level I Probabilistic Risk Assessment (PRA) of the conceptual design of the Advanced Neutron Source (ANS) Reactor identified core flow blockage as the most likely internal event leading to fuel damage. The flow blockage event frequency used in the original ANS PRA was based primarily on the flow blockage work done for the High Flux Isotope Reactor (HFIR) PRA. This report examines potential flow blockage scenarios and calculates an estimate of the likelihood of debris-induced fuel damage. The bulk of the report is based specifically on the conceptual design of ANS with a 93%-enriched, two-element core; insights to the impact of the proposed three-element core are examined in Sect. 5. In addition to providing a probability (uncertainty) distribution for the likelihood of core flow blockage, this ongoing effort will serve to indicate potential areas of concern to be focused on in the preliminary design for elimination or mitigation. It will also serve as a loose-parts management tool.

  3. Probabilistic Shock Iinitiation Thresholds and QMU Applications

    SciTech Connect

    Hrousis, C A; Gresshoff, M; Overturf, G E

    2009-04-10

    The Probabilistic Threshold Criterion (PTC) Project at LLNL develops phenomenological criteria for establishing margin of safety or performance margin on high explosive (HE) initiation in the high-speed impact regime, creating tools for safety assessment and design of initiation systems and HE trains in general. Until recently, there has been little foundation for probabilistic assessment of HE initiation scenarios. This work attempts to use probabilistic information that is available from both historic and ongoing tests to develop a basis for such assessment. Current PTC approaches start with the functional form of James Initiation Criterion as a backbone, and generalize to include varying areas of initiation and provide a probabilistic response based on test data. Recent work includes application of the PTC methodology to safety assessments involving a donor charge detonation and the need for assessment of a nearby acceptor charge's response, as well as flyer-acceptor configurations, with and without barriers. Results to date are in agreement with other less formal assessment protocols, and indicate a promising use for PTC-based assessments. In particular, there is interest in this approach because it supports the Quantified Margins and Uncertainties (QMU) framework for establishing confidence in the performance and/or safety of an HE system.

  4. Pigeons' Discounting of Probabilistic and Delayed Reinforcers

    ERIC Educational Resources Information Center

    Green, Leonard; Myerson, Joel; Calvert, Amanda L.

    2010-01-01

    Pigeons' discounting of probabilistic and delayed food reinforcers was studied using adjusting-amount procedures. In the probability discounting conditions, pigeons chose between an adjusting number of food pellets contingent on a single key peck and a larger, fixed number of pellets contingent on completion of a variable-ratio schedule. In the…

  5. A probabilistic approach to composite micromechanics

    NASA Technical Reports Server (NTRS)

    Stock, T. A.; Bellini, P. X.; Murthy, P. L. N.; Chamis, C. C.

    1988-01-01

    Probabilistic composite micromechanics methods are developed that simulate expected uncertainties in unidirectional fiber composite properties. These methods are in the form of computational procedures using Monte Carlo simulation. A graphite/epoxy unidirectional composite (ply) is studied to demonstrate fiber composite material properties at the micro level. Regression results are presented to show the relative correlation between predicted and response variables in the study.

  6. Balkanization and Unification of Probabilistic Inferences

    ERIC Educational Resources Information Center

    Yu, Chong-Ho

    2005-01-01

    Many research-related classes in social sciences present probability as a unified approach based upon mathematical axioms, but neglect the diversity of various probability theories and their associated philosophical assumptions. Although currently the dominant statistical and probabilistic approach is the Fisherian tradition, the use of Fisherian…

  7. Dynamic Probabilistic Instability of Composite Structures

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.

    2009-01-01

    A computationally effective method is described to evaluate the non-deterministic dynamic instability (probabilistic dynamic buckling) of thin composite shells. The method is a judicious combination of available computer codes for finite element, composite mechanics and probabilistic structural analysis. The solution method is incrementally updated Lagrangian. It is illustrated by applying it to thin composite cylindrical shell subjected to dynamic loads. Both deterministic and probabilistic buckling loads are evaluated to demonstrate the effectiveness of the method. A universal plot is obtained for the specific shell that can be used to approximate buckling loads for different load rates and different probability levels. Results from this plot show that the faster the rate, the higher the buckling load and the shorter the time. The lower the probability, the lower is the buckling load for a specific time. Probabilistic sensitivity results show that the ply thickness, the fiber volume ratio and the fiber longitudinal modulus, dynamic load and loading rate are the dominant uncertainties in that order.

  8. The Probabilistic Nature of Preferential Choice

    ERIC Educational Resources Information Center

    Rieskamp, Jorg

    2008-01-01

    Previous research has developed a variety of theories explaining when and why people's decisions under risk deviate from the standard economic view of expected utility maximization. These theories are limited in their predictive accuracy in that they do not explain the probabilistic nature of preferential choice, that is, why an individual makes…

  9. Probabilistic analysis of a materially nonlinear structure

    NASA Technical Reports Server (NTRS)

    Millwater, H. R.; Wu, Y.-T.; Fossum, A. F.

    1990-01-01

    A probabilistic finite element program is used to perform probabilistic analysis of a materially nonlinear structure. The program used in this study is NESSUS (Numerical Evaluation of Stochastic Structure Under Stress), under development at Southwest Research Institute. The cumulative distribution function (CDF) of the radial stress of a thick-walled cylinder under internal pressure is computed and compared with the analytical solution. In addition, sensitivity factors showing the relative importance of the input random variables are calculated. Significant plasticity is present in this problem and has a pronounced effect on the probabilistic results. The random input variables are the material yield stress and internal pressure with Weibull and normal distributions, respectively. The results verify the ability of NESSUS to compute the CDF and sensitivity factors of a materially nonlinear structure. In addition, the ability of the Advanced Mean Value (AMV) procedure to assess the probabilistic behavior of structures which exhibit a highly nonlinear response is shown. Thus, the AMV procedure can be applied with confidence to other structures which exhibit nonlinear behavior.

  10. Probabilistic Relational Structures and Their Applications

    ERIC Educational Resources Information Center

    Domotor, Zoltan

    The principal objects of the investigation reported were, first, to study qualitative probability relations on Boolean algebras, and secondly, to describe applications in the theories of probability logic, information, automata, and probabilistic measurement. The main contribution of this work is stated in 10 definitions and 20 theorems. The basic…

  11. A Probabilistic Model of Melody Perception

    ERIC Educational Resources Information Center

    Temperley, David

    2008-01-01

    This study presents a probabilistic model of melody perception, which infers the key of a melody and also judges the probability of the melody itself. The model uses Bayesian reasoning: For any "surface" pattern and underlying "structure," we can infer the structure maximizing P(structure [vertical bar] surface) based on knowledge of P(surface,…

  12. Probabilistic Grammars for Natural Languages. Psychology Series.

    ERIC Educational Resources Information Center

    Suppes, Patrick

    The purpose of this paper is to define the framework within which empirical investigations of probabilistic grammars can take place and to sketch how this attack can be made. The full presentation of empirical results will be left to other papers. In the detailed empirical work, the author has depended on the collaboration of E. Gammon and A.…

  13. Probabilistic Aeroelastic Analysis Developed for Turbomachinery Components

    NASA Technical Reports Server (NTRS)

    Reddy, T. S. R.; Mital, Subodh K.; Stefko, George L.; Pai, Shantaram S.

    2003-01-01

    Aeroelastic analyses for advanced turbomachines are being developed for use at the NASA Glenn Research Center and industry. However, these analyses at present are used for turbomachinery design with uncertainties accounted for by using safety factors. This approach may lead to overly conservative designs, thereby reducing the potential of designing higher efficiency engines. An integration of the deterministic aeroelastic analysis methods with probabilistic analysis methods offers the potential to design efficient engines with fewer aeroelastic problems and to make a quantum leap toward designing safe reliable engines. In this research, probabilistic analysis is integrated with aeroelastic analysis: (1) to determine the parameters that most affect the aeroelastic characteristics (forced response and stability) of a turbomachine component such as a fan, compressor, or turbine and (2) to give the acceptable standard deviation on the design parameters for an aeroelastically stable system. The approach taken is to combine the aeroelastic analysis of the MISER (MIStuned Engine Response) code with the FPI (fast probability integration) code. The role of MISER is to provide the functional relationships that tie the structural and aerodynamic parameters (the primitive variables) to the forced response amplitudes and stability eigenvalues (the response properties). The role of FPI is to perform probabilistic analyses by utilizing the response properties generated by MISER. The results are a probability density function for the response properties. The probabilistic sensitivities of the response variables to uncertainty in primitive variables are obtained as a byproduct of the FPI technique. The combined analysis of aeroelastic and probabilistic analysis is applied to a 12-bladed cascade vibrating in bending and torsion. Out of the total 11 design parameters, 6 are considered as having probabilistic variation. The six parameters are space-to-chord ratio (SBYC), stagger angle

  14. Stochastic damage evolution in textile laminates

    NASA Technical Reports Server (NTRS)

    Dzenis, Yuris A.; Bogdanovich, Alexander E.; Pastore, Christopher M.

    1993-01-01

    A probabilistic model utilizing random material characteristics to predict damage evolution in textile laminates is presented. Model is based on a division of each ply into two sublaminas consisting of cells. The probability of cell failure is calculated using stochastic function theory and maximal strain failure criterion. Three modes of failure, i.e. fiber breakage, matrix failure in transverse direction, as well as matrix or interface shear cracking, are taken into account. Computed failure probabilities are utilized in reducing cell stiffness based on the mesovolume concept. A numerical algorithm is developed predicting the damage evolution and deformation history of textile laminates. Effect of scatter of fiber orientation on cell properties is discussed. Weave influence on damage accumulation is illustrated with the help of an example of a Kevlar/epoxy laminate.

  15. Wing Damage Effects on Dragonfly's maneuverability

    NASA Astrophysics Data System (ADS)

    Ning, Zhe; Gai, Kuo; Zeyghami, Samane; Dong, Haibo; Flow Simulation Research Group (FSRG) Team

    2011-11-01

    In this work, how the insect flight behavior contributes to its adaptability to limited performance condition is studied through a combined experimental and computational study. High speed photogrammetry is used to collect the data of dragonflies' takeoffs with intact and damaged wings along the chord and span separately. Then the effect of the spanwise and chordwise damage on the dragonfly wing is investigated. Results show that both changes have different effects on the wing and body kinematics and the merit of maneuverability. Two theories will be introduced to explain the wing damage tolerance behavior of the dragonfly flight. This work is supported by NSF CBET-1055949.

  16. Acid tolerance in amphibians

    SciTech Connect

    Pierce, B.A.

    1985-04-01

    Studies of amphibian acid tolerance provide information about the potential effects of acid deposition on amphibian communities. Amphibians as a group appear to be relatively acid tolerant, with many species suffering increased mortality only below pH 4. However, amphibians exhibit much intraspecific variation in acid tolerance, and some species are sensitive to even low levels of acidity. Furthermore, nonlethal effects, including depression of growth rates and increases in developmental abnormalities, can occur at higher pH.

  17. Tolerance to deer herbivory and resistance to insect herbivores in the common evening primrose (Oenothera biennis).

    PubMed

    Puentes, A; Johnson, M T J

    2016-01-01

    The evolution of plant defence in response to herbivory will depend on the fitness effects of damage, availability of genetic variation and potential ecological and genetic constraints on defence. Here, we examine the potential for evolution of tolerance to deer herbivory in Oenothera biennis while simultaneously considering resistance to natural insect herbivores. We examined (i) the effects of deer damage on fitness, (ii) the presence of genetic variation in tolerance and resistance, (iii) selection on tolerance, (iv) genetic correlations with resistance that could constrain evolution of tolerance and (v) plant traits that might predict defence. In a field experiment, we simulated deer damage occurring early and late in the season, recorded arthropod abundances, flowering phenology and measured growth rate and lifetime reproduction. Our study showed that deer herbivory has a negative effect on fitness, with effects being more pronounced for late-season damage. Selection acted to increase tolerance to deer damage, yet there was low and nonsignificant genetic variation in this trait. In contrast, there was substantial genetic variation in resistance to insect herbivores. Resistance was genetically uncorrelated with tolerance, whereas positive genetic correlations in resistance to insect herbivores suggest there exists diffuse selection on resistance traits. In addition, growth rate and flowering time did not predict variation in tolerance, but flowering phenology was genetically correlated with resistance. Our results suggest that deer damage has the potential to exert selection because browsing reduces plant fitness, but limited standing genetic variation in tolerance is expected to constrain adaptive evolution in O. biennis. PMID:26395768

  18. Sulfur tolerant anode materials

    SciTech Connect

    Not Available

    1987-02-01

    The goal of this program is the development of a molten carbonate fuel cell (MCFC) anode which is more tolerant of sulfur contaminants in the fuel than the current state-of-the-art nickel-based anode structures. This program addresses two different but related aspects of the sulfur contamination problem. The primary aspect is concerned with the development of a sulfur tolerant electrocatalyst for the fuel oxidation reaction. A secondary issue is the development of a sulfur tolerant water-gas-shift reaction catalyst and an investigation of potential steam reforming catalysts which also have some sulfur tolerant capabilities. These two aspects are being addressed as two separate tasks.

  19. Probabilistic seismic vulnerability and risk assessment of stone masonry structures

    NASA Astrophysics Data System (ADS)

    Abo El Ezz, Ahmad

    Earthquakes represent major natural hazards that regularly impact the built environment in seismic prone areas worldwide and cause considerable social and economic losses. The high losses incurred following the past destructive earthquakes promoted the need for assessment of the seismic vulnerability and risk of the existing buildings. Many historic buildings in the old urban centers in Eastern Canada such as Old Quebec City are built of stone masonry and represent un-measurable architectural and cultural heritage. These buildings were built to resist gravity loads only and generally offer poor resistance to lateral seismic loads. Seismic vulnerability assessment of stone masonry buildings is therefore the first necessary step in developing seismic retrofitting and pre-disaster mitigation plans. The objective of this study is to develop a set of probability-based analytical tools for efficient seismic vulnerability and uncertainty analysis of stone masonry buildings. A simplified probabilistic analytical methodology for vulnerability modelling of stone masonry building with systematic treatment of uncertainties throughout the modelling process is developed in the first part of this study. Building capacity curves are developed using a simplified mechanical model. A displacement based procedure is used to develop damage state fragility functions in terms of spectral displacement response based on drift thresholds of stone masonry walls. A simplified probabilistic seismic demand analysis is proposed to capture the combined uncertainty in capacity and demand on fragility functions. In the second part, a robust analytical procedure for the development of seismic hazard compatible fragility and vulnerability functions is proposed. The results are given by sets of seismic hazard compatible vulnerability functions in terms of structure-independent intensity measure (e.g. spectral acceleration) that can be used for seismic risk analysis. The procedure is very efficient for

  20. Probabilistic Analysis of Gas Turbine Field Performance

    NASA Technical Reports Server (NTRS)

    Gorla, Rama S. R.; Pai, Shantaram S.; Rusick, Jeffrey J.

    2002-01-01

    A gas turbine thermodynamic cycle was computationally simulated and probabilistically evaluated in view of the several uncertainties in the performance parameters, which are indices of gas turbine health. Cumulative distribution functions and sensitivity factors were computed for the overall thermal efficiency and net specific power output due to the thermodynamic random variables. These results can be used to quickly identify the most critical design variables in order to optimize the design, enhance performance, increase system availability and make it cost effective. The analysis leads to the selection of the appropriate measurements to be used in the gas turbine health determination and to the identification of both the most critical measurements and parameters. Probabilistic analysis aims at unifying and improving the control and health monitoring of gas turbine aero-engines by increasing the quality and quantity of information available about the engine's health and performance.