Sample records for probabilistic damage tolerance

  1. Probabilistic Evaluation of Blade Impact Damage

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.; Abumeri, G. H.

    2003-01-01

    The response to high velocity impact of a composite blade is probabilistically evaluated. The evaluation is focused on quantifying probabilistically the effects of uncertainties (scatter) in the variables that describe the impact, the blade make-up (geometry and material), the blade response (displacements, strains, stresses, frequencies), the blade residual strength after impact, and the blade damage tolerance. The results of probabilistic evaluations results are in terms of probability cumulative distribution functions and probabilistic sensitivities. Results show that the blade has relatively low damage tolerance at 0.999 probability of structural failure and substantial at 0.01 probability.

  2. Damage Tolerance and Reliability of Turbine Engine Components

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.

    1999-01-01

    This report describes a formal method to quantify structural damage tolerance and reliability in the presence of a multitude of uncertainties in turbine engine components. The method is based at the material behavior level where primitive variables with their respective scatter ranges are used to describe behavior. Computational simulation is then used to propagate the uncertainties to the structural scale where damage tolerance and reliability are usually specified. Several sample cases are described to illustrate the effectiveness, versatility, and maturity of the method. Typical results from this method demonstrate that it is mature and that it can be used to probabilistically evaluate turbine engine structural components. It may be inferred from the results that the method is suitable for probabilistically predicting the remaining life in aging or deteriorating structures, for making strategic projections and plans, and for achieving better, cheaper, faster products that give competitive advantages in world markets.

  3. An Approach to Risk-Based Design Incorporating Damage Tolerance Analyses

    NASA Technical Reports Server (NTRS)

    Knight, Norman F., Jr.; Glaessgen, Edward H.; Sleight, David W.

    2002-01-01

    Incorporating risk-based design as an integral part of spacecraft development is becoming more and more common. Assessment of uncertainties associated with design parameters and environmental aspects such as loading provides increased knowledge of the design and its performance. Results of such studies can contribute to mitigating risk through a system-level assessment. Understanding the risk of an event occurring, the probability of its occurrence, and the consequences of its occurrence can lead to robust, reliable designs. This paper describes an approach to risk-based structural design incorporating damage-tolerance analysis. The application of this approach to a candidate Earth-entry vehicle is described. The emphasis of the paper is on describing an approach for establishing damage-tolerant structural response inputs to a system-level probabilistic risk assessment.

  4. Design of Composite Structures for Reliability and Damage Tolerance

    NASA Technical Reports Server (NTRS)

    Rais-Rohani, Masoud

    1999-01-01

    A summary of research conducted during the first year is presented. The research objectives were sought by conducting two tasks: (1) investigation of probabilistic design techniques for reliability-based design of composite sandwich panels, and (2) examination of strain energy density failure criterion in conjunction with response surface methodology for global-local design of damage tolerant helicopter fuselage structures. This report primarily discusses the efforts surrounding the first task and provides a discussion of some preliminary work involving the second task.

  5. Probabilistic evaluation of on-line checks in fault-tolerant multiprocessor systems

    NASA Technical Reports Server (NTRS)

    Nair, V. S. S.; Hoskote, Yatin V.; Abraham, Jacob A.

    1992-01-01

    The analysis of fault-tolerant multiprocessor systems that use concurrent error detection (CED) schemes is much more difficult than the analysis of conventional fault-tolerant architectures. Various analytical techniques have been proposed to evaluate CED schemes deterministically. However, these approaches are based on worst-case assumptions related to the failure of system components. Often, the evaluation results do not reflect the actual fault tolerance capabilities of the system. A probabilistic approach to evaluate the fault detecting and locating capabilities of on-line checks in a system is developed. The various probabilities associated with the checking schemes are identified and used in the framework of the matrix-based model. Based on these probabilistic matrices, estimates for the fault tolerance capabilities of various systems are derived analytically.

  6. Probabilistic flood damage modelling at the meso-scale

    NASA Astrophysics Data System (ADS)

    Kreibich, Heidi; Botto, Anna; Schröter, Kai; Merz, Bruno

    2014-05-01

    Decisions on flood risk management and adaptation are usually based on risk analyses. Such analyses are associated with significant uncertainty, even more if changes in risk due to global change are expected. Although uncertainty analysis and probabilistic approaches have received increased attention during the last years, they are still not standard practice for flood risk assessments. Most damage models have in common that complex damaging processes are described by simple, deterministic approaches like stage-damage functions. Novel probabilistic, multi-variate flood damage models have been developed and validated on the micro-scale using a data-mining approach, namely bagging decision trees (Merz et al. 2013). In this presentation we show how the model BT-FLEMO (Bagging decision Tree based Flood Loss Estimation MOdel) can be applied on the meso-scale, namely on the basis of ATKIS land-use units. The model is applied in 19 municipalities which were affected during the 2002 flood by the River Mulde in Saxony, Germany. The application of BT-FLEMO provides a probability distribution of estimated damage to residential buildings per municipality. Validation is undertaken on the one hand via a comparison with eight other damage models including stage-damage functions as well as multi-variate models. On the other hand the results are compared with official damage data provided by the Saxon Relief Bank (SAB). The results show, that uncertainties of damage estimation remain high. Thus, the significant advantage of this probabilistic flood loss estimation model BT-FLEMO is that it inherently provides quantitative information about the uncertainty of the prediction. Reference: Merz, B.; Kreibich, H.; Lall, U. (2013): Multi-variate flood damage assessment: a tree-based data-mining approach. NHESS, 13(1), 53-64.

  7. Damage Tolerance of Composites

    NASA Technical Reports Server (NTRS)

    Hodge, Andy

    2007-01-01

    Fracture control requirements have been developed to address damage tolerance of composites for manned space flight hardware. The requirements provide the framework for critical and noncritical hardware assessment and testing. The need for damage threat assessments, impact damage protection plans, and nondestructive evaluation are also addressed. Hardware intended to be damage tolerant have extensive coupon, sub-element, and full-scale testing requirements in-line with the Building Block Approach concept from the MIL-HDBK-17, Department of Defense Composite Materials Handbook.

  8. Damage Tolerance of Large Shell Structures

    NASA Technical Reports Server (NTRS)

    Minnetyan, L.; Chamis, C. C.

    1999-01-01

    Progressive damage and fracture of large shell structures is investigated. A computer model is used for the assessment of structural response, progressive fracture resistance, and defect/damage tolerance characteristics. Critical locations of a stiffened conical shell segment are identified. Defective and defect-free computer models are simulated to evaluate structural damage/defect tolerance. Safe pressurization levels are assessed for the retention of structural integrity at the presence of damage/ defects. Damage initiation, growth, accumulation, and propagation to fracture are included in the simulations. Damage propagation and burst pressures for defective and defect-free shells are compared to evaluate damage tolerance. Design implications with regard to defect and damage tolerance of a large steel pressure vessel are examined.

  9. Probabilistic Fatigue Damage Program (FATIG)

    NASA Technical Reports Server (NTRS)

    Michalopoulos, Constantine

    2012-01-01

    FATIG computes fatigue damage/fatigue life using the stress rms (root mean square) value, the total number of cycles, and S-N curve parameters. The damage is computed by the following methods: (a) traditional method using Miner s rule with stress cycles determined from a Rayleigh distribution up to 3*sigma; and (b) classical fatigue damage formula involving the Gamma function, which is derived from the integral version of Miner's rule. The integration is carried out over all stress amplitudes. This software solves the problem of probabilistic fatigue damage using the integral form of the Palmgren-Miner rule. The software computes fatigue life using an approach involving all stress amplitudes, up to N*sigma, as specified by the user. It can be used in the design of structural components subjected to random dynamic loading, or by any stress analyst with minimal training for fatigue life estimates of structural components.

  10. 77 FR 4890 - Damage Tolerance and Fatigue Evaluation for Composite Rotorcraft Structures, and Damage Tolerance...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-02-01

    ...-AJ52, 2120-AJ51 Damage Tolerance and Fatigue Evaluation for Composite Rotorcraft Structures, and Damage... Tolerance and Fatigue Evaluation for Composite Rotorcraft Structures'' (76 FR 74655), published December 1... December 2, 2011. In the ``Composite Rotorcraft Structures'' rule, the FAA amended its regulations to...

  11. Damage Tolerance Assessment Branch

    NASA Technical Reports Server (NTRS)

    Walker, James L.

    2013-01-01

    The Damage Tolerance Assessment Branch evaluates the ability of a structure to perform reliably throughout its service life in the presence of a defect, crack, or other form of damage. Such assessment is fundamental to the use of structural materials and requires an integral blend of materials engineering, fracture testing and analysis, and nondestructive evaluation. The vision of the Branch is to increase the safety of manned space flight by improving the fracture control and the associated nondestructive evaluation processes through development and application of standards, guidelines, advanced test and analytical methods. The Branch also strives to assist and solve non-aerospace related NDE and damage tolerance problems, providing consultation, prototyping and inspection services.

  12. Damage tolerance assessment handbook. Volume 2 : airframe damage tolerance evaluation

    DOT National Transportation Integrated Search

    1999-02-01

    The handbook is presented in two volumes. Volume I introduces the damage tolerance concept with an historical perspective followed by the fundamentals of fracture mechanics and fatigue crack propagation. Various fracture criteria and crack growth rul...

  13. Damage-Survivable and Damage-Tolerant Laminated Composites with Optimally Placed Piezoelectric Layers

    DTIC Science & Technology

    1992-11-13

    AD-A269 879 Damage-Survivable j and Damage-Tolerant Laminated Composites .4.. with Optimally placed Piezoelectric Layers Final Report No. 1 S. P...Damage Surviable and Damage-Tolerant Laminated Composites With Optimally Placed Piezoelectric Layers 12. PERSONAL AUTHOR(S) S.P. Joshi, W.S. Chan ൕa...block number) The main objective of the research is to assure that the embedded sensors/actuators in a smart laminated composite structure are damage

  14. Probabilistic Methods for Structural Design and Reliability

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.; Whitlow, Woodrow, Jr. (Technical Monitor)

    2002-01-01

    This report describes a formal method to quantify structural damage tolerance and reliability in the presence of a multitude of uncertainties in turbine engine components. The method is based at the material behavior level where primitive variables with their respective scatter ranges are used to describe behavior. Computational simulation is then used to propagate the uncertainties to the structural scale where damage tolerance and reliability are usually specified. Several sample cases are described to illustrate the effectiveness, versatility, and maturity of the method. Typical results from this method demonstrate, that it is mature and that it can be used to probabilistically evaluate turbine engine structural components. It may be inferred from the results that the method is suitable for probabilistically predicting the remaining life in aging or in deteriorating structures, for making strategic projections and plans, and for achieving better, cheaper, faster products that give competitive advantages in world markets.

  15. Damage tolerance certification of a fighter horizontal stabilizer

    NASA Astrophysics Data System (ADS)

    Huang, Jia-Yen; Tsai, Ming-Yang; Chen, Jong-Sheng; Ong, Ching-Long

    1995-05-01

    A review of the program for the damage tolerance certification test of a composite horizontal stabilizer (HS) of a fighter is presented. The object of this program is to certify that the fatigue life and damage tolerance strength of a damaged composite horizontal stabilizer meets the design requirements. According to the specification for damage tolerance certification, a test article should be subjected to two design lifetimes of flight-by-flight load spectra simulating the in-service fatigue loading condition for the aircraft. However, considering the effect of environmental change on the composite structure, one additional lifetime test was performed. In addition, to evaluate the possibilities for extending the service life of the structure, one more lifetime test was carried out with the spectrum increased by a factor of 1.4. To assess the feasibility and reliability of repair technology on a composite structure, two damaged areas were repaired after two lifetimes of damage tolerance test. On completion of four lifetimes of the damage tolerance test, the static residual strength was measured to check whether structural strength after repair met the requirements. Stiffness and static strength of the composite HS with and without damage were evaluated and compared.

  16. Damage Tolerance of Composite Laminates from an Empirical Perspective

    NASA Technical Reports Server (NTRS)

    Nettles, Alan T.

    2009-01-01

    Damage tolerance consists of analysis and experimentation working together. Impact damage is usually of most concern for laminated composites. Once impacted, the residual compression strength is usually of most interest. Other properties may be of more interest than compression (application dependent). A damage tolerance program is application specific (not everyone is building aircraft). The "Building Block Approach" is suggested for damage tolerance. Advantage can be taken of the excellent fatigue resistance of damaged laminates to save time and costs.

  17. Near Real-Time Probabilistic Damage Diagnosis Using Surrogate Modeling and High Performance Computing

    NASA Technical Reports Server (NTRS)

    Warner, James E.; Zubair, Mohammad; Ranjan, Desh

    2017-01-01

    This work investigates novel approaches to probabilistic damage diagnosis that utilize surrogate modeling and high performance computing (HPC) to achieve substantial computational speedup. Motivated by Digital Twin, a structural health management (SHM) paradigm that integrates vehicle-specific characteristics with continual in-situ damage diagnosis and prognosis, the methods studied herein yield near real-time damage assessments that could enable monitoring of a vehicle's health while it is operating (i.e. online SHM). High-fidelity modeling and uncertainty quantification (UQ), both critical to Digital Twin, are incorporated using finite element method simulations and Bayesian inference, respectively. The crux of the proposed Bayesian diagnosis methods, however, is the reformulation of the numerical sampling algorithms (e.g. Markov chain Monte Carlo) used to generate the resulting probabilistic damage estimates. To this end, three distinct methods are demonstrated for rapid sampling that utilize surrogate modeling and exploit various degrees of parallelism for leveraging HPC. The accuracy and computational efficiency of the methods are compared on the problem of strain-based crack identification in thin plates. While each approach has inherent problem-specific strengths and weaknesses, all approaches are shown to provide accurate probabilistic damage diagnoses and several orders of magnitude computational speedup relative to a baseline Bayesian diagnosis implementation.

  18. A Novel Approach to Rotorcraft Damage Tolerance

    NASA Technical Reports Server (NTRS)

    Forth, Scott C.; Everett, Richard A.; Newman, John A.

    2002-01-01

    Damage-tolerance methodology is positioned to replace safe-life methodologies for designing rotorcraft structures. The argument for implementing a damage-tolerance method comes from the fundamental fact that rotorcraft structures typically fail by fatigue cracking. Therefore, if technology permits prediction of fatigue-crack growth in structures, a damage-tolerance method should deliver the most accurate prediction of component life. Implementing damage-tolerance (DT) into high-cycle-fatigue (HCF) components will require a shift from traditional DT methods that rely on detecting an initial flaw with nondestructive inspection (NDI) methods. The rapid accumulation of cycles in a HCF component will result in a design based on a traditional DT method that is either impractical because of frequent inspections, or because the design will be too heavy to operate efficiently. Furthermore, once a HCF component develops a detectable propagating crack, the remaining fatigue life is short, sometimes less than one flight hour, which does not leave sufficient time for inspection. Therefore, designing a HCF component will require basing the life analysis on an initial flaw that is undetectable with current NDI technology.

  19. Probabilistic Assessment of Fracture Progression in Composite Structures

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.; Minnetyan, Levon; Mauget, Bertrand; Huang, Dade; Addi, Frank

    1999-01-01

    This report describes methods and corresponding computer codes that are used to evaluate progressive damage and fracture and to perform probabilistic assessment in built-up composite structures. Structural response is assessed probabilistically, during progressive fracture. The effects of design variable uncertainties on structural fracture progression are quantified. The fast probability integrator (FPI) is used to assess the response scatter in the composite structure at damage initiation. The sensitivity of the damage response to design variables is computed. The methods are general purpose and are applicable to stitched and unstitched composites in all types of structures and fracture processes starting from damage initiation to unstable propagation and to global structure collapse. The methods are demonstrated for a polymer matrix composite stiffened panel subjected to pressure. The results indicated that composite constituent properties, fabrication parameters, and respective uncertainties have a significant effect on structural durability and reliability. Design implications with regard to damage progression, damage tolerance, and reliability of composite structures are examined.

  20. A Computationally-Efficient Inverse Approach to Probabilistic Strain-Based Damage Diagnosis

    NASA Technical Reports Server (NTRS)

    Warner, James E.; Hochhalter, Jacob D.; Leser, William P.; Leser, Patrick E.; Newman, John A

    2016-01-01

    This work presents a computationally-efficient inverse approach to probabilistic damage diagnosis. Given strain data at a limited number of measurement locations, Bayesian inference and Markov Chain Monte Carlo (MCMC) sampling are used to estimate probability distributions of the unknown location, size, and orientation of damage. Substantial computational speedup is obtained by replacing a three-dimensional finite element (FE) model with an efficient surrogate model. The approach is experimentally validated on cracked test specimens where full field strains are determined using digital image correlation (DIC). Access to full field DIC data allows for testing of different hypothetical sensor arrangements, facilitating the study of strain-based diagnosis effectiveness as the distance between damage and measurement locations increases. The ability of the framework to effectively perform both probabilistic damage localization and characterization in cracked plates is demonstrated and the impact of measurement location on uncertainty in the predictions is shown. Furthermore, the analysis time to produce these predictions is orders of magnitude less than a baseline Bayesian approach with the FE method by utilizing surrogate modeling and effective numerical sampling approaches.

  1. Probabilistic Fatigue Damage Prognosis Using a Surrogate Model Trained Via 3D Finite Element Analysis

    NASA Technical Reports Server (NTRS)

    Leser, Patrick E.; Hochhalter, Jacob D.; Newman, John A.; Leser, William P.; Warner, James E.; Wawrzynek, Paul A.; Yuan, Fuh-Gwo

    2015-01-01

    Utilizing inverse uncertainty quantification techniques, structural health monitoring can be integrated with damage progression models to form probabilistic predictions of a structure's remaining useful life. However, damage evolution in realistic structures is physically complex. Accurately representing this behavior requires high-fidelity models which are typically computationally prohibitive. In the present work, a high-fidelity finite element model is represented by a surrogate model, reducing computation times. The new approach is used with damage diagnosis data to form a probabilistic prediction of remaining useful life for a test specimen under mixed-mode conditions.

  2. Damage Tolerance Analysis of a Pressurized Liquid Oxygen Tank

    NASA Technical Reports Server (NTRS)

    Forth, Scott C.; Harvin, Stephen F.; Gregory, Peyton B.; Mason, Brian H.; Thompson, Joe E.; Hoffman, Eric K.

    2006-01-01

    A damage tolerance assessment was conducted of an 8,000 gallon pressurized Liquid Oxygen (LOX) tank. The LOX tank is constructed of a stainless steel pressure vessel enclosed by a thermal-insulating vacuum jacket. The vessel is pressurized to 2,250 psi with gaseous nitrogen resulting in both thermal and pressure stresses on the tank wall. Finite element analyses were performed on the tank to characterize the stresses from operation. Engineering material data was found from both the construction of the tank and the technical literature. An initial damage state was assumed based on records of a nondestructive inspection performed on the tank. The damage tolerance analyses were conducted using the NASGRO computer code. This paper contains the assumptions, and justifications, made for the input parameters to the damage tolerance analyses and the results of the damage tolerance analyses with a discussion on the operational safety of the LOX tank.

  3. A probabilistic fatigue analysis of multiple site damage

    NASA Technical Reports Server (NTRS)

    Rohrbaugh, S. M.; Ruff, D.; Hillberry, B. M.; Mccabe, G.; Grandt, A. F., Jr.

    1994-01-01

    The variability in initial crack size and fatigue crack growth is incorporated in a probabilistic model that is used to predict the fatigue lives for unstiffened aluminum alloy panels containing multiple site damage (MSD). The uncertainty of the damage in the MSD panel is represented by a distribution of fatigue crack lengths that are analytically derived from equivalent initial flaw sizes. The variability in fatigue crack growth rate is characterized by stochastic descriptions of crack growth parameters for a modified Paris crack growth law. A Monte-Carlo simulation explicitly describes the MSD panel by randomly selecting values from the stochastic variables and then grows the MSD cracks with a deterministic fatigue model until the panel fails. Different simulations investigate the influences of the fatigue variability on the distributions of remaining fatigue lives. Six cases that consider fixed and variable conditions of initial crack size and fatigue crack growth rate are examined. The crack size distribution exhibited a dominant effect on the remaining fatigue life distribution, and the variable crack growth rate exhibited a lesser effect on the distribution. In addition, the probabilistic model predicted that only a small percentage of the life remains after a lead crack develops in the MSD panel.

  4. A damage-tolerant glass.

    PubMed

    Demetriou, Marios D; Launey, Maximilien E; Garrett, Glenn; Schramm, Joseph P; Hofmann, Douglas C; Johnson, William L; Ritchie, Robert O

    2011-02-01

    Owing to a lack of microstructure, glassy materials are inherently strong but brittle, and often demonstrate extreme sensitivity to flaws. Accordingly, their macroscopic failure is often not initiated by plastic yielding, and almost always terminated by brittle fracture. Unlike conventional brittle glasses, metallic glasses are generally capable of limited plastic yielding by shear-band sliding in the presence of a flaw, and thus exhibit toughness-strength relationships that lie between those of brittle ceramics and marginally tough metals. Here, a bulk glassy palladium alloy is introduced, demonstrating an unusual capacity for shielding an opening crack accommodated by an extensive shear-band sliding process, which promotes a fracture toughness comparable to those of the toughest materials known. This result demonstrates that the combination of toughness and strength (that is, damage tolerance) accessible to amorphous materials extends beyond the benchmark ranges established by the toughest and strongest materials known, thereby pushing the envelope of damage tolerance accessible to a structural metal.

  5. A Study of Damage Tolerance in Curved Composite Panels.

    DTIC Science & Technology

    1988-03-01

    A198 617 A STUDY OF DAMAGE TOLERANCE IN CURVED COMPOSITE PANELS 1/2 I CNO O NGNERN BLNuE Hl(U) Ala FORCE INST OF TECH &IRIGHT-PATTERSON APR ON...OAL B PEAU OF STANOARDS- -T - % r . P.4 .% FILE CC R CDV OF DTIC MAR3 11988 *grr A STUDY OF DAMAGE TOLERANCE IN CURVED COMPOSITE PANELS THESIS...AFIT/GA/AA/88-.3 0 A STUDY OF DAMAGE TOLERANCE IN CURVED COMPOSITE PANELS THESIS - Brendan L. Wilder Captain, USAF S*2822 $$$*fltf$$*2*f$1fltf

  6. Improving glucose tolerance by muscle-damaging exercise.

    PubMed

    Ho, Chien-Te; Otaka, Machiko; Kuo, Chia-Hua

    2017-04-01

    Tissue damage is regarded as an unwanted medical condition to be avoided. However, introducing tolerable tissue damages has been used as a therapeutic intervention in traditional and complementary medicine to cure discomfort and illness. Eccentric exercise is known to cause significant necrosis and insulin resistance of skeletal muscle. The purpose of this study was to determine the magnitude of muscle damage and blood glucose responses during an oral glucose tolerance test (OGTT) after eccentric training in 21 young participants. They were challenged by 5 times of 100-meter downhill sprinting and 20 times of squats training at 30 pounds weight load for 3 days, which resulted in a wide spectrum of muscle creatine kinase (CK) surges in plasma, 48 h after the last bout of exercise. Participants were then divided into two groups according the magnitude of CK increases (low CK: +48% ± 0.3; high CK: +137% ± 0.5, P < 0.05). Both groups show comparable decreases in blood glucose levels in OGTT, suggesting that this muscle-damaging exercise does not appear to decrease but rather improve glycemic control in men. The result of the study rejects the hypothesis that eccentric exercise decreases glucose tolerance. Improved glucose tolerance with CK increase implicates a beneficial effect of replacing metabolically weaker muscle fibers by eccentric exercise in Darwinian natural selection fashion.

  7. Demonstrating damage tolerance of composite airframes

    NASA Technical Reports Server (NTRS)

    Poe, Clarence C., Jr.

    1993-01-01

    Commercial transport aircraft operating in the United States are certified by the Federal Aviation Authority to be damage tolerant. On 28 April 1988, Aloha Airlines Flight 243, a Boeing 727-200 airplane, suffered an explosive decompression of the fuselage but landed safely. This event provides very strong justification for the damage tolerant design criteria. The likely cause of the explosive decompression was the linkup of numerous small fatigue cracks that initiated at adjacent fastener holes in the lap splice joint at the side of the body. Actually, the design should have limited the damage size to less than two frame spacings (about 40 inches), but this type of 'multi-site damage' was not originally taken into account. This cracking pattern developed only in the high-time airplanes (many flights). After discovery in the fleet, a stringent inspection program using eddy current techniques was inaugurated to discover these cracks before they linked up. Because of concerns about safety and the maintenance burden, the lap-splice joints of these high-time airplanes are being modified to remove cracks and prevent new cracking; newer designs account for 'multi-site damage'.

  8. Estimation of probability of failure for damage-tolerant aerospace structures

    NASA Astrophysics Data System (ADS)

    Halbert, Keith

    The majority of aircraft structures are designed to be damage-tolerant such that safe operation can continue in the presence of minor damage. It is necessary to schedule inspections so that minor damage can be found and repaired. It is generally not possible to perform structural inspections prior to every flight. The scheduling is traditionally accomplished through a deterministic set of methods referred to as Damage Tolerance Analysis (DTA). DTA has proven to produce safe aircraft but does not provide estimates of the probability of failure of future flights or the probability of repair of future inspections. Without these estimates maintenance costs cannot be accurately predicted. Also, estimation of failure probabilities is now a regulatory requirement for some aircraft. The set of methods concerned with the probabilistic formulation of this problem are collectively referred to as Probabilistic Damage Tolerance Analysis (PDTA). The goal of PDTA is to control the failure probability while holding maintenance costs to a reasonable level. This work focuses specifically on PDTA for fatigue cracking of metallic aircraft structures. The growth of a crack (or cracks) must be modeled using all available data and engineering knowledge. The length of a crack can be assessed only indirectly through evidence such as non-destructive inspection results, failures or lack of failures, and the observed severity of usage of the structure. The current set of industry PDTA tools are lacking in several ways: they may in some cases yield poor estimates of failure probabilities, they cannot realistically represent the variety of possible failure and maintenance scenarios, and they do not allow for model updates which incorporate observed evidence. A PDTA modeling methodology must be flexible enough to estimate accurately the failure and repair probabilities under a variety of maintenance scenarios, and be capable of incorporating observed evidence as it becomes available. This

  9. Multiaxial and thermomechanical fatigue considerations in damage tolerant design

    NASA Technical Reports Server (NTRS)

    Leese, G. E.; Bill, R. C.

    1985-01-01

    In considering damage tolerant design concepts for gas turbine hot section components, several challenging concerns arise: Complex multiaxial loading situations are encountered; Thermomechanical fatigue loading involving very wide temperature ranges is imposed on components; Some hot section materials are extremely anisotropic; and coatings and environmental interactions play an important role in crack propagation. The effects of multiaxiality and thermomechanical fatigue are considered from the standpoint of their impact on damage tolerant design concepts. Recently obtained research results as well as results from the open literature are examined and their implications for damage tolerant design are discussed. Three important needs required to advance analytical capabilities in support of damage tolerant design become readily apparent: (1) a theoretical basis to account for the effect of nonproportional loading (mechanical and mechanical/thermal); (2) the development of practical crack growth parameters that are applicable to thermomechanical fatigue situations; and (3) the development of crack growth models that address multiple crack failures.

  10. Mitochondrial DNA repair and damage tolerance.

    PubMed

    Stein, Alexis; Sia, Elaine A

    2017-01-01

    The accurate maintenance of mitochondrial DNA (mtDNA) is required in order for eukaryotic cells to assemble a functional electron transport chain. This independently-maintained genome relies on nuclear-encoded proteins that are imported into the mitochondria to carry out replication and repair processes. Decades of research has made clear that mitochondria employ robust and varied mtDNA repair and damage tolerance mechanisms in order to ensure the proper maintenance of the mitochondrial genome. This review focuses on our current understanding of mtDNA repair and damage tolerance pathways including base excision repair, mismatch repair, homologous recombination, non-homologous end joining, translesion synthesis and mtDNA degradation in both yeast and mammalian systems.

  11. Progressive Fracture and Damage Tolerance of Composite Pressure Vessels

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.; Gotsis, Pascal K.; Minnetyan, Levon

    1997-01-01

    Structural performance (integrity, durability and damage tolerance) of fiber reinforced composite pressure vessels, designed for pressured shelters for planetary exploration, is investigated via computational simulation. An integrated computer code is utilized for the simulation of damage initiation, growth, and propagation under pressure. Aramid fibers are considered in a rubbery polymer matrix for the composite system. Effects of fiber orientation and fabrication defect/accidental damages are investigated with regard to the safety and durability of the shelter. Results show the viability of fiber reinforced pressure vessels as damage tolerant shelters for planetary colonization.

  12. Investigation of progressive failure robustness and alternate load paths for damage tolerant structures

    NASA Astrophysics Data System (ADS)

    Marhadi, Kun Saptohartyadi

    Structural optimization for damage tolerance under various unforeseen damage scenarios is computationally challenging. It couples non-linear progressive failure analysis with sampling-based stochastic analysis of random damage. The goal of this research was to understand the relationship between alternate load paths available in a structure and its damage tolerance, and to use this information to develop computationally efficient methods for designing damage tolerant structures. Progressive failure of a redundant truss structure subjected to small random variability was investigated to identify features that correlate with robustness and predictability of the structure's progressive failure. The identified features were used to develop numerical surrogate measures that permit computationally efficient deterministic optimization to achieve robustness and predictability of progressive failure. Analysis of damage tolerance on designs with robust progressive failure indicated that robustness and predictability of progressive failure do not guarantee damage tolerance. Damage tolerance requires a structure to redistribute its load to alternate load paths. In order to investigate the load distribution characteristics that lead to damage tolerance in structures, designs with varying degrees of damage tolerance were generated using brute force stochastic optimization. A method based on principal component analysis was used to describe load distributions (alternate load paths) in the structures. Results indicate that a structure that can develop alternate paths is not necessarily damage tolerant. The alternate load paths must have a required minimum load capability. Robustness analysis of damage tolerant optimum designs indicates that designs are tailored to specified damage. A design Optimized under one damage specification can be sensitive to other damages not considered. Effectiveness of existing load path definitions and characterizations were investigated for continuum

  13. Damage tolerance and structural monitoring for wind turbine blades

    PubMed Central

    McGugan, M.; Pereira, G.; Sørensen, B. F.; Toftegaard, H.; Branner, K.

    2015-01-01

    The paper proposes a methodology for reliable design and maintenance of wind turbine rotor blades using a condition monitoring approach and a damage tolerance index coupling the material and structure. By improving the understanding of material properties that control damage propagation it will be possible to combine damage tolerant structural design, monitoring systems, inspection techniques and modelling to manage the life cycle of the structures. This will allow an efficient operation of the wind turbine in terms of load alleviation, limited maintenance and repair leading to a more effective exploitation of offshore wind. PMID:25583858

  14. Effect of resin on impact damage tolerance of graphite/epoxy laminates

    NASA Technical Reports Server (NTRS)

    Williams, J. G.; Rhodes, M. D.

    1982-01-01

    Twenty-four different epoxy resin systems were evaluated by a variety of test techniques to identify materials that exhibited improved impact damage tolerance in graphite/epoxy composite laminates. Forty-eight-ply composite panels of five of the material systems were able to sustain 100 m/s impact by a 1.27-cm-diameter aluminum projectile while statically loaded to strains of 0.005. Of the five materials with the highest tolerance to impact, two had elastomeric additives, two had thermoplastic additives, and one had a vinyl modifier; all the five systems used bisphenol A as the base resin. An evaluation of test results shows that the laminate damage tolerance is largely determined by the resin tensile properties, and that improvements in laminate damage tolerance are not necessarily made at the expense of room-temperature mechanical properties. The results also suggest that a resin volume fraction of 40 percent or greater may be required to permit the plastic flow between fibers necessary for improved damage tolerance.

  15. Phosphorylation of human INO80 is involved in DNA damage tolerance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kato, Dai; Waki, Mayumi; Umezawa, Masaki

    Highlights: Black-Right-Pointing-Pointer Depletion of hINO80 significantly reduced PCNA ubiquitination. Black-Right-Pointing-Pointer Depletion of hINO80 significantly reduced nuclear dots intensity of RAD18 after UV irradiation. Black-Right-Pointing-Pointer Western blot analyses showed phosphorylated hINO80 C-terminus. Black-Right-Pointing-Pointer Overexpression of phosphorylation mutant hINO80 reduced PCNA ubiquitination. -- Abstract: Double strand breaks (DSBs) are the most serious type of DNA damage. DSBs can be generated directly by exposure to ionizing radiation or indirectly by replication fork collapse. The DNA damage tolerance pathway, which is conserved from bacteria to humans, prevents this collapse by overcoming replication blockages. The INO80 chromatin remodeling complex plays an important role in themore » DNA damage response. The yeast INO80 complex participates in the DNA damage tolerance pathway. The mechanisms regulating yINO80 complex are not fully understood, but yeast INO80 complex are necessary for efficient proliferating cell nuclear antigen (PCNA) ubiquitination and for recruitment of Rad18 to replication forks. In contrast, the function of the mammalian INO80 complex in DNA damage tolerance is less clear. Here, we show that human INO80 was necessary for PCNA ubiquitination and recruitment of Rad18 to DNA damage sites. Moreover, the C-terminal region of human INO80 was phosphorylated, and overexpression of a phosphorylation-deficient mutant of human INO80 resulted in decreased ubiquitination of PCNA during DNA replication. These results suggest that the human INO80 complex, like the yeast complex, was involved in the DNA damage tolerance pathway and that phosphorylation of human INO80 was involved in the DNA damage tolerance pathway. These findings provide new insights into the DNA damage tolerance pathway in mammalian cells.« less

  16. An Experimental Investigation of Damage Resistances and Damage Tolerance of Composite Materials

    NASA Technical Reports Server (NTRS)

    Prabhakaran, R.

    2003-01-01

    The project included three lines of investigation, aimed at a better understanding of the damage resistance and damage tolerance of pultruded composites. The three lines of investigation were: (i) measurement of permanent dent depth after transverse indentation at different load levels, and correlation with other damage parameters such as damage area (from x-radiography) and back surface crack length, (ii) estimation of point stress and average stress characteristic dimensions corresponding to measured damage parameters, and (iii) an attempt to measure the damage area by a reflection photoelastic technique. All the three lines of investigation were pursued.

  17. Concepts for improving the damage tolerance of composite compression panels

    NASA Technical Reports Server (NTRS)

    Rhodes, M. D.; Williams, J. G.

    1981-01-01

    The results of an experimental evaluation of graphite-epoxy composite compression panel impact damage tolerance and damage propagation arrest concepts are reported. The tests were conducted on flat plate specimens and blade-stiffened structural panels such as those used in commercial aircraft wings, and the residual strength of damaged specimens and their sensitivity to damage while subjected to in-plane compression loading were determined. Results suggest that matrix materials that fail by delamination have the lowest damage tolerance, and it is concluded that alternative matrix materials with transverse reinforcement to suppress the delamination failure mode and yield the higher-strain value transverse shear crippling mode should be developed.

  18. Damage Tolerance Applied to Design of Mid-Size Aircraft

    NASA Astrophysics Data System (ADS)

    Chaves, Carlos Eduardo

    Most of the mid-size aircraft are certified according to FAA Part 25 requirements, and in order to comply with these requirements the majority of the aircraft structure must be damage tolerant. To assure damage tolerance, despite the overall structural behavior, one should look at the details. There is a great amount of analysis tasks and tests that must be carried out in order to guarantee the aircraft structural integrity. This paper presents an overview of Embraer experience with design and analysis for damage tolerance during the last 30 years. Aspects like DT analysis for metallic and composite structures, selection of appropriate materials, loads, definition of limits of validity and definition of inspection intervals will be addressed along this work. Selected structural tests that have been performed for validation of modeling predictions will be presented. Some aspects to be discussed are related to the design differences between commercial jets, which are usually subjected to high usage conditions, business jets and military aircraft. Further, the application of future technologies, such as structural health monitoring, and also of new materials and manufacturing processes that have been evaluated in order to improve the damage tolerance capability of the aircraft structures will be discussed.

  19. Advanced information processing system - Status report. [for fault tolerant and damage tolerant data processing for aerospace vehicles

    NASA Technical Reports Server (NTRS)

    Brock, L. D.; Lala, J.

    1986-01-01

    The Advanced Information Processing System (AIPS) is designed to provide a fault tolerant and damage tolerant data processing architecture for a broad range of aerospace vehicles. The AIPS architecture also has attributes to enhance system effectiveness such as graceful degradation, growth and change tolerance, integrability, etc. Two key building blocks being developed by the AIPS program are a fault and damage tolerant processor and communication network. A proof-of-concept system is now being built and will be tested to demonstrate the validity and performance of the AIPS concepts.

  20. Damage tolerance and structural monitoring for wind turbine blades.

    PubMed

    McGugan, M; Pereira, G; Sørensen, B F; Toftegaard, H; Branner, K

    2015-02-28

    The paper proposes a methodology for reliable design and maintenance of wind turbine rotor blades using a condition monitoring approach and a damage tolerance index coupling the material and structure. By improving the understanding of material properties that control damage propagation it will be possible to combine damage tolerant structural design, monitoring systems, inspection techniques and modelling to manage the life cycle of the structures. This will allow an efficient operation of the wind turbine in terms of load alleviation, limited maintenance and repair leading to a more effective exploitation of offshore wind. © 2015 The Author(s) Published by the Royal Society. All rights reserved.

  1. An Evaluation of the Applicability of Damage Tolerance to Dynamic Systems

    NASA Technical Reports Server (NTRS)

    Forth, Scott C.; Le, Dy; Turnberg, Jay

    2005-01-01

    The Federal Aviation Administration, the National Aeronautics and Space Administration and the aircraft industry have teamed together to develop methods and guidance for the safe life-cycle management of dynamic systems. Based on the success of the United States Air Force damage tolerance initiative for airframe structure, a crack growth based damage tolerance approach is being examined for implementation into the design and management of dynamic systems. However, dynamic systems accumulate millions of vibratory cycles per flight hour, more than 12,000 times faster than an airframe system. If a detectable crack develops in a dynamic system, the time to failure is extremely short, less than 100 flight hours in most cases, leaving little room for error in the material characterization, life cycle analysis, nondestructive inspection and maintenance processes. In this paper, the authors review the damage tolerant design process focusing on uncertainties that affect dynamic systems and evaluate the applicability of damage tolerance on dynamic systems.

  2. The effect of resin on the impact damage tolerance of graphite-epoxy laminates

    NASA Technical Reports Server (NTRS)

    Williams, J. G.; Rhodes, M. D.

    1981-01-01

    The effect of the matrix resin on the impact damage tolerance of graphite-epoxy composite laminates was investigated. The materials were evaluated on the basis of the damage incurred due to local impact and on their ability to retain compression strength in the presence of impact damage. Twenty-four different resin systems were evaluated. Five of the systems demonstrated substantial improvements compared to the baseline system including retention of compression strength in the presence of impact damage. Examination of the neat resin mechanical properties indicates the resin tensile properties influence significantly the laminate damage tolerance and that improvements in laminate damage tolerance are not necessarily made at the expense of room temperature mechanical properties. Preliminary results indicate a resin volume fraction on the order of 40 percent or greater may be required to permit the plastic flow between fibers necessary for improved damage tolerance.

  3. Some Examples of the Relations Between Processing and Damage Tolerance

    NASA Technical Reports Server (NTRS)

    Nettles, Alan T.

    2012-01-01

    Most structures made of laminated polymer matrix composites (PMCs) must be designed to some damage tolerance requirement that includes foreign object impact damage. Thus from the beginning of a part s life, impact damage is assumed to exist in the material and the part is designed to carry the required load with the prescribed impact damage present. By doing this, some processing defects may automatically be accounted for in the reduced design allowable due to these impacts. This paper will present examples of how a given level of impact damage and certain processing defects affect the compression strength of a laminate that contains both. Knowledge of the impact damage tolerance requirements, before processing begins, can broaden material options and processing techniques since the structure is not being designed to pristine properties.

  4. Fatigue Crack Growth Database for Damage Tolerance Analysis

    NASA Technical Reports Server (NTRS)

    Forman, R. G.; Shivakumar, V.; Cardinal, J. W.; Williams, L. C.; McKeighan, P. C.

    2005-01-01

    The objective of this project was to begin the process of developing a fatigue crack growth database (FCGD) of metallic materials for use in damage tolerance analysis of aircraft structure. For this initial effort, crack growth rate data in the NASGRO (Registered trademark) database, the United States Air Force Damage Tolerant Design Handbook, and other publicly available sources were examined and used to develop a database that characterizes crack growth behavior for specific applications (materials). The focus of this effort was on materials for general commercial aircraft applications, including large transport airplanes, small transport commuter airplanes, general aviation airplanes, and rotorcraft. The end products of this project are the FCGD software and this report. The specific goal of this effort was to present fatigue crack growth data in three usable formats: (1) NASGRO equation parameters, (2) Walker equation parameters, and (3) tabular data points. The development of this FCGD will begin the process of developing a consistent set of standard fatigue crack growth material properties. It is envisioned that the end product of the process will be a general repository for credible and well-documented fracture properties that may be used as a default standard in damage tolerance analyses.

  5. Design, testing, and damage tolerance study of bonded stiffened composite wing cover panels

    NASA Technical Reports Server (NTRS)

    Madan, Ram C.; Sutton, Jason O.

    1988-01-01

    Results are presented from the application of damage tolerance criteria for composite panels to multistringer composite wing cover panels developed under NASA's Composite Transport Wing Technology Development contract. This conceptual wing design integrated aeroelastic stiffness constraints with an enhanced damage tolerance material system, in order to yield optimized producibility and structural performance. Damage tolerance was demonstrated in a test program using full-sized cover panel subcomponents; panel skins were impacted at midbay between stiffeners, directly over a stiffener, and over the stiffener flange edge. None of the impacts produced visible damage. NASTRAN analyses were performed to simulate NDI-detected invisible damage.

  6. Concepts for improving the damage tolerance of composite compression panels. [aircraft structures

    NASA Technical Reports Server (NTRS)

    Rhodes, M. D.; Williams, J. G.

    1984-01-01

    The residual strength of specimens with damage and the sensitivity to damage while subjected to an applied inplane compression load were determined for flatplate specimens and blade-stiffened panels. The results suggest that matrix materials that fail by delamination have the lowest damage tolerance capability. Alternate matrix materials or laminates which are transversely reinforced suppress the delamination mode of failure and change the failure mode to transverse shear crippling which occurs at a higher strain value. Several damage-tolerant blade-stiffened panel design concepts are evaluated. Structural efficiency studies conducted show only small mass penalties may result from incorporating these damage-tolerant features in panel design. The implication of test results on the design of aircraft structures was examined with respect to FAR requirements.

  7. Damage-Tolerance and Fatigue Evaluation of Structure

    DOT National Transportation Integrated Search

    1997-02-18

    This advisory circular (AC) sets forth an acceptable means of compliance : with the provisions of Part 25 of the Federal Aviation Regulations (FAR) dealing with the damage-tolerance and fatigue evaluation requirements of transport category aircraft s...

  8. Water availability limits tolerance of apical damage in the Chilean tarweed Madia sativa

    NASA Astrophysics Data System (ADS)

    Gonzáles, Wilfredo L.; Suárez, Lorena H.; Molina-Montenegro, Marco A.; Gianoli, Ernesto

    2008-07-01

    Plant tolerance is the ability to reduce the negative impact of herbivory on plant fitness. Numerous studies have shown that plant tolerance is affected by nutrient availability, but the effect of soil moisture has received less attention. We evaluated tolerance of apical damage (clipping that mimicked insect damage) under two watering regimes (control watering and drought) in the tarweed Madia sativa (Asteraceae). We recorded number of heads with seeds and total number of heads as traits related to fitness. Net photosynthetic rate, water use efficiency, number of branches, shoot biomass, and the root:shoot biomass ratio were measured as traits potentially related to tolerance via compensatory responses to damage. In the drought treatment, damaged plants showed ≈43% reduction in reproductive fitness components in comparison with undamaged plants. In contrast, there was no significant difference in reproductive fitness between undamaged and damaged plants in the control watering treatment. Shoot biomass was not affected by apical damage. The number of branches increased after damage in both water treatments but this increase was limited by drought stress. Net photosynthetic rate increased in damaged plants only in the control watering treatment. Water use efficiency increased with drought stress and, in plants regularly watered, also increased after damage. Root:shoot ratio was higher in the low water treatment and damaged plants tended to reduce root:shoot ratio only in this water treatment. It is concluded that water availability limits tolerance to apical damage in M. sativa, and that putative compensatory mechanisms are differentially affected by water availability.

  9. Optimization of Aerospace Structure Subject to Damage Tolerance Criteria

    NASA Technical Reports Server (NTRS)

    Akgun, Mehmet A.

    1999-01-01

    The objective of this cooperative agreement was to seek computationally efficient ways to optimize aerospace structures subject to damage tolerance criteria. Optimization was to involve sizing as well as topology optimization. The work was done in collaboration with Steve Scotti, Chauncey Wu and Joanne Walsh at the NASA Langley Research Center. Computation of constraint sensitivity is normally the most time-consuming step of an optimization procedure. The cooperative work first focused on this issue and implemented the adjoint method of sensitivity computation in an optimization code (runstream) written in Engineering Analysis Language (EAL). The method was implemented both for bar and plate elements including buckling sensitivity for the latter. Lumping of constraints was investigated as a means to reduce the computational cost. Adjoint sensitivity computation was developed and implemented for lumped stress and buckling constraints. Cost of the direct method and the adjoint method was compared for various structures with and without lumping. The results were reported in two papers. It is desirable to optimize topology of an aerospace structure subject to a large number of damage scenarios so that a damage tolerant structure is obtained. Including damage scenarios in the design procedure is critical in order to avoid large mass penalties at later stages. A common method for topology optimization is that of compliance minimization which has not been used for damage tolerant design. In the present work, topology optimization is treated as a conventional problem aiming to minimize the weight subject to stress constraints. Multiple damage configurations (scenarios) are considered. Each configuration has its own structural stiffness matrix and, normally, requires factoring of the matrix and solution of the system of equations. Damage that is expected to be tolerated is local and represents a small change in the stiffness matrix compared to the baseline (undamaged

  10. Improved damage tolerance of titanium by adhesive lamination

    NASA Technical Reports Server (NTRS)

    Johnson, W. S.

    1982-01-01

    Basic damage tolerance properties of Ti-6A1-4V titanium plate can be improved by laminating thin sheets of titanium with adhesives. Compact tension and center cracked tension specimens made from thick plate, thin sheet, and laminated plate (six plies of thin sheet) were tested. The fracture toughness of the laminated plate was 39 percent higher than the monolithic plate. The laminated plate's through the thickness crack growth rate was about 20 percent less than that of the monolithic plate. The damage tolerance life of the surface cracked laminate was 6 to over 15 times the life of a monolithic specimen. A simple method of predicting crack growth in a crack ply of a laminate is presented.

  11. Electronic hybridisation implications for the damage-tolerance of thin film metallic glasses.

    PubMed

    Schnabel, Volker; Jaya, B Nagamani; Köhler, Mathias; Music, Denis; Kirchlechner, Christoph; Dehm, Gerhard; Raabe, Dierk; Schneider, Jochen M

    2016-11-07

    A paramount challenge in materials science is to design damage-tolerant glasses. Poisson's ratio is commonly used as a criterion to gauge the brittle-ductile transition in glasses. However, our data, as well as results in the literature, are in conflict with the concept of Poisson's ratio serving as a universal parameter for fracture energy. Here, we identify the electronic structure fingerprint associated with damage tolerance in thin film metallic glasses. Our correlative theoretical and experimental data reveal that the fraction of bonds stemming from hybridised states compared to the overall bonding can be associated with damage tolerance in thin film metallic glasses.

  12. Recent Advances in Durability and Damage Tolerance Methodology at NASA Langley Research Center

    NASA Technical Reports Server (NTRS)

    Ransom, J. B.; Glaessgen, E. H.; Raju, I. S.; Harris, C. E.

    2007-01-01

    Durability and damage tolerance (D&DT) issues are critical to the development of lighter, safer and more efficient aerospace vehicles. Durability is largely an economic life-cycle design consideration whereas damage tolerance directly addresses the structural airworthiness (safety) of the vehicle. Both D&DT methodologies must address the deleterious effects of changes in material properties and the initiation and growth of damage that may occur during the vehicle s service lifetime. The result of unanticipated D&DT response is often manifested in the form of catastrophic and potentially fatal accidents. As such, durability and damage tolerance requirements must be rigorously addressed for commercial transport aircraft and NASA spacecraft systems. This paper presents an overview of the recent and planned future research in durability and damage tolerance analytical and experimental methods for both metallic and composite aerospace structures at NASA Langley Research Center (LaRC).

  13. Pre-damage biomass allocation and not invasiveness predicts tolerance to damage in seedlings of woody species in Hawaii.

    PubMed

    Lurie, Matthew H; Barton, Kasey E; Daehler, Curtis C

    2017-12-01

    Plant-herbivore interactions have been predicted to play a fundamental role in plant invasions, although support for this assertion from previous research is mixed. While plants may escape from specialist herbivores in their introduced ranges, herbivory from generalists is common. Tolerance traits may allow non-native plants to mitigate the negative consequences of generalist herbivory that they cannot avoid in their introduced range. Here we address whether tolerance to herbivory, quantified as survival and compensatory growth, is associated with plant invasion success in Hawaii and investigate traits that may enhance tolerance in seedlings, the life stage most susceptible to herbivory. In a greenhouse experiment, we measured seedling tolerance to simulated herbivory through mechanical damage (50% leaf removal) of 16 non-native woody plant species differing in invasion status (invasive vs. non-invasive). Seedlings were grown for 2 weeks following damage and analyzed for biomass to determine whether damaged plants could fully compensate for the lost leaf tissue. Over 99% of all seedlings survived defoliation. Although species varied significantly in their levels of compensation, there was no consistent difference between invasive and non-invasive species. Seedlings of 11 species undercompensated and remained substantially smaller than control seedlings 2 weeks after damage; four species were close to compensating, while one species overcompensated. Across species, compensation was positively associated with an increased investment in potential storage reserves, specifically cotyledons and roots, suggesting that these organs provide resources that help seedlings re-grow following damage. Our results add to a growing consensus that pre-damage growth patterns determine tolerance to damage, even in young seedlings which have relatively low biomass. The lack of higher tolerance in highly invasive species may suggest that invaders overcome herbivory barriers to invasion

  14. Electronic hybridisation implications for the damage-tolerance of thin film metallic glasses

    PubMed Central

    Schnabel, Volker; Jaya, B. Nagamani; Köhler, Mathias; Music, Denis; Kirchlechner, Christoph; Dehm, Gerhard; Raabe, Dierk; Schneider, Jochen M.

    2016-01-01

    A paramount challenge in materials science is to design damage-tolerant glasses. Poisson’s ratio is commonly used as a criterion to gauge the brittle-ductile transition in glasses. However, our data, as well as results in the literature, are in conflict with the concept of Poisson’s ratio serving as a universal parameter for fracture energy. Here, we identify the electronic structure fingerprint associated with damage tolerance in thin film metallic glasses. Our correlative theoretical and experimental data reveal that the fraction of bonds stemming from hybridised states compared to the overall bonding can be associated with damage tolerance in thin film metallic glasses. PMID:27819318

  15. Applications of a damage tolerance analysis methodology in aircraft design and production

    NASA Technical Reports Server (NTRS)

    Woodward, M. R.; Owens, S. D.; Law, G. E.; Mignery, L. A.

    1992-01-01

    Objectives of customer mandated aircraft structural integrity initiatives in design are to guide material selection, to incorporate fracture resistant concepts in the design, to utilize damage tolerance based allowables and planned inspection procedures necessary to enhance the safety and reliability of manned flight vehicles. However, validated fracture analysis tools for composite structures are needed to accomplish these objectives in a timely and economical manner. This paper briefly describes the development, validation, and application of a damage tolerance methodology for composite airframe structures. A closed-form analysis code, entitled SUBLAM was developed to predict the critical biaxial strain state necessary to cause sublaminate buckling-induced delamination extension in an impact damaged composite laminate. An embedded elliptical delamination separating a thin sublaminate from a thick parent laminate is modelled. Predicted failure strains were correlated against a variety of experimental data that included results from compression after impact coupon and element tests. An integrated analysis package was developed to predict damage tolerance based margin-of-safety (MS) using NASTRAN generated loads and element information. Damage tolerance aspects of new concepts are quickly and cost-effectively determined without the need for excessive testing.

  16. A modal H∞-norm-based performance requirement for damage-tolerant active controller design

    NASA Astrophysics Data System (ADS)

    Genari, Helói F. G.; Mechbal, Nazih; Coffignal, Gérard; Nóbrega, Eurípedes G. O.

    2017-04-01

    Damage-tolerant active control (DTAC) is a recent research area that encompasses control design methodologies resulting from the application of fault-tolerant control methods to vibration control of structures subject to damage. The possibility of damage occurrence is not usually considered in the active vibration control design requirements. Damage changes the structure dynamics, which may produce unexpected modal behavior of the closed-loop system, usually not anticipated by the controller design approaches. A modal H∞ norm and a respective robust controller design framework were recently introduced, and this method is here extended to face a new DTAC strategy implementation. Considering that damage affects each vibration mode differently, this paper adopts the modal H∞ norm to include damage as a design requirement. The basic idea is to create an appropriate energy distribution over the frequency range of interest and respective vibration modes, guaranteeing robustness, damage tolerance, and adequate overall performance, taking into account that it is common to have previous knowledge of the structure regions where damage may occur during its operational life. For this purpose, a structural health monitoring technique is applied to evaluate modal modifications caused by damage. This information is used to create modal weighing matrices, conducting to the modal H∞ controller design. Finite element models are adopted for a case study structure, including different damage severities, in order to validate the proposed control strategy. Results show the effectiveness of the proposed methodology with respect to damage tolerance.

  17. High damage tolerance of electrochemically lithiated silicon

    PubMed Central

    Wang, Xueju; Fan, Feifei; Wang, Jiangwei; Wang, Haoran; Tao, Siyu; Yang, Avery; Liu, Yang; Beng Chew, Huck; Mao, Scott X.; Zhu, Ting; Xia, Shuman

    2015-01-01

    Mechanical degradation and resultant capacity fade in high-capacity electrode materials critically hinder their use in high-performance rechargeable batteries. Despite tremendous efforts devoted to the study of the electro–chemo–mechanical behaviours of high-capacity electrode materials, their fracture properties and mechanisms remain largely unknown. Here we report a nanomechanical study on the damage tolerance of electrochemically lithiated silicon. Our in situ transmission electron microscopy experiments reveal a striking contrast of brittle fracture in pristine silicon versus ductile tensile deformation in fully lithiated silicon. Quantitative fracture toughness measurements by nanoindentation show a rapid brittle-to-ductile transition of fracture as the lithium-to-silicon molar ratio is increased to above 1.5. Molecular dynamics simulations elucidate the mechanistic underpinnings of the brittle-to-ductile transition governed by atomic bonding and lithiation-induced toughening. Our results reveal the high damage tolerance in amorphous lithium-rich silicon alloys and have important implications for the development of durable rechargeable batteries. PMID:26400671

  18. High damage tolerance of electrochemically lithiated silicon

    NASA Astrophysics Data System (ADS)

    Wang, Xueju; Fan, Feifei; Wang, Jiangwei; Wang, Haoran; Tao, Siyu; Yang, Avery; Liu, Yang; Beng Chew, Huck; Mao, Scott X.; Zhu, Ting; Xia, Shuman

    2015-09-01

    Mechanical degradation and resultant capacity fade in high-capacity electrode materials critically hinder their use in high-performance rechargeable batteries. Despite tremendous efforts devoted to the study of the electro-chemo-mechanical behaviours of high-capacity electrode materials, their fracture properties and mechanisms remain largely unknown. Here we report a nanomechanical study on the damage tolerance of electrochemically lithiated silicon. Our in situ transmission electron microscopy experiments reveal a striking contrast of brittle fracture in pristine silicon versus ductile tensile deformation in fully lithiated silicon. Quantitative fracture toughness measurements by nanoindentation show a rapid brittle-to-ductile transition of fracture as the lithium-to-silicon molar ratio is increased to above 1.5. Molecular dynamics simulations elucidate the mechanistic underpinnings of the brittle-to-ductile transition governed by atomic bonding and lithiation-induced toughening. Our results reveal the high damage tolerance in amorphous lithium-rich silicon alloys and have important implications for the development of durable rechargeable batteries.

  19. High damage tolerance of electrochemically lithiated silicon

    DOE PAGES

    Wang, Xueju; Fan, Feifei; Wang, Jiangwei; ...

    2015-09-24

    Mechanical degradation and resultant capacity fade in high-capacity electrode materials critically hinder their use in high-performance rechargeable batteries. Despite tremendous efforts devoted to the study of the electro–chemo–mechanical behaviours of high-capacity electrode materials, their fracture properties and mechanisms remain largely unknown. In this paper, we report a nanomechanical study on the damage tolerance of electrochemically lithiated silicon. Our in situ transmission electron microscopy experiments reveal a striking contrast of brittle fracture in pristine silicon versus ductile tensile deformation in fully lithiated silicon. Quantitative fracture toughness measurements by nanoindentation show a rapid brittle-to-ductile transition of fracture as the lithium-to-silicon molar ratiomore » is increased to above 1.5. Molecular dynamics simulations elucidate the mechanistic underpinnings of the brittle-to-ductile transition governed by atomic bonding and lithiation-induced toughening. Finally, our results reveal the high damage tolerance in amorphous lithium-rich silicon alloys and have important implications for the development of durable rechargeable batteries.« less

  20. Damage Tolerance and Reliability of Turbine Engine Components

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.

    1999-01-01

    A formal method is described to quantify structural damage tolerance and reliability in the presence of multitude of uncertainties in turbine engine components. The method is based at the materials behaviour level where primitive variables with their respective scatters are used to describe the behavior. Computational simulation is then used to propagate those uncertainties to the structural scale where damage tolerance and reliability are usually specified. Several sample cases are described to illustrate the effectiveness, versatility, and maturity of the method. Typical results from these methods demonstrate that the methods are mature and that they can be used for future strategic projections and planning to assure better, cheaper, faster, products for competitive advantages in world markets. These results also indicate that the methods are suitable for predicting remaining life in aging or deteriorating structures.

  1. Damage Tolerance and Reliability of Turbine Engine Components

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.

    1998-01-01

    A formal method is described to quantify structural damage tolerance and reliability in the presence of multitude of uncertainties in turbine engine components. The method is based at the materials behavior level where primitive variables with their respective scatters are used to describe that behavior. Computational simulation is then used to propagate those uncertainties to the structural scale where damage tolerance and reliability are usually specified. Several sample cases are described to illustrate the effectiveness, versatility, and maturity of the method. Typical results from these methods demonstrate that the methods are mature and that they can be used for future strategic projections and planning to assure better, cheaper, faster products for competitive advantages in world markets. These results also indicate that the methods are suitable for predicting remaining life in aging or deteriorating structures.

  2. Fuel containment and damage tolerance for large composite primary aircraft structures. Phase 1: Testing

    NASA Technical Reports Server (NTRS)

    Sandifer, J. P.

    1983-01-01

    Technical problems associated with fuel containment and damage tolerance of composite material wings for transport aircraft were identified. The major tasks are the following: (1) the preliminary design of damage tolerant wing surface using composite materials; (2) the evaluation of fuel sealing and lightning protection methods for a composite material wing; and (3) an experimental investigation of the damage tolerant characteristics of toughened resin graphite/epoxy materials. The test results, the test techniques, and the test data are presented.

  3. Safe-life and damage-tolerant design approaches for helicopter structures

    NASA Technical Reports Server (NTRS)

    Reddick, H. K., Jr.

    1983-01-01

    The safe-life and damage-tolerant design approaches discussed apply to both metallic and fibrous composite helicopter structures. The application of these design approaches to fibrous composite structures is emphasized. Safe-life and damage-tolerant criteria are applied to all helicopter flight critical components, which are generally categorized as: dynamic components with a main and tail rotor system, which includes blades, hub and rotating controls, and drive train which includes transmission, and main and interconnecting rotor shafts; and the airframe, composed of the fuselage, aerodynamic surfaces, and landing gear.

  4. Influence of Fibre Architecture on Impact Damage Tolerance in 3D Woven Composites

    NASA Astrophysics Data System (ADS)

    Potluri, P.; Hogg, P.; Arshad, M.; Jetavat, D.; Jamshidi, P.

    2012-10-01

    3D woven composites, due to the presence of through-thickness fibre-bridging, have the potential to improve damage tolerance and at the same time to reduce the manufacturing costs. However, ability to withstand damage depends on weave topology as well as geometry of individual tows. There is an extensive literature on damage tolerance of 2D prepreg laminates but limited work is reported on the damage tolerance of 3D weaves. In view of the recent interest in 3D woven composites from aerospace as well as non-aerospace sectors, this paper aims to provide an understanding of the impact damage resistance as well as damage tolerance of 3D woven composites. Four different 3D woven architectures, orthogonal, angle interlocked, layer-to-layer and modified layer-to-layer structures, have been produced under identical weaving conditions. Two additional structures, Unidirectional (UD) cross-ply and 2D plain weave, have been developed for comparison with 3D weaves. All the four 3D woven laminates have similar order of magnitude of damage area and damage width, but significantly lower than UD and 2D woven laminates. Damage Resistance, calculated as impact energy per unit damage area, has been shown to be significantly higher for 3D woven laminates. Rate of change of CAI strength with impact energy appears to be similar for all four 3D woven laminates as well as UD laminate; 2D woven laminate has higher rate of degradation with respect to impact energy. Undamaged compression strength has been shown to be a function of average tow waviness angle. Additionally, 3D weaves exhibit a critical damage size; below this size there is no appreciable reduction in compression strength. 3D woven laminates have also exhibited a degree of plasticity during compression whereas UD laminates fail instantly. The experimental work reported in this paper forms a foundation for systematic development of computational models for 3D woven architectures for damage tolerance.

  5. Damage-tolerance strategies for nacre tablets.

    PubMed

    Wang, Shengnan; Zhu, Xinqiao; Li, Qiyang; Wang, Rizhi; Wang, Xiaoxiang

    2016-05-01

    Nacre, a natural armor, exhibits prominent penetration resistance against predatory attacks. Unraveling its hierarchical toughening mechanisms and damage-tolerance design strategies may provide significant inspiration for the pursuit of high-performance artificial armors. In this work, relationships between the structure and mechanical performance of nacre were investigated. The results show that other than their brick-and-mortar structure, individual nacre tablets significantly contribute to the damage localization of nacre. Affected by intracrystalline organics, the tablets exhibit a unique fracture behavior. The synergistic action of the nanoscale deformation mechanisms increases the energy dissipation efficiency of the tablets and contributes to the preservation of the structural and functional integrity of the shell. Copyright © 2016 Elsevier Inc. All rights reserved.

  6. A Framework for Probabilistic Evaluation of Interval Management Tolerance in the Terminal Radar Control Area

    NASA Technical Reports Server (NTRS)

    Hercencia-Zapana, Heber; Herencia-Zapana, Heber; Hagen, George E.; Neogi, Natasha

    2012-01-01

    Projections of future traffic in the national airspace show that most of the hub airports and their attendant airspace will need to undergo significant redevelopment and redesign in order to accommodate any significant increase in traffic volume. Even though closely spaced parallel approaches increase throughput into a given airport, controller workload in oversubscribed metroplexes is further taxed by these approaches that require stringent monitoring in a saturated environment. The interval management (IM) concept in the TRACON area is designed to shift some of the operational burden from the control tower to the flight deck, placing the flight crew in charge of implementing the required speed changes to maintain a relative spacing interval. The interval management tolerance is a measure of the allowable deviation from the desired spacing interval for the IM aircraft (and its target aircraft). For this complex task, Formal Methods can help to ensure better design and system implementation. In this paper, we propose a probabilistic framework to quantify the uncertainty and performance associated with the major components of the IM tolerance. The analytical basis for this framework may be used to formalize both correctness and probabilistic system safety claims in a modular fashion at the algorithmic level in a way compatible with several Formal Methods tools.

  7. The combined effect of glass buffer strips and stitching on the damage tolerance of composites

    NASA Technical Reports Server (NTRS)

    Kullerd, Susan M.

    1993-01-01

    Recent research has demonstrated that through-the-thickness stitching provides major improvements in the damage tolerance of composite laminates loaded in compression. However, the brittle nature of polymer matrix composites makes them susceptible to damage propagation, requiring special material applications and designs to limit damage growth. Glass buffer strips, embedded within laminates, have shown the potential for improving the damage tolerance of unstitched composite laminates loaded in tension. The glass buffer strips, less stiff than the surrounding carbon fibers, arrest crack growth in composites under tensile loads. The present study investigates the damage tolerance characteristics of laminates that contain both stitching and glass buffer strips.

  8. Use of a New Portable Instrumented Impactor on the NASA Composite Crew Module Damage Tolerance Program

    NASA Technical Reports Server (NTRS)

    Jackson, Wade C.; Polis, Daniel L.

    2014-01-01

    Damage tolerance performance is critical to composite structures because surface impacts at relatively low energies may result in a significant strength loss. For certification, damage tolerance criteria require aerospace vehicles to meet design loads while containing damage at critical locations. Data from standard small coupon testing are difficult to apply to larger more complex structures. Due to the complexity of predicting both the impact damage and the residual properties, damage tolerance is demonstrated primarily by testing. A portable, spring-propelled, impact device was developed which allows the impact damage response to be investigated on large specimens, full-scale components, or entire vehicles. During impact, both the force history and projectile velocity are captured. The device was successfully used to demonstrate the damage tolerance performance of the NASA Composite Crew Module. The impactor was used to impact 18 different design features at impact energies up to 35 J. Detailed examples of these results are presented, showing impact force histories, damage inspection results, and response to loading.

  9. Damage tolerant design using collapse techniques

    NASA Technical Reports Server (NTRS)

    Haftka, R. T.

    1982-01-01

    A new approach to the design of structures for improved global damage tolerance is presented. In its undamaged condition the structure is designed subject to strength, displacement and buckling constraints. In the damaged condition the only constraint is that the structure will not collapse. The collapse load calculation is formulated as a maximization problem and solved by an interior extended penalty function. The design for minimum weight subject to constraints on the undamaged structure and a specified level of the collapse load is a minimization problem which is also solved by a penalty function formulation. Thus the overall problem is of a nested or multilevel optimization. Examples are presented to demonstrate the difference between the present and more traditional approaches.

  10. 14 CFR 23.574 - Metallic damage tolerance and fatigue evaluation of commuter category airplanes.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... evaluation of commuter category airplanes. 23.574 Section 23.574 Aeronautics and Space FEDERAL AVIATION... COMMUTER CATEGORY AIRPLANES Structure Fatigue Evaluation § 23.574 Metallic damage tolerance and fatigue evaluation of commuter category airplanes. For commuter category airplanes— (a) Metallic damage tolerance. An...

  11. 14 CFR 23.574 - Metallic damage tolerance and fatigue evaluation of commuter category airplanes.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... evaluation of commuter category airplanes. 23.574 Section 23.574 Aeronautics and Space FEDERAL AVIATION... COMMUTER CATEGORY AIRPLANES Structure Fatigue Evaluation § 23.574 Metallic damage tolerance and fatigue evaluation of commuter category airplanes. For commuter category airplanes— (a) Metallic damage tolerance. An...

  12. 14 CFR 23.574 - Metallic damage tolerance and fatigue evaluation of commuter category airplanes.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... evaluation of commuter category airplanes. 23.574 Section 23.574 Aeronautics and Space FEDERAL AVIATION... COMMUTER CATEGORY AIRPLANES Structure Fatigue Evaluation § 23.574 Metallic damage tolerance and fatigue evaluation of commuter category airplanes. For commuter category airplanes— (a) Metallic damage tolerance. An...

  13. 14 CFR 23.574 - Metallic damage tolerance and fatigue evaluation of commuter category airplanes.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... evaluation of commuter category airplanes. 23.574 Section 23.574 Aeronautics and Space FEDERAL AVIATION... COMMUTER CATEGORY AIRPLANES Structure Fatigue Evaluation § 23.574 Metallic damage tolerance and fatigue evaluation of commuter category airplanes. For commuter category airplanes— (a) Metallic damage tolerance. An...

  14. 14 CFR 23.574 - Metallic damage tolerance and fatigue evaluation of commuter category airplanes.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... evaluation of commuter category airplanes. 23.574 Section 23.574 Aeronautics and Space FEDERAL AVIATION... COMMUTER CATEGORY AIRPLANES Structure Fatigue Evaluation § 23.574 Metallic damage tolerance and fatigue evaluation of commuter category airplanes. For commuter category airplanes— (a) Metallic damage tolerance. An...

  15. Effect of water availability on tolerance of leaf damage in tall morning glory, Ipomoea purpurea

    NASA Astrophysics Data System (ADS)

    Atala, Cristian; Gianoli, Ernesto

    2009-03-01

    Resource availability may limit plant tolerance of herbivory. To predict the effect of differential resource availability on plant tolerance, the limiting resource model (LRM) considers which resource limits plant fitness and which resource is mostly affected by herbivore damage. We tested the effect of experimental drought on tolerance of leaf damage in Ipomoea purpurea, which is naturally exposed to both leaf damage and summer drought. To seek mechanistic explanations, we also measured several morphological, allocation and gas exchange traits. In this case, LRM predicts that tolerance would be the same in both water treatments. Plants were assigned to a combination of two water treatments (control and low water) and two damage treatments (50% defoliation and undamaged). Plants showed tolerance of leaf damage, i.e., a similar number of fruits were produced by damaged and undamaged plants, only in control water. Whereas experimental drought affected all plant traits, leaf damage caused plants to show a greater leaf trichome density and reduced shoot biomass, but only in low water. It is suggested that the reduced fitness (number of fruits) of damaged plants in low water was mediated by the differential reduction of shoot biomass, because the number of fruits per shoot biomass was similar in damaged and undamaged plants. Alternative but less likely explanations include the opposing direction of functional responses to drought and defoliation, and resource costs of the damage-induced leaf trichome density. Our results somewhat challenge the LRM predictions, but further research including field experiments is needed to validate some of the preliminary conclusions drawn.

  16. Intraspecific competition facilitates the evolution of tolerance to insect damage in the perennial plant Solanum carolinense.

    PubMed

    McNutt, David W; Halpern, Stacey L; Barrows, Kahaili; Underwood, Nora

    2012-12-01

    Tolerance to herbivory (the degree to which plants maintain fitness after damage) is a key component of plant defense, so understanding how natural selection and evolutionary constraints act on tolerance traits is important to general theories of plant-herbivore interactions. These factors may be affected by plant competition, which often interacts with damage to influence trait expression and fitness. However, few studies have manipulated competitor density to examine the evolutionary effects of competition on tolerance. In this study, we tested whether intraspecific competition affects four aspects of the evolution of tolerance to herbivory in the perennial plant Solanum carolinense: phenotypic expression, expression of genetic variation, the adaptive value of tolerance, and costs of tolerance. We manipulated insect damage and intraspecific competition for clonal lines of S. carolinense in a greenhouse experiment, and measured tolerance in terms of sexual and asexual fitness components. Compared to plants growing at low density, plants growing at high density had greater expression of and genetic variation in tolerance, and experienced greater fitness benefits from tolerance when damaged. Tolerance was not costly for plants growing at either density, and only plants growing at low density benefited from tolerance when undamaged, perhaps due to greater intrinsic growth rates of more tolerant genotypes. These results suggest that competition is likely to facilitate the evolution of tolerance in S. carolinense, and perhaps in other plants that regularly experience competition, while spatio-temporal variation in density may maintain genetic variation in tolerance.

  17. Application of damage tolerance methodology in certification of the Piaggio P-180 Avanti

    NASA Technical Reports Server (NTRS)

    Johnson, Jerry

    1992-01-01

    The Piaggio P-180 Avanti, a twin pusher-prop engine nine-passenger business aircraft was certified in 1990, to the requirements of FAR Part 23 and Associated Special Conditions for Composite Structure. Certification included the application of a damage tolerant methodology to the design of the composite forward wing and empennage (vertical fin, horizontal stabilizer, tailcone, and rudder) structure. This methodology included an extensive analytical evaluation coupled with sub-component and full-scale testing of the structure. The work from the Damage Tolerance Analysis Assessment was incorporated into the full-scale testing. Damage representing hazards such as dropped tools, ground equipment, handling, and runway debris, was applied to the test articles. Additional substantiation included allowing manufacturing discrepancies to exist unrepaired on the full-scale articles and simulated bondline failures in critical elements. The importance of full-scale testing in the critical environmental conditions and the application of critical damage are addressed. The implication of damage tolerance on static and fatigue testing is discussed. Good correlation between finite element solutions and experimental test data was observed.

  18. Verification of recursive probabilistic integration (RPI) method for fatigue life management using non-destructive inspections

    NASA Astrophysics Data System (ADS)

    Chen, Tzikang J.; Shiao, Michael

    2016-04-01

    This paper verified a generic and efficient assessment concept for probabilistic fatigue life management. The concept is developed based on an integration of damage tolerance methodology, simulations methods1, 2, and a probabilistic algorithm RPI (recursive probability integration)3-9 considering maintenance for damage tolerance and risk-based fatigue life management. RPI is an efficient semi-analytical probabilistic method for risk assessment subjected to various uncertainties such as the variability in material properties including crack growth rate, initial flaw size, repair quality, random process modeling of flight loads for failure analysis, and inspection reliability represented by probability of detection (POD). In addition, unlike traditional Monte Carlo simulations (MCS) which requires a rerun of MCS when maintenance plan is changed, RPI can repeatedly use a small set of baseline random crack growth histories excluding maintenance related parameters from a single MCS for various maintenance plans. In order to fully appreciate the RPI method, a verification procedure was performed. In this study, MC simulations in the orders of several hundred billions were conducted for various flight conditions, material properties, and inspection scheduling, POD and repair/replacement strategies. Since the MC simulations are time-consuming methods, the simulations were conducted parallelly on DoD High Performance Computers (HPC) using a specialized random number generator for parallel computing. The study has shown that RPI method is several orders of magnitude more efficient than traditional Monte Carlo simulations.

  19. Collection, processing, and reporting of damage tolerant design data for non-aerospace structural materials

    NASA Technical Reports Server (NTRS)

    Huber, P. D.; Gallagher, J. P.

    1994-01-01

    This report describes the organization, format and content of the NASA Johnson damage tolerant database which was created to store damage tolerant property data for non aerospace structural materials. The database is designed to store fracture toughness data (K(sub IC), K(sub c), J(sub IC) and CTOD(sub IC)), resistance curve data (K(sub R) VS. delta a (sub eff) and JR VS. delta a (sub eff)), as well as subcritical crack growth data (a vs. N and da/dN vs. delta K). The database contains complementary material property data for both stainless and alloy steels, as well as for aluminum, nickel, and titanium alloys which were not incorporated into the Damage Tolerant Design Handbook database.

  20. Low cost damage tolerant composite fabrication

    NASA Technical Reports Server (NTRS)

    Palmer, R. J.; Freeman, W. T.

    1988-01-01

    The resin transfer molding (RTM) process applied to composite aircraft parts offers the potential for using low cost resin systems with dry graphite fabrics that can be significantly less expensive than prepreg tape fabricated components. Stitched graphite fabric composites have demonstrated compression after impact failure performance that equals or exceeds that of thermoplastic or tough thermoset matrix composites. This paper reviews methods developed to fabricate complex shape composite parts using stitched graphite fabrics to increase damage tolerance with RTM processes to reduce fabrication cost.

  1. Advanced Composite Wind Turbine Blade Design Based on Durability and Damage Tolerance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Abumeri, Galib; Abdi, Frank

    2012-02-16

    composite damage and fracture modes that resemble those reported in the tests. The results show that computational simulation can be relied on to enhance the design of tapered composite structures such as the ones used in turbine wind blades. A computational simulation for durability, damage tolerance (D&DT) and reliability of composite wind turbine blade structures in presence of uncertainties in material properties was performed. A composite turbine blade was first assessed with finite element based multi-scale progressive failure analysis to determine failure modes and locations as well as the fracture load. D&DT analyses were then validated with static test performed at Sandia National Laboratories. The work was followed by detailed weight analysis to identify contribution of various materials to the overall weight of the blade. The methodology ensured that certain types of failure modes, such as delamination progression, are contained to reduce risk to the structure. Probabilistic analysis indicated that composite shear strength has a great influence on the blade ultimate load under static loading. Weight was reduced by 12% with robust design without loss in reliability or D&DT. Structural benefits obtained with the use of enhanced matrix properties through nanoparticles infusion were also assessed. Thin unidirectional fiberglass layers enriched with silica nanoparticles were applied to the outer surfaces of a wind blade to improve its overall structural performance and durability. The wind blade was a 9-meter prototype structure manufactured and tested subject to three saddle static loading at Sandia National Laboratory (SNL). The blade manufacturing did not include the use of any nano-material. With silica nanoparticles in glass composite applied to the exterior surfaces of the blade, the durability and damage tolerance (D&DT) results from multi-scale PFA showed an increase in ultimate load of the blade by 9.2% as compared to baseline structural performance (without

  2. Probabilistic Structural Analysis of SSME Turbopump Blades: Probabilistic Geometry Effects

    NASA Technical Reports Server (NTRS)

    Nagpal, V. K.

    1985-01-01

    A probabilistic study was initiated to evaluate the precisions of the geometric and material properties tolerances on the structural response of turbopump blades. To complete this study, a number of important probabilistic variables were identified which are conceived to affect the structural response of the blade. In addition, a methodology was developed to statistically quantify the influence of these probabilistic variables in an optimized way. The identified variables include random geometric and material properties perturbations, different loadings and a probabilistic combination of these loadings. Influences of these probabilistic variables are planned to be quantified by evaluating the blade structural response. Studies of the geometric perturbations were conducted for a flat plate geometry as well as for a space shuttle main engine blade geometry using a special purpose code which uses the finite element approach. Analyses indicate that the variances of the perturbations about given mean values have significant influence on the response.

  3. 75 FR 24502 - Damage Tolerance and Fatigue Evaluation of Composite Rotorcraft Structures; Reopening of Comment...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-05-05

    .... FAA-2009-0660; Notice No. 10-09] RIN 2120-AJ52 Damage Tolerance and Fatigue Evaluation of Composite... requirements of normal and transport category rotorcraft. The amendment would address advances in composite... 793) Notice No. 09-12, entitled ``Damage Tolerance and Fatigue Evaluation of Composite Rotorcraft...

  4. Damage-tolerant nanotwinned metals with nanovoids under radiation environments

    PubMed Central

    Chen, Y.; Yu, K Y.; Liu, Y.; Shao, S.; Wang, H.; Kirk, M. A.; Wang, J.; Zhang, X.

    2015-01-01

    Material performance in extreme radiation environments is central to the design of future nuclear reactors. Radiation induces significant damage in the form of dislocation loops and voids in irradiated materials, and continuous radiation often leads to void growth and subsequent void swelling in metals with low stacking fault energy. Here we show that by using in situ heavy ion irradiation in a transmission electron microscope, pre-introduced nanovoids in nanotwinned Cu efficiently absorb radiation-induced defects accompanied by gradual elimination of nanovoids, enhancing radiation tolerance of Cu. In situ studies and atomistic simulations reveal that such remarkable self-healing capability stems from high density of coherent and incoherent twin boundaries that rapidly capture and transport point defects and dislocation loops to nanovoids, which act as storage bins for interstitial loops. This study describes a counterintuitive yet significant concept: deliberate introduction of nanovoids in conjunction with nanotwins enables unprecedented damage tolerance in metallic materials. PMID:25906997

  5. Damage-tolerant nanotwinned metals with nanovoids under radiation environments.

    PubMed

    Chen, Y; Yu, K Y; Liu, Y; Shao, S; Wang, H; Kirk, M A; Wang, J; Zhang, X

    2015-04-24

    Material performance in extreme radiation environments is central to the design of future nuclear reactors. Radiation induces significant damage in the form of dislocation loops and voids in irradiated materials, and continuous radiation often leads to void growth and subsequent void swelling in metals with low stacking fault energy. Here we show that by using in situ heavy ion irradiation in a transmission electron microscope, pre-introduced nanovoids in nanotwinned Cu efficiently absorb radiation-induced defects accompanied by gradual elimination of nanovoids, enhancing radiation tolerance of Cu. In situ studies and atomistic simulations reveal that such remarkable self-healing capability stems from high density of coherent and incoherent twin boundaries that rapidly capture and transport point defects and dislocation loops to nanovoids, which act as storage bins for interstitial loops. This study describes a counterintuitive yet significant concept: deliberate introduction of nanovoids in conjunction with nanotwins enables unprecedented damage tolerance in metallic materials.

  6. Damage-tolerant nanotwinned metals with nanovoids under radiation environments

    DOE PAGES

    Chen, Y.; Yu, K. Y.; Liu, Y.; ...

    2015-04-24

    Material performance in extreme radiation environments is central to the design of future nuclear reactors. Radiation induces significant damage in the form of dislocation loops and voids in irradiated materials, and continuous radiation often leads to void growth and subsequent void swelling in metals with low stacking fault energy. Here we show that by using in situ heavy ion irradiation in a transmission electron microscope, pre-introduced nanovoids in nanotwinned Cu efficiently absorb radiation-induced defects accompanied by gradual elimination of nanovoids, enhancing radiation tolerance of Cu. In situ studies and atomistic simulations reveal that such remarkable self-healing capability stems from highmore » density of coherent and incoherent twin boundaries that rapidly capture and transport point defects and dislocation loops to nanovoids, which act as storage bins for interstitial loops. This study describes a counterintuitive yet significant concept: deliberate introduction of nanovoids in conjunction with nanotwins enables unprecedented damage tolerance in metallic materials.« less

  7. Fuel containment, lightning protection and damage tolerance in large composite primary aircraft structures

    NASA Technical Reports Server (NTRS)

    Griffin, Charles F.; James, Arthur M.

    1985-01-01

    The damage-tolerance characteristics of high strain-to-failure graphite fibers and toughened resins were evaluated. Test results show that conventional fuel tank sealing techniques are applicable to composite structures. Techniques were developed to prevent fuel leaks due to low-energy impact damage. For wing panels subjected to swept stroke lightning strikes, a surface protection of graphite/aluminum wire fabric and a fastener treatment proved effective in eliminating internal sparking and reducing structural damage. The technology features developed were incorporated and demonstrated in a test panel designed to meet the strength, stiffness, and damage tolerance requirements of a large commercial transport aircraft. The panel test results exceeded design requirements for all test conditions. Wing surfaces constructed with composites offer large weight savings if design allowable strains for compression can be increased from current levels.

  8. Effect of Translaminar Reinforcements and Hybridization on Damage Resistance and Tolerance of Composite Laminates

    DTIC Science & Technology

    2012-01-01

    REINFORCEMENTS AND HYBRIDIZATION ON DAMAGE RESISTANCE AND TOLERANCE OF COMPOSITE LAMINATES It was shown that the damage resistance and tolerance of... laminated composites can be enhanced by the employment of translaminar reinforcements (TLR) such as stitching, z-pinning and 3D weaving and also by hybrid...Park, NC 27709-2211 Composite Laminates Resistance REPORT DOCUMENTATION PAGE 11. SPONSOR/MONITOR’S REPORT NUMBER(S) 10. SPONSOR/MONITOR’S ACRONYM

  9. Low velocity instrumented impact testing of four new damage tolerant carbon/epoxy composite systems

    NASA Technical Reports Server (NTRS)

    Lance, D. G.; Nettles, A. T.

    1990-01-01

    Low velocity drop weight instrumented impact testing was utilized to examine the damage resistance of four recently developed carbon fiber/epoxy resin systems. A fifth material, T300/934, for which a large data base exists, was also tested for comparison purposes. A 16-ply quasi-isotropic lay-up configuration was used for all the specimens. Force/absorbed energy-time plots were generated for each impact test. The specimens were cross-sectionally analyzed to record the damage corresponding to each impact energy level. Maximum force of impact versus impact energy plots were constructed to compare the various systems for impact damage resistance. Results show that the four new damage tolerant fiber/resin systems far outclassed the T300/934 material. The most damage tolerant material tested was the IM7/1962 fiber/resin system.

  10. Alumina additions may improve the damage tolerance of soft machined zirconia-based ceramics.

    PubMed

    Oilo, Marit; Tvinnereim, Helene M; Gjerdet, Nils Roar

    2011-01-01

    The aim of this study was to evaluate the damage tolerance of different zirconia-based materials. Bars of one hard machined and one soft machined dental zirconia and an experimental 95% zirconia 5% alumina ceramic were subjected to 100,000 stress cycles (n = 10), indented to provoke cracks on the tensile stress side (n = 10), and left untreated as controls (n = 10). The experimental material demonstrated a higher relative damage tolerance, with a 40% reduction compared to 68% for the hard machined zirconia and 84% for the soft machined zirconia.

  11. 14 CFR 29.573 - Damage Tolerance and Fatigue Evaluation of Composite Rotorcraft Structures.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... Composite Rotorcraft Structures. 29.573 Section 29.573 Aeronautics and Space FEDERAL AVIATION ADMINISTRATION... Structures. (a) Each applicant must evaluate the composite rotorcraft structure under the damage tolerance..., types, and sizes of damage, considering fatigue, environmental effects, intrinsic and discrete flaws...

  12. 14 CFR 27.573 - Damage Tolerance and Fatigue Evaluation of Composite Rotorcraft Structures.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... Composite Rotorcraft Structures. 27.573 Section 27.573 Aeronautics and Space FEDERAL AVIATION ADMINISTRATION... Structures. (a) Each applicant must evaluate the composite rotorcraft structure under the damage tolerance..., types, and sizes of damage, considering fatigue, environmental effects, intrinsic and discrete flaws...

  13. 14 CFR 27.573 - Damage Tolerance and Fatigue Evaluation of Composite Rotorcraft Structures.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... Composite Rotorcraft Structures. 27.573 Section 27.573 Aeronautics and Space FEDERAL AVIATION ADMINISTRATION... Structures. (a) Each applicant must evaluate the composite rotorcraft structure under the damage tolerance..., types, and sizes of damage, considering fatigue, environmental effects, intrinsic and discrete flaws...

  14. 14 CFR 29.573 - Damage Tolerance and Fatigue Evaluation of Composite Rotorcraft Structures.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... Composite Rotorcraft Structures. 29.573 Section 29.573 Aeronautics and Space FEDERAL AVIATION ADMINISTRATION... Structures. (a) Each applicant must evaluate the composite rotorcraft structure under the damage tolerance..., types, and sizes of damage, considering fatigue, environmental effects, intrinsic and discrete flaws...

  15. 14 CFR 27.573 - Damage Tolerance and Fatigue Evaluation of Composite Rotorcraft Structures.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... Composite Rotorcraft Structures. 27.573 Section 27.573 Aeronautics and Space FEDERAL AVIATION ADMINISTRATION... Structures. (a) Each applicant must evaluate the composite rotorcraft structure under the damage tolerance..., types, and sizes of damage, considering fatigue, environmental effects, intrinsic and discrete flaws...

  16. 14 CFR 29.573 - Damage Tolerance and Fatigue Evaluation of Composite Rotorcraft Structures.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... Composite Rotorcraft Structures. 29.573 Section 29.573 Aeronautics and Space FEDERAL AVIATION ADMINISTRATION... Structures. (a) Each applicant must evaluate the composite rotorcraft structure under the damage tolerance..., types, and sizes of damage, considering fatigue, environmental effects, intrinsic and discrete flaws...

  17. Damage Tolerant Design for Cold-Section Turbine Engine Disks

    DTIC Science & Technology

    1981-06-01

    Ti-6Al-4V Disks ......... .. 59 28. FIOO 2nd-Stage Fan Disk Designs ........ ................ .. 61 29. Fan Disk Tangential Stress Profile... 61 30. Life-Limiting Features of Damage-Tolerant Disk .......... ... 62 31. Disk Life Limits .... ...................... 62 32. Life Test...Stress Rati• Model ..... .......... .. 113 61 . Thick-Section Center-Notched Specimen ....... ............. .. 116 62. Bolthole Specimen

  18. The research and development of damage tolerant carbon fiber composites

    NASA Astrophysics Data System (ADS)

    Miranda, John Armando

    This record of study takes a first hand look at corporate research and development efforts to improve the damage tolerance of two unique composite materials used in high performance aerospace applications. The professional internship with The Dow Chemical Company---Dow/United Technologies joint venture describes the intern's involvement in developing patentable process technologies for interleave toughening of high temperature resins and their composites. The subsequent internship with Hexcel Corporation describes the intern's involvement in developing the damage tolerance of novel and existing honeycomb sandwich structure technologies. Through the Doctor of Engineering professional internship experience this student exercised fundamental academic understanding and methods toward accomplishing the corporate objectives of the internship sponsors in a resource efficient and cost-effective manner. Also, the student gained tremendous autonomy through exceptional training in working in focused team environments with highly trained engineers and scientists in achieving important corporate objectives.

  19. Damage Tolerance Testing of a NASA TransHab Derivative Woven Inflatable Module

    NASA Technical Reports Server (NTRS)

    Edgecombe, John; delaFuente, Horacio; Valle, Gerard

    2009-01-01

    Current options for Lunar habitat architecture include inflatable habitats and airlocks. Inflatable structures can have mass and volume advantages over conventional structures. However, inflatable structures carry different inherent risks and are at a lower Technical Readiness Level (TRL) than more conventional metallic structures. One of the risks associated with inflatable structures is in understanding the tolerance to induced damage. The Damage Tolerance Test (DTT) is designed to study the structural integrity of an expandable structure. TransHab (Figure 1) was an experimental inflatable module developed at the NASA/Johnson Space Center in the 1990 s. The TransHab design was originally envisioned for use in Mars Transits but was also studied as a potential habitat for the International Space Station (ISS). The design of the TransHab module was based on a woven design using an Aramid fabric. Testing of this design demonstrated a high level of predictability and repeatability with analytical predictions of stresses and deflections. Based on JSC s experience with the design and analysis of woven inflatable structures, the Damage Tolerance Test article was designed and fabricated using a woven design. The DTT article was inflated to 45 psig, representing 25% of the ultimate burst pressure, and one of the one-inch wide longitudinal structural members was severed by initiating a Linear Shaped Charge (LSC). Strain gage measurements, at the interface between the expandable elements (straps) and the nonexpandable metallic elements for pre-selected longitudinal straps, were taken throughout pressurization of the module and strap separation. Strain gage measurements show no change in longitudinal strap loading at the bulkhead interface after strap separation indicating loads in the restraint layer were re-distributed local to the damaged area due to the effects of friction under high internal pressure loading. The test completed all primary objectives with better than

  20. Damage-Tolerant Fan Casings for Jet Engines

    NASA Technical Reports Server (NTRS)

    2006-01-01

    All turbofan engines work on the same principle. A large fan at the front of the engine draws air in. A portion of the air enters the compressor, but a greater portion passes on the outside of the engine this is called bypass air. The air that enters the compressor then passes through several stages of rotating fan blades that compress the air more, and then it passes into the combustor. In the combustor, fuel is injected into the airstream, and the fuel-air mixture is ignited. The hot gasses produced expand rapidly to the rear, and the engine reacts by moving forward. If there is a flaw in the system, such as an unexpected obstruction, the fan blade can break, spin off, and harm other engine components. Fan casings, therefore, need to be strong enough to contain errant blades and damage-tolerant to withstand the punishment of a loose blade-turned-projectile. NASA has spearheaded research into improving jet engine fan casings, ultimately discovering a cost-effective approach to manufacturing damage-tolerant fan cases that also boast significant weight reduction. In an aircraft, weight reduction translates directly into fuel burn savings, increased payload, and greater aircraft range. This technology increases safety and structural integrity; is an attractive, viable option for engine manufacturers, because of the low-cost manufacturing; and it is a practical alternative for customers, as it has the added cost saving benefits of the weight reduction.

  1. Rotational 3D printing of damage-tolerant composites with programmable mechanics

    PubMed Central

    Raney, Jordan R.; Compton, Brett G.; Ober, Thomas J.; Shea, Kristina; Lewis, Jennifer A.

    2018-01-01

    Natural composites exhibit exceptional mechanical performance that often arises from complex fiber arrangements within continuous matrices. Inspired by these natural systems, we developed a rotational 3D printing method that enables spatially controlled orientation of short fibers in polymer matrices solely by varying the nozzle rotation speed relative to the printing speed. Using this method, we fabricated carbon fiber–epoxy composites composed of volume elements (voxels) with programmably defined fiber arrangements, including adjacent regions with orthogonally and helically oriented fibers that lead to nonuniform strain and failure as well as those with purely helical fiber orientations akin to natural composites that exhibit enhanced damage tolerance. Our approach broadens the design, microstructural complexity, and performance space for fiber-reinforced composites through site-specific optimization of their fiber orientation, strain, failure, and damage tolerance. PMID:29348206

  2. Damage-tolerant metallic composites via melt infiltration of additively manufactured preforms

    DOE PAGES

    Pawlowski, Alexander E.; Cordero, Zachary C.; French, Matthew R.; ...

    2017-04-22

    A facile two-step approach for 3D printing metal-metal composites with precisely controlled microstructures is described. Composites made with this approach exhibit tailorable thermal and mechanical properties as well as exceptional damage tolerance.

  3. Damage-tolerant metallic composites via melt infiltration of additively manufactured preforms

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pawlowski, Alexander E.; Cordero, Zachary C.; French, Matthew R.

    A facile two-step approach for 3D printing metal-metal composites with precisely controlled microstructures is described. Composites made with this approach exhibit tailorable thermal and mechanical properties as well as exceptional damage tolerance.

  4. Design Manual for Impact Damage Tolerant Aircraft Structure

    DTIC Science & Technology

    1981-10-01

    controlled AAA guns having a very high rate of fire, is a significant threat. The final type of non -exploding military projectile-missile warhead fragments...8217~ AGARD-AG-238 NORTH 4TLANTIC TR!EATY ORGANIZATION ADVISORY GROUP FOR AERGSPACE RESEARCH AND DEVELOPMENT (ORGANISATION DU TRAITE DE L’ATLANTIQUE NORD...within the Working Group on Impact Damage Tolerance of Structures and also at the Specialists’ Meeting held in Ankara in September 1975 (see AGARD

  5. Materials and processes laboratory composite materials characterization task, part 1. Damage tolerance

    NASA Technical Reports Server (NTRS)

    Nettles, A. T.; Tucker, D. S.; Patterson, W. J.; Franklin, S. W.; Gordon, G. H.; Hart, L.; Hodge, A. J.; Lance, D. G.; Russel, S. S.

    1991-01-01

    A test run was performed on IM6/3501-6 carbon-epoxy in which the material was processed, machined into specimens, and tested for damage tolerance capabilities. Nondestructive test data played a major role in this element of composite characterization. A time chart was produced showing the time the composite material spent within each Branch or Division in order to identify those areas which produce a long turnaround time. Instrumented drop weight testing was performed on the specimens with nondestructive evaluation being performed before and after the impacts. Destructive testing in the form of cross-sectional photomicrography and compression-after-impact testing were used. Results show that the processing and machining steps needed to be performed more rapidly if data on composite material is to be collected within a reasonable timeframe. The results of the damage tolerance testing showed that IM6/3501-6 is a brittle material that is very susceptible to impact damage.

  6. Rotational 3D printing of damage-tolerant composites with programmable mechanics.

    PubMed

    Raney, Jordan R; Compton, Brett G; Mueller, Jochen; Ober, Thomas J; Shea, Kristina; Lewis, Jennifer A

    2018-02-06

    Natural composites exhibit exceptional mechanical performance that often arises from complex fiber arrangements within continuous matrices. Inspired by these natural systems, we developed a rotational 3D printing method that enables spatially controlled orientation of short fibers in polymer matrices solely by varying the nozzle rotation speed relative to the printing speed. Using this method, we fabricated carbon fiber-epoxy composites composed of volume elements (voxels) with programmably defined fiber arrangements, including adjacent regions with orthogonally and helically oriented fibers that lead to nonuniform strain and failure as well as those with purely helical fiber orientations akin to natural composites that exhibit enhanced damage tolerance. Our approach broadens the design, microstructural complexity, and performance space for fiber-reinforced composites through site-specific optimization of their fiber orientation, strain, failure, and damage tolerance. Copyright © 2018 the Author(s). Published by PNAS.

  7. Modulation of inflammation and disease tolerance by DNA damage response pathways.

    PubMed

    Neves-Costa, Ana; Moita, Luis F

    2017-03-01

    The accurate replication and repair of DNA is central to organismal survival. This process is challenged by the many factors that can change genetic information such as replication errors and direct damage to the DNA molecule by chemical and physical agents. DNA damage can also result from microorganism invasion as an integral step of their life cycle or as collateral damage from host defense mechanisms against pathogens. Here we review the complex crosstalk of DNA damage response and immune response pathways that might be evolutionarily connected and argue that DNA damage response pathways can be explored therapeutically to induce disease tolerance through the activation of tissue damage control processes. Such approach may constitute the missing pillar in the treatment of critical illnesses caused by multiple organ failure, such as sepsis and septic shock. © 2016 Federation of European Biochemical Societies.

  8. Advanced Durability and Damage Tolerance Design and Analysis Methods for Composite Structures: Lessons Learned from NASA Technology Development Programs

    NASA Technical Reports Server (NTRS)

    Harris, Charles E.; Starnes, James H., Jr.; Shuart, Mark J.

    2003-01-01

    Aerospace vehicles are designed to be durable and damage tolerant. Durability is largely an economic life-cycle design consideration whereas damage tolerance directly addresses the structural airworthiness (safety) of the vehicle. However, both durability and damage tolerance design methodologies must address the deleterious effects of changes in material properties and the initiation and growth of microstructural damage that may occur during the service lifetime of the vehicle. Durability and damage tolerance design and certification requirements are addressed for commercial transport aircraft and NASA manned spacecraft systems. The state-of-the-art in advanced design and analysis methods is illustrated by discussing the results of several recently completed NASA technology development programs. These programs include the NASA Advanced Subsonic Technology Program demonstrating technologies for large transport aircraft and the X-33 hypersonic test vehicle demonstrating technologies for a single-stage-to-orbit space launch vehicle.

  9. A damage tolerance comparison of IM7/8551 and IM8G/8553 carbon/epoxy composites

    NASA Technical Reports Server (NTRS)

    Lance, D. G.; Nettles, A. T.

    1991-01-01

    A damage tolerance study of two new toughened carbon fiber/epoxy resin systems was undertaken as a continuation of ongoing work into screening new opposites for resistance to foreign object impact. This report is intended to be a supplement to NASA TP 3029 in which four new fiber/resin systems were tested for damage tolerance. Instrumented drop weight impact testing was used to inflict damage to 16-ply quasi-isotropic specimens. Instrumented output data and cross-sectional examinations of the damage zone were utilized to quantify the damage. It was found that the two fiber/resin systems tested in this study were much more impact resistant than an untoughened composite such as T300/934, but were not as impact resistant as other materials previously studied.

  10. Limiting damage during infection: lessons from infection tolerance for novel therapeutics.

    PubMed

    Vale, Pedro F; Fenton, Andy; Brown, Sam P

    2014-01-01

    The distinction between pathogen elimination and damage limitation during infection is beginning to change perspectives on infectious disease control, and has recently led to the development of novel therapies that focus on reducing the illness caused by pathogens (‘‘damage limitation’’)rather than reducing pathogen burdens directly (‘‘pathogen elimination’’). While beneficial at the individual host level, the population consequences of these interventions remain unclear. To address this issue,we present a simple conceptual framework for damage limitation during infection that distinguishes between therapies that are either host-centric (pro-tolerance) or pathogen-centric (anti-virulence). We then draw on recent developments from the evolutionary ecology of disease tolerance to highlight some potential epidemiological and evolutionary responses of pathogens to medical interventions that target the symptoms of infection. Just as pathogens are known to evolve in response to antimicrobial and vaccination therapies, we caution that claims of ‘‘evolution-proof’’ anti-virulence interventions may be premature, and further, that in infections where virulence and transmission are linked, reducing illness without reducing pathogen burden could have non-trivial epidemiological and evolutionary consequences that require careful examination.

  11. Nano-enhanced aerospace composites for increased damage tolerance and service life damage monitoring

    NASA Astrophysics Data System (ADS)

    Paipetis, A.; Matikas, T. E.; Barkoula, N. M.; Karapappas, P.; Vavouliotis, A.; Kostopoulos, V.

    2009-03-01

    This study deals with new generation composite systems which apart from the primary reinforcement at the typical fiber scale (~10 μm) are also reinforced at the nanoscale. This is performed via incorporation of nano-scale additives in typical aerospace matrix systems, such as epoxies. Carbon Nanotubes (CNTs) are ideal candidates as their extremely high aspect ratio and mechanical properties render them advantageous to other nanoscale materials. The result is the significant increase in the damage tolerance of the novel composite systems even at very low CNT loadings. By monitoring the resistance change of the CNT network, information both on the real time deformation state of the composite is obtained as a reversible change in the bulk resistance of the material, and the damage state of the material as an irreversible change in the bulk resistance of the material. The irreversible monotonic increase of the electrical resistance can be related to internal damage in the hybrid composite system and may be used as an index of the remaining lifetime of a structural component.

  12. A Markov Chain Approach to Probabilistic Swarm Guidance

    NASA Technical Reports Server (NTRS)

    Acikmese, Behcet; Bayard, David S.

    2012-01-01

    This paper introduces a probabilistic guidance approach for the coordination of swarms of autonomous agents. The main idea is to drive the swarm to a prescribed density distribution in a prescribed region of the configuration space. In its simplest form, the probabilistic approach is completely decentralized and does not require communication or collabo- ration between agents. Agents make statistically independent probabilistic decisions based solely on their own state, that ultimately guides the swarm to the desired density distribution in the configuration space. In addition to being completely decentralized, the probabilistic guidance approach has a novel autonomous self-repair property: Once the desired swarm density distribution is attained, the agents automatically repair any damage to the distribution without collaborating and without any knowledge about the damage.

  13. Damage Tolerance Enhancement of Carbon Fiber Reinforced Polymer Composites by Nanoreinforcement of Matrix

    NASA Astrophysics Data System (ADS)

    Fenner, Joel Stewart

    Nanocomposites are a relatively new class of materials which incorporate exotic, engineered nanoparticles to achieve superior material properties. Because of their extremely small size and well-ordered structure, many nanoparticles possess properties that exceed those offered by a wide range of other known materials, making them attractive candidates for novel materials engineering development. Their small size is also an impediment to their practical use, as they typically cannot be employed by themselves to realize those properties in large structures. Furthermore, nanoparticles typically possess strong self-affinity, rendering them difficult to disperse uniformly into a composite. However, contemporary research has shown that, if well-dispersed, nanoparticles have great capacity to improve the mechanical properties of composites, especially damage tolerance, in the form of fracture toughness, fatigue life, and impact damage mitigation. This research focuses on the development, manufacturing, and testing of hybrid micro/nanocomposites comprised of woven carbon fibers with a carbon nanotube reinforced epoxy matrix. Material processing consisted of dispersant-and-sonication based methods to disperse nanotubes into the matrix, and a vacuum-assisted wet lay-up process to prepare the hybrid composite laminates. Various damage tolerance properties of the hybrid composite were examined, including static strength, fracture toughness, fatigue life, fatigue crack growth rate, and impact damage behavior, and compared with similarly-processed reference material produced without nanoreinforcement. Significant improvements were obtained in interlaminar shear strength (15%), Mode-I fracture toughness (180%), shear fatigue life (order of magnitude), Mode-I fatigue crack growth rate (factor of 2), and effective impact damage toughness (40%). Observations by optical microscopy, scanning electron microscopy, and ultrasonic imaging showed significant differences in failure behavior

  14. Damage Tolerance of Pre-Stressed Composite Panels Under Impact Loads

    NASA Astrophysics Data System (ADS)

    Johnson, Alastair F.; Toso-Pentecôte, Nathalie; Schueler, Dominik

    2014-02-01

    An experimental test campaign studied the structural integrity of carbon fibre/epoxy panels preloaded in tension or compression then subjected to gas gun impact tests causing significant damage. The test programme used representative composite aircraft fuselage panels composed of aerospace carbon fibre toughened epoxy prepreg laminates. Preload levels in tension were representative of design limit loads for fuselage panels of this size, and maximum compression preloads were in the post-buckle region. Two main impact scenarios were considered: notch damage from a 12 mm steel cube projectile, at velocities in the range 93-136 m/s; blunt impact damage from 25 mm diameter glass balls, at velocities 64-86 m/s. The combined influence of preload and impact damage on panel residual strengths was measured and results analysed in the context of damage tolerance requirements for composite aircraft panels. The tests showed structural integrity well above design limit loads for composite panels preloaded in tension and compression with visible notch impact damage from hard body impact tests. However, blunt impact tests on buckled compression loaded panels caused large delamination damage regions which lowered plate bending stiffness and reduced significantly compression strengths in buckling.

  15. What do we gain with Probabilistic Flood Loss Models?

    NASA Astrophysics Data System (ADS)

    Schroeter, K.; Kreibich, H.; Vogel, K.; Merz, B.; Lüdtke, S.

    2015-12-01

    The reliability of flood loss models is a prerequisite for their practical usefulness. Oftentimes, traditional uni-variate damage models as for instance depth-damage curves fail to reproduce the variability of observed flood damage. Innovative multi-variate probabilistic modelling approaches are promising to capture and quantify the uncertainty involved and thus to improve the basis for decision making. In this study we compare the predictive capability of two probabilistic modelling approaches, namely Bagging Decision Trees and Bayesian Networks and traditional stage damage functions which are cast in a probabilistic framework. For model evaluation we use empirical damage data which are available from computer aided telephone interviews that were respectively compiled after the floods in 2002, 2005, 2006 and 2013 in the Elbe and Danube catchments in Germany. We carry out a split sample test by sub-setting the damage records. One sub-set is used to derive the models and the remaining records are used to evaluate the predictive performance of the model. Further we stratify the sample according to catchments which allows studying model performance in a spatial transfer context. Flood damage estimation is carried out on the scale of the individual buildings in terms of relative damage. The predictive performance of the models is assessed in terms of systematic deviations (mean bias), precision (mean absolute error) as well as in terms of reliability which is represented by the proportion of the number of observations that fall within the 95-quantile and 5-quantile predictive interval. The reliability of the probabilistic predictions within validation runs decreases only slightly and achieves a very good coverage of observations within the predictive interval. Probabilistic models provide quantitative information about prediction uncertainty which is crucial to assess the reliability of model predictions and improves the usefulness of model results.

  16. Damage tolerance assessment handbook. Volume 1 : introduction, fracture mechanics, fatigue crack propagation

    DOT National Transportation Integrated Search

    1999-02-01

    The handbook is presented in two volumes. This volume, Volume I, introduces the damage tolerance concept with an historical perspective followed by the fundamentals of fracture mechanics and fatigue crack propagation. Various fracture criteria and cr...

  17. Damage tolerance of a composite sandwich with interleaved foam core

    NASA Astrophysics Data System (ADS)

    Ishai, Ori; Hiel, Clement

    A composite sandwich panel consisting of carbon fiber-reinforced plastic (CFRP) skins and a syntactic foam core was selected as an appropriate structural concept for the design of wind tunnel compressor blades. Interleaving of the core with tough interlayers was done to prevent core cracking and to improve damage tolerance of the sandwich. Simply supported sandwich beam specimens were subjected to low-velocity drop-weight impacts as well as high velocity ballistic impacts. The performance of the interleaved core sandwich panels was characterized by localized skin damage and minor cracking of the core. Residual compressive strength (RCS) of the skin, which was derived from flexural test, shows the expected trend of decreasing with increasing size of the damage, impact energy, and velocity. In the case of skin damage, RCS values of around 50 percent of the virgin interleaved reference were obtained at the upper impact energy range. Based on the similarity between low-velocity and ballistic-impact effects, it was concluded that impact energy is the main variable controlling damage and residual strength, where as velocity plays a minor role.

  18. Damage tolerance of a composite sandwich with interleaved foam core

    NASA Technical Reports Server (NTRS)

    Ishai, Ori; Hiel, Clement

    1992-01-01

    A composite sandwich panel consisting of carbon fiber-reinforced plastic (CFRP) skins and a syntactic foam core was selected as an appropriate structural concept for the design of wind tunnel compressor blades. Interleaving of the core with tough interlayers was done to prevent core cracking and to improve damage tolerance of the sandwich. Simply supported sandwich beam specimens were subjected to low-velocity drop-weight impacts as well as high velocity ballistic impacts. The performance of the interleaved core sandwich panels was characterized by localized skin damage and minor cracking of the core. Residual compressive strength (RCS) of the skin, which was derived from flexural test, shows the expected trend of decreasing with increasing size of the damage, impact energy, and velocity. In the case of skin damage, RCS values of around 50 percent of the virgin interleaved reference were obtained at the upper impact energy range. Based on the similarity between low-velocity and ballistic-impact effects, it was concluded that impact energy is the main variable controlling damage and residual strength, where as velocity plays a minor role.

  19. New discoveries linking transcription to DNA repair and damage tolerance pathways.

    PubMed

    Cohen, Susan E; Walker, Graham C

    2011-01-01

    In Escherichia coli, the transcription elongation factor NusA is associated with all elongating RNA polymerases where it functions in transcription termination and antitermination. Here, we review our recent results implicating NusA in the recruitment of DNA repair and damage tolerance mechanisms to sites of stalled transcription complexes.

  20. A preliminary damage tolerance methodology for composite structures

    NASA Technical Reports Server (NTRS)

    Wilkins, D. J.

    1983-01-01

    The certification experience for the primary, safety-of-flight composite structure applications on the F-16 is discussed. The rationale for the selection of delamination as the major issue for damage tolerance is discussed, as well as the modeling approach selected. The development of the necessary coupon-level data base is briefly summarized. The major emphasis is on the description of a full-scale fatigue test where delamination growth was obtained to demonstrate the validity of the selected approach. A summary is used to review the generic features of the methodology.

  1. Ultraviolet-B-induced DNA damage and ultraviolet-B tolerance mechanisms in species with different functional groups coexisting in subalpine moorlands.

    PubMed

    Wang, Qing-Wei; Kamiyama, Chiho; Hidema, Jun; Hikosaka, Kouki

    2016-08-01

    High doses of ultraviolet-B (UV-B; 280-315 nm) radiation can have detrimental effects on plants, and especially damage their DNA. Plants have DNA repair and protection mechanisms to prevent UV-B damage. However, it remains unclear how DNA damage and tolerance mechanisms vary among field species. We studied DNA damage and tolerance mechanisms in 26 species with different functional groups coexisting in two moorlands at two elevations. We collected current-year leaves in July and August, and determined accumulation of cyclobutane pyrimidine dimer (CPD) as UV-B damage and photorepair activity (PRA) and concentrations of UV-absorbing compounds (UACs) and carotenoids (CARs) as UV-B tolerance mechanisms. DNA damage was greater in dicot than in monocot species, and higher in herbaceous than in woody species. Evergreen species accumulated more CPDs than deciduous species. PRA was higher in Poaceae than in species of other families. UACs were significantly higher in woody than in herbaceous species. The CPD level was not explained by the mechanisms across species, but was significantly related to PRA and UACs when we ignored species with low CPD, PRA and UACs, implying the presence of another effective tolerance mechanism. UACs were correlated negatively with PRA and positively with CARs. Our results revealed that UV-induced DNA damage significantly varies among native species, and this variation is related to functional groups. DNA repair, rather than UV-B protection, dominates in UV-B tolerance in the field. Our findings also suggest that UV-B tolerance mechanisms vary among species under evolutionary trade-off and synergism.

  2. Damage tolerance of candidate thermoset composites for use on single stage to orbit vehicles

    NASA Technical Reports Server (NTRS)

    Nettles, A. T.; Lance, D.; Hodge, A.

    1994-01-01

    Four fiber/resin systems were compared for resistance to damage and damage tolerance. One toughened epoxy and three toughened bismaleimide (BMI) resins were used, all with IM7 carbon fiber reinforcement. A statistical design of experiments technique was used to evaluate the effects of impact energy, specimen thickness, and impactor diameter on the damage area, as computed by C-scans, and residual compression-after-impact (CAI) strength. Results showed that two of the BMI systems sustained relatively large damage zones yet had an excellent retention of CAI strength.

  3. Opportunities of probabilistic flood loss models

    NASA Astrophysics Data System (ADS)

    Schröter, Kai; Kreibich, Heidi; Lüdtke, Stefan; Vogel, Kristin; Merz, Bruno

    2016-04-01

    Oftentimes, traditional uni-variate damage models as for instance depth-damage curves fail to reproduce the variability of observed flood damage. However, reliable flood damage models are a prerequisite for the practical usefulness of the model results. Innovative multi-variate probabilistic modelling approaches are promising to capture and quantify the uncertainty involved and thus to improve the basis for decision making. In this study we compare the predictive capability of two probabilistic modelling approaches, namely Bagging Decision Trees and Bayesian Networks and traditional stage damage functions. For model evaluation we use empirical damage data which are available from computer aided telephone interviews that were respectively compiled after the floods in 2002, 2005, 2006 and 2013 in the Elbe and Danube catchments in Germany. We carry out a split sample test by sub-setting the damage records. One sub-set is used to derive the models and the remaining records are used to evaluate the predictive performance of the model. Further we stratify the sample according to catchments which allows studying model performance in a spatial transfer context. Flood damage estimation is carried out on the scale of the individual buildings in terms of relative damage. The predictive performance of the models is assessed in terms of systematic deviations (mean bias), precision (mean absolute error) as well as in terms of sharpness of the predictions the reliability which is represented by the proportion of the number of observations that fall within the 95-quantile and 5-quantile predictive interval. The comparison of the uni-variable Stage damage function and the multivariable model approach emphasises the importance to quantify predictive uncertainty. With each explanatory variable, the multi-variable model reveals an additional source of uncertainty. However, the predictive performance in terms of precision (mbe), accuracy (mae) and reliability (HR) is clearly improved

  4. Damage-Tolerance Characteristics of Composite Fuselage Sandwich Structures with Thick Facesheets

    NASA Technical Reports Server (NTRS)

    McGowan, David M.; Ambur, Damodar R.

    1997-01-01

    Damage tolerance characteristics and results from experimental and analytical studies of a composite fuselage keel sandwich structure subjected to low-speed impact damage and discrete-source damage are presented. The test specimens are constructed from graphite-epoxy skins borided to a honeycomb core, and they are representative of a highly loaded fuselage keel structure. Results of compression-after-impact (CAI) and notch-length sensitivity studies of 5-in.-wide by 10-in.long specimens are presented. A correlation between low-speed-impact dent depth, the associated damage area, and residual strength for different impact-energy levels is described; and a comparison of the strength for undamaged and damaged specimens with different notch-length-to-specimen-width ratios is presented. Surface strains in the facesheets of the undamaged specimens as well as surface strains that illustrate the load redistribution around the notch sites in the notched specimens are presented and compared with results from finite element analyses. Reductions in strength of as much as 53.1 percent for the impacted specimens and 64.7 percent for the notched specimens are observed.

  5. Cell cycle stage-specific roles of Rad18 in tolerance and repair of oxidative DNA damage

    PubMed Central

    Yang, Yang; Durando, Michael; Smith-Roe, Stephanie L.; Sproul, Chris; Greenwalt, Alicia M.; Kaufmann, William; Oh, Sehyun; Hendrickson, Eric A.; Vaziri, Cyrus

    2013-01-01

    The E3 ubiquitin ligase Rad18 mediates tolerance of replication fork-stalling bulky DNA lesions, but whether Rad18 mediates tolerance of bulky DNA lesions acquired outside S-phase is unclear. Using synchronized cultures of primary human cells, we defined cell cycle stage-specific contributions of Rad18 to genome maintenance in response to ultraviolet C (UVC) and H2O2-induced DNA damage. UVC and H2O2 treatments both induced Rad18-mediated proliferating cell nuclear antigen mono-ubiquitination during G0, G1 and S-phase. Rad18 was important for repressing H2O2-induced (but not ultraviolet-induced) double strand break (DSB) accumulation and ATM S1981 phosphorylation only during G1, indicating a specific role for Rad18 in processing of oxidative DNA lesions outside S-phase. However, H2O2-induced DSB formation in Rad18-depleted G1 cells was not associated with increased genotoxin sensitivity, indicating that back-up DSB repair mechanisms compensate for Rad18 deficiency. Indeed, in DNA LigIV-deficient cells Rad18-depletion conferred H2O2-sensitivity, demonstrating functional redundancy between Rad18 and non-homologous end joining for tolerance of oxidative DNA damage acquired during G1. In contrast with G1-synchronized cultures, S-phase cells were H2O2-sensitive following Rad18-depletion. We conclude that although Rad18 pathway activation by oxidative lesions is not restricted to S-phase, Rad18-mediated trans-lesion synthesis by Polη is dispensable for damage-tolerance in G1 (because of back-up non-homologous end joining-mediated DSB repair), yet Rad18 is necessary for damage tolerance during S-phase. PMID:23295675

  6. Towards a damage tolerance philosophy for composite materials and structures

    NASA Technical Reports Server (NTRS)

    O'Brien, T. Kevin

    1990-01-01

    A damage-threshold/fail-safe approach is proposed to ensure that composite structures are both sufficiently durable for economy of operation, as well as adequately fail-safe or damage tolerant for flight safety. Matrix cracks are assumed to exist throughout the off-axis plies. Delamination onset is predicted using a strain energy release rate characterization. Delamination growth is accounted for in one of three ways: either analytically, using delamination growth laws in conjunction with strain energy release rate analyses incorporating delamination resistance curves; experimentally, using measured stiffness loss; or conservatively, assuming delamination onset corresponds to catastrophic delamination growth. Fail-safety is assessed by accounting for the accumulation of delaminations through the thickness. A tension fatigue life prediction for composite laminates is presented as a case study to illustrate how this approach may be implemented. Suggestions are made for applying the damage-threshold/fail-safe approach to compression fatigue, tension/compression fatigue, and compression strength following low velocity impact.

  7. Towards a damage tolerance philosophy for composite materials and structures

    NASA Technical Reports Server (NTRS)

    Obrien, T. Kevin

    1988-01-01

    A damage-threshold/fail-safe approach is proposed to ensure that composite structures are both sufficiently durable for economy of operation, as well as adequately fail-safe or damage tolerant for flight safety. Matrix cracks are assumed to exist throughout the off-axis plies. Delamination onset is predicted using a strain energy release rate characterization. Delamination growth is accounted for in one of three ways: either analytically, using delamination growth laws in conjunction with strain energy release rate analyses incorporating delamination resistance curves; experimentally, using measured stiffness loss; or conservatively, assuming delamination onset corresponds to catastrophic delamination growth. Fail-safety is assessed by accounting for the accumulation of delaminations through the thickness. A tension fatigue life prediction for composite laminates is presented as a case study to illustrate how this approach may be implemented. Suggestions are made for applying the damage-threshold/fail-safe approach to compression fatigue, tension/compression fatigue, and compression strength following low velocity impact.

  8. DNA lesion identity drives choice of damage tolerance pathway in murine cell chromosomes.

    PubMed

    Cohen, Isadora S; Bar, Carmit; Paz-Elizur, Tamar; Ainbinder, Elena; Leopold, Karoline; de Wind, Niels; Geacintov, Nicholas; Livneh, Zvi

    2015-02-18

    DNA-damage tolerance (DDT) via translesion DNA synthesis (TLS) or homology-dependent repair (HDR) functions to bypass DNA lesions encountered during replication, and is critical for maintaining genome stability. Here, we present piggyBlock, a new chromosomal assay that, using piggyBac transposition of DNA containing a known lesion, measures the division of labor between the two DDT pathways. We show that in the absence of DNA damage response, tolerance of the most common sunlight-induced DNA lesion, TT-CPD, is achieved by TLS in mouse embryo fibroblasts. Meanwhile, BP-G, a major smoke-induced DNA lesion, is bypassed primarily by HDR, providing the first evidence for this mechanism being the main tolerance pathway for a biologically important lesion in a mammalian genome. We also show that, far from being a last-resort strategy as it is sometimes portrayed, TLS operates alongside nucleotide excision repair, handling 40% of TT-CPDs in repair-proficient cells. Finally, DDT acts in mouse embryonic stem cells, exhibiting the same pattern—mutagenic TLS included—despite the risk of propagating mutations along all cell lineages. The new method highlights the importance of HDR, and provides an effective tool for studying DDT in mammalian cells.

  9. Damage tolerance of nuclear graphite at elevated temperatures

    DOE PAGES

    Liu, Dong; Gludovatz, Bernd; Barnard, Harold S.; ...

    2017-06-30

    Nuclear-grade graphite is a critically important high-temperature structural material for current and potentially next generation of fission reactors worldwide. It is imperative to understand its damage-tolerant behaviour and to discern the mechanisms of damage evolution under in-service conditions. Here we perform in situ mechanical testing with synchrotron X-ray computed micro-tomography at temperatures between ambient and 1,000 °C on a nuclear-grade Gilsocarbon graphite. We find that both the strength and fracture toughness of this graphite are improved at elevated temperature. Whereas this behaviour is consistent with observations of the closure of microcracks formed parallel to the covalent-sp 2-bonded graphene layers atmore » higher temperatures, which accommodate the more than tenfold larger thermal expansion perpendicular to these layers, we attribute the elevation in strength and toughness primarily to changes in the residual stress state at 800–1,000 °C, specifically to the reduction in significant levels of residual tensile stresses in the graphite that are ‘frozen-in’ following processing.« less

  10. Damage tolerance of nuclear graphite at elevated temperatures

    PubMed Central

    Liu, Dong; Gludovatz, Bernd; Barnard, Harold S.; Kuball, Martin; Ritchie, Robert O.

    2017-01-01

    Nuclear-grade graphite is a critically important high-temperature structural material for current and potentially next generation of fission reactors worldwide. It is imperative to understand its damage-tolerant behaviour and to discern the mechanisms of damage evolution under in-service conditions. Here we perform in situ mechanical testing with synchrotron X-ray computed micro-tomography at temperatures between ambient and 1,000 °C on a nuclear-grade Gilsocarbon graphite. We find that both the strength and fracture toughness of this graphite are improved at elevated temperature. Whereas this behaviour is consistent with observations of the closure of microcracks formed parallel to the covalent-sp2-bonded graphene layers at higher temperatures, which accommodate the more than tenfold larger thermal expansion perpendicular to these layers, we attribute the elevation in strength and toughness primarily to changes in the residual stress state at 800–1,000 °C, specifically to the reduction in significant levels of residual tensile stresses in the graphite that are ‘frozen-in’ following processing. PMID:28665405

  11. Damage tolerance of bonded composite aircraft repairs for metallic structures

    NASA Astrophysics Data System (ADS)

    Clark, Randal John

    This thesis describes the development and validation of methods for damage tolerance substantiation of bonded composite repairs applied to cracked plates. This technology is used to repair metal aircraft structures, offering improvements in fatigue life, cost, manufacturability, and inspectability when compared to riveted repairs. The work focuses on the effects of plate thickness and bending on repair life, and covers fundamental aspects of fracture and fatigue of cracked plates and bonded joints. This project falls under the UBC Bonded Composite Repair Program, which has the goal of certification and widespread use of bonded repairs in civilian air transportation. This thesis analyses the plate thickness and transverse stress effects on fracture of repaired plates and the related problem of induced geometrically nonlinear bending in unbalanced (single-sided) repairs. The author begins by developing a classification scheme for assigning repair damage tolerance substantiation requirements based upon stress-based adhesive fracture/fatigue criteria and the residual strength of the original structure. The governing equations for bending of cracked plates are then reformulated and line-spring models are developed for linear and nonlinear coupled bending and extension of reinforced cracks. The line-spring models were used to correct the Wang and Rose energy method for the determination of the long-crack limit stress intensity, and to develop a new interpolation model for repaired cracks of arbitrary length. The analysis was validated using finite element models and data from mechanical tests performed on hybrid bonded joints and repair specimens that are representative of an in-service repair. This work will allow designers to evaluate the damage tolerance of the repaired plate, the adhesive, and the composite patch, which is an airworthiness requirement under FAR (Federal Aviation Regulations) 25.571. The thesis concludes by assessing the remaining barriers to

  12. Assessment of the Damage Tolerance of Postbuckled Hat-Stiffened Panels Using Single-Stringer Specimens

    NASA Technical Reports Server (NTRS)

    Bisagni, Chiara; Vescovini, Riccardo; Davila, Carlos G.

    2010-01-01

    A procedure is proposed for the assessment of the damage tolerance and collapse of stiffened composite panels using a single-stringer compression specimen. The dimensions of the specimen are determined such that the specimen s nonlinear response and collapse are representative of an equivalent multi-stringer panel in compression. Experimental tests are conducted on specimens with and without an embedded delamination. A shell-based finite element model with intralaminar and interlaminar damage capabilities is developed to predict the postbuckling response as well as the damage evolution from initiation to collapse.

  13. Re-Tooling the Agency's Engineering Predictive Practices for Durability and Damage Tolerance

    NASA Technical Reports Server (NTRS)

    Piascik, Robert S.; Knight, Norman F., Jr.

    2017-01-01

    Over the past decade, the Agency has placed less emphasis on testing and has increasingly relied on computational methods to assess durability and damage tolerance (D&DT) behavior when evaluating design margins for fracture-critical components. With increased emphasis on computational D&DT methods as the standard practice, it is paramount that capabilities of these methods are understood, the methods are used within their technical limits, and validation by well-designed tests confirms understanding. The D&DT performance of a component is highly dependent on parameters in the neighborhood of the damage. This report discusses D&DT method vulnerabilities.

  14. DNA lesion identity drives choice of damage tolerance pathway in murine cell chromosomes

    PubMed Central

    Cohen, Isadora S.; Bar, Carmit; Paz-Elizur, Tamar; Ainbinder, Elena; Leopold, Karoline; de Wind, Niels; Geacintov, Nicholas; Livneh, Zvi

    2015-01-01

    DNA-damage tolerance (DDT) via translesion DNA synthesis (TLS) or homology-dependent repair (HDR) functions to bypass DNA lesions encountered during replication, and is critical for maintaining genome stability. Here, we present piggyBlock, a new chromosomal assay that, using piggyBac transposition of DNA containing a known lesion, measures the division of labor between the two DDT pathways. We show that in the absence of DNA damage response, tolerance of the most common sunlight-induced DNA lesion, TT-CPD, is achieved by TLS in mouse embryo fibroblasts. Meanwhile, BP-G, a major smoke-induced DNA lesion, is bypassed primarily by HDR, providing the first evidence for this mechanism being the main tolerance pathway for a biologically important lesion in a mammalian genome. We also show that, far from being a last-resort strategy as it is sometimes portrayed, TLS operates alongside nucleotide excision repair, handling 40% of TT-CPDs in repair-proficient cells. Finally, DDT acts in mouse embryonic stem cells, exhibiting the same pattern—mutagenic TLS included—despite the risk of propagating mutations along all cell lineages. The new method highlights the importance of HDR, and provides an effective tool for studying DDT in mammalian cells. PMID:25589543

  15. FAA/NASA International Symposium on Advanced Structural Integrity Methods for Airframe Durability and Damage Tolerance

    NASA Technical Reports Server (NTRS)

    Harris, Charles E. (Editor)

    1994-01-01

    International technical experts in durability and damage tolerance of metallic airframe structures were assembled to present and discuss recent research findings and the development of advanced design and analysis methods, structural concepts, and advanced materials. The symposium focused on the dissemination of new knowledge and the peer-review of progress on the development of advanced methodologies. Papers were presented on: structural concepts for enhanced durability, damage tolerance, and maintainability; new metallic alloys and processing technology; fatigue crack initiation and small crack effects; fatigue crack growth models; fracture mechanics failure, criteria for ductile materials; structural mechanics methodology for residual strength and life prediction; development of flight load spectra for design and testing; and advanced approaches to resist corrosion and environmentally assisted fatigue.

  16. Probabilistic Prediction of Lifetimes of Ceramic Parts

    NASA Technical Reports Server (NTRS)

    Nemeth, Noel N.; Gyekenyesi, John P.; Jadaan, Osama M.; Palfi, Tamas; Powers, Lynn; Reh, Stefan; Baker, Eric H.

    2006-01-01

    ANSYS/CARES/PDS is a software system that combines the ANSYS Probabilistic Design System (PDS) software with a modified version of the Ceramics Analysis and Reliability Evaluation of Structures Life (CARES/Life) Version 6.0 software. [A prior version of CARES/Life was reported in Program for Evaluation of Reliability of Ceramic Parts (LEW-16018), NASA Tech Briefs, Vol. 20, No. 3 (March 1996), page 28.] CARES/Life models effects of stochastic strength, slow crack growth, and stress distribution on the overall reliability of a ceramic component. The essence of the enhancement in CARES/Life 6.0 is the capability to predict the probability of failure using results from transient finite-element analysis. ANSYS PDS models the effects of uncertainty in material properties, dimensions, and loading on the stress distribution and deformation. ANSYS/CARES/PDS accounts for the effects of probabilistic strength, probabilistic loads, probabilistic material properties, and probabilistic tolerances on the lifetime and reliability of the component. Even failure probability becomes a stochastic quantity that can be tracked as a response variable. ANSYS/CARES/PDS enables tracking of all stochastic quantities in the design space, thereby enabling more precise probabilistic prediction of lifetimes of ceramic components.

  17. Global/local methods for probabilistic structural analysis

    NASA Technical Reports Server (NTRS)

    Millwater, H. R.; Wu, Y.-T.

    1993-01-01

    A probabilistic global/local method is proposed to reduce the computational requirements of probabilistic structural analysis. A coarser global model is used for most of the computations with a local more refined model used only at key probabilistic conditions. The global model is used to establish the cumulative distribution function (cdf) and the Most Probable Point (MPP). The local model then uses the predicted MPP to adjust the cdf value. The global/local method is used within the advanced mean value probabilistic algorithm. The local model can be more refined with respect to the g1obal model in terms of finer mesh, smaller time step, tighter tolerances, etc. and can be used with linear or nonlinear models. The basis for this approach is described in terms of the correlation between the global and local models which can be estimated from the global and local MPPs. A numerical example is presented using the NESSUS probabilistic structural analysis program with the finite element method used for the structural modeling. The results clearly indicate a significant computer savings with minimal loss in accuracy.

  18. Global/local methods for probabilistic structural analysis

    NASA Astrophysics Data System (ADS)

    Millwater, H. R.; Wu, Y.-T.

    1993-04-01

    A probabilistic global/local method is proposed to reduce the computational requirements of probabilistic structural analysis. A coarser global model is used for most of the computations with a local more refined model used only at key probabilistic conditions. The global model is used to establish the cumulative distribution function (cdf) and the Most Probable Point (MPP). The local model then uses the predicted MPP to adjust the cdf value. The global/local method is used within the advanced mean value probabilistic algorithm. The local model can be more refined with respect to the g1obal model in terms of finer mesh, smaller time step, tighter tolerances, etc. and can be used with linear or nonlinear models. The basis for this approach is described in terms of the correlation between the global and local models which can be estimated from the global and local MPPs. A numerical example is presented using the NESSUS probabilistic structural analysis program with the finite element method used for the structural modeling. The results clearly indicate a significant computer savings with minimal loss in accuracy.

  19. Assessing inspection sensitivity as it relates to damage tolerance in composite rotor hubs

    NASA Astrophysics Data System (ADS)

    Roach, Dennis P.; Rackow, Kirk

    2001-08-01

    Increasing niche applications, growing international markets, and the emergence of advanced rotorcraft technology are expected to greatly increase the population of helicopters over the next decade. In terms of fuselage fatigue, helicopters show similar trends as fixed-wing aircraft. The highly unsteady loads experienced by rotating wings not only directly affect components in the dynamic systems but are also transferred to the fixed airframe structure. Expanded use of rotorcraft has focused attention on the use of new materials and the optimization of maintenance practices. The FAA's Airworthiness Assurance Center (AANC) at Sandia National Labs has joined with Bell Helicopter andother agencies in the rotorcraft industry to evaluate nondestructive inspection (NDI) capabilities in light of the damage tolerance of assorted rotorcraft structure components. Currently, the program's emphasis is on composite rotor hubs. The rotorcraft industry is constantly evaluating new types of lightweight composite materials that not only enhance the safety and reliability of rotor components but also improve performance and extended operating life as well. Composite rotor hubs have led to the use of bearingless rotor systems that are less complex and require less maintenance than their predecessors. The test facility described in this paper allows the structural stability and damage tolerance of composite hubs to be evaluated using realistic flight load spectrums of centrifugal force and bending loads. NDI was integrated into the life-cycle fatigue tests in order to evaluate flaw detection sensitivity simultaneously wiht residual strength and general rotor hub peformance. This paper will describe the evolving use of damage tolerance analysis (DTA) to direct and improve rotorcraft maintenance along with the related use of nondestructive inspections to manage helicopter safety. OVeralll, the data from this project will provide information to improve the producibility, inspectability

  20. Probabilistic, meso-scale flood loss modelling

    NASA Astrophysics Data System (ADS)

    Kreibich, Heidi; Botto, Anna; Schröter, Kai; Merz, Bruno

    2016-04-01

    Flood risk analyses are an important basis for decisions on flood risk management and adaptation. However, such analyses are associated with significant uncertainty, even more if changes in risk due to global change are expected. Although uncertainty analysis and probabilistic approaches have received increased attention during the last years, they are still not standard practice for flood risk assessments and even more for flood loss modelling. State of the art in flood loss modelling is still the use of simple, deterministic approaches like stage-damage functions. Novel probabilistic, multi-variate flood loss models have been developed and validated on the micro-scale using a data-mining approach, namely bagging decision trees (Merz et al. 2013). In this presentation we demonstrate and evaluate the upscaling of the approach to the meso-scale, namely on the basis of land-use units. The model is applied in 19 municipalities which were affected during the 2002 flood by the River Mulde in Saxony, Germany (Botto et al. submitted). The application of bagging decision tree based loss models provide a probability distribution of estimated loss per municipality. Validation is undertaken on the one hand via a comparison with eight deterministic loss models including stage-damage functions as well as multi-variate models. On the other hand the results are compared with official loss data provided by the Saxon Relief Bank (SAB). The results show, that uncertainties of loss estimation remain high. Thus, the significant advantage of this probabilistic flood loss estimation approach is that it inherently provides quantitative information about the uncertainty of the prediction. References: Merz, B.; Kreibich, H.; Lall, U. (2013): Multi-variate flood damage assessment: a tree-based data-mining approach. NHESS, 13(1), 53-64. Botto A, Kreibich H, Merz B, Schröter K (submitted) Probabilistic, multi-variable flood loss modelling on the meso-scale with BT-FLEMO. Risk Analysis.

  1. Aerothermal performance and damage tolerance of a Rene 41 metallic standoff thermal protection system at Mach 6.7

    NASA Technical Reports Server (NTRS)

    Avery, D. E.

    1984-01-01

    A flight-weight, metallic thermal protection system (TPS) model applicable to Earth-entry and hypersonic-cruise vehicles was subjected to multiple cycles of both radiant and aerothermal heating in order to evaluate its aerothermal performance, structural integrity, and damage tolerance. The TPS was designed for a maximum operating temperature of 2060 R and featured a shingled, corrugation-stiffened corrugated-skin heat shield of Rene 41, a nickel-base alloy. The model was subjected to 10 radiant heating tests and to 3 radiant preheat/aerothermal tests. Under radiant-heating conditions with a maximum surface temperature of 2050 R, the TPS performed as designed and limited the primary structure away from the support ribs to temperatures below 780 R. During the first attempt at aerothermal exposure, a failure in the panel-holder test fixture severely damaged the model. However, two radiant preheat/aerothermal tests were made with the damaged model to test its damage tolerance. During these tests, the damaged area did not enlarge; however, the rapidly increasing structural temperature measuring during these tests indicates that had the damaged area been exposed to aerodynamic heating for the entire trajectory, an aluminum burn-through would have occurred.

  2. 77 FR 50576 - Damage Tolerance and Fatigue Evaluation of Composite Rotorcraft Structures; OMB Approval of...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-08-22

    ... Composite Rotorcraft Structures; OMB Approval of Information Collection AGENCY: Federal Aviation... requirement contained in the FAA's final rule, ``Damage Tolerance and Fatigue Evaluation of Composite... and Fatigue Evaluation of Composite Rotorcraft Structures,'' published in the Federal Register (76 FR...

  3. WIPCast: Probabilistic Forecasting for Aviation Decision Aid Applications

    DTIC Science & Technology

    2011-06-01

    traders, or families planning an outing – manage weather-related risk. By quantifying risk , probabilistic forecasting enables optimization of actions via...confidence interval to the user’s risk tolerance helps drive highly effective and innovative decision support mechanisms for visually quantifying risk for

  4. A probabilistic damage model of stress-induced permeability anisotropy during cataclastic flow

    NASA Astrophysics Data System (ADS)

    Zhu, Wenlu; MontéSi, Laurent G. J.; Wong, Teng-Fong

    2007-10-01

    A fundamental understanding of the effect of stress on permeability evolution is important for many fault mechanics and reservoir engineering problems. Recent laboratory measurements demonstrate that in the cataclastic flow regime, the stress-induced anisotropic reduction of permeability in porous rocks can be separated into 3 different stages. In the elastic regime (stage I), permeability and porosity reduction are solely controlled by the effective mean stress, with negligible permeability anisotropy. Stage II starts at the onset of shear-enhanced compaction, when a critical yield stress is attained. In stage II, the deviatoric stress exerts primary control over permeability and porosity evolution. The increase in deviatoric stress results in drastic permeability and porosity reduction and considerable permeability anisotropy. The transition from stage II to stage III takes place progressively during the development of pervasive cataclastic flow. In stage III, permeability and porosity reduction becomes gradual again, and permeability anisotropy diminishes. Microstructural observations on deformed samples using laser confocal microscopy reveal that stress-induced microcracking and pore collapse are the primary forms of damage during cataclastic flow. A probabilistic damage model is formulated to characterize the effects of stress on permeability and its anisotropy. In our model, the effects of both effective mean stress and differential stress on permeability evolution are calculated. By introducing stress sensitivity coefficients, we propose a first-order description of the dependence of permeability evolution on different loading paths. Built upon the micromechanisms of deformation in porous rocks, this unified model provides new insight into the coupling of stress and permeability.

  5. Probabilistic evaluation of damage potential in earthquake-induced liquefaction in a 3-D soil deposit

    NASA Astrophysics Data System (ADS)

    Halder, A.; Miller, F. J.

    1982-03-01

    A probabilistic model to evaluate the risk of liquefaction at a site and to limit or eliminate damage during earthquake induced liquefaction is proposed. The model is extended to consider three dimensional nonhomogeneous soil properties. The parameters relevant to the liquefaction phenomenon are identified, including: (1) soil parameters; (2) parameters required to consider laboratory test and sampling effects; and (3) loading parameters. The fundamentals of risk based design concepts pertient to liquefaction are reviewed. A detailed statistical evaluation of the soil parameters in the proposed liquefaction model is provided and the uncertainty associated with the estimation of in situ relative density is evaluated for both direct and indirect methods. It is found that the liquefaction potential the uncertainties in the load parameters could be higher than those in the resistance parameters.

  6. Damage tolerance protein Mus81 associates with the FHA1 domain of checkpoint kinase Cds1.

    PubMed

    Boddy, M N; Lopez-Girona, A; Shanahan, P; Interthal, H; Heyer, W D; Russell, P

    2000-12-01

    Cds1, a serine/threonine kinase, enforces the S-M checkpoint in the fission yeast Schizosaccharomyces pombe. Cds1 is required for survival of replicational stress caused by agents that stall replication forks, but how Cds1 performs these functions is largely unknown. Here we report that the forkhead-associated-1 (FHA1) protein-docking domain of Cds1 interacts with Mus81, an evolutionarily conserved damage tolerance protein. Mus81 has an endonuclease homology domain found in the XPF nucleotide excision repair protein. Inactivation of mus81 reveals a unique spectrum of phenotypes. Mus81 enables survival of deoxynucleotide triphosphate starvation, UV radiation, and DNA polymerase impairment. Mus81 is essential in the absence of Bloom's syndrome Rqh1 helicase and is required for productive meiosis. Genetic epistasis studies suggest that Mus81 works with recombination enzymes to properly replicate damaged DNA. Inactivation of Mus81 triggers a checkpoint-dependent delay of mitosis. We propose that Mus81 is involved in the recruitment of Cds1 to aberrant DNA structures where Cds1 modulates the activity of damage tolerance enzymes.

  7. Overexpression of the DNA mismatch repair factor, PMS2, confers hypermutability and DNA damage tolerance.

    PubMed

    Gibson, Shannon L; Narayanan, Latha; Hegan, Denise Campisi; Buermeyer, Andrew B; Liskay, R Michael; Glazer, Peter M

    2006-12-08

    Inherited defects in genes associated with DNA mismatch repair (MMR) have been linked to familial colorectal cancer. Cells deficient in MMR are genetically unstable and demonstrate a tolerance phenotype in response to certain classes of DNA damage. Some sporadic human cancers also show abnormalities in MMR gene function, typically due to diminished expression of one of the MutL homologs, MLH1. Here, we report that overexpression of the MutL homolog, human PMS2, can also cause a disruption of the MMR pathway in mammalian cells, resulting in hypermutability and DNA damage tolerance. A mouse fibroblast cell line carrying a recoverable lambda phage shuttle vector for mutation detection was transfected with either a vector designed to express hPMS2 or with an empty vector control. Cells overexpressing hPMS2 were found to have elevated spontaneous mutation frequencies at the cII reporter gene locus. They also showed an increase in the level of mutations induced by the alkylating agent, methynitrosourea (MNU). Clonogenic survival assays demonstrated increased survival of the PMS2-overexpressing cells following exposure to MNU, consistent with the induction of a damage tolerance phenotype. Similar results were seen in cells expressing a mutant PMS2 gene, containing a premature stop codon at position 134 and representing a variant found in an individual with familial colon cancer. These results show that dysregulation of PMS2 gene expression can disrupt MMR function in mammalian cells and establish an additional carcinogenic mechanism by which cells can develop genetic instability and acquire resistance to cytotoxic cancer therapies.

  8. Confronting uncertainty in flood damage predictions

    NASA Astrophysics Data System (ADS)

    Schröter, Kai; Kreibich, Heidi; Vogel, Kristin; Merz, Bruno

    2015-04-01

    Reliable flood damage models are a prerequisite for the practical usefulness of the model results. Oftentimes, traditional uni-variate damage models as for instance depth-damage curves fail to reproduce the variability of observed flood damage. Innovative multi-variate probabilistic modelling approaches are promising to capture and quantify the uncertainty involved and thus to improve the basis for decision making. In this study we compare the predictive capability of two probabilistic modelling approaches, namely Bagging Decision Trees and Bayesian Networks. For model evaluation we use empirical damage data which are available from computer aided telephone interviews that were respectively compiled after the floods in 2002, 2005 and 2006, in the Elbe and Danube catchments in Germany. We carry out a split sample test by sub-setting the damage records. One sub-set is used to derive the models and the remaining records are used to evaluate the predictive performance of the model. Further we stratify the sample according to catchments which allows studying model performance in a spatial transfer context. Flood damage estimation is carried out on the scale of the individual buildings in terms of relative damage. The predictive performance of the models is assessed in terms of systematic deviations (mean bias), precision (mean absolute error) as well as in terms of reliability which is represented by the proportion of the number of observations that fall within the 95-quantile and 5-quantile predictive interval. The reliability of the probabilistic predictions within validation runs decreases only slightly and achieves a very good coverage of observations within the predictive interval. Probabilistic models provide quantitative information about prediction uncertainty which is crucial to assess the reliability of model predictions and improves the usefulness of model results.

  9. Durability and Damage Tolerance of High Temperature Polymeric Composites

    NASA Technical Reports Server (NTRS)

    Case, Scott W.; Reifsnider, Kenneth L.

    1996-01-01

    Modern durability and damage tolerance predictions for composite material systems rely on accurate estimates of the local stress and material states for each of the constituents, as well as the manner in which the constituents interact. In this work, an number of approaches to estimating the stress states and interactions are developed. First, an elasticity solution is presented for the problem of a penny-shaped crack in an N-phase composite material system opened by a prescribed normal pressure. The stress state around such a crack is then used to estimate the stress concentrations due to adjacent fiber fractures in composite materials. The resulting stress concentrations are then used to estimate the tensile strength of the composite. The predicted results are compared with experimental values. In addition, a cumulative damage model for fatigue is presented. Modifications to the model are made to include the effects of variable amplitude loading. These modifications are based upon the use of remaining strength as a damage metric and the definition of an equivalent generalized time. The model is initially validated using results from the literature. Also, experimental data from APC-2 laminates and IM7/K3B laminates are used in the model. The use of such data for notched laminates requires the use of an effective hole size, which is calculated based upon strain distribution measurements. Measured remaining strengths after fatigue loading are compared with the predicted values for specimens fatigued at room temperature and 350 F (177 C).

  10. FAA/NASA International Symposium on Advanced Structural Integrity Methods for Airframe Durability and Damage Tolerance, part 2

    NASA Technical Reports Server (NTRS)

    Harris, Charles E. (Editor)

    1994-01-01

    The international technical experts in the areas of durability and damage tolerance of metallic airframe structures were assembled to present and discuss recent research findings and the development of advanced design and analysis methods, structural concepts, and advanced materials. The principal focus of the symposium was on the dissemination of new knowledge and the peer-review of progress on the development of advanced methodologies. Papers were presented on the following topics: structural concepts for enhanced durability, damage tolerance, and maintainability; new metallic alloys and processing technology; fatigue crack initiation and small crack effects; fatigue crack growth models; fracture mechanics failure criteria for ductile materials; structural mechanics methodology for residual strength and life prediction; development of flight load spectra for design and testing; and corrosion resistance.

  11. Fault-tolerant quantum computation with nondeterministic entangling gates

    NASA Astrophysics Data System (ADS)

    Auger, James M.; Anwar, Hussain; Gimeno-Segovia, Mercedes; Stace, Thomas M.; Browne, Dan E.

    2018-03-01

    Performing entangling gates between physical qubits is necessary for building a large-scale universal quantum computer, but in some physical implementations—for example, those that are based on linear optics or networks of ion traps—entangling gates can only be implemented probabilistically. In this work, we study the fault-tolerant performance of a topological cluster state scheme with local nondeterministic entanglement generation, where failed entangling gates (which correspond to bonds on the lattice representation of the cluster state) lead to a defective three-dimensional lattice with missing bonds. We present two approaches for dealing with missing bonds; the first is a nonadaptive scheme that requires no additional quantum processing, and the second is an adaptive scheme in which qubits can be measured in an alternative basis to effectively remove them from the lattice, hence eliminating their damaging effect and leading to better threshold performance. We find that a fault-tolerance threshold can still be observed with a bond-loss rate of 6.5% for the nonadaptive scheme, and a bond-loss rate as high as 14.5% for the adaptive scheme.

  12. Insensitivity to Flaws Leads to Damage Tolerance in Brittle Architected Meta-Materials

    NASA Astrophysics Data System (ADS)

    Montemayor, L. C.; Wong, W. H.; Zhang, Y.-W.; Greer, J. R.

    2016-02-01

    Cellular solids are instrumental in creating lightweight, strong, and damage-tolerant engineering materials. By extending feature size down to the nanoscale, we simultaneously exploit the architecture and material size effects to substantially enhance structural integrity of architected meta-materials. We discovered that hollow-tube alumina nanolattices with 3D kagome geometry that contained pre-fabricated flaws always failed at the same load as the pristine specimens when the ratio of notch length (a) to sample width (w) is no greater than 1/3, with no correlation between failure occurring at or away from the notch. Samples with (a/w) > 0.3, and notch length-to-unit cell size ratios of (a/l) > 5.2, failed at a lower peak loads because of the higher sample compliance when fewer unit cells span the intact region. Finite element simulations show that the failure is governed by purely tensile loading for (a/w) < 0.3 for the same (a/l); bending begins to play a significant role in failure as (a/w) increases. This experimental and computational work demonstrates that the discrete-continuum duality of architected structural meta-materials may give rise to their damage tolerance and insensitivity of failure to the presence of flaws even when made entirely of intrinsically brittle materials.

  13. Probabilistic structural analysis to quantify uncertainties associated with turbopump blades

    NASA Technical Reports Server (NTRS)

    Nagpal, Vinod K.; Rubinstein, Robert; Chamis, Christos C.

    1988-01-01

    A probabilistic study of turbopump blades has been in progress at NASA Lewis Research Center for over the last two years. The objectives of this study are to evaluate the effects of uncertainties in geometry and material properties on the structural response of the turbopump blades to evaluate the tolerance limits on the design. A methodology based on probabilistic approach was developed to quantify the effects of the random uncertainties. The results indicate that only the variations in geometry have significant effects.

  14. Probabilistic Seismic Risk Model for Western Balkans

    NASA Astrophysics Data System (ADS)

    Stejskal, Vladimir; Lorenzo, Francisco; Pousse, Guillaume; Radovanovic, Slavica; Pekevski, Lazo; Dojcinovski, Dragi; Lokin, Petar; Petronijevic, Mira; Sipka, Vesna

    2010-05-01

    A probabilistic seismic risk model for insurance and reinsurance purposes is presented for an area of Western Balkans, covering former Yugoslavia and Albania. This territory experienced many severe earthquakes during past centuries producing significant damage to many population centres in the region. The highest hazard is related to external Dinarides, namely to the collision zone of the Adriatic plate. The model is based on a unified catalogue for the region and a seismic source model consisting of more than 30 zones covering all the three main structural units - Southern Alps, Dinarides and the south-western margin of the Pannonian Basin. A probabilistic methodology using Monte Carlo simulation was applied to generate the hazard component of the model. Unique set of damage functions based on both loss experience and engineering assessments is used to convert the modelled ground motion severity into the monetary loss.

  15. A methodology for post-mainshock probabilistic assessment of building collapse risk

    USGS Publications Warehouse

    Luco, N.; Gerstenberger, M.C.; Uma, S.R.; Ryu, H.; Liel, A.B.; Raghunandan, M.

    2011-01-01

    This paper presents a methodology for post-earthquake probabilistic risk (of damage) assessment that we propose in order to develop a computational tool for automatic or semi-automatic assessment. The methodology utilizes the same so-called risk integral which can be used for pre-earthquake probabilistic assessment. The risk integral couples (i) ground motion hazard information for the location of a structure of interest with (ii) knowledge of the fragility of the structure with respect to potential ground motion intensities. In the proposed post-mainshock methodology, the ground motion hazard component of the risk integral is adapted to account for aftershocks which are deliberately excluded from typical pre-earthquake hazard assessments and which decrease in frequency with the time elapsed since the mainshock. Correspondingly, the structural fragility component is adapted to account for any damage caused by the mainshock, as well as any uncertainty in the extent of this damage. The result of the adapted risk integral is a fully-probabilistic quantification of post-mainshock seismic risk that can inform emergency response mobilization, inspection prioritization, and re-occupancy decisions.

  16. Probabilistic evaluation of uncertainties and risks in aerospace components

    NASA Technical Reports Server (NTRS)

    Shah, A. R.; Shiao, M. C.; Nagpal, V. K.; Chamis, C. C.

    1992-01-01

    A methodology is presented for the computational simulation of primitive variable uncertainties, and attention is given to the simulation of specific aerospace components. Specific examples treated encompass a probabilistic material behavior model, as well as static, dynamic, and fatigue/damage analyses of a turbine blade in a mistuned bladed rotor in the SSME turbopumps. An account is given of the use of the NESSES probabilistic FEM analysis CFD code.

  17. Probabilistic structural analysis to quantify uncertainties associated with turbopump blades

    NASA Technical Reports Server (NTRS)

    Nagpal, Vinod K.; Rubinstein, Robert; Chamis, Christos C.

    1987-01-01

    A probabilistic study of turbopump blades has been in progress at NASA Lewis Research Center for over the last two years. The objectives of this study are to evaluate the effects of uncertainties in geometry and material properties on the structural response of the turbopump blades to evaluate the tolerance limits on the design. A methodology based on probabilistic approach has been developed to quantify the effects of the random uncertainties. The results of this study indicate that only the variations in geometry have significant effects.

  18. The stomatopod dactyl club: a formidable damage-tolerant biological hammer.

    PubMed

    Weaver, James C; Milliron, Garrett W; Miserez, Ali; Evans-Lutterodt, Kenneth; Herrera, Steven; Gallana, Isaias; Mershon, William J; Swanson, Brook; Zavattieri, Pablo; DiMasi, Elaine; Kisailus, David

    2012-06-08

    Nature has evolved efficient strategies to synthesize complex mineralized structures that exhibit exceptional damage tolerance. One such example is found in the hypermineralized hammer-like dactyl clubs of the stomatopods, a group of highly aggressive marine crustaceans. The dactyl clubs from one species, Odontodactylus scyllarus, exhibit an impressive set of characteristics adapted for surviving high-velocity impacts on the heavily mineralized prey on which they feed. Consisting of a multiphase composite of oriented crystalline hydroxyapatite and amorphous calcium phosphate and carbonate, in conjunction with a highly expanded helicoidal organization of the fibrillar chitinous organic matrix, these structures display several effective lines of defense against catastrophic failure during repetitive high-energy loading events.

  19. The Stomatopod Dactyl Club: A Formidable Damage-Tolerant Biological Hammer

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Weaver J. C.; DiMasi E.; Milliron, G.W.

    2012-06-08

    Nature has evolved efficient strategies to synthesize complex mineralized structures that exhibit exceptional damage tolerance. One such example is found in the hypermineralized hammer-like dactyl clubs of the stomatopods, a group of highly aggressive marine crustaceans. The dactyl clubs from one species, Odontodactylus scyllarus, exhibit an impressive set of characteristics adapted for surviving high-velocity impacts on the heavily mineralized prey on which they feed. Consisting of a multiphase composite of oriented crystalline hydroxyapatite and amorphous calcium phosphate and carbonate, in conjunction with a highly expanded helicoidal organization of the fibrillar chitinous organic matrix, these structures display several effective lines ofmore » defense against catastrophic failure during repetitive high-energy loading events.« less

  20. QTL analysis of frost damage in pea suggests different mechanisms involved in frost tolerance.

    PubMed

    Klein, Anthony; Houtin, Hervé; Rond, Céline; Marget, Pascal; Jacquin, Françoise; Boucherot, Karen; Huart, Myriam; Rivière, Nathalie; Boutet, Gilles; Lejeune-Hénaut, Isabelle; Burstin, Judith

    2014-06-01

    Avoidance mechanisms and intrinsic resistance are complementary strategies to improve winter frost tolerance and yield potential in field pea. The development of the winter pea crop represents a major challenge to expand plant protein production in temperate areas. Breeding winter cultivars requires the combination of freezing tolerance as well as high seed productivity and quality. In this context, we investigated the genetic determinism of winter frost tolerance and assessed its genetic relationship with yield and developmental traits. Using a newly identified source of frost resistance, we developed a population of recombinant inbred lines and evaluated it in six environments in Dijon and Clermont-Ferrand between 2005 and 2010. We developed a genetic map comprising 679 markers distributed over seven linkage groups and covering 947.1 cM. One hundred sixty-one quantitative trait loci (QTL) explaining 9-71 % of the phenotypic variation were detected across the six environments for all traits measured. Two clusters of QTL mapped on the linkage groups III and one cluster on LGVI reveal the genetic links between phenology, morphology, yield-related traits and frost tolerance in winter pea. QTL clusters on LGIII highlighted major developmental gene loci (Hr and Le) and the QTL cluster on LGVI explained up to 71 % of the winter frost damage variation. This suggests that a specific architecture and flowering ideotype defines frost tolerance in winter pea. However, two consistent frost tolerance QTL on LGV were independent of phenology and morphology traits, showing that different protective mechanisms are involved in frost tolerance. Finally, these results suggest that frost tolerance can be bred independently to seed productivity and quality.

  1. Structurally Integrated, Damage-Tolerant, Thermal Spray Coatings

    NASA Astrophysics Data System (ADS)

    Vackel, Andrew; Dwivedi, Gopal; Sampath, Sanjay

    2015-07-01

    Thermal spray coatings are used extensively for the protection and life extension of engineering components exposed to harsh wear and/or corrosion during service in aerospace, energy, and heavy machinery sectors. Cermet coatings applied via high-velocity thermal spray are used in aggressive wear situations almost always coupled with corrosive environments. In several instances (e.g., landing gear), coatings are considered as part of the structure requiring system-level considerations. Despite their widespread use, the technology has lacked generalized scientific principles for robust coating design, manufacturing, and performance analysis. Advances in process and in situ diagnostics have provided significant insights into the process-structure-property-performance correlations providing a framework-enhanced design. In this overview, critical aspects of materials, process, parametrics, and performance are discussed through exemplary studies on relevant compositions. The underlying connective theme is understanding and controlling residual stresses generation, which not only addresses process dynamics but also provides linkage for process-property relationship for both the system (e.g., fatigue) and the surface (wear and corrosion). The anisotropic microstructure also invokes the need for damage-tolerant material design to meet future goals.

  2. Probabilistic fatigue methodology for six nines reliability

    NASA Technical Reports Server (NTRS)

    Everett, R. A., Jr.; Bartlett, F. D., Jr.; Elber, Wolf

    1990-01-01

    Fleet readiness and flight safety strongly depend on the degree of reliability that can be designed into rotorcraft flight critical components. The current U.S. Army fatigue life specification for new rotorcraft is the so-called six nines reliability, or a probability of failure of one in a million. The progress of a round robin which was established by the American Helicopter Society (AHS) Subcommittee for Fatigue and Damage Tolerance is reviewed to investigate reliability-based fatigue methodology. The participants in this cooperative effort are in the U.S. Army Aviation Systems Command (AVSCOM) and the rotorcraft industry. One phase of the joint activity examined fatigue reliability under uniquely defined conditions for which only one answer was correct. The other phases were set up to learn how the different industry methods in defining fatigue strength affected the mean fatigue life and reliability calculations. Hence, constant amplitude and spectrum fatigue test data were provided so that each participant could perform their standard fatigue life analysis. As a result of this round robin, the probabilistic logic which includes both fatigue strength and spectrum loading variability in developing a consistant reliability analysis was established. In this first study, the reliability analysis was limited to the linear cumulative damage approach. However, it is expected that superior fatigue life prediction methods will ultimately be developed through this open AHS forum. To that end, these preliminary results were useful in identifying some topics for additional study.

  3. Reduction of female copulatory damage by resilin represents evidence for tolerance in sexual conflict

    PubMed Central

    Michels, Jan; Gorb, Stanislav N.; Reinhardt, Klaus

    2015-01-01

    Intergenomic evolutionary conflicts increase biological diversity. In sexual conflict, female defence against males is generally assumed to be resistance, which, however, often leads to trait exaggeration but not diversification. Here, we address whether tolerance, a female defence mechanism known from interspecific conflicts, exists in sexual conflict. We examined the traumatic insemination of female bed bugs via cuticle penetration by males, a textbook example of sexual conflict. Confocal laser scanning microscopy revealed large proportions of the soft and elastic protein resilin in the cuticle of the spermalege, the female defence organ. Reduced tissue damage and haemolymph loss were identified as adaptive female benefits from resilin. These did not arise from resistance because microindentation showed that the penetration force necessary to breach the cuticle was significantly lower at the resilin-rich spermalege than at other cuticle sites. Furthermore, a male survival analysis indicated that the spermalege did not impose antagonistic selection on males. Our findings suggest that the specific spermalege material composition evolved to tolerate the traumatic cuticle penetration. They demonstrate the importance of tolerance in sexual conflict and genitalia evolution, extend fundamental coevolution and speciation models and contribute to explaining the evolution of complexity. We propose that tolerance can drive trait diversity. PMID:25673297

  4. Reduction of female copulatory damage by resilin represents evidence for tolerance in sexual conflict.

    PubMed

    Michels, Jan; Gorb, Stanislav N; Reinhardt, Klaus

    2015-03-06

    Intergenomic evolutionary conflicts increase biological diversity. In sexual conflict, female defence against males is generally assumed to be resistance, which, however, often leads to trait exaggeration but not diversification. Here, we address whether tolerance, a female defence mechanism known from interspecific conflicts, exists in sexual conflict. We examined the traumatic insemination of female bed bugs via cuticle penetration by males, a textbook example of sexual conflict. Confocal laser scanning microscopy revealed large proportions of the soft and elastic protein resilin in the cuticle of the spermalege, the female defence organ. Reduced tissue damage and haemolymph loss were identified as adaptive female benefits from resilin. These did not arise from resistance because microindentation showed that the penetration force necessary to breach the cuticle was significantly lower at the resilin-rich spermalege than at other cuticle sites. Furthermore, a male survival analysis indicated that the spermalege did not impose antagonistic selection on males. Our findings suggest that the specific spermalege material composition evolved to tolerate the traumatic cuticle penetration. They demonstrate the importance of tolerance in sexual conflict and genitalia evolution, extend fundamental coevolution and speciation models and contribute to explaining the evolution of complexity. We propose that tolerance can drive trait diversity. © 2015 The Author(s) Published by the Royal Society. All rights reserved.

  5. Unprecedented simultaneous enhancement in damage tolerance and fatigue resistance of zirconia/Ta composites

    NASA Astrophysics Data System (ADS)

    Smirnov, A.; Beltrán, J. I.; Rodriguez-Suarez, T.; Pecharromán, C.; Muñoz, M. C.; Moya, J. S.; Bartolomé, J. F.

    2017-03-01

    Dense (>98 th%) and homogeneous ceramic/metal composites were obtained by spark plasma sintering (SPS) using ZrO2 and lamellar metallic powders of tantalum or niobium (20 vol.%) as starting materials. The present study has demonstrated the unique and unpredicted simultaneous enhancement in toughness and strength with very high flaw tolerance of zirconia/Ta composites. In addition to their excellent static mechanical properties, these composites also have exceptional resistance to fatigue loading. It has been shown that the major contributions to toughening are the resulting crack bridging and plastic deformation of the metallic particles, together with crack deflection and interfacial debonding, which is compatible with the coexistence in the composite of both, strong and weak ceramic/metal interfaces, in agreement with predictions of ab-initio calculations. Therefore, these materials are promising candidates for designing damage tolerance components for aerospace industry, cutting and drilling tools, biomedical implants, among many others.

  6. Aldehyde dehydrogenase 2 protects human umbilical vein endothelial cells against oxidative damage and increases endothelial nitric oxide production to reverse nitroglycerin tolerance.

    PubMed

    Hu, X Y; Fang, Q; Ma, D; Jiang, L; Yang, Y; Sun, J; Yang, C; Wang, J S

    2016-06-10

    Medical nitroglycerin (glyceryl trinitrate, GTN) use is limited principally by tolerance typified by a decrease in nitric oxide (NO) produced by biotransformation. Such tolerance may lead to endothelial dysfunction by inducing oxidative stress. In vivo studies have demonstrated that aldehyde dehydrogenase 2 (ALDH2) plays important roles in GTN biotransformation and tolerance. Thus, modification of ALDH2 expression represents a potentially effective strategy to prevent and reverse GTN tolerance and endothelial dysfunction. In this study, a eukaryotic expression vector containing the ALDH2 gene was introduced into human umbilical vein endothelial cells (HUVECs) by liposome-mediated transfection. An indirect immunofluorescence assay showed that ALDH2 expression increased 24 h after transfection. Moreover, real-time polymerase chain reaction and western blotting revealed significantly higher ALDH2 mRNA and protein expression in the gene-transfected group than in the two control groups. GTN tolerance was induced by treating HUVECs with 10 mM GTN for 16 h + 10 min, which significantly decreased NO levels in control cells, but not in those transfected with ALDH2. Overexpression of ALDH2 increased cell survival against GTN-induced cytotoxicity and conferred protection from oxidative damage resulting from nitrate tolerance, accompanied by decreased production of intracellular reactive oxygen species and reduced expression of heme oxygenase 1. Furthermore, ALDH2 overexpression promoted Akt phosphorylation under GTN tolerance conditions. ALDH2 gene transfection can reverse and prevent tolerance to GTN through its bioactivation and protect against oxidative damage, preventing the development of endothelial dysfunction.

  7. Pro-oxidant Induced DNA Damage in Human Lymphoblastoid Cells: Homeostatic Mechanisms of Genotoxic Tolerance

    PubMed Central

    Seager, Anna L.

    2012-01-01

    Oxidative stress contributes to many disease etiologies including ageing, neurodegeneration, and cancer, partly through DNA damage induction (genotoxicity). Understanding the i nteractions of free radicals with DNA is fundamental to discern mutation risks. In genetic toxicology, regulatory authorities consider that most genotoxins exhibit a linear relationship between dose and mutagenic response. Yet, homeostatic mechanisms, including DNA repair, that allow cells to tolerate low levels of genotoxic exposure exist. Acceptance of thresholds for genotoxicity has widespread consequences in terms of understanding cancer risk and regulating human exposure to chemicals/drugs. Three pro-oxidant chemicals, hydrogen peroxide (H2O2), potassium bromate (KBrO3), and menadione, were examined for low dose-response curves in human lymphoblastoid cells. DNA repair and antioxidant capacity were assessed as possible threshold mechanisms. H2O2 and KBrO3, but not menadione, exhibited thresholded responses, containing a range of nongenotoxic low doses. Levels of the DNA glycosylase 8-oxoguanine glycosylase were unchanged in response to pro- oxidant stress. DNA repair–focused gene expression arrays reported changes in ATM and BRCA1, involved in double-strand break repair, in response to low-dose pro-oxidant exposure; however, these alterations were not substantiated at the protein level. Determination of oxidatively induced DNA damage in H2O2-treated AHH-1 cells reported accumulation of thymine glycol above the genotoxic threshold. Further, the H2O2 dose-response curve was shifted by modulating the antioxidant glutathione. Hence, observed pro- oxidant thresholds were due to protective capacities of base excision repair enzymes and antioxidants against DNA damage, highlighting the importance of homeostatic mechanisms in “genotoxic tolerance.” PMID:22539617

  8. Tolerance to insect defoliation: biocenotic aspects

    Treesearch

    Andrey A. Pleshanov; Victor I. Voronin; Elena S. Khlimankova; Valentina I. Epova

    1991-01-01

    Woody plant resistance to insect damage is of great importance in forest protection, and tree tolerance is an important element of this resistance. The compensating mechanisms responsible for tolerance are nonspecific as a rule and develop after damage has been caused by phytophagous animals or other unfavorable effects. Beyond that, plant tolerance depends on duration...

  9. Probabilistic Prognosis of Non-Planar Fatigue Crack Growth

    NASA Technical Reports Server (NTRS)

    Leser, Patrick E.; Newman, John A.; Warner, James E.; Leser, William P.; Hochhalter, Jacob D.; Yuan, Fuh-Gwo

    2016-01-01

    Quantifying the uncertainty in model parameters for the purpose of damage prognosis can be accomplished utilizing Bayesian inference and damage diagnosis data from sources such as non-destructive evaluation or structural health monitoring. The number of samples required to solve the Bayesian inverse problem through common sampling techniques (e.g., Markov chain Monte Carlo) renders high-fidelity finite element-based damage growth models unusable due to prohibitive computation times. However, these types of models are often the only option when attempting to model complex damage growth in real-world structures. Here, a recently developed high-fidelity crack growth model is used which, when compared to finite element-based modeling, has demonstrated reductions in computation times of three orders of magnitude through the use of surrogate models and machine learning. The model is flexible in that only the expensive computation of the crack driving forces is replaced by the surrogate models, leaving the remaining parameters accessible for uncertainty quantification. A probabilistic prognosis framework incorporating this model is developed and demonstrated for non-planar crack growth in a modified, edge-notched, aluminum tensile specimen. Predictions of remaining useful life are made over time for five updates of the damage diagnosis data, and prognostic metrics are utilized to evaluate the performance of the prognostic framework. Challenges specific to the probabilistic prognosis of non-planar fatigue crack growth are highlighted and discussed in the context of the experimental results.

  10. Unprecedented simultaneous enhancement in damage tolerance and fatigue resistance of zirconia/Ta composites

    PubMed Central

    Smirnov, A.; Beltrán, J. I.; Rodriguez-Suarez, T.; Pecharromán, C.; Muñoz, M. C.; Moya, J. S.; Bartolomé, J. F.

    2017-01-01

    Dense (>98 th%) and homogeneous ceramic/metal composites were obtained by spark plasma sintering (SPS) using ZrO2 and lamellar metallic powders of tantalum or niobium (20 vol.%) as starting materials. The present study has demonstrated the unique and unpredicted simultaneous enhancement in toughness and strength with very high flaw tolerance of zirconia/Ta composites. In addition to their excellent static mechanical properties, these composites also have exceptional resistance to fatigue loading. It has been shown that the major contributions to toughening are the resulting crack bridging and plastic deformation of the metallic particles, together with crack deflection and interfacial debonding, which is compatible with the coexistence in the composite of both, strong and weak ceramic/metal interfaces, in agreement with predictions of ab-initio calculations. Therefore, these materials are promising candidates for designing damage tolerance components for aerospace industry, cutting and drilling tools, biomedical implants, among many others. PMID:28322343

  11. Probabilistic Structural Analysis Methods (PSAM) for select space propulsion system structural components

    NASA Technical Reports Server (NTRS)

    Cruse, T. A.

    1987-01-01

    The objective is the development of several modular structural analysis packages capable of predicting the probabilistic response distribution for key structural variables such as maximum stress, natural frequencies, transient response, etc. The structural analysis packages are to include stochastic modeling of loads, material properties, geometry (tolerances), and boundary conditions. The solution is to be in terms of the cumulative probability of exceedance distribution (CDF) and confidence bounds. Two methods of probability modeling are to be included as well as three types of structural models - probabilistic finite-element method (PFEM); probabilistic approximate analysis methods (PAAM); and probabilistic boundary element methods (PBEM). The purpose in doing probabilistic structural analysis is to provide the designer with a more realistic ability to assess the importance of uncertainty in the response of a high performance structure. Probabilistic Structural Analysis Method (PSAM) tools will estimate structural safety and reliability, while providing the engineer with information on the confidence that should be given to the predicted behavior. Perhaps most critically, the PSAM results will directly provide information on the sensitivity of the design response to those variables which are seen to be uncertain.

  12. Probabilistic Structural Analysis Methods for select space propulsion system structural components (PSAM)

    NASA Technical Reports Server (NTRS)

    Cruse, T. A.; Burnside, O. H.; Wu, Y.-T.; Polch, E. Z.; Dias, J. B.

    1988-01-01

    The objective is the development of several modular structural analysis packages capable of predicting the probabilistic response distribution for key structural variables such as maximum stress, natural frequencies, transient response, etc. The structural analysis packages are to include stochastic modeling of loads, material properties, geometry (tolerances), and boundary conditions. The solution is to be in terms of the cumulative probability of exceedance distribution (CDF) and confidence bounds. Two methods of probability modeling are to be included as well as three types of structural models - probabilistic finite-element method (PFEM); probabilistic approximate analysis methods (PAAM); and probabilistic boundary element methods (PBEM). The purpose in doing probabilistic structural analysis is to provide the designer with a more realistic ability to assess the importance of uncertainty in the response of a high performance structure. Probabilistic Structural Analysis Method (PSAM) tools will estimate structural safety and reliability, while providing the engineer with information on the confidence that should be given to the predicted behavior. Perhaps most critically, the PSAM results will directly provide information on the sensitivity of the design response to those variables which are seen to be uncertain.

  13. Acoustic emission based damage localization in composites structures using Bayesian identification

    NASA Astrophysics Data System (ADS)

    Kundu, A.; Eaton, M. J.; Al-Jumali, S.; Sikdar, S.; Pullin, R.

    2017-05-01

    Acoustic emission based damage detection in composite structures is based on detection of ultra high frequency packets of acoustic waves emitted from damage sources (such as fibre breakage, fatigue fracture, amongst others) with a network of distributed sensors. This non-destructive monitoring scheme requires solving an inverse problem where the measured signals are linked back to the location of the source. This in turn enables rapid deployment of mitigative measures. The presence of significant amount of uncertainty associated with the operating conditions and measurements makes the problem of damage identification quite challenging. The uncertainties stem from the fact that the measured signals are affected by the irregular geometries, manufacturing imprecision, imperfect boundary conditions, existing damages/structural degradation, amongst others. This work aims to tackle these uncertainties within a framework of automated probabilistic damage detection. The method trains a probabilistic model of the parametrized input and output model of the acoustic emission system with experimental data to give probabilistic descriptors of damage locations. A response surface modelling the acoustic emission as a function of parametrized damage signals collected from sensors would be calibrated with a training dataset using Bayesian inference. This is used to deduce damage locations in the online monitoring phase. During online monitoring, the spatially correlated time data is utilized in conjunction with the calibrated acoustic emissions model to infer the probabilistic description of the acoustic emission source within a hierarchical Bayesian inference framework. The methodology is tested on a composite structure consisting of carbon fibre panel with stiffeners and damage source behaviour has been experimentally simulated using standard H-N sources. The methodology presented in this study would be applicable in the current form to structural damage detection under varying

  14. Quantifying the risks of winter damage on overwintering crops under future climates: Will low-temperature damage be more likely in warmer climates?

    NASA Astrophysics Data System (ADS)

    Vico, G.; Weih, M.

    2014-12-01

    Autumn-sown crops act as winter cover crop, reducing soil erosion and nutrient leaching, while potentially providing higher yields than spring varieties in many environments. Nevertheless, overwintering crops are exposed for longer periods to the vagaries of weather conditions. Adverse winter conditions, in particular, may negatively affect the final yield, by reducing crop survival or its vigor. The net effect of the projected shifts in climate is unclear. On the one hand, warmer temperatures may reduce the frequency of low temperatures, thereby reducing damage risk. On the other hand, warmer temperatures, by reducing plant acclimation level and the amount and duration of snow cover, may increase the likelihood of damage. Thus, warmer climates may paradoxically result in more extensive low temperature damage and reduced viability for overwintering plants. The net effect of a shift in climate is explored by means of a parsimonious probabilistic model, based on a coupled description of air temperature, snow cover, and crop tolerable temperature. Exploiting an extensive dataset of winter wheat responses to low temperature exposure, the risk of winter damage occurrence is quantified under conditions typical of northern temperate latitudes. The full spectrum of variations expected with climate change is explored, quantifying the joint effects of alterations in temperature averages and their variability as well as shifts in precipitation. The key features affecting winter wheat vulnerability to low temperature damage under future climates are singled out.

  15. An assessment of buffer strips for improving damage tolerance of composite laminates at elevated temperature

    NASA Technical Reports Server (NTRS)

    Bigelow, C. A.

    1981-01-01

    Buffer strips greatly improve the damage tolerance of graphite/epoxy laminates loaded in tension. Graphite/polyimide buffer strip panels were made and tested to determine their residual strength at ambient and elevated (177 C) temperature. Each panel was cut in the center to represent damage. Panels were radiographed and crack-opening displacements were recorded to indicate fracture, fracture arrest, and the extent of damage in the buffer strip after arrest. All panels had the same buffer strip spacing and width. The buffer strip material was 0 deg S-glass/PMR-15. The buffer strips were made by replacing narrow strips of the 0 deg graphite plies with strips of the 0 deg S-glass on either a one-for-one or a two-for-one basis. Half of the panels were heated to 177 + or - 3 C before and during the testing. Elevated temperature did not alter the fracture behavior of the buffer configuration.

  16. Mechanical behavior, damage tolerance and durability of fiber metal laminates for aircraft structures

    NASA Astrophysics Data System (ADS)

    Wu, Guocai

    This study systematically explores the mechanical behavior, damage tolerance and durability of fiber metal laminates, a promising candidate materials system for next generation aerospace structures. The experimental results indicated that GLARE laminates exhibited a bilinear deformation behavior under static in-plane loading. Both an analytical constitutive model based on a modified classical lamination theory which incorporates the elasto-plastic behavior of aluminum alloy and a numerical simulation based on finite element modeling are used to predict the nonlinear stress-strain response and deformation behavior of GLARE laminates. The blunt notched strength of GLARE laminates increased with decreasing specimen width and decreasing hole diameter. The notched strength of GLARE laminates was evaluated based on a modified point stress criterion. A computer simulation based on finite element method was performed to study stress concentration and distribution around the notch and verify the analytical and experimental results of notched strength. Good agreement is obtained between the model predictions and experimental results. Experimental results also indicate that GLARE laminates exhibited superior impact properties to those of monolithic 2024-T3 aluminum alloy at low velocity impact loading. The GLARE 5-2/1 laminate with 0°/90°/90°/0° fiber configuration exhibits a better impact resistance than the GLARE 4-3/2 laminate with 0°/90°/0° fiber orientation. The characteristic impact energies, the damage area, and the permanent deflection of laminates are used to evaluate the impact damage resistance. The post-impact residual tensile strength under various damage states ranging from the plastic dent, barely visible impact damage (BVID), clearly visible impact damage (CVID) up to the complete perforation was also measured and compared. The post-impact fatigue behavior under various stress levels and impact damage states was extensively explored. The damage

  17. Dehydration rate determines the degree of membrane damage and desiccation tolerance in bryophytes.

    PubMed

    Cruz de Carvalho, Ricardo; Catalá, Myriam; Branquinho, Cristina; Marques da Silva, Jorge; Barreno, Eva

    2017-03-01

    Desiccation tolerant (DT) organisms are able to withstand an extended loss of body water and rapidly resume metabolism upon rehydration. This ability, however, is strongly dependent on a slow dehydration rate. Fast dehydration affects membrane integrity leading to intracellular solute leakage upon rehydration and thereby impairs metabolism recovery. We test the hypothesis that the increased cell membrane damage and membrane permeability observed under fast dehydration, compared with slow dehydration, is related to an increase in lipid peroxidation. Our results reject this hypothesis because following rehydration lipid peroxidation remains unaltered, a fact that could be due to the high increase of NO upon rehydration. However, in fast-dried samples we found a strong signal of red autofluorescence upon rehydration, which correlates with an increase in ROS production and with membrane leakage, particularly the case of phenolics. This could be used as a bioindicator of oxidative stress and membrane damage. © 2016 Scandinavian Plant Physiology Society.

  18. Rad18 confers hematopoietic progenitor cell DNA damage tolerance independently of the Fanconi Anemia pathway in vivo

    PubMed Central

    Yang, Yang; Poe, Jonathan C.; Yang, Lisong; Fedoriw, Andrew; Desai, Siddhi; Magnuson, Terry; Li, Zhiguo; Fedoriw, Yuri; Araki, Kimi; Gao, Yanzhe; Tateishi, Satoshi; Sarantopoulos, Stefanie; Vaziri, Cyrus

    2016-01-01

    In cultured cancer cells the E3 ubiquitin ligase Rad18 activates Trans-Lesion Synthesis (TLS) and the Fanconi Anemia (FA) pathway. However, physiological roles of Rad18 in DNA damage tolerance and carcinogenesis are unknown and were investigated here. Primary hematopoietic stem and progenitor cells (HSPC) co-expressed RAD18 and FANCD2 proteins, potentially consistent with a role for Rad18 in FA pathway function during hematopoiesis. However, hematopoietic defects typically associated with fanc-deficiency (decreased HSPC numbers, reduced engraftment potential of HSPC, and Mitomycin C (MMC) -sensitive hematopoiesis), were absent in Rad18−/− mice. Moreover, primary Rad18−/− mouse embryonic fibroblasts (MEF) retained robust Fancd2 mono-ubiquitination following MMC treatment. Therefore, Rad18 is dispensable for FA pathway activation in untransformed cells and the Rad18 and FA pathways are separable in hematopoietic cells. In contrast with responses to crosslinking agents, Rad18−/− HSPC were sensitive to in vivo treatment with the myelosuppressive agent 7,12 Dimethylbenz[a]anthracene (DMBA). Rad18-deficient fibroblasts aberrantly accumulated DNA damage markers after DMBA treatment. Moreover, in vivo DMBA treatment led to increased incidence of B cell malignancy in Rad18−/− mice. These results identify novel hematopoietic functions for Rad18 and provide the first demonstration that Rad18 confers DNA damage tolerance and tumor-suppression in a physiological setting. PMID:26883629

  19. The US Navy’s Helicopter Integrated Diagnostics System (HIDS) Program: Power Drive Train Crack Detection Diagnostics and Prognostics Life Usage Monitoring and Damage Tolerance; Techniques, Methodologies, and Experiences

    DTIC Science & Technology

    2000-02-01

    HIDS] Program: Power Drive Train Crack Detection Diagnostics and Prognostics ife Usage Monitoring and Damage Tolerance; Techniques, Methodologies, and...and Prognostics , Life Usage Monitoring , and Damage Tolerance; Techniques, Methodologies, and Experiences Andrew Hess Harrison Chin William Hardman...continuing program and deployed engine monitoring systems in fixed to evaluate helicopter diagnostic, prognostic , and wing aircraft, notably on the A

  20. A Probabilistic Typhoon Risk Model for Vietnam

    NASA Astrophysics Data System (ADS)

    Haseemkunju, A.; Smith, D. F.; Brolley, J. M.

    2017-12-01

    Annually, the coastal Provinces of low-lying Mekong River delta region in the southwest to the Red River Delta region in Northern Vietnam is exposed to severe wind and flood risk from landfalling typhoons. On average, about two to three tropical cyclones with a maximum sustained wind speed of >=34 knots make landfall along the Vietnam coast. Recently, Typhoon Wutip (2013) crossed Central Vietnam as a category 2 typhoon causing significant damage to properties. As tropical cyclone risk is expected to increase with increase in exposure and population growth along the coastal Provinces of Vietnam, insurance/reinsurance, and capital markets need a comprehensive probabilistic model to assess typhoon risk in Vietnam. In 2017, CoreLogic has expanded the geographical coverage of its basin-wide Western North Pacific probabilistic typhoon risk model to estimate the economic and insured losses from landfalling and by-passing tropical cyclones in Vietnam. The updated model is based on 71 years (1945-2015) of typhoon best-track data and 10,000 years of a basin-wide simulated stochastic tracks covering eight countries including Vietnam. The model is capable of estimating damage from wind, storm surge and rainfall flooding using vulnerability models, which relate typhoon hazard to building damageability. The hazard and loss models are validated against past historical typhoons affecting Vietnam. Notable typhoons causing significant damage in Vietnam are Lola (1993), Frankie (1996), Xangsane (2006), and Ketsana (2009). The central and northern coastal provinces of Vietnam are more vulnerable to wind and flood hazard, while typhoon risk in the southern provinces are relatively low.

  1. Effect of Buckling Modes on the Fatigue Life and Damage Tolerance of Stiffened Structures

    NASA Technical Reports Server (NTRS)

    Davila, Carlos G.; Bisagni, Chiara; Rose, Cheryl A.

    2015-01-01

    The postbuckling response and the collapse of composite specimens with a co-cured hat stringer are investigated experimentally and numerically. These specimens are designed to evaluate the postbuckling response and the effect of an embedded defect on the collapse load and the mode of failure. Tests performed using controlled conditions and detailed instrumentation demonstrate that the damage tolerance, fatigue life, and collapse loads are closely tied with the mode of the postbuckling deformation, which can be different between two nominally identical specimens. Modes that tend to open skin/stringer defects are the most damaging to the structure. However, skin/stringer bond defects can also propagate under shearing modes. In the proposed paper, the effects of initial shape imperfections on the postbuckling modes and the interaction between different postbuckling deformations and the propagation of skin/stringer bond defects under quasi-static or fatigue loads will be examined.

  2. Realising damage-tolerant nacre-inspired CFRP

    NASA Astrophysics Data System (ADS)

    Narducci, F.; Lee, K.-Y.; Pinho, S. T.

    2018-07-01

    In this work, a nacre-inspired Carbon Fibre Reinforced Polymer (CFRP) composite is designed, synthesised and tested. Analytical and numerical models are used to design a tiled micro-structure, mimicking the staggered arrangement of ceramic platelets in nacre and exploiting geometrical interlocks for crack deflection and damage diffusion. The designed pattern of tiles is then laser-engraved in the laminate plies. In order to increase the damage-spreading capability of the material, a thin layer of poly(lactic acid) (PLA) is film-cast on the interlaminar region, both as a continuous film and as a pattern of fractal-shaped patches. Three-point bending tests show how the nacre-like micro-structure succeeds in deflecting cracks, with damage diffusion being significantly improved by the addition of PLA at the interface between tiles. It is observed that a texture of discontinuous fractal-shaped PLA patches can increase damage diffusion, by promoting the unlocking of tiles whilst preserving the interface strength.

  3. Optimal Design and Damage Tolerance Verification of an Isogrid Structure for Helicopter Application

    NASA Technical Reports Server (NTRS)

    Baker, Donald J.; Fudge, Jack; Ambur, Damodar R.; Kassapoglou, Christos

    2003-01-01

    A composite isogrid panel design for application to a rotorcraft fuselage is presented. An optimum panel design for the lower fuselage of the rotorcraft that is subjected to combined in-plane compression and shear loads was generated using a design tool that utilizes a smeared-stiffener theory in conjunction with a genetic algorithm. A design feature was introduced along the edges of the panel that facilitates introduction of loads into the isogrid panel without producing undesirable local bending gradients. A low-cost manufacturing method for the isogrid panel that incorporates these design details is also presented. Axial compression tests were conducted on the undamaged and low-speed impact damaged panels to demonstrate the damage tolerance of this isogrid panel. A combined loading test fixture was designed and utilized that allowed simultaneous application of compression and shear loads to the test specimen. Results from finite element analyses are presented for the isogrid panel designs and these results are compared with experimental results. This study illustrates the isogrid concept to be a viable candidate for application to the helicopter lower fuselage structure.

  4. Probabilistic Damage Characterization Using the Computationally-Efficient Bayesian Approach

    NASA Technical Reports Server (NTRS)

    Warner, James E.; Hochhalter, Jacob D.

    2016-01-01

    This work presents a computationally-ecient approach for damage determination that quanti es uncertainty in the provided diagnosis. Given strain sensor data that are polluted with measurement errors, Bayesian inference is used to estimate the location, size, and orientation of damage. This approach uses Bayes' Theorem to combine any prior knowledge an analyst may have about the nature of the damage with information provided implicitly by the strain sensor data to form a posterior probability distribution over possible damage states. The unknown damage parameters are then estimated based on samples drawn numerically from this distribution using a Markov Chain Monte Carlo (MCMC) sampling algorithm. Several modi cations are made to the traditional Bayesian inference approach to provide signi cant computational speedup. First, an ecient surrogate model is constructed using sparse grid interpolation to replace a costly nite element model that must otherwise be evaluated for each sample drawn with MCMC. Next, the standard Bayesian posterior distribution is modi ed using a weighted likelihood formulation, which is shown to improve the convergence of the sampling process. Finally, a robust MCMC algorithm, Delayed Rejection Adaptive Metropolis (DRAM), is adopted to sample the probability distribution more eciently. Numerical examples demonstrate that the proposed framework e ectively provides damage estimates with uncertainty quanti cation and can yield orders of magnitude speedup over standard Bayesian approaches.

  5. Learning Probabilistic Logic Models from Probabilistic Examples

    PubMed Central

    Chen, Jianzhong; Muggleton, Stephen; Santos, José

    2009-01-01

    Abstract We revisit an application developed originally using abductive Inductive Logic Programming (ILP) for modeling inhibition in metabolic networks. The example data was derived from studies of the effects of toxins on rats using Nuclear Magnetic Resonance (NMR) time-trace analysis of their biofluids together with background knowledge representing a subset of the Kyoto Encyclopedia of Genes and Genomes (KEGG). We now apply two Probabilistic ILP (PILP) approaches - abductive Stochastic Logic Programs (SLPs) and PRogramming In Statistical modeling (PRISM) to the application. Both approaches support abductive learning and probability predictions. Abductive SLPs are a PILP framework that provides possible worlds semantics to SLPs through abduction. Instead of learning logic models from non-probabilistic examples as done in ILP, the PILP approach applied in this paper is based on a general technique for introducing probability labels within a standard scientific experimental setting involving control and treated data. Our results demonstrate that the PILP approach provides a way of learning probabilistic logic models from probabilistic examples, and the PILP models learned from probabilistic examples lead to a significant decrease in error accompanied by improved insight from the learned results compared with the PILP models learned from non-probabilistic examples. PMID:19888348

  6. Learning Probabilistic Logic Models from Probabilistic Examples.

    PubMed

    Chen, Jianzhong; Muggleton, Stephen; Santos, José

    2008-10-01

    We revisit an application developed originally using abductive Inductive Logic Programming (ILP) for modeling inhibition in metabolic networks. The example data was derived from studies of the effects of toxins on rats using Nuclear Magnetic Resonance (NMR) time-trace analysis of their biofluids together with background knowledge representing a subset of the Kyoto Encyclopedia of Genes and Genomes (KEGG). We now apply two Probabilistic ILP (PILP) approaches - abductive Stochastic Logic Programs (SLPs) and PRogramming In Statistical modeling (PRISM) to the application. Both approaches support abductive learning and probability predictions. Abductive SLPs are a PILP framework that provides possible worlds semantics to SLPs through abduction. Instead of learning logic models from non-probabilistic examples as done in ILP, the PILP approach applied in this paper is based on a general technique for introducing probability labels within a standard scientific experimental setting involving control and treated data. Our results demonstrate that the PILP approach provides a way of learning probabilistic logic models from probabilistic examples, and the PILP models learned from probabilistic examples lead to a significant decrease in error accompanied by improved insight from the learned results compared with the PILP models learned from non-probabilistic examples.

  7. Probabilistic Harmonic Analysis on Distributed Photovoltaic Integration Considering Typical Weather Scenarios

    NASA Astrophysics Data System (ADS)

    Bin, Che; Ruoying, Yu; Dongsheng, Dang; Xiangyan, Wang

    2017-05-01

    Distributed Generation (DG) integrating to the network would cause the harmonic pollution which would cause damages on electrical devices and affect the normal operation of power system. On the other hand, due to the randomness of the wind and solar irradiation, the output of DG is random, too, which leads to an uncertainty of the harmonic generated by the DG. Thus, probabilistic methods are needed to analyse the impacts of the DG integration. In this work we studied the harmonic voltage probabilistic distribution and the harmonic distortion in distributed network after the distributed photovoltaic (DPV) system integrating in different weather conditions, mainly the sunny day, cloudy day, rainy day and the snowy day. The probabilistic distribution function of the DPV output power in different typical weather conditions could be acquired via the parameter identification method of maximum likelihood estimation. The Monte-Carlo simulation method was adopted to calculate the probabilistic distribution of harmonic voltage content at different frequency orders as well as the harmonic distortion (THD) in typical weather conditions. The case study was based on the IEEE33 system and the results of harmonic voltage content probabilistic distribution as well as THD in typical weather conditions were compared.

  8. Damage Tolerance Assessment of Friction Pull Plug Welds

    NASA Technical Reports Server (NTRS)

    McGill, Preston; Burkholder, Jonathan

    2012-01-01

    Friction stir welding is a solid state welding process developed and patented by The Welding Institute in Cambridge, England. Friction stir welding has been implemented in the aerospace industry in the fabrication of longitudinal welds in pressurized cryogenic propellant tanks. As the industry looks to implement friction stir welding in circumferential welds in pressurized cryogenic propellant tanks, techniques to close out the termination hole associated with retracting the pin tool are being evaluated. Friction pull plug welding is under development as a one means of closing out the termination hole. A friction pull plug weld placed in a friction stir weld results in a non-homogenous weld joint where the initial weld, plug weld, their respective heat affected zones and the base metal all interact. The welded joint is a composite, plastically deformed material system with a complex residual stress field. In order to address damage tolerance concerns associated with friction plug welds in safety critical structures, such as propellant tanks, nondestructive inspection and proof testing may be required to screen hardware for mission critical defects. The efficacy of the nondestructive evaluation or the proof test is based on an assessment of the critical flaw size in the test or service environments. Test data relating residual strength capability to flaw size in two aluminum alloy friction plug weld configurations is presented.

  9. Precision cancer therapy: profiting from tumor specific defects in the DNA damage tolerance system.

    PubMed

    Buoninfante, Olimpia Alessandra; Pilzecker, Bas; Aslam, Muhammad Assad; Zavrakidis, Ioannis; van der Wiel, Rianne; van de Ven, Marieke; van den Berk, Paul C M; Jacobs, Heinz

    2018-04-10

    DNA damage tolerance (DDT) enables replication to continue in the presence of a damaged template and constitutes a key step in DNA interstrand crosslink repair. In this way DDT minimizes replication stress inflicted by a wide range of endogenous and exogenous agents, and provides a critical first line defense against alkylating and platinating chemotherapeutics. Effective DDT strongly depends on damage-induced, site-specific PCNA-ubiquitination at Lysine (K) 164 by the E2/E3 complex (RAD6/18). A survey of The Cancer Genome Atlas (TCGA) revealed a high frequency of tumors presents RAD6/RAD18 bi-allelic inactivating deletions. For instance, 11% of renal cell carcinoma and 5% of pancreatic tumors have inactivating RAD18 -deletions and 7% of malignant peripheral nerve sheath tumors lack RAD6B . To determine the potential benefit for tumor-specific DDT defects, we followed a genetic approach by establishing unique sets of DDT-proficient Pcna K164 and -defective Pcna K164R lymphoma and breast cancer cell lines. In the absence of exogenous DNA damage, Pcna K164R tumors grew comparably to their Pcna K164 controls in vitro and in vivo . However, DDT-defective lymphomas and breast cancers were compared to their DDT-proficient controls hypersensitive to the chemotherapeutic drug cisplatin (CsPt), both in vitro and in vivo. CsPt strongly inhibited tumor growth and the overall survival of tumor bearing mice greatly improved in the DDT-defective condition. These insights open new therapeutic possibilities for precision cancer medicine with DNA damaging chemotherapeutics and optimize Next-Generation-Sequencing (NGS)-based cancer-diagnostics, -therapeutics, and -prognosis.

  10. Precision cancer therapy: profiting from tumor specific defects in the DNA damage tolerance system

    PubMed Central

    Buoninfante, Olimpia Alessandra; Pilzecker, Bas; Aslam, Muhammad Assad; Zavrakidis, Ioannis; van der Wiel, Rianne; van de Ven, Marieke; van den Berk, Paul C.M.; Jacobs, Heinz

    2018-01-01

    DNA damage tolerance (DDT) enables replication to continue in the presence of a damaged template and constitutes a key step in DNA interstrand crosslink repair. In this way DDT minimizes replication stress inflicted by a wide range of endogenous and exogenous agents, and provides a critical first line defense against alkylating and platinating chemotherapeutics. Effective DDT strongly depends on damage-induced, site-specific PCNA-ubiquitination at Lysine (K) 164 by the E2/E3 complex (RAD6/18). A survey of The Cancer Genome Atlas (TCGA) revealed a high frequency of tumors presents RAD6/RAD18 bi-allelic inactivating deletions. For instance, 11% of renal cell carcinoma and 5% of pancreatic tumors have inactivating RAD18-deletions and 7% of malignant peripheral nerve sheath tumors lack RAD6B. To determine the potential benefit for tumor-specific DDT defects, we followed a genetic approach by establishing unique sets of DDT-proficient PcnaK164 and -defective PcnaK164R lymphoma and breast cancer cell lines. In the absence of exogenous DNA damage, PcnaK164R tumors grew comparably to their PcnaK164 controls in vitro and in vivo. However, DDT-defective lymphomas and breast cancers were compared to their DDT-proficient controls hypersensitive to the chemotherapeutic drug cisplatin (CsPt), both in vitro and in vivo. CsPt strongly inhibited tumor growth and the overall survival of tumor bearing mice greatly improved in the DDT-defective condition. These insights open new therapeutic possibilities for precision cancer medicine with DNA damaging chemotherapeutics and optimize Next-Generation-Sequencing (NGS)-based cancer-diagnostics, -therapeutics, and -prognosis. PMID:29721165

  11. Flood Risk and Probabilistic Benefit Assessment to Support Management of Flood-Prone Lands: Evidence From Candaba Floodplains, Philippines

    NASA Astrophysics Data System (ADS)

    Juarez, A. M.; Kibler, K. M.; Sayama, T.; Ohara, M.

    2016-12-01

    Flood management decision-making is often supported by risk assessment, which may overlook the role of coping capacity and the potential benefits derived from direct use of flood-prone land. Alternatively, risk-benefit analysis can support floodplain management to yield maximum socio-ecological benefits for the minimum flood risk. We evaluate flood risk-probabilistic benefit tradeoffs of livelihood practices compatible with direct human use of flood-prone land (agriculture/wild fisheries) and nature conservation (wild fisheries only) in Candaba, Philippines. Located north-west to Metro Manila, Candaba area is a multi-functional landscape that provides a temporally-variable mix of possible land uses, benefits and ecosystem services of local and regional value. To characterize inundation from 1.3- to 100-year recurrence intervals we couple frequency analysis with rainfall-runoff-inundation modelling and remotely-sensed data. By combining simulated probabilistic floods with both damage and benefit functions (e.g. fish capture and rice yield with flood intensity) we estimate potential damages and benefits over varying probabilistic flood hazards. We find that although direct human uses of flood-prone land are associated with damages, for all the investigated magnitudes of flood events with different frequencies, the probabilistic benefits ( 91 million) exceed risks by a large margin ( 33 million). Even considering risk, probabilistic livelihood benefits of direct human uses far exceed benefits provided by scenarios that exclude direct "risky" human uses (difference of 85 million). In addition, we find that individual coping strategies, such as adapting crop planting periods to the flood pulse or fishing rather than cultivating rice in the wet season, minimize flood losses ( 6 million) while allowing for valuable livelihood benefits ($ 125 million) in flood-prone land. Analysis of societal benefits and local capacities to cope with regular floods demonstrate the

  12. Probabilistic Learning in Junior High School: Investigation of Student Probabilistic Thinking Levels

    NASA Astrophysics Data System (ADS)

    Kurniasih, R.; Sujadi, I.

    2017-09-01

    This paper was to investigate level on students’ probabilistic thinking. Probabilistic thinking level is level of probabilistic thinking. Probabilistic thinking is thinking about probabilistic or uncertainty matter in probability material. The research’s subject was students in grade 8th Junior High School students. The main instrument is a researcher and a supporting instrument is probabilistic thinking skills test and interview guidelines. Data was analyzed using triangulation method. The results showed that the level of students probabilistic thinking before obtaining a teaching opportunity at the level of subjective and transitional. After the students’ learning level probabilistic thinking is changing. Based on the results of research there are some students who have in 8th grade level probabilistic thinking numerically highest of levels. Level of students’ probabilistic thinking can be used as a reference to make a learning material and strategy.

  13. Radiation Tolerant Interfaces: Influence of Local Stoichiometry at the Misfit Dislocation on Radiation Damage Resistance of Metal/Oxide Interfaces

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shutthanandan, Vaithiyalingam; Choudhury, Samrat; Manandhar, Sandeep

    To understand how variations in interface properties such as misfit-dislocation density and local chemistry affect radiation-induced defect absorption and recombination, we have explored a model system of CrxV1-x alloy epitaxial films deposited on MgO single crystals. By controlling film composition, the lattice mismatch with MgO was adjusted so that the misfit-dislocation density varies at the interface. These interfaces were exposed to irradiation and in situ results show that the film with a semi-coherent interface (Cr) withstands irradiation while V film, which has similar semi-coherent interface like Cr, showed the largest damage. Theoretical calculations indicate that, unlike at metal/metal interfaces, themore » misfit dislocation density does not dominate radiation damage tolerance at metal/oxide interfaces. Rather, the stoichiometry, and the precise location of the misfit-dislocation density relative to the interface, drives defect behavior. Together, these results demonstrate the sensitivity of defect recombination to interfacial chemistry and provide new avenues for engineering radiation-tolerant nanomaterials.« less

  14. Probabilistic Models for Solar Particle Events

    NASA Technical Reports Server (NTRS)

    Adams, James H., Jr.; Dietrich, W. F.; Xapsos, M. A.; Welton, A. M.

    2009-01-01

    Probabilistic Models of Solar Particle Events (SPEs) are used in space mission design studies to provide a description of the worst-case radiation environment that the mission must be designed to tolerate.The models determine the worst-case environment using a description of the mission and a user-specified confidence level that the provided environment will not be exceeded. This poster will focus on completing the existing suite of models by developing models for peak flux and event-integrated fluence elemental spectra for the Z>2 elements. It will also discuss methods to take into account uncertainties in the data base and the uncertainties resulting from the limited number of solar particle events in the database. These new probabilistic models are based on an extensive survey of SPE measurements of peak and event-integrated elemental differential energy spectra. Attempts are made to fit the measured spectra with eight different published models. The model giving the best fit to each spectrum is chosen and used to represent that spectrum for any energy in the energy range covered by the measurements. The set of all such spectral representations for each element is then used to determine the worst case spectrum as a function of confidence level. The spectral representation that best fits these worst case spectra is found and its dependence on confidence level is parameterized. This procedure creates probabilistic models for the peak and event-integrated spectra.

  15. Long-term strength and damage accumulation in laminates

    NASA Astrophysics Data System (ADS)

    Dzenis, Yuris A.; Joshi, Shiv P.

    1993-04-01

    A modified version of the probabilistic model developed by authors for damage evolution analysis of laminates subjected to random loading is utilized to predict long-term strength of laminates. The model assumes that each ply in a laminate consists of a large number of mesovolumes. Probabilistic variation functions for mesovolumes stiffnesses as well as strengths are used in the analysis. Stochastic strains are calculated using the lamination theory and random function theory. Deterioration of ply stiffnesses is calculated on the basis of the probabilities of mesovolumes failures using the theory of excursions of random process beyond the limits. Long-term strength and damage accumulation in a Kevlar/epoxy laminate under tension and complex in-plane loading are investigated. Effects of the mean level and stochastic deviation of loading on damage evolution and time-to-failure of laminate are discussed. Long-term cumulative damage at the time of the final failure at low loading levels is more than at high loading levels. The effect of the deviation in loading is more pronounced at lower mean loading levels.

  16. Development of pressure containment and damage tolerance technology for composite fuselage structures in large transport aircraft

    NASA Technical Reports Server (NTRS)

    Smith, P. J.; Thomson, L. W.; Wilson, R. D.

    1986-01-01

    NASA sponsored composites research and development programs were set in place to develop the critical engineering technologies in large transport aircraft structures. This NASA-Boeing program focused on the critical issues of damage tolerance and pressure containment generic to the fuselage structure of large pressurized aircraft. Skin-stringer and honeycomb sandwich composite fuselage shell designs were evaluated to resolve these issues. Analyses were developed to model the structural response of the fuselage shell designs, and a development test program evaluated the selected design configurations to appropriate load conditions.

  17. Orchid flowers tolerance to gamma-radiation

    NASA Astrophysics Data System (ADS)

    Kikuchi, Olivia Kimiko

    2000-03-01

    Cut flowers are fresh goods that may be treated with fumigants such as methyl bromide to meet the needs of the quarantine requirements of importing countries. Irradiation is a non-chemical alternative to substitute the methyl bromide treatment of fresh products. In this research, different cut orchids were irradiated to examine their tolerance to gamma-rays. A 200 Gy dose did inhibit the Dendrobium palenopsis buds from opening, but did not cause visible damage to opened flowers. Doses of 800 and 1000 Gy were damaging because they provoked the flowers to drop from the stem. Cattleya irradiated with 750 Gy did not show any damage, and were therefore eligible for the radiation treatment. Cymbidium tolerated up to 300 Gy and above this dose dropped prematurely. On the other hand, Oncydium did not tolerate doses above 150 Gy.

  18. Damage tolerance and arrest characteristics of pressurized graphite/epoxy tape cylinders

    NASA Technical Reports Server (NTRS)

    Ranniger, Claudia U.; Lagace, Paul A.; Graves, Michael J.

    1993-01-01

    An investigation of the damage tolerance and damage arrest characteristics of internally-pressurized graphite/epoxy tape cylinders with axial notches was conducted. An existing failure prediction methodology, developed and verified for quasi-isotropic graphite/epoxy fabric cylinders, was investigated for applicability to general tape layups. In addition, the effect of external circumferential stiffening bands on the direction of fracture path propagation and possible damage arrest was examined. Quasi-isotropic (90/0/plus or minus 45)s and structurally anisotropic (plus or minus 45/0)s and (plus or minus 45/90)s coupons and cylinders were constructed from AS4/3501-6 graphite/epoxy tape. Notched and unnotched coupons were tested in tension and the data correlated using the equation of Mar and Lin. Cylinders with through-thickness axial slits were pressurized to failure achieving a far-field two-to-one biaxial stress state. Experimental failure pressures of the (90/0/plus or minus 45)s cylinders agreed with predicted values for all cases but the specimen with the smallest slit. However, the failure pressures of the structurally anisotropic cylinders, (plus or minus 45/0)s and (plus or minus 45/90)s, were above the values predicted utilizing the predictive methodology in all cases. Possible factors neglected by the predictive methodology include structural coupling in the laminates and axial loading of the cylindrical specimens. Furthermore, applicability of the predictive methodology depends on the similarity of initial fracture modes in the coupon specimens and the cylinder specimens of the same laminate type. The existence of splitting which may be exacerbated by the axial loading in the cylinders, shows that this condition is not always met. The circumferential stiffeners were generally able to redirect fracture propagation from longitudinal to circumferential. A quantitative assessment for stiffener effectiveness in containing the fracture, based on cylinder

  19. DNA damage tolerance in hematopoietic stem and progenitor cells in mice

    PubMed Central

    Pilzecker, Bas; Buoninfante, Olimpia Alessandra; van den Berk, Paul; Lancini, Cesare; Song, Ji-Ying; Citterio, Elisabetta

    2017-01-01

    DNA damage tolerance (DDT) enables bypassing of DNA lesions during replication, thereby preventing fork stalling, replication stress, and secondary DNA damage related to fork stalling. Three modes of DDT have been documented: translesion synthesis (TLS), template switching (TS), and repriming. TLS and TS depend on site-specific PCNA K164 monoubiquitination and polyubiquitination, respectively. To investigate the role of DDT in maintaining hematopoietic stem cells (HSCs) and progenitors, we used PcnaK164R/K164R mice as a unique DDT-defective mouse model. Analysis of the composition of HSCs and HSC-derived multipotent progenitors (MPPs) revealed a significantly reduced number of HSCs, likely owing to increased differentiation of HSCs toward myeloid/erythroid-associated MPP2s. This skewing came at the expense of the number of lymphoid-primed MPP4s, which appeared to be compensated for by increased MPP4 proliferation. Furthermore, defective DDT decreased the numbers of MPP-derived common lymphoid progenitor (CLP), common myeloid progenitor (CMP), megakaryocyte-erythroid progenitor (MEP), and granulocyte-macrophage progenitor (GMP) cells, accompanied by increased cell cycle arrest in CMPs. The HSC and MPP phenotypes are reminiscent of premature aging and stressed hematopoiesis, and indeed progressed with age and were exacerbated on cisplatin exposure. Bone marrow transplantations revealed a strong cell intrinsic defect of DDT-deficient HSCs in reconstituting lethally irradiated mice and a strong competitive disadvantage when cotransplanted with wild-type HSCs. These findings indicate a critical role of DDT in maintaining HSCs and progenitor cells, and in preventing premature aging. PMID:28761001

  20. Probabilistic Methodology for Estimation of Number and Economic Loss (Cost) of Future Landslides in the San Francisco Bay Region, California

    USGS Publications Warehouse

    Crovelli, Robert A.; Coe, Jeffrey A.

    2008-01-01

    The Probabilistic Landslide Assessment Cost Estimation System (PLACES) presented in this report estimates the number and economic loss (cost) of landslides during a specified future time in individual areas, and then calculates the sum of those estimates. The analytic probabilistic methodology is based upon conditional probability theory and laws of expectation and variance. The probabilistic methodology is expressed in the form of a Microsoft Excel computer spreadsheet program. Using historical records, the PLACES spreadsheet is used to estimate the number of future damaging landslides and total damage, as economic loss, from future landslides caused by rainstorms in 10 counties of the San Francisco Bay region in California. Estimates are made for any future 5-year period of time. The estimated total number of future damaging landslides for the entire 10-county region during any future 5-year period of time is about 330. Santa Cruz County has the highest estimated number of damaging landslides (about 90), whereas Napa, San Francisco, and Solano Counties have the lowest estimated number of damaging landslides (5?6 each). Estimated direct costs from future damaging landslides for the entire 10-county region for any future 5-year period are about US $76 million (year 2000 dollars). San Mateo County has the highest estimated costs ($16.62 million), and Solano County has the lowest estimated costs (about $0.90 million). Estimated direct costs are also subdivided into public and private costs.

  1. Ontogenetic patterns in the mechanisms of tolerance to herbivory in Plantago

    PubMed Central

    Barton, Kasey E.

    2013-01-01

    Background and Aims Herbivory and plant defence differ markedly among seedlings and juvenile and mature plants in most species. While ontogenetic patterns of chemical resistance have been the focus of much research, comparatively little is known about how tolerance to damage changes across ontogeny. Due to dramatic shifts in plant size, resource acquisition, stored reserves and growth, it was predicted that tolerance and related underlying mechanisms would differ among ontogenetic stages. Methods Ontogenetic patterns in the mechanisms of tolerance were investigated in Plantago lanceolata and P. major (Plantaginaceae) using the genetic sib-ship approach. Pot-grown plants were subjected to 50 % defoliation at the seedling, juvenile and mature stages and either harvested in the short-term to look at plasticity in growth and photosynthesis in response to damage or allowed to grow through seed maturation to measure phenology, shoot compensation and reproductive fitness. Key Results Tolerance to defoliation was high in P. lanceolata, but low in P. major, and did not vary among ontogenetic stages in either species. Mechanisms underlying tolerance did vary across ontogeny. In P. lanceolata, tolerance was significantly related to flowering (juveniles) and pre-damage shoot biomass (mature plants). In P. major, tolerance was significantly related to pre-damage root biomass (seedlings) and induction of non-photochemical quenching, a photosynthetic parameter (juveniles). Conclusions Biomass partitioning was very plastic in response to damage and showed associations with tolerance in both species, indicating a strong role in plant defence. In contrast, photosynthesis and phenology showed weaker responses to damage and were related to tolerance only in certain ontogenetic stages. This study highlights the pivotal role of ontogeny in plant defence and herbivory. Additional studies in more species are needed to determine how seedlings tolerate herbivory in general and whether

  2. A Probabilistic Asteroid Impact Risk Model

    NASA Technical Reports Server (NTRS)

    Mathias, Donovan L.; Wheeler, Lorien F.; Dotson, Jessie L.

    2016-01-01

    Asteroid threat assessment requires the quantification of both the impact likelihood and resulting consequence across the range of possible events. This paper presents a probabilistic asteroid impact risk (PAIR) assessment model developed for this purpose. The model incorporates published impact frequency rates with state-of-the-art consequence assessment tools, applied within a Monte Carlo framework that generates sets of impact scenarios from uncertain parameter distributions. Explicit treatment of atmospheric entry is included to produce energy deposition rates that account for the effects of thermal ablation and object fragmentation. These energy deposition rates are used to model the resulting ground damage, and affected populations are computed for the sampled impact locations. The results for each scenario are aggregated into a distribution of potential outcomes that reflect the range of uncertain impact parameters, population densities, and strike probabilities. As an illustration of the utility of the PAIR model, the results are used to address the question of what minimum size asteroid constitutes a threat to the population. To answer this question, complete distributions of results are combined with a hypothetical risk tolerance posture to provide the minimum size, given sets of initial assumptions. Model outputs demonstrate how such questions can be answered and provide a means for interpreting the effect that input assumptions and uncertainty can have on final risk-based decisions. Model results can be used to prioritize investments to gain knowledge in critical areas or, conversely, to identify areas where additional data has little effect on the metrics of interest.

  3. The application of probabilistic fracture analysis to residual life evaluation of embrittled reactor vessels

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dickson, T.L.; Simonen, F.A.

    1992-05-01

    Probabilistic fracture mechanics analysis is a major element of comprehensive probabilistic methodology on which current NRC regulatory requirements for pressurized water reactor vessel integrity evaluation are based. Computer codes such as OCA-P and VISA-II perform probabilistic fracture analyses to estimate the increase in vessel failure probability that occurs as the vessel material accumulates radiation damage over the operating life of the vessel. The results of such analyses, when compared with limits of acceptable failure probabilities, provide an estimation of the residual life of a vessel. Such codes can be applied to evaluate the potential benefits of plant-specific mitigating actions designedmore » to reduce the probability of failure of a reactor vessel. 10 refs.« less

  4. The application of probabilistic fracture analysis to residual life evaluation of embrittled reactor vessels

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dickson, T.L.; Simonen, F.A.

    1992-01-01

    Probabilistic fracture mechanics analysis is a major element of comprehensive probabilistic methodology on which current NRC regulatory requirements for pressurized water reactor vessel integrity evaluation are based. Computer codes such as OCA-P and VISA-II perform probabilistic fracture analyses to estimate the increase in vessel failure probability that occurs as the vessel material accumulates radiation damage over the operating life of the vessel. The results of such analyses, when compared with limits of acceptable failure probabilities, provide an estimation of the residual life of a vessel. Such codes can be applied to evaluate the potential benefits of plant-specific mitigating actions designedmore » to reduce the probability of failure of a reactor vessel. 10 refs.« less

  5. Transparency and damage tolerance of patternable omniphobic lubricated surfaces based on inverse colloidal monolayers

    DOE PAGES

    Vogel, Nicolas; Belisle, Rebecca A.; Hatton, Benjamin; ...

    2013-07-31

    A transparent coating that repels a wide variety of liquids, prevents staining, is capable of self-repair and is robust towards mechanical damage can have a broad technological impact, from solar cell coatings to self-cleaning optical devices. Here we employ colloidal templating to design transparent, nanoporous surface structures. A lubricant can be firmly locked into the structures and, owing to its fluidic nature, forms a defect-free, self-healing interface that eliminates the pinning of a second liquid applied to its surface, leading to efficient liquid repellency, prevention of adsorption of liquid-borne contaminants, and reduction of ice adhesion strength. We further show howmore » this method can be applied to locally pattern the repellent character of the substrate, thus opening opportunities to spatially confine any simple or complex fluids. The coating is highly defect-tolerant due to its interconnected, honeycomb wall structure, and repellency prevails after the application of strong shear forces and mechanical damage. The regularity of the coating allows us to understand and predict the stability or failure of repellency as a function of lubricant layer thickness and defect distribution based on a simple geometric model.« less

  6. A probabilistic estimate of maximum acceleration in rock in the contiguous United States

    USGS Publications Warehouse

    Algermissen, Sylvester Theodore; Perkins, David M.

    1976-01-01

    This paper presents a probabilistic estimate of the maximum ground acceleration to be expected from earthquakes occurring in the contiguous United States. It is based primarily upon the historic seismic record which ranges from very incomplete before 1930 to moderately complete after 1960. Geologic data, primarily distribution of faults, have been employed only to a minor extent, because most such data have not been interpreted yet with earthquake hazard evaluation in mind.The map provides a preliminary estimate of the relative hazard in various parts of the country. The report provides a method for evaluating the relative importance of the many parameters and assumptions in hazard analysis. The map and methods of evaluation described reflect the current state of understanding and are intended to be useful for engineering purposes in reducing the effects of earthquakes on buildings and other structures.Studies are underway on improved methods for evaluating the relativ( earthquake hazard of different regions. Comments on this paper are invited to help guide future research and revisions of the accompanying map.The earthquake hazard in the United States has been estimated in a variety of ways since the initial effort by Ulrich (see Roberts and Ulrich, 1950). In general, the earlier maps provided an estimate of the severity of ground shaking or damage but the frequency of occurrence of the shaking or damage was not given. Ulrich's map showed the distribution of expected damage in terms of no damage (zone 0), minor damage (zone 1), moderate damage (zone 2), and major damage (zone 3). The zones were not defined further and the frequency of occurrence of damage was not suggested. Richter (1959) and Algermissen (1969) estimated the ground motion in terms of maximum Modified Mercalli intensity. Richter used the terms "occasional" and "frequent" to characterize intensity IX shaking and Algermissen included recurrence curves for various parts of the country in the paper

  7. Wind effects on long-span bridges: Probabilistic wind data format for buffeting and VIV load assessments

    NASA Astrophysics Data System (ADS)

    Hoffmann, K.; Srouji, R. G.; Hansen, S. O.

    2017-12-01

    The technology development within the structural design of long-span bridges in Norwegian fjords has created a need for reformulating the calculation format and the physical quantities used to describe the properties of wind and the associated wind-induced effects on bridge decks. Parts of a new probabilistic format describing the incoming, undisturbed wind is presented. It is expected that a fixed probabilistic format will facilitate a more physically consistent and precise description of the wind conditions, which in turn increase the accuracy and considerably reduce uncertainties in wind load assessments. Because the format is probabilistic, a quantification of the level of safety and uncertainty in predicted wind loads is readily accessible. A simple buffeting response calculation demonstrates the use of probabilistic wind data in the assessment of wind loads and responses. Furthermore, vortex-induced fatigue damage is discussed in relation to probabilistic wind turbulence data and response measurements from wind tunnel tests.

  8. Damage Progression in Bolted Composites

    NASA Technical Reports Server (NTRS)

    Minnetyan, Levon; Chamis, Christos C.; Gotsis, Pascal K.

    1998-01-01

    Structural durability, damage tolerance, and progressive fracture characteristics of bolted graphite/epoxy composite laminates are evaluated via computational simulation. Constituent material properties and stress and strain limits are scaled up to the structure level to evaluate the overall damage and fracture propagation for bolted composites. Single and double bolted composite specimens with various widths and bolt spacings are evaluated. The effect of bolt spacing is investigated with regard to the structural durability of a bolted joint. Damage initiation, growth, accumulation, and propagation to fracture are included in the simulations. Results show the damage progression sequence and structural fracture resistance during different degradation stages. A procedure is outlined for the use of computational simulation data in the assessment of damage tolerance, determination of sensitive parameters affecting fracture, and interpretation of experimental results with insight for design decisions.

  9. Damage Progression in Bolted Composites

    NASA Technical Reports Server (NTRS)

    Minnetyan, Levon; Chamis, Christos; Gotsis, Pascal K.

    1998-01-01

    Structural durability,damage tolerance,and progressive fracture characteristics of bolted graphite/epoxy composite laminates are evaluated via computational simulation. Constituent material properties and stress and strain limits are scaled up to the structure level to evaluate the overall damage and fracture propagation for bolted composites. Single and double bolted composite specimens with various widths and bolt spacings are evaluated. The effect of bolt spacing is investigated with regard to the structural durability of a bolted joint. Damage initiation, growth, accumulation, and propagation to fracture are included in the simulations. Results show the damage progression sequence and structural fracture resistance during different degradation stages. A procedure is outlined for the use of computational simulation data in the assessment of damage tolerance, determination of sensitive parameters affecting fracture, and interpretation of experimental results with insight for design decisions.

  10. Asteroid Risk Assessment: A Probabilistic Approach.

    PubMed

    Reinhardt, Jason C; Chen, Xi; Liu, Wenhao; Manchev, Petar; Paté-Cornell, M Elisabeth

    2016-02-01

    Following the 2013 Chelyabinsk event, the risks posed by asteroids attracted renewed interest, from both the scientific and policy-making communities. It reminded the world that impacts from near-Earth objects (NEOs), while rare, have the potential to cause great damage to cities and populations. Point estimates of the risk (such as mean numbers of casualties) have been proposed, but because of the low-probability, high-consequence nature of asteroid impacts, these averages provide limited actionable information. While more work is needed to further refine its input distributions (e.g., NEO diameters), the probabilistic model presented in this article allows a more complete evaluation of the risk of NEO impacts because the results are distributions that cover the range of potential casualties. This model is based on a modularized simulation that uses probabilistic inputs to estimate probabilistic risk metrics, including those of rare asteroid impacts. Illustrative results of this analysis are presented for a period of 100 years. As part of this demonstration, we assess the effectiveness of civil defense measures in mitigating the risk of human casualties. We find that they are likely to be beneficial but not a panacea. We also compute the probability-but not the consequences-of an impact with global effects ("cataclysm"). We conclude that there is a continued need for NEO observation, and for analyses of the feasibility and risk-reduction effectiveness of space missions designed to deflect or destroy asteroids that threaten the Earth. © 2015 Society for Risk Analysis.

  11. Hierarchical flexural strength of enamel: transition from brittle to damage-tolerant behaviour

    PubMed Central

    Bechtle, Sabine; Özcoban, Hüseyin; Lilleodden, Erica T.; Huber, Norbert; Schreyer, Andreas; Swain, Michael V.; Schneider, Gerold A.

    2012-01-01

    Hard, biological materials are generally hierarchically structured from the nano- to the macro-scale in a somewhat self-similar manner consisting of mineral units surrounded by a soft protein shell. Considerable efforts are underway to mimic such materials because of their structurally optimized mechanical functionality of being hard and stiff as well as damage-tolerant. However, it is unclear how different hierarchical levels interact to achieve this performance. In this study, we consider dental enamel as a representative, biological hierarchical structure and determine its flexural strength and elastic modulus at three levels of hierarchy using focused ion beam (FIB) prepared cantilevers of micrometre size. The results are compared and analysed using a theoretical model proposed by Jäger and Fratzl and developed by Gao and co-workers. Both properties decrease with increasing hierarchical dimension along with a switch in mechanical behaviour from linear-elastic to elastic-inelastic. We found Gao's model matched the results very well. PMID:22031729

  12. Probabilistic modeling of condition-based maintenance strategies and quantification of its benefits for airliners

    NASA Astrophysics Data System (ADS)

    Pattabhiraman, Sriram

    Airplane fuselage structures are designed with the concept of damage tolerance, wherein small damage are allowed to remain on the airplane, and damage that otherwise affect the safety of the structure are repaired. The damage critical to the safety of the fuselage are repaired by scheduling maintenance at pre-determined intervals. Scheduling maintenance is an interesting trade-off between damage tolerance and cost. Tolerance of larger damage would require less frequent maintenance and hence, a lower cost, to maintain a certain level of reliability. Alternatively, condition-based maintenance techniques have been developed using on-board sensors, which track damage continuously and request maintenance only when the damage size crosses a particular threshold. This effects a tolerance of larger damage than scheduled maintenance, leading to savings in cost. This work quantifies the savings of condition-based maintenance over scheduled maintenance. The work also quantifies converting the cost savings into weight savings. Structural health monitoring will need time to be able to establish itself as a stand-alone system for maintenance, due to concerns on its diagnosis accuracy and reliability. This work also investigates the effect of synchronizing structural health monitoring system with scheduled maintenance. This work uses on-board SHM equipment skip structural airframe maintenance (a subsect of scheduled maintenance), whenever deemed unnecessary while maintain a desired level of safety of structure. The work will also predict the necessary maintenance for a fleet of airplanes, based on the current damage status of the airplanes. The work also analyses the possibility of false alarm, wherein maintenance is being requested with no critical damage on the airplane. The work use SHM as a tool to identify lemons in a fleet of airplanes. Lemons are those airplanes that would warrant more maintenance trips than the average behavior of the fleet.

  13. A Probabilistic Design Method Applied to Smart Composite Structures

    NASA Technical Reports Server (NTRS)

    Shiao, Michael C.; Chamis, Christos C.

    1995-01-01

    A probabilistic design method is described and demonstrated using a smart composite wing. Probabilistic structural design incorporates naturally occurring uncertainties including those in constituent (fiber/matrix) material properties, fabrication variables, structure geometry and control-related parameters. Probabilistic sensitivity factors are computed to identify those parameters that have a great influence on a specific structural reliability. Two performance criteria are used to demonstrate this design methodology. The first criterion requires that the actuated angle at the wing tip be bounded by upper and lower limits at a specified reliability. The second criterion requires that the probability of ply damage due to random impact load be smaller than an assigned value. When the relationship between reliability improvement and the sensitivity factors is assessed, the results show that a reduction in the scatter of the random variable with the largest sensitivity factor (absolute value) provides the lowest failure probability. An increase in the mean of the random variable with a negative sensitivity factor will reduce the failure probability. Therefore, the design can be improved by controlling or selecting distribution parameters associated with random variables. This can be implemented during the manufacturing process to obtain maximum benefit with minimum alterations.

  14. Efficient probabilistic inference in generic neural networks trained with non-probabilistic feedback.

    PubMed

    Orhan, A Emin; Ma, Wei Ji

    2017-07-26

    Animals perform near-optimal probabilistic inference in a wide range of psychophysical tasks. Probabilistic inference requires trial-to-trial representation of the uncertainties associated with task variables and subsequent use of this representation. Previous work has implemented such computations using neural networks with hand-crafted and task-dependent operations. We show that generic neural networks trained with a simple error-based learning rule perform near-optimal probabilistic inference in nine common psychophysical tasks. In a probabilistic categorization task, error-based learning in a generic network simultaneously explains a monkey's learning curve and the evolution of qualitative aspects of its choice behavior. In all tasks, the number of neurons required for a given level of performance grows sublinearly with the input population size, a substantial improvement on previous implementations of probabilistic inference. The trained networks develop a novel sparsity-based probabilistic population code. Our results suggest that probabilistic inference emerges naturally in generic neural networks trained with error-based learning rules.Behavioural tasks often require probability distributions to be inferred about task specific variables. Here, the authors demonstrate that generic neural networks can be trained using a simple error-based learning rule to perform such probabilistic computations efficiently without any need for task specific operations.

  15. Probabilistic structural analysis of space propulsion system LOX post

    NASA Technical Reports Server (NTRS)

    Newell, J. F.; Rajagopal, K. R.; Ho, H. W.; Cunniff, J. M.

    1990-01-01

    The probabilistic structural analysis program NESSUS (Numerical Evaluation of Stochastic Structures Under Stress; Cruse et al., 1988) is applied to characterize the dynamic loading and response of the Space Shuttle main engine (SSME) LOX post. The design and operation of the SSME are reviewed; the LOX post structure is described; and particular attention is given to the generation of composite load spectra, the finite-element model of the LOX post, and the steps in the NESSUS structural analysis. The results are presented in extensive tables and graphs, and it is shown that NESSUS correctly predicts the structural effects of changes in the temperature loading. The probabilistic approach also facilitates (1) damage assessments for a given failure model (based on gas temperature, heat-shield gap, and material properties) and (2) correlation of the gas temperature with operational parameters such as engine thrust.

  16. Probabilistic record linkage

    PubMed Central

    Sayers, Adrian; Ben-Shlomo, Yoav; Blom, Ashley W; Steele, Fiona

    2016-01-01

    Abstract Studies involving the use of probabilistic record linkage are becoming increasingly common. However, the methods underpinning probabilistic record linkage are not widely taught or understood, and therefore these studies can appear to be a ‘black box’ research tool. In this article, we aim to describe the process of probabilistic record linkage through a simple exemplar. We first introduce the concept of deterministic linkage and contrast this with probabilistic linkage. We illustrate each step of the process using a simple exemplar and describe the data structure required to perform a probabilistic linkage. We describe the process of calculating and interpreting matched weights and how to convert matched weights into posterior probabilities of a match using Bayes theorem. We conclude this article with a brief discussion of some of the computational demands of record linkage, how you might assess the quality of your linkage algorithm, and how epidemiologists can maximize the value of their record-linked research using robust record linkage methods. PMID:26686842

  17. Radiation Tolerant Interfaces: Influence of Local Stoichiometry at the Misfit Dislocation on Radiation Damage Resistance of Metal/Oxide Interfaces

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shutthanandan, Vaithiyalingam; Choudhury, Samrat; Manandhar, Sandeep

    The interaction of radiation with materials controls the performance, reliability, and safety of many structures in nuclear power systems. Revolutionary improvements in radiation damage resistance may be attainable if methods can be found to manipulate interface properties to give optimal interface stability and point defect recombination capability. To understand how variations in interface properties such as misfit dislocation density and local chemistry affect radiation-induced defect absorption and recombination, a model system of metallic Cr xV 1-x (0 ≤ x ≤ 1) epitaxial films deposited on MgO(001) single crystal substrates has been explored in this paper. By controlling film composition, themore » lattice mismatch between the film and MgO is adjusted to vary the misfit dislocation density at the metal/oxide interface. The stability of these interfaces under various irradiation conditions is studied experimentally and theoretically. The results indicate that, unlike at metal/metal interfaces, the misfit dislocation density does not dominate radiation damage tolerance at metal/oxide interfaces. Rather, the stoichiometry and the location of the misfit dislocation extra half-plane (in the metal or the oxide) drive radiation-induced defect behavior. Finally, together, these results demonstrate the sensitivity of defect recombination to interfacial chemistry and provide new avenues for engineering radiation-tolerant nanomaterials for next-generation nuclear power plants.« less

  18. Radiation Tolerant Interfaces: Influence of Local Stoichiometry at the Misfit Dislocation on Radiation Damage Resistance of Metal/Oxide Interfaces

    DOE PAGES

    Shutthanandan, Vaithiyalingam; Choudhury, Samrat; Manandhar, Sandeep; ...

    2017-04-24

    The interaction of radiation with materials controls the performance, reliability, and safety of many structures in nuclear power systems. Revolutionary improvements in radiation damage resistance may be attainable if methods can be found to manipulate interface properties to give optimal interface stability and point defect recombination capability. To understand how variations in interface properties such as misfit dislocation density and local chemistry affect radiation-induced defect absorption and recombination, a model system of metallic Cr xV 1-x (0 ≤ x ≤ 1) epitaxial films deposited on MgO(001) single crystal substrates has been explored in this paper. By controlling film composition, themore » lattice mismatch between the film and MgO is adjusted to vary the misfit dislocation density at the metal/oxide interface. The stability of these interfaces under various irradiation conditions is studied experimentally and theoretically. The results indicate that, unlike at metal/metal interfaces, the misfit dislocation density does not dominate radiation damage tolerance at metal/oxide interfaces. Rather, the stoichiometry and the location of the misfit dislocation extra half-plane (in the metal or the oxide) drive radiation-induced defect behavior. Finally, together, these results demonstrate the sensitivity of defect recombination to interfacial chemistry and provide new avenues for engineering radiation-tolerant nanomaterials for next-generation nuclear power plants.« less

  19. DNA damage tolerance pathway involving DNA polymerase ι and the tumor suppressor p53 regulates DNA replication fork progression.

    PubMed

    Hampp, Stephanie; Kiessling, Tina; Buechle, Kerstin; Mansilla, Sabrina F; Thomale, Jürgen; Rall, Melanie; Ahn, Jinwoo; Pospiech, Helmut; Gottifredi, Vanesa; Wiesmüller, Lisa

    2016-07-26

    DNA damage tolerance facilitates the progression of replication forks that have encountered obstacles on the template strands. It involves either translesion DNA synthesis initiated by proliferating cell nuclear antigen monoubiquitination or less well-characterized fork reversal and template switch mechanisms. Herein, we characterize a novel tolerance pathway requiring the tumor suppressor p53, the translesion polymerase ι (POLι), the ubiquitin ligase Rad5-related helicase-like transcription factor (HLTF), and the SWI/SNF catalytic subunit (SNF2) translocase zinc finger ran-binding domain containing 3 (ZRANB3). This novel p53 activity is lost in the exonuclease-deficient but transcriptionally active p53(H115N) mutant. Wild-type p53, but not p53(H115N), associates with POLι in vivo. Strikingly, the concerted action of p53 and POLι decelerates nascent DNA elongation and promotes HLTF/ZRANB3-dependent recombination during unperturbed DNA replication. Particularly after cross-linker-induced replication stress, p53 and POLι also act together to promote meiotic recombination enzyme 11 (MRE11)-dependent accumulation of (phospho-)replication protein A (RPA)-coated ssDNA. These results implicate a direct role of p53 in the processing of replication forks encountering obstacles on the template strand. Our findings define an unprecedented function of p53 and POLι in the DNA damage response to endogenous or exogenous replication stress.

  20. Affective and cognitive factors influencing sensitivity to probabilistic information.

    PubMed

    Tyszka, Tadeusz; Sawicki, Przemyslaw

    2011-11-01

    In study 1 different groups of female students were randomly assigned to one of four probabilistic information formats. Five different levels of probability of a genetic disease in an unborn child were presented to participants (within-subject factor). After the presentation of the probability level, participants were requested to indicate the acceptable level of pain they would tolerate to avoid the disease (in their unborn child), their subjective evaluation of the disease risk, and their subjective evaluation of being worried by this risk. The results of study 1 confirmed the hypothesis that an experience-based probability format decreases the subjective sense of worry about the disease, thus, presumably, weakening the tendency to overrate the probability of rare events. Study 2 showed that for the emotionally laden stimuli, the experience-based probability format resulted in higher sensitivity to probability variations than other formats of probabilistic information. These advantages of the experience-based probability format are interpreted in terms of two systems of information processing: the rational deliberative versus the affective experiential and the principle of stimulus-response compatibility. © 2011 Society for Risk Analysis.

  1. Damage Tolerance Assessment of Friction Pull Plug Welds in an Aluminum Alloy

    NASA Technical Reports Server (NTRS)

    McGill, Preston; Burkholder, Jonathan

    2012-01-01

    Friction stir welding is a solid state welding process used in the fabrication of cryogenic propellant tanks. Self-reacting friction stir welding is one variation of the friction stir weld process being developed for manufacturing tanks. Friction pull plug welding is used to seal the exit hole that remains in a circumferential self-reacting friction stir weld. A friction plug weld placed in a self-reacting friction stir weld results in a non-homogenous weld joint where the initial weld, plug weld, their respective heat affected zones and the base metal all interact. The welded joint is a composite plastically deformed material system with a complex residual stress field. In order to address damage tolerance concerns associated with friction plug welds in safety critical structures, such as propellant tanks, nondestructive inspection and proof testing may be required to screen hardware for mission critical defects. The efficacy of the nondestructive evaluation or the proof test is based on an assessment of the critical flaw size. Test data relating residual strength capability to flaw size in an aluminum alloy friction plug weld will be presented.

  2. Meta-analysis of attitudes toward damage-causing mammalian wildlife.

    PubMed

    Kansky, Ruth; Kidd, Martin; Knight, Andrew T

    2014-08-01

    Many populations of threatened mammals persist outside formally protected areas, and their survival depends on the willingness of communities to coexist with them. An understanding of the attitudes, and specifically the tolerance, of individuals and communities and the factors that determine these is therefore fundamental to designing strategies to alleviate human-wildlife conflict. We conducted a meta-analysis to identify factors that affected attitudes toward 4 groups of terrestrial mammals. Elephants (65%) elicited the most positive attitudes, followed by primates (55%), ungulates (53%), and carnivores (44%). Urban residents presented the most positive attitudes (80%), followed by commercial farmers (51%) and communal farmers (26%). A tolerance to damage index showed that human tolerance of ungulates and primates was proportional to the probability of experiencing damage while elephants elicited tolerance levels higher than anticipated and carnivores elicited tolerance levels lower than anticipated. Contrary to conventional wisdom, experiencing damage was not always the dominant factor determining attitudes. Communal farmers had a lower probability of being positive toward carnivores irrespective of probability of experiencing damage, while commercial farmers and urban residents were more likely to be positive toward carnivores irrespective of damage. Urban residents were more likely to be positive toward ungulates, elephants, and primates when probability of damage was low, but not when it was high. Commercial and communal farmers had a higher probability of being positive toward ungulates, primates, and elephants irrespective of probability of experiencing damage. Taxonomic bias may therefore be important. Identifying the distinct factors explaining these attitudes and the specific contexts in which they operate, inclusive of the species causing damage, will be essential for prioritizing conservation investments. © 2014 The Authors. Conservation Biology

  3. Probabilistic Wind Power Ramp Forecasting Based on a Scenario Generation Method

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Qin; Florita, Anthony R; Krishnan, Venkat K

    Wind power ramps (WPRs) are particularly important in the management and dispatch of wind power and currently drawing the attention of balancing authorities. With the aim to reduce the impact of WPRs for power system operations, this paper develops a probabilistic ramp forecasting method based on a large number of simulated scenarios. An ensemble machine learning technique is first adopted to forecast the basic wind power forecasting scenario and calculate the historical forecasting errors. A continuous Gaussian mixture model (GMM) is used to fit the probability distribution function (PDF) of forecasting errors. The cumulative distribution function (CDF) is analytically deduced.more » The inverse transform method based on Monte Carlo sampling and the CDF is used to generate a massive number of forecasting error scenarios. An optimized swinging door algorithm is adopted to extract all the WPRs from the complete set of wind power forecasting scenarios. The probabilistic forecasting results of ramp duration and start-time are generated based on all scenarios. Numerical simulations on publicly available wind power data show that within a predefined tolerance level, the developed probabilistic wind power ramp forecasting method is able to predict WPRs with a high level of sharpness and accuracy.« less

  4. Probabilistic evaluation of uncertainties and risks in aerospace components

    NASA Technical Reports Server (NTRS)

    Shah, A. R.; Shiao, M. C.; Nagpal, V. K.; Chamis, C. C.

    1992-01-01

    This paper summarizes a methodology developed at NASA Lewis Research Center which computationally simulates the structural, material, and load uncertainties associated with Space Shuttle Main Engine (SSME) components. The methodology was applied to evaluate the scatter in static, buckling, dynamic, fatigue, and damage behavior of the SSME turbo pump blade. Also calculated are the probability densities of typical critical blade responses, such as effective stress, natural frequency, damage initiation, most probable damage path, etc. Risk assessments were performed for different failure modes, and the effect of material degradation on the fatigue and damage behaviors of a blade were calculated using a multi-factor interaction equation. Failure probabilities for different fatigue cycles were computed and the uncertainties associated with damage initiation and damage propagation due to different load cycle were quantified. Evaluations on the effects of mistuned blades on a rotor were made; uncertainties in the excitation frequency were found to significantly amplify the blade responses of a mistuned rotor. The effects of the number of blades on a rotor were studied. The autocorrelation function of displacements and the probability density function of the first passage time for deterministic and random barriers for structures subjected to random processes also were computed. A brief discussion was included on the future direction of probabilistic structural analysis.

  5. Damage-Tolerant Polymer Composite Systems

    NASA Astrophysics Data System (ADS)

    Reifsnider, Kenneth L.

    1988-11-01

    One of the reasons for the rapid growth in the application of polymer composites is the opportunity they provide for the design and construction of composite structures that are especially resistant to losses of strength or reduced life resulting from damage during service. The usefulness of such materials is enhanced by the variety of reinforcement schemes that can be chosen to reflect specific service conditions. Under cyclic loading and demanding mechanical situations (e.g., helicopter parts, vehicle springs and high-speed rotors), polymer composites are considerably superior to competing materials.

  6. Constraints on the evolution of tolerance to herbicide in the common morning glory: resistance and tolerance are mutually exclusive.

    PubMed

    Baucom, Regina S; Mauricio, Rodney

    2008-11-01

    Evolutionary biologists explain the maintenance of intermediate levels of defense in plant populations as being due to trade-offs, or negative genetic covariances among ecologically important traits. Attempts at detecting trade-offs as constraints on the evolution of defense have not always been successful, leading some to conclude that such trade-offs rarely explain current levels of defense in the population. Using the agricultural pest Ipomoea purpurea, we measured correlations between traits involved in defense to glyphosate, the active ingredient in Roundup, a widely used herbicide. We found significant allocation costs of tolerance, as well as trade-offs between resistance and two measures of tolerance to glyphosate. Selection on resistance and tolerance exhibited differing patterns: tolerance to leaf damage was under negative directional selection, whereas resistance was under positive directional selection. The joint pattern of selection on resistance and tolerance to leaf damage indicated the presence of alternate peaks in the fitness landscape such that a combination of either high tolerance and low resistance, or high resistance and low tolerance was favored. The widespread use of this herbicide suggests that it is likely an important selective agent on weed populations. Understanding the evolutionary dynamics of herbicide defense traits is thus of increasing importance in the context of human-mediated evolution.

  7. Probabilistic liquefaction hazard analysis at liquefied sites of 1956 Dunaharaszti earthquake, in Hungary

    NASA Astrophysics Data System (ADS)

    Győri, Erzsébet; Gráczer, Zoltán; Tóth, László; Bán, Zoltán; Horváth, Tibor

    2017-04-01

    Liquefaction potential evaluations are generally made to assess the hazard from specific scenario earthquakes. These evaluations may estimate the potential in a binary fashion (yes/no), define a factor of safety or predict the probability of liquefaction given a scenario event. Usually the level of ground shaking is obtained from the results of PSHA. Although it is determined probabilistically, a single level of ground shaking is selected and used within the liquefaction potential evaluation. In contrary, the fully probabilistic liquefaction potential assessment methods provide a complete picture of liquefaction hazard, namely taking into account the joint probability distribution of PGA and magnitude of earthquake scenarios; both of which are key inputs in the stress-based simplified methods. Kramer and Mayfield (2007) has developed a fully probabilistic liquefaction potential evaluation method using a performance-based earthquake engineering (PBEE) framework. The results of the procedure are the direct estimate of the return period of liquefaction and the liquefaction hazard curves in function of depth. The method combines the disaggregation matrices computed for different exceedance frequencies during probabilistic seismic hazard analysis with one of the recent models for the conditional probability of liquefaction. We have developed a software for the assessment of performance-based liquefaction triggering on the basis of Kramer and Mayfield method. Originally the SPT based probabilistic method of Cetin et al. (2004) was built-in into the procedure of Kramer and Mayfield to compute the conditional probability however there is no professional consensus about its applicability. Therefore we have included not only Cetin's method but Idriss and Boulanger (2012) SPT based moreover Boulanger and Idriss (2014) CPT based procedures into our computer program. In 1956, a damaging earthquake of magnitude 5.6 occurred in Dunaharaszti, in Hungary. Its epicenter was located

  8. Probabilistic Structural Analysis Program

    NASA Technical Reports Server (NTRS)

    Pai, Shantaram S.; Chamis, Christos C.; Murthy, Pappu L. N.; Stefko, George L.; Riha, David S.; Thacker, Ben H.; Nagpal, Vinod K.; Mital, Subodh K.

    2010-01-01

    NASA/NESSUS 6.2c is a general-purpose, probabilistic analysis program that computes probability of failure and probabilistic sensitivity measures of engineered systems. Because NASA/NESSUS uses highly computationally efficient and accurate analysis techniques, probabilistic solutions can be obtained even for extremely large and complex models. Once the probabilistic response is quantified, the results can be used to support risk-informed decisions regarding reliability for safety-critical and one-of-a-kind systems, as well as for maintaining a level of quality while reducing manufacturing costs for larger-quantity products. NASA/NESSUS has been successfully applied to a diverse range of problems in aerospace, gas turbine engines, biomechanics, pipelines, defense, weaponry, and infrastructure. This program combines state-of-the-art probabilistic algorithms with general-purpose structural analysis and lifting methods to compute the probabilistic response and reliability of engineered structures. Uncertainties in load, material properties, geometry, boundary conditions, and initial conditions can be simulated. The structural analysis methods include non-linear finite-element methods, heat-transfer analysis, polymer/ceramic matrix composite analysis, monolithic (conventional metallic) materials life-prediction methodologies, boundary element methods, and user-written subroutines. Several probabilistic algorithms are available such as the advanced mean value method and the adaptive importance sampling method. NASA/NESSUS 6.2c is structured in a modular format with 15 elements.

  9. 7 CFR 51.2954 - Tolerances for grade defects.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... damaged by mold or insects or seriously damaged by other means, of which not more than 5/6 or 5 pct may be damaged by insects, but no part of any tolerance shall be allowed for walnuts containing live insects No... adhering hulls 15 pct total, by count, including not more than 8 pct which are damaged by mold or insects...

  10. Damage Tolerance Behavior of Friction Stir Welds in Aluminum Alloys

    NASA Technical Reports Server (NTRS)

    McGill, Preston; Burkholder, Jonathan

    2012-01-01

    Friction stir welding is a solid state welding process used in the fabrication of various aerospace structures. Self-reacting and conventional friction stir welding are variations of the friction stir weld process employed in the fabrication of cryogenic propellant tanks which are classified as pressurized structure in many spaceflight vehicle architectures. In order to address damage tolerance behavior associated with friction stir welds in these safety critical structures, nondestructive inspection and proof testing may be required to screen hardware for mission critical defects. The efficacy of the nondestructive evaluation or the proof test is based on an assessment of the critical flaw size. Test data describing fracture behavior, residual strength capability, and cyclic mission life capability of friction stir welds at ambient and cryogenic temperatures have been generated and will be presented in this paper. Fracture behavior will include fracture toughness and tearing (R-curve) response of the friction stir welds. Residual strength behavior will include an evaluation of the effects of lack of penetration on conventional friction stir welds, the effects of internal defects (wormholes) on self-reacting friction stir welds, and an evaluation of the effects of fatigue cycled surface cracks on both conventional and selfreacting welds. Cyclic mission life capability will demonstrate the effects of surface crack defects on service load cycle capability. The fracture data will be used to evaluate nondestructive inspection and proof test requirements for the welds.

  11. On the monitoring and implications of growing damages caused by manufacturing defects in composite structures

    NASA Astrophysics Data System (ADS)

    Schagerl, M.; Viechtbauer, C.; Hörrmann, S.

    2015-07-01

    Damage tolerance is a classical safety concept for the design of aircraft structures. Basically, this approach considers possible damages in the structure, predicts the damage growth under applied loading conditions and predicts the following decrease of the structural strength. As a fundamental result the damage tolerance approach yields the maximum inspection interval, which is the time a damage grows from a detectable to a critical level. The above formulation of the damage tolerance safety concept targets on metallic structures where the damage is typically a simple fatigue crack. Fiber-reinforced polymers show a much more complex damage behavior, such as delaminationsin laminated composites. Moreover, progressive damage in composites is often initiated by manufacturing defects. The complex manufacturing processes for composite structures almost certainly yield parts with defects, e.g. pores in the matrix or undulations of fibers. From such defects growing damages may start after a certain time of operation. The demand to simplify or even avoid the inspection of composite structures has therefore led to a comeback of the traditional safe-life safety concept. The aim of the so-called safe-life flaw tolerance concept is a structure that is capable of carrying the static loads during operation, despite significant damages and after a representative fatigue load spectrum. A structure with this property does not need to be inspected, respectively monitored at all during its service life. However, its load carrying capability is thereby not fully utilized. This article presents the possible refinement of the state-of-the-art safe-life flaw tolerance concept for composite structures towards a damage tolerance approach considering also the influence of manufacturing defects on damage initiation and growth. Based on fundamental physical relations and experimental observations the challenges when developing damage growth and residual strength curves are discussed.

  12. Probabilistic Wind Power Ramp Forecasting Based on a Scenario Generation Method: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Qin; Florita, Anthony R; Krishnan, Venkat K

    2017-08-31

    Wind power ramps (WPRs) are particularly important in the management and dispatch of wind power, and they are currently drawing the attention of balancing authorities. With the aim to reduce the impact of WPRs for power system operations, this paper develops a probabilistic ramp forecasting method based on a large number of simulated scenarios. An ensemble machine learning technique is first adopted to forecast the basic wind power forecasting scenario and calculate the historical forecasting errors. A continuous Gaussian mixture model (GMM) is used to fit the probability distribution function (PDF) of forecasting errors. The cumulative distribution function (CDF) ismore » analytically deduced. The inverse transform method based on Monte Carlo sampling and the CDF is used to generate a massive number of forecasting error scenarios. An optimized swinging door algorithm is adopted to extract all the WPRs from the complete set of wind power forecasting scenarios. The probabilistic forecasting results of ramp duration and start time are generated based on all scenarios. Numerical simulations on publicly available wind power data show that within a predefined tolerance level, the developed probabilistic wind power ramp forecasting method is able to predict WPRs with a high level of sharpness and accuracy.« less

  13. Mechanisms of DNA damage, repair and mutagenesis

    PubMed Central

    Chatterjee, Nimrat; Walker, Graham C.

    2017-01-01

    Living organisms are continuously exposed to a myriad of DNA damaging agents that can impact health and modulate disease-states. However, robust DNA repair and damage-bypass mechanisms faithfully protect the DNA by either removing or tolerating the damage to ensure an overall survival. Deviations in this fine-tuning are known to destabilize cellular metabolic homeostasis, as exemplified in diverse cancers where disruption or deregulation of DNA repair pathways results in genome instability. Because routinely used biological, physical and chemical agents impact human health, testing their genotoxicity and regulating their use have become important. In this introductory review, we will delineate mechanisms of DNA damage and the counteracting repair/tolerance pathways to provide insights into the molecular basis of genotoxicity in cells that lays the foundation for subsequent articles in this issue. PMID:28485537

  14. A probabilistic atlas of the cerebellar white matter.

    PubMed

    van Baarsen, K M; Kleinnijenhuis, M; Jbabdi, S; Sotiropoulos, S N; Grotenhuis, J A; van Cappellen van Walsum, A M

    2016-01-01

    Imaging of the cerebellar cortex, deep cerebellar nuclei and their connectivity are gaining attraction, due to the important role the cerebellum plays in cognition and motor control. Atlases of the cerebellar cortex and nuclei are used to locate regions of interest in clinical and neuroscience studies. However, the white matter that connects these relay stations is of at least similar functional importance. Damage to these cerebellar white matter tracts may lead to serious language, cognitive and emotional disturbances, although the pathophysiological mechanism behind it is still debated. Differences in white matter integrity between patients and controls might shed light on structure-function correlations. A probabilistic parcellation atlas of the cerebellar white matter would help these studies by facilitating automatic segmentation of the cerebellar peduncles, the localization of lesions and the comparison of white matter integrity between patients and controls. In this work a digital three-dimensional probabilistic atlas of the cerebellar white matter is presented, based on high quality 3T, 1.25mm resolution diffusion MRI data from 90 subjects participating in the Human Connectome Project. The white matter tracts were estimated using probabilistic tractography. Results over 90 subjects were symmetrical and trajectories of superior, middle and inferior cerebellar peduncles resembled the anatomy as known from anatomical studies. This atlas will contribute to a better understanding of cerebellar white matter architecture. It may eventually aid in defining structure-function correlations in patients with cerebellar disorders. Copyright © 2015 Elsevier Inc. All rights reserved.

  15. 7 CFR 51.307 - Application of tolerances.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... STANDARDS) United States Standards for Grades of Apples Application of Tolerances § 51.307 Application of... least one apple which is seriously damaged by insects or affected by decay or internal breakdown may be... have more than 3 times the tolerance specified, except that at least three defective apples may be...

  16. 7 CFR 51.307 - Application of tolerances.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... STANDARDS) United States Standards for Grades of Apples Application of Tolerances § 51.307 Application of... least one apple which is seriously damaged by insects or affected by decay or internal breakdown may be... have more than 3 times the tolerance specified, except that at least three defective apples may be...

  17. 7 CFR 51.307 - Application of tolerances.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... STANDARDS) United States Standards for Grades of Apples Application of Tolerances § 51.307 Application of... least one apple which is seriously damaged by insects or affected by decay or internal breakdown may be... have more than 3 times the tolerance specified, except that at least three defective apples may be...

  18. Probabilistic Flexural Fatigue in Plain and Fiber-Reinforced Concrete

    PubMed Central

    Ríos, José D.

    2017-01-01

    The objective of this work is two-fold. First, we attempt to fit the experimental data on the flexural fatigue of plain and fiber-reinforced concrete with a probabilistic model (Saucedo, Yu, Medeiros, Zhang and Ruiz, Int. J. Fatigue, 2013, 48, 308–318). This model was validated for compressive fatigue at various loading frequencies, but not for flexural fatigue. Since the model is probabilistic, it is not necessarily related to the specific mechanism of fatigue damage, but rather generically explains the fatigue distribution in concrete (plain or reinforced with fibers) for damage under compression, tension or flexion. In this work, more than 100 series of flexural fatigue tests in the literature are fit with excellent results. Since the distribution of monotonic tests was not available in the majority of cases, a two-step procedure is established to estimate the model parameters based solely on fatigue tests. The coefficient of regression was more than 0.90 except for particular cases where not all tests were strictly performed under the same loading conditions, which confirms the applicability of the model to flexural fatigue data analysis. Moreover, the model parameters are closely related to fatigue performance, which demonstrates the predictive capacity of the model. For instance, the scale parameter is related to flexural strength, which improves with the addition of fibers. Similarly, fiber increases the scattering of fatigue life, which is reflected by the decreasing shape parameter. PMID:28773123

  19. Probabilistic Flexural Fatigue in Plain and Fiber-Reinforced Concrete.

    PubMed

    Ríos, José D; Cifuentes, Héctor; Yu, Rena C; Ruiz, Gonzalo

    2017-07-07

    The objective of this work is two-fold. First, we attempt to fit the experimental data on the flexural fatigue of plain and fiber-reinforced concrete with a probabilistic model (Saucedo, Yu, Medeiros, Zhang and Ruiz, Int. J. Fatigue, 2013, 48, 308-318). This model was validated for compressive fatigue at various loading frequencies, but not for flexural fatigue. Since the model is probabilistic, it is not necessarily related to the specific mechanism of fatigue damage, but rather generically explains the fatigue distribution in concrete (plain or reinforced with fibers) for damage under compression, tension or flexion. In this work, more than 100 series of flexural fatigue tests in the literature are fit with excellent results. Since the distribution of monotonic tests was not available in the majority of cases, a two-step procedure is established to estimate the model parameters based solely on fatigue tests. The coefficient of regression was more than 0.90 except for particular cases where not all tests were strictly performed under the same loading conditions, which confirms the applicability of the model to flexural fatigue data analysis. Moreover, the model parameters are closely related to fatigue performance, which demonstrates the predictive capacity of the model. For instance, the scale parameter is related to flexural strength, which improves with the addition of fibers. Similarly, fiber increases the scattering of fatigue life, which is reflected by the decreasing shape parameter.

  20. Specialists Meeting on Impact Damage Tolerance of Structures

    DTIC Science & Technology

    1976-01-01

    example, fatigue, timl-de tectIable initial defects and in-fliglht d amalt such aS that inflicted by miilitary weapons or by debris from ’n din tegra t...relative to many types of damaging mechanisms, lncludlig for example: I. Fat Igue 2. Non-detectable Initial defects 3. In-flight damage, such as Inflicted...undetected flaw or defect . In both cases, the benefits of successful design are Improved safety and economics. With respect to In-flight darvqe, tre

  1. The probabilistic nature of preferential choice.

    PubMed

    Rieskamp, Jörg

    2008-11-01

    Previous research has developed a variety of theories explaining when and why people's decisions under risk deviate from the standard economic view of expected utility maximization. These theories are limited in their predictive accuracy in that they do not explain the probabilistic nature of preferential choice, that is, why an individual makes different choices in nearly identical situations, or why the magnitude of these inconsistencies varies in different situations. To illustrate the advantage of probabilistic theories, three probabilistic theories of decision making under risk are compared with their deterministic counterparts. The probabilistic theories are (a) a probabilistic version of a simple choice heuristic, (b) a probabilistic version of cumulative prospect theory, and (c) decision field theory. By testing the theories with the data from three experimental studies, the superiority of the probabilistic models over their deterministic counterparts in predicting people's decisions under risk become evident. When testing the probabilistic theories against each other, decision field theory provides the best account of the observed behavior.

  2. Fault-tolerant clock synchronization in distributed systems

    NASA Technical Reports Server (NTRS)

    Ramanathan, Parameswaran; Shin, Kang G.; Butler, Ricky W.

    1990-01-01

    Existing fault-tolerant clock synchronization algorithms are compared and contrasted. These include the following: software synchronization algorithms, such as convergence-averaging, convergence-nonaveraging, and consistency algorithms, as well as probabilistic synchronization; hardware synchronization algorithms; and hybrid synchronization. The worst-case clock skews guaranteed by representative algorithms are compared, along with other important aspects such as time, message, and cost overhead imposed by the algorithms. More recent developments such as hardware-assisted software synchronization and algorithms for synchronizing large, partially connected distributed systems are especially emphasized.

  3. Probabilistic load simulation: Code development status

    NASA Astrophysics Data System (ADS)

    Newell, J. F.; Ho, H.

    1991-05-01

    The objective of the Composite Load Spectra (CLS) project is to develop generic load models to simulate the composite load spectra that are included in space propulsion system components. The probabilistic loads thus generated are part of the probabilistic design analysis (PDA) of a space propulsion system that also includes probabilistic structural analyses, reliability, and risk evaluations. Probabilistic load simulation for space propulsion systems demands sophisticated probabilistic methodology and requires large amounts of load information and engineering data. The CLS approach is to implement a knowledge based system coupled with a probabilistic load simulation module. The knowledge base manages and furnishes load information and expertise and sets up the simulation runs. The load simulation module performs the numerical computation to generate the probabilistic loads with load information supplied from the CLS knowledge base.

  4. Probabilistic design of fibre concrete structures

    NASA Astrophysics Data System (ADS)

    Pukl, R.; Novák, D.; Sajdlová, T.; Lehký, D.; Červenka, J.; Červenka, V.

    2017-09-01

    Advanced computer simulation is recently well-established methodology for evaluation of resistance of concrete engineering structures. The nonlinear finite element analysis enables to realistically predict structural damage, peak load, failure, post-peak response, development of cracks in concrete, yielding of reinforcement, concrete crushing or shear failure. The nonlinear material models can cover various types of concrete and reinforced concrete: ordinary concrete, plain or reinforced, without or with prestressing, fibre concrete, (ultra) high performance concrete, lightweight concrete, etc. Advanced material models taking into account fibre concrete properties such as shape of tensile softening branch, high toughness and ductility are described in the paper. Since the variability of the fibre concrete material properties is rather high, the probabilistic analysis seems to be the most appropriate format for structural design and evaluation of structural performance, reliability and safety. The presented combination of the nonlinear analysis with advanced probabilistic methods allows evaluation of structural safety characterized by failure probability or by reliability index respectively. Authors offer a methodology and computer tools for realistic safety assessment of concrete structures; the utilized approach is based on randomization of the nonlinear finite element analysis of the structural model. Uncertainty of the material properties or their randomness obtained from material tests are accounted in the random distribution. Furthermore, degradation of the reinforced concrete materials such as carbonation of concrete, corrosion of reinforcement, etc. can be accounted in order to analyze life-cycle structural performance and to enable prediction of the structural reliability and safety in time development. The results can serve as a rational basis for design of fibre concrete engineering structures based on advanced nonlinear computer analysis. The presented

  5. Damage tolerant functionally graded materials for advanced wear and friction applications

    NASA Astrophysics Data System (ADS)

    Prchlik, Lubos

    The research work presented in this dissertation focused on processing effects, microstructure development, characterization and performance evaluation of composite and graded coatings used for friction and wear control. The following issues were addressed. (1) Definition of prerequisites for a successful composite and graded coating formation by means of thermal spraying. (2) Improvement of characterization methods available for homogenous thermally sprayed coating and their extension to composite and graded materials. (3) Development of novel characterization methods specifically for FGMs, with a focus on through thickness property measurement by indentation and in-situ curvature techniques. (4) Design of composite materials with improved properties compared to homogenous coatings. (5) Fabrication and performance assessment of FGM with improved wear and impact damage properties. Materials. The materials studied included several material systems relevant to low friction and contact damage tolerant applications: MO-Mo2C, WC-Co cermets as materials commonly used sliding components of industrial machinery and NiCrAlY/8%-Yttria Partially Stabilized Zirconia composites as a potential solution for abradable sections of gas turbines and aircraft engines. In addition, uniform coatings such as molybdenum and Ni5%Al alloy were evaluated as model system to assess the influence of microstructure variation onto the mechanical property and wear response. Methods. The contact response of the materials was investigated through several techniques. These included methods evaluating the relevant intrinsic coating properties such as elastic modulus, residual stress, fracture toughness, scratch resistance and tests measuring the abrasion and friction-sliding behavior. Dry-sand and wet two-body abrasion testing was performed in addition to traditional ball on disc sliding tests. Among all characterization techniques the spherical indentation deserved most attention and enabled to

  6. Probabilistic inspection strategies for minimizing service failures

    NASA Technical Reports Server (NTRS)

    Brot, Abraham

    1994-01-01

    The INSIM computer program is described which simulates the 'limited fatigue life' environment in which aircraft structures generally operate. The use of INSIM to develop inspection strategies which aim to minimize service failures is demonstrated. Damage-tolerance methodology, inspection thresholds and customized inspections are simulated using the probability of failure as the driving parameter.

  7. Stochastic damage evolution in textile laminates

    NASA Technical Reports Server (NTRS)

    Dzenis, Yuris A.; Bogdanovich, Alexander E.; Pastore, Christopher M.

    1993-01-01

    A probabilistic model utilizing random material characteristics to predict damage evolution in textile laminates is presented. Model is based on a division of each ply into two sublaminas consisting of cells. The probability of cell failure is calculated using stochastic function theory and maximal strain failure criterion. Three modes of failure, i.e. fiber breakage, matrix failure in transverse direction, as well as matrix or interface shear cracking, are taken into account. Computed failure probabilities are utilized in reducing cell stiffness based on the mesovolume concept. A numerical algorithm is developed predicting the damage evolution and deformation history of textile laminates. Effect of scatter of fiber orientation on cell properties is discussed. Weave influence on damage accumulation is illustrated with the help of an example of a Kevlar/epoxy laminate.

  8. Students’ difficulties in probabilistic problem-solving

    NASA Astrophysics Data System (ADS)

    Arum, D. P.; Kusmayadi, T. A.; Pramudya, I.

    2018-03-01

    There are many errors can be identified when students solving mathematics problems, particularly in solving the probabilistic problem. This present study aims to investigate students’ difficulties in solving the probabilistic problem. It focuses on analyzing and describing students errors during solving the problem. This research used the qualitative method with case study strategy. The subjects in this research involve ten students of 9th grade that were selected by purposive sampling. Data in this research involve students’ probabilistic problem-solving result and recorded interview regarding students’ difficulties in solving the problem. Those data were analyzed descriptively using Miles and Huberman steps. The results show that students have difficulties in solving the probabilistic problem and can be divided into three categories. First difficulties relate to students’ difficulties in understanding the probabilistic problem. Second, students’ difficulties in choosing and using appropriate strategies for solving the problem. Third, students’ difficulties with the computational process in solving the problem. Based on the result seems that students still have difficulties in solving the probabilistic problem. It means that students have not able to use their knowledge and ability for responding probabilistic problem yet. Therefore, it is important for mathematics teachers to plan probabilistic learning which could optimize students probabilistic thinking ability.

  9. A probabilistic Hu-Washizu variational principle

    NASA Technical Reports Server (NTRS)

    Liu, W. K.; Belytschko, T.; Besterfield, G. H.

    1987-01-01

    A Probabilistic Hu-Washizu Variational Principle (PHWVP) for the Probabilistic Finite Element Method (PFEM) is presented. This formulation is developed for both linear and nonlinear elasticity. The PHWVP allows incorporation of the probabilistic distributions for the constitutive law, compatibility condition, equilibrium, domain and boundary conditions into the PFEM. Thus, a complete probabilistic analysis can be performed where all aspects of the problem are treated as random variables and/or fields. The Hu-Washizu variational formulation is available in many conventional finite element codes thereby enabling the straightforward inclusion of the probabilistic features into present codes.

  10. Probabilistic Composite Design

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.

    1997-01-01

    Probabilistic composite design is described in terms of a computational simulation. This simulation tracks probabilistically the composite design evolution from constituent materials, fabrication process, through composite mechanics and structural components. Comparisons with experimental data are provided to illustrate selection of probabilistic design allowables, test methods/specimen guidelines, and identification of in situ versus pristine strength, For example, results show that: in situ fiber tensile strength is 90% of its pristine strength; flat-wise long-tapered specimens are most suitable for setting ply tensile strength allowables: a composite radome can be designed with a reliability of 0.999999; and laminate fatigue exhibits wide-spread scatter at 90% cyclic-stress to static-strength ratios.

  11. Probabilistic Aeroelastic Analysis of Turbomachinery Components

    NASA Technical Reports Server (NTRS)

    Reddy, T. S. R.; Mital, S. K.; Stefko, G. L.

    2004-01-01

    A probabilistic approach is described for aeroelastic analysis of turbomachinery blade rows. Blade rows with subsonic flow and blade rows with supersonic flow with subsonic leading edge are considered. To demonstrate the probabilistic approach, the flutter frequency, damping and forced response of a blade row representing a compressor geometry is considered. The analysis accounts for uncertainties in structural and aerodynamic design variables. The results are presented in the form of probabilistic density function (PDF) and sensitivity factors. For subsonic flow cascade, comparisons are also made with different probabilistic distributions, probabilistic methods, and Monte-Carlo simulation. The approach shows that the probabilistic approach provides a more realistic and systematic way to assess the effect of uncertainties in design variables on the aeroelastic instabilities and response.

  12. Probabilistic metrology or how some measurement outcomes render ultra-precise estimates

    NASA Astrophysics Data System (ADS)

    Calsamiglia, J.; Gendra, B.; Muñoz-Tapia, R.; Bagan, E.

    2016-10-01

    We show on theoretical grounds that, even in the presence of noise, probabilistic measurement strategies (which have a certain probability of failure or abstention) can provide, upon a heralded successful outcome, estimates with a precision that exceeds the deterministic bounds for the average precision. This establishes a new ultimate bound on the phase estimation precision of particular measurement outcomes (or sequence of outcomes). For probe systems subject to local dephasing, we quantify such precision limit as a function of the probability of failure that can be tolerated. Our results show that the possibility of abstaining can set back the detrimental effects of noise.

  13. Tolerance to deer herbivory and resistance to insect herbivores in the common evening primrose (Oenothera biennis).

    PubMed

    Puentes, A; Johnson, M T J

    2016-01-01

    The evolution of plant defence in response to herbivory will depend on the fitness effects of damage, availability of genetic variation and potential ecological and genetic constraints on defence. Here, we examine the potential for evolution of tolerance to deer herbivory in Oenothera biennis while simultaneously considering resistance to natural insect herbivores. We examined (i) the effects of deer damage on fitness, (ii) the presence of genetic variation in tolerance and resistance, (iii) selection on tolerance, (iv) genetic correlations with resistance that could constrain evolution of tolerance and (v) plant traits that might predict defence. In a field experiment, we simulated deer damage occurring early and late in the season, recorded arthropod abundances, flowering phenology and measured growth rate and lifetime reproduction. Our study showed that deer herbivory has a negative effect on fitness, with effects being more pronounced for late-season damage. Selection acted to increase tolerance to deer damage, yet there was low and nonsignificant genetic variation in this trait. In contrast, there was substantial genetic variation in resistance to insect herbivores. Resistance was genetically uncorrelated with tolerance, whereas positive genetic correlations in resistance to insect herbivores suggest there exists diffuse selection on resistance traits. In addition, growth rate and flowering time did not predict variation in tolerance, but flowering phenology was genetically correlated with resistance. Our results suggest that deer damage has the potential to exert selection because browsing reduces plant fitness, but limited standing genetic variation in tolerance is expected to constrain adaptive evolution in O. biennis. © 2015 European Society For Evolutionary Biology. Journal of Evolutionary Biology © 2015 European Society For Evolutionary Biology.

  14. Probabilistic Physics-Based Risk Tools Used to Analyze the International Space Station Electrical Power System Output

    NASA Technical Reports Server (NTRS)

    Patel, Bhogila M.; Hoge, Peter A.; Nagpal, Vinod K.; Hojnicki, Jeffrey S.; Rusick, Jeffrey J.

    2004-01-01

    This paper describes the methods employed to apply probabilistic modeling techniques to the International Space Station (ISS) power system. These techniques were used to quantify the probabilistic variation in the power output, also called the response variable, due to variations (uncertainties) associated with knowledge of the influencing factors called the random variables. These uncertainties can be due to unknown environmental conditions, variation in the performance of electrical power system components or sensor tolerances. Uncertainties in these variables, cause corresponding variations in the power output, but the magnitude of that effect varies with the ISS operating conditions, e.g. whether or not the solar panels are actively tracking the sun. Therefore, it is important to quantify the influence of these uncertainties on the power output for optimizing the power available for experiments.

  15. Multidisciplinary Optimization and Damage Tolerance of Stiffened Structures

    NASA Astrophysics Data System (ADS)

    Jrad, Mohamed

    interest. Buckling analysis of a composite panel with attached longitudinal stiffeners under compressive loads is performed using Ritz method with trigonometric functions. Results are then compared to those from Abaqus FEA for different shell elements. The case of composite panel with one, two, and three stiffeners is investigated. The effect of the distance between the stiffeners on the buckling load is also studied. The variation of the buckling load and buckling modes with the stiffeners' height is investigated. It is shown that there is an optimum value of stiffeners' height beyond which the structural response of the stiffened panel is not improved and the buckling load does not increase. Furthermore, there exist different critical values of stiffener's height at which the buckling mode of the structure changes. Next, buckling analysis of a composite panel with two straight stiffeners and a crack at the center is performed. Finally, buckling analysis of a composite panel with curvilinear stiffeners and a crack at the center is also conducted. Results show that panels with a larger crack have a reduced buckling load and that the buckling load decreases slightly when using higher order 2D shell FEM elements. A damage tolerance framework, EBF3PanelOpt, has been developed to design and analyze curvilinearly stiffened panels. The framework is written with the scripting language Python and it interacts with the commercial software MSC. Patran (for geometry and mesh creation), MSC. Nastran (for finite element analysis), and MSC. Marc (for damage tolerance analysis). The crack location is set to the location of the maximum value of the major principal stress while its orientation is set normal to the major principal axis direction. The effective stress intensity factor is calculated using the Virtual Crack Closure Technique and compared to the fracture toughness of the material in order to decide whether the crack will expand or not. The ratio of these two quantities is used

  16. Conducting field trials for frost tolerance breeding in cereals.

    PubMed

    Cattivelli, Luigi

    2014-01-01

    Cereal species can be damaged by frost either during winter or at flowering stage. Frost tolerance per se is only a part of the mechanisms that allow the plants to survive during winter; winterhardiness also considers other biotic or physical stresses that challenge the plants during the winter season limiting their survival rate. While frost tolerance can also be tested in controlled environments, winterhardiness can be determined only with field evaluations. Post-heading frost damage occurs from radiation frost events in spring during the reproductive stages. A reliable evaluation of winterhardiness or of post-heading frost damage should be carried out with field trials replicated across years and locations to overcome the irregular occurrence of natural conditions which satisfactorily differentiate genotypes. The evaluation of post-heading frost damage requires a specific attention to plant phenology. The extent of frost damage is usually determined with a visual score at the end of the winter.

  17. Recent Advances in Composite Damage Mechanics

    NASA Technical Reports Server (NTRS)

    Reifsnider, Ken; Case, Scott; Iyengar, Nirmal

    1996-01-01

    The state of the art and recent developments in the field of composite material damage mechanics are reviewed, with emphasis on damage accumulation. The kinetics of damage accumulation are considered with emphasis on the general accumulation of discrete local damage events such as single or multiple fiber fractures or microcrack formation. The issues addressed include: how to define strength in the presence of widely distributed damage, and how to combine mechanical representations in order to predict the damage tolerance and life of engineering components. It is shown that a damage mechanics approach can be related to the thermodynamics of the damage accumulation processes in composite laminates subjected to mechanical loading and environmental conditions over long periods of time.

  18. Against all odds -- Probabilistic forecasts and decision making

    NASA Astrophysics Data System (ADS)

    Liechti, Katharina; Zappa, Massimiliano

    2015-04-01

    In the city of Zurich (Switzerland) the setting is such that the damage potential due to flooding of the river Sihl is estimated to about 5 billion US dollars. The flood forecasting system that is used by the administration for decision making runs continuously since 2007. It has a time horizon of max. five days and operates at hourly time steps. The flood forecasting system includes three different model chains. Two of those are run by the deterministic NWP models COSMO-2 and COSMO-7 and one is driven by the probabilistic NWP COSMO-Leps. The model chains are consistent since February 2010, so five full years are available for the evaluation for the system. The system was evaluated continuously and is a very nice example to present the added value that lies in probabilistic forecasts. The forecasts are available on an online-platform to the decision makers. Several graphical representations of the forecasts and forecast-history are available to support decision making and to rate the current situation. The communication between forecasters and decision-makers is quite close. To put it short, an ideal situation. However, an event or better put a non-event in summer 2014 showed that the knowledge about the general superiority of probabilistic forecasts doesn't necessarily mean that the decisions taken in a specific situation will be based on that probabilistic forecast. Some years of experience allow gaining confidence in the system, both for the forecasters and for the decision-makers. Even if from the theoretical point of view the handling during crisis situation is well designed, a first event demonstrated that the dialog with the decision-makers still lacks of exercise during such situations. We argue, that a false alarm is a needed experience to consolidate real-time emergency procedures relying on ensemble predictions. A missed event would probably also fit, but, in our case, we are very happy not to report about this option.

  19. Probabilistic classifiers with high-dimensional data

    PubMed Central

    Kim, Kyung In; Simon, Richard

    2011-01-01

    For medical classification problems, it is often desirable to have a probability associated with each class. Probabilistic classifiers have received relatively little attention for small n large p classification problems despite of their importance in medical decision making. In this paper, we introduce 2 criteria for assessment of probabilistic classifiers: well-calibratedness and refinement and develop corresponding evaluation measures. We evaluated several published high-dimensional probabilistic classifiers and developed 2 extensions of the Bayesian compound covariate classifier. Based on simulation studies and analysis of gene expression microarray data, we found that proper probabilistic classification is more difficult than deterministic classification. It is important to ensure that a probabilistic classifier is well calibrated or at least not “anticonservative” using the methods developed here. We provide this evaluation for several probabilistic classifiers and also evaluate their refinement as a function of sample size under weak and strong signal conditions. We also present a cross-validation method for evaluating the calibration and refinement of any probabilistic classifier on any data set. PMID:21087946

  20. Honey Bee (Apis mellifera) Drones Survive Oxidative Stress due to Increased Tolerance instead of Avoidance or Repair of Oxidative Damage

    PubMed Central

    Li-Byarlay, Hongmei; Huang, Ming Hua; Simone-Finstrom, Michael; Strand, Micheline K.; Tarpy, David R.; Rueppell, Olav

    2016-01-01

    Oxidative stress can lead to premature aging symptoms and cause acute mortality at higher doses in a range of organisms. Oxidative stress resistance and longevity are mechanistically and phenotypically linked; considerable variation in oxidative stress resistance exists among and within species and typically covaries with life expectancy. However, it is unclear whether stress-resistant, long-lived individuals avoid, repair, or tolerate molecular damage to survive longer than others. The honey bee (Apis mellifera L.) is an emerging model system that is well-suited to address this question. Furthermore, this species is the most economically important pollinator, whose health may be compromised by pesticide exposure, including oxidative stressors. Here, we develop a protocol for inducing oxidative stress in honey bee males (drones) via Paraquat injection. After injection, individuals from different colony sources were kept in common social conditions to monitor their survival compared to saline-injected controls. Oxidative stress was measured in susceptible and resistant individuals. Paraquat drastically reduced survival but individuals varied in their resistance to treatment within and among colony sources. Longer-lived individuals exhibited higher levels of lipid peroxidation than individuals dying early. In contrast, the level of protein carbonylation was not significantly different between the two groups. This first study of oxidative stress in male honey bees suggests that survival of an acute oxidative stressor is due to tolerance, not prevention or repair, of oxidative damage to lipids. It also demonstrates colony differences in oxidative stress resistance that might be useful for breeding stress-resistant honey bees. PMID:27422326

  1. Honey bee (Apis mellifera) drones survive oxidative stress due to increased tolerance instead of avoidance or repair of oxidative damage.

    PubMed

    Li-Byarlay, Hongmei; Huang, Ming Hua; Simone-Finstrom, Michael; Strand, Micheline K; Tarpy, David R; Rueppell, Olav

    2016-10-01

    Oxidative stress can lead to premature aging symptoms and cause acute mortality at higher doses in a range of organisms. Oxidative stress resistance and longevity are mechanistically and phenotypically linked; considerable variation in oxidative stress resistance exists among and within species and typically covaries with life expectancy. However, it is unclear whether stress-resistant, long-lived individuals avoid, repair, or tolerate molecular damage to survive longer than others. The honey bee (Apis mellifera L.) is an emerging model system that is well-suited to address this question. Furthermore, this species is the most economically important pollinator, whose health may be compromised by pesticide exposure, including oxidative stressors. Here, we develop a protocol for inducing oxidative stress in honey bee males (drones) via Paraquat injection. After injection, individuals from different colony sources were kept in common social conditions to monitor their survival compared to saline-injected controls. Oxidative stress was measured in susceptible and resistant individuals. Paraquat drastically reduced survival but individuals varied in their resistance to treatment within and among colony sources. Longer-lived individuals exhibited higher levels of lipid peroxidation than individuals dying early. In contrast, the level of protein carbonylation was not significantly different between the two groups. This first study of oxidative stress in male honey bees suggests that survival of an acute oxidative stressor is due to tolerance, not prevention or repair, of oxidative damage to lipids. It also demonstrates colony differences in oxidative stress resistance that might be useful for breeding stress-resistant honey bees. Copyright © 2016 Elsevier Inc. All rights reserved.

  2. Exogenous nitric oxide improves salt tolerance during establishment of Jatropha curcas seedlings by ameliorating oxidative damage and toxic ion accumulation.

    PubMed

    Gadelha, Cibelle Gomes; Miranda, Rafael de Souza; Alencar, Nara Lídia M; Costa, José Hélio; Prisco, José Tarquinio; Gomes-Filho, Enéas

    2017-05-01

    Jatropha curcas is an oilseed species that is considered an excellent alternative energy source for fossil-based fuels for growing in arid and semiarid regions, where salinity is becoming a stringent problem to crop production. Our working hypothesis was that nitric oxide (NO) priming enhances salt tolerance of J. curcas during early seedling development. Under NaCl stress, seedlings arising from NO-treated seeds showed lower accumulation of Na + and Cl - than those salinized seedlings only, which was consistent with a better growth for all analyzed time points. Also, although salinity promoted a significant increase in hydrogen peroxide (H 2 O 2 ) content and membrane damage, the harmful effects were less aggressive in NO-primed seedlings. The lower oxidative damage in NO-primed stressed seedlings was attributed to operation of a powerful antioxidant system, including greater glutathione (GSH) and ascorbate (AsA) contents as well as catalase (CAT) and glutathione reductase (GR) enzyme activities in both endosperm and embryo axis. Priming with NO also was found to rapidly up-regulate the JcCAT1, JcCAT2, JcGR1 and JcGR2 gene expression in embryo axis, suggesting that NO-induced salt responses include functional and transcriptional regulations. Thus, NO almost completely abolished the deleterious salinity effects on reserve mobilization and seedling growth. In conclusion, NO priming improves salt tolerance of J. curcas during seedling establishment by inducing an effective antioxidant system and limiting toxic ion and reactive oxygen species (ROS) accumulation. Copyright © 2017 Elsevier GmbH. All rights reserved.

  3. Bayesian wavelet PCA methodology for turbomachinery damage diagnosis under uncertainty

    NASA Astrophysics Data System (ADS)

    Xu, Shengli; Jiang, Xiaomo; Huang, Jinzhi; Yang, Shuhua; Wang, Xiaofang

    2016-12-01

    Centrifugal compressor often suffers various defects such as impeller cracking, resulting in forced outage of the total plant. Damage diagnostics and condition monitoring of such a turbomachinery system has become an increasingly important and powerful tool to prevent potential failure in components and reduce unplanned forced outage and further maintenance costs, while improving reliability, availability and maintainability of a turbomachinery system. This paper presents a probabilistic signal processing methodology for damage diagnostics using multiple time history data collected from different locations of a turbomachine, considering data uncertainty and multivariate correlation. The proposed methodology is based on the integration of three advanced state-of-the-art data mining techniques: discrete wavelet packet transform, Bayesian hypothesis testing, and probabilistic principal component analysis. The multiresolution wavelet analysis approach is employed to decompose a time series signal into different levels of wavelet coefficients. These coefficients represent multiple time-frequency resolutions of a signal. Bayesian hypothesis testing is then applied to each level of wavelet coefficient to remove possible imperfections. The ratio of posterior odds Bayesian approach provides a direct means to assess whether there is imperfection in the decomposed coefficients, thus avoiding over-denoising. Power spectral density estimated by the Welch method is utilized to evaluate the effectiveness of Bayesian wavelet cleansing method. Furthermore, the probabilistic principal component analysis approach is developed to reduce dimensionality of multiple time series and to address multivariate correlation and data uncertainty for damage diagnostics. The proposed methodology and generalized framework is demonstrated with a set of sensor data collected from a real-world centrifugal compressor with impeller cracks, through both time series and contour analyses of vibration

  4. Probabilistic Model for Laser Damage to the Human Retina

    DTIC Science & Technology

    2012-03-01

    the beam. Power density may be measured in radiant exposure, J cm2 , or by irradiance , W cm2 . In the experimental database used in this study and...to quan- tify a binary response, either lethal or non-lethal, within a population such as insects or rats. In directed energy research, probit...value of the normalized Arrhenius damage integral. In a one-dimensional simulation, the source term is determined as a spatially averaged irradiance (W

  5. Probabilistic Structural Analysis Theory Development

    NASA Technical Reports Server (NTRS)

    Burnside, O. H.

    1985-01-01

    The objective of the Probabilistic Structural Analysis Methods (PSAM) project is to develop analysis techniques and computer programs for predicting the probabilistic response of critical structural components for current and future space propulsion systems. This technology will play a central role in establishing system performance and durability. The first year's technical activity is concentrating on probabilistic finite element formulation strategy and code development. Work is also in progress to survey critical materials and space shuttle mian engine components. The probabilistic finite element computer program NESSUS (Numerical Evaluation of Stochastic Structures Under Stress) is being developed. The final probabilistic code will have, in the general case, the capability of performing nonlinear dynamic of stochastic structures. It is the goal of the approximate methods effort to increase problem solving efficiency relative to finite element methods by using energy methods to generate trial solutions which satisfy the structural boundary conditions. These approximate methods will be less computer intensive relative to the finite element approach.

  6. Probabilistic drug connectivity mapping

    PubMed Central

    2014-01-01

    Background The aim of connectivity mapping is to match drugs using drug-treatment gene expression profiles from multiple cell lines. This can be viewed as an information retrieval task, with the goal of finding the most relevant profiles for a given query drug. We infer the relevance for retrieval by data-driven probabilistic modeling of the drug responses, resulting in probabilistic connectivity mapping, and further consider the available cell lines as different data sources. We use a special type of probabilistic model to separate what is shared and specific between the sources, in contrast to earlier connectivity mapping methods that have intentionally aggregated all available data, neglecting information about the differences between the cell lines. Results We show that the probabilistic multi-source connectivity mapping method is superior to alternatives in finding functionally and chemically similar drugs from the Connectivity Map data set. We also demonstrate that an extension of the method is capable of retrieving combinations of drugs that match different relevant parts of the query drug response profile. Conclusions The probabilistic modeling-based connectivity mapping method provides a promising alternative to earlier methods. Principled integration of data from different cell lines helps to identify relevant responses for specific drug repositioning applications. PMID:24742351

  7. Demonstration of the Application of Composite Load Spectra (CLS) and Probabilistic Structural Analysis (PSAM) Codes to SSME Heat Exchanger Turnaround Vane

    NASA Technical Reports Server (NTRS)

    Rajagopal, Kadambi R.; DebChaudhury, Amitabha; Orient, George

    2000-01-01

    This report describes a probabilistic structural analysis performed to determine the probabilistic structural response under fluctuating random pressure loads for the Space Shuttle Main Engine (SSME) turnaround vane. It uses a newly developed frequency and distance dependent correlation model that has features to model the decay phenomena along the flow and across the flow with the capability to introduce a phase delay. The analytical results are compared using two computer codes SAFER (Spectral Analysis of Finite Element Responses) and NESSUS (Numerical Evaluation of Stochastic Structures Under Stress) and with experimentally observed strain gage data. The computer code NESSUS with an interface to a sub set of Composite Load Spectra (CLS) code is used for the probabilistic analysis. A Fatigue code was used to calculate fatigue damage due to the random pressure excitation. The random variables modeled include engine system primitive variables that influence the operating conditions, convection velocity coefficient, stress concentration factor, structural damping, and thickness of the inner and outer vanes. The need for an appropriate correlation model in addition to magnitude of the PSD is emphasized. The study demonstrates that correlation characteristics even under random pressure loads are capable of causing resonance like effects for some modes. The study identifies the important variables that contribute to structural alternate stress response and drive the fatigue damage for the new design. Since the alternate stress for the new redesign is less than the endurance limit for the material, the damage due high cycle fatigue is negligible.

  8. Tolerance of Cottonwood to Damage by Cottonwood Leaf Beetle

    Treesearch

    F. L. Oliveria; D. T. Cooper

    1977-01-01

    Wide variation in tolerance to the cottonwood leaf beetle was found in fourteen hundred eastern cottonwood clones, originating from 36 young natural stands along the Mississippi River from Memphis, Tennessee, to Baton Rouge, Louisiana. Expected genetic gains were large enough to justify further research.

  9. Is probabilistic bias analysis approximately Bayesian?

    PubMed Central

    MacLehose, Richard F.; Gustafson, Paul

    2011-01-01

    Case-control studies are particularly susceptible to differential exposure misclassification when exposure status is determined following incident case status. Probabilistic bias analysis methods have been developed as ways to adjust standard effect estimates based on the sensitivity and specificity of exposure misclassification. The iterative sampling method advocated in probabilistic bias analysis bears a distinct resemblance to a Bayesian adjustment; however, it is not identical. Furthermore, without a formal theoretical framework (Bayesian or frequentist), the results of a probabilistic bias analysis remain somewhat difficult to interpret. We describe, both theoretically and empirically, the extent to which probabilistic bias analysis can be viewed as approximately Bayesian. While the differences between probabilistic bias analysis and Bayesian approaches to misclassification can be substantial, these situations often involve unrealistic prior specifications and are relatively easy to detect. Outside of these special cases, probabilistic bias analysis and Bayesian approaches to exposure misclassification in case-control studies appear to perform equally well. PMID:22157311

  10. Homologous Recombination and Translesion DNA Synthesis Play Critical Roles on Tolerating DNA Damage Caused by Trace Levels of Hexavalent Chromium

    PubMed Central

    Chen, Youjun; Zhou, Yi-Hui; Neo, Dayna; Clement, Jean; Takata, Minoru; Takeda, Shunichi; Sale, Julian; Wright, Fred A.; Swenberg, James A.; Nakamura, Jun

    2016-01-01

    Contamination of potentially carcinogenic hexavalent chromium (Cr(VI)) in the drinking water is a major public health concern worldwide. However, little information is available regarding the biological effects of a nanomoler amount of Cr(VI). Here, we investigated the genotoxic effects of Cr(VI) at nanomoler levels and their repair pathways. We found that DNA damage response analyzed based on differential toxicity of isogenic cells deficient in various DNA repair proteins is observed after a three-day incubation with K2CrO4 in REV1-deficient DT40 cells at 19.2 μg/L or higher as well as in TK6 cells deficient in polymerase delta subunit 3 (POLD3) at 9.8 μg/L or higher. The genotoxicity of Cr(VI) decreased ~3000 times when the incubation time was reduced from three days to ten minutes. TK mutation rate also significantly decreased from 6 day to 1 day exposure to Cr(VI). The DNA damage response analysis suggest that DNA repair pathways, including the homologous recombination and REV1- and POLD3-mediated error-prone translesion synthesis pathways, are critical for the cells to tolerate to DNA damage caused by trace amount of Cr(VI). PMID:27907204

  11. Tensile strength of composite sheets with unidirectional stringers and crack-like damage

    NASA Technical Reports Server (NTRS)

    Poe, C. C., Jr.

    1984-01-01

    The damage tolerance characteristics of metal tension panels with riveted and bonded stringers are well known. The stringers arrest unstable cracks and retard propagation of fatigue cracks. Residual strengths and fatigue lives are considerably greater than those of unstiffened or integrally stiffened sheets. The damage tolerance of composite sheets with bonded composite stringers loaded in tension was determined. Cracks in composites do not readily propagate in fatigue, at least not through fibers. Moreover, the residual strength of notched composites is sometimes even increased by fatigue loading. Therefore, the residual strength aspect of damage tolerance, and not fatigue crack propagation, was investigated. About 50 graphite/epoxy composite panels were made with two sheet layups and several stringer configurations. Crack-like slots were cut in the middle of the panels to simulate damage. The panels were instrumented and monotonically loaded in tension to failure. The tests indicate that the composite panels have considerable damage tolerance, much like metal panels. The stringers arrested cracks that ran from the crack-like slots, and the residual strengths were considerably greater than those of unstiffened composite sheets. A stress intensity factor analysis was developed to predict the failing strains of the stiffened panels. Using the analysis, a single design curve was produced for composite sheets with bonded stringers of any configuration.

  12. Seed tolerance to predation: Evidence from the toxic seeds of the buckeye tree (Aesculus californica; Sapindaceae).

    PubMed

    Mendoza, Eduardo; Dirzo, Rodolfo

    2009-07-01

    Tolerance, the capacity of plants to withstand attack by animals, as opposed to resistance, has been poorly examined in the context of seed predation. We investigated the role that the seed mass of the large-seeded endemic tree Aesculus californica plays as a tolerance trait to rodent attack by comparing, under greenhouse conditions, patterns of germination, and subsequent seedling growth, of seeds with a wide range of natural damage. Germination percentage was reduced by 50% and time to germination by 64% in attacked compared to intact seeds, and germination probability was negatively correlated with damage. Seedlings that emerged from intact seeds were taller and bore more leaves than those from damaged seeds. This species' large seed mass favors tolerance to damage because heavily damaged seeds are able to germinate and produce seedlings. This finding is significant given that seeds of this species are known to contain chemical compounds toxic to vertebrates, a resistance trait. We posit that this combination of tolerance and resistance traits might be a particularly effective antipredation strategy when seeds are exposed to a variety of vertebrate predators.

  13. Probabilistic fatigue life prediction of metallic and composite materials

    NASA Astrophysics Data System (ADS)

    Xiang, Yibing

    Fatigue is one of the most common failure modes for engineering structures, such as aircrafts, rotorcrafts and aviation transports. Both metallic materials and composite materials are widely used and affected by fatigue damage. Huge uncertainties arise from material properties, measurement noise, imperfect models, future anticipated loads and environmental conditions. These uncertainties are critical issues for accurate remaining useful life (RUL) prediction for engineering structures in service. Probabilistic fatigue prognosis considering various uncertainties is of great importance for structural safety. The objective of this study is to develop probabilistic fatigue life prediction models for metallic materials and composite materials. A fatigue model based on crack growth analysis and equivalent initial flaw size concept is proposed for metallic materials. Following this, the developed model is extended to include structural geometry effects (notch effect), environmental effects (corroded specimens) and manufacturing effects (shot peening effects). Due to the inhomogeneity and anisotropy, the fatigue model suitable for metallic materials cannot be directly applied to composite materials. A composite fatigue model life prediction is proposed based on a mixed-mode delamination growth model and a stiffness degradation law. After the development of deterministic fatigue models of metallic and composite materials, a general probabilistic life prediction methodology is developed. The proposed methodology combines an efficient Inverse First-Order Reliability Method (IFORM) for the uncertainty propogation in fatigue life prediction. An equivalent stresstransformation has been developed to enhance the computational efficiency under realistic random amplitude loading. A systematical reliability-based maintenance optimization framework is proposed for fatigue risk management and mitigation of engineering structures.

  14. Low Velocity Impact Damage to Carbon/Epoxy Laminates

    NASA Technical Reports Server (NTRS)

    Nettles, Alan T.

    2011-01-01

    Impact damage tends to be more detrimental to a laminate's compression strength as compared to tensile strength. Proper use of Non Destructive Evaluation (NDE) Techniques can remove conservatism (weight) from many structures. Test largest components economically feasible as coupons. If damage tolerance is a driver, then consider different resin systems. Do not use a single knockdown factor to account for damage.

  15. Probabilistic analysis of the influence of the bonding degree of the stem-cement interface in the performance of cemented hip prostheses.

    PubMed

    Pérez, M A; Grasa, J; García-Aznar, J M; Bea, J A; Doblaré, M

    2006-01-01

    The long-term behavior of the stem-cement interface is one of the most frequent topics of discussion in the design of cemented total hip replacements, especially with regards to the process of damage accumulation in the cement layer. This effect is analyzed here comparing two different situations of the interface: completely bonded and debonded with friction. This comparative analysis is performed using a probabilistic computational approach that considers the variability and uncertainty of determinant factors that directly compromise the damage accumulation in the cement mantle. This stochastic technique is based on the combination of probabilistic finite elements (PFEM) and a cumulative damage approach known as B-model. Three random variables were considered: muscle and joint contact forces at the hip (both for walking and stair climbing), cement damage and fatigue properties of the cement. The results predicted that the regions with higher failure probability in the bulk cement are completely different depending on the stem-cement interface characteristics. In a bonded interface, critical sites appeared at the distal and medial parts of the cement, while for debonded interfaces, the critical regions were found distally and proximally. In bonded interfaces, the failure probability was higher than in debonded ones. The same conclusion may be established for stair climbing in comparison with walking activity.

  16. Review of the probabilistic failure analysis methodology and other probabilistic approaches for application in aerospace structural design

    NASA Technical Reports Server (NTRS)

    Townsend, J.; Meyers, C.; Ortega, R.; Peck, J.; Rheinfurth, M.; Weinstock, B.

    1993-01-01

    Probabilistic structural analyses and design methods are steadily gaining acceptance within the aerospace industry. The safety factor approach to design has long been the industry standard, and it is believed by many to be overly conservative and thus, costly. A probabilistic approach to design may offer substantial cost savings. This report summarizes several probabilistic approaches: the probabilistic failure analysis (PFA) methodology developed by Jet Propulsion Laboratory, fast probability integration (FPI) methods, the NESSUS finite element code, and response surface methods. Example problems are provided to help identify the advantages and disadvantages of each method.

  17. Proceedings, Seminar on Probabilistic Methods in Geotechnical Engineering

    NASA Astrophysics Data System (ADS)

    Hynes-Griffin, M. E.; Buege, L. L.

    1983-09-01

    Contents: Applications of Probabilistic Methods in Geotechnical Engineering; Probabilistic Seismic and Geotechnical Evaluation at a Dam Site; Probabilistic Slope Stability Methodology; Probability of Liquefaction in a 3-D Soil Deposit; Probabilistic Design of Flood Levees; Probabilistic and Statistical Methods for Determining Rock Mass Deformability Beneath Foundations: An Overview; Simple Statistical Methodology for Evaluating Rock Mechanics Exploration Data; New Developments in Statistical Techniques for Analyzing Rock Slope Stability.

  18. Fatigue and damage tolerance scatter models

    NASA Astrophysics Data System (ADS)

    Raikher, Veniamin L.

    1994-09-01

    Effective Total Fatigue Life and Crack Growth Scatter Models are proposed. The first of them is based on the power form of the Wohler curve, fatigue scatter dependence on mean life value, cycle stress ratio influence on fatigue scatter, and validated description of the mean stress influence on the mean fatigue life. The second uses in addition are fracture mechanics approach, assumption of initial damage existence, and Paris equation. Simple formulas are derived for configurations of models. A preliminary identification of the parameters of the models is fulfilled on the basis of experimental data. Some new and important results for fatigue and crack growth scatter characteristics are obtained.

  19. Improving the Fatigue Crack Propagation Resistance and Damage Tolerance of 2524-T3 Alloy with Amorphous Electroless Ni-P Coating

    NASA Astrophysics Data System (ADS)

    Chen, Lai; Zeng, Diping; Liu, Zhiyi; Bai, Song; Li, Junlin

    2018-02-01

    The surface microhardness, as well as the fatigue crack propagation (FCP) resistance of 2524-T3 alloy, is improved by producing a 20-μm-thick amorphous electroless Ni-12% P coating on its surface. Compared to the substrate, this deposited EN coating possesses higher strength properties and exhibits a greater ability of accommodating the plastic deformation at the fatigue crack tip, thereby remarkably improving the FCP resistance in near-threshold and early Paris regimes. Regardless of the similar FCP rates in Paris regime (Δ K ≥ 16.2 MPa m0.5), the coated sample exhibits extended Paris regime and enhanced damage tolerance.

  20. An examination of the damage tolerance enhancement of carbon/epoxy using an outer lamina of spectra (R)

    NASA Technical Reports Server (NTRS)

    Lance, D. G.; Nettles, A. T.

    1991-01-01

    Low velocity instrumented impact testing was utilized to examine the effects of an outer lamina of ultra-high molecular weight polyethylene (Spectra) on the damage tolerance of carbon epoxy composites. Four types of 16-ply quasi-isotropic panels (0, +45, 90, -45) were tested. Some panels contained no Spectra, while others had a lamina of Spectra bonded to the top (impacted side), bottom, or both sides of the composite plates. The specimens were impacted with energies up to 8.5 J. Force time plots and maximum force versus impact energy graphs were generated for comparison purposes. Specimens were also subjected to cross-sectional analysis and compression after impact tests. The results show that while the Spectra improved the maximum load that the panels could withstand before fiber breakage, the Spectra seemingly reduced the residual strength of the composites.

  1. Stress-tolerance of baker's-yeast (Saccharomyces cerevisiae) cells: stress-protective molecules and genes involved in stress tolerance.

    PubMed

    Shima, Jun; Takagi, Hiroshi

    2009-05-29

    During the fermentation of dough and the production of baker's yeast (Saccharomyces cerevisiae), cells are exposed to numerous environmental stresses (baking-associated stresses) such as freeze-thaw, high sugar concentrations, air-drying and oxidative stresses. Cellular macromolecules, including proteins, nucleic acids and membranes, are seriously damaged under stress conditions, leading to the inhibition of cell growth, cell viability and fermentation. To avoid lethal damage, yeast cells need to acquire a variety of stress-tolerant mechanisms, for example the induction of stress proteins, the accumulation of stress protectants, changes in membrane composition and repression of translation, and by regulating the corresponding gene expression via stress-triggered signal-transduction pathways. Trehalose and proline are considered to be critical stress protectants, as is glycerol. It is known that these molecules are effective for providing protection against various types of environmental stresses. Modifications of the metabolic pathways of trehalose and proline by self-cloning methods have significantly increased tolerance to baking-associated stresses. To clarify which genes are required for stress tolerance, both a comprehensive phenomics analysis and a functional genomics analysis were carried out under stress conditions that simulated those occurring during the commercial baking process. These analyses indicated that many genes are involved in stress tolerance in yeast. In particular, it was suggested that vacuolar H+-ATPase plays important roles in yeast cells under stress conditions.

  2. Damage Tolerance and Mechanics of Interfaces in Nanostructured Metals

    NASA Astrophysics Data System (ADS)

    Foley, Daniel J.

    The concept of interface driven properties in crystalline metals has been one of the most intensely discussed topics in materials science for decades. Since the 1980s researchers have been exploring the concept of grain boundary engineering as route for tuning properties such as fracture toughness and irradiation resistance. This is especially true in ultra-fine grained and nanocrystalline materials where grain boundary mediated properties become dominant. More recently, materials composed of hierarchical nanostructures, such as amorphous-crystalline nanolaminates, have attracted considerable attention due to their favorable properties, ease of manufacture and highly tunable microstructure. While both grain boundary engineering and hierarchical nanostructures have shown promise there are still questions remaining regarding the role of specific attributes of the microstructure (such as grain boundaries, grain/layer size and inter/intralayer morphology) in determining material properties. This thesis attempts to address these questions by using atomistic simulations to perform deformation and damage loading studies on a series of nanolaminate and bicrystalline structures. During the course of this thesis the roles of layer thickness, interlayer structure and interlayer chemistry on the mechanical properties of Ni-NiX amorphous-crystalline nanolaminates were explored using atomistic simulations. This thesis found that layer thickness/thickness ratio and amorphous layer chemistry play a crucial role in yield strength and Young's modulus. Analysis of the deformation mechanisms at the atomic scale revealed that structures containing single crystalline, crystalline layers undergo plastic deformation when shear transformation zones form in the amorphous layer and impinge on the amorphous-crystalline interface, leading to dislocation emission. However, structures containing nanocrystalline, crystalline layers (both equiaxed and columnar nanocrystalline) undergo plastic

  3. Probabilistic simulation of stress concentration in composite laminates

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.; Murthy, P. L. N.; Liaw, L.

    1993-01-01

    A computational methodology is described to probabilistically simulate the stress concentration factors in composite laminates. This new approach consists of coupling probabilistic composite mechanics with probabilistic finite element structural analysis. The probabilistic composite mechanics is used to probabilistically describe all the uncertainties inherent in composite material properties while probabilistic finite element is used to probabilistically describe the uncertainties associated with methods to experimentally evaluate stress concentration factors such as loads, geometry, and supports. The effectiveness of the methodology is demonstrated by using it to simulate the stress concentration factors in composite laminates made from three different composite systems. Simulated results match experimental data for probability density and for cumulative distribution functions. The sensitivity factors indicate that the stress concentration factors are influenced by local stiffness variables, by load eccentricities and by initial stress fields.

  4. Micro-Energy Rates for Damage Tolerance and Durability of Composite Structures

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.; Minnetyan, Levon

    2006-01-01

    In this paper, the adhesive bond strength of lap-jointed graphite/aluminum composites is examined by computational simulation. Computed micro-stress level energy release rates are used to identify the damage mechanisms associated with the corresponding acoustic emission (AE) signals. Computed damage regions are similarly correlated with ultrasonically scanned damage regions. Results show that computational simulation can be used with suitable NDE methods for credible in-service monitoring of composites.

  5. Balancing repair and tolerance of DNA damage caused by alkylating agents.

    PubMed

    Fu, Dragony; Calvo, Jennifer A; Samson, Leona D

    2012-01-12

    Alkylating agents constitute a major class of frontline chemotherapeutic drugs that inflict cytotoxic DNA damage as their main mode of action, in addition to collateral mutagenic damage. Numerous cellular pathways, including direct DNA damage reversal, base excision repair (BER) and mismatch repair (MMR), respond to alkylation damage to defend against alkylation-induced cell death or mutation. However, maintaining a proper balance of activity both within and between these pathways is crucial for a favourable response of an organism to alkylating agents. Furthermore, the response of an individual to alkylating agents can vary considerably from tissue to tissue and from person to person, pointing to genetic and epigenetic mechanisms that modulate alkylating agent toxicity.

  6. Antioxidant enzymatic activity is linked to waterlogging stress tolerance in citrus.

    PubMed

    Arbona, Vicent; Hossain, Zahed; López-Climent, María F; Pérez-Clemente, Rosa M; Gómez-Cadenas, Aurelio

    2008-04-01

    Soil flooding constitutes a seasonal factor that negatively affects plant performance and crop yields. In this work, the relationship between oxidative damage and flooding sensitivity was addressed in three citrus genotypes with different abilities to tolerate waterlogging. We examined leaf visible damage, oxidative damage in terms of malondialdehyde (MDA) concentration, leaf proline concentration, leaf and root ascorbate and glutathione contents and the antioxidant enzyme activities superoxide dismutase (EC 1.15.1.1), ascorbate peroxidase (EC 1.11.1.11), catalase (EC 1.11.1.6) and glutathione reductase (EC 1.8.1.7). No differences in the extent of oxidative damage relative to controls were found among genotypes. However, a different ability to delay the apparition of oxidative damage was associated to a higher tolerance to waterlogging. This ability was linked to an enhanced activated oxygen species' scavenging capacity in terms of an increased antioxidant enzyme activity and higher content in polar antioxidant compounds. Therefore, the existence of a direct relationship between stress sensitivity and the early accumulation of MDA is proposed. In addition, data indicate that the protective role of proline has to be considered minimal as its accumulation was inversely correlated with tolerance to the stress. The positive antioxidant response in Carrizo citrange (Poncirus trifoliata L. Raf. x Citrus sinensis L. Osb.) and Citrumelo CPB 4475 (Poncirus trifoliata L. Raf. x Citrus paradisi L. Macf.) might be responsible for a higher tolerance to flooding stress, whereas in Cleopatra mandarin (Citrus reshni Hort. Ex Tan.), the early accumulation of MDA seems to be associated to an impaired ability for H2O2 scavenging.

  7. 7 CFR 51.1405 - Application of tolerances.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... STANDARDS) United States Standards for Grades of Pecans in the Shell 1 Application of Tolerances § 51.1405... that at least one pecan which is seriously damaged by live insects inside the shell is permitted...

  8. Development of probabilistic multimedia multipathway computer codes.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yu, C.; LePoire, D.; Gnanapragasam, E.

    2002-01-01

    The deterministic multimedia dose/risk assessment codes RESRAD and RESRAD-BUILD have been widely used for many years for evaluation of sites contaminated with residual radioactive materials. The RESRAD code applies to the cleanup of sites (soils) and the RESRAD-BUILD code applies to the cleanup of buildings and structures. This work describes the procedure used to enhance the deterministic RESRAD and RESRAD-BUILD codes for probabilistic dose analysis. A six-step procedure was used in developing default parameter distributions and the probabilistic analysis modules. These six steps include (1) listing and categorizing parameters; (2) ranking parameters; (3) developing parameter distributions; (4) testing parameter distributionsmore » for probabilistic analysis; (5) developing probabilistic software modules; and (6) testing probabilistic modules and integrated codes. The procedures used can be applied to the development of other multimedia probabilistic codes. The probabilistic versions of RESRAD and RESRAD-BUILD codes provide tools for studying the uncertainty in dose assessment caused by uncertain input parameters. The parameter distribution data collected in this work can also be applied to other multimedia assessment tasks and multimedia computer codes.« less

  9. Probabilistic sizing of laminates with uncertainties

    NASA Technical Reports Server (NTRS)

    Shah, A. R.; Liaw, D. G.; Chamis, C. C.

    1993-01-01

    A reliability based design methodology for laminate sizing and configuration for a special case of composite structures is described. The methodology combines probabilistic composite mechanics with probabilistic structural analysis. The uncertainties of constituent materials (fiber and matrix) to predict macroscopic behavior are simulated using probabilistic theory. Uncertainties in the degradation of composite material properties are included in this design methodology. A multi-factor interaction equation is used to evaluate load and environment dependent degradation of the composite material properties at the micromechanics level. The methodology is integrated into a computer code IPACS (Integrated Probabilistic Assessment of Composite Structures). Versatility of this design approach is demonstrated by performing a multi-level probabilistic analysis to size the laminates for design structural reliability of random type structures. The results show that laminate configurations can be selected to improve the structural reliability from three failures in 1000, to no failures in one million. Results also show that the laminates with the highest reliability are the least sensitive to the loading conditions.

  10. Probabilistic finite elements for fracture mechanics

    NASA Technical Reports Server (NTRS)

    Besterfield, Glen

    1988-01-01

    The probabilistic finite element method (PFEM) is developed for probabilistic fracture mechanics (PFM). A finite element which has the near crack-tip singular strain embedded in the element is used. Probabilistic distributions, such as expectation, covariance and correlation stress intensity factors, are calculated for random load, random material and random crack length. The method is computationally quite efficient and can be expected to determine the probability of fracture or reliability.

  11. Maximizing Statistical Power When Verifying Probabilistic Forecasts of Hydrometeorological Events

    NASA Astrophysics Data System (ADS)

    DeChant, C. M.; Moradkhani, H.

    2014-12-01

    Hydrometeorological events (i.e. floods, droughts, precipitation) are increasingly being forecasted probabilistically, owing to the uncertainties in the underlying causes of the phenomenon. In these forecasts, the probability of the event, over some lead time, is estimated based on some model simulations or predictive indicators. By issuing probabilistic forecasts, agencies may communicate the uncertainty in the event occurring. Assuming that the assigned probability of the event is correct, which is referred to as a reliable forecast, the end user may perform some risk management based on the potential damages resulting from the event. Alternatively, an unreliable forecast may give false impressions of the actual risk, leading to improper decision making when protecting resources from extreme events. Due to this requisite for reliable forecasts to perform effective risk management, this study takes a renewed look at reliability assessment in event forecasts. Illustrative experiments will be presented, showing deficiencies in the commonly available approaches (Brier Score, Reliability Diagram). Overall, it is shown that the conventional reliability assessment techniques do not maximize the ability to distinguish between a reliable and unreliable forecast. In this regard, a theoretical formulation of the probabilistic event forecast verification framework will be presented. From this analysis, hypothesis testing with the Poisson-Binomial distribution is the most exact model available for the verification framework, and therefore maximizes one's ability to distinguish between a reliable and unreliable forecast. Application of this verification system was also examined within a real forecasting case study, highlighting the additional statistical power provided with the use of the Poisson-Binomial distribution.

  12. Probabilistic simulation of uncertainties in thermal structures

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.; Shiao, Michael

    1990-01-01

    Development of probabilistic structural analysis methods for hot structures is a major activity at Lewis Research Center. It consists of five program elements: (1) probabilistic loads; (2) probabilistic finite element analysis; (3) probabilistic material behavior; (4) assessment of reliability and risk; and (5) probabilistic structural performance evaluation. Recent progress includes: (1) quantification of the effects of uncertainties for several variables on high pressure fuel turbopump (HPFT) blade temperature, pressure, and torque of the Space Shuttle Main Engine (SSME); (2) the evaluation of the cumulative distribution function for various structural response variables based on assumed uncertainties in primitive structural variables; (3) evaluation of the failure probability; (4) reliability and risk-cost assessment, and (5) an outline of an emerging approach for eventual hot structures certification. Collectively, the results demonstrate that the structural durability/reliability of hot structural components can be effectively evaluated in a formal probabilistic framework. In addition, the approach can be readily extended to computationally simulate certification of hot structures for aerospace environments.

  13. Probabilistic, Seismically-Induced Landslide Hazard Mapping of Western Oregon

    NASA Astrophysics Data System (ADS)

    Olsen, M. J.; Sharifi Mood, M.; Gillins, D. T.; Mahalingam, R.

    2015-12-01

    Earthquake-induced landslides can generate significant damage within urban communities by damaging structures, obstructing lifeline connection routes and utilities, generating various environmental impacts, and possibly resulting in loss of life. Reliable hazard and risk maps are important to assist agencies in efficiently allocating and managing limited resources to prepare for such events. This research presents a new methodology in order to communicate site-specific landslide hazard assessments in a large-scale, regional map. Implementation of the proposed methodology results in seismic-induced landslide hazard maps that depict the probabilities of exceeding landslide displacement thresholds (e.g. 0.1, 0.3, 1.0 and 10 meters). These maps integrate a variety of data sources including: recent landslide inventories, LIDAR and photogrammetric topographic data, geology map, mapped NEHRP site classifications based on available shear wave velocity data in each geologic unit, and USGS probabilistic seismic hazard curves. Soil strength estimates were obtained by evaluating slopes present along landslide scarps and deposits for major geologic units. Code was then developed to integrate these layers to perform a rigid, sliding block analysis to determine the amount and associated probabilities of displacement based on each bin of peak ground acceleration in the seismic hazard curve at each pixel. The methodology was applied to western Oregon, which contains weak, weathered, and often wet soils at steep slopes. Such conditions have a high landslide hazard even without seismic events. A series of landslide hazard maps highlighting the probabilities of exceeding the aforementioned thresholds were generated for the study area. These output maps were then utilized in a performance based design framework enabling them to be analyzed in conjunction with other hazards for fully probabilistic-based hazard evaluation and risk assessment. a) School of Civil and Construction

  14. Formalizing Probabilistic Safety Claims

    NASA Technical Reports Server (NTRS)

    Herencia-Zapana, Heber; Hagen, George E.; Narkawicz, Anthony J.

    2011-01-01

    A safety claim for a system is a statement that the system, which is subject to hazardous conditions, satisfies a given set of properties. Following work by John Rushby and Bev Littlewood, this paper presents a mathematical framework that can be used to state and formally prove probabilistic safety claims. It also enables hazardous conditions, their uncertainties, and their interactions to be integrated into the safety claim. This framework provides a formal description of the probabilistic composition of an arbitrary number of hazardous conditions and their effects on system behavior. An example is given of a probabilistic safety claim for a conflict detection algorithm for aircraft in a 2D airspace. The motivation for developing this mathematical framework is that it can be used in an automated theorem prover to formally verify safety claims.

  15. 76 FR 74655 - Damage Tolerance and Fatigue Evaluation of Composite Rotorcraft Structures

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-12-01

    ... and discrete flaws, and impact or other accidental damage (including the discrete source of the... discrete manufacturing defects or accidental damage, is avoided throughout the operational life or... and discrete flaws, and impact or other accidental damage (including the discrete source of the...

  16. A time-dependent probabilistic seismic-hazard model for California

    USGS Publications Warehouse

    Cramer, C.H.; Petersen, M.D.; Cao, T.; Toppozada, Tousson R.; Reichle, M.

    2000-01-01

    For the purpose of sensitivity testing and illuminating nonconsensus components of time-dependent models, the California Department of Conservation, Division of Mines and Geology (CDMG) has assembled a time-dependent version of its statewide probabilistic seismic hazard (PSH) model for California. The model incorporates available consensus information from within the earth-science community, except for a few faults or fault segments where consensus information is not available. For these latter faults, published information has been incorporated into the model. As in the 1996 CDMG/U.S. Geological Survey (USGS) model, the time-dependent models incorporate three multisegment ruptures: a 1906, an 1857, and a southern San Andreas earthquake. Sensitivity tests are presented to show the effect on hazard and expected damage estimates of (1) intrinsic (aleatory) sigma, (2) multisegment (cascade) vs. independent segment (no cascade) ruptures, and (3) time-dependence vs. time-independence. Results indicate that (1) differences in hazard and expected damage estimates between time-dependent and independent models increase with decreasing intrinsic sigma, (2) differences in hazard and expected damage estimates between full cascading and not cascading are insensitive to intrinsic sigma, (3) differences in hazard increase with increasing return period (decreasing probability of occurrence), and (4) differences in moment-rate budgets increase with decreasing intrinsic sigma and with the degree of cascading, but are within the expected uncertainty in PSH time-dependent modeling and do not always significantly affect hazard and expected damage estimates.

  17. Probabilistic Simulation of Progressive Fracture in Bolted-Joint Composite Laminates

    NASA Technical Reports Server (NTRS)

    Minnetyan, L.; Singhal, S. N.; Chamis, C. C.

    1996-01-01

    This report describes computational methods to probabilistically simulate fracture in bolted composite structures. An innovative approach that is independent of stress intensity factors and fracture toughness was used to simulate progressive fracture. The effect of design variable uncertainties on structural damage was also quantified. A fast probability integrator assessed the scatter in the composite structure response before and after damage. Then the sensitivity of the response to design variables was computed. General-purpose methods, which are applicable to bolted joints in all types of structures and in all fracture processes-from damage initiation to unstable propagation and global structure collapse-were used. These methods were demonstrated for a bolted joint of a polymer matrix composite panel under edge loads. The effects of the fabrication process were included in the simulation of damage in the bolted panel. Results showed that the most effective way to reduce end displacement at fracture is to control both the load and the ply thickness. The cumulative probability for longitudinal stress in all plies was most sensitive to the load; in the 0 deg. plies it was very sensitive to ply thickness. The cumulative probability for transverse stress was most sensitive to the matrix coefficient of thermal expansion. In addition, fiber volume ratio and fiber transverse modulus both contributed significantly to the cumulative probability for the transverse stresses in all the plies.

  18. Damage tolerance in filament-wound graphite/epoxy pressure vessels

    NASA Technical Reports Server (NTRS)

    Simon, William E.; Ngueyen, Vinh D.; Chenna, Ravi K.

    1995-01-01

    Graphite/epoxy composites are extensively used in the aerospace and sporting goods industries due to their superior engineering properties compared to those of metals. However, graphite/epoxy is extremely susceptible to impact damage which can cause considerable and sometimes undetected reduction in strength. An inelastic impact model was developed to predict damage due to low-velocity impact. A transient dynamic finite element formulation was used in conjunction with the 3D Tsai-Wu failure criterion to determine and incorporate failure in the materials during impact. Material degradation can be adjusted from no degradation to partial degradation to full degradation. The developed software is based on an object-oriented implementation framework called Extensible Implementation Framework for Finite Elements (EIFFE).

  19. ALA Pretreatment Improves Waterlogging Tolerance of Fig Plants

    PubMed Central

    An, Yuyan; Qi, Lin; Wang, Liangju

    2016-01-01

    5-aminolevulinic acid (ALA), a natural and environmentally friendly plant growth regulator, can improve plant tolerance to various environmental stresses. However, whether ALA can improve plant waterlogging tolerance is unknown. Here, we investigated the effects of ALA pretreatment on the waterlogging-induced damage of fig (Ficus carica Linn.) plants, which often suffer from waterlogging stress. ALA pretreatment significantly alleviated stress-induced morphological damage, increased leaf relative water content (RWC), and reduced leaf superoxide anion (O2⋅¯) production rate and malonaldehyde (MDA) content in fig leaves, indicating ALA mitigates waterlogging stress of fig plants. We further demonstrated that ALA pretreatment largely promoted leaf chlorophyll content, photosynthetic electron transfer ability, and photosynthetic performance index, indicating ALA significantly improves plant photosynthetic efficiency under waterlogging stress. Moreover, ALA pretreatment significantly increased activities of leaf superoxide dismutase (SOD) and peroxidase (POD), root vigor, and activities of root alcohol dehydrogenase (ADH), and lactate dehydrogenase (LDH), indicating ALA also significantly improves antioxidant ability and root function of fig plants under waterlogging stress. Taken together, ALA pretreatment improves waterlogging tolerance of fig plants significantly, and the promoted root respiration, leaf photosynthesis, and antioxidant ability may contribute greatly to this improvement. Our data firstly shows that ALA can improve plant waterlogging tolerance. PMID:26789407

  20. A probabilistic asteroid impact risk model: assessment of sub-300 m impacts

    NASA Astrophysics Data System (ADS)

    Mathias, Donovan L.; Wheeler, Lorien F.; Dotson, Jessie L.

    2017-06-01

    A comprehensive asteroid threat assessment requires the quantification of both the impact likelihood and resulting consequence across the range of possible events. This paper presents a probabilistic asteroid impact risk (PAIR) assessment model developed for this purpose. The model incorporates published impact frequency rates with state-of-the-art consequence assessment tools, applied within a Monte Carlo framework that generates sets of impact scenarios from uncertain input parameter distributions. Explicit treatment of atmospheric entry is included to produce energy deposition rates that account for the effects of thermal ablation and object fragmentation. These energy deposition rates are used to model the resulting ground damage, and affected populations are computed for the sampled impact locations. The results for each scenario are aggregated into a distribution of potential outcomes that reflect the range of uncertain impact parameters, population densities, and strike probabilities. As an illustration of the utility of the PAIR model, the results are used to address the question of what minimum size asteroid constitutes a threat to the population. To answer this question, complete distributions of results are combined with a hypothetical risk tolerance posture to provide the minimum size, given sets of initial assumptions for objects up to 300 m in diameter. Model outputs demonstrate how such questions can be answered and provide a means for interpreting the effect that input assumptions and uncertainty can have on final risk-based decisions. Model results can be used to prioritize investments to gain knowledge in critical areas or, conversely, to identify areas where additional data have little effect on the metrics of interest.

  1. Probabilistic brains: knowns and unknowns

    PubMed Central

    Pouget, Alexandre; Beck, Jeffrey M; Ma, Wei Ji; Latham, Peter E

    2015-01-01

    There is strong behavioral and physiological evidence that the brain both represents probability distributions and performs probabilistic inference. Computational neuroscientists have started to shed light on how these probabilistic representations and computations might be implemented in neural circuits. One particularly appealing aspect of these theories is their generality: they can be used to model a wide range of tasks, from sensory processing to high-level cognition. To date, however, these theories have only been applied to very simple tasks. Here we discuss the challenges that will emerge as researchers start focusing their efforts on real-life computations, with a focus on probabilistic learning, structural learning and approximate inference. PMID:23955561

  2. Probabilistic simple sticker systems

    NASA Astrophysics Data System (ADS)

    Selvarajoo, Mathuri; Heng, Fong Wan; Sarmin, Nor Haniza; Turaev, Sherzod

    2017-04-01

    A model for DNA computing using the recombination behavior of DNA molecules, known as a sticker system, was introduced by by L. Kari, G. Paun, G. Rozenberg, A. Salomaa, and S. Yu in the paper entitled DNA computing, sticker systems and universality from the journal of Acta Informatica vol. 35, pp. 401-420 in the year 1998. A sticker system uses the Watson-Crick complementary feature of DNA molecules: starting from the incomplete double stranded sequences, and iteratively using sticking operations until a complete double stranded sequence is obtained. It is known that sticker systems with finite sets of axioms and sticker rules generate only regular languages. Hence, different types of restrictions have been considered to increase the computational power of sticker systems. Recently, a variant of restricted sticker systems, called probabilistic sticker systems, has been introduced [4]. In this variant, the probabilities are initially associated with the axioms, and the probability of a generated string is computed by multiplying the probabilities of all occurrences of the initial strings in the computation of the string. Strings for the language are selected according to some probabilistic requirements. In this paper, we study fundamental properties of probabilistic simple sticker systems. We prove that the probabilistic enhancement increases the computational power of simple sticker systems.

  3. Adjoint Techniques for Topology Optimization of Structures Under Damage Conditions

    NASA Technical Reports Server (NTRS)

    Akgun, Mehmet A.; Haftka, Raphael T.

    2000-01-01

    The objective of this cooperative agreement was to seek computationally efficient ways to optimize aerospace structures subject to damage tolerance criteria. Optimization was to involve sizing as well as topology optimization. The work was done in collaboration with Steve Scotti, Chauncey Wu and Joanne Walsh at the NASA Langley Research Center. Computation of constraint sensitivity is normally the most time-consuming step of an optimization procedure. The cooperative work first focused on this issue and implemented the adjoint method of sensitivity computation (Haftka and Gurdal, 1992) in an optimization code (runstream) written in Engineering Analysis Language (EAL). The method was implemented both for bar and plate elements including buckling sensitivity for the latter. Lumping of constraints was investigated as a means to reduce the computational cost. Adjoint sensitivity computation was developed and implemented for lumped stress and buckling constraints. Cost of the direct method and the adjoint method was compared for various structures with and without lumping. The results were reported in two papers (Akgun et al., 1998a and 1999). It is desirable to optimize topology of an aerospace structure subject to a large number of damage scenarios so that a damage tolerant structure is obtained. Including damage scenarios in the design procedure is critical in order to avoid large mass penalties at later stages (Haftka et al., 1983). A common method for topology optimization is that of compliance minimization (Bendsoe, 1995) which has not been used for damage tolerant design. In the present work, topology optimization is treated as a conventional problem aiming to minimize the weight subject to stress constraints. Multiple damage configurations (scenarios) are considered. Each configuration has its own structural stiffness matrix and, normally, requires factoring of the matrix and solution of the system of equations. Damage that is expected to be tolerated is local

  4. Turtle anoxia tolerance: Biochemistry and gene regulation.

    PubMed

    Krivoruchko, Anastasia; Storey, Kenneth B

    2015-06-01

    While oxygen limitation can be extremely damaging for many animals, some vertebrates have perfected anaerobic survival. Freshwater turtles belonging to the Trachemys and Chrysemys genera, for example, can survive many weeks without oxygen, and as such are commonly used as model animals for vertebrate anoxia tolerance. In the present review we discuss the recent advances made in understanding the biochemical and molecular nature of natural anoxia tolerance of freshwater turtles. Research in recent years has shown that activation of several important pathways occurs in response to anoxia in turtles, including those that function in the stress response, cell cycle arrest, inhibition of gene expression and metabolism. These likely contribute to anoxia tolerance in turtle tissues by minimizing cell damage in response to anoxia, as well as facilitating metabolic rate depression. The research discussed in the present review contributes to the understanding of how freshwater turtles can survive without oxygen for prolonged periods of time. This could also improve understanding of the molecular nature of hypoxic/ischemic injuries in mammalian tissues and suggest potential ways to avoid these. Copyright © 2015 Elsevier B.V. All rights reserved.

  5. Probabilistic Design and Analysis Framework

    NASA Technical Reports Server (NTRS)

    Strack, William C.; Nagpal, Vinod K.

    2010-01-01

    PRODAF is a software package designed to aid analysts and designers in conducting probabilistic analysis of components and systems. PRODAF can integrate multiple analysis programs to ease the tedious process of conducting a complex analysis process that requires the use of multiple software packages. The work uses a commercial finite element analysis (FEA) program with modules from NESSUS to conduct a probabilistic analysis of a hypothetical turbine blade, disk, and shaft model. PRODAF applies the response surface method, at the component level, and extrapolates the component-level responses to the system level. Hypothetical components of a gas turbine engine are first deterministically modeled using FEA. Variations in selected geometrical dimensions and loading conditions are analyzed to determine the effects of the stress state within each component. Geometric variations include the cord length and height for the blade, inner radius, outer radius, and thickness, which are varied for the disk. Probabilistic analysis is carried out using developing software packages like System Uncertainty Analysis (SUA) and PRODAF. PRODAF was used with a commercial deterministic FEA program in conjunction with modules from the probabilistic analysis program, NESTEM, to perturb loads and geometries to provide a reliability and sensitivity analysis. PRODAF simplified the handling of data among the various programs involved, and will work with many commercial and opensource deterministic programs, probabilistic programs, or modules.

  6. Recent Developments and Challenges Implementing New and Improved Stress Intensity Factor (K) Solutions in NASGRO for Damage Tolerance Analyses

    NASA Technical Reports Server (NTRS)

    Cardinal, Joseph W.; McClung, R. Craig; Lee, Yi-Der; Guo, Yajun; Beek, Joachim M.

    2014-01-01

    Fatigue crack growth analysis software has been available to damage tolerance analysts for many years in either commercial products or via proprietary in-house codes. The NASGRO software has been publicly available since the mid-80s (known as NASA/FLAGRO up to 1999) and since 2000 has been sustained and further developed by a collaborative effort between Southwest Research Institute® (SwRI®), the NASA Johnson Space Center (JSC), and the members of the NASGRO Industrial Consortium. Since the stress intensity factor (K) is the foundation of fracture mechanics and damage tolerance analysis of aircraft structures, a significant focus of development efforts in the past fifteen years has been geared towards enhancing legacy K solutions and developing new and efficient numerical K solutions that can handle the complicated stress gradients computed by today’s analysts using detailed finite element models of fatigue critical locations. This paper provides an overview of K solutions that have been recently implemented or improved for the analysis of geometries such as two unequal through cracks at a hole and two unequal corner cracks at a hole, as well as state-of-the-art weight function models capable of computing K in the presence of univariant and/or bivariant stress gradients and complicated residual stress distributions. Some historical background is provided to review how common K solutions have evolved over the years, including selective examples from the literature and from new research. Challenges and progress in rectifying discrepancies between older legacy solutions and newer models are reviewed as well as approaches and challenges for verification and validation of K solutions. Finally, a summary of current challenges and future research and development needs is presented. A key theme throughout the presentation of this paper will be how members of the aerospace industry have collaborated with software developers to develop a practical analysis tool that is

  7. Damage prognosis of adhesively-bonded joints in laminated composite structural components of unmanned aerial vehicles

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Farrar, Charles R; Gobbato, Maurizio; Conte, Joel

    2009-01-01

    The extensive use of lightweight advanced composite materials in unmanned aerial vehicles (UAVs) drastically increases the sensitivity to both fatigue- and impact-induced damage of their critical structural components (e.g., wings and tail stabilizers) during service life. The spar-to-skin adhesive joints are considered one of the most fatigue sensitive subcomponents of a lightweight UAV composite wing with damage progressively evolving from the wing root. This paper presents a comprehensive probabilistic methodology for predicting the remaining service life of adhesively-bonded joints in laminated composite structural components of UAVs. Non-destructive evaluation techniques and Bayesian inference are used to (i) assess the current statemore » of damage of the system and, (ii) update the probability distribution of the damage extent at various locations. A probabilistic model for future loads and a mechanics-based damage model are then used to stochastically propagate damage through the joint. Combined local (e.g., exceedance of a critical damage size) and global (e.g.. flutter instability) failure criteria are finally used to compute the probability of component failure at future times. The applicability and the partial validation of the proposed methodology are then briefly discussed by analyzing the debonding propagation, along a pre-defined adhesive interface, in a simply supported laminated composite beam with solid rectangular cross section, subjected to a concentrated load applied at mid-span. A specially developed Eliler-Bernoulli beam finite element with interlaminar slip along the damageable interface is used in combination with a cohesive zone model to study the fatigue-induced degradation in the adhesive material. The preliminary numerical results presented are promising for the future validation of the methodology.« less

  8. Topics in Probabilistic Judgment Aggregation

    ERIC Educational Resources Information Center

    Wang, Guanchun

    2011-01-01

    This dissertation is a compilation of several studies that are united by their relevance to probabilistic judgment aggregation. In the face of complex and uncertain events, panels of judges are frequently consulted to provide probabilistic forecasts, and aggregation of such estimates in groups often yield better results than could have been made…

  9. Frontal and Parietal Contributions to Probabilistic Association Learning

    PubMed Central

    Rushby, Jacqueline A.; Vercammen, Ans; Loo, Colleen; Short, Brooke

    2011-01-01

    Neuroimaging studies have shown both dorsolateral prefrontal (DLPFC) and inferior parietal cortex (iPARC) activation during probabilistic association learning. Whether these cortical brain regions are necessary for probabilistic association learning is presently unknown. Participants' ability to acquire probabilistic associations was assessed during disruptive 1 Hz repetitive transcranial magnetic stimulation (rTMS) of the left DLPFC, left iPARC, and sham using a crossover single-blind design. On subsequent sessions, performance improved relative to baseline except during DLPFC rTMS that disrupted the early acquisition beneficial effect of prior exposure. A second experiment examining rTMS effects on task-naive participants showed that neither DLPFC rTMS nor sham influenced naive acquisition of probabilistic associations. A third experiment examining consecutive administration of the probabilistic association learning test revealed early trial interference from previous exposure to different probability schedules. These experiments, showing disrupted acquisition of probabilistic associations by rTMS only during subsequent sessions with an intervening night's sleep, suggest that the DLPFC may facilitate early access to learned strategies or prior task-related memories via consolidation. Although neuroimaging studies implicate DLPFC and iPARC in probabilistic association learning, the present findings suggest that early acquisition of the probabilistic cue-outcome associations in task-naive participants is not dependent on either region. PMID:21216842

  10. Sugars and Desiccation Tolerance in Seeds 1

    PubMed Central

    Koster, Karen L.; Leopold, A. Carl

    1988-01-01

    Soluble sugars have been shown to protect liposomes and lobster microsomes from desiccation damage, and a protective role has been proposed for them in several anhydrous systems. We have studied the relationship between soluble sugar content and the loss of desiccation tolerance in the axes of germinating soybean (Glycine max L. Merr. cv Williams), pea (Pisum sativum L. cv Alaska), and corn (Zea mays L. cv Merit) axes. The loss of desiccation tolerance during imbibition was monitored by following the ability of seeds to germinate after desiccation following various periods of preimbibition and by following the rates of electrolyte leakage from dried, then rehydrated axes. Finally, we analyzed the soluble sugar contents of the axes throughout the transition from desiccation tolerance to intolerance. These analyses show that sucrose and larger oligosaccharides were consistently present during the tolerant stage, and that desiccation tolerance disappeared as the oligosaccharides were lost. The results support the idea that sucrose may serve as the principal agent of desiccation tolerance in these seeds, with the larger oligosaccharides serving to keep the sucrose from crystallizing. PMID:16666392

  11. Probability of growth of small damage sites on the exit surface of fused silica optics.

    PubMed

    Negres, Raluca A; Abdulla, Ghaleb M; Cross, David A; Liao, Zhi M; Carr, Christopher W

    2012-06-04

    Growth of laser damage on fused silica optical components depends on several key parameters including laser fluence, wavelength, pulse duration, and site size. Here we investigate the growth behavior of small damage sites on the exit surface of SiO₂ optics under exposure to tightly controlled laser pulses. Results demonstrate that the onset of damage growth is not governed by a threshold, but is probabilistic in nature and depends both on the current size of a damage site and the laser fluence to which it is exposed. We also develop models for use in growth prediction. In addition, we show that laser exposure history also influences the behavior of individual sites.

  12. Nanoscale origins of the damage tolerance of the high-entropy alloy CrMnFeCoNi

    DOE PAGES

    Zhang, ZiJiao; Mao, M. M.; Wang, Jiangwei; ...

    2015-12-09

    Damage tolerance can be an elusive characteristic of structural materials requiring both high strength and ductility, properties that are often mutually exclusive. High-entropy alloys are of interest in this regard. Specifically, the single-phase CrMnFeCoNi alloy displays tensile strength levels of ~1 GPa, excellent ductility (~60–70%) and exceptional fracture toughness (KJIc>200M Pa√m). Here through the use of in situ straining in an aberration-corrected transmission electron microscope, we report on the salient atomistic to micro-scale mechanisms underlying the origin of these properties. We identify a synergy of multiple deformation mechanisms, rarely achieved in metallic alloys, which generates high strength, work hardening andmore » ductility, including the easy motion of Shockley partials, their interactions to form stacking-fault parallelepipeds, and arrest at planar slip bands of undissociated dislocations. In conclusion, we further show that crack propagation is impeded by twinned, nanoscale bridges that form between the near-tip crack faces and delay fracture by shielding the crack tip.« less

  13. Nanoscale origins of the damage tolerance of the high-entropy alloy CrMnFeCoNi

    PubMed Central

    Zhang, ZiJiao; Mao, M. M.; Wang, Jiangwei; Gludovatz, Bernd; Zhang, Ze; Mao, Scott X.; George, Easo P.; Yu, Qian; Ritchie, Robert O.

    2015-01-01

    Damage tolerance can be an elusive characteristic of structural materials requiring both high strength and ductility, properties that are often mutually exclusive. High-entropy alloys are of interest in this regard. Specifically, the single-phase CrMnFeCoNi alloy displays tensile strength levels of ∼1 GPa, excellent ductility (∼60–70%) and exceptional fracture toughness (KJIc>200 MPa√m). Here through the use of in situ straining in an aberration-corrected transmission electron microscope, we report on the salient atomistic to micro-scale mechanisms underlying the origin of these properties. We identify a synergy of multiple deformation mechanisms, rarely achieved in metallic alloys, which generates high strength, work hardening and ductility, including the easy motion of Shockley partials, their interactions to form stacking-fault parallelepipeds, and arrest at planar slip bands of undissociated dislocations. We further show that crack propagation is impeded by twinned, nanoscale bridges that form between the near-tip crack faces and delay fracture by shielding the crack tip. PMID:26647978

  14. Probabilistic boundary element method

    NASA Technical Reports Server (NTRS)

    Cruse, T. A.; Raveendra, S. T.

    1989-01-01

    The purpose of the Probabilistic Structural Analysis Method (PSAM) project is to develop structural analysis capabilities for the design analysis of advanced space propulsion system hardware. The boundary element method (BEM) is used as the basis of the Probabilistic Advanced Analysis Methods (PADAM) which is discussed. The probabilistic BEM code (PBEM) is used to obtain the structural response and sensitivity results to a set of random variables. As such, PBEM performs analogous to other structural analysis codes such as finite elements in the PSAM system. For linear problems, unlike the finite element method (FEM), the BEM governing equations are written at the boundary of the body only, thus, the method eliminates the need to model the volume of the body. However, for general body force problems, a direct condensation of the governing equations to the boundary of the body is not possible and therefore volume modeling is generally required.

  15. A probabilistic tornado wind hazard model for the continental United States

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hossain, Q; Kimball, J; Mensing, R

    A probabilistic tornado wind hazard model for the continental United States (CONUS) is described. The model incorporates both aleatory (random) and epistemic uncertainties associated with quantifying the tornado wind hazard parameters. The temporal occurrences of tornadoes within the continental United States (CONUS) is assumed to be a Poisson process. A spatial distribution of tornado touchdown locations is developed empirically based on the observed historical events within the CONUS. The hazard model is an aerial probability model that takes into consideration the size and orientation of the facility, the length and width of the tornado damage area (idealized as a rectanglemore » and dependent on the tornado intensity scale), wind speed variation within the damage area, tornado intensity classification errors (i.e.,errors in assigning a Fujita intensity scale based on surveyed damage), and the tornado path direction. Epistemic uncertainties in describing the distributions of the aleatory variables are accounted for by using more than one distribution model to describe aleatory variations. The epistemic uncertainties are based on inputs from a panel of experts. A computer program, TORNADO, has been developed incorporating this model; features of this program are also presented.« less

  16. Probabilistic finite elements

    NASA Technical Reports Server (NTRS)

    Belytschko, Ted; Wing, Kam Liu

    1987-01-01

    In the Probabilistic Finite Element Method (PFEM), finite element methods have been efficiently combined with second-order perturbation techniques to provide an effective method for informing the designer of the range of response which is likely in a given problem. The designer must provide as input the statistical character of the input variables, such as yield strength, load magnitude, and Young's modulus, by specifying their mean values and their variances. The output then consists of the mean response and the variance in the response. Thus the designer is given a much broader picture of the predicted performance than with simply a single response curve. These methods are applicable to a wide class of problems, provided that the scale of randomness is not too large and the probabilistic density functions possess decaying tails. By incorporating the computational techniques we have developed in the past 3 years for efficiency, the probabilistic finite element methods are capable of handling large systems with many sources of uncertainties. Sample results for an elastic-plastic ten-bar structure and an elastic-plastic plane continuum with a circular hole subject to cyclic loadings with the yield stress on the random field are given.

  17. A Probabilistic Analysis of Surface Water Flood Risk in London.

    PubMed

    Jenkins, Katie; Hall, Jim; Glenis, Vassilis; Kilsby, Chris

    2018-06-01

    Flooding in urban areas during heavy rainfall, often characterized by short duration and high-intensity events, is known as "surface water flooding." Analyzing surface water flood risk is complex as it requires understanding of biophysical and human factors, such as the localized scale and nature of heavy precipitation events, characteristics of the urban area affected (including detailed topography and drainage networks), and the spatial distribution of economic and social vulnerability. Climate change is recognized as having the potential to enhance the intensity and frequency of heavy rainfall events. This study develops a methodology to link high spatial resolution probabilistic projections of hourly precipitation with detailed surface water flood depth maps and characterization of urban vulnerability to estimate surface water flood risk. It incorporates probabilistic information on the range of uncertainties in future precipitation in a changing climate. The method is applied to a case study of Greater London and highlights that both the frequency and spatial extent of surface water flood events are set to increase under future climate change. The expected annual damage from surface water flooding is estimated to be to be £171 million, £343 million, and £390 million/year under the baseline, 2030 high, and 2050 high climate change scenarios, respectively. © 2017 Society for Risk Analysis.

  18. Probabilistic Simulation of Stress Concentration in Composite Laminates

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.; Murthy, P. L. N.; Liaw, D. G.

    1994-01-01

    A computational methodology is described to probabilistically simulate the stress concentration factors (SCF's) in composite laminates. This new approach consists of coupling probabilistic composite mechanics with probabilistic finite element structural analysis. The composite mechanics is used to probabilistically describe all the uncertainties inherent in composite material properties, whereas the finite element is used to probabilistically describe the uncertainties associated with methods to experimentally evaluate SCF's, such as loads, geometry, and supports. The effectiveness of the methodology is demonstrated by using is to simulate the SCF's in three different composite laminates. Simulated results match experimental data for probability density and for cumulative distribution functions. The sensitivity factors indicate that the SCF's are influenced by local stiffness variables, by load eccentricities, and by initial stress fields.

  19. Tolerance and resistance of invasive and native Eupatorium species to generalist herbivore insects

    NASA Astrophysics Data System (ADS)

    Wang, Rui-Fang; Feng, Yu-Long

    2016-11-01

    Invasive plants are exotic species that escape control by native specialist enemies. However, exotic plants may still be attacked by locally occurring generalist enemies, which can influence the dynamics of biological invasions. If invasive plants have greater defensive (resistance and tolerance) capabilities than indigenous plants, they may experience less damage from native herbivores. In the present study, we tested this prediction using the invasive plant Eupatorium adenophorum and two native congeners under simulated defoliation and generalist herbivore insect (Helicoverpa armigera and Spodoptera litura) treatments. E. adenophorum was less susceptible and compensated more quickly to damages in biomass production from both treatments compared to its two congeners, exhibiting greater herbivore tolerance. This strong tolerance to damage was associated with greater resource allocation to aboveground structures, leading to a higher leaf area ratio and a lower root: crown mass ratio than those of its native congeners. E. adenophorum also displayed a higher resistance index (which integrates acid detergent fiber, nitrogen content, carbon/nitrogen ratio, leaf mass per area, toughness, and trichome density) than its two congeners. Thus, H. armigera and S. litura performed poorly on E. adenophorum, with less leaf damage, a lengthened insect developmental duration, and decreased pupating: molting ratios compared to those of the native congeners. Strong tolerance and resistance traits may facilitate the successful invasion of E. adenophorum in China and may decrease the efficacy of leaf-feeding biocontrol agents. Our results highlight both the need for further research on defensive traits and their role in the invasiveness and biological control of exotic plants, and suggest that biocontrol of E. adenophorum in China would require damage to the plant far in excess of current levels.

  20. Probabilistic models of cognition: conceptual foundations.

    PubMed

    Chater, Nick; Tenenbaum, Joshua B; Yuille, Alan

    2006-07-01

    Remarkable progress in the mathematics and computer science of probability has led to a revolution in the scope of probabilistic models. In particular, 'sophisticated' probabilistic methods apply to structured relational systems such as graphs and grammars, of immediate relevance to the cognitive sciences. This Special Issue outlines progress in this rapidly developing field, which provides a potentially unifying perspective across a wide range of domains and levels of explanation. Here, we introduce the historical and conceptual foundations of the approach, explore how the approach relates to studies of explicit probabilistic reasoning, and give a brief overview of the field as it stands today.

  1. Probabilistic machine learning and artificial intelligence.

    PubMed

    Ghahramani, Zoubin

    2015-05-28

    How can a machine learn from experience? Probabilistic modelling provides a framework for understanding what learning is, and has therefore emerged as one of the principal theoretical and practical approaches for designing machines that learn from data acquired through experience. The probabilistic framework, which describes how to represent and manipulate uncertainty about models and predictions, has a central role in scientific data analysis, machine learning, robotics, cognitive science and artificial intelligence. This Review provides an introduction to this framework, and discusses some of the state-of-the-art advances in the field, namely, probabilistic programming, Bayesian optimization, data compression and automatic model discovery.

  2. Probabilistic machine learning and artificial intelligence

    NASA Astrophysics Data System (ADS)

    Ghahramani, Zoubin

    2015-05-01

    How can a machine learn from experience? Probabilistic modelling provides a framework for understanding what learning is, and has therefore emerged as one of the principal theoretical and practical approaches for designing machines that learn from data acquired through experience. The probabilistic framework, which describes how to represent and manipulate uncertainty about models and predictions, has a central role in scientific data analysis, machine learning, robotics, cognitive science and artificial intelligence. This Review provides an introduction to this framework, and discusses some of the state-of-the-art advances in the field, namely, probabilistic programming, Bayesian optimization, data compression and automatic model discovery.

  3. Probabilistic numerics and uncertainty in computations

    PubMed Central

    Hennig, Philipp; Osborne, Michael A.; Girolami, Mark

    2015-01-01

    We deliver a call to arms for probabilistic numerical methods: algorithms for numerical tasks, including linear algebra, integration, optimization and solving differential equations, that return uncertainties in their calculations. Such uncertainties, arising from the loss of precision induced by numerical calculation with limited time or hardware, are important for much contemporary science and industry. Within applications such as climate science and astrophysics, the need to make decisions on the basis of computations with large and complex data have led to a renewed focus on the management of numerical uncertainty. We describe how several seminal classic numerical methods can be interpreted naturally as probabilistic inference. We then show that the probabilistic view suggests new algorithms that can flexibly be adapted to suit application specifics, while delivering improved empirical performance. We provide concrete illustrations of the benefits of probabilistic numeric algorithms on real scientific problems from astrometry and astronomical imaging, while highlighting open problems with these new algorithms. Finally, we describe how probabilistic numerical methods provide a coherent framework for identifying the uncertainty in calculations performed with a combination of numerical algorithms (e.g. both numerical optimizers and differential equation solvers), potentially allowing the diagnosis (and control) of error sources in computations. PMID:26346321

  4. Probabilistic numerics and uncertainty in computations.

    PubMed

    Hennig, Philipp; Osborne, Michael A; Girolami, Mark

    2015-07-08

    We deliver a call to arms for probabilistic numerical methods : algorithms for numerical tasks, including linear algebra, integration, optimization and solving differential equations, that return uncertainties in their calculations. Such uncertainties, arising from the loss of precision induced by numerical calculation with limited time or hardware, are important for much contemporary science and industry. Within applications such as climate science and astrophysics, the need to make decisions on the basis of computations with large and complex data have led to a renewed focus on the management of numerical uncertainty. We describe how several seminal classic numerical methods can be interpreted naturally as probabilistic inference. We then show that the probabilistic view suggests new algorithms that can flexibly be adapted to suit application specifics, while delivering improved empirical performance. We provide concrete illustrations of the benefits of probabilistic numeric algorithms on real scientific problems from astrometry and astronomical imaging, while highlighting open problems with these new algorithms. Finally, we describe how probabilistic numerical methods provide a coherent framework for identifying the uncertainty in calculations performed with a combination of numerical algorithms (e.g. both numerical optimizers and differential equation solvers), potentially allowing the diagnosis (and control) of error sources in computations.

  5. The roles of tolerance in the evolution, maintenance and breakdown of mutualism

    NASA Astrophysics Data System (ADS)

    Edwards, David P.

    2009-10-01

    Tolerance strategies are cost-reduction mechanisms that enable organisms to recover some of the fitness lost to damage, but impose limited or no cost on antagonists. They are frequently invoked in studies of plant-herbivore and of host-parasite interactions, but the possible roles of tolerance in mutualism (interspecific cooperation) have yet to be thoroughly examined. This review identifies candidate roles for tolerance in the evolution, maintenance and breakdown of mutualism. Firstly, by reducing the cost of damage, tolerance provides a key pathway by which pre-mutualistic hosts can reduce the cost of association with their parasites, promoting cooperation. This holds for the evolution of ‘evolved dependency’ type mutualism, where a host requires an antagonist that does not direct any reward to their partner for some resource, and of ‘outright mutualism’, where participants directly trade benefits. Secondly, in outright mutualism, tolerance might maintain cooperation by reducing the cost of a persisting negative trait in a symbiotic partner. Finally, the evolution of tolerance might also provide a pathway out of mutualism because the host could evolve a cheaper alternative to continued cooperation with its mutualistic partner, permitting autonomy. A key consequence of tolerance is that it contrasts with partner choice mechanisms that impose large costs on cheats, and I highlight understanding any trade-off between tolerance and partner choice as an important research topic in the evolution of cooperation. I conclude by identifying tolerance as part of a more general phenomenon of co-adaptation in mutualism and parasitism that drives the evolution of the cost/benefit ratio from the interaction.

  6. Ocular tolerance of preservatives on the murine cornea.

    PubMed

    Furrer, P; Mayer, J M; Plazonnet, B; Gurny, R

    1999-03-01

    We investigated the effects of instilling 13 commonly used preservatives on the murine cornea in vivo. Due to the instillation of preservatives, micro-lesions are formed on the cornea and can be selectively marked by fluorescein. The sum of the resulting fluorescent areas was measured using an episcopic microscope coupled to an image processing system. All the tested preservatives proved to be well-tolerated by the eye at commonly used concentrations. However, in some cases, increased concentrations of preservatives or combinations resulted in significant increase of the amount of corneal damage. With increasing the concentration, corneal lesion increased the most in the case of cetylpyridinium. While a combination of chlorobutanol 0.5% and phenylethylalcohol 0.5% did not result in an increase in corneal damage (when compared to the use of each separately), the associations of thiomersal 0.02% and phenylethylalcohol 0.4% on one hand and of edetate disodium (EDTA) 0.1% and benzalkonium 0.01% on the other, resulted in significant increases in the amount of corneal damage. However, in none of the tested combinations, the increase in the observed damage exceed the limit of ocular intolerance we had defined beforehand: thus, they were all deemed relatively well-tolerated. In the last part of the study, we investigated the effects of combining several preservatives, at usual concentrations, with an anesthetic solution of oxybuprocaine and found no notable increase in ocular damage.

  7. Probabilistic soft sets and dual probabilistic soft sets in decision making with positive and negative parameters

    NASA Astrophysics Data System (ADS)

    Fatimah, F.; Rosadi, D.; Hakim, R. B. F.

    2018-03-01

    In this paper, we motivate and introduce probabilistic soft sets and dual probabilistic soft sets for handling decision making problem in the presence of positive and negative parameters. We propose several types of algorithms related to this problem. Our procedures are flexible and adaptable. An example on real data is also given.

  8. Error Discounting in Probabilistic Category Learning

    ERIC Educational Resources Information Center

    Craig, Stewart; Lewandowsky, Stephan; Little, Daniel R.

    2011-01-01

    The assumption in some current theories of probabilistic categorization is that people gradually attenuate their learning in response to unavoidable error. However, existing evidence for this error discounting is sparse and open to alternative interpretations. We report 2 probabilistic-categorization experiments in which we investigated error…

  9. Dynamic Probabilistic Instability of Composite Structures

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.

    2009-01-01

    A computationally effective method is described to evaluate the non-deterministic dynamic instability (probabilistic dynamic buckling) of thin composite shells. The method is a judicious combination of available computer codes for finite element, composite mechanics and probabilistic structural analysis. The solution method is incrementally updated Lagrangian. It is illustrated by applying it to thin composite cylindrical shell subjected to dynamic loads. Both deterministic and probabilistic buckling loads are evaluated to demonstrate the effectiveness of the method. A universal plot is obtained for the specific shell that can be used to approximate buckling loads for different load rates and different probability levels. Results from this plot show that the faster the rate, the higher the buckling load and the shorter the time. The lower the probability, the lower is the buckling load for a specific time. Probabilistic sensitivity results show that the ply thickness, the fiber volume ratio and the fiber longitudinal modulus, dynamic load and loading rate are the dominant uncertainties in that order.

  10. A fault-tolerant intelligent robotic control system

    NASA Technical Reports Server (NTRS)

    Marzwell, Neville I.; Tso, Kam Sing

    1993-01-01

    This paper describes the concept, design, and features of a fault-tolerant intelligent robotic control system being developed for space and commercial applications that require high dependability. The comprehensive strategy integrates system level hardware/software fault tolerance with task level handling of uncertainties and unexpected events for robotic control. The underlying architecture for system level fault tolerance is the distributed recovery block which protects against application software, system software, hardware, and network failures. Task level fault tolerance provisions are implemented in a knowledge-based system which utilizes advanced automation techniques such as rule-based and model-based reasoning to monitor, diagnose, and recover from unexpected events. The two level design provides tolerance of two or more faults occurring serially at any level of command, control, sensing, or actuation. The potential benefits of such a fault tolerant robotic control system include: (1) a minimized potential for damage to humans, the work site, and the robot itself; (2) continuous operation with a minimum of uncommanded motion in the presence of failures; and (3) more reliable autonomous operation providing increased efficiency in the execution of robotic tasks and decreased demand on human operators for controlling and monitoring the robotic servicing routines.

  11. Fuel containment and damage tolerance in large composite primary aircraft structures. Phase 2: Testing

    NASA Technical Reports Server (NTRS)

    Sandifer, J. P.; Denny, A.; Wood, M. A.

    1985-01-01

    Technical issues associated with fuel containment and damage tolerance of composite wing structures for transport aircraft were investigated. Material evaluation tests were conducted on two toughened resin composites: Celion/HX1504 and Celion/5245. These consisted of impact, tension, compression, edge delamination, and double cantilever beam tests. Another test series was conducted on graphite/epoxy box beams simulating a wing cover to spar cap joint configuration of a pressurized fuel tank. These tests evaluated the effectiveness of sealing methods with various fastener types and spacings under fatigue loading and with pressurized fuel. Another test series evaluated the ability of the selected coatings, film, and materials to prevent fuel leakage through 32-ply AS4/2220-1 laminates at various impact energy levels. To verify the structural integrity of the technology demonstration article structural details, tests were conducted on blade stiffened panels and sections. Compression tests were performed on undamaged and impacted stiffened AS4/2220-1 panels and smaller element tests to evaluate stiffener pull-off, side load and failsafe properties. Compression tests were also performed on panels subjected to Zone 2 lightning strikes. All of these data were integrated into a demonstration article representing a moderately loaded area of a transport wing. This test combined lightning strike, pressurized fuel, impact, impact repair, fatigue and residual strength.

  12. A novel two-step method for screening shade tolerant mutant plants via dwarfism

    USDA-ARS?s Scientific Manuscript database

    When subjected to shade, plants undergo rapid shoot elongation, which often makes them more prone to disease and mechanical damage. It has been reported that, in turfgrass, induced dwarfism can enhance shade tolerance. Here, we describe a two-step procedure for isolating shade tolerant mutants of ...

  13. Seasonal variation in hybrid poplar tolerance to glyphosate.

    Treesearch

    Daniel Netzer; Edward Hansen

    1992-01-01

    Reports that glyphosate applied during April or May in hybrid poplar plantations usually results in tree growth increases and that later summer applications often result in tree damage, growth loss, or mortality. Introduces the concept of "physiological" and "morphological" herbicide tolerance.

  14. Probabilistic structural analysis methods for space propulsion system components

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.

    1986-01-01

    The development of a three-dimensional inelastic analysis methodology for the Space Shuttle main engine (SSME) structural components is described. The methodology is composed of: (1) composite load spectra, (2) probabilistic structural analysis methods, (3) the probabilistic finite element theory, and (4) probabilistic structural analysis. The methodology has led to significant technical progress in several important aspects of probabilistic structural analysis. The program and accomplishments to date are summarized.

  15. Probabilistic models to estimate fire-induced cable damage at nuclear power plants

    NASA Astrophysics Data System (ADS)

    Valbuena, Genebelin R.

    Even though numerous PRAs have shown that fire can be a major contributor to nuclear power plant risk, there are some specific areas of knowledge related to this issue, such as the prediction of fire-induced damage to electrical cables and circuits, and their potential effects in the safety of the nuclear power plant, that still constitute a practical enigma, particularly for the lack of approaches/models to perform consistent and objective assessments. This report contains a discussion of three different models to estimate fire-induced cable damage likelihood given a specified fire profile: the kinetic, the heat transfer and the IR "K Factor" model. These models not only are based on statistical analysis of data available in the open literature, but to the greatest extent possible they use physics based principles to describe the underlying mechanism of failures that take place among the electrical cables upon heating due to external fires. The characterization of cable damage, and consequently the loss of functionality of electrical cables in fire is a complex phenomenon that depends on a variety of intrinsic factors such as cable materials and dimensions, and extrinsic factors such as electrical and mechanical loads on the cables, heat flux severity, and exposure time. Some of these factors are difficult to estimate even in a well-characterized fire, not only for the variability related to the unknown material composition and physical arrangements, but also for the lack of objective frameworks and theoretical models to study the behavior of polymeric wire cable insulation under dynamic external thermal insults. The results of this research will (1) help to develop a consistent framework to predict fire-induced cable failure modes likelihood, and (2) develop some guidance to evaluate and/or reduce the risk associated with these failure modes in existing and new power plant facilities. Among the models evaluated, the physics-based heat transfer model takes into

  16. A global probabilistic tsunami hazard assessment from earthquake sources

    USGS Publications Warehouse

    Davies, Gareth; Griffin, Jonathan; Lovholt, Finn; Glimsdal, Sylfest; Harbitz, Carl; Thio, Hong Kie; Lorito, Stefano; Basili, Roberto; Selva, Jacopo; Geist, Eric L.; Baptista, Maria Ana

    2017-01-01

    Large tsunamis occur infrequently but have the capacity to cause enormous numbers of casualties, damage to the built environment and critical infrastructure, and economic losses. A sound understanding of tsunami hazard is required to underpin management of these risks, and while tsunami hazard assessments are typically conducted at regional or local scales, globally consistent assessments are required to support international disaster risk reduction efforts, and can serve as a reference for local and regional studies. This study presents a global-scale probabilistic tsunami hazard assessment (PTHA), extending previous global-scale assessments based largely on scenario analysis. Only earthquake sources are considered, as they represent about 80% of the recorded damaging tsunami events. Globally extensive estimates of tsunami run-up height are derived at various exceedance rates, and the associated uncertainties are quantified. Epistemic uncertainties in the exceedance rates of large earthquakes often lead to large uncertainties in tsunami run-up. Deviations between modelled tsunami run-up and event observations are quantified, and found to be larger than suggested in previous studies. Accounting for these deviations in PTHA is important, as it leads to a pronounced increase in predicted tsunami run-up for a given exceedance rate.

  17. Probabilistic Simulation of Multi-Scale Composite Behavior

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.

    2012-01-01

    A methodology is developed to computationally assess the non-deterministic composite response at all composite scales (from micro to structural) due to the uncertainties in the constituent (fiber and matrix) properties, in the fabrication process and in structural variables (primitive variables). The methodology is computationally efficient for simulating the probability distributions of composite behavior, such as material properties, laminate and structural responses. Bi-products of the methodology are probabilistic sensitivities of the composite primitive variables. The methodology has been implemented into the computer codes PICAN (Probabilistic Integrated Composite ANalyzer) and IPACS (Integrated Probabilistic Assessment of Composite Structures). The accuracy and efficiency of this methodology are demonstrated by simulating the uncertainties in composite typical laminates and comparing the results with the Monte Carlo simulation method. Available experimental data of composite laminate behavior at all scales fall within the scatters predicted by PICAN. Multi-scaling is extended to simulate probabilistic thermo-mechanical fatigue and to simulate the probabilistic design of a composite redome in order to illustrate its versatility. Results show that probabilistic fatigue can be simulated for different temperature amplitudes and for different cyclic stress magnitudes. Results also show that laminate configurations can be selected to increase the redome reliability by several orders of magnitude without increasing the laminate thickness--a unique feature of structural composites. The old reference denotes that nothing fundamental has been done since that time.

  18. Probabilistic population projections with migration uncertainty

    PubMed Central

    Azose, Jonathan J.; Ševčíková, Hana; Raftery, Adrian E.

    2016-01-01

    We produce probabilistic projections of population for all countries based on probabilistic projections of fertility, mortality, and migration. We compare our projections to those from the United Nations’ Probabilistic Population Projections, which uses similar methods for fertility and mortality but deterministic migration projections. We find that uncertainty in migration projection is a substantial contributor to uncertainty in population projections for many countries. Prediction intervals for the populations of Northern America and Europe are over 70% wider, whereas prediction intervals for the populations of Africa, Asia, and the world as a whole are nearly unchanged. Out-of-sample validation shows that the model is reasonably well calibrated. PMID:27217571

  19. Staged decision making based on probabilistic forecasting

    NASA Astrophysics Data System (ADS)

    Booister, Nikéh; Verkade, Jan; Werner, Micha; Cranston, Michael; Cumiskey, Lydia; Zevenbergen, Chris

    2016-04-01

    Flood forecasting systems reduce, but cannot eliminate uncertainty about the future. Probabilistic forecasts explicitly show that uncertainty remains. However, as - compared to deterministic forecasts - a dimension is added ('probability' or 'likelihood'), with this added dimension decision making is made slightly more complicated. A technique of decision support is the cost-loss approach, which defines whether or not to issue a warning or implement mitigation measures (risk-based method). With the cost-loss method a warning will be issued when the ratio of the response costs to the damage reduction is less than or equal to the probability of the possible flood event. This cost-loss method is not widely used, because it motivates based on only economic values and is a technique that is relatively static (no reasoning, yes/no decision). Nevertheless it has high potential to improve risk-based decision making based on probabilistic flood forecasting because there are no other methods known that deal with probabilities in decision making. The main aim of this research was to explore the ways of making decision making based on probabilities with the cost-loss method better applicable in practice. The exploration began by identifying other situations in which decisions were taken based on uncertain forecasts or predictions. These cases spanned a range of degrees of uncertainty: from known uncertainty to deep uncertainty. Based on the types of uncertainties, concepts of dealing with situations and responses were analysed and possible applicable concepts where chosen. Out of this analysis the concepts of flexibility and robustness appeared to be fitting to the existing method. Instead of taking big decisions with bigger consequences at once, the idea is that actions and decisions are cut-up into smaller pieces and finally the decision to implement is made based on economic costs of decisions and measures and the reduced effect of flooding. The more lead-time there is in

  20. Probabilistic Cue Combination: Less Is More

    ERIC Educational Resources Information Center

    Yurovsky, Daniel; Boyer, Ty W.; Smith, Linda B.; Yu, Chen

    2013-01-01

    Learning about the structure of the world requires learning probabilistic relationships: rules in which cues do not predict outcomes with certainty. However, in some cases, the ability to track probabilistic relationships is a handicap, leading adults to perform non-normatively in prediction tasks. For example, in the "dilution effect,"…

  1. Cumulative herbivory outpaces compensation for early floral damage on a monocarpic perennial thistle

    USDA-ARS?s Scientific Manuscript database

    Floral herbivory presents a threat to plant reproductive success. Monocarpic plants should tolerate early apical damage with compensatory reproductive effort by subsequent flower heads during their single flowering season. However, the actual contribution of this tolerance response to net fitness is...

  2. In-situ monitoring and assessment of post barge-bridge collision damage for minimizing traffic delay and detour : final report.

    DOT National Transportation Integrated Search

    2016-07-31

    This report presents a novel framework for promptly assessing the probability of barge-bridge : collision damage of piers based on probabilistic-based classification through machine learning. The main : idea of the presented framework is to divide th...

  3. Sensor Based Engine Life Calculation: A Probabilistic Perspective

    NASA Technical Reports Server (NTRS)

    Guo, Ten-Huei; Chen, Philip

    2003-01-01

    It is generally known that an engine component will accumulate damage (life usage) during its lifetime of use in a harsh operating environment. The commonly used cycle count for engine component usage monitoring has an inherent range of uncertainty which can be overly costly or potentially less safe from an operational standpoint. With the advance of computer technology, engine operation modeling, and the understanding of damage accumulation physics, it is possible (and desirable) to use the available sensor information to make a more accurate assessment of engine component usage. This paper describes a probabilistic approach to quantify the effects of engine operating parameter uncertainties on the thermomechanical fatigue (TMF) life of a selected engine part. A closed-loop engine simulation with a TMF life model is used to calculate the life consumption of different mission cycles. A Monte Carlo simulation approach is used to generate the statistical life usage profile for different operating assumptions. The probabilities of failure of different operating conditions are compared to illustrate the importance of the engine component life calculation using sensor information. The results of this study clearly show that a sensor-based life cycle calculation can greatly reduce the risk of component failure as well as extend on-wing component life by avoiding unnecessary maintenance actions.

  4. Biosentinel: Improving Desiccation Tolerance of Yeast Biosensors for Deep-Space Missions

    NASA Technical Reports Server (NTRS)

    Dalal, Sawan; Santa Maria, Sergio R.; Liddell, Lauren; Bhattacharya, Sharmila

    2017-01-01

    BioSentinel is one of 13 secondary payloads to be deployed on Exploration Mission 1 (EM-1) in 2019. We will use the budding yeast Saccharomyces cerevisiae as a biosensor to determine how deep-space radiation affects living organisms and to potentially quantify radiation levels through radiation damage analysis. Radiation can damage DNA through double strand breaks (DSBs), which can normally be repaired by homologous recombination. Two yeast strains will be air-dried and stored in microfluidic cards within the payload: a wild-type control strain and a radiation sensitive rad51 mutant that is deficient in DSB repairs. Throughout the mission, the microfluidic cards will be rehydrated with growth medium and an indicator dye. Growth rates of each strain will be measured through LED detection of the reduction of the indicator dye, which correlates with DNA repair and the amount of radiation damage accumulated. Results from BioSentinel will be compared to analog experiments on the ISS and on Earth. It is well known that desiccation can damage yeast cells and decrease viability over time. We performed a screen for desiccation-tolerant rad51 strains. We selected 20 re-isolates of rad51 and ran a weekly screen for desiccation-tolerant mutants for five weeks. Our data shows that viability decreases over time, confirming previous research findings. Isolates L2, L5 and L14 indicate desiccation tolerance and are candidates for whole-genome sequencing. More time is needed to determine whether a specific strain is truly desiccation tolerant. Furthermore, we conducted an intracellular trehalose assay to test how intracellular trehalose concentrations affect or protect the mutant strains against desiccation stress. S. cerevisiae cell and reagent concentrations from a previously established intracellular trehalose protocol did not yield significant absorbance measurements, so we tested varying cell and reagent concentrations and determined proper concentrations for successful

  5. NASA workshop on impact damage to composites

    NASA Technical Reports Server (NTRS)

    Poe, C. C., Jr.

    1991-01-01

    A compilation of slides presented at the NASA Workshop on Impact Damage to Composites held on March 19 and 20, 1991, at the Langley Research Center, Hampton, Virginia is given. The objective of the workshop was to review technology for evaluating impact damage tolerance of composite structures and identify deficiencies. Research, development, design methods, and design criteria were addressed. Actions to eliminate technology deficiencies were developed. A list of those actions and a list of attendees are also included.

  6. Damage Tolerant Analysis of Cracked Al 2024-T3 Panels repaired with Single Boron/Epoxy Patch

    NASA Astrophysics Data System (ADS)

    Mahajan, Akshay D.; Murthy, A. Ramachandra; Nanda Kumar, M. R.; Gopinath, Smitha

    2018-06-01

    It is known that damage tolerant analysis has two objectives, namely, remaining life prediction and residual strength evaluation. To achieve the these objectives, determination of accurate and reliable fracture parameter is very important. XFEM methodologies for fatigue and fracture analysis of cracked aluminium panels repaired with different patch shapes made of single boron/epoxy have been developed. Heaviside and asymptotic crack tip enrichment functions are employed to model the crack. XFEM formulations such as displacement field formulation and element stiffness matrix formulation are presented. Domain form of interaction integral is employed to determine Stress Intensity Factor of repaired cracked panels. Computed SIFs are incorporated in Paris crack growth model to predict the remaining fatigue life. The residual strength has been computed by using the remaining life approach, which accounts for both crack growth constants and no. of cycles to failure. From the various studies conducted, it is observed that repaired panels have significant effect on reduction of the SIF at the crack tip and hence residual strength as well as remaining life of the patched cracked panels are improved significantly. The predicted remaining life and residual strength will be useful for design of structures/components under fatigue loading.

  7. Vagueness as Probabilistic Linguistic Knowledge

    NASA Astrophysics Data System (ADS)

    Lassiter, Daniel

    Consideration of the metalinguistic effects of utterances involving vague terms has led Barker [1] to treat vagueness using a modified Stalnakerian model of assertion. I present a sorites-like puzzle for factual beliefs in the standard Stalnakerian model [28] and show that it can be resolved by enriching the model to make use of probabilistic belief spaces. An analogous problem arises for metalinguistic information in Barker's model, and I suggest that a similar enrichment is needed here as well. The result is a probabilistic theory of linguistic representation that retains a classical metalanguage but avoids the undesirable divorce between meaning and use inherent in the epistemic theory [34]. I also show that the probabilistic approach provides a plausible account of the sorites paradox and higher-order vagueness and that it fares well empirically and conceptually in comparison to leading competitors.

  8. Probabilistic Risk Assessment: A Bibliography

    NASA Technical Reports Server (NTRS)

    2000-01-01

    Probabilistic risk analysis is an integration of failure modes and effects analysis (FMEA), fault tree analysis and other techniques to assess the potential for failure and to find ways to reduce risk. This bibliography references 160 documents in the NASA STI Database that contain the major concepts, probabilistic risk assessment, risk and probability theory, in the basic index or major subject terms, An abstract is included with most citations, followed by the applicable subject terms.

  9. Radiation-induced amorphization resistance and radiation tolerance in structurally related oxides.

    PubMed

    Sickafus, Kurt E; Grimes, Robin W; Valdez, James A; Cleave, Antony; Tang, Ming; Ishimaru, Manabu; Corish, Siobhan M; Stanek, Christopher R; Uberuaga, Blas P

    2007-03-01

    Ceramics destined for use in hostile environments such as nuclear reactors or waste immobilization must be highly durable and especially resistant to radiation damage effects. In particular, they must not be prone to amorphization or swelling. Few ceramics meet these criteria and much work has been devoted in recent years to identifying radiation-tolerant ceramics and the characteristics that promote radiation tolerance. Here, we examine trends in radiation damage behaviour for families of compounds related by crystal structure. Specifically, we consider oxides with structures related to the fluorite crystal structure. We demonstrate that improved amorphization resistance characteristics are to be found in compounds that have a natural tendency to accommodate lattice disorder.

  10. Mus308 Processes Oxygen and Nitrogen Ethylation DNA Damage in Germ Cells of Drosophila

    PubMed Central

    Díaz-Valdés, Nancy; Comendador, Miguel A.; Sierra, L. María

    2010-01-01

    The D. melanogaster mus308 gene, highly conserved among higher eukaryotes, is implicated in the repair of cross-links and of O-ethylpyrimidine DNA damage, working in a DNA damage tolerance mechanism. However, despite its relevance, its possible role on the processing of different DNA ethylation damages is not clear. To obtain data on mutation frequency and on mutation spectra in mus308 deficient (mus308−) conditions, the ethylating agent diethyl sulfate (DES) was analysed in postmeiotic male germ cells. These data were compared with those corresponding to mus308 efficient conditions. Our results indicate that Mus308 is necessary for the processing of oxygen and N-ethylation damage, for the survival of fertilized eggs depending on the level of induced DNA damage, and for an influence of the DNA damage neighbouring sequence. These results support the role of mus308 in a tolerance mechanism linked to a translesion synthesis pathway and also to the alternative end-joinig system. PMID:20936147

  11. Probabilistic Learning by Rodent Grid Cells

    PubMed Central

    Cheung, Allen

    2016-01-01

    Mounting evidence shows mammalian brains are probabilistic computers, but the specific cells involved remain elusive. Parallel research suggests that grid cells of the mammalian hippocampal formation are fundamental to spatial cognition but their diverse response properties still defy explanation. No plausible model exists which explains stable grids in darkness for twenty minutes or longer, despite being one of the first results ever published on grid cells. Similarly, no current explanation can tie together grid fragmentation and grid rescaling, which show very different forms of flexibility in grid responses when the environment is varied. Other properties such as attractor dynamics and grid anisotropy seem to be at odds with one another unless additional properties are assumed such as a varying velocity gain. Modelling efforts have largely ignored the breadth of response patterns, while also failing to account for the disastrous effects of sensory noise during spatial learning and recall, especially in darkness. Here, published electrophysiological evidence from a range of experiments are reinterpreted using a novel probabilistic learning model, which shows that grid cell responses are accurately predicted by a probabilistic learning process. Diverse response properties of probabilistic grid cells are statistically indistinguishable from rat grid cells across key manipulations. A simple coherent set of probabilistic computations explains stable grid fields in darkness, partial grid rescaling in resized arenas, low-dimensional attractor grid cell dynamics, and grid fragmentation in hairpin mazes. The same computations also reconcile oscillatory dynamics at the single cell level with attractor dynamics at the cell ensemble level. Additionally, a clear functional role for boundary cells is proposed for spatial learning. These findings provide a parsimonious and unified explanation of grid cell function, and implicate grid cells as an accessible neuronal population

  12. A spatio-temporal model for probabilistic seismic hazard zonation of Tehran

    NASA Astrophysics Data System (ADS)

    Hashemi, Mahdi; Alesheikh, Ali Asghar; Zolfaghari, Mohammad Reza

    2013-08-01

    A precondition for all disaster management steps, building damage prediction, and construction code developments is a hazard assessment that shows the exceedance probabilities of different ground motion levels at a site considering different near- and far-field earthquake sources. The seismic sources are usually categorized as time-independent area sources and time-dependent fault sources. While the earlier incorporates the small and medium events, the later takes into account only the large characteristic earthquakes. In this article, a probabilistic approach is proposed to aggregate the effects of time-dependent and time-independent sources on seismic hazard. The methodology is then applied to generate three probabilistic seismic hazard maps of Tehran for 10%, 5%, and 2% exceedance probabilities in 50 years. The results indicate an increase in peak ground acceleration (PGA) values toward the southeastern part of the study area and the PGA variations are mostly controlled by the shear wave velocities across the city. In addition, the implementation of the methodology takes advantage of GIS capabilities especially raster-based analyses and representations. During the estimation of the PGA exceedance rates, the emphasis has been placed on incorporating the effects of different attenuation relationships and seismic source models by using a logic tree.

  13. Probabilistic Evaluation of Advanced Ceramic Matrix Composite Structures

    NASA Technical Reports Server (NTRS)

    Abumeri, Galib H.; Chamis, Christos C.

    2003-01-01

    The objective of this report is to summarize the deterministic and probabilistic structural evaluation results of two structures made with advanced ceramic composites (CMC): internally pressurized tube and uniformly loaded flange. The deterministic structural evaluation includes stress, displacement, and buckling analyses. It is carried out using the finite element code MHOST, developed for the 3-D inelastic analysis of structures that are made with advanced materials. The probabilistic evaluation is performed using the integrated probabilistic assessment of composite structures computer code IPACS. The affects of uncertainties in primitive variables related to the material, fabrication process, and loadings on the material property and structural response behavior are quantified. The primitive variables considered are: thermo-mechanical properties of fiber and matrix, fiber and void volume ratios, use temperature, and pressure. The probabilistic structural analysis and probabilistic strength results are used by IPACS to perform reliability and risk evaluation of the two structures. The results will show that the sensitivity information obtained for the two composite structures from the computational simulation can be used to alter the design process to meet desired service requirements. In addition to detailed probabilistic analysis of the two structures, the following were performed specifically on the CMC tube: (1) predicted the failure load and the buckling load, (2) performed coupled non-deterministic multi-disciplinary structural analysis, and (3) demonstrated that probabilistic sensitivities can be used to select a reduced set of design variables for optimization.

  14. New iron-sulfur clusters help hydrogenases tolerate oxygen.

    PubMed

    Grubel, Katarzyna; Holland, Patrick L

    2012-04-02

    One S less: recent crystallographic studies have revealed a new, oxygen-tolerant kind of iron-sulfide cluster [4Fe-3S], which contains only three rather than four sulfur atoms in its cage (see picture; yellow=S, red=Fe, blue=N, green=cysteine). It is proposed that the cluster's ability to transfer multiple electrons increases the oxygen tolerance by enabling the enzyme to reduce O(2) rapidly, converting the dioxygen into harmless water before it can damage the protein. Copyright © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  15. A native plant competitor mediates the impact of above- and belowground damage on an invasive tree.

    PubMed

    Carrillo, Juli; Siemann, Evan

    2016-10-01

    Plant competition may mediate the impacts of herbivory on invasive plant species through effects on plant growth and defense. This may predictably depend on whether herbivory occurs above or below ground and on relative plant competitive ability. We simulated the potential impact of above- or belowground damage by biocontrol agents on the growth of a woody invader (Chinese tallow tree, Triadica sebifera) through artificial herbivory, with or without competition with a native grass, little bluestem (Schizachyrium scoparium). We measured two defense responses of Triadica through quantifying constitutive and induced extrafloral nectar production and tolerance of above- and belowground damage (root and shoot biomass regrowth). We examined genetic variation in plant growth and defense across native (China) and invasive (United States) Triadica populations. Without competition, aboveground damage had a greater impact than belowground damage on Triadica performance, whereas with competition and above- and belowground damage impacted Triadica similarly. Whole plant tolerance to damage below ground was negatively associated with tolerance to grass competitors indicating tradeoffs in the ability to tolerate herbivory vs. compete. Competition reduced investment in defensive extrafloral nectar (EFN) production. Aboveground damage inhibited rather than induced EFN production while belowground plant damage did not impact aboveground nectar production. We found some support for the evolution of increased competitive ability hypothesis for invasive plants as United States plants were larger than native China plants and were more plastic in their response to biotic stressors than China plants (they altered their root to shoot ratios dependent on herbivory and competition treatments). Our results indicate that habitat type and the presence of competitors may be a larger determinant of herbivory impact than feeding mode and suggest that integrated pest management strategies including

  16. Detecting damage in full-scale honeycomb sandwich composite curved fuselage panels through frequency response

    NASA Astrophysics Data System (ADS)

    Leone, Frank A., Jr.; Ozevin, Didem; Mosinyi, Bao; Bakuckas, John G., Jr.; Awerbuch, Jonathan; Lau, Alan; Tan, Tein-Min

    2008-03-01

    Preliminary tests were conducted using frequency response (FR) characteristics to determine damage initiation and growth in a honeycomb sandwich graphite/epoxy curved panel. This investigation was part of a more general study investigating the damage tolerance characteristics of several such panels subjected to quasi-static internal pressurization combined with hoop and axial loading. The panels were tested at the Full-Scale Aircraft Structural Test Evaluation and Research (FASTER) facility located at the Federal Aviation Administration William J. Hughes Technical Center in Atlantic City, NJ. The overall program objective was to investigate the damage tolerance characteristics of full-scale composite curved aircraft fuselage panels and the evolution of damage under quasi-static loading up to failure. This paper focuses on one aspect of this comprehensive investigation: the effect of state-of-damage on the characteristics of the frequency response of the subject material. The results presented herein show that recording the frequency response could be used for real-time monitoring of damage growth and in determining damage severity in full-scale composites fuselage aircraft structures.

  17. Overview of Future of Probabilistic Methods and RMSL Technology and the Probabilistic Methods Education Initiative for the US Army at the SAE G-11 Meeting

    NASA Technical Reports Server (NTRS)

    Singhal, Surendra N.

    2003-01-01

    The SAE G-11 RMSL Division and Probabilistic Methods Committee meeting sponsored by the Picatinny Arsenal during March 1-3, 2004 at Westin Morristown, will report progress on projects for probabilistic assessment of Army system and launch an initiative for probabilistic education. The meeting features several Army and industry Senior executives and Ivy League Professor to provide an industry/government/academia forum to review RMSL technology; reliability and probabilistic technology; reliability-based design methods; software reliability; and maintainability standards. With over 100 members including members with national/international standing, the mission of the G-11s Probabilistic Methods Committee is to enable/facilitate rapid deployment of probabilistic technology to enhance the competitiveness of our industries by better, faster, greener, smarter, affordable and reliable product development.

  18. Damage and Loss Estimation for Natural Gas Networks: The Case of Istanbul

    NASA Astrophysics Data System (ADS)

    Çaktı, Eser; Hancılar, Ufuk; Şeşetyan, Karin; Bıyıkoǧlu, Hikmet; Şafak, Erdal

    2017-04-01

    Natural gas networks are one of the major lifeline systems to support human, urban and industrial activities. The continuity of gas supply is critical for almost all functions of modern life. Under natural phenomena such as earthquakes and landslides the damages to the system elements may lead to explosions and fires compromising human life and damaging physical environment. Furthermore, the disruption in the gas supply puts human activities at risk and also results in economical losses. This study is concerned with the performance of one of the largest natural gas distribution systems in the world. Physical damages to Istanbul's natural gas network are estimated under the most recent probabilistic earthquake hazard models available, as well as under simulated ground motions from physics based models. Several vulnerability functions are used in modelling damages to system elements. A first-order assessment of monetary losses to Istanbul's natural gas distribution network is also attempted.

  19. Do probabilistic forecasts lead to better decisions?

    NASA Astrophysics Data System (ADS)

    Ramos, M. H.; van Andel, S. J.; Pappenberger, F.

    2012-12-01

    The last decade has seen growing research in producing probabilistic hydro-meteorological forecasts and increasing their reliability. This followed the promise that, supplied with information about uncertainty, people would take better risk-based decisions. In recent years, therefore, research and operational developments have also start putting attention to ways of communicating the probabilistic forecasts to decision makers. Communicating probabilistic forecasts includes preparing tools and products for visualization, but also requires understanding how decision makers perceive and use uncertainty information in real-time. At the EGU General Assembly 2012, we conducted a laboratory-style experiment in which several cases of flood forecasts and a choice of actions to take were presented as part of a game to participants, who acted as decision makers. Answers were collected and analyzed. In this paper, we present the results of this exercise and discuss if indeed we make better decisions on the basis of probabilistic forecasts.

  20. Do probabilistic forecasts lead to better decisions?

    NASA Astrophysics Data System (ADS)

    Ramos, M. H.; van Andel, S. J.; Pappenberger, F.

    2013-06-01

    The last decade has seen growing research in producing probabilistic hydro-meteorological forecasts and increasing their reliability. This followed the promise that, supplied with information about uncertainty, people would take better risk-based decisions. In recent years, therefore, research and operational developments have also started focusing attention on ways of communicating the probabilistic forecasts to decision-makers. Communicating probabilistic forecasts includes preparing tools and products for visualisation, but also requires understanding how decision-makers perceive and use uncertainty information in real time. At the EGU General Assembly 2012, we conducted a laboratory-style experiment in which several cases of flood forecasts and a choice of actions to take were presented as part of a game to participants, who acted as decision-makers. Answers were collected and analysed. In this paper, we present the results of this exercise and discuss if we indeed make better decisions on the basis of probabilistic forecasts.

  1. Study of Composite Plate Damages Using Embedded PZT Sensors with Various Center Frequency

    NASA Astrophysics Data System (ADS)

    Kang, Kyoung-Tak; Chun, Heoung-Jae; Son, Ju-Hyun; Byun, Joon-Hyung; Um, Moon-Kwang; Lee, Sang-Kwan

    This study presents part of an experimental and analytical survey of candidate methods for damage detection of composite structural. Embedded piezoceramic (PZT) sensors were excited with the high power ultrasonic wave generator generating a propagation of stress wave along the composite plate. The same embedded piezoceramic (PZT) sensors are used as receivers for acquiring stress signals. The effects of center frequency of embedded sensor were evaluated for the damage identification capability with known localized defects. The study was carried out to assess damage in composite plate by fusing information from multiple sensing paths of the embedded network. It was based on the Hilbert transform, signal correlation and probabilistic searching. The obtained results show that satisfactory detection of defects could be achieved by proposed method.

  2. Probabilistic Geoacoustic Inversion in Complex Environments

    DTIC Science & Technology

    2015-09-30

    Probabilistic Geoacoustic Inversion in Complex Environments Jan Dettmer School of Earth and Ocean Sciences, University of Victoria, Victoria BC...long-range inversion methods can fail to provide sufficient resolution. For proper quantitative examination of variability, parameter uncertainty must...project aims to advance probabilistic geoacoustic inversion methods for complex ocean environments for a range of geoacoustic data types. The work is

  3. Probabilistic structural analysis methods of hot engine structures

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.; Hopkins, D. A.

    1989-01-01

    Development of probabilistic structural analysis methods for hot engine structures is a major activity at Lewis Research Center. Recent activities have focused on extending the methods to include the combined uncertainties in several factors on structural response. This paper briefly describes recent progress on composite load spectra models, probabilistic finite element structural analysis, and probabilistic strength degradation modeling. Progress is described in terms of fundamental concepts, computer code development, and representative numerical results.

  4. Probabilistic structural analysis of aerospace components using NESSUS

    NASA Technical Reports Server (NTRS)

    Shiao, Michael C.; Nagpal, Vinod K.; Chamis, Christos C.

    1988-01-01

    Probabilistic structural analysis of a Space Shuttle main engine turbopump blade is conducted using the computer code NESSUS (numerical evaluation of stochastic structures under stress). The goal of the analysis is to derive probabilistic characteristics of blade response given probabilistic descriptions of uncertainties in blade geometry, material properties, and temperature and pressure distributions. Probability densities are derived for critical blade responses. Risk assessment and failure life analysis is conducted assuming different failure models.

  5. Modelling multi-hazard hurricane damages on an urbanized coast with a Bayesian Network approach

    USGS Publications Warehouse

    van Verseveld, H.C.W.; Van Dongeren, A. R.; Plant, Nathaniel G.; Jäger, W.S.; den Heijer, C.

    2015-01-01

    Hurricane flood impacts to residential buildings in coastal zones are caused by a number of hazards, such as inundation, overflow currents, erosion, and wave attack. However, traditional hurricane damage models typically make use of stage-damage functions, where the stage is related to flooding depth only. Moreover, these models are deterministic and do not consider the large amount of uncertainty associated with both the processes themselves and with the predictions. This uncertainty becomes increasingly important when multiple hazards (flooding, wave attack, erosion, etc.) are considered simultaneously. This paper focusses on establishing relationships between observed damage and multiple hazard indicators in order to make better probabilistic predictions. The concept consists of (1) determining Local Hazard Indicators (LHIs) from a hindcasted storm with use of a nearshore morphodynamic model, XBeach, and (2) coupling these LHIs and building characteristics to the observed damages. We chose a Bayesian Network approach in order to make this coupling and used the LHIs ‘Inundation depth’, ‘Flow velocity’, ‘Wave attack’, and ‘Scour depth’ to represent flooding, current, wave impacts, and erosion related hazards.The coupled hazard model was tested against four thousand damage observations from a case site at the Rockaway Peninsula, NY, that was impacted by Hurricane Sandy in late October, 2012. The model was able to accurately distinguish ‘Minor damage’ from all other outcomes 95% of the time and could distinguish areas that were affected by the storm, but not severely damaged, 68% of the time. For the most heavily damaged buildings (‘Major Damage’ and ‘Destroyed’), projections of the expected damage underestimated the observed damage. The model demonstrated that including multiple hazards doubled the prediction skill, with Log-Likelihood Ratio test (a measure of improved accuracy and reduction in uncertainty) scores between 0.02 and 0

  6. Damage evaluation by a guided wave-hidden Markov model based method

    NASA Astrophysics Data System (ADS)

    Mei, Hanfei; Yuan, Shenfang; Qiu, Lei; Zhang, Jinjin

    2016-02-01

    Guided wave based structural health monitoring has shown great potential in aerospace applications. However, one of the key challenges of practical engineering applications is the accurate interpretation of the guided wave signals under time-varying environmental and operational conditions. This paper presents a guided wave-hidden Markov model based method to improve the damage evaluation reliability of real aircraft structures under time-varying conditions. In the proposed approach, an HMM based unweighted moving average trend estimation method, which can capture the trend of damage propagation from the posterior probability obtained by HMM modeling is used to achieve a probabilistic evaluation of the structural damage. To validate the developed method, experiments are performed on a hole-edge crack specimen under fatigue loading condition and a real aircraft wing spar under changing structural boundary conditions. Experimental results show the advantage of the proposed method.

  7. bayesPop: Probabilistic Population Projections

    PubMed Central

    Ševčíková, Hana; Raftery, Adrian E.

    2016-01-01

    We describe bayesPop, an R package for producing probabilistic population projections for all countries. This uses probabilistic projections of total fertility and life expectancy generated by Bayesian hierarchical models. It produces a sample from the joint posterior predictive distribution of future age- and sex-specific population counts, fertility rates and mortality rates, as well as future numbers of births and deaths. It provides graphical ways of summarizing this information, including trajectory plots and various kinds of probabilistic population pyramids. An expression language is introduced which allows the user to produce the predictive distribution of a wide variety of derived population quantities, such as the median age or the old age dependency ratio. The package produces aggregated projections for sets of countries, such as UN regions or trading blocs. The methodology has been used by the United Nations to produce their most recent official population projections for all countries, published in the World Population Prospects. PMID:28077933

  8. Impact Damage and Strain Rate Effects for Toughened Epoxy Composite Structures

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.; Minnetyan, Levon

    2006-01-01

    Structural integrity of composite systems under dynamic impact loading is investigated herein. The GENOA virtual testing software environment is used to implement the effects of dynamic loading on fracture progression and damage tolerance. Combinations of graphite and glass fibers with a toughened epoxy matrix are investigated. The effect of a ceramic coating for the absorption of impact energy is also included. Impact and post impact simulations include verification and prediction of (1) Load and Impact Energy, (2) Impact Damage Size, (3) Maximum Impact Peak Load, (4) Residual Strength, (5) Maximum Displacement, (6) Contribution of Failure Modes to Failure Mechanisms, (7) Prediction of Impact Load Versus Time, and (8) Damage, and Fracture Pattern. A computer model is utilized for the assessment of structural response, progressive fracture, and defect/damage tolerance characteristics. Results show the damage progression sequence and the changes in the structural response characteristics due to dynamic impact. The fundamental premise of computational simulation is that the complete evaluation of composite fracture requires an assessment of ply and subply level damage/fracture processes as the structure is subjected to loads. Simulation results for the graphite/epoxy composite were compared with the impact and tension failure test data, correlation and verification was obtained that included: (1) impact energy, (2) damage size, (3) maximum impact peak load, (4) residual strength, (5) maximum displacement, and (6) failure mechanisms of the composite structure.

  9. Probabilistic structural analysis methods of hot engine structures

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.; Hopkins, D. A.

    1989-01-01

    Development of probabilistic structural analysis methods for hot engine structures at Lewis Research Center is presented. Three elements of the research program are: (1) composite load spectra methodology; (2) probabilistic structural analysis methodology; and (3) probabilistic structural analysis application. Recent progress includes: (1) quantification of the effects of uncertainties for several variables on high pressure fuel turbopump (HPFT) turbine blade temperature, pressure, and torque of the space shuttle main engine (SSME); (2) the evaluation of the cumulative distribution function for various structural response variables based on assumed uncertainties in primitive structural variables; and (3) evaluation of the failure probability. Collectively, the results demonstrate that the structural durability of hot engine structural components can be effectively evaluated in a formal probabilistic/reliability framework.

  10. Probabilistic structural analysis methods for select space propulsion system components

    NASA Technical Reports Server (NTRS)

    Millwater, H. R.; Cruse, T. A.

    1989-01-01

    The Probabilistic Structural Analysis Methods (PSAM) project developed at the Southwest Research Institute integrates state-of-the-art structural analysis techniques with probability theory for the design and analysis of complex large-scale engineering structures. An advanced efficient software system (NESSUS) capable of performing complex probabilistic analysis has been developed. NESSUS contains a number of software components to perform probabilistic analysis of structures. These components include: an expert system, a probabilistic finite element code, a probabilistic boundary element code and a fast probability integrator. The NESSUS software system is shown. An expert system is included to capture and utilize PSAM knowledge and experience. NESSUS/EXPERT is an interactive menu-driven expert system that provides information to assist in the use of the probabilistic finite element code NESSUS/FEM and the fast probability integrator (FPI). The expert system menu structure is summarized. The NESSUS system contains a state-of-the-art nonlinear probabilistic finite element code, NESSUS/FEM, to determine the structural response and sensitivities. A broad range of analysis capabilities and an extensive element library is present.

  11. Probabilistic reasoning in data analysis.

    PubMed

    Sirovich, Lawrence

    2011-09-20

    This Teaching Resource provides lecture notes, slides, and a student assignment for a lecture on probabilistic reasoning in the analysis of biological data. General probabilistic frameworks are introduced, and a number of standard probability distributions are described using simple intuitive ideas. Particular attention is focused on random arrivals that are independent of prior history (Markovian events), with an emphasis on waiting times, Poisson processes, and Poisson probability distributions. The use of these various probability distributions is applied to biomedical problems, including several classic experimental studies.

  12. Temperature-stress resistance and tolerance along a latitudinal cline in North American Arabidopsis lyrata.

    PubMed

    Wos, Guillaume; Willi, Yvonne

    2015-01-01

    The study of latitudinal gradients can yield important insights into adaptation to temperature stress. Two strategies are available: resistance by limiting damage, or tolerance by reducing the fitness consequences of damage. Here we studied latitudinal variation in resistance and tolerance to frost and heat and tested the prediction of a trade-off between the two strategies and their costliness. We raised plants of replicate maternal seed families from eight populations of North American Arabidopsis lyrata collected along a latitudinal gradient in climate chambers and exposed them repeatedly to either frost or heat stress, while a set of control plants grew under standard conditions. When control plants reached maximum rosette size, leaf samples were exposed to frost and heat stress, and electrolyte leakage (PEL) was measured and treated as an estimate of resistance. Difference in maximum rosette size between stressed and control plants was used as an estimate of tolerance. Northern populations were more frost resistant, and less heat resistant and less heat tolerant, but-unexpectedly-they were also less frost tolerant. Negative genetic correlations between resistance and tolerance to the same and different thermal stress were generally not significant, indicating only weak trade-offs. However, tolerance to frost was consistently accompanied by small size under control conditions, which may explain the non-adaptive latitudinal pattern for frost tolerance. Our results suggest that adaptation to frost and heat is not constrained by trade-offs between them. But the cost of frost tolerance in terms of plant size reduction may be important for the limits of species distributions and climate niches.

  13. Damage tolerance of woven graphite-epoxy buffer strip panels

    NASA Technical Reports Server (NTRS)

    Kennedy, John M.

    1990-01-01

    Graphite-epoxy panels with S glass buffer strips were tested in tension and shear to measure their residual strengths with crack-like damage. The buffer strips were regularly spaced narrow strips of continuous S glass. Panels were made with a uniweave graphite cloth where the S glass buffer material was woven directly into the cloth. Panels were made with different width and thickness buffer strips. The panels were loaded to failure while remote strain, strain at the end of the slit, and crack opening displacement were monitoring. The notched region and nearby buffer strips were radiographed periodically to reveal crack growth and damage. Except for panels with short slits, the buffer strips arrested the propagating crack. The strength (or failing strain) of the panels was significantly higher than the strength of all-graphite panels with the same length slit. Panels with wide, thick buffer strips were stronger than panels with thin, narrow buffer strips. A shear-lag model predicted the failing strength of tension panels with wide buffer strips accurately, but over-estimated the strength of the shear panels and the tension panels with narrow buffer strips.

  14. Probabilistic Aeroelastic Analysis Developed for Turbomachinery Components

    NASA Technical Reports Server (NTRS)

    Reddy, T. S. R.; Mital, Subodh K.; Stefko, George L.; Pai, Shantaram S.

    2003-01-01

    Aeroelastic analyses for advanced turbomachines are being developed for use at the NASA Glenn Research Center and industry. However, these analyses at present are used for turbomachinery design with uncertainties accounted for by using safety factors. This approach may lead to overly conservative designs, thereby reducing the potential of designing higher efficiency engines. An integration of the deterministic aeroelastic analysis methods with probabilistic analysis methods offers the potential to design efficient engines with fewer aeroelastic problems and to make a quantum leap toward designing safe reliable engines. In this research, probabilistic analysis is integrated with aeroelastic analysis: (1) to determine the parameters that most affect the aeroelastic characteristics (forced response and stability) of a turbomachine component such as a fan, compressor, or turbine and (2) to give the acceptable standard deviation on the design parameters for an aeroelastically stable system. The approach taken is to combine the aeroelastic analysis of the MISER (MIStuned Engine Response) code with the FPI (fast probability integration) code. The role of MISER is to provide the functional relationships that tie the structural and aerodynamic parameters (the primitive variables) to the forced response amplitudes and stability eigenvalues (the response properties). The role of FPI is to perform probabilistic analyses by utilizing the response properties generated by MISER. The results are a probability density function for the response properties. The probabilistic sensitivities of the response variables to uncertainty in primitive variables are obtained as a byproduct of the FPI technique. The combined analysis of aeroelastic and probabilistic analysis is applied to a 12-bladed cascade vibrating in bending and torsion. Out of the total 11 design parameters, 6 are considered as having probabilistic variation. The six parameters are space-to-chord ratio (SBYC), stagger angle

  15. A Bayesian state-space approach for damage detection and classification

    NASA Astrophysics Data System (ADS)

    Dzunic, Zoran; Chen, Justin G.; Mobahi, Hossein; Büyüköztürk, Oral; Fisher, John W.

    2017-11-01

    The problem of automatic damage detection in civil structures is complex and requires a system that can interpret collected sensor data into meaningful information. We apply our recently developed switching Bayesian model for dependency analysis to the problems of damage detection and classification. The model relies on a state-space approach that accounts for noisy measurement processes and missing data, which also infers the statistical temporal dependency between measurement locations signifying the potential flow of information within the structure. A Gibbs sampling algorithm is used to simultaneously infer the latent states, parameters of the state dynamics, the dependence graph, and any changes in behavior. By employing a fully Bayesian approach, we are able to characterize uncertainty in these variables via their posterior distribution and provide probabilistic estimates of the occurrence of damage or a specific damage scenario. We also implement a single class classification method which is more realistic for most real world situations where training data for a damaged structure is not available. We demonstrate the methodology with experimental test data from a laboratory model structure and accelerometer data from a real world structure during different environmental and excitation conditions.

  16. Engineering microbes for tolerance to next-generation biofuels

    PubMed Central

    2011-01-01

    A major challenge when using microorganisms to produce bulk chemicals such as biofuels is that the production targets are often toxic to cells. Many biofuels are known to reduce cell viability through damage to the cell membrane and interference with essential physiological processes. Therefore, cells must trade off biofuel production and survival, reducing potential yields. Recently, there have been several efforts towards engineering strains for biofuel tolerance. Promising methods include engineering biofuel export systems, heat shock proteins, membrane modifications, more general stress responses, and approaches that integrate multiple tolerance strategies. In addition, in situ recovery methods and media supplements can help to ease the burden of end-product toxicity and may be used in combination with genetic approaches. Recent advances in systems and synthetic biology provide a framework for tolerance engineering. This review highlights recent targeted approaches towards improving microbial tolerance to next-generation biofuels with a particular emphasis on strategies that will improve production. PMID:21936941

  17. Error Discounting in Probabilistic Category Learning

    PubMed Central

    Craig, Stewart; Lewandowsky, Stephan; Little, Daniel R.

    2011-01-01

    Some current theories of probabilistic categorization assume that people gradually attenuate their learning in response to unavoidable error. However, existing evidence for this error discounting is sparse and open to alternative interpretations. We report two probabilistic-categorization experiments that investigated error discounting by shifting feedback probabilities to new values after different amounts of training. In both experiments, responding gradually became less responsive to errors, and learning was slowed for some time after the feedback shift. Both results are indicative of error discounting. Quantitative modeling of the data revealed that adding a mechanism for error discounting significantly improved the fits of an exemplar-based and a rule-based associative learning model, as well as of a recency-based model of categorization. We conclude that error discounting is an important component of probabilistic learning. PMID:21355666

  18. Loss of desiccation tolerance in Copaifera langsdorffii Desf. seeds during germination.

    PubMed

    Pereira, W V S; Faria, J M R; Tonetti, O A O; Silva, E A A

    2014-05-01

    This study evaluated the loss of desiccation tolerance in C. langsdorffii seeds during the germination process. Seeds were imbibed for 24, 48, 72, 96, 120 and 144 hours and dried to the initial moisture content, kept in this state for 3 days after which they were submitted to pre-humidification and rehydration. Ultraestructural evaluations were done aiming to observe the cell damage caused by the dry process. Desiccation tolerance was evaluated in terms of the percentage of normal seedlings. Seeds not submitted to the drying process presented 61% of normal seedlings, and after 24 hours of imbibition, followed by drying, the seeds presented the same percentage of survival. However, after 48 hours of imbibition, seeds started to lose the desiccation tolerance. There was twenty six percent of normal seedlings formed from seeds imbibed for 96 hours and later dried and rehydrated. Only 5% of seeds imbibed for 144 hours, dried and rehydrated formed normal seedlings. At 144 hours of imbibition followed the dry process, there was damage into the cell structure, indicating that the seeds were unable to keep the cell structure during the drying process. Copaifera langsdorffii seeds loses the desiccation tolerance at the start of Phase 2 of imbibition.

  19. Modeling Uncertainties in EEG Microstates: Analysis of Real and Imagined Motor Movements Using Probabilistic Clustering-Driven Training of Probabilistic Neural Networks.

    PubMed

    Dinov, Martin; Leech, Robert

    2017-01-01

    Part of the process of EEG microstate estimation involves clustering EEG channel data at the global field power (GFP) maxima, very commonly using a modified K-means approach. Clustering has also been done deterministically, despite there being uncertainties in multiple stages of the microstate analysis, including the GFP peak definition, the clustering itself and in the post-clustering assignment of microstates back onto the EEG timecourse of interest. We perform a fully probabilistic microstate clustering and labeling, to account for these sources of uncertainty using the closest probabilistic analog to KM called Fuzzy C-means (FCM). We train softmax multi-layer perceptrons (MLPs) using the KM and FCM-inferred cluster assignments as target labels, to then allow for probabilistic labeling of the full EEG data instead of the usual correlation-based deterministic microstate label assignment typically used. We assess the merits of the probabilistic analysis vs. the deterministic approaches in EEG data recorded while participants perform real or imagined motor movements from a publicly available data set of 109 subjects. Though FCM group template maps that are almost topographically identical to KM were found, there is considerable uncertainty in the subsequent assignment of microstate labels. In general, imagined motor movements are less predictable on a time point-by-time point basis, possibly reflecting the more exploratory nature of the brain state during imagined, compared to during real motor movements. We find that some relationships may be more evident using FCM than using KM and propose that future microstate analysis should preferably be performed probabilistically rather than deterministically, especially in situations such as with brain computer interfaces, where both training and applying models of microstates need to account for uncertainty. Probabilistic neural network-driven microstate assignment has a number of advantages that we have discussed, which

  20. Modeling Uncertainties in EEG Microstates: Analysis of Real and Imagined Motor Movements Using Probabilistic Clustering-Driven Training of Probabilistic Neural Networks

    PubMed Central

    Dinov, Martin; Leech, Robert

    2017-01-01

    Part of the process of EEG microstate estimation involves clustering EEG channel data at the global field power (GFP) maxima, very commonly using a modified K-means approach. Clustering has also been done deterministically, despite there being uncertainties in multiple stages of the microstate analysis, including the GFP peak definition, the clustering itself and in the post-clustering assignment of microstates back onto the EEG timecourse of interest. We perform a fully probabilistic microstate clustering and labeling, to account for these sources of uncertainty using the closest probabilistic analog to KM called Fuzzy C-means (FCM). We train softmax multi-layer perceptrons (MLPs) using the KM and FCM-inferred cluster assignments as target labels, to then allow for probabilistic labeling of the full EEG data instead of the usual correlation-based deterministic microstate label assignment typically used. We assess the merits of the probabilistic analysis vs. the deterministic approaches in EEG data recorded while participants perform real or imagined motor movements from a publicly available data set of 109 subjects. Though FCM group template maps that are almost topographically identical to KM were found, there is considerable uncertainty in the subsequent assignment of microstate labels. In general, imagined motor movements are less predictable on a time point-by-time point basis, possibly reflecting the more exploratory nature of the brain state during imagined, compared to during real motor movements. We find that some relationships may be more evident using FCM than using KM and propose that future microstate analysis should preferably be performed probabilistically rather than deterministically, especially in situations such as with brain computer interfaces, where both training and applying models of microstates need to account for uncertainty. Probabilistic neural network-driven microstate assignment has a number of advantages that we have discussed, which

  1. Error Mitigation of Point-to-Point Communication for Fault-Tolerant Computing

    NASA Technical Reports Server (NTRS)

    Akamine, Robert L.; Hodson, Robert F.; LaMeres, Brock J.; Ray, Robert E.

    2011-01-01

    Fault tolerant systems require the ability to detect and recover from physical damage caused by the hardware s environment, faulty connectors, and system degradation over time. This ability applies to military, space, and industrial computing applications. The integrity of Point-to-Point (P2P) communication, between two microcontrollers for example, is an essential part of fault tolerant computing systems. In this paper, different methods of fault detection and recovery are presented and analyzed.

  2. Some Observations on Damage Tolerance Analyses in Pressure Vessels

    NASA Technical Reports Server (NTRS)

    Raju, Ivatury S.; Dawicke, David S.; Hampton, Roy W.

    2017-01-01

    AIAA standards S080 and S081 are applicable for certification of metallic pressure vessels (PV) and composite overwrap pressure vessels (COPV), respectively. These standards require damage tolerance analyses with a minimum reliable detectible flaw/crack and demonstration of safe life four times the service life with these cracks at the worst-case location in the PVs and oriented perpendicular to the maximum principal tensile stress. The standards require consideration of semi-elliptical surface cracks in the range of aspect ratios (crack depth a to half of the surface length c, i.e., (a/c) of 0.2 to 1). NASA-STD-5009 provides the minimum reliably detectible standard crack sizes (90/95 probability of detection (POD) for several non-destructive evaluation (NDE) methods (eddy current (ET), penetrant (PT), radiography (RT) and ultrasonic (UT)) for the two limits of the aspect ratio range required by the AIAA standards. This paper tries to answer the questions: can the safe life analysis consider only the life for the crack sizes at the two required limits, or endpoints, of the (a/c) range for the NDE method used or does the analysis need to consider values within that range? What would be an appropriate method to interpolate 90/95 POD crack sizes at intermediate (a/c) values? Several procedures to develop combinations of a and c within the specified range are explored. A simple linear relationship between a and c is chosen to compare the effects of seven different approaches to determine combinations of aj and cj that are between the (a/c) endpoints. Two of the seven are selected for evaluation: Approach I, the simple linear relationship, and a more conservative option, Approach III. For each of these two Approaches, the lives are computed for initial semi-elliptic crack configurations in a plate subjected to remote tensile fatigue loading with an R-ratio of 0.1, for an assumed material evaluated using NASGRO (registered 4) version 8.1. These calculations demonstrate

  3. Is Probabilistic Evidence a Source of Knowledge?

    ERIC Educational Resources Information Center

    Friedman, Ori; Turri, John

    2015-01-01

    We report a series of experiments examining whether people ascribe knowledge for true beliefs based on probabilistic evidence. Participants were less likely to ascribe knowledge for beliefs based on probabilistic evidence than for beliefs based on perceptual evidence (Experiments 1 and 2A) or testimony providing causal information (Experiment 2B).…

  4. Probabilistic Tsunami Hazard Analysis

    NASA Astrophysics Data System (ADS)

    Thio, H. K.; Ichinose, G. A.; Somerville, P. G.; Polet, J.

    2006-12-01

    The recent tsunami disaster caused by the 2004 Sumatra-Andaman earthquake has focused our attention to the hazard posed by large earthquakes that occur under water, in particular subduction zone earthquakes, and the tsunamis that they generate. Even though these kinds of events are rare, the very large loss of life and material destruction caused by this earthquake warrant a significant effort towards the mitigation of the tsunami hazard. For ground motion hazard, Probabilistic Seismic Hazard Analysis (PSHA) has become a standard practice in the evaluation and mitigation of seismic hazard to populations in particular with respect to structures, infrastructure and lifelines. Its ability to condense the complexities and variability of seismic activity into a manageable set of parameters greatly facilitates the design of effective seismic resistant buildings but also the planning of infrastructure projects. Probabilistic Tsunami Hazard Analysis (PTHA) achieves the same goal for hazards posed by tsunami. There are great advantages of implementing such a method to evaluate the total risk (seismic and tsunami) to coastal communities. The method that we have developed is based on the traditional PSHA and therefore completely consistent with standard seismic practice. Because of the strong dependence of tsunami wave heights on bathymetry, we use a full waveform tsunami waveform computation in lieu of attenuation relations that are common in PSHA. By pre-computing and storing the tsunami waveforms at points along the coast generated for sets of subfaults that comprise larger earthquake faults, we can efficiently synthesize tsunami waveforms for any slip distribution on those faults by summing the individual subfault tsunami waveforms (weighted by their slip). This efficiency make it feasible to use Green's function summation in lieu of attenuation relations to provide very accurate estimates of tsunami height for probabilistic calculations, where one typically computes

  5. Aligned composite structures for mitigation of impact damage and resistance to wear in dynamic environments

    DOEpatents

    Mulligan, Anthony C.; Rigali, Mark J.; Sutaria, Manish P.; Popovich, Dragan; Halloran, Joseph P.; Fulcher, Michael L.; Cook, Randy C.

    2005-12-13

    Fibrous monolith composites having architectures that provide increased flaw insensitivity, improved hardness, wear resistance and damage tolerance and methods of manufacture thereof are provided for use in dynamic environments to mitigate impact damage and increase wear resistance.

  6. Aligned composite structures for mitigation of impact damage and resistance to wear in dynamic environments

    DOEpatents

    Mulligan, Anthony C.; Rigali, Mark J.; Sutaria, Manish P.; Popovich, Dragan; Halloran, Joseph P.; Fulcher, Michael L.; Cook, Randy C.

    2009-04-14

    Fibrous monolith composites having architectures that provide increased flaw insensitivity, improved hardness, wear resistance and damage tolerance and methods of manufacture thereof are provided for use in dynamic environments to mitigate impact damage and increase wear resistance.

  7. Aligned composite structures for mitigation of impact damage and resistance to wear in dynamic environments

    DOEpatents

    Rigali, Mark J.; Sutaria, Manish P.; Mulligan, Anthony C.; Popovich, Dragan

    2004-03-23

    Fibrous monolith composites having architectures that provide increased flaw insensitivity, improved hardness, wear resistance and damage tolerance and methods of manufacture thereof are provided for use in dynamic environments to mitigate impact damage and increase wear resistance.

  8. Probabilistic dual heuristic programming-based adaptive critic

    NASA Astrophysics Data System (ADS)

    Herzallah, Randa

    2010-02-01

    Adaptive critic (AC) methods have common roots as generalisations of dynamic programming for neural reinforcement learning approaches. Since they approximate the dynamic programming solutions, they are potentially suitable for learning in noisy, non-linear and non-stationary environments. In this study, a novel probabilistic dual heuristic programming (DHP)-based AC controller is proposed. Distinct to current approaches, the proposed probabilistic (DHP) AC method takes uncertainties of forward model and inverse controller into consideration. Therefore, it is suitable for deterministic and stochastic control problems characterised by functional uncertainty. Theoretical development of the proposed method is validated by analytically evaluating the correct value of the cost function which satisfies the Bellman equation in a linear quadratic control problem. The target value of the probabilistic critic network is then calculated and shown to be equal to the analytically derived correct value. Full derivation of the Riccati solution for this non-standard stochastic linear quadratic control problem is also provided. Moreover, the performance of the proposed probabilistic controller is demonstrated on linear and non-linear control examples.

  9. Probabilistic analysis of a materially nonlinear structure

    NASA Technical Reports Server (NTRS)

    Millwater, H. R.; Wu, Y.-T.; Fossum, A. F.

    1990-01-01

    A probabilistic finite element program is used to perform probabilistic analysis of a materially nonlinear structure. The program used in this study is NESSUS (Numerical Evaluation of Stochastic Structure Under Stress), under development at Southwest Research Institute. The cumulative distribution function (CDF) of the radial stress of a thick-walled cylinder under internal pressure is computed and compared with the analytical solution. In addition, sensitivity factors showing the relative importance of the input random variables are calculated. Significant plasticity is present in this problem and has a pronounced effect on the probabilistic results. The random input variables are the material yield stress and internal pressure with Weibull and normal distributions, respectively. The results verify the ability of NESSUS to compute the CDF and sensitivity factors of a materially nonlinear structure. In addition, the ability of the Advanced Mean Value (AMV) procedure to assess the probabilistic behavior of structures which exhibit a highly nonlinear response is shown. Thus, the AMV procedure can be applied with confidence to other structures which exhibit nonlinear behavior.

  10. Superposition-Based Analysis of First-Order Probabilistic Timed Automata

    NASA Astrophysics Data System (ADS)

    Fietzke, Arnaud; Hermanns, Holger; Weidenbach, Christoph

    This paper discusses the analysis of first-order probabilistic timed automata (FPTA) by a combination of hierarchic first-order superposition-based theorem proving and probabilistic model checking. We develop the overall semantics of FPTAs and prove soundness and completeness of our method for reachability properties. Basically, we decompose FPTAs into their time plus first-order logic aspects on the one hand, and their probabilistic aspects on the other hand. Then we exploit the time plus first-order behavior by hierarchic superposition over linear arithmetic. The result of this analysis is the basis for the construction of a reachability equivalent (to the original FPTA) probabilistic timed automaton to which probabilistic model checking is finally applied. The hierarchic superposition calculus required for the analysis is sound and complete on the first-order formulas generated from FPTAs. It even works well in practice. We illustrate the potential behind it with a real-life DHCP protocol example, which we analyze by means of tool chain support.

  11. The case for probabilistic forecasting in hydrology

    NASA Astrophysics Data System (ADS)

    Krzysztofowicz, Roman

    2001-08-01

    That forecasts should be stated in probabilistic, rather than deterministic, terms has been argued from common sense and decision-theoretic perspectives for almost a century. Yet most operational hydrological forecasting systems produce deterministic forecasts and most research in operational hydrology has been devoted to finding the 'best' estimates rather than quantifying the predictive uncertainty. This essay presents a compendium of reasons for probabilistic forecasting of hydrological variates. Probabilistic forecasts are scientifically more honest, enable risk-based warnings of floods, enable rational decision making, and offer additional economic benefits. The growing demand for information about risk and the rising capability to quantify predictive uncertainties create an unparalleled opportunity for the hydrological profession to dramatically enhance the forecasting paradigm.

  12. Controlling the self-organizing dynamics in a sandpile model on complex networks by failure tolerance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Qi, Junjian; Pfenninger, Stefan

    In this paper, we propose a strategy to control the self-organizing dynamics of the Bak-Tang-Wiesenfeld (BTW) sandpile model on complex networks by allowing some degree of failure tolerance for the nodes and introducing additional active dissipation while taking the risk of possible node damage. We show that the probability for large cascades significantly increases or decreases respectively when the risk for node damage outweighs the active dissipation and when the active dissipation outweighs the risk for node damage. By considering the potential additional risk from node damage, a non-trivial optimal active dissipation control strategy which minimizes the total cost inmore » the system can be obtained. Under some conditions the introduced control strategy can decrease the total cost in the system compared to the uncontrolled model. Moreover, when the probability of damaging a node experiencing failure tolerance is greater than the critical value, then no matter how successful the active dissipation control is, the total cost of the system will have to increase. This critical damage probability can be used as an indicator of the robustness of a network or system. Copyright (C) EPLA, 2015« less

  13. Genome-scale analyses of butanol tolerance in Saccharomyces cerevisiae reveal an essential role of protein degradation

    PubMed Central

    2013-01-01

    Background n-Butanol and isobutanol produced from biomass-derived sugars are promising renewable transport fuels and solvents. Saccharomyces cerevisiae has been engineered for butanol production, but its high butanol sensitivity poses an upper limit to product titers that can be reached by further pathway engineering. A better understanding of the molecular basis of butanol stress and tolerance of S. cerevisiae is important for achieving improved tolerance. Results By combining a screening of the haploid S. cerevisiae knock-out library, gene overexpression, and genome analysis of evolutionary engineered n-butanol-tolerant strains, we established that protein degradation plays an essential role in tolerance. Strains deleted in genes involved in the ubiquitin-proteasome system and in vacuolar degradation of damaged proteins showed hypersensitivity to n-butanol. Overexpression of YLR224W, encoding the subunit responsible for the recognition of damaged proteins of an ubiquitin ligase complex, resulted in a strain with a higher n-butanol tolerance. Two independently evolved n-butanol-tolerant strains carried different mutations in both RPN4 and RTG1, which encode transcription factors involved in the expression of proteasome and peroxisomal genes, respectively. Introduction of these mutated alleles in the reference strain increased butanol tolerance, confirming their relevance in the higher tolerance phenotype. The evolved strains, in addition to n-butanol, were also more tolerant to 2-butanol, isobutanol and 1-propanol, indicating a common molecular basis for sensitivity and tolerance to C3 and C4 alcohols. Conclusions This study shows that maintenance of protein integrity plays an essential role in butanol tolerance and demonstrates new promising targets to engineer S. cerevisiae for improved tolerance. PMID:23552365

  14. Processing of probabilistic information in weight perception and motor prediction.

    PubMed

    Trampenau, Leif; van Eimeren, Thilo; Kuhtz-Buschbeck, Johann

    2017-02-01

    We studied the effects of probabilistic cues, i.e., of information of limited certainty, in the context of an action task (GL: grip-lift) and of a perceptual task (WP: weight perception). Normal subjects (n = 22) saw four different probabilistic visual cues, each of which announced the likely weight of an object. In the GL task, the object was grasped and lifted with a pinch grip, and the peak force rates indicated that the grip and load forces were scaled predictively according to the probabilistic information. The WP task provided the expected heaviness related to each probabilistic cue; the participants gradually adjusted the object's weight until its heaviness matched the expected weight for a given cue. Subjects were randomly assigned to two groups: one started with the GL task and the other one with the WP task. The four different probabilistic cues influenced weight adjustments in the WP task and peak force rates in the GL task in a similar manner. The interpretation and utilization of the probabilistic information was critically influenced by the initial task. Participants who started with the WP task classified the four probabilistic cues into four distinct categories and applied these categories to the subsequent GL task. On the other side, participants who started with the GL task applied three distinct categories to the four cues and retained this classification in the following WP task. The initial strategy, once established, determined the way how the probabilistic information was interpreted and implemented.

  15. Lipophilic components of the brown seaweed, Ascophyllum nodosum, enhance freezing tolerance in Arabidopsis thaliana.

    PubMed

    Rayirath, Prasanth; Benkel, Bernhard; Mark Hodges, D; Allan-Wojtas, Paula; Mackinnon, Shawna; Critchley, Alan T; Prithiviraj, Balakrishnan

    2009-06-01

    Extracts of the brown seaweed Ascophyllum nodosum enhance plant tolerance against environmental stresses such as drought, salinity, and frost. However, the molecular mechanisms underlying this improved stress tolerance and the nature of the bioactive compounds present in the seaweed extracts that elicits stress tolerance remain largely unknown. We investigated the effect of A. nodosum extracts and its organic sub-fractions on freezing tolerance of Arabidopsis thaliana. Ascophyllum nodosum extracts and its lipophilic fraction significantly increased tolerance to freezing temperatures in in vitro and in vivo assays. Untreated plants exhibited severe chlorosis, tissue damage, and failed to recover from freezing treatments while the extract-treated plants recovered from freezing temperature of -7.5 degrees C in in vitro and -5.5 degrees C in in vivo assays. Electrolyte leakage measurements revealed that the LT(50) value was lowered by 3 degrees C while cell viability staining demonstrated a 30-40% reduction in area of damaged tissue in extract treated plants as compared to water controls. Moreover, histological observations of leaf sections revealed that extracts have a significant effect on maintaining membrane integrity during freezing stress. Treated plants exhibited 70% less chlorophyll damage during freezing recovery as compared to the controls, and this correlated with reduced expression of the chlorphyllase genes AtCHL1 and AtCHL2. Further, the A. nodosum extract treatment modulated the expression of the cold response genes, COR15A, RD29A, and CBF3, resulting in enhanced tolerance to freezing temperatures. More than 2.6-fold increase in expression of RD29A, 1.8-fold increase of CBF3 and two-fold increase in the transcript level of COR15A was observed in plants treated with lipophilic fraction of A. nodosum at -2 degrees C. Taken together, the results suggest that chemical components in A. nodosum extracts protect membrane integrity and affect the expression of

  16. Damage prognosis: the future of structural health monitoring.

    PubMed

    Farrar, Charles R; Lieven, Nick A J

    2007-02-15

    This paper concludes the theme issue on structural health monitoring (SHM) by discussing the concept of damage prognosis (DP). DP attempts to forecast system performance by assessing the current damage state of the system (i.e. SHM), estimating the future loading environments for that system, and predicting through simulation and past experience the remaining useful life of the system. The successful development of a DP capability will require the further development and integration of many technology areas including both measurement/processing/telemetry hardware and a variety of deterministic and probabilistic predictive modelling capabilities, as well as the ability to quantify the uncertainty in these predictions. The multidisciplinary and challenging nature of the DP problem, its current embryonic state of development, and its tremendous potential for life-safety and economic benefits qualify DP as a 'grand challenge' problem for engineers in the twenty-first century.

  17. Monte Carlo simulation methodology for the reliabilty of aircraft structures under damage tolerance considerations

    NASA Astrophysics Data System (ADS)

    Rambalakos, Andreas

    Current federal aviation regulations in the United States and around the world mandate the need for aircraft structures to meet damage tolerance requirements through out the service life. These requirements imply that the damaged aircraft structure must maintain adequate residual strength in order to sustain its integrity that is accomplished by a continuous inspection program. The multifold objective of this research is to develop a methodology based on a direct Monte Carlo simulation process and to assess the reliability of aircraft structures. Initially, the structure is modeled as a parallel system with active redundancy comprised of elements with uncorrelated (statistically independent) strengths and subjected to an equal load distribution. Closed form expressions for the system capacity cumulative distribution function (CDF) are developed by expanding the current expression for the capacity CDF of a parallel system comprised by three elements to a parallel system comprised with up to six elements. These newly developed expressions will be used to check the accuracy of the implementation of a Monte Carlo simulation algorithm to determine the probability of failure of a parallel system comprised of an arbitrary number of statistically independent elements. The second objective of this work is to compute the probability of failure of a fuselage skin lap joint under static load conditions through a Monte Carlo simulation scheme by utilizing the residual strength of the fasteners subjected to various initial load distributions and then subjected to a new unequal load distribution resulting from subsequent fastener sequential failures. The final and main objective of this thesis is to present a methodology for computing the resulting gradual deterioration of the reliability of an aircraft structural component by employing a direct Monte Carlo simulation approach. The uncertainties associated with the time to crack initiation, the probability of crack detection, the

  18. A Probabilistic, Facility-Centric Approach to Lightning Strike Location

    NASA Technical Reports Server (NTRS)

    Huddleston, Lisa L.; Roeder, William p.; Merceret, Francis J.

    2012-01-01

    A new probabilistic facility-centric approach to lightning strike location has been developed. This process uses the bivariate Gaussian distribution of probability density provided by the current lightning location error ellipse for the most likely location of a lightning stroke and integrates it to determine the probability that the stroke is inside any specified radius of any location, even if that location is not centered on or even with the location error ellipse. This technique is adapted from a method of calculating the probability of debris collisionith spacecraft. Such a technique is important in spaceport processing activities because it allows engineers to quantify the risk of induced current damage to critical electronics due to nearby lightning strokes. This technique was tested extensively and is now in use by space launch organizations at Kennedy Space Center and Cape Canaveral Air Force Station. Future applications could include forensic meteorology.

  19. Damage tolerance of pressurized graphite/epoxy tape cylinders under uniaxial and biaxial loading. M.S. Thesis

    NASA Technical Reports Server (NTRS)

    Priest, Stacy Marie

    1993-01-01

    The damage tolerance behavior of internally pressurized, axially slit, graphite/epoxy tape cylinders was investigated. Specifically, the effects of axial stress, structural anisotropy, and subcritical damage were considered. In addition, the limitations of a methodology which uses coupon fracture data to predict cylinder failure were explored. This predictive methodology was previously shown to be valid for quasi-isotropic fabric and tape cylinders but invalid for structurally anisotropic (+/-45/90)(sub s) and (+/-45/0)(sub s) cylinders. The effects of axial stress and structural anisotropy were assessed by testing tape cylinders with (90/0/+/-45)(sub s), (+/-45/90)(sub s), and (+/-45/0)(sub s) layups in a uniaxial test apparatus, specially designed and built for this work, and comparing the results to previous tests conducted in biaxial loading. Structural anisotropy effects were also investigated by testing cylinders with the quasi-isotropic (0/+/-45/90)(sub s) layup which is a stacking sequence variation of the previously tested (90/0/+/-45)(sub s) layup with higher D(sub 16) and D(sub 26) terms but comparable D(sub 16) and D(sub 26) to D(sub 11) ratios. All cylinders tested and used for comparison are made from AS4/3501-6 graphite/epoxy tape and have a diameter of 305 mm. Cylinder slit lengths range from 12.7 to 50.8 mm. Failure pressures are lower for the uniaxially loaded cylinders in all cases. The smallest percent failure pressure decreases are observed for the (+/-45/90)(sub s) cylinders, while the greatest such decreases are observed for the (+/-45/0)(sub s) cylinders. The relative effects of the axial stress on the cylinder failure pressures do not correlate with the degree of structural coupling. The predictive methodology is not applicable for uniaxially loaded (+/-45/90)(sub s) and (+/-45/0)(sub s) cylinders, may be applicable for uniaxially loaded (90/0/+/-45)(sub s) cylinders, and is applicable for the biaxially loaded (90/0/+/-45)(sub s) and (0

  20. Recent developments of the NESSUS probabilistic structural analysis computer program

    NASA Technical Reports Server (NTRS)

    Millwater, H.; Wu, Y.-T.; Torng, T.; Thacker, B.; Riha, D.; Leung, C. P.

    1992-01-01

    The NESSUS probabilistic structural analysis computer program combines state-of-the-art probabilistic algorithms with general purpose structural analysis methods to compute the probabilistic response and the reliability of engineering structures. Uncertainty in loading, material properties, geometry, boundary conditions and initial conditions can be simulated. The structural analysis methods include nonlinear finite element and boundary element methods. Several probabilistic algorithms are available such as the advanced mean value method and the adaptive importance sampling method. The scope of the code has recently been expanded to include probabilistic life and fatigue prediction of structures in terms of component and system reliability and risk analysis of structures considering cost of failure. The code is currently being extended to structural reliability considering progressive crack propagation. Several examples are presented to demonstrate the new capabilities.

  1. Probabilistic Ontology Architecture for a Terrorist Identification Decision Support System

    DTIC Science & Technology

    2014-06-01

    in real-world problems requires probabilistic ontologies, which integrate the inferential reasoning power of probabilistic representations with the... inferential reasoning power of probabilistic representations with the first-order expressivity of ontologies. The Reference Architecture for...ontology, terrorism, inferential reasoning, architecture I. INTRODUCTION A. Background Whether by nature or design, the personas of terrorists are

  2. Probabilistic Structural Analysis Methods (PSAM) for select space propulsion system components

    NASA Technical Reports Server (NTRS)

    1991-01-01

    The fourth year of technical developments on the Numerical Evaluation of Stochastic Structures Under Stress (NESSUS) system for Probabilistic Structural Analysis Methods is summarized. The effort focused on the continued expansion of the Probabilistic Finite Element Method (PFEM) code, the implementation of the Probabilistic Boundary Element Method (PBEM), and the implementation of the Probabilistic Approximate Methods (PAppM) code. The principal focus for the PFEM code is the addition of a multilevel structural dynamics capability. The strategy includes probabilistic loads, treatment of material, geometry uncertainty, and full probabilistic variables. Enhancements are included for the Fast Probability Integration (FPI) algorithms and the addition of Monte Carlo simulation as an alternate. Work on the expert system and boundary element developments continues. The enhanced capability in the computer codes is validated by applications to a turbine blade and to an oxidizer duct.

  3. Probabilistic liver atlas construction.

    PubMed

    Dura, Esther; Domingo, Juan; Ayala, Guillermo; Marti-Bonmati, Luis; Goceri, E

    2017-01-13

    Anatomical atlases are 3D volumes or shapes representing an organ or structure of the human body. They contain either the prototypical shape of the object of interest together with other shapes representing its statistical variations (statistical atlas) or a probability map of belonging to the object (probabilistic atlas). Probabilistic atlases are mostly built with simple estimations only involving the data at each spatial location. A new method for probabilistic atlas construction that uses a generalized linear model is proposed. This method aims to improve the estimation of the probability to be covered by the liver. Furthermore, all methods to build an atlas involve previous coregistration of the sample of shapes available. The influence of the geometrical transformation adopted for registration in the quality of the final atlas has not been sufficiently investigated. The ability of an atlas to adapt to a new case is one of the most important quality criteria that should be taken into account. The presented experiments show that some methods for atlas construction are severely affected by the previous coregistration step. We show the good performance of the new approach. Furthermore, results suggest that extremely flexible registration methods are not always beneficial, since they can reduce the variability of the atlas and hence its ability to give sensible values of probability when used as an aid in segmentation of new cases.

  4. Single-molecule live-cell imaging of bacterial DNA repair and damage tolerance.

    PubMed

    Ghodke, Harshad; Ho, Han; van Oijen, Antoine M

    2018-02-19

    Genomic DNA is constantly under threat from intracellular and environmental factors that damage its chemical structure. Uncorrected DNA damage may impede cellular propagation or even result in cell death, making it critical to restore genomic integrity. Decades of research have revealed a wide range of mechanisms through which repair factors recognize damage and co-ordinate repair processes. In recent years, single-molecule live-cell imaging methods have further enriched our understanding of how repair factors operate in the crowded intracellular environment. The ability to follow individual biochemical events, as they occur in live cells, makes single-molecule techniques tremendously powerful to uncover the spatial organization and temporal regulation of repair factors during DNA-repair reactions. In this review, we will cover practical aspects of single-molecule live-cell imaging and highlight recent advances accomplished by the application of these experimental approaches to the study of DNA-repair processes in prokaryotes. © 2018 The Author(s). Published by Portland Press Limited on behalf of the Biochemical Society.

  5. Overexpression of CsCaM3 Improves High Temperature Tolerance in Cucumber

    PubMed Central

    Yu, Bingwei; Yan, Shuangshuang; Zhou, Huoyan; Dong, Riyue; Lei, Jianjun; Chen, Changming; Cao, Bihao

    2018-01-01

    High temperature (HT) stress affects the growth and production of cucumbers, but genetic resources with high heat tolerance are very scarce in this crop. Calmodulin (CaM) has been confirmed to be related to the regulation of HT stress resistance in plants. CsCaM3, a CaM gene, was isolated from cucumber inbred line “02-8.” Its expression was characterized in the present study. CsCaM3 transcripts differed among the organs and tissues of cucumber plants and could be induced by HTs or abscisic acid, but not by salicylic acid. CsCaM3 transcripts exhibited subcellular localization to the cytoplasm and nuclei of cells. Overexpression of CsCaM3 in cucumber plants has the potential to improve their heat tolerance and protect against oxidative damage and photosynthesis system damage by regulating the expression of HT-responsive genes in plants, including chlorophyll catabolism-related genes under HT stress. Taken together, our results provide useful insights into stress tolerance in cucumber. PMID:29946334

  6. Future trends in flood risk in Indonesia - A probabilistic approach

    NASA Astrophysics Data System (ADS)

    Muis, Sanne; Guneralp, Burak; Jongman, Brenden; Ward, Philip

    2014-05-01

    Indonesia is one of the 10 most populous countries in the world and is highly vulnerable to (river) flooding. Catastrophic floods occur on a regular basis; total estimated damages were US 0.8 bn in 2010 and US 3 bn in 2013. Large parts of Greater Jakarta, the capital city, are annually subject to flooding. Flood risks (i.e. the product of hazard, exposure and vulnerability) are increasing due to rapid increases in exposure, such as strong population growth and ongoing economic development. The increase in risk may also be amplified by increasing flood hazards, such as increasing flood frequency and intensity due to climate change and land subsidence. The implementation of adaptation measures, such as the construction of dykes and strategic urban planning, may counteract these increasing trends. However, despite its importance for adaptation planning, a comprehensive assessment of current and future flood risk in Indonesia is lacking. This contribution addresses this issue and aims to provide insight into how socio-economic trends and climate change projections may shape future flood risks in Indonesia. Flood risk were calculated using an adapted version of the GLOFRIS global flood risk assessment model. Using this approach, we produced probabilistic maps of flood risks (i.e. annual expected damage) at a resolution of 30"x30" (ca. 1km x 1km at the equator). To represent flood exposure, we produced probabilistic projections of urban growth in a Monte-Carlo fashion based on probability density functions of projected population and GDP values for 2030. To represent flood hazard, inundation maps were computed using the hydrological-hydraulic component of GLOFRIS. These maps show flood inundation extent and depth for several return periods and were produced for several combinations of GCMs and future socioeconomic scenarios. Finally, the implementation of different adaptation strategies was incorporated into the model to explore to what extent adaptation may be able to

  7. A framework for probabilistic pluvial flood nowcasting for urban areas

    NASA Astrophysics Data System (ADS)

    Ntegeka, Victor; Murla, Damian; Wang, Lipen; Foresti, Loris; Reyniers, Maarten; Delobbe, Laurent; Van Herk, Kristine; Van Ootegem, Luc; Willems, Patrick

    2016-04-01

    Pluvial flood nowcasting is gaining ground not least because of the advancements in rainfall forecasting schemes. Short-term forecasts and applications have benefited from the availability of such forecasts with high resolution in space (~1km) and time (~5min). In this regard, it is vital to evaluate the potential of nowcasting products for urban inundation applications. One of the most advanced Quantitative Precipitation Forecasting (QPF) techniques is the Short-Term Ensemble Prediction System, which was originally co-developed by the UK Met Office and Australian Bureau of Meteorology. The scheme was further tuned to better estimate extreme and moderate events for the Belgian area (STEPS-BE). Against this backdrop, a probabilistic framework has been developed that consists of: (1) rainfall nowcasts; (2) sewer hydraulic model; (3) flood damage estimation; and (4) urban inundation risk mapping. STEPS-BE forecasts are provided at high resolution (1km/5min) with 20 ensemble members with a lead time of up to 2 hours using a 4 C-band radar composite as input. Forecasts' verification was performed over the cities of Leuven and Ghent and biases were found to be small. The hydraulic model consists of the 1D sewer network and an innovative 'nested' 2D surface model to model 2D urban surface inundations at high resolution. The surface components are categorized into three groups and each group is modelled using triangular meshes at different resolutions; these include streets (3.75 - 15 m2), high flood hazard areas (12.5 - 50 m2) and low flood hazard areas (75 - 300 m2). Functions describing urban flood damage and social consequences were empirically derived based on questionnaires to people in the region that were recently affected by sewer floods. Probabilistic urban flood risk maps were prepared based on spatial interpolation techniques of flood inundation. The method has been implemented and tested for the villages Oostakker and Sint-Amandsberg, which are part of the

  8. Fatigue Damage Mechanisms in Advanced Hybrid Titanium Composite Laminates

    NASA Technical Reports Server (NTRS)

    Johnson, W. Steven; Rhymer, Donald W.; St.Clair, Terry L. (Technical Monitor)

    2000-01-01

    Hybrid Titanium Composite Laminates (HTCL) are a type of hybrid composite laminate with promise for high-speed aerospace applications, specifically designed for improved damage tolerance and strength at high-temperature (350 F, 177 C). However, in previous testing, HTCL demonstrated a propensity to excessive delamination at the titanium/PMC interface following titanium cracking. An advanced HTCL has been constructed with an emphasis on strengthening this interface, combining a PETI-5/IM7 PMC with Ti-15-3 foils prepared with an alkaline-perborate surface treatment. This paper discusses how the fatigue capabilities of the "advanced" HTCL compare to the first generation HTCL which was not modified for interface optimization, in both tension-tension (R = 0.1) and tension-compression (R=-0.2). The advanced HTCL under did not demonstrate a significant improvement in fatigue life, in either tension-tension or tension-compression loading. However, the advanced HTCL proved much more damage tolerant. The R = 0.1 tests revealed the advanced HTCL to increase the fatigue life following initial titanium ply damage up to 10X that of the initial HTCL at certain stress levels. The damage progression following the initial ply damage demonstrated the effect of the strengthened PMC/titanium interface. Acetate film replication of the advanced HTCL edges showed a propensity for some fibers in the adjacent PMC layers to fail at the point of titanium crack formation, suppressing delamination at the Ti/PMC interface. The inspection of failure surfaces validated these findings, revealing PMC fibers bonded to the majority of the titanium surfaces. Tension compression fatigue (R = -0.2) demonstrated the same trends in cycles between initial damage and failure, damage progression, and failure surfaces. Moreover, in possessing a higher resistance to delamination, the advanced HTCL did not exhibit buckling following initial titanium ply cracking under compression unlike the initial HTCL.

  9. Parallel computing for probabilistic fatigue analysis

    NASA Technical Reports Server (NTRS)

    Sues, Robert H.; Lua, Yuan J.; Smith, Mark D.

    1993-01-01

    This paper presents the results of Phase I research to investigate the most effective parallel processing software strategies and hardware configurations for probabilistic structural analysis. We investigate the efficiency of both shared and distributed-memory architectures via a probabilistic fatigue life analysis problem. We also present a parallel programming approach, the virtual shared-memory paradigm, that is applicable across both types of hardware. Using this approach, problems can be solved on a variety of parallel configurations, including networks of single or multiprocessor workstations. We conclude that it is possible to effectively parallelize probabilistic fatigue analysis codes; however, special strategies will be needed to achieve large-scale parallelism to keep large number of processors busy and to treat problems with the large memory requirements encountered in practice. We also conclude that distributed-memory architecture is preferable to shared-memory for achieving large scale parallelism; however, in the future, the currently emerging hybrid-memory architectures will likely be optimal.

  10. Multiple Damage Progression Paths in Model-Based Prognostics

    NASA Technical Reports Server (NTRS)

    Daigle, Matthew; Goebel, Kai Frank

    2011-01-01

    Model-based prognostics approaches employ domain knowledge about a system, its components, and how they fail through the use of physics-based models. Component wear is driven by several different degradation phenomena, each resulting in their own damage progression path, overlapping to contribute to the overall degradation of the component. We develop a model-based prognostics methodology using particle filters, in which the problem of characterizing multiple damage progression paths is cast as a joint state-parameter estimation problem. The estimate is represented as a probability distribution, allowing the prediction of end of life and remaining useful life within a probabilistic framework that supports uncertainty management. We also develop a novel variance control mechanism that maintains an uncertainty bound around the hidden parameters to limit the amount of estimation uncertainty and, consequently, reduce prediction uncertainty. We construct a detailed physics-based model of a centrifugal pump, to which we apply our model-based prognostics algorithms. We illustrate the operation of the prognostic solution with a number of simulation-based experiments and demonstrate the performance of the chosen approach when multiple damage mechanisms are active

  11. Fully probabilistic control design in an adaptive critic framework.

    PubMed

    Herzallah, Randa; Kárný, Miroslav

    2011-12-01

    Optimal stochastic controller pushes the closed-loop behavior as close as possible to the desired one. The fully probabilistic design (FPD) uses probabilistic description of the desired closed loop and minimizes Kullback-Leibler divergence of the closed-loop description to the desired one. Practical exploitation of the fully probabilistic design control theory continues to be hindered by the computational complexities involved in numerically solving the associated stochastic dynamic programming problem; in particular, very hard multivariate integration and an approximate interpolation of the involved multivariate functions. This paper proposes a new fully probabilistic control algorithm that uses the adaptive critic methods to circumvent the need for explicitly evaluating the optimal value function, thereby dramatically reducing computational requirements. This is a main contribution of this paper. Copyright © 2011 Elsevier Ltd. All rights reserved.

  12. Application of Probabilistic Analysis to Aircraft Impact Dynamics

    NASA Technical Reports Server (NTRS)

    Lyle, Karen H.; Padula, Sharon L.; Stockwell, Alan E.

    2003-01-01

    Full-scale aircraft crash simulations performed with nonlinear, transient dynamic, finite element codes can incorporate structural complexities such as: geometrically accurate models; human occupant models; and advanced material models to include nonlinear stressstrain behaviors, laminated composites, and material failure. Validation of these crash simulations is difficult due to a lack of sufficient information to adequately determine the uncertainty in the experimental data and the appropriateness of modeling assumptions. This paper evaluates probabilistic approaches to quantify the uncertainty in the simulated responses. Several criteria are used to determine that a response surface method is the most appropriate probabilistic approach. The work is extended to compare optimization results with and without probabilistic constraints.

  13. 40 CFR 180.1180 - Kaolin; exemption from the requirement of a tolerance.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ..., fungus, and bacterial damage to plants. This temporary exemption from the requirement of a tolerance will... control of insects, fungi, and bacteria (food/feed use). [62 FR 19685, Apr. 23, 1997, as amended at 63 FR...

  14. 40 CFR 180.1180 - Kaolin; exemption from the requirement of a tolerance.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ..., fungus, and bacterial damage to plants. This temporary exemption from the requirement of a tolerance will... control of insects, fungi, and bacteria (food/feed use). [62 FR 19685, Apr. 23, 1997, as amended at 63 FR...

  15. 40 CFR 180.1180 - Kaolin; exemption from the requirement of a tolerance.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ..., fungus, and bacterial damage to plants. This temporary exemption from the requirement of a tolerance will... control of insects, fungi, and bacteria (food/feed use). [62 FR 19685, Apr. 23, 1997, as amended at 63 FR...

  16. 40 CFR 180.1180 - Kaolin; exemption from the requirement of a tolerance.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ..., fungus, and bacterial damage to plants. This temporary exemption from the requirement of a tolerance will... control of insects, fungi, and bacteria (food/feed use). [62 FR 19685, Apr. 23, 1997, as amended at 63 FR...

  17. 40 CFR 180.1180 - Kaolin; exemption from the requirement of a tolerance.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ..., fungus, and bacterial damage to plants. This temporary exemption from the requirement of a tolerance will... control of insects, fungi, and bacteria (food/feed use). [62 FR 19685, Apr. 23, 1997, as amended at 63 FR...

  18. Baseline-free damage detection in composite plates based on the reciprocity principle

    NASA Astrophysics Data System (ADS)

    Huang, Liping; Zeng, Liang; Lin, Jing

    2018-01-01

    Lamb wave based damage detection techniques have been widely used in composite structures. In particular, these techniques usually rely on reference signals, which are significantly influenced by the operational and environmental conditions. To solve this issue, this paper presents a baseline-free damage inspection method based on the reciprocity principle. If a localized nonlinear scatterer exists along the wave path, the reciprocity breaks down. Through estimating the loss of reciprocity, the delamination could be detected. A reciprocity index (RI), which compares the discrepancy between the signal received in transducer B when emitting from transducer A and the signal received in A when the same source is located in B, is established to quantitatively analyze the reciprocity. Experimental results show that the RI value of a damaged path is much higher than that of a healthy path. In addition, the effects of the parameters of excitation signal (i.e., central frequency and bandwidth) and the position of delamination on the RI value are discussed. Furthermore, a RI based probabilistic imaging algorithm is proposed for detecting delamination damage of composite plates without reference signals. Finally, the effectiveness of this baseline-free damage detection method is validated by an experimental example.

  19. Attenuated microglial activation mediates tolerance to the neurotoxic effects of methamphetamine.

    PubMed

    Thomas, David M; Kuhn, Donald M

    2005-02-01

    Methamphetamine causes persistent damage to dopamine nerve endings of the striatum. Repeated, intermittent treatment of mice with low doses of methamphetamine leads to the development of tolerance to its neurotoxic effects. The mechanisms underlying tolerance are not understood but clearly involve more than alterations in drug bioavailability or reductions in the hyperthermia caused by methamphetamine. Microglia have been implicated recently as mediators of methamphetamine-induced neurotoxicity. The purpose of the present studies was to determine if a tolerance regimen of methamphetamine would attenuate the microglial response to a neurotoxic challenge. Mice treated with a low-dose methamphetamine tolerance regimen showed minor reductions in striatal dopamine content and low levels of microglial activation. When the tolerance regimen preceded a neurotoxic challenge of methamphetamine, the depletion of dopamine normally seen was significantly attenuated. The microglial activation that occurs after a toxic methamphetamine challenge was blunted likewise. Despite the induction of tolerance against drug-induced toxicity and microglial activation, a neurotoxic challenge with methamphetamine still caused hyperthermia. These results suggest that tolerance to methamphetamine neurotoxicity is associated with attenuated microglial activation and they further dissociate its neurotoxicity from drug-induced hyperthermia.

  20. Hypoxic pretreatment protects against neuronal damage of the rat hippocampus induced by severe hypoxia.

    PubMed

    Gorgias, N; Maidatsi, P; Tsolaki, M; Alvanou, A; Kiriazis, G; Kaidoglou, K; Giala, M

    1996-04-01

    The present study investigates whether under conditions of successive hypoxic exposures pretreatment with mild (15% O(2)) or moderate (10% O(2)) hypoxia, protects hippocampal neurones against damage induced by severe (3% O(2)) hypoxia. The ultrastructural findings were also correlated with regional superoxide dismutase (SOD) activity changes. In unpretreated rats severe hypoxia induced ultrastructural changes consistent with the aspects of delayed neuronal death (DND). However, in preexposed animals hippocampal damage was attenuated in an inversely proportional way with the severity of the hypoxic pretreatment. The ultrastructural hypoxic tolerance findings were also closely related to increased regional SOD activity levels. Thus the activation of the endogenous antioxidant defense by hypoxic preconditioning, protects against hippocampal damage induced by severe hypoxia. The eventual contribution of increased endogenous adenosine and/or reduced excitotoxicity to induce hypoxic tolerance is discussed.

  1. Damage and strength of composite materials: Trends, predictions, and challenges

    NASA Technical Reports Server (NTRS)

    Obrien, T. Kevin

    1994-01-01

    Research on damage mechanisms and ultimate strength of composite materials relevant to scaling issues will be addressed in this viewgraph presentation. The use of fracture mechanics and Weibull statistics to predict scaling effects for the onset of isolated damage mechanisms will be highlighted. The ability of simple fracture mechanics models to predict trends that are useful in parametric or preliminary designs studies will be reviewed. The limitations of these simple models for complex loading conditions will also be noted. The difficulty in developing generic criteria for the growth of these mechanisms needed in progressive damage models to predict strength will be addressed. A specific example for a problem where failure is a direct consequence of progressive delamination will be explored. A damage threshold/fail-safety concept for addressing composite damage tolerance will be discussed.

  2. Probabilistic pathway construction.

    PubMed

    Yousofshahi, Mona; Lee, Kyongbum; Hassoun, Soha

    2011-07-01

    Expression of novel synthesis pathways in host organisms amenable to genetic manipulations has emerged as an attractive metabolic engineering strategy to overproduce natural products, biofuels, biopolymers and other commercially useful metabolites. We present a pathway construction algorithm for identifying viable synthesis pathways compatible with balanced cell growth. Rather than exhaustive exploration, we investigate probabilistic selection of reactions to construct the pathways. Three different selection schemes are investigated for the selection of reactions: high metabolite connectivity, low connectivity and uniformly random. For all case studies, which involved a diverse set of target metabolites, the uniformly random selection scheme resulted in the highest average maximum yield. When compared to an exhaustive search enumerating all possible reaction routes, our probabilistic algorithm returned nearly identical distributions of yields, while requiring far less computing time (minutes vs. years). The pathways identified by our algorithm have previously been confirmed in the literature as viable, high-yield synthesis routes. Prospectively, our algorithm could facilitate the design of novel, non-native synthesis routes by efficiently exploring the diversity of biochemical transformations in nature. Copyright © 2011 Elsevier Inc. All rights reserved.

  3. Tracking composite material damage evolution using Bayesian filtering and flash thermography data

    NASA Astrophysics Data System (ADS)

    Gregory, Elizabeth D.; Holland, Steve D.

    2016-05-01

    We propose a method for tracking the condition of a composite part using Bayesian filtering of ash thermography data over the lifetime of the part. In this demonstration, composite panels were fabricated; impacted to induce subsurface delaminations; and loaded in compression over multiple time steps, causing the delaminations to grow in size. Flash thermography data was collected between each damage event to serve as a time history of the part. The ash thermography indicated some areas of damage but provided little additional information as to the exact nature or depth of the damage. Computed tomography (CT) data was also collected after each damage event and provided a high resolution volume model of damage that acted as truth. After each cycle, the condition estimate, from the ash thermography data and the Bayesian filter, was compared to 'ground truth'. The Bayesian process builds on the lifetime history of ash thermography scans and can give better estimates of material condition as compared to the most recent scan alone, which is common practice in the aerospace industry. Bayesian inference provides probabilistic estimates of damage condition that are updated as each new set of data becomes available. The method was tested on simulated data and then on an experimental data set.

  4. Lightweight Damage Tolerant, High-Temperature Radiators for Nuclear Power and Propulsion

    NASA Technical Reports Server (NTRS)

    Craven, Paul D.; SanSoucie, Michael P.

    2015-01-01

    is enabled. High thermal conductivity carbon fibers are lightweight, damage tolerant, and can be heated to high temperature. Areal densities in the NASA set target range of 2 to 4 kg/m2 (for enabling NEP) are achieved and with specific powers (kW/kg) a factor of about 7 greater than conventional metal fins and about 1.5 greater than carbon composite fins. Figure 2 shows one fin under test. All tests were done under vacuum conditions.

  5. Probabilistic modeling of discourse-aware sentence processing.

    PubMed

    Dubey, Amit; Keller, Frank; Sturt, Patrick

    2013-07-01

    Probabilistic models of sentence comprehension are increasingly relevant to questions concerning human language processing. However, such models are often limited to syntactic factors. This restriction is unrealistic in light of experimental results suggesting interactions between syntax and other forms of linguistic information in human sentence processing. To address this limitation, this article introduces two sentence processing models that augment a syntactic component with information about discourse co-reference. The novel combination of probabilistic syntactic components with co-reference classifiers permits them to more closely mimic human behavior than existing models. The first model uses a deep model of linguistics, based in part on probabilistic logic, allowing it to make qualitative predictions on experimental data; the second model uses shallow processing to make quantitative predictions on a broad-coverage reading-time corpus. Copyright © 2013 Cognitive Science Society, Inc.

  6. Evolution of resistance and tolerance to herbivores: testing the trade-off hypothesis.

    PubMed

    Kariñho-Betancourt, Eunice; Núñez-Farfán, Juan

    2015-01-01

    Background. To cope with their natural enemies, plants rely on resistance and tolerance as defensive strategies. Evolution of these strategies among natural population can be constrained by the absence of genetic variation or because of the antagonistic genetic correlation (trade-off) between them. Also, since plant defenses are integrated by several traits, it has been suggested that trade-offs might occur between specific defense traits. Methodology/Principal Findings. We experimentally assessed (1) the presence of genetic variance in tolerance, total resistance, and leaf trichome density as specific defense trait, (2) the extent of natural selection acting on plant defenses, and (3) the relationship between total resistance and leaf trichome density with tolerance to herbivory in the annual herb Datura stramonium. Full-sib families of D. stramonium were either exposed to natural herbivores (control) or protected from them by a systemic insecticide. We detected genetic variance for leaf trichome density, and directional selection acting on this character. However, we did not detect a negative significant correlation between tolerance and total resistance, or between tolerance and leaf trichome density. We argue that low levels of leaf damage by herbivores precluded the detection of a negative genetic correlation between plant defense strategies. Conclusions/Significance. This study provides empirical evidence of the independent evolution of plant defense strategies, and a defensive role of leaf trichomes. The pattern of selection should favor individuals with high trichomes density. Also, because leaf trichome density reduces damage by herbivores and possess genetic variance in the studied population, its evolution is not constrained.

  7. Evolution of resistance and tolerance to herbivores: testing the trade-off hypothesis

    PubMed Central

    Kariñho-Betancourt, Eunice

    2015-01-01

    Background. To cope with their natural enemies, plants rely on resistance and tolerance as defensive strategies. Evolution of these strategies among natural population can be constrained by the absence of genetic variation or because of the antagonistic genetic correlation (trade-off) between them. Also, since plant defenses are integrated by several traits, it has been suggested that trade-offs might occur between specific defense traits. Methodology/Principal Findings. We experimentally assessed (1) the presence of genetic variance in tolerance, total resistance, and leaf trichome density as specific defense trait, (2) the extent of natural selection acting on plant defenses, and (3) the relationship between total resistance and leaf trichome density with tolerance to herbivory in the annual herb Datura stramonium. Full-sib families of D. stramonium were either exposed to natural herbivores (control) or protected from them by a systemic insecticide. We detected genetic variance for leaf trichome density, and directional selection acting on this character. However, we did not detect a negative significant correlation between tolerance and total resistance, or between tolerance and leaf trichome density. We argue that low levels of leaf damage by herbivores precluded the detection of a negative genetic correlation between plant defense strategies. Conclusions/Significance. This study provides empirical evidence of the independent evolution of plant defense strategies, and a defensive role of leaf trichomes. The pattern of selection should favor individuals with high trichomes density. Also, because leaf trichome density reduces damage by herbivores and possess genetic variance in the studied population, its evolution is not constrained. PMID:25780756

  8. Time dependence of breakdown in a global fiber-bundle model with continuous damage.

    PubMed

    Moral, L; Moreno, Y; Gómez, J B; Pacheco, A F

    2001-06-01

    A time-dependent global fiber-bundle model of fracture with continuous damage is formulated in terms of a set of coupled nonlinear differential equations. A first integral of this set is analytically obtained. The time evolution of the system is studied by applying a discrete probabilistic method. Several results are discussed emphasizing their differences with the standard time-dependent model. The results obtained show that with this simple model a variety of experimental observations can be qualitatively reproduced.

  9. Impact damage resistance and damage suppression properties of shape memory alloys in hybrid composites—a review

    NASA Astrophysics Data System (ADS)

    Angioni, S. L.; Meo, M.; Foreman, A.

    2011-01-01

    Composite materials are known to have a poor resistance to through-the-thickness impact loading. There are various methods for improving their impact damage tolerance, such as fiber toughening, matrix toughening, interface toughening, through-the-thickness reinforcements, and selective interlayers and hybrids. Hybrid composites with improved impact resistance are particularly useful in military and commercial civil applications. Hybridizing composites using shape memory alloys (SMA) is one solution since SMA materials can absorb the energy of the impact through superelastic deformation or recovery stress, reducing the effects of the impact on the composite structure. The SMA material may be embedded in the hybrid composites (SMAHC) in many different forms and also the characteristics of the fiber reinforcements may vary, such as SMA wires in woven laminates or SMA foils in unidirectional laminates, only to cite two examples. We will review the state of the art of SMAHC for the purpose of damage suppression. Both the active and passive damage suppression mechanisms will be considered.

  10. Non-unitary probabilistic quantum computing

    NASA Technical Reports Server (NTRS)

    Gingrich, Robert M.; Williams, Colin P.

    2004-01-01

    We present a method for designing quantum circuits that perform non-unitary quantum computations on n-qubit states probabilistically, and give analytic expressions for the success probability and fidelity.

  11. Freezing tolerance of winter wheat as influenced by extended growth at low temperature and exposure to freeze-thaw cycles

    USDA-ARS?s Scientific Manuscript database

    As the seasons progress, autumn-planted winter wheat plants (Triticum aestivum L.) first gain, then progressively lose freezing tolerance. Exposing the plants to freeze-thaw cycles of -3/3°C results in increased ability to tolerate subsequent freezing to potentially damaging temperatures. This stu...

  12. Probabilistic Structural Analysis Methods (PSAM) for Select Space Propulsion System Components

    NASA Technical Reports Server (NTRS)

    1999-01-01

    Probabilistic Structural Analysis Methods (PSAM) are described for the probabilistic structural analysis of engine components for current and future space propulsion systems. Components for these systems are subjected to stochastic thermomechanical launch loads. Uncertainties or randomness also occurs in material properties, structural geometry, and boundary conditions. Material property stochasticity, such as in modulus of elasticity or yield strength, exists in every structure and is a consequence of variations in material composition and manufacturing processes. Procedures are outlined for computing the probabilistic structural response or reliability of the structural components. The response variables include static or dynamic deflections, strains, and stresses at one or several locations, natural frequencies, fatigue or creep life, etc. Sample cases illustrates how the PSAM methods and codes simulate input uncertainties and compute probabilistic response or reliability using a finite element model with probabilistic methods.

  13. Modelling biofilm-induced formation damage and biocide treatment in subsurface geosystems

    PubMed Central

    Ezeuko, C C; Sen, A; Gates, I D

    2013-01-01

    Biofilm growth in subsurface porous media, and its treatment with biocides (antimicrobial agents), involves a complex interaction of biogeochemical processes which provide non-trivial mathematical modelling challenges. Although there are literature reports of mathematical models to evaluate biofilm tolerance to biocides, none of these models have investigated biocide treatment of biofilms growing in interconnected porous media with flow. In this paper, we present a numerical investigation using a pore network model of biofilm growth, formation damage and biocide treatment. The model includes three phases (aqueous, adsorbed biofilm, and solid matrix), a single growth-limiting nutrient and a single biocide dissolved in the water. Biofilm is assumed to contain a single species of microbe, in which each cell can be a viable persister, a viable non-persister, or non-viable (dead). Persisters describe small subpopulation of cells which are tolerant to biocide treatment. Biofilm tolerance to biocide treatment is regulated by persister cells and includes ‘innate’ and ‘biocide-induced’ factors. Simulations demonstrate that biofilm tolerance to biocides can increase with biofilm maturity, and that biocide treatment alone does not reverse biofilm-induced formation damage. Also, a successful application of biological permeability conformance treatment involving geologic layers with flow communication is more complicated than simply engineering the attachment of biofilm-forming cells at desired sites. PMID:23164434

  14. Probabilistic Polling And Voting In The 2008 Presidential Election

    PubMed Central

    Delavande, Adeline; Manski, Charles F.

    2010-01-01

    This article reports new empirical evidence on probabilistic polling, which asks persons to state in percent-chance terms the likelihood that they will vote and for whom. Before the 2008 presidential election, seven waves of probabilistic questions were administered biweekly to participants in the American Life Panel (ALP). Actual voting behavior was reported after the election. We find that responses to the verbal and probabilistic questions are well-aligned ordinally. Moreover, the probabilistic responses predict voting behavior beyond what is possible using verbal responses alone. The probabilistic responses have more predictive power in early August, and the verbal responses have more power in late October. However, throughout the sample period, one can predict voting behavior better using both types of responses than either one alone. Studying the longitudinal pattern of responses, we segment respondents into those who are consistently pro-Obama, consistently anti-Obama, and undecided/vacillators. Membership in the consistently pro- or anti-Obama group is an almost perfect predictor of actual voting behavior, while the undecided/vacillators group has more nuanced voting behavior. We find that treating the ALP as a panel improves predictive power: current and previous polling responses together provide more predictive power than do current responses alone. PMID:24683275

  15. Arguments for zero tolerance of sexual contact between doctors and patients.

    PubMed Central

    Cullen, R M

    1999-01-01

    Some doctors do enter into sexual relationships with patients. These relationships can be damaging to the patient involved. One response available to both individual doctors and to disciplinary bodies is to prohibit sexual contact between doctors and patients ("zero tolerance"). This paper considers five ways of arguing for a zero tolerance policy. The first rests on an empirical claim that such contact is almost always harmful to the patient involved. The second is based on a "principles" approach while the third originates in "virtues" ethics. The fourth argues that zero tolerance is an "a priori" truth. These four attempt to establish that the behaviour is always wrong and ought, therefore, to be prohibited. The fifth argument is counterfactual. It claims a policy that allowed sexual contact would have unacceptable consequences. Given the responsibility of regulatory bodies to protect the public, zero tolerance is a natural policy to develop. PMID:10635503

  16. Probabilistic composite micromechanics

    NASA Technical Reports Server (NTRS)

    Stock, T. A.; Bellini, P. X.; Murthy, P. L. N.; Chamis, C. C.

    1988-01-01

    Probabilistic composite micromechanics methods are developed that simulate expected uncertainties in unidirectional fiber composite properties. These methods are in the form of computational procedures using Monte Carlo simulation. A graphite/epoxy unidirectional composite (ply) is studied to demonstrate fiber composite material properties at the micro level. Regression results are presented to show the relative correlation between predicted and response variables in the study.

  17. Learning Sparse Feature Representations using Probabilistic Quadtrees and Deep Belief Nets

    DTIC Science & Technology

    2015-04-24

    Feature Representations usingProbabilistic Quadtrees and Deep Belief Nets Learning sparse feature representations is a useful instru- ment for solving an...novel framework for the classifi cation of handwritten digits that learns sparse representations using probabilistic quadtrees and Deep Belief Nets... Learning Sparse Feature Representations usingProbabilistic Quadtrees and Deep Belief Nets Report Title Learning sparse feature representations is a useful

  18. Dominating Scale-Free Networks Using Generalized Probabilistic Methods

    PubMed Central

    Molnár,, F.; Derzsy, N.; Czabarka, É.; Székely, L.; Szymanski, B. K.; Korniss, G.

    2014-01-01

    We study ensemble-based graph-theoretical methods aiming to approximate the size of the minimum dominating set (MDS) in scale-free networks. We analyze both analytical upper bounds of dominating sets and numerical realizations for applications. We propose two novel probabilistic dominating set selection strategies that are applicable to heterogeneous networks. One of them obtains the smallest probabilistic dominating set and also outperforms the deterministic degree-ranked method. We show that a degree-dependent probabilistic selection method becomes optimal in its deterministic limit. In addition, we also find the precise limit where selecting high-degree nodes exclusively becomes inefficient for network domination. We validate our results on several real-world networks, and provide highly accurate analytical estimates for our methods. PMID:25200937

  19. Modelling default and likelihood reasoning as probabilistic reasoning

    NASA Technical Reports Server (NTRS)

    Buntine, Wray

    1990-01-01

    A probabilistic analysis of plausible reasoning about defaults and about likelihood is presented. Likely and by default are in fact treated as duals in the same sense as possibility and necessity. To model these four forms probabilistically, a qualitative default probabilistic (QDP) logic and its quantitative counterpart DP are derived that allow qualitative and corresponding quantitative reasoning. Consistency and consequent results for subsets of the logics are given that require at most a quadratic number of satisfiability tests in the underlying propositional logic. The quantitative logic shows how to track the propagation error inherent in these reasoning forms. The methodology and sound framework of the system highlights their approximate nature, the dualities, and the need for complementary reasoning about relevance.

  20. Damage assessment in multilayered MEMS structures under thermal fatigue

    NASA Astrophysics Data System (ADS)

    Maligno, A. R.; Whalley, D. C.; Silberschmidt, V. V.

    2011-07-01

    This paper reports on the application of a Physics of Failure (PoF) methodology to assessing the reliability of a micro electro mechanical system (MEMS). Numerical simulations, based on the finite element method (FEM) using a sub-domain approach was used to examine the damage onset due to temperature variations (e.g. yielding of metals which may lead to thermal fatigue). In this work remeshing techniques were employed in order to develop a damage tolerance approach based on the assumption that initial flaws exist in the multi-layered.

  1. Effects of Oxygen Availability on Acetic Acid Tolerance and Intracellular pH in Dekkera bruxellensis.

    PubMed

    Capusoni, Claudia; Arioli, Stefania; Zambelli, Paolo; Moktaduzzaman, M; Mora, Diego; Compagno, Concetta

    2016-08-01

    The yeast Dekkera bruxellensis, associated with wine and beer production, has recently received attention, because its high ethanol and acid tolerance enables it to compete with Saccharomyces cerevisiae in distilleries that produce fuel ethanol. We investigated how different cultivation conditions affect the acetic acid tolerance of D. bruxellensis We analyzed the ability of two strains (CBS 98 and CBS 4482) exhibiting different degrees of tolerance to grow in the presence of acetic acid under aerobic and oxygen-limited conditions. We found that the concomitant presence of acetic acid and oxygen had a negative effect on D. bruxellensis growth. In contrast, incubation under oxygen-limited conditions resulted in reproducible growth kinetics that exhibited a shorter adaptive phase and higher growth rates than those with cultivation under aerobic conditions. This positive effect was more pronounced in CBS 98, the more-sensitive strain. Cultivation of CBS 98 cells under oxygen-limited conditions improved their ability to restore their intracellular pH upon acetic acid exposure and to reduce the oxidative damage to intracellular macromolecules caused by the presence of acetic acid. This study reveals an important role of oxidative stress in acetic acid tolerance in D. bruxellensis, indicating that reduced oxygen availability can protect against the damage caused by the presence of acetic acid. This aspect is important for optimizing industrial processes performed in the presence of acetic acid. This study reveals an important role of oxidative stress in acetic acid tolerance in D. bruxellensis, indicating that reduced oxygen availability can have a protective role against the damage caused by the presence of acetic acid. This aspect is important for the optimization of industrial processes performed in the presence of acetic acid. Copyright © 2016, American Society for Microbiology. All Rights Reserved.

  2. Effects of Oxygen Availability on Acetic Acid Tolerance and Intracellular pH in Dekkera bruxellensis

    PubMed Central

    Capusoni, Claudia; Arioli, Stefania; Zambelli, Paolo; Moktaduzzaman, M.; Mora, Diego

    2016-01-01

    ABSTRACT The yeast Dekkera bruxellensis, associated with wine and beer production, has recently received attention, because its high ethanol and acid tolerance enables it to compete with Saccharomyces cerevisiae in distilleries that produce fuel ethanol. We investigated how different cultivation conditions affect the acetic acid tolerance of D. bruxellensis. We analyzed the ability of two strains (CBS 98 and CBS 4482) exhibiting different degrees of tolerance to grow in the presence of acetic acid under aerobic and oxygen-limited conditions. We found that the concomitant presence of acetic acid and oxygen had a negative effect on D. bruxellensis growth. In contrast, incubation under oxygen-limited conditions resulted in reproducible growth kinetics that exhibited a shorter adaptive phase and higher growth rates than those with cultivation under aerobic conditions. This positive effect was more pronounced in CBS 98, the more-sensitive strain. Cultivation of CBS 98 cells under oxygen-limited conditions improved their ability to restore their intracellular pH upon acetic acid exposure and to reduce the oxidative damage to intracellular macromolecules caused by the presence of acetic acid. This study reveals an important role of oxidative stress in acetic acid tolerance in D. bruxellensis, indicating that reduced oxygen availability can protect against the damage caused by the presence of acetic acid. This aspect is important for optimizing industrial processes performed in the presence of acetic acid. IMPORTANCE This study reveals an important role of oxidative stress in acetic acid tolerance in D. bruxellensis, indicating that reduced oxygen availability can have a protective role against the damage caused by the presence of acetic acid. This aspect is important for the optimization of industrial processes performed in the presence of acetic acid. PMID:27235432

  3. Probabilistic liquefaction triggering based on the cone penetration test

    USGS Publications Warehouse

    Moss, R.E.S.; Seed, R.B.; Kayen, R.E.; Stewart, J.P.; Tokimatsu, K.

    2005-01-01

    Performance-based earthquake engineering requires a probabilistic treatment of potential failure modes in order to accurately quantify the overall stability of the system. This paper is a summary of the application portions of the probabilistic liquefaction triggering correlations proposed recently proposed by Moss and co-workers. To enable probabilistic treatment of liquefaction triggering, the variables comprising the seismic load and the liquefaction resistance were treated as inherently uncertain. Supporting data from an extensive Cone Penetration Test (CPT)-based liquefaction case history database were used to develop a probabilistic correlation. The methods used to measure the uncertainty of the load and resistance variables, how the interactions of these variables were treated using Bayesian updating, and how reliability analysis was applied to produce curves of equal probability of liquefaction are presented. The normalization for effective overburden stress, the magnitude correlated duration weighting factor, and the non-linear shear mass participation factor used are also discussed.

  4. Relative Gains, Losses, and Reference Points in Probabilistic Choice in Rats

    PubMed Central

    Marshall, Andrew T.; Kirkpatrick, Kimberly

    2015-01-01

    Theoretical reference points have been proposed to differentiate probabilistic gains from probabilistic losses in humans, but such a phenomenon in non-human animals has yet to be thoroughly elucidated. Three experiments evaluated the effect of reward magnitude on probabilistic choice in rats, seeking to determine reference point use by examining the effect of previous outcome magnitude(s) on subsequent choice behavior. Rats were trained to choose between an outcome that always delivered reward (low-uncertainty choice) and one that probabilistically delivered reward (high-uncertainty). The probability of high-uncertainty outcome receipt and the magnitudes of low-uncertainty and high-uncertainty outcomes were manipulated within and between experiments. Both the low- and high-uncertainty outcomes involved variable reward magnitudes, so that either a smaller or larger magnitude was probabilistically delivered, as well as reward omission following high-uncertainty choices. In Experiments 1 and 2, the between groups factor was the magnitude of the high-uncertainty-smaller (H-S) and high-uncertainty-larger (H-L) outcome, respectively. The H-S magnitude manipulation differentiated the groups, while the H-L magnitude manipulation did not. Experiment 3 showed that manipulating the probability of differential losses as well as the expected value of the low-uncertainty choice produced systematic effects on choice behavior. The results suggest that the reference point for probabilistic gains and losses was the expected value of the low-uncertainty choice. Current theories of probabilistic choice behavior have difficulty accounting for the present results, so an integrated theoretical framework is proposed. Overall, the present results have implications for understanding individual differences and corresponding underlying mechanisms of probabilistic choice behavior. PMID:25658448

  5. Ubc9 is required for damage-tolerance and damage-induced interchromosomal homologous recombination in S. cerevisiae.

    PubMed

    Maeda, Daisuke; Seki, Masayuki; Onoda, Fumitoshi; Branzei, Dana; Kawabe, Yoh-Ichi; Enomoto, Takemi

    2004-03-04

    Ubc9 is an enzyme involved in the conjugation of small ubiquitin related modifier (SUMO) to target proteins. A Saccharomyces cerevisiae ubc9 temperature sensitive (ts) mutant showed higher sensitivity to various DNA damaging agents such as methylmethanesulfonate (MMS) and UV at a semi-permissive temperature than wild-type cells. The sensitivity of ubc9ts cells was not suppressed by the introduction of a mutated UBC9 gene, UBC9-C93S, whose product is unable to covalently bind to SUMO and consequently fails to conjugate SUMO to target proteins. Diploid ubc9ts cells were more sensitive to various DNA damaging agents than haploid ubc9ts cells suggesting the involvement of homologous recombination in the sensitivity of ubc9ts cells. The frequency of interchromosomal recombination between heteroalleles, his1-1/his1-7 loci, in wild-type cells was remarkably increased upon exposure to MMS or UV. Although the frequency of spontaneous interchromosomal recombination between the heteroalleles in ubc9ts cells was almost the same as that of wild-type cells, no induction of interchromosomal recombination was observed in ubc9ts cells upon exposure to MMS or UV. Copyright 2003 Elsevier B.V.

  6. Probabilistic approach to damage of tunnel lining due to fire

    NASA Astrophysics Data System (ADS)

    Šejnoha, Jiří; Sýkora, Jan; Novotná, Eva; Šejnoha, Michal

    2017-09-01

    In this paper, risk is perceived as the probable damage caused by a fire in the tunnel lining. In its first part the traffic flow is described as a Markov chain of joint states consisting of a combination of trucks/buses (TB) and personal cars (PC) from adjoining lanes. The heat release rate is then taken for a measure of the fire power. The intensity λf reflecting the frequency of fires was assessed based on extensive studies carried out in Austria [1] and Italy [2, 3]. The traffic density AADT, the length of the tunnel L, the percentage of TBs, and the number of lanes are the remaining parameters characterizing the traffic flow. In the second part, a special combination of models originally proposed by Bažant and Thonguthai [4], and Künzel & Kiessl [5] for the description of transport processes in concrete at very high temperatures creates a basis for the prediction of the thickness of the spalling zone and the volume of concrete degraded by temperatures that exceed a certain temperature level. The model was validated against a macroscopic test on concrete samples placed into the furnace.

  7. Reversing Optical Damage In LiNbO3 Switches

    NASA Technical Reports Server (NTRS)

    Gee, C. M.; Thurmond, G. D.

    1985-01-01

    One symptom of optical damage in Ti-diffused LiNbO3 directional-coupler switch reversed by temporarily raising input illumination to higher-thannormal power level. Healing phenomenon used to restore normal operation, increase operating-power rating, and stabilize operating characteristics at lower powers. Higher operating power is tolerated after treatment.

  8. Detector Damage at X-Ray Free-Electron Laser Sources

    NASA Astrophysics Data System (ADS)

    Blaj, G.; Carini, G.; Carron, S.; Haller, G.; Hart, P.; Hasi, J.; Herrmann, S.; Kenney, C.; Segal, J.; Stan, C. A.; Tomada, A.

    2016-06-01

    Free-electron lasers (FELs) opened a new window on imaging the motion of atoms and molecules. At SLAC, FEL experiments are performed at LCLS using 120 Hz pulses with 1012 to 1013 photons in 10 fs (billions of times brighter than at the most powerful synchrotrons). Concurrently, users and staff operate under high pressure due to flexible and often rapidly changing setups and low tolerance for system malfunction. This extreme detection environment raises unique challenges, from obvious to surprising, and leads to treating detectors as consumables. We discuss in detail the detector damage mechanisms observed in 7 years of operation at LCLS, together with the corresponding damage mitigation strategies and their effectiveness. Main types of damage mechanisms already identified include: (1) x-ray radiation damage (from “catastrophic” to “classical”), (2) direct and indirect damage caused by optical lasers, (3) sample induced damage, (4) vacuum related damage, (5) high-pressure environment. In total, 19 damage mechanisms have been identified. We also present general strategies for reducing damage risk or minimizing the impact of detector damage on the science program. These include availability of replacement parts and skilled operators and also careful planning, incident investigation resulting in updated designs, procedures and operator training.

  9. Probabilistic Cellular Automata

    PubMed Central

    Agapie, Alexandru; Giuclea, Marius

    2014-01-01

    Abstract Cellular automata are binary lattices used for modeling complex dynamical systems. The automaton evolves iteratively from one configuration to another, using some local transition rule based on the number of ones in the neighborhood of each cell. With respect to the number of cells allowed to change per iteration, we speak of either synchronous or asynchronous automata. If randomness is involved to some degree in the transition rule, we speak of probabilistic automata, otherwise they are called deterministic. With either type of cellular automaton we are dealing with, the main theoretical challenge stays the same: starting from an arbitrary initial configuration, predict (with highest accuracy) the end configuration. If the automaton is deterministic, the outcome simplifies to one of two configurations, all zeros or all ones. If the automaton is probabilistic, the whole process is modeled by a finite homogeneous Markov chain, and the outcome is the corresponding stationary distribution. Based on our previous results for the asynchronous case—connecting the probability of a configuration in the stationary distribution to its number of zero-one borders—the article offers both numerical and theoretical insight into the long-term behavior of synchronous cellular automata. PMID:24999557

  10. Probabilistic cellular automata.

    PubMed

    Agapie, Alexandru; Andreica, Anca; Giuclea, Marius

    2014-09-01

    Cellular automata are binary lattices used for modeling complex dynamical systems. The automaton evolves iteratively from one configuration to another, using some local transition rule based on the number of ones in the neighborhood of each cell. With respect to the number of cells allowed to change per iteration, we speak of either synchronous or asynchronous automata. If randomness is involved to some degree in the transition rule, we speak of probabilistic automata, otherwise they are called deterministic. With either type of cellular automaton we are dealing with, the main theoretical challenge stays the same: starting from an arbitrary initial configuration, predict (with highest accuracy) the end configuration. If the automaton is deterministic, the outcome simplifies to one of two configurations, all zeros or all ones. If the automaton is probabilistic, the whole process is modeled by a finite homogeneous Markov chain, and the outcome is the corresponding stationary distribution. Based on our previous results for the asynchronous case-connecting the probability of a configuration in the stationary distribution to its number of zero-one borders-the article offers both numerical and theoretical insight into the long-term behavior of synchronous cellular automata.

  11. Probabilistic population aging

    PubMed Central

    2017-01-01

    We merge two methodologies, prospective measures of population aging and probabilistic population forecasts. We compare the speed of change and variability in forecasts of the old age dependency ratio and the prospective old age dependency ratio as well as the same comparison for the median age and the prospective median age. While conventional measures of population aging are computed on the basis of the number of years people have already lived, prospective measures are computed also taking account of the expected number of years they have left to live. Those remaining life expectancies change over time and differ from place to place. We compare the probabilistic distributions of the conventional and prospective measures using examples from China, Germany, Iran, and the United States. The changes over time and the variability of the prospective indicators are smaller than those that are observed in the conventional ones. A wide variety of new results emerge from the combination of methodologies. For example, for Germany, Iran, and the United States the likelihood that the prospective median age of the population in 2098 will be lower than it is today is close to 100 percent. PMID:28636675

  12. Probabilistic Structural Analysis Methods (PSAM) for select space propulsion systems components

    NASA Technical Reports Server (NTRS)

    1991-01-01

    Summarized here is the technical effort and computer code developed during the five year duration of the program for probabilistic structural analysis methods. The summary includes a brief description of the computer code manuals and a detailed description of code validation demonstration cases for random vibrations of a discharge duct, probabilistic material nonlinearities of a liquid oxygen post, and probabilistic buckling of a transfer tube liner.

  13. Distribution functions of probabilistic automata

    NASA Technical Reports Server (NTRS)

    Vatan, F.

    2001-01-01

    Each probabilistic automaton M over an alphabet A defines a probability measure Prob sub(M) on the set of all finite and infinite words over A. We can identify a k letter alphabet A with the set {0, 1,..., k-1}, and, hence, we can consider every finite or infinite word w over A as a radix k expansion of a real number X(w) in the interval [0, 1]. This makes X(w) a random variable and the distribution function of M is defined as usual: F(x) := Prob sub(M) { w: X(w) < x }. Utilizing the fixed-point semantics (denotational semantics), extended to probabilistic computations, we investigate the distribution functions of probabilistic automata in detail. Automata with continuous distribution functions are characterized. By a new, and much more easier method, it is shown that the distribution function F(x) is an analytic function if it is a polynomial. Finally, answering a question posed by D. Knuth and A. Yao, we show that a polynomial distribution function F(x) on [0, 1] can be generated by a prob abilistic automaton iff all the roots of F'(x) = 0 in this interval, if any, are rational numbers. For this, we define two dynamical systems on the set of polynomial distributions and study attracting fixed points of random composition of these two systems.

  14. Probabilistic reversal learning is impaired in Parkinson's disease

    PubMed Central

    Peterson, David A.; Elliott, Christian; Song, David D.; Makeig, Scott; Sejnowski, Terrence J.; Poizner, Howard

    2009-01-01

    In many everyday settings, the relationship between our choices and their potentially rewarding outcomes is probabilistic and dynamic. In addition, the difficulty of the choices can vary widely. Although a large body of theoretical and empirical evidence suggests that dopamine mediates rewarded learning, the influence of dopamine in probabilistic and dynamic rewarded learning remains unclear. We adapted a probabilistic rewarded learning task originally used to study firing rates of dopamine cells in primate substantia nigra pars compacta (Morris et al. 2006) for use as a reversal learning task with humans. We sought to investigate how the dopamine depletion in Parkinson's disease (PD) affects probabilistic reward learning and adaptation to a reversal in reward contingencies. Over the course of 256 trials subjects learned to choose the more favorable from among pairs of images with small or large differences in reward probabilities. During a subsequent otherwise identical reversal phase, the reward probability contingencies for the stimuli were reversed. Seventeen Parkinson's disease (PD) patients of mild to moderate severity were studied off of their dopaminergic medications and compared to 15 age-matched controls. Compared to controls, PD patients had distinct pre- and post-reversal deficiencies depending upon the difficulty of the choices they had to learn. The patients also exhibited compromised adaptability to the reversal. A computational model of the subjects’ trial-by-trial choices demonstrated that the adaptability was sensitive to the gain with which patients weighted pre-reversal feedback. Collectively, the results implicate the nigral dopaminergic system in learning to make choices in environments with probabilistic and dynamic reward contingencies. PMID:19628022

  15. Probabilistic QoS Analysis In Wireless Sensor Networks

    DTIC Science & Technology

    2012-04-01

    and A.O. Fapojuwo. TDMA scheduling with optimized energy efficiency and minimum delay in clustered wireless sensor networks . IEEE Trans. on Mobile...Research Computer Science and Engineering, Department of 5-1-2012 Probabilistic QoS Analysis in Wireless Sensor Networks Yunbo Wang University of...Wang, Yunbo, "Probabilistic QoS Analysis in Wireless Sensor Networks " (2012). Computer Science and Engineering: Theses, Dissertations, and Student

  16. Process for computing geometric perturbations for probabilistic analysis

    DOEpatents

    Fitch, Simeon H. K. [Charlottesville, VA; Riha, David S [San Antonio, TX; Thacker, Ben H [San Antonio, TX

    2012-04-10

    A method for computing geometric perturbations for probabilistic analysis. The probabilistic analysis is based on finite element modeling, in which uncertainties in the modeled system are represented by changes in the nominal geometry of the model, referred to as "perturbations". These changes are accomplished using displacement vectors, which are computed for each node of a region of interest and are based on mean-value coordinate calculations.

  17. Probabilistic simulation of multi-scale composite behavior

    NASA Technical Reports Server (NTRS)

    Liaw, D. G.; Shiao, M. C.; Singhal, S. N.; Chamis, Christos C.

    1993-01-01

    A methodology is developed to computationally assess the probabilistic composite material properties at all composite scale levels due to the uncertainties in the constituent (fiber and matrix) properties and in the fabrication process variables. The methodology is computationally efficient for simulating the probability distributions of material properties. The sensitivity of the probabilistic composite material property to each random variable is determined. This information can be used to reduce undesirable uncertainties in material properties at the macro scale of the composite by reducing the uncertainties in the most influential random variables at the micro scale. This methodology was implemented into the computer code PICAN (Probabilistic Integrated Composite ANalyzer). The accuracy and efficiency of this methodology are demonstrated by simulating the uncertainties in the material properties of a typical laminate and comparing the results with the Monte Carlo simulation method. The experimental data of composite material properties at all scales fall within the scatters predicted by PICAN.

  18. Probabilistic Exposure Analysis for Chemical Risk Characterization

    PubMed Central

    Bogen, Kenneth T.; Cullen, Alison C.; Frey, H. Christopher; Price, Paul S.

    2009-01-01

    This paper summarizes the state of the science of probabilistic exposure assessment (PEA) as applied to chemical risk characterization. Current probabilistic risk analysis methods applied to PEA are reviewed. PEA within the context of risk-based decision making is discussed, including probabilistic treatment of related uncertainty, interindividual heterogeneity, and other sources of variability. Key examples of recent experience gained in assessing human exposures to chemicals in the environment, and other applications to chemical risk characterization and assessment, are presented. It is concluded that, although improvements continue to be made, existing methods suffice for effective application of PEA to support quantitative analyses of the risk of chemically induced toxicity that play an increasing role in key decision-making objectives involving health protection, triage, civil justice, and criminal justice. Different types of information required to apply PEA to these different decision contexts are identified, and specific PEA methods are highlighted that are best suited to exposure assessment in these separate contexts. PMID:19223660

  19. Empirical Fragility Analysis of Buildings and Boats Damaged By the 2011 Great East Japan Tsunami and Their Practical Application

    NASA Astrophysics Data System (ADS)

    Suppasri, A.; Charvet, I.; Leelawat, N.; Fukutani, Y.; Muhari, A.; Futami, T.; Imamura, F.

    2014-12-01

    This study focused in turn on detailed data of buildings and boats damage caused by the 2011 tsunami in order to understand its main causes and provide damage probability estimates. Tsunami-induced building damage data was collected from field surveys, and includes inundation depth, building material, number of stories and occupancy type for more than 80,000 buildings. Numerical simulations with high resolution bathymetry and topography data were conducted to obtain characteristic tsunami measures such as flow velocity. These data were analyzed using advanced statistical methods, ordinal regression analysis to create not only empirical 2D tsunami fragility curves, but also 3D tsunami fragility surfaces for the first time. The effect of floating debris was also considered, by using a binary indicator of debris impact based on the proximity of a structure from a debris source (i.e. washed away building). Both the 2D and 3D fragility analyses provided results for each different building damage level, and different topography. While 2D fragility curves provide easily interpretable results relating tsunami flow depth to damage probability for different damage levels, 3D fragility surfaces allow for several influential tsunami parameters to be taken into account thus reduce uncertainty in the probability estimations. More than 20,000 damaged boats were used in the analysis similar to the one carried out on the buildings. Detailed data for each boat comprises information on the damage ratio (paid value over insured value), tonnage, engine type, material type and damage classification. The 2D and 3D fragility analyses were developed using representative tsunami heights for each port obtained from field surveys and flow velocities obtained from the aforementioned simulations. The results are currently being adapted for practical disaster mitigation. They are being integrated with the probabilistic tsunami hazard analysis, in order to create offshore and onshore

  20. Probabilistic Flood Maps to support decision-making: Mapping the Value of Information

    NASA Astrophysics Data System (ADS)

    Alfonso, L.; Mukolwe, M. M.; Di Baldassarre, G.

    2016-02-01

    Floods are one of the most frequent and disruptive natural hazards that affect man. Annually, significant flood damage is documented worldwide. Flood mapping is a common preimpact flood hazard mitigation measure, for which advanced methods and tools (such as flood inundation models) are used to estimate potential flood extent maps that are used in spatial planning. However, these tools are affected, largely to an unknown degree, by both epistemic and aleatory uncertainty. Over the past few years, advances in uncertainty analysis with respect to flood inundation modeling show that it is appropriate to adopt Probabilistic Flood Maps (PFM) to account for uncertainty. However, the following question arises; how can probabilistic flood hazard information be incorporated into spatial planning? Thus, a consistent framework to incorporate PFMs into the decision-making is required. In this paper, a novel methodology based on Decision-Making under Uncertainty theories, in particular Value of Information (VOI) is proposed. Specifically, the methodology entails the use of a PFM to generate a VOI map, which highlights floodplain locations where additional information is valuable with respect to available floodplain management actions and their potential consequences. The methodology is illustrated with a simplified example and also applied to a real case study in the South of France, where a VOI map is analyzed on the basis of historical land use change decisions over a period of 26 years. Results show that uncertain flood hazard information encapsulated in PFMs can aid decision-making in floodplain planning.

  1. Hurricane Katrina winds damaged longleaf pine less than loblolly pine

    Treesearch

    Kurt H. Johnsen; John R. Butnor; John S. Kush; Ronald C. Schmidtling; C. Dana Nelson

    2009-01-01

    Some evidence suggests that longleaf pine might be more tolerant of high winds than either slash pine (Pinus elliotii Englem.) or loblolly pine (Pinus taeda L.). We studied wind damage to these three pine species in a common garden experiment in southeast Mississippi following Hurricane Katrina,...

  2. Structure of a Novel DNA-binding Domain of Helicase-like Transcription Factor (HLTF) and Its Functional Implication in DNA Damage Tolerance.

    PubMed

    Hishiki, Asami; Hara, Kodai; Ikegaya, Yuzu; Yokoyama, Hideshi; Shimizu, Toshiyuki; Sato, Mamoru; Hashimoto, Hiroshi

    2015-05-22

    HLTF (helicase-like transcription factor) is a yeast RAD5 homolog found in mammals. HLTF has E3 ubiquitin ligase and DNA helicase activities, and plays a pivotal role in the template-switching pathway of DNA damage tolerance. HLTF has an N-terminal domain that has been designated the HIRAN (HIP116 and RAD5 N-terminal) domain. The HIRAN domain has been hypothesized to play a role in DNA binding; however, the structural basis of, and functional evidence for, the HIRAN domain in DNA binding has remained unclear. Here we show for the first time the crystal structure of the HIRAN domain of human HLTF in complex with DNA. The HIRAN domain is composed of six β-strands and two α-helices, forming an OB-fold structure frequently found in ssDNA-binding proteins, including in replication factor A (RPA). Interestingly, this study reveals that the HIRAN domain interacts with not only with a single-stranded DNA but also with a duplex DNA. Furthermore, the structure unexpectedly clarifies that the HIRAN domain specifically recognizes the 3'-end of DNA. These results suggest that the HIRAN domain functions as a sensor to the 3'-end of the primer strand at the stalled replication fork and that the domain facilitates fork regression. HLTF is recruited to a damaged site through the HIRAN domain at the stalled replication fork. Furthermore, our results have implications for the mechanism of template switching. © 2015 by The American Society for Biochemistry and Molecular Biology, Inc.

  3. Probabilistic Meteorological Characterization for Turbine Loads

    NASA Astrophysics Data System (ADS)

    Kelly, M.; Larsen, G.; Dimitrov, N. K.; Natarajan, A.

    2014-06-01

    Beyond the existing, limited IEC prescription to describe fatigue loads on wind turbines, we look towards probabilistic characterization of the loads via analogous characterization of the atmospheric flow, particularly for today's "taller" turbines with rotors well above the atmospheric surface layer. Based on both data from multiple sites as well as theoretical bases from boundary-layer meteorology and atmospheric turbulence, we offer probabilistic descriptions of shear and turbulence intensity, elucidating the connection of each to the other as well as to atmospheric stability and terrain. These are used as input to loads calculation, and with a statistical loads output description, they allow for improved design and loads calculations.

  4. Probabilistic evaluation of SSME structural components

    NASA Astrophysics Data System (ADS)

    Rajagopal, K. R.; Newell, J. F.; Ho, H.

    1991-05-01

    The application is described of Composite Load Spectra (CLS) and Numerical Evaluation of Stochastic Structures Under Stress (NESSUS) family of computer codes to the probabilistic structural analysis of four Space Shuttle Main Engine (SSME) space propulsion system components. These components are subjected to environments that are influenced by many random variables. The applications consider a wide breadth of uncertainties encountered in practice, while simultaneously covering a wide area of structural mechanics. This has been done consistent with the primary design requirement for each component. The probabilistic application studies are discussed using finite element models that have been typically used in the past in deterministic analysis studies.

  5. Status Report on Irradiation Capsules Containing Welded FeCrAl Specimens for Radiation Tolerance Evaluation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Field, Kevin G.; Howard, Richard H.

    2016-02-26

    This status report provides the background and current status of a series of irradiation capsules, or “rabbits”, that were designed and built to test the contributions of microstructure, composition, damage dose, and irradiation temperature on the radiation tolerance of candidate FeCrAl alloys being developed to have enhanced weldability and radiation tolerance. These rabbits will also test the validity of using an ultra-miniature tensile specimen to assess the mechanical properties of irradiated FeCrAl base metal and weldments. All rabbits are to be irradiated in the High Flux Isotope Reactor (HFIR) at Oak Ridge National Laboratory (ORNL) to damage doses up tomore » ≥15 dpa at temperatures between 200-550°C.« less

  6. Nematode and grape rootstock interactions including an improved understanding of tolerance.

    PubMed

    McKenry, M V; Anwar, Safdar A

    2006-09-01

    Sixteen cultivars of grape were screened over a two-year period in the presence or absence of 10 different nematode populations. Populations of Meloidogyne spp., Xiphinema index, and Mesocriconema xenoplax developed more rapidly and caused greater damage than populations of X. americanum and Tylenchulus semipenetrans. Populations of mixed Meloidogyne spp. having a history of feeding on grape were among the fastest developing populations. Tolerance to nematode parasitism appeared to be based on different mechanisms. Slow developing, less pathogenic nematode populations often stimulated vine growth, thus vines appeared to possess tolerance. Likewise, cultivars selected for nematode resistance often stimulated vine growth when fed upon by the nematode. However, tolerance sources that resulted from nematode resistance are vulnerable due to the occurrence of populations that break resistance mechanisms. Growth of cultivars with phylloxera (Daktalospharia vitifoliae) resistance was unchanged by the presence of nematodes, indicating that phylloxera resistance may provide a useful source of nematode relief. These and several additional sources of specific tolerance are discussed.

  7. Nematode and Grape Rootstock Interactions Including an Improved Understanding of Tolerance

    PubMed Central

    McKenry, M.V.; Anwar, Safdar A.

    2006-01-01

    Sixteen cultivars of grape were screened over a two-year period in the presence or absence of 10 different nematode populations. Populations of Meloidogyne spp., Xiphinema index, and Mesocriconema xenoplax developed more rapidly and caused greater damage than populations of X. americanum and Tylenchulus semipenetrans. Populations of mixed Meloidogyne spp. having a history of feeding on grape were among the fastest developing populations. Tolerance to nematode parasitism appeared to be based on different mechanisms. Slow developing, less pathogenic nematode populations often stimulated vine growth, thus vines appeared to possess tolerance. Likewise, cultivars selected for nematode resistance often stimulated vine growth when fed upon by the nematode. However, tolerance sources that resulted from nematode resistance are vulnerable due to the occurrence of populations that break resistance mechanisms. Growth of cultivars with phylloxera (Daktalospharia vitifoliae) resistance was unchanged by the presence of nematodes, indicating that phylloxera resistance may provide a useful source of nematode relief. These and several additional sources of specific tolerance are discussed. PMID:19259534

  8. PCEMCAN - Probabilistic Ceramic Matrix Composites Analyzer: User's Guide, Version 1.0

    NASA Technical Reports Server (NTRS)

    Shah, Ashwin R.; Mital, Subodh K.; Murthy, Pappu L. N.

    1998-01-01

    PCEMCAN (Probabalistic CEramic Matrix Composites ANalyzer) is an integrated computer code developed at NASA Lewis Research Center that simulates uncertainties associated with the constituent properties, manufacturing process, and geometric parameters of fiber reinforced ceramic matrix composites and quantifies their random thermomechanical behavior. The PCEMCAN code can perform the deterministic as well as probabilistic analyses to predict thermomechanical properties. This User's guide details the step-by-step procedure to create input file and update/modify the material properties database required to run PCEMCAN computer code. An overview of the geometric conventions, micromechanical unit cell, nonlinear constitutive relationship and probabilistic simulation methodology is also provided in the manual. Fast probability integration as well as Monte-Carlo simulation methods are available for the uncertainty simulation. Various options available in the code to simulate probabilistic material properties and quantify sensitivity of the primitive random variables have been described. The description of deterministic as well as probabilistic results have been described using demonstration problems. For detailed theoretical description of deterministic and probabilistic analyses, the user is referred to the companion documents "Computational Simulation of Continuous Fiber-Reinforced Ceramic Matrix Composite Behavior," NASA TP-3602, 1996 and "Probabilistic Micromechanics and Macromechanics for Ceramic Matrix Composites", NASA TM 4766, June 1997.

  9. Bayesian probabilistic population projections for all countries.

    PubMed

    Raftery, Adrian E; Li, Nan; Ševčíková, Hana; Gerland, Patrick; Heilig, Gerhard K

    2012-08-28

    Projections of countries' future populations, broken down by age and sex, are widely used for planning and research. They are mostly done deterministically, but there is a widespread need for probabilistic projections. We propose a bayesian method for probabilistic population projections for all countries. The total fertility rate and female and male life expectancies at birth are projected probabilistically using bayesian hierarchical models estimated via Markov chain Monte Carlo using United Nations population data for all countries. These are then converted to age-specific rates and combined with a cohort component projection model. This yields probabilistic projections of any population quantity of interest. The method is illustrated for five countries of different demographic stages, continents and sizes. The method is validated by an out of sample experiment in which data from 1950-1990 are used for estimation, and applied to predict 1990-2010. The method appears reasonably accurate and well calibrated for this period. The results suggest that the current United Nations high and low variants greatly underestimate uncertainty about the number of oldest old from about 2050 and that they underestimate uncertainty for high fertility countries and overstate uncertainty for countries that have completed the demographic transition and whose fertility has started to recover towards replacement level, mostly in Europe. The results also indicate that the potential support ratio (persons aged 20-64 per person aged 65+) will almost certainly decline dramatically in most countries over the coming decades.

  10. Bayesian probabilistic population projections for all countries

    PubMed Central

    Raftery, Adrian E.; Li, Nan; Ševčíková, Hana; Gerland, Patrick; Heilig, Gerhard K.

    2012-01-01

    Projections of countries’ future populations, broken down by age and sex, are widely used for planning and research. They are mostly done deterministically, but there is a widespread need for probabilistic projections. We propose a Bayesian method for probabilistic population projections for all countries. The total fertility rate and female and male life expectancies at birth are projected probabilistically using Bayesian hierarchical models estimated via Markov chain Monte Carlo using United Nations population data for all countries. These are then converted to age-specific rates and combined with a cohort component projection model. This yields probabilistic projections of any population quantity of interest. The method is illustrated for five countries of different demographic stages, continents and sizes. The method is validated by an out of sample experiment in which data from 1950–1990 are used for estimation, and applied to predict 1990–2010. The method appears reasonably accurate and well calibrated for this period. The results suggest that the current United Nations high and low variants greatly underestimate uncertainty about the number of oldest old from about 2050 and that they underestimate uncertainty for high fertility countries and overstate uncertainty for countries that have completed the demographic transition and whose fertility has started to recover towards replacement level, mostly in Europe. The results also indicate that the potential support ratio (persons aged 20–64 per person aged 65+) will almost certainly decline dramatically in most countries over the coming decades. PMID:22908249

  11. Application of tolerance limits to the characterization of image registration performance.

    PubMed

    Fedorov, Andriy; Wells, William M; Kikinis, Ron; Tempany, Clare M; Vangel, Mark G

    2014-07-01

    Deformable image registration is used increasingly in image-guided interventions and other applications. However, validation and characterization of registration performance remain areas that require further study. We propose an analysis methodology for deriving tolerance limits on the initial conditions for deformable registration that reliably lead to a successful registration. This approach results in a concise summary of the probability of registration failure, while accounting for the variability in the test data. The (β, γ) tolerance limit can be interpreted as a value of the input parameter that leads to successful registration outcome in at least 100β% of cases with the 100γ% confidence. The utility of the methodology is illustrated by summarizing the performance of a deformable registration algorithm evaluated in three different experimental setups of increasing complexity. Our examples are based on clinical data collected during MRI-guided prostate biopsy registered using publicly available deformable registration tool. The results indicate that the proposed methodology can be used to generate concise graphical summaries of the experiments, as well as a probabilistic estimate of the registration outcome for a future sample. Its use may facilitate improved objective assessment, comparison and retrospective stress-testing of deformable.

  12. Comparison of probabilistic and deterministic fiber tracking of cranial nerves.

    PubMed

    Zolal, Amir; Sobottka, Stephan B; Podlesek, Dino; Linn, Jennifer; Rieger, Bernhard; Juratli, Tareq A; Schackert, Gabriele; Kitzler, Hagen H

    2017-09-01

    OBJECTIVE The depiction of cranial nerves (CNs) using diffusion tensor imaging (DTI) is of great interest in skull base tumor surgery and DTI used with deterministic tracking methods has been reported previously. However, there are still no good methods usable for the elimination of noise from the resulting depictions. The authors have hypothesized that probabilistic tracking could lead to more accurate results, because it more efficiently extracts information from the underlying data. Moreover, the authors have adapted a previously described technique for noise elimination using gradual threshold increases to probabilistic tracking. To evaluate the utility of this new approach, a comparison is provided with this work between the gradual threshold increase method in probabilistic and deterministic tracking of CNs. METHODS Both tracking methods were used to depict CNs II, III, V, and the VII+VIII bundle. Depiction of 240 CNs was attempted with each of the above methods in 30 healthy subjects, which were obtained from 2 public databases: the Kirby repository (KR) and Human Connectome Project (HCP). Elimination of erroneous fibers was attempted by gradually increasing the respective thresholds (fractional anisotropy [FA] and probabilistic index of connectivity [PICo]). The results were compared with predefined ground truth images based on corresponding anatomical scans. Two label overlap measures (false-positive error and Dice similarity coefficient) were used to evaluate the success of both methods in depicting the CN. Moreover, the differences between these parameters obtained from the KR and HCP (with higher angular resolution) databases were evaluated. Additionally, visualization of 10 CNs in 5 clinical cases was attempted with both methods and evaluated by comparing the depictions with intraoperative findings. RESULTS Maximum Dice similarity coefficients were significantly higher with probabilistic tracking (p < 0.001; Wilcoxon signed-rank test). The false

  13. Learning Probabilistic Inference through Spike-Timing-Dependent Plasticity.

    PubMed

    Pecevski, Dejan; Maass, Wolfgang

    2016-01-01

    Numerous experimental data show that the brain is able to extract information from complex, uncertain, and often ambiguous experiences. Furthermore, it can use such learnt information for decision making through probabilistic inference. Several models have been proposed that aim at explaining how probabilistic inference could be performed by networks of neurons in the brain. We propose here a model that can also explain how such neural network could acquire the necessary information for that from examples. We show that spike-timing-dependent plasticity in combination with intrinsic plasticity generates in ensembles of pyramidal cells with lateral inhibition a fundamental building block for that: probabilistic associations between neurons that represent through their firing current values of random variables. Furthermore, by combining such adaptive network motifs in a recursive manner the resulting network is enabled to extract statistical information from complex input streams, and to build an internal model for the distribution p (*) that generates the examples it receives. This holds even if p (*) contains higher-order moments. The analysis of this learning process is supported by a rigorous theoretical foundation. Furthermore, we show that the network can use the learnt internal model immediately for prediction, decision making, and other types of probabilistic inference.

  14. The Role of Probabilistic Design Analysis Methods in Safety and Affordability

    NASA Technical Reports Server (NTRS)

    Safie, Fayssal M.

    2016-01-01

    For the last several years, NASA and its contractors have been working together to build space launch systems to commercialize space. Developing commercial affordable and safe launch systems becomes very important and requires a paradigm shift. This paradigm shift enforces the need for an integrated systems engineering environment where cost, safety, reliability, and performance need to be considered to optimize the launch system design. In such an environment, rule based and deterministic engineering design practices alone may not be sufficient to optimize margins and fault tolerance to reduce cost. As a result, introduction of Probabilistic Design Analysis (PDA) methods to support the current deterministic engineering design practices becomes a necessity to reduce cost without compromising reliability and safety. This paper discusses the importance of PDA methods in NASA's new commercial environment, their applications, and the key role they can play in designing reliable, safe, and affordable launch systems. More specifically, this paper discusses: 1) The involvement of NASA in PDA 2) Why PDA is needed 3) A PDA model structure 4) A PDA example application 5) PDA link to safety and affordability.

  15. Probabilistic Model Development

    NASA Technical Reports Server (NTRS)

    Adam, James H., Jr.

    2010-01-01

    Objective: Develop a Probabilistic Model for the Solar Energetic Particle Environment. Develop a tool to provide a reference solar particle radiation environment that: 1) Will not be exceeded at a user-specified confidence level; 2) Will provide reference environments for: a) Peak flux; b) Event-integrated fluence; and c) Mission-integrated fluence. The reference environments will consist of: a) Elemental energy spectra; b) For protons, helium and heavier ions.

  16. RecA-mediated SOS response provides a geraniol tolerance in Escherichia coli.

    PubMed

    Shah, Asad Ali; Wang, Chonglong; Yoon, Sang-Hwal; Kim, Jae-Yean; Choi, Eui-Sung; Kim, Seon-Won

    2013-09-20

    Geraniol is an important industrial material and a potential candidate of advanced biofuels. One challenge of microbial geraniol production is the toxicity to hosts. However, the poor understanding on geraniol tolerance mechanism is an obstacle for developing geraniol tolerant host. This study genome-widely screened a shot-gun DNA library of Escherichia coli and found that recA is able to confer geraniol tolerance in E. coli. The recA knockout mutant was found extremely sensitive to geraniol. Based on our data, it was deciphered that recA provided tolerance through SOS response network responding to DNA damage caused by geraniol. RecA-mediated SOS response activates the homologous recombinational repair by RecB and RecN for corrective DNA maintenance. This protection mechanism suggests an effective strategy to combat geraniol toxicity in E. coli. Copyright © 2013 Elsevier B.V. All rights reserved.

  17. Development of probabilistic design method for annular fuels

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ozawa, Takayuki

    2007-07-01

    The increase of linear power and burn-up during the reactor operation is considered as one measure to ensure the utility of fast reactors in the future; for this the application of annular oxide fuels is under consideration. The annular fuel design code CEPTAR was developed in the Japan Atomic Energy Agency (JAEA) and verified by using many irradiation experiences with oxide fuels. In addition, the probabilistic fuel design code BORNFREE was also developed to provide a safe and reasonable fuel design and to evaluate the design margins quantitatively. This study aimed at the development of a probabilistic design method formore » annular oxide fuels; this was implemented in the developed BORNFREE-CEPTAR code, and the code was used to make a probabilistic evaluation with regard to the permissive linear power. (author)« less

  18. The analysis of probability task completion; Taxonomy of probabilistic thinking-based across gender in elementary school students

    NASA Astrophysics Data System (ADS)

    Sari, Dwi Ivayana; Budayasa, I. Ketut; Juniati, Dwi

    2017-08-01

    Formulation of mathematical learning goals now is not only oriented on cognitive product, but also leads to cognitive process, which is probabilistic thinking. Probabilistic thinking is needed by students to make a decision. Elementary school students are required to develop probabilistic thinking as foundation to learn probability at higher level. A framework of probabilistic thinking of students had been developed by using SOLO taxonomy, which consists of prestructural probabilistic thinking, unistructural probabilistic thinking, multistructural probabilistic thinking and relational probabilistic thinking. This study aimed to analyze of probability task completion based on taxonomy of probabilistic thinking. The subjects were two students of fifth grade; boy and girl. Subjects were selected by giving test of mathematical ability and then based on high math ability. Subjects were given probability tasks consisting of sample space, probability of an event and probability comparison. The data analysis consisted of categorization, reduction, interpretation and conclusion. Credibility of data used time triangulation. The results was level of boy's probabilistic thinking in completing probability tasks indicated multistructural probabilistic thinking, while level of girl's probabilistic thinking in completing probability tasks indicated unistructural probabilistic thinking. The results indicated that level of boy's probabilistic thinking was higher than level of girl's probabilistic thinking. The results could contribute to curriculum developer in developing probability learning goals for elementary school students. Indeed, teachers could teach probability with regarding gender difference.

  19. Root Damage under Alkaline Stress Is Associated with Reactive Oxygen Species Accumulation in Rice (Oryza sativa L.).

    PubMed

    Zhang, Hui; Liu, Xiao-Long; Zhang, Rui-Xue; Yuan, Hai-Yan; Wang, Ming-Ming; Yang, Hao-Yu; Ma, Hong-Yuan; Liu, Duo; Jiang, Chang-Jie; Liang, Zheng-Wei

    2017-01-01

    Alkaline stress (high pH) severely damages root cells, and consequently, inhibits rice ( Oryza sativa L.) seedling growth. In this study, we demonstrate the accumulation of reactive oxygen species (ROS) in root cells under alkaline stress. Seedlings of two rice cultivars with different alkaline tolerances, 'Dongdao-4' (moderately alkaline-tolerant) and 'Jiudao-51' (alkaline-sensitive), were subjected to alkaline stress simulated by 15 mM sodium carbonate (Na 2 CO 3 ). Alkaline stress greatly reduced seedling survival rate, shoot and root growth, and root vigor. Moreover, severe root cell damage was observed under alkaline stress, as shown by increased membrane injury, malondialdehyde accumulation, and Evan's Blue staining. The expression of the cell death-related genes OsKOD1 , OsHsr203j , OsCP1 , and OsNAC4 was consistently upregulated, while that of a cell death-suppressor gene, OsBI1 , was downregulated. Analysis of the ROS contents revealed that alkaline stress induced a marked accumulation of superoxide anions ([Formula: see text]) and hydrogen peroxide (H 2 O 2 ) in rice roots. The application of procyanidins (a potent antioxidant) to rice seedlings 24 h prior to alkaline treatment significantly alleviated alkalinity-induced root damage and promoted seedling growth inhibition, which were concomitant with reduced ROS accumulation. These results suggest that root cell damage, and consequently growth inhibition, of rice seedlings under alkaline stress is closely associated with ROS accumulation. The antioxidant activity of superoxide dismutase, catalase, peroxidase, and ascorbate peroxidase increased under alkaline stress in the roots, probably in response to the cellular damage induced by oxidative stress. However, this response mechanism may be overwhelmed by the excess ROS accumulation observed under stress, resulting in oxidative damage to root cells. Our findings provide physiological insights into the molecular mechanisms of alkalinity-induced damage to

  20. Local-based damage detection of cyclically loaded bridge piers using wireless sensing units

    NASA Astrophysics Data System (ADS)

    Hou, Tsung-Chin; Lynch, Jerome P.; Parra-Montesinos, Gustavo

    2005-05-01

    Concrete bridge piers are a common structural element employed in the design of bridges and elevated roadways. In order to ensure adequate behavior under earthquake-induced displacements, extensive reinforcement detailing in the form of closely spaced ties or spirals is necessary, leading to congestion problems and difficulties during concrete casting. Further, costly repairs are often necessary in bridge piers after a major earthquake which in some cases involve the total or partial shutdown of the bridge. In order to increase the damage tolerance while relaxing the transverse reinforcement requirements of bridge piers, the use of high-performance fiber reinforced cementitious composites (HPFRCC) in earthquake-resistant bridge piers is explored. HPFRCCs are a relatively new class of cementitious material for civil structures with tensile strain-hardening behavior and high damage tolerance. To monitor the behavior of this new class of material in the field, low-cost wireless monitoring technologies will be adopted to provide HPFRCC structural elements the capability to accurately monitor their performance and health. In particular, the computational core of a wireless sensing unit can be harnessed to screen HPFRCC components for damage in real-time. A seismic damage index initially proposed for flexure dominated reinforced concrete elements is modified to serve as an algorithmic tool for the rapid assessment of damage (due to flexure and shear) in HPFRCC bridge piers subjected to large shear reversals. Traditional and non-traditional sensor strategies of an HPFRCC bridge pier are proposed to optimize the correlation between the proposed damage index model and the damage observed in a circular pier test specimen. Damage index models are shown to be a sufficiently accurate rough measure of the degree of local-area damage that can then be wirelessly communicated to bridge officials.

  1. Why is Probabilistic Seismic Hazard Analysis (PSHA) still used?

    NASA Astrophysics Data System (ADS)

    Mulargia, Francesco; Stark, Philip B.; Geller, Robert J.

    2017-03-01

    Even though it has never been validated by objective testing, Probabilistic Seismic Hazard Analysis (PSHA) has been widely used for almost 50 years by governments and industry in applications with lives and property hanging in the balance, such as deciding safety criteria for nuclear power plants, making official national hazard maps, developing building code requirements, and determining earthquake insurance rates. PSHA rests on assumptions now known to conflict with earthquake physics; many damaging earthquakes, including the 1988 Spitak, Armenia, event and the 2011 Tohoku, Japan, event, have occurred in regions relatively rated low-risk by PSHA hazard maps. No extant method, including PSHA, produces reliable estimates of seismic hazard. Earthquake hazard mitigation should be recognized to be inherently political, involving a tradeoff between uncertain costs and uncertain risks. Earthquake scientists, engineers, and risk managers can make important contributions to the hard problem of allocating limited resources wisely, but government officials and stakeholders must take responsibility for the risks of accidents due to natural events that exceed the adopted safety criteria.

  2. DISCOUNTING OF DELAYED AND PROBABILISTIC LOSSES OVER A WIDE RANGE OF AMOUNTS

    PubMed Central

    Green, Leonard; Myerson, Joel; Oliveira, Luís; Chang, Seo Eun

    2014-01-01

    The present study examined delay and probability discounting of hypothetical monetary losses over a wide range of amounts (from $20 to $500,000) in order to determine how amount affects the parameters of the hyperboloid discounting function. In separate conditions, college students chose between immediate payments and larger, delayed payments and between certain payments and larger, probabilistic payments. The hyperboloid function accurately described both types of discounting, and amount of loss had little or no systematic effect on the degree of discounting. Importantly, the amount of loss also had little systematic effect on either the rate parameter or the exponent of the delay and probability discounting functions. The finding that the parameters of the hyperboloid function remain relatively constant across a wide range of amounts of delayed and probabilistic loss stands in contrast to the robust amount effects observed with delayed and probabilistic rewards. At the individual level, the degree to which delayed losses were discounted was uncorrelated with the degree to which probabilistic losses were discounted, and delay and probability loaded on two separate factors, similar to what is observed with delayed and probabilistic rewards. Taken together, these findings argue that although delay and probability discounting involve fundamentally different decision-making mechanisms, nevertheless the discounting of delayed and probabilistic losses share an insensitivity to amount that distinguishes it from the discounting of delayed and probabilistic gains. PMID:24745086

  3. Molecular biology of freezing tolerance.

    PubMed

    Storey, Kenneth B; Storey, Janet M

    2013-07-01

    Winter survival for many kinds of animals involves freeze tolerance, the ability to endure the conversion of about 65% of total body water into extracellular ice and the consequences that freezing imposes including interruption of vital processes (e.g., heartbeat and breathing), cell shrinkage, elevated osmolality, anoxia/ischemia, and potential physical damage from ice. Freeze-tolerant animals include various terrestrially hibernating amphibians and reptiles, many species of insects, and numerous other invertebrates inhabiting both terrestrial and intertidal environments. Well-known strategies of freezing survival include accumulation of low molecular mass carbohydrate cryoprotectants (e.g., glycerol), use of ice nucleating agents/proteins for controlled triggering of ice growth and of antifreeze proteins that inhibit ice recrystallization, and good tolerance of anoxia and dehydration. The present article focuses on more recent advances in our knowledge of the genes and proteins that support freeze tolerance and the metabolic regulatory mechanisms involved. Important roles have been identified for aquaporins and transmembrane channels that move cryoprotectants, heat shock proteins and other chaperones, antioxidant defenses, and metabolic rate depression. Genome and proteome screening has revealed many new potential targets that respond to freezing, in particular implicating cytoskeleton remodeling as a necessary facet of low temperature and/or cell volume adaptation. Key regulatory mechanisms include reversible phosphorylation control of metabolic enzymes and microRNA control of gene transcript expression. These help to remodel metabolism to preserve core functions while suppressing energy expensive metabolic activities such as the cell cycle. All of these advances are providing a much more complete picture of life in the frozen state. © 2013 American Physiological Society.

  4. Probabilistic assessment of smart composite structures

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.; Shiao, Michael C.

    1994-01-01

    A composite wing with spars and bulkheads is used to demonstrate the effectiveness of probabilistic assessment of smart composite structures to control uncertainties in distortions and stresses. Results show that a smart composite wing can be controlled to minimize distortions and to have specified stress levels in the presence of defects. Structural responses such as changes in angle of attack, vertical displacements, and stress in the control and controlled plies are probabilistically assessed to quantify their respective uncertainties. Sensitivity factors are evaluated to identify those parameters that have the greatest influence on a specific structural response. Results show that smart composite structures can be configured to control both distortions and ply stresses to satisfy specified design requirements.

  5. Significance testing as perverse probabilistic reasoning

    PubMed Central

    2011-01-01

    Truth claims in the medical literature rely heavily on statistical significance testing. Unfortunately, most physicians misunderstand the underlying probabilistic logic of significance tests and consequently often misinterpret their results. This near-universal misunderstanding is highlighted by means of a simple quiz which we administered to 246 physicians at two major academic hospitals, on which the proportion of incorrect responses exceeded 90%. A solid understanding of the fundamental concepts of probability theory is becoming essential to the rational interpretation of medical information. This essay provides a technically sound review of these concepts that is accessible to a medical audience. We also briefly review the debate in the cognitive sciences regarding physicians' aptitude for probabilistic inference. PMID:21356064

  6. Probabilistic micromechanics for metal matrix composites

    NASA Astrophysics Data System (ADS)

    Engelstad, S. P.; Reddy, J. N.; Hopkins, Dale A.

    A probabilistic micromechanics-based nonlinear analysis procedure is developed to predict and quantify the variability in the properties of high temperature metal matrix composites. Monte Carlo simulation is used to model the probabilistic distributions of the constituent level properties including fiber, matrix, and interphase properties, volume and void ratios, strengths, fiber misalignment, and nonlinear empirical parameters. The procedure predicts the resultant ply properties and quantifies their statistical scatter. Graphite copper and Silicon Carbide Titanlum Aluminide (SCS-6 TI15) unidirectional plies are considered to demonstrate the predictive capabilities. The procedure is believed to have a high potential for use in material characterization and selection to precede and assist in experimental studies of new high temperature metal matrix composites.

  7. A New Scheme for Probabilistic Teleportation and Its Potential Applications

    NASA Astrophysics Data System (ADS)

    Wei, Jia-Hua; Dai, Hong-Yi; Zhang, Ming

    2013-12-01

    We propose a novel scheme to probabilistically teleport an unknown two-level quantum state when the information of the partially entangled state is only available for the sender. This is in contrast with the fact that the receiver must know the non-maximally entangled state in previous typical schemes for the teleportation. Additionally, we illustrate two potential applications of the novel scheme for probabilistic teleportation from a sender to a receiver with the help of an assistant, who plays distinct roles under different communication conditions, and our results show that the novel proposal could enlarge the applied range of probabilistic teleportation.

  8. Emergence of spontaneous anticipatory hand movements in a probabilistic environment

    PubMed Central

    Bruhn, Pernille

    2013-01-01

    In this article, we present a novel experimental approach to the study of anticipation in probabilistic cuing. We implemented a modified spatial cuing task in which participants made an anticipatory hand movement toward one of two probabilistic targets while the (x, y)-computer mouse coordinates of their hand movements were sampled. This approach allowed us to tap into anticipatory processes as they occurred, rather than just measuring their behavioral outcome through reaction time to the target. In different conditions, we varied the participants’ degree of certainty of the upcoming target position with probabilistic pre-cues. We found that participants initiated spontaneous anticipatory hand movements in all conditions, even when they had no information on the position of the upcoming target. However, participants’ hand position immediately before the target was affected by the degree of certainty concerning the target’s position. This modulation of anticipatory hand movements emerged rapidly in most participants as they encountered a constant probabilistic relation between a cue and an upcoming target position over the course of the experiment. Finally, we found individual differences in the way anticipatory behavior was modulated with an uncertain/neutral cue. Implications of these findings for probabilistic spatial cuing are discussed. PMID:23833694

  9. Failure Predictions for VHTR Core Components using a Probabilistic Contiuum Damage Mechanics Model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fok, Alex

    2013-10-30

    The proposed work addresses the key research need for the development of constitutive models and overall failure models for graphite and high temperature structural materials, with the long-term goal being to maximize the design life of the Next Generation Nuclear Plant (NGNP). To this end, the capability of a Continuum Damage Mechanics (CDM) model, which has been used successfully for modeling fracture of virgin graphite, will be extended as a predictive and design tool for the core components of the very high- temperature reactor (VHTR). Specifically, irradiation and environmental effects pertinent to the VHTR will be incorporated into the modelmore » to allow fracture of graphite and ceramic components under in-reactor conditions to be modeled explicitly using the finite element method. The model uses a combined stress-based and fracture mechanics-based failure criterion, so it can simulate both the initiation and propagation of cracks. Modern imaging techniques, such as x-ray computed tomography and digital image correlation, will be used during material testing to help define the baseline material damage parameters. Monte Carlo analysis will be performed to address inherent variations in material properties, the aim being to reduce the arbitrariness and uncertainties associated with the current statistical approach. The results can potentially contribute to the current development of American Society of Mechanical Engineers (ASME) codes for the design and construction of VHTR core components.« less

  10. Genetic compensation of high dose radiation-induced damage in an anhydrobiotic insect

    NASA Astrophysics Data System (ADS)

    Gusev, Oleg; Nakahara, Yuichi; Sakashita, Tetsuya; Kikawada, Takahiro; Okuda, Takashi

    Anhydrobiotic larvae of African chironomid Polypedilum vanderplanki are known to show an extremely high tolerance against a range of stresses. The tolerance against various extreme environments exhibited by that insect might be due to being almost completely desiccated replacing water with trehalose, a state where little or no chemical reactions occur. From 2005 dried larvae of this insect are being used in a number of space experiments, both inside and outside of ISS as a model organism for estimation the limits of higher organisms' resistance to space environment stresses and long-term storage of the alive anhydrobiotic organisms during continues spaceflight. We have shown previously that both hydrated and dried larvae of Polypedilum vanderplanki have very higher tolerance against both highand low-linear energy transfer (LET), surviving after 7000Gy irradiation. It was suggested that the larvae would have effective DNA-reparation system in addition to artificial protection provided by glass-stage without water. In the present study we conducted analysis of stress-related gene expression in the larvae after 70-2000 Gy irradiations. Both DNA damage level and activity of DNA-reparation, anti-apoptotic and protein-damage related genes were analyzed. Direct visualization of DNA damage in the larvae fat body cells using Comet Assay showed that fragmented by radiation DNA is re-arranged within 76-98 hours after exposure. We found that massive overexpression of hsp and anti-oxidant genes occur in larvae entering anhydrobiosis , and provides refolding of proteins after rehydration. In the irradiated larvae overexpression of DNA-reparation enzymes anti-apoptotic genes was confirmed, suggesting that survival after high-dose irradiation is a result of combination of highly effective blocking of entering the apoptosis after severe DNA damage and DNA reparation.

  11. Application of a time probabilistic approach to seismic landslide hazard estimates in Iran

    NASA Astrophysics Data System (ADS)

    Rajabi, A. M.; Del Gaudio, V.; Capolongo, D.; Khamehchiyan, M.; Mahdavifar, M. R.

    2009-04-01

    Iran is a country located in a tectonic active belt and is prone to earthquake and related phenomena. In the recent years, several earthquakes caused many fatalities and damages to facilities, e.g. the Manjil (1990), Avaj (2002), Bam (2003) and Firuzabad-e-Kojur (2004) earthquakes. These earthquakes generated many landslides. For instance, catastrophic landslides triggered by the Manjil Earthquake (Ms = 7.7) in 1990 buried the village of Fatalak, killed more than 130 peoples and cut many important road and other lifelines, resulting in major economic disruption. In general, earthquakes in Iran have been concentrated in two major zones with different seismicity characteristics: one is the region of Alborz and Central Iran and the other is the Zagros Orogenic Belt. Understanding where seismically induced landslides are most likely to occur is crucial in reducing property damage and loss of life in future earthquakes. For this purpose a time probabilistic approach for earthquake-induced landslide hazard at regional scale, proposed by Del Gaudio et al. (2003), has been applied to the whole Iranian territory to provide the basis of hazard estimates. This method consists in evaluating the recurrence of seismically induced slope failure conditions inferred from the Newmark's model. First, by adopting Arias Intensity to quantify seismic shaking and using different Arias attenuation relations for Alborz - Central Iran and Zagros regions, well-established methods of seismic hazard assessment, based on the Cornell (1968) method, were employed to obtain the occurrence probabilities for different levels of seismic shaking in a time interval of interest (50 year). Then, following Jibson (1998), empirical formulae specifically developed for Alborz - Central Iran and Zagros, were used to represent, according to the Newmark's model, the relation linking Newmark's displacement Dn to Arias intensity Ia and to slope critical acceleration ac. These formulae were employed to evaluate

  12. A Proposed Probabilistic Extension of the Halpern and Pearl Definition of ‘Actual Cause’

    PubMed Central

    2017-01-01

    ABSTRACT Joseph Halpern and Judea Pearl ([2005]) draw upon structural equation models to develop an attractive analysis of ‘actual cause’. Their analysis is designed for the case of deterministic causation. I show that their account can be naturally extended to provide an elegant treatment of probabilistic causation. 1Introduction2Preemption3Structural Equation Models4The Halpern and Pearl Definition of ‘Actual Cause’5Preemption Again6The Probabilistic Case7Probabilistic Causal Models8A Proposed Probabilistic Extension of Halpern and Pearl’s Definition9Twardy and Korb’s Account10Probabilistic Fizzling11Conclusion PMID:29593362

  13. Probabilistic seismic loss estimation via endurance time method

    NASA Astrophysics Data System (ADS)

    Tafakori, Ehsan; Pourzeynali, Saeid; Estekanchi, Homayoon E.

    2017-01-01

    Probabilistic Seismic Loss Estimation is a methodology used as a quantitative and explicit expression of the performance of buildings using terms that address the interests of both owners and insurance companies. Applying the ATC 58 approach for seismic loss assessment of buildings requires using Incremental Dynamic Analysis (IDA), which needs hundreds of time-consuming analyses, which in turn hinders its wide application. The Endurance Time Method (ETM) is proposed herein as part of a demand propagation prediction procedure and is shown to be an economical alternative to IDA. Various scenarios were considered to achieve this purpose and their appropriateness has been evaluated using statistical methods. The most precise and efficient scenario was validated through comparison against IDA driven response predictions of 34 code conforming benchmark structures and was proven to be sufficiently precise while offering a great deal of efficiency. The loss values were estimated by replacing IDA with the proposed ETM-based procedure in the ATC 58 procedure and it was found that these values suffer from varying inaccuracies, which were attributed to the discretized nature of damage and loss prediction functions provided by ATC 58.

  14. Research on probabilistic information processing

    NASA Technical Reports Server (NTRS)

    Edwards, W.

    1973-01-01

    The work accomplished on probabilistic information processing (PIP) is reported. The research proposals and decision analysis are discussed along with the results of research on MSC setting, multiattribute utilities, and Bayesian research. Abstracts of reports concerning the PIP research are included.

  15. SERIES: Genomic instability in cancer Balancing repair and tolerance of DNA damage caused by alkylating agents

    PubMed Central

    Fu, Dragony; Calvo, Jennifer A.; Samson, Leona D

    2013-01-01

    Alkylating agents comprise a major class of frontline chemotherapeutic drugs that inflict cytotoxic DNA damage as their main mode of action, in addition to collateral mutagenic damage. Numerous cellular pathways, including direct DNA damage reversal, base excision repair (BER), and mismatch repair (MMR) respond to alkylation damage to defend against alkylation-induced cell death or mutation. However, maintaining a proper balance of activity both within and between these pathways is crucial for an organism's favorable response to alkylating agents. Furthermore, an individual's response to alkylating agents can vary considerably from tissue to tissue and from person to person, pointing to genetic and epigenetic mechanisms that modulate alkylating agent toxicity. PMID:22237395

  16. Damage detection of engine bladed-disks using multivariate statistical analysis

    NASA Astrophysics Data System (ADS)

    Fang, X.; Tang, J.

    2006-03-01

    The timely detection of damage in aero-engine bladed-disks is an extremely important and challenging research topic. Bladed-disks have high modal density and, particularly, their vibration responses are subject to significant uncertainties due to manufacturing tolerance (blade-to-blade difference or mistuning), operating condition change and sensor noise. In this study, we present a new methodology for the on-line damage detection of engine bladed-disks using their vibratory responses during spin-up or spin-down operations which can be measured by blade-tip-timing sensing technique. We apply a principle component analysis (PCA)-based approach for data compression, feature extraction, and denoising. The non-model based damage detection is achieved by analyzing the change between response features of the healthy structure and of the damaged one. We facilitate such comparison by incorporating the Hotelling's statistic T2 analysis, which yields damage declaration with a given confidence level. The effectiveness of the method is demonstrated by case studies.

  17. Expectancy Learning from Probabilistic Input by Infants

    PubMed Central

    Romberg, Alexa R.; Saffran, Jenny R.

    2013-01-01

    Across the first few years of life, infants readily extract many kinds of regularities from their environment, and this ability is thought to be central to development in a number of domains. Numerous studies have documented infants’ ability to recognize deterministic sequential patterns. However, little is known about the processes infants use to build and update representations of structure in time, and how infants represent patterns that are not completely predictable. The present study investigated how infants’ expectations fora simple structure develope over time, and how infants update their representations with new information. We measured 12-month-old infants’ anticipatory eye movements to targets that appeared in one of two possible locations. During the initial phase of the experiment, infants either saw targets that appeared consistently in the same location (Deterministic condition) or probabilistically in either location, with one side more frequent than the other (Probabilistic condition). After this initial divergent experience, both groups saw the same sequence of trials for the rest of the experiment. The results show that infants readily learn from both deterministic and probabilistic input, with infants in both conditions reliably predicting the most likely target location by the end of the experiment. Local context had a large influence on behavior: infants adjusted their predictions to reflect changes in the target location on the previous trial. This flexibility was particularly evident in infants with more variable prior experience (the Probabilistic condition). The results provide some of the first data showing how infants learn in real time. PMID:23439947

  18. Analyses of flooding tolerance of soybean varieties at emergence and varietal differences in their proteomes.

    PubMed

    Nanjo, Yohei; Jang, Hee-Young; Kim, Hong-Sig; Hiraga, Susumu; Woo, Sun-Hee; Komatsu, Setsuko

    2014-10-01

    Flooding of fields due to heavy and/or continuous rainfall influences soybean production. To identify soybean varieties with flooding tolerance at the seedling emergence stage, 128 soybean varieties were evaluated using a flooding tolerance index, which is based on plant survival rates, the lack of apparent damage and lateral root development, and post-flooding radicle elongation rate. The soybean varieties were ranked according to their flooding tolerance index, and it was found that the tolerance levels of soybean varieties exhibit a continuum of differences between varieties. Subsequently, tolerant, moderately tolerant and sensitive varieties were selected and subjected to comparative proteomic analysis to clarify the tolerance mechanism. Proteomic analysis of the radicles, combined with correlation analysis, showed that the ratios of RNA binding/processing related proteins and flooding stress indicator proteins were significantly correlated with flooding tolerance index. The RNA binding/processing related proteins were positively correlated in untreated soybeans, whereas flooding stress indicator proteins were negatively correlated in flooded soybeans. These results suggest that flooding tolerance is regulated by mechanisms through multiple factors and is associated with abundance levels of the identified proteins. Copyright © 2014 Elsevier Ltd. All rights reserved.

  19. Probabilistic thinking and death anxiety: a terror management based study.

    PubMed

    Hayslip, Bert; Schuler, Eric R; Page, Kyle S; Carver, Kellye S

    2014-01-01

    Terror Management Theory has been utilized to understand how death can change behavioral outcomes and social dynamics. One area that is not well researched is why individuals willingly engage in risky behavior that could accelerate their mortality. One method of distancing a potential life threatening outcome when engaging in risky behaviors is through stacking probability in favor of the event not occurring, termed probabilistic thinking. The present study examines the creation and psychometric properties of the Probabilistic Thinking scale in a sample of young, middle aged, and older adults (n = 472). The scale demonstrated adequate internal consistency reliability for each of the four subscales, excellent overall internal consistency, and good construct validity regarding relationships with measures of death anxiety. Reliable age and gender effects in probabilistic thinking were also observed. The relationship of probabilistic thinking as part of a cultural buffer against death anxiety is discussed, as well as its implications for Terror Management research.

  20. A Hough Transform Global Probabilistic Approach to Multiple-Subject Diffusion MRI Tractography

    DTIC Science & Technology

    2010-04-01

    distribution unlimited 13. SUPPLEMENTARY NOTES 14. ABSTRACT A global probabilistic fiber tracking approach based on the voting process provided by...umn.edu 2 ABSTRACT A global probabilistic fiber tracking approach based on the voting process provided by the Hough transform is introduced in...criteria for aligning curves and particularly tracts. In this work, we present a global probabilistic approach inspired by the voting procedure provided