Science.gov

Sample records for probabilistic damage tolerance

  1. Probabilistic Evaluation of Blade Impact Damage

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.; Abumeri, G. H.

    2003-01-01

    The response to high velocity impact of a composite blade is probabilistically evaluated. The evaluation is focused on quantifying probabilistically the effects of uncertainties (scatter) in the variables that describe the impact, the blade make-up (geometry and material), the blade response (displacements, strains, stresses, frequencies), the blade residual strength after impact, and the blade damage tolerance. The results of probabilistic evaluations results are in terms of probability cumulative distribution functions and probabilistic sensitivities. Results show that the blade has relatively low damage tolerance at 0.999 probability of structural failure and substantial at 0.01 probability.

  2. Probabilistic Fatigue Damage Program (FATIG)

    NASA Technical Reports Server (NTRS)

    Michalopoulos, Constantine

    2012-01-01

    FATIG computes fatigue damage/fatigue life using the stress rms (root mean square) value, the total number of cycles, and S-N curve parameters. The damage is computed by the following methods: (a) traditional method using Miner s rule with stress cycles determined from a Rayleigh distribution up to 3*sigma; and (b) classical fatigue damage formula involving the Gamma function, which is derived from the integral version of Miner's rule. The integration is carried out over all stress amplitudes. This software solves the problem of probabilistic fatigue damage using the integral form of the Palmgren-Miner rule. The software computes fatigue life using an approach involving all stress amplitudes, up to N*sigma, as specified by the user. It can be used in the design of structural components subjected to random dynamic loading, or by any stress analyst with minimal training for fatigue life estimates of structural components.

  3. Certification of damage tolerant composite structure

    NASA Technical Reports Server (NTRS)

    Rapoff, Andrew J.; Dill, Harold D.; Sanger, Kenneth B.; Kautz, Edward F.

    1990-01-01

    A reliability based certification testing methodology for impact damage tolerant composite structure was developed. Cocured, adhesively bonded, and impact damaged composite static strength and fatigue life data were statistically analyzed to determine the influence of test parameters on the data scatter. The impact damage resistance and damage tolerance of various structural configurations were characterized through the analysis of an industry wide database of impact test results. Realistic impact damage certification requirements were proposed based on actual fleet aircraft data. The capabilities of available impact damage analysis methods were determined through correlation with experimental data. Probabilistic methods were developed to estimate the reliability of impact damaged composite structures.

  4. Damage Tolerance of Composites

    NASA Technical Reports Server (NTRS)

    Hodge, Andy

    2007-01-01

    Fracture control requirements have been developed to address damage tolerance of composites for manned space flight hardware. The requirements provide the framework for critical and noncritical hardware assessment and testing. The need for damage threat assessments, impact damage protection plans, and nondestructive evaluation are also addressed. Hardware intended to be damage tolerant have extensive coupon, sub-element, and full-scale testing requirements in-line with the Building Block Approach concept from the MIL-HDBK-17, Department of Defense Composite Materials Handbook.

  5. Composites Damage Tolerance Workshop

    NASA Technical Reports Server (NTRS)

    Gregg, Wayne

    2006-01-01

    The Composite Damage Tolerance Workshop included participants from NASA, academia, and private industry. The objectives of the workshop were to begin dialogue in order to establish a working group within the Agency, create awareness of damage tolerance requirements for Constellation, and discuss potential composite hardware for the Crew Launch Vehicle (CLV) Upper Stage (US) and Crew Module. It was proposed that a composites damage tolerance working group be created that acts within the framework of the existing NASA Fracture Control Methodology Panel. The working group charter would be to identify damage tolerance gaps and obstacles for implementation of composite structures into manned space flight systems and to develop strategies and recommendations to overcome these obstacles.

  6. Damage Tolerance Assessment Branch

    NASA Technical Reports Server (NTRS)

    Walker, James L.

    2013-01-01

    The Damage Tolerance Assessment Branch evaluates the ability of a structure to perform reliably throughout its service life in the presence of a defect, crack, or other form of damage. Such assessment is fundamental to the use of structural materials and requires an integral blend of materials engineering, fracture testing and analysis, and nondestructive evaluation. The vision of the Branch is to increase the safety of manned space flight by improving the fracture control and the associated nondestructive evaluation processes through development and application of standards, guidelines, advanced test and analytical methods. The Branch also strives to assist and solve non-aerospace related NDE and damage tolerance problems, providing consultation, prototyping and inspection services.

  7. Damage identification with probabilistic neural networks

    SciTech Connect

    Klenke, S.E.; Paez, T.L.

    1995-12-01

    This paper investigates the use of artificial neural networks (ANNs) to identify damage in mechanical systems. Two probabilistic neural networks (PNNs) are developed and used to judge whether or not damage has occurred in a specific mechanical system, based on experimental measurements. The first PNN is a classical type that casts Bayesian decision analysis into an ANN framework, it uses exemplars measured from the undamaged and damaged system to establish whether system response measurements of unknown origin come from the former class (undamaged) or the latter class (damaged). The second PNN establishes the character of the undamaged system in terms of a kernel density estimator of measures of system response; when presented with system response measures of unknown origin, it makes a probabilistic judgment whether or not the data come from the undamaged population. The physical system used to carry out the experiments is an aerospace system component, and the environment used to excite the system is a stationary random vibration. The results of damage identification experiments are presented along with conclusions rating the effectiveness of the approaches.

  8. Damage Tolerance: Assessment Handbook. Volume 2: Airframe Damage Tolerance Evaluation

    DTIC Science & Technology

    1993-10-01

    Airframe Damage Tolerance Evaluation NJ 084u5 DTIC FLE 15 1994 Research and Special Programs Administration John A. Volpe National Transportation Systems...permission of John Wiley and Sons, New York, N.Y.] (4-5] CORRODED END Magnes;um Magnesium alloys Zinc Galvanized steel or galvanized wrought iron Aluminum...Reprinted from M M. Ratwani and D.P. Wilhem , iDeelopment andEvaluation of Methods of Plane Strain Fractuire Analysis, Northrop Corporation, AFFDL-TR-73-42

  9. Damage Tolerance and Reliability of Turbine Engine Components

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.

    1999-01-01

    This report describes a formal method to quantify structural damage tolerance and reliability in the presence of a multitude of uncertainties in turbine engine components. The method is based at the material behavior level where primitive variables with their respective scatter ranges are used to describe behavior. Computational simulation is then used to propagate the uncertainties to the structural scale where damage tolerance and reliability are usually specified. Several sample cases are described to illustrate the effectiveness, versatility, and maturity of the method. Typical results from this method demonstrate that it is mature and that it can be used to probabilistically evaluate turbine engine structural components. It may be inferred from the results that the method is suitable for probabilistically predicting the remaining life in aging or deteriorating structures, for making strategic projections and plans, and for achieving better, cheaper, faster products that give competitive advantages in world markets.

  10. Damage Tolerance of Large Shell Structures

    NASA Technical Reports Server (NTRS)

    Minnetyan, L.; Chamis, C. C.

    1999-01-01

    Progressive damage and fracture of large shell structures is investigated. A computer model is used for the assessment of structural response, progressive fracture resistance, and defect/damage tolerance characteristics. Critical locations of a stiffened conical shell segment are identified. Defective and defect-free computer models are simulated to evaluate structural damage/defect tolerance. Safe pressurization levels are assessed for the retention of structural integrity at the presence of damage/ defects. Damage initiation, growth, accumulation, and propagation to fracture are included in the simulations. Damage propagation and burst pressures for defective and defect-free shells are compared to evaluate damage tolerance. Design implications with regard to defect and damage tolerance of a large steel pressure vessel are examined.

  11. Probabilistic flood damage modelling at the meso-scale

    NASA Astrophysics Data System (ADS)

    Kreibich, Heidi; Botto, Anna; Schröter, Kai; Merz, Bruno

    2014-05-01

    Decisions on flood risk management and adaptation are usually based on risk analyses. Such analyses are associated with significant uncertainty, even more if changes in risk due to global change are expected. Although uncertainty analysis and probabilistic approaches have received increased attention during the last years, they are still not standard practice for flood risk assessments. Most damage models have in common that complex damaging processes are described by simple, deterministic approaches like stage-damage functions. Novel probabilistic, multi-variate flood damage models have been developed and validated on the micro-scale using a data-mining approach, namely bagging decision trees (Merz et al. 2013). In this presentation we show how the model BT-FLEMO (Bagging decision Tree based Flood Loss Estimation MOdel) can be applied on the meso-scale, namely on the basis of ATKIS land-use units. The model is applied in 19 municipalities which were affected during the 2002 flood by the River Mulde in Saxony, Germany. The application of BT-FLEMO provides a probability distribution of estimated damage to residential buildings per municipality. Validation is undertaken on the one hand via a comparison with eight other damage models including stage-damage functions as well as multi-variate models. On the other hand the results are compared with official damage data provided by the Saxon Relief Bank (SAB). The results show, that uncertainties of damage estimation remain high. Thus, the significant advantage of this probabilistic flood loss estimation model BT-FLEMO is that it inherently provides quantitative information about the uncertainty of the prediction. Reference: Merz, B.; Kreibich, H.; Lall, U. (2013): Multi-variate flood damage assessment: a tree-based data-mining approach. NHESS, 13(1), 53-64.

  12. Derivation Of Probabilistic Damage Definitions From High Fidelity Deterministic Computations

    SciTech Connect

    Leininger, L D

    2004-10-26

    This paper summarizes a methodology used by the Underground Analysis and Planning System (UGAPS) at Lawrence Livermore National Laboratory (LLNL) for the derivation of probabilistic damage curves for US Strategic Command (USSTRATCOM). UGAPS uses high fidelity finite element and discrete element codes on the massively parallel supercomputers to predict damage to underground structures from military interdiction scenarios. These deterministic calculations can be riddled with uncertainty, especially when intelligence, the basis for this modeling, is uncertain. The technique presented here attempts to account for this uncertainty by bounding the problem with reasonable cases and using those bounding cases as a statistical sample. Probability of damage curves are computed and represented that account for uncertainty within the sample and enable the war planner to make informed decisions. This work is flexible enough to incorporate any desired damage mechanism and can utilize the variety of finite element and discrete element codes within the national laboratory and government contractor community.

  13. 77 FR 4890 - Damage Tolerance and Fatigue Evaluation for Composite Rotorcraft Structures, and Damage Tolerance...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-02-01

    ... static strength of composite rotorcraft structures using a damage tolerance evaluation, or a fatigue... regulations to require evaluation of fatigue and residual static strength of composite rotorcraft...

  14. A Novel Approach to Rotorcraft Damage Tolerance

    NASA Technical Reports Server (NTRS)

    Forth, Scott C.; Everett, Richard A.; Newman, John A.

    2002-01-01

    Damage-tolerance methodology is positioned to replace safe-life methodologies for designing rotorcraft structures. The argument for implementing a damage-tolerance method comes from the fundamental fact that rotorcraft structures typically fail by fatigue cracking. Therefore, if technology permits prediction of fatigue-crack growth in structures, a damage-tolerance method should deliver the most accurate prediction of component life. Implementing damage-tolerance (DT) into high-cycle-fatigue (HCF) components will require a shift from traditional DT methods that rely on detecting an initial flaw with nondestructive inspection (NDI) methods. The rapid accumulation of cycles in a HCF component will result in a design based on a traditional DT method that is either impractical because of frequent inspections, or because the design will be too heavy to operate efficiently. Furthermore, once a HCF component develops a detectable propagating crack, the remaining fatigue life is short, sometimes less than one flight hour, which does not leave sufficient time for inspection. Therefore, designing a HCF component will require basing the life analysis on an initial flaw that is undetectable with current NDI technology.

  15. Damage-tolerance strategies for nacre tablets.

    PubMed

    Wang, Shengnan; Zhu, Xinqiao; Li, Qiyang; Wang, Rizhi; Wang, Xiaoxiang

    2016-05-01

    Nacre, a natural armor, exhibits prominent penetration resistance against predatory attacks. Unraveling its hierarchical toughening mechanisms and damage-tolerance design strategies may provide significant inspiration for the pursuit of high-performance artificial armors. In this work, relationships between the structure and mechanical performance of nacre were investigated. The results show that other than their brick-and-mortar structure, individual nacre tablets significantly contribute to the damage localization of nacre. Affected by intracrystalline organics, the tablets exhibit a unique fracture behavior. The synergistic action of the nanoscale deformation mechanisms increases the energy dissipation efficiency of the tablets and contributes to the preservation of the structural and functional integrity of the shell.

  16. Durability and Damage Tolerance of Aluminum Castings

    DTIC Science & Technology

    1988-09-01

    casting alloys A357 and A201. On completion of the program, revisions to material, process, and DADT specifications will be recommended, if necessary...and the effects of process variables on the properties of A357 -T6 and A201-T7 castings were described. This second interim report covers additional...damage tolerance properties of A357 -T6 and A201-T7 produced using the specifications selected earlier [1] in Task 2 were determined. These alloys were

  17. 75 FR 11734 - Damage Tolerance Data for Repairs and Alterations

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-03-12

    ... TRANSPORTATION Federal Aviation Administration 14 CFR Part 26 RIN 2120-AI32 Damage Tolerance Data for Repairs and... make damage tolerance data for repairs and alterations to fatigue critical airplane structure available... of design approvals to make available to operators damage tolerance (DT) data for repairs...

  18. Damage Tolerance of Composite Laminates from an Empirical Perspective

    NASA Technical Reports Server (NTRS)

    Nettles, Alan T.

    2009-01-01

    Damage tolerance consists of analysis and experimentation working together. Impact damage is usually of most concern for laminated composites. Once impacted, the residual compression strength is usually of most interest. Other properties may be of more interest than compression (application dependent). A damage tolerance program is application specific (not everyone is building aircraft). The "Building Block Approach" is suggested for damage tolerance. Advantage can be taken of the excellent fatigue resistance of damaged laminates to save time and costs.

  19. Damage Tolerance of Integral Structure in Rotorcraft

    NASA Technical Reports Server (NTRS)

    Forth, Scott C.; Urban, Michael R.

    2003-01-01

    The rotorcraft industry has rapidly implemented integral structures into aircraft to benefit from the weight and cost advantages over traditionally riveted structure. The cost to manufacture an integral structure, where the entire component is machined from a single plate of material, is about one-fifth that of a riveted structure. Furthermore, the integral structure can weigh only one-half that of a riveted structure through optimal design of stiffening structure and part reduction. Finally, inspection and repair of damage in the field can be less costly than riveted structure. There are no rivet heads to inspect under, reducing inspection time, and damage can be removed or patched readily without altering the primary structure, reducing replacement or repair costs. In this paper, the authors will investigate the damage tolerance implications of fielding an integral structure manufactured from thick plate aluminum.

  20. High damage tolerance of electrochemically lithiated silicon

    PubMed Central

    Wang, Xueju; Fan, Feifei; Wang, Jiangwei; Wang, Haoran; Tao, Siyu; Yang, Avery; Liu, Yang; Beng Chew, Huck; Mao, Scott X.; Zhu, Ting; Xia, Shuman

    2015-01-01

    Mechanical degradation and resultant capacity fade in high-capacity electrode materials critically hinder their use in high-performance rechargeable batteries. Despite tremendous efforts devoted to the study of the electro–chemo–mechanical behaviours of high-capacity electrode materials, their fracture properties and mechanisms remain largely unknown. Here we report a nanomechanical study on the damage tolerance of electrochemically lithiated silicon. Our in situ transmission electron microscopy experiments reveal a striking contrast of brittle fracture in pristine silicon versus ductile tensile deformation in fully lithiated silicon. Quantitative fracture toughness measurements by nanoindentation show a rapid brittle-to-ductile transition of fracture as the lithium-to-silicon molar ratio is increased to above 1.5. Molecular dynamics simulations elucidate the mechanistic underpinnings of the brittle-to-ductile transition governed by atomic bonding and lithiation-induced toughening. Our results reveal the high damage tolerance in amorphous lithium-rich silicon alloys and have important implications for the development of durable rechargeable batteries. PMID:26400671

  1. High damage tolerance of electrochemically lithiated silicon

    DOE PAGES

    Wang, Xueju; Fan, Feifei; Wang, Jiangwei; ...

    2015-09-24

    Mechanical degradation and resultant capacity fade in high-capacity electrode materials critically hinder their use in high-performance rechargeable batteries. Despite tremendous efforts devoted to the study of the electro–chemo–mechanical behaviours of high-capacity electrode materials, their fracture properties and mechanisms remain largely unknown. In this paper, we report a nanomechanical study on the damage tolerance of electrochemically lithiated silicon. Our in situ transmission electron microscopy experiments reveal a striking contrast of brittle fracture in pristine silicon versus ductile tensile deformation in fully lithiated silicon. Quantitative fracture toughness measurements by nanoindentation show a rapid brittle-to-ductile transition of fracture as the lithium-to-silicon molar ratiomore » is increased to above 1.5. Molecular dynamics simulations elucidate the mechanistic underpinnings of the brittle-to-ductile transition governed by atomic bonding and lithiation-induced toughening. Finally, our results reveal the high damage tolerance in amorphous lithium-rich silicon alloys and have important implications for the development of durable rechargeable batteries.« less

  2. High damage tolerance of electrochemically lithiated silicon

    SciTech Connect

    Wang, Xueju; Fan, Feifei; Wang, Jiangwei; Wang, Haoran; Tao, Siyu; Yang, Avery; Liu, Yang; Beng Chew, Huck; Mao, Scott X.; Zhu, Ting; Xia, Shuman

    2015-09-24

    Mechanical degradation and resultant capacity fade in high-capacity electrode materials critically hinder their use in high-performance rechargeable batteries. Despite tremendous efforts devoted to the study of the electro–chemo–mechanical behaviours of high-capacity electrode materials, their fracture properties and mechanisms remain largely unknown. In this paper, we report a nanomechanical study on the damage tolerance of electrochemically lithiated silicon. Our in situ transmission electron microscopy experiments reveal a striking contrast of brittle fracture in pristine silicon versus ductile tensile deformation in fully lithiated silicon. Quantitative fracture toughness measurements by nanoindentation show a rapid brittle-to-ductile transition of fracture as the lithium-to-silicon molar ratio is increased to above 1.5. Molecular dynamics simulations elucidate the mechanistic underpinnings of the brittle-to-ductile transition governed by atomic bonding and lithiation-induced toughening. Finally, our results reveal the high damage tolerance in amorphous lithium-rich silicon alloys and have important implications for the development of durable rechargeable batteries.

  3. High damage tolerance of electrochemically lithiated silicon.

    PubMed

    Wang, Xueju; Fan, Feifei; Wang, Jiangwei; Wang, Haoran; Tao, Siyu; Yang, Avery; Liu, Yang; Beng Chew, Huck; Mao, Scott X; Zhu, Ting; Xia, Shuman

    2015-09-24

    Mechanical degradation and resultant capacity fade in high-capacity electrode materials critically hinder their use in high-performance rechargeable batteries. Despite tremendous efforts devoted to the study of the electro-chemo-mechanical behaviours of high-capacity electrode materials, their fracture properties and mechanisms remain largely unknown. Here we report a nanomechanical study on the damage tolerance of electrochemically lithiated silicon. Our in situ transmission electron microscopy experiments reveal a striking contrast of brittle fracture in pristine silicon versus ductile tensile deformation in fully lithiated silicon. Quantitative fracture toughness measurements by nanoindentation show a rapid brittle-to-ductile transition of fracture as the lithium-to-silicon molar ratio is increased to above 1.5. Molecular dynamics simulations elucidate the mechanistic underpinnings of the brittle-to-ductile transition governed by atomic bonding and lithiation-induced toughening. Our results reveal the high damage tolerance in amorphous lithium-rich silicon alloys and have important implications for the development of durable rechargeable batteries.

  4. Probabilistic Fatigue Damage Prognosis Using a Surrogate Model Trained Via 3D Finite Element Analysis

    NASA Technical Reports Server (NTRS)

    Leser, Patrick E.; Hochhalter, Jacob D.; Newman, John A.; Leser, William P.; Warner, James E.; Wawrzynek, Paul A.; Yuan, Fuh-Gwo

    2015-01-01

    Utilizing inverse uncertainty quantification techniques, structural health monitoring can be integrated with damage progression models to form probabilistic predictions of a structure's remaining useful life. However, damage evolution in realistic structures is physically complex. Accurately representing this behavior requires high-fidelity models which are typically computationally prohibitive. In the present work, a high-fidelity finite element model is represented by a surrogate model, reducing computation times. The new approach is used with damage diagnosis data to form a probabilistic prediction of remaining useful life for a test specimen under mixed-mode conditions.

  5. Impact Damage Tolerance of a Carbon Fibre Composite Laminate.

    DTIC Science & Technology

    1984-05-01

    design of composite structures. 8 CONCLUSIONS These carbon fibre/ epoxy resin laminates are susceptible :: low e ;rt., - .. impact damage, especially...ROYAL AIRCRAFT ESTABLISHMENT0 Technical Report 84049 May 1984 GARTEUR/TP-007 IMPACT DAMAGE TOLERANCE OF A CARBON FIBRE COMPOSITE LAMINATE by DTIC G...007 Received for printing 3 May 1984 IMPACT DAMAGE TOLERANCE OF A CARBON FIBRE COMPOSITE LAMINATE by G. Dorey P. Sigety* K. Stellbrink** W. G. J. ’t

  6. High performance implementation of probabilistic damage tolerance analysis

    NASA Astrophysics Data System (ADS)

    Crosby, Nathan

    A number of recent studies have demonstrated that large-scale extratropical wave activity is characterized by quasi-periodic behavior on timescales of 20-30 days, particularly in the Southern Hemisphere. This phenomenon has been termed the Baroclinic Annular Mode (BAM), and is responsible for the modulation of eddy heat fluxes, eddy kinetic energy, and precipitation on large scales. However, the extent to which this periodic modulation is discernable or significant on smaller spatial scales had not yet been established. Using data from the ECMWF Interim Reanalysis for the period 1979-2014, this study extensively examines the spatial structure of the BAM. Spectral analyses reveal the spatial limitations of the periodic behavior, while lag-correlation analyses reveal the patterns of propagation and development of anomalies that give rise to the observed periodicity. Periodic behavior is more robust in the Southern Hemisphere than in the Northern Hemisphere, but filtering out low wavenumbers from NH data helps clarify the BAM signal. Additionally, it is demonstrated that the BAM appears very differently in two relatively similar global climate models, suggesting further study is needed to determine how modern GCMs capture the BAM. Supplementing our analyses of observed and modeled data is a simple two-way linear feedback model, which is utilized to demonstrate the principal mechanism underlying periodic behavior in the BAM. The model makes it apparent that the BAM can be modeled as a simple linear feedback between baroclinicity and eddy heat fluxes. The periodicity seen on larger scales is a product of differential advection rates affecting the development of spatially overlapping, out-of-phase anomalies. The large-scale nature of the periodic behavior, however, makes it difficult to draw conclusions about the potential utility of the BAM for weather analysts and forecasters, and the limitations of this study limit our ability to describe its role in the climate system. It is hoped that the research presented here will pave the way to future studies which may more thoroughly answer such questions.

  7. Damage Tolerance Analysis of a Pressurized Liquid Oxygen Tank

    NASA Technical Reports Server (NTRS)

    Forth, Scott C.; Harvin, Stephen F.; Gregory, Peyton B.; Mason, Brian H.; Thompson, Joe E.; Hoffman, Eric K.

    2006-01-01

    A damage tolerance assessment was conducted of an 8,000 gallon pressurized Liquid Oxygen (LOX) tank. The LOX tank is constructed of a stainless steel pressure vessel enclosed by a thermal-insulating vacuum jacket. The vessel is pressurized to 2,250 psi with gaseous nitrogen resulting in both thermal and pressure stresses on the tank wall. Finite element analyses were performed on the tank to characterize the stresses from operation. Engineering material data was found from both the construction of the tank and the technical literature. An initial damage state was assumed based on records of a nondestructive inspection performed on the tank. The damage tolerance analyses were conducted using the NASGRO computer code. This paper contains the assumptions, and justifications, made for the input parameters to the damage tolerance analyses and the results of the damage tolerance analyses with a discussion on the operational safety of the LOX tank.

  8. An Approach to Risk-Based Design Incorporating Damage Tolerance Analyses

    NASA Technical Reports Server (NTRS)

    Knight, Norman F., Jr.; Glaessgen, Edward H.; Sleight, David W.

    2002-01-01

    Incorporating risk-based design as an integral part of spacecraft development is becoming more and more common. Assessment of uncertainties associated with design parameters and environmental aspects such as loading provides increased knowledge of the design and its performance. Results of such studies can contribute to mitigating risk through a system-level assessment. Understanding the risk of an event occurring, the probability of its occurrence, and the consequences of its occurrence can lead to robust, reliable designs. This paper describes an approach to risk-based structural design incorporating damage-tolerance analysis. The application of this approach to a candidate Earth-entry vehicle is described. The emphasis of the paper is on describing an approach for establishing damage-tolerant structural response inputs to a system-level probabilistic risk assessment.

  9. Progressive Fracture and Damage Tolerance of Composite Pressure Vessels

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.; Gotsis, Pascal K.; Minnetyan, Levon

    1997-01-01

    Structural performance (integrity, durability and damage tolerance) of fiber reinforced composite pressure vessels, designed for pressured shelters for planetary exploration, is investigated via computational simulation. An integrated computer code is utilized for the simulation of damage initiation, growth, and propagation under pressure. Aramid fibers are considered in a rubbery polymer matrix for the composite system. Effects of fiber orientation and fabrication defect/accidental damages are investigated with regard to the safety and durability of the shelter. Results show the viability of fiber reinforced pressure vessels as damage tolerant shelters for planetary colonization.

  10. Multiaxial and thermomechanical fatigue considerations in damage tolerant design

    NASA Technical Reports Server (NTRS)

    Leese, G. E.; Bill, R. C.

    1985-01-01

    In considering damage tolerant design concepts for gas turbine hot section components, several challenging concerns arise: Complex multiaxial loading situations are encountered; Thermomechanical fatigue loading involving very wide temperature ranges is imposed on components; Some hot section materials are extremely anisotropic; and coatings and environmental interactions play an important role in crack propagation. The effects of multiaxiality and thermomechanical fatigue are considered from the standpoint of their impact on damage tolerant design concepts. Recently obtained research results as well as results from the open literature are examined and their implications for damage tolerant design are discussed. Three important needs required to advance analytical capabilities in support of damage tolerant design become readily apparent: (1) a theoretical basis to account for the effect of nonproportional loading (mechanical and mechanical/thermal); (2) the development of practical crack growth parameters that are applicable to thermomechanical fatigue situations; and (3) the development of crack growth models that address multiple crack failures.

  11. Probabilistic Damage Characterization Using the Computationally-Efficient Bayesian Approach

    NASA Technical Reports Server (NTRS)

    Warner, James E.; Hochhalter, Jacob D.

    2016-01-01

    This work presents a computationally-ecient approach for damage determination that quanti es uncertainty in the provided diagnosis. Given strain sensor data that are polluted with measurement errors, Bayesian inference is used to estimate the location, size, and orientation of damage. This approach uses Bayes' Theorem to combine any prior knowledge an analyst may have about the nature of the damage with information provided implicitly by the strain sensor data to form a posterior probability distribution over possible damage states. The unknown damage parameters are then estimated based on samples drawn numerically from this distribution using a Markov Chain Monte Carlo (MCMC) sampling algorithm. Several modi cations are made to the traditional Bayesian inference approach to provide signi cant computational speedup. First, an ecient surrogate model is constructed using sparse grid interpolation to replace a costly nite element model that must otherwise be evaluated for each sample drawn with MCMC. Next, the standard Bayesian posterior distribution is modi ed using a weighted likelihood formulation, which is shown to improve the convergence of the sampling process. Finally, a robust MCMC algorithm, Delayed Rejection Adaptive Metropolis (DRAM), is adopted to sample the probability distribution more eciently. Numerical examples demonstrate that the proposed framework e ectively provides damage estimates with uncertainty quanti cation and can yield orders of magnitude speedup over standard Bayesian approaches.

  12. Design of highly damage-tolerant sandwich panels

    NASA Astrophysics Data System (ADS)

    Hiel, Clement; Ishai, Ori

    The effects of different fabrication procedures to increase the damage tolerance of sandwich panels were studied. Baseline panels consisted of a 25.4 mm premolded core, surfaced with 177 C cure film adhesive and carbon-bismaleimide prepreg which were subsequently cocured onto the core. It was found that panels with a prefabricated skin, which was subsequently bonded onto the core with room temperature cure adhesive, showed greatly increased damage tolerance.

  13. Near Real-Time Probabilistic Damage Diagnosis Using Surrogate Modeling and High Performance Computing

    NASA Technical Reports Server (NTRS)

    Warner, James E.; Zubair, Mohammad; Ranjan, Desh

    2017-01-01

    This work investigates novel approaches to probabilistic damage diagnosis that utilize surrogate modeling and high performance computing (HPC) to achieve substantial computational speedup. Motivated by Digital Twin, a structural health management (SHM) paradigm that integrates vehicle-specific characteristics with continual in-situ damage diagnosis and prognosis, the methods studied herein yield near real-time damage assessments that could enable monitoring of a vehicle's health while it is operating (i.e. online SHM). High-fidelity modeling and uncertainty quantification (UQ), both critical to Digital Twin, are incorporated using finite element method simulations and Bayesian inference, respectively. The crux of the proposed Bayesian diagnosis methods, however, is the reformulation of the numerical sampling algorithms (e.g. Markov chain Monte Carlo) used to generate the resulting probabilistic damage estimates. To this end, three distinct methods are demonstrated for rapid sampling that utilize surrogate modeling and exploit various degrees of parallelism for leveraging HPC. The accuracy and computational efficiency of the methods are compared on the problem of strain-based crack identification in thin plates. While each approach has inherent problem-specific strengths and weaknesses, all approaches are shown to provide accurate probabilistic damage diagnoses and several orders of magnitude computational speedup relative to a baseline Bayesian diagnosis implementation.

  14. Some Examples of the Relations Between Processing and Damage Tolerance

    NASA Technical Reports Server (NTRS)

    Nettles, Alan T.

    2012-01-01

    Most structures made of laminated polymer matrix composites (PMCs) must be designed to some damage tolerance requirement that includes foreign object impact damage. Thus from the beginning of a part s life, impact damage is assumed to exist in the material and the part is designed to carry the required load with the prescribed impact damage present. By doing this, some processing defects may automatically be accounted for in the reduced design allowable due to these impacts. This paper will present examples of how a given level of impact damage and certain processing defects affect the compression strength of a laminate that contains both. Knowledge of the impact damage tolerance requirements, before processing begins, can broaden material options and processing techniques since the structure is not being designed to pristine properties.

  15. Damage Tolerant Microstructures for Shock Environments

    NASA Astrophysics Data System (ADS)

    Cerreta, Ellen; Dennis-Koller, Darcie; Escobedo, Juan Pablo; Fensin, Saryu; Valone, Steve; Trujillo, Carl; Bronkhorst, Curt; Lebensohn, Ricardo

    While dynamic failure, due to shock loading, has been studied for many years, our current ability to predict and simulate evolving damage during dynamic loading remains limited. One reason for this is due to the lack of understanding for the linkages between process-induced as well as evolved microstructure and damage. To this end, the role of microstructure on the early stages of dynamic damage has been studied in high purity Ta and Cu. This work, which utilizes plate-impact experiments to interrogate these effects, has recently been extended to a subset to Cu-alloys (Cu-Pb, Cu-Nb, and Cu-Ag). These multi-length scale studies, have identified a number of linkages between damage nucleation and growth and microstructural features such as: grain boundary types, grain boundary orientation with respect to loading direction, grain orientation, and bi-metal interfaces. A combination of modeling and simulation techniques along with experimental observation has been utilized to examine the mechanisms for the ductile damage processes such as nucleation, growth and coalescence. This work has identified differing features of importance for damage nucleation in high purity and alloyed materials, lending insight into features of concern for mitigating shock induced damage in more complicated alloy systems.

  16. Damage tolerance and structural monitoring for wind turbine blades

    PubMed Central

    McGugan, M.; Pereira, G.; Sørensen, B. F.; Toftegaard, H.; Branner, K.

    2015-01-01

    The paper proposes a methodology for reliable design and maintenance of wind turbine rotor blades using a condition monitoring approach and a damage tolerance index coupling the material and structure. By improving the understanding of material properties that control damage propagation it will be possible to combine damage tolerant structural design, monitoring systems, inspection techniques and modelling to manage the life cycle of the structures. This will allow an efficient operation of the wind turbine in terms of load alleviation, limited maintenance and repair leading to a more effective exploitation of offshore wind. PMID:25583858

  17. Damage tolerance and structural monitoring for wind turbine blades.

    PubMed

    McGugan, M; Pereira, G; Sørensen, B F; Toftegaard, H; Branner, K

    2015-02-28

    The paper proposes a methodology for reliable design and maintenance of wind turbine rotor blades using a condition monitoring approach and a damage tolerance index coupling the material and structure. By improving the understanding of material properties that control damage propagation it will be possible to combine damage tolerant structural design, monitoring systems, inspection techniques and modelling to manage the life cycle of the structures. This will allow an efficient operation of the wind turbine in terms of load alleviation, limited maintenance and repair leading to a more effective exploitation of offshore wind.

  18. Probabilistic Model for Laser Damage to the Human Retina

    DTIC Science & Technology

    2012-03-01

    the beam. Power density may be measured in radiant exposure, J cm2 , or by irradiance , W cm2 . In the experimental database used in this study and...to quan- tify a binary response, either lethal or non-lethal, within a population such as insects or rats. In directed energy research, probit...value of the normalized Arrhenius damage integral. In a one-dimensional simulation, the source term is determined as a spatially averaged irradiance (W

  19. Design Manual for Impact Damage Tolerant Aircraft Structure

    DTIC Science & Technology

    1981-10-01

    amenable to statistical analysis. Figure 2-9 shows typical small arms projectile damage measurements In a notch -sensitive high -strength aluminum alloy ...impacts by small arms projectiles, missile warhead fragments, and the fragmentation and blast effects of high -explosive projectiles. The responses... Effect of Several Pararnaters on Gunfire Damage of Metal Structure Since damage tolerance also depends on material properties , material selection is an

  20. Damage Tolerance Issues as Related to Metallic Rotorcraft Dynamic Components

    NASA Technical Reports Server (NTRS)

    Everett, R. A., Jr.; Elber, W.

    2005-01-01

    In this paper issues related to the use of damage tolerance in life managing rotorcraft dynamic components are reviewed. In the past, rotorcraft fatigue design has combined constant amplitude tests of full-scale parts with flight loads and usage data in a conservative manner to provide "safe life" component replacement times. In contrast to the safe life approach over the past twenty years the United States Air Force and several other NATO nations have used damage tolerance design philosophies for fixed wing aircraft to improve safety and reliability. The reliability of the safe life approach being used in rotorcraft started to be questioned shortly after presentations at an American Helicopter Society's specialist meeting in 1980 showed predicted fatigue lives for a hypothetical pitch-link problem to vary from a low of 9 hours to a high in excess of 2594 hours. This presented serious cost, weight, and reliability implications. Somewhat after the U.S. Army introduced its six nines reliability on fatigue life, attention shifted towards using a possible damage tolerance approach to the life management of rotorcraft dynamic components. The use of damage tolerance in life management of dynamic rotorcraft parts will be the subject of this paper. This review will start with past studies on using damage tolerance life management with existing helicopter parts that were safe life designed. Also covered will be a successful attempt at certifying a tail rotor pitch rod using damage tolerance, which was designed using the safe life approach. The FAA review of rotorcraft fatigue design and their recommendations along with some on-going U.S. industry research in damage tolerance on rotorcraft will be reviewed. Finally, possible problems and future needs for research will be highlighted.

  1. Probabilistic, multi-variate flood damage modelling using random forests and Bayesian networks

    NASA Astrophysics Data System (ADS)

    Kreibich, Heidi; Schröter, Kai

    2015-04-01

    Decisions on flood risk management and adaptation are increasingly based on risk analyses. Such analyses are associated with considerable uncertainty, even more if changes in risk due to global change are expected. Although uncertainty analysis and probabilistic approaches have received increased attention recently, they are hardly applied in flood damage assessments. Most of the damage models usually applied in standard practice have in common that complex damaging processes are described by simple, deterministic approaches like stage-damage functions. This presentation will show approaches for probabilistic, multi-variate flood damage modelling on the micro- and meso-scale and discuss their potential and limitations. Reference: Merz, B.; Kreibich, H.; Lall, U. (2013): Multi-variate flood damage assessment: a tree-based data-mining approach. NHESS, 13(1), 53-64. Schröter, K., Kreibich, H., Vogel, K., Riggelsen, C., Scherbaum, F., Merz, B. (2014): How useful are complex flood damage models? - Water Resources Research, 50, 4, p. 3378-3395.

  2. On the enhancement of impact damage tolerance of composite laminates

    NASA Technical Reports Server (NTRS)

    Nettles, A. T.; Lance, D. G.

    1993-01-01

    This paper examines the use of a thin layer of Ultra High Molecular Weight Polyethylene (UHMWPE) on the outer surface of carbon/epoxy composite materials as a method of improving impact resistance and damage tolerance through hybridization. Flat 16-ply laminates as well as honeycomb sandwich structures with eight-ply facesheets were tested in this study. Instrumented drop-weight impact testing was used to inflict damage upon the specimens. Evaluation of damage resistance included instrumented impact data, visual examination, C-scanning and compression after impact (CAI) testing. The results show that only one lamina of UHMWPE did not improve the damage tolerance (strength retention) of the 16-ply flat laminate specimens or the honeycomb sandwich beams, however, a modest gain in impact resistance (detectable damage) was found for the honeycomb sandwich specimens that contained an outer layer of UHMWPE.

  3. Rapid Damage eXplorer (RDX): A Probabilistic Framework for Learning Changes From Bitemporal Images

    SciTech Connect

    Vatsavai, Raju

    2012-01-01

    Recent decade has witnessed major changes on the Earth, for example, deforestation, varying cropping and human settlement patterns, and crippling damages due to disasters. Accurate damage assessment caused by major natural and anthropogenic disasters is becoming critical due to increases in human and economic loss. This increase in loss of life and severe damages can be attributed to the growing population, as well as human migration to the disaster prone regions of the world. Rapid assessment of these changes and dissemination of accurate information is critical for creating an effective emergency response. Change detection using high-resolution satellite images is a primary tool in assessing damages, monitoring biomass and critical infrastructures, and identifying new settlements. In this demo, we present a novel supervised probabilistic framework for identifying changes using very high-resolution multispectral, and bitemporal remote sensing images. Our demo shows that the rapid damage explorer (RDX) system is resilient to registration errors and differing sensor characteristics.

  4. Ontogenetic contingency of tolerance mechanisms in response to apical damage

    PubMed Central

    Gruntman, Michal; Novoplansky, Ariel

    2011-01-01

    Background and Aims Plants are able to tolerate tissue loss through vigorous branching which is often triggered by release from apical dominance and activation of lateral meristems. However, damage-induced branching might not be a mere physiological outcome of released apical dominance, but an adaptive response to environmental signals, such as damage timing and intensity. Here, branching responses to both factors were examined in the annual plant Medicago truncatula. Methods Branching patterns and allocation to reproductive traits were examined in response to variable clipping intensities and timings in M. truncatula plants from two populations that vary in the onset of reproduction. Phenotypic selection analysis was used to evaluate the strength and direction of selection on branching under the damage treatments. Key Results Plants of both populations exhibited an ontogenetic shift in tolerance mechanisms: while early damage induced greater meristem activation, late damage elicited investment in late-determined traits, including mean pod and seed biomass, and supported greater germination rates. Severe damage mostly elicited simultaneous development of multiple-order lateral branches, but this response was limited to early damage. Selection analyses revealed positive directional selection on branching in plants under early- compared with late- or no-damage treatments. Conclusions The results demonstrate that damage-induced meristem activation is an adaptive response that could be modified according to the plant's developmental stage, severity of tissue loss and their interaction, stressing the importance of considering these effects when studying plastic responses to apical damage. PMID:21873259

  5. Mechanical Data for Use in Damage Tolerance Analyses

    NASA Technical Reports Server (NTRS)

    Forth, Scott C.; James, Mark A.; Newman, John A.; Everett, Richard A., Jr.; Johnston, William M., Jr.

    2004-01-01

    This report describes the results of a research program to determine the damage tolerance properties of metallic propeller materials. Three alloys were selected for investigation: 2025-T6 Aluminum, D6AC Steel and 4340 Steel. Mechanical response, fatigue (S-N) and fatigue crack growth rate data are presented for all of the alloys. The main conclusions that can be drawn from this study are as follows. The damage tolerant design of a propeller system will require a complete understanding of the fatigue crack growth threshold. There exists no experimental procedure to reliably develop the fatigue crack growth threshold data that is needed for damage tolerant design methods. Significant research will be required to fully understand the fatigue crack growth threshold. The development of alternative precracking methods, evaluating the effect of specimen configuration and attempting to identify micromechanical issues are simply the first steps to understanding the mechanics of the threshold.

  6. Damage Tolerance Characterisitics of Composite Sandwich Structures

    DTIC Science & Technology

    2000-02-01

    and very simplified modelling Unit of the EH-101 helicopter is made of a composite skeleton of the damage introduced by impact; second, the evaluation...the delamination boundary. The Multi Point Constraint If the delamination growth data from the teflon strip element of NASTRAN is used for modelling ...component level. These kinds of tests are composite sandwich structures used by the helicopter industry, carried out not only to verify load paths and

  7. A Framework for Probabilistic Evaluation of Interval Management Tolerance in the Terminal Radar Control Area

    NASA Technical Reports Server (NTRS)

    Hercencia-Zapana, Heber; Herencia-Zapana, Heber; Hagen, George E.; Neogi, Natasha

    2012-01-01

    Projections of future traffic in the national airspace show that most of the hub airports and their attendant airspace will need to undergo significant redevelopment and redesign in order to accommodate any significant increase in traffic volume. Even though closely spaced parallel approaches increase throughput into a given airport, controller workload in oversubscribed metroplexes is further taxed by these approaches that require stringent monitoring in a saturated environment. The interval management (IM) concept in the TRACON area is designed to shift some of the operational burden from the control tower to the flight deck, placing the flight crew in charge of implementing the required speed changes to maintain a relative spacing interval. The interval management tolerance is a measure of the allowable deviation from the desired spacing interval for the IM aircraft (and its target aircraft). For this complex task, Formal Methods can help to ensure better design and system implementation. In this paper, we propose a probabilistic framework to quantify the uncertainty and performance associated with the major components of the IM tolerance. The analytical basis for this framework may be used to formalize both correctness and probabilistic system safety claims in a modular fashion at the algorithmic level in a way compatible with several Formal Methods tools.

  8. An Experimental Investigation of Damage Resistances and Damage Tolerance of Composite Materials

    NASA Technical Reports Server (NTRS)

    Prabhakaran, R.

    2003-01-01

    The project included three lines of investigation, aimed at a better understanding of the damage resistance and damage tolerance of pultruded composites. The three lines of investigation were: (i) measurement of permanent dent depth after transverse indentation at different load levels, and correlation with other damage parameters such as damage area (from x-radiography) and back surface crack length, (ii) estimation of point stress and average stress characteristic dimensions corresponding to measured damage parameters, and (iii) an attempt to measure the damage area by a reflection photoelastic technique. All the three lines of investigation were pursued.

  9. Damage tolerant composite wing panels for transport aircraft

    NASA Technical Reports Server (NTRS)

    Smith, Peter J.; Wilson, Robert D.; Gibbins, M. N.

    1985-01-01

    Commercial aircraft advanced composite wing surface panels were tested for durability and damage tolerance. The wing of a fuel-efficient, 200-passenger airplane for 1990 delivery was sized using grahite-epoxy materials. The damage tolerance program was structured to allow a systematic progression from material evaluations to the optimized large panel verification tests. The program included coupon testing to evaluate toughened material systems, static and fatigue tests of compression coupons with varying amounts of impact damage, element tests of three-stiffener panels to evaluate upper wing panel design concepts, and the wing structure damage environment was studied. A series of technology demonstration tests of large compression panels is performed. A repair investigation is included in the final large panel test.

  10. Life assessment and damage tolerance of wind turbines

    NASA Astrophysics Data System (ADS)

    Wanhill, R. J. H.

    1983-11-01

    Safe and durable operation of fatigue critical structures in high technology windmills, including safe life assessment and possible application of damage tolerance principles was surveyed. A research program to assist safe and durable operation of windmill rotors in the Netherlands is reviewed.

  11. A Computationally-Efficient Inverse Approach to Probabilistic Strain-Based Damage Diagnosis

    NASA Technical Reports Server (NTRS)

    Warner, James E.; Hochhalter, Jacob D.; Leser, William P.; Leser, Patrick E.; Newman, John A

    2016-01-01

    This work presents a computationally-efficient inverse approach to probabilistic damage diagnosis. Given strain data at a limited number of measurement locations, Bayesian inference and Markov Chain Monte Carlo (MCMC) sampling are used to estimate probability distributions of the unknown location, size, and orientation of damage. Substantial computational speedup is obtained by replacing a three-dimensional finite element (FE) model with an efficient surrogate model. The approach is experimentally validated on cracked test specimens where full field strains are determined using digital image correlation (DIC). Access to full field DIC data allows for testing of different hypothetical sensor arrangements, facilitating the study of strain-based diagnosis effectiveness as the distance between damage and measurement locations increases. The ability of the framework to effectively perform both probabilistic damage localization and characterization in cracked plates is demonstrated and the impact of measurement location on uncertainty in the predictions is shown. Furthermore, the analysis time to produce these predictions is orders of magnitude less than a baseline Bayesian approach with the FE method by utilizing surrogate modeling and effective numerical sampling approaches.

  12. Heat tolerance of higher plants cenosis to damaging air temperatures

    NASA Astrophysics Data System (ADS)

    Ushakova, Sofya; Shklavtsova, Ekaterina

    Designing sustained biological-technical life support systems (BTLSS) including higher plants as a part of a photosynthesizing unit, it is important to foresee the multi species cenosis reaction on either stress-factors. Air temperature changing in BTLSS (because of failure of a thermoregulation system) up to the values leading to irreversible damages of photosynthetic processes is one of those factors. However, it is possible to increase, within the certain limits, the plant cenosis tolerance to the unfavorable temperatures’ effect due to the choice of the higher plants possessing resistance both to elevated and to lowered air temperatures. Besides, the plants heat tolerance can be increased when subjecting them during their growing to the hardening off temperatures’ effect. Thus, we have come to the conclusion that it is possible to increase heat tolerance of multi species cenosis under the damaging effect of air temperature of 45 (°) СC.

  13. Homologous recombination maintenance of genome integrity during DNA damage tolerance

    PubMed Central

    Prado, Félix

    2014-01-01

    The DNA strand exchange protein Rad51 provides a safe mechanism for the repair of DNA breaks using the information of a homologous DNA template. Homologous recombination (HR) also plays a key role in the response to DNA damage that impairs the advance of the replication forks by providing mechanisms to circumvent the lesion and fill in the tracks of single-stranded DNA that are generated during the process of lesion bypass. These activities postpone repair of the blocking lesion to ensure that DNA replication is completed in a timely manner. Experimental evidence generated over the last few years indicates that HR participates in this DNA damage tolerance response together with additional error-free (template switch) and error-prone (translesion synthesis) mechanisms through intricate connections, which are presented here. The choice between repair and tolerance, and the mechanism of tolerance, is critical to avoid increased mutagenesis and/or genome rearrangements, which are both hallmarks of cancer. PMID:27308329

  14. Optimization of Aerospace Structure Subject to Damage Tolerance Criteria

    NASA Technical Reports Server (NTRS)

    Akgun, Mehmet A.

    1999-01-01

    The objective of this cooperative agreement was to seek computationally efficient ways to optimize aerospace structures subject to damage tolerance criteria. Optimization was to involve sizing as well as topology optimization. The work was done in collaboration with Steve Scotti, Chauncey Wu and Joanne Walsh at the NASA Langley Research Center. Computation of constraint sensitivity is normally the most time-consuming step of an optimization procedure. The cooperative work first focused on this issue and implemented the adjoint method of sensitivity computation in an optimization code (runstream) written in Engineering Analysis Language (EAL). The method was implemented both for bar and plate elements including buckling sensitivity for the latter. Lumping of constraints was investigated as a means to reduce the computational cost. Adjoint sensitivity computation was developed and implemented for lumped stress and buckling constraints. Cost of the direct method and the adjoint method was compared for various structures with and without lumping. The results were reported in two papers. It is desirable to optimize topology of an aerospace structure subject to a large number of damage scenarios so that a damage tolerant structure is obtained. Including damage scenarios in the design procedure is critical in order to avoid large mass penalties at later stages. A common method for topology optimization is that of compliance minimization which has not been used for damage tolerant design. In the present work, topology optimization is treated as a conventional problem aiming to minimize the weight subject to stress constraints. Multiple damage configurations (scenarios) are considered. Each configuration has its own structural stiffness matrix and, normally, requires factoring of the matrix and solution of the system of equations. Damage that is expected to be tolerated is local and represents a small change in the stiffness matrix compared to the baseline (undamaged

  15. Damage-tolerant nanotwinned metals with nanovoids under radiation environments

    DOE PAGES

    Chen, Y.; Yu, K. Y.; Liu, Y.; ...

    2015-04-24

    Material performance in extreme radiation environments is central to the design of future nuclear reactors. Radiation induces significant damage in the form of dislocation loops and voids in irradiated materials, and continuous radiation often leads to void growth and subsequent void swelling in metals with low stacking fault energy. Here we show that by using in situ heavy ion irradiation in a transmission electron microscope, pre-introduced nanovoids in nanotwinned Cu efficiently absorb radiation-induced defects accompanied by gradual elimination of nanovoids, enhancing radiation tolerance of Cu. In situ studies and atomistic simulations reveal that such remarkable self-healing capability stems from highmore » density of coherent and incoherent twin boundaries that rapidly capture and transport point defects and dislocation loops to nanovoids, which act as storage bins for interstitial loops. This study describes a counterintuitive yet significant concept: deliberate introduction of nanovoids in conjunction with nanotwins enables unprecedented damage tolerance in metallic materials.« less

  16. Damage-tolerant nanotwinned metals with nanovoids under radiation environments

    NASA Astrophysics Data System (ADS)

    Chen, Y.; Yu, K. Y.; Liu, Y.; Shao, S.; Wang, H.; Kirk, M. A.; Wang, J.; Zhang, X.

    2015-04-01

    Material performance in extreme radiation environments is central to the design of future nuclear reactors. Radiation induces significant damage in the form of dislocation loops and voids in irradiated materials, and continuous radiation often leads to void growth and subsequent void swelling in metals with low stacking fault energy. Here we show that by using in situ heavy ion irradiation in a transmission electron microscope, pre-introduced nanovoids in nanotwinned Cu efficiently absorb radiation-induced defects accompanied by gradual elimination of nanovoids, enhancing radiation tolerance of Cu. In situ studies and atomistic simulations reveal that such remarkable self-healing capability stems from high density of coherent and incoherent twin boundaries that rapidly capture and transport point defects and dislocation loops to nanovoids, which act as storage bins for interstitial loops. This study describes a counterintuitive yet significant concept: deliberate introduction of nanovoids in conjunction with nanotwins enables unprecedented damage tolerance in metallic materials.

  17. Damage Tolerance and Reliability of Turbine Engine Components

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.

    1999-01-01

    A formal method is described to quantify structural damage tolerance and reliability in the presence of multitude of uncertainties in turbine engine components. The method is based at the materials behaviour level where primitive variables with their respective scatters are used to describe the behavior. Computational simulation is then used to propagate those uncertainties to the structural scale where damage tolerance and reliability are usually specified. Several sample cases are described to illustrate the effectiveness, versatility, and maturity of the method. Typical results from these methods demonstrate that the methods are mature and that they can be used for future strategic projections and planning to assure better, cheaper, faster, products for competitive advantages in world markets. These results also indicate that the methods are suitable for predicting remaining life in aging or deteriorating structures.

  18. Damage Tolerance and Reliability of Turbine Engine Components

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.

    1998-01-01

    A formal method is described to quantify structural damage tolerance and reliability in the presence of multitude of uncertainties in turbine engine components. The method is based at the materials behavior level where primitive variables with their respective scatters are used to describe that behavior. Computational simulation is then used to propagate those uncertainties to the structural scale where damage tolerance and reliability are usually specified. Several sample cases are described to illustrate the effectiveness, versatility, and maturity of the method. Typical results from these methods demonstrate that the methods are mature and that they can be used for future strategic projections and planning to assure better, cheaper, faster products for competitive advantages in world markets. These results also indicate that the methods are suitable for predicting remaining life in aging or deteriorating structures.

  19. Damage-tolerant nanotwinned metals with nanovoids under radiation environments

    PubMed Central

    Chen, Y.; Yu, K Y.; Liu, Y.; Shao, S.; Wang, H.; Kirk, M. A.; Wang, J.; Zhang, X.

    2015-01-01

    Material performance in extreme radiation environments is central to the design of future nuclear reactors. Radiation induces significant damage in the form of dislocation loops and voids in irradiated materials, and continuous radiation often leads to void growth and subsequent void swelling in metals with low stacking fault energy. Here we show that by using in situ heavy ion irradiation in a transmission electron microscope, pre-introduced nanovoids in nanotwinned Cu efficiently absorb radiation-induced defects accompanied by gradual elimination of nanovoids, enhancing radiation tolerance of Cu. In situ studies and atomistic simulations reveal that such remarkable self-healing capability stems from high density of coherent and incoherent twin boundaries that rapidly capture and transport point defects and dislocation loops to nanovoids, which act as storage bins for interstitial loops. This study describes a counterintuitive yet significant concept: deliberate introduction of nanovoids in conjunction with nanotwins enables unprecedented damage tolerance in metallic materials. PMID:25906997

  20. Development of the Damage Tolerance Criteria for an Aging Fleet

    DTIC Science & Technology

    2014-10-20

    Laboratory Air Force Materiel Command REPORT DOCUMENTATION PAGE Form Approved OMB No. 0704-0188 Public reporting burden for this collection of...information if it does not display a currently valid OMB control number. PLEASE DO NOT RETURN YOUR FORM TO THE ABOVE ADDRESS. 1. REPORT DATE (DD-MM-YYYY... Form 298 (Rev. 8-98) Prescribed by ANSI Std. Z39.18 Development of the damage tolerance criteria for an ageing fleet. Principal Researcher

  1. Multiaxial and Thermomechanical Fatigue Considerations in Damage Tolerant Design.

    DTIC Science & Technology

    1985-01-01

    application to disk materials. In figure 5, similar multiple cracking patterns, observed early in the life of Inconel 718 are shown in replicas taken...III 111111.5W MICROCOPY RESOLUTION TEST CHART NATIONAL BUREAU OF STANDARDS 196-A q[ S! St d:P.~ ~ %d \\ d // / .~ z .* * %~I *o % % % J. !:% w. W...of this program is the incorporation of damage tolerant concepts in the engine design, combined with mission oriented testing directed toward the

  2. Fatigue Crack Growth Database for Damage Tolerance Analysis

    NASA Technical Reports Server (NTRS)

    Forman, R. G.; Shivakumar, V.; Cardinal, J. W.; Williams, L. C.; McKeighan, P. C.

    2005-01-01

    The objective of this project was to begin the process of developing a fatigue crack growth database (FCGD) of metallic materials for use in damage tolerance analysis of aircraft structure. For this initial effort, crack growth rate data in the NASGRO (Registered trademark) database, the United States Air Force Damage Tolerant Design Handbook, and other publicly available sources were examined and used to develop a database that characterizes crack growth behavior for specific applications (materials). The focus of this effort was on materials for general commercial aircraft applications, including large transport airplanes, small transport commuter airplanes, general aviation airplanes, and rotorcraft. The end products of this project are the FCGD software and this report. The specific goal of this effort was to present fatigue crack growth data in three usable formats: (1) NASGRO equation parameters, (2) Walker equation parameters, and (3) tabular data points. The development of this FCGD will begin the process of developing a consistent set of standard fatigue crack growth material properties. It is envisioned that the end product of the process will be a general repository for credible and well-documented fracture properties that may be used as a default standard in damage tolerance analyses.

  3. Elastic properties, strength and damage tolerance of pultruded composites

    NASA Astrophysics Data System (ADS)

    Saha, Mrinal Chandra

    Pultruded composites are candidate materials for civil engineering infrastructural applications due their higher corrosion resistance and lower life cycle cost. Efficient use of materials like structural members requires thorough understanding of the mechanism that affects their response. The present investigation addresses the modeling and characterization of E-glass fiber/polyester resin matrix pultruded composites in the form of sheets of various thicknesses. The elastic constants were measured using static, vibration and ultrasonic methods. Two types of piezoelectric crystals were used in ultrasonic measurements. Finally, the feasibility of using a single specimen, in the form of a circular disk, was shown in measuring all the elastic constants using ultrasonic technique. The effects of stress gradient on tensile strength were investigated. A large number of specimens, parallel and transverse to the pultrusion direction, were tested in tension, 3-point flexure, and 4-point flexure. A 2-parameter Weibull model was applied to predict the tensile strength from the flexure tests. The measured and Weibull-predicted ratios did not show consistent agreement. Microstructural observations suggested that the flaw distribution in the material was not uniform, which appears to be a basic requirement for the Weibull distribution. Compressive properties were measured using a short-block compression test specimen of 44.4-mm long and 25.4-mm wide. Specimens were tested at 0°, 30°, 45°, 60° and 90° orientations. The compression test specimen was modeled using 4-noded isoparametric layered plate and shell elements. The predicted elastic properties for the roving layer and the continuous strand mat layer was used for the finite element study. The damage resistance and damage tolerance were investigated experimentally. Using a quasi-static indentation loading, damage was induced at various incrementally increased force levels to investigate the damage growth process. Damage

  4. Quantifying grain boundary damage tolerance with atomistic simulations

    NASA Astrophysics Data System (ADS)

    Foley, Daniel; Tucker, Garritt J.

    2016-10-01

    Grain boundaries play a pivotal role in defect evolution and accommodation within materials. Irradiated metals have been observed to form defect denuded zones in the vicinity of grain boundaries. This is especially apparent in nanocrystalline metals, which have an increased grain boundary concentration, as compared to their polycrystalline counterparts. Importantly, the effect of individual grain boundaries on microstructural damage tolerance is related to the character or structural state of the grain boundary. In this work, the damage accommodation behavior of a variety of copper grain boundaries is studied using atomistic simulations. Damage accumulation behavior is found to reach a saturation point where both the free volume and energy of a grain boundary fluctuate within an elliptical manifold, which varies in size for different boundary characters. Analysis of the grain boundaries shows that extrinsic damage accommodation occurs due to localized atomic shuffling accompanied by free volume rearrangement within the boundary. Continuous damage accumulation leads to altered atomic structural states that oscillate around a mean non-equilibrium state, that is energetically metastable. Our results suggest that variation of grain boundary behavior, both from equilibrium and under saturation, is directly related to grain boundary equilibrium energy and some boundaries have a greater propensity to continually accommodate damage, as compared to others.

  5. Damage tolerance of a composite sandwich with interleaved foam core

    NASA Technical Reports Server (NTRS)

    Ishai, Ori; Hiel, Clement

    1992-01-01

    A composite sandwich panel consisting of carbon fiber-reinforced plastic (CFRP) skins and a syntactic foam core was selected as an appropriate structural concept for the design of wind tunnel compressor blades. Interleaving of the core with tough interlayers was done to prevent core cracking and to improve damage tolerance of the sandwich. Simply supported sandwich beam specimens were subjected to low-velocity drop-weight impacts as well as high velocity ballistic impacts. The performance of the interleaved core sandwich panels was characterized by localized skin damage and minor cracking of the core. Residual compressive strength (RCS) of the skin, which was derived from flexural test, shows the expected trend of decreasing with increasing size of the damage, impact energy, and velocity. In the case of skin damage, RCS values of around 50 percent of the virgin interleaved reference were obtained at the upper impact energy range. Based on the similarity between low-velocity and ballistic-impact effects, it was concluded that impact energy is the main variable controlling damage and residual strength, where as velocity plays a minor role.

  6. Review of the Oconee-3 probabilistic risk assessment: external events, core damage frequency. Volume 2

    SciTech Connect

    Hanan, N.A.; Ilberg, D.; Xue, D.; Youngblood, R.; Reed, J.W.; McCann, M.; Talwani, T.; Wreathall, J.; Kurth, P.D.; Bandyopadhyay, K.

    1986-03-01

    A review of the Oconee-3 Probabilistic Risk Assessment (OPRA) was conducted with the broad objective of evaluating qualitatively and quantitatively (as much as possible) the OPRA assessment of the important sequences that are ''externally'' generated and lead to core damage. The review included a technical assessment of the assumptions and methods used in the OPRA within its stated objective and with the limited information available. Within this scope, BNL performed a detailed reevaluation of the accident sequences generated by internal floods and earthquakes and a less detailed review (in some cases a scoping review) for the accident sequences generated by fires, tornadoes, external floods, and aircraft impact. 12 refs., 24 figs., 31 tabs.

  7. Microplasticity and fatigue in a damage tolerant niobium aluminide intermetallic

    SciTech Connect

    Soboyejo, W.O.; DiPasquale, J.; Srivatsan, T.S.; Konitzer, D.

    1997-12-31

    In this paper, the micromechanisms of microplasticity and fatigue are elucidated for a damage tolerant niobium aluminide intermetallic deformed to failure under both monotonic and cyclic loading. Localized microplasticity is shown to occur by the formation of slip bands at stresses as low as 9% of the bulk yield stress. Formation and presence of slip bands is also observed upon application of the first cycle of fatigue load. The deformation and cracking phenomena are discussed in light of classical fatigue crack initiation and propagation models. The implications of microplasticity are elucidated for both fatigue crack initiation and crack growth.

  8. Damage-Tolerant Fan Casings for Jet Engines

    NASA Technical Reports Server (NTRS)

    2006-01-01

    All turbofan engines work on the same principle. A large fan at the front of the engine draws air in. A portion of the air enters the compressor, but a greater portion passes on the outside of the engine this is called bypass air. The air that enters the compressor then passes through several stages of rotating fan blades that compress the air more, and then it passes into the combustor. In the combustor, fuel is injected into the airstream, and the fuel-air mixture is ignited. The hot gasses produced expand rapidly to the rear, and the engine reacts by moving forward. If there is a flaw in the system, such as an unexpected obstruction, the fan blade can break, spin off, and harm other engine components. Fan casings, therefore, need to be strong enough to contain errant blades and damage-tolerant to withstand the punishment of a loose blade-turned-projectile. NASA has spearheaded research into improving jet engine fan casings, ultimately discovering a cost-effective approach to manufacturing damage-tolerant fan cases that also boast significant weight reduction. In an aircraft, weight reduction translates directly into fuel burn savings, increased payload, and greater aircraft range. This technology increases safety and structural integrity; is an attractive, viable option for engine manufacturers, because of the low-cost manufacturing; and it is a practical alternative for customers, as it has the added cost saving benefits of the weight reduction.

  9. Damage-tolerant nanotwinned metals with nanovoids under radiation environments

    SciTech Connect

    Chen, Y.; Yu, K. Y.; Liu, Y.; Shao, S.; Wang, H.; Kirk, M. A.; Wang, J.; Zhang, X.

    2015-04-24

    Material performance in extreme radiation environments is central to the design of future nuclear reactors. Radiation induces significant damage in the form of dislocation loops and voids in irradiated materials, and continuous radiation often leads to void growth and subsequent void swelling in metals with low stacking fault energy. Here we show that by using in situ heavy ion irradiation in a transmission electron microscope, pre-introduced nanovoids in nanotwinned Cu efficiently absorb radiation-induced defects accompanied by gradual elimination of nanovoids, enhancing radiation tolerance of Cu. In situ studies and atomistic simulations reveal that such remarkable self-healing capability stems from high density of coherent and incoherent twin boundaries that rapidly capture and transport point defects and dislocation loops to nanovoids, which act as storage bins for interstitial loops. This study describes a counterintuitive yet significant concept: deliberate introduction of nanovoids in conjunction with nanotwins enables unprecedented damage tolerance in metallic materials.

  10. The use of a non-probabilistic artificial neural network to consider uncertainties in vibration-based-damage detection

    NASA Astrophysics Data System (ADS)

    Padil, Khairul H.; Bakhary, Norhisham; Hao, Hong

    2017-01-01

    The effectiveness of artificial neural networks (ANNs) when applied to pattern recognition in vibration-based damage detection has been demonstrated in many studies because they are capable of providing accurate results and the reliable identification of structural damage based on modal data. However, the use of ANNs has been questioned in terms of its reliability in the face of uncertainties in measurement and modeling data. Attempts to incorporate a probabilistic method into an ANN by treating the uncertainties as normally distributed random variables has delivered promising solutions to this problem, but the probabilistic method is less straightforward in practice because it is often not possible to obtain unbiased probabilistic distributions of the uncertainties. Moreover, the probabilistic ANN method is computationally complex, especially when generating output data. In this study, a non-probabilistic ANN is proposed to address the problem of uncertainty in vibration damage detection using ANNs. The input data for the network consist of natural frequencies and mode shapes, and the output is the Young's modulus (E values), which acts as an elemental stiffness parameter (ESP). Through the interval analysis method, the noise in measured frequencies and mode shapes are considered to be coupled rather than statistically distributed. This method calculates the interval bound (lower and upper bounds) of the ESP changes based on an interval analysis method. The ANN is used to predict the output of this interval bound by considering the uncertainties in the input parameters. To establish the relationship between the input parameters and output parameters, a possibility of damage existence (PoDE) parameter is defined for the undamaged and damaged states. A stiffness reduction factor (SRF) is also used to represent changes in the stiffness parameter. A numerical model and a laboratory-tested steel portal frame demonstrate the efficacy of the method in improving the

  11. Estimation of probability of failure for damage-tolerant aerospace structures

    NASA Astrophysics Data System (ADS)

    Halbert, Keith

    The majority of aircraft structures are designed to be damage-tolerant such that safe operation can continue in the presence of minor damage. It is necessary to schedule inspections so that minor damage can be found and repaired. It is generally not possible to perform structural inspections prior to every flight. The scheduling is traditionally accomplished through a deterministic set of methods referred to as Damage Tolerance Analysis (DTA). DTA has proven to produce safe aircraft but does not provide estimates of the probability of failure of future flights or the probability of repair of future inspections. Without these estimates maintenance costs cannot be accurately predicted. Also, estimation of failure probabilities is now a regulatory requirement for some aircraft. The set of methods concerned with the probabilistic formulation of this problem are collectively referred to as Probabilistic Damage Tolerance Analysis (PDTA). The goal of PDTA is to control the failure probability while holding maintenance costs to a reasonable level. This work focuses specifically on PDTA for fatigue cracking of metallic aircraft structures. The growth of a crack (or cracks) must be modeled using all available data and engineering knowledge. The length of a crack can be assessed only indirectly through evidence such as non-destructive inspection results, failures or lack of failures, and the observed severity of usage of the structure. The current set of industry PDTA tools are lacking in several ways: they may in some cases yield poor estimates of failure probabilities, they cannot realistically represent the variety of possible failure and maintenance scenarios, and they do not allow for model updates which incorporate observed evidence. A PDTA modeling methodology must be flexible enough to estimate accurately the failure and repair probabilities under a variety of maintenance scenarios, and be capable of incorporating observed evidence as it becomes available. This

  12. Durability and Damage Tolerance of High Temperature Polymeric Composites

    NASA Technical Reports Server (NTRS)

    Case, Scott W.; Reifsnider, Kenneth L.

    1996-01-01

    Modern durability and damage tolerance predictions for composite material systems rely on accurate estimates of the local stress and material states for each of the constituents, as well as the manner in which the constituents interact. In this work, an number of approaches to estimating the stress states and interactions are developed. First, an elasticity solution is presented for the problem of a penny-shaped crack in an N-phase composite material system opened by a prescribed normal pressure. The stress state around such a crack is then used to estimate the stress concentrations due to adjacent fiber fractures in composite materials. The resulting stress concentrations are then used to estimate the tensile strength of the composite. The predicted results are compared with experimental values. In addition, a cumulative damage model for fatigue is presented. Modifications to the model are made to include the effects of variable amplitude loading. These modifications are based upon the use of remaining strength as a damage metric and the definition of an equivalent generalized time. The model is initially validated using results from the literature. Also, experimental data from APC-2 laminates and IM7/K3B laminates are used in the model. The use of such data for notched laminates requires the use of an effective hole size, which is calculated based upon strain distribution measurements. Measured remaining strengths after fatigue loading are compared with the predicted values for specimens fatigued at room temperature and 350 F (177 C).

  13. Towards a damage tolerance philosophy for composite materials and structures

    NASA Technical Reports Server (NTRS)

    Obrien, T. Kevin

    1988-01-01

    A damage-threshold/fail-safe approach is proposed to ensure that composite structures are both sufficiently durable for economy of operation, as well as adequately fail-safe or damage tolerant for flight safety. Matrix cracks are assumed to exist throughout the off-axis plies. Delamination onset is predicted using a strain energy release rate characterization. Delamination growth is accounted for in one of three ways: either analytically, using delamination growth laws in conjunction with strain energy release rate analyses incorporating delamination resistance curves; experimentally, using measured stiffness loss; or conservatively, assuming delamination onset corresponds to catastrophic delamination growth. Fail-safety is assessed by accounting for the accumulation of delaminations through the thickness. A tension fatigue life prediction for composite laminates is presented as a case study to illustrate how this approach may be implemented. Suggestions are made for applying the damage-threshold/fail-safe approach to compression fatigue, tension/compression fatigue, and compression strength following low velocity impact.

  14. Towards a damage tolerance philosophy for composite materials and structures

    NASA Technical Reports Server (NTRS)

    O'Brien, T. Kevin

    1990-01-01

    A damage-threshold/fail-safe approach is proposed to ensure that composite structures are both sufficiently durable for economy of operation, as well as adequately fail-safe or damage tolerant for flight safety. Matrix cracks are assumed to exist throughout the off-axis plies. Delamination onset is predicted using a strain energy release rate characterization. Delamination growth is accounted for in one of three ways: either analytically, using delamination growth laws in conjunction with strain energy release rate analyses incorporating delamination resistance curves; experimentally, using measured stiffness loss; or conservatively, assuming delamination onset corresponds to catastrophic delamination growth. Fail-safety is assessed by accounting for the accumulation of delaminations through the thickness. A tension fatigue life prediction for composite laminates is presented as a case study to illustrate how this approach may be implemented. Suggestions are made for applying the damage-threshold/fail-safe approach to compression fatigue, tension/compression fatigue, and compression strength following low velocity impact.

  15. Advanced information processing system - Status report. [for fault tolerant and damage tolerant data processing for aerospace vehicles

    NASA Technical Reports Server (NTRS)

    Brock, L. D.; Lala, J.

    1986-01-01

    The Advanced Information Processing System (AIPS) is designed to provide a fault tolerant and damage tolerant data processing architecture for a broad range of aerospace vehicles. The AIPS architecture also has attributes to enhance system effectiveness such as graceful degradation, growth and change tolerance, integrability, etc. Two key building blocks being developed by the AIPS program are a fault and damage tolerant processor and communication network. A proof-of-concept system is now being built and will be tested to demonstrate the validity and performance of the AIPS concepts.

  16. Fuel containment and damage tolerance for large composite primary aircraft structures. Phase 1: Testing

    NASA Technical Reports Server (NTRS)

    Sandifer, J. P.

    1983-01-01

    Technical problems associated with fuel containment and damage tolerance of composite material wings for transport aircraft were identified. The major tasks are the following: (1) the preliminary design of damage tolerant wing surface using composite materials; (2) the evaluation of fuel sealing and lightning protection methods for a composite material wing; and (3) an experimental investigation of the damage tolerant characteristics of toughened resin graphite/epoxy materials. The test results, the test techniques, and the test data are presented.

  17. Intraspecific competition facilitates the evolution of tolerance to insect damage in the perennial plant Solanum carolinense.

    PubMed

    McNutt, David W; Halpern, Stacey L; Barrows, Kahaili; Underwood, Nora

    2012-12-01

    Tolerance to herbivory (the degree to which plants maintain fitness after damage) is a key component of plant defense, so understanding how natural selection and evolutionary constraints act on tolerance traits is important to general theories of plant-herbivore interactions. These factors may be affected by plant competition, which often interacts with damage to influence trait expression and fitness. However, few studies have manipulated competitor density to examine the evolutionary effects of competition on tolerance. In this study, we tested whether intraspecific competition affects four aspects of the evolution of tolerance to herbivory in the perennial plant Solanum carolinense: phenotypic expression, expression of genetic variation, the adaptive value of tolerance, and costs of tolerance. We manipulated insect damage and intraspecific competition for clonal lines of S. carolinense in a greenhouse experiment, and measured tolerance in terms of sexual and asexual fitness components. Compared to plants growing at low density, plants growing at high density had greater expression of and genetic variation in tolerance, and experienced greater fitness benefits from tolerance when damaged. Tolerance was not costly for plants growing at either density, and only plants growing at low density benefited from tolerance when undamaged, perhaps due to greater intrinsic growth rates of more tolerant genotypes. These results suggest that competition is likely to facilitate the evolution of tolerance in S. carolinense, and perhaps in other plants that regularly experience competition, while spatio-temporal variation in density may maintain genetic variation in tolerance.

  18. The combined effect of glass buffer strips and stitching on the damage tolerance of composites

    NASA Technical Reports Server (NTRS)

    Kullerd, Susan M.

    1993-01-01

    Recent research has demonstrated that through-the-thickness stitching provides major improvements in the damage tolerance of composite laminates loaded in compression. However, the brittle nature of polymer matrix composites makes them susceptible to damage propagation, requiring special material applications and designs to limit damage growth. Glass buffer strips, embedded within laminates, have shown the potential for improving the damage tolerance of unstitched composite laminates loaded in tension. The glass buffer strips, less stiff than the surrounding carbon fibers, arrest crack growth in composites under tensile loads. The present study investigates the damage tolerance characteristics of laminates that contain both stitching and glass buffer strips.

  19. Design, testing, and damage tolerance study of bonded stiffened composite wing cover panels

    NASA Technical Reports Server (NTRS)

    Madan, Ram C.; Sutton, Jason O.

    1988-01-01

    Results are presented from the application of damage tolerance criteria for composite panels to multistringer composite wing cover panels developed under NASA's Composite Transport Wing Technology Development contract. This conceptual wing design integrated aeroelastic stiffness constraints with an enhanced damage tolerance material system, in order to yield optimized producibility and structural performance. Damage tolerance was demonstrated in a test program using full-sized cover panel subcomponents; panel skins were impacted at midbay between stiffeners, directly over a stiffener, and over the stiffener flange edge. None of the impacts produced visible damage. NASTRAN analyses were performed to simulate NDI-detected invisible damage.

  20. Ethical Implications of Probabilistic Event Attribution for Policy Discussions about Loss and Damage

    NASA Astrophysics Data System (ADS)

    Otto, F. E. L.; Thompson, A.

    2014-12-01

    Warming of the global climate system is unequivocal, predominantly due to rising greenhouse gases with direct implications from rising mean global temperatures for some slow-onset events such as sea level rise, which can therefore be linked directly to past emissions. In many regions, however, extreme weather events, like heatwaves, floods, and droughts, are associated with greater loss and damage. An increase in average temperatures will lead to an increase in the frequency or magnitude of some extreme weather events including heat waves and droughts. For example, the deaths of at least thirty-five thousand people in Europe are attributable to the record-breaking heat wave of 2003. Extreme heat events and subsequent droughts can be directly linked to the loss of human life as well as damage to, or the significant diminishment of economic productivity. Two points are crucial here. First, the science of attributing slow-onset phenomena, such as higher mean temperatures or rising sea levels, to greenhouse gas emissions and other anthropogenic climatic forcings is different than the science of attributing particular extreme weather events, such as heat waves and extreme precipitation, to anthropogenic global climate change. The latter requires a different statistical approach. Second, extreme weather events, at least in the short term, will cause more damage and thus adversely affect society more than slow-onset phenomena. But while there is widespread agreement that slow-onset climate affects can be reliably attributed to anthropogenic greenhouse gas emissions our ability to attribute any particular extreme weather event to anthropogenic climate change is less accepted. However, with the emerging science of probabilistic event attribution it is possible to attribute the fraction of risk caused by anthropogenic climate change to particular weather events and their associated losses. Even with high uncertainty the robust link of a only a small fraction of excessive

  1. The effect of resin on the impact damage tolerance of graphite-epoxy laminates

    NASA Technical Reports Server (NTRS)

    Williams, J. G.; Rhodes, M. D.

    1981-01-01

    The effect of the matrix resin on the impact damage tolerance of graphite-epoxy composite laminates was investigated. The materials were evaluated on the basis of the damage incurred due to local impact and on their ability to retain compression strength in the presence of impact damage. Twenty-four different resin systems were evaluated. Five of the systems demonstrated substantial improvements compared to the baseline system including retention of compression strength in the presence of impact damage. Examination of the neat resin mechanical properties indicates the resin tensile properties influence significantly the laminate damage tolerance and that improvements in laminate damage tolerance are not necessarily made at the expense of room temperature mechanical properties. Preliminary results indicate a resin volume fraction on the order of 40 percent or greater may be required to permit the plastic flow between fibers necessary for improved damage tolerance.

  2. 76 FR 74655 - Damage Tolerance and Fatigue Evaluation of Composite Rotorcraft Structures

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-12-01

    ... complex materials that have unique advantages in fatigue strength, weight, and tolerance to damage. The... static strength of composite rotorcraft structures using a damage tolerance evaluation, or a fatigue... also harmonize this standard with international standards for evaluating the fatigue strength of...

  3. 14 CFR 23.574 - Metallic damage tolerance and fatigue evaluation of commuter category airplanes.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 14 Aeronautics and Space 1 2011-01-01 2011-01-01 false Metallic damage tolerance and fatigue... COMMUTER CATEGORY AIRPLANES Structure Fatigue Evaluation § 23.574 Metallic damage tolerance and fatigue... evaluation of the strength, detail design, and fabrication must show that catastrophic failure due to...

  4. 14 CFR 23.574 - Metallic damage tolerance and fatigue evaluation of commuter category airplanes.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 14 Aeronautics and Space 1 2014-01-01 2014-01-01 false Metallic damage tolerance and fatigue... COMMUTER CATEGORY AIRPLANES Structure Fatigue Evaluation § 23.574 Metallic damage tolerance and fatigue... evaluation of the strength, detail design, and fabrication must show that catastrophic failure due to...

  5. 75 FR 24502 - Damage Tolerance and Fatigue Evaluation of Composite Rotorcraft Structures; Reopening of Comment...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-05-05

    ... From the Federal Register Online via the Government Publishing Office DEPARTMENT OF TRANSPORTATION Federal Aviation Administration 14 CFR Parts 27 and 29 RIN 2120-AJ52 Damage Tolerance and Fatigue... 793) Notice No. 09-12, entitled ``Damage Tolerance and Fatigue Evaluation of Composite...

  6. 14 CFR 23.574 - Metallic damage tolerance and fatigue evaluation of commuter category airplanes.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 14 Aeronautics and Space 1 2010-01-01 2010-01-01 false Metallic damage tolerance and fatigue... COMMUTER CATEGORY AIRPLANES Structure Fatigue Evaluation § 23.574 Metallic damage tolerance and fatigue... evaluation of the strength, detail design, and fabrication must show that catastrophic failure due to...

  7. 77 FR 50576 - Damage Tolerance and Fatigue Evaluation of Composite Rotorcraft Structures; OMB Approval of...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-08-22

    ... TRANSPORTATION Federal Aviation Administration 14 CFR Parts 27 and 29 RIN 2120-AJ52 Damage Tolerance and Fatigue... collection requirement contained in the FAA's final rule, ``Damage Tolerance and Fatigue Evaluation of... and Fatigue Evaluation of Composite Rotorcraft Structures,'' published in the Federal Register (76...

  8. 14 CFR 23.574 - Metallic damage tolerance and fatigue evaluation of commuter category airplanes.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 14 Aeronautics and Space 1 2012-01-01 2012-01-01 false Metallic damage tolerance and fatigue... COMMUTER CATEGORY AIRPLANES Structure Fatigue Evaluation § 23.574 Metallic damage tolerance and fatigue... evaluation of the strength, detail design, and fabrication must show that catastrophic failure due to...

  9. 14 CFR 23.574 - Metallic damage tolerance and fatigue evaluation of commuter category airplanes.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 14 Aeronautics and Space 1 2013-01-01 2013-01-01 false Metallic damage tolerance and fatigue... COMMUTER CATEGORY AIRPLANES Structure Fatigue Evaluation § 23.574 Metallic damage tolerance and fatigue... evaluation of the strength, detail design, and fabrication must show that catastrophic failure due to...

  10. Structurally Integrated, Damage-Tolerant, Thermal Spray Coatings

    NASA Astrophysics Data System (ADS)

    Vackel, Andrew; Dwivedi, Gopal; Sampath, Sanjay

    2015-07-01

    Thermal spray coatings are used extensively for the protection and life extension of engineering components exposed to harsh wear and/or corrosion during service in aerospace, energy, and heavy machinery sectors. Cermet coatings applied via high-velocity thermal spray are used in aggressive wear situations almost always coupled with corrosive environments. In several instances (e.g., landing gear), coatings are considered as part of the structure requiring system-level considerations. Despite their widespread use, the technology has lacked generalized scientific principles for robust coating design, manufacturing, and performance analysis. Advances in process and in situ diagnostics have provided significant insights into the process-structure-property-performance correlations providing a framework-enhanced design. In this overview, critical aspects of materials, process, parametrics, and performance are discussed through exemplary studies on relevant compositions. The underlying connective theme is understanding and controlling residual stresses generation, which not only addresses process dynamics but also provides linkage for process-property relationship for both the system (e.g., fatigue) and the surface (wear and corrosion). The anisotropic microstructure also invokes the need for damage-tolerant material design to meet future goals.

  11. Damage Tolerance Behavior of Friction Stir Welds in Aluminum Alloys

    NASA Technical Reports Server (NTRS)

    McGill, Preston; Burkholder, Jonathan

    2012-01-01

    Friction stir welding is a solid state welding process used in the fabrication of various aerospace structures. Self-reacting and conventional friction stir welding are variations of the friction stir weld process employed in the fabrication of cryogenic propellant tanks which are classified as pressurized structure in many spaceflight vehicle architectures. In order to address damage tolerance behavior associated with friction stir welds in these safety critical structures, nondestructive inspection and proof testing may be required to screen hardware for mission critical defects. The efficacy of the nondestructive evaluation or the proof test is based on an assessment of the critical flaw size. Test data describing fracture behavior, residual strength capability, and cyclic mission life capability of friction stir welds at ambient and cryogenic temperatures have been generated and will be presented in this paper. Fracture behavior will include fracture toughness and tearing (R-curve) response of the friction stir welds. Residual strength behavior will include an evaluation of the effects of lack of penetration on conventional friction stir welds, the effects of internal defects (wormholes) on self-reacting friction stir welds, and an evaluation of the effects of fatigue cycled surface cracks on both conventional and selfreacting welds. Cyclic mission life capability will demonstrate the effects of surface crack defects on service load cycle capability. The fracture data will be used to evaluate nondestructive inspection and proof test requirements for the welds.

  12. Damage Tolerance Assessment of Friction Pull Plug Welds

    NASA Technical Reports Server (NTRS)

    McGill, Preston; Burkholder, Jonathan

    2012-01-01

    Friction stir welding is a solid state welding process developed and patented by The Welding Institute in Cambridge, England. Friction stir welding has been implemented in the aerospace industry in the fabrication of longitudinal welds in pressurized cryogenic propellant tanks. As the industry looks to implement friction stir welding in circumferential welds in pressurized cryogenic propellant tanks, techniques to close out the termination hole associated with retracting the pin tool are being evaluated. Friction pull plug welding is under development as a one means of closing out the termination hole. A friction pull plug weld placed in a friction stir weld results in a non-homogenous weld joint where the initial weld, plug weld, their respective heat affected zones and the base metal all interact. The welded joint is a composite, plastically deformed material system with a complex residual stress field. In order to address damage tolerance concerns associated with friction plug welds in safety critical structures, such as propellant tanks, nondestructive inspection and proof testing may be required to screen hardware for mission critical defects. The efficacy of the nondestructive evaluation or the proof test is based on an assessment of the critical flaw size in the test or service environments. Test data relating residual strength capability to flaw size in two aluminum alloy friction plug weld configurations is presented.

  13. Water availability limits tolerance of apical damage in the Chilean tarweed Madia sativa

    NASA Astrophysics Data System (ADS)

    Gonzáles, Wilfredo L.; Suárez, Lorena H.; Molina-Montenegro, Marco A.; Gianoli, Ernesto

    2008-07-01

    Plant tolerance is the ability to reduce the negative impact of herbivory on plant fitness. Numerous studies have shown that plant tolerance is affected by nutrient availability, but the effect of soil moisture has received less attention. We evaluated tolerance of apical damage (clipping that mimicked insect damage) under two watering regimes (control watering and drought) in the tarweed Madia sativa (Asteraceae). We recorded number of heads with seeds and total number of heads as traits related to fitness. Net photosynthetic rate, water use efficiency, number of branches, shoot biomass, and the root:shoot biomass ratio were measured as traits potentially related to tolerance via compensatory responses to damage. In the drought treatment, damaged plants showed ≈43% reduction in reproductive fitness components in comparison with undamaged plants. In contrast, there was no significant difference in reproductive fitness between undamaged and damaged plants in the control watering treatment. Shoot biomass was not affected by apical damage. The number of branches increased after damage in both water treatments but this increase was limited by drought stress. Net photosynthetic rate increased in damaged plants only in the control watering treatment. Water use efficiency increased with drought stress and, in plants regularly watered, also increased after damage. Root:shoot ratio was higher in the low water treatment and damaged plants tended to reduce root:shoot ratio only in this water treatment. It is concluded that water availability limits tolerance to apical damage in M. sativa, and that putative compensatory mechanisms are differentially affected by water availability.

  14. Recent Advances in Durability and Damage Tolerance Methodology at NASA Langley Research Center

    NASA Technical Reports Server (NTRS)

    Ransom, J. B.; Glaessgen, E. H.; Raju, I. S.; Harris, C. E.

    2007-01-01

    Durability and damage tolerance (D&DT) issues are critical to the development of lighter, safer and more efficient aerospace vehicles. Durability is largely an economic life-cycle design consideration whereas damage tolerance directly addresses the structural airworthiness (safety) of the vehicle. Both D&DT methodologies must address the deleterious effects of changes in material properties and the initiation and growth of damage that may occur during the vehicle s service lifetime. The result of unanticipated D&DT response is often manifested in the form of catastrophic and potentially fatal accidents. As such, durability and damage tolerance requirements must be rigorously addressed for commercial transport aircraft and NASA spacecraft systems. This paper presents an overview of the recent and planned future research in durability and damage tolerance analytical and experimental methods for both metallic and composite aerospace structures at NASA Langley Research Center (LaRC).

  15. Electronic hybridisation implications for the damage-tolerance of thin film metallic glasses

    PubMed Central

    Schnabel, Volker; Jaya, B. Nagamani; Köhler, Mathias; Music, Denis; Kirchlechner, Christoph; Dehm, Gerhard; Raabe, Dierk; Schneider, Jochen M.

    2016-01-01

    A paramount challenge in materials science is to design damage-tolerant glasses. Poisson’s ratio is commonly used as a criterion to gauge the brittle-ductile transition in glasses. However, our data, as well as results in the literature, are in conflict with the concept of Poisson’s ratio serving as a universal parameter for fracture energy. Here, we identify the electronic structure fingerprint associated with damage tolerance in thin film metallic glasses. Our correlative theoretical and experimental data reveal that the fraction of bonds stemming from hybridised states compared to the overall bonding can be associated with damage tolerance in thin film metallic glasses. PMID:27819318

  16. Electronic hybridisation implications for the damage-tolerance of thin film metallic glasses

    NASA Astrophysics Data System (ADS)

    Schnabel, Volker; Jaya, B. Nagamani; Köhler, Mathias; Music, Denis; Kirchlechner, Christoph; Dehm, Gerhard; Raabe, Dierk; Schneider, Jochen M.

    2016-11-01

    A paramount challenge in materials science is to design damage-tolerant glasses. Poisson’s ratio is commonly used as a criterion to gauge the brittle-ductile transition in glasses. However, our data, as well as results in the literature, are in conflict with the concept of Poisson’s ratio serving as a universal parameter for fracture energy. Here, we identify the electronic structure fingerprint associated with damage tolerance in thin film metallic glasses. Our correlative theoretical and experimental data reveal that the fraction of bonds stemming from hybridised states compared to the overall bonding can be associated with damage tolerance in thin film metallic glasses.

  17. Electronic hybridisation implications for the damage-tolerance of thin film metallic glasses.

    PubMed

    Schnabel, Volker; Jaya, B Nagamani; Köhler, Mathias; Music, Denis; Kirchlechner, Christoph; Dehm, Gerhard; Raabe, Dierk; Schneider, Jochen M

    2016-11-07

    A paramount challenge in materials science is to design damage-tolerant glasses. Poisson's ratio is commonly used as a criterion to gauge the brittle-ductile transition in glasses. However, our data, as well as results in the literature, are in conflict with the concept of Poisson's ratio serving as a universal parameter for fracture energy. Here, we identify the electronic structure fingerprint associated with damage tolerance in thin film metallic glasses. Our correlative theoretical and experimental data reveal that the fraction of bonds stemming from hybridised states compared to the overall bonding can be associated with damage tolerance in thin film metallic glasses.

  18. A modal H∞-norm-based performance requirement for damage-tolerant active controller design

    NASA Astrophysics Data System (ADS)

    Genari, Helói F. G.; Mechbal, Nazih; Coffignal, Gérard; Nóbrega, Eurípedes G. O.

    2017-04-01

    Damage-tolerant active control (DTAC) is a recent research area that encompasses control design methodologies resulting from the application of fault-tolerant control methods to vibration control of structures subject to damage. The possibility of damage occurrence is not usually considered in the active vibration control design requirements. Damage changes the structure dynamics, which may produce unexpected modal behavior of the closed-loop system, usually not anticipated by the controller design approaches. A modal H∞ norm and a respective robust controller design framework were recently introduced, and this method is here extended to face a new DTAC strategy implementation. Considering that damage affects each vibration mode differently, this paper adopts the modal H∞ norm to include damage as a design requirement. The basic idea is to create an appropriate energy distribution over the frequency range of interest and respective vibration modes, guaranteeing robustness, damage tolerance, and adequate overall performance, taking into account that it is common to have previous knowledge of the structure regions where damage may occur during its operational life. For this purpose, a structural health monitoring technique is applied to evaluate modal modifications caused by damage. This information is used to create modal weighing matrices, conducting to the modal H∞ controller design. Finite element models are adopted for a case study structure, including different damage severities, in order to validate the proposed control strategy. Results show the effectiveness of the proposed methodology with respect to damage tolerance.

  19. Specialists Meeting on Impact Damage Tolerance of Structures

    DTIC Science & Technology

    1976-01-01

    damage in metal and fiber- composite structure. III 1 1 IMPACT DAMAGE IN ETIJALS SrlThe impact-darmdge response of metals depends upon many interrelated...established. Il1. 1.2 IMPACT DAMAGE IN FIBER COMPOSITES There has been very little parametric impact-damage testing of fiber composites . Figure 15 shows...type of structural material, and Includes: I. Bullets Impacting metal 2. Bullets or fragments Impacting fiber composites 3. Fragments impacting metal

  20. An Evaluation of the Applicability of Damage Tolerance to Dynamic Systems

    NASA Technical Reports Server (NTRS)

    Forth, Scott C.; Le, Dy; Turnberg, Jay

    2005-01-01

    The Federal Aviation Administration, the National Aeronautics and Space Administration and the aircraft industry have teamed together to develop methods and guidance for the safe life-cycle management of dynamic systems. Based on the success of the United States Air Force damage tolerance initiative for airframe structure, a crack growth based damage tolerance approach is being examined for implementation into the design and management of dynamic systems. However, dynamic systems accumulate millions of vibratory cycles per flight hour, more than 12,000 times faster than an airframe system. If a detectable crack develops in a dynamic system, the time to failure is extremely short, less than 100 flight hours in most cases, leaving little room for error in the material characterization, life cycle analysis, nondestructive inspection and maintenance processes. In this paper, the authors review the damage tolerant design process focusing on uncertainties that affect dynamic systems and evaluate the applicability of damage tolerance on dynamic systems.

  1. 14 CFR 27.573 - Damage Tolerance and Fatigue Evaluation of Composite Rotorcraft Structures.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... effects of material and process variability along with environmental conditions in the strength and..., DEPARTMENT OF TRANSPORTATION AIRCRAFT AIRWORTHINESS STANDARDS: NORMAL CATEGORY ROTORCRAFT Strength... intervals of the rotorcraft by performing damage tolerance evaluations of the strength of composite PSEs...

  2. 14 CFR 29.573 - Damage Tolerance and Fatigue Evaluation of Composite Rotorcraft Structures.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... effects of material and process variability along with environmental conditions in the strength and..., DEPARTMENT OF TRANSPORTATION AIRCRAFT AIRWORTHINESS STANDARDS: TRANSPORT CATEGORY ROTORCRAFT Strength... intervals of the rotorcraft by performing damage tolerance evaluations of the strength of composite PSEs...

  3. 14 CFR 27.573 - Damage Tolerance and Fatigue Evaluation of Composite Rotorcraft Structures.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... effects of material and process variability along with environmental conditions in the strength and..., DEPARTMENT OF TRANSPORTATION AIRCRAFT AIRWORTHINESS STANDARDS: NORMAL CATEGORY ROTORCRAFT Strength... intervals of the rotorcraft by performing damage tolerance evaluations of the strength of composite PSEs...

  4. 14 CFR 29.573 - Damage Tolerance and Fatigue Evaluation of Composite Rotorcraft Structures.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... effects of material and process variability along with environmental conditions in the strength and..., DEPARTMENT OF TRANSPORTATION AIRCRAFT AIRWORTHINESS STANDARDS: TRANSPORT CATEGORY ROTORCRAFT Strength... intervals of the rotorcraft by performing damage tolerance evaluations of the strength of composite PSEs...

  5. Concepts for improving the damage tolerance of composite compression panels. [aircraft structures

    NASA Technical Reports Server (NTRS)

    Rhodes, M. D.; Williams, J. G.

    1984-01-01

    The residual strength of specimens with damage and the sensitivity to damage while subjected to an applied inplane compression load were determined for flatplate specimens and blade-stiffened panels. The results suggest that matrix materials that fail by delamination have the lowest damage tolerance capability. Alternate matrix materials or laminates which are transversely reinforced suppress the delamination mode of failure and change the failure mode to transverse shear crippling which occurs at a higher strain value. Several damage-tolerant blade-stiffened panel design concepts are evaluated. Structural efficiency studies conducted show only small mass penalties may result from incorporating these damage-tolerant features in panel design. The implication of test results on the design of aircraft structures was examined with respect to FAR requirements.

  6. Influence of Fibre Architecture on Impact Damage Tolerance in 3D Woven Composites

    NASA Astrophysics Data System (ADS)

    Potluri, P.; Hogg, P.; Arshad, M.; Jetavat, D.; Jamshidi, P.

    2012-10-01

    3D woven composites, due to the presence of through-thickness fibre-bridging, have the potential to improve damage tolerance and at the same time to reduce the manufacturing costs. However, ability to withstand damage depends on weave topology as well as geometry of individual tows. There is an extensive literature on damage tolerance of 2D prepreg laminates but limited work is reported on the damage tolerance of 3D weaves. In view of the recent interest in 3D woven composites from aerospace as well as non-aerospace sectors, this paper aims to provide an understanding of the impact damage resistance as well as damage tolerance of 3D woven composites. Four different 3D woven architectures, orthogonal, angle interlocked, layer-to-layer and modified layer-to-layer structures, have been produced under identical weaving conditions. Two additional structures, Unidirectional (UD) cross-ply and 2D plain weave, have been developed for comparison with 3D weaves. All the four 3D woven laminates have similar order of magnitude of damage area and damage width, but significantly lower than UD and 2D woven laminates. Damage Resistance, calculated as impact energy per unit damage area, has been shown to be significantly higher for 3D woven laminates. Rate of change of CAI strength with impact energy appears to be similar for all four 3D woven laminates as well as UD laminate; 2D woven laminate has higher rate of degradation with respect to impact energy. Undamaged compression strength has been shown to be a function of average tow waviness angle. Additionally, 3D weaves exhibit a critical damage size; below this size there is no appreciable reduction in compression strength. 3D woven laminates have also exhibited a degree of plasticity during compression whereas UD laminates fail instantly. The experimental work reported in this paper forms a foundation for systematic development of computational models for 3D woven architectures for damage tolerance.

  7. Advanced Composite Wind Turbine Blade Design Based on Durability and Damage Tolerance

    SciTech Connect

    Abumeri, Galib; Abdi, Frank

    2012-02-16

    damage and fracture modes that resemble those reported in the tests. The results show that computational simulation can be relied on to enhance the design of tapered composite structures such as the ones used in turbine wind blades. A computational simulation for durability, damage tolerance (D&DT) and reliability of composite wind turbine blade structures in presence of uncertainties in material properties was performed. A composite turbine blade was first assessed with finite element based multi-scale progressive failure analysis to determine failure modes and locations as well as the fracture load. D&DT analyses were then validated with static test performed at Sandia National Laboratories. The work was followed by detailed weight analysis to identify contribution of various materials to the overall weight of the blade. The methodology ensured that certain types of failure modes, such as delamination progression, are contained to reduce risk to the structure. Probabilistic analysis indicated that composite shear strength has a great influence on the blade ultimate load under static loading. Weight was reduced by 12% with robust design without loss in reliability or D&DT. Structural benefits obtained with the use of enhanced matrix properties through nanoparticles infusion were also assessed. Thin unidirectional fiberglass layers enriched with silica nanoparticles were applied to the outer surfaces of a wind blade to improve its overall structural performance and durability. The wind blade was a 9-meter prototype structure manufactured and tested subject to three saddle static loading at Sandia National Laboratory (SNL). The blade manufacturing did not include the use of any nano-material. With silica nanoparticles in glass composite applied to the exterior surfaces of the blade, the durability and damage tolerance (D&DT) results from multi-scale PFA showed an increase in ultimate load of the blade by 9.2% as compared to baseline structural performance (without nano

  8. Strong, damage tolerant oxide-fiber/oxide matrix composites

    NASA Astrophysics Data System (ADS)

    Bao, Yahua

    cationic polyelectrolytes to have a positive surface charge and then dipped into diluted, negatively-charged AlPO4 colloidal suspension (0.05M) at pH 7.5. Amorphous AlPO4 (crystallizes to tridymite- and cristobalite-forms at 1080°C) nano particles were coated on fibers layer-by-layer using an electrostatic attraction protocol. A uniform and smooth coating was formed which allowed fiber pullout from the matrix of a Nextel 720/alumina mini-composite hot-pressed at 1250°C/20MPa. Reaction-bonded mullite (RBM), with low formation temperature and sintering shrinkage was synthesized by incorporation of mixed-rare-earth-oxide (MREO) and mullite seeds. Pure mullite formed with 7.5wt% MREO at 1300°C. Introduction of 5wt% mullite seeds gave RBM with less than 3% shrinkage and 20% porosity. AlPO4-coated Nextel 720/RBM composites were successful fabricated by EPID and pressureless sintering at 1300°C. Significant fiber pullout occurred and the 4-point bend strength was around 170MPa (with 25-30vol% fibers) at room temperature and 1100°C and a Work-of-Fracture 7KJ/m2. At 1200°C, the composite failed in shear due to the MREO-based glassy phase in the matrix. AlPO4-coated Nextel 720 fiber/aluminosilicate (no MREO) showed damage tolerance at 1200°C with a bend strength 170MPa.

  9. Some Observations on Damage Tolerance Analyses in Pressure Vessels

    NASA Technical Reports Server (NTRS)

    Raju, Ivatury S.; Dawicke, David S.; Hampton, Roy W.

    2017-01-01

    AIAA standards S080 and S081 are applicable for certification of metallic pressure vessels (PV) and composite overwrap pressure vessels (COPV), respectively. These standards require damage tolerance analyses with a minimum reliable detectible flaw/crack and demonstration of safe life four times the service life with these cracks at the worst-case location in the PVs and oriented perpendicular to the maximum principal tensile stress. The standards require consideration of semi-elliptical surface cracks in the range of aspect ratios (crack depth a to half of the surface length c, i.e., (a/c) of 0.2 to 1). NASA-STD-5009 provides the minimum reliably detectible standard crack sizes (90/95 probability of detection (POD) for several non-destructive evaluation (NDE) methods (eddy current (ET), penetrant (PT), radiography (RT) and ultrasonic (UT)) for the two limits of the aspect ratio range required by the AIAA standards. This paper tries to answer the questions: can the safe life analysis consider only the life for the crack sizes at the two required limits, or endpoints, of the (a/c) range for the NDE method used or does the analysis need to consider values within that range? What would be an appropriate method to interpolate 90/95 POD crack sizes at intermediate (a/c) values? Several procedures to develop combinations of a and c within the specified range are explored. A simple linear relationship between a and c is chosen to compare the effects of seven different approaches to determine combinations of aj and cj that are between the (a/c) endpoints. Two of the seven are selected for evaluation: Approach I, the simple linear relationship, and a more conservative option, Approach III. For each of these two Approaches, the lives are computed for initial semi-elliptic crack configurations in a plate subjected to remote tensile fatigue loading with an R-ratio of 0.1, for an assumed material evaluated using NASGRO (registered 4) version 8.1. These calculations demonstrate

  10. Use of a New Portable Instrumented Impactor on the NASA Composite Crew Module Damage Tolerance Program

    NASA Technical Reports Server (NTRS)

    Jackson, Wade C.; Polis, Daniel L.

    2014-01-01

    Damage tolerance performance is critical to composite structures because surface impacts at relatively low energies may result in a significant strength loss. For certification, damage tolerance criteria require aerospace vehicles to meet design loads while containing damage at critical locations. Data from standard small coupon testing are difficult to apply to larger more complex structures. Due to the complexity of predicting both the impact damage and the residual properties, damage tolerance is demonstrated primarily by testing. A portable, spring-propelled, impact device was developed which allows the impact damage response to be investigated on large specimens, full-scale components, or entire vehicles. During impact, both the force history and projectile velocity are captured. The device was successfully used to demonstrate the damage tolerance performance of the NASA Composite Crew Module. The impactor was used to impact 18 different design features at impact energies up to 35 J. Detailed examples of these results are presented, showing impact force histories, damage inspection results, and response to loading.

  11. Phosphorylation of human INO80 is involved in DNA damage tolerance

    SciTech Connect

    Kato, Dai; Waki, Mayumi; Umezawa, Masaki; Aoki, Yuka; Utsugi, Takahiko; Ohtsu, Masaya; Murakami, Yasufumi

    2012-01-06

    Highlights: Black-Right-Pointing-Pointer Depletion of hINO80 significantly reduced PCNA ubiquitination. Black-Right-Pointing-Pointer Depletion of hINO80 significantly reduced nuclear dots intensity of RAD18 after UV irradiation. Black-Right-Pointing-Pointer Western blot analyses showed phosphorylated hINO80 C-terminus. Black-Right-Pointing-Pointer Overexpression of phosphorylation mutant hINO80 reduced PCNA ubiquitination. -- Abstract: Double strand breaks (DSBs) are the most serious type of DNA damage. DSBs can be generated directly by exposure to ionizing radiation or indirectly by replication fork collapse. The DNA damage tolerance pathway, which is conserved from bacteria to humans, prevents this collapse by overcoming replication blockages. The INO80 chromatin remodeling complex plays an important role in the DNA damage response. The yeast INO80 complex participates in the DNA damage tolerance pathway. The mechanisms regulating yINO80 complex are not fully understood, but yeast INO80 complex are necessary for efficient proliferating cell nuclear antigen (PCNA) ubiquitination and for recruitment of Rad18 to replication forks. In contrast, the function of the mammalian INO80 complex in DNA damage tolerance is less clear. Here, we show that human INO80 was necessary for PCNA ubiquitination and recruitment of Rad18 to DNA damage sites. Moreover, the C-terminal region of human INO80 was phosphorylated, and overexpression of a phosphorylation-deficient mutant of human INO80 resulted in decreased ubiquitination of PCNA during DNA replication. These results suggest that the human INO80 complex, like the yeast complex, was involved in the DNA damage tolerance pathway and that phosphorylation of human INO80 was involved in the DNA damage tolerance pathway. These findings provide new insights into the DNA damage tolerance pathway in mammalian cells.

  12. Low velocity instrumented impact testing of four new damage tolerant carbon/epoxy composite systems

    NASA Technical Reports Server (NTRS)

    Lance, D. G.; Nettles, A. T.

    1990-01-01

    Low velocity drop weight instrumented impact testing was utilized to examine the damage resistance of four recently developed carbon fiber/epoxy resin systems. A fifth material, T300/934, for which a large data base exists, was also tested for comparison purposes. A 16-ply quasi-isotropic lay-up configuration was used for all the specimens. Force/absorbed energy-time plots were generated for each impact test. The specimens were cross-sectionally analyzed to record the damage corresponding to each impact energy level. Maximum force of impact versus impact energy plots were constructed to compare the various systems for impact damage resistance. Results show that the four new damage tolerant fiber/resin systems far outclassed the T300/934 material. The most damage tolerant material tested was the IM7/1962 fiber/resin system.

  13. INSYDE: a synthetic, probabilistic flood damage model based on explicit cost analysis

    NASA Astrophysics Data System (ADS)

    Dottori, Francesco; Figueiredo, Rui; Martina, Mario L. V.; Molinari, Daniela; Scorzini, Anna Rita

    2016-12-01

    Methodologies to estimate economic flood damages are increasingly important for flood risk assessment and management. In this work, we present a new synthetic flood damage model based on a component-by-component analysis of physical damage to buildings. The damage functions are designed using an expert-based approach with the support of existing scientific and technical literature, loss adjustment studies, and damage surveys carried out for past flood events in Italy. The model structure is designed to be transparent and flexible, and therefore it can be applied in different geographical contexts and adapted to the actual knowledge of hazard and vulnerability variables. The model has been tested in a recent flood event in northern Italy. Validation results provided good estimates of post-event damages, with similar or superior performances when compared with other damage models available in the literature. In addition, a local sensitivity analysis was performed in order to identify the hazard variables that have more influence on damage assessment results.

  14. Collection, processing, and reporting of damage tolerant design data for non-aerospace structural materials

    NASA Technical Reports Server (NTRS)

    Huber, P. D.; Gallagher, J. P.

    1994-01-01

    This report describes the organization, format and content of the NASA Johnson damage tolerant database which was created to store damage tolerant property data for non aerospace structural materials. The database is designed to store fracture toughness data (K(sub IC), K(sub c), J(sub IC) and CTOD(sub IC)), resistance curve data (K(sub R) VS. delta a (sub eff) and JR VS. delta a (sub eff)), as well as subcritical crack growth data (a vs. N and da/dN vs. delta K). The database contains complementary material property data for both stainless and alloy steels, as well as for aluminum, nickel, and titanium alloys which were not incorporated into the Damage Tolerant Design Handbook database.

  15. Damage tolerance of candidate thermoset composites for use on single stage to orbit vehicles

    NASA Technical Reports Server (NTRS)

    Nettles, A. T.; Lance, D.; Hodge, A.

    1994-01-01

    Four fiber/resin systems were compared for resistance to damage and damage tolerance. One toughened epoxy and three toughened bismaleimide (BMI) resins were used, all with IM7 carbon fiber reinforcement. A statistical design of experiments technique was used to evaluate the effects of impact energy, specimen thickness, and impactor diameter on the damage area, as computed by C-scans, and residual compression-after-impact (CAI) strength. Results showed that two of the BMI systems sustained relatively large damage zones yet had an excellent retention of CAI strength.

  16. Safe-life and damage-tolerant design approaches for helicopter structures

    NASA Technical Reports Server (NTRS)

    Reddick, H. K., Jr.

    1983-01-01

    The safe-life and damage-tolerant design approaches discussed apply to both metallic and fibrous composite helicopter structures. The application of these design approaches to fibrous composite structures is emphasized. Safe-life and damage-tolerant criteria are applied to all helicopter flight critical components, which are generally categorized as: dynamic components with a main and tail rotor system, which includes blades, hub and rotating controls, and drive train which includes transmission, and main and interconnecting rotor shafts; and the airframe, composed of the fuselage, aerodynamic surfaces, and landing gear.

  17. Applications of a damage tolerance analysis methodology in aircraft design and production

    NASA Technical Reports Server (NTRS)

    Woodward, M. R.; Owens, S. D.; Law, G. E.; Mignery, L. A.

    1992-01-01

    Objectives of customer mandated aircraft structural integrity initiatives in design are to guide material selection, to incorporate fracture resistant concepts in the design, to utilize damage tolerance based allowables and planned inspection procedures necessary to enhance the safety and reliability of manned flight vehicles. However, validated fracture analysis tools for composite structures are needed to accomplish these objectives in a timely and economical manner. This paper briefly describes the development, validation, and application of a damage tolerance methodology for composite airframe structures. A closed-form analysis code, entitled SUBLAM was developed to predict the critical biaxial strain state necessary to cause sublaminate buckling-induced delamination extension in an impact damaged composite laminate. An embedded elliptical delamination separating a thin sublaminate from a thick parent laminate is modelled. Predicted failure strains were correlated against a variety of experimental data that included results from compression after impact coupon and element tests. An integrated analysis package was developed to predict damage tolerance based margin-of-safety (MS) using NASTRAN generated loads and element information. Damage tolerance aspects of new concepts are quickly and cost-effectively determined without the need for excessive testing.

  18. Fuel containment, lightning protection and damage tolerance in large composite primary aircraft structures

    NASA Technical Reports Server (NTRS)

    Griffin, Charles F.; James, Arthur M.

    1985-01-01

    The damage-tolerance characteristics of high strain-to-failure graphite fibers and toughened resins were evaluated. Test results show that conventional fuel tank sealing techniques are applicable to composite structures. Techniques were developed to prevent fuel leaks due to low-energy impact damage. For wing panels subjected to swept stroke lightning strikes, a surface protection of graphite/aluminum wire fabric and a fastener treatment proved effective in eliminating internal sparking and reducing structural damage. The technology features developed were incorporated and demonstrated in a test panel designed to meet the strength, stiffness, and damage tolerance requirements of a large commercial transport aircraft. The panel test results exceeded design requirements for all test conditions. Wing surfaces constructed with composites offer large weight savings if design allowable strains for compression can be increased from current levels.

  19. Eukaryotic Translesion Polymerases and Their Roles and Regulation in DNA Damage Tolerance

    PubMed Central

    Waters, Lauren S.; Minesinger, Brenda K.; Wiltrout, Mary Ellen; D'Souza, Sanjay; Woodruff, Rachel V.; Walker, Graham C.

    2009-01-01

    Summary: DNA repair and DNA damage tolerance machineries are crucial to overcome the vast array of DNA damage that a cell encounters during its lifetime. In this review, we summarize the current state of knowledge about the eukaryotic DNA damage tolerance pathway translesion synthesis (TLS), a process in which specialized DNA polymerases replicate across from DNA lesions. TLS aids in resistance to DNA damage, presumably by restarting stalled replication forks or filling in gaps that remain in the genome due to the presence of DNA lesions. One consequence of this process is the potential risk of introducing mutations. Given the role of these translesion polymerases in mutagenesis, we discuss the significant regulatory mechanisms that control the five known eukaryotic translesion polymerases: Rev1, Pol ζ, Pol κ, Pol η, and Pol ι. PMID:19258535

  20. Damage Tolerant Repair Techniques for Pressurized Aircraft Fuselages

    DTIC Science & Technology

    1994-01-01

    knit, -2K, and polyester mat, -2M). The results of an empirical study of bonding variables are shown in table 4.3. Table 4.3. Results of cure parameter...values shown in table 6.6. Table 6.6. Apparent Crack Patching Effectiveness, Long Cracks. Crack half length, Erdogan AK, Empirical (patched) percent mm...couldn’t have done it without you, and I wouldn’t have wanted to. IIIoo Damage Tolerat Repai Techniques for Premunizod Aircraft FuAslages Table of

  1. Damage tolerance of woven graphite-epoxy buffer strip panels

    NASA Technical Reports Server (NTRS)

    Kennedy, John M.

    1990-01-01

    Graphite-epoxy panels with S glass buffer strips were tested in tension and shear to measure their residual strengths with crack-like damage. The buffer strips were regularly spaced narrow strips of continuous S glass. Panels were made with a uniweave graphite cloth where the S glass buffer material was woven directly into the cloth. Panels were made with different width and thickness buffer strips. The panels were loaded to failure while remote strain, strain at the end of the slit, and crack opening displacement were monitoring. The notched region and nearby buffer strips were radiographed periodically to reveal crack growth and damage. Except for panels with short slits, the buffer strips arrested the propagating crack. The strength (or failing strain) of the panels was significantly higher than the strength of all-graphite panels with the same length slit. Panels with wide, thick buffer strips were stronger than panels with thin, narrow buffer strips. A shear-lag model predicted the failing strength of tension panels with wide buffer strips accurately, but over-estimated the strength of the shear panels and the tension panels with narrow buffer strips.

  2. USAF Damage Tolerant Design Handbook: Guidelines for the analysis and Design of Damage Tolerant Aircraft Structures. Revision A

    DTIC Science & Technology

    1979-03-01

    710 5O• Figure 2.2 (Con’t) IwSCAIorn MOu.&m uwi.z, AS io ameow A S*I tw aeK WL I N uaA AULSKtI o~ SSDUTT AL SARM I SIOU L - -I ~ 7 I 01.3) PY...POINT 0.005, FOR REMAINING STRUCTURE TYPICAL DAMAGE ANALYSIS MFG. DAMAGE 0 20 40 60 80 100 120 140 PER CENT OF DESIGN LIFETIME Figure 2.29 Development...71.; - 94 444.6.444in 140 2024-T3 120 8 W 80 in.4 0 G O - 40- 4 o With buckling plates K value for 20 * Without buckling plates thick plate OL I 1 I

  3. Modulation of inflammation and disease tolerance by DNA damage response pathways.

    PubMed

    Neves-Costa, Ana; Moita, Luis F

    2016-09-30

    The accurate replication and repair of DNA is central to organismal survival. This process is challenged by the many factors that can change genetic information such as replication errors and direct damage to the DNA molecule by chemical and physical agents. DNA damage can also result from microorganism invasion as an integral step of their life cycle or as collateral damage from host defense mechanisms against pathogens. Here we review the complex crosstalk of DNA damage response and immune response pathways that might be evolutionarily connected and argue that DNA damage response pathways can be explored therapeutically to induce disease tolerance through the activation of tissue damage control processes. Such approach may constitute the missing pillar in the treatment of critical illnesses caused by multiple organ failure, such as sepsis and septic shock.

  4. Probabilistic characteristics of random damage events and their quantification in acrylic bone cement.

    PubMed

    Qi, Gang; Wayne, Steven F; Penrose, Oliver; Lewis, Gladius; Hochstein, John I; Mann, Kenneth A

    2010-11-01

    The failure of brittle and quasi-brittle polymers can be attributed to a multitude of random microscopic damage modes, such as fibril breakage, crazing, and microfracture. As the load increases, new damage modes appear, and existing ones can transition into others. In the example polymer used in this study--a commercially available acrylic bone cement--these modes, as revealed by scanning electron microscopy of fracture surfaces, include nucleation of voids, cracking, and local detachment of the beads from the matrix. Here, we made acoustic measurements of the randomly generated microscopic events (RGME) that occurred in the material under pure tension and under three-point bending, and characterized the severity of the damage by the entropy (s) of the probability distribution of the observed acoustic signal amplitudes. We correlated s with the applied stress (σ) by establishing an empirical s-σ relationship, which quantifies the activities of RGME under Mode I stress. It reveals the state of random damage modes: when ds/dσ > 0, the number of damage modes present increases with increasing stress, whereas it decreases when ds/dσ < 0. When ds/dσ ≈ 0, no new random damage modes occur. In the s-σ curve, there exists a transition zone, with the stress at the "knee point" in this zone (center of the zone) corresponding to ~30 and ~35% of the cement's tensile and bending strengths, respectively. This finding explains the effects of RGME on material fatigue performance and may be used to approximate fatigue limit.

  5. New discoveries linking transcription to DNA repair and damage tolerance pathways.

    PubMed

    Cohen, Susan E; Walker, Graham C

    2011-01-01

    In Escherichia coli, the transcription elongation factor NusA is associated with all elongating RNA polymerases where it functions in transcription termination and antitermination. Here, we review our recent results implicating NusA in the recruitment of DNA repair and damage tolerance mechanisms to sites of stalled transcription complexes.

  6. Fuel containment and damage tolerance in large composite primary aircraft structures

    NASA Technical Reports Server (NTRS)

    Griffin, C. F.

    1983-01-01

    Technical problems related to fuel containment and damage tolerance of composite material wings for transport aircraft was investigated. The major tasks are the following: (1) the preliminary design of damage tolerant wing surface using composite materials; (2) the evaluation of fuel sealing and lightning protection methods for a composite material wing; and (3) an experimental investigation of the damage tolerant characteristics of toughened resin graphite/epoxy materials. The design concepts investigated for the upper and lower surfaces of a composite wing for a transport aircraft are presented and the relationship between weight savings and the design allowable strain used within the analysis is discussed. Experiments which compare the fuel sealing characteristics of bolt-bonded joints and bolted joints sealed with a polysulphide sealant are reviewed. Data from lightning strike tests on stiffened and unstiffened graphite/epoxy panels are presented. A wide variety of coupon tests were conducted to evaluate the relative damage tolerance of toughened resin graphite/epoxies. Data from these tests are presented and their relevance to the wing surface design concepts are discussed.

  7. Assessment of the Damage Tolerance of Postbuckled Hat-Stiffened Panels Using Single-Stringer Specimens

    NASA Technical Reports Server (NTRS)

    Bisagni, Chiara; Vescovini, Riccardo; Davila, Carlos G.

    2010-01-01

    A procedure is proposed for the assessment of the damage tolerance and collapse of stiffened composite panels using a single-stringer compression specimen. The dimensions of the specimen are determined such that the specimen s nonlinear response and collapse are representative of an equivalent multi-stringer panel in compression. Experimental tests are conducted on specimens with and without an embedded delamination. A shell-based finite element model with intralaminar and interlaminar damage capabilities is developed to predict the postbuckling response as well as the damage evolution from initiation to collapse.

  8. An assessment of buffer strips for improving damage tolerance

    NASA Technical Reports Server (NTRS)

    Poe, C. C., Jr.; Kennedy, J. M.

    1981-01-01

    Graphite/epoxy panels with buffer strips were tested in tension to measure their residual strength with crack-like damage. Panels were made with 45/0/-45/90(2S) and 45/0/450(2S) layups. The buffer strips were parallel to the loading directions. They were made by replacing narrow strips of the 0 deg graphite plies with strips of either 0 deg S-Glass/epoxy or Kevlar-49/epoxy on either a one for one or a two for one basis. In a third case, O deg graphite/epoxy was used as the buffer material and thin, perforated Mylar strips were placed between the 0 deg piles and the cross-plies to weaken the interfaces and thus to isolate the 0 deg plies. Some panels were made with buffer strips of different widths and spacings. The buffer strips arrested the cracks and increased the residual strengths significantly over those plain laminates without buffer strips. A shear-lag type stress analysis correctly predicted the effects of layups, buffer material, buffer strip width and spacing, and the number of plies of buffer material.

  9. Probabilistic Methods for Structural Design and Reliability

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.; Whitlow, Woodrow, Jr. (Technical Monitor)

    2002-01-01

    This report describes a formal method to quantify structural damage tolerance and reliability in the presence of a multitude of uncertainties in turbine engine components. The method is based at the material behavior level where primitive variables with their respective scatter ranges are used to describe behavior. Computational simulation is then used to propagate the uncertainties to the structural scale where damage tolerance and reliability are usually specified. Several sample cases are described to illustrate the effectiveness, versatility, and maturity of the method. Typical results from this method demonstrate, that it is mature and that it can be used to probabilistically evaluate turbine engine structural components. It may be inferred from the results that the method is suitable for probabilistically predicting the remaining life in aging or in deteriorating structures, for making strategic projections and plans, and for achieving better, cheaper, faster products that give competitive advantages in world markets.

  10. Application of damage tolerance methodology in certification of the Piaggio P-180 Avanti

    NASA Technical Reports Server (NTRS)

    Johnson, Jerry

    1992-01-01

    The Piaggio P-180 Avanti, a twin pusher-prop engine nine-passenger business aircraft was certified in 1990, to the requirements of FAR Part 23 and Associated Special Conditions for Composite Structure. Certification included the application of a damage tolerant methodology to the design of the composite forward wing and empennage (vertical fin, horizontal stabilizer, tailcone, and rudder) structure. This methodology included an extensive analytical evaluation coupled with sub-component and full-scale testing of the structure. The work from the Damage Tolerance Analysis Assessment was incorporated into the full-scale testing. Damage representing hazards such as dropped tools, ground equipment, handling, and runway debris, was applied to the test articles. Additional substantiation included allowing manufacturing discrepancies to exist unrepaired on the full-scale articles and simulated bondline failures in critical elements. The importance of full-scale testing in the critical environmental conditions and the application of critical damage are addressed. The implication of damage tolerance on static and fatigue testing is discussed. Good correlation between finite element solutions and experimental test data was observed.

  11. Failure Predictions for VHTR Core Components using a Probabilistic Contiuum Damage Mechanics Model

    SciTech Connect

    Fok, Alex

    2013-10-30

    The proposed work addresses the key research need for the development of constitutive models and overall failure models for graphite and high temperature structural materials, with the long-term goal being to maximize the design life of the Next Generation Nuclear Plant (NGNP). To this end, the capability of a Continuum Damage Mechanics (CDM) model, which has been used successfully for modeling fracture of virgin graphite, will be extended as a predictive and design tool for the core components of the very high- temperature reactor (VHTR). Specifically, irradiation and environmental effects pertinent to the VHTR will be incorporated into the model to allow fracture of graphite and ceramic components under in-reactor conditions to be modeled explicitly using the finite element method. The model uses a combined stress-based and fracture mechanics-based failure criterion, so it can simulate both the initiation and propagation of cracks. Modern imaging techniques, such as x-ray computed tomography and digital image correlation, will be used during material testing to help define the baseline material damage parameters. Monte Carlo analysis will be performed to address inherent variations in material properties, the aim being to reduce the arbitrariness and uncertainties associated with the current statistical approach. The results can potentially contribute to the current development of American Society of Mechanical Engineers (ASME) codes for the design and construction of VHTR core components.

  12. Damage Tolerance Testing of a NASA TransHab Derivative Woven Inflatable Module

    NASA Technical Reports Server (NTRS)

    Edgecombe, John; delaFuente, Horacio; Valle, Gerard

    2009-01-01

    Current options for Lunar habitat architecture include inflatable habitats and airlocks. Inflatable structures can have mass and volume advantages over conventional structures. However, inflatable structures carry different inherent risks and are at a lower Technical Readiness Level (TRL) than more conventional metallic structures. One of the risks associated with inflatable structures is in understanding the tolerance to induced damage. The Damage Tolerance Test (DTT) is designed to study the structural integrity of an expandable structure. TransHab (Figure 1) was an experimental inflatable module developed at the NASA/Johnson Space Center in the 1990 s. The TransHab design was originally envisioned for use in Mars Transits but was also studied as a potential habitat for the International Space Station (ISS). The design of the TransHab module was based on a woven design using an Aramid fabric. Testing of this design demonstrated a high level of predictability and repeatability with analytical predictions of stresses and deflections. Based on JSC s experience with the design and analysis of woven inflatable structures, the Damage Tolerance Test article was designed and fabricated using a woven design. The DTT article was inflated to 45 psig, representing 25% of the ultimate burst pressure, and one of the one-inch wide longitudinal structural members was severed by initiating a Linear Shaped Charge (LSC). Strain gage measurements, at the interface between the expandable elements (straps) and the nonexpandable metallic elements for pre-selected longitudinal straps, were taken throughout pressurization of the module and strap separation. Strain gage measurements show no change in longitudinal strap loading at the bulkhead interface after strap separation indicating loads in the restraint layer were re-distributed local to the damaged area due to the effects of friction under high internal pressure loading. The test completed all primary objectives with better than

  13. Damage tolerance and assessment of unidirectional carbon fiber composites: An experimental and numerical study

    NASA Astrophysics Data System (ADS)

    Flores, Mark David

    Composites are beginning to be used in a variety of different applications throughout industry. However, certification and damage tolerance is a growing concern in many aerospace and marine applications. Although compression-after-impact have been studied thoroughly, determining a damage tolerance methodology that accurately characterizes the failure of composites has not been established. An experimental investigation was performed to study the effect of stacking sequence, low-velocity impact response, and residual strength due to compression and fatigue. Digital Image Correlation (DIC) captured the strains and deformation of the plate due to compression. Computational investigations integrated non-destructive techniques (C-Scan, X-Ray) to determine the extent of the damage created by the manufacturing process and impact to accurately create a representative of the pre-existing damage. Fiber/matrix cracking, delamination growth, buckling, as well as other failures mechanisms occur in compression-after-impact laminated specimens examined experimentally. The results from this study provide knowledge of the compression after impact strength of plates, and a basis for validation of detailed modeling of progressive failure from impact damaged composites.

  14. Advanced Damage Tolerance Analysis of International Space Station Pressure Wall Welds

    NASA Technical Reports Server (NTRS)

    Allen, Phillip A.

    2006-01-01

    EM20/MSFC has sponsored technology in the area of advanced damage tolerance analysis tools used to analyze the International Space Station (ISS) pressure wall welds. The ISS European modules did not receive non-destructive evaluation (NDE) inspection after proof test. In final assembly configuration, most welds could only be inspected from one side, and some welds were uninspectible. Therefore, advanced damage tolerance analysis was required to determine the critical initial flaw sizes and predicted safe life for the pressure wall welds. EM20 sponsored the development of a new finite element tools using FEA-Crack and WARP3D to solve the problem. This presentation gives a brief overview of the new analytical tools and the analysis results.

  15. FAA/NASA International Symposium on Advanced Structural Integrity Methods for Airframe Durability and Damage Tolerance

    NASA Technical Reports Server (NTRS)

    Harris, Charles E. (Editor)

    1994-01-01

    International technical experts in durability and damage tolerance of metallic airframe structures were assembled to present and discuss recent research findings and the development of advanced design and analysis methods, structural concepts, and advanced materials. The symposium focused on the dissemination of new knowledge and the peer-review of progress on the development of advanced methodologies. Papers were presented on: structural concepts for enhanced durability, damage tolerance, and maintainability; new metallic alloys and processing technology; fatigue crack initiation and small crack effects; fatigue crack growth models; fracture mechanics failure, criteria for ductile materials; structural mechanics methodology for residual strength and life prediction; development of flight load spectra for design and testing; and advanced approaches to resist corrosion and environmentally assisted fatigue.

  16. Advanced Durability and Damage Tolerance Design and Analysis Methods for Composite Structures: Lessons Learned from NASA Technology Development Programs

    NASA Technical Reports Server (NTRS)

    Harris, Charles E.; Starnes, James H., Jr.; Shuart, Mark J.

    2003-01-01

    Aerospace vehicles are designed to be durable and damage tolerant. Durability is largely an economic life-cycle design consideration whereas damage tolerance directly addresses the structural airworthiness (safety) of the vehicle. However, both durability and damage tolerance design methodologies must address the deleterious effects of changes in material properties and the initiation and growth of microstructural damage that may occur during the service lifetime of the vehicle. Durability and damage tolerance design and certification requirements are addressed for commercial transport aircraft and NASA manned spacecraft systems. The state-of-the-art in advanced design and analysis methods is illustrated by discussing the results of several recently completed NASA technology development programs. These programs include the NASA Advanced Subsonic Technology Program demonstrating technologies for large transport aircraft and the X-33 hypersonic test vehicle demonstrating technologies for a single-stage-to-orbit space launch vehicle.

  17. Probabilistic Assessment of Fracture Progression in Composite Structures

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.; Minnetyan, Levon; Mauget, Bertrand; Huang, Dade; Addi, Frank

    1999-01-01

    This report describes methods and corresponding computer codes that are used to evaluate progressive damage and fracture and to perform probabilistic assessment in built-up composite structures. Structural response is assessed probabilistically, during progressive fracture. The effects of design variable uncertainties on structural fracture progression are quantified. The fast probability integrator (FPI) is used to assess the response scatter in the composite structure at damage initiation. The sensitivity of the damage response to design variables is computed. The methods are general purpose and are applicable to stitched and unstitched composites in all types of structures and fracture processes starting from damage initiation to unstable propagation and to global structure collapse. The methods are demonstrated for a polymer matrix composite stiffened panel subjected to pressure. The results indicated that composite constituent properties, fabrication parameters, and respective uncertainties have a significant effect on structural durability and reliability. Design implications with regard to damage progression, damage tolerance, and reliability of composite structures are examined.

  18. Damage Tolerance of Pre-Stressed Composite Panels Under Impact Loads

    NASA Astrophysics Data System (ADS)

    Johnson, Alastair F.; Toso-Pentecôte, Nathalie; Schueler, Dominik

    2014-02-01

    An experimental test campaign studied the structural integrity of carbon fibre/epoxy panels preloaded in tension or compression then subjected to gas gun impact tests causing significant damage. The test programme used representative composite aircraft fuselage panels composed of aerospace carbon fibre toughened epoxy prepreg laminates. Preload levels in tension were representative of design limit loads for fuselage panels of this size, and maximum compression preloads were in the post-buckle region. Two main impact scenarios were considered: notch damage from a 12 mm steel cube projectile, at velocities in the range 93-136 m/s; blunt impact damage from 25 mm diameter glass balls, at velocities 64-86 m/s. The combined influence of preload and impact damage on panel residual strengths was measured and results analysed in the context of damage tolerance requirements for composite aircraft panels. The tests showed structural integrity well above design limit loads for composite panels preloaded in tension and compression with visible notch impact damage from hard body impact tests. However, blunt impact tests on buckled compression loaded panels caused large delamination damage regions which lowered plate bending stiffness and reduced significantly compression strengths in buckling.

  19. An assessment of buffer strips for improving damage tolerance of composite laminates at elevated temperature

    NASA Technical Reports Server (NTRS)

    Bigelow, C. A.

    1981-01-01

    Buffer strips greatly improve the damage tolerance of graphite/epoxy laminates loaded in tension. Graphite/polyimide buffer strip panels were made and tested to determine their residual strength at ambient and elevated (177 C) temperature. Each panel was cut in the center to represent damage. Panels were radiographed and crack-opening displacements were recorded to indicate fracture, fracture arrest, and the extent of damage in the buffer strip after arrest. All panels had the same buffer strip spacing and width. The buffer strip material was 0 deg S-glass/PMR-15. The buffer strips were made by replacing narrow strips of the 0 deg graphite plies with strips of the 0 deg S-glass on either a one-for-one or a two-for-one basis. Half of the panels were heated to 177 + or - 3 C before and during the testing. Elevated temperature did not alter the fracture behavior of the buffer configuration.

  20. Modeling continuous-fiber reinforced polymer composites for exploration of damage tolerant concepts

    NASA Astrophysics Data System (ADS)

    Matthews, Peter J.

    This work aims to improve the predictive capability for fiber-reinforced polymer matrix composite laminates using the finite element method. A new tool for modeling composite damage was developed which considers important modes of failure. Well-known micromechanical models were implemented to predict material values for material systems of interest to aerospace applications. These generated material values served as input to intralaminar and interlaminar damage models. A three-dimensional in-plane damage material model was implemented and behavior verified. Deficiencies in current state-of-the-art interlaminar capabilities were explored using the virtual crack closure technique and the cohesive zone model. A user-defined cohesive element was implemented to discover the importance of traction-separation material constitutive behavior. A novel method for correlation of traction-separation parameters was created. This new damage modeling tool was used for evaluation of novel material systems to improve damage tolerance. Classical laminate plate theory was used in a full-factorial study of layerwise-hybrid laminates. Filament-wound laminated composite cylindrical shells were subjected to quasi-static loading to validate the finite element computational composite damage model. The new tool for modeling provides sufficient accuracy and generality for use on a wide-range of problems.

  1. Seismic damages comparison of low-rise moderate reinforced concrete moment frames in the near- and far-field earthquakes by a probabilistic approach

    NASA Astrophysics Data System (ADS)

    Dadashi, Reza; Nasserasadi, Kiarash

    2015-06-01

    Buildings and other structures experience more damages in near-field earthquakes due to existence of high period pulse in the records of near-field earthquakes. These pulses may not be existed in all near-field records. Therefore, to evaluate the effect of near-field earthquakes on structures realistically, a probabilistic approach is used to evaluate the probability of different damage state in near- and far-field earthquakes. In this method, the damage of structure is evaluated by estimation of fragility function of structure through numerous non-linear dynamic analysis subjected to different ground motion records. To compare the effect of near-field and far-field earthquakes on low-rise moderate reinforced concrete moment, a two and three story concrete frame were selected and designed according to Iranian code. The fragility function of frames was estimated in near- and far-field earthquakes. In near-field earthquakes, mixture of pulse like and non-pulse like records were considered. The results have shown that no meaningful difference between probabilities of failure of near- and far-field was observed. Therefore, it can be concluded that although the near-field earthquake may cause severe damages on structures due to existing impulses in some records, from the probabilistic point of view and considering all near-field records, this effect is not significant.

  2. DNA lesion identity drives choice of damage tolerance pathway in murine cell chromosomes

    PubMed Central

    Cohen, Isadora S.; Bar, Carmit; Paz-Elizur, Tamar; Ainbinder, Elena; Leopold, Karoline; de Wind, Niels; Geacintov, Nicholas; Livneh, Zvi

    2015-01-01

    DNA-damage tolerance (DDT) via translesion DNA synthesis (TLS) or homology-dependent repair (HDR) functions to bypass DNA lesions encountered during replication, and is critical for maintaining genome stability. Here, we present piggyBlock, a new chromosomal assay that, using piggyBac transposition of DNA containing a known lesion, measures the division of labor between the two DDT pathways. We show that in the absence of DNA damage response, tolerance of the most common sunlight-induced DNA lesion, TT-CPD, is achieved by TLS in mouse embryo fibroblasts. Meanwhile, BP-G, a major smoke-induced DNA lesion, is bypassed primarily by HDR, providing the first evidence for this mechanism being the main tolerance pathway for a biologically important lesion in a mammalian genome. We also show that, far from being a last-resort strategy as it is sometimes portrayed, TLS operates alongside nucleotide excision repair, handling 40% of TT-CPDs in repair-proficient cells. Finally, DDT acts in mouse embryonic stem cells, exhibiting the same pattern—mutagenic TLS included—despite the risk of propagating mutations along all cell lineages. The new method highlights the importance of HDR, and provides an effective tool for studying DDT in mammalian cells. PMID:25589543

  3. Damage Tolerance Testing of a NASA TransHab Derivative Woven Inflatable Module

    NASA Technical Reports Server (NTRS)

    Edgecombe, John; delaFuente, Horacio; Valle, Gerald D.

    2008-01-01

    Current options for Lunar habitat architecture include inflatable habitats and airlocks. Inflatable structures can have mass and volume advantages over conventional structures. Inflatable structures are perceived to carry additional risk because they are at a lower Technical Readiness Level (TRL) than conventional metallic structures. One of the risks associated with inflatable structures is understanding the tolerance to component damage and the resulting behavior of the system after the damage is introduced. The Damage Tolerance Test (DTT) is designed to study the structural integrity of an expandable structure during and subsequent to induced damage. The TransHab Project developed an experimental inflatable module developed at Johnson Space Center in the 1990's. The TransHab design was originally envisioned for use in Mars Transits but was also studied as a potential habitat for the International Space Station (ISS). The design of the TransHab module was based on a woven design using an Aramid fabric. Testing of this design demonstrated a high level of predictability and repeatability and good correlation with analytical predictions of stresses and deflections. Based on JSC's experience with the design and analysis of woven inflatable structures, the Damage Tolerance Test article was designed and fabricated using a woven design. The Damage Tolerance Test Article consists of a load bearing restraint layer, a bladder or gas barrier, and a structural metallic core. The test article restraint layer is fabricated from one inch wide Kevlar webbing that is woven in a basket weave pattern. Underneath the structural restraint layer is the bladder or gas barrier. For this test the bladder was required to maintain pressure for testing only and was not representative of a flight design. The bladder and structural restraint layer attach to the structural core of the module at steel bulkheads at each end. The two bulkheads are separated by a 10 foot center tube which provides

  4. Reduction of female copulatory damage by resilin represents evidence for tolerance in sexual conflict

    PubMed Central

    Michels, Jan; Gorb, Stanislav N.; Reinhardt, Klaus

    2015-01-01

    Intergenomic evolutionary conflicts increase biological diversity. In sexual conflict, female defence against males is generally assumed to be resistance, which, however, often leads to trait exaggeration but not diversification. Here, we address whether tolerance, a female defence mechanism known from interspecific conflicts, exists in sexual conflict. We examined the traumatic insemination of female bed bugs via cuticle penetration by males, a textbook example of sexual conflict. Confocal laser scanning microscopy revealed large proportions of the soft and elastic protein resilin in the cuticle of the spermalege, the female defence organ. Reduced tissue damage and haemolymph loss were identified as adaptive female benefits from resilin. These did not arise from resistance because microindentation showed that the penetration force necessary to breach the cuticle was significantly lower at the resilin-rich spermalege than at other cuticle sites. Furthermore, a male survival analysis indicated that the spermalege did not impose antagonistic selection on males. Our findings suggest that the specific spermalege material composition evolved to tolerate the traumatic cuticle penetration. They demonstrate the importance of tolerance in sexual conflict and genitalia evolution, extend fundamental coevolution and speciation models and contribute to explaining the evolution of complexity. We propose that tolerance can drive trait diversity. PMID:25673297

  5. Hydrogen sulfide induces oxidative damage to RNA and DNA in a sulfide-tolerant marine invertebrate.

    PubMed

    Joyner-Matos, Joanna; Predmore, Benjamin L; Stein, Jenny R; Leeuwenburgh, Christiaan; Julian, David

    2010-01-01

    Hydrogen sulfide acts as an environmental toxin across a range of concentrations and as a cellular signaling molecule at very low concentrations. Despite its toxicity, many animals, including the mudflat polychaete Glycera dibranchiata, are periodically or continuously exposed to sulfide in their environment. We tested the hypothesis that a broad range of ecologically relevant sulfide concentrations induces oxidative stress and oxidative damage to RNA and DNA in G. dibranchiata. Coelomocytes exposed in vitro to sulfide (0-3 mmol L(-1) for 1 h) showed dose-dependent increases in oxidative stress (as 2',7'-dichlorofluorescein fluorescence) and superoxide production (as dihydroethidine fluorescence). Coelomocytes exposed in vitro to sulfide (up to 0.73 mmol L(-1) for 2 h) also acquired increased oxidative damage to RNA (detected as 8-oxo-7,8-dihydroguanosine) and DNA (detected as 8-oxo-7,8-dihydro-2'-deoxyguanosine). Worms exposed in vivo to sulfide (0-10 mmol L(-1) for 24 h) acquired elevated oxidative damage to RNA and DNA in both coelomocytes and body wall tissue. While the consequences of RNA and DNA oxidative damage are poorly understood, oxidatively damaged deoxyguanosine bases preferentially bind thymine, causing G-T transversions and potentially causing heritable point mutations. This suggests that sulfide can be an environmental mutagen in sulfide-tolerant invertebrates.

  6. Damage-Tolerance Characteristics of Composite Fuselage Sandwich Structures with Thick Facesheets

    NASA Technical Reports Server (NTRS)

    McGowan, David M.; Ambur, Damodar R.

    1997-01-01

    Damage tolerance characteristics and results from experimental and analytical studies of a composite fuselage keel sandwich structure subjected to low-speed impact damage and discrete-source damage are presented. The test specimens are constructed from graphite-epoxy skins borided to a honeycomb core, and they are representative of a highly loaded fuselage keel structure. Results of compression-after-impact (CAI) and notch-length sensitivity studies of 5-in.-wide by 10-in.long specimens are presented. A correlation between low-speed-impact dent depth, the associated damage area, and residual strength for different impact-energy levels is described; and a comparison of the strength for undamaged and damaged specimens with different notch-length-to-specimen-width ratios is presented. Surface strains in the facesheets of the undamaged specimens as well as surface strains that illustrate the load redistribution around the notch sites in the notched specimens are presented and compared with results from finite element analyses. Reductions in strength of as much as 53.1 percent for the impacted specimens and 64.7 percent for the notched specimens are observed.

  7. Unprecedented simultaneous enhancement in damage tolerance and fatigue resistance of zirconia/Ta composites.

    PubMed

    Smirnov, A; Beltrán, J I; Rodriguez-Suarez, T; Pecharromán, C; Muñoz, M C; Moya, J S; Bartolomé, J F

    2017-03-21

    Dense (>98 th%) and homogeneous ceramic/metal composites were obtained by spark plasma sintering (SPS) using ZrO2 and lamellar metallic powders of tantalum or niobium (20 vol.%) as starting materials. The present study has demonstrated the unique and unpredicted simultaneous enhancement in toughness and strength with very high flaw tolerance of zirconia/Ta composites. In addition to their excellent static mechanical properties, these composites also have exceptional resistance to fatigue loading. It has been shown that the major contributions to toughening are the resulting crack bridging and plastic deformation of the metallic particles, together with crack deflection and interfacial debonding, which is compatible with the coexistence in the composite of both, strong and weak ceramic/metal interfaces, in agreement with predictions of ab-initio calculations. Therefore, these materials are promising candidates for designing damage tolerance components for aerospace industry, cutting and drilling tools, biomedical implants, among many others.

  8. Unprecedented simultaneous enhancement in damage tolerance and fatigue resistance of zirconia/Ta composites

    PubMed Central

    Smirnov, A.; Beltrán, J. I.; Rodriguez-Suarez, T.; Pecharromán, C.; Muñoz, M. C.; Moya, J. S.; Bartolomé, J. F.

    2017-01-01

    Dense (>98 th%) and homogeneous ceramic/metal composites were obtained by spark plasma sintering (SPS) using ZrO2 and lamellar metallic powders of tantalum or niobium (20 vol.%) as starting materials. The present study has demonstrated the unique and unpredicted simultaneous enhancement in toughness and strength with very high flaw tolerance of zirconia/Ta composites. In addition to their excellent static mechanical properties, these composites also have exceptional resistance to fatigue loading. It has been shown that the major contributions to toughening are the resulting crack bridging and plastic deformation of the metallic particles, together with crack deflection and interfacial debonding, which is compatible with the coexistence in the composite of both, strong and weak ceramic/metal interfaces, in agreement with predictions of ab-initio calculations. Therefore, these materials are promising candidates for designing damage tolerance components for aerospace industry, cutting and drilling tools, biomedical implants, among many others. PMID:28322343

  9. Unprecedented simultaneous enhancement in damage tolerance and fatigue resistance of zirconia/Ta composites

    NASA Astrophysics Data System (ADS)

    Smirnov, A.; Beltrán, J. I.; Rodriguez-Suarez, T.; Pecharromán, C.; Muñoz, M. C.; Moya, J. S.; Bartolomé, J. F.

    2017-03-01

    Dense (>98 th%) and homogeneous ceramic/metal composites were obtained by spark plasma sintering (SPS) using ZrO2 and lamellar metallic powders of tantalum or niobium (20 vol.%) as starting materials. The present study has demonstrated the unique and unpredicted simultaneous enhancement in toughness and strength with very high flaw tolerance of zirconia/Ta composites. In addition to their excellent static mechanical properties, these composites also have exceptional resistance to fatigue loading. It has been shown that the major contributions to toughening are the resulting crack bridging and plastic deformation of the metallic particles, together with crack deflection and interfacial debonding, which is compatible with the coexistence in the composite of both, strong and weak ceramic/metal interfaces, in agreement with predictions of ab-initio calculations. Therefore, these materials are promising candidates for designing damage tolerance components for aerospace industry, cutting and drilling tools, biomedical implants, among many others.

  10. Development of pressure containment and damage tolerance technology for composite fuselage structures in large transport aircraft

    NASA Technical Reports Server (NTRS)

    Smith, P. J.; Thomson, L. W.; Wilson, R. D.

    1986-01-01

    NASA sponsored composites research and development programs were set in place to develop the critical engineering technologies in large transport aircraft structures. This NASA-Boeing program focused on the critical issues of damage tolerance and pressure containment generic to the fuselage structure of large pressurized aircraft. Skin-stringer and honeycomb sandwich composite fuselage shell designs were evaluated to resolve these issues. Analyses were developed to model the structural response of the fuselage shell designs, and a development test program evaluated the selected design configurations to appropriate load conditions.

  11. Dehydration rate determines the degree of membrane damage and desiccation tolerance in bryophytes.

    PubMed

    Cruz de Carvalho, Ricardo; Catalá, Myriam; Branquinho, Cristina; Marques da Silva, Jorge; Barreno, Eva

    2017-03-01

    Desiccation tolerant (DT) organisms are able to withstand an extended loss of body water and rapidly resume metabolism upon rehydration. This ability, however, is strongly dependent on a slow dehydration rate. Fast dehydration affects membrane integrity leading to intracellular solute leakage upon rehydration and thereby impairs metabolism recovery. We test the hypothesis that the increased cell membrane damage and membrane permeability observed under fast dehydration, compared with slow dehydration, is related to an increase in lipid peroxidation. Our results reject this hypothesis because following rehydration lipid peroxidation remains unaltered, a fact that could be due to the high increase of NO upon rehydration. However, in fast-dried samples we found a strong signal of red autofluorescence upon rehydration, which correlates with an increase in ROS production and with membrane leakage, particularly the case of phenolics. This could be used as a bioindicator of oxidative stress and membrane damage.

  12. Effect of Buckling Modes on the Fatigue Life and Damage Tolerance of Stiffened Structures

    NASA Technical Reports Server (NTRS)

    Davila, Carlos G.; Bisagni, Chiara; Rose, Cheryl A.

    2015-01-01

    The postbuckling response and the collapse of composite specimens with a co-cured hat stringer are investigated experimentally and numerically. These specimens are designed to evaluate the postbuckling response and the effect of an embedded defect on the collapse load and the mode of failure. Tests performed using controlled conditions and detailed instrumentation demonstrate that the damage tolerance, fatigue life, and collapse loads are closely tied with the mode of the postbuckling deformation, which can be different between two nominally identical specimens. Modes that tend to open skin/stringer defects are the most damaging to the structure. However, skin/stringer bond defects can also propagate under shearing modes. In the proposed paper, the effects of initial shape imperfections on the postbuckling modes and the interaction between different postbuckling deformations and the propagation of skin/stringer bond defects under quasi-static or fatigue loads will be examined.

  13. Probabilistic analysis of cascade failure dynamics in complex network

    NASA Astrophysics Data System (ADS)

    Zhang, Ding-Xue; Zhao, Dan; Guan, Zhi-Hong; Wu, Yonghong; Chi, Ming; Zheng, Gui-Lin

    2016-11-01

    The impact of initial load and tolerance parameter distribution on cascade failure is investigated. By using mean field theory, a probabilistic cascade failure model is established. Based on the model, the damage caused by certain attack size can be predicted, and the critical attack size is derived by the condition of cascade failure end, which ensures no collapse. The critical attack size is larger than the case of constant tolerance parameter for network of random distribution. Comparing three typical distributions, simulation results indicate that the network whose initial load and tolerance parameter both follow Weibull distribution performs better than others.

  14. Long-term hygrothermal effects on damage tolerance of hybrid composite sandwich panels

    NASA Technical Reports Server (NTRS)

    Ishai, Ori; Hiel, Clement; Luft, Michael

    1995-01-01

    A sandwich construction, composed of hybrid carbon-glass fiber-reinforced plastic skins and a syntactic foam core, was selected as the design concept for a wind tunnel compressor blade application, where high damage tolerance and durability are of major importance. Beam specimens were prepared from open-edge and encapsulated sandwich panels which had previously been immersed in water at different temperatures for periods of up to about two years in the extreme case. Moisture absorption and strength characteristics, as related to time of exposure to hygrothermal conditions, were evaluated for the sandwich specimens and their constituents (skins and foam). After different exposure periods, low-velocity impact damage was inflicted on most sandwich specimens and damage characteristics were related to impact energy. Eventually, the residual compressive strengths of the damaged (and undamaged) beams were determined flexurally. Test results show that exposure to hygrothermal conditions leads to significant strength reductions for foam specimens and open-edge sandwich panels, compared with reference specimens stored at room temperature. In the case of skin specimens and for beams prepared from encapsulated sanwich panels that had previously been exposed to hygrothermal conditions, moisture absorption was found to improve strength as related to the reference case. The beneficial effect of moisture on skin performance was, however, limited to moisture contents below 1% (at 50 C and lower temperatures). Above this moisture level and at higher temperatures, strength degradation of the skin seems to prevail.

  15. Ultra-strong and damage tolerant metallic bulk materials: A lesson from nanostructured pearlitic steel wires

    PubMed Central

    Hohenwarter, A.; Völker, B.; Kapp, M. W.; Li, Y.; Goto, S.; Raabe, D.; Pippan, R.

    2016-01-01

    Structural materials used for safety critical applications require high strength and simultaneously high resistance against crack growth, referred to as damage tolerance. However, the two properties typically exclude each other and research efforts towards ever stronger materials are hampered by drastic loss of fracture resistance. Therefore, future development of novel ultra-strong bulk materials requires a fundamental understanding of the toughness determining mechanisms. As model material we use today’s strongest metallic bulk material, namely, a nanostructured pearlitic steel wire, and measured the fracture toughness on micron-sized specimens in different crack growth directions and found an unexpected strong anisotropy in the fracture resistance. Along the wire axis the material reveals ultra-high strength combined with so far unprecedented damage tolerance. We attribute this excellent property combination to the anisotropy in the fracture toughness inducing a high propensity for micro-crack formation parallel to the wire axis. This effect causes a local crack tip stress relaxation and enables the high fracture toughness without being detrimental to the material’s strength. PMID:27624220

  16. Insensitivity to Flaws Leads to Damage Tolerance in Brittle Architected Meta-Materials.

    PubMed

    Montemayor, L C; Wong, W H; Zhang, Y-W; Greer, J R

    2016-02-03

    Cellular solids are instrumental in creating lightweight, strong, and damage-tolerant engineering materials. By extending feature size down to the nanoscale, we simultaneously exploit the architecture and material size effects to substantially enhance structural integrity of architected meta-materials. We discovered that hollow-tube alumina nanolattices with 3D kagome geometry that contained pre-fabricated flaws always failed at the same load as the pristine specimens when the ratio of notch length (a) to sample width (w) is no greater than 1/3, with no correlation between failure occurring at or away from the notch. Samples with (a/w) > 0.3, and notch length-to-unit cell size ratios of (a/l) > 5.2, failed at a lower peak loads because of the higher sample compliance when fewer unit cells span the intact region. Finite element simulations show that the failure is governed by purely tensile loading for (a/w) < 0.3 for the same (a/l); bending begins to play a significant role in failure as (a/w) increases. This experimental and computational work demonstrates that the discrete-continuum duality of architected structural meta-materials may give rise to their damage tolerance and insensitivity of failure to the presence of flaws even when made entirely of intrinsically brittle materials.

  17. Insensitivity to Flaws Leads to Damage Tolerance in Brittle Architected Meta-Materials

    PubMed Central

    Montemayor, L. C.; Wong, W. H.; Zhang, Y.-W.; Greer, J. R.

    2016-01-01

    Cellular solids are instrumental in creating lightweight, strong, and damage-tolerant engineering materials. By extending feature size down to the nanoscale, we simultaneously exploit the architecture and material size effects to substantially enhance structural integrity of architected meta-materials. We discovered that hollow-tube alumina nanolattices with 3D kagome geometry that contained pre-fabricated flaws always failed at the same load as the pristine specimens when the ratio of notch length (a) to sample width (w) is no greater than 1/3, with no correlation between failure occurring at or away from the notch. Samples with (a/w) > 0.3, and notch length-to-unit cell size ratios of (a/l) > 5.2, failed at a lower peak loads because of the higher sample compliance when fewer unit cells span the intact region. Finite element simulations show that the failure is governed by purely tensile loading for (a/w) < 0.3 for the same (a/l); bending begins to play a significant role in failure as (a/w) increases. This experimental and computational work demonstrates that the discrete-continuum duality of architected structural meta-materials may give rise to their damage tolerance and insensitivity of failure to the presence of flaws even when made entirely of intrinsically brittle materials. PMID:26837581

  18. Ultra-strong and damage tolerant metallic bulk materials: A lesson from nanostructured pearlitic steel wires

    NASA Astrophysics Data System (ADS)

    Hohenwarter, A.; Völker, B.; Kapp, M. W.; Li, Y.; Goto, S.; Raabe, D.; Pippan, R.

    2016-09-01

    Structural materials used for safety critical applications require high strength and simultaneously high resistance against crack growth, referred to as damage tolerance. However, the two properties typically exclude each other and research efforts towards ever stronger materials are hampered by drastic loss of fracture resistance. Therefore, future development of novel ultra-strong bulk materials requires a fundamental understanding of the toughness determining mechanisms. As model material we use today’s strongest metallic bulk material, namely, a nanostructured pearlitic steel wire, and measured the fracture toughness on micron-sized specimens in different crack growth directions and found an unexpected strong anisotropy in the fracture resistance. Along the wire axis the material reveals ultra-high strength combined with so far unprecedented damage tolerance. We attribute this excellent property combination to the anisotropy in the fracture toughness inducing a high propensity for micro-crack formation parallel to the wire axis. This effect causes a local crack tip stress relaxation and enables the high fracture toughness without being detrimental to the material’s strength.

  19. Ultra-strong and damage tolerant metallic bulk materials: A lesson from nanostructured pearlitic steel wires.

    PubMed

    Hohenwarter, A; Völker, B; Kapp, M W; Li, Y; Goto, S; Raabe, D; Pippan, R

    2016-09-14

    Structural materials used for safety critical applications require high strength and simultaneously high resistance against crack growth, referred to as damage tolerance. However, the two properties typically exclude each other and research efforts towards ever stronger materials are hampered by drastic loss of fracture resistance. Therefore, future development of novel ultra-strong bulk materials requires a fundamental understanding of the toughness determining mechanisms. As model material we use today's strongest metallic bulk material, namely, a nanostructured pearlitic steel wire, and measured the fracture toughness on micron-sized specimens in different crack growth directions and found an unexpected strong anisotropy in the fracture resistance. Along the wire axis the material reveals ultra-high strength combined with so far unprecedented damage tolerance. We attribute this excellent property combination to the anisotropy in the fracture toughness inducing a high propensity for micro-crack formation parallel to the wire axis. This effect causes a local crack tip stress relaxation and enables the high fracture toughness without being detrimental to the material's strength.

  20. Damage tolerance assessment of bonded composite doubler repairs for commercial aircraft applications

    SciTech Connect

    Roach, D.

    1998-08-01

    The Federal Aviation Administration has sponsored a project at its Airworthiness Assurance NDI Validation Center (AANC) to validate the use of bonded composite doublers on commercial aircraft. A specific application was chosen in order to provide a proof-of-concept driving force behind this test and analysis project. However, the data stemming from this study serves as a comprehensive evaluation of bonded composite doublers for general use. The associated documentation package provides guidance regarding the design, analysis, installation, damage tolerance, and nondestructive inspection of these doublers. This report describes a series of fatigue and strength tests which were conducted to study the damage tolerance of Boron-Epoxy composite doublers. Tension-tension fatigue and ultimate strength tests attempted to grow engineered flaws in coupons with composite doublers bonded to aluminum skin. An array of design parameters, including various flaw scenarios, the effects of surface impact, and other off-design conditions, were studied. The structural tests were used to: (1) assess the potential for interply delaminations and disbonds between the aluminum and the laminate, and (2) determine the load transfer and crack mitigation capabilities of composite doublers in the presence of severe defects. A series of specimens were subjected to ultimate tension tests in order to determine strength values and failure modes. It was demonstrated that even in the presence of extensive damage in the original structure (cracks, material loss) and in spite of non-optimum installations (adhesive disbonds), the composite doubler allowed the structure to survive more than 144,000 cycles of fatigue loading. Installation flaws in the composite laminate did not propagate over 216,000 fatigue cycles. Furthermore, the added impediments of impact--severe enough to deform the parent aluminum skin--and hot-wet exposure did not effect the doubler`s performance. Since the tests were conducting

  1. DNA damage, apoptosis and langerhans cells--Activators of UV-induced immune tolerance.

    PubMed

    Timares, Laura; Katiyar, Santosh K; Elmets, Craig A

    2008-01-01

    Solar UVR is highly mutagenic but is only partially absorbed by the outer stratum corneum of the epidermis. UVR can penetrate into the deeper layers of the epidermis, depending on melanin content, where it induces DNA damage and apoptosis in epidermal cells, including those in the germinative basal layer. The cellular decision to initiate either cellular repair or undergo apoptosis has evolved to balance the acute need to maintain skin barrier function with the long-term risk of retaining precancerous cells. Langerhans cells (LCs) are positioned suprabasally, where they may sense UV damage directly, or indirectly through recognition of apoptotic vesicles and soluble mediators derived from surrounding keratinocytes. Apoptotic vesicles will contain UV-induced altered proteins that may be presented to the immune system as foreign. The observation that UVR induces immune tolerance to skin-associated antigens suggests that this photodamage response has evolved to preserve the skin barrier by protecting it from autoimmune attack. LC involvement in this process is not clear and controversial. We will highlight some basic concepts of photobiology and review recent advances pertaining to UV-induced DNA damage, apoptosis regulation, novel immunomodulatory mechanisms and the role of LCs in generating antigen-specific regulatory T cells.

  2. Static and damage tolerance tests of an advanced composite vertical fin for L-1011 aircraft

    NASA Technical Reports Server (NTRS)

    Dorward, F.; Ketola, R. N.

    1983-01-01

    This paper recounts the significant events which took place during the structural verification testing of two graphite/epoxy material, full-size vertical stabilizers. The ground test articles were tested to a high bending dynamic lateral gust condition. The first unit failed during static testing at 98 percent Design Ultimate Load. Failure began within the front spar cap. A detailed review of the failure was performed to identify all possible modes. This review resulted in a 'production line' type fix being designed for incorporation in the second ground test article prior to installation in the test fixture. The modified second unit sustained 106 percent of Design Ultimate Load without incident. One lifetime (36,000 flights) of damage tolerance testing was accomplished with the specimen purposely damaged at five locations. A fail-safe loading was performed successfully after simulating lightning strike damage to the fin box cover. A large area repair was substantiated by completing a second lifetime of spectrum loadings. The residual static strength was determined to be 119.7 percent of Design Ultimate Load.

  3. Role of interfaces i nthe design of ultra-high strength, radiation damage tolerant nanocomposites

    SciTech Connect

    Misra, Amit; Wang, Yongqiang; Nastasi, Michael A; Baldwin, Jon K; Wei, Qiangmin; Li, Nan; Mara, Nathan; Zhang, Xinghang; Fu, Engang; Anderoglu, Osman; Li, Hongqi; Bhattacharyya, Dhriti

    2010-12-09

    The combination of high strength and high radiation damage tolerance in nanolaminate composites can be achieved when the individual layers in these composites are only a few nanometers thick and contain special interfaces that act both as obstacles to slip, as well as sinks for radiation-induced defects. The morphological and phase stabilities and strength and ductility of these nano-composites under ion irradiation are explored as a function of layer thickness, temperature and interface structure. Magnetron sputtered metallic multilayers such as Cu-Nb and V-Ag with a range of individual layer thickness from approximately 2 nm to 50 nm and the corresponding 1000 nm thick single layer films were implanted with helium ions at room temperature. Cross-sectional Transmission Electron Microscopy (TEM) was used to measure the distribution of helium bubbles and correlated with the helium concentration profile measured vis ion beam analysis techniques to obtain the helium concentration at which bubbles are detected in TEM. It was found that in multilayers the minimum helium concentration to form bubbles (approximately I nm in size) that are easily resolved in through-focus TEM imaging was several atomic %, orders of magnitude higher than that in single layer metal films. This observation is consistent with an increased solubility of helium at interfaces that is predicted by atomistic modeling of the atomic structures of fcc-bcc interfaces. At helium concentrations as high as 7 at.%, a uniform distribution of I nm diameter bubbles results in negligible irradiation hardening and loss of deformability in multi layers with layer thicknesses of a few nanometers. The control of atomic structures of interfaces to produce high helium solubility at interfaces is crucial in the design of nano-composite materials that are radiation damage tolerant. Reduced radiation damage also leads to a reduction in the irradiation hardening, particularly at layer thickness of approximately 5 run

  4. The Stomatopod Dactyl Club: A Formidable Damage-Tolerant Biological Hammer

    SciTech Connect

    Weaver J. C.; DiMasi E.; Milliron, G.W.; Miserez, A.; Evans-Lutterodt, K.; Herrera, S.; Gallana, I.; Mershon, W.J.; Swanson, B.; Zavattieri, P.; Kisailus, D.

    2012-06-08

    Nature has evolved efficient strategies to synthesize complex mineralized structures that exhibit exceptional damage tolerance. One such example is found in the hypermineralized hammer-like dactyl clubs of the stomatopods, a group of highly aggressive marine crustaceans. The dactyl clubs from one species, Odontodactylus scyllarus, exhibit an impressive set of characteristics adapted for surviving high-velocity impacts on the heavily mineralized prey on which they feed. Consisting of a multiphase composite of oriented crystalline hydroxyapatite and amorphous calcium phosphate and carbonate, in conjunction with a highly expanded helicoidal organization of the fibrillar chitinous organic matrix, these structures display several effective lines of defense against catastrophic failure during repetitive high-energy loading events.

  5. Damage Tolerance Analysis of Space Shuttle External Tank Lug Fillet Welds Using NASGRO

    NASA Technical Reports Server (NTRS)

    Allen, Phillip A.

    2006-01-01

    The damage tolerance of the External Tank (ET) lug welds were reassessed because of an increase in the loads due to the removal of the protuberance air load (PAT.,) ramp. The analysis methods included detailed finite element analysis (FEA) of the ET welded lugs and FEA of the lug weld test configuration. The FEA results were used as input to the crack growth analysis code NASGRO to calculate the mission life capability of the ET lug welds and to predict the number of cycles to failure in the lug weld testing. The presentation presents the method of transferring the FEM results to the NASGRO model and gives correlations between FEM and NASGRO stress intensity calculations.

  6. The Stomatopod Dactyl Club: A Formidable Damage-Tolerant Biological Hammer

    NASA Astrophysics Data System (ADS)

    Weaver, James C.; Milliron, Garrett W.; Miserez, Ali; Evans-Lutterodt, Kenneth; Herrera, Steven; Gallana, Isaias; Mershon, William J.; Swanson, Brook; Zavattieri, Pablo; DiMasi, Elaine; Kisailus, David

    2012-06-01

    Nature has evolved efficient strategies to synthesize complex mineralized structures that exhibit exceptional damage tolerance. One such example is found in the hypermineralized hammer-like dactyl clubs of the stomatopods, a group of highly aggressive marine crustaceans. The dactyl clubs from one species, Odontodactylus scyllarus, exhibit an impressive set of characteristics adapted for surviving high-velocity impacts on the heavily mineralized prey on which they feed. Consisting of a multiphase composite of oriented crystalline hydroxyapatite and amorphous calcium phosphate and carbonate, in conjunction with a highly expanded helicoidal organization of the fibrillar chitinous organic matrix, these structures display several effective lines of defense against catastrophic failure during repetitive high-energy loading events.

  7. Hierarchical flexural strength of enamel: transition from brittle to damage-tolerant behaviour.

    PubMed

    Bechtle, Sabine; Özcoban, Hüseyin; Lilleodden, Erica T; Huber, Norbert; Schreyer, Andreas; Swain, Michael V; Schneider, Gerold A

    2012-06-07

    Hard, biological materials are generally hierarchically structured from the nano- to the macro-scale in a somewhat self-similar manner consisting of mineral units surrounded by a soft protein shell. Considerable efforts are underway to mimic such materials because of their structurally optimized mechanical functionality of being hard and stiff as well as damage-tolerant. However, it is unclear how different hierarchical levels interact to achieve this performance. In this study, we consider dental enamel as a representative, biological hierarchical structure and determine its flexural strength and elastic modulus at three levels of hierarchy using focused ion beam (FIB) prepared cantilevers of micrometre size. The results are compared and analysed using a theoretical model proposed by Jäger and Fratzl and developed by Gao and co-workers. Both properties decrease with increasing hierarchical dimension along with a switch in mechanical behaviour from linear-elastic to elastic-inelastic. We found Gao's model matched the results very well.

  8. Plasticity and ductility in graphene oxide through a mechanochemically induced damage tolerance mechanism

    PubMed Central

    Wei, Xiaoding; Mao, Lily; Soler-Crespo, Rafael A.; Paci, Jeffrey T.; Espinosa, Horacio D.

    2015-01-01

    The ability to bias chemical reaction pathways is a fundamental goal for chemists and material scientists to produce innovative materials. Recently, two-dimensional materials have emerged as potential platforms for exploring novel mechanically activated chemical reactions. Here we report a mechanochemical phenomenon in graphene oxide membranes, covalent epoxide-to-ether functional group transformations that deviate from epoxide ring-opening reactions, discovered through nanomechanical experiments and density functional-based tight binding calculations. These mechanochemical transformations in a two-dimensional system are directionally dependent, and confer pronounced plasticity and damage tolerance to graphene oxide monolayers. Additional experiments on chemically modified graphene oxide membranes, with ring-opened epoxide groups, verify this unique deformation mechanism. These studies establish graphene oxide as a two-dimensional building block with highly tuneable mechanical properties for the design of high-performance nanocomposites, and stimulate the discovery of new bond-selective chemical transformations in two-dimensional materials. PMID:26289729

  9. Fatigue life estimation procedures for the endurance of a cardiac valve prosthesis: stress/life and damage-tolerant analyses.

    PubMed

    Ritchie, R O; Lubock, P

    1986-05-01

    Projected fatigue life analyses are performed to estimate the endurance of a cardiac valve prosthesis under physiological environmental and mechanical conditions. The analyses are conducted using both the classical stress-strain/life and the fracture mechanics-based damage-tolerant approaches, and provide estimates of expected life in terms of initial flaw sizes which may pre-exist in the metal prior to the valve entering service. The damage-tolerant analysis further is supplemented by consideration of the question of "short cracks," which represents a developing area in metal fatigue research, not commonly applied to data in standard engineering design practice.

  10. Transparency and damage tolerance of patternable omniphobic lubricated surfaces based on inverse colloidal monolayers

    DOE PAGES

    Vogel, Nicolas; Belisle, Rebecca A.; Hatton, Benjamin; ...

    2013-07-31

    A transparent coating that repels a wide variety of liquids, prevents staining, is capable of self-repair and is robust towards mechanical damage can have a broad technological impact, from solar cell coatings to self-cleaning optical devices. Here we employ colloidal templating to design transparent, nanoporous surface structures. A lubricant can be firmly locked into the structures and, owing to its fluidic nature, forms a defect-free, self-healing interface that eliminates the pinning of a second liquid applied to its surface, leading to efficient liquid repellency, prevention of adsorption of liquid-borne contaminants, and reduction of ice adhesion strength. We further show howmore » this method can be applied to locally pattern the repellent character of the substrate, thus opening opportunities to spatially confine any simple or complex fluids. The coating is highly defect-tolerant due to its interconnected, honeycomb wall structure, and repellency prevails after the application of strong shear forces and mechanical damage. The regularity of the coating allows us to understand and predict the stability or failure of repellency as a function of lubricant layer thickness and defect distribution based on a simple geometric model.« less

  11. Transparency and damage tolerance of patternable omniphobic lubricated surfaces based on inverse colloidal monolayers

    SciTech Connect

    Vogel, Nicolas; Belisle, Rebecca A.; Hatton, Benjamin; Wong, Tak-Sing; Aizenberg, Joanna

    2013-07-31

    A transparent coating that repels a wide variety of liquids, prevents staining, is capable of self-repair and is robust towards mechanical damage can have a broad technological impact, from solar cell coatings to self-cleaning optical devices. Here we employ colloidal templating to design transparent, nanoporous surface structures. A lubricant can be firmly locked into the structures and, owing to its fluidic nature, forms a defect-free, self-healing interface that eliminates the pinning of a second liquid applied to its surface, leading to efficient liquid repellency, prevention of adsorption of liquid-borne contaminants, and reduction of ice adhesion strength. We further show how this method can be applied to locally pattern the repellent character of the substrate, thus opening opportunities to spatially confine any simple or complex fluids. The coating is highly defect-tolerant due to its interconnected, honeycomb wall structure, and repellency prevails after the application of strong shear forces and mechanical damage. The regularity of the coating allows us to understand and predict the stability or failure of repellency as a function of lubricant layer thickness and defect distribution based on a simple geometric model.

  12. Optimal Design and Damage Tolerance Verification of an Isogrid Structure for Helicopter Application

    NASA Technical Reports Server (NTRS)

    Baker, Donald J.; Fudge, Jack; Ambur, Damodar R.; Kassapoglou, Christos

    2003-01-01

    A composite isogrid panel design for application to a rotorcraft fuselage is presented. An optimum panel design for the lower fuselage of the rotorcraft that is subjected to combined in-plane compression and shear loads was generated using a design tool that utilizes a smeared-stiffener theory in conjunction with a genetic algorithm. A design feature was introduced along the edges of the panel that facilitates introduction of loads into the isogrid panel without producing undesirable local bending gradients. A low-cost manufacturing method for the isogrid panel that incorporates these design details is also presented. Axial compression tests were conducted on the undamaged and low-speed impact damaged panels to demonstrate the damage tolerance of this isogrid panel. A combined loading test fixture was designed and utilized that allowed simultaneous application of compression and shear loads to the test specimen. Results from finite element analyses are presented for the isogrid panel designs and these results are compared with experimental results. This study illustrates the isogrid concept to be a viable candidate for application to the helicopter lower fuselage structure.

  13. Structurally Integrated, Damage Tolerant Thermal Spray Coatings: Processing Effects on Surface and System Functionalities

    NASA Astrophysics Data System (ADS)

    Vackel, Andrew

    Thermal Spray (TS) coatings have seen extensive application as protective surfaces to enhance the service life of substrates prone to damage in their operating environment (wear, corrosion, heat etc.). With the advent of high velocity TS processes, the ability to deposit highly dense (>99%) metallic and cermet coatings has further enhanced the protective ability of these coatings. In addition to surface functionality, the influence of the coating application on the mechanical performance of a coated component is of great concern when such a component will experience either static or cyclic loading during service. Using a process mapping methodology, the processing-property interplay between coating materials meant to provide damage tolerant surface or for structural restoration are explored in terms of relevant mechanical properties. Most importantly, the residual stresses inherent in TS deposited coatings are shown to play a significant role in the integrated mechanical performance of these coatings. Unique to high velocity TS processes is the ability to produce compressive stresses within the deposit from the cold working induced by the high kinetic energy particles upon impact. The extent of these formation stresses are explored with different coating materials, as well as processing influence. The ability of dense TS coatings to carry significant structural load and synergistically strengthen coated tensile specimens is demonstrated as a function of coating material, processing, and thickness. The sharing of load between the substrate and otherwise brittle coating enables higher loads before yield for the bi-material specimens, offering a methodology to improve the tensile performance of coated components for structural repair or multi-functionality (surface and structure). The concern of cyclic fatigue damage in coated components is explored, since the majority of service application are designed for loading to be well below the yield point. The role of

  14. Damage tolerance and arrest characteristics of pressurized graphite/epoxy tape cylinders

    NASA Technical Reports Server (NTRS)

    Ranniger, Claudia U.; Lagace, Paul A.; Graves, Michael J.

    1993-01-01

    An investigation of the damage tolerance and damage arrest characteristics of internally-pressurized graphite/epoxy tape cylinders with axial notches was conducted. An existing failure prediction methodology, developed and verified for quasi-isotropic graphite/epoxy fabric cylinders, was investigated for applicability to general tape layups. In addition, the effect of external circumferential stiffening bands on the direction of fracture path propagation and possible damage arrest was examined. Quasi-isotropic (90/0/plus or minus 45)s and structurally anisotropic (plus or minus 45/0)s and (plus or minus 45/90)s coupons and cylinders were constructed from AS4/3501-6 graphite/epoxy tape. Notched and unnotched coupons were tested in tension and the data correlated using the equation of Mar and Lin. Cylinders with through-thickness axial slits were pressurized to failure achieving a far-field two-to-one biaxial stress state. Experimental failure pressures of the (90/0/plus or minus 45)s cylinders agreed with predicted values for all cases but the specimen with the smallest slit. However, the failure pressures of the structurally anisotropic cylinders, (plus or minus 45/0)s and (plus or minus 45/90)s, were above the values predicted utilizing the predictive methodology in all cases. Possible factors neglected by the predictive methodology include structural coupling in the laminates and axial loading of the cylindrical specimens. Furthermore, applicability of the predictive methodology depends on the similarity of initial fracture modes in the coupon specimens and the cylinder specimens of the same laminate type. The existence of splitting which may be exacerbated by the axial loading in the cylinders, shows that this condition is not always met. The circumferential stiffeners were generally able to redirect fracture propagation from longitudinal to circumferential. A quantitative assessment for stiffener effectiveness in containing the fracture, based on cylinder

  15. Damage tolerance based life prediction in gas turbine engine blades under vibratory high cycle fatigue

    SciTech Connect

    Walls, D.P.; deLaneuville, R.E.; Cunningham, S.E.

    1997-01-01

    A novel fracture mechanics approach has been used to predict crack propagation lives in gas turbine engine blades subjected to vibratory high cycle fatigue (HCF). The vibratory loading included both a resonant mode and a nonresonant mode, with one blade subjected to only the nonresonant mode and another blade to both modes. A life prediction algorithm was utilized to predict HCF propagation lives for each case. The life prediction system incorporates a boundary integral element (BIE) derived hybrid stress intensity solution, which accounts for the transition from a surface crack to corner crack to edge crack. It also includes a derivation of threshold crack length from threshold stress intensity factors to give crack size limits for no propagation. The stress intensity solution was calibrated for crack aspect ratios measured directly from the fracture surfaces. The model demonstrates the ability to correlate predicted missions to failure with values deduced from fractographic analysis. This analysis helps to validate the use of fracture mechanics approaches for assessing damage tolerance in gas turbine engine components subjected to combined steady and vibratory stresses.

  16. Evaluation of Damage Tolerance of Advanced SiC/SiC Composites after Neutron Irradiation

    NASA Astrophysics Data System (ADS)

    Ozawa, Kazumi; Katoh, Yutai; Nozawa, Takashi; Hinoki, Tatsuya; Snead, Lance L.

    2011-10-01

    Silicon carbide composites (SiC/SiC) are attractive candidate materials for structural and functional components in fusion energy systems. The effect of neutron irradiation on damage tolerance of the nuclear grade SiC/SiC composites (plain woven Hi-Nicalon™ Type-S reinforced CVI matrix composites multilayer interphase and unidirectional Tyranno™-SA3 reinforced NITE matrix with carbon mono-layer interphase) was evaluated by means of miniaturized single-edged notched beam test. No significant changes in crack extension behavior and in the load-loadpoint displacement characteristics such as the peak load and hysteresis loop width were observed after irradiation to 5.9 × 1025 n/m2 (E > 0.1 MeV) at 800°C and to 5.8 × 1025 n/m2 at 1300°C. By applying a global energy balance analysis based on non-linear fracture mechanics, the energy release rate for these composite materials was found to be unchanged by irradiation with a value of 3±2 kJ/m2. This has led to the conclusion that, for these fairly aggressive irradiation conditions, the effect of neutron irradiation on the fracture resistance of these composites appears insignificant.

  17. Fuel containment and damage tolerance in large composite primary aircraft structures. Phase 2: Testing

    NASA Technical Reports Server (NTRS)

    Sandifer, J. P.; Denny, A.; Wood, M. A.

    1985-01-01

    Technical issues associated with fuel containment and damage tolerance of composite wing structures for transport aircraft were investigated. Material evaluation tests were conducted on two toughened resin composites: Celion/HX1504 and Celion/5245. These consisted of impact, tension, compression, edge delamination, and double cantilever beam tests. Another test series was conducted on graphite/epoxy box beams simulating a wing cover to spar cap joint configuration of a pressurized fuel tank. These tests evaluated the effectiveness of sealing methods with various fastener types and spacings under fatigue loading and with pressurized fuel. Another test series evaluated the ability of the selected coatings, film, and materials to prevent fuel leakage through 32-ply AS4/2220-1 laminates at various impact energy levels. To verify the structural integrity of the technology demonstration article structural details, tests were conducted on blade stiffened panels and sections. Compression tests were performed on undamaged and impacted stiffened AS4/2220-1 panels and smaller element tests to evaluate stiffener pull-off, side load and failsafe properties. Compression tests were also performed on panels subjected to Zone 2 lightning strikes. All of these data were integrated into a demonstration article representing a moderately loaded area of a transport wing. This test combined lightning strike, pressurized fuel, impact, impact repair, fatigue and residual strength.

  18. Damage Tolerance Assessment of Friction Pull Plug Welds in an Aluminum Alloy

    NASA Technical Reports Server (NTRS)

    McGill, Preston; Burkholder, Jonathan

    2012-01-01

    Friction stir welding is a solid state welding process used in the fabrication of cryogenic propellant tanks. Self-reacting friction stir welding is one variation of the friction stir weld process being developed for manufacturing tanks. Friction pull plug welding is used to seal the exit hole that remains in a circumferential self-reacting friction stir weld. A friction plug weld placed in a self-reacting friction stir weld results in a non-homogenous weld joint where the initial weld, plug weld, their respective heat affected zones and the base metal all interact. The welded joint is a composite plastically deformed material system with a complex residual stress field. In order to address damage tolerance concerns associated with friction plug welds in safety critical structures, such as propellant tanks, nondestructive inspection and proof testing may be required to screen hardware for mission critical defects. The efficacy of the nondestructive evaluation or the proof test is based on an assessment of the critical flaw size. Test data relating residual strength capability to flaw size in an aluminum alloy friction plug weld will be presented.

  19. Hierarchical flexural strength of enamel: transition from brittle to damage-tolerant behaviour

    PubMed Central

    Bechtle, Sabine; Özcoban, Hüseyin; Lilleodden, Erica T.; Huber, Norbert; Schreyer, Andreas; Swain, Michael V.; Schneider, Gerold A.

    2012-01-01

    Hard, biological materials are generally hierarchically structured from the nano- to the macro-scale in a somewhat self-similar manner consisting of mineral units surrounded by a soft protein shell. Considerable efforts are underway to mimic such materials because of their structurally optimized mechanical functionality of being hard and stiff as well as damage-tolerant. However, it is unclear how different hierarchical levels interact to achieve this performance. In this study, we consider dental enamel as a representative, biological hierarchical structure and determine its flexural strength and elastic modulus at three levels of hierarchy using focused ion beam (FIB) prepared cantilevers of micrometre size. The results are compared and analysed using a theoretical model proposed by Jäger and Fratzl and developed by Gao and co-workers. Both properties decrease with increasing hierarchical dimension along with a switch in mechanical behaviour from linear-elastic to elastic-inelastic. We found Gao's model matched the results very well. PMID:22031729

  20. DNA Damage Tolerance and a Web of Connections with DNA Repair at Yale

    PubMed Central

    Wood, Richard D.

    2013-01-01

    This short article summarizes some of the research carried out recently by my laboratory colleagues on the function of DNA polymerase zeta (polζ) in mammalian cells. Some personal background is also described, relevant to research associations with Yale University and its continuing influence. Polζ is involved in the bypass of many DNA lesions by translesion DNA synthesis and is responsible for the majority of DNA damage-induced point mutagenesis in mammalian cells (including human cells), as well as in yeast. We also found that the absence of this enzyme leads to gross chromosomal instability in mammalian cells and increased spontaneous tumorigenesis in mice. Recently, we discovered a further unexpectedly critical role for polζ: it plays an essential role in allowing continued rapid proliferation of cells and tissues. These observations and others indicate that polζ engages frequently during DNA replication to bypass and tolerate DNA lesions or unusual DNA structures that are barriers for the normal DNA replication machinery. PMID:24348215

  1. Aerothermal performance and damage tolerance of a Rene 41 metallic standoff thermal protection system at Mach 6.7

    NASA Technical Reports Server (NTRS)

    Avery, D. E.

    1984-01-01

    A flight-weight, metallic thermal protection system (TPS) model applicable to Earth-entry and hypersonic-cruise vehicles was subjected to multiple cycles of both radiant and aerothermal heating in order to evaluate its aerothermal performance, structural integrity, and damage tolerance. The TPS was designed for a maximum operating temperature of 2060 R and featured a shingled, corrugation-stiffened corrugated-skin heat shield of Rene 41, a nickel-base alloy. The model was subjected to 10 radiant heating tests and to 3 radiant preheat/aerothermal tests. Under radiant-heating conditions with a maximum surface temperature of 2050 R, the TPS performed as designed and limited the primary structure away from the support ribs to temperatures below 780 R. During the first attempt at aerothermal exposure, a failure in the panel-holder test fixture severely damaged the model. However, two radiant preheat/aerothermal tests were made with the damaged model to test its damage tolerance. During these tests, the damaged area did not enlarge; however, the rapidly increasing structural temperature measuring during these tests indicates that had the damaged area been exposed to aerodynamic heating for the entire trajectory, an aluminum burn-through would have occurred.

  2. Damage tolerant functionally graded materials for advanced wear and friction applications

    NASA Astrophysics Data System (ADS)

    Prchlik, Lubos

    The research work presented in this dissertation focused on processing effects, microstructure development, characterization and performance evaluation of composite and graded coatings used for friction and wear control. The following issues were addressed. (1) Definition of prerequisites for a successful composite and graded coating formation by means of thermal spraying. (2) Improvement of characterization methods available for homogenous thermally sprayed coating and their extension to composite and graded materials. (3) Development of novel characterization methods specifically for FGMs, with a focus on through thickness property measurement by indentation and in-situ curvature techniques. (4) Design of composite materials with improved properties compared to homogenous coatings. (5) Fabrication and performance assessment of FGM with improved wear and impact damage properties. Materials. The materials studied included several material systems relevant to low friction and contact damage tolerant applications: MO-Mo2C, WC-Co cermets as materials commonly used sliding components of industrial machinery and NiCrAlY/8%-Yttria Partially Stabilized Zirconia composites as a potential solution for abradable sections of gas turbines and aircraft engines. In addition, uniform coatings such as molybdenum and Ni5%Al alloy were evaluated as model system to assess the influence of microstructure variation onto the mechanical property and wear response. Methods. The contact response of the materials was investigated through several techniques. These included methods evaluating the relevant intrinsic coating properties such as elastic modulus, residual stress, fracture toughness, scratch resistance and tests measuring the abrasion and friction-sliding behavior. Dry-sand and wet two-body abrasion testing was performed in addition to traditional ball on disc sliding tests. Among all characterization techniques the spherical indentation deserved most attention and enabled to

  3. Cell cycle stage-specific roles of Rad18 in tolerance and repair of oxidative DNA damage

    PubMed Central

    Yang, Yang; Durando, Michael; Smith-Roe, Stephanie L.; Sproul, Chris; Greenwalt, Alicia M.; Kaufmann, William; Oh, Sehyun; Hendrickson, Eric A.; Vaziri, Cyrus

    2013-01-01

    The E3 ubiquitin ligase Rad18 mediates tolerance of replication fork-stalling bulky DNA lesions, but whether Rad18 mediates tolerance of bulky DNA lesions acquired outside S-phase is unclear. Using synchronized cultures of primary human cells, we defined cell cycle stage-specific contributions of Rad18 to genome maintenance in response to ultraviolet C (UVC) and H2O2-induced DNA damage. UVC and H2O2 treatments both induced Rad18-mediated proliferating cell nuclear antigen mono-ubiquitination during G0, G1 and S-phase. Rad18 was important for repressing H2O2-induced (but not ultraviolet-induced) double strand break (DSB) accumulation and ATM S1981 phosphorylation only during G1, indicating a specific role for Rad18 in processing of oxidative DNA lesions outside S-phase. However, H2O2-induced DSB formation in Rad18-depleted G1 cells was not associated with increased genotoxin sensitivity, indicating that back-up DSB repair mechanisms compensate for Rad18 deficiency. Indeed, in DNA LigIV-deficient cells Rad18-depletion conferred H2O2-sensitivity, demonstrating functional redundancy between Rad18 and non-homologous end joining for tolerance of oxidative DNA damage acquired during G1. In contrast with G1-synchronized cultures, S-phase cells were H2O2-sensitive following Rad18-depletion. We conclude that although Rad18 pathway activation by oxidative lesions is not restricted to S-phase, Rad18-mediated trans-lesion synthesis by Polη is dispensable for damage-tolerance in G1 (because of back-up non-homologous end joining-mediated DSB repair), yet Rad18 is necessary for damage tolerance during S-phase. PMID:23295675

  4. FAA/NASA International Symposium on Advanced Structural Integrity Methods for Airframe Durability and Damage Tolerance, part 2

    NASA Technical Reports Server (NTRS)

    Harris, Charles E. (Editor)

    1994-01-01

    The international technical experts in the areas of durability and damage tolerance of metallic airframe structures were assembled to present and discuss recent research findings and the development of advanced design and analysis methods, structural concepts, and advanced materials. The principal focus of the symposium was on the dissemination of new knowledge and the peer-review of progress on the development of advanced methodologies. Papers were presented on the following topics: structural concepts for enhanced durability, damage tolerance, and maintainability; new metallic alloys and processing technology; fatigue crack initiation and small crack effects; fatigue crack growth models; fracture mechanics failure criteria for ductile materials; structural mechanics methodology for residual strength and life prediction; development of flight load spectra for design and testing; and corrosion resistance.

  5. Monte Carlo simulation methodology for the reliabilty of aircraft structures under damage tolerance considerations

    NASA Astrophysics Data System (ADS)

    Rambalakos, Andreas

    Current federal aviation regulations in the United States and around the world mandate the need for aircraft structures to meet damage tolerance requirements through out the service life. These requirements imply that the damaged aircraft structure must maintain adequate residual strength in order to sustain its integrity that is accomplished by a continuous inspection program. The multifold objective of this research is to develop a methodology based on a direct Monte Carlo simulation process and to assess the reliability of aircraft structures. Initially, the structure is modeled as a parallel system with active redundancy comprised of elements with uncorrelated (statistically independent) strengths and subjected to an equal load distribution. Closed form expressions for the system capacity cumulative distribution function (CDF) are developed by expanding the current expression for the capacity CDF of a parallel system comprised by three elements to a parallel system comprised with up to six elements. These newly developed expressions will be used to check the accuracy of the implementation of a Monte Carlo simulation algorithm to determine the probability of failure of a parallel system comprised of an arbitrary number of statistically independent elements. The second objective of this work is to compute the probability of failure of a fuselage skin lap joint under static load conditions through a Monte Carlo simulation scheme by utilizing the residual strength of the fasteners subjected to various initial load distributions and then subjected to a new unequal load distribution resulting from subsequent fastener sequential failures. The final and main objective of this thesis is to present a methodology for computing the resulting gradual deterioration of the reliability of an aircraft structural component by employing a direct Monte Carlo simulation approach. The uncertainties associated with the time to crack initiation, the probability of crack detection, the

  6. Resistance and tolerance of Terminalia sericea trees to simulated herbivore damage under different soil nutrient and moisture conditions.

    PubMed

    Katjiua, Mutjinde L J; Ward, David

    2006-07-01

    Resource availability, degree of herbivore damage, genetic variability, and their interactions influence the allocation of investment by plants to resistance and tolerance traits. We evaluated the independent and interactive effects of soil nutrients and moisture, and simulated the effects of herbivore damage on condensed tannins (resistance) and growth/regrowth (tolerance) traits of Terminalia sericea, a deciduous tree in the Kalahari desert that constitutes a major component of livestock diet. We used a completely crossed randomized-block design experiment to examine the effects of nutrients, water availability, and herbivore damage on regrowth and resistance traits of T. sericea seedlings. Plant height, number of branches, internode length, leaf area, leaf mass for each seedling, combined weight of stems and twigs, and root mass were recorded. Condensed tannin concentrations were 22.5 and 21.5% higher under low nutrients and low soil moisture than under high nutrient and high water treatment levels. Tannin concentrations did not differ significantly between control and experimental seedlings 2 mo after simulated herbivore damage. Tannin concentrations correlated more strongly with growth traits under low- than under high-nutrient conditions. No trade-offs were detected among individual growth traits, nor between growth traits and condensed tannins. T. sericea appeared to invest more in both resistance and regrowth traits when grown under low-nutrient conditions. Investment in the resistance trait (condensed tannin) under high-nutrient conditions was minimal and, to a lesser degree, correlated with plant growth. These results suggest that T. sericea displays both resistance and tolerance strategies, and that the degree to which each is expressed is resource-dependent.

  7. Design, analysis, and fabrication of a pressure box test fixture for tension damage tolerance testing of curved fuselage panels

    NASA Technical Reports Server (NTRS)

    Smith, P. J.; Bodine, J. B.; Preuss, C. H.; Koch, W. J.

    1993-01-01

    A pressure box test fixture was designed and fabricated to evaluate the effects of internal pressure, biaxial tension loads, curvature, and damage on the fracture response of composite fuselage structure. Previous work in composite fuselage tension damage tolerance, performed during NASA contract NAS1-17740, evaluated the above effects on unstiffened panels only. This work extends the tension damage tolerance testing to curved stiffened fuselage crown structure that contains longitudinal stringers and circumferential frame elements. The pressure box fixture was designed to apply internal pressure up to 20 psi, and axial tension loads up to 5000 lb/in, either separately or simultaneously. A NASTRAN finite element model of the pressure box fixture and composite stiffened panel was used to help design the test fixture, and was compared to a finite element model of a full composite stiffened fuselage shell. This was done to ensure that the test panel was loaded in a similar way to a panel in the full fuselage shell, and that the fixture and its attachment plates did not adversely affect the panel.

  8. Assessment of Damage Tolerance Requirements and Analysis - Task 1 report. Volume 2. Analytical Methods

    DTIC Science & Technology

    1986-03-31

    always smaller than kt; and for the same kt, kN decreases with decreasing p . Such a relationship conforms with the empirical observa- tion that the...predict fatigue crack initiation under spectrum loading, cumulative damage computations must be performed. The Palmgren -Miner approach of linear...the Palmgren -Miner approach, the cumulative damage is, nk -Zdn - Df J 0 \\dn / i i (19) When cumulative damage equals a predetermined value of Df, a

  9. Micro-Energy Rates for Damage Tolerance and Durability of Composite Structures

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.; Minnetyan, Levon

    2006-01-01

    In this paper, the adhesive bond strength of lap-jointed graphite/aluminum composites is examined by computational simulation. Computed micro-stress level energy release rates are used to identify the damage mechanisms associated with the corresponding acoustic emission (AE) signals. Computed damage regions are similarly correlated with ultrasonically scanned damage regions. Results show that computational simulation can be used with suitable NDE methods for credible in-service monitoring of composites.

  10. GENETIC AND MOLECULAR ANALYSIS OF DNA DAMAGE REPAIR AND TOLERANCE PATHWAYS.

    SciTech Connect

    SUTHERLAND, B.M.

    2001-07-26

    Radiation can damage cellular components, including DNA. Organisms have developed a panoply of means of dealing with DNA damage. Some repair paths have rather narrow substrate specificity (e.g. photolyases), which act on specific pyrimidine photoproducts in a specific type (e.g., DNA) and conformation (double-stranded B conformation) of nucleic acid. Others, for example, nucleotide excision repair, deal with larger classes of damages, in this case bulky adducts in DNA. A detailed discussion of DNA repair mechanisms is beyond the scope of this article, but one can be found in the excellent book of Friedberg et al. [1] for further detail. However, some DNA damages and paths for repair of those damages important for photobiology will be outlined below as a basis for the specific examples of genetic and molecular analysis that will be presented below.

  11. Preliminary design of composite wing-box structures for global damage tolerance

    NASA Technical Reports Server (NTRS)

    Starnes, J. H., Jr.; Haftka, R. T.

    1980-01-01

    A procedure is presented that incorporates the influence of potential global damage conditions into the design process for minimum-mass wing-box structures. The procedure is based on mathematical-programming optimization techniques. Material-strength, minimum-gage, and panel-buckling constraints are introduced by penalty functions, and Newton's method with approximate second derivatives of the penalty terms is used as the search algorithm to obtain minimum-mass designs. A potential global damage condition is represented by a structural model with the damaged components removed. Example minimum-mass designs are obtained that simultaneously satisfy the constraints of the damaged and undamaged configurations of both graphite-epoxy and aluminum wing-box structural models. These examples are designed with and without the influence of potential damage conditions, and results indicate that for equal mass cases the residual strength of a damaged structure is higher when the influence of potential damage is properly included in the design from the outset. Results of these examples also identify the minimum structural mass increase required to increase residual strength levels.

  12. Desiccation sensitivity and tolerance in the moss Physcomitrella patens: assessing limits and damage.

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The moss Physcomitrella patens is becoming the model of choice for functional genomic studies at the cellular level. Studies report that P. patens survives moderate osmotic and salt stress, and that desiccation tolerance can be induced by exogenous ABA. Our goal was to quantify the extent of dehydr...

  13. New Technologies and Materials for Enhanced Damage and Fire Tolerance of Naval Vessels

    DTIC Science & Technology

    2011-02-01

    Head / Dockyard Laboratory (Atlantic) Approved for release by Original signed by Ron Kuwahara for Calvin V. Hyatt Chair/Document Review Panel...and damage detection, abatement and suppression on board naval vessels are reviewed . These include point and volume fire and damage sensors and...Several approaches to enhancing the fire and flammability properties of non-metallic (polymeric) materials used on naval vessels are also reviewed . The

  14. Omega-3 improves glucose tolerance but increases lipid peroxidation and DNA damage in hepatocytes of fructose-fed rats.

    PubMed

    de Castro, Gabriela Salim Ferreira; dos Santos, Raquel Alves; Portari, Guilherme Vannucchi; Jordão, Alceu Afonso; Vannucchi, Helio

    2012-04-01

    The high consumption of fructose is linked to the increase in various characteristics of the metabolic syndrome. Fish oil is beneficial for the treatment of these comorbidities, such as insulin resistance, dyslipidemia, and hepatic steatosis. The objective of this study was to evaluate the consequences of the administration of fish oil concomitant to fructose ingestion during the experiment (45 days) and during the final 15 days in high-fructose-fed rats. Male Wistar rats were divided into 5 groups: control; those receiving 10% fish oil (FO); those receiving 60% fructose (Fr); those receiving 60% fructose and 10% fish oil for 45 days (FrFO); and those receiving fructose plus soybean oil for 30 days and fish oil for the final 15 days of the study (FrFO15). There was an increase in triacylglycerol, serum total cholesterol, and hepatic volume in the Fr group. The FO and FrFO groups experienced an increase in lipid peroxidation and a decrease in serum reduced glutathione. The FrFO group suffered greater hepatic injury, with increased alanine aminotransferase levels and DNA damage. Marked n-3 incorporation occurred in the groups receiving fish oil, favoring a better response to the oral glucose tolerance test. Fructose induced comorbidities of the metabolic syndrome, and the use of fish oil promoted a better glucose tolerance, although it was accompanied by more hepatocyte damage.

  15. Honey bee (Apis mellifera) drones survive oxidative stress due to increased tolerance instead of avoidance or repair of oxidative damage.

    PubMed

    Li-Byarlay, Hongmei; Huang, Ming Hua; Simone-Finstrom, Michael; Strand, Micheline K; Tarpy, David R; Rueppell, Olav

    2016-10-01

    Oxidative stress can lead to premature aging symptoms and cause acute mortality at higher doses in a range of organisms. Oxidative stress resistance and longevity are mechanistically and phenotypically linked; considerable variation in oxidative stress resistance exists among and within species and typically covaries with life expectancy. However, it is unclear whether stress-resistant, long-lived individuals avoid, repair, or tolerate molecular damage to survive longer than others. The honey bee (Apis mellifera L.) is an emerging model system that is well-suited to address this question. Furthermore, this species is the most economically important pollinator, whose health may be compromised by pesticide exposure, including oxidative stressors. Here, we develop a protocol for inducing oxidative stress in honey bee males (drones) via Paraquat injection. After injection, individuals from different colony sources were kept in common social conditions to monitor their survival compared to saline-injected controls. Oxidative stress was measured in susceptible and resistant individuals. Paraquat drastically reduced survival but individuals varied in their resistance to treatment within and among colony sources. Longer-lived individuals exhibited higher levels of lipid peroxidation than individuals dying early. In contrast, the level of protein carbonylation was not significantly different between the two groups. This first study of oxidative stress in male honey bees suggests that survival of an acute oxidative stressor is due to tolerance, not prevention or repair, of oxidative damage to lipids. It also demonstrates colony differences in oxidative stress resistance that might be useful for breeding stress-resistant honey bees.

  16. Exogenous nitric oxide improves salt tolerance during establishment of Jatropha curcas seedlings by ameliorating oxidative damage and toxic ion accumulation.

    PubMed

    Gadelha, Cibelle Gomes; Miranda, Rafael de Souza; Alencar, Nara Lídia M; Costa, José Hélio; Prisco, José Tarquinio; Gomes-Filho, Enéas

    2017-02-20

    Jatropha curcas is an oilseed species that is considered an excellent alternative energy source for fossil-based fuels for growing in arid and semiarid regions, where salinity is becoming a stringent problem to crop production. Our working hypothesis was that nitric oxide (NO) priming enhances salt tolerance of J. curcas during early seedling development. Under NaCl stress, seedlings arising from NO-treated seeds showed lower accumulation of Na(+) and Cl(-) than those salinized seedlings only, which was consistent with a better growth for all analyzed time points. Also, although salinity promoted a significant increase in hydrogen peroxide (H2O2) content and membrane damage, the harmful effects were less aggressive in NO-primed seedlings. The lower oxidative damage in NO-primed stressed seedlings was attributed to operation of a powerful antioxidant system, including greater glutathione (GSH) and ascorbate (AsA) contents as well as catalase (CAT) and glutathione reductase (GR) enzyme activities in both endosperm and embryo axis. Priming with NO also was found to rapidly up-regulate the JcCAT1, JcCAT2, JcGR1 and JcGR2 gene expression in embryo axis, suggesting that NO-induced salt responses include functional and transcriptional regulations. Thus, NO almost completely abolished the deleterious salinity effects on reserve mobilization and seedling growth. In conclusion, NO priming improves salt tolerance of J. curcas during seedling establishment by inducing an effective antioxidant system and limiting toxic ion and reactive oxygen species (ROS) accumulation.

  17. DNA damage tolerance pathway involving DNA polymerase ι and the tumor suppressor p53 regulates DNA replication fork progression

    PubMed Central

    Hampp, Stephanie; Kiessling, Tina; Buechle, Kerstin; Mansilla, Sabrina F.; Thomale, Jürgen; Rall, Melanie; Ahn, Jinwoo; Pospiech, Helmut; Gottifredi, Vanesa; Wiesmüller, Lisa

    2016-01-01

    DNA damage tolerance facilitates the progression of replication forks that have encountered obstacles on the template strands. It involves either translesion DNA synthesis initiated by proliferating cell nuclear antigen monoubiquitination or less well-characterized fork reversal and template switch mechanisms. Herein, we characterize a novel tolerance pathway requiring the tumor suppressor p53, the translesion polymerase ι (POLι), the ubiquitin ligase Rad5-related helicase-like transcription factor (HLTF), and the SWI/SNF catalytic subunit (SNF2) translocase zinc finger ran-binding domain containing 3 (ZRANB3). This novel p53 activity is lost in the exonuclease-deficient but transcriptionally active p53(H115N) mutant. Wild-type p53, but not p53(H115N), associates with POLι in vivo. Strikingly, the concerted action of p53 and POLι decelerates nascent DNA elongation and promotes HLTF/ZRANB3-dependent recombination during unperturbed DNA replication. Particularly after cross-linker–induced replication stress, p53 and POLι also act together to promote meiotic recombination enzyme 11 (MRE11)-dependent accumulation of (phospho-)replication protein A (RPA)-coated ssDNA. These results implicate a direct role of p53 in the processing of replication forks encountering obstacles on the template strand. Our findings define an unprecedented function of p53 and POLι in the DNA damage response to endogenous or exogenous replication stress. PMID:27407148

  18. The key regulator of submergence tolerance, SUB1A, promotes photosynthetic and metabolic recovery from submergence damage in rice leaves.

    PubMed

    Alpuerto, Jasper Benedict; Hussain, Rana Muhammad Fraz; Fukao, Takeshi

    2016-03-01

    The submergence-tolerance regulator, SUBMERGENCE1A (SUB1A), of rice (Oryza sativa L.) modulates gene regulation, metabolism and elongation growth during submergence. Its benefits continue during desubmergence through protection from reactive oxygen species and dehydration, but there is limited understanding of SUB1A's role in physiological recovery from the stress. Here, we investigated the contribution of SUB1A to desubmergence recovery using the two near-isogenic lines, submergence-sensitive M202 and tolerant M202(Sub1). No visible damage was detected in the two genotypes after 3 d of submergence, but the sublethal stress differentially altered photosynthetic parameters and accumulation of energy reserves. Submergence inhibited photosystem II photochemistry and stimulated breakdown of protein and accumulation of several amino acids in both genotypes at similar levels. Upon desubmergence, however, more rapid return to homeostasis of these factors was observed in M202(Sub1). Submergence considerably restrained non-photochemical quenching (NPQ) in M202, whereas the value was unaltered in M202(Sub1) during the stress. Upon reaeration, submerged plants encounter sudden exposure to higher light. A greater capability for NPQ-mediated photoprotection can benefit the rapid recovery of photosynthetic performance and energy reserve metabolism in M202(Sub1). Our findings illuminate the significant role of SUB1A in active physiological recovery upon desubmergence, a component of enhanced tolerance to submergence.

  19. AGARD/SMP Review Damage Tolerance for Engine Structures. 1. Non-Destructive Evaluation

    DTIC Science & Technology

    1988-11-01

    Hote eoance Range Web Bore is Disk and Ha DsksSos Fire 2. Composite Sketch Of Typical Rotor Corponpenb Of A Gas Turbine EngVe 3-3 In light of the pant...tolerant materials in satisfying low cycle fatigue and fracture mechanics life requirements, configurations must reflect realistic NDE limitations...aironautique A Gusi ce typo do bosomos A satisfaire en contril* In situ d’aironefs en cours d’utilisation. Plus directoasat. lea avantagos cntis

  20. High Strength and Impact Damage Tolerant Syntactic Foam for High Performance Sandwich Structures

    DTIC Science & Technology

    2006-07-25

    it needs additional curing sources, its uniformity is not as good as other curing methods, and its shrinkage is usually high. Summary...Fast Repair of Laminated Beams Using UV Curing Composites ,” Composite Structures, 60(1), pp. 73-81, (2003). 3. S.S. Pang, G. Li, J.E. Helms, and...strength and higher impact tolerant syntactic foam for composite sandwich structures. A unique microstructure was designed and realized through a

  1. Ultraviolet-B-induced DNA damage and ultraviolet-B tolerance mechanisms in species with different functional groups coexisting in subalpine moorlands.

    PubMed

    Wang, Qing-Wei; Kamiyama, Chiho; Hidema, Jun; Hikosaka, Kouki

    2016-08-01

    High doses of ultraviolet-B (UV-B; 280-315 nm) radiation can have detrimental effects on plants, and especially damage their DNA. Plants have DNA repair and protection mechanisms to prevent UV-B damage. However, it remains unclear how DNA damage and tolerance mechanisms vary among field species. We studied DNA damage and tolerance mechanisms in 26 species with different functional groups coexisting in two moorlands at two elevations. We collected current-year leaves in July and August, and determined accumulation of cyclobutane pyrimidine dimer (CPD) as UV-B damage and photorepair activity (PRA) and concentrations of UV-absorbing compounds (UACs) and carotenoids (CARs) as UV-B tolerance mechanisms. DNA damage was greater in dicot than in monocot species, and higher in herbaceous than in woody species. Evergreen species accumulated more CPDs than deciduous species. PRA was higher in Poaceae than in species of other families. UACs were significantly higher in woody than in herbaceous species. The CPD level was not explained by the mechanisms across species, but was significantly related to PRA and UACs when we ignored species with low CPD, PRA and UACs, implying the presence of another effective tolerance mechanism. UACs were correlated negatively with PRA and positively with CARs. Our results revealed that UV-induced DNA damage significantly varies among native species, and this variation is related to functional groups. DNA repair, rather than UV-B protection, dominates in UV-B tolerance in the field. Our findings also suggest that UV-B tolerance mechanisms vary among species under evolutionary trade-off and synergism.

  2. Homologous Recombination and Translesion DNA Synthesis Play Critical Roles on Tolerating DNA Damage Caused by Trace Levels of Hexavalent Chromium

    PubMed Central

    Chen, Youjun; Zhou, Yi-Hui; Neo, Dayna; Clement, Jean; Takata, Minoru; Takeda, Shunichi; Sale, Julian; Wright, Fred A.; Swenberg, James A.; Nakamura, Jun

    2016-01-01

    Contamination of potentially carcinogenic hexavalent chromium (Cr(VI)) in the drinking water is a major public health concern worldwide. However, little information is available regarding the biological effects of a nanomoler amount of Cr(VI). Here, we investigated the genotoxic effects of Cr(VI) at nanomoler levels and their repair pathways. We found that DNA damage response analyzed based on differential toxicity of isogenic cells deficient in various DNA repair proteins is observed after a three-day incubation with K2CrO4 in REV1-deficient DT40 cells at 19.2 μg/L or higher as well as in TK6 cells deficient in polymerase delta subunit 3 (POLD3) at 9.8 μg/L or higher. The genotoxicity of Cr(VI) decreased ~3000 times when the incubation time was reduced from three days to ten minutes. TK mutation rate also significantly decreased from 6 day to 1 day exposure to Cr(VI). The DNA damage response analysis suggest that DNA repair pathways, including the homologous recombination and REV1- and POLD3-mediated error-prone translesion synthesis pathways, are critical for the cells to tolerate to DNA damage caused by trace amount of Cr(VI). PMID:27907204

  3. A damage tolerance comparison of 7075-T6 aluminum alloy and IM7/977-2 carbon/epoxy

    NASA Technical Reports Server (NTRS)

    Nettles, Alan T.; Lance, David G.; Hodge, Andrew J.

    1991-01-01

    A comparison of low velocity impact damage between one of the strongest aluminum alloys, to a new, damage tolerant resin system as a matrix for high strength carbon fibers was examined in this study. The aluminum and composite materials were used as face sheets on a 0.13 g/cu cm aluminum honeycomb. Four levels of impact energy were used; 2.6 J, 5.3 J, 7.8 J and 9.9 J. The beams were compared for static strength and fatique life by use of the four-point bend flexure test. It was found that in the undamaged state the specific strength of the composite face sheets was about twice that of the aluminum face sheets. A sharp drop in strength was observed for the composite specimens impacted at the lowest (2.6J) energy level, but the overall specific strength was still higher than for the aluminum specimens. At all impact energy levels tested, the static specific strength of the composite face sheets were significantly higher than the aluminum face sheets. The fatigue life of the most severely damaged composite specimen was about 17 times greater than the undamaged aluminum specimens when cycled at 1 Hz between 20 percent and 85 percent of ultimate breaking load.

  4. 75 FR 793 - Damage Tolerance and Fatigue Evaluation of Composite Rotorcraft Structures

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-01-06

    ... amendment would require evaluation of fatigue and residual static strength of composite rotorcraft... static or fatigue loads. The proposal would require consideration of the effects of fatigue damage on... applicant must show that catastrophic failure due to static and fatigue loads, considering the intrinsic...

  5. Durability and Damage Tolerance of Bismaleimide Composites. Volume 1. Technical Report

    DTIC Science & Technology

    1988-06-01

    Metal Skin"II-- -- .1.014 20.0 --- I- ~22,0 ’ Composite ... Composite Skin Width (1.00 In.) t = Nominal Composite Skin Thickness (6 Plies) C = Honeycomb Core Height (1.50 in.) T = Metal Skin Thickness (0.125 In.) L...31, 1987. 9. Garbo, S.P. and Ogonowski, J.M. "Effect of Variances and Manufacturing Tolerances on the Design Strength and Life of Mechanically Fastened Composite Joints ," ANAL-TR-81-3041. Volumes 1, 2 and 3, April 1981. 155

  6. Damage tolerance in filament-wound graphite/epoxy pressure vessels

    NASA Astrophysics Data System (ADS)

    Simon, William E.; Ngueyen, Vinh D.; Chenna, Ravi K.

    1995-07-01

    Graphite/epoxy composites are extensively used in the aerospace and sporting goods industries due to their superior engineering properties compared to those of metals. However, graphite/epoxy is extremely susceptible to impact damage which can cause considerable and sometimes undetected reduction in strength. An inelastic impact model was developed to predict damage due to low-velocity impact. A transient dynamic finite element formulation was used in conjunction with the 3D Tsai-Wu failure criterion to determine and incorporate failure in the materials during impact. Material degradation can be adjusted from no degradation to partial degradation to full degradation. The developed software is based on an object-oriented implementation framework called Extensible Implementation Framework for Finite Elements (EIFFE).

  7. Recent development in the design, testing and impact-damage tolerance of stiffened composite panels

    NASA Technical Reports Server (NTRS)

    Williams, J. G.; Anderson, M. S.; Rhodes, M. D.; Starnes, J. H., Jr.; Stroud, W. J.

    1979-01-01

    Structural technology of laminated filamentary-composite stiffened-panel structures under combined inplane and lateral loadings is discussed. Attention is focused on: (1) methods for analyzing the behavior of these structures under load and for determining appropriate structural proportions for weight-efficient configurations; and (2) effects of impact damage and geometric imperfections on structural performance. Recent improvements in buckling analysis involving combined inplane compression and shear loadings and transverse shear deformations are presented. A computer code is described for proportioning or sizing laminate layers and cross-sectional dimensions, and the code is used to develop structural efficiency data for a variety of configurations, loading conditions, and constraint conditions. Experimental data on buckling of panels under inplane compression is presented. Mechanisms of impact damage initiation and propagation are described.

  8. Damage tolerance in filament-wound graphite/epoxy pressure vessels

    NASA Technical Reports Server (NTRS)

    Simon, William E.; Ngueyen, Vinh D.; Chenna, Ravi K.

    1995-01-01

    Graphite/epoxy composites are extensively used in the aerospace and sporting goods industries due to their superior engineering properties compared to those of metals. However, graphite/epoxy is extremely susceptible to impact damage which can cause considerable and sometimes undetected reduction in strength. An inelastic impact model was developed to predict damage due to low-velocity impact. A transient dynamic finite element formulation was used in conjunction with the 3D Tsai-Wu failure criterion to determine and incorporate failure in the materials during impact. Material degradation can be adjusted from no degradation to partial degradation to full degradation. The developed software is based on an object-oriented implementation framework called Extensible Implementation Framework for Finite Elements (EIFFE).

  9. Simplification of Fatigue Test Requirements for Damage Tolerance of Composite Interstage Launch Vehicle Hardware

    NASA Technical Reports Server (NTRS)

    Nettles, A. T.; Hodge, A. J.; Jackson, J. R.

    2010-01-01

    The issue of fatigue loading of structures composed of composite materials is considered in a requirements document that is currently in place for manned launch vehicles. By taking into account the short life of these parts, coupled with design considerations, it is demonstrated that the necessary coupon level fatigue data collapse to a static case. Data from a literature review of past studies that examined compressive fatigue loading after impact and data generated from this experimental study are presented to support this finding. Damage growth, in the form of infrared thermography, was difficult to detect due to rapid degradation of compressive properties once damage growth initiated. Unrealistically high fatigue amplitudes were needed to fail 5 of 15 specimens before 10,000 cycles were reached. Since a typical vehicle structure, such as the Ares I interstage, only experiences a few cycles near limit load, it is concluded that static compression after impact (CAI) strength data will suffice for most launch vehicle structures.

  10. Preliminary Studies on Damage Tolerant Strategies for Composite Design and Health Monitoring

    DTIC Science & Technology

    2009-05-22

    methodology for detecting damage in thin walled plate metallic structures, using 2-D ultrasonic phased arrays , was presented, obtaining beam-forming...active nonlinear acousto- ultrasonic based methods, and (2) active Lamb wave based methods Lamb wave methods are based on the principle of detecting...open field. On the other hand, the nonlinear acousto- ultrasonic methods attempt to exploit the effect of anomalously high levels of nonlinearity in

  11. An examination of the damage tolerance enhancement of carbon/epoxy using an outer lamina of spectra (R)

    NASA Technical Reports Server (NTRS)

    Lance, D. G.; Nettles, A. T.

    1991-01-01

    Low velocity instrumented impact testing was utilized to examine the effects of an outer lamina of ultra-high molecular weight polyethylene (Spectra) on the damage tolerance of carbon epoxy composites. Four types of 16-ply quasi-isotropic panels (0, +45, 90, -45) were tested. Some panels contained no Spectra, while others had a lamina of Spectra bonded to the top (impacted side), bottom, or both sides of the composite plates. The specimens were impacted with energies up to 8.5 J. Force time plots and maximum force versus impact energy graphs were generated for comparison purposes. Specimens were also subjected to cross-sectional analysis and compression after impact tests. The results show that while the Spectra improved the maximum load that the panels could withstand before fiber breakage, the Spectra seemingly reduced the residual strength of the composites.

  12. A study of the damage tolerance enhancement of carbon/epoxy laminates by utilizing an outer lamina of ultra high molecular weight polyethylene

    NASA Technical Reports Server (NTRS)

    Nettles, Alan T.; Lance, David G.

    1991-01-01

    The damage tolerance of carbon/epoxy was examined when an outer layer of ultra high molecular weight polyethylene (Spectra) material was utilized on the specimen. Four types of 16 ply quasi-isotropic panels, (0,+45,90,-45)s2 were tested. The first contained no Spectra, while the others had one lamina of Spectra placed on either the top (impacted side), bottom or both surfaces of the composite plate. A range of impact energies up to approximately 8.5 Joules (6.3 ft-lbs) was used to inflict damage upon these specimens. Glass/Phenolic honeycomb beams with a core density of 314 N/m3 (2.0 lb/ft3) and 8 ply quasi-isotropic facesheets were also tested for compression-after-impact strength with and without Spectra at impact energies of 1,2,3 and 4 Joules (.74, 1.47, 2.21 and 2.95 ft-lbs). It was observed that the composite plates had little change in damage tolerance due to the Spectra, while the honeycomb panels demonstrated a slight increase in damage tolerance when Spectra was added, the damage tolerance level being more improved at higher impact energies.

  13. Fatigue and damage tolerance of Y-TZP ceramics in layered biomechanical systems.

    PubMed

    Zhang, Yu; Pajares, Antonia; Lawn, Brian R

    2004-10-15

    The fatigue properties of fine-grain Y-TZP in cyclic flexural testing are studied. Comparative tests on a coarser-grain alumina provide a baseline control. A bilayer configuration with ceramic plates bonded to a compliant polymeric substrate and loaded with concentrated forces at the top surfaces, simulating basic layer structures in dental crowns and hip replacement prostheses, is used as a basic test specimen. Critical times to initiate radial crack failure at the ceramic undersurfaces at prescribed maximum surface loads are measured for Y-TZP with as-polished surfaces, mechanically predamaged undersurfaces, and after a thermal aging treatment. No differences in critical failure conditions are observed between monotonic and cyclic loading on as-polished surfaces, or between as-polished and mechanically damaged surfaces in monotonic loading, consistent with fatigue controlled by slow crack growth. However, the data for mechanically damaged and aged specimens show substantial declines in sustainable stresses and times to failure in cyclic loading, indicating an augmenting role of mechanical and thermal processes in certain instances. In all cases, however, the sustainable stresses in the Y-TZP remain higher than that of the alumina, suggesting that with proper measures to avoid inherent structural instabilities, Y-TZP could provide superior performance in biomechanical applications.

  14. Comparison of tissue damage caused by various laser systems with tissue tolerable plasma by light and laser scan microscopy

    NASA Astrophysics Data System (ADS)

    Vandersee, Staffan; Lademann, Jürgen; Richter, Heike; Patzelt, Alexa; Lange-Asschenfeldt, Bernhard

    2013-10-01

    Tissue tolerable plasma (TTP) represents a novel therapeutic method with promising capabilities in the field of dermatological interventions, in particular disinfection but also wound antisepsis and regeneration. The energy transfer by plasma into living tissue is not easily educible, as a variety of features such as the medium’s actual molecule-stream, the ions, electrons and free radicals involved, as well as the emission of ultraviolet, visible and infrared light contribute to its increasingly well characterized effects. Thus, relating possible adversary effects, especially of prolonged exposure to a single component of the plasma’s mode of action, is difficult. Until now, severe adverse events connected to plasma exposure have not been reported when conducted according to existing therapeutic protocols. In this study, we have compared the tissue damage-potential of CO2 and dye lasers with TTP in a porcine model. After exposure of pig ear skin to the three treatment modalities, all specimens were examined histologically and by means of laser scan microscopy (LSM). Light microscopical tissue damage could only be shown in the case of the CO2 laser, whereas dye laser and plasma treatment resulted in no detectable impairment of the specimens. In the case of TTP, LSM examination revealed only an impairment of the uppermost corneal layers of the skin, thus stressing its safety when used in vivo.

  15. Investigation of radiation damage tolerance in interface-containing metallic nano structures

    SciTech Connect

    Greer, Julia R.

    2016-10-21

    The proposed work seeks to conduct a basic study by applying experimental and computational methods to obtain quantitative influence of helium sink strength and proximity on He bubble nucleation and growth in He-irradiated nano-scale metallic structures, and the ensuing deformation mechanisms and mechanical properties. We utilized a combination of nano-scale in-situ tension and compression experiments on low-energy He-irradiated samples combined with site-specific microstructural characterization and modeling efforts. We also investigated the mechanical deformation of nano-architected materials, i.e. nanolattices which are comprised of 3-dimensional interwoven networks of hollow tubes, with the wall thickness in the nanometer range. This systematic approach will provide us with critical information for identifying key factors that govern He bubble nucleation and growth upon irradiation as a function of both sink strength and sink proximity through an experimentally-confirmed physical understanding. As an outgrowth of these efforts, we performed irradiations with self-ions (Ni2+) on Ni-Al-Zr metallic glass nanolattices to assess their resilience against radiation damage rather than He-ion implantation. We focused our attention on studying individual bcc/fcc interfaces within a single nano structure (nano-pillar or a hollow tube): a single Fe (bcc)-Cu (fcc) boundary per pillar oriented perpendicular to the pillar axes, as well as pure bcc and fcc nano structures. Additional interfaces of interest include bcc/bcc and metal/metallic glass all within a single nano-structure volume. The model material systems are: (1) pure single crystalline Fe and Cu, (2) a single Fe (bcc)-Cu (fcc) boundary per nano structure (3) a single metal–metallic glass, all oriented non-parallel to the loading direction so that their fracture strength can be tested. A nano-fabrication approach, which involves e-beam lithography and templated electroplating, as well as two

  16. Test validation of environmental barrier coating (EBC) durability and damage tolerance modeling approach

    NASA Astrophysics Data System (ADS)

    Abdul-Aziz, Ali; Najafi, Ali; Abdi, Frank; Bhatt, Ramakrishna T.; Grady, Joseph E.

    2014-03-01

    Protection of Ceramic Matrix Composites (CMCs) is rather an important element for the engine manufacturers and aerospace companies to help improve the durability of their hot engine components. The CMC's are typically porous materials which permits some desirable infiltration that lead to strength enhancements. However, they experience various durability issues such as degradation due to coating oxidation. These concerns are being addressed by introducing a high temperature protective system, Environmental Barrier Coating (EBC) that can operate at temperature applications1, 3 In this paper, linear elastic progressive failure analyses are performed to evaluate conditions that would cause crack initiation in the EBC. The analysis is to determine the overall failure sequence under tensile loading conditions on different layers of material including the EBC and CMC in an attempt to develop a life/failure model. A 3D finite element model of a dogbone specimen is constructed for the analyses. Damage initiation, propagation and final failure is captured using a progressive failure model considering tensile loading conditions at room temperature. It is expected that this study will establish a process for using a computational approach, validated at a specimen level, to predict reliably in the future component level performance without resorting to extensive testing.

  17. Recent Developments and Challenges Implementing New and Improved Stress Intensity Factor (K) Solutions in NASGRO for Damage Tolerance Analyses

    NASA Technical Reports Server (NTRS)

    Cardinal, Joseph W.; McClung, R. Craig; Lee, Yi-Der; Guo, Yajun; Beek, Joachim M.

    2014-01-01

    Fatigue crack growth analysis software has been available to damage tolerance analysts for many years in either commercial products or via proprietary in-house codes. The NASGRO software has been publicly available since the mid-80s (known as NASA/FLAGRO up to 1999) and since 2000 has been sustained and further developed by a collaborative effort between Southwest Research Institute® (SwRI®), the NASA Johnson Space Center (JSC), and the members of the NASGRO Industrial Consortium. Since the stress intensity factor (K) is the foundation of fracture mechanics and damage tolerance analysis of aircraft structures, a significant focus of development efforts in the past fifteen years has been geared towards enhancing legacy K solutions and developing new and efficient numerical K solutions that can handle the complicated stress gradients computed by today’s analysts using detailed finite element models of fatigue critical locations. This paper provides an overview of K solutions that have been recently implemented or improved for the analysis of geometries such as two unequal through cracks at a hole and two unequal corner cracks at a hole, as well as state-of-the-art weight function models capable of computing K in the presence of univariant and/or bivariant stress gradients and complicated residual stress distributions. Some historical background is provided to review how common K solutions have evolved over the years, including selective examples from the literature and from new research. Challenges and progress in rectifying discrepancies between older legacy solutions and newer models are reviewed as well as approaches and challenges for verification and validation of K solutions. Finally, a summary of current challenges and future research and development needs is presented. A key theme throughout the presentation of this paper will be how members of the aerospace industry have collaborated with software developers to develop a practical analysis tool that is

  18. Fatigue and Damage Tolerance Analysis of a Hybrid Composite Tapered Flexbeam

    NASA Technical Reports Server (NTRS)

    Murri, Gretchen B.; Schaff, Jeffrey R.; Dobyns, Al

    2001-01-01

    The behavior of nonlinear tapered composite flexbeams under combined axial tension and cyclic bending loading was studied using coupon test specimens and finite element (FE) analyses. The flexbeams used a hybrid material system of graphite/epoxy and glass/epoxy and had internal dropped plies, dropped in an overlapping stepwise pattern. Two material configurations, differing only in the use of glass or graphite plies in the continuous plies near the midplane, were studied. Test specimens were cut from a full-size helicopter tail-rotor flexbeam and were tested in a hydraulic load frame under combined constant axialtension load and transverse cyclic bending loads. The first determination damage observed in the specimens occurred at the area around the tip of the outermost ply-drop group in the tapered region of the flexbeam, near the thick end. Delaminations grew slowly and stably, toward the thick end of the flexbeam, at the interfaces above and below the dropped-ply region. A 2D finite element model of the flexbeam was developed. The model was analyzed using a geometrically non-linear analysis with both the ANSYS and ABAQUS FE codes. The global responses of each analysis agreed well with the test results. The ANSYS model was used to calculate strain energy release rates (G) for delaminations initiating at two different ply-ending locations. The results showed that delaminations were more inclined to grow at the locations where they were observed in the test specimens. Both ANSYS and ABAQUS were used to calculate G values associated with delamination initiating at the observed location but growing in different interfaces, either above or below the ply-ending group toward the thick end, or toward the thin end from the tip of the resin pocket. The different analysis codes generated the same trends and comparable peak values, within 5-11 % for each delamination path. Both codes showed that delamination toward the thick region was largely mode II, and toward the thin

  19. Probabilistic Fatigue: Computational Simulation

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.

    2002-01-01

    Fatigue is a primary consideration in the design of aerospace structures for long term durability and reliability. There are several types of fatigue that must be considered in the design. These include low cycle, high cycle, combined for different cyclic loading conditions - for example, mechanical, thermal, erosion, etc. The traditional approach to evaluate fatigue has been to conduct many tests in the various service-environment conditions that the component will be subjected to in a specific design. This approach is reasonable and robust for that specific design. However, it is time consuming, costly and needs to be repeated for designs in different operating conditions in general. Recent research has demonstrated that fatigue of structural components/structures can be evaluated by computational simulation based on a novel paradigm. Main features in this novel paradigm are progressive telescoping scale mechanics, progressive scale substructuring and progressive structural fracture, encompassed with probabilistic simulation. These generic features of this approach are to probabilistically telescope scale local material point damage all the way up to the structural component and to probabilistically scale decompose structural loads and boundary conditions all the way down to material point. Additional features include a multifactor interaction model that probabilistically describes material properties evolution, any changes due to various cyclic load and other mutually interacting effects. The objective of the proposed paper is to describe this novel paradigm of computational simulation and present typical fatigue results for structural components. Additionally, advantages, versatility and inclusiveness of computational simulation versus testing are discussed. Guidelines for complementing simulated results with strategic testing are outlined. Typical results are shown for computational simulation of fatigue in metallic composite structures to demonstrate the

  20. Rad5 Template Switch Pathway of DNA Damage Tolerance Determines Synergism between Cisplatin and NSC109268 in Saccharomyces cerevisiae

    PubMed Central

    Jain, Dilip; Siede, Wolfram

    2013-01-01

    The success of cisplatin (CP) based therapy is often hindered by acquisition of CP resistance. We isolated NSC109268 as a compound altering cellular sensitivity to DNA damaging agents. Previous investigation revealed an enhancement of CP sensitivity by NSC109268 in wild-type Saccharomyces cerevisiae and CP-sensitive and -resistant cancer cell lines that correlated with a slower S phase traversal. Here, we extended these studies to determine the target pathway(s) of NSC109268 in mediating CP sensitization, using yeast as a model. We reasoned that mutants defective in the relevant target of NSC109268 should be hypersensitive to CP and the sensitization effect by NSC109268 should be absent or strongly reduced. A survey of various yeast deletion mutants converged on the Rad5 pathway of DNA damage tolerance by template switching as the likely target pathway of NSC109268 in mediating cellular sensitization to CP. Additionally, cell cycle delays following CP treatment were not synergistically influenced by NSC109268 in the CP hypersensitive rad5Δ mutant. The involvement of the known inhibitory activities of NSC109268 on 20S proteasome and phosphatases 2Cα and 2A was tested. In the CP hypersensitive ptc2Δptc3Δpph3Δ yeast strain, deficient for 2C and 2A-type phosphatases, cellular sensitization to CP by NSC109268 was greatly reduced. It is therefore suggested that NSC109268 affects CP sensitivity by inhibiting the activity of unknown protein(s) whose dephosphorylation is required for the template switch pathway. PMID:24130896

  1. Damage tolerance of pressurized graphite/epoxy tape cylinders under uniaxial and biaxial loading. M.S. Thesis

    NASA Technical Reports Server (NTRS)

    Priest, Stacy Marie

    1993-01-01

    The damage tolerance behavior of internally pressurized, axially slit, graphite/epoxy tape cylinders was investigated. Specifically, the effects of axial stress, structural anisotropy, and subcritical damage were considered. In addition, the limitations of a methodology which uses coupon fracture data to predict cylinder failure were explored. This predictive methodology was previously shown to be valid for quasi-isotropic fabric and tape cylinders but invalid for structurally anisotropic (+/-45/90)(sub s) and (+/-45/0)(sub s) cylinders. The effects of axial stress and structural anisotropy were assessed by testing tape cylinders with (90/0/+/-45)(sub s), (+/-45/90)(sub s), and (+/-45/0)(sub s) layups in a uniaxial test apparatus, specially designed and built for this work, and comparing the results to previous tests conducted in biaxial loading. Structural anisotropy effects were also investigated by testing cylinders with the quasi-isotropic (0/+/-45/90)(sub s) layup which is a stacking sequence variation of the previously tested (90/0/+/-45)(sub s) layup with higher D(sub 16) and D(sub 26) terms but comparable D(sub 16) and D(sub 26) to D(sub 11) ratios. All cylinders tested and used for comparison are made from AS4/3501-6 graphite/epoxy tape and have a diameter of 305 mm. Cylinder slit lengths range from 12.7 to 50.8 mm. Failure pressures are lower for the uniaxially loaded cylinders in all cases. The smallest percent failure pressure decreases are observed for the (+/-45/90)(sub s) cylinders, while the greatest such decreases are observed for the (+/-45/0)(sub s) cylinders. The relative effects of the axial stress on the cylinder failure pressures do not correlate with the degree of structural coupling. The predictive methodology is not applicable for uniaxially loaded (+/-45/90)(sub s) and (+/-45/0)(sub s) cylinders, may be applicable for uniaxially loaded (90/0/+/-45)(sub s) cylinders, and is applicable for the biaxially loaded (90/0/+/-45)(sub s) and (0

  2. Essential Roles of the Smc5/6 Complex in Replication through Natural Pausing Sites and Endogenous DNA Damage Tolerance

    PubMed Central

    Menolfi, Demis; Delamarre, Axel; Lengronne, Armelle; Pasero, Philippe; Branzei, Dana

    2015-01-01

    Summary The essential functions of the conserved Smc5/6 complex remain elusive. To uncover its roles in genome maintenance, we established Saccharomyces cerevisiae cell-cycle-regulated alleles that enable restriction of Smc5/6 components to S or G2/M. Unexpectedly, the essential functions of Smc5/6 segregated fully and selectively to G2/M. Genetic screens that became possible with generated alleles identified processes that crucially rely on Smc5/6 specifically in G2/M: metabolism of DNA recombination structures triggered by endogenous replication stress, and replication through natural pausing sites located in late-replicating regions. In the first process, Smc5/6 modulates remodeling of recombination intermediates, cooperating with dissolution activities. In the second, Smc5/6 prevents chromosome fragility and toxic recombination instigated by prolonged pausing and the fork protection complex, Tof1-Csm3. Our results thus dissect Smc5/6 essential roles and reveal that combined defects in DNA damage tolerance and pausing site-replication cause recombination-mediated DNA lesions, which we propose to drive developmental and cancer-prone disorders. PMID:26698660

  3. Nanoscale origins of the damage tolerance of the high-entropy alloy CrMnFeCoNi

    PubMed Central

    Zhang, ZiJiao; Mao, M. M.; Wang, Jiangwei; Gludovatz, Bernd; Zhang, Ze; Mao, Scott X.; George, Easo P.; Yu, Qian; Ritchie, Robert O.

    2015-01-01

    Damage tolerance can be an elusive characteristic of structural materials requiring both high strength and ductility, properties that are often mutually exclusive. High-entropy alloys are of interest in this regard. Specifically, the single-phase CrMnFeCoNi alloy displays tensile strength levels of ∼1 GPa, excellent ductility (∼60–70%) and exceptional fracture toughness (KJIc>200 MPa√m). Here through the use of in situ straining in an aberration-corrected transmission electron microscope, we report on the salient atomistic to micro-scale mechanisms underlying the origin of these properties. We identify a synergy of multiple deformation mechanisms, rarely achieved in metallic alloys, which generates high strength, work hardening and ductility, including the easy motion of Shockley partials, their interactions to form stacking-fault parallelepipeds, and arrest at planar slip bands of undissociated dislocations. We further show that crack propagation is impeded by twinned, nanoscale bridges that form between the near-tip crack faces and delay fracture by shielding the crack tip. PMID:26647978

  4. Nanoscale origins of the damage tolerance of the high-entropy alloy CrMnFeCoNi.

    PubMed

    Zhang, ZiJiao; Mao, M M; Wang, Jiangwei; Gludovatz, Bernd; Zhang, Ze; Mao, Scott X; George, Easo P; Yu, Qian; Ritchie, Robert O

    2015-12-09

    Damage tolerance can be an elusive characteristic of structural materials requiring both high strength and ductility, properties that are often mutually exclusive. High-entropy alloys are of interest in this regard. Specifically, the single-phase CrMnFeCoNi alloy displays tensile strength levels of ∼ 1 GPa, excellent ductility (∼ 60-70%) and exceptional fracture toughness (KJIc>200 MPa√m). Here through the use of in situ straining in an aberration-corrected transmission electron microscope, we report on the salient atomistic to micro-scale mechanisms underlying the origin of these properties. We identify a synergy of multiple deformation mechanisms, rarely achieved in metallic alloys, which generates high strength, work hardening and ductility, including the easy motion of Shockley partials, their interactions to form stacking-fault parallelepipeds, and arrest at planar slip bands of undissociated dislocations. We further show that crack propagation is impeded by twinned, nanoscale bridges that form between the near-tip crack faces and delay fracture by shielding the crack tip.

  5. Real-time immune cell interactions in target tissue during autoimmune-induced damage and graft tolerance

    PubMed Central

    Miska, Jason; Abdulreda, Midhat H.; Devarajan, Priyadharshini; Lui, Jen Bon; Suzuki, Jun; Pileggi, Antonello; Berggren, Per-Olof

    2014-01-01

    Real-time imaging studies are reshaping immunological paradigms, but a visual framework is lacking for self-antigen-specific T cells at the effector phase in target tissues. To address this issue, we conducted intravital, longitudinal imaging analyses of cellular behavior in nonlymphoid target tissues to illustrate some key aspects of T cell biology. We used mouse models of T cell–mediated damage and protection of pancreatic islet grafts. Both CD4+ and CD8+ effector T (Teff) lymphocytes directly engaged target cells. Strikingly, juxtaposed β cells lacking specific antigens were not subject to bystander destruction but grew substantially in days, likely by replication. In target tissue, Foxp3+ regulatory T (Treg) cells persistently contacted Teff cells with or without involvement of CD11c+ dendritic cells, an observation conciliating with the in vitro “trademark” of Treg function, contact-dependent suppression. This study illustrates tolerance induction by contact-based immune cell interaction in target tissues and highlights potentials of tissue regeneration under antigenic incognito in inflammatory settings. PMID:24567447

  6. An investigation of the fracture and fatigue crack growth behavior of forged damage-tolerant niobium aluminide intermetallics

    SciTech Connect

    Ye, F.; Mercer, C.; Soboyejo, W.O.

    1998-09-01

    The results of a recent study of the effects of ternary alloying with Ti on the fatigue and fracture behavior of a new class of forged damage-tolerant niobium aluminide (Ng, Al-xTi) intermetallics are presented in this article. The alloys studied have the following nominal compositions: Nb-15Al-10Ti (10Ti alloy), Nb-15Al-25Ti (25Ti alloy), and Nb-15Al-40Ti (40Ti alloy). All compositions are quoted in atomic percentages unless stated otherwise. The 10Ti and 25Ti alloys exhibit fracture toughness levels between 10 and 20 MPa{radical}m at room temperature. Fracture in these alloys occurs by brittle cleavage fracture modes. In contrast, a ductile dimpled fracture mode is observed at room-temperature for the alloy containing 40 at. pct Ti. The 40Ti alloy also exhibits exceptional combinations of room-temperature strength (695 to 904 MPa), ductility (4 to 30 pct), fracture toughness (40 to 100 MPa{radical}m), and fatigue crack growth resistance (comparable to Ti-6Al-4V, monolithic Nb, and inconel 718). The implications of the results are discussed for potential structural applications of the 40Ti alloy in the intermediate-temperature ({approximately}700 C to 750 C) regime.

  7. Nanoscale origins of the damage tolerance of the high-entropy alloy CrMnFeCoNi

    DOE PAGES

    Zhang, ZiJiao; Mao, M. M.; Wang, Jiangwei; ...

    2015-12-09

    Damage tolerance can be an elusive characteristic of structural materials requiring both high strength and ductility, properties that are often mutually exclusive. High-entropy alloys are of interest in this regard. Specifically, the single-phase CrMnFeCoNi alloy displays tensile strength levels of ~1 GPa, excellent ductility (~60–70%) and exceptional fracture toughness (KJIc>200M Pa√m). Here through the use of in situ straining in an aberration-corrected transmission electron microscope, we report on the salient atomistic to micro-scale mechanisms underlying the origin of these properties. We identify a synergy of multiple deformation mechanisms, rarely achieved in metallic alloys, which generates high strength, work hardening andmore » ductility, including the easy motion of Shockley partials, their interactions to form stacking-fault parallelepipeds, and arrest at planar slip bands of undissociated dislocations. In conclusion, we further show that crack propagation is impeded by twinned, nanoscale bridges that form between the near-tip crack faces and delay fracture by shielding the crack tip.« less

  8. Nanoscale origins of the damage tolerance of the high-entropy alloy CrMnFeCoNi

    SciTech Connect

    Zhang, ZiJiao; Mao, M. M.; Wang, Jiangwei; Gludovatz, Bernd; Zhang, Ze; Mao, Scott X.; George, Easo P.; Yu, Qian; Ritchie, Robert O.

    2015-12-09

    Damage tolerance can be an elusive characteristic of structural materials requiring both high strength and ductility, properties that are often mutually exclusive. High-entropy alloys are of interest in this regard. Specifically, the single-phase CrMnFeCoNi alloy displays tensile strength levels of ~1 GPa, excellent ductility (~60–70%) and exceptional fracture toughness (KJIc>200M Pa√m). Here through the use of in situ straining in an aberration-corrected transmission electron microscope, we report on the salient atomistic to micro-scale mechanisms underlying the origin of these properties. We identify a synergy of multiple deformation mechanisms, rarely achieved in metallic alloys, which generates high strength, work hardening and ductility, including the easy motion of Shockley partials, their interactions to form stacking-fault parallelepipeds, and arrest at planar slip bands of undissociated dislocations. In conclusion, we further show that crack propagation is impeded by twinned, nanoscale bridges that form between the near-tip crack faces and delay fracture by shielding the crack tip.

  9. Verification of recursive probabilistic integration (RPI) method for fatigue life management using non-destructive inspections

    NASA Astrophysics Data System (ADS)

    Chen, Tzikang J.; Shiao, Michael

    2016-04-01

    This paper verified a generic and efficient assessment concept for probabilistic fatigue life management. The concept is developed based on an integration of damage tolerance methodology, simulations methods1, 2, and a probabilistic algorithm RPI (recursive probability integration)3-9 considering maintenance for damage tolerance and risk-based fatigue life management. RPI is an efficient semi-analytical probabilistic method for risk assessment subjected to various uncertainties such as the variability in material properties including crack growth rate, initial flaw size, repair quality, random process modeling of flight loads for failure analysis, and inspection reliability represented by probability of detection (POD). In addition, unlike traditional Monte Carlo simulations (MCS) which requires a rerun of MCS when maintenance plan is changed, RPI can repeatedly use a small set of baseline random crack growth histories excluding maintenance related parameters from a single MCS for various maintenance plans. In order to fully appreciate the RPI method, a verification procedure was performed. In this study, MC simulations in the orders of several hundred billions were conducted for various flight conditions, material properties, and inspection scheduling, POD and repair/replacement strategies. Since the MC simulations are time-consuming methods, the simulations were conducted parallelly on DoD High Performance Computers (HPC) using a specialized random number generator for parallel computing. The study has shown that RPI method is several orders of magnitude more efficient than traditional Monte Carlo simulations.

  10. A nanoscale composite material for enhanced damage tolerance in micro and nano-electro-mechanical systems and structures

    NASA Astrophysics Data System (ADS)

    Paranjpye, Alok

    A laminar composite material with alternating layers of residual compressive and tensile stresses has previously been shown to offer enhanced tolerance to fracture in macroscale ceramic components. In this work, a similarly damage-tolerant composite material with micro and nano-scale laminae has been developed as an alternative to monolithic silicon for the fabrication of Micro-Electro-Mechanical Systems (MEMS). The motivation for this work arises out of the repeated mechanical failure of prototype MEMS-based microscale surgical tools when subject to shock or impact loads, in spite of rigorous design features for minimizing such failures. This behavior can be attributed to the low fracture toughness of silicon and is a general characteristic of brittle materials, particular ceramics. Fittingly, the solution proposed here is inspired by earlier research in the ceramics community. Structures of a Silicon and Silicon Oxide laminar composite were fabricated with micrometer range laminae widths. This represents a model, scalable material system due to the covalent bonded interface between the laminae materials. Tests performed on these cantilevers to measure their fracture properties, showed higher minimum fracture stresses displayed by composite cantilevers in comparison with identical monolithic silicon structures. Moreover, these minima match well with the "threshold" stress, a lower bound on the fracture stress of this composite predicted from theoretical considerations. A more complete model for the fracture properties of this material was also developed, removing an important assumption of the existing theory, which limits its application to some material systems. The updated theory models the effect of the laminar structure of the composite as an effective anisotropy in its properties with regard to stress fields around any cracks in the material. The predictions from this model are shown to better replicate results from finite element simulations of laminate

  11. Probabilistic fatigue methodology for six nines reliability

    NASA Technical Reports Server (NTRS)

    Everett, R. A., Jr.; Bartlett, F. D., Jr.; Elber, Wolf

    1990-01-01

    Fleet readiness and flight safety strongly depend on the degree of reliability that can be designed into rotorcraft flight critical components. The current U.S. Army fatigue life specification for new rotorcraft is the so-called six nines reliability, or a probability of failure of one in a million. The progress of a round robin which was established by the American Helicopter Society (AHS) Subcommittee for Fatigue and Damage Tolerance is reviewed to investigate reliability-based fatigue methodology. The participants in this cooperative effort are in the U.S. Army Aviation Systems Command (AVSCOM) and the rotorcraft industry. One phase of the joint activity examined fatigue reliability under uniquely defined conditions for which only one answer was correct. The other phases were set up to learn how the different industry methods in defining fatigue strength affected the mean fatigue life and reliability calculations. Hence, constant amplitude and spectrum fatigue test data were provided so that each participant could perform their standard fatigue life analysis. As a result of this round robin, the probabilistic logic which includes both fatigue strength and spectrum loading variability in developing a consistant reliability analysis was established. In this first study, the reliability analysis was limited to the linear cumulative damage approach. However, it is expected that superior fatigue life prediction methods will ultimately be developed through this open AHS forum. To that end, these preliminary results were useful in identifying some topics for additional study.

  12. Analysis of the Static and Fatigue Strenght of a Damage Tolerant 3D-Reinforced Joining Technology on Composite Single Lap Joints

    NASA Astrophysics Data System (ADS)

    Nogueira, A. C.; Drechsler, K.; Hombergsmeier, E.

    2012-07-01

    The increasing usage of carbon fiber reinforced plastics (CFRP) in aerospace together with the constant drive for fuel efficiency and lightweight design have imposed new challenges in next generation structural assemblies and load transfer efficient joining methods. To address this issue, an innovative technology, denominated Redundant High Efficiency Assembly (RHEA) joints, is introduced as a high-performance lightweight joint that combines efficient load transfer with good damage tolerance. A review of the ongoing research involving the RHEA joint technology, its through-thickness reinforcement concept and the results of quasi-static and fatigue tensile investigations of single lap shear specimens are exposed and discussed. Improvements in ultimate static load, maximum joint deformation, damage tolerance and fatigue life are encountered when comparing the performance of the RHEA lap shear joints to co-bonded reference specimens.

  13. Composites Damage Tolerance

    NASA Technical Reports Server (NTRS)

    Gregg, Wayne

    2008-01-01

    This slide presentation reviews the development of MSFC-RQMT-3479 for requirements for fracture control of composites to be used in the Constellation program. This effort is part of the development of a revision of NASA-STD-5019(A), which will include MSFC-RQMT-3479. Examples of the requirement criteria and implementation are given.

  14. Brite-Euram programme: ACOUFAT acoustic fatigue and related damage tolerance of advanced composite and metallic structures

    NASA Astrophysics Data System (ADS)

    Tougard, D.

    1994-09-01

    The Brite/Euram programme ACOUFAT is concerned with 'Acoustic fatigue and related damage tolerance of advanced composite and metallic structure'. Three main fields of the ACOUFAT results are discussed: (1) The use of a 'frequency degradation' criterion, usually applied to classical metallic materials and early Carbon Fiber Reinforced Plastic (CFRP) materials, is not considered suitable, as the only parameter, for determination of CFRP specimen 'failure' in acoustic fatigue. It is suggested that a suitable criterion should be based, in further work, upon the degradation of the mechanical properties of the specimens; (2) On the basis of Wind-Tunnel (WT) calibration tests, a semi-empirical model of the spatio-temporal characteristics of the aero-acoustic loads exerted on a flat panel by the turbulent field created by a flap has been developed and utilized as 'Load Data Input' for Finite Element (FE) calculations. The WT tests have been reasonably well presented: the development of this semi-empirical model is an encouraging initial success. The results from the initial modelling suggest that this can be extended to the modelling of the acoustic loads in Progressive Wave Tubes (PWT); and (3) The excitation of structures by aero-acoustic loads may not be simulated fully in PWT by simply modifying and correctly shaping the spectral content. The effect of the spatial distribution of the loading is clearly different in both cases and the tested specimen endurance may be significantly different. It is clear that a theoretical approach based on a correct prediction of the responses to both types of environment is required.

  15. Effects of Weave Styles and Crimp Gradients on Damage Tolerance and Energy-Absorption Capacities of Woven Kevlar/Epoxy Composites

    DTIC Science & Technology

    2015-09-01

    ORGANIZATION NAME(S) AND ADDRESS(ES) Naval Undersea Warfare Center Division 1176 Howell Street Newport, Rl 02841-1708 8. PERFORMING ORGANIZATION ...gradients (CGs) on the damage-tolerance levels and energy-absorption capacities of woven fabric-reinforced polymer (WFRP) composites. A comparative...Mechanics Fabrics Finite Element Analysis Functionally Graded Impact Fracture Kevlar Woven Composites Woven Fabric-Reinforced Polymer Composites 16

  16. DNA Polymerases ImuC and DinB Are Involved in DNA Alkylation Damage Tolerance in Pseudomonas aeruginosa and Pseudomonas putida

    PubMed Central

    Jatsenko, Tatjana; Sidorenko, Julia; Saumaa, Signe; Kivisaar, Maia

    2017-01-01

    Translesion DNA synthesis (TLS), facilitated by low-fidelity polymerases, is an important DNA damage tolerance mechanism. Here, we investigated the role and biological function of TLS polymerase ImuC (former DnaE2), generally present in bacteria lacking DNA polymerase V, and TLS polymerase DinB in response to DNA alkylation damage in Pseudomonas aeruginosa and P. putida. We found that TLS DNA polymerases ImuC and DinB ensured a protective role against N- and O-methylation induced by N-methyl-N'-nitro-N-nitrosoguanidine (MNNG) in both P. aeruginosa and P. putida. DinB also appeared to be important for the survival of P. aeruginosa and rapidly growing P. putida cells in the presence of methyl methanesulfonate (MMS). The role of ImuC in protection against MMS-induced damage was uncovered under DinB-deficient conditions. Apart from this, both ImuC and DinB were critical for the survival of bacteria with impaired base excision repair (BER) functions upon alkylation damage, lacking DNA glycosylases AlkA and/or Tag. Here, the increased sensitivity of imuCdinB double deficient strains in comparison to single mutants suggested that the specificity of alkylated DNA lesion bypass of DinB and ImuC might also be different. Moreover, our results demonstrated that mutagenesis induced by MMS in pseudomonads was largely ImuC-dependent. Unexpectedly, we discovered that the growth temperature of bacteria affected the efficiency of DinB and ImuC in ensuring cell survival upon alkylation damage. Taken together, the results of our study disclosed the involvement of ImuC in DNA alkylation damage tolerance, especially at low temperatures, and its possible contribution to the adaptation of pseudomonads upon DNA alkylation damage via increased mutagenesis. PMID:28118378

  17. Deterministic and Probabilistic Creep and Creep Rupture Enhancement to CARES/Creep: Multiaxial Creep Life Prediction of Ceramic Structures Using Continuum Damage Mechanics and the Finite Element Method

    NASA Technical Reports Server (NTRS)

    Jadaan, Osama M.; Powers, Lynn M.; Gyekenyesi, John P.

    1998-01-01

    High temperature and long duration applications of monolithic ceramics can place their failure mode in the creep rupture regime. A previous model advanced by the authors described a methodology by which the creep rupture life of a loaded component can be predicted. That model was based on the life fraction damage accumulation rule in association with the modified Monkman-Grant creep ripture criterion However, that model did not take into account the deteriorating state of the material due to creep damage (e.g., cavitation) as time elapsed. In addition, the material creep parameters used in that life prediction methodology, were based on uniaxial creep curves displaying primary and secondary creep behavior, with no tertiary regime. The objective of this paper is to present a creep life prediction methodology based on a modified form of the Kachanov-Rabotnov continuum damage mechanics (CDM) theory. In this theory, the uniaxial creep rate is described in terms of stress, temperature, time, and the current state of material damage. This scalar damage state parameter is basically an abstract measure of the current state of material damage due to creep deformation. The damage rate is assumed to vary with stress, temperature, time, and the current state of damage itself. Multiaxial creep and creep rupture formulations of the CDM approach are presented in this paper. Parameter estimation methodologies based on nonlinear regression analysis are also described for both, isothermal constant stress states and anisothermal variable stress conditions This creep life prediction methodology was preliminarily added to the integrated design code CARES/Creep (Ceramics Analysis and Reliability Evaluation of Structures/Creep), which is a postprocessor program to commercially available finite element analysis (FEA) packages. Two examples, showing comparisons between experimental and predicted creep lives of ceramic specimens, are used to demonstrate the viability of this methodology and

  18. Training probabilistic VLSI models on-chip to recognise biomedical signals under hardware nonidealities.

    PubMed

    Jiang, P C; Chen, H

    2006-01-01

    VLSI implementation of probabilistic models is attractive for many biomedical applications. However, hardware non-idealities can prevent probabilistic VLSI models from modelling data optimally through on-chip learning. This paper investigates the maximum computational errors that a probabilistic VLSI model can tolerate when modelling real biomedical data. VLSI circuits capable of achieving the required precision are also proposed.

  19. A Probabilistic Asteroid Impact Risk Model

    NASA Technical Reports Server (NTRS)

    Mathias, Donovan L.; Wheeler, Lorien F.; Dotson, Jessie L.

    2016-01-01

    Asteroid threat assessment requires the quantification of both the impact likelihood and resulting consequence across the range of possible events. This paper presents a probabilistic asteroid impact risk (PAIR) assessment model developed for this purpose. The model incorporates published impact frequency rates with state-of-the-art consequence assessment tools, applied within a Monte Carlo framework that generates sets of impact scenarios from uncertain parameter distributions. Explicit treatment of atmospheric entry is included to produce energy deposition rates that account for the effects of thermal ablation and object fragmentation. These energy deposition rates are used to model the resulting ground damage, and affected populations are computed for the sampled impact locations. The results for each scenario are aggregated into a distribution of potential outcomes that reflect the range of uncertain impact parameters, population densities, and strike probabilities. As an illustration of the utility of the PAIR model, the results are used to address the question of what minimum size asteroid constitutes a threat to the population. To answer this question, complete distributions of results are combined with a hypothetical risk tolerance posture to provide the minimum size, given sets of initial assumptions. Model outputs demonstrate how such questions can be answered and provide a means for interpreting the effect that input assumptions and uncertainty can have on final risk-based decisions. Model results can be used to prioritize investments to gain knowledge in critical areas or, conversely, to identify areas where additional data has little effect on the metrics of interest.

  20. A Damage Tolerance Comparison of Composite Hat-Stiffened and Honeycomb Sandwich Structure for Launch Vehicle Interstage Applications

    NASA Technical Reports Server (NTRS)

    Nettles, A. T.

    2011-01-01

    In this study, a direct comparison of the compression-after-impact (CAI) strength of impact-damaged, hat-stiffened and honeycomb sandwich structure for launch vehicle use was made. The specimens used consisted of small substructure designed to carry a line load of approx..3,000 lb/in. Damage was inflicted upon the specimens via drop weight impact. Infrared thermography was used to examine the extent of planar damage in the specimens. The specimens were prepared for compression testing to obtain residual compression strength versus damage severity curves. Results show that when weight of the structure is factored in, both types of structure had about the same CAI strength for a given damage level. The main difference was that the hat-stiffened specimens exhibited a multiphase failure whereas the honeycomb sandwich structure failed catastrophically.

  1. Probabilistic Prediction of Lifetimes of Ceramic Parts

    NASA Technical Reports Server (NTRS)

    Nemeth, Noel N.; Gyekenyesi, John P.; Jadaan, Osama M.; Palfi, Tamas; Powers, Lynn; Reh, Stefan; Baker, Eric H.

    2006-01-01

    ANSYS/CARES/PDS is a software system that combines the ANSYS Probabilistic Design System (PDS) software with a modified version of the Ceramics Analysis and Reliability Evaluation of Structures Life (CARES/Life) Version 6.0 software. [A prior version of CARES/Life was reported in Program for Evaluation of Reliability of Ceramic Parts (LEW-16018), NASA Tech Briefs, Vol. 20, No. 3 (March 1996), page 28.] CARES/Life models effects of stochastic strength, slow crack growth, and stress distribution on the overall reliability of a ceramic component. The essence of the enhancement in CARES/Life 6.0 is the capability to predict the probability of failure using results from transient finite-element analysis. ANSYS PDS models the effects of uncertainty in material properties, dimensions, and loading on the stress distribution and deformation. ANSYS/CARES/PDS accounts for the effects of probabilistic strength, probabilistic loads, probabilistic material properties, and probabilistic tolerances on the lifetime and reliability of the component. Even failure probability becomes a stochastic quantity that can be tracked as a response variable. ANSYS/CARES/PDS enables tracking of all stochastic quantities in the design space, thereby enabling more precise probabilistic prediction of lifetimes of ceramic components.

  2. The Effects of Foam Thermal Protection System on the Damage Tolerance Characteristics of Composite Sandwich Structures for Launch Vehicles

    NASA Technical Reports Server (NTRS)

    Nettles, A. T.; Hodge, A. J.; Jackson, J. R.

    2011-01-01

    For any structure composed of laminated composite materials, impact damage is one of the greatest risks and therefore most widely tested responses. Typically, impact damage testing and analysis assumes that a solid object comes into contact with the bare surface of the laminate (the outer ply). However, most launch vehicle structures will have a thermal protection system (TPS) covering the structure for the majority of its life. Thus, the impact response of the material with the TPS covering is the impact scenario of interest. In this study, laminates representative of the composite interstage structure for the Ares I launch vehicle were impact tested with and without the planned TPS covering, which consists of polyurethane foam. Response variables examined include maximum load of impact, damage size as detected by nondestructive evaluation techniques, and damage morphology and compression after impact strength. Results show that there is little difference between TPS covered and bare specimens, except the residual strength data is higher for TPS covered specimens.

  3. The role of quasi-plasticity in the extreme contact damage tolerance of the stomatopod dactyl club.

    PubMed

    Amini, Shahrouz; Tadayon, Maryam; Idapalapati, Sridhar; Miserez, Ali

    2015-09-01

    The structure of the stomatopod dactyl club--an ultrafast, hammer-like device used by the animal to shatter hard seashells--offers inspiration for impact-tolerant ceramics. Here, we present the micromechanical principles and related micromechanisms of deformation that impart the club with high impact tolerance. By using depth-sensing nanoindentation with spherical and sharp contact tips in combination with post-indentation residual stress mapping by Raman microspectroscopy, we show that the impact surface region of the dactyl club exhibits a quasi-plastic contact response associated with the interfacial sliding and rotation of fluorapatite nanorods, endowing the club with localized yielding. We also show that the subsurface layers exhibit strain hardening by microchannel densification, which provides additional dissipation of impact energy. Our findings suggest that the club's macroscopic size is below the critical size above which Hertzian brittle cracks are nucleated.

  4. The role of quasi-plasticity in the extreme contact damage tolerance of the stomatopod dactyl club

    NASA Astrophysics Data System (ADS)

    Amini, Shahrouz; Tadayon, Maryam; Idapalapati, Sridhar; Miserez, Ali

    2015-09-01

    The structure of the stomatopod dactyl club--an ultrafast, hammer-like device used by the animal to shatter hard seashells--offers inspiration for impact-tolerant ceramics. Here, we present the micromechanical principles and related micromechanisms of deformation that impart the club with high impact tolerance. By using depth-sensing nanoindentation with spherical and sharp contact tips in combination with post-indentation residual stress mapping by Raman microspectroscopy, we show that the impact surface region of the dactyl club exhibits a quasi-plastic contact response associated with the interfacial sliding and rotation of fluorapatite nanorods, endowing the club with localized yielding. We also show that the subsurface layers exhibit strain hardening by microchannel densification, which provides additional dissipation of impact energy. Our findings suggest that the club’s macroscopic size is below the critical size above which Hertzian brittle cracks are nucleated.

  5. Probabilistic Modeling of Space Shuttle Debris Impact

    NASA Technical Reports Server (NTRS)

    Huyse, Luc J.; Asce, M.; Waldhart, Chris J.; Riha, David S.; Larsen, Curtis E.; Gomez, Reynaldo J.; Stuart, Phillip C.

    2007-01-01

    On Feb 1, 2003, the Shuttle Columbia was lost during its return to Earth. As a result of the conclusion that debris impact caused the damage to the left wing of the Columbia Space Shuttle Vehicle (SSV) during ascent, the Columbia Accident Investigation Board recommended that an assessment be performed of the debris environment experienced by the SSV during ascent. A flight rationale based on probabilistic assessment is used for the SSV return-to-flight. The assessment entails identifying all potential debris sources, their probable geometric and aerodynamic characteristics, and their potential for impacting and damaging critical Shuttle components. A probabilistic analysis tool, based on the SwRI-developed NESSUS probabilistic analysis software, predicts the probability of impact and damage to the space shuttle wing leading edge and thermal protection system components. Among other parameters, the likelihood of unacceptable damage depends on the time of release (Mach number of the orbiter) and the divot mass as well as the impact velocity and impact angle. A typical result is visualized in the figures below. Probability of impact and damage, as well as the sensitivities thereof with respect to the distribution assumptions, can be computed and visualized at each point on the orbiter or summarized per wing panel or tile zone.

  6. Identification of β Clamp-DNA Interaction Regions That Impair the Ability of E. coli to Tolerate Specific Classes of DNA Damage

    PubMed Central

    Nanfara, Michael T.; Babu, Vignesh M. P.; Ghazy, Mohamed A.; Sutton, Mark D.

    2016-01-01

    The E. coli dnaN-encoded β sliding clamp protein plays a pivotal role in managing the actions on DNA of the 5 bacterial DNA polymerases, proteins involved in mismatch repair, as well as several additional proteins involved in DNA replication. Results of in vitro experiments indicate that the loading of β clamp onto DNA relies on both the DnaX clamp loader complex as well as several discrete sliding clamp-DNA interactions. However, the importance of these DNA interactions to E. coli viability, as well as the ability of the β clamp to support the actions of its numerous partner proteins, have not yet been examined. To determine the contribution of β clamp-DNA interactions to the ability of E. coli to cope with different classes of DNA damage, we used alanine scanning to mutate 22 separate residues mapping to 3 distinct β clamp surfaces known or nearby those known to contact the DNA template, including residues P20-L27 (referred to here as loop I), H148-Y154 (loop II) and 7 different residues lining the central pore of the β clamp through which the DNA template threads. Twenty of these 22 dnaN mutants supported bacterial growth. While none of these 20 conferred sensitivity to hydrogen peroxide or ultra violet light, 12 were sensitized to NFZ, 5 were sensitized to MMS, 8 displayed modestly altered frequencies of DNA damage-induced mutagenesis, and 2 may be impaired for supporting hda function. Taken together, these results demonstrate that discrete β clamp-DNA interaction regions contribute to the ability of E. coli to tolerate specific classes of DNA damage. PMID:27685804

  7. Damage tolerance modeling and validation of a wireless sensory composite panel for a structural health monitoring system

    NASA Astrophysics Data System (ADS)

    Talagani, Mohamad R.; Abdi, Frank; Saravanos, Dimitris; Chrysohoidis, Nikos; Nikbin, Kamran; Ragalini, Rose; Rodov, Irena

    2013-05-01

    The paper proposes the diagnostic and prognostic modeling and test validation of a Wireless Integrated Strain Monitoring and Simulation System (WISMOS). The effort verifies a hardware and web based software tool that is able to evaluate and optimize sensorized aerospace composite structures for the purpose of Structural Health Monitoring (SHM). The tool is an extension of an existing suite of an SHM system, based on a diagnostic-prognostic system (DPS) methodology. The goal of the extended SHM-DPS is to apply multi-scale nonlinear physics-based Progressive Failure analyses to the "as-is" structural configuration to determine residual strength, remaining service life, and future inspection intervals and maintenance procedures. The DPS solution meets the JTI Green Regional Aircraft (GRA) goals towards low weight, durable and reliable commercial aircraft. It will take advantage of the currently developed methodologies within the European Clean sky JTI project WISMOS, with the capability to transmit, store and process strain data from a network of wireless sensors (e.g. strain gages, FBGA) and utilize a DPS-based methodology, based on multi scale progressive failure analysis (MS-PFA), to determine structural health and to advice with respect to condition based inspection and maintenance. As part of the validation of the Diagnostic and prognostic system, Carbon/Epoxy ASTM coupons were fabricated and tested to extract the mechanical properties. Subsequently two composite stiffened panels were manufactured, instrumented and tested under compressive loading: 1) an undamaged stiffened buckling panel; and 2) a damaged stiffened buckling panel including an initial diamond cut. Next numerical Finite element models of the two panels were developed and analyzed under test conditions using Multi-Scale Progressive Failure Analysis (an extension of FEM) to evaluate the damage/fracture evolution process, as well as the identification of contributing failure modes. The comparisons

  8. Probabilistic Structural Analysis Program

    NASA Technical Reports Server (NTRS)

    Pai, Shantaram S.; Chamis, Christos C.; Murthy, Pappu L. N.; Stefko, George L.; Riha, David S.; Thacker, Ben H.; Nagpal, Vinod K.; Mital, Subodh K.

    2010-01-01

    NASA/NESSUS 6.2c is a general-purpose, probabilistic analysis program that computes probability of failure and probabilistic sensitivity measures of engineered systems. Because NASA/NESSUS uses highly computationally efficient and accurate analysis techniques, probabilistic solutions can be obtained even for extremely large and complex models. Once the probabilistic response is quantified, the results can be used to support risk-informed decisions regarding reliability for safety-critical and one-of-a-kind systems, as well as for maintaining a level of quality while reducing manufacturing costs for larger-quantity products. NASA/NESSUS has been successfully applied to a diverse range of problems in aerospace, gas turbine engines, biomechanics, pipelines, defense, weaponry, and infrastructure. This program combines state-of-the-art probabilistic algorithms with general-purpose structural analysis and lifting methods to compute the probabilistic response and reliability of engineered structures. Uncertainties in load, material properties, geometry, boundary conditions, and initial conditions can be simulated. The structural analysis methods include non-linear finite-element methods, heat-transfer analysis, polymer/ceramic matrix composite analysis, monolithic (conventional metallic) materials life-prediction methodologies, boundary element methods, and user-written subroutines. Several probabilistic algorithms are available such as the advanced mean value method and the adaptive importance sampling method. NASA/NESSUS 6.2c is structured in a modular format with 15 elements.

  9. Resistance to UV-B induced DNA damage in extreme-tolerant cryptoendolithic Antarctic fungi: detection by PCR assays.

    PubMed

    Selbmann, Laura; Isola, Daniela; Zucconi, Laura; Onofri, Silvano

    2011-10-01

    Cryptoendolithic Antarctic black fungi are adapted to the harshest terrestrial conditions as in the ice-free area of the McMurdo Dry Valleys. Recently, surviving space simulated conditions proves their bewildering extremotolerance. In order to investigate the potential DNA damage and their response after UV-B exposition, two strains of Antarctic cryptoendolithic black fungi, Cryomyces antarcticus CCFEE 534 and Cryomyces minteri CCFEE 5187, were irradiated at different UV-B doses. Since conventional methods cannot be applied to these organisms, the effect on the genome was assessed by RAPD and rDNA amplification PCR based assays; the results were compared with the responses of Saccharomyces pastorianus DBVPG 6283 treated with the same conditions. Results showed that template activity was drastically inhibited in S. pastorianus after irradiation. Dramatic changes in the RAPD profiles showed after 30 min of exposure while the rDNA amplification of SSU, LSU, and ITS portions failed after 30, 60, and 90 min of exposure respectively. No alteration was detected in the templates of the Antarctic strains where both RAPD profiles and rDNA PCR amplifications were unaffected even after 240 min of exposure. The electroferograms of the rDNA portions of Cryomyces strains were perfectly readable and conserved whilst the analyses revealed a marked alteration in S. pastorianus confirming the high resistance of the Antarctic strains to UV-B exposure.

  10. A Markov Chain Approach to Probabilistic Swarm Guidance

    NASA Technical Reports Server (NTRS)

    Acikmese, Behcet; Bayard, David S.

    2012-01-01

    This paper introduces a probabilistic guidance approach for the coordination of swarms of autonomous agents. The main idea is to drive the swarm to a prescribed density distribution in a prescribed region of the configuration space. In its simplest form, the probabilistic approach is completely decentralized and does not require communication or collabo- ration between agents. Agents make statistically independent probabilistic decisions based solely on their own state, that ultimately guides the swarm to the desired density distribution in the configuration space. In addition to being completely decentralized, the probabilistic guidance approach has a novel autonomous self-repair property: Once the desired swarm density distribution is attained, the agents automatically repair any damage to the distribution without collaborating and without any knowledge about the damage.

  11. Coordinated Changes in Antioxidative Enzymes Protect the Photosynthetic Machinery from Salinity Induced Oxidative Damage and Confer Salt Tolerance in an Extreme Halophyte Salvadora persica L.

    PubMed Central

    Rangani, Jaykumar; Parida, Asish K.; Panda, Ashok; Kumari, Asha

    2016-01-01

    Salinity-induced modulations in growth, photosynthetic pigments, relative water content (RWC), lipid peroxidation, photosynthesis, photosystem II efficiency, and changes in activity of various antioxidative enzymes were studied in the halophyte Salvadora persica treated with various levels of salinity (0, 250, 500, 750, and 1000 mM NaCl) to obtain an insight into the salt tolerance ability of this halophyte. Both fresh and dry biomass as well as leaf area (LA) declined at all levels of salinity whereas salinity caused an increase in leaf succulence. A gradual increase was observed in the Na+ content of leaf with increasing salt concentration up to 750 mM NaCl, but at higher salt concentration (1000 mM NaCl), the Na+ content surprisingly dropped down to the level of 250 mM NaCl. The chlorophyll and carotenoid contents of the leaf remained unaffected by salinity. The photosynthetic rate (PN), stomatal conductance (gs), the transpiration rate (E), quantum yield of PSII (ΦPSII), photochemical quenching (qP), and electron transport rate remained unchanged at low salinity (250 to 500 mM NaCl) whereas, significant reduction in these parameters were observed at high salinity (750 to 1000 mM NaCl). The RWC% and water use efficiency (WUE) of leaf remained unaffected by salinity. The salinity had no effect on maximum quantum efficiency of PS II (Fv/Fm) which indicates that PS II is not perturbed by salinity-induced oxidative damage. Analysis of the isoforms of antioxidative enzymes revealed that the leaves of S. persica have two isoforms each of Mn-SOD and Fe-SOD and one isoform of Cu-Zn SOD, three isoforms of POX, two isoforms of APX and one isoform of CAT. There was differential responses in activity and expression of different isoforms of various antioxidative enzymes. The malondialdehyde (MDA) content (a product of lipid peroxidation) of leaf remained unchanged in S. persica treated with various levels of salinity. Our results suggest that the absence of pigment

  12. Probabilistic Seismic Risk Model for Western Balkans

    NASA Astrophysics Data System (ADS)

    Stejskal, Vladimir; Lorenzo, Francisco; Pousse, Guillaume; Radovanovic, Slavica; Pekevski, Lazo; Dojcinovski, Dragi; Lokin, Petar; Petronijevic, Mira; Sipka, Vesna

    2010-05-01

    A probabilistic seismic risk model for insurance and reinsurance purposes is presented for an area of Western Balkans, covering former Yugoslavia and Albania. This territory experienced many severe earthquakes during past centuries producing significant damage to many population centres in the region. The highest hazard is related to external Dinarides, namely to the collision zone of the Adriatic plate. The model is based on a unified catalogue for the region and a seismic source model consisting of more than 30 zones covering all the three main structural units - Southern Alps, Dinarides and the south-western margin of the Pannonian Basin. A probabilistic methodology using Monte Carlo simulation was applied to generate the hazard component of the model. Unique set of damage functions based on both loss experience and engineering assessments is used to convert the modelled ground motion severity into the monetary loss.

  13. Probabilistic Composite Design

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.

    1997-01-01

    Probabilistic composite design is described in terms of a computational simulation. This simulation tracks probabilistically the composite design evolution from constituent materials, fabrication process, through composite mechanics and structural components. Comparisons with experimental data are provided to illustrate selection of probabilistic design allowables, test methods/specimen guidelines, and identification of in situ versus pristine strength, For example, results show that: in situ fiber tensile strength is 90% of its pristine strength; flat-wise long-tapered specimens are most suitable for setting ply tensile strength allowables: a composite radome can be designed with a reliability of 0.999999; and laminate fatigue exhibits wide-spread scatter at 90% cyclic-stress to static-strength ratios.

  14. Formalizing Probabilistic Safety Claims

    NASA Technical Reports Server (NTRS)

    Herencia-Zapana, Heber; Hagen, George E.; Narkawicz, Anthony J.

    2011-01-01

    A safety claim for a system is a statement that the system, which is subject to hazardous conditions, satisfies a given set of properties. Following work by John Rushby and Bev Littlewood, this paper presents a mathematical framework that can be used to state and formally prove probabilistic safety claims. It also enables hazardous conditions, their uncertainties, and their interactions to be integrated into the safety claim. This framework provides a formal description of the probabilistic composition of an arbitrary number of hazardous conditions and their effects on system behavior. An example is given of a probabilistic safety claim for a conflict detection algorithm for aircraft in a 2D airspace. The motivation for developing this mathematical framework is that it can be used in an automated theorem prover to formally verify safety claims.

  15. Probabilistic, meso-scale flood loss modelling

    NASA Astrophysics Data System (ADS)

    Kreibich, Heidi; Botto, Anna; Schröter, Kai; Merz, Bruno

    2016-04-01

    Flood risk analyses are an important basis for decisions on flood risk management and adaptation. However, such analyses are associated with significant uncertainty, even more if changes in risk due to global change are expected. Although uncertainty analysis and probabilistic approaches have received increased attention during the last years, they are still not standard practice for flood risk assessments and even more for flood loss modelling. State of the art in flood loss modelling is still the use of simple, deterministic approaches like stage-damage functions. Novel probabilistic, multi-variate flood loss models have been developed and validated on the micro-scale using a data-mining approach, namely bagging decision trees (Merz et al. 2013). In this presentation we demonstrate and evaluate the upscaling of the approach to the meso-scale, namely on the basis of land-use units. The model is applied in 19 municipalities which were affected during the 2002 flood by the River Mulde in Saxony, Germany (Botto et al. submitted). The application of bagging decision tree based loss models provide a probability distribution of estimated loss per municipality. Validation is undertaken on the one hand via a comparison with eight deterministic loss models including stage-damage functions as well as multi-variate models. On the other hand the results are compared with official loss data provided by the Saxon Relief Bank (SAB). The results show, that uncertainties of loss estimation remain high. Thus, the significant advantage of this probabilistic flood loss estimation approach is that it inherently provides quantitative information about the uncertainty of the prediction. References: Merz, B.; Kreibich, H.; Lall, U. (2013): Multi-variate flood damage assessment: a tree-based data-mining approach. NHESS, 13(1), 53-64. Botto A, Kreibich H, Merz B, Schröter K (submitted) Probabilistic, multi-variable flood loss modelling on the meso-scale with BT-FLEMO. Risk Analysis.

  16. Damage tolerant light absorbing material

    DOEpatents

    Lauf, R.J.; Hamby, C. Jr.; Akerman, M.A.; Seals, R.D.

    1993-09-07

    A light absorbing article comprised of a composite of carbon-bonded carbon fibers, is prepared by: blending carbon fibers with a carbonizable organic powder to form a mixture; dispersing the mixture into an aqueous slurry; vacuum molding the aqueous slurry to form a green article; drying and curing the green article to form a cured article; and, carbonizing the cured article at a temperature of at least about 1000 C to form a carbon-bonded carbon fiber light absorbing composite article having a bulk density less than 1 g/cm[sup 3]. 9 figures.

  17. Damage tolerant light absorbing material

    DOEpatents

    Lauf, Robert J.; Hamby, Jr., Clyde; Akerman, M. Alfred; Seals, Roland D.

    1993-01-01

    A light absorbing article comprised of a composite of carbon-bonded carbon fibers, prepared by: blending carbon fibers with a carbonizable organic powder to form a mixture; dispersing the mixture into an aqueous slurry; vacuum molding the aqueous slurry to form a green article; drying and curing the green article to form a cured article; and, carbonizing the cured article at a temperature of at least about 1000.degree. C. to form a carbon-bonded carbon fiber light absorbing composite article having a bulk density less than 1 g/cm.sup.3.

  18. Probabilistic composite micromechanics

    NASA Technical Reports Server (NTRS)

    Stock, T. A.; Bellini, P. X.; Murthy, P. L. N.; Chamis, C. C.

    1988-01-01

    Probabilistic composite micromechanics methods are developed that simulate expected uncertainties in unidirectional fiber composite properties. These methods are in the form of computational procedures using Monte Carlo simulation. A graphite/epoxy unidirectional composite (ply) is studied to demonstrate fiber composite material properties at the micro level. Regression results are presented to show the relative correlation between predicted and response variables in the study.

  19. Probabilistic Causation without Probability.

    ERIC Educational Resources Information Center

    Holland, Paul W.

    The failure of Hume's "constant conjunction" to describe apparently causal relations in science and everyday life has led to various "probabilistic" theories of causation of which the study by P. C. Suppes (1970) is an important example. A formal model that was developed for the analysis of comparative agricultural experiments…

  20. Probabilistic Threshold Criterion

    SciTech Connect

    Gresshoff, M; Hrousis, C A

    2010-03-09

    The Probabilistic Shock Threshold Criterion (PSTC) Project at LLNL develops phenomenological criteria for estimating safety or performance margin on high explosive (HE) initiation in the shock initiation regime, creating tools for safety assessment and design of initiation systems and HE trains in general. Until recently, there has been little foundation for probabilistic assessment of HE initiation scenarios. This work attempts to use probabilistic information that is available from both historic and ongoing tests to develop a basis for such assessment. Current PSTC approaches start with the functional form of the James Initiation Criterion as a backbone, and generalize to include varying areas of initiation and provide a probabilistic response based on test data for 1.8 g/cc (Ultrafine) 1,3,5-triamino-2,4,6-trinitrobenzene (TATB) and LX-17 (92.5% TATB, 7.5% Kel-F 800 binder). Application of the PSTC methodology is presented investigating the safety and performance of a flying plate detonator and the margin of an Ultrafine TATB booster initiating LX-17.

  1. The desert moss Pterygoneurum lamellatum (Pottiaceae) exhibits an inducible ecological strategy of desiccation tolerance: effects of rate of drying on shoot damage and regeneration

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Premise of the study: Bryophytes are regarded as a clade incorporating constitutive desiccation tolerance, especially terrestrial species. Here we test the hypothesis that the opposing ecological strategy of desiccation tolerance, inducibility, is present in a desert moss, and addressed by varying r...

  2. Probabilistic authenticated quantum dialogue

    NASA Astrophysics Data System (ADS)

    Hwang, Tzonelih; Luo, Yi-Ping

    2015-12-01

    This work proposes a probabilistic authenticated quantum dialogue (PAQD) based on Bell states with the following notable features. (1) In our proposed scheme, the dialogue is encoded in a probabilistic way, i.e., the same messages can be encoded into different quantum states, whereas in the state-of-the-art authenticated quantum dialogue (AQD), the dialogue is encoded in a deterministic way; (2) the pre-shared secret key between two communicants can be reused without any security loophole; (3) each dialogue in the proposed PAQD can be exchanged within only one-step quantum communication and one-step classical communication. However, in the state-of-the-art AQD protocols, both communicants have to run a QKD protocol for each dialogue and each dialogue requires multiple quantum as well as classical communicational steps; (4) nevertheless, the proposed scheme can resist the man-in-the-middle attack, the modification attack, and even other well-known attacks.

  3. Geothermal probabilistic cost study

    NASA Astrophysics Data System (ADS)

    Orren, L. H.; Ziman, G. M.; Jones, S. C.; Lee, T. K.; Noll, R.; Wilde, L.; Sadanand, V.

    1981-08-01

    A tool is presented to quantify the risks of geothermal projects, the Geothermal Probabilistic Cost Model (GPCM). The GPCM model was used to evaluate a geothermal reservoir for a binary-cycle electric plant at Heber, California. Three institutional aspects of the geothermal risk which can shift the risk among different agents was analyzed. The leasing of geothermal land, contracting between the producer and the user of the geothermal heat, and insurance against faulty performance were examined.

  4. Probabilistic Model Development

    NASA Technical Reports Server (NTRS)

    Adam, James H., Jr.

    2010-01-01

    Objective: Develop a Probabilistic Model for the Solar Energetic Particle Environment. Develop a tool to provide a reference solar particle radiation environment that: 1) Will not be exceeded at a user-specified confidence level; 2) Will provide reference environments for: a) Peak flux; b) Event-integrated fluence; and c) Mission-integrated fluence. The reference environments will consist of: a) Elemental energy spectra; b) For protons, helium and heavier ions.

  5. Geothermal probabilistic cost study

    NASA Technical Reports Server (NTRS)

    Orren, L. H.; Ziman, G. M.; Jones, S. C.; Lee, T. K.; Noll, R.; Wilde, L.; Sadanand, V.

    1981-01-01

    A tool is presented to quantify the risks of geothermal projects, the Geothermal Probabilistic Cost Model (GPCM). The GPCM model was used to evaluate a geothermal reservoir for a binary-cycle electric plant at Heber, California. Three institutional aspects of the geothermal risk which can shift the risk among different agents was analyzed. The leasing of geothermal land, contracting between the producer and the user of the geothermal heat, and insurance against faulty performance were examined.

  6. Sensitivity of probabilistic MCO water content estimates to key assumptions

    SciTech Connect

    DUNCAN, D.R.

    1999-02-25

    Sensitivity of probabilistic multi-canister overpack (MCO) water content estimates to key assumptions is evaluated with emphasis on the largest non-cladding film-contributors, water borne by particulates adhering to damage sites, and water borne by canister particulate. Calculations considered different choices of damage state degree of independence, different choices of percentile for reference high inputs, three types of input probability density function (pdfs): triangular, log-normal, and Weibull, and the number of scrap baskets in an MCO.

  7. Cranes and Crops: Investigating Farmer Tolerances toward Crop Damage by Threatened Blue Cranes ( Anthropoides paradiseus) in the Western Cape, South Africa

    NASA Astrophysics Data System (ADS)

    van Velden, Julia L.; Smith, Tanya; Ryan, Peter G.

    2016-12-01

    The Western Cape population of Blue Cranes ( Anthropoides paradiseus) in South Africa is of great importance as the largest population throughout its range. However, Blue Cranes are strongly associated with agricultural lands in the Western Cape, and therefore may come into conflict with farmers who perceive them as damaging to crops. We investigated the viability of this population by exploring farmer attitudes toward crane damage in two regions of the Western Cape, the Swartland and Overberg, using semi-structured interviews. Perceptions of cranes differed widely between regions: farmers in the Swartland perceived crane flocks to be particularly damaging to the feed crop sweet lupin (65 % of farmers reported some level of damage by cranes), and 40 % of these farmers perceived cranes as more problematic than other common bird pests. Farmers in the Overberg did not perceive cranes as highly damaging, although there was concern about cranes eating feed at sheep troughs. Farmers who had experienced large flocks on their farms and farmers who ranked cranes as more problematic than other bird pests more often perceived cranes to be damaging to their livelihoods. Biographical variables and crop profiles could not be related to the perception of damage, indicating the complexity of this human-wildlife conflict. Farmers' need for management alternatives was related to the perceived severity of damage. These results highlight the need for location-specific management solutions to crop damage by cranes, and contribute to the management of this vulnerable species.

  8. Asteroid Risk Assessment: A Probabilistic Approach.

    PubMed

    Reinhardt, Jason C; Chen, Xi; Liu, Wenhao; Manchev, Petar; Paté-Cornell, M Elisabeth

    2016-02-01

    Following the 2013 Chelyabinsk event, the risks posed by asteroids attracted renewed interest, from both the scientific and policy-making communities. It reminded the world that impacts from near-Earth objects (NEOs), while rare, have the potential to cause great damage to cities and populations. Point estimates of the risk (such as mean numbers of casualties) have been proposed, but because of the low-probability, high-consequence nature of asteroid impacts, these averages provide limited actionable information. While more work is needed to further refine its input distributions (e.g., NEO diameters), the probabilistic model presented in this article allows a more complete evaluation of the risk of NEO impacts because the results are distributions that cover the range of potential casualties. This model is based on a modularized simulation that uses probabilistic inputs to estimate probabilistic risk metrics, including those of rare asteroid impacts. Illustrative results of this analysis are presented for a period of 100 years. As part of this demonstration, we assess the effectiveness of civil defense measures in mitigating the risk of human casualties. We find that they are likely to be beneficial but not a panacea. We also compute the probability-but not the consequences-of an impact with global effects ("cataclysm"). We conclude that there is a continued need for NEO observation, and for analyses of the feasibility and risk-reduction effectiveness of space missions designed to deflect or destroy asteroids that threaten the Earth.

  9. Tolerating Zero Tolerance?

    ERIC Educational Resources Information Center

    Moore, Brian N.

    2010-01-01

    The concept of zero tolerance dates back to the mid-1990s when New Jersey was creating laws to address nuisance crimes in communities. The main goal of these neighborhood crime policies was to have zero tolerance for petty crime such as graffiti or littering so as to keep more serious crimes from occurring. Next came the war on drugs. In federal…

  10. Probabilistic inspection strategies for minimizing service failures

    NASA Astrophysics Data System (ADS)

    Brot, Abraham

    1994-09-01

    The INSIM computer program is described which simulates the 'limited fatigue life' environment in which aircraft structures generally operate. The use of INSIM to develop inspection strategies which aim to minimize service failures is demonstrated. Damage-tolerance methodology, inspection thresholds and customized inspections are simulated using the probability of failure as the driving parameter.

  11. Topics in Probabilistic Judgment Aggregation

    ERIC Educational Resources Information Center

    Wang, Guanchun

    2011-01-01

    This dissertation is a compilation of several studies that are united by their relevance to probabilistic judgment aggregation. In the face of complex and uncertain events, panels of judges are frequently consulted to provide probabilistic forecasts, and aggregation of such estimates in groups often yield better results than could have been made…

  12. Time Analysis for Probabilistic Workflows

    SciTech Connect

    Czejdo, Bogdan; Ferragut, Erik M

    2012-01-01

    There are many theoretical and practical results in the area of workflow modeling, especially when the more formal workflows are used. In this paper we focus on probabilistic workflows. We show algorithms for time computations in probabilistic workflows. With time of activities more precisely modeled, we can achieve improvement in the work cooperation and analyses of cooperation including simulation and visualization.

  13. Quantum probabilistic logic programming

    NASA Astrophysics Data System (ADS)

    Balu, Radhakrishnan

    2015-05-01

    We describe a quantum mechanics based logic programming language that supports Horn clauses, random variables, and covariance matrices to express and solve problems in probabilistic logic. The Horn clauses of the language wrap random variables, including infinite valued, to express probability distributions and statistical correlations, a powerful feature to capture relationship between distributions that are not independent. The expressive power of the language is based on a mechanism to implement statistical ensembles and to solve the underlying SAT instances using quantum mechanical machinery. We exploit the fact that classical random variables have quantum decompositions to build the Horn clauses. We establish the semantics of the language in a rigorous fashion by considering an existing probabilistic logic language called PRISM with classical probability measures defined on the Herbrand base and extending it to the quantum context. In the classical case H-interpretations form the sample space and probability measures defined on them lead to consistent definition of probabilities for well formed formulae. In the quantum counterpart, we define probability amplitudes on Hinterpretations facilitating the model generations and verifications via quantum mechanical superpositions and entanglements. We cast the well formed formulae of the language as quantum mechanical observables thus providing an elegant interpretation for their probabilities. We discuss several examples to combine statistical ensembles and predicates of first order logic to reason with situations involving uncertainty.

  14. Probabilistic cellular automata.

    PubMed

    Agapie, Alexandru; Andreica, Anca; Giuclea, Marius

    2014-09-01

    Cellular automata are binary lattices used for modeling complex dynamical systems. The automaton evolves iteratively from one configuration to another, using some local transition rule based on the number of ones in the neighborhood of each cell. With respect to the number of cells allowed to change per iteration, we speak of either synchronous or asynchronous automata. If randomness is involved to some degree in the transition rule, we speak of probabilistic automata, otherwise they are called deterministic. With either type of cellular automaton we are dealing with, the main theoretical challenge stays the same: starting from an arbitrary initial configuration, predict (with highest accuracy) the end configuration. If the automaton is deterministic, the outcome simplifies to one of two configurations, all zeros or all ones. If the automaton is probabilistic, the whole process is modeled by a finite homogeneous Markov chain, and the outcome is the corresponding stationary distribution. Based on our previous results for the asynchronous case-connecting the probability of a configuration in the stationary distribution to its number of zero-one borders-the article offers both numerical and theoretical insight into the long-term behavior of synchronous cellular automata.

  15. Is the basic conditional probabilistic?

    PubMed

    Goodwin, Geoffrey P

    2014-06-01

    Nine experiments examined whether individuals treat the meaning of basic conditional assertions as deterministic or probabilistic. In Experiments 1-4, participants were presented with either probabilistic or deterministic relations, which they had to describe with a conditional. These experiments consistently showed that people tend only to use the basic if p then q construction to describe deterministic relations between antecedent and consequent, whereas they use a probabilistically qualified construction, if p then probably q, to describe probabilistic relations-suggesting that the default interpretation of the conditional is deterministic. Experiments 5 and 6 showed that when directly asked, individuals typically report that conditional assertions admit no exceptions (i.e., they are seen as deterministic). Experiments 7-9 showed that individuals judge the truth of conditional assertions in accordance with this deterministic interpretation. Together, these results pose a challenge to probabilistic accounts of the meaning of conditionals and support mental models, formal rules, and suppositional accounts.

  16. Probabilistic Structural Analysis Theory Development

    NASA Technical Reports Server (NTRS)

    Burnside, O. H.

    1985-01-01

    The objective of the Probabilistic Structural Analysis Methods (PSAM) project is to develop analysis techniques and computer programs for predicting the probabilistic response of critical structural components for current and future space propulsion systems. This technology will play a central role in establishing system performance and durability. The first year's technical activity is concentrating on probabilistic finite element formulation strategy and code development. Work is also in progress to survey critical materials and space shuttle mian engine components. The probabilistic finite element computer program NESSUS (Numerical Evaluation of Stochastic Structures Under Stress) is being developed. The final probabilistic code will have, in the general case, the capability of performing nonlinear dynamic of stochastic structures. It is the goal of the approximate methods effort to increase problem solving efficiency relative to finite element methods by using energy methods to generate trial solutions which satisfy the structural boundary conditions. These approximate methods will be less computer intensive relative to the finite element approach.

  17. Probabilistic retinal vessel segmentation

    NASA Astrophysics Data System (ADS)

    Wu, Chang-Hua; Agam, Gady

    2007-03-01

    Optic fundus assessment is widely used for diagnosing vascular and non-vascular pathology. Inspection of the retinal vasculature may reveal hypertension, diabetes, arteriosclerosis, cardiovascular disease and stroke. Due to various imaging conditions retinal images may be degraded. Consequently, the enhancement of such images and vessels in them is an important task with direct clinical applications. We propose a novel technique for vessel enhancement in retinal images that is capable of enhancing vessel junctions in addition to linear vessel segments. This is an extension of vessel filters we have previously developed for vessel enhancement in thoracic CT scans. The proposed approach is based on probabilistic models which can discern vessels and junctions. Evaluation shows the proposed filter is better than several known techniques and is comparable to the state of the art when evaluated on a standard dataset. A ridge-based vessel tracking process is applied on the enhanced image to demonstrate the effectiveness of the enhancement filter.

  18. Probabilistic Fiber Composite Micromechanics

    NASA Technical Reports Server (NTRS)

    Stock, Thomas A.

    1996-01-01

    Probabilistic composite micromechanics methods are developed that simulate expected uncertainties in unidirectional fiber composite properties. These methods are in the form of computational procedures using Monte Carlo simulation. The variables in which uncertainties are accounted for include constituent and void volume ratios, constituent elastic properties and strengths, and fiber misalignment. A graphite/epoxy unidirectional composite (ply) is studied to demonstrate fiber composite material property variations induced by random changes expected at the material micro level. Regression results are presented to show the relative correlation between predictor and response variables in the study. These computational procedures make possible a formal description of anticipated random processes at the intra-ply level, and the related effects of these on composite properties.

  19. Novel probabilistic neuroclassifier

    NASA Astrophysics Data System (ADS)

    Hong, Jiang; Serpen, Gursel

    2003-09-01

    A novel probabilistic potential function neural network classifier algorithm to deal with classes which are multi-modally distributed and formed from sets of disjoint pattern clusters is proposed in this paper. The proposed classifier has a number of desirable properties which distinguish it from other neural network classifiers. A complete description of the algorithm in terms of its architecture and the pseudocode is presented. Simulation analysis of the newly proposed neuro-classifier algorithm on a set of benchmark problems is presented. Benchmark problems tested include IRIS, Sonar, Vowel Recognition, Two-Spiral, Wisconsin Breast Cancer, Cleveland Heart Disease and Thyroid Gland Disease. Simulation results indicate that the proposed neuro-classifier performs consistently better for a subset of problems for which other neural classifiers perform relatively poorly.

  20. Probabilistic Mesomechanical Fatigue Model

    NASA Technical Reports Server (NTRS)

    Tryon, Robert G.

    1997-01-01

    A probabilistic mesomechanical fatigue life model is proposed to link the microstructural material heterogeneities to the statistical scatter in the macrostructural response. The macrostructure is modeled as an ensemble of microelements. Cracks nucleation within the microelements and grow from the microelements to final fracture. Variations of the microelement properties are defined using statistical parameters. A micromechanical slip band decohesion model is used to determine the crack nucleation life and size. A crack tip opening displacement model is used to determine the small crack growth life and size. Paris law is used to determine the long crack growth life. The models are combined in a Monte Carlo simulation to determine the statistical distribution of total fatigue life for the macrostructure. The modeled response is compared to trends in experimental observations from the literature.

  1. Probabilistic brains: knowns and unknowns

    PubMed Central

    Pouget, Alexandre; Beck, Jeffrey M; Ma, Wei Ji; Latham, Peter E

    2015-01-01

    There is strong behavioral and physiological evidence that the brain both represents probability distributions and performs probabilistic inference. Computational neuroscientists have started to shed light on how these probabilistic representations and computations might be implemented in neural circuits. One particularly appealing aspect of these theories is their generality: they can be used to model a wide range of tasks, from sensory processing to high-level cognition. To date, however, these theories have only been applied to very simple tasks. Here we discuss the challenges that will emerge as researchers start focusing their efforts on real-life computations, with a focus on probabilistic learning, structural learning and approximate inference. PMID:23955561

  2. Probabilistic Design of Composite Structures

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.

    2006-01-01

    A formal procedure for the probabilistic design evaluation of a composite structure is described. The uncertainties in all aspects of a composite structure (constituent material properties, fabrication variables, structural geometry, and service environments, etc.), which result in the uncertain behavior in the composite structural responses, are included in the evaluation. The probabilistic evaluation consists of: (1) design criteria, (2) modeling of composite structures and uncertainties, (3) simulation methods, and (4) the decision-making process. A sample case is presented to illustrate the formal procedure and to demonstrate that composite structural designs can be probabilistically evaluated with accuracy and efficiency.

  3. Common Difficulties with Probabilistic Reasoning.

    ERIC Educational Resources Information Center

    Hope, Jack A.; Kelly, Ivan W.

    1983-01-01

    Several common errors reflecting difficulties in probabilistic reasoning are identified, relating to ambiguity, previous outcomes, sampling, unusual events, and estimating. Knowledge of these mistakes and interpretations may help mathematics teachers understand the thought processes of their students. (MNS)

  4. Error Tolerant Plan Recognition: An Empirical Investigation

    DTIC Science & Technology

    2015-05-01

    2015, Association for the Advancement of Artificial Intelligence (www.aaai.org). All rights reserved. PR’s ability to tolerate input errors vs...STATEMENT Approved for public release; distribution unlimited 13. SUPPLEMENTARY NOTES Proceedings of the Twenty-Eighth Florida Artificial Intelligence ...general model for online probabilistic plan recognition. Proceedings of the Eighteenth International Joint Conference on Artificial Intelligence (pp

  5. A Probabilistic Ontology Development Methodology

    DTIC Science & Technology

    2014-06-01

    to have a tool guiding the user on the steps necessary to create a probabilistic ontology and link this documentation to its implementation … [4...extension that is beyond the scope of this work and includes methods such as ONIONS , FCA-Merge, and PROMPT. The interested reader may find these...construction “It would be interesting to have a tool guiding the user on the steps necessary to create a probabilistic ontology and link this

  6. Probabilistic Risk Assessment: A Bibliography

    NASA Technical Reports Server (NTRS)

    2000-01-01

    Probabilistic risk analysis is an integration of failure modes and effects analysis (FMEA), fault tree analysis and other techniques to assess the potential for failure and to find ways to reduce risk. This bibliography references 160 documents in the NASA STI Database that contain the major concepts, probabilistic risk assessment, risk and probability theory, in the basic index or major subject terms, An abstract is included with most citations, followed by the applicable subject terms.

  7. Probabilistic theories with purification

    SciTech Connect

    Chiribella, Giulio; D'Ariano, Giacomo Mauro; Perinotti, Paolo

    2010-06-15

    We investigate general probabilistic theories in which every mixed state has a purification, unique up to reversible channels on the purifying system. We show that the purification principle is equivalent to the existence of a reversible realization of every physical process, that is, to the fact that every physical process can be regarded as arising from a reversible interaction of the system with an environment, which is eventually discarded. From the purification principle we also construct an isomorphism between transformations and bipartite states that possesses all structural properties of the Choi-Jamiolkowski isomorphism in quantum theory. Such an isomorphism allows one to prove most of the basic features of quantum theory, like, e.g., existence of pure bipartite states giving perfect correlations in independent experiments, no information without disturbance, no joint discrimination of all pure states, no cloning, teleportation, no programming, no bit commitment, complementarity between correctable channels and deletion channels, characterization of entanglement-breaking channels as measure-and-prepare channels, and others, without resorting to the mathematical framework of Hilbert spaces.

  8. The hazard in using probabilistic seismic hazard analysis

    SciTech Connect

    Krinitzsky, E.L. . Geotechnical Lab.)

    1993-11-01

    Earthquake experts rely on probabilistic seismic hazard analysis for everything from emergency-response planning to development of building codes. Unfortunately, says the author, the analysis is defective for the large earthquakes that pose the greater risks. Structures have short lifetimes and the distance over which earthquakes cause damage are relatively small. Exceptions serve to prove the rule. To be useful in engineering, earthquakes hazard assessment must focus narrowly in both time and space.

  9. Air exposure behavior of the semiterrestrial crab Neohelice granulata allows tolerance to severe hypoxia but not prevent oxidative damage due to hypoxia-reoxygenation cycle.

    PubMed

    de Lima, Tábata Martins; Geihs, Márcio Alberto; Nery, Luiz Eduardo Maia; Maciel, Fábio Everton

    2015-11-01

    The air exposure behavior of the semi-terrestrial crab Neohelice granulata during severe hypoxia was studied. This study also verified whether this behavior mitigates possible oxidative damage, namely lipoperoxidation, caused by hypoxia and reoxygenation cycles. The lethal time for 50% of the crabs subjected to severe hypoxia (0.5 mgO2 · L(-1)) with free access to air was compared to that of crabs subjected to severe hypoxia without access to air. Crabs were placed in aquaria divided into three zones: water (when the animal was fully submersed), land (when the animal was completely emerged) and intermediate (when the animal was in contact with both environments) zones. Then the crabs were held in this condition for 270 min, and the time spent in each zone was recorded. Lipid peroxidation (LPO) damage to the walking leg muscles was determined for the following four experimental conditions: a--normoxic water with free access to air; b--hypoxic water without access to air; c--hypoxic water followed by normoxic water without air access; and d--hypoxic water with free access to air. When exposed to hypoxic water, N. granulata spent significantly more time on land, 135.3 ± 17.7 min, whereas control animals (exposed to normoxic water) spent more time submerged, 187.4 ± 20.2 min. By this behavior, N. granulata was able to maintain a 100% survival rate when exposed to severe hypoxia. However, N. granulata must still return to water after periods of air exposure (~ 14 min), causing a sequence of hypoxia/reoxygenation events. Despite increasing the survival rate, hypoxia with air access does not decrease the lipid peroxidation damage caused by the hypoxia and reoxygenation cycle experienced by these crabs.

  10. The IEEE eighteenth international symposium on fault-tolerant computing (Digest of Papers)

    SciTech Connect

    Not Available

    1988-01-01

    These proceedings collect papers on fault detection and computers. Topics include: software failure behavior, fault tolerant distributed programs, parallel simulation of faults, concurrent built-in self-test techniques, fault-tolerant parallel processor architectures, probabilistic fault diagnosis, fault tolerances in hypercube processors and cellular automation modeling.

  11. PROBABILISTIC INFORMATION INTEGRATION TECHNOLOGY

    SciTech Connect

    J. BOOKER; M. MEYER; ET AL

    2001-02-01

    The Statistical Sciences Group at Los Alamos has successfully developed a structured, probabilistic, quantitative approach for the evaluation of system performance based on multiple information sources, called Information Integration Technology (IIT). The technology integrates diverse types and sources of data and information (both quantitative and qualitative), and their associated uncertainties, to develop distributions for performance metrics, such as reliability. Applications include predicting complex system performance, where test data are lacking or expensive to obtain, through the integration of expert judgment, historical data, computer/simulation model predictions, and any relevant test/experimental data. The technology is particularly well suited for tracking estimated system performance for systems under change (e.g. development, aging), and can be used at any time during product development, including concept and early design phases, prior to prototyping, testing, or production, and before costly design decisions are made. Techniques from various disciplines (e.g., state-of-the-art expert elicitation, statistical and reliability analysis, design engineering, physics modeling, and knowledge management) are merged and modified to develop formal methods for the data/information integration. The power of this technology, known as PREDICT (Performance and Reliability Evaluation with Diverse Information Combination and Tracking), won a 1999 R and D 100 Award (Meyer, Booker, Bement, Kerscher, 1999). Specifically the PREDICT application is a formal, multidisciplinary process for estimating the performance of a product when test data are sparse or nonexistent. The acronym indicates the purpose of the methodology: to evaluate the performance or reliability of a product/system by combining all available (often diverse) sources of information and then tracking that performance as the product undergoes changes.

  12. Probabilistic exposure fusion.

    PubMed

    Song, Mingli; Tao, Dacheng; Chen, Chun; Bu, Jiajun; Luo, Jiebo; Zhang, Chengqi

    2012-01-01

    The luminance of a natural scene is often of high dynamic range (HDR). In this paper, we propose a new scheme to handle HDR scenes by integrating locally adaptive scene detail capture and suppressing gradient reversals introduced by the local adaptation. The proposed scheme is novel for capturing an HDR scene by using a standard dynamic range (SDR) device and synthesizing an image suitable for SDR displays. In particular, we use an SDR capture device to record scene details (i.e., the visible contrasts and the scene gradients) in a series of SDR images with different exposure levels. Each SDR image responds to a fraction of the HDR and partially records scene details. With the captured SDR image series, we first calculate the image luminance levels, which maximize the visible contrasts, and then the scene gradients embedded in these images. Next, we synthesize an SDR image by using a probabilistic model that preserves the calculated image luminance levels and suppresses reversals in the image luminance gradients. The synthesized SDR image contains much more scene details than any of the captured SDR image. Moreover, the proposed scheme also functions as the tone mapping of an HDR image to the SDR image, and it is superior to both global and local tone mapping operators. This is because global operators fail to preserve visual details when the contrast ratio of a scene is large, whereas local operators often produce halos in the synthesized SDR image. The proposed scheme does not require any human interaction or parameter tuning for different scenes. Subjective evaluations have shown that it is preferred over a number of existing approaches.

  13. The Xerophyta viscosa aldose reductase (ALDRXV4) confers enhanced drought and salinity tolerance to transgenic tobacco plants by scavenging methylglyoxal and reducing the membrane damage.

    PubMed

    Kumar, Deepak; Singh, Preeti; Yusuf, Mohd Aslam; Upadhyaya, Chandrama Prakash; Roy, Suchandra Deb; Hohn, Thomas; Sarin, Neera Bhalla

    2013-06-01

    We report the efficacy of an aldose reductase (ALDRXV4) enzyme from Xerophyta viscosa Baker in enhancing the prospects of plant's survival under abiotic stress. Transgenic tobacco plants overexpressing ALDRXV4 cDNA showed alleviation of NaCl and mannitol-induced abiotic stress. The transgenic plants survived longer periods of water deficiency and salinity stress and exhibited improved recovery after rehydration as compared to the wild type plants. The increased synthesis of aldose reductase in transgenic plants correlated with reduced methylglyoxal and malondialdehyde accumulation and an elevated level of sorbitol under stress conditions. In addition, the transgenic lines showed better photosynthetic efficiency, less electrolyte damage, greater water retention, higher proline accumulation, and favorable ionic balance under stress conditions. Together, these findings suggest the potential of engineering aldose reductase levels for better performance of crop plants growing under drought and salt stress conditions.

  14. Probabilistic Prognosis of Non-Planar Fatigue Crack Growth

    NASA Technical Reports Server (NTRS)

    Leser, Patrick E.; Newman, John A.; Warner, James E.; Leser, William P.; Hochhalter, Jacob D.; Yuan, Fuh-Gwo

    2016-01-01

    Quantifying the uncertainty in model parameters for the purpose of damage prognosis can be accomplished utilizing Bayesian inference and damage diagnosis data from sources such as non-destructive evaluation or structural health monitoring. The number of samples required to solve the Bayesian inverse problem through common sampling techniques (e.g., Markov chain Monte Carlo) renders high-fidelity finite element-based damage growth models unusable due to prohibitive computation times. However, these types of models are often the only option when attempting to model complex damage growth in real-world structures. Here, a recently developed high-fidelity crack growth model is used which, when compared to finite element-based modeling, has demonstrated reductions in computation times of three orders of magnitude through the use of surrogate models and machine learning. The model is flexible in that only the expensive computation of the crack driving forces is replaced by the surrogate models, leaving the remaining parameters accessible for uncertainty quantification. A probabilistic prognosis framework incorporating this model is developed and demonstrated for non-planar crack growth in a modified, edge-notched, aluminum tensile specimen. Predictions of remaining useful life are made over time for five updates of the damage diagnosis data, and prognostic metrics are utilized to evaluate the performance of the prognostic framework. Challenges specific to the probabilistic prognosis of non-planar fatigue crack growth are highlighted and discussed in the context of the experimental results.

  15. Vagueness as Probabilistic Linguistic Knowledge

    NASA Astrophysics Data System (ADS)

    Lassiter, Daniel

    Consideration of the metalinguistic effects of utterances involving vague terms has led Barker [1] to treat vagueness using a modified Stalnakerian model of assertion. I present a sorites-like puzzle for factual beliefs in the standard Stalnakerian model [28] and show that it can be resolved by enriching the model to make use of probabilistic belief spaces. An analogous problem arises for metalinguistic information in Barker's model, and I suggest that a similar enrichment is needed here as well. The result is a probabilistic theory of linguistic representation that retains a classical metalanguage but avoids the undesirable divorce between meaning and use inherent in the epistemic theory [34]. I also show that the probabilistic approach provides a plausible account of the sorites paradox and higher-order vagueness and that it fares well empirically and conceptually in comparison to leading competitors.

  16. Probabilistic progressive buckling of trusses

    NASA Technical Reports Server (NTRS)

    Pai, Shantaram S.; Chamis, Christos C.

    1991-01-01

    A three-bay, space, cantilever truss is probabilistically evaluated to describe progressive buckling and truss collapse in view of the numerous uncertainties associated with the structural, material, and load variables (primitive variables) that describe the truss. Initially, the truss is deterministically analyzed for member forces, and member(s) in which the axial force exceeds the Euler buckling load are identified. These member(s) are then discretized with several intermediate nodes and a probabilistic buckling analysis is performed on the truss to obtain its probabilistic buckling loads and respective mode shapes. Furthermore, sensitivities associated with the uncertainties in the primitive variables are investigated, margin of safety values for the truss are determined, and truss end node displacements are noted. These steps are repeated by sequentially removing the buckled member(s) until onset of truss collapse is reached. Results show that this procedure yields an optimum truss configuration for a given loading and for a specified reliability.

  17. Probabilistic progressive buckling of trusses

    NASA Technical Reports Server (NTRS)

    Pai, Shantaram S.; Chamis, Christos C.

    1994-01-01

    A three-bay, space, cantilever truss is probabilistically evaluated to describe progressive buckling and truss collapse in view of the numerous uncertainties associated with the structural, material, and load variables that describe the truss. Initially, the truss is deterministically analyzed for member forces, and members in which the axial force exceeds the Euler buckling load are identified. These members are then discretized with several intermediate nodes, and a probabilistic buckling analysis is performed on the truss to obtain its probabilistic buckling loads and the respective mode shapes. Furthermore, sensitivities associated with the uncertainties in the primitive variables are investigated, margin of safety values for the truss are determined, and truss end node displacements are noted. These steps are repeated by sequentially removing buckled members until onset of truss collapse is reached. Results show that this procedure yields an optimum truss configuration for a given loading and for a specified reliability.

  18. Probabilistic inversion: a preliminary discussion

    NASA Astrophysics Data System (ADS)

    Battista Rossi, Giovanni; Crenna, Francesco

    2015-02-01

    We continue the discussion on the possibility of interpreting probability as a logic, that we have started in the previous IMEKO TC1-TC7-TC13 Symposium. We show here how a probabilistic logic can be extended up to including direct and inverse functions. We also discuss the relationship between this framework and the Bayes-Laplace rule, showing how the latter can be formally interpreted as a probabilistic inversion device. We suggest that these findings open a new perspective in the evaluation of measurement uncertainty.

  19. Probabilistic assessment of composite structures

    NASA Technical Reports Server (NTRS)

    Shiao, Michael E.; Abumeri, Galib H.; Chamis, Christos C.

    1993-01-01

    A general computational simulation methodology for an integrated probabilistic assessment of composite structures is discussed and demonstrated using aircraft fuselage (stiffened composite cylindrical shell) structures with rectangular cutouts. The computational simulation was performed for the probabilistic assessment of the structural behavior including buckling loads, vibration frequencies, global displacements, and local stresses. The scatter in the structural response is simulated based on the inherent uncertainties in the primitive (independent random) variables at the fiber matrix constituent, ply, laminate, and structural scales that describe the composite structures. The effect of uncertainties due to fabrication process variables such as fiber volume ratio, void volume ratio, ply orientation, and ply thickness is also included. The methodology has been embedded in the computer code IPACS (Integrated Probabilistic Assessment of Composite Structures). In addition to the simulated scatter, the IPACS code also calculates the sensitivity of the composite structural behavior to all the primitive variables that influence the structural behavior. This information is useful for assessing reliability and providing guidance for improvement. The results from the probabilistic assessment for the composite structure with rectangular cutouts indicate that the uncertainty in the longitudinal ply stress is mainly caused by the uncertainty in the laminate thickness, and the large overlap of the scatter in the first four buckling loads implies that the buckling mode shape for a specific buckling load can be either of the four modes.

  20. Research on probabilistic information processing

    NASA Technical Reports Server (NTRS)

    Edwards, W.

    1973-01-01

    The work accomplished on probabilistic information processing (PIP) is reported. The research proposals and decision analysis are discussed along with the results of research on MSC setting, multiattribute utilities, and Bayesian research. Abstracts of reports concerning the PIP research are included.

  1. Making Probabilistic Relational Categories Learnable

    ERIC Educational Resources Information Center

    Jung, Wookyoung; Hummel, John E.

    2015-01-01

    Theories of relational concept acquisition (e.g., schema induction) based on structured intersection discovery predict that relational concepts with a probabilistic (i.e., family resemblance) structure ought to be extremely difficult to learn. We report four experiments testing this prediction by investigating conditions hypothesized to facilitate…

  2. On the applicability of probabilistics

    SciTech Connect

    Roth, P.G.

    1996-12-31

    GEAE`s traditional lifing approach, based on Low Cycle Fatigue (LCF) curves, is evolving for fracture critical powder metal components by incorporating probabilistic fracture mechanics analysis. Supporting this move is a growing validation database which convincingly demonstrates that probabilistics work given the right inputs. Significant efforts are being made to ensure the right inputs. For example, Heavy Liquid Separation (HLS) analysis has been developed to quantify and control inclusion content (1). Also, an intensive seeded fatigue program providing a model for crack initiation at inclusions is ongoing (2). Despite the optimism and energy, probabilistics are only tools and have limitations. Designing to low failure probabilities helps provide protection, but other strategies are needed to protect against surprises. A low risk design limit derived from a predicted failure distribution can lead to a high risk deployment if there are unaccounted-for deviations from analysis assumptions. Recognized deviations which are statistically quantifiable can be integrated into the probabilistic analysis (an advantage of the approach). When deviations are known to be possible but are not properly describable statistically, it may be more appropriate to maintain the traditional position of conservatively bounding relevant input parameters. Finally, safety factors on analysis results may be called for in cases where there is little experience supporting new design concepts or material applications (where unrecognized deviations might be expected).

  3. Probabilistic structural analysis of space propulsion system LOX post

    NASA Technical Reports Server (NTRS)

    Newell, J. F.; Rajagopal, K. R.; Ho, H. W.; Cunniff, J. M.

    1990-01-01

    The probabilistic structural analysis program NESSUS (Numerical Evaluation of Stochastic Structures Under Stress; Cruse et al., 1988) is applied to characterize the dynamic loading and response of the Space Shuttle main engine (SSME) LOX post. The design and operation of the SSME are reviewed; the LOX post structure is described; and particular attention is given to the generation of composite load spectra, the finite-element model of the LOX post, and the steps in the NESSUS structural analysis. The results are presented in extensive tables and graphs, and it is shown that NESSUS correctly predicts the structural effects of changes in the temperature loading. The probabilistic approach also facilitates (1) damage assessments for a given failure model (based on gas temperature, heat-shield gap, and material properties) and (2) correlation of the gas temperature with operational parameters such as engine thrust.

  4. Probabilistic load simulation: Code development status

    NASA Technical Reports Server (NTRS)

    Newell, J. F.; Ho, H.

    1991-01-01

    The objective of the Composite Load Spectra (CLS) project is to develop generic load models to simulate the composite load spectra that are included in space propulsion system components. The probabilistic loads thus generated are part of the probabilistic design analysis (PDA) of a space propulsion system that also includes probabilistic structural analyses, reliability, and risk evaluations. Probabilistic load simulation for space propulsion systems demands sophisticated probabilistic methodology and requires large amounts of load information and engineering data. The CLS approach is to implement a knowledge based system coupled with a probabilistic load simulation module. The knowledge base manages and furnishes load information and expertise and sets up the simulation runs. The load simulation module performs the numerical computation to generate the probabilistic loads with load information supplied from the CLS knowledge base.

  5. Probabilistic load simulation: Code development status

    NASA Astrophysics Data System (ADS)

    Newell, J. F.; Ho, H.

    1991-05-01

    The objective of the Composite Load Spectra (CLS) project is to develop generic load models to simulate the composite load spectra that are included in space propulsion system components. The probabilistic loads thus generated are part of the probabilistic design analysis (PDA) of a space propulsion system that also includes probabilistic structural analyses, reliability, and risk evaluations. Probabilistic load simulation for space propulsion systems demands sophisticated probabilistic methodology and requires large amounts of load information and engineering data. The CLS approach is to implement a knowledge based system coupled with a probabilistic load simulation module. The knowledge base manages and furnishes load information and expertise and sets up the simulation runs. The load simulation module performs the numerical computation to generate the probabilistic loads with load information supplied from the CLS knowledge base.

  6. 7 CFR 51.2954 - Tolerances for grade defects.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... not more than 6 pct which are damaged by mold or insects or seriously damaged by other means, of which not more than 5/6 or 5 pct may be damaged by insects, but no part of any tolerance shall be allowed for walnuts containing live insects No tolerance to reduce the required 70 pct of “light...

  7. 7 CFR 51.2954 - Tolerances for grade defects.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... not more than 6 pct which are damaged by mold or insects or seriously damaged by other means, of which not more than 5/6 or 5 pct may be damaged by insects, but no part of any tolerance shall be allowed for walnuts containing live insects No tolerance to reduce the required 70 pct of “light...

  8. A Preliminary Study of the Application of Probabilistic Risk Assessment Techniques to High-Energy Laser Safety

    DTIC Science & Technology

    2001-12-01

    that scintillation gain exceeds a given level (g) ............... 19 Figure 8. Typical dose - response curve for laser-induced ocular damage...23 Figure 9. Probit transformation of the dose - response curve for laser-induced ocular dam age...Protection Criteria An important element in the probabilistic risk assessment is the biological damage model (or dose response curve ). This describes

  9. Probabilistic fatigue methodology and wind turbine reliability

    SciTech Connect

    Lange, C.H.

    1996-05-01

    Wind turbines subjected to highly irregular loadings due to wind, gravity, and gyroscopic effects are especially vulnerable to fatigue damage. The objective of this study is to develop and illustrate methods for the probabilistic analysis and design of fatigue-sensitive wind turbine components. A computer program (CYCLES) that estimates fatigue reliability of structural and mechanical components has been developed. A FORM/SORM analysis is used to compute failure probabilities and importance factors of the random variables. The limit state equation includes uncertainty in environmental loading, gross structural response, and local fatigue properties. Several techniques are shown to better study fatigue loads data. Common one-parameter models, such as the Rayleigh and exponential models are shown to produce dramatically different estimates of load distributions and fatigue damage. Improved fits may be achieved with the two-parameter Weibull model. High b values require better modeling of relatively large stress ranges; this is effectively done by matching at least two moments (Weibull) and better by matching still higher moments. For this purpose, a new, four-moment {open_quotes}generalized Weibull{close_quotes} model is introduced. Load and resistance factor design (LRFD) methodology for design against fatigue is proposed and demonstrated using data from two horizontal-axis wind turbines. To estimate fatigue damage, wind turbine blade loads have been represented by their first three statistical moments across a range of wind conditions. Based on the moments {mu}{sub 1}{hor_ellipsis}{mu}{sub 3}, new {open_quotes}quadratic Weibull{close_quotes} load distribution models are introduced. The fatigue reliability is found to be notably affected by the choice of load distribution model.

  10. Probabilistic Evaluation of Bolted Joints in Polymer Matrix Composites

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.; Minnetyan, L.

    1997-01-01

    Computational methods are described to probabilistically simulate fracture in bolted composite structures. Progressive fracture is simulated via an innovative approach independent of stress intensity factors and fracture toughness. The effect on structure damage of design variable uncertainties is quantified. The Fast Probability Integrator is used to assess the scatter in the composite structure response before and after damage. Sensitivity of the response to design variables is evaluated. The methods are demonstrated for bolted joint polymer matrix composite panels under end loads. The effects of fabrication process are included in the simulation of damage in the bolted panel. The results show that the most effective way to reduce the end displacement at fracture is to control the load and ply thickness.

  11. Environmental probabilistic quantitative assessment methodologies

    USGS Publications Warehouse

    Crovelli, R.A.

    1995-01-01

    In this paper, four petroleum resource assessment methodologies are presented as possible pollution assessment methodologies, even though petroleum as a resource is desirable, whereas pollution is undesirable. A methodology is defined in this paper to consist of a probability model and a probabilistic method, where the method is used to solve the model. The following four basic types of probability models are considered: 1) direct assessment, 2) accumulation size, 3) volumetric yield, and 4) reservoir engineering. Three of the four petroleum resource assessment methodologies were written as microcomputer systems, viz. TRIAGG for direct assessment, APRAS for accumulation size, and FASPU for reservoir engineering. A fourth microcomputer system termed PROBDIST supports the three assessment systems. The three assessment systems have different probability models but the same type of probabilistic method. The type of advantages of the analytic method are in computational speed and flexibility, making it ideal for a microcomputer. -from Author

  12. Synaptic Computation Underlying Probabilistic Inference

    PubMed Central

    Soltani, Alireza; Wang, Xiao-Jing

    2010-01-01

    In this paper we propose that synapses may be the workhorse of neuronal computations that underlie probabilistic reasoning. We built a neural circuit model for probabilistic inference when information provided by different sensory cues needs to be integrated, and the predictive powers of individual cues about an outcome are deduced through experience. We found that bounded synapses naturally compute, through reward-dependent plasticity, the posterior probability that a choice alternative is correct given that a cue is presented. Furthermore, a decision circuit endowed with such synapses makes choices based on the summated log posterior odds and performs near-optimal cue combination. The model is validated by reproducing salient observations of, and provide insights into, a monkey experiment using a categorization task. Our model thus suggests a biophysical instantiation of the Bayesian decision rule, while predicting important deviations from it similar to ‘base-rate neglect’ observed in human studies when alternatives have unequal priors. PMID:20010823

  13. Probabilistic methods for rotordynamics analysis

    NASA Technical Reports Server (NTRS)

    Wu, Y.-T.; Torng, T. Y.; Millwater, H. R.; Fossum, A. F.; Rheinfurth, M. H.

    1991-01-01

    This paper summarizes the development of the methods and a computer program to compute the probability of instability of dynamic systems that can be represented by a system of second-order ordinary linear differential equations. Two instability criteria based upon the eigenvalues or Routh-Hurwitz test functions are investigated. Computational methods based on a fast probability integration concept and an efficient adaptive importance sampling method are proposed to perform efficient probabilistic analysis. A numerical example is provided to demonstrate the methods.

  14. Probabilistic Simulation for Nanocomposite Characterization

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.; Coroneos, Rula M.

    2007-01-01

    A unique probabilistic theory is described to predict the properties of nanocomposites. The simulation is based on composite micromechanics with progressive substructuring down to a nanoscale slice of a nanofiber where all the governing equations are formulated. These equations have been programmed in a computer code. That computer code is used to simulate uniaxial strengths properties of a mononanofiber laminate. The results are presented graphically and discussed with respect to their practical significance. These results show smooth distributions.

  15. Applications of Probabilistic Risk Assessment

    SciTech Connect

    Burns, K.J.; Chapman, J.R.; Follen, S.M.; O'Regan, P.J. )

    1991-05-01

    This report provides a summary of potential and actual applications of Probabilistic Risk Assessment (PRA) technology and insights. Individual applications are derived from the experiences of a number of US nuclear utilities. This report identifies numerous applications of PRA techniques beyond those typically associated with PRAs. In addition, believing that the future use of PRA techniques should not be limited to those of the past, areas of plant operations, maintenance, and financial resource allocation are discussed. 9 refs., 3 tabs.

  16. Probabilistic Aeroelastic Analysis of Turbomachinery Components

    NASA Technical Reports Server (NTRS)

    Reddy, T. S. R.; Mital, S. K.; Stefko, G. L.

    2004-01-01

    A probabilistic approach is described for aeroelastic analysis of turbomachinery blade rows. Blade rows with subsonic flow and blade rows with supersonic flow with subsonic leading edge are considered. To demonstrate the probabilistic approach, the flutter frequency, damping and forced response of a blade row representing a compressor geometry is considered. The analysis accounts for uncertainties in structural and aerodynamic design variables. The results are presented in the form of probabilistic density function (PDF) and sensitivity factors. For subsonic flow cascade, comparisons are also made with different probabilistic distributions, probabilistic methods, and Monte-Carlo simulation. The approach shows that the probabilistic approach provides a more realistic and systematic way to assess the effect of uncertainties in design variables on the aeroelastic instabilities and response.

  17. A probabilistic atlas of the cerebellar white matter.

    PubMed

    van Baarsen, K M; Kleinnijenhuis, M; Jbabdi, S; Sotiropoulos, S N; Grotenhuis, J A; van Cappellen van Walsum, A M

    2016-01-01

    Imaging of the cerebellar cortex, deep cerebellar nuclei and their connectivity are gaining attraction, due to the important role the cerebellum plays in cognition and motor control. Atlases of the cerebellar cortex and nuclei are used to locate regions of interest in clinical and neuroscience studies. However, the white matter that connects these relay stations is of at least similar functional importance. Damage to these cerebellar white matter tracts may lead to serious language, cognitive and emotional disturbances, although the pathophysiological mechanism behind it is still debated. Differences in white matter integrity between patients and controls might shed light on structure-function correlations. A probabilistic parcellation atlas of the cerebellar white matter would help these studies by facilitating automatic segmentation of the cerebellar peduncles, the localization of lesions and the comparison of white matter integrity between patients and controls. In this work a digital three-dimensional probabilistic atlas of the cerebellar white matter is presented, based on high quality 3T, 1.25mm resolution diffusion MRI data from 90 subjects participating in the Human Connectome Project. The white matter tracts were estimated using probabilistic tractography. Results over 90 subjects were symmetrical and trajectories of superior, middle and inferior cerebellar peduncles resembled the anatomy as known from anatomical studies. This atlas will contribute to a better understanding of cerebellar white matter architecture. It may eventually aid in defining structure-function correlations in patients with cerebellar disorders.

  18. A Probabilistic Design Method Applied to Smart Composite Structures

    NASA Technical Reports Server (NTRS)

    Shiao, Michael C.; Chamis, Christos C.

    1995-01-01

    A probabilistic design method is described and demonstrated using a smart composite wing. Probabilistic structural design incorporates naturally occurring uncertainties including those in constituent (fiber/matrix) material properties, fabrication variables, structure geometry and control-related parameters. Probabilistic sensitivity factors are computed to identify those parameters that have a great influence on a specific structural reliability. Two performance criteria are used to demonstrate this design methodology. The first criterion requires that the actuated angle at the wing tip be bounded by upper and lower limits at a specified reliability. The second criterion requires that the probability of ply damage due to random impact load be smaller than an assigned value. When the relationship between reliability improvement and the sensitivity factors is assessed, the results show that a reduction in the scatter of the random variable with the largest sensitivity factor (absolute value) provides the lowest failure probability. An increase in the mean of the random variable with a negative sensitivity factor will reduce the failure probability. Therefore, the design can be improved by controlling or selecting distribution parameters associated with random variables. This can be implemented during the manufacturing process to obtain maximum benefit with minimum alterations.

  19. Staged decision making based on probabilistic forecasting

    NASA Astrophysics Data System (ADS)

    Booister, Nikéh; Verkade, Jan; Werner, Micha; Cranston, Michael; Cumiskey, Lydia; Zevenbergen, Chris

    2016-04-01

    Flood forecasting systems reduce, but cannot eliminate uncertainty about the future. Probabilistic forecasts explicitly show that uncertainty remains. However, as - compared to deterministic forecasts - a dimension is added ('probability' or 'likelihood'), with this added dimension decision making is made slightly more complicated. A technique of decision support is the cost-loss approach, which defines whether or not to issue a warning or implement mitigation measures (risk-based method). With the cost-loss method a warning will be issued when the ratio of the response costs to the damage reduction is less than or equal to the probability of the possible flood event. This cost-loss method is not widely used, because it motivates based on only economic values and is a technique that is relatively static (no reasoning, yes/no decision). Nevertheless it has high potential to improve risk-based decision making based on probabilistic flood forecasting because there are no other methods known that deal with probabilities in decision making. The main aim of this research was to explore the ways of making decision making based on probabilities with the cost-loss method better applicable in practice. The exploration began by identifying other situations in which decisions were taken based on uncertain forecasts or predictions. These cases spanned a range of degrees of uncertainty: from known uncertainty to deep uncertainty. Based on the types of uncertainties, concepts of dealing with situations and responses were analysed and possible applicable concepts where chosen. Out of this analysis the concepts of flexibility and robustness appeared to be fitting to the existing method. Instead of taking big decisions with bigger consequences at once, the idea is that actions and decisions are cut-up into smaller pieces and finally the decision to implement is made based on economic costs of decisions and measures and the reduced effect of flooding. The more lead-time there is in

  20. Damage Tolerance Concepts for Critical Engine Components.

    DTIC Science & Technology

    1985-10-01

    18 (1984), 1235-1240. 21. James M.N. and Knott J.F., An assessment of crack closure and the extent of the short crack regime in QIN ( HY80 ) steel ...STRUCTURAL COMPONENTS OF A IEDIUM CARBON STEEL AND A MEDIUM-STRENGTH AL-Mg ALLOY by C.M.Branco 7 MANUFACTURINGTECHNOLOGY FOR NONDESTRUCTIVE EVALUATION-(NDE...ACCEPTANCE METHODS IN STRUCTURAL L COMPONENTS OF A MEDIUM CARBON STEEL AND A MEDIUM STRENGTH AL-Mg ALLOY "’" by %" C.M.Branco Professor UNIVERSITY OF MINHO

  1. Composite Structures Damage Tolerance Analysis Methodologies

    NASA Technical Reports Server (NTRS)

    Chang, James B.; Goyal, Vinay K.; Klug, John C.; Rome, Jacob I.

    2012-01-01

    This report presents the results of a literature review as part of the development of composite hardware fracture control guidelines funded by NASA Engineering and Safety Center (NESC) under contract NNL04AA09B. The objectives of the overall development tasks are to provide a broad information and database to the designers, analysts, and testing personnel who are engaged in space flight hardware production.

  2. Expression of the PsMTA1 gene in white poplar engineered with the MAT system is associated with heavy metal tolerance and protection against 8-hydroxy-2'-deoxyguanosine mediated-DNA damage.

    PubMed

    Balestrazzi, Alma; Botti, Silvia; Zelasco, Samantha; Biondi, Stefania; Franchin, Cinzia; Calligari, Paolo; Racchi, Milvia; Turchi, Adelaide; Lingua, Guido; Berta, Graziella; Carbonera, Daniela

    2009-08-01

    Marker-free transgenic white poplar (Populus alba L., cv 'Villafranca') plants, expressing the PsMT (A1) gene from Pisum sativum for a metallothionein-like protein, were produced by Agrobacterium tumefaciens-mediated transformation. The 35SCaMV-PsMT (A1)-NosT cassette was inserted into the ipt-type vector pMAT22. The occurrence of the abnormal ipt-shooty phenotype allowed the visual selection of transformants, while the yeast site-specific recombination R/RS system was responsible for the excision of the undesired vector sequences with the consequent recovery of normal marker-free transgenic plants. Molecular analyses confirmed the presence of the 35SCaMV-PsMT (A1)-NosT cassette and transgene expression. Five selected lines were further characterized, revealing the ability to withstand heavy metal toxicity. They survived 0.1 mM CuCl(2), a concentration which strongly affected the nontransgenic plants. Moreover, root development was only slightly affected by the ectopic expression of the transgene. Reactive oxygen species were accumulated to a lower extent in leaf tissues of multi-auto-transformation (MAT)-PsMT(A1) plants exposed to copper and zinc, compared to control plants. Tolerance to photo-oxidative stress induced by paraquat was another distinctive feature of the MAT-PsMT(A1) lines. Finally, low levels of DNA damage were detected by quantifying the amounts of 8-hydroxy-2'-deoxyguanosine in leaf tissues of the transgenic plants exposed to copper.

  3. Confronting uncertainty in flood damage predictions

    NASA Astrophysics Data System (ADS)

    Schröter, Kai; Kreibich, Heidi; Vogel, Kristin; Merz, Bruno

    2015-04-01

    Reliable flood damage models are a prerequisite for the practical usefulness of the model results. Oftentimes, traditional uni-variate damage models as for instance depth-damage curves fail to reproduce the variability of observed flood damage. Innovative multi-variate probabilistic modelling approaches are promising to capture and quantify the uncertainty involved and thus to improve the basis for decision making. In this study we compare the predictive capability of two probabilistic modelling approaches, namely Bagging Decision Trees and Bayesian Networks. For model evaluation we use empirical damage data which are available from computer aided telephone interviews that were respectively compiled after the floods in 2002, 2005 and 2006, in the Elbe and Danube catchments in Germany. We carry out a split sample test by sub-setting the damage records. One sub-set is used to derive the models and the remaining records are used to evaluate the predictive performance of the model. Further we stratify the sample according to catchments which allows studying model performance in a spatial transfer context. Flood damage estimation is carried out on the scale of the individual buildings in terms of relative damage. The predictive performance of the models is assessed in terms of systematic deviations (mean bias), precision (mean absolute error) as well as in terms of reliability which is represented by the proportion of the number of observations that fall within the 95-quantile and 5-quantile predictive interval. The reliability of the probabilistic predictions within validation runs decreases only slightly and achieves a very good coverage of observations within the predictive interval. Probabilistic models provide quantitative information about prediction uncertainty which is crucial to assess the reliability of model predictions and improves the usefulness of model results.

  4. Probabilistic evaluation of uncertainties and risks in aerospace components

    NASA Technical Reports Server (NTRS)

    Shah, A. R.; Shiao, M. C.; Nagpal, V. K.; Chamis, C. C.

    1992-01-01

    This paper summarizes a methodology developed at NASA Lewis Research Center which computationally simulates the structural, material, and load uncertainties associated with Space Shuttle Main Engine (SSME) components. The methodology was applied to evaluate the scatter in static, buckling, dynamic, fatigue, and damage behavior of the SSME turbo pump blade. Also calculated are the probability densities of typical critical blade responses, such as effective stress, natural frequency, damage initiation, most probable damage path, etc. Risk assessments were performed for different failure modes, and the effect of material degradation on the fatigue and damage behaviors of a blade were calculated using a multi-factor interaction equation. Failure probabilities for different fatigue cycles were computed and the uncertainties associated with damage initiation and damage propagation due to different load cycle were quantified. Evaluations on the effects of mistuned blades on a rotor were made; uncertainties in the excitation frequency were found to significantly amplify the blade responses of a mistuned rotor. The effects of the number of blades on a rotor were studied. The autocorrelation function of displacements and the probability density function of the first passage time for deterministic and random barriers for structures subjected to random processes also were computed. A brief discussion was included on the future direction of probabilistic structural analysis.

  5. 7 CFR 51.2280 - Tolerances for grade defects.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... on the basis of weight. (b) In order to allow for variations, other than for color and size, incident... in Table I: Table I Grade Tolerances for grade defects Total defects Serious damage Very serious... serious damage). Color Requirements...

  6. Mixed deterministic and probabilistic networks.

    PubMed

    Mateescu, Robert; Dechter, Rina

    2008-11-01

    The paper introduces mixed networks, a new graphical model framework for expressing and reasoning with probabilistic and deterministic information. The motivation to develop mixed networks stems from the desire to fully exploit the deterministic information (constraints) that is often present in graphical models. Several concepts and algorithms specific to belief networks and constraint networks are combined, achieving computational efficiency, semantic coherence and user-interface convenience. We define the semantics and graphical representation of mixed networks, and discuss the two main types of algorithms for processing them: inference-based and search-based. A preliminary experimental evaluation shows the benefits of the new model.

  7. Mixed deterministic and probabilistic networks

    PubMed Central

    Dechter, Rina

    2010-01-01

    The paper introduces mixed networks, a new graphical model framework for expressing and reasoning with probabilistic and deterministic information. The motivation to develop mixed networks stems from the desire to fully exploit the deterministic information (constraints) that is often present in graphical models. Several concepts and algorithms specific to belief networks and constraint networks are combined, achieving computational efficiency, semantic coherence and user-interface convenience. We define the semantics and graphical representation of mixed networks, and discuss the two main types of algorithms for processing them: inference-based and search-based. A preliminary experimental evaluation shows the benefits of the new model. PMID:20981243

  8. Probabilistic risk assessment: Number 219

    SciTech Connect

    Bari, R.A.

    1985-11-13

    This report describes a methodology for analyzing the safety of nuclear power plants. A historical overview of plants in the US is provided, and past, present, and future nuclear safety and risk assessment are discussed. A primer on nuclear power plants is provided with a discussion of pressurized water reactors (PWR) and boiling water reactors (BWR) and their operation and containment. Probabilistic Risk Assessment (PRA), utilizing both event-tree and fault-tree analysis, is discussed as a tool in reactor safety, decision making, and communications. (FI)

  9. Probabilistic approach to EMP assessment

    SciTech Connect

    Bevensee, R.M.; Cabayan, H.S.; Deadrick, F.J.; Martin, L.C.; Mensing, R.W.

    1980-09-01

    The development of nuclear EMP hardness requirements must account for uncertainties in the environment, in interaction and coupling, and in the susceptibility of subsystems and components. Typical uncertainties of the last two kinds are briefly summarized, and an assessment methodology is outlined, based on a probabilistic approach that encompasses the basic concepts of reliability. It is suggested that statements of survivability be made compatible with system reliability. Validation of the approach taken for simple antenna/circuit systems is performed with experiments and calculations that involve a Transient Electromagnetic Range, numerical antenna modeling, separate device failure data, and a failure analysis computer program.

  10. Probabilistic Simulation for Nanocomposite Fracture

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.

    2010-01-01

    A unique probabilistic theory is described to predict the uniaxial strengths and fracture properties of nanocomposites. The simulation is based on composite micromechanics with progressive substructuring down to a nanoscale slice of a nanofiber where all the governing equations are formulated. These equations have been programmed in a computer code. That computer code is used to simulate uniaxial strengths and fracture of a nanofiber laminate. The results are presented graphically and discussed with respect to their practical significance. These results show smooth distributions from low probability to high.

  11. Damage Progression in Bolted Composites

    NASA Technical Reports Server (NTRS)

    Minnetyan, Levon; Chamis, Christos; Gotsis, Pascal K.

    1998-01-01

    Structural durability,damage tolerance,and progressive fracture characteristics of bolted graphite/epoxy composite laminates are evaluated via computational simulation. Constituent material properties and stress and strain limits are scaled up to the structure level to evaluate the overall damage and fracture propagation for bolted composites. Single and double bolted composite specimens with various widths and bolt spacings are evaluated. The effect of bolt spacing is investigated with regard to the structural durability of a bolted joint. Damage initiation, growth, accumulation, and propagation to fracture are included in the simulations. Results show the damage progression sequence and structural fracture resistance during different degradation stages. A procedure is outlined for the use of computational simulation data in the assessment of damage tolerance, determination of sensitive parameters affecting fracture, and interpretation of experimental results with insight for design decisions.

  12. Damage Progression in Bolted Composites

    NASA Technical Reports Server (NTRS)

    Minnetyan, Levon; Chamis, Christos C.; Gotsis, Pascal K.

    1998-01-01

    Structural durability, damage tolerance, and progressive fracture characteristics of bolted graphite/epoxy composite laminates are evaluated via computational simulation. Constituent material properties and stress and strain limits are scaled up to the structure level to evaluate the overall damage and fracture propagation for bolted composites. Single and double bolted composite specimens with various widths and bolt spacings are evaluated. The effect of bolt spacing is investigated with regard to the structural durability of a bolted joint. Damage initiation, growth, accumulation, and propagation to fracture are included in the simulations. Results show the damage progression sequence and structural fracture resistance during different degradation stages. A procedure is outlined for the use of computational simulation data in the assessment of damage tolerance, determination of sensitive parameters affecting fracture, and interpretation of experimental results with insight for design decisions.

  13. Probabilistic Seismic Hazard assessment in Albania

    NASA Astrophysics Data System (ADS)

    Muco, B.; Kiratzi, A.; Sulstarova, E.; Kociu, S.; Peci, V.; Scordilis, E.

    2002-12-01

    Albania is one of the coutries with highest sesimicity in Europe.The history of instrumental monitoring of seismicity in this country started since 1968 with the setting up of the first seismographic station of Tirana and more effectively after the beginning of the operation of the Albanian Seismological Network in 1976. There is a rich evidence that during two thousands years Albania has been hit by many disastrous earthquakes. The highest magnitude estimated is 7.2. After the end of Communist era and opening of the country, a boom of constructions started in Albania continuing even now. It makes more indispensabile the producing of accurate seismic hazard maps for preventing the damages of future probable earthquakes. Some efforts have already been done in seismic hazard assessment(Sulstarova et al., 1980; Kociu, 2000; Muco et al., 2002). In this approach, the probabilistic technique has been used in one joint work between Seismological Institute of Tirana, Albania and Department of Geophysics of Aristotle University of Thessaloniki, Greece, into the framework of NATO SfP project "SeisAlbania". The earthquake catalogue adopted was specifically conceived for this seismic hazard analysis and contains 530 events with magnitude M>4.5 from the year 58 up to 2000. We divided the country in 8 seismotectonic zones giving for them the most representative fault characteristics. The computer code used for hazard calculation was OHAZ, developed from the Geophysical Survey of Slovenia and the attenuation models used were Ambraseys et al., 1996; Sabetta and Pugliese, 1996 and Margaris et al., 2001. The hazard maps are obtained for 100, 475, 2375 and 4746 return periods, for rock soil condition. Analyzing the map of PGA values for a return period of 475 years, there are separated 5 zones with different escalation of PGA values: 1)the zone with PGA (0.20 - 0.24 g) 1.8 percent of Albanian territory, 2)the zone with PGA (0.16 - 0.20 g) 22.6 percent of Albanian territory, 3)the

  14. Probabilistic Cue Combination: Less Is More

    ERIC Educational Resources Information Center

    Yurovsky, Daniel; Boyer, Ty W.; Smith, Linda B.; Yu, Chen

    2013-01-01

    Learning about the structure of the world requires learning probabilistic relationships: rules in which cues do not predict outcomes with certainty. However, in some cases, the ability to track probabilistic relationships is a handicap, leading adults to perform non-normatively in prediction tasks. For example, in the "dilution effect,"…

  15. Error Discounting in Probabilistic Category Learning

    ERIC Educational Resources Information Center

    Craig, Stewart; Lewandowsky, Stephan; Little, Daniel R.

    2011-01-01

    The assumption in some current theories of probabilistic categorization is that people gradually attenuate their learning in response to unavoidable error. However, existing evidence for this error discounting is sparse and open to alternative interpretations. We report 2 probabilistic-categorization experiments in which we investigated error…

  16. Probabilistic analysis of tsunami hazards

    USGS Publications Warehouse

    Geist, E.L.; Parsons, T.

    2006-01-01

    Determining the likelihood of a disaster is a key component of any comprehensive hazard assessment. This is particularly true for tsunamis, even though most tsunami hazard assessments have in the past relied on scenario or deterministic type models. We discuss probabilistic tsunami hazard analysis (PTHA) from the standpoint of integrating computational methods with empirical analysis of past tsunami runup. PTHA is derived from probabilistic seismic hazard analysis (PSHA), with the main difference being that PTHA must account for far-field sources. The computational methods rely on numerical tsunami propagation models rather than empirical attenuation relationships as in PSHA in determining ground motions. Because a number of source parameters affect local tsunami runup height, PTHA can become complex and computationally intensive. Empirical analysis can function in one of two ways, depending on the length and completeness of the tsunami catalog. For site-specific studies where there is sufficient tsunami runup data available, hazard curves can primarily be derived from empirical analysis, with computational methods used to highlight deficiencies in the tsunami catalog. For region-wide analyses and sites where there are little to no tsunami data, a computationally based method such as Monte Carlo simulation is the primary method to establish tsunami hazards. Two case studies that describe how computational and empirical methods can be integrated are presented for Acapulco, Mexico (site-specific) and the U.S. Pacific Northwest coastline (region-wide analysis).

  17. Software for Probabilistic Risk Reduction

    NASA Technical Reports Server (NTRS)

    Hensley, Scott; Michel, Thierry; Madsen, Soren; Chapin, Elaine; Rodriguez, Ernesto

    2004-01-01

    A computer program implements a methodology, denoted probabilistic risk reduction, that is intended to aid in planning the development of complex software and/or hardware systems. This methodology integrates two complementary prior methodologies: (1) that of probabilistic risk assessment and (2) a risk-based planning methodology, implemented in a prior computer program known as Defect Detection and Prevention (DDP), in which multiple requirements and the beneficial effects of risk-mitigation actions are taken into account. The present methodology and the software are able to accommodate both process knowledge (notably of the efficacy of development practices) and product knowledge (notably of the logical structure of a system, the development of which one seeks to plan). Estimates of the costs and benefits of a planned development can be derived. Functional and non-functional aspects of software can be taken into account, and trades made among them. It becomes possible to optimize the planning process in the sense that it becomes possible to select the best suite of process steps and design choices to maximize the expectation of success while remaining within budget.

  18. 7 CFR 51.306 - Tolerances.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ..., CERTIFICATION, AND STANDARDS) United States Standards for Grades of Apples Tolerances § 51.306 Tolerances. In...: (1) U.S. Extra Fancy, U.S. Fancy, U.S. No. 1, and U.S. No. 1 Hail grades: 10 percent of the apples in... 5 percent, shall be allowed for apples which are seriously damaged, including therein not more...

  19. 7 CFR 51.306 - Tolerances.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ..., CERTIFICATION, AND STANDARDS) United States Standards for Grades of Apples Tolerances § 51.306 Tolerances. In...: (1) U.S. Extra Fancy, U.S. Fancy, U.S. No. 1, and U.S. No. 1 Hail grades: 10 percent of the apples in... 5 percent, shall be allowed for apples which are seriously damaged, including therein not more...

  20. Is probabilistic evidence a source of knowledge?

    PubMed

    Friedman, Ori; Turri, John

    2015-07-01

    We report a series of experiments examining whether people ascribe knowledge for true beliefs based on probabilistic evidence. Participants were less likely to ascribe knowledge for beliefs based on probabilistic evidence than for beliefs based on perceptual evidence (Experiments 1 and 2A) or testimony providing causal information (Experiment 2B). Denial of knowledge for beliefs based on probabilistic evidence did not arise because participants viewed such beliefs as unjustified, nor because such beliefs leave open the possibility of error. These findings rule out traditional philosophical accounts for why probabilistic evidence does not produce knowledge. The experiments instead suggest that people deny knowledge because they distrust drawing conclusions about an individual based on reasoning about the population to which it belongs, a tendency previously identified by "judgment and decision making" researchers. Consistent with this, participants were more willing to ascribe knowledge for beliefs based on probabilistic evidence that is specific to a particular case (Experiments 3A and 3B).

  1. Probabilistic modeling of financial exposure to flood in France

    NASA Astrophysics Data System (ADS)

    Moncoulon, David; Quantin, Antoine; Leblois, Etienne

    2014-05-01

    CCR is a French reinsurance company which offers natural catastrophe covers with the State guarantee. Within this framework, CCR develops its own models to assess its financial exposure to floods, droughts, earthquakes and other perils, and thus the exposure of insurers and the French State. A probabilistic flood model has been developed in order to estimate the financial exposure of the Nat Cat insurance market to flood events, depending on their annual occurrence probability. This presentation is organized in two parts. The first part is dedicated to the development of a flood hazard and damage model (ARTEMIS). The model calibration and validation on historical events are then described. In the second part, the coupling of ARTEMIS with two generators of probabilistic events is achieved: a stochastic flow generator and a stochastic spatialized precipitation generator, adapted from the SAMPO model developed by IRSTEA. The analysis of the complementary nature of these two generators is proposed: the first one allows generating floods on the French hydrological station network; the second allows simulating surface water runoff and Small River floods, even on ungauged rivers. Thus, the simulation of thousands of non-occured, but possible events allows us to provide for the first time an estimate of the financial exposure to flooding in France at different scales (commune, department, country) and from different points of view (hazard, vulnerability and damages).

  2. 7 CFR 51.2954 - Tolerances for grade defects.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... damaged by mold or insects or seriously damaged by other means, of which not more than 5/6 or 5 pct may be damaged by insects, but no part of any tolerance shall be allowed for walnuts containing live insects No... adhering hulls 15 pct total, by count, including not more than 8 pct which are damaged by mold or...

  3. 7 CFR 51.2954 - Tolerances for grade defects.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... damaged by mold or insects or seriously damaged by other means, of which not more than 5/6 or 5 pct may be damaged by insects, but no part of any tolerance shall be allowed for walnuts containing live insects No... adhering hulls 15 pct total, by count, including not more than 8 pct which are damaged by mold or...

  4. Neural networks for damage identification

    SciTech Connect

    Paez, T.L.; Klenke, S.E.

    1997-11-01

    Efforts to optimize the design of mechanical systems for preestablished use environments and to extend the durations of use cycles establish a need for in-service health monitoring. Numerous studies have proposed measures of structural response for the identification of structural damage, but few have suggested systematic techniques to guide the decision as to whether or not damage has occurred based on real data. Such techniques are necessary because in field applications the environments in which systems operate and the measurements that characterize system behavior are random. This paper investigates the use of artificial neural networks (ANNs) to identify damage in mechanical systems. Two probabilistic neural networks (PNNs) are developed and used to judge whether or not damage has occurred in a specific mechanical system, based on experimental measurements. The first PNN is a classical type that casts Bayesian decision analysis into an ANN framework; it uses exemplars measured from the undamaged and damaged system to establish whether system response measurements of unknown origin come from the former class (undamaged) or the latter class (damaged). The second PNN establishes the character of the undamaged system in terms of a kernel density estimator of measures of system response; when presented with system response measures of unknown origin, it makes a probabilistic judgment whether or not the data come from the undamaged population. The physical system used to carry out the experiments is an aerospace system component, and the environment used to excite the system is a stationary random vibration. The results of damage identification experiments are presented along with conclusions rating the effectiveness of the approaches.

  5. Probabilistic Reasoning for Plan Robustness

    NASA Technical Reports Server (NTRS)

    Schaffer, Steve R.; Clement, Bradley J.; Chien, Steve A.

    2005-01-01

    A planning system must reason about the uncertainty of continuous variables in order to accurately project the possible system state over time. A method is devised for directly reasoning about the uncertainty in continuous activity duration and resource usage for planning problems. By representing random variables as parametric distributions, computing projected system state can be simplified in some cases. Common approximation and novel methods are compared for over-constrained and lightly constrained domains. The system compares a few common approximation methods for an iterative repair planner. Results show improvements in robustness over the conventional non-probabilistic representation by reducing the number of constraint violations witnessed by execution. The improvement is more significant for larger problems and problems with higher resource subscription levels but diminishes as the system is allowed to accept higher risk levels.

  6. Probabilistic cloning of equidistant states

    SciTech Connect

    Jimenez, O.; Roa, Luis; Delgado, A.

    2010-08-15

    We study the probabilistic cloning of equidistant states. These states are such that the inner product between them is a complex constant or its conjugate. Thereby, it is possible to study their cloning in a simple way. In particular, we are interested in the behavior of the cloning probability as a function of the phase of the overlap among the involved states. We show that for certain families of equidistant states Duan and Guo's cloning machine leads to cloning probabilities lower than the optimal unambiguous discrimination probability of equidistant states. We propose an alternative cloning machine whose cloning probability is higher than or equal to the optimal unambiguous discrimination probability for any family of equidistant states. Both machines achieve the same probability for equidistant states whose inner product is a positive real number.

  7. Probabilistic direct counterfactual quantum communication

    NASA Astrophysics Data System (ADS)

    Zhang, Sheng

    2017-02-01

    It is striking that the quantum Zeno effect can be used to launch a direct counterfactual communication between two spatially separated parties, Alice and Bob. So far, existing protocols of this type only provide a deterministic counterfactual communication service. However, this counterfactuality should be payed at a price. Firstly, the transmission time is much longer than a classical transmission costs. Secondly, the chained-cycle structure makes them more sensitive to channel noises. Here, we extend the idea of counterfactual communication, and present a probabilistic-counterfactual quantum communication protocol, which is proved to have advantages over the deterministic ones. Moreover, the presented protocol could evolve to a deterministic one solely by adjusting the parameters of the beam splitters. Project supported by the National Natural Science Foundation of China (Grant No. 61300203).

  8. Development of probabilistic multimedia multipathway computer codes.

    SciTech Connect

    Yu, C.; LePoire, D.; Gnanapragasam, E.; Arnish, J.; Kamboj, S.; Biwer, B. M.; Cheng, J.-J.; Zielen, A. J.; Chen, S. Y.; Mo, T.; Abu-Eid, R.; Thaggard, M.; Sallo, A., III.; Peterson, H., Jr.; Williams, W. A.; Environmental Assessment; NRC; EM

    2002-01-01

    The deterministic multimedia dose/risk assessment codes RESRAD and RESRAD-BUILD have been widely used for many years for evaluation of sites contaminated with residual radioactive materials. The RESRAD code applies to the cleanup of sites (soils) and the RESRAD-BUILD code applies to the cleanup of buildings and structures. This work describes the procedure used to enhance the deterministic RESRAD and RESRAD-BUILD codes for probabilistic dose analysis. A six-step procedure was used in developing default parameter distributions and the probabilistic analysis modules. These six steps include (1) listing and categorizing parameters; (2) ranking parameters; (3) developing parameter distributions; (4) testing parameter distributions for probabilistic analysis; (5) developing probabilistic software modules; and (6) testing probabilistic modules and integrated codes. The procedures used can be applied to the development of other multimedia probabilistic codes. The probabilistic versions of RESRAD and RESRAD-BUILD codes provide tools for studying the uncertainty in dose assessment caused by uncertain input parameters. The parameter distribution data collected in this work can also be applied to other multimedia assessment tasks and multimedia computer codes.

  9. Modeling neural activity with cumulative damage distributions.

    PubMed

    Leiva, Víctor; Tejo, Mauricio; Guiraud, Pierre; Schmachtenberg, Oliver; Orio, Patricio; Marmolejo-Ramos, Fernando

    2015-10-01

    Neurons transmit information as action potentials or spikes. Due to the inherent randomness of the inter-spike intervals (ISIs), probabilistic models are often used for their description. Cumulative damage (CD) distributions are a family of probabilistic models that has been widely considered for describing time-related cumulative processes. This family allows us to consider certain deterministic principles for modeling ISIs from a probabilistic viewpoint and to link its parameters to values with biological interpretation. The CD family includes the Birnbaum-Saunders and inverse Gaussian distributions, which possess distinctive properties and theoretical arguments useful for ISI description. We expand the use of CD distributions to the modeling of neural spiking behavior, mainly by testing the suitability of the Birnbaum-Saunders distribution, which has not been studied in the setting of neural activity. We validate this expansion with original experimental and simulated electrophysiological data.

  10. Pesticide Tolerances

    EPA Pesticide Factsheets

    EPA regulates pesticides used to protect crops and sets limits on the amount of pesticide remaining in or on foods in the U.S. The limits on pesticides on foods are called tolerances in the U.S. (maximum residue limits (MRLs) in many other countries).

  11. Probabilistic population projections with migration uncertainty.

    PubMed

    Azose, Jonathan J; Ševčíková, Hana; Raftery, Adrian E

    2016-06-07

    We produce probabilistic projections of population for all countries based on probabilistic projections of fertility, mortality, and migration. We compare our projections to those from the United Nations' Probabilistic Population Projections, which uses similar methods for fertility and mortality but deterministic migration projections. We find that uncertainty in migration projection is a substantial contributor to uncertainty in population projections for many countries. Prediction intervals for the populations of Northern America and Europe are over 70% wider, whereas prediction intervals for the populations of Africa, Asia, and the world as a whole are nearly unchanged. Out-of-sample validation shows that the model is reasonably well calibrated.

  12. Probabilistic machine learning and artificial intelligence.

    PubMed

    Ghahramani, Zoubin

    2015-05-28

    How can a machine learn from experience? Probabilistic modelling provides a framework for understanding what learning is, and has therefore emerged as one of the principal theoretical and practical approaches for designing machines that learn from data acquired through experience. The probabilistic framework, which describes how to represent and manipulate uncertainty about models and predictions, has a central role in scientific data analysis, machine learning, robotics, cognitive science and artificial intelligence. This Review provides an introduction to this framework, and discusses some of the state-of-the-art advances in the field, namely, probabilistic programming, Bayesian optimization, data compression and automatic model discovery.

  13. Probabilistic machine learning and artificial intelligence

    NASA Astrophysics Data System (ADS)

    Ghahramani, Zoubin

    2015-05-01

    How can a machine learn from experience? Probabilistic modelling provides a framework for understanding what learning is, and has therefore emerged as one of the principal theoretical and practical approaches for designing machines that learn from data acquired through experience. The probabilistic framework, which describes how to represent and manipulate uncertainty about models and predictions, has a central role in scientific data analysis, machine learning, robotics, cognitive science and artificial intelligence. This Review provides an introduction to this framework, and discusses some of the state-of-the-art advances in the field, namely, probabilistic programming, Bayesian optimization, data compression and automatic model discovery.

  14. Replicating damaged DNA in eukaryotes.

    PubMed

    Chatterjee, Nimrat; Siede, Wolfram

    2013-12-01

    DNA damage is one of many possible perturbations that challenge the mechanisms that preserve genetic stability during the copying of the eukaryotic genome in S phase. This short review provides, in the first part, a general introduction to the topic and an overview of checkpoint responses. In the second part, the mechanisms of error-free tolerance in response to fork-arresting DNA damage will be discussed in some detail.

  15. Non-unitary probabilistic quantum computing

    NASA Technical Reports Server (NTRS)

    Gingrich, Robert M.; Williams, Colin P.

    2004-01-01

    We present a method for designing quantum circuits that perform non-unitary quantum computations on n-qubit states probabilistically, and give analytic expressions for the success probability and fidelity.

  16. Probabilistic micromechanics for high-temperature composites

    NASA Technical Reports Server (NTRS)

    Reddy, J. N.

    1993-01-01

    The three-year program of research had the following technical objectives: the development of probabilistic methods for micromechanics-based constitutive and failure models, application of the probabilistic methodology in the evaluation of various composite materials and simulation of expected uncertainties in unidirectional fiber composite properties, and influence of the uncertainties in composite properties on the structural response. The first year of research was devoted to the development of probabilistic methodology for micromechanics models. The second year of research focused on the evaluation of the Chamis-Hopkins constitutive model and Aboudi constitutive model using the methodology developed in the first year of research. The third year of research was devoted to the development of probabilistic finite element analysis procedures for laminated composite plate and shell structures.

  17. Probabilistic regularization in inverse optical imaging.

    PubMed

    De Micheli, E; Viano, G A

    2000-11-01

    The problem of object restoration in the case of spatially incoherent illumination is considered. A regularized solution to the inverse problem is obtained through a probabilistic approach, and a numerical algorithm based on the statistical analysis of the noisy data is presented. Particular emphasis is placed on the question of the positivity constraint, which is incorporated into the probabilistically regularized solution by means of a quadratic programming technique. Numerical examples illustrating the main steps of the algorithm are also given.

  18. Probabilistic Approaches for Evaluating Space Shuttle Risks

    NASA Technical Reports Server (NTRS)

    Vesely, William

    2001-01-01

    The objectives of the Space Shuttle PRA (Probabilistic Risk Assessment) are to: (1) evaluate mission risks; (2) evaluate uncertainties and sensitivities; (3) prioritize contributors; (4) evaluate upgrades; (5) track risks; and (6) provide decision tools. This report discusses the significance of a Space Shuttle PRA and its participants. The elements and type of losses to be included are discussed. The program and probabilistic approaches are then discussed.

  19. Probabilistic cloning of three symmetric states

    SciTech Connect

    Jimenez, O.; Bergou, J.; Delgado, A.

    2010-12-15

    We study the probabilistic cloning of three symmetric states. These states are defined by a single complex quantity, the inner product among them. We show that three different probabilistic cloning machines are necessary to optimally clone all possible families of three symmetric states. We also show that the optimal cloning probability of generating M copies out of one original can be cast as the quotient between the success probability of unambiguously discriminating one and M copies of symmetric states.

  20. Parallel and Distributed Systems for Probabilistic Reasoning

    DTIC Science & Technology

    2012-12-01

    High-Level Abstractions . . . . . . . . . . . . . . . . . . . . . . . . . . . 154 8 Future Work 156 8.1 Scalable Online Probabilistic Reasoning...this chapter can be obtained from our online repository at http://gonzalezlabs/thesis. 3.1 Belief Propagation A core operation in probabilistic...models is not strictly novel. In the setting of online inference in Russell and Norvig [1995] used the notion of Fixed Lag Smoothing to eliminate the

  1. Probabilistic fatigue life prediction of metallic and composite materials

    NASA Astrophysics Data System (ADS)

    Xiang, Yibing

    Fatigue is one of the most common failure modes for engineering structures, such as aircrafts, rotorcrafts and aviation transports. Both metallic materials and composite materials are widely used and affected by fatigue damage. Huge uncertainties arise from material properties, measurement noise, imperfect models, future anticipated loads and environmental conditions. These uncertainties are critical issues for accurate remaining useful life (RUL) prediction for engineering structures in service. Probabilistic fatigue prognosis considering various uncertainties is of great importance for structural safety. The objective of this study is to develop probabilistic fatigue life prediction models for metallic materials and composite materials. A fatigue model based on crack growth analysis and equivalent initial flaw size concept is proposed for metallic materials. Following this, the developed model is extended to include structural geometry effects (notch effect), environmental effects (corroded specimens) and manufacturing effects (shot peening effects). Due to the inhomogeneity and anisotropy, the fatigue model suitable for metallic materials cannot be directly applied to composite materials. A composite fatigue model life prediction is proposed based on a mixed-mode delamination growth model and a stiffness degradation law. After the development of deterministic fatigue models of metallic and composite materials, a general probabilistic life prediction methodology is developed. The proposed methodology combines an efficient Inverse First-Order Reliability Method (IFORM) for the uncertainty propogation in fatigue life prediction. An equivalent stresstransformation has been developed to enhance the computational efficiency under realistic random amplitude loading. A systematical reliability-based maintenance optimization framework is proposed for fatigue risk management and mitigation of engineering structures.

  2. Maximizing Statistical Power When Verifying Probabilistic Forecasts of Hydrometeorological Events

    NASA Astrophysics Data System (ADS)

    DeChant, C. M.; Moradkhani, H.

    2014-12-01

    Hydrometeorological events (i.e. floods, droughts, precipitation) are increasingly being forecasted probabilistically, owing to the uncertainties in the underlying causes of the phenomenon. In these forecasts, the probability of the event, over some lead time, is estimated based on some model simulations or predictive indicators. By issuing probabilistic forecasts, agencies may communicate the uncertainty in the event occurring. Assuming that the assigned probability of the event is correct, which is referred to as a reliable forecast, the end user may perform some risk management based on the potential damages resulting from the event. Alternatively, an unreliable forecast may give false impressions of the actual risk, leading to improper decision making when protecting resources from extreme events. Due to this requisite for reliable forecasts to perform effective risk management, this study takes a renewed look at reliability assessment in event forecasts. Illustrative experiments will be presented, showing deficiencies in the commonly available approaches (Brier Score, Reliability Diagram). Overall, it is shown that the conventional reliability assessment techniques do not maximize the ability to distinguish between a reliable and unreliable forecast. In this regard, a theoretical formulation of the probabilistic event forecast verification framework will be presented. From this analysis, hypothesis testing with the Poisson-Binomial distribution is the most exact model available for the verification framework, and therefore maximizes one's ability to distinguish between a reliable and unreliable forecast. Application of this verification system was also examined within a real forecasting case study, highlighting the additional statistical power provided with the use of the Poisson-Binomial distribution.

  3. Probabilistic simulation of uncertainties in thermal structures

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.; Shiao, Michael

    1990-01-01

    Development of probabilistic structural analysis methods for hot structures is a major activity at Lewis Research Center. It consists of five program elements: (1) probabilistic loads; (2) probabilistic finite element analysis; (3) probabilistic material behavior; (4) assessment of reliability and risk; and (5) probabilistic structural performance evaluation. Recent progress includes: (1) quantification of the effects of uncertainties for several variables on high pressure fuel turbopump (HPFT) blade temperature, pressure, and torque of the Space Shuttle Main Engine (SSME); (2) the evaluation of the cumulative distribution function for various structural response variables based on assumed uncertainties in primitive structural variables; (3) evaluation of the failure probability; (4) reliability and risk-cost assessment, and (5) an outline of an emerging approach for eventual hot structures certification. Collectively, the results demonstrate that the structural durability/reliability of hot structural components can be effectively evaluated in a formal probabilistic framework. In addition, the approach can be readily extended to computationally simulate certification of hot structures for aerospace environments.

  4. Nitrate tolerance.

    PubMed

    Parker, J O

    1987-11-16

    The organic nitrates are the most widely used agents in the management of patients with angina pectoris. When initially administered by the oral route, the nitrates produce profound changes in systemic hemodynamics and significant and prolonged improvement in exercise duration. It has been shown that during short periods of regular oral nitrate administration, the hemodynamic, antiischemic and antianginal effects of the nitrates are greatly reduced. Thus, when initially administered, oral isosorbide dinitrate prolongs exercise duration for a period of several hours, but during sustained 4-times-daily therapy, exercise tolerance is improved for only 2 hours after administration. Studies with transdermal preparations of isosorbide dinitrate and nitroglycerin also show improvement during short-term administration for up to 8 hours, but after several days of once-daily therapy, the effects of these agents are similar to placebo. It is apparent that nitrate tolerance is a clinically relevant problem. Although tolerance develops rapidly during nitrate therapy, it is reversed promptly during nitrate-free periods. Oral nitrates maintain their antianginal effects when given 2 or 3 times daily with provision of a nitrate-free period. Studies are currently underway to investigate the effects of intermittent administration schedules with transdermal nitrate preparations.

  5. A novel Bayesian imaging method for probabilistic delamination detection of composite materials

    NASA Astrophysics Data System (ADS)

    Peng, Tishun; Saxena, Abhinav; Goebel, Kai; Xiang, Yibing; Sankararaman, Shankar; Liu, Yongming

    2013-12-01

    A probabilistic framework for location and size determination for delamination in carbon-carbon composites is proposed in this paper. A probability image of delaminated area using Lamb wave-based damage detection features is constructed with the Bayesian updating technique. First, the algorithm for the probabilistic delamination detection framework using the proposed Bayesian imaging method (BIM) is presented. Next, a fatigue testing setup for carbon-carbon composite coupons is described. The Lamb wave-based diagnostic signal is then interpreted and processed. Next, the obtained signal features are incorporated in the Bayesian imaging method for delamination size and location detection, as well as the corresponding uncertainty bounds prediction. The damage detection results using the proposed methodology are compared with x-ray images for verification and validation. Finally, some conclusions are drawn and suggestions made for future works based on the study presented in this paper.

  6. Probabilistic analysis of deposit liquefaction

    SciTech Connect

    Loh, C.H.; Cheng, C.R.; Wen, Y.K.

    1995-12-31

    This paper presents a procedure to perform the risk analysis for ground failure by liquefaction. The liquefaction is defined as the result of cumulative damage caused by seismic loading. The fatigue life of soil can be determined on the basis of the N-S relationship and Miner`s cumulative damage law. The rain-flow method is used to count the number of cycles of stress response of the soil deposit. Finally, the probability of liquefaction is obtained by integrating over all the possible ground motion and the fragility curves of liquefaction potential.

  7. Probabilistic risk assessment familiarization training

    SciTech Connect

    Phillabaum, J.L.

    1989-01-01

    Philadelphia Electric Company (PECo) created a Nuclear Group Risk and Reliability Assessment Program Plan in order to focus the utilization of probabilistic risk assessment (PRA) in support of Limerick Generating Station and Peach Bottom Atomic Power Station. The continuation of a PRA program was committed by PECo to the U.S. Nuclear Regulatory Commission (NRC) prior to be the issuance of an operating license for Limerick Unit 1. It is believed that increased use of PRA techniques to support activities at Limerick and Peach Bottom will enhance PECo's overall nuclear excellence. Training for familiarization with PRA is designed for attendance once by all nuclear group personnel to understand PRA and its potential effect on their jobs. The training content describes the history of PRA and how it applies to PECo's nuclear activities. Key PRA concepts serve as the foundation for the familiarization training. These key concepts are covered in all classes to facilitate an appreciation of the remaining material, which is tailored to the audience. Some of the concepts covered are comparison of regulatory philosophy to PRA techniques, fundamentals of risk/success, risk equation/risk summation, and fault trees and event trees. Building on the concepts, PRA insights and applications are then described that are tailored to the audience.

  8. Probabilistic elastography: estimating lung elasticity.

    PubMed

    Risholm, Petter; Ross, James; Washko, George R; Wells, William M

    2011-01-01

    We formulate registration-based elastography in a probabilistic framework and apply it to study lung elasticity in the presence of emphysematous and fibrotic tissue. The elasticity calculations are based on a Finite Element discretization of a linear elastic biomechanical model. We marginalize over the boundary conditions (deformation) of the biomechanical model to determine the posterior distribution over elasticity parameters. Image similarity is included in the likelihood, an elastic prior is included to constrain the boundary conditions, while a Markov model is used to spatially smooth the inhomogeneous elasticity. We use a Markov Chain Monte Carlo (MCMC) technique to characterize the posterior distribution over elasticity from which we extract the most probable elasticity as well as the uncertainty of this estimate. Even though registration-based lung elastography with inhomogeneous elasticity is challenging due the problem's highly underdetermined nature and the sparse image information available in lung CT, we show promising preliminary results on estimating lung elasticity contrast in the presence of emphysematous and fibrotic tissue.

  9. Probabilistic modeling of children's handwriting

    NASA Astrophysics Data System (ADS)

    Puri, Mukta; Srihari, Sargur N.; Hanson, Lisa

    2013-12-01

    There is little work done in the analysis of children's handwriting, which can be useful in developing automatic evaluation systems and in quantifying handwriting individuality. We consider the statistical analysis of children's handwriting in early grades. Samples of handwriting of children in Grades 2-4 who were taught the Zaner-Bloser style were considered. The commonly occurring word "and" written in cursive style as well as hand-print were extracted from extended writing. The samples were assigned feature values by human examiners using a truthing tool. The human examiners looked at how the children constructed letter formations in their writing, looking for similarities and differences from the instructions taught in the handwriting copy book. These similarities and differences were measured using a feature space distance measure. Results indicate that the handwriting develops towards more conformity with the class characteristics of the Zaner-Bloser copybook which, with practice, is the expected result. Bayesian networks were learnt from the data to enable answering various probabilistic queries, such as determining students who may continue to produce letter formations as taught during lessons in school and determining the students who will develop a different and/or variation of the those letter formations and the number of different types of letter formations.

  10. Optimal probabilistic dense coding schemes

    NASA Astrophysics Data System (ADS)

    Kögler, Roger A.; Neves, Leonardo

    2017-04-01

    Dense coding with non-maximally entangled states has been investigated in many different scenarios. We revisit this problem for protocols adopting the standard encoding scheme. In this case, the set of possible classical messages cannot be perfectly distinguished due to the non-orthogonality of the quantum states carrying them. So far, the decoding process has been approached in two ways: (i) The message is always inferred, but with an associated (minimum) error; (ii) the message is inferred without error, but only sometimes; in case of failure, nothing else is done. Here, we generalize on these approaches and propose novel optimal probabilistic decoding schemes. The first uses quantum-state separation to increase the distinguishability of the messages with an optimal success probability. This scheme is shown to include (i) and (ii) as special cases and continuously interpolate between them, which enables the decoder to trade-off between the level of confidence desired to identify the received messages and the success probability for doing so. The second scheme, called multistage decoding, applies only for qudits ( d-level quantum systems with d>2) and consists of further attempts in the state identification process in case of failure in the first one. We show that this scheme is advantageous over (ii) as it increases the mutual information between the sender and receiver.

  11. Probabilistic description of traffic flow

    NASA Astrophysics Data System (ADS)

    Mahnke, R.; Kaupužs, J.; Lubashevsky, I.

    2005-03-01

    A stochastic description of traffic flow, called probabilistic traffic flow theory, is developed. The general master equation is applied to relatively simple models to describe the formation and dissolution of traffic congestions. Our approach is mainly based on spatially homogeneous systems like periodically closed circular rings without on- and off-ramps. We consider a stochastic one-step process of growth or shrinkage of a car cluster (jam). As generalization we discuss the coexistence of several car clusters of different sizes. The basic problem is to find a physically motivated ansatz for the transition rates of the attachment and detachment of individual cars to a car cluster consistent with the empirical observations in real traffic. The emphasis is put on the analogy with first-order phase transitions and nucleation phenomena in physical systems like supersaturated vapour. The results are summarized in the flux-density relation, the so-called fundamental diagram of traffic flow, and compared with empirical data. Different regimes of traffic flow are discussed: free flow, congested mode as stop-and-go regime, and heavy viscous traffic. The traffic breakdown is studied based on the master equation as well as the Fokker-Planck approximation to calculate mean first passage times or escape rates. Generalizations are developed to allow for on-ramp effects. The calculated flux-density relation and characteristic breakdown times coincide with empirical data measured on highways. Finally, a brief summary of the stochastic cellular automata approach is given.

  12. Symbolic representation of probabilistic worlds.

    PubMed

    Feldman, Jacob

    2012-04-01

    Symbolic representation of environmental variables is a ubiquitous and often debated component of cognitive science. Yet notwithstanding centuries of philosophical discussion, the efficacy, scope, and validity of such representation has rarely been given direct consideration from a mathematical point of view. This paper introduces a quantitative measure of the effectiveness of symbolic representation, and develops formal constraints under which such representation is in fact warranted. The effectiveness of symbolic representation hinges on the probabilistic structure of the environment that is to be represented. For arbitrary probability distributions (i.e., environments), symbolic representation is generally not warranted. But in modal environments, defined here as those that consist of mixtures of component distributions that are narrow ("spiky") relative to their spreads, symbolic representation can be shown to represent the environment with a relatively negligible loss of information. Modal environments support propositional forms, logical relations, and other familiar features of symbolic representation. Hence the assumption that our environment is, in fact, modal is a key tacit assumption underlying the use of symbols in cognitive science.

  13. Dynamical systems probabilistic risk assessment

    SciTech Connect

    Denman, Matthew R.; Ames, Arlo Leroy

    2014-03-01

    Probabilistic Risk Assessment (PRA) is the primary tool used to risk-inform nuclear power regulatory and licensing activities. Risk-informed regulations are intended to reduce inherent conservatism in regulatory metrics (e.g., allowable operating conditions and technical specifications) which are built into the regulatory framework by quantifying both the total risk profile as well as the change in the risk profile caused by an event or action (e.g., in-service inspection procedures or power uprates). Dynamical Systems (DS) analysis has been used to understand unintended time-dependent feedbacks in both industrial and organizational settings. In dynamical systems analysis, feedback loops can be characterized and studied as a function of time to describe the changes to the reliability of plant Structures, Systems and Components (SSCs). While DS has been used in many subject areas, some even within the PRA community, it has not been applied toward creating long-time horizon, dynamic PRAs (with time scales ranging between days and decades depending upon the analysis). Understanding slowly developing dynamic effects, such as wear-out, on SSC reliabilities may be instrumental in ensuring a safely and reliably operating nuclear fleet. Improving the estimation of a plant's continuously changing risk profile will allow for more meaningful risk insights, greater stakeholder confidence in risk insights, and increased operational flexibility.

  14. A methodology for post-mainshock probabilistic assessment of building collapse risk

    USGS Publications Warehouse

    Luco, N.; Gerstenberger, M.C.; Uma, S.R.; Ryu, H.; Liel, A.B.; Raghunandan, M.

    2011-01-01

    This paper presents a methodology for post-earthquake probabilistic risk (of damage) assessment that we propose in order to develop a computational tool for automatic or semi-automatic assessment. The methodology utilizes the same so-called risk integral which can be used for pre-earthquake probabilistic assessment. The risk integral couples (i) ground motion hazard information for the location of a structure of interest with (ii) knowledge of the fragility of the structure with respect to potential ground motion intensities. In the proposed post-mainshock methodology, the ground motion hazard component of the risk integral is adapted to account for aftershocks which are deliberately excluded from typical pre-earthquake hazard assessments and which decrease in frequency with the time elapsed since the mainshock. Correspondingly, the structural fragility component is adapted to account for any damage caused by the mainshock, as well as any uncertainty in the extent of this damage. The result of the adapted risk integral is a fully-probabilistic quantification of post-mainshock seismic risk that can inform emergency response mobilization, inspection prioritization, and re-occupancy decisions.

  15. Advanced probabilistic risk analysis using RAVEN and RELAP-7

    SciTech Connect

    Rabiti, Cristian; Alfonsi, Andrea; Mandelli, Diego; Cogliati, Joshua; Kinoshita, Robert

    2014-06-01

    RAVEN, under the support of the Nuclear Energy Advanced Modeling and Simulation (NEAMS) program [1], is advancing its capability to perform statistical analyses of stochastic dynamic systems. This is aligned with its mission to provide the tools needed by the Risk Informed Safety Margin Characterization (RISMC) path-lead [2] under the Department Of Energy (DOE) Light Water Reactor Sustainability program [3]. In particular this task is focused on the synergetic development with the RELAP-7 [4] code to advance the state of the art on the safety analysis of nuclear power plants (NPP). The investigation of the probabilistic evolution of accident scenarios for a complex system such as a nuclear power plant is not a trivial challenge. The complexity of the system to be modeled leads to demanding computational requirements even to simulate one of the many possible evolutions of an accident scenario (tens of CPU/hour). At the same time, the probabilistic analysis requires thousands of runs to investigate outcomes characterized by low probability and severe consequence (tail problem). The milestone reported in June of 2013 [5] described the capability of RAVEN to implement complex control logic and provide an adequate support for the exploration of the probabilistic space using a Monte Carlo sampling strategy. Unfortunately the Monte Carlo approach is ineffective with a problem of this complexity. In the following year of development, the RAVEN code has been extended with more sophisticated sampling strategies (grids, Latin Hypercube, and adaptive sampling). This milestone report illustrates the effectiveness of those methodologies in performing the assessment of the probability of core damage following the onset of a Station Black Out (SBO) situation in a boiling water reactor (BWR). The first part of the report provides an overview of the available probabilistic analysis capabilities, ranging from the different types of distributions available, possible sampling

  16. Probabilistic Methodology for Estimation of Number and Economic Loss (Cost) of Future Landslides in the San Francisco Bay Region, California

    USGS Publications Warehouse

    Crovelli, Robert A.; Coe, Jeffrey A.

    2008-01-01

    The Probabilistic Landslide Assessment Cost Estimation System (PLACES) presented in this report estimates the number and economic loss (cost) of landslides during a specified future time in individual areas, and then calculates the sum of those estimates. The analytic probabilistic methodology is based upon conditional probability theory and laws of expectation and variance. The probabilistic methodology is expressed in the form of a Microsoft Excel computer spreadsheet program. Using historical records, the PLACES spreadsheet is used to estimate the number of future damaging landslides and total damage, as economic loss, from future landslides caused by rainstorms in 10 counties of the San Francisco Bay region in California. Estimates are made for any future 5-year period of time. The estimated total number of future damaging landslides for the entire 10-county region during any future 5-year period of time is about 330. Santa Cruz County has the highest estimated number of damaging landslides (about 90), whereas Napa, San Francisco, and Solano Counties have the lowest estimated number of damaging landslides (5?6 each). Estimated direct costs from future damaging landslides for the entire 10-county region for any future 5-year period are about US $76 million (year 2000 dollars). San Mateo County has the highest estimated costs ($16.62 million), and Solano County has the lowest estimated costs (about $0.90 million). Estimated direct costs are also subdivided into public and private costs.

  17. A Probabilistic, Facility-Centric Approach to Lightning Strike Location

    NASA Technical Reports Server (NTRS)

    Huddleston, Lisa L.; Roeder, William p.; Merceret, Francis J.

    2012-01-01

    A new probabilistic facility-centric approach to lightning strike location has been developed. This process uses the bivariate Gaussian distribution of probability density provided by the current lightning location error ellipse for the most likely location of a lightning stroke and integrates it to determine the probability that the stroke is inside any specified radius of any location, even if that location is not centered on or even with the location error ellipse. This technique is adapted from a method of calculating the probability of debris collisionith spacecraft. Such a technique is important in spaceport processing activities because it allows engineers to quantify the risk of induced current damage to critical electronics due to nearby lightning strokes. This technique was tested extensively and is now in use by space launch organizations at Kennedy Space Center and Cape Canaveral Air Force Station. Future applications could include forensic meteorology.

  18. Advanced Seismic Probabilistic Risk Assessment Demonstration Project Plan

    SciTech Connect

    Coleman, Justin

    2014-09-01

    Idaho National Laboratories (INL) has an ongoing research and development (R&D) project to remove excess conservatism from seismic probabilistic risk assessments (SPRA) calculations. These risk calculations should focus on providing best estimate results, and associated insights, for evaluation and decision-making. This report presents a plan for improving our current traditional SPRA process using a seismic event recorded at a nuclear power plant site, with known outcomes, to improve the decision making process. SPRAs are intended to provide best estimates of the various combinations of structural and equipment failures that can lead to a seismic induced core damage event. However, in general this approach has been conservative, and potentially masks other important events (for instance, it was not the seismic motions that caused the Fukushima core melt events, but the tsunami ingress into the facility).

  19. Demonstrate Ames Laboratory capability in Probabilistic Risk Assessment (PRA)

    SciTech Connect

    Bluhm, D.; Greimann, L.; Fanous, F.; Challa, R.; Gupta, S.

    1993-07-01

    In response to the damage which occurred during the Three Mile Island nuclear accident, the Nuclear Regulatory Commission (NRC) has implemented a Probabilistic Risk Assessment (PRA) program to evaluate the safety of nuclear power facilities during events with a low probability of occurrence. The PRA can be defined as a mathematical technique to identify and rank the importance of event sequences that can lead to a severe nuclear accident. Another PRA application is the evaluation of nuclear containment buildings due to earthquakes. In order to perform a seismic PRA, the two conditional probabilities of ground motion and of structural failure of the different components given a specific earthquake are first studied. The first of these is termed probability of exceedance and the second as seismic fragility analysis. The seismic fragility analysis is then related to the ground motion measured in terms of ``g`` to obtain a plant level fragility curve.

  20. Probabilistic Survivability Versus Time Modeling

    NASA Technical Reports Server (NTRS)

    Joyner, James J., Sr.

    2015-01-01

    This technical paper documents Kennedy Space Centers Independent Assessment team work completed on three assessments for the Ground Systems Development and Operations (GSDO) Program to assist the Chief Safety and Mission Assurance Officer (CSO) and GSDO management during key programmatic reviews. The assessments provided the GSDO Program with an analysis of how egress time affects the likelihood of astronaut and worker survival during an emergency. For each assessment, the team developed probability distributions for hazard scenarios to address statistical uncertainty, resulting in survivability plots over time. The first assessment developed a mathematical model of probabilistic survivability versus time to reach a safe location using an ideal Emergency Egress System at Launch Complex 39B (LC-39B); the second used the first model to evaluate and compare various egress systems under consideration at LC-39B. The third used a modified LC-39B model to determine if a specific hazard decreased survivability more rapidly than other events during flight hardware processing in Kennedys Vehicle Assembly Building (VAB).Based on the composite survivability versus time graphs from the first two assessments, there was a soft knee in the Figure of Merit graphs at eight minutes (ten minutes after egress ordered). Thus, the graphs illustrated to the decision makers that the final emergency egress design selected should have the capability of transporting the flight crew from the top of LC 39B to a safe location in eight minutes or less. Results for the third assessment were dominated by hazards that were classified as instantaneous in nature (e.g. stacking mishaps) and therefore had no effect on survivability vs time to egress the VAB. VAB emergency scenarios that degraded over time (e.g. fire) produced survivability vs time graphs that were line with aerospace industry norms.

  1. Probabilistic Modeling of Rosette Formation

    PubMed Central

    Long, Mian; Chen, Juan; Jiang, Ning; Selvaraj, Periasamy; McEver, Rodger P.; Zhu, Cheng

    2006-01-01

    Rosetting, or forming a cell aggregate between a single target nucleated cell and a number of red blood cells (RBCs), is a simple assay for cell adhesion mediated by specific receptor-ligand interaction. For example, rosette formation between sheep RBC and human lymphocytes has been used to differentiate T cells from B cells. Rosetting assay is commonly used to determine the interaction of Fc γ-receptors (FcγR) expressed on inflammatory cells and IgG coated on RBCs. Despite its wide use in measuring cell adhesion, the biophysical parameters of rosette formation have not been well characterized. Here we developed a probabilistic model to describe the distribution of rosette sizes, which is Poissonian. The average rosette size is predicted to be proportional to the apparent two-dimensional binding affinity of the interacting receptor-ligand pair and their site densities. The model has been supported by experiments of rosettes mediated by four molecular interactions: FcγRIII interacting with IgG, T cell receptor and coreceptor CD8 interacting with antigen peptide presented by major histocompatibility molecule, P-selectin interacting with P-selectin glycoprotein ligand 1 (PSGL-1), and L-selectin interacting with PSGL-1. The latter two are structurally similar and are different from the former two. Fitting the model to data enabled us to evaluate the apparent effective two-dimensional binding affinity of the interacting molecular pairs: 7.19 × 10−5 μm4 for FcγRIII-IgG interaction, 4.66 × 10−3 μm4 for P-selectin-PSGL-1 interaction, and 0.94 × 10−3 μm4 for L-selectin-PSGL-1 interaction. These results elucidate the biophysical mechanism of rosette formation and enable it to become a semiquantitative assay that relates the rosette size to the effective affinity for receptor-ligand binding. PMID:16603493

  2. Validation of seismic probabilistic risk assessments of nuclear power plants

    SciTech Connect

    Ellingwood, B.

    1994-01-01

    A seismic probabilistic risk assessment (PRA) of a nuclear plant requires identification and information regarding the seismic hazard at the plant site, dominant accident sequences leading to core damage, and structure and equipment fragilities. Uncertainties are associated with each of these ingredients of a PRA. Sources of uncertainty due to seismic hazard and assumptions underlying the component fragility modeling may be significant contributors to uncertainty in estimates of core damage probability. Design and construction errors also may be important in some instances. When these uncertainties are propagated through the PRA, the frequency distribution of core damage probability may span three orders of magnitude or more. This large variability brings into question the credibility of PRA methods and the usefulness of insights to be gained from a PRA. The sensitivity of accident sequence probabilities and high-confidence, low probability of failure (HCLPF) plant fragilities to seismic hazard and fragility modeling assumptions was examined for three nuclear power plants. Mean accident sequence probabilities were found to be relatively insensitive (by a factor of two or less) to: uncertainty in the coefficient of variation (logarithmic standard deviation) describing inherent randomness in component fragility; truncation of lower tail of fragility; uncertainty in random (non-seismic) equipment failures (e.g., diesel generators); correlation between component capacities; and functional form of fragility family. On the other hand, the accident sequence probabilities, expressed in the form of a frequency distribution, are affected significantly by the seismic hazard modeling, including slopes of seismic hazard curves and likelihoods assigned to those curves.

  3. Probabilistic, Seismically-Induced Landslide Hazard Mapping of Western Oregon

    NASA Astrophysics Data System (ADS)

    Olsen, M. J.; Sharifi Mood, M.; Gillins, D. T.; Mahalingam, R.

    2015-12-01

    Earthquake-induced landslides can generate significant damage within urban communities by damaging structures, obstructing lifeline connection routes and utilities, generating various environmental impacts, and possibly resulting in loss of life. Reliable hazard and risk maps are important to assist agencies in efficiently allocating and managing limited resources to prepare for such events. This research presents a new methodology in order to communicate site-specific landslide hazard assessments in a large-scale, regional map. Implementation of the proposed methodology results in seismic-induced landslide hazard maps that depict the probabilities of exceeding landslide displacement thresholds (e.g. 0.1, 0.3, 1.0 and 10 meters). These maps integrate a variety of data sources including: recent landslide inventories, LIDAR and photogrammetric topographic data, geology map, mapped NEHRP site classifications based on available shear wave velocity data in each geologic unit, and USGS probabilistic seismic hazard curves. Soil strength estimates were obtained by evaluating slopes present along landslide scarps and deposits for major geologic units. Code was then developed to integrate these layers to perform a rigid, sliding block analysis to determine the amount and associated probabilities of displacement based on each bin of peak ground acceleration in the seismic hazard curve at each pixel. The methodology was applied to western Oregon, which contains weak, weathered, and often wet soils at steep slopes. Such conditions have a high landslide hazard even without seismic events. A series of landslide hazard maps highlighting the probabilities of exceeding the aforementioned thresholds were generated for the study area. These output maps were then utilized in a performance based design framework enabling them to be analyzed in conjunction with other hazards for fully probabilistic-based hazard evaluation and risk assessment. a) School of Civil and Construction

  4. Historical overview of immunological tolerance.

    PubMed

    Schwartz, Ronald H

    2012-04-01

    A fundamental property of the immune system is its ability to mediate self-defense with a minimal amount of collateral damage to the host. The system uses several different mechanisms to achieve this goal, which is collectively referred to as the "process of immunological tolerance." This article provides an introductory historical overview to these various mechanisms, which are discussed in greater detail throughout this collection, and then briefly describes what happens when this process fails, a state referred to as "autoimmunity."

  5. Probabilistic numerics and uncertainty in computations

    PubMed Central

    Hennig, Philipp; Osborne, Michael A.; Girolami, Mark

    2015-01-01

    We deliver a call to arms for probabilistic numerical methods: algorithms for numerical tasks, including linear algebra, integration, optimization and solving differential equations, that return uncertainties in their calculations. Such uncertainties, arising from the loss of precision induced by numerical calculation with limited time or hardware, are important for much contemporary science and industry. Within applications such as climate science and astrophysics, the need to make decisions on the basis of computations with large and complex data have led to a renewed focus on the management of numerical uncertainty. We describe how several seminal classic numerical methods can be interpreted naturally as probabilistic inference. We then show that the probabilistic view suggests new algorithms that can flexibly be adapted to suit application specifics, while delivering improved empirical performance. We provide concrete illustrations of the benefits of probabilistic numeric algorithms on real scientific problems from astrometry and astronomical imaging, while highlighting open problems with these new algorithms. Finally, we describe how probabilistic numerical methods provide a coherent framework for identifying the uncertainty in calculations performed with a combination of numerical algorithms (e.g. both numerical optimizers and differential equation solvers), potentially allowing the diagnosis (and control) of error sources in computations. PMID:26346321

  6. Probabilistic numerics and uncertainty in computations.

    PubMed

    Hennig, Philipp; Osborne, Michael A; Girolami, Mark

    2015-07-08

    We deliver a call to arms for probabilistic numerical methods: algorithms for numerical tasks, including linear algebra, integration, optimization and solving differential equations, that return uncertainties in their calculations. Such uncertainties, arising from the loss of precision induced by numerical calculation with limited time or hardware, are important for much contemporary science and industry. Within applications such as climate science and astrophysics, the need to make decisions on the basis of computations with large and complex data have led to a renewed focus on the management of numerical uncertainty. We describe how several seminal classic numerical methods can be interpreted naturally as probabilistic inference. We then show that the probabilistic view suggests new algorithms that can flexibly be adapted to suit application specifics, while delivering improved empirical performance. We provide concrete illustrations of the benefits of probabilistic numeric algorithms on real scientific problems from astrometry and astronomical imaging, while highlighting open problems with these new algorithms. Finally, we describe how probabilistic numerical methods provide a coherent framework for identifying the uncertainty in calculations performed with a combination of numerical algorithms (e.g. both numerical optimizers and differential equation solvers), potentially allowing the diagnosis (and control) of error sources in computations.

  7. Future trends in flood risk in Indonesia - A probabilistic approach

    NASA Astrophysics Data System (ADS)

    Muis, Sanne; Guneralp, Burak; Jongman, Brenden; Ward, Philip

    2014-05-01

    Indonesia is one of the 10 most populous countries in the world and is highly vulnerable to (river) flooding. Catastrophic floods occur on a regular basis; total estimated damages were US 0.8 bn in 2010 and US 3 bn in 2013. Large parts of Greater Jakarta, the capital city, are annually subject to flooding. Flood risks (i.e. the product of hazard, exposure and vulnerability) are increasing due to rapid increases in exposure, such as strong population growth and ongoing economic development. The increase in risk may also be amplified by increasing flood hazards, such as increasing flood frequency and intensity due to climate change and land subsidence. The implementation of adaptation measures, such as the construction of dykes and strategic urban planning, may counteract these increasing trends. However, despite its importance for adaptation planning, a comprehensive assessment of current and future flood risk in Indonesia is lacking. This contribution addresses this issue and aims to provide insight into how socio-economic trends and climate change projections may shape future flood risks in Indonesia. Flood risk were calculated using an adapted version of the GLOFRIS global flood risk assessment model. Using this approach, we produced probabilistic maps of flood risks (i.e. annual expected damage) at a resolution of 30"x30" (ca. 1km x 1km at the equator). To represent flood exposure, we produced probabilistic projections of urban growth in a Monte-Carlo fashion based on probability density functions of projected population and GDP values for 2030. To represent flood hazard, inundation maps were computed using the hydrological-hydraulic component of GLOFRIS. These maps show flood inundation extent and depth for several return periods and were produced for several combinations of GCMs and future socioeconomic scenarios. Finally, the implementation of different adaptation strategies was incorporated into the model to explore to what extent adaptation may be able to

  8. Orchid flowers tolerance to gamma-radiation

    NASA Astrophysics Data System (ADS)

    Kikuchi, Olivia Kimiko

    2000-03-01

    Cut flowers are fresh goods that may be treated with fumigants such as methyl bromide to meet the needs of the quarantine requirements of importing countries. Irradiation is a non-chemical alternative to substitute the methyl bromide treatment of fresh products. In this research, different cut orchids were irradiated to examine their tolerance to gamma-rays. A 200 Gy dose did inhibit the Dendrobium palenopsis buds from opening, but did not cause visible damage to opened flowers. Doses of 800 and 1000 Gy were damaging because they provoked the flowers to drop from the stem. Cattleya irradiated with 750 Gy did not show any damage, and were therefore eligible for the radiation treatment. Cymbidium tolerated up to 300 Gy and above this dose dropped prematurely. On the other hand, Oncydium did not tolerate doses above 150 Gy.

  9. Probabilistic Exposure Analysis for Chemical Risk Characterization

    PubMed Central

    Bogen, Kenneth T.; Cullen, Alison C.; Frey, H. Christopher; Price, Paul S.

    2009-01-01

    This paper summarizes the state of the science of probabilistic exposure assessment (PEA) as applied to chemical risk characterization. Current probabilistic risk analysis methods applied to PEA are reviewed. PEA within the context of risk-based decision making is discussed, including probabilistic treatment of related uncertainty, interindividual heterogeneity, and other sources of variability. Key examples of recent experience gained in assessing human exposures to chemicals in the environment, and other applications to chemical risk characterization and assessment, are presented. It is concluded that, although improvements continue to be made, existing methods suffice for effective application of PEA to support quantitative analyses of the risk of chemically induced toxicity that play an increasing role in key decision-making objectives involving health protection, triage, civil justice, and criminal justice. Different types of information required to apply PEA to these different decision contexts are identified, and specific PEA methods are highlighted that are best suited to exposure assessment in these separate contexts. PMID:19223660

  10. bayesPop: Probabilistic Population Projections

    PubMed Central

    Ševčíková, Hana; Raftery, Adrian E.

    2016-01-01

    We describe bayesPop, an R package for producing probabilistic population projections for all countries. This uses probabilistic projections of total fertility and life expectancy generated by Bayesian hierarchical models. It produces a sample from the joint posterior predictive distribution of future age- and sex-specific population counts, fertility rates and mortality rates, as well as future numbers of births and deaths. It provides graphical ways of summarizing this information, including trajectory plots and various kinds of probabilistic population pyramids. An expression language is introduced which allows the user to produce the predictive distribution of a wide variety of derived population quantities, such as the median age or the old age dependency ratio. The package produces aggregated projections for sets of countries, such as UN regions or trading blocs. The methodology has been used by the United Nations to produce their most recent official population projections for all countries, published in the World Population Prospects. PMID:28077933

  11. bayesPop: Probabilistic Population Projections.

    PubMed

    Ševčíková, Hana; Raftery, Adrian E

    2016-12-01

    We describe bayesPop, an R package for producing probabilistic population projections for all countries. This uses probabilistic projections of total fertility and life expectancy generated by Bayesian hierarchical models. It produces a sample from the joint posterior predictive distribution of future age- and sex-specific population counts, fertility rates and mortality rates, as well as future numbers of births and deaths. It provides graphical ways of summarizing this information, including trajectory plots and various kinds of probabilistic population pyramids. An expression language is introduced which allows the user to produce the predictive distribution of a wide variety of derived population quantities, such as the median age or the old age dependency ratio. The package produces aggregated projections for sets of countries, such as UN regions or trading blocs. The methodology has been used by the United Nations to produce their most recent official population projections for all countries, published in the World Population Prospects.

  12. A probabilistic approach to spectral graph matching.

    PubMed

    Egozi, Amir; Keller, Yosi; Guterman, Hugo

    2013-01-01

    Spectral Matching (SM) is a computationally efficient approach to approximate the solution of pairwise matching problems that are np-hard. In this paper, we present a probabilistic interpretation of spectral matching schemes and derive a novel Probabilistic Matching (PM) scheme that is shown to outperform previous approaches. We show that spectral matching can be interpreted as a Maximum Likelihood (ML) estimate of the assignment probabilities and that the Graduated Assignment (GA) algorithm can be cast as a Maximum a Posteriori (MAP) estimator. Based on this analysis, we derive a ranking scheme for spectral matchings based on their reliability, and propose a novel iterative probabilistic matching algorithm that relaxes some of the implicit assumptions used in prior works. We experimentally show our approaches to outperform previous schemes when applied to exhaustive synthetic tests as well as the analysis of real image sequences.

  13. Probabilistic Cue Combination: Less is More

    PubMed Central

    Yurovsky, Daniel; Boyer, Ty W.; Smith, Linda B.; Yu, Chen

    2012-01-01

    Learning about the structure of the world requires learning probabilistic relationships: rules in which cues do not predict outcomes with certainty. However, in some cases, the ability to track probabilistic relationships is a handicap, leading adults to perform non-normatively in prediction tasks. For example, in the dilution effect, predictions made from the combination of two cues of different strengths are less accurate than those made from the stronger cue alone. Here we show that dilution is an adult problem; 11-month-old infants combine strong and weak predictors normatively. These results extend and add support for the less is more hypothesis: limited cognitive resources can lead children to represent probabilistic information differently from adults, and this difference in representation can have important downstream consequences for prediction. PMID:23432826

  14. Strategies in probabilistic categorization: Results from a new way of analyzing performance

    PubMed Central

    Meeter, Martijn; Myers, Catherine E.; Shohamy, Daphna; Hopkins, Ramona O.; Gluck, Mark A.

    2006-01-01

    The “Weather Prediction” task is a widely used task for investigating probabilistic category learning, in which various cues are probabilistically (but not perfectly) predictive of class membership. This means that a given combination of cues sometimes belongs to one class and sometimes to another. Prior studies showed that subjects can improve their performance with training, and that there is considerable individual variation in the strategies subjects use to approach this task. Here, we discuss a recently introduced analysis of probabilistic categorization, which attempts to identify the strategy followed by a participant. Monte Carlo simulations show that the analysis can, indeed, reliably identify such a strategy if it is used, and can identify switches from one strategy to another. Analysis of data from normal young adults shows that the fitted strategy can predict subsequent responses. Moreover, learning is shown to be highly nonlinear in probabilistic categorization. Analysis of performance of patients with dense memory impairments due to hippocampal damage shows that although these patients can change strategies, they are as likely to fall back to an inferior strategy as to move to more optimal ones. PMID:16547162

  15. Degradation monitoring using probabilistic inference

    NASA Astrophysics Data System (ADS)

    Alpay, Bulent

    In order to increase safety and improve economy and performance in a nuclear power plant (NPP), the source and extent of component degradations should be identified before failures and breakdowns occur. It is also crucial for the next generation of NPPs, which are designed to have a long core life and high fuel burnup to have a degradation monitoring system in order to keep the reactor in a safe state, to meet the designed reactor core lifetime and to optimize the scheduled maintenance. Model-based methods are based on determining the inconsistencies between the actual and expected behavior of the plant, and use these inconsistencies for detection and diagnostics of degradations. By defining degradation as a random abrupt change from the nominal to a constant degraded state of a component, we employed nonlinear filtering techniques based on state/parameter estimation. We utilized a Bayesian recursive estimation formulation in the sequential probabilistic inference framework and constructed a hidden Markov model to represent a general physical system. By addressing the problem of a filter's inability to estimate an abrupt change, which is called the oblivious filter problem in nonlinear extensions of Kalman filtering, and the sample impoverishment problem in particle filtering, we developed techniques to modify filtering algorithms by utilizing additional data sources to improve the filter's response to this problem. We utilized a reliability degradation database that can be constructed from plant specific operational experience and test and maintenance reports to generate proposal densities for probable degradation modes. These are used in a multiple hypothesis testing algorithm. We then test samples drawn from these proposal densities with the particle filtering estimates based on the Bayesian recursive estimation formulation with the Metropolis Hastings algorithm, which is a well-known Markov chain Monte Carlo method (MCMC). This multiple hypothesis testing

  16. A Probabilistic Cell Tracking Algorithm

    NASA Astrophysics Data System (ADS)

    Steinacker, Reinhold; Mayer, Dieter; Leiding, Tina; Lexer, Annemarie; Umdasch, Sarah

    2013-04-01

    The research described below was carried out during the EU-Project Lolight - development of a low cost, novel and accurate lightning mapping and thunderstorm (supercell) tracking system. The Project aims to develop a small-scale tracking method to determine and nowcast characteristic trajectories and velocities of convective cells and cell complexes. The results of the algorithm will provide a higher accuracy than current locating systems distributed on a coarse scale. Input data for the developed algorithm are two temporally separated lightning density fields. Additionally a Monte Carlo method minimizing a cost function is utilizied which leads to a probabilistic forecast for the movement of thunderstorm cells. In the first step the correlation coefficients between the first and the second density field are computed. Hence, the first field is shifted by all shifting vectors which are physically allowed. The maximum length of each vector is determined by the maximum possible speed of thunderstorm cells and the difference in time for both density fields. To eliminate ambiguities in determination of directions and velocities, the so called Random Walker of the Monte Carlo process is used. Using this method a grid point is selected at random. Moreover, one vector out of all predefined shifting vectors is suggested - also at random but with a probability that is related to the correlation coefficient. If this exchange of shifting vectors reduces the cost function, the new direction and velocity are accepted. Otherwise it is discarded. This process is repeated until the change of cost functions falls below a defined threshold. The Monte Carlo run gives information about the percentage of accepted shifting vectors for all grid points. In the course of the forecast, amplifications of cell density are permitted. For this purpose, intensity changes between the investigated areas of both density fields are taken into account. Knowing the direction and speed of thunderstorm

  17. 7 CFR 51.2544 - Tolerances.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ....S. artificially opened U.S. non-split External (shell) Defects (tolerances by weight): (a) Non-split...) Damage by other means 1 1 2 3 10 N/A (e) Total External Defects 9 16 N/A N/A N/A N/A (f) Undersized (Less than 30/64 inch in diameter) 5 5 5 5 4 5 Table II—Tolerances Factorinternal (kernel)...

  18. 7 CFR 51.2544 - Tolerances.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... U.S. No. 1 U.S. select U.S. artificially opened U.S. non-split External (shell) Defects (tolerances... stained, included in (c) 2 3 3 3 3 3 (d) Damage by other means 1 1 2 3 10 N/A (e) Total External Defects 9... Factorinternal (kernel) defects (tolerances by weight) U.S. fancy(percent) U.S. extraNo. 1 (percent) U.S. No....

  19. 7 CFR 51.2544 - Tolerances.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ....S. artificially opened U.S. non-split External (shell) Defects (tolerances by weight): (a) Non-split...) Damage by other means 1 1 2 3 10 N/A (e) Total External Defects 9 16 N/A N/A N/A N/A (f) Undersized (Less than 30/64 inch in diameter) 5 5 5 5 4 5 Table II—Tolerances Factorinternal (kernel)...

  20. 7 CFR 51.2544 - Tolerances.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... U.S. No. 1 U.S. select U.S. artificially opened U.S. non-split External (shell) Defects (tolerances... stained, included in (c) 2 3 3 3 3 3 (d) Damage by other means 1 1 2 3 10 N/A (e) Total External Defects 9... Factorinternal (kernel) defects (tolerances by weight) U.S. fancy(percent) U.S. extraNo. 1 (percent) U.S. No....

  1. Why are probabilistic laws governing quantum mechanics and neurobiology?

    NASA Astrophysics Data System (ADS)

    Kröger, Helmut

    2005-08-01

    We address the question: Why are dynamical laws governing in quantum mechanics and in neuroscience of probabilistic nature instead of being deterministic? We discuss some ideas showing that the probabilistic option offers advantages over the deterministic one.

  2. Towards Probabilistic Modelling in Event-B

    NASA Astrophysics Data System (ADS)

    Tarasyuk, Anton; Troubitsyna, Elena; Laibinis, Linas

    Event-B provides us with a powerful framework for correct-by-construction system development. However, while developing dependable systems we should not only guarantee their functional correctness but also quantitatively assess their dependability attributes. In this paper we investigate how to conduct probabilistic assessment of reliability of control systems modeled in Event-B. We show how to transform an Event-B model into a Markov model amendable for probabilistic reliability analysis. Our approach enables integration of reasoning about correctness with quantitative analysis of reliability.

  3. Probabilistic assessment of uncertain adaptive hybrid composites

    NASA Technical Reports Server (NTRS)

    Shiao, Michael C.; Singhal, Surendra N.; Chamis, Christos C.

    1994-01-01

    Adaptive composite structures using actuation materials, such as piezoelectric fibers, were assessed probabilistically utilizing intraply hybrid composite mechanics in conjunction with probabilistic composite structural analysis. Uncertainties associated with the actuation material as well as the uncertainties in the regular (traditional) composite material properties were quantified and considered in the assessment. Static and buckling analyses were performed for rectangular panels with various boundary conditions and different control arrangements. The probability density functions of the structural behavior, such as maximum displacement and critical buckling load, were computationally simulated. The results of the assessment indicate that improved design and reliability can be achieved with actuation material.

  4. A Probabilistic Approach to Aeropropulsion System Assessment

    NASA Technical Reports Server (NTRS)

    Tong, Michael T.

    2000-01-01

    A probabilistic approach is described for aeropropulsion system assessment. To demonstrate this approach, the technical performance of a wave rotor-enhanced gas turbine engine (i.e. engine net thrust, specific fuel consumption, and engine weight) is assessed. The assessment accounts for the uncertainties in component efficiencies/flows and mechanical design variables, using probability distributions. The results are presented in the form of cumulative distribution functions (CDFs) and sensitivity analyses, and are compared with those from the traditional deterministic approach. The comparison shows that the probabilistic approach provides a more realistic and systematic way to assess an aeropropulsion system.

  5. The probabilistic approach to human reasoning.

    PubMed

    Oaksford, M; Chater, N

    2001-08-01

    A recent development in the cognitive science of reasoning has been the emergence of a probabilistic approach to the behaviour observed on ostensibly logical tasks. According to this approach the errors and biases documented on these tasks occur because people import their everyday uncertain reasoning strategies into the laboratory. Consequently participants' apparently irrational behaviour is the result of comparing it with an inappropriate logical standard. In this article, we contrast the probabilistic approach with other approaches to explaining rationality, and then show how it has been applied to three main areas of logical reasoning: conditional inference, Wason's selection task and syllogistic reasoning.

  6. Probabilistic modeling of condition-based maintenance strategies and quantification of its benefits for airliners

    NASA Astrophysics Data System (ADS)

    Pattabhiraman, Sriram

    Airplane fuselage structures are designed with the concept of damage tolerance, wherein small damage are allowed to remain on the airplane, and damage that otherwise affect the safety of the structure are repaired. The damage critical to the safety of the fuselage are repaired by scheduling maintenance at pre-determined intervals. Scheduling maintenance is an interesting trade-off between damage tolerance and cost. Tolerance of larger damage would require less frequent maintenance and hence, a lower cost, to maintain a certain level of reliability. Alternatively, condition-based maintenance techniques have been developed using on-board sensors, which track damage continuously and request maintenance only when the damage size crosses a particular threshold. This effects a tolerance of larger damage than scheduled maintenance, leading to savings in cost. This work quantifies the savings of condition-based maintenance over scheduled maintenance. The work also quantifies converting the cost savings into weight savings. Structural health monitoring will need time to be able to establish itself as a stand-alone system for maintenance, due to concerns on its diagnosis accuracy and reliability. This work also investigates the effect of synchronizing structural health monitoring system with scheduled maintenance. This work uses on-board SHM equipment skip structural airframe maintenance (a subsect of scheduled maintenance), whenever deemed unnecessary while maintain a desired level of safety of structure. The work will also predict the necessary maintenance for a fleet of airplanes, based on the current damage status of the airplanes. The work also analyses the possibility of false alarm, wherein maintenance is being requested with no critical damage on the airplane. The work use SHM as a tool to identify lemons in a fleet of airplanes. Lemons are those airplanes that would warrant more maintenance trips than the average behavior of the fleet.

  7. Flooding tolerance of forage legumes.

    PubMed

    Striker, Gustavo G; Colmer, Timothy D

    2016-06-20

    We review waterlogging and submergence tolerances of forage (pasture) legumes. Growth reductions from waterlogging in perennial species ranged from >50% for Medicago sativa and Trifolium pratense to <25% for Lotus corniculatus, L. tenuis, and T. fragiferum For annual species, waterlogging reduced Medicago truncatula by ~50%, whereas Melilotus siculus and T. michelianum were not reduced. Tolerant species have higher root porosity (gas-filled volume in tissues) owing to aerenchyma formation. Plant dry mass (waterlogged relative to control) had a positive (hyperbolic) relationship to root porosity across eight species. Metabolism in hypoxic roots was influenced by internal aeration. Sugars accumulate in M. sativa due to growth inhibition from limited respiration and low energy in roots of low porosity (i.e. 4.5%). In contrast, L. corniculatus, with higher root porosity (i.e. 17.2%) and O2 supply allowing respiration, maintained growth better and sugars did not accumulate. Tolerant legumes form nodules, and internal O2 diffusion along roots can sustain metabolism, including N2 fixation, in submerged nodules. Shoot physiology depends on species tolerance. In M. sativa, photosynthesis soon declines and in the longer term (>10 d) leaves suffer chlorophyll degradation, damage, and N, P, and K deficiencies. In tolerant L corniculatus and L. tenuis, photosynthesis is maintained longer, shoot N is less affected, and shoot P can even increase during waterlogging. Species also differ in tolerance of partial and complete shoot submergence. Gaps in knowledge include anoxia tolerance of roots, N2 fixation during field waterlogging, and identification of traits conferring the ability to recover after water subsides.

  8. Bounding probabilistic sea-level projections within the framework of the possibility theory

    NASA Astrophysics Data System (ADS)

    Le Cozannet, Gonéri; Manceau, Jean-Charles; Rohmer, Jeremy

    2017-01-01

    Despite progresses in climate change science, projections of future sea-level rise remain highly uncertain, especially due to large unknowns in the melting processes affecting the ice-sheets in Greenland and Antarctica. Based on climate-models outcomes and the expertise of scientists concerned with these issues, the IPCC provided constraints to the quantiles of sea-level projections. Moreover, additional physical limits to future sea-level rise have been established, although approximately. However, many probability functions can comply with this imprecise knowledge. In this contribution, we provide a framework based on extra-probabilistic theories (namely the possibility theory) to model the uncertainties in sea-level rise projections by 2100 under the RCP 8.5 scenario. The results provide a concise representation of uncertainties in future sea-level rise and of their intrinsically imprecise nature, including a maximum bound of the total uncertainty. Today, coastal impact studies are increasingly moving away from deterministic sea-level projections, which underestimate the expectancy of damages and adaptation needs compared to probabilistic laws. However, we show that the probability functions used so-far have only explored a rather conservative subset of sea-level projections compliant with the IPCC. As a consequence, coastal impact studies relying on these probabilistic sea-level projections are expected to underestimate the possibility of large damages and adaptation needs.

  9. Probabilistic Grammars for Natural Languages. Psychology Series.

    ERIC Educational Resources Information Center

    Suppes, Patrick

    The purpose of this paper is to define the framework within which empirical investigations of probabilistic grammars can take place and to sketch how this attack can be made. The full presentation of empirical results will be left to other papers. In the detailed empirical work, the author has depended on the collaboration of E. Gammon and A.…

  10. Probabilistic analysis of a materially nonlinear structure

    NASA Technical Reports Server (NTRS)

    Millwater, H. R.; Wu, Y.-T.; Fossum, A. F.

    1990-01-01

    A probabilistic finite element program is used to perform probabilistic analysis of a materially nonlinear structure. The program used in this study is NESSUS (Numerical Evaluation of Stochastic Structure Under Stress), under development at Southwest Research Institute. The cumulative distribution function (CDF) of the radial stress of a thick-walled cylinder under internal pressure is computed and compared with the analytical solution. In addition, sensitivity factors showing the relative importance of the input random variables are calculated. Significant plasticity is present in this problem and has a pronounced effect on the probabilistic results. The random input variables are the material yield stress and internal pressure with Weibull and normal distributions, respectively. The results verify the ability of NESSUS to compute the CDF and sensitivity factors of a materially nonlinear structure. In addition, the ability of the Advanced Mean Value (AMV) procedure to assess the probabilistic behavior of structures which exhibit a highly nonlinear response is shown. Thus, the AMV procedure can be applied with confidence to other structures which exhibit nonlinear behavior.

  11. A probabilistic approach to composite micromechanics

    NASA Technical Reports Server (NTRS)

    Stock, T. A.; Bellini, P. X.; Murthy, P. L. N.; Chamis, C. C.

    1988-01-01

    Probabilistic composite micromechanics methods are developed that simulate expected uncertainties in unidirectional fiber composite properties. These methods are in the form of computational procedures using Monte Carlo simulation. A graphite/epoxy unidirectional composite (ply) is studied to demonstrate fiber composite material properties at the micro level. Regression results are presented to show the relative correlation between predicted and response variables in the study.

  12. Balkanization and Unification of Probabilistic Inferences

    ERIC Educational Resources Information Center

    Yu, Chong-Ho

    2005-01-01

    Many research-related classes in social sciences present probability as a unified approach based upon mathematical axioms, but neglect the diversity of various probability theories and their associated philosophical assumptions. Although currently the dominant statistical and probabilistic approach is the Fisherian tradition, the use of Fisherian…

  13. Dynamic Probabilistic Instability of Composite Structures

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.

    2009-01-01

    A computationally effective method is described to evaluate the non-deterministic dynamic instability (probabilistic dynamic buckling) of thin composite shells. The method is a judicious combination of available computer codes for finite element, composite mechanics and probabilistic structural analysis. The solution method is incrementally updated Lagrangian. It is illustrated by applying it to thin composite cylindrical shell subjected to dynamic loads. Both deterministic and probabilistic buckling loads are evaluated to demonstrate the effectiveness of the method. A universal plot is obtained for the specific shell that can be used to approximate buckling loads for different load rates and different probability levels. Results from this plot show that the faster the rate, the higher the buckling load and the shorter the time. The lower the probability, the lower is the buckling load for a specific time. Probabilistic sensitivity results show that the ply thickness, the fiber volume ratio and the fiber longitudinal modulus, dynamic load and loading rate are the dominant uncertainties in that order.

  14. The Probabilistic Nature of Preferential Choice

    ERIC Educational Resources Information Center

    Rieskamp, Jorg

    2008-01-01

    Previous research has developed a variety of theories explaining when and why people's decisions under risk deviate from the standard economic view of expected utility maximization. These theories are limited in their predictive accuracy in that they do not explain the probabilistic nature of preferential choice, that is, why an individual makes…

  15. Pigeons' Discounting of Probabilistic and Delayed Reinforcers

    ERIC Educational Resources Information Center

    Green, Leonard; Myerson, Joel; Calvert, Amanda L.

    2010-01-01

    Pigeons' discounting of probabilistic and delayed food reinforcers was studied using adjusting-amount procedures. In the probability discounting conditions, pigeons chose between an adjusting number of food pellets contingent on a single key peck and a larger, fixed number of pellets contingent on completion of a variable-ratio schedule. In the…

  16. Probabilistic Relational Structures and Their Applications

    ERIC Educational Resources Information Center

    Domotor, Zoltan

    The principal objects of the investigation reported were, first, to study qualitative probability relations on Boolean algebras, and secondly, to describe applications in the theories of probability logic, information, automata, and probabilistic measurement. The main contribution of this work is stated in 10 definitions and 20 theorems. The basic…

  17. Probabilistic Aeroelastic Analysis Developed for Turbomachinery Components

    NASA Technical Reports Server (NTRS)

    Reddy, T. S. R.; Mital, Subodh K.; Stefko, George L.; Pai, Shantaram S.

    2003-01-01

    Aeroelastic analyses for advanced turbomachines are being developed for use at the NASA Glenn Research Center and industry. However, these analyses at present are used for turbomachinery design with uncertainties accounted for by using safety factors. This approach may lead to overly conservative designs, thereby reducing the potential of designing higher efficiency engines. An integration of the deterministic aeroelastic analysis methods with probabilistic analysis methods offers the potential to design efficient engines with fewer aeroelastic problems and to make a quantum leap toward designing safe reliable engines. In this research, probabilistic analysis is integrated with aeroelastic analysis: (1) to determine the parameters that most affect the aeroelastic characteristics (forced response and stability) of a turbomachine component such as a fan, compressor, or turbine and (2) to give the acceptable standard deviation on the design parameters for an aeroelastically stable system. The approach taken is to combine the aeroelastic analysis of the MISER (MIStuned Engine Response) code with the FPI (fast probability integration) code. The role of MISER is to provide the functional relationships that tie the structural and aerodynamic parameters (the primitive variables) to the forced response amplitudes and stability eigenvalues (the response properties). The role of FPI is to perform probabilistic analyses by utilizing the response properties generated by MISER. The results are a probability density function for the response properties. The probabilistic sensitivities of the response variables to uncertainty in primitive variables are obtained as a byproduct of the FPI technique. The combined analysis of aeroelastic and probabilistic analysis is applied to a 12-bladed cascade vibrating in bending and torsion. Out of the total 11 design parameters, 6 are considered as having probabilistic variation. The six parameters are space-to-chord ratio (SBYC), stagger angle

  18. Advanced Test Reactor probabilistic risk assessment methodology and results summary

    SciTech Connect

    Eide, S.A.; Atkinson, S.A.; Thatcher, T.A.

    1992-01-01

    The Advanced Test Reactor (ATR) probabilistic risk assessment (PRA) Level 1 report documents a comprehensive and state-of-the-art study to establish and reduce the risk associated with operation of the ATR, expressed as a mean frequency of fuel damage. The ATR Level 1 PRA effort is unique and outstanding because of its consistent and state-of-the-art treatment of all facets of the risk study, its comprehensive and cost-effective risk reduction effort while the risk baseline was being established, and its thorough and comprehensive documentation. The PRA includes many improvements to the state-of-the-art, including the following: establishment of a comprehensive generic data base for component failures, treatment of initiating event frequencies given significant plant improvements in recent years, performance of efficient identification and screening of fire and flood events using code-assisted vital area analysis, identification and treatment of significant seismic-fire-flood-wind interactions, and modeling of large loss-of-coolant accidents (LOCAs) and experiment loop ruptures leading to direct damage of the ATR core. 18 refs.

  19. Advanced neutron source reactor probabilistic flow blockage assessment

    SciTech Connect

    Ramsey, C.T.

    1995-08-01

    The Phase I Level I Probabilistic Risk Assessment (PRA) of the conceptual design of the Advanced Neutron Source (ANS) Reactor identified core flow blockage as the most likely internal event leading to fuel damage. The flow blockage event frequency used in the original ANS PRA was based primarily on the flow blockage work done for the High Flux Isotope Reactor (HFIR) PRA. This report examines potential flow blockage scenarios and calculates an estimate of the likelihood of debris-induced fuel damage. The bulk of the report is based specifically on the conceptual design of ANS with a 93%-enriched, two-element core; insights to the impact of the proposed three-element core are examined in Sect. 5. In addition to providing a probability (uncertainty) distribution for the likelihood of core flow blockage, this ongoing effort will serve to indicate potential areas of concern to be focused on in the preliminary design for elimination or mitigation. It will also serve as a loose-parts management tool.

  20. Sensor Based Engine Life Calculation: A Probabilistic Perspective

    NASA Technical Reports Server (NTRS)

    Guo, Ten-Huei; Chen, Philip

    2003-01-01

    It is generally known that an engine component will accumulate damage (life usage) during its lifetime of use in a harsh operating environment. The commonly used cycle count for engine component usage monitoring has an inherent range of uncertainty which can be overly costly or potentially less safe from an operational standpoint. With the advance of computer technology, engine operation modeling, and the understanding of damage accumulation physics, it is possible (and desirable) to use the available sensor information to make a more accurate assessment of engine component usage. This paper describes a probabilistic approach to quantify the effects of engine operating parameter uncertainties on the thermomechanical fatigue (TMF) life of a selected engine part. A closed-loop engine simulation with a TMF life model is used to calculate the life consumption of different mission cycles. A Monte Carlo simulation approach is used to generate the statistical life usage profile for different operating assumptions. The probabilities of failure of different operating conditions are compared to illustrate the importance of the engine component life calculation using sensor information. The results of this study clearly show that a sensor-based life cycle calculation can greatly reduce the risk of component failure as well as extend on-wing component life by avoiding unnecessary maintenance actions.

  1. Low Velocity Impact Damage to Carbon/Epoxy Laminates

    NASA Technical Reports Server (NTRS)

    Nettles, Alan T.

    2011-01-01

    Impact damage tends to be more detrimental to a laminate's compression strength as compared to tensile strength. Proper use of Non Destructive Evaluation (NDE) Techniques can remove conservatism (weight) from many structures. Test largest components economically feasible as coupons. If damage tolerance is a driver, then consider different resin systems. Do not use a single knockdown factor to account for damage.

  2. 7 CFR 51.1405 - Application of tolerances.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... STANDARDS) United States Standards for Grades of Pecans in the Shell 1 Application of Tolerances § 51.1405... that at least one pecan which is seriously damaged by live insects inside the shell is...

  3. 7 CFR 51.1405 - Application of tolerances.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... (INSPECTION, CERTIFICATION, AND STANDARDS) United States Standards for Grades of Pecans in the Shell 1... tolerance of less than 5 percent, except that at least one pecan which is seriously damaged by live...

  4. 7 CFR 51.1405 - Application of tolerances.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... (INSPECTION, CERTIFICATION, AND STANDARDS) United States Standards for Grades of Pecans in the Shell 1... tolerance of less than 5 percent, except that at least one pecan which is seriously damaged by live...

  5. 7 CFR 51.1405 - Application of tolerances.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... STANDARDS) United States Standards for Grades of Pecans in the Shell 1 Application of Tolerances § 51.1405... that at least one pecan which is seriously damaged by live insects inside the shell is...

  6. 7 CFR 51.1405 - Application of tolerances.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... STANDARDS) United States Standards for Grades of Pecans in the Shell 1 Application of Tolerances § 51.1405... that at least one pecan which is seriously damaged by live insects inside the shell is...

  7. A probabilistic estimate of maximum acceleration in rock in the contiguous United States

    USGS Publications Warehouse

    Algermissen, Sylvester Theodore; Perkins, David M.

    1976-01-01

    This paper presents a probabilistic estimate of the maximum ground acceleration to be expected from earthquakes occurring in the contiguous United States. It is based primarily upon the historic seismic record which ranges from very incomplete before 1930 to moderately complete after 1960. Geologic data, primarily distribution of faults, have been employed only to a minor extent, because most such data have not been interpreted yet with earthquake hazard evaluation in mind.The map provides a preliminary estimate of the relative hazard in various parts of the country. The report provides a method for evaluating the relative importance of the many parameters and assumptions in hazard analysis. The map and methods of evaluation described reflect the current state of understanding and are intended to be useful for engineering purposes in reducing the effects of earthquakes on buildings and other structures.Studies are underway on improved methods for evaluating the relativ( earthquake hazard of different regions. Comments on this paper are invited to help guide future research and revisions of the accompanying map.The earthquake hazard in the United States has been estimated in a variety of ways since the initial effort by Ulrich (see Roberts and Ulrich, 1950). In general, the earlier maps provided an estimate of the severity of ground shaking or damage but the frequency of occurrence of the shaking or damage was not given. Ulrich's map showed the distribution of expected damage in terms of no damage (zone 0), minor damage (zone 1), moderate damage (zone 2), and major damage (zone 3). The zones were not defined further and the frequency of occurrence of damage was not suggested. Richter (1959) and Algermissen (1969) estimated the ground motion in terms of maximum Modified Mercalli intensity. Richter used the terms "occasional" and "frequent" to characterize intensity IX shaking and Algermissen included recurrence curves for various parts of the country in the paper

  8. 7 CFR 51.307 - Application of tolerances.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... STANDARDS) United States Standards for Grades of Apples Application of Tolerances § 51.307 Application of... least one apple which is seriously damaged by insects or affected by decay or internal breakdown may be... have more than 3 times the tolerance specified, except that at least three defective apples may...

  9. 7 CFR 51.307 - Application of tolerances.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... STANDARDS) United States Standards for Grades of Apples Application of Tolerances § 51.307 Application of... least one apple which is seriously damaged by insects or affected by decay or internal breakdown may be... have more than 3 times the tolerance specified, except that at least three defective apples may...

  10. 7 CFR 51.2648 - Tolerances.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ..., CERTIFICATION, AND STANDARDS) United States Standards for Grades for Sweet Cherries 1 Tolerances § 51.2648... 2 —(1) U.S. No. 1. 8 percent for cherries which fail to meet the requirements for this grade... damage, including in this latter amount not more than one-half of 1 percent for cherries which...

  11. 7 CFR 51.2648 - Tolerances.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ..., CERTIFICATION, AND STANDARDS) United States Standards for Grades for Sweet Cherries 1 Tolerances § 51.2648... 2 —(1) U.S. No. 1. 8 percent for cherries which fail to meet the requirements for this grade... damage, including in this latter amount not more than one-half of 1 percent for cherries which...

  12. 7 CFR 51.1306 - Tolerances.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... percent of the pears in any lot may fail to meet the requirements of grade: Provided, That not more than 5 percent shall be seriously damaged by insects, and not more than 1 percent shall be allowed for decay or internal breakdown. (b) When applying the foregoing tolerances to the combination grade no part of...

  13. Probabilistic assessment of smart composite structures

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.; Shiao, Michael C.

    1994-01-01

    A composite wing with spars and bulkheads is used to demonstrate the effectiveness of probabilistic assessment of smart composite structures to control uncertainties in distortions and stresses. Results show that a smart composite wing can be controlled to minimize distortions and to have specified stress levels in the presence of defects. Structural responses such as changes in angle of attack, vertical displacements, and stress in the control and controlled plies are probabilistically assessed to quantify their respective uncertainties. Sensitivity factors are evaluated to identify those parameters that have the greatest influence on a specific structural response. Results show that smart composite structures can be configured to control both distortions and ply stresses to satisfy specified design requirements.

  14. Exact and Approximate Probabilistic Symbolic Execution

    NASA Technical Reports Server (NTRS)

    Luckow, Kasper; Pasareanu, Corina S.; Dwyer, Matthew B.; Filieri, Antonio; Visser, Willem

    2014-01-01

    Probabilistic software analysis seeks to quantify the likelihood of reaching a target event under uncertain environments. Recent approaches compute probabilities of execution paths using symbolic execution, but do not support nondeterminism. Nondeterminism arises naturally when no suitable probabilistic model can capture a program behavior, e.g., for multithreading or distributed systems. In this work, we propose a technique, based on symbolic execution, to synthesize schedulers that resolve nondeterminism to maximize the probability of reaching a target event. To scale to large systems, we also introduce approximate algorithms to search for good schedulers, speeding up established random sampling and reinforcement learning results through the quantification of path probabilities based on symbolic execution. We implemented the techniques in Symbolic PathFinder and evaluated them on nondeterministic Java programs. We show that our algorithms significantly improve upon a state-of- the-art statistical model checking algorithm, originally developed for Markov Decision Processes.

  15. Probabilistic Methods for Structural Reliability and Risk

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.

    2010-01-01

    A probabilistic method is used to evaluate the structural reliability and risk of select metallic and composite structures. The method is a multiscale, multifunctional and it is based on the most elemental level. A multifactor interaction model is used to describe the material properties which are subsequently evaluated probabilistically. The metallic structure is a two rotor aircraft engine, while the composite structures consist of laminated plies (multiscale) and the properties of each ply are the multifunctional representation. The structural component is modeled by finite element. The solution method for structural responses is obtained by an updated simulation scheme. The results show that the risk for the two rotor engine is about 0.0001 and the composite built-up structure is also 0.0001.

  16. Modelling default and likelihood reasoning as probabilistic

    NASA Technical Reports Server (NTRS)

    Buntine, Wray

    1990-01-01

    A probabilistic analysis of plausible reasoning about defaults and about likelihood is presented. 'Likely' and 'by default' are in fact treated as duals in the same sense as 'possibility' and 'necessity'. To model these four forms probabilistically, a logic QDP and its quantitative counterpart DP are derived that allow qualitative and corresponding quantitative reasoning. Consistency and consequence results for subsets of the logics are given that require at most a quadratic number of satisfiability tests in the underlying propositional logic. The quantitative logic shows how to track the propagation error inherent in these reasoning forms. The methodology and sound framework of the system highlights their approximate nature, the dualities, and the need for complementary reasoning about relevance.

  17. Probabilistic Methods for Structural Reliability and Risk

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.

    2008-01-01

    A probabilistic method is used to evaluate the structural reliability and risk of select metallic and composite structures. The method is a multiscale, multifunctional and it is based on the most elemental level. A multi-factor interaction model is used to describe the material properties which are subsequently evaluated probabilistically. The metallic structure is a two rotor aircraft engine, while the composite structures consist of laminated plies (multiscale) and the properties of each ply are the multifunctional representation. The structural component is modeled by finite element. The solution method for structural responses is obtained by an updated simulation scheme. The results show that the risk for the two rotor engine is about 0.0001 and the composite built-up structure is also 0.0001.

  18. Probabilistic Analysis of Gas Turbine Field Performance

    NASA Technical Reports Server (NTRS)

    Gorla, Rama S. R.; Pai, Shantaram S.; Rusick, Jeffrey J.

    2002-01-01

    A gas turbine thermodynamic cycle was computationally simulated and probabilistically evaluated in view of the several uncertainties in the performance parameters, which are indices of gas turbine health. Cumulative distribution functions and sensitivity factors were computed for the overall thermal efficiency and net specific power output due to the thermodynamic random variables. These results can be used to quickly identify the most critical design variables in order to optimize the design, enhance performance, increase system availability and make it cost effective. The analysis leads to the selection of the appropriate measurements to be used in the gas turbine health determination and to the identification of both the most critical measurements and parameters. Probabilistic analysis aims at unifying and improving the control and health monitoring of gas turbine aero-engines by increasing the quality and quantity of information available about the engine's health and performance.

  19. Probabilistic Assessment of National Wind Tunnel

    NASA Technical Reports Server (NTRS)

    Shah, A. R.; Shiao, M.; Chamis, C. C.

    1996-01-01

    A preliminary probabilistic structural assessment of the critical section of National Wind Tunnel (NWT) is performed using NESSUS (Numerical Evaluation of Stochastic Structures Under Stress) computer code. Thereby, the capabilities of NESSUS code have been demonstrated to address reliability issues of the NWT. Uncertainties in the geometry, material properties, loads and stiffener location on the NWT are considered to perform the reliability assessment. Probabilistic stress, frequency, buckling, fatigue and proof load analyses are performed. These analyses cover the major global and some local design requirements. Based on the assumed uncertainties, the results reveal the assurance of minimum 0.999 reliability for the NWT. Preliminary life prediction analysis results show that the life of the NWT is governed by the fatigue of welds. Also, reliability based proof test assessment is performed.

  20. Significance testing as perverse probabilistic reasoning

    PubMed Central

    2011-01-01

    Truth claims in the medical literature rely heavily on statistical significance testing. Unfortunately, most physicians misunderstand the underlying probabilistic logic of significance tests and consequently often misinterpret their results. This near-universal misunderstanding is highlighted by means of a simple quiz which we administered to 246 physicians at two major academic hospitals, on which the proportion of incorrect responses exceeded 90%. A solid understanding of the fundamental concepts of probability theory is becoming essential to the rational interpretation of medical information. This essay provides a technically sound review of these concepts that is accessible to a medical audience. We also briefly review the debate in the cognitive sciences regarding physicians' aptitude for probabilistic inference. PMID:21356064

  1. Multiscale/Multifunctional Probabilistic Composite Fatigue

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.

    2010-01-01

    A multilevel (multiscale/multifunctional) evaluation is demonstrated by applying it to three different sample problems. These problems include the probabilistic evaluation of a space shuttle main engine blade, an engine rotor and an aircraft wing. The results demonstrate that the blade will fail at the highest probability path, the engine two-stage rotor will fail by fracture at the rim and the aircraft wing will fail at 109 fatigue cycles with a probability of 0.9967.

  2. Bayesian Probabilistic Projection of International Migration.

    PubMed

    Azose, Jonathan J; Raftery, Adrian E

    2015-10-01

    We propose a method for obtaining joint probabilistic projections of migration for all countries, broken down by age and sex. Joint trajectories for all countries are constrained to satisfy the requirement of zero global net migration. We evaluate our model using out-of-sample validation and compare point projections to the projected migration rates from a persistence model similar to the method used in the United Nations' World Population Prospects, and also to a state-of-the-art gravity model.

  3. Probabilistically teleporting arbitrary two-qubit states

    NASA Astrophysics Data System (ADS)

    Choudhury, Binayak S.; Dhara, Arpan

    2016-12-01

    In this paper we make use of two non-maximally entangled three-qubit channels for probabilistically teleporting arbitrary two particle states from a sender to a receiver. We also calculate the success probability of the teleportation. In the protocol we use two measurements of which one is a POVM and the other is a projective measurement. The POVM provides the protocol with operational advantage.

  4. Probabilistic Anisotropic Failure Criteria for Composite Materials.

    DTIC Science & Technology

    1987-12-01

    worksheets were based on Microsoft Excel software. 55 55 ’. 2.’ 𔃼..’. -.. ’-,’€’.’’.’ :2.,2..’..’.2.’.’.,’.." .𔃼.. .2...analytically described the failure cri - terion and probabilistic failure states of a anisotropic composite in a combined stress state. Strength...APPENDIX F RELIABILITY/FAILURE FUNCTION WORKSHEET ........... 76 APPENDIX G PERCENTILE STRENGTH WORKSHEET ....................... 80 LIST OF

  5. Maritime Threat Detection Using Probabilistic Graphical Models

    DTIC Science & Technology

    2012-01-01

    CRF, unlike an HMM, can represent local features, and does not require feature concatenation. MLNs For MLNs, we used Alchemy ( Alchemy 2011), an...open source statistical relational learning and probabilistic inferencing package. Alchemy supports generative and discriminative weight learning, and...that Alchemy creates a new formula for every possible combination of the values for a1 and a2 that fit the type specified in their predicate

  6. Incorporating psychological influences in probabilistic cost analysis

    SciTech Connect

    Kujawski, Edouard; Alvaro, Mariana; Edwards, William

    2004-01-08

    Today's typical probabilistic cost analysis assumes an ''ideal'' project that is devoid of the human and organizational considerations that heavily influence the success and cost of real-world projects. In the real world ''Money Allocated Is Money Spent'' (MAIMS principle); cost underruns are rarely available to protect against cost overruns while task overruns are passed on to the total project cost. Realistic cost estimates therefore require a modified probabilistic cost analysis that simultaneously models the cost management strategy including budget allocation. Psychological influences such as overconfidence in assessing uncertainties and dependencies among cost elements and risks are other important considerations that are generally not addressed. It should then be no surprise that actual project costs often exceed the initial estimates and are delivered late and/or with a reduced scope. This paper presents a practical probabilistic cost analysis model that incorporates recent findings in human behavior and judgment under uncertainty, dependencies among cost elements, the MAIMS principle, and project management practices. Uncertain cost elements are elicited from experts using the direct fractile assessment method and fitted with three-parameter Weibull distributions. The full correlation matrix is specified in terms of two parameters that characterize correlations among cost elements in the same and in different subsystems. The analysis is readily implemented using standard Monte Carlo simulation tools such as {at}Risk and Crystal Ball{reg_sign}. The analysis of a representative design and engineering project substantiates that today's typical probabilistic cost analysis is likely to severely underestimate project cost for probability of success values of importance to contractors and procuring activities. The proposed approach provides a framework for developing a viable cost management strategy for allocating baseline budgets and contingencies. Given the

  7. The probabilistic structure of planetary contamination models

    NASA Technical Reports Server (NTRS)

    Harrison, J. M.; North, W. D.

    1973-01-01

    The analytical basis for planetary quarantine standards and procedures is presented. The heirarchy of planetary quarantine decisions is explained and emphasis is placed on the determination of mission specifications to include sterilization. The influence of the Sagan-Coleman probabilistic model of planetary contamination on current standards and procedures is analyzed. A classical problem in probability theory which provides a close conceptual parallel to the type of dependence present in the contamination problem is presented.

  8. Probabilistic Network Approach to Decision-Making

    NASA Astrophysics Data System (ADS)

    Nicolis, Grégoire; Nicolis, Stamatios C.

    2015-06-01

    A probabilistic approach to decision-making is developed in which the states of the underlying stochastic process, assumed to be of the Markov type, represent the competing options. The principal parameters determining the dominance of a particular option versus the others are identified and the transduction of information associated to the transitions between states is quantified using a set of entropy-like quantities.

  9. Dynamic competitive probabilistic principal components analysis.

    PubMed

    López-Rubio, Ezequiel; Ortiz-DE-Lazcano-Lobato, Juan Miguel

    2009-04-01

    We present a new neural model which extends the classical competitive learning (CL) by performing a Probabilistic Principal Components Analysis (PPCA) at each neuron. The model also has the ability to learn the number of basis vectors required to represent the principal directions of each cluster, so it overcomes a drawback of most local PCA models, where the dimensionality of a cluster must be fixed a priori. Experimental results are presented to show the performance of the network with multispectral image data.

  10. Probabilistic structural analysis methods and applications

    NASA Technical Reports Server (NTRS)

    Cruse, T. A.; Wu, Y.-T.; Dias, B.; Rajagopal, K. R.

    1988-01-01

    An advanced algorithm for simulating the probabilistic distribution of structural responses due to statistical uncertainties in loads, geometry, material properties, and boundary conditions is reported. The method effectively combines an advanced algorithm for calculating probability levels for multivariate problems (fast probability integration) together with a general-purpose finite-element code for stress, vibration, and buckling analysis. Application is made to a space propulsion system turbine blade for which the geometry and material properties are treated as random variables.

  11. Probabilistic seismic vulnerability and risk assessment of stone masonry structures

    NASA Astrophysics Data System (ADS)

    Abo El Ezz, Ahmad

    Earthquakes represent major natural hazards that regularly impact the built environment in seismic prone areas worldwide and cause considerable social and economic losses. The high losses incurred following the past destructive earthquakes promoted the need for assessment of the seismic vulnerability and risk of the existing buildings. Many historic buildings in the old urban centers in Eastern Canada such as Old Quebec City are built of stone masonry and represent un-measurable architectural and cultural heritage. These buildings were built to resist gravity loads only and generally offer poor resistance to lateral seismic loads. Seismic vulnerability assessment of stone masonry buildings is therefore the first necessary step in developing seismic retrofitting and pre-disaster mitigation plans. The objective of this study is to develop a set of probability-based analytical tools for efficient seismic vulnerability and uncertainty analysis of stone masonry buildings. A simplified probabilistic analytical methodology for vulnerability modelling of stone masonry building with systematic treatment of uncertainties throughout the modelling process is developed in the first part of this study. Building capacity curves are developed using a simplified mechanical model. A displacement based procedure is used to develop damage state fragility functions in terms of spectral displacement response based on drift thresholds of stone masonry walls. A simplified probabilistic seismic demand analysis is proposed to capture the combined uncertainty in capacity and demand on fragility functions. In the second part, a robust analytical procedure for the development of seismic hazard compatible fragility and vulnerability functions is proposed. The results are given by sets of seismic hazard compatible vulnerability functions in terms of structure-independent intensity measure (e.g. spectral acceleration) that can be used for seismic risk analysis. The procedure is very efficient for

  12. Probabilistic Graph Layout for Uncertain Network Visualization.

    PubMed

    Schulz, Christoph; Nocaj, Arlind; Goertler, Jochen; Deussen, Oliver; Brandes, Ulrik; Weiskopf, Daniel

    2017-01-01

    We present a novel uncertain network visualization technique based on node-link diagrams. Nodes expand spatially in our probabilistic graph layout, depending on the underlying probability distributions of edges. The visualization is created by computing a two-dimensional graph embedding that combines samples from the probabilistic graph. A Monte Carlo process is used to decompose a probabilistic graph into its possible instances and to continue with our graph layout technique. Splatting and edge bundling are used to visualize point clouds and network topology. The results provide insights into probability distributions for the entire network-not only for individual nodes and edges. We validate our approach using three data sets that represent a wide range of network types: synthetic data, protein-protein interactions from the STRING database, and travel times extracted from Google Maps. Our approach reveals general limitations of the force-directed layout and allows the user to recognize that some nodes of the graph are at a specific position just by chance.

  13. Integrating Sequence Evolution into Probabilistic Orthology Analysis.

    PubMed

    Ullah, Ikram; Sjöstrand, Joel; Andersson, Peter; Sennblad, Bengt; Lagergren, Jens

    2015-11-01

    Orthology analysis, that is, finding out whether a pair of homologous genes are orthologs - stemming from a speciation - or paralogs - stemming from a gene duplication - is of central importance in computational biology, genome annotation, and phylogenetic inference. In particular, an orthologous relationship makes functional equivalence of the two genes highly likely. A major approach to orthology analysis is to reconcile a gene tree to the corresponding species tree, (most commonly performed using the most parsimonious reconciliation, MPR). However, most such phylogenetic orthology methods infer the gene tree without considering the constraints implied by the species tree and, perhaps even more importantly, only allow the gene sequences to influence the orthology analysis through the a priori reconstructed gene tree. We propose a sound, comprehensive Bayesian Markov chain Monte Carlo-based method, DLRSOrthology, to compute orthology probabilities. It efficiently sums over the possible gene trees and jointly takes into account the current gene tree, all possible reconciliations to the species tree, and the, typically strong, signal conveyed by the sequences. We compare our method with PrIME-GEM, a probabilistic orthology approach built on a probabilistic duplication-loss model, and MrBayesMPR, a probabilistic orthology approach that is based on conventional Bayesian inference coupled with MPR. We find that DLRSOrthology outperforms these competing approaches on synthetic data as well as on biological data sets and is robust to incomplete taxon sampling artifacts.

  14. Multiclient Identification System Using Adaptive Probabilistic Model

    NASA Astrophysics Data System (ADS)

    Lin, Chin-Teng; Siana, Linda; Shou, Yu-Wen; Yang, Chien-Ting

    2010-12-01

    This paper aims at integrating detection and identification of human faces in a more practical and real-time face recognition system. The proposed face detection system is based on the cascade Adaboost method to improve the precision and robustness toward unstable surrounding lightings. Our Adaboost method innovates to adjust the environmental lighting conditions by histogram lighting normalization and to accurately locate the face regions by a region-based-clustering process as well. We also address on the problem of multi-scale faces in this paper by using 12 different scales of searching windows and 5 different orientations for each client in pursuit of the multi-view independent face identification. There are majorly two methodological parts in our face identification system, including PCA (principal component analysis) facial feature extraction and adaptive probabilistic model (APM). The structure of our implemented APM with a weighted combination of simple probabilistic functions constructs the likelihood functions by the probabilistic constraint in the similarity measures. In addition, our proposed method can online add a new client and update the information of registered clients due to the constructed APM. The experimental results eventually show the superior performance of our proposed system for both offline and real-time online testing.

  15. Amplification uncertainty relation for probabilistic amplifiers

    NASA Astrophysics Data System (ADS)

    Namiki, Ryo

    2015-09-01

    Traditionally, quantum amplification limit refers to the property of inevitable noise addition on canonical variables when the field amplitude of an unknown state is linearly transformed through a quantum channel. Recent theoretical studies have determined amplification limits for cases of probabilistic quantum channels or general quantum operations by specifying a set of input states or a state ensemble. However, it remains open how much excess noise on canonical variables is unavoidable and whether there exists a fundamental trade-off relation between the canonical pair in a general amplification process. In this paper we present an uncertainty-product form of amplification limits for general quantum operations by assuming an input ensemble of Gaussian-distributed coherent states. It can be derived as a straightforward consequence of canonical uncertainty relations and retrieves basic properties of the traditional amplification limit. In addition, our amplification limit turns out to give a physical limitation on probabilistic reduction of an Einstein-Podolsky-Rosen uncertainty. In this regard, we find a condition that probabilistic amplifiers can be regarded as local filtering operations to distill entanglement. This condition establishes a clear benchmark to verify an advantage of non-Gaussian operations beyond Gaussian operations with a feasible input set of coherent states and standard homodyne measurements.

  16. A method for probabilistic flash flood forecasting

    NASA Astrophysics Data System (ADS)

    Hardy, Jill; Gourley, Jonathan J.; Kirstetter, Pierre-Emmanuel; Hong, Yang; Kong, Fanyou; Flamig, Zachary L.

    2016-10-01

    Flash flooding is one of the most costly and deadly natural hazards in the United States and across the globe. This study advances the use of high-resolution quantitative precipitation forecasts (QPFs) for flash flood forecasting. The QPFs are derived from a stormscale ensemble prediction system, and used within a distributed hydrological model framework to yield basin-specific, probabilistic flash flood forecasts (PFFFs). Before creating the PFFFs, it is important to characterize QPF uncertainty, particularly in terms of location which is the most problematic for hydrological use of QPFs. The SAL methodology (Wernli et al., 2008), which stands for structure, amplitude, and location, is used for this error quantification, with a focus on location. Finally, the PFFF methodology is proposed that produces probabilistic hydrological forecasts. The main advantages of this method are: (1) identifying specific basin scales that are forecast to be impacted by flash flooding; (2) yielding probabilistic information about the forecast hydrologic response that accounts for the locational uncertainties of the QPFs; (3) improving lead time by using stormscale NWP ensemble forecasts; and (4) not requiring multiple simulations, which are computationally demanding.

  17. Can crops tolerate acid rain

    SciTech Connect

    Kaplan, J.K.

    1989-11-01

    This brief article describes work by scientists at the ARS Air Quality-Plant Growth and Development Laboratory in Raleigh, North Carolina, that indicates little damage to crops as a result of acid rain. In studies with simulated acid rain and 216 exposed varieties of 18 crops, there were no significant injuries nor was there reduced growth in most species. Results of chronic and acute exposures were correlated in sensitive tomato and soybean plants and in tolerant winter wheat and lettuce plants. These results suggest that 1-hour exposures could be used in the future to screen varieties for sensitivity to acid rain.

  18. Stochastic damage evolution in textile laminates

    NASA Technical Reports Server (NTRS)

    Dzenis, Yuris A.; Bogdanovich, Alexander E.; Pastore, Christopher M.

    1993-01-01

    A probabilistic model utilizing random material characteristics to predict damage evolution in textile laminates is presented. Model is based on a division of each ply into two sublaminas consisting of cells. The probability of cell failure is calculated using stochastic function theory and maximal strain failure criterion. Three modes of failure, i.e. fiber breakage, matrix failure in transverse direction, as well as matrix or interface shear cracking, are taken into account. Computed failure probabilities are utilized in reducing cell stiffness based on the mesovolume concept. A numerical algorithm is developed predicting the damage evolution and deformation history of textile laminates. Effect of scatter of fiber orientation on cell properties is discussed. Weave influence on damage accumulation is illustrated with the help of an example of a Kevlar/epoxy laminate.

  19. Aligned composite structures for mitigation of impact damage and resistance to wear in dynamic environments

    DOEpatents

    Mulligan, Anthony C.; Rigali, Mark J.; Sutaria, Manish P.; Popovich, Dragan; Halloran, Joseph P.; Fulcher, Michael L.; Cook, Randy C.

    2005-12-13

    Fibrous monolith composites having architectures that provide increased flaw insensitivity, improved hardness, wear resistance and damage tolerance and methods of manufacture thereof are provided for use in dynamic environments to mitigate impact damage and increase wear resistance.

  20. Aligned composite structures for mitigation of impact damage and resistance to wear in dynamic environments

    DOEpatents

    Mulligan, Anthony C.; Rigali, Mark J.; Sutaria, Manish P.; Popovich, Dragan; Halloran, Joseph P.; Fulcher, Michael L.; Cook, Randy C.

    2009-04-14

    Fibrous monolith composites having architectures that provide increased flaw insensitivity, improved hardness, wear resistance and damage tolerance and methods of manufacture thereof are provided for use in dynamic environments to mitigate impact damage and increase wear resistance.

  1. Aligned composite structures for mitigation of impact damage and resistance to wear in dynamic environments

    DOEpatents

    Rigali, Mark J.; Sutaria, Manish P.; Mulligan, Anthony C.; Popovich, Dragan

    2004-03-23

    Fibrous monolith composites having architectures that provide increased flaw insensitivity, improved hardness, wear resistance and damage tolerance and methods of manufacture thereof are provided for use in dynamic environments to mitigate impact damage and increase wear resistance.

  2. Right Hemisphere Brain Damage

    MedlinePlus

    ... Language and Swallowing / Disorders and Diseases Right Hemisphere Brain Damage [ en Español ] What is right hemisphere brain ... right hemisphere brain damage ? What is right hemisphere brain damage? Right hemisphere brain damage (RHD) is damage ...

  3. Probabilistic structural analysis of aerospace components using NESSUS

    NASA Technical Reports Server (NTRS)

    Shiao, Michael C.; Nagpal, Vinod K.; Chamis, Christos C.

    1988-01-01

    Probabilistic structural analysis of a Space Shuttle main engine turbopump blade is conducted using the computer code NESSUS (numerical evaluation of stochastic structures under stress). The goal of the analysis is to derive probabilistic characteristics of blade response given probabilistic descriptions of uncertainties in blade geometry, material properties, and temperature and pressure distributions. Probability densities are derived for critical blade responses. Risk assessment and failure life analysis is conducted assuming different failure models.

  4. Increased size of cotton root system does not impart tolerance to Meloidogyne incognita

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Plant tolerance or intolerance to parasitic nematodes represent a spectrum describing the degree of damage inflicted by the nematode on the host plant. Tolerance is typically measured in terms of yield suppression. Instances of plant tolerance to nematodes have been documented in some crops, inclu...

  5. 7 CFR 51.1215 - Application of tolerances to individual packages.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... Grades of Peaches Application of Tolerances § 51.1215 Application of tolerances to individual packages... any lot shall have not more than double the tolerance specified, except that at least one peach which... percentage of defects: Provided, That not more than one peach which is seriously damaged by insects...

  6. A framework for probabilistic pluvial flood nowcasting for urban areas

    NASA Astrophysics Data System (ADS)

    Ntegeka, Victor; Murla, Damian; Wang, Lipen; Foresti, Loris; Reyniers, Maarten; Delobbe, Laurent; Van Herk, Kristine; Van Ootegem, Luc; Willems, Patrick

    2016-04-01

    Pluvial flood nowcasting is gaining ground not least because of the advancements in rainfall forecasting schemes. Short-term forecasts and applications have benefited from the availability of such forecasts with high resolution in space (~1km) and time (~5min). In this regard, it is vital to evaluate the potential of nowcasting products for urban inundation applications. One of the most advanced Quantitative Precipitation Forecasting (QPF) techniques is the Short-Term Ensemble Prediction System, which was originally co-developed by the UK Met Office and Australian Bureau of Meteorology. The scheme was further tuned to better estimate extreme and moderate events for the Belgian area (STEPS-BE). Against this backdrop, a probabilistic framework has been developed that consists of: (1) rainfall nowcasts; (2) sewer hydraulic model; (3) flood damage estimation; and (4) urban inundation risk mapping. STEPS-BE forecasts are provided at high resolution (1km/5min) with 20 ensemble members with a lead time of up to 2 hours using a 4 C-band radar composite as input. Forecasts' verification was performed over the cities of Leuven and Ghent and biases were found to be small. The hydraulic model consists of the 1D sewer network and an innovative 'nested' 2D surface model to model 2D urban surface inundations at high resolution. The surface components are categorized into three groups and each group is modelled using triangular meshes at different resolutions; these include streets (3.75 - 15 m2), high flood hazard areas (12.5 - 50 m2) and low flood hazard areas (75 - 300 m2). Functions describing urban flood damage and social consequences were empirically derived based on questionnaires to people in the region that were recently affected by sewer floods. Probabilistic urban flood risk maps were prepared based on spatial interpolation techniques of flood inundation. The method has been implemented and tested for the villages Oostakker and Sint-Amandsberg, which are part of the

  7. Probabilistic alternatives to Bayesianism: the case of explanationism

    PubMed Central

    Douven, Igor; Schupbach, Jonah N.

    2015-01-01

    There has been a probabilistic turn in contemporary cognitive science. Far and away, most of the work in this vein is Bayesian, at least in name. Coinciding with this development, philosophers have increasingly promoted Bayesianism as the best normative account of how humans ought to reason. In this paper, we make a push for exploring the probabilistic terrain outside of Bayesianism. Non-Bayesian, but still probabilistic, theories provide plausible competitors both to descriptive and normative Bayesian accounts. We argue for this general idea via recent work on explanationist models of updating, which are fundamentally probabilistic but assign a substantial, non-Bayesian role to explanatory considerations. PMID:25964769

  8. NASA workshop on impact damage to composites

    NASA Technical Reports Server (NTRS)

    Poe, C. C., Jr.

    1991-01-01

    A compilation of slides presented at the NASA Workshop on Impact Damage to Composites held on March 19 and 20, 1991, at the Langley Research Center, Hampton, Virginia is given. The objective of the workshop was to review technology for evaluating impact damage tolerance of composite structures and identify deficiencies. Research, development, design methods, and design criteria were addressed. Actions to eliminate technology deficiencies were developed. A list of those actions and a list of attendees are also included.

  9. Lactose tolerance tests

    MedlinePlus

    Hydrogen breath test for lactose tolerance ... Two common methods include: Lactose tolerance blood test Hydrogen breath test The hydrogen breath test is the preferred method. It measures the amount of hydrogen in the air you breathe out. ...

  10. Revoking Pesticide Tolerances

    EPA Pesticide Factsheets

    EPA revokes pesticide tolerances when all registrations of a pesticide have been canceled in the U.S. and the tolerances are not needed for imported foods or when there are no registered uses for certain crops.

  11. Probabilistic seismic loss estimation via endurance time method

    NASA Astrophysics Data System (ADS)

    Tafakori, Ehsan; Pourzeynali, Saeid; Estekanchi, Homayoon E.

    2017-01-01

    Probabilistic Seismic Loss Estimation is a methodology used as a quantitative and explicit expression of the performance of buildings using terms that address the interests of both owners and insurance companies. Applying the ATC 58 approach for seismic loss assessment of buildings requires using Incremental Dynamic Analysis (IDA), which needs hundreds of time-consuming analyses, which in turn hinders its wide application. The Endurance Time Method (ETM) is proposed herein as part of a demand propagation prediction procedure and is shown to be an economical alternative to IDA. Various scenarios were considered to achieve this purpose and their appropriateness has been evaluated using statistical methods. The most precise and efficient scenario was validated through comparison against IDA driven response predictions of 34 code conforming benchmark structures and was proven to be sufficiently precise while offering a great deal of efficiency. The loss values were estimated by replacing IDA with the proposed ETM-based procedure in the ATC 58 procedure and it was found that these values suffer from varying inaccuracies, which were attributed to the discretized nature of damage and loss prediction functions provided by ATC 58.

  12. Probabilistic consequence model of accidenal or intentional chemical releases.

    SciTech Connect

    Chang, Y.-S.; Samsa, M. E.; Folga, S. M.; Hartmann, H. M.

    2008-06-02

    In this work, general methodologies for evaluating the impacts of large-scale toxic chemical releases are proposed. The potential numbers of injuries and fatalities, the numbers of hospital beds, and the geographical areas rendered unusable during and some time after the occurrence and passage of a toxic plume are estimated on a probabilistic basis. To arrive at these estimates, historical accidental release data, maximum stored volumes, and meteorological data were used as inputs into the SLAB accidental chemical release model. Toxic gas footprints from the model were overlaid onto detailed population and hospital distribution data for a given region to estimate potential impacts. Output results are in the form of a generic statistical distribution of injuries and fatalities associated with specific toxic chemicals and regions of the United States. In addition, indoor hazards were estimated, so the model can provide contingency plans for either shelter-in-place or evacuation when an accident occurs. The stochastic distributions of injuries and fatalities are being used in a U.S. Department of Homeland Security-sponsored decision support system as source terms for a Monte Carlo simulation that evaluates potential measures for mitigating terrorist threats. This information can also be used to support the formulation of evacuation plans and to estimate damage and cleanup costs.

  13. Probabilistic modelling of rainfall induced landslide hazard assessment

    NASA Astrophysics Data System (ADS)

    Kawagoe, S.; Kazama, S.; Sarukkalige, P. R.

    2010-06-01

    To evaluate the frequency and distribution of landslides hazards over Japan, this study uses a probabilistic model based on multiple logistic regression analysis. Study particular concerns several important physical parameters such as hydraulic parameters, geographical parameters and the geological parameters which are considered to be influential in the occurrence of landslides. Sensitivity analysis confirmed that hydrological parameter (hydraulic gradient) is the most influential factor in the occurrence of landslides. Therefore, the hydraulic gradient is used as the main hydraulic parameter; dynamic factor which includes the effect of heavy rainfall and their return period. Using the constructed spatial data-sets, a multiple logistic regression model is applied and landslide hazard probability maps are produced showing the spatial-temporal distribution of landslide hazard probability over Japan. To represent the landslide hazard in different temporal scales, extreme precipitation in 5 years, 30 years, and 100 years return periods are used for the evaluation. The results show that the highest landslide hazard probability exists in the mountain ranges on the western side of Japan (Japan Sea side), including the Hida and Kiso, Iide and the Asahi mountainous range, the south side of Chugoku mountainous range, the south side of Kyusu mountainous and the Dewa mountainous range and the Hokuriku region. The developed landslide hazard probability maps in this study will assist authorities, policy makers and decision makers, who are responsible for infrastructural planning and development, as they can identify landslide-susceptible areas and thus decrease landslide damage through proper preparation.

  14. Probabilistic modelling of rainfall induced landslide hazard assessment

    NASA Astrophysics Data System (ADS)

    Kawagoe, S.; Kazama, S.; Sarukkalige, P. R.

    2010-01-01

    To evaluate the frequency and distribution of landslides hazards over Japan, this study uses a probabilistic model based on multiple logistic regression analysis. Study particular concerns several important physical parameters such as hydraulic parameters, geographical parameters and the geological parameters which are considered to be influential in the occurrence of landslides. Sensitivity analysis confirmed that hydrological parameter (hydraulic gradient) is the most influential factor in the occurrence of landslides. Therefore, the hydraulic gradient is used as the main hydraulic parameter; dynamic factor which includes the effect of heavy rainfall and their return period. Using the constructed spatial data-sets, a multiple logistic regression model is applied and landslide susceptibility maps are produced showing the spatial-temporal distribution of landslide hazard susceptibility over Japan. To represent the susceptibility in different temporal scales, extreme precipitation in 5 years, 30 years, and 100 years return periods are used for the evaluation. The results show that the highest landslide hazard susceptibility exists in the mountain ranges on the western side of Japan (Japan Sea side), including the Hida and Kiso, Iide and the Asahi mountainous range, the south side of Chugoku mountainous range, the south side of Kyusu mountainous and the Dewa mountainous range and the Hokuriku region. The developed landslide hazard susceptibility maps in this study will assist authorities, policy makers and decision makers, who are responsible for infrastructural planning and development, as they can identify landslide-susceptible areas and thus decrease landslide damage through proper preparation.

  15. Need for Tolerances and Tolerance Exemptions for Minimum Risk Pesticides

    EPA Pesticide Factsheets

    The ingredients used in minimum risk products used on food, food crops, food contact surfaces, or animal feed commodities generally have a tolerance or tolerance exemption. Learn about tolerances and tolerance exemptions for minimum risk ingredients.

  16. Probabilistic Physics-Based Risk Tools Used to Analyze the International Space Station Electrical Power System Output

    NASA Technical Reports Server (NTRS)

    Patel, Bhogila M.; Hoge, Peter A.; Nagpal, Vinod K.; Hojnicki, Jeffrey S.; Rusick, Jeffrey J.

    2004-01-01

    This paper describes the methods employed to apply probabilistic modeling techniques to the International Space Station (ISS) power system. These techniques were used to quantify the probabilistic variation in the power output, also called the response variable, due to variations (uncertainties) associated with knowledge of the influencing factors called the random variables. These uncertainties can be due to unknown environmental conditions, variation in the performance of electrical power system components or sensor tolerances. Uncertainties in these variables, cause corresponding variations in the power output, but the magnitude of that effect varies with the ISS operating conditions, e.g. whether or not the solar panels are actively tracking the sun. Therefore, it is important to quantify the influence of these uncertainties on the power output for optimizing the power available for experiments.

  17. Why is Probabilistic Seismic Hazard Analysis (PSHA) still used?

    NASA Astrophysics Data System (ADS)

    Mulargia, Francesco; Stark, Philip B.; Geller, Robert J.

    2017-03-01

    Even though it has never been validated by objective testing, Probabilistic Seismic Hazard Analysis (PSHA) has been widely used for almost 50 years by governments and industry in applications with lives and property hanging in the balance, such as deciding safety criteria for nuclear power plants, making official national hazard maps, developing building code requirements, and determining earthquake insurance rates. PSHA rests on assumptions now known to conflict with earthquake physics; many damaging earthquakes, including the 1988 Spitak, Armenia, event and the 2011 Tohoku, Japan, event, have occurred in regions relatively rated low-risk by PSHA hazard maps. No extant method, including PSHA, produces reliable estimates of seismic hazard. Earthquake hazard mitigation should be recognized to be inherently political, involving a tradeoff between uncertain costs and uncertain risks. Earthquake scientists, engineers, and risk managers can make important contributions to the hard problem of allocating limited resources wisely, but government officials and stakeholders must take responsibility for the risks of accidents due to natural events that exceed the adopted safety criteria. ********* ;Without an analysis of the physical causes of recorded floods, and of the whole geophysical, biophysical and anthropogenic context which circumscribes the potential for flood formation, results of flood frequency analysis as [now practiced], rather than providing information useful for coping with the flood hazard, themselves represent an additional hazard that can contribute to damages caused by floods. This danger is very real since decisions made on the basis of wrong numbers presented as good estimates of flood probabilities will generally be worse than decisions made with an awareness of an impossibility to make a good estimate and with the aid of merely qualitative information on the general flooding potential.;

  18. Probabilistic forecasts based on radar rainfall uncertainty

    NASA Astrophysics Data System (ADS)

    Liguori, S.; Rico-Ramirez, M. A.

    2012-04-01

    The potential advantages resulting from integrating weather radar rainfall estimates in hydro-meteorological forecasting systems is limited by the inherent uncertainty affecting radar rainfall measurements, which is due to various sources of error [1-3]. The improvement of quality control and correction techniques is recognized to play a role for the future improvement of radar-based flow predictions. However, the knowledge of the uncertainty affecting radar rainfall data can also be effectively used to build a hydro-meteorological forecasting system in a probabilistic framework. This work discusses the results of the implementation of a novel probabilistic forecasting system developed to improve ensemble predictions over a small urban area located in the North of England. An ensemble of radar rainfall fields can be determined as the sum of a deterministic component and a perturbation field, the latter being informed by the knowledge of the spatial-temporal characteristics of the radar error assessed with reference to rain-gauges measurements. This approach is similar to the REAL system [4] developed for use in the Southern-Alps. The radar uncertainty estimate can then be propagated with a nowcasting model, used to extrapolate an ensemble of radar rainfall forecasts, which can ultimately drive hydrological ensemble predictions. A radar ensemble generator has been calibrated using radar rainfall data made available from the UK Met Office after applying post-processing and corrections algorithms [5-6]. One hour rainfall accumulations from 235 rain gauges recorded for the year 2007 have provided the reference to determine the radar error. Statistics describing the spatial characteristics of the error (i.e. mean and covariance) have been computed off-line at gauges location, along with the parameters describing the error temporal correlation. A system has then been set up to impose the space-time error properties to stochastic perturbations, generated in real-time at

  19. Review of the probabilistic failure analysis methodology and other probabilistic approaches for application in aerospace structural design

    NASA Technical Reports Server (NTRS)

    Townsend, J.; Meyers, C.; Ortega, R.; Peck, J.; Rheinfurth, M.; Weinstock, B.

    1993-01-01

    Probabilistic structural analyses and design methods are steadily gaining acceptance within the aerospace industry. The safety factor approach to design has long been the industry standard, and it is believed by many to be overly conservative and thus, costly. A probabilistic approach to design may offer substantial cost savings. This report summarizes several probabilistic approaches: the probabilistic failure analysis (PFA) methodology developed by Jet Propulsion Laboratory, fast probability integration (FPI) methods, the NESSUS finite element code, and response surface methods. Example problems are provided to help identify the advantages and disadvantages of each method.

  20. Probabilistic design analysis using Composite Loads Spectra (CLS) coupled with Probabilistic Structural Analysis Methodologies (PSAM)

    NASA Technical Reports Server (NTRS)

    Newell, J. F.; Rajagopal, K. R.; Ho, H.

    1989-01-01

    Composite loads spectra (CLS) were applied to generate probabilistic loads for use in the PSAM nonlinear evaluation of stochastic structures under stress (NESSUS) finite element code. The CLS approach allows for quantifying loads as mean values and distributions around a central value rather than maximum or enveloped values typically used in deterministic analysis. NESSUS uses these loads to determine mean and perturbation responses. These results are probabilistically evaluated with the distributional information from CLS using a fast probabilistic integration (FPI) technique to define response distributions. The main example discussed describes a method of obtaining load descriptions and stress response of the second-stage turbine blade of the Space Shuttle Main Engine (SSME) high-pressure fuel turbopump (HPFTP). Additional information is presented on the on-going analysis of the high pressure oxidizer turbopump discharge duct (HPOTP) where probabilistic dynamic loads have been generated and are in the process of being used for dynamic analysis. Example comparisons of load analysis and engine data are furnished for partial verification and/or justification for the methodology.

  1. A probabilistic tsunami hazard assessment for Indonesia

    NASA Astrophysics Data System (ADS)

    Horspool, N.; Pranantyo, I.; Griffin, J.; Latief, H.; Natawidjaja, D. H.; Kongko, W.; Cipta, A.; Bustaman, B.; Anugrah, S. D.; Thio, H. K.

    2014-11-01

    Probabilistic hazard assessments are a fundamental tool for assessing the threats posed by hazards to communities and are important for underpinning evidence-based decision-making regarding risk mitigation activities. Indonesia has been the focus of intense tsunami risk mitigation efforts following the 2004 Indian Ocean tsunami, but this has been largely concentrated on the Sunda Arc with little attention to other tsunami prone areas of the country such as eastern Indonesia. We present the first nationally consistent probabilistic tsunami hazard assessment (PTHA) for Indonesia. This assessment produces time-independent forecasts of tsunami hazards at the coast using data from tsunami generated by local, regional and distant earthquake sources. The methodology is based on the established monte carlo approach to probabilistic seismic hazard assessment (PSHA) and has been adapted to tsunami. We account for sources of epistemic and aleatory uncertainty in the analysis through the use of logic trees and sampling probability density functions. For short return periods (100 years) the highest tsunami hazard is the west coast of Sumatra, south coast of Java and the north coast of Papua. For longer return periods (500-2500 years), the tsunami hazard is highest along the Sunda Arc, reflecting the larger maximum magnitudes. The annual probability of experiencing a tsunami with a height of > 0.5 m at the coast is greater than 10% for Sumatra, Java, the Sunda islands (Bali, Lombok, Flores, Sumba) and north Papua. The annual probability of experiencing a tsunami with a height of > 3.0 m, which would cause significant inundation and fatalities, is 1-10% in Sumatra, Java, Bali, Lombok and north Papua, and 0.1-1% for north Sulawesi, Seram and Flores. The results of this national-scale hazard assessment provide evidence for disaster managers to prioritise regions for risk mitigation activities and/or more detailed hazard or risk assessment.

  2. Probabilistic priority assessment of nurse calls.

    PubMed

    Ongenae, Femke; Myny, Dries; Dhaene, Tom; Defloor, Tom; Van Goubergen, Dirk; Verhoeve, Piet; Decruyenaere, Johan; De Turck, Filip

    2014-05-01

    Current nurse call systems are very static. Call buttons are fixed to the wall, and systems do not account for various factors specific to a situation. We have developed a software platform, the ontology-based Nurse Call System (oNCS), which supports the transition to mobile and wireless nurse call buttons and uses an intelligent algorithm to address nurse calls. This algorithm dynamically adapts to the situation at hand by taking the profile information of staff and patients into account by using an ontology. This article describes a probabilistic extension of the oNCS that supports a more sophisticated nurse call algorithm by dynamically assigning priorities to calls based on the risk factors of the patient and the kind of call. The probabilistic oNCS is evaluated through implementation of a prototype and simulations, based on a detailed dataset obtained from 3 nursing departments of Ghent University Hospital. The arrival times of nurses at the location of a call, the workload distribution of calls among nurses, and the assignment of priorities to calls are compared for the oNCS and the current nurse call system. Additionally, the performance of the system and the parameters of the priority assignment algorithm are explored. The execution time of the nurse call algorithm is on average 50.333 ms. Moreover, the probabilistic oNCS significantly improves the assignment of nurses to calls. Calls generally result in a nurse being present more quickly, the workload distribution among the nurses improves, and the priorities and kinds of calls are taken into account.

  3. Probabilistic Climate Scenario Information for Risk Assessment

    NASA Astrophysics Data System (ADS)

    Dairaku, K.; Ueno, G.; Takayabu, I.

    2014-12-01

    Climate information and services for Impacts, Adaptation and Vulnerability (IAV) Assessments are of great concern. In order to develop probabilistic regional climate information that represents the uncertainty in climate scenario experiments in Japan, we compared the physics ensemble experiments using the 60km global atmospheric model of the Meteorological Research Institute (MRI-AGCM) with multi-model ensemble experiments with global atmospheric-ocean coupled models (CMIP3) of SRES A1b scenario experiments. The MRI-AGCM shows relatively good skills particularly in tropics for temperature and geopotential height. Variability in surface air temperature of physical ensemble experiments with MRI-AGCM was within the range of one standard deviation of the CMIP3 model in the Asia region. On the other hand, the variability of precipitation was relatively well represented compared with the variation of the CMIP3 models. Models which show the similar reproducibility in the present climate shows different future climate change. We couldn't find clear relationships between present climate and future climate change in temperature and precipitation. We develop a new method to produce probabilistic information of climate change scenarios by weighting model ensemble experiments based on a regression model (Krishnamurti et al., Science, 1999). The method can be easily applicable to other regions and other physical quantities, and also to downscale to finer-scale dependent on availability of observation dataset. The prototype of probabilistic information in Japan represents the quantified structural uncertainties of multi-model ensemble experiments of climate change scenarios. Acknowledgments: This study was supported by the SOUSEI Program, funded by Ministry of Education, Culture, Sports, Science and Technology, Government of Japan.

  4. A Probabilistic Tsunami Hazard Assessment Methodology

    NASA Astrophysics Data System (ADS)

    Gonzalez, Frank; Geist, Eric; Jaffe, Bruce; Kanoglu, Utku; Mofjeld, Harold; Synolakis, Costas; Titov, Vasily; Arcas, Diego

    2010-05-01

    A methodology for probabilistic tsunami hazard assessment (PTHA) will be described for multiple near- and far-field seismic sources. The method integrates tsunami inundation modeling with the approach of probabilistic seismic hazard assessment (PSHA). A database of inundation simulations is developed, with each simulation corresponding to an earthquake source for which the seismic parameters and mean interevent time have been estimated. A Poissonian model is then adopted for estimating the probability that tsunami flooding will exceed a given level during a specified period of time, taking into account multiple sources and multiple causes of uncertainty. Uncertainty in the tidal stage at tsunami arrival is dealt with by developing a parametric expression for the probability density function of the sum of the tides and a tsunami; uncertainty in the slip distribution of the near-field source is dealt with probabilistically by considering multiple sources in which width and slip values vary, subject to the constraint of a constant moment magnitude. The method was applied to Seaside, Oregon, to obtain estimates of the spatial distribution of 100- and 500-year maximum tsunami amplitudes, i.e., amplitudes with 1% and 0.2% annual probability of exceedance. These results will be presented and discussed, including the primary remaining sources of uncertainty -- those associated with interevent time estimates, the modeling of background sea level, and temporal changes in bathymetry and topography. PTHA represents an important contribution to tsunami hazard assessment techniques; viewed in the broader context of risk analysis, PTHA provides a method for quantifying estimates of the likelihood and severity of the tsunami hazard, which can then be combined with vulnerability and exposure to yield estimates of tsunami risk.

  5. Inferring cellular networks using probabilistic graphical models.

    PubMed

    Friedman, Nir

    2004-02-06

    High-throughput genome-wide molecular assays, which probe cellular networks from different perspectives, have become central to molecular biology. Probabilistic graphical models are useful for extracting meaningful biological insights from the resulting data sets. These models provide a concise representation of complex cellular networks by composing simpler submodels. Procedures based on well-understood principles for inferring such models from data facilitate a model-based methodology for analysis and discovery. This methodology and its capabilities are illustrated by several recent applications to gene expression data.

  6. Probabilistic graphical models for genetic association studies.

    PubMed

    Mourad, Raphaël; Sinoquet, Christine; Leray, Philippe

    2012-01-01

    Probabilistic graphical models have been widely recognized as a powerful formalism in the bioinformatics field, especially in gene expression studies and linkage analysis. Although less well known in association genetics, many successful methods have recently emerged to dissect the genetic architecture of complex diseases. In this review article, we cover the applications of these models to the population association studies' context, such as linkage disequilibrium modeling, fine mapping and candidate gene studies, and genome-scale association studies. Significant breakthroughs of the corresponding methods are highlighted, but emphasis is also given to their current limitations, in particular, to the issue of scalability. Finally, we give promising directions for future research in this field.

  7. Ensemble postprocessing for probabilistic quantitative precipitation forecasts

    NASA Astrophysics Data System (ADS)

    Bentzien, S.; Friederichs, P.

    2012-12-01

    Precipitation is one of the most difficult weather variables to predict in hydrometeorological applications. In order to assess the uncertainty inherent in deterministic numerical weather prediction (NWP), meteorological services around the globe develop ensemble prediction systems (EPS) based on high-resolution NWP systems. With non-hydrostatic model dynamics and without parameterization of deep moist convection, high-resolution NWP models are able to describe convective processes in more detail and provide more realistic mesoscale structures. However, precipitation forecasts are still affected by displacement errors, systematic biases and fast error growth on small scales. Probabilistic guidance can be achieved from an ensemble setup which accounts for model error and uncertainty of initial and boundary conditions. The German Meteorological Service (Deutscher Wetterdienst, DWD) provides such an ensemble system based on the German-focused limited-area model COSMO-DE. With a horizontal grid-spacing of 2.8 km, COSMO-DE is the convection-permitting high-resolution part of the operational model chain at DWD. The COSMO-DE-EPS consists of 20 realizations of COSMO-DE, driven by initial and boundary conditions derived from 4 global models and 5 perturbations of model physics. Ensemble systems like COSMO-DE-EPS are often limited with respect to ensemble size due to the immense computational costs. As a consequence, they can be biased and exhibit insufficient ensemble spread, and probabilistic forecasts may be not well calibrated. In this study, probabilistic quantitative precipitation forecasts are derived from COSMO-DE-EPS and evaluated at more than 1000 rain gauges located all over Germany. COSMO-DE-EPS is a frequently updated ensemble system, initialized 8 times a day. We use the time-lagged approach to inexpensively increase ensemble spread, which results in more reliable forecasts especially for extreme precipitation events. Moreover, we will show that statistical

  8. Probabilistic double guarantee kidnapping detection in SLAM.

    PubMed

    Tian, Yang; Ma, Shugen

    2016-01-01

    For determining whether kidnapping has happened and which type of kidnapping it is while a robot performs autonomous tasks in an unknown environment, a double guarantee kidnapping detection (DGKD) method has been proposed. The good performance of DGKD in a relative small environment is shown. However, a limitation of DGKD is found in a large-scale environment by our recent work. In order to increase the adaptability of DGKD in a large-scale environment, an improved method called probabilistic double guarantee kidnapping detection is proposed in this paper to combine probability of features' positions and the robot's posture. Simulation results demonstrate the validity and accuracy of the proposed method.

  9. Probabilistic Algorithm for Sampler Siting (PASS)

    SciTech Connect

    Lorenzetti, David M.; Sohn, Michael D.

    2007-05-29

    PASS (Probabilistic Approach to Sampler Siting) optimizes the placement of samplers in buildings. The program exhaustively checks every sampler-network that can be formed, evaluating against user-supplied simulations of the possible release scenarios. The program identifies the networks that maximize the probablity of detecting a release from among the suite of user-supllied scenarios. The user may specify how many networks to report, in order to provide a number of choices in cases where many networks have very similar behavior.

  10. a Probabilistic Approach to Robotic Nde Inspection

    NASA Astrophysics Data System (ADS)

    Summan, R.; Dobie, G.; Hensman, J.; Pierce, S. G.; Worden, K.

    2010-02-01

    The application of wireless robotic inspection vehicles equipped with different NDE payloads has been introduced previously, with emphasis placed on inspection applications in hazardous and inaccessible environments. A particular challenge to the practical application of such robotic inspection lies in the localization of the devices. The authors here consider a probabilistic approach to both the positioning and defect problems by using the location of the robot and the NDE measurements (acquired from the onboard transducers) to make inference about defect existence and position. Using a particle filter approach running locally on the robots, the vehicle location is tracked by fusing noisy redundant data sets supplying positional information.

  11. A periodic probabilistic photonic cluster state generator

    NASA Astrophysics Data System (ADS)

    Fanto, Michael L.; Smith, A. Matthew; Alsing, Paul M.; Tison, Christopher C.; Preble, Stefan F.; Lott, Gordon E.; Osman, Joseph M.; Szep, Attila; Kim, Richard S.

    2014-10-01

    The research detailed in this paper describes a Periodic Cluster State Generator (PCSG) consisting of a monolithic integrated waveguide device that employs four wave mixing, an array of probabilistic photon guns, single mode sequential entanglers and an array of controllable entangling gates between modes to create arbitrary cluster states. Utilizing the PCSG one is able to produce a cluster state with nearest neighbor entanglement in the form of a linear or square lattice. Cluster state resources of this type have been proven to be able to perform universal quantum computation.

  12. Probabilistic Interpolation of Wind Hazard Maps

    NASA Astrophysics Data System (ADS)

    Xu, L.

    2012-12-01

    Wind hazard maps are widely used to compute design loads and to evaluate insurance risks. While building codes often provide these maps for only a few return periods, wind hazard maps for other return periods are often needed for risk assessments. In this study, we evaluate a probabilistic interpolation approach for deriving wind hazard maps for return periods other than those available. The probabilistic interpolation approach assumes that probabilities of wind values in a wind hazard map follow Gumbel distribution. Although most studies have been performed on data from individual weather stations, it remains to be seen how well the Gumbel distribution-based interpolation performs for wind hazard maps. The Gumbel distribution F(V) =exp{-exp[-α(V - u)]} is assumed for wind speed at a wind map location, where α and u are parameters that vary with location. VT = u + α-1lnT is the wind speed of return period T when T is large. If T0 and T1 are two given return periods and T1 is greater, then VT = (1-θ)VT0 + θVT1 where θ = (lnT - lnT0)/(lnT1 - lnT0). Therefore, VT is a weighted average between VT0 and VT1. Here we select the US and Mexican hazard maps to evaluate the probabilistic interpolation method. In ASCE 7-10 wind maps, the basic wind speed has a single value for most inland areas, which is 54, 51, and 47 m/s for 1700-year, 700-year, and 300-year return periods, respectively. We use the 1700-year and 300-year values to obtain the 700-year value using the Gumbel distribution-based interpolation. The computed 700-year value is 50.4 m/s compared to 51 m/s provided in the code, about 1% difference. For coastal regions subjected to hurricane winds, the relative error between the interpolated 700-year values and the original 700-year values are within 2% for most areas except for areas where hurricane zones transition to inland non-hurricane zones; there the relative errors can increase to 4%. The Mexican wind code includes wind maps of three return periods: 10

  13. The probabilistic cell: implementation of a probabilistic inference by the biochemical mechanisms of phototransduction.

    PubMed

    Houillon, Audrey; Bessière, Pierre; Droulez, Jacques

    2010-09-01

    When we perceive the external world, our brain has to deal with the incompleteness and uncertainty associated with sensory inputs, memory and prior knowledge. In theoretical neuroscience probabilistic approaches have received a growing interest recently, as they account for the ability to reason with incomplete knowledge and to efficiently describe perceptive and behavioral tasks. How can the probability distributions that need to be estimated in these models be represented and processed in the brain, in particular at the single cell level? We consider the basic function carried out by photoreceptor cells which consists in detecting the presence or absence of light. We give a system-level understanding of the process of phototransduction based on a bayesian formalism: we show that the process of phototransduction is equivalent to a temporal probabilistic inference in a Hidden Markov Model (HMM), for estimating the presence or absence of light. Thus, the biochemical mechanisms of phototransduction underlie the estimation of the current state probability distribution of the presence of light. A classical descriptive model describes the interactions between the different molecular messengers, ions, enzymes and channel proteins occurring within the photoreceptor by a set of nonlinear coupled differential equations. In contrast, the probabilistic HMM model is described by a discrete recurrence equation. It appears that the binary HMM has a general solution in the case of constant input. This allows a detailed analysis of the dynamics of the system. The biochemical system and the HMM behave similarly under steady-state conditions. Consequently a formal equivalence can be found between the biochemical system and the HMM. Numerical simulations further extend the results to the dynamic case and to noisy input. All in all, we have derived a probabilistic model equivalent to a classical descriptive model of phototransduction, which has the additional advantage of assigning a

  14. Probabilistic Seismic Hazard Assessment for Northeast India Region

    NASA Astrophysics Data System (ADS)

    Das, Ranjit; Sharma, M. L.; Wason, H. R.

    2016-08-01

    Northeast India bounded by latitudes 20°-30°N and longitudes 87°-98°E is one of the most seismically active areas in the world. This region has experienced several moderate-to-large-sized earthquakes, including the 12 June, 1897 Shillong earthquake ( M w 8.1) and the 15 August, 1950 Assam earthquake ( M w 8.7) which caused loss of human lives and significant damages to buildings highlighting the importance of seismic hazard assessment for the region. Probabilistic seismic hazard assessment of the region has been carried out using a unified moment magnitude catalog prepared by an improved General Orthogonal Regression methodology (Geophys J Int, 190:1091-1096, 2012; Probabilistic seismic hazard assessment of Northeast India region, Ph.D. Thesis, Department of Earthquake Engineering, IIT Roorkee, Roorkee, 2013) with events compiled from various databases (ISC, NEIC,GCMT, IMD) and other available catalogs. The study area has been subdivided into nine seismogenic source zones to account for local variation in tectonics and seismicity characteristics. The seismicity parameters are estimated for each of these source zones, which are input variables into seismic hazard estimation of a region. The seismic hazard analysis of the study region has been performed by dividing the area into grids of size 0.1° × 0.1°. Peak ground acceleration (PGA) and spectral acceleration ( S a) values (for periods of 0.2 and 1 s) have been evaluated at bedrock level corresponding to probability of exceedance (PE) of 50, 20, 10, 2 and 0.5 % in 50 years. These exceedance values correspond to return periods of 100, 225, 475, 2475, and 10,000 years, respectively. The seismic hazard maps have been prepared at the bedrock level, and it is observed that the seismic hazard estimates show a significant local variation in contrast to the uniform hazard value suggested by the Indian standard seismic code [Indian standard, criteria for earthquake-resistant design of structures, fifth edition, Part

  15. Perception of Speech Reflects Optimal Use of Probabilistic Speech Cues

    ERIC Educational Resources Information Center

    Clayards, Meghan; Tanenhaus, Michael K.; Aslin, Richard N.; Jacobs, Robert A.

    2008-01-01

    Listeners are exquisitely sensitive to fine-grained acoustic detail within phonetic categories for sounds and words. Here we show that this sensitivity is optimal given the probabilistic nature of speech cues. We manipulated the probability distribution of one probabilistic cue, voice onset time (VOT), which differentiates word initial labial…

  16. The Role of Language in Building Probabilistic Thinking

    ERIC Educational Resources Information Center

    Nacarato, Adair Mendes; Grando, Regina Célia

    2014-01-01

    This paper is based on research that investigated the development of probabilistic language and thinking by students 10-12 years old. The focus was on the adequate use of probabilistic terms in social practice. A series of tasks was developed for the investigation and completed by the students working in groups. The discussions were video recorded…

  17. Understanding Probabilistic Thinking: The Legacy of Efraim Fischbein.

    ERIC Educational Resources Information Center

    Greer, Brian

    2001-01-01

    Honors the contribution of Efraim Fischbein to the study and analysis of probabilistic thinking. Summarizes Fischbein's early work, then focuses on the role of intuition in mathematical and scientific thinking; the development of probabilistic thinking; and the influence of instruction on that development. (Author/MM)

  18. A probabilistic bridge safety evaluation against floods.

    PubMed

    Liao, Kuo-Wei; Muto, Yasunori; Chen, Wei-Lun; Wu, Bang-Ho

    2016-01-01

    To further capture the influences of uncertain factors on river bridge safety evaluation, a probabilistic approach is adopted. Because this is a systematic and nonlinear problem, MPP-based reliability analyses are not suitable. A sampling approach such as a Monte Carlo simulation (MCS) or importance sampling is often adopted. To enhance the efficiency of the sampling approach, this study utilizes Bayesian least squares support vector machines to construct a response surface followed by an MCS, providing a more precise safety index. Although there are several factors impacting the flood-resistant reliability of a bridge, previous experiences and studies show that the reliability of the bridge itself plays a key role. Thus, the goal of this study is to analyze the system reliability of a selected bridge that includes five limit states. The random variables considered here include the water surface elevation, water velocity, local scour depth, soil property and wind load. Because the first three variables are deeply affected by river hydraulics, a probabilistic HEC-RAS-based simulation is performed to capture the uncertainties in those random variables. The accuracy and variation of our solutions are confirmed by a direct MCS to ensure the applicability of the proposed approach. The results of a numerical example indicate that the proposed approach can efficiently provide an accurate bridge safety evaluation and maintain satisfactory variation.

  19. Variational approach to probabilistic finite elements

    NASA Technical Reports Server (NTRS)

    Belytschko, T.; Liu, W. K.; Mani, A.; Besterfield, G.

    1991-01-01

    Probabilistic finite element methods (PFEM), synthesizing the power of finite element methods with second-moment techniques, are formulated for various classes of problems in structural and solid mechanics. Time-invariant random materials, geometric properties and loads are incorporated in terms of their fundamental statistics viz. second-moments. Analogous to the discretization of the displacement field in finite element methods, the random fields are also discretized. Preserving the conceptual simplicity, the response moments are calculated with minimal computations. By incorporating certain computational techniques, these methods are shown to be capable of handling large systems with many sources of uncertainties. By construction, these methods are applicable when the scale of randomness is not very large and when the probabilistic density functions have decaying tails. The accuracy and efficiency of these methods, along with their limitations, are demonstrated by various applications. Results obtained are compared with those of Monte Carlo simulation and it is shown that good accuracy can be obtained for both linear and nonlinear problems. The methods are amenable to implementation in deterministic FEM based computer codes.

  20. Variational approach to probabilistic finite elements

    NASA Technical Reports Server (NTRS)

    Belytschko, T.; Liu, W. K.; Mani, A.; Besterfield, G.

    1987-01-01

    Probabilistic finite element method (PFEM), synthesizing the power of finite element methods with second-moment techniques, are formulated for various classes of problems in structural and solid mechanics. Time-invariant random materials, geometric properties, and loads are incorporated in terms of their fundamental statistics viz. second-moments. Analogous to the discretization of the displacement field in finite element methods, the random fields are also discretized. Preserving the conceptual simplicity, the response moments are calculated with minimal computations. By incorporating certain computational techniques, these methods are shown to be capable of handling large systems with many sources of uncertainties. By construction, these methods are applicable when the scale of randomness is not very large and when the probabilistic density functions have decaying tails. The accuracy and efficiency of these methods, along with their limitations, are demonstrated by various applications. Results obtained are compared with those of Monte Carlo simulation and it is shown that good accuracy can be obtained for both linear and nonlinear problems. The methods are amenable to implementation in deterministic FEM based computer codes.