Sample records for probabilistic based design

  1. Distributed collaborative probabilistic design of multi-failure structure with fluid-structure interaction using fuzzy neural network of regression

    NASA Astrophysics Data System (ADS)

    Song, Lu-Kai; Wen, Jie; Fei, Cheng-Wei; Bai, Guang-Chen

    2018-05-01

    To improve the computing efficiency and precision of probabilistic design for multi-failure structure, a distributed collaborative probabilistic design method-based fuzzy neural network of regression (FR) (called as DCFRM) is proposed with the integration of distributed collaborative response surface method and fuzzy neural network regression model. The mathematical model of DCFRM is established and the probabilistic design idea with DCFRM is introduced. The probabilistic analysis of turbine blisk involving multi-failure modes (deformation failure, stress failure and strain failure) was investigated by considering fluid-structure interaction with the proposed method. The distribution characteristics, reliability degree, and sensitivity degree of each failure mode and overall failure mode on turbine blisk are obtained, which provides a useful reference for improving the performance and reliability of aeroengine. Through the comparison of methods shows that the DCFRM reshapes the probability of probabilistic analysis for multi-failure structure and improves the computing efficiency while keeping acceptable computational precision. Moreover, the proposed method offers a useful insight for reliability-based design optimization of multi-failure structure and thereby also enriches the theory and method of mechanical reliability design.

  2. Distributed collaborative probabilistic design for turbine blade-tip radial running clearance using support vector machine of regression

    NASA Astrophysics Data System (ADS)

    Fei, Cheng-Wei; Bai, Guang-Chen

    2014-12-01

    To improve the computational precision and efficiency of probabilistic design for mechanical dynamic assembly like the blade-tip radial running clearance (BTRRC) of gas turbine, a distribution collaborative probabilistic design method-based support vector machine of regression (SR)(called as DCSRM) is proposed by integrating distribution collaborative response surface method and support vector machine regression model. The mathematical model of DCSRM is established and the probabilistic design idea of DCSRM is introduced. The dynamic assembly probabilistic design of aeroengine high-pressure turbine (HPT) BTRRC is accomplished to verify the proposed DCSRM. The analysis results reveal that the optimal static blade-tip clearance of HPT is gained for designing BTRRC, and improving the performance and reliability of aeroengine. The comparison of methods shows that the DCSRM has high computational accuracy and high computational efficiency in BTRRC probabilistic analysis. The present research offers an effective way for the reliability design of mechanical dynamic assembly and enriches mechanical reliability theory and method.

  3. Probabilistic load simulation: Code development status

    NASA Astrophysics Data System (ADS)

    Newell, J. F.; Ho, H.

    1991-05-01

    The objective of the Composite Load Spectra (CLS) project is to develop generic load models to simulate the composite load spectra that are included in space propulsion system components. The probabilistic loads thus generated are part of the probabilistic design analysis (PDA) of a space propulsion system that also includes probabilistic structural analyses, reliability, and risk evaluations. Probabilistic load simulation for space propulsion systems demands sophisticated probabilistic methodology and requires large amounts of load information and engineering data. The CLS approach is to implement a knowledge based system coupled with a probabilistic load simulation module. The knowledge base manages and furnishes load information and expertise and sets up the simulation runs. The load simulation module performs the numerical computation to generate the probabilistic loads with load information supplied from the CLS knowledge base.

  4. Probabilistic sizing of laminates with uncertainties

    NASA Technical Reports Server (NTRS)

    Shah, A. R.; Liaw, D. G.; Chamis, C. C.

    1993-01-01

    A reliability based design methodology for laminate sizing and configuration for a special case of composite structures is described. The methodology combines probabilistic composite mechanics with probabilistic structural analysis. The uncertainties of constituent materials (fiber and matrix) to predict macroscopic behavior are simulated using probabilistic theory. Uncertainties in the degradation of composite material properties are included in this design methodology. A multi-factor interaction equation is used to evaluate load and environment dependent degradation of the composite material properties at the micromechanics level. The methodology is integrated into a computer code IPACS (Integrated Probabilistic Assessment of Composite Structures). Versatility of this design approach is demonstrated by performing a multi-level probabilistic analysis to size the laminates for design structural reliability of random type structures. The results show that laminate configurations can be selected to improve the structural reliability from three failures in 1000, to no failures in one million. Results also show that the laminates with the highest reliability are the least sensitive to the loading conditions.

  5. Application of probabilistic analysis/design methods in space programs - The approaches, the status, and the needs

    NASA Technical Reports Server (NTRS)

    Ryan, Robert S.; Townsend, John S.

    1993-01-01

    The prospective improvement of probabilistic methods for space program analysis/design entails the further development of theories, codes, and tools which match specific areas of application, the drawing of lessons from previous uses of probability and statistics data bases, the enlargement of data bases (especially in the field of structural failures), and the education of engineers and managers on the advantages of these methods. An evaluation is presently made of the current limitations of probabilistic engineering methods. Recommendations are made for specific applications.

  6. A probabilistic approach to aircraft design emphasizing stability and control uncertainties

    NASA Astrophysics Data System (ADS)

    Delaurentis, Daniel Andrew

    In order to address identified deficiencies in current approaches to aerospace systems design, a new method has been developed. This new method for design is based on the premise that design is a decision making activity, and that deterministic analysis and synthesis can lead to poor, or misguided decision making. This is due to a lack of disciplinary knowledge of sufficient fidelity about the product, to the presence of uncertainty at multiple levels of the aircraft design hierarchy, and to a failure to focus on overall affordability metrics as measures of goodness. Design solutions are desired which are robust to uncertainty and are based on the maximum knowledge possible. The new method represents advances in the two following general areas. 1. Design models and uncertainty. The research performed completes a transition from a deterministic design representation to a probabilistic one through a modeling of design uncertainty at multiple levels of the aircraft design hierarchy, including: (1) Consistent, traceable uncertainty classification and representation; (2) Concise mathematical statement of the Probabilistic Robust Design problem; (3) Variants of the Cumulative Distribution Functions (CDFs) as decision functions for Robust Design; (4) Probabilistic Sensitivities which identify the most influential sources of variability. 2. Multidisciplinary analysis and design. Imbedded in the probabilistic methodology is a new approach for multidisciplinary design analysis and optimization (MDA/O), employing disciplinary analysis approximations formed through statistical experimentation and regression. These approximation models are a function of design variables common to the system level as well as other disciplines. For aircraft, it is proposed that synthesis/sizing is the proper avenue for integrating multiple disciplines. Research hypotheses are translated into a structured method, which is subsequently tested for validity. Specifically, the implementation involves the study of the relaxed static stability technology for a supersonic commercial transport aircraft. The probabilistic robust design method is exercised resulting in a series of robust design solutions based on different interpretations of "robustness". Insightful results are obtained and the ability of the method to expose trends in the design space are noted as a key advantage.

  7. A simulation-based probabilistic design method for arctic sea transport systems

    NASA Astrophysics Data System (ADS)

    Martin, Bergström; Ove, Erikstad Stein; Sören, Ehlers

    2016-12-01

    When designing an arctic cargo ship, it is necessary to consider multiple stochastic factors. This paper evaluates the merits of a simulation-based probabilistic design method specifically developed to deal with this challenge. The outcome of the paper indicates that the incorporation of simulations and probabilistic design parameters into the design process enables more informed design decisions. For instance, it enables the assessment of the stochastic transport capacity of an arctic ship, as well as of its long-term ice exposure that can be used to determine an appropriate level of ice-strengthening. The outcome of the paper also indicates that significant gains in transport system cost-efficiency can be obtained by extending the boundaries of the design task beyond the individual vessel. In the case of industrial shipping, this allows for instance the consideration of port-based cargo storage facilities allowing for temporary shortages in transport capacity and thus a reduction in the required fleet size / ship capacity.

  8. Optimization of Systems with Uncertainty: Initial Developments for Performance, Robustness and Reliability Based Designs

    NASA Technical Reports Server (NTRS)

    Crespo, Luis G.; Bushnell, Dennis M. (Technical Monitor)

    2002-01-01

    This paper presents a study on the optimization of systems with structured uncertainties, whose inputs and outputs can be exhaustively described in the probabilistic sense. By propagating the uncertainty from the input to the output in the space of the probability density functions and the moments, optimization problems that pursue performance, robustness and reliability based designs are studied. Be specifying the desired outputs in terms of desired probability density functions and then in terms of meaningful probabilistic indices, we settle a computationally viable framework for solving practical optimization problems. Applications to static optimization and stability control are used to illustrate the relevance of incorporating uncertainty in the early stages of the design. Several examples that admit a full probabilistic description of the output in terms of the design variables and the uncertain inputs are used to elucidate the main features of the generic problem and its solution. Extensions to problems that do not admit closed form solutions are also evaluated. Concrete evidence of the importance of using a consistent probabilistic formulation of the optimization problem and a meaningful probabilistic description of its solution is provided in the examples. In the stability control problem the analysis shows that standard deterministic approaches lead to designs with high probability of running into instability. The implementation of such designs can indeed have catastrophic consequences.

  9. Use of the LUS in sequence allele designations to facilitate probabilistic genotyping of NGS-based STR typing results.

    PubMed

    Just, Rebecca S; Irwin, Jodi A

    2018-05-01

    Some of the expected advantages of next generation sequencing (NGS) for short tandem repeat (STR) typing include enhanced mixture detection and genotype resolution via sequence variation among non-homologous alleles of the same length. However, at the same time that NGS methods for forensic DNA typing have advanced in recent years, many caseworking laboratories have implemented or are transitioning to probabilistic genotyping to assist the interpretation of complex autosomal STR typing results. Current probabilistic software programs are designed for length-based data, and were not intended to accommodate sequence strings as the product input. Yet to leverage the benefits of NGS for enhanced genotyping and mixture deconvolution, the sequence variation among same-length products must be utilized in some form. Here, we propose use of the longest uninterrupted stretch (LUS) in allele designations as a simple method to represent sequence variation within the STR repeat regions and facilitate - in the nearterm - probabilistic interpretation of NGS-based typing results. An examination of published population data indicated that a reference LUS region is straightforward to define for most autosomal STR loci, and that using repeat unit plus LUS length as the allele designator can represent greater than 80% of the alleles detected by sequencing. A proof of concept study performed using a freely available probabilistic software demonstrated that the LUS length can be used in allele designations when a program does not require alleles to be integers, and that utilizing sequence information improves interpretation of both single-source and mixed contributor STR typing results as compared to using repeat unit information alone. The LUS concept for allele designation maintains the repeat-based allele nomenclature that will permit backward compatibility to extant STR databases, and the LUS lengths themselves will be concordant regardless of the NGS assay or analysis tools employed. Further, these biologically based, easy-to-derive designations uphold clear relationships between parent alleles and their stutter products, enabling analysis in fully continuous probabilistic programs that model stutter while avoiding the algorithmic complexities that come with string based searches. Though using repeat unit plus LUS length as the allele designator does not capture variation that occurs outside of the core repeat regions, this straightforward approach would permit the large majority of known STR sequence variation to be used for mixture deconvolution and, in turn, result in more informative mixture statistics in the near term. Ultimately, the method could bridge the gap from current length-based probabilistic systems to facilitate broader adoption of NGS by forensic DNA testing laboratories. Copyright © 2018 The Authors. Published by Elsevier B.V. All rights reserved.

  10. WV R-EMAP SMALL WATERSHED CHARACTERIZATION, CLASSIFICATION, AND ASSESSMENT FOR WEST VIRGINIA UTILIZING EMAP DESIGN AND TOOLS

    EPA Science Inventory

    A probabilistic watershed-based framework was developed to encompass wadeable streams within all three ecoregions of West Virginia, with the exclusion noted below. In Phase I of the project (year 2001), we developed and applied a probabilistic watershed-based sampling framework ...

  11. Probabilistic Scenario-based Seismic Risk Analysis for Critical Infrastructures Method and Application for a Nuclear Power Plant

    NASA Astrophysics Data System (ADS)

    Klügel, J.

    2006-12-01

    Deterministic scenario-based seismic hazard analysis has a long tradition in earthquake engineering for developing the design basis of critical infrastructures like dams, transport infrastructures, chemical plants and nuclear power plants. For many applications besides of the design of infrastructures it is of interest to assess the efficiency of the design measures taken. These applications require a method allowing to perform a meaningful quantitative risk analysis. A new method for a probabilistic scenario-based seismic risk analysis has been developed based on a probabilistic extension of proven deterministic methods like the MCE- methodology. The input data required for the method are entirely based on the information which is necessary to perform any meaningful seismic hazard analysis. The method is based on the probabilistic risk analysis approach common for applications in nuclear technology developed originally by Kaplan & Garrick (1981). It is based (1) on a classification of earthquake events into different size classes (by magnitude), (2) the evaluation of the frequency of occurrence of events, assigned to the different classes (frequency of initiating events, (3) the development of bounding critical scenarios assigned to each class based on the solution of an optimization problem and (4) in the evaluation of the conditional probability of exceedance of critical design parameters (vulnerability analysis). The advantage of the method in comparison with traditional PSHA consists in (1) its flexibility, allowing to use different probabilistic models for earthquake occurrence as well as to incorporate advanced physical models into the analysis, (2) in the mathematically consistent treatment of uncertainties, and (3) in the explicit consideration of the lifetime of the critical structure as a criterion to formulate different risk goals. The method was applied for the evaluation of the risk of production interruption losses of a nuclear power plant during its residual lifetime.

  12. MATILDA Version 2: Rough Earth TIALD Model for Laser Probabilistic Risk Assessment in Hilly Terrain - Part I

    DTIC Science & Technology

    2017-03-13

    support of airborne laser designator use during test and training exercises on military ranges. The initial MATILDA tool, MATILDA PRO Version-1.6.1...was based on the 2007 PRA model developed to perform range safety clearances for the UK Thermal Imaging Airborne Laser Designator (TIALD) system...AFRL Technical Reports. This Technical Report, designated Part I, con- tains documentation of the computational procedures for probabilistic fault

  13. Overview of Future of Probabilistic Methods and RMSL Technology and the Probabilistic Methods Education Initiative for the US Army at the SAE G-11 Meeting

    NASA Technical Reports Server (NTRS)

    Singhal, Surendra N.

    2003-01-01

    The SAE G-11 RMSL Division and Probabilistic Methods Committee meeting sponsored by the Picatinny Arsenal during March 1-3, 2004 at Westin Morristown, will report progress on projects for probabilistic assessment of Army system and launch an initiative for probabilistic education. The meeting features several Army and industry Senior executives and Ivy League Professor to provide an industry/government/academia forum to review RMSL technology; reliability and probabilistic technology; reliability-based design methods; software reliability; and maintainability standards. With over 100 members including members with national/international standing, the mission of the G-11s Probabilistic Methods Committee is to enable/facilitate rapid deployment of probabilistic technology to enhance the competitiveness of our industries by better, faster, greener, smarter, affordable and reliable product development.

  14. Reliability-Based Control Design for Uncertain Systems

    NASA Technical Reports Server (NTRS)

    Crespo, Luis G.; Kenny, Sean P.

    2005-01-01

    This paper presents a robust control design methodology for systems with probabilistic parametric uncertainty. Control design is carried out by solving a reliability-based multi-objective optimization problem where the probability of violating design requirements is minimized. Simultaneously, failure domains are optimally enlarged to enable global improvements in the closed-loop performance. To enable an efficient numerical implementation, a hybrid approach for estimating reliability metrics is developed. This approach, which integrates deterministic sampling and asymptotic approximations, greatly reduces the numerical burden associated with complex probabilistic computations without compromising the accuracy of the results. Examples using output-feedback and full-state feedback with state estimation are used to demonstrate the ideas proposed.

  15. Optimization of Contrast Detection Power with Probabilistic Behavioral Information

    PubMed Central

    Cordes, Dietmar; Herzmann, Grit; Nandy, Rajesh; Curran, Tim

    2012-01-01

    Recent progress in the experimental design for event-related fMRI experiments made it possible to find the optimal stimulus sequence for maximum contrast detection power using a genetic algorithm. In this study, a novel algorithm is proposed for optimization of contrast detection power by including probabilistic behavioral information, based on pilot data, in the genetic algorithm. As a particular application, a recognition memory task is studied and the design matrix optimized for contrasts involving the familiarity of individual items (pictures of objects) and the recollection of qualitative information associated with the items (left/right orientation). Optimization of contrast efficiency is a complicated issue whenever subjects’ responses are not deterministic but probabilistic. Contrast efficiencies are not predictable unless behavioral responses are included in the design optimization. However, available software for design optimization does not include options for probabilistic behavioral constraints. If the anticipated behavioral responses are included in the optimization algorithm, the design is optimal for the assumed behavioral responses, and the resulting contrast efficiency is greater than what either a block design or a random design can achieve. Furthermore, improvements of contrast detection power depend strongly on the behavioral probabilities, the perceived randomness, and the contrast of interest. The present genetic algorithm can be applied to any case in which fMRI contrasts are dependent on probabilistic responses that can be estimated from pilot data. PMID:22326984

  16. Probabilistic Finite Element Analysis & Design Optimization for Structural Designs

    NASA Astrophysics Data System (ADS)

    Deivanayagam, Arumugam

    This study focuses on implementing probabilistic nature of material properties (Kevlar® 49) to the existing deterministic finite element analysis (FEA) of fabric based engine containment system through Monte Carlo simulations (MCS) and implementation of probabilistic analysis in engineering designs through Reliability Based Design Optimization (RBDO). First, the emphasis is on experimental data analysis focusing on probabilistic distribution models which characterize the randomness associated with the experimental data. The material properties of Kevlar® 49 are modeled using experimental data analysis and implemented along with an existing spiral modeling scheme (SMS) and user defined constitutive model (UMAT) for fabric based engine containment simulations in LS-DYNA. MCS of the model are performed to observe the failure pattern and exit velocities of the models. Then the solutions are compared with NASA experimental tests and deterministic results. MCS with probabilistic material data give a good prospective on results rather than a single deterministic simulation results. The next part of research is to implement the probabilistic material properties in engineering designs. The main aim of structural design is to obtain optimal solutions. In any case, in a deterministic optimization problem even though the structures are cost effective, it becomes highly unreliable if the uncertainty that may be associated with the system (material properties, loading etc.) is not represented or considered in the solution process. Reliable and optimal solution can be obtained by performing reliability optimization along with the deterministic optimization, which is RBDO. In RBDO problem formulation, in addition to structural performance constraints, reliability constraints are also considered. This part of research starts with introduction to reliability analysis such as first order reliability analysis, second order reliability analysis followed by simulation technique that are performed to obtain probability of failure and reliability of structures. Next, decoupled RBDO procedure is proposed with a new reliability analysis formulation with sensitivity analysis, which is performed to remove the highly reliable constraints in the RBDO, thereby reducing the computational time and function evaluations. Followed by implementation of the reliability analysis concepts and RBDO in finite element 2D truss problems and a planar beam problem are presented and discussed.

  17. Probabilistic confidence for decisions based on uncertain reliability estimates

    NASA Astrophysics Data System (ADS)

    Reid, Stuart G.

    2013-05-01

    Reliability assessments are commonly carried out to provide a rational basis for risk-informed decisions concerning the design or maintenance of engineering systems and structures. However, calculated reliabilities and associated probabilities of failure often have significant uncertainties associated with the possible estimation errors relative to the 'true' failure probabilities. For uncertain probabilities of failure, a measure of 'probabilistic confidence' has been proposed to reflect the concern that uncertainty about the true probability of failure could result in a system or structure that is unsafe and could subsequently fail. The paper describes how the concept of probabilistic confidence can be applied to evaluate and appropriately limit the probabilities of failure attributable to particular uncertainties such as design errors that may critically affect the dependability of risk-acceptance decisions. This approach is illustrated with regard to the dependability of structural design processes based on prototype testing with uncertainties attributable to sampling variability.

  18. Probabilistic Design Storm Method for Improved Flood Estimation in Ungauged Catchments

    NASA Astrophysics Data System (ADS)

    Berk, Mario; Å pačková, Olga; Straub, Daniel

    2017-12-01

    The design storm approach with event-based rainfall-runoff models is a standard method for design flood estimation in ungauged catchments. The approach is conceptually simple and computationally inexpensive, but the underlying assumptions can lead to flawed design flood estimations. In particular, the implied average recurrence interval (ARI) neutrality between rainfall and runoff neglects uncertainty in other important parameters, leading to an underestimation of design floods. The selection of a single representative critical rainfall duration in the analysis leads to an additional underestimation of design floods. One way to overcome these nonconservative approximations is the use of a continuous rainfall-runoff model, which is associated with significant computational cost and requires rainfall input data that are often not readily available. As an alternative, we propose a novel Probabilistic Design Storm method that combines event-based flood modeling with basic probabilistic models and concepts from reliability analysis, in particular the First-Order Reliability Method (FORM). The proposed methodology overcomes the limitations of the standard design storm approach, while utilizing the same input information and models without excessive computational effort. Additionally, the Probabilistic Design Storm method allows deriving so-called design charts, which summarize representative design storm events (combinations of rainfall intensity and other relevant parameters) for floods with different return periods. These can be used to study the relationship between rainfall and runoff return periods. We demonstrate, investigate, and validate the method by means of an example catchment located in the Bavarian Pre-Alps, in combination with a simple hydrological model commonly used in practice.

  19. Reliability, Risk and Cost Trade-Offs for Composite Designs

    NASA Technical Reports Server (NTRS)

    Shiao, Michael C.; Singhal, Surendra N.; Chamis, Christos C.

    1996-01-01

    Risk and cost trade-offs have been simulated using a probabilistic method. The probabilistic method accounts for all naturally-occurring uncertainties including those in constituent material properties, fabrication variables, structure geometry and loading conditions. The probability density function of first buckling load for a set of uncertain variables is computed. The probabilistic sensitivity factors of uncertain variables to the first buckling load is calculated. The reliability-based cost for a composite fuselage panel is defined and minimized with respect to requisite design parameters. The optimization is achieved by solving a system of nonlinear algebraic equations whose coefficients are functions of probabilistic sensitivity factors. With optimum design parameters such as the mean and coefficient of variation (representing range of scatter) of uncertain variables, the most efficient and economical manufacturing procedure can be selected. In this paper, optimum values of the requisite design parameters for a predetermined cost due to failure occurrence are computationally determined. The results for the fuselage panel analysis show that the higher the cost due to failure occurrence, the smaller the optimum coefficient of variation of fiber modulus (design parameter) in longitudinal direction.

  20. UQTools: The Uncertainty Quantification Toolbox - Introduction and Tutorial

    NASA Technical Reports Server (NTRS)

    Kenny, Sean P.; Crespo, Luis G.; Giesy, Daniel P.

    2012-01-01

    UQTools is the short name for the Uncertainty Quantification Toolbox, a software package designed to efficiently quantify the impact of parametric uncertainty on engineering systems. UQTools is a MATLAB-based software package and was designed to be discipline independent, employing very generic representations of the system models and uncertainty. Specifically, UQTools accepts linear and nonlinear system models and permits arbitrary functional dependencies between the system s measures of interest and the probabilistic or non-probabilistic parametric uncertainty. One of the most significant features incorporated into UQTools is the theoretical development centered on homothetic deformations and their application to set bounding and approximating failure probabilities. Beyond the set bounding technique, UQTools provides a wide range of probabilistic and uncertainty-based tools to solve key problems in science and engineering.

  1. Integrated deterministic and probabilistic safety analysis for safety assessment of nuclear power plants

    DOE PAGES

    Di Maio, Francesco; Zio, Enrico; Smith, Curtis; ...

    2015-07-06

    The present special issue contains an overview of the research in the field of Integrated Deterministic and Probabilistic Safety Assessment (IDPSA) of Nuclear Power Plants (NPPs). Traditionally, safety regulation for NPPs design and operation has been based on Deterministic Safety Assessment (DSA) methods to verify criteria that assure plant safety in a number of postulated Design Basis Accident (DBA) scenarios. Referring to such criteria, it is also possible to identify those plant Structures, Systems, and Components (SSCs) and activities that are most important for safety within those postulated scenarios. Then, the design, operation, and maintenance of these “safety-related” SSCs andmore » activities are controlled through regulatory requirements and supported by Probabilistic Safety Assessment (PSA).« less

  2. Fully probabilistic control for stochastic nonlinear control systems with input dependent noise.

    PubMed

    Herzallah, Randa

    2015-03-01

    Robust controllers for nonlinear stochastic systems with functional uncertainties can be consistently designed using probabilistic control methods. In this paper a generalised probabilistic controller design for the minimisation of the Kullback-Leibler divergence between the actual joint probability density function (pdf) of the closed loop control system, and an ideal joint pdf is presented emphasising how the uncertainty can be systematically incorporated in the absence of reliable systems models. To achieve this objective all probabilistic models of the system are estimated from process data using mixture density networks (MDNs) where all the parameters of the estimated pdfs are taken to be state and control input dependent. Based on this dependency of the density parameters on the input values, explicit formulations to the construction of optimal generalised probabilistic controllers are obtained through the techniques of dynamic programming and adaptive critic methods. Using the proposed generalised probabilistic controller, the conditional joint pdfs can be made to follow the ideal ones. A simulation example is used to demonstrate the implementation of the algorithm and encouraging results are obtained. Copyright © 2014 Elsevier Ltd. All rights reserved.

  3. Review of the probabilistic failure analysis methodology and other probabilistic approaches for application in aerospace structural design

    NASA Technical Reports Server (NTRS)

    Townsend, J.; Meyers, C.; Ortega, R.; Peck, J.; Rheinfurth, M.; Weinstock, B.

    1993-01-01

    Probabilistic structural analyses and design methods are steadily gaining acceptance within the aerospace industry. The safety factor approach to design has long been the industry standard, and it is believed by many to be overly conservative and thus, costly. A probabilistic approach to design may offer substantial cost savings. This report summarizes several probabilistic approaches: the probabilistic failure analysis (PFA) methodology developed by Jet Propulsion Laboratory, fast probability integration (FPI) methods, the NESSUS finite element code, and response surface methods. Example problems are provided to help identify the advantages and disadvantages of each method.

  4. Probabilistic distributions of pinhole defects in atomic layer deposited films on polymeric substrates

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yersak, Alexander S., E-mail: alexander.yersak@colorado.edu; Lee, Yung-Cheng

    Pinhole defects in atomic layer deposition (ALD) coatings were measured in an area of 30 cm{sup 2} in an ALD reactor, and these defects were represented by a probabilistic cluster model instead of a single defect density value with number of defects over area. With the probabilistic cluster model, the pinhole defects were simulated over a manufacturing scale surface area of ∼1 m{sup 2}. Large-area pinhole defect simulations were used to develop an improved and enhanced design method for ALD-based devices. A flexible thermal ground plane (FTGP) device requiring ALD hermetic coatings was used as an example. Using a single defectmore » density value, it was determined that for an application with operation temperatures higher than 60 °C, the FTGP device would not be possible. The new probabilistic cluster model shows that up to 40.3% of the FTGP would be acceptable. With this new approach the manufacturing yield of ALD-enabled or other thin film based devices with different design configurations can be determined. It is important to guide process optimization and control and design for manufacturability.« less

  5. Probabilistic Modeling of Settlement Risk at Land Disposal Facilities - 12304

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Foye, Kevin C.; Soong, Te-Yang

    2012-07-01

    The long-term reliability of land disposal facility final cover systems - and therefore the overall waste containment - depends on the distortions imposed on these systems by differential settlement/subsidence. The evaluation of differential settlement is challenging because of the heterogeneity of the waste mass (caused by inconsistent compaction, void space distribution, debris-soil mix ratio, waste material stiffness, time-dependent primary compression of the fine-grained soil matrix, long-term creep settlement of the soil matrix and the debris, etc.) at most land disposal facilities. Deterministic approaches to long-term final cover settlement prediction are not able to capture the spatial variability in the wastemore » mass and sub-grade properties which control differential settlement. An alternative, probabilistic solution is to use random fields to model the waste and sub-grade properties. The modeling effort informs the design, construction, operation, and maintenance of land disposal facilities. A probabilistic method to establish design criteria for waste placement and compaction is introduced using the model. Random fields are ideally suited to problems of differential settlement modeling of highly heterogeneous foundations, such as waste. Random fields model the seemingly random spatial distribution of a design parameter, such as compressibility. When used for design, the use of these models prompts the need for probabilistic design criteria. It also allows for a statistical approach to waste placement acceptance criteria. An example design evaluation was performed, illustrating the use of the probabilistic differential settlement simulation methodology to assemble a design guidance chart. The purpose of this design evaluation is to enable the designer to select optimal initial combinations of design slopes and quality control acceptance criteria that yield an acceptable proportion of post-settlement slopes meeting some design minimum. For this specific example, relative density, which can be determined through field measurements, was selected as the field quality control parameter for waste placement. This technique can be extended to include a rigorous performance-based methodology using other parameters (void space criteria, debris-soil mix ratio, pre-loading, etc.). As shown in this example, each parameter range, or sets of parameter ranges can be selected such that they can result in an acceptable, long-term differential settlement according to the probabilistic model. The methodology can also be used to re-evaluate the long-term differential settlement behavior at closed land disposal facilities to identify, if any, problematic facilities so that remedial action (e.g., reinforcement of upper and intermediate waste layers) can be implemented. Considering the inherent spatial variability in waste and earth materials and the need for engineers to apply sound quantitative practices to engineering analysis, it is important to apply the available probabilistic techniques to problems of differential settlement. One such method to implement probability-based differential settlement analyses for the design of landfill final covers has been presented. The design evaluation technique presented is one tool to bridge the gap from deterministic practice to probabilistic practice. (authors)« less

  6. Probabilistic Structural Evaluation of Uncertainties in Radiator Sandwich Panel Design

    NASA Technical Reports Server (NTRS)

    Kuguoglu, Latife; Ludwiczak, Damian

    2006-01-01

    The Jupiter Icy Moons Orbiter (JIMO) Space System is part of the NASA's Prometheus Program. As part of the JIMO engineering team at NASA Glenn Research Center, the structural design of the JIMO Heat Rejection Subsystem (HRS) is evaluated. An initial goal of this study was to perform sensitivity analyses to determine the relative importance of the input variables on the structural responses of the radiator panel. The desire was to let the sensitivity analysis information identify the important parameters. The probabilistic analysis methods illustrated here support this objective. The probabilistic structural performance evaluation of a HRS radiator sandwich panel was performed. The radiator panel structural performance was assessed in the presence of uncertainties in the loading, fabrication process variables, and material properties. The stress and displacement contours of the deterministic structural analysis at mean probability was performed and results presented. It is followed by a probabilistic evaluation to determine the effect of the primitive variables on the radiator panel structural performance. Based on uncertainties in material properties, structural geometry and loading, the results of the displacement and stress analysis are used as an input file for the probabilistic analysis of the panel. The sensitivity of the structural responses, such as maximum displacement and maximum tensile and compressive stresses of the facesheet in x and y directions and maximum VonMises stresses of the tube, to the loading and design variables is determined under the boundary condition where all edges of the radiator panel are pinned. Based on this study, design critical material and geometric parameters of the considered sandwich panel are identified.

  7. Overview of Probabilistic Methods for SAE G-11 Meeting for Reliability and Uncertainty Quantification for DoD TACOM Initiative with SAE G-11 Division

    NASA Technical Reports Server (NTRS)

    Singhal, Surendra N.

    2003-01-01

    The SAE G-11 RMSL Division and Probabilistic Methods Committee meeting during October 6-8 at the Best Western Sterling Inn, Sterling Heights (Detroit), Michigan is co-sponsored by US Army Tank-automotive & Armaments Command (TACOM). The meeting will provide an industry/government/academia forum to review RMSL technology; reliability and probabilistic technology; reliability-based design methods; software reliability; and maintainability standards. With over 100 members including members with national/international standing, the mission of the G-11's Probabilistic Methods Committee is to "enable/facilitate rapid deployment of probabilistic technology to enhance the competitiveness of our industries by better, faster, greener, smarter, affordable and reliable product development."

  8. Probabilistic structural analysis to quantify uncertainties associated with turbopump blades

    NASA Technical Reports Server (NTRS)

    Nagpal, Vinod K.; Rubinstein, Robert; Chamis, Christos C.

    1988-01-01

    A probabilistic study of turbopump blades has been in progress at NASA Lewis Research Center for over the last two years. The objectives of this study are to evaluate the effects of uncertainties in geometry and material properties on the structural response of the turbopump blades to evaluate the tolerance limits on the design. A methodology based on probabilistic approach was developed to quantify the effects of the random uncertainties. The results indicate that only the variations in geometry have significant effects.

  9. Design-based Sample and Probability Law-Assumed Sample: Their Role in Scientific Investigation.

    ERIC Educational Resources Information Center

    Ojeda, Mario Miguel; Sahai, Hardeo

    2002-01-01

    Discusses some key statistical concepts in probabilistic and non-probabilistic sampling to provide an overview for understanding the inference process. Suggests a statistical model constituting the basis of statistical inference and provides a brief review of the finite population descriptive inference and a quota sampling inferential theory.…

  10. Probabilistic performance-based design for high performance control systems

    NASA Astrophysics Data System (ADS)

    Micheli, Laura; Cao, Liang; Gong, Yongqiang; Cancelli, Alessandro; Laflamme, Simon; Alipour, Alice

    2017-04-01

    High performance control systems (HPCS) are advanced damping systems capable of high damping performance over a wide frequency bandwidth, ideal for mitigation of multi-hazards. They include active, semi-active, and hybrid damping systems. However, HPCS are more expensive than typical passive mitigation systems, rely on power and hardware (e.g., sensors, actuators) to operate, and require maintenance. In this paper, a life cycle cost analysis (LCA) approach is proposed to estimate the economic benefit these systems over the entire life of the structure. The novelty resides in the life cycle cost analysis in the performance based design (PBD) tailored to multi-level wind hazards. This yields a probabilistic performance-based design approach for HPCS. Numerical simulations are conducted on a building located in Boston, MA. LCA are conducted for passive control systems and HPCS, and the concept of controller robustness is demonstrated. Results highlight the promise of the proposed performance-based design procedure.

  11. Stochastic model for fatigue crack size and cost effective design decisions. [for aerospace structures

    NASA Technical Reports Server (NTRS)

    Hanagud, S.; Uppaluri, B.

    1975-01-01

    This paper describes a methodology for making cost effective fatigue design decisions. The methodology is based on a probabilistic model for the stochastic process of fatigue crack growth with time. The development of a particular model for the stochastic process is also discussed in the paper. The model is based on the assumption of continuous time and discrete space of crack lengths. Statistical decision theory and the developed probabilistic model are used to develop the procedure for making fatigue design decisions on the basis of minimum expected cost or risk function and reliability bounds. Selections of initial flaw size distribution, NDT, repair threshold crack lengths, and inspection intervals are discussed.

  12. Probabilistic Meteorological Characterization for Turbine Loads

    NASA Astrophysics Data System (ADS)

    Kelly, M.; Larsen, G.; Dimitrov, N. K.; Natarajan, A.

    2014-06-01

    Beyond the existing, limited IEC prescription to describe fatigue loads on wind turbines, we look towards probabilistic characterization of the loads via analogous characterization of the atmospheric flow, particularly for today's "taller" turbines with rotors well above the atmospheric surface layer. Based on both data from multiple sites as well as theoretical bases from boundary-layer meteorology and atmospheric turbulence, we offer probabilistic descriptions of shear and turbulence intensity, elucidating the connection of each to the other as well as to atmospheric stability and terrain. These are used as input to loads calculation, and with a statistical loads output description, they allow for improved design and loads calculations.

  13. Lung Cancer Assistant: a hybrid clinical decision support application for lung cancer care.

    PubMed

    Sesen, M Berkan; Peake, Michael D; Banares-Alcantara, Rene; Tse, Donald; Kadir, Timor; Stanley, Roz; Gleeson, Fergus; Brady, Michael

    2014-09-06

    Multidisciplinary team (MDT) meetings are becoming the model of care for cancer patients worldwide. While MDTs have improved the quality of cancer care, the meetings impose substantial time pressure on the members, who generally attend several such MDTs. We describe Lung Cancer Assistant (LCA), a clinical decision support (CDS) prototype designed to assist the experts in the treatment selection decisions in the lung cancer MDTs. A novel feature of LCA is its ability to provide rule-based and probabilistic decision support within a single platform. The guideline-based CDS is based on clinical guideline rules, while the probabilistic CDS is based on a Bayesian network trained on the English Lung Cancer Audit Database (LUCADA). We assess rule-based and probabilistic recommendations based on their concordances with the treatments recorded in LUCADA. Our results reveal that the guideline rule-based recommendations perform well in simulating the recorded treatments with exact and partial concordance rates of 0.57 and 0.79, respectively. On the other hand, the exact and partial concordance rates achieved with probabilistic results are relatively poorer with 0.27 and 0.76. However, probabilistic decision support fulfils a complementary role in providing accurate survival estimations. Compared to recorded treatments, both CDS approaches promote higher resection rates and multimodality treatments.

  14. Probabilistic Design Analysis (PDA) Approach to Determine the Probability of Cross-System Failures for a Space Launch Vehicle

    NASA Technical Reports Server (NTRS)

    Shih, Ann T.; Lo, Yunnhon; Ward, Natalie C.

    2010-01-01

    Quantifying the probability of significant launch vehicle failure scenarios for a given design, while still in the design process, is critical to mission success and to the safety of the astronauts. Probabilistic risk assessment (PRA) is chosen from many system safety and reliability tools to verify the loss of mission (LOM) and loss of crew (LOC) requirements set by the NASA Program Office. To support the integrated vehicle PRA, probabilistic design analysis (PDA) models are developed by using vehicle design and operation data to better quantify failure probabilities and to better understand the characteristics of a failure and its outcome. This PDA approach uses a physics-based model to describe the system behavior and response for a given failure scenario. Each driving parameter in the model is treated as a random variable with a distribution function. Monte Carlo simulation is used to perform probabilistic calculations to statistically obtain the failure probability. Sensitivity analyses are performed to show how input parameters affect the predicted failure probability, providing insight for potential design improvements to mitigate the risk. The paper discusses the application of the PDA approach in determining the probability of failure for two scenarios from the NASA Ares I project

  15. Robust Control Design for Systems With Probabilistic Uncertainty

    NASA Technical Reports Server (NTRS)

    Crespo, Luis G.; Kenny, Sean P.

    2005-01-01

    This paper presents a reliability- and robustness-based formulation for robust control synthesis for systems with probabilistic uncertainty. In a reliability-based formulation, the probability of violating design requirements prescribed by inequality constraints is minimized. In a robustness-based formulation, a metric which measures the tendency of a random variable/process to cluster close to a target scalar/function is minimized. A multi-objective optimization procedure, which combines stability and performance requirements in time and frequency domains, is used to search for robustly optimal compensators. Some of the fundamental differences between the proposed strategy and conventional robust control methods are: (i) unnecessary conservatism is eliminated since there is not need for convex supports, (ii) the most likely plants are favored during synthesis allowing for probabilistic robust optimality, (iii) the tradeoff between robust stability and robust performance can be explored numerically, (iv) the uncertainty set is closely related to parameters with clear physical meaning, and (v) compensators with improved robust characteristics for a given control structure can be synthesized.

  16. Methods for combining payload parameter variations with input environment. [calculating design limit loads compatible with probabilistic structural design criteria

    NASA Technical Reports Server (NTRS)

    Merchant, D. H.

    1976-01-01

    Methods are presented for calculating design limit loads compatible with probabilistic structural design criteria. The approach is based on the concept that the desired limit load, defined as the largest load occurring in a mission, is a random variable having a specific probability distribution which may be determined from extreme-value theory. The design limit load, defined as a particular of this random limit load, is the value conventionally used in structural design. Methods are presented for determining the limit load probability distributions from both time-domain and frequency-domain dynamic load simulations. Numerical demonstrations of the method are also presented.

  17. Safety design approach for external events in Japan sodium-cooled fast reactor

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yamano, H.; Kubo, S.; Tani, A.

    2012-07-01

    This paper describes a safety design approach for external events in the design study of Japan sodium-cooled fast reactor. An emphasis is introduction of a design extension external condition (DEEC). In addition to seismic design, other external events such as tsunami, strong wind, abnormal temperature, etc. were addressed in this study. From a wide variety of external events consisting of natural hazards and human-induced ones, a screening method was developed in terms of siting, consequence, frequency to select representative events. Design approaches for these events were categorized on the probabilistic, statistical and deterministic basis. External hazard conditions were considered mainlymore » for DEECs. In the probabilistic approach, the DEECs of earthquake, tsunami and strong wind were defined as 1/10 of exceedance probability of the external design bases. The other representative DEECs were also defined based on statistical or deterministic approaches. (authors)« less

  18. The Use of the Direct Optimized Probabilistic Calculation Method in Design of Bolt Reinforcement for Underground and Mining Workings

    PubMed Central

    Krejsa, Martin; Janas, Petr; Yilmaz, Işık; Marschalko, Marian; Bouchal, Tomas

    2013-01-01

    The load-carrying system of each construction should fulfill several conditions which represent reliable criteria in the assessment procedure. It is the theory of structural reliability which determines probability of keeping required properties of constructions. Using this theory, it is possible to apply probabilistic computations based on the probability theory and mathematic statistics. Development of those methods has become more and more popular; it is used, in particular, in designs of load-carrying structures with the required level or reliability when at least some input variables in the design are random. The objective of this paper is to indicate the current scope which might be covered by the new method—Direct Optimized Probabilistic Calculation (DOProC) in assessments of reliability of load-carrying structures. DOProC uses a purely numerical approach without any simulation techniques. This provides more accurate solutions to probabilistic tasks, and, in some cases, such approach results in considerably faster completion of computations. DOProC can be used to solve efficiently a number of probabilistic computations. A very good sphere of application for DOProC is the assessment of the bolt reinforcement in the underground and mining workings. For the purposes above, a special software application—“Anchor”—has been developed. PMID:23935412

  19. Integration of Advanced Probabilistic Analysis Techniques with Multi-Physics Models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cetiner, Mustafa Sacit; none,; Flanagan, George F.

    2014-07-30

    An integrated simulation platform that couples probabilistic analysis-based tools with model-based simulation tools can provide valuable insights for reactive and proactive responses to plant operating conditions. The objective of this work is to demonstrate the benefits of a partial implementation of the Small Modular Reactor (SMR) Probabilistic Risk Assessment (PRA) Detailed Framework Specification through the coupling of advanced PRA capabilities and accurate multi-physics plant models. Coupling a probabilistic model with a multi-physics model will aid in design, operations, and safety by providing a more accurate understanding of plant behavior. This represents the first attempt at actually integrating these two typesmore » of analyses for a control system used for operations, on a faster than real-time basis. This report documents the development of the basic communication capability to exchange data with the probabilistic model using Reliability Workbench (RWB) and the multi-physics model using Dymola. The communication pathways from injecting a fault (i.e., failing a component) to the probabilistic and multi-physics models were successfully completed. This first version was tested with prototypic models represented in both RWB and Modelica. First, a simple event tree/fault tree (ET/FT) model was created to develop the software code to implement the communication capabilities between the dynamic-link library (dll) and RWB. A program, written in C#, successfully communicates faults to the probabilistic model through the dll. A systems model of the Advanced Liquid-Metal Reactor–Power Reactor Inherently Safe Module (ALMR-PRISM) design developed under another DOE project was upgraded using Dymola to include proper interfaces to allow data exchange with the control application (ConApp). A program, written in C+, successfully communicates faults to the multi-physics model. The results of the example simulation were successfully plotted.« less

  20. The Study of the Relationship between Probabilistic Design and Axiomatic Design Methodology. Volume 2

    NASA Technical Reports Server (NTRS)

    Onwubiko, Chin-Yere; Onyebueke, Landon

    1996-01-01

    The structural design, or the design of machine elements, has been traditionally based on deterministic design methodology. The deterministic method considers all design parameters to be known with certainty. This methodology is, therefore, inadequate to design complex structures that are subjected to a variety of complex, severe loading conditions. A nonlinear behavior that is dependent on stress, stress rate, temperature, number of load cycles, and time is observed on all components subjected to complex conditions. These complex conditions introduce uncertainties; hence, the actual factor of safety margin remains unknown. In the deterministic methodology, the contingency of failure is discounted; hence, there is a use of a high factor of safety. It may be most useful in situations where the design structures are simple. The probabilistic method is concerned with the probability of non-failure performance of structures or machine elements. It is much more useful in situations where the design is characterized by complex geometry, possibility of catastrophic failure, sensitive loads and material properties. Also included: Comparative Study of the use of AGMA Geometry Factors and Probabilistic Design Methodology in the Design of Compact Spur Gear Set.

  1. Probabilistic structural analysis to quantify uncertainties associated with turbopump blades

    NASA Technical Reports Server (NTRS)

    Nagpal, Vinod K.; Rubinstein, Robert; Chamis, Christos C.

    1987-01-01

    A probabilistic study of turbopump blades has been in progress at NASA Lewis Research Center for over the last two years. The objectives of this study are to evaluate the effects of uncertainties in geometry and material properties on the structural response of the turbopump blades to evaluate the tolerance limits on the design. A methodology based on probabilistic approach has been developed to quantify the effects of the random uncertainties. The results of this study indicate that only the variations in geometry have significant effects.

  2. Use of Probabilistic Engineering Methods in the Detailed Design and Development Phases of the NASA Ares Launch Vehicle

    NASA Technical Reports Server (NTRS)

    Fayssal, Safie; Weldon, Danny

    2008-01-01

    The United States National Aeronautics and Space Administration (NASA) is in the midst of a space exploration program called Constellation to send crew and cargo to the international Space Station, to the moon, and beyond. As part of the Constellation program, a new launch vehicle, Ares I, is being developed by NASA Marshall Space Flight Center. Designing a launch vehicle with high reliability and increased safety requires a significant effort in understanding design variability and design uncertainty at the various levels of the design (system, element, subsystem, component, etc.) and throughout the various design phases (conceptual, preliminary design, etc.). In a previous paper [1] we discussed a probabilistic functional failure analysis approach intended mainly to support system requirements definition, system design, and element design during the early design phases. This paper provides an overview of the application of probabilistic engineering methods to support the detailed subsystem/component design and development as part of the "Design for Reliability and Safety" approach for the new Ares I Launch Vehicle. Specifically, the paper discusses probabilistic engineering design analysis cases that had major impact on the design and manufacturing of the Space Shuttle hardware. The cases represent important lessons learned from the Space Shuttle Program and clearly demonstrate the significance of probabilistic engineering analysis in better understanding design deficiencies and identifying potential design improvement for Ares I. The paper also discusses the probabilistic functional failure analysis approach applied during the early design phases of Ares I and the forward plans for probabilistic design analysis in the detailed design and development phases.

  3. System Risk Assessment and Allocation in Conceptual Design

    NASA Technical Reports Server (NTRS)

    Mahadevan, Sankaran; Smith, Natasha L.; Zang, Thomas A. (Technical Monitor)

    2003-01-01

    As aerospace systems continue to evolve in addressing newer challenges in air and space transportation, there exists a heightened priority for significant improvement in system performance, cost effectiveness, reliability, and safety. Tools, which synthesize multidisciplinary integration, probabilistic analysis, and optimization, are needed to facilitate design decisions allowing trade-offs between cost and reliability. This study investigates tools for probabilistic analysis and probabilistic optimization in the multidisciplinary design of aerospace systems. A probabilistic optimization methodology is demonstrated for the low-fidelity design of a reusable launch vehicle at two levels, a global geometry design and a local tank design. Probabilistic analysis is performed on a high fidelity analysis of a Navy missile system. Furthermore, decoupling strategies are introduced to reduce the computational effort required for multidisciplinary systems with feedback coupling.

  4. The Study of the Relationship between Probabilistic Design and Axiomatic Design Methodology. Volume 1

    NASA Technical Reports Server (NTRS)

    Onwubiko, Chinyere; Onyebueke, Landon

    1996-01-01

    This program report is the final report covering all the work done on this project. The goal of this project is technology transfer of methodologies to improve design process. The specific objectives are: 1. To learn and understand the Probabilistic design analysis using NESSUS. 2. To assign Design Projects to either undergraduate or graduate students on the application of NESSUS. 3. To integrate the application of NESSUS into some selected senior level courses in Civil and Mechanical Engineering curricula. 4. To develop courseware in Probabilistic Design methodology to be included in a graduate level Design Methodology course. 5. To study the relationship between the Probabilistic design methodology and Axiomatic design methodology.

  5. Effects of sample survey design on the accuracy of classification tree models in species distribution models

    USGS Publications Warehouse

    Edwards, T.C.; Cutler, D.R.; Zimmermann, N.E.; Geiser, L.; Moisen, Gretchen G.

    2006-01-01

    We evaluated the effects of probabilistic (hereafter DESIGN) and non-probabilistic (PURPOSIVE) sample surveys on resultant classification tree models for predicting the presence of four lichen species in the Pacific Northwest, USA. Models derived from both survey forms were assessed using an independent data set (EVALUATION). Measures of accuracy as gauged by resubstitution rates were similar for each lichen species irrespective of the underlying sample survey form. Cross-validation estimates of prediction accuracies were lower than resubstitution accuracies for all species and both design types, and in all cases were closer to the true prediction accuracies based on the EVALUATION data set. We argue that greater emphasis should be placed on calculating and reporting cross-validation accuracy rates rather than simple resubstitution accuracy rates. Evaluation of the DESIGN and PURPOSIVE tree models on the EVALUATION data set shows significantly lower prediction accuracy for the PURPOSIVE tree models relative to the DESIGN models, indicating that non-probabilistic sample surveys may generate models with limited predictive capability. These differences were consistent across all four lichen species, with 11 of the 12 possible species and sample survey type comparisons having significantly lower accuracy rates. Some differences in accuracy were as large as 50%. The classification tree structures also differed considerably both among and within the modelled species, depending on the sample survey form. Overlap in the predictor variables selected by the DESIGN and PURPOSIVE tree models ranged from only 20% to 38%, indicating the classification trees fit the two evaluated survey forms on different sets of predictor variables. The magnitude of these differences in predictor variables throws doubt on ecological interpretation derived from prediction models based on non-probabilistic sample surveys. ?? 2006 Elsevier B.V. All rights reserved.

  6. Databases Don't Measure Motivation

    ERIC Educational Resources Information Center

    Yeager, Joseph

    2005-01-01

    Automated persuasion is the Holy Grail of quantitatively biased data base designers. However, data base histories are, at best, probabilistic estimates of customer behavior and do not make use of more sophisticated qualitative motivational profiling tools. While usually absent from web designer thinking, qualitative motivational profiling can be…

  7. Reliability-based design optimization using a generalized subset simulation method and posterior approximation

    NASA Astrophysics Data System (ADS)

    Ma, Yuan-Zhuo; Li, Hong-Shuang; Yao, Wei-Xing

    2018-05-01

    The evaluation of the probabilistic constraints in reliability-based design optimization (RBDO) problems has always been significant and challenging work, which strongly affects the performance of RBDO methods. This article deals with RBDO problems using a recently developed generalized subset simulation (GSS) method and a posterior approximation approach. The posterior approximation approach is used to transform all the probabilistic constraints into ordinary constraints as in deterministic optimization. The assessment of multiple failure probabilities required by the posterior approximation approach is achieved by GSS in a single run at all supporting points, which are selected by a proper experimental design scheme combining Sobol' sequences and Bucher's design. Sequentially, the transformed deterministic design optimization problem can be solved by optimization algorithms, for example, the sequential quadratic programming method. Three optimization problems are used to demonstrate the efficiency and accuracy of the proposed method.

  8. Interrelation Between Safety Factors and Reliability

    NASA Technical Reports Server (NTRS)

    Elishakoff, Isaac; Chamis, Christos C. (Technical Monitor)

    2001-01-01

    An evaluation was performed to establish relationships between safety factors and reliability relationships. Results obtained show that the use of the safety factor is not contradictory to the employment of the probabilistic methods. In many cases the safety factors can be directly expressed by the required reliability levels. However, there is a major difference that must be emphasized: whereas the safety factors are allocated in an ad hoc manner, the probabilistic approach offers a unified mathematical framework. The establishment of the interrelation between the concepts opens an avenue to specify safety factors based on reliability. In cases where there are several forms of failure, then the allocation of safety factors should he based on having the same reliability associated with each failure mode. This immediately suggests that by the probabilistic methods the existing over-design or under-design can be eliminated. The report includes three parts: Part 1-Random Actual Stress and Deterministic Yield Stress; Part 2-Deterministic Actual Stress and Random Yield Stress; Part 3-Both Actual Stress and Yield Stress Are Random.

  9. On the security of a novel probabilistic signature based on bilinear square Diffie-Hellman problem and its extension.

    PubMed

    Zhao, Zhenguo; Shi, Wenbo

    2014-01-01

    Probabilistic signature scheme has been widely used in modern electronic commerce since it could provide integrity, authenticity, and nonrepudiation. Recently, Wu and Lin proposed a novel probabilistic signature (PS) scheme using the bilinear square Diffie-Hellman (BSDH) problem. They also extended it to a universal designated verifier signature (UDVS) scheme. In this paper, we analyze the security of Wu et al.'s PS scheme and UDVS scheme. Through concrete attacks, we demonstrate both of their schemes are not unforgeable. The security analysis shows that their schemes are not suitable for practical applications.

  10. An approximate methods approach to probabilistic structural analysis

    NASA Technical Reports Server (NTRS)

    Mcclung, R. C.; Millwater, H. R.; Wu, Y.-T.; Thacker, B. H.; Burnside, O. H.

    1989-01-01

    A probabilistic structural analysis method (PSAM) is described which makes an approximate calculation of the structural response of a system, including the associated probabilistic distributions, with minimal computation time and cost, based on a simplified representation of the geometry, loads, and material. The method employs the fast probability integration (FPI) algorithm of Wu and Wirsching. Typical solution strategies are illustrated by formulations for a representative critical component chosen from the Space Shuttle Main Engine (SSME) as part of a major NASA-sponsored program on PSAM. Typical results are presented to demonstrate the role of the methodology in engineering design and analysis.

  11. Wind/tornado design criteria, development to achieve required probabilistic performance goals

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ng, D.S.

    1991-06-01

    This paper describes the strategy for developing new design criteria for a critical facility to withstand loading induced by the wind/tornado hazard. The proposed design requirements for resisting wind/tornado loads are based on probabilistic performance goals. The proposed design criteria were prepared by a Working Group consisting of six experts in wind/tornado engineering and meteorology. Utilizing their best technical knowledge and judgment in the wind/tornado field, they met and discussed the methodologies and reviewed available data. A review of the available wind/tornado hazard model for the site, structural response evaluation methods, and conservative acceptance criteria lead to proposed design criteriamore » that has a high probability of achieving the required performance goals.« less

  12. Wind/tornado design criteria, development to achieve required probabilistic performance goals

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ng, D.S.

    This paper describes the strategy for developing new design criteria for a critical facility to withstand loading induced by the wind/tornado hazard. The proposed design requirements for resisting wind/tornado loads are based on probabilistic performance goals. The proposed design criteria were prepared by a Working Group consisting of six experts in wind/tornado engineering and meteorology. Utilizing their best technical knowledge and judgment in the wind/tornado field, they met and discussed the methodologies and reviewed available data. A review of the available wind/tornado hazard model for the site, structural response evaluation methods, and conservative acceptance criteria lead to proposed design criteriamore » that has a high probability of achieving the required performance goals.« less

  13. Probabilistic Composite Design

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.

    1997-01-01

    Probabilistic composite design is described in terms of a computational simulation. This simulation tracks probabilistically the composite design evolution from constituent materials, fabrication process, through composite mechanics and structural components. Comparisons with experimental data are provided to illustrate selection of probabilistic design allowables, test methods/specimen guidelines, and identification of in situ versus pristine strength, For example, results show that: in situ fiber tensile strength is 90% of its pristine strength; flat-wise long-tapered specimens are most suitable for setting ply tensile strength allowables: a composite radome can be designed with a reliability of 0.999999; and laminate fatigue exhibits wide-spread scatter at 90% cyclic-stress to static-strength ratios.

  14. Preliminary Structural Sensitivity Study of Hypersonic Inflatable Aerodynamic Decelerator Using Probabilistic Methods

    NASA Technical Reports Server (NTRS)

    Lyle, Karen H.

    2014-01-01

    Acceptance of new spacecraft structural architectures and concepts requires validated design methods to minimize the expense involved with technology validation via flighttesting. This paper explores the implementation of probabilistic methods in the sensitivity analysis of the structural response of a Hypersonic Inflatable Aerodynamic Decelerator (HIAD). HIAD architectures are attractive for spacecraft deceleration because they are lightweight, store compactly, and utilize the atmosphere to decelerate a spacecraft during re-entry. However, designers are hesitant to include these inflatable approaches for large payloads or spacecraft because of the lack of flight validation. In the example presented here, the structural parameters of an existing HIAD model have been varied to illustrate the design approach utilizing uncertainty-based methods. Surrogate models have been used to reduce computational expense several orders of magnitude. The suitability of the design is based on assessing variation in the resulting cone angle. The acceptable cone angle variation would rely on the aerodynamic requirements.

  15. Probabilistic Aeroelastic Analysis of Turbomachinery Components

    NASA Technical Reports Server (NTRS)

    Reddy, T. S. R.; Mital, S. K.; Stefko, G. L.

    2004-01-01

    A probabilistic approach is described for aeroelastic analysis of turbomachinery blade rows. Blade rows with subsonic flow and blade rows with supersonic flow with subsonic leading edge are considered. To demonstrate the probabilistic approach, the flutter frequency, damping and forced response of a blade row representing a compressor geometry is considered. The analysis accounts for uncertainties in structural and aerodynamic design variables. The results are presented in the form of probabilistic density function (PDF) and sensitivity factors. For subsonic flow cascade, comparisons are also made with different probabilistic distributions, probabilistic methods, and Monte-Carlo simulation. The approach shows that the probabilistic approach provides a more realistic and systematic way to assess the effect of uncertainties in design variables on the aeroelastic instabilities and response.

  16. Probabilistic Structural Analysis of the SRB Aft Skirt External Fitting Modification

    NASA Technical Reports Server (NTRS)

    Townsend, John S.; Peck, J.; Ayala, S.

    1999-01-01

    NASA has funded several major programs (the PSAM Project is an example) to develop Probabilistic Structural Analysis Methods and tools for engineers to apply in the design and assessment of aerospace hardware. A probabilistic finite element design tool, known as NESSUS, is used to determine the reliability of the Space Shuttle Solid Rocket Booster (SRB) aft skirt critical weld. An external bracket modification to the aft skirt provides a comparison basis for examining the details of the probabilistic analysis and its contributions to the design process.

  17. Reliability and Probabilistic Risk Assessment - How They Play Together

    NASA Technical Reports Server (NTRS)

    Safie, Fayssal; Stutts, Richard; Huang, Zhaofeng

    2015-01-01

    Since the Space Shuttle Challenger accident in 1986, NASA has extensively used probabilistic analysis methods to assess, understand, and communicate the risk of space launch vehicles. Probabilistic Risk Assessment (PRA), used in the nuclear industry, is one of the probabilistic analysis methods NASA utilizes to assess Loss of Mission (LOM) and Loss of Crew (LOC) risk for launch vehicles. PRA is a system scenario based risk assessment that uses a combination of fault trees, event trees, event sequence diagrams, and probability distributions to analyze the risk of a system, a process, or an activity. It is a process designed to answer three basic questions: 1) what can go wrong that would lead to loss or degraded performance (i.e., scenarios involving undesired consequences of interest), 2) how likely is it (probabilities), and 3) what is the severity of the degradation (consequences). Since the Challenger accident, PRA has been used in supporting decisions regarding safety upgrades for launch vehicles. Another area that was given a lot of emphasis at NASA after the Challenger accident is reliability engineering. Reliability engineering has been a critical design function at NASA since the early Apollo days. However, after the Challenger accident, quantitative reliability analysis and reliability predictions were given more scrutiny because of their importance in understanding failure mechanism and quantifying the probability of failure, which are key elements in resolving technical issues, performing design trades, and implementing design improvements. Although PRA and reliability are both probabilistic in nature and, in some cases, use the same tools, they are two different activities. Specifically, reliability engineering is a broad design discipline that deals with loss of function and helps understand failure mechanism and improve component and system design. PRA is a system scenario based risk assessment process intended to assess the risk scenarios that could lead to a major/top undesirable system event, and to identify those scenarios that are high-risk drivers. PRA output is critical to support risk informed decisions concerning system design. This paper describes the PRA process and the reliability engineering discipline in detail. It discusses their differences and similarities and how they work together as complementary analyses to support the design and risk assessment processes. Lessons learned, applications, and case studies in both areas are also discussed in the paper to demonstrate and explain these differences and similarities.

  18. Probabilistic structural analysis methods for improving Space Shuttle engine reliability

    NASA Technical Reports Server (NTRS)

    Boyce, L.

    1989-01-01

    Probabilistic structural analysis methods are particularly useful in the design and analysis of critical structural components and systems that operate in very severe and uncertain environments. These methods have recently found application in space propulsion systems to improve the structural reliability of Space Shuttle Main Engine (SSME) components. A computer program, NESSUS, based on a deterministic finite-element program and a method of probabilistic analysis (fast probability integration) provides probabilistic structural analysis for selected SSME components. While computationally efficient, it considers both correlated and nonnormal random variables as well as an implicit functional relationship between independent and dependent variables. The program is used to determine the response of a nickel-based superalloy SSME turbopump blade. Results include blade tip displacement statistics due to the variability in blade thickness, modulus of elasticity, Poisson's ratio or density. Modulus of elasticity significantly contributed to blade tip variability while Poisson's ratio did not. Thus, a rational method for choosing parameters to be modeled as random is provided.

  19. Probabilistic Assessment of National Wind Tunnel

    NASA Technical Reports Server (NTRS)

    Shah, A. R.; Shiao, M.; Chamis, C. C.

    1996-01-01

    A preliminary probabilistic structural assessment of the critical section of National Wind Tunnel (NWT) is performed using NESSUS (Numerical Evaluation of Stochastic Structures Under Stress) computer code. Thereby, the capabilities of NESSUS code have been demonstrated to address reliability issues of the NWT. Uncertainties in the geometry, material properties, loads and stiffener location on the NWT are considered to perform the reliability assessment. Probabilistic stress, frequency, buckling, fatigue and proof load analyses are performed. These analyses cover the major global and some local design requirements. Based on the assumed uncertainties, the results reveal the assurance of minimum 0.999 reliability for the NWT. Preliminary life prediction analysis results show that the life of the NWT is governed by the fatigue of welds. Also, reliability based proof test assessment is performed.

  20. On the Security of a Novel Probabilistic Signature Based on Bilinear Square Diffie-Hellman Problem and Its Extension

    PubMed Central

    Zhao, Zhenguo; Shi, Wenbo

    2014-01-01

    Probabilistic signature scheme has been widely used in modern electronic commerce since it could provide integrity, authenticity, and nonrepudiation. Recently, Wu and Lin proposed a novel probabilistic signature (PS) scheme using the bilinear square Diffie-Hellman (BSDH) problem. They also extended it to a universal designated verifier signature (UDVS) scheme. In this paper, we analyze the security of Wu et al.'s PS scheme and UDVS scheme. Through concrete attacks, we demonstrate both of their schemes are not unforgeable. The security analysis shows that their schemes are not suitable for practical applications. PMID:25025083

  1. Robust Control Design for Uncertain Nonlinear Dynamic Systems

    NASA Technical Reports Server (NTRS)

    Kenny, Sean P.; Crespo, Luis G.; Andrews, Lindsey; Giesy, Daniel P.

    2012-01-01

    Robustness to parametric uncertainty is fundamental to successful control system design and as such it has been at the core of many design methods developed over the decades. Despite its prominence, most of the work on robust control design has focused on linear models and uncertainties that are non-probabilistic in nature. Recently, researchers have acknowledged this disparity and have been developing theory to address a broader class of uncertainties. This paper presents an experimental application of robust control design for a hybrid class of probabilistic and non-probabilistic parametric uncertainties. The experimental apparatus is based upon the classic inverted pendulum on a cart. The physical uncertainty is realized by a known additional lumped mass at an unknown location on the pendulum. This unknown location has the effect of substantially altering the nominal frequency and controllability of the nonlinear system, and in the limit has the capability to make the system neutrally stable and uncontrollable. Another uncertainty to be considered is a direct current motor parameter. The control design objective is to design a controller that satisfies stability, tracking error, control power, and transient behavior requirements for the largest range of parametric uncertainties. This paper presents an overview of the theory behind the robust control design methodology and the experimental results.

  2. [National Health and Nutrition Survey 2012: design and coverage].

    PubMed

    Romero-Martínez, Martín; Shamah-Levy, Teresa; Franco-Núñez, Aurora; Villalpando, Salvador; Cuevas-Nasu, Lucía; Gutiérrez, Juan Pablo; Rivera-Dommarco, Juan Ángel

    2013-01-01

    To describe the design and population coverage of the National Health and Nutrition Survey 2012 (NHNS 2012). The design of the NHNS 2012 is reported, as a probabilistic population based survey with a multi-stage and stratified sampling, as well as the sample inferential properties, the logistical procedures, and the obtained coverage. Household response rate for the NHNS 2012 was 87%, completing data from 50,528 households, where 96 031 individual interviews selected by age and 14,104 of ambulatory health services users were also obtained. The probabilistic design of the NHNS 2012 as well as its coverage allowed to generate inferences about health and nutrition conditions, health programs coverage, and access to health services. Because of their complex designs, all estimations from the NHNS 2012 must use the survey design: weights, primary sampling units, and stratus variables.

  3. Proceedings, Seminar on Probabilistic Methods in Geotechnical Engineering

    NASA Astrophysics Data System (ADS)

    Hynes-Griffin, M. E.; Buege, L. L.

    1983-09-01

    Contents: Applications of Probabilistic Methods in Geotechnical Engineering; Probabilistic Seismic and Geotechnical Evaluation at a Dam Site; Probabilistic Slope Stability Methodology; Probability of Liquefaction in a 3-D Soil Deposit; Probabilistic Design of Flood Levees; Probabilistic and Statistical Methods for Determining Rock Mass Deformability Beneath Foundations: An Overview; Simple Statistical Methodology for Evaluating Rock Mechanics Exploration Data; New Developments in Statistical Techniques for Analyzing Rock Slope Stability.

  4. Architecture for Integrated Medical Model Dynamic Probabilistic Risk Assessment

    NASA Technical Reports Server (NTRS)

    Jaworske, D. A.; Myers, J. G.; Goodenow, D.; Young, M.; Arellano, J. D.

    2016-01-01

    Probabilistic Risk Assessment (PRA) is a modeling tool used to predict potential outcomes of a complex system based on a statistical understanding of many initiating events. Utilizing a Monte Carlo method, thousands of instances of the model are considered and outcomes are collected. PRA is considered static, utilizing probabilities alone to calculate outcomes. Dynamic Probabilistic Risk Assessment (dPRA) is an advanced concept where modeling predicts the outcomes of a complex system based not only on the probabilities of many initiating events, but also on a progression of dependencies brought about by progressing down a time line. Events are placed in a single time line, adding each event to a queue, as managed by a planner. Progression down the time line is guided by rules, as managed by a scheduler. The recently developed Integrated Medical Model (IMM) summarizes astronaut health as governed by the probabilities of medical events and mitigation strategies. Managing the software architecture process provides a systematic means of creating, documenting, and communicating a software design early in the development process. The software architecture process begins with establishing requirements and the design is then derived from the requirements.

  5. A Bayesian Framework for Analysis of Pseudo-Spatial Models of Comparable Engineered Systems with Application to Spacecraft Anomaly Prediction Based on Precedent Data

    NASA Astrophysics Data System (ADS)

    Ndu, Obibobi Kamtochukwu

    To ensure that estimates of risk and reliability inform design and resource allocation decisions in the development of complex engineering systems, early engagement in the design life cycle is necessary. An unfortunate constraint on the accuracy of such estimates at this stage of concept development is the limited amount of high fidelity design and failure information available on the actual system under development. Applying the human ability to learn from experience and augment our state of knowledge to evolve better solutions mitigates this limitation. However, the challenge lies in formalizing a methodology that takes this highly abstract, but fundamentally human cognitive, ability and extending it to the field of risk analysis while maintaining the tenets of generalization, Bayesian inference, and probabilistic risk analysis. We introduce an integrated framework for inferring the reliability, or other probabilistic measures of interest, of a new system or a conceptual variant of an existing system. Abstractly, our framework is based on learning from the performance of precedent designs and then applying the acquired knowledge, appropriately adjusted based on degree of relevance, to the inference process. This dissertation presents a method for inferring properties of the conceptual variant using a pseudo-spatial model that describes the spatial configuration of the family of systems to which the concept belongs. Through non-metric multidimensional scaling, we formulate the pseudo-spatial model based on rank-ordered subjective expert perception of design similarity between systems that elucidate the psychological space of the family. By a novel extension of Kriging methods for analysis of geospatial data to our "pseudo-space of comparable engineered systems", we develop a Bayesian inference model that allows prediction of the probabilistic measure of interest.

  6. Probability-Based Design Criteria of the ASCE 7 Tsunami Loads and Effects Provisions (Invited)

    NASA Astrophysics Data System (ADS)

    Chock, G.

    2013-12-01

    Mitigation of tsunami risk requires a combination of emergency preparedness for evacuation in addition to providing structural resilience of critical facilities, infrastructure, and key resources necessary for immediate response and economic and social recovery. Critical facilities would include emergency response, medical, tsunami refuges and shelters, ports and harbors, lifelines, transportation, telecommunications, power, financial institutions, and major industrial/commercial facilities. The Tsunami Loads and Effects Subcommittee of the ASCE/SEI 7 Standards Committee is developing a proposed new Chapter 6 - Tsunami Loads and Effects for the 2016 edition of the ASCE 7 Standard. ASCE 7 provides the minimum design loads and requirements for structures subject to building codes such as the International Building Code utilized in the USA. In this paper we will provide a review emphasizing the intent of these new code provisions and explain the design methodology. The ASCE 7 provisions for Tsunami Loads and Effects enables a set of analysis and design methodologies that are consistent with performance-based engineering based on probabilistic criteria. . The ASCE 7 Tsunami Loads and Effects chapter will be initially applicable only to the states of Alaska, Washington, Oregon, California, and Hawaii. Ground shaking effects and subsidence from a preceding local offshore Maximum Considered Earthquake will also be considered prior to tsunami arrival for Alaska and states in the Pacific Northwest regions governed by nearby offshore subduction earthquakes. For national tsunami design provisions to achieve a consistent reliability standard of structural performance for community resilience, a new generation of tsunami inundation hazard maps for design is required. The lesson of recent tsunami is that historical records alone do not provide a sufficient measure of the potential heights of future tsunamis. Engineering design must consider the occurrence of events greater than scenarios in the historical record, and should properly be based on the underlying seismicity of subduction zones. Therefore, Probabilistic Tsunami Hazard Analysis (PTHA) consistent with source seismicity must be performed in addition to consideration of historical event scenarios. A method of Probabilistic Tsunami Hazard Analysis has been established that is generally consistent with Probabilistic Seismic Hazard Analysis in the treatment of uncertainty. These new tsunami design zone maps will define the coastal zones where structures of greater importance would be designed for tsunami resistance and community resilience. Structural member acceptability criteria will be based on performance objectives for a 2,500-year Maximum Considered Tsunami. The approach developed by the ASCE Tsunami Loads and Effects Subcommittee of the ASCE 7 Standard would result in the first national unification of tsunami hazard criteria for design codes reflecting the modern approach of Performance-Based Engineering.

  7. Fully probabilistic control design in an adaptive critic framework.

    PubMed

    Herzallah, Randa; Kárný, Miroslav

    2011-12-01

    Optimal stochastic controller pushes the closed-loop behavior as close as possible to the desired one. The fully probabilistic design (FPD) uses probabilistic description of the desired closed loop and minimizes Kullback-Leibler divergence of the closed-loop description to the desired one. Practical exploitation of the fully probabilistic design control theory continues to be hindered by the computational complexities involved in numerically solving the associated stochastic dynamic programming problem; in particular, very hard multivariate integration and an approximate interpolation of the involved multivariate functions. This paper proposes a new fully probabilistic control algorithm that uses the adaptive critic methods to circumvent the need for explicitly evaluating the optimal value function, thereby dramatically reducing computational requirements. This is a main contribution of this paper. Copyright © 2011 Elsevier Ltd. All rights reserved.

  8. Development of probabilistic design method for annular fuels

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ozawa, Takayuki

    2007-07-01

    The increase of linear power and burn-up during the reactor operation is considered as one measure to ensure the utility of fast reactors in the future; for this the application of annular oxide fuels is under consideration. The annular fuel design code CEPTAR was developed in the Japan Atomic Energy Agency (JAEA) and verified by using many irradiation experiences with oxide fuels. In addition, the probabilistic fuel design code BORNFREE was also developed to provide a safe and reasonable fuel design and to evaluate the design margins quantitatively. This study aimed at the development of a probabilistic design method formore » annular oxide fuels; this was implemented in the developed BORNFREE-CEPTAR code, and the code was used to make a probabilistic evaluation with regard to the permissive linear power. (author)« less

  9. Design for Reliability and Safety Approach for the NASA New Launch Vehicle

    NASA Technical Reports Server (NTRS)

    Safie, Fayssal, M.; Weldon, Danny M.

    2007-01-01

    The United States National Aeronautics and Space Administration (NASA) is in the midst of a space exploration program intended for sending crew and cargo to the international Space Station (ISS), to the moon, and beyond. This program is called Constellation. As part of the Constellation program, NASA is developing new launch vehicles aimed at significantly increase safety and reliability, reduce the cost of accessing space, and provide a growth path for manned space exploration. Achieving these goals requires a rigorous process that addresses reliability, safety, and cost upfront and throughout all the phases of the life cycle of the program. This paper discusses the "Design for Reliability and Safety" approach for the NASA new crew launch vehicle called ARES I. The ARES I is being developed by NASA Marshall Space Flight Center (MSFC) in support of the Constellation program. The ARES I consists of three major Elements: A solid First Stage (FS), an Upper Stage (US), and liquid Upper Stage Engine (USE). Stacked on top of the ARES I is the Crew exploration vehicle (CEV). The CEV consists of a Launch Abort System (LAS), Crew Module (CM), Service Module (SM), and a Spacecraft Adapter (SA). The CEV development is being led by NASA Johnson Space Center (JSC). Designing for high reliability and safety require a good integrated working environment and a sound technical design approach. The "Design for Reliability and Safety" approach addressed in this paper discusses both the environment and the technical process put in place to support the ARES I design. To address the integrated working environment, the ARES I project office has established a risk based design group called "Operability Design and Analysis" (OD&A) group. This group is an integrated group intended to bring together the engineering, design, and safety organizations together to optimize the system design for safety, reliability, and cost. On the technical side, the ARES I project has, through the OD&A environment, implemented a probabilistic approach to analyze and evaluate design uncertainties and understand their impact on safety, reliability, and cost. This paper focuses on the use of the various probabilistic approaches that have been pursued by the ARES I project. Specifically, the paper discusses an integrated functional probabilistic analysis approach that addresses upffont some key areas to support the ARES I Design Analysis Cycle (DAC) pre Preliminary Design (PD) Phase. This functional approach is a probabilistic physics based approach that combines failure probabilities with system dynamics and engineering failure impact models to identify key system risk drivers and potential system design requirements. The paper also discusses other probabilistic risk assessment approaches planned by the ARES I project to support the PD phase and beyond.

  10. Probabilistic Models for Solar Particle Events

    NASA Technical Reports Server (NTRS)

    Adams, James H., Jr.; Dietrich, W. F.; Xapsos, M. A.; Welton, A. M.

    2009-01-01

    Probabilistic Models of Solar Particle Events (SPEs) are used in space mission design studies to provide a description of the worst-case radiation environment that the mission must be designed to tolerate.The models determine the worst-case environment using a description of the mission and a user-specified confidence level that the provided environment will not be exceeded. This poster will focus on completing the existing suite of models by developing models for peak flux and event-integrated fluence elemental spectra for the Z>2 elements. It will also discuss methods to take into account uncertainties in the data base and the uncertainties resulting from the limited number of solar particle events in the database. These new probabilistic models are based on an extensive survey of SPE measurements of peak and event-integrated elemental differential energy spectra. Attempts are made to fit the measured spectra with eight different published models. The model giving the best fit to each spectrum is chosen and used to represent that spectrum for any energy in the energy range covered by the measurements. The set of all such spectral representations for each element is then used to determine the worst case spectrum as a function of confidence level. The spectral representation that best fits these worst case spectra is found and its dependence on confidence level is parameterized. This procedure creates probabilistic models for the peak and event-integrated spectra.

  11. Deployment Analysis of a Simple Tape-Spring Hinge Using Probabilistic Methods

    NASA Technical Reports Server (NTRS)

    Lyle, Karen H.; Horta, Lucas G.

    2012-01-01

    Acceptance of new deployable structures architectures and concepts requires validated design methods to minimize the expense involved with technology validation flight testing. Deployable concepts for large lightweight spacecraft include booms, antennae, and masts. This paper explores the implementation of probabilistic methods in the design process for the deployment of a strain-energy mechanism, specifically a simple tape-spring hinge. Strain-energy mechanisms are attractive for deployment in very lightweight systems because they do not require the added mass and complexity associated with motors and controllers. However, designers are hesitant to include free deployment, strain-energy mechanisms because of the potential for uncontrolled behavior. In the example presented here, the tapespring cross-sectional dimensions have been varied and a target displacement during deployment has been selected as the design metric. Specifically, the tape-spring should reach the final position in the shortest time with the minimal amount of overshoot and oscillations. Surrogate models have been used to reduce computational expense. Parameter values to achieve the target response have been computed and used to demonstrate the approach. Based on these results, the application of probabilistic methods for design of a tape-spring hinge has shown promise as a means of designing strain-energy components for more complex space concepts.

  12. Overview of the SAE G-11 RMSL (Reliability, Maintainability, Supportability, and Logistics) Division Activities and Technical Projects

    NASA Technical Reports Server (NTRS)

    Singhal, Surendra N.

    2003-01-01

    The SAE G-11 RMSL (Reliability, Maintainability, Supportability, and Logistics) Division activities include identification and fulfillment of joint industry, government, and academia needs for development and implementation of RMSL technologies. Four Projects in the Probabilistic Methods area and two in the area of RMSL have been identified. These are: (1) Evaluation of Probabilistic Technology - progress has been made toward the selection of probabilistic application cases. Future effort will focus on assessment of multiple probabilistic softwares in solving selected engineering problems using probabilistic methods. Relevance to Industry & Government - Case studies of typical problems encountering uncertainties, results of solutions to these problems run by different codes, and recommendations on which code is applicable for what problems; (2) Probabilistic Input Preparation - progress has been made in identifying problem cases such as those with no data, little data and sufficient data. Future effort will focus on developing guidelines for preparing input for probabilistic analysis, especially with no or little data. Relevance to Industry & Government - Too often, we get bogged down thinking we need a lot of data before we can quantify uncertainties. Not True. There are ways to do credible probabilistic analysis with little data; (3) Probabilistic Reliability - probabilistic reliability literature search has been completed along with what differentiates it from statistical reliability. Work on computation of reliability based on quantification of uncertainties in primitive variables is in progress. Relevance to Industry & Government - Correct reliability computations both at the component and system level are needed so one can design an item based on its expected usage and life span; (4) Real World Applications of Probabilistic Methods (PM) - A draft of volume 1 comprising aerospace applications has been released. Volume 2, a compilation of real world applications of probabilistic methods with essential information demonstrating application type and timehost savings by the use of probabilistic methods for generic applications is in progress. Relevance to Industry & Government - Too often, we say, 'The Proof is in the Pudding'. With help from many contributors, we hope to produce such a document. Problem is - not too many people are coming forward due to proprietary nature. So, we are asking to document only minimum information including problem description, what method used, did it result in any savings, and how much?; (5) Software Reliability - software reliability concept, program, implementation, guidelines, and standards are being documented. Relevance to Industry & Government - software reliability is a complex issue that must be understood & addressed in all facets of business in industry, government, and other institutions. We address issues, concepts, ways to implement solutions, and guidelines for maximizing software reliability; (6) Maintainability Standards - maintainability/serviceability industry standard/guidelines and industry best practices and methodologies used in performing maintainability/ serviceability tasks are being documented. Relevance to Industry & Government - Any industry or government process, project, and/or tool must be maintained and serviced to realize the life and performance it was designed for. We address issues and develop guidelines for optimum performance & life.

  13. An overview of engineering concepts and current design algorithms for probabilistic structural analysis

    NASA Technical Reports Server (NTRS)

    Duffy, S. F.; Hu, J.; Hopkins, D. A.

    1995-01-01

    The article begins by examining the fundamentals of traditional deterministic design philosophy. The initial section outlines the concepts of failure criteria and limit state functions two traditional notions that are embedded in deterministic design philosophy. This is followed by a discussion regarding safety factors (a possible limit state function) and the common utilization of statistical concepts in deterministic engineering design approaches. Next the fundamental aspects of a probabilistic failure analysis are explored and it is shown that deterministic design concepts mentioned in the initial portion of the article are embedded in probabilistic design methods. For components fabricated from ceramic materials (and other similarly brittle materials) the probabilistic design approach yields the widely used Weibull analysis after suitable assumptions are incorporated. The authors point out that Weibull analysis provides the rare instance where closed form solutions are available for a probabilistic failure analysis. Since numerical methods are usually required to evaluate component reliabilities, a section on Monte Carlo methods is included to introduce the concept. The article concludes with a presentation of the technical aspects that support the numerical method known as fast probability integration (FPI). This includes a discussion of the Hasofer-Lind and Rackwitz-Fiessler approximations.

  14. Development of Maximum Considered Earthquake Ground Motion Maps

    USGS Publications Warehouse

    Leyendecker, E.V.; Hunt, R.J.; Frankel, A.D.; Rukstales, K.S.

    2000-01-01

    The 1997 NEHRP Recommended Provisions for Seismic Regulations for New Buildings use a design procedure that is based on spectral response acceleration rather than the traditional peak ground acceleration, peak ground velocity, or zone factors. The spectral response accelerations are obtained from maps prepared following the recommendations of the Building Seismic Safety Council's (BSSC) Seismic Design Procedures Group (SDPG). The SDPG-recommended maps, the Maximum Considered Earthquake (MCE) Ground Motion Maps, are based on the U.S. Geological Survey (USGS) probabilistic hazard maps with additional modifications incorporating deterministic ground motions in selected areas and the application of engineering judgement. The MCE ground motion maps included with the 1997 NEHRP Provisions also serve as the basis for the ground motion maps used in the seismic design portions of the 2000 International Building Code and the 2000 International Residential Code. Additionally the design maps prepared for the 1997 NEHRP Provisions, combined with selected USGS probabilistic maps, are used with the 1997 NEHRP Guidelines for the Seismic Rehabilitation of Buildings.

  15. Relative risk of probabilistic category learning deficits in patients with schizophrenia and their siblings

    PubMed Central

    Weickert, Thomas W.; Goldberg, Terry E.; Egan, Michael F.; Apud, Jose A.; Meeter, Martijn; Myers, Catherine E.; Gluck, Mark A; Weinberger, Daniel R.

    2010-01-01

    Background While patients with schizophrenia display an overall probabilistic category learning performance deficit, the extent to which this deficit occurs in unaffected siblings of patients with schizophrenia is unknown. There are also discrepant findings regarding probabilistic category learning acquisition rate and performance in patients with schizophrenia. Methods A probabilistic category learning test was administered to 108 patients with schizophrenia, 82 unaffected siblings, and 121 healthy participants. Results Patients with schizophrenia displayed significant differences from their unaffected siblings and healthy participants with respect to probabilistic category learning acquisition rates. Although siblings on the whole failed to differ from healthy participants on strategy and quantitative indices of overall performance and learning acquisition, application of a revised learning criterion enabling classification into good and poor learners based on individual learning curves revealed significant differences between percentages of sibling and healthy poor learners: healthy (13.2%), siblings (34.1%), patients (48.1%), yielding a moderate relative risk. Conclusions These results clarify previous discrepant findings pertaining to probabilistic category learning acquisition rate in schizophrenia and provide the first evidence for the relative risk of probabilistic category learning abnormalities in unaffected siblings of patients with schizophrenia, supporting genetic underpinnings of probabilistic category learning deficits in schizophrenia. These findings also raise questions regarding the contribution of antipsychotic medication to the probabilistic category learning deficit in schizophrenia. The distinction between good and poor learning may be used to inform genetic studies designed to detect schizophrenia risk alleles. PMID:20172502

  16. Probabilistic Structural Analysis of the Solid Rocket Booster Aft Skirt External Fitting Modification

    NASA Technical Reports Server (NTRS)

    Townsend, John S.; Peck, Jeff; Ayala, Samuel

    2000-01-01

    NASA has funded several major programs (the Probabilistic Structural Analysis Methods Project is an example) to develop probabilistic structural analysis methods and tools for engineers to apply in the design and assessment of aerospace hardware. A probabilistic finite element software code, known as Numerical Evaluation of Stochastic Structures Under Stress, is used to determine the reliability of a critical weld of the Space Shuttle solid rocket booster aft skirt. An external bracket modification to the aft skirt provides a comparison basis for examining the details of the probabilistic analysis and its contributions to the design process. Also, analysis findings are compared with measured Space Shuttle flight data.

  17. Design of Critical Components

    NASA Technical Reports Server (NTRS)

    Hendricks, Robert C.; Zaretsky, Erwin V.

    2001-01-01

    Critical component design is based on minimizing product failures that results in loss of life. Potential catastrophic failures are reduced to secondary failures where components removed for cause or operating time in the system. Issues of liability and cost of component removal become of paramount importance. Deterministic design with factors of safety and probabilistic design address but lack the essential characteristics for the design of critical components. In deterministic design and fabrication there are heuristic rules and safety factors developed over time for large sets of structural/material components. These factors did not come without cost. Many designs failed and many rules (codes) have standing committees to oversee their proper usage and enforcement. In probabilistic design, not only are failures a given, the failures are calculated; an element of risk is assumed based on empirical failure data for large classes of component operations. Failure of a class of components can be predicted, yet one can not predict when a specific component will fail. The analogy is to the life insurance industry where very careful statistics are book-kept on classes of individuals. For a specific class, life span can be predicted within statistical limits, yet life-span of a specific element of that class can not be predicted.

  18. Application of Probabilistic Methods for the Determination of an Economically Robust HSCT Configuration

    NASA Technical Reports Server (NTRS)

    Mavris, Dimitri N.; Bandte, Oliver; Schrage, Daniel P.

    1996-01-01

    This paper outlines an approach for the determination of economically viable robust design solutions using the High Speed Civil Transport (HSCT) as a case study. Furthermore, the paper states the advantages of a probability based aircraft design over the traditional point design approach. It also proposes a new methodology called Robust Design Simulation (RDS) which treats customer satisfaction as the ultimate design objective. RDS is based on a probabilistic approach to aerospace systems design, which views the chosen objective as a distribution function introduced by so called noise or uncertainty variables. Since the designer has no control over these variables, a variability distribution is defined for each one of them. The cumulative effect of all these distributions causes the overall variability of the objective function. For cases where the selected objective function depends heavily on these noise variables, it may be desirable to obtain a design solution that minimizes this dependence. The paper outlines a step by step approach on how to achieve such a solution for the HSCT case study and introduces an evaluation criterion which guarantees the highest customer satisfaction. This customer satisfaction is expressed by the probability of achieving objective function values less than a desired target value.

  19. Target Coverage in Wireless Sensor Networks with Probabilistic Sensors

    PubMed Central

    Shan, Anxing; Xu, Xianghua; Cheng, Zongmao

    2016-01-01

    Sensing coverage is a fundamental problem in wireless sensor networks (WSNs), which has attracted considerable attention. Conventional research on this topic focuses on the 0/1 coverage model, which is only a coarse approximation to the practical sensing model. In this paper, we study the target coverage problem, where the objective is to find the least number of sensor nodes in randomly-deployed WSNs based on the probabilistic sensing model. We analyze the joint detection probability of target with multiple sensors. Based on the theoretical analysis of the detection probability, we formulate the minimum ϵ-detection coverage problem. We prove that the minimum ϵ-detection coverage problem is NP-hard and present an approximation algorithm called the Probabilistic Sensor Coverage Algorithm (PSCA) with provable approximation ratios. To evaluate our design, we analyze the performance of PSCA theoretically and also perform extensive simulations to demonstrate the effectiveness of our proposed algorithm. PMID:27618902

  20. Expert Design Advisor

    DTIC Science & Technology

    1990-10-01

    to economic, technological, spatial or logistic concerns, or involve training, man-machine interfaces, or integration into existing systems. Once the...probabilistic reasoning, mixed analysis- and simulation-oriented, mixed computation- and communication-oriented, nonpreemptive static priority...scheduling base, nonrandomized, preemptive static priority scheduling base, randomized, simulation-oriented, and static scheduling base. The selection of both

  1. User perception and interpretation of tornado probabilistic hazard information: Comparison of four graphical designs.

    PubMed

    Miran, Seyed M; Ling, Chen; James, Joseph J; Gerard, Alan; Rothfusz, Lans

    2017-11-01

    Effective design for presenting severe weather information is important to reduce devastating consequences of severe weather. The Probabilistic Hazard Information (PHI) system for severe weather is being developed by NOAA National Severe Storms Laboratory (NSSL) to communicate probabilistic hazardous weather information. This study investigates the effects of four PHI graphical designs for tornado threat, namely, "four-color"," red-scale", "grayscale" and "contour", on users' perception, interpretation, and reaction to threat information. PHI is presented on either a map background or a radar background. Analysis showed that the accuracy was significantly higher and response time faster when PHI was displayed on map background as compared to radar background due to better contrast. When displayed on a radar background, "grayscale" design resulted in a higher accuracy of responses. Possibly due to familiarity, participants reported four-color design as their favorite design, which also resulted in the fastest recognition of probability levels on both backgrounds. Our study shows the importance of using intuitive color-coding and sufficient contrast in conveying probabilistic threat information via graphical design. We also found that users follows a rational perceiving-judging-feeling-and acting approach in processing probabilistic hazard information for tornado. Copyright © 2017 Elsevier Ltd. All rights reserved.

  2. An advanced probabilistic structural analysis method for implicit performance functions

    NASA Technical Reports Server (NTRS)

    Wu, Y.-T.; Millwater, H. R.; Cruse, T. A.

    1989-01-01

    In probabilistic structural analysis, the performance or response functions usually are implicitly defined and must be solved by numerical analysis methods such as finite element methods. In such cases, the most commonly used probabilistic analysis tool is the mean-based, second-moment method which provides only the first two statistical moments. This paper presents a generalized advanced mean value (AMV) method which is capable of establishing the distributions to provide additional information for reliability design. The method requires slightly more computations than the second-moment method but is highly efficient relative to the other alternative methods. In particular, the examples show that the AMV method can be used to solve problems involving non-monotonic functions that result in truncated distributions.

  3. Ares I Static Tests Design

    NASA Technical Reports Server (NTRS)

    Carson, William; Lindemuth, Kathleen; Mich, John; White, K. Preston; Parker, Peter A.

    2009-01-01

    Probabilistic engineering design enhances safety and reduces costs by incorporating risk assessment directly into the design process. In this paper, we assess the format of the quantitative metrics for the vehicle which will replace the Space Shuttle, the Ares I rocket. Specifically, we address the metrics for in-flight measurement error in the vector position of the motor nozzle, dictated by limits on guidance, navigation, and control systems. Analyses include the propagation of error from measured to derived parameters, the time-series of dwell points for the duty cycle during static tests, and commanded versus achieved yaw angle during tests. Based on these analyses, we recommend a probabilistic template for specifying the maximum error in angular displacement and radial offset for the nozzle-position vector. Criteria for evaluating individual tests and risky decisions also are developed.

  4. Probabilistic Analysis of Solid Oxide Fuel Cell Based Hybrid Gas Turbine System

    NASA Technical Reports Server (NTRS)

    Gorla, Rama S. R.; Pai, Shantaram S.; Rusick, Jeffrey J.

    2003-01-01

    The emergence of fuel cell systems and hybrid fuel cell systems requires the evolution of analysis strategies for evaluating thermodynamic performance. A gas turbine thermodynamic cycle integrated with a fuel cell was computationally simulated and probabilistically evaluated in view of the several uncertainties in the thermodynamic performance parameters. Cumulative distribution functions and sensitivity factors were computed for the overall thermal efficiency and net specific power output due to the uncertainties in the thermodynamic random variables. These results can be used to quickly identify the most critical design variables in order to optimize the design and make it cost effective. The analysis leads to the selection of criteria for gas turbine performance.

  5. Development of Advanced Life Cycle Costing Methods for Technology Benefit/Cost/Risk Assessment

    NASA Technical Reports Server (NTRS)

    Yackovetsky, Robert (Technical Monitor)

    2002-01-01

    The overall objective of this three-year grant is to provide NASA Langley's System Analysis Branch with improved affordability tools and methods based on probabilistic cost assessment techniques. In order to accomplish this objective, the Aerospace Systems Design Laboratory (ASDL) needs to pursue more detailed affordability, technology impact, and risk prediction methods and to demonstrate them on variety of advanced commercial transports. The affordability assessment, which is a cornerstone of ASDL methods, relies on the Aircraft Life Cycle Cost Analysis (ALCCA) program originally developed by NASA Ames Research Center and enhanced by ASDL. This grant proposed to improve ALCCA in support of the project objective by updating the research, design, test, and evaluation cost module, as well as the engine development cost module. Investigations into enhancements to ALCCA include improved engine development cost, process based costing, supportability cost, and system reliability with airline loss of revenue for system downtime. A probabilistic, stand-alone version of ALCCA/FLOPS will also be developed under this grant in order to capture the uncertainty involved in technology assessments. FLOPS (FLight Optimization System program) is an aircraft synthesis and sizing code developed by NASA Langley Research Center. This probabilistic version of the coupled program will be used within a Technology Impact Forecasting (TIF) method to determine what types of technologies would have to be infused in a system in order to meet customer requirements. A probabilistic analysis of the CER's (cost estimating relationships) within ALCCA will also be carried out under this contract in order to gain some insight as to the most influential costs and the impact that code fidelity could have on future RDS (Robust Design Simulation) studies.

  6. Probabilistic design of fibre concrete structures

    NASA Astrophysics Data System (ADS)

    Pukl, R.; Novák, D.; Sajdlová, T.; Lehký, D.; Červenka, J.; Červenka, V.

    2017-09-01

    Advanced computer simulation is recently well-established methodology for evaluation of resistance of concrete engineering structures. The nonlinear finite element analysis enables to realistically predict structural damage, peak load, failure, post-peak response, development of cracks in concrete, yielding of reinforcement, concrete crushing or shear failure. The nonlinear material models can cover various types of concrete and reinforced concrete: ordinary concrete, plain or reinforced, without or with prestressing, fibre concrete, (ultra) high performance concrete, lightweight concrete, etc. Advanced material models taking into account fibre concrete properties such as shape of tensile softening branch, high toughness and ductility are described in the paper. Since the variability of the fibre concrete material properties is rather high, the probabilistic analysis seems to be the most appropriate format for structural design and evaluation of structural performance, reliability and safety. The presented combination of the nonlinear analysis with advanced probabilistic methods allows evaluation of structural safety characterized by failure probability or by reliability index respectively. Authors offer a methodology and computer tools for realistic safety assessment of concrete structures; the utilized approach is based on randomization of the nonlinear finite element analysis of the structural model. Uncertainty of the material properties or their randomness obtained from material tests are accounted in the random distribution. Furthermore, degradation of the reinforced concrete materials such as carbonation of concrete, corrosion of reinforcement, etc. can be accounted in order to analyze life-cycle structural performance and to enable prediction of the structural reliability and safety in time development. The results can serve as a rational basis for design of fibre concrete engineering structures based on advanced nonlinear computer analysis. The presented methodology is illustrated on results from two probabilistic studies with different types of concrete structures related to practical applications and made from various materials (with the parameters obtained from real material tests).

  7. Specifying design conservatism: Worst case versus probabilistic analysis

    NASA Technical Reports Server (NTRS)

    Miles, Ralph F., Jr.

    1993-01-01

    Design conservatism is the difference between specified and required performance, and is introduced when uncertainty is present. The classical approach of worst-case analysis for specifying design conservatism is presented, along with the modern approach of probabilistic analysis. The appropriate degree of design conservatism is a tradeoff between the required resources and the probability and consequences of a failure. A probabilistic analysis properly models this tradeoff, while a worst-case analysis reveals nothing about the probability of failure, and can significantly overstate the consequences of failure. Two aerospace examples will be presented that illustrate problems that can arise with a worst-case analysis.

  8. Probabilistic Ontology Architecture for a Terrorist Identification Decision Support System

    DTIC Science & Technology

    2014-06-01

    in real-world problems requires probabilistic ontologies, which integrate the inferential reasoning power of probabilistic representations with the... inferential reasoning power of probabilistic representations with the first-order expressivity of ontologies. The Reference Architecture for...ontology, terrorism, inferential reasoning, architecture I. INTRODUCTION A. Background Whether by nature or design, the personas of terrorists are

  9. Probabilistic Aeroelastic Analysis Developed for Turbomachinery Components

    NASA Technical Reports Server (NTRS)

    Reddy, T. S. R.; Mital, Subodh K.; Stefko, George L.; Pai, Shantaram S.

    2003-01-01

    Aeroelastic analyses for advanced turbomachines are being developed for use at the NASA Glenn Research Center and industry. However, these analyses at present are used for turbomachinery design with uncertainties accounted for by using safety factors. This approach may lead to overly conservative designs, thereby reducing the potential of designing higher efficiency engines. An integration of the deterministic aeroelastic analysis methods with probabilistic analysis methods offers the potential to design efficient engines with fewer aeroelastic problems and to make a quantum leap toward designing safe reliable engines. In this research, probabilistic analysis is integrated with aeroelastic analysis: (1) to determine the parameters that most affect the aeroelastic characteristics (forced response and stability) of a turbomachine component such as a fan, compressor, or turbine and (2) to give the acceptable standard deviation on the design parameters for an aeroelastically stable system. The approach taken is to combine the aeroelastic analysis of the MISER (MIStuned Engine Response) code with the FPI (fast probability integration) code. The role of MISER is to provide the functional relationships that tie the structural and aerodynamic parameters (the primitive variables) to the forced response amplitudes and stability eigenvalues (the response properties). The role of FPI is to perform probabilistic analyses by utilizing the response properties generated by MISER. The results are a probability density function for the response properties. The probabilistic sensitivities of the response variables to uncertainty in primitive variables are obtained as a byproduct of the FPI technique. The combined analysis of aeroelastic and probabilistic analysis is applied to a 12-bladed cascade vibrating in bending and torsion. Out of the total 11 design parameters, 6 are considered as having probabilistic variation. The six parameters are space-to-chord ratio (SBYC), stagger angle (GAMA), elastic axis (ELAXS), Mach number (MACH), mass ratio (MASSR), and frequency ratio (WHWB). The cascade is considered to be in subsonic flow with Mach 0.7. The results of the probabilistic aeroelastic analysis are the probability density function of predicted aerodynamic damping and frequency for flutter and the response amplitudes for forced response.

  10. The analysis of probability task completion; Taxonomy of probabilistic thinking-based across gender in elementary school students

    NASA Astrophysics Data System (ADS)

    Sari, Dwi Ivayana; Budayasa, I. Ketut; Juniati, Dwi

    2017-08-01

    Formulation of mathematical learning goals now is not only oriented on cognitive product, but also leads to cognitive process, which is probabilistic thinking. Probabilistic thinking is needed by students to make a decision. Elementary school students are required to develop probabilistic thinking as foundation to learn probability at higher level. A framework of probabilistic thinking of students had been developed by using SOLO taxonomy, which consists of prestructural probabilistic thinking, unistructural probabilistic thinking, multistructural probabilistic thinking and relational probabilistic thinking. This study aimed to analyze of probability task completion based on taxonomy of probabilistic thinking. The subjects were two students of fifth grade; boy and girl. Subjects were selected by giving test of mathematical ability and then based on high math ability. Subjects were given probability tasks consisting of sample space, probability of an event and probability comparison. The data analysis consisted of categorization, reduction, interpretation and conclusion. Credibility of data used time triangulation. The results was level of boy's probabilistic thinking in completing probability tasks indicated multistructural probabilistic thinking, while level of girl's probabilistic thinking in completing probability tasks indicated unistructural probabilistic thinking. The results indicated that level of boy's probabilistic thinking was higher than level of girl's probabilistic thinking. The results could contribute to curriculum developer in developing probability learning goals for elementary school students. Indeed, teachers could teach probability with regarding gender difference.

  11. A Probabilistic Approach to Predict Thermal Fatigue Life for Ball Grid Array Solder Joints

    NASA Astrophysics Data System (ADS)

    Wei, Helin; Wang, Kuisheng

    2011-11-01

    Numerous studies of the reliability of solder joints have been performed. Most life prediction models are limited to a deterministic approach. However, manufacturing induces uncertainty in the geometry parameters of solder joints, and the environmental temperature varies widely due to end-user diversity, creating uncertainties in the reliability of solder joints. In this study, a methodology for accounting for variation in the lifetime prediction for lead-free solder joints of ball grid array packages (PBGA) is demonstrated. The key aspects of the solder joint parameters and the cyclic temperature range related to reliability are involved. Probabilistic solutions of the inelastic strain range and thermal fatigue life based on the Engelmaier model are developed to determine the probability of solder joint failure. The results indicate that the standard deviation increases significantly when more random variations are involved. Using the probabilistic method, the influence of each variable on the thermal fatigue life is quantified. This information can be used to optimize product design and process validation acceptance criteria. The probabilistic approach creates the opportunity to identify the root causes of failed samples from product fatigue tests and field returns. The method can be applied to better understand how variation affects parameters of interest in an electronic package design with area array interconnections.

  12. Integrated Campaign Probabilistic Cost, Schedule, Performance, and Value for Program Office Support

    NASA Technical Reports Server (NTRS)

    Cornelius, David; Sasamoto, Washito; Daugherty, Kevin; Deacon, Shaun

    2012-01-01

    This paper describes an integrated assessment tool developed at NASA Langley Research Center that incorporates probabilistic analysis of life cycle cost, schedule, launch performance, on-orbit performance, and value across a series of planned space-based missions, or campaign. Originally designed as an aid in planning the execution of missions to accomplish the National Research Council 2007 Earth Science Decadal Survey, it utilizes Monte Carlo simulation of a series of space missions for assessment of resource requirements and expected return on investment. Interactions between simulated missions are incorporated, such as competition for launch site manifest, to capture unexpected and non-linear system behaviors. A novel value model is utilized to provide an assessment of the probabilistic return on investment. A demonstration case is discussed to illustrate the tool utility.

  13. Probabilistic Structural Analysis Methods (PSAM) for select space propulsion system structural components

    NASA Technical Reports Server (NTRS)

    Cruse, T. A.

    1987-01-01

    The objective is the development of several modular structural analysis packages capable of predicting the probabilistic response distribution for key structural variables such as maximum stress, natural frequencies, transient response, etc. The structural analysis packages are to include stochastic modeling of loads, material properties, geometry (tolerances), and boundary conditions. The solution is to be in terms of the cumulative probability of exceedance distribution (CDF) and confidence bounds. Two methods of probability modeling are to be included as well as three types of structural models - probabilistic finite-element method (PFEM); probabilistic approximate analysis methods (PAAM); and probabilistic boundary element methods (PBEM). The purpose in doing probabilistic structural analysis is to provide the designer with a more realistic ability to assess the importance of uncertainty in the response of a high performance structure. Probabilistic Structural Analysis Method (PSAM) tools will estimate structural safety and reliability, while providing the engineer with information on the confidence that should be given to the predicted behavior. Perhaps most critically, the PSAM results will directly provide information on the sensitivity of the design response to those variables which are seen to be uncertain.

  14. Probabilistic Structural Analysis Methods for select space propulsion system structural components (PSAM)

    NASA Technical Reports Server (NTRS)

    Cruse, T. A.; Burnside, O. H.; Wu, Y.-T.; Polch, E. Z.; Dias, J. B.

    1988-01-01

    The objective is the development of several modular structural analysis packages capable of predicting the probabilistic response distribution for key structural variables such as maximum stress, natural frequencies, transient response, etc. The structural analysis packages are to include stochastic modeling of loads, material properties, geometry (tolerances), and boundary conditions. The solution is to be in terms of the cumulative probability of exceedance distribution (CDF) and confidence bounds. Two methods of probability modeling are to be included as well as three types of structural models - probabilistic finite-element method (PFEM); probabilistic approximate analysis methods (PAAM); and probabilistic boundary element methods (PBEM). The purpose in doing probabilistic structural analysis is to provide the designer with a more realistic ability to assess the importance of uncertainty in the response of a high performance structure. Probabilistic Structural Analysis Method (PSAM) tools will estimate structural safety and reliability, while providing the engineer with information on the confidence that should be given to the predicted behavior. Perhaps most critically, the PSAM results will directly provide information on the sensitivity of the design response to those variables which are seen to be uncertain.

  15. A probabilistic Sperner's theorem, with applications to the problem of retrieving information from a data base

    NASA Technical Reports Server (NTRS)

    Baumert, L. D.; Mceliece, R. J.; Rodemich, E. R.; Rumsey, H., Jr.

    1978-01-01

    The design of an optimal merged keycode data base information retrieval system is detailed. A probability distribution of n-bit binary words that minimized false drops was developed for the case where the set of desired records was a subset of tagged records.

  16. Efficient probabilistic inference in generic neural networks trained with non-probabilistic feedback.

    PubMed

    Orhan, A Emin; Ma, Wei Ji

    2017-07-26

    Animals perform near-optimal probabilistic inference in a wide range of psychophysical tasks. Probabilistic inference requires trial-to-trial representation of the uncertainties associated with task variables and subsequent use of this representation. Previous work has implemented such computations using neural networks with hand-crafted and task-dependent operations. We show that generic neural networks trained with a simple error-based learning rule perform near-optimal probabilistic inference in nine common psychophysical tasks. In a probabilistic categorization task, error-based learning in a generic network simultaneously explains a monkey's learning curve and the evolution of qualitative aspects of its choice behavior. In all tasks, the number of neurons required for a given level of performance grows sublinearly with the input population size, a substantial improvement on previous implementations of probabilistic inference. The trained networks develop a novel sparsity-based probabilistic population code. Our results suggest that probabilistic inference emerges naturally in generic neural networks trained with error-based learning rules.Behavioural tasks often require probability distributions to be inferred about task specific variables. Here, the authors demonstrate that generic neural networks can be trained using a simple error-based learning rule to perform such probabilistic computations efficiently without any need for task specific operations.

  17. Are there Benefits to Combining Regional Probabalistic Survey and Historic Targeted Environmental Monitoring Data to Improve Our Understanding of Overall Regional Estuary Environmental Status?

    NASA Astrophysics Data System (ADS)

    Dasher, D. H.; Lomax, T. J.; Bethe, A.; Jewett, S.; Hoberg, M.

    2016-02-01

    A regional probabilistic survey of 20 randomly selected stations, where water and sediments were sampled, was conducted over an area of Simpson Lagoon and Gwydyr Bay in the Beaufort Sea adjacent Prudhoe Bay, Alaska, in 2014. Sampling parameters included water column for temperature, salinity, dissolved oxygen, chlorophyll a, nutrients and sediments for macroinvertebrates, chemistry, i.e., trace metals and hydrocarbons, and grain size. The 2014 probabilistic survey design allows for inferences to be made of environmental status, for instance the spatial or aerial distribution of sediment trace metals within the design area sampled. Historically, since the 1970's a number of monitoring studies have been conducted in this estuary area using a targeted rather than regional probabilistic design. Targeted non-random designs were utilized to assess specific points of interest and cannot be used to make inferences to distributions of environmental parameters. Due to differences in the environmental monitoring objectives between probabilistic and targeted designs there has been limited assessment see if benefits exist to combining the two approaches. This study evaluates if a combined approach using the 2014 probabilistic survey sediment trace metal and macroinvertebrate results and historical targeted monitoring data can provide a new perspective on better understanding the environmental status of these estuaries.

  18. Risk Informed Design as Part of the Systems Engineering Process

    NASA Technical Reports Server (NTRS)

    Deckert, George

    2010-01-01

    This slide presentation reviews the importance of Risk Informed Design (RID) as an important feature of the systems engineering process. RID is based on the principle that risk is a design commodity such as mass, volume, cost or power. It also reviews Probabilistic Risk Assessment (PRA) as it is used in the product life cycle in the development of NASA's Constellation Program.

  19. Optimization of Adaptive Intraply Hybrid Fiber Composites with Reliability Considerations

    NASA Technical Reports Server (NTRS)

    Shiao, Michael C.; Chamis, Christos C.

    1994-01-01

    The reliability with bounded distribution parameters (mean, standard deviation) was maximized and the reliability-based cost was minimized for adaptive intra-ply hybrid fiber composites by using a probabilistic method. The probabilistic method accounts for all naturally occurring uncertainties including those in constituent material properties, fabrication variables, structure geometry, and control-related parameters. Probabilistic sensitivity factors were computed and used in the optimization procedures. For actuated change in the angle of attack of an airfoil-like composite shell structure with an adaptive torque plate, the reliability was maximized to 0.9999 probability, with constraints on the mean and standard deviation of the actuation material volume ratio (percentage of actuation composite material in a ply) and the actuation strain coefficient. The reliability-based cost was minimized for an airfoil-like composite shell structure with an adaptive skin and a mean actuation material volume ratio as the design parameter. At a O.9-mean actuation material volume ratio, the minimum cost was obtained.

  20. The application of probabilistic design theory to high temperature low cycle fatigue

    NASA Technical Reports Server (NTRS)

    Wirsching, P. H.

    1981-01-01

    Metal fatigue under stress and thermal cycling is a principal mode of failure in gas turbine engine hot section components such as turbine blades and disks and combustor liners. Designing for fatigue is subject to considerable uncertainty, e.g., scatter in cycles to failure, available fatigue test data and operating environment data, uncertainties in the models used to predict stresses, etc. Methods of analyzing fatigue test data for probabilistic design purposes are summarized. The general strain life as well as homo- and hetero-scedastic models are considered. Modern probabilistic design theory is reviewed and examples are presented which illustrate application to reliability analysis of gas turbine engine components.

  1. Accounting for Uncertainties in Strengths of SiC MEMS Parts

    NASA Technical Reports Server (NTRS)

    Nemeth, Noel; Evans, Laura; Beheim, Glen; Trapp, Mark; Jadaan, Osama; Sharpe, William N., Jr.

    2007-01-01

    A methodology has been devised for accounting for uncertainties in the strengths of silicon carbide structural components of microelectromechanical systems (MEMS). The methodology enables prediction of the probabilistic strengths of complexly shaped MEMS parts using data from tests of simple specimens. This methodology is intended to serve as a part of a rational basis for designing SiC MEMS, supplementing methodologies that have been borrowed from the art of designing macroscopic brittle material structures. The need for this or a similar methodology arises as a consequence of the fundamental nature of MEMS and the brittle silicon-based materials of which they are typically fabricated. When tested to fracture, MEMS and structural components thereof show wide part-to-part scatter in strength. The methodology involves the use of the Ceramics Analysis and Reliability Evaluation of Structures Life (CARES/Life) software in conjunction with the ANSYS Probabilistic Design System (PDS) software to simulate or predict the strength responses of brittle material components while simultaneously accounting for the effects of variability of geometrical features on the strength responses. As such, the methodology involves the use of an extended version of the ANSYS/CARES/PDS software system described in Probabilistic Prediction of Lifetimes of Ceramic Parts (LEW-17682-1/4-1), Software Tech Briefs supplement to NASA Tech Briefs, Vol. 30, No. 9 (September 2006), page 10. The ANSYS PDS software enables the ANSYS finite-element-analysis program to account for uncertainty in the design-and analysis process. The ANSYS PDS software accounts for uncertainty in material properties, dimensions, and loading by assigning probabilistic distributions to user-specified model parameters and performing simulations using various sampling techniques.

  2. A Case Study for Probabilistic Methods Validation (MSFC Center Director's Discretionary Fund, Project No. 94-26)

    NASA Technical Reports Server (NTRS)

    Price J. M.; Ortega, R.

    1998-01-01

    Probabilistic method is not a universally accepted approach for the design and analysis of aerospace structures. The validity of this approach must be demonstrated to encourage its acceptance as it viable design and analysis tool to estimate structural reliability. The objective of this Study is to develop a well characterized finite population of similar aerospace structures that can be used to (1) validate probabilistic codes, (2) demonstrate the basic principles behind probabilistic methods, (3) formulate general guidelines for characterization of material drivers (such as elastic modulus) when limited data is available, and (4) investigate how the drivers affect the results of sensitivity analysis at the component/failure mode level.

  3. Exploration of Advanced Probabilistic and Stochastic Design Methods

    NASA Technical Reports Server (NTRS)

    Mavris, Dimitri N.

    2003-01-01

    The primary objective of the three year research effort was to explore advanced, non-deterministic aerospace system design methods that may have relevance to designers and analysts. The research pursued emerging areas in design methodology and leverage current fundamental research in the area of design decision-making, probabilistic modeling, and optimization. The specific focus of the three year investigation was oriented toward methods to identify and analyze emerging aircraft technologies in a consistent and complete manner, and to explore means to make optimal decisions based on this knowledge in a probabilistic environment. The research efforts were classified into two main areas. First, Task A of the grant has had the objective of conducting research into the relative merits of possible approaches that account for both multiple criteria and uncertainty in design decision-making. In particular, in the final year of research, the focus was on the comparison and contrasting between three methods researched. Specifically, these three are the Joint Probabilistic Decision-Making (JPDM) technique, Physical Programming, and Dempster-Shafer (D-S) theory. The next element of the research, as contained in Task B, was focused upon exploration of the Technology Identification, Evaluation, and Selection (TIES) methodology developed at ASDL, especially with regards to identification of research needs in the baseline method through implementation exercises. The end result of Task B was the documentation of the evolution of the method with time and a technology transfer to the sponsor regarding the method, such that an initial capability for execution could be obtained by the sponsor. Specifically, the results of year 3 efforts were the creation of a detailed tutorial for implementing the TIES method. Within the tutorial package, templates and detailed examples were created for learning and understanding the details of each step. For both research tasks, sample files and tutorials are attached in electronic form with the enclosed CD.

  4. Probabilistic Neighborhood-Based Data Collection Algorithms for 3D Underwater Acoustic Sensor Networks.

    PubMed

    Han, Guangjie; Li, Shanshan; Zhu, Chunsheng; Jiang, Jinfang; Zhang, Wenbo

    2017-02-08

    Marine environmental monitoring provides crucial information and support for the exploitation, utilization, and protection of marine resources. With the rapid development of information technology, the development of three-dimensional underwater acoustic sensor networks (3D UASNs) provides a novel strategy to acquire marine environment information conveniently, efficiently and accurately. However, the specific propagation effects of acoustic communication channel lead to decreased successful information delivery probability with increased distance. Therefore, we investigate two probabilistic neighborhood-based data collection algorithms for 3D UASNs which are based on a probabilistic acoustic communication model instead of the traditional deterministic acoustic communication model. An autonomous underwater vehicle (AUV) is employed to traverse along the designed path to collect data from neighborhoods. For 3D UASNs without prior deployment knowledge, partitioning the network into grids can allow the AUV to visit the central location of each grid for data collection. For 3D UASNs in which the deployment knowledge is known in advance, the AUV only needs to visit several selected locations by constructing a minimum probabilistic neighborhood covering set to reduce data latency. Otherwise, by increasing the transmission rounds, our proposed algorithms can provide a tradeoff between data collection latency and information gain. These algorithms are compared with basic Nearest-neighbor Heuristic algorithm via simulations. Simulation analyses show that our proposed algorithms can efficiently reduce the average data collection completion time, corresponding to a decrease of data latency.

  5. A note on probabilistic models over strings: the linear algebra approach.

    PubMed

    Bouchard-Côté, Alexandre

    2013-12-01

    Probabilistic models over strings have played a key role in developing methods that take into consideration indels as phylogenetically informative events. There is an extensive literature on using automata and transducers on phylogenies to do inference on these probabilistic models, in which an important theoretical question is the complexity of computing the normalization of a class of string-valued graphical models. This question has been investigated using tools from combinatorics, dynamic programming, and graph theory, and has practical applications in Bayesian phylogenetics. In this work, we revisit this theoretical question from a different point of view, based on linear algebra. The main contribution is a set of results based on this linear algebra view that facilitate the analysis and design of inference algorithms on string-valued graphical models. As an illustration, we use this method to give a new elementary proof of a known result on the complexity of inference on the "TKF91" model, a well-known probabilistic model over strings. Compared to previous work, our proving method is easier to extend to other models, since it relies on a novel weak condition, triangular transducers, which is easy to establish in practice. The linear algebra view provides a concise way of describing transducer algorithms and their compositions, opens the possibility of transferring fast linear algebra libraries (for example, based on GPUs), as well as low rank matrix approximation methods, to string-valued inference problems.

  6. A Proposed Probabilistic Extension of the Halpern and Pearl Definition of ‘Actual Cause’

    PubMed Central

    2017-01-01

    ABSTRACT Joseph Halpern and Judea Pearl ([2005]) draw upon structural equation models to develop an attractive analysis of ‘actual cause’. Their analysis is designed for the case of deterministic causation. I show that their account can be naturally extended to provide an elegant treatment of probabilistic causation. 1Introduction2Preemption3Structural Equation Models4The Halpern and Pearl Definition of ‘Actual Cause’5Preemption Again6The Probabilistic Case7Probabilistic Causal Models8A Proposed Probabilistic Extension of Halpern and Pearl’s Definition9Twardy and Korb’s Account10Probabilistic Fizzling11Conclusion PMID:29593362

  7. Probabilistic distance-based quantizer design for distributed estimation

    NASA Astrophysics Data System (ADS)

    Kim, Yoon Hak

    2016-12-01

    We consider an iterative design of independently operating local quantizers at nodes that should cooperate without interaction to achieve application objectives for distributed estimation systems. We suggest as a new cost function a probabilistic distance between the posterior distribution and its quantized one expressed as the Kullback Leibler (KL) divergence. We first present the analysis that minimizing the KL divergence in the cyclic generalized Lloyd design framework is equivalent to maximizing the logarithmic quantized posterior distribution on the average which can be further computationally reduced in our iterative design. We propose an iterative design algorithm that seeks to maximize the simplified version of the posterior quantized distribution and discuss that our algorithm converges to a global optimum due to the convexity of the cost function and generates the most informative quantized measurements. We also provide an independent encoding technique that enables minimization of the cost function and can be efficiently simplified for a practical use of power-constrained nodes. We finally demonstrate through extensive experiments an obvious advantage of improved estimation performance as compared with the typical designs and the novel design techniques previously published.

  8. Probabilistic structural analysis of space propulsion system LOX post

    NASA Technical Reports Server (NTRS)

    Newell, J. F.; Rajagopal, K. R.; Ho, H. W.; Cunniff, J. M.

    1990-01-01

    The probabilistic structural analysis program NESSUS (Numerical Evaluation of Stochastic Structures Under Stress; Cruse et al., 1988) is applied to characterize the dynamic loading and response of the Space Shuttle main engine (SSME) LOX post. The design and operation of the SSME are reviewed; the LOX post structure is described; and particular attention is given to the generation of composite load spectra, the finite-element model of the LOX post, and the steps in the NESSUS structural analysis. The results are presented in extensive tables and graphs, and it is shown that NESSUS correctly predicts the structural effects of changes in the temperature loading. The probabilistic approach also facilitates (1) damage assessments for a given failure model (based on gas temperature, heat-shield gap, and material properties) and (2) correlation of the gas temperature with operational parameters such as engine thrust.

  9. Evaluation of Horizontal Seismic Hazard of Shahrekord, Iran

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Amiri, G. Ghodrati; Dehkordi, M. Raeisi; Amrei, S. A. Razavian

    2008-07-08

    This paper presents probabilistic horizontal seismic hazard assessment of Shahrekord, Iran. It displays the probabilistic estimate of Peak Ground Horizontal Acceleration (PGHA) for the return period of 75, 225, 475 and 2475 years. The output of the probabilistic seismic hazard analysis is based on peak ground acceleration (PGA), which is the most common criterion in designing of buildings. A catalogue of seismic events that includes both historical and instrumental events was developed and covers the period from 840 to 2007. The seismic sources that affect the hazard in Shahrekord were identified within the radius of 150 km and the recurrencemore » relationships of these sources were generated. Finally four maps have been prepared to indicate the earthquake hazard of Shahrekord in the form of iso-acceleration contour lines for different hazard levels by using SEISRISK III software.« less

  10. Probabilistic Design and Analysis Framework

    NASA Technical Reports Server (NTRS)

    Strack, William C.; Nagpal, Vinod K.

    2010-01-01

    PRODAF is a software package designed to aid analysts and designers in conducting probabilistic analysis of components and systems. PRODAF can integrate multiple analysis programs to ease the tedious process of conducting a complex analysis process that requires the use of multiple software packages. The work uses a commercial finite element analysis (FEA) program with modules from NESSUS to conduct a probabilistic analysis of a hypothetical turbine blade, disk, and shaft model. PRODAF applies the response surface method, at the component level, and extrapolates the component-level responses to the system level. Hypothetical components of a gas turbine engine are first deterministically modeled using FEA. Variations in selected geometrical dimensions and loading conditions are analyzed to determine the effects of the stress state within each component. Geometric variations include the cord length and height for the blade, inner radius, outer radius, and thickness, which are varied for the disk. Probabilistic analysis is carried out using developing software packages like System Uncertainty Analysis (SUA) and PRODAF. PRODAF was used with a commercial deterministic FEA program in conjunction with modules from the probabilistic analysis program, NESTEM, to perturb loads and geometries to provide a reliability and sensitivity analysis. PRODAF simplified the handling of data among the various programs involved, and will work with many commercial and opensource deterministic programs, probabilistic programs, or modules.

  11. Safety and integrity of pipeline systems - philosophy and experience in Germany

    DOT National Transportation Integrated Search

    1997-01-01

    The design, construction and operation of gas pipeline systems in Germany are subject to the Energy Act and associated regulations. This legal structure is based on a deterministic rather than a probabilistic safety philosophy, consisting of technica...

  12. Probabilistic Sizing and Verification of Space Ceramic Structures

    NASA Astrophysics Data System (ADS)

    Denaux, David; Ballhause, Dirk; Logut, Daniel; Lucarelli, Stefano; Coe, Graham; Laine, Benoit

    2012-07-01

    Sizing of ceramic parts is best optimised using a probabilistic approach which takes into account the preexisting flaw distribution in the ceramic part to compute a probability of failure of the part depending on the applied load, instead of a maximum allowable load as for a metallic part. This requires extensive knowledge of the material itself but also an accurate control of the manufacturing process. In the end, risk reduction approaches such as proof testing may be used to lower the final probability of failure of the part. Sizing and verification of ceramic space structures have been performed by Astrium for more than 15 years, both with Zerodur and SiC: Silex telescope structure, Seviri primary mirror, Herschel telescope, Formosat-2 instrument, and other ceramic structures flying today. Throughout this period of time, Astrium has investigated and developed experimental ceramic analysis tools based on the Weibull probabilistic approach. In the scope of the ESA/ESTEC study: “Mechanical Design and Verification Methodologies for Ceramic Structures”, which is to be concluded in the beginning of 2012, existing theories, technical state-of-the-art from international experts, and Astrium experience with probabilistic analysis tools have been synthesized into a comprehensive sizing and verification method for ceramics. Both classical deterministic and more optimised probabilistic methods are available, depending on the criticality of the item and on optimisation needs. The methodology, based on proven theory, has been successfully applied to demonstration cases and has shown its practical feasibility.

  13. Probabilistic Prediction of Lifetimes of Ceramic Parts

    NASA Technical Reports Server (NTRS)

    Nemeth, Noel N.; Gyekenyesi, John P.; Jadaan, Osama M.; Palfi, Tamas; Powers, Lynn; Reh, Stefan; Baker, Eric H.

    2006-01-01

    ANSYS/CARES/PDS is a software system that combines the ANSYS Probabilistic Design System (PDS) software with a modified version of the Ceramics Analysis and Reliability Evaluation of Structures Life (CARES/Life) Version 6.0 software. [A prior version of CARES/Life was reported in Program for Evaluation of Reliability of Ceramic Parts (LEW-16018), NASA Tech Briefs, Vol. 20, No. 3 (March 1996), page 28.] CARES/Life models effects of stochastic strength, slow crack growth, and stress distribution on the overall reliability of a ceramic component. The essence of the enhancement in CARES/Life 6.0 is the capability to predict the probability of failure using results from transient finite-element analysis. ANSYS PDS models the effects of uncertainty in material properties, dimensions, and loading on the stress distribution and deformation. ANSYS/CARES/PDS accounts for the effects of probabilistic strength, probabilistic loads, probabilistic material properties, and probabilistic tolerances on the lifetime and reliability of the component. Even failure probability becomes a stochastic quantity that can be tracked as a response variable. ANSYS/CARES/PDS enables tracking of all stochastic quantities in the design space, thereby enabling more precise probabilistic prediction of lifetimes of ceramic components.

  14. Probabilistic DHP adaptive critic for nonlinear stochastic control systems.

    PubMed

    Herzallah, Randa

    2013-06-01

    Following the recently developed algorithms for fully probabilistic control design for general dynamic stochastic systems (Herzallah & Káarnáy, 2011; Kárný, 1996), this paper presents the solution to the probabilistic dual heuristic programming (DHP) adaptive critic method (Herzallah & Káarnáy, 2011) and randomized control algorithm for stochastic nonlinear dynamical systems. The purpose of the randomized control input design is to make the joint probability density function of the closed loop system as close as possible to a predetermined ideal joint probability density function. This paper completes the previous work (Herzallah & Káarnáy, 2011; Kárný, 1996) by formulating and solving the fully probabilistic control design problem on the more general case of nonlinear stochastic discrete time systems. A simulated example is used to demonstrate the use of the algorithm and encouraging results have been obtained. Copyright © 2013 Elsevier Ltd. All rights reserved.

  15. ASSESSING THE ECOLOGICAL CONDITION OF A COASTAL PLAIN WATERSHED USING A PROBABILISTIC SURVEY DESIGN

    EPA Science Inventory

    Using a probabilistic survey design, we assessed the ecological condition of the Florida (USA) portion of the Escambia River watershed using selected environmental and benthic macroinvertebrate data. Macroinvertebrates were sampled at 28 sites during July-August 1996, and 3414 i...

  16. Concurrent Probabilistic Simulation of High Temperature Composite Structural Response

    NASA Technical Reports Server (NTRS)

    Abdi, Frank

    1996-01-01

    A computational structural/material analysis and design tool which would meet industry's future demand for expedience and reduced cost is presented. This unique software 'GENOA' is dedicated to parallel and high speed analysis to perform probabilistic evaluation of high temperature composite response of aerospace systems. The development is based on detailed integration and modification of diverse fields of specialized analysis techniques and mathematical models to combine their latest innovative capabilities into a commercially viable software package. The technique is specifically designed to exploit the availability of processors to perform computationally intense probabilistic analysis assessing uncertainties in structural reliability analysis and composite micromechanics. The primary objectives which were achieved in performing the development were: (1) Utilization of the power of parallel processing and static/dynamic load balancing optimization to make the complex simulation of structure, material and processing of high temperature composite affordable; (2) Computational integration and synchronization of probabilistic mathematics, structural/material mechanics and parallel computing; (3) Implementation of an innovative multi-level domain decomposition technique to identify the inherent parallelism, and increasing convergence rates through high- and low-level processor assignment; (4) Creating the framework for Portable Paralleled architecture for the machine independent Multi Instruction Multi Data, (MIMD), Single Instruction Multi Data (SIMD), hybrid and distributed workstation type of computers; and (5) Market evaluation. The results of Phase-2 effort provides a good basis for continuation and warrants Phase-3 government, and industry partnership.

  17. Reliability assessment of slender concrete columns at the stability failure

    NASA Astrophysics Data System (ADS)

    Valašík, Adrián; Benko, Vladimír; Strauss, Alfred; Täubling, Benjamin

    2018-01-01

    The European Standard for designing concrete columns within the use of non-linear methods shows deficiencies in terms of global reliability, in case that the concrete columns fail by the loss of stability. The buckling failure is a brittle failure which occurs without warning and the probability of its formation depends on the columns slenderness. Experiments with slender concrete columns were carried out in cooperation with STRABAG Bratislava LTD in Central Laboratory of Faculty of Civil Engineering SUT in Bratislava. The following article aims to compare the global reliability of slender concrete columns with slenderness of 90 and higher. The columns were designed according to methods offered by EN 1992-1-1 [1]. The mentioned experiments were used as basis for deterministic nonlinear modelling of the columns and subsequent the probabilistic evaluation of structural response variability. Final results may be utilized as thresholds for loading of produced structural elements and they aim to present probabilistic design as less conservative compared to classic partial safety factor based design and alternative ECOV method.

  18. The Role of Probabilistic Design Analysis Methods in Safety and Affordability

    NASA Technical Reports Server (NTRS)

    Safie, Fayssal M.

    2016-01-01

    For the last several years, NASA and its contractors have been working together to build space launch systems to commercialize space. Developing commercial affordable and safe launch systems becomes very important and requires a paradigm shift. This paradigm shift enforces the need for an integrated systems engineering environment where cost, safety, reliability, and performance need to be considered to optimize the launch system design. In such an environment, rule based and deterministic engineering design practices alone may not be sufficient to optimize margins and fault tolerance to reduce cost. As a result, introduction of Probabilistic Design Analysis (PDA) methods to support the current deterministic engineering design practices becomes a necessity to reduce cost without compromising reliability and safety. This paper discusses the importance of PDA methods in NASA's new commercial environment, their applications, and the key role they can play in designing reliable, safe, and affordable launch systems. More specifically, this paper discusses: 1) The involvement of NASA in PDA 2) Why PDA is needed 3) A PDA model structure 4) A PDA example application 5) PDA link to safety and affordability.

  19. VBA: A Probabilistic Treatment of Nonlinear Models for Neurobiological and Behavioural Data

    PubMed Central

    Daunizeau, Jean; Adam, Vincent; Rigoux, Lionel

    2014-01-01

    This work is in line with an on-going effort tending toward a computational (quantitative and refutable) understanding of human neuro-cognitive processes. Many sophisticated models for behavioural and neurobiological data have flourished during the past decade. Most of these models are partly unspecified (i.e. they have unknown parameters) and nonlinear. This makes them difficult to peer with a formal statistical data analysis framework. In turn, this compromises the reproducibility of model-based empirical studies. This work exposes a software toolbox that provides generic, efficient and robust probabilistic solutions to the three problems of model-based analysis of empirical data: (i) data simulation, (ii) parameter estimation/model selection, and (iii) experimental design optimization. PMID:24465198

  20. Probabilistic Evaluation of Advanced Ceramic Matrix Composite Structures

    NASA Technical Reports Server (NTRS)

    Abumeri, Galib H.; Chamis, Christos C.

    2003-01-01

    The objective of this report is to summarize the deterministic and probabilistic structural evaluation results of two structures made with advanced ceramic composites (CMC): internally pressurized tube and uniformly loaded flange. The deterministic structural evaluation includes stress, displacement, and buckling analyses. It is carried out using the finite element code MHOST, developed for the 3-D inelastic analysis of structures that are made with advanced materials. The probabilistic evaluation is performed using the integrated probabilistic assessment of composite structures computer code IPACS. The affects of uncertainties in primitive variables related to the material, fabrication process, and loadings on the material property and structural response behavior are quantified. The primitive variables considered are: thermo-mechanical properties of fiber and matrix, fiber and void volume ratios, use temperature, and pressure. The probabilistic structural analysis and probabilistic strength results are used by IPACS to perform reliability and risk evaluation of the two structures. The results will show that the sensitivity information obtained for the two composite structures from the computational simulation can be used to alter the design process to meet desired service requirements. In addition to detailed probabilistic analysis of the two structures, the following were performed specifically on the CMC tube: (1) predicted the failure load and the buckling load, (2) performed coupled non-deterministic multi-disciplinary structural analysis, and (3) demonstrated that probabilistic sensitivities can be used to select a reduced set of design variables for optimization.

  1. Reliability-based econometrics of aerospace structural systems: Design criteria and test options. Ph.D. Thesis - Georgia Inst. of Tech.

    NASA Technical Reports Server (NTRS)

    Thomas, J. M.; Hanagud, S.

    1974-01-01

    The design criteria and test options for aerospace structural reliability were investigated. A decision methodology was developed for selecting a combination of structural tests and structural design factors. The decision method involves the use of Bayesian statistics and statistical decision theory. Procedures are discussed for obtaining and updating data-based probabilistic strength distributions for aerospace structures when test information is available and for obtaining subjective distributions when data are not available. The techniques used in developing the distributions are explained.

  2. NASA Applications and Lessons Learned in Reliability Engineering

    NASA Technical Reports Server (NTRS)

    Safie, Fayssal M.; Fuller, Raymond P.

    2011-01-01

    Since the Shuttle Challenger accident in 1986, communities across NASA have been developing and extensively using quantitative reliability and risk assessment methods in their decision making process. This paper discusses several reliability engineering applications that NASA has used over the year to support the design, development, and operation of critical space flight hardware. Specifically, the paper discusses several reliability engineering applications used by NASA in areas such as risk management, inspection policies, components upgrades, reliability growth, integrated failure analysis, and physics based probabilistic engineering analysis. In each of these areas, the paper provides a brief discussion of a case study to demonstrate the value added and the criticality of reliability engineering in supporting NASA project and program decisions to fly safely. Examples of these case studies discussed are reliability based life limit extension of Shuttle Space Main Engine (SSME) hardware, Reliability based inspection policies for Auxiliary Power Unit (APU) turbine disc, probabilistic structural engineering analysis for reliability prediction of the SSME alternate turbo-pump development, impact of ET foam reliability on the Space Shuttle System risk, and reliability based Space Shuttle upgrade for safety. Special attention is given in this paper to the physics based probabilistic engineering analysis applications and their critical role in evaluating the reliability of NASA development hardware including their potential use in a research and technology development environment.

  3. Non-unitary probabilistic quantum computing

    NASA Technical Reports Server (NTRS)

    Gingrich, Robert M.; Williams, Colin P.

    2004-01-01

    We present a method for designing quantum circuits that perform non-unitary quantum computations on n-qubit states probabilistically, and give analytic expressions for the success probability and fidelity.

  4. Development of probabilistic thinking-oriented learning tools for probability materials at junior high school students

    NASA Astrophysics Data System (ADS)

    Sari, Dwi Ivayana; Hermanto, Didik

    2017-08-01

    This research is a developmental research of probabilistic thinking-oriented learning tools for probability materials at ninth grade students. This study is aimed to produce a good probabilistic thinking-oriented learning tools. The subjects were IX-A students of MTs Model Bangkalan. The stages of this development research used 4-D development model which has been modified into define, design and develop. Teaching learning tools consist of lesson plan, students' worksheet, learning teaching media and students' achievement test. The research instrument used was a sheet of learning tools validation, a sheet of teachers' activities, a sheet of students' activities, students' response questionnaire and students' achievement test. The result of those instruments were analyzed descriptively to answer research objectives. The result was teaching learning tools in which oriented to probabilistic thinking of probability at ninth grade students which has been valid. Since teaching and learning tools have been revised based on validation, and after experiment in class produced that teachers' ability in managing class was effective, students' activities were good, students' responses to the learning tools were positive and the validity, sensitivity and reliability category toward achievement test. In summary, this teaching learning tools can be used by teacher to teach probability for develop students' probabilistic thinking.

  5. Probalistic Models for Solar Particle Events

    NASA Technical Reports Server (NTRS)

    Adams, James H., Jr.; Xapsos, Michael

    2009-01-01

    Probabilistic Models of Solar Particle Events (SPEs) are used in space mission design studies to describe the radiation environment that can be expected at a specified confidence level. The task of the designer is then to choose a design that will operate in the model radiation environment. Probabilistic models have already been developed for solar proton events that describe the peak flux, event-integrated fluence and missionintegrated fluence. In addition a probabilistic model has been developed that describes the mission-integrated fluence for the Z>2 elemental spectra. This talk will focus on completing this suite of models by developing models for peak flux and event-integrated fluence elemental spectra for the Z>2 element

  6. On the Accuracy of Probabilistic Bucking Load Prediction

    NASA Technical Reports Server (NTRS)

    Arbocz, Johann; Starnes, James H.; Nemeth, Michael P.

    2001-01-01

    The buckling strength of thin-walled stiffened or unstiffened, metallic or composite shells is of major concern in aeronautical and space applications. The difficulty to predict the behavior of axially compressed thin-walled cylindrical shells continues to worry design engineers as we enter the third millennium. Thanks to extensive research programs in the late sixties and early seventies and the contributions of many eminent scientists, it is known that buckling strength calculations are affected by the uncertainties in the definition of the parameters of the problem such as definition of loads, material properties, geometric variables, edge support conditions, and the accuracy of the engineering models and analysis tools used in the design phase. The NASA design criteria monographs from the late sixties account for these design uncertainties by the use of a lump sum safety factor. This so-called 'empirical knockdown factor gamma' usually results in overly conservative design. Recently new reliability based probabilistic design procedure for buckling critical imperfect shells have been proposed. It essentially consists of a stochastic approach which introduces an improved 'scientific knockdown factor lambda(sub a)', that is not as conservative as the traditional empirical one. In order to incorporate probabilistic methods into a High Fidelity Analysis Approach one must be able to assess the accuracy of the various steps that must be executed to complete a reliability calculation. In the present paper the effect of size of the experimental input sample on the predicted value of the scientific knockdown factor lambda(sub a) calculated by the First-Order, Second-Moment Method is investigated.

  7. Probabilistic Methods for Structural Design and Reliability

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.; Whitlow, Woodrow, Jr. (Technical Monitor)

    2002-01-01

    This report describes a formal method to quantify structural damage tolerance and reliability in the presence of a multitude of uncertainties in turbine engine components. The method is based at the material behavior level where primitive variables with their respective scatter ranges are used to describe behavior. Computational simulation is then used to propagate the uncertainties to the structural scale where damage tolerance and reliability are usually specified. Several sample cases are described to illustrate the effectiveness, versatility, and maturity of the method. Typical results from this method demonstrate, that it is mature and that it can be used to probabilistically evaluate turbine engine structural components. It may be inferred from the results that the method is suitable for probabilistically predicting the remaining life in aging or in deteriorating structures, for making strategic projections and plans, and for achieving better, cheaper, faster products that give competitive advantages in world markets.

  8. An analytical probabilistic model of the quality efficiency of a sewer tank

    NASA Astrophysics Data System (ADS)

    Balistrocchi, Matteo; Grossi, Giovanna; Bacchi, Baldassare

    2009-12-01

    The assessment of the efficiency of a storm water storage facility devoted to the sewer overflow control in urban areas strictly depends on the ability to model the main features of the rainfall-runoff routing process and the related wet weather pollution delivery. In this paper the possibility of applying the analytical probabilistic approach for developing a tank design method, whose potentials are similar to the continuous simulations, is proved. In the model derivation the quality issues of such devices were implemented. The formulation is based on a Weibull probabilistic model of the main characteristics of the rainfall process and on a power law describing the relationship between the dimensionless storm water cumulative runoff volume and the dimensionless cumulative pollutograph. Following this approach, efficiency indexes were established. The proposed model was verified by comparing its results to those obtained by continuous simulations; satisfactory agreement is shown for the proposed efficiency indexes.

  9. Estimation of distribution algorithm with path relinking for the blocking flow-shop scheduling problem

    NASA Astrophysics Data System (ADS)

    Shao, Zhongshi; Pi, Dechang; Shao, Weishi

    2018-05-01

    This article presents an effective estimation of distribution algorithm, named P-EDA, to solve the blocking flow-shop scheduling problem (BFSP) with the makespan criterion. In the P-EDA, a Nawaz-Enscore-Ham (NEH)-based heuristic and the random method are combined to generate the initial population. Based on several superior individuals provided by a modified linear rank selection, a probabilistic model is constructed to describe the probabilistic distribution of the promising solution space. The path relinking technique is incorporated into EDA to avoid blindness of the search and improve the convergence property. A modified referenced local search is designed to enhance the local exploitation. Moreover, a diversity-maintaining scheme is introduced into EDA to avoid deterioration of the population. Finally, the parameters of the proposed P-EDA are calibrated using a design of experiments approach. Simulation results and comparisons with some well-performing algorithms demonstrate the effectiveness of the P-EDA for solving BFSP.

  10. A probabilistic multi-criteria decision making technique for conceptual and preliminary aerospace systems design

    NASA Astrophysics Data System (ADS)

    Bandte, Oliver

    It has always been the intention of systems engineering to invent or produce the best product possible. Many design techniques have been introduced over the course of decades that try to fulfill this intention. Unfortunately, no technique has succeeded in combining multi-criteria decision making with probabilistic design. The design technique developed in this thesis, the Joint Probabilistic Decision Making (JPDM) technique, successfully overcomes this deficiency by generating a multivariate probability distribution that serves in conjunction with a criterion value range of interest as a universally applicable objective function for multi-criteria optimization and product selection. This new objective function constitutes a meaningful Xnetric, called Probability of Success (POS), that allows the customer or designer to make a decision based on the chance of satisfying the customer's goals. In order to incorporate a joint probabilistic formulation into the systems design process, two algorithms are created that allow for an easy implementation into a numerical design framework: the (multivariate) Empirical Distribution Function and the Joint Probability Model. The Empirical Distribution Function estimates the probability that an event occurred by counting how many times it occurred in a given sample. The Joint Probability Model on the other hand is an analytical parametric model for the multivariate joint probability. It is comprised of the product of the univariate criterion distributions, generated by the traditional probabilistic design process, multiplied with a correlation function that is based on available correlation information between pairs of random variables. JPDM is an excellent tool for multi-objective optimization and product selection, because of its ability to transform disparate objectives into a single figure of merit, the likelihood of successfully meeting all goals or POS. The advantage of JPDM over other multi-criteria decision making techniques is that POS constitutes a single optimizable function or metric that enables a comparison of all alternative solutions on an equal basis. Hence, POS allows for the use of any standard single-objective optimization technique available and simplifies a complex multi-criteria selection problem into a simple ordering problem, where the solution with the highest POS is best. By distinguishing between controllable and uncontrollable variables in the design process, JPDM can account for the uncertain values of the uncontrollable variables that are inherent to the design problem, while facilitating an easy adjustment of the controllable ones to achieve the highest possible POS. Finally, JPDM's superiority over current multi-criteria decision making techniques is demonstrated with an optimization of a supersonic transport concept and ten contrived equations as well as a product selection example, determining an airline's best choice among Boeing's B-747, B-777, Airbus' A340, and a Supersonic Transport. The optimization examples demonstrate JPDM's ability to produce a better solution with a higher POS than an Overall Evaluation Criterion or Goal Programming approach. Similarly, the product selection example demonstrates JPDM's ability to produce a better solution with a higher POS and different ranking than the Overall Evaluation Criterion or Technique for Order Preferences by Similarity to the Ideal Solution (TOPSIS) approach.

  11. Probabilistic Multi-Sensor Fusion Based Indoor Positioning System on a Mobile Device

    PubMed Central

    He, Xiang; Aloi, Daniel N.; Li, Jia

    2015-01-01

    Nowadays, smart mobile devices include more and more sensors on board, such as motion sensors (accelerometer, gyroscope, magnetometer), wireless signal strength indicators (WiFi, Bluetooth, Zigbee), and visual sensors (LiDAR, camera). People have developed various indoor positioning techniques based on these sensors. In this paper, the probabilistic fusion of multiple sensors is investigated in a hidden Markov model (HMM) framework for mobile-device user-positioning. We propose a graph structure to store the model constructed by multiple sensors during the offline training phase, and a multimodal particle filter to seamlessly fuse the information during the online tracking phase. Based on our algorithm, we develop an indoor positioning system on the iOS platform. The experiments carried out in a typical indoor environment have shown promising results for our proposed algorithm and system design. PMID:26694387

  12. Probabilistic Multi-Sensor Fusion Based Indoor Positioning System on a Mobile Device.

    PubMed

    He, Xiang; Aloi, Daniel N; Li, Jia

    2015-12-14

    Nowadays, smart mobile devices include more and more sensors on board, such as motion sensors (accelerometer, gyroscope, magnetometer), wireless signal strength indicators (WiFi, Bluetooth, Zigbee), and visual sensors (LiDAR, camera). People have developed various indoor positioning techniques based on these sensors. In this paper, the probabilistic fusion of multiple sensors is investigated in a hidden Markov model (HMM) framework for mobile-device user-positioning. We propose a graph structure to store the model constructed by multiple sensors during the offline training phase, and a multimodal particle filter to seamlessly fuse the information during the online tracking phase. Based on our algorithm, we develop an indoor positioning system on the iOS platform. The experiments carried out in a typical indoor environment have shown promising results for our proposed algorithm and system design.

  13. Probabilistic Neighborhood-Based Data Collection Algorithms for 3D Underwater Acoustic Sensor Networks

    PubMed Central

    Han, Guangjie; Li, Shanshan; Zhu, Chunsheng; Jiang, Jinfang; Zhang, Wenbo

    2017-01-01

    Marine environmental monitoring provides crucial information and support for the exploitation, utilization, and protection of marine resources. With the rapid development of information technology, the development of three-dimensional underwater acoustic sensor networks (3D UASNs) provides a novel strategy to acquire marine environment information conveniently, efficiently and accurately. However, the specific propagation effects of acoustic communication channel lead to decreased successful information delivery probability with increased distance. Therefore, we investigate two probabilistic neighborhood-based data collection algorithms for 3D UASNs which are based on a probabilistic acoustic communication model instead of the traditional deterministic acoustic communication model. An autonomous underwater vehicle (AUV) is employed to traverse along the designed path to collect data from neighborhoods. For 3D UASNs without prior deployment knowledge, partitioning the network into grids can allow the AUV to visit the central location of each grid for data collection. For 3D UASNs in which the deployment knowledge is known in advance, the AUV only needs to visit several selected locations by constructing a minimum probabilistic neighborhood covering set to reduce data latency. Otherwise, by increasing the transmission rounds, our proposed algorithms can provide a tradeoff between data collection latency and information gain. These algorithms are compared with basic Nearest-neighbor Heuristic algorithm via simulations. Simulation analyses show that our proposed algorithms can efficiently reduce the average data collection completion time, corresponding to a decrease of data latency. PMID:28208735

  14. Is Probabilistic Evidence a Source of Knowledge?

    ERIC Educational Resources Information Center

    Friedman, Ori; Turri, John

    2015-01-01

    We report a series of experiments examining whether people ascribe knowledge for true beliefs based on probabilistic evidence. Participants were less likely to ascribe knowledge for beliefs based on probabilistic evidence than for beliefs based on perceptual evidence (Experiments 1 and 2A) or testimony providing causal information (Experiment 2B).…

  15. A Probabilistic Design Method Applied to Smart Composite Structures

    NASA Technical Reports Server (NTRS)

    Shiao, Michael C.; Chamis, Christos C.

    1995-01-01

    A probabilistic design method is described and demonstrated using a smart composite wing. Probabilistic structural design incorporates naturally occurring uncertainties including those in constituent (fiber/matrix) material properties, fabrication variables, structure geometry and control-related parameters. Probabilistic sensitivity factors are computed to identify those parameters that have a great influence on a specific structural reliability. Two performance criteria are used to demonstrate this design methodology. The first criterion requires that the actuated angle at the wing tip be bounded by upper and lower limits at a specified reliability. The second criterion requires that the probability of ply damage due to random impact load be smaller than an assigned value. When the relationship between reliability improvement and the sensitivity factors is assessed, the results show that a reduction in the scatter of the random variable with the largest sensitivity factor (absolute value) provides the lowest failure probability. An increase in the mean of the random variable with a negative sensitivity factor will reduce the failure probability. Therefore, the design can be improved by controlling or selecting distribution parameters associated with random variables. This can be implemented during the manufacturing process to obtain maximum benefit with minimum alterations.

  16. Design of Composite Structures for Reliability and Damage Tolerance

    NASA Technical Reports Server (NTRS)

    Rais-Rohani, Masoud

    1999-01-01

    A summary of research conducted during the first year is presented. The research objectives were sought by conducting two tasks: (1) investigation of probabilistic design techniques for reliability-based design of composite sandwich panels, and (2) examination of strain energy density failure criterion in conjunction with response surface methodology for global-local design of damage tolerant helicopter fuselage structures. This report primarily discusses the efforts surrounding the first task and provides a discussion of some preliminary work involving the second task.

  17. Life Predicted in a Probabilistic Design Space for Brittle Materials With Transient Loads

    NASA Technical Reports Server (NTRS)

    Nemeth, Noel N.; Palfi, Tamas; Reh, Stefan

    2005-01-01

    Analytical techniques have progressively become more sophisticated, and now we can consider the probabilistic nature of the entire space of random input variables on the lifetime reliability of brittle structures. This was demonstrated with NASA s CARES/Life (Ceramic Analysis and Reliability Evaluation of Structures/Life) code combined with the commercially available ANSYS/Probabilistic Design System (ANSYS/PDS), a probabilistic analysis tool that is an integral part of the ANSYS finite-element analysis program. ANSYS/PDS allows probabilistic loads, component geometry, and material properties to be considered in the finite-element analysis. CARES/Life predicts the time dependent probability of failure of brittle material structures under generalized thermomechanical loading--such as that found in a turbine engine hot-section. Glenn researchers coupled ANSYS/PDS with CARES/Life to assess the effects of the stochastic variables of component geometry, loading, and material properties on the predicted life of the component for fully transient thermomechanical loading and cyclic loading.

  18. Award-Winning CARES/Life Ceramics Durability Evaluation Software Is Making Advanced Technology Accessible

    NASA Technical Reports Server (NTRS)

    1997-01-01

    Products made from advanced ceramics show great promise for revolutionizing aerospace and terrestrial propulsion and power generation. However, ceramic components are difficult to design because brittle materials in general have widely varying strength values. The CARES/Life software developed at the NASA Lewis Research Center eases this by providing a tool that uses probabilistic reliability analysis techniques to optimize the design and manufacture of brittle material components. CARES/Life is an integrated package that predicts the probability of a monolithic ceramic component's failure as a function of its time in service. It couples commercial finite element programs--which resolve a component's temperature and stress distribution - with reliability evaluation and fracture mechanics routines for modeling strength - limiting defects. These routines are based on calculations of the probabilistic nature of the brittle material's strength.

  19. Development of Probabilistic Rigid Pavement Design Methodologies for Military Airfields.

    DTIC Science & Technology

    1983-12-01

    4A161102AT22, Task AO, Work Unit 009, "Methodology for Considering Material Variability in Pavement Design." OCE Project Monitor was Mr. S. S. Gillespie. The...PREFACE. .. ............................. VOLUME 1: STATE OF THE ART VARIABILITY OF AIRFIELD PAVEMENT MATERIALS VOLUME 11: MATHEMATICAL FORMULATION OF...VOLUME IV: PROBABILISTIC ANALYSIS OF RIGID AIRFIELD DESIGN BY ELASTIC LAYERED THEORY VOLUME I STATE OF THE ART VARIABILITY OF AIRFIELD PAVEMENT MATERIALS

  20. RANGE AND DENSITY OF ALIEN FISH IN WESTERN STREAMS AND RIVERS, US

    EPA Science Inventory

    Alien fish have become increasingly prevalent in Western U.S. waters. The EPA Environmental Monitoring and Assessment Program's Western Pilot (12 western states), which is based upon a probabilistic design, provides an opportunity to make inferences about the range and density of...

  1. Probabilistic safety assessment of the design of a tall buildings under the extreme load

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Králik, Juraj, E-mail: juraj.kralik@stuba.sk

    2016-06-08

    The paper describes some experiences from the deterministic and probabilistic analysis of the safety of the tall building structure. There are presented the methods and requirements of Eurocode EN 1990, standard ISO 2394 and JCSS. The uncertainties of the model and resistance of the structures are considered using the simulation methods. The MONTE CARLO, LHS and RSM probabilistic methods are compared with the deterministic results. On the example of the probability analysis of the safety of the tall buildings is demonstrated the effectiveness of the probability design of structures using Finite Element Methods.

  2. Probabilistic safety assessment of the design of a tall buildings under the extreme load

    NASA Astrophysics Data System (ADS)

    Králik, Juraj

    2016-06-01

    The paper describes some experiences from the deterministic and probabilistic analysis of the safety of the tall building structure. There are presented the methods and requirements of Eurocode EN 1990, standard ISO 2394 and JCSS. The uncertainties of the model and resistance of the structures are considered using the simulation methods. The MONTE CARLO, LHS and RSM probabilistic methods are compared with the deterministic results. On the example of the probability analysis of the safety of the tall buildings is demonstrated the effectiveness of the probability design of structures using Finite Element Methods.

  3. 75 FR 13610 - Office of New Reactors; Interim Staff Guidance on Implementation of a Seismic Margin Analysis for...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-03-22

    ... Staff Guidance on Implementation of a Seismic Margin Analysis for New Reactors Based on Probabilistic... Seismic Margin Analysis for New Reactors Based on Probabilistic Risk Assessment,'' (Agencywide Documents.../COL-ISG-020 ``Implementation of a Seismic Margin Analysis for New Reactors Based on Probabilistic Risk...

  4. Fifth Annual Workshop on the Application of Probabilistic Methods for Gas Turbine Engines

    NASA Technical Reports Server (NTRS)

    Briscoe, Victoria (Compiler)

    2002-01-01

    These are the proceedings of the 5th Annual FAA/Air Force/NASA/Navy Workshop on the Probabilistic Methods for Gas Turbine Engines hosted by NASA Glenn Research Center and held at the Holiday Inn Cleveland West. The history of this series of workshops stems from the recognition that both military and commercial aircraft engines are inevitably subjected to similar design and manufacturing principles. As such, it was eminently logical to combine knowledge bases on how some of these overlapping principles and methodologies are being applied. We have started the process by creating synergy and cooperation between the FAA, Air Force, Navy, and NASA in these workshops. The recent 3-day workshop was specifically designed to benefit the development of probabilistic methods for gas turbine engines by addressing recent technical accomplishments and forging new ideas. We accomplished our goals of minimizing duplication, maximizing the dissemination of information, and improving program planning to all concerned. This proceeding includes the final agenda, abstracts, presentations, and panel notes, plus the valuable contact information from our presenters and attendees. We hope that this proceeding will be a tool to enhance understanding of the developers and users of probabilistic methods. The fifth workshop doubled its attendance and had the success of collaboration with the many diverse groups represented including government, industry, academia, and our international partners. So, "Start your engines!" and utilize these proceedings towards creating safer and more reliable gas turbine engines for our commercial and military partners.

  5. Constructing probabilistic scenarios for wide-area solar power generation

    DOE PAGES

    Woodruff, David L.; Deride, Julio; Staid, Andrea; ...

    2017-12-22

    Optimizing thermal generation commitments and dispatch in the presence of high penetrations of renewable resources such as solar energy requires a characterization of their stochastic properties. In this study, we describe novel methods designed to create day-ahead, wide-area probabilistic solar power scenarios based only on historical forecasts and associated observations of solar power production. Each scenario represents a possible trajectory for solar power in next-day operations with an associated probability computed by algorithms that use historical forecast errors. Scenarios are created by segmentation of historic data, fitting non-parametric error distributions using epi-splines, and then computing specific quantiles from these distributions.more » Additionally, we address the challenge of establishing an upper bound on solar power output. Our specific application driver is for use in stochastic variants of core power systems operations optimization problems, e.g., unit commitment and economic dispatch. These problems require as input a range of possible future realizations of renewables production. However, the utility of such probabilistic scenarios extends to other contexts, e.g., operator and trader situational awareness. Finally, we compare the performance of our approach to a recently proposed method based on quantile regression, and demonstrate that our method performs comparably to this approach in terms of two widely used methods for assessing the quality of probabilistic scenarios: the Energy score and the Variogram score.« less

  6. Constructing probabilistic scenarios for wide-area solar power generation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Woodruff, David L.; Deride, Julio; Staid, Andrea

    Optimizing thermal generation commitments and dispatch in the presence of high penetrations of renewable resources such as solar energy requires a characterization of their stochastic properties. In this study, we describe novel methods designed to create day-ahead, wide-area probabilistic solar power scenarios based only on historical forecasts and associated observations of solar power production. Each scenario represents a possible trajectory for solar power in next-day operations with an associated probability computed by algorithms that use historical forecast errors. Scenarios are created by segmentation of historic data, fitting non-parametric error distributions using epi-splines, and then computing specific quantiles from these distributions.more » Additionally, we address the challenge of establishing an upper bound on solar power output. Our specific application driver is for use in stochastic variants of core power systems operations optimization problems, e.g., unit commitment and economic dispatch. These problems require as input a range of possible future realizations of renewables production. However, the utility of such probabilistic scenarios extends to other contexts, e.g., operator and trader situational awareness. Finally, we compare the performance of our approach to a recently proposed method based on quantile regression, and demonstrate that our method performs comparably to this approach in terms of two widely used methods for assessing the quality of probabilistic scenarios: the Energy score and the Variogram score.« less

  7. A Hough Transform Global Probabilistic Approach to Multiple-Subject Diffusion MRI Tractography

    DTIC Science & Technology

    2010-04-01

    distribution unlimited 13. SUPPLEMENTARY NOTES 14. ABSTRACT A global probabilistic fiber tracking approach based on the voting process provided by...umn.edu 2 ABSTRACT A global probabilistic fiber tracking approach based on the voting process provided by the Hough transform is introduced in...criteria for aligning curves and particularly tracts. In this work, we present a global probabilistic approach inspired by the voting procedure provided

  8. Modeling marine oily wastewater treatment by a probabilistic agent-based approach.

    PubMed

    Jing, Liang; Chen, Bing; Zhang, Baiyu; Ye, Xudong

    2018-02-01

    This study developed a novel probabilistic agent-based approach for modeling of marine oily wastewater treatment processes. It begins first by constructing a probability-based agent simulation model, followed by a global sensitivity analysis and a genetic algorithm-based calibration. The proposed modeling approach was tested through a case study of the removal of naphthalene from marine oily wastewater using UV irradiation. The removal of naphthalene was described by an agent-based simulation model using 8 types of agents and 11 reactions. Each reaction was governed by a probability parameter to determine its occurrence. The modeling results showed that the root mean square errors between modeled and observed removal rates were 8.73 and 11.03% for calibration and validation runs, respectively. Reaction competition was analyzed by comparing agent-based reaction probabilities, while agents' heterogeneity was visualized by plotting their real-time spatial distribution, showing a strong potential for reactor design and process optimization. Copyright © 2017 Elsevier Ltd. All rights reserved.

  9. A WHOLE-LAKE WATER QUALITY SURVEY OF LAKE OAHE BASED ON A SPATIALLY-BALANCED PROBABILISTIC DESIGN

    EPA Science Inventory

    Assessing conditions on large bodies of water presets multiple statistical and logistical challenges. As part of the Upper Missouri River Program of the Environmental Monitoring and Assessment Project (EMAP) we surveyed water quality of Lake Oahe in July-August, 2002 using a spat...

  10. Data Assimilation - Advances and Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Williams, Brian J.

    2014-07-30

    This presentation provides an overview of data assimilation (model calibration) for complex computer experiments. Calibration refers to the process of probabilistically constraining uncertain physics/engineering model inputs to be consistent with observed experimental data. An initial probability distribution for these parameters is updated using the experimental information. Utilization of surrogate models and empirical adjustment for model form error in code calibration form the basis for the statistical methodology considered. The role of probabilistic code calibration in supporting code validation is discussed. Incorporation of model form uncertainty in rigorous uncertainty quantification (UQ) analyses is also addressed. Design criteria used within a batchmore » sequential design algorithm are introduced for efficiently achieving predictive maturity and improved code calibration. Predictive maturity refers to obtaining stable predictive inference with calibrated computer codes. These approaches allow for augmentation of initial experiment designs for collecting new physical data. A standard framework for data assimilation is presented and techniques for updating the posterior distribution of the state variables based on particle filtering and the ensemble Kalman filter are introduced.« less

  11. Develop Probabilistic Tsunami Design Maps for ASCE 7

    NASA Astrophysics Data System (ADS)

    Wei, Y.; Thio, H. K.; Chock, G.; Titov, V. V.

    2014-12-01

    A national standard for engineering design for tsunami effects has not existed before and this significant risk is mostly ignored in engineering design. The American Society of Civil Engineers (ASCE) 7 Tsunami Loads and Effects Subcommittee is completing a chapter for the 2016 edition of ASCE/SEI 7 Standard. Chapter 6, Tsunami Loads and Effects, would become the first national tsunami design provisions. These provisions will apply to essential facilities and critical infrastructure. This standard for tsunami loads and effects will apply to designs as part of the tsunami preparedness. The provisions will have significance as the post-tsunami recovery tool, to plan and evaluate for reconstruction. Maps of 2,500-year probabilistic tsunami inundation for Alaska, Washington, Oregon, California, and Hawaii need to be developed for use with the ASCE design provisions. These new tsunami design zone maps will define the coastal zones where structures of greater importance would be designed for tsunami resistance and community resilience. The NOAA Center for Tsunami Research (NCTR) has developed 75 tsunami inundation models as part of the operational tsunami model forecast capability for the U.S. coastline. NCTR, UW, and URS are collaborating with ASCE to develop the 2,500-year tsunami design maps for the Pacific states using these tsunami models. This ensures the probabilistic criteria are established in ASCE's tsunami design maps. URS established a Probabilistic Tsunami Hazard Assessment approach consisting of a large amount of tsunami scenarios that include both epistemic uncertainty and aleatory variability (Thio et al., 2010). Their study provides 2,500-year offshore tsunami heights at the 100-m water depth, along with the disaggregated earthquake sources. NOAA's tsunami models are used to identify a group of sources that produce these 2,500-year tsunami heights. The tsunami inundation limits and runup heights derived from these sources establish the tsunami design map for the study site. ASCE's Energy Grad Line Analysis then uses these modeling constraints to derive hydrodynamic forces for structures within the tsunami design zone. The probabilistic tsunami design maps will be validated by comparison to state inundation maps under the coordination of the National Tsunami Hazard Mitigation Program.

  12. An Approach to Risk-Based Design Incorporating Damage Tolerance Analyses

    NASA Technical Reports Server (NTRS)

    Knight, Norman F., Jr.; Glaessgen, Edward H.; Sleight, David W.

    2002-01-01

    Incorporating risk-based design as an integral part of spacecraft development is becoming more and more common. Assessment of uncertainties associated with design parameters and environmental aspects such as loading provides increased knowledge of the design and its performance. Results of such studies can contribute to mitigating risk through a system-level assessment. Understanding the risk of an event occurring, the probability of its occurrence, and the consequences of its occurrence can lead to robust, reliable designs. This paper describes an approach to risk-based structural design incorporating damage-tolerance analysis. The application of this approach to a candidate Earth-entry vehicle is described. The emphasis of the paper is on describing an approach for establishing damage-tolerant structural response inputs to a system-level probabilistic risk assessment.

  13. Probabilistic Inference: Task Dependency and Individual Differences of Probability Weighting Revealed by Hierarchical Bayesian Modeling

    PubMed Central

    Boos, Moritz; Seer, Caroline; Lange, Florian; Kopp, Bruno

    2016-01-01

    Cognitive determinants of probabilistic inference were examined using hierarchical Bayesian modeling techniques. A classic urn-ball paradigm served as experimental strategy, involving a factorial two (prior probabilities) by two (likelihoods) design. Five computational models of cognitive processes were compared with the observed behavior. Parameter-free Bayesian posterior probabilities and parameter-free base rate neglect provided inadequate models of probabilistic inference. The introduction of distorted subjective probabilities yielded more robust and generalizable results. A general class of (inverted) S-shaped probability weighting functions had been proposed; however, the possibility of large differences in probability distortions not only across experimental conditions, but also across individuals, seems critical for the model's success. It also seems advantageous to consider individual differences in parameters of probability weighting as being sampled from weakly informative prior distributions of individual parameter values. Thus, the results from hierarchical Bayesian modeling converge with previous results in revealing that probability weighting parameters show considerable task dependency and individual differences. Methodologically, this work exemplifies the usefulness of hierarchical Bayesian modeling techniques for cognitive psychology. Theoretically, human probabilistic inference might be best described as the application of individualized strategic policies for Bayesian belief revision. PMID:27303323

  14. Effects of sample survey design on the accuracy of classification tree models in species distribution models

    Treesearch

    Thomas C. Edwards; D. Richard Cutler; Niklaus E. Zimmermann; Linda Geiser; Gretchen G. Moisen

    2006-01-01

    We evaluated the effects of probabilistic (hereafter DESIGN) and non-probabilistic (PURPOSIVE) sample surveys on resultant classification tree models for predicting the presence of four lichen species in the Pacific Northwest, USA. Models derived from both survey forms were assessed using an independent data set (EVALUATION). Measures of accuracy as gauged by...

  15. Probabilistic Learning in Junior High School: Investigation of Student Probabilistic Thinking Levels

    NASA Astrophysics Data System (ADS)

    Kurniasih, R.; Sujadi, I.

    2017-09-01

    This paper was to investigate level on students’ probabilistic thinking. Probabilistic thinking level is level of probabilistic thinking. Probabilistic thinking is thinking about probabilistic or uncertainty matter in probability material. The research’s subject was students in grade 8th Junior High School students. The main instrument is a researcher and a supporting instrument is probabilistic thinking skills test and interview guidelines. Data was analyzed using triangulation method. The results showed that the level of students probabilistic thinking before obtaining a teaching opportunity at the level of subjective and transitional. After the students’ learning level probabilistic thinking is changing. Based on the results of research there are some students who have in 8th grade level probabilistic thinking numerically highest of levels. Level of students’ probabilistic thinking can be used as a reference to make a learning material and strategy.

  16. A Measure Approximation for Distributionally Robust PDE-Constrained Optimization Problems

    DOE PAGES

    Kouri, Drew Philip

    2017-12-19

    In numerous applications, scientists and engineers acquire varied forms of data that partially characterize the inputs to an underlying physical system. This data is then used to inform decisions such as controls and designs. Consequently, it is critical that the resulting control or design is robust to the inherent uncertainties associated with the unknown probabilistic characterization of the model inputs. Here in this work, we consider optimal control and design problems constrained by partial differential equations with uncertain inputs. We do not assume a known probabilistic model for the inputs, but rather we formulate the problem as a distributionally robustmore » optimization problem where the outer minimization problem determines the control or design, while the inner maximization problem determines the worst-case probability measure that matches desired characteristics of the data. We analyze the inner maximization problem in the space of measures and introduce a novel measure approximation technique, based on the approximation of continuous functions, to discretize the unknown probability measure. Finally, we prove consistency of our approximated min-max problem and conclude with numerical results.« less

  17. Design and analysis of DNA strand displacement devices using probabilistic model checking

    PubMed Central

    Lakin, Matthew R.; Parker, David; Cardelli, Luca; Kwiatkowska, Marta; Phillips, Andrew

    2012-01-01

    Designing correct, robust DNA devices is difficult because of the many possibilities for unwanted interference between molecules in the system. DNA strand displacement has been proposed as a design paradigm for DNA devices, and the DNA strand displacement (DSD) programming language has been developed as a means of formally programming and analysing these devices to check for unwanted interference. We demonstrate, for the first time, the use of probabilistic verification techniques to analyse the correctness, reliability and performance of DNA devices during the design phase. We use the probabilistic model checker prism, in combination with the DSD language, to design and debug DNA strand displacement components and to investigate their kinetics. We show how our techniques can be used to identify design flaws and to evaluate the merits of contrasting design decisions, even on devices comprising relatively few inputs. We then demonstrate the use of these components to construct a DNA strand displacement device for approximate majority voting. Finally, we discuss some of the challenges and possible directions for applying these methods to more complex designs. PMID:22219398

  18. A novel soft tissue prediction methodology for orthognathic surgery based on probabilistic finite element modelling

    PubMed Central

    Borghi, Alessandro; Ruggiero, Federica; Badiali, Giovanni; Bianchi, Alberto; Marchetti, Claudio; Rodriguez-Florez, Naiara; Breakey, Richard W. F.; Jeelani, Owase; Dunaway, David J.; Schievano, Silvia

    2018-01-01

    Repositioning of the maxilla in orthognathic surgery is carried out for functional and aesthetic purposes. Pre-surgical planning tools can predict 3D facial appearance by computing the response of the soft tissue to the changes to the underlying skeleton. The clinical use of commercial prediction software remains controversial, likely due to the deterministic nature of these computational predictions. A novel probabilistic finite element model (FEM) for the prediction of postoperative facial soft tissues is proposed in this paper. A probabilistic FEM was developed and validated on a cohort of eight patients who underwent maxillary repositioning and had pre- and postoperative cone beam computed tomography (CBCT) scans taken. Firstly, a variables correlation assessed various modelling parameters. Secondly, a design of experiments (DOE) provided a range of potential outcomes based on uniformly distributed input parameters, followed by an optimisation. Lastly, the second DOE iteration provided optimised predictions with a probability range. A range of 3D predictions was obtained using the probabilistic FEM and validated using reconstructed soft tissue surfaces from the postoperative CBCT data. The predictions in the nose and upper lip areas accurately include the true postoperative position, whereas the prediction under-estimates the position of the cheeks and lower lip. A probabilistic FEM has been developed and validated for the prediction of the facial appearance following orthognathic surgery. This method shows how inaccuracies in the modelling and uncertainties in executing surgical planning influence the soft tissue prediction and it provides a range of predictions including a minimum and maximum, which may be helpful for patients in understanding the impact of surgery on the face. PMID:29742139

  19. A novel soft tissue prediction methodology for orthognathic surgery based on probabilistic finite element modelling.

    PubMed

    Knoops, Paul G M; Borghi, Alessandro; Ruggiero, Federica; Badiali, Giovanni; Bianchi, Alberto; Marchetti, Claudio; Rodriguez-Florez, Naiara; Breakey, Richard W F; Jeelani, Owase; Dunaway, David J; Schievano, Silvia

    2018-01-01

    Repositioning of the maxilla in orthognathic surgery is carried out for functional and aesthetic purposes. Pre-surgical planning tools can predict 3D facial appearance by computing the response of the soft tissue to the changes to the underlying skeleton. The clinical use of commercial prediction software remains controversial, likely due to the deterministic nature of these computational predictions. A novel probabilistic finite element model (FEM) for the prediction of postoperative facial soft tissues is proposed in this paper. A probabilistic FEM was developed and validated on a cohort of eight patients who underwent maxillary repositioning and had pre- and postoperative cone beam computed tomography (CBCT) scans taken. Firstly, a variables correlation assessed various modelling parameters. Secondly, a design of experiments (DOE) provided a range of potential outcomes based on uniformly distributed input parameters, followed by an optimisation. Lastly, the second DOE iteration provided optimised predictions with a probability range. A range of 3D predictions was obtained using the probabilistic FEM and validated using reconstructed soft tissue surfaces from the postoperative CBCT data. The predictions in the nose and upper lip areas accurately include the true postoperative position, whereas the prediction under-estimates the position of the cheeks and lower lip. A probabilistic FEM has been developed and validated for the prediction of the facial appearance following orthognathic surgery. This method shows how inaccuracies in the modelling and uncertainties in executing surgical planning influence the soft tissue prediction and it provides a range of predictions including a minimum and maximum, which may be helpful for patients in understanding the impact of surgery on the face.

  20. Alternate Methods in Refining the SLS Nozzle Plug Loads

    NASA Technical Reports Server (NTRS)

    Burbank, Scott; Allen, Andrew

    2013-01-01

    Numerical analysis has shown that the SLS nozzle environmental barrier (nozzle plug) design is inadequate for the prelaunch condition, which consists of two dominant loads: 1) the main engines startup pressure and 2) an environmentally induced pressure. Efforts to reduce load conservatisms included a dynamic analysis which showed a 31% higher safety factor compared to the standard static analysis. The environmental load is typically approached with a deterministic method using the worst possible combinations of pressures and temperatures. An alternate probabilistic approach, utilizing the distributions of pressures and temperatures, resulted in a 54% reduction in the environmental pressure load. A Monte Carlo simulation of environmental load that used five years of historical pressure and temperature data supported the results of the probabilistic analysis, indicating the probabilistic load is reflective of a 3-sigma condition (1 in 370 probability). Utilizing the probabilistic load analysis eliminated excessive conservatisms and will prevent a future overdesign of the nozzle plug. Employing a similar probabilistic approach to other design and analysis activities can result in realistic yet adequately conservative solutions.

  1. Design of Probabilistic Random Forests with Applications to Anticancer Drug Sensitivity Prediction

    PubMed Central

    Rahman, Raziur; Haider, Saad; Ghosh, Souparno; Pal, Ranadip

    2015-01-01

    Random forests consisting of an ensemble of regression trees with equal weights are frequently used for design of predictive models. In this article, we consider an extension of the methodology by representing the regression trees in the form of probabilistic trees and analyzing the nature of heteroscedasticity. The probabilistic tree representation allows for analytical computation of confidence intervals (CIs), and the tree weight optimization is expected to provide stricter CIs with comparable performance in mean error. We approached the ensemble of probabilistic trees’ prediction from the perspectives of a mixture distribution and as a weighted sum of correlated random variables. We applied our methodology to the drug sensitivity prediction problem on synthetic and cancer cell line encyclopedia dataset and illustrated that tree weights can be selected to reduce the average length of the CI without increase in mean error. PMID:27081304

  2. Advanced Small Modular Reactor (SMR) Probabilistic Risk Assessment (PRA) Technical Exchange Meeting

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, Curtis

    2013-09-01

    During FY13, the INL developed an advanced SMR PRA framework which has been described in the report Small Modular Reactor (SMR) Probabilistic Risk Assessment (PRA) Detailed Technical Framework Specification, INL/EXT-13-28974 (April 2013). In this framework, the various areas are considered: Probabilistic models to provide information specific to advanced SMRs Representation of specific SMR design issues such as having co-located modules and passive safety features Use of modern open-source and readily available analysis methods Internal and external events resulting in impacts to safety All-hazards considerations Methods to support the identification of design vulnerabilities Mechanistic and probabilistic data needs to support modelingmore » and tools In order to describe this framework more fully and obtain feedback on the proposed approaches, the INL hosted a technical exchange meeting during August 2013. This report describes the outcomes of that meeting.« less

  3. Integrating statistical and process-based models to produce probabilistic landslide hazard at regional scale

    NASA Astrophysics Data System (ADS)

    Strauch, R. L.; Istanbulluoglu, E.

    2017-12-01

    We develop a landslide hazard modeling approach that integrates a data-driven statistical model and a probabilistic process-based shallow landslide model for mapping probability of landslide initiation, transport, and deposition at regional scales. The empirical model integrates the influence of seven site attribute (SA) classes: elevation, slope, curvature, aspect, land use-land cover, lithology, and topographic wetness index, on over 1,600 observed landslides using a frequency ratio (FR) approach. A susceptibility index is calculated by adding FRs for each SA on a grid-cell basis. Using landslide observations we relate susceptibility index to an empirically-derived probability of landslide impact. This probability is combined with results from a physically-based model to produce an integrated probabilistic map. Slope was key in landslide initiation while deposition was linked to lithology and elevation. Vegetation transition from forest to alpine vegetation and barren land cover with lower root cohesion leads to higher frequency of initiation. Aspect effects are likely linked to differences in root cohesion and moisture controlled by solar insulation and snow. We demonstrate the model in the North Cascades of Washington, USA and identify locations of high and low probability of landslide impacts that can be used by land managers in their design, planning, and maintenance.

  4. Probabilistic Analysis of a Composite Crew Module

    NASA Technical Reports Server (NTRS)

    Mason, Brian H.; Krishnamurthy, Thiagarajan

    2011-01-01

    An approach for conducting reliability-based analysis (RBA) of a Composite Crew Module (CCM) is presented. The goal is to identify and quantify the benefits of probabilistic design methods for the CCM and future space vehicles. The coarse finite element model from a previous NASA Engineering and Safety Center (NESC) project is used as the baseline deterministic analysis model to evaluate the performance of the CCM using a strength-based failure index. The first step in the probabilistic analysis process is the determination of the uncertainty distributions for key parameters in the model. Analytical data from water landing simulations are used to develop an uncertainty distribution, but such data were unavailable for other load cases. The uncertainty distributions for the other load scale factors and the strength allowables are generated based on assumed coefficients of variation. Probability of first-ply failure is estimated using three methods: the first order reliability method (FORM), Monte Carlo simulation, and conditional sampling. Results for the three methods were consistent. The reliability is shown to be driven by first ply failure in one region of the CCM at the high altitude abort load set. The final predicted probability of failure is on the order of 10-11 due to the conservative nature of the factors of safety on the deterministic loads.

  5. Probabilistic finite elements

    NASA Technical Reports Server (NTRS)

    Belytschko, Ted; Wing, Kam Liu

    1987-01-01

    In the Probabilistic Finite Element Method (PFEM), finite element methods have been efficiently combined with second-order perturbation techniques to provide an effective method for informing the designer of the range of response which is likely in a given problem. The designer must provide as input the statistical character of the input variables, such as yield strength, load magnitude, and Young's modulus, by specifying their mean values and their variances. The output then consists of the mean response and the variance in the response. Thus the designer is given a much broader picture of the predicted performance than with simply a single response curve. These methods are applicable to a wide class of problems, provided that the scale of randomness is not too large and the probabilistic density functions possess decaying tails. By incorporating the computational techniques we have developed in the past 3 years for efficiency, the probabilistic finite element methods are capable of handling large systems with many sources of uncertainties. Sample results for an elastic-plastic ten-bar structure and an elastic-plastic plane continuum with a circular hole subject to cyclic loadings with the yield stress on the random field are given.

  6. Estimate Tsunami Flow Conditions and Large-Debris Tracks for the Design of Coastal Infrastructures along Coastlines of the U.S. Pacific Northwest

    NASA Astrophysics Data System (ADS)

    Wei, Y.; Thomas, S.; Zhou, H.; Arcas, D.; Titov, V. V.

    2017-12-01

    The increasing potential tsunami hazards pose great challenges for infrastructures along the coastlines of the U.S. Pacific Northwest. Tsunami impact at a coastal site is usually assessed from deterministic scenarios based on 10,000 years of geological records in the Cascadia Subduction Zone (CSZ). Aside from these deterministic methods, the new ASCE 7-16 tsunami provisions provide engineering design criteria of tsunami loads on buildings based on a probabilistic approach. This work develops a site-specific model near Newport, OR using high-resolution grids, and compute tsunami inundation depth and velocities at the study site resulted from credible probabilistic and deterministic earthquake sources in the Cascadia Subduction Zone. Three Cascadia scenarios, two deterministic scenarios, XXL1 and L1, and a 2,500-yr probabilistic scenario compliant with the new ASCE 7-16 standard, are simulated using combination of a depth-averaged shallow water model for offshore propagation and a Boussinesq-type model for onshore inundation. We speculate on the methods and procedure to obtain the 2,500-year probabilistic scenario for Newport that is compliant with the ASCE 7-16 tsunami provisions. We provide details of model results, particularly the inundation depth and flow speed for a new building, which will also be designated as a tsunami vertical evacuation shelter, at Newport, Oregon. We show that the ASCE 7-16 consistent hazards are between those obtained from deterministic L1 and XXL1 scenarios, and the greatest impact on the building may come from later waves. As a further step, we utilize the inundation model results to numerically compute tracks of large vessels in the vicinity of the building site and estimate if these vessels will impact on the building site during the extreme XXL1 and ASCE 7-16 hazard-consistent scenarios. Two-step study is carried out first to study tracks of massless particles and then large vessels with assigned mass considering drag force, inertial force, ship grounding and mooring. The simulation results show that none of the large vessels will impact on the building site in all tested scenarios.

  7. Probabilistic structural analysis methods for select space propulsion system components

    NASA Technical Reports Server (NTRS)

    Millwater, H. R.; Cruse, T. A.

    1989-01-01

    The Probabilistic Structural Analysis Methods (PSAM) project developed at the Southwest Research Institute integrates state-of-the-art structural analysis techniques with probability theory for the design and analysis of complex large-scale engineering structures. An advanced efficient software system (NESSUS) capable of performing complex probabilistic analysis has been developed. NESSUS contains a number of software components to perform probabilistic analysis of structures. These components include: an expert system, a probabilistic finite element code, a probabilistic boundary element code and a fast probability integrator. The NESSUS software system is shown. An expert system is included to capture and utilize PSAM knowledge and experience. NESSUS/EXPERT is an interactive menu-driven expert system that provides information to assist in the use of the probabilistic finite element code NESSUS/FEM and the fast probability integrator (FPI). The expert system menu structure is summarized. The NESSUS system contains a state-of-the-art nonlinear probabilistic finite element code, NESSUS/FEM, to determine the structural response and sensitivities. A broad range of analysis capabilities and an extensive element library is present.

  8. Probabilistic machine learning and artificial intelligence.

    PubMed

    Ghahramani, Zoubin

    2015-05-28

    How can a machine learn from experience? Probabilistic modelling provides a framework for understanding what learning is, and has therefore emerged as one of the principal theoretical and practical approaches for designing machines that learn from data acquired through experience. The probabilistic framework, which describes how to represent and manipulate uncertainty about models and predictions, has a central role in scientific data analysis, machine learning, robotics, cognitive science and artificial intelligence. This Review provides an introduction to this framework, and discusses some of the state-of-the-art advances in the field, namely, probabilistic programming, Bayesian optimization, data compression and automatic model discovery.

  9. Probabilistic machine learning and artificial intelligence

    NASA Astrophysics Data System (ADS)

    Ghahramani, Zoubin

    2015-05-01

    How can a machine learn from experience? Probabilistic modelling provides a framework for understanding what learning is, and has therefore emerged as one of the principal theoretical and practical approaches for designing machines that learn from data acquired through experience. The probabilistic framework, which describes how to represent and manipulate uncertainty about models and predictions, has a central role in scientific data analysis, machine learning, robotics, cognitive science and artificial intelligence. This Review provides an introduction to this framework, and discusses some of the state-of-the-art advances in the field, namely, probabilistic programming, Bayesian optimization, data compression and automatic model discovery.

  10. Reliability Assessment of a Robust Design Under Uncertainty for a 3-D Flexible Wing

    NASA Technical Reports Server (NTRS)

    Gumbert, Clyde R.; Hou, Gene J. -W.; Newman, Perry A.

    2003-01-01

    The paper presents reliability assessment results for the robust designs under uncertainty of a 3-D flexible wing previously reported by the authors. Reliability assessments (additional optimization problems) of the active constraints at the various probabilistic robust design points are obtained and compared with the constraint values or target constraint probabilities specified in the robust design. In addition, reliability-based sensitivity derivatives with respect to design variable mean values are also obtained and shown to agree with finite difference values. These derivatives allow one to perform reliability based design without having to obtain second-order sensitivity derivatives. However, an inner-loop optimization problem must be solved for each active constraint to find the most probable point on that constraint failure surface.

  11. An Outcomes-Based Assessment of Quality of Life in Social Services

    ERIC Educational Resources Information Center

    Gomez, Laura Elisabet; Arias, Benito; Verdugo, Miguel Angel; Navas, Patricia

    2012-01-01

    The goal of this article consists of describing the calibration of an instrument to assess quality of life-related personal outcomes using Rasch analysis. The sample was composed of 3.029 recipients of social services from Catalonia (Spain) and was selected using a probabilistic polietapic sample design. Results related to unidimensionality, item…

  12. The Integrated Medical Model - A Risk Assessment and Decision Support Tool for Human Space Flight Missions

    NASA Technical Reports Server (NTRS)

    Kerstman, Eric; Minard, Charles G.; Saile, Lynn; FreiredeCarvalho, Mary; Myers, Jerry; Walton, Marlei; Butler, Douglas; Lopez, Vilma

    2010-01-01

    The Integrated Medical Model (IMM) is a decision support tool that is useful to space flight mission planners and medical system designers in assessing risks and optimizing medical systems. The IMM employs an evidence-based, probabilistic risk assessment (PRA) approach within the operational constraints of space flight.

  13. Review of Reliability-Based Design Optimization Approach and Its Integration with Bayesian Method

    NASA Astrophysics Data System (ADS)

    Zhang, Xiangnan

    2018-03-01

    A lot of uncertain factors lie in practical engineering, such as external load environment, material property, geometrical shape, initial condition, boundary condition, etc. Reliability method measures the structural safety condition and determine the optimal design parameter combination based on the probabilistic theory. Reliability-based design optimization (RBDO) is the most commonly used approach to minimize the structural cost or other performance under uncertainty variables which combines the reliability theory and optimization. However, it cannot handle the various incomplete information. The Bayesian approach is utilized to incorporate this kind of incomplete information in its uncertainty quantification. In this paper, the RBDO approach and its integration with Bayesian method are introduced.

  14. Probabilistic description of probable maximum precipitation

    NASA Astrophysics Data System (ADS)

    Ben Alaya, Mohamed Ali; Zwiers, Francis W.; Zhang, Xuebin

    2017-04-01

    Probable Maximum Precipitation (PMP) is the key parameter used to estimate probable Maximum Flood (PMF). PMP and PMF are important for dam safety and civil engineering purposes. Even if the current knowledge of storm mechanisms remains insufficient to properly evaluate limiting values of extreme precipitation, PMP estimation methods are still based on deterministic consideration, and give only single values. This study aims to provide a probabilistic description of the PMP based on the commonly used method, the so-called moisture maximization. To this end, a probabilistic bivariate extreme values model is proposed to address the limitations of traditional PMP estimates via moisture maximization namely: (i) the inability to evaluate uncertainty and to provide a range PMP values, (ii) the interpretation that a maximum of a data series as a physical upper limit (iii) and the assumption that a PMP event has maximum moisture availability. Results from simulation outputs of the Canadian Regional Climate Model CanRCM4 over North America reveal the high uncertainties inherent in PMP estimates and the non-validity of the assumption that PMP events have maximum moisture availability. This later assumption leads to overestimation of the PMP by an average of about 15% over North America, which may have serious implications for engineering design.

  15. A Max-Flow Based Algorithm for Connected Target Coverage with Probabilistic Sensors

    PubMed Central

    Shan, Anxing; Xu, Xianghua; Cheng, Zongmao; Wang, Wensheng

    2017-01-01

    Coverage is a fundamental issue in the research field of wireless sensor networks (WSNs). Connected target coverage discusses the sensor placement to guarantee the needs of both coverage and connectivity. Existing works largely leverage on the Boolean disk model, which is only a coarse approximation to the practical sensing model. In this paper, we focus on the connected target coverage issue based on the probabilistic sensing model, which can characterize the quality of coverage more accurately. In the probabilistic sensing model, sensors are only be able to detect a target with certain probability. We study the collaborative detection probability of target under multiple sensors. Armed with the analysis of collaborative detection probability, we further formulate the minimum ϵ-connected target coverage problem, aiming to minimize the number of sensors satisfying the requirements of both coverage and connectivity. We map it into a flow graph and present an approximation algorithm called the minimum vertices maximum flow algorithm (MVMFA) with provable time complex and approximation ratios. To evaluate our design, we analyze the performance of MVMFA theoretically and also conduct extensive simulation studies to demonstrate the effectiveness of our proposed algorithm. PMID:28587084

  16. A Max-Flow Based Algorithm for Connected Target Coverage with Probabilistic Sensors.

    PubMed

    Shan, Anxing; Xu, Xianghua; Cheng, Zongmao; Wang, Wensheng

    2017-05-25

    Coverage is a fundamental issue in the research field of wireless sensor networks (WSNs). Connected target coverage discusses the sensor placement to guarantee the needs of both coverage and connectivity. Existing works largely leverage on the Boolean disk model, which is only a coarse approximation to the practical sensing model. In this paper, we focus on the connected target coverage issue based on the probabilistic sensing model, which can characterize the quality of coverage more accurately. In the probabilistic sensing model, sensors are only be able to detect a target with certain probability. We study the collaborative detection probability of target under multiple sensors. Armed with the analysis of collaborative detection probability, we further formulate the minimum ϵ -connected target coverage problem, aiming to minimize the number of sensors satisfying the requirements of both coverage and connectivity. We map it into a flow graph and present an approximation algorithm called the minimum vertices maximum flow algorithm (MVMFA) with provable time complex and approximation ratios. To evaluate our design, we analyze the performance of MVMFA theoretically and also conduct extensive simulation studies to demonstrate the effectiveness of our proposed algorithm.

  17. Possibility-based robust design optimization for the structural-acoustic system with fuzzy parameters

    NASA Astrophysics Data System (ADS)

    Yin, Hui; Yu, Dejie; Yin, Shengwen; Xia, Baizhan

    2018-03-01

    The conventional engineering optimization problems considering uncertainties are based on the probabilistic model. However, the probabilistic model may be unavailable because of the lack of sufficient objective information to construct the precise probability distribution of uncertainties. This paper proposes a possibility-based robust design optimization (PBRDO) framework for the uncertain structural-acoustic system based on the fuzzy set model, which can be constructed by expert opinions. The objective of robust design is to optimize the expectation and variability of system performance with respect to uncertainties simultaneously. In the proposed PBRDO, the entropy of the fuzzy system response is used as the variability index; the weighted sum of the entropy and expectation of the fuzzy response is used as the objective function, and the constraints are established in the possibility context. The computations for the constraints and objective function of PBRDO are a triple-loop and a double-loop nested problem, respectively, whose computational costs are considerable. To improve the computational efficiency, the target performance approach is introduced to transform the calculation of the constraints into a double-loop nested problem. To further improve the computational efficiency, a Chebyshev fuzzy method (CFM) based on the Chebyshev polynomials is proposed to estimate the objective function, and the Chebyshev interval method (CIM) is introduced to estimate the constraints, thereby the optimization problem is transformed into a single-loop one. Numerical results on a shell structural-acoustic system verify the effectiveness and feasibility of the proposed methods.

  18. Integration of an Evidence Base into a Probabilistic Risk Assessment Model. The Integrated Medical Model Database: An Organized Evidence Base for Assessing In-Flight Crew Health Risk and System Design

    NASA Technical Reports Server (NTRS)

    Saile, Lynn; Lopez, Vilma; Bickham, Grandin; FreiredeCarvalho, Mary; Kerstman, Eric; Byrne, Vicky; Butler, Douglas; Myers, Jerry; Walton, Marlei

    2011-01-01

    This slide presentation reviews the Integrated Medical Model (IMM) database, which is an organized evidence base for assessing in-flight crew health risk. The database is a relational database accessible to many people. The database quantifies the model inputs by a ranking based on the highest value of the data as Level of Evidence (LOE) and the quality of evidence (QOE) score that provides an assessment of the evidence base for each medical condition. The IMM evidence base has already been able to provide invaluable information for designers, and for other uses.

  19. Robust Design Optimization via Failure Domain Bounding

    NASA Technical Reports Server (NTRS)

    Crespo, Luis G.; Kenny, Sean P.; Giesy, Daniel P.

    2007-01-01

    This paper extends and applies the strategies recently developed by the authors for handling constraints under uncertainty to robust design optimization. For the scope of this paper, robust optimization is a methodology aimed at problems for which some parameters are uncertain and are only known to belong to some uncertainty set. This set can be described by either a deterministic or a probabilistic model. In the methodology developed herein, optimization-based strategies are used to bound the constraint violation region using hyper-spheres and hyper-rectangles. By comparing the resulting bounding sets with any given uncertainty model, it can be determined whether the constraints are satisfied for all members of the uncertainty model (i.e., constraints are feasible) or not (i.e., constraints are infeasible). If constraints are infeasible and a probabilistic uncertainty model is available, upper bounds to the probability of constraint violation can be efficiently calculated. The tools developed enable approximating not only the set of designs that make the constraints feasible but also, when required, the set of designs for which the probability of constraint violation is below a prescribed admissible value. When constraint feasibility is possible, several design criteria can be used to shape the uncertainty model of performance metrics of interest. Worst-case, least-second-moment, and reliability-based design criteria are considered herein. Since the problem formulation is generic and the tools derived only require standard optimization algorithms for their implementation, these strategies are easily applicable to a broad range of engineering problems.

  20. Development of Probabilistic Life Prediction Methodologies and Testing Strategies for MEMS and CMC's

    NASA Technical Reports Server (NTRS)

    Jadaan, Osama

    2003-01-01

    This effort is to investigate probabilistic life prediction methodologies for ceramic matrix composites and MicroElectroMechanical Systems (MEMS) and to analyze designs that determine stochastic properties of MEMS. For CMC's this includes a brief literature survey regarding lifing methodologies. Also of interest for MEMS is the design of a proper test for the Weibull size effect in thin film (bulge test) specimens. The Weibull size effect is a consequence of a stochastic strength response predicted from the Weibull distribution. Confirming that MEMS strength is controlled by the Weibull distribution will enable the development of a probabilistic design methodology for MEMS - similar to the GRC developed CARES/Life program for bulk ceramics. A main objective of this effort is to further develop and verify the ability of the Ceramics Analysis and Reliability Evaluation of Structures/Life (CARES/Life) code to predict the time-dependent reliability of MEMS structures subjected to multiple transient loads. A second set of objectives is to determine the applicability/suitability of the CARES/Life methodology for CMC analysis, what changes would be needed to the methodology and software, and if feasible, run a demonstration problem. Also important is an evaluation of CARES/Life coupled to the ANSYS Probabilistic Design System (PDS) and the potential of coupling transient reliability analysis to the ANSYS PDS.

  1. Frontal and Parietal Contributions to Probabilistic Association Learning

    PubMed Central

    Rushby, Jacqueline A.; Vercammen, Ans; Loo, Colleen; Short, Brooke

    2011-01-01

    Neuroimaging studies have shown both dorsolateral prefrontal (DLPFC) and inferior parietal cortex (iPARC) activation during probabilistic association learning. Whether these cortical brain regions are necessary for probabilistic association learning is presently unknown. Participants' ability to acquire probabilistic associations was assessed during disruptive 1 Hz repetitive transcranial magnetic stimulation (rTMS) of the left DLPFC, left iPARC, and sham using a crossover single-blind design. On subsequent sessions, performance improved relative to baseline except during DLPFC rTMS that disrupted the early acquisition beneficial effect of prior exposure. A second experiment examining rTMS effects on task-naive participants showed that neither DLPFC rTMS nor sham influenced naive acquisition of probabilistic associations. A third experiment examining consecutive administration of the probabilistic association learning test revealed early trial interference from previous exposure to different probability schedules. These experiments, showing disrupted acquisition of probabilistic associations by rTMS only during subsequent sessions with an intervening night's sleep, suggest that the DLPFC may facilitate early access to learned strategies or prior task-related memories via consolidation. Although neuroimaging studies implicate DLPFC and iPARC in probabilistic association learning, the present findings suggest that early acquisition of the probabilistic cue-outcome associations in task-naive participants is not dependent on either region. PMID:21216842

  2. Enhancing Cost Realism through Risk-Driven Contracting: Designing Incentive Fees Based on Probabilistic Cost Estimates

    DTIC Science & Technology

    2012-04-01

    Comparison of Management Practices in the Army, Navy, and Air Force 142Defense ARJ, April 2012, Vol. 19 No. 2 : 133–160 It appears the pendulum may be...the cost risk for requiring greater innovation. However, this natural flattening trend also leads to a potential drawback of the risk-driven

  3. Dynamic sensing model for accurate delectability of environmental phenomena using event wireless sensor network

    NASA Astrophysics Data System (ADS)

    Missif, Lial Raja; Kadhum, Mohammad M.

    2017-09-01

    Wireless Sensor Network (WSN) has been widely used for monitoring where sensors are deployed to operate independently to sense abnormal phenomena. Most of the proposed environmental monitoring systems are designed based on a predetermined sensing range which does not reflect the sensor reliability, event characteristics, and the environment conditions. Measuring of the capability of a sensor node to accurately detect an event within a sensing field is of great important for monitoring applications. This paper presents an efficient mechanism for even detection based on probabilistic sensing model. Different models have been presented theoretically in this paper to examine their adaptability and applicability to the real environment applications. The numerical results of the experimental evaluation have showed that the probabilistic sensing model provides accurate observation and delectability of an event, and it can be utilized for different environment scenarios.

  4. Unsteady Probabilistic Analysis of a Gas Turbine System

    NASA Technical Reports Server (NTRS)

    Brown, Marilyn

    2003-01-01

    In this work, we have considered an annular cascade configuration subjected to unsteady inflow conditions. The unsteady response calculation has been implemented into the time marching CFD code, MSUTURBO. The computed steady state results for the pressure distribution demonstrated good agreement with experimental data. We have computed results for the amplitudes of the unsteady pressure over the blade surfaces. With the increase in gas turbine engine structural complexity and performance over the past 50 years, structural engineers have created an array of safety nets to ensure against component failures in turbine engines. In order to reduce what is now considered to be excessive conservatism and yet maintain the same adequate margins of safety, there is a pressing need to explore methods of incorporating probabilistic design procedures into engine development. Probabilistic methods combine and prioritize the statistical distributions of each design variable, generate an interactive distribution and offer the designer a quantified relationship between robustness, endurance and performance. The designer can therefore iterate between weight reduction, life increase, engine size reduction, speed increase etc.

  5. Probabilistic Modeling of Aircraft Trajectories for Dynamic Separation Volumes

    NASA Technical Reports Server (NTRS)

    Lewis, Timothy A.

    2016-01-01

    With a proliferation of new and unconventional vehicles and operations expected in the future, the ab initio airspace design will require new approaches to trajectory prediction for separation assurance and other air traffic management functions. This paper presents an approach to probabilistic modeling of the trajectory of an aircraft when its intent is unknown. The approach uses a set of feature functions to constrain a maximum entropy probability distribution based on a set of observed aircraft trajectories. This model can be used to sample new aircraft trajectories to form an ensemble reflecting the variability in an aircraft's intent. The model learning process ensures that the variability in this ensemble reflects the behavior observed in the original data set. Computational examples are presented.

  6. Probabilistic liquefaction triggering based on the cone penetration test

    USGS Publications Warehouse

    Moss, R.E.S.; Seed, R.B.; Kayen, R.E.; Stewart, J.P.; Tokimatsu, K.

    2005-01-01

    Performance-based earthquake engineering requires a probabilistic treatment of potential failure modes in order to accurately quantify the overall stability of the system. This paper is a summary of the application portions of the probabilistic liquefaction triggering correlations proposed recently proposed by Moss and co-workers. To enable probabilistic treatment of liquefaction triggering, the variables comprising the seismic load and the liquefaction resistance were treated as inherently uncertain. Supporting data from an extensive Cone Penetration Test (CPT)-based liquefaction case history database were used to develop a probabilistic correlation. The methods used to measure the uncertainty of the load and resistance variables, how the interactions of these variables were treated using Bayesian updating, and how reliability analysis was applied to produce curves of equal probability of liquefaction are presented. The normalization for effective overburden stress, the magnitude correlated duration weighting factor, and the non-linear shear mass participation factor used are also discussed.

  7. Constellation design with geometric and probabilistic shaping

    NASA Astrophysics Data System (ADS)

    Zhang, Shaoliang; Yaman, Fatih

    2018-02-01

    A systematic study, including theory, simulation and experiments, is carried out to review the generalized pairwise optimization algorithm for designing optimized constellation. In order to verify its effectiveness, the algorithm is applied in three testing cases: 2-dimensional 8 quadrature amplitude modulation (QAM), 4-dimensional set-partitioning QAM, and probabilistic-shaped (PS) 32QAM. The results suggest that geometric shaping can work together with PS to further bridge the gap toward the Shannon limit.

  8. High Cycle Fatigue (HCF) Science and Technology Program 2002 Annual Report

    DTIC Science & Technology

    2003-08-01

    Turbine Engine Airfoils, Phase I 4.3 Probabilistic Design of Turbine Engine Airfoils, Phase II 4.4 Probabilistic Blade Design System 4.5...XTL17/SE2 7.4 Conclusion 8.0 TEST AND EVALUATION 8.1 Characterization Test Protocol 8.2 Demonstration Test Protocol 8.3 Development of Multi ...transparent and opaque overlays for processing. The objective of the SBIR Phase I program was to identify and evaluate promising methods for

  9. Coupled Multi-Disciplinary Optimization for Structural Reliability and Affordability

    NASA Technical Reports Server (NTRS)

    Abumeri, Galib H.; Chamis, Christos C.

    2003-01-01

    A computational simulation method is presented for Non-Deterministic Multidisciplinary Optimization of engine composite materials and structures. A hypothetical engine duct made with ceramic matrix composites (CMC) is evaluated probabilistically in the presence of combined thermo-mechanical loading. The structure is tailored by quantifying the uncertainties in all relevant design variables such as fabrication, material, and loading parameters. The probabilistic sensitivities are used to select critical design variables for optimization. In this paper, two approaches for non-deterministic optimization are presented. The non-deterministic minimization of combined failure stress criterion is carried out by: (1) performing probabilistic evaluation first and then optimization and (2) performing optimization first and then probabilistic evaluation. The first approach shows that the optimization feasible region can be bounded by a set of prescribed probability limits and that the optimization follows the cumulative distribution function between those limits. The second approach shows that the optimization feasible region is bounded by 0.50 and 0.999 probabilities.

  10. Methods for Combining Payload Parameter Variations with Input Environment

    NASA Technical Reports Server (NTRS)

    Merchant, D. H.; Straayer, J. W.

    1975-01-01

    Methods are presented for calculating design limit loads compatible with probabilistic structural design criteria. The approach is based on the concept that the desired limit load, defined as the largest load occuring in a mission, is a random variable having a specific probability distribution which may be determined from extreme-value theory. The design limit load, defined as a particular value of this random limit load, is the value conventionally used in structural design. Methods are presented for determining the limit load probability distributions from both time-domain and frequency-domain dynamic load simulations. Numerical demonstrations of the methods are also presented.

  11. A Darwinian approach to control-structure design

    NASA Technical Reports Server (NTRS)

    Zimmerman, David C.

    1993-01-01

    Genetic algorithms (GA's), as introduced by Holland (1975), are one form of directed random search. The form of direction is based on Darwin's 'survival of the fittest' theories. GA's are radically different from the more traditional design optimization techniques. GA's work with a coding of the design variables, as opposed to working with the design variables directly. The search is conducted from a population of designs (i.e., from a large number of points in the design space), unlike the traditional algorithms which search from a single design point. The GA requires only objective function information, as opposed to gradient or other auxiliary information. Finally, the GA is based on probabilistic transition rules, as opposed to deterministic rules. These features allow the GA to attack problems with local-global minima, discontinuous design spaces and mixed variable problems, all in a single, consistent framework.

  12. Modern proposal of methodology for retrieval of characteristic synthetic rainfall hyetographs

    NASA Astrophysics Data System (ADS)

    Licznar, Paweł; Burszta-Adamiak, Ewa; Łomotowski, Janusz; Stańczyk, Justyna

    2017-11-01

    Modern engineering workshop of designing and modelling complex drainage systems is based on hydrodynamic modelling and has a probabilistic character. Its practical application requires a change regarding rainfall models accepted at the input. Previously used artificial rainfall models of simplified form, e.g. block precipitation or Euler's type II model rainfall are no longer sufficient. It is noticeable that urgent clarification is needed as regards the methodology of standardized rainfall hyetographs that would take into consideration the specifics of local storm rainfall temporal dynamics. The aim of the paper is to present a proposal for innovative methodology for determining standardized rainfall hyetographs, based on statistical processing of the collection of actual local precipitation characteristics. Proposed methodology is based on the classification of standardized rainfall hyetographs with the use of cluster analysis. Its application is presented on the example of selected rain gauges localized in Poland. Synthetic rainfall hyetographs achieved as a final result may be used for hydrodynamic modelling of sewerage systems, including probabilistic detection of necessary capacity of retention reservoirs.

  13. Process for computing geometric perturbations for probabilistic analysis

    DOEpatents

    Fitch, Simeon H. K. [Charlottesville, VA; Riha, David S [San Antonio, TX; Thacker, Ben H [San Antonio, TX

    2012-04-10

    A method for computing geometric perturbations for probabilistic analysis. The probabilistic analysis is based on finite element modeling, in which uncertainties in the modeled system are represented by changes in the nominal geometry of the model, referred to as "perturbations". These changes are accomplished using displacement vectors, which are computed for each node of a region of interest and are based on mean-value coordinate calculations.

  14. An ontology-based nurse call management system (oNCS) with probabilistic priority assessment

    PubMed Central

    2011-01-01

    Background The current, place-oriented nurse call systems are very static. A patient can only make calls with a button which is fixed to a wall of a room. Moreover, the system does not take into account various factors specific to a situation. In the future, there will be an evolution to a mobile button for each patient so that they can walk around freely and still make calls. The system would become person-oriented and the available context information should be taken into account to assign the correct nurse to a call. The aim of this research is (1) the design of a software platform that supports the transition to mobile and wireless nurse call buttons in hospitals and residential care and (2) the design of a sophisticated nurse call algorithm. This algorithm dynamically adapts to the situation at hand by taking the profile information of staff members and patients into account. Additionally, the priority of a call probabilistically depends on the risk factors, assigned to a patient. Methods The ontology-based Nurse Call System (oNCS) was developed as an extension of a Context-Aware Service Platform. An ontology is used to manage the profile information. Rules implement the novel nurse call algorithm that takes all this information into account. Probabilistic reasoning algorithms are designed to determine the priority of a call based on the risk factors of the patient. Results The oNCS system is evaluated through a prototype implementation and simulations, based on a detailed dataset obtained from Ghent University Hospital. The arrival times of nurses at the location of a call, the workload distribution of calls amongst nurses and the assignment of priorities to calls are compared for the oNCS system and the current, place-oriented nurse call system. Additionally, the performance of the system is discussed. Conclusions The execution time of the nurse call algorithm is on average 50.333 ms. Moreover, the oNCS system significantly improves the assignment of nurses to calls. Calls generally have a nurse present faster and the workload-distribution amongst the nurses improves. PMID:21294860

  15. Wind effects on long-span bridges: Probabilistic wind data format for buffeting and VIV load assessments

    NASA Astrophysics Data System (ADS)

    Hoffmann, K.; Srouji, R. G.; Hansen, S. O.

    2017-12-01

    The technology development within the structural design of long-span bridges in Norwegian fjords has created a need for reformulating the calculation format and the physical quantities used to describe the properties of wind and the associated wind-induced effects on bridge decks. Parts of a new probabilistic format describing the incoming, undisturbed wind is presented. It is expected that a fixed probabilistic format will facilitate a more physically consistent and precise description of the wind conditions, which in turn increase the accuracy and considerably reduce uncertainties in wind load assessments. Because the format is probabilistic, a quantification of the level of safety and uncertainty in predicted wind loads is readily accessible. A simple buffeting response calculation demonstrates the use of probabilistic wind data in the assessment of wind loads and responses. Furthermore, vortex-induced fatigue damage is discussed in relation to probabilistic wind turbulence data and response measurements from wind tunnel tests.

  16. Probabilistic structural analysis using a general purpose finite element program

    NASA Astrophysics Data System (ADS)

    Riha, D. S.; Millwater, H. R.; Thacker, B. H.

    1992-07-01

    This paper presents an accurate and efficient method to predict the probabilistic response for structural response quantities, such as stress, displacement, natural frequencies, and buckling loads, by combining the capabilities of MSC/NASTRAN, including design sensitivity analysis and fast probability integration. Two probabilistic structural analysis examples have been performed and verified by comparison with Monte Carlo simulation of the analytical solution. The first example consists of a cantilevered plate with several point loads. The second example is a probabilistic buckling analysis of a simply supported composite plate under in-plane loading. The coupling of MSC/NASTRAN and fast probability integration is shown to be orders of magnitude more efficient than Monte Carlo simulation with excellent accuracy.

  17. Augmenting Probabilistic Risk Assesment with Malevolent Initiators

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Curtis Smith; David Schwieder

    2011-11-01

    As commonly practiced, the use of probabilistic risk assessment (PRA) in nuclear power plants only considers accident initiators such as natural hazards, equipment failures, and human error. Malevolent initiators are ignored in PRA, but are considered the domain of physical security, which uses vulnerability assessment based on an officially specified threat (design basis threat). This paper explores the implications of augmenting and extending existing PRA models by considering new and modified scenarios resulting from malevolent initiators. Teaming the augmented PRA models with conventional vulnerability assessments can cost-effectively enhance security of a nuclear power plant. This methodology is useful for operatingmore » plants, as well as in the design of new plants. For the methodology, we have proposed an approach that builds on and extends the practice of PRA for nuclear power plants for security-related issues. Rather than only considering 'random' failures, we demonstrated a framework that is able to represent and model malevolent initiating events and associated plant impacts.« less

  18. Saul: Towards Declarative Learning Based Programming

    PubMed Central

    Kordjamshidi, Parisa; Roth, Dan; Wu, Hao

    2015-01-01

    We present Saul, a new probabilistic programming language designed to address some of the shortcomings of programming languages that aim at advancing and simplifying the development of AI systems. Such languages need to interact with messy, naturally occurring data, to allow a programmer to specify what needs to be done at an appropriate level of abstraction rather than at the data level, to be developed on a solid theory that supports moving to and reasoning at this level of abstraction and, finally, to support flexible integration of these learning and inference models within an application program. Saul is an object-functional programming language written in Scala that facilitates these by (1) allowing a programmer to learn, name and manipulate named abstractions over relational data; (2) supporting seamless incorporation of trainable (probabilistic or discriminative) components into the program, and (3) providing a level of inference over trainable models to support composition and make decisions that respect domain and application constraints. Saul is developed over a declaratively defined relational data model, can use piecewise learned factor graphs with declaratively specified learning and inference objectives, and it supports inference over probabilistic models augmented with declarative knowledge-based constraints. We describe the key constructs of Saul and exemplify its use in developing applications that require relational feature engineering and structured output prediction. PMID:26635465

  19. Saul: Towards Declarative Learning Based Programming.

    PubMed

    Kordjamshidi, Parisa; Roth, Dan; Wu, Hao

    2015-07-01

    We present Saul , a new probabilistic programming language designed to address some of the shortcomings of programming languages that aim at advancing and simplifying the development of AI systems. Such languages need to interact with messy, naturally occurring data, to allow a programmer to specify what needs to be done at an appropriate level of abstraction rather than at the data level, to be developed on a solid theory that supports moving to and reasoning at this level of abstraction and, finally, to support flexible integration of these learning and inference models within an application program. Saul is an object-functional programming language written in Scala that facilitates these by (1) allowing a programmer to learn, name and manipulate named abstractions over relational data; (2) supporting seamless incorporation of trainable (probabilistic or discriminative) components into the program, and (3) providing a level of inference over trainable models to support composition and make decisions that respect domain and application constraints. Saul is developed over a declaratively defined relational data model, can use piecewise learned factor graphs with declaratively specified learning and inference objectives, and it supports inference over probabilistic models augmented with declarative knowledge-based constraints. We describe the key constructs of Saul and exemplify its use in developing applications that require relational feature engineering and structured output prediction.

  20. Development of a Prototype Model-Form Uncertainty Knowledge Base

    NASA Technical Reports Server (NTRS)

    Green, Lawrence L.

    2016-01-01

    Uncertainties are generally classified as either aleatory or epistemic. Aleatory uncertainties are those attributed to random variation, either naturally or through manufacturing processes. Epistemic uncertainties are generally attributed to a lack of knowledge. One type of epistemic uncertainty is called model-form uncertainty. The term model-form means that among the choices to be made during a design process within an analysis, there are different forms of the analysis process, which each give different results for the same configuration at the same flight conditions. Examples of model-form uncertainties include the grid density, grid type, and solver type used within a computational fluid dynamics code, or the choice of the number and type of model elements within a structures analysis. The objectives of this work are to identify and quantify a representative set of model-form uncertainties and to make this information available to designers through an interactive knowledge base (KB). The KB can then be used during probabilistic design sessions, so as to enable the possible reduction of uncertainties in the design process through resource investment. An extensive literature search has been conducted to identify and quantify typical model-form uncertainties present within aerospace design. An initial attempt has been made to assemble the results of this literature search into a searchable KB, usable in real time during probabilistic design sessions. A concept of operations and the basic structure of a model-form uncertainty KB are described. Key operations within the KB are illustrated. Current limitations in the KB, and possible workarounds are explained.

  1. System Level Uncertainty Assessment for Collaborative RLV Design

    NASA Technical Reports Server (NTRS)

    Charania, A. C.; Bradford, John E.; Olds, John R.; Graham, Matthew

    2002-01-01

    A collaborative design process utilizing Probabilistic Data Assessment (PDA) is showcased. Given the limitation of financial resources by both the government and industry, strategic decision makers need more than just traditional point designs, they need to be aware of the likelihood of these future designs to meet their objectives. This uncertainty, an ever-present character in the design process, can be embraced through a probabilistic design environment. A conceptual design process is presented that encapsulates the major engineering disciplines for a Third Generation Reusable Launch Vehicle (RLV). Toolsets consist of aerospace industry standard tools in disciplines such as trajectory, propulsion, mass properties, cost, operations, safety, and economics. Variations of the design process are presented that use different fidelities of tools. The disciplinary engineering models are used in a collaborative engineering framework utilizing Phoenix Integration's ModelCenter and AnalysisServer environment. These tools allow the designer to join disparate models and simulations together in a unified environment wherein each discipline can interact with any other discipline. The design process also uses probabilistic methods to generate the system level output metrics of interest for a RLV conceptual design. The specific system being examined is the Advanced Concept Rocket Engine 92 (ACRE-92) RLV. Previous experience and knowledge (in terms of input uncertainty distributions from experts and modeling and simulation codes) can be coupled with Monte Carlo processes to best predict the chances of program success.

  2. Probabilistic risk assessment of the Space Shuttle. Phase 3: A study of the potential of losing the vehicle during nominal operation, volume 1

    NASA Technical Reports Server (NTRS)

    Fragola, Joseph R.; Maggio, Gaspare; Frank, Michael V.; Gerez, Luis; Mcfadden, Richard H.; Collins, Erin P.; Ballesio, Jorge; Appignani, Peter L.; Karns, James J.

    1995-01-01

    This document is the Executive Summary of a technical report on a probabilistic risk assessment (PRA) of the Space Shuttle vehicle performed under the sponsorship of the Office of Space Flight of the US National Aeronautics and Space Administration. It briefly summarizes the methodology and results of the Shuttle PRA. The primary objective of this project was to support management and engineering decision-making with respect to the Shuttle program by producing (1) a quantitative probabilistic risk model of the Space Shuttle during flight, (2) a quantitative assessment of in-flight safety risk, (3) an identification and prioritization of the design and operations that principally contribute to in-flight safety risk, and (4) a mechanism for risk-based evaluation proposed modifications to the Shuttle System. Secondary objectives were to provide a vehicle for introducing and transferring PRA technology to the NASA community, and to demonstrate the value of PRA by applying it beneficially to a real program of great international importance.

  3. Software for Probabilistic Risk Reduction

    NASA Technical Reports Server (NTRS)

    Hensley, Scott; Michel, Thierry; Madsen, Soren; Chapin, Elaine; Rodriguez, Ernesto

    2004-01-01

    A computer program implements a methodology, denoted probabilistic risk reduction, that is intended to aid in planning the development of complex software and/or hardware systems. This methodology integrates two complementary prior methodologies: (1) that of probabilistic risk assessment and (2) a risk-based planning methodology, implemented in a prior computer program known as Defect Detection and Prevention (DDP), in which multiple requirements and the beneficial effects of risk-mitigation actions are taken into account. The present methodology and the software are able to accommodate both process knowledge (notably of the efficacy of development practices) and product knowledge (notably of the logical structure of a system, the development of which one seeks to plan). Estimates of the costs and benefits of a planned development can be derived. Functional and non-functional aspects of software can be taken into account, and trades made among them. It becomes possible to optimize the planning process in the sense that it becomes possible to select the best suite of process steps and design choices to maximize the expectation of success while remaining within budget.

  4. Development of a Physically-Based Methodology for Predicting Material Variability in Fatigue Crack Initiation and Growth Response

    DTIC Science & Technology

    2004-12-01

    64, (2000), Federal Aviation Administration, Washington, DC. 14. Y.T. Wu, M.P. Enright, and H.R. Millwater , "Probabilistic Methods for Design...Assessment of Reliability with Inspection," AIAA Journal, AIAA, 40 (5), (2002), 937-946. 15. M.P. Enright, L. Huyse, R.C. McClung, and H.R. Millwater

  5. Superposition-Based Analysis of First-Order Probabilistic Timed Automata

    NASA Astrophysics Data System (ADS)

    Fietzke, Arnaud; Hermanns, Holger; Weidenbach, Christoph

    This paper discusses the analysis of first-order probabilistic timed automata (FPTA) by a combination of hierarchic first-order superposition-based theorem proving and probabilistic model checking. We develop the overall semantics of FPTAs and prove soundness and completeness of our method for reachability properties. Basically, we decompose FPTAs into their time plus first-order logic aspects on the one hand, and their probabilistic aspects on the other hand. Then we exploit the time plus first-order behavior by hierarchic superposition over linear arithmetic. The result of this analysis is the basis for the construction of a reachability equivalent (to the original FPTA) probabilistic timed automaton to which probabilistic model checking is finally applied. The hierarchic superposition calculus required for the analysis is sound and complete on the first-order formulas generated from FPTAs. It even works well in practice. We illustrate the potential behind it with a real-life DHCP protocol example, which we analyze by means of tool chain support.

  6. CPT-based probabilistic and deterministic assessment of in situ seismic soil liquefaction potential

    USGS Publications Warehouse

    Moss, R.E.S.; Seed, R.B.; Kayen, R.E.; Stewart, J.P.; Der Kiureghian, A.; Cetin, K.O.

    2006-01-01

    This paper presents a complete methodology for both probabilistic and deterministic assessment of seismic soil liquefaction triggering potential based on the cone penetration test (CPT). A comprehensive worldwide set of CPT-based liquefaction field case histories were compiled and back analyzed, and the data then used to develop probabilistic triggering correlations. Issues investigated in this study include improved normalization of CPT resistance measurements for the influence of effective overburden stress, and adjustment to CPT tip resistance for the potential influence of "thin" liquefiable layers. The effects of soil type and soil character (i.e., "fines" adjustment) for the new correlations are based on a combination of CPT tip and sleeve resistance. To quantify probability for performancebased engineering applications, Bayesian "regression" methods were used, and the uncertainties of all variables comprising both the seismic demand and the liquefaction resistance were estimated and included in the analysis. The resulting correlations were developed using a Bayesian framework and are presented in both probabilistic and deterministic formats. The results are compared to previous probabilistic and deterministic correlations. ?? 2006 ASCE.

  7. A Hypergraph and Arithmetic Residue-based Probabilistic Neural Network for classification in Intrusion Detection Systems.

    PubMed

    Raman, M R Gauthama; Somu, Nivethitha; Kirthivasan, Kannan; Sriram, V S Shankar

    2017-08-01

    Over the past few decades, the design of an intelligent Intrusion Detection System (IDS) remains an open challenge to the research community. Continuous efforts by the researchers have resulted in the development of several learning models based on Artificial Neural Network (ANN) to improve the performance of the IDSs. However, there exists a tradeoff with respect to the stability of ANN architecture and the detection rate for less frequent attacks. This paper presents a novel approach based on Helly property of Hypergraph and Arithmetic Residue-based Probabilistic Neural Network (HG AR-PNN) to address the classification problem in IDS. The Helly property of Hypergraph was exploited for the identification of the optimal feature subset and the arithmetic residue of the optimal feature subset was used to train the PNN. The performance of HG AR-PNN was evaluated using KDD CUP 1999 intrusion dataset. Experimental results prove the dominance of HG AR-PNN classifier over the existing classifiers with respect to the stability and improved detection rate for less frequent attacks. Copyright © 2017 Elsevier Ltd. All rights reserved.

  8. Probabilistic Analysis and Design of a Raked Wing Tip for a Commercial Transport

    NASA Technical Reports Server (NTRS)

    Mason Brian H.; Chen, Tzi-Kang; Padula, Sharon L.; Ransom, Jonathan B.; Stroud, W. Jefferson

    2008-01-01

    An approach for conducting reliability-based design and optimization (RBDO) of a Boeing 767 raked wing tip (RWT) is presented. The goal is to evaluate the benefits of RBDO for design of an aircraft substructure. A finite-element (FE) model that includes eight critical static load cases is used to evaluate the response of the wing tip. Thirteen design variables that describe the thickness of the composite skins and stiffeners are selected to minimize the weight of the wing tip. A strain-based margin of safety is used to evaluate the performance of the structure. The randomness in the load scale factor and in the strain limits is considered. Of the 13 variables, the wing-tip design was controlled primarily by the thickness of the thickest plies in the upper skins. The report includes an analysis of the optimization results and recommendations for future reliability-based studies.

  9. Risk-Based Probabilistic Approach to Aeropropulsion System Assessment

    NASA Technical Reports Server (NTRS)

    Tong, Michael T.

    2002-01-01

    In an era of shrinking development budgets and resources, where there is also an emphasis on reducing the product development cycle, the role of system assessment, performed in the early stages of an engine development program, becomes very critical to the successful development of new aeropropulsion systems. A reliable system assessment not only helps to identify the best propulsion system concept among several candidates, it can also identify which technologies are worth pursuing. This is particularly important for advanced aeropropulsion technology development programs, which require an enormous amount of resources. In the current practice of deterministic, or point-design, approaches, the uncertainties of design variables are either unaccounted for or accounted for by safety factors. This could often result in an assessment with unknown and unquantifiable reliability. Consequently, it would fail to provide additional insight into the risks associated with the new technologies, which are often needed by decision makers to determine the feasibility and return-on-investment of a new aircraft engine. In this work, an alternative approach based on the probabilistic method was described for a comprehensive assessment of an aeropropulsion system. The statistical approach quantifies the design uncertainties inherent in a new aeropropulsion system and their influences on engine performance. Because of this, it enhances the reliability of a system assessment. A technical assessment of a wave-rotor-enhanced gas turbine engine was performed to demonstrate the methodology. The assessment used probability distributions to account for the uncertainties that occur in component efficiencies and flows and in mechanical design variables. The approach taken in this effort was to integrate the thermodynamic cycle analysis embedded in the computer code NEPP (NASA Engine Performance Program) and the engine weight analysis embedded in the computer code WATE (Weight Analysis of Turbine Engines) with the fast probability integration technique (FPI). FPI was developed by Southwest Research Institute under contract with the NASA Glenn Research Center. The results were plotted in the form of cumulative distribution functions and sensitivity analyses and were compared with results from the traditional deterministic approach. The comparison showed that the probabilistic approach provides a more realistic and systematic way to assess an aeropropulsion system. The current work addressed the application of the probabilistic approach to assess specific fuel consumption, engine thrust, and weight. Similarly, the approach can be used to assess other aspects of aeropropulsion system performance, such as cost, acoustic noise, and emissions. Additional information is included in the original extended abstract.

  10. Watershed-based survey designs

    USGS Publications Warehouse

    Detenbeck, N.E.; Cincotta, D.; Denver, J.M.; Greenlee, S.K.; Olsen, A.R.; Pitchford, A.M.

    2005-01-01

    Watershed-based sampling design and assessment tools help serve the multiple goals for water quality monitoring required under the Clean Water Act, including assessment of regional conditions to meet Section 305(b), identification of impaired water bodies or watersheds to meet Section 303(d), and development of empirical relationships between causes or sources of impairment and biological responses. Creation of GIS databases for hydrography, hydrologically corrected digital elevation models, and hydrologic derivatives such as watershed boundaries and upstream–downstream topology of subcatchments would provide a consistent seamless nationwide framework for these designs. The elements of a watershed-based sample framework can be represented either as a continuous infinite set defined by points along a linear stream network, or as a discrete set of watershed polygons. Watershed-based designs can be developed with existing probabilistic survey methods, including the use of unequal probability weighting, stratification, and two-stage frames for sampling. Case studies for monitoring of Atlantic Coastal Plain streams, West Virginia wadeable streams, and coastal Oregon streams illustrate three different approaches for selecting sites for watershed-based survey designs.

  11. A Probabilistic Feature Map-Based Localization System Using a Monocular Camera.

    PubMed

    Kim, Hyungjin; Lee, Donghwa; Oh, Taekjun; Choi, Hyun-Taek; Myung, Hyun

    2015-08-31

    Image-based localization is one of the most widely researched localization techniques in the robotics and computer vision communities. As enormous image data sets are provided through the Internet, many studies on estimating a location with a pre-built image-based 3D map have been conducted. Most research groups use numerous image data sets that contain sufficient features. In contrast, this paper focuses on image-based localization in the case of insufficient images and features. A more accurate localization method is proposed based on a probabilistic map using 3D-to-2D matching correspondences between a map and a query image. The probabilistic feature map is generated in advance by probabilistic modeling of the sensor system as well as the uncertainties of camera poses. Using the conventional PnP algorithm, an initial camera pose is estimated on the probabilistic feature map. The proposed algorithm is optimized from the initial pose by minimizing Mahalanobis distance errors between features from the query image and the map to improve accuracy. To verify that the localization accuracy is improved, the proposed algorithm is compared with the conventional algorithm in a simulation and realenvironments.

  12. A Probabilistic Feature Map-Based Localization System Using a Monocular Camera

    PubMed Central

    Kim, Hyungjin; Lee, Donghwa; Oh, Taekjun; Choi, Hyun-Taek; Myung, Hyun

    2015-01-01

    Image-based localization is one of the most widely researched localization techniques in the robotics and computer vision communities. As enormous image data sets are provided through the Internet, many studies on estimating a location with a pre-built image-based 3D map have been conducted. Most research groups use numerous image data sets that contain sufficient features. In contrast, this paper focuses on image-based localization in the case of insufficient images and features. A more accurate localization method is proposed based on a probabilistic map using 3D-to-2D matching correspondences between a map and a query image. The probabilistic feature map is generated in advance by probabilistic modeling of the sensor system as well as the uncertainties of camera poses. Using the conventional PnP algorithm, an initial camera pose is estimated on the probabilistic feature map. The proposed algorithm is optimized from the initial pose by minimizing Mahalanobis distance errors between features from the query image and the map to improve accuracy. To verify that the localization accuracy is improved, the proposed algorithm is compared with the conventional algorithm in a simulation and realenvironments. PMID:26404284

  13. A novel probabilistic framework for event-based speech recognition

    NASA Astrophysics Data System (ADS)

    Juneja, Amit; Espy-Wilson, Carol

    2003-10-01

    One of the reasons for unsatisfactory performance of the state-of-the-art automatic speech recognition (ASR) systems is the inferior acoustic modeling of low-level acoustic-phonetic information in the speech signal. An acoustic-phonetic approach to ASR, on the other hand, explicitly targets linguistic information in the speech signal, but such a system for continuous speech recognition (CSR) is not known to exist. A probabilistic and statistical framework for CSR based on the idea of the representation of speech sounds by bundles of binary valued articulatory phonetic features is proposed. Multiple probabilistic sequences of linguistically motivated landmarks are obtained using binary classifiers of manner phonetic features-syllabic, sonorant and continuant-and the knowledge-based acoustic parameters (APs) that are acoustic correlates of those features. The landmarks are then used for the extraction of knowledge-based APs for source and place phonetic features and their binary classification. Probabilistic landmark sequences are constrained using manner class language models for isolated or connected word recognition. The proposed method could overcome the disadvantages encountered by the early acoustic-phonetic knowledge-based systems that led the ASR community to switch to systems highly dependent on statistical pattern analysis methods and probabilistic language or grammar models.

  14. Probabilistic performance assessment of complex energy process systems - The case of a self-sustained sanitation system.

    PubMed

    Kolios, Athanasios; Jiang, Ying; Somorin, Tosin; Sowale, Ayodeji; Anastasopoulou, Aikaterini; Anthony, Edward J; Fidalgo, Beatriz; Parker, Alison; McAdam, Ewan; Williams, Leon; Collins, Matt; Tyrrel, Sean

    2018-05-01

    A probabilistic modelling approach was developed and applied to investigate the energy and environmental performance of an innovative sanitation system, the "Nano-membrane Toilet" (NMT). The system treats human excreta via an advanced energy and water recovery island with the aim of addressing current and future sanitation demands. Due to the complex design and inherent characteristics of the system's input material, there are a number of stochastic variables which may significantly affect the system's performance. The non-intrusive probabilistic approach adopted in this study combines a finite number of deterministic thermodynamic process simulations with an artificial neural network (ANN) approximation model and Monte Carlo simulations (MCS) to assess the effect of system uncertainties on the predicted performance of the NMT system. The joint probability distributions of the process performance indicators suggest a Stirling Engine (SE) power output in the range of 61.5-73 W with a high confidence interval (CI) of 95%. In addition, there is high probability (with 95% CI) that the NMT system can achieve positive net power output between 15.8 and 35 W. A sensitivity study reveals the system power performance is mostly affected by SE heater temperature. Investigation into the environmental performance of the NMT design, including water recovery and CO 2 /NO x emissions, suggests significant environmental benefits compared to conventional systems. Results of the probabilistic analysis can better inform future improvements on the system design and operational strategy and this probabilistic assessment framework can also be applied to similar complex engineering systems.

  15. Development of a Probabilistic Tsunami Hazard Analysis in Japan

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Toshiaki Sakai; Tomoyoshi Takeda; Hiroshi Soraoka

    2006-07-01

    It is meaningful for tsunami assessment to evaluate phenomena beyond the design basis as well as seismic design. Because once we set the design basis tsunami height, we still have possibilities tsunami height may exceeds the determined design tsunami height due to uncertainties regarding the tsunami phenomena. Probabilistic tsunami risk assessment consists of estimating for tsunami hazard and fragility of structures and executing system analysis. In this report, we apply a method for probabilistic tsunami hazard analysis (PTHA). We introduce a logic tree approach to estimate tsunami hazard curves (relationships between tsunami height and probability of excess) and present anmore » example for Japan. Examples of tsunami hazard curves are illustrated, and uncertainty in the tsunami hazard is displayed by 5-, 16-, 50-, 84- and 95-percentile and mean hazard curves. The result of PTHA will be used for quantitative assessment of the tsunami risk for important facilities located on coastal area. Tsunami hazard curves are the reasonable input data for structures and system analysis. However the evaluation method for estimating fragility of structures and the procedure of system analysis is now being developed. (authors)« less

  16. Error Discounting in Probabilistic Category Learning

    PubMed Central

    Craig, Stewart; Lewandowsky, Stephan; Little, Daniel R.

    2011-01-01

    Some current theories of probabilistic categorization assume that people gradually attenuate their learning in response to unavoidable error. However, existing evidence for this error discounting is sparse and open to alternative interpretations. We report two probabilistic-categorization experiments that investigated error discounting by shifting feedback probabilities to new values after different amounts of training. In both experiments, responding gradually became less responsive to errors, and learning was slowed for some time after the feedback shift. Both results are indicative of error discounting. Quantitative modeling of the data revealed that adding a mechanism for error discounting significantly improved the fits of an exemplar-based and a rule-based associative learning model, as well as of a recency-based model of categorization. We conclude that error discounting is an important component of probabilistic learning. PMID:21355666

  17. A Study of Students' Reasoning about Probabilistic Causality: Implications for Understanding Complex Systems and for Instructional Design

    ERIC Educational Resources Information Center

    Grotzer, Tina A.; Solis, S. Lynneth; Tutwiler, M. Shane; Cuzzolino, Megan Powell

    2017-01-01

    Understanding complex systems requires reasoning about causal relationships that behave or appear to behave probabilistically. Features such as distributed agency, large spatial scales, and time delays obscure co-variation relationships and complex interactions can result in non-deterministic relationships between causes and effects that are best…

  18. Adolescents' Heightened Risk-Seeking in a Probabilistic Gambling Task

    ERIC Educational Resources Information Center

    Burnett, Stephanie; Bault, Nadege; Coricelli, Giorgio; Blakemore, Sarah-Jayne

    2010-01-01

    This study investigated adolescent males' decision-making under risk, and the emotional response to decision outcomes, using a probabilistic gambling task designed to evoke counterfactually mediated emotions (relief and regret). Participants were 20 adolescents (aged 9-11), 26 young adolescents (aged 12-15), 20 mid-adolescents (aged 15-18) and 17…

  19. Probabilistic Parameter Uncertainty Analysis of Single Input Single Output Control Systems

    NASA Technical Reports Server (NTRS)

    Smith, Brett A.; Kenny, Sean P.; Crespo, Luis G.

    2005-01-01

    The current standards for handling uncertainty in control systems use interval bounds for definition of the uncertain parameters. This approach gives no information about the likelihood of system performance, but simply gives the response bounds. When used in design, current methods of m-analysis and can lead to overly conservative controller design. With these methods, worst case conditions are weighted equally with the most likely conditions. This research explores a unique approach for probabilistic analysis of control systems. Current reliability methods are examined showing the strong areas of each in handling probability. A hybrid method is developed using these reliability tools for efficiently propagating probabilistic uncertainty through classical control analysis problems. The method developed is applied to classical response analysis as well as analysis methods that explore the effects of the uncertain parameters on stability and performance metrics. The benefits of using this hybrid approach for calculating the mean and variance of responses cumulative distribution functions are shown. Results of the probabilistic analysis of a missile pitch control system, and a non-collocated mass spring system, show the added information provided by this hybrid analysis.

  20. A computational framework to empower probabilistic protein design

    PubMed Central

    Fromer, Menachem; Yanover, Chen

    2008-01-01

    Motivation: The task of engineering a protein to perform a target biological function is known as protein design. A commonly used paradigm casts this functional design problem as a structural one, assuming a fixed backbone. In probabilistic protein design, positional amino acid probabilities are used to create a random library of sequences to be simultaneously screened for biological activity. Clearly, certain choices of probability distributions will be more successful in yielding functional sequences. However, since the number of sequences is exponential in protein length, computational optimization of the distribution is difficult. Results: In this paper, we develop a computational framework for probabilistic protein design following the structural paradigm. We formulate the distribution of sequences for a structure using the Boltzmann distribution over their free energies. The corresponding probabilistic graphical model is constructed, and we apply belief propagation (BP) to calculate marginal amino acid probabilities. We test this method on a large structural dataset and demonstrate the superiority of BP over previous methods. Nevertheless, since the results obtained by BP are far from optimal, we thoroughly assess the paradigm using high-quality experimental data. We demonstrate that, for small scale sub-problems, BP attains identical results to those produced by exact inference on the paradigmatic model. However, quantitative analysis shows that the distributions predicted significantly differ from the experimental data. These findings, along with the excellent performance we observed using BP on the smaller problems, suggest potential shortcomings of the paradigm. We conclude with a discussion of how it may be improved in the future. Contact: fromer@cs.huji.ac.il PMID:18586717

  1. Probabilistic Assessment of Fracture Progression in Composite Structures

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.; Minnetyan, Levon; Mauget, Bertrand; Huang, Dade; Addi, Frank

    1999-01-01

    This report describes methods and corresponding computer codes that are used to evaluate progressive damage and fracture and to perform probabilistic assessment in built-up composite structures. Structural response is assessed probabilistically, during progressive fracture. The effects of design variable uncertainties on structural fracture progression are quantified. The fast probability integrator (FPI) is used to assess the response scatter in the composite structure at damage initiation. The sensitivity of the damage response to design variables is computed. The methods are general purpose and are applicable to stitched and unstitched composites in all types of structures and fracture processes starting from damage initiation to unstable propagation and to global structure collapse. The methods are demonstrated for a polymer matrix composite stiffened panel subjected to pressure. The results indicated that composite constituent properties, fabrication parameters, and respective uncertainties have a significant effect on structural durability and reliability. Design implications with regard to damage progression, damage tolerance, and reliability of composite structures are examined.

  2. Analysis of scale effect in compressive ice failure and implications for design

    NASA Astrophysics Data System (ADS)

    Taylor, Rocky Scott

    The main focus of the study was the analysis of scale effect in local ice pressure resulting from probabilistic (spalling) fracture and the relationship between local and global loads due to the averaging of pressures across the width of a structure. A review of fundamental theory, relevant ice mechanics and a critical analysis of data and theory related to the scale dependent pressure behavior of ice were completed. To study high pressure zones (hpzs), data from small-scale indentation tests carried out at the NRC-IOT were analyzed, including small-scale ice block and ice sheet tests. Finite element analysis was used to model a sample ice block indentation event using a damaging, viscoelastic material model and element removal techniques (for spalling). Medium scale tactile sensor data from the Japan Ocean Industries Association (JOIA) program were analyzed to study details of hpz behavior. The averaging of non-simultaneous hpz loads during an ice-structure interaction was examined using local panel pressure data. Probabilistic averaging methodology for extrapolating full-scale pressures from local panel pressures was studied and an improved correlation model was formulated. Panel correlations for high speed events were observed to be lower than panel correlations for low speed events. Global pressure estimates based on probabilistic averaging were found to give substantially lower average errors in estimation of load compared with methods based on linear extrapolation (no averaging). Panel correlations were analyzed for Molikpaq and compared with JOIA results. From this analysis, it was shown that averaging does result in decreasing pressure for increasing structure width. The relationship between local pressure and ice thickness for a panel of unit width was studied in detail using full-scale data from the STRICE, Molikpaq, Cook Inlet and Japan Ocean Industries Association (JOIA) data sets. A distinct trend of decreasing pressure with increasing ice thickness was observed. The pressure-thickness behavior was found to be well modeled by the power law relationships Pavg = 0.278 h-0.408 MPa and Pstd = 0.172h-0.273 MPa for the mean and standard deviation of pressure, respectively. To study theoretical aspects of spalling fracture and the pressure-thickness scale effect, probabilistic failure models have been developed. A probabilistic model based on Weibull theory (tensile stresses only) was first developed. Estimates of failure pressure obtained with this model were orders of magnitude higher than the pressures observed from benchmark data due to the assumption of only tensile failure. A probabilistic fracture mechanics (PFM) model including both tensile and compressive (shear) cracks was developed. Criteria for unstable fracture in tensile and compressive (shear) zones were given. From these results a clear theoretical scale effect in peak (spalling) pressure was observed. This scale effect followed the relationship Pp,th = 0.15h-0.50 MPa which agreed well with the benchmark data. The PFM model was applied to study the effect of ice edge shape (taper angle) and hpz eccentricity. Results indicated that specimens with flat edges spall at lower pressures while those with more tapered edges spall less readily. The mean peak (failure) pressure was also observed to decrease with increased eccentricity. It was concluded that hpzs centered about the middle of the ice thickness are the zones most likely to create the peak pressures that are of interest in design. Promising results were obtained using the PFM model, which provides strong support for continued research in the development and application of probabilistic fracture mechanics to the study of scale effects in compressive ice failure and to guide the development of methods for the estimation of design ice pressures.

  3. Probabilistic analysis of mean-response along-wind induced vibrations on wind turbine towers using wireless network data sensors

    NASA Astrophysics Data System (ADS)

    Velazquez, Antonio; Swartz, Raymond A.

    2011-04-01

    Wind turbine systems are attracting considerable attention due to concerns regarding global energy consumption as well as sustainability. Advances in wind turbine technology promote the tendency to improve efficiency in the structure that support and produce this renewable power source, tending toward more slender and larger towers, larger gear boxes, and larger, lighter blades. The structural design optimization process must account for uncertainties and nonlinear effects (such as wind-induced vibrations, unmeasured disturbances, and material and geometric variabilities). In this study, a probabilistic monitoring approach is developed that measures the response of the turbine tower to stochastic loading, estimates peak demand, and structural resistance (in terms of serviceability). The proposed monitoring system can provide a real-time estimate of the probability of exceedance of design serviceability conditions based on data collected in-situ. Special attention is paid to wind and aerodynamic characteristics that are intrinsically present (although sometimes neglected in health monitoring analysis) and derived from observations or experiments. In particular, little attention has been devoted to buffeting, usually non-catastrophic but directly impacting the serviceability of the operating wind turbine. As a result, modal-based analysis methods for the study and derivation of flutter instability, and buffeting response, have been successfully applied to the assessment of the susceptibility of high-rise slender structures, including wind turbine towers. A detailed finite element model has been developed to generate data (calibrated to published experimental and analytical results). Risk assessment is performed for the effects of along wind forces in a framework of quantitative risk analysis. Both structural resistance and wind load demands were considered probabilistic with the latter assessed by dynamic analyses.

  4. Probabilistic risk assessment for CO2 storage in geological formations: robust design and support for decision making under uncertainty

    NASA Astrophysics Data System (ADS)

    Oladyshkin, Sergey; Class, Holger; Helmig, Rainer; Nowak, Wolfgang

    2010-05-01

    CO2 storage in geological formations is currently being discussed intensively as a technology for mitigating CO2 emissions. However, any large-scale application requires a thorough analysis of the potential risks. Current numerical simulation models are too expensive for probabilistic risk analysis and for stochastic approaches based on brute-force repeated simulation. Even single deterministic simulations may require parallel high-performance computing. The multiphase flow processes involved are too non-linear for quasi-linear error propagation and other simplified stochastic tools. As an alternative approach, we propose a massive stochastic model reduction based on the probabilistic collocation method. The model response is projected onto a orthogonal basis of higher-order polynomials to approximate dependence on uncertain parameters (porosity, permeability etc.) and design parameters (injection rate, depth etc.). This allows for a non-linear propagation of model uncertainty affecting the predicted risk, ensures fast computation and provides a powerful tool for combining design variables and uncertain variables into one approach based on an integrative response surface. Thus, the design task of finding optimal injection regimes explicitly includes uncertainty, which leads to robust designs of the non-linear system that minimize failure probability and provide valuable support for risk-informed management decisions. We validate our proposed stochastic approach by Monte Carlo simulation using a common 3D benchmark problem (Class et al. Computational Geosciences 13, 2009). A reasonable compromise between computational efforts and precision was reached already with second-order polynomials. In our case study, the proposed approach yields a significant computational speedup by a factor of 100 compared to Monte Carlo simulation. We demonstrate that, due to the non-linearity of the flow and transport processes during CO2 injection, including uncertainty in the analysis leads to a systematic and significant shift of predicted leakage rates towards higher values compared with deterministic simulations, affecting both risk estimates and the design of injection scenarios. This implies that, neglecting uncertainty can be a strong simplification for modeling CO2 injection, and the consequences can be stronger than when neglecting several physical phenomena (e.g. phase transition, convective mixing, capillary forces etc.). The authors would like to thank the German Research Foundation (DFG) for financial support of the project within the Cluster of Excellence in Simulation Technology (EXC 310/1) at the University of Stuttgart. Keywords: polynomial chaos; CO2 storage; multiphase flow; porous media; risk assessment; uncertainty; integrative response surfaces

  5. Base-Rate Neglect as a Function of Base Rates in Probabilistic Contingency Learning

    ERIC Educational Resources Information Center

    Kutzner, Florian; Freytag, Peter; Vogel, Tobias; Fiedler, Klaus

    2008-01-01

    When humans predict criterion events based on probabilistic predictors, they often lend excessive weight to the predictor and insufficient weight to the base rate of the criterion event. In an operant analysis, using a matching-to-sample paradigm, Goodie and Fantino (1996) showed that humans exhibit base-rate neglect when predictors are associated…

  6. Probabilistic Analysis of Gas Turbine Field Performance

    NASA Technical Reports Server (NTRS)

    Gorla, Rama S. R.; Pai, Shantaram S.; Rusick, Jeffrey J.

    2002-01-01

    A gas turbine thermodynamic cycle was computationally simulated and probabilistically evaluated in view of the several uncertainties in the performance parameters, which are indices of gas turbine health. Cumulative distribution functions and sensitivity factors were computed for the overall thermal efficiency and net specific power output due to the thermodynamic random variables. These results can be used to quickly identify the most critical design variables in order to optimize the design, enhance performance, increase system availability and make it cost effective. The analysis leads to the selection of the appropriate measurements to be used in the gas turbine health determination and to the identification of both the most critical measurements and parameters. Probabilistic analysis aims at unifying and improving the control and health monitoring of gas turbine aero-engines by increasing the quality and quantity of information available about the engine's health and performance.

  7. The pyPHaz software, an interactive tool to analyze and visualize results from probabilistic hazard assessments

    NASA Astrophysics Data System (ADS)

    Tonini, Roberto; Selva, Jacopo; Costa, Antonio; Sandri, Laura

    2014-05-01

    Probabilistic Hazard Assessment (PHA) is becoming an essential tool for risk mitigation policies, since it allows to quantify the hazard due to hazardous phenomena and, differently from the deterministic approach, it accounts for both aleatory and epistemic uncertainties. On the other hand, one of the main disadvantages of PHA methods is that their results are not easy to understand and interpret by people who are not specialist in probabilistic tools. For scientists, this leads to the issue of providing tools that can be easily used and understood by decision makers (i.e., risk managers or local authorities). The work here presented fits into the problem of simplifying the transfer between scientific knowledge and land protection policies, by providing an interface between scientists, who produce PHA's results, and decision makers, who use PHA's results for risk analyses. In this framework we present pyPHaz, an open tool developed and designed to visualize and analyze PHA results due to one or more phenomena affecting a specific area of interest. The software implementation has been fully developed with the free and open-source Python programming language and some featured Python-based libraries and modules. The pyPHaz tool allows to visualize the Hazard Curves (HC) calculated in a selected target area together with different levels of uncertainty (mean and percentiles) on maps that can be interactively created and modified by the user, thanks to a dedicated Graphical User Interface (GUI). Moreover, the tool can be used to compare the results of different PHA models and to merge them, by creating ensemble models. The pyPHaz software has been designed with the features of storing and accessing all the data through a MySQL database and of being able to read as input the XML-based standard file formats defined in the frame of GEM (Global Earthquake Model). This format model is easy to extend also to any other kind of hazard, as it will be shown in the applications here used as examples of the pyPHaz potentialities, that are focused on a Probabilistic Volcanic Hazard Assessment (PVHA) for tephra dispersal and fallout applied to the municipality of Naples.

  8. Students’ difficulties in probabilistic problem-solving

    NASA Astrophysics Data System (ADS)

    Arum, D. P.; Kusmayadi, T. A.; Pramudya, I.

    2018-03-01

    There are many errors can be identified when students solving mathematics problems, particularly in solving the probabilistic problem. This present study aims to investigate students’ difficulties in solving the probabilistic problem. It focuses on analyzing and describing students errors during solving the problem. This research used the qualitative method with case study strategy. The subjects in this research involve ten students of 9th grade that were selected by purposive sampling. Data in this research involve students’ probabilistic problem-solving result and recorded interview regarding students’ difficulties in solving the problem. Those data were analyzed descriptively using Miles and Huberman steps. The results show that students have difficulties in solving the probabilistic problem and can be divided into three categories. First difficulties relate to students’ difficulties in understanding the probabilistic problem. Second, students’ difficulties in choosing and using appropriate strategies for solving the problem. Third, students’ difficulties with the computational process in solving the problem. Based on the result seems that students still have difficulties in solving the probabilistic problem. It means that students have not able to use their knowledge and ability for responding probabilistic problem yet. Therefore, it is important for mathematics teachers to plan probabilistic learning which could optimize students probabilistic thinking ability.

  9. Probabilistic Assessment of a CMC Turbine Vane

    NASA Technical Reports Server (NTRS)

    Murthy, Pappu L. N.; Brewer, Dave; Mital, Subodh K.

    2004-01-01

    In order to demonstrate the advanced CMC technology under development within the Ultra Efficient Engine Technology (UEET) program, it has been planned to fabricate, test and analyze an all CMC turbine vane made of a SiC/SiC composite material. The objective was to utilize a 5-II Satin Weave SiC/CVI SiC/ and MI SiC matrix material that was developed in-house under the Enabling Propulsion Materials (EPM) program, to design and fabricate a stator vane that can endure successfully 1000 hours of engine service conditions operation. The design requirements for the vane are to be able to withstand a maximum of 2400 F within the substrate and the hot surface temperature of 2700 F with the aid of an in-house developed Environmental/Thermal Barrier Coating (EBC/TBC) system. The vane will be tested in a High Pressure Burner Rig at NASA Glenn Research Center facility. This rig is capable of simulating the engine service environment. The present paper focuses on a probabilistic assessment of the vane. The material stress/strain relationship shows a bilinear behavior with a distinct knee corresponding to what is often termed as first matrix cracking strength. This is a critical life limiting consideration for these materials. The vane is therefore designed such that the maximum stresses are within this limit so that the structure is never subjected to loads beyond the first matrix cracking strength. Any violation of this design requirement is considered as failure. Probabilistic analysis is performed in order to determine the probability of failure based on this assumption. In the analysis, material properties, strength, and pressures are considered random variables. The variations in properties and strength are based on the actual experimental data generated in house. The mean values for the pressures on the upper surface and the lower surface are known but their distributions are unknown. In the present analysis the pressures are considered normally distributed with a nominal variation. Temperature profile on the vane is obtained by performing a CFD analysis and is assumed to be deterministic.

  10. Probabilistic classifiers with high-dimensional data

    PubMed Central

    Kim, Kyung In; Simon, Richard

    2011-01-01

    For medical classification problems, it is often desirable to have a probability associated with each class. Probabilistic classifiers have received relatively little attention for small n large p classification problems despite of their importance in medical decision making. In this paper, we introduce 2 criteria for assessment of probabilistic classifiers: well-calibratedness and refinement and develop corresponding evaluation measures. We evaluated several published high-dimensional probabilistic classifiers and developed 2 extensions of the Bayesian compound covariate classifier. Based on simulation studies and analysis of gene expression microarray data, we found that proper probabilistic classification is more difficult than deterministic classification. It is important to ensure that a probabilistic classifier is well calibrated or at least not “anticonservative” using the methods developed here. We provide this evaluation for several probabilistic classifiers and also evaluate their refinement as a function of sample size under weak and strong signal conditions. We also present a cross-validation method for evaluating the calibration and refinement of any probabilistic classifier on any data set. PMID:21087946

  11. A comprehensive probabilistic analysis model of oil pipelines network based on Bayesian network

    NASA Astrophysics Data System (ADS)

    Zhang, C.; Qin, T. X.; Jiang, B.; Huang, C.

    2018-02-01

    Oil pipelines network is one of the most important facilities of energy transportation. But oil pipelines network accident may result in serious disasters. Some analysis models for these accidents have been established mainly based on three methods, including event-tree, accident simulation and Bayesian network. Among these methods, Bayesian network is suitable for probabilistic analysis. But not all the important influencing factors are considered and the deployment rule of the factors has not been established. This paper proposed a probabilistic analysis model of oil pipelines network based on Bayesian network. Most of the important influencing factors, including the key environment condition and emergency response are considered in this model. Moreover, the paper also introduces a deployment rule for these factors. The model can be used in probabilistic analysis and sensitive analysis of oil pipelines network accident.

  12. Free-Energy-Based Design Policy for Robust Network Control against Environmental Fluctuation.

    PubMed

    Iwai, Takuya; Kominami, Daichi; Murata, Masayuki; Yomo, Tetsuya

    2015-01-01

    Bioinspired network control is a promising approach for realizing robust network controls. It relies on a probabilistic mechanism composed of positive and negative feedback that allows the system to eventually stabilize on the best solution. When the best solution fails due to environmental fluctuation, the system cannot keep its function until the system finds another solution again. To prevent the temporal loss of the function, the system should prepare some solution candidates and stochastically select available one from them. However, most bioinspired network controls are not designed with this issue in mind. In this paper, we propose a thermodynamics-based design policy that allows systems to retain an appropriate degree of randomness depending on the degree of environmental fluctuation, which prepares the system for the occurrence of environmental fluctuation. Furthermore, we verify the design policy by using an attractor selection model-based multipath routing to run simulation experiments.

  13. Is probabilistic bias analysis approximately Bayesian?

    PubMed Central

    MacLehose, Richard F.; Gustafson, Paul

    2011-01-01

    Case-control studies are particularly susceptible to differential exposure misclassification when exposure status is determined following incident case status. Probabilistic bias analysis methods have been developed as ways to adjust standard effect estimates based on the sensitivity and specificity of exposure misclassification. The iterative sampling method advocated in probabilistic bias analysis bears a distinct resemblance to a Bayesian adjustment; however, it is not identical. Furthermore, without a formal theoretical framework (Bayesian or frequentist), the results of a probabilistic bias analysis remain somewhat difficult to interpret. We describe, both theoretically and empirically, the extent to which probabilistic bias analysis can be viewed as approximately Bayesian. While the differences between probabilistic bias analysis and Bayesian approaches to misclassification can be substantial, these situations often involve unrealistic prior specifications and are relatively easy to detect. Outside of these special cases, probabilistic bias analysis and Bayesian approaches to exposure misclassification in case-control studies appear to perform equally well. PMID:22157311

  14. Probabilistic Design Methodology and its Application to the Design of an Umbilical Retract Mechanism

    NASA Technical Reports Server (NTRS)

    Onyebueke, Landon; Ameye, Olusesan

    2002-01-01

    A lot has been learned from past experience with structural and machine element failures. The understanding of failure modes and the application of an appropriate design analysis method can lead to improved structural and machine element safety as well as serviceability. To apply Probabilistic Design Methodology (PDM), all uncertainties are modeled as random variables with selected distribution types, means, and standard deviations. It is quite difficult to achieve a robust design without considering the randomness of the design parameters which is the case in the use of the Deterministic Design Approach. The US Navy has a fleet of submarine-launched ballistic missiles. An umbilical plug joins the missile to the submarine in order to provide electrical and cooling water connections. As the missile leaves the submarine, an umbilical retract mechanism retracts the umbilical plug clear of the advancing missile after disengagement during launch and retrains the plug in the retracted position. The design of the current retract mechanism in use was based on the deterministic approach which puts emphasis on factor of safety. A new umbilical retract mechanism that is simpler in design, lighter in weight, more reliable, easier to adjust, and more cost effective has become desirable since this will increase the performance and efficiency of the system. This paper reports on a recent project performed at Tennessee State University for the US Navy that involved the application of PDM to the design of an umbilical retract mechanism. This paper demonstrates how the use of PDM lead to the minimization of weight and cost, and the maximization of reliability and performance.

  15. Probabilistic assessment of uncertain adaptive hybrid composites

    NASA Technical Reports Server (NTRS)

    Shiao, Michael C.; Singhal, Surendra N.; Chamis, Christos C.

    1994-01-01

    Adaptive composite structures using actuation materials, such as piezoelectric fibers, were assessed probabilistically utilizing intraply hybrid composite mechanics in conjunction with probabilistic composite structural analysis. Uncertainties associated with the actuation material as well as the uncertainties in the regular (traditional) composite material properties were quantified and considered in the assessment. Static and buckling analyses were performed for rectangular panels with various boundary conditions and different control arrangements. The probability density functions of the structural behavior, such as maximum displacement and critical buckling load, were computationally simulated. The results of the assessment indicate that improved design and reliability can be achieved with actuation material.

  16. Risk-Informed Safety Assurance and Probabilistic Assessment of Mission-Critical Software-Intensive Systems

    NASA Technical Reports Server (NTRS)

    Guarro, Sergio B.

    2010-01-01

    This report validates and documents the detailed features and practical application of the framework for software intensive digital systems risk assessment and risk-informed safety assurance presented in the NASA PRA Procedures Guide for Managers and Practitioner. This framework, called herein the "Context-based Software Risk Model" (CSRM), enables the assessment of the contribution of software and software-intensive digital systems to overall system risk, in a manner which is entirely compatible and integrated with the format of a "standard" Probabilistic Risk Assessment (PRA), as currently documented and applied for NASA missions and applications. The CSRM also provides a risk-informed path and criteria for conducting organized and systematic digital system and software testing so that, within this risk-informed paradigm, the achievement of a quantitatively defined level of safety and mission success assurance may be targeted and demonstrated. The framework is based on the concept of context-dependent software risk scenarios and on the modeling of such scenarios via the use of traditional PRA techniques - i.e., event trees and fault trees - in combination with more advanced modeling devices such as the Dynamic Flowgraph Methodology (DFM) or other dynamic logic-modeling representations. The scenarios can be synthesized and quantified in a conditional logic and probabilistic formulation. The application of the CSRM method documented in this report refers to the MiniAERCam system designed and developed by the NASA Johnson Space Center.

  17. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bucknor, Matthew; Grabaskas, David; Brunett, Acacia J.

    We report that many advanced reactor designs rely on passive systems to fulfill safety functions during accident sequences. These systems depend heavily on boundary conditions to induce a motive force, meaning the system can fail to operate as intended because of deviations in boundary conditions, rather than as the result of physical failures. Furthermore, passive systems may operate in intermediate or degraded modes. These factors make passive system operation difficult to characterize within a traditional probabilistic framework that only recognizes discrete operating modes and does not allow for the explicit consideration of time-dependent boundary conditions. Argonne National Laboratory has beenmore » examining various methodologies for assessing passive system reliability within a probabilistic risk assessment for a station blackout event at an advanced small modular reactor. This paper provides an overview of a passive system reliability demonstration analysis for an external event. Considering an earthquake with the possibility of site flooding, the analysis focuses on the behavior of the passive Reactor Cavity Cooling System following potential physical damage and system flooding. The assessment approach seeks to combine mechanistic and simulation-based methods to leverage the benefits of the simulation-based approach without the need to substantially deviate from conventional probabilistic risk assessment techniques. Lastly, although this study is presented as only an example analysis, the results appear to demonstrate a high level of reliability of the Reactor Cavity Cooling System (and the reactor system in general) for the postulated transient event.« less

  18. Probabilistic evaluation of damage potential in earthquake-induced liquefaction in a 3-D soil deposit

    NASA Astrophysics Data System (ADS)

    Halder, A.; Miller, F. J.

    1982-03-01

    A probabilistic model to evaluate the risk of liquefaction at a site and to limit or eliminate damage during earthquake induced liquefaction is proposed. The model is extended to consider three dimensional nonhomogeneous soil properties. The parameters relevant to the liquefaction phenomenon are identified, including: (1) soil parameters; (2) parameters required to consider laboratory test and sampling effects; and (3) loading parameters. The fundamentals of risk based design concepts pertient to liquefaction are reviewed. A detailed statistical evaluation of the soil parameters in the proposed liquefaction model is provided and the uncertainty associated with the estimation of in situ relative density is evaluated for both direct and indirect methods. It is found that the liquefaction potential the uncertainties in the load parameters could be higher than those in the resistance parameters.

  19. Probabilistic Structural Analysis Methods (PSAM) for select space propulsion system components

    NASA Technical Reports Server (NTRS)

    1991-01-01

    This annual report summarizes the work completed during the third year of technical effort on the referenced contract. Principal developments continue to focus on the Probabilistic Finite Element Method (PFEM) which has been under development for three years. Essentially all of the linear capabilities within the PFEM code are in place. Major progress in the application or verifications phase was achieved. An EXPERT module architecture was designed and partially implemented. EXPERT is a user interface module which incorporates an expert system shell for the implementation of a rule-based interface utilizing the experience and expertise of the user community. The Fast Probability Integration (FPI) Algorithm continues to demonstrate outstanding performance characteristics for the integration of probability density functions for multiple variables. Additionally, an enhanced Monte Carlo simulation algorithm was developed and demonstrated for a variety of numerical strategies.

  20. Diagnosing Geospatial Uncertainty Visualization Challenges in Seasonal Temperature and Precipitation Forecasts

    NASA Astrophysics Data System (ADS)

    Speciale, A.; Kenney, M. A.; Gerst, M.; Baer, A. E.; DeWitt, D.; Gottschalk, J.; Handel, S.

    2017-12-01

    The uncertainty of future weather and climate conditions is important for many decisions made in communities and economic sectors. One tool that decision-makers use in gauging this uncertainty is forecasts, especially maps (or visualizations) of probabilistic forecast results. However, visualizing geospatial uncertainty is challenging because including probability introduces an extra variable to represent and probability is often poorly understood by users. Using focus group and survey methods, this study seeks to understand the barriers to using probabilistic temperature and precipitation visualizations for specific decisions in the agriculture, energy, emergency management, and water resource sectors. Preliminary results shown here focus on findings of emergency manager needs. Our experimental design uses National Oceanic and Atmospheric Administration (NOAA's) Climate Prediction Center (CPC) climate outlooks, which produce probabilistic temperature and precipitation forecast visualizations at the 6-10 day, 8-14 day, 3-4 week, and 1 and 3 month timeframes. Users were asked to complete questions related to how they use weather information, how uncertainty is represented, and design elements (e.g., color, contour lines) of the visualizations. Preliminary results from the emergency management sector indicate there is significant confusion on how "normal" weather is defined, boundaries between probability ranges, and meaning of the contour lines. After a complete understandability diagnosis is made using results from all sectors, we will collaborate with CPC to suggest modifications to the climate outlook visualizations. These modifications will then be retested in similar focus groups and web-based surveys to confirm they better meet the needs of users.

  1. Model Verification and Validation Concepts for a Probabilistic Fracture Assessment Model to Predict Cracking of Knife Edge Seals in the Space Shuttle Main Engine High Pressure Oxidizer

    NASA Technical Reports Server (NTRS)

    Pai, Shantaram S.; Riha, David S.

    2013-01-01

    Physics-based models are routinely used to predict the performance of engineered systems to make decisions such as when to retire system components, how to extend the life of an aging system, or if a new design will be safe or available. Model verification and validation (V&V) is a process to establish credibility in model predictions. Ideally, carefully controlled validation experiments will be designed and performed to validate models or submodels. In reality, time and cost constraints limit experiments and even model development. This paper describes elements of model V&V during the development and application of a probabilistic fracture assessment model to predict cracking in space shuttle main engine high-pressure oxidizer turbopump knife-edge seals. The objective of this effort was to assess the probability of initiating and growing a crack to a specified failure length in specific flight units for different usage and inspection scenarios. The probabilistic fracture assessment model developed in this investigation combined a series of submodels describing the usage, temperature history, flutter tendencies, tooth stresses and numbers of cycles, fatigue cracking, nondestructive inspection, and finally the probability of failure. The analysis accounted for unit-to-unit variations in temperature, flutter limit state, flutter stress magnitude, and fatigue life properties. The investigation focused on the calculation of relative risk rather than absolute risk between the usage scenarios. Verification predictions were first performed for three units with known usage and cracking histories to establish credibility in the model predictions. Then, numerous predictions were performed for an assortment of operating units that had flown recently or that were projected for future flights. Calculations were performed using two NASA-developed software tools: NESSUS(Registered Trademark) for the probabilistic analysis, and NASGRO(Registered Trademark) for the fracture mechanics analysis. The goal of these predictions was to provide additional information to guide decisions on the potential of reusing existing and installed units prior to the new design certification.

  2. Probabilistic Tsunami Hazard Analysis

    NASA Astrophysics Data System (ADS)

    Thio, H. K.; Ichinose, G. A.; Somerville, P. G.; Polet, J.

    2006-12-01

    The recent tsunami disaster caused by the 2004 Sumatra-Andaman earthquake has focused our attention to the hazard posed by large earthquakes that occur under water, in particular subduction zone earthquakes, and the tsunamis that they generate. Even though these kinds of events are rare, the very large loss of life and material destruction caused by this earthquake warrant a significant effort towards the mitigation of the tsunami hazard. For ground motion hazard, Probabilistic Seismic Hazard Analysis (PSHA) has become a standard practice in the evaluation and mitigation of seismic hazard to populations in particular with respect to structures, infrastructure and lifelines. Its ability to condense the complexities and variability of seismic activity into a manageable set of parameters greatly facilitates the design of effective seismic resistant buildings but also the planning of infrastructure projects. Probabilistic Tsunami Hazard Analysis (PTHA) achieves the same goal for hazards posed by tsunami. There are great advantages of implementing such a method to evaluate the total risk (seismic and tsunami) to coastal communities. The method that we have developed is based on the traditional PSHA and therefore completely consistent with standard seismic practice. Because of the strong dependence of tsunami wave heights on bathymetry, we use a full waveform tsunami waveform computation in lieu of attenuation relations that are common in PSHA. By pre-computing and storing the tsunami waveforms at points along the coast generated for sets of subfaults that comprise larger earthquake faults, we can efficiently synthesize tsunami waveforms for any slip distribution on those faults by summing the individual subfault tsunami waveforms (weighted by their slip). This efficiency make it feasible to use Green's function summation in lieu of attenuation relations to provide very accurate estimates of tsunami height for probabilistic calculations, where one typically computes thousands of earthquake scenarios. We have carried out preliminary tsunami hazard calculations for different return periods for western North America and Hawaii based on thousands of earthquake scenarios around the Pacific rim and along the coast of North America. We will present tsunami hazard maps for several return periods and also discuss how to use these results for probabilistic inundation and runup mapping. Our knowledge of certain types of tsunami sources is very limited (e.g. submarine landslides), but a probabilistic framework for tsunami hazard evaluation can include even such sources and their uncertainties and present the overall hazard in a meaningful and consistent way.

  3. Use of Probabilistic Risk Assessment in Shuttle Decision Making Process

    NASA Technical Reports Server (NTRS)

    Boyer, Roger L.; Hamlin, Teri, L.

    2011-01-01

    This slide presentation reviews the use of Probabilistic Risk Assessment (PRA) to assist in the decision making for the shuttle design and operation. Probabilistic Risk Assessment (PRA) is a comprehensive, structured, and disciplined approach to identifying and analyzing risk in complex systems and/or processes that seeks answers to three basic questions: (i.e., what can go wrong? what is the likelihood of these occurring? and what are the consequences that could result if these occur?) The purpose of the Shuttle PRA (SPRA) is to provide a useful risk management tool for the Space Shuttle Program (SSP) to identify strengths and possible weaknesses in the Shuttle design and operation. SPRA was initially developed to support upgrade decisions, but has evolved into a tool that supports Flight Readiness Reviews (FRR) and near real-time flight decisions. Examples of the use of PRA for the shuttle are reviewed.

  4. CARES/Life Used for Probabilistic Characterization of MEMS Pressure Sensor Membranes

    NASA Technical Reports Server (NTRS)

    Nemeth, Noel N.

    2002-01-01

    Microelectromechanical systems (MEMS) devices are typically made from brittle materials such as silicon using traditional semiconductor manufacturing techniques. They can be etched (or micromachined) from larger structures or can be built up with material deposition processes. Maintaining dimensional control and consistent mechanical properties is considerably more difficult for MEMS because feature size is on the micrometer scale. Therefore, the application of probabilistic design methodology becomes necessary for MEMS. This was demonstrated at the NASA Glenn Research Center and Case Western Reserve University in an investigation that used the NASA-developed CARES/Life brittle material design program to study the probabilistic fracture strength behavior of single-crystal SiC, polycrystalline SiC, and amorphous Si3N4 pressurized 1-mm-square thin-film diaphragms. These materials are of interest because of their superior high-temperature characteristics, which are desirable for harsh environment applications such as turbine engine and rocket propulsion system hot sections.

  5. a Generic Probabilistic Model and a Hierarchical Solution for Sensor Localization in Noisy and Restricted Conditions

    NASA Astrophysics Data System (ADS)

    Ji, S.; Yuan, X.

    2016-06-01

    A generic probabilistic model, under fundamental Bayes' rule and Markov assumption, is introduced to integrate the process of mobile platform localization with optical sensors. And based on it, three relative independent solutions, bundle adjustment, Kalman filtering and particle filtering are deduced under different and additional restrictions. We want to prove that first, Kalman filtering, may be a better initial-value supplier for bundle adjustment than traditional relative orientation in irregular strips and networks or failed tie-point extraction. Second, in high noisy conditions, particle filtering can act as a bridge for gap binding when a large number of gross errors fail a Kalman filtering or a bundle adjustment. Third, both filtering methods, which help reduce the error propagation and eliminate gross errors, guarantee a global and static bundle adjustment, who requires the strictest initial values and control conditions. The main innovation is about the integrated processing of stochastic errors and gross errors in sensor observations, and the integration of the three most used solutions, bundle adjustment, Kalman filtering and particle filtering into a generic probabilistic localization model. The tests in noisy and restricted situations are designed and examined to prove them.

  6. Probabilistic analysis of wind-induced vibration mitigation of structures by fluid viscous dampers

    NASA Astrophysics Data System (ADS)

    Chen, Jianbing; Zeng, Xiaoshu; Peng, Yongbo

    2017-11-01

    The high-rise buildings usually suffer from excessively large wind-induced vibrations, and thus vibration control systems might be necessary. Fluid viscous dampers (FVDs) with nonlinear power law against velocity are widely employed. With the transition of design method from traditional frequency domain approaches to more refined direct time domain approaches, the difficulty of time integration of these systems occurs sometimes. In the present paper, firstly the underlying reason of the difficulty is revealed by identifying that the equations of motion of high-rise buildings installed with FVDs are sometimes stiff differential equations. Thus, an approach effective for stiff differential systems, i.e., the backward difference formula (BDF), is then introduced, and verified to be effective for the equation of motion of wind-induced vibration controlled systems. Comparative studies are performed among some methods, including the Newmark method, KR-alpha method, energy-based linearization method and the statistical linearization method. Based on the above results, a 20-story steel frame structure is taken as a practical example. Particularly, the randomness of structural parameters and of wind loading input is emphasized. The extreme values of the responses are examined, showing the effectiveness of the proposed approach, and also necessitating the refined probabilistic analysis in the design of wind-induced vibration mitigation systems.

  7. Probabilistic estimation of residential air exchange rates for population-based human exposure modeling

    EPA Science Inventory

    Residential air exchange rates (AERs) are a key determinant in the infiltration of ambient air pollution indoors. Population-based human exposure models using probabilistic approaches to estimate personal exposure to air pollutants have relied on input distributions from AER meas...

  8. A Practical Probabilistic Graphical Modeling Tool for Weighing Ecological Risk-Based Evidence

    EPA Science Inventory

    Past weight-of-evidence frameworks for adverse ecological effects have provided soft-scoring procedures for judgments based on the quality and measured attributes of evidence. Here, we provide a flexible probabilistic structure for weighing and integrating lines of evidence for e...

  9. Probabilistic Design of a Mars Sample Return Earth Entry Vehicle Thermal Protection System

    NASA Technical Reports Server (NTRS)

    Dec, John A.; Mitcheltree, Robert A.

    2002-01-01

    The driving requirement for design of a Mars Sample Return mission is to assure containment of the returned samples. Designing to, and demonstrating compliance with, such a requirement requires physics based tools that establish the relationship between engineer's sizing margins and probabilities of failure. The traditional method of determining margins on ablative thermal protection systems, while conservative, provides little insight into the actual probability of an over-temperature during flight. The objective of this paper is to describe a new methodology for establishing margins on sizing the thermal protection system (TPS). Results of this Monte Carlo approach are compared with traditional methods.

  10. Seismic Hazard Analysis — Quo vadis?

    NASA Astrophysics Data System (ADS)

    Klügel, Jens-Uwe

    2008-05-01

    The paper is dedicated to the review of methods of seismic hazard analysis currently in use, analyzing the strengths and weaknesses of different approaches. The review is performed from the perspective of a user of the results of seismic hazard analysis for different applications such as the design of critical and general (non-critical) civil infrastructures, technical and financial risk analysis. A set of criteria is developed for and applied to an objective assessment of the capabilities of different analysis methods. It is demonstrated that traditional probabilistic seismic hazard analysis (PSHA) methods have significant deficiencies, thus limiting their practical applications. These deficiencies have their roots in the use of inadequate probabilistic models and insufficient understanding of modern concepts of risk analysis, as have been revealed in some recent large scale studies. These deficiencies result in the lack of ability of a correct treatment of dependencies between physical parameters and finally, in an incorrect treatment of uncertainties. As a consequence, results of PSHA studies have been found to be unrealistic in comparison with empirical information from the real world. The attempt to compensate these problems by a systematic use of expert elicitation has, so far, not resulted in any improvement of the situation. It is also shown that scenario-earthquakes developed by disaggregation from the results of a traditional PSHA may not be conservative with respect to energy conservation and should not be used for the design of critical infrastructures without validation. Because the assessment of technical as well as of financial risks associated with potential damages of earthquakes need a risk analysis, current method is based on a probabilistic approach with its unsolved deficiencies. Traditional deterministic or scenario-based seismic hazard analysis methods provide a reliable and in general robust design basis for applications such as the design of critical infrastructures, especially with systematic sensitivity analyses based on validated phenomenological models. Deterministic seismic hazard analysis incorporates uncertainties in the safety factors. These factors are derived from experience as well as from expert judgment. Deterministic methods associated with high safety factors may lead to too conservative results, especially if applied for generally short-lived civil structures. Scenarios used in deterministic seismic hazard analysis have a clear physical basis. They are related to seismic sources discovered by geological, geomorphologic, geodetic and seismological investigations or derived from historical references. Scenario-based methods can be expanded for risk analysis applications with an extended data analysis providing the frequency of seismic events. Such an extension provides a better informed risk model that is suitable for risk-informed decision making.

  11. Asteroid Risk Assessment: A Probabilistic Approach.

    PubMed

    Reinhardt, Jason C; Chen, Xi; Liu, Wenhao; Manchev, Petar; Paté-Cornell, M Elisabeth

    2016-02-01

    Following the 2013 Chelyabinsk event, the risks posed by asteroids attracted renewed interest, from both the scientific and policy-making communities. It reminded the world that impacts from near-Earth objects (NEOs), while rare, have the potential to cause great damage to cities and populations. Point estimates of the risk (such as mean numbers of casualties) have been proposed, but because of the low-probability, high-consequence nature of asteroid impacts, these averages provide limited actionable information. While more work is needed to further refine its input distributions (e.g., NEO diameters), the probabilistic model presented in this article allows a more complete evaluation of the risk of NEO impacts because the results are distributions that cover the range of potential casualties. This model is based on a modularized simulation that uses probabilistic inputs to estimate probabilistic risk metrics, including those of rare asteroid impacts. Illustrative results of this analysis are presented for a period of 100 years. As part of this demonstration, we assess the effectiveness of civil defense measures in mitigating the risk of human casualties. We find that they are likely to be beneficial but not a panacea. We also compute the probability-but not the consequences-of an impact with global effects ("cataclysm"). We conclude that there is a continued need for NEO observation, and for analyses of the feasibility and risk-reduction effectiveness of space missions designed to deflect or destroy asteroids that threaten the Earth. © 2015 Society for Risk Analysis.

  12. Solving probability reasoning based on DNA strand displacement and probability modules.

    PubMed

    Zhang, Qiang; Wang, Xiaobiao; Wang, Xiaojun; Zhou, Changjun

    2017-12-01

    In computation biology, DNA strand displacement technology is used to simulate the computation process and has shown strong computing ability. Most researchers use it to solve logic problems, but it is only rarely used in probabilistic reasoning. To process probabilistic reasoning, a conditional probability derivation model and total probability model based on DNA strand displacement were established in this paper. The models were assessed through the game "read your mind." It has been shown to enable the application of probabilistic reasoning in genetic diagnosis. Copyright © 2017 Elsevier Ltd. All rights reserved.

  13. Generalized probabilistic scale space for image restoration.

    PubMed

    Wong, Alexander; Mishra, Akshaya K

    2010-10-01

    A novel generalized sampling-based probabilistic scale space theory is proposed for image restoration. We explore extending the definition of scale space to better account for both noise and observation models, which is important for producing accurately restored images. A new class of scale-space realizations based on sampling and probability theory is introduced to realize this extended definition in the context of image restoration. Experimental results using 2-D images show that generalized sampling-based probabilistic scale-space theory can be used to produce more accurate restored images when compared with state-of-the-art scale-space formulations, particularly under situations characterized by low signal-to-noise ratios and image degradation.

  14. Probabilistic dual heuristic programming-based adaptive critic

    NASA Astrophysics Data System (ADS)

    Herzallah, Randa

    2010-02-01

    Adaptive critic (AC) methods have common roots as generalisations of dynamic programming for neural reinforcement learning approaches. Since they approximate the dynamic programming solutions, they are potentially suitable for learning in noisy, non-linear and non-stationary environments. In this study, a novel probabilistic dual heuristic programming (DHP)-based AC controller is proposed. Distinct to current approaches, the proposed probabilistic (DHP) AC method takes uncertainties of forward model and inverse controller into consideration. Therefore, it is suitable for deterministic and stochastic control problems characterised by functional uncertainty. Theoretical development of the proposed method is validated by analytically evaluating the correct value of the cost function which satisfies the Bellman equation in a linear quadratic control problem. The target value of the probabilistic critic network is then calculated and shown to be equal to the analytically derived correct value. Full derivation of the Riccati solution for this non-standard stochastic linear quadratic control problem is also provided. Moreover, the performance of the proposed probabilistic controller is demonstrated on linear and non-linear control examples.

  15. Probabilistic seismic hazard analyses for ground motions and fault displacement at Yucca Mountain, Nevada

    USGS Publications Warehouse

    Stepp, J.C.; Wong, I.; Whitney, J.; Quittmeyer, R.; Abrahamson, N.; Toro, G.; Young, S.R.; Coppersmith, K.; Savy, J.; Sullivan, T.

    2001-01-01

    Probabilistic seismic hazard analyses were conducted to estimate both ground motion and fault displacement hazards at the potential geologic repository for spent nuclear fuel and high-level radioactive waste at Yucca Mountain, Nevada. The study is believed to be the largest and most comprehensive analyses ever conducted for ground-shaking hazard and is a first-of-a-kind assessment of probabilistic fault displacement hazard. The major emphasis of the study was on the quantification of epistemic uncertainty. Six teams of three experts performed seismic source and fault displacement evaluations, and seven individual experts provided ground motion evaluations. State-of-the-practice expert elicitation processes involving structured workshops, consensus identification of parameters and issues to be evaluated, common sharing of data and information, and open exchanges about the basis for preliminary interpretations were implemented. Ground-shaking hazard was computed for a hypothetical rock outcrop at -300 m, the depth of the potential waste emplacement drifts, at the designated design annual exceedance probabilities of 10-3 and 10-4. The fault displacement hazard was calculated at the design annual exceedance probabilities of 10-4 and 10-5.

  16. Multi-Hazard Advanced Seismic Probabilistic Risk Assessment Tools and Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Coleman, Justin L.; Bolisetti, Chandu; Veeraraghavan, Swetha

    Design of nuclear power plant (NPP) facilities to resist natural hazards has been a part of the regulatory process from the beginning of the NPP industry in the United States (US), but has evolved substantially over time. The original set of approaches and methods was entirely deterministic in nature and focused on a traditional engineering margins-based approach. However, over time probabilistic and risk-informed approaches were also developed and implemented in US Nuclear Regulatory Commission (NRC) guidance and regulation. A defense-in-depth framework has also been incorporated into US regulatory guidance over time. As a result, today, the US regulatory framework incorporatesmore » deterministic and probabilistic approaches for a range of different applications and for a range of natural hazard considerations. This framework will continue to evolve as a result of improved knowledge and newly identified regulatory needs and objectives, most notably in response to the NRC activities developed in response to the 2011 Fukushima accident in Japan. Although the US regulatory framework has continued to evolve over time, the tools, methods and data available to the US nuclear industry to meet the changing requirements have not kept pace. Notably, there is significant room for improvement in the tools and methods available for external event probabilistic risk assessment (PRA), which is the principal assessment approach used in risk-informed regulations and risk-informed decision-making applied to natural hazard assessment and design. This is particularly true if PRA is applied to natural hazards other than seismic loading. Development of a new set of tools and methods that incorporate current knowledge, modern best practice, and state-of-the-art computational resources would lead to more reliable assessment of facility risk and risk insights (e.g., the SSCs and accident sequences that are most risk-significant), with less uncertainty and reduced conservatisms.« less

  17. New Proofs of Some q-Summation and q-Transformation Formulas

    PubMed Central

    Liu, Xian-Fang; Bi, Ya-Qing; Luo, Qiu-Ming

    2014-01-01

    We obtain an expectation formula and give the probabilistic proofs of some summation and transformation formulas of q-series based on our expectation formula. Although these formulas in themselves are not the probability results, the proofs given are based on probabilistic concepts. PMID:24895675

  18. Probabilistic simulation of uncertainties in thermal structures

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.; Shiao, Michael

    1990-01-01

    Development of probabilistic structural analysis methods for hot structures is a major activity at Lewis Research Center. It consists of five program elements: (1) probabilistic loads; (2) probabilistic finite element analysis; (3) probabilistic material behavior; (4) assessment of reliability and risk; and (5) probabilistic structural performance evaluation. Recent progress includes: (1) quantification of the effects of uncertainties for several variables on high pressure fuel turbopump (HPFT) blade temperature, pressure, and torque of the Space Shuttle Main Engine (SSME); (2) the evaluation of the cumulative distribution function for various structural response variables based on assumed uncertainties in primitive structural variables; (3) evaluation of the failure probability; (4) reliability and risk-cost assessment, and (5) an outline of an emerging approach for eventual hot structures certification. Collectively, the results demonstrate that the structural durability/reliability of hot structural components can be effectively evaluated in a formal probabilistic framework. In addition, the approach can be readily extended to computationally simulate certification of hot structures for aerospace environments.

  19. Optimisation study of a vehicle bumper subsystem with fuzzy parameters

    NASA Astrophysics Data System (ADS)

    Farkas, L.; Moens, D.; Donders, S.; Vandepitte, D.

    2012-10-01

    This paper deals with the design and optimisation for crashworthiness of a vehicle bumper subsystem, which is a key scenario for vehicle component design. The automotive manufacturers and suppliers have to find optimal design solutions for such subsystems that comply with the conflicting requirements of the regulatory bodies regarding functional performance (safety and repairability) and regarding the environmental impact (mass). For the bumper design challenge, an integrated methodology for multi-attribute design engineering of mechanical structures is set up. The integrated process captures the various tasks that are usually performed manually, this way facilitating the automated design iterations for optimisation. Subsequently, an optimisation process is applied that takes the effect of parametric uncertainties into account, such that the system level of failure possibility is acceptable. This optimisation process is referred to as possibility-based design optimisation and integrates the fuzzy FE analysis applied for the uncertainty treatment in crash simulations. This process is the counterpart of the reliability-based design optimisation used in a probabilistic context with statistically defined parameters (variabilities).

  20. Probabilistic Simulation of Multi-Scale Composite Behavior

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.

    2012-01-01

    A methodology is developed to computationally assess the non-deterministic composite response at all composite scales (from micro to structural) due to the uncertainties in the constituent (fiber and matrix) properties, in the fabrication process and in structural variables (primitive variables). The methodology is computationally efficient for simulating the probability distributions of composite behavior, such as material properties, laminate and structural responses. Bi-products of the methodology are probabilistic sensitivities of the composite primitive variables. The methodology has been implemented into the computer codes PICAN (Probabilistic Integrated Composite ANalyzer) and IPACS (Integrated Probabilistic Assessment of Composite Structures). The accuracy and efficiency of this methodology are demonstrated by simulating the uncertainties in composite typical laminates and comparing the results with the Monte Carlo simulation method. Available experimental data of composite laminate behavior at all scales fall within the scatters predicted by PICAN. Multi-scaling is extended to simulate probabilistic thermo-mechanical fatigue and to simulate the probabilistic design of a composite redome in order to illustrate its versatility. Results show that probabilistic fatigue can be simulated for different temperature amplitudes and for different cyclic stress magnitudes. Results also show that laminate configurations can be selected to increase the redome reliability by several orders of magnitude without increasing the laminate thickness--a unique feature of structural composites. The old reference denotes that nothing fundamental has been done since that time.

  1. Probabilistic evaluation of SSME structural components

    NASA Astrophysics Data System (ADS)

    Rajagopal, K. R.; Newell, J. F.; Ho, H.

    1991-05-01

    The application is described of Composite Load Spectra (CLS) and Numerical Evaluation of Stochastic Structures Under Stress (NESSUS) family of computer codes to the probabilistic structural analysis of four Space Shuttle Main Engine (SSME) space propulsion system components. These components are subjected to environments that are influenced by many random variables. The applications consider a wide breadth of uncertainties encountered in practice, while simultaneously covering a wide area of structural mechanics. This has been done consistent with the primary design requirement for each component. The probabilistic application studies are discussed using finite element models that have been typically used in the past in deterministic analysis studies.

  2. Demonstration of a Probabilistic Technique for the Determination of Economic Viability of Very Large Transport Configurations

    NASA Technical Reports Server (NTRS)

    Mavris, Dimitri N.

    1998-01-01

    Over the past few years, modem aircraft design has experienced a paradigm shift from designing for performance to designing for affordability. This report contains a probabilistic approach that will allow traditional deterministic design methods to be extended to account for disciplinary, economic, and technological uncertainty. The probabilistic approach was facilitated by the Fast Probability Integration (FPI) technique; a technique which allows the designer to gather valuable information about the vehicle's behavior in the design space. This technique is efficient for assessing multi-attribute, multi-constraint problems in a more realistic fashion. For implementation purposes, this technique is applied to illustrate how both economic and technological uncertainty associated with a Very Large Transport aircraft concept may be assessed. The assessment is evaluated with the FPI technique to determine the cumulative probability distributions of the design space, as bound by economic objectives and performance constraints. These distributions were compared to established targets for a comparable large capacity aircraft, similar in size to the Boeing 747-400. The conventional baseline configuration design space was determined to be unfeasible and marginally viable, motivating the infusion of advanced technologies, including reductions in drag, specific fuel consumption, wing weight, and Research, Development, Testing, and Evaluation costs. The resulting system design space was qualitatively assessed with technology metric "k" factors. The infusion of technologies shifted the VLT design into regions of feasibility and greater viability. The study also demonstrated a method and relationship by which the impact of new technologies may be assessed in a more system focused approach.

  3. Algebraic and Probabilistic Bases for Fuzzy Sets and the Development of Fuzzy Conditioning

    DTIC Science & Technology

    1991-08-01

    results; and also recently, among others, Bruno & Gilio (1985) bringing forth the basic is- e of combining implicatives compatible with conditional...probabilistic bases for fuzzy sets 67 7. Bruno, G. & Gilio , A. (1985), Confronto fra eventi condizionati di probabililiti nulla nell’ inferenza statistica

  4. DESIGNING ENVIRONMENTAL MONITORING DATABASES FOR STATISTIC ASSESSMENT

    EPA Science Inventory

    Databases designed for statistical analyses have characteristics that distinguish them from databases intended for general use. EMAP uses a probabilistic sampling design to collect data to produce statistical assessments of environmental conditions. In addition to supporting the ...

  5. Probabilistic Methods for Uncertainty Propagation Applied to Aircraft Design

    NASA Technical Reports Server (NTRS)

    Green, Lawrence L.; Lin, Hong-Zong; Khalessi, Mohammad R.

    2002-01-01

    Three methods of probabilistic uncertainty propagation and quantification (the method of moments, Monte Carlo simulation, and a nongradient simulation search method) are applied to an aircraft analysis and conceptual design program to demonstrate design under uncertainty. The chosen example problems appear to have discontinuous design spaces and thus these examples pose difficulties for many popular methods of uncertainty propagation and quantification. However, specific implementation features of the first and third methods chosen for use in this study enable successful propagation of small uncertainties through the program. Input uncertainties in two configuration design variables are considered. Uncertainties in aircraft weight are computed. The effects of specifying required levels of constraint satisfaction with specified levels of input uncertainty are also demonstrated. The results show, as expected, that the designs under uncertainty are typically heavier and more conservative than those in which no input uncertainties exist.

  6. Probabilistic, Seismically-Induced Landslide Hazard Mapping of Western Oregon

    NASA Astrophysics Data System (ADS)

    Olsen, M. J.; Sharifi Mood, M.; Gillins, D. T.; Mahalingam, R.

    2015-12-01

    Earthquake-induced landslides can generate significant damage within urban communities by damaging structures, obstructing lifeline connection routes and utilities, generating various environmental impacts, and possibly resulting in loss of life. Reliable hazard and risk maps are important to assist agencies in efficiently allocating and managing limited resources to prepare for such events. This research presents a new methodology in order to communicate site-specific landslide hazard assessments in a large-scale, regional map. Implementation of the proposed methodology results in seismic-induced landslide hazard maps that depict the probabilities of exceeding landslide displacement thresholds (e.g. 0.1, 0.3, 1.0 and 10 meters). These maps integrate a variety of data sources including: recent landslide inventories, LIDAR and photogrammetric topographic data, geology map, mapped NEHRP site classifications based on available shear wave velocity data in each geologic unit, and USGS probabilistic seismic hazard curves. Soil strength estimates were obtained by evaluating slopes present along landslide scarps and deposits for major geologic units. Code was then developed to integrate these layers to perform a rigid, sliding block analysis to determine the amount and associated probabilities of displacement based on each bin of peak ground acceleration in the seismic hazard curve at each pixel. The methodology was applied to western Oregon, which contains weak, weathered, and often wet soils at steep slopes. Such conditions have a high landslide hazard even without seismic events. A series of landslide hazard maps highlighting the probabilities of exceeding the aforementioned thresholds were generated for the study area. These output maps were then utilized in a performance based design framework enabling them to be analyzed in conjunction with other hazards for fully probabilistic-based hazard evaluation and risk assessment. a) School of Civil and Construction Engineering, Oregon State University, Corvallis, OR 97331, USA

  7. Integrated Technology Assessment Center (ITAC) Update

    NASA Technical Reports Server (NTRS)

    Taylor, J. L.; Neely, M. A.; Curran, F. M.; Christensen, E. R.; Escher, D.; Lovell, N.; Morris, Charles (Technical Monitor)

    2002-01-01

    The Integrated Technology Assessment Center (ITAC) has developed a flexible systems analysis framework to identify long-term technology needs, quantify payoffs for technology investments, and assess the progress of ASTP-sponsored technology programs in the hypersonics area. For this, ITAC has assembled an experienced team representing a broad sector of the aerospace community and developed a systematic assessment process complete with supporting tools. Concepts for transportation systems are selected based on relevance to the ASTP and integrated concept models (ICM) of these concepts are developed. Key technologies of interest are identified and projections are made of their characteristics with respect to their impacts on key aspects of the specific concepts of interest. Both the models and technology projections are then fed into the ITAC's probabilistic systems analysis framework in ModelCenter. This framework permits rapid sensitivity analysis, single point design assessment, and a full probabilistic assessment of each concept with respect to both embedded and enhancing technologies. Probabilistic outputs are weighed against metrics of interest to ASTP using a multivariate decision making process to provide inputs for technology prioritization within the ASTP. ITAC program is currently finishing the assessment of a two-stage-to-orbit (TSTO), rocket-based combined cycle (RBCC) concept and a TSTO turbine-based combined cycle (TBCC) concept developed by the team with inputs from NASA. A baseline all rocket TSTO concept is also being developed for comparison. Boeing has recently submitted a performance model for their Flexible Aerospace System Solution for Tomorrow (FASST) concept and the ISAT program will provide inputs for a single-stage-to-orbit (SSTO) TBCC based concept in the near-term. Both of these latter concepts will be analyzed within the ITAC framework over the summer. This paper provides a status update of the ITAC program.

  8. Ranking of sabotage/tampering avoidance technology alternatives

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Andrews, W.B.; Tabatabai, A.S.; Powers, T.B.

    1986-01-01

    Pacific Northwest Laboratory conducted a study to evaluate alternatives to the design and operation of nuclear power plants, emphasizing a reduction of their vulnerability to sabotage. Estimates of core melt accident frequency during normal operations and from sabotage/tampering events were used to rank the alternatives. Core melt frequency for normal operations was estimated using sensitivity analysis of results of probabilistic risk assessments. Core melt frequency for sabotage/tampering was estimated by developing a model based on probabilistic risk analyses, historic data, engineering judgment, and safeguards analyses of plant locations where core melt events could be initiated. Results indicate the most effectivemore » alternatives focus on large areas of the plant, increase safety system redundancy, and reduce reliance on single locations for mitigation of transients. Less effective options focus on specific areas of the plant, reduce reliance on some plant areas for safe shutdown, and focus on less vulnerable targets.« less

  9. NasoNet, modeling the spread of nasopharyngeal cancer with networks of probabilistic events in discrete time.

    PubMed

    Galán, S F; Aguado, F; Díez, F J; Mira, J

    2002-07-01

    The spread of cancer is a non-deterministic dynamic process. As a consequence, the design of an assistant system for the diagnosis and prognosis of the extent of a cancer should be based on a representation method that deals with both uncertainty and time. The ultimate goal is to know the stage of development of a cancer in a patient before selecting the appropriate treatment. A network of probabilistic events in discrete time (NPEDT) is a type of Bayesian network for temporal reasoning that models the causal mechanisms associated with the time evolution of a process. This paper describes NasoNet, a system that applies NPEDTs to the diagnosis and prognosis of nasopharyngeal cancer. We have made use of temporal noisy gates to model the dynamic causal interactions that take place in the domain. The methodology we describe is general enough to be applied to any other type of cancer.

  10. Robust Depth Image Acquisition Using Modulated Pattern Projection and Probabilistic Graphical Models

    PubMed Central

    Kravanja, Jaka; Žganec, Mario; Žganec-Gros, Jerneja; Dobrišek, Simon; Štruc, Vitomir

    2016-01-01

    Depth image acquisition with structured light approaches in outdoor environments is a challenging problem due to external factors, such as ambient sunlight, which commonly affect the acquisition procedure. This paper presents a novel structured light sensor designed specifically for operation in outdoor environments. The sensor exploits a modulated sequence of structured light projected onto the target scene to counteract environmental factors and estimate a spatial distortion map in a robust manner. The correspondence between the projected pattern and the estimated distortion map is then established using a probabilistic framework based on graphical models. Finally, the depth image of the target scene is reconstructed using a number of reference frames recorded during the calibration process. We evaluate the proposed sensor on experimental data in indoor and outdoor environments and present comparative experiments with other existing methods, as well as commercial sensors. PMID:27775570

  11. A probabilistic storm transposition approach for estimating exceedance probabilities of extreme precipitation depths

    NASA Astrophysics Data System (ADS)

    Foufoula-Georgiou, E.

    1989-05-01

    A storm transposition approach is investigated as a possible tool of assessing the frequency of extreme precipitation depths, that is, depths of return period much greater than 100 years. This paper focuses on estimation of the annual exceedance probability of extreme average precipitation depths over a catchment. The probabilistic storm transposition methodology is presented, and the several conceptual and methodological difficulties arising in this approach are identified. The method is implemented and is partially evaluated by means of a semihypothetical example involving extreme midwestern storms and two hypothetical catchments (of 100 and 1000 mi2 (˜260 and 2600 km2)) located in central Iowa. The results point out the need for further research to fully explore the potential of this approach as a tool for assessing the probabilities of rare storms, and eventually floods, a necessary element of risk-based analysis and design of large hydraulic structures.

  12. Probabilistic structural analysis methods of hot engine structures

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.; Hopkins, D. A.

    1989-01-01

    Development of probabilistic structural analysis methods for hot engine structures at Lewis Research Center is presented. Three elements of the research program are: (1) composite load spectra methodology; (2) probabilistic structural analysis methodology; and (3) probabilistic structural analysis application. Recent progress includes: (1) quantification of the effects of uncertainties for several variables on high pressure fuel turbopump (HPFT) turbine blade temperature, pressure, and torque of the space shuttle main engine (SSME); (2) the evaluation of the cumulative distribution function for various structural response variables based on assumed uncertainties in primitive structural variables; and (3) evaluation of the failure probability. Collectively, the results demonstrate that the structural durability of hot engine structural components can be effectively evaluated in a formal probabilistic/reliability framework.

  13. A Markov Chain Approach to Probabilistic Swarm Guidance

    NASA Technical Reports Server (NTRS)

    Acikmese, Behcet; Bayard, David S.

    2012-01-01

    This paper introduces a probabilistic guidance approach for the coordination of swarms of autonomous agents. The main idea is to drive the swarm to a prescribed density distribution in a prescribed region of the configuration space. In its simplest form, the probabilistic approach is completely decentralized and does not require communication or collabo- ration between agents. Agents make statistically independent probabilistic decisions based solely on their own state, that ultimately guides the swarm to the desired density distribution in the configuration space. In addition to being completely decentralized, the probabilistic guidance approach has a novel autonomous self-repair property: Once the desired swarm density distribution is attained, the agents automatically repair any damage to the distribution without collaborating and without any knowledge about the damage.

  14. Posterior stabilized versus cruciate retaining total knee arthroplasty designs: conformity affects the performance reliability of the design over the patient population.

    PubMed

    Ardestani, Marzieh M; Moazen, Mehran; Maniei, Ehsan; Jin, Zhongmin

    2015-04-01

    Commercially available fixed bearing knee prostheses are mainly divided into two groups: posterior stabilized (PS) versus cruciate retaining (CR). Despite the widespread comparative studies, the debate continues regarding the superiority of one type over the other. This study used a combined finite element (FE) simulation and principal component analysis (PCA) to evaluate "reliability" and "sensitivity" of two PS designs versus two CR designs over a patient population. Four fixed bearing implants were chosen: PFC (DePuy), PFC Sigma (DePuy), NexGen (Zimmer) and Genesis II (Smith & Nephew). Using PCA, a large probabilistic knee joint motion and loading database was generated based on the available experimental data from literature. The probabilistic knee joint data were applied to each implant in a FE simulation to calculate the potential envelopes of kinematics (i.e. anterior-posterior [AP] displacement and internal-external [IE] rotation) and contact mechanics. The performance envelopes were considered as an indicator of performance reliability. For each implant, PCA was used to highlight how much the implant performance was influenced by changes in each input parameter (sensitivity). Results showed that (1) conformity directly affected the reliability of the knee implant over a patient population such that lesser conformity designs (PS or CR), had higher kinematic variability and were more influenced by AP force and IE torque, (2) contact reliability did not differ noticeably among different designs and (3) CR or PS designs affected the relative rank of critical factors that influenced the reliability of each design. Such investigations enlighten the underlying biomechanics of various implant designs and can be utilized to estimate the potential performance of an implant design over a patient population. Copyright © 2015 IPEM. Published by Elsevier Ltd. All rights reserved.

  15. Commercialization of NESSUS: Status

    NASA Technical Reports Server (NTRS)

    Thacker, Ben H.; Millwater, Harry R.

    1991-01-01

    A plan was initiated in 1988 to commercialize the Numerical Evaluation of Stochastic Structures Under Stress (NESSUS) probabilistic structural analysis software. The goal of the on-going commercialization effort is to begin the transfer of Probabilistic Structural Analysis Method (PSAM) developed technology into industry and to develop additional funding resources in the general area of structural reliability. The commercialization effort is summarized. The SwRI NESSUS Software System is a general purpose probabilistic finite element computer program using state of the art methods for predicting stochastic structural response due to random loads, material properties, part geometry, and boundary conditions. NESSUS can be used to assess structural reliability, to compute probability of failure, to rank the input random variables by importance, and to provide a more cost effective design than traditional methods. The goal is to develop a general probabilistic structural analysis methodology to assist in the certification of critical components in the next generation Space Shuttle Main Engine.

  16. Probabilistic boundary element method

    NASA Technical Reports Server (NTRS)

    Cruse, T. A.; Raveendra, S. T.

    1989-01-01

    The purpose of the Probabilistic Structural Analysis Method (PSAM) project is to develop structural analysis capabilities for the design analysis of advanced space propulsion system hardware. The boundary element method (BEM) is used as the basis of the Probabilistic Advanced Analysis Methods (PADAM) which is discussed. The probabilistic BEM code (PBEM) is used to obtain the structural response and sensitivity results to a set of random variables. As such, PBEM performs analogous to other structural analysis codes such as finite elements in the PSAM system. For linear problems, unlike the finite element method (FEM), the BEM governing equations are written at the boundary of the body only, thus, the method eliminates the need to model the volume of the body. However, for general body force problems, a direct condensation of the governing equations to the boundary of the body is not possible and therefore volume modeling is generally required.

  17. The analysis of the possibility of using 10-minute rainfall series to determine the maximum rainfall amount with 5 minutes duration

    NASA Astrophysics Data System (ADS)

    Kaźmierczak, Bartosz; Wartalska, Katarzyna; Wdowikowski, Marcin; Kotowski, Andrzej

    2017-11-01

    Modern scientific research in the area of heavy rainfall analysis regarding to the sewerage design indicates the need to develop and use probabilistic rain models. One of the issues that remains to be resolved is the length of the shortest amount of rain to be analyzed. It is commonly believed that the best time is 5 minutes, while the least rain duration measured by the national services is often 10 or even 15 minutes. Main aim of this paper is to present the difference between probabilistic rainfall models results given from rainfall time series including and excluding 5 minutes rainfall duration. Analysis were made for long-time period from 1961-2010 on polish meteorological station Legnica. To develop best fitted to measurement rainfall data probabilistic model 4 probabilistic distributions were used. Results clearly indicates that models including 5 minutes rainfall duration remains more appropriate to use.

  18. Learning Probabilistic Logic Models from Probabilistic Examples

    PubMed Central

    Chen, Jianzhong; Muggleton, Stephen; Santos, José

    2009-01-01

    Abstract We revisit an application developed originally using abductive Inductive Logic Programming (ILP) for modeling inhibition in metabolic networks. The example data was derived from studies of the effects of toxins on rats using Nuclear Magnetic Resonance (NMR) time-trace analysis of their biofluids together with background knowledge representing a subset of the Kyoto Encyclopedia of Genes and Genomes (KEGG). We now apply two Probabilistic ILP (PILP) approaches - abductive Stochastic Logic Programs (SLPs) and PRogramming In Statistical modeling (PRISM) to the application. Both approaches support abductive learning and probability predictions. Abductive SLPs are a PILP framework that provides possible worlds semantics to SLPs through abduction. Instead of learning logic models from non-probabilistic examples as done in ILP, the PILP approach applied in this paper is based on a general technique for introducing probability labels within a standard scientific experimental setting involving control and treated data. Our results demonstrate that the PILP approach provides a way of learning probabilistic logic models from probabilistic examples, and the PILP models learned from probabilistic examples lead to a significant decrease in error accompanied by improved insight from the learned results compared with the PILP models learned from non-probabilistic examples. PMID:19888348

  19. Learning Probabilistic Logic Models from Probabilistic Examples.

    PubMed

    Chen, Jianzhong; Muggleton, Stephen; Santos, José

    2008-10-01

    We revisit an application developed originally using abductive Inductive Logic Programming (ILP) for modeling inhibition in metabolic networks. The example data was derived from studies of the effects of toxins on rats using Nuclear Magnetic Resonance (NMR) time-trace analysis of their biofluids together with background knowledge representing a subset of the Kyoto Encyclopedia of Genes and Genomes (KEGG). We now apply two Probabilistic ILP (PILP) approaches - abductive Stochastic Logic Programs (SLPs) and PRogramming In Statistical modeling (PRISM) to the application. Both approaches support abductive learning and probability predictions. Abductive SLPs are a PILP framework that provides possible worlds semantics to SLPs through abduction. Instead of learning logic models from non-probabilistic examples as done in ILP, the PILP approach applied in this paper is based on a general technique for introducing probability labels within a standard scientific experimental setting involving control and treated data. Our results demonstrate that the PILP approach provides a way of learning probabilistic logic models from probabilistic examples, and the PILP models learned from probabilistic examples lead to a significant decrease in error accompanied by improved insight from the learned results compared with the PILP models learned from non-probabilistic examples.

  20. Community-based early warning systems for flood risk mitigation in Nepal

    NASA Astrophysics Data System (ADS)

    Smith, Paul J.; Brown, Sarah; Dugar, Sumit

    2017-03-01

    This paper focuses on the use of community-based early warning systems for flood resilience in Nepal. The first part of the work outlines the evolution and current status of these community-based systems, highlighting the limited lead times currently available for early warning. The second part of the paper focuses on the development of a robust operational flood forecasting methodology for use by the Nepal Department of Hydrology and Meteorology (DHM) to enhance early warning lead times. The methodology uses data-based physically interpretable time series models and data assimilation to generate probabilistic forecasts, which are presented in a simple visual tool. The approach is designed to work in situations of limited data availability with an emphasis on sustainability and appropriate technology. The successful application of the forecast methodology to the flood-prone Karnali River basin in western Nepal is outlined, increasing lead times from 2-3 to 7-8 h. The challenges faced in communicating probabilistic forecasts to the last mile of the existing community-based early warning systems across Nepal is discussed. The paper concludes with an assessment of the applicability of this approach in basins and countries beyond Karnali and Nepal and an overview of key lessons learnt from this initiative.

  1. A probabilistic approach to randomness in geometric configuration of scalable origami structures

    NASA Astrophysics Data System (ADS)

    Liu, Ke; Paulino, Glaucio; Gardoni, Paolo

    2015-03-01

    Origami, an ancient paper folding art, has inspired many solutions to modern engineering challenges. The demand for actual engineering applications motivates further investigation in this field. Although rooted from the historic art form, many applications of origami are based on newly designed origami patterns to match the specific requirenments of an engineering problem. The application of origami to structural design problems ranges from micro-structure of materials to large scale deployable shells. For instance, some origami-inspired designs have unique properties such as negative Poisson ratio and flat foldability. However, origami structures are typically constrained by strict mathematical geometric relationships, which in reality, can be easily violated, due to, for example, random imperfections introduced during manufacturing, or non-uniform deformations under working conditions (e.g. due to non-uniform thermal effects). Therefore, the effects of uncertainties in origami-like structures need to be studied in further detail in order to provide a practical guide for scalable origami-inspired engineering designs. Through reliability and probabilistic analysis, we investigate the effect of randomness in origami structures on their mechanical properties. Dislocations of vertices of an origami structure have different impacts on different mechanical properties, and different origami designs could have different sensitivities to imperfections. Thus we aim to provide a preliminary understanding of the structural behavior of some common scalable origami structures subject to randomness in their geometric configurations in order to help transition the technology toward practical applications of origami engineering.

  2. QUANTIFYING AGGREGATE CHLORPYRIFOS EXPOSURE AND DOSE TO CHILDREN USING A PHYSICALLY-BASED TWO-STAGE MONTE CARLO PROBABILISTIC MODEL

    EPA Science Inventory

    To help address the Food Quality Protection Act of 1996, a physically-based, two-stage Monte Carlo probabilistic model has been developed to quantify and analyze aggregate exposure and dose to pesticides via multiple routes and pathways. To illustrate model capabilities and ide...

  3. A ligand predication tool based on modeling and reasoning with imprecise probabilistic knowledge.

    PubMed

    Liu, Weiru; Yue, Anbu; Timson, David J

    2010-04-01

    Ligand prediction has been driven by a fundamental desire to understand more about how biomolecules recognize their ligands and by the commercial imperative to develop new drugs. Most of the current available software systems are very complex and time-consuming to use. Therefore, developing simple and efficient tools to perform initial screening of interesting compounds is an appealing idea. In this paper, we introduce our tool for very rapid screening for likely ligands (either substrates or inhibitors) based on reasoning with imprecise probabilistic knowledge elicited from past experiments. Probabilistic knowledge is input to the system via a user-friendly interface showing a base compound structure. A prediction of whether a particular compound is a substrate is queried against the acquired probabilistic knowledge base and a probability is returned as an indication of the prediction. This tool will be particularly useful in situations where a number of similar compounds have been screened experimentally, but information is not available for all possible members of that group of compounds. We use two case studies to demonstrate how to use the tool. 2009 Elsevier Ireland Ltd. All rights reserved.

  4. Design of robust reliable control for T-S fuzzy Markovian jumping delayed neutral type neural networks with probabilistic actuator faults and leakage delays: An event-triggered communication scheme.

    PubMed

    Syed Ali, M; Vadivel, R; Saravanakumar, R

    2018-06-01

    This study examines the problem of robust reliable control for Takagi-Sugeno (T-S) fuzzy Markovian jumping delayed neural networks with probabilistic actuator faults and leakage terms. An event-triggered communication scheme. First, the randomly occurring actuator faults and their failures rates are governed by two sets of unrelated random variables satisfying certain probabilistic failures of every actuator, new type of distribution based event triggered fault model is proposed, which utilize the effect of transmission delay. Second, Takagi-Sugeno (T-S) fuzzy model is adopted for the neural networks and the randomness of actuators failures is modeled in a Markov jump model framework. Third, to guarantee the considered closed-loop system is exponential mean square stable with a prescribed reliable control performance, a Markov jump event-triggered scheme is designed in this paper, which is the main purpose of our study. Fourth, by constructing appropriate Lyapunov-Krasovskii functional, employing Newton-Leibniz formulation and integral inequalities, several delay-dependent criteria for the solvability of the addressed problem are derived. The obtained stability criteria are stated in terms of linear matrix inequalities (LMIs), which can be checked numerically using the effective LMI toolbox in MATLAB. Finally, numerical examples are given to illustrate the effectiveness and reduced conservatism of the proposed results over the existing ones, among them one example was supported by real-life application of the benchmark problem. Copyright © 2018 ISA. Published by Elsevier Ltd. All rights reserved.

  5. Advanced Reactor Passive System Reliability Demonstration Analysis for an External Event

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bucknor, Matthew D.; Grabaskas, David; Brunett, Acacia J.

    2016-01-01

    Many advanced reactor designs rely on passive systems to fulfill safety functions during accident sequences. These systems depend heavily on boundary conditions to induce a motive force, meaning the system can fail to operate as intended due to deviations in boundary conditions, rather than as the result of physical failures. Furthermore, passive systems may operate in intermediate or degraded modes. These factors make passive system operation difficult to characterize within a traditional probabilistic framework that only recognizes discrete operating modes and does not allow for the explicit consideration of time-dependent boundary conditions. Argonne National Laboratory has been examining various methodologiesmore » for assessing passive system reliability within a probabilistic risk assessment for a station blackout event at an advanced small modular reactor. This paper provides an overview of a passive system reliability demonstration analysis for an external event. Centering on an earthquake with the possibility of site flooding, the analysis focuses on the behavior of the passive reactor cavity cooling system following potential physical damage and system flooding. The assessment approach seeks to combine mechanistic and simulation-based methods to leverage the benefits of the simulation-based approach without the need to substantially deviate from conventional probabilistic risk assessment techniques. While this study is presented as only an example analysis, the results appear to demonstrate a high level of reliability for the reactor cavity cooling system (and the reactor system in general) to the postulated transient event.« less

  6. Advanced Reactor Passive System Reliability Demonstration Analysis for an External Event

    DOE PAGES

    Bucknor, Matthew; Grabaskas, David; Brunett, Acacia J.; ...

    2017-01-24

    We report that many advanced reactor designs rely on passive systems to fulfill safety functions during accident sequences. These systems depend heavily on boundary conditions to induce a motive force, meaning the system can fail to operate as intended because of deviations in boundary conditions, rather than as the result of physical failures. Furthermore, passive systems may operate in intermediate or degraded modes. These factors make passive system operation difficult to characterize within a traditional probabilistic framework that only recognizes discrete operating modes and does not allow for the explicit consideration of time-dependent boundary conditions. Argonne National Laboratory has beenmore » examining various methodologies for assessing passive system reliability within a probabilistic risk assessment for a station blackout event at an advanced small modular reactor. This paper provides an overview of a passive system reliability demonstration analysis for an external event. Considering an earthquake with the possibility of site flooding, the analysis focuses on the behavior of the passive Reactor Cavity Cooling System following potential physical damage and system flooding. The assessment approach seeks to combine mechanistic and simulation-based methods to leverage the benefits of the simulation-based approach without the need to substantially deviate from conventional probabilistic risk assessment techniques. Lastly, although this study is presented as only an example analysis, the results appear to demonstrate a high level of reliability of the Reactor Cavity Cooling System (and the reactor system in general) for the postulated transient event.« less

  7. Visualising probabilistic flood forecast information: expert preferences and perceptions of best practice in uncertainty communication

    NASA Astrophysics Data System (ADS)

    Pappenberger, F.; Stephens, E. M.; Thielen, J.; Salomon, P.; Demeritt, D.; van Andel, S.; Wetterhall, F.; Alfieri, L.

    2011-12-01

    The aim of this paper is to understand and to contribute to improved communication of the probabilistic flood forecasts generated by Hydrological Ensemble Prediction Systems (HEPS) with particular focus on the inter expert communication. Different users are likely to require different kinds of information from HEPS and thus different visualizations. The perceptions of this expert group are important both because they are the designers and primary users of existing HEPS. Nevertheless, they have sometimes resisted the release of uncertainty information to the general public because of doubts about whether it can be successfully communicated in ways that would be readily understood to non-experts. In this paper we explore the strengths and weaknesses of existing HEPS visualization methods and thereby formulate some wider recommendations about best practice for HEPS visualization and communication. We suggest that specific training on probabilistic forecasting would foster use of probabilistic forecasts with a wider range of applications. The result of a case study exercise showed that there is no overarching agreement between experts on how to display probabilistic forecasts and what they consider essential information that should accompany plots and diagrams. In this paper we propose a list of minimum properties that, if consistently displayed with probabilistic forecasts, would make the products more easily understandable.

  8. CARES/Life Software for Designing More Reliable Ceramic Parts

    NASA Technical Reports Server (NTRS)

    Nemeth, Noel N.; Powers, Lynn M.; Baker, Eric H.

    1997-01-01

    Products made from advanced ceramics show great promise for revolutionizing aerospace and terrestrial propulsion, and power generation. However, ceramic components are difficult to design because brittle materials in general have widely varying strength values. The CAPES/Life software eases this task by providing a tool to optimize the design and manufacture of brittle material components using probabilistic reliability analysis techniques. Probabilistic component design involves predicting the probability of failure for a thermomechanically loaded component from specimen rupture data. Typically, these experiments are performed using many simple geometry flexural or tensile test specimens. A static, dynamic, or cyclic load is applied to each specimen until fracture. Statistical strength and SCG (fatigue) parameters are then determined from these data. Using these parameters and the results obtained from a finite element analysis, the time-dependent reliability for a complex component geometry and loading is then predicted. Appropriate design changes are made until an acceptable probability of failure has been reached.

  9. A Step Made Toward Designing Microelectromechanical System (MEMS) Structures With High Reliability

    NASA Technical Reports Server (NTRS)

    Nemeth, Noel N.

    2003-01-01

    The mechanical design of microelectromechanical systems-particularly for micropower generation applications-requires the ability to predict the strength capacity of load-carrying components over the service life of the device. These microdevices, which typically are made of brittle materials such as polysilicon, show wide scatter (stochastic behavior) in strength as well as a different average strength for different sized structures (size effect). These behaviors necessitate either costly and time-consuming trial-and-error designs or, more efficiently, the development of a probabilistic design methodology for MEMS. Over the years, the NASA Glenn Research Center s Life Prediction Branch has developed the CARES/Life probabilistic design methodology to predict the reliability of advanced ceramic components. In this study, done in collaboration with Johns Hopkins University, the ability of the CARES/Life code to predict the reliability of polysilicon microsized structures with stress concentrations is successfully demonstrated.

  10. Constellation Probabilistic Risk Assessment (PRA): Design Consideration for the Crew Exploration Vehicle

    NASA Technical Reports Server (NTRS)

    Prassinos, Peter G.; Stamatelatos, Michael G.; Young, Jonathan; Smith, Curtis

    2010-01-01

    Managed by NASA's Office of Safety and Mission Assurance, a pilot probabilistic risk analysis (PRA) of the NASA Crew Exploration Vehicle (CEV) was performed in early 2006. The PRA methods used follow the general guidance provided in the NASA PRA Procedures Guide for NASA Managers and Practitioners'. Phased-mission based event trees and fault trees are used to model a lunar sortie mission of the CEV - involving the following phases: launch of a cargo vessel and a crew vessel; rendezvous of these two vessels in low Earth orbit; transit to th$: moon; lunar surface activities; ascension &om the lunar surface; and return to Earth. The analysis is based upon assumptions, preliminary system diagrams, and failure data that may involve large uncertainties or may lack formal validation. Furthermore, some of the data used were based upon expert judgment or extrapolated from similar componentssystemsT. his paper includes a discussion of the system-level models and provides an overview of the analysis results used to identify insights into CEV risk drivers, and trade and sensitivity studies. Lastly, the PRA model was used to determine changes in risk as the system configurations or key parameters are modified.

  11. Towards dropout training for convolutional neural networks.

    PubMed

    Wu, Haibing; Gu, Xiaodong

    2015-11-01

    Recently, dropout has seen increasing use in deep learning. For deep convolutional neural networks, dropout is known to work well in fully-connected layers. However, its effect in convolutional and pooling layers is still not clear. This paper demonstrates that max-pooling dropout is equivalent to randomly picking activation based on a multinomial distribution at training time. In light of this insight, we advocate employing our proposed probabilistic weighted pooling, instead of commonly used max-pooling, to act as model averaging at test time. Empirical evidence validates the superiority of probabilistic weighted pooling. We also empirically show that the effect of convolutional dropout is not trivial, despite the dramatically reduced possibility of over-fitting due to the convolutional architecture. Elaborately designing dropout training simultaneously in max-pooling and fully-connected layers, we achieve state-of-the-art performance on MNIST, and very competitive results on CIFAR-10 and CIFAR-100, relative to other approaches without data augmentation. Finally, we compare max-pooling dropout and stochastic pooling, both of which introduce stochasticity based on multinomial distributions at pooling stage. Copyright © 2015 Elsevier Ltd. All rights reserved.

  12. A Unified Probabilistic Framework for Dose-Response Assessment of Human Health Effects.

    PubMed

    Chiu, Weihsueh A; Slob, Wout

    2015-12-01

    When chemical health hazards have been identified, probabilistic dose-response assessment ("hazard characterization") quantifies uncertainty and/or variability in toxicity as a function of human exposure. Existing probabilistic approaches differ for different types of endpoints or modes-of-action, lacking a unifying framework. We developed a unified framework for probabilistic dose-response assessment. We established a framework based on four principles: a) individual and population dose responses are distinct; b) dose-response relationships for all (including quantal) endpoints can be recast as relating to an underlying continuous measure of response at the individual level; c) for effects relevant to humans, "effect metrics" can be specified to define "toxicologically equivalent" sizes for this underlying individual response; and d) dose-response assessment requires making adjustments and accounting for uncertainty and variability. We then derived a step-by-step probabilistic approach for dose-response assessment of animal toxicology data similar to how nonprobabilistic reference doses are derived, illustrating the approach with example non-cancer and cancer datasets. Probabilistically derived exposure limits are based on estimating a "target human dose" (HDMI), which requires risk management-informed choices for the magnitude (M) of individual effect being protected against, the remaining incidence (I) of individuals with effects ≥ M in the population, and the percent confidence. In the example datasets, probabilistically derived 90% confidence intervals for HDMI values span a 40- to 60-fold range, where I = 1% of the population experiences ≥ M = 1%-10% effect sizes. Although some implementation challenges remain, this unified probabilistic framework can provide substantially more complete and transparent characterization of chemical hazards and support better-informed risk management decisions.

  13. Probabilistic assessment of smart composite structures

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.; Shiao, Michael C.

    1994-01-01

    A composite wing with spars and bulkheads is used to demonstrate the effectiveness of probabilistic assessment of smart composite structures to control uncertainties in distortions and stresses. Results show that a smart composite wing can be controlled to minimize distortions and to have specified stress levels in the presence of defects. Structural responses such as changes in angle of attack, vertical displacements, and stress in the control and controlled plies are probabilistically assessed to quantify their respective uncertainties. Sensitivity factors are evaluated to identify those parameters that have the greatest influence on a specific structural response. Results show that smart composite structures can be configured to control both distortions and ply stresses to satisfy specified design requirements.

  14. A mediation model to explain decision making under conditions of risk among adolescents: the role of fluid intelligence and probabilistic reasoning.

    PubMed

    Donati, Maria Anna; Panno, Angelo; Chiesi, Francesca; Primi, Caterina

    2014-01-01

    This study tested the mediating role of probabilistic reasoning ability in the relationship between fluid intelligence and advantageous decision making among adolescents in explicit situations of risk--that is, in contexts in which information on the choice options (gains, losses, and probabilities) were explicitly presented at the beginning of the task. Participants were 282 adolescents attending high school (77% males, mean age = 17.3 years). We first measured fluid intelligence and probabilistic reasoning ability. Then, to measure decision making under explicit conditions of risk, participants performed the Game of Dice Task, in which they have to decide among different alternatives that are explicitly linked to a specific amount of gain or loss and have obvious winning probabilities that are stable over time. Analyses showed a significant positive indirect effect of fluid intelligence on advantageous decision making through probabilistic reasoning ability that acted as a mediator. Specifically, fluid intelligence may enhance ability to reason in probabilistic terms, which in turn increases the likelihood of advantageous choices when adolescents are confronted with an explicit decisional context. Findings show that in experimental paradigm settings, adolescents are able to make advantageous decisions using cognitive abilities when faced with decisions under explicit risky conditions. This study suggests that interventions designed to promote probabilistic reasoning, for example by incrementing the mathematical prerequisites necessary to reason in probabilistic terms, may have a positive effect on adolescents' decision-making abilities.

  15. Reliability and risk assessment of structures

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.

    1991-01-01

    Development of reliability and risk assessment of structural components and structures is a major activity at Lewis Research Center. It consists of five program elements: (1) probabilistic loads; (2) probabilistic finite element analysis; (3) probabilistic material behavior; (4) assessment of reliability and risk; and (5) probabilistic structural performance evaluation. Recent progress includes: (1) the evaluation of the various uncertainties in terms of cumulative distribution functions for various structural response variables based on known or assumed uncertainties in primitive structural variables; (2) evaluation of the failure probability; (3) reliability and risk-cost assessment; and (4) an outline of an emerging approach for eventual certification of man-rated structures by computational methods. Collectively, the results demonstrate that the structural durability/reliability of man-rated structural components and structures can be effectively evaluated by using formal probabilistic methods.

  16. Building a high-resolution T2-weighted MR-based probabilistic model of tumor occurrence in the prostate.

    PubMed

    Nagarajan, Mahesh B; Raman, Steven S; Lo, Pechin; Lin, Wei-Chan; Khoshnoodi, Pooria; Sayre, James W; Ramakrishna, Bharath; Ahuja, Preeti; Huang, Jiaoti; Margolis, Daniel J A; Lu, David S K; Reiter, Robert E; Goldin, Jonathan G; Brown, Matthew S; Enzmann, Dieter R

    2018-02-19

    We present a method for generating a T2 MR-based probabilistic model of tumor occurrence in the prostate to guide the selection of anatomical sites for targeted biopsies and serve as a diagnostic tool to aid radiological evaluation of prostate cancer. In our study, the prostate and any radiological findings within were segmented retrospectively on 3D T2-weighted MR images of 266 subjects who underwent radical prostatectomy. Subsequent histopathological analysis determined both the ground truth and the Gleason grade of the tumors. A randomly chosen subset of 19 subjects was used to generate a multi-subject-derived prostate template. Subsequently, a cascading registration algorithm involving both affine and non-rigid B-spline transforms was used to register the prostate of every subject to the template. Corresponding transformation of radiological findings yielded a population-based probabilistic model of tumor occurrence. The quality of our probabilistic model building approach was statistically evaluated by measuring the proportion of correct placements of tumors in the prostate template, i.e., the number of tumors that maintained their anatomical location within the prostate after their transformation into the prostate template space. Probabilistic model built with tumors deemed clinically significant demonstrated a heterogeneous distribution of tumors, with higher likelihood of tumor occurrence at the mid-gland anterior transition zone and the base-to-mid-gland posterior peripheral zones. Of 250 MR lesions analyzed, 248 maintained their original anatomical location with respect to the prostate zones after transformation to the prostate. We present a robust method for generating a probabilistic model of tumor occurrence in the prostate that could aid clinical decision making, such as selection of anatomical sites for MR-guided prostate biopsies.

  17. Great Balls of Fire: A probabilistic approach to quantify the hazard related to ballistics - A case study at La Fossa volcano, Vulcano Island, Italy

    NASA Astrophysics Data System (ADS)

    Biass, Sébastien; Falcone, Jean-Luc; Bonadonna, Costanza; Di Traglia, Federico; Pistolesi, Marco; Rosi, Mauro; Lestuzzi, Pierino

    2016-10-01

    We present a probabilistic approach to quantify the hazard posed by volcanic ballistic projectiles (VBP) and their potential impact on the built environment. A model named Great Balls of Fire (GBF) is introduced to describe ballistic trajectories of VBPs accounting for a variable drag coefficient and topography. It relies on input parameters easily identifiable in the field and is designed to model large numbers of VBPs stochastically. Associated functions come with the GBF code to post-process model outputs into a comprehensive probabilistic hazard assessment for VBP impacts. Outcomes include probability maps to exceed given thresholds of kinetic energies at impact, hazard curves and probabilistic isoenergy maps. Probabilities are calculated either on equally-sized pixels or zones of interest. The approach is calibrated, validated and applied to La Fossa volcano, Vulcano Island (Italy). We constructed a generic eruption scenario based on stratigraphic studies and numerical inversions of the 1888-1890 long-lasting Vulcanian cycle of La Fossa. Results suggest a ~ 10- 2% probability of occurrence of VBP impacts with kinetic energies ≤ 104 J at the touristic locality of Porto. In parallel, the vulnerability to roof perforation was estimated by combining field observations and published literature, allowing for a first estimate of the potential impact of VBPs during future Vulcanian eruptions. Results indicate a high physical vulnerability to the VBP hazard, and, consequently, half of the building stock having a ≥ 2.5 × 10- 3% probability of roof perforation.

  18. A Unified Probabilistic Framework for Dose–Response Assessment of Human Health Effects

    PubMed Central

    Slob, Wout

    2015-01-01

    Background When chemical health hazards have been identified, probabilistic dose–response assessment (“hazard characterization”) quantifies uncertainty and/or variability in toxicity as a function of human exposure. Existing probabilistic approaches differ for different types of endpoints or modes-of-action, lacking a unifying framework. Objectives We developed a unified framework for probabilistic dose–response assessment. Methods We established a framework based on four principles: a) individual and population dose responses are distinct; b) dose–response relationships for all (including quantal) endpoints can be recast as relating to an underlying continuous measure of response at the individual level; c) for effects relevant to humans, “effect metrics” can be specified to define “toxicologically equivalent” sizes for this underlying individual response; and d) dose–response assessment requires making adjustments and accounting for uncertainty and variability. We then derived a step-by-step probabilistic approach for dose–response assessment of animal toxicology data similar to how nonprobabilistic reference doses are derived, illustrating the approach with example non-cancer and cancer datasets. Results Probabilistically derived exposure limits are based on estimating a “target human dose” (HDMI), which requires risk management–informed choices for the magnitude (M) of individual effect being protected against, the remaining incidence (I) of individuals with effects ≥ M in the population, and the percent confidence. In the example datasets, probabilistically derived 90% confidence intervals for HDMI values span a 40- to 60-fold range, where I = 1% of the population experiences ≥ M = 1%–10% effect sizes. Conclusions Although some implementation challenges remain, this unified probabilistic framework can provide substantially more complete and transparent characterization of chemical hazards and support better-informed risk management decisions. Citation Chiu WA, Slob W. 2015. A unified probabilistic framework for dose–response assessment of human health effects. Environ Health Perspect 123:1241–1254; http://dx.doi.org/10.1289/ehp.1409385 PMID:26006063

  19. Multi-disciplinary coupling effects for integrated design of propulsion systems

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.; Singhal, S. N.

    1993-01-01

    Effective computational simulation procedures are described for modeling the inherent multi-disciplinary interactions which govern the accurate response of propulsion systems. Results are presented for propulsion system responses including multi-disciplinary coupling effects using coupled multi-discipline thermal, structural, and acoustic tailoring; an integrated system of multi-disciplinary simulators; coupled material behavior/fabrication process tailoring; sensitivities using a probabilistic simulator; and coupled materials, structures, fracture, and probabilistic behavior simulator. The results demonstrate that superior designs can be achieved if the analysis/tailoring methods account for the multi-disciplinary coupling effects. The coupling across disciplines can be used to develop an integrated coupled multi-discipline numerical propulsion system simulator.

  20. Multi-disciplinary coupling for integrated design of propulsion systems

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.; Singhal, S. N.

    1993-01-01

    Effective computational simulation procedures are described for modeling the inherent multi-disciplinary interactions for determining the true response of propulsion systems. Results are presented for propulsion system responses including multi-discipline coupling effects via (1) coupled multi-discipline tailoring, (2) an integrated system of multidisciplinary simulators, (3) coupled material-behavior/fabrication-process tailoring, (4) sensitivities using a probabilistic simulator, and (5) coupled materials/structures/fracture/probabilistic behavior simulator. The results show that the best designs can be determined if the analysis/tailoring methods account for the multi-disciplinary coupling effects. The coupling across disciplines can be used to develop an integrated interactive multi-discipline numerical propulsion system simulator.

  1. A Comparison of Traditional, Step-Path, and Geostatistical Techniques in the Stability Analysis of a Large Open Pit

    NASA Astrophysics Data System (ADS)

    Mayer, J. M.; Stead, D.

    2017-04-01

    With the increased drive towards deeper and more complex mine designs, geotechnical engineers are often forced to reconsider traditional deterministic design techniques in favour of probabilistic methods. These alternative techniques allow for the direct quantification of uncertainties within a risk and/or decision analysis framework. However, conventional probabilistic practices typically discretize geological materials into discrete, homogeneous domains, with attributes defined by spatially constant random variables, despite the fact that geological media display inherent heterogeneous spatial characteristics. This research directly simulates this phenomenon using a geostatistical approach, known as sequential Gaussian simulation. The method utilizes the variogram which imposes a degree of controlled spatial heterogeneity on the system. Simulations are constrained using data from the Ok Tedi mine site in Papua New Guinea and designed to randomly vary the geological strength index and uniaxial compressive strength using Monte Carlo techniques. Results suggest that conventional probabilistic techniques have a fundamental limitation compared to geostatistical approaches, as they fail to account for the spatial dependencies inherent to geotechnical datasets. This can result in erroneous model predictions, which are overly conservative when compared to the geostatistical results.

  2. Control of Stochastic Master Equation Models of Genetic Regulatory Networks by Approximating Their Average Behavior

    NASA Astrophysics Data System (ADS)

    Umut Caglar, Mehmet; Pal, Ranadip

    2010-10-01

    The central dogma of molecular biology states that ``information cannot be transferred back from protein to either protein or nucleic acid.'' However, this assumption is not exactly correct in most of the cases. There are a lot of feedback loops and interactions between different levels of systems. These types of interactions are hard to analyze due to the lack of data in the cellular level and probabilistic nature of interactions. Probabilistic models like Stochastic Master Equation (SME) or deterministic models like differential equations (DE) can be used to analyze these types of interactions. SME models based on chemical master equation (CME) can provide detailed representation of genetic regulatory system, but their use is restricted by the large data requirements and computational costs of calculations. The differential equations models on the other hand, have low calculation costs and much more adequate to generate control procedures on the system; but they are not adequate to investigate the probabilistic nature of interactions. In this work the success of the mapping between SME and DE is analyzed, and the success of a control policy generated by DE model with respect to SME model is examined. Index Terms--- Stochastic Master Equation models, Differential Equation Models, Control Policy Design, Systems biology

  3. Aeromechanics and Vehicle Configuration Demonstrations. Volume 3: A Hybrid Probabilistic Method for Estimate Design Margin

    DTIC Science & Technology

    2014-02-01

    infrastructure–satellites provide communications , remote sensing, radio -based navigation through the global positioning system, and world-wide, coordinated...to be expendable. For the Saturn V stages, #501 is the first Saturn V launched while #506 is the rocket used for the Apollo 11 mission after having...Air Force AGENCY ACRONYM(S) AFRL/RQHV 11 . SPONSORING/MONITORING AGENCY REPORT NUMBER(S) AFRL-RQ-WP-TR-2014-0005V3 12. DISTRIBUTION/AVAILABILITY

  4. Probabilistic Reverse dOsimetry Estimating Exposure Distribution (PROcEED)

    EPA Pesticide Factsheets

    PROcEED is a web-based application used to conduct probabilistic reverse dosimetry calculations.The tool is used for estimating a distribution of exposure concentrations likely to have produced biomarker concentrations measured in a population.

  5. Quantification and Segmentation of Brain Tissues from MR Images: A Probabilistic Neural Network Approach

    PubMed Central

    Wang, Yue; Adalý, Tülay; Kung, Sun-Yuan; Szabo, Zsolt

    2007-01-01

    This paper presents a probabilistic neural network based technique for unsupervised quantification and segmentation of brain tissues from magnetic resonance images. It is shown that this problem can be solved by distribution learning and relaxation labeling, resulting in an efficient method that may be particularly useful in quantifying and segmenting abnormal brain tissues where the number of tissue types is unknown and the distributions of tissue types heavily overlap. The new technique uses suitable statistical models for both the pixel and context images and formulates the problem in terms of model-histogram fitting and global consistency labeling. The quantification is achieved by probabilistic self-organizing mixtures and the segmentation by a probabilistic constraint relaxation network. The experimental results show the efficient and robust performance of the new algorithm and that it outperforms the conventional classification based approaches. PMID:18172510

  6. APOLLO_NG - a probabilistic interpretation of the APOLLO legacy for AVHRR heritage channels

    NASA Astrophysics Data System (ADS)

    Klüser, L.; Killius, N.; Gesell, G.

    2015-10-01

    The cloud processing scheme APOLLO (AVHRR Processing scheme Over cLouds, Land and Ocean) has been in use for cloud detection and cloud property retrieval since the late 1980s. The physics of the APOLLO scheme still build the backbone of a range of cloud detection algorithms for AVHRR (Advanced Very High Resolution Radiometer) heritage instruments. The APOLLO_NG (APOLLO_NextGeneration) cloud processing scheme is a probabilistic interpretation of the original APOLLO method. It builds upon the physical principles that have served well in the original APOLLO scheme. Nevertheless, a couple of additional variables have been introduced in APOLLO_NG. Cloud detection is no longer performed as a binary yes/no decision based on these physical principles. It is rather expressed as cloud probability for each satellite pixel. Consequently, the outcome of the algorithm can be tuned from being sure to reliably identify clear pixels to conditions of reliably identifying definitely cloudy pixels, depending on the purpose. The probabilistic approach allows retrieving not only the cloud properties (optical depth, effective radius, cloud top temperature and cloud water path) but also their uncertainties. APOLLO_NG is designed as a standalone cloud retrieval method robust enough for operational near-realtime use and for application to large amounts of historical satellite data. The radiative transfer solution is approximated by the same two-stream approach which also had been used for the original APOLLO. This allows the algorithm to be applied to a wide range of sensors without the necessity of sensor-specific tuning. Moreover it allows for online calculation of the radiative transfer (i.e., within the retrieval algorithm) giving rise to a detailed probabilistic treatment of cloud variables. This study presents the algorithm for cloud detection and cloud property retrieval together with the physical principles from the APOLLO legacy it is based on. Furthermore a couple of example results from NOAA-18 are presented.

  7. APOLLO_NG - a probabilistic interpretation of the APOLLO legacy for AVHRR heritage channels

    NASA Astrophysics Data System (ADS)

    Klüser, L.; Killius, N.; Gesell, G.

    2015-04-01

    The cloud processing scheme APOLLO (Avhrr Processing scheme Over cLouds, Land and Ocean) has been in use for cloud detection and cloud property retrieval since the late 1980s. The physics of the APOLLO scheme still build the backbone of a range of cloud detection algorithms for AVHRR (Advanced Very High Resolution Radiometer) heritage instruments. The APOLLO_NG (APOLLO_NextGeneration) cloud processing scheme is a probabilistic interpretation of the original APOLLO method. While building upon the physical principles having served well in the original APOLLO a couple of additional variables have been introduced in APOLLO_NG. Cloud detection is not performed as a binary yes/no decision based on these physical principals but is expressed as cloud probability for each satellite pixel. Consequently the outcome of the algorithm can be tuned from clear confident to cloud confident depending on the purpose. The probabilistic approach allows to retrieving not only the cloud properties (optical depth, effective radius, cloud top temperature and cloud water path) but also their uncertainties. APOLLO_NG is designed as a standalone cloud retrieval method robust enough for operational near-realtime use and for the application with large amounts of historical satellite data. Thus the radiative transfer solution is approximated by the same two stream approach which also had been used for the original APOLLO. This allows the algorithm to be robust enough for being applied to a wide range of sensors without the necessity of sensor-specific tuning. Moreover it allows for online calculation of the radiative transfer (i.e. within the retrieval algorithm) giving rise to a detailed probabilistic treatment of cloud variables. This study presents the algorithm for cloud detection and cloud property retrieval together with the physical principles from the APOLLO legacy it is based on. Furthermore a couple of example results from on NOAA-18 are presented.

  8. Probabilistic population projections with migration uncertainty

    PubMed Central

    Azose, Jonathan J.; Ševčíková, Hana; Raftery, Adrian E.

    2016-01-01

    We produce probabilistic projections of population for all countries based on probabilistic projections of fertility, mortality, and migration. We compare our projections to those from the United Nations’ Probabilistic Population Projections, which uses similar methods for fertility and mortality but deterministic migration projections. We find that uncertainty in migration projection is a substantial contributor to uncertainty in population projections for many countries. Prediction intervals for the populations of Northern America and Europe are over 70% wider, whereas prediction intervals for the populations of Africa, Asia, and the world as a whole are nearly unchanged. Out-of-sample validation shows that the model is reasonably well calibrated. PMID:27217571

  9. PROTOCOL TO EVALUATE THE MOISTURE DURABILITY OF ENERGY-EFFICIENT WALLS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Boudreaux, Philip R; Pallin, Simon B; Hun, Diana E

    Walls account for about 8% of the energy used in residential buildings. This energy penalty can be reduced with higher insulation levels and increased airtightness. However, these measures can compromise the moisture durability and long-term performance of wall assemblies because they can lead to lower moisture tolerance due to reduced drying potential. To avert these problems, a moisture durability protocol was developed to evaluate the probability that an energy-efficient wall design will experience mold growth. This protocol examines the effects of moisture sources in walls through a combination of simulations and lab experiments, uses the mold growth index as themore » moisture durability indicator, and is based on a probabilistic approach that utilizes stochastically varying input parameters. The simulation tools used include a new validated method for taking into account the effects of air leakage in wall assemblies This paper provides an overview of the developed protocol, discussion of the probabilistic simulation approach and describes results from the evaluation of two wall assemblies in Climate Zones 2, 4, and 6. The protocol will be used to supply builders with wall designs that are energy efficient, moisture durable and cost-effective.« less

  10. Reliability-based design optimization of reinforced concrete structures including soil-structure interaction using a discrete gravitational search algorithm and a proposed metamodel

    NASA Astrophysics Data System (ADS)

    Khatibinia, M.; Salajegheh, E.; Salajegheh, J.; Fadaee, M. J.

    2013-10-01

    A new discrete gravitational search algorithm (DGSA) and a metamodelling framework are introduced for reliability-based design optimization (RBDO) of reinforced concrete structures. The RBDO of structures with soil-structure interaction (SSI) effects is investigated in accordance with performance-based design. The proposed DGSA is based on the standard gravitational search algorithm (GSA) to optimize the structural cost under deterministic and probabilistic constraints. The Monte-Carlo simulation (MCS) method is considered as the most reliable method for estimating the probabilities of reliability. In order to reduce the computational time of MCS, the proposed metamodelling framework is employed to predict the responses of the SSI system in the RBDO procedure. The metamodel consists of a weighted least squares support vector machine (WLS-SVM) and a wavelet kernel function, which is called WWLS-SVM. Numerical results demonstrate the efficiency and computational advantages of DGSA and the proposed metamodel for RBDO of reinforced concrete structures.

  11. Against all odds -- Probabilistic forecasts and decision making

    NASA Astrophysics Data System (ADS)

    Liechti, Katharina; Zappa, Massimiliano

    2015-04-01

    In the city of Zurich (Switzerland) the setting is such that the damage potential due to flooding of the river Sihl is estimated to about 5 billion US dollars. The flood forecasting system that is used by the administration for decision making runs continuously since 2007. It has a time horizon of max. five days and operates at hourly time steps. The flood forecasting system includes three different model chains. Two of those are run by the deterministic NWP models COSMO-2 and COSMO-7 and one is driven by the probabilistic NWP COSMO-Leps. The model chains are consistent since February 2010, so five full years are available for the evaluation for the system. The system was evaluated continuously and is a very nice example to present the added value that lies in probabilistic forecasts. The forecasts are available on an online-platform to the decision makers. Several graphical representations of the forecasts and forecast-history are available to support decision making and to rate the current situation. The communication between forecasters and decision-makers is quite close. To put it short, an ideal situation. However, an event or better put a non-event in summer 2014 showed that the knowledge about the general superiority of probabilistic forecasts doesn't necessarily mean that the decisions taken in a specific situation will be based on that probabilistic forecast. Some years of experience allow gaining confidence in the system, both for the forecasters and for the decision-makers. Even if from the theoretical point of view the handling during crisis situation is well designed, a first event demonstrated that the dialog with the decision-makers still lacks of exercise during such situations. We argue, that a false alarm is a needed experience to consolidate real-time emergency procedures relying on ensemble predictions. A missed event would probably also fit, but, in our case, we are very happy not to report about this option.

  12. Efficient Sensitivity Methods for Probabilistic Lifing and Engine Prognostics

    DTIC Science & Technology

    2010-09-01

    AFRL-RX-WP-TR-2010-4297 EFFICIENT SENSITIVITY METHODS FOR PROBABILISTIC LIFING AND ENGINE PROGNOSTICS Harry Millwater , Ronald Bagley, Jose...5c. PROGRAM ELEMENT NUMBER 62102F 6. AUTHOR(S) Harry Millwater , Ronald Bagley, Jose Garza, D. Wagner, Andrew Bates, and Andy Voorhees 5d...Reliability Assessment, MIL-HDBK-1823, 30 April 1999. 9. Leverant GR, Millwater HR, McClung RC, Enright MP, A New Tool for Design and Certification of

  13. Probabalistic Risk Assessment of a Turbine Disk

    NASA Astrophysics Data System (ADS)

    Carter, Jace A.; Thomas, Michael; Goswami, Tarun; Fecke, Ted

    Current Federal Aviation Administration (FAA) rotor design certification practices risk assessment using a probabilistic framework focused on only the life-limiting defect location of a component. This method generates conservative approximations of the operational risk. The first section of this article covers a discretization method, which allows for a transition from this relative risk to an absolute risk where the component is discretized into regions called zones. General guidelines were established for the zone-refinement process based on the stress gradient topology in order to reach risk convergence. The second section covers a risk assessment method for predicting the total fatigue life due to fatigue induced damage. The total fatigue life incorporates a dual mechanism approach including the crack initiation life and propagation life while simultaneously determining the associated initial flaw sizes. A microstructure-based model was employed to address uncertainties in material response and relate crack initiation life with crack size, while propagation life was characterized large crack growth laws. The two proposed methods were applied to a representative Inconel 718 turbine disk. The zone-based method reduces the conservative approaches, while showing effects of feature-based inspection on the risk assessment. In the fatigue damage assessment, the predicted initial crack distribution was found to be the most sensitive probabilistic parameter and can be used to establish an enhanced inspection planning.

  14. Morphological and wavelet features towards sonographic thyroid nodules evaluation.

    PubMed

    Tsantis, Stavros; Dimitropoulos, Nikos; Cavouras, Dionisis; Nikiforidis, George

    2009-03-01

    This paper presents a computer-based classification scheme that utilized various morphological and novel wavelet-based features towards malignancy risk evaluation of thyroid nodules in ultrasonography. The study comprised 85 ultrasound images-patients that were cytological confirmed (54 low-risk and 31 high-risk). A set of 20 features (12 based on nodules boundary shape and 8 based on wavelet local maxima located within each nodule) has been generated. Two powerful pattern recognition algorithms (support vector machines and probabilistic neural networks) have been designed and developed in order to quantify the power of differentiation of the introduced features. A comparative study has also been held, in order to estimate the impact speckle had onto the classification procedure. The diagnostic sensitivity and specificity of both classifiers was made by means of receiver operating characteristics (ROC) analysis. In the speckle-free feature set, the area under the ROC curve was 0.96 for the support vector machines classifier whereas for the probabilistic neural networks was 0.91. In the feature set with speckle, the corresponding areas under the ROC curves were 0.88 and 0.86 respectively for the two classifiers. The proposed features can increase the classification accuracy and decrease the rate of missing and misdiagnosis in thyroid cancer control.

  15. Failed rib region prediction in a human body model during crash events with precrash braking.

    PubMed

    Guleyupoglu, B; Koya, B; Barnard, R; Gayzik, F S

    2018-02-28

    The objective of this study is 2-fold. We used a validated human body finite element model to study the predicted chest injury (focusing on rib fracture as a function of element strain) based on varying levels of simulated precrash braking. Furthermore, we compare deterministic and probabilistic methods of rib injury prediction in the computational model. The Global Human Body Models Consortium (GHBMC) M50-O model was gravity settled in the driver position of a generic interior equipped with an advanced 3-point belt and airbag. Twelve cases were investigated with permutations for failure, precrash braking system, and crash severity. The severities used were median (17 kph), severe (34 kph), and New Car Assessment Program (NCAP; 56.4 kph). Cases with failure enabled removed rib cortical bone elements once 1.8% effective plastic strain was exceeded. Alternatively, a probabilistic framework found in the literature was used to predict rib failure. Both the probabilistic and deterministic methods take into consideration location (anterior, lateral, and posterior). The deterministic method is based on a rubric that defines failed rib regions dependent on a threshold for contiguous failed elements. The probabilistic method depends on age-based strain and failure functions. Kinematics between both methods were similar (peak max deviation: ΔX head = 17 mm; ΔZ head = 4 mm; ΔX thorax = 5 mm; ΔZ thorax = 1 mm). Seat belt forces at the time of probabilistic failed region initiation were lower than those at deterministic failed region initiation. The probabilistic method for rib fracture predicted more failed regions in the rib (an analog for fracture) than the deterministic method in all but 1 case where they were equal. The failed region patterns between models are similar; however, there are differences that arise due to stress reduced from element elimination that cause probabilistic failed regions to continue to rise after no deterministic failed region would be predicted. Both the probabilistic and deterministic methods indicate similar trends with regards to the effect of precrash braking; however, there are tradeoffs. The deterministic failed region method is more spatially sensitive to failure and is more sensitive to belt loads. The probabilistic failed region method allows for increased capability in postprocessing with respect to age. The probabilistic failed region method predicted more failed regions than the deterministic failed region method due to force distribution differences.

  16. Comparison of probabilistic and deterministic fiber tracking of cranial nerves.

    PubMed

    Zolal, Amir; Sobottka, Stephan B; Podlesek, Dino; Linn, Jennifer; Rieger, Bernhard; Juratli, Tareq A; Schackert, Gabriele; Kitzler, Hagen H

    2017-09-01

    OBJECTIVE The depiction of cranial nerves (CNs) using diffusion tensor imaging (DTI) is of great interest in skull base tumor surgery and DTI used with deterministic tracking methods has been reported previously. However, there are still no good methods usable for the elimination of noise from the resulting depictions. The authors have hypothesized that probabilistic tracking could lead to more accurate results, because it more efficiently extracts information from the underlying data. Moreover, the authors have adapted a previously described technique for noise elimination using gradual threshold increases to probabilistic tracking. To evaluate the utility of this new approach, a comparison is provided with this work between the gradual threshold increase method in probabilistic and deterministic tracking of CNs. METHODS Both tracking methods were used to depict CNs II, III, V, and the VII+VIII bundle. Depiction of 240 CNs was attempted with each of the above methods in 30 healthy subjects, which were obtained from 2 public databases: the Kirby repository (KR) and Human Connectome Project (HCP). Elimination of erroneous fibers was attempted by gradually increasing the respective thresholds (fractional anisotropy [FA] and probabilistic index of connectivity [PICo]). The results were compared with predefined ground truth images based on corresponding anatomical scans. Two label overlap measures (false-positive error and Dice similarity coefficient) were used to evaluate the success of both methods in depicting the CN. Moreover, the differences between these parameters obtained from the KR and HCP (with higher angular resolution) databases were evaluated. Additionally, visualization of 10 CNs in 5 clinical cases was attempted with both methods and evaluated by comparing the depictions with intraoperative findings. RESULTS Maximum Dice similarity coefficients were significantly higher with probabilistic tracking (p < 0.001; Wilcoxon signed-rank test). The false-positive error of the last obtained depiction was also significantly lower in probabilistic than in deterministic tracking (p < 0.001). The HCP data yielded significantly better results in terms of the Dice coefficient in probabilistic tracking (p < 0.001, Mann-Whitney U-test) and in deterministic tracking (p = 0.02). The false-positive errors were smaller in HCP data in deterministic tracking (p < 0.001) and showed a strong trend toward significance in probabilistic tracking (p = 0.06). In the clinical cases, the probabilistic method visualized 7 of 10 attempted CNs accurately, compared with 3 correct depictions with deterministic tracking. CONCLUSIONS High angular resolution DTI scans are preferable for the DTI-based depiction of the cranial nerves. Probabilistic tracking with a gradual PICo threshold increase is more effective for this task than the previously described deterministic tracking with a gradual FA threshold increase and might represent a method that is useful for depicting cranial nerves with DTI since it eliminates the erroneous fibers without manual intervention.

  17. Estimating rates of local extinction and colonization in colonial species and an extension to the metapopulation and community levels

    USGS Publications Warehouse

    Barbraud, C.; Nichols, J.D.; Hines, J.E.; Hafner, H.

    2003-01-01

    Coloniality has mainly been studied from an evolutionary perspective, but relatively few studies have developed methods for modelling colony dynamics. Changes in number of colonies over time provide a useful tool for predicting and evaluating the responses of colonial species to management and to environmental disturbance. Probabilistic Markov process models have been recently used to estimate colony site dynamics using presence-absence data when all colonies are detected in sampling efforts. Here, we define and develop two general approaches for the modelling and analysis of colony dynamics for sampling situations in which all colonies are, and are not, detected. For both approaches, we develop a general probabilistic model for the data and then constrain model parameters based on various hypotheses about colony dynamics. We use Akaike's Information Criterion (AIC) to assess the adequacy of the constrained models. The models are parameterised with conditional probabilities of local colony site extinction and colonization. Presence-absence data arising from Pollock's robust capture-recapture design provide the basis for obtaining unbiased estimates of extinction, colonization, and detection probabilities when not all colonies are detected. This second approach should be particularly useful in situations where detection probabilities are heterogeneous among colony sites. The general methodology is illustrated using presence-absence data on two species of herons (Purple Heron, Ardea purpurea and Grey Heron, Ardea cinerea). Estimates of the extinction and colonization rates showed interspecific differences and strong temporal and spatial variations. We were also able to test specific predictions about colony dynamics based on ideas about habitat change and metapopulation dynamics. We recommend estimators based on probabilistic modelling for future work on colony dynamics. We also believe that this methodological framework has wide application to problems in animal ecology concerning metapopulation and community dynamics.

  18. Willingness-to-pay for a probabilistic flood forecast: a risk-based decision-making game

    NASA Astrophysics Data System (ADS)

    Arnal, Louise; Ramos, Maria-Helena; Coughlan, Erin; Cloke, Hannah L.; Stephens, Elisabeth; Wetterhall, Fredrik; van Andel, Schalk-Jan; Pappenberger, Florian

    2016-04-01

    Forecast uncertainty is a twofold issue, as it constitutes both an added value and a challenge for the forecaster and the user of the forecasts. Many authors have demonstrated the added (economic) value of probabilistic forecasts over deterministic forecasts for a diversity of activities in the water sector (e.g. flood protection, hydroelectric power management and navigation). However, the richness of the information is also a source of challenges for operational uses, due partially to the difficulty to transform the probability of occurrence of an event into a binary decision. The setup and the results of a risk-based decision-making experiment, designed as a game on the topic of flood protection mitigation, called ``How much are you prepared to pay for a forecast?'', will be presented. The game was played at several workshops in 2015, including during this session at the EGU conference in 2015, and a total of 129 worksheets were collected and analysed. The aim of this experiment was to contribute to the understanding of the role of probabilistic forecasts in decision-making processes and their perceived value by decision-makers. Based on the participants' willingness-to-pay for a forecast, the results of the game showed that the value (or the usefulness) of a forecast depends on several factors, including the way users perceive the quality of their forecasts and link it to the perception of their own performances as decision-makers. Balancing avoided costs and the cost (or the benefit) of having forecasts available for making decisions is not straightforward, even in a simplified game situation, and is a topic that deserves more attention from the hydrological forecasting community in the future.

  19. Weighing costs and losses: A decision making game using probabilistic forecasts

    NASA Astrophysics Data System (ADS)

    Werner, Micha; Ramos, Maria-Helena; Wetterhall, Frederik; Cranston, Michael; van Andel, Schalk-Jan; Pappenberger, Florian; Verkade, Jan

    2017-04-01

    Probabilistic forecasts are increasingly recognised as an effective and reliable tool to communicate uncertainties. The economic value of probabilistic forecasts has been demonstrated by several authors, showing the benefit to using probabilistic forecasts over deterministic forecasts in several sectors, including flood and drought warning, hydropower, and agriculture. Probabilistic forecasting is also central to the emerging concept of risk-based decision making, and underlies emerging paradigms such as impact-based forecasting. Although the economic value of probabilistic forecasts is easily demonstrated in academic works, its evaluation in practice is more complex. The practical use of probabilistic forecasts requires decision makers to weigh the cost of an appropriate response to a probabilistic warning against the projected loss that would occur if the event forecast becomes reality. In this paper, we present the results of a simple game that aims to explore how decision makers are influenced by the costs required for taking a response and the potential losses they face in case the forecast flood event occurs. Participants play the role of one of three possible different shop owners. Each type of shop has losses of quite different magnitude, should a flood event occur. The shop owners are presented with several forecasts, each with a probability of a flood event occurring, which would inundate their shop and lead to those losses. In response, they have to decide if they want to do nothing, raise temporary defences, or relocate their inventory. Each action comes at a cost; and the different shop owners therefore have quite different cost/loss ratios. The game was played on four occasions. Players were attendees of the ensemble hydro-meteorological forecasting session of the 2016 EGU Assembly, professionals participating at two other conferences related to hydrometeorology, and a group of students. All audiences were familiar with the principles of forecasting and water-related risks, and one of the audiences comprised a group of experts in probabilistic forecasting. Results show that the different shop owners do take the costs of taking action and the potential losses into account in their decisions. Shop owners with a low cost/loss ratio were found to be more inclined to take actions based on the forecasts, though the absolute value of the losses also increased the willingness to take action. Little differentiation was found between the different groups of players.

  20. Design for a Crane Metallic Structure Based on Imperialist Competitive Algorithm and Inverse Reliability Strategy

    NASA Astrophysics Data System (ADS)

    Fan, Xiao-Ning; Zhi, Bo

    2017-07-01

    Uncertainties in parameters such as materials, loading, and geometry are inevitable in designing metallic structures for cranes. When considering these uncertainty factors, reliability-based design optimization (RBDO) offers a more reasonable design approach. However, existing RBDO methods for crane metallic structures are prone to low convergence speed and high computational cost. A unilevel RBDO method, combining a discrete imperialist competitive algorithm with an inverse reliability strategy based on the performance measure approach, is developed. Application of the imperialist competitive algorithm at the optimization level significantly improves the convergence speed of this RBDO method. At the reliability analysis level, the inverse reliability strategy is used to determine the feasibility of each probabilistic constraint at each design point by calculating its α-percentile performance, thereby avoiding convergence failure, calculation error, and disproportionate computational effort encountered using conventional moment and simulation methods. Application of the RBDO method to an actual crane structure shows that the developed RBDO realizes a design with the best tradeoff between economy and safety together with about one-third of the convergence speed and the computational cost of the existing method. This paper provides a scientific and effective design approach for the design of metallic structures of cranes.

  1. Probabilistic Design of a Plate-Like Wing to Meet Flutter and Strength Requirements

    NASA Technical Reports Server (NTRS)

    Stroud, W. Jefferson; Krishnamurthy, T.; Mason, Brian H.; Smith, Steven A.; Naser, Ahmad S.

    2002-01-01

    An approach is presented for carrying out reliability-based design of a metallic, plate-like wing to meet strength and flutter requirements that are given in terms of risk/reliability. The design problem is to determine the thickness distribution such that wing weight is a minimum and the probability of failure is less than a specified value. Failure is assumed to occur if either the flutter speed is less than a specified allowable or the stress caused by a pressure loading is greater than a specified allowable. Four uncertain quantities are considered: wing thickness, calculated flutter speed, allowable stress, and magnitude of a uniform pressure load. The reliability-based design optimization approach described herein starts with a design obtained using conventional deterministic design optimization with margins on the allowables. Reliability is calculated using Monte Carlo simulation with response surfaces that provide values of stresses and flutter speed. During the reliability-based design optimization, the response surfaces and move limits are coordinated to ensure accuracy of the response surfaces. Studies carried out in the paper show the relationship between reliability and weight and indicate that, for the design problem considered, increases in reliability can be obtained with modest increases in weight.

  2. TU-AB-BRB-01: Coverage Evaluation and Probabilistic Treatment Planning as a Margin Alternative

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Siebers, J.

    The accepted clinical method to accommodate targeting uncertainties inherent in fractionated external beam radiation therapy is to utilize GTV-to-CTV and CTV-to-PTV margins during the planning process to design a PTV-conformal static dose distribution on the planning image set. Ideally, margins are selected to ensure a high (e.g. >95%) target coverage probability (CP) in spite of inherent inter- and intra-fractional positional variations, tissue motions, and initial contouring uncertainties. Robust optimization techniques, also known as probabilistic treatment planning techniques, explicitly incorporate the dosimetric consequences of targeting uncertainties by including CP evaluation into the planning optimization process along with coverage-based planning objectives. Themore » treatment planner no longer needs to use PTV and/or PRV margins; instead robust optimization utilizes probability distributions of the underlying uncertainties in conjunction with CP-evaluation for the underlying CTVs and OARs to design an optimal treated volume. This symposium will describe CP-evaluation methods as well as various robust planning techniques including use of probability-weighted dose distributions, probability-weighted objective functions, and coverage optimized planning. Methods to compute and display the effect of uncertainties on dose distributions will be presented. The use of robust planning to accommodate inter-fractional setup uncertainties, organ deformation, and contouring uncertainties will be examined as will its use to accommodate intra-fractional organ motion. Clinical examples will be used to inter-compare robust and margin-based planning, highlighting advantages of robust-plans in terms of target and normal tissue coverage. Robust-planning limitations as uncertainties approach zero and as the number of treatment fractions becomes small will be presented, as well as the factors limiting clinical implementation of robust planning. Learning Objectives: To understand robust-planning as a clinical alternative to using margin-based planning. To understand conceptual differences between uncertainty and predictable motion. To understand fundamental limitations of the PTV concept that probabilistic planning can overcome. To understand the major contributing factors to target and normal tissue coverage probability. To understand the similarities and differences of various robust planning techniques To understand the benefits and limitations of robust planning techniques.« less

  3. Rocketdyne PSAM: In-house enhancement/application

    NASA Technical Reports Server (NTRS)

    Newell, J. F.; Rajagopal, K. R.; Ohara, K.

    1991-01-01

    The development was initiated of the Probabilistic Design Analysis (PDA) Process for rocket engines. This will enable engineers a quantitative assessment of calculated reliability during the design process. The PDA will help choose better designs, make them more robust, and help decide on critical tests to help demonstrate key reliability issues to aid in improving the confidence of the engine capabilities. Rockedyne's involvement with the Composite Loads Spectra (CLS) and Probabilistic Structural Analysis Methodology (PSAM) contracts started this effort and are key elements in the on-going developments. Internal development efforts and hardware applications complement and extend the CLS and PSAM efforts. The completion of the CLS option work and the follow-on PSAM developments will also be integral parts of this methodology. A brief summary of these efforts is presented.

  4. Reliability Based Geometric Design of Horizontal Circular Curves

    NASA Astrophysics Data System (ADS)

    Rajbongshi, Pabitra; Kalita, Kuldeep

    2018-06-01

    Geometric design of horizontal circular curve primarily involves with radius of the curve and stopping sight distance at the curve section. Minimum radius is decided based on lateral thrust exerted on the vehicles and the minimum stopping sight distance is provided to maintain the safety in longitudinal direction of vehicles. Available sight distance at site can be regulated by changing the radius and middle ordinate at the curve section. Both radius and sight distance depend on design speed. Speed of vehicles at any road section is a variable parameter and therefore, normally the 98th percentile speed is taken as the design speed. This work presents a probabilistic approach for evaluating stopping sight distance, considering the variability of all input parameters of sight distance. It is observed that the 98th percentile sight distance value is much lower than the sight distance corresponding to 98th percentile speed. The distribution of sight distance parameter is also studied and found to follow a lognormal distribution. Finally, the reliability based design charts are presented for both plain and hill regions, and considering the effect of lateral thrust.

  5. Probabilistic commodity-flow-based focusing of monitoring activities to facilitate early detection of Phytophthora ramorum outbreaks

    Treesearch

    Steven C. McKelvey; William D. Smith; Frank Koch

    2012-01-01

    This project summary describes a probabilistic model developed with funding support from the Forest Health Monitoring Program of the Forest Service, U.S. Department of Agriculture (BaseEM Project SO-R-08-01). The model has been implemented in SODBuster, a standalone software package developed using the Java software development kit from Sun Microsystems.

  6. Probabilistic Wind Power Ramp Forecasting Based on a Scenario Generation Method

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Qin; Florita, Anthony R; Krishnan, Venkat K

    Wind power ramps (WPRs) are particularly important in the management and dispatch of wind power and currently drawing the attention of balancing authorities. With the aim to reduce the impact of WPRs for power system operations, this paper develops a probabilistic ramp forecasting method based on a large number of simulated scenarios. An ensemble machine learning technique is first adopted to forecast the basic wind power forecasting scenario and calculate the historical forecasting errors. A continuous Gaussian mixture model (GMM) is used to fit the probability distribution function (PDF) of forecasting errors. The cumulative distribution function (CDF) is analytically deduced.more » The inverse transform method based on Monte Carlo sampling and the CDF is used to generate a massive number of forecasting error scenarios. An optimized swinging door algorithm is adopted to extract all the WPRs from the complete set of wind power forecasting scenarios. The probabilistic forecasting results of ramp duration and start-time are generated based on all scenarios. Numerical simulations on publicly available wind power data show that within a predefined tolerance level, the developed probabilistic wind power ramp forecasting method is able to predict WPRs with a high level of sharpness and accuracy.« less

  7. Probabilistic Wind Power Ramp Forecasting Based on a Scenario Generation Method: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Qin; Florita, Anthony R; Krishnan, Venkat K

    2017-08-31

    Wind power ramps (WPRs) are particularly important in the management and dispatch of wind power, and they are currently drawing the attention of balancing authorities. With the aim to reduce the impact of WPRs for power system operations, this paper develops a probabilistic ramp forecasting method based on a large number of simulated scenarios. An ensemble machine learning technique is first adopted to forecast the basic wind power forecasting scenario and calculate the historical forecasting errors. A continuous Gaussian mixture model (GMM) is used to fit the probability distribution function (PDF) of forecasting errors. The cumulative distribution function (CDF) ismore » analytically deduced. The inverse transform method based on Monte Carlo sampling and the CDF is used to generate a massive number of forecasting error scenarios. An optimized swinging door algorithm is adopted to extract all the WPRs from the complete set of wind power forecasting scenarios. The probabilistic forecasting results of ramp duration and start time are generated based on all scenarios. Numerical simulations on publicly available wind power data show that within a predefined tolerance level, the developed probabilistic wind power ramp forecasting method is able to predict WPRs with a high level of sharpness and accuracy.« less

  8. Design Of An Intelligent Robotic System Organizer Via Expert System Tecniques

    NASA Astrophysics Data System (ADS)

    Yuan, Peter H.; Valavanis, Kimon P.

    1989-02-01

    Intelligent Robotic Systems are a special type of Intelligent Machines. When modeled based on Vle theory of Intelligent Controls, they are composed of three interactive levels, namely: organization, coordination, and execution, ordered according, to the ,Principle of Increasing, Intelligence with Decreasing Precl.sion. Expert System techniques, are used to design an Intelligent Robotic System Organizer with a dynamic Knowledge Base and an interactive Inference Engine. Task plans are formulated using, either or both of a Probabilistic Approach and Forward Chapling Methodology, depending on pertinent information associated with a spec;fic requested job. The Intelligent Robotic System, Organizer is implemented and tested on a prototype system operating in an uncertain environment. An evaluation of-the performance, of the prototype system is conducted based upon the probability of generating a successful task sequence versus the number of trials taken by the organizer.

  9. PubMed related articles: a probabilistic topic-based model for content similarity

    PubMed Central

    Lin, Jimmy; Wilbur, W John

    2007-01-01

    Background We present a probabilistic topic-based model for content similarity called pmra that underlies the related article search feature in PubMed. Whether or not a document is about a particular topic is computed from term frequencies, modeled as Poisson distributions. Unlike previous probabilistic retrieval models, we do not attempt to estimate relevance–but rather our focus is "relatedness", the probability that a user would want to examine a particular document given known interest in another. We also describe a novel technique for estimating parameters that does not require human relevance judgments; instead, the process is based on the existence of MeSH ® in MEDLINE ®. Results The pmra retrieval model was compared against bm25, a competitive probabilistic model that shares theoretical similarities. Experiments using the test collection from the TREC 2005 genomics track shows a small but statistically significant improvement of pmra over bm25 in terms of precision. Conclusion Our experiments suggest that the pmra model provides an effective ranking algorithm for related article search. PMID:17971238

  10. Simulation optimization of PSA-threshold based prostate cancer screening policies

    PubMed Central

    Zhang, Jingyu; Denton, Brian T.; Shah, Nilay D.; Inman, Brant A.

    2013-01-01

    We describe a simulation optimization method to design PSA screening policies based on expected quality adjusted life years (QALYs). Our method integrates a simulation model in a genetic algorithm which uses a probabilistic method for selection of the best policy. We present computational results about the efficiency of our algorithm. The best policy generated by our algorithm is compared to previously recommended screening policies. Using the policies determined by our model, we present evidence that patients should be screened more aggressively but for a shorter length of time than previously published guidelines recommend. PMID:22302420

  11. Lossed in translation: an off-the-shelf method to recover probabilistic beliefs from loss-averse agents.

    PubMed

    Offerman, Theo; Palley, Asa B

    2016-01-01

    Strictly proper scoring rules are designed to truthfully elicit subjective probabilistic beliefs from risk neutral agents. Previous experimental studies have identified two problems with this method: (i) risk aversion causes agents to bias their reports toward the probability of [Formula: see text], and (ii) for moderate beliefs agents simply report [Formula: see text]. Applying a prospect theory model of risk preferences, we show that loss aversion can explain both of these behavioral phenomena. Using the insights of this model, we develop a simple off-the-shelf probability assessment mechanism that encourages loss-averse agents to report true beliefs. In an experiment, we demonstrate the effectiveness of this modification in both eliminating uninformative reports and eliciting true probabilistic beliefs.

  12. Quantification of uncertainties in the performance of smart composite structures

    NASA Technical Reports Server (NTRS)

    Shiao, Michael C.; Chamis, Christos C.

    1993-01-01

    A composite wing with spars, bulkheads, and built-in control devices is evaluated using a method for the probabilistic assessment of smart composite structures. Structural responses (such as change in angle of attack, vertical displacements, and stresses in regular plies with traditional materials and in control plies with mixed traditional and actuation materials) are probabilistically assessed to quantify their respective scatter. Probabilistic sensitivity factors are computed to identify those parameters that have a significant influence on a specific structural response. Results show that the uncertainties in the responses of smart composite structures can be quantified. Responses such as structural deformation, ply stresses, frequencies, and buckling loads in the presence of defects can be reliably controlled to satisfy specified design requirements.

  13. An extended continuous estimation of distribution algorithm for solving the permutation flow-shop scheduling problem

    NASA Astrophysics Data System (ADS)

    Shao, Zhongshi; Pi, Dechang; Shao, Weishi

    2017-11-01

    This article proposes an extended continuous estimation of distribution algorithm (ECEDA) to solve the permutation flow-shop scheduling problem (PFSP). In ECEDA, to make a continuous estimation of distribution algorithm (EDA) suitable for the PFSP, the largest order value rule is applied to convert continuous vectors to discrete job permutations. A probabilistic model based on a mixed Gaussian and Cauchy distribution is built to maintain the exploration ability of the EDA. Two effective local search methods, i.e. revolver-based variable neighbourhood search and Hénon chaotic-based local search, are designed and incorporated into the EDA to enhance the local exploitation. The parameters of the proposed ECEDA are calibrated by means of a design of experiments approach. Simulation results and comparisons based on some benchmark instances show the efficiency of the proposed algorithm for solving the PFSP.

  14. Probabilistic Risk Assessment to Inform Decision Making: Frequently Asked Questions

    EPA Pesticide Factsheets

    General concepts and principles of Probabilistic Risk Assessment (PRA), describe how PRA can improve the bases of Agency decisions, and provide illustrations of how PRA has been used in risk estimation and in describing the uncertainty in decision making.

  15. Applying Probabilistic Decision Models to Clinical Trial Design

    PubMed Central

    Smith, Wade P; Phillips, Mark H

    2018-01-01

    Clinical trial design most often focuses on a single or several related outcomes with corresponding calculations of statistical power. We consider a clinical trial to be a decision problem, often with competing outcomes. Using a current controversy in the treatment of HPV-positive head and neck cancer, we apply several different probabilistic methods to help define the range of outcomes given different possible trial designs. Our model incorporates the uncertainties in the disease process and treatment response and the inhomogeneities in the patient population. Instead of expected utility, we have used a Markov model to calculate quality adjusted life expectancy as a maximization objective. Monte Carlo simulations over realistic ranges of parameters are used to explore different trial scenarios given the possible ranges of parameters. This modeling approach can be used to better inform the initial trial design so that it will more likely achieve clinical relevance. PMID:29888075

  16. [Methodological design of the National Health and Nutrition Survey 2016].

    PubMed

    Romero-Martínez, Martín; Shamah-Levy, Teresa; Cuevas-Nasu, Lucía; Gómez-Humarán, Ignacio Méndez; Gaona-Pineda, Elsa Berenice; Gómez-Acosta, Luz María; Rivera-Dommarco, Juan Ángel; Hernández-Ávila, Mauricio

    2017-01-01

    Describe the design methodology of the halfway health and nutrition national survey (Ensanut-MC) 2016. The Ensanut-MC is a national probabilistic survey whose objective population are the inhabitants of private households in Mexico. The sample size was determined to make inferences on the urban and rural areas in four regions. Describes main design elements: target population, topics of study, sampling procedure, measurement procedure and logistics organization. A final sample of 9 479 completed household interviews, and a sample of 16 591 individual interviews. The response rate for households was 77.9%, and the response rate for individuals was 91.9%. The Ensanut-MC probabilistic design allows valid statistical inferences about interest parameters for Mexico´s public health and nutrition, specifically on overweight, obesity and diabetes mellitus. Updated information also supports the monitoring, updating and formulation of new policies and priority programs.

  17. Design for Reliability and Safety Approach for the New NASA Launch Vehicle

    NASA Technical Reports Server (NTRS)

    Safie, Fayssal M.; Weldon, Danny M.

    2007-01-01

    The United States National Aeronautics and Space Administration (NASA) is in the midst of a space exploration program intended for sending crew and cargo to the international Space Station (ISS), to the moon, and beyond. This program is called Constellation. As part of the Constellation program, NASA is developing new launch vehicles aimed at significantly increase safety and reliability, reduce the cost of accessing space, and provide a growth path for manned space exploration. Achieving these goals requires a rigorous process that addresses reliability, safety, and cost upfront and throughout all the phases of the life cycle of the program. This paper discusses the "Design for Reliability and Safety" approach for the NASA new launch vehicles, the ARES I and ARES V. Specifically, the paper addresses the use of an integrated probabilistic functional analysis to support the design analysis cycle and a probabilistic risk assessment (PRA) to support the preliminary design and beyond.

  18. Mixed oxidizer hybrid propulsion system optimization under uncertainty using applied response surface methodology and Monte Carlo simulation

    NASA Astrophysics Data System (ADS)

    Whitehead, James Joshua

    The analysis documented herein provides an integrated approach for the conduct of optimization under uncertainty (OUU) using Monte Carlo Simulation (MCS) techniques coupled with response surface-based methods for characterization of mixture-dependent variables. This novel methodology provides an innovative means of conducting optimization studies under uncertainty in propulsion system design. Analytic inputs are based upon empirical regression rate information obtained from design of experiments (DOE) mixture studies utilizing a mixed oxidizer hybrid rocket concept. Hybrid fuel regression rate was selected as the target response variable for optimization under uncertainty, with maximization of regression rate chosen as the driving objective. Characteristic operational conditions and propellant mixture compositions from experimental efforts conducted during previous foundational work were combined with elemental uncertainty estimates as input variables. Response surfaces for mixture-dependent variables and their associated uncertainty levels were developed using quadratic response equations incorporating single and two-factor interactions. These analysis inputs, response surface equations and associated uncertainty contributions were applied to a probabilistic MCS to develop dispersed regression rates as a function of operational and mixture input conditions within design space. Illustrative case scenarios were developed and assessed using this analytic approach including fully and partially constrained operational condition sets over all of design mixture space. In addition, optimization sets were performed across an operationally representative region in operational space and across all investigated mixture combinations. These scenarios were selected as representative examples relevant to propulsion system optimization, particularly for hybrid and solid rocket platforms. Ternary diagrams, including contour and surface plots, were developed and utilized to aid in visualization. The concept of Expanded-Durov diagrams was also adopted and adapted to this study to aid in visualization of uncertainty bounds. Regions of maximum regression rate and associated uncertainties were determined for each set of case scenarios. Application of response surface methodology coupled with probabilistic-based MCS allowed for flexible and comprehensive interrogation of mixture and operating design space during optimization cases. Analyses were also conducted to assess sensitivity of uncertainty to variations in key elemental uncertainty estimates. The methodology developed during this research provides an innovative optimization tool for future propulsion design efforts.

  19. Probabilistic brain tissue segmentation in neonatal magnetic resonance imaging.

    PubMed

    Anbeek, Petronella; Vincken, Koen L; Groenendaal, Floris; Koeman, Annemieke; van Osch, Matthias J P; van der Grond, Jeroen

    2008-02-01

    A fully automated method has been developed for segmentation of four different structures in the neonatal brain: white matter (WM), central gray matter (CEGM), cortical gray matter (COGM), and cerebrospinal fluid (CSF). The segmentation algorithm is based on information from T2-weighted (T2-w) and inversion recovery (IR) scans. The method uses a K nearest neighbor (KNN) classification technique with features derived from spatial information and voxel intensities. Probabilistic segmentations of each tissue type were generated. By applying thresholds on these probability maps, binary segmentations were obtained. These final segmentations were evaluated by comparison with a gold standard. The sensitivity, specificity, and Dice similarity index (SI) were calculated for quantitative validation of the results. High sensitivity and specificity with respect to the gold standard were reached: sensitivity >0.82 and specificity >0.9 for all tissue types. Tissue volumes were calculated from the binary and probabilistic segmentations. The probabilistic segmentation volumes of all tissue types accurately estimated the gold standard volumes. The KNN approach offers valuable ways for neonatal brain segmentation. The probabilistic outcomes provide a useful tool for accurate volume measurements. The described method is based on routine diagnostic magnetic resonance imaging (MRI) and is suitable for large population studies.

  20. Probabilistic evaluation of on-line checks in fault-tolerant multiprocessor systems

    NASA Technical Reports Server (NTRS)

    Nair, V. S. S.; Hoskote, Yatin V.; Abraham, Jacob A.

    1992-01-01

    The analysis of fault-tolerant multiprocessor systems that use concurrent error detection (CED) schemes is much more difficult than the analysis of conventional fault-tolerant architectures. Various analytical techniques have been proposed to evaluate CED schemes deterministically. However, these approaches are based on worst-case assumptions related to the failure of system components. Often, the evaluation results do not reflect the actual fault tolerance capabilities of the system. A probabilistic approach to evaluate the fault detecting and locating capabilities of on-line checks in a system is developed. The various probabilities associated with the checking schemes are identified and used in the framework of the matrix-based model. Based on these probabilistic matrices, estimates for the fault tolerance capabilities of various systems are derived analytically.

  1. A Multiatlas Segmentation Using Graph Cuts with Applications to Liver Segmentation in CT Scans

    PubMed Central

    2014-01-01

    An atlas-based segmentation approach is presented that combines low-level operations, an affine probabilistic atlas, and a multiatlas-based segmentation. The proposed combination provides highly accurate segmentation due to registrations and atlas selections based on the regions of interest (ROIs) and coarse segmentations. Our approach shares the following common elements between the probabilistic atlas and multiatlas segmentation: (a) the spatial normalisation and (b) the segmentation method, which is based on minimising a discrete energy function using graph cuts. The method is evaluated for the segmentation of the liver in computed tomography (CT) images. Low-level operations define a ROI around the liver from an abdominal CT. We generate a probabilistic atlas using an affine registration based on geometry moments from manually labelled data. Next, a coarse segmentation of the liver is obtained from the probabilistic atlas with low computational effort. Then, a multiatlas segmentation approach improves the accuracy of the segmentation. Both the atlas selections and the nonrigid registrations of the multiatlas approach use a binary mask defined by coarse segmentation. We experimentally demonstrate that this approach performs better than atlas selections and nonrigid registrations in the entire ROI. The segmentation results are comparable to those obtained by human experts and to other recently published results. PMID:25276219

  2. GoDisco: Selective Gossip Based Dissemination of Information in Social Community Based Overlays

    NASA Astrophysics Data System (ADS)

    Datta, Anwitaman; Sharma, Rajesh

    We propose and investigate a gossip based, social principles and behavior inspired decentralized mechanism (GoDisco) to disseminate information in online social community networks, using exclusively social links and exploiting semantic context to keep the dissemination process selective to relevant nodes. Such a designed dissemination scheme using gossiping over a egocentric social network is unique and is arguably a concept whose time has arrived, emulating word of mouth behavior and can have interesting applications like probabilistic publish/subscribe, decentralized recommendation and contextual advertisement systems, to name a few. Simulation based experiments show that despite using only local knowledge and contacts, the system has good global coverage and behavior.

  3. Automated Database Schema Design Using Mined Data Dependencies.

    ERIC Educational Resources Information Center

    Wong, S. K. M.; Butz, C. J.; Xiang, Y.

    1998-01-01

    Describes a bottom-up procedure for discovering multivalued dependencies in observed data without knowing a priori the relationships among the attributes. The proposed algorithm is an application of technique designed for learning conditional independencies in probabilistic reasoning; a prototype system for automated database schema design has…

  4. The case for probabilistic forecasting in hydrology

    NASA Astrophysics Data System (ADS)

    Krzysztofowicz, Roman

    2001-08-01

    That forecasts should be stated in probabilistic, rather than deterministic, terms has been argued from common sense and decision-theoretic perspectives for almost a century. Yet most operational hydrological forecasting systems produce deterministic forecasts and most research in operational hydrology has been devoted to finding the 'best' estimates rather than quantifying the predictive uncertainty. This essay presents a compendium of reasons for probabilistic forecasting of hydrological variates. Probabilistic forecasts are scientifically more honest, enable risk-based warnings of floods, enable rational decision making, and offer additional economic benefits. The growing demand for information about risk and the rising capability to quantify predictive uncertainties create an unparalleled opportunity for the hydrological profession to dramatically enhance the forecasting paradigm.

  5. Flood risk assessment in France: comparison of extreme flood estimation methods (EXTRAFLO project, Task 7)

    NASA Astrophysics Data System (ADS)

    Garavaglia, F.; Paquet, E.; Lang, M.; Renard, B.; Arnaud, P.; Aubert, Y.; Carre, J.

    2013-12-01

    In flood risk assessment the methods can be divided in two families: deterministic methods and probabilistic methods. In the French hydrologic community the probabilistic methods are historically preferred to the deterministic ones. Presently a French research project named EXTRAFLO (RiskNat Program of the French National Research Agency, https://extraflo.cemagref.fr) deals with the design values for extreme rainfall and floods. The object of this project is to carry out a comparison of the main methods used in France for estimating extreme values of rainfall and floods, to obtain a better grasp of their respective fields of application. In this framework we present the results of Task 7 of EXTRAFLO project. Focusing on French watersheds, we compare the main extreme flood estimation methods used in French background: (i) standard flood frequency analysis (Gumbel and GEV distribution), (ii) regional flood frequency analysis (regional Gumbel and GEV distribution), (iii) local and regional flood frequency analysis improved by historical information (Naulet et al., 2005), (iv) simplify probabilistic method based on rainfall information (i.e. Gradex method (CFGB, 1994), Agregee method (Margoum, 1992) and Speed method (Cayla, 1995)), (v) flood frequency analysis by continuous simulation approach and based on rainfall information (i.e. Schadex method (Paquet et al., 2013, Garavaglia et al., 2010), Shyreg method (Lavabre et al., 2003)) and (vi) multifractal approach. The main result of this comparative study is that probabilistic methods based on additional information (i.e. regional, historical and rainfall information) provide better estimations than the standard flood frequency analysis. Another interesting result is that, the differences between the various extreme flood quantile estimations of compared methods increase with return period, staying relatively moderate up to 100-years return levels. Results and discussions are here illustrated throughout with the example of five watersheds located in the South of France. References : O. CAYLA : Probability calculation of design floods abd inflows - SPEED. Waterpower 1995, San Francisco, California 1995 CFGB : Design flood determination by the gradex method. Bulletin du Comité Français des Grands Barrages News 96, 18th congress CIGB-ICOLD n2, nov:108, 1994. F. GARAVAGLIA et al. : Introducing a rainfall compound distribution model based on weather patterns subsampling. Hydrology and Earth System Sciences, 14, 951-964, 2010. J. LAVABRE et al. : SHYREG : une méthode pour l'estimation régionale des débits de crue. application aux régions méditerranéennes françaises. Ingénierie EAT, 97-111, 2003. M. MARGOUM : Estimation des crues rares et extrêmes : le modèle AGREGEE. Conceptions et remières validations. PhD, Ecole des Mines de Paris, 1992. R. NAULET et al. : Flood frequency analysis on the Ardèche river using French documentary sources from the two last centuries. Journal of Hydrology, 313:58-78, 2005. E. PAQUET et al. : The SCHADEX method: A semi-continuous rainfall-runoff simulation for extreme flood estimation, Journal of Hydrology, 495, 23-37, 2013.

  6. Seismic Hazard analysis of Adjaria Region in Georgia

    NASA Astrophysics Data System (ADS)

    Jorjiashvili, Nato; Elashvili, Mikheil

    2014-05-01

    The most commonly used approach to determining seismic-design loads for engineering projects is probabilistic seismic-hazard analysis (PSHA). The primary output from a PSHA is a hazard curve showing the variation of a selected ground-motion parameter, such as peak ground acceleration (PGA) or spectral acceleration (SA), against the annual frequency of exceedance (or its reciprocal, return period). The design value is the ground-motion level that corresponds to a preselected design return period. For many engineering projects, such as standard buildings and typical bridges, the seismic loading is taken from the appropriate seismic-design code, the basis of which is usually a PSHA. For more important engineering projects— where the consequences of failure are more serious, such as dams and chemical plants—it is more usual to obtain the seismic-design loads from a site-specific PSHA, in general, using much longer return periods than those governing code based design. Calculation of Probabilistic Seismic Hazard was performed using Software CRISIS2007 by Ordaz, M., Aguilar, A., and Arboleda, J., Instituto de Ingeniería, UNAM, Mexico. CRISIS implements a classical probabilistic seismic hazard methodology where seismic sources can be modelled as points, lines and areas. In the case of area sources, the software offers an integration procedure that takes advantage of a triangulation algorithm used for seismic source discretization. This solution improves calculation efficiency while maintaining a reliable description of source geometry and seismicity. Additionally, supplementary filters (e.g. fix a sitesource distance that excludes from calculation sources at great distance) allow the program to balance precision and efficiency during hazard calculation. Earthquake temporal occurrence is assumed to follow a Poisson process, and the code facilitates two types of MFDs: a truncated exponential Gutenberg-Richter [1944] magnitude distribution and a characteristic magnitude distribution [Youngs and Coppersmith, 1985]. Notably, the software can deal with uncertainty in the seismicity input parameters such as maximum magnitude value. CRISIS offers a set of built-in GMPEs, as well as the possibility of defining new ones by providing information in a tabular format. Our study shows that in case of Ajaristkali HPP study area, significant contribution to Seismic Hazard comes from local sources with quite low Mmax values, thus these two attenuation lows give us quite different PGA and SA values.

  7. Probabilistic liquefaction hazard analysis at liquefied sites of 1956 Dunaharaszti earthquake, in Hungary

    NASA Astrophysics Data System (ADS)

    Győri, Erzsébet; Gráczer, Zoltán; Tóth, László; Bán, Zoltán; Horváth, Tibor

    2017-04-01

    Liquefaction potential evaluations are generally made to assess the hazard from specific scenario earthquakes. These evaluations may estimate the potential in a binary fashion (yes/no), define a factor of safety or predict the probability of liquefaction given a scenario event. Usually the level of ground shaking is obtained from the results of PSHA. Although it is determined probabilistically, a single level of ground shaking is selected and used within the liquefaction potential evaluation. In contrary, the fully probabilistic liquefaction potential assessment methods provide a complete picture of liquefaction hazard, namely taking into account the joint probability distribution of PGA and magnitude of earthquake scenarios; both of which are key inputs in the stress-based simplified methods. Kramer and Mayfield (2007) has developed a fully probabilistic liquefaction potential evaluation method using a performance-based earthquake engineering (PBEE) framework. The results of the procedure are the direct estimate of the return period of liquefaction and the liquefaction hazard curves in function of depth. The method combines the disaggregation matrices computed for different exceedance frequencies during probabilistic seismic hazard analysis with one of the recent models for the conditional probability of liquefaction. We have developed a software for the assessment of performance-based liquefaction triggering on the basis of Kramer and Mayfield method. Originally the SPT based probabilistic method of Cetin et al. (2004) was built-in into the procedure of Kramer and Mayfield to compute the conditional probability however there is no professional consensus about its applicability. Therefore we have included not only Cetin's method but Idriss and Boulanger (2012) SPT based moreover Boulanger and Idriss (2014) CPT based procedures into our computer program. In 1956, a damaging earthquake of magnitude 5.6 occurred in Dunaharaszti, in Hungary. Its epicenter was located about 5 km from the southern boundary of Budapest. The quake caused serious damages in the epicentral area and in the southern districts of the capital. The epicentral area of the earthquake is located along the Danube River. Sand boils were observed in some locations that indicated the occurrence of liquefaction. Because their exact locations were recorded at the time of the earthquake, in situ geotechnical measurements (CPT and SPT) could be performed at two (Dunaharaszti and Taksony) sites. The different types of measurements enabled the probabilistic liquefaction hazard computations at the two studied sites. We have compared the return periods of liquefaction that were computed using different built-in simplified stress based methods.

  8. Probabilistic thinking and death anxiety: a terror management based study.

    PubMed

    Hayslip, Bert; Schuler, Eric R; Page, Kyle S; Carver, Kellye S

    2014-01-01

    Terror Management Theory has been utilized to understand how death can change behavioral outcomes and social dynamics. One area that is not well researched is why individuals willingly engage in risky behavior that could accelerate their mortality. One method of distancing a potential life threatening outcome when engaging in risky behaviors is through stacking probability in favor of the event not occurring, termed probabilistic thinking. The present study examines the creation and psychometric properties of the Probabilistic Thinking scale in a sample of young, middle aged, and older adults (n = 472). The scale demonstrated adequate internal consistency reliability for each of the four subscales, excellent overall internal consistency, and good construct validity regarding relationships with measures of death anxiety. Reliable age and gender effects in probabilistic thinking were also observed. The relationship of probabilistic thinking as part of a cultural buffer against death anxiety is discussed, as well as its implications for Terror Management research.

  9. Do probabilistic forecasts lead to better decisions?

    NASA Astrophysics Data System (ADS)

    Ramos, M. H.; van Andel, S. J.; Pappenberger, F.

    2012-12-01

    The last decade has seen growing research in producing probabilistic hydro-meteorological forecasts and increasing their reliability. This followed the promise that, supplied with information about uncertainty, people would take better risk-based decisions. In recent years, therefore, research and operational developments have also start putting attention to ways of communicating the probabilistic forecasts to decision makers. Communicating probabilistic forecasts includes preparing tools and products for visualization, but also requires understanding how decision makers perceive and use uncertainty information in real-time. At the EGU General Assembly 2012, we conducted a laboratory-style experiment in which several cases of flood forecasts and a choice of actions to take were presented as part of a game to participants, who acted as decision makers. Answers were collected and analyzed. In this paper, we present the results of this exercise and discuss if indeed we make better decisions on the basis of probabilistic forecasts.

  10. Do probabilistic forecasts lead to better decisions?

    NASA Astrophysics Data System (ADS)

    Ramos, M. H.; van Andel, S. J.; Pappenberger, F.

    2013-06-01

    The last decade has seen growing research in producing probabilistic hydro-meteorological forecasts and increasing their reliability. This followed the promise that, supplied with information about uncertainty, people would take better risk-based decisions. In recent years, therefore, research and operational developments have also started focusing attention on ways of communicating the probabilistic forecasts to decision-makers. Communicating probabilistic forecasts includes preparing tools and products for visualisation, but also requires understanding how decision-makers perceive and use uncertainty information in real time. At the EGU General Assembly 2012, we conducted a laboratory-style experiment in which several cases of flood forecasts and a choice of actions to take were presented as part of a game to participants, who acted as decision-makers. Answers were collected and analysed. In this paper, we present the results of this exercise and discuss if we indeed make better decisions on the basis of probabilistic forecasts.

  11. Conceptual design study of Fusion Experimental Reactor (FY86 FER): Safety

    NASA Astrophysics Data System (ADS)

    Seki, Yasushi; Iida, Hiromasa; Honda, Tsutomu

    1987-08-01

    This report describes the study on safety for FER (Fusion Experimental Reactor) which has been designed as a next step machine to the JT-60. Though the final purpose of this study is to have an image of design base accident, maximum credible accident and to assess their risk or probability, etc., as FER plant system, the emphasis of this years study is placed on fuel-gas circulation system where the tritium inventory is maximum. The report consists of two chapters. The first chapter summarizes the FER system and describes FMEA (Failure Mode and Effect Analysis) and related accident progression sequence for FER plant system as a whole. The second chapter of this report is focused on fuel-gas circulation system including purification, isotope separation and storage. Probability of risk is assessed by the probabilistic risk analysis (PRA) procedure based on FMEA, ETA and FTA.

  12. Computer-Simulation Surrogates for Optimization: Application to Trapezoidal Ducts and Axisymmetric Bodies

    NASA Technical Reports Server (NTRS)

    Otto, John C.; Paraschivoiu, Marius; Yesilyurt, Serhat; Patera, Anthony T.

    1995-01-01

    Engineering design and optimization efforts using computational systems rapidly become resource intensive. The goal of the surrogate-based approach is to perform a complete optimization with limited resources. In this paper we present a Bayesian-validated approach that informs the designer as to how well the surrogate performs; in particular, our surrogate framework provides precise (albeit probabilistic) bounds on the errors incurred in the surrogate-for-simulation substitution. The theory and algorithms of our computer{simulation surrogate framework are first described. The utility of the framework is then demonstrated through two illustrative examples: maximization of the flowrate of fully developed ow in trapezoidal ducts; and design of an axisymmetric body that achieves a target Stokes drag.

  13. Robustness-Based Design Optimization Under Data Uncertainty

    NASA Technical Reports Server (NTRS)

    Zaman, Kais; McDonald, Mark; Mahadevan, Sankaran; Green, Lawrence

    2010-01-01

    This paper proposes formulations and algorithms for design optimization under both aleatory (i.e., natural or physical variability) and epistemic uncertainty (i.e., imprecise probabilistic information), from the perspective of system robustness. The proposed formulations deal with epistemic uncertainty arising from both sparse and interval data without any assumption about the probability distributions of the random variables. A decoupled approach is proposed in this paper to un-nest the robustness-based design from the analysis of non-design epistemic variables to achieve computational efficiency. The proposed methods are illustrated for the upper stage design problem of a two-stage-to-orbit (TSTO) vehicle, where the information on the random design inputs are only available as sparse point and/or interval data. As collecting more data reduces uncertainty but increases cost, the effect of sample size on the optimality and robustness of the solution is also studied. A method is developed to determine the optimal sample size for sparse point data that leads to the solutions of the design problem that are least sensitive to variations in the input random variables.

  14. Probabilistic modeling approach for evaluating the compliance of ready-to-eat foods with new European Union safety criteria for Listeria monocytogenes.

    PubMed

    Koutsoumanis, Konstantinos; Angelidis, Apostolos S

    2007-08-01

    Among the new microbiological criteria that have been incorporated in EU Regulation 2073/2005, of particular interest are those concerning Listeria monocytogenes in ready-to eat (RTE) foods, because for certain food categories, they no longer require zero tolerance but rather specify a maximum allowable concentration of 100 CFU/g or ml. This study presents a probabilistic modeling approach for evaluating the compliance of RTE sliced meat products with the new safety criteria for L. monocytogenes. The approach was based on the combined use of (i) growth/no growth boundary models, (ii) kinetic growth models, (iii) product characteristics data (pH, a(w), shelf life) collected from 160 meat products from the Hellenic retail market, and (iv) storage temperature data recorded from 50 retail stores in Greece. This study shows that probabilistic analysis of the above components using Monte Carlo simulation, which takes into account the variability of factors affecting microbial growth, can lead to a realistic estimation of the behavior of L. monocytogenes throughout the food supply chain, and the quantitative output generated can be further used by food managers as a decision-making tool regarding the design or modification of a product's formulation or its "use-by" date in order to ensure its compliance with the new safety criteria. The study also argues that compliance of RTE foods with the new safety criteria should not be considered a parameter with a discrete and binary outcome because it depends on factors such as product characteristics, storage temperature, and initial contamination level, which display considerable variability even among different packages of the same RTE product. Rather, compliance should be expressed and therefore regulated in a more probabilistic fashion.

  15. The Use of Probabilistic Methods to Evaluate the Systems Impact of Component Design Improvements on Large Turbofan Engines

    NASA Technical Reports Server (NTRS)

    Packard, Michael H.

    2002-01-01

    Probabilistic Structural Analysis (PSA) is now commonly used for predicting the distribution of time/cycles to failure of turbine blades and other engine components. These distributions are typically based on fatigue/fracture and creep failure modes of these components. Additionally, reliability analysis is used for taking test data related to particular failure modes and calculating failure rate distributions of electronic and electromechanical components. How can these individual failure time distributions of structural, electronic and electromechanical component failure modes be effectively combined into a top level model for overall system evaluation of component upgrades, changes in maintenance intervals, or line replaceable unit (LRU) redesign? This paper shows an example of how various probabilistic failure predictions for turbine engine components can be evaluated and combined to show their effect on overall engine performance. A generic model of a turbofan engine was modeled using various Probabilistic Risk Assessment (PRA) tools (Quantitative Risk Assessment Software (QRAS) etc.). Hypothetical PSA results for a number of structural components along with mitigation factors that would restrict the failure mode from propagating to a Loss of Mission (LOM) failure were used in the models. The output of this program includes an overall failure distribution for LOM of the system. The rank and contribution to the overall Mission Success (MS) is also given for each failure mode and each subsystem. This application methodology demonstrates the effectiveness of PRA for assessing the performance of large turbine engines. Additionally, the effects of system changes and upgrades, the application of different maintenance intervals, inclusion of new sensor detection of faults and other upgrades were evaluated in determining overall turbine engine reliability.

  16. New Criterion and Tool for Caltrans Seismic Hazard Characterization

    NASA Astrophysics Data System (ADS)

    Shantz, T.; Merriam, M.; Turner, L.; Chiou, B.; Liu, X.

    2008-12-01

    Caltrans recently adopted new procedures for the development of response spectra for structure design. These procedures incorporate both deterministic and probabilistic criteria. The Next Generation Attenuation (NGA) models (2008) are used for deterministic assessment (using a revised late-Quaternary age fault database), and the USGS 2008 5% in 50-year hazard maps are used for probabilistic assessment. A minimum deterministic spectrum based on a M6.5 earthquake at 12 km is also included. These spectra are enveloped and the largest values used. A new publicly available web-based design tool for calculating the design spectrum will be used for calculations. The tool is built on a Windows-Apache-MySQL-PHP (WAMP) platform and integrates GoogleMaps for increased flexibility in the tool's use. Links to Caltrans data such as pre-construction logs of test borings assist in the estimation of Vs30 values used in the new procedures. Basin effects based on new models developed for the CFM, for the San Francisco Bay area by the USGS, and by Thurber (2008) are also incorporated. It is anticipated that additional layers such as CGS Seismic Hazard Zone maps will be added in the future. Application of the new criterion will result in expected higher levels of ground motion at many bridges west of the Coast Ranges. In eastern California, use of the NGA relationships for strike-slip faulting (the dominant sense of motion in California) will often result in slightly lower expected values for bridges. The expected result is a more realistic prediction of ground motions at bridges, in keeping with those motions developed for other large-scale and important structures. The tool is based on a simplified fault map of California, so it will not be used for more detailed evaluations such as surface rupture determination. Announcements regarding tool availability (expected to be in early 2009) are at http://www.dot.ca.gov/research/index.htm

  17. A probabilistic approach to identify putative drug targets in biochemical networks.

    PubMed

    Murabito, Ettore; Smallbone, Kieran; Swinton, Jonathan; Westerhoff, Hans V; Steuer, Ralf

    2011-06-06

    Network-based drug design holds great promise in clinical research as a way to overcome the limitations of traditional approaches in the development of drugs with high efficacy and low toxicity. This novel strategy aims to study how a biochemical network as a whole, rather than its individual components, responds to specific perturbations in different physiological conditions. Proteins exerting little control over normal cells and larger control over altered cells may be considered as good candidates for drug targets. The application of network-based drug design would greatly benefit from using an explicit computational model describing the dynamics of the system under investigation. However, creating a fully characterized kinetic model is not an easy task, even for relatively small networks, as it is still significantly hampered by the lack of data about kinetic mechanisms and parameters values. Here, we propose a Monte Carlo approach to identify the differences between flux control profiles of a metabolic network in different physiological states, when information about the kinetics of the system is partially or totally missing. Based on experimentally accessible information on metabolic phenotypes, we develop a novel method to determine probabilistic differences in the flux control coefficients between the two observable phenotypes. Knowledge of how differences in flux control are distributed among the different enzymatic steps is exploited to identify points of fragility in one of the phenotypes. Using a prototypical cancerous phenotype as an example, we demonstrate how our approach can assist researchers in developing compounds with high efficacy and low toxicity. © 2010 The Royal Society

  18. Reliability Coupled Sensitivity Based Design Approach for Gravity Retaining Walls

    NASA Astrophysics Data System (ADS)

    Guha Ray, A.; Baidya, D. K.

    2012-09-01

    Sensitivity analysis involving different random variables and different potential failure modes of a gravity retaining wall focuses on the fact that high sensitivity of a particular variable on a particular mode of failure does not necessarily imply a remarkable contribution to the overall failure probability. The present paper aims at identifying a probabilistic risk factor ( R f ) for each random variable based on the combined effects of failure probability ( P f ) of each mode of failure of a gravity retaining wall and sensitivity of each of the random variables on these failure modes. P f is calculated by Monte Carlo simulation and sensitivity analysis of each random variable is carried out by F-test analysis. The structure, redesigned by modifying the original random variables with the risk factors, is safe against all the variations of random variables. It is observed that R f for friction angle of backfill soil ( φ 1 ) increases and cohesion of foundation soil ( c 2 ) decreases with an increase of variation of φ 1 , while R f for unit weights ( γ 1 and γ 2 ) for both soil and friction angle of foundation soil ( φ 2 ) remains almost constant for variation of soil properties. The results compared well with some of the existing deterministic and probabilistic methods and found to be cost-effective. It is seen that if variation of φ 1 remains within 5 %, significant reduction in cross-sectional area can be achieved. But if the variation is more than 7-8 %, the structure needs to be modified. Finally design guidelines for different wall dimensions, based on the present approach, are proposed.

  19. Probabilistic framework for product design optimization and risk management

    NASA Astrophysics Data System (ADS)

    Keski-Rahkonen, J. K.

    2018-05-01

    Probabilistic methods have gradually gained ground within engineering practices but currently it is still the industry standard to use deterministic safety margin approaches to dimensioning components and qualitative methods to manage product risks. These methods are suitable for baseline design work but quantitative risk management and product reliability optimization require more advanced predictive approaches. Ample research has been published on how to predict failure probabilities for mechanical components and furthermore to optimize reliability through life cycle cost analysis. This paper reviews the literature for existing methods and tries to harness their best features and simplify the process to be applicable in practical engineering work. Recommended process applies Monte Carlo method on top of load-resistance models to estimate failure probabilities. Furthermore, it adds on existing literature by introducing a practical framework to use probabilistic models in quantitative risk management and product life cycle costs optimization. The main focus is on mechanical failure modes due to the well-developed methods used to predict these types of failures. However, the same framework can be applied on any type of failure mode as long as predictive models can be developed.

  20. Probabilistic vs. deterministic fiber tracking and the influence of different seed regions to delineate cerebellar-thalamic fibers in deep brain stimulation.

    PubMed

    Schlaier, Juergen R; Beer, Anton L; Faltermeier, Rupert; Fellner, Claudia; Steib, Kathrin; Lange, Max; Greenlee, Mark W; Brawanski, Alexander T; Anthofer, Judith M

    2017-06-01

    This study compared tractography approaches for identifying cerebellar-thalamic fiber bundles relevant to planning target sites for deep brain stimulation (DBS). In particular, probabilistic and deterministic tracking of the dentate-rubro-thalamic tract (DRTT) and differences between the spatial courses of the DRTT and the cerebello-thalamo-cortical (CTC) tract were compared. Six patients with movement disorders were examined by magnetic resonance imaging (MRI), including two sets of diffusion-weighted images (12 and 64 directions). Probabilistic and deterministic tractography was applied on each diffusion-weighted dataset to delineate the DRTT. Results were compared with regard to their sensitivity in revealing the DRTT and additional fiber tracts and processing time. Two sets of regions-of-interests (ROIs) guided deterministic tractography of the DRTT or the CTC, respectively. Tract distances to an atlas-based reference target were compared. Probabilistic fiber tracking with 64 orientations detected the DRTT in all twelve hemispheres. Deterministic tracking detected the DRTT in nine (12 directions) and in only two (64 directions) hemispheres. Probabilistic tracking was more sensitive in detecting additional fibers (e.g. ansa lenticularis and medial forebrain bundle) than deterministic tracking. Probabilistic tracking lasted substantially longer than deterministic. Deterministic tracking was more sensitive in detecting the CTC than the DRTT. CTC tracts were located adjacent but consistently more posterior to DRTT tracts. These results suggest that probabilistic tracking is more sensitive and robust in detecting the DRTT but harder to implement than deterministic approaches. Although sensitivity of deterministic tracking is higher for the CTC than the DRTT, targets for DBS based on these tracts likely differ. © 2017 Federation of European Neuroscience Societies and John Wiley & Sons Ltd.

  1. Optimal design of groundwater remediation system using a probabilistic multi-objective fast harmony search algorithm under uncertainty

    NASA Astrophysics Data System (ADS)

    Luo, Qiankun; Wu, Jianfeng; Yang, Yun; Qian, Jiazhong; Wu, Jichun

    2014-11-01

    This study develops a new probabilistic multi-objective fast harmony search algorithm (PMOFHS) for optimal design of groundwater remediation systems under uncertainty associated with the hydraulic conductivity (K) of aquifers. The PMOFHS integrates the previously developed deterministic multi-objective optimization method, namely multi-objective fast harmony search algorithm (MOFHS) with a probabilistic sorting technique to search for Pareto-optimal solutions to multi-objective optimization problems in a noisy hydrogeological environment arising from insufficient K data. The PMOFHS is then coupled with the commonly used flow and transport codes, MODFLOW and MT3DMS, to identify the optimal design of groundwater remediation systems for a two-dimensional hypothetical test problem and a three-dimensional Indiana field application involving two objectives: (i) minimization of the total remediation cost through the engineering planning horizon, and (ii) minimization of the mass remaining in the aquifer at the end of the operational period, whereby the pump-and-treat (PAT) technology is used to clean up contaminated groundwater. Also, Monte Carlo (MC) analysis is employed to evaluate the effectiveness of the proposed methodology. Comprehensive analysis indicates that the proposed PMOFHS can find Pareto-optimal solutions with low variability and high reliability and is a potentially effective tool for optimizing multi-objective groundwater remediation problems under uncertainty.

  2. Balkanization and Unification of Probabilistic Inferences

    ERIC Educational Resources Information Center

    Yu, Chong-Ho

    2005-01-01

    Many research-related classes in social sciences present probability as a unified approach based upon mathematical axioms, but neglect the diversity of various probability theories and their associated philosophical assumptions. Although currently the dominant statistical and probabilistic approach is the Fisherian tradition, the use of Fisherian…

  3. Probabilistic Seismic Hazard Assessment for Iraq

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Onur, Tuna; Gok, Rengin; Abdulnaby, Wathiq

    Probabilistic Seismic Hazard Assessments (PSHA) form the basis for most contemporary seismic provisions in building codes around the world. The current building code of Iraq was published in 1997. An update to this edition is in the process of being released. However, there are no national PSHA studies in Iraq for the new building code to refer to for seismic loading in terms of spectral accelerations. As an interim solution, the new draft building code was considering to refer to PSHA results produced in the late 1990s as part of the Global Seismic Hazard Assessment Program (GSHAP; Giardini et al.,more » 1999). However these results are: a) more than 15 years outdated, b) PGA-based only, necessitating rough conversion factors to calculate spectral accelerations at 0.3s and 1.0s for seismic design, and c) at a probability level of 10% chance of exceedance in 50 years, not the 2% that the building code requires. Hence there is a pressing need for a new, updated PSHA for Iraq.« less

  4. Learning Additional Languages as Hierarchical Probabilistic Inference: Insights From First Language Processing.

    PubMed

    Pajak, Bozena; Fine, Alex B; Kleinschmidt, Dave F; Jaeger, T Florian

    2016-12-01

    We present a framework of second and additional language (L2/L n ) acquisition motivated by recent work on socio-indexical knowledge in first language (L1) processing. The distribution of linguistic categories covaries with socio-indexical variables (e.g., talker identity, gender, dialects). We summarize evidence that implicit probabilistic knowledge of this covariance is critical to L1 processing, and propose that L2/L n learning uses the same type of socio-indexical information to probabilistically infer latent hierarchical structure over previously learned and new languages. This structure guides the acquisition of new languages based on their inferred place within that hierarchy, and is itself continuously revised based on new input from any language. This proposal unifies L1 processing and L2/L n acquisition as probabilistic inference under uncertainty over socio-indexical structure. It also offers a new perspective on crosslinguistic influences during L2/L n learning, accommodating gradient and continued transfer (both negative and positive) from previously learned to novel languages, and vice versa.

  5. Learning Additional Languages as Hierarchical Probabilistic Inference: Insights From First Language Processing

    PubMed Central

    Pajak, Bozena; Fine, Alex B.; Kleinschmidt, Dave F.; Jaeger, T. Florian

    2015-01-01

    We present a framework of second and additional language (L2/Ln) acquisition motivated by recent work on socio-indexical knowledge in first language (L1) processing. The distribution of linguistic categories covaries with socio-indexical variables (e.g., talker identity, gender, dialects). We summarize evidence that implicit probabilistic knowledge of this covariance is critical to L1 processing, and propose that L2/Ln learning uses the same type of socio-indexical information to probabilistically infer latent hierarchical structure over previously learned and new languages. This structure guides the acquisition of new languages based on their inferred place within that hierarchy, and is itself continuously revised based on new input from any language. This proposal unifies L1 processing and L2/Ln acquisition as probabilistic inference under uncertainty over socio-indexical structure. It also offers a new perspective on crosslinguistic influences during L2/Ln learning, accommodating gradient and continued transfer (both negative and positive) from previously learned to novel languages, and vice versa. PMID:28348442

  6. The European ASAMPSA_E project : towards guidance to model the impact of high amplitude natural hazards in the probabilistic safety assessment of nuclear power plants. Information on the project progress and needs from the geosciences.

    NASA Astrophysics Data System (ADS)

    Raimond, Emmanuel; Decker, Kurt; Guigueno, Yves; Klug, Joakim; Loeffler, Horst

    2015-04-01

    The Fukushima nuclear accident in Japan resulted from the combination of two correlated extreme external events (earthquake and tsunami). The consequences, in particular flooding, went beyond what was considered in the initial engineering design design of nuclear power plants (NPPs). Such situations can in theory be identified using probabilistic safety assessment (PSA) methodology. PSA results may then lead industry (system suppliers and utilities) or Safety Authorities to take appropriate decisions to reinforce the defence-in-depth of the NPP for low probability event but high amplitude consequences. In reality, the development of such PSA remains a challenging task. Definitions of the design basis of NPPs, for example, require data on events with occurrence probabilities not higher than 10-4 per year. Today, even lower probabilities, down to 10-8, are expected and typically used for probabilistic safety analyses (PSA) of NPPs and the examination of so-called design extension conditions. Modelling the combinations of natural or man-made hazards that can affect a NPP and affecting some meaningful probability of occurrence seems to be difficult. The European project ASAMPSAE (www.asampsa.eu) gathers more than 30 organizations (industry, research, safety control) from Europe, US and Japan and aims at identifying some meaningful practices to extend the scope and the quality of the existing probabilistic safety analysis developed for nuclear power plants. It offers a framework to discuss, at a technical level, how "extended PSA" can be developed efficiently and be used to verify if the robustness of Nuclear Power Plants (NPPs) in their environment is sufficient. The paper will present the objectives of this project, some first lessons and introduce which type of guidance is being developed. It will explain the need of expertise from geosciences to support the nuclear safety assessment in the different area (seismotectonic, hydrological, meteorological and biological hazards, …).

  7. Simulation of probabilistic wind loads and building analysis

    NASA Technical Reports Server (NTRS)

    Shah, Ashwin R.; Chamis, Christos C.

    1991-01-01

    Probabilistic wind loads likely to occur on a structure during its design life are predicted. Described here is a suitable multifactor interactive equation (MFIE) model and its use in the Composite Load Spectra (CLS) computer program to simulate the wind pressure cumulative distribution functions on four sides of a building. The simulated probabilistic wind pressure load was applied to a building frame, and cumulative distribution functions of sway displacements and reliability against overturning were obtained using NESSUS (Numerical Evaluation of Stochastic Structure Under Stress), a stochastic finite element computer code. The geometry of the building and the properties of building members were also considered as random in the NESSUS analysis. The uncertainties of wind pressure, building geometry, and member section property were qualified in terms of their respective sensitivities on the structural response.

  8. An investigation into the probabilistic combination of quasi-static and random accelerations

    NASA Technical Reports Server (NTRS)

    Schock, R. W.; Tuell, L. P.

    1984-01-01

    The development of design load factors for aerospace and aircraft components and experiment support structures, which are subject to a simultaneous vehicle dynamic vibration (quasi-static) and acoustically generated random vibration, require the selection of a combination methodology. Typically, the procedure is to define the quasi-static and the random generated response separately, and arithmetically add or root sum square to get combined accelerations. Since the combination of a probabilistic and a deterministic function yield a probabilistic function, a viable alternate approach would be to determine the characteristics of the combined acceleration probability density function and select an appropriate percentile level for the combined acceleration. The following paper develops this mechanism and provides graphical data to select combined accelerations for most popular percentile levels.

  9. Dominating Scale-Free Networks Using Generalized Probabilistic Methods

    PubMed Central

    Molnár,, F.; Derzsy, N.; Czabarka, É.; Székely, L.; Szymanski, B. K.; Korniss, G.

    2014-01-01

    We study ensemble-based graph-theoretical methods aiming to approximate the size of the minimum dominating set (MDS) in scale-free networks. We analyze both analytical upper bounds of dominating sets and numerical realizations for applications. We propose two novel probabilistic dominating set selection strategies that are applicable to heterogeneous networks. One of them obtains the smallest probabilistic dominating set and also outperforms the deterministic degree-ranked method. We show that a degree-dependent probabilistic selection method becomes optimal in its deterministic limit. In addition, we also find the precise limit where selecting high-degree nodes exclusively becomes inefficient for network domination. We validate our results on several real-world networks, and provide highly accurate analytical estimates for our methods. PMID:25200937

  10. The application of probabilistic fracture analysis to residual life evaluation of embrittled reactor vessels

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dickson, T.L.; Simonen, F.A.

    1992-05-01

    Probabilistic fracture mechanics analysis is a major element of comprehensive probabilistic methodology on which current NRC regulatory requirements for pressurized water reactor vessel integrity evaluation are based. Computer codes such as OCA-P and VISA-II perform probabilistic fracture analyses to estimate the increase in vessel failure probability that occurs as the vessel material accumulates radiation damage over the operating life of the vessel. The results of such analyses, when compared with limits of acceptable failure probabilities, provide an estimation of the residual life of a vessel. Such codes can be applied to evaluate the potential benefits of plant-specific mitigating actions designedmore » to reduce the probability of failure of a reactor vessel. 10 refs.« less

  11. The application of probabilistic fracture analysis to residual life evaluation of embrittled reactor vessels

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dickson, T.L.; Simonen, F.A.

    1992-01-01

    Probabilistic fracture mechanics analysis is a major element of comprehensive probabilistic methodology on which current NRC regulatory requirements for pressurized water reactor vessel integrity evaluation are based. Computer codes such as OCA-P and VISA-II perform probabilistic fracture analyses to estimate the increase in vessel failure probability that occurs as the vessel material accumulates radiation damage over the operating life of the vessel. The results of such analyses, when compared with limits of acceptable failure probabilities, provide an estimation of the residual life of a vessel. Such codes can be applied to evaluate the potential benefits of plant-specific mitigating actions designedmore » to reduce the probability of failure of a reactor vessel. 10 refs.« less

  12. A rule-based expert system applied to moisture durability of building envelopes

    DOE PAGES

    Boudreaux, Philip R.; Pallin, Simon B.; Accawi, Gina K.; ...

    2018-01-09

    The moisture durability of an envelope component such as a wall or roof is difficult to predict. Moisture durability depends on all the construction materials used, as well as the climate, orientation, air tightness, and indoor conditions. Modern building codes require more insulation and tighter construction but provide little guidance about how to ensure these energy-efficient assemblies remain moisture durable. Furthermore, as new products and materials are introduced, builders are increasingly uncertain about the long-term durability of their building envelope designs. Oak Ridge National Laboratory and the US Department of Energy’s Building America Program are applying a rule-based expert systemmore » methodology in a web tool to help designers determine whether a given wall design is likely to be moisture durable and provide expert guidance on moisture risk management specific to a wall design and climate. Finally, the expert system is populated with knowledge from both expert judgment and probabilistic hygrothermal simulation results.« less

  13. The use of subjective expert opinions in cost optimum design of aerospace structures. [probabilistic failure models

    NASA Technical Reports Server (NTRS)

    Thomas, J. M.; Hanagud, S.

    1975-01-01

    The results of two questionnaires sent to engineering experts are statistically analyzed and compared with objective data from Saturn V design and testing. Engineers were asked how likely it was for structural failure to occur at load increments above and below analysts' stress limit predictions. They were requested to estimate the relative probabilities of different failure causes, and of failure at each load increment given a specific cause. Three mathematical models are constructed based on the experts' assessment of causes. The experts' overall assessment of prediction strength fits the Saturn V data better than the models do, but a model test option (T-3) based on the overall assessment gives more design change likelihood to overstrength structures than does an older standard test option. T-3 compares unfavorably with the standard option in a cost optimum structural design problem. The report reflects a need for subjective data when objective data are unavailable.

  14. A rule-based expert system applied to moisture durability of building envelopes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Boudreaux, Philip R.; Pallin, Simon B.; Accawi, Gina K.

    The moisture durability of an envelope component such as a wall or roof is difficult to predict. Moisture durability depends on all the construction materials used, as well as the climate, orientation, air tightness, and indoor conditions. Modern building codes require more insulation and tighter construction but provide little guidance about how to ensure these energy-efficient assemblies remain moisture durable. Furthermore, as new products and materials are introduced, builders are increasingly uncertain about the long-term durability of their building envelope designs. Oak Ridge National Laboratory and the US Department of Energy’s Building America Program are applying a rule-based expert systemmore » methodology in a web tool to help designers determine whether a given wall design is likely to be moisture durable and provide expert guidance on moisture risk management specific to a wall design and climate. Finally, the expert system is populated with knowledge from both expert judgment and probabilistic hygrothermal simulation results.« less

  15. Probabilistic analysis of the influence of the bonding degree of the stem-cement interface in the performance of cemented hip prostheses.

    PubMed

    Pérez, M A; Grasa, J; García-Aznar, J M; Bea, J A; Doblaré, M

    2006-01-01

    The long-term behavior of the stem-cement interface is one of the most frequent topics of discussion in the design of cemented total hip replacements, especially with regards to the process of damage accumulation in the cement layer. This effect is analyzed here comparing two different situations of the interface: completely bonded and debonded with friction. This comparative analysis is performed using a probabilistic computational approach that considers the variability and uncertainty of determinant factors that directly compromise the damage accumulation in the cement mantle. This stochastic technique is based on the combination of probabilistic finite elements (PFEM) and a cumulative damage approach known as B-model. Three random variables were considered: muscle and joint contact forces at the hip (both for walking and stair climbing), cement damage and fatigue properties of the cement. The results predicted that the regions with higher failure probability in the bulk cement are completely different depending on the stem-cement interface characteristics. In a bonded interface, critical sites appeared at the distal and medial parts of the cement, while for debonded interfaces, the critical regions were found distally and proximally. In bonded interfaces, the failure probability was higher than in debonded ones. The same conclusion may be established for stair climbing in comparison with walking activity.

  16. Probabilistic Simulation of Combined Thermo-Mechanical Cyclic Fatigue in Composites

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.

    2011-01-01

    A methodology to compute probabilistically-combined thermo-mechanical fatigue life of polymer matrix laminated composites has been developed and is demonstrated. Matrix degradation effects caused by long-term environmental exposure and mechanical/thermal cyclic loads are accounted for in the simulation process. A unified time-temperature-stress-dependent multifactor-interaction relationship developed at NASA Glenn Research Center has been used to model the degradation/aging of material properties due to cyclic loads. The fast probability-integration method is used to compute probabilistic distribution of response. Sensitivities of fatigue life reliability to uncertainties in the primitive random variables (e.g., constituent properties, fiber volume ratio, void volume ratio, ply thickness, etc.) computed and their significance in the reliability-based design for maximum life is discussed. The effect of variation in the thermal cyclic loads on the fatigue reliability for a (0/+/-45/90)s graphite/epoxy laminate with a ply thickness of 0.127 mm, with respect to impending failure modes has been studied. The results show that, at low mechanical-cyclic loads and low thermal-cyclic amplitudes, fatigue life for 0.999 reliability is most sensitive to matrix compressive strength, matrix modulus, thermal expansion coefficient, and ply thickness. Whereas at high mechanical-cyclic loads and high thermal-cyclic amplitudes, fatigue life at 0.999 reliability is more sensitive to the shear strength of matrix, longitudinal fiber modulus, matrix modulus, and ply thickness.

  17. Probabilistic Simulation for Combined Cycle Fatigue in Composites

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.

    2010-01-01

    A methodology to compute probabilistic fatigue life of polymer matrix laminated composites has been developed and demonstrated. Matrix degradation effects caused by long term environmental exposure and mechanical/thermal cyclic loads are accounted for in the simulation process. A unified time-temperature-stress dependent multifactor interaction relationship developed at NASA Glenn Research Center has been used to model the degradation/aging of material properties due to cyclic loads. The fast probability integration method is used to compute probabilistic distribution of response. Sensitivities of fatigue life reliability to uncertainties in the primitive random variables (e.g., constituent properties, fiber volume ratio, void volume ratio, ply thickness, etc.) computed and their significance in the reliability-based design for maximum life is discussed. The effect of variation in the thermal cyclic loads on the fatigue reliability for a (0/+/- 45/90)s graphite/epoxy laminate with a ply thickness of 0.127 mm, with respect to impending failure modes has been studied. The results show that, at low mechanical cyclic loads and low thermal cyclic amplitudes, fatigue life for 0.999 reliability is most sensitive to matrix compressive strength, matrix modulus, thermal expansion coefficient, and ply thickness. Whereas at high mechanical cyclic loads and high thermal cyclic amplitudes, fatigue life at 0.999 reliability is more sensitive to the shear strength of matrix, longitudinal fiber modulus, matrix modulus, and ply thickness.

  18. Probabilistic Simulation of Combined Thermo-Mechanical Cyclic Fatigue in Composites

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.

    2010-01-01

    A methodology to compute probabilistically-combined thermo-mechanical fatigue life of polymer matrix laminated composites has been developed and is demonstrated. Matrix degradation effects caused by long-term environmental exposure and mechanical/thermal cyclic loads are accounted for in the simulation process. A unified time-temperature-stress-dependent multifactor-interaction relationship developed at NASA Glenn Research Center has been used to model the degradation/aging of material properties due to cyclic loads. The fast probability-integration method is used to compute probabilistic distribution of response. Sensitivities of fatigue life reliability to uncertainties in the primitive random variables (e.g., constituent properties, fiber volume ratio, void volume ratio, ply thickness, etc.) computed and their significance in the reliability-based design for maximum life is discussed. The effect of variation in the thermal cyclic loads on the fatigue reliability for a (0/+/-45/90)s graphite/epoxy laminate with a ply thickness of 0.127 mm, with respect to impending failure modes has been studied. The results show that, at low mechanical-cyclic loads and low thermal-cyclic amplitudes, fatigue life for 0.999 reliability is most sensitive to matrix compressive strength, matrix modulus, thermal expansion coefficient, and ply thickness. Whereas at high mechanical-cyclic loads and high thermal-cyclic amplitudes, fatigue life at 0.999 reliability is more sensitive to the shear strength of matrix, longitudinal fiber modulus, matrix modulus, and ply thickness.

  19. Towards Real-time, On-board, Hardware-Supported Sensor and Software Health Management for Unmanned Aerial Systems

    NASA Technical Reports Server (NTRS)

    Schumann, Johann; Rozier, Kristin Y.; Reinbacher, Thomas; Mengshoel, Ole J.; Mbaya, Timmy; Ippolito, Corey

    2013-01-01

    Unmanned aerial systems (UASs) can only be deployed if they can effectively complete their missions and respond to failures and uncertain environmental conditions while maintaining safety with respect to other aircraft as well as humans and property on the ground. In this paper, we design a real-time, on-board system health management (SHM) capability to continuously monitor sensors, software, and hardware components for detection and diagnosis of failures and violations of safety or performance rules during the flight of a UAS. Our approach to SHM is three-pronged, providing: (1) real-time monitoring of sensor and/or software signals; (2) signal analysis, preprocessing, and advanced on the- fly temporal and Bayesian probabilistic fault diagnosis; (3) an unobtrusive, lightweight, read-only, low-power realization using Field Programmable Gate Arrays (FPGAs) that avoids overburdening limited computing resources or costly re-certification of flight software due to instrumentation. Our implementation provides a novel approach of combining modular building blocks, integrating responsive runtime monitoring of temporal logic system safety requirements with model-based diagnosis and Bayesian network-based probabilistic analysis. We demonstrate this approach using actual data from the NASA Swift UAS, an experimental all-electric aircraft.

  20. Agent-based simulation for human-induced hazard analysis.

    PubMed

    Bulleit, William M; Drewek, Matthew W

    2011-02-01

    Terrorism could be treated as a hazard for design purposes. For instance, the terrorist hazard could be analyzed in a manner similar to the way that seismic hazard is handled. No matter how terrorism is dealt with in the design of systems, the need for predictions of the frequency and magnitude of the hazard will be required. And, if the human-induced hazard is to be designed for in a manner analogous to natural hazards, then the predictions should be probabilistic in nature. The model described in this article is a prototype model that used agent-based modeling (ABM) to analyze terrorist attacks. The basic approach in this article of using ABM to model human-induced hazards has been preliminarily validated in the sense that the attack magnitudes seem to be power-law distributed and attacks occur mostly in regions where high levels of wealth pass through, such as transit routes and markets. The model developed in this study indicates that ABM is a viable approach to modeling socioeconomic-based infrastructure systems for engineering design to deal with human-induced hazards. © 2010 Society for Risk Analysis.

  1. A probabilistic maintenance model for diesel engines

    NASA Astrophysics Data System (ADS)

    Pathirana, Shan; Abeygunawardane, Saranga Kumudu

    2018-02-01

    In this paper, a probabilistic maintenance model is developed for inspection based preventive maintenance of diesel engines based on the practical model concepts discussed in the literature. Developed model is solved using real data obtained from inspection and maintenance histories of diesel engines and experts' views. Reliability indices and costs were calculated for the present maintenance policy of diesel engines. A sensitivity analysis is conducted to observe the effect of inspection based preventive maintenance on the life cycle cost of diesel engines.

  2. Probabilistic Analysis of Radiation Doses for Shore-Based Individuals in Operation Tomodachi

    DTIC Science & Technology

    2013-05-01

    Based Upon Oxygen Consumption Rates. EPA/600/R-06/129F, U.S. Environmental Protection Agency, Washington, D.C. May. USEPA (U.S. Environmental...pascal (Pa) pound-force per square inch (psi) 6.894 757 × 103 pascal (Pa) Angle/ Temperature /Time hour (h) 3.6 × 103 second (s) degree of arc (o...equivalent and effective dose is the sievert (Sv). (1 Sv = 1 J kg–1). 1 DTRA-TR-12-002: Probabilistic Analysis of Radiation Doses for Shore-Based

  3. Reliability-Based Design Optimization of a Composite Airframe Component

    NASA Technical Reports Server (NTRS)

    Patnaik, Surya N.; Pai, Shantaram S.; Coroneos, Rula M.

    2009-01-01

    A stochastic design optimization methodology (SDO) has been developed to design components of an airframe structure that can be made of metallic and composite materials. The design is obtained as a function of the risk level, or reliability, p. The design method treats uncertainties in load, strength, and material properties as distribution functions, which are defined with mean values and standard deviations. A design constraint or a failure mode is specified as a function of reliability p. Solution to stochastic optimization yields the weight of a structure as a function of reliability p. Optimum weight versus reliability p traced out an inverted-S-shaped graph. The center of the inverted-S graph corresponded to 50 percent (p = 0.5) probability of success. A heavy design with weight approaching infinity could be produced for a near-zero rate of failure that corresponds to unity for reliability p (or p = 1). Weight can be reduced to a small value for the most failure-prone design with a reliability that approaches zero (p = 0). Reliability can be changed for different components of an airframe structure. For example, the landing gear can be designed for a very high reliability, whereas it can be reduced to a small extent for a raked wingtip. The SDO capability is obtained by combining three codes: (1) The MSC/Nastran code was the deterministic analysis tool, (2) The fast probabilistic integrator, or the FPI module of the NESSUS software, was the probabilistic calculator, and (3) NASA Glenn Research Center s optimization testbed CometBoards became the optimizer. The SDO capability requires a finite element structural model, a material model, a load model, and a design model. The stochastic optimization concept is illustrated considering an academic example and a real-life raked wingtip structure of the Boeing 767-400 extended range airliner made of metallic and composite materials.

  4. Applications of Response Surface-Based Methods to Noise Analysis in the Conceptual Design of Revolutionary Aircraft

    NASA Technical Reports Server (NTRS)

    Hill, Geoffrey A.; Olson, Erik D.

    2004-01-01

    Due to the growing problem of noise in today's air transportation system, there have arisen needs to incorporate noise considerations in the conceptual design of revolutionary aircraft. Through the use of response surfaces, complex noise models may be converted into polynomial equations for rapid and simplified evaluation. This conversion allows many of the commonly used response surface-based trade space exploration methods to be applied to noise analysis. This methodology is demonstrated using a noise model of a notional 300 passenger Blended-Wing-Body (BWB) transport. Response surfaces are created relating source noise levels of the BWB vehicle to its corresponding FAR-36 certification noise levels and the resulting trade space is explored. Methods demonstrated include: single point analysis, parametric study, an optimization technique for inverse analysis, sensitivity studies, and probabilistic analysis. Extended applications of response surface-based methods in noise analysis are also discussed.

  5. Probabilistic Tractography of the Cranial Nerves in Vestibular Schwannoma.

    PubMed

    Zolal, Amir; Juratli, Tareq A; Podlesek, Dino; Rieger, Bernhard; Kitzler, Hagen H; Linn, Jennifer; Schackert, Gabriele; Sobottka, Stephan B

    2017-11-01

    Multiple recent studies have reported on diffusion tensor-based fiber tracking of cranial nerves in vestibular schwannoma, with conflicting results as to the accuracy of the method and the occurrence of cochlear nerve depiction. Probabilistic nontensor-based tractography might offer advantages in terms of better extraction of directional information from the underlying data in cranial nerves, which are of subvoxel size. Twenty-one patients with large vestibular schwannomas were recruited. The probabilistic tracking was run preoperatively and the position of the potential depictions of the facial and cochlear nerves was estimated postoperatively by 3 independent observers in a blinded fashion. The true position of the nerve was determined intraoperatively by the surgeon. Thereafter, the imaging-based estimated position was compared with the intraoperatively determined position. Tumor size, cystic appearance, and postoperative House-Brackmann score were analyzed with regard to the accuracy of the depiction of the nerves. The probabilistic tracking showed a connection that correlated to the position of the facial nerve in 81% of the cases and to the position of the cochlear nerve in 33% of the cases. Altogether, the resulting depiction did not correspond to the intraoperative position of any of the nerves in 3 cases. In a majority of cases, the position of the facial nerve, but not of the cochlear nerve, could be estimated by evaluation of the probabilistic tracking results. However, false depictions not corresponding to any nerve do occur and cannot be discerned as such from the image only. Copyright © 2017 Elsevier Inc. All rights reserved.

  6. Development of probabilistic emission inventories of air toxics for Jacksonville, Florida, USA.

    PubMed

    Zhao, Yuchao; Frey, H Christopher

    2004-11-01

    Probabilistic emission inventories were developed for 1,3-butadiene, mercury (Hg), arsenic (As), benzene, formaldehyde, and lead for Jacksonville, FL. To quantify inter-unit variability in empirical emission factor data, the Maximum Likelihood Estimation (MLE) method or the Method of Matching Moments was used to fit parametric distributions. For data sets that contain nondetected measurements, a method based upon MLE was used for parameter estimation. To quantify the uncertainty in urban air toxic emission factors, parametric bootstrap simulation and empirical bootstrap simulation were applied to uncensored and censored data, respectively. The probabilistic emission inventories were developed based on the product of the uncertainties in the emission factors and in the activity factors. The uncertainties in the urban air toxics emission inventories range from as small as -25 to +30% for Hg to as large as -83 to +243% for As. The key sources of uncertainty in the emission inventory for each toxic are identified based upon sensitivity analysis. Typically, uncertainty in the inventory of a given pollutant can be attributed primarily to a small number of source categories. Priorities for improving the inventories and for refining the probabilistic analysis are discussed.

  7. ProbCD: enrichment analysis accounting for categorization uncertainty.

    PubMed

    Vêncio, Ricardo Z N; Shmulevich, Ilya

    2007-10-12

    As in many other areas of science, systems biology makes extensive use of statistical association and significance estimates in contingency tables, a type of categorical data analysis known in this field as enrichment (also over-representation or enhancement) analysis. In spite of efforts to create probabilistic annotations, especially in the Gene Ontology context, or to deal with uncertainty in high throughput-based datasets, current enrichment methods largely ignore this probabilistic information since they are mainly based on variants of the Fisher Exact Test. We developed an open-source R-based software to deal with probabilistic categorical data analysis, ProbCD, that does not require a static contingency table. The contingency table for the enrichment problem is built using the expectation of a Bernoulli Scheme stochastic process given the categorization probabilities. An on-line interface was created to allow usage by non-programmers and is available at: http://xerad.systemsbiology.net/ProbCD/. We present an analysis framework and software tools to address the issue of uncertainty in categorical data analysis. In particular, concerning the enrichment analysis, ProbCD can accommodate: (i) the stochastic nature of the high-throughput experimental techniques and (ii) probabilistic gene annotation.

  8. Prediction of Human Intestinal Absorption of Compounds Using Artificial Intelligence Techniques.

    PubMed

    Kumar, Rajnish; Sharma, Anju; Siddiqui, Mohammed Haris; Tiwari, Rajesh Kumar

    2017-01-01

    Information about Pharmacokinetics of compounds is an essential component of drug design and development. Modeling the pharmacokinetic properties require identification of the factors effecting absorption, distribution, metabolism and excretion of compounds. There have been continuous attempts in the prediction of intestinal absorption of compounds using various Artificial intelligence methods in the effort to reduce the attrition rate of drug candidates entering to preclinical and clinical trials. Currently, there are large numbers of individual predictive models available for absorption using machine learning approaches. Six Artificial intelligence methods namely, Support vector machine, k- nearest neighbor, Probabilistic neural network, Artificial neural network, Partial least square and Linear discriminant analysis were used for prediction of absorption of compounds. Prediction accuracy of Support vector machine, k- nearest neighbor, Probabilistic neural network, Artificial neural network, Partial least square and Linear discriminant analysis for prediction of intestinal absorption of compounds was found to be 91.54%, 88.33%, 84.30%, 86.51%, 79.07% and 80.08% respectively. Comparative analysis of all the six prediction models suggested that Support vector machine with Radial basis function based kernel is comparatively better for binary classification of compounds using human intestinal absorption and may be useful at preliminary stages of drug design and development. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  9. Evaluation of Sex-Specific Movement Patterns in Judo Using Probabilistic Neural Networks.

    PubMed

    Miarka, Bianca; Sterkowicz-Przybycien, Katarzyna; Fukuda, David H

    2017-10-01

    The purpose of the present study was to create a probabilistic neural network to clarify the understanding of movement patterns in international judo competitions by gender. Analysis of 773 male and 638 female bouts was utilized to identify movements during the approach, gripping, attack (including biomechanical designations), groundwork, defense, and pause phases. Probabilistic neural network and chi-square (χ 2 ) tests modeled and compared frequencies (p ≤ .05). Women (mean [interquartile range]: 9.9 [4; 14]) attacked more than men (7.0 [3; 10]) while attempting a greater number of arm/leg lever (women: 2.7 [1; 6]; men: 4.0 [0; 4]) and trunk/leg lever (women: 0.8 [0; 1]; men: 2.4 [0; 4]) techniques but fewer maximal length-moment arm techniques (women: 0.7 [0; 1]; men: 1.0 [0; 2]). Male athletes displayed one-handed gripping of the back and sleeve, whereas female athletes executed a greater number of groundwork techniques. An optimized probabilistic neural network model, using patterns from the gripping, attack, groundwork, and pause phases, produced an overall prediction accuracy of 76% for discrimination between men and women.

  10. Combining MLC and SVM Classifiers for Learning Based Decision Making: Analysis and Evaluations

    PubMed Central

    Zhang, Yi; Ren, Jinchang; Jiang, Jianmin

    2015-01-01

    Maximum likelihood classifier (MLC) and support vector machines (SVM) are two commonly used approaches in machine learning. MLC is based on Bayesian theory in estimating parameters of a probabilistic model, whilst SVM is an optimization based nonparametric method in this context. Recently, it is found that SVM in some cases is equivalent to MLC in probabilistically modeling the learning process. In this paper, MLC and SVM are combined in learning and classification, which helps to yield probabilistic output for SVM and facilitate soft decision making. In total four groups of data are used for evaluations, covering sonar, vehicle, breast cancer, and DNA sequences. The data samples are characterized in terms of Gaussian/non-Gaussian distributed and balanced/unbalanced samples which are then further used for performance assessment in comparing the SVM and the combined SVM-MLC classifier. Interesting results are reported to indicate how the combined classifier may work under various conditions. PMID:26089862

  11. Combining MLC and SVM Classifiers for Learning Based Decision Making: Analysis and Evaluations.

    PubMed

    Zhang, Yi; Ren, Jinchang; Jiang, Jianmin

    2015-01-01

    Maximum likelihood classifier (MLC) and support vector machines (SVM) are two commonly used approaches in machine learning. MLC is based on Bayesian theory in estimating parameters of a probabilistic model, whilst SVM is an optimization based nonparametric method in this context. Recently, it is found that SVM in some cases is equivalent to MLC in probabilistically modeling the learning process. In this paper, MLC and SVM are combined in learning and classification, which helps to yield probabilistic output for SVM and facilitate soft decision making. In total four groups of data are used for evaluations, covering sonar, vehicle, breast cancer, and DNA sequences. The data samples are characterized in terms of Gaussian/non-Gaussian distributed and balanced/unbalanced samples which are then further used for performance assessment in comparing the SVM and the combined SVM-MLC classifier. Interesting results are reported to indicate how the combined classifier may work under various conditions.

  12. Scalable DB+IR Technology: Processing Probabilistic Datalog with HySpirit.

    PubMed

    Frommholz, Ingo; Roelleke, Thomas

    2016-01-01

    Probabilistic Datalog (PDatalog, proposed in 1995) is a probabilistic variant of Datalog and a nice conceptual idea to model Information Retrieval in a logical, rule-based programming paradigm. Making PDatalog work in real-world applications requires more than probabilistic facts and rules, and the semantics associated with the evaluation of the programs. We report in this paper some of the key features of the HySpirit system required to scale the execution of PDatalog programs. Firstly, there is the requirement to express probability estimation in PDatalog. Secondly, fuzzy-like predicates are required to model vague predicates (e.g. vague match of attributes such as age or price). Thirdly, to handle large data sets there are scalability issues to be addressed, and therefore, HySpirit provides probabilistic relational indexes and parallel and distributed processing . The main contribution of this paper is a consolidated view on the methods of the HySpirit system to make PDatalog applicable in real-scale applications that involve a wide range of requirements typical for data (information) management and analysis.

  13. NESSUS (Numerical Evaluation of Stochastic Structures Under Stress)/EXPERT: Bridging the gap between artificial intelligence and FORTRAN

    NASA Technical Reports Server (NTRS)

    Fink, Pamela K.; Palmer, Karol K.

    1988-01-01

    The development of a probabilistic structural analysis methodology (PSAM) is described. In the near-term, the methodology will be applied to designing critical components of the next generation space shuttle main engine. In the long-term, PSAM will be applied very broadly, providing designers with a new technology for more effective design of structures whose character and performance are significantly affected by random variables. The software under development to implement the ideas developed in PSAM resembles, in many ways, conventional deterministic structural analysis code. However, several additional capabilities regarding the probabilistic analysis makes the input data requirements and the resulting output even more complex. As a result, an intelligent front- and back-end to the code is being developed to assist the design engineer in providing the input data in a correct and appropriate manner. The type of knowledge that this entails is, in general, heuristically-based, allowing the fairly well-understood technology of production rules to apply with little difficulty. However, the PSAM code, called NESSUS, is written in FORTRAN-77 and runs on a DEC VAX. Thus, the associated expert system, called NESSUS/EXPERT, must run on a DEC VAX as well, and integrate effectively and efficiently with the existing FORTRAN code. This paper discusses the process undergone to select a suitable tool, identify an appropriate division between the functions that should be performed in FORTRAN and those that should be performed by production rules, and how integration of the conventional and AI technologies was achieved.

  14. Probabilistic interpretation of Peelle's pertinent puzzle and its resolution

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hanson, Kenneth M.; Kawano, T.; Talou, P.

    2004-01-01

    Peelle's Pertinent Puzzle (PPP) states a seemingly plausible set of measurements with their covariance matrix, which produce an implausible answer. To answer the PPP question, we describe a reasonable experimental situation that is consistent with the PPP solution. The confusion surrounding the PPP arises in part because of its imprecise statement, which permits to a variety of interpretations and resulting answers, some of which seem implausible. We emphasize the importance of basing the analysis on an unambiguous probabilistic model that reflects the experimental situation. We present several different models of how the measurements quoted in the PPP problem could bemore » obtained, and interpret their solution in terms of a detailed probabilistic analysis. We suggest a probabilistic approach to handling uncertainties about which model to use.« less

  15. Probabilistic Interpretation of Peelle's Pertinent Puzzle and its Resolution

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hanson, Kenneth M.; Kawano, Toshihiko; Talou, Patrick

    2005-05-24

    Peelle's Pertinent Puzzle (PPP) states a seemingly plausible set of measurements with their covariance matrix, which produce an implausible answer. To answer the PPP question, we describe a reasonable experimental situation that is consistent with the PPP solution. The confusion surrounding the PPP arises in part because of its imprecise statement, which permits to a variety of interpretations and resulting answers, some of which seem implausible. We emphasize the importance of basing the analysis on an unambiguous probabilistic model that reflects the experimental situation. We present several different models of how the measurements quoted in the PPP problem could bemore » obtained, and interpret their solution in terms of a detailed probabilistic analysis. We suggest a probabilistic approach to handling uncertainties about which model to use.« less

  16. Probabilistic drug connectivity mapping

    PubMed Central

    2014-01-01

    Background The aim of connectivity mapping is to match drugs using drug-treatment gene expression profiles from multiple cell lines. This can be viewed as an information retrieval task, with the goal of finding the most relevant profiles for a given query drug. We infer the relevance for retrieval by data-driven probabilistic modeling of the drug responses, resulting in probabilistic connectivity mapping, and further consider the available cell lines as different data sources. We use a special type of probabilistic model to separate what is shared and specific between the sources, in contrast to earlier connectivity mapping methods that have intentionally aggregated all available data, neglecting information about the differences between the cell lines. Results We show that the probabilistic multi-source connectivity mapping method is superior to alternatives in finding functionally and chemically similar drugs from the Connectivity Map data set. We also demonstrate that an extension of the method is capable of retrieving combinations of drugs that match different relevant parts of the query drug response profile. Conclusions The probabilistic modeling-based connectivity mapping method provides a promising alternative to earlier methods. Principled integration of data from different cell lines helps to identify relevant responses for specific drug repositioning applications. PMID:24742351

  17. Probabilistic Harmonic Analysis on Distributed Photovoltaic Integration Considering Typical Weather Scenarios

    NASA Astrophysics Data System (ADS)

    Bin, Che; Ruoying, Yu; Dongsheng, Dang; Xiangyan, Wang

    2017-05-01

    Distributed Generation (DG) integrating to the network would cause the harmonic pollution which would cause damages on electrical devices and affect the normal operation of power system. On the other hand, due to the randomness of the wind and solar irradiation, the output of DG is random, too, which leads to an uncertainty of the harmonic generated by the DG. Thus, probabilistic methods are needed to analyse the impacts of the DG integration. In this work we studied the harmonic voltage probabilistic distribution and the harmonic distortion in distributed network after the distributed photovoltaic (DPV) system integrating in different weather conditions, mainly the sunny day, cloudy day, rainy day and the snowy day. The probabilistic distribution function of the DPV output power in different typical weather conditions could be acquired via the parameter identification method of maximum likelihood estimation. The Monte-Carlo simulation method was adopted to calculate the probabilistic distribution of harmonic voltage content at different frequency orders as well as the harmonic distortion (THD) in typical weather conditions. The case study was based on the IEEE33 system and the results of harmonic voltage content probabilistic distribution as well as THD in typical weather conditions were compared.

  18. ASSESSING THE ECOLOGICAL CONDITION OF SOUTHEAST U. S. ESTUARIES

    EPA Science Inventory

    As a means to assess ecological condition, 151 stations located in southeastern estuaries from Cape Henry, Virginia to Biscayne Bay, Florida were sampled by state agencies during the summer of 2000 using a probabilistic design. The design used 8 size classes of estuaries ranging ...

  19. DESIGNING MONITORING AND ASSESSMENT STRATEGIES TO INCLUDE NEARSHORE ECOSYSTEMS OF THE GREAT LAKES

    EPA Science Inventory

    An expectation for monitoring and assessment of very large aquatic systems is that we can develop a strategy that recognizes and reports on ecologically-important subareas using spatially-stratified, probabilistic sampling designs. Ongoing efforts monitor the main-body, offshore ...

  20. BN-FLEMOps pluvial - A probabilistic multi-variable loss estimation model for pluvial floods

    NASA Astrophysics Data System (ADS)

    Roezer, V.; Kreibich, H.; Schroeter, K.; Doss-Gollin, J.; Lall, U.; Merz, B.

    2017-12-01

    Pluvial flood events, such as in Copenhagen (Denmark) in 2011, Beijing (China) in 2012 or Houston (USA) in 2016, have caused severe losses to urban dwellings in recent years. These floods are caused by storm events with high rainfall rates well above the design levels of urban drainage systems, which lead to inundation of streets and buildings. A projected increase in frequency and intensity of heavy rainfall events in many areas and an ongoing urbanization may increase pluvial flood losses in the future. For an efficient risk assessment and adaptation to pluvial floods, a quantification of the flood risk is needed. Few loss models have been developed particularly for pluvial floods. These models usually use simple waterlevel- or rainfall-loss functions and come with very high uncertainties. To account for these uncertainties and improve the loss estimation, we present a probabilistic multi-variable loss estimation model for pluvial floods based on empirical data. The model was developed in a two-step process using a machine learning approach and a comprehensive database comprising 783 records of direct building and content damage of private households. The data was gathered through surveys after four different pluvial flood events in Germany between 2005 and 2014. In a first step, linear and non-linear machine learning algorithms, such as tree-based and penalized regression models were used to identify the most important loss influencing factors among a set of 55 candidate variables. These variables comprise hydrological and hydraulic aspects, early warning, precaution, building characteristics and the socio-economic status of the household. In a second step, the most important loss influencing variables were used to derive a probabilistic multi-variable pluvial flood loss estimation model based on Bayesian Networks. Two different networks were tested: a score-based network learned from the data and a network based on expert knowledge. Loss predictions are made through Bayesian inference using Markov chain Monte Carlo (MCMC) sampling. With the ability to cope with incomplete information and use expert knowledge, as well as inherently providing quantitative uncertainty information, it is shown that loss models based on BNs are superior to deterministic approaches for pluvial flood risk assessment.

  1. A probabilistic and continuous model of protein conformational space for template-free modeling.

    PubMed

    Zhao, Feng; Peng, Jian; Debartolo, Joe; Freed, Karl F; Sosnick, Tobin R; Xu, Jinbo

    2010-06-01

    One of the major challenges with protein template-free modeling is an efficient sampling algorithm that can explore a huge conformation space quickly. The popular fragment assembly method constructs a conformation by stringing together short fragments extracted from the Protein Data Base (PDB). The discrete nature of this method may limit generated conformations to a subspace in which the native fold does not belong. Another worry is that a protein with really new fold may contain some fragments not in the PDB. This article presents a probabilistic model of protein conformational space to overcome the above two limitations. This probabilistic model employs directional statistics to model the distribution of backbone angles and 2(nd)-order Conditional Random Fields (CRFs) to describe sequence-angle relationship. Using this probabilistic model, we can sample protein conformations in a continuous space, as opposed to the widely used fragment assembly and lattice model methods that work in a discrete space. We show that when coupled with a simple energy function, this probabilistic method compares favorably with the fragment assembly method in the blind CASP8 evaluation, especially on alpha or small beta proteins. To our knowledge, this is the first probabilistic method that can search conformations in a continuous space and achieves favorable performance. Our method also generated three-dimensional (3D) models better than template-based methods for a couple of CASP8 hard targets. The method described in this article can also be applied to protein loop modeling, model refinement, and even RNA tertiary structure prediction.

  2. Elasto-limited plastic analysis of structures for probabilistic conditions

    NASA Astrophysics Data System (ADS)

    Movahedi Rad, M.

    2018-06-01

    With applying plastic analysis and design methods, significant saving in material can be obtained. However, as a result of this benefit excessive plastic deformations and large residual displacements might develop, which in turn might lead to unserviceability and collapse of the structure. In this study, for deterministic problem the residual deformation of structures is limited by considering a constraint on the complementary strain energy of the residual forces. For probabilistic problem the constraint for the complementary strain energy of the residual forces is given randomly and critical stresses updated during the iteration. Limit curves are presented for the plastic limit load factors. The results show that these constraints have significant effects on the load factors. The formulations of the deterministic and probabilistic problems lead to mathematical programming which are solved by the use of nonlinear algorithm.

  3. Cost-utility analysis of screening for diabetic retinopathy in Japan: a probabilistic Markov modeling study.

    PubMed

    Kawasaki, Ryo; Akune, Yoko; Hiratsuka, Yoshimune; Fukuhara, Shunichi; Yamada, Masakazu

    2015-02-01

    To evaluate the cost-effectiveness for a screening interval longer than 1 year detecting diabetic retinopathy (DR) through the estimation of incremental costs per quality-adjusted life year (QALY) based on the best available clinical data in Japan. A Markov model with a probabilistic cohort analysis was framed to calculate incremental costs per QALY gained by implementing a screening program detecting DR in Japan. A 1-year cycle length and population size of 50,000 with a 50-year time horizon (age 40-90 years) was used. Best available clinical data from publications and national surveillance data was used, and a model was designed including current diagnosis and management of DR with corresponding visual outcomes. One-way and probabilistic sensitivity analyses were performed considering uncertainties in the parameters. In the base-case analysis, the strategy with a screening program resulted in an incremental cost of 5,147 Japanese yen (¥; US$64.6) and incremental effectiveness of 0.0054 QALYs per person screened. The incremental cost-effectiveness ratio was ¥944,981 (US$11,857) per QALY. The simulation suggested that screening would result in a significant reduction in blindness in people aged 40 years or over (-16%). Sensitivity analyses suggested that in order to achieve both reductions in blindness and cost-effectiveness in Japan, the screening program should screen those aged 53-84 years, at intervals of 3 years or less. An eye screening program in Japan would be cost-effective in detecting DR and preventing blindness from DR, even allowing for the uncertainties in estimates of costs, utility, and current management of DR.

  4. Discrete Element Method Modeling of Bedload Transport: Towards a physics-based link between bed surface variability and particle entrainment statistics

    NASA Astrophysics Data System (ADS)

    Ghasemi, A.; Borhani, S.; Viparelli, E.; Hill, K. M.

    2017-12-01

    The Exner equation provides a formal mathematical link between sediment transport and bed morphology. It is typically represented in a discrete formulation where there is a sharp geometric interface between the bedload layer and the bed, below which no particles are entrained. For high temporally and spatially resolved models, this is strictly correct, but typically this is applied in such a way that spatial and temporal fluctuations in the bed surface (bedforms and otherwise) are not captured. This limits the extent to which the exchange between particles in transport and the sediment bed are properly represented, particularly problematic for mixed grain size distributions that exhibit segregation. Nearly two decades ago, Parker (2000) provided a framework for a solution to this dilemma in the form of a probabilistic Exner equation, partially experimentally validated by Wong et al. (2007). We present a computational study designed to develop a physics-based framework for understanding the interplay between physical parameters of the bed and flow and parameters in the Parker (2000) probabilistic formulation. To do so we use Discrete Element Method simulations to relate local time-varying parameters to long-term macroscopic parameters. These include relating local grain size distribution and particle entrainment and deposition rates to long- average bed shear stress and the standard deviation of bed height variations. While relatively simple, these simulations reproduce long-accepted empirically determined transport behaviors such as the Meyer-Peter and Muller (1948) relationship. We also find that these simulations reproduce statistical relationships proposed by Wong et al. (2007) such as a Gaussian distribution of bed heights whose standard deviation increases with increasing bed shear stress. We demonstrate how the ensuing probabilistic formulations provide insight into the transport and deposition of both narrow and wide grain size distribution.

  5. A Parallel Particle Swarm Optimization Algorithm Accelerated by Asynchronous Evaluations

    NASA Technical Reports Server (NTRS)

    Venter, Gerhard; Sobieszczanski-Sobieski, Jaroslaw

    2005-01-01

    A parallel Particle Swarm Optimization (PSO) algorithm is presented. Particle swarm optimization is a fairly recent addition to the family of non-gradient based, probabilistic search algorithms that is based on a simplified social model and is closely tied to swarming theory. Although PSO algorithms present several attractive properties to the designer, they are plagued by high computational cost as measured by elapsed time. One approach to reduce the elapsed time is to make use of coarse-grained parallelization to evaluate the design points. Previous parallel PSO algorithms were mostly implemented in a synchronous manner, where all design points within a design iteration are evaluated before the next iteration is started. This approach leads to poor parallel speedup in cases where a heterogeneous parallel environment is used and/or where the analysis time depends on the design point being analyzed. This paper introduces an asynchronous parallel PSO algorithm that greatly improves the parallel e ciency. The asynchronous algorithm is benchmarked on a cluster assembled of Apple Macintosh G5 desktop computers, using the multi-disciplinary optimization of a typical transport aircraft wing as an example.

  6. Characterizing the Benefits of Seismic Isolation for Nuclear Structures: A Framework for Risk-Based Decision Making

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bolisetti, Chandrakanth; Yu, Chingching; Coleman, Justin

    This report provides a framework for assessing the benefits of seismic isolation and exercises the framework on a Generic Department of Energy Nuclear Facility (GDNF). These benefits are (1) reduction in the risk of unacceptable seismic performance and a dramatic reduction in the probability of unacceptable performance at beyond-design basis shaking, and (2) a reduction in capital cost at sites with moderate to high seismic hazard. The framework includes probabilistic risk assessment and estimates of overnight capital cost for the GDNF.

  7. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhao, T; Ruan, D

    Purpose: The growing size and heterogeneity in training atlas necessitates sophisticated schemes to identify only the most relevant atlases for the specific multi-atlas-based image segmentation problem. This study aims to develop a model to infer the inaccessible oracle geometric relevance metric from surrogate image similarity metrics, and based on such model, provide guidance to atlas selection in multi-atlas-based image segmentation. Methods: We relate the oracle geometric relevance metric in label space to the surrogate metric in image space, by a monotonically non-decreasing function with additive random perturbations. Subsequently, a surrogate’s ability to prognosticate the oracle order for atlas subset selectionmore » is quantified probabilistically. Finally, important insights and guidance are provided for the design of fusion set size, balancing the competing demands to include the most relevant atlases and to exclude the most irrelevant ones. A systematic solution is derived based on an optimization framework. Model verification and performance assessment is performed based on clinical prostate MR images. Results: The proposed surrogate model was exemplified by a linear map with normally distributed perturbation, and verified with several commonly-used surrogates, including MSD, NCC and (N)MI. The derived behaviors of different surrogates in atlas selection and their corresponding performance in ultimate label estimate were validated. The performance of NCC and (N)MI was similarly superior to MSD, with a 10% higher atlas selection probability and a segmentation performance increase in DSC by 0.10 with the first and third quartiles of (0.83, 0.89), compared to (0.81, 0.89). The derived optimal fusion set size, valued at 7/8/8/7 for MSD/NCC/MI/NMI, agreed well with the appropriate range [4, 9] from empirical observation. Conclusion: This work has developed an efficacious probabilistic model to characterize the image-based surrogate metric on atlas selection. Analytical insights lead to valid guiding principles on fusion set size design.« less

  8. Trait-Dependent Biogeography: (Re)Integrating Biology into Probabilistic Historical Biogeographical Models.

    PubMed

    Sukumaran, Jeet; Knowles, L Lacey

    2018-06-01

    The development of process-based probabilistic models for historical biogeography has transformed the field by grounding it in modern statistical hypothesis testing. However, most of these models abstract away biological differences, reducing species to interchangeable lineages. We present here the case for reintegration of biology into probabilistic historical biogeographical models, allowing a broader range of questions about biogeographical processes beyond ancestral range estimation or simple correlation between a trait and a distribution pattern, as well as allowing us to assess how inferences about ancestral ranges themselves might be impacted by differential biological traits. We show how new approaches to inference might cope with the computational challenges resulting from the increased complexity of these trait-based historical biogeographical models. Copyright © 2018 Elsevier Ltd. All rights reserved.

  9. Distinct Roles of Dopamine and Subthalamic Nucleus in Learning and Probabilistic Decision Making

    ERIC Educational Resources Information Center

    Coulthard, Elizabeth J.; Bogacz, Rafal; Javed, Shazia; Mooney, Lucy K.; Murphy, Gillian; Keeley, Sophie; Whone, Alan L.

    2012-01-01

    Even simple behaviour requires us to make decisions based on combining multiple pieces of learned and new information. Making such decisions requires both learning the optimal response to each given stimulus as well as combining probabilistic information from multiple stimuli before selecting a response. Computational theories of decision making…

  10. Probabilistic and structural reliability analysis of laminated composite structures based on the IPACS code

    NASA Technical Reports Server (NTRS)

    Sobel, Larry; Buttitta, Claudio; Suarez, James

    1993-01-01

    Probabilistic predictions based on the Integrated Probabilistic Assessment of Composite Structures (IPACS) code are presented for the material and structural response of unnotched and notched, 1M6/3501-6 Gr/Ep laminates. Comparisons of predicted and measured modulus and strength distributions are given for unnotched unidirectional, cross-ply, and quasi-isotropic laminates. The predicted modulus distributions were found to correlate well with the test results for all three unnotched laminates. Correlations of strength distributions for the unnotched laminates are judged good for the unidirectional laminate and fair for the cross-ply laminate, whereas the strength correlation for the quasi-isotropic laminate is deficient because IPACS did not yet have a progressive failure capability. The paper also presents probabilistic and structural reliability analysis predictions for the strain concentration factor (SCF) for an open-hole, quasi-isotropic laminate subjected to longitudinal tension. A special procedure was developed to adapt IPACS for the structural reliability analysis. The reliability results show the importance of identifying the most significant random variables upon which the SCF depends, and of having accurate scatter values for these variables.

  11. A unified probabilistic approach to improve spelling in an event-related potential-based brain-computer interface.

    PubMed

    Kindermans, Pieter-Jan; Verschore, Hannes; Schrauwen, Benjamin

    2013-10-01

    In recent years, in an attempt to maximize performance, machine learning approaches for event-related potential (ERP) spelling have become more and more complex. In this paper, we have taken a step back as we wanted to improve the performance without building an overly complex model, that cannot be used by the community. Our research resulted in a unified probabilistic model for ERP spelling, which is based on only three assumptions and incorporates language information. On top of that, the probabilistic nature of our classifier yields a natural dynamic stopping strategy. Furthermore, our method uses the same parameters across 25 subjects from three different datasets. We show that our classifier, when enhanced with language models and dynamic stopping, improves the spelling speed and accuracy drastically. Additionally, we would like to point out that as our model is entirely probabilistic, it can easily be used as the foundation for complex systems in future work. All our experiments are executed on publicly available datasets to allow for future comparison with similar techniques.

  12. Renewable energy in electric utility capacity planning: a decomposition approach with application to a Mexican utility

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Staschus, K.

    1985-01-01

    In this dissertation, efficient algorithms for electric-utility capacity expansion planning with renewable energy are developed. The algorithms include a deterministic phase that quickly finds a near-optimal expansion plan using derating and a linearized approximation to the time-dependent availability of nondispatchable energy sources. A probabilistic second phase needs comparatively few computer-time consuming probabilistic simulation iterations to modify this solution towards the optimal expansion plan. For the deterministic first phase, two algorithms, based on a Lagrangian Dual decomposition and a Generalized Benders Decomposition, are developed. The probabilistic second phase uses a Generalized Benders Decomposition approach. Extensive computational tests of the algorithms aremore » reported. Among the deterministic algorithms, the one based on Lagrangian Duality proves fastest. The two-phase approach is shown to save up to 80% in computing time as compared to a purely probabilistic algorithm. The algorithms are applied to determine the optimal expansion plan for the Tijuana-Mexicali subsystem of the Mexican electric utility system. A strong recommendation to push conservation programs in the desert city of Mexicali results from this implementation.« less

  13. Probabilistic/Fracture-Mechanics Model For Service Life

    NASA Technical Reports Server (NTRS)

    Watkins, T., Jr.; Annis, C. G., Jr.

    1991-01-01

    Computer program makes probabilistic estimates of lifetime of engine and components thereof. Developed to fill need for more accurate life-assessment technique that avoids errors in estimated lives and provides for statistical assessment of levels of risk created by engineering decisions in designing system. Implements mathematical model combining techniques of statistics, fatigue, fracture mechanics, nondestructive analysis, life-cycle cost analysis, and management of engine parts. Used to investigate effects of such engine-component life-controlling parameters as return-to-service intervals, stresses, capabilities for nondestructive evaluation, and qualities of materials.

  14. Probabilistic and Possibilistic Analyses of the Strength of a Bonded Joint

    NASA Technical Reports Server (NTRS)

    Stroud, W. Jefferson; Krishnamurthy, T.; Smith, Steven A.

    2001-01-01

    The effects of uncertainties on the strength of a single lap shear joint are explained. Probabilistic and possibilistic methods are used to account for uncertainties. Linear and geometrically nonlinear finite element analyses are used in the studies. To evaluate the strength of the joint, fracture in the adhesive and material strength failure in the strap are considered. The study shows that linear analyses yield conservative predictions for failure loads. The possibilistic approach for treating uncertainties appears to be viable for preliminary design, but with several qualifications.

  15. Essays on variational approximation techniques for stochastic optimization problems

    NASA Astrophysics Data System (ADS)

    Deride Silva, Julio A.

    This dissertation presents five essays on approximation and modeling techniques, based on variational analysis, applied to stochastic optimization problems. It is divided into two parts, where the first is devoted to equilibrium problems and maxinf optimization, and the second corresponds to two essays in statistics and uncertainty modeling. Stochastic optimization lies at the core of this research as we were interested in relevant equilibrium applications that contain an uncertain component, and the design of a solution strategy. In addition, every stochastic optimization problem relies heavily on the underlying probability distribution that models the uncertainty. We studied these distributions, in particular, their design process and theoretical properties such as their convergence. Finally, the last aspect of stochastic optimization that we covered is the scenario creation problem, in which we described a procedure based on a probabilistic model to create scenarios for the applied problem of power estimation of renewable energies. In the first part, Equilibrium problems and maxinf optimization, we considered three Walrasian equilibrium problems: from economics, we studied a stochastic general equilibrium problem in a pure exchange economy, described in Chapter 3, and a stochastic general equilibrium with financial contracts, in Chapter 4; finally from engineering, we studied an infrastructure planning problem in Chapter 5. We stated these problems as belonging to the maxinf optimization class and, in each instance, we provided an approximation scheme based on the notion of lopsided convergence and non-concave duality. This strategy is the foundation of the augmented Walrasian algorithm, whose convergence is guaranteed by lopsided convergence, that was implemented computationally, obtaining numerical results for relevant examples. The second part, Essays about statistics and uncertainty modeling, contains two essays covering a convergence problem for a sequence of estimators, and a problem for creating probabilistic scenarios on renewable energies estimation. In Chapter 7 we re-visited one of the "folk theorems" in statistics, where a family of Bayes estimators under 0-1 loss functions is claimed to converge to the maximum a posteriori estimator. This assertion is studied under the scope of the hypo-convergence theory, and the density functions are included in the class of upper semicontinuous functions. We conclude this chapter with an example in which the convergence does not hold true, and we provided sufficient conditions that guarantee convergence. The last chapter, Chapter 8, addresses the important topic of creating probabilistic scenarios for solar power generation. Scenarios are a fundamental input for the stochastic optimization problem of energy dispatch, especially when incorporating renewables. We proposed a model designed to capture the constraints induced by physical characteristics of the variables based on the application of an epi-spline density estimation along with a copula estimation, in order to account for partial correlations between variables.

  16. Robust Observation Detection for Single Object Tracking: Deterministic and Probabilistic Patch-Based Approaches

    PubMed Central

    Zulkifley, Mohd Asyraf; Rawlinson, David; Moran, Bill

    2012-01-01

    In video analytics, robust observation detection is very important as the content of the videos varies a lot, especially for tracking implementation. Contrary to the image processing field, the problems of blurring, moderate deformation, low illumination surroundings, illumination change and homogenous texture are normally encountered in video analytics. Patch-Based Observation Detection (PBOD) is developed to improve detection robustness to complex scenes by fusing both feature- and template-based recognition methods. While we believe that feature-based detectors are more distinctive, however, for finding the matching between the frames are best achieved by a collection of points as in template-based detectors. Two methods of PBOD—the deterministic and probabilistic approaches—have been tested to find the best mode of detection. Both algorithms start by building comparison vectors at each detected points of interest. The vectors are matched to build candidate patches based on their respective coordination. For the deterministic method, patch matching is done in 2-level test where threshold-based position and size smoothing are applied to the patch with the highest correlation value. For the second approach, patch matching is done probabilistically by modelling the histograms of the patches by Poisson distributions for both RGB and HSV colour models. Then, maximum likelihood is applied for position smoothing while a Bayesian approach is applied for size smoothing. The result showed that probabilistic PBOD outperforms the deterministic approach with average distance error of 10.03% compared with 21.03%. This algorithm is best implemented as a complement to other simpler detection methods due to heavy processing requirement. PMID:23202226

  17. Risk Informed Assessment of Regulatory and Design Requirements for Future Nuclear Power Plants - Final Technical Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ritterbusch, Stanley; Golay, Michael; Duran, Felicia

    2003-01-29

    OAK B188 Summary of methods proposed for risk informing the design and regulation of future nuclear power plants. All elements of the historical design and regulation process are preserved, but the methods proposed for new plants use probabilistic risk assessment methods as the primary decision making tool.

  18. Application of Deterministic and Probabilistic System Design Methods and Enhancements of Conceptual Design Tools for ERA Project

    NASA Technical Reports Server (NTRS)

    Mavris, Dimitri N.; Schutte, Jeff S.

    2016-01-01

    This report documents work done by the Aerospace Systems Design Lab (ASDL) at the Georgia Institute of Technology, Daniel Guggenheim School of Aerospace Engineering for the National Aeronautics and Space Administration, Aeronautics Research Mission Directorate, Integrated System Research Program, Environmentally Responsible Aviation (ERA) Project. This report was prepared under contract NNL12AA12C, "Application of Deterministic and Probabilistic System Design Methods and Enhancement of Conceptual Design Tools for ERA Project". The research within this report addressed the Environmentally Responsible Aviation (ERA) project goal stated in the NRA solicitation "to advance vehicle concepts and technologies that can simultaneously reduce fuel burn, noise, and emissions." To identify technology and vehicle solutions that simultaneously meet these three metrics requires the use of system-level analysis with the appropriate level of fidelity to quantify feasibility, benefits and degradations, and associated risk. In order to perform the system level analysis, the Environmental Design Space (EDS) [Kirby 2008, Schutte 2012a] environment developed by ASDL was used to model both conventional and unconventional configurations as well as to assess technologies from the ERA and N+2 timeframe portfolios. A well-established system design approach was used to perform aircraft conceptual design studies, including technology trade studies to identify technology portfolios capable of accomplishing the ERA project goal and to obtain accurate tradeoffs between performance, noise, and emissions. The ERA goal, shown in Figure 1, is to simultaneously achieve the N+2 benefits of a cumulative noise margin of 42 EPNdB relative to stage 4, a 75 percent reduction in LTO NOx emissions relative to CAEP 6 and a 50 percent reduction in fuel burn relative to the 2005 best in class aircraft. There were 5 research task associated with this research: 1) identify technology collectors, 2) model technology collectors in EDS, 3) model and assess ERA technologies, 4) LTO and cruise emission prediction, and 5) probabilistic analysis of technology collectors and portfolios.

  19. Probabilistic methods for rotordynamics analysis

    NASA Technical Reports Server (NTRS)

    Wu, Y.-T.; Torng, T. Y.; Millwater, H. R.; Fossum, A. F.; Rheinfurth, M. H.

    1991-01-01

    This paper summarizes the development of the methods and a computer program to compute the probability of instability of dynamic systems that can be represented by a system of second-order ordinary linear differential equations. Two instability criteria based upon the eigenvalues or Routh-Hurwitz test functions are investigated. Computational methods based on a fast probability integration concept and an efficient adaptive importance sampling method are proposed to perform efficient probabilistic analysis. A numerical example is provided to demonstrate the methods.

  20. Bayesian Networks Improve Causal Environmental Assessments for Evidence-Based Policy.

    PubMed

    Carriger, John F; Barron, Mace G; Newman, Michael C

    2016-12-20

    Rule-based weight of evidence approaches to ecological risk assessment may not account for uncertainties and generally lack probabilistic integration of lines of evidence. Bayesian networks allow causal inferences to be made from evidence by including causal knowledge about the problem, using this knowledge with probabilistic calculus to combine multiple lines of evidence, and minimizing biases in predicting or diagnosing causal relationships. Too often, sources of uncertainty in conventional weight of evidence approaches are ignored that can be accounted for with Bayesian networks. Specifying and propagating uncertainties improve the ability of models to incorporate strength of the evidence in the risk management phase of an assessment. Probabilistic inference from a Bayesian network allows evaluation of changes in uncertainty for variables from the evidence. The network structure and probabilistic framework of a Bayesian approach provide advantages over qualitative approaches in weight of evidence for capturing the impacts of multiple sources of quantifiable uncertainty on predictions of ecological risk. Bayesian networks can facilitate the development of evidence-based policy under conditions of uncertainty by incorporating analytical inaccuracies or the implications of imperfect information, structuring and communicating causal issues through qualitative directed graph formulations, and quantitatively comparing the causal power of multiple stressors on valued ecological resources. These aspects are demonstrated through hypothetical problem scenarios that explore some major benefits of using Bayesian networks for reasoning and making inferences in evidence-based policy.

  1. Impact of refining the assessment of dietary exposure to cadmium in the European adult population.

    PubMed

    Ferrari, Pietro; Arcella, Davide; Heraud, Fanny; Cappé, Stefano; Fabiansson, Stefan

    2013-01-01

    Exposure assessment constitutes an important step in any risk assessment of potentially harmful substances present in food. The European Food Safety Authority (EFSA) first assessed dietary exposure to cadmium in Europe using a deterministic framework, resulting in mean values of exposure in the range of health-based guidance values. Since then, the characterisation of foods has been refined to better match occurrence and consumption data, and a new strategy to handle left-censoring in occurrence data was devised. A probabilistic assessment was performed and compared with deterministic estimates, using occurrence values at the European level and consumption data from 14 national dietary surveys. Mean estimates in the probabilistic assessment ranged from 1.38 (95% CI = 1.35-1.44) to 2.08 (1.99-2.23) µg kg⁻¹ bodyweight (bw) week⁻¹ across the different surveys, which were less than 10% lower than deterministic (middle bound) mean values that ranged from 1.50 to 2.20 µg kg⁻¹ bw week⁻¹. Probabilistic 95th percentile estimates of dietary exposure ranged from 2.65 (2.57-2.72) to 4.99 (4.62-5.38) µg kg⁻¹ bw week⁻¹, which were, with the exception of one survey, between 3% and 17% higher than middle-bound deterministic estimates. Overall, the proportion of subjects exceeding the tolerable weekly intake of 2.5 µg kg⁻¹ bw ranged from 14.8% (13.6-16.0%) to 31.2% (29.7-32.5%) according to the probabilistic assessment. The results of this work indicate that mean values of dietary exposure to cadmium in the European population were of similar magnitude using determinist or probabilistic assessments. For higher exposure levels, probabilistic estimates were almost consistently larger than deterministic counterparts, thus reflecting the impact of using the full distribution of occurrence values to determine exposure levels. It is considered prudent to use probabilistic methodology should exposure estimates be close to or exceeding health-based guidance values.

  2. Development of optimization-based probabilistic earthquake scenarios for the city of Tehran

    NASA Astrophysics Data System (ADS)

    Zolfaghari, M. R.; Peyghaleh, E.

    2016-01-01

    This paper presents the methodology and practical example for the application of optimization process to select earthquake scenarios which best represent probabilistic earthquake hazard in a given region. The method is based on simulation of a large dataset of potential earthquakes, representing the long-term seismotectonic characteristics in a given region. The simulation process uses Monte-Carlo simulation and regional seismogenic source parameters to generate a synthetic earthquake catalogue consisting of a large number of earthquakes, each characterized with magnitude, location, focal depth and fault characteristics. Such catalogue provides full distributions of events in time, space and size; however, demands large computation power when is used for risk assessment, particularly when other sources of uncertainties are involved in the process. To reduce the number of selected earthquake scenarios, a mixed-integer linear program formulation is developed in this study. This approach results in reduced set of optimization-based probabilistic earthquake scenario, while maintaining shape of hazard curves and full probabilistic picture by minimizing the error between hazard curves driven by full and reduced sets of synthetic earthquake scenarios. To test the model, the regional seismotectonic and seismogenic characteristics of northern Iran are used to simulate a set of 10,000-year worth of events consisting of some 84,000 earthquakes. The optimization model is then performed multiple times with various input data, taking into account probabilistic seismic hazard for Tehran city as the main constrains. The sensitivity of the selected scenarios to the user-specified site/return period error-weight is also assessed. The methodology could enhance run time process for full probabilistic earthquake studies like seismic hazard and risk assessment. The reduced set is the representative of the contributions of all possible earthquakes; however, it requires far less computation power. The authors have used this approach for risk assessment towards identification of effectiveness-profitability of risk mitigation measures, using optimization model for resource allocation. Based on the error-computation trade-off, 62-earthquake scenarios are chosen to be used for this purpose.

  3. Using a probabilistic approach in an ecological risk assessment simulation tool: test case for depleted uranium (DU).

    PubMed

    Fan, Ming; Thongsri, Tepwitoon; Axe, Lisa; Tyson, Trevor A

    2005-06-01

    A probabilistic approach was applied in an ecological risk assessment (ERA) to characterize risk and address uncertainty employing Monte Carlo simulations for assessing parameter and risk probabilistic distributions. This simulation tool (ERA) includes a Window's based interface, an interactive and modifiable database management system (DBMS) that addresses a food web at trophic levels, and a comprehensive evaluation of exposure pathways. To illustrate this model, ecological risks from depleted uranium (DU) exposure at the US Army Yuma Proving Ground (YPG) and Aberdeen Proving Ground (APG) were assessed and characterized. Probabilistic distributions showed that at YPG, a reduction in plant root weight is considered likely to occur (98% likelihood) from exposure to DU; for most terrestrial animals, likelihood for adverse reproduction effects ranges from 0.1% to 44%. However, for the lesser long-nosed bat, the effects are expected to occur (>99% likelihood) through the reduction in size and weight of offspring. Based on available DU data for the firing range at APG, DU uptake will not likely affect survival of aquatic plants and animals (<0.1% likelihood). Based on field and laboratory studies conducted at APG and YPG on pocket mice, kangaroo rat, white-throated woodrat, deer, and milfoil, body burden concentrations observed fall into the distributions simulated at both sites.

  4. Incorporating linguistic, probabilistic, and possibilistic information in a risk-based approach for ranking contaminated sites.

    PubMed

    Zhang, Kejiang; Achari, Gopal; Pei, Yuansheng

    2010-10-01

    Different types of uncertain information-linguistic, probabilistic, and possibilistic-exist in site characterization. Their representation and propagation significantly influence the management of contaminated sites. In the absence of a framework with which to properly represent and integrate these quantitative and qualitative inputs together, decision makers cannot fully take advantage of the available and necessary information to identify all the plausible alternatives. A systematic methodology was developed in the present work to incorporate linguistic, probabilistic, and possibilistic information into the Preference Ranking Organization METHod for Enrichment Evaluation (PROMETHEE), a subgroup of Multi-Criteria Decision Analysis (MCDA) methods for ranking contaminated sites. The identification of criteria based on the paradigm of comparative risk assessment provides a rationale for risk-based prioritization. Uncertain linguistic, probabilistic, and possibilistic information identified in characterizing contaminated sites can be properly represented as numerical values, intervals, probability distributions, and fuzzy sets or possibility distributions, and linguistic variables according to their nature. These different kinds of representation are first transformed into a 2-tuple linguistic representation domain. The propagation of hybrid uncertainties is then carried out in the same domain. This methodology can use the original site information directly as much as possible. The case study shows that this systematic methodology provides more reasonable results. © 2010 SETAC.

  5. Fracture mechanics concepts in reliability analysis of monolithic ceramics

    NASA Technical Reports Server (NTRS)

    Manderscheid, Jane M.; Gyekenyesi, John P.

    1987-01-01

    Basic design concepts for high-performance, monolithic ceramic structural components are addressed. The design of brittle ceramics differs from that of ductile metals because of the inability of ceramic materials to redistribute high local stresses caused by inherent flaws. Random flaw size and orientation requires that a probabilistic analysis be performed in order to determine component reliability. The current trend in probabilistic analysis is to combine linear elastic fracture mechanics concepts with the two parameter Weibull distribution function to predict component reliability under multiaxial stress states. Nondestructive evaluation supports this analytical effort by supplying data during verification testing. It can also help to determine statistical parameters which describe the material strength variation, in particular the material threshold strength (the third Weibull parameter), which in the past was often taken as zero for simplicity.

  6. Probabilistic modeling of discourse-aware sentence processing.

    PubMed

    Dubey, Amit; Keller, Frank; Sturt, Patrick

    2013-07-01

    Probabilistic models of sentence comprehension are increasingly relevant to questions concerning human language processing. However, such models are often limited to syntactic factors. This restriction is unrealistic in light of experimental results suggesting interactions between syntax and other forms of linguistic information in human sentence processing. To address this limitation, this article introduces two sentence processing models that augment a syntactic component with information about discourse co-reference. The novel combination of probabilistic syntactic components with co-reference classifiers permits them to more closely mimic human behavior than existing models. The first model uses a deep model of linguistics, based in part on probabilistic logic, allowing it to make qualitative predictions on experimental data; the second model uses shallow processing to make quantitative predictions on a broad-coverage reading-time corpus. Copyright © 2013 Cognitive Science Society, Inc.

  7. Evaluating the Use of Existing Data Sources, Probabilistic Linkage, and Multiple Imputation to Build Population-based Injury Databases Across Phases of Trauma Care

    PubMed Central

    Newgard, Craig; Malveau, Susan; Staudenmayer, Kristan; Wang, N. Ewen; Hsia, Renee Y.; Mann, N. Clay; Holmes, James F.; Kuppermann, Nathan; Haukoos, Jason S.; Bulger, Eileen M.; Dai, Mengtao; Cook, Lawrence J.

    2012-01-01

    Objectives The objective was to evaluate the process of using existing data sources, probabilistic linkage, and multiple imputation to create large population-based injury databases matched to outcomes. Methods This was a retrospective cohort study of injured children and adults transported by 94 emergency medical systems (EMS) agencies to 122 hospitals in seven regions of the western United States over a 36-month period (2006 to 2008). All injured patients evaluated by EMS personnel within specific geographic catchment areas were included, regardless of field disposition or outcome. The authors performed probabilistic linkage of EMS records to four hospital and postdischarge data sources (emergency department [ED] data, patient discharge data, trauma registries, and vital statistics files) and then handled missing values using multiple imputation. The authors compare and evaluate matched records, match rates (proportion of matches among eligible patients), and injury outcomes within and across sites. Results There were 381,719 injured patients evaluated by EMS personnel in the seven regions. Among transported patients, match rates ranged from 14.9% to 87.5% and were directly affected by the availability of hospital data sources and proportion of missing values for key linkage variables. For vital statistics records (1-year mortality), estimated match rates ranged from 88.0% to 98.7%. Use of multiple imputation (compared to complete case analysis) reduced bias for injury outcomes, although sample size, percentage missing, type of variable, and combined-site versus single-site imputation models all affected the resulting estimates and variance. Conclusions This project demonstrates the feasibility and describes the process of constructing population-based injury databases across multiple phases of care using existing data sources and commonly available analytic methods. Attention to key linkage variables and decisions for handling missing values can be used to increase match rates between data sources, minimize bias, and preserve sampling design. PMID:22506952

  8. The Stag Hunt Game: An Example of an Excel-Based Probabilistic Game

    ERIC Educational Resources Information Center

    Bridge, Dave

    2016-01-01

    With so many role-playing simulations already in the political science education literature, the recent repeated calls for new games is both timely and appropriate. This article answers and extends those calls by advocating the creation of probabilistic games using Microsoft Excel. I introduce the example of the Stag Hunt Game--a short, effective,…

  9. Probability versus Representativeness in Infancy: Can Infants Use Naïve Physics to Adjust Population Base Rates in Probabilistic Inference?

    ERIC Educational Resources Information Center

    Denison, Stephanie; Trikutam, Pallavi; Xu, Fei

    2014-01-01

    A rich tradition in developmental psychology explores physical reasoning in infancy. However, no research to date has investigated whether infants can reason about physical objects that behave probabilistically, rather than deterministically. Physical events are often quite variable, in that similar-looking objects can be placed in similar…

  10. Decomposing biodiversity data using the Latent Dirichlet Allocation model, a probabilistic multivariate statistical method

    Treesearch

    Denis Valle; Benjamin Baiser; Christopher W. Woodall; Robin Chazdon; Jerome Chave

    2014-01-01

    We propose a novel multivariate method to analyse biodiversity data based on the Latent Dirichlet Allocation (LDA) model. LDA, a probabilistic model, reduces assemblages to sets of distinct component communities. It produces easily interpretable results, can represent abrupt and gradual changes in composition, accommodates missing data and allows for coherent estimates...

  11. Dynamic Stability of Uncertain Laminated Beams Under Subtangential Loads

    NASA Technical Reports Server (NTRS)

    Goyal, Vijay K.; Kapania, Rakesh K.; Adelman, Howard (Technical Monitor); Horta, Lucas (Technical Monitor)

    2002-01-01

    Because of the inherent complexity of fiber-reinforced laminated composites, it can be challenging to manufacture composite structures according to their exact design specifications, resulting in unwanted material and geometric uncertainties. In this research, we focus on the deterministic and probabilistic stability analysis of laminated structures subject to subtangential loading, a combination of conservative and nonconservative tangential loads, using the dynamic criterion. Thus a shear-deformable laminated beam element, including warping effects, is derived to study the deterministic and probabilistic response of laminated beams. This twenty-one degrees of freedom element can be used for solving both static and dynamic problems. In the first-order shear deformable model used here we have employed a more accurate method to obtain the transverse shear correction factor. The dynamic version of the principle of virtual work for laminated composites is expressed in its nondimensional form and the element tangent stiffness and mass matrices are obtained using analytical integration The stability is studied by giving the structure a small disturbance about an equilibrium configuration, and observing if the resulting response remains small. In order to study the dynamic behavior by including uncertainties into the problem, three models were developed: Exact Monte Carlo Simulation, Sensitivity Based Monte Carlo Simulation, and Probabilistic FEA. These methods were integrated into the developed finite element analysis. Also, perturbation and sensitivity analysis have been used to study nonconservative problems, as well as to study the stability analysis, using the dynamic criterion.

  12. Torsional Ultrasound Sensor Optimization for Soft Tissue Characterization

    PubMed Central

    Melchor, Juan; Muñoz, Rafael; Rus, Guillermo

    2017-01-01

    Torsion mechanical waves have the capability to characterize shear stiffness moduli of soft tissue. Under this hypothesis, a computational methodology is proposed to design and optimize a piezoelectrics-based transmitter and receiver to generate and measure the response of torsional ultrasonic waves. The procedure employed is divided into two steps: (i) a finite element method (FEM) is developed to obtain a transmitted and received waveform as well as a resonance frequency of a previous geometry validated with a semi-analytical simplified model and (ii) a probabilistic optimality criteria of the design based on inverse problem from the estimation of robust probability of detection (RPOD) to maximize the detection of the pathology defined in terms of changes of shear stiffness. This study collects different options of design in two separated models, in transmission and contact, respectively. The main contribution of this work describes a framework to establish such as forward, inverse and optimization procedures to choose a set of appropriate parameters of a transducer. This methodological framework may be generalizable for other different applications. PMID:28617353

  13. Stability metrics for multi-source biomedical data based on simplicial projections from probability distribution distances.

    PubMed

    Sáez, Carlos; Robles, Montserrat; García-Gómez, Juan M

    2017-02-01

    Biomedical data may be composed of individuals generated from distinct, meaningful sources. Due to possible contextual biases in the processes that generate data, there may exist an undesirable and unexpected variability among the probability distribution functions (PDFs) of the source subsamples, which, when uncontrolled, may lead to inaccurate or unreproducible research results. Classical statistical methods may have difficulties to undercover such variabilities when dealing with multi-modal, multi-type, multi-variate data. This work proposes two metrics for the analysis of stability among multiple data sources, robust to the aforementioned conditions, and defined in the context of data quality assessment. Specifically, a global probabilistic deviation and a source probabilistic outlyingness metrics are proposed. The first provides a bounded degree of the global multi-source variability, designed as an estimator equivalent to the notion of normalized standard deviation of PDFs. The second provides a bounded degree of the dissimilarity of each source to a latent central distribution. The metrics are based on the projection of a simplex geometrical structure constructed from the Jensen-Shannon distances among the sources PDFs. The metrics have been evaluated and demonstrated their correct behaviour on a simulated benchmark and with real multi-source biomedical data using the UCI Heart Disease data set. The biomedical data quality assessment based on the proposed stability metrics may improve the efficiency and effectiveness of biomedical data exploitation and research.

  14. A New Approach to Extreme Value Estimation Applicable to a Wide Variety of Random Variables

    NASA Technical Reports Server (NTRS)

    Holland, Frederic A., Jr.

    1997-01-01

    Designing reliable structures requires an estimate of the maximum and minimum values (i.e., strength and load) that may be encountered in service. Yet designs based on very extreme values (to insure safety) can result in extra material usage and hence, uneconomic systems. In aerospace applications, severe over-design cannot be tolerated making it almost mandatory to design closer to the assumed limits of the design random variables. The issue then is predicting extreme values that are practical, i.e. neither too conservative or non-conservative. Obtaining design values by employing safety factors is well known to often result in overly conservative designs and. Safety factor values have historically been selected rather arbitrarily, often lacking a sound rational basis. To answer the question of how safe a design needs to be has lead design theorists to probabilistic and statistical methods. The so-called three-sigma approach is one such method and has been described as the first step in utilizing information about the data dispersion. However, this method is based on the assumption that the random variable is dispersed symmetrically about the mean and is essentially limited to normally distributed random variables. Use of this method can therefore result in unsafe or overly conservative design allowables if the common assumption of normality is incorrect.

  15. Optimization of the kernel functions in a probabilistic neural network analyzing the local pattern distribution.

    PubMed

    Galleske, I; Castellanos, J

    2002-05-01

    This article proposes a procedure for the automatic determination of the elements of the covariance matrix of the gaussian kernel function of probabilistic neural networks. Two matrices, a rotation matrix and a matrix of variances, can be calculated by analyzing the local environment of each training pattern. The combination of them will form the covariance matrix of each training pattern. This automation has two advantages: First, it will free the neural network designer from indicating the complete covariance matrix, and second, it will result in a network with better generalization ability than the original model. A variation of the famous two-spiral problem and real-world examples from the UCI Machine Learning Repository will show a classification rate not only better than the original probabilistic neural network but also that this model can outperform other well-known classification techniques.

  16. Probabilistic graphlet transfer for photo cropping.

    PubMed

    Zhang, Luming; Song, Mingli; Zhao, Qi; Liu, Xiao; Bu, Jiajun; Chen, Chun

    2013-02-01

    As one of the most basic photo manipulation processes, photo cropping is widely used in the printing, graphic design, and photography industries. In this paper, we introduce graphlets (i.e., small connected subgraphs) to represent a photo's aesthetic features, and propose a probabilistic model to transfer aesthetic features from the training photo onto the cropped photo. In particular, by segmenting each photo into a set of regions, we construct a region adjacency graph (RAG) to represent the global aesthetic feature of each photo. Graphlets are then extracted from the RAGs, and these graphlets capture the local aesthetic features of the photos. Finally, we cast photo cropping as a candidate-searching procedure on the basis of a probabilistic model, and infer the parameters of the cropped photos using Gibbs sampling. The proposed method is fully automatic. Subjective evaluations have shown that it is preferred over a number of existing approaches.

  17. Probabilistic Seismic Risk Model for Western Balkans

    NASA Astrophysics Data System (ADS)

    Stejskal, Vladimir; Lorenzo, Francisco; Pousse, Guillaume; Radovanovic, Slavica; Pekevski, Lazo; Dojcinovski, Dragi; Lokin, Petar; Petronijevic, Mira; Sipka, Vesna

    2010-05-01

    A probabilistic seismic risk model for insurance and reinsurance purposes is presented for an area of Western Balkans, covering former Yugoslavia and Albania. This territory experienced many severe earthquakes during past centuries producing significant damage to many population centres in the region. The highest hazard is related to external Dinarides, namely to the collision zone of the Adriatic plate. The model is based on a unified catalogue for the region and a seismic source model consisting of more than 30 zones covering all the three main structural units - Southern Alps, Dinarides and the south-western margin of the Pannonian Basin. A probabilistic methodology using Monte Carlo simulation was applied to generate the hazard component of the model. Unique set of damage functions based on both loss experience and engineering assessments is used to convert the modelled ground motion severity into the monetary loss.

  18. Predicting the onset of psychosis in patients at clinical high risk: practical guide to probabilistic prognostic reasoning.

    PubMed

    Fusar-Poli, P; Schultze-Lutter, F

    2016-02-01

    Prediction of psychosis in patients at clinical high risk (CHR) has become a mainstream focus of clinical and research interest worldwide. When using CHR instruments for clinical purposes, the predicted outcome is but only a probability; and, consequently, any therapeutic action following the assessment is based on probabilistic prognostic reasoning. Yet, probabilistic reasoning makes considerable demands on the clinicians. We provide here a scholarly practical guide summarising the key concepts to support clinicians with probabilistic prognostic reasoning in the CHR state. We review risk or cumulative incidence of psychosis in, person-time rate of psychosis, Kaplan-Meier estimates of psychosis risk, measures of prognostic accuracy, sensitivity and specificity in receiver operator characteristic curves, positive and negative predictive values, Bayes' theorem, likelihood ratios, potentials and limits of real-life applications of prognostic probabilistic reasoning in the CHR state. Understanding basic measures used for prognostic probabilistic reasoning is a prerequisite for successfully implementing the early detection and prevention of psychosis in clinical practice. Future refinement of these measures for CHR patients may actually influence risk management, especially as regards initiating or withholding treatment. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/

  19. Diffusion tensor tractography of the arcuate fasciculus in patients with brain tumors: Comparison between deterministic and probabilistic models

    PubMed Central

    Li, Zhixi; Peck, Kyung K.; Brennan, Nicole P.; Jenabi, Mehrnaz; Hsu, Meier; Zhang, Zhigang; Holodny, Andrei I.; Young, Robert J.

    2014-01-01

    Purpose The purpose of this study was to compare the deterministic and probabilistic tracking methods of diffusion tensor white matter fiber tractography in patients with brain tumors. Materials and Methods We identified 29 patients with left brain tumors <2 cm from the arcuate fasciculus who underwent pre-operative language fMRI and DTI. The arcuate fasciculus was reconstructed using a deterministic Fiber Assignment by Continuous Tracking (FACT) algorithm and a probabilistic method based on an extended Monte Carlo Random Walk algorithm. Tracking was controlled using two ROIs corresponding to Broca’s and Wernicke’s areas. Tracts in tumoraffected hemispheres were examined for extension between Broca’s and Wernicke’s areas, anterior-posterior length and volume, and compared with the normal contralateral tracts. Results Probabilistic tracts displayed more complete anterior extension to Broca’s area than did FACT tracts on the tumor-affected and normal sides (p < 0.0001). The median length ratio for tumor: normal sides was greater for probabilistic tracts than FACT tracts (p < 0.0001). The median tract volume ratio for tumor: normal sides was also greater for probabilistic tracts than FACT tracts (p = 0.01). Conclusion Probabilistic tractography reconstructs the arcuate fasciculus more completely and performs better through areas of tumor and/or edema. The FACT algorithm tends to underestimate the anterior-most fibers of the arcuate fasciculus, which are crossed by primary motor fibers. PMID:25328583

  20. Evaluating bacterial gene-finding HMM structures as probabilistic logic programs.

    PubMed

    Mørk, Søren; Holmes, Ian

    2012-03-01

    Probabilistic logic programming offers a powerful way to describe and evaluate structured statistical models. To investigate the practicality of probabilistic logic programming for structure learning in bioinformatics, we undertook a simplified bacterial gene-finding benchmark in PRISM, a probabilistic dialect of Prolog. We evaluate Hidden Markov Model structures for bacterial protein-coding gene potential, including a simple null model structure, three structures based on existing bacterial gene finders and two novel model structures. We test standard versions as well as ADPH length modeling and three-state versions of the five model structures. The models are all represented as probabilistic logic programs and evaluated using the PRISM machine learning system in terms of statistical information criteria and gene-finding prediction accuracy, in two bacterial genomes. Neither of our implementations of the two currently most used model structures are best performing in terms of statistical information criteria or prediction performances, suggesting that better-fitting models might be achievable. The source code of all PRISM models, data and additional scripts are freely available for download at: http://github.com/somork/codonhmm. Supplementary data are available at Bioinformatics online.

  1. A probabilistic NF2 relational algebra for integrated information retrieval and database systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fuhr, N.; Roelleke, T.

    The integration of information retrieval (IR) and database systems requires a data model which allows for modelling documents as entities, representing uncertainty and vagueness and performing uncertain inference. For this purpose, we present a probabilistic data model based on relations in non-first-normal-form (NF2). Here, tuples are assigned probabilistic weights giving the probability that a tuple belongs to a relation. Thus, the set of weighted index terms of a document are represented as a probabilistic subrelation. In a similar way, imprecise attribute values are modelled as a set-valued attribute. We redefine the relational operators for this type of relations such thatmore » the result of each operator is again a probabilistic NF2 relation, where the weight of a tuple gives the probability that this tuple belongs to the result. By ordering the tuples according to decreasing probabilities, the model yields a ranking of answers like in most IR models. This effect also can be used for typical database queries involving imprecise attribute values as well as for combinations of database and IR queries.« less

  2. How Do High School Students Solve Probability Problems? A Mixed Methods Study on Probabilistic Reasoning

    ERIC Educational Resources Information Center

    Heyvaert, Mieke; Deleye, Maarten; Saenen, Lore; Van Dooren, Wim; Onghena, Patrick

    2018-01-01

    When studying a complex research phenomenon, a mixed methods design allows to answer a broader set of research questions and to tap into different aspects of this phenomenon, compared to a monomethod design. This paper reports on how a sequential equal status design (QUAN ? QUAL) was used to examine students' reasoning processes when solving…

  3. Relative potentials of concentrating and two-axis tracking flat-plate photovoltaic arrays for central-station applications

    NASA Technical Reports Server (NTRS)

    Borden, C. S.; Schwartz, D. L.

    1984-01-01

    The purpose of this study is to assess the relative economic potentials of concenrating and two-axis tracking flat-plate photovoltaic arrays for central-station applications in the mid-1990's. Specific objectives of this study are to provide information on concentrator photovoltaic collector probabilistic price and efficiency levels to illustrate critical areas of R&D for concentrator cells and collectors, and to compare concentrator and flat-plate PV price and efficiency alternatives for several locations, based on their implied costs of energy. To deal with the uncertainties surrounding research and development activities in general, a probabilistic assessment of commercially achievable concentrator photovoltaic collector efficiencies and prices (at the factory loading dock) is performed. The results of this projection of concentrator photovoltaic technology are then compared with a previous flat-plate module price analysis (performed early in 1983). To focus this analysis on specific collector alternatives and their implied energy costs for different locations, similar two-axis tracking designs are assumed for both concentrator and flat-plate options.

  4. A probabilistic sizing tool and Monte Carlo analysis for entry vehicle ablative thermal protection systems

    NASA Astrophysics Data System (ADS)

    Mazzaracchio, Antonio; Marchetti, Mario

    2010-03-01

    Implicit ablation and thermal response software was developed to analyse and size charring ablative thermal protection systems for entry vehicles. A statistical monitor integrated into the tool, which uses the Monte Carlo technique, allows a simulation to run over stochastic series. This performs an uncertainty and sensitivity analysis, which estimates the probability of maintaining the temperature of the underlying material within specified requirements. This approach and the associated software are primarily helpful during the preliminary design phases of spacecraft thermal protection systems. They are proposed as an alternative to traditional approaches, such as the Root-Sum-Square method. The developed tool was verified by comparing the results with those from previous work on thermal protection system probabilistic sizing methodologies, which are based on an industry standard high-fidelity ablation and thermal response program. New case studies were analysed to establish thickness margins on sizing heat shields that are currently proposed for vehicles using rigid aeroshells for future aerocapture missions at Neptune, and identifying the major sources of uncertainty in the material response.

  5. An approximate methods approach to probabilistic structural analysis

    NASA Technical Reports Server (NTRS)

    Mcclung, R. C.; Millwater, H. R.; Wu, Y.-T.; Thacker, B. H.; Burnside, O. H.

    1989-01-01

    A major research and technology program in Probabilistic Structural Analysis Methods (PSAM) is currently being sponsored by the NASA Lewis Research Center with Southwest Research Institute as the prime contractor. This program is motivated by the need to accurately predict structural response in an environment where the loadings, the material properties, and even the structure may be considered random. The heart of PSAM is a software package which combines advanced structural analysis codes with a fast probability integration (FPI) algorithm for the efficient calculation of stochastic structural response. The basic idea of PAAM is simple: make an approximate calculation of system response, including calculation of the associated probabilities, with minimal computation time and cost, based on a simplified representation of the geometry, loads, and material. The deterministic solution resulting should give a reasonable and realistic description of performance-limiting system responses, although some error will be inevitable. If the simple model has correctly captured the basic mechanics of the system, however, including the proper functional dependence of stress, frequency, etc. on design parameters, then the response sensitivities calculated may be of significantly higher accuracy.

  6. Aircraft Conflict Analysis and Real-Time Conflict Probing Using Probabilistic Trajectory Modeling

    NASA Technical Reports Server (NTRS)

    Yang, Lee C.; Kuchar, James K.

    2000-01-01

    Methods for maintaining separation between aircraft in the current airspace system have been built from a foundation of structured routes and evolved procedures. However, as the airspace becomes more congested and the chance of failures or operational error become more problematic, automated conflict alerting systems have been proposed to help provide decision support and to serve as traffic monitoring aids. The problem of conflict detection and resolution has been tackled from a number of different ways, but in this thesis, it is recast as a problem of prediction in the presence of uncertainties. Much of the focus is concentrated on the errors and uncertainties from the working trajectory model used to estimate future aircraft positions. The more accurate the prediction, the more likely an ideal (no false alarms, no missed detections) alerting system can be designed. Additional insights into the problem were brought forth by a review of current operational and developmental approaches found in the literature. An iterative, trial and error approach to threshold design was identified. When examined from a probabilistic perspective, the threshold parameters were found to be a surrogate to probabilistic performance measures. To overcome the limitations in the current iterative design method, a new direct approach is presented where the performance measures are directly computed and used to perform the alerting decisions. The methodology is shown to handle complex encounter situations (3-D, multi-aircraft, multi-intent, with uncertainties) with relative ease. Utilizing a Monte Carlo approach, a method was devised to perform the probabilistic computations in near realtime. Not only does this greatly increase the method's potential as an analytical tool, but it also opens up the possibility for use as a real-time conflict alerting probe. A prototype alerting logic was developed and has been utilized in several NASA Ames Research Center experimental studies.

  7. Perception of Risk and Terrorism-Related Behavior Change: Dual Influences of Probabilistic Reasoning and Reality Testing.

    PubMed

    Denovan, Andrew; Dagnall, Neil; Drinkwater, Kenneth; Parker, Andrew; Clough, Peter

    2017-01-01

    The present study assessed the degree to which probabilistic reasoning performance and thinking style influenced perception of risk and self-reported levels of terrorism-related behavior change. A sample of 263 respondents, recruited via convenience sampling, completed a series of measures comprising probabilistic reasoning tasks (perception of randomness, base rate, probability, and conjunction fallacy), the Reality Testing subscale of the Inventory of Personality Organization (IPO-RT), the Domain-Specific Risk-Taking Scale, and a terrorism-related behavior change scale. Structural equation modeling examined three progressive models. Firstly, the Independence Model assumed that probabilistic reasoning, perception of risk and reality testing independently predicted terrorism-related behavior change. Secondly, the Mediation Model supposed that probabilistic reasoning and reality testing correlated, and indirectly predicted terrorism-related behavior change through perception of risk. Lastly, the Dual-Influence Model proposed that probabilistic reasoning indirectly predicted terrorism-related behavior change via perception of risk, independent of reality testing. Results indicated that performance on probabilistic reasoning tasks most strongly predicted perception of risk, and preference for an intuitive thinking style (measured by the IPO-RT) best explained terrorism-related behavior change. The combination of perception of risk with probabilistic reasoning ability in the Dual-Influence Model enhanced the predictive power of the analytical-rational route, with conjunction fallacy having a significant indirect effect on terrorism-related behavior change via perception of risk. The Dual-Influence Model possessed superior fit and reported similar predictive relations between intuitive-experiential and analytical-rational routes and terrorism-related behavior change. The discussion critically examines these findings in relation to dual-processing frameworks. This includes considering the limitations of current operationalisations and recommendations for future research that align outcomes and subsequent work more closely to specific dual-process models.

  8. Perception of Risk and Terrorism-Related Behavior Change: Dual Influences of Probabilistic Reasoning and Reality Testing

    PubMed Central

    Denovan, Andrew; Dagnall, Neil; Drinkwater, Kenneth; Parker, Andrew; Clough, Peter

    2017-01-01

    The present study assessed the degree to which probabilistic reasoning performance and thinking style influenced perception of risk and self-reported levels of terrorism-related behavior change. A sample of 263 respondents, recruited via convenience sampling, completed a series of measures comprising probabilistic reasoning tasks (perception of randomness, base rate, probability, and conjunction fallacy), the Reality Testing subscale of the Inventory of Personality Organization (IPO-RT), the Domain-Specific Risk-Taking Scale, and a terrorism-related behavior change scale. Structural equation modeling examined three progressive models. Firstly, the Independence Model assumed that probabilistic reasoning, perception of risk and reality testing independently predicted terrorism-related behavior change. Secondly, the Mediation Model supposed that probabilistic reasoning and reality testing correlated, and indirectly predicted terrorism-related behavior change through perception of risk. Lastly, the Dual-Influence Model proposed that probabilistic reasoning indirectly predicted terrorism-related behavior change via perception of risk, independent of reality testing. Results indicated that performance on probabilistic reasoning tasks most strongly predicted perception of risk, and preference for an intuitive thinking style (measured by the IPO-RT) best explained terrorism-related behavior change. The combination of perception of risk with probabilistic reasoning ability in the Dual-Influence Model enhanced the predictive power of the analytical-rational route, with conjunction fallacy having a significant indirect effect on terrorism-related behavior change via perception of risk. The Dual-Influence Model possessed superior fit and reported similar predictive relations between intuitive-experiential and analytical-rational routes and terrorism-related behavior change. The discussion critically examines these findings in relation to dual-processing frameworks. This includes considering the limitations of current operationalisations and recommendations for future research that align outcomes and subsequent work more closely to specific dual-process models. PMID:29062288

  9. Mechanical system reliability for long life space systems

    NASA Technical Reports Server (NTRS)

    Kowal, Michael T.

    1994-01-01

    The creation of a compendium of mechanical limit states was undertaken in order to provide a reference base for the application of first-order reliability methods to mechanical systems in the context of the development of a system level design methodology. The compendium was conceived as a reference source specific to the problem of developing the noted design methodology, and not an exhaustive or exclusive compilation of mechanical limit states. The compendium is not intended to be a handbook of mechanical limit states for general use. The compendium provides a diverse set of limit-state relationships for use in demonstrating the application of probabilistic reliability methods to mechanical systems. The compendium is to be used in the reliability analysis of moderately complex mechanical systems.

  10. A Probabilistic Model for Hydrokinetic Turbine Collision Risks: Exploring Impacts on Fish

    PubMed Central

    Hammar, Linus; Eggertsen, Linda; Andersson, Sandra; Ehnberg, Jimmy; Arvidsson, Rickard; Gullström, Martin; Molander, Sverker

    2015-01-01

    A variety of hydrokinetic turbines are currently under development for power generation in rivers, tidal straits and ocean currents. Because some of these turbines are large, with rapidly moving rotor blades, the risk of collision with aquatic animals has been brought to attention. The behavior and fate of animals that approach such large hydrokinetic turbines have not yet been monitored at any detail. In this paper, we conduct a synthesis of the current knowledge and understanding of hydrokinetic turbine collision risks. The outcome is a generic fault tree based probabilistic model suitable for estimating population-level ecological risks. New video-based data on fish behavior in strong currents are provided and models describing fish avoidance behaviors are presented. The findings indicate low risk for small-sized fish. However, at large turbines (≥5 m), bigger fish seem to have high probability of collision, mostly because rotor detection and avoidance is difficult in low visibility. Risks can therefore be substantial for vulnerable populations of large-sized fish, which thrive in strong currents. The suggested collision risk model can be applied to different turbine designs and at a variety of locations as basis for case-specific risk assessments. The structure of the model facilitates successive model validation, refinement and application to other organism groups such as marine mammals. PMID:25730314

  11. Upgrades to the REA method for producing probabilistic climate change projections

    NASA Astrophysics Data System (ADS)

    Xu, Ying; Gao, Xuejie; Giorgi, Filippo

    2010-05-01

    We present an augmented version of the Reliability Ensemble Averaging (REA) method designed to generate probabilistic climate change information from ensembles of climate model simulations. Compared to the original version, the augmented one includes consideration of multiple variables and statistics in the calculation of the performance-based weights. In addition, the model convergence criterion previously employed is removed. The method is applied to the calculation of changes in mean and variability for temperature and precipitation over different sub-regions of East Asia based on the recently completed CMIP3 multi-model ensemble. Comparison of the new and old REA methods, along with the simple averaging procedure, and the use of different combinations of performance metrics shows that at fine sub-regional scales the choice of weighting is relevant. This is mostly because the models show a substantial spread in performance for the simulation of precipitation statistics, a result that supports the use of model weighting as a useful option to account for wide ranges of quality of models. The REA method, and in particular the upgraded one, provides a simple and flexible framework for assessing the uncertainty related to the aggregation of results from ensembles of models in order to produce climate change information at the regional scale. KEY WORDS: REA method, Climate change, CMIP3

  12. Progress Implementing a Model-Based Iterative Reconstruction Algorithm for Ultrasound Imaging of Thick Concrete

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Almansouri, Hani; Johnson, Christi R; Clayton, Dwight A

    All commercial nuclear power plants (NPPs) in the United States contain concrete structures. These structures provide important foundation, support, shielding, and containment functions. Identification and management of aging and the degradation of concrete structures is fundamental to the proposed long-term operation of NPPs. Concrete structures in NPPs are often inaccessible and contain large volumes of massively thick concrete. While acoustic imaging using the synthetic aperture focusing technique (SAFT) works adequately well for thin specimens of concrete such as concrete transportation structures, enhancements are needed for heavily reinforced, thick concrete. We argue that image reconstruction quality for acoustic imaging in thickmore » concrete could be improved with Model-Based Iterative Reconstruction (MBIR) techniques. MBIR works by designing a probabilistic model for the measurements (forward model) and a probabilistic model for the object (prior model). Both models are used to formulate an objective function (cost function). The final step in MBIR is to optimize the cost function. Previously, we have demonstrated a first implementation of MBIR for an ultrasonic transducer array system. The original forward model has been upgraded to account for direct arrival signal. Updates to the forward model will be documented and the new algorithm will be assessed with synthetic and empirical samples.« less

  13. Brain function during probabilistic learning in relation to IQ and level of education.

    PubMed

    van den Bos, Wouter; Crone, Eveline A; Güroğlu, Berna

    2012-02-15

    Knowing how to adapt your behavior based on feedback lies at the core of successful learning. We investigated the relation between brain function, grey matter volume, educational level and IQ in a Dutch adolescent sample. In total 45 healthy volunteers between ages 13 and 16 were recruited from schools for pre-vocational and pre-university education. For each individual, IQ was estimated using two subtests from the WISC-III-R (similarities and block design). While in the magnetic resonance imaging (MRI) scanner, participants performed a probabilistic learning task. Behavioral comparisons showed that participants with higher IQ used a more adaptive learning strategy after receiving positive feedback. Analysis of neural activation revealed that higher IQ was associated with increased activation in DLPFC and dACC when receiving positive feedback, specifically for rules with low reward probability (i.e., unexpected positive feedback). Furthermore, VBM analyses revealed that IQ correlated positively with grey matter volume within these regions. These results provide support for IQ-related individual differences in the developmental time courses of neural circuitry supporting feedback-based learning. Current findings are interpreted in terms of a prolonged window of flexibility and opportunity for adolescents with higher IQ scores. Copyright © 2011 Elsevier Ltd. All rights reserved.

  14. A probabilistic model for hydrokinetic turbine collision risks: exploring impacts on fish.

    PubMed

    Hammar, Linus; Eggertsen, Linda; Andersson, Sandra; Ehnberg, Jimmy; Arvidsson, Rickard; Gullström, Martin; Molander, Sverker

    2015-01-01

    A variety of hydrokinetic turbines are currently under development for power generation in rivers, tidal straits and ocean currents. Because some of these turbines are large, with rapidly moving rotor blades, the risk of collision with aquatic animals has been brought to attention. The behavior and fate of animals that approach such large hydrokinetic turbines have not yet been monitored at any detail. In this paper, we conduct a synthesis of the current knowledge and understanding of hydrokinetic turbine collision risks. The outcome is a generic fault tree based probabilistic model suitable for estimating population-level ecological risks. New video-based data on fish behavior in strong currents are provided and models describing fish avoidance behaviors are presented. The findings indicate low risk for small-sized fish. However, at large turbines (≥5 m), bigger fish seem to have high probability of collision, mostly because rotor detection and avoidance is difficult in low visibility. Risks can therefore be substantial for vulnerable populations of large-sized fish, which thrive in strong currents. The suggested collision risk model can be applied to different turbine designs and at a variety of locations as basis for case-specific risk assessments. The structure of the model facilitates successive model validation, refinement and application to other organism groups such as marine mammals.

  15. Sequence similarity is more relevant than species specificity in probabilistic backtranslation.

    PubMed

    Ferro, Alfredo; Giugno, Rosalba; Pigola, Giuseppe; Pulvirenti, Alfredo; Di Pietro, Cinzia; Purrello, Michele; Ragusa, Marco

    2007-02-21

    Backtranslation is the process of decoding a sequence of amino acids into the corresponding codons. All synthetic gene design systems include a backtranslation module. The degeneracy of the genetic code makes backtranslation potentially ambiguous since most amino acids are encoded by multiple codons. The common approach to overcome this difficulty is based on imitation of codon usage within the target species. This paper describes EasyBack, a new parameter-free, fully-automated software for backtranslation using Hidden Markov Models. EasyBack is not based on imitation of codon usage within the target species, but instead uses a sequence-similarity criterion. The model is trained with a set of proteins with known cDNA coding sequences, constructed from the input protein by querying the NCBI databases with BLAST. Unlike existing software, the proposed method allows the quality of prediction to be estimated. When tested on a group of proteins that show different degrees of sequence conservation, EasyBack outperforms other published methods in terms of precision. The prediction quality of a protein backtranslation methis markedly increased by replacing the criterion of most used codon in the same species with a Hidden Markov Model trained with a set of most similar sequences from all species. Moreover, the proposed method allows the quality of prediction to be estimated probabilistically.

  16. Progress implementing a model-based iterative reconstruction algorithm for ultrasound imaging of thick concrete

    NASA Astrophysics Data System (ADS)

    Almansouri, Hani; Johnson, Christi; Clayton, Dwight; Polsky, Yarom; Bouman, Charles; Santos-Villalobos, Hector

    2017-02-01

    All commercial nuclear power plants (NPPs) in the United States contain concrete structures. These structures provide important foundation, support, shielding, and containment functions. Identification and management of aging and the degradation of concrete structures is fundamental to the proposed long-term operation of NPPs. Concrete structures in NPPs are often inaccessible and contain large volumes of massively thick concrete. While acoustic imaging using the synthetic aperture focusing technique (SAFT) works adequately well for thin specimens of concrete such as concrete transportation structures, enhancements are needed for heavily reinforced, thick concrete. We argue that image reconstruction quality for acoustic imaging in thick concrete could be improved with Model-Based Iterative Reconstruction (MBIR) techniques. MBIR works by designing a probabilistic model for the measurements (forward model) and a probabilistic model for the object (prior model). Both models are used to formulate an objective function (cost function). The final step in MBIR is to optimize the cost function. Previously, we have demonstrated a first implementation of MBIR for an ultrasonic transducer array system. The original forward model has been upgraded to account for direct arrival signal. Updates to the forward model will be documented and the new algorithm will be assessed with synthetic and empirical samples.

  17. FPGA-Based Stochastic Echo State Networks for Time-Series Forecasting.

    PubMed

    Alomar, Miquel L; Canals, Vincent; Perez-Mora, Nicolas; Martínez-Moll, Víctor; Rosselló, Josep L

    2016-01-01

    Hardware implementation of artificial neural networks (ANNs) allows exploiting the inherent parallelism of these systems. Nevertheless, they require a large amount of resources in terms of area and power dissipation. Recently, Reservoir Computing (RC) has arisen as a strategic technique to design recurrent neural networks (RNNs) with simple learning capabilities. In this work, we show a new approach to implement RC systems with digital gates. The proposed method is based on the use of probabilistic computing concepts to reduce the hardware required to implement different arithmetic operations. The result is the development of a highly functional system with low hardware resources. The presented methodology is applied to chaotic time-series forecasting.

  18. FPGA-Based Stochastic Echo State Networks for Time-Series Forecasting

    PubMed Central

    Alomar, Miquel L.; Canals, Vincent; Perez-Mora, Nicolas; Martínez-Moll, Víctor; Rosselló, Josep L.

    2016-01-01

    Hardware implementation of artificial neural networks (ANNs) allows exploiting the inherent parallelism of these systems. Nevertheless, they require a large amount of resources in terms of area and power dissipation. Recently, Reservoir Computing (RC) has arisen as a strategic technique to design recurrent neural networks (RNNs) with simple learning capabilities. In this work, we show a new approach to implement RC systems with digital gates. The proposed method is based on the use of probabilistic computing concepts to reduce the hardware required to implement different arithmetic operations. The result is the development of a highly functional system with low hardware resources. The presented methodology is applied to chaotic time-series forecasting. PMID:26880876

  19. Query-based biclustering of gene expression data using Probabilistic Relational Models.

    PubMed

    Zhao, Hui; Cloots, Lore; Van den Bulcke, Tim; Wu, Yan; De Smet, Riet; Storms, Valerie; Meysman, Pieter; Engelen, Kristof; Marchal, Kathleen

    2011-02-15

    With the availability of large scale expression compendia it is now possible to view own findings in the light of what is already available and retrieve genes with an expression profile similar to a set of genes of interest (i.e., a query or seed set) for a subset of conditions. To that end, a query-based strategy is needed that maximally exploits the coexpression behaviour of the seed genes to guide the biclustering, but that at the same time is robust against the presence of noisy genes in the seed set as seed genes are often assumed, but not guaranteed to be coexpressed in the queried compendium. Therefore, we developed ProBic, a query-based biclustering strategy based on Probabilistic Relational Models (PRMs) that exploits the use of prior distributions to extract the information contained within the seed set. We applied ProBic on a large scale Escherichia coli compendium to extend partially described regulons with potentially novel members. We compared ProBic's performance with previously published query-based biclustering algorithms, namely ISA and QDB, from the perspective of bicluster expression quality, robustness of the outcome against noisy seed sets and biological relevance.This comparison learns that ProBic is able to retrieve biologically relevant, high quality biclusters that retain their seed genes and that it is particularly strong in handling noisy seeds. ProBic is a query-based biclustering algorithm developed in a flexible framework, designed to detect biologically relevant, high quality biclusters that retain relevant seed genes even in the presence of noise or when dealing with low quality seed sets.

  20. Role of ionotropic glutamate receptors in delay and probability discounting in the rat.

    PubMed

    Yates, Justin R; Batten, Seth R; Bardo, Michael T; Beckmann, Joshua S

    2015-04-01

    Discounting of delayed and probabilistic reinforcement is linked to increased drug use and pathological gambling. Understanding the neurobiology of discounting is important for designing treatments for these disorders. Glutamate is considered to be involved in addiction-like behaviors; however, the role of ionotropic glutamate receptors (iGluRs) in discounting remains unclear. The current study examined the effects of N-methyl-D-aspartate (NMDA) and α-amino-3-hydroxy-5-methyl-4-isoxazolepropionic acid (AMPA) glutamate receptor blockade on performance in delay and probability discounting tasks. Following training in either delay or probability discounting, rats (n = 12, each task) received pretreatments of the NMDA receptor antagonists MK-801 (0, 0.01, 0.03, 0.1, or 0.3 mg/kg, s.c.) or ketamine (0, 1.0, 5.0, or 10.0 mg/kg, i.p.), as well as the AMPA receptor antagonist CNQX (0, 1.0, 3.0, or 5.6 mg/kg, i.p.). Hyperbolic discounting functions were used to estimate sensitivity to delayed/probabilistic reinforcement and sensitivity to reinforcer amount. An intermediate dose of MK-801 (0.03 mg/kg) decreased sensitivity to both delayed and probabilistic reinforcement. In contrast, ketamine did not affect the rate of discounting in either task but decreased sensitivity to reinforcer amount. CNQX did not alter sensitivity to reinforcer amount or delayed/probabilistic reinforcement. These results show that blockade of NMDA receptors, but not AMPA receptors, decreases sensitivity to delayed/probabilistic reinforcement (MK-801) and sensitivity to reinforcer amount (ketamine). The differential effects of MK-801 and ketamine demonstrate that sensitivities to delayed/probabilistic reinforcement and reinforcer amount are pharmacologically dissociable.

  1. An accurate and efficient reliability-based design optimization using the second order reliability method and improved stability transformation method

    NASA Astrophysics Data System (ADS)

    Meng, Zeng; Yang, Dixiong; Zhou, Huanlin; Yu, Bo

    2018-05-01

    The first order reliability method has been extensively adopted for reliability-based design optimization (RBDO), but it shows inaccuracy in calculating the failure probability with highly nonlinear performance functions. Thus, the second order reliability method is required to evaluate the reliability accurately. However, its application for RBDO is quite challenge owing to the expensive computational cost incurred by the repeated reliability evaluation and Hessian calculation of probabilistic constraints. In this article, a new improved stability transformation method is proposed to search the most probable point efficiently, and the Hessian matrix is calculated by the symmetric rank-one update. The computational capability of the proposed method is illustrated and compared to the existing RBDO approaches through three mathematical and two engineering examples. The comparison results indicate that the proposed method is very efficient and accurate, providing an alternative tool for RBDO of engineering structures.

  2. Global integrated drought monitoring and prediction system

    PubMed Central

    Hao, Zengchao; AghaKouchak, Amir; Nakhjiri, Navid; Farahmand, Alireza

    2014-01-01

    Drought is by far the most costly natural disaster that can lead to widespread impacts, including water and food crises. Here we present data sets available from the Global Integrated Drought Monitoring and Prediction System (GIDMaPS), which provides drought information based on multiple drought indicators. The system provides meteorological and agricultural drought information based on multiple satellite-, and model-based precipitation and soil moisture data sets. GIDMaPS includes a near real-time monitoring component and a seasonal probabilistic prediction module. The data sets include historical drought severity data from the monitoring component, and probabilistic seasonal forecasts from the prediction module. The probabilistic forecasts provide essential information for early warning, taking preventive measures, and planning mitigation strategies. GIDMaPS data sets are a significant extension to current capabilities and data sets for global drought assessment and early warning. The presented data sets would be instrumental in reducing drought impacts especially in developing countries. Our results indicate that GIDMaPS data sets reliably captured several major droughts from across the globe. PMID:25977759

  3. Global integrated drought monitoring and prediction system.

    PubMed

    Hao, Zengchao; AghaKouchak, Amir; Nakhjiri, Navid; Farahmand, Alireza

    2014-01-01

    Drought is by far the most costly natural disaster that can lead to widespread impacts, including water and food crises. Here we present data sets available from the Global Integrated Drought Monitoring and Prediction System (GIDMaPS), which provides drought information based on multiple drought indicators. The system provides meteorological and agricultural drought information based on multiple satellite-, and model-based precipitation and soil moisture data sets. GIDMaPS includes a near real-time monitoring component and a seasonal probabilistic prediction module. The data sets include historical drought severity data from the monitoring component, and probabilistic seasonal forecasts from the prediction module. The probabilistic forecasts provide essential information for early warning, taking preventive measures, and planning mitigation strategies. GIDMaPS data sets are a significant extension to current capabilities and data sets for global drought assessment and early warning. The presented data sets would be instrumental in reducing drought impacts especially in developing countries. Our results indicate that GIDMaPS data sets reliably captured several major droughts from across the globe.

  4. Probabilistic analysis for fatigue strength degradation of materials

    NASA Technical Reports Server (NTRS)

    Royce, Lola

    1989-01-01

    This report presents the results of the first year of a research program conducted for NASA-LeRC by the University of Texas at San Antonio. The research included development of methodology that provides a probabilistic treatment of lifetime prediction of structural components of aerospace propulsion systems subjected to fatigue. Material strength degradation models, based on primitive variables, include both a fatigue strength reduction model and a fatigue crack growth model. Linear elastic fracture mechanics is utilized in the latter model. Probabilistic analysis is based on simulation, and both maximum entropy and maximum penalized likelihood methods are used for the generation of probability density functions. The resulting constitutive relationships are included in several computer programs, RANDOM2, RANDOM3, and RANDOM4. These programs determine the random lifetime of an engine component, in mechanical load cycles, to reach a critical fatigue strength or crack size. The material considered was a cast nickel base superalloy, one typical of those used in the Space Shuttle Main Engine.

  5. Probabilistic combination of static and dynamic gait features for verification

    NASA Astrophysics Data System (ADS)

    Bazin, Alex I.; Nixon, Mark S.

    2005-03-01

    This paper describes a novel probabilistic framework for biometric identification and data fusion. Based on intra and inter-class variation extracted from training data, posterior probabilities describing the similarity between two feature vectors may be directly calculated from the data using the logistic function and Bayes rule. Using a large publicly available database we show the two imbalanced gait modalities may be fused using this framework. All fusion methods tested provide an improvement over the best modality, with the weighted sum rule giving the best performance, hence showing that highly imbalanced classifiers may be fused in a probabilistic setting; improving not only the performance, but also generalized application capability.

  6. A Comparison of Probabilistic and Deterministic Campaign Analysis for Human Space Exploration

    NASA Technical Reports Server (NTRS)

    Merrill, R. Gabe; Andraschko, Mark; Stromgren, Chel; Cirillo, Bill; Earle, Kevin; Goodliff, Kandyce

    2008-01-01

    Human space exploration is by its very nature an uncertain endeavor. Vehicle reliability, technology development risk, budgetary uncertainty, and launch uncertainty all contribute to stochasticity in an exploration scenario. However, traditional strategic analysis has been done in a deterministic manner, analyzing and optimizing the performance of a series of planned missions. History has shown that exploration scenarios rarely follow such a planned schedule. This paper describes a methodology to integrate deterministic and probabilistic analysis of scenarios in support of human space exploration. Probabilistic strategic analysis is used to simulate "possible" scenario outcomes, based upon the likelihood of occurrence of certain events and a set of pre-determined contingency rules. The results of the probabilistic analysis are compared to the nominal results from the deterministic analysis to evaluate the robustness of the scenario to adverse events and to test and optimize contingency planning.

  7. From information processing to decisions: Formalizing and comparing psychologically plausible choice models.

    PubMed

    Heck, Daniel W; Hilbig, Benjamin E; Moshagen, Morten

    2017-08-01

    Decision strategies explain how people integrate multiple sources of information to make probabilistic inferences. In the past decade, increasingly sophisticated methods have been developed to determine which strategy explains decision behavior best. We extend these efforts to test psychologically more plausible models (i.e., strategies), including a new, probabilistic version of the take-the-best (TTB) heuristic that implements a rank order of error probabilities based on sequential processing. Within a coherent statistical framework, deterministic and probabilistic versions of TTB and other strategies can directly be compared using model selection by minimum description length or the Bayes factor. In an experiment with inferences from given information, only three of 104 participants were best described by the psychologically plausible, probabilistic version of TTB. Similar as in previous studies, most participants were classified as users of weighted-additive, a strategy that integrates all available information and approximates rational decisions. Copyright © 2017 Elsevier Inc. All rights reserved.

  8. Probabilistic sensitivity analysis incorporating the bootstrap: an example comparing treatments for the eradication of Helicobacter pylori.

    PubMed

    Pasta, D J; Taylor, J L; Henning, J M

    1999-01-01

    Decision-analytic models are frequently used to evaluate the relative costs and benefits of alternative therapeutic strategies for health care. Various types of sensitivity analysis are used to evaluate the uncertainty inherent in the models. Although probabilistic sensitivity analysis is more difficult theoretically and computationally, the results can be much more powerful and useful than deterministic sensitivity analysis. The authors show how a Monte Carlo simulation can be implemented using standard software to perform a probabilistic sensitivity analysis incorporating the bootstrap. The method is applied to a decision-analytic model evaluating the cost-effectiveness of Helicobacter pylori eradication. The necessary steps are straightforward and are described in detail. The use of the bootstrap avoids certain difficulties encountered with theoretical distributions. The probabilistic sensitivity analysis provided insights into the decision-analytic model beyond the traditional base-case and deterministic sensitivity analyses and should become the standard method for assessing sensitivity.

  9. Global assessment of predictability of water availability: A bivariate probabilistic Budyko analysis

    NASA Astrophysics Data System (ADS)

    Wang, Weiguang; Fu, Jianyu

    2018-02-01

    Estimating continental water availability is of great importance for water resources management, in terms of maintaining ecosystem integrity and sustaining society development. To more accurately quantify the predictability of water availability, on the basis of univariate probabilistic Budyko framework, a bivariate probabilistic Budyko approach was developed using copula-based joint distribution model for considering the dependence between parameter ω of Wang-Tang's equation and the Normalized Difference Vegetation Index (NDVI), and was applied globally. The results indicate the predictive performance in global water availability is conditional on the climatic condition. In comparison with simple univariate distribution, the bivariate one produces the lower interquartile range under the same global dataset, especially in the regions with higher NDVI values, highlighting the importance of developing the joint distribution by taking into account the dependence structure of parameter ω and NDVI, which can provide more accurate probabilistic evaluation of water availability.

  10. Project Ukko - Design of a climate service visualisation interface for seasonal wind forecasts

    NASA Astrophysics Data System (ADS)

    Hemment, Drew; Stefaner, Moritz; Makri, Stephann; Buontempo, Carlo; Christel, Isadora; Torralba-Fernandez, Veronica; Gonzalez-Reviriego, Nube; Doblas-Reyes, Francisco; de Matos, Paula; Dykes, Jason

    2016-04-01

    Project Ukko is a prototype climate service to visually communicate probabilistic seasonal wind forecasts for the energy sector. In Project Ukko, an interactive visualisation enhances the accessibility and readability to the latests advances in seasonal wind speed predictions developed as part of the RESILIENCE prototype of the EUPORIAS (EC FP7) project. Climate services provide made-to-measure climate information, tailored to the specific requirements of different users and industries. In the wind energy sector, understanding of wind conditions in the next few months has high economic value, for instance, for the energy traders. Current energy practices use retrospective climatology, but access to reliable seasonal predictions based in the recent advances in global climate models has potential to improve their resilience to climate variability and change. Despite their potential benefits, a barrier to the development of commercially viable services is the complexity of the probabilistic forecast information, and the challenge of communicating complex and uncertain information to decision makers in industry. Project Ukko consists of an interactive climate service interface for wind energy users to explore probabilistic wind speed predictions for the coming season. This interface enables fast visual detection and exploration of interesting features and regions likely to experience unusual changes in wind speed in the coming months.The aim is not only to support users to better understand the future variability in wind power resources, but also to bridge the gap between practitioners' traditional approach and the advanced prediction systems developed by the climate science community. Project Ukko is presented as a case study of cross-disciplinary collaboration between climate science and design, for the development of climate services that are useful, usable and effective for industry users. The presentation will reflect on the challenge of developing a climate service for industry users in the wind energy sector, the background to this challenge, our approach, and the evaluation of the visualisation interface.

  11. Applying the food safety objective and related standards to thermal inactivation of Salmonella in poultry meat.

    PubMed

    Membré, Jeanne-Marie; Bassett, John; Gorris, Leon G M

    2007-09-01

    The objective of this study was to investigate the practicality of designing a heat treatment process in a food manufacturing operation for a product governed by a Food Safety Objective (FSO). Salmonella in cooked poultry meat was taken as the working example. Although there is no FSO for this product in current legislation, this may change in the (near) future. Four different process design calculations were explored by means of deterministic and probabilistic approaches to mathematical data handling and modeling. It was found that the probabilistic approach was a more objective, transparent, and quantifiable approach to establish the stringency of food safety management systems. It also allowed the introduction of specific prevalence rates. The key input analyzed in this study was the minimum time required for the heat treatment at a fixed temperature to produce a product that complied with the criterion for product safety, i.e., the FSO. By means of the four alternative process design calculations, the minimum time requirement at 70 degrees C was established and ranged from 0.26 to 0.43 min. This is comparable to the U.S. regulation recommendations and significantly less than that of 2 min at 70 degrees C used, for instance, in the United Kingdom regulation concerning vegetative microorganisms in ready-to-eat foods. However, the objective of this study was not to challenge existing regulations but to provide an illustration of how an FSO established by a competent authority can guide decisions on safe product and process designs in practical operation; it hopefully contributes to the collaborative work between regulators, academia, and industries that need to continue learning and gaining experience from each other in order to translate risk-based concepts such as the FSO into everyday operational practice.

  12. Modeling Uncertainties in EEG Microstates: Analysis of Real and Imagined Motor Movements Using Probabilistic Clustering-Driven Training of Probabilistic Neural Networks.

    PubMed

    Dinov, Martin; Leech, Robert

    2017-01-01

    Part of the process of EEG microstate estimation involves clustering EEG channel data at the global field power (GFP) maxima, very commonly using a modified K-means approach. Clustering has also been done deterministically, despite there being uncertainties in multiple stages of the microstate analysis, including the GFP peak definition, the clustering itself and in the post-clustering assignment of microstates back onto the EEG timecourse of interest. We perform a fully probabilistic microstate clustering and labeling, to account for these sources of uncertainty using the closest probabilistic analog to KM called Fuzzy C-means (FCM). We train softmax multi-layer perceptrons (MLPs) using the KM and FCM-inferred cluster assignments as target labels, to then allow for probabilistic labeling of the full EEG data instead of the usual correlation-based deterministic microstate label assignment typically used. We assess the merits of the probabilistic analysis vs. the deterministic approaches in EEG data recorded while participants perform real or imagined motor movements from a publicly available data set of 109 subjects. Though FCM group template maps that are almost topographically identical to KM were found, there is considerable uncertainty in the subsequent assignment of microstate labels. In general, imagined motor movements are less predictable on a time point-by-time point basis, possibly reflecting the more exploratory nature of the brain state during imagined, compared to during real motor movements. We find that some relationships may be more evident using FCM than using KM and propose that future microstate analysis should preferably be performed probabilistically rather than deterministically, especially in situations such as with brain computer interfaces, where both training and applying models of microstates need to account for uncertainty. Probabilistic neural network-driven microstate assignment has a number of advantages that we have discussed, which are likely to be further developed and exploited in future studies. In conclusion, probabilistic clustering and a probabilistic neural network-driven approach to microstate analysis is likely to better model and reveal details and the variability hidden in current deterministic and binarized microstate assignment and analyses.

  13. Modeling Uncertainties in EEG Microstates: Analysis of Real and Imagined Motor Movements Using Probabilistic Clustering-Driven Training of Probabilistic Neural Networks

    PubMed Central

    Dinov, Martin; Leech, Robert

    2017-01-01

    Part of the process of EEG microstate estimation involves clustering EEG channel data at the global field power (GFP) maxima, very commonly using a modified K-means approach. Clustering has also been done deterministically, despite there being uncertainties in multiple stages of the microstate analysis, including the GFP peak definition, the clustering itself and in the post-clustering assignment of microstates back onto the EEG timecourse of interest. We perform a fully probabilistic microstate clustering and labeling, to account for these sources of uncertainty using the closest probabilistic analog to KM called Fuzzy C-means (FCM). We train softmax multi-layer perceptrons (MLPs) using the KM and FCM-inferred cluster assignments as target labels, to then allow for probabilistic labeling of the full EEG data instead of the usual correlation-based deterministic microstate label assignment typically used. We assess the merits of the probabilistic analysis vs. the deterministic approaches in EEG data recorded while participants perform real or imagined motor movements from a publicly available data set of 109 subjects. Though FCM group template maps that are almost topographically identical to KM were found, there is considerable uncertainty in the subsequent assignment of microstate labels. In general, imagined motor movements are less predictable on a time point-by-time point basis, possibly reflecting the more exploratory nature of the brain state during imagined, compared to during real motor movements. We find that some relationships may be more evident using FCM than using KM and propose that future microstate analysis should preferably be performed probabilistically rather than deterministically, especially in situations such as with brain computer interfaces, where both training and applying models of microstates need to account for uncertainty. Probabilistic neural network-driven microstate assignment has a number of advantages that we have discussed, which are likely to be further developed and exploited in future studies. In conclusion, probabilistic clustering and a probabilistic neural network-driven approach to microstate analysis is likely to better model and reveal details and the variability hidden in current deterministic and binarized microstate assignment and analyses. PMID:29163110

  14. Design for cyclic loading endurance of composites

    NASA Technical Reports Server (NTRS)

    Shiao, Michael C.; Murthy, Pappu L. N.; Chamis, Christos C.; Liaw, Leslie D. G.

    1993-01-01

    The application of the computer code IPACS (Integrated Probabilistic Assessment of Composite Structures) to aircraft wing type structures is described. The code performs a complete probabilistic analysis for composites taking into account the uncertainties in geometry, boundary conditions, material properties, laminate lay-ups, and loads. Results of the analysis are presented in terms of cumulative distribution functions (CDF) and probability density function (PDF) of the fatigue life of a wing type composite structure under different hygrothermal environments subjected to the random pressure. The sensitivity of the fatigue life to a number of critical structural/material variables is also computed from the analysis.

  15. Programming Probabilistic Structural Analysis for Parallel Processing Computer

    NASA Technical Reports Server (NTRS)

    Sues, Robert H.; Chen, Heh-Chyun; Twisdale, Lawrence A.; Chamis, Christos C.; Murthy, Pappu L. N.

    1991-01-01

    The ultimate goal of this research program is to make Probabilistic Structural Analysis (PSA) computationally efficient and hence practical for the design environment by achieving large scale parallelism. The paper identifies the multiple levels of parallelism in PSA, identifies methodologies for exploiting this parallelism, describes the development of a parallel stochastic finite element code, and presents results of two example applications. It is demonstrated that speeds within five percent of those theoretically possible can be achieved. A special-purpose numerical technique, the stochastic preconditioned conjugate gradient method, is also presented and demonstrated to be extremely efficient for certain classes of PSA problems.

  16. Probabilistic Based Modeling and Simulation Assessment

    DTIC Science & Technology

    2010-06-01

    different crash and blast scenarios. With the integration of the high fidelity neck and head model, a methodology to calculate the probability of injury...variability, correlation, and multiple (often competing) failure metrics. Important scenarios include vehicular collisions, blast /fragment impact, and...first area of focus is to develop a methodology to integrate probabilistic analysis into finite element analysis of vehicle collisions and blast . The

  17. Scalable Quantum Networks for Distributed Computing and Sensing

    DTIC Science & Technology

    2016-04-01

    probabilistic measurement , so we developed quantum memories and guided-wave implementations of same, demonstrating controlled delay of a heralded single...Second, fundamental scalability requires a method to synchronize protocols based on quantum measurements , which are inherently probabilistic. To meet...AFRL-AFOSR-UK-TR-2016-0007 Scalable Quantum Networks for Distributed Computing and Sensing Ian Walmsley THE UNIVERSITY OF OXFORD Final Report 04/01

  18. Reliability-Based Stability Analysis of Rock Slopes Using Numerical Analysis and Response Surface Method

    NASA Astrophysics Data System (ADS)

    Dadashzadeh, N.; Duzgun, H. S. B.; Yesiloglu-Gultekin, N.

    2017-08-01

    While advanced numerical techniques in slope stability analysis are successfully used in deterministic studies, they have so far found limited use in probabilistic analyses due to their high computation cost. The first-order reliability method (FORM) is one of the most efficient probabilistic techniques to perform probabilistic stability analysis by considering the associated uncertainties in the analysis parameters. However, it is not possible to directly use FORM in numerical slope stability evaluations as it requires definition of a limit state performance function. In this study, an integrated methodology for probabilistic numerical modeling of rock slope stability is proposed. The methodology is based on response surface method, where FORM is used to develop an explicit performance function from the results of numerical simulations. The implementation of the proposed methodology is performed by considering a large potential rock wedge in Sumela Monastery, Turkey. The accuracy of the developed performance function to truly represent the limit state surface is evaluated by monitoring the slope behavior. The calculated probability of failure is compared with Monte Carlo simulation (MCS) method. The proposed methodology is found to be 72% more efficient than MCS, while the accuracy is decreased with an error of 24%.

  19. Bayesian networks improve causal environmental ...

    EPA Pesticide Factsheets

    Rule-based weight of evidence approaches to ecological risk assessment may not account for uncertainties and generally lack probabilistic integration of lines of evidence. Bayesian networks allow causal inferences to be made from evidence by including causal knowledge about the problem, using this knowledge with probabilistic calculus to combine multiple lines of evidence, and minimizing biases in predicting or diagnosing causal relationships. Too often, sources of uncertainty in conventional weight of evidence approaches are ignored that can be accounted for with Bayesian networks. Specifying and propagating uncertainties improve the ability of models to incorporate strength of the evidence in the risk management phase of an assessment. Probabilistic inference from a Bayesian network allows evaluation of changes in uncertainty for variables from the evidence. The network structure and probabilistic framework of a Bayesian approach provide advantages over qualitative approaches in weight of evidence for capturing the impacts of multiple sources of quantifiable uncertainty on predictions of ecological risk. Bayesian networks can facilitate the development of evidence-based policy under conditions of uncertainty by incorporating analytical inaccuracies or the implications of imperfect information, structuring and communicating causal issues through qualitative directed graph formulations, and quantitatively comparing the causal power of multiple stressors on value

  20. Judgment under uncertainty; a probabilistic evaluation framework for decision-making about sanitation systems in low-income countries.

    PubMed

    Malekpour, Shirin; Langeveld, Jeroen; Letema, Sammy; Clemens, François; van Lier, Jules B

    2013-03-30

    This paper introduces the probabilistic evaluation framework, to enable transparent and objective decision-making in technology selection for sanitation solutions in low-income countries. The probabilistic framework recognizes the often poor quality of the available data for evaluations. Within this framework, the evaluations will be done based on the probabilities that the expected outcomes occur in practice, considering the uncertainties in evaluation parameters. Consequently, the outcome of evaluations will not be single point estimates; but there exists a range of possible outcomes. A first trial application of this framework for evaluation of sanitation options in the Nyalenda settlement in Kisumu, Kenya, showed how the range of values that an evaluation parameter may obtain in practice would influence the evaluation outcomes. In addition, as the probabilistic evaluation requires various site-specific data, sensitivity analysis was performed to determine the influence of each data set quality on the evaluation outcomes. Based on that, data collection activities could be (re)directed, in a trade-off between the required investments in those activities and the resolution of the decisions that are to be made. Copyright © 2013 Elsevier Ltd. All rights reserved.

  1. Multi-atlas based segmentation using probabilistic label fusion with adaptive weighting of image similarity measures.

    PubMed

    Sjöberg, C; Ahnesjö, A

    2013-06-01

    Label fusion multi-atlas approaches for image segmentation can give better segmentation results than single atlas methods. We present a multi-atlas label fusion strategy based on probabilistic weighting of distance maps. Relationships between image similarities and segmentation similarities are estimated in a learning phase and used to derive fusion weights that are proportional to the probability for each atlas to improve the segmentation result. The method was tested using a leave-one-out strategy on a database of 21 pre-segmented prostate patients for different image registrations combined with different image similarity scorings. The probabilistic weighting yields results that are equal or better compared to both fusion with equal weights and results using the STAPLE algorithm. Results from the experiments demonstrate that label fusion by weighted distance maps is feasible, and that probabilistic weighted fusion improves segmentation quality more the stronger the individual atlas segmentation quality depends on the corresponding registered image similarity. The regions used for evaluation of the image similarity measures were found to be more important than the choice of similarity measure. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  2. Shear-wave velocity-based probabilistic and deterministic assessment of seismic soil liquefaction potential

    USGS Publications Warehouse

    Kayen, R.; Moss, R.E.S.; Thompson, E.M.; Seed, R.B.; Cetin, K.O.; Der Kiureghian, A.; Tanaka, Y.; Tokimatsu, K.

    2013-01-01

    Shear-wave velocity (Vs) offers a means to determine the seismic resistance of soil to liquefaction by a fundamental soil property. This paper presents the results of an 11-year international project to gather new Vs site data and develop probabilistic correlations for seismic soil liquefaction occurrence. Toward that objective, shear-wave velocity test sites were identified, and measurements made for 301 new liquefaction field case histories in China, Japan, Taiwan, Greece, and the United States over a decade. The majority of these new case histories reoccupy those previously investigated by penetration testing. These new data are combined with previously published case histories to build a global catalog of 422 case histories of Vs liquefaction performance. Bayesian regression and structural reliability methods facilitate a probabilistic treatment of the Vs catalog for performance-based engineering applications. Where possible, uncertainties of the variables comprising both the seismic demand and the soil capacity were estimated and included in the analysis, resulting in greatly reduced overall model uncertainty relative to previous studies. The presented data set and probabilistic analysis also help resolve the ancillary issues of adjustment for soil fines content and magnitude scaling factors.

  3. Flash-flood early warning using weather radar data: from nowcasting to forecasting

    NASA Astrophysics Data System (ADS)

    Liechti, Katharina; Panziera, Luca; Germann, Urs; Zappa, Massimiliano

    2013-04-01

    In our study we explore the limits of radar-based forecasting for hydrological runoff prediction. Two novel probabilistic radar-based forecasting chains for flash-flood early warning are investigated in three catchments in the Southern Swiss Alps and set in relation to deterministic discharge forecast for the same catchments. The first probabilistic radar-based forecasting chain is driven by NORA (Nowcasting of Orographic Rainfall by means of Analogues), an analogue-based heuristic nowcasting system to predict orographic rainfall for the following eight hours. The second probabilistic forecasting system evaluated is REAL-C2, where the numerical weather prediction COSMO-2 is initialized with 25 different initial conditions derived from a four-day nowcast with the radar ensemble REAL. Additionally, three deterministic forecasting chains were analysed. The performance of these five flash-flood forecasting systems was analysed for 1389 hours between June 2007 and December 2010 for which NORA forecasts were issued, due to the presence of orographic forcing. We found a clear preference for the probabilistic approach. Discharge forecasts perform better when forced by NORA rather than by a persistent radar QPE for lead times up to eight hours and for all discharge thresholds analysed. The best results were, however, obtained with the REAL-C2 forecasting chain, which was also remarkably skilful even with the highest thresholds. However, for regions where REAL cannot be produced, NORA might be an option for forecasting events triggered by orographic forcing.

  4. Flash-flood early warning using weather radar data: from nowcasting to forecasting

    NASA Astrophysics Data System (ADS)

    Liechti, K.; Panziera, L.; Germann, U.; Zappa, M.

    2013-01-01

    This study explores the limits of radar-based forecasting for hydrological runoff prediction. Two novel probabilistic radar-based forecasting chains for flash-flood early warning are investigated in three catchments in the Southern Swiss Alps and set in relation to deterministic discharge forecast for the same catchments. The first probabilistic radar-based forecasting chain is driven by NORA (Nowcasting of Orographic Rainfall by means of Analogues), an analogue-based heuristic nowcasting system to predict orographic rainfall for the following eight hours. The second probabilistic forecasting system evaluated is REAL-C2, where the numerical weather prediction COSMO-2 is initialized with 25 different initial conditions derived from a four-day nowcast with the radar ensemble REAL. Additionally, three deterministic forecasting chains were analysed. The performance of these five flash-flood forecasting systems was analysed for 1389 h between June 2007 and December 2010 for which NORA forecasts were issued, due to the presence of orographic forcing. We found a clear preference for the probabilistic approach. Discharge forecasts perform better when forced by NORA rather than by a persistent radar QPE for lead times up to eight hours and for all discharge thresholds analysed. The best results were, however, obtained with the REAL-C2 forecasting chain, which was also remarkably skilful even with the highest thresholds. However, for regions where REAL cannot be produced, NORA might be an option for forecasting events triggered by orographic precipitation.

  5. Probabilistic approaches to accounting for data variability in the practical application of bioavailability in predicting aquatic risks from metals.

    PubMed

    Ciffroy, Philippe; Charlatchka, Rayna; Ferreira, Daniel; Marang, Laura

    2013-07-01

    The biotic ligand model (BLM) theoretically enables the derivation of environmental quality standards that are based on true bioavailable fractions of metals. Several physicochemical variables (especially pH, major cations, dissolved organic carbon, and dissolved metal concentrations) must, however, be assigned to run the BLM, but they are highly variable in time and space in natural systems. This article describes probabilistic approaches for integrating such variability during the derivation of risk indexes. To describe each variable using a probability density function (PDF), several methods were combined to 1) treat censored data (i.e., data below the limit of detection), 2) incorporate the uncertainty of the solid-to-liquid partitioning of metals, and 3) detect outliers. From a probabilistic perspective, 2 alternative approaches that are based on log-normal and Γ distributions were tested to estimate the probability of the predicted environmental concentration (PEC) exceeding the predicted non-effect concentration (PNEC), i.e., p(PEC/PNEC>1). The probabilistic approach was tested on 4 real-case studies based on Cu-related data collected from stations on the Loire and Moselle rivers. The approach described in this article is based on BLM tools that are freely available for end-users (i.e., the Bio-Met software) and on accessible statistical data treatments. This approach could be used by stakeholders who are involved in risk assessments of metals for improving site-specific studies. Copyright © 2013 SETAC.

  6. Probabilistic Multi-Scale, Multi-Level, Multi-Disciplinary Analysis and Optimization of Engine Structures

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.; Abumeri, Galib H.

    2000-01-01

    Aircraft engines are assemblies of dynamically interacting components. Engine updates to keep present aircraft flying safely and engines for new aircraft are progressively required to operate in more demanding technological and environmental requirements. Designs to effectively meet those requirements are necessarily collections of multi-scale, multi-level, multi-disciplinary analysis and optimization methods and probabilistic methods are necessary to quantify respective uncertainties. These types of methods are the only ones that can formally evaluate advanced composite designs which satisfy those progressively demanding requirements while assuring minimum cost, maximum reliability and maximum durability. Recent research activities at NASA Glenn Research Center have focused on developing multi-scale, multi-level, multidisciplinary analysis and optimization methods. Multi-scale refers to formal methods which describe complex material behavior metal or composite; multi-level refers to integration of participating disciplines to describe a structural response at the scale of interest; multidisciplinary refers to open-ended for various existing and yet to be developed discipline constructs required to formally predict/describe a structural response in engine operating environments. For example, these include but are not limited to: multi-factor models for material behavior, multi-scale composite mechanics, general purpose structural analysis, progressive structural fracture for evaluating durability and integrity, noise and acoustic fatigue, emission requirements, hot fluid mechanics, heat-transfer and probabilistic simulations. Many of these, as well as others, are encompassed in an integrated computer code identified as Engine Structures Technology Benefits Estimator (EST/BEST) or Multi-faceted/Engine Structures Optimization (MP/ESTOP). The discipline modules integrated in MP/ESTOP include: engine cycle (thermodynamics), engine weights, internal fluid mechanics, cost, mission and coupled structural/thermal, various composite property simulators and probabilistic methods to evaluate uncertainty effects (scatter ranges) in all the design parameters. The objective of the proposed paper is to briefly describe a multi-faceted design analysis and optimization capability for coupled multi-discipline engine structures optimization. Results are presented for engine and aircraft type metrics to illustrate the versatility of that capability. Results are also presented for reliability, noise and fatigue to illustrate its inclusiveness. For example, replacing metal rotors with composites reduces the engine weight by 20 percent, 15 percent noise reduction, and an order of magnitude improvement in reliability. Composite designs exist to increase fatigue life by at least two orders of magnitude compared to state-of-the-art metals.

  7. Probabilistic seismic hazard characterization and design parameters for the Pantex Plant

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bernreuter, D. L.; Foxall, W.; Savy, J. B.

    1998-10-19

    The Hazards Mitigation Center at Lawrence Livermore National Laboratory (LLNL) updated the seismic hazard and design parameters at the Pantex Plant. The probabilistic seismic hazard (PSH) estimates were first updated using the latest available data and knowledge from LLNL (1993, 1998), Frankel et al. (1996), and other relevant recent studies from several consulting companies. Special attention was given to account for the local seismicity and for the system of potentially active faults associated with the Amarillo-Wichita uplift. Aleatory (random) uncertainty was estimated from the available data and the epistemic (knowledge) uncertainty was taken from results of similar studies. Special attentionmore » was given to soil amplification factors for the site. Horizontal Peak Ground Acceleration (PGA) and 5% damped uniform hazard spectra were calculated for six return periods (100 yr., 500 yr., 1000 yr., 2000 yr., 10,000 yr., and 100,000 yr.). The design parameters were calculated following DOE standards (DOE-STD-1022 to 1024). Response spectra for design or evaluation of Performance Category 1 through 4 structures, systems, and components are presented.« less

  8. Development of Testing Methodologies for the Mechanical Properties of MEMS

    NASA Technical Reports Server (NTRS)

    Ekwaro-Osire, Stephen

    2003-01-01

    This effort is to investigate and design testing strategies to determine the mechanical properties of MicroElectroMechanical Systems (MEMS) as well as investigate the development of a MEMS Probabilistic Design Methodology (PDM). One item of potential interest is the design of a test for the Weibull size effect in pressure membranes. The Weibull size effect is a consequence of a stochastic strength response predicted from the Weibull distribution. Confirming that MEMS strength is controlled by the Weibull distribution will enable the development of a probabilistic design methodology for MEMS - similar to the GRC developed CARES/Life program for bulk ceramics. However, the primary area of investigation will most likely be analysis and modeling of material interfaces for strength as well as developing a strategy to handle stress singularities at sharp corners, filets, and material interfaces. This will be a continuation of the previous years work. The ultimate objective of this effort is to further develop and verify the ability of the Ceramics Analysis and Reliability Evaluation of Structures Life (CARES/Life) code to predict the time-dependent reliability of MEMS structures subjected to multiple transient loads.

  9. Development of a nanosatellite de-orbiting system by reliability based design optimization

    NASA Astrophysics Data System (ADS)

    Nikbay, Melike; Acar, Pınar; Aslan, Alim Rüstem

    2015-12-01

    This paper presents design approaches to develop a reliable and efficient de-orbiting system for the 3USAT nanosatellite to provide a beneficial orbital decay process at the end of a mission. A de-orbiting system is initially designed by employing the aerodynamic drag augmentation principle where the structural constraints of the overall satellite system and the aerodynamic forces are taken into account. Next, an alternative de-orbiting system is designed with new considerations and further optimized using deterministic and reliability based design techniques. For the multi-objective design, the objectives are chosen to maximize the aerodynamic drag force through the maximization of the Kapton surface area while minimizing the de-orbiting system mass. The constraints are related in a deterministic manner to the required deployment force, the height of the solar panel hole and the deployment angle. The length and the number of layers of the deployable Kapton structure are used as optimization variables. In the second stage of this study, uncertainties related to both manufacturing and operating conditions of the deployable structure in space environment are considered. These uncertainties are then incorporated into the design process by using different probabilistic approaches such as Monte Carlo Simulation, the First-Order Reliability Method and the Second-Order Reliability Method. The reliability based design optimization seeks optimal solutions using the former design objectives and constraints with the inclusion of a reliability index. Finally, the de-orbiting system design alternatives generated by different approaches are investigated and the reliability based optimum design is found to yield the best solution since it significantly improves both system reliability and performance requirements.

  10. Willingness-to-pay for a probabilistic flood forecast: a risk-based decision-making game

    NASA Astrophysics Data System (ADS)

    Arnal, Louise; Ramos, Maria-Helena; Coughlan de Perez, Erin; Cloke, Hannah Louise; Stephens, Elisabeth; Wetterhall, Fredrik; van Andel, Schalk Jan; Pappenberger, Florian

    2016-08-01

    Probabilistic hydro-meteorological forecasts have over the last decades been used more frequently to communicate forecast uncertainty. This uncertainty is twofold, as it constitutes both an added value and a challenge for the forecaster and the user of the forecasts. Many authors have demonstrated the added (economic) value of probabilistic over deterministic forecasts across the water sector (e.g. flood protection, hydroelectric power management and navigation). However, the richness of the information is also a source of challenges for operational uses, due partially to the difficulty in transforming the probability of occurrence of an event into a binary decision. This paper presents the results of a risk-based decision-making game on the topic of flood protection mitigation, called "How much are you prepared to pay for a forecast?". The game was played at several workshops in 2015, which were attended by operational forecasters and academics working in the field of hydro-meteorology. The aim of this game was to better understand the role of probabilistic forecasts in decision-making processes and their perceived value by decision-makers. Based on the participants' willingness-to-pay for a forecast, the results of the game show that the value (or the usefulness) of a forecast depends on several factors, including the way users perceive the quality of their forecasts and link it to the perception of their own performances as decision-makers.

  11. Probabilistic and machine learning-based retrieval approaches for biomedical dataset retrieval

    PubMed Central

    Karisani, Payam; Qin, Zhaohui S; Agichtein, Eugene

    2018-01-01

    Abstract The bioCADDIE dataset retrieval challenge brought together different approaches to retrieval of biomedical datasets relevant to a user’s query, expressed as a text description of a needed dataset. We describe experiments in applying a data-driven, machine learning-based approach to biomedical dataset retrieval as part of this challenge. We report on a series of experiments carried out to evaluate the performance of both probabilistic and machine learning-driven techniques from information retrieval, as applied to this challenge. Our experiments with probabilistic information retrieval methods, such as query term weight optimization, automatic query expansion and simulated user relevance feedback, demonstrate that automatically boosting the weights of important keywords in a verbose query is more effective than other methods. We also show that although there is a rich space of potential representations and features available in this domain, machine learning-based re-ranking models are not able to improve on probabilistic information retrieval techniques with the currently available training data. The models and algorithms presented in this paper can serve as a viable implementation of a search engine to provide access to biomedical datasets. The retrieval performance is expected to be further improved by using additional training data that is created by expert annotation, or gathered through usage logs, clicks and other processes during natural operation of the system. Database URL: https://github.com/emory-irlab/biocaddie PMID:29688379

  12. Adaptive predictors based on probabilistic SVM for real time disruption mitigation on JET

    NASA Astrophysics Data System (ADS)

    Murari, A.; Lungaroni, M.; Peluso, E.; Gaudio, P.; Vega, J.; Dormido-Canto, S.; Baruzzo, M.; Gelfusa, M.; Contributors, JET

    2018-05-01

    Detecting disruptions with sufficient anticipation time is essential to undertake any form of remedial strategy, mitigation or avoidance. Traditional predictors based on machine learning techniques can be very performing, if properly optimised, but do not provide a natural estimate of the quality of their outputs and they typically age very quickly. In this paper a new set of tools, based on probabilistic extensions of support vector machines (SVM), are introduced and applied for the first time to JET data. The probabilistic output constitutes a natural qualification of the prediction quality and provides additional flexibility. An adaptive training strategy ‘from scratch’ has also been devised, which allows preserving the performance even when the experimental conditions change significantly. Large JET databases of disruptions, covering entire campaigns and thousands of discharges, have been analysed, both for the case of the graphite and the ITER Like Wall. Performance significantly better than any previous predictor using adaptive training has been achieved, satisfying even the requirements of the next generation of devices. The adaptive approach to the training has also provided unique information about the evolution of the operational space. The fact that the developed tools give the probability of disruption improves the interpretability of the results, provides an estimate of the predictor quality and gives new insights into the physics. Moreover, the probabilistic treatment permits to insert more easily these classifiers into general decision support and control systems.

  13. A multiobjective decision support/numerical modeling approach for design and evaluation of shallow landfill burial systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ascough, II, James Clifford

    1992-05-01

    The capability to objectively evaluate design performance of shallow landfill burial (SLB) systems is of great interest to diverse scientific disciplines, including hydrologists, engineers, environmental scientists, and SLB regulators. The goal of this work was to develop and validate a procedure for the nonsubjective evaluation of SLB designs under actual or simulated environmental conditions. A multiobjective decision module (MDM) based on scoring functions (Wymore, 1988) was implemented to evaluate SLB design performance. Input values to the MDM are provided by hydrologic models. The MDM assigns a total score to each SLB design alternative, thereby allowing for rapid and repeatable designmore » performance evaluation. The MDM was validated for a wide range of SLB designs under different climatic conditions. Rigorous assessment of SLB performance also requires incorporation of hydrologic probabilistic analysis and hydrologic risk into the overall design. This was accomplished through the development of a frequency analysis module. The frequency analysis module allows SLB design event magnitudes to be calculated based on the hydrologic return period. The multiobjective decision and freqeuncy anslysis modules were integrated in a decision support system (DSS) framework, SLEUTH (Shallow Landfill Evaluation Using Transport and Hydrology). SLEUTH is a Microsoft Windows {trademark} application, and is written in the Knowledge Pro Windows (Knowledge Garden, Inc., 1991) development language.« less

  14. Structural design of composite rotor blades with consideration of manufacturability, durability, and manufacturing uncertainties

    NASA Astrophysics Data System (ADS)

    Li, Leihong

    A modular structural design methodology for composite blades is developed. This design method can be used to design composite rotor blades with sophisticate geometric cross-sections. This design method hierarchically decomposed the highly-coupled interdisciplinary rotor analysis into global and local levels. In the global level, aeroelastic response analysis and rotor trim are conduced based on multi-body dynamic models. In the local level, variational asymptotic beam sectional analysis methods are used for the equivalent one-dimensional beam properties. Compared with traditional design methodology, the proposed method is more efficient and accurate. Then, the proposed method is used to study three different design problems that have not been investigated before. The first is to add manufacturing constraints into design optimization. The introduction of manufacturing constraints complicates the optimization process. However, the design with manufacturing constraints benefits the manufacturing process and reduces the risk of violating major performance constraints. Next, a new design procedure for structural design against fatigue failure is proposed. This procedure combines the fatigue analysis with the optimization process. The durability or fatigue analysis employs a strength-based model. The design is subject to stiffness, frequency, and durability constraints. Finally, the manufacturing uncertainty impacts on rotor blade aeroelastic behavior are investigated, and a probabilistic design method is proposed to control the impacts of uncertainty on blade structural performance. The uncertainty factors include dimensions, shapes, material properties, and service loads.

  15. Exact and Approximate Probabilistic Symbolic Execution

    NASA Technical Reports Server (NTRS)

    Luckow, Kasper; Pasareanu, Corina S.; Dwyer, Matthew B.; Filieri, Antonio; Visser, Willem

    2014-01-01

    Probabilistic software analysis seeks to quantify the likelihood of reaching a target event under uncertain environments. Recent approaches compute probabilities of execution paths using symbolic execution, but do not support nondeterminism. Nondeterminism arises naturally when no suitable probabilistic model can capture a program behavior, e.g., for multithreading or distributed systems. In this work, we propose a technique, based on symbolic execution, to synthesize schedulers that resolve nondeterminism to maximize the probability of reaching a target event. To scale to large systems, we also introduce approximate algorithms to search for good schedulers, speeding up established random sampling and reinforcement learning results through the quantification of path probabilities based on symbolic execution. We implemented the techniques in Symbolic PathFinder and evaluated them on nondeterministic Java programs. We show that our algorithms significantly improve upon a state-of- the-art statistical model checking algorithm, originally developed for Markov Decision Processes.

  16. Guided SAR image despeckling with probabilistic non local weights

    NASA Astrophysics Data System (ADS)

    Gokul, Jithin; Nair, Madhu S.; Rajan, Jeny

    2017-12-01

    SAR images are generally corrupted by granular disturbances called speckle, which makes visual analysis and detail extraction a difficult task. Non Local despeckling techniques with probabilistic similarity has been a recent trend in SAR despeckling. To achieve effective speckle suppression without compromising detail preservation, we propose an improvement for the existing Generalized Guided Filter with Bayesian Non-Local Means (GGF-BNLM) method. The proposed method (Guided SAR Image Despeckling with Probabilistic Non Local Weights) replaces parametric constants based on heuristics in GGF-BNLM method with dynamically derived values based on the image statistics for weight computation. Proposed changes make GGF-BNLM method adaptive and as a result, significant improvement is achieved in terms of performance. Experimental analysis on SAR images shows excellent speckle reduction without compromising feature preservation when compared to GGF-BNLM method. Results are also compared with other state-of-the-art and classic SAR depseckling techniques to demonstrate the effectiveness of the proposed method.

  17. [Uncertainty characterization approaches for ecological risk assessment of polycyclic aromatic hydrocarbon in Taihu Lake].

    PubMed

    Guo, Guang-Hui; Wu, Feng-Chang; He, Hong-Ping; Feng, Cheng-Lian; Zhang, Rui-Qing; Li, Hui-Xian

    2012-04-01

    Probabilistic approaches, such as Monte Carlo Sampling (MCS) and Latin Hypercube Sampling (LHS), and non-probabilistic approaches, such as interval analysis, fuzzy set theory and variance propagation, were used to characterize uncertainties associated with risk assessment of sigma PAH8 in surface water of Taihu Lake. The results from MCS and LHS were represented by probability distributions of hazard quotients of sigma PAH8 in surface waters of Taihu Lake. The probabilistic distribution of hazard quotient were obtained from the results of MCS and LHS based on probabilistic theory, which indicated that the confidence intervals of hazard quotient at 90% confidence level were in the range of 0.000 18-0.89 and 0.000 17-0.92, with the mean of 0.37 and 0.35, respectively. In addition, the probabilities that the hazard quotients from MCS and LHS exceed the threshold of 1 were 9.71% and 9.68%, respectively. The sensitivity analysis suggested the toxicity data contributed the most to the resulting distribution of quotients. The hazard quotient of sigma PAH8 to aquatic organisms ranged from 0.000 17 to 0.99 using interval analysis. The confidence interval was (0.001 5, 0.016 3) at the 90% confidence level calculated using fuzzy set theory, and the confidence interval was (0.000 16, 0.88) at the 90% confidence level based on the variance propagation. These results indicated that the ecological risk of sigma PAH8 to aquatic organisms were low. Each method has its own set of advantages and limitations, which was based on different theory; therefore, the appropriate method should be selected on a case-by-case to quantify the effects of uncertainties on the ecological risk assessment. Approach based on the probabilistic theory was selected as the most appropriate method to assess the risk of sigma PAH8 in surface water of Taihu Lake, which provided an important scientific foundation of risk management and control for organic pollutants in water.

  18. A 3D Human-Machine Integrated Design and Analysis Framework for Squat Exercises with a Smith Machine.

    PubMed

    Lee, Haerin; Jung, Moonki; Lee, Ki-Kwang; Lee, Sang Hun

    2017-02-06

    In this paper, we propose a three-dimensional design and evaluation framework and process based on a probabilistic-based motion synthesis algorithm and biomechanical analysis system for the design of the Smith machine and squat training programs. Moreover, we implemented a prototype system to validate the proposed framework. The framework consists of an integrated human-machine-environment model as well as a squat motion synthesis system and biomechanical analysis system. In the design and evaluation process, we created an integrated model in which interactions between a human body and machine or the ground are modeled as joints with constraints at contact points. Next, we generated Smith squat motion using the motion synthesis program based on a Gaussian process regression algorithm with a set of given values for independent variables. Then, using the biomechanical analysis system, we simulated joint moments and muscle activities from the input of the integrated model and squat motion. We validated the model and algorithm through physical experiments measuring the electromyography (EMG) signals, ground forces, and squat motions as well as through a biomechanical simulation of muscle forces. The proposed approach enables the incorporation of biomechanics in the design process and reduces the need for physical experiments and prototypes in the development of training programs and new Smith machines.

  19. A 3D Human-Machine Integrated Design and Analysis Framework for Squat Exercises with a Smith Machine

    PubMed Central

    Lee, Haerin; Jung, Moonki; Lee, Ki-Kwang; Lee, Sang Hun

    2017-01-01

    In this paper, we propose a three-dimensional design and evaluation framework and process based on a probabilistic-based motion synthesis algorithm and biomechanical analysis system for the design of the Smith machine and squat training programs. Moreover, we implemented a prototype system to validate the proposed framework. The framework consists of an integrated human–machine–environment model as well as a squat motion synthesis system and biomechanical analysis system. In the design and evaluation process, we created an integrated model in which interactions between a human body and machine or the ground are modeled as joints with constraints at contact points. Next, we generated Smith squat motion using the motion synthesis program based on a Gaussian process regression algorithm with a set of given values for independent variables. Then, using the biomechanical analysis system, we simulated joint moments and muscle activities from the input of the integrated model and squat motion. We validated the model and algorithm through physical experiments measuring the electromyography (EMG) signals, ground forces, and squat motions as well as through a biomechanical simulation of muscle forces. The proposed approach enables the incorporation of biomechanics in the design process and reduces the need for physical experiments and prototypes in the development of training programs and new Smith machines. PMID:28178184

  20. Loss resilience for two-qubit state transmission using distributed phase sensitive amplification

    DOE PAGES

    Dailey, James; Agarwal, Anjali; Toliver, Paul; ...

    2015-11-12

    We transmit phase-encoded non-orthogonal quantum states through a 5-km long fibre-based distributed optical phase-sensitive amplifier (OPSA) using telecom-wavelength photonic qubit pairs. The gain is set to equal the transmission loss to probabilistically preserve input states during transmission. While neither state is optimally aligned to the OPSA, each input state is equally amplified with no measurable degradation in state quality. These results promise a new approach to reduce the effects of loss by encoding quantum information in a two-qubit Hilbert space which is designed to benefit from transmission through an OPSA.

Top