Science.gov

Sample records for failure assessment diagram

  1. Failure Assessment Diagram for Titanium Brazed Joints

    NASA Technical Reports Server (NTRS)

    Flom, Yury; Jones, Justin S.; Powell, Mollie M.; Puckett, David F.

    2011-01-01

    The interaction equation was used to predict failure in Ti-4V-6Al joints brazed with Al 1100 filler metal. The joints used in this study were geometrically similar to the joints in the brazed beryllium metering structure considered for the ATLAS telescope. This study confirmed that the interaction equation R(sub sigma) + R(sub Tau) = 1, where R(sub sigma) and R(sub Tau)are normal and shear stress ratios, can be used as conservative lower bound estimate of the failure criterion in ATLAS brazed joints as well as for construction of the Failure Assessment Diagram (FAD).

  2. Failure Assessment Diagram for Brazed 304 Stainless Steel Joints

    NASA Technical Reports Server (NTRS)

    Flom, Yory

    2011-01-01

    Interaction equations were proposed earlier to predict failure in Albemet 162 brazed joints. Present study demonstrates that the same interaction equations can be used for lower bound estimate of the failure criterion in 304 stainless steel joints brazed with silver-based filler metals as well as for construction of the Failure Assessment Diagrams (FAD).

  3. Evaluation of Brazed Joints Using Failure Assessment Diagram

    NASA Technical Reports Server (NTRS)

    Flom, Yury

    2012-01-01

    Fitness-for service approach was used to perform structural analysis of the brazed joints consisting of several base metal / filler metal combinations. Failure Assessment Diagrams (FADs) based on tensile and shear stress ratios were constructed and experimentally validated. It was shown that such FADs can provide a conservative estimate of safe combinations of stresses in the brazed joints. Based on this approach, Margins of Safety (MS) of the brazed joints subjected to multi-axial loading conditions can be evaluated..

  4. Improved reliability analysis method based on the failure assessment diagram

    NASA Astrophysics Data System (ADS)

    Zhou, Yu; Zhang, Zheng; Zhong, Qunpeng

    2012-07-01

    With the uncertainties related to operating conditions, in-service non-destructive testing (NDT) measurements and material properties considered in the structural integrity assessment, probabilistic analysis based on the failure assessment diagram (FAD) approach has recently become an important concern. However, the point density revealing the probabilistic distribution characteristics of the assessment points is usually ignored. To obtain more detailed and direct knowledge from the reliability analysis, an improved probabilistic fracture mechanics (PFM) assessment method is proposed. By integrating 2D kernel density estimation (KDE) technology into the traditional probabilistic assessment, the probabilistic density of the randomly distributed assessment points is visualized in the assessment diagram. Moreover, a modified interval sensitivity analysis is implemented and compared with probabilistic sensitivity analysis. The improved reliability analysis method is applied to the assessment of a high pressure pipe containing an axial internal semi-elliptical surface crack. The results indicate that these two methods can give consistent sensitivities of input parameters, but the interval sensitivity analysis is computationally more efficient. Meanwhile, the point density distribution and its contour are plotted in the FAD, thereby better revealing the characteristics of PFM assessment. This study provides a powerful tool for the reliability analysis of critical structures.

  5. Application of ISO22000, failure mode, and effect analysis (FMEA) cause and effect diagrams and pareto in conjunction with HACCP and risk assessment for processing of pastry products.

    PubMed

    Varzakas, Theodoros H

    2011-09-01

    The Failure Mode and Effect Analysis (FMEA) model has been applied for the risk assessment of pastry processing. A tentative approach of FMEA application to the pastry industry was attempted in conjunction with ISO22000. Preliminary Hazard Analysis was used to analyze and predict the occurring failure modes in a food chain system (pastry processing plant), based on the functions, characteristics, and/or interactions of the ingredients or the processes, upon which the system depends. Critical Control points have been identified and implemented in the cause and effect diagram (also known as Ishikawa, tree diagram, and fishbone diagram). In this work a comparison of ISO22000 analysis with HACCP is carried out over pastry processing and packaging. However, the main emphasis was put on the quantification of risk assessment by determining the Risk Priority Number (RPN) per identified processing hazard. Storage of raw materials and storage of final products at -18°C followed by freezing were the processes identified as the ones with the highest RPN (225, 225, and 144 respectively) and corrective actions were undertaken. Following the application of corrective actions, a second calculation of RPN values was carried out leading to considerably lower values (below the upper acceptable limit of 130). It is noteworthy that the application of Ishikawa (Cause and Effect or Tree diagram) led to converging results thus corroborating the validity of conclusions derived from risk assessment and FMEA. Therefore, the incorporation of FMEA analysis within the ISO22000 system of a pastry processing industry is considered imperative. PMID:21838557

  6. A comparative study of wide plate behavior of a range of structural steels using the failure assessment diagram

    SciTech Connect

    Bannister, A.C.; Harrison, P.L.

    1995-12-31

    In the field of structural integrity assessments, attention is currently focused on the ability of such methods to conservatively predict the deformation and fracture behavior of structural steels and their weldments. In the current paper, the results of a series of wide plate tests on a range of structural steels are presented and the results assessed in terms of CTOD-strain relationships, BS PD 6493 Levels 2 and 3, and the crack driving force approach. The behavior of the large scale tests and the results of the various analyses are assessed with regard to the stress-strain characteristics of the individual steels. In a second step, the approach is extended to the assessment of a number of wide plate tests comprising welded joints with mismatched strength levels. Over, under and even-matched welded plates are compared with the behavior of normalized and Quenched and Tempered parent plates. The study demonstrates that the behavior of parent material wide plate tests can vary widely depending on the stress-strain characteristics of the material. The different behavior is a result of the consecutive effects of different steel processing conditions, microstructure, yield to tensile strength ratio and strain hardening exponent. These features are also manifested, to a lesser or greater extent, in the results of wide plate tests on welded plates of mismatched strength. Studies on mismatch effects should therefore include equal attention to the stress-strain characteristics of the parent materials as this may, in some circumstances, dominate any effects of weld strength mismatch.

  7. Automatically Assessing Graph-Based Diagrams

    ERIC Educational Resources Information Center

    Thomas, Pete; Smith, Neil; Waugh, Kevin

    2008-01-01

    To date there has been very little work on the machine understanding of imprecise diagrams, such as diagrams drawn by students in response to assessment questions. Imprecise diagrams exhibit faults such as missing, extraneous and incorrectly formed elements. The semantics of imprecise diagrams are difficult to determine. While there have been…

  8. Using Dynamic Master Logic Diagram for component partial failure analysis

    SciTech Connect

    Ni, T.; Modarres, M.

    1996-12-01

    A methodology using the Dynamic Master Logic Diagram (DMLD) for the evaluation of component partial failure is presented. Since past PRAs have not focused on partial failure effects, the reliability of components are only based on the binary state assumption, i.e. defining a component as fully failed or functioning. This paper is to develop an approach to predict and estimate the component partial failure on the basis of the fuzzy state assumption. One example of the application of this methodology with the reliability function diagram of a centrifugal pump is presented.

  9. Failure mode diagram of rubble pile asteroids: Application to (25143) asteroid Itokawa

    NASA Astrophysics Data System (ADS)

    Hirabayashi, Masatoshi; Scheeres, Daniel J.

    2016-01-01

    Proposing a diagram which shows the variation in asteroidal failure as a function of a spin period, later called the failure mode diagram, this paper considers the failure modes and conditions of asteroid (25143) Itokawa. This diagram is useful to describe when and where failure occurs in an asteroid. Assuming that Itokawa is homogeneous, we use a plastic finite element code to obtain the diagram for this object. The results show that if the bulk cohesive strength is less than 0.1 Pa, Itokawa experiences compressional failure on the neck surface at the current spin period 12.1 hours. At a spin period shorter than 4.5 hours, tension across the neck causes this asteroid to split into two components. It is also found that if the breakup spin period is longer than 5.2 hours, their motion is bounded. This implies that once Itokawa splits, the components may escape from one another.

  10. The Problem of Labels in E-Assessment of Diagrams

    ERIC Educational Resources Information Center

    Jayal, Ambikesh; Shepperd, Martin

    2009-01-01

    In this article we explore a problematic aspect of automated assessment of diagrams. Diagrams have partial and sometimes inconsistent semantics. Typically much of the meaning of a diagram resides in the labels; however, the choice of labeling is largely unrestricted. This means a correct solution may utilize differing yet semantically equivalent…

  11. Using Tree Diagrams as an Assessment Tool in Statistics Education

    ERIC Educational Resources Information Center

    Yin, Yue

    2012-01-01

    This study examines the potential of the tree diagram, a type of graphic organizer, as an assessment tool to measure students' knowledge structures in statistics education. Students' knowledge structures in statistics have not been sufficiently assessed in statistics, despite their importance. This article first presents the rationale and method…

  12. Application of Failure Mode and Effect Analysis (FMEA), cause and effect analysis, and Pareto diagram in conjunction with HACCP to a corn curl manufacturing plant.

    PubMed

    Varzakas, Theodoros H; Arvanitoyannis, Ioannis S

    2007-01-01

    The Failure Mode and Effect Analysis (FMEA) model has been applied for the risk assessment of corn curl manufacturing. A tentative approach of FMEA application to the snacks industry was attempted in an effort to exclude the presence of GMOs in the final product. This is of crucial importance both from the ethics and the legislation (Regulations EC 1829/2003; EC 1830/2003; Directive EC 18/2001) point of view. The Preliminary Hazard Analysis and the Fault Tree Analysis were used to analyze and predict the occurring failure modes in a food chain system (corn curls processing plant), based on the functions, characteristics, and/or interactions of the ingredients or the processes, upon which the system depends. Critical Control points have been identified and implemented in the cause and effect diagram (also known as Ishikawa, tree diagram, and the fishbone diagram). Finally, Pareto diagrams were employed towards the optimization of GMOs detection potential of FMEA. PMID:17457722

  13. Acute Effects of Vagotomy on Baroreflex Equilibrium Diagram in Rats with Chronic Heart Failure.

    PubMed

    Kawada, Toru; Li, Meihua; Zheng, Can; Sugimachi, Masaru

    2016-01-01

    The arterial baroreflex system can be divided into the neural arc, from pressure input to efferent sympathetic nerve activity (SNA), and the peripheral arc, from SNA to arterial pressure (AP). Plotting the neural and peripheral arcs on a pressure-SNA plane yields a baroreflex equilibrium diagram. We examined the effects of vagotomy on the open-loop static characteristics of the carotid sinus baroreflex in normal control rats (NC, n = 10) and rats with heart failure after myocardial infarction (MI, n = 10). In the NC group, vagotomy shifted the neural arc toward higher SNA and decreased the slope of the peripheral arc. Consequently, the operating-point SNA increased without a significant change in the operating-point AP on the baroreflex equilibrium diagram. These vagotomy-induced effects were not observed in the MI group, suggesting a loss of vagal modulation of the carotid sinus baroreflex function in heart failure. PMID:27594790

  14. Acute Effects of Vagotomy on Baroreflex Equilibrium Diagram in Rats with Chronic Heart Failure

    PubMed Central

    Kawada, Toru; Li, Meihua; Zheng, Can; Sugimachi, Masaru

    2016-01-01

    The arterial baroreflex system can be divided into the neural arc, from pressure input to efferent sympathetic nerve activity (SNA), and the peripheral arc, from SNA to arterial pressure (AP). Plotting the neural and peripheral arcs on a pressure–SNA plane yields a baroreflex equilibrium diagram. We examined the effects of vagotomy on the open-loop static characteristics of the carotid sinus baroreflex in normal control rats (NC, n = 10) and rats with heart failure after myocardial infarction (MI, n = 10). In the NC group, vagotomy shifted the neural arc toward higher SNA and decreased the slope of the peripheral arc. Consequently, the operating-point SNA increased without a significant change in the operating-point AP on the baroreflex equilibrium diagram. These vagotomy-induced effects were not observed in the MI group, suggesting a loss of vagal modulation of the carotid sinus baroreflex function in heart failure. PMID:27594790

  15. Students' Understanding of Diagrams for Solving Word Problems: A Framework for Assessing Diagram Proficiency

    ERIC Educational Resources Information Center

    Poch, Apryl L.; van Garderen, Delinda; Scheuermann, Amy M.

    2015-01-01

    A visual representation, such as a diagram, can be a powerful strategy for solving mathematical word problems. However, using a representation to solve mathematical word problems is not as simple as it seems! Many students with learning disabilities struggle to use a diagram effectively and efficiently. This article provides a framework for…

  16. Failure Assessment of Stainless Steel and Titanium Brazed Joints

    NASA Technical Reports Server (NTRS)

    Flom, Yury A.

    2012-01-01

    Following successful application of Coulomb-Mohr and interaction equations for evaluation of safety margins in Albemet 162 brazed joints, two additional base metal/filler metal systems were investigated. Specimens consisting of stainless steel brazed with silver-base filler metal and titanium brazed with 1100 Al alloy were tested to failure under combined action of tensile, shear, bending and torsion loads. Finite Element Analysis (FEA), hand calculations and digital image comparison (DIC) techniques were used to estimate failure stresses and construct Failure Assessment Diagrams (FAD). This study confirms that interaction equation R(sub sigma) + R(sub tau) = 1, where R(sub sigma) and R(sub t u) are normal and shear stress ratios, can be used as conservative lower bound estimate of the failure criterion in stainless steel and titanium brazed joints.

  17. Development of partial failure analysis method in probability risk assessments

    SciTech Connect

    Ni, T.; Modarres, M.

    1996-12-01

    This paper presents a new approach to evaluate the partial failure effect on current Probability Risk Assessments (PRAs). An integrated methodology of the thermal-hydraulic analysis and fuzzy logic simulation using the Dynamic Master Logic Diagram (DMLD) was developed. The thermal-hydraulic analysis used in this approach is to identify partial operation effect of any PRA system function in a plant model. The DMLD is used to simulate the system performance of the partial failure effect and inspect all minimal cut sets of system functions. This methodology can be applied in the context of a full scope PRA to reduce core damage frequency. An example of this application of the approach is presented. The partial failure data used in the example is from a survey study of partial failure effects from the Nuclear Plant Reliability Data System (NPRDS).

  18. Assessment of failure of cemented polyethylene acetabular component due to bone remodeling: A finite element study.

    PubMed

    Ghosh, Rajesh

    2016-09-01

    The aim of the study is to determine failure of the cemented polyethylene acetabular component, which might occur due to excessive bone resorption, cement-bone interface debonding and fatigue failure of the cement mantle. Three-dimensional finite element models of intact and implanted pelvic bone were developed and bone remodeling algorithm was implemented for present analysis. Soderberg fatigue failure diagram was used for fatigue assessment of the cement mantle. Hoffman failure criterion was considered for prediction of cement-bone interface debonding. Results indicate fatigue failure of the cement mantle and implant-bone interface debonding might not occur due to bone remodeling. PMID:27408485

  19. Ground water quality assessment using multi-rectangular diagrams.

    PubMed

    Ahmad, Niaz; Sen, Zekai; Ahmad, Manzoor

    2003-01-01

    A new graphical technique is proposed here for classifying chemical analyses of ground water. In this technique, a diagram is constructed using rectangular coordinates. The new diagram, called a multi-rectangular diagram (MRD), uses adjacent multi-rectangles in which each rectangle represents a specific ground water type. This new diagram has the capability to accommodate a large number of data sets. MRDs have been used to classify chemical analyses of ground water in the Chaj Doab area of Pakistan to illustrate this new approach. Using this graphical method, the differentiated ground water types are calcium bicarbonate, magnesium bicarbonate, sodium bicarbonate, and sodium sulfate. Sodium bicarbonate emerges as the most abundant ground water type. MRDs also offer a visual display of the Chebotarev sequence of ground water quality evolution. PMID:14649865

  20. Failure Assessment of Brazed Structures

    NASA Technical Reports Server (NTRS)

    Flom, Yuri

    2012-01-01

    Despite the great advances in analytical methods available to structural engineers, designers of brazed structures have great difficulties in addressing fundamental questions related to the loadcarrying capabilities of brazed assemblies. In this chapter we will review why such common engineering tools as Finite Element Analysis (FEA) as well as many well-established theories (Tresca, von Mises, Highest Principal Stress, etc) don't work well for the brazed joints. This chapter will show how the classic approach of using interaction equations and the less known Coulomb-Mohr failure criterion can be employed to estimate Margins of Safety (MS) in brazed joints.

  1. Assessing Students to Erase Failure

    ERIC Educational Resources Information Center

    Riggins, Cheryl G.

    2006-01-01

    Good education begins with an objective assessment and accurate diagnosis of specific "deficiencies." Transforming assessment from a system of primarily external judgments made by others to a system driven primarily by self-correction is the new frontier for instructional leaders. In this article, the author describes how student-centered…

  2. Assessment of Nonorganic Failure To Thrive.

    ERIC Educational Resources Information Center

    Wooster, Donna M.

    1999-01-01

    This article describes basic assessment considerations for infants and toddlers exhibiting nonorganic failure to thrive. The evaluation process must examine feeding, maternal-child interactions, child temperament, and environmental risks and behaviors. Early identification and intervention are necessary to minimize the long-term developmental…

  3. Nanoparticles in the environment: assessment using the causal diagram approach

    PubMed Central

    2012-01-01

    Nanoparticles (NPs) cause concern for health and safety as their impact on the environment and humans is not known. Relatively few studies have investigated the toxicological and environmental effects of exposure to naturally occurring NPs (NNPs) and man-made or engineered NPs (ENPs) that are known to have a wide variety of effects once taken up into an organism. A review of recent knowledge (between 2000-2010) on NP sources, and their behaviour, exposure and effects on the environment and humans was performed. An integrated approach was used to comprise available scientific information within an interdisciplinary logical framework, to identify knowledge gaps and to describe environment and health linkages for NNPs and ENPs. The causal diagram has been developed as a method to handle the complexity of issues on NP safety, from their exposure to the effects on the environment and health. It gives an overview of available scientific information starting with common sources of NPs and their interactions with various environmental processes that may pose threats to both human health and the environment. Effects of NNPs on dust cloud formation and decrease in sunlight intensity were found to be important environmental changes with direct and indirect implication in various human health problems. NNPs and ENPs exposure and their accumulation in biological matrices such as microbiota, plants and humans may result in various adverse effects. The impact of some NPs on human health by ROS generation was found to be one of the major causes to develop various diseases. A proposed cause-effects diagram for NPs is designed considering both NNPs and ENPs. It represents a valuable information package and user-friendly tool for various stakeholders including students, researchers and policy makers, to better understand and communicate on issues related to NPs. PMID:22759495

  4. Nanoparticles in the environment: assessment using the causal diagram approach.

    PubMed

    Smita, Suchi; Gupta, Shailendra K; Bartonova, Alena; Dusinska, Maria; Gutleb, Arno C; Rahman, Qamar

    2012-01-01

    Nanoparticles (NPs) cause concern for health and safety as their impact on the environment and humans is not known. Relatively few studies have investigated the toxicological and environmental effects of exposure to naturally occurring NPs (NNPs) and man-made or engineered NPs (ENPs) that are known to have a wide variety of effects once taken up into an organism. A review of recent knowledge (between 2000-2010) on NP sources, and their behaviour, exposure and effects on the environment and humans was performed. An integrated approach was used to comprise available scientific information within an interdisciplinary logical framework, to identify knowledge gaps and to describe environment and health linkages for NNPs and ENPs. The causal diagram has been developed as a method to handle the complexity of issues on NP safety, from their exposure to the effects on the environment and health. It gives an overview of available scientific information starting with common sources of NPs and their interactions with various environmental processes that may pose threats to both human health and the environment. Effects of NNPs on dust cloud formation and decrease in sunlight intensity were found to be important environmental changes with direct and indirect implication in various human health problems. NNPs and ENPs exposure and their accumulation in biological matrices such as microbiota, plants and humans may result in various adverse effects. The impact of some NPs on human health by ROS generation was found to be one of the major causes to develop various diseases. A proposed cause-effects diagram for NPs is designed considering both NNPs and ENPs. It represents a valuable information package and user-friendly tool for various stakeholders including students, researchers and policy makers, to better understand and communicate on issues related to NPs. PMID:22759495

  5. Insulation failure assessment under random energization overvoltages

    SciTech Connect

    Mahdy, A.M.; Anis, H.I.; El-Morshedy, A.

    1996-03-01

    This paper offers a new simple approach to the evaluation of the risk of failure of external insulation in view of their known probabilistic nature. This is applied to EHV transmission systems subjected to energization overvoltages. The randomness, both in the applied stresses and insulation`s withstand characteristics are numerically simulated and then integrated to assess the risk of failure. Overvoltage control methods are accounted for, such as the use of pre-insertion breaker resistors, series capacitive compensation, and the installation of shunt reactors.

  6. Assessing legal responsibility for implant failure.

    PubMed

    Palat, M

    1991-04-01

    The number of malpractice suits related to implants has recently increased significantly, with awards that are among the largest in dentistry. This article discusses the principles involved in assessing liability for implant failure and the various clinical situations that can affect liability in implant practice. The author also provides a list of the interrogatories required of defendants in malpractice suits related to implants. PMID:1893392

  7. Ignition Failure Mode Radiochemical Diagnostics Initial Assessment

    SciTech Connect

    Fortner, R; Bernstein, L; Cerjan, C; Haan, S W; Harding, R; Hatchett, S; Hoffman, R; Koch, J; Moody, K; Schneider, D; Stoyer, M; Werner, C; Zimmerman, G

    2007-04-20

    Radiochemical diagnostic signatures are well known to be effective indicators of nuclear ignition and burn reaction conditions. Nuclear activation is already a reliable technique to measure yield. More comprehensively, though, important quantities such as fuel areal density and ion temperature might be separately and more precisely monitored by a judicious choice of select nuclear reactions. This report details an initial assessment of this approach to diagnosing ignition failures on point-design cryogenic National Ignition Campaign targets. Using newly generated nuclear reaction cross section data for Scandium and Iridium, modest uniform doping of the innermost ablator region provides clearly observable reaction product differences between robust burn and failure for either element. Both equatorial and polar tracer loading yield observable, but indistinguishable, signatures for either choice of element for the preliminary cases studied.

  8. Failure detection system risk reduction assessment

    NASA Technical Reports Server (NTRS)

    Aguilar, Robert B. (Inventor); Huang, Zhaofeng (Inventor)

    2012-01-01

    A process includes determining a probability of a failure mode of a system being analyzed reaching a failure limit as a function of time to failure limit, determining a probability of a mitigation of the failure mode as a function of a time to failure limit, and quantifying a risk reduction based on the probability of the failure mode reaching the failure limit and the probability of the mitigation.

  9. Derivation of Failure Rates and Probability of Failures for the International Space Station Probabilistic Risk Assessment Study

    NASA Technical Reports Server (NTRS)

    Vitali, Roberto; Lutomski, Michael G.

    2004-01-01

    National Aeronautics and Space Administration s (NASA) International Space Station (ISS) Program uses Probabilistic Risk Assessment (PRA) as part of its Continuous Risk Management Process. It is used as a decision and management support tool to not only quantify risk for specific conditions, but more importantly comparing different operational and management options to determine the lowest risk option and provide rationale for management decisions. This paper presents the derivation of the probability distributions used to quantify the failure rates and the probability of failures of the basic events employed in the PRA model of the ISS. The paper will show how a Bayesian approach was used with different sources of data including the actual ISS on orbit failures to enhance the confidence in results of the PRA. As time progresses and more meaningful data is gathered from on orbit failures, an increasingly accurate failure rate probability distribution for the basic events of the ISS PRA model can be obtained. The ISS PRA has been developed by mapping the ISS critical systems such as propulsion, thermal control, or power generation into event sequences diagrams and fault trees. The lowest level of indenture of the fault trees was the orbital replacement units (ORU). The ORU level was chosen consistently with the level of statistically meaningful data that could be obtained from the aerospace industry and from the experts in the field. For example, data was gathered for the solenoid valves present in the propulsion system of the ISS. However valves themselves are composed of parts and the individual failure of these parts was not accounted for in the PRA model. In other words the failure of a spring within a valve was considered a failure of the valve itself.

  10. Assessment of hoist failure rate for Payload Transporter III

    SciTech Connect

    Demmie, P.N.

    1994-02-01

    Assessment of the hoist failure rate for the Payload Transporter Type III (PT-III) hoist was completed as one of the ground transportation tasks for the Minuteman II (MMIII) Weapon System Safety Assessment. The failures of concern are failures that lead to dropping a reentry system (RS) during hoist operations in a silo or the assembly, storage, and inspection building for a MMIII wing. After providing a brief description of the PT-III hoist system, the author summarizes his search for historical data from industry and the military services for failures of electric hoist systems. Since such information was not found, the strategy for assessing a failure rate was to consider failure mechanisms which lead to load-drop accidents, estimate their rates, and sum the rates for the PT-III hoist failure rate. The author discusses failure mechanisms and describes his assessment of a chain failure rate that is based on data from destructive testing of a chain of the type used for the PT-III hoist and projected usage rates for hoist operations involving the RS. The main result provides upper bounds for chain failure rates that are based on these data. No test data were found to estimate failure rates due to mechanisms other than chain failure. The author did not attempt to quantify the effects of human factors on the PT-III hoist failure rate.

  11. Failure risk assessment by analysis and testing

    NASA Technical Reports Server (NTRS)

    Moore, N.; Ebbeler, D.; Creager, M.

    1992-01-01

    The sources of information on which to base an evaluation of reliability or failure risk of an aerospace flight system are (1) experience from tests and flights and (2) engineering analysis. It is rarely feasible to establish high reliability at high confidence by testing aerospace systems or components. Moreover, failure prediction by conventional, deterministic methods of engineering analysis can become arbitrary and subject to serious misinterpretation when uncertain or approximate information is used to establish analysis parameter values and to calibrate the accuracy of engineering models. The limitations of testing to evaluate failure risk are discussed, and a statistical approach which incorporates both engineering analysis and testing is presented.

  12. Assessment of Steel Reinforcement Corrosion State by Parameters of Potentiodynamic Diagrams

    NASA Astrophysics Data System (ADS)

    Krajči, Ľudovít; Jerga, Ján

    2015-12-01

    The environment of the steel reinforcement has a significant impact on the durability and service life of a concrete structure. It is not only the presence of aggressive substances from the environment, but also the own composition of concrete mixture. The use of new types of cements, additives and admixtures must be preceded by verification, if they themselves shall not initiate the corrosion. There is a need for closer physical expression of the parameters of the potentiodynamic diagrams allowing reliable assessment of the influence of the surrounding environment on electrochemical behaviour of reinforcement. The analysis of zero retardation limits of potentiodynamic curves is presented.

  13. Common-Cause Failure Analysis in Event Assessment

    SciTech Connect

    Dana L. Kelly; Dale M. Rasmuson

    2008-09-01

    This paper describes the approach taken by the U. S. Nuclear Regulatory Commission to the treatment of common-cause failure in probabilistic risk assessment of operational events. The approach is based upon the Basic Parameter Model for common-cause failure, and examples are illustrated using the alpha-factor parameterization, the approach adopted by the NRC in their Standardized Plant Analysis Risk (SPAR) models. The cases of a failed component (with and without shared common-cause failure potential) and a component being unavailable due to preventive maintenance or testing are addressed. The treatment of two related failure modes (e.g., failure to start and failure to run) is a new feature of this paper. These methods are being applied by the NRC in assessing the risk significance of operational events for the Significance Determination Process (SDP) and the Accident Sequence Precursor (ASP) program.

  14. Vulnerability Assessment for Cascading Failures in Electric Power Systems

    SciTech Connect

    Baldick, R.; Chowdhury, Badrul; Dobson, Ian; Dong, Zhao Yang; Gou, Bei; Hawkins, David L.; Huang, Zhenyu; Joung, Manho; Kim, Janghoon; Kirschen, Daniel; Lee, Stephen; Li, Fangxing; Li, Juan; Li, Zuyi; Liu, Chen-Ching; Luo, Xiaochuan; Mili, Lamine; Miller, Stephen; Nakayama, Marvin; Papic, Milorad; Podmore, Robin; Rossmaier, John; Schneider, Kevin P.; Sun, Hongbin; Sun, Kai; Wang, David; Wu, Zhigang; Yao, Liangzhong; Zhang, Pei; Zhang, Wenjie; Zhang, Xiaoping

    2008-09-10

    Cascading failures present severe threats to power grid security, and thus vulnerability assessment of power grids is of significant importance. Focusing on analytic methods, this paper reviews the state of the art of vulnerability assessment methods in the context of cascading failures in three categories: steady-state modeling based analysis; dynamic modeling analysis; and non-traditional modeling approaches. The impact of emerging technologies including phasor technology, high-performance computing techniques, and visualization techniques on the vulnerability assessment of cascading failures is then addressed, and future research directions are presented.

  15. 45 CFR 150.461 - Failure to pay assessment.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 45 Public Welfare 1 2010-10-01 2010-10-01 false Failure to pay assessment. 150.461 Section 150.461 Public Welfare DEPARTMENT OF HEALTH AND HUMAN SERVICES REQUIREMENTS RELATING TO HEALTH CARE ACCESS CMS ENFORCEMENT IN GROUP AND INDIVIDUAL INSURANCE MARKETS Administrative Hearings § 150.461 Failure to...

  16. 45 CFR 150.461 - Failure to pay assessment.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 45 Public Welfare 1 2013-10-01 2013-10-01 false Failure to pay assessment. 150.461 Section 150.461 Public Welfare DEPARTMENT OF HEALTH AND HUMAN SERVICES REQUIREMENTS RELATING TO HEALTH CARE ACCESS CMS ENFORCEMENT IN GROUP AND INDIVIDUAL INSURANCE MARKETS Administrative Hearings § 150.461 Failure to...

  17. 45 CFR 150.461 - Failure to pay assessment.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 45 Public Welfare 1 2011-10-01 2011-10-01 false Failure to pay assessment. 150.461 Section 150.461 Public Welfare DEPARTMENT OF HEALTH AND HUMAN SERVICES REQUIREMENTS RELATING TO HEALTH CARE ACCESS CMS ENFORCEMENT IN GROUP AND INDIVIDUAL INSURANCE MARKETS Administrative Hearings § 150.461 Failure to...

  18. 45 CFR 156.961 - Failure to pay assessment.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 45 Public Welfare 1 2014-10-01 2014-10-01 false Failure to pay assessment. 156.961 Section 156.961 Public Welfare Department of Health and Human Services REQUIREMENTS RELATING TO HEALTH CARE ACCESS HEALTH... Administrative Review of QHP Issuer Sanctions in Federally-Facilitated Exchanges § 156.961 Failure to...

  19. 45 CFR 150.461 - Failure to pay assessment.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 45 Public Welfare 1 2014-10-01 2014-10-01 false Failure to pay assessment. 150.461 Section 150.461 Public Welfare Department of Health and Human Services REQUIREMENTS RELATING TO HEALTH CARE ACCESS CMS ENFORCEMENT IN GROUP AND INDIVIDUAL INSURANCE MARKETS Administrative Hearings § 150.461 Failure to...

  20. 45 CFR 150.461 - Failure to pay assessment.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 45 Public Welfare 1 2012-10-01 2012-10-01 false Failure to pay assessment. 150.461 Section 150.461 Public Welfare DEPARTMENT OF HEALTH AND HUMAN SERVICES REQUIREMENTS RELATING TO HEALTH CARE ACCESS CMS ENFORCEMENT IN GROUP AND INDIVIDUAL INSURANCE MARKETS Administrative Hearings § 150.461 Failure to...

  1. Probabilistic failure assessment with application to solid rocket motors

    NASA Technical Reports Server (NTRS)

    Jan, Darrell L.; Davidson, Barry D.; Moore, Nicholas R.

    1990-01-01

    A quantitative methodology is being developed for assessment of risk of failure of solid rocket motors. This probabilistic methodology employs best available engineering models and available information in a stochastic framework. The framework accounts for incomplete knowledge of governing parameters, intrinsic variability, and failure model specification error. Earlier case studies have been conducted on several failure modes of the Space Shuttle Main Engine. Work in progress on application of this probabilistic approach to large solid rocket boosters such as the Advanced Solid Rocket Motor for the Space Shuttle is described. Failure due to debonding has been selected as the first case study for large solid rocket motors (SRMs) since it accounts for a significant number of historical SRM failures. Impact of incomplete knowledge of governing parameters and failure model specification errors is expected to be important.

  2. Development and validation of standard area diagrams to aide assessment of pecan scab symptoms on pecan fruit

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Pecan scab (Fusicladium effusum) causes losses of pecan nutmeat yield and quality in the southeastern U.S. Disease assessment relies on visual rating, which can be inaccurate, imprecise with poor inter-rater reliability. A standard area diagram (SAD) set for pecan scab on fruit valves was develope...

  3. Methods for Assessing Honeycomb Sandwich Panel Wrinkling Failures

    NASA Technical Reports Server (NTRS)

    Zalewski, Bart F.; Dial, William B.; Bednarcyk, Brett A.

    2012-01-01

    Efficient closed-form methods for predicting the facesheet wrinkling failure mode in sandwich panels are assessed. Comparisons were made with finite element model predictions for facesheet wrinkling, and a validated closed-form method was implemented in the HyperSizer structure sizing software.

  4. Methods of failure and reliability assessment for mechanical heart pumps.

    PubMed

    Patel, Sonna M; Allaire, Paul E; Wood, Houston G; Throckmorton, Amy L; Tribble, Curt G; Olsen, Don B

    2005-01-01

    Artificial blood pumps are today's most promising bridge-to-recovery (BTR), bridge-to-transplant (BTT), and destination therapy solutions for patients suffering from intractable congestive heart failure (CHF). Due to an increased need for effective, reliable, and safe long-term artificial blood pumps, each new design must undergo failure and reliability testing, an important step prior to approval from the United States Food and Drug Administration (FDA), for clinical testing and commercial use. The FDA has established no specific standards or protocols for these testing procedures and there are only limited recommendations provided by the scientific community when testing an overall blood pump system and individual system components. Product development of any medical device must follow a systematic and logical approach. As the most critical aspects of the design phase, failure and reliability assessments aid in the successful evaluation and preparation of medical devices prior to clinical application. The extent of testing, associated costs, and lengthy time durations to execute these experiments justify the need for an early evaluation of failure and reliability. During the design stages of blood pump development, a failure modes and effects analysis (FMEA) should be completed to provide a concise evaluation of the occurrence and frequency of failures and their effects on the overall support system. Following this analysis, testing of any pump typically involves four sequential processes: performance and reliability testing in simple hydraulic or mock circulatory loops, acute and chronic animal experiments, human error analysis, and ultimately, clinical testing. This article presents recommendations for failure and reliability testing based on the National Institutes of Health (NIH), Society for Thoracic Surgeons (STS) and American Society for Artificial Internal Organs (ASAIO), American National Standards Institute (ANSI), the Association for Advancement of

  5. How many standard area diagram sets are needed for accurate disease severity assessment

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Standard area diagram sets (SADs) are widely used in plant pathology: a rater estimates disease severity by comparing an unknown sample to actual severities in the SADs and interpolates an estimate as accurately as possible (although some SADs have been developed for categorizing disease too). Most ...

  6. Computed radionuclide urogram for assessing acute renal failure

    SciTech Connect

    Schlegel, J.U.; Lang, E.K.

    1980-05-01

    The computed radionuclide urogram is advocated as a noninvasive diagnostic method for differentiation of the most common prerenal, renal, and postrenal causes of acute renal failure. On the basis of characteristic changes in the effective renal plasma flow rate, the calculated filtration fraction, and the calculated glomerular filtration rate, prerenal conditions such as renal artery stenosis or thrombosis, renal conditions such as acute rejection or acute tubular necrosis, and postrenal conditions such as obstruction or leakage, which are the most common causes of acute renal failure, can be differentiated. In conjunction with morphologic criteria derived from sonograms, a diagnosis with acceptable confidence can be rendered in most instances. Both the computed radionuclide urogram and sonogram are noninvasive and can be used without adverse effects in the presence of azotemia and even anuria. This also makes feasible reexamination at intervals to assess effect of therapy and offer prognostic information.

  7. Selected component failure rate values from fusion safety assessment tasks

    SciTech Connect

    Cadwallader, L.C.

    1998-09-01

    This report is a compilation of component failure rate and repair rate values that can be used in magnetic fusion safety assessment tasks. Several safety systems are examined, such as gas cleanup systems and plasma shutdown systems. Vacuum system component reliability values, including large vacuum chambers, have been reviewed. Values for water cooling system components have also been reported here. The report concludes with the examination of some equipment important to personnel safety, atmospheres, combustible gases, and airborne releases of radioactivity. These data should be useful to system designers to calculate scoping values for the availability and repair intervals for their systems, and for probabilistic safety or risk analysts to assess fusion systems for safety of the public and the workers.

  8. Selected Component Failure Rate Values from Fusion Safety Assessment Tasks

    SciTech Connect

    Cadwallader, Lee Charles

    1998-09-01

    This report is a compilation of component failure rate and repair rate values that can be used in magnetic fusion safety assessment tasks. Several safety systems are examined, such as gas cleanup systems and plasma shutdown systems. Vacuum system component reliability values, including large vacuum chambers, have been reviewed. Values for water cooling system components have also been reported here. The report concludes with the examination of some equipment important to personnel safety, atmospheres, combustible gases, and airborne releases of radioactivity. These data should be useful to system designers to calculate scoping values for the availability and repair intervals for their systems, and for probabilistic safety or risk analysts to assess fusion systems for safety of the public and the workers.

  9. Assessing Wind Farm Reliability Using Weather Dependent Failure Rates

    NASA Astrophysics Data System (ADS)

    Wilson, G.; McMillan, D.

    2014-06-01

    Using reliability data comprising of two modern, large scale wind farm sites and wind data from two onsite met masts, a model is developed which calculates wind speed dependant failure rates which are used to populate a Markov Chain. Monte Carlo simulation is then exercised to simulate three wind farms which are subjected to controlled wind speed conditions from three separate potential UK sites. The model then calculates and compares wind farm reliability due to corrective maintenance and component failure rates influenced by the wind speed of each of the sites. Results show that the components affected most by changes in average daily wind speed are the control system and the yaw system. A comparison between this model and a more simple estimation of site yield is undertaken. The model takes into account the effects of the wind speed on the cost of operation and maintenance and also includes the impact of longer periods of downtime in the winter months and shorter periods in the summer. By taking these factors into account a more detailed site assessment can be undertaken. There is significant value to this model for operators and manufacturers.

  10. Ab-initio calculations and phase diagram assessments of An-Al systems (An = U, Np, Pu)

    NASA Astrophysics Data System (ADS)

    Sedmidubský, D.; Konings, R. J. M.; Souček, P.

    2010-02-01

    The enthalpies of formation of binary intermetallic compounds AnAl n(n=2,3,4, An=U,Np,Pu) were assessed from first principle calculations of total energies performed using full potential APW + lo technique within density functional theory ( WIEN2k). The substantial contribution to entropies, S298°, arising from lattice vibrations was calculated by direct method within harmonic crystal approximation ( Phonon software + VASP for obtaining Hellmann-Feynman forces). The electronic heat capacity and the corresponding contribution to entropy were estimated from the density of states at Fermi level obtained from electronic structure calculations. The phase diagrams of the relevant systems An-Al were calculated based on the thermodynamic data assessed from ab-initio calculations, known equilibrium and calorimetry data by employing the FactSage program.

  11. Use of standard area diagrams to improve assessment of pecan scab on fruit

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Pecan scab (Fusicladium effusum) causes significant economic losses of pecan throughout the southeastern US. Disease assessment relies on visual rating of disease severity, which can be inaccurate, imprecise, with poor repeatability and reproducibility. Accurate, precise assessments are important fo...

  12. Comparison of Body Composition Assessment Methods in Pediatric Intestinal Failure

    PubMed Central

    Mehta, Nilesh M.; Raphael, Bram; Guteirrez, Ivan; Quinn, Nicolle; Mitchell, Paul D.; Litman, Heather J.; Jaksic, Tom; Duggan, Christopher P.

    2015-01-01

    Objectives To examine the agreement of multifrequency bioelectric impedance analysis (BIA) and anthropometry with reference methods for body composition assessment in children with intestinal failure (IF). Methods We conducted a prospective pilot study in children 14 years of age or younger with IF resulting from either short bowel syndrome (SBS) or motility disorders. Bland Altman analysis was used to examine the agreement between BIA and deuterium dilution in measuring total body water (TBW) and lean body mass (LBM); and between BIA and dual X-ray absorptiometry (DXA) techniques in measuring LBM and FM. Fat mass (FM) and percent body fat (%BF) measurements by BIA and anthropometry, were also compared in relation to those measured by deuterium dilution. Results Fifteen children with IF, median (IQR) age 7.2 (5.0, 10.0) years, 10 (67%) male, were studied. BIA and deuterium dilution were in good agreement with a mean bias (limits of agreement) of 0.9 (-3.2, 5.0) for TBW (L) and 0.1 (-5.4 to 5.6) for LBM (kg) measurements. The mean bias (limits) for FM (kg) and %BF measurements were 0.4 (-3.8, 4.6) kg and 1.7 (-16.9, 20.3)% respectively. The limits of agreement were within 1 SD of the mean bias in 12/14 (86%) subjects for TBW and LBM, and in 11/14 (79%) for FM and %BF measurements. Mean bias (limits) for LBM (kg) and FM (kg) between BIA and DXA were 1.6 (-3.0 to 6.3) kg and -0.1 (-3.2 to 3.1) kg, respectively. Mean bias (limits) for FM (kg) and %BF between anthropometry and deuterium dilution were 0.2 (-4.2, 4.6) and -0.2 (-19.5 to 19.1), respectively. The limits of agreement were within 1 SD of the mean bias in 10/14 (71%) subjects. Conclusions In children with intestinal failure, TBW and LBM measurements by multifrequency BIA method were in agreement with isotope dilution and DXA methods, with small mean bias. In comparison to deuterium dilution, BIA was comparable to anthropometry for FM and %BF assessments with small mean bias. However, the limits of agreement

  13. Assessment of Commonly Available Educational Materials in Heart Failure Clinics

    PubMed Central

    Taylor-Clarke, Kimberli; Henry-Okafor, Queen; Murphy, Clare; Keyes, Madeline; Rothman, Russell; Churchwell, Andre; Mensah, George A.; Sawyer, Douglas; Sampson, Uchechukwu K. A.

    2014-01-01

    Background Health literacy (HL) is an established independent predictor of cardiovascular outcomes. Approximately 90 million Americans have limited HL and read at ≤ 5th grade-level. Therefore, we sought to determine the suitability and readability level of common cardiovascular patient education materials (PEM) related to heart failure and heart-healthy lifestyle. Methods and Results The suitability and readability of written PEMs were assessed using the suitability assessment of materials (SAM) and Fry readability formula. The SAM criteria are comprised of the following categories: message content, text appearance, visuals, and layout and design. We obtained a convenience sample of 18 English-written cardiovascular PEMs freely available from major health organizations. Two reviewers independently appraised the PEMs. Final suitability scores ranged from 12 to 87%. Readability levels ranged between 3rd and 15th grade-level; the average readability level was 8th grade. Ninety-four percent of the PEMs were rated either superior or adequate on text appearance, but ≥ 50% of the PEMs were rated inadequate on each of the other categories of the SAM criteria. Only two (11%) PEMs had the optimum suitability score of ≥ 70% and ≤ 5th grade readability level suitable for populations with limited HL. Conclusions Commonly available cardiovascular PEMs used by some major healthcare institutions are not suitable for the average American patient. The true prevalence of suboptimal PEMs needs to be determined as it potentially negatively impacts optimal healthcare delivery and outcomes. PMID:21743339

  14. A failure diagnosis and impact assessment prototype for Space Station Freedom

    NASA Technical Reports Server (NTRS)

    Baker, Carolyn G.; Marsh, Christopher A.

    1991-01-01

    NASA is investigating the use of advanced automation to enhance crew productivity for Space Station Freedom in numerous areas, one being failure management. A prototype is described that diagnoses failure sources and assesses the future impacts of those failures on other Freedom entities.

  15. Assessing patient preferences in heart failure using conjoint methodology

    PubMed Central

    Pisa, Giovanni; Eichmann, Florian; Hupfer, Stephan

    2015-01-01

    Aim The course of heart failure (HF) is characterized by frequent hospitalizations, a high mortality rate, as well as a severely impaired health-related quality of life (HRQoL). To optimize disease management, understanding of patient preferences is crucial. We aimed to assess patient preferences using conjoint methodology and HRQoL in patients with HF. Methods Two modules were applied: an initial qualitative module, consisting of in-depth interviews with 12 HF patients, and the main quantitative module in 300 HF patients from across Germany. Patients were stratified according to the time of their last HF hospitalization. Each patient was presented with ten different scenarios during the conjoint exercise. Additionally, patients completed the generic HRQoL instrument, EuroQol health questionnaire (EQ-5D™). Results The attribute with the highest relative importance was dyspnea (44%), followed by physical capacity (18%). Of similar importance were exhaustion during mental activities (13%), fear due to HF (13%), and autonomy (12%). The most affected HRQoL dimensions according to the EQ-5D questionnaire were anxiety/depression (23% with severe problems), pain/discomfort (19%), and usual activities (15%). Overall average EQ-5D score was 0.39 with stable, chronic patients (never hospitalized) having a significantly better health state vs the rest of the cohort. Conclusion This paper analyzed patient preference in HF using a conjoint methodology. The preference weights resulting from the conjoint analysis could be used in future to design HRQoL questionnaires which could better assess patient preferences in HF care. PMID:26345530

  16. Thermodynamic Diagrams

    NASA Astrophysics Data System (ADS)

    Chaston, Scot

    1999-02-01

    Thermodynamic data such as equilibrium constants, standard cell potentials, molar enthalpies of formation, and standard entropies of substances can be a very useful basis for an organized presentation of knowledge in diverse areas of applied chemistry. Thermodynamic data can become particularly useful when incorporated into thermodynamic diagrams that are designed to be easy to recall, to serve as a basis for reconstructing previous knowledge, and to determine whether reactions can occur exergonically or only with the help of an external energy source. Few students in our chemistry-based courses would want to acquire the depth of knowledge or rigor of professional thermodynamicists. But they should nevertheless learn how to make good use of thermodynamic data in their professional occupations that span the chemical, biological, environmental, and medical laboratory fields. This article discusses examples of three thermodynamic diagrams that have been developed for this purpose. They are the thermodynamic energy account (TEA), the total entropy scale, and the thermodynamic scale diagrams. These diagrams help in the teaching and learning of thermodynamics by bringing the imagination into the process of developing a better understanding of abstract thermodynamic functions, and by allowing the reader to keep track of specialist thermodynamic discourses in the literature.

  17. Generic component failure data base for light water and liquid sodium reactor PRAs (probabilistic risk assessments)

    SciTech Connect

    Eide, S.A.; Chmielewski, S.V.; Swantz, T.D.

    1990-02-01

    A comprehensive generic component failure data base has been developed for light water and liquid sodium reactor probabilistic risk assessments (PRAs). The Nuclear Computerized Library for Assessing Reactor Reliability (NUCLARR) and the Centralized Reliability Data Organization (CREDO) data bases were used to generate component failure rates. Using this approach, most of the failure rates are based on actual plant data rather than existing estimates. 21 refs., 9 tabs.

  18. Consequence assessment of large rock slope failures in Norway

    NASA Astrophysics Data System (ADS)

    Oppikofer, Thierry; Hermanns, Reginald L.; Horton, Pascal; Sandøy, Gro; Roberts, Nicholas J.; Jaboyedoff, Michel; Böhme, Martina; Yugsi Molina, Freddy X.

    2014-05-01

    Steep glacially carved valleys and fjords in Norway are prone to many landslide types, including large rockslides, rockfalls, and debris flows. Large rockslides and their secondary effects (rockslide-triggered displacement waves, inundation behind landslide dams and outburst floods from failure of landslide dams) pose a significant hazard to the population living in the valleys and along the fjords shoreline. The Geological Survey of Norway performs systematic mapping of unstable rock slopes in Norway and has detected more than 230 unstable slopes with significant postglacial deformation. This large number necessitates prioritisation of follow-up activities, such as more detailed investigations, periodic displacement measurements, continuous monitoring and early-warning systems. Prioritisation is achieved through a hazard and risk classification system, which has been developed by a panel of international and Norwegian experts (www.ngu.no/en-gb/hm/Publications/Reports/2012/2012-029). The risk classification system combines a qualitative hazard assessment with a consequences assessment focusing on potential life losses. The hazard assessment is based on a series of nine geomorphological, engineering geological and structural criteria, as well as displacement rates, past events and other signs of activity. We present a method for consequence assessment comprising four main steps: 1. computation of the volume of the unstable rock slope; 2. run-out assessment based on the volume-dependent angle of reach (Fahrböschung) or detailed numerical run-out modelling; 3. assessment of possible displacement wave propagation and run-up based on empirical relations or modelling in 2D or 3D; and 4. estimation of the number of persons exposed to rock avalanches or displacement waves. Volume computation of an unstable rock slope is based on the sloping local base level technique, which uses a digital elevation model to create a second-order curved surface between the mapped extent of

  19. Consequence assessment of large rock slope failures in Norway

    NASA Astrophysics Data System (ADS)

    Oppikofer, Thierry; Hermanns, Reginald L.; Horton, Pascal; Sandøy, Gro; Roberts, Nicholas J.; Jaboyedoff, Michel; Böhme, Martina; Yugsi Molina, Freddy X.

    2014-05-01

    Steep glacially carved valleys and fjords in Norway are prone to many landslide types, including large rockslides, rockfalls, and debris flows. Large rockslides and their secondary effects (rockslide-triggered displacement waves, inundation behind landslide dams and outburst floods from failure of landslide dams) pose a significant hazard to the population living in the valleys and along the fjords shoreline. The Geological Survey of Norway performs systematic mapping of unstable rock slopes in Norway and has detected more than 230 unstable slopes with significant postglacial deformation. This large number necessitates prioritisation of follow-up activities, such as more detailed investigations, periodic displacement measurements, continuous monitoring and early-warning systems. Prioritisation is achieved through a hazard and risk classification system, which has been developed by a panel of international and Norwegian experts (www.ngu.no/en-gb/hm/Publications/Reports/2012/2012-029). The risk classification system combines a qualitative hazard assessment with a consequences assessment focusing on potential life losses. The hazard assessment is based on a series of nine geomorphological, engineering geological and structural criteria, as well as displacement rates, past events and other signs of activity. We present a method for consequence assessment comprising four main steps: 1. computation of the volume of the unstable rock slope; 2. run-out assessment based on the volume-dependent angle of reach (Fahrböschung) or detailed numerical run-out modelling; 3. assessment of possible displacement wave propagation and run-up based on empirical relations or modelling in 2D or 3D; and 4. estimation of the number of persons exposed to rock avalanches or displacement waves. Volume computation of an unstable rock slope is based on the sloping local base level technique, which uses a digital elevation model to create a second-order curved surface between the mapped extent of

  20. Assessing mechanical vulnerability in water distribution networks under multiple failures

    NASA Astrophysics Data System (ADS)

    Berardi, Luigi; Ugarelli, Rita; Røstum, Jon; Giustolisi, Orazio

    2014-03-01

    Understanding mechanical vulnerability of water distribution networks (WDN) is of direct relevance for water utilities since it entails two different purposes. On the one hand, it might support the identification of severe failure scenarios due to external causes (e.g., natural or intentional events) which result into the most critical consequences on WDN supply capacity. On the other hand, it aims at figure out the WDN portions which are more prone to be affected by asset disruptions. The complexity of such analysis stems from the number of possible scenarios with single and multiple simultaneous shutdowns of asset elements leading to modifications of network topology and insufficient water supply to customers. In this work, the search for the most disruptive combinations of multiple asset failure events is formulated and solved as a multiobjective optimization problem. The higher vulnerability failure scenarios are detected as those causing the lower supplied demand due to the lower number of simultaneous failures. The automatic detection of WDN topology, subsequent to the detachments of failed elements, is combined with pressure-driven analysis. The methodology is demonstrated on a real water distribution network. Results show that, besides the failures causing the detachment of reservoirs, tanks, or pumps, there are other different topological modifications which may cause severe WDN service disruptions. Such information is of direct relevance to support planning asset enhancement works and improve the preparedness to extreme events.

  1. Common-Cause Failure Treatment in Event Assessment: Basis for a Proposed New Model

    SciTech Connect

    Dana Kelly; Song-Hua Shen; Gary DeMoss; Kevin Coyne; Don Marksberry

    2010-06-01

    Event assessment is an application of probabilistic risk assessment in which observed equipment failures and outages are mapped into the risk model to obtain a numerical estimate of the event’s risk significance. In this paper, we focus on retrospective assessments to estimate the risk significance of degraded conditions such as equipment failure accompanied by a deficiency in a process such as maintenance practices. In modeling such events, the basic events in the risk model that are associated with observed failures and other off-normal situations are typically configured to be failed, while those associated with observed successes and unchallenged components are assumed capable of failing, typically with their baseline probabilities. This is referred to as the failure memory approach to event assessment. The conditioning of common-cause failure probabilities for the common cause component group associated with the observed component failure is particularly important, as it is insufficient to simply leave these probabilities at their baseline values, and doing so may result in a significant underestimate of risk significance for the event. Past work in this area has focused on the mathematics of the adjustment. In this paper, we review the Basic Parameter Model for common-cause failure, which underlies most current risk modelling, discuss the limitations of this model with respect to event assessment, and introduce a proposed new framework for common-cause failure, which uses a Bayesian network to model underlying causes of failure, and which has the potential to overcome the limitations of the Basic Parameter Model with respect to event assessment.

  2. Usefulness of the culturally adapted oxygen-cost diagram in the assessment of dyspnea in Puerto Rico

    PubMed Central

    Santos Rodríguez, Ruth A.; Dexter, Donald; Nieves-Plaza, Mariely; Nazario, Cruz M.

    2015-01-01

    Objective Breathlessness is a common and disabling symptom of pulmonary disease. Measuring its severity is recommended as such measurements can be helpful in both clinical and research settings. The oxygen-cost diagram (OCD) and the Medical Research Council (MRC) dyspnea scale were developed in English to measure severity of dyspnea. These scales were previously translated to Spanish and adapted for use in a Hispanic population. The objective of this study is to assess the psychometric properties of these scales. We propose the scales correlate well with measures of physiological impairment. Methods Subjects having pulmonary disease rated their perceptions of dyspnea using the scales, performed a spirometry test, and did a 6-min walk. Spearman correlation coefficients (r) were used to correlate dyspnea scores with spirometric parameters and distance walked (6MWD). Results Sixty-six patients having stable asthma (n = 36), chronic obstructive pulmonary disease (n = 19), or interstitial lung disease (n = 11) participated in the study. OCD scores showed a significant correlation with FEV1 (r = 0.41; p<0.01), FEV1% (r = 0.36; p<0.01), FVC (r = 0.44; p<0.01), and FVC% (r = 0.37; p<0.01) in the study population. The OCD scores were highly correlated with 6MWD (r = 0.59, p<0.01). The MRC dyspnea scale showed significant inverse correlation with FEV1 (r = −0.34; p<0.01) and 6MWD (r = −0.33; p<0.05), but the correlations were weaker compared to the correlations with the OCD scale. Conclusions The severity of breathlessness as measured by the adapted Spanish OCD showed a moderate to high correlation with spirometric parameters and 6MWD; therefore, the adapted OCD should prove to be useful in Puerto Rico. PMID:25856872

  3. Proof test diagrams for Zerodur glass-ceramic

    NASA Technical Reports Server (NTRS)

    Tucker, D. S.

    1991-01-01

    Proof test diagrams for Zerodur glass-ceramics are calculated from available fracture mechanics data. It is shown that the environment has a large effect on minimum time-to-failure as predicted by proof test diagrams.

  4. A Probabilistic-Micro-mechanical Methodology for Assessing Zirconium Alloy Cladding Failure

    SciTech Connect

    Pan, Y.M.; Chan, K.S.; Riha, D.S.

    2007-07-01

    Cladding failure of fuel rods caused by hydride-induced embrittlement is a reliability concern for spent nuclear fuel after extended burnup. Uncertainties in the cladding temperature, cladding stress, oxide layer thickness, and the critical stress value for hydride reorientation preclude an assessment of the cladding failure risk. A set of micro-mechanical models for treating oxide cracking, blister cracking, delayed hydride cracking, and cladding fracture was developed and incorporated in a computer model. Results obtained from the preliminary model calculations indicate that at temperatures below a critical temperature of 318.5 deg. C [605.3 deg. F], the time to failure by delayed hydride cracking in Zr-2.5%Nb decreased with increasing cladding temperature. The overall goal of this project is to develop a probabilistic-micro-mechanical methodology for assessing the probability of hydride-induced failure in Zircaloy cladding and thereby establish performance criteria. (authors)

  5. Probabilistic assessment of failure in adhesively bonded composite laminates

    SciTech Connect

    Minnetyan, L.; Chamis, C.C.

    1997-07-01

    Damage initiation and progressive fracture of adhesively bonded graphite/epoxy composites is investigated under tensile loading. A computer code is utilized for the simulation of composite structural damage and fracture. Structural response is assessed probabilistically during degradation. The effects of design variable uncertainties on structural damage progression are quantified. The Fast Probability Integrator is used to assess the response scatter in the composite structure at damage initiation. Sensitivity of the damage response to design variables is computed. Methods are general purpose in nature and are applicable to all types of laminated composite structures and joints, starting from damage initiation to unstable damage propagation and collapse. Results indicate that composite constituent and adhesive properties have a significant effect on structural durability. Damage initiation/progression does not necessarily begin in the adhesive bond. Design implications with regard to damage tolerance of adhesively bonded joints are examined.

  6. Lumbar Transpedicular Implant Failure: A Clinical and Surgical Challenge and Its Radiological Assessment

    PubMed Central

    Ali, Abdel Mohsen Arafa

    2014-01-01

    Study Design It is a multicenter, controlled case study review of a big scale of pedicle-screw procedures from January 2000 to June 2010. The outcomes were compared to those with no implant failure. Purpose The purpose of this study was to review retrospectively the outcome of 100 patients with implant failure in comparison to 100 control-patients, and to study the causes of failure and its prevention. Overview of Literature Transpedicular fixation is associated with risks of hardware failure, such as screw/rod breakage and/or loosening at the screw-rod interface and difficulties in the system assembly, which remain a significant clinical problem. Removal or revision of the spinal hardware is often required. Methods Two hundred patients (88 women, 112 men) were divided into 2 major groups, with 100 patients in group I (implant failure group G1) and 100 patients in group II (successful fusion, control group G2). We subdivided the study groups into two subgroups: subgroup a (single-level instrumented group) and subgroup b (multilevel instrumented group). The implant status was assessed based on intraoperative and follow-up radiographs. Results Implant failure in general was present in 36% in G1a, and in 64% in G1b, and types of implant failure included screw fracture (34%), rod fracture (24%), rod loosening (22%), screw loosening (16%), and failure of both rod and screw (4%). Most of the failures (90%) occurred within 6 months after surgery, with no reported cases 1 year postoperatively. Conclusions We tried to address the problem and study the causes of failure, and proposed solutions for its prevention. PMID:24967042

  7. Beyond ejection fraction: an integrative approach for assessment of cardiac structure and function in heart failure.

    PubMed

    Cikes, Maja; Solomon, Scott D

    2016-06-01

    Left ventricular ejection fraction (LVEF) has been the central parameter used for diagnosis and management in patients with heart failure. A good predictor of adverse outcomes in heart failure when below ∼45%, LVEF is less useful as a marker of risk as it approaches normal. As a measure of cardiac function, ejection fraction has several important limitations. Calculated as the stroke volume divided by end-diastolic volume, the estimation of ejection fraction is generally based on geometric assumptions that allow for assessment of volumes based on linear or two-dimensional measurements. Left ventricular ejection fraction is both preload- and afterload-dependent, can change substantially based on loading conditions, is only moderately reproducible, and represents only a single measure of risk in patients with heart failure. Moreover, the relationship between ejection fraction and risk in patients with heart failure is modified by factors such as hypertension, diabetes, and renal function. A more complete evaluation and understanding of left ventricular function in patients with heart failure requires a more comprehensive assessment: we conceptualize an integrative approach that incorporates measures of left and right ventricular function, left ventricular geometry, left atrial size, and valvular function, as well as non-imaging factors (such as clinical parameters and biomarkers), providing a comprehensive and accurate prediction of risk in heart failure. PMID:26417058

  8. 26 CFR 301.6685-1 - Assessable penalties with respect to private foundations' failure to comply with section 6104(d).

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... foundations' failure to comply with section 6104(d). 301.6685-1 Section 301.6685-1 Internal Revenue INTERNAL... Additional Amounts § 301.6685-1 Assessable penalties with respect to private foundations' failure to comply... private foundations' annual returns, and who fails so to comply, if such failure is willful, shall pay...

  9. 26 CFR 301.6685-1 - Assessable penalties with respect to private foundations' failure to comply with section 6104(d).

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... foundations' failure to comply with section 6104(d). 301.6685-1 Section 301.6685-1 Internal Revenue INTERNAL... Additional Amounts § 301.6685-1 Assessable penalties with respect to private foundations' failure to comply... private foundations' annual returns, and who fails so to comply, if such failure is willful, shall pay...

  10. 26 CFR 301.6685-1 - Assessable penalties with respect to private foundations' failure to comply with section 6104(d).

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... foundations' failure to comply with section 6104(d). 301.6685-1 Section 301.6685-1 Internal Revenue INTERNAL... Additional Amounts § 301.6685-1 Assessable penalties with respect to private foundations' failure to comply... private foundations' annual returns, and who fails so to comply, if such failure is willful, shall pay...

  11. 26 CFR 301.6685-1 - Assessable penalties with respect to private foundations' failure to comply with section 6104(d).

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... foundations' failure to comply with section 6104(d). 301.6685-1 Section 301.6685-1 Internal Revenue INTERNAL... Additional Amounts § 301.6685-1 Assessable penalties with respect to private foundations' failure to comply... private foundations' annual returns, and who fails so to comply, if such failure is willful, shall pay...

  12. 26 CFR 301.6685-1 - Assessable penalties with respect to private foundations' failure to comply with section 6104(d).

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... foundations' failure to comply with section 6104(d). 301.6685-1 Section 301.6685-1 Internal Revenue INTERNAL... Additional Amounts § 301.6685-1 Assessable penalties with respect to private foundations' failure to comply... private foundations' annual returns, and who fails so to comply, if such failure is willful, shall pay...

  13. ANALYSIS OF SEQUENTIAL FAILURES FOR ASSESSMENT OF RELIABILITY AND SAFETY OF MANUFACTURING SYSTEMS. (R828541)

    EPA Science Inventory

    Assessment of reliability and safety of a manufacturing system with sequential failures is an important issue in industry, since the reliability and safety of the system depend not only on all failed states of system components, but also on the sequence of occurrences of those...

  14. VALIDATION OF PROTOCOLS FOR ASSESSING EARLY PREGNANCY FAILURE IN THE RAT: CLOMIPHENE CITRATE

    EPA Science Inventory

    Following the assembly of a battery of protocols for the assessment of maternally-mediated toxicity during early pregnancy, the validation of this battery for its utility in detecting and defining mechanisms of early pregnancy failure is ongoing. his report describes the use of c...

  15. Application of ISO22000 and Failure Mode and Effect Analysis (fmea) for Industrial Processing of Poultry Products

    NASA Astrophysics Data System (ADS)

    Varzakas, Theodoros H.; Arvanitoyannis, Ioannis S.

    Failure Mode and Effect Analysis (FMEA) model has been applied for the risk assessment of poultry slaughtering and manufacturing. In this work comparison of ISO22000 analysis with HACCP is carried out over poultry slaughtering, processing and packaging. Critical Control points and Prerequisite programs (PrPs) have been identified and implemented in the cause and effect diagram (also known as Ishikawa, tree diagram and fishbone diagram).

  16. An improved approach for flight readiness certification: Methodology for failure risk assessment and application examples. Volume 2: Software documentation

    NASA Astrophysics Data System (ADS)

    Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.

    1992-06-01

    An improved methodology for quantitatively evaluating failure risk of spaceflight systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with engineering analysis to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in engineering analyses of failure phenomena, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which engineering analysis models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes, These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. Conventional engineering analysis models currently employed for design of failure prediction are used in this methodology. The PFA methodology is described and examples of its application are presented. Conventional approaches to failure risk evaluation for spaceflight systems are discussed, and the rationale for the approach taken in the PFA methodology is presented. The statistical methods, engineering models, and computer software used in fatigue failure mode applications are thoroughly documented.

  17. An improved approach for flight readiness certification: Methodology for failure risk assessment and application examples, volume 1

    NASA Astrophysics Data System (ADS)

    Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.

    1992-06-01

    An improved methodology for quantitatively evaluating failure risk of spaceflight systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with engineering analysis to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in engineering analyses of failure phenomena, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which engineering analysis models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes. These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. Conventional engineering analysis models currently employed for design of failure prediction are used in this methodology. The PFA methodology is described and examples of its application are presented. Conventional approaches to failure risk evaluation for spaceflight systems are discussed, and the rationale for the approach taken in the PFA methodology is presented. The statistical methods, engineering models, and computer software used in fatigue failure mode applications are thoroughly documented.

  18. An improved approach for flight readiness certification: Methodology for failure risk assessment and application examples. Volume 2: Software documentation

    NASA Technical Reports Server (NTRS)

    Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.

    1992-01-01

    An improved methodology for quantitatively evaluating failure risk of spaceflight systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with engineering analysis to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in engineering analyses of failure phenomena, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which engineering analysis models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes, These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. Conventional engineering analysis models currently employed for design of failure prediction are used in this methodology. The PFA methodology is described and examples of its application are presented. Conventional approaches to failure risk evaluation for spaceflight systems are discussed, and the rationale for the approach taken in the PFA methodology is presented. The statistical methods, engineering models, and computer software used in fatigue failure mode applications are thoroughly documented.

  19. An improved approach for flight readiness certification: Methodology for failure risk assessment and application examples, volume 1

    NASA Technical Reports Server (NTRS)

    Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.

    1992-01-01

    An improved methodology for quantitatively evaluating failure risk of spaceflight systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with engineering analysis to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in engineering analyses of failure phenomena, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which engineering analysis models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes. These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. Conventional engineering analysis models currently employed for design of failure prediction are used in this methodology. The PFA methodology is described and examples of its application are presented. Conventional approaches to failure risk evaluation for spaceflight systems are discussed, and the rationale for the approach taken in the PFA methodology is presented. The statistical methods, engineering models, and computer software used in fatigue failure mode applications are thoroughly documented.

  20. Performance improvement through proactive risk assessment: Using failure modes and effects analysis

    PubMed Central

    Yarmohammadian, Mohammad Hossein; Abadi, Tahereh Naseri Boori; Tofighi, Shahram; Esfahani, Sekine Saghaeiannejad

    2014-01-01

    Introduction: Cognizance of any error-prone professional activities has a great impact on the continuity of professional organizations in the competitive atmosphere, particularly in health care industry where every second has critical value in patients’ life saving. Considering invaluable functions of medical record department — as legal document and continuity of health care — “failure mode and effects analysis (FMEA)” utilized to identify the ways a process can fail, and how it can be made safer. Materials and Methods: The structured approach involved assembling a team of experts, employing a trained facilitator, introducing the rating scales and process during team orientation and collectively scoring failure modes. The probability of the failure-effect combination was related to the frequency of occurrence, potential severity, and likelihood of detection before causing any harm to the staff or patients. Frequency, severity and detectability were each given a score from 1 to 10. Risk priority numbers were calculated. Results: In total 56 failure modes were identified and in subsets of Medical Record Department including admission unit dividing emergency, outpatient and inpatient classes, statististic, health data organizing and data processing and Medical Coding units. Although most failure modes were classified as a high risk group, limited resources were, as an impediment to implement recommended actions at the same time. Conclusion: Proactive risk assessment methods, such as FMEA enable health care administrators to identify where and what safeguards are needed to protect against a bad outcome even when an error does occur. PMID:25013821

  1. Assessing performance and validating finite element simulations using probabilistic knowledge

    SciTech Connect

    Dolin, Ronald M.; Rodriguez, E. A.

    2002-01-01

    Two probabilistic approaches for assessing performance are presented. The first approach assesses probability of failure by simultaneously modeling all likely events. The probability each event causes failure along with the event's likelihood of occurrence contribute to the overall probability of failure. The second assessment method is based on stochastic sampling using an influence diagram. Latin-hypercube sampling is used to stochastically assess events. The overall probability of failure is taken as the maximum probability of failure of all the events. The Likelihood of Occurrence simulation suggests failure does not occur while the Stochastic Sampling approach predicts failure. The Likelihood of Occurrence results are used to validate finite element predictions.

  2. The assessment of low probability containment failure modes using dynamic PRA

    NASA Astrophysics Data System (ADS)

    Brunett, Acacia Joann

    Although low probability containment failure modes in nuclear power plants may lead to large releases of radioactive material, these modes are typically crudely modeled in system level codes and have large associated uncertainties. Conventional risk assessment techniques (i.e. the fault-tree/event-tree methodology) are capable of accounting for these failure modes to some degree, however, they require the analyst to pre-specify the ordering of events, which can vary within the range of uncertainty of the phenomena. More recently, dynamic probabilistic risk assessment (DPRA) techniques have been developed which remove the dependency on the analyst. Through DPRA, it is now possible to perform a mechanistic and consistent analysis of low probability phenomena, with the timing of the possible events determined by the computational model simulating the reactor behavior. The purpose of this work is to utilize DPRA tools to assess low probability containment failure modes and the driving mechanisms. Particular focus is given to the risk-dominant containment failure modes considered in NUREG-1150, which has long been the standard for PRA techniques. More specifically, this work focuses on the low probability phenomena occurring during a station blackout (SBO) with late power recovery in the Zion Nuclear Power Plant, a Westinghouse pressurized water reactor (PWR). Subsequent to the major risk study performed in NUREG-1150, significant experimentation and modeling regarding the mechanisms driving containment failure modes have been performed. In light of this improved understanding, NUREG-1150 containment failure modes are reviewed in this work using the current state of knowledge. For some unresolved mechanisms, such as containment loading from high pressure melt ejection and combustion events, additional analyses are performed using the accident simulation tool MELCOR to explore the bounding containment loads for realistic scenarios. A dynamic treatment in the

  3. Preliminary Master Logic Diagram for ITER operation

    SciTech Connect

    Cadwallader, L.C.; Taylor, N.P.; Poucet, A.E.

    1998-04-01

    This paper describes the work performed to develop a Master Logic Diagram (MLD) for the operations phase of the International Thermonuclear Experimental Reactor (ITER). The MLD is a probabilistic risk assessment tool used to identify the broad set of potential initiating events that could lead to an offsite radioactive or toxic chemical release from the facility under study. The MLD described here is complementary to the failure modes and effects analyses (FMEAs) that have been performed for ITER`s major plant systems in the engineering evaluation of the facility design. While the FMEAs are a bottom-up or component level approach, the MLD is a top-down or facility level approach to identifying the broad spectrum of potential events. Strengths of the MLD are that it analyzes the entire plant, depicts completeness in the accident initiator process, provides an independent method for identification, and can also identify potential system interactions. MLDs have been used successfully as a hazard analysis tool. This paper describes the process used for the ITER MLD to treat the variety of radiological and toxicological source terms present in the ITER design. One subtree of the nineteen page MLD is shown to illustrate the levels of the diagram.

  4. Phase Equilibria Diagrams Database

    National Institute of Standards and Technology Data Gateway

    SRD 31 NIST/ACerS Phase Equilibria Diagrams Database (PC database for purchase)   The Phase Equilibria Diagrams Database contains commentaries and more than 21,000 diagrams for non-organic systems, including those published in all 21 hard-copy volumes produced as part of the ACerS-NIST Phase Equilibria Diagrams Program (formerly titled Phase Diagrams for Ceramists): Volumes I through XIV (blue books); Annuals 91, 92, 93; High Tc Superconductors I & II; Zirconium & Zirconia Systems; and Electronic Ceramics I. Materials covered include oxides as well as non-oxide systems such as chalcogenides and pnictides, phosphates, salt systems, and mixed systems of these classes.

  5. Noninvasive radiographic assessment of cardiovascular function in acute and chronic respiratory failure

    SciTech Connect

    Berger, H.J.; Matthay, R.A.

    1981-04-01

    Noninvasive radiographic techniques have provided a means of studying the natural history and pathogenesis of cardiovascular performance in acute and chronic respiratory failure. Chest radiography, radionuclide angiocardiography and thallium-201 imaging, and M mode and cross-sectional echocardiography have been employed. Each of these techniques has specific uses, attributes and limitations. For example, measurement of descending pulmonary arterial diameters on the plain chest radiograph allows determination of the presence or absence of pulmonary arterial hypertension. Right and left ventricular performance can be evaluated at rest and during exercise using radionuclide angiocardiography. The biventricular response to exercise and to therapeutic interventions also can be assessed with this approach. Evaluation of the pulmonary valve echogram and echocardiographic right ventricular dimensions have been shown to reflect right ventricular hemodynamics and size. Each of these noninvasive techniques has been applied to the study of patients with respiratory failure and has provided important physiologic data.

  6. Endothelial dysfunction as assessed with magnetic resonance imaging - A major determinant in chronic heart failure.

    PubMed

    Kovačić, Slavica; Plazonić, Željko; Batinac, Tanja; Miletić, Damir; Ružić, Alen

    2016-05-01

    Chronic heart failure (CHF) is a clinical syndrome resulting from interaction of different structure and functional disturbances leading to decreased heart ability to ensure adequate supply of oxygenized blood to tissues and ensure adequate metabolic needs in the cases of normal or increased afterload. Endothelial dysfunction (ED) is a pathological condition characterized by general imbalance of all major endothelial mechanisms with key role in development and progression of atherosclerotic disease. ED has been associated with most cardiovascular risk factors. There is increasing interest in assessing endothelial function non-invasively, leading to development and evaluation of new diagnostic methods. We suggest that MRI is safe and reliable test that offers important advantages over ultrasound for the detection of ED and monitoring of the expected therapeutic effect. We believe that ED plays a pivotal role in chronic heart failure development and progression, regardless of its etiology, and that MRI should be introduced as a "gold standard" in diagnostic procedure and treatment. PMID:27063091

  7. Assessing Safety in Distillation Column Using Dynamic Simulation and Failure Mode and Effect Analysis (FMEA)

    NASA Astrophysics Data System (ADS)

    Werner, Suhendra; Fred, Witt; Compart

    Safety assessment becomes an important activity in chemical industries since the need to comply with general legal requirements in addition to meet safer plant and profit. This paper reviews some most frequently causes of distillation column malfunction. First, analysis of case histories will be discussed for providing guidelines in identifying potential trouble spots in distillation column. A dynamic simulation for operational failure is simulated as the basis for assessing the consequences. A case study will be used from a side stream distillation column to show the implementation of the concept. A framework for assessing safety in the column is proposed using Fault Mode and Effect Analysis (FMEA). Further, trouble-free operation in order to reduce the risk associated with column malfunction is described.

  8. Factors affecting nurses' intent to assess for depression in heart failure patients.

    PubMed

    Lea, Patricia

    2014-01-01

    The association between depression and cardiovascular disease has been well established and has been shown to decrease patients' quality of life and increase the risk of mortality, frequency and duration of hospitalization, and health care costs. The inpatient setting provides a potentially valuable opportunity to assess and treat depression among patients with acute cardiac illness, allowing for daily monitoring of treatment side effects. Although systematic depression screening appears to be feasible, efficient, and well accepted on inpatient cardiac units, the current lack of consistent inpatient assessment for depression in heart failure patients suggests the presence of barriers influencing the effective diagnosis and treatment of depression among inpatients with heart failure. The theory of planned behavior describes the cognitive mechanism by which behavioral intent is formed, giving some insight into how nurses' attitudes and beliefs affect their performance of routine depression screening. In addition, application of this cognitive theory suggests that nurses may be influenced to adopt more positive attitudes and beliefs about depression through educational intervention, leading to greater likelihood of routine assessment for depression, ultimately leading to more timely diagnosis and treatment and improved patient outcomes. PMID:25280199

  9. Sample handler for x-ray tomographic microscopy and image-guided failure assessment

    SciTech Connect

    Wyss, Peter; Thurner, Philipp; Broennimann, Rolf; Sennhauser, Urs; Stampanoni, Marco; Abela, Rafael; Mueller, Ralph

    2005-07-15

    X-ray tomographic microscopy (XTM) yields a three-dimensional data model of an investigated specimen. XTM providing micrometer resolution requires synchrotron light, high resolution area detectors, and a precise sample handler. The sample handler has a height of 270 mm only, is usable for 1 {mu}m resolution, and is able to carry loading machines with a weight of up to 20 kg. This allows exposing samples to load between scans for image-guided failure assessment. This system has been used in the XTM end station of the materials science beamline of the Swiss Light Source at the Paul Scherrer Institut.

  10. Assessing hospital readmission risk factors in heart failure patients enrolled in a telemonitoring program.

    PubMed

    Zai, Adrian H; Ronquillo, Jeremiah G; Nieves, Regina; Chueh, Henry C; Kvedar, Joseph C; Jethwani, Kamal

    2013-01-01

    The purpose of this study was to validate a previously developed heart failure readmission predictive algorithm based on psychosocial factors, develop a new model based on patient-reported symptoms from a telemonitoring program, and assess the impact of weight fluctuations and other factors on hospital readmission. Clinical, demographic, and telemonitoring data was collected from 100 patients enrolled in the Partners Connected Cardiac Care Program between July 2008 and November 2011. 38% of study participants were readmitted to the hospital within 30 days. Ten different heart-failure-related symptoms were reported 17,389 times, with the top three contributing approximately 50% of the volume. The psychosocial readmission model yielded an AUC of 0.67, along with sensitivity 0.87, specificity 0.32, positive predictive value 0.44, and negative predictive value 0.8 at a cutoff value of 0.30. In summary, hospital readmission models based on psychosocial characteristics, standardized changes in weight, or patient-reported symptoms can be developed and validated in heart failure patients participating in an institutional telemonitoring program. However, more robust models will need to be developed that use a comprehensive set of factors in order to have a significant impact on population health. PMID:23710170

  11. Risk assessment for Industrial Control Systems quantifying availability using mean failure cost (MFC)

    DOE PAGESBeta

    Chen, Qian; Abercrombie, Robert K; Sheldon, Frederick T.

    2015-09-23

    Industrial Control Systems (ICS) are commonly used in industries such as oil and natural gas, transportation, electric, water and wastewater, chemical, pharmaceutical, pulp and paper, food and beverage, as well as discrete manufacturing (e.g., automotive, aerospace, and durable goods.) SCADA systems are generally used to control dispersed assets using centralized data acquisition and supervisory control. Originally, ICS implementations were susceptible primarily to local threats because most of their components were located in physically secure areas (i.e., ICS components were not connected to IT networks or systems). The trend toward integrating ICS systems with IT networks (e.g., efficiency and the Internetmore » of Things) provides significantly less isolation for ICS from the outside world thus creating greater risk due to external threats. Albeit, the availability of ICS/SCADA systems is critical to assuring safety, security and profitability. Such systems form the backbone of our national cyber-physical infrastructure. We extend the concept of mean failure cost (MFC) to address quantifying availability to harmonize well with ICS security risk assessment. This new measure is based on the classic formulation of Availability combined with Mean Failure Cost (MFC). The metric offers a computational basis to estimate the availability of a system in terms of the loss that each stakeholder stands to sustain as a result of security violations or breakdowns (e.g., deliberate malicious failures).« less

  12. Risk assessment for Industrial Control Systems quantifying availability using mean failure cost (MFC)

    SciTech Connect

    Chen, Qian; Abercrombie, Robert K; Sheldon, Frederick T.

    2015-09-23

    Industrial Control Systems (ICS) are commonly used in industries such as oil and natural gas, transportation, electric, water and wastewater, chemical, pharmaceutical, pulp and paper, food and beverage, as well as discrete manufacturing (e.g., automotive, aerospace, and durable goods.) SCADA systems are generally used to control dispersed assets using centralized data acquisition and supervisory control. Originally, ICS implementations were susceptible primarily to local threats because most of their components were located in physically secure areas (i.e., ICS components were not connected to IT networks or systems). The trend toward integrating ICS systems with IT networks (e.g., efficiency and the Internet of Things) provides significantly less isolation for ICS from the outside world thus creating greater risk due to external threats. Albeit, the availability of ICS/SCADA systems is critical to assuring safety, security and profitability. Such systems form the backbone of our national cyber-physical infrastructure. We extend the concept of mean failure cost (MFC) to address quantifying availability to harmonize well with ICS security risk assessment. This new measure is based on the classic formulation of Availability combined with Mean Failure Cost (MFC). The metric offers a computational basis to estimate the availability of a system in terms of the loss that each stakeholder stands to sustain as a result of security violations or breakdowns (e.g., deliberate malicious failures).

  13. Assessment of Three Finite Element Approaches for Modeling the Ballistic Impact Failure of Metal Plates

    NASA Astrophysics Data System (ADS)

    Mansur, Ali; Nganbe, Michel

    2015-03-01

    The ballistic impact was numerically modeled for AISI 450 steel struck by a 17.3 g ogive nose WC-Co projectile using Abaqus/Explicit. The model was validated using experimental results and data for different projectiles and metal targets. The Abaqus ductile-shear, local principal strain to fracture, and absorbed strain energy at failure criteria were investigated. Due to the highly dynamic nature of ballistic impacts, the absorbed strain energy approach posed serious challenges in estimating the effective deformation volume and yielded the largest critical plate thicknesses for through-thickness penetration (failure). In contrast, the principal strain criterion yielded the lowest critical thicknesses and provided the best agreement with experimental ballistic test data with errors between 0 and 30%. This better accuracy was due to early failure definition when the very first mesh at the target back side reached the strain to fracture, which compensated for the overall model overestimation. The ductile-shear criterion yielded intermediate results between those of the two comparative approaches. In contrast to the ductile-shear criterion, the principal strain criterion requires only basic data readily available for practically all materials. Therefore, it is a viable alternative for an initial assessment of the ballistic performance and pre-screening of a large number of new candidate materials as well as for supporting the development of novel armor systems.

  14. Gravity wave transmission diagram

    NASA Astrophysics Data System (ADS)

    Tomikawa, Yoshihiro

    2016-07-01

    A possibility of gravity wave propagation from a source region to the airglow layer around the mesopause has been discussed based on the gravity wave blocking diagram taking into account the critical level filtering alone. This paper proposes a new gravity wave transmission diagram in which both the critical level filtering and turning level reflection of gravity waves are considered. It shows a significantly different distribution of gravity wave transmissivity from the blocking diagram.

  15. Causes of death in fulminant hepatic failure and relationship to quantitative histological assessment of parenchymal damage.

    PubMed

    Gazzard, B G; Portmann, B; Murray-Lyon, I M; Williams, R

    1975-10-01

    The clinical course and causes of death in 132 consecutive patients with fulminant hepatic failure and grade III or IV encephalopathy have been reviewed. 105 patients died and in 96 of these an autopsy examination was performed. In 36 patients there was cerebral oedema and the mean age of this group was significantly younger than the other fatal cases. In 28 patients death was attributed to major haemorrhage which originated in the gastrointestinal tract in 25. The prothrombin time ratio was not significantly greater in patients with major bleeding than in those without but they did have a significantly lower platelet count. Sepsis contributed to death in 12 patients. In 25 patients massive hepatic necrosis only was found at autopsy and death was considered to be due solely to hepatic failure. The degree of hepatocyte loss was assessed in 80 fatal cases by a histological morphometric technique on a needle specimen of liver taken immediately post-mortem. The proportion of the liver volume occupied by hepatocytes (hepatocyte volume fraction, HVF) was greatly reduced in all patients (normal 85+/-SD 5 percent) but the mean value was significantly higher in the patients dying with sepsis, cerebral oedema or haemorrhage than in the group in whom death was attributed solely to hepatic failure. There were ten patients in whom liver function was improving at the time of death which was due to cerebral (9) or haemorrhage (1). These observations suggest that many patients presently dying from fulminant hepatic failure may be expected to survive, once more effective therapy is available for the complications of the illness. PMID:172938

  16. The assessment of low probability containment failure modes using dynamic PRA

    NASA Astrophysics Data System (ADS)

    Brunett, Acacia Joann

    Although low probability containment failure modes in nuclear power plants may lead to large releases of radioactive material, these modes are typically crudely modeled in system level codes and have large associated uncertainties. Conventional risk assessment techniques (i.e. the fault-tree/event-tree methodology) are capable of accounting for these failure modes to some degree, however, they require the analyst to pre-specify the ordering of events, which can vary within the range of uncertainty of the phenomena. More recently, dynamic probabilistic risk assessment (DPRA) techniques have been developed which remove the dependency on the analyst. Through DPRA, it is now possible to perform a mechanistic and consistent analysis of low probability phenomena, with the timing of the possible events determined by the computational model simulating the reactor behavior. The purpose of this work is to utilize DPRA tools to assess low probability containment failure modes and the driving mechanisms. Particular focus is given to the risk-dominant containment failure modes considered in NUREG-1150, which has long been the standard for PRA techniques. More specifically, this work focuses on the low probability phenomena occurring during a station blackout (SBO) with late power recovery in the Zion Nuclear Power Plant, a Westinghouse pressurized water reactor (PWR). Subsequent to the major risk study performed in NUREG-1150, significant experimentation and modeling regarding the mechanisms driving containment failure modes have been performed. In light of this improved understanding, NUREG-1150 containment failure modes are reviewed in this work using the current state of knowledge. For some unresolved mechanisms, such as containment loading from high pressure melt ejection and combustion events, additional analyses are performed using the accident simulation tool MELCOR to explore the bounding containment loads for realistic scenarios. A dynamic treatment in the

  17. Hertzsprung-Russell Diagram

    NASA Astrophysics Data System (ADS)

    Chiosi, C.; Murdin, P.

    2000-11-01

    The Hertzsprung-Russell diagram (HR-diagram), pioneered independently by EJNAR HERTZSPRUNG and HENRY NORRIS RUSSELL, is a plot of the star luminosity versus the surface temperature. It stems from the basic relation for an object emitting thermal radiation as a black body: ...

  18. Strategic environmental assessment can help solve environmental impact assessment failures in developing countries

    SciTech Connect

    Alshuwaikhat, Habib M. . E-mail: habibms@kfupm.edu.sa

    2005-05-15

    The current trend of industrialization and urbanization in developing nations has a huge impact on anthropogenic and natural ecosystems. Pollution sources increase with the expansion of cities and cause contamination of water, air and soil. The absence of urban environmental planning and management strategies has resulted in greater concern for future urban development. This paper advocates the adoption of strategic environmental assessment (SEA) as a means to achieve sustainable development in developing countries. It investigates project-level environmental impact assessment (EIA) and its limitations. The exploration of SEA and its features are addressed. The effective implementation of SEA can create a roadmap for sustainable development. In many developing countries, the lack of transparency and accountability and ineffective public participation in the development of the policy, plan and program (PPP) would be mitigated by the SEA process. Moreover, the proactive and broadly based characteristics of SEA would benefit the institutional development of the PPP process, which is rarely experienced in many developing countries. The paper also explores the prospects for SEA and its guiding principles in developing countries. Finally, the paper calls for a coordinated effort between all government, nongovernment and international organizations involved with PPPs to enable developing countries to pursue a path of sustainable development through the development and application of strategic environmental assessment.

  19. Risk assessment of turbine rotor failure using probabilistic ultrasonic non-destructive evaluations

    NASA Astrophysics Data System (ADS)

    Guan, Xuefei; Zhang, Jingdan; Zhou, S. Kevin; Rasselkorde, El Mahjoub; Abbasi, Waheed A.

    2014-02-01

    The study presents a method and application of risk assessment methodology for turbine rotor fatigue failure using probabilistic ultrasonic nondestructive evaluations. A rigorous probabilistic modeling for ultrasonic flaw sizing is developed by incorporating the model-assisted probability of detection, and the probability density function (PDF) of the actual flaw size is derived. Two general scenarios, namely the ultrasonic inspection with an identified flaw indication and the ultrasonic inspection without flaw indication, are considered in the derivation. To perform estimations for fatigue reliability and remaining useful life, uncertainties from ultrasonic flaw sizing and fatigue model parameters are systematically included and quantified. The model parameter PDF is estimated using Bayesian parameter estimation and actual fatigue testing data. The overall method is demonstrated using a realistic application of steam turbine rotor, and the risk analysis under given safety criteria is provided to support maintenance planning.

  20. Risk assessment of turbine rotor failure using probabilistic ultrasonic non-destructive evaluations

    SciTech Connect

    Guan, Xuefei; Zhang, Jingdan; Zhou, S. Kevin; Rasselkorde, El Mahjoub; Abbasi, Waheed A.

    2014-02-18

    The study presents a method and application of risk assessment methodology for turbine rotor fatigue failure using probabilistic ultrasonic nondestructive evaluations. A rigorous probabilistic modeling for ultrasonic flaw sizing is developed by incorporating the model-assisted probability of detection, and the probability density function (PDF) of the actual flaw size is derived. Two general scenarios, namely the ultrasonic inspection with an identified flaw indication and the ultrasonic inspection without flaw indication, are considered in the derivation. To perform estimations for fatigue reliability and remaining useful life, uncertainties from ultrasonic flaw sizing and fatigue model parameters are systematically included and quantified. The model parameter PDF is estimated using Bayesian parameter estimation and actual fatigue testing data. The overall method is demonstrated using a realistic application of steam turbine rotor, and the risk analysis under given safety criteria is provided to support maintenance planning.

  1. Noninvasive assessment of right and left ventricular function in acute and chronic respiratory failure

    SciTech Connect

    Matthay, R.A.; Berger, H.J.

    1983-05-01

    This review evaluates noninvasive techniques for assessing cardiovascular performance in acute and chronic respiratory failure. Radiographic, radionuclide, and echocardiographic methods for determining ventricular volumes, right (RV) and left ventricular (LV) ejection fractions, and pulmonary artery pressure (PAP) are emphasized. These methods include plain chest radiography, radionuclide angiocardiography, thallium-201 myocardial imaging, and M mode and 2-dimensional echocardiography, which have recently been applied in patients to detect pulmonary artery hypertension (PAH), right ventricular enlargement, and occult ventricular performance abnormalities at rest or exercise. Moreover, radionuclide angiocardiography has proven useful in combination with hemodynamic measurements, for evaluating the short-and long-term cardiovascular effects of therapeutic agents, such as oxygen, digitalis, theophylline, beta-adrenergic agents, and vasodilators.

  2. Modeling of Electrical Cable Failure in a Dynamic Assessment of Fire Risk

    NASA Astrophysics Data System (ADS)

    Bucknor, Matthew D.

    complexity to existing cable failure techniques and tuned to empirical data can better approximate the temperature response of a cables located in tightly packed cable bundles. The new models also provide a way to determine the conditions insides a cable bundle which allows for separate treatment of cables on the interior of the bundle from cables on the exterior of the bundle. The results from the DET analysis show that the overall assessed probability of cable failure can be significantly reduced by more realistically accounting for the influence that the fire brigade has on a fire progression scenario. The shielding analysis results demonstrate a significant reduction in the temperature response of a shielded versus a non-shielded cable bundle; however the computational cost of using a fire progression model that can capture these effects may be prohibitive for performing DET analyses with currently available computational fluid dynamics models and computational resources.

  3. Square Source Type Diagram

    NASA Astrophysics Data System (ADS)

    Aso, N.; Ohta, K.; Ide, S.

    2014-12-01

    Deformation in a small volume of earth interior is expressed by a symmetric moment tensor located on a point source. The tensor contains information of characteristic directions, source amplitude, and source types such as isotropic, double-couple, or compensated-linear-vector-dipole (CLVD). Although we often assume a double couple as the source type of an earthquake, significant non-double-couple component including isotropic component is often reported for induced earthquakes and volcanic earthquakes. For discussions on source types including double-couple and non-double-couple components, it is helpful to display them using some visual diagrams. Since the information of source type has two degrees of freedom, it can be displayed onto a two-dimensional flat plane. Although the diagram developed by Hudson et al. [1989] is popular, the trace corresponding to the mechanism combined by two mechanisms is not always a smooth line. To overcome this problem, Chapman and Leaney [2012] developed a new diagram. This diagram has an advantage that a straight line passing through the center corresponds to the mechanism obtained by a combination of an arbitrary mechanism and a double-couple [Tape and Tape, 2012], but this diagram has some difficulties in use. First, it is slightly difficult to produce the diagram because of its curved shape. Second, it is also difficult to read out the ratios among isotropic, double-couple, and CLVD components, which we want to obtain from the estimated moment tensors, because they do not appear directly on the horizontal or vertical axes. In the present study, we developed another new square diagram that overcomes the difficulties of previous diagrams. This diagram is an orthogonal system of isotropic and deviatoric axes, so it is easy to get the ratios among isotropic, double-couple, and CLVD components. Our diagram has another advantage that the probability density is obtained simply from the area within the diagram if the probability density

  4. Review of nutritional screening and assessment tools and clinical outcomes in heart failure.

    PubMed

    Lin, Hong; Zhang, Haifeng; Lin, Zheng; Li, Xinli; Kong, Xiangqin; Sun, Gouzhen

    2016-09-01

    Recent studies have suggested that undernutrition as defined using multidimensional nutritional evaluation tools may affect clinical outcomes in heart failure (HF). The evidence supporting this correlation is unclear. Therefore, we conducted this systematic review to critically appraise the use of multidimensional evaluation tools in the prediction of clinical outcomes in HF. We performed descriptive analyses of all identified articles involving qualitative analyses. We used STATA to conduct meta-analyses when at least three studies that tested the same type of nutritional assessment or screening tools and used the same outcome were identified. Sensitivity analyses were conducted to validate our positive results. We identified 17 articles with qualitative analyses and 11 with quantitative analysis after comprehensive literature searching and screening. We determined that the prevalence of malnutrition is high in HF (range 16-90 %), particularly in advanced and acute decompensated HF (approximate range 75-90 %). Undernutrition as identified by multidimensional evaluation tools may be significantly associated with hospitalization, length of stay and complications and is particularly strongly associated with high mortality. The meta-analysis revealed that compared with other tools, Mini Nutritional Assessment (MNA) scores were the strongest predictors of mortality in HF [HR (4.32, 95 % CI 2.30-8.11)]. Our results remained reliable after conducting sensitivity analyses. The prevalence of malnutrition is high in HF, particularly in advanced and acute decompensated HF. Moreover, undernutrition as identified by multidimensional evaluation tools is significantly associated with unfavourable prognoses and high mortality in HF. PMID:26920682

  5. Performance of the Automated Neuropsychological Assessment Metrics (ANAM) in Detecting Cognitive Impairment in Heart Failure Patients

    PubMed Central

    Xie, Susan S.; Goldstein, Carly M.; Gathright, Emily C.; Gunstad, John; Dolansky, Mary A.; Redle, Joseph; Hughes, Joel W.

    2015-01-01

    Objective Evaluate capacity of the Automated Neuropsychological Assessment Metrics (ANAM) to detect cognitive impairment (CI) in heart failure (HF) patients. Background CI is a key prognostic marker in HF. Though the most widely used cognitive screen in HF, the Mini-Mental State Examination (MMSE) is insufficiently sensitive. The ANAM has demonstrated sensitivity to cognitive domains affected by HF, but has not been assessed in this population. Methods Investigators administered the ANAM and MMSE to 57 HF patients, compared against a composite model of cognitive function. Results ANAM efficiency (p < .05) and accuracy scores (p < .001) successfully differentiated CI and non-CI. ANAM efficiency and accuracy scores classified 97.7% and 93.0% of non-CI patients, and 14.3% and 21.4% with CI, respectively. Conclusions The ANAM is more effective than the MMSE for detecting CI, but further research is needed to develop a more optimal cognitive screen for routine use in HF patients. PMID:26354858

  6. A Model for Assessment of Failure of LWR Fuel during an RIA

    SciTech Connect

    Liu, Wenfeng; Kazimi, Mujid S.

    2007-07-01

    This paper presents a model for Pellet-Cladding Mechanical Interaction (PCMI) failure of LWR fuel during an RIA. The model uses the J-integral as a driving parameter to characterize the failure potential during PCMI. The model is implemented in the FRAPTRAN code and is validated by CABRI and NSRR simulated RIA test data. Simulation of PWR and BWR conditions are conducted by FRAPTRAN to evaluate the fuel failure potential using this model. Model validation and simulation results are compared with the strain-based failure model of PNNL and the SED/CSED model of EPRI. Our fracture mechanics model has good capability to differentiate failure from non-failure cases. The results reveal significant effects of power pulse width: a wider pulse width generally increases the threshold for fuel failure. However, this effect is less obvious for highly corroded cladding. (authors)

  7. Software Tools for Lifetime Assessment of Thermal Barrier Coatings Part I — Thermal Ageing Failure and Thermal Fatigue Failure

    NASA Astrophysics Data System (ADS)

    Renusch, Daniel; Rudolphi, Mario; Schütze, Michael

    Thermal barrier coatings (TBCs) increase the service lifetime of specific components in, for example, gas turbines or airplane engines and allow higher operating temperatures to increase efficiency. Lifetime prediction models are therefore of both academic and applied interest; either to test new coatings or to determine operational conditions that can ensure a certain lifetime, for example 25,000 hr for gas turbines. Driven by these demands, the equations used in lifetime prediction have become more and more sophisticated and consequently are complicated to apply. A collection of software tools for lifetime assessment was therefore developed to provide an easy to use graphical user interface whilst incorporating the recent improvements in modeling equations. The Windows based software is compatible with other Windows applications, such as, Power Point, Excel, or Origin. Laboratory lifetime data from isothermal, thermal cyclic and/or burner rig testing can be loaded into the software for analysis and the program provides confidence limits and an accuracy assessment of the analysis model. The main purpose of the software tool is to predict TBC spallation for a given bond coat temperature, temperature gradient across the coating, and thermal cycle frequency.

  8. Weyl card diagrams

    SciTech Connect

    Jones, Gregory; Wang, John E.

    2005-06-15

    To capture important physical properties of a spacetime we construct a new diagram, the card diagram, which accurately draws generalized Weyl spacetimes in arbitrary dimensions by encoding their global spacetime structure, singularities, horizons, and some aspects of causal structure including null infinity. Card diagrams draw only nontrivial directions providing a clearer picture of the geometric features of spacetimes as compared to Penrose diagrams, and can change continuously as a function of the geometric parameters. One of our main results is to describe how Weyl rods are traversable horizons and the entirety of the spacetime can be mapped out. We review Weyl techniques and as examples we systematically discuss properties of a variety of solutions including Kerr-Newman black holes, black rings, expanding bubbles, and recent spacelike-brane solutions. Families of solutions will share qualitatively similar cards. In addition we show how card diagrams not only capture information about a geometry but also its analytic continuations by providing a geometric picture of analytic continuation. Weyl techniques are generalized to higher dimensional charged solutions and applied to generate perturbations of bubble and S-brane solutions by Israel-Khan rods.

  9. Upgrading Diagnostic Diagrams

    NASA Astrophysics Data System (ADS)

    Proxauf, B.; Kimeswenger, S.; Öttl, S.

    2014-04-01

    Diagnostic diagrams of forbidden lines have been a useful tool for observers in astrophysics for many decades now. They are used to obtain information on the basic physical properties of thin gaseous nebulae. Moreover they are also the initial tool to derive thermodynamic properties of the plasma from observations to get ionization correction factors and thus to obtain proper abundances of the nebulae. Some diagnostic diagrams are in wavelengths domains which were difficult to take either due to missing wavelength coverage or low resolution of older spectrographs. Thus they were hardly used in the past. An upgrade of this useful tool is necessary because most of the diagrams were calculated using only the species involved as a single atom gas, although several are affected by well-known fluorescence mechanisms as well. Additionally the atomic data have improved up to the present time. The new diagnostic diagrams are calculated by using large grids of parameter space in the photoionization code CLOUDY. For a given basic parameter the input radiation field is varied to find the solutions with cooling-heating-equilibrium. Empirical numerical functions are fitted to provide formulas usable in e.g. data reduction pipelines. The resulting diagrams differ significantly from those used up to now and will improve the thermodynamic calculations.

  10. Trace element indiscrimination diagrams

    NASA Astrophysics Data System (ADS)

    Li, Chusi; Arndt, Nicholas T.; Tang, Qingyan; Ripley, Edward M.

    2015-09-01

    We tested the accuracy of trace element discrimination diagrams for basalts using new datasets from two petrological databases, PetDB and GEOROC. Both binary and ternary diagrams using Zr, Ti, V, Y, Th, Hf, Nb, Ta, Sm, and Sc do a poor job of discriminating between basalts generated in various tectonic environments (continental flood basalt, mid-ocean ridge basalt, ocean island basalt, oceanic plateau basalt, back-arc basin basalt, and various types of arc basalt). The overlaps between the different types of basalt are too large for the confident application of such diagrams when used in the absence of geological and petrological constraints. None of the diagrams we tested can clearly discriminate between back-arc basin basalt and mid-ocean ridge basalt, between continental flood basalt and oceanic plateau basalt, and between different types of arc basalt (intra-oceanic, island and continental arcs). Only ocean island basalt and some mid-ocean ridge basalt are generally distinguishable in the diagrams, and even in this case, mantle-normalized trace element patterns offer a better solution for discriminating between the two types of basalt.

  11. Weyl card diagrams

    NASA Astrophysics Data System (ADS)

    Jones, Gregory; Wang, John E.

    2005-06-01

    To capture important physical properties of a spacetime we construct a new diagram, the card diagram, which accurately draws generalized Weyl spacetimes in arbitrary dimensions by encoding their global spacetime structure, singularities, horizons, and some aspects of causal structure including null infinity. Card diagrams draw only nontrivial directions providing a clearer picture of the geometric features of spacetimes as compared to Penrose diagrams, and can change continuously as a function of the geometric parameters. One of our main results is to describe how Weyl rods are traversable horizons and the entirety of the spacetime can be mapped out. We review Weyl techniques and as examples we systematically discuss properties of a variety of solutions including Kerr-Newman black holes, black rings, expanding bubbles, and recent spacelike-brane solutions. Families of solutions will share qualitatively similar cards. In addition we show how card diagrams not only capture information about a geometry but also its analytic continuations by providing a geometric picture of analytic continuation. Weyl techniques are generalized to higher dimensional charged solutions and applied to generate perturbations of bubble and S-brane solutions by Israel-Khan rods.

  12. Methods for the development and assessment of atrial fibrillation and heart failure dog models

    PubMed Central

    Urban, Jon F; Gerhart, Renee L; Krzeszak, Jason R; Leet, Corey R; Lentz, Linnea R; McClay, Carolyn B

    2011-01-01

    Objective To report Medtronic experiences with the development of animal models for atrial fibrillation (AF) and chronic heart failure (CHF) using high-rate pacing for AF and microemboli for CHF. Methods For the AF model, an atrial lead was attached to a Medtronic Synergy™ neurostimulator, which was programmed to stimulate at 50 Hz in an on-off duty cycle. Atrial natriuretic peptide (ANP), brain natriuretic peptide (BNP) and N-terminal pro brain natriuretic peptide (NT-proBNP) were assayed at select time points. For CHF model, a serial injection of 90 µm polystyrene microspheres at 62,400 beads/mL (Polybead, Polysciences, Inc.) was performed to induce global ischemia, either with weekly monitoring and embolization schedule (group 1, n = 25) or with biweekly monitoring and emboliation schedule (group 2, n = 36 ). Echocardiograms were used along with ventriculograms and magnetic resonance imaging scans weekly to assess cardiac function and ANP, BNP and NT-proBNP were monitored. Results For the AF model, the days to sustained AF for four animals following surgery were 7, 25, 21 and 19, respectively; For the CHF model, the days to meet CHF endpoints were 116 in group 1 and 89 in group 2. For both AF and CHF models, NT-proBNP correlated well with the development of disease states. Conclusion Our experience for the development and assessment of AF and CHF dog models may help researchers who are in search for animal model for assessing the safety and efficacy of a device-based therapy. PMID:22783299

  13. Moraine-dammed lake failures in Patagonia and assessment of outburst susceptibility in the Baker Basin

    NASA Astrophysics Data System (ADS)

    Iribarren Anacona, P.; Norton, K. P.; Mackintosh, A.

    2014-12-01

    Glacier retreat since the Little Ice Age has resulted in the development or expansion of hundreds of glacial lakes in Patagonia. Some of these lakes have produced large (≥ 106 m3) Glacial Lake Outburst Floods (GLOFs) damaging inhabited areas. GLOF hazard studies in Patagonia have been mainly based on the analysis of short-term series (≤ 50 years) of flood data and until now no attempt has been made to identify the relative susceptibility of lakes to failure. Power schemes and associated infrastructure are planned for Patagonian basins that have historically been affected by GLOFs, and we now require a thorough understanding of the characteristics of dangerous lakes in order to assist with hazard assessment and planning. In this paper, the conditioning factors of 16 outbursts from moraine-dammed lakes in Patagonia were analysed. These data were used to develop a classification scheme designed to assess outburst susceptibility, based on image classification techniques, flow routine algorithms and the Analytical Hierarchy Process. This scheme was applied to the Baker Basin, Chile, where at least seven moraine-dammed lakes have failed in historic time. We identified 386 moraine-dammed lakes in the Baker Basin of which 28 were classified with high or very high outburst susceptibility. Commonly, lakes with high outburst susceptibility are in contact with glaciers and have moderate (> 8°) to steep (> 15°) dam outlet slopes, akin to failed lakes in Patagonia. The proposed classification scheme is suitable for first-order GLOF hazard assessments in this region. However, rapidly changing glaciers in Patagonia make detailed analysis and monitoring of hazardous lakes and glaciated areas upstream from inhabited areas or critical infrastructure necessary, in order to better prepare for hazards emerging from an evolving cryosphere.

  14. Probabilistic exposure assessment model to estimate aseptic-UHT product failure rate.

    PubMed

    Pujol, Laure; Albert, Isabelle; Magras, Catherine; Johnson, Nicholas Brian; Membré, Jeanne-Marie

    2015-01-01

    Aseptic-Ultra-High-Temperature (UHT) products are manufactured to be free of microorganisms capable of growing in the food at normal non-refrigerated conditions at which the food is likely to be held during manufacture, distribution and storage. Two important phases within the process are widely recognised as critical in controlling microbial contamination: the sterilisation steps and the following aseptic steps. Of the microbial hazards, the pathogen spore formers Clostridium botulinum and Bacillus cereus are deemed the most pertinent to be controlled. In addition, due to a relatively high thermal resistance, Geobacillus stearothermophilus spores are considered a concern for spoilage of low acid aseptic-UHT products. A probabilistic exposure assessment model has been developed in order to assess the aseptic-UHT product failure rate associated with these three bacteria. It was a Modular Process Risk Model, based on nine modules. They described: i) the microbial contamination introduced by the raw materials, either from the product (i.e. milk, cocoa and dextrose powders and water) or the packaging (i.e. bottle and sealing component), ii) the sterilisation processes, of either the product or the packaging material, iii) the possible recontamination during subsequent processing of both product and packaging. The Sterility Failure Rate (SFR) was defined as the sum of bottles contaminated for each batch, divided by the total number of bottles produced per process line run (10(6) batches simulated per process line). The SFR associated with the three bacteria was estimated at the last step of the process (i.e. after Module 9) but also after each module, allowing for the identification of modules, and responsible contamination pathways, with higher or lower intermediate SFR. The model contained 42 controlled settings associated with factory environment, process line or product formulation, and more than 55 probabilistic inputs corresponding to inputs with variability

  15. 77 FR 5857 - Common-Cause Failure Analysis in Event and Condition Assessment: Guidance and Research, Draft...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-02-06

    ...: On November 2, 2011 (76 FR 67764), the U.S. Nuclear Regulatory Commission (NRC) published for public comment Draft NUREG, ``Common- Cause Failure Analysis in Event and Condition Assessment: Guidance and...-2011-0254. Discussion On November 2, 2011 (76 FR 67764), the NRC published for public comment...

  16. Assessing the Value-Added by the Environmental Testing Process with the Aide of Physics/Engineering of Failure Evaluations

    NASA Technical Reports Server (NTRS)

    Cornford, S.; Gibbel, M.

    1997-01-01

    NASA's Code QT Test Effectiveness Program is funding a series of applied research activities focused on utilizing the principles of physics and engineering of failure and those of engineering economics to assess and improve the value-added by the various validation and verification activities to organizations.

  17. Materials Degradation & Failure: Assessment of Structure and Properties. Resources in Technology.

    ERIC Educational Resources Information Center

    Technology Teacher, 1991

    1991-01-01

    This module provides information on materials destruction (through corrosion, oxidation, and degradation) and failure. A design brief includes objective, student challenge, resources, student outcomes, and quiz. (SK)

  18. Direct and indirect assessment of skeletal muscle blood flow in chronic congestive heart failure

    SciTech Connect

    LeJemtel, T.H.; Scortichini, D.; Katz, S.

    1988-09-09

    In patients with chronic congestive heart failure (CHF), skeletal muscle blood flow can be measured directly by the continuous thermodilution technique and by the xenon-133 clearance method. The continuous thermodilution technique requires retrograde catheterization of the femoral vein and, thus, cannot be repeated conveniently in patients during evaluation of pharmacologic interventions. The xenon-133 clearance, which requires only an intramuscular injection, allows repeated determination of skeletal muscle blood flow. In patients with severe CHF, a fixed capacity of the skeletal muscle vasculature to dilate appears to limit maximal exercise performance. Moreover, the changes in peak skeletal muscle blood flow noted during long-term administration of captopril, an angiotensin-converting enzyme inhibitor, appears to correlate with the changes in aerobic capacity. In patients with CHF, resting supine deep femoral vein oxygen content can be used as an indirect measurement of resting skeletal muscle blood flow. The absence of a steady state complicates the determination of peak skeletal muscle blood flow reached during graded bicycle or treadmill exercise in patients with chronic CHF. Indirect assessments of skeletal muscle blood flow and metabolism during exercise performed at submaximal work loads are currently developed in patients with chronic CHF.

  19. Assessment of adult patients with chronic liver failure for liver transplantation in 2015: who and when?

    PubMed

    McCaughan, G W; Crawford, M; Sandroussi, C; Koorey, D J; Bowen, D G; Shackel, N A; Strasser, S I

    2016-04-01

    In 2015, there are a few absolute contraindications to liver transplantation. In adult patients, survival post-liver transplant is excellent, with 1-year survival rate >90% and 5-year survival rates >80% and predicted median allograft survival beyond 20 years. Patients with a Child-Turcotte Pugh score ≥9 or a model for end-stage liver disease (MELD) score >15 should be referred for liver transplantation, with patients who have a MELD score >17 showing a 1-year survival benefit with liver transplantation. A careful selection of hepatocellular cancer patients results in excellent outcomes, while consideration of extra-hepatic disease (reversible vs irreversible) and social support structures are crucial to patient assessment. Alcoholic liver disease remains a challenge, and the potential to cure hepatitis C virus infection together with the emerging issue of non-alcoholic fatty liver disease-associated chronic liver failure will change the landscape of the who in the years ahead. The when will continue to be determined largely by the severity of liver disease based on the MELD score for the foreseeable future. PMID:27062203

  20. Optimal Allocation of Gold Standard Testing under Constrained Availability: Application to Assessment of HIV Treatment Failure

    PubMed Central

    Liu, Tao; Hogan, Joseph W.; Wang, Lisa; Zhang, Shangxuan; Kantor, Rami

    2013-01-01

    The World Health Organization (WHO) guidelines for monitoring the effectiveness of HIV treatment in resource-limited settings (RLS) are mostly based on clinical and immunological markers (e.g., CD4 cell counts). Recent research indicates that the guidelines are inadequate and can result in high error rates. Viral load (VL) is considered the “gold standard”, yet its widespread use is limited by cost and infrastructure. In this paper, we propose a diagnostic algorithm that uses information from routinely-collected clinical and immunological markers to guide a selective use of VL testing for diagnosing HIV treatment failure, under the assumption that VL testing is available only at a certain portion of patient visits. Our algorithm identifies the patient sub-population, such that the use of limited VL testing on them minimizes a pre-defined risk (e.g., misdiagnosis error rate). Diagnostic properties of our proposal algorithm are assessed by simulations. For illustration, data from the Miriam Hospital Immunology Clinic (RI, USA) are analyzed. PMID:24672142

  1. A non-stationary earthquake probability assessment with the Mohr-Coulomb failure criterion

    NASA Astrophysics Data System (ADS)

    Wang, J. P.; Xu, Y.

    2015-10-01

    From theory to experience, earthquake probability associated with an active fault should be gradually increasing with time since the last event. In this paper, a new non-stationary earthquake assessment motivated/derived from the Mohr-Coulomb failure criterion is introduced. Different from other non-stationary earthquake analyses, the new model can more clearly define and calculate the stress states between two characteristic earthquakes. In addition to the model development and the algorithms, this paper also presents an example calculation to help explain and validate the new model. On the condition of best-estimate model parameters, the example calculation shows a 7.6 % probability for the Meishan fault in central Taiwan to induce a major earthquake in years 2015-2025, and if the earthquake does not occur by 2025, the earthquake probability will increase to 8 % in 2025-2035, which validates the new model that can calculate non-stationary earthquake probability as it should vary with time.

  2. Assessing Strategies for Heart Failure with Preserved Ejection Fraction at the Outpatient Clinic

    PubMed Central

    Jorge, Antonio José Lagoeiro; Rosa, Maria Luiza Garcia; Ribeiro, Mario Luiz; Fernandes, Luiz Claudio Maluhy; Freire, Monica Di Calafiori; Correia, Dayse Silva; Teixeira, Patrick Duarte; Mesquita, Evandro Tinoco

    2014-01-01

    Background: Heart failure with preserved ejection fraction (HFPEF) is the most common form of heart failure (HF), its diagnosis being a challenge to the outpatient clinic practice. Objective: To describe and compare two strategies derived from algorithms of the European Society of Cardiology Diastology Guidelines for the diagnosis of HFPEF. Methods: Cross-sectional study with 166 consecutive ambulatory patients (67.9±11.7 years; 72% of women). The strategies to confirm HFPEF were established according to the European Society of Cardiology Diastology Guidelines criteria. In strategy 1 (S1), tissue Doppler echocardiography (TDE) and electrocardiography (ECG) were used; in strategy 2 (S2), B-type natriuretic peptide (BNP) measurement was included. Results: In S1, patients were divided into groups based on the E/E'ratio as follows: GI, E/E'> 15 (n = 16; 9%); GII, E/E'8 to 15 (n = 79; 48%); and GIII, E/E'< 8 (n = 71; 43%). HFPEF was confirmed in GI and excluded in GIII. In GII, TDE [left atrial volume index (LAVI) ≥ 40 mL/m2; left ventricular mass index LVMI) > 122 for women and > 149 g/m2 for men] and ECG (atrial fibrillation) parameters were assessed, confirming HFPEF in 33 more patients, adding up to 49 (29%). In S2, patients were divided into three groups based on BNP levels. GI (BNP > 200 pg/mL) consisted of 12 patients, HFPEF being confirmed in all of them. GII (BNP ranging from 100 to 200 pg/mL) consisted of 20 patients with LAVI > 29 mL/m2, or LVMI ≥ 96 g/m2 for women or ≥ 116 g/m2 for men, or E/E'≥ 8 or atrial fibrillation on ECG, and the diagnosis of HFPEF was confirmed in 15. GIII (BNP < 100 pg/mL) consisted of 134 patients, 26 of whom had the diagnosis of HFPEF confirmed when GII parameters were used. Measuring BNP levels in S2 identified 4 more patients (8%) with HFPEF as compared with those identified in S1. Conclusion: The association of BNP measurement and TDE data is better than the isolated use of those parameters. BNP can be useful in

  3. Impulse-Momentum Diagrams

    ERIC Educational Resources Information Center

    Rosengrant, David

    2011-01-01

    Multiple representations are a valuable tool to help students learn and understand physics concepts. Furthermore, representations help students learn how to think and act like real scientists. These representations include: pictures, free-body diagrams, energy bar charts, electrical circuits, and, more recently, computer simulations and…

  4. 30 CFR 1218.41 - Assessments for failure to submit payment of same amount as Form ONRR-2014 or bill document or to...

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 30 Mineral Resources 3 2014-07-01 2014-07-01 false Assessments for failure to submit payment of same amount as Form ONRR-2014 or bill document or to provide adequate information. 1218.41 Section 1218... General Provisions § 1218.41 Assessments for failure to submit payment of same amount as Form ONRR-2014...

  5. An improved approach for flight readiness certification: Methodology for failure risk assessment and application examples. Volume 3: Structure and listing of programs

    NASA Astrophysics Data System (ADS)

    Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.

    1992-06-01

    An improved methodology for quantitatively evaluating failure risk of spaceflight systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with engineering analysis to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in engineering analyses of failure phenomena, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which engineering analysis models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes. These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. Conventional engineering analysis models currently employed for design of failure prediction are used in this methodology. The PFA methodology is described and examples of its application are presented. Conventional approaches to failure risk evaluation for spaceflight systems are discussed, and the rationale for the approach taken in the PFA methodology is presented. The statistical methods, engineering models, and computer software used in fatigue failure mode applications are thoroughly documented.

  6. An improved approach for flight readiness certification: Methodology for failure risk assessment and application examples. Volume 3: Structure and listing of programs

    NASA Technical Reports Server (NTRS)

    Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.

    1992-01-01

    An improved methodology for quantitatively evaluating failure risk of spaceflight systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with engineering analysis to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in engineering analyses of failure phenomena, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which engineering analysis models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes. These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. Conventional engineering analysis models currently employed for design of failure prediction are used in this methodology. The PFA methodology is described and examples of its application are presented. Conventional approaches to failure risk evaluation for spaceflight systems are discussed, and the rationale for the approach taken in the PFA methodology is presented. The statistical methods, engineering models, and computer software used in fatigue failure mode applications are thoroughly documented.

  7. Fuzzy-logic assessment of failure hazard in pipelines due to mining activity

    NASA Astrophysics Data System (ADS)

    Malinowska, A. A.; Hejmanowski, R.

    2015-11-01

    The present research is aimed at a critical analysis of a method presently used for evaluating failure hazard in linear objects in mining areas. A fuzzy model of failure hazard of a linear object was created on the basis of the experience gathered so far. The rules of Mamdani fuzzy model have been used in the analyses. Finally the scaled model was integrated with a Geographic Information System (GIS), which was used to evaluate failure hazard in a water pipeline in a mining area.

  8. Tectonic discrimination diagrams revisited

    NASA Astrophysics Data System (ADS)

    Vermeesch, Pieter

    2006-06-01

    The decision boundaries of most tectonic discrimination diagrams are drawn by eye. Discriminant analysis is a statistically more rigorous way to determine the tectonic affinity of oceanic basalts based on their bulk-rock chemistry. This method was applied to a database of 756 oceanic basalts of known tectonic affinity (ocean island, mid-ocean ridge, or island arc). For each of these training data, up to 45 major, minor, and trace elements were measured. Discriminant analysis assumes multivariate normality. If the same covariance structure is shared by all the classes (i.e., tectonic affinities), the decision boundaries are linear, hence the term linear discriminant analysis (LDA). In contrast with this, quadratic discriminant analysis (QDA) allows the classes to have different covariance structures. To solve the statistical problems associated with the constant-sum constraint of geochemical data, the training data must be transformed to log-ratio space before performing a discriminant analysis. The results can be mapped back to the compositional data space using the inverse log-ratio transformation. An exhaustive exploration of 14,190 possible ternary discrimination diagrams yields the Ti-Si-Sr system as the best linear discrimination diagram and the Na-Nb-Sr system as the best quadratic discrimination diagram. The best linear and quadratic discrimination diagrams using only immobile elements are Ti-V-Sc and Ti-V-Sm, respectively. As little as 5% of the training data are misclassified by these discrimination diagrams. Testing them on a second database of 182 samples that were not part of the training data yields a more reliable estimate of future performance. Although QDA misclassifies fewer training data than LDA, the opposite is generally true for the test data. Therefore LDA is a cruder but more robust classifier than QDA. Another advantage of LDA is that it provides a powerful way to reduce the dimensionality of the multivariate geochemical data in a similar

  9. Predictive Values of Red Blood Cell Distribution Width in Assessing Severity of Chronic Heart Failure.

    PubMed

    Liu, Sen; Wang, Ping; Shen, Ping-Ping; Zhou, Jian-Hua

    2016-01-01

    BACKGROUND This retrospective study was performed to evaluate the value of baseline red blood cell distribution width (RDW) for predicting the severity of chronic heart failure (CHF) compared with N-terminal prohormone brain natriuretic peptide (NT-ProBNP) and other hematological and biochemical parameters. MATERIAL AND METHODS Hematological and biochemical parameters were obtained from 179 patients with New York Heart Association (NYHA) CHF class I (n=44), II (n=39), III (n=41), and IV (n=55). Receiver operator characteristic (ROC) curves were used for assessing predictive values. RESULTS RDW increased significantly in class III and IV compared with class I (14.3±2.3% and 14.3±1.7% vs. 12.9±0.8%, P<0.01). Areas under ROCs (AUCs) of RDW and NT-ProBNP for class IV HF were 0.817 and 0.840, respectively. RDW was markedly elevated in the mortality group compared with the survival group (13.7±1.7 vs. 15.8±1.8, P<0.01). The predictive value of RDW was lower than that of NT-ProBNP but was comparable to white blood cell (WBC), neutrophil (NEU), lymphocyte (L), and neutrophil/lymphocyte ratio (N/L) for mortality during hospitalization, with AUCs of 0.837, 0.939, 0.858, 0.891, 0.885, and 0.885, respectively. RDW and NT-proBNP showed low predictive values for repeated admission (≥3). RDW was an independent risk factor for mortality (OR=2.531, 95% CI: 1.371-4.671). CONCLUSIONS RDW increased significantly in class III and IV patients and in the mortality group. The predictive value of RDW is comparable to NT-proBNP for class IV and lower than that of NT-proBNP for mortality. Elevated RDW is an independent risk factor for mortality. PMID:27324271

  10. Minimal Effects of Acute Liver Injury/Acute Liver Failure on Hemostasis as Assessed by Thromboelastography

    PubMed Central

    Stravitz, R. Todd; Lisman, Ton; Luketic, Velimir A.; Sterling, Richard K.; Puri, Puneet; Fuchs, Michael; Ibrahim, Ashraf; Lee, William M.; Sanyal, Arun J.

    2016-01-01

    Background & Aims Patients with acute liver injury/failure (ALI/ALF) are assumed to have a bleeding diathesis on the basis of elevated INR; however, clinically significant bleeding is rare. We hypothesized that patients with ALI/ALF have normal hemostasis despite elevated INR Methods Fifty-one patients with ALI/ALF were studied prospectively using thromboelastography (TEG), which measures the dynamics and physical properties of clot formation in whole blood. ALI was defined as an INR ≥1.5 in a patient with no previous liver disease, and ALF as ALI with hepatic encephalopathy. Results Thirty-seven of 51 patients (73%) had ALF and 22 patients (43%) underwent liver transplantation or died. Despite a mean INR of 3.4±1.7 (range 1.5–9.6), mean TEG parameters were normal, and 5 individual TEG parameters were normal in 32 (63%). Low maximum amplitude, the measure of ultimate clot strength, was confined to patients with platelet counts <126 × 109/L. Maximum amplitude was higher in patients with ALF than ALI and correlated directly with venous ammonia concentrations and with increasing severity of liver injury assessed by elements of the systemic inflammatory response syndrome. All patients had markedly decreased procoagulant factor V and VII levels, which were proportional to decreases in anticoagulant proteins and inversely proportional to elevated factor VIII levels. Conclusions Despite elevated INR, most patients with ALI/ALF maintain normal hemostasis by TEG, the mechanisms of which include an increase in clot strength with increasing severity of liver injury, increased factor VIII levels, and a commensurate decline in pro- and anticoagulant proteins. PMID:21703173

  11. Sepsis-related organ failure assessment and withholding or withdrawing life support from critically ill patients

    PubMed Central

    Miguel, Nolla; León, Mariá A; Ibáñez, Jordi; Díaz, Rosa M; Merten, Alfredo; Gahete, Francesc

    1998-01-01

    Background: We studied the incidence of withholding or withdrawing therapeutic measures in intensive care unit (ICU) patients, as well as the possible implications of sepsis-related organ failure assessment (SOFA) in the decision-making process and the ethical conflicts emerging from these measures. Methods: The patients (n = 372) were placed in different groups: those surviving 1 year after ICU admission (S; n = 301), deaths at home (DH; n = 2), deaths in the hospital after ICU discharge (DIH; n = 13) and deaths in the ICU (DI; n = 56). The last group was divided into the following subgroups: two cardiovascular deaths (CVD), 20 brain deaths (BD), 25 deaths after withholding of life support (DWH) and nine deaths after withdrawal of life support (DWD). Results: APACHE III, daily therapeutic intervention scoring system (TISS) and daily SOFA scores were good mortality predictors. The length of ICU stay in DIH (20 days) and in DWH (14 days) was significantly greater than in BD (5 days) or in S (7 days). The number of days with a maximum SOFA score was greater in DWD (5 days) than in S, BD or DWH (2 days). Conclusions: Daily SOFA is a useful parameter when the decision to withhold or withdraw treatment has to be considered, especially if the established measures do not improve the clinical condition of the patient. Although making decisions based on the use of severity parameters may cause ethical problems, it may reduce the anxiety level. Additionally, it may help when considering the need for extraordinary measures or new investigative protocols for better management of resources. PMID:11056711

  12. Fluid Volume Overload and Congestion in Heart Failure: Time to Reconsider Pathophysiology and How Volume Is Assessed.

    PubMed

    Miller, Wayne L

    2016-08-01

    Volume regulation, assessment, and management remain basic issues in patients with heart failure. The discussion presented here is directed at opening a reassessment of the pathophysiology of congestion in congestive heart failure and the methods by which we determine volume overload status. Peer-reviewed historical and contemporary literatures are reviewed. Volume overload and fluid congestion remain primary issues for patients with chronic heart failure. The pathophysiology is complex, and the simple concept of intravascular fluid accumulation is not adequate. The dynamics of interstitial and intravascular fluid compartment interactions and fluid redistribution from venous splanchnic beds to central pulmonary circulation need to be taken into account in strategies of volume management. Clinical bedside evaluations and right heart hemodynamic assessments can alert clinicians of changes in volume status, but only the quantitative measurement of total blood volume can help identify the heterogeneity in plasma volume and red blood cell mass that are features of volume overload in patients with chronic heart failure and help guide individualized, appropriate therapy-not all volume overload is the same. PMID:27436837

  13. Random Forest for automatic assessment of heart failure severity in a telemonitoring scenario.

    PubMed

    Guidi, G; Pettenati, M C; Miniati, R; Iadanza, E

    2013-01-01

    In this study, we describe an automatic classifier of patients with Heart Failure designed for a telemonitoring scenario, improving the results obtained in our previous works. Our previous studies showed that the technique that better processes the heart failure typical telemonitoring-parameters is the Classification Tree. We therefore decided to analyze the data with its direct evolution that is the Random Forest algorithm. The results show an improvement both in accuracy and in limiting critical errors. PMID:24110416

  14. Impulse-Momentum Diagrams

    NASA Astrophysics Data System (ADS)

    Rosengrant, David

    2011-01-01

    Multiple representations are a valuable tool to help students learn and understand physics concepts. Furthermore, representations help students learn how to think and act like real scientists.2 These representations include: pictures, free-body diagrams,3 energy bar charts,4 electrical circuits, and, more recently, computer simulations and animations.5 However, instructors have limited choices when they want to help their students understand impulse and momentum. One of the only available options is the impulse-momentum bar chart.6 The bar charts can effectively show the magnitude of the momentum as well as help students understand conservation of momentum, but they do not easily show the actual direction. This paper highlights a new representation instructors can use to help their students with momentum and impulse—the impulse-momentum diagram (IMD).

  15. TEP process flow diagram

    SciTech Connect

    Wilms, R Scott; Carlson, Bryan; Coons, James; Kubic, William

    2008-01-01

    This presentation describes the development of the proposed Process Flow Diagram (PFD) for the Tokamak Exhaust Processing System (TEP) of ITER. A brief review of design efforts leading up to the PFD is followed by a description of the hydrogen-like, air-like, and waterlike processes. Two new design values are described; the mostcommon and most-demanding design values. The proposed PFD is shown to meet specifications under the most-common and mostdemanding design values.

  16. Assessment of the probability of failure for EC nondestructive testing based on intrusive spectral stochastic finite element method

    NASA Astrophysics Data System (ADS)

    Oudni, Zehor; Féliachi, Mouloud; Mohellebi, Hassane

    2014-06-01

    This work is undertaken to study the reliability of eddy current nondestructive testing (ED-NDT) when the defect concerns a change of physical property of the material. So, an intrusive spectral stochastic finite element method (SSFEM) is developed in the case of 2D electromagnetic harmonic equation. The electrical conductivity is considered as random variable and is developed in series of Hermite polynomials. The developed model is validated from measurements on NDT device and is applied to the assessment of the reliability of failure in steam generator tubing of nuclear power plants. The exploitation of the model concerns the impedance calculation of the sensor and the assessment of the reliability of failure. The random defect geometry is also considered and results are given.

  17. Failure-Oriented Training.

    ERIC Educational Resources Information Center

    Pickens, Diana; Lorenz, Paul

    This document consists of a number of figures and diagrams suitable for overhead transparencies that illustrate and elaborate on the prnciples of failure-oriented training (a model for improving the effectiveness of instructional analysis). By adding a few simple steps to analysis, the resulting training will be closer to the idealized tutor:…

  18. Proactive Risk Assessment of Blood Transfusion Process, in Pediatric Emergency, Using the Health Care Failure Mode and Effects Analysis (HFMEA)

    PubMed Central

    Dehnavieh, Reza; Ebrahimipour, Hossein; Molavi-Taleghani, Yasamin; Vafaee-Najar, Ali; Hekmat, Somayeh Noori; Esmailzdeh, Hamid

    2015-01-01

    Introduction: Pediatric emergency has been considered as a high risk area, and blood transfusion is known as a unique clinical measure, therefore this study was conducted with the purpose of assessing the proactive risk assessment of blood transfusion process in Pediatric Emergency of Qaem education- treatment center in Mashhad, by the Healthcare Failure Mode and Effects Analysis (HFMEA) methodology. Methodology: This cross-sectional study analyzed the failure mode and effects of blood transfusion process by a mixture of quantitative-qualitative method. The proactive HFMEA was used to identify and analyze the potential failures of the process. The information of the items in HFMEA forms was collected after obtaining a consensus of experts’ panel views via the interview and focus group discussion sessions. Results: The Number of 77 failure modes were identified for 24 sub-processes enlisted in 8 processes of blood transfusion. Totally 13 failure modes were identified as non-acceptable risk (a hazard score above 8) in the blood transfusion process and were transferred to the decision tree. Root causes of high risk modes were discussed in cause-effect meetings and were classified based on the UK national health system (NHS) approved classifications model. Action types were classified in the form of acceptance (11.6%), control (74.2%) and elimination (14.2%). Recommendations were placed in 7 categories using TRIZ (“Theory of Inventive Problem Solving.”) Conclusion: The re-engineering process for the required changes, standardizing and updating the blood transfusion procedure, root cause analysis of blood transfusion catastrophic events, patient identification bracelet, training classes and educational pamphlets for raising awareness of personnel, and monthly gathering of transfusion medicine committee have all been considered as executive strategies in work agenda in pediatric emergency. PMID:25560332

  19. Assessment and management of cerebral edema and intracranial hypertension in acute liver failure.

    PubMed

    Mohsenin, Vahid

    2013-10-01

    Acute liver failure is uncommon but not a rare complication of liver injury. It can happen after ingestion of acetaminophen and exposure to toxins and hepatitis viruses. The defining clinical symptoms are coagulopathy and encephalopathy occurring within days or weeks of the primary insult in patients without preexisting liver injury. Acute liver failure is often complicated by multiorgan failure and sepsis. The most life-threatening complications are sepsis, multiorgan failure, and brain edema. The clinical signs of increased intracranial pressure (ICP) are nonspecific except for neurologic deficits in impending brain stem herniation. Computed tomography of the brain is not sensitive enough in gauging intracranial hypertension or ruling out brain edema. Intracranial pressure monitoring, transcranial Doppler, and jugular venous oximetry provide valuable information for monitoring ICP and guiding therapeutic measures in patients with encephalopathy grade III or IV. Osmotic therapy using hypertonic saline and mannitol, therapeutic hypothermia, and propofol sedation are shown to improve ICPs and stabilize the patient for liver transplantation. In this article, diagnosis and management of hepatic encephalopathy and cerebral edema in patients with acute liver failure are reviewed. PMID:23683564

  20. Predictive Values of Red Blood Cell Distribution Width in Assessing Severity of Chronic Heart Failure

    PubMed Central

    Liu, Sen; Wang, Ping; Shen, Ping-Ping; Zhou, Jian-Hua

    2016-01-01

    Background This retrospective study was performed to evaluate the value of baseline red blood cell distribution width (RDW) for predicting the severity of chronic heart failure (CHF) compared with N-terminal prohormone brain natriuretic peptide (NT-ProBNP) and other hematological and biochemical parameters. Material/Methods Hematological and biochemical parameters were obtained from 179 patients with New York Heart Association (NYHA) CHF class I (n=44), II (n=39), III (n=41), and IV (n=55). Receiver operator characteristic (ROC) curves were used for assessing predictive values. Results RDW increased significantly in class III and IV compared with class I (14.3±2.3% and 14.3±1.7% vs. 12.9±0.8%, P<0.01). Areas under ROCs (AUCs) of RDW and NT-ProBNP for class IV HF were 0.817 and 0.840, respectively. RDW was markedly elevated in the mortality group compared with the survival group (13.7±1.7 vs. 15.8±1.8, P<0.01). The predictive value of RDW was lower than that of NT-ProBNP but was comparable to white blood cell (WBC), neutrophil (NEU), lymphocyte (L), and neutrophil/lymphocyte ratio (N/L) for mortality during hospitalization, with AUCs of 0.837, 0.939, 0.858, 0.891, 0.885, and 0.885, respectively. RDW and NT-proBNP showed low predictive values for repeated admission (≥3). RDW was an independent risk factor for mortality (OR=2.531, 95% CI: 1.371–4.671). Conclusions RDW increased significantly in class III and IV patients and in the mortality group. The predictive value of RDW is comparable to NT-proBNP for class IV and lower than that of NT-proBNP for mortality. Elevated RDW is an independent risk factor for mortality. PMID:27324271

  1. Traditional and new composite endpoints in heart failure clinical trials: facilitating comprehensive efficacy assessments and improving trial efficiency.

    PubMed

    Anker, Stefan D; Schroeder, Stefan; Atar, Dan; Bax, Jeroen J; Ceconi, Claudio; Cowie, Martin R; Crisp, Adam; Dominjon, Fabienne; Ford, Ian; Ghofrani, Hossein-Ardeschir; Gropper, Savion; Hindricks, Gerhard; Hlatky, Mark A; Holcomb, Richard; Honarpour, Narimon; Jukema, J Wouter; Kim, Albert M; Kunz, Michael; Lefkowitz, Martin; Le Floch, Chantal; Landmesser, Ulf; McDonagh, Theresa A; McMurray, John J; Merkely, Bela; Packer, Milton; Prasad, Krishna; Revkin, James; Rosano, Giuseppe M C; Somaratne, Ransi; Stough, Wendy Gattis; Voors, Adriaan A; Ruschitzka, Frank

    2016-05-01

    Composite endpoints are commonly used as the primary measure of efficacy in heart failure clinical trials to assess the overall treatment effect and to increase the efficiency of trials. Clinical trials still must enrol large numbers of patients to accrue a sufficient number of outcome events and have adequate power to draw conclusions about the efficacy and safety of new treatments for heart failure. Additionally, the societal and health system perspectives on heart failure have raised interest in ascertaining the effects of therapy on outcomes such as repeat hospitalization and the patient's burden of disease. Thus, novel methods for using composite endpoints in clinical trials (e.g. clinical status composite endpoints, recurrent event analyses) are being applied in current and planned trials. Endpoints that measure functional status or reflect the patient experience are important but used cautiously because heart failure treatments may improve function yet have adverse effects on mortality. This paper discusses the use of traditional and new composite endpoints, identifies qualities of robust composites, and outlines opportunities for future research. PMID:27071916

  2. Heart Failure

    MedlinePlus

    ... version of this page please turn Javascript on. Heart Failure What is Heart Failure? In heart failure, the heart cannot pump enough ... failure often experience tiredness and shortness of breath. Heart Failure is Serious Heart failure is a serious and ...

  3. Wilson Loop Diagrams and Positroids

    NASA Astrophysics Data System (ADS)

    Agarwala, Susama; Marin-Amat, Eloi

    2016-07-01

    In this paper, we study a new application of the positive Grassmannian to Wilson loop diagrams (or MHV diagrams) for scattering amplitudes in N= 4 Super Yang-Mill theory (N = 4 SYM). There has been much interest in studying this theory via the positive Grassmannians using BCFW recursion. This is the first attempt to study MHV diagrams for planar Wilson loop calculations (or planar amplitudes) in terms of positive Grassmannians. We codify Wilson loop diagrams completely in terms of matroids. This allows us to apply the combinatorial tools in matroid theory used to identify positroids (non-negative Grassmannians) to Wilson loop diagrams. In doing so, we find that certain non-planar Wilson loop diagrams define positive Grassmannians. While non-planar diagrams do not have physical meaning, this finding suggests that they may have value as an algebraic tool, and deserve further investigation.

  4. ASSESSMENT OF SYNTHETIC MEMBRANE SUCCESSES AND FAILURES AT WASTE STORAGE AND DISPOSAL SITES

    EPA Science Inventory

    Data from 27 lined facilities provided by five vendors was analyzed to determine the factors which contributed to success or failure of the liner at those facilities. The sites studied included a wide variety of wastes handled, liner types, geographic locations, facility ages, fa...

  5. Risk assessment of the emergency processes: Healthcare failure mode and effect analysis

    PubMed Central

    Taleghani, Yasamin Molavi; Rezaei, Fatemeh; Sheikhbardsiri, Hojat

    2016-01-01

    BACKGROUND: Ensuring about the patient’s safety is the first vital step in improving the quality of care and the emergency ward is known as a high-risk area in treatment health care. The present study was conducted to evaluate the selected risk processes of emergency surgery department of a treatment-educational Qaem center in Mashhad by using analysis method of the conditions and failure effects in health care. METHODS: In this study, in combination (qualitative action research and quantitative cross-sectional), failure modes and effects of 5 high-risk procedures of the emergency surgery department were identified and analyzed according to Healthcare Failure Mode and Effects Analysis (HFMEA). To classify the failure modes from the “nursing errors in clinical management model (NECM)”, the classification of the effective causes of error from “Eindhoven model” and determination of the strategies to improve from the “theory of solving problem by an inventive method” were used. To analyze the quantitative data of descriptive statistics (total points) and to analyze the qualitative data, content analysis and agreement of comments of the members were used. RESULTS: In 5 selected processes by “voting method using rating”, 23 steps, 61 sub-processes and 217 potential failure modes were identified by HFMEA. 25 (11.5%) failure modes as the high risk errors were detected and transferred to the decision tree. The most and the least failure modes were placed in the categories of care errors (54.7%) and knowledge and skill (9.5%), respectively. Also, 29.4% of preventive measures were in the category of human resource management strategy. CONCLUSION: “Revision and re-engineering of processes”, “continuous monitoring of the works”, “preparation and revision of operating procedures and policies”, “developing the criteria for evaluating the performance of the personnel”, “designing a suitable educational content for needs of employee”,

  6. Warped penguin diagrams

    SciTech Connect

    Csaki, Csaba; Grossman, Yuval; Tanedo, Philip; Tsai, Yuhsin

    2011-04-01

    We present an analysis of the loop-induced magnetic dipole operator in the Randall-Sundrum model of a warped extra dimension with anarchic bulk fermions and an IR brane-localized Higgs. These operators are finite at one-loop order and we explicitly calculate the branching ratio for {mu}{yields}e{gamma} using the mixed position/momentum space formalism. The particular bound on the anarchic Yukawa and Kaluza-Klein (KK) scales can depend on the flavor structure of the anarchic matrices. It is possible for a generic model to either be ruled out or unaffected by these bounds without any fine-tuning. We quantify how these models realize this surprising behavior. We also review tree-level lepton flavor bounds in these models and show that these are on the verge of tension with the {mu}{yields}e{gamma} bounds from typical models with a 3 TeV Kaluza-Klein scale. Further, we illuminate the nature of the one-loop finiteness of these diagrams and show how to accurately determine the degree of divergence of a five-dimensional loop diagram using both the five-dimensional and KK formalism. This power counting can be obfuscated in the four-dimensional Kaluza-Klein formalism and we explicitly point out subtleties that ensure that the two formalisms agree. Finally, we remark on the existence of a perturbative regime in which these one-loop results give the dominant contribution.

  7. Assessment of congestive heart failure in chest radiographs. Observer performance with two common film-screen systems.

    PubMed

    Henriksson, L; Sundin, A; Smedby, O; Albrektsson, P

    1990-09-01

    The effect of observer variations and film-screen quality on the diagnosis of congestive heart failure based on chest radiographs was studied in 27 patients. For each patient, two films were exposed, one with the Kodak Lanex Medium system and one with the Agfa MR 400 system. The films were presented to three observers who assessed the presence of congestive heart failure on a three-graded scale. The results showed no significant difference between the two systems but large systematic differences between the observers. There were also differences between the two ratings by the same observer that could not be explained by the film-screen factor. It is concluded that the choice between these two systems is of little importance in view of the interobserver and intraobserver variability that can exist within the same department. PMID:2261292

  8. Using Eye Tracking to Investigate Semantic and Spatial Representations of Scientific Diagrams during Text-Diagram Integration

    ERIC Educational Resources Information Center

    Jian, Yu-Cin; Wu, Chao-Jung

    2015-01-01

    We investigated strategies used by readers when reading a science article with a diagram and assessed whether semantic and spatial representations were constructed while reading the diagram. Seventy-one undergraduate participants read a scientific article while tracking their eye movements and then completed a reading comprehension test. Our…

  9. Risk-benefit assessment of ivabradine in the treatment of chronic heart failure

    PubMed Central

    Urbanek, Irmina; Kaczmarek, Krzysztof; Cygankiewicz, Iwona; Ptaszynski, Pawel

    2014-01-01

    Heart rate is not only a major risk marker in heart failure but also a general risk marker. Within the last few years, it has been demonstrated that reduction of resting heart rate to <70 bpm is of significant benefit for patients with heart failure, especially those with impaired left ventricular systolic function. Ivabradine is the first innovative drug synthesized to reduce heart rate. It selectively and specifically inhibits the pacemaker If ionic current, which reduces cardiac pacemaker activity. Therefore, the main effect of ivabradine therapy is a substantial lowering of heart rate. Ivabradine does not influence intracardiac conduction, contractility, or ventricular repolarization. According to the European Society of Cardiology guidelines, ivabradine should be considered in symptomatic patients (New York Heart Association functional class II–IV) with sinus rhythm, left ventricular ejection fraction ≤35%, and heart rate ≥70 bpm despite optimal treatment with a beta-blocker, angiotensin-converting enzyme inhibitor/angiotensin receptor blocker, and a mineralocorticoid receptor antagonist. As shown in numerous clinical studies, ivabradine improves clinical outcomes and quality of life and reduces the risk of death from heart failure or cardiovascular causes. Treatment with ivabradine is very well tolerated and safe, even at maximal recommended doses. PMID:24855390

  10. Using Eye Tracking to Investigate Semantic and Spatial Representations of Scientific Diagrams During Text-Diagram Integration

    NASA Astrophysics Data System (ADS)

    Jian, Yu-Cin; Wu, Chao-Jung

    2015-02-01

    We investigated strategies used by readers when reading a science article with a diagram and assessed whether semantic and spatial representations were constructed while reading the diagram. Seventy-one undergraduate participants read a scientific article while tracking their eye movements and then completed a reading comprehension test. Our results showed that the text-diagram referencing strategy was commonly used. However, some readers adopted other reading strategies, such as reading the diagram or text first. We found all readers who had referred to the diagram spent roughly the same amount of time reading and performed equally well. However, some participants who ignored the diagram performed more poorly on questions that tested understanding of basic facts. This result indicates that dual coding theory may be a possible theory to explain the phenomenon. Eye movement patterns indicated that at least some readers had extracted semantic information of the scientific terms when first looking at the diagram. Readers who read the scientific terms on the diagram first tended to spend less time looking at the same terms in the text, which they read after. Besides, presented clear diagrams can help readers process both semantic and spatial information, thereby facilitating an overall understanding of the article. In addition, although text-first and diagram-first readers spent similar total reading time on the text and diagram parts of the article, respectively, text-first readers had significantly less number of saccades of text and diagram than diagram-first readers. This result might be explained as text-directed reading.

  11. Micro-compression: a novel technique for the nondestructive assessment of local bone failure.

    PubMed

    Müller, R; Gerber, S C; Hayes, W C

    1998-12-01

    Many bones within the axial and appendicular skeleton are subjected to repetitive, cyclic loading during the course of ordinary daily activities. If this repetitive loading is of sufficient magnitude or duration, fatigue failure of the bone tissue may result. In clinical orthopedics, trabecular fatigue fractures are observed as compressive stress fractures in the proximal femur, vertebrae, calcaneus and tibia, and are often preceded by buckling and bending of microstructural elements. However, the relative importance of bone density and architecture in the etiology of these fractures is poorly understood. The aim of the study was to investigate failure mechanisms of 3D trabecular bone using micro-computed tomography (microCT). Because of its nondestructive nature, microCT represents an ideal approach for performing not only static measurements of bone architecture but also dynamic measurements of failure initiation and propagation as well as damage accumulation. For the purpose of the study, a novel micro-compression device was devised to measure loaded trabecular bone specimens directly in a micro-tomographic system. The measurement window in the device was made of a radiolucent, highly stiff plastic to enable X-rays to penetrate the material. The micro-compressor has an outer diameter of 19 mm and a total length of 65 mm. The internal load chamber fits wet or dry bone specimens with maximal diameters of 9 mm and maximal lengths of 22 mm. For the actual measurement, first, the unloaded bone is measured in the microCT. Second, a load-displacement curve is recorded where the load is measured with an integrated mini-button load cell and the displacement is computed directly from the microCT scout-view. For each load case, a 3D snap-shot of the structure under load is taken providing 34 microm nominal resolution. Initial measurements included specimens from bovine tibiae and whale spine to investigate the influence of the structure type on the failure mechanism. In a

  12. Argument Diagramming: The Araucaria Project

    NASA Astrophysics Data System (ADS)

    Rowe, Glenn; Reed, Chris

    Formal arguments, such as those used in science, medicine and law to establish a conclusion by providing supporting evidence, are frequently represented by diagrams such as trees and graphs. We describe the software package Araucaria which allows textual arguments to be marked up and represented as standard, Toulmin or Wigmore diagrams. Since each of these diagramming techniques was devised for a particular domain or argumentation, we discuss some of the issues involved in translating between diagrams. The exercise of translating between different diagramming types illustrates that any one diagramming system often cannot capture all of the nuances inherent in an argument. Finally, we describe some areas, such as critical thinking courses in colleges and universities and the analysis of evidence in court cases, where Araucaria has been put to practical use.

  13. Program Synthesizes UML Sequence Diagrams

    NASA Technical Reports Server (NTRS)

    Barry, Matthew R.; Osborne, Richard N.

    2006-01-01

    A computer program called "Rational Sequence" generates Universal Modeling Language (UML) sequence diagrams of a target Java program running on a Java virtual machine (JVM). Rational Sequence thereby performs a reverse engineering function that aids in the design documentation of the target Java program. Whereas previously, the construction of sequence diagrams was a tedious manual process, Rational Sequence generates UML sequence diagrams automatically from the running Java code.

  14. Risk Assessment of Using Entonox for the Relief of Labor Pain: A Healthcare Failure Modes and Effects Analysis Approach

    PubMed Central

    Najafi, Tahereh Fathi; Bahri, Narjes; Ebrahimipour, Hosein; Najar, Ali Vafaee; Taleghani, Yasamin Molavi

    2016-01-01

    Introduction In order to prevent medical errors, it is important to know why they occur and to identify their causes. Healthcare failure modes and effects analysis (HFMEA) is a type of qualitative descriptive that is used to evaluate the risk. The aim of this study was to assess the risks of using Entonox for labor pain by HFMEA. Methods A mixed-methods design (qualitative action research and quantitative cross-sectional research) was used. The modes and effects of failures in the process of using Entonox were detected and analyzed during 2013–2014 at Hefdahe Shahrivar Hospital, Mashhad, Iran. Overall, 52 failure modes were identified, with 25 being recognized as high-risk modes. Results The results revealed that 48.5% of these errors fall into the care process type, 22.05% belong to the communicative type, 19.1% fall into the administrative type, and 10.2% are of the knowledge and skills type. Strategies were presented in the forms of acceptance (3.2%), control (90.3%), and elimination (6.4%). Conclusion The following actions are suggested for improving the process of using Entonox: Close supervision by the midwife, precise recording of all the stages of the process in the woman’s medical record, the necessity of the presence of the anesthesiologist at the woman’s bedside during labor, confirming the indications for use of Entonox, and close monitoring to ensure the safety of the gas cylinder guards. PMID:27123224

  15. Development of Methodology to Assess the Failure Behaviour of Bamboo Single Fibre by Acoustic Emission Technique

    NASA Astrophysics Data System (ADS)

    Alam, Md. Saiful; Gulshan, Fahmida; Ahsan, Qumrul; Wevers, Martine; Pfeiffer, Helge; van Vuure, Aart-Willem; Osorio, Lina; Verpoest, Ignaas

    2016-06-01

    Acoustic emission (AE) was used as a tool for detecting, evaluating and for better understanding of the damage mechanism and failure behavior in composites during mechanical loading. Methodology was developed for tensile test of natural fibres (bamboo single fibre). A series of experiments were performed and load drops (one or two) were observed in the load versus time graphs. From the observed AE parameters such as amplitude, energy, duration etc. significant information corresponding to the load drops were found. These AE signals from the load drop occurred from such failure as debonding between two elementary fibre or from join of elementary fibre at edge. The various sources of load at first load drop was not consistent for the different samples (for a particular sample the value is 8 N, stress: 517.51 MPa). Final breaking of fibre corresponded to saturated level AE amplitude of preamplifier (99.9 dB) for all samples. Therefore, it was not possible to determine the exact AE energy value for final breaking. Same methodology was used for tensile test of three single fibres, which gave clear indication of load drop before the final breaking of first and second fibre.

  16. Failure assessment of aluminum liner based filament-wound hybrid riser subjected to internal hydrostatic pressure

    NASA Astrophysics Data System (ADS)

    Dikshit, Vishwesh; Seng, Ong Lin; Maheshwari, Muneesh; Asundi, A.

    2015-03-01

    The present study describes the burst behavior of aluminum liner based prototype filament-wound hybrid riser under internal hydrostatic pressure. The main objective of present study is to developed an internal pressure test rig set-up for filament-wound hybrid riser and investigate the failure modes of filament-wound hybrid riser under internal hydrostatic burst pressure loading. The prototype filament-wound hybrid riser used for burst test consists of an internal aluminum liner and outer composite layer. The carbon-epoxy composites as part of the filament-wound hybrid risers were manufactured with [±55o] lay-up pattern with total composite layer thickness of 1.6 mm using a CNC filament-winding machine. The burst test was monitored by video camera which helps to analyze the failure mechanism of the fractured filament-wound hybrid riser. The Fiber Bragg Grating (FBG) sensor was used to monitor and record the strain changes during burst test of prototype filament-wound hybrid riser. This study shows good improvements in burst strength of filament-wound hybrid riser compared to the monolithic metallic riser. Since, strain measurement using FBG sensors has been testified as a reliable method, we aim to further understand in detail using this technique.

  17. Sex Differences in Patients With Acute Decompensated Heart Failure: Insights From the Heart Function Assessment Registry Trial in Saudi Arabia.

    PubMed

    AlFaleh, Hussam F; Thalib, Lukman; Kashour, Tarek; Hersi, Ahmad; Mimish, Layth; Elasfar, Abdelfatah A; Almasood, Ali; Al Ghamdi, Saleh; Ghabashi, Abdullah; Malik, Asif; Hussein, Gamal A; Al-Murayeh, Mushabab; Abuosa, Ahmed; Al Habeeb, Waleed; Al Habib, Khalid F

    2016-08-01

    We assessed sex-specific differences in clinical features and outcomes of patients with acute heart failure (AHF). The Heart function Assessment Registry Trial in Saudi Arabia (HEARTS), a prospective registry, enrolled 2609 patients with AHF (34.2% women) between 2009 and 2010. Women were older and more likely to have risk factors for atherosclerosis, history of heart failure (HF), and rheumatic heart and valve disease. Ischemic heart disease was the prime cause for HF in men and women but more so in men (P < .001). Women had higher rates of hypertensive heart disease and primary valve disease (P < .001, for both comparisons). Men were more likely to have severe left ventricular systolic dysfunction. On discharge, a higher use of angiotensin-converting enzyme inhibitors, β-blockers, and aldosterone inhibitors was observed in men (P < .001 for all comparisons). Apart from higher atrial fibrillation in women and higher ventricular arrhythmias in men, no differences were observed in hospital outcomes. The overall survival did not differ between men and women (hazard ratio: 1.0, 95% confidence interval: 0.8-1.2, P = .981). Men and women with AHF differ significantly in baseline clinical characteristics and management but not in adverse outcomes. PMID:26438635

  18. Risk assessment of Giardia from a full scale MBR sewage treatment plant caused by membrane integrity failure.

    PubMed

    Zhang, Yu; Chen, Zhimin; An, Wei; Xiao, Shumin; Yuan, Hongying; Zhang, Dongqing; Yang, Min

    2015-04-01

    Membrane bioreactors (MBR) are highly efficient at intercepting particles and microbes and have become an important technology for wastewater reclamation. However, many pathogens can accumulate in activated sludge due to the long residence time usually adopted in MBR, and thus may pose health risks when membrane integrity problems occur. This study presents data from a survey on the occurrence of water-borne Giardia pathogens in reclaimed water from a full-scale wastewater treatment plant with MBR experiencing membrane integrity failure, and assessed the associated risk for green space irrigation. Due to membrane integrity failure, the MBR effluent turbidity varied between 0.23 and 1.90 NTU over a period of eight months. Though this turbidity level still met reclaimed water quality standards (≤5 NTU), Giardia were detected at concentrations of 0.3 to 95 cysts/10 L, with a close correlation between effluent turbidity and Giardia concentration. All β-giardin gene sequences of Giardia in the WWTP influents were genotyped as Assemblages A and B, both of which are known to infect humans. An exponential dose-response model was applied to assess the risk of infection by Giardia. The risk in the MBR effluent with chlorination was 9.83×10(-3), higher than the acceptable annual risk of 1.0×10(-4). This study suggested that membrane integrity is very important for keeping a low pathogen level, and multiple barriers are needed to ensure the biological safety of MBR effluent. PMID:25872734

  19. An assessment of BWR (boiling water reactor) Mark-II containment challenges, failure modes, and potential improvements in performance

    SciTech Connect

    Kelly, D.L.; Jones, K.R.; Dallman, R.J. ); Wagner, K.C. )

    1990-07-01

    This report assesses challenges to BWR Mark II containment integrity that could potentially arise from severe accidents. Also assessed are some potential improvements that could prevent core damage or containment failure, or could mitigate the consequences of such failure by reducing the release of fission products to the environment. These challenges and improvements are analyzed via a limited quantitative risk/benefit analysis of a generic BWR/4 reactor with Mark II containment. Point estimate frequencies of the dominant core damage sequences are obtained and simple containment event trees are constructed to evaluate the response of the containment to these severe accident sequences. The resulting containment release modes are then binned into source term release categories, which provide inputs to the consequence analysis. The output of the consequences analysis is used to construct an overall base case risk profile. Potential improvements and sensitivities are evaluated by modifying the event tree spilt fractions, thus generating a revised risk profile. Several important sensitivity cases are examined to evaluate the impact of phenomenological uncertainties on the final results. 75 refs., 25 figs., 65 tabs.

  20. Potential-pH Diagrams.

    ERIC Educational Resources Information Center

    Barnum, Dennis W.

    1982-01-01

    Potential-pH diagrams show the domains of redoxpotential and pH in which major species are most stable. Constructing such diagrams provides students with opportunities to decide what species must be considered, search literature for equilibrium constants and free energies of formation, and practice in using the Nernst equation. (Author/JN)

  1. Contingency diagrams as teaching tools

    PubMed Central

    Mattaini, Mark A.

    1995-01-01

    Contingency diagrams are particularly effective teaching tools, because they provide a means for students to view the complexities of contingency networks present in natural and laboratory settings while displaying the elementary processes that constitute those networks. This paper sketches recent developments in this visualization technology and illustrates approaches for using contingency diagrams in teaching. ImagesFigure 2Figure 3Figure 4 PMID:22478208

  2. Moving Toward Comprehensive Acute Heart Failure Risk Assessment in the Emergency Department

    PubMed Central

    Collins, Sean P.; Storrow, Alan B.

    2013-01-01

    Nearly 700,000 emergency department (ED) visits were due to acute heart failure (AHF) in 2009. Most visits result in a hospital admission and account for the largest proportion of a projected $70 billion to be spent on heart failure care by 2030. ED-based risk prediction tools in AHF rarely impact disposition decision making. This is a major factor contributing to the 80% admission rate for ED patients with AHF, which has remained unchanged over the last several years. Self-care behaviors such as symptom monitoring, medication taking, dietary adherence, and exercise have been associated with decreased hospital readmissions, yet self-care remains largely unaddressed in ED patients with AHF and thus represents a significant lost opportunity to improve patient care and decrease ED visits and hospitalizations. Furthermore, shared decision making encourages collaborative interaction between patients, caregivers, and providers to drive a care path based on mutual agreement. The observation that “difficult decisions now will simplify difficult decisions later” has particular relevance to the ED, given this is the venue for many such issues. We hypothesize patients as complex and heterogeneous as ED patients with AHF may need both an objective evaluation of physiologic risk as well as an evaluation of barriers to ideal self-care, along with strategies to overcome these barriers. Combining physician gestalt, physiologic risk prediction instruments, an evaluation of self-care, and an information exchange between patient and provider using shared decision making may provide the critical inertia necessary to discharge patients home after a brief ED evaluation. PMID:24159563

  3. Risk assessment of drain valve failure in the K-West basin south loadout pit

    SciTech Connect

    MORGAN, R.G.

    1999-06-23

    The drain valve located in the bottom of the K-West Basin South Loadout Pit (SLOP) could provide an additional leak path from the K Basins if the drain valve were damaged during construction, installation, or operation of the cask loading system. For the K-West Basin SLOP the immersion pail support structure (IPSS) has already been installed, but the immersion pail has not been installed in the IPSS. The objective of this analysis is to evaluate the risk of damaging the drain valve during the remaining installation activities or operation of the cask loading system. Valve damage, as used in this analysis, does not necessarily imply large amounts of the water will be released quickly from the basin, rather valve damage implies that the valve's integrity has been compromised. The analysis process is a risk-based uncertainty analysis where best engineering judgement is used to represent each variable in the analysis. The uncertainty associated with each variable is represented by a probability distribution. The uncertainty is propagated through the analysis by Monte Carlo convolution techniques. The corresponding results are developed as a probability distribution and the risk is expressed in terms of the corresponding complementary cumulative distribution function (''risk curve''). The total risk is the area under the ''risk curve''. The risk of potentially dropping a cask into or on the IPSS and damaging the drain valve is approximately 1 x 10{sup -4} to 2 x 10{sup -5} per year. The risk of objects falling behind the IPSS and damaging the valve is 3 x 10{sup -2} to 6 x 10{sup -3} per year. Both risks are expressed as drain value failure frequencies. The risk of objects falling behind the IPSS and damaging the valve can be significantly reduced by an impact limiter and/or installing a gating or plate over the area bounded by the back of the IPSS and the wall of the SLOP. With either of these actions there is a 90 percent confidence that the frequency of drain valve

  4. Assessing the Effects of the "Rocket Math" Program with a Primary Elementary School Student at Risk for School Failure: A Case Study

    ERIC Educational Resources Information Center

    Smith, Christina R.; Marchand-Martella, Nancy E.; Martella, Ronald C.

    2011-01-01

    This study assessed the effects of the "Rocket Math" program on the math fluency skills of a first grade student at risk for school failure. The student received instruction in the "Rocket Math" program over 6 months. He was assessed using a pre- and posttest curriculum-based measurement (CBM) and individualized fluency checkouts within the…

  5. Consequences and assessment of human vestibular failure: implications for postural control.

    PubMed

    Colebatch, James G

    2002-01-01

    Labyrinthine afferents respond to both angular velocity (semicircular canals) and linear acceleration (otoliths), including gravity. Given their response to gravity, the otoliths are likely to have an important role in the postural functions of the vestibular apparatus. Unilateral vestibular ablation has dramatic effects on posture in many animals, but less so in primates. Nevertheless, bilateral vestibular lesions lead to disabling symptoms in man related to disturbed ocular and postural control and impaired perception of slopes and accelerations. While seimicircular canal function can be assessed through its effects on vestibular ocular reflexes, assessment of otolith function in man has traditionally been much more difficult. Recent definition of a short latency vestibulocollic reflex, activated by sound and appearing to arise from the saccule, shows promise as a new method of non-invasive assessment of otolith function. PMID:12171099

  6. Concrete and abstract Voronoi diagrams

    SciTech Connect

    Klein, R. )

    1989-01-01

    The Voronoi diagram of a set of sites is a partition of the plane into regions, one to each site, such that the region of each site contains all points of the plane that are closer to this site than to the other ones. Such partitions are of great importance to computer science and many other fields. The challenge is to compute Voronoi diagrams quickly. The problem is that their structure depends on the notion of distance and the sort of site. In this book the author proposes a unifying approach by introducing abstract Voronoi diagrams. These are based on the concept of bisecting curves which are required to have some simple properties that are actually possessed by most bisectors of concrete Voronoi diagrams. Abstract Voronoi diagrams can be computed efficiently and there exists a worst-case efficient algorithm of divide-and-conquer type that applies to all abstract Voronoi diagrams satisfying a certain constraint. The author shows that this constraint is fulfilled by the concrete diagrams based no large classes of metrics in the plane.

  7. Assessment of systolic and diastolic function in heart failure using ambulatory monitoring with acoustic cardiography.

    PubMed

    Dillier, Roger; Zuber, Michel; Arand, Patricia; Erne, Susanne; Erne, Paul

    2011-08-01

    INTRODUCTION. The circadian variation of heart function and heart sounds in patients with and without heart failure (HF) is poorly understood. We hypothesized HF patients would exhibit less circadian variation with worsened cardiac function and sleep apnea. METHODS. We studied 67 HF patients (age 67.4 ± 8.2 years; 42% acute HF) and 63 asymptomatic control subjects with no history of HF (age 61.6 ± 7.7 years). Subjects wore a heart sound/ECG/respiratory monitor. The data were analyzed for sleep apnea, diastolic heart sounds, and systolic time intervals. RESULTS. The HF group had significantly greater prevalence of the third heart sound and prolongation of electro-mechanical activation time, while the control group had an age-related increase in the prevalence of the fourth heart sound. The control group showed more circadian variation in cardiac function. The HF subjects had more sleep apnea and higher occurrence of heart rate non-dipping. CONCLUSIONS. The control subjects demonstrated an increasing incidence of diastolic dysfunction with age, while systolic function was mostly unchanged with aging. Parameters related to systolic function were significantly worse in the HF group with little diurnal variation, indicating a constant stimulation of sympathetic tone in HF and reduction of diurnal regulation. PMID:21361859

  8. Failure Impact Analysis of Key Management in AMI Using Cybernomic Situational Assessment (CSA)

    SciTech Connect

    Abercrombie, Robert K; Sheldon, Frederick T; Hauser, Katie R; Lantz, Margaret W; Mili, Ali

    2013-01-01

    In earlier work, we presented a computational framework for quantifying the security of a system in terms of the average loss a stakeholder stands to sustain as a result of threats to the system. We named this system, the Cyberspace Security Econometrics System (CSES). In this paper, we refine the framework and apply it to cryptographic key management within the Advanced Metering Infrastructure (AMI) as an example. The stakeholders, requirements, components, and threats are determined. We then populate the matrices with justified values by addressing the AMI at a higher level, rather than trying to consider every piece of hardware and software involved. We accomplish this task by leveraging the recently established NISTR 7628 guideline for smart grid security. This allowed us to choose the stakeholders, requirements, components, and threats realistically. We reviewed the literature and selected an industry technical working group to select three representative threats from a collection of 29 threats. From this subset, we populate the stakes, dependency, and impact matrices, and the threat vector with realistic numbers. Each Stakeholder s Mean Failure Cost is then computed.

  9. Assessment of filter dust characteristics that cause filter failure during hot-gas filtration

    SciTech Connect

    John P. Hurley; Biplab Mukherjee; Michael D. Mann

    2006-08-15

    The high-temperature filtration of particulates from gases is greatly limited because of the development of dust cakes that are difficult to remove and can bridge between candle filters, causing them to break. Understanding the conditions leading to the formation of cohesive dust can prevent costly filter failures and ensure higher efficiency of solid fuel, direct-fired turbine power generation systems. The University of North Dakota Energy & Environmental Research Center is working with the New Energy and Industrial Technology Development Organization and the U.S. Department of Energy to perform research to characterize and determine the factors that cause the development of such dust cakes. Changes in the tensile strength, bridging propensity, and plasticity of filter dust cakes were measured as a function of the temperature and a filter pressure drop for a coal and a biomass filter dust. The biomass filter dust indicated that potential filtering problems can exist at temperatures as low as 400{sup o}C, while the coal filter dust showed good filtering characteristics up to 750{sup o}C. A statistically valid model that can indicate the propensity of filters to fail with system operating conditions was developed. A detailed analysis of the chemical aspect of dusts is also presented in order to explore the causes of such stickiness. 16 refs., 10 figs., 3 tabs.

  10. Students' different understandings of class diagrams

    NASA Astrophysics Data System (ADS)

    Boustedt, Jonas

    2012-03-01

    The software industry needs well-trained software designers and one important aspect of software design is the ability to model software designs visually and understand what visual models represent. However, previous research indicates that software design is a difficult task to many students. This article reports empirical findings from a phenomenographic investigation on how students understand class diagrams, Unified Modeling Language (UML) symbols, and relations to object-oriented (OO) concepts. The informants were 20 Computer Science students from four different universities in Sweden. The results show qualitatively different ways to understand and describe UML class diagrams and the "diamond symbols" representing aggregation and composition. The purpose of class diagrams was understood in a varied way, from describing it as a documentation to a more advanced view related to communication. The descriptions of class diagrams varied from seeing them as a specification of classes to a more advanced view, where they were described to show hierarchic structures of classes and relations. The diamond symbols were seen as "relations" and a more advanced way was seeing the white and the black diamonds as different symbols for aggregation and composition. As a consequence of the results, it is recommended that UML should be adopted in courses. It is briefly indicated how the phenomenographic results in combination with variation theory can be used by teachers to enhance students' possibilities to reach advanced understanding of phenomena related to UML class diagrams. Moreover, it is recommended that teachers should put more effort in assessing skills in proper usage of the basic symbols and models and students should be provided with opportunities to practise collaborative design, e.g. using whiteboards.

  11. 30 CFR 218.40 - Assessments for incorrect or late reports and failure to report.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... INTERIOR MINERALS REVENUE MANAGEMENT COLLECTION OF MONIES AND PROVISION FOR GEOTHERMAL CREDITS AND... MMS by the designated due date for geothermal, solid minerals, and Indian oil and gas leases. (b) An... geothermal, solid minerals, and Indian oil and gas leases. (c) For purpose of assessments discussed in...

  12. Risk assessment of failure modes of gas diffuser liner of V94.2 siemens gas turbine by FMEA method

    NASA Astrophysics Data System (ADS)

    Mirzaei Rafsanjani, H.; Rezaei Nasab, A.

    2012-05-01

    Failure of welding connection of gas diffuser liner and exhaust casing is one of the failure modes of V94.2 gas turbines which are happened in some power plants. This defect is one of the uncertainties of customers when they want to accept the final commissioning of this product. According to this, the risk priority of this failure evaluated by failure modes and effect analysis (FMEA) method to find out whether this failure is catastrophic for turbine performance and is harmful for humans. By using history of 110 gas turbines of this model which are used in some power plants, the severity number, occurrence number and detection number of failure determined and consequently the Risk Priority Number (RPN) of failure determined. Finally, critically matrix of potential failures is created and illustrated that failure modes are located in safe zone.

  13. Cardiac status assessment with a multi-signal device for improved home-based congestive heart failure management.

    PubMed

    Muehlsteff, Jens; Carvalho, Paulo; Henriques, Jorge; Paiva, Rui P; Reiter, Harald

    2011-01-01

    State-of-the-Art disease management for Congestive Heart Failure (CHF) patients is still based on easy-to-acquire measures such as heart rate (HR), weight and blood pressure (BP). However, these measures respond late to changes of the patient health status and provide limited information to personalize and adapt medication therapy. This paper describes our concept called "Cardiac Status Assessment" we have been investigating within the European project "HeartCycle" towards next-generation home-based disease management of CHF. In our concept we analyze non-invasive surrogate measures of the cardio-vascular function in particular systolic time intervals and pulse wave characteristics to estimate Cardiac Output (CO) and Systemic Vascular Resistance (SVR) both are established clinical measures. We discuss the underlying concept, a developed measurement system and first results. PMID:22254450

  14. Personalized risk assessment of heart failure patients: more perspectives from transforming growth factor super-family members.

    PubMed

    Goletti, S; Gruson, D

    2015-03-30

    More personalized risk assessment of patients with heart failure (HF) is important to develop more tailored based care and for a better allocation of resources. The measurement of biomarkers is now part of the standards of care and is important for the sub-phenotyping of HF patients to demonstrate the activation of pathophysiological pathways engaged in the worsening of HF. The sub-phenotyping of patients can lead therefore to a more personalized selection of the treatment. Several members of the transforming growth factor β (TGF-β) super-family, such as myostatin, activin A, GDF-15 and GDF-11, are involved in cardiac remodeling and the evaluation of their circulating levels might provide new insights to the course of the disease and also to guide prognostication and therapeutic selection of HF patients. PMID:25260834

  15. The Hertzsprung-Russell Diagram.

    ERIC Educational Resources Information Center

    Woodrow, Janice

    1991-01-01

    Describes a classroom use of the Hertzsprung-Russell diagram to infer not only the properties of a star but also the star's probable stage in evolution, life span, and age of the cluster in which it is located. (ZWH)

  16. Atemporal diagrams for quantum circuits

    SciTech Connect

    Griffiths, Robert B.; Wu Shengjun; Yu Li; Cohen, Scott M.

    2006-05-15

    A system of diagrams is introduced that allows the representation of various elements of a quantum circuit, including measurements, in a form which makes no reference to time (hence 'atemporal'). It can be used to relate quantum dynamical properties to those of entangled states (map-state duality), and suggests useful analogies, such as the inverse of an entangled ket. Diagrams clarify the role of channel kets, transition operators, dynamical operators (matrices), and Kraus rank for noisy quantum channels. Positive (semidefinite) operators are represented by diagrams with a symmetry that aids in understanding their connection with completely positive maps. The diagrams are used to analyze standard teleportation and dense coding, and for a careful study of unambiguous (conclusive) teleportation. A simple diagrammatic argument shows that a Kraus rank of 3 is impossible for a one-qubit channel modeled using a one-qubit environment in a mixed state.

  17. Particles, Feynman Diagrams and All That

    ERIC Educational Resources Information Center

    Daniel, Michael

    2006-01-01

    Quantum fields are introduced in order to give students an accurate qualitative understanding of the origin of Feynman diagrams as representations of particle interactions. Elementary diagrams are combined to produce diagrams representing the main features of the Standard Model.

  18. An assessment of the state of the art in predicting the failure of ceramics: Final report

    SciTech Connect

    Boulet, J.A.M.

    1988-03-01

    The greatest weakness in existing design strategies for brittle fracture is in the narrow range of conditions for which the strategies are adequate. The primary reason for this weakness is the use of simplistic mechanical models of fracture processes and unverified statistical models of materials. To improve the design methodology, the models must first be improved. Specifically recommended research goals are: to develop models of cracks with realistic geometry under arbitrary stress states; to identify and model the most important relationships between fracture processes and microstructural features; to assess the technology available for acquiring statistical data on microstructure and flaw populations, and to establish the amount of data required for verification of statistical models; and to establish a computer-based fracture simulation that can incorporate a wide variety of mechanical and statistical models and crack geometries, as well as arbitrary stress states. 204 refs., 2 tabs.

  19. Risk assessment of component failure modes and human errors using a new FMECA approach: application in the safety analysis of HDR brachytherapy.

    PubMed

    Giardina, M; Castiglia, F; Tomarchio, E

    2014-12-01

    Failure mode, effects and criticality analysis (FMECA) is a safety technique extensively used in many different industrial fields to identify and prevent potential failures. In the application of traditional FMECA, the risk priority number (RPN) is determined to rank the failure modes; however, the method has been criticised for having several weaknesses. Moreover, it is unable to adequately deal with human errors or negligence. In this paper, a new versatile fuzzy rule-based assessment model is proposed to evaluate the RPN index to rank both component failure and human error. The proposed methodology is applied to potential radiological over-exposure of patients during high-dose-rate brachytherapy treatments. The critical analysis of the results can provide recommendations and suggestions regarding safety provisions for the equipment and procedures required to reduce the occurrence of accidental events. PMID:25379678

  20. Bootstrapping & Separable Monte Carlo Simulation Methods Tailored for Efficient Assessment of Probability of Failure of Dynamic Systems

    NASA Astrophysics Data System (ADS)

    Jehan, Musarrat

    The response of a dynamic system is random. There is randomness in both the applied loads and the strength of the system. Therefore, to account for the uncertainty, the safety of the system must be quantified using its probability of survival (reliability). Monte Carlo Simulation (MCS) is a widely used method for probabilistic analysis because of its robustness. However, a challenge in reliability assessment using MCS is that the high computational cost limits the accuracy of MCS. Haftka et al. [2010] developed an improved sampling technique for reliability assessment called separable Monte Carlo (SMC) that can significantly increase the accuracy of estimation without increasing the cost of sampling. However, this method was applied to time-invariant problems involving two random variables only. This dissertation extends SMC to random vibration problems with multiple random variables. This research also develops a novel method for estimation of the standard deviation of the probability of failure of a structure under static or random vibration. The method is demonstrated on quarter car models and a wind turbine. The proposed method is validated using repeated standard MCS.

  1. Classification tree for risk assessment in patients suffering from congestive heart failure via long-term heart rate variability.

    PubMed

    Melillo, Paolo; De Luca, Nicola; Bracale, Marcello; Pecchia, Leandro

    2013-05-01

    This study aims to develop an automatic classifier for risk assessment in patients suffering from congestive heart failure (CHF). The proposed classifier separates lower risk patients from higher risk ones, using standard long-term heart rate variability (HRV) measures. Patients are labeled as lower or higher risk according to the New York Heart Association classification (NYHA). A retrospective analysis on two public Holter databases was performed, analyzing the data of 12 patients suffering from mild CHF (NYHA I and II), labeled as lower risk, and 32 suffering from severe CHF (NYHA III and IV), labeled as higher risk. Only patients with a fraction of total heartbeats intervals (RR) classified as normal-to-normal (NN) intervals (NN/RR) higher than 80% were selected as eligible in order to have a satisfactory signal quality. Classification and regression tree (CART) was employed to develop the classifiers. A total of 30 higher risk and 11 lower risk patients were included in the analysis. The proposed classification trees achieved a sensitivity and a specificity rate of 93.3% and 63.6%, respectively, in identifying higher risk patients. Finally, the rules obtained by CART are comprehensible and consistent with the consensus showed by previous studies that depressed HRV is a useful tool for risk assessment in patients suffering from CHF. PMID:24592473

  2. Analysis of Japanese banks’ historical tree diagram

    NASA Astrophysics Data System (ADS)

    Ueno, Hiromichi; Mizuno, Takayuki; Takayasu, Misako

    2007-09-01

    By using the historical data from the Japanese banks’ database at “The Bankers Library” of Japanese Banker Association, we analyze the historical network of banks from 1868 to 2006. Firstly, we define a bank every year by a particle and draw a space-time evolution process of merger, division, establishment, and failure by a tree diagram structure. We found that the distribution of the tree basin size of real data and simulation result are mostly fitting well. Secondly, we analyze the raw data of financial statements of banks collected by the National Diet library. We confirm that the distributions of the amount of deposits have fat-tail every year, however, small deviations are observed relating to governmental policy.

  3. Assessment of formulation robustness for nano-crystalline suspensions using failure mode analysis or derisking approach.

    PubMed

    Nakach, Mostafa; Authelin, Jean-René; Voignier, Cecile; Tadros, Tharwat; Galet, Laurence; Chamayou, Alain

    2016-06-15

    The small particle size of nano-crystalline suspensions can be responsible for their physical instability during drug product preparation (downstream processing), storage and administration. For that purpose, the commercial formulation needs to be sufficiently robust to various triggering conditions, such as ionic strength, shear rate, wetting/dispersing agent desorption by dilution, temperature and pH variation. In our previous work we described a systematic approach to select the suitable wetting/dispersant agent for the stabilization of nano-crystalline suspension. In this paper, we described the assessment of the formulation robustness (stabilized using a mixture of sodium dodecyl sulfate (SDS) and polyvinylpyrrolidone (PVP) and) by measuring the rate of perikinetic (diffusion-controlled) and orthokinetic (shear-induced) aggregation as a function of ionic strength, temperature, pH and dilution. The results showed that, using the SDS/PVP system, the critical coagulation concentration is about five times higher than that observed in the literature for suspension colloidaly stable at high concentration. The nano-suspension was also found to be very stable at ambient temperature and at different pH conditions. Desorption test confirmed the high affinity between API and wetting/dispersing agent. However, the suspension undergoes aggregation at high temperature due to the desorption of the wetting/dispersing agent and disaggregation of SDS micelles. Furthermore, aggregation occurs at very high shear rate (orhokinetic aggregation) by overcoming the energy barrier responsible for colloidal stability of the system. PMID:27102992

  4. Kidney Failure

    MedlinePlus

    ... if You Have Kidney Disease Kidney Failure Expand Dialysis Kidney Transplant Preparing for Kidney Failure Treatment Choosing Not to Treat with Dialysis or Transplant Paying for Kidney Failure Treatment Contact ...

  5. 30 CFR 1218.41 - Assessments for failure to submit payment of same amount as Form ONRR-2014 or bill document or to...

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... same amount as Form ONRR-2014 or bill document or to provide adequate information. 1218.41 Section 1218... General Provisions § 1218.41 Assessments for failure to submit payment of same amount as Form ONRR-2014 or... Form ONRR-2014, Form ONRR-4430, or a bill document, unless ONRR has authorized the difference in...

  6. 30 CFR 1218.41 - Assessments for failure to submit payment of same amount as Form MMS-2014 or bill document or to...

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... same amount as Form MMS-2014 or bill document or to provide adequate information. 1218.41 Section 1218... Provisions § 1218.41 Assessments for failure to submit payment of same amount as Form MMS-2014 or bill... leases is not equivalent in amount to the total of individual line items on the associated Form...

  7. 30 CFR 218.41 - Assessments for failure to submit payment of same amount as Form MMS-2014 or bill document or to...

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... same amount as Form MMS-2014 or bill document or to provide adequate information. 218.41 Section 218.41... Assessments for failure to submit payment of same amount as Form MMS-2014 or bill document or to provide... equivalent in amount to the total of individual line items on the associated Form MMS-2014, Form MMS-4430,...

  8. Pseudohaptic interaction with knot diagrams

    NASA Astrophysics Data System (ADS)

    Weng, Jianguang; Zhang, Hui

    2012-07-01

    To make progress in understanding knot theory, we need to interact with the projected representations of mathematical knots, which are continuous in three dimensions (3-D) but significantly interrupted in the projective images. One way to achieve such a goal is to design an interactive system that allows us to sketch two-dimensional (2-D) knot diagrams by taking advantage of a collision-sensing controller and explore their underlying smooth structures through a continuous motion. Recent advances of interaction techniques have been made that allow progress in this direction. Pseudohaptics that simulate haptic effects using pure visual feedback can be used to develop such an interactive system. We outline one such pseudohaptic knot diagram interface. Our interface derives from the familiar pencil-and-paper process of drawing 2-D knot diagrams and provides haptic-like sensations to facilitate the creation and exploration of knot diagrams. A centerpiece of the interaction model simulates a physically reactive mouse cursor, which is exploited to resolve the apparent conflict between the continuous structure of the actual smooth knot and the visual discontinuities in the knot diagram representation. Another value in exploiting pseudohaptics is that an acceleration (or deceleration) of the mouse cursor (or surface locator) can be used to indicate the slope of the curve (or surface) of which the projective image is being explored. By exploiting these additional visual cues, we proceed to a full-featured extension to a pseudohaptic four-dimensional (4-D) visualization system that simulates the continuous navigation on 4-D objects and allows us to sense the bumps and holes in the fourth dimension. Preliminary tests of the software show that main features of the interface overcome some expected perceptual limitations in our interaction with 2-D knot diagrams of 3-D knots and 3-D projective images of 4-D mathematical objects.

  9. Reliability analysis based on the losses from failures.

    PubMed

    Todinov, M T

    2006-04-01

    The conventional reliability analysis is based on the premise that increasing the reliability of a system will decrease the losses from failures. On the basis of counterexamples, it is demonstrated that this is valid only if all failures are associated with the same losses. In case of failures associated with different losses, a system with larger reliability is not necessarily characterized by smaller losses from failures. Consequently, a theoretical framework and models are proposed for a reliability analysis, linking reliability and the losses from failures. Equations related to the distributions of the potential losses from failure have been derived. It is argued that the classical risk equation only estimates the average value of the potential losses from failure and does not provide insight into the variability associated with the potential losses. Equations have also been derived for determining the potential and the expected losses from failures for nonrepairable and repairable systems with components arranged in series, with arbitrary life distributions. The equations are also valid for systems/components with multiple mutually exclusive failure modes. The expected losses given failure is a linear combination of the expected losses from failure associated with the separate failure modes scaled by the conditional probabilities with which the failure modes initiate failure. On this basis, an efficient method for simplifying complex reliability block diagrams has been developed. Branches of components arranged in series whose failures are mutually exclusive can be reduced to single components with equivalent hazard rate, downtime, and expected costs associated with intervention and repair. A model for estimating the expected losses from early-life failures has also been developed. For a specified time interval, the expected losses from early-life failures are a sum of the products of the expected number of failures in the specified time intervals covering the

  10. Sequential organ failure assessment scoring and prediction of patient's outcome in Intensive Care Unit of a tertiary care hospital

    PubMed Central

    Jain, Aditi; Palta, Sanjeev; Saroa, Richa; Palta, Anshu; Sama, Sonu; Gombar, Satinder

    2016-01-01

    Background and Aims: The objective was to determine the accuracy of sequential organ failure assessment (SOFA) score in predicting outcome of patients in Intensive Care Unit (ICU). Material and Methods: Forty-four consecutive patients between 15 and 80 years admitted to ICU over 8 weeks period were studied prospectively. Three patients were excluded. SOFA score was determined 24 h postadmission to ICU and subsequently every 48 h for the first 10 days. Patients were followed till discharge/death/transfer from the ICU. Initial SOFA score, highest and mean SOFA scores were calculated and correlated with mortality and duration of stay in ICU. Results: The mortality rate was 39% and the mean duration of stay in the ICU was 9 days. The maximum score in survivors (3.92 ± 2.17) was significantly lower than nonsurvivors (8.9 ± 3.45). The initial SOFA score had a strong statistical correlation with mortality. Cardiovascular score on day 1 and 3, respiratory score on day 7, and coagulation profile on day 3 correlated significantly with the outcome. Duration of the stay did not correlate with the survival (P = 0.461). Conclusion: SOFA score is a simple, but effective prognostic indicator and evaluator for patient progress in ICU. Day 1 SOFA can triage the patients into risk categories. For further management, mean and maximum score help determine the severity of illness and can act as a guide for the intensity of therapy required for each patient.

  11. Generalized discriminant analysis for congestive heart failure risk assessment based on long-term heart rate variability.

    PubMed

    Shahbazi, Fatemeh; Asl, Babak Mohammadzadeh

    2015-11-01

    The aims of this study are summarized in the following items: first, to investigate the class discrimination power of long-term heart rate variability (HRV) features for risk assessment in patients suffering from congestive heart failure (CHF); second, to introduce the most discriminative features of HRV to discriminate low risk patients (LRPs) and high risk patients (HRPs), and third, to examine the influence of feature dimension reduction in order to achieve desired accuracy of the classification. We analyzed two public Holter databases: 12 data of patients suffering from mild CHF (NYHA class I and II), labeled as LRPs and 32 data of patients suffering from severe CHF (NYHA class III and IV), labeled as HRPs. A K-nearest neighbor classifier was used to evaluate the performance of feature set in the classification. Moreover, to reduce the number of features as well as the overlap of the samples of two classes in feature space, we used generalized discriminant analysis (GDA) as a feature extraction method. By applying GDA to the discriminative nonlinear features, we achieved sensitivity and specificity of 100% having the least number of features. Finally, the results were compared with other similar conducted studies regarding the performance of feature selection procedure and classifier besides the number of features used in training. PMID:26344584

  12. Failure of Passive Immune Transfer in Calves: A Meta-Analysis on the Consequences and Assessment of the Economic Impact.

    PubMed

    Raboisson, Didier; Trillat, Pauline; Cahuzac, Clélia

    2016-01-01

    Low colostrum intake at birth results in the failure of passive transfer (FPT) due to the inadequate ingestion of colostral immunoglobulins (Ig). FPT is associated with an increased risk of mortality and decreased health and longevity. Despite the known management practices associated with low FPT, it remains an important issue in the field. Neither a quantitative analysis of FPT consequences nor an assessment of its total cost are available. To address this point, a meta-analysis on the adjusted associations between FPT and its outcomes was first performed. Then, the total costs of FPT in European systems were calculated using a stochastic method with adjusted values as the input parameters. The adjusted risks (and 95% confidence intervals) for mortality, bovine respiratory disease, diarrhoea and overall morbidity in the case of FPT were 2.12 (1.43-3.13), 1.75 (1.50-2.03), 1.51 (1.05-2.17) and 1.91 (1.63-2.24), respectively. The mean (and 95% prediction interval) total costs per calf with FPT were estimated to be €60 (€10-109) and €80 (€20-139) for dairy and beef, respectively. As a result of the double-step stochastic method, the proposed economic estimation constitutes the first estimate available for FPT. The results are presented in a way that facilitates their use in the field and, with limited effort, combines the cost of each contributor to increase the applicability of the economic assessment to the situations farm-advisors may face. The present economic estimates are also an important tool to evaluate the profitability of measures that aim to improve colostrum intake and FPT prevention. PMID:26986832

  13. Failure of Passive Immune Transfer in Calves: A Meta-Analysis on the Consequences and Assessment of the Economic Impact

    PubMed Central

    Raboisson, Didier; Trillat, Pauline; Cahuzac, Clélia

    2016-01-01

    Low colostrum intake at birth results in the failure of passive transfer (FPT) due to the inadequate ingestion of colostral immunoglobulins (Ig). FPT is associated with an increased risk of mortality and decreased health and longevity. Despite the known management practices associated with low FPT, it remains an important issue in the field. Neither a quantitative analysis of FPT consequences nor an assessment of its total cost are available. To address this point, a meta-analysis on the adjusted associations between FPT and its outcomes was first performed. Then, the total costs of FPT in European systems were calculated using a stochastic method with adjusted values as the input parameters. The adjusted risks (and 95% confidence intervals) for mortality, bovine respiratory disease, diarrhoea and overall morbidity in the case of FPT were 2.12 (1.43–3.13), 1.75 (1.50–2.03), 1.51 (1.05–2.17) and 1.91 (1.63–2.24), respectively. The mean (and 95% prediction interval) total costs per calf with FPT were estimated to be €60 (€10–109) and €80 (€20–139) for dairy and beef, respectively. As a result of the double-step stochastic method, the proposed economic estimation constitutes the first estimate available for FPT. The results are presented in a way that facilitates their use in the field and, with limited effort, combines the cost of each contributor to increase the applicability of the economic assessment to the situations farm-advisors may face. The present economic estimates are also an important tool to evaluate the profitability of measures that aim to improve colostrum intake and FPT prevention. PMID:26986832

  14. Voronoi Diagrams and Spring Rain

    ERIC Educational Resources Information Center

    Perham, Arnold E.; Perham, Faustine L.

    2011-01-01

    The goal of this geometry project is to use Voronoi diagrams, a powerful modeling tool across disciplines, and the integration of technology to analyze spring rainfall from rain gauge data over a region. In their investigation, students use familiar equipment from their mathematical toolbox: triangles and other polygons, circumcenters and…

  15. Assessment of the risk of failure of high voltage substations due to environmental conditions and pollution on insulators.

    PubMed

    Castillo Sierra, Rafael; Oviedo-Trespalacios, Oscar; Candelo, John E; Soto, Jose D

    2015-07-01

    Pollution on electrical insulators is one of the greatest causes of failure of substations subjected to high levels of salinity and environmental pollution. Considering leakage current as the main indicator of pollution on insulators, this paper focuses on establishing the effect of the environmental conditions on the risk of failure due to pollution on insulators and determining the significant change in the magnitude of the pollution on the insulators during dry and humid periods. Hierarchical segmentation analysis was used to establish the effect of environmental conditions on the risk of failure due to pollution on insulators. The Kruskal-Wallis test was utilized to determine the significant changes in the magnitude of the pollution due to climate periods. An important result was the discovery that leakage current was more common on insulators during dry periods than humid ones. There was also a higher risk of failure due to pollution during dry periods. During the humid period, various temperatures and wind directions produced a small change in the risk of failure. As a technical result, operators of electrical substations can now identify the cause of an increase in risk of failure due to pollution in the area. The research provides a contribution towards the behaviour of the leakage current under conditions similar to those of the Colombian Caribbean coast and how they affect the risk of failure of the substation due to pollution. PMID:25634366

  16. Comparison of linear-elastic-plastic, elastic-plastic, and fully plastic failure models in the assessment of piping integrity

    SciTech Connect

    Streit, R.D.

    1981-01-01

    A double-ended guillotine break in the primary coolant loop of a pressurized water reactor (PWR) is a postulated loss of coolant accient which can result in extreme dynamic loads (i.e., the asymmetric blowdown load) on the reactor pressure vessel (RPV) an vessel intervals. Design and construction of the RPV and support systems to withstand these extreme dynamic loads is very difficult. Similar high loading would also be experienced in a boiling water reactor given a similar accident. Although such a break would be an extremely rare event, its obvious safety and design implications demand that it is carefully evaluated. The work discussed here is part of the Load Combinations Program at Lawrence Livermore National Laboratory to estimate the probability of a double-ended guillotine break in the primary reactor coolant loop of a selected PWR. The program employs a fracture mechanics based fatigue model to propagate cracks from an initial flaw distribution. It was found that while most of the large cracks grew into leaks, a complete (or nearly complete) circumferential crack could lead to a double-ended pipe break with prior leaking and thus, without warning. It is important to assess under what loads such a crack will result in complete pipe severance. The loads considered in this evaluation result from pressure, dead weight and seismic stresses. For the PWR hot leg considered in this investigation the internal pressure contributes the most to the load controlled stresses (i.e., stresses which can cause piping failure) and thus, the problem is treated as axisymmetric with uniform axial loading.

  17. Relation of Longitudinal Changes in Quality of Life Assessments to Changes in Functional Capacity in Patients With Heart Failure With and Without Anemia.

    PubMed

    Cooper, Trond J; Anker, Stefan D; Comin-Colet, Josep; Filippatos, Gerasimos; Lainscak, Mitja; Lüscher, Thomas F; Mori, Claudio; Johnson, Patrick; Ponikowski, Piotr; Dickstein, Kenneth

    2016-05-01

    Clinical status in heart failure is conventionally assessed by the physician's evaluation, patients' own perception of their symptoms, quality of life (QoL) tools, and a measure of functional capacity. These aspects can be measured with tools such as the New York Heart Association functional class, QoL tools such as the EuropeanQoL-5 dimension, the Kansas City Cardiomyopathy Questionnaire, patient global assessment (PGA), and by 6-minute walk test (6MWT), respectively. The ferric carboxymaltose in patients with heart failure and iron deficiency (FAIR-HF) trial demonstrated that treatment with intravenous ferric carboxymaltose in iron-deficient patients with symptomatic heart failure with reduced left ventricular function, significantly improved all 5 outcome measures. This analysis assessed the correlations between the longitudinal changes in the measures of clinical status, as measured by QoL tools and the changes in the measures of functional capacity as measured by the 6MWT. This analysis used the database from the FAIR-HF trial, which randomized 459 patients with chronic heart failure (reduced left ventricular ejection fraction) and iron deficiency, with or without anemia to ferrous carboxymaltose or placebo. The degree of correlation between QoL tools and the 6MWT was assessed at 4, 12, and 24 weeks. The data demonstrate highly significant correlations between QoL and functional capacity, as measured by the 6MWT, at all time points (p <0.001). Changes in PGA, Kansas City Cardiomyopathy Questionnaire, and EuroQoL-5D correlated increasingly over time with changes in 6MWT performance. Interestingly, the strongest correlation at 24 weeks is for the PGA, which is a simple numerical scale (r = -0.57, p <0.001). This analysis provides evidence that QoL assessment show a significant correlation with functional capacity, as measured by the 6MWT. The strength of these correlations increased over time. PMID:27015889

  18. Spectral Determinants on Mandelstam Diagrams

    NASA Astrophysics Data System (ADS)

    Hillairet, Luc; Kalvin, Victor; Kokotov, Alexey

    2016-04-01

    We study the regularized determinant of the Laplacian as a functional on the space of Mandelstam diagrams (noncompact translation surfaces glued from finite and semi-infinite cylinders). A Mandelstam diagram can be considered as a compact Riemann surface equipped with a conformal flat singular metric {|ω|^2}, where {ω} is a meromorphic one-form with simple poles such that all its periods are pure imaginary and all its residues are real. The main result is an explicit formula for the determinant of the Laplacian in terms of the basic objects on the underlying Riemann surface (the prime form, theta-functions, the canonical meromorphic bidifferential) and the divisor of the meromorphic form {ω}. As an important intermediate result we prove a decomposition formula of the type of Burghelea-Friedlander-Kappeler for the determinant of the Laplacian for flat surfaces with cylindrical ends and conical singularities.

  19. Hero's journey in bifurcation diagram

    NASA Astrophysics Data System (ADS)

    Monteiro, L. H. A.; Mustaro, P. N.

    2012-06-01

    The hero's journey is a narrative structure identified by several authors in comparative studies on folklore and mythology. This storytelling template presents the stages of inner metamorphosis undergone by the protagonist after being called to an adventure. In a simplified version, this journey is divided into three acts separated by two crucial moments. Here we propose a discrete-time dynamical system for representing the protagonist's evolution. The suffering along the journey is taken as the control parameter of this system. The bifurcation diagram exhibits stationary, periodic and chaotic behaviors. In this diagram, there are transition from fixed point to chaos and transition from limit cycle to fixed point. We found that the values of the control parameter corresponding to these two transitions are in quantitative agreement with the two critical moments of the three-act hero's journey identified in 10 movies appearing in the list of the 200 worldwide highest-grossing films.

  20. Causal diagrams in systems epidemiology

    PubMed Central

    2012-01-01

    Methods of diagrammatic modelling have been greatly developed in the past two decades. Outside the context of infectious diseases, systematic use of diagrams in epidemiology has been mainly confined to the analysis of a single link: that between a disease outcome and its proximal determinant(s). Transmitted causes ("causes of causes") tend not to be systematically analysed. The infectious disease epidemiology modelling tradition models the human population in its environment, typically with the exposure-health relationship and the determinants of exposure being considered at individual and group/ecological levels, respectively. Some properties of the resulting systems are quite general, and are seen in unrelated contexts such as biochemical pathways. Confining analysis to a single link misses the opportunity to discover such properties. The structure of a causal diagram is derived from knowledge about how the world works, as well as from statistical evidence. A single diagram can be used to characterise a whole research area, not just a single analysis - although this depends on the degree of consistency of the causal relationships between different populations - and can therefore be used to integrate multiple datasets. Additional advantages of system-wide models include: the use of instrumental variables - now emerging as an important technique in epidemiology in the context of mendelian randomisation, but under-used in the exploitation of "natural experiments"; the explicit use of change models, which have advantages with respect to inferring causation; and in the detection and elucidation of feedback. PMID:22429606

  1. Looking inside the butterfly diagram

    NASA Astrophysics Data System (ADS)

    Ternullo, M.

    2007-12-01

    The suitability of Maunder's butterfly diagram to give a realistic picture of the photospheric magnetic flux large scale distribution is discussed. The evolution of the sunspot zone in cycle 20 through 23 is described. To reduce the noise which covers any structure in the diagram, a smoothing algorithm has been applied to the sunspot data. This operation has eliminated any short period fluctuation, and given visibility to long duration phenomena. One of these phenomena is the fact that the equatorward drift of the spot zone center of mass results from the alternation of several prograde (namely, equatorward) segments with other stationary or poleward segments. The long duration of the stationary/retrograde phases as well as the similarities among the spot zone alternating paths in the cycles under examination prevent us from considering these features as meaningless fluctuations, randomly superimposed on the continuous equatorward migration. On the contrary, these features should be considered physically meaningful phenomena, requiring adequate explanations. Moreover, even the smoothed spotted area markedly oscillates. The compared examination of area and spot zone evolution allows us to infer details about the spotted area distribution inside the butterfly diagram. Links between the changing structure of the spot zone and the tachocline rotation rate oscillations are proposed.

  2. Failure of platelet parameters and biomarkers to correlate platelet function to severity and etiology of heart failure in patients enrolled in the EPCOT trial. With special reference to the Hemodyne hemostatic analyzer. Whole Blood Impedance Aggregometry for the Assessment of Platelet Function in Patients with Congestive Heart Failure.

    PubMed

    Serebruany, Victor L; McKenzie, Marcus E; Meister, Andrew F; Fuzaylov, Sergey Y; Gurbel, Paul A; Atar, Dan; Gattis, Wendy A; O'Connor, Christopher M

    2002-01-01

    Data from small studies have suggested the presence of platelet abnormalities in patients with congestive heart failure (CHF). We sought to characterize the diagnostic utility of different platelet parameters and platelet-endothelial biomarkers in a random outpatient CHF population investigated in the EPCOT ('Whole Blood Impedance Aggregometry for the Assessment of Platelet Function in Patients with Congestive Heart Failure') Trial. Blood samples were obtained for measurement of platelet contractile force (PCF), whole blood aggregation, shear-induced closure time, expression of glycoprotein (GP) IIb/IIIa, and P-selectin in 100 consecutive patients with CHF. Substantial interindividual variability of platelet characteristics exists in patients with CHF. There were no statistically significant differences when patients were grouped according to incidence of vascular events, emergency revascularization needs, survival, or etiology of heart failure. Aspirin use did not affect instrument readings either. PCF correlates very poorly with whole blood aggregometry (r(2) = 0.023), closure time (r(2) = 0.028), platelet GP IIb/IIIa (r(2) = 0.0028), and P-selectin (r(2) = 0.002) expression. Furthermore, there was no correlation with brain natriuretic peptide concentrations, a marker of severity and prognosis in heart failure reflecting the neurohumoral status. Patients with heart failure enrolled in the EPCOT Trial exhibited a marginal, sometimes oppositely directed change in platelet function, challenging the diagnostic utility of these platelet parameters and biomarkers to serve as useful tools for the identification of platelet abnormalities, for predicting clinical outcomes, or for monitoring antiplatelet strategies in this population. The usefulness of these measurements for assessing platelets in the different clinical settings remains to be explored. Taken together, opposite to our expectations, major clinical characteristics of heart failure did not correlate well with

  3. Twistor Diagrams and Quantum Field Theory.

    NASA Astrophysics Data System (ADS)

    O'Donald, Lewis

    Available from UMI in association with The British Library. Requires signed TDF. This thesis uses twistor diagram theory, as developed by Penrose (1975) and Hodges (1990c), to try to approach some of the difficulties inherent in the standard quantum field theoretic description of particle interactions. The resolution of these issues is the eventual goal of the twistor diagram program. First twistor diagram theory is introduced from a physical view-point, with the aim of studying larger diagrams than have been typically explored. Methods are evolved to tackle the double box and triple box diagrams. These lead to three methods of constructing an amplitude for the double box, and two ways for the triple box. Next this theory is applied to translate the channels of a Yukawa Feynman diagram, which has more than four external states, into various twistor diagrams. This provides a test of the skeleton hypothesis (of Hodges, 1990c) in these cases, and also shows that conformal breaking must enter into twistor diagrams before the translation of loop level Feynman diagrams. The issue of divergent Feynman diagrams is then considered. By using a twistor equivalent of the sum-over -states idea of quantum field theory, twistor translations of loop diagrams are conjectured. The various massless propagator corrections and vacuum diagrams calculated give results consistent with Feynman theory. Two diagrams are also found that give agreement with the finite parts of the Feynman "fish" diagrams of phi^4 -theory. However it is found that a more rigorous translation for the time-like fish requires new boundaries to be added to the twistor sum-over-states. The twistor diagram obtained is found to give the finite part of the relevant Feynman diagram.

  4. Kidney Failure

    MedlinePlus

    ... enough red blood cells. This is called kidney failure. If your kidneys fail, you need treatment to ... providers, family, and friends, most people with kidney failure can lead full and active lives. NIH: National ...

  5. Respiratory Failure

    MedlinePlus

    Respiratory failure happens when not enough oxygen passes from your lungs into your blood. Your body's organs, such ... brain, need oxygen-rich blood to work well. Respiratory failure also can happen if your lungs can't ...

  6. Assessment of the magnitude and associated factors of immunological failure among adult and adolescent HIV-infected patients in St. Luke and Tulubolo Hospital, Oromia Region, Ethiopia

    PubMed Central

    Bayou, Bekelech; Sisay, Abay; Kumie, Abera

    2015-01-01

    Introduction The use of antiretroviral therapy (ART) has become a standard of care for the treatment of HIV infection. However, cost and resistance to ART are major obstacles for access to treatment especially in resource-limited settings. In this study, we aimed to assess the magnitude and associated factors of Immunological failure among adult and adolescent HIV infected Patients (with age ‘15yrs) on Highly Active Antiretroviral Therapy (HAART) in St. Luke and Tulu Bolo Hospitals, Oromia Region, Ethiopia. Methods A retrospective follow-up study was conducted among HIV-infected patients initiated 1st line ART at St. Luke and Tulu Bolo Hospitals, South West Shoa Zone, Oromia, Ethiopia. Results A total of 828 patient charts were reviewed. 477(57.6%) were female and the median age was 32 years. The median baseline CD4 count was 148cells/mm3. The most common prescribed ART was TDF based (36.7%). Out of 828 patients chart reviewed 6.8% (56) were developed immunological failure. Out of them only 20 (2.4%) were detected and put on second line regimen. The incidence of immunological failure was 1.8 cases per 100 person years of follow-up. Patients who had not disclosed their HIV status to any one had high risk of immunological failure compared with patients those who had disclosed their HIV status (AHR, 0.429; 95% CI 0.206 - 0.893; P-value=0.024). Conclusion Non disclosures of HIV status and with ambulatory of baseline functional status were found to be predictors of immunological failure. Most of the immunological failure cases were not detected early and not switched to second line ARV regimen. So patients with the above risk factors should be considered for a timely switch to second line HAART. PMID:26587140

  7. Differential Effectiveness of Two Science Diagram Types.

    ERIC Educational Resources Information Center

    Holliday, William G.

    Reported is an Aptitude Treatment Instruction (ATI) Study designed to evaluate the aptitude of verbal comprehension in terms of two unitary complex science diagram types: a single complex block word diagram and a single complex picture word diagram.. ATI theory and research indicate that different effective instructional treatments tend to help…

  8. Arrows in Comprehending and Producing Mechanical Diagrams

    ERIC Educational Resources Information Center

    Heiser, Julie; Tversky, Barbara

    2006-01-01

    Mechanical systems have structural organizations--parts, and their relations--and functional organizations--temporal, dynamic, and causal processes--which can be explained using text or diagrams. Two experiments illustrate the role of arrows in diagrams of mechanical systems. In Experiment 1, people described diagrams with or without arrows,…

  9. Understanding machines from text and diagrams

    NASA Astrophysics Data System (ADS)

    Hegarty, Mary; Just, Marcel A.

    1987-12-01

    Instructional materials typically use both text and diagrams to explain how machines work. In this paper we give an account of what information is involved in understanding a mechanical device and the role that diagrams might play in communicating this information. We propose a model of how people read a text and inspect an accompanying diagram which states that people inspect diagrams for three reasons: (1) to form a representation of information read in the text, (2) to reactivate information that has already been represented, and (3) to encode information that is absent from the text. Using data from subjects' eye fixations while they read a text and inspected an accompanying diagram, we find that low-ability subjects need to inspect diagrams more often than high-ability text. The data also suggest that knowledge of what is relevant in a diagram might be a prerequisite for encoding new information from a diagram. Instructional materials typically use both text and diagrams to explain how machines work. In this paper we give an account of what information is involved in understanding a mechanical device and the role that diagrams might play in communicating this information. We propose a model of how people read a text and inspect an accompanying diagram which states that people inspect diagrams for three reasons: (1) to form a representation of information read in the text; (2) to reactivate information that was alsready represented, and *3) to encode information that is absent from the text. Uinsg data from subjects' eye fixations while they read a text and inspected an accompanying diagram, we find that low-ability subjects need to inspect diagrmas more often than high-ability tesxt. The data also suggest that knowledge of what is relevant in a diagram might be a prerequisite and encoding information on a diagram.

  10. Diagram, a Learning Environment for Initiation to Object-Oriented Modeling with UML Class Diagrams

    ERIC Educational Resources Information Center

    Py, Dominique; Auxepaules, Ludovic; Alonso, Mathilde

    2013-01-01

    This paper presents Diagram, a learning environment for object-oriented modelling (OOM) with UML class diagrams. Diagram an open environment, in which the teacher can add new exercises without constraints on the vocabulary or the size of the diagram. The interface includes methodological help, encourages self-correcting and self-monitoring, and…

  11. An assessment of BWR (boiling water reactor) Mark III containment challenges, failure modes, and potential improvements in performance

    SciTech Connect

    Schroeder, J.A.; Pafford, D.J.; Kelly, D.L.; Jones, K.R.; Dallman, F.J. )

    1991-01-01

    This report describes risk-significant challenges posed to Mark III containment systems by severe accidents as identified for Grand Gulf. Design similarities and differences between the Mark III plants that are important to containment performance are summarized. The accident sequences responsible for the challenges and the postulated containment failure modes associated with each challenge are identified and described. Improvements are discussed that have the potential either to prevent or delay containment failure, or to mitigate the offsite consequences of a fission product release. For each of these potential improvements, a qualitative analysis is provided. A limited quantitative risk analysis is provided for selected potential improvements. 21 refs., 5 figs., 46 tabs.

  12. Optical generation of Voronoi diagram.

    PubMed

    Giavazzi, F; Cerbino, R; Mazzoni, S; Giglio, M; Vailati, A

    2008-03-31

    We present results of experiments of diffraction by an amplitude screen, made of randomly distributed circular holes. By careful selection of the experimental parameters we obtain an intensity pattern strongly connected to the Voronoi diagram (VD) generated by the centers of the apertures. With the help of simulations we give a description of the observed phenomenon and elucidate the optimal parameters for its observation. Finally, we also suggest how it can be used for a fast, all-optical generation of VDs. PMID:18542580

  13. A Hubble Diagram for Quasars

    NASA Astrophysics Data System (ADS)

    Risaliti, Guido; Lusso, Elisabeta

    2015-09-01

    We present a new method to test the cosmological model at high z, and measure the cosmological parameters, based on the non-linear correlation between UV and X-ray luminosity in quasars. While the method can be successfully tested with the data available today, a deep X-ray survey matching the future LSST and Euclid quasar catalogs is needed to achieve a high precision. Athena could provide a Hubble diagram for quasar analogous to that available today for supernovae, but extending up to z>6.

  14. Cell flipping in permutation diagrams

    NASA Astrophysics Data System (ADS)

    Golumbic, Martin Charles; Kaplang, Haim

    Permutation diagrams have been used in circuit design to model a set of single point nets crossing a channel, where the minimum number of layers needed to realize the diagram equals the clique number ω(G) of its permutation graph, the value of which can be calculated in O(n log n) time. We consider a generalization of this model motivated by "standard cell" technology in which the numbers on each side of the channel are partitioned into consecutive subsequences, or cells, each of which can be left unchanged or flipped (i.e., reversed). We ask, for what choice of fiippings will the resulting clique number be minimum or maximum. We show that when one side of the channel is fixed (no flipping), an optimal flipping for the other side can be found in O(n log n) time for the maximum clique number. We prove that the general problem is NP-complete for the minimum clique number and O(n 2) for the maximum clique number. Moreover, since the complement of a permutation graph is also a permutation graph, the same complexity results hold for the independence number.

  15. Phase Diagrams of Nuclear Pasta

    NASA Astrophysics Data System (ADS)

    Caplan, Matthew; Horowitz, Chuck; Berry, Don; da Silva Schneider, Andre

    2016-03-01

    In the inner crust of neutrons stars, where matter is near the saturation density, protons and neutrons arrange themselves into complex structures called nuclear pasta. Early theoretical work predicted a simple graduated hierarchy of pasta phases, consisting of spheres, cylinders, slabs, and uniform matter with voids. Previous work has simulated these phases with a simple classical model and has shown that the formation of these structures is dependent on the temperature, density, and proton fraction. However, previous work only studied a limited range of these parameters due to computational limitations. Thanks to recent advances in computing it is now possible to survey the structure of nuclear pasta for a larger range of parameters. By simulating nuclear pasta with constant temperature and proton fraction in an expanding simulation volume we are able to study the phase transitions in nuclear pasta, and thus produce a set of phase diagrams. We report on these phase diagrams as well as newly identified phases of nuclear pasta and discuss their implications for neutron star observables.

  16. The Importance of Design in Learning from Node-Link Diagrams

    ERIC Educational Resources Information Center

    van Amelsvoort, Marije; van der Meij, Jan; Anjewierden, Anjo; van der Meij, Hans

    2013-01-01

    Diagrams organize by location. They give spatial cues for finding and recognizing information and for making inferences. In education, diagrams are often used to help students understand and recall information. This study assessed the influence of perceptual cues on reading behavior and subsequent retention. Eighty-two participants were assigned…

  17. Heber Geothermal Binary Demonstration Project. Final design availability assessment. Revision 1

    SciTech Connect

    Mulvihill, R.J.; Reny, D.A.; Geumlek, J.M.; Purohit, G.P.

    1983-02-01

    An availability assessment of the principal systems of the Heber Geothermal Power Plant has been carried out based on the final issue of the process descriptions, process flow diagrams, and the approved for design P and IDs prepared by Fluor Power Services, Inc. (FPS). The principal systems are those which contribute most to plant unavailability. The plant equivalent availability, considering forced and deferred corrective maintenance outages, was computed using a 91 state Markov model to represent the 29 principal system failure configurations and their significant combinations. The failure configurations and associated failure and repair rates were defined from system/subsystem availability assessments that were conducted using the availability assessments based on the EPRI GO methodology and availability block diagram models. The availability and unavailability ranking of the systems and major equipment is presented.

  18. Hubble's diagram and cosmic expansion

    PubMed Central

    Kirshner, Robert P.

    2004-01-01

    Edwin Hubble's classic article on the expanding universe appeared in PNAS in 1929 [Hubble, E. P. (1929) Proc. Natl. Acad. Sci. USA 15, 168–173]. The chief result, that a galaxy's distance is proportional to its redshift, is so well known and so deeply embedded into the language of astronomy through the Hubble diagram, the Hubble constant, Hubble's Law, and the Hubble time, that the article itself is rarely referenced. Even though Hubble's distances have a large systematic error, Hubble's velocities come chiefly from Vesto Melvin Slipher, and the interpretation in terms of the de Sitter effect is out of the mainstream of modern cosmology, this article opened the way to investigation of the expanding, evolving, and accelerating universe that engages today's burgeoning field of cosmology. PMID:14695886

  19. Phase diagram of ammonium nitrate

    NASA Astrophysics Data System (ADS)

    Dunuwille, M.; Yoo, C. S.

    2014-05-01

    Ammonium Nitrate (AN) has often subjected to uses in improvised explosive devices, due to its wide availability as a fertilizer and its capability of becoming explosive with slight additions of organic and inorganic compounds. Yet, the origin of enhanced energetic properties of impure AN (or AN mixtures) is neither chemically unique nor well understood -resulting in rather catastrophic disasters in the past1 and thereby a significant burden on safety in using ammonium nitrates even today. To remedy this situation, we have carried out an extensive study to investigate the phase stability of AN at high pressure and temperature, using diamond anvil cells and micro-Raman spectroscopy. The present results confirm the recently proposed phase IV-to-IV' transition above 17 GPa2 and provide new constraints for the melting and phase diagram of AN to 40 GPa and 400 °C.

  20. Application of Failure Mode and Effect Analysis (FMEA) and cause and effect analysis in conjunction with ISO 22000 to a snails (Helix aspersa) processing plant; A case study.

    PubMed

    Arvanitoyannis, Ioannis S; Varzakas, Theodoros H

    2009-08-01

    Failure Mode and Effect Analysis (FMEA) has been applied for the risk assessment of snails manufacturing. A tentative approach of FMEA application to the snails industry was attempted in conjunction with ISO 22000. Preliminary Hazard Analysis was used to analyze and predict the occurring failure modes in a food chain system (snails processing plant), based on the functions, characteristics, and/or interactions of the ingredients or the processes, upon which the system depends. Critical Control points have been identified and implemented in the cause and effect diagram (also known as Ishikawa, tree diagram, and fishbone diagram). In this work a comparison of ISO22000 analysis with HACCP is carried out over snails processing and packaging. However, the main emphasis was put on the quantification of risk assessment by determining the RPN per identified processing hazard. Sterilization of tins, bioaccumulation of heavy metals, packaging of shells and poisonous mushrooms, were the processes identified as the ones with the highest RPN (280, 240, 147, 144, respectively) and corrective actions were undertaken. Following the application of corrective actions, a second calculation of RPN values was carried out leading to considerably lower values (below the upper acceptable limit of 130). It is noteworthy that the application of Ishikawa (Cause and Effect or Tree diagram) led to converging results thus corroborating the validity of conclusions derived from risk assessment and FMEA. Therefore, the incorporation of FMEA analysis within the ISO22000 system of a snails processing industry is considered imperative. PMID:19582641

  1. No Such Thing as Failure, Only Feedback: Designing Innovative Opportunities for E-Assessment and Technology-Mediated Feedback

    ERIC Educational Resources Information Center

    Miller, Charles; Doering, Aaron; Scharber, Cassandra

    2010-01-01

    In this paper we challenge designers, researchers, teachers, students, and parents to re-assess and re-envision the value of technology-mediated feedback and e-assessment by examining the innovative roles feedback and assessment played in the design of three contemporary web-based learning environments. These environments include 1) an…

  2. The neptunium-iron phase diagram

    NASA Astrophysics Data System (ADS)

    Gibson, J. K.; Haire, R. G.; Beahm, E. C.; Gensini, M. M.; Maeda, A.; Ogawa, T.

    1994-08-01

    The phase relations in the Np-Fe alloy system have been elucidated using differential thermal analysis. A phase diagram for this system is postulated based upon the experimental results, regular-solution model calculations, and an expected correspondence to the U-Fe and Pu-Fe diagrams. The postulated Np-Fe diagram is characterized by limited terminal solid solubilities, two intermetallic solid phases, NpFe 2 and Np 6Fe, and two eutectics.

  3. Productive Failure

    ERIC Educational Resources Information Center

    Kapur, Manu

    2008-01-01

    This study demonstrates an existence proof for "productive failure": engaging students in solving complex, ill-structured problems without the provision of support structures can be a productive exercise in failure. In a computer-supported collaborative learning setting, eleventh-grade science students were randomly assigned to one of two…

  4. Ion potential diagrams for electrochromic devices

    SciTech Connect

    Varsano, F. |; Cahen, D.; Decker, F.; Guillemoles, J.F. |; Masetti, E.

    1998-12-01

    Ion potential diagrams can facilitate the description of systems in which ionic species are mobile. They depict qualitatively the spatial dependence of the potential energy for mobile ions, somewhat akin to band diagrams for electrons. The authors construct ion potential diagrams for the mixed conducting (oxide), optically active electrodes of five-layer electrochromic devices, based on reversible Li{sup +} intercalation. These serve to analyze stability problems that arise in these systems. The authors then use them as building blocks to arrive at ion diagrams for complete devices. This allows analyses of (dis)coloration kinetics.

  5. Influence Diagram Use With Respect to Technology Planning and Investment

    NASA Technical Reports Server (NTRS)

    Levack, Daniel J. H.; DeHoff, Bryan; Rhodes, Russel E.

    2009-01-01

    Influence diagrams are relatively simple, but powerful, tools for assessing the impact of choices or resource allocations on goals or requirements. They are very general and can be used on a wide range of problems. They can be used for any problem that has defined goals, a set of factors that influence the goals or the other factors, and a set of inputs. Influence diagrams show the relationship among a set of results and the attributes that influence them and the inputs that influence the attributes. If the results are goals or requirements of a program, then the influence diagram can be used to examine how the requirements are affected by changes to technology investment. This paper uses an example to show how to construct and interpret influence diagrams, how to assign weights to the inputs and attributes, how to assign weights to the transfer functions (influences), and how to calculate the resulting influences of the inputs on the results. A study is also presented as an example of how using influence diagrams can help in technology planning and investment. The Space Propulsion Synergy Team (SPST) used this technique to examine the impact of R&D spending on the Life Cycle Cost (LCC) of a space transportation system. The question addressed was the effect on the recurring and the non-recurring portions of LCC of the proportion of R&D resources spent to impact technology objectives versus the proportion spent to impact operational dependability objectives. The goals, attributes, and the inputs were established. All of the linkages (influences) were determined. The weighting of each of the attributes and each of the linkages was determined. Finally the inputs were varied and the impacts on the LCC determined and are presented. The paper discusses how each of these was accomplished both for credibility and as an example for future studies using influence diagrams for technology planning and investment planning.

  6. Assessment of nutritional status by composite index for anthropometric failure: a study among slum children in Bankura, West Bengal.

    PubMed

    Shit, Subhadeep; Taraphdar, Pranita; Mukhopadhyay, Dipta K; Sinhababu, Apurba; Biswas, Akhil B

    2012-01-01

    A community-based cross-sectional study was conducted to find out the prevalence of composite index of anthropometric failure (CIAF) among 117 slum dwelling under-five children in Bankura town, West Bengal and its relation with some common socio-economic factors. Among study population, the prevalence of underweight was 41.6%, whereas CIAF was 80.3%. CIAF gave a near complete estimation of undernutrition unlike underweight. Children who were unimmunized, with more number of siblings, living in a nuclear family, or with illiterate mothers were more likely to be undernourished. PMID:23354144

  7. The changing face of liver transplantation for acute liver failure: Assessment of current status and implications for future practice.

    PubMed

    Donnelly, Mhairi C; Hayes, Peter C; Simpson, Kenneth J

    2016-04-01

    The etiology and outcomes of acute liver failure (ALF) have changed since the definition of this disease entity in the 1970s. In particular, the role of emergency liver transplantation has evolved over time, with the development of prognostic scoring systems to facilitate listing of appropriate patients, and a better understanding of transplant benefit in patients with ALF. This review examines the changing etiology of ALF, transplant benefit, outcomes following transplantation, and future alternatives to emergency liver transplantation in this devastating condition. Liver Transplantation 22 527-535 2016 AASLD. PMID:26823231

  8. Influence Diagrams as Decision-Making Tools for Pesticide Risk Management

    EPA Science Inventory

    The pesticide policy arena is filled with discussion of probabilistic approaches to assess ecological risk, however, similar discussions about implementing formal probabilistic methods in pesticide risk decision making are less common. An influence diagram approach is proposed f...

  9. Phase diagram of ammonium nitrate

    SciTech Connect

    Dunuwille, Mihindra; Yoo, Choong-Shik

    2013-12-07

    Ammonium Nitrate (AN) is a fertilizer, yet becomes an explosive upon a small addition of chemical impurities. The origin of enhanced chemical sensitivity in impure AN (or AN mixtures) is not well understood, posing significant safety issues in using AN even today. To remedy the situation, we have carried out an extensive study to investigate the phase stability of AN and its mixtures with hexane (ANFO–AN mixed with fuel oil) and Aluminum (Ammonal) at high pressures and temperatures, using diamond anvil cells (DAC) and micro-Raman spectroscopy. The results indicate that pure AN decomposes to N{sub 2}, N{sub 2}O, and H{sub 2}O at the onset of the melt, whereas the mixtures, ANFO and Ammonal, decompose at substantially lower temperatures. The present results also confirm the recently proposed phase IV-IV{sup ′} transition above 17 GPa and provide new constraints for the melting and phase diagram of AN to 40 GPa and 400°C.

  10. Phase Diagram of Ammonium Nitrate

    NASA Astrophysics Data System (ADS)

    Dunuwille, Mihindra; Yoo, Choong-Shik

    2013-06-01

    Ammonium Nitrate (AN) has often been subjected to uses in improvised explosive devices, due to its wide availability as a fertilizer and its capability of becoming explosive with slight additions of organic and inorganic compounds. Yet, the origin of enhanced energetic properties of impure AN (or AN mixtures) is neither chemically unique nor well understood - resulting in rather catastrophic disasters in the past1 and thereby a significant burden on safety, in using ammonium nitrates even today. To remedy this situation, we have carried out an extensive study to investigate the phase stability of AN, in different chemical environments, at high pressure and temperature, using diamond anvil cells and micro-Raman spectroscopy. The present results confirm the recently proposed phase IV-to-IV' transition above 15 GPa2 and provide new constraints for the melting and phase diagram of AN to 40 GPa and 673 K. The present study has been supported by the U.S. DHS under Award Number 2008-ST-061-ED0001.

  11. Looking Forward, Looking Back: Assessing Variations in Hospital Resource Use and Outcomes for Elderly Patients with Heart Failure

    PubMed Central

    Ong, Michael K.; Mangione, Carol M.; Romano, Patrick S.; Zhou, Qiong; Auerbach, Andrew D.; Chun, Alein; Davidson, Bruce; Ganiats, Theodore G.; Greenfield, Sheldon; Gropper, Michael A.; Malik, Shaista; Rosenthal, J. Thomas; Escarce, José J.

    2009-01-01

    Background Recent studies have found substantial variation in hospital resource utilization by expired Medicare beneficiaries with chronic illnesses. By analyzing only expired patients, these studies cannot identify differences across hospitals in health outcomes like mortality. This study examines the association between mortality and resource utilization at the hospital level, when all Medicare beneficiaries hospitalized for heart failure are examined. Methods and Results 3,999 individuals hospitalized with a principal diagnosis of heart failure at six California teaching hospitals between January 1, 2001 and June 30, 2005 were analyzed with multivariate risk-adjustment models for total hospital days, total hospital direct costs, and mortality within 180-days after initial admission (“Looking Forward”). A subset of 1,639 individuals who died during the study period were analyzed with multivariate risk-adjustment models for total hospital days and total hospital direct costs within 180-days prior to death (“Looking Back”). “Looking Forward” risk-adjusted hospital means ranged from 17.0% to 26.0% for mortality, 7.8 to 14.9 days for total hospital days, and 0.66 to 1.30 times the mean value for indexed total direct costs. Spearman rank correlation coefficients were −0.68 between mortality and hospital days, and −0.93 between mortality and indexed total direct costs. “Looking Back” risk-adjusted hospital means ranged from 9.1 to 21.7 days for total hospital days and 0.91 to 1.79 times the mean value for indexed total direct costs. Variation in resource utilization site ranks between expired and all individuals were due to insignificant differences. Conclusions California teaching hospitals that used more resources caring for patients hospitalized for heart failure had lower mortality rates. Focusing only on expired individuals may overlook mortality variation as well as associations between greater resource utilization and lower mortality

  12. Reading fitness landscape diagrams through HSAB concepts

    NASA Astrophysics Data System (ADS)

    Vigneresse, Jean-Louis

    2014-10-01

    Fitness landscapes are conceived as range of mountains, with local peaks and valleys. In terms of potential, such topographic variations indicate places of local instability or stability. The chemical potential, or electronegativity, its value changed of sign, carries similar information. In addition to chemical descriptors defined through hard-soft acid-base (HSAB) concepts and computed through density functional theory (DFT), the principles that rule chemical reactions allow the design of such landscape diagrams. The simplest diagram uses electrophilicity and hardness as coordinates. It allows examining the influence of maximum hardness or minimum electrophilicity principles. A third dimension is introduced within such a diagram by mapping the topography of electronegativity, polarizability or charge exchange. Introducing charge exchange during chemical reactions, or mapping a third parameter (f.i. polarizability) reinforces the information carried by a simple binary diagram. Examples of such diagrams are provided, using data from Earth Sciences, simple oxides or ligands.

  13. Testicular failure

    MedlinePlus

    ... Blood tests may show a low level of testosterone and high levels of prolactin, FSH , and LH . ... testes will be ordered. Testicular failure and low testosterone level may be hard to diagnose in older ...

  14. Heart Failure

    MedlinePlus

    ... together. About Rise Above HF Rise Above Heart Failure seeks to increase the dialogue about HF and improve the lives of people affected by the condition through awareness, education and support. Through the initiative, AHA strives to ...

  15. Structure-retention diagrams of ceramides established for their identification.

    PubMed

    Gaudin, Karen; Chaminade, Pierre; Baillet, Arlette

    2002-10-11

    Molecular species analysis of ceramides was carried out using porous graphitic carbon with gradient elution: chloroform-methanol from 45:55 to 85:15 with a slope at 2.7%/min. These conditions gave a linear relationship between retention data and structure of ceramides. It was demonstrated that linearity occurred when a high slope value of linear gradient elution was used. Thereby the linear diagram was evolved by plotting the adjusted retention time against the total number of carbon atoms of ceramide molecules. Each line represents one ceramide class. Such a Structure-Retention Diagram describes ceramide retention and thus constitutes an identification method using only retention data. This Structure-Retention Diagram was assessed and compared to another obtained from octadesyl-grafted silica in terms of their reproducibility, precision and ability to provide ceramide identification. Better identification was obtained using the results from both Structure-Retention Diagrams. This approach with a two-dimensional separation system allowed to take advantage of the specificity of both identification models. PMID:12437165

  16. Faceting diagram for sticky steps

    NASA Astrophysics Data System (ADS)

    Akutsu, Noriko

    2016-03-01

    Faceting diagrams for the step-faceting zone, the step droplet zone, and the Gruber-Mullins-Pokrovsky-Talapov (GMPT) zone for a crystal surface are obtained by using the density matrix renormalization group method to calculate the surface tension. The model based on these calculations is the restricted solid-on-solid (RSOS) model with a point-contact-type step-step attraction (p-RSOS model) on a square lattice. The point-contact-type step-step attraction represents the energy gain obtained by forming a bonding state with orbital overlap at the meeting point of the neighboring steps. In the step-faceting zone, disconnectedness in the surface tension leads to the formation of a faceted macrostep on a vicinal surface at equilibrium. The disconnectedness in the surface tension also causes the first-order shape transition for the equilibrium shape of a crystal droplet. The lower zone boundary line (ZBL), which separates the step-faceting zone and the step droplet zone, is obtained by the condition γ 1 = lim n → ∞ γ n / n , where γn is the step tension of the n-th merged step. The upper ZBL, which separates the GMPT zone and the step droplet zone, is obtained by the condition Aq,eff = 0 and Bq,eff = 0, where Aq,eff and Bq,eff represent the coefficients for the | q → | 2 term and the | q → | 3 term, respectively, in the | q → | -expanded form of the surface free energy f eff ( q → ) . Here, q → is the surface gradient relative to the (111) surface. The reason why the vicinal surface inclined in the <101> direction does not exhibit step-faceting is explained in terms of the one-dimensional spinless quasi-impenetrable attractive bosons at absolute zero.

  17. Development and Validation of the First Iranian Questionnaire to Assess Quality of Life in Patients With Heart Failure: IHF-QoL

    PubMed Central

    Naderi, Nasim; Bakhshandeh, Hooman; Amin, Ahmad; Taghavi, Sepideh; Dadashi, Masoumeh; Maleki, Majid

    2012-01-01

    Background: In its Constitution of 1948, WHO defined health as “a state of complete physical, mental, and social well-being, and not merely the absence of disease and infirmity” . In 1994, the Agency for Health Care Policy and Research published clinical practice guidelines recommending providers to routinely evaluate patients’ HRQoL (Health Related Quality of Life) and use their assessment to modify and guide patient care. Objectives: to create a valid, sensitive, disease-specific Persian health status quality of life questionnaire for patients with chronic heart failure in Iran. Materials and Methods: Considering the existing relevant inventories and scientific literature, the authors designed the first draft of questionnaire which was modified and validated, using expert opinions and finalized in a session of expert panel. The questionnaire was processed among 130 patients with heart failure. Construct validity evaluated by principle component factor analysis, and promax method was used for factor rotation. MacNew quality of life questionnaire was selected to assess convergence validity, and the agreements were measured in 60 patients. Discriminant validity was also assessed. Thirty patients were followed for 3 months and responsiveness of questionnaire was measured. Cronbach's alpha, item analysis, and Intra-class correlation coefficients (ICCs) were used to investigate reliability of questionnaire. SPSS 15 for Windows was applied for statistical analysis. Results: Principle component factor analysis revealed 4 main components. Sub-group analysis suggested that IHF-QoL questionnaire demonstrated an acceptable discriminant validity. High conformity between this inventory and MacNew questionnaire revealed an appropriate convergence validity. Cronbach's alpha (α) for the overall questionnaire was equal to 0.922. Intra-class correlation coefficients (ICCs) for all components were significant (from. 708 to. 883; all P values < 0.001). Patients fallow

  18. Positron Emission Tomography for Assessing Local Failure After Stereotactic Body Radiotherapy for Non-Small-Cell Lung Cancer

    SciTech Connect

    Zhang Xu; Liu Hui; Balter, Peter; Allen, Pamela K.; Komaki, Ritsuko; Pan, Tinsu; Chuang, Hubert H.; Chang, Joe Y.

    2012-08-01

    Purpose: We analyzed whether positron emission tomography (PET)/computed tomography standardized uptake values (SUVs) after stereotactic body radiotherapy (SBRT) could predict local recurrence (LR) in non-small-cell lung cancer (NSCLC). Methods and Materials: This study comprised 128 patients with Stage I (n = 68) or isolated recurrent/secondary parenchymal (n = 60) NSCLC treated with image-guided SBRT to 50 Gy over 4 consecutive days; prior radiotherapy was allowed. PET/computed tomography scans were obtained before therapy and at 1 to 6 months after therapy, as well as subsequently as clinically indicated. Continuous variables were analyzed with Kruskal-Wallis tests and categorical variables with Pearson chi-square or Fisher exact tests. Actuarial local failure rates were calculated with the Kaplan-Meier method. Results: At a median follow-up of 31 months (range, 6-71 months), the actuarial 1-, 2-, and 3-year local control rates were 100%, 98.5%, and 98.5%, respectively, in the Stage I group and 95.8%, 87.6%, and 85.8%, respectively, in the recurrent group. The cumulative rates of regional nodal recurrence and distant metastasis were 8.8% (6 of 68) and 14.7% (10 of 68), respectively, for the Stage I group and 11.7% (7 of 60) and 16.7% (10 of 60), respectively, for the recurrent group. Univariate analysis showed that SUVs obtained 12.1 to 24 months after treatment for the Stage I group (p = 0.007) and 6.1 to 12 months and 12.1 to 24 months after treatment for the recurrent group were associated with LR (p < 0.001 for both). Of the 128 patients, 17 (13.3%) had ipsilateral consolidation after SBRT but no elevated metabolic activity on PET; none had LR. The cutoff maximum SUV of 5 was found to have 100% sensitivity, 91% specificity, a 50% positive predictive value, and a 100% negative predictive value for predicting LR. Conclusions: PET was helpful for distinguishing SBRT-induced consolidation from LR. SUVs obtained more than 6 months after SBRT for NSCLC were

  19. Radial dyssynchrony assessed by cardiovascular magnetic resonance in relation to left ventricular function, myocardial scarring and QRS duration in patients with heart failure

    PubMed Central

    2009-01-01

    Background Intuitively, cardiac dyssynchrony is the inevitable result of myocardial injury. We hypothezised that radial dyssynchrony reflects left ventricular remodeling, myocardial scarring, QRS duration and impaired LV function and that, accordingly, it is detectable in all patients with heart failure. Methods 225 patients with heart failure, grouped according to QRS duration of <120 ms (A, n = 75), between 120-149 ms (B, n = 75) or ≥150 ms (C, n = 75), and 50 healthy controls underwent assessment of radial dyssynchrony using the cardiovascular magnetic resonance tissue synchronization index (CMR-TSI = SD of time to peak inward endocardial motion in up to 60 myocardial segments). Results Compared to 50 healthy controls (21.8 ± 6.3 ms [mean ± SD]), CMR-TSI was higher in A (74.8 ± 34.6 ms), B (92.4 ± 39.5 ms) and C (104.6 ± 45.6 ms) (all p < 0.0001). Adopting a cut-off CMR-TSI of 34.4 ms (21.8 plus 2xSD for controls) for the definition of dyssynchrony, it was present in 91% in A, 95% in B and 99% in C. Amongst patients in NYHA class III or IV, with a LVEF<35% and a QRS>120 ms, 99% had dyssynchrony. Amongst those with a QRS<120 ms, 91% had dyssynchrony. Across the study sample, CMR-TSI was related positively to left ventricular volumes (p < 0.0001) and inversely to LVEF (CMR-TSI = 178.3 e (-0.033 LVEF) ms, p < 0.0001). Conclusion Radial dyssynchrony is almost universal in patients with heart failure. This vies against the notion that a lack of response to CRT is related to a lack of dyssynchrony. PMID:19930713

  20. A pilot study to assess the feasibility of a submaximal exercise test to measure individual response to cardiac medication in dogs with acquired heart failure.

    PubMed

    Ferasin, L; Marcora, S

    2007-08-01

    Exercise testing is not commonly used in canine medicine because of several limitations. The aim of this study was to investigate the suitability of a treadmill test to measure the exercise capacity of untrained canine cardiac patients and to measure some biological parameters that might reflect the tolerance of dogs with heart failure to submaximal exercise. The exercise capacity of seven dogs with naturally occurring heart failure was evaluated before the institution of cardiac medication and 7 days after the beginning of the study. An additional re-examination was requested after 28 days. The exercise test was performed on a motorized treadmill at three different speeds (0.5 m/s, 1.0 m/s and 1.5 m/s). The following parameters were measured at the end of each stage and after 20 min recovery: heart rate, rectal temperature, glucose, lactate, aspartate aminotransferase, creatine kinase, PvO(2), PvCO(2), pH, haematocrit, bicarbonate, sodium, potassium and chloride. Serum cardiac troponin-I was also measured at the beginning of the test and at the end of the recovery period. Owners' perception reflected the ability of their dogs to exercise on the treadmill. Lactate level increased noticeably with the intensity of the exercise test, and its variation coincided with different exercise tolerance observed by the owners. Heart rate seemed to follow a similar trend in the few dogs presented in sinus rhythm. None of the remaining parameters appeared to be sensitive indicators of activity level in the dogs used in this study. The treadmill exercise test in dogs with acquired heart failure is feasible and might provide useful information for assessing individual response to cardiac medication. Lactate and heart rate seemed to reflect individual levels of exercise tolerance, although further studies are necessary to confirm the reliability and repeatability of this test. PMID:17253114

  1. Algorithmic Identification for Wings in Butterfly Diagrams.

    NASA Astrophysics Data System (ADS)

    Illarionov, E. A.; Sokolov, D. D.

    2012-12-01

    We investigate to what extent the wings of solar butterfly diagrams can be separated without an explicit usage of Hale's polarity law as well as the location of the solar equator. Two algorithms of cluster analysis, namely DBSCAN and C-means, have demonstrated their ability to separate the wings of contemporary butterfly diagrams based on the sunspot group density in the diagram only. Here we generalize the method for continuous tracers, give results concerning the migration velocities and presented clusters for 12 - 20 cycles.

  2. Learning from Failures.

    ERIC Educational Resources Information Center

    Saffran, Murray

    1991-01-01

    Describes mistakes made in trying to change the Nutrition and Digestion section of a medical biochemistry course. Author tried to make the section student taught and reports nine mistakes including the following: ignoring active opposition of colleagues, failure to assess the receptivity of the class to a new form of teaching, and overestimating…

  3. Assessment of RELAP5/MOD3 with the LOFT L9-1/L3-3 experiment simulating an anticipated transient with multiple failures

    SciTech Connect

    Bang, Y.S.; Seul, K.W.; Kim, H.J.

    1994-02-01

    The RELAP5/MOD3 5m5 code is assessed using the L9-1/L3-3 test carried out in the LOFT facility, a 1/60-scaled experimental reactor, simulating a loss of feedwater accident with multiple failures and the sequentially-induced small break loss-of-coolant accident. The code predictability is evaluated for the four separated sub-periods with respect to the system response; initial heatup phase, spray and power operated relief valve (PORV) cycling phase, blowdown phase and recovery phase. Based on the comparisons of the results from the calculation with the experiment data, it is shown that the overall thermal-hydraulic behavior important to the scenario such as a heat removal between the primary side and the secondary side and a system depressurization can be well-predicted and that the code could be applied to the full-scale nuclear power plant for an anticipated transient with multiple failures within a reasonable accuracy. The minor discrepancies between the prediction and the experiment are identified in reactor scram time, post-scram behavior in the initial heatup phase, excessive heatup rate in the cycling phase, insufficient energy convected out the PORV under the hot leg stratified condition in the saturated blowdown phase and void distribution in secondary side in the recovery phase. This may come from the code uncertainties in predicting the spray mass flow rate, the associated condensation in pressurizer and junction fluid density under stratified condition.

  4. A Hubble Diagram for Quasars

    NASA Astrophysics Data System (ADS)

    Risaliti, G.; Lusso, E.

    2015-12-01

    We present a new method to test the ΛCDM cosmological model and to estimate cosmological parameters based on the nonlinear relation between the ultraviolet and X-ray luminosities of quasars. We built a data set of 1138 quasars by merging several samples from the literature with X-ray measurements at 2 keV and SDSS photometry, which was used to estimate the extinction-corrected 2500 Å flux. We obtained three main results: (1) we checked the nonlinear relation between X-ray and UV luminosities in small redshift bins up to z˜ 6, confirming that the relation holds at all redshifts with the same slope; (2) we built a Hubble diagram for quasars up to z˜ 6, which is well matched to that of supernovae in the common z = 0-1.4 redshift interval and extends the test of the cosmological model up to z˜ 6; and (3) we showed that this nonlinear relation is a powerful tool for estimating cosmological parameters. Using the present data and assuming a ΛCDM model, we obtain {{{Ω }}}M = 0.22{}-0.08+0.10 and {{{Ω }}}{{Λ }} = 0.92{}-0.30+0.18 ({{{Ω }}}M = 0.28 ± 0.04 and {{{Ω }}}{{Λ }} = 0.73 +/- 0.08 from a joint quasar-SNe fit). Much more precise measurements will be achieved with future surveys. A few thousand SDSS quasars already have serendipitous X-ray observations from Chandra or XMM-Newton, and at least 100,000 quasars with UV and X-ray data will be made available by the extended ROentgen Survey with an Imaging Telescope Array all-sky survey in a few years. The Euclid, Large Synoptic Survey Telescope, and Advanced Telescope for High ENergy Astrophysics surveys will further increase the sample size to at least several hundred thousand. Our simulations show that these samples will provide tight constraints on the cosmological parameters and will allow us to test for possible deviations from the standard model with higher precision than is possible today.

  5. Phase diagram for passive electromagnetic scatterers.

    PubMed

    Lee, Jeng Yi; Lee, Ray-Kuang

    2016-03-21

    With the conservation of power, a phase diagram defined by amplitude square and phase of scattering coefficients for each spherical harmonic channel is introduced as a universal map for any passive electromagnetic scatterers. Physically allowable solutions for scattering coefficients in this diagram clearly show power competitions among scattering and absorption. It also illustrates a variety of exotic scattering or absorption phenomena, from resonant scattering, invisible cloaking, to coherent perfect absorber. With electrically small core-shell scatterers as an example, we demonstrate a systematic method to design field-controllable structures based on the allowed trajectories in this diagram. The proposed phase diagram and inverse design can provide tools to design functional electromagnetic devices. PMID:27136839

  6. A Smart Thermal Block Diagram Tool

    NASA Technical Reports Server (NTRS)

    Tsuyuki, Glenn; Miyake, Robert; Dodge, Kyle

    2008-01-01

    The presentation describes a Smart Thermal Block Diagram Tool. It is used by JPL's Team X in studying missions during the Pre-Phase A. It helps generate cost and mass estimates using proprietary data bases.

  7. The Art of Free-Body Diagrams.

    ERIC Educational Resources Information Center

    Puri, Avinash

    1996-01-01

    Discusses the difficulty of drawing free-body diagrams which only show forces exerted on a body from its neighbors. Presents three ways a body may be modeled: a particle, rigid extended, and nonrigid extended. (MKR)

  8. An Improved Mnemonic Diagram for Thermodynamic Relationships.

    ERIC Educational Resources Information Center

    Rodriguez, Joaquin; Brainard, Alan J.

    1989-01-01

    Considers pressure, volume, entropy, temperature, Helmholtz free energy, Gibbs free energy, enthalpy, and internal energy. Suggests the mnemonic diagram is for use with simple systems that are defined as macroscopically homogeneous, isotropic, uncharged, and chemically inert. (MVL)

  9. Use of influence diagrams in gas transfer system option prioritization

    SciTech Connect

    Heger, A.S.; Garcia, M.D.

    1995-08-01

    A formal decision-analysis methodology was applied to aid the Department of Energy (DOE) in deciding which of several gas transfer system (GTS) options should be selected. The decision objectives for this case study, i.e., risk and cost, were directly derived from the DOE guidelines. Influence diagrams were used to define the structure of the decision problem and clearly delineate the flow if information. A set of performance matrices wee used in conjunction with the influence diagrams to assess and evaluate the degree to which the objectives of the case study were met. These performance measures were extracted from technical models, design and operating data, and professional judgments. The results were aggregated to provide an overall evaluation of the different design options of the gas transfer system. Consequently, the results of this analysis were used as an aid to DOE to select a viable GTS option.

  10. Reliability computation from reliability block diagrams

    NASA Technical Reports Server (NTRS)

    Chelson, P. O.; Eckstein, R. E.

    1971-01-01

    A method and a computer program are presented to calculate probability of system success from an arbitrary reliability block diagram. The class of reliability block diagrams that can be handled include any active/standby combination of redundancy, and the computations include the effects of dormancy and switching in any standby redundancy. The mechanics of the program are based on an extension of the probability tree method of computing system probabilities.