Science.gov

Sample records for failure assessment diagram

  1. Failure Assessment Diagram for Titanium Brazed Joints

    NASA Technical Reports Server (NTRS)

    Flom, Yury; Jones, Justin S.; Powell, Mollie M.; Puckett, David F.

    2011-01-01

    The interaction equation was used to predict failure in Ti-4V-6Al joints brazed with Al 1100 filler metal. The joints used in this study were geometrically similar to the joints in the brazed beryllium metering structure considered for the ATLAS telescope. This study confirmed that the interaction equation R(sub sigma) + R(sub Tau) = 1, where R(sub sigma) and R(sub Tau)are normal and shear stress ratios, can be used as conservative lower bound estimate of the failure criterion in ATLAS brazed joints as well as for construction of the Failure Assessment Diagram (FAD).

  2. Failure Assessment Diagram for Brazed 304 Stainless Steel Joints

    NASA Technical Reports Server (NTRS)

    Flom, Yory

    2011-01-01

    Interaction equations were proposed earlier to predict failure in Albemet 162 brazed joints. Present study demonstrates that the same interaction equations can be used for lower bound estimate of the failure criterion in 304 stainless steel joints brazed with silver-based filler metals as well as for construction of the Failure Assessment Diagrams (FAD).

  3. Evaluation of Brazed Joints Using Failure Assessment Diagram

    NASA Technical Reports Server (NTRS)

    Flom, Yury

    2012-01-01

    Fitness-for service approach was used to perform structural analysis of the brazed joints consisting of several base metal / filler metal combinations. Failure Assessment Diagrams (FADs) based on tensile and shear stress ratios were constructed and experimentally validated. It was shown that such FADs can provide a conservative estimate of safe combinations of stresses in the brazed joints. Based on this approach, Margins of Safety (MS) of the brazed joints subjected to multi-axial loading conditions can be evaluated..

  4. Improved reliability analysis method based on the failure assessment diagram

    NASA Astrophysics Data System (ADS)

    Zhou, Yu; Zhang, Zheng; Zhong, Qunpeng

    2012-07-01

    With the uncertainties related to operating conditions, in-service non-destructive testing (NDT) measurements and material properties considered in the structural integrity assessment, probabilistic analysis based on the failure assessment diagram (FAD) approach has recently become an important concern. However, the point density revealing the probabilistic distribution characteristics of the assessment points is usually ignored. To obtain more detailed and direct knowledge from the reliability analysis, an improved probabilistic fracture mechanics (PFM) assessment method is proposed. By integrating 2D kernel density estimation (KDE) technology into the traditional probabilistic assessment, the probabilistic density of the randomly distributed assessment points is visualized in the assessment diagram. Moreover, a modified interval sensitivity analysis is implemented and compared with probabilistic sensitivity analysis. The improved reliability analysis method is applied to the assessment of a high pressure pipe containing an axial internal semi-elliptical surface crack. The results indicate that these two methods can give consistent sensitivities of input parameters, but the interval sensitivity analysis is computationally more efficient. Meanwhile, the point density distribution and its contour are plotted in the FAD, thereby better revealing the characteristics of PFM assessment. This study provides a powerful tool for the reliability analysis of critical structures.

  5. Application of ISO22000, failure mode, and effect analysis (FMEA) cause and effect diagrams and pareto in conjunction with HACCP and risk assessment for processing of pastry products.

    PubMed

    Varzakas, Theodoros H

    2011-09-01

    The Failure Mode and Effect Analysis (FMEA) model has been applied for the risk assessment of pastry processing. A tentative approach of FMEA application to the pastry industry was attempted in conjunction with ISO22000. Preliminary Hazard Analysis was used to analyze and predict the occurring failure modes in a food chain system (pastry processing plant), based on the functions, characteristics, and/or interactions of the ingredients or the processes, upon which the system depends. Critical Control points have been identified and implemented in the cause and effect diagram (also known as Ishikawa, tree diagram, and fishbone diagram). In this work a comparison of ISO22000 analysis with HACCP is carried out over pastry processing and packaging. However, the main emphasis was put on the quantification of risk assessment by determining the Risk Priority Number (RPN) per identified processing hazard. Storage of raw materials and storage of final products at -18°C followed by freezing were the processes identified as the ones with the highest RPN (225, 225, and 144 respectively) and corrective actions were undertaken. Following the application of corrective actions, a second calculation of RPN values was carried out leading to considerably lower values (below the upper acceptable limit of 130). It is noteworthy that the application of Ishikawa (Cause and Effect or Tree diagram) led to converging results thus corroborating the validity of conclusions derived from risk assessment and FMEA. Therefore, the incorporation of FMEA analysis within the ISO22000 system of a pastry processing industry is considered imperative. PMID:21838557

  6. A comparative study of wide plate behavior of a range of structural steels using the failure assessment diagram

    SciTech Connect

    Bannister, A.C.; Harrison, P.L.

    1995-12-31

    In the field of structural integrity assessments, attention is currently focused on the ability of such methods to conservatively predict the deformation and fracture behavior of structural steels and their weldments. In the current paper, the results of a series of wide plate tests on a range of structural steels are presented and the results assessed in terms of CTOD-strain relationships, BS PD 6493 Levels 2 and 3, and the crack driving force approach. The behavior of the large scale tests and the results of the various analyses are assessed with regard to the stress-strain characteristics of the individual steels. In a second step, the approach is extended to the assessment of a number of wide plate tests comprising welded joints with mismatched strength levels. Over, under and even-matched welded plates are compared with the behavior of normalized and Quenched and Tempered parent plates. The study demonstrates that the behavior of parent material wide plate tests can vary widely depending on the stress-strain characteristics of the material. The different behavior is a result of the consecutive effects of different steel processing conditions, microstructure, yield to tensile strength ratio and strain hardening exponent. These features are also manifested, to a lesser or greater extent, in the results of wide plate tests on welded plates of mismatched strength. Studies on mismatch effects should therefore include equal attention to the stress-strain characteristics of the parent materials as this may, in some circumstances, dominate any effects of weld strength mismatch.

  7. Automatically Assessing Graph-Based Diagrams

    ERIC Educational Resources Information Center

    Thomas, Pete; Smith, Neil; Waugh, Kevin

    2008-01-01

    To date there has been very little work on the machine understanding of imprecise diagrams, such as diagrams drawn by students in response to assessment questions. Imprecise diagrams exhibit faults such as missing, extraneous and incorrectly formed elements. The semantics of imprecise diagrams are difficult to determine. While there have been…

  8. Using Dynamic Master Logic Diagram for component partial failure analysis

    SciTech Connect

    Ni, T.; Modarres, M.

    1996-12-01

    A methodology using the Dynamic Master Logic Diagram (DMLD) for the evaluation of component partial failure is presented. Since past PRAs have not focused on partial failure effects, the reliability of components are only based on the binary state assumption, i.e. defining a component as fully failed or functioning. This paper is to develop an approach to predict and estimate the component partial failure on the basis of the fuzzy state assumption. One example of the application of this methodology with the reliability function diagram of a centrifugal pump is presented.

  9. Failure mode diagram of rubble pile asteroids: Application to (25143) asteroid Itokawa

    NASA Astrophysics Data System (ADS)

    Hirabayashi, Masatoshi; Scheeres, Daniel J.

    2016-01-01

    Proposing a diagram which shows the variation in asteroidal failure as a function of a spin period, later called the failure mode diagram, this paper considers the failure modes and conditions of asteroid (25143) Itokawa. This diagram is useful to describe when and where failure occurs in an asteroid. Assuming that Itokawa is homogeneous, we use a plastic finite element code to obtain the diagram for this object. The results show that if the bulk cohesive strength is less than 0.1 Pa, Itokawa experiences compressional failure on the neck surface at the current spin period 12.1 hours. At a spin period shorter than 4.5 hours, tension across the neck causes this asteroid to split into two components. It is also found that if the breakup spin period is longer than 5.2 hours, their motion is bounded. This implies that once Itokawa splits, the components may escape from one another.

  10. The Problem of Labels in E-Assessment of Diagrams

    ERIC Educational Resources Information Center

    Jayal, Ambikesh; Shepperd, Martin

    2009-01-01

    In this article we explore a problematic aspect of automated assessment of diagrams. Diagrams have partial and sometimes inconsistent semantics. Typically much of the meaning of a diagram resides in the labels; however, the choice of labeling is largely unrestricted. This means a correct solution may utilize differing yet semantically equivalent…

  11. Using Tree Diagrams as an Assessment Tool in Statistics Education

    ERIC Educational Resources Information Center

    Yin, Yue

    2012-01-01

    This study examines the potential of the tree diagram, a type of graphic organizer, as an assessment tool to measure students' knowledge structures in statistics education. Students' knowledge structures in statistics have not been sufficiently assessed in statistics, despite their importance. This article first presents the rationale and method…

  12. Application of Failure Mode and Effect Analysis (FMEA), cause and effect analysis, and Pareto diagram in conjunction with HACCP to a corn curl manufacturing plant.

    PubMed

    Varzakas, Theodoros H; Arvanitoyannis, Ioannis S

    2007-01-01

    The Failure Mode and Effect Analysis (FMEA) model has been applied for the risk assessment of corn curl manufacturing. A tentative approach of FMEA application to the snacks industry was attempted in an effort to exclude the presence of GMOs in the final product. This is of crucial importance both from the ethics and the legislation (Regulations EC 1829/2003; EC 1830/2003; Directive EC 18/2001) point of view. The Preliminary Hazard Analysis and the Fault Tree Analysis were used to analyze and predict the occurring failure modes in a food chain system (corn curls processing plant), based on the functions, characteristics, and/or interactions of the ingredients or the processes, upon which the system depends. Critical Control points have been identified and implemented in the cause and effect diagram (also known as Ishikawa, tree diagram, and the fishbone diagram). Finally, Pareto diagrams were employed towards the optimization of GMOs detection potential of FMEA. PMID:17457722

  13. Acute Effects of Vagotomy on Baroreflex Equilibrium Diagram in Rats with Chronic Heart Failure.

    PubMed

    Kawada, Toru; Li, Meihua; Zheng, Can; Sugimachi, Masaru

    2016-01-01

    The arterial baroreflex system can be divided into the neural arc, from pressure input to efferent sympathetic nerve activity (SNA), and the peripheral arc, from SNA to arterial pressure (AP). Plotting the neural and peripheral arcs on a pressure-SNA plane yields a baroreflex equilibrium diagram. We examined the effects of vagotomy on the open-loop static characteristics of the carotid sinus baroreflex in normal control rats (NC, n = 10) and rats with heart failure after myocardial infarction (MI, n = 10). In the NC group, vagotomy shifted the neural arc toward higher SNA and decreased the slope of the peripheral arc. Consequently, the operating-point SNA increased without a significant change in the operating-point AP on the baroreflex equilibrium diagram. These vagotomy-induced effects were not observed in the MI group, suggesting a loss of vagal modulation of the carotid sinus baroreflex function in heart failure. PMID:27594790

  14. Acute Effects of Vagotomy on Baroreflex Equilibrium Diagram in Rats with Chronic Heart Failure

    PubMed Central

    Kawada, Toru; Li, Meihua; Zheng, Can; Sugimachi, Masaru

    2016-01-01

    The arterial baroreflex system can be divided into the neural arc, from pressure input to efferent sympathetic nerve activity (SNA), and the peripheral arc, from SNA to arterial pressure (AP). Plotting the neural and peripheral arcs on a pressure–SNA plane yields a baroreflex equilibrium diagram. We examined the effects of vagotomy on the open-loop static characteristics of the carotid sinus baroreflex in normal control rats (NC, n = 10) and rats with heart failure after myocardial infarction (MI, n = 10). In the NC group, vagotomy shifted the neural arc toward higher SNA and decreased the slope of the peripheral arc. Consequently, the operating-point SNA increased without a significant change in the operating-point AP on the baroreflex equilibrium diagram. These vagotomy-induced effects were not observed in the MI group, suggesting a loss of vagal modulation of the carotid sinus baroreflex function in heart failure. PMID:27594790

  15. Students' Understanding of Diagrams for Solving Word Problems: A Framework for Assessing Diagram Proficiency

    ERIC Educational Resources Information Center

    Poch, Apryl L.; van Garderen, Delinda; Scheuermann, Amy M.

    2015-01-01

    A visual representation, such as a diagram, can be a powerful strategy for solving mathematical word problems. However, using a representation to solve mathematical word problems is not as simple as it seems! Many students with learning disabilities struggle to use a diagram effectively and efficiently. This article provides a framework for…

  16. Failure Assessment of Stainless Steel and Titanium Brazed Joints

    NASA Technical Reports Server (NTRS)

    Flom, Yury A.

    2012-01-01

    Following successful application of Coulomb-Mohr and interaction equations for evaluation of safety margins in Albemet 162 brazed joints, two additional base metal/filler metal systems were investigated. Specimens consisting of stainless steel brazed with silver-base filler metal and titanium brazed with 1100 Al alloy were tested to failure under combined action of tensile, shear, bending and torsion loads. Finite Element Analysis (FEA), hand calculations and digital image comparison (DIC) techniques were used to estimate failure stresses and construct Failure Assessment Diagrams (FAD). This study confirms that interaction equation R(sub sigma) + R(sub tau) = 1, where R(sub sigma) and R(sub t u) are normal and shear stress ratios, can be used as conservative lower bound estimate of the failure criterion in stainless steel and titanium brazed joints.

  17. Development of partial failure analysis method in probability risk assessments

    SciTech Connect

    Ni, T.; Modarres, M.

    1996-12-01

    This paper presents a new approach to evaluate the partial failure effect on current Probability Risk Assessments (PRAs). An integrated methodology of the thermal-hydraulic analysis and fuzzy logic simulation using the Dynamic Master Logic Diagram (DMLD) was developed. The thermal-hydraulic analysis used in this approach is to identify partial operation effect of any PRA system function in a plant model. The DMLD is used to simulate the system performance of the partial failure effect and inspect all minimal cut sets of system functions. This methodology can be applied in the context of a full scope PRA to reduce core damage frequency. An example of this application of the approach is presented. The partial failure data used in the example is from a survey study of partial failure effects from the Nuclear Plant Reliability Data System (NPRDS).

  18. Assessment of failure of cemented polyethylene acetabular component due to bone remodeling: A finite element study.

    PubMed

    Ghosh, Rajesh

    2016-09-01

    The aim of the study is to determine failure of the cemented polyethylene acetabular component, which might occur due to excessive bone resorption, cement-bone interface debonding and fatigue failure of the cement mantle. Three-dimensional finite element models of intact and implanted pelvic bone were developed and bone remodeling algorithm was implemented for present analysis. Soderberg fatigue failure diagram was used for fatigue assessment of the cement mantle. Hoffman failure criterion was considered for prediction of cement-bone interface debonding. Results indicate fatigue failure of the cement mantle and implant-bone interface debonding might not occur due to bone remodeling. PMID:27408485

  19. Ground water quality assessment using multi-rectangular diagrams.

    PubMed

    Ahmad, Niaz; Sen, Zekai; Ahmad, Manzoor

    2003-01-01

    A new graphical technique is proposed here for classifying chemical analyses of ground water. In this technique, a diagram is constructed using rectangular coordinates. The new diagram, called a multi-rectangular diagram (MRD), uses adjacent multi-rectangles in which each rectangle represents a specific ground water type. This new diagram has the capability to accommodate a large number of data sets. MRDs have been used to classify chemical analyses of ground water in the Chaj Doab area of Pakistan to illustrate this new approach. Using this graphical method, the differentiated ground water types are calcium bicarbonate, magnesium bicarbonate, sodium bicarbonate, and sodium sulfate. Sodium bicarbonate emerges as the most abundant ground water type. MRDs also offer a visual display of the Chebotarev sequence of ground water quality evolution. PMID:14649865

  20. Failure Assessment of Brazed Structures

    NASA Technical Reports Server (NTRS)

    Flom, Yuri

    2012-01-01

    Despite the great advances in analytical methods available to structural engineers, designers of brazed structures have great difficulties in addressing fundamental questions related to the loadcarrying capabilities of brazed assemblies. In this chapter we will review why such common engineering tools as Finite Element Analysis (FEA) as well as many well-established theories (Tresca, von Mises, Highest Principal Stress, etc) don't work well for the brazed joints. This chapter will show how the classic approach of using interaction equations and the less known Coulomb-Mohr failure criterion can be employed to estimate Margins of Safety (MS) in brazed joints.

  1. Assessing Students to Erase Failure

    ERIC Educational Resources Information Center

    Riggins, Cheryl G.

    2006-01-01

    Good education begins with an objective assessment and accurate diagnosis of specific "deficiencies." Transforming assessment from a system of primarily external judgments made by others to a system driven primarily by self-correction is the new frontier for instructional leaders. In this article, the author describes how student-centered…

  2. Assessment of Nonorganic Failure To Thrive.

    ERIC Educational Resources Information Center

    Wooster, Donna M.

    1999-01-01

    This article describes basic assessment considerations for infants and toddlers exhibiting nonorganic failure to thrive. The evaluation process must examine feeding, maternal-child interactions, child temperament, and environmental risks and behaviors. Early identification and intervention are necessary to minimize the long-term developmental…

  3. Nanoparticles in the environment: assessment using the causal diagram approach.

    PubMed

    Smita, Suchi; Gupta, Shailendra K; Bartonova, Alena; Dusinska, Maria; Gutleb, Arno C; Rahman, Qamar

    2012-01-01

    Nanoparticles (NPs) cause concern for health and safety as their impact on the environment and humans is not known. Relatively few studies have investigated the toxicological and environmental effects of exposure to naturally occurring NPs (NNPs) and man-made or engineered NPs (ENPs) that are known to have a wide variety of effects once taken up into an organism. A review of recent knowledge (between 2000-2010) on NP sources, and their behaviour, exposure and effects on the environment and humans was performed. An integrated approach was used to comprise available scientific information within an interdisciplinary logical framework, to identify knowledge gaps and to describe environment and health linkages for NNPs and ENPs. The causal diagram has been developed as a method to handle the complexity of issues on NP safety, from their exposure to the effects on the environment and health. It gives an overview of available scientific information starting with common sources of NPs and their interactions with various environmental processes that may pose threats to both human health and the environment. Effects of NNPs on dust cloud formation and decrease in sunlight intensity were found to be important environmental changes with direct and indirect implication in various human health problems. NNPs and ENPs exposure and their accumulation in biological matrices such as microbiota, plants and humans may result in various adverse effects. The impact of some NPs on human health by ROS generation was found to be one of the major causes to develop various diseases. A proposed cause-effects diagram for NPs is designed considering both NNPs and ENPs. It represents a valuable information package and user-friendly tool for various stakeholders including students, researchers and policy makers, to better understand and communicate on issues related to NPs. PMID:22759495

  4. Nanoparticles in the environment: assessment using the causal diagram approach

    PubMed Central

    2012-01-01

    Nanoparticles (NPs) cause concern for health and safety as their impact on the environment and humans is not known. Relatively few studies have investigated the toxicological and environmental effects of exposure to naturally occurring NPs (NNPs) and man-made or engineered NPs (ENPs) that are known to have a wide variety of effects once taken up into an organism. A review of recent knowledge (between 2000-2010) on NP sources, and their behaviour, exposure and effects on the environment and humans was performed. An integrated approach was used to comprise available scientific information within an interdisciplinary logical framework, to identify knowledge gaps and to describe environment and health linkages for NNPs and ENPs. The causal diagram has been developed as a method to handle the complexity of issues on NP safety, from their exposure to the effects on the environment and health. It gives an overview of available scientific information starting with common sources of NPs and their interactions with various environmental processes that may pose threats to both human health and the environment. Effects of NNPs on dust cloud formation and decrease in sunlight intensity were found to be important environmental changes with direct and indirect implication in various human health problems. NNPs and ENPs exposure and their accumulation in biological matrices such as microbiota, plants and humans may result in various adverse effects. The impact of some NPs on human health by ROS generation was found to be one of the major causes to develop various diseases. A proposed cause-effects diagram for NPs is designed considering both NNPs and ENPs. It represents a valuable information package and user-friendly tool for various stakeholders including students, researchers and policy makers, to better understand and communicate on issues related to NPs. PMID:22759495

  5. Insulation failure assessment under random energization overvoltages

    SciTech Connect

    Mahdy, A.M.; Anis, H.I.; El-Morshedy, A.

    1996-03-01

    This paper offers a new simple approach to the evaluation of the risk of failure of external insulation in view of their known probabilistic nature. This is applied to EHV transmission systems subjected to energization overvoltages. The randomness, both in the applied stresses and insulation`s withstand characteristics are numerically simulated and then integrated to assess the risk of failure. Overvoltage control methods are accounted for, such as the use of pre-insertion breaker resistors, series capacitive compensation, and the installation of shunt reactors.

  6. Assessing legal responsibility for implant failure.

    PubMed

    Palat, M

    1991-04-01

    The number of malpractice suits related to implants has recently increased significantly, with awards that are among the largest in dentistry. This article discusses the principles involved in assessing liability for implant failure and the various clinical situations that can affect liability in implant practice. The author also provides a list of the interrogatories required of defendants in malpractice suits related to implants. PMID:1893392

  7. Ignition Failure Mode Radiochemical Diagnostics Initial Assessment

    SciTech Connect

    Fortner, R; Bernstein, L; Cerjan, C; Haan, S W; Harding, R; Hatchett, S; Hoffman, R; Koch, J; Moody, K; Schneider, D; Stoyer, M; Werner, C; Zimmerman, G

    2007-04-20

    Radiochemical diagnostic signatures are well known to be effective indicators of nuclear ignition and burn reaction conditions. Nuclear activation is already a reliable technique to measure yield. More comprehensively, though, important quantities such as fuel areal density and ion temperature might be separately and more precisely monitored by a judicious choice of select nuclear reactions. This report details an initial assessment of this approach to diagnosing ignition failures on point-design cryogenic National Ignition Campaign targets. Using newly generated nuclear reaction cross section data for Scandium and Iridium, modest uniform doping of the innermost ablator region provides clearly observable reaction product differences between robust burn and failure for either element. Both equatorial and polar tracer loading yield observable, but indistinguishable, signatures for either choice of element for the preliminary cases studied.

  8. Failure detection system risk reduction assessment

    NASA Technical Reports Server (NTRS)

    Aguilar, Robert B. (Inventor); Huang, Zhaofeng (Inventor)

    2012-01-01

    A process includes determining a probability of a failure mode of a system being analyzed reaching a failure limit as a function of time to failure limit, determining a probability of a mitigation of the failure mode as a function of a time to failure limit, and quantifying a risk reduction based on the probability of the failure mode reaching the failure limit and the probability of the mitigation.

  9. Derivation of Failure Rates and Probability of Failures for the International Space Station Probabilistic Risk Assessment Study

    NASA Technical Reports Server (NTRS)

    Vitali, Roberto; Lutomski, Michael G.

    2004-01-01

    National Aeronautics and Space Administration s (NASA) International Space Station (ISS) Program uses Probabilistic Risk Assessment (PRA) as part of its Continuous Risk Management Process. It is used as a decision and management support tool to not only quantify risk for specific conditions, but more importantly comparing different operational and management options to determine the lowest risk option and provide rationale for management decisions. This paper presents the derivation of the probability distributions used to quantify the failure rates and the probability of failures of the basic events employed in the PRA model of the ISS. The paper will show how a Bayesian approach was used with different sources of data including the actual ISS on orbit failures to enhance the confidence in results of the PRA. As time progresses and more meaningful data is gathered from on orbit failures, an increasingly accurate failure rate probability distribution for the basic events of the ISS PRA model can be obtained. The ISS PRA has been developed by mapping the ISS critical systems such as propulsion, thermal control, or power generation into event sequences diagrams and fault trees. The lowest level of indenture of the fault trees was the orbital replacement units (ORU). The ORU level was chosen consistently with the level of statistically meaningful data that could be obtained from the aerospace industry and from the experts in the field. For example, data was gathered for the solenoid valves present in the propulsion system of the ISS. However valves themselves are composed of parts and the individual failure of these parts was not accounted for in the PRA model. In other words the failure of a spring within a valve was considered a failure of the valve itself.

  10. Assessment of hoist failure rate for Payload Transporter III

    SciTech Connect

    Demmie, P.N.

    1994-02-01

    Assessment of the hoist failure rate for the Payload Transporter Type III (PT-III) hoist was completed as one of the ground transportation tasks for the Minuteman II (MMIII) Weapon System Safety Assessment. The failures of concern are failures that lead to dropping a reentry system (RS) during hoist operations in a silo or the assembly, storage, and inspection building for a MMIII wing. After providing a brief description of the PT-III hoist system, the author summarizes his search for historical data from industry and the military services for failures of electric hoist systems. Since such information was not found, the strategy for assessing a failure rate was to consider failure mechanisms which lead to load-drop accidents, estimate their rates, and sum the rates for the PT-III hoist failure rate. The author discusses failure mechanisms and describes his assessment of a chain failure rate that is based on data from destructive testing of a chain of the type used for the PT-III hoist and projected usage rates for hoist operations involving the RS. The main result provides upper bounds for chain failure rates that are based on these data. No test data were found to estimate failure rates due to mechanisms other than chain failure. The author did not attempt to quantify the effects of human factors on the PT-III hoist failure rate.

  11. Failure risk assessment by analysis and testing

    NASA Technical Reports Server (NTRS)

    Moore, N.; Ebbeler, D.; Creager, M.

    1992-01-01

    The sources of information on which to base an evaluation of reliability or failure risk of an aerospace flight system are (1) experience from tests and flights and (2) engineering analysis. It is rarely feasible to establish high reliability at high confidence by testing aerospace systems or components. Moreover, failure prediction by conventional, deterministic methods of engineering analysis can become arbitrary and subject to serious misinterpretation when uncertain or approximate information is used to establish analysis parameter values and to calibrate the accuracy of engineering models. The limitations of testing to evaluate failure risk are discussed, and a statistical approach which incorporates both engineering analysis and testing is presented.

  12. Assessment of Steel Reinforcement Corrosion State by Parameters of Potentiodynamic Diagrams

    NASA Astrophysics Data System (ADS)

    Krajči, Ľudovít; Jerga, Ján

    2015-12-01

    The environment of the steel reinforcement has a significant impact on the durability and service life of a concrete structure. It is not only the presence of aggressive substances from the environment, but also the own composition of concrete mixture. The use of new types of cements, additives and admixtures must be preceded by verification, if they themselves shall not initiate the corrosion. There is a need for closer physical expression of the parameters of the potentiodynamic diagrams allowing reliable assessment of the influence of the surrounding environment on electrochemical behaviour of reinforcement. The analysis of zero retardation limits of potentiodynamic curves is presented.

  13. Common-Cause Failure Analysis in Event Assessment

    SciTech Connect

    Dana L. Kelly; Dale M. Rasmuson

    2008-09-01

    This paper describes the approach taken by the U. S. Nuclear Regulatory Commission to the treatment of common-cause failure in probabilistic risk assessment of operational events. The approach is based upon the Basic Parameter Model for common-cause failure, and examples are illustrated using the alpha-factor parameterization, the approach adopted by the NRC in their Standardized Plant Analysis Risk (SPAR) models. The cases of a failed component (with and without shared common-cause failure potential) and a component being unavailable due to preventive maintenance or testing are addressed. The treatment of two related failure modes (e.g., failure to start and failure to run) is a new feature of this paper. These methods are being applied by the NRC in assessing the risk significance of operational events for the Significance Determination Process (SDP) and the Accident Sequence Precursor (ASP) program.

  14. Vulnerability Assessment for Cascading Failures in Electric Power Systems

    SciTech Connect

    Baldick, R.; Chowdhury, Badrul; Dobson, Ian; Dong, Zhao Yang; Gou, Bei; Hawkins, David L.; Huang, Zhenyu; Joung, Manho; Kim, Janghoon; Kirschen, Daniel; Lee, Stephen; Li, Fangxing; Li, Juan; Li, Zuyi; Liu, Chen-Ching; Luo, Xiaochuan; Mili, Lamine; Miller, Stephen; Nakayama, Marvin; Papic, Milorad; Podmore, Robin; Rossmaier, John; Schneider, Kevin P.; Sun, Hongbin; Sun, Kai; Wang, David; Wu, Zhigang; Yao, Liangzhong; Zhang, Pei; Zhang, Wenjie; Zhang, Xiaoping

    2008-09-10

    Cascading failures present severe threats to power grid security, and thus vulnerability assessment of power grids is of significant importance. Focusing on analytic methods, this paper reviews the state of the art of vulnerability assessment methods in the context of cascading failures in three categories: steady-state modeling based analysis; dynamic modeling analysis; and non-traditional modeling approaches. The impact of emerging technologies including phasor technology, high-performance computing techniques, and visualization techniques on the vulnerability assessment of cascading failures is then addressed, and future research directions are presented.

  15. 45 CFR 150.461 - Failure to pay assessment.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 45 Public Welfare 1 2010-10-01 2010-10-01 false Failure to pay assessment. 150.461 Section 150.461 Public Welfare DEPARTMENT OF HEALTH AND HUMAN SERVICES REQUIREMENTS RELATING TO HEALTH CARE ACCESS CMS ENFORCEMENT IN GROUP AND INDIVIDUAL INSURANCE MARKETS Administrative Hearings § 150.461 Failure to...

  16. 45 CFR 150.461 - Failure to pay assessment.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 45 Public Welfare 1 2013-10-01 2013-10-01 false Failure to pay assessment. 150.461 Section 150.461 Public Welfare DEPARTMENT OF HEALTH AND HUMAN SERVICES REQUIREMENTS RELATING TO HEALTH CARE ACCESS CMS ENFORCEMENT IN GROUP AND INDIVIDUAL INSURANCE MARKETS Administrative Hearings § 150.461 Failure to...

  17. 45 CFR 150.461 - Failure to pay assessment.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 45 Public Welfare 1 2011-10-01 2011-10-01 false Failure to pay assessment. 150.461 Section 150.461 Public Welfare DEPARTMENT OF HEALTH AND HUMAN SERVICES REQUIREMENTS RELATING TO HEALTH CARE ACCESS CMS ENFORCEMENT IN GROUP AND INDIVIDUAL INSURANCE MARKETS Administrative Hearings § 150.461 Failure to...

  18. 45 CFR 156.961 - Failure to pay assessment.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 45 Public Welfare 1 2014-10-01 2014-10-01 false Failure to pay assessment. 156.961 Section 156.961 Public Welfare Department of Health and Human Services REQUIREMENTS RELATING TO HEALTH CARE ACCESS HEALTH... Administrative Review of QHP Issuer Sanctions in Federally-Facilitated Exchanges § 156.961 Failure to...

  19. 45 CFR 150.461 - Failure to pay assessment.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 45 Public Welfare 1 2014-10-01 2014-10-01 false Failure to pay assessment. 150.461 Section 150.461 Public Welfare Department of Health and Human Services REQUIREMENTS RELATING TO HEALTH CARE ACCESS CMS ENFORCEMENT IN GROUP AND INDIVIDUAL INSURANCE MARKETS Administrative Hearings § 150.461 Failure to...

  20. 45 CFR 150.461 - Failure to pay assessment.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 45 Public Welfare 1 2012-10-01 2012-10-01 false Failure to pay assessment. 150.461 Section 150.461 Public Welfare DEPARTMENT OF HEALTH AND HUMAN SERVICES REQUIREMENTS RELATING TO HEALTH CARE ACCESS CMS ENFORCEMENT IN GROUP AND INDIVIDUAL INSURANCE MARKETS Administrative Hearings § 150.461 Failure to...

  1. Probabilistic failure assessment with application to solid rocket motors

    NASA Technical Reports Server (NTRS)

    Jan, Darrell L.; Davidson, Barry D.; Moore, Nicholas R.

    1990-01-01

    A quantitative methodology is being developed for assessment of risk of failure of solid rocket motors. This probabilistic methodology employs best available engineering models and available information in a stochastic framework. The framework accounts for incomplete knowledge of governing parameters, intrinsic variability, and failure model specification error. Earlier case studies have been conducted on several failure modes of the Space Shuttle Main Engine. Work in progress on application of this probabilistic approach to large solid rocket boosters such as the Advanced Solid Rocket Motor for the Space Shuttle is described. Failure due to debonding has been selected as the first case study for large solid rocket motors (SRMs) since it accounts for a significant number of historical SRM failures. Impact of incomplete knowledge of governing parameters and failure model specification errors is expected to be important.

  2. Development and validation of standard area diagrams to aide assessment of pecan scab symptoms on pecan fruit

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Pecan scab (Fusicladium effusum) causes losses of pecan nutmeat yield and quality in the southeastern U.S. Disease assessment relies on visual rating, which can be inaccurate, imprecise with poor inter-rater reliability. A standard area diagram (SAD) set for pecan scab on fruit valves was develope...

  3. Methods for Assessing Honeycomb Sandwich Panel Wrinkling Failures

    NASA Technical Reports Server (NTRS)

    Zalewski, Bart F.; Dial, William B.; Bednarcyk, Brett A.

    2012-01-01

    Efficient closed-form methods for predicting the facesheet wrinkling failure mode in sandwich panels are assessed. Comparisons were made with finite element model predictions for facesheet wrinkling, and a validated closed-form method was implemented in the HyperSizer structure sizing software.

  4. How many standard area diagram sets are needed for accurate disease severity assessment

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Standard area diagram sets (SADs) are widely used in plant pathology: a rater estimates disease severity by comparing an unknown sample to actual severities in the SADs and interpolates an estimate as accurately as possible (although some SADs have been developed for categorizing disease too). Most ...

  5. Methods of failure and reliability assessment for mechanical heart pumps.

    PubMed

    Patel, Sonna M; Allaire, Paul E; Wood, Houston G; Throckmorton, Amy L; Tribble, Curt G; Olsen, Don B

    2005-01-01

    Artificial blood pumps are today's most promising bridge-to-recovery (BTR), bridge-to-transplant (BTT), and destination therapy solutions for patients suffering from intractable congestive heart failure (CHF). Due to an increased need for effective, reliable, and safe long-term artificial blood pumps, each new design must undergo failure and reliability testing, an important step prior to approval from the United States Food and Drug Administration (FDA), for clinical testing and commercial use. The FDA has established no specific standards or protocols for these testing procedures and there are only limited recommendations provided by the scientific community when testing an overall blood pump system and individual system components. Product development of any medical device must follow a systematic and logical approach. As the most critical aspects of the design phase, failure and reliability assessments aid in the successful evaluation and preparation of medical devices prior to clinical application. The extent of testing, associated costs, and lengthy time durations to execute these experiments justify the need for an early evaluation of failure and reliability. During the design stages of blood pump development, a failure modes and effects analysis (FMEA) should be completed to provide a concise evaluation of the occurrence and frequency of failures and their effects on the overall support system. Following this analysis, testing of any pump typically involves four sequential processes: performance and reliability testing in simple hydraulic or mock circulatory loops, acute and chronic animal experiments, human error analysis, and ultimately, clinical testing. This article presents recommendations for failure and reliability testing based on the National Institutes of Health (NIH), Society for Thoracic Surgeons (STS) and American Society for Artificial Internal Organs (ASAIO), American National Standards Institute (ANSI), the Association for Advancement of

  6. Computed radionuclide urogram for assessing acute renal failure

    SciTech Connect

    Schlegel, J.U.; Lang, E.K.

    1980-05-01

    The computed radionuclide urogram is advocated as a noninvasive diagnostic method for differentiation of the most common prerenal, renal, and postrenal causes of acute renal failure. On the basis of characteristic changes in the effective renal plasma flow rate, the calculated filtration fraction, and the calculated glomerular filtration rate, prerenal conditions such as renal artery stenosis or thrombosis, renal conditions such as acute rejection or acute tubular necrosis, and postrenal conditions such as obstruction or leakage, which are the most common causes of acute renal failure, can be differentiated. In conjunction with morphologic criteria derived from sonograms, a diagnosis with acceptable confidence can be rendered in most instances. Both the computed radionuclide urogram and sonogram are noninvasive and can be used without adverse effects in the presence of azotemia and even anuria. This also makes feasible reexamination at intervals to assess effect of therapy and offer prognostic information.

  7. Selected component failure rate values from fusion safety assessment tasks

    SciTech Connect

    Cadwallader, L.C.

    1998-09-01

    This report is a compilation of component failure rate and repair rate values that can be used in magnetic fusion safety assessment tasks. Several safety systems are examined, such as gas cleanup systems and plasma shutdown systems. Vacuum system component reliability values, including large vacuum chambers, have been reviewed. Values for water cooling system components have also been reported here. The report concludes with the examination of some equipment important to personnel safety, atmospheres, combustible gases, and airborne releases of radioactivity. These data should be useful to system designers to calculate scoping values for the availability and repair intervals for their systems, and for probabilistic safety or risk analysts to assess fusion systems for safety of the public and the workers.

  8. Selected Component Failure Rate Values from Fusion Safety Assessment Tasks

    SciTech Connect

    Cadwallader, Lee Charles

    1998-09-01

    This report is a compilation of component failure rate and repair rate values that can be used in magnetic fusion safety assessment tasks. Several safety systems are examined, such as gas cleanup systems and plasma shutdown systems. Vacuum system component reliability values, including large vacuum chambers, have been reviewed. Values for water cooling system components have also been reported here. The report concludes with the examination of some equipment important to personnel safety, atmospheres, combustible gases, and airborne releases of radioactivity. These data should be useful to system designers to calculate scoping values for the availability and repair intervals for their systems, and for probabilistic safety or risk analysts to assess fusion systems for safety of the public and the workers.

  9. Assessing Wind Farm Reliability Using Weather Dependent Failure Rates

    NASA Astrophysics Data System (ADS)

    Wilson, G.; McMillan, D.

    2014-06-01

    Using reliability data comprising of two modern, large scale wind farm sites and wind data from two onsite met masts, a model is developed which calculates wind speed dependant failure rates which are used to populate a Markov Chain. Monte Carlo simulation is then exercised to simulate three wind farms which are subjected to controlled wind speed conditions from three separate potential UK sites. The model then calculates and compares wind farm reliability due to corrective maintenance and component failure rates influenced by the wind speed of each of the sites. Results show that the components affected most by changes in average daily wind speed are the control system and the yaw system. A comparison between this model and a more simple estimation of site yield is undertaken. The model takes into account the effects of the wind speed on the cost of operation and maintenance and also includes the impact of longer periods of downtime in the winter months and shorter periods in the summer. By taking these factors into account a more detailed site assessment can be undertaken. There is significant value to this model for operators and manufacturers.

  10. Ab-initio calculations and phase diagram assessments of An-Al systems (An = U, Np, Pu)

    NASA Astrophysics Data System (ADS)

    Sedmidubský, D.; Konings, R. J. M.; Souček, P.

    2010-02-01

    The enthalpies of formation of binary intermetallic compounds AnAl n(n=2,3,4, An=U,Np,Pu) were assessed from first principle calculations of total energies performed using full potential APW + lo technique within density functional theory ( WIEN2k). The substantial contribution to entropies, S298°, arising from lattice vibrations was calculated by direct method within harmonic crystal approximation ( Phonon software + VASP for obtaining Hellmann-Feynman forces). The electronic heat capacity and the corresponding contribution to entropy were estimated from the density of states at Fermi level obtained from electronic structure calculations. The phase diagrams of the relevant systems An-Al were calculated based on the thermodynamic data assessed from ab-initio calculations, known equilibrium and calorimetry data by employing the FactSage program.

  11. Use of standard area diagrams to improve assessment of pecan scab on fruit

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Pecan scab (Fusicladium effusum) causes significant economic losses of pecan throughout the southeastern US. Disease assessment relies on visual rating of disease severity, which can be inaccurate, imprecise, with poor repeatability and reproducibility. Accurate, precise assessments are important fo...

  12. Comparison of Body Composition Assessment Methods in Pediatric Intestinal Failure

    PubMed Central

    Mehta, Nilesh M.; Raphael, Bram; Guteirrez, Ivan; Quinn, Nicolle; Mitchell, Paul D.; Litman, Heather J.; Jaksic, Tom; Duggan, Christopher P.

    2015-01-01

    Objectives To examine the agreement of multifrequency bioelectric impedance analysis (BIA) and anthropometry with reference methods for body composition assessment in children with intestinal failure (IF). Methods We conducted a prospective pilot study in children 14 years of age or younger with IF resulting from either short bowel syndrome (SBS) or motility disorders. Bland Altman analysis was used to examine the agreement between BIA and deuterium dilution in measuring total body water (TBW) and lean body mass (LBM); and between BIA and dual X-ray absorptiometry (DXA) techniques in measuring LBM and FM. Fat mass (FM) and percent body fat (%BF) measurements by BIA and anthropometry, were also compared in relation to those measured by deuterium dilution. Results Fifteen children with IF, median (IQR) age 7.2 (5.0, 10.0) years, 10 (67%) male, were studied. BIA and deuterium dilution were in good agreement with a mean bias (limits of agreement) of 0.9 (-3.2, 5.0) for TBW (L) and 0.1 (-5.4 to 5.6) for LBM (kg) measurements. The mean bias (limits) for FM (kg) and %BF measurements were 0.4 (-3.8, 4.6) kg and 1.7 (-16.9, 20.3)% respectively. The limits of agreement were within 1 SD of the mean bias in 12/14 (86%) subjects for TBW and LBM, and in 11/14 (79%) for FM and %BF measurements. Mean bias (limits) for LBM (kg) and FM (kg) between BIA and DXA were 1.6 (-3.0 to 6.3) kg and -0.1 (-3.2 to 3.1) kg, respectively. Mean bias (limits) for FM (kg) and %BF between anthropometry and deuterium dilution were 0.2 (-4.2, 4.6) and -0.2 (-19.5 to 19.1), respectively. The limits of agreement were within 1 SD of the mean bias in 10/14 (71%) subjects. Conclusions In children with intestinal failure, TBW and LBM measurements by multifrequency BIA method were in agreement with isotope dilution and DXA methods, with small mean bias. In comparison to deuterium dilution, BIA was comparable to anthropometry for FM and %BF assessments with small mean bias. However, the limits of agreement

  13. Assessment of Commonly Available Educational Materials in Heart Failure Clinics

    PubMed Central

    Taylor-Clarke, Kimberli; Henry-Okafor, Queen; Murphy, Clare; Keyes, Madeline; Rothman, Russell; Churchwell, Andre; Mensah, George A.; Sawyer, Douglas; Sampson, Uchechukwu K. A.

    2014-01-01

    Background Health literacy (HL) is an established independent predictor of cardiovascular outcomes. Approximately 90 million Americans have limited HL and read at ≤ 5th grade-level. Therefore, we sought to determine the suitability and readability level of common cardiovascular patient education materials (PEM) related to heart failure and heart-healthy lifestyle. Methods and Results The suitability and readability of written PEMs were assessed using the suitability assessment of materials (SAM) and Fry readability formula. The SAM criteria are comprised of the following categories: message content, text appearance, visuals, and layout and design. We obtained a convenience sample of 18 English-written cardiovascular PEMs freely available from major health organizations. Two reviewers independently appraised the PEMs. Final suitability scores ranged from 12 to 87%. Readability levels ranged between 3rd and 15th grade-level; the average readability level was 8th grade. Ninety-four percent of the PEMs were rated either superior or adequate on text appearance, but ≥ 50% of the PEMs were rated inadequate on each of the other categories of the SAM criteria. Only two (11%) PEMs had the optimum suitability score of ≥ 70% and ≤ 5th grade readability level suitable for populations with limited HL. Conclusions Commonly available cardiovascular PEMs used by some major healthcare institutions are not suitable for the average American patient. The true prevalence of suboptimal PEMs needs to be determined as it potentially negatively impacts optimal healthcare delivery and outcomes. PMID:21743339

  14. A failure diagnosis and impact assessment prototype for Space Station Freedom

    NASA Technical Reports Server (NTRS)

    Baker, Carolyn G.; Marsh, Christopher A.

    1991-01-01

    NASA is investigating the use of advanced automation to enhance crew productivity for Space Station Freedom in numerous areas, one being failure management. A prototype is described that diagnoses failure sources and assesses the future impacts of those failures on other Freedom entities.

  15. Assessing patient preferences in heart failure using conjoint methodology

    PubMed Central

    Pisa, Giovanni; Eichmann, Florian; Hupfer, Stephan

    2015-01-01

    Aim The course of heart failure (HF) is characterized by frequent hospitalizations, a high mortality rate, as well as a severely impaired health-related quality of life (HRQoL). To optimize disease management, understanding of patient preferences is crucial. We aimed to assess patient preferences using conjoint methodology and HRQoL in patients with HF. Methods Two modules were applied: an initial qualitative module, consisting of in-depth interviews with 12 HF patients, and the main quantitative module in 300 HF patients from across Germany. Patients were stratified according to the time of their last HF hospitalization. Each patient was presented with ten different scenarios during the conjoint exercise. Additionally, patients completed the generic HRQoL instrument, EuroQol health questionnaire (EQ-5D™). Results The attribute with the highest relative importance was dyspnea (44%), followed by physical capacity (18%). Of similar importance were exhaustion during mental activities (13%), fear due to HF (13%), and autonomy (12%). The most affected HRQoL dimensions according to the EQ-5D questionnaire were anxiety/depression (23% with severe problems), pain/discomfort (19%), and usual activities (15%). Overall average EQ-5D score was 0.39 with stable, chronic patients (never hospitalized) having a significantly better health state vs the rest of the cohort. Conclusion This paper analyzed patient preference in HF using a conjoint methodology. The preference weights resulting from the conjoint analysis could be used in future to design HRQoL questionnaires which could better assess patient preferences in HF care. PMID:26345530

  16. Thermodynamic Diagrams

    NASA Astrophysics Data System (ADS)

    Chaston, Scot

    1999-02-01

    Thermodynamic data such as equilibrium constants, standard cell potentials, molar enthalpies of formation, and standard entropies of substances can be a very useful basis for an organized presentation of knowledge in diverse areas of applied chemistry. Thermodynamic data can become particularly useful when incorporated into thermodynamic diagrams that are designed to be easy to recall, to serve as a basis for reconstructing previous knowledge, and to determine whether reactions can occur exergonically or only with the help of an external energy source. Few students in our chemistry-based courses would want to acquire the depth of knowledge or rigor of professional thermodynamicists. But they should nevertheless learn how to make good use of thermodynamic data in their professional occupations that span the chemical, biological, environmental, and medical laboratory fields. This article discusses examples of three thermodynamic diagrams that have been developed for this purpose. They are the thermodynamic energy account (TEA), the total entropy scale, and the thermodynamic scale diagrams. These diagrams help in the teaching and learning of thermodynamics by bringing the imagination into the process of developing a better understanding of abstract thermodynamic functions, and by allowing the reader to keep track of specialist thermodynamic discourses in the literature.

  17. Generic component failure data base for light water and liquid sodium reactor PRAs (probabilistic risk assessments)

    SciTech Connect

    Eide, S.A.; Chmielewski, S.V.; Swantz, T.D.

    1990-02-01

    A comprehensive generic component failure data base has been developed for light water and liquid sodium reactor probabilistic risk assessments (PRAs). The Nuclear Computerized Library for Assessing Reactor Reliability (NUCLARR) and the Centralized Reliability Data Organization (CREDO) data bases were used to generate component failure rates. Using this approach, most of the failure rates are based on actual plant data rather than existing estimates. 21 refs., 9 tabs.

  18. Consequence assessment of large rock slope failures in Norway

    NASA Astrophysics Data System (ADS)

    Oppikofer, Thierry; Hermanns, Reginald L.; Horton, Pascal; Sandøy, Gro; Roberts, Nicholas J.; Jaboyedoff, Michel; Böhme, Martina; Yugsi Molina, Freddy X.

    2014-05-01

    Steep glacially carved valleys and fjords in Norway are prone to many landslide types, including large rockslides, rockfalls, and debris flows. Large rockslides and their secondary effects (rockslide-triggered displacement waves, inundation behind landslide dams and outburst floods from failure of landslide dams) pose a significant hazard to the population living in the valleys and along the fjords shoreline. The Geological Survey of Norway performs systematic mapping of unstable rock slopes in Norway and has detected more than 230 unstable slopes with significant postglacial deformation. This large number necessitates prioritisation of follow-up activities, such as more detailed investigations, periodic displacement measurements, continuous monitoring and early-warning systems. Prioritisation is achieved through a hazard and risk classification system, which has been developed by a panel of international and Norwegian experts (www.ngu.no/en-gb/hm/Publications/Reports/2012/2012-029). The risk classification system combines a qualitative hazard assessment with a consequences assessment focusing on potential life losses. The hazard assessment is based on a series of nine geomorphological, engineering geological and structural criteria, as well as displacement rates, past events and other signs of activity. We present a method for consequence assessment comprising four main steps: 1. computation of the volume of the unstable rock slope; 2. run-out assessment based on the volume-dependent angle of reach (Fahrböschung) or detailed numerical run-out modelling; 3. assessment of possible displacement wave propagation and run-up based on empirical relations or modelling in 2D or 3D; and 4. estimation of the number of persons exposed to rock avalanches or displacement waves. Volume computation of an unstable rock slope is based on the sloping local base level technique, which uses a digital elevation model to create a second-order curved surface between the mapped extent of

  19. Consequence assessment of large rock slope failures in Norway

    NASA Astrophysics Data System (ADS)

    Oppikofer, Thierry; Hermanns, Reginald L.; Horton, Pascal; Sandøy, Gro; Roberts, Nicholas J.; Jaboyedoff, Michel; Böhme, Martina; Yugsi Molina, Freddy X.

    2014-05-01

    Steep glacially carved valleys and fjords in Norway are prone to many landslide types, including large rockslides, rockfalls, and debris flows. Large rockslides and their secondary effects (rockslide-triggered displacement waves, inundation behind landslide dams and outburst floods from failure of landslide dams) pose a significant hazard to the population living in the valleys and along the fjords shoreline. The Geological Survey of Norway performs systematic mapping of unstable rock slopes in Norway and has detected more than 230 unstable slopes with significant postglacial deformation. This large number necessitates prioritisation of follow-up activities, such as more detailed investigations, periodic displacement measurements, continuous monitoring and early-warning systems. Prioritisation is achieved through a hazard and risk classification system, which has been developed by a panel of international and Norwegian experts (www.ngu.no/en-gb/hm/Publications/Reports/2012/2012-029). The risk classification system combines a qualitative hazard assessment with a consequences assessment focusing on potential life losses. The hazard assessment is based on a series of nine geomorphological, engineering geological and structural criteria, as well as displacement rates, past events and other signs of activity. We present a method for consequence assessment comprising four main steps: 1. computation of the volume of the unstable rock slope; 2. run-out assessment based on the volume-dependent angle of reach (Fahrböschung) or detailed numerical run-out modelling; 3. assessment of possible displacement wave propagation and run-up based on empirical relations or modelling in 2D or 3D; and 4. estimation of the number of persons exposed to rock avalanches or displacement waves. Volume computation of an unstable rock slope is based on the sloping local base level technique, which uses a digital elevation model to create a second-order curved surface between the mapped extent of

  20. Assessing mechanical vulnerability in water distribution networks under multiple failures

    NASA Astrophysics Data System (ADS)

    Berardi, Luigi; Ugarelli, Rita; Røstum, Jon; Giustolisi, Orazio

    2014-03-01

    Understanding mechanical vulnerability of water distribution networks (WDN) is of direct relevance for water utilities since it entails two different purposes. On the one hand, it might support the identification of severe failure scenarios due to external causes (e.g., natural or intentional events) which result into the most critical consequences on WDN supply capacity. On the other hand, it aims at figure out the WDN portions which are more prone to be affected by asset disruptions. The complexity of such analysis stems from the number of possible scenarios with single and multiple simultaneous shutdowns of asset elements leading to modifications of network topology and insufficient water supply to customers. In this work, the search for the most disruptive combinations of multiple asset failure events is formulated and solved as a multiobjective optimization problem. The higher vulnerability failure scenarios are detected as those causing the lower supplied demand due to the lower number of simultaneous failures. The automatic detection of WDN topology, subsequent to the detachments of failed elements, is combined with pressure-driven analysis. The methodology is demonstrated on a real water distribution network. Results show that, besides the failures causing the detachment of reservoirs, tanks, or pumps, there are other different topological modifications which may cause severe WDN service disruptions. Such information is of direct relevance to support planning asset enhancement works and improve the preparedness to extreme events.

  1. Usefulness of the culturally adapted oxygen-cost diagram in the assessment of dyspnea in Puerto Rico

    PubMed Central

    Santos Rodríguez, Ruth A.; Dexter, Donald; Nieves-Plaza, Mariely; Nazario, Cruz M.

    2015-01-01

    Objective Breathlessness is a common and disabling symptom of pulmonary disease. Measuring its severity is recommended as such measurements can be helpful in both clinical and research settings. The oxygen-cost diagram (OCD) and the Medical Research Council (MRC) dyspnea scale were developed in English to measure severity of dyspnea. These scales were previously translated to Spanish and adapted for use in a Hispanic population. The objective of this study is to assess the psychometric properties of these scales. We propose the scales correlate well with measures of physiological impairment. Methods Subjects having pulmonary disease rated their perceptions of dyspnea using the scales, performed a spirometry test, and did a 6-min walk. Spearman correlation coefficients (r) were used to correlate dyspnea scores with spirometric parameters and distance walked (6MWD). Results Sixty-six patients having stable asthma (n = 36), chronic obstructive pulmonary disease (n = 19), or interstitial lung disease (n = 11) participated in the study. OCD scores showed a significant correlation with FEV1 (r = 0.41; p<0.01), FEV1% (r = 0.36; p<0.01), FVC (r = 0.44; p<0.01), and FVC% (r = 0.37; p<0.01) in the study population. The OCD scores were highly correlated with 6MWD (r = 0.59, p<0.01). The MRC dyspnea scale showed significant inverse correlation with FEV1 (r = −0.34; p<0.01) and 6MWD (r = −0.33; p<0.05), but the correlations were weaker compared to the correlations with the OCD scale. Conclusions The severity of breathlessness as measured by the adapted Spanish OCD showed a moderate to high correlation with spirometric parameters and 6MWD; therefore, the adapted OCD should prove to be useful in Puerto Rico. PMID:25856872

  2. Common-Cause Failure Treatment in Event Assessment: Basis for a Proposed New Model

    SciTech Connect

    Dana Kelly; Song-Hua Shen; Gary DeMoss; Kevin Coyne; Don Marksberry

    2010-06-01

    Event assessment is an application of probabilistic risk assessment in which observed equipment failures and outages are mapped into the risk model to obtain a numerical estimate of the event’s risk significance. In this paper, we focus on retrospective assessments to estimate the risk significance of degraded conditions such as equipment failure accompanied by a deficiency in a process such as maintenance practices. In modeling such events, the basic events in the risk model that are associated with observed failures and other off-normal situations are typically configured to be failed, while those associated with observed successes and unchallenged components are assumed capable of failing, typically with their baseline probabilities. This is referred to as the failure memory approach to event assessment. The conditioning of common-cause failure probabilities for the common cause component group associated with the observed component failure is particularly important, as it is insufficient to simply leave these probabilities at their baseline values, and doing so may result in a significant underestimate of risk significance for the event. Past work in this area has focused on the mathematics of the adjustment. In this paper, we review the Basic Parameter Model for common-cause failure, which underlies most current risk modelling, discuss the limitations of this model with respect to event assessment, and introduce a proposed new framework for common-cause failure, which uses a Bayesian network to model underlying causes of failure, and which has the potential to overcome the limitations of the Basic Parameter Model with respect to event assessment.

  3. Proof test diagrams for Zerodur glass-ceramic

    NASA Technical Reports Server (NTRS)

    Tucker, D. S.

    1991-01-01

    Proof test diagrams for Zerodur glass-ceramics are calculated from available fracture mechanics data. It is shown that the environment has a large effect on minimum time-to-failure as predicted by proof test diagrams.

  4. A Probabilistic-Micro-mechanical Methodology for Assessing Zirconium Alloy Cladding Failure

    SciTech Connect

    Pan, Y.M.; Chan, K.S.; Riha, D.S.

    2007-07-01

    Cladding failure of fuel rods caused by hydride-induced embrittlement is a reliability concern for spent nuclear fuel after extended burnup. Uncertainties in the cladding temperature, cladding stress, oxide layer thickness, and the critical stress value for hydride reorientation preclude an assessment of the cladding failure risk. A set of micro-mechanical models for treating oxide cracking, blister cracking, delayed hydride cracking, and cladding fracture was developed and incorporated in a computer model. Results obtained from the preliminary model calculations indicate that at temperatures below a critical temperature of 318.5 deg. C [605.3 deg. F], the time to failure by delayed hydride cracking in Zr-2.5%Nb decreased with increasing cladding temperature. The overall goal of this project is to develop a probabilistic-micro-mechanical methodology for assessing the probability of hydride-induced failure in Zircaloy cladding and thereby establish performance criteria. (authors)

  5. Probabilistic assessment of failure in adhesively bonded composite laminates

    SciTech Connect

    Minnetyan, L.; Chamis, C.C.

    1997-07-01

    Damage initiation and progressive fracture of adhesively bonded graphite/epoxy composites is investigated under tensile loading. A computer code is utilized for the simulation of composite structural damage and fracture. Structural response is assessed probabilistically during degradation. The effects of design variable uncertainties on structural damage progression are quantified. The Fast Probability Integrator is used to assess the response scatter in the composite structure at damage initiation. Sensitivity of the damage response to design variables is computed. Methods are general purpose in nature and are applicable to all types of laminated composite structures and joints, starting from damage initiation to unstable damage propagation and collapse. Results indicate that composite constituent and adhesive properties have a significant effect on structural durability. Damage initiation/progression does not necessarily begin in the adhesive bond. Design implications with regard to damage tolerance of adhesively bonded joints are examined.

  6. Lumbar Transpedicular Implant Failure: A Clinical and Surgical Challenge and Its Radiological Assessment

    PubMed Central

    Ali, Abdel Mohsen Arafa

    2014-01-01

    Study Design It is a multicenter, controlled case study review of a big scale of pedicle-screw procedures from January 2000 to June 2010. The outcomes were compared to those with no implant failure. Purpose The purpose of this study was to review retrospectively the outcome of 100 patients with implant failure in comparison to 100 control-patients, and to study the causes of failure and its prevention. Overview of Literature Transpedicular fixation is associated with risks of hardware failure, such as screw/rod breakage and/or loosening at the screw-rod interface and difficulties in the system assembly, which remain a significant clinical problem. Removal or revision of the spinal hardware is often required. Methods Two hundred patients (88 women, 112 men) were divided into 2 major groups, with 100 patients in group I (implant failure group G1) and 100 patients in group II (successful fusion, control group G2). We subdivided the study groups into two subgroups: subgroup a (single-level instrumented group) and subgroup b (multilevel instrumented group). The implant status was assessed based on intraoperative and follow-up radiographs. Results Implant failure in general was present in 36% in G1a, and in 64% in G1b, and types of implant failure included screw fracture (34%), rod fracture (24%), rod loosening (22%), screw loosening (16%), and failure of both rod and screw (4%). Most of the failures (90%) occurred within 6 months after surgery, with no reported cases 1 year postoperatively. Conclusions We tried to address the problem and study the causes of failure, and proposed solutions for its prevention. PMID:24967042

  7. Beyond ejection fraction: an integrative approach for assessment of cardiac structure and function in heart failure.

    PubMed

    Cikes, Maja; Solomon, Scott D

    2016-06-01

    Left ventricular ejection fraction (LVEF) has been the central parameter used for diagnosis and management in patients with heart failure. A good predictor of adverse outcomes in heart failure when below ∼45%, LVEF is less useful as a marker of risk as it approaches normal. As a measure of cardiac function, ejection fraction has several important limitations. Calculated as the stroke volume divided by end-diastolic volume, the estimation of ejection fraction is generally based on geometric assumptions that allow for assessment of volumes based on linear or two-dimensional measurements. Left ventricular ejection fraction is both preload- and afterload-dependent, can change substantially based on loading conditions, is only moderately reproducible, and represents only a single measure of risk in patients with heart failure. Moreover, the relationship between ejection fraction and risk in patients with heart failure is modified by factors such as hypertension, diabetes, and renal function. A more complete evaluation and understanding of left ventricular function in patients with heart failure requires a more comprehensive assessment: we conceptualize an integrative approach that incorporates measures of left and right ventricular function, left ventricular geometry, left atrial size, and valvular function, as well as non-imaging factors (such as clinical parameters and biomarkers), providing a comprehensive and accurate prediction of risk in heart failure. PMID:26417058

  8. 26 CFR 301.6685-1 - Assessable penalties with respect to private foundations' failure to comply with section 6104(d).

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... foundations' failure to comply with section 6104(d). 301.6685-1 Section 301.6685-1 Internal Revenue INTERNAL... Additional Amounts § 301.6685-1 Assessable penalties with respect to private foundations' failure to comply... private foundations' annual returns, and who fails so to comply, if such failure is willful, shall pay...

  9. 26 CFR 301.6685-1 - Assessable penalties with respect to private foundations' failure to comply with section 6104(d).

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... foundations' failure to comply with section 6104(d). 301.6685-1 Section 301.6685-1 Internal Revenue INTERNAL... Additional Amounts § 301.6685-1 Assessable penalties with respect to private foundations' failure to comply... private foundations' annual returns, and who fails so to comply, if such failure is willful, shall pay...

  10. 26 CFR 301.6685-1 - Assessable penalties with respect to private foundations' failure to comply with section 6104(d).

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... foundations' failure to comply with section 6104(d). 301.6685-1 Section 301.6685-1 Internal Revenue INTERNAL... Additional Amounts § 301.6685-1 Assessable penalties with respect to private foundations' failure to comply... private foundations' annual returns, and who fails so to comply, if such failure is willful, shall pay...

  11. 26 CFR 301.6685-1 - Assessable penalties with respect to private foundations' failure to comply with section 6104(d).

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... foundations' failure to comply with section 6104(d). 301.6685-1 Section 301.6685-1 Internal Revenue INTERNAL... Additional Amounts § 301.6685-1 Assessable penalties with respect to private foundations' failure to comply... private foundations' annual returns, and who fails so to comply, if such failure is willful, shall pay...

  12. 26 CFR 301.6685-1 - Assessable penalties with respect to private foundations' failure to comply with section 6104(d).

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... foundations' failure to comply with section 6104(d). 301.6685-1 Section 301.6685-1 Internal Revenue INTERNAL... Additional Amounts § 301.6685-1 Assessable penalties with respect to private foundations' failure to comply... private foundations' annual returns, and who fails so to comply, if such failure is willful, shall pay...

  13. ANALYSIS OF SEQUENTIAL FAILURES FOR ASSESSMENT OF RELIABILITY AND SAFETY OF MANUFACTURING SYSTEMS. (R828541)

    EPA Science Inventory

    Assessment of reliability and safety of a manufacturing system with sequential failures is an important issue in industry, since the reliability and safety of the system depend not only on all failed states of system components, but also on the sequence of occurrences of those...

  14. VALIDATION OF PROTOCOLS FOR ASSESSING EARLY PREGNANCY FAILURE IN THE RAT: CLOMIPHENE CITRATE

    EPA Science Inventory

    Following the assembly of a battery of protocols for the assessment of maternally-mediated toxicity during early pregnancy, the validation of this battery for its utility in detecting and defining mechanisms of early pregnancy failure is ongoing. his report describes the use of c...

  15. Application of ISO22000 and Failure Mode and Effect Analysis (fmea) for Industrial Processing of Poultry Products

    NASA Astrophysics Data System (ADS)

    Varzakas, Theodoros H.; Arvanitoyannis, Ioannis S.

    Failure Mode and Effect Analysis (FMEA) model has been applied for the risk assessment of poultry slaughtering and manufacturing. In this work comparison of ISO22000 analysis with HACCP is carried out over poultry slaughtering, processing and packaging. Critical Control points and Prerequisite programs (PrPs) have been identified and implemented in the cause and effect diagram (also known as Ishikawa, tree diagram and fishbone diagram).

  16. An improved approach for flight readiness certification: Methodology for failure risk assessment and application examples. Volume 2: Software documentation

    NASA Technical Reports Server (NTRS)

    Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.

    1992-01-01

    An improved methodology for quantitatively evaluating failure risk of spaceflight systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with engineering analysis to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in engineering analyses of failure phenomena, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which engineering analysis models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes, These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. Conventional engineering analysis models currently employed for design of failure prediction are used in this methodology. The PFA methodology is described and examples of its application are presented. Conventional approaches to failure risk evaluation for spaceflight systems are discussed, and the rationale for the approach taken in the PFA methodology is presented. The statistical methods, engineering models, and computer software used in fatigue failure mode applications are thoroughly documented.

  17. An improved approach for flight readiness certification: Methodology for failure risk assessment and application examples, volume 1

    NASA Technical Reports Server (NTRS)

    Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.

    1992-01-01

    An improved methodology for quantitatively evaluating failure risk of spaceflight systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with engineering analysis to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in engineering analyses of failure phenomena, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which engineering analysis models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes. These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. Conventional engineering analysis models currently employed for design of failure prediction are used in this methodology. The PFA methodology is described and examples of its application are presented. Conventional approaches to failure risk evaluation for spaceflight systems are discussed, and the rationale for the approach taken in the PFA methodology is presented. The statistical methods, engineering models, and computer software used in fatigue failure mode applications are thoroughly documented.

  18. An improved approach for flight readiness certification: Methodology for failure risk assessment and application examples. Volume 2: Software documentation

    NASA Astrophysics Data System (ADS)

    Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.

    1992-06-01

    An improved methodology for quantitatively evaluating failure risk of spaceflight systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with engineering analysis to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in engineering analyses of failure phenomena, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which engineering analysis models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes, These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. Conventional engineering analysis models currently employed for design of failure prediction are used in this methodology. The PFA methodology is described and examples of its application are presented. Conventional approaches to failure risk evaluation for spaceflight systems are discussed, and the rationale for the approach taken in the PFA methodology is presented. The statistical methods, engineering models, and computer software used in fatigue failure mode applications are thoroughly documented.

  19. An improved approach for flight readiness certification: Methodology for failure risk assessment and application examples, volume 1

    NASA Astrophysics Data System (ADS)

    Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.

    1992-06-01

    An improved methodology for quantitatively evaluating failure risk of spaceflight systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with engineering analysis to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in engineering analyses of failure phenomena, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which engineering analysis models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes. These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. Conventional engineering analysis models currently employed for design of failure prediction are used in this methodology. The PFA methodology is described and examples of its application are presented. Conventional approaches to failure risk evaluation for spaceflight systems are discussed, and the rationale for the approach taken in the PFA methodology is presented. The statistical methods, engineering models, and computer software used in fatigue failure mode applications are thoroughly documented.

  20. Performance improvement through proactive risk assessment: Using failure modes and effects analysis

    PubMed Central

    Yarmohammadian, Mohammad Hossein; Abadi, Tahereh Naseri Boori; Tofighi, Shahram; Esfahani, Sekine Saghaeiannejad

    2014-01-01

    Introduction: Cognizance of any error-prone professional activities has a great impact on the continuity of professional organizations in the competitive atmosphere, particularly in health care industry where every second has critical value in patients’ life saving. Considering invaluable functions of medical record department — as legal document and continuity of health care — “failure mode and effects analysis (FMEA)” utilized to identify the ways a process can fail, and how it can be made safer. Materials and Methods: The structured approach involved assembling a team of experts, employing a trained facilitator, introducing the rating scales and process during team orientation and collectively scoring failure modes. The probability of the failure-effect combination was related to the frequency of occurrence, potential severity, and likelihood of detection before causing any harm to the staff or patients. Frequency, severity and detectability were each given a score from 1 to 10. Risk priority numbers were calculated. Results: In total 56 failure modes were identified and in subsets of Medical Record Department including admission unit dividing emergency, outpatient and inpatient classes, statististic, health data organizing and data processing and Medical Coding units. Although most failure modes were classified as a high risk group, limited resources were, as an impediment to implement recommended actions at the same time. Conclusion: Proactive risk assessment methods, such as FMEA enable health care administrators to identify where and what safeguards are needed to protect against a bad outcome even when an error does occur. PMID:25013821

  1. Assessing performance and validating finite element simulations using probabilistic knowledge

    SciTech Connect

    Dolin, Ronald M.; Rodriguez, E. A.

    2002-01-01

    Two probabilistic approaches for assessing performance are presented. The first approach assesses probability of failure by simultaneously modeling all likely events. The probability each event causes failure along with the event's likelihood of occurrence contribute to the overall probability of failure. The second assessment method is based on stochastic sampling using an influence diagram. Latin-hypercube sampling is used to stochastically assess events. The overall probability of failure is taken as the maximum probability of failure of all the events. The Likelihood of Occurrence simulation suggests failure does not occur while the Stochastic Sampling approach predicts failure. The Likelihood of Occurrence results are used to validate finite element predictions.

  2. The assessment of low probability containment failure modes using dynamic PRA

    NASA Astrophysics Data System (ADS)

    Brunett, Acacia Joann

    Although low probability containment failure modes in nuclear power plants may lead to large releases of radioactive material, these modes are typically crudely modeled in system level codes and have large associated uncertainties. Conventional risk assessment techniques (i.e. the fault-tree/event-tree methodology) are capable of accounting for these failure modes to some degree, however, they require the analyst to pre-specify the ordering of events, which can vary within the range of uncertainty of the phenomena. More recently, dynamic probabilistic risk assessment (DPRA) techniques have been developed which remove the dependency on the analyst. Through DPRA, it is now possible to perform a mechanistic and consistent analysis of low probability phenomena, with the timing of the possible events determined by the computational model simulating the reactor behavior. The purpose of this work is to utilize DPRA tools to assess low probability containment failure modes and the driving mechanisms. Particular focus is given to the risk-dominant containment failure modes considered in NUREG-1150, which has long been the standard for PRA techniques. More specifically, this work focuses on the low probability phenomena occurring during a station blackout (SBO) with late power recovery in the Zion Nuclear Power Plant, a Westinghouse pressurized water reactor (PWR). Subsequent to the major risk study performed in NUREG-1150, significant experimentation and modeling regarding the mechanisms driving containment failure modes have been performed. In light of this improved understanding, NUREG-1150 containment failure modes are reviewed in this work using the current state of knowledge. For some unresolved mechanisms, such as containment loading from high pressure melt ejection and combustion events, additional analyses are performed using the accident simulation tool MELCOR to explore the bounding containment loads for realistic scenarios. A dynamic treatment in the

  3. Preliminary Master Logic Diagram for ITER operation

    SciTech Connect

    Cadwallader, L.C.; Taylor, N.P.; Poucet, A.E.

    1998-04-01

    This paper describes the work performed to develop a Master Logic Diagram (MLD) for the operations phase of the International Thermonuclear Experimental Reactor (ITER). The MLD is a probabilistic risk assessment tool used to identify the broad set of potential initiating events that could lead to an offsite radioactive or toxic chemical release from the facility under study. The MLD described here is complementary to the failure modes and effects analyses (FMEAs) that have been performed for ITER`s major plant systems in the engineering evaluation of the facility design. While the FMEAs are a bottom-up or component level approach, the MLD is a top-down or facility level approach to identifying the broad spectrum of potential events. Strengths of the MLD are that it analyzes the entire plant, depicts completeness in the accident initiator process, provides an independent method for identification, and can also identify potential system interactions. MLDs have been used successfully as a hazard analysis tool. This paper describes the process used for the ITER MLD to treat the variety of radiological and toxicological source terms present in the ITER design. One subtree of the nineteen page MLD is shown to illustrate the levels of the diagram.

  4. Phase Equilibria Diagrams Database

    National Institute of Standards and Technology Data Gateway

    SRD 31 NIST/ACerS Phase Equilibria Diagrams Database (PC database for purchase)   The Phase Equilibria Diagrams Database contains commentaries and more than 21,000 diagrams for non-organic systems, including those published in all 21 hard-copy volumes produced as part of the ACerS-NIST Phase Equilibria Diagrams Program (formerly titled Phase Diagrams for Ceramists): Volumes I through XIV (blue books); Annuals 91, 92, 93; High Tc Superconductors I & II; Zirconium & Zirconia Systems; and Electronic Ceramics I. Materials covered include oxides as well as non-oxide systems such as chalcogenides and pnictides, phosphates, salt systems, and mixed systems of these classes.

  5. Endothelial dysfunction as assessed with magnetic resonance imaging - A major determinant in chronic heart failure.

    PubMed

    Kovačić, Slavica; Plazonić, Željko; Batinac, Tanja; Miletić, Damir; Ružić, Alen

    2016-05-01

    Chronic heart failure (CHF) is a clinical syndrome resulting from interaction of different structure and functional disturbances leading to decreased heart ability to ensure adequate supply of oxygenized blood to tissues and ensure adequate metabolic needs in the cases of normal or increased afterload. Endothelial dysfunction (ED) is a pathological condition characterized by general imbalance of all major endothelial mechanisms with key role in development and progression of atherosclerotic disease. ED has been associated with most cardiovascular risk factors. There is increasing interest in assessing endothelial function non-invasively, leading to development and evaluation of new diagnostic methods. We suggest that MRI is safe and reliable test that offers important advantages over ultrasound for the detection of ED and monitoring of the expected therapeutic effect. We believe that ED plays a pivotal role in chronic heart failure development and progression, regardless of its etiology, and that MRI should be introduced as a "gold standard" in diagnostic procedure and treatment. PMID:27063091

  6. Noninvasive radiographic assessment of cardiovascular function in acute and chronic respiratory failure

    SciTech Connect

    Berger, H.J.; Matthay, R.A.

    1981-04-01

    Noninvasive radiographic techniques have provided a means of studying the natural history and pathogenesis of cardiovascular performance in acute and chronic respiratory failure. Chest radiography, radionuclide angiocardiography and thallium-201 imaging, and M mode and cross-sectional echocardiography have been employed. Each of these techniques has specific uses, attributes and limitations. For example, measurement of descending pulmonary arterial diameters on the plain chest radiograph allows determination of the presence or absence of pulmonary arterial hypertension. Right and left ventricular performance can be evaluated at rest and during exercise using radionuclide angiocardiography. The biventricular response to exercise and to therapeutic interventions also can be assessed with this approach. Evaluation of the pulmonary valve echogram and echocardiographic right ventricular dimensions have been shown to reflect right ventricular hemodynamics and size. Each of these noninvasive techniques has been applied to the study of patients with respiratory failure and has provided important physiologic data.

  7. Assessing Safety in Distillation Column Using Dynamic Simulation and Failure Mode and Effect Analysis (FMEA)

    NASA Astrophysics Data System (ADS)

    Werner, Suhendra; Fred, Witt; Compart

    Safety assessment becomes an important activity in chemical industries since the need to comply with general legal requirements in addition to meet safer plant and profit. This paper reviews some most frequently causes of distillation column malfunction. First, analysis of case histories will be discussed for providing guidelines in identifying potential trouble spots in distillation column. A dynamic simulation for operational failure is simulated as the basis for assessing the consequences. A case study will be used from a side stream distillation column to show the implementation of the concept. A framework for assessing safety in the column is proposed using Fault Mode and Effect Analysis (FMEA). Further, trouble-free operation in order to reduce the risk associated with column malfunction is described.

  8. Factors affecting nurses' intent to assess for depression in heart failure patients.

    PubMed

    Lea, Patricia

    2014-01-01

    The association between depression and cardiovascular disease has been well established and has been shown to decrease patients' quality of life and increase the risk of mortality, frequency and duration of hospitalization, and health care costs. The inpatient setting provides a potentially valuable opportunity to assess and treat depression among patients with acute cardiac illness, allowing for daily monitoring of treatment side effects. Although systematic depression screening appears to be feasible, efficient, and well accepted on inpatient cardiac units, the current lack of consistent inpatient assessment for depression in heart failure patients suggests the presence of barriers influencing the effective diagnosis and treatment of depression among inpatients with heart failure. The theory of planned behavior describes the cognitive mechanism by which behavioral intent is formed, giving some insight into how nurses' attitudes and beliefs affect their performance of routine depression screening. In addition, application of this cognitive theory suggests that nurses may be influenced to adopt more positive attitudes and beliefs about depression through educational intervention, leading to greater likelihood of routine assessment for depression, ultimately leading to more timely diagnosis and treatment and improved patient outcomes. PMID:25280199

  9. Gravity wave transmission diagram

    NASA Astrophysics Data System (ADS)

    Tomikawa, Yoshihiro

    2016-07-01

    A possibility of gravity wave propagation from a source region to the airglow layer around the mesopause has been discussed based on the gravity wave blocking diagram taking into account the critical level filtering alone. This paper proposes a new gravity wave transmission diagram in which both the critical level filtering and turning level reflection of gravity waves are considered. It shows a significantly different distribution of gravity wave transmissivity from the blocking diagram.

  10. Sample handler for x-ray tomographic microscopy and image-guided failure assessment

    SciTech Connect

    Wyss, Peter; Thurner, Philipp; Broennimann, Rolf; Sennhauser, Urs; Stampanoni, Marco; Abela, Rafael; Mueller, Ralph

    2005-07-15

    X-ray tomographic microscopy (XTM) yields a three-dimensional data model of an investigated specimen. XTM providing micrometer resolution requires synchrotron light, high resolution area detectors, and a precise sample handler. The sample handler has a height of 270 mm only, is usable for 1 {mu}m resolution, and is able to carry loading machines with a weight of up to 20 kg. This allows exposing samples to load between scans for image-guided failure assessment. This system has been used in the XTM end station of the materials science beamline of the Swiss Light Source at the Paul Scherrer Institut.

  11. Risk assessment for Industrial Control Systems quantifying availability using mean failure cost (MFC)

    DOE PAGESBeta

    Chen, Qian; Abercrombie, Robert K; Sheldon, Frederick T.

    2015-09-23

    Industrial Control Systems (ICS) are commonly used in industries such as oil and natural gas, transportation, electric, water and wastewater, chemical, pharmaceutical, pulp and paper, food and beverage, as well as discrete manufacturing (e.g., automotive, aerospace, and durable goods.) SCADA systems are generally used to control dispersed assets using centralized data acquisition and supervisory control. Originally, ICS implementations were susceptible primarily to local threats because most of their components were located in physically secure areas (i.e., ICS components were not connected to IT networks or systems). The trend toward integrating ICS systems with IT networks (e.g., efficiency and the Internetmore » of Things) provides significantly less isolation for ICS from the outside world thus creating greater risk due to external threats. Albeit, the availability of ICS/SCADA systems is critical to assuring safety, security and profitability. Such systems form the backbone of our national cyber-physical infrastructure. We extend the concept of mean failure cost (MFC) to address quantifying availability to harmonize well with ICS security risk assessment. This new measure is based on the classic formulation of Availability combined with Mean Failure Cost (MFC). The metric offers a computational basis to estimate the availability of a system in terms of the loss that each stakeholder stands to sustain as a result of security violations or breakdowns (e.g., deliberate malicious failures).« less

  12. Risk assessment for Industrial Control Systems quantifying availability using mean failure cost (MFC)

    SciTech Connect

    Chen, Qian; Abercrombie, Robert K; Sheldon, Frederick T.

    2015-09-23

    Industrial Control Systems (ICS) are commonly used in industries such as oil and natural gas, transportation, electric, water and wastewater, chemical, pharmaceutical, pulp and paper, food and beverage, as well as discrete manufacturing (e.g., automotive, aerospace, and durable goods.) SCADA systems are generally used to control dispersed assets using centralized data acquisition and supervisory control. Originally, ICS implementations were susceptible primarily to local threats because most of their components were located in physically secure areas (i.e., ICS components were not connected to IT networks or systems). The trend toward integrating ICS systems with IT networks (e.g., efficiency and the Internet of Things) provides significantly less isolation for ICS from the outside world thus creating greater risk due to external threats. Albeit, the availability of ICS/SCADA systems is critical to assuring safety, security and profitability. Such systems form the backbone of our national cyber-physical infrastructure. We extend the concept of mean failure cost (MFC) to address quantifying availability to harmonize well with ICS security risk assessment. This new measure is based on the classic formulation of Availability combined with Mean Failure Cost (MFC). The metric offers a computational basis to estimate the availability of a system in terms of the loss that each stakeholder stands to sustain as a result of security violations or breakdowns (e.g., deliberate malicious failures).

  13. Assessment of Three Finite Element Approaches for Modeling the Ballistic Impact Failure of Metal Plates

    NASA Astrophysics Data System (ADS)

    Mansur, Ali; Nganbe, Michel

    2015-03-01

    The ballistic impact was numerically modeled for AISI 450 steel struck by a 17.3 g ogive nose WC-Co projectile using Abaqus/Explicit. The model was validated using experimental results and data for different projectiles and metal targets. The Abaqus ductile-shear, local principal strain to fracture, and absorbed strain energy at failure criteria were investigated. Due to the highly dynamic nature of ballistic impacts, the absorbed strain energy approach posed serious challenges in estimating the effective deformation volume and yielded the largest critical plate thicknesses for through-thickness penetration (failure). In contrast, the principal strain criterion yielded the lowest critical thicknesses and provided the best agreement with experimental ballistic test data with errors between 0 and 30%. This better accuracy was due to early failure definition when the very first mesh at the target back side reached the strain to fracture, which compensated for the overall model overestimation. The ductile-shear criterion yielded intermediate results between those of the two comparative approaches. In contrast to the ductile-shear criterion, the principal strain criterion requires only basic data readily available for practically all materials. Therefore, it is a viable alternative for an initial assessment of the ballistic performance and pre-screening of a large number of new candidate materials as well as for supporting the development of novel armor systems.

  14. Assessing hospital readmission risk factors in heart failure patients enrolled in a telemonitoring program.

    PubMed

    Zai, Adrian H; Ronquillo, Jeremiah G; Nieves, Regina; Chueh, Henry C; Kvedar, Joseph C; Jethwani, Kamal

    2013-01-01

    The purpose of this study was to validate a previously developed heart failure readmission predictive algorithm based on psychosocial factors, develop a new model based on patient-reported symptoms from a telemonitoring program, and assess the impact of weight fluctuations and other factors on hospital readmission. Clinical, demographic, and telemonitoring data was collected from 100 patients enrolled in the Partners Connected Cardiac Care Program between July 2008 and November 2011. 38% of study participants were readmitted to the hospital within 30 days. Ten different heart-failure-related symptoms were reported 17,389 times, with the top three contributing approximately 50% of the volume. The psychosocial readmission model yielded an AUC of 0.67, along with sensitivity 0.87, specificity 0.32, positive predictive value 0.44, and negative predictive value 0.8 at a cutoff value of 0.30. In summary, hospital readmission models based on psychosocial characteristics, standardized changes in weight, or patient-reported symptoms can be developed and validated in heart failure patients participating in an institutional telemonitoring program. However, more robust models will need to be developed that use a comprehensive set of factors in order to have a significant impact on population health. PMID:23710170

  15. Causes of death in fulminant hepatic failure and relationship to quantitative histological assessment of parenchymal damage.

    PubMed

    Gazzard, B G; Portmann, B; Murray-Lyon, I M; Williams, R

    1975-10-01

    The clinical course and causes of death in 132 consecutive patients with fulminant hepatic failure and grade III or IV encephalopathy have been reviewed. 105 patients died and in 96 of these an autopsy examination was performed. In 36 patients there was cerebral oedema and the mean age of this group was significantly younger than the other fatal cases. In 28 patients death was attributed to major haemorrhage which originated in the gastrointestinal tract in 25. The prothrombin time ratio was not significantly greater in patients with major bleeding than in those without but they did have a significantly lower platelet count. Sepsis contributed to death in 12 patients. In 25 patients massive hepatic necrosis only was found at autopsy and death was considered to be due solely to hepatic failure. The degree of hepatocyte loss was assessed in 80 fatal cases by a histological morphometric technique on a needle specimen of liver taken immediately post-mortem. The proportion of the liver volume occupied by hepatocytes (hepatocyte volume fraction, HVF) was greatly reduced in all patients (normal 85+/-SD 5 percent) but the mean value was significantly higher in the patients dying with sepsis, cerebral oedema or haemorrhage than in the group in whom death was attributed solely to hepatic failure. There were ten patients in whom liver function was improving at the time of death which was due to cerebral (9) or haemorrhage (1). These observations suggest that many patients presently dying from fulminant hepatic failure may be expected to survive, once more effective therapy is available for the complications of the illness. PMID:172938

  16. The assessment of low probability containment failure modes using dynamic PRA

    NASA Astrophysics Data System (ADS)

    Brunett, Acacia Joann

    Although low probability containment failure modes in nuclear power plants may lead to large releases of radioactive material, these modes are typically crudely modeled in system level codes and have large associated uncertainties. Conventional risk assessment techniques (i.e. the fault-tree/event-tree methodology) are capable of accounting for these failure modes to some degree, however, they require the analyst to pre-specify the ordering of events, which can vary within the range of uncertainty of the phenomena. More recently, dynamic probabilistic risk assessment (DPRA) techniques have been developed which remove the dependency on the analyst. Through DPRA, it is now possible to perform a mechanistic and consistent analysis of low probability phenomena, with the timing of the possible events determined by the computational model simulating the reactor behavior. The purpose of this work is to utilize DPRA tools to assess low probability containment failure modes and the driving mechanisms. Particular focus is given to the risk-dominant containment failure modes considered in NUREG-1150, which has long been the standard for PRA techniques. More specifically, this work focuses on the low probability phenomena occurring during a station blackout (SBO) with late power recovery in the Zion Nuclear Power Plant, a Westinghouse pressurized water reactor (PWR). Subsequent to the major risk study performed in NUREG-1150, significant experimentation and modeling regarding the mechanisms driving containment failure modes have been performed. In light of this improved understanding, NUREG-1150 containment failure modes are reviewed in this work using the current state of knowledge. For some unresolved mechanisms, such as containment loading from high pressure melt ejection and combustion events, additional analyses are performed using the accident simulation tool MELCOR to explore the bounding containment loads for realistic scenarios. A dynamic treatment in the

  17. Hertzsprung-Russell Diagram

    NASA Astrophysics Data System (ADS)

    Chiosi, C.; Murdin, P.

    2000-11-01

    The Hertzsprung-Russell diagram (HR-diagram), pioneered independently by EJNAR HERTZSPRUNG and HENRY NORRIS RUSSELL, is a plot of the star luminosity versus the surface temperature. It stems from the basic relation for an object emitting thermal radiation as a black body: ...

  18. Strategic environmental assessment can help solve environmental impact assessment failures in developing countries

    SciTech Connect

    Alshuwaikhat, Habib M. . E-mail: habibms@kfupm.edu.sa

    2005-05-15

    The current trend of industrialization and urbanization in developing nations has a huge impact on anthropogenic and natural ecosystems. Pollution sources increase with the expansion of cities and cause contamination of water, air and soil. The absence of urban environmental planning and management strategies has resulted in greater concern for future urban development. This paper advocates the adoption of strategic environmental assessment (SEA) as a means to achieve sustainable development in developing countries. It investigates project-level environmental impact assessment (EIA) and its limitations. The exploration of SEA and its features are addressed. The effective implementation of SEA can create a roadmap for sustainable development. In many developing countries, the lack of transparency and accountability and ineffective public participation in the development of the policy, plan and program (PPP) would be mitigated by the SEA process. Moreover, the proactive and broadly based characteristics of SEA would benefit the institutional development of the PPP process, which is rarely experienced in many developing countries. The paper also explores the prospects for SEA and its guiding principles in developing countries. Finally, the paper calls for a coordinated effort between all government, nongovernment and international organizations involved with PPPs to enable developing countries to pursue a path of sustainable development through the development and application of strategic environmental assessment.

  19. Risk assessment of turbine rotor failure using probabilistic ultrasonic non-destructive evaluations

    SciTech Connect

    Guan, Xuefei; Zhang, Jingdan; Zhou, S. Kevin; Rasselkorde, El Mahjoub; Abbasi, Waheed A.

    2014-02-18

    The study presents a method and application of risk assessment methodology for turbine rotor fatigue failure using probabilistic ultrasonic nondestructive evaluations. A rigorous probabilistic modeling for ultrasonic flaw sizing is developed by incorporating the model-assisted probability of detection, and the probability density function (PDF) of the actual flaw size is derived. Two general scenarios, namely the ultrasonic inspection with an identified flaw indication and the ultrasonic inspection without flaw indication, are considered in the derivation. To perform estimations for fatigue reliability and remaining useful life, uncertainties from ultrasonic flaw sizing and fatigue model parameters are systematically included and quantified. The model parameter PDF is estimated using Bayesian parameter estimation and actual fatigue testing data. The overall method is demonstrated using a realistic application of steam turbine rotor, and the risk analysis under given safety criteria is provided to support maintenance planning.

  20. Risk assessment of turbine rotor failure using probabilistic ultrasonic non-destructive evaluations

    NASA Astrophysics Data System (ADS)

    Guan, Xuefei; Zhang, Jingdan; Zhou, S. Kevin; Rasselkorde, El Mahjoub; Abbasi, Waheed A.

    2014-02-01

    The study presents a method and application of risk assessment methodology for turbine rotor fatigue failure using probabilistic ultrasonic nondestructive evaluations. A rigorous probabilistic modeling for ultrasonic flaw sizing is developed by incorporating the model-assisted probability of detection, and the probability density function (PDF) of the actual flaw size is derived. Two general scenarios, namely the ultrasonic inspection with an identified flaw indication and the ultrasonic inspection without flaw indication, are considered in the derivation. To perform estimations for fatigue reliability and remaining useful life, uncertainties from ultrasonic flaw sizing and fatigue model parameters are systematically included and quantified. The model parameter PDF is estimated using Bayesian parameter estimation and actual fatigue testing data. The overall method is demonstrated using a realistic application of steam turbine rotor, and the risk analysis under given safety criteria is provided to support maintenance planning.

  1. Noninvasive assessment of right and left ventricular function in acute and chronic respiratory failure

    SciTech Connect

    Matthay, R.A.; Berger, H.J.

    1983-05-01

    This review evaluates noninvasive techniques for assessing cardiovascular performance in acute and chronic respiratory failure. Radiographic, radionuclide, and echocardiographic methods for determining ventricular volumes, right (RV) and left ventricular (LV) ejection fractions, and pulmonary artery pressure (PAP) are emphasized. These methods include plain chest radiography, radionuclide angiocardiography, thallium-201 myocardial imaging, and M mode and 2-dimensional echocardiography, which have recently been applied in patients to detect pulmonary artery hypertension (PAH), right ventricular enlargement, and occult ventricular performance abnormalities at rest or exercise. Moreover, radionuclide angiocardiography has proven useful in combination with hemodynamic measurements, for evaluating the short-and long-term cardiovascular effects of therapeutic agents, such as oxygen, digitalis, theophylline, beta-adrenergic agents, and vasodilators.

  2. Modeling of Electrical Cable Failure in a Dynamic Assessment of Fire Risk

    NASA Astrophysics Data System (ADS)

    Bucknor, Matthew D.

    complexity to existing cable failure techniques and tuned to empirical data can better approximate the temperature response of a cables located in tightly packed cable bundles. The new models also provide a way to determine the conditions insides a cable bundle which allows for separate treatment of cables on the interior of the bundle from cables on the exterior of the bundle. The results from the DET analysis show that the overall assessed probability of cable failure can be significantly reduced by more realistically accounting for the influence that the fire brigade has on a fire progression scenario. The shielding analysis results demonstrate a significant reduction in the temperature response of a shielded versus a non-shielded cable bundle; however the computational cost of using a fire progression model that can capture these effects may be prohibitive for performing DET analyses with currently available computational fluid dynamics models and computational resources.

  3. Square Source Type Diagram

    NASA Astrophysics Data System (ADS)

    Aso, N.; Ohta, K.; Ide, S.

    2014-12-01

    Deformation in a small volume of earth interior is expressed by a symmetric moment tensor located on a point source. The tensor contains information of characteristic directions, source amplitude, and source types such as isotropic, double-couple, or compensated-linear-vector-dipole (CLVD). Although we often assume a double couple as the source type of an earthquake, significant non-double-couple component including isotropic component is often reported for induced earthquakes and volcanic earthquakes. For discussions on source types including double-couple and non-double-couple components, it is helpful to display them using some visual diagrams. Since the information of source type has two degrees of freedom, it can be displayed onto a two-dimensional flat plane. Although the diagram developed by Hudson et al. [1989] is popular, the trace corresponding to the mechanism combined by two mechanisms is not always a smooth line. To overcome this problem, Chapman and Leaney [2012] developed a new diagram. This diagram has an advantage that a straight line passing through the center corresponds to the mechanism obtained by a combination of an arbitrary mechanism and a double-couple [Tape and Tape, 2012], but this diagram has some difficulties in use. First, it is slightly difficult to produce the diagram because of its curved shape. Second, it is also difficult to read out the ratios among isotropic, double-couple, and CLVD components, which we want to obtain from the estimated moment tensors, because they do not appear directly on the horizontal or vertical axes. In the present study, we developed another new square diagram that overcomes the difficulties of previous diagrams. This diagram is an orthogonal system of isotropic and deviatoric axes, so it is easy to get the ratios among isotropic, double-couple, and CLVD components. Our diagram has another advantage that the probability density is obtained simply from the area within the diagram if the probability density

  4. Review of nutritional screening and assessment tools and clinical outcomes in heart failure.

    PubMed

    Lin, Hong; Zhang, Haifeng; Lin, Zheng; Li, Xinli; Kong, Xiangqin; Sun, Gouzhen

    2016-09-01

    Recent studies have suggested that undernutrition as defined using multidimensional nutritional evaluation tools may affect clinical outcomes in heart failure (HF). The evidence supporting this correlation is unclear. Therefore, we conducted this systematic review to critically appraise the use of multidimensional evaluation tools in the prediction of clinical outcomes in HF. We performed descriptive analyses of all identified articles involving qualitative analyses. We used STATA to conduct meta-analyses when at least three studies that tested the same type of nutritional assessment or screening tools and used the same outcome were identified. Sensitivity analyses were conducted to validate our positive results. We identified 17 articles with qualitative analyses and 11 with quantitative analysis after comprehensive literature searching and screening. We determined that the prevalence of malnutrition is high in HF (range 16-90 %), particularly in advanced and acute decompensated HF (approximate range 75-90 %). Undernutrition as identified by multidimensional evaluation tools may be significantly associated with hospitalization, length of stay and complications and is particularly strongly associated with high mortality. The meta-analysis revealed that compared with other tools, Mini Nutritional Assessment (MNA) scores were the strongest predictors of mortality in HF [HR (4.32, 95 % CI 2.30-8.11)]. Our results remained reliable after conducting sensitivity analyses. The prevalence of malnutrition is high in HF, particularly in advanced and acute decompensated HF. Moreover, undernutrition as identified by multidimensional evaluation tools is significantly associated with unfavourable prognoses and high mortality in HF. PMID:26920682

  5. Performance of the Automated Neuropsychological Assessment Metrics (ANAM) in Detecting Cognitive Impairment in Heart Failure Patients

    PubMed Central

    Xie, Susan S.; Goldstein, Carly M.; Gathright, Emily C.; Gunstad, John; Dolansky, Mary A.; Redle, Joseph; Hughes, Joel W.

    2015-01-01

    Objective Evaluate capacity of the Automated Neuropsychological Assessment Metrics (ANAM) to detect cognitive impairment (CI) in heart failure (HF) patients. Background CI is a key prognostic marker in HF. Though the most widely used cognitive screen in HF, the Mini-Mental State Examination (MMSE) is insufficiently sensitive. The ANAM has demonstrated sensitivity to cognitive domains affected by HF, but has not been assessed in this population. Methods Investigators administered the ANAM and MMSE to 57 HF patients, compared against a composite model of cognitive function. Results ANAM efficiency (p < .05) and accuracy scores (p < .001) successfully differentiated CI and non-CI. ANAM efficiency and accuracy scores classified 97.7% and 93.0% of non-CI patients, and 14.3% and 21.4% with CI, respectively. Conclusions The ANAM is more effective than the MMSE for detecting CI, but further research is needed to develop a more optimal cognitive screen for routine use in HF patients. PMID:26354858

  6. A Model for Assessment of Failure of LWR Fuel during an RIA

    SciTech Connect

    Liu, Wenfeng; Kazimi, Mujid S.

    2007-07-01

    This paper presents a model for Pellet-Cladding Mechanical Interaction (PCMI) failure of LWR fuel during an RIA. The model uses the J-integral as a driving parameter to characterize the failure potential during PCMI. The model is implemented in the FRAPTRAN code and is validated by CABRI and NSRR simulated RIA test data. Simulation of PWR and BWR conditions are conducted by FRAPTRAN to evaluate the fuel failure potential using this model. Model validation and simulation results are compared with the strain-based failure model of PNNL and the SED/CSED model of EPRI. Our fracture mechanics model has good capability to differentiate failure from non-failure cases. The results reveal significant effects of power pulse width: a wider pulse width generally increases the threshold for fuel failure. However, this effect is less obvious for highly corroded cladding. (authors)

  7. Upgrading Diagnostic Diagrams

    NASA Astrophysics Data System (ADS)

    Proxauf, B.; Kimeswenger, S.; Öttl, S.

    2014-04-01

    Diagnostic diagrams of forbidden lines have been a useful tool for observers in astrophysics for many decades now. They are used to obtain information on the basic physical properties of thin gaseous nebulae. Moreover they are also the initial tool to derive thermodynamic properties of the plasma from observations to get ionization correction factors and thus to obtain proper abundances of the nebulae. Some diagnostic diagrams are in wavelengths domains which were difficult to take either due to missing wavelength coverage or low resolution of older spectrographs. Thus they were hardly used in the past. An upgrade of this useful tool is necessary because most of the diagrams were calculated using only the species involved as a single atom gas, although several are affected by well-known fluorescence mechanisms as well. Additionally the atomic data have improved up to the present time. The new diagnostic diagrams are calculated by using large grids of parameter space in the photoionization code CLOUDY. For a given basic parameter the input radiation field is varied to find the solutions with cooling-heating-equilibrium. Empirical numerical functions are fitted to provide formulas usable in e.g. data reduction pipelines. The resulting diagrams differ significantly from those used up to now and will improve the thermodynamic calculations.

  8. Trace element indiscrimination diagrams

    NASA Astrophysics Data System (ADS)

    Li, Chusi; Arndt, Nicholas T.; Tang, Qingyan; Ripley, Edward M.

    2015-09-01

    We tested the accuracy of trace element discrimination diagrams for basalts using new datasets from two petrological databases, PetDB and GEOROC. Both binary and ternary diagrams using Zr, Ti, V, Y, Th, Hf, Nb, Ta, Sm, and Sc do a poor job of discriminating between basalts generated in various tectonic environments (continental flood basalt, mid-ocean ridge basalt, ocean island basalt, oceanic plateau basalt, back-arc basin basalt, and various types of arc basalt). The overlaps between the different types of basalt are too large for the confident application of such diagrams when used in the absence of geological and petrological constraints. None of the diagrams we tested can clearly discriminate between back-arc basin basalt and mid-ocean ridge basalt, between continental flood basalt and oceanic plateau basalt, and between different types of arc basalt (intra-oceanic, island and continental arcs). Only ocean island basalt and some mid-ocean ridge basalt are generally distinguishable in the diagrams, and even in this case, mantle-normalized trace element patterns offer a better solution for discriminating between the two types of basalt.

  9. Weyl card diagrams

    SciTech Connect

    Jones, Gregory; Wang, John E.

    2005-06-15

    To capture important physical properties of a spacetime we construct a new diagram, the card diagram, which accurately draws generalized Weyl spacetimes in arbitrary dimensions by encoding their global spacetime structure, singularities, horizons, and some aspects of causal structure including null infinity. Card diagrams draw only nontrivial directions providing a clearer picture of the geometric features of spacetimes as compared to Penrose diagrams, and can change continuously as a function of the geometric parameters. One of our main results is to describe how Weyl rods are traversable horizons and the entirety of the spacetime can be mapped out. We review Weyl techniques and as examples we systematically discuss properties of a variety of solutions including Kerr-Newman black holes, black rings, expanding bubbles, and recent spacelike-brane solutions. Families of solutions will share qualitatively similar cards. In addition we show how card diagrams not only capture information about a geometry but also its analytic continuations by providing a geometric picture of analytic continuation. Weyl techniques are generalized to higher dimensional charged solutions and applied to generate perturbations of bubble and S-brane solutions by Israel-Khan rods.

  10. Weyl card diagrams

    NASA Astrophysics Data System (ADS)

    Jones, Gregory; Wang, John E.

    2005-06-01

    To capture important physical properties of a spacetime we construct a new diagram, the card diagram, which accurately draws generalized Weyl spacetimes in arbitrary dimensions by encoding their global spacetime structure, singularities, horizons, and some aspects of causal structure including null infinity. Card diagrams draw only nontrivial directions providing a clearer picture of the geometric features of spacetimes as compared to Penrose diagrams, and can change continuously as a function of the geometric parameters. One of our main results is to describe how Weyl rods are traversable horizons and the entirety of the spacetime can be mapped out. We review Weyl techniques and as examples we systematically discuss properties of a variety of solutions including Kerr-Newman black holes, black rings, expanding bubbles, and recent spacelike-brane solutions. Families of solutions will share qualitatively similar cards. In addition we show how card diagrams not only capture information about a geometry but also its analytic continuations by providing a geometric picture of analytic continuation. Weyl techniques are generalized to higher dimensional charged solutions and applied to generate perturbations of bubble and S-brane solutions by Israel-Khan rods.

  11. Software Tools for Lifetime Assessment of Thermal Barrier Coatings Part I — Thermal Ageing Failure and Thermal Fatigue Failure

    NASA Astrophysics Data System (ADS)

    Renusch, Daniel; Rudolphi, Mario; Schütze, Michael

    Thermal barrier coatings (TBCs) increase the service lifetime of specific components in, for example, gas turbines or airplane engines and allow higher operating temperatures to increase efficiency. Lifetime prediction models are therefore of both academic and applied interest; either to test new coatings or to determine operational conditions that can ensure a certain lifetime, for example 25,000 hr for gas turbines. Driven by these demands, the equations used in lifetime prediction have become more and more sophisticated and consequently are complicated to apply. A collection of software tools for lifetime assessment was therefore developed to provide an easy to use graphical user interface whilst incorporating the recent improvements in modeling equations. The Windows based software is compatible with other Windows applications, such as, Power Point, Excel, or Origin. Laboratory lifetime data from isothermal, thermal cyclic and/or burner rig testing can be loaded into the software for analysis and the program provides confidence limits and an accuracy assessment of the analysis model. The main purpose of the software tool is to predict TBC spallation for a given bond coat temperature, temperature gradient across the coating, and thermal cycle frequency.

  12. Moraine-dammed lake failures in Patagonia and assessment of outburst susceptibility in the Baker Basin

    NASA Astrophysics Data System (ADS)

    Iribarren Anacona, P.; Norton, K. P.; Mackintosh, A.

    2014-12-01

    Glacier retreat since the Little Ice Age has resulted in the development or expansion of hundreds of glacial lakes in Patagonia. Some of these lakes have produced large (≥ 106 m3) Glacial Lake Outburst Floods (GLOFs) damaging inhabited areas. GLOF hazard studies in Patagonia have been mainly based on the analysis of short-term series (≤ 50 years) of flood data and until now no attempt has been made to identify the relative susceptibility of lakes to failure. Power schemes and associated infrastructure are planned for Patagonian basins that have historically been affected by GLOFs, and we now require a thorough understanding of the characteristics of dangerous lakes in order to assist with hazard assessment and planning. In this paper, the conditioning factors of 16 outbursts from moraine-dammed lakes in Patagonia were analysed. These data were used to develop a classification scheme designed to assess outburst susceptibility, based on image classification techniques, flow routine algorithms and the Analytical Hierarchy Process. This scheme was applied to the Baker Basin, Chile, where at least seven moraine-dammed lakes have failed in historic time. We identified 386 moraine-dammed lakes in the Baker Basin of which 28 were classified with high or very high outburst susceptibility. Commonly, lakes with high outburst susceptibility are in contact with glaciers and have moderate (> 8°) to steep (> 15°) dam outlet slopes, akin to failed lakes in Patagonia. The proposed classification scheme is suitable for first-order GLOF hazard assessments in this region. However, rapidly changing glaciers in Patagonia make detailed analysis and monitoring of hazardous lakes and glaciated areas upstream from inhabited areas or critical infrastructure necessary, in order to better prepare for hazards emerging from an evolving cryosphere.

  13. Methods for the development and assessment of atrial fibrillation and heart failure dog models

    PubMed Central

    Urban, Jon F; Gerhart, Renee L; Krzeszak, Jason R; Leet, Corey R; Lentz, Linnea R; McClay, Carolyn B

    2011-01-01

    Objective To report Medtronic experiences with the development of animal models for atrial fibrillation (AF) and chronic heart failure (CHF) using high-rate pacing for AF and microemboli for CHF. Methods For the AF model, an atrial lead was attached to a Medtronic Synergy™ neurostimulator, which was programmed to stimulate at 50 Hz in an on-off duty cycle. Atrial natriuretic peptide (ANP), brain natriuretic peptide (BNP) and N-terminal pro brain natriuretic peptide (NT-proBNP) were assayed at select time points. For CHF model, a serial injection of 90 µm polystyrene microspheres at 62,400 beads/mL (Polybead, Polysciences, Inc.) was performed to induce global ischemia, either with weekly monitoring and embolization schedule (group 1, n = 25) or with biweekly monitoring and emboliation schedule (group 2, n = 36 ). Echocardiograms were used along with ventriculograms and magnetic resonance imaging scans weekly to assess cardiac function and ANP, BNP and NT-proBNP were monitored. Results For the AF model, the days to sustained AF for four animals following surgery were 7, 25, 21 and 19, respectively; For the CHF model, the days to meet CHF endpoints were 116 in group 1 and 89 in group 2. For both AF and CHF models, NT-proBNP correlated well with the development of disease states. Conclusion Our experience for the development and assessment of AF and CHF dog models may help researchers who are in search for animal model for assessing the safety and efficacy of a device-based therapy. PMID:22783299

  14. Probabilistic exposure assessment model to estimate aseptic-UHT product failure rate.

    PubMed

    Pujol, Laure; Albert, Isabelle; Magras, Catherine; Johnson, Nicholas Brian; Membré, Jeanne-Marie

    2015-01-01

    Aseptic-Ultra-High-Temperature (UHT) products are manufactured to be free of microorganisms capable of growing in the food at normal non-refrigerated conditions at which the food is likely to be held during manufacture, distribution and storage. Two important phases within the process are widely recognised as critical in controlling microbial contamination: the sterilisation steps and the following aseptic steps. Of the microbial hazards, the pathogen spore formers Clostridium botulinum and Bacillus cereus are deemed the most pertinent to be controlled. In addition, due to a relatively high thermal resistance, Geobacillus stearothermophilus spores are considered a concern for spoilage of low acid aseptic-UHT products. A probabilistic exposure assessment model has been developed in order to assess the aseptic-UHT product failure rate associated with these three bacteria. It was a Modular Process Risk Model, based on nine modules. They described: i) the microbial contamination introduced by the raw materials, either from the product (i.e. milk, cocoa and dextrose powders and water) or the packaging (i.e. bottle and sealing component), ii) the sterilisation processes, of either the product or the packaging material, iii) the possible recontamination during subsequent processing of both product and packaging. The Sterility Failure Rate (SFR) was defined as the sum of bottles contaminated for each batch, divided by the total number of bottles produced per process line run (10(6) batches simulated per process line). The SFR associated with the three bacteria was estimated at the last step of the process (i.e. after Module 9) but also after each module, allowing for the identification of modules, and responsible contamination pathways, with higher or lower intermediate SFR. The model contained 42 controlled settings associated with factory environment, process line or product formulation, and more than 55 probabilistic inputs corresponding to inputs with variability

  15. 77 FR 5857 - Common-Cause Failure Analysis in Event and Condition Assessment: Guidance and Research, Draft...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-02-06

    ...: On November 2, 2011 (76 FR 67764), the U.S. Nuclear Regulatory Commission (NRC) published for public comment Draft NUREG, ``Common- Cause Failure Analysis in Event and Condition Assessment: Guidance and...-2011-0254. Discussion On November 2, 2011 (76 FR 67764), the NRC published for public comment...

  16. Assessing the Value-Added by the Environmental Testing Process with the Aide of Physics/Engineering of Failure Evaluations

    NASA Technical Reports Server (NTRS)

    Cornford, S.; Gibbel, M.

    1997-01-01

    NASA's Code QT Test Effectiveness Program is funding a series of applied research activities focused on utilizing the principles of physics and engineering of failure and those of engineering economics to assess and improve the value-added by the various validation and verification activities to organizations.

  17. Materials Degradation & Failure: Assessment of Structure and Properties. Resources in Technology.

    ERIC Educational Resources Information Center

    Technology Teacher, 1991

    1991-01-01

    This module provides information on materials destruction (through corrosion, oxidation, and degradation) and failure. A design brief includes objective, student challenge, resources, student outcomes, and quiz. (SK)

  18. Optimal Allocation of Gold Standard Testing under Constrained Availability: Application to Assessment of HIV Treatment Failure

    PubMed Central

    Liu, Tao; Hogan, Joseph W.; Wang, Lisa; Zhang, Shangxuan; Kantor, Rami

    2013-01-01

    The World Health Organization (WHO) guidelines for monitoring the effectiveness of HIV treatment in resource-limited settings (RLS) are mostly based on clinical and immunological markers (e.g., CD4 cell counts). Recent research indicates that the guidelines are inadequate and can result in high error rates. Viral load (VL) is considered the “gold standard”, yet its widespread use is limited by cost and infrastructure. In this paper, we propose a diagnostic algorithm that uses information from routinely-collected clinical and immunological markers to guide a selective use of VL testing for diagnosing HIV treatment failure, under the assumption that VL testing is available only at a certain portion of patient visits. Our algorithm identifies the patient sub-population, such that the use of limited VL testing on them minimizes a pre-defined risk (e.g., misdiagnosis error rate). Diagnostic properties of our proposal algorithm are assessed by simulations. For illustration, data from the Miriam Hospital Immunology Clinic (RI, USA) are analyzed. PMID:24672142

  19. Direct and indirect assessment of skeletal muscle blood flow in chronic congestive heart failure

    SciTech Connect

    LeJemtel, T.H.; Scortichini, D.; Katz, S.

    1988-09-09

    In patients with chronic congestive heart failure (CHF), skeletal muscle blood flow can be measured directly by the continuous thermodilution technique and by the xenon-133 clearance method. The continuous thermodilution technique requires retrograde catheterization of the femoral vein and, thus, cannot be repeated conveniently in patients during evaluation of pharmacologic interventions. The xenon-133 clearance, which requires only an intramuscular injection, allows repeated determination of skeletal muscle blood flow. In patients with severe CHF, a fixed capacity of the skeletal muscle vasculature to dilate appears to limit maximal exercise performance. Moreover, the changes in peak skeletal muscle blood flow noted during long-term administration of captopril, an angiotensin-converting enzyme inhibitor, appears to correlate with the changes in aerobic capacity. In patients with CHF, resting supine deep femoral vein oxygen content can be used as an indirect measurement of resting skeletal muscle blood flow. The absence of a steady state complicates the determination of peak skeletal muscle blood flow reached during graded bicycle or treadmill exercise in patients with chronic CHF. Indirect assessments of skeletal muscle blood flow and metabolism during exercise performed at submaximal work loads are currently developed in patients with chronic CHF.

  20. A non-stationary earthquake probability assessment with the Mohr-Coulomb failure criterion

    NASA Astrophysics Data System (ADS)

    Wang, J. P.; Xu, Y.

    2015-10-01

    From theory to experience, earthquake probability associated with an active fault should be gradually increasing with time since the last event. In this paper, a new non-stationary earthquake assessment motivated/derived from the Mohr-Coulomb failure criterion is introduced. Different from other non-stationary earthquake analyses, the new model can more clearly define and calculate the stress states between two characteristic earthquakes. In addition to the model development and the algorithms, this paper also presents an example calculation to help explain and validate the new model. On the condition of best-estimate model parameters, the example calculation shows a 7.6 % probability for the Meishan fault in central Taiwan to induce a major earthquake in years 2015-2025, and if the earthquake does not occur by 2025, the earthquake probability will increase to 8 % in 2025-2035, which validates the new model that can calculate non-stationary earthquake probability as it should vary with time.

  1. Assessment of adult patients with chronic liver failure for liver transplantation in 2015: who and when?

    PubMed

    McCaughan, G W; Crawford, M; Sandroussi, C; Koorey, D J; Bowen, D G; Shackel, N A; Strasser, S I

    2016-04-01

    In 2015, there are a few absolute contraindications to liver transplantation. In adult patients, survival post-liver transplant is excellent, with 1-year survival rate >90% and 5-year survival rates >80% and predicted median allograft survival beyond 20 years. Patients with a Child-Turcotte Pugh score ≥9 or a model for end-stage liver disease (MELD) score >15 should be referred for liver transplantation, with patients who have a MELD score >17 showing a 1-year survival benefit with liver transplantation. A careful selection of hepatocellular cancer patients results in excellent outcomes, while consideration of extra-hepatic disease (reversible vs irreversible) and social support structures are crucial to patient assessment. Alcoholic liver disease remains a challenge, and the potential to cure hepatitis C virus infection together with the emerging issue of non-alcoholic fatty liver disease-associated chronic liver failure will change the landscape of the who in the years ahead. The when will continue to be determined largely by the severity of liver disease based on the MELD score for the foreseeable future. PMID:27062203

  2. Impulse-Momentum Diagrams

    ERIC Educational Resources Information Center

    Rosengrant, David

    2011-01-01

    Multiple representations are a valuable tool to help students learn and understand physics concepts. Furthermore, representations help students learn how to think and act like real scientists. These representations include: pictures, free-body diagrams, energy bar charts, electrical circuits, and, more recently, computer simulations and…

  3. Assessing Strategies for Heart Failure with Preserved Ejection Fraction at the Outpatient Clinic

    PubMed Central

    Jorge, Antonio José Lagoeiro; Rosa, Maria Luiza Garcia; Ribeiro, Mario Luiz; Fernandes, Luiz Claudio Maluhy; Freire, Monica Di Calafiori; Correia, Dayse Silva; Teixeira, Patrick Duarte; Mesquita, Evandro Tinoco

    2014-01-01

    Background: Heart failure with preserved ejection fraction (HFPEF) is the most common form of heart failure (HF), its diagnosis being a challenge to the outpatient clinic practice. Objective: To describe and compare two strategies derived from algorithms of the European Society of Cardiology Diastology Guidelines for the diagnosis of HFPEF. Methods: Cross-sectional study with 166 consecutive ambulatory patients (67.9±11.7 years; 72% of women). The strategies to confirm HFPEF were established according to the European Society of Cardiology Diastology Guidelines criteria. In strategy 1 (S1), tissue Doppler echocardiography (TDE) and electrocardiography (ECG) were used; in strategy 2 (S2), B-type natriuretic peptide (BNP) measurement was included. Results: In S1, patients were divided into groups based on the E/E'ratio as follows: GI, E/E'> 15 (n = 16; 9%); GII, E/E'8 to 15 (n = 79; 48%); and GIII, E/E'< 8 (n = 71; 43%). HFPEF was confirmed in GI and excluded in GIII. In GII, TDE [left atrial volume index (LAVI) ≥ 40 mL/m2; left ventricular mass index LVMI) > 122 for women and > 149 g/m2 for men] and ECG (atrial fibrillation) parameters were assessed, confirming HFPEF in 33 more patients, adding up to 49 (29%). In S2, patients were divided into three groups based on BNP levels. GI (BNP > 200 pg/mL) consisted of 12 patients, HFPEF being confirmed in all of them. GII (BNP ranging from 100 to 200 pg/mL) consisted of 20 patients with LAVI > 29 mL/m2, or LVMI ≥ 96 g/m2 for women or ≥ 116 g/m2 for men, or E/E'≥ 8 or atrial fibrillation on ECG, and the diagnosis of HFPEF was confirmed in 15. GIII (BNP < 100 pg/mL) consisted of 134 patients, 26 of whom had the diagnosis of HFPEF confirmed when GII parameters were used. Measuring BNP levels in S2 identified 4 more patients (8%) with HFPEF as compared with those identified in S1. Conclusion: The association of BNP measurement and TDE data is better than the isolated use of those parameters. BNP can be useful in

  4. 30 CFR 1218.41 - Assessments for failure to submit payment of same amount as Form ONRR-2014 or bill document or to...

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 30 Mineral Resources 3 2014-07-01 2014-07-01 false Assessments for failure to submit payment of same amount as Form ONRR-2014 or bill document or to provide adequate information. 1218.41 Section 1218... General Provisions § 1218.41 Assessments for failure to submit payment of same amount as Form ONRR-2014...

  5. An improved approach for flight readiness certification: Methodology for failure risk assessment and application examples. Volume 3: Structure and listing of programs

    NASA Technical Reports Server (NTRS)

    Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.

    1992-01-01

    An improved methodology for quantitatively evaluating failure risk of spaceflight systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with engineering analysis to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in engineering analyses of failure phenomena, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which engineering analysis models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes. These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. Conventional engineering analysis models currently employed for design of failure prediction are used in this methodology. The PFA methodology is described and examples of its application are presented. Conventional approaches to failure risk evaluation for spaceflight systems are discussed, and the rationale for the approach taken in the PFA methodology is presented. The statistical methods, engineering models, and computer software used in fatigue failure mode applications are thoroughly documented.

  6. An improved approach for flight readiness certification: Methodology for failure risk assessment and application examples. Volume 3: Structure and listing of programs

    NASA Astrophysics Data System (ADS)

    Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.

    1992-06-01

    An improved methodology for quantitatively evaluating failure risk of spaceflight systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with engineering analysis to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in engineering analyses of failure phenomena, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which engineering analysis models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes. These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. Conventional engineering analysis models currently employed for design of failure prediction are used in this methodology. The PFA methodology is described and examples of its application are presented. Conventional approaches to failure risk evaluation for spaceflight systems are discussed, and the rationale for the approach taken in the PFA methodology is presented. The statistical methods, engineering models, and computer software used in fatigue failure mode applications are thoroughly documented.

  7. Tectonic discrimination diagrams revisited

    NASA Astrophysics Data System (ADS)

    Vermeesch, Pieter

    2006-06-01

    The decision boundaries of most tectonic discrimination diagrams are drawn by eye. Discriminant analysis is a statistically more rigorous way to determine the tectonic affinity of oceanic basalts based on their bulk-rock chemistry. This method was applied to a database of 756 oceanic basalts of known tectonic affinity (ocean island, mid-ocean ridge, or island arc). For each of these training data, up to 45 major, minor, and trace elements were measured. Discriminant analysis assumes multivariate normality. If the same covariance structure is shared by all the classes (i.e., tectonic affinities), the decision boundaries are linear, hence the term linear discriminant analysis (LDA). In contrast with this, quadratic discriminant analysis (QDA) allows the classes to have different covariance structures. To solve the statistical problems associated with the constant-sum constraint of geochemical data, the training data must be transformed to log-ratio space before performing a discriminant analysis. The results can be mapped back to the compositional data space using the inverse log-ratio transformation. An exhaustive exploration of 14,190 possible ternary discrimination diagrams yields the Ti-Si-Sr system as the best linear discrimination diagram and the Na-Nb-Sr system as the best quadratic discrimination diagram. The best linear and quadratic discrimination diagrams using only immobile elements are Ti-V-Sc and Ti-V-Sm, respectively. As little as 5% of the training data are misclassified by these discrimination diagrams. Testing them on a second database of 182 samples that were not part of the training data yields a more reliable estimate of future performance. Although QDA misclassifies fewer training data than LDA, the opposite is generally true for the test data. Therefore LDA is a cruder but more robust classifier than QDA. Another advantage of LDA is that it provides a powerful way to reduce the dimensionality of the multivariate geochemical data in a similar

  8. Fuzzy-logic assessment of failure hazard in pipelines due to mining activity

    NASA Astrophysics Data System (ADS)

    Malinowska, A. A.; Hejmanowski, R.

    2015-11-01

    The present research is aimed at a critical analysis of a method presently used for evaluating failure hazard in linear objects in mining areas. A fuzzy model of failure hazard of a linear object was created on the basis of the experience gathered so far. The rules of Mamdani fuzzy model have been used in the analyses. Finally the scaled model was integrated with a Geographic Information System (GIS), which was used to evaluate failure hazard in a water pipeline in a mining area.

  9. Minimal Effects of Acute Liver Injury/Acute Liver Failure on Hemostasis as Assessed by Thromboelastography

    PubMed Central

    Stravitz, R. Todd; Lisman, Ton; Luketic, Velimir A.; Sterling, Richard K.; Puri, Puneet; Fuchs, Michael; Ibrahim, Ashraf; Lee, William M.; Sanyal, Arun J.

    2016-01-01

    Background & Aims Patients with acute liver injury/failure (ALI/ALF) are assumed to have a bleeding diathesis on the basis of elevated INR; however, clinically significant bleeding is rare. We hypothesized that patients with ALI/ALF have normal hemostasis despite elevated INR Methods Fifty-one patients with ALI/ALF were studied prospectively using thromboelastography (TEG), which measures the dynamics and physical properties of clot formation in whole blood. ALI was defined as an INR ≥1.5 in a patient with no previous liver disease, and ALF as ALI with hepatic encephalopathy. Results Thirty-seven of 51 patients (73%) had ALF and 22 patients (43%) underwent liver transplantation or died. Despite a mean INR of 3.4±1.7 (range 1.5–9.6), mean TEG parameters were normal, and 5 individual TEG parameters were normal in 32 (63%). Low maximum amplitude, the measure of ultimate clot strength, was confined to patients with platelet counts <126 × 109/L. Maximum amplitude was higher in patients with ALF than ALI and correlated directly with venous ammonia concentrations and with increasing severity of liver injury assessed by elements of the systemic inflammatory response syndrome. All patients had markedly decreased procoagulant factor V and VII levels, which were proportional to decreases in anticoagulant proteins and inversely proportional to elevated factor VIII levels. Conclusions Despite elevated INR, most patients with ALI/ALF maintain normal hemostasis by TEG, the mechanisms of which include an increase in clot strength with increasing severity of liver injury, increased factor VIII levels, and a commensurate decline in pro- and anticoagulant proteins. PMID:21703173

  10. Predictive Values of Red Blood Cell Distribution Width in Assessing Severity of Chronic Heart Failure.

    PubMed

    Liu, Sen; Wang, Ping; Shen, Ping-Ping; Zhou, Jian-Hua

    2016-01-01

    BACKGROUND This retrospective study was performed to evaluate the value of baseline red blood cell distribution width (RDW) for predicting the severity of chronic heart failure (CHF) compared with N-terminal prohormone brain natriuretic peptide (NT-ProBNP) and other hematological and biochemical parameters. MATERIAL AND METHODS Hematological and biochemical parameters were obtained from 179 patients with New York Heart Association (NYHA) CHF class I (n=44), II (n=39), III (n=41), and IV (n=55). Receiver operator characteristic (ROC) curves were used for assessing predictive values. RESULTS RDW increased significantly in class III and IV compared with class I (14.3±2.3% and 14.3±1.7% vs. 12.9±0.8%, P<0.01). Areas under ROCs (AUCs) of RDW and NT-ProBNP for class IV HF were 0.817 and 0.840, respectively. RDW was markedly elevated in the mortality group compared with the survival group (13.7±1.7 vs. 15.8±1.8, P<0.01). The predictive value of RDW was lower than that of NT-ProBNP but was comparable to white blood cell (WBC), neutrophil (NEU), lymphocyte (L), and neutrophil/lymphocyte ratio (N/L) for mortality during hospitalization, with AUCs of 0.837, 0.939, 0.858, 0.891, 0.885, and 0.885, respectively. RDW and NT-proBNP showed low predictive values for repeated admission (≥3). RDW was an independent risk factor for mortality (OR=2.531, 95% CI: 1.371-4.671). CONCLUSIONS RDW increased significantly in class III and IV patients and in the mortality group. The predictive value of RDW is comparable to NT-proBNP for class IV and lower than that of NT-proBNP for mortality. Elevated RDW is an independent risk factor for mortality. PMID:27324271

  11. Sepsis-related organ failure assessment and withholding or withdrawing life support from critically ill patients

    PubMed Central

    Miguel, Nolla; León, Mariá A; Ibáñez, Jordi; Díaz, Rosa M; Merten, Alfredo; Gahete, Francesc

    1998-01-01

    Background: We studied the incidence of withholding or withdrawing therapeutic measures in intensive care unit (ICU) patients, as well as the possible implications of sepsis-related organ failure assessment (SOFA) in the decision-making process and the ethical conflicts emerging from these measures. Methods: The patients (n = 372) were placed in different groups: those surviving 1 year after ICU admission (S; n = 301), deaths at home (DH; n = 2), deaths in the hospital after ICU discharge (DIH; n = 13) and deaths in the ICU (DI; n = 56). The last group was divided into the following subgroups: two cardiovascular deaths (CVD), 20 brain deaths (BD), 25 deaths after withholding of life support (DWH) and nine deaths after withdrawal of life support (DWD). Results: APACHE III, daily therapeutic intervention scoring system (TISS) and daily SOFA scores were good mortality predictors. The length of ICU stay in DIH (20 days) and in DWH (14 days) was significantly greater than in BD (5 days) or in S (7 days). The number of days with a maximum SOFA score was greater in DWD (5 days) than in S, BD or DWH (2 days). Conclusions: Daily SOFA is a useful parameter when the decision to withhold or withdraw treatment has to be considered, especially if the established measures do not improve the clinical condition of the patient. Although making decisions based on the use of severity parameters may cause ethical problems, it may reduce the anxiety level. Additionally, it may help when considering the need for extraordinary measures or new investigative protocols for better management of resources. PMID:11056711

  12. Fluid Volume Overload and Congestion in Heart Failure: Time to Reconsider Pathophysiology and How Volume Is Assessed.

    PubMed

    Miller, Wayne L

    2016-08-01

    Volume regulation, assessment, and management remain basic issues in patients with heart failure. The discussion presented here is directed at opening a reassessment of the pathophysiology of congestion in congestive heart failure and the methods by which we determine volume overload status. Peer-reviewed historical and contemporary literatures are reviewed. Volume overload and fluid congestion remain primary issues for patients with chronic heart failure. The pathophysiology is complex, and the simple concept of intravascular fluid accumulation is not adequate. The dynamics of interstitial and intravascular fluid compartment interactions and fluid redistribution from venous splanchnic beds to central pulmonary circulation need to be taken into account in strategies of volume management. Clinical bedside evaluations and right heart hemodynamic assessments can alert clinicians of changes in volume status, but only the quantitative measurement of total blood volume can help identify the heterogeneity in plasma volume and red blood cell mass that are features of volume overload in patients with chronic heart failure and help guide individualized, appropriate therapy-not all volume overload is the same. PMID:27436837

  13. Impulse-Momentum Diagrams

    NASA Astrophysics Data System (ADS)

    Rosengrant, David

    2011-01-01

    Multiple representations are a valuable tool to help students learn and understand physics concepts. Furthermore, representations help students learn how to think and act like real scientists.2 These representations include: pictures, free-body diagrams,3 energy bar charts,4 electrical circuits, and, more recently, computer simulations and animations.5 However, instructors have limited choices when they want to help their students understand impulse and momentum. One of the only available options is the impulse-momentum bar chart.6 The bar charts can effectively show the magnitude of the momentum as well as help students understand conservation of momentum, but they do not easily show the actual direction. This paper highlights a new representation instructors can use to help their students with momentum and impulse—the impulse-momentum diagram (IMD).

  14. TEP process flow diagram

    SciTech Connect

    Wilms, R Scott; Carlson, Bryan; Coons, James; Kubic, William

    2008-01-01

    This presentation describes the development of the proposed Process Flow Diagram (PFD) for the Tokamak Exhaust Processing System (TEP) of ITER. A brief review of design efforts leading up to the PFD is followed by a description of the hydrogen-like, air-like, and waterlike processes. Two new design values are described; the mostcommon and most-demanding design values. The proposed PFD is shown to meet specifications under the most-common and mostdemanding design values.

  15. Random Forest for automatic assessment of heart failure severity in a telemonitoring scenario.

    PubMed

    Guidi, G; Pettenati, M C; Miniati, R; Iadanza, E

    2013-01-01

    In this study, we describe an automatic classifier of patients with Heart Failure designed for a telemonitoring scenario, improving the results obtained in our previous works. Our previous studies showed that the technique that better processes the heart failure typical telemonitoring-parameters is the Classification Tree. We therefore decided to analyze the data with its direct evolution that is the Random Forest algorithm. The results show an improvement both in accuracy and in limiting critical errors. PMID:24110416

  16. Assessment of the probability of failure for EC nondestructive testing based on intrusive spectral stochastic finite element method

    NASA Astrophysics Data System (ADS)

    Oudni, Zehor; Féliachi, Mouloud; Mohellebi, Hassane

    2014-06-01

    This work is undertaken to study the reliability of eddy current nondestructive testing (ED-NDT) when the defect concerns a change of physical property of the material. So, an intrusive spectral stochastic finite element method (SSFEM) is developed in the case of 2D electromagnetic harmonic equation. The electrical conductivity is considered as random variable and is developed in series of Hermite polynomials. The developed model is validated from measurements on NDT device and is applied to the assessment of the reliability of failure in steam generator tubing of nuclear power plants. The exploitation of the model concerns the impedance calculation of the sensor and the assessment of the reliability of failure. The random defect geometry is also considered and results are given.

  17. Failure-Oriented Training.

    ERIC Educational Resources Information Center

    Pickens, Diana; Lorenz, Paul

    This document consists of a number of figures and diagrams suitable for overhead transparencies that illustrate and elaborate on the prnciples of failure-oriented training (a model for improving the effectiveness of instructional analysis). By adding a few simple steps to analysis, the resulting training will be closer to the idealized tutor:…

  18. Proactive Risk Assessment of Blood Transfusion Process, in Pediatric Emergency, Using the Health Care Failure Mode and Effects Analysis (HFMEA)

    PubMed Central

    Dehnavieh, Reza; Ebrahimipour, Hossein; Molavi-Taleghani, Yasamin; Vafaee-Najar, Ali; Hekmat, Somayeh Noori; Esmailzdeh, Hamid

    2015-01-01

    Introduction: Pediatric emergency has been considered as a high risk area, and blood transfusion is known as a unique clinical measure, therefore this study was conducted with the purpose of assessing the proactive risk assessment of blood transfusion process in Pediatric Emergency of Qaem education- treatment center in Mashhad, by the Healthcare Failure Mode and Effects Analysis (HFMEA) methodology. Methodology: This cross-sectional study analyzed the failure mode and effects of blood transfusion process by a mixture of quantitative-qualitative method. The proactive HFMEA was used to identify and analyze the potential failures of the process. The information of the items in HFMEA forms was collected after obtaining a consensus of experts’ panel views via the interview and focus group discussion sessions. Results: The Number of 77 failure modes were identified for 24 sub-processes enlisted in 8 processes of blood transfusion. Totally 13 failure modes were identified as non-acceptable risk (a hazard score above 8) in the blood transfusion process and were transferred to the decision tree. Root causes of high risk modes were discussed in cause-effect meetings and were classified based on the UK national health system (NHS) approved classifications model. Action types were classified in the form of acceptance (11.6%), control (74.2%) and elimination (14.2%). Recommendations were placed in 7 categories using TRIZ (“Theory of Inventive Problem Solving.”) Conclusion: The re-engineering process for the required changes, standardizing and updating the blood transfusion procedure, root cause analysis of blood transfusion catastrophic events, patient identification bracelet, training classes and educational pamphlets for raising awareness of personnel, and monthly gathering of transfusion medicine committee have all been considered as executive strategies in work agenda in pediatric emergency. PMID:25560332

  19. Assessment and management of cerebral edema and intracranial hypertension in acute liver failure.

    PubMed

    Mohsenin, Vahid

    2013-10-01

    Acute liver failure is uncommon but not a rare complication of liver injury. It can happen after ingestion of acetaminophen and exposure to toxins and hepatitis viruses. The defining clinical symptoms are coagulopathy and encephalopathy occurring within days or weeks of the primary insult in patients without preexisting liver injury. Acute liver failure is often complicated by multiorgan failure and sepsis. The most life-threatening complications are sepsis, multiorgan failure, and brain edema. The clinical signs of increased intracranial pressure (ICP) are nonspecific except for neurologic deficits in impending brain stem herniation. Computed tomography of the brain is not sensitive enough in gauging intracranial hypertension or ruling out brain edema. Intracranial pressure monitoring, transcranial Doppler, and jugular venous oximetry provide valuable information for monitoring ICP and guiding therapeutic measures in patients with encephalopathy grade III or IV. Osmotic therapy using hypertonic saline and mannitol, therapeutic hypothermia, and propofol sedation are shown to improve ICPs and stabilize the patient for liver transplantation. In this article, diagnosis and management of hepatic encephalopathy and cerebral edema in patients with acute liver failure are reviewed. PMID:23683564

  20. Predictive Values of Red Blood Cell Distribution Width in Assessing Severity of Chronic Heart Failure

    PubMed Central

    Liu, Sen; Wang, Ping; Shen, Ping-Ping; Zhou, Jian-Hua

    2016-01-01

    Background This retrospective study was performed to evaluate the value of baseline red blood cell distribution width (RDW) for predicting the severity of chronic heart failure (CHF) compared with N-terminal prohormone brain natriuretic peptide (NT-ProBNP) and other hematological and biochemical parameters. Material/Methods Hematological and biochemical parameters were obtained from 179 patients with New York Heart Association (NYHA) CHF class I (n=44), II (n=39), III (n=41), and IV (n=55). Receiver operator characteristic (ROC) curves were used for assessing predictive values. Results RDW increased significantly in class III and IV compared with class I (14.3±2.3% and 14.3±1.7% vs. 12.9±0.8%, P<0.01). Areas under ROCs (AUCs) of RDW and NT-ProBNP for class IV HF were 0.817 and 0.840, respectively. RDW was markedly elevated in the mortality group compared with the survival group (13.7±1.7 vs. 15.8±1.8, P<0.01). The predictive value of RDW was lower than that of NT-ProBNP but was comparable to white blood cell (WBC), neutrophil (NEU), lymphocyte (L), and neutrophil/lymphocyte ratio (N/L) for mortality during hospitalization, with AUCs of 0.837, 0.939, 0.858, 0.891, 0.885, and 0.885, respectively. RDW and NT-proBNP showed low predictive values for repeated admission (≥3). RDW was an independent risk factor for mortality (OR=2.531, 95% CI: 1.371–4.671). Conclusions RDW increased significantly in class III and IV patients and in the mortality group. The predictive value of RDW is comparable to NT-proBNP for class IV and lower than that of NT-proBNP for mortality. Elevated RDW is an independent risk factor for mortality. PMID:27324271

  1. Traditional and new composite endpoints in heart failure clinical trials: facilitating comprehensive efficacy assessments and improving trial efficiency.

    PubMed

    Anker, Stefan D; Schroeder, Stefan; Atar, Dan; Bax, Jeroen J; Ceconi, Claudio; Cowie, Martin R; Crisp, Adam; Dominjon, Fabienne; Ford, Ian; Ghofrani, Hossein-Ardeschir; Gropper, Savion; Hindricks, Gerhard; Hlatky, Mark A; Holcomb, Richard; Honarpour, Narimon; Jukema, J Wouter; Kim, Albert M; Kunz, Michael; Lefkowitz, Martin; Le Floch, Chantal; Landmesser, Ulf; McDonagh, Theresa A; McMurray, John J; Merkely, Bela; Packer, Milton; Prasad, Krishna; Revkin, James; Rosano, Giuseppe M C; Somaratne, Ransi; Stough, Wendy Gattis; Voors, Adriaan A; Ruschitzka, Frank

    2016-05-01

    Composite endpoints are commonly used as the primary measure of efficacy in heart failure clinical trials to assess the overall treatment effect and to increase the efficiency of trials. Clinical trials still must enrol large numbers of patients to accrue a sufficient number of outcome events and have adequate power to draw conclusions about the efficacy and safety of new treatments for heart failure. Additionally, the societal and health system perspectives on heart failure have raised interest in ascertaining the effects of therapy on outcomes such as repeat hospitalization and the patient's burden of disease. Thus, novel methods for using composite endpoints in clinical trials (e.g. clinical status composite endpoints, recurrent event analyses) are being applied in current and planned trials. Endpoints that measure functional status or reflect the patient experience are important but used cautiously because heart failure treatments may improve function yet have adverse effects on mortality. This paper discusses the use of traditional and new composite endpoints, identifies qualities of robust composites, and outlines opportunities for future research. PMID:27071916

  2. Heart Failure

    MedlinePlus

    ... version of this page please turn Javascript on. Heart Failure What is Heart Failure? In heart failure, the heart cannot pump enough ... failure often experience tiredness and shortness of breath. Heart Failure is Serious Heart failure is a serious and ...

  3. Wilson Loop Diagrams and Positroids

    NASA Astrophysics Data System (ADS)

    Agarwala, Susama; Marin-Amat, Eloi

    2016-07-01

    In this paper, we study a new application of the positive Grassmannian to Wilson loop diagrams (or MHV diagrams) for scattering amplitudes in N= 4 Super Yang-Mill theory (N = 4 SYM). There has been much interest in studying this theory via the positive Grassmannians using BCFW recursion. This is the first attempt to study MHV diagrams for planar Wilson loop calculations (or planar amplitudes) in terms of positive Grassmannians. We codify Wilson loop diagrams completely in terms of matroids. This allows us to apply the combinatorial tools in matroid theory used to identify positroids (non-negative Grassmannians) to Wilson loop diagrams. In doing so, we find that certain non-planar Wilson loop diagrams define positive Grassmannians. While non-planar diagrams do not have physical meaning, this finding suggests that they may have value as an algebraic tool, and deserve further investigation.

  4. ASSESSMENT OF SYNTHETIC MEMBRANE SUCCESSES AND FAILURES AT WASTE STORAGE AND DISPOSAL SITES

    EPA Science Inventory

    Data from 27 lined facilities provided by five vendors was analyzed to determine the factors which contributed to success or failure of the liner at those facilities. The sites studied included a wide variety of wastes handled, liner types, geographic locations, facility ages, fa...

  5. Risk assessment of the emergency processes: Healthcare failure mode and effect analysis

    PubMed Central

    Taleghani, Yasamin Molavi; Rezaei, Fatemeh; Sheikhbardsiri, Hojat

    2016-01-01

    BACKGROUND: Ensuring about the patient’s safety is the first vital step in improving the quality of care and the emergency ward is known as a high-risk area in treatment health care. The present study was conducted to evaluate the selected risk processes of emergency surgery department of a treatment-educational Qaem center in Mashhad by using analysis method of the conditions and failure effects in health care. METHODS: In this study, in combination (qualitative action research and quantitative cross-sectional), failure modes and effects of 5 high-risk procedures of the emergency surgery department were identified and analyzed according to Healthcare Failure Mode and Effects Analysis (HFMEA). To classify the failure modes from the “nursing errors in clinical management model (NECM)”, the classification of the effective causes of error from “Eindhoven model” and determination of the strategies to improve from the “theory of solving problem by an inventive method” were used. To analyze the quantitative data of descriptive statistics (total points) and to analyze the qualitative data, content analysis and agreement of comments of the members were used. RESULTS: In 5 selected processes by “voting method using rating”, 23 steps, 61 sub-processes and 217 potential failure modes were identified by HFMEA. 25 (11.5%) failure modes as the high risk errors were detected and transferred to the decision tree. The most and the least failure modes were placed in the categories of care errors (54.7%) and knowledge and skill (9.5%), respectively. Also, 29.4% of preventive measures were in the category of human resource management strategy. CONCLUSION: “Revision and re-engineering of processes”, “continuous monitoring of the works”, “preparation and revision of operating procedures and policies”, “developing the criteria for evaluating the performance of the personnel”, “designing a suitable educational content for needs of employee”,

  6. Warped penguin diagrams

    SciTech Connect

    Csaki, Csaba; Grossman, Yuval; Tanedo, Philip; Tsai, Yuhsin

    2011-04-01

    We present an analysis of the loop-induced magnetic dipole operator in the Randall-Sundrum model of a warped extra dimension with anarchic bulk fermions and an IR brane-localized Higgs. These operators are finite at one-loop order and we explicitly calculate the branching ratio for {mu}{yields}e{gamma} using the mixed position/momentum space formalism. The particular bound on the anarchic Yukawa and Kaluza-Klein (KK) scales can depend on the flavor structure of the anarchic matrices. It is possible for a generic model to either be ruled out or unaffected by these bounds without any fine-tuning. We quantify how these models realize this surprising behavior. We also review tree-level lepton flavor bounds in these models and show that these are on the verge of tension with the {mu}{yields}e{gamma} bounds from typical models with a 3 TeV Kaluza-Klein scale. Further, we illuminate the nature of the one-loop finiteness of these diagrams and show how to accurately determine the degree of divergence of a five-dimensional loop diagram using both the five-dimensional and KK formalism. This power counting can be obfuscated in the four-dimensional Kaluza-Klein formalism and we explicitly point out subtleties that ensure that the two formalisms agree. Finally, we remark on the existence of a perturbative regime in which these one-loop results give the dominant contribution.

  7. Assessment of congestive heart failure in chest radiographs. Observer performance with two common film-screen systems.

    PubMed

    Henriksson, L; Sundin, A; Smedby, O; Albrektsson, P

    1990-09-01

    The effect of observer variations and film-screen quality on the diagnosis of congestive heart failure based on chest radiographs was studied in 27 patients. For each patient, two films were exposed, one with the Kodak Lanex Medium system and one with the Agfa MR 400 system. The films were presented to three observers who assessed the presence of congestive heart failure on a three-graded scale. The results showed no significant difference between the two systems but large systematic differences between the observers. There were also differences between the two ratings by the same observer that could not be explained by the film-screen factor. It is concluded that the choice between these two systems is of little importance in view of the interobserver and intraobserver variability that can exist within the same department. PMID:2261292

  8. Using Eye Tracking to Investigate Semantic and Spatial Representations of Scientific Diagrams during Text-Diagram Integration

    ERIC Educational Resources Information Center

    Jian, Yu-Cin; Wu, Chao-Jung

    2015-01-01

    We investigated strategies used by readers when reading a science article with a diagram and assessed whether semantic and spatial representations were constructed while reading the diagram. Seventy-one undergraduate participants read a scientific article while tracking their eye movements and then completed a reading comprehension test. Our…

  9. Risk-benefit assessment of ivabradine in the treatment of chronic heart failure

    PubMed Central

    Urbanek, Irmina; Kaczmarek, Krzysztof; Cygankiewicz, Iwona; Ptaszynski, Pawel

    2014-01-01

    Heart rate is not only a major risk marker in heart failure but also a general risk marker. Within the last few years, it has been demonstrated that reduction of resting heart rate to <70 bpm is of significant benefit for patients with heart failure, especially those with impaired left ventricular systolic function. Ivabradine is the first innovative drug synthesized to reduce heart rate. It selectively and specifically inhibits the pacemaker If ionic current, which reduces cardiac pacemaker activity. Therefore, the main effect of ivabradine therapy is a substantial lowering of heart rate. Ivabradine does not influence intracardiac conduction, contractility, or ventricular repolarization. According to the European Society of Cardiology guidelines, ivabradine should be considered in symptomatic patients (New York Heart Association functional class II–IV) with sinus rhythm, left ventricular ejection fraction ≤35%, and heart rate ≥70 bpm despite optimal treatment with a beta-blocker, angiotensin-converting enzyme inhibitor/angiotensin receptor blocker, and a mineralocorticoid receptor antagonist. As shown in numerous clinical studies, ivabradine improves clinical outcomes and quality of life and reduces the risk of death from heart failure or cardiovascular causes. Treatment with ivabradine is very well tolerated and safe, even at maximal recommended doses. PMID:24855390

  10. Using Eye Tracking to Investigate Semantic and Spatial Representations of Scientific Diagrams During Text-Diagram Integration

    NASA Astrophysics Data System (ADS)

    Jian, Yu-Cin; Wu, Chao-Jung

    2015-02-01

    We investigated strategies used by readers when reading a science article with a diagram and assessed whether semantic and spatial representations were constructed while reading the diagram. Seventy-one undergraduate participants read a scientific article while tracking their eye movements and then completed a reading comprehension test. Our results showed that the text-diagram referencing strategy was commonly used. However, some readers adopted other reading strategies, such as reading the diagram or text first. We found all readers who had referred to the diagram spent roughly the same amount of time reading and performed equally well. However, some participants who ignored the diagram performed more poorly on questions that tested understanding of basic facts. This result indicates that dual coding theory may be a possible theory to explain the phenomenon. Eye movement patterns indicated that at least some readers had extracted semantic information of the scientific terms when first looking at the diagram. Readers who read the scientific terms on the diagram first tended to spend less time looking at the same terms in the text, which they read after. Besides, presented clear diagrams can help readers process both semantic and spatial information, thereby facilitating an overall understanding of the article. In addition, although text-first and diagram-first readers spent similar total reading time on the text and diagram parts of the article, respectively, text-first readers had significantly less number of saccades of text and diagram than diagram-first readers. This result might be explained as text-directed reading.

  11. Argument Diagramming: The Araucaria Project

    NASA Astrophysics Data System (ADS)

    Rowe, Glenn; Reed, Chris

    Formal arguments, such as those used in science, medicine and law to establish a conclusion by providing supporting evidence, are frequently represented by diagrams such as trees and graphs. We describe the software package Araucaria which allows textual arguments to be marked up and represented as standard, Toulmin or Wigmore diagrams. Since each of these diagramming techniques was devised for a particular domain or argumentation, we discuss some of the issues involved in translating between diagrams. The exercise of translating between different diagramming types illustrates that any one diagramming system often cannot capture all of the nuances inherent in an argument. Finally, we describe some areas, such as critical thinking courses in colleges and universities and the analysis of evidence in court cases, where Araucaria has been put to practical use.

  12. Micro-compression: a novel technique for the nondestructive assessment of local bone failure.

    PubMed

    Müller, R; Gerber, S C; Hayes, W C

    1998-12-01

    Many bones within the axial and appendicular skeleton are subjected to repetitive, cyclic loading during the course of ordinary daily activities. If this repetitive loading is of sufficient magnitude or duration, fatigue failure of the bone tissue may result. In clinical orthopedics, trabecular fatigue fractures are observed as compressive stress fractures in the proximal femur, vertebrae, calcaneus and tibia, and are often preceded by buckling and bending of microstructural elements. However, the relative importance of bone density and architecture in the etiology of these fractures is poorly understood. The aim of the study was to investigate failure mechanisms of 3D trabecular bone using micro-computed tomography (microCT). Because of its nondestructive nature, microCT represents an ideal approach for performing not only static measurements of bone architecture but also dynamic measurements of failure initiation and propagation as well as damage accumulation. For the purpose of the study, a novel micro-compression device was devised to measure loaded trabecular bone specimens directly in a micro-tomographic system. The measurement window in the device was made of a radiolucent, highly stiff plastic to enable X-rays to penetrate the material. The micro-compressor has an outer diameter of 19 mm and a total length of 65 mm. The internal load chamber fits wet or dry bone specimens with maximal diameters of 9 mm and maximal lengths of 22 mm. For the actual measurement, first, the unloaded bone is measured in the microCT. Second, a load-displacement curve is recorded where the load is measured with an integrated mini-button load cell and the displacement is computed directly from the microCT scout-view. For each load case, a 3D snap-shot of the structure under load is taken providing 34 microm nominal resolution. Initial measurements included specimens from bovine tibiae and whale spine to investigate the influence of the structure type on the failure mechanism. In a

  13. Program Synthesizes UML Sequence Diagrams

    NASA Technical Reports Server (NTRS)

    Barry, Matthew R.; Osborne, Richard N.

    2006-01-01

    A computer program called "Rational Sequence" generates Universal Modeling Language (UML) sequence diagrams of a target Java program running on a Java virtual machine (JVM). Rational Sequence thereby performs a reverse engineering function that aids in the design documentation of the target Java program. Whereas previously, the construction of sequence diagrams was a tedious manual process, Rational Sequence generates UML sequence diagrams automatically from the running Java code.

  14. Risk Assessment of Using Entonox for the Relief of Labor Pain: A Healthcare Failure Modes and Effects Analysis Approach

    PubMed Central

    Najafi, Tahereh Fathi; Bahri, Narjes; Ebrahimipour, Hosein; Najar, Ali Vafaee; Taleghani, Yasamin Molavi

    2016-01-01

    Introduction In order to prevent medical errors, it is important to know why they occur and to identify their causes. Healthcare failure modes and effects analysis (HFMEA) is a type of qualitative descriptive that is used to evaluate the risk. The aim of this study was to assess the risks of using Entonox for labor pain by HFMEA. Methods A mixed-methods design (qualitative action research and quantitative cross-sectional research) was used. The modes and effects of failures in the process of using Entonox were detected and analyzed during 2013–2014 at Hefdahe Shahrivar Hospital, Mashhad, Iran. Overall, 52 failure modes were identified, with 25 being recognized as high-risk modes. Results The results revealed that 48.5% of these errors fall into the care process type, 22.05% belong to the communicative type, 19.1% fall into the administrative type, and 10.2% are of the knowledge and skills type. Strategies were presented in the forms of acceptance (3.2%), control (90.3%), and elimination (6.4%). Conclusion The following actions are suggested for improving the process of using Entonox: Close supervision by the midwife, precise recording of all the stages of the process in the woman’s medical record, the necessity of the presence of the anesthesiologist at the woman’s bedside during labor, confirming the indications for use of Entonox, and close monitoring to ensure the safety of the gas cylinder guards. PMID:27123224

  15. Development of Methodology to Assess the Failure Behaviour of Bamboo Single Fibre by Acoustic Emission Technique

    NASA Astrophysics Data System (ADS)

    Alam, Md. Saiful; Gulshan, Fahmida; Ahsan, Qumrul; Wevers, Martine; Pfeiffer, Helge; van Vuure, Aart-Willem; Osorio, Lina; Verpoest, Ignaas

    2016-06-01

    Acoustic emission (AE) was used as a tool for detecting, evaluating and for better understanding of the damage mechanism and failure behavior in composites during mechanical loading. Methodology was developed for tensile test of natural fibres (bamboo single fibre). A series of experiments were performed and load drops (one or two) were observed in the load versus time graphs. From the observed AE parameters such as amplitude, energy, duration etc. significant information corresponding to the load drops were found. These AE signals from the load drop occurred from such failure as debonding between two elementary fibre or from join of elementary fibre at edge. The various sources of load at first load drop was not consistent for the different samples (for a particular sample the value is 8 N, stress: 517.51 MPa). Final breaking of fibre corresponded to saturated level AE amplitude of preamplifier (99.9 dB) for all samples. Therefore, it was not possible to determine the exact AE energy value for final breaking. Same methodology was used for tensile test of three single fibres, which gave clear indication of load drop before the final breaking of first and second fibre.

  16. Failure assessment of aluminum liner based filament-wound hybrid riser subjected to internal hydrostatic pressure

    NASA Astrophysics Data System (ADS)

    Dikshit, Vishwesh; Seng, Ong Lin; Maheshwari, Muneesh; Asundi, A.

    2015-03-01

    The present study describes the burst behavior of aluminum liner based prototype filament-wound hybrid riser under internal hydrostatic pressure. The main objective of present study is to developed an internal pressure test rig set-up for filament-wound hybrid riser and investigate the failure modes of filament-wound hybrid riser under internal hydrostatic burst pressure loading. The prototype filament-wound hybrid riser used for burst test consists of an internal aluminum liner and outer composite layer. The carbon-epoxy composites as part of the filament-wound hybrid risers were manufactured with [±55o] lay-up pattern with total composite layer thickness of 1.6 mm using a CNC filament-winding machine. The burst test was monitored by video camera which helps to analyze the failure mechanism of the fractured filament-wound hybrid riser. The Fiber Bragg Grating (FBG) sensor was used to monitor and record the strain changes during burst test of prototype filament-wound hybrid riser. This study shows good improvements in burst strength of filament-wound hybrid riser compared to the monolithic metallic riser. Since, strain measurement using FBG sensors has been testified as a reliable method, we aim to further understand in detail using this technique.

  17. Sex Differences in Patients With Acute Decompensated Heart Failure: Insights From the Heart Function Assessment Registry Trial in Saudi Arabia.

    PubMed

    AlFaleh, Hussam F; Thalib, Lukman; Kashour, Tarek; Hersi, Ahmad; Mimish, Layth; Elasfar, Abdelfatah A; Almasood, Ali; Al Ghamdi, Saleh; Ghabashi, Abdullah; Malik, Asif; Hussein, Gamal A; Al-Murayeh, Mushabab; Abuosa, Ahmed; Al Habeeb, Waleed; Al Habib, Khalid F

    2016-08-01

    We assessed sex-specific differences in clinical features and outcomes of patients with acute heart failure (AHF). The Heart function Assessment Registry Trial in Saudi Arabia (HEARTS), a prospective registry, enrolled 2609 patients with AHF (34.2% women) between 2009 and 2010. Women were older and more likely to have risk factors for atherosclerosis, history of heart failure (HF), and rheumatic heart and valve disease. Ischemic heart disease was the prime cause for HF in men and women but more so in men (P < .001). Women had higher rates of hypertensive heart disease and primary valve disease (P < .001, for both comparisons). Men were more likely to have severe left ventricular systolic dysfunction. On discharge, a higher use of angiotensin-converting enzyme inhibitors, β-blockers, and aldosterone inhibitors was observed in men (P < .001 for all comparisons). Apart from higher atrial fibrillation in women and higher ventricular arrhythmias in men, no differences were observed in hospital outcomes. The overall survival did not differ between men and women (hazard ratio: 1.0, 95% confidence interval: 0.8-1.2, P = .981). Men and women with AHF differ significantly in baseline clinical characteristics and management but not in adverse outcomes. PMID:26438635

  18. Risk assessment of Giardia from a full scale MBR sewage treatment plant caused by membrane integrity failure.

    PubMed

    Zhang, Yu; Chen, Zhimin; An, Wei; Xiao, Shumin; Yuan, Hongying; Zhang, Dongqing; Yang, Min

    2015-04-01

    Membrane bioreactors (MBR) are highly efficient at intercepting particles and microbes and have become an important technology for wastewater reclamation. However, many pathogens can accumulate in activated sludge due to the long residence time usually adopted in MBR, and thus may pose health risks when membrane integrity problems occur. This study presents data from a survey on the occurrence of water-borne Giardia pathogens in reclaimed water from a full-scale wastewater treatment plant with MBR experiencing membrane integrity failure, and assessed the associated risk for green space irrigation. Due to membrane integrity failure, the MBR effluent turbidity varied between 0.23 and 1.90 NTU over a period of eight months. Though this turbidity level still met reclaimed water quality standards (≤5 NTU), Giardia were detected at concentrations of 0.3 to 95 cysts/10 L, with a close correlation between effluent turbidity and Giardia concentration. All β-giardin gene sequences of Giardia in the WWTP influents were genotyped as Assemblages A and B, both of which are known to infect humans. An exponential dose-response model was applied to assess the risk of infection by Giardia. The risk in the MBR effluent with chlorination was 9.83×10(-3), higher than the acceptable annual risk of 1.0×10(-4). This study suggested that membrane integrity is very important for keeping a low pathogen level, and multiple barriers are needed to ensure the biological safety of MBR effluent. PMID:25872734

  19. An assessment of BWR (boiling water reactor) Mark-II containment challenges, failure modes, and potential improvements in performance

    SciTech Connect

    Kelly, D.L.; Jones, K.R.; Dallman, R.J. ); Wagner, K.C. )

    1990-07-01

    This report assesses challenges to BWR Mark II containment integrity that could potentially arise from severe accidents. Also assessed are some potential improvements that could prevent core damage or containment failure, or could mitigate the consequences of such failure by reducing the release of fission products to the environment. These challenges and improvements are analyzed via a limited quantitative risk/benefit analysis of a generic BWR/4 reactor with Mark II containment. Point estimate frequencies of the dominant core damage sequences are obtained and simple containment event trees are constructed to evaluate the response of the containment to these severe accident sequences. The resulting containment release modes are then binned into source term release categories, which provide inputs to the consequence analysis. The output of the consequences analysis is used to construct an overall base case risk profile. Potential improvements and sensitivities are evaluated by modifying the event tree spilt fractions, thus generating a revised risk profile. Several important sensitivity cases are examined to evaluate the impact of phenomenological uncertainties on the final results. 75 refs., 25 figs., 65 tabs.

  20. Contingency diagrams as teaching tools

    PubMed Central

    Mattaini, Mark A.

    1995-01-01

    Contingency diagrams are particularly effective teaching tools, because they provide a means for students to view the complexities of contingency networks present in natural and laboratory settings while displaying the elementary processes that constitute those networks. This paper sketches recent developments in this visualization technology and illustrates approaches for using contingency diagrams in teaching. ImagesFigure 2Figure 3Figure 4 PMID:22478208

  1. Potential-pH Diagrams.

    ERIC Educational Resources Information Center

    Barnum, Dennis W.

    1982-01-01

    Potential-pH diagrams show the domains of redoxpotential and pH in which major species are most stable. Constructing such diagrams provides students with opportunities to decide what species must be considered, search literature for equilibrium constants and free energies of formation, and practice in using the Nernst equation. (Author/JN)

  2. Moving Toward Comprehensive Acute Heart Failure Risk Assessment in the Emergency Department

    PubMed Central

    Collins, Sean P.; Storrow, Alan B.

    2013-01-01

    Nearly 700,000 emergency department (ED) visits were due to acute heart failure (AHF) in 2009. Most visits result in a hospital admission and account for the largest proportion of a projected $70 billion to be spent on heart failure care by 2030. ED-based risk prediction tools in AHF rarely impact disposition decision making. This is a major factor contributing to the 80% admission rate for ED patients with AHF, which has remained unchanged over the last several years. Self-care behaviors such as symptom monitoring, medication taking, dietary adherence, and exercise have been associated with decreased hospital readmissions, yet self-care remains largely unaddressed in ED patients with AHF and thus represents a significant lost opportunity to improve patient care and decrease ED visits and hospitalizations. Furthermore, shared decision making encourages collaborative interaction between patients, caregivers, and providers to drive a care path based on mutual agreement. The observation that “difficult decisions now will simplify difficult decisions later” has particular relevance to the ED, given this is the venue for many such issues. We hypothesize patients as complex and heterogeneous as ED patients with AHF may need both an objective evaluation of physiologic risk as well as an evaluation of barriers to ideal self-care, along with strategies to overcome these barriers. Combining physician gestalt, physiologic risk prediction instruments, an evaluation of self-care, and an information exchange between patient and provider using shared decision making may provide the critical inertia necessary to discharge patients home after a brief ED evaluation. PMID:24159563

  3. Risk assessment of drain valve failure in the K-West basin south loadout pit

    SciTech Connect

    MORGAN, R.G.

    1999-06-23

    The drain valve located in the bottom of the K-West Basin South Loadout Pit (SLOP) could provide an additional leak path from the K Basins if the drain valve were damaged during construction, installation, or operation of the cask loading system. For the K-West Basin SLOP the immersion pail support structure (IPSS) has already been installed, but the immersion pail has not been installed in the IPSS. The objective of this analysis is to evaluate the risk of damaging the drain valve during the remaining installation activities or operation of the cask loading system. Valve damage, as used in this analysis, does not necessarily imply large amounts of the water will be released quickly from the basin, rather valve damage implies that the valve's integrity has been compromised. The analysis process is a risk-based uncertainty analysis where best engineering judgement is used to represent each variable in the analysis. The uncertainty associated with each variable is represented by a probability distribution. The uncertainty is propagated through the analysis by Monte Carlo convolution techniques. The corresponding results are developed as a probability distribution and the risk is expressed in terms of the corresponding complementary cumulative distribution function (''risk curve''). The total risk is the area under the ''risk curve''. The risk of potentially dropping a cask into or on the IPSS and damaging the drain valve is approximately 1 x 10{sup -4} to 2 x 10{sup -5} per year. The risk of objects falling behind the IPSS and damaging the valve is 3 x 10{sup -2} to 6 x 10{sup -3} per year. Both risks are expressed as drain value failure frequencies. The risk of objects falling behind the IPSS and damaging the valve can be significantly reduced by an impact limiter and/or installing a gating or plate over the area bounded by the back of the IPSS and the wall of the SLOP. With either of these actions there is a 90 percent confidence that the frequency of drain valve

  4. Concrete and abstract Voronoi diagrams

    SciTech Connect

    Klein, R. )

    1989-01-01

    The Voronoi diagram of a set of sites is a partition of the plane into regions, one to each site, such that the region of each site contains all points of the plane that are closer to this site than to the other ones. Such partitions are of great importance to computer science and many other fields. The challenge is to compute Voronoi diagrams quickly. The problem is that their structure depends on the notion of distance and the sort of site. In this book the author proposes a unifying approach by introducing abstract Voronoi diagrams. These are based on the concept of bisecting curves which are required to have some simple properties that are actually possessed by most bisectors of concrete Voronoi diagrams. Abstract Voronoi diagrams can be computed efficiently and there exists a worst-case efficient algorithm of divide-and-conquer type that applies to all abstract Voronoi diagrams satisfying a certain constraint. The author shows that this constraint is fulfilled by the concrete diagrams based no large classes of metrics in the plane.

  5. Assessing the Effects of the "Rocket Math" Program with a Primary Elementary School Student at Risk for School Failure: A Case Study

    ERIC Educational Resources Information Center

    Smith, Christina R.; Marchand-Martella, Nancy E.; Martella, Ronald C.

    2011-01-01

    This study assessed the effects of the "Rocket Math" program on the math fluency skills of a first grade student at risk for school failure. The student received instruction in the "Rocket Math" program over 6 months. He was assessed using a pre- and posttest curriculum-based measurement (CBM) and individualized fluency checkouts within the…

  6. Consequences and assessment of human vestibular failure: implications for postural control.

    PubMed

    Colebatch, James G

    2002-01-01

    Labyrinthine afferents respond to both angular velocity (semicircular canals) and linear acceleration (otoliths), including gravity. Given their response to gravity, the otoliths are likely to have an important role in the postural functions of the vestibular apparatus. Unilateral vestibular ablation has dramatic effects on posture in many animals, but less so in primates. Nevertheless, bilateral vestibular lesions lead to disabling symptoms in man related to disturbed ocular and postural control and impaired perception of slopes and accelerations. While seimicircular canal function can be assessed through its effects on vestibular ocular reflexes, assessment of otolith function in man has traditionally been much more difficult. Recent definition of a short latency vestibulocollic reflex, activated by sound and appearing to arise from the saccule, shows promise as a new method of non-invasive assessment of otolith function. PMID:12171099

  7. Students' different understandings of class diagrams

    NASA Astrophysics Data System (ADS)

    Boustedt, Jonas

    2012-03-01

    The software industry needs well-trained software designers and one important aspect of software design is the ability to model software designs visually and understand what visual models represent. However, previous research indicates that software design is a difficult task to many students. This article reports empirical findings from a phenomenographic investigation on how students understand class diagrams, Unified Modeling Language (UML) symbols, and relations to object-oriented (OO) concepts. The informants were 20 Computer Science students from four different universities in Sweden. The results show qualitatively different ways to understand and describe UML class diagrams and the "diamond symbols" representing aggregation and composition. The purpose of class diagrams was understood in a varied way, from describing it as a documentation to a more advanced view related to communication. The descriptions of class diagrams varied from seeing them as a specification of classes to a more advanced view, where they were described to show hierarchic structures of classes and relations. The diamond symbols were seen as "relations" and a more advanced way was seeing the white and the black diamonds as different symbols for aggregation and composition. As a consequence of the results, it is recommended that UML should be adopted in courses. It is briefly indicated how the phenomenographic results in combination with variation theory can be used by teachers to enhance students' possibilities to reach advanced understanding of phenomena related to UML class diagrams. Moreover, it is recommended that teachers should put more effort in assessing skills in proper usage of the basic symbols and models and students should be provided with opportunities to practise collaborative design, e.g. using whiteboards.

  8. Assessment of filter dust characteristics that cause filter failure during hot-gas filtration

    SciTech Connect

    John P. Hurley; Biplab Mukherjee; Michael D. Mann

    2006-08-15

    The high-temperature filtration of particulates from gases is greatly limited because of the development of dust cakes that are difficult to remove and can bridge between candle filters, causing them to break. Understanding the conditions leading to the formation of cohesive dust can prevent costly filter failures and ensure higher efficiency of solid fuel, direct-fired turbine power generation systems. The University of North Dakota Energy & Environmental Research Center is working with the New Energy and Industrial Technology Development Organization and the U.S. Department of Energy to perform research to characterize and determine the factors that cause the development of such dust cakes. Changes in the tensile strength, bridging propensity, and plasticity of filter dust cakes were measured as a function of the temperature and a filter pressure drop for a coal and a biomass filter dust. The biomass filter dust indicated that potential filtering problems can exist at temperatures as low as 400{sup o}C, while the coal filter dust showed good filtering characteristics up to 750{sup o}C. A statistically valid model that can indicate the propensity of filters to fail with system operating conditions was developed. A detailed analysis of the chemical aspect of dusts is also presented in order to explore the causes of such stickiness. 16 refs., 10 figs., 3 tabs.

  9. Assessment of systolic and diastolic function in heart failure using ambulatory monitoring with acoustic cardiography.

    PubMed

    Dillier, Roger; Zuber, Michel; Arand, Patricia; Erne, Susanne; Erne, Paul

    2011-08-01

    INTRODUCTION. The circadian variation of heart function and heart sounds in patients with and without heart failure (HF) is poorly understood. We hypothesized HF patients would exhibit less circadian variation with worsened cardiac function and sleep apnea. METHODS. We studied 67 HF patients (age 67.4 ± 8.2 years; 42% acute HF) and 63 asymptomatic control subjects with no history of HF (age 61.6 ± 7.7 years). Subjects wore a heart sound/ECG/respiratory monitor. The data were analyzed for sleep apnea, diastolic heart sounds, and systolic time intervals. RESULTS. The HF group had significantly greater prevalence of the third heart sound and prolongation of electro-mechanical activation time, while the control group had an age-related increase in the prevalence of the fourth heart sound. The control group showed more circadian variation in cardiac function. The HF subjects had more sleep apnea and higher occurrence of heart rate non-dipping. CONCLUSIONS. The control subjects demonstrated an increasing incidence of diastolic dysfunction with age, while systolic function was mostly unchanged with aging. Parameters related to systolic function were significantly worse in the HF group with little diurnal variation, indicating a constant stimulation of sympathetic tone in HF and reduction of diurnal regulation. PMID:21361859

  10. Failure Impact Analysis of Key Management in AMI Using Cybernomic Situational Assessment (CSA)

    SciTech Connect

    Abercrombie, Robert K; Sheldon, Frederick T; Hauser, Katie R; Lantz, Margaret W; Mili, Ali

    2013-01-01

    In earlier work, we presented a computational framework for quantifying the security of a system in terms of the average loss a stakeholder stands to sustain as a result of threats to the system. We named this system, the Cyberspace Security Econometrics System (CSES). In this paper, we refine the framework and apply it to cryptographic key management within the Advanced Metering Infrastructure (AMI) as an example. The stakeholders, requirements, components, and threats are determined. We then populate the matrices with justified values by addressing the AMI at a higher level, rather than trying to consider every piece of hardware and software involved. We accomplish this task by leveraging the recently established NISTR 7628 guideline for smart grid security. This allowed us to choose the stakeholders, requirements, components, and threats realistically. We reviewed the literature and selected an industry technical working group to select three representative threats from a collection of 29 threats. From this subset, we populate the stakes, dependency, and impact matrices, and the threat vector with realistic numbers. Each Stakeholder s Mean Failure Cost is then computed.

  11. 30 CFR 218.40 - Assessments for incorrect or late reports and failure to report.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... INTERIOR MINERALS REVENUE MANAGEMENT COLLECTION OF MONIES AND PROVISION FOR GEOTHERMAL CREDITS AND... MMS by the designated due date for geothermal, solid minerals, and Indian oil and gas leases. (b) An... geothermal, solid minerals, and Indian oil and gas leases. (c) For purpose of assessments discussed in...

  12. Risk assessment of failure modes of gas diffuser liner of V94.2 siemens gas turbine by FMEA method

    NASA Astrophysics Data System (ADS)

    Mirzaei Rafsanjani, H.; Rezaei Nasab, A.

    2012-05-01

    Failure of welding connection of gas diffuser liner and exhaust casing is one of the failure modes of V94.2 gas turbines which are happened in some power plants. This defect is one of the uncertainties of customers when they want to accept the final commissioning of this product. According to this, the risk priority of this failure evaluated by failure modes and effect analysis (FMEA) method to find out whether this failure is catastrophic for turbine performance and is harmful for humans. By using history of 110 gas turbines of this model which are used in some power plants, the severity number, occurrence number and detection number of failure determined and consequently the Risk Priority Number (RPN) of failure determined. Finally, critically matrix of potential failures is created and illustrated that failure modes are located in safe zone.

  13. Cardiac status assessment with a multi-signal device for improved home-based congestive heart failure management.

    PubMed

    Muehlsteff, Jens; Carvalho, Paulo; Henriques, Jorge; Paiva, Rui P; Reiter, Harald

    2011-01-01

    State-of-the-Art disease management for Congestive Heart Failure (CHF) patients is still based on easy-to-acquire measures such as heart rate (HR), weight and blood pressure (BP). However, these measures respond late to changes of the patient health status and provide limited information to personalize and adapt medication therapy. This paper describes our concept called "Cardiac Status Assessment" we have been investigating within the European project "HeartCycle" towards next-generation home-based disease management of CHF. In our concept we analyze non-invasive surrogate measures of the cardio-vascular function in particular systolic time intervals and pulse wave characteristics to estimate Cardiac Output (CO) and Systemic Vascular Resistance (SVR) both are established clinical measures. We discuss the underlying concept, a developed measurement system and first results. PMID:22254450

  14. Personalized risk assessment of heart failure patients: more perspectives from transforming growth factor super-family members.

    PubMed

    Goletti, S; Gruson, D

    2015-03-30

    More personalized risk assessment of patients with heart failure (HF) is important to develop more tailored based care and for a better allocation of resources. The measurement of biomarkers is now part of the standards of care and is important for the sub-phenotyping of HF patients to demonstrate the activation of pathophysiological pathways engaged in the worsening of HF. The sub-phenotyping of patients can lead therefore to a more personalized selection of the treatment. Several members of the transforming growth factor β (TGF-β) super-family, such as myostatin, activin A, GDF-15 and GDF-11, are involved in cardiac remodeling and the evaluation of their circulating levels might provide new insights to the course of the disease and also to guide prognostication and therapeutic selection of HF patients. PMID:25260834

  15. The Hertzsprung-Russell Diagram.

    ERIC Educational Resources Information Center

    Woodrow, Janice

    1991-01-01

    Describes a classroom use of the Hertzsprung-Russell diagram to infer not only the properties of a star but also the star's probable stage in evolution, life span, and age of the cluster in which it is located. (ZWH)

  16. Atemporal diagrams for quantum circuits

    SciTech Connect

    Griffiths, Robert B.; Wu Shengjun; Yu Li; Cohen, Scott M.

    2006-05-15

    A system of diagrams is introduced that allows the representation of various elements of a quantum circuit, including measurements, in a form which makes no reference to time (hence 'atemporal'). It can be used to relate quantum dynamical properties to those of entangled states (map-state duality), and suggests useful analogies, such as the inverse of an entangled ket. Diagrams clarify the role of channel kets, transition operators, dynamical operators (matrices), and Kraus rank for noisy quantum channels. Positive (semidefinite) operators are represented by diagrams with a symmetry that aids in understanding their connection with completely positive maps. The diagrams are used to analyze standard teleportation and dense coding, and for a careful study of unambiguous (conclusive) teleportation. A simple diagrammatic argument shows that a Kraus rank of 3 is impossible for a one-qubit channel modeled using a one-qubit environment in a mixed state.

  17. Particles, Feynman Diagrams and All That

    ERIC Educational Resources Information Center

    Daniel, Michael

    2006-01-01

    Quantum fields are introduced in order to give students an accurate qualitative understanding of the origin of Feynman diagrams as representations of particle interactions. Elementary diagrams are combined to produce diagrams representing the main features of the Standard Model.

  18. An assessment of the state of the art in predicting the failure of ceramics: Final report

    SciTech Connect

    Boulet, J.A.M.

    1988-03-01

    The greatest weakness in existing design strategies for brittle fracture is in the narrow range of conditions for which the strategies are adequate. The primary reason for this weakness is the use of simplistic mechanical models of fracture processes and unverified statistical models of materials. To improve the design methodology, the models must first be improved. Specifically recommended research goals are: to develop models of cracks with realistic geometry under arbitrary stress states; to identify and model the most important relationships between fracture processes and microstructural features; to assess the technology available for acquiring statistical data on microstructure and flaw populations, and to establish the amount of data required for verification of statistical models; and to establish a computer-based fracture simulation that can incorporate a wide variety of mechanical and statistical models and crack geometries, as well as arbitrary stress states. 204 refs., 2 tabs.

  19. Risk assessment of component failure modes and human errors using a new FMECA approach: application in the safety analysis of HDR brachytherapy.

    PubMed

    Giardina, M; Castiglia, F; Tomarchio, E

    2014-12-01

    Failure mode, effects and criticality analysis (FMECA) is a safety technique extensively used in many different industrial fields to identify and prevent potential failures. In the application of traditional FMECA, the risk priority number (RPN) is determined to rank the failure modes; however, the method has been criticised for having several weaknesses. Moreover, it is unable to adequately deal with human errors or negligence. In this paper, a new versatile fuzzy rule-based assessment model is proposed to evaluate the RPN index to rank both component failure and human error. The proposed methodology is applied to potential radiological over-exposure of patients during high-dose-rate brachytherapy treatments. The critical analysis of the results can provide recommendations and suggestions regarding safety provisions for the equipment and procedures required to reduce the occurrence of accidental events. PMID:25379678

  20. Analysis of Japanese banks’ historical tree diagram

    NASA Astrophysics Data System (ADS)

    Ueno, Hiromichi; Mizuno, Takayuki; Takayasu, Misako

    2007-09-01

    By using the historical data from the Japanese banks’ database at “The Bankers Library” of Japanese Banker Association, we analyze the historical network of banks from 1868 to 2006. Firstly, we define a bank every year by a particle and draw a space-time evolution process of merger, division, establishment, and failure by a tree diagram structure. We found that the distribution of the tree basin size of real data and simulation result are mostly fitting well. Secondly, we analyze the raw data of financial statements of banks collected by the National Diet library. We confirm that the distributions of the amount of deposits have fat-tail every year, however, small deviations are observed relating to governmental policy.

  1. Classification tree for risk assessment in patients suffering from congestive heart failure via long-term heart rate variability.

    PubMed

    Melillo, Paolo; De Luca, Nicola; Bracale, Marcello; Pecchia, Leandro

    2013-05-01

    This study aims to develop an automatic classifier for risk assessment in patients suffering from congestive heart failure (CHF). The proposed classifier separates lower risk patients from higher risk ones, using standard long-term heart rate variability (HRV) measures. Patients are labeled as lower or higher risk according to the New York Heart Association classification (NYHA). A retrospective analysis on two public Holter databases was performed, analyzing the data of 12 patients suffering from mild CHF (NYHA I and II), labeled as lower risk, and 32 suffering from severe CHF (NYHA III and IV), labeled as higher risk. Only patients with a fraction of total heartbeats intervals (RR) classified as normal-to-normal (NN) intervals (NN/RR) higher than 80% were selected as eligible in order to have a satisfactory signal quality. Classification and regression tree (CART) was employed to develop the classifiers. A total of 30 higher risk and 11 lower risk patients were included in the analysis. The proposed classification trees achieved a sensitivity and a specificity rate of 93.3% and 63.6%, respectively, in identifying higher risk patients. Finally, the rules obtained by CART are comprehensible and consistent with the consensus showed by previous studies that depressed HRV is a useful tool for risk assessment in patients suffering from CHF. PMID:24592473

  2. Bootstrapping & Separable Monte Carlo Simulation Methods Tailored for Efficient Assessment of Probability of Failure of Dynamic Systems

    NASA Astrophysics Data System (ADS)

    Jehan, Musarrat

    The response of a dynamic system is random. There is randomness in both the applied loads and the strength of the system. Therefore, to account for the uncertainty, the safety of the system must be quantified using its probability of survival (reliability). Monte Carlo Simulation (MCS) is a widely used method for probabilistic analysis because of its robustness. However, a challenge in reliability assessment using MCS is that the high computational cost limits the accuracy of MCS. Haftka et al. [2010] developed an improved sampling technique for reliability assessment called separable Monte Carlo (SMC) that can significantly increase the accuracy of estimation without increasing the cost of sampling. However, this method was applied to time-invariant problems involving two random variables only. This dissertation extends SMC to random vibration problems with multiple random variables. This research also develops a novel method for estimation of the standard deviation of the probability of failure of a structure under static or random vibration. The method is demonstrated on quarter car models and a wind turbine. The proposed method is validated using repeated standard MCS.

  3. Assessment of formulation robustness for nano-crystalline suspensions using failure mode analysis or derisking approach.

    PubMed

    Nakach, Mostafa; Authelin, Jean-René; Voignier, Cecile; Tadros, Tharwat; Galet, Laurence; Chamayou, Alain

    2016-06-15

    The small particle size of nano-crystalline suspensions can be responsible for their physical instability during drug product preparation (downstream processing), storage and administration. For that purpose, the commercial formulation needs to be sufficiently robust to various triggering conditions, such as ionic strength, shear rate, wetting/dispersing agent desorption by dilution, temperature and pH variation. In our previous work we described a systematic approach to select the suitable wetting/dispersant agent for the stabilization of nano-crystalline suspension. In this paper, we described the assessment of the formulation robustness (stabilized using a mixture of sodium dodecyl sulfate (SDS) and polyvinylpyrrolidone (PVP) and) by measuring the rate of perikinetic (diffusion-controlled) and orthokinetic (shear-induced) aggregation as a function of ionic strength, temperature, pH and dilution. The results showed that, using the SDS/PVP system, the critical coagulation concentration is about five times higher than that observed in the literature for suspension colloidaly stable at high concentration. The nano-suspension was also found to be very stable at ambient temperature and at different pH conditions. Desorption test confirmed the high affinity between API and wetting/dispersing agent. However, the suspension undergoes aggregation at high temperature due to the desorption of the wetting/dispersing agent and disaggregation of SDS micelles. Furthermore, aggregation occurs at very high shear rate (orhokinetic aggregation) by overcoming the energy barrier responsible for colloidal stability of the system. PMID:27102992

  4. Pseudohaptic interaction with knot diagrams

    NASA Astrophysics Data System (ADS)

    Weng, Jianguang; Zhang, Hui

    2012-07-01

    To make progress in understanding knot theory, we need to interact with the projected representations of mathematical knots, which are continuous in three dimensions (3-D) but significantly interrupted in the projective images. One way to achieve such a goal is to design an interactive system that allows us to sketch two-dimensional (2-D) knot diagrams by taking advantage of a collision-sensing controller and explore their underlying smooth structures through a continuous motion. Recent advances of interaction techniques have been made that allow progress in this direction. Pseudohaptics that simulate haptic effects using pure visual feedback can be used to develop such an interactive system. We outline one such pseudohaptic knot diagram interface. Our interface derives from the familiar pencil-and-paper process of drawing 2-D knot diagrams and provides haptic-like sensations to facilitate the creation and exploration of knot diagrams. A centerpiece of the interaction model simulates a physically reactive mouse cursor, which is exploited to resolve the apparent conflict between the continuous structure of the actual smooth knot and the visual discontinuities in the knot diagram representation. Another value in exploiting pseudohaptics is that an acceleration (or deceleration) of the mouse cursor (or surface locator) can be used to indicate the slope of the curve (or surface) of which the projective image is being explored. By exploiting these additional visual cues, we proceed to a full-featured extension to a pseudohaptic four-dimensional (4-D) visualization system that simulates the continuous navigation on 4-D objects and allows us to sense the bumps and holes in the fourth dimension. Preliminary tests of the software show that main features of the interface overcome some expected perceptual limitations in our interaction with 2-D knot diagrams of 3-D knots and 3-D projective images of 4-D mathematical objects.

  5. Kidney Failure

    MedlinePlus

    ... if You Have Kidney Disease Kidney Failure Expand Dialysis Kidney Transplant Preparing for Kidney Failure Treatment Choosing Not to Treat with Dialysis or Transplant Paying for Kidney Failure Treatment Contact ...

  6. 30 CFR 1218.41 - Assessments for failure to submit payment of same amount as Form ONRR-2014 or bill document or to...

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... same amount as Form ONRR-2014 or bill document or to provide adequate information. 1218.41 Section 1218... General Provisions § 1218.41 Assessments for failure to submit payment of same amount as Form ONRR-2014 or... Form ONRR-2014, Form ONRR-4430, or a bill document, unless ONRR has authorized the difference in...

  7. 30 CFR 1218.41 - Assessments for failure to submit payment of same amount as Form MMS-2014 or bill document or to...

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... same amount as Form MMS-2014 or bill document or to provide adequate information. 1218.41 Section 1218... Provisions § 1218.41 Assessments for failure to submit payment of same amount as Form MMS-2014 or bill... leases is not equivalent in amount to the total of individual line items on the associated Form...

  8. 30 CFR 218.41 - Assessments for failure to submit payment of same amount as Form MMS-2014 or bill document or to...

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... same amount as Form MMS-2014 or bill document or to provide adequate information. 218.41 Section 218.41... Assessments for failure to submit payment of same amount as Form MMS-2014 or bill document or to provide... equivalent in amount to the total of individual line items on the associated Form MMS-2014, Form MMS-4430,...

  9. Reliability analysis based on the losses from failures.

    PubMed

    Todinov, M T

    2006-04-01

    The conventional reliability analysis is based on the premise that increasing the reliability of a system will decrease the losses from failures. On the basis of counterexamples, it is demonstrated that this is valid only if all failures are associated with the same losses. In case of failures associated with different losses, a system with larger reliability is not necessarily characterized by smaller losses from failures. Consequently, a theoretical framework and models are proposed for a reliability analysis, linking reliability and the losses from failures. Equations related to the distributions of the potential losses from failure have been derived. It is argued that the classical risk equation only estimates the average value of the potential losses from failure and does not provide insight into the variability associated with the potential losses. Equations have also been derived for determining the potential and the expected losses from failures for nonrepairable and repairable systems with components arranged in series, with arbitrary life distributions. The equations are also valid for systems/components with multiple mutually exclusive failure modes. The expected losses given failure is a linear combination of the expected losses from failure associated with the separate failure modes scaled by the conditional probabilities with which the failure modes initiate failure. On this basis, an efficient method for simplifying complex reliability block diagrams has been developed. Branches of components arranged in series whose failures are mutually exclusive can be reduced to single components with equivalent hazard rate, downtime, and expected costs associated with intervention and repair. A model for estimating the expected losses from early-life failures has also been developed. For a specified time interval, the expected losses from early-life failures are a sum of the products of the expected number of failures in the specified time intervals covering the

  10. Generalized discriminant analysis for congestive heart failure risk assessment based on long-term heart rate variability.

    PubMed

    Shahbazi, Fatemeh; Asl, Babak Mohammadzadeh

    2015-11-01

    The aims of this study are summarized in the following items: first, to investigate the class discrimination power of long-term heart rate variability (HRV) features for risk assessment in patients suffering from congestive heart failure (CHF); second, to introduce the most discriminative features of HRV to discriminate low risk patients (LRPs) and high risk patients (HRPs), and third, to examine the influence of feature dimension reduction in order to achieve desired accuracy of the classification. We analyzed two public Holter databases: 12 data of patients suffering from mild CHF (NYHA class I and II), labeled as LRPs and 32 data of patients suffering from severe CHF (NYHA class III and IV), labeled as HRPs. A K-nearest neighbor classifier was used to evaluate the performance of feature set in the classification. Moreover, to reduce the number of features as well as the overlap of the samples of two classes in feature space, we used generalized discriminant analysis (GDA) as a feature extraction method. By applying GDA to the discriminative nonlinear features, we achieved sensitivity and specificity of 100% having the least number of features. Finally, the results were compared with other similar conducted studies regarding the performance of feature selection procedure and classifier besides the number of features used in training. PMID:26344584

  11. Sequential organ failure assessment scoring and prediction of patient's outcome in Intensive Care Unit of a tertiary care hospital

    PubMed Central

    Jain, Aditi; Palta, Sanjeev; Saroa, Richa; Palta, Anshu; Sama, Sonu; Gombar, Satinder

    2016-01-01

    Background and Aims: The objective was to determine the accuracy of sequential organ failure assessment (SOFA) score in predicting outcome of patients in Intensive Care Unit (ICU). Material and Methods: Forty-four consecutive patients between 15 and 80 years admitted to ICU over 8 weeks period were studied prospectively. Three patients were excluded. SOFA score was determined 24 h postadmission to ICU and subsequently every 48 h for the first 10 days. Patients were followed till discharge/death/transfer from the ICU. Initial SOFA score, highest and mean SOFA scores were calculated and correlated with mortality and duration of stay in ICU. Results: The mortality rate was 39% and the mean duration of stay in the ICU was 9 days. The maximum score in survivors (3.92 ± 2.17) was significantly lower than nonsurvivors (8.9 ± 3.45). The initial SOFA score had a strong statistical correlation with mortality. Cardiovascular score on day 1 and 3, respiratory score on day 7, and coagulation profile on day 3 correlated significantly with the outcome. Duration of the stay did not correlate with the survival (P = 0.461). Conclusion: SOFA score is a simple, but effective prognostic indicator and evaluator for patient progress in ICU. Day 1 SOFA can triage the patients into risk categories. For further management, mean and maximum score help determine the severity of illness and can act as a guide for the intensity of therapy required for each patient.

  12. Failure of Passive Immune Transfer in Calves: A Meta-Analysis on the Consequences and Assessment of the Economic Impact

    PubMed Central

    Raboisson, Didier; Trillat, Pauline; Cahuzac, Clélia

    2016-01-01

    Low colostrum intake at birth results in the failure of passive transfer (FPT) due to the inadequate ingestion of colostral immunoglobulins (Ig). FPT is associated with an increased risk of mortality and decreased health and longevity. Despite the known management practices associated with low FPT, it remains an important issue in the field. Neither a quantitative analysis of FPT consequences nor an assessment of its total cost are available. To address this point, a meta-analysis on the adjusted associations between FPT and its outcomes was first performed. Then, the total costs of FPT in European systems were calculated using a stochastic method with adjusted values as the input parameters. The adjusted risks (and 95% confidence intervals) for mortality, bovine respiratory disease, diarrhoea and overall morbidity in the case of FPT were 2.12 (1.43–3.13), 1.75 (1.50–2.03), 1.51 (1.05–2.17) and 1.91 (1.63–2.24), respectively. The mean (and 95% prediction interval) total costs per calf with FPT were estimated to be €60 (€10–109) and €80 (€20–139) for dairy and beef, respectively. As a result of the double-step stochastic method, the proposed economic estimation constitutes the first estimate available for FPT. The results are presented in a way that facilitates their use in the field and, with limited effort, combines the cost of each contributor to increase the applicability of the economic assessment to the situations farm-advisors may face. The present economic estimates are also an important tool to evaluate the profitability of measures that aim to improve colostrum intake and FPT prevention. PMID:26986832

  13. Failure of Passive Immune Transfer in Calves: A Meta-Analysis on the Consequences and Assessment of the Economic Impact.

    PubMed

    Raboisson, Didier; Trillat, Pauline; Cahuzac, Clélia

    2016-01-01

    Low colostrum intake at birth results in the failure of passive transfer (FPT) due to the inadequate ingestion of colostral immunoglobulins (Ig). FPT is associated with an increased risk of mortality and decreased health and longevity. Despite the known management practices associated with low FPT, it remains an important issue in the field. Neither a quantitative analysis of FPT consequences nor an assessment of its total cost are available. To address this point, a meta-analysis on the adjusted associations between FPT and its outcomes was first performed. Then, the total costs of FPT in European systems were calculated using a stochastic method with adjusted values as the input parameters. The adjusted risks (and 95% confidence intervals) for mortality, bovine respiratory disease, diarrhoea and overall morbidity in the case of FPT were 2.12 (1.43-3.13), 1.75 (1.50-2.03), 1.51 (1.05-2.17) and 1.91 (1.63-2.24), respectively. The mean (and 95% prediction interval) total costs per calf with FPT were estimated to be €60 (€10-109) and €80 (€20-139) for dairy and beef, respectively. As a result of the double-step stochastic method, the proposed economic estimation constitutes the first estimate available for FPT. The results are presented in a way that facilitates their use in the field and, with limited effort, combines the cost of each contributor to increase the applicability of the economic assessment to the situations farm-advisors may face. The present economic estimates are also an important tool to evaluate the profitability of measures that aim to improve colostrum intake and FPT prevention. PMID:26986832

  14. Voronoi Diagrams and Spring Rain

    ERIC Educational Resources Information Center

    Perham, Arnold E.; Perham, Faustine L.

    2011-01-01

    The goal of this geometry project is to use Voronoi diagrams, a powerful modeling tool across disciplines, and the integration of technology to analyze spring rainfall from rain gauge data over a region. In their investigation, students use familiar equipment from their mathematical toolbox: triangles and other polygons, circumcenters and…

  15. Assessment of the risk of failure of high voltage substations due to environmental conditions and pollution on insulators.

    PubMed

    Castillo Sierra, Rafael; Oviedo-Trespalacios, Oscar; Candelo, John E; Soto, Jose D

    2015-07-01

    Pollution on electrical insulators is one of the greatest causes of failure of substations subjected to high levels of salinity and environmental pollution. Considering leakage current as the main indicator of pollution on insulators, this paper focuses on establishing the effect of the environmental conditions on the risk of failure due to pollution on insulators and determining the significant change in the magnitude of the pollution on the insulators during dry and humid periods. Hierarchical segmentation analysis was used to establish the effect of environmental conditions on the risk of failure due to pollution on insulators. The Kruskal-Wallis test was utilized to determine the significant changes in the magnitude of the pollution due to climate periods. An important result was the discovery that leakage current was more common on insulators during dry periods than humid ones. There was also a higher risk of failure due to pollution during dry periods. During the humid period, various temperatures and wind directions produced a small change in the risk of failure. As a technical result, operators of electrical substations can now identify the cause of an increase in risk of failure due to pollution in the area. The research provides a contribution towards the behaviour of the leakage current under conditions similar to those of the Colombian Caribbean coast and how they affect the risk of failure of the substation due to pollution. PMID:25634366

  16. Comparison of linear-elastic-plastic, elastic-plastic, and fully plastic failure models in the assessment of piping integrity

    SciTech Connect

    Streit, R.D.

    1981-01-01

    A double-ended guillotine break in the primary coolant loop of a pressurized water reactor (PWR) is a postulated loss of coolant accient which can result in extreme dynamic loads (i.e., the asymmetric blowdown load) on the reactor pressure vessel (RPV) an vessel intervals. Design and construction of the RPV and support systems to withstand these extreme dynamic loads is very difficult. Similar high loading would also be experienced in a boiling water reactor given a similar accident. Although such a break would be an extremely rare event, its obvious safety and design implications demand that it is carefully evaluated. The work discussed here is part of the Load Combinations Program at Lawrence Livermore National Laboratory to estimate the probability of a double-ended guillotine break in the primary reactor coolant loop of a selected PWR. The program employs a fracture mechanics based fatigue model to propagate cracks from an initial flaw distribution. It was found that while most of the large cracks grew into leaks, a complete (or nearly complete) circumferential crack could lead to a double-ended pipe break with prior leaking and thus, without warning. It is important to assess under what loads such a crack will result in complete pipe severance. The loads considered in this evaluation result from pressure, dead weight and seismic stresses. For the PWR hot leg considered in this investigation the internal pressure contributes the most to the load controlled stresses (i.e., stresses which can cause piping failure) and thus, the problem is treated as axisymmetric with uniform axial loading.

  17. Spectral Determinants on Mandelstam Diagrams

    NASA Astrophysics Data System (ADS)

    Hillairet, Luc; Kalvin, Victor; Kokotov, Alexey

    2016-04-01

    We study the regularized determinant of the Laplacian as a functional on the space of Mandelstam diagrams (noncompact translation surfaces glued from finite and semi-infinite cylinders). A Mandelstam diagram can be considered as a compact Riemann surface equipped with a conformal flat singular metric {|ω|^2}, where {ω} is a meromorphic one-form with simple poles such that all its periods are pure imaginary and all its residues are real. The main result is an explicit formula for the determinant of the Laplacian in terms of the basic objects on the underlying Riemann surface (the prime form, theta-functions, the canonical meromorphic bidifferential) and the divisor of the meromorphic form {ω}. As an important intermediate result we prove a decomposition formula of the type of Burghelea-Friedlander-Kappeler for the determinant of the Laplacian for flat surfaces with cylindrical ends and conical singularities.

  18. Hero's journey in bifurcation diagram

    NASA Astrophysics Data System (ADS)

    Monteiro, L. H. A.; Mustaro, P. N.

    2012-06-01

    The hero's journey is a narrative structure identified by several authors in comparative studies on folklore and mythology. This storytelling template presents the stages of inner metamorphosis undergone by the protagonist after being called to an adventure. In a simplified version, this journey is divided into three acts separated by two crucial moments. Here we propose a discrete-time dynamical system for representing the protagonist's evolution. The suffering along the journey is taken as the control parameter of this system. The bifurcation diagram exhibits stationary, periodic and chaotic behaviors. In this diagram, there are transition from fixed point to chaos and transition from limit cycle to fixed point. We found that the values of the control parameter corresponding to these two transitions are in quantitative agreement with the two critical moments of the three-act hero's journey identified in 10 movies appearing in the list of the 200 worldwide highest-grossing films.

  19. Relation of Longitudinal Changes in Quality of Life Assessments to Changes in Functional Capacity in Patients With Heart Failure With and Without Anemia.

    PubMed

    Cooper, Trond J; Anker, Stefan D; Comin-Colet, Josep; Filippatos, Gerasimos; Lainscak, Mitja; Lüscher, Thomas F; Mori, Claudio; Johnson, Patrick; Ponikowski, Piotr; Dickstein, Kenneth

    2016-05-01

    Clinical status in heart failure is conventionally assessed by the physician's evaluation, patients' own perception of their symptoms, quality of life (QoL) tools, and a measure of functional capacity. These aspects can be measured with tools such as the New York Heart Association functional class, QoL tools such as the EuropeanQoL-5 dimension, the Kansas City Cardiomyopathy Questionnaire, patient global assessment (PGA), and by 6-minute walk test (6MWT), respectively. The ferric carboxymaltose in patients with heart failure and iron deficiency (FAIR-HF) trial demonstrated that treatment with intravenous ferric carboxymaltose in iron-deficient patients with symptomatic heart failure with reduced left ventricular function, significantly improved all 5 outcome measures. This analysis assessed the correlations between the longitudinal changes in the measures of clinical status, as measured by QoL tools and the changes in the measures of functional capacity as measured by the 6MWT. This analysis used the database from the FAIR-HF trial, which randomized 459 patients with chronic heart failure (reduced left ventricular ejection fraction) and iron deficiency, with or without anemia to ferrous carboxymaltose or placebo. The degree of correlation between QoL tools and the 6MWT was assessed at 4, 12, and 24 weeks. The data demonstrate highly significant correlations between QoL and functional capacity, as measured by the 6MWT, at all time points (p <0.001). Changes in PGA, Kansas City Cardiomyopathy Questionnaire, and EuroQoL-5D correlated increasingly over time with changes in 6MWT performance. Interestingly, the strongest correlation at 24 weeks is for the PGA, which is a simple numerical scale (r = -0.57, p <0.001). This analysis provides evidence that QoL assessment show a significant correlation with functional capacity, as measured by the 6MWT. The strength of these correlations increased over time. PMID:27015889

  20. Causal diagrams in systems epidemiology

    PubMed Central

    2012-01-01

    Methods of diagrammatic modelling have been greatly developed in the past two decades. Outside the context of infectious diseases, systematic use of diagrams in epidemiology has been mainly confined to the analysis of a single link: that between a disease outcome and its proximal determinant(s). Transmitted causes ("causes of causes") tend not to be systematically analysed. The infectious disease epidemiology modelling tradition models the human population in its environment, typically with the exposure-health relationship and the determinants of exposure being considered at individual and group/ecological levels, respectively. Some properties of the resulting systems are quite general, and are seen in unrelated contexts such as biochemical pathways. Confining analysis to a single link misses the opportunity to discover such properties. The structure of a causal diagram is derived from knowledge about how the world works, as well as from statistical evidence. A single diagram can be used to characterise a whole research area, not just a single analysis - although this depends on the degree of consistency of the causal relationships between different populations - and can therefore be used to integrate multiple datasets. Additional advantages of system-wide models include: the use of instrumental variables - now emerging as an important technique in epidemiology in the context of mendelian randomisation, but under-used in the exploitation of "natural experiments"; the explicit use of change models, which have advantages with respect to inferring causation; and in the detection and elucidation of feedback. PMID:22429606

  1. Looking inside the butterfly diagram

    NASA Astrophysics Data System (ADS)

    Ternullo, M.

    2007-12-01

    The suitability of Maunder's butterfly diagram to give a realistic picture of the photospheric magnetic flux large scale distribution is discussed. The evolution of the sunspot zone in cycle 20 through 23 is described. To reduce the noise which covers any structure in the diagram, a smoothing algorithm has been applied to the sunspot data. This operation has eliminated any short period fluctuation, and given visibility to long duration phenomena. One of these phenomena is the fact that the equatorward drift of the spot zone center of mass results from the alternation of several prograde (namely, equatorward) segments with other stationary or poleward segments. The long duration of the stationary/retrograde phases as well as the similarities among the spot zone alternating paths in the cycles under examination prevent us from considering these features as meaningless fluctuations, randomly superimposed on the continuous equatorward migration. On the contrary, these features should be considered physically meaningful phenomena, requiring adequate explanations. Moreover, even the smoothed spotted area markedly oscillates. The compared examination of area and spot zone evolution allows us to infer details about the spotted area distribution inside the butterfly diagram. Links between the changing structure of the spot zone and the tachocline rotation rate oscillations are proposed.

  2. Failure of platelet parameters and biomarkers to correlate platelet function to severity and etiology of heart failure in patients enrolled in the EPCOT trial. With special reference to the Hemodyne hemostatic analyzer. Whole Blood Impedance Aggregometry for the Assessment of Platelet Function in Patients with Congestive Heart Failure.

    PubMed

    Serebruany, Victor L; McKenzie, Marcus E; Meister, Andrew F; Fuzaylov, Sergey Y; Gurbel, Paul A; Atar, Dan; Gattis, Wendy A; O'Connor, Christopher M

    2002-01-01

    Data from small studies have suggested the presence of platelet abnormalities in patients with congestive heart failure (CHF). We sought to characterize the diagnostic utility of different platelet parameters and platelet-endothelial biomarkers in a random outpatient CHF population investigated in the EPCOT ('Whole Blood Impedance Aggregometry for the Assessment of Platelet Function in Patients with Congestive Heart Failure') Trial. Blood samples were obtained for measurement of platelet contractile force (PCF), whole blood aggregation, shear-induced closure time, expression of glycoprotein (GP) IIb/IIIa, and P-selectin in 100 consecutive patients with CHF. Substantial interindividual variability of platelet characteristics exists in patients with CHF. There were no statistically significant differences when patients were grouped according to incidence of vascular events, emergency revascularization needs, survival, or etiology of heart failure. Aspirin use did not affect instrument readings either. PCF correlates very poorly with whole blood aggregometry (r(2) = 0.023), closure time (r(2) = 0.028), platelet GP IIb/IIIa (r(2) = 0.0028), and P-selectin (r(2) = 0.002) expression. Furthermore, there was no correlation with brain natriuretic peptide concentrations, a marker of severity and prognosis in heart failure reflecting the neurohumoral status. Patients with heart failure enrolled in the EPCOT Trial exhibited a marginal, sometimes oppositely directed change in platelet function, challenging the diagnostic utility of these platelet parameters and biomarkers to serve as useful tools for the identification of platelet abnormalities, for predicting clinical outcomes, or for monitoring antiplatelet strategies in this population. The usefulness of these measurements for assessing platelets in the different clinical settings remains to be explored. Taken together, opposite to our expectations, major clinical characteristics of heart failure did not correlate well with

  3. Twistor Diagrams and Quantum Field Theory.

    NASA Astrophysics Data System (ADS)

    O'Donald, Lewis

    Available from UMI in association with The British Library. Requires signed TDF. This thesis uses twistor diagram theory, as developed by Penrose (1975) and Hodges (1990c), to try to approach some of the difficulties inherent in the standard quantum field theoretic description of particle interactions. The resolution of these issues is the eventual goal of the twistor diagram program. First twistor diagram theory is introduced from a physical view-point, with the aim of studying larger diagrams than have been typically explored. Methods are evolved to tackle the double box and triple box diagrams. These lead to three methods of constructing an amplitude for the double box, and two ways for the triple box. Next this theory is applied to translate the channels of a Yukawa Feynman diagram, which has more than four external states, into various twistor diagrams. This provides a test of the skeleton hypothesis (of Hodges, 1990c) in these cases, and also shows that conformal breaking must enter into twistor diagrams before the translation of loop level Feynman diagrams. The issue of divergent Feynman diagrams is then considered. By using a twistor equivalent of the sum-over -states idea of quantum field theory, twistor translations of loop diagrams are conjectured. The various massless propagator corrections and vacuum diagrams calculated give results consistent with Feynman theory. Two diagrams are also found that give agreement with the finite parts of the Feynman "fish" diagrams of phi^4 -theory. However it is found that a more rigorous translation for the time-like fish requires new boundaries to be added to the twistor sum-over-states. The twistor diagram obtained is found to give the finite part of the relevant Feynman diagram.

  4. Arrows in Comprehending and Producing Mechanical Diagrams

    ERIC Educational Resources Information Center

    Heiser, Julie; Tversky, Barbara

    2006-01-01

    Mechanical systems have structural organizations--parts, and their relations--and functional organizations--temporal, dynamic, and causal processes--which can be explained using text or diagrams. Two experiments illustrate the role of arrows in diagrams of mechanical systems. In Experiment 1, people described diagrams with or without arrows,…

  5. Differential Effectiveness of Two Science Diagram Types.

    ERIC Educational Resources Information Center

    Holliday, William G.

    Reported is an Aptitude Treatment Instruction (ATI) Study designed to evaluate the aptitude of verbal comprehension in terms of two unitary complex science diagram types: a single complex block word diagram and a single complex picture word diagram.. ATI theory and research indicate that different effective instructional treatments tend to help…

  6. Respiratory Failure

    MedlinePlus

    Respiratory failure happens when not enough oxygen passes from your lungs into your blood. Your body's organs, such ... brain, need oxygen-rich blood to work well. Respiratory failure also can happen if your lungs can't ...

  7. Kidney Failure

    MedlinePlus

    ... enough red blood cells. This is called kidney failure. If your kidneys fail, you need treatment to ... providers, family, and friends, most people with kidney failure can lead full and active lives. NIH: National ...

  8. Assessment of the magnitude and associated factors of immunological failure among adult and adolescent HIV-infected patients in St. Luke and Tulubolo Hospital, Oromia Region, Ethiopia

    PubMed Central

    Bayou, Bekelech; Sisay, Abay; Kumie, Abera

    2015-01-01

    Introduction The use of antiretroviral therapy (ART) has become a standard of care for the treatment of HIV infection. However, cost and resistance to ART are major obstacles for access to treatment especially in resource-limited settings. In this study, we aimed to assess the magnitude and associated factors of Immunological failure among adult and adolescent HIV infected Patients (with age ‘15yrs) on Highly Active Antiretroviral Therapy (HAART) in St. Luke and Tulu Bolo Hospitals, Oromia Region, Ethiopia. Methods A retrospective follow-up study was conducted among HIV-infected patients initiated 1st line ART at St. Luke and Tulu Bolo Hospitals, South West Shoa Zone, Oromia, Ethiopia. Results A total of 828 patient charts were reviewed. 477(57.6%) were female and the median age was 32 years. The median baseline CD4 count was 148cells/mm3. The most common prescribed ART was TDF based (36.7%). Out of 828 patients chart reviewed 6.8% (56) were developed immunological failure. Out of them only 20 (2.4%) were detected and put on second line regimen. The incidence of immunological failure was 1.8 cases per 100 person years of follow-up. Patients who had not disclosed their HIV status to any one had high risk of immunological failure compared with patients those who had disclosed their HIV status (AHR, 0.429; 95% CI 0.206 - 0.893; P-value=0.024). Conclusion Non disclosures of HIV status and with ambulatory of baseline functional status were found to be predictors of immunological failure. Most of the immunological failure cases were not detected early and not switched to second line ARV regimen. So patients with the above risk factors should be considered for a timely switch to second line HAART. PMID:26587140

  9. Understanding machines from text and diagrams

    NASA Astrophysics Data System (ADS)

    Hegarty, Mary; Just, Marcel A.

    1987-12-01

    Instructional materials typically use both text and diagrams to explain how machines work. In this paper we give an account of what information is involved in understanding a mechanical device and the role that diagrams might play in communicating this information. We propose a model of how people read a text and inspect an accompanying diagram which states that people inspect diagrams for three reasons: (1) to form a representation of information read in the text, (2) to reactivate information that has already been represented, and (3) to encode information that is absent from the text. Using data from subjects' eye fixations while they read a text and inspected an accompanying diagram, we find that low-ability subjects need to inspect diagrams more often than high-ability text. The data also suggest that knowledge of what is relevant in a diagram might be a prerequisite for encoding new information from a diagram. Instructional materials typically use both text and diagrams to explain how machines work. In this paper we give an account of what information is involved in understanding a mechanical device and the role that diagrams might play in communicating this information. We propose a model of how people read a text and inspect an accompanying diagram which states that people inspect diagrams for three reasons: (1) to form a representation of information read in the text; (2) to reactivate information that was alsready represented, and *3) to encode information that is absent from the text. Uinsg data from subjects' eye fixations while they read a text and inspected an accompanying diagram, we find that low-ability subjects need to inspect diagrmas more often than high-ability tesxt. The data also suggest that knowledge of what is relevant in a diagram might be a prerequisite and encoding information on a diagram.

  10. Diagram, a Learning Environment for Initiation to Object-Oriented Modeling with UML Class Diagrams

    ERIC Educational Resources Information Center

    Py, Dominique; Auxepaules, Ludovic; Alonso, Mathilde

    2013-01-01

    This paper presents Diagram, a learning environment for object-oriented modelling (OOM) with UML class diagrams. Diagram an open environment, in which the teacher can add new exercises without constraints on the vocabulary or the size of the diagram. The interface includes methodological help, encourages self-correcting and self-monitoring, and…

  11. A Hubble Diagram for Quasars

    NASA Astrophysics Data System (ADS)

    Risaliti, Guido; Lusso, Elisabeta

    2015-09-01

    We present a new method to test the cosmological model at high z, and measure the cosmological parameters, based on the non-linear correlation between UV and X-ray luminosity in quasars. While the method can be successfully tested with the data available today, a deep X-ray survey matching the future LSST and Euclid quasar catalogs is needed to achieve a high precision. Athena could provide a Hubble diagram for quasar analogous to that available today for supernovae, but extending up to z>6.

  12. Optical generation of Voronoi diagram.

    PubMed

    Giavazzi, F; Cerbino, R; Mazzoni, S; Giglio, M; Vailati, A

    2008-03-31

    We present results of experiments of diffraction by an amplitude screen, made of randomly distributed circular holes. By careful selection of the experimental parameters we obtain an intensity pattern strongly connected to the Voronoi diagram (VD) generated by the centers of the apertures. With the help of simulations we give a description of the observed phenomenon and elucidate the optimal parameters for its observation. Finally, we also suggest how it can be used for a fast, all-optical generation of VDs. PMID:18542580

  13. An assessment of BWR (boiling water reactor) Mark III containment challenges, failure modes, and potential improvements in performance

    SciTech Connect

    Schroeder, J.A.; Pafford, D.J.; Kelly, D.L.; Jones, K.R.; Dallman, F.J. )

    1991-01-01

    This report describes risk-significant challenges posed to Mark III containment systems by severe accidents as identified for Grand Gulf. Design similarities and differences between the Mark III plants that are important to containment performance are summarized. The accident sequences responsible for the challenges and the postulated containment failure modes associated with each challenge are identified and described. Improvements are discussed that have the potential either to prevent or delay containment failure, or to mitigate the offsite consequences of a fission product release. For each of these potential improvements, a qualitative analysis is provided. A limited quantitative risk analysis is provided for selected potential improvements. 21 refs., 5 figs., 46 tabs.

  14. Cell flipping in permutation diagrams

    NASA Astrophysics Data System (ADS)

    Golumbic, Martin Charles; Kaplang, Haim

    Permutation diagrams have been used in circuit design to model a set of single point nets crossing a channel, where the minimum number of layers needed to realize the diagram equals the clique number ω(G) of its permutation graph, the value of which can be calculated in O(n log n) time. We consider a generalization of this model motivated by "standard cell" technology in which the numbers on each side of the channel are partitioned into consecutive subsequences, or cells, each of which can be left unchanged or flipped (i.e., reversed). We ask, for what choice of fiippings will the resulting clique number be minimum or maximum. We show that when one side of the channel is fixed (no flipping), an optimal flipping for the other side can be found in O(n log n) time for the maximum clique number. We prove that the general problem is NP-complete for the minimum clique number and O(n 2) for the maximum clique number. Moreover, since the complement of a permutation graph is also a permutation graph, the same complexity results hold for the independence number.

  15. Phase Diagrams of Nuclear Pasta

    NASA Astrophysics Data System (ADS)

    Caplan, Matthew; Horowitz, Chuck; Berry, Don; da Silva Schneider, Andre

    2016-03-01

    In the inner crust of neutrons stars, where matter is near the saturation density, protons and neutrons arrange themselves into complex structures called nuclear pasta. Early theoretical work predicted a simple graduated hierarchy of pasta phases, consisting of spheres, cylinders, slabs, and uniform matter with voids. Previous work has simulated these phases with a simple classical model and has shown that the formation of these structures is dependent on the temperature, density, and proton fraction. However, previous work only studied a limited range of these parameters due to computational limitations. Thanks to recent advances in computing it is now possible to survey the structure of nuclear pasta for a larger range of parameters. By simulating nuclear pasta with constant temperature and proton fraction in an expanding simulation volume we are able to study the phase transitions in nuclear pasta, and thus produce a set of phase diagrams. We report on these phase diagrams as well as newly identified phases of nuclear pasta and discuss their implications for neutron star observables.

  16. The Importance of Design in Learning from Node-Link Diagrams

    ERIC Educational Resources Information Center

    van Amelsvoort, Marije; van der Meij, Jan; Anjewierden, Anjo; van der Meij, Hans

    2013-01-01

    Diagrams organize by location. They give spatial cues for finding and recognizing information and for making inferences. In education, diagrams are often used to help students understand and recall information. This study assessed the influence of perceptual cues on reading behavior and subsequent retention. Eighty-two participants were assigned…

  17. Heber Geothermal Binary Demonstration Project. Final design availability assessment. Revision 1

    SciTech Connect

    Mulvihill, R.J.; Reny, D.A.; Geumlek, J.M.; Purohit, G.P.

    1983-02-01

    An availability assessment of the principal systems of the Heber Geothermal Power Plant has been carried out based on the final issue of the process descriptions, process flow diagrams, and the approved for design P and IDs prepared by Fluor Power Services, Inc. (FPS). The principal systems are those which contribute most to plant unavailability. The plant equivalent availability, considering forced and deferred corrective maintenance outages, was computed using a 91 state Markov model to represent the 29 principal system failure configurations and their significant combinations. The failure configurations and associated failure and repair rates were defined from system/subsystem availability assessments that were conducted using the availability assessments based on the EPRI GO methodology and availability block diagram models. The availability and unavailability ranking of the systems and major equipment is presented.

  18. Hubble's diagram and cosmic expansion

    PubMed Central

    Kirshner, Robert P.

    2004-01-01

    Edwin Hubble's classic article on the expanding universe appeared in PNAS in 1929 [Hubble, E. P. (1929) Proc. Natl. Acad. Sci. USA 15, 168–173]. The chief result, that a galaxy's distance is proportional to its redshift, is so well known and so deeply embedded into the language of astronomy through the Hubble diagram, the Hubble constant, Hubble's Law, and the Hubble time, that the article itself is rarely referenced. Even though Hubble's distances have a large systematic error, Hubble's velocities come chiefly from Vesto Melvin Slipher, and the interpretation in terms of the de Sitter effect is out of the mainstream of modern cosmology, this article opened the way to investigation of the expanding, evolving, and accelerating universe that engages today's burgeoning field of cosmology. PMID:14695886

  19. Phase diagram of ammonium nitrate

    NASA Astrophysics Data System (ADS)

    Dunuwille, M.; Yoo, C. S.

    2014-05-01

    Ammonium Nitrate (AN) has often subjected to uses in improvised explosive devices, due to its wide availability as a fertilizer and its capability of becoming explosive with slight additions of organic and inorganic compounds. Yet, the origin of enhanced energetic properties of impure AN (or AN mixtures) is neither chemically unique nor well understood -resulting in rather catastrophic disasters in the past1 and thereby a significant burden on safety in using ammonium nitrates even today. To remedy this situation, we have carried out an extensive study to investigate the phase stability of AN at high pressure and temperature, using diamond anvil cells and micro-Raman spectroscopy. The present results confirm the recently proposed phase IV-to-IV' transition above 17 GPa2 and provide new constraints for the melting and phase diagram of AN to 40 GPa and 400 °C.

  20. The neptunium-iron phase diagram

    NASA Astrophysics Data System (ADS)

    Gibson, J. K.; Haire, R. G.; Beahm, E. C.; Gensini, M. M.; Maeda, A.; Ogawa, T.

    1994-08-01

    The phase relations in the Np-Fe alloy system have been elucidated using differential thermal analysis. A phase diagram for this system is postulated based upon the experimental results, regular-solution model calculations, and an expected correspondence to the U-Fe and Pu-Fe diagrams. The postulated Np-Fe diagram is characterized by limited terminal solid solubilities, two intermetallic solid phases, NpFe 2 and Np 6Fe, and two eutectics.

  1. Application of Failure Mode and Effect Analysis (FMEA) and cause and effect analysis in conjunction with ISO 22000 to a snails (Helix aspersa) processing plant; A case study.

    PubMed

    Arvanitoyannis, Ioannis S; Varzakas, Theodoros H

    2009-08-01

    Failure Mode and Effect Analysis (FMEA) has been applied for the risk assessment of snails manufacturing. A tentative approach of FMEA application to the snails industry was attempted in conjunction with ISO 22000. Preliminary Hazard Analysis was used to analyze and predict the occurring failure modes in a food chain system (snails processing plant), based on the functions, characteristics, and/or interactions of the ingredients or the processes, upon which the system depends. Critical Control points have been identified and implemented in the cause and effect diagram (also known as Ishikawa, tree diagram, and fishbone diagram). In this work a comparison of ISO22000 analysis with HACCP is carried out over snails processing and packaging. However, the main emphasis was put on the quantification of risk assessment by determining the RPN per identified processing hazard. Sterilization of tins, bioaccumulation of heavy metals, packaging of shells and poisonous mushrooms, were the processes identified as the ones with the highest RPN (280, 240, 147, 144, respectively) and corrective actions were undertaken. Following the application of corrective actions, a second calculation of RPN values was carried out leading to considerably lower values (below the upper acceptable limit of 130). It is noteworthy that the application of Ishikawa (Cause and Effect or Tree diagram) led to converging results thus corroborating the validity of conclusions derived from risk assessment and FMEA. Therefore, the incorporation of FMEA analysis within the ISO22000 system of a snails processing industry is considered imperative. PMID:19582641

  2. No Such Thing as Failure, Only Feedback: Designing Innovative Opportunities for E-Assessment and Technology-Mediated Feedback

    ERIC Educational Resources Information Center

    Miller, Charles; Doering, Aaron; Scharber, Cassandra

    2010-01-01

    In this paper we challenge designers, researchers, teachers, students, and parents to re-assess and re-envision the value of technology-mediated feedback and e-assessment by examining the innovative roles feedback and assessment played in the design of three contemporary web-based learning environments. These environments include 1) an…

  3. Ion potential diagrams for electrochromic devices

    SciTech Connect

    Varsano, F. |; Cahen, D.; Decker, F.; Guillemoles, J.F. |; Masetti, E.

    1998-12-01

    Ion potential diagrams can facilitate the description of systems in which ionic species are mobile. They depict qualitatively the spatial dependence of the potential energy for mobile ions, somewhat akin to band diagrams for electrons. The authors construct ion potential diagrams for the mixed conducting (oxide), optically active electrodes of five-layer electrochromic devices, based on reversible Li{sup +} intercalation. These serve to analyze stability problems that arise in these systems. The authors then use them as building blocks to arrive at ion diagrams for complete devices. This allows analyses of (dis)coloration kinetics.

  4. Influence Diagram Use With Respect to Technology Planning and Investment

    NASA Technical Reports Server (NTRS)

    Levack, Daniel J. H.; DeHoff, Bryan; Rhodes, Russel E.

    2009-01-01

    Influence diagrams are relatively simple, but powerful, tools for assessing the impact of choices or resource allocations on goals or requirements. They are very general and can be used on a wide range of problems. They can be used for any problem that has defined goals, a set of factors that influence the goals or the other factors, and a set of inputs. Influence diagrams show the relationship among a set of results and the attributes that influence them and the inputs that influence the attributes. If the results are goals or requirements of a program, then the influence diagram can be used to examine how the requirements are affected by changes to technology investment. This paper uses an example to show how to construct and interpret influence diagrams, how to assign weights to the inputs and attributes, how to assign weights to the transfer functions (influences), and how to calculate the resulting influences of the inputs on the results. A study is also presented as an example of how using influence diagrams can help in technology planning and investment. The Space Propulsion Synergy Team (SPST) used this technique to examine the impact of R&D spending on the Life Cycle Cost (LCC) of a space transportation system. The question addressed was the effect on the recurring and the non-recurring portions of LCC of the proportion of R&D resources spent to impact technology objectives versus the proportion spent to impact operational dependability objectives. The goals, attributes, and the inputs were established. All of the linkages (influences) were determined. The weighting of each of the attributes and each of the linkages was determined. Finally the inputs were varied and the impacts on the LCC determined and are presented. The paper discusses how each of these was accomplished both for credibility and as an example for future studies using influence diagrams for technology planning and investment planning.

  5. Productive Failure

    ERIC Educational Resources Information Center

    Kapur, Manu

    2008-01-01

    This study demonstrates an existence proof for "productive failure": engaging students in solving complex, ill-structured problems without the provision of support structures can be a productive exercise in failure. In a computer-supported collaborative learning setting, eleventh-grade science students were randomly assigned to one of two…

  6. The changing face of liver transplantation for acute liver failure: Assessment of current status and implications for future practice.

    PubMed

    Donnelly, Mhairi C; Hayes, Peter C; Simpson, Kenneth J

    2016-04-01

    The etiology and outcomes of acute liver failure (ALF) have changed since the definition of this disease entity in the 1970s. In particular, the role of emergency liver transplantation has evolved over time, with the development of prognostic scoring systems to facilitate listing of appropriate patients, and a better understanding of transplant benefit in patients with ALF. This review examines the changing etiology of ALF, transplant benefit, outcomes following transplantation, and future alternatives to emergency liver transplantation in this devastating condition. Liver Transplantation 22 527-535 2016 AASLD. PMID:26823231

  7. Assessment of nutritional status by composite index for anthropometric failure: a study among slum children in Bankura, West Bengal.

    PubMed

    Shit, Subhadeep; Taraphdar, Pranita; Mukhopadhyay, Dipta K; Sinhababu, Apurba; Biswas, Akhil B

    2012-01-01

    A community-based cross-sectional study was conducted to find out the prevalence of composite index of anthropometric failure (CIAF) among 117 slum dwelling under-five children in Bankura town, West Bengal and its relation with some common socio-economic factors. Among study population, the prevalence of underweight was 41.6%, whereas CIAF was 80.3%. CIAF gave a near complete estimation of undernutrition unlike underweight. Children who were unimmunized, with more number of siblings, living in a nuclear family, or with illiterate mothers were more likely to be undernourished. PMID:23354144

  8. Influence Diagrams as Decision-Making Tools for Pesticide Risk Management

    EPA Science Inventory

    The pesticide policy arena is filled with discussion of probabilistic approaches to assess ecological risk, however, similar discussions about implementing formal probabilistic methods in pesticide risk decision making are less common. An influence diagram approach is proposed f...

  9. Phase diagram of ammonium nitrate

    SciTech Connect

    Dunuwille, Mihindra; Yoo, Choong-Shik

    2013-12-07

    Ammonium Nitrate (AN) is a fertilizer, yet becomes an explosive upon a small addition of chemical impurities. The origin of enhanced chemical sensitivity in impure AN (or AN mixtures) is not well understood, posing significant safety issues in using AN even today. To remedy the situation, we have carried out an extensive study to investigate the phase stability of AN and its mixtures with hexane (ANFO–AN mixed with fuel oil) and Aluminum (Ammonal) at high pressures and temperatures, using diamond anvil cells (DAC) and micro-Raman spectroscopy. The results indicate that pure AN decomposes to N{sub 2}, N{sub 2}O, and H{sub 2}O at the onset of the melt, whereas the mixtures, ANFO and Ammonal, decompose at substantially lower temperatures. The present results also confirm the recently proposed phase IV-IV{sup ′} transition above 17 GPa and provide new constraints for the melting and phase diagram of AN to 40 GPa and 400°C.

  10. Phase Diagram of Ammonium Nitrate

    NASA Astrophysics Data System (ADS)

    Dunuwille, Mihindra; Yoo, Choong-Shik

    2013-06-01

    Ammonium Nitrate (AN) has often been subjected to uses in improvised explosive devices, due to its wide availability as a fertilizer and its capability of becoming explosive with slight additions of organic and inorganic compounds. Yet, the origin of enhanced energetic properties of impure AN (or AN mixtures) is neither chemically unique nor well understood - resulting in rather catastrophic disasters in the past1 and thereby a significant burden on safety, in using ammonium nitrates even today. To remedy this situation, we have carried out an extensive study to investigate the phase stability of AN, in different chemical environments, at high pressure and temperature, using diamond anvil cells and micro-Raman spectroscopy. The present results confirm the recently proposed phase IV-to-IV' transition above 15 GPa2 and provide new constraints for the melting and phase diagram of AN to 40 GPa and 673 K. The present study has been supported by the U.S. DHS under Award Number 2008-ST-061-ED0001.

  11. Reading fitness landscape diagrams through HSAB concepts

    NASA Astrophysics Data System (ADS)

    Vigneresse, Jean-Louis

    2014-10-01

    Fitness landscapes are conceived as range of mountains, with local peaks and valleys. In terms of potential, such topographic variations indicate places of local instability or stability. The chemical potential, or electronegativity, its value changed of sign, carries similar information. In addition to chemical descriptors defined through hard-soft acid-base (HSAB) concepts and computed through density functional theory (DFT), the principles that rule chemical reactions allow the design of such landscape diagrams. The simplest diagram uses electrophilicity and hardness as coordinates. It allows examining the influence of maximum hardness or minimum electrophilicity principles. A third dimension is introduced within such a diagram by mapping the topography of electronegativity, polarizability or charge exchange. Introducing charge exchange during chemical reactions, or mapping a third parameter (f.i. polarizability) reinforces the information carried by a simple binary diagram. Examples of such diagrams are provided, using data from Earth Sciences, simple oxides or ligands.

  12. Looking Forward, Looking Back: Assessing Variations in Hospital Resource Use and Outcomes for Elderly Patients with Heart Failure

    PubMed Central

    Ong, Michael K.; Mangione, Carol M.; Romano, Patrick S.; Zhou, Qiong; Auerbach, Andrew D.; Chun, Alein; Davidson, Bruce; Ganiats, Theodore G.; Greenfield, Sheldon; Gropper, Michael A.; Malik, Shaista; Rosenthal, J. Thomas; Escarce, José J.

    2009-01-01

    Background Recent studies have found substantial variation in hospital resource utilization by expired Medicare beneficiaries with chronic illnesses. By analyzing only expired patients, these studies cannot identify differences across hospitals in health outcomes like mortality. This study examines the association between mortality and resource utilization at the hospital level, when all Medicare beneficiaries hospitalized for heart failure are examined. Methods and Results 3,999 individuals hospitalized with a principal diagnosis of heart failure at six California teaching hospitals between January 1, 2001 and June 30, 2005 were analyzed with multivariate risk-adjustment models for total hospital days, total hospital direct costs, and mortality within 180-days after initial admission (“Looking Forward”). A subset of 1,639 individuals who died during the study period were analyzed with multivariate risk-adjustment models for total hospital days and total hospital direct costs within 180-days prior to death (“Looking Back”). “Looking Forward” risk-adjusted hospital means ranged from 17.0% to 26.0% for mortality, 7.8 to 14.9 days for total hospital days, and 0.66 to 1.30 times the mean value for indexed total direct costs. Spearman rank correlation coefficients were −0.68 between mortality and hospital days, and −0.93 between mortality and indexed total direct costs. “Looking Back” risk-adjusted hospital means ranged from 9.1 to 21.7 days for total hospital days and 0.91 to 1.79 times the mean value for indexed total direct costs. Variation in resource utilization site ranks between expired and all individuals were due to insignificant differences. Conclusions California teaching hospitals that used more resources caring for patients hospitalized for heart failure had lower mortality rates. Focusing only on expired individuals may overlook mortality variation as well as associations between greater resource utilization and lower mortality

  13. Structure-retention diagrams of ceramides established for their identification.

    PubMed

    Gaudin, Karen; Chaminade, Pierre; Baillet, Arlette

    2002-10-11

    Molecular species analysis of ceramides was carried out using porous graphitic carbon with gradient elution: chloroform-methanol from 45:55 to 85:15 with a slope at 2.7%/min. These conditions gave a linear relationship between retention data and structure of ceramides. It was demonstrated that linearity occurred when a high slope value of linear gradient elution was used. Thereby the linear diagram was evolved by plotting the adjusted retention time against the total number of carbon atoms of ceramide molecules. Each line represents one ceramide class. Such a Structure-Retention Diagram describes ceramide retention and thus constitutes an identification method using only retention data. This Structure-Retention Diagram was assessed and compared to another obtained from octadesyl-grafted silica in terms of their reproducibility, precision and ability to provide ceramide identification. Better identification was obtained using the results from both Structure-Retention Diagrams. This approach with a two-dimensional separation system allowed to take advantage of the specificity of both identification models. PMID:12437165

  14. Heart Failure

    MedlinePlus

    ... together. About Rise Above HF Rise Above Heart Failure seeks to increase the dialogue about HF and improve the lives of people affected by the condition through awareness, education and support. Through the initiative, AHA strives to ...

  15. Testicular failure

    MedlinePlus

    ... Blood tests may show a low level of testosterone and high levels of prolactin, FSH , and LH . ... testes will be ordered. Testicular failure and low testosterone level may be hard to diagnose in older ...

  16. Faceting diagram for sticky steps

    NASA Astrophysics Data System (ADS)

    Akutsu, Noriko

    2016-03-01

    Faceting diagrams for the step-faceting zone, the step droplet zone, and the Gruber-Mullins-Pokrovsky-Talapov (GMPT) zone for a crystal surface are obtained by using the density matrix renormalization group method to calculate the surface tension. The model based on these calculations is the restricted solid-on-solid (RSOS) model with a point-contact-type step-step attraction (p-RSOS model) on a square lattice. The point-contact-type step-step attraction represents the energy gain obtained by forming a bonding state with orbital overlap at the meeting point of the neighboring steps. In the step-faceting zone, disconnectedness in the surface tension leads to the formation of a faceted macrostep on a vicinal surface at equilibrium. The disconnectedness in the surface tension also causes the first-order shape transition for the equilibrium shape of a crystal droplet. The lower zone boundary line (ZBL), which separates the step-faceting zone and the step droplet zone, is obtained by the condition γ 1 = lim n → ∞ γ n / n , where γn is the step tension of the n-th merged step. The upper ZBL, which separates the GMPT zone and the step droplet zone, is obtained by the condition Aq,eff = 0 and Bq,eff = 0, where Aq,eff and Bq,eff represent the coefficients for the | q → | 2 term and the | q → | 3 term, respectively, in the | q → | -expanded form of the surface free energy f eff ( q → ) . Here, q → is the surface gradient relative to the (111) surface. The reason why the vicinal surface inclined in the <101> direction does not exhibit step-faceting is explained in terms of the one-dimensional spinless quasi-impenetrable attractive bosons at absolute zero.

  17. Development and Validation of the First Iranian Questionnaire to Assess Quality of Life in Patients With Heart Failure: IHF-QoL

    PubMed Central

    Naderi, Nasim; Bakhshandeh, Hooman; Amin, Ahmad; Taghavi, Sepideh; Dadashi, Masoumeh; Maleki, Majid

    2012-01-01

    Background: In its Constitution of 1948, WHO defined health as “a state of complete physical, mental, and social well-being, and not merely the absence of disease and infirmity” . In 1994, the Agency for Health Care Policy and Research published clinical practice guidelines recommending providers to routinely evaluate patients’ HRQoL (Health Related Quality of Life) and use their assessment to modify and guide patient care. Objectives: to create a valid, sensitive, disease-specific Persian health status quality of life questionnaire for patients with chronic heart failure in Iran. Materials and Methods: Considering the existing relevant inventories and scientific literature, the authors designed the first draft of questionnaire which was modified and validated, using expert opinions and finalized in a session of expert panel. The questionnaire was processed among 130 patients with heart failure. Construct validity evaluated by principle component factor analysis, and promax method was used for factor rotation. MacNew quality of life questionnaire was selected to assess convergence validity, and the agreements were measured in 60 patients. Discriminant validity was also assessed. Thirty patients were followed for 3 months and responsiveness of questionnaire was measured. Cronbach's alpha, item analysis, and Intra-class correlation coefficients (ICCs) were used to investigate reliability of questionnaire. SPSS 15 for Windows was applied for statistical analysis. Results: Principle component factor analysis revealed 4 main components. Sub-group analysis suggested that IHF-QoL questionnaire demonstrated an acceptable discriminant validity. High conformity between this inventory and MacNew questionnaire revealed an appropriate convergence validity. Cronbach's alpha (α) for the overall questionnaire was equal to 0.922. Intra-class correlation coefficients (ICCs) for all components were significant (from. 708 to. 883; all P values < 0.001). Patients fallow

  18. Positron Emission Tomography for Assessing Local Failure After Stereotactic Body Radiotherapy for Non-Small-Cell Lung Cancer

    SciTech Connect

    Zhang Xu; Liu Hui; Balter, Peter; Allen, Pamela K.; Komaki, Ritsuko; Pan, Tinsu; Chuang, Hubert H.; Chang, Joe Y.

    2012-08-01

    Purpose: We analyzed whether positron emission tomography (PET)/computed tomography standardized uptake values (SUVs) after stereotactic body radiotherapy (SBRT) could predict local recurrence (LR) in non-small-cell lung cancer (NSCLC). Methods and Materials: This study comprised 128 patients with Stage I (n = 68) or isolated recurrent/secondary parenchymal (n = 60) NSCLC treated with image-guided SBRT to 50 Gy over 4 consecutive days; prior radiotherapy was allowed. PET/computed tomography scans were obtained before therapy and at 1 to 6 months after therapy, as well as subsequently as clinically indicated. Continuous variables were analyzed with Kruskal-Wallis tests and categorical variables with Pearson chi-square or Fisher exact tests. Actuarial local failure rates were calculated with the Kaplan-Meier method. Results: At a median follow-up of 31 months (range, 6-71 months), the actuarial 1-, 2-, and 3-year local control rates were 100%, 98.5%, and 98.5%, respectively, in the Stage I group and 95.8%, 87.6%, and 85.8%, respectively, in the recurrent group. The cumulative rates of regional nodal recurrence and distant metastasis were 8.8% (6 of 68) and 14.7% (10 of 68), respectively, for the Stage I group and 11.7% (7 of 60) and 16.7% (10 of 60), respectively, for the recurrent group. Univariate analysis showed that SUVs obtained 12.1 to 24 months after treatment for the Stage I group (p = 0.007) and 6.1 to 12 months and 12.1 to 24 months after treatment for the recurrent group were associated with LR (p < 0.001 for both). Of the 128 patients, 17 (13.3%) had ipsilateral consolidation after SBRT but no elevated metabolic activity on PET; none had LR. The cutoff maximum SUV of 5 was found to have 100% sensitivity, 91% specificity, a 50% positive predictive value, and a 100% negative predictive value for predicting LR. Conclusions: PET was helpful for distinguishing SBRT-induced consolidation from LR. SUVs obtained more than 6 months after SBRT for NSCLC were

  19. Radial dyssynchrony assessed by cardiovascular magnetic resonance in relation to left ventricular function, myocardial scarring and QRS duration in patients with heart failure

    PubMed Central

    2009-01-01

    Background Intuitively, cardiac dyssynchrony is the inevitable result of myocardial injury. We hypothezised that radial dyssynchrony reflects left ventricular remodeling, myocardial scarring, QRS duration and impaired LV function and that, accordingly, it is detectable in all patients with heart failure. Methods 225 patients with heart failure, grouped according to QRS duration of <120 ms (A, n = 75), between 120-149 ms (B, n = 75) or ≥150 ms (C, n = 75), and 50 healthy controls underwent assessment of radial dyssynchrony using the cardiovascular magnetic resonance tissue synchronization index (CMR-TSI = SD of time to peak inward endocardial motion in up to 60 myocardial segments). Results Compared to 50 healthy controls (21.8 ± 6.3 ms [mean ± SD]), CMR-TSI was higher in A (74.8 ± 34.6 ms), B (92.4 ± 39.5 ms) and C (104.6 ± 45.6 ms) (all p < 0.0001). Adopting a cut-off CMR-TSI of 34.4 ms (21.8 plus 2xSD for controls) for the definition of dyssynchrony, it was present in 91% in A, 95% in B and 99% in C. Amongst patients in NYHA class III or IV, with a LVEF<35% and a QRS>120 ms, 99% had dyssynchrony. Amongst those with a QRS<120 ms, 91% had dyssynchrony. Across the study sample, CMR-TSI was related positively to left ventricular volumes (p < 0.0001) and inversely to LVEF (CMR-TSI = 178.3 e (-0.033 LVEF) ms, p < 0.0001). Conclusion Radial dyssynchrony is almost universal in patients with heart failure. This vies against the notion that a lack of response to CRT is related to a lack of dyssynchrony. PMID:19930713

  20. A pilot study to assess the feasibility of a submaximal exercise test to measure individual response to cardiac medication in dogs with acquired heart failure.

    PubMed

    Ferasin, L; Marcora, S

    2007-08-01

    Exercise testing is not commonly used in canine medicine because of several limitations. The aim of this study was to investigate the suitability of a treadmill test to measure the exercise capacity of untrained canine cardiac patients and to measure some biological parameters that might reflect the tolerance of dogs with heart failure to submaximal exercise. The exercise capacity of seven dogs with naturally occurring heart failure was evaluated before the institution of cardiac medication and 7 days after the beginning of the study. An additional re-examination was requested after 28 days. The exercise test was performed on a motorized treadmill at three different speeds (0.5 m/s, 1.0 m/s and 1.5 m/s). The following parameters were measured at the end of each stage and after 20 min recovery: heart rate, rectal temperature, glucose, lactate, aspartate aminotransferase, creatine kinase, PvO(2), PvCO(2), pH, haematocrit, bicarbonate, sodium, potassium and chloride. Serum cardiac troponin-I was also measured at the beginning of the test and at the end of the recovery period. Owners' perception reflected the ability of their dogs to exercise on the treadmill. Lactate level increased noticeably with the intensity of the exercise test, and its variation coincided with different exercise tolerance observed by the owners. Heart rate seemed to follow a similar trend in the few dogs presented in sinus rhythm. None of the remaining parameters appeared to be sensitive indicators of activity level in the dogs used in this study. The treadmill exercise test in dogs with acquired heart failure is feasible and might provide useful information for assessing individual response to cardiac medication. Lactate and heart rate seemed to reflect individual levels of exercise tolerance, although further studies are necessary to confirm the reliability and repeatability of this test. PMID:17253114

  1. Algorithmic Identification for Wings in Butterfly Diagrams.

    NASA Astrophysics Data System (ADS)

    Illarionov, E. A.; Sokolov, D. D.

    2012-12-01

    We investigate to what extent the wings of solar butterfly diagrams can be separated without an explicit usage of Hale's polarity law as well as the location of the solar equator. Two algorithms of cluster analysis, namely DBSCAN and C-means, have demonstrated their ability to separate the wings of contemporary butterfly diagrams based on the sunspot group density in the diagram only. Here we generalize the method for continuous tracers, give results concerning the migration velocities and presented clusters for 12 - 20 cycles.

  2. A Hubble Diagram for Quasars

    NASA Astrophysics Data System (ADS)

    Risaliti, G.; Lusso, E.

    2015-12-01

    We present a new method to test the ΛCDM cosmological model and to estimate cosmological parameters based on the nonlinear relation between the ultraviolet and X-ray luminosities of quasars. We built a data set of 1138 quasars by merging several samples from the literature with X-ray measurements at 2 keV and SDSS photometry, which was used to estimate the extinction-corrected 2500 Å flux. We obtained three main results: (1) we checked the nonlinear relation between X-ray and UV luminosities in small redshift bins up to z˜ 6, confirming that the relation holds at all redshifts with the same slope; (2) we built a Hubble diagram for quasars up to z˜ 6, which is well matched to that of supernovae in the common z = 0-1.4 redshift interval and extends the test of the cosmological model up to z˜ 6; and (3) we showed that this nonlinear relation is a powerful tool for estimating cosmological parameters. Using the present data and assuming a ΛCDM model, we obtain {{{Ω }}}M = 0.22{}-0.08+0.10 and {{{Ω }}}{{Λ }} = 0.92{}-0.30+0.18 ({{{Ω }}}M = 0.28 ± 0.04 and {{{Ω }}}{{Λ }} = 0.73 +/- 0.08 from a joint quasar-SNe fit). Much more precise measurements will be achieved with future surveys. A few thousand SDSS quasars already have serendipitous X-ray observations from Chandra or XMM-Newton, and at least 100,000 quasars with UV and X-ray data will be made available by the extended ROentgen Survey with an Imaging Telescope Array all-sky survey in a few years. The Euclid, Large Synoptic Survey Telescope, and Advanced Telescope for High ENergy Astrophysics surveys will further increase the sample size to at least several hundred thousand. Our simulations show that these samples will provide tight constraints on the cosmological parameters and will allow us to test for possible deviations from the standard model with higher precision than is possible today.

  3. Assessment of RELAP5/MOD3 with the LOFT L9-1/L3-3 experiment simulating an anticipated transient with multiple failures

    SciTech Connect

    Bang, Y.S.; Seul, K.W.; Kim, H.J.

    1994-02-01

    The RELAP5/MOD3 5m5 code is assessed using the L9-1/L3-3 test carried out in the LOFT facility, a 1/60-scaled experimental reactor, simulating a loss of feedwater accident with multiple failures and the sequentially-induced small break loss-of-coolant accident. The code predictability is evaluated for the four separated sub-periods with respect to the system response; initial heatup phase, spray and power operated relief valve (PORV) cycling phase, blowdown phase and recovery phase. Based on the comparisons of the results from the calculation with the experiment data, it is shown that the overall thermal-hydraulic behavior important to the scenario such as a heat removal between the primary side and the secondary side and a system depressurization can be well-predicted and that the code could be applied to the full-scale nuclear power plant for an anticipated transient with multiple failures within a reasonable accuracy. The minor discrepancies between the prediction and the experiment are identified in reactor scram time, post-scram behavior in the initial heatup phase, excessive heatup rate in the cycling phase, insufficient energy convected out the PORV under the hot leg stratified condition in the saturated blowdown phase and void distribution in secondary side in the recovery phase. This may come from the code uncertainties in predicting the spray mass flow rate, the associated condensation in pressurizer and junction fluid density under stratified condition.

  4. Learning from Failures.

    ERIC Educational Resources Information Center

    Saffran, Murray

    1991-01-01

    Describes mistakes made in trying to change the Nutrition and Digestion section of a medical biochemistry course. Author tried to make the section student taught and reports nine mistakes including the following: ignoring active opposition of colleagues, failure to assess the receptivity of the class to a new form of teaching, and overestimating…

  5. A Smart Thermal Block Diagram Tool

    NASA Technical Reports Server (NTRS)

    Tsuyuki, Glenn; Miyake, Robert; Dodge, Kyle

    2008-01-01

    The presentation describes a Smart Thermal Block Diagram Tool. It is used by JPL's Team X in studying missions during the Pre-Phase A. It helps generate cost and mass estimates using proprietary data bases.

  6. The Art of Free-Body Diagrams.

    ERIC Educational Resources Information Center

    Puri, Avinash

    1996-01-01

    Discusses the difficulty of drawing free-body diagrams which only show forces exerted on a body from its neighbors. Presents three ways a body may be modeled: a particle, rigid extended, and nonrigid extended. (MKR)

  7. Phase diagram for passive electromagnetic scatterers.

    PubMed

    Lee, Jeng Yi; Lee, Ray-Kuang

    2016-03-21

    With the conservation of power, a phase diagram defined by amplitude square and phase of scattering coefficients for each spherical harmonic channel is introduced as a universal map for any passive electromagnetic scatterers. Physically allowable solutions for scattering coefficients in this diagram clearly show power competitions among scattering and absorption. It also illustrates a variety of exotic scattering or absorption phenomena, from resonant scattering, invisible cloaking, to coherent perfect absorber. With electrically small core-shell scatterers as an example, we demonstrate a systematic method to design field-controllable structures based on the allowed trajectories in this diagram. The proposed phase diagram and inverse design can provide tools to design functional electromagnetic devices. PMID:27136839

  8. An Improved Mnemonic Diagram for Thermodynamic Relationships.

    ERIC Educational Resources Information Center

    Rodriguez, Joaquin; Brainard, Alan J.

    1989-01-01

    Considers pressure, volume, entropy, temperature, Helmholtz free energy, Gibbs free energy, enthalpy, and internal energy. Suggests the mnemonic diagram is for use with simple systems that are defined as macroscopically homogeneous, isotropic, uncharged, and chemically inert. (MVL)

  9. Use of influence diagrams in gas transfer system option prioritization

    SciTech Connect

    Heger, A.S.; Garcia, M.D.

    1995-08-01

    A formal decision-analysis methodology was applied to aid the Department of Energy (DOE) in deciding which of several gas transfer system (GTS) options should be selected. The decision objectives for this case study, i.e., risk and cost, were directly derived from the DOE guidelines. Influence diagrams were used to define the structure of the decision problem and clearly delineate the flow if information. A set of performance matrices wee used in conjunction with the influence diagrams to assess and evaluate the degree to which the objectives of the case study were met. These performance measures were extracted from technical models, design and operating data, and professional judgments. The results were aggregated to provide an overall evaluation of the different design options of the gas transfer system. Consequently, the results of this analysis were used as an aid to DOE to select a viable GTS option.

  10. Elementary diagrams in nuclear and neutron matter

    SciTech Connect

    Wiringa, R.B.

    1995-08-01

    Variational calculations of nuclear and neutron matter are currently performed using a diagrammatic cluster expansion with the aid of nonlinear integral equations for evaluating expectation values. These are the Fermi hypernetted chain (FHNC) and single-operator chain (SOC) equations, which are a way of doing partial diagram summations to infinite order. A more complete summation can be made by adding elementary diagrams to the procedure. The simplest elementary diagrams appear at the four-body cluster level; there is one such E{sub 4} diagram in Bose systems, but 35 diagrams in Fermi systems, which gives a level of approximation called FHNC/4. We developed a novel technique for evaluating these diagrams, by computing and storing 6 three-point functions, S{sub xyz}(r{sub 12}, r{sub 13}, r{sub 23}), where xyz (= ccd, cce, ddd, dde, dee, or eee) denotes the exchange character at the vertices 1, 2, and 3. All 35 Fermi E{sub 4} diagrams can be constructed from these 6 functions and other two-point functions that are already calculated. The elementary diagrams are known to be important in some systems like liquid {sup 3}He. We expect them to be small in nuclear matter at normal density, but they might become significant at higher densities appropriate for neutron star calculations. This year we programmed the FHNC/4 contributions to the energy and tested them in a number of simple model cases, including liquid {sup 3}He and Bethe`s homework problem. We get reasonable, but not exact agreement with earlier published work. In nuclear and neutron matter with the Argonne v{sub 14} interaction these contributions are indeed small corrections at normal density and grow to only 5-10 MeV/nucleon at 5 times normal density.

  11. Lattice and Phase Diagram in QCD

    SciTech Connect

    Lombardo, Maria Paola

    2008-10-13

    Model calculations have produced a number of very interesting expectations for the QCD Phase Diagram, and the task of a lattice calculations is to put these studies on a quantitative grounds. I will give an overview of the current status of the lattice analysis of the QCD phase diagram, from the quantitative results of mature calculations at zero and small baryochemical potential, to the exploratory studies of the colder, denser phase.

  12. Reliability computation from reliability block diagrams

    NASA Technical Reports Server (NTRS)

    Chelson, P. O.; Eckstein, R. E.

    1971-01-01

    A method and a computer program are presented to calculate probability of system success from an arbitrary reliability block diagram. The class of reliability block diagrams that can be handled include any active/standby combination of redundancy, and the computations include the effects of dormancy and switching in any standby redundancy. The mechanics of the program are based on an extension of the probability tree method of computing system probabilities.

  13. Fluctuations and the QCD phase diagram

    SciTech Connect

    Schaefer, B.-J.

    2012-06-15

    In this contribution the role of quantum fluctuations for the QCD phase diagram is discussed. This concerns in particular the importance of the matter back-reaction to the gluonic sector. The impact of these fluctuations on the location of the confinement/deconfinement and the chiral transition lines as well as their interrelation are investigated. Consequences of our findings for the size of a possible quarkyonic phase and location of a critical endpoint in the phase diagram are drawn.

  14. A universal structured-design diagramer

    NASA Technical Reports Server (NTRS)

    1981-01-01

    Program (FLOWCHARTER) generates standardized flowcharts and concordances for development and debugging of programs in any language. User describes programming-language grammar, providing syntax rules in Backus-Naur form (BNF), list of semantic rules, and set of concordance rules. Once grammar is described, user supplies only source code of program to be diagrammed. FLOWCHARTER automatically produces flow diagram and concordance. Source code for program is written for PASCAL Release 2 compiler, as distributed by University of Minnesota.

  15. ISS EPS Orbital Replacement Unit Block Diagrams

    NASA Technical Reports Server (NTRS)

    Schmitz, Gregory V.

    2001-01-01

    The attached documents are being provided to Switching Power Magazine for information purposes. This magazine is writing a feature article on the International Space Station Electrical Power System, focusing on the switching power processors. These units include the DC-DC Converter Unit (DDCU), the Bi-directional Charge/Discharge Unit (BCDU), and the Sequential Shunt Unit (SSU). These diagrams are high-level schematics/block diagrams depicting the overall functionality of each unit.

  16. Fault Tree, Event Tree, and Piping and Instrumentation Diagram (FEP) editors, Version 4. 0

    SciTech Connect

    McKay, M.K.; Skinner, N.L.; Wood, S.T. )

    1992-05-01

    The Fault Tree, Event Tree, and Piping Instrumentation Diagram (FEP) editors allow the user to graphically build and edit fault trees, event trees, and piping instrumentation diagrams (P IDs). The software is designed to enable the use of graphical-based editors found in the Integrated Reliability and Risk Assessment System (IRRAS). FEP is made up of three separate editors (Fault Tree, Event Tree, and Piping Instrumentation Diagram) and a utility module. This reference manual provides a screen-by-screen walkthrough of the entire FEP System.

  17. Class diagram based evaluation of software performance

    NASA Astrophysics Data System (ADS)

    Pham, Huong V.; Nguyen, Binh N.

    2013-03-01

    The evaluation of software performance in the early stages of the software life cycle is important and it has been widely studied. In the software model specification, class diagram is the important object-oriented software specification model. The measures based on a class diagram have been widely studied to evaluate quality of software such as complexity, maintainability, reuse capability, etc. However the software performance evaluation based on Class model has not been widely studied, especially for object-oriented design of embedded software. Therefore, in this paper we propose a new approach to directly evaluate the software performance based on class diagrams. From a class diagram, we determine the parameters which are used to evaluate and build formula of the measures such as Size of Class Variables, Size of Class Methods, Size of Instance Variables, Size of Instance Methods, etc. Then, we do analysis of the dependence of performance on these measures and build the performance evaluation function from class diagram. Thereby we can choose the best class diagram based on this evaluation function.

  18. Pressure-enthalpy diagrams for alternative refrigerants

    SciTech Connect

    Chen, J.; Kruse, H.

    1996-10-01

    Thermodynamic diagrams, particularly log(p)-h diagrams, have become very convenient tools for refrigeration and air-conditioning industries. To promote alternative refrigerants-related development and application, it is urgently required to provide the industries with reliable engineering diagrams for the most promising candidate refrigerants. A computer program has been developed for automatically producing log(p)-h diagrams for alternative refrigerants. The Lee Kesler Ploecker (LKP) equation of state has been used to calculate thermodynamic data. Some modifications have been made to the LKP to improve the calculation convergency. In this paper three sample diagrams for R134a, a binary R410A and a ternary R407B which have been enclosed and analyzed. To investigate the LKP calculation accuracy details, an extensive deviation analysis has been made for R134a. For mixed refrigerants, good calculation accuracy was achieved by optimizing the binary interactive parameters. The system can produce log(p)-h diagrams with reliable accuracy, high quality, and flexibility to meet any size and color requirements.

  19. Heart Failure

    MedlinePlus

    ... Tiredness and shortness of breath Common causes of heart failure are coronary artery disease, high blood pressure and diabetes. It is more common in people who are 65 years old or older, African Americans, people who are overweight, and people who have ...

  20. Respiratory Failure

    MedlinePlus

    ... from inhaling smoke or harmful fumes Treatment for respiratory failure depends on whether the condition is acute (short-term) or chronic (ongoing) and how severe it is. It also depends on the underlying cause. You may receive oxygen therapy and other treatment to help you breathe. NIH: ...

  1. Margins in high temperature leak-before-break assessments

    SciTech Connect

    Budden, P.J.; Hooton, D.G.

    1997-04-01

    Developments in the defect assessment procedure R6 to include high-temperature mechanisms in Leak-before-Break arguments are described. In particular, the effect of creep on the time available to detect a leak and on the crack opening area, and hence leak rate, is discussed. The competing influence of these two effects is emphasized by an example. The application to Leak-before-Break of the time-dependent failure assessment diagram approach for high temperature defect assessment is then outlined. The approach is shown to be of use in assessing the erosion of margins by creep.

  2. Imaging assessment of a portable hemodialysis device: detection of possible failure modes and monitoring of functional performance

    PubMed Central

    Olorunsola, Olufoladare G.; Kim, Steven H.; Chang, Ryan; Kuo, Yuo-Chen; Hetts, Steven W.; Heller, Alex; Kant, Rishi; Saeed, Maythem; Fissell, William H.; Roy, Shuvo; Wilson, Mark W.

    2014-01-01

    Background The purpose of this study was to investigate the utility and limitations of various imaging modalities in the noninvasive assessment of a novel compact hemodialyzer under development for renal replacement therapy, with specific aim towards monitoring its functional performance. Methods The prototype is a 4×3×6 cm aluminum cartridge housing “blood” and “dialysate” flow paths arranged in parallel. A sheet of semipermeable silicon nanopore membranes forms the blood-dialysate interface, allowing passage of small molecules. Blood flow was simulated using a peristaltic pump to instill iodinated contrast through the blood compartment, while de-ionized water was instilled through the dialysate compartment at a matched rate in the countercurrent direction. Images were acquired under these flow conditions using multi-detector computed tomography (MDCT), fluoroscopy, high-resolution quantitative computed tomography (HR-QCT), and magnetic resonance imaging (MRI). MDCT was used to monitor contrast diffusion efficiency by plotting contrast density as a function of position along the path of flow through the cartridge during steady state infusion at 1 and 20 mL/min. Both linear and exponential regressions were used to model contrast decay along the flow path. Results Both linear and exponential models of contrast decay appeared to be reasonable approximations, yielding similar results for contrast diffusion during a single pass through the cartridge. There was no measurable difference in contrast diffusion when comparing 1 mL/min and 20 mL/min flow rates. Fluoroscopy allowed a gross qualitative assessment of flow within the device, and revealed flow inhomogeneity within the corner of the cartridge opposite the blood inlet port. MRI and HR-QCT were both severely limited due to the paramagnetic properties and high atomic number of the target material, respectively. During testing, we encountered several causes of device malfunction, including leak formation

  3. Application of ISO 22000 and Failure Mode and Effect Analysis (FMEA) for industrial processing of salmon: a case study.

    PubMed

    Arvanitoyannis, Ioannis S; Varzakas, Theodoros H

    2008-05-01

    The Failure Mode and Effect Analysis (FMEA) model was applied for risk assessment of salmon manufacturing. A tentative approach of FMEA application to the salmon industry was attempted in conjunction with ISO 22000. Preliminary Hazard Analysis was used to analyze and predict the occurring failure modes in a food chain system (salmon processing plant), based on the functions, characteristics, and/or interactions of the ingredients or the processes, upon which the system depends. Critical Control points were identified and implemented in the cause and effect diagram (also known as Ishikawa, tree diagram and fishbone diagram). In this work, a comparison of ISO 22000 analysis with HACCP is carried out over salmon processing and packaging. However, the main emphasis was put on the quantification of risk assessment by determining the RPN per identified processing hazard. Fish receiving, casing/marking, blood removal, evisceration, filet-making cooling/freezing, and distribution were the processes identified as the ones with the highest RPN (252, 240, 210, 210, 210, 210, 200 respectively) and corrective actions were undertaken. After the application of corrective actions, a second calculation of RPN values was carried out resulting in substantially lower values (below the upper acceptable limit of 130). It is noteworthy that the application of Ishikawa (Cause and Effect or Tree diagram) led to converging results thus corroborating the validity of conclusions derived from risk assessment and FMEA. Therefore, the incorporation of FMEA analysis within the ISO 22000 system of a salmon processing industry is anticipated to prove advantageous to industrialists, state food inspectors, and consumers. PMID:18464031

  4. Adding point of care ultrasound to assess volume status in heart failure patients in a nurse-led outpatient clinic. A randomised study

    PubMed Central

    Gundersen, Guri Holmen; Norekval, Tone M; Haug, Hilde Haugberg; Skjetne, Kyrre; Kleinau, Jens Olaf; Graven, Torbjorn; Dalen, Havard

    2016-01-01

    Objectives Medical history, physical examination and laboratory testing are not optimal for the assessment of volume status in heart failure (HF) patients. We aimed to study the clinical influence of focused ultrasound of the pleural cavities and inferior vena cava (IVC) performed by specialised nurses to assess volume status in HF patients at an outpatient clinic. Methods HF outpatients were prospectively included and underwent laboratory testing, history recording and clinical examination by two nurses with and without an ultrasound examination of the pleural cavities and IVC using a pocket-size imaging device, in random order. Each nurse worked in a team with a cardiologist. The influence of the different diagnostic tests on diuretic dosing was assessed descriptively and in linear regression analyses. Results Sixty-two patients were included and 119 examinations were performed. Mean±SD age was 74±12 years, EF was 34±14%, and N-terminal pro-brain natriuretic peptide (NT-proBNP) value was 3761±3072 ng/L. Dosing of diuretics differed between the teams in 31 out of 119 consultations. Weight change and volume status assessed clinically with and without ultrasound predicted dose adjustment of diuretics at follow-up (p<0.05). Change of oedema, NT-proBNP, creatinine, and symptoms did not (p≥0.10). In adjusted analyses, only volume status based on ultrasound predicted dose adjustments of diuretics at first visit and follow-up (all ultrasound p≤0.01, all other p≥0.2). Conclusions Ultrasound examinations of the pleural cavities and IVC by nurses may improve diagnostics and patient care in HF patients at an outpatient clinic, but more studies are needed to determine whether these examinations have an impact on clinical outcomes. Trial registration number NCT01794715. PMID:26438785

  5. Metallization failures

    NASA Technical Reports Server (NTRS)

    Beatty, R.

    1971-01-01

    Metallization-related failure mechanisms were shown to be a major cause of integrated circuit failures under accelerated stress conditions, as well as in actual use under field operation. The integrated circuit industry is aware of the problem and is attempting to solve it in one of two ways: (1) better understanding of the aluminum system, which is the most widely used metallization material for silicon integrated circuits both as a single level and multilevel metallization, or (2) evaluating alternative metal systems. Aluminum metallization offers many advantages, but also has limitations particularly at elevated temperatures and high current densities. As an alternative, multilayer systems of the general form, silicon device-metal-inorganic insulator-metal, are being considered to produce large scale integrated arrays. The merits and restrictions of metallization systems in current usage and systems under development are defined.

  6. Common Cause Failure Modeling

    NASA Technical Reports Server (NTRS)

    Hark, Frank; Britton, Paul; Ring, Rob; Novack, Steven D.

    2015-01-01

    Common Cause Failures (CCFs) are a known and documented phenomenon that defeats system redundancy. CCFS are a set of dependent type of failures that can be caused by: system environments; manufacturing; transportation; storage; maintenance; and assembly, as examples. Since there are many factors that contribute to CCFs, the effects can be reduced, but they are difficult to eliminate entirely. Furthermore, failure databases sometimes fail to differentiate between independent and CCF (dependent) failure and data is limited, especially for launch vehicles. The Probabilistic Risk Assessment (PRA) of NASA's Safety and Mission Assurance Directorate at Marshall Space Flight Center (MFSC) is using generic data from the Nuclear Regulatory Commission's database of common cause failures at nuclear power plants to estimate CCF due to the lack of a more appropriate data source. There remains uncertainty in the actual magnitude of the common cause risk estimates for different systems at this stage of the design. Given the limited data about launch vehicle CCF and that launch vehicles are a highly redundant system by design, it is important to make design decisions to account for a range of values for independent and CCFs. When investigating the design of the one-out-of-two component redundant system for launch vehicles, a response surface was constructed to represent the impact of the independent failure rate versus a common cause beta factor effect on a system's failure probability. This presentation will define a CCF and review estimation calculations. It gives a summary of reduction methodologies and a review of examples of historical CCFs. Finally, it presents the response surface and discusses the results of the different CCFs on the reliability of a one-out-of-two system.

  7. Common Cause Failure Modeling

    NASA Technical Reports Server (NTRS)

    Hark, Frank; Britton, Paul; Ring, Rob; Novack, Steven D.

    2016-01-01

    Common Cause Failures (CCFs) are a known and documented phenomenon that defeats system redundancy. CCFS are a set of dependent type of failures that can be caused by: system environments; manufacturing; transportation; storage; maintenance; and assembly, as examples. Since there are many factors that contribute to CCFs, the effects can be reduced, but they are difficult to eliminate entirely. Furthermore, failure databases sometimes fail to differentiate between independent and CCF (dependent) failure and data is limited, especially for launch vehicles. The Probabilistic Risk Assessment (PRA) of NASA's Safety and Mission Assurance Directorate at Marshal Space Flight Center (MFSC) is using generic data from the Nuclear Regulatory Commission's database of common cause failures at nuclear power plants to estimate CCF due to the lack of a more appropriate data source. There remains uncertainty in the actual magnitude of the common cause risk estimates for different systems at this stage of the design. Given the limited data about launch vehicle CCF and that launch vehicles are a highly redundant system by design, it is important to make design decisions to account for a range of values for independent and CCFs. When investigating the design of the one-out-of-two component redundant system for launch vehicles, a response surface was constructed to represent the impact of the independent failure rate versus a common cause beta factor effect on a system's failure probability. This presentation will define a CCF and review estimation calculations. It gives a summary of reduction methodologies and a review of examples of historical CCFs. Finally, it presents the response surface and discusses the results of the different CCFs on the reliability of a one-out-of-two system.

  8. Assessment of vasodilator therapy in patients with severe congestive heart failure: limitations of measurements of left ventricular ejection fraction and volumes

    SciTech Connect

    Firth, B.G.; Dehmer, G.J.; Markham, R.V. Jr.; Willerson, J.T.; Hillis, L.D.

    1982-11-01

    Although noninvasive techniques are often used to assess the effect of vasodilator therapy in patients with congestive heart failure, it is unknown whether changes in noninvasively determined left ventricular ejection fraction, volume, or dimension reliably reflect alterations in intracardiac pressure and flow. Accordingly, we compared the acute effect of sodium nitroprusside on left ventricular volume and ejection fraction (determined scintigraphically) with its effect on intracardiac pressure and forward cardiac index (determined by thermodilution) in 12 patients with severe, chronic congestive heart failure and a markedly dilated left ventricle. Nitroprusside (infused at 1.3 +/- 1.1 (mean +/- standard deviation) microgram/kg/min) caused a decrease in mean systemic arterial, mean pulmonary arterial, and mean pulmonary capillary wedge pressure as well as a concomitant increase in forward cardiac index. Simultaneously, left ventricular end-diastolic and end-systolic volume indexes decreased, but the scintigraphically determined cardiac index did not change significantly. Left ventricular ejection fraction averaged 0.19 +/- 0.05 before nitroprusside administration and increased by less than 0.05 units in response to nitroprusside in 11 of 12 patients. The only significant correlation between scintigraphically and invasively determined variables was that between the percent change in end-diastolic volume index and the percent change in pulmonary capillary wedge pressure (r . 0.68, p . 0.01). Although nitroprusside produced changes in scintigraphically determined left ventricular ejection fraction, end-systolic volume index, and cardiac index, these alterations bore no predictable relation to changes in intracardiac pressure, forward cardiac index, or vascular resistance. Furthermore, nitroprusside produced a considerably greater percent change in the invasively measured variables than in the scintigraphically determined ones.

  9. Clinical usefulness of blood metal measurements to assess the failure of metal-on-metal hip implants

    PubMed Central

    Sampson, Barry; Hart, Alister

    2012-01-01

    In April 2010, a Medicines and Healthcare Products Regulatory Agency safety alert concerning all metal-on-metal (MOM) hip replacements recommended measuring chromium and cobalt concentrations when managing patients with painful prostheses. The need for this review is illustrated by the recent surge in requests for these blood tests from orthopaedic surgeons following this alert. The aim is to provide guidance to laboratories in assessing these requests and advising clinicians on interpretation. First, we summarize the basic terminology regarding the types of hip replacements, with emphasis on the MOM type. Second, we describe the clinical concerns over implant-derived wear debris in the local tissues and distant sites. Analytical aspects of the measurement of the relevant metal ions and what factors affect the levels measured are discussed. The application of inductively coupled plasma mass spectrometry techniques to the measurement of these metals is considered in detail. The biological effects of metal wear products are summarized with local toxicity and systemic biological effects considered, including carcinogenicity, genotoxicity and systemic toxicity. Clinical cases are used to illustrate pertinent points. PMID:22155921

  10. TIME-TEMPERATURE-TRANSFORMATION (TTT) DIAGRAMS FOR FUTURE WASTE COMPOSITIONS

    SciTech Connect

    Billings, A.; Edwards, T.

    2010-07-08

    As a part of the Waste Acceptance Product Specifications (WAPS) for Vitrified High-Level Waste Forms defined by the Department of Energy - Office of Environmental Management, the waste form stability must be determined for each of the projected high-level waste (HLW) types at the Savannah River Site (SRS). Specifically, WAPS 1.4.1 requires the glass transition temperature (T{sub g}) to be defined and time-temperature-transformation (TTT) diagrams to be developed. The T{sub g} of a glass is an indicator of the approximate temperature where the supercooled liquid converts to a solid on cooling or conversely, where the solid begins to behave as a viscoelastic solid on heating. A TTT diagram identifies the crystalline phases that can form as a function of time and temperature for a given waste type or more specifically, the borosilicate glass waste form. In order to assess durability, the Product Consistency Test (PCT) was used and the durability results compared to the Environmental Assessment (EA) glass. The measurement of glass transition temperature and the development of TTT diagrams have already been performed for the seven Defense Waste Processing Facility (DWPF) projected compositions as defined in the Waste Form Compliance Plan (WCP) and in SRNL-STI-2009-00025. Additional phase transformation information exists for other projected compositions, but overall these compositions did not cover composition regions estimated for future waste processing. To develop TTT diagrams for future waste types, the Savannah River National Laboratory (SRNL) fabricated two caches of glass from reagent grade oxides to simulate glass compositions which would be likely processed with and without Al dissolution. These were used for glass transition temperature measurement and TTT diagram development. The glass transition temperatures of both glasses were measured using differential scanning calorimetry (DSC) and were recorded to be 448 C and 452 C. Using the previous TTT diagrams as

  11. Non-planar on-shell diagrams

    NASA Astrophysics Data System (ADS)

    Franco, Sebastián; Galloni, Daniele; Penante, Brenda; Wen, Congkao

    2015-06-01

    We initiate a systematic study of non-planar on-shell diagrams in SYM and develop powerful technology for doing so. We introduce canonical variables generalizing face variables, which make the d log form of the on-shell form explicit. We make significant progress towards a general classification of arbitrary on-shell diagrams by means of two classes of combinatorial objects: generalized matching and matroid polytopes. We propose a boundary measurement that connects general on-shell diagrams to the Grassmannian. Our proposal exhibits two important and non-trivial properties: positivity in the planar case and it matches the combinatorial description of the diagrams in terms of generalized matroid polytopes. Interestingly, non-planar diagrams exhibit novel phenomena, such as the emergence of constraints on Plücker coordinates beyond Plücker relations when deleting edges, which are neatly captured by the generalized matching and matroid polytopes. This behavior is tied to the existence of a new type of poles in the on-shell form at which combinations of Plücker coordinates vanish. Finally, we introduce a prescription, applicable beyond the MHV case, for writing the on-shell form as a function of minors directly from the graph.

  12. Identifying conservation successes, failures and future opportunities; assessing recovery potential of wild ungulates and tigers in Eastern Cambodia.

    PubMed

    O'Kelly, Hannah J; Evans, Tom D; Stokes, Emma J; Clements, Tom J; Dara, An; Gately, Mark; Menghor, Nut; Pollard, Edward H B; Soriyun, Men; Walston, Joe

    2012-01-01

    Conservation investment, particularly for charismatic and wide-ranging large mammal species, needs to be evidence-based. Despite the prevalence of this theme within the literature, examples of robust data being generated to guide conservation policy and funding decisions are rare. We present the first published case-study of tiger conservation in Indochina, from a site where an evidence-based approach has been implemented for this iconic predator and its prey. Despite the persistence of extensive areas of habitat, Indochina's tiger and ungulate prey populations are widely supposed to have precipitously declined in recent decades. The Seima Protection Forest (SPF), and broader Eastern Plains Landscape, was identified in 2000 as representing Cambodia's best hope for tiger recovery; reflected in its designation as a Global Priority Tiger Conservation Landscape. Since 2005 distance sampling, camera-trapping and detection-dog surveys have been employed to assess the recovery potential of ungulate and tiger populations in SPF. Our results show that while conservation efforts have ensured that small but regionally significant populations of larger ungulates persist, and density trends in smaller ungulates are stable, overall ungulate populations remain well below theoretical carrying capacity. Extensive field surveys failed to yield any evidence of tiger, and we contend that there is no longer a resident population within the SPF. This local extirpation is believed to be primarily attributable to two decades of intensive hunting; but importantly, prey densities are also currently below the level necessary to support a viable tiger population. Based on these results and similar findings from neighbouring sites, Eastern Cambodia does not currently constitute a Tiger Source Site nor meet the criteria of a Global Priority Tiger Landscape. However, SPF retains global importance for many other elements of biodiversity. It retains high regional importance for ungulate

  13. The Semiotic Structure of Geometry Diagrams: How Textbook Diagrams Convey Meaning

    ERIC Educational Resources Information Center

    Dimmel, Justin K.; Herbst, Patricio G.

    2015-01-01

    Geometry diagrams use the visual features of specific drawn objects to convey meaning about generic mathematical entities. We examine the semiotic structure of these visual features in two parts. One, we conduct a semiotic inquiry to conceptualize geometry diagrams as mathematical texts that comprise choices from different semiotic systems. Two,…

  14. A method for developing design diagrams for ceramic and glass materials using fatigue data

    NASA Technical Reports Server (NTRS)

    Heslin, T. M.; Magida, M. B.; Forrest, K. A.

    1986-01-01

    The service lifetime of glass and ceramic materials can be expressed as a plot of time-to-failure versus applied stress whose plot is parametric in percent probability of failure. This type of plot is called a design diagram. Confidence interval estimates for such plots depend on the type of test that is used to generate the data, on assumptions made concerning the statistical distribution of the test results, and on the type of analysis used. This report outlines the development of design diagrams for glass and ceramic materials in engineering terms using static or dynamic fatigue tests, assuming either no particular statistical distribution of test results or a Weibull distribution and using either median value or homologous ratio analysis of the test results.

  15. The Use of Computational Diagrams and Nomograms in Higher Education.

    ERIC Educational Resources Information Center

    Brandenburg, Richard K.; Simpson, William A.

    1984-01-01

    The use of computational diagrams and nomographs for the calculations that frequently occur in college administration is examined. Steps in constructing a nomograph and a four-dimensional computational diagram are detailed, and uses of three- and four-dimensional diagrams are covered. Diagrams and nomographs are useful in the following cases: (1)…

  16. 49 CFR 1152.10 - System diagram map.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 49 Transportation 8 2012-10-01 2012-10-01 false System diagram map. 1152.10 Section 1152.10... TRANSPORTATION UNDER 49 U.S.C. 10903 System Diagram § 1152.10 System diagram map. (a) Each carrier shall prepare a diagram of its rail system on a map, designating all lines in its system by the...

  17. Fishbone Diagrams: Organize Reading Content with a "Bare Bones" Strategy

    ERIC Educational Resources Information Center

    Clary, Renee; Wandersee, James

    2010-01-01

    Fishbone diagrams, also known as Ishikawa diagrams or cause-and-effect diagrams, are one of the many problem-solving tools created by Dr. Kaoru Ishikawa, a University of Tokyo professor. Part of the brilliance of Ishikawa's idea resides in the simplicity and practicality of the diagram's basic model--a fish's skeleton. This article describes how…

  18. 49 CFR 1152.10 - System diagram map.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 49 Transportation 8 2010-10-01 2010-10-01 false System diagram map. 1152.10 Section 1152.10... TRANSPORTATION UNDER 49 U.S.C. 10903 System Diagram § 1152.10 System diagram map. (a) Each carrier shall prepare a diagram of its rail system on a map, designating all lines in its system by the...

  19. 49 CFR 1152.10 - System diagram map.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 49 Transportation 8 2014-10-01 2014-10-01 false System diagram map. 1152.10 Section 1152.10... TRANSPORTATION UNDER 49 U.S.C. 10903 System Diagram § 1152.10 System diagram map. (a) Each carrier shall prepare a diagram of its rail system on a map, designating all lines in its system by the...

  20. 49 CFR 1152.10 - System diagram map.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 49 Transportation 8 2011-10-01 2011-10-01 false System diagram map. 1152.10 Section 1152.10... TRANSPORTATION UNDER 49 U.S.C. 10903 System Diagram § 1152.10 System diagram map. (a) Each carrier shall prepare a diagram of its rail system on a map, designating all lines in its system by the...

  1. 49 CFR 1152.10 - System diagram map.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 49 Transportation 8 2013-10-01 2013-10-01 false System diagram map. 1152.10 Section 1152.10... TRANSPORTATION UNDER 49 U.S.C. 10903 System Diagram § 1152.10 System diagram map. (a) Each carrier shall prepare a diagram of its rail system on a map, designating all lines in its system by the...

  2. Computationally useful bridge diagram series. II. Diagrams in h-bonds

    NASA Astrophysics Data System (ADS)

    Perkyns, John S.; Dyer, Kippi M.; Pettitt, B. Montgomery

    2002-06-01

    Equations for calculating accurate 4-point and 5-point bridge diagrams in terms of h-bonds have been presented and solved for various phase points of the Lennard-Jones fluid. A method of finding a self-consistent solution for the bridge function and the radial distribution function is demonstrated. The significance of this result over bridge diagrams expressed as f-bonds, in terms of its applicability to charged and dipolar models is discussed. Two very simple phenomenological bridge diagram forms for the bridge function for this model are examined and found to give results almost as accurate and in some cases more accurate than previous forms in the literature. This work represents the first use of directly calculated 5-point bridge diagrams in terms of h-bonds, and the many extra orders of f-bond diagrams which they include, in an integral equation result.

  3. Computationally Useful Bridge Diagram Series. II. Diagrams in H-Bonds

    SciTech Connect

    Perkyns, John S.; Dyer, Kippi M.; Pettitt, Bernard M.

    2002-06-01

    Equations for calculating accurate 4-point and 5-point bridge diagrams in terms of h-bonds have been presented and solved for various phase points of the Lennard-Jones fluid. A method of finding a self-consistent solution for the bridge function and the radial distribution function is demonstrated. The significance of this result over bridge diagrams expressed as f-bonds, in terms of its applicability to charged and dipolar models is discussed. Two very simple phenomenological bridge diagram forms for the bridge function for this model are examined and found to give results almost as accurate and in some cases more accurate than previous forms in the literature. This work represents the first use of directly calculated 5-point bridge diagrams in terms of h-bonds, and the many extra orders of f-bond diagrams which they include, in an integral equation result.

  4. A new method for diagramming pacemaker electrocardiograms.

    PubMed

    Hesselson, A B; Parsonnet, V

    1994-08-01

    Advancements in technology have made paced ECG interpretation increasingly difficult. A new method for depicting the complex pacemaker/heart interactions that eliminates the extensive use of symbols and repetitious use of refractory period and rate limit information of previous methods has been devised. The method uses a framework of parallel horizontal lines drawn on grid paper underneath the ECG. The lines are spaced apart by the actual programmed values (lower rate, AV, VA intervals) of the pacemaker in question. This framework allows the simultaneous use of the horizontal and vertical directions for the diagram of pacemaker timing intervals. Also, a single representation of refractory periods, upper rate intervals, and other variables can be labeled vertically and extrapolated horizontally across the entire diagram. Single chamber, dual chamber, and rate-modulated ECGs are readily represented. The diagram is easily plotted on standard ECG paper and flexible enough to represent complex ECGs. PMID:7526348

  5. The renormalon diagram in gauge theories on

    NASA Astrophysics Data System (ADS)

    Anber, Mohamed M.; Sulejmanpasic, Tin

    2015-01-01

    We analyze the renormalon diagram of gauge theories on . In particular, we perform exact one loop calculations for the vacuum polarization in QCD with adjoint matter and observe that all infrared logarithms, as functions of the external momentum, cancel between the vacuum part and finite volume part, which eliminates the IR renormalon problem. We argue that the singularities in the Borel plane, arising from the topological neutral bions, are not associated with the renormalon diagram, but with the proliferation of the Feynman diagrams. As a byproduct, we obtain, for the first time, an exact one-loop result of the vacuum polarization which can be adapted to the case of thermal compactification of QCD.

  6. The Butterfly diagram leopard skin pattern

    NASA Astrophysics Data System (ADS)

    Ternullo, Maurizio

    2011-08-01

    A time-latitude diagram where spotgroups are given proportional relevance to their area is presented. The diagram reveals that the spotted area distribution is higly dishomogeneous, most of it being concentrated in few, small portions (``knots'') of the Butterfly Diagram; because of this structure, the BD may be properly described as a cluster of knots. The description, assuming that spots scatter around the ``spot mean latitude'' steadily drifting equatorward, is challenged. Indeed, spots cluster around at as many latitudes as knots; a knot may appear at either lower or higher latitudes than previous ones, in a seemingly random way; accordingly, the spot mean latitude abruptly drifts equatorward or even poleward at any knot activation, in spite of any smoothing procedure. Preliminary analyses suggest that the activity splits, in any hemisphere, into two or more distinct ``activity waves'', drifting equatorward at a rate higher than the spot zone as a whole.

  7. Phase diagram of a truncated tetrahedral model.

    PubMed

    Krcmar, Roman; Gendiar, Andrej; Nishino, Tomotoshi

    2016-08-01

    Phase diagram of a discrete counterpart of the classical Heisenberg model, the truncated tetrahedral model, is analyzed on the square lattice, when the interaction is ferromagnetic. Each spin is represented by a unit vector that can point to one of the 12 vertices of the truncated tetrahedron, which is a continuous interpolation between the tetrahedron and the octahedron. Phase diagram of the model is determined by means of the statistical analog of the entanglement entropy, which is numerically calculated by the corner transfer matrix renormalization group method. The obtained phase diagram consists of four different phases, which are separated by five transition lines. In the parameter region, where the octahedral anisotropy is dominant, a weak first-order phase transition is observed. PMID:27627273

  8. A pseudo-haptic knot diagram interface

    NASA Astrophysics Data System (ADS)

    Zhang, Hui; Weng, Jianguang; Hanson, Andrew J.

    2011-01-01

    To make progress in understanding knot theory, we will need to interact with the projected representations of mathematical knots which are of course continuous in 3D but significantly interrupted in the projective images. One way to achieve such a goal would be to design an interactive system that allows us to sketch 2D knot diagrams by taking advantage of a collision-sensing controller and explore their underlying smooth structures through a continuous motion. Recent advances of interaction techniques have been made that allow progress to be made in this direction. Pseudo-haptics that simulates haptic effects using pure visual feedback can be used to develop such an interactive system. This paper outlines one such pseudo-haptic knot diagram interface. Our interface derives from the familiar pencil-and-paper process of drawing 2D knot diagrams and provides haptic-like sensations to facilitate the creation and exploration of knot diagrams. A centerpiece of the interaction model simulates a "physically" reactive mouse cursor, which is exploited to resolve the apparent conflict between the continuous structure of the actual smooth knot and the visual discontinuities in the knot diagram representation. Another value in exploiting pseudo-haptics is that an acceleration (or deceleration) of the mouse cursor (or surface locator) can be used to indicate the slope of the curve (or surface) of whom the projective image is being explored. By exploiting these additional visual cues, we proceed to a full-featured extension to a pseudo-haptic 4D visualization system that simulates the continuous navigation on 4D objects and allows us to sense the bumps and holes in the fourth dimension. Preliminary tests of the software show that main features of the interface overcome some expected perceptual limitations in our interaction with 2D knot diagrams of 3D knots and 3D projective images of 4D mathematical objects.

  9. The role of post-failure brittleness of soft rocks in the assessment of stability of intact masses: FDEM technique applications to ideal problems

    NASA Astrophysics Data System (ADS)

    Lollino, Piernicola; Andriani, Gioacchino Francesco; Fazio, Nunzio Luciano; Perrotti, Michele

    2016-04-01

    Strain-softening under low confinement stress, i.e. the drop of strength that occurs in the post-failure stage, represents a key factor of the stress-strain behavior of rocks. However, this feature of the rock behavior is generally underestimated or even neglected in the assessment of boundary value problems of intact soft rock masses. This is typically the case when the stability of intact rock masses is treated by means of limit equilibrium or finite element analyses, for which rigid-plastic or elastic perfectly-plastic constitutive models, generally implementing peak strength conditions of the rock, are respectively used. In fact, the aforementioned numerical techniques are characterized by intrinsic limitations that do not allow to account for material brittleness, either for the method assumptions or due to numerical stability problems, as for the case of the finite element method, unless sophisticated regularization techniques are implemented. However, for those problems that concern the stability of intact soft rock masses at low stress levels, as for example the stability of shallow underground caves or that of rock slopes, the brittle stress-strain response of rock in the post-failure stage cannot be disregarded due to the risk of overestimation of the stability factor. This work is aimed at highlighting the role of post-peak brittleness of soft rocks in the analysis of specific ideal problems by means of the use of a hybrid finite-discrete element technique (FDEM) that allows for the simulation of the rock stress-strain brittle behavior in a proper way. In particular, the stability of two ideal cases, represented by a shallow underground rectangular cave and a vertical cliff, has been analyzed by implementing a post-peak brittle behavior of the rock and the comparison with a non-brittle response of the rock mass is also explored. To this purpose, the mechanical behavior of a soft calcarenite belonging to the Calcarenite di Gravina formation, extensively

  10. Minkowski diagram in relativity and holography.

    PubMed

    Abramson, N

    1988-05-01

    Now that ultrashort laser pulses can be used in holography, the temporal and spatial resolution approach the same order of magnitude. In that case the limited speed of light sometimes causes large measuring errors if correction methods are not introduced. Therefore, we want to revive the Minkowski diagram, which was invented in 1908 to visualize relativistic relations between time and space. We show how this diagram in a modified form can be used to derive both the static holodiagram, used for conventional holography, including ultrahigh-speed recordings of wavefronts, and a dynamic holodiagram used for studying the apparent distortions of objects recorded at relativistic speeds. PMID:20531662

  11. B-Fe-U Phase Diagram

    NASA Astrophysics Data System (ADS)

    Dias, Marta; Carvalho, Patrícia Almeida; Mardolcar, Umesh Vinaica; Tougait, Olivier; Noël, Henri; Gonçalves, António Pereira

    2014-04-01

    The liquidus projection of the U-rich corner of the B-Fe-U phase diagram is proposed based on X-ray powder diffraction measurements, differential thermal analysis, and scanning electron microscopy observations complemented with energy- and wavelength-dispersive X-ray spectroscopies. Two ternary reactions in this U-rich region were observed and their approximate temperatures were established. In addition, an overview of the complete phase diagram is given, including the liquidus projection; isothermal sections at 1053 K, 1223 K, and 1373 K (780 °C, 950 °C, and 1100 °C); and a U:(Fe,B) = 1:5 isopleth.

  12. The phase diagram of solid hydrogen at high pressure: A challenge for first principles calculations

    NASA Astrophysics Data System (ADS)

    Azadi, Sam; Foulkes, Matthew

    2015-03-01

    We present comprehensive results for the high-pressure phase diagram of solid hydrogen. We focus on the energetically most favorable molecular and atomic crystal structures. To obtain the ground-state static enthalpy and phase diagram, we use semi-local and hybrid density functional theory (DFT) as well as diffusion quantum Monte Carlo (DMC) methods. The closure of the band gap with increasing pressure is investigated utilizing quasi-particle many-body calculations within the GW approximation. The dynamical phase diagram is calculated by adding proton zero-point energies (ZPE) to static enthalpies. Density functional perturbation theory is employed to calculate the proton ZPE and the infra-red and Raman spectra. Our results clearly demonstrate the failure of DFT-based methods to provide an accurate static phase diagram, especially when comparing insulating and metallic phases. Our dynamical phase diagram obtained using fully many-body DMC calculations shows that the molecular-to-atomic phase transition happens at the experimentally accessible pressure of 374 GPa. We claim that going beyond mean-field schemes to obtain derivatives of the total energy and optimize crystal structures at the many-body level is crucial. This work was supported by the UK engineering and physics science research council under Grant EP/I030190/1, and made use of computing facilities provided by HECTOR, and by the Imperial College London high performance computing centre.

  13. Prospective Assessment of Patterns of Failure After High-Precision Definitive (Chemo)Radiation in Head-and-Neck Squamous Cell Carcinoma

    SciTech Connect

    Gupta, Tejpal

    2011-06-01

    Purpose: To prospectively analyze patterns of failure in patients with head-and-neck squamous cell carcinoma treated with definitive high-precision radiotherapy with a focus on location of failure relative to target volume coverage. Methods and Materials: Sixty patients treated with three-dimensional conformal radiotherapy or intensity-modulated radiation therapy were included. Locoregional failure volume was defined on the planning data set at relapse, and dose received was analyzed by use of dose-volume histograms. Results: Thirteen patients were deemed to have had locoregional failures, of which two did not have any viable tumor on salvage neck dissection, leaving eleven patients with proven persistent or recurrent locoregional disease. Of these, 9 patients had in-field failure, 1 marginal failure, and 1 both in-field and marginal failures. Overall, only 2 of 11 patients (18%) with relapse had any marginal failure. Of the 20 sites of locoregional failure, 15 (75%) were in-field and 5 (25%) marginal. Distant metastases were detected in 3 patients, whereas a second new primary developed in 3 others. With a median follow-up of 26 months (interquartile range, 18-31 months) for surviving patients, the 3-year local control, locoregional control, disease-free survival, and overall survival rates were 75.3%, 74%, 67.2%, and 60.5%, respectively. Conclusions: Locoregional relapse remains the predominant pattern of failure in head-and-neck squamous cell carcinoma treated with high-precision definitive radiotherapy with the majority of failures occurring 'in-field' within the high-dose volume. Marginal failures can occur, particularly in the vicinity of the spared parotid gland. The therapeutic index of high-precision conformal radiotherapy is largely dependent on adequate selection and delineation of target volumes and organs at risk.

  14. Heart Failure in South America

    PubMed Central

    Bocchi, Edimar Alcides

    2013-01-01

    Continued assessment of temporal trends in mortality and epidemiology of specific heart failure in South America is needed to provide a scientific basis for rational allocation of the limited health care resources, and strategies to reduce risk and predict the future burden of heart failure. The epidemiology of heart failure in South America was reviewed. Heart failure is the main cause of hospitalization based on available data from approximately 50% of the South American population. The main etiologies of heart failure are ischemic, idiopathic dilated cardiomyopathy, valvular, hypertensive and chagasic etiologies. In endemic areas, Chagas heart disease may be responsible by 41% of the HF cases. Also, heart failure presents high mortality especially in patients with Chagas etiology. Heart failure and etiologies associated with heart failure may be responsible for 6.3% of causes of deaths. Rheumatic fever is the leading cause of valvular heart disease. However, a tendency to reduction of HF mortality due to Chagas heart disease from 1985 to 2006, and reduction in mortality due to HF from 1999 to 2005 were observed in selected states in Brazil. The findings have important public health implications because the allocation of health care resources, and strategies to reduce risk of heart failure should also consider the control of neglected Chagas disease and rheumatic fever in South American countries. PMID:23597301

  15. Usefulness of Geriatric Nutritional Risk Index for Assessing Nutritional Status and Its Prognostic Impact in Patients Aged ≥65 Years With Acute Heart Failure.

    PubMed

    Honda, Yasuyuki; Nagai, Toshiyuki; Iwakami, Naotsugu; Sugano, Yasuo; Honda, Satoshi; Okada, Atsushi; Asaumi, Yasuhide; Aiba, Takeshi; Noguchi, Teruo; Kusano, Kengo; Ogawa, Hisao; Yasuda, Satoshi; Anzai, Toshihisa

    2016-08-15

    Malnutrition is becoming one of the most important determinants of worse clinical outcomes in patients with acute heart failure (AHF). However, appropriate tools for evaluating the nutritional status in patients aged ≥65 years with AHF remain unclear. We examined 490 consecutive patients aged ≥65 years with AHF. They were divided into 2 groups according to Geriatric Nutritional Risk Index (GNRI; cut-off value = 92). During a median period of 189 days, the mortality rate was significantly higher in the lower GNRI group than the higher GNRI group (p <0.001). In multivariate analyses, lower GNRI was an independent determinant of adverse events (HR 0.92, 95% CI 0.88 to 0.95, p <0.001). The GNRI showed the best prognostic value (C-statistic: 0.70) among other nutritional indexes. Adding GNRI to an existing outcome prediction model for mortality in AHF significantly increased the C-statistic from 0.68 to 0.74 (p = 0.017). The net reclassification improvement afforded by GNRI was 60% overall, 27% for events, and 33% for nonevents (p <0.001). In conclusion, lower GNRI on admission was independently associated with worse clinical outcomes in patients aged ≥65 years with AHF, and it was superior to other nutritional parameters. Furthermore, the assessment of nutritional status using GNRI is very helpful for risk stratification. PMID:27324158

  16. The sequential organ failure assessment score as a useful predictor for estimating the prognosis of systemic inflammatory response syndrome patients being treated with extracorporeal blood purification.

    PubMed

    Kikuchi, Hiroshi; Maruyama, Hiroki; Omori, Saori; Kazama, Junichiro J; Gejyo, Fumitake

    2003-08-01

    Systemic inflammatory response syndrome (SIRS) is a major cause of morbidity and mortality in critically ill patients. Extracorporeal blood purification procedures are becoming important for treating these patients. However, the cost of these procedures is high. Therefore, a prognostic marker would be helpful. To establish the reliability of the Sequential Organ Failure Assessment (SOFA) score as a prognostic indicator, we evaluated daily changes in the SOFA score of 40 SIRS patients who needed blood purification procedures such as continuous renal replacement therapy (CRRT), endotoxin adsorption, bilirubin adsorption, and/or plasma exchange. Twenty patients survived and 20 died. Although the baseline scores of the two groups (survivors and non-survivors) did not differ, both the maximum value of the SOFA score and the DeltaSOFA score (the difference between the maximum SOFA and baseline SOFA scores) were significantly higher in the non-survivor group. The mortality rate among patients with a maximum SOFA score greater than or equal to 18 or a DeltaSOFA score greater than or equal to 3 was higher than for the rest of the patients. The changes in the SOFA score correlated well with the outcomes of the SIRS patients. The maximum SOFA score and the DeltaSOFA score are therefore likely to be useful prognostic markers. PMID:12887731

  17. Initial Sequential Organ Failure Assessment score versus Simplified Acute Physiology score to analyze multiple organ dysfunction in infectious diseases in Intensive Care Unit

    PubMed Central

    Nair, Remyasri; Bhandary, Nithish M.; D’Souza, Ashton D.

    2016-01-01

    Aims: To investigate initial Sequential Organ Failure Assessment (SOFA) score of patients in Intensive Care Unit (ICU), who were diagnosed with infectious disease, as an indicator of multiple organ dysfunction and to examine if initial SOFA score is a better mortality predictor compared to Simplified Acute Physiology Score (SAPS). Materials and Methods: Hospital-based study done in medical ICU, from June to September 2014 with a sample size of 48. Patients aged 18 years and above, diagnosed with infectious disease were included. Patients with history of chronic illness (renal/hepatic/pulmonary/  cardiovascular), diabetes, hypertension, chronic obstructive pulmonary disease, heart disease, those on immunosuppressive therapy/chemoradiotherapy for malignancy and patients in immunocompromised state were excluded. Blood investigations were obtained. Six organ dysfunctions were assessed using initial SOFA score and graded from 0 to 4. SAPS was calculated as the sum of points assigned to each of the 17 variables (12 physiological, age, type of admission, and three underlying diseases). The outcome measure was survival status at ICU discharge. Results: We categorized infectious diseases into dengue fever, leptospirosis, malaria, respiratory tract infections, and others which included undiagnosed febrile illness, meningitis, urinary tract infection and gastroenteritis. Initial SOFA score was both sensitive and specific; SAPS lacked sensitivity. We found no significant association between age and survival status. Both SAPS and initial SOFA score were found to be statistically significant as mortality predictors. There is significant association of initial SOFA score in analyzing organ dysfunction in infectious diseases (P < 0.001). SAPS showed no statistical significance. There was statistically significant (P = 0.015) percentage of nonsurvivors with moderate and severe dysfunction, based on SOFA score. Nonsurvivors had higher SAPS but was not statistically significant (P

  18. NFHS Court and Field Diagram Guide.

    ERIC Educational Resources Information Center

    Gillis, John, Ed.

    This guide contains a comprehensive collection of diagrams and specifications of playing fields and courts used in interscholastic and recreational sports, along with information on how to set up various formats of tournament drawings, how to compute golf handicaps, and how to convert metric-to-English distances. Lists are provided of national…

  19. Journeys on the H-R diagram

    SciTech Connect

    Kaler, J.B.

    1988-05-01

    The evolution of various types of stars along the H-R diagram is discussed. Star birth and youth is addressed, and the events that occur due to core contraction, shell burning, and double-shell burning are described. The evolutionary courses of planetary nebulae, white dwarfs, and supernovas are examined.

  20. On phase diagrams of magnetic reconnection

    SciTech Connect

    Cassak, P. A.; Drake, J. F.

    2013-06-15

    Recently, “phase diagrams” of magnetic reconnection were developed to graphically organize the present knowledge of what type, or phase, of reconnection is dominant in systems with given characteristic plasma parameters. Here, a number of considerations that require caution in using the diagrams are pointed out. First, two known properties of reconnection are omitted from the diagrams: the history dependence of reconnection and the absence of reconnection for small Lundquist number. Second, the phase diagrams mask a number of features. For one, the predicted transition to Hall reconnection should be thought of as an upper bound on the Lundquist number, and it may happen for considerably smaller values. Second, reconnection is never “slow,” it is always “fast” in the sense that the normalized reconnection rate is always at least 0.01. This has important implications for reconnection onset models. Finally, the definition of the relevant Lundquist number is nuanced and may differ greatly from the value based on characteristic scales. These considerations are important for applications of the phase diagrams. This is demonstrated by example for solar flares, where it is argued that it is unlikely that collisional reconnection can occur in the corona.

  1. Fog Machines, Vapors, and Phase Diagrams

    ERIC Educational Resources Information Center

    Vitz, Ed

    2008-01-01

    A series of demonstrations is described that elucidate the operation of commercial fog machines by using common laboratory equipment and supplies. The formation of fogs, or "mixing clouds", is discussed in terms of the phase diagram for water and other chemical principles. The demonstrations can be adapted for presentation suitable for elementary…

  2. Spin wave Feynman diagram vertex computation package

    NASA Astrophysics Data System (ADS)

    Price, Alexander; Javernick, Philip; Datta, Trinanjan

    Spin wave theory is a well-established theoretical technique that can correctly predict the physical behavior of ordered magnetic states. However, computing the effects of an interacting spin wave theory incorporating magnons involve a laborious by hand derivation of Feynman diagram vertices. The process is tedious and time consuming. Hence, to improve productivity and have another means to check the analytical calculations, we have devised a Feynman Diagram Vertex Computation package. In this talk, we will describe our research group's effort to implement a Mathematica based symbolic Feynman diagram vertex computation package that computes spin wave vertices. Utilizing the non-commutative algebra package NCAlgebra as an add-on to Mathematica, symbolic expressions for the Feynman diagram vertices of a Heisenberg quantum antiferromagnet are obtained. Our existing code reproduces the well-known expressions of a nearest neighbor square lattice Heisenberg model. We also discuss the case of a triangular lattice Heisenberg model where non collinear terms contribute to the vertex interactions.

  3. Dynamic Tactile Diagram Simplification on Refreshable Displays

    ERIC Educational Resources Information Center

    Rastogi, Ravi; Pawluk, Dianne T. V.

    2013-01-01

    The increasing use of visual diagrams in educational and work environments, and even our daily lives, has created obstacles for individuals who are blind or visually impaired to "independently" access the information they represent. Although physical tactile pictures can be created to convey the visual information, it is typically a slow,…

  4. Computer-Generated Diagrams for the Classroom.

    ERIC Educational Resources Information Center

    Carle, Mark A.; Greenslade, Thomas B., Jr.

    1986-01-01

    Describes 10 computer programs used to draw diagrams usually drawn on chalkboards, such as addition of three vectors, vector components, range of a projectile, lissajous figures, beats, isotherms, Snell's law, waves passing through a lens, magnetic field due to Helmholtz coils, and three curves. Several programming tips are included. (JN)

  5. The Binary Temperature-Composition Phase Diagram

    ERIC Educational Resources Information Center

    Sanders, Philip C.; Reeves, James H.; Messina, Michael

    2006-01-01

    The equations for the liquid and gas lines in the binary temperature-composition phase diagram are derived by approximating that delta(H)[subscript vap] of the two liquids are equal. It is shown that within this approximation, the resulting equations are not too difficult to present in an undergraduate physical chemistry lecture.

  6. Image Attributes: A Study of Scientific Diagrams.

    ERIC Educational Resources Information Center

    Brunskill, Jeff; Jorgensen, Corinne

    2002-01-01

    Discusses advancements in imaging technology and increased user access to digital images, as well as efforts to develop adequate indexing and retrieval methods for image databases. Describes preliminary results of a study of undergraduates that explored the attributes naive subjects use to describe scientific diagrams. (Author/LRW)

  7. Constructing Causal Diagrams to Learn Deliberation

    ERIC Educational Resources Information Center

    Easterday, Matthew W.; Aleven, Vincent; Scheines, Richard; Carver, Sharon M.

    2009-01-01

    Policy problems like "What should we do about global warming?" are ill-defined in large part because we do not agree on a system to represent them the way we agree Algebra problems should be represented by equations. As a first step toward building a policy deliberation tutor, we investigated: (a) whether causal diagrams help students learn to…

  8. Fine structure of the butterfly diagram revisited

    NASA Astrophysics Data System (ADS)

    Major, Balázs

    The latitudinal time distribution of sunspots (butterfly diagram) was studied by Becker (1959) and Antalová & Gnevyshev (1985). Our goal is to revisit these studies. In the first case we check whether there is a poleward migration in sunspot activity. In the second case we confirm the results, and make more quantitative statements concerning their significance and the position of the activity peaks.

  9. Phase diagram of spiking neural networks

    PubMed Central

    Seyed-allaei, Hamed

    2015-01-01

    In computer simulations of spiking neural networks, often it is assumed that every two neurons of the network are connected by a probability of 2%, 20% of neurons are inhibitory and 80% are excitatory. These common values are based on experiments, observations, and trials and errors, but here, I take a different perspective, inspired by evolution, I systematically simulate many networks, each with a different set of parameters, and then I try to figure out what makes the common values desirable. I stimulate networks with pulses and then measure their: dynamic range, dominant frequency of population activities, total duration of activities, maximum rate of population and the occurrence time of maximum rate. The results are organized in phase diagram. This phase diagram gives an insight into the space of parameters – excitatory to inhibitory ratio, sparseness of connections and synaptic weights. This phase diagram can be used to decide the parameters of a model. The phase diagrams show that networks which are configured according to the common values, have a good dynamic range in response to an impulse and their dynamic range is robust in respect to synaptic weights, and for some synaptic weights they oscillates in α or β frequencies, independent of external stimuli. PMID:25788885

  10. Complexities of One-Component Phase Diagrams

    ERIC Educational Resources Information Center

    Ciccioli, Andrea; Glasser, Leslie

    2011-01-01

    For most materials, the solid at and near the triple-point temperature is denser than the liquid with which it is in equilibrium. However, for water and certain other materials, the densities of the phases are reversed, with the solid being less dense. The profound consequences for the appearance of the "pVT" diagram of one-component materials…

  11. Impersonal parameters from Hertzsprung-Russell diagrams

    NASA Astrophysics Data System (ADS)

    Wilson, R. E.; Hurley, Jarrod R.

    2003-10-01

    An objective process for estimation of star cluster parameters from Hertzsprung-Russell (HR) diagrams is introduced, with direct inclusion of multiple stars, a least-squares fitting criterion, and standard error estimates. No role is played by conventional isochrones. Instead the quantity compared between observation and theory is the density of points (areal ) as it varies over the diagram. With as the effective observable quantity, standard parameter adjustment theory can be brought to bear on HR diagram analysis. Here we use the method of differential corrections with a least-squares fitting criterion, but any of the many known fitting methods should be applicable to comparison of observed and theoretical distributions. Diverse numerical schemes were developed to make the overall algorithm workable, including two that improve differentiability of by rendering point distributions effectively equivalent to continuous distributions in certain respects. Statistics of distributions are handled not via Monte Carlo methods but by the Functional Statistics Algorithm (hereafter FSA), a statistical algorithm that has been developed for HR diagram fitting but should serve as an alternative to Monte Carlo in many other applications. FSA accomplishes the aims of Monte Carlo with orders of magnitude less computation. Analysis of luminosity functions is included within the HR diagram algorithm as a special case. Areal density analysis of HR diagrams is acceptably fast because we handle stellar evolution via approximation functions, whose output also is more precisely differentiable than that of a full stellar evolution program. Evolution by approximation functions is roughly a million times as fast as full evolution and has virtually no numerical noise. The algorithmic ideas that lead to objective solutions can be applied to many kinds of HR diagram analysis that are now done subjectively. The present solution program is limited by speed considerations to use of one evolution

  12. Evaluating Risk Of Failure With Limited Information

    NASA Technical Reports Server (NTRS)

    Moore, N. R.; Ebbeler, D. H.; Creager, M.; Newlin, L. E.; Sutharshana, S.

    1993-01-01

    Report describes probabilistic failure assessment (PFA). Developed for application to spaceflight systems for sufficient testing of hardware to ensure reliability not feasible. However, must be ascertained that critical failure modes extremely unlikely to occur during service. PFA applied to any failure mode described by quantitative models of physics and mechanics of failure phenomena, such as fatigue crack in initiation or propagation in structures, leakage of seals, wear in bearings, and erosion of arcjet thrustor cathodes.

  13. Statistics-related and reliability-physics-related failure processes in electronics devices and products

    NASA Astrophysics Data System (ADS)

    Suhir, E.

    2014-05-01

    The well known and widely used experimental reliability "passport" of a mass manufactured electronic or a photonic product — the bathtub curve — reflects the combined contribution of the statistics-related and reliability-physics (physics-of-failure)-related processes. When time progresses, the first process results in a decreasing failure rate, while the second process associated with the material aging and degradation leads to an increased failure rate. An attempt has been made in this analysis to assess the level of the reliability physics-related aging process from the available bathtub curve (diagram). It is assumed that the products of interest underwent the burn-in testing and therefore the obtained bathtub curve does not contain the infant mortality portion. It has been also assumed that the two random processes in question are statistically independent, and that the failure rate of the physical process can be obtained by deducting the theoretically assessed statistical failure rate from the bathtub curve ordinates. In the carried out numerical example, the Raleigh distribution for the statistical failure rate was used, for the sake of a relatively simple illustration. The developed methodology can be used in reliability physics evaluations, when there is a need to better understand the roles of the statistics-related and reliability-physics-related irreversible random processes in reliability evaluations. The future work should include investigations on how powerful and flexible methods and approaches of the statistical mechanics can be effectively employed, in addition to reliability physics techniques, to model the operational reliability of electronic and photonic products.

  14. Failure environment analysis tool applications

    NASA Technical Reports Server (NTRS)

    Pack, Ginger L.; Wadsworth, David B.

    1993-01-01

    Understanding risks and avoiding failure are daily concerns for the women and men of NASA. Although NASA's mission propels us to push the limits of technology, and though the risks are considerable, the NASA community has instilled within, the determination to preserve the integrity of the systems upon which our mission and, our employees lives and well-being depend. One of the ways this is being done is by expanding and improving the tools used to perform risk assessment. The Failure Environment Analysis Tool (FEAT) was developed to help engineers and analysts more thoroughly and reliably conduct risk assessment and failure analysis. FEAT accomplishes this by providing answers to questions regarding what might have caused a particular failure; or, conversely, what effect the occurrence of a failure might have on an entire system. Additionally, FEAT can determine what common causes could have resulted in other combinations of failures. FEAT will even help determine the vulnerability of a system to failures, in light of reduced capability. FEAT also is useful in training personnel who must develop an understanding of particular systems. FEAT facilitates training on system behavior, by providing an automated environment in which to conduct 'what-if' evaluation. These types of analyses make FEAT a valuable tool for engineers and operations personnel in the design, analysis, and operation of NASA space systems.

  15. Failure environment analysis tool applications

    NASA Technical Reports Server (NTRS)

    Pack, Ginger L.; Wadsworth, David B.

    1994-01-01

    Understanding risks and avoiding failure are daily concerns for the women and men of NASA. Although NASA's mission propels us to push the limits of technology, and though the risks are considerable, the NASA community has instilled within it, the determination to preserve the integrity of the systems upon which our mission and, our employees lives and well-being depend. One of the ways this is being done is by expanding and improving the tools used to perform risk assessment. The Failure Environment Analysis Tool (FEAT) was developed to help engineers and analysts more thoroughly and reliably conduct risk assessment and failure analysis. FEAT accomplishes this by providing answers to questions regarding what might have caused a particular failure; or, conversely, what effect the occurrence of a failure might have on an entire system. Additionally, FEAT can determine what common causes could have resulted in other combinations of failures. FEAT will even help determine the vulnerability of a system to failures, in light of reduced capability. FEAT also is useful in training personnel who must develop an understanding of particular systems. FEAT facilitates training on system behavior, by providing an automated environment in which to conduct 'what-if' evaluation. These types of analyses make FEAT a valuable tool for engineers and operations personnel in the design, analysis, and operation of NASA space systems.

  16. Layout pattern analysis using the Voronoi diagram of line segments

    NASA Astrophysics Data System (ADS)

    Dey, Sandeep Kumar; Cheilaris, Panagiotis; Gabrani, Maria; Papadopoulou, Evanthia

    2016-01-01

    Early identification of problematic patterns in very large scale integration (VLSI) designs is of great value as the lithographic simulation tools face significant timing challenges. To reduce the processing time, such a tool selects only a fraction of possible patterns which have a probable area of failure, with the risk of missing some problematic patterns. We introduce a fast method to automatically extract patterns based on their structure and context, using the Voronoi diagram of line-segments as derived from the edges of VLSI design shapes. Designers put line segments around the problematic locations in patterns called "gauges," along which the critical distance is measured. The gauge center is the midpoint of a gauge. We first use the Voronoi diagram of VLSI shapes to identify possible problematic locations, represented as gauge centers. Then we use the derived locations to extract windows containing the problematic patterns from the design layout. The problematic locations are prioritized by the shape and proximity information of the design polygons. We perform experiments for pattern selection in a portion of a 22-nm random logic design layout. The design layout had 38,584 design polygons (consisting of 199,946 line segments) on layer Mx, and 7079 markers generated by an optical rule checker (ORC) tool. The optical rules specify requirements for printing circuits with minimum dimension. Markers are the locations of some optical rule violations in the layout. We verify our approach by comparing the coverage of our extracted patterns to the ORC-generated markers. We further derive a similarity measure between patterns and between layouts. The similarity measure helps to identify a set of representative gauges that reduces the number of patterns for analysis.

  17. Assessment of Under Nutrition Using Composite Index of Anthropometric Failure (CIAF) amongst Toddlers Residing in Urban Slums of Raipur City, Chhattisgarh, India

    PubMed Central

    Soni, G.P.; Jain, Kamlesh; Agrawal, Shubhra

    2015-01-01

    Introduction Several indicators have been used for measurement of under nutrition in the past. They are overlapping and none individually provide a comprehensive number of under nourished in the community. The effort has been to discuss the use of an alternative indicator of malnutrition – the composite index of anthropometric failure (CIAF). Aim To study the prevalence of under nutrition of Toddlers using CIAF and compare the prevalence of under nutrition obtained by primitive indicators and CIAF. Materials and Methods Cross-sectional community based study was carried out in urban slums of Raipur (C.G) during Jan 01,2014 to Sept 30, 2014 using sample size of 602. Slums were selected by multistage random sampling and the subjects were selected by convenient sampling, i.e. starting from a random point house to house survey was carried out until desired number of subjects (According to PPS) were covered assuming that slum population is evenly distributed. Attendant of Toddlers were interviewed with semi structured proforma and Height and Weight were measured by measuring tape and Salter’s weighing machine respectively. Informed consent was obtained. MS excel was used for data analysis after compilation. Results Girls and boys were 50% each. By CIAF the prevalence of under nutrition was found to be 62.1% while, Underweight, Stunting and Wasting showed it to be 45.2%, 46.6% and 17.8% respectively. Conclusion Primitive indices under estimate the burden of under nutrition and CIAF should be used a screening tool for assessing under nutrition. PMID:26393147

  18. Phase diagram of a single lane roundabout

    NASA Astrophysics Data System (ADS)

    Echab, H.; Lakouari, N.; Ez-Zahraouy, H.; Benyoussef, A.

    2016-03-01

    Using the cellular automata model, we numerically study the traffic dynamic in a single lane roundabout system of four entry/exit points. The boundaries are controlled by the injecting rates α1, α2 and the extracting rate β. Both the system with and without Splitter Islands of width Lsp are considered. The phase diagram in the (α1 , β) space and its variation with the roundabout size, Pagg (i.e. the probability of aggressive entry), and Pexit (i.e. the probability of preferential exit) are constructed. The results show that the phase diagram in both cases consists of three phases: free flow, congested and jammed. However, as Lsp increases the free flow phase enlarges while the congested and jammed ones shrink. On the other hand, the short sized roundabout shows better performance in the free flow phase while the large one is more optimal in the congested phase. The density profiles are also investigated.

  19. Prediction of boron carbon nitrogen phase diagram

    NASA Astrophysics Data System (ADS)

    Yao, Sanxi; Zhang, Hantao; Widom, Michael

    We studied the phase diagram of boron, carbon and nitrogen, including the boron-carbon and boron-nitrogen binaries and the boron-carbon-nitrogen ternary. Based on the idea of electron counting and using a technique of mixing similar primitive cells, we constructed many ''electron precise'' structures. First principles calculation is performed on these structures, with either zero or high pressures. For the BN binary, our calculation confirms that a rhmobohedral phase can be stablized at high pressure, consistent with some experimental results. For the BCN ternary, a new ground state structure is discovered and an Ising-like phase transition is suggested. Moreover, we modeled BCN ternary phase diagram and show continuous solubility from boron carbide to the boron subnitride phase.

  20. Penguin diagrams for improved staggered fermions

    SciTech Connect

    Lee, Weonjong

    2005-01-01

    We calculate, at the one-loop level, penguin diagrams for improved staggered fermion operators constructed using various fat links. The main result is that diagonal mixing coefficients with penguin operators are identical between the unimproved operators and the improved operators using such fat links as Fat7, Fat7+Lepage, Fat7, HYP (I) and HYP (II). In addition, it turns out that the off-diagonal mixing vanishes for those constructed using fat links of Fat7, Fat7 and HYP (II). This is a consequence of the fact that the improvement by various fat links changes only the mixing with higher dimension operators and off-diagonal operators. The results of this paper, combined with those for current-current diagrams, provide complete matching at the one-loop level with all corrections of O(g{sup 2}) included.

  1. Direct Measurement of the Fluid Phase Diagram.

    PubMed

    Bao, Bo; Riordon, Jason; Xu, Yi; Li, Huawei; Sinton, David

    2016-07-19

    The thermodynamic phase of a fluid (liquid, vapor or supercritical) is fundamental to all chemical processes, and the critical point is particularly important for supercritical chemical extraction. Conventional phase measurement methods require hours to obtain a single datum on the pressure and temperature diagram. Here, we present the direct measurement of the full pressure-temperature phase diagram, with 10 000 microwells. Orthogonal, linear, pressure and temperature gradients are obtained with 100 parallel microchannels (spanning the pressure range), each with 100 microwells (spanning the temperature range). The phase-mapping approach is demonstrated with both a pure substance (CO2) and a mixture (95% CO2 + 5% N2). Liquid, vapor, and supercritical regions are clearly differentiated, and the critical pressure is measured at 1.2% error with respect to the NIST standard. This approach provides over 100-fold improvement in measurement speed over conventional methods. PMID:27331613

  2. Krajewski diagrams and the standard model

    SciTech Connect

    Stephan, Christoph A.

    2009-04-15

    This paper provides a complete list of Krajewski diagrams representing the standard model of particle physics. We will give the possible representations of the algebra and the anomaly free lifts which provide the representation of the standard model gauge group on the fermionic Hilbert space. The algebra representations following from the Krajewski diagrams are not complete in the sense that the corresponding spectral triples do not necessarily obey to the axiom of Poincare duality. This defect may be repaired by adding new particles to the model, i.e., by building models beyond the standard model. The aim of this list of finite spectral triples (up to Poincare duality) is therefore to provide a basis for model building beyond the standard model.

  3. MHV diagrams in momentum twistor space

    NASA Astrophysics Data System (ADS)

    Bullimore, Mathew; Mason, Lionel; Skinner, David

    2010-12-01

    We show that there are remarkable simplifications when the MHV diagram formalism for mathcal{N} = 4 super Yang-Mills is reformulated in momentum twistor space. The vertices are replaced by unity while each propagator becomes a dual superconformal `R-invariant' whose arguments may be read off from the diagram, and include an arbitrarily chosen reference twistor. The momentum twistor MHV rules generate a formula for the full, all-loop planar integrand for the super Yang-Mills S-matrix that is manifestly dual superconformally invariant up to the choice of a reference twistor. We give a general proof of this reformulation and illustrate its use by computing the momentum twistor NMHV and N2MHV tree amplitudes and the integrands of the MHV and NMHV 1-loop and the MHV 2-loop planar amplitudes.

  4. Band diagram of strained graphene nanoribbons

    NASA Astrophysics Data System (ADS)

    Prabhakar, Sanjay; Melnik, Roderick; Bonilla, Luis

    2016-04-01

    The influence of ripple waves on the band diagram of zigzag strained graphene nanoribbons (GNRs) is analyzed by utilizing the finite element method. Such waves have their origin in electromechanical effects. With a novel model, we demonstrate that electron-hole band diagrams of GNRs are highly influenced (i.e. level crossing of the bands are possible) by two combined effects: pseudo-magnetic fields originating from electroelasticity theory and external magnetic fields. In particular, we show that the level crossing point can be observed at large external magnetic fields (B ≈ 100T ) in strained GNRs, when the externally applied tensile edge stress is on the order of -100 eV/nm and the amplitude of the out-of-plane ripple waves is on the order of 1nm.

  5. Ring-Diagram Analysis: Status and Perspectives

    NASA Astrophysics Data System (ADS)

    Hill, F.

    Ring diagram analysis is now more than a decade old. While the details of the technique are still evolving, the application of the method to MDI, TON, Mt. Wilson, HLH, and GONG data is providing intriguing results. Thanks to the work of many people, it is now becoming possible to observationally infer the complicated dynamics in the outer 15 Mm of the solar convection zone, investigate the depth dependence of meridional flow, and get a closer look at zonal jet-stream structures in the mid-latitudes. We may soon be able to similarly investigate the spatio-temporal distribution of scalar fields. As ring diagrams and other local helioseismology methods such as time-distance and acoustic imaging continue to mature, the comparison of results from different techniques on common data sets will provide a useful reality check.

  6. Persistence diagrams of cortical surface data.

    PubMed

    Chung, Moo K; Bubenik, Peter; Kim, Peter T

    2009-01-01

    We present a novel framework for characterizing signals in images using techniques from computational algebraic topology. This technique is general enough for dealing with noisy multivariate data including geometric noise. The main tool is persistent homology which can be encoded in persistence diagrams. These diagrams visually show how the number of connected components of the sublevel sets of the signal changes. The use of local critical values of a function differs from the usual statistical parametric mapping framework, which mainly uses the mean signal in quantifying imaging data. Our proposed method uses all the local critical values in characterizing the signal and by doing so offers a completely new data reduction and analysis framework for quantifying the signal. As an illustration, we apply this method to a 1D simulated signal and 2D cortical thickness data. In case of the latter, extra homological structures are evident in an control group over the autistic group. PMID:19694279

  7. Extracting parameters from colour-magnitude diagrams

    NASA Astrophysics Data System (ADS)

    Bonatto, C.; Campos, F.; Kepler, S. O.; Bica, E.

    2015-07-01

    We present a simple approach for obtaining robust values of astrophysical parameters from the observed colour-magnitude diagrams (CMDs) of star clusters. The basic inputs are the Hess diagram built with the photometric measurements of a star cluster and a set of isochrones covering wide ranges of age and metallicity. In short, each isochrone is shifted in apparent distance modulus and colour excess until it crosses over the maximum possible Hess density. Repeating this step for all available isochrones leads to the construction of the solution map, in which the optimum values of age and metallicity - as well as foreground/background reddening and distance from the Sun - can be searched for. Controlled tests with simulated CMDs show that the approach is efficient in recovering the input values. We apply the approach to the open clusters M 67, NGC 6791 and NGC 2635, which are characterized by different ages, metallicities and distances from the Sun.

  8. Phase Coexistence in a Dynamic Phase Diagram.

    PubMed

    Gentile, Luigi; Coppola, Luigi; Balog, Sandor; Mortensen, Kell; Ranieri, Giuseppe A; Olsson, Ulf

    2015-08-01

    Metastability and phase coexistence are important concepts in colloidal science. Typically, the phase diagram of colloidal systems is considered at the equilibrium without the presence of an external field. However, several studies have reported phase transition under mechanical deformation. The reason behind phase coexistence under shear flow is not fully understood. Here, multilamellar vesicle (MLV)-to-sponge (L3 ) and MLV-to-Lα transitions upon increasing temperature are detected using flow small-angle neutron scattering techniques. Coexistence of Lα and MLV phases at 40 °C under shear flow is detected by using flow NMR spectroscopy. The unusual rheological behavior observed by studying the lamellar phase of a non-ionic surfactant is explained using (2) H NMR and diffusion flow NMR spectroscopy with the coexistence of planar lamellar-multilamellar vesicles. Moreover, a dynamic phase diagram over a wide range of temperatures is proposed. PMID:26083451

  9. Phase diagram of superconductor-ferromagnet superlattices

    SciTech Connect

    Radovic, Z.; Dobrosavljevic-Grujic, L.

    1994-12-31

    Recent progress in the proximity effect theory of superconductor-ferromagnet superlattices is reviewed. The phase diagram calculations, transition temperature {Tc} and upper critical fields H{sub c2}, are presented. Characteristic features in {Tc} and H{sub c2}(T) dependence on layers thicknesses, including the predicted unusual oscillatory variations and new inhomogeneous superconducting state with nontrivial phase difference between neighboring superconducting layers, are discussed and compared with experimental data for V/Fe and Nb/Gd superlattices.

  10. Mixed wasted integrated program: Logic diagram

    SciTech Connect

    Mayberry, J.; Stelle, S.; O`Brien, M.; Rudin, M.; Ferguson, J.; McFee, J.

    1994-11-30

    The Mixed Waste Integrated Program Logic Diagram was developed to provide technical alternative for mixed wastes projects for the Office of Technology Development`s Mixed Waste Integrated Program (MWIP). Technical solutions in the areas of characterization, treatment, and disposal were matched to a select number of US Department of Energy (DOE) treatability groups represented by waste streams found in the Mixed Waste Inventory Report (MWIR).

  11. Displaying multimedia environmental partitioning by triangular diagrams

    SciTech Connect

    Lee, S.C.; Mackay, D.

    1995-11-01

    It is suggested that equilateral triangular diagrams are a useful method of depicting the equilibrium partitioning of organic chemicals among the three primary environmental media of the atmosphere, the hydrosphere, and the organosphere (natural organic matter and biotic lipids and waxes). The technique is useful for grouping chemicals into classes according to their partitioning tendencies, for depicting the incremental effects of substituents such as alkyl groups and chlorine, and for showing how partitioning changes in response to changes in temperature.

  12. Nonthermal Radio Emission and the HR Diagram

    NASA Technical Reports Server (NTRS)

    Gibson, D. M.

    1985-01-01

    Perhaps the most reliable indicator of non-radiative heating/momentum in a stellar atmosphere is the presence of nonthermal radio emission. To date, 77 normal stellar objects have been detected and identified as nonthermal sources. These stellar objects are tabulated herein. It is apparent that non-thermal radio emission is not ubiquitous across the HR diagram. This is clearly the case for the single stars; it is not as clear for the binaries unless the radio emission is associated with their late-type components. Choosing to make this association, the single stars and the late-type components are plotted together. The following picture emerges: (1) there are four locations on the HR diagram where non-thermal radio stars are found; (2) the peak incoherent 5 GHz luminosities show a suprisingly small range for stars within each class; (3) the fraction of stellar energy that escapes as radio emission can be estimated by comparing the integrated maximum radio luminosity to the bolometric luminosity; (4) there are no apparent differences in L sub R between binaries with two cool components, binaries with one hot and one cool component, and single stars for classes C and D; and (5) The late-type stars (classes B, C, and D) are located in parts of the HR diagram where there is reason to suspect that the surfaces of the stars are being braked with respect to their interiors.

  13. Regime Diagrams for K-Theory Dispersion

    NASA Astrophysics Data System (ADS)

    Smith, Ronald B.

    2011-06-01

    In atmospheric dispersion, the "non-Gaussian" effects of gravitational settling, the vertical gradient in diffusivity and the surface deposition do not enter uniformly but rather break up parameter space into several discrete regimes. Here, we describe regime diagrams that are constructed for K-theory dispersion of effluent from a surface line source in unsheared inhomogeneous turbulence, using a previously derived Fourier-Hankel method. This K-theory formulation differs from the traditional one by keeping a non-zero diffusivity at the ground. This change allows for turbulent exchange between the canopy and the atmosphere and allows new natural length scales to emerge. The axes on the regime diagrams are non-dimensional distance defined as the ratio of downwind distance to the characteristic length scale for each effect. For each value of the ratio of settling speed to the K gradient, two to four regimes are found. Concentration formulae are given for each regime. The regime diagrams allow real dispersion problems to be categorized and the validity of end-state concentration formulae to be judged.

  14. Phase diagram and dynamics of Yukawa systems

    NASA Astrophysics Data System (ADS)

    Robbins, Mark. O.; Kremer, Kurt; Grest, Gary S.

    1988-03-01

    The phase diagram and dynamical properties of systems of particles interacting through a repulsive screened Coulomb (Yukawa) potential have been calculated using molecular and lattice dynamics techniques. The phase diagram contains both a melting transition and a transition from fcc to bcc crystalline phases. These phase transitions have been studied as a function of potential shape (screening length) and compared to phenomenological criteria for transition temperatures such as those of Lindemann and of Hansen and Verlet. The transition from fcc to bcc with increasing temperature is shown to result from a higher entropy in the bcc phase because of its softer shear modes. Even when the stable solid phase below the melting temperature is fcc, bcc-like local order is found in the liquid phase. This may substantially slow crystallization. The calculated phase diagram and shear modulus are in good agreement with experiments on colloidal suspensions of polystyrene spheres. The single particle dynamics of Yukawa systems show several unusual features. There is a pronounced subdiffusive regime in liquids near and below the melting temperature. This regime reflects the existence of two time scales: a typical phonon period, and the time for a particle to feel a new environment. The second time scale becomes longer as the temperature is lowered or the range of interaction (screening length) increases.

  15. Recognition and processing of logic diagrams

    NASA Astrophysics Data System (ADS)

    Darwish, Ahmed M.; Bashandy, Ahmed R.

    1996-03-01

    In this paper we present a vision system that is capable of interpreting schematic logic diagrams, i.e. determine the output as a logic function of the inputs. The system is composed of a number of modules each designed to perform a specific subtask. Each module bears a minor contribution in the form of a new mixture of known algorithms or extensions to handle actual real life image imperfections which researchers tend to ignore when they develop their theoretical foundations. The main contribution, thus, is not in any individual module, it is rather in their integration to achieve the target job. The system is organized more or less in a classical fashion. Aside from the image acquisition and preprocessing modules, interesting modules include: the segmenter, the identifier, the connector and the grapher. A good segmentation output is one reason for the success of the presented system. Several novelties exist in the presented approach. Following segmentation the type of each logic gate is determined and its topological connectivity. The logic diagram is then transformed to a directed acyclic graph in which the final node is the output logic gate. The logic function is then determined by backtracking techniques. The system is not only aimed at recognition applications. In fact its main usage may be to target other processing applications such as storage compression and graphics modification and manipulation of the diagram as is explained.

  16. The Critical Importance of Russell's Diagram

    NASA Astrophysics Data System (ADS)

    Gingerich, O.

    2013-04-01

    The idea of dwarf and giants stars, but not the nomenclature, was first established by Eijnar Hertzsprung in 1905; his first diagrams in support appeared in 1911. In 1913 Henry Norris Russell could demonstrate the effect far more strikingly because he measured the parallaxes of many stars at Cambridge, and could plot absolute magnitude against spectral type for many points. The general concept of dwarf and giant stars was essential in the galactic structure work of Harlow Shapley, Russell's first graduate student. In order to calibrate the period-luminosity relation of Cepheid variables, he was obliged to fall back on statistical parallax using only 11 Cepheids, a very sparse sample. Here the insight provided by the Russell diagram became critical. The presence of yellow K giant stars in globular clusters credentialed his calibration of the period-luminosity relation by showing that the calibrated luminosity of the Cepheids was comparable to the luminosity of the K giants. It is well known that in 1920 Shapley did not believe in the cosmological distances of Heber Curtis' spiral nebulae. It is not so well known that in 1920 Curtis' plot of the period-luminosity relation suggests that he didn't believe it was a physical relation and also he failed to appreciate the significance of the Russell diagram for understanding the large size of the Milky Way.

  17. 75 FR 61512 - Outer Continental Shelf Official Protraction Diagrams

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-10-05

    ... Bureau of Ocean Energy Management, Regulation and Enforcement Outer Continental Shelf Official... Outer Continental Shelf Official Protraction Diagrams (OPDs) located within Atlantic Ocean areas, with... informational purposes only. Outer Continental Shelf Official Protraction Diagrams in the North Atlantic,...

  18. Placing the Forces on Free-Body Diagrams.

    ERIC Educational Resources Information Center

    Sperry, Willard

    1994-01-01

    Discusses the problem of drawing free-body diagrams to analyze the conditions of static equilibrium. Presents a method based on the correct placement of the normal force on the body. Includes diagrams. (MVL)

  19. Teaching Verbal Chains Using Flow Diagrams and Texts

    ERIC Educational Resources Information Center

    Holliday, William G.

    1976-01-01

    A discussion of the recent diagram and attention theory and research surprisingly suggests that a single flow diagram with instructive questions constitutes an effective learning medium in terms of verbal chaining. (Author)

  20. Massive basketball diagram for a thermal scalar field theory

    NASA Astrophysics Data System (ADS)

    Andersen, Jens O.; Braaten, Eric; Strickland, Michael

    2000-08-01

    The ``basketball diagram'' is a three-loop vacuum diagram for a scalar field theory that cannot be expressed in terms of one-loop diagrams. We calculate this diagram for a massive scalar field at nonzero temperature, reducing it to expressions involving three-dimensional integrals that can be easily evaluated numerically. We use this result to calculate the free energy for a massive scalar field with a φ4 interaction to three-loop order.

  1. Variation in Cognitive Failures: An Individual Differences Investigation of Everyday Attention and Memory Failures

    ERIC Educational Resources Information Center

    Unsworth, Nash; Brewer, Gene A.; Spillers, Gregory J.

    2012-01-01

    The present study examined individual differences in everyday cognitive failures assessed by diaries. A large sample of participants completed various cognitive ability measures in the laboratory. Furthermore, a subset of these participants also recorded everyday cognitive failures (attention, retrospective memory, and prospective memory failures)…

  2. Oak Ridge National Laboratory Technology Logic Diagram. Executive Summary

    SciTech Connect

    Not Available

    1993-06-30

    This executive summary contains a description of the logic diagram format; some examples from the diagram (Vol. 2) and associated technology evaluation data sheets (Vol. 3); a complete (albeit condensed) listing of the RA, D&D, and WM problems at ORNL; and a complete listing of the technology rankings for all the areas covered by the diagram.

  3. The Classroom as Rhizome: New Strategies for Diagramming Knotted Interactions

    ERIC Educational Resources Information Center

    de Freitas, Elizabeth

    2012-01-01

    This article calls attention to the unexamined role of diagrams in educational research and offers examples of alternative diagramming practices or tools that shed light on classroom interaction as a rhizomatic process. Drawing extensively on the work of Latour, Deleuze and Guattari, and Chatelet, this article explores the power of diagramming as…

  4. Differential Cognitive and Affective Responses to Flow Diagrams in Science

    ERIC Educational Resources Information Center

    Holliday, William G.; And Others

    1977-01-01

    Describes a study in which tenth-grade biology students who were low verbal performers scored significantly higher on achievement tests when provided with picture-word diagrams of biological concepts than when provided with block-word diagrams. Students and teachers also preferred picture-word diagrams as indicated by a questionnaire. (MLH)

  5. 30 CFR 256.8 - Leasing maps and diagrams.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 30 Mineral Resources 2 2010-07-01 2010-07-01 false Leasing maps and diagrams. 256.8 Section 256.8..., General § 256.8 Leasing maps and diagrams. (a) Any area of the OCS which has been appropriately platted as... 12(d) of the Act. (b) The MMS shall prepare leasing maps and official protraction diagrams of...

  6. Students' Learning Activities While Studying Biological Process Diagrams

    ERIC Educational Resources Information Center

    Kragten, Marco; Admiraal, Wilfried; Rijlaarsdam, Gert

    2015-01-01

    Process diagrams describe how a system functions (e.g. photosynthesis) and are an important type of representation in Biology education. In the present study, we examined students' learning activities while studying process diagrams, related to their resulting comprehension of these diagrams. Each student completed three learning tasks. Verbal…

  7. Science Visual Literacy: Learners' Perceptions and Knowledge of Diagrams

    ERIC Educational Resources Information Center

    McTigue, Erin M.; Flowers, Amanda C.

    2011-01-01

    Constructing meaning from science texts relies not only on comprehending the words but also the diagrams and other graphics. The goal of this study was to explore elementary students' perceptions of science diagrams and their skills related to diagram interpretation. 30 students, ranging from second grade through middle school, completed a diagram…

  8. The Space Station Freedom Reliability and Maintainability Assessment Tool

    NASA Technical Reports Server (NTRS)

    Blumentritt, Will; Doran, Linda; Sample, Keith

    1993-01-01

    The Reliability and Maintainability Assessment Tool is a stochastic, event-oriented simulation model that has been developed to analyze the functional reliability, availability, and maintainability characteristics of the Space Station Freedom. This tool simulates failures and performs corrective and preventive maintenance tasks, utilizing user-specified maintenance resources, including crewmembers and/or robotics, and accommodates the growth of the station. The model dynamically interfaces with minimal cut sets derived from reliability block diagrams to assess functional status and to determine queuing priorities.

  9. The Space Station Freedom Reliability and Maintainability Assessment Tool

    NASA Astrophysics Data System (ADS)

    Blumentritt, Will; Doran, Linda; Sample, Keith

    The Reliability and Maintainability Assessment Tool is a stochastic, event-oriented simulation model that has been developed to analyze the functional reliability, availability, and maintainability characteristics of the Space Station Freedom. This tool simulates failures and performs corrective and preventive maintenance tasks, utilizing user-specified maintenance resources, including crewmembers and/or robotics, and accommodates the growth of the station. The model dynamically interfaces with minimal cut sets derived from reliability block diagrams to assess functional status and to determine queuing priorities.

  10. Ivabradine: Heart Failure and Beyond.

    PubMed

    Chaudhary, Rahul; Garg, Jalaj; Krishnamoorthy, Parasuram; Shah, Neeraj; Lanier, Gregg; Martinez, Mathew W; Freudenberger, Ronald

    2016-07-01

    Heart failure affects over 5 million people in the United States and carries a high rate of mortality. Ivabradine, a new agent has been added to the current medical options for managing heart failure. It is a selective funny current (If) inhibitor in sinoatrial node and slows its firing rate, prolonging diastolic depolarization without a negative inotropic effect. Ivabradine was only recently approved by Food and Drug administration after the results of Systolic Heart Failure Treatment with the If Inhibitor Ivabradine (SHIFT) trial, for a reduction in rehospitalizations from chronic heart failure. This trial assessed patients with stable heart failure with reduced ejection fraction and a heart rate of at least 70 beats per minute at rest on maximally tolerated beta-blocker therapy and demonstrated statistically significant reduction in heart failure hospitalization and deaths. Additionally, ivabradine has been associated with reduced cardiac remodeling, reduced heart rate variability, improvement in exercise tolerance, improved heart failure class of New York Heart Association, and better quality of life. It has also been tried in other conditions, such as inappropriate sinus tachycardia and cardiogenic shock, and is currently in phase II trial for patients with newly diagnosed multiple organ dysfunction syndrome. PMID:26721645

  11. 24 CFR 902.62 - Failure to submit data.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 24 Housing and Urban Development 4 2013-04-01 2013-04-01 false Failure to submit data. 902.62... DEVELOPMENT PUBLIC HOUSING ASSESSMENT SYSTEM PHAS Scoring § 902.62 Failure to submit data. (a) Failure to... receive a presumptive rating of failure for its unaudited information and shall receive zero points...

  12. 24 CFR 902.62 - Failure to submit data.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 24 Housing and Urban Development 4 2011-04-01 2011-04-01 false Failure to submit data. 902.62... DEVELOPMENT PUBLIC HOUSING ASSESSMENT SYSTEM PHAS Scoring § 902.62 Failure to submit data. (a) Failure to... receive a presumptive rating of failure for its unaudited information and shall receive zero points...

  13. 24 CFR 902.62 - Failure to submit data.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 24 Housing and Urban Development 4 2012-04-01 2012-04-01 false Failure to submit data. 902.62... DEVELOPMENT PUBLIC HOUSING ASSESSMENT SYSTEM PHAS Scoring § 902.62 Failure to submit data. (a) Failure to... receive a presumptive rating of failure for its unaudited information and shall receive zero points...

  14. Urea distribution in renal failure

    PubMed Central

    Blackmore, D. J.; Elder, W. J.; Bowden, C. H.

    1963-01-01

    An assessment of intracellular urea removed during haemodialysis has been made from urea extraction and plasma urea estimations. An apparent wide variation in the movement of intracellular urea in patients with acute renal failure from obstetric and traumatic causes and with chronic renal failure is reported. A method for the estimation of red cell water urea is presented. In two patients with chronic renal failure the red cell urea level was much higher than would have been expected from the plasma urea level before dialysis. In two obstetric patients there was no such discrepancy. The conclusion is drawn that research should be directed to variations of intracellular metabolism in renal failure before a more rational approach can be made to its management. PMID:16811009

  15. Acute kidney failure

    MedlinePlus

    Kidney failure; Renal failure; Renal failure - acute; ARF; Kidney injury - acute ... There are many possible causes of kidney damage. They include: ... cholesterol (cholesterol emboli) Decreased blood flow due to very ...

  16. What Is Heart Failure?

    MedlinePlus

    ... page from the NHLBI on Twitter. What Is Heart Failure? Heart failure is a condition in which the heart can' ... force. Some people have both problems. The term "heart failure" doesn't mean that your heart has stopped ...

  17. Assessment of a primary care-based telemonitoring intervention for home care patients with heart failure and chronic lung disease. The TELBIL study

    PubMed Central

    2011-01-01

    Background Telemonitoring technology offers one of the most promising alternatives for the provision of health care services at the patient's home. The primary aim of this study is to evaluate the impact of a primary care-based telemonitoring intervention on the frequency of hospital admissions. Methods/design A primary care-based randomised controlled trial will be carried out to assess the impact of a telemonitoring intervention aimed at home care patients with heart failure (HF) and/or chronic lung disease (CLD). The results will be compared with those obtained with standard health care practice. The duration of the study will be of one year. Sixty patients will be recruited for the study. In-home patients, diagnosed with HF and/or CLD, aged 14 or above and with two or more hospital admissions in the previous year will be eligible. For the intervention group, telemonitoring will consist of daily patient self-measurements of respiratory-rate, heart-rate, blood pressure, oxygen saturation, weight and body temperature. Additionally, the patients will complete a qualitative symptom questionnaire daily using the telemonitoring system. Routine telephone contacts will be conducted every fortnight and additional telephone contacts will be carried out if the data received at the primary care centre are out of the established limits. The control group will receive usual care. The primary outcome measure is the number of hospital admissions due to any cause that occurred in a period of 12 months post-randomisation. The secondary outcome measures are: duration of hospital stay, hospital admissions due to HF or CLD, mortality rate, use of health care resources, quality of life, cost-effectiveness, compliance and patient and health care professional satisfaction with the new technology. Discussion The results of this study will shed some light on the effects of telemonitoring for the follow-up and management of chronic patients from a primary care setting. The study may

  18. Revisiting Stress Corrosion Cracking of Steel in Caustic Solutions for Developing Cracking Susceptibility Diagrams for Improved Applicability

    NASA Astrophysics Data System (ADS)

    Pal, Sarvesh; Raman, R. K. Singh; Ibrahim, R. N.

    2012-06-01

    Stress corrosion cracking tests were conducted using Bayer solutions of different chemistry at different temperatures for extraction of alumina from bauxite ores. The validity of the commonly used caustic cracking susceptibility (CS) diagram for steels exposed to plain caustic solutions was assessed by testing the notched and precracked specimens. This study presents first results toward the development of a model susceptibility diagram for actual Bayer solutions, and for improved applicability of the traditional plain caustic diagram. For mechanistic understanding of caustic cracking, tests were also carried out under imposed electrochemical conditions.

  19. Fault Tree, Event Tree, and Piping and Instrumentation Diagram (FEP) editors, Version 4.0. Reference manual

    SciTech Connect

    McKay, M.K.; Skinner, N.L.; Wood, S.T.

    1992-05-01

    The Fault Tree, Event Tree, and Piping & Instrumentation Diagram (FEP) editors allow the user to graphically build and edit fault trees, event trees, and piping & instrumentation diagrams (P & IDs). The software is designed to enable the use of graphical-based editors found in the Integrated Reliability and Risk Assessment System (IRRAS). FEP is made up of three separate editors (Fault Tree, Event Tree, and Piping & Instrumentation Diagram) and a utility module. This reference manual provides a screen-by-screen walkthrough of the entire FEP System.

  20. Generic Phase Diagram of Binary Superlattices

    NASA Astrophysics Data System (ADS)

    Tkachenko, Alexei

    Emergence of a large variety of self-assembled superlattices is a dramatic recent trend in the fields of nanoparticle and colloidal sciences. Motivated by this development, we propose a model that combines simplicity with a remarkably rich phase behavior, applicable to a wide range of such self-assembled systems. Those include nanoparticle and colloidal assemblies driven by DNA-mediated interactions, electrostatics, and possibly, by controlled drying. In our model, a binary system of Large and Small hard sphere (L and S)interact via selective short-range (''sticky'') attraction. In its simplest version, this Binary Sticky Sphere model features attraction only between 'S' and 'L' particles, respectively. We demonstrate that in the limit when this attraction is sufficiently strong compared to kT, the problem becomes purely geometrical: the thermodynamically preferred state should maximize the number of S-L contacts. A general procedure for constructing the phase diagram as a function of system composition f, and particle size ratio r, is outlined. In this way, the global phase behavior can be calculated very efficiently, for a given set of plausible candidate phases. Furthermore, the geometric nature of the problem enables us to generate those candidate phases through a well defined and intuitive construction. We calculate the phase diagrams both for 2D and 3D systems, and compare the results with existing experiments. Most of the 3D superlattices observed to date are featured in our phase diagram, while several more are yet to be discovered. The research was carried out at the CFN, DOE Office of Science Facility, at BNL, under Contract No. DE-SC0012704.

  1. State-transition diagrams for biologists.

    PubMed

    Bersini, Hugues; Klatzmann, David; Six, Adrien; Thomas-Vaslin, Véronique

    2012-01-01

    It is clearly in the tradition of biologists to conceptualize the dynamical evolution of biological systems in terms of state-transitions of biological objects. This paper is mainly concerned with (but obviously not limited too) the immunological branch of biology and shows how the adoption of UML (Unified Modeling Language) state-transition diagrams can ease the modeling, the understanding, the coding, the manipulation or the documentation of population-based immune software model generally defined as a set of ordinary differential equations (ODE), describing the evolution in time of populations of various biological objects. Moreover, that same UML adoption naturally entails a far from negligible representational economy since one graphical item of the diagram might have to be repeated in various places of the mathematical model. First, the main graphical elements of the UML state-transition diagram and how they can be mapped onto a corresponding ODE mathematical model are presented. Then, two already published immune models of thymocyte behavior and time evolution in the thymus, the first one originally conceived as an ODE population-based model whereas the second one as an agent-based one, are refactored and expressed in a state-transition form so as to make them much easier to understand and their respective code easier to access, to modify and run. As an illustrative proof, for any immunologist, it should be possible to understand faithfully enough what the two software models are supposed to reproduce and how they execute with no need to plunge into the Java or Fortran lines. PMID:22844438

  2. State-Transition Diagrams for Biologists

    PubMed Central

    Bersini, Hugues; Klatzmann, David; Six, Adrien; Thomas-Vaslin, Véronique

    2012-01-01

    It is clearly in the tradition of biologists to conceptualize the dynamical evolution of biological systems in terms of state-transitions of biological objects. This paper is mainly concerned with (but obviously not limited too) the immunological branch of biology and shows how the adoption of UML (Unified Modeling Language) state-transition diagrams can ease the modeling, the understanding, the coding, the manipulation or the documentation of population-based immune software model generally defined as a set of ordinary differential equations (ODE), describing the evolution in time of populations of various biological objects. Moreover, that same UML adoption naturally entails a far from negligible representational economy since one graphical item of the diagram might have to be repeated in various places of the mathematical model. First, the main graphical elements of the UML state-transition diagram and how they can be mapped onto a corresponding ODE mathematical model are presented. Then, two already published immune models of thymocyte behavior and time evolution in the thymus, the first one originally conceived as an ODE population-based model whereas the second one as an agent-based one, are refactored and expressed in a state-transition form so as to make them much easier to understand and their respective code easier to access, to modify and run. As an illustrative proof, for any immunologist, it should be possible to understand faithfully enough what the two software models are supposed to reproduce and how they execute with no need to plunge into the Java or Fortran lines. PMID:22844438

  3. Thermodynamic modeling of the UO2-ThO2 phase diagram

    NASA Astrophysics Data System (ADS)

    Kim, Jinwon; Kim, Sung S.

    2016-02-01

    The phase diagram in the UO2-ThO2 system has been assessed by thermodynamic modeling with existing data from the literature. The subregular solution model was used to represent the Gibbs free energies of the liquid and the solid phases. By considering the liquidus, solidus and miscibility gap data, the interaction parameters of the liquid and the solid phases were optimized through a multiple linear regression method. A consistent set of interaction parameters were derived for describing the miscibility gap as well as the liquidus/solidus. The phase diagram calculated in the present work is in good agreement with experimental data in the literature.

  4. Massless sunset diagrams in finite asymmetric volumes

    NASA Astrophysics Data System (ADS)

    Niedermayer, F.; Weisz, P.

    2016-06-01

    This paper discusses the methods and the results used in an accompanying paper describing the matching of effective chiral Lagrangians in dimensional and lattice regularizations. We present methods to compute 2-loop massless sunset diagrams in finite asymmetric volumes in the framework of these regularizations. We also consider 1-loop sums in both regularizations, extending the results of Hasenfratz and Leutwyler for the case of dimensional regularization and we introduce a new method to calculate precisely the expansion coefficients of the 1-loop lattice sums.

  5. Toward a phase diagram for stocks

    NASA Astrophysics Data System (ADS)

    Ivanova, K.

    1999-08-01

    A display of the tentatively basic parameters of stocks, i.e. the daily closing price and the daily transaction volume is presented eliminating the time variable between them. The “phase diagram” looks like a triangular region similar to the two-phase region of traffic diagrams. The data is taken for two companies (SGP and OXHP) which present different long-range correlations in the closing price value as examined by the linearly Detrended Fluctuation Analysis (DFA) statistical method. Substructures are observed in the “phase diagram” as due to changes in management policy, e.g. stock splits.

  6. Heart failure - medicines

    MedlinePlus

    CHF - medicines; Congestive heart failure - medicines; Cardiomyopathy - medicines; HF - medicines ... You will need to take most of your heart failure medicines every day. Some medicines are taken ...

  7. A new symbolic language for diagramming pacemaker/heart interaction.

    PubMed

    Brownlee, R R; Shimmel-Golden, J B; Del Marco, C J; Furman, S

    1982-09-01

    A new symbolic language is presented that can be used to diagram pacemaker/heart interactions. The language symbolically indicates "normally" conducted and ectopic events; pacemaker stimuli; pacemaker capture of the chamber; triggered pacemaker stimuli by "normal" or ectopic events; as well as such anomalous pacemaker/heart interactions as failure to sense; "crosstalk" between electrodes; and "normal," ectopic, or paced events in one chamber sensed by the electrode in the other chamber. In addition, symbols are provided to represent antegrade and retrograde accessory pathway conduction, and electronic and physiological refractory intervals. A common baseline is used to separate symbols for atrial activity, above the baseline, from those for ventricular activity, below the baseline. Parallel baselines are used to plot refractory intervals. Thus, even complex dual-chamber pacemaker operating modes can be represented with intrinsic and stimulated cardiac response to pacemaker operation. The language can be sketched out informally for description of general concepts or drafted on millimeter grid paper to make precise timing notations. It is especially useful for interdisciplinary communication of ideas about complex pacemaker/heart interactions. PMID:6182542

  8. Component failure data handbook

    SciTech Connect

    Gentillon, C.D.

    1991-04-01

    This report presents generic component failure rates that are used in reliability and risk studies of commercial nuclear power plants. The rates are computed using plant-specific data from published probabilistic risk assessments supplemented by selected other sources. Each data source is described. For rates with four or more separate estimates among the sources, plots show the data that are combined. The method for combining data from different sources is presented. The resulting aggregated rates are listed with upper bounds that reflect the variability observed in each rate across the nuclear power plant industry. Thus, the rates are generic. Both per hour and per demand rates are included. They may be used for screening in risk assessments or for forming distributions to be updated with plant-specific data.

  9. The Mental Health Outcomes of Drought: A Systematic Review and Causal Process Diagram

    PubMed Central

    Vins, Holly; Bell, Jesse; Saha, Shubhayu; Hess, Jeremy J.

    2015-01-01

    Little is understood about the long term, indirect health consequences of drought (a period of abnormally dry weather). In particular, the implications of drought for mental health via pathways such as loss of livelihood, diminished social support, and rupture of place bonds have not been extensively studied, leaving a knowledge gap for practitioners and researchers alike. A systematic review of literature was performed to examine the mental health effects of drought. The systematic review results were synthesized to create a causal process diagram that illustrates the pathways linking drought effects to mental health outcomes. Eighty-two articles using a variety of methods in different contexts were gathered from the systematic review. The pathways in the causal process diagram with greatest support in the literature are those focusing on the economic and migratory effects of drought. The diagram highlights the complexity of the relationships between drought and mental health, including the multiple ways that factors can interact and lead to various outcomes. The systematic review and resulting causal process diagram can be used in both practice and theory, including prevention planning, public health programming, vulnerability and risk assessment, and research question guidance. The use of a causal process diagram provides a much needed avenue for integrating the findings of diverse research to further the understanding of the mental health implications of drought. PMID:26506367

  10. The Mental Health Outcomes of Drought: A Systematic Review and Causal Process Diagram.

    PubMed

    Vins, Holly; Bell, Jesse; Saha, Shubhayu; Hess, Jeremy J

    2015-10-01

    Little is understood about the long term, indirect health consequences of drought (a period of abnormally dry weather). In particular, the implications of drought for mental health via pathways such as loss of livelihood, diminished social support, and rupture of place bonds have not been extensively studied, leaving a knowledge gap for practitioners and researchers alike. A systematic review of literature was performed to examine the mental health effects of drought. The systematic review results were synthesized to create a causal process diagram that illustrates the pathways linking drought effects to mental health outcomes. Eighty-two articles using a variety of methods in different contexts were gathered from the systematic review. The pathways in the causal process diagram with greatest support in the literature are those focusing on the economic and migratory effects of drought. The diagram highlights the complexity of the relationships between drought and mental health, including the multiple ways that factors can interact and lead to various outcomes. The systematic review and resulting causal process diagram can be used in both practice and theory, including prevention planning, public health programming, vulnerability and risk assessment, and research question guidance. The use of a causal process diagram provides a much needed avenue for integrating the findings of diverse research to further the understanding of the mental health implications of drought. PMID:26506367

  11. Instability Regions in the Upper HR Diagram

    NASA Technical Reports Server (NTRS)

    deJager, Cornelis; Lobel, Alex; Nieuwenhuijzen, Hans; Stothers, Richard; Hansen, James E. (Technical Monitor)

    2001-01-01

    The following instability regions for blueward evolving supergiants are outlined and compared: (1) Areas in the Hertzsprung-Russell(HR) diagram where stars are dynamically unstable. (2) Areas where the effective acceleration in the upper part of the photospheres is negative, hence directed outward. (3) Areas where the sonic points of the stellar wind (Where wind velocity = sound velocity) are situated inside the photospheres, at a level deeper than tau(sub Ross) = 0.01. We compare the results with the positions of actual stars in the HR diagram and we find evidence that the recent strong contraction of the yellow hypergiant HR8752 was initiated in a period during which (g(sub eff)) is less than 0, whereupon the star became dynamically unstable. The instability and extreme shells around IRC+10420 are suggested to be related to three factors: (g(sub eff)) is less than 0; the sonic point is situated inside the photosphere; and the star is dynamically unstable.

  12. Non-equilibrium phases and phase diagrams

    SciTech Connect

    Massalski, T.B.; Rizzo, H.F.

    1988-03-01

    In this paper we consider the degree of usefulness of the phase diagram and the related thermodynamics in predicting and understanding the formation of metastable phases during quenching, or during low-temperature solid-state interdiffusion, or during co-deposition. Recent research has demonstrated that many of such metastable phases are formed because the more stable intermediate phases that are favored thermodynamically are nevertheless bypassed kinetically. The kinetic elimination of intermediate phases provides conditions where a metastable equilibrium can be established at low temperatures between the supercooled liquid and the terminal solid solutions, leading to metastable partitioned two-phase regions. Alternatively, the range of the metastable phases may be governed by the T/sub 0/ principle related to the crossover of the respective free energy curves, or may be controlled mainly by kinetic considerations. Which particular thermodynamic conditions apply appears to depend on the initial form of the phase diagram and the specific technique used. The occurrence of massive transformations also is discussed. 34 refs., 10 figs.

  13. Phase diagrams of disordered Weyl semimetals

    NASA Astrophysics Data System (ADS)

    Shapourian, Hassan; Hughes, Taylor L.

    2016-02-01

    Weyl semimetals are gapless quasitopological materials with a set of isolated nodal points forming their Fermi surface. They manifest their quasitopological character in a series of topological electromagnetic responses including the anomalous Hall effect. Here, we study the effect of disorder on Weyl semimetals while monitoring both their nodal/semimetallic and topological properties through computations of the localization length and the Hall conductivity. We examine three different lattice tight-binding models which realize the Weyl semimetal in part of their phase diagram and look for universal features that are common to all of the models, and interesting distinguishing features of each model. We present detailed phase diagrams of these models for large system sizes and we find that weak disorder preserves the nodal points up to the diffusive limit, but does affect the Hall conductivity. We show that the trend of the Hall conductivity is consistent with an effective picture in which disorder causes the Weyl nodes move within the Brillouin zone along a specific direction that depends deterministically on the properties of the model and the neighboring phases to the Weyl semimetal phase. We also uncover an unusual (nonquantized) anomalous Hall insulator phase which can only exist in the presence of disorder.

  14. Isolated pulsar spin evolution on the diagram

    NASA Astrophysics Data System (ADS)

    Ridley, J. P.; Lorimer, D. R.

    2010-05-01

    We look at two contrasting spin-down models for isolated radio pulsars and, accounting for selection effects, synthesize observable populations. While our goal is to reproduce all of the observable characteristics, in this paper we pay particular attention to the form of the spin period versus period derivative () diagram and its dependence on various pulsar properties. We analyse the initial spin period, the braking index, the magnetic field, various beaming models as well as the pulsar's luminosity. In addition to considering the standard magnetic dipole model for pulsar spin-down, we also consider the recent hybrid model proposed by Contopoulos and Spitkovsky. The magnetic dipole model, however, does a better job of reproducing the observed pulsar population. We conclude that random alignment angles and period-dependent luminosity distributions are essential to reproduce the observed diagram. We also consider the time decay of alignment angles and attempt to reconcile various models currently being studied. We conclude that in order to account for recent evidence for the alignment found by Weltevrede and Johnston, the braking torque on a neutron star should not depend strongly on the inclination. Our simulation code is publicly available and includes a web-based interface to examine the results and make predictions for yields of current and future surveys.

  15. Revisiting the phase diagram of hard ellipsoids

    NASA Astrophysics Data System (ADS)

    Odriozola, Gerardo

    2012-04-01

    In this work, the well-known Frenkel-Mulder phase diagram of hard ellipsoids of revolution [D. Frenkel and B. M. Mulder, Mol. Phys. 55, 1171 (1985), 10.1080/00268978500101971] is revisited by means of replica exchange Monte Carlo simulations. The method provides good sampling of dense systems and so, solid phases can be accessed without the need of imposing a given structure. At high densities, we found plastic solids and fcc-like crystals for semi-spherical ellipsoids (prolates and oblates), and SM2 structures [P. Pfleiderer and T. Schilling, Phys. Rev. E 75, 020402 (2007)] for x : 1-prolates and 1 : x-oblates with x ≥ 3. The revised fluid-crystal and isotropic-nematic transitions reasonably agree with those presented in the Frenkel-Mulder diagram. An interesting result is that, for small system sizes (100 particles), we obtained 2:1- and 1.5:1-prolate equations of state without transitions, while some order is developed at large densities. Furthermore, the symmetric oblate cases are also reluctant to form ordered phases.

  16. Revisiting the phase diagram of hard ellipsoids.

    PubMed

    Odriozola, Gerardo

    2012-04-01

    In this work, the well-known Frenkel-Mulder phase diagram of hard ellipsoids of revolution [D. Frenkel and B. M. Mulder, Mol. Phys. 55, 1171 (1985)] is revisited by means of replica exchange Monte Carlo simulations. The method provides good sampling of dense systems and so, solid phases can be accessed without the need of imposing a given structure. At high densities, we found plastic solids and fcc-like crystals for semi-spherical ellipsoids (prolates and oblates), and SM2 structures [P. Pfleiderer and T. Schilling, Phys. Rev. E 75, 020402 (2007)] for x : 1-prolates and 1 : x-oblates with x ≥ 3. The revised fluid-crystal and isotropic-nematic transitions reasonably agree with those presented in the Frenkel-Mulder diagram. An interesting result is that, for small system sizes (100 particles), we obtained 2:1- and 1.5:1-prolate equations of state without transitions, while some order is developed at large densities. Furthermore, the symmetric oblate cases are also reluctant to form ordered phases. PMID:22482570

  17. Automated D/3 to Visio Analog Diagrams

    Energy Science and Technology Software Center (ESTSC)

    2000-08-10

    ADVAD1 reads an ASCII file containing the D/3 DCS MDL input for analog points for a D/3 continuous database. It uses the information in the files to create a series of Visio files representing the structure of each analog chain, one drawing per Visio file. The actual drawing function is performed by Visio (requires Visio version 4.5+). The user can configure the program to select which fields in the database are shown on the diagrammore » and how the information is to be presented. This gives a visual representation of the structure of the analog chains, showing selected fields in a consistent manner. Updating documentation can be done easily and the automated approach eliminates human error in the cadding process. The program can also create the drawings far faster than a human operator is capable, able to create approximately 270 typical diagrams in about 8 minutes on a Pentium II 400 MHz PC. The program allows for multiple option sets to be saved to provide different settings (i.e., different fields, different field presentations, and /or different diagram layouts) for various scenarios or facilities on one workstation. Option sets may be exported from the Windows registry to allow duplication of settings on another workstation.« less

  18. Critical point analysis of phase envelope diagram

    SciTech Connect

    Soetikno, Darmadi; Siagian, Ucok W. R.; Kusdiantara, Rudy Puspita, Dila Sidarto, Kuntjoro A. Soewono, Edy; Gunawan, Agus Y.

    2014-03-24

    Phase diagram or phase envelope is a relation between temperature and pressure that shows the condition of equilibria between the different phases of chemical compounds, mixture of compounds, and solutions. Phase diagram is an important issue in chemical thermodynamics and hydrocarbon reservoir. It is very useful for process simulation, hydrocarbon reactor design, and petroleum engineering studies. It is constructed from the bubble line, dew line, and critical point. Bubble line and dew line are composed of bubble points and dew points, respectively. Bubble point is the first point at which the gas is formed when a liquid is heated. Meanwhile, dew point is the first point where the liquid is formed when the gas is cooled. Critical point is the point where all of the properties of gases and liquids are equal, such as temperature, pressure, amount of substance, and others. Critical point is very useful in fuel processing and dissolution of certain chemicals. Here in this paper, we will show the critical point analytically. Then, it will be compared with numerical calculations of Peng-Robinson equation by using Newton-Raphson method. As case studies, several hydrocarbon mixtures are simulated using by Matlab.

  19. Recent Results in Ring-Diagram Analysis

    NASA Astrophysics Data System (ADS)

    Rabello-Soares, M. C.

    2013-12-01

    The ring-diagram technique was developed by Frank Hill 25 years ago and matured quickly during the late 1990s. It is nowadays one of the most commonly used techniques in local helioseismology. The method consists in the power spectral analysis of solar acoustic oscillations on small regions (2° to 30°) of the solar surface. The power spectrum resembles a set of trumpets nested inside each other and for a given frequency, it looks like a ring, hence the technique's name. It provides information on the horizontal flow field and thermodynamic structure in the layers immediately below the photosphere. With data regularly provided by MDI, GONG, and more recently HMI, many important results have been achieved. In recently years, these results include estimations of the meridional circulation and its evolution with solar cycle; flows associated with active regions, as well as, flow divergence and vorticity, and thermal structure beneath and around active regions. Much progress is expected with data now provided by HMI's high spatial resolution observations and high duty cycle. There are two data processing pipelines (GONG and HMI) providing free access to the data and the results of the ring-diagram analysis. Here we will discuss the most recent results and improvements in the technique, as well as, the many challenges that still remain.

  20. Phase diagram of quantum square ice

    NASA Astrophysics Data System (ADS)

    Henry, Louis-Paul; Holdsworth, Peter; Mila, Frederic; Roscilde, Tommaso

    2013-03-01

    We have investigated the ground-state and finite-temperature phase diagram of quantum square ice - realized by the transverse-field Ising model on a checkerboard lattice - using both linear spin-wave (LSW) theory and quantum Monte Carlo (QMC). We generalize the model with different couplings between nearest (J1) and next-to-nearest (J2) neighbors on the checkerboard lattice. Our QMC approach generalizes the loop algorithm - very efficient in the study of constrained classical systems - to a ``brane algorithm'' for quantum systems. At the LSW level the vast degeneracy of the ground-state for J1 =J2 and J2 >J1 remains intact; moreover LSW theory breaks down in extended regions of the phase diagram, pointing at non-classical states. Our QMC study goes beyond perturbative schemes and addresses directly the nature of the low-temperature phases. We have critically examined the possibility of a resonating-plaquette state for J1 =J2 , suggested by degenerate perturbation theory on the ice-rule manifold for weak fields. Our QMC results for finite fields confirm the absence of Néel or collinear order, but they do not confirm the presence of resonating-plaquette order, pointing at a possibly more complex non-classical state.

  1. Critical point analysis of phase envelope diagram

    NASA Astrophysics Data System (ADS)

    Soetikno, Darmadi; Kusdiantara, Rudy; Puspita, Dila; Sidarto, Kuntjoro A.; Siagian, Ucok W. R.; Soewono, Edy; Gunawan, Agus Y.

    2014-03-01

    Phase diagram or phase envelope is a relation between temperature and pressure that shows the condition of equilibria between the different phases of chemical compounds, mixture of compounds, and solutions. Phase diagram is an important issue in chemical thermodynamics and hydrocarbon reservoir. It is very useful for process simulation, hydrocarbon reactor design, and petroleum engineering studies. It is constructed from the bubble line, dew line, and critical point. Bubble line and dew line are composed of bubble points and dew points, respectively. Bubble point is the first point at which the gas is formed when a liquid is heated. Meanwhile, dew point is the first point where the liquid is formed when the gas is cooled. Critical point is the point where all of the properties of gases and liquids are equal, such as temperature, pressure, amount of substance, and others. Critical point is very useful in fuel processing and dissolution of certain chemicals. Here in this paper, we will show the critical point analytically. Then, it will be compared with numerical calculations of Peng-Robinson equation by using Newton-Raphson method. As case studies, several hydrocarbon mixtures are simulated using by Matlab.

  2. Identifying Liquid-Gas System Misconceptions and Addressing Them Using a Laboratory Exercise on Pressure-Temperature Diagrams of a Mixed Gas Involving Liquid-Vapor Equilibrium

    ERIC Educational Resources Information Center

    Yoshikawa, Masahiro; Koga, Nobuyoshi

    2016-01-01

    This study focuses on students' understandings of a liquid-gas system with liquid-vapor equilibrium in a closed system using a pressure-temperature ("P-T") diagram. By administrating three assessment questions concerning the "P-T" diagrams of liquid-gas systems to students at the beginning of undergraduate general chemistry…

  3. Effects of Student-Generated Diagrams versus Student-Generated Summaries on Conceptual Understanding of Causal and Dynamic Knowledge in Plate Tectonics.

    ERIC Educational Resources Information Center

    Gobert, Janice D.; Clement, John J.

    1999-01-01

    Grade five students' (n=58) conceptual understanding of plate tectonics was measured by analysis of student-generated summaries and diagrams, and by posttest assessment of both the spatial/static and causal/dynamic aspects of the domain. The diagram group outperformed the summary and text-only groups on the posttest measures. Discusses the effects…

  4. A Markov Model for Assessing the Reliability of a Digital Feedwater Control System

    SciTech Connect

    Chu,T.L.; Yue, M.; Martinez-Guridi, G.; Lehner, J.

    2009-02-11

    A Markov approach has been selected to represent and quantify the reliability model of a digital feedwater control system (DFWCS). The system state, i.e., whether a system fails or not, is determined by the status of the components that can be characterized by component failure modes. Starting from the system state that has no component failure, possible transitions out of it are all failure modes of all components in the system. Each additional component failure mode will formulate a different system state that may or may not be a system failure state. The Markov transition diagram is developed by strictly following the sequences of component failures (i.e., failure sequences) because the different orders of the same set of failures may affect the system in completely different ways. The formulation and quantification of the Markov model, together with the proposed FMEA (Failure Modes and Effects Analysis) approach, and the development of the supporting automated FMEA tool are considered the three major elements of a generic conceptual framework under which the reliability of digital systems can be assessed.

  5. The Diagram as Story: Unfolding the Event-Structure of the Mathematical Diagram

    ERIC Educational Resources Information Center

    de Freitas, Elizabeth

    2012-01-01

    This paper explores the role of narrative in decoding diagrams. I focus on two fundamental facets of narrative: (1) the recounting of causally related sequences of events, and (2) the positioning of the narrator through point-of-view and voice. In the first two sections of the paper I discuss philosophical and semiotic frameworks for making sense…

  6. Diagrams: A Visual Survey of Graphs, Maps, Charts and Diagrams for the Graphic Designer.

    ERIC Educational Resources Information Center

    Lockwood, Arthur

    Since the ultimate success of any diagram rests in its clarity, it is important that the designer select a method of presentation which will achieve this aim. He should be aware of the various ways in which statistics can be shown diagrammatically, how information can be incorporated in maps, and how events can be plotted in chart or graph form.…

  7. Advanced Heart Failure

    MedlinePlus

    ... High Blood Pressure Tools & Resources Stroke More Advanced Heart Failure Updated:Oct 8,2015 When heart failure (HF) ... content was last reviewed on 04/06/2015. Heart Failure • Home • About Heart Failure • Causes and Risks for ...

  8. Progressive Failure Analysis Methodology for Laminated Composite Structures

    NASA Technical Reports Server (NTRS)

    Sleight, David W.

    1999-01-01

    A progressive failure analysis method has been developed for predicting the failure of laminated composite structures under geometrically nonlinear deformations. The progressive failure analysis uses C(exp 1) shell elements based on classical lamination theory to calculate the in-plane stresses. Several failure criteria, including the maximum strain criterion, Hashin's criterion, and Christensen's criterion, are used to predict the failure mechanisms and several options are available to degrade the material properties after failures. The progressive failure analysis method is implemented in the COMET finite element analysis code and can predict the damage and response of laminated composite structures from initial loading to final failure. The different failure criteria and material degradation methods are compared and assessed by performing analyses of several laminated composite structures. Results from the progressive failure method indicate good correlation with the existing test data except in structural applications where interlaminar stresses are important which may cause failure mechanisms such as debonding or delaminations.

  9. Global Phase Diagram in Layered Organic Conductors

    NASA Astrophysics Data System (ADS)

    Chen, Yan; Guo, Yaowu

    2014-03-01

    Layered organic superconductors serve as model systems for Mott physics with geometrical frustration. The global phase diagram of such system is obtained by using Gutzwiller variational method to study a Hubbard model including a spin exchange coupling term. Five possible candidates of ground state are obtained respectively, including a spin liquid insulating state at large on-site Coulomb repulsion U and large lattice frustration t'/t, an antiferromagnetic state at large U and small t'/t , two Gossamer superconducting states at medium U with either gapless dx 2 - y 2-wave (small t'/t) or gapped d +id-wave symmetry (large t'/t) , and a metallic Fermi liquid state at small U. Moreover, we study the evolution of double occupancy number d in terms of different U and t'/t parameters mimicking the pressure effect. Our results are qualitatively consistent with main experimental results in organic superconductors. -/abstract- Billing ID: 814549 Member I

  10. Phase diagrams of bosonic ABn chains

    NASA Astrophysics Data System (ADS)

    Cruz, G. J.; Franco, R.; Silva-Valencia, J.

    2016-04-01

    The A B N - 1 chain is a system that consists of repeating a unit cell with N sites where between the A and B sites there is an energy difference of λ. We considered bosons in these special lattices and took into account the kinetic energy, the local two-body interaction, and the inhomogenous local energy in the Hamiltonian. We found the charge density wave (CDW) and superfluid and Mott insulator phases, and constructed the phase diagram for N = 2 and 3 at the thermodynamic limit. The system exhibited insulator phases for densities ρ = α/ N, with α being an integer. We obtained that superfluid regions separate the insulator phases for densities larger than one. For any N value, we found that for integer densities ρ, the system exhibits ρ + 1 insulator phases, a Mott insulator phase, and ρ CDW phases. For non-integer densities larger than one, several CDW phases appear.

  11. The Phase Diagram of Superionic Ice

    NASA Astrophysics Data System (ADS)

    Sun, Jiming; Clark, Bryan; Car, Roberto

    2014-03-01

    Using the variable cell Car-Parrinello molecular dynamics method, we study the phase diagram of superionic ice from 200GPa to 2.5TPa. We present evidence that at very high pressure the FCC structure of the oxygen sublattice may become unstable allowing for a new superionic ice phase, in which the oxygen sublattice takes the P21 structure found in zero-temperature total energy calculations. We also report on how the melting temperature of the hydrogen sublattice is affected by this new crystalline structure of the oxygen sublattice. This work was supported by the NSF under grant DMS-1065894(J.S. and R.C.) and PHY11-25915(B.C.).

  12. Phase diagram of a Schelling segregation model

    NASA Astrophysics Data System (ADS)

    Gauvin, L.; Vannimenus, J.; Nadal, J.-P.

    2009-07-01

    The collective behavior in a variant of Schelling’s segregation model is characterized with methods borrowed from statistical physics, in a context where their relevance was not conspicuous. A measure of segregation based on cluster geometry is defined and several quantities analogous to those used to describe physical lattice models at equilibrium are introduced. This physical approach allows to distinguish quantitatively several regimes and to characterize the transitions between them, leading to the building of a phase diagram. Some of the transitions evoke empirical sudden ethnic turnovers. We also establish links with ‘spin-1’ models in physics. Our approach provides generic tools to analyze the dynamics of other socio-economic systems.

  13. Reentrant Phase Diagram of Network Fluids

    NASA Astrophysics Data System (ADS)

    Russo, J.; Tavares, J. M.; Teixeira, P. I. C.; Telo da Gama, M. M.; Sciortino, F.

    2011-02-01

    We introduce a microscopic model for particles with dissimilar patches which displays an unconventional “pinched” phase diagram, similar to the one predicted by Tlusty and Safran in the context of dipolar fluids [Science 290, 1328 (2000)SCIEAS0036-807510.1126/science.290.5495.1328]. The model—based on two types of patch interactions, which account, respectively, for chaining and branching of the self-assembled networks—is studied both numerically via Monte Carlo simulations and theoretically via first-order perturbation theory. The dense phase is rich in junctions, while the less-dense phase is rich in chain ends. The model provides a reference system for a deep understanding of the competition between condensation and self-assembly into equilibrium-polymer chains.

  14. Phase diagram in the entanglement PNJL model

    NASA Astrophysics Data System (ADS)

    Friesen, A.; Kalinovsky, Y.; Toneev, V.

    2016-01-01

    Effects of the vector interaction in the Nambu-Jona-Lasinio model with Polyakov loop are studied in combination with the entanglement interaction between the quark and pure gauge sectors. We investigate the QCD phase diagram and find that the first order chiral phase transition at finite baryon chemical potentials and its critical endpoint disappear for sufficiently large values of the vector interaction constant Gv. The presence of an entanglement interaction between quark and pure gauge sectors leads to an increase of the value Gv for which the first order transition disappears. The influence of a nonzero Gv on the curvature of the crossover boundary in the T - μ plane nearby μ= 0 is also examined for both cases.

  15. Expression of Superparamagnetic Particles on FORC Diagrams

    NASA Astrophysics Data System (ADS)

    Hirt, A. M.; Kumari, M.; Crippa, F.; Petri-Fink, A.

    2015-12-01

    Identification of superparamagnetic (SP) particles in natural materials provides information on processes that lead to the new formation or dissolution of iron oxides. SP particles express themselves on first-order reversal curve (FORC) diagrams as a distribution centered near the origin of the diagram. Pike et al. (2001, GJI, 145, 721) demonstrated that thermal relaxation produces an upward shift in the FORC distribution, and attributed this to a pause encountered at each reversal field. In this study we examine the relationship between this upward shift and particles size on two sets of synthetic iron oxide nanoparticles. One set of coated magnetite particles have well-constrained particles size with 9, 16 and 20 nm as their diameter. A second set from the FeraSpin™ Series, consisting of FeraSpinXS, M and XL, were evaluated. Rock magnetic experiments indicate that the first set of samples is exclusively magnetite, whereas the FeraSpin samples contain predominantly magnetite with some degree of oxidation. Samples from both sets show that the upward shift of the FORC distribution at the origin increases with decreasing particle size. The amount of shift in the FeraSpin series is less when compared to the samples from the first set. This is attributed to the effect of interaction that counteracts the effect of thermal relaxation behavior of the SP particles. The FeraSpin series also shows a broader FORC distribution on the vertical axis that appears to be related to non-saturation of the hysteresis curve at maximum applied field. This non-saturation behavior can be due to spins of very fine particles or oxidation to hematite. AC susceptibility at low temperature indicates that particle interaction may affect the effective magnetic particle size. Our results suggest that the FORC distribution in pure SP particle systems provides information on the particle size distribution or oxidation, which can be further evaluated with low temperature techniques.

  16. Steam generator tube failures

    SciTech Connect

    MacDonald, P.E.; Shah, V.N.; Ward, L.W.; Ellison, P.G.

    1996-04-01

    A review and summary of the available information on steam generator tubing failures and the impact of these failures on plant safety is presented. The following topics are covered: pressurized water reactor (PWR), Canadian deuterium uranium (CANDU) reactor, and Russian water moderated, water cooled energy reactor (VVER) steam generator degradation, PWR steam generator tube ruptures, the thermal-hydraulic response of a PWR plant with a faulted steam generator, the risk significance of steam generator tube rupture accidents, tubing inspection requirements and fitness-for-service criteria in various countries, and defect detection reliability and sizing accuracy. A significant number of steam generator tubes are defective and are removed from service or repaired each year. This wide spread damage has been caused by many diverse degradation mechanisms, some of which are difficult to detect and predict. In addition, spontaneous tube ruptures have occurred at the rate of about one every 2 years over the last 20 years, and incipient tube ruptures (tube failures usually identified with leak detection monitors just before rupture) have been occurring at the rate of about one per year. These ruptures have caused complex plant transients which have not always been easy for the reactor operators to control. Our analysis shows that if more than 15 tubes rupture during a main steam line break, the system response could lead to core melting. Although spontaneous and induced steam generator tube ruptures are small contributors to the total core damage frequency calculated in probabilistic risk assessments, they are risk significant because the radionuclides are likely to bypass the reactor containment building. The frequency of steam generator tube ruptures can be significantly reduced through appropriate and timely inspections and repairs or removal from service.

  17. Preparing for Failure

    SciTech Connect

    Murphy, L.T.

    2006-07-01

    Risk management is one of the most complex project management processes, requiring rigorous management and discipline. Unfortunately, for many organizations, the risk management process has become contaminated by poor management practices, an absence of meaningful risk assessments, meaningless risk event descriptions, incomplete and vague risk impact analyses, poor follow through on risk mitigation activities and a general lack of attention to accuracy, completeness and quality. At this point, the risk register, instead of being a key tool used by the organization to systematically identify and eliminate risk, while exploiting opportunities, has become a list of pre-prepared excuses based on the repeat of failures encountered on past projects. However, organizations are not condemned to repeat past failures. By returning to the basics of risk management, and through the application of some basic management guidelines, the risk register-instead of being an 'Excuse Register' - can become the cornerstone of a comprehensive risk management program to promote a systematic, pro-active approach within an organization that will result in accomplishing mitigation activities, reducing risk and gaining advantage through opportunities. (authors)

  18. Thermodynamic Venn diagrams: Sorting out forces, fluxes, and Legendre transforms

    NASA Astrophysics Data System (ADS)

    Kerr, W. C.; Macosko, J. C.

    2011-09-01

    We show how to use a Venn diagram to illuminate the relations among the different thermodynamic potentials, forces, and fluxes of a simple system. A single diagram shows all of the thermodynamic potentials obtainable by Legendre transformations starting from the internal energy as the fundamental potential. From the diagram, we can also read off the Maxwell relations deduced from each of these potentials. We construct a second Venn diagram that shows the analogous information for the Massieu functions, obtained by Legendre transformations starting from the entropy as the fundamental thermodynamic function.

  19. 54. PHOTOCOPY OF DRAWING AMMONIA LEACHING PLANT FLOW DIAGRAM, REPRESENTING ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    54. PHOTOCOPY OF DRAWING AMMONIA LEACHING PLANT FLOW DIAGRAM, REPRESENTING ONE COMPLETE CYCLE - Kennecott Copper Corporation, On Copper River & Northwestern Railroad, Kennicott, Valdez-Cordova Census Area, AK

  20. 55. PHOTOCOPY OF DRAWING AMMONIA LEACHING PLANT FLOW DIAGRAM, REPRESENTING ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    55. PHOTOCOPY OF DRAWING AMMONIA LEACHING PLANT FLOW DIAGRAM, REPRESENTING ONE COMPLETE CYCLE - Kennecott Copper Corporation, On Copper River & Northwestern Railroad, Kennicott, Valdez-Cordova Census Area, AK

  1. 53. PHOTOCOPY OF DRAWING AMMONIA LEACHING PLANT FLOW DIAGRAM, REPRESENTING ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    53. PHOTOCOPY OF DRAWING AMMONIA LEACHING PLANT FLOW DIAGRAM, REPRESENTING ONE COMPLETE CYCLE - Kennecott Copper Corporation, On Copper River & Northwestern Railroad, Kennicott, Valdez-Cordova Census Area, AK

  2. Massive basketball diagram for a thermal scalar field theory

    SciTech Connect

    Andersen, Jens O.; Braaten, Eric; Strickland, Michael

    2000-08-15

    The ''basketball diagram'' is a three-loop vacuum diagram for a scalar field theory that cannot be expressed in terms of one-loop diagrams. We calculate this diagram for a massive scalar field at nonzero temperature, reducing it to expressions involving three-dimensional integrals that can be easily evaluated numerically. We use this result to calculate the free energy for a massive scalar field with a {phi}{sup 4} interaction to three-loop order. (c) 2000 The American Physical Society.

  3. Integrated 3-parameter diagram for determining thermodynamic properties of fluids

    NASA Astrophysics Data System (ADS)

    Zhao, Guochang; Deng, Xiaoxue; Zhu, Mingshan

    1987-04-01

    The importance of thermodynamic properties of fluids has motivated recent studies in developing methods of calculating thermodynamic properties. Among the various methods, the use of computational diagrams is a commonly used engineering method. Conventional diagrams do not take into consideration the internal relationships among the various thermodynamic properties. The internal relationships of various thermodynamic properties are considered. The Lee-Kessler three-parameter equations were used to construct an integrated three-parameter diagram for determining the thermodynamic properties of fluids; the curves were generated using an ai-M/6 microcomputer with an attached Sr 6602 plotter. The diagram is considered sufficiently accurate for engineering calculations.

  4. Oak Ridge K-25 Site Technology Logic Diagram. Volume 2, Technology Logic Diagrams

    SciTech Connect

    Fellows, R.L.

    1993-02-26

    The Oak Ridge K-25 Technology Logic Diagram (TLD), a decision support tool for the K-25 Site, was developed to provide a planning document that relates envirorunental restoration and waste management problems at the Oak Ridge K-25 Site to potential technologies that can remediate these problems. The TLD technique identifies the research necessary to develop these technologies to a state that allows for technology transfer and application to waste management, remedial action, and decontamination and decommissioning activities. The TLD consists of four separate volumes-Vol. 1, Vol. 2, Vol. 3A, and Vol. 3B. Volume 1 provides introductory and overview information about the TLD. This volume, Volume 2, contains logic diagrams with an index. Volume 3 has been divided into two separate volumes to facilitate handling and use.

  5. Depression and congestive heart failure.

    PubMed

    Guck, Thomas P; Elsasser, Gary N; Kavan, Michael G; Barone, Eugene J

    2003-01-01

    The prevalence rates of depression in congestive heart failure patients range from 24%-42%. Depression is a graded, independent risk factor for readmission to the hospital, functional decline, and mortality in patients with congestive heart failure. Physicians can assess depression by using the SIG E CAPS + mood mnemonic, or any of a number of easily administered and scored self-report inventories. Cognitive-behavior therapy is the preferred psychological treatment. Cognitive-behavior therapy emphasizes the reciprocal interactions among physiology, environmental events, thoughts, and behaviors, and how these may be altered to produce changes in mood and behavior. Pharmacologically, the selective serotonin reuptake inhibitors are recommended, whereas the tricyclic antidepressants are not recommended for depression in congestive heart failure patients. The combination of a selective serotonin reuptake inhibitor with cognitive-behavior therapy is often the most effective treatment. PMID:12826775

  6. The potential failure of Monte Nuovo at Ischia Island (Southern Italy): numerical assessment of a likely induced tsunami and its effects on a densely inhabited area

    NASA Astrophysics Data System (ADS)

    Zaniboni, F.; Pagnoni, G.; Tinti, S.; Della Seta, M.; Fredi, P.; Marotta, E.; Orsi, G.

    2013-11-01

    Ischia is the emergent top of a large volcanic complex that rises more than 1,000 m above the sea floor, at the north-western end of the Gulf of Naples. Caldera resurgence in the central part of the island has resulted in the formation of differentially displaced blocks, among which Mt. Epomeo (787 m a.s.l.) is the most uplifted. Deformation and slope instability have been recognised as common features induced by a block resurgence mechanism that causes uplift and favours gravitational loading and flank failure. The Monte Nuovo block, a topographic high on the north-western flank of Mt. Epomeo, has recently been interpreted as a block affected by deep-seated gravitational slope deformation. This block may undergo a catastrophic failure in the case of renewal of magmatic activity. This paper investigates the potential failure of the Monte Nuovo block as a rockslide-debris avalanche, the consequent tsunami generation and wave propagation, and discusses the catastrophic effects of such an event. Mobilization-prone volume has been estimated at about 160·106 m3 and would move from a maximum elevation of 400 m a.s.l. The landslide itself would sweep away a densely populated territory as large as 3.5 km2. The highest waves generated by the tsunami, on which this paper is mainly focussed, would hit the northern and western shores of Ischia. However, the high coast would prevent inundation and limit devastation to beaches, harbours and surrounding areas. Most of the tsunami energy would head towards the north-east, hitting the Campania coast. Severe inundation would affect an area of up to 20 km2 around the mouth of the Volturno river, including the urban area of Castel Volturno. In contrast, less energy would travel towards the south, and the Gulf of Naples would be perturbed by long persisting waves of limited damaging potential.

  7. Assessment of predictive models for the failure of titanium and ferrous alloys due to hydrogen effects. Report for the period of June 16 to September 15, 1981

    SciTech Connect

    Archbold, T.F.; Bower, R.B.; Polonis, D.H.

    1982-04-01

    The 1977 version of the Simpson-Puls-Dutton model appears to be the most amenable with respect to utilizing known or readily estimated quantities. The Pardee-Paton model requires extensive calculations involving estimated quantities. Recent observations by Koike and Suzuki on vanadium support the general assumption that crack growth in hydride forming metals is determined by the rate of hydride formation, and their hydrogen atmosphere-displacive transformation model is of potential interest in explaining hydrogen embrittlement in ferrous alloys as well as hydride formers. The discontinuous nature of cracking due to hydrogen embrittlement appears to depend very strongly on localized stress intensities, thereby pointing to the role of microstructure in influencing crack initiation, fracture mode and crack path. The initiation of hydrogen induced failures over relatively short periods of time can be characterized with fair reliability using measurements of the threshold stress intensity. The experimental conditions for determining K/sub Th/ and ..delta..K/sub Th/ are designed to ensure plane strain conditions in most cases. Plane strain test conditions may be viewed as a conservative basis for predicting delayed failure. The physical configuration of nuclear waste canisters may involve elastic/plastic conditions rather than a state of plane strain, especially with thin-walled vessels. Under these conditions, alternative predictive tests may be considered, including COD and R-curve methods. The double cantilever beam technique employed by Boyer and Spurr on titanium alloys offers advantages for examining hydrogen induced delayed failure over long periods of time. 88 references. (DLC)

  8. Left atrioventricular remodeling in the assessment of the left ventricle diastolic function in patients with heart failure: a review of the currently studied echocardiographic variables

    PubMed Central

    Danzmann, Luiz C; Bodanese, Luiz Carlos; Köhler, Ilmar; Torres, Marco R

    2008-01-01

    Multiparametric echocardiographic imaging of the failing heart is now increasingly used and useful in decision making in heart failure. The reasons for this, relies on the need of different strategies of handling these patients, as differentiation of systolic or diastolic dysfunction, as well as on the gamma of approaches available, such as percutaneous and surgical revascularization, devices implantations, and valvular regurgitations and stenosis corrections. Congestive heart failure in patients with normal left ventricular diameters or preserved left ventricular ejection fraction had been pointed out recently as present in a proportion so high as 40 to 50 percent of cases of heart failure, mainly due to the epidemics in well developed countries, as is the problem of not well controlled metabolic states (such as obesity and diabetes), but also due to the real word in developing countries, as is the case of hypertension epidemics and its lack of adequate control. As a matter of public utility, the guidelines in the diagnosis and treatment of such patients will have to be cheap, available, easily reproducible, and ideally will furnish answers for the clinician questions not in a binary "black or white" manner, but with graduations, so if possible it has to be quantitative. The present paper aim to focus on the current clinical applications of tissue Doppler and of left atrial function and remodeling, and its pathophysiologic relationship with the left ventricle, as will be cleared in the documented review of echocardiography that follows, considering that the need of universal data on the syndrome of the failing heart does not mean, unfortunately, that all patients and clinicians in developing countries have at their own health facilities the same imaging tools, since they are, as a general rule, expensive. PMID:19014611

  9. Plasma Glutamine Concentrations in Liver Failure

    PubMed Central

    Helling, Gunnel; Wahlin, Staffan; Smedberg, Marie; Pettersson, Linn; Tjäder, Inga; Norberg, Åke; Rooyackers, Olav; Wernerman, Jan

    2016-01-01

    Background Higher than normal plasma glutamine concentration at admission to an intensive care unit is associated with an unfavorable outcome. Very high plasma glutamine levels are sometimes seen in both acute and chronic liver failure. We aimed to systematically explore the relation between different types of liver failure and plasma glutamine concentrations. Methods Four different groups of patients were studies; chronic liver failure (n = 40), acute on chronic liver failure (n = 20), acute fulminant liver failure (n = 20), and post-hepatectomy liver failure (n = 20). Child-Pugh and Model for End-stage Liver Disease (MELD) scores were assessed as indices of liver function. All groups except the chronic liver failure group were followed longitudinally during hospitalisation. Outcomes were recorded up to 48 months after study inclusion. Results All groups had individuals with very high plasma glutamine concentrations. In the total group of patients (n = 100), severity of liver failure correlated significantly with plasma glutamine concentration, but the correlation was not strong. Conclusion Liver failure, regardless of severity and course of illness, may be associated with a high plasma glutamine concentration. Further studies are needed to understand whether high glutamine levels should be regarded as a biomarker or as a contributor to symptomatology in liver failure. PMID:26938452

  10. Heart failure - home monitoring

    MedlinePlus

    ... this page: //medlineplus.gov/ency/patientinstructions/000113.htm Heart failure - home monitoring To use the sharing features on ... body and the symptoms that tell you your heart failure is getting worse will help you stay healthier ...

  11. Diagram of Cell to Cell Communication

    NASA Technical Reports Server (NTRS)

    2002-01-01

    Diagram depicts the importance of cell-cell communication as central to the understanding of cancer growth and progression, the focus of the NASA bioreactor demonstration system (BDS-05) investigation. Microgravity studies will allow us to unravel the signaling and communication between these cells with the host and potential development of therapies for the treatment of cancer metastasis. The NASA Bioreactor provides a low turbulence culture environment which promotes the formation of large, three-dimensional cell clusters. Due to their high level of cellular organization and specialization, samples constructed in the bioreactor more closely resemble the original tumor or tissue found in the body. The Bioreactor is rotated to provide gentle mixing of fresh and spent nutrient without inducing shear forces that would damage the cells. The work is sponsored by NASA's Office of Biological and Physical Research. The bioreactor is managed by the Biotechnology Cell Science Program at NASA's Johnson Space Center (JSC). NASA-sponsored bioreactor research has been instrumental in helping scientists to better understand normal and cancerous tissue development. In cooperation with the medical community, the bioreactor design is being used to prepare better models of human colon, prostate, breast and ovarian tumors. Cartilage, bone marrow, heart muscle, skeletal muscle, pancreatic islet cells, liver and kidney are just a few of the normal tissues being cultured in rotating bioreactors by investigators. Credit: Emory University.

  12. Phase diagram for inertial granular flows.

    PubMed

    DeGiuli, E; McElwaine, J N; Wyart, M

    2016-07-01

    Flows of hard granular materials depend strongly on the interparticle friction coefficient μ_{p} and on the inertial number I, which characterizes proximity to the jamming transition where flow stops. Guided by numerical simulations, we derive the phase diagram of dense inertial flow of spherical particles, finding three regimes for 10^{-4}≲I≲10^{-1}: frictionless, frictional sliding, and rolling. These are distinguished by the dominant means of energy dissipation, changing from collisional to sliding friction, and back to collisional, as μ_{p} increases from zero at constant I. The three regimes differ in their kinetics and rheology; in particular, the velocity fluctuations and the stress ratio both display nonmonotonic behavior with μ_{p}, corresponding to transitions between the three regimes of flow. We rationalize the phase boundaries between these regimes, show that energy balance yields scaling relations between microscopic properties in each of them, and derive the strain scale at which particles lose memory of their velocity. For the frictional sliding regime most relevant experimentally, we find for I≥10^{-2.5} that the growth of the macroscopic friction μ(I) with I is induced by an increase of collisional dissipation. This implies in that range that μ(I)-μ(0)∼I^{1-2b}, where b≈0.2 is an exponent that characterizes both the dimensionless velocity fluctuations L∼I^{-b} and the density of sliding contacts χ∼I^{b}. PMID:27575203

  13. Phase diagram for inertial granular flows

    NASA Astrophysics Data System (ADS)

    DeGiuli, E.; McElwaine, J. N.; Wyart, M.

    2016-07-01

    Flows of hard granular materials depend strongly on the interparticle friction coefficient μp and on the inertial number I , which characterizes proximity to the jamming transition where flow stops. Guided by numerical simulations, we derive the phase diagram of dense inertial flow of spherical particles, finding three regimes for 10-4≲I ≲10-1 : frictionless, frictional sliding, and rolling. These are distinguished by the dominant means of energy dissipation, changing from collisional to sliding friction, and back to collisional, as μp increases from zero at constant I . The three regimes differ in their kinetics and rheology; in particular, the velocity fluctuations and the stress ratio both display nonmonotonic behavior with μp, corresponding to transitions between the three regimes of flow. We rationalize the phase boundaries between these regimes, show that energy balance yields scaling relations between microscopic properties in each of them, and derive the strain scale at which particles lose memory of their velocity. For the frictional sliding regime most relevant experimentally, we find for I ≥10-2.5 that the growth of the macroscopic friction μ (I ) with I is induced by an increase of collisional dissipation. This implies in that range that μ (I ) -μ (0 ) ˜I1 -2 b , where b ≈0.2 is an exponent that characterizes both the dimensionless velocity fluctuations L ˜I-b and the density of sliding contacts χ ˜Ib .

  14. Failures in psychodynamic psychotherapy.

    PubMed

    Gold, Jerry; Stricker, George

    2011-11-01

    This article addresses the issue of failures in psychodynamic psychotherapy. Drawing on the clinical and research literatures, and utilizing our clinical experiences, we first describe and define criteria for success and failure in treatment. We then review five factors that can lead to failure: client factors, therapist factors, technical factors, relationship factors, and environmental factors. We illustrate our presentation with a case example, and conclude by discussing ways in which the likelihood of failures in psychodynamic treatment can be lowered. PMID:21935934

  15. Improving Students' Diagram Comprehension with Classroom Instruction

    ERIC Educational Resources Information Center

    Cromley, Jennifer G.; Perez, Tony C.; Fitzhugh, Shannon L.; Newcombe, Nora S.; Wills, Theodore W.; Tanaka, Jacqueline C.

    2013-01-01

    The authors tested whether students can be taught to better understand conventional representations in diagrams, photographs, and other visual representations in science textbooks. The authors developed a teacher-delivered, workbook-and-discussion-based classroom instructional method called Conventions of Diagrams (COD). The authors trained 1…

  16. Symbol-and-Arrow Diagrams in Teaching Pharmacokinetics.

    ERIC Educational Resources Information Center

    Hayton, William L.

    1990-01-01

    Symbol-and-arrow diagrams are helpful adjuncts to equations derived from pharmacokinetic models. Both show relationships among dependent and independent variables. Diagrams show only qualitative relationships, but clearly show which variables are dependent and which are independent, helping students understand complex but important functional…

  17. 49 CFR Appendix B to Part 230 - Diagrams and Drawings

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 49 Transportation 4 2010-10-01 2010-10-01 false Diagrams and Drawings B Appendix B to Part 230 Transportation Other Regulations Relating to Transportation (Continued) FEDERAL RAILROAD ADMINISTRATION... to Part 230—Diagrams and Drawings ER17No99.015 ER17No99.016 ER17No99.017 ER17No99.018...

  18. Pourbaix ("E"-pH-M) Diagrams in Three Dimensions

    ERIC Educational Resources Information Center

    Pesterfield, Lester L.; Maddox, Jeremy B.; Crocker, Michael S.; Schweitzer, George K.

    2012-01-01

    "E"-pH (Pourbaix) diagrams provide an important graphical link between the thermodynamic calculations of potential, pH, equilibrium constant, concentration, and changes in Gibbs energy and the experimentally observed behavior of species in aqueous solutions. The utility of "E"-pH diagrams is extended with the introduction of an additional…

  19. Water, Water Everywhere: Phase Diagrams of Ordinary Water Substance

    ERIC Educational Resources Information Center

    Glasser, L.

    2004-01-01

    The full phase diagram of water in the form of a graphical representation of the three-dimensional (3D) PVT diagram using authentic data is presented. An interesting controversy regarding the phase behavior of water was the much-touted proposal of a solid phase of water, polywater, supposedly stable under atmospheric conditions.

  20. Argument Diagramming and Critical Thinking in Introductory Philosophy

    ERIC Educational Resources Information Center

    Harrell, Maralee

    2011-01-01

    In a multi-study naturalistic quasi-experiment involving 269 students in a semester-long introductory philosophy course, we investigated the effect of teaching argument diagramming (AD) on students' scores on argument analysis tasks. An argument diagram is a visual representation of the content and structure of an argument. In each study, all of…

  1. 30 CFR 556.8 - Leasing maps and diagrams.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 30 Mineral Resources 2 2012-07-01 2012-07-01 false Leasing maps and diagrams. 556.8 Section 556.8... Management, General § 556.8 Leasing maps and diagrams. (a) Any area of the OCS which has been appropriately... operation under section 12(d) of the Act. (b) BOEM shall prepare leasing maps and official...

  2. 49 CFR Appendix B to Part 230 - Diagrams and Drawings

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 49 Transportation 4 2011-10-01 2011-10-01 false Diagrams and Drawings B Appendix B to Part 230 Transportation Other Regulations Relating to Transportation (Continued) FEDERAL RAILROAD ADMINISTRATION... to Part 230—Diagrams and Drawings ER17No99.015 ER17No99.016 ER17No99.017 ER17No99.018...

  3. 49 CFR Appendix B to Part 230 - Diagrams and Drawings

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 49 Transportation 4 2012-10-01 2012-10-01 false Diagrams and Drawings B Appendix B to Part 230 Transportation Other Regulations Relating to Transportation (Continued) FEDERAL RAILROAD ADMINISTRATION... to Part 230—Diagrams and Drawings ER17No99.015 ER17No99.016 ER17No99.017 ER17No99.018...

  4. 49 CFR Appendix B to Part 230 - Diagrams and Drawings

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 49 Transportation 4 2013-10-01 2013-10-01 false Diagrams and Drawings B Appendix B to Part 230 Transportation Other Regulations Relating to Transportation (Continued) FEDERAL RAILROAD ADMINISTRATION... to Part 230—Diagrams and Drawings ER17No99.015 ER17No99.016 ER17No99.017 ER17No99.018...

  5. Diagram, Gesture, Agency: Theorizing Embodiment in the Mathematics Classroom

    ERIC Educational Resources Information Center

    de Freitas, Elizabeth; Sinclair, Nathalie

    2012-01-01

    In this paper, we use the work of philosopher Gilles Chatelet to rethink the gesture/diagram relationship and to explore the ways mathematical agency is constituted through it. We argue for a fundamental philosophical shift to better conceptualize the relationship between gesture and diagram, and suggest that such an approach might open up new…

  6. Diagramming Word Problems: A Strategic Approach for Instruction

    ERIC Educational Resources Information Center

    van Garderen, Delinda; Scheuermann, Amy M.

    2015-01-01

    While often recommended as a strategy to use in order to solve word problems, drawing a diagram is a complex process that requires a good depth of understanding. Many middle school students with learning disabilities (LD) often struggle to use diagrams in an effective and efficient manner. This article presents information for teaching middle…

  7. Adding Value to Force Diagrams: Representing Relative Force Magnitudes

    ERIC Educational Resources Information Center

    Wendel, Paul

    2011-01-01

    Nearly all physics instructors recognize the instructional value of force diagrams, and this journal has published several collections of exercises to improve student skill in this area. Yet some instructors worry that too few students perceive the conceptual and problem-solving utility of force diagrams, and over recent years a rich variety of…

  8. Using a Spreadsheet To Explore Melting, Dissolving and Phase Diagrams.

    ERIC Educational Resources Information Center

    Goodwin, Alan

    2002-01-01

    Compares phase diagrams relating to the solubilities and melting points of various substances in textbooks with those generated by a spreadsheet using data from the literature. Argues that differences between the diagrams give rise to new chemical insights. (Author/MM)

  9. 30 CFR 556.8 - Leasing maps and diagrams.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 30 Mineral Resources 2 2013-07-01 2013-07-01 false Leasing maps and diagrams. 556.8 Section 556.8... Management, General § 556.8 Leasing maps and diagrams. (a) Any area of the OCS which has been appropriately... operation under section 12(d) of the Act. (b) BOEM shall prepare leasing maps and official...

  10. 30 CFR 556.8 - Leasing maps and diagrams.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 30 Mineral Resources 2 2014-07-01 2014-07-01 false Leasing maps and diagrams. 556.8 Section 556.8... Management, General § 556.8 Leasing maps and diagrams. (a) Any area of the OCS which has been appropriately... operation under section 12(d) of the Act. (b) BOEM shall prepare leasing maps and official...

  11. 30 CFR 256.8 - Leasing maps and diagrams.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 30 Mineral Resources 2 2011-07-01 2011-07-01 false Leasing maps and diagrams. 256.8 Section 256.8... Oil, Gas, and Sulphur Management, General § 256.8 Leasing maps and diagrams. (a) Any area of the OCS... restricted from operation under section 12(d) of the Act. (b) The MMS shall prepare leasing maps and...

  12. Heuristic Diagrams as a Tool to Teach History of Science

    ERIC Educational Resources Information Center

    Chamizo, Jose A.

    2012-01-01

    The graphic organizer called here heuristic diagram as an improvement of Gowin's Vee heuristic is proposed as a tool to teach history of science. Heuristic diagrams have the purpose of helping students (or teachers, or researchers) to understand their own research considering that asks and problem-solving are central to scientific activity. The…

  13. Using Cabri3D Diagrams for Teaching Geometry

    ERIC Educational Resources Information Center

    Accascina, Giuseppe; Rogora, Enrico

    2006-01-01

    Cabri3D is a potentially very useful software for learning and teaching 3D geometry. The dynamic nature of the digital diagrams produced with it provides a useful aid for helping students to better develop concept images of geometric concepts. However, since any Cabri3D diagram represents three-dimensional objects on the two dimensional screen of…

  14. Diagram This Headline in One Minute, if You Can

    ERIC Educational Resources Information Center

    Landecker, Heidi

    2009-01-01

    Say "sentence diagramming" to people of a certain age, and one gets different reactions. Say it to most college students, and one gets a blank look. But not from the 24 students in Lucy Ferriss's "Constructing Thought," a half-credit course in the English department at Trinity College. They know how to diagram a sentence--and they are passionate…

  15. Diversions: Spatial Thinking Tasks--Cube Diagrams and Drawings

    ERIC Educational Resources Information Center

    Gough, John

    2009-01-01

    This article illustrates spatial thinking tasks through cube diagrams and drawings. The author talks about the pentacube diagram that is based on the principle that a vertical cube-edge is shown "vertically". The author describes how to extend isometric drawing to include triangular wedges that are made by slicing single cubes, bi-cubes,…

  16. An Introductory Idea for Teaching Two-Component Phase Diagrams

    ERIC Educational Resources Information Center

    Peckham, Gavin D.; McNaught, Ian J.

    2011-01-01

    The teaching of two-component phase diagrams has attracted little attention in this "Journal," and it is hoped that this article will make a useful contribution. Current physical chemistry textbooks describe two-component phase diagrams adequately, but do so in a piecemeal fashion one section at a time; first solid-liquid equilibria, then…

  17. Persistence Diagrams of High-Resolution Temporal Rainfall

    NASA Astrophysics Data System (ADS)

    Fernández Méndez, F.; Carsteanu, A. A.

    2015-12-01

    This study applies Topological Data Analysis (TDA), by generating persistence diagrams to uncover patterns in the data of high-resolution temporal rainfall intensities from Iowa City (IIHR, U of Iowa). Persistence diagrams are a way to identify essential cycles in state-space representations of the data.

  18. Telemonitoring in chronic heart failure.

    PubMed

    Hasan, Ayesha; Paul, Vince

    2011-06-01

    Clinical management of refractory heart failure remains challenging, with a high rate of rehospitalizations despite advances in medical and device therapy. Care can be provided in person, via telehomecare (by telephone), or telemonitoring, which involves wireless technology for remote follow-up. Telemonitoring wirelessly transmits parameters such as weight, heart rate, or blood pressure for review by health-care professionals. Cardiac implantable devices (defibrillators and cardiac resynchronization therapy) also transmit continually interrogated physiological data, such as heart rate variability or intrathoracic impedance, which may be of value to predict patients at greater risk of hospitalization for heart failure. The use of remote monitoring techniques facilitates a rapid and regular review of such data by health-care workers as part of a heart failure management programme. Current evidence supports the feasibility of such an approach but routinely assessed parameters have been shown not to impact patient outcomes. Devices that directly assess cardiac haemodynamic status through invasive measurement of pressures are currently under investigation and could potentially increase the sensitivity and specificity of predicting heart failure events. The current evidence for telemonitoring and remote monitoring, including implantable haemodynamic devices, will be reviewed. PMID:21289040

  19. Ammonia tank failure

    SciTech Connect

    Sweat, M.E.

    1983-04-01

    An ammonia tank failure at Hawkeye Chemical of Clinton, Iowa is discussed. The tank was a double-wall, 27,000 metric-ton tank built in 1968 and commissioned in December 1969. The paper presented covers the cause of the failure, repair, and procedural changes made to prevent recurrence of the failure. (JMT)

  20. In Support of Failure

    ERIC Educational Resources Information Center

    Carr, Allison

    2013-01-01

    In this essay, I propose a concerted effort to begin devising a theory and pedagogy of failure. I review the discourse of failure in Western culture as well as in composition pedagogy, ultimately suggesting that failure is not simply a judgement or indication of rank but is a relational, affect-bearing concept with tremendous relevance to…

  1. Sensor-Failure Simulator

    NASA Technical Reports Server (NTRS)

    Melcher, Kevin J.; Delaat, John C.; Merrill, Walter C.; Oberle, Lawrence G.; Sadler, Gerald G.

    1988-01-01

    Outputs of defective sensors simulated for studies of reliability of control systems. Real-time sensor-failure simulator (SFS) designed and built for use with Advance Detection, Isolation, and Accommodation (ADIA) program. Equipment consists of IBM PC/XT computer and associated analog circuitry. User defines failure scenarios to determine which sensor signals fail and method(s) used to simulate failure.

  2. Heart failure - palliative care

    MedlinePlus

    Chronic heart failure very often gets worse over time. Many people who have heart failure die of ... failure to take in enough calories and nutrients. Wasting of muscles and weight loss are part of the natural disease process. It can help to eat several small ...

  3. The coupling of thermochemistry and phase diagrams for group III-V semiconductor systems. Final report

    SciTech Connect

    Anderson, T.J.

    1998-07-21

    The project was directed at linking the thermochemical properties of III-V compound semiconductors systems with the reported phase diagrams. The solid-liquid phase equilibrium problem was formulated and three approaches to calculating the reduced standard state chemical potential were identified and values were calculated. In addition, thermochemical values for critical properties were measured using solid state electrochemical techniques. These values, along with the standard state chemical potentials and other available thermochemical and phase diagram data, were combined with a critical assessment of selected III-V systems. This work was culminated with a comprehensive assessment of all the III-V binary systems. A novel aspect of the experimental part of this project was the demonstration of the use of a liquid encapsulate to measure component activities by a solid state emf technique in liquid III-V systems that exhibit high vapor pressures at the measurement temperature.

  4. Updating the Nomographical Diagrams for Dimensioning the Beams

    NASA Astrophysics Data System (ADS)

    Pop, Maria T.

    2015-12-01

    In order to reduce the time period needed for structures design it is strongly recommended to use nomographical diagrams. The base for formation and updating the nomographical diagrams, stands on the charts presented by different technical publications. The updated charts use the same algorithm and calculation elements as the former diagrams in accordance to the latest prescriptions and European standards. The result consists in a chart, having the same properties, similar with the nomogragraphical diagrams already in us. As a general conclusion, even in our days, the nomographical diagrams are very easy to use. Taking into consideration the value of the moment it's easy to find out the necessary reinforcement area and vice-verse, having the reinforcement area you can find out the capable moment. It still remains a useful opportunity for pre-sizing and designs the reinforced concrete sections.

  5. An Automated Approach to Transform Use Cases into Activity Diagrams

    NASA Astrophysics Data System (ADS)

    Yue, Tao; Briand, Lionel C.; Labiche, Yvan

    Use cases are commonly used to structure and document requirements while UML activity diagrams are often used to visualize and formalize use cases, for example to support automated test case generation. Therefore the automated support for the transition from use cases to activity diagrams would provide significant, practical help. Additionally, traceability could be established through automated transformation, which could then be used for instance to relate requirements to design decisions and test cases. In this paper, we propose an approach to automatically generate activity diagrams from use cases while establishing traceability links. Data flow information can also be generated and added to these activity diagrams. Our approach is implemented in a tool, which we used to perform five case studies. The results show that high quality activity diagrams can be generated. Our analysis also shows that our approach outperforms existing academic approaches and commercial tools.

  6. Calculation of Gallium-metal-Arsenic phase diagrams

    NASA Technical Reports Server (NTRS)

    Scofield, J. D.; Davison, J. E.; Ray, A. E.; Smith, S. R.

    1991-01-01

    Electrical contacts and metallization to GaAs solar cells must survive at high temperatures for several minutes under specific mission scenarios. The determination of which metallizations or alloy systems that are able to withstand extreme thermal excursions with minimum degradation to solar cell performance can be predicted by properly calculated temperature constitution phase diagrams. A method for calculating a ternary diagram and its three constituent binary phase diagrams is briefly outlined and ternary phase diagrams for three Ga-As-X alloy systems are presented. Free energy functions of the liquid and solid phase are approximated by the regular solution theory. Phase diagrams calculated using this method are presented for the Ga-As-Ge and Ga-As-Ag systems.

  7. VennMaster: Area-proportional Euler diagrams for functional GO analysis of microarrays

    PubMed Central

    Kestler, Hans A; Müller, André; Kraus, Johann M; Buchholz, Malte; Gress, Thomas M; Liu, Hongfang; Kane, David W; Zeeberg, Barry R; Weinstein, John N

    2008-01-01

    Background Microarray experiments generate vast amounts of data. The functional context of differentially expressed genes can be assessed by querying the Gene Ontology (GO) database via GoMiner. Directed acyclic graph representations, which are used to depict GO categories enriched with differentially expressed genes, are difficult to interpret and, depending on the particular analysis, may not be well suited for formulating new hypotheses. Additional graphical methods are therefore needed to augment the GO graphical representation. Results We present an alternative visualization approach, area-proportional Euler diagrams, showing set relationships with semi-quantitative size information in a single diagram to support biological hypothesis formulation. The cardinalities of sets and intersection sets are represented by area-proportional Euler diagrams and their corresponding graphical (circular or polygonal) intersection areas. Optimally proportional representations are obtained using swarm and evolutionary optimization algorithms. Conclusion VennMaster's area-proportional Euler diagrams effectively structure and visualize the results of a GO analysis by indicating to what extent flagged genes are shared by different categories. In addition to reducing the complexity of the output, the visualizations facilitate generation of novel hypotheses from the analysis of seemingly unrelated categories that share differentially expressed genes. PMID:18230172

  8. Nasal diagrams: a tool for recording the distribution of nasal lesions in rats and mice.

    PubMed

    Mery, S; Gross, E A; Joyner, D R; Godo, M; Morgan, K T

    1994-01-01

    Knowledge of patterns of lesion distribution can provide insight into the relative roles played by regional tissue dose and local tissue susceptibility in toxic responses to xenobiotics in the nose and assist assessment of potential human risk. A consistent approach is needed for recording lesion distribution patterns in the complex nasal airways of rats and mice. The present work provides a series of diagrams of the nasal passages of the Fischer-344 rat and B6C3F1 mouse, designed for mapping nasal lesions. The diagrams present each of the major cross-sectional airway profiles, provide adequate space for nasal mucosal lesion recording, and are suitable for duplication in a commercial photocopier. Sagittal diagrams are also provided to permit transfer of lesion location data observed in transverse sections onto the long axis of the nose. The distribution of lesions induced by a selected range of xenobiotics is presented. Approaches to application of the diagrams and interpretation of results obtained are discussed in relation to factors responsible for lesion distribution in the nose and their relevance to interspecies extrapolation. A modified approach to anatomical classification of the ethmoturbinates of the rodent is also presented. PMID:7817125

  9. The ruthenium-yttrium system: An experimental calorimetric study with a phase diagram optimization

    SciTech Connect

    Selhaoui, N.; Bouirden, L.; Charles, J.; Gachon, J.C.; Kleppa, O.J.

    1998-07-01

    After an experimental determination of the standard enthalpies of formation of Ru{sub 0.67}Y{sub 0.33} and Ru{sub 0.286}Y{sub 0.714}, the Ru-Y system was numerically assessed with help of NANCYUN software to check the consistency between the experimental results and the phase diagram proposed in the literature.

  10. Moving toward comprehensive acute heart failure risk assessment in the emergency department: the importance of self-care and shared decision making.

    PubMed

    Collins, Sean P; Storrow, Alan B

    2013-08-01

    Nearly 700,000 emergency department (ED) visits were due to acute heart failure (AHF) in 2009. Most visits result in a hospital admission and account for the largest proportion of a projected $70 billion to be spent on heart failure care by 2030. ED-based risk prediction tools in AHF rarely impact disposition decision making. This is a major factor contributing to the 80% admission rate for ED patients with AHF, which has remained unchanged over the last several years. Self-care behaviors such as symptom monitoring, medication taking, dietary adherence, and exercise have been associated with decreased hospital readmissions, yet self-care remains largely unaddressed in ED patients with AHF and thus represents a significant lost opportunity to improve patient care and decrease ED visits and hospitalizations. Furthermore, shared decision making encourages collaborative interaction between patients, caregivers, and providers to drive a care path based on mutual agreement. The observation that “difficult decisions now will simplify difficult decisions later” has particular relevance to the ED, given this is the venue for many such issues. We hypothesize patients as complex and heterogeneous as ED patients with AHF may need both an objective evaluation of physiologic risk as well as an evaluation of barriers to ideal self-care, along with strategies to overcome these barriers. Combining physician gestalt, physiologic risk prediction instruments, an evaluation of self-care, and an information exchange between patient and provider using shared decision making may provide the critical inertia necessary to discharge patients home after a brief ED evaluation. PMID:24159563

  11. Assessment of the Relationship between Galectin-3 and Ejection Fraction and Functional Capacity in the Patients with Compensated Systolic Heart Failure

    PubMed Central

    Atabakhshian, Roya; Kazerouni, Faranak; Raygan, Fariba; Amirrasouli, Hushang; Rahimipour, Ali; Shakeri, Nezhat

    2014-01-01

    Background: Galectin-3 is a soluble ß-galactoside–binding lectin released by activated cardiac macrophages. Galectin-3 has been proposed for diagnosis and prognosis of HF patients. Objectives: The present study aimed to investigate the relationship between galectin-3 as a biomarker and ejection fraction and functional capacity in the patients with compensated systolic heart failure. Patients and Methods: In this study, serum levels of Galectin-3 were measured in 76 patients with compensated heart failure with New York Heart Association class I–IV and left ventricular ejection fraction < 45%. Galectin-3 was measured by an ELISA kit. Besides, echocardiography was used to evaluate left ventricular ejection fraction. Additionally, functional capacity was determined based on the patients’ ability to perform a set of activities. After all, the data were analyzed used t-test, Kruskal-Wallis, one–way ANOVA, and chi-square test. P < 0.05 was considered as statistically significant. Results: The patients’ age ranged from 45 to 75 years, with the mean age of 63.85 ± 9 years. In addition 57.9% of the patients were male. The results revealed no significant correlation between Galectin-3 and age, body mass index, and estimated glomerular filtration rate. Also, no significant correlation was observed between Galectin-3 levels and left ventricular ejection fraction (P = 0.166) and functional capacity (P = 0.420). Yet, a significant difference was found between males and females regarding the mean of Galectin-3 (P = 0.039). Conclusions: The study results suggested that Galectin-3 could not be used as a marker of disease progression in the patients under treatment, which could probably be the result of medication use in these patients. PMID:25614856

  12. Effects of Three Diagram Instruction Methods on Transfer of Diagram Comprehension Skills: The Critical Role of Inference While Learning

    ERIC Educational Resources Information Center

    Cromley, Jennifer G.; Bergey, Bradley W.; Fitzhugh, Shannon; Newcombe, Nora; Wills, Theodore W.; Shipley, Thomas F.; Tanaka, Jacqueline C.

    2013-01-01

    Can students be taught to better comprehend the diagrams in their textbooks? Can such teaching transfer to uninstructed diagrams in the same domain or even in a new domain? What methods work best for these goals? Building on previous research showing positive results compared to control groups in both laboratory studies and short-term…

  13. Oak Ridge National Laboratory Technology Logic Diagram. Volume 2, Technology Logic Diagram: Part B, Remedial Action

    SciTech Connect

    Not Available

    1993-09-01

    The Oak Ridge National Laboratory Technology Logic Diagram (TLD) was developed to provide a decision support tool that relates environmental restoration (ER) and waste management (WM) problems at Oak Ridge National Laboratory (ORNL) to potential technologies that can remediate these problems. The TLD identifies the research, development, demonstration, testing, and evaluation needed to develop these technologies to a state that allows technology transfer and application to decontamination and decommissioning (D&D), remedial action (RA), and WM activities. The TLD consists of three fundamentally separate volumes: Vol. 1 (Technology Evaluation), Vol. 2 (Technology Logic Diagram), and Vol. 3 (Technology Evaluation Data Sheets). Part A of Vols. 1. and 2 focuses on D&D. Part B of Vols. 1 and 2 focuses on the RA of contaminated facilities. Part C of Vols. 1 and 2 focuses on WM. Each part of Vol. 1 contains an overview of the TLD, an explanation of the program-specific responsibilities, a review of identified technologies, and the rankings of remedial technologies. Volume 2 (Pts. A, B, and C) contains the logic linkages among EM goals, environmental problems, and the various technologies that have the potential to solve these problems. Volume 3 (Pts. A, B, and C) contains the TLD data sheets. Remedial action is the focus of Vol. 2, Pt. B, which has been divided into the three necessary subelements of the RA: characterization, RA, and robotics and automation. Each of these sections address general ORNL problems, which are then broken down by problem area/constituents and linked to potential remedial technologies. The diagrams also contain summary information about a technology`s status, its science and technology needs, and its implementation needs.

  14. The high-z quasar Hubble Diagram

    SciTech Connect

    Melia, Fulvio

    2014-01-01

    Two recent discoveries have made it possible for us to begin using high-z quasars as standard candles to construct a Hubble Diagram (HD) at z > 6. These are (1) the recognition from reverberation mapping that a relationship exists between the optical/UV luminosity and the distance of line-emitting gas from the central ionizing source. Thus, together with a measurement of the velocity of the line-emitting gas, e.g., via the width of BLR lines, such as Mg II, a single observation can therefore in principle provide a determination of the black hole's mass; and (2) the identification of quasar ULAS J1120+0641 at z = 7.085, which has significantly extended the redshift range of these sources, providing essential leverage when fitting theoretical luminosity distances to the data. In this paper, we use the observed fluxes and Mg II line-widths of these sources to show that one may reasonably test the predicted high-z distance versus redshift relationship, and we assemble a sample of 20 currently available high-z quasars for this exercise. We find a good match between theory and observations, suggesting that a more complete, high-quality survey may indeed eventually produce an HD to complement the highly-detailed study already underway (e.g., with Type Ia SNe, GRBs, and cosmic chronometers) at lower redshifts. With the modest sample we have here, we show that the R{sub h} = ct Universe and ΛCDM both fit the data quite well, though the smaller number of free parameters in the former produces a more favorable outcome when we calculate likelihoods using the Akaike, Kullback, and Bayes Information Criteria. These three statistical tools result in similar probabilities, indicating that the R{sub h} = ct Universe is more likely than ΛCDM to be correct, by a ratio of about 85% to 15%.

  15. Dynamic phase diagram of soft nanocolloids.

    PubMed

    Gupta, Sudipta; Camargo, Manuel; Stellbrink, Jörg; Allgaier, Jürgen; Radulescu, Aurel; Lindner, Peter; Zaccarelli, Emanuela; Likos, Christos N; Richter, Dieter

    2015-09-01

    We present a comprehensive experimental and theoretical study covering micro-, meso- and macroscopic length and time scales, which enables us to establish a generalized view in terms of structure-property relationship and equilibrium dynamics of soft colloids. We introduce a new, tunable block copolymer model system, which allows us to vary the aggregation number, and consequently its softness, by changing the solvophobic-to-solvophilic block ratio (m : n) over two orders of magnitude. Based on a simple and general coarse-grained model of the colloidal interaction potential, we verify the significance of interaction length σint governing both structural and dynamic properties. We put forward a quantitative comparison between theory and experiment without adjustable parameters, covering a broad range of experimental polymer volume fractions (0.001 ≤ϕ≤ 0.5) and regimes from ultra-soft star-like to hard sphere-like particles, that finally results in the dynamic phase diagram of soft colloids. In particular, we find throughout the concentration domain a strong correlation between mesoscopic diffusion and macroscopic viscosity, irrespective of softness, manifested in data collapse on master curves using the interaction length σint as the only relevant parameter. A clear reentrance in the glass transition at high aggregation numbers is found, recovering the predicted hard-sphere (HS) value in the hard-sphere like limit. Finally, the excellent agreement between our new experimental systems with different but already established model systems shows the relevance of block copolymer micelles as a versatile realization of soft colloids and the general validity of a coarse-grained approach for the description of the structure and dynamics of soft colloids. PMID:26219628

  16. Superconducting phase diagram of itinerant antiferromagnets

    NASA Astrophysics Data System (ADS)

    Rømer, A. T.; Eremin, I.; Hirschfeld, P. J.; Andersen, B. M.

    2016-05-01

    We study the phase diagram of the Hubbard model in the weak-coupling limit for coexisting spin-density-wave order and spin-fluctuation-mediated superconductivity. Both longitudinal and transverse spin fluctuations contribute significantly to the effective interaction potential, which creates Cooper pairs of the quasiparticles of the antiferromagnetic metallic state. We find a dominant dx2-y2-wave solution in both electron- and hole-doped cases. In the quasi-spin-triplet channel, the longitudinal fluctuations give rise to an effective attraction supporting a p -wave gap, but are overcome by repulsive contributions from the transverse fluctuations which disfavor p -wave pairing compared to dx2-y2. The subleading pair instability is found to be in the g -wave channel, but complex admixtures of d and g are not energetically favored since their nodal structures coincide. Inclusion of interband pairing, in which each fermion in the Cooper pair belongs to a different spin-density-wave band, is considered for a range of electron dopings in the regime of well-developed magnetic order. We demonstrate that these interband pairing gaps, which are nonzero in the magnetic state, must have the same parity under inversion as the normal intraband gaps. The self-consistent solution to the full system of five coupled gap equations gives intraband and interband pairing gaps of dx2-y2 structure and similar gap magnitude. In conclusion, the dx2-y2 gap dominates for both hole and electron doping inside the spin-density-wave phase.

  17. Penguin-like diagrams from the standard model

    NASA Astrophysics Data System (ADS)

    Ping, Chia Swee

    2015-04-01

    The Standard Model is highly successful in describing the interactions of leptons and quarks. There are, however, rare processes that involve higher order effects in electroweak interactions. One specific class of processes is the penguin-like diagram. Such class of diagrams involves the neutral change of quark flavours accompanied by the emission of a gluon (gluon penguin), a photon (photon penguin), a gluon and a photon (gluon-photon penguin), a Z-boson (Z penguin), or a Higgs-boson (Higgs penguin). Such diagrams do not arise at the tree level in the Standard Model. They are, however, induced by one-loop effects. In this paper, we present an exact calculation of the penguin diagram vertices in the `tHooft-Feynman gauge. Renormalization of the vertex is effected by a prescription by Chia and Chong which gives an expression for the counter term identical to that obtained by employing Ward-Takahashi identity. The on-shell vertex functions for the penguin diagram vertices are obtained. The various penguin diagram vertex functions are related to one another via Ward-Takahashi identity. From these, a set of relations is obtained connecting the vertex form factors of various penguin diagrams. Explicit expressions for the gluon-photon penguin vertex form factors are obtained, and their contributions to the flavor changing processes estimated.

  18. Penguin-like diagrams from the standard model

    SciTech Connect

    Ping, Chia Swee

    2015-04-24

    The Standard Model is highly successful in describing the interactions of leptons and quarks. There are, however, rare processes that involve higher order effects in electroweak interactions. One specific class of processes is the penguin-like diagram. Such class of diagrams involves the neutral change of quark flavours accompanied by the emission of a gluon (gluon penguin), a photon (photon penguin), a gluon and a photon (gluon-photon penguin), a Z-boson (Z penguin), or a Higgs-boson (Higgs penguin). Such diagrams do not arise at the tree level in the Standard Model. They are, however, induced by one-loop effects. In this paper, we present an exact calculation of the penguin diagram vertices in the ‘tHooft-Feynman gauge. Renormalization of the vertex is effected by a prescription by Chia and Chong which gives an expression for the counter term identical to that obtained by employing Ward-Takahashi identity. The on-shell vertex functions for the penguin diagram vertices are obtained. The various penguin diagram vertex functions are related to one another via Ward-Takahashi identity. From these, a set of relations is obtained connecting the vertex form factors of various penguin diagrams. Explicit expressions for the gluon-photon penguin vertex form factors are obtained, and their contributions to the flavor changing processes estimated.

  19. Plotting and Analyzing Data Trends in Ternary Diagrams Made Easy

    NASA Astrophysics Data System (ADS)

    John, Cédric M.

    2004-04-01

    Ternary plots are used in many fields of science to characterize a system based on three components. Triangular plotting is thus useful to a broad audience in the Earth sciences and beyond. Unfortunately, it is typically the most expensive commercial software packages that offer the option to plot data in ternary diagrams, and they lack features that are paramount to the geosciences, such as the ability to plot data directly into a standardized diagram and the possibility to analyze temporal and stratigraphic trends within this diagram. To address these issues, δPlot was developed with a strong emphasis on ease of use, community orientation, and availability free of charges. This ``freeware'' supports a fully graphical user interface where data can be imported as text files, or by copying and pasting. A plot is automatically generated, and any standard diagram can be selected for plotting in the background using a simple pull-down menu. Standard diagrams are stored in an external database of PDF files that currently holds some 30 diagrams that deal with different fields of the Earth sciences. Using any drawing software supporting PDF, one can easily produce new standard diagrams to be used with δPlot by simply adding them to the library folder. An independent column of values, commonly stratigraphic depths or ages, can be used to sort the data sets.

  20. Laboratory assessment of anti-thrombotic therapy in heart failure, atrial fibrillation and coronary artery disease: insights using thrombelastography and a micro-titre plate assay of thrombogenesis and fibrinolysis.

    PubMed

    Lau, Y C; Xiong, Q; Ranjit, P; Lip, G Y H; Blann, A D

    2016-08-01

    As heart failure, coronary artery disease and atrial fibrillation all bring a risk of thrombosis, anti-thrombotic therapy is recommended. Despite such treatment, major cardiovascular events such as myocardial infarction and stroke still occur, implying inadequate suppression of thrombus formation. Accordingly, identification of patients whose haemostasis remains unimpaired by treatment is valuable. We compared indices for assessing thrombogenesis and fibrinolysis by two different techniques in patients on different anti-thrombotic agents, i.e. aspirin or warfarin. We determined fibrin clot formation and fibrinolysis by a microplate assay and thromboelastography, and platelet marker soluble P selectin in 181 patients with acute or chronic heart failure, coronary artery disease who were taking either aspirin or warfarin. Five thromboelastograph indices and four microplate assay indices were different on aspirin versus warfarin (p < 0.05). In multivariate regression analysis, only microplate assay indices rate of clot formation and rate of clot dissolution were independently related to aspirin or warfarin use (p ≤ 0.001). Five microplate assay indices, but no thrombelastograph index, were different (p < 0.001) in aspirin users. Three microplate assay indices were different (p ≤ 0.002) in warfarin users. The microplate assay indices of lag time and rate of clot formation were abnormal in chronic heart failure patients on aspirin, suggesting increased risk of thrombosis despite anti-platelet use. Soluble P selectin was lower in patients on aspirin (p = 0.0175) but failed to correlate with any other index of haemostasis. The microplate assay shows promise as a tool for dissecting thrombogenesis and fibrinolysis in cardiovascular disease, and the impact of antithrombotic therapy. Prospective studies are required to determine a role in predicting thrombotic risk. PMID:26942726

  1. Cu-Zn binary phase diagram and diffusion couples

    NASA Technical Reports Server (NTRS)

    Mccoy, Robert A.

    1992-01-01

    The objectives of this paper are to learn: (1) what information a binary phase diagram can yield; (2) how to construct and heat treat a simple diffusion couple; (3) how to prepare a metallographic sample; (4) how to operate a metallograph; (5) how to correlate phases found in the diffusion couple with phases predicted by the phase diagram; (6) how diffusion couples held at various temperatures could be used to construct a phase diagram; (7) the relation between the thickness of an intermetallic phase layer and the diffusion time; and (8) the effect of one species of atoms diffusing faster than another species in a diffusion couple.

  2. Endorectal MRI assessment of local relapse after surgery for prostate cancer: A model to define treatment field guidelines for adjuvant radiotherapy in patients at high risk for local failure

    SciTech Connect

    Miralbell, Raymond . E-mail: Raymond.Miralbell@hcuge.ch; Vees, Hansjoerg; Lozano, Joan; Khan, Haleem; Molla, Meritxell; Hidalgo, Alberto; Linero, Dolors; Rouzaud, Michel

    2007-02-01

    Purpose: To assess the role of endorectal magnetic resonance imaging (MRI) in defining local relapse after radical prostatectomy for prostate cancer to help to reassess the clinical target volume (CTV) for adjuvant postprostatectomy radiotherapy. Methods and Materials: Sixty patients undergoing an endorectal MRI before salvage radiotherapy were selected. Spatial coordinates of the relapses were assessed using two reference points: the inferior border of the pubic symphysis (point 1) and the urethro-vesical anastomosis (point 2). Every lesion on MRI was delineated on the planning computed tomography and center of mass coordinates were plotted in two separate diagrams (along the x, y, and z axes) with the urethro-vesical anastomosis as the coordinate origin. An 'ideal' CTV was constructed, centered at a point defined by the mathematical means of each of the three coordinates with dimensions defined as twice 2 standard deviations in each of the three axes. The dosimetric impact of the new CTV definition was evaluated in six adjuvantly treated patients. Results: The ideal CTV center of mass was located at coordinates 0 (x), -5 (y), and -3 (z) mm with SDs of 6 (x), 6 (y), and 9 (z) mm, respectively. The CTV size was 24 (x) x 24 (y) x 36 (z) mm. Significant rectal sparing was observed with the new CTV. Conclusions: A CTV with an approximately cylindrical shape ({approx}4 x 3 cm) centered 5 mm posterior and 3 mm inferior to the urethro-vesical anastomosis was defined. Such CTV may reduce the irradiation of normal nontarget tissue in the pelvis potentially improving treatment tolerance.

  3. Cosmological test with the QSO Hubble diagram

    NASA Astrophysics Data System (ADS)

    López-Corredoira, M.; Melia, F.; Lusso, E.; Risaliti, G.

    2016-03-01

    A Hubble diagram (HD) has recently been constructed in the redshift range 0 ≲ z ≲ 6.5 using a nonlinear relation between the ultraviolet (UV) and X-ray luminosities of quasi stellar objects (QSOs). The Type Ia Supernovae (SN) HD has already provided a high-precision test of cosmological models, but the fact that the QSO distribution extends well beyond the supernova range (z ≲ 1.8), in principle provides us with an important complementary diagnostic whose significantly greater leverage in z can impose tighter constraints on the distance versus redshift relationship. In this paper, we therefore perform an independent test of nine different cosmological models, among which six are expanding, while three are static. Many of these are disfavored by other kinds of observations (including the aforementioned Type Ia SNe). We wish to examine whether the QSO HD confirms or rejects these earlier conclusions. We find that four of these models (Einstein-de Sitter, the Milne universe, the static universe with simple tired light and the static universe with plasma tired light) are excluded at the > 99% C.L. The quasi-steady state model is excluded at > 95% C.L. The remaining four models (ΛCDM/wCDM, the Rh = ct universe, the Friedmann open universe and a static universe with a linear Hubble law) all pass the test. However, only ΛCDM/wCDM and Rh = ct also pass the Alcock-Paczyński (AP) test. The optimized parameters in ΛCDM/wCDM are Ωm = 0.20-0.20+0.24 and wde = -1.2-∞+1.6 (the dark energy equation-of-state). Combined with the AP test, these values become Ωm = 0.38-0.19+0.20 and wde = -0.28-0.40+0.52. But whereas this optimization of parameters in ΛCDM/wCDM creates some tension with their concordance values, the Rh = ct universe has the advantage of fitting the QSO and AP data without any free parameters.

  4. “Not the ‘Grim Reaper Service’”: An Assessment of Provider Knowledge, Attitudes, and Perceptions Regarding Palliative Care Referral Barriers in Heart Failure

    PubMed Central

    Kavalieratos, Dio; Mitchell, Emma M.; Carey, Timothy S.; Dev, Sandesh; Biddle, Andrea K.; Reeve, Bryce B.; Abernethy, Amy P.; Weinberger, Morris

    2014-01-01

    Background Although similar to cancer patients regarding symptom burden and prognosis, patients with heart failure (HF) tend to receive palliative care far less frequently. We sought to explore factors perceived by cardiology, primary care, and palliative care providers to impede palliative care referral for HF patients. Methods and Results We conducted semistructured interviews regarding (1) perceived needs of patients with advanced HF; (2) knowledge, attitudes, and experiences with specialist palliative care; (3) perceived indications for and optimal timing of palliative care referral in HF; and (4) perceived barriers to palliative care referral. Two investigators analyzed data using template analysis, a qualitative technique. We interviewed 18 physician, nurse practitioner, and physician assistant providers from 3 specialties: cardiology, primary care, and palliative care. Providers had limited knowledge regarding what palliative care is, and how it can complement traditional HF therapy to decrease HF‐related suffering. Interviews identified several potential barriers: the unpredictable course of HF; lack of clear referral triggers across the HF trajectory; and ambiguity regarding what differentiates standard HF therapy from palliative care. Nevertheless, providers expressed interest for integrating palliative care into traditional HF care, but were unsure of how to initiate collaboration. Conclusions Palliative care referral for HF patients may be suboptimal due to limited provider knowledge and misperceptions of palliative care as a service reserved for those near death. These factors represent potentially modifiable targets for provider education, which may help to improve palliative care referral for HF patients with unresolved disease‐related burden. PMID:24385453

  5. [Orthodontic failures in Class II cases].

    PubMed

    Boileau, Marie-José

    2016-03-01

    In Class II treatment, as with all malformations, therapeutic failure can impact some or all of our treatment aims, whether occlusal, functional or esthetic. Using clinical cases, we will first define the concept of failure and the limits of what is acceptable in these different areas. We will then attempt to determine the main causes underlying our failures in order to better avoid them. An analysis of the literature and of the clinical cases demonstrates that our failures are most often caused by a misevaluation of the amount and direction of residual growth, poor control of the vertical dimension, inadequate management of functional problems, an inadequate position of the maxillary and mandibular incisors. In addition to these major treatment errors, one also encounters insufficient patient cooperation, which needs to be assessed and maintained in order to limit the number of failures and treatment drop-outs. PMID:27083232

  6. Tensile failure criteria for fiber composite materials

    NASA Technical Reports Server (NTRS)

    Rosen, B. W.; Zweben, C. H.

    1972-01-01

    The analysis provides insight into the failure mechanics of these materials and defines criteria which serve as tools for preliminary design material selection and for material reliability assessment. The model incorporates both dispersed and propagation type failures and includes the influence of material heterogeneity. The important effects of localized matrix damage and post-failure matrix shear stress transfer are included in the treatment. The model is used to evaluate the influence of key parameters on the failure of several commonly used fiber-matrix systems. Analyses of three possible failure modes were developed. These modes are the fiber break propagation mode, the cumulative group fracture mode, and the weakest link mode. Application of the new model to composite material systems has indicated several results which require attention in the development of reliable structural composites. Prominent among these are the size effect and the influence of fiber strength variability.

  7. X ray computed tomography for failure analysis

    NASA Astrophysics Data System (ADS)

    Bossi, Richard H.; Crews, Alan R.; Georgeson, Gary E.

    1992-08-01

    Under a preliminary testing task assignment of the Advanced Development of X-Ray Computed Tomography Application program, computed tomography (CT) has been studied for its potential as a tool to assist in failure analysis investigations. CT provides three-dimensional spatial distribution of material that can be used to assess internal configurations and material conditions nondestructively. This capability has been used in failure analysis studies to determine the position of internal components and their operation. CT is particularly advantageous on complex systems, composite failure studies, and testing under operational or environmental conditions. CT plays an important role in reducing the time and effort of a failure analysis investigation. Aircraft manufacturing or logistical facilities perform failure analysis operations routinely and could be expected to reduce schedules, reduce costs and/or improve evaluation on about 10 to 30 percent of the problems they investigate by using CT.

  8. How to Recognize Success and Failure: Practical Assessment of an Evolving, First-Semester Laboratory Program Using Simple, Outcome-Based Tools

    ERIC Educational Resources Information Center

    Gron, Liz U.; Bradley, Shelly B.; McKenzie, Jennifer R.; Shinn, Sara E.; Teague, M. Warfield

    2013-01-01

    This paper presents the use of simple, outcome-based assessment tools to design and evaluate the first semester of a new introductory laboratory program created to teach green analytical chemistry using environmental samples. This general chemistry laboratory program, like many introductory courses, has a wide array of stakeholders within and…

  9. The development of a risk of failure evaluation tool for small dams in Mzingwane Catchment, Zimbabwe

    NASA Astrophysics Data System (ADS)

    Mufute, N. L.; Senzanje, A.; Kaseke, E.

    Small dams in Mzingwane Catchment in southern Zimbabwe are mostly in poor physical condition mainly due to lack of resources for repair and maintenance. Most of these dams are likely to fail thereby adversely affecting water availability and livelihoods in the area. To assist those involved in maintenance, repair and rehabilitation of small dams in resource poor and data sparse areas such as Mzingwane Catchment, a non-probabilistic but numerical risk of failure evaluation tool was developed. The tool helps to systematically, and objectively classify risk of failure of small dams, hence assist in the ranking of dams to prioritise and attend to first. This is important where resources are limited. The tool makes use of factors such as seepage, erosion and others that are traditionally used to assess condition of dams. In the development of the tool, an assessment of the physical condition of 44 (1 medium sized and 43 small dams) dams was done and the factors were identified and listed according to guidelines for design and maintenance of small dams. The description of the extent to which the factors affect the physical condition of small dams was then standardised. This was mainly guided by standard based and risk-based approaches to dam safety evaluation. Cause-effect diagrams were used to determine the stage at which each factor is involved in contributing to dam failure. Weights were then allocated to each factor depending on its stage or level in the process of causing dam failure. Scores were allocated to each factor based on its description and weight. Small dams design and maintenance guidelines were also used to guide the ranking and weighting of the factors. The tool was used to classify 10 dams. The risk of failure was low for one dam, moderate for one, high for four and very high for four dams, two of which had already failed. It was concluded that the tool could be used to rank the risk of failure of small dams in semi-arid areas. The tool needs to be

  10. Nitrile/Buna N Material Failure Assessment for an O-Ring used on the Gaseous Hydrogen Flow Control Valve (FCV) of the Space Shuttle Main Engine

    NASA Technical Reports Server (NTRS)

    Wingard, Doug

    2006-01-01

    After the rollout of Space Shuttle Discovery in April 2005 in preparation for return-to-flight, there was a failure of the Orbiter (OV-103) helium signature leak test in the gaseous hydrogen (GH2) system. Leakage was attributed to the Flow Control Valve (FCV) in Main Engine 3. The FCV determined to be the source of the leak for OV-103 is designated as LV-58. The nitrile/Buna N rubber O-ring seal was removed from LV-58, and failure analysis indicated radial cracks providing leak paths in one quadrant. Cracks were eventually found in 6 of 9 FCV O-rings among the three Shuttle Orbiters, though none were as severe as those for LV-58, OV-103. Testing by EM10 at MSFC on all 9 FCV O- rings included: laser dimensional, Shore A hardness and properties from a dynamic mechanical analyzer (DMA) and an Instron tensile machine. The following test data was obtained on the cracked quadrant of the LV-58, OV-103 O-ring: (1) the estimated compression set was only 9.5%, compared to none for the rest of the O-ring; (2) Shore A hardness for the O.D. was higher by almost 4 durometer points than for the rest of the O-ring; and (3) DMA data showed that the storage/elastic modulus E was almost 25% lower than for the rest of the O-ring. Of the 8 FCV O-rings tested on an Instron, 4 yielded tensile strengths that were below the MIL spec requirement of 1350 psi-a likely influence of rubber cracking. Comparisons were made between values of modulus determined by DNA (elastic) and Instron (Young s). Each nitrile/Buna N O-ring used in the FCV conforms to the MIL-P-25732C specification. A number of such O-rings taken from shelf storage at MSFC and Kennedy Space Center (KSC) were used to generate a reference curve of DMA glass transition temperature (Tg) vs. shelf storage time ranging from 8 to 26 years. A similar reference curve of TGA onset temperature (of rubber weight loss) vs. shelf storage time was also generated. The DMA and TGA data for the used FCV O-rings were compared to the reference

  11. 61. THREE LINE WIRING DIAGRAM, 33 KV & NO. 1 ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    61. THREE LINE WIRING DIAGRAM, 33 KV & NO. 1 TRANS BANK, SANTA ANA NO. 1 HYDRO PLANT, OCTOBER 27, 1958. SCE drawing no. 428058-1. - Santa Ana River Hydroelectric System, SAR-1 Powerhouse, Redlands, San Bernardino County, CA

  12. Penguin diagram dominance in radiative weak decays of bottom baryons

    SciTech Connect

    Kohara, Yoji

    2005-05-01

    Radiative weak decays of antitriplet bottom baryons are studied under the assumption of penguin diagram dominance and flavor-SU(3) (or SU(2)) symmetry. Relations among decay rates of various decay modes are derived.

  13. Exercises in Drawing and Utilizing Free-Body Diagrams.

    ERIC Educational Resources Information Center

    Fisher, Kurt

    1999-01-01

    Finds that students taking algebra-based introductory physics have difficulty with one- and two-body problems in particle mechanics. Provides graded exercises for drawing and utilizing free-body diagrams. (CCM)

  14. Flame Deflector Section, Elevation, Water Supply Flow Diagram, Exploded ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Flame Deflector - Section, Elevation, Water Supply Flow Diagram, Exploded Deflector Manifolds, and Interior Perspective - Marshall Space Flight Center, F-1 Engine Static Test Stand, On Route 565 between Huntsville and Decatur, Huntsville, Madison County, AL

  15. 22. Power plant engine pipingcompressed air piping diagram and sections, ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    22. Power plant engine piping-compressed air piping diagram and sections, sheet 81 of 130 - Naval Air Station Fallon, Power Plant, 800 Complex, off Carson Road near intersection of Pasture & Berney Roads, Fallon, Churchill County, NV

  16. 21. Power plant engine fuel oil piping diagrams, sheet 83 ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    21. Power plant engine fuel oil piping diagrams, sheet 83 of 130 - Naval Air Station Fallon, Power Plant, 800 Complex, off Carson Road near intersection of Pasture & Berney Roads, Fallon, Churchill County, NV

  17. Search the Foot and Ankle: Interactive Foot Diagram

    MedlinePlus

    ... Search by GPS Please enter a city or last name. Use your current position? {{ps.position.alert.message}} ... digit zip code. Please enter a city or last name. Search Where do you hurt? Interactive Foot Diagram ...

  18. Influence diagrams as oil spill decision science tools

    EPA Science Inventory

    Making inferences on risks to ecosystem services (ES) from ecological crises can be more reliably handled using decision science tools. Influence diagrams (IDs) are probabilistic networks that explicitly represent the decisions related to a problem and evidence of their influence...

  19. Heuristic Diagrams as a Tool to Teach History of Science

    NASA Astrophysics Data System (ADS)

    Chamizo, José A.

    2012-05-01

    The graphic organizer called here heuristic diagram as an improvement of Gowin's Vee heuristic is proposed as a tool to teach history of science. Heuristic diagrams have the purpose of helping students (or teachers, or researchers) to understand their own research considering that asks and problem-solving are central to scientific activity. The left side originally related in Gowin's Vee with philosophies, theories, models, laws or regularities now agrees with Toulmin's concepts (language, models as representation techniques and application procedures). Mexican science teachers without experience in science education research used the heuristic diagram to learn about the history of chemistry considering also in the left side two different historical times: past and present. Through a semantic differential scale teachers' attitude to the heuristic diagram was evaluated and its usefulness was demonstrated.

  20. 8. Photocopy of top half of an 1855 organizational diagram ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    8. Photocopy of top half of an 1855 organizational diagram of the New York and Erie Railroad. Original in the collections of the Library of Congress. - Erie Railway, New Jersey, New York, Pennsylvania, Deposit, Broome County, NY

  1. 9. Photocopy of bottom half of an 1855 organizational diagram ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    9. Photocopy of bottom half of an 1855 organizational diagram of the New York and Erie Railroad. Original in the collections of the Library of Congress. - Erie Railway, New Jersey, New York, Pennsylvania, Deposit, Broome County, NY

  2. 129. PLAN OF IMPROVEMENT, HUNTINGTON BEACH MUNICIPAL PIER: LIGHTING DIAGRAM. ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    129. PLAN OF IMPROVEMENT, HUNTINGTON BEACH MUNICIPAL PIER: LIGHTING DIAGRAM. Sheet lO of 11 (#3283) - Huntington Beach Municipal Pier, Pacific Coast Highway at Main Street, Huntington Beach, Orange County, CA

  3. Adding Value to Force Diagrams: Representing Relative Force Magnitudes

    NASA Astrophysics Data System (ADS)

    Wendel, Paul

    2011-05-01

    Nearly all physics instructors recognize the instructional value of force diagrams, and this journal has published several collections of exercises to improve student skill in this area.1-4 Yet some instructors worry that too few students perceive the conceptual and problem-solving utility of force diagrams,4-6 and over recent years a rich variety of approaches has been proposed to add value to force diagrams. Suggestions include strategies for identifying candidate forces,6,7 emphasizing the distinction between "contact" and "noncontact" forces,5,8 and the use of computer-based tutorials.9,10 Instructors have suggested a variety of conventions for constructing force diagrams, including approaches to arrow placement and orientation2,11-13 and proposed notations for locating forces or marking action-reaction force pairs.8,11,14,15

  4. An individual patient meta-analysis of five randomized trials assessing the effects of cardiac resynchronization therapy on morbidity and mortality in patients with symptomatic heart failure

    PubMed Central

    Cleland, John G.; Abraham, William T.; Linde, Cecilia; Gold, Michael R.; Young, James B.; Claude Daubert, J.; Sherfesee, Lou; Wells, George A.; Tang, Anthony S.L.

    2013-01-01

    Aims Cardiac resynchronization therapy (CRT) with or without a defibrillator reduces morbidity and mortality in selected patients with heart failure (HF) but response can be variable. We sought to identify pre-implantation variables that predict the response to CRT in a meta-analysis using individual patient-data. Methods and results An individual patient meta-analysis of five randomized trials, funded by Medtronic, comparing CRT either with no active device or with a defibrillator was conducted, including the following baseline variables: age, sex, New York Heart Association class, aetiology, QRS morphology, QRS duration, left ventricular ejection fraction (LVEF), and systolic blood pressure. Outcomes were all-cause mortality and first hospitalization for HF or death. Of 3782 patients in sinus rhythm, median (inter-quartile range) age was 66 (58–73) years, QRS duration was 160 (146–176) ms, LVEF was 24 (20–28)%, and 78% had left bundle branch block. A multivariable model suggested that only QRS duration predicted the magnitude of the effect of CRT on outcomes. Further analysis produced estimated hazard ratios for the effect of CRT on all-cause mortality and on the composite of first hospitalization for HF or death that suggested increasing benefit with increasing QRS duration, the 95% confidence bounds excluding 1.0 at ∼140 ms for each endpoint, suggesting a high probability of substantial benefit from CRT when QRS duration exceeds this value. Conclusion QRS duration is a powerful predictor of the effects of CRT on morbidity and mortality in patients with symptomatic HF and left ventricular systolic dysfunction who are in sinus rhythm. QRS morphology did not provide additional information about clinical response. ClinicalTrials.gov numbers NCT00170300, NCT00271154, NCT00251251. PMID:23900696

  5. A randomized, double-blind, placebo-controlled, multicentre study to assess haemodynamic effects of serelaxin in patients with acute heart failure

    PubMed Central

    Ponikowski, Piotr; Mitrovic, Veselin; Ruda, Mikhail; Fernandez, Alberto; Voors, Adriaan A.; Vishnevsky, Alexander; Cotter, Gad; Milo, Olga; Laessing, Ute; Zhang, Yiming; Dahlke, Marion; Zymlinski, Robert; Metra, Marco

    2014-01-01

    Aims The aim of this study was to evaluate the haemodynamic effects of serelaxin (30 µg/kg/day 20-h infusion and 4-h post-infusion period) in patients with acute heart failure (AHF). Methods and results This double-blind, multicentre study randomized 71 AHF patients with pulmonary capillary wedge pressure (PCWP) ≥18 mmHg, systolic blood pressure (BP) ≥115 mmHg, and estimated glomerular filtration rate ≥30 mL/min/1.73 m2 to serelaxin (n = 34) or placebo (n = 37) within 48 h of hospitalization. Co-primary endpoints were peak change from baseline in PCWP and cardiac index (CI) during the first 8 h of infusion. Among 63 patients eligible for haemodynamic analysis (serelaxin, n = 32; placebo, n = 31), those treated with serelaxin had a significantly higher decrease in peak PCWP during the first 8 h of infusion (difference vs. placebo: −2.44 mmHg, P = 0.004). Serelaxin showed no significant effect on the peak change in CI vs. placebo. Among secondary haemodynamic endpoints, a highly significant reduction in pulmonary artery pressure (PAP) was observed throughout the serelaxin infusion (largest difference in mean PAP vs. placebo: −5.17 mmHg at 4 h, P < 0.0001). Right atrial pressure, systemic/pulmonary vascular resistance, and systolic/diastolic BP decreased from baseline with serelaxin vs. placebo and treatment differences reached statistical significance at some time points. Serelaxin administration improved renal function and decreased N-terminal pro-brain natriuretic peptide levels vs. placebo. Treatment with serelaxin was well tolerated with no apparent safety concerns. Conclusion The haemodynamic effects of serelaxin observed in the present study provide plausible mechanistic support for improvement in signs and symptoms of congestion observed with this agent in AHF patients. ClinicalTrials.gov identifier NCT01543854. PMID:24255129

  6. 76 FR 54787 - Outer Continental Shelf Official Protraction Diagram, Lease Maps, and Supplemental Official Outer...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-09-02

    ... Bureau of Ocean Energy Management, Regulation and Enforcement Outer Continental Shelf Official Protraction Diagram, Lease Maps, and Supplemental Official Outer Continental Shelf Block Diagrams AGENCY... revised North American Datum of 1927 (NAD 27) Outer Continental Shelf Official Protraction Diagram,...

  7. 76 FR 2919 - Outer Continental Shelf Official Protraction Diagram and Supplemental Official Outer Continental...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-01-18

    ... Bureau of Ocean Energy Management, Regulation and Enforcement Outer Continental Shelf Official Protraction Diagram and Supplemental Official Outer Continental Shelf Block Diagrams AGENCY: Bureau of Ocean... American Datum of 1983 (NAD 83) Outer Continental Shelf Official Protraction Diagram and...

  8. Phase diagram calculation of AIIIBV binary solutions of the eutectic type in the generalized lattice model

    NASA Astrophysics Data System (ADS)

    Panov, G. A.; Zakharov, M. A.

    2015-11-01

    The present work is devoted to the phase diagrams calculation of AIIIBV systems within the framework of the generalized lattice model taking account of volume effects. The theoretically calculated phase diagram is compared with the corresponding experimental diagrams.

  9. Analysing Vee Diagram Reflections to Explore Pre-Service Science Teachers' Understanding the Nature of Science in Biology

    ERIC Educational Resources Information Center

    Savran-Gencer, Ayse

    2014-01-01

    Vee diagrams have been a metacognitive tool to help in learning the nature and structure of knowledge by reflecting on the scientific process and making knowledge much more explicit to learners during the practical work. This study aimed to assess pre-service science teachers' understanding some aspects of NOS by analyzing their reflections…

  10. CUEMAP: A tool for generating hierarchical charts and dataflow diagrams

    SciTech Connect

    Lee, J.W.

    1987-12-01

    CUEMAP is a preprocessor to the MAPPER program, which generates report quality visual aids. CUEMAP uses text blocks, symbols, and line connectors to lay out hierarchical charts and dataflow diagrams. A grid is specified as a reference point on which the labels and symbols can be placed. Connectors are added to complete the diagram. Modifications and enhancements require knowledge of the MAPPER syntax. 1 ref., 2 figs.

  11. Construction of Penrose Diagrams for Dynamic Black Holes

    NASA Technical Reports Server (NTRS)

    Brown, Beth A.; Lindesay, James

    2008-01-01

    A set of Penrose diagrams is constructed in order to examine the large-scale causal structure of black holes with dynamic horizons. Coordinate dependencies of significant features, such as the event horizon and radial mass scale, are demonstrated on the diagrams. Unlike in static Schwarzschild geometries, the radial mass scale is clearly seen to differ from the horizon. Trajectories for photons near the horizon are briefly discussed.

  12. Diagnostic accuracy of refractometer and Brix refractometer to assess failure of passive transfer in calves: protocol for a systematic review and meta-analysis.

    PubMed

    Buczinski, S; Fecteau, G; Chigerwe, M; Vandeweerd, J M

    2016-06-01

    Calves are highly dependent of colostrum (and antibody) intake because they are born agammaglobulinemic. The transfer of passive immunity in calves can be assessed directly by dosing immunoglobulin G (IgG) or by refractometry or Brix refractometry. The latter are easier to perform routinely in the field. This paper presents a protocol for a systematic review meta-analysis to assess the diagnostic accuracy of refractometry or Brix refractometry versus dosage of IgG as a reference standard test. With this review protocol we aim to be able to report refractometer and Brix refractometer accuracy in terms of sensitivity and specificity as well as to quantify the impact of any study characteristic on test accuracy. PMID:27427188

  13. Renal failure in burn patients: a review

    PubMed Central

    Emara, S.S.; Alzaylai, A.A.

    2013-01-01

    Summary Burn care providers are usually challenged by multiple complications during the management of acute burns. One of the most common complications worldwide is renal failure. This article reviews the various aspects of renal failure management in burn patients. Two different types of renal failures develop in these patients. The different aetiological factors, incidence, suspected prognosis, ways of diagnosing, as well as prevention methods, and the most accepted treatment modalities are all discussed. A good understanding and an effective assessment of the problem help to reduce both morbidity and mortality in burn management. PMID:23966893

  14. Students' Learning Activities While Studying Biological Process Diagrams

    NASA Astrophysics Data System (ADS)

    Kragten, Marco; Admiraal, Wilfried; Rijlaarsdam, Gert

    2015-08-01

    Process diagrams describe how a system functions (e.g. photosynthesis) and are an important type of representation in Biology education. In the present study, we examined students' learning activities while studying process diagrams, related to their resulting comprehension of these diagrams. Each student completed three learning tasks. Verbal data and eye-tracking data were collected as indications of students' learning activities. For the verbal data, we applied a fine-grained coding scheme to optimally describe students' learning activities. For the eye-tracking data, we used fixation time and transitions between areas of interest in the process diagrams as indices of learning activities. Various learning activities while studying process diagrams were found that distinguished between more and less successful students. Results showed that between-student variance in comprehension score was highly predicted by meaning making of the process arrows (80%) and fixation time in the main area (65%). Students employed successful learning activities consistently across learning tasks. Furthermore, compared to unsuccessful students, successful students used a more coherent approach of interrelated learning activities for comprehending process diagrams.

  15. CALCULATING ASTEROSEISMIC DIAGRAMS FOR SOLAR-LIKE OSCILLATIONS

    SciTech Connect

    White, Timothy R.; Bedding, Timothy R.; Stello, Dennis; Huber, Daniel; Christensen-Dalsgaard, Jorgen; Kjeldsen, Hans

    2011-12-20

    With the success of the Kepler and CoRoT missions, the number of stars with detected solar-like oscillations has increased by several orders of magnitude; for the first time we are able to perform large-scale ensemble asteroseismology of these stars. In preparation for this golden age of asteroseismology we have computed expected values of various asteroseismic observables from models of varying mass and metallicity. The relationships between these asteroseismic observables, such as the separations between mode frequencies, are able to significantly constrain estimates of the ages and masses of these stars. We investigate the scaling relation between the large frequency separation, {Delta}{nu}, and mean stellar density. Furthermore we present model evolutionary tracks for several asteroseismic diagrams. We have extended the so-called C-D diagram beyond the main sequence to the subgiants and the red giant branch. We also consider another asteroseismic diagram, the {epsilon} diagram, which is more sensitive to variations in stellar properties at the subgiant stages and can aid in determining the correct mode identification. The recent discovery of gravity-mode period spacings in red giants forms the basis for a third asteroseismic diagram. We compare the evolutionary model tracks in these asteroseismic diagrams with results from pre-Kepler studies of solar-like oscillations and early results from Kepler.

  16. Physical modelling of failure in composites.

    PubMed

    Talreja, Ramesh

    2016-07-13

    Structural integrity of composite materials is governed by failure mechanisms that initiate at the scale of the microstructure. The local stress fields evolve with the progression of the failure mechanisms. Within the full span from initiation to criticality of the failure mechanisms, the governing length scales in a fibre-reinforced composite change from the fibre size to the characteristic fibre-architecture sizes, and eventually to a structural size, depending on the composite configuration and structural geometry as well as the imposed loading environment. Thus, a physical modelling of failure in composites must necessarily be of multi-scale nature, although not always with the same hierarchy for each failure mode. With this background, the paper examines the currently available main composite failure theories to assess their ability to capture the essential features of failure. A case is made for an alternative in the form of physical modelling and its skeleton is constructed based on physical observations and systematic analysis of the basic failure modes and associated stress fields and energy balances. This article is part of the themed issue 'Multiscale modelling of the structural integrity of composite materials'. PMID:27242307

  17. Acute respiratory failure in pregnancy.

    PubMed

    Lapinsky, Stephen E

    2015-09-01

    Respiratory failure affects up to 0.2% of pregnancies, more commonly in the postpartum period. Altered maternal respiratory physiology affects the assessment and management of these patients. Respiratory failure may result from pregnancy-specific conditions such as preeclampsia, amniotic fluid embolism or peripartum cardiomyopathy. Pregnancy may increase the risk or severity of other conditions, including thromboembolism, asthma, viral pneumonitis, and gastric acid aspiration. Management during pregnancy is similar to the nonpregnant patient. Endotracheal intubation in pregnancy carries an increased risk, due to airway edema and rapid oxygen desaturation following apnea. Few data are available to direct prolonged mechanical ventilation in pregnancy. Chest wall compliance is reduced, perhaps permitting slightly higher airway pressures. Optimizing oxygenation is important, but data on the use of permissive hypercapnia are limited. Delivery of the fetus does not always improve maternal respiratory function, but should be considered if benefit to the fetus is anticipated. PMID:27512467

  18. High Concordance Between Mental Stress–Induced and Adenosine-Induced Myocardial Ischemia Assessed Using SPECT in Heart Failure Patients: Hemodynamic and Biomarker Correlates

    PubMed Central

    Wawrzyniak, Andrew J.; Dilsizian, Vasken; Krantz, David S.; Harris, Kristie M.; Smith, Mark F.; Shankovich, Anthony; Whittaker, Kerry S.; Rodriguez, Gabriel A.; Gottdiener, John; Li, Shuying; Kop, Willem; Gottlieb, Stephen S.

    2016-01-01

    Mental stress can trigger myocardial ischemia, but the prevalence of mental stress–induced ischemia in congestive heart failure (CHF) patients is unknown. We characterized mental stress–induced and adenosine-induced changes in myocardial perfusion and neurohormonal activation in CHF patients with reduced left-ventricular function using SPECT to precisely quantify segment-level myocardial perfusion. Methods Thirty-four coronary artery disease patients (mean age ± SD, 62 ± 10 y) with CHF longer than 3 mo and ejection fraction less than 40% underwent both adenosine and mental stress myocardial perfusion SPECT on consecutive days. Mental stress consisted of anger recall (anger-provoking speech) followed by subtraction of serial sevens. The presence and extent of myocardial ischemia was quantified using the conventional 17-segment model. Results Sixty-eight percent of patients had 1 ischemic segment or more during mental stress and 81% during adenosine. On segment-by-segment analysis, perfusion with mental stress and adenosine were highly correlated. No significant differences were found between any 2 time points for B-type natriuretic peptide, tumor necrosis factor-α, IL-1b, troponin, vascular endothelin growth factor, IL-17a, matrix metallopeptidase-9, or C-reactive protein. However, endothelin-1 and IL-6 increased, and IL-10 decreased, between the stressor and 30 min after stress. Left-ventricular end diastolic dimension was 179 ± 65 mL at rest and increased to 217 ± 71 after mental stress and 229 ± 86 after adenosine (P < 0.01 for both). Resting end systolic volume was 129 ± 60 mL at rest and increased to 158 ± 66 after mental stress (P < 0.05) and 171 ± 87 after adenosine (P < 0.07), with no significant differences between adenosine and mental stress. Ejection fraction was 30 ± 12 at baseline, 29 ± 11 with mental stress, and 28 ± 10 with adenosine (P = not significant). Conclusion There was high concordance between ischemic perfusion defects induced

  19. Can the material properties of regenerate bone be predicted with non-invasive methods of assessment? Exploring the correlation between dual X-ray absorptiometry and compression testing to failure in an animal model of distraction osteogenesis.

    PubMed

    Monsell, Fergal; Hughes, Andrew William; Turner, James; Bellemore, Michael C; Bilston, Lynne

    2014-04-01

    Evaluation of the material properties of regenerate bone is of fundamental importance to a successful outcome following distraction osteogenesis using an external fixator. Plain radiographs are in widespread use for assessment of alignment and the distraction gap but are unable to detect bone formation in the early stages of distraction osteogenesis and do not quantify accurately the structural properties of the regenerate. Dual X-ray absorptiometry (DXA) is a widely available non-invasive imaging modality that, unlike X-ray, can be used to measure bone mineral content (BMC) and density quantitatively. In order to be useful as a clinical investigation; however, the structural two-dimensional geometry and density distributions assessed by DXA should reflect material properties such as modulus and also predict the structural mechanical properties of the regenerate bone formed. We explored the hypothesis that there is a relationship between DXA assessment of regenerate bone and structural mechanical properties in an animal model of distraction osteogenesis. Distraction osteogenesis was carried out on the tibial diaphysis of 41 male, 12 week old, New Zealand white rabbits as part of a larger study. Distraction started after a latent period of 24 h at a rate of 0.375 mm every 12 h and continued for 10-days, achieving average lengthening of 7.1 mm. Following an 18-day period of consolidation, the regenerate bone was subject to bone density measurements using a total body dual-energy X-ray densitometer. This produced measurement of BMC, bone mineral density (BMD) and volumetric bone mineral density (vBMD). The tibiae were then disarticulated and cleaned of soft tissue before loading in compression to failure using an Instron mechanical testing machine (Instron Corporation, Massachusetts USA). Using Spearman rank correlation and linear regression, there was a significant correlation between vBMD and the Modulus of Elasticity, Yield Stress and Failure Stress of the

  20. Birth control failure.

    PubMed

    Sophocles, A M

    1986-10-01

    Birth control failure usually results from the incorrect or inconsistent use of contraceptives. By providing anticipatory counseling, based on an understanding of the reasons for birth control failure, family physicians can help curtail the current epidemic of unwanted pregnancies. PMID:3766356