Sample records for quantitative functional failure

  1. Of pacemakers and statistics: the actuarial method extended.

    PubMed

    Dussel, J; Wolbarst, A B; Scott-Millar, R N; Obel, I W

    1980-01-01

    Pacemakers cease functioning because of either natural battery exhaustion (nbe) or component failure (cf). A study of four series of pacemakers shows that a simple extension of the actuarial method, so as to incorporate Normal statistics, makes possible a quantitative differentiation between the two modes of failure. This involves the separation of the overall failure probability density function PDF(t) into constituent parts pdfnbe(t) and pdfcf(t). The approach should allow a meaningful comparison of the characteristics of different pacemaker types.

  2. Improving the Estimates of International Space Station (ISS) Induced K-Factor Failure Rates for On-Orbit Replacement Unit (ORU) Supportability Analyses

    NASA Technical Reports Server (NTRS)

    Anderson, Leif F.; Harrington, Sean P.; Omeke, Ojei, II; Schwaab, Douglas G.

    2009-01-01

    This is a case study on revised estimates of induced failure for International Space Station (ISS) on-orbit replacement units (ORUs). We devise a heuristic to leverage operational experience data by aggregating ORU, associated function (vehicle sub -system), and vehicle effective' k-factors using actual failure experience. With this input, we determine a significant failure threshold and minimize the difference between the actual and predicted failure rates. We conclude with a discussion on both qualitative and quantitative improvements the heuristic methods and potential benefits to ISS supportability engineering analysis.

  3. Predictive factors for renal failure and a control and treatment algorithm

    PubMed Central

    Cerqueira, Denise de Paula; Tavares, José Roberto; Machado, Regimar Carla

    2014-01-01

    Objectives to evaluate the renal function of patients in an intensive care unit, to identify the predisposing factors for the development of renal failure, and to develop an algorithm to help in the control of the disease. Method exploratory, descriptive, prospective study with a quantitative approach. Results a total of 30 patients (75.0%) were diagnosed with kidney failure and the main factors associated with this disease were: advanced age, systemic arterial hypertension, diabetes mellitus, lung diseases, and antibiotic use. Of these, 23 patients (76.6%) showed a reduction in creatinine clearance in the first 24 hours of hospitalization. Conclusion a decline in renal function was observed in a significant number of subjects, therefore, an algorithm was developed with the aim of helping in the control of renal failure in a practical and functional way. PMID:26107827

  4. Comprehensive, Quantitative Risk Assessment of CO{sub 2} Geologic Sequestration

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lepinski, James

    2013-09-30

    A Quantitative Failure Modes and Effects Analysis (QFMEA) was developed to conduct comprehensive, quantitative risk assessments on CO{sub 2} capture, transportation, and sequestration or use in deep saline aquifers, enhanced oil recovery operations, or enhanced coal bed methane operations. The model identifies and characterizes potential risks; identifies the likely failure modes, causes, effects and methods of detection; lists possible risk prevention and risk mitigation steps; estimates potential damage recovery costs, mitigation costs and costs savings resulting from mitigation; and ranks (prioritizes) risks according to the probability of failure, the severity of failure, the difficulty of early failure detection and themore » potential for fatalities. The QFMEA model generates the necessary information needed for effective project risk management. Diverse project information can be integrated into a concise, common format that allows comprehensive, quantitative analysis, by a cross-functional team of experts, to determine: What can possibly go wrong? How much will damage recovery cost? How can it be prevented or mitigated? What is the cost savings or benefit of prevention or mitigation? Which risks should be given highest priority for resolution? The QFMEA model can be tailored to specific projects and is applicable to new projects as well as mature projects. The model can be revised and updated as new information comes available. It accepts input from multiple sources, such as literature searches, site characterization, field data, computer simulations, analogues, process influence diagrams, probability density functions, financial analysis models, cost factors, and heuristic best practices manuals, and converts the information into a standardized format in an Excel spreadsheet. Process influence diagrams, geologic models, financial models, cost factors and an insurance schedule were developed to support the QFMEA model. Comprehensive, quantitative risk assessments were conducted on three (3) sites using the QFMEA model: (1) SACROC Northern Platform CO{sub 2}-EOR Site in the Permian Basin, Scurry County, TX, (2) Pump Canyon CO{sub 2}-ECBM Site in the San Juan Basin, San Juan County, NM, and (3) Farnsworth Unit CO{sub 2}-EOR Site in the Anadarko Basin, Ochiltree County, TX. The sites were sufficiently different from each other to test the robustness of the QFMEA model.« less

  5. SPECT and PET in ischemic heart failure.

    PubMed

    Angelidis, George; Giamouzis, Gregory; Karagiannis, Georgios; Butler, Javed; Tsougos, Ioannis; Valotassiou, Varvara; Giannakoulas, George; Dimakopoulos, Nikolaos; Xanthopoulos, Andrew; Skoularigis, John; Triposkiadis, Filippos; Georgoulias, Panagiotis

    2017-03-01

    Heart failure is a common clinical syndrome associated with significant morbidity and mortality worldwide. Ischemic heart disease is the leading cause of heart failure, at least in the industrialized countries. Proper diagnosis of the syndrome and management of patients with heart failure require anatomical and functional information obtained through various imaging modalities. Nuclear cardiology techniques play a main role in the evaluation of heart failure. Myocardial single photon emission computed tomography (SPECT) with thallium-201 or technetium-99 m labelled tracers offer valuable data regarding ventricular function, myocardial perfusion, viability, and intraventricular synchronism. Moreover, positron emission tomography (PET) permits accurate evaluation of myocardial perfusion, metabolism, and viability, providing high-quality images and the ability of quantitative analysis. As these imaging techniques assess different parameters of cardiac structure and function, variations of sensitivity and specificity have been reported among them. In addition, the role of SPECT and PET guided therapy remains controversial. In this comprehensive review, we address these controversies and report the advances in patient's investigation with SPECT and PET in ischemic heart failure. Furthermore, we present the innovations in technology that are expected to strengthen the role of nuclear cardiology modalities in the investigation of heart failure.

  6. Physiological and biochemical basis of clinical liver function tests: a review.

    PubMed

    Hoekstra, Lisette T; de Graaf, Wilmar; Nibourg, Geert A A; Heger, Michal; Bennink, Roelof J; Stieger, Bruno; van Gulik, Thomas M

    2013-01-01

    To review the literature on the most clinically relevant and novel liver function tests used for the assessment of hepatic function before liver surgery. Postoperative liver failure is the major cause of mortality and morbidity after partial liver resection and develops as a result of insufficient remnant liver function. Therefore, accurate preoperative assessment of the future remnant liver function is mandatory in the selection of candidates for safe partial liver resection. A MEDLINE search was performed using the key words "liver function tests," "functional studies in the liver," "compromised liver," "physiological basis," and "mechanistic background," with and without Boolean operators. Passive liver function tests, including biochemical parameters and clinical grading systems, are not accurate enough in predicting outcome after liver surgery. Dynamic quantitative liver function tests, such as the indocyanine green test and galactose elimination capacity, are more accurate as they measure the elimination process of a substance that is cleared and/or metabolized almost exclusively by the liver. However, these tests only measure global liver function. Nuclear imaging techniques ((99m)Tc-galactosyl serum albumin scintigraphy and (99m)Tc-mebrofenin hepatobiliary scintigraphy) can measure both total and future remnant liver function and potentially identify patients at risk for postresectional liver failure. Because of the complexity of liver function, one single test does not represent overall liver function. In addition to computed tomography volumetry, quantitative liver function tests should be used to determine whether a safe resection can be performed. Presently, (99m)Tc-mebrofenin hepatobiliary scintigraphy seems to be the most valuable quantitative liver function test, as it can measure multiple aspects of liver function in, specifically, the future remnant liver.

  7. Staging Hemodynamic Failure With Blood Oxygen-Level-Dependent Functional Magnetic Resonance Imaging Cerebrovascular Reactivity: A Comparison Versus Gold Standard (15O-)H2O-Positron Emission Tomography.

    PubMed

    Fierstra, Jorn; van Niftrik, Christiaan; Warnock, Geoffrey; Wegener, Susanne; Piccirelli, Marco; Pangalu, Athina; Esposito, Giuseppe; Valavanis, Antonios; Buck, Alfred; Luft, Andreas; Bozinov, Oliver; Regli, Luca

    2018-03-01

    Increased stroke risk correlates with hemodynamic failure, which can be assessed with ( 15 O-)H 2 O positron emission tomography (PET) cerebral blood flow (CBF) measurements. This gold standard technique, however, is not established for routine clinical imaging. Standardized blood oxygen-level-dependent (BOLD) functional magnetic resonance imaging+CO 2 is a noninvasive and potentially widely applicable tool to assess whole-brain quantitative cerebrovascular reactivity (CVR). We examined the agreement between the 2 imaging modalities and hypothesized that quantitative CVR can be a surrogate imaging marker to assess hemodynamic failure. Nineteen data sets of subjects with chronic cerebrovascular steno-occlusive disease (age, 60±11 years; 4 women) and unilaterally impaired perfusion reserve on Diamox-challenged ( 15 O-)H 2 O PET were studied and compared with a standardized BOLD functional magnetic resonance imaging+CO 2 examination within 6 weeks (8±19 days). Agreement between quantitative CBF- and CVR-based perfusion reserve was assessed. Hemodynamic failure was staged according to PET findings: stage 0: normal CBF, normal perfusion reserve; stage I: normal CBF, decreased perfusion reserve; and stage II: decreased CBF, decreased perfusion reserve. The BOLD CVR data set of the same subjects was then matched to the corresponding stage of hemodynamic failure. PET-based stage I versus stage II could also be clearly separated with BOLD CVR measurements (CVR for stage I 0.11 versus CVR for stage II -0.03; P <0.01). Hemispheric and middle cerebral artery territory difference analyses (ie, affected versus unaffected side) showed a significant correlation for CVR impairment in the affected hemisphere and middle cerebral artery territory ( P <0.01, R 2 =0.47 and P =0.02, R 2 = 0.25, respectively). BOLD CVR corresponded well to CBF perfusion reserve measurements obtained with ( 15 O-)H 2 O-PET, especially for detecting hemodynamic failure in the affected hemisphere and middle cerebral artery territory and for identifying hemodynamic failure stage II. BOLD CVR may, therefore, be considered for prospective studies assessing stroke risk in patients with chronic cerebrovascular steno-occlusive disease, in particular because it can potentially be implemented in routine clinical imaging. © 2018 American Heart Association, Inc.

  8. [Evaluation of intraventricular dyssynchrony by quantitative tissue velocity imaging in rats of post-infarction heart failure].

    PubMed

    Wang, Yan; Zhu, Wenhui; Duan, Xingxing; Zhao, Yongfeng; Liu, Wengang; Li, Ruizhen

    2011-04-01

    To evaluate intraventricular systolic dyssynchrony in rats with post-infarction heart failure by quantitative tissue velocity imaging combining synchronous electrocardiograph. A total of 60 male SD rats were randomly assigned to 3 groups: a 4 week post-operative group and an 8 week post-operation group (each n=25, with anterior descending branch of the left coronary artery ligated), and a sham operation group (n=10, with thoracotomy and open pericardium, but no ligation of the artery). The time to peak systolic velocity of regional myocardial in the rats was measured and the index of the left intraventricular dyssynchrony was calculated. All indexes of the heart function became lower as the heart failure worsened except the left ventricle index in the post-operative groups. All indexes of the dyssynchrony got longer in the post-operative groups (P<0.05), while the changes in the sham operation group were not significantly different (P>0.05). Quantitative tissue velocity imaging combining synchronous electrocardiograph can analyse the intraventricular systolic dyssynchrony accurately.

  9. Fatigue in older adults with stable heart failure.

    PubMed

    Stephen, Sharon A

    2008-01-01

    The purpose of this study was to describe fatigue and the relationships among fatigue intensity, self-reported functional status, and quality of life in older adults with stable heart failure. A descriptive, correlational design was used to collect quantitative data with reliable and valid instruments. Fifty-three eligible volunteers completed a questionnaire during an interview. Those with recent changes in their medical regimen, other fatigue-inducing illnesses, and isolated diastolic dysfunction were excluded. Fatigue intensity (Profile of Mood States fatigue subscale) was associated with lower quality of life, perceived health, and satisfaction with life. Fatigue was common, and no relationship was found between fatigue intensity and self-reported functional status. Marital status was the only independent predictor of fatigue. In stable heart failure, fatigue is a persistent symptom. Clinicians need to ask patients about fatigue and assess the impact on quality of life. Self-reported functional status cannot serve as a proxy measure for fatigue.

  10. Quantitative Approach to Failure Mode and Effect Analysis for Linear Accelerator Quality Assurance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    O'Daniel, Jennifer C., E-mail: jennifer.odaniel@duke.edu; Yin, Fang-Fang

    Purpose: To determine clinic-specific linear accelerator quality assurance (QA) TG-142 test frequencies, to maximize physicist time efficiency and patient treatment quality. Methods and Materials: A novel quantitative approach to failure mode and effect analysis is proposed. Nine linear accelerator-years of QA records provided data on failure occurrence rates. The severity of test failure was modeled by introducing corresponding errors into head and neck intensity modulated radiation therapy treatment plans. The relative risk of daily linear accelerator QA was calculated as a function of frequency of test performance. Results: Although the failure severity was greatest for daily imaging QA (imaging vsmore » treatment isocenter and imaging positioning/repositioning), the failure occurrence rate was greatest for output and laser testing. The composite ranking results suggest that performing output and lasers tests daily, imaging versus treatment isocenter and imaging positioning/repositioning tests weekly, and optical distance indicator and jaws versus light field tests biweekly would be acceptable for non-stereotactic radiosurgery/stereotactic body radiation therapy linear accelerators. Conclusions: Failure mode and effect analysis is a useful tool to determine the relative importance of QA tests from TG-142. Because there are practical time limitations on how many QA tests can be performed, this analysis highlights which tests are the most important and suggests the frequency of testing based on each test's risk priority number.« less

  11. Nkx2.5 enhances the efficacy of mesenchymal stem cells transplantation in treatment heart failure in rats.

    PubMed

    Deng, Bo; Wang, Jin Xin; Hu, Xing Xing; Duan, Peng; Wang, Lin; Li, Yang; Zhu, Qing Lei

    2017-08-01

    The aim of this study is to determine whether Nkx2.5 transfection of transplanted bone marrow mesenchymal stem cells (MSCs) improves the efficacy of treatment of adriamycin-induced heart failure in a rat model. Nkx2.5 was transfected in MSCs by lentiviral vector transduction. The expressions of Nkx2.5 and cardiac specific genes in MSCs and Nkx2.5 transfected mesenchymal stem cells (MSCs-Nkx2.5) were analyzed with quantitative real-time PCR and Western blot in vitro. Heart failure models of rats were induced by adriamycin and were then randomly divided into 3 groups: injected saline, MSCs or MSCs-Nkx2.5 via the femoral vein respectively. Four weeks after injection, the cardiac function, expressions of cardiac specific gene, fibrosis formation and collagen volume fraction in the myocardium as well as the expressions of GATA4 and MEF2 in rats were analyzed with echocardiography, immunohistochemistry, Masson staining, quantitative real-time PCR and Western blot, respectively. Nkx2.5 enhanced cardiac specific gene expressions including α-MHC, TNI, CKMB, connexin-43 in MSCs-Nkx2.5 in vitro. Both MSCs and MSCs-Nkx2.5 improved cardiac function, promoted the differentiation of transplanted MSCs into cardiomyocyte-like cells, decreased fibrosis formation and collagen volume fraction in the myocardium, as well as increased the expressions of GATA4 and MEF2 in adriamycin-induced rat heart failure models. Moreover, the effect was much more remarkable in MSCs-Nkx2.5 than in MSCs group. This study has found that Nkx2.5 enhances the efficacy of MSCs transplantation in treatment adriamycin-induced heart failure in rats. Nkx2.5 transfected to transplanted MSCs provides a potential effective approach to heart failure. Copyright © 2017 Elsevier Inc. All rights reserved.

  12. Quantitative method of medication system interface evaluation.

    PubMed

    Pingenot, Alleene Anne; Shanteau, James; Pingenot, James D F

    2007-01-01

    The objective of this study was to develop a quantitative method of evaluating the user interface for medication system software. A detailed task analysis provided a description of user goals and essential activity. A structural fault analysis was used to develop a detailed description of the system interface. Nurses experienced with use of the system under evaluation provided estimates of failure rates for each point in this simplified fault tree. Means of estimated failure rates provided quantitative data for fault analysis. Authors note that, although failures of steps in the program were frequent, participants reported numerous methods of working around these failures so that overall system failure was rare. However, frequent process failure can affect the time required for processing medications, making a system inefficient. This method of interface analysis, called Software Efficiency Evaluation and Fault Identification Method, provides quantitative information with which prototypes can be compared and problems within an interface identified.

  13. Post-anoxic quantitative MRI changes may predict emergence from coma and functional outcomes at discharge.

    PubMed

    Reynolds, Alexandra S; Guo, Xiaotao; Matthews, Elizabeth; Brodie, Daniel; Rabbani, Leroy E; Roh, David J; Park, Soojin; Claassen, Jan; Elkind, Mitchell S V; Zhao, Binsheng; Agarwal, Sachin

    2017-08-01

    Traditional predictors of neurological prognosis after cardiac arrest are unreliable after targeted temperature management. Absence of pupillary reflexes remains a reliable predictor of poor outcome. Diffusion-weighted imaging has emerged as a potential predictor of recovery, and here we compare imaging characteristics to pupillary exam. We identified 69 patients who had MRIs within seven days of arrest and used a semi-automated algorithm to perform quantitative volumetric analysis of apparent diffusion coefficient (ADC) sequences at various thresholds. Area under receiver operating characteristic curves (ROC-AUC) were estimated to compare predictive values of quantitative MRI with pupillary exam at days 3, 5 and 7 post-arrest, for persistence of coma and functional outcomes at discharge. Cerebral Performance Category scores of 3-4 were considered poor outcome. Excluding patients where life support was withdrawn, ≥2.8% diffusion restriction of the entire brain at an ADC of ≤650×10 -6 m 2 /s was 100% specific and 68% sensitive for failure to wake up from coma before discharge. The ROC-AUC of ADC changes at ≤450×10 -6 mm 2 /s and ≤650×10 -6 mm 2 /s were significantly superior in predicting failure to wake up from coma compared to bilateral absence of pupillary reflexes. Among survivors, >0.01% of diffusion restriction of the entire brain at an ADC ≤450×10 -6 m 2 /s was 100% specific and 46% sensitive for poor functional outcome at discharge. The ROC curve predicting poor functional outcome at ADC ≤450×10 -6 mm 2 /s had an AUC of 0.737 (0.574-0.899, p=0.04). Post-anoxic diffusion changes using quantitative brain MRI may aid in predicting persistent coma and poor functional outcomes at hospital discharge. Copyright © 2017 Elsevier B.V. All rights reserved.

  14. Feasibility of high-resolution quantitative perfusion analysis in patients with heart failure.

    PubMed

    Sammut, Eva; Zarinabad, Niloufar; Wesolowski, Roman; Morton, Geraint; Chen, Zhong; Sohal, Manav; Carr-White, Gerry; Razavi, Reza; Chiribiri, Amedeo

    2015-02-12

    Cardiac magnetic resonance (CMR) is playing an expanding role in the assessment of patients with heart failure (HF). The assessment of myocardial perfusion status in HF can be challenging due to left ventricular (LV) remodelling and wall thinning, coexistent scar and respiratory artefacts. The aim of this study was to assess the feasibility of quantitative CMR myocardial perfusion analysis in patients with HF. A group of 58 patients with heart failure (HF; left ventricular ejection fraction, LVEF ≤ 50%) and 33 patients with normal LVEF (LVEF >50%), referred for suspected coronary artery disease, were studied. All subjects underwent quantitative first-pass stress perfusion imaging using adenosine according to standard acquisition protocols. The feasibility of quantitative perfusion analysis was then assessed using high-resolution, 3 T kt perfusion and voxel-wise Fermi deconvolution. 30/58 (52%) subjects in the HF group had underlying ischaemic aetiology. Perfusion abnormalities were seen amongst patients with ischaemic HF and patients with normal LV function. No regional perfusion defect was observed in the non-ischaemic HF group. Good agreement was found between visual and quantitative analysis across all groups. Absolute stress perfusion rate, myocardial perfusion reserve (MPR) and endocardial-epicardial MPR ratio identified areas with abnormal perfusion in the ischaemic HF group (p = 0.02; p = 0.04; p = 0.02, respectively). In the Normal LV group, MPR and endocardial-epicardial MPR ratio were able to distinguish between normal and abnormal segments (p = 0.04; p = 0.02 respectively). No significant differences of absolute stress perfusion rate or MPR were observed comparing visually normal segments amongst groups. Our results demonstrate the feasibility of high-resolution voxel-wise perfusion assessment in patients with HF.

  15. RNA splicing regulated by RBFOX1 is essential for cardiac function in zebrafish.

    PubMed

    Frese, Karen S; Meder, Benjamin; Keller, Andreas; Just, Steffen; Haas, Jan; Vogel, Britta; Fischer, Simon; Backes, Christina; Matzas, Mark; Köhler, Doreen; Benes, Vladimir; Katus, Hugo A; Rottbauer, Wolfgang

    2015-08-15

    Alternative splicing is one of the major mechanisms through which the proteomic and functional diversity of eukaryotes is achieved. However, the complex nature of the splicing machinery, its associated splicing regulators and the functional implications of alternatively spliced transcripts are only poorly understood. Here, we investigated the functional role of the splicing regulator rbfox1 in vivo using the zebrafish as a model system. We found that loss of rbfox1 led to progressive cardiac contractile dysfunction and heart failure. By using deep-transcriptome sequencing and quantitative real-time PCR, we show that depletion of rbfox1 in zebrafish results in an altered isoform expression of several crucial target genes, such as actn3a and hug. This study underlines that tightly regulated splicing is necessary for unconstrained cardiac function and renders the splicing regulator rbfox1 an interesting target for investigation in human heart failure and cardiomyopathy. © 2015. Published by The Company of Biologists Ltd.

  16. 33 CFR 154.804 - Review, certification, and initial inspection.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ..., property, and the environment if an accident were to occur; and (4) If a quantitative failure analysis is... quantitative failure analysis. (e) The certifying entity must conduct all initial inspections and witness all...

  17. 33 CFR 154.804 - Review, certification, and initial inspection.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ..., property, and the environment if an accident were to occur; and (4) If a quantitative failure analysis is... quantitative failure analysis. (e) The certifying entity must conduct all initial inspections and witness all...

  18. 33 CFR 154.804 - Review, certification, and initial inspection.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ..., property, and the environment if an accident were to occur; and (4) If a quantitative failure analysis is... quantitative failure analysis. (e) The certifying entity must conduct all initial inspections and witness all...

  19. 33 CFR 154.804 - Review, certification, and initial inspection.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ..., property, and the environment if an accident were to occur; and (4) If a quantitative failure analysis is... quantitative failure analysis. (e) The certifying entity must conduct all initial inspections and witness all...

  20. 33 CFR 154.2020 - Certification and recertification-owner/operator responsibilities.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... Procedures,” and in Military Standard MIL-STD-882B for a quantitative failure analysis. For assistance in... quantitative failure analysis is also conducted, the level of safety attained is at least one order of...

  1. Rodent heart failure models do not reflect the human circulating microRNA signature in heart failure.

    PubMed

    Vegter, Eline L; Ovchinnikova, Ekaterina S; Silljé, Herman H W; Meems, Laura M G; van der Pol, Atze; van der Velde, A Rogier; Berezikov, Eugene; Voors, Adriaan A; de Boer, Rudolf A; van der Meer, Peter

    2017-01-01

    We recently identified a set of plasma microRNAs (miRNAs) that are downregulated in patients with heart failure in comparison with control subjects. To better understand their meaning and function, we sought to validate these circulating miRNAs in 3 different well-established rat and mouse heart failure models, and correlated the miRNAs to parameters of cardiac function. The previously identified let-7i-5p, miR-16-5p, miR-18a-5p, miR-26b-5p, miR-27a-3p, miR-30e-5p, miR-199a-3p, miR-223-3p, miR-423-3p, miR-423-5p and miR-652-3p were measured by means of quantitative real time polymerase chain reaction (qRT-PCR) in plasma samples of 8 homozygous TGR(mREN2)27 (Ren2) transgenic rats and 8 (control) Sprague-Dawley rats, 6 mice with angiotensin II-induced heart failure (AngII) and 6 control mice, and 8 mice with ischemic heart failure and 6 controls. Circulating miRNA levels were compared between the heart failure animals and healthy controls. Ren2 rats, AngII mice and mice with ischemic heart failure showed clear signs of heart failure, exemplified by increased left ventricular and lung weights, elevated end-diastolic left ventricular pressures, increased expression of cardiac stress markers and reduced left ventricular ejection fraction. All miRNAs were detectable in plasma from rats and mice. No significant differences were observed between the circulating miRNAs in heart failure animals when compared to the healthy controls (all P>0.05) and no robust associations with cardiac function could be found. The previous observation that miRNAs circulate in lower levels in human patients with heart failure could not be validated in well-established rat and mouse heart failure models. These results question the translation of data on human circulating miRNA levels to experimental models, and vice versa the validity of experimental miRNA data for human heart failure.

  2. The bond rupture force for sulfur chains calculated from quantum chemistry simulations and its relevance to the tensile strength of vulcanized rubber

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hanson, David Edward; Barber, John L.

    From quantum chemistry simulations using density functional theory, we obtain the total electronic energy of an eight-atom sulfur chain as its end-to-end distance is extended until S–S bond rupture occurs. We find that a sulfur chain can be extended by about 40% beyond its nominally straight conformation, where it experiences rupture at an end-to-end tension of about 1.5 nN. Using this rupture force as the chain failure limit in an explicit polymer network simulation model (EPnet), we predict the tensile failure stress for sulfur crosslinked (vulcanized) natural rubber. Furthermore, quantitative agreement with published experimental data for the failure stress ismore » obtained in these simulations if we assume that only about 30% of the sulfur chains produce viable network crosslinks. Surprisingly, we also find that the failure stress of a rubber network does not scale linearly with the chain failure force limit.« less

  3. The bond rupture force for sulfur chains calculated from quantum chemistry simulations and its relevance to the tensile strength of vulcanized rubber

    DOE PAGES

    Hanson, David Edward; Barber, John L.

    2017-11-20

    From quantum chemistry simulations using density functional theory, we obtain the total electronic energy of an eight-atom sulfur chain as its end-to-end distance is extended until S–S bond rupture occurs. We find that a sulfur chain can be extended by about 40% beyond its nominally straight conformation, where it experiences rupture at an end-to-end tension of about 1.5 nN. Using this rupture force as the chain failure limit in an explicit polymer network simulation model (EPnet), we predict the tensile failure stress for sulfur crosslinked (vulcanized) natural rubber. Furthermore, quantitative agreement with published experimental data for the failure stress ismore » obtained in these simulations if we assume that only about 30% of the sulfur chains produce viable network crosslinks. Surprisingly, we also find that the failure stress of a rubber network does not scale linearly with the chain failure force limit.« less

  4. Nonlinear viscoelasticity and generalized failure criterion for biopolymer gels

    NASA Astrophysics Data System (ADS)

    Divoux, Thibaut; Keshavarz, Bavand; Manneville, Sébastien; McKinley, Gareth

    2016-11-01

    Biopolymer gels display a multiscale microstructure that is responsible for their solid-like properties. Upon external deformation, these soft viscoelastic solids exhibit a generic nonlinear mechanical response characterized by pronounced stress- or strain-stiffening prior to irreversible damage and failure, most often through macroscopic fractures. Here we show on a model acid-induced protein gel that the nonlinear viscoelastic properties of the gel can be described in terms of a 'damping function' which predicts the gel mechanical response quantitatively up to the onset of macroscopic failure. Using a nonlinear integral constitutive equation built upon the experimentally-measured damping function in conjunction with power-law linear viscoelastic response, we derive the form of the stress growth in the gel following the start up of steady shear. We also couple the shear stress response with Bailey's durability criteria for brittle solids in order to predict the critical values of the stress σc and strain γc for failure of the gel, and how they scale with the applied shear rate. This provides a generalized failure criterion for biopolymer gels in a range of different deformation histories. This work was funded by the MIT-France seed fund and by the CNRS PICS-USA scheme (#36939). BK acknowledges financial support from Axalta Coating Systems.

  5. Role of heat shock transcription factor 1(HSF1)-upregulated macrophage in ameliorating pressure overload-induced heart failure in mice.

    PubMed

    Du, Peizhao; Chang, Yaowei; Dai, Fangjie; Wei, Chunyan; Zhang, Qi; Li, Jiming

    2018-08-15

    In order to explore the role of macrophages in HSF1-mediated alleviation of heart failure, mice model of pressure overload-induced heart failure was established using transverse aortic constriction (TAC). Changes in cardiac function and morphology were studied in TAC and SHAM groups using ultrasonic device, tissue staining, electron microscopy, real-time quantitative polymerase chain reaction (RT-QPCR), and Western blotting. We found that mice in the TAC group showed evidence of impaired cardiac function and aggravation of fibrosis on ultrasonic and histopathological examination when compared to those in the SHAM group. The expressions of HSF1, LC3II/LC3I, Becline-1 and HIF-1, as well as autophagosome formation in TAC group were greater than that in SHAM group. On sub-group analyses in the TAC group, improved cardiac function and alleviation of fibrosis was observed in the HSF1 TG subgroup as compared to that in the wild type subgroup. Expressions of LC3II/LC3I, Becline-1 and HIF-1, too showed an obvious increase; and increased autophagosome formation was observed on electron microscopy. Opposite results were observed in the HSF1 KO subgroup. These results collectively suggest that in the pressure overload heart failure model, HSF1 promoted formation of macrophages by inducing upregulation of HIF-1 expression, through which heart failure was ameliorated. Copyright © 2018 Elsevier B.V. All rights reserved.

  6. Prediction of Hip Failure Load: In Vitro Study of 80 Femurs Using Three Imaging Methods and Finite Element Models-The European Fracture Study (EFFECT).

    PubMed

    Pottecher, Pierre; Engelke, Klaus; Duchemin, Laure; Museyko, Oleg; Moser, Thomas; Mitton, David; Vicaut, Eric; Adams, Judith; Skalli, Wafa; Laredo, Jean Denis; Bousson, Valérie

    2016-09-01

    Purpose To evaluate the performance of three imaging methods (radiography, dual-energy x-ray absorptiometry [DXA], and quantitative computed tomography [CT]) and that of a numerical analysis with finite element modeling (FEM) in the prediction of failure load of the proximal femur and to identify the best densitometric or geometric predictors of hip failure load. Materials and Methods Institutional review board approval was obtained. A total of 40 pairs of excised cadaver femurs (mean patient age at time of death, 82 years ± 12 [standard deviation]) were examined with (a) radiography to measure geometric parameters (lengths, angles, and cortical thicknesses), (b) DXA (reference standard) to determine areal bone mineral densities (BMDs), and (c) quantitative CT with dedicated three-dimensional analysis software to determine volumetric BMDs and geometric parameters (neck axis length, cortical thicknesses, volumes, and moments of inertia), and (d) quantitative CT-based FEM to calculate a numerical value of failure load. The 80 femurs were fractured via mechanical testing, with random assignment of one femur from each pair to the single-limb stance configuration (hereafter, stance configuration) and assignment of the paired femur to the sideways fall configuration (hereafter, side configuration). Descriptive statistics, univariate correlations, and stepwise regression models were obtained for each imaging method and for FEM to enable us to predict failure load in both configurations. Results Statistics reported are for stance and side configurations, respectively. For radiography, the strongest correlation with mechanical failure load was obtained by using a geometric parameter combined with a cortical thickness (r(2) = 0.66, P < .001; r(2) = 0.65, P < .001). For DXA, the strongest correlation with mechanical failure load was obtained by using total BMD (r(2) = 0.73, P < .001) and trochanteric BMD (r(2) = 0.80, P < .001). For quantitative CT, in both configurations, the best model combined volumetric BMD and a moment of inertia (r(2) = 0.78, P < .001; r(2) = 0.85, P < .001). FEM explained 87% (P < .001) and 83% (P < .001) of bone strength, respectively. By combining (a) radiography and DXA and (b) quantitative CT and DXA, correlations with mechanical failure load increased to 0.82 (P < .001) and 0.84 (P < .001), respectively, for radiography and DXA and to 0.80 (P < .001) and 0.86 (P < .001) , respectively, for quantitative CT and DXA. Conclusion Quantitative CT-based FEM was the best method with which to predict the experimental failure load; however, combining quantitative CT and DXA yielded a performance as good as that attained with FEM. The quantitative CT DXA combination may be easier to use in fracture prediction, provided standardized software is developed. These findings also highlight the major influence on femoral failure load, particularly in the trochanteric region, of a densitometric parameter combined with a geometric parameter. (©) RSNA, 2016 Online supplemental material is available for this article.

  7. Modelling passive diastolic mechanics with quantitative MRI of cardiac structure and function.

    PubMed

    Wang, Vicky Y; Lam, H I; Ennis, Daniel B; Cowan, Brett R; Young, Alistair A; Nash, Martyn P

    2009-10-01

    The majority of patients with clinically diagnosed heart failure have normal systolic pump function and are commonly categorized as suffering from diastolic heart failure. The left ventricle (LV) remodels its structure and function to adapt to pathophysiological changes in geometry and loading conditions, which in turn can alter the passive ventricular mechanics. In order to better understand passive ventricular mechanics, a LV finite element (FE) model was customized to geometric data segmented from in vivo tagged magnetic resonance images (MRI) data and myofibre orientation derived from ex vivo diffusion tensor MRI (DTMRI) of a canine heart using nonlinear finite element fitting techniques. MRI tissue tagging enables quantitative evaluation of cardiac mechanical function with high spatial and temporal resolution, whilst the direction of maximum water diffusion in each voxel of a DTMRI directly corresponds to the local myocardial fibre orientation. Due to differences in myocardial geometry between in vivo and ex vivo imaging, myofibre orientations were mapped into the geometric FE model using host mesh fitting (a free form deformation technique). Pressure recordings, temporally synchronized to the tagging data, were used as the loading constraints to simulate the LV deformation during diastole. Simulation of diastolic LV mechanics allowed us to estimate the stiffness of the passive LV myocardium based on kinematic data obtained from tagged MRI. Integrated physiological modelling of this kind will allow more insight into mechanics of the LV on an individualized basis, thereby improving our understanding of the underlying structural basis of mechanical dysfunction under pathological conditions.

  8. NESTEM-QRAS: A Tool for Estimating Probability of Failure

    NASA Technical Reports Server (NTRS)

    Patel, Bhogilal M.; Nagpal, Vinod K.; Lalli, Vincent A.; Pai, Shantaram; Rusick, Jeffrey J.

    2002-01-01

    An interface between two NASA GRC specialty codes, NESTEM and QRAS has been developed. This interface enables users to estimate, in advance, the risk of failure of a component, a subsystem, and/or a system under given operating conditions. This capability would be able to provide a needed input for estimating the success rate for any mission. NESTEM code, under development for the last 15 years at NASA Glenn Research Center, has the capability of estimating probability of failure of components under varying loading and environmental conditions. This code performs sensitivity analysis of all the input variables and provides their influence on the response variables in the form of cumulative distribution functions. QRAS, also developed by NASA, assesses risk of failure of a system or a mission based on the quantitative information provided by NESTEM or other similar codes, and user provided fault tree and modes of failure. This paper will describe briefly, the capabilities of the NESTEM, QRAS and the interface. Also, in this presentation we will describe stepwise process the interface uses using an example.

  9. NESTEM-QRAS: A Tool for Estimating Probability of Failure

    NASA Astrophysics Data System (ADS)

    Patel, Bhogilal M.; Nagpal, Vinod K.; Lalli, Vincent A.; Pai, Shantaram; Rusick, Jeffrey J.

    2002-10-01

    An interface between two NASA GRC specialty codes, NESTEM and QRAS has been developed. This interface enables users to estimate, in advance, the risk of failure of a component, a subsystem, and/or a system under given operating conditions. This capability would be able to provide a needed input for estimating the success rate for any mission. NESTEM code, under development for the last 15 years at NASA Glenn Research Center, has the capability of estimating probability of failure of components under varying loading and environmental conditions. This code performs sensitivity analysis of all the input variables and provides their influence on the response variables in the form of cumulative distribution functions. QRAS, also developed by NASA, assesses risk of failure of a system or a mission based on the quantitative information provided by NESTEM or other similar codes, and user provided fault tree and modes of failure. This paper will describe briefly, the capabilities of the NESTEM, QRAS and the interface. Also, in this presentation we will describe stepwise process the interface uses using an example.

  10. Left ventricular performance in various heart diseases with or without heart failure:--an appraisal by quantitative one-plane cineangiocardiography.

    PubMed

    Lien, W P; Lee, Y S; Chang, F Z; Chen, J J; Shieh, W B

    1978-01-01

    Quantitative one-plane cineangiocardiography in right anterior oblique position for evaluation of LV performance was carried out in 62 patients with various heart diseases and in 13 subjects with normal LV. Parameters for evaluating both pump and muscle performances were derived from volume and pressure measurements. Of 31 patients with either systolic hypertension or LV myocardial diseases (coronary artery disease or idiopathic cardiomyopathy), 14 had clinical evidence of LV failure before the study. It was found that mean VCF and EF were most sensitive indicators of impaired LV performance among the various parameters. There was a close correlation between mean VCF and EF, yet discordant changes of both parameters were noted in some patients. Furthermore, wall motion abnormalities were not infrequently observed in patients with coronary artery disease or primary cardiomyopathy. Therefore, assessment of at least three ejection properties (EF, mean VCF and wall motion abnormalities) are considered to be essential for full understanding of derangement of LV function in heart disease. This is especially true of patients with coronary artery disease. LV behavior in relation to different pathological stresses or lesions, such as chronic pressure or volume load, myocardial disease and mitral stenosis, was also studied and possible cause of impaired LV myocardial function in mitral stenosis was discussed.

  11. Solar-cell interconnect design for terrestrial photovoltaic modules

    NASA Technical Reports Server (NTRS)

    Mon, G. R.; Moore, D. M.; Ross, R. G., Jr.

    1984-01-01

    Useful solar cell interconnect reliability design and life prediction algorithms are presented, together with experimental data indicating that the classical strain cycle (fatigue) curve for the interconnect material does not account for the statistical scatter that is required in reliability predictions. This shortcoming is presently addressed by fitting a functional form to experimental cumulative interconnect failure rate data, which thereby yields statistical fatigue curves enabling not only the prediction of cumulative interconnect failures during the design life of an array field, but also the quantitative interpretation of data from accelerated thermal cycling tests. Optimal interconnect cost reliability design algorithms are also derived which may allow the minimization of energy cost over the design life of the array field.

  12. Solar-cell interconnect design for terrestrial photovoltaic modules

    NASA Astrophysics Data System (ADS)

    Mon, G. R.; Moore, D. M.; Ross, R. G., Jr.

    1984-11-01

    Useful solar cell interconnect reliability design and life prediction algorithms are presented, together with experimental data indicating that the classical strain cycle (fatigue) curve for the interconnect material does not account for the statistical scatter that is required in reliability predictions. This shortcoming is presently addressed by fitting a functional form to experimental cumulative interconnect failure rate data, which thereby yields statistical fatigue curves enabling not only the prediction of cumulative interconnect failures during the design life of an array field, but also the quantitative interpretation of data from accelerated thermal cycling tests. Optimal interconnect cost reliability design algorithms are also derived which may allow the minimization of energy cost over the design life of the array field.

  13. Study of Experiment on Rock-like Material Consist of fly-ash, Cement and Mortar

    NASA Astrophysics Data System (ADS)

    Nan, Qin; Hongwei, Wang; Yongyan, Wang

    2018-03-01

    Study the uniaxial compression test of rock-like material consist of coal ash, cement and mortar by changing the sand cement ratio, replace of fine coal, grain diameter, water-binder ratio and height-diameter ratio. We get the law of four factors above to rock-like material’s uniaxial compression characteristics and the quantitative relation. The effect law can be sum up as below: sample’s uniaxial compressive strength and elasticity modulus tend to decrease with the increase of sand cement ratio, replace of fine coal and water-binder ratio, and it satisfies with power function relation. With high ratio increases gradually, the uniaxial compressive strength and elastic modulus is lower, and presents the inverse function curve; Specimen tensile strength decreases gradually with the increase of fly ash. By contrast, uniaxial compression failure phenomenon is consistent with the real rock common failure pattern.

  14. Sophisticated Calculation of the 1oo4-architecture for Safety-related Systems Conforming to IEC61508

    NASA Astrophysics Data System (ADS)

    Hayek, A.; Bokhaiti, M. Al; Schwarz, M. H.; Boercsoek, J.

    2012-05-01

    With the publication and enforcement of the standard IEC 61508 of safety related systems, recent system architectures have been presented and evaluated. Among a number of techniques and measures to the evaluation of safety integrity level (SIL) for safety-related systems, several measures such as reliability block diagrams and Markov models are used to analyze the probability of failure on demand (PFD) and mean time to failure (MTTF) which conform to IEC 61508. The current paper deals with the quantitative analysis of the novel 1oo4-architecture (one out of four) presented in recent work. Therefore sophisticated calculations for the required parameters are introduced. The provided 1oo4-architecture represents an advanced safety architecture based on on-chip redundancy, which is 3-failure safe. This means that at least one of the four channels have to work correctly in order to trigger the safety function.

  15. Downregulation of MicroRNA-126 Contributes to the Failing Right Ventricle in Pulmonary Arterial Hypertension.

    PubMed

    Potus, François; Ruffenach, Grégoire; Dahou, Abdellaziz; Thebault, Christophe; Breuils-Bonnet, Sandra; Tremblay, Ève; Nadeau, Valérie; Paradis, Renée; Graydon, Colin; Wong, Ryan; Johnson, Ian; Paulin, Roxane; Lajoie, Annie C; Perron, Jean; Charbonneau, Eric; Joubert, Philippe; Pibarot, Philippe; Michelakis, Evangelos D; Provencher, Steeve; Bonnet, Sébastien

    2015-09-08

    Right ventricular (RV) failure is the most important factor of both morbidity and mortality in pulmonary arterial hypertension (PAH). However, the underlying mechanisms resulting in the failed RV in PAH remain unknown. There is growing evidence that angiogenesis and microRNAs are involved in PAH-associated RV failure. We hypothesized that microRNA-126 (miR-126) downregulation decreases microvessel density and promotes the transition from a compensated to a decompensated RV in PAH. We studied RV free wall tissues from humans with normal RV (n=17), those with compensated RV hypertrophy (n=8), and patients with PAH with decompensated RV failure (n=14). Compared with RV tissues from patients with compensated RV hypertrophy, patients with decompensated RV failure had decreased miR-126 expression (quantitative reverse transcription-polymerase chain reaction; P<0.01) and capillary density (CD31(+) immunofluorescence; P<0.001), whereas left ventricular tissues were not affected. miR-126 downregulation was associated with increased Sprouty-related EVH1 domain-containing protein 1 (SPRED-1), leading to decreased activation of RAF (phosphorylated RAF/RAF) and mitogen-activated protein kinase (MAPK); (phosphorylated MAPK/MAPK), thus inhibiting the vascular endothelial growth factor pathway. In vitro, Matrigel assay showed that miR-126 upregulation increased angiogenesis of primary cultured endothelial cells from patients with decompensated RV failure. Furthermore, in vivo miR-126 upregulation (mimic intravenous injection) improved cardiac vascular density and function of monocrotaline-induced PAH animals. RV failure in PAH is associated with a specific molecular signature within the RV, contributing to a decrease in RV vascular density and promoting the progression to RV failure. More importantly, miR-126 upregulation in the RV improves microvessel density and RV function in experimental PAH. © 2015 American Heart Association, Inc.

  16. Fibrosis-Related Gene Expression in Single Ventricle Heart Disease.

    PubMed

    Nakano, Stephanie J; Siomos, Austine K; Garcia, Anastacia M; Nguyen, Hieu; SooHoo, Megan; Galambos, Csaba; Nunley, Karin; Stauffer, Brian L; Sucharov, Carmen C; Miyamoto, Shelley D

    2017-12-01

    To evaluate fibrosis and fibrosis-related gene expression in the myocardium of pediatric subjects with single ventricle with right ventricular failure. Real-time quantitative polymerase chain reaction was performed on explanted right ventricular myocardium of pediatric subjects with single ventricle disease and controls with nonfailing heart disease. Subjects were divided into 3 groups: single ventricle failing (right ventricular failure before or after stage I palliation), single ventricle nonfailing (infants listed for primary transplantation with normal right ventricular function), and stage III (Fontan or right ventricular failure after stage III). To evaluate subjects of similar age and right ventricular volume loading, single ventricle disease with failure was compared with single ventricle without failure and stage III was compared with nonfailing right ventricular disease. Histologic fibrosis was assessed in all hearts. Mann-Whitney tests were performed to identify differences in gene expression. Collagen (Col1α, Col3) expression is decreased in single ventricle congenital heart disease with failure compared with nonfailing single ventricle congenital heart disease (P = .019 and P = .035, respectively), and is equivalent in stage III compared with nonfailing right ventricular heart disease. Tissue inhibitors of metalloproteinase (TIMP-1, TIMP-3, and TIMP-4) are downregulated in stage III compared with nonfailing right ventricular heart disease (P = .0047, P = .013 and P = .013, respectively). Matrix metalloproteinases (MMP-2, MMP-9) are similar between nonfailing single ventricular heart disease and failing single ventricular heart disease, and between stage III heart disease and nonfailing right ventricular heart disease. There is no difference in the prevalence of right ventricular fibrosis by histology in subjects with single ventricular failure heart disease with right ventricular failure (18%) compared with those with normal right ventricular function (38%). Fibrosis is not a primary contributor to right ventricular failure in infants and young children with single ventricular heart disease. Additional studies are required to understand whether antifibrotic therapies are beneficial in this population. Copyright © 2017 Elsevier Inc. All rights reserved.

  17. Functional Fault Model Development Process to Support Design Analysis and Operational Assessment

    NASA Technical Reports Server (NTRS)

    Melcher, Kevin J.; Maul, William A.; Hemminger, Joseph A.

    2016-01-01

    A functional fault model (FFM) is an abstract representation of the failure space of a given system. As such, it simulates the propagation of failure effects along paths between the origin of the system failure modes and points within the system capable of observing the failure effects. As a result, FFMs may be used to diagnose the presence of failures in the modeled system. FFMs necessarily contain a significant amount of information about the design, operations, and failure modes and effects. One of the important benefits of FFMs is that they may be qualitative, rather than quantitative and, as a result, may be implemented early in the design process when there is more potential to positively impact the system design. FFMs may therefore be developed and matured throughout the monitored system's design process and may subsequently be used to provide real-time diagnostic assessments that support system operations. This paper provides an overview of a generalized NASA process that is being used to develop and apply FFMs. FFM technology has been evolving for more than 25 years. The FFM development process presented in this paper was refined during NASA's Ares I, Space Launch System, and Ground Systems Development and Operations programs (i.e., from about 2007 to the present). Process refinement took place as new modeling, analysis, and verification tools were created to enhance FFM capabilities. In this paper, standard elements of a model development process (i.e., knowledge acquisition, conceptual design, implementation & verification, and application) are described within the context of FFMs. Further, newer tools and analytical capabilities that may benefit the broader systems engineering process are identified and briefly described. The discussion is intended as a high-level guide for future FFM modelers.

  18. Evaluation of airway protection: Quantitative timing measures versus penetration/aspiration score.

    PubMed

    Kendall, Katherine A

    2017-10-01

    Quantitative measures of swallowing function may improve the reliability and accuracy of modified barium swallow (MBS) study interpretation. Quantitative study analysis has not been widely instituted, however, secondary to concerns about the time required to make measures and a lack of research demonstrating impact on MBS interpretation. This study compares the accuracy of the penetration/aspiration (PEN/ASP) scale (an observational visual-perceptual assessment tool) to quantitative measures of airway closure timing relative to the arrival of the bolus at the upper esophageal sphincter in identifying a failure of airway protection during deglutition. Retrospective review of clinical swallowing data from a university-based outpatient clinic. Swallowing data from 426 patients were reviewed. Patients with normal PEN/ASP scores were identified, and the results of quantitative airway closure timing measures for three liquid bolus sizes were evaluated. The incidence of significant airway closure delay with and without a normal PEN/ASP score was determined. Inter-rater reliability for the quantitative measures was calculated. In patients with a normal PEN/ASP score, 33% demonstrated a delay in airway closure on at least one swallow during the MBS study. There was no correlation between PEN/ASP score and airway closure delay. Inter-rater reliability for the quantitative measure of airway closure timing was nearly perfect (intraclass correlation coefficient = 0.973). The use of quantitative measures of swallowing function, in conjunction with traditional visual perceptual methods of MBS study interpretation, improves the identification of airway closure delay, and hence, potential aspiration risk, even when no penetration or aspiration is apparent on the MBS study. 4. Laryngoscope, 127:2314-2318, 2017. © 2017 The American Laryngological, Rhinological and Otological Society, Inc.

  19. Midi-maxi computer interaction in the interpretation of nuclear medicine procedures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schlapper, G.A.

    1977-01-01

    A study of renal function with an Anger Gamma Camera coupled with a Digital Equipment Corporation Gamma-11 System and an IBM System 370 demonstrates the potential of quantitative determinations of physiological function through the application of midi-maxi computer interaction in the interpretation of nuclear medicine procedures. It is shown that radiotracers can provide an opportunity to assess physiological processes of renal function by noninvasively following the path of a tracer as a function of time. Time-activity relationships obtained over seven anatomically defined regions are related to parameters of a seven compartment model employed to describe the renal clearance process. Themore » values obtained for clinically significant parameters agree with known renal pathophysiology. Differentiation of failure of acute, chronic, and obstructive forms is indicated.« less

  20. Failure analysis of parameter-induced simulation crashes in climate models

    NASA Astrophysics Data System (ADS)

    Lucas, D. D.; Klein, R.; Tannahill, J.; Ivanova, D.; Brandon, S.; Domyancic, D.; Zhang, Y.

    2013-01-01

    Simulations using IPCC-class climate models are subject to fail or crash for a variety of reasons. Quantitative analysis of the failures can yield useful insights to better understand and improve the models. During the course of uncertainty quantification (UQ) ensemble simulations to assess the effects of ocean model parameter uncertainties on climate simulations, we experienced a series of simulation crashes within the Parallel Ocean Program (POP2) component of the Community Climate System Model (CCSM4). About 8.5% of our CCSM4 simulations failed for numerical reasons at combinations of POP2 parameter values. We apply support vector machine (SVM) classification from machine learning to quantify and predict the probability of failure as a function of the values of 18 POP2 parameters. A committee of SVM classifiers readily predicts model failures in an independent validation ensemble, as assessed by the area under the receiver operating characteristic (ROC) curve metric (AUC > 0.96). The causes of the simulation failures are determined through a global sensitivity analysis. Combinations of 8 parameters related to ocean mixing and viscosity from three different POP2 parameterizations are the major sources of the failures. This information can be used to improve POP2 and CCSM4 by incorporating correlations across the relevant parameters. Our method can also be used to quantify, predict, and understand simulation crashes in other complex geoscientific models.

  1. Failure analysis of parameter-induced simulation crashes in climate models

    NASA Astrophysics Data System (ADS)

    Lucas, D. D.; Klein, R.; Tannahill, J.; Ivanova, D.; Brandon, S.; Domyancic, D.; Zhang, Y.

    2013-08-01

    Simulations using IPCC (Intergovernmental Panel on Climate Change)-class climate models are subject to fail or crash for a variety of reasons. Quantitative analysis of the failures can yield useful insights to better understand and improve the models. During the course of uncertainty quantification (UQ) ensemble simulations to assess the effects of ocean model parameter uncertainties on climate simulations, we experienced a series of simulation crashes within the Parallel Ocean Program (POP2) component of the Community Climate System Model (CCSM4). About 8.5% of our CCSM4 simulations failed for numerical reasons at combinations of POP2 parameter values. We applied support vector machine (SVM) classification from machine learning to quantify and predict the probability of failure as a function of the values of 18 POP2 parameters. A committee of SVM classifiers readily predicted model failures in an independent validation ensemble, as assessed by the area under the receiver operating characteristic (ROC) curve metric (AUC > 0.96). The causes of the simulation failures were determined through a global sensitivity analysis. Combinations of 8 parameters related to ocean mixing and viscosity from three different POP2 parameterizations were the major sources of the failures. This information can be used to improve POP2 and CCSM4 by incorporating correlations across the relevant parameters. Our method can also be used to quantify, predict, and understand simulation crashes in other complex geoscientific models.

  2. Recent advances in computational structural reliability analysis methods

    NASA Astrophysics Data System (ADS)

    Thacker, Ben H.; Wu, Y.-T.; Millwater, Harry R.; Torng, Tony Y.; Riha, David S.

    1993-10-01

    The goal of structural reliability analysis is to determine the probability that the structure will adequately perform its intended function when operating under the given environmental conditions. Thus, the notion of reliability admits the possibility of failure. Given the fact that many different modes of failure are usually possible, achievement of this goal is a formidable task, especially for large, complex structural systems. The traditional (deterministic) design methodology attempts to assure reliability by the application of safety factors and conservative assumptions. However, the safety factor approach lacks a quantitative basis in that the level of reliability is never known and usually results in overly conservative designs because of compounding conservatisms. Furthermore, problem parameters that control the reliability are not identified, nor their importance evaluated. A summary of recent advances in computational structural reliability assessment is presented. A significant level of activity in the research and development community was seen recently, much of which was directed towards the prediction of failure probabilities for single mode failures. The focus is to present some early results and demonstrations of advanced reliability methods applied to structural system problems. This includes structures that can fail as a result of multiple component failures (e.g., a redundant truss), or structural components that may fail due to multiple interacting failure modes (e.g., excessive deflection, resonate vibration, or creep rupture). From these results, some observations and recommendations are made with regard to future research needs.

  3. Recent advances in computational structural reliability analysis methods

    NASA Technical Reports Server (NTRS)

    Thacker, Ben H.; Wu, Y.-T.; Millwater, Harry R.; Torng, Tony Y.; Riha, David S.

    1993-01-01

    The goal of structural reliability analysis is to determine the probability that the structure will adequately perform its intended function when operating under the given environmental conditions. Thus, the notion of reliability admits the possibility of failure. Given the fact that many different modes of failure are usually possible, achievement of this goal is a formidable task, especially for large, complex structural systems. The traditional (deterministic) design methodology attempts to assure reliability by the application of safety factors and conservative assumptions. However, the safety factor approach lacks a quantitative basis in that the level of reliability is never known and usually results in overly conservative designs because of compounding conservatisms. Furthermore, problem parameters that control the reliability are not identified, nor their importance evaluated. A summary of recent advances in computational structural reliability assessment is presented. A significant level of activity in the research and development community was seen recently, much of which was directed towards the prediction of failure probabilities for single mode failures. The focus is to present some early results and demonstrations of advanced reliability methods applied to structural system problems. This includes structures that can fail as a result of multiple component failures (e.g., a redundant truss), or structural components that may fail due to multiple interacting failure modes (e.g., excessive deflection, resonate vibration, or creep rupture). From these results, some observations and recommendations are made with regard to future research needs.

  4. Interrelation of structure and operational states in cascading failure of overloading lines in power grids

    NASA Astrophysics Data System (ADS)

    Xue, Fei; Bompard, Ettore; Huang, Tao; Jiang, Lin; Lu, Shaofeng; Zhu, Huaiying

    2017-09-01

    As the modern power system is expected to develop to a more intelligent and efficient version, i.e. the smart grid, or to be the central backbone of energy internet for free energy interactions, security concerns related to cascading failures have been raised with consideration of catastrophic results. The researches of topological analysis based on complex networks have made great contributions in revealing structural vulnerabilities of power grids including cascading failure analysis. However, existing literature with inappropriate assumptions in modeling still cannot distinguish the effects between the structure and operational state to give meaningful guidance for system operation. This paper is to reveal the interrelation between network structure and operational states in cascading failure and give quantitative evaluation by integrating both perspectives. For structure analysis, cascading paths will be identified by extended betweenness and quantitatively described by cascading drop and cascading gradient. Furthermore, the operational state for cascading paths will be described by loading level. Then, the risk of cascading failure along a specific cascading path can be quantitatively evaluated considering these two factors. The maximum cascading gradient of all possible cascading paths can be used as an overall metric to evaluate the entire power grid for its features related to cascading failure. The proposed method is tested and verified on IEEE30-bus system and IEEE118-bus system, simulation evidences presented in this paper suggests that the proposed model can identify the structural causes for cascading failure and is promising to give meaningful guidance for the protection of system operation in the future.

  5. An evidential reasoning extension to quantitative model-based failure diagnosis

    NASA Technical Reports Server (NTRS)

    Gertler, Janos J.; Anderson, Kenneth C.

    1992-01-01

    The detection and diagnosis of failures in physical systems characterized by continuous-time operation are studied. A quantitative diagnostic methodology has been developed that utilizes the mathematical model of the physical system. On the basis of the latter, diagnostic models are derived each of which comprises a set of orthogonal parity equations. To improve the robustness of the algorithm, several models may be used in parallel, providing potentially incomplete and/or conflicting inferences. Dempster's rule of combination is used to integrate evidence from the different models. The basic probability measures are assigned utilizing quantitative information extracted from the mathematical model and from online computation performed therewith.

  6. Spectral mechanisms of spatially induced blackness: data and quantitative model.

    PubMed

    Shinomori, K; Schefrin, B E; Werner, J S

    1997-02-01

    Spectral efficiency functions and tests of additivity were obtained with three observers to identify possible chromatic contributions to spatially induced blackness. Stimuli consisted of a series of monochromatic (400-700 nm; 10-nm steps), 52-arcmin circular test lights surrounded by broadband (x = 0.31, y = 0.37), 63-138-arcmin annuli of fixed retinal illuminance. The stimuli were imaged on the fovea in Maxwellian view as 500-ms flashes with 10-s interstimulus intervals. Observers decreased the intensity of the test center until it was first perceived as completely black. Action spectra determined for two surround levels [2.5 and 3.5 log trolands] had three sensitivity peaks (at approximately 440, 540, and 600 nm), However, when monochromatic surrounds were adjusted to induce blackness in a broadband center, action spectra were unimodal and identical to functions obtained by heterochromatic flicker photometry. Tests of additivity revealed that when blackness is induced by broadband surround into a bichromatic center, there is an additivity failure of the cancellation type. This additivity failure indicates that blackness induction is influenced, in part, by signals from opponent-chromatic pathways. A quantitative model is presented to account for these data. This model assumes that blackness induction is determined by the ratio of responses to the stimulus center and the annulus, and while signals form the annulus are based only on achromatic information, responses from the center are based on both chromatic and achromatic properties of the stimulus.

  7. Variation in neurophysiological function and evidence of quantitative electroencephalogram discordance: predicting cocaine-dependent treatment attrition.

    PubMed

    Venneman, Sandy; Leuchter, Andrew; Bartzokis, George; Beckson, Mace; Simon, Sara L; Schaefer, Melodie; Rawson, Richard; Newton, Tom; Cook, Ian A; Uijtdehaage, Sebastian; Ling, Walter

    2006-01-01

    Cocaine treatment trials suffer from a high rate of attrition. We examined pretreatment neurophysiological factors to identify participants at greatest risk. Twenty-five participants were divided into concordant and discordant groups following electroencephalogram (EEG) measures recorded prior to a double-blind, placebo-controlled treatment trial. Three possible outcomes were examined: successful completion, dropout, and removal. Concordant (high perfusion correlate) participants had an 85% rate of successful completion, while discordant participants had a 15% rate of successful completion. Twenty-five percent of dropouts and 50% of participants removed were discordant (low perfusion correlate), while only 25% of those who completed were discordant. Failure to complete the trial was not explained by depression, craving, benzoylecgonine levels or quantitative electroencephalogram (QEEG) power; thus cordance may help identify attrition risk.

  8. On possibilities of using global monitoring in effective prevention of tailings storage facilities failures.

    PubMed

    Stefaniak, Katarzyna; Wróżyńska, Magdalena

    2018-02-01

    Protection of common natural goods is one of the greatest challenges man faces every day. Extracting and processing natural resources such as mineral deposits contributes to the transformation of the natural environment. The number of activities designed to keep balance are undertaken in accordance with the concept of integrated order. One of them is the use of comprehensive systems of tailings storage facility monitoring. Despite the monitoring, system failures still occur. The quantitative aspect of the failures illustrates both the scale of the problem and the quantitative aspect of the consequences of tailings storage facility failures. The paper presents vast possibilities provided by the global monitoring in the effective prevention of these failures. Particular attention is drawn to the potential of using multidirectional monitoring, including technical and environmental monitoring by the example of one of the world's biggest hydrotechnical constructions-Żelazny Most Tailings Storage Facility (TSF), Poland. Analysis of monitoring data allows to take preventive action against construction failures of facility dams, which can have devastating effects on human life and the natural environment.

  9. Adherence and drug resistance: predictions for therapy outcome.

    PubMed Central

    Wahl, L M; Nowak, M A

    2000-01-01

    We combine standard pharmacokinetics with an established model of viral replication to predict the outcome of therapy as a function of adherence to the drug regimen. We consider two types of treatment failure: failure to eliminate the wild-type virus, and the emergence of drug-resistant virus. Specifically, we determine the conditions under which resistance dominates as a result of imperfect adherence. We derive this result for both single- and triple-drug therapies, with attention to conditions which favour the emergence of viral strains that are resistant to one or more drugs in a cocktail. Our analysis provides quantitative estimates of the degree of adherence necessary to prevent resistance. We derive results specific to the treatment of human immunodeficiency virus infection, but emphasize that our method is applicable to a range of viral or other infections treated by chemotherapy. PMID:10819155

  10. Quantitative evolutionary design

    PubMed Central

    Diamond, Jared

    2002-01-01

    The field of quantitative evolutionary design uses evolutionary reasoning (in terms of natural selection and ultimate causation) to understand the magnitudes of biological reserve capacities, i.e. excesses of capacities over natural loads. Ratios of capacities to loads, defined as safety factors, fall in the range 1.2-10 for most engineered and biological components, even though engineered safety factors are specified intentionally by humans while biological safety factors arise through natural selection. Familiar examples of engineered safety factors include those of buildings, bridges and elevators (lifts), while biological examples include factors of bones and other structural elements, of enzymes and transporters, and of organ metabolic performances. Safety factors serve to minimize the overlap zone (resulting in performance failure) between the low tail of capacity distributions and the high tail of load distributions. Safety factors increase with coefficients of variation of load and capacity, with capacity deterioration with time, and with cost of failure, and decrease with costs of initial construction, maintenance, operation, and opportunity. Adaptive regulation of many biological systems involves capacity increases with increasing load; several quantitative examples suggest sublinear increases, such that safety factors decrease towards 1.0. Unsolved questions include safety factors of series systems, parallel or branched pathways, elements with multiple functions, enzyme reaction chains, and equilibrium enzymes. The modest sizes of safety factors imply the existence of costs that penalize excess capacities. Those costs are likely to involve wasted energy or space for large or expensive components, but opportunity costs of wasted space at the molecular level for minor components. PMID:12122135

  11. Problems experienced by informal caregivers of individuals with heart failure: An integrative review.

    PubMed

    Grant, Joan S; Graven, Lucinda J

    2018-04-01

    The purpose of this review was to examine and synthesize recent literature regarding problems experienced by informal caregivers when providing care for individuals with heart failure in the home. Integrative literature review. A review of current empirical literature was conducted utilizing PubMed, CINAHL, Embase, Sociological Abstracts, Social Sciences Full Text, PsycARTICLES, PsycINFO, Health Source: Nursing/Academic Edition, and Cochrane computerized databases. 19 qualitative, 16 quantitative, and 2 mixed methods studies met the inclusion criteria for review. Computerized databases were searched for a combination of subject terms (i.e., MeSH) and keywords related to informal caregivers, problems, and heart failure. The title and abstract of identified articles and reference lists were reviewed. Studies were included if they were published in English between January 2000 and December 2016 and examined problems experienced by informal caregivers in providing care for individuals with heart failure in the home. Studies were excluded if not written in English or if elements of caregiving in heart failure were not present in the title, abstract, or text. Unpublished and duplicate empirical literature as well as articles related to specific end-stage heart failure populations also were excluded. Methodology described by Cooper and others for integrative reviews of quantitative and qualitative research was used. Quality appraisal of the included studies was evaluated using the Joanna Briggs Institute critical appraisal tools for cross-sectional quantitative and qualitative studies. Informal caregivers experienced four key problems when providing care for individuals with heart failure in the home, including performing multifaceted activities and roles that evolve around daily heart failure demands; maintaining caregiver physical, emotional, social, spiritual, and financial well-being; having insufficient caregiver support; and performing caregiving with uncertainty and inadequate knowledge. Informal caregivers of individuals with heart failure experience complex problems in the home when providing care which impact all aspects of their lives. Incorporating advice from informal caregivers of individuals with heart failure will assist in the development of interventions to reduce negative caregiver outcomes. Given the complex roles in caring for individuals with heart failure, multicomponent interventions are potentially promising in assisting informal caregivers in performing these roles. Published by Elsevier Ltd.

  12. Commonalities and Differences in Functional Safety Systems Between ISS Payloads and Industrial Applications

    NASA Astrophysics Data System (ADS)

    Malyshev, Mikhail; Kreimer, Johannes

    2013-09-01

    Safety analyses for electrical, electronic and/or programmable electronic (E/E/EP) safety-related systems used in payload applications on-board the International Space Station (ISS) are often based on failure modes, effects and criticality analysis (FMECA). For industrial applications of E/E/EP safety-related systems, comparable strategies exist and are defined in the IEC-61508 standard. This standard defines some quantitative criteria based on potential failure modes (for example, Safe Failure Fraction). These criteria can be calculated for an E/E/EP system or components to assess their compliance to requirements of a particular Safety Integrity Level (SIL). The standard defines several SILs depending on how much risk has to be mitigated by a safety-critical system. When a FMECA is available for an ISS payload or its subsystem, it may be possible to calculate the same or similar parameters as defined in the 61508 standard. One example of a payload that has a dedicated functional safety subsystem is the Electromagnetic Levitator (EML). This payload for the ISS is planned to be operated on-board starting 2014. The EML is a high-temperature materials processing facility. The dedicated subsystem "Hazard Control Electronics" (HCE) is implemented to ensure compliance to failure tolerance in limiting samples processing parameters to maintain generation of the potentially toxic by-products to safe limits in line with the requirements applied to the payloads by the ISS Program. The objective of this paper is to assess the implementation of the HCE in the EML against criteria for functional safety systems in the IEC-61508 standard and to evaluate commonalities and differences with respect to safety requirements levied on ISS Payloads. An attempt is made to assess a possibility of using commercially available components and systems certified for compliance to industrial functional safety standards in ISS payloads.

  13. A study of Mariner 10 flight experiences and some flight piece part failure rate computations

    NASA Technical Reports Server (NTRS)

    Paul, F. A.

    1976-01-01

    The problems and failures encountered in Mariner flight are discussed and the data available through a quantitative accounting of all electronic piece parts on the spacecraft are summarized. It also shows computed failure rates for electronic piece parts. It is intended that these computed data be used in the continued updating of the failure rate base used for trade-off studies and predictions for future JPL space missions.

  14. Goal-Function Tree Modeling for Systems Engineering and Fault Management

    NASA Technical Reports Server (NTRS)

    Patterson, Jonathan D.; Johnson, Stephen B.

    2013-01-01

    The draft NASA Fault Management (FM) Handbook (2012) states that Fault Management (FM) is a "part of systems engineering", and that it "demands a system-level perspective" (NASAHDBK- 1002, 7). What, exactly, is the relationship between systems engineering and FM? To NASA, systems engineering (SE) is "the art and science of developing an operable system capable of meeting requirements within often opposed constraints" (NASA/SP-2007-6105, 3). Systems engineering starts with the elucidation and development of requirements, which set the goals that the system is to achieve. To achieve these goals, the systems engineer typically defines functions, and the functions in turn are the basis for design trades to determine the best means to perform the functions. System Health Management (SHM), by contrast, defines "the capabilities of a system that preserve the system's ability to function as intended" (Johnson et al., 2011, 3). Fault Management, in turn, is the operational subset of SHM, which detects current or future failures, and takes operational measures to prevent or respond to these failures. Failure, in turn, is the "unacceptable performance of intended function." (Johnson 2011, 605) Thus the relationship of SE to FM is that SE defines the functions and the design to perform those functions to meet system goals and requirements, while FM detects the inability to perform those functions and takes action. SHM and FM are in essence "the dark side" of SE. For every function to be performed (SE), there is the possibility that it is not successfully performed (SHM); FM defines the means to operationally detect and respond to this lack of success. We can also describe this in terms of goals: for every goal to be achieved, there is the possibility that it is not achieved; FM defines the means to operationally detect and respond to this inability to achieve the goal. This brief description of relationships between SE, SHM, and FM provide hints to a modeling approach to provide formal connectivity between the nominal (SE), and off-nominal (SHM and FM) aspects of functions and designs. This paper describes a formal modeling approach to the initial phases of the development process that integrates the nominal and off-nominal perspectives in a model that unites SE goals and functions of with the failure to achieve goals and functions (SHM/FM). This methodology and corresponding model, known as a Goal-Function Tree (GFT), provides a means to represent, decompose, and elaborate system goals and functions in a rigorous manner that connects directly to design through use of state variables that translate natural language requirements and goals into logical-physical state language. The state variable-based approach also provides the means to directly connect FM to the design, by specifying the range in which state variables must be controlled to achieve goals, and conversely, the failures that exist if system behavior go out-of-range. This in turn allows for the systems engineers and SHM/FM engineers to determine which state variables to monitor, and what action(s) to take should the system fail to achieve that goal. In sum, the GFT representation provides a unified approach to early-phase SE and FM development. This representation and methodology has been successfully developed and implemented using Systems Modeling Language (SysML) on the NASA Space Launch System (SLS) Program. It enabled early design trade studies of failure detection coverage to ensure complete detection coverage of all crew-threatening failures. The representation maps directly both to FM algorithm designs, and to failure scenario definitions needed for design analysis and testing. The GFT representation provided the basis for mapping of abort triggers into scenarios, both needed for initial, and successful quantitative analyses of abort effectiveness (detection and response to crew-threatening events).

  15. The chromatin-binding protein Smyd1 restricts adult mammalian heart growth

    PubMed Central

    Kimball, Todd; Rasmussen, Tara L.; Rosa-Garrido, Manuel; Chen, Haodong; Tran, Tam; Miller, Mickey R.; Gray, Ricardo; Jiang, Shanxi; Ren, Shuxun; Wang, Yibin; Tucker, Haley O.; Vondriska, Thomas M.

    2016-01-01

    All terminally differentiated organs face two challenges, maintaining their cellular identity and restricting organ size. The molecular mechanisms responsible for these decisions are of critical importance to organismal development, and perturbations in their normal balance can lead to disease. A hallmark of heart failure, a condition affecting millions of people worldwide, is hypertrophic growth of cardiomyocytes. The various forms of heart failure in human and animal models share conserved transcriptome remodeling events that lead to expression of genes normally silenced in the healthy adult heart. However, the chromatin remodeling events that maintain cell and organ size are incompletely understood; insights into these mechanisms could provide new targets for heart failure therapy. Using a quantitative proteomics approach to identify muscle-specific chromatin regulators in a mouse model of hypertrophy and heart failure, we identified upregulation of the histone methyltransferase Smyd1 during disease. Inducible loss-of-function studies in vivo demonstrate that Smyd1 is responsible for restricting growth in the adult heart, with its absence leading to cellular hypertrophy, organ remodeling, and fulminate heart failure. Molecular studies reveal Smyd1 to be a muscle-specific regulator of gene expression and indicate that Smyd1 modulates expression of gene isoforms whose expression is associated with cardiac pathology. Importantly, activation of Smyd1 can prevent pathological cell growth. These findings have basic implications for our understanding of cardiac pathologies and open new avenues to the treatment of cardiac hypertrophy and failure by modulating Smyd1. PMID:27663768

  16. The chromatin-binding protein Smyd1 restricts adult mammalian heart growth.

    PubMed

    Franklin, Sarah; Kimball, Todd; Rasmussen, Tara L; Rosa-Garrido, Manuel; Chen, Haodong; Tran, Tam; Miller, Mickey R; Gray, Ricardo; Jiang, Shanxi; Ren, Shuxun; Wang, Yibin; Tucker, Haley O; Vondriska, Thomas M

    2016-11-01

    All terminally differentiated organs face two challenges, maintaining their cellular identity and restricting organ size. The molecular mechanisms responsible for these decisions are of critical importance to organismal development, and perturbations in their normal balance can lead to disease. A hallmark of heart failure, a condition affecting millions of people worldwide, is hypertrophic growth of cardiomyocytes. The various forms of heart failure in human and animal models share conserved transcriptome remodeling events that lead to expression of genes normally silenced in the healthy adult heart. However, the chromatin remodeling events that maintain cell and organ size are incompletely understood; insights into these mechanisms could provide new targets for heart failure therapy. Using a quantitative proteomics approach to identify muscle-specific chromatin regulators in a mouse model of hypertrophy and heart failure, we identified upregulation of the histone methyltransferase Smyd1 during disease. Inducible loss-of-function studies in vivo demonstrate that Smyd1 is responsible for restricting growth in the adult heart, with its absence leading to cellular hypertrophy, organ remodeling, and fulminate heart failure. Molecular studies reveal Smyd1 to be a muscle-specific regulator of gene expression and indicate that Smyd1 modulates expression of gene isoforms whose expression is associated with cardiac pathology. Importantly, activation of Smyd1 can prevent pathological cell growth. These findings have basic implications for our understanding of cardiac pathologies and open new avenues to the treatment of cardiac hypertrophy and failure by modulating Smyd1. Copyright © 2016 the American Physiological Society.

  17. Analyser-based phase contrast image reconstruction using geometrical optics.

    PubMed

    Kitchen, M J; Pavlov, K M; Siu, K K W; Menk, R H; Tromba, G; Lewis, R A

    2007-07-21

    Analyser-based phase contrast imaging can provide radiographs of exceptional contrast at high resolution (<100 microm), whilst quantitative phase and attenuation information can be extracted using just two images when the approximations of geometrical optics are satisfied. Analytical phase retrieval can be performed by fitting the analyser rocking curve with a symmetric Pearson type VII function. The Pearson VII function provided at least a 10% better fit to experimentally measured rocking curves than linear or Gaussian functions. A test phantom, a hollow nylon cylinder, was imaged at 20 keV using a Si(1 1 1) analyser at the ELETTRA synchrotron radiation facility. Our phase retrieval method yielded a more accurate object reconstruction than methods based on a linear fit to the rocking curve. Where reconstructions failed to map expected values, calculations of the Takagi number permitted distinction between the violation of the geometrical optics conditions and the failure of curve fitting procedures. The need for synchronized object/detector translation stages was removed by using a large, divergent beam and imaging the object in segments. Our image acquisition and reconstruction procedure enables quantitative phase retrieval for systems with a divergent source and accounts for imperfections in the analyser.

  18. Failure-probability driven dose painting

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vogelius, Ivan R.; Håkansson, Katrin; Due, Anne K.

    Purpose: To demonstrate a data-driven dose-painting strategy based on the spatial distribution of recurrences in previously treated patients. The result is a quantitative way to define a dose prescription function, optimizing the predicted local control at constant treatment intensity. A dose planning study using the optimized dose prescription in 20 patients is performed.Methods: Patients treated at our center have five tumor subvolumes from the center of the tumor (PET positive volume) and out delineated. The spatial distribution of 48 failures in patients with complete clinical response after (chemo)radiation is used to derive a model for tumor control probability (TCP). Themore » total TCP is fixed to the clinically observed 70% actuarial TCP at five years. Additionally, the authors match the distribution of failures between the five subvolumes to the observed distribution. The steepness of the dose–response is extracted from the literature and the authors assume 30% and 20% risk of subclinical involvement in the elective volumes. The result is a five-compartment dose response model matching the observed distribution of failures. The model is used to optimize the distribution of dose in individual patients, while keeping the treatment intensity constant and the maximum prescribed dose below 85 Gy.Results: The vast majority of failures occur centrally despite the small volumes of the central regions. Thus, optimizing the dose prescription yields higher doses to the central target volumes and lower doses to the elective volumes. The dose planning study shows that the modified prescription is clinically feasible. The optimized TCP is 89% (range: 82%–91%) as compared to the observed TCP of 70%.Conclusions: The observed distribution of locoregional failures was used to derive an objective, data-driven dose prescription function. The optimized dose is predicted to result in a substantial increase in local control without increasing the predicted risk of toxicity.« less

  19. Simulating Initial and Progressive Failure of Open-Hole Composite Laminates under Tension

    NASA Astrophysics Data System (ADS)

    Guo, Zhangxin; Zhu, Hao; Li, Yongcun; Han, Xiaoping; Wang, Zhihua

    2016-12-01

    A finite element (FE) model is developed for the progressive failure analysis of fiber reinforced polymer laminates. The failure criterion for fiber and matrix failure is implemented in the FE code Abaqus using user-defined material subroutine UMAT. The gradual degradation of the material properties is controlled by the individual fracture energies of fiber and matrix. The failure and damage in composite laminates containing a central hole subjected to uniaxial tension are simulated. The numerical results show that the damage model can be used to accurately predicte the progressive failure behaviour both qualitatively and quantitatively.

  20. Hormonal and metabolic defects in a prader-willi syndrome mouse model with neonatal failure to thrive.

    PubMed

    Stefan, M; Ji, H; Simmons, R A; Cummings, D E; Ahima, R S; Friedman, M I; Nicholls, R D

    2005-10-01

    Prader-Willi syndrome (PWS) has a biphasic clinical phenotype with failure to thrive in the neonatal period followed by hyperphagia and severe obesity commencing in childhood among other endocrinological and neurobehavioral abnormalities. The syndrome results from loss of function of several clustered, paternally expressed genes in chromosome 15q11-q13. PWS is assumed to result from a hypothalamic defect, but the pathophysiological basis of the disorder is unknown. We hypothesize that a fetal developmental abnormality in PWS leads to the neonatal phenotype, whereas the adult phenotype results from a failure in compensatory mechanisms. To address this hypothesis and better characterize the neonatal failure to thrive phenotype during postnatal life, we studied a transgenic deletion PWS (TgPWS) mouse model that shares similarities with the first stage of the human syndrome. TgPWS mice have fetal and neonatal growth retardation associated with profoundly reduced insulin and glucagon levels. Consistent with growth retardation, TgPWS mice have deregulated liver expression of IGF system components, as revealed by quantitative gene expression studies. Lethality in TgPWS mice appears to result from severe hypoglycemia after postnatal d 2 after depletion of liver glycogen stores. Consistent with hypoglycemia, TgPWS mice appear to have increased fat oxidation. Ghrelin levels increase in TgPWS reciprocally with the falling glucose levels, suggesting that the rise in ghrelin reported in PWS patients may be secondary to a perceived energy deficiency. Together, the data reveal defects in endocrine pancreatic function as well as glucose and hepatic energy metabolism that may underlie the neonatal phenotype of PWS.

  1. Application of Fault Management Theory to the Quantitative Selection of a Launch Vehicle Abort Trigger Suite

    NASA Technical Reports Server (NTRS)

    Lo, Yunnhon; Johnson, Stephen B.; Breckenridge, Jonathan T.

    2014-01-01

    The theory of System Health Management (SHM) and of its operational subset Fault Management (FM) states that FM is implemented as a "meta" control loop, known as an FM Control Loop (FMCL). The FMCL detects that all or part of a system is now failed, or in the future will fail (that is, cannot be controlled within acceptable limits to achieve its objectives), and takes a control action (a response) to return the system to a controllable state. In terms of control theory, the effectiveness of each FMCL is estimated based on its ability to correctly estimate the system state, and on the speed of its response to the current or impending failure effects. This paper describes how this theory has been successfully applied on the National Aeronautics and Space Administration's (NASA) Space Launch System (SLS) Program to quantitatively estimate the effectiveness of proposed abort triggers so as to select the most effective suite to protect the astronauts from catastrophic failure of the SLS. The premise behind this process is to be able to quantitatively provide the value versus risk trade-off for any given abort trigger, allowing decision makers to make more informed decisions. All current and planned crewed launch vehicles have some form of vehicle health management system integrated with an emergency launch abort system to ensure crew safety. While the design can vary, the underlying principle is the same: detect imminent catastrophic vehicle failure, initiate launch abort, and extract the crew to safety. Abort triggers are the detection mechanisms that identify that a catastrophic launch vehicle failure is occurring or is imminent and cause the initiation of a notification to the crew vehicle that the escape system must be activated. While ensuring that the abort triggers provide this function, designers must also ensure that the abort triggers do not signal that a catastrophic failure is imminent when in fact the launch vehicle can successfully achieve orbit. That is, the abort triggers must have low false negative rates to be sure that real crew-threatening failures are detected, and also low false positive rates to ensure that the crew does not abort from non-crew-threatening launch vehicle behaviors. The analysis process described in this paper is a compilation of over six years of lessons learned and refinements from experiences developing abort triggers for NASA's Constellation Program (Ares I Project) and the SLS Program, as well as the simultaneous development of SHM/FM theory. The paper will describe the abort analysis concepts and process, developed in conjunction with SLS Safety and Mission Assurance (S&MA) to define a common set of mission phase, failure scenario, and Loss of Mission Environment (LOME) combinations upon which the SLS Loss of Mission (LOM) Probabilistic Risk Assessment (PRA) models are built. This abort analysis also requires strong coordination with the Multi-Purpose Crew Vehicle (MPCV) and SLS Structures and Environments (STE) to formulate a series of abortability tables that encapsulate explosion dynamics over the ascent mission phase. The design and assessment of abort conditions and triggers to estimate their Loss of Crew (LOC) Benefits also requires in-depth integration with other groups, including Avionics, Guidance, Navigation and Control(GN&C), the Crew Office, Mission Operations, and Ground Systems. The outputs of this analysis are a critical input to SLS S&MA's LOC PRA models. The process described here may well be the first full quantitative application of SHM/FM theory to the selection of a sensor suite for any aerospace system.

  2. An academic medical center's response to widespread computer failure.

    PubMed

    Genes, Nicholas; Chary, Michael; Chason, Kevin W

    2013-01-01

    As hospitals incorporate information technology (IT), their operations become increasingly vulnerable to technological breakdowns and attacks. Proper emergency management and business continuity planning require an approach to identify, mitigate, and work through IT downtime. Hospitals can prepare for these disasters by reviewing case studies. This case study details the disruption of computer operations at Mount Sinai Medical Center (MSMC), an urban academic teaching hospital. The events, and MSMC's response, are narrated and the impact on hospital operations is analyzed. MSMC's disaster management strategy prevented computer failure from compromising patient care, although walkouts and time-to-disposition in the emergency department (ED) notably increased. This incident highlights the importance of disaster preparedness and mitigation. It also demonstrates the value of using operational data to evaluate hospital responses to disasters. Quantifying normal hospital functions, just as with a patient's vital signs, may help quantitatively evaluate and improve disaster management and business continuity planning.

  3. Current Understanding of the Pathophysiology of Myocardial Fibrosis and Its Quantitative Assessment in Heart Failure

    PubMed Central

    Liu, Tong; Song, Deli; Dong, Jianzeng; Zhu, Pinghui; Liu, Jie; Liu, Wei; Ma, Xiaohai; Zhao, Lei; Ling, Shukuan

    2017-01-01

    Myocardial fibrosis is an important part of cardiac remodeling that leads to heart failure and death. Myocardial fibrosis results from increased myofibroblast activity and excessive extracellular matrix deposition. Various cells and molecules are involved in this process, providing targets for potential drug therapies. Currently, the main detection methods of myocardial fibrosis rely on serum markers, cardiac magnetic resonance imaging, and endomyocardial biopsy. This review summarizes our current knowledge regarding the pathophysiology, quantitative assessment, and novel therapeutic strategies of myocardial fibrosis. PMID:28484397

  4. Probabilistic framework for product design optimization and risk management

    NASA Astrophysics Data System (ADS)

    Keski-Rahkonen, J. K.

    2018-05-01

    Probabilistic methods have gradually gained ground within engineering practices but currently it is still the industry standard to use deterministic safety margin approaches to dimensioning components and qualitative methods to manage product risks. These methods are suitable for baseline design work but quantitative risk management and product reliability optimization require more advanced predictive approaches. Ample research has been published on how to predict failure probabilities for mechanical components and furthermore to optimize reliability through life cycle cost analysis. This paper reviews the literature for existing methods and tries to harness their best features and simplify the process to be applicable in practical engineering work. Recommended process applies Monte Carlo method on top of load-resistance models to estimate failure probabilities. Furthermore, it adds on existing literature by introducing a practical framework to use probabilistic models in quantitative risk management and product life cycle costs optimization. The main focus is on mechanical failure modes due to the well-developed methods used to predict these types of failures. However, the same framework can be applied on any type of failure mode as long as predictive models can be developed.

  5. Top-Down Quantitative Proteomics Identified Phosphorylation of Cardiac Troponin I as a Candidate Biomarker for Chronic Heart Failure

    PubMed Central

    Zhang, Jiang; Guy, Moltu J.; Norman, Holly S.; Chen, Yi-Chen; Xu, Qingge; Dong, Xintong; Guner, Huseyin; Wang, Sijian; Kohmoto, Takushi; Young, Ken H.; Moss, Richard L.; Ge, Ying

    2011-01-01

    The rapid increase in the prevalence of chronic heart failure (CHF) worldwide underscores an urgent need to identify biomarkers for the early detection of CHF. Post-translational modifications (PTMs) are associated with many critical signaling events during disease progression and thus offer a plethora of candidate biomarkers. We have employed top-down quantitative proteomics methodology for comprehensive assessment of PTMs in whole proteins extracted from normal and diseased tissues. We have systematically analyzed thirty-six clinical human heart tissue samples and identified phosphorylation of cardiac troponin I (cTnI) as a candidate biomarker for CHF. The relative percentages of the total phosphorylated cTnI forms over the entire cTnI populations (%Ptotal) were 56.4±3.5%, 36.9±1.6%, 6.1±2.4%, and 1.0±0.6% for postmortem hearts with normal cardiac function (n=7), early-stage of mild hypertrophy (n=5), severe hypertrophy/dilation (n=4), and end-stage CHF (n=6), respectively. In fresh transplant samples, the %Ptotal of cTnI from non-failing donor (n=4), and end-stage failing hearts (n=10) were 49.5±5.9% and 18.8±2.9%, respectively. Top-down MS with electron capture dissociation unequivocally localized the altered phosphorylation sites to Ser22/23 and determined the order of phosphorylation/dephosphorylation. This study represents the first clinical application of top-down MS-based quantitative proteomics for biomarker discovery from tissues, highlighting the potential of PTM as disease biomarkers. PMID:21751783

  6. A cascading failure model for analyzing railway accident causation

    NASA Astrophysics Data System (ADS)

    Liu, Jin-Tao; Li, Ke-Ping

    2018-01-01

    In this paper, a new cascading failure model is proposed for quantitatively analyzing the railway accident causation. In the model, the loads of nodes are redistributed according to the strength of the causal relationships between the nodes. By analyzing the actual situation of the existing prevention measures, a critical threshold of the load parameter in the model is obtained. To verify the effectiveness of the proposed cascading model, simulation experiments of a train collision accident are performed. The results show that the cascading failure model can describe the cascading process of the railway accident more accurately than the previous models, and can quantitatively analyze the sensitivities and the influence of the causes. In conclusion, this model can assist us to reveal the latent rules of accident causation to reduce the occurrence of railway accidents.

  7. Failure mechanisms of fibrin-based surgical tissue adhesives

    NASA Astrophysics Data System (ADS)

    Sierra, David Hugh

    A series of studies was performed to investigate the potential impact of heterogeneity in the matrix of multiple-component fibrin-based tissue adhesives upon their mechanical and biomechanical properties both in vivo and in vitro. Investigations into the failure mechanisms by stereological techniques demonstrated that heterogeneity could be measured quantitatively and that the variation in heterogeneity could be altered both by the means of component mixing and delivery and by the formulation of the sealant. Ex vivo tensile adhesive strength was found to be inversely proportional to the amount of heterogeneity. In contrast, in vivo tensile wound-closure strength was found to be relatively unaffected by the degree of heterogeneity, while in vivo parenchymal organ hemostasis in rabbits was found to be affected: greater heterogeneity appeared to correlate with an increase in hemostasis time and amount of sealant necessary to effect hemostasis. Tensile testing of the bulk sealant showed that mechanical parameters were proportional to fibrin concentration and that the physical characteristics of the failure supported a ductile mechanism. Strain hardening as a function of percentage of strain, and strain rate was observed for both concentrations, and syneresis was observed at low strain rates for the lower fibrin concentration. Blister testing demonstrated that burst pressure and failure energy were proportional to fibrin concentration and decreased with increasing flow rate. Higher fibrin concentration demonstrated predominately compact morphology debonds with cohesive failure loci, demonstrating shear or viscous failure in a viscoelastic rubbery adhesive. The lower fibrin concentration sealant exhibited predominately fractal morphology debonds with cohesive failure loci, supporting an elastoviscous material condition. The failure mechanism for these was hypothesized and shown to be flow-induced ductile fracture. Based on these findings, the failure mechanism was stochastic in nature because the mean failure energy and burst pressure values were not predictive of locus and morphology. Instead, flow rate and fibrin concentration showed the most predictive value, with the outcome best described as a probability distribution rather than a specific deterministic outcome.

  8. Clinical assessment of social cognitive function in neurological disorders.

    PubMed

    Henry, Julie D; von Hippel, William; Molenberghs, Pascal; Lee, Teresa; Sachdev, Perminder S

    2016-01-01

    Social cognition broadly refers to the processing of social information in the brain that underlies abilities such as the detection of others' emotions and responding appropriately to these emotions. Social cognitive skills are critical for successful communication and, consequently, mental health and wellbeing. Disturbances of social cognition are early and salient features of many neuropsychiatric, neurodevelopmental and neurodegenerative disorders, and often occur after acute brain injury. Its assessment in the clinic is, therefore, of paramount importance. Indeed, the most recent edition of the American Psychiatric Association's Diagnostic and Statistical Manual for Mental Disorders (DSM-5) introduced social cognition as one of six core components of neurocognitive function, alongside memory and executive control. Failures of social cognition most often present as poor theory of mind, reduced affective empathy, impaired social perception or abnormal social behaviour. Standard neuropsychological assessments lack the precision and sensitivity needed to adequately inform treatment of these failures. In this Review, we present appropriate methods of assessment for each of the four domains, using an example disorder to illustrate the value of these approaches. We discuss the clinical applications of testing for social cognitive function, and finally suggest a five-step algorithm for the evaluation and treatment of impairments, providing quantitative evidence to guide the selection of social cognitive measures in clinical practice.

  9. Noninvasive estimation of tissue edema in healthy volunteers and in patients suffering from heart failure

    NASA Astrophysics Data System (ADS)

    Gurfinkel, Yuri I.; Mikhailov, Valery M.; Kudutkina, Marina I.

    2004-06-01

    Capillaries play a critical role in cardiovascular function as the point of exchange of nutrients and waste products between tissues and circulation. A common problem for healthy volunteers examined during isolation, and for the patients suffering from heart failure is a quantitative estimation tissue oedema. Until now, objective assessment body fluids retention in tissues did not exist. Optical imaging of living capillaries is a challenging and medically important scientific problem. Goal of the investigation was to study dynamic of microcriculation parameters including tissue oedema in healthy volunteers during extended isolation and relative hypokinesia as a model of mission to the International Space Station. The other aim was to study dynamic of microcirculation parameters including tissue oedema in patients suffering from heart failure under treatment. Healthy volunteers and patients. We studied four healthy male subjects at the age of 41, 37, 40, and 48 before the experiment (June 1999), and during the 240-d isolation period starting from July3, 1999. Unique hermetic chambers with artidicial environmental parameters allowed performing this study with maximum similarity to real conditions in the International Space Station (ISS). With the regularity of 3 times a week at the same time, each subject recorded three video episodes with the total length of one-minute using the optical computerized capillaroscope for noninvasive measurement of the capillary diameters sizes, capillary blood velocity as well as the size of the perivascular zone. All this parameters of microcirculation determined during three weeks in 15 patients (10 male, 5 female, aged 62,2+/-8,8) suffering from heart failure under Furosemid 40 mg 2 times a week, as diuretic. Results. About 1500 episodes recorded on laser disks and analyzed during this experiment. Every subject had wave-like variations of capillary blood velocity within the minute, week, and month ranges. It was found that the perivascular zone sizes rising during isolation correlate with body mass of subjects and probably depend on retention of body fluids in tissues. Computerized capillaroscopy provides a new opportunity for non-invasive quantitative estimation tissue oedema and suggests for exact management patients suffering from heart failure under diuretic treatment.

  10. The Six Minute Walk Test Revisited

    NASA Astrophysics Data System (ADS)

    Mazumder, M.

    2017-12-01

    Background and Purpose: Heart failure is the leading cause of death and often alters or severely restricts human mobility, an essential life function. Motion capture is an emerging tool for analyzing human movement and extremity articulation, providing quantitative information on gait and range of motion. This study uses BioStamp mechanosensors to identify differences in motion for the duration of the Six Minute Walk Test and signature patterns of muscle contraction and posture in patients with advanced heart failure compared to healthy subjects. Identification and close follow up of these patterns may allow enhanced diagnosis and the possibility for early intervention before disease worsening. Additionally, movement parameters represent a new family of potential biomarkers to track heart failure onset, progression and therapy. Methods: Prior to the Six Minute Walk Test, BioStamps (MC10) were applied to the chest, upper and lower extremities of heart failure and healthy patients and data were streamed and recorded revealing the pattern of movement in three separate axes. Conjointly, before and after the Six Minute Walk Test, the following vitals were measured per subject: heart rate, respiratory rate, blood pressure, oxygen saturation, dyspnea and leg fatigue (self-reported with Borg scale). During the test, patients were encouraged to walk as far as they can in 6 minutes on a 30m course, as we recorded the number of laps completed and oxygen saturation every minute. Results and Conclusions: The sensors captured and quantified whole body and regional motion parameters including: a. motion extent, position, acceleration and angle via incorporated accelerometers and gyroscopes; b. muscle contraction via incorporated electromyogram (EMG). Accelerometry and gyroscopic data for the last five steps of a healthy and heart failure patient are shown. While significant differences in motion for the duration of the test were not found, each category of patients had a distinct pattern of motion - with identifiable qualitative and quantitative differences. These wearable conformal skin adherent sensors allow on-body, mobile, personalized determination of motion and flexibility parameters. This tool and method hold promise for providing motion "biomarker" data in health and disease.

  11. Weighted Fuzzy Risk Priority Number Evaluation of Turbine and Compressor Blades Considering Failure Mode Correlations

    NASA Astrophysics Data System (ADS)

    Gan, Luping; Li, Yan-Feng; Zhu, Shun-Peng; Yang, Yuan-Jian; Huang, Hong-Zhong

    2014-06-01

    Failure mode, effects and criticality analysis (FMECA) and Fault tree analysis (FTA) are powerful tools to evaluate reliability of systems. Although single failure mode issue can be efficiently addressed by traditional FMECA, multiple failure modes and component correlations in complex systems cannot be effectively evaluated. In addition, correlated variables and parameters are often assumed to be precisely known in quantitative analysis. In fact, due to the lack of information, epistemic uncertainty commonly exists in engineering design. To solve these problems, the advantages of FMECA, FTA, fuzzy theory, and Copula theory are integrated into a unified hybrid method called fuzzy probability weighted geometric mean (FPWGM) risk priority number (RPN) method. The epistemic uncertainty of risk variables and parameters are characterized by fuzzy number to obtain fuzzy weighted geometric mean (FWGM) RPN for single failure mode. Multiple failure modes are connected using minimum cut sets (MCS), and Boolean logic is used to combine fuzzy risk priority number (FRPN) of each MCS. Moreover, Copula theory is applied to analyze the correlation of multiple failure modes in order to derive the failure probabilities of each MCS. Compared to the case where dependency among multiple failure modes is not considered, the Copula modeling approach eliminates the error of reliability analysis. Furthermore, for purpose of quantitative analysis, probabilities importance weight from failure probabilities are assigned to FWGM RPN to reassess the risk priority, which generalize the definition of probability weight and FRPN, resulting in a more accurate estimation than that of the traditional models. Finally, a basic fatigue analysis case drawn from turbine and compressor blades in aeroengine is used to demonstrate the effectiveness and robustness of the presented method. The result provides some important insights on fatigue reliability analysis and risk priority assessment of structural system under failure correlations.

  12. Visibility graph analysis of heart rate time series and bio-marker of congestive heart failure

    NASA Astrophysics Data System (ADS)

    Bhaduri, Anirban; Bhaduri, Susmita; Ghosh, Dipak

    2017-09-01

    Study of RR interval time series for Congestive Heart Failure had been an area of study with different methods including non-linear methods. In this article the cardiac dynamics of heart beat are explored in the light of complex network analysis, viz. visibility graph method. Heart beat (RR Interval) time series data taken from Physionet database [46, 47] belonging to two groups of subjects, diseased (congestive heart failure) (29 in number) and normal (54 in number) are analyzed with the technique. The overall results show that a quantitative parameter can significantly differentiate between the diseased subjects and the normal subjects as well as different stages of the disease. Further, the data when split into periods of around 1 hour each and analyzed separately, also shows the same consistent differences. This quantitative parameter obtained using the visibility graph analysis thereby can be used as a potential bio-marker as well as a subsequent alarm generation mechanism for predicting the onset of Congestive Heart Failure.

  13. Perfluorocarbon Nanoparticles for Physiological and Molecular Imaging and Therapy

    PubMed Central

    Chen, Junjie; Pan, Hua; Lanza, Gregory M.; Wickline, Samuel A.

    2014-01-01

    Herein we review the use of non-nephrotoxic perfluorocarbon nanoparticles (PFC NP) for noninvasive detection and therapy of kidney diseases, and provide a synopsis of other related literature pertinent to anticipated clinical application. Recent reports indicate that PFC NP allow quantitative mapping of kidney perfusion, and oxygenation after ischemia-reperfusion injury with the use of a novel multi-nuclear 1H/19F magnetic resonance imaging (MRI) approach,. Furthermore, when conjugated with targeting ligands, the functionalized PFC NP offer unique and quantitative capabilities for imaging inflammation in the kidney of atherosclerotic ApoE-null mice. Additionally, PFC NP can facilitate drug delivery for treatment of inflammation, thrombosis, and angiogenesis in selected conditions that are comorbidities for to kidney failure. The excellent safety profile of PFC NP with respect to kidney injury positions these nanomedicine approaches as promising diagnostic and therapeutic candidates for treating and following acute and chronic kidney diseases. PMID:24206599

  14. Space Shuttle Main Engine Quantitative Risk Assessment: Illustrating Modeling of a Complex System with a New QRA Software Package

    NASA Technical Reports Server (NTRS)

    Smart, Christian

    1998-01-01

    During 1997, a team from Hernandez Engineering, MSFC, Rocketdyne, Thiokol, Pratt & Whitney, and USBI completed the first phase of a two year Quantitative Risk Assessment (QRA) of the Space Shuttle. The models for the Shuttle systems were entered and analyzed by a new QRA software package. This system, termed the Quantitative Risk Assessment System(QRAS), was designed by NASA and programmed by the University of Maryland. The software is a groundbreaking PC-based risk assessment package that allows the user to model complex systems in a hierarchical fashion. Features of the software include the ability to easily select quantifications of failure modes, draw Event Sequence Diagrams(ESDs) interactively, perform uncertainty and sensitivity analysis, and document the modeling. This paper illustrates both the approach used in modeling and the particular features of the software package. The software is general and can be used in a QRA of any complex engineered system. The author is the project lead for the modeling of the Space Shuttle Main Engines (SSMEs), and this paper focuses on the modeling completed for the SSMEs during 1997. In particular, the groundrules for the study, the databases used, the way in which ESDs were used to model catastrophic failure of the SSMES, the methods used to quantify the failure rates, and how QRAS was used in the modeling effort are discussed. Groundrules were necessary to limit the scope of such a complex study, especially with regard to a liquid rocket engine such as the SSME, which can be shut down after ignition either on the pad or in flight. The SSME was divided into its constituent components and subsystems. These were ranked on the basis of the possibility of being upgraded and risk of catastrophic failure. Once this was done the Shuttle program Hazard Analysis and Failure Modes and Effects Analysis (FMEA) were used to create a list of potential failure modes to be modeled. The groundrules and other criteria were used to screen out the many failure modes that did not contribute significantly to the catastrophic risk. The Hazard Analysis and FMEA for the SSME were also used to build ESDs that show the chain of events leading from the failure mode occurence to one of the following end states: catastrophic failure, engine shutdown, or siccessful operation( successful with respect to the failure mode under consideration).

  15. Structural health monitoring and damage evaluation for steel confined reinforced concrete column using the acoustic emission technique

    NASA Astrophysics Data System (ADS)

    Du, Fangzhu; Li, Dongsheng

    2018-03-01

    As a new kind of composite structures, the using of steel confined reinforced concrete column attract increasing attention in civil engineer. During the damage process, this new structure offers highly complex and invisible failure mechanism due to the combination effects of steel tubes, concrete, and steel rebar. Acoustic emission (AE) technique has been extensively studied in nondestructive testing (NDT) and is currently applied in civil engineering for structural health monitoring (SHM) and damage evaluation. In the present study, damage property and failure evolution of steel confined and unconfined reinforced concrete (RC) columns are investigated under quasi-static loading through (AE) signal. Significantly improved loading capacity and excellent energy dissipation characteristic demonstrated the practicality of that proposed structure. AE monitoring results indicated that the progressive deformation of the test specimens occur in three stages representing different damage conditions. Sentry function compares the logarithm ratio between the stored strain energy (Es) and the released acoustic energy (Ea); explicitly disclose the damage growth and failure mechanism of the test specimens. Other extended AE features including index of damage (ID), and relax ratio are calculated to quantitatively evaluate the damage severity and critical point. Complicated temporal evolution of different AE features confirms the potential importance of integrated analysis of two or more parameters. The proposed multi-indicators analysis is capable of revealing the damage growth and failure mechanism for steel confined RC columns, and providing critical warning information for structure failure.

  16. Analysis of the microRNA signature driving adaptive right ventricular hypertrophy in an ovine model of congenital heart disease.

    PubMed

    Kameny, Rebecca Johnson; He, Youping; Zhu, Terry; Gong, Wenhui; Raff, Gary W; Chapin, Cheryl J; Datar, Sanjeev A; Boehme, Jason; Hata, Akiko; Fineman, Jeffrey R

    2018-06-15

    The right ventricular (RV) response to pulmonary arterial hypertension (PAH) is heterogeneous. Most patients have maladaptive changes with RV dilation and failure while some-especially patients with PAH secondary to congenital heart disease (CHD)-have an adaptive response with hypertrophy and preserved systolic function. Mechanisms for RV adaptation to PAH are unknown despite RV function being a primary determinant of mortality. In our CHD ovine model with fetally-implanted aortopulmonary shunt (shunt lambs), we previously demonstrated an adaptive physiologic RV response to increased afterload with hypertrophy. In this study, we examined small noncoding microRNA (miRNA) expression in shunt RV and characterized downstream effects of a key miRNA. RV tissue was harvested from 4-week-old shunt and control lambs (n=5) and miRNA, mRNA, and proteins were quantitated. We found differential expression of 40 cardiovascular-specific miRNA in shunt RV. Interestingly, this miRNA signature is distinct from models of RV failure, suggesting that miRNAs might contribute to adaptive RV hypertrophy. Among RV miRNAs, miR-199b is decreased in RV with eventual downregulation of nuclear factor of activated T-cells (NFAT)/calcineurin signaling. Furthermore, anti-fibrotic miR-29a is increased in shunt RV with reduction of miR-29 targets Collagen A1 and 3A1 and decreased fibrosis. Thus, we conclude that the miRNA signature specific to shunt lambs is distinct from RV failure and drives gene expression required for adaptive RV hypertrophy. We propose that the adaptive RV miRNA signature may serve as a prognostic and therapeutic tool in patients with PAH to attenuate or prevent progression of RV failure and premature death.

  17. The SAM framework: modeling the effects of management factors on human behavior in risk analysis.

    PubMed

    Murphy, D M; Paté-Cornell, M E

    1996-08-01

    Complex engineered systems, such as nuclear reactors and chemical plants, have the potential for catastrophic failure with disastrous consequences. In recent years, human and management factors have been recognized as frequent root causes of major failures in such systems. However, classical probabilistic risk analysis (PRA) techniques do not account for the underlying causes of these errors because they focus on the physical system and do not explicitly address the link between components' performance and organizational factors. This paper describes a general approach for addressing the human and management causes of system failure, called the SAM (System-Action-Management) framework. Beginning with a quantitative risk model of the physical system, SAM expands the scope of analysis to incorporate first the decisions and actions of individuals that affect the physical system. SAM then links management factors (incentives, training, policies and procedures, selection criteria, etc.) to those decisions and actions. The focus of this paper is on four quantitative models of action that describe this last relationship. These models address the formation of intentions for action and their execution as a function of the organizational environment. Intention formation is described by three alternative models: a rational model, a bounded rationality model, and a rule-based model. The execution of intentions is then modeled separately. These four models are designed to assess the probabilities of individual actions from the perspective of management, thus reflecting the uncertainties inherent to human behavior. The SAM framework is illustrated for a hypothetical case of hazardous materials transportation. This framework can be used as a tool to increase the safety and reliability of complex technical systems by modifying the organization, rather than, or in addition to, re-designing the physical system.

  18. Ultrashort echo time magnetization transfer (UTE-MT) imaging of cortical bone.

    PubMed

    Chang, Eric Y; Bae, Won C; Shao, Hongda; Biswas, Reni; Li, Shihong; Chen, Jun; Patil, Shantanu; Healey, Robert; D'Lima, Darryl D; Chung, Christine B; Du, Jiang

    2015-07-01

    Magnetization transfer (MT) imaging is one way to indirectly assess pools of protons with fast transverse relaxation. However, conventional MT imaging sequences are not applicable to short T2 tissues such as cortical bone. Ultrashort echo time (UTE) sequences with TE values as low as 8 µs can detect signals from different water components in cortical bone. In this study we aim to evaluate two-dimensional UTE-MT imaging of cortical bone and its application in assessing cortical bone porosity as measured by micro-computed tomography (μCT) and biomechanical properties. In total, 38 human cadaveric distal femur and proximal tibia bones were sectioned to produce 122 rectangular pieces of cortical bone for quantitative UTE-MT MR imaging, μCT, and biomechanical testing. Off-resonance saturation ratios (OSRs) with a series of MT pulse frequency offsets (Δf) were calculated and compared with porosity assessed with μCT, as well as elastic (modulus, yield stress, and strain) and failure (ultimate stress, failure strain, and energy) properties, using Pearson correlation and linear regression. A moderately strong negative correlation was observed between OSR and μCT porosity (R(2)  = 0.46-0.51), while a moderate positive correlation was observed between OSR and yield stress (R(2)  = 0.25-0.30) and failure stress (R(2)  = 0.31-0.35), and a weak positive correlation (R(2)  = 0.09-0.12) between OSR and Young's modulus at all off-resonance saturation frequencies. OSR determined with the UTE-MT sequence provides quantitative information on cortical bone and is sensitive to μCT porosity and biomechanical function. Copyright © 2015 John Wiley & Sons, Ltd.

  19. Development of a quantitative model for the mechanism of raveling failure in highway rock slopes using LIDAR.

    DOT National Transportation Integrated Search

    2013-03-01

    Rock falls on highways while dangerous are unpredictable. Most rock falls are of the raveling type and not conducive to stability : calculations, and even the failure mechanisms are not well understood. LIDAR (LIght Detection And Ranging) has been sh...

  20. The effect of heart failure and left ventricular assist device treatment on right ventricular mechanics: a computational study.

    PubMed

    Park, Jun I K; Heikhmakhtiar, Aulia Khamas; Kim, Chang Hyun; Kim, Yoo Seok; Choi, Seong Wook; Song, Kwang Soup; Lim, Ki Moo

    2018-05-22

    Although it is important to analyze the hemodynamic factors related to the right ventricle (RV) after left ventricular assist device (LVAD) implantation, previous studies have focused only on the alteration of the ventricular shape and lack quantitative analysis of the various hemodynamic parameters. Therefore, we quantitatively analyzed various hemodynamic parameters related to the RV under normal, heart failure (HF), and HF incorporated with continuous flow LVAD therapy by using a computational model. In this study, we combined a three-dimensional finite element electromechanical model of ventricles, which is based on human ventricular morphology captured by magnetic resonance imaging (MRI) with a lumped model of the circulatory system and continuous flow LVAD function in order to construct an integrated model of an LVAD implanted-cardiovascular system. To induce systolic dysfunction, the magnitude of the calcium transient function under HF condition was reduced to 70% of the normal value, and the time constant was reduced by 30% of the normal value. Under the HF condition, the left ventricular end systolic pressure decreased, the left ventricular end diastolic pressure increased, and the pressure in the right atrium (RA), RV, and pulmonary artery (PA) increased compared with the normal condition. The LVAD therapy decreased the end-systolic pressure of the LV by 41%, RA by 29%, RV by 53%, and PA by 71%, but increased the right ventricular ejection fraction by 52% and cardiac output by 40%, while the stroke work was reduced by 67% compared with the HF condition without LVAD. The end-systolic ventricular tension and strain decreased with the LVAD treatment. LVAD enhances CO and mechanical unloading of the LV as well as those of the RV and prevents pulmonary hypertension which can be induced by HF.

  1. “I’d eat a bucket of nails if you told me it would help me sleep:” Perceptions of insomnia and its treatment in patients with stable heart failure

    PubMed Central

    Andrews, Laura Kierol; Coviello, Jessica; Hurley, Elisabeth; Rose, Leonie; Redeker, Nancy S.

    2014-01-01

    Background Poor sleep, including insomnia, is common among patients with heart failure (HF). However, little is known about the efficacy of interventions for insomnia in this population. Prior to developing interventions, there is a need for better understanding of patient perceptions about insomnia and its treatment. Objectives To evaluate HF patients’ perceptions about 1) insomnia and its consequences; 2) predisposing, precipitating, and perpetuating factors for insomnia; 3) self-management strategies and treatments for insomnia; and 4) preferences for insomnia treatment. Methods The study, guided by the “3 P” model of insomnia, employed a parallel convergent mixed methods design in which we obtained qualitative data through focus groups and quantitative data through questionnaires (sleep quality, insomnia severity, dysfunctional beliefs and attitudes about sleep; sleep-related daytime symptoms and functional performance). Content analysis was used to evaluate themes arising from the focus group data, and descriptive statistics were used to analyze the quantitative data. The results of both forms of data collection were compared and synthesized. Results HF patients perceived insomnia as having a negative impact on daytime function and comorbid health problems, pain, nocturia, and psychological factors as perpetuating factors. They viewed use of hypnotic medications as often necessary but disliked negative daytime side effects. They used a variety of strategies to manage their insomnia, but generally did not mention their sleep concerns to physicians whom they perceived as not interested in sleep. Conclusions HF patients believe insomnia is important and multi-factorial. Behavioral treatments, such as Cognitive Behavioral Therapy, for insomnia may be efficacious in modifying perpetuating factors and likely to be acceptable to patients. PMID:23998381

  2. Left atrial function in heart failure with impaired and preserved ejection fraction.

    PubMed

    Fang, Fang; Lee, Alex Pui-Wai; Yu, Cheuk-Man

    2014-09-01

    Left atrial structural and functional changes in heart failure are relatively ignored parts of cardiac assessment. This review illustrates the pathophysiological and functional changes in left atrium in heart failure as well as their prognostic value. Heart failure can be divided into those with systolic dysfunction and heart failure with preserved ejection fraction (HFPEF). Left atrial enlargement and dysfunction commonly occur in systolic heart failure, in particular, in idiopathic dilated cardiomyopathy. Atrial enlargement and dysfunction also carry important prognostic value in systolic heart failure, independently of known parameters such as left ventricular ejection fraction. In HFPEF, there is evidence of left atrial enlargement, impaired atrial compliance, and reduction of atrial pump function. This occurs not only at rest but also during exercise, indicating significant impairment of atrial contractile reserve. Furthermore, atrial dyssynchrony is common in HFPEF. These factors further contribute to the development of new onset or progression of atrial arrhythmias, in particular, atrial fibrillation. Left atrial function is an integral part of cardiac function and its structural and functional changes in heart failure are common. As changes of left atrial structure and function have different clinical implications in systolic heart failure and HFPEF, routine assessment is warranted.

  3. Quantitative risk assessment for skin sensitization: Success or failure?

    PubMed

    Kimber, Ian; Gerberick, G Frank; Basketter, David A

    2017-02-01

    Skin sensitization is unique in the world of toxicology. There is a combination of reliable, validated predictive test methods for identification of skin sensitizing chemicals, a clearly documented and transparent approach to risk assessment, and effective feedback from dermatology clinics around the world delivering evidence of the success or failure of the hazard identification/risk assessment/management process. Recent epidemics of contact allergy, particularly to preservatives, have raised questions of whether the safety/risk assessment process is working in an optimal manner (or indeed is working at all!). This review has as its focus skin sensitization quantitative risk assessment (QRA). The core toxicological principles of QRA are reviewed, and evidence of use and misuse examined. What becomes clear is that skin sensitization QRA will only function adequately if two essential criteria are met. The first is that QRA is applied rigourously, and the second is that potential exposure to the sensitizing substance is assessed adequately. This conclusion will come as no surprise to any toxicologist who appreciates the basic premise that "risk = hazard x exposure". Accordingly, use of skin sensitization QRA is encouraged, not least because the essential feedback from dermatology clinics can be used as a tool to refine QRA in situations where this risk assessment tool has not been properly used. Copyright © 2016 Elsevier Inc. All rights reserved.

  4. Risk analysis of heat recovery steam generator with semi quantitative risk based inspection API 581

    NASA Astrophysics Data System (ADS)

    Prayogo, Galang Sandy; Haryadi, Gunawan Dwi; Ismail, Rifky; Kim, Seon Jin

    2016-04-01

    Corrosion is a major problem that most often occurs in the power plant. Heat recovery steam generator (HRSG) is an equipment that has a high risk to the power plant. The impact of corrosion damage causing HRSG power plant stops operating. Furthermore, it could be threaten the safety of employees. The Risk Based Inspection (RBI) guidelines by the American Petroleum Institute (API) 58 has been used to risk analysis in the HRSG 1. By using this methodology, the risk that caused by unexpected failure as a function of the probability and consequence of failure can be estimated. This paper presented a case study relating to the risk analysis in the HRSG, starting with a summary of the basic principles and procedures of risk assessment and applying corrosion RBI for process industries. The risk level of each HRSG equipment were analyzed: HP superheater has a medium high risk (4C), HP evaporator has a medium-high risk (4C), and the HP economizer has a medium risk (3C). The results of the risk assessment using semi-quantitative method of standard API 581 based on the existing equipment at medium risk. In the fact, there is no critical problem in the equipment components. Damage mechanisms were prominent throughout the equipment is thinning mechanism. The evaluation of the risk approach was done with the aim of reducing risk by optimizing the risk assessment activities.

  5. Confessions of a Quantitative Educational Researcher Trying to Teach Qualitative Research.

    ERIC Educational Resources Information Center

    Stallings, William M.

    1995-01-01

    Describes one quantitative educational researcher's experiences teaching qualitative research, the approach used in classes, and the successes and failures. These experiences are examined from the viewpoint of a traditionally trained professor who has now been called upon to master and teach qualitative research. (GR)

  6. Icariin protects rats against 5/6 nephrectomy-induced chronic kidney failure by increasing the number of renal stem cells.

    PubMed

    Huang, Zhongdi; He, Liqun; Huang, Di; Lei, Shi; Gao, Jiandong

    2015-10-21

    Chronic kidney disease poses a serious health problem worldwide with increasing prevalence and lack of effective treatment. This study aimed to investigate the mechanism of icariin in alleviating chronic renal failure induced by 5/6 nephrectomy in rats. The chronic renal failure model was established by a two-phased 5/6 nephrectomy procedure. The model rats were given daily doses of water or icariin for 8 weeks. The kidney morphology was checked by HE staining. The levels of blood urea nitrogen, serum creatinine, and serum uric acid were measured by colometric methods. The expression of specified genes was analyzed by quantitative real-time PCR and immunohistochemical staining. The number of renal stem/progenitor cells was analyzed by CD133 and CD24 immunohistochemical staining. Icariin protected against CDK-caused damages to kidney histology and improved renal function, significantly reduced levels of BUN, creatinine, and uric acid. Icariin inhibited the expression level of TGF-β1 whereas upregulated HGF, BMP-7, WT-1, and Pax2 expression. Moreover, ccariin significantly increased the expression of CD24, CD133, Osr1, and Nanog in remnant kidney and the numbers of CD133(+)/CD24(+) renal stem/progenitor cells. These data demonstrated that icariin effectively alleviated 5/6 nephrectomy induced chronic renal failure through increasing renal stem/progenitor cells.

  7. Quantitative risk analysis of oil storage facilities in seismic areas.

    PubMed

    Fabbrocino, Giovanni; Iervolino, Iunio; Orlando, Francesca; Salzano, Ernesto

    2005-08-31

    Quantitative risk analysis (QRA) of industrial facilities has to take into account multiple hazards threatening critical equipment. Nevertheless, engineering procedures able to evaluate quantitatively the effect of seismic action are not well established. Indeed, relevant industrial accidents may be triggered by loss of containment following ground shaking or other relevant natural hazards, either directly or through cascade effects ('domino effects'). The issue of integrating structural seismic risk into quantitative probabilistic seismic risk analysis (QpsRA) is addressed in this paper by a representative study case regarding an oil storage plant with a number of atmospheric steel tanks containing flammable substances. Empirical seismic fragility curves and probit functions, properly defined both for building-like and non building-like industrial components, have been crossed with outcomes of probabilistic seismic hazard analysis (PSHA) for a test site located in south Italy. Once the seismic failure probabilities have been quantified, consequence analysis has been performed for those events which may be triggered by the loss of containment following seismic action. Results are combined by means of a specific developed code in terms of local risk contour plots, i.e. the contour line for the probability of fatal injures at any point (x, y) in the analysed area. Finally, a comparison with QRA obtained by considering only process-related top events is reported for reference.

  8. Application of Function-Failure Similarity Method to Rotorcraft Component Design

    NASA Technical Reports Server (NTRS)

    Roberts, Rory A.; Stone, Robert E.; Tumer, Irem Y.; Clancy, Daniel (Technical Monitor)

    2002-01-01

    Performance and safety are the top concerns of high-risk aerospace applications at NASA. Eliminating or reducing performance and safety problems can be achieved with a thorough understanding of potential failure modes in the designs that lead to these problems. The majority of techniques use prior knowledge and experience as well as Failure Modes and Effects as methods to determine potential failure modes of aircraft. During the design of aircraft, a general technique is needed to ensure that every potential failure mode is considered, while avoiding spending time on improbable failure modes. In this work, this is accomplished by mapping failure modes to specific components, which are described by their functionality. The failure modes are then linked to the basic functions that are carried within the components of the aircraft. Using this technique, designers can examine the basic functions, and select appropriate analyses to eliminate or design out the potential failure modes. The fundamentals of this method were previously introduced for a simple rotating machine test rig with basic functions that are common to a rotorcraft. In this paper, this technique is applied to the engine and power train of a rotorcraft, using failures and functions obtained from accident reports and engineering drawings.

  9. Identification of emergent off-nominal operational requirements during conceptual architecting of the more electric aircraft

    NASA Astrophysics Data System (ADS)

    Armstrong, Michael James

    Increases in power demands and changes in the design practices of overall equipment manufacturers has led to a new paradigm in vehicle systems definition. The development of unique power systems architectures is of increasing importance to overall platform feasibility and must be pursued early in the aircraft design process. Many vehicle systems architecture trades must be conducted concurrent to platform definition. With an increased complexity introduced during conceptual design, accurate predictions of unit level sizing requirements must be made. Architecture specific emergent requirements must be identified which arise due to the complex integrated effect of unit behaviors. Off-nominal operating scenarios present sizing critical requirements to the aircraft vehicle systems. These requirements are architecture specific and emergent. Standard heuristically defined failure mitigation is sufficient for sizing traditional and evolutionary architectures. However, architecture concepts which vary significantly in terms of structure and composition require that unique failure mitigation strategies be defined for accurate estimations of unit level requirements. Identifying of these off-nominal emergent operational requirements require extensions to traditional safety and reliability tools and the systematic identification of optimal performance degradation strategies. Discrete operational constraints posed by traditional Functional Hazard Assessment (FHA) are replaced by continuous relationships between function loss and operational hazard. These relationships pose the objective function for hazard minimization. Load shedding optimization is performed for all statistically significant failures by varying the allocation of functional capability throughout the vehicle systems architecture. Expressing hazards, and thereby, reliability requirements as continuous relationships with the magnitude and duration of functional failure requires augmentations to the traditional means for system safety assessment (SSA). The traditional two state and discrete system reliability assessment proves insufficient. Reliability is, therefore, handled in an analog fashion: as a function of magnitude of failure and failure duration. A series of metrics are introduced which characterize system performance in terms of analog hazard probabilities. These include analog and cumulative system and functional risk, hazard correlation, and extensions to the traditional component importance metrics. Continuous FHA, load shedding optimization, and analog SSA constitute the SONOMA process (Systematic Off-Nominal Requirements Analysis). Analog system safety metrics inform both architecture optimization (changes in unit level capability and reliability) and architecture augmentation (changes in architecture structure and composition). This process was applied for two vehicle systems concepts (conventional and 'more-electric') in terms of loss/hazard relationships with varying degrees of fidelity. Application of this process shows that the traditional assumptions regarding the structure of the function loss vs. hazard relationship apply undue design bias to functions and components during exploratory design. This bias is illustrated in terms of inaccurate estimations of the system and function level risk and unit level importance. It was also shown that off-nominal emergent requirements must be defined specific to each architecture concept. Quantitative comparisons of architecture specific off-nominal performance were obtained which provide evidence to the need for accurate definition of load shedding strategies during architecture exploratory design. Formally expressing performance degradation strategies in terms of the minimization of a continuous hazard space enhances the system architects ability to accurately predict sizing critical emergent requirements concurrent to architecture definition. Furthermore, the methods and frameworks generated here provide a structured and flexible means for eliciting these architecture specific requirements during the performance of architecture trades.

  10. Deriving Function-failure Similarity Information for Failure-free Rotorcraft Component Design

    NASA Technical Reports Server (NTRS)

    Roberts, Rory A.; Stone, Robert B.; Tumer, Irem Y.; Clancy, Daniel (Technical Monitor)

    2002-01-01

    Performance and safety are the top concerns of high-risk aerospace applications at NASA. Eliminating or reducing performance and safety problems can be achieved with a thorough understanding of potential failure modes in the design that lead to these problems. The majority of techniques use prior knowledge and experience as well as Failure Modes and Effects as methods to determine potential failure modes of aircraft. The aircraft design needs to be passed through a general technique to ensure that every potential failure mode is considered, while avoiding spending time on improbable failure modes. In this work, this is accomplished by mapping failure modes to certain components, which are described by their functionality. In turn, the failure modes are then linked to the basic functions that are carried within the components of the aircraft. Using the technique proposed in this paper, designers can examine the basic functions, and select appropriate analyses to eliminate or design out the potential failure modes. This method was previously applied to a simple rotating machine test rig with basic functions that are common to a rotorcraft. In this paper, this technique is applied to the engine and power train of a rotorcraft, using failures and functions obtained from accident reports and engineering drawings.

  11. 76 FR 27710 - Self-Regulatory Organizations; NASDAQ OMX BX, Inc.; Order Granting Approval of Proposed Rule...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-05-12

    ... 8, 2010.\\3\\ The Commission subsequently extended the time period in which to either approve the... BX Venture Market will have minimal quantitative listing standards, but will have qualitative... another national securities exchange for failure to meet quantitative listing standards (including price...

  12. Failure detection system risk reduction assessment

    NASA Technical Reports Server (NTRS)

    Aguilar, Robert B. (Inventor); Huang, Zhaofeng (Inventor)

    2012-01-01

    A process includes determining a probability of a failure mode of a system being analyzed reaching a failure limit as a function of time to failure limit, determining a probability of a mitigation of the failure mode as a function of a time to failure limit, and quantifying a risk reduction based on the probability of the failure mode reaching the failure limit and the probability of the mitigation.

  13. Comprehensive Analysis of Gene Expression Profiles of Sepsis-Induced Multiorgan Failure Identified Its Valuable Biomarkers.

    PubMed

    Wang, Yumei; Yin, Xiaoling; Yang, Fang

    2018-02-01

    Sepsis is an inflammatory-related disease, and severe sepsis would induce multiorgan dysfunction, which is the most common cause of death of patients in noncoronary intensive care units. Progression of novel therapeutic strategies has proven to be of little impact on the mortality of severe sepsis, and unfortunately, its mechanisms still remain poorly understood. In this study, we analyzed gene expression profiles of severe sepsis with failure of lung, kidney, and liver for the identification of potential biomarkers. We first downloaded the gene expression profiles from the Gene Expression Omnibus and performed preprocessing of raw microarray data sets and identification of differential expression genes (DEGs) through the R programming software; then, significantly enriched functions of DEGs in lung, kidney, and liver failure sepsis samples were obtained from the Database for Annotation, Visualization, and Integrated Discovery; finally, protein-protein interaction network was constructed for DEGs based on the STRING database, and network modules were also obtained through the MCODE cluster method. As a result, lung failure sepsis has the highest number of DEGs of 859, whereas the number of DEGs in kidney and liver failure sepsis samples is 178 and 175, respectively. In addition, 17 overlaps were obtained among the three lists of DEGs. Biological processes related to immune and inflammatory response were found to be significantly enriched in DEGs. Network and module analysis identified four gene clusters in which all or most of genes were upregulated. The expression changes of Icam1 and Socs3 were further validated through quantitative PCR analysis. This study should shed light on the development of sepsis and provide potential therapeutic targets for sepsis-induced multiorgan failure.

  14. Hybrid Modeling for Testing Intelligent Software for Lunar-Mars Closed Life Support

    NASA Technical Reports Server (NTRS)

    Malin, Jane T.; Nicholson, Leonard S. (Technical Monitor)

    1999-01-01

    Intelligent software is being developed for closed life support systems with biological components, for human exploration of the Moon and Mars. The intelligent software functions include planning/scheduling, reactive discrete control and sequencing, management of continuous control, and fault detection, diagnosis, and management of failures and errors. Four types of modeling information have been essential to system modeling and simulation to develop and test the software and to provide operational model-based what-if analyses: discrete component operational and failure modes; continuous dynamic performance within component modes, modeled qualitatively or quantitatively; configuration of flows and power among components in the system; and operations activities and scenarios. CONFIG, a multi-purpose discrete event simulation tool that integrates all four types of models for use throughout the engineering and operations life cycle, has been used to model components and systems involved in the production and transfer of oxygen and carbon dioxide in a plant-growth chamber and between that chamber and a habitation chamber with physicochemical systems for gas processing.

  15. Premature ovarian failure due to tetrasomy X in an adolescent girl.

    PubMed

    Kara, Cengiz; Üstyol, Ala; Yılmaz, Ayşegül; Altundağ, Engin; Oğur, Gönül

    2014-12-01

    Tetrasomy X associated with premature ovarian failure has been described in a few patients, and the parental origin of the extra X chromosomes has not been investigated so far in this group. A 15-year-old girl with mental retardation and minor physical anomalies showed secondary amenorrhea, high gonadotropin levels, and osteoporosis. Molecular analysis of the fibroblast cells revealed pure 48,XXXX constitution despite 48,XXXX/47,XXX mosaicism in peripheral blood. Analysis of the polymorphic markers (X22, DXYS218, DXYS267, HPRT) on the X chromosome by the quantitative fluorescent polymerase chain reaction (QF-PCR) method demonstrated that the extra X chromosomes were maternal in origin. Patients with tetrasomy X syndrome should be screened for ovarian insufficiency during early adolescence because hormone replacement therapy may be required for prevention of osteoporosis. In order to understand a potential impact of the parental origin of the extra X chromosomes on ovarian development and function, further studies are needed.

  16. Impact of self-healing capability on network robustness

    NASA Astrophysics Data System (ADS)

    Shang, Yilun

    2015-04-01

    A wide spectrum of real-life systems ranging from neurons to botnets display spontaneous recovery ability. Using the generating function formalism applied to static uncorrelated random networks with arbitrary degree distributions, the microscopic mechanism underlying the depreciation-recovery process is characterized and the effect of varying self-healing capability on network robustness is revealed. It is found that the self-healing capability of nodes has a profound impact on the phase transition in the emergence of percolating clusters, and that salient difference exists in upholding network integrity under random failures and intentional attacks. The results provide a theoretical framework for quantitatively understanding the self-healing phenomenon in varied complex systems.

  17. Impact of self-healing capability on network robustness.

    PubMed

    Shang, Yilun

    2015-04-01

    A wide spectrum of real-life systems ranging from neurons to botnets display spontaneous recovery ability. Using the generating function formalism applied to static uncorrelated random networks with arbitrary degree distributions, the microscopic mechanism underlying the depreciation-recovery process is characterized and the effect of varying self-healing capability on network robustness is revealed. It is found that the self-healing capability of nodes has a profound impact on the phase transition in the emergence of percolating clusters, and that salient difference exists in upholding network integrity under random failures and intentional attacks. The results provide a theoretical framework for quantitatively understanding the self-healing phenomenon in varied complex systems.

  18. Multifunctional Nano-engineered Polymer Surfaces with Enhanced Mechanical Resistance and Superhydrophobicity

    NASA Astrophysics Data System (ADS)

    Hernández, Jaime J.; Monclús, Miguel A.; Navarro-Baena, Iván; Viela, Felipe; Molina-Aldareguia, Jon M.; Rodríguez, Isabel

    2017-03-01

    This paper presents a multifunctional polymer surface that provides superhydrophobicity and self-cleaning functions together with an enhancement in mechanical and electrical performance. These functionalities are produced by nanoimprinting high aspect ratio pillar arrays on polymeric matrix incorporating functional reinforcing elements. Two distinct matrix-filler systems are investigated specifically, Carbon Nanotube reinforced Polystyrene (CNT-PS) and Reduced Graphene Oxide reinforced Polyvinylidene Difluoride (RGO-PVDF). Mechanical characterization of the topographies by quantitative nanoindentation and nanoscratch tests are performed to evidence a considerable increase in stiffness, Young’s modulus and critical failure load with respect to the pristine polymers. The improvement on the mechanical properties is rationalized in terms of effective dispersion and penetration of the fillers into the imprinted structures as determined by confocal Raman and SEM studies. In addition, an increase in the degree of crystallization for the PVDF-RGO imprinted nanocomposite possibly accounts for the larger enhancement observed. Improvement of the mechanical ruggedness of functional textured surfaces with appropriate fillers will enable the implementation of multifunctional nanotextured materials in real applications.

  19. Use of mechanistic simulations as a quantitative risk-ranking tool within the quality by design framework.

    PubMed

    Stocker, Elena; Toschkoff, Gregor; Sacher, Stephan; Khinast, Johannes G

    2014-11-20

    The purpose of this study is to evaluate the use of computer simulations for generating quantitative knowledge as a basis for risk ranking and mechanistic process understanding, as required by ICH Q9 on quality risk management systems. In this specific publication, the main focus is the demonstration of a risk assessment workflow, including a computer simulation for the generation of mechanistic understanding of active tablet coating in a pan coater. Process parameter screening studies are statistically planned under consideration of impacts on a potentially critical quality attribute, i.e., coating mass uniformity. Based on computer simulation data the process failure mode and effects analysis of the risk factors is performed. This results in a quantitative criticality assessment of process parameters and the risk priority evaluation of failure modes. The factor for a quantitative reassessment of the criticality and risk priority is the coefficient of variation, which represents the coating mass uniformity. The major conclusion drawn from this work is a successful demonstration of the integration of computer simulation in the risk management workflow leading to an objective and quantitative risk assessment. Copyright © 2014. Published by Elsevier B.V.

  20. β1-Adrenergic blocker bisoprolol reverses down-regulated ion channels in sinoatrial node of heart failure rats.

    PubMed

    Du, Yuan; Zhang, Junbo; Xi, Yutao; Wu, Geru; Han, Ke; Huang, Xin; Ma, Aiqun; Wang, Tingzhong

    2016-06-01

    Bisoprolol, an antagonist of β1-adrenergic receptors, is effective in reducing the morbidity and mortality in patients with heart failure (HF). It has been found that HF is accompanied with dysfunction of the sinoatrial node (SAN). However, whether bisoprolol reverses the decreased SAN function in HF and how the relevant ion channels in SAN change were relatively less studied. SAN function and messenger RNA (mRNA) expression of sodium channels and hyperpolarization-activated cyclic nucleotide-gated (HCN) channel subunits were assessed in sham-operated rats, abdominal arterio-venous shunt (volume overload)-induced HF rats, and bisoprolol- treated HF rats. SAN cells of rats were isolated by laser capture microdissection. Quantitative real-time PCR analysis was used to quantify mRNA expression of sodium channels and HCN channel subunits in SAN. Intrinsic heart rate declined and sinus node recovery time prolonged in HF rats, indicating the suppressed SAN function, which could be improved by bisoprolol treatment. Nav1.1, Nav1.6, and HCN4 mRNA expressions were reduced in SAN in HF rats compared with that in control rats. Treatment with bisoprolol could reverse both the SAN function and the Nav1.1, Nav1.6, and HCN4 mRNA expression partially. These data indicated that bisoprolol is effective in HF treatment partially due to improved SAN function by reversing the down-regulation of sodium channels (Nav1.1 and Nav1.6) and HCN channel (HCN4) subunits in SAN in failing hearts.

  1. A Comparison of Functional Models for Use in the Function-Failure Design Method

    NASA Technical Reports Server (NTRS)

    Stock, Michael E.; Stone, Robert B.; Tumer, Irem Y.

    2006-01-01

    When failure analysis and prevention, guided by historical design knowledge, are coupled with product design at its conception, shorter design cycles are possible. By decreasing the design time of a product in this manner, design costs are reduced and the product will better suit the customer s needs. Prior work indicates that similar failure modes occur with products (or components) with similar functionality. To capitalize on this finding, a knowledge base of historical failure information linked to functionality is assembled for use by designers. One possible use for this knowledge base is within the Elemental Function-Failure Design Method (EFDM). This design methodology and failure analysis tool begins at conceptual design and keeps the designer cognizant of failures that are likely to occur based on the product s functionality. The EFDM offers potential improvement over current failure analysis methods, such as FMEA, FMECA, and Fault Tree Analysis, because it can be implemented hand in hand with other conceptual design steps and carried throughout a product s design cycle. These other failure analysis methods can only truly be effective after a physical design has been completed. The EFDM however is only as good as the knowledge base that it draws from, and therefore it is of utmost importance to develop a knowledge base that will be suitable for use across a wide spectrum of products. One fundamental question that arises in using the EFDM is: At what level of detail should functional descriptions of components be encoded? This paper explores two approaches to populating a knowledge base with actual failure occurrence information from Bell 206 helicopters. Functional models expressed at various levels of detail are investigated to determine the necessary detail for an applicable knowledge base that can be used by designers in both new designs as well as redesigns. High level and more detailed functional descriptions are derived for each failed component based on NTSB accident reports. To best record this data, standardized functional and failure mode vocabularies are used. Two separate function-failure knowledge bases are then created aid compared. Results indicate that encoding failure data using more detailed functional models allows for a more robust knowledge base. Interestingly however, when applying the EFDM, high level descriptions continue to produce useful results when using the knowledge base generated from the detailed functional models.

  2. Common Cause Failure Modeling in Space Launch Vehicles

    NASA Technical Reports Server (NTRS)

    Hark, Frank; Ring, Rob; Novack, Steven D.; Britton, Paul

    2015-01-01

    Common Cause Failures (CCFs) are a known and documented phenomenon that defeats system redundancy. CCFs are a set of dependent type of failures that can be caused for example by system environments, manufacturing, transportation, storage, maintenance, and assembly. Since there are many factors that contribute to CCFs, they can be reduced, but are difficult to eliminate entirely. Furthermore, failure databases sometimes fail to differentiate between independent and dependent CCF. Because common cause failure data is limited in the aerospace industry, the Probabilistic Risk Assessment (PRA) Team at Bastion Technology Inc. is estimating CCF risk using generic data collected by the Nuclear Regulatory Commission (NRC). Consequently, common cause risk estimates based on this database, when applied to other industry applications, are highly uncertain. Therefore, it is important to account for a range of values for independent and CCF risk and to communicate the uncertainty to decision makers. There is an existing methodology for reducing CCF risk during design, which includes a checklist of 40+ factors grouped into eight categories. Using this checklist, an approach to produce a beta factor estimate is being investigated that quantitatively relates these factors. In this example, the checklist will be tailored to space launch vehicles, a quantitative approach will be described, and an example of the method will be presented.

  3. Know thy eHealth user: Development of biopsychosocial personas from a study of older adults with heart failure.

    PubMed

    Holden, Richard J; Kulanthaivel, Anand; Purkayastha, Saptarshi; Goggins, Kathryn M; Kripalani, Sunil

    2017-12-01

    Personas are a canonical user-centered design method increasingly used in health informatics research. Personas-empirically-derived user archetypes-can be used by eHealth designers to gain a robust understanding of their target end users such as patients. To develop biopsychosocial personas of older patients with heart failure using quantitative analysis of survey data. Data were collected using standardized surveys and medical record abstraction from 32 older adults with heart failure recently hospitalized for acute heart failure exacerbation. Hierarchical cluster analysis was performed on a final dataset of n=30. Nonparametric analyses were used to identify differences between clusters on 30 clustering variables and seven outcome variables. Six clusters were produced, ranging in size from two to eight patients per cluster. Clusters differed significantly on these biopsychosocial domains and subdomains: demographics (age, sex); medical status (comorbid diabetes); functional status (exhaustion, household work ability, hygiene care ability, physical ability); psychological status (depression, health literacy, numeracy); technology (Internet availability); healthcare system (visit by home healthcare, trust in providers); social context (informal caregiver support, cohabitation, marital status); and economic context (employment status). Tabular and narrative persona descriptions provide an easy reference guide for informatics designers. Personas development using approaches such as clustering of structured survey data is an important tool for health informatics professionals. We describe insights from our study of patients with heart failure, then recommend a generic ten-step personas development process. Methods strengths and limitations of the study and of personas development generally are discussed. Copyright © 2017 Elsevier B.V. All rights reserved.

  4. Causes of catastrophic failure in complex systems

    NASA Astrophysics Data System (ADS)

    Thomas, David A.

    2010-08-01

    Root causes of mission critical failures and major cost and schedule overruns in complex systems and programs are studied through the post-mortem analyses compiled for several examples, including the Hubble Space Telescope, the Challenger and Columbia Shuttle accidents, and the Three Mile Island nuclear power plant accident. The roles of organizational complexity, cognitive biases in decision making, the display of quantitative data, and cost and schedule pressure are all considered. Recommendations for mitigating the risk of similar failures in future programs are also provided.

  5. Diuretics as pathogenetic treatment for heart failure

    PubMed Central

    Guglin, Maya

    2011-01-01

    Increased intracardiac filling pressure or congestion causes symptoms and leads to hospital admissions in patients with heart failure, regardless of their systolic function. A history of hospital admission, in turn, predicts further hospitalizations and morbidity, and a higher number of hospitalizations determine higher mortality. Congestion is therefore the driving force of the natural history of heart failure. Congestion is the syndrome shared by heart failure with preserved and reduced systolic function. These two conditions have almost identical morbidity, mortality, and survival because the outcomes are driven by congestion. A small difference in favor of heart failure with preserved systolic function comes from decreased ejection fraction and left ventricular remodeling which is only present in heart failure with decreased systolic function. The magnitude of this difference reflects the contribution of decreased systolic function and ventricular remodeling to the progression of heart failure. The only treatment available for congestion is fluid removal via diuretics, ultrafiltration, or dialysis. It is the only treatment that works equally well for heart failure with reduced and preserved systolic function because it affects congestion, the main pathogenetic feature of the disease. Diuretics are pathogenetic therapy for heart failure. PMID:21403798

  6. Probabilistic failure assessment with application to solid rocket motors

    NASA Technical Reports Server (NTRS)

    Jan, Darrell L.; Davidson, Barry D.; Moore, Nicholas R.

    1990-01-01

    A quantitative methodology is being developed for assessment of risk of failure of solid rocket motors. This probabilistic methodology employs best available engineering models and available information in a stochastic framework. The framework accounts for incomplete knowledge of governing parameters, intrinsic variability, and failure model specification error. Earlier case studies have been conducted on several failure modes of the Space Shuttle Main Engine. Work in progress on application of this probabilistic approach to large solid rocket boosters such as the Advanced Solid Rocket Motor for the Space Shuttle is described. Failure due to debonding has been selected as the first case study for large solid rocket motors (SRMs) since it accounts for a significant number of historical SRM failures. Impact of incomplete knowledge of governing parameters and failure model specification errors is expected to be important.

  7. Wood-adhesive bonding failure : modeling and simulation

    Treesearch

    Zhiyong Cai

    2010-01-01

    The mechanism of wood bonding failure when exposed to wet conditions or wet/dry cycles is not fully understood and the role of the resulting internal stresses exerted upon the wood-adhesive bondline has yet to be quantitatively determined. Unlike previous modeling this study has developed a new two-dimensional internal-stress model on the basis of the mechanics of...

  8. Assessing Preservice Teachers' Mathematics Cognitive Failures as Related to Mathematics Anxiety and Performance in Undergraduate Calculus

    ERIC Educational Resources Information Center

    Awofala, Adeneye O. A.; Odogwu, Helen N.

    2017-01-01

    The study investigated mathematics cognitive failures as related to mathematics anxiety, gender and performance in calculus among 450 preservice teachers from four public universities in the South West geo-political zone of Nigeria using the quantitative research method within the blueprint of the descriptive survey design. Data collected were…

  9. Classroom Assessment Strategies: What Do Students At-Risk and Teachers Perceive as Effective and Useful?

    ERIC Educational Resources Information Center

    Rieg, Sue A.

    2007-01-01

    With the focus on standardized tests, it appears that we are leaving classroom assessments and students at-risk of school failure behind. This quantitative study investigated the perceptions of junior high school teachers, and students at risk of school failure, on the effectiveness and level of use of various classroom assessments and…

  10. 75 FR 82098 - Self-Regulatory Organizations; NASDAQ OMX BX, Inc.; Notice of Filing of Amendment No. 1 to...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-12-29

    ....\\4\\ The Commission subsequently extended the time period in which to either approve the proposed rule... Market.'' The BX Venture Market will have minimal quantitative listing standards, but have qualitative... securities exchange for failure to meet quantitative listing standards (including price or other market value...

  11. The Positive Alternative Credit Experience (PACE) Program a Quantitative Comparative Study

    ERIC Educational Resources Information Center

    Warren, Rebecca Anne

    2011-01-01

    The purpose of this quantitative comparative study was to evaluate the Positive Alternative Credit Experience (PACE) Program using an objectives-oriented approach to a formative program evaluation. The PACE Program was a semester-long high school alternative education program designed to serve students at-risk for academic failure or dropping out…

  12. Qualification Testing Versus Quantitative Reliability Testing of PV - Gaining Confidence in a Rapidly Changing Technology: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kurtz, Sarah; Repins, Ingrid L; Hacke, Peter L

    Continued growth of PV system deployment would be enhanced by quantitative, low-uncertainty predictions of the degradation and failure rates of PV modules and systems. The intended product lifetime (decades) far exceeds the product development cycle (months), limiting our ability to reduce the uncertainty of the predictions for this rapidly changing technology. Yet, business decisions (setting insurance rates, analyzing return on investment, etc.) require quantitative risk assessment. Moving toward more quantitative assessments requires consideration of many factors, including the intended application, consequence of a possible failure, variability in the manufacturing, installation, and operation, as well as uncertainty in the measured accelerationmore » factors, which provide the basis for predictions based on accelerated tests. As the industry matures, it is useful to periodically assess the overall strategy for standards development and prioritization of research to provide a technical basis both for the standards and the analysis related to the application of those. To this end, this paper suggests a tiered approach to creating risk assessments. Recent and planned potential improvements in international standards are also summarized.« less

  13. Fidelity Failures in Brief Strategic Family Therapy for Adolescent Drug Abuse: A Clinical Analysis.

    PubMed

    Lebensohn-Chialvo, Florencia; Rohrbaugh, Michael J; Hasler, Brant P

    2018-04-30

    As evidence-based family treatments for adolescent substance use and conduct problems gain traction, cutting edge research moves beyond randomized efficacy trials to address questions such as how these treatments work and how best to disseminate them to community settings. A key factor in effective dissemination is treatment fidelity, which refers to implementing an intervention in a manner consistent with an established manual. While most fidelity research is quantitative, this study offers a qualitative clinical analysis of fidelity failures in a large, multisite effectiveness trial of Brief Strategic Family Therapy (BSFT) for adolescent drug abuse, where BSFT developers trained community therapists to administer this intervention in their own agencies. Using case notes and video recordings of therapy sessions, an independent expert panel first rated 103 cases on quantitative fidelity scales grounded in the BSFT manual and the broader structural-strategic framework that informs BSFT intervention. Because fidelity was generally low, the panel reviewed all cases qualitatively to identify emergent types or categories of fidelity failure. Ten categories of failures emerged, characterized by therapist omissions (e.g., failure to engage key family members, failure to think in threes) and commissions (e.g., off-model, nonsystemic formulations/interventions). Of these, "failure to think in threes" appeared basic and particularly problematic, reflecting the central place of this idea in structural theory and therapy. Although subject to possible bias, our observations highlight likely stumbling blocks in exporting a complex family treatment like BSFT to community settings. These findings also underscore the importance of treatment fidelity in family therapy research. © 2018 Family Process Institute.

  14. Polymorphism and Elastic Response of Molecular Materials from First Principles: How Hard Can it Be?

    NASA Astrophysics Data System (ADS)

    Reilly, Anthony; Tkatchenko, Alexandre

    2014-03-01

    Molecular materials are of great fundamental and applied importance in science and industry, with numerous applications in pharmaceuticals, electronics, sensing, and catalysis. A key challenge for theory has been the prediction of their stability, polymorphism and response to perturbations. While pairwise models of van der Waals (vdW) interactions have improved the ability of density functional theory (DFT) to model these systems, substantial quantitative and even qualitative failures remain. In this contribution we show how a many-body description of vdW interactions can dramatically improve the accuracy of DFT for molecular materials, yielding quantitative description of stabilities and polymorphism for these challenging systems. Moreover, the role of many-body vdW interactions goes beyond stabilities to response properties. In particular, we have studied the elastic properties of a series of molecular crystals, finding that many-body vdW interactions can account for up to 30% of the elastic response, leading to quantitative and qualitative changes in elastic behavior. We will illustrate these crucial effects with the challenging case of the polymorphs of aspirin, leading to a better understanding of the conflicting experimental and theoretical studies of this system.

  15. Comparison of Gated SPECT Myocardial Perfusion Imaging with Echocardiography for the Measurement of Left Ventricular Volumes and Ejection Fraction in Patients With Severe Heart Failure

    PubMed Central

    Shojaeifard, Maryam; Ghaedian, Tahereh; Yaghoobi, Nahid; Malek, Hadi; Firoozabadi, Hasan; Bitarafan-Rajabi, Ahmad; Haghjoo, Majid; Amin, Ahmad; Azizian, Nasrin; Rastgou, Feridoon

    2015-01-01

    Background: Gated single-photon emission computed tomography (SPECT) myocardial perfusion imaging (MPI) is known as a feasible tool for the measurement of left ventricular ejection fraction (EF) and volumes, which are of great importance in the management and follow-up of patients with coronary artery diseases. However, considering the technical shortcomings of SPECT in the presence of perfusion defect, the accuracy of this method in heart failure patients is still controversial. Objectives: The aim of the present study was to compare the results from gated SPECT MPI with those from echocardiography in heart failure patients to compare echocardiographically-derived left ventricular dimension and function data to those from gated SPECT MPI in heart failure patients. Patients and Methods: Forty-one patients with severely reduced left ventricular systolic function (EF ≤ 35%) who were referred for gated SPECT MPI were prospectively enrolled. Quantification of EF, end-diastolic volume (EDV), and end-systolic volume (ESV) was performed by using quantitative gated spect (QGS) (QGS, version 0.4, May 2009) and emory cardiac toolbox (ECTb) (ECTb, revision 1.0, copyright 2007) software packages. EF, EDV, and ESV were also measured with two-dimensional echocardiography within 3 days after MPI. Results: A good correlation was found between echocardiographically-derived EF, EDV, and ESV and the values derived using QGS (r = 0.67, r = 0.78, and r = 0.80 for EF, EDV, and ESV, respectively; P < 0.001) and ECTb (r = 0.68, 0.79, and r = 0.80 for EF, EDV, and ESV, respectively; P < 0.001). However, Bland-Altman plots indicated significantly different mean values for EF, 11.4 and 20.9 using QGS and ECTb, respectively, as compared with echocardiography. ECTb-derived EDV was also significantly higher than the EDV measured with echocardiography and QGS. The highest correlation between echocardiography and gated SPECT MPI was found for mean values of ESV different. Conclusions: Gated SPECT MPI has a good correlation with echocardiography for the measurement of left ventricular EF, EDV, and ESV in patients with severe heart failure. However, the absolute values of these functional parameters from echocardiography and gated SPECT MPI measured with different software packages should not be used interchangeably. PMID:26889455

  16. How do cardiorespiratory fitness improvements vary with physical training modality in heart failure patients? A quantitative guide

    PubMed Central

    Smart, Neil A

    2013-01-01

    BACKGROUND: Peak oxygen consumption (VO2) is the gold standard measure of cardiorespiratory fitness and a reliable predictor of survival in chronic heart failure patients. Furthermore, any form of physical training usually improves cardiorespiratory fitness, although the magnitude of improvement in peak VO2 may vary across different training prescriptions. OBJECTIVE: To quantify, and subsequently rank, the magnitude of improvement in peak VO2 for different physical training prescriptions using data from published meta-analyses and randomized controlled trials. METHODS: Prospective randomized controlled parallel trials and meta-analyses of exercise training in chronic heart failure patients that provided data on change in peak VO2 for nine a priori comparative analyses were examined. RESULTS: All forms of physical training were beneficial, although the improvement in peak VO2 varied with modality. High-intensity interval exercise yielded the largest increase in peak VO2, followed in descending order by moderate-intensity aerobic exercise, functional electrical stimulation, inspiratory muscle training, combined aerobic and resistance training, and isolated resistance training. With regard to setting, the present study was unable to determine whether outpatient or unsupervised home exercise provided greater benefits in terms of peak VO2 improvment. CONCLUSIONS: Interval exercise is not suitable for all patients, especially the high-intensity variety; however, when indicated, this form of exercise should be adopted to optimize peak VO2 adaptations. Other forms of activity, such as functional electrical stimulation, may be more appropriate for patients who are not capable of high-intensity interval training, especially for severely deconditioned patients who are initially unable to exercise. PMID:24294043

  17. REMUS100 AUV with an integrated microfluidic system for explosives detection.

    PubMed

    Adams, André A; Charles, Paul T; Veitch, Scott P; Hanson, Alfred; Deschamps, Jeffrey R; Kusterbeck, Anne W

    2013-06-01

    Quantitating explosive materials at trace concentrations in real-time on-site within the marine environment may prove critical to protecting civilians, waterways, and military personnel during this era of increased threat of widespread terroristic activity. Presented herein are results from recent field trials that demonstrate detection and quantitation of small nitroaromatic molecules using novel high-throughput microfluidic immunosensors (HTMI) to perform displacement-based immunoassays onboard a HYDROID REMUS100 autonomous underwater vehicle. Missions were conducted 2-3 m above the sea floor, and no HTMI failures were observed due to clogging from biomass infiltration. Additionally, no device leaks were observed during the trials. HTMIs maintained immunoassay functionality during 2 h deployments, while continuously sampling seawater absent without any pretreatment at a flow rate of 2 mL/min. This 20-fold increase in the nominal flow rate of the assay resulted in an order of magnitude reduction in both lag and assay times. Contaminated seawater that contained 20-175 ppb trinitrotoluene was analyzed.

  18. Revised Risk Priority Number in Failure Mode and Effects Analysis Model from the Perspective of Healthcare System

    PubMed Central

    Rezaei, Fatemeh; Yarmohammadian, Mohmmad H.; Haghshenas, Abbas; Fallah, Ali; Ferdosi, Masoud

    2018-01-01

    Background: Methodology of Failure Mode and Effects Analysis (FMEA) is known as an important risk assessment tool and accreditation requirement by many organizations. For prioritizing failures, the index of “risk priority number (RPN)” is used, especially for its ease and subjective evaluations of occurrence, the severity and the detectability of each failure. In this study, we have tried to apply FMEA model more compatible with health-care systems by redefining RPN index to be closer to reality. Methods: We used a quantitative and qualitative approach in this research. In the qualitative domain, focused groups discussion was used to collect data. A quantitative approach was used to calculate RPN score. Results: We have studied patient's journey in surgery ward from holding area to the operating room. The highest priority failures determined based on (1) defining inclusion criteria as severity of incident (clinical effect, claim consequence, waste of time and financial loss), occurrence of incident (time - unit occurrence and degree of exposure to risk) and preventability (degree of preventability and defensive barriers) then, (2) risks priority criteria quantified by using RPN index (361 for the highest rate failure). The ability of improved RPN scores reassessed by root cause analysis showed some variations. Conclusions: We concluded that standard criteria should be developed inconsistent with clinical linguistic and special scientific fields. Therefore, cooperation and partnership of technical and clinical groups are necessary to modify these models. PMID:29441184

  19. [Understanding heart failure].

    PubMed

    Boo, José Fernando Guadalajara

    2006-01-01

    Heart failure is a disease with several definitions. The term "heart failure" is used by has brougth about confusion in the terminology. For this reason, the value of the ejection fraction (< 0.40 or < 0.35) is used in most meganalyses on the treatment of heart failure, avoiding the term "heart failure" that is a confounding concept. In this paper we carefully analyze the meaning of contractility, ventricular function or performance, preload, afterload, heart failure, compensation mechanisms in heart failure, myocardial oxygen consumption, inadequate, adequate and inappropriate hypertrophy, systole, diastole, compliance, problems of relaxation, and diastolic dysfunction. Their definitions are supported by the original scientific descriptions in an attempt to clarify the concepts about ventricular function and heart failure and, in this way, use the same scientific language about the meaning of ventricular function, heart failure, and diastolic dysfunction.

  20. A major X-linked locus affects kidney function in mice

    PubMed Central

    Leduc, Magalie S.; Savage, Holly S.; Stearns, Timothy M.; Cario, Clinton L.; Walsh, Kenneth A.; Paigen, Beverly; Berndt, Annerose

    2012-01-01

    Chronic kidney disease is a common disease with increasing prevalence in the western population. One common reason for chronic kidney failure is diabetic nephropathy. Diabetic nephropathy and hyperglycemia are characteristics of the mouse inbred strain KK/HlJ, which is predominantly used as a model for metabolic syndrome due to its inherited glucose intolerance and insulin resistance. We used KK/HlJ, an albuminuria-sensitive strain, and C57BL/6J, an albuminuria-resistant strain, to perform a quantitative trait locus (QTL) cross to identify the genetic basis for chronic kidney failure. Albumin-creatinine-ratio (ACR) was measured in 130 F2 male offspring. One significant QTL was identified on chromosome (Chr) X and four suggestive QTLs were found on Chrs 6, 7, 12, and 13. Narrowing of the QTL region was focused on the X-linked QTL and performed by incorporating genotype and expression analyses for genes located in the region. From the 485 genes identified in the X-linked QTL region, a few candidate genes were identified using a combination of bioinformatic evidence based on genomic comparison of the parental strains and known function in urine homeostasis. Finally, this study demonstrates the significance of the X chromosome in the genetic determination of albuminuria. PMID:23011808

  1. Mode of action and effects of standardized collaborative disease management on mortality and morbidity in patients with systolic heart failure: the Interdisciplinary Network for Heart Failure (INH) study.

    PubMed

    Angermann, Christiane E; Störk, Stefan; Gelbrich, Götz; Faller, Hermann; Jahns, Roland; Frantz, Stefan; Loeffler, Markus; Ertl, Georg

    2012-01-01

    Trials investigating efficacy of disease management programs (DMP) in heart failure reported contradictory results. Features rendering specific interventions successful are often ill defined. We evaluated the mode of action and effects of a nurse-coordinated DMP (HeartNetCare-HF, HNC). Patients hospitalized for systolic heart failure were randomly assigned to HNC or usual care (UC). Besides telephone-based monitoring and education, HNC addressed individual problems raised by patients, pursued networking of health care providers and provided training for caregivers. End points were time to death or rehospitalization (combined primary), heart failure symptoms, and quality of life (SF-36). Of 1007 consecutive patients, 715 were randomly assigned (HNC: n=352; UC: n=363; age, 69±12 years; 29% female; 40% New York Heart Association class III-IV). Within 180 days, 130 HNC and 137 UC patients reached the primary end point (hazard ratio, 1.02; 95% confidence interval, 0.81-1.30; P=0.89), since more HNC patients were readmitted. Overall, 32 HNC and 52 UC patients died (1 UC patient and 4 HNC patients after dropout); thus, uncensored hazard ratio was 0.62 (0.40-0.96; P=0.03). HNC patients improved more regarding New York Heart Association class (P=0.05), physical functioning (P=0.03), and physical health component (P=0.03). Except for HNC, health care utilization was comparable between groups. However, HNC patients requested counseling for noncardiac problems even more frequently than for cardiovascular or heart-failure-related issues. The primary end point of this study was neutral. However, mortality risk and surrogates of well-being improved significantly. Quantitative assessment of patient requirements suggested that besides (tele)monitoring individualized care considering also noncardiac problems should be integrated in efforts to achieve more sustainable improvement in heart failure outcomes. URL: http://www.controlled-trials.com. Unique identifier: ISRCTN23325295.

  2. The Considere Condition and Rapid Stretching of Linear and Branched Polymer Melts

    NASA Technical Reports Server (NTRS)

    McKinley, Gareth H.; Hassager, Ole

    1999-01-01

    We analyze the onset of "necking" and subsequent filament failure during the transient uniaxial elongation of viscoelastic fluid samples in extensional rheometers. In the limit of rapid elongation (such that no molecular relaxation occurs), the external work applied is all stored elastically and the Considere criterion originally developed in solid mechanics can be used to quantitatively predict the critical Hencky strain to failure. By comparing the predictions of the Doi-Edwards model for linear homopolymer melts with those of the "Pom-Pom" model for prototypical branched melts we show that the critical strain to failure in rapid elongation of a rubbery material is intimately linked to the molecular topology of the chain, especially the degree of chain branching. The onset of necking instability is monotonically shifted to larger Hencky strains as the number of branches is increased. Numerical computations at finite Deborah numbers also show that there is an optimal range of deformation rates over which homogeneous extensions can be maintained to large strain. We also consider other rapid homogeneous stretching deformations, such as biaxial and planar stretching, and show that the degree of stabilization afforded by inclusion of material with long-chain branching is a sensitive function of the imposed mode of deformation.

  3. Differential reliability : probabilistic engineering applied to wood members in bending-tension

    Treesearch

    Stanley K. Suddarth; Frank E. Woeste; William L. Galligan

    1978-01-01

    Reliability analysis is a mathematical technique for appraising the design and materials of engineered structures to provide a quantitative estimate of probability of failure. Two or more cases which are similar in all respects but one may be analyzed by this method; the contrast between the probabilities of failure for these cases allows strong analytical focus on the...

  4. The Minimum Grading Controversy: Results of a Quantitative Study of Seven Years of Grading Data from an Urban High School

    ERIC Educational Resources Information Center

    Carey, Theodore; Carifio, James

    2012-01-01

    In an effort to reduce failure and drop-out rates, schools have been implementing minimum grading. One form involves raising catastrophically low student quarter grades to a predetermined minimum--typically a 50. Proponents argue it gives struggling students a reasonable chance to recover from failure. Critics contend the practice induces grade…

  5. A novel strategy for rapid detection of NT-proBNP

    NASA Astrophysics Data System (ADS)

    Cui, Qiyao; Sun, Honghao; Zhu, Hui

    2017-09-01

    In order to establish a simple, rapid, sensitive, and specific quantitative assay to detect the biomarkers of heart failure, in this study, biotin-streptavidin technology was employed with fluorescence immunochromatographic assay to detect the concentration of the biomarkers in serum, and this method was applied to detect NT-proBNP, which is valuable for diagnostic evaluation of heart failure.

  6. Students' Failure to Submit Research Projects on Time: A Case Study from Masvingo Regional Centre at Zimbabwe Open University

    ERIC Educational Resources Information Center

    Chabaya, Owence; Chiome, Chrispen; Chabaya, Raphinos A.

    2009-01-01

    The study sought to determine lecturers' and students' perceptions of factors contributing to students' failure to submit research projects on time in three departments of the Zimbabwe Open University. The study employed a descriptive survey design and was both quantitative and qualitative. The questionnaire used as a data-gathering instrument had…

  7. A novel approach for evaluating the risk of health care failure modes.

    PubMed

    Chang, Dong Shang; Chung, Jenq Hann; Sun, Kuo Lung; Yang, Fu Chiang

    2012-12-01

    Failure mode and effects analysis (FMEA) can be employed to reduce medical errors by identifying the risk ranking of the health care failure modes and taking priority action for safety improvement. The purpose of this paper is to propose a novel approach of data analysis. The approach is to integrate FMEA and a mathematical tool-Data envelopment analysis (DEA) with "slack-based measure" (SBM), in the field of data analysis. The risk indexes (severity, occurrence, and detection) of FMEA are viewed as multiple inputs of DEA. The practicality and usefulness of the proposed approach is illustrated by one case of health care. Being a systematic approach for improving the service quality of health care, the approach can offer quantitative corrective information of risk indexes that thereafter reduce failure possibility. For safety improvement, these new targets of the risk indexes could be used for management by objectives. But FMEA cannot provide quantitative corrective information of risk indexes. The novel approach can surely overcome this chief shortcoming of FMEA. After combining DEA SBM model with FMEA, the two goals-increase of patient safety, medical cost reduction-can be together achieved.

  8. Risk analysis of heat recovery steam generator with semi quantitative risk based inspection API 581

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Prayogo, Galang Sandy, E-mail: gasandylang@live.com; Haryadi, Gunawan Dwi; Ismail, Rifky

    Corrosion is a major problem that most often occurs in the power plant. Heat recovery steam generator (HRSG) is an equipment that has a high risk to the power plant. The impact of corrosion damage causing HRSG power plant stops operating. Furthermore, it could be threaten the safety of employees. The Risk Based Inspection (RBI) guidelines by the American Petroleum Institute (API) 58 has been used to risk analysis in the HRSG 1. By using this methodology, the risk that caused by unexpected failure as a function of the probability and consequence of failure can be estimated. This paper presentedmore » a case study relating to the risk analysis in the HRSG, starting with a summary of the basic principles and procedures of risk assessment and applying corrosion RBI for process industries. The risk level of each HRSG equipment were analyzed: HP superheater has a medium high risk (4C), HP evaporator has a medium-high risk (4C), and the HP economizer has a medium risk (3C). The results of the risk assessment using semi-quantitative method of standard API 581 based on the existing equipment at medium risk. In the fact, there is no critical problem in the equipment components. Damage mechanisms were prominent throughout the equipment is thinning mechanism. The evaluation of the risk approach was done with the aim of reducing risk by optimizing the risk assessment activities.« less

  9. Dynamics of functional failures and recovery in complex road networks

    NASA Astrophysics Data System (ADS)

    Zhan, Xianyuan; Ukkusuri, Satish V.; Rao, P. Suresh C.

    2017-11-01

    We propose a new framework for modeling the evolution of functional failures and recoveries in complex networks, with traffic congestion on road networks as the case study. Differently from conventional approaches, we transform the evolution of functional states into an equivalent dynamic structural process: dual-vertex splitting and coalescing embedded within the original network structure. The proposed model successfully explains traffic congestion and recovery patterns at the city scale based on high-resolution data from two megacities. Numerical analysis shows that certain network structural attributes can amplify or suppress cascading functional failures. Our approach represents a new general framework to model functional failures and recoveries in flow-based networks and allows understanding of the interplay between structure and function for flow-induced failure propagation and recovery.

  10. Floating Node Method and Virtual Crack Closure Technique for Modeling Matrix Cracking-Delamination Interaction

    NASA Technical Reports Server (NTRS)

    DeCarvalho, N. V.; Chen, B. Y.; Pinho, S. T.; Baiz, P. M.; Ratcliffe, J. G.; Tay, T. E.

    2013-01-01

    A novel approach is proposed for high-fidelity modeling of progressive damage and failure in composite materials that combines the Floating Node Method (FNM) and the Virtual Crack Closure Technique (VCCT) to represent multiple interacting failure mechanisms in a mesh-independent fashion. In this study, the approach is applied to the modeling of delamination migration in cross-ply tape laminates. Delamination, matrix cracking, and migration are all modeled using fracture mechanics based failure and migration criteria. The methodology proposed shows very good qualitative and quantitative agreement with experiments.

  11. Floating Node Method and Virtual Crack Closure Technique for Modeling Matrix Cracking-Delamination Migration

    NASA Technical Reports Server (NTRS)

    DeCarvalho, Nelson V.; Chen, B. Y.; Pinho, Silvestre T.; Baiz, P. M.; Ratcliffe, James G.; Tay, T. E.

    2013-01-01

    A novel approach is proposed for high-fidelity modeling of progressive damage and failure in composite materials that combines the Floating Node Method (FNM) and the Virtual Crack Closure Technique (VCCT) to represent multiple interacting failure mechanisms in a mesh-independent fashion. In this study, the approach is applied to the modeling of delamination migration in cross-ply tape laminates. Delamination, matrix cracking, and migration are all modeled using fracture mechanics based failure and migration criteria. The methodology proposed shows very good qualitative and quantitative agreement with experiments.

  12. Failure mode and effects analysis: too little for too much?

    PubMed

    Dean Franklin, Bryony; Shebl, Nada Atef; Barber, Nick

    2012-07-01

    Failure mode and effects analysis (FMEA) is a structured prospective risk assessment method that is widely used within healthcare. FMEA involves a multidisciplinary team mapping out a high-risk process of care, identifying the failures that can occur, and then characterising each of these in terms of probability of occurrence, severity of effects and detectability, to give a risk priority number used to identify failures most in need of attention. One might assume that such a widely used tool would have an established evidence base. This paper considers whether or not this is the case, examining the evidence for the reliability and validity of its outputs, the mathematical principles behind the calculation of a risk prioirty number, and variation in how it is used in practice. We also consider the likely advantages of this approach, together with the disadvantages in terms of the healthcare professionals' time involved. We conclude that although FMEA is popular and many published studies have reported its use within healthcare, there is little evidence to support its use for the quantitative prioritisation of process failures. It lacks both reliability and validity, and is very time consuming. We would not recommend its use as a quantitative technique to prioritise, promote or study patient safety interventions. However, the stage of FMEA involving multidisciplinary mapping process seems valuable and work is now needed to identify the best way of converting this into plans for action.

  13. The Failing Heart Relies on Ketone Bodies as a Fuel.

    PubMed

    Aubert, Gregory; Martin, Ola J; Horton, Julie L; Lai, Ling; Vega, Rick B; Leone, Teresa C; Koves, Timothy; Gardell, Stephen J; Krüger, Marcus; Hoppel, Charles L; Lewandowski, E Douglas; Crawford, Peter A; Muoio, Deborah M; Kelly, Daniel P

    2016-02-23

    Significant evidence indicates that the failing heart is energy starved. During the development of heart failure, the capacity of the heart to utilize fatty acids, the chief fuel, is diminished. Identification of alternate pathways for myocardial fuel oxidation could unveil novel strategies to treat heart failure. Quantitative mitochondrial proteomics was used to identify energy metabolic derangements that occur during the development of cardiac hypertrophy and heart failure in well-defined mouse models. As expected, the amounts of proteins involved in fatty acid utilization were downregulated in myocardial samples from the failing heart. Conversely, expression of β-hydroxybutyrate dehydrogenase 1, a key enzyme in the ketone oxidation pathway, was increased in the heart failure samples. Studies of relative oxidation in an isolated heart preparation using ex vivo nuclear magnetic resonance combined with targeted quantitative myocardial metabolomic profiling using mass spectrometry revealed that the hypertrophied and failing heart shifts to oxidizing ketone bodies as a fuel source in the context of reduced capacity to oxidize fatty acids. Distinct myocardial metabolomic signatures of ketone oxidation were identified. These results indicate that the hypertrophied and failing heart shifts to ketone bodies as a significant fuel source for oxidative ATP production. Specific metabolite biosignatures of in vivo cardiac ketone utilization were identified. Future studies aimed at determining whether this fuel shift is adaptive or maladaptive could unveil new therapeutic strategies for heart failure. © 2016 American Heart Association, Inc.

  14. A manual for pyrotechnic design, development and qualification

    NASA Astrophysics Data System (ADS)

    Bement, Laurence J.; Schimmel, Morry L.

    1995-06-01

    Although pyrotechnic devices have been singularly responsible for the success of many of the critical mechanical functions in aerospace programs for over 30 years, ground and in-flight failures continue to occur. Subsequent investigations reveal that little or no quantitative information is available on measuring the effects on performance of system variables or on determining functional margins. Pyrotechnics are considered to be readily available and, therefore, can be managed by any subsystem in which they are applied, such as structure, propulsion, electric power, or life support. The primary purpose of this manual is to alter the concept that the use of pyrotechnics is an art and refute 'justifications' that applications do not need to be understood by providing information on pyrotechnic design, development, and qualification on an engineering basis. Included are approaches to demonstrate functional reliability with less than 10 units, how to manage pyrotechnic-unique requirements, and methods to assure that the system is properly assembled and will perform the required tasks.

  15. A manual for pyrotechnic design, development and qualification

    NASA Technical Reports Server (NTRS)

    Bement, Laurence J.; Schimmel, Morry L.

    1995-01-01

    Although pyrotechnic devices have been singularly responsible for the success of many of the critical mechanical functions in aerospace programs for over 30 years, ground and in-flight failures continue to occur. Subsequent investigations reveal that little or no quantitative information is available on measuring the effects on performance of system variables or on determining functional margins. Pyrotechnics are considered to be readily available and, therefore, can be managed by any subsystem in which they are applied, such as structure, propulsion, electric power, or life support. The primary purpose of this manual is to alter the concept that the use of pyrotechnics is an art and refute 'justifications' that applications do not need to be understood by providing information on pyrotechnic design, development, and qualification on an engineering basis. Included are approaches to demonstrate functional reliability with less than 10 units, how to manage pyrotechnic-unique requirements, and methods to assure that the system is properly assembled and will perform the required tasks.

  16. Reliability analysis and fault-tolerant system development for a redundant strapdown inertial measurement unit. [inertial platforms

    NASA Technical Reports Server (NTRS)

    Motyka, P.

    1983-01-01

    A methodology is developed and applied for quantitatively analyzing the reliability of a dual, fail-operational redundant strapdown inertial measurement unit (RSDIMU). A Markov evaluation model is defined in terms of the operational states of the RSDIMU to predict system reliability. A 27 state model is defined based upon a candidate redundancy management system which can detect and isolate a spectrum of failure magnitudes. The results of parametric studies are presented which show the effect on reliability of the gyro failure rate, both the gyro and accelerometer failure rates together, false alarms, probability of failure detection, probability of failure isolation, and probability of damage effects and mission time. A technique is developed and evaluated for generating dynamic thresholds for detecting and isolating failures of the dual, separated IMU. Special emphasis is given to the detection of multiple, nonconcurrent failures. Digital simulation time histories are presented which show the thresholds obtained and their effectiveness in detecting and isolating sensor failures.

  17. Exercise Training for Heart Failure Patients with and without Systolic Dysfunction: An Evidence-Based Analysis of How Patients Benefit

    PubMed Central

    Smart, Neil

    2011-01-01

    Significant benefits can be derived by heart failure patients from exercise training. This paper provides an evidence-based assessment of expected clinical benefits of exercise training for heart failure patients. Meta-analyses and randomized, controlled trials of exercise training in heart failure patients were reviewed from a search of PubMed, Cochrane Controlled Trial Registry (CCTR), CINAHL, and EMBASE. Exercise training improves functional capacity, quality of life, hospitalization, and systolic and diastolic function in heart failure patients. Heart failure patients with preserved systolic function (HFnEF) participating in exercise training studies are more likely to be women and are 5–7 years older than their systolic heart failure (CHF) counterparts. All patients exhibit low functional capacities, although in HFnEF patients this may be age related, therefore subtle differences in exercise prescriptions are required. Published works report that exercise training is beneficial for heart failure patients with and without systolic dysfunction. PMID:20953365

  18. An Illustration of Determining Quantitatively the Rock Mass Quality Parameters of the Hoek-Brown Failure Criterion

    NASA Astrophysics Data System (ADS)

    Wu, Li; Adoko, Amoussou Coffi; Li, Bo

    2018-04-01

    In tunneling, determining quantitatively the rock mass strength parameters of the Hoek-Brown (HB) failure criterion is useful since it can improve the reliability of the design of tunnel support systems. In this study, a quantitative method is proposed to determine the rock mass quality parameters of the HB failure criterion, namely the Geological Strength Index (GSI) and the disturbance factor ( D) based on the structure of drilling core and weathering condition of rock mass combined with acoustic wave test to calculate the strength of rock mass. The Rock Mass Structure Index and the Rock Mass Weathering Index are used to quantify the GSI while the longitudinal wave velocity ( V p) is employed to derive the value of D. The DK383+338 tunnel face of Yaojia tunnel of Shanghai-Kunming passenger dedicated line served as illustration of how the methodology is implemented. The values of the GSI and D are obtained using the HB criterion and then using the proposed method. The measured in situ stress is used to evaluate their accuracy. To this end, the major and minor principal stresses are calculated based on the GSI and D given by HB criterion and the proposed method. The results indicated that both methods were close to the field observation which suggests that the proposed method can be used for determining quantitatively the rock quality parameters, as well. However, these results remain valid only for rock mass quality and rock type similar to those of the DK383+338 tunnel face of Yaojia tunnel.

  19. Numerical simulation of damage and progressive failures in composite laminates using the layerwise plate theory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Reddy, Y.S.

    1992-01-01

    The failure behavior of composite laminates is modeled numerically using the Generalized Layerwise Plate Theory (GLPT) of Reddy and a progressive failure algorithm. The Layerwise Theory of Reddy assumes a piecewise continuous displacement field through the thickness of the laminate and therefore has the ability to capture the interlaminar stress fields near the free edges and cut outs more accurately. The progressive failure algorithm is based on the assumption that the material behaves like a stable progressively fracturing solid. A three-dimensional stiffness reduction scheme is developed and implemented to study progressive failures in composite laminates. The effect of various parametersmore » such as out-of-plane material properties, boundary conditions, and stiffness reduction methods on the failure stresses and strains of a quasi-isotropic composite laminate with free edges subjected to tensile loading is studied. The ultimate stresses and strains predicted by the Generalized Layerwise Plate Theory (GLPT) and the more widely used First Order Shear Deformation Theory (FSDT) are compared with experimental results. The predictions of the GLPT are found to be in good agreement with the experimental results both qualitatively and quantitatively, while the predictions of FSDT are found to be different from experimental results both qualitatively and quantitatively. The predictive ability of various phenomenological failure criteria is evaluated with reference to the experimental results available in the literature. The effect of geometry of the test specimen and the displacement boundary conditions at the grips on the ultimate stresses and strains of a composite laminate under compressive loading is studied. The ultimate stresses and strains are found to be quite sensitive to the geometry of the test specimen and the displacement boundary conditions at the grips. The degree of sensitivity is observed to depend strongly on the lamination sequence.« less

  20. Impaired immune function in children and adults with Fanconi anemia.

    PubMed

    Myers, Kasiani C; Sauter, Sharon; Zhang, Xue; Bleesing, Jacob J; Davies, Stella M; Wells, Susanne I; Mehta, Parinda A; Kumar, Ashish; Marmer, Daniel; Marsh, Rebecca; Brown, Darron; Butsch Kovacic, Melinda

    2017-11-01

    Fanconi anemia (FA) is a rare genetic disorder characterized by genome instability, bone marrow failure, and cancer predisposition. Previously, small studies have reported heterogeneous immune dysfunction in FA. We performed a detailed immunologic assessment in a large FA cohort who have not undergone bone marrow transplantation or developed malignancies. Comprehensive quantitative and functional immunologic assessment of 29 FA individuals was compared to healthy age-matched controls. Compared to non-FA persons of similar ages, FA individuals showed lower absolute total B cells (P < 0.001), lower memory B cells (P < 0.001), and decreased IgM (P < 0.001) but normal IgG. NK cells (P < 0.001) and NK cytotoxicity (P < 0.001) were decreased. CD4 + T cells were decreased (P = 0.022), while CD8 + T cell and absolute T-cell numbers were comparable. Cytotoxic T cells (P < 0.003), and antigen proliferation response to tetanus (P = 0.019) and candida (P = 0.019), were diminished in FA. Phytohemagglutinin responses and plasma cytokines were normal. Within FA subjects, adults and older children (≥10 years) exhibited higher CD8 + T cells than younger children (P = 0.004). Documented atypical infections were infrequent, although oral human papilloma virus (HPV) prevalence was higher (31% positive) in FA. Overall, these results demonstrate a high rate of significant humoral and cellular immune dysfunction. Continued longitudinal study of immune function is critical to understand evolution with age, bone marrow failure, and cancer development. © 2017 Wiley Periodicals, Inc.

  1. Failure Modes Effects and Criticality Analysis, an Underutilized Safety, Reliability, Project Management and Systems Engineering Tool

    NASA Astrophysics Data System (ADS)

    Mullin, Daniel Richard

    2013-09-01

    The majority of space programs whether manned or unmanned for science or exploration require that a Failure Modes Effects and Criticality Analysis (FMECA) be performed as part of their safety and reliability activities. This comes as no surprise given that FMECAs have been an integral part of the reliability engineer's toolkit since the 1950s. The reasons for performing a FMECA are well known including fleshing out system single point failures, system hazards and critical components and functions. However, in the author's ten years' experience as a space systems safety and reliability engineer, findings demonstrate that the FMECA is often performed as an afterthought, simply to meet contract deliverable requirements and is often started long after the system requirements allocation and preliminary design have been completed. There are also important qualitative and quantitative components often missing which can provide useful data to all of project stakeholders. These include; probability of occurrence, probability of detection, time to effect and time to detect and, finally, the Risk Priority Number. This is unfortunate as the FMECA is a powerful system design tool that when used effectively, can help optimize system function while minimizing the risk of failure. When performed as early as possible in conjunction with writing the top level system requirements, the FMECA can provide instant feedback on the viability of the requirements while providing a valuable sanity check early in the design process. It can indicate which areas of the system will require redundancy and which areas are inherently the most risky from the onset. Based on historical and practical examples, it is this author's contention that FMECAs are an immense source of important information for all involved stakeholders in a given project and can provide several benefits including, efficient project management with respect to cost and schedule, system engineering and requirements management, assembly integration and test (AI&T) and operations if applied early, performed to completion and updated along with system design.

  2. SU-F-R-46: Predicting Distant Failure in Lung SBRT Using Multi-Objective Radiomics Model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhou, Z; Folkert, M; Iyengar, P

    2016-06-15

    Purpose: To predict distant failure in lung stereotactic body radiation therapy (SBRT) in early stage non-small cell lung cancer (NSCLC) by using a new multi-objective radiomics model. Methods: Currently, most available radiomics models use the overall accuracy as the objective function. However, due to data imbalance, a single object may not reflect the performance of a predictive model. Therefore, we developed a multi-objective radiomics model which considers both sensitivity and specificity as the objective functions simultaneously. The new model is used to predict distant failure in lung SBRT using 52 patients treated at our institute. Quantitative imaging features of PETmore » and CT as well as clinical parameters are utilized to build the predictive model. Image features include intensity features (9), textural features (12) and geometric features (8). Clinical parameters for each patient include demographic parameters (4), tumor characteristics (8), treatment faction schemes (4) and pretreatment medicines (6). The modelling procedure consists of two steps: extracting features from segmented tumors in PET and CT; and selecting features and training model parameters based on multi-objective. Support Vector Machine (SVM) is used as the predictive model, while a nondominated sorting-based multi-objective evolutionary computation algorithm II (NSGA-II) is used for solving the multi-objective optimization. Results: The accuracy for PET, clinical, CT, PET+clinical, PET+CT, CT+clinical, PET+CT+clinical are 71.15%, 84.62%, 84.62%, 85.54%, 82.69%, 84.62%, 86.54%, respectively. The sensitivities for the above seven combinations are 41.76%, 58.33%, 50.00%, 50.00%, 41.67%, 41.67%, 58.33%, while the specificities are 80.00%, 92.50%, 90.00%, 97.50%, 92.50%, 97.50%, 97.50%. Conclusion: A new multi-objective radiomics model for predicting distant failure in NSCLC treated with SBRT was developed. The experimental results show that the best performance can be obtained by combining all features.« less

  3. Color Shift Failure Prediction for Phosphor-Converted White LEDs by Modeling Features of Spectral Power Distribution with a Nonlinear Filter Approach

    PubMed Central

    Mohamed, Moumouni Guero; Fan, Xuejun; Zhang, Guoqi; Pecht, Michael

    2017-01-01

    With the expanding application of light-emitting diodes (LEDs), the color quality of white LEDs has attracted much attention in several color-sensitive application fields, such as museum lighting, healthcare lighting and displays. Reliability concerns for white LEDs are changing from the luminous efficiency to color quality. However, most of the current available research on the reliability of LEDs is still focused on luminous flux depreciation rather than color shift failure. The spectral power distribution (SPD), defined as the radiant power distribution emitted by a light source at a range of visible wavelength, contains the most fundamental luminescence mechanisms of a light source. SPD is used as the quantitative inference of an LED’s optical characteristics, including color coordinates that are widely used to represent the color shift process. Thus, to model the color shift failure of white LEDs during aging, this paper first extracts the features of an SPD, representing the characteristics of blue LED chips and phosphors, by multi-peak curve-fitting and modeling them with statistical functions. Then, because the shift processes of extracted features in aged LEDs are always nonlinear, a nonlinear state-space model is then developed to predict the color shift failure time within a self-adaptive particle filter framework. The results show that: (1) the failure mechanisms of LEDs can be identified by analyzing the extracted features of SPD with statistical curve-fitting and (2) the developed method can dynamically and accurately predict the color coordinates, correlated color temperatures (CCTs), and color rendering indexes (CRIs) of phosphor-converted (pc)-white LEDs, and also can estimate the residual color life. PMID:28773176

  4. Color Shift Failure Prediction for Phosphor-Converted White LEDs by Modeling Features of Spectral Power Distribution with a Nonlinear Filter Approach.

    PubMed

    Fan, Jiajie; Mohamed, Moumouni Guero; Qian, Cheng; Fan, Xuejun; Zhang, Guoqi; Pecht, Michael

    2017-07-18

    With the expanding application of light-emitting diodes (LEDs), the color quality of white LEDs has attracted much attention in several color-sensitive application fields, such as museum lighting, healthcare lighting and displays. Reliability concerns for white LEDs are changing from the luminous efficiency to color quality. However, most of the current available research on the reliability of LEDs is still focused on luminous flux depreciation rather than color shift failure. The spectral power distribution (SPD), defined as the radiant power distribution emitted by a light source at a range of visible wavelength, contains the most fundamental luminescence mechanisms of a light source. SPD is used as the quantitative inference of an LED's optical characteristics, including color coordinates that are widely used to represent the color shift process. Thus, to model the color shift failure of white LEDs during aging, this paper first extracts the features of an SPD, representing the characteristics of blue LED chips and phosphors, by multi-peak curve-fitting and modeling them with statistical functions. Then, because the shift processes of extracted features in aged LEDs are always nonlinear, a nonlinear state-space model is then developed to predict the color shift failure time within a self-adaptive particle filter framework. The results show that: (1) the failure mechanisms of LEDs can be identified by analyzing the extracted features of SPD with statistical curve-fitting and (2) the developed method can dynamically and accurately predict the color coordinates, correlated color temperatures (CCTs), and color rendering indexes (CRIs) of phosphor-converted (pc)-white LEDs, and also can estimate the residual color life.

  5. Risk Based Reliability Centered Maintenance of DOD Fire Protection Systems

    DTIC Science & Technology

    1999-01-01

    2.2.3 Failure Mode and Effect Analysis ( FMEA )............................ 2.2.4 Failure Mode Risk Characterization...Step 2 - System functions and functional failures definition Step 3 - Failure mode and effect analysis ( FMEA ) Step 4 - Failure mode risk...system). The Interface Location column identifies the location where the FMEA of the fire protection system began or stopped. For example, for the fire

  6. Quantitative risk assessment system (QRAS)

    NASA Technical Reports Server (NTRS)

    Tan, Zhibin (Inventor); Mosleh, Ali (Inventor); Weinstock, Robert M (Inventor); Smidts, Carol S (Inventor); Chang, Yung-Hsien (Inventor); Groen, Francisco J (Inventor); Swaminathan, Sankaran (Inventor)

    2001-01-01

    A quantitative risk assessment system (QRAS) builds a risk model of a system for which risk of failure is being assessed, then analyzes the risk of the system corresponding to the risk model. The QRAS performs sensitivity analysis of the risk model by altering fundamental components and quantifications built into the risk model, then re-analyzes the risk of the system using the modifications. More particularly, the risk model is built by building a hierarchy, creating a mission timeline, quantifying failure modes, and building/editing event sequence diagrams. Multiplicities, dependencies, and redundancies of the system are included in the risk model. For analysis runs, a fixed baseline is first constructed and stored. This baseline contains the lowest level scenarios, preserved in event tree structure. The analysis runs, at any level of the hierarchy and below, access this baseline for risk quantitative computation as well as ranking of particular risks. A standalone Tool Box capability exists, allowing the user to store application programs within QRAS.

  7. Gut microbiota and obesity.

    PubMed

    Scarpellini, Emidio; Campanale, Mariachiara; Leone, Diana; Purchiaroni, Flaminia; Vitale, Giovanna; Lauritano, Ernesto Cristiano; Gasbarrini, Antonio

    2010-10-01

    Intestinal epithelium, mucosal immune system, and bacterial flora represent a morpho-functional system on dynamic balance responsible for the intestinal metabolic and trophic functions, and the regulation of mucosal and systemic host's immunity. Obesity is a pathological condition affecting a growing number of people especially in the Western countries resulting from the failure of the organism's energetic balance based on the perfect equality of income, waste, and storage. Recent evidences explain the mechanisms for the microbial regulation of the host's metabolism both in health and disease. In particular, animal studies have explained how quali-/quantitative changes in microflora composition are able to affect the absorption of the nutrients and the energy distribution. Antibiotics, prebiotics, probiotics, and symbiotics are the instruments utilized in the current clinical practice to modulate the intestinal bacterial flora in man both in health and pathologic conditions with promising preliminary results on prevention and therapy of obesity and related metabolic diseases.

  8. Effect of age and gender on sudomotor and cardiovagal function and blood pressure response to tilt in normal subjects

    NASA Technical Reports Server (NTRS)

    Low, P. A.; Denq, J. C.; Opfer-Gehrking, T. L.; Dyck, P. J.; O'Brien, P. C.; Slezak, J. M.

    1997-01-01

    Normative data are limited on autonomic function tests, especially beyond age 60 years. We therefore evaluated these tests in a total of 557 normal subjects evenly distributed by age and gender from 10 to 83 years. Heart rate (HR) response to deep breathing fell with increasing age. Valsalva ratio varied with both age and gender. QSART (quantitative sudomotor axon-reflex test) volume was consistently greater in men (approximately double) and progressively declined with age for all three lower extremity sites but not the forearm site. Orthostatic blood pressure reduction was greater with increasing age. HR at rest was significantly higher in women, and the increment with head-up tilt fell with increasing age. For no tests did we find a regression to zero, and some tests seem to level off with increasing age, indicating that diagnosis of autonomic failure was possible to over 80 years of age.

  9. [Biomechanical modeling of pelvic organ mobility: towards personalized medicine].

    PubMed

    Cosson, Michel; Rubod, Chrystèle; Vallet, Alexandra; Witz, Jean-François; Brieu, Mathias

    2011-11-01

    Female pelvic mobility is crucial for urinary, bowel and sexual function and for vaginal delivery. This mobility is ensured by a complex organ suspension system composed of ligaments, fascia and muscles. Impaired pelvic mobility affects one in three women of all ages and can be incapacitating. Surgical management has a high failure rate, largely owing to poor knowledge of the organ support system, including the barely discernible ligamentous system. We propose a 3D digital model of the pelvic cavity based on MRI images and quantitative tools, designed to locate the pelvic ligaments. We thus obtain a coherent anatomical and functional model which can be used to analyze pelvic pathophysiology. This work represents a first step towards creating a tool for localizing and characterizing the source of pelvic imbalance. We examine possible future applications of this model, in terms of personalized therapy and prevention.

  10. Intraoperative Transesophageal Echocardiography and Right Ventricular Failure After Left Ventricular Assist Device Implantation.

    PubMed

    Silverton, Natalie A; Patel, Ravi; Zimmerman, Josh; Ma, Jianing; Stoddard, Greg; Selzman, Craig; Morrissey, Candice K

    2018-02-15

    To determine whether intraoperative measures of right ventricular (RV) function using transesophageal echocardiography are associated with subsequent RV failure after left ventricular assist device (LVAD) implantation. Retrospective, nonrandomized, observational study. Single tertiary-level, university-affiliated hospital. The study comprised 100 patients with systolic heart failure undergoing elective LVAD implantation. Transesophageal echocardiographic images before and after cardiopulmonary bypass were analyzed to quantify RV function using tricuspid annular plane systolic excursion (TAPSE), tricuspid annular systolic velocity (S'), fractional area change (FAC), RV global longitudinal strain, and RV free wall strain. A chart review was performed to determine which patients subsequently developed RV failure (right ventricular assist device placement or prolonged inotrope requirement ≥14 days). Nineteen patients (19%) subsequently developed RV failure. Postbypass FAC was the only measure of RV function that distinguished between the RV failure and non-RV failure groups (21.2% v 26.5%; p = 0.04). The sensitivity, specificity, and area under the curve of an abnormal RV FAC (<35%) for RV failure after LVAD implantation were 84%, 20%, and 0.52, respectively. No other intraoperative measure of RV function was associated with subsequent RV failure. RV failure increased ventilator time, intensive care unit and hospital length of stay, and mortality. Intraoperative measures of RV function such as tricuspid annular plane systolic excursion, tricuspid annular systolic velocity, and RV strain were not associated with RV failure after LVAD implantation. Decreased postbypass FAC was significantly associated with RV failure but showed poor discrimination. Copyright © 2018 Elsevier Inc. All rights reserved.

  11. Quality-by-Design II: Application of Quantitative Risk Analysis to the Formulation of Ciprofloxacin Tablets.

    PubMed

    Claycamp, H Gregg; Kona, Ravikanth; Fahmy, Raafat; Hoag, Stephen W

    2016-04-01

    Qualitative risk assessment methods are often used as the first step to determining design space boundaries; however, quantitative assessments of risk with respect to the design space, i.e., calculating the probability of failure for a given severity, are needed to fully characterize design space boundaries. Quantitative risk assessment methods in design and operational spaces are a significant aid to evaluating proposed design space boundaries. The goal of this paper is to demonstrate a relatively simple strategy for design space definition using a simplified Bayesian Monte Carlo simulation. This paper builds on a previous paper that used failure mode and effects analysis (FMEA) qualitative risk assessment and Plackett-Burman design of experiments to identity the critical quality attributes. The results show that the sequential use of qualitative and quantitative risk assessments can focus the design of experiments on a reduced set of critical material and process parameters that determine a robust design space under conditions of limited laboratory experimentation. This approach provides a strategy by which the degree of risk associated with each known parameter can be calculated and allocates resources in a manner that manages risk to an acceptable level.

  12. Renal function monitoring in heart failure – what is the optimal frequency? A narrative review

    PubMed Central

    Wright, David; Devonald, Mark Alexander John; Pirmohamed, Munir

    2017-01-01

    The second most common cause of hospitalization due to adverse drug reactions in the UK is renal dysfunction due to diuretics, particularly in patients with heart failure, where diuretic therapy is a mainstay of treatment regimens. Therefore, the optimal frequency for monitoring renal function in these patients is an important consideration for preventing renal failure and hospitalization. This review looks at the current evidence for optimal monitoring practices of renal function in patients with heart failure according to national and international guidelines on the management of heart failure (AHA/NICE/ESC/SIGN). Current guidance of renal function monitoring is in large part based on expert opinion, with a lack of clinical studies that have specifically evaluated the optimal frequency of renal function monitoring in patients with heart failure. Furthermore, there is variability between guidelines, and recommendations are typically nonspecific. Safer prescribing of diuretics in combination with other antiheart failure treatments requires better evidence for frequency of renal function monitoring. We suggest developing more personalized monitoring rather than from the current medication‐based guidance. Such flexible clinical guidelines could be implemented using intelligent clinical decision support systems. Personalized renal function monitoring would be more effective in preventing renal decline, rather than reacting to it. PMID:28901643

  13. Rock Slide Risk Assessment: A Semi-Quantitative Approach

    NASA Astrophysics Data System (ADS)

    Duzgun, H. S. B.

    2009-04-01

    Rock slides can be better managed by systematic risk assessments. Any risk assessment methodology for rock slides involves identification of rock slide risk components, which are hazard, elements at risk and vulnerability. For a quantitative/semi-quantitative risk assessment for rock slides, a mathematical value the risk has to be computed and evaluated. The quantitative evaluation of risk for rock slides enables comparison of the computed risk with the risk of other natural and/or human-made hazards and providing better decision support and easier communication for the decision makers. A quantitative/semi-quantitative risk assessment procedure involves: Danger Identification, Hazard Assessment, Elements at Risk Identification, Vulnerability Assessment, Risk computation, Risk Evaluation. On the other hand, the steps of this procedure require adaptation of existing or development of new implementation methods depending on the type of landslide, data availability, investigation scale and nature of consequences. In study, a generic semi-quantitative risk assessment (SQRA) procedure for rock slides is proposed. The procedure has five consecutive stages: Data collection and analyses, hazard assessment, analyses of elements at risk and vulnerability and risk assessment. The implementation of the procedure for a single rock slide case is illustrated for a rock slope in Norway. Rock slides from mountain Ramnefjell to lake Loen are considered to be one of the major geohazards in Norway. Lake Loen is located in the inner part of Nordfjord in Western Norway. Ramnefjell Mountain is heavily jointed leading to formation of vertical rock slices with height between 400-450 m and width between 7-10 m. These slices threaten the settlements around Loen Valley and tourists visiting the fjord during summer season, as the released slides have potential of creating tsunami. In the past, several rock slides had been recorded from the Mountain Ramnefjell between 1905 and 1950. Among them, four of the slides caused formation of tsunami waves which washed up to 74 m above the lake level. Two of the slides resulted in many fatalities in the inner part of the Loen Valley as well as great damages. There are three predominant joint structures in Ramnefjell Mountain, which controls failure and the geometry of the slides. The first joint set is a foliation plane striking northeast-southwest and dipping 35˚ -40˚ to the east-southeast. The second and the third joint sets are almost perpendicular and parallel to the mountain side and scarp, respectively. These three joint sets form slices of rock columns with width ranging between 7-10 m and height of 400-450 m. It is stated that the joints in set II are opened between 1-2 m, which may bring about collection of water during heavy rainfall or snow melt causing the slices to be pressed out. It is estimated that water in the vertical joints both reduces the shear strength of sliding plane and causes reduction of normal stress on the sliding plane due to formation of uplift force. Hence rock slides in Ramnefjell mountain occur in plane failure mode. The quantitative evaluation of rock slide risk requires probabilistic analysis of rock slope stability and identification of consequences if the rock slide occurs. In this study failure probability of a rock slice is evaluated by first-order reliability method (FORM). Then in order to use the calculated probability of failure value (Pf) in risk analyses, it is required to associate this Pf with frequency based probabilities (i.ePf / year) since the computed failure probabilities is a measure of hazard and not a measure of risk unless they are associated with the consequences of the failure. This can be done by either considering the time dependent behavior of the basic variables in the probabilistic models or associating the computed Pf with frequency of the failures in the region. In this study, the frequency of previous rock slides in the previous century in Remnefjell is used for evaluation of frequency based probability to be used in risk assessment. The major consequence of a rock slide is generation of a tsunami in the lake Loen, causing inundation of residential areas around the lake. Risk is assessed by adapting damage probability matrix approach, which is originally developed for risk assessment for buildings in case of earthquake.

  14. Life Cost Based FMEA Manual: A Step by Step Guide to Carrying Out a Cost-based Failure Modes and Effects Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rhee, Seung; Spencer, Cherrill; /Stanford U. /SLAC

    2009-01-23

    Failure occurs when one or more of the intended functions of a product are no longer fulfilled to the customer's satisfaction. The most critical product failures are those that escape design reviews and in-house quality inspection and are found by the customer. The product may work for a while until its performance degrades to an unacceptable level or it may have not worked even before customer took possession of the product. The end results of failures which may lead to unsafe conditions or major losses of the main function are rated high in severity. Failure Modes and Effects Analysis (FMEA)more » is a tool widely used in the automotive, aerospace, and electronics industries to identify, prioritize, and eliminate known potential failures, problems, and errors from systems under design, before the product is released (Stamatis, 1997). Several industrial FMEA standards such as those published by the Society of Automotive Engineers, US Department of Defense, and the Automotive Industry Action Group employ the Risk Priority Number (RPN) to measure risk and severity of failures. The Risk Priority Number (RPN) is a product of 3 indices: Occurrence (O), Severity (S), and Detection (D). In a traditional FMEA process design engineers typically analyze the 'root cause' and 'end-effects' of potential failures in a sub-system or component and assign penalty points through the O, S, D values to each failure. The analysis is organized around categories called failure modes, which link the causes and effects of failures. A few actions are taken upon completing the FMEA worksheet. The RPN column generally will identify the high-risk areas. The idea of performing FMEA is to eliminate or reduce known and potential failures before they reach the customers. Thus, a plan of action must be in place for the next task. Not all failures can be resolved during the product development cycle, thus prioritization of actions must be made within the design group. One definition of detection difficulty (D) is how well the organization controls the development process. Another definition relates to the detectability of a particular failure in the product when it is in the hands of the customer. The former asks 'What is the chance of catching the problem before we give it to the customer'? The latter asks 'What is the chance of the customer catching the problem before the problem results in a catastrophic failure?' (Palady, 1995) These differing definitions confuse the FMEA users when one tries to determine detection difficulty. Are we trying to measure how easy it is to detect where a failure has occurred or when it has occurred? Or are we trying to measure how easy or difficult it is to prevent failures? Ordinal scale variables are used to rank-order industries such as, hotels, restaurants, and movies (Note that a 4 star hotel is not necessarily twice as good as a 2 star hotel). Ordinal values preserve rank in a group of items, but the distance between the values cannot be measured since a distance function does not exist. Thus, the product or sum of ordinal variables loses its rank since each parameter has different scales. The RPN is a product of 3 independent ordinal variables, it can indicate that some failure types are 'worse' than others, but give no quantitative indication of their relative effects. To resolve the ambiguity of measuring detection difficulty and the irrational logic of multiplying 3 ordinal indices, a new methodology was created to overcome these shortcomings, Life Cost-Based FMEA. Life Cost-Based FMEA measures failure/risk in terms of monetary cost. Cost is a universal parameter that can be easily related to severity by engineers and others. Thus, failure cost can be estimated using the following simplest form: Expected Failure Cost = {sup n}{Sigma}{sub i=1}p{sub i}c{sub i}, p: Probability of a particular failure occurring; c: Monetary cost associated with that particular failure; and n: Total number of failure scenarios. FMEA is most effective when there are inputs into it from all concerned disciplines of the product development team. However, FMEA is a long process and can become tedious and won't be effective if too many people participate. An ideal team should have 3 to 4 people from: design, manufacturing, and service departments if possible. Depending on how complex the system is, the entire process can take anywhere from one to four weeks working full time. Thus, it is important to agree to the time commitment before starting the analysis else, anxious managers might stop the procedure before it is completed.« less

  15. Rockfall hazard and risk assessments along roads at a regional scale: example in Swiss Alps

    NASA Astrophysics Data System (ADS)

    Michoud, C.; Derron, M.-H.; Horton, P.; Jaboyedoff, M.; Baillifard, F.-J.; Loye, A.; Nicolet, P.; Pedrazzini, A.; Queyrel, A.

    2012-03-01

    Unlike fragmental rockfall runout assessments, there are only few robust methods to quantify rock-mass-failure susceptibilities at regional scale. A detailed slope angle analysis of recent Digital Elevation Models (DEM) can be used to detect potential rockfall source areas, thanks to the Slope Angle Distribution procedure. However, this method does not provide any information on block-release frequencies inside identified areas. The present paper adds to the Slope Angle Distribution of cliffs unit its normalized cumulative distribution function. This improvement is assimilated to a quantitative weighting of slope angles, introducing rock-mass-failure susceptibilities inside rockfall source areas previously detected. Then rockfall runout assessment is performed using the GIS- and process-based software Flow-R, providing relative frequencies for runout. Thus, taking into consideration both susceptibility results, this approach can be used to establish, after calibration, hazard and risk maps at regional scale. As an example, a risk analysis of vehicle traffic exposed to rockfalls is performed along the main roads of the Swiss alpine valley of Bagnes.

  16. Flexible materials technology

    NASA Technical Reports Server (NTRS)

    Steurer, W. H.

    1980-01-01

    A survey of all presently defined or proposed large space systems indicated an ever increasing demand for flexible components and materials, primarily as a result of the widening disparity between the stowage space of launch vehicles and the size of advanced systems. Typical flexible components and material requirements were identified on the basis of recurrence and/or functional commonality. This was followed by the evaluation of candidate materials and the search for material capabilities which promise to satisfy the postulated requirements. Particular attention was placed on thin films, and on the requirements of deployable antennas. The assessment of the performance of specific materials was based primarily on the failure mode, derived from a detailed failure analysis. In view of extensive on going work on thermal and environmental degradation effects, prime emphasis was placed on the assessment of the performance loss by meteoroid damage. Quantitative data were generated for tension members and antenna reflector materials. A methodology was developed for the representation of the overall materials performance as related to systems service life. A number of promising new concepts for flexible materials were identified.

  17. Failure detection in high-performance clusters and computers using chaotic map computations

    DOEpatents

    Rao, Nageswara S.

    2015-09-01

    A programmable media includes a processing unit capable of independent operation in a machine that is capable of executing 10.sup.18 floating point operations per second. The processing unit is in communication with a memory element and an interconnect that couples computing nodes. The programmable media includes a logical unit configured to execute arithmetic functions, comparative functions, and/or logical functions. The processing unit is configured to detect computing component failures, memory element failures and/or interconnect failures by executing programming threads that generate one or more chaotic map trajectories. The central processing unit or graphical processing unit is configured to detect a computing component failure, memory element failure and/or an interconnect failure through an automated comparison of signal trajectories generated by the chaotic maps.

  18. Introducing anisotropic Minkowski functionals and quantitative anisotropy measures for local structure analysis in biomedical imaging

    NASA Astrophysics Data System (ADS)

    Wismüller, Axel; De, Titas; Lochmüller, Eva; Eckstein, Felix; Nagarajan, Mahesh B.

    2013-03-01

    The ability of Minkowski Functionals to characterize local structure in different biological tissue types has been demonstrated in a variety of medical image processing tasks. We introduce anisotropic Minkowski Functionals (AMFs) as a novel variant that captures the inherent anisotropy of the underlying gray-level structures. To quantify the anisotropy characterized by our approach, we further introduce a method to compute a quantitative measure motivated by a technique utilized in MR diffusion tensor imaging, namely fractional anisotropy. We showcase the applicability of our method in the research context of characterizing the local structure properties of trabecular bone micro-architecture in the proximal femur as visualized on multi-detector CT. To this end, AMFs were computed locally for each pixel of ROIs extracted from the head, neck and trochanter regions. Fractional anisotropy was then used to quantify the local anisotropy of the trabecular structures found in these ROIs and to compare its distribution in different anatomical regions. Our results suggest a significantly greater concentration of anisotropic trabecular structures in the head and neck regions when compared to the trochanter region (p < 10-4). We also evaluated the ability of such AMFs to predict bone strength in the femoral head of proximal femur specimens obtained from 50 donors. Our results suggest that such AMFs, when used in conjunction with multi-regression models, can outperform more conventional features such as BMD in predicting failure load. We conclude that such anisotropic Minkowski Functionals can capture valuable information regarding directional attributes of local structure, which may be useful in a wide scope of biomedical imaging applications.

  19. Introducing Anisotropic Minkowski Functionals and Quantitative Anisotropy Measures for Local Structure Analysis in Biomedical Imaging

    PubMed Central

    Wismüller, Axel; De, Titas; Lochmüller, Eva; Eckstein, Felix; Nagarajan, Mahesh B.

    2017-01-01

    The ability of Minkowski Functionals to characterize local structure in different biological tissue types has been demonstrated in a variety of medical image processing tasks. We introduce anisotropic Minkowski Functionals (AMFs) as a novel variant that captures the inherent anisotropy of the underlying gray-level structures. To quantify the anisotropy characterized by our approach, we further introduce a method to compute a quantitative measure motivated by a technique utilized in MR diffusion tensor imaging, namely fractional anisotropy. We showcase the applicability of our method in the research context of characterizing the local structure properties of trabecular bone micro-architecture in the proximal femur as visualized on multi-detector CT. To this end, AMFs were computed locally for each pixel of ROIs extracted from the head, neck and trochanter regions. Fractional anisotropy was then used to quantify the local anisotropy of the trabecular structures found in these ROIs and to compare its distribution in different anatomical regions. Our results suggest a significantly greater concentration of anisotropic trabecular structures in the head and neck regions when compared to the trochanter region (p < 10−4). We also evaluated the ability of such AMFs to predict bone strength in the femoral head of proximal femur specimens obtained from 50 donors. Our results suggest that such AMFs, when used in conjunction with multi-regression models, can outperform more conventional features such as BMD in predicting failure load. We conclude that such anisotropic Minkowski Functionals can capture valuable information regarding directional attributes of local structure, which may be useful in a wide scope of biomedical imaging applications. PMID:29170580

  20. Fractography: determining the sites of fracture initiation.

    PubMed

    Mecholsky, J J

    1995-03-01

    Fractography is the analysis of fracture surfaces. Here, it refers to quantitative fracture surface analysis (FSA) in the context of applying the principles of fracture mechanics to the topography observed on the fracture surface of brittle materials. The application of FSA is based on the principle that encoded on the fracture surface of brittle materials is the entire history of the fracture process. It is our task to develop the skills and knowledge to decode this information. There are several motivating factors for applying our knowledge of FSA. The first and foremost is that there is specific, quantitative information to be obtained from the fracture surface. This information includes the identification of the size and location of the fracture initiating crack or defect, the stress state at failure, the existence, or not, of local or global residual stress, the existence, or not, of stress corrosion and a knowledge of local processing anomalies which affect the fracture process. The second motivating factor is that the information is free. Once a material is tested to failure, the encoded information becomes available. If we decide to observe the features produced during fracture then we are rewarded with much information. If we decide to ignore the fracture surface, then we are left to guess and/or reason as to the cause of the failure without the benefit of all of the possible information available. This paper addresses the application of quantitative fracture surface analysis to basic research, material and product development, and "trouble-shooting" of in-service failures. First, the basic principles involved will be presented. Next, the methodology necessary to apply the principles will be presented. Finally, a summary of the presentation will be made showing the applicability to design and reliability.

  1. Heart Failure Self-care Within the Context of Patient and Informal Caregiver Dyadic Engagement: A Mixed Methods Study.

    PubMed

    Buck, Harleah G; Hupcey, Judith; Wang, Hsiao-Lan; Fradley, Michael; Donovan, Kristine A; Watach, Alexa

    Recent heart failure (HF) patient and informal caregiver (eg, dyadic) studies have either examined self-care from a qualitative or quantitative perspective. To date, the 2 types of data have not been integrated. The aim of this study was to understand HF self-care within the context of dyadic engagement. This was a cross-sectional, mixed methods (quantitative/qualitative) study. Heart failure self-care was measured with the Self-care of Heart Failure Index (v.6) dichotomized to adequate (≥70) or inadequate (<69). Dyadic symptom management type was assessed with the Dyadic Symptom Management Type scale. Interviews regarding self-care were conducted with both dyad members present. Content analytic techniques were used. Data were integrated using an information matrix and triangulated using Creswell and Plano Clark's methods. Of the 27 dyads, HF participants were 56% men, with a mean age of 77 years. Caregivers were 74% women, with a mean age of 66 years, representing spouses (n = 14) and adult children (n = 7). Quantitatively, few dyads scored as adequate (≥70) in self-care; the qualitative data described the impact of adequacy on the dyads' behavior. Dyads who scored higher, individually or both, on self-care self-efficacy and self-care management were less likely to change from their life course pattern. Either the patient or dyad continued to handle all self-care as they always had, rather than trying new strategies or reaching out for help as the patient's condition deteriorated. Our data suggest links that should be explored between dyadic adequacy and response to patients' symptoms. Future studies should assess dyadic adequacy longitudinally and examine its relationship to event-free survival and health services cost.

  2. A methodology for estimating risks associated with landslides of contaminated soil into rivers.

    PubMed

    Göransson, Gunnel; Norrman, Jenny; Larson, Magnus; Alén, Claes; Rosén, Lars

    2014-02-15

    Urban areas adjacent to surface water are exposed to soil movements such as erosion and slope failures (landslides). A landslide is a potential mechanism for mobilisation and spreading of pollutants. This mechanism is in general not included in environmental risk assessments for contaminated sites, and the consequences associated with contamination in the soil are typically not considered in landslide risk assessments. This study suggests a methodology to estimate the environmental risks associated with landslides in contaminated sites adjacent to rivers. The methodology is probabilistic and allows for datasets with large uncertainties and the use of expert judgements, providing quantitative estimates of probabilities for defined failures. The approach is illustrated by a case study along the river Göta Älv, Sweden, where failures are defined and probabilities for those failures are estimated. Failures are defined from a pollution perspective and in terms of exceeding environmental quality standards (EQSs) and acceptable contaminant loads. Models are then suggested to estimate probabilities of these failures. A landslide analysis is carried out to assess landslide probabilities based on data from a recent landslide risk classification study along the river Göta Älv. The suggested methodology is meant to be a supplement to either landslide risk assessment (LRA) or environmental risk assessment (ERA), providing quantitative estimates of the risks associated with landslide in contaminated sites. The proposed methodology can also act as a basis for communication and discussion, thereby contributing to intersectoral management solutions. From the case study it was found that the defined failures are governed primarily by the probability of a landslide occurring. The overall probabilities for failure are low; however, if a landslide occurs the probabilities of exceeding EQS are high and the probability of having at least a 10% increase in the contamination load within one year is also high. Copyright © 2013 Elsevier B.V. All rights reserved.

  3. Cardiac systolic dysfunction in doxorubicin-challenged rats is associated with upregulation of MuRF2 and MuRF3 E3 ligases

    PubMed Central

    da Silva, Marcia Gracindo; Mattos, Elisabete; Camacho-Pereira, Juliana; Domitrovic, Tatiana; Galina, Antonio; Costa, Mauro W; Kurtenbach, Eleonora

    2012-01-01

    Doxorubicin (DOXO) is an efficient and low-cost chemotherapeutic agent. The use of DOXO is limited by its side effects, including cardiotoxicity, that may progress to cardiac failure as a result of multifactorial events that have not yet been fully elucidated. In the present study, the effects of DOXO at two different doses were analyzed to identify early functional and molecular markers of cardiac distress. One group of rats received 7.5 mg/kg of DOXO (low-dose group) and was followed for 20 weeks. A subset of these animals was then subjected to an additional cycle of DOXO treatment, generating a cumulative dose of 20 mg/kg (high-dose group). Physiological and biochemical parameters were assessed in both treatment groups and in a control group that received saline. Systolic dysfunction was observed only in the high-dose group. Mitochondrial function analysis showed a clear reduction in oxidative cellular respiration for animals in both DOXO treatment groups, with evidence of complex I damage being observed. Transcriptional analysis by quantitative polymerase chain reaction revealed an increase in atrial natriuretic peptide transcript in the high-dose group, which is consistent with cardiac failure. Analysis of transcription levels of key components of the cardiac ubiquitin-proteasome system found that the ubiquitin E3 ligase muscle ring finger 1 (MuRF1) was upregulated in both the low- and high-dose DOXO groups. MuRF2 and MuRF3 were also upregulated in the high-dose group but not in the low-dose group. This molecular profile may be useful as an early physiological and energetic cardiac failure indicator for testing therapeutic interventions in animal models. PMID:23620696

  4. Risk analysis of analytical validations by probabilistic modification of FMEA.

    PubMed

    Barends, D M; Oldenhof, M T; Vredenbregt, M J; Nauta, M J

    2012-05-01

    Risk analysis is a valuable addition to validation of an analytical chemistry process, enabling not only detecting technical risks, but also risks related to human failures. Failure Mode and Effect Analysis (FMEA) can be applied, using a categorical risk scoring of the occurrence, detection and severity of failure modes, and calculating the Risk Priority Number (RPN) to select failure modes for correction. We propose a probabilistic modification of FMEA, replacing the categorical scoring of occurrence and detection by their estimated relative frequency and maintaining the categorical scoring of severity. In an example, the results of traditional FMEA of a Near Infrared (NIR) analytical procedure used for the screening of suspected counterfeited tablets are re-interpretated by this probabilistic modification of FMEA. Using this probabilistic modification of FMEA, the frequency of occurrence of undetected failure mode(s) can be estimated quantitatively, for each individual failure mode, for a set of failure modes, and the full analytical procedure. Copyright © 2012 Elsevier B.V. All rights reserved.

  5. Fault tree applications within the safety program of Idaho Nuclear Corporation

    NASA Technical Reports Server (NTRS)

    Vesely, W. E.

    1971-01-01

    Computerized fault tree analyses are used to obtain both qualitative and quantitative information about the safety and reliability of an electrical control system that shuts the reactor down when certain safety criteria are exceeded, in the design of a nuclear plant protection system, and in an investigation of a backup emergency system for reactor shutdown. The fault tree yields the modes by which the system failure or accident will occur, the most critical failure or accident causing areas, detailed failure probabilities, and the response of safety or reliability to design modifications and maintenance schemes.

  6. The VHCF experimental investigation of FV520B-I with surface roughness Ry

    NASA Astrophysics Data System (ADS)

    Wang, J. L.; Zhang, Y. L.; Ding, M. C.; Zhao, Q. C.

    2018-05-01

    Different surface roughness type (Ra and Ry) has different effect on the VHCF failure and life. Ra is widely employed as the quantitative expression of the surface roughness, but there are few fatigue failure mechanism analysis and experimental study under surface roughness Ry. The VHCF experiment is conducted out using the specimen with different surface roughness values. The surface roughness Ry is employed as the major research object to investigate the relationship and distribution tendency between the Ry, fatigue life and the distance between internal inclusion and surface, and a new VHCF failure character is proposed.

  7. Analytical Method to Evaluate Failure Potential During High-Risk Component Development

    NASA Technical Reports Server (NTRS)

    Tumer, Irem Y.; Stone, Robert B.; Clancy, Daniel (Technical Monitor)

    2001-01-01

    Communicating failure mode information during design and manufacturing is a crucial task for failure prevention. Most processes use Failure Modes and Effects types of analyses, as well as prior knowledge and experience, to determine the potential modes of failures a product might encounter during its lifetime. When new products are being considered and designed, this knowledge and information is expanded upon to help designers extrapolate based on their similarity with existing products and the potential design tradeoffs. This paper makes use of similarities and tradeoffs that exist between different failure modes based on the functionality of each component/product. In this light, a function-failure method is developed to help the design of new products with solutions for functions that eliminate or reduce the potential of a failure mode. The method is applied to a simplified rotating machinery example in this paper, and is proposed as a means to account for helicopter failure modes during design and production, addressing stringent safety and performance requirements for NASA applications.

  8. Failure Mode Identification Through Clustering Analysis

    NASA Technical Reports Server (NTRS)

    Arunajadai, Srikesh G.; Stone, Robert B.; Tumer, Irem Y.; Clancy, Daniel (Technical Monitor)

    2002-01-01

    Research has shown that nearly 80% of the costs and problems are created in product development and that cost and quality are essentially designed into products in the conceptual stage. Currently, failure identification procedures (such as FMEA (Failure Modes and Effects Analysis), FMECA (Failure Modes, Effects and Criticality Analysis) and FTA (Fault Tree Analysis)) and design of experiments are being used for quality control and for the detection of potential failure modes during the detail design stage or post-product launch. Though all of these methods have their own advantages, they do not give information as to what are the predominant failures that a designer should focus on while designing a product. This work uses a functional approach to identify failure modes, which hypothesizes that similarities exist between different failure modes based on the functionality of the product/component. In this paper, a statistical clustering procedure is proposed to retrieve information on the set of predominant failures that a function experiences. The various stages of the methodology are illustrated using a hypothetical design example.

  9. Fatigue crack growth in an aluminum alloy-fractographic study

    NASA Astrophysics Data System (ADS)

    Salam, I.; Muhammad, W.; Ejaz, N.

    2016-08-01

    A two-fold approach was adopted to understand the fatigue crack growth process in an Aluminum alloy; fatigue crack growth test of samples and analysis of fractured surfaces. Fatigue crack growth tests were conducted on middle tension M(T) samples prepared from an Aluminum alloy cylinder. The tests were conducted under constant amplitude loading at R ratio 0.1. The stress applied was from 20,30 and 40 per cent of the yield stress of the material. The fatigue crack growth data was recorded. After fatigue testing, the samples were subjected to detailed scanning electron microscopic (SEM) analysis. The resulting fracture surfaces were subjected to qualitative and quantitative fractographic examinations. Quantitative fracture analysis included an estimation of crack growth rate (CGR) in different regions. The effect of the microstructural features on fatigue crack growth was examined. It was observed that in stage II (crack growth region), the failure mode changes from intergranular to transgranular as the stress level increases. In the region of intergranular failure the localized brittle failure was observed and fatigue striations are difficult to reveal. However, in the region of transgranular failure the crack path is independent of the microstructural features. In this region, localized ductile failure mode was observed and well defined fatigue striations were present in the wake of fatigue crack. The effect of interaction of growing fatigue crack with microstructural features was not substantial. The final fracture (stage III) was ductile in all the cases.

  10. PROBABILISTIC RISK ANALYSIS OF RADIOACTIVE WASTE DISPOSALS - a case study

    NASA Astrophysics Data System (ADS)

    Trinchero, P.; Delos, A.; Tartakovsky, D. M.; Fernandez-Garcia, D.; Bolster, D.; Dentz, M.; Sanchez-Vila, X.; Molinero, J.

    2009-12-01

    The storage of contaminant material in superficial or sub-superficial repositories, such as tailing piles for mine waste or disposal sites for low and intermediate nuclear waste, poses a potential threat for the surrounding biosphere. The minimization of these risks can be achieved by supporting decision-makers with quantitative tools capable to incorporate all source of uncertainty within a rigorous probabilistic framework. A case study is presented where we assess the risks associated to the superficial storage of hazardous waste close to a populated area. The intrinsic complexity of the problem, involving many events with different spatial and time scales and many uncertainty parameters is overcome by using a formal PRA (probabilistic risk assessment) procedure that allows decomposing the system into a number of key events. Hence, the failure of the system is directly linked to the potential contamination of one of the three main receptors: the underlying karst aquifer, a superficial stream that flows near the storage piles and a protection area surrounding a number of wells used for water supply. The minimal cut sets leading to the failure of the system are obtained by defining a fault-tree that incorporates different events including the failure of the engineered system (e.g. cover of the piles) and the failure of the geological barrier (e.g. clay layer that separates the bottom of the pile from the karst formation). Finally the probability of failure is quantitatively assessed combining individual independent or conditional probabilities that are computed numerically or borrowed from reliability database.

  11. Renal function monitoring in heart failure - what is the optimal frequency? A narrative review.

    PubMed

    Al-Naher, Ahmed; Wright, David; Devonald, Mark Alexander John; Pirmohamed, Munir

    2018-01-01

    The second most common cause of hospitalization due to adverse drug reactions in the UK is renal dysfunction due to diuretics, particularly in patients with heart failure, where diuretic therapy is a mainstay of treatment regimens. Therefore, the optimal frequency for monitoring renal function in these patients is an important consideration for preventing renal failure and hospitalization. This review looks at the current evidence for optimal monitoring practices of renal function in patients with heart failure according to national and international guidelines on the management of heart failure (AHA/NICE/ESC/SIGN). Current guidance of renal function monitoring is in large part based on expert opinion, with a lack of clinical studies that have specifically evaluated the optimal frequency of renal function monitoring in patients with heart failure. Furthermore, there is variability between guidelines, and recommendations are typically nonspecific. Safer prescribing of diuretics in combination with other antiheart failure treatments requires better evidence for frequency of renal function monitoring. We suggest developing more personalized monitoring rather than from the current medication-based guidance. Such flexible clinical guidelines could be implemented using intelligent clinical decision support systems. Personalized renal function monitoring would be more effective in preventing renal decline, rather than reacting to it. © 2017 The Authors. British Journal of Clinical Pharmacology published by John Wiley & Sons Ltd on behalf of British Pharmacological Society.

  12. Markov and semi-Markov processes as a failure rate

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Grabski, Franciszek

    2016-06-08

    In this paper the reliability function is defined by the stochastic failure rate process with a non negative and right continuous trajectories. Equations for the conditional reliability functions of an object, under assumption that the failure rate is a semi-Markov process with an at most countable state space are derived. A proper theorem is presented. The linear systems of equations for the appropriate Laplace transforms allow to find the reliability functions for the alternating, the Poisson and the Furry-Yule failure rate processes.

  13. An improved approach for flight readiness certification: Methodology for failure risk assessment and application examples. Volume 2: Software documentation

    NASA Technical Reports Server (NTRS)

    Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.

    1992-01-01

    An improved methodology for quantitatively evaluating failure risk of spaceflight systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with engineering analysis to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in engineering analyses of failure phenomena, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which engineering analysis models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes, These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. Conventional engineering analysis models currently employed for design of failure prediction are used in this methodology. The PFA methodology is described and examples of its application are presented. Conventional approaches to failure risk evaluation for spaceflight systems are discussed, and the rationale for the approach taken in the PFA methodology is presented. The statistical methods, engineering models, and computer software used in fatigue failure mode applications are thoroughly documented.

  14. An improved approach for flight readiness certification: Methodology for failure risk assessment and application examples, volume 1

    NASA Technical Reports Server (NTRS)

    Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.

    1992-01-01

    An improved methodology for quantitatively evaluating failure risk of spaceflight systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with engineering analysis to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in engineering analyses of failure phenomena, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which engineering analysis models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes. These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. Conventional engineering analysis models currently employed for design of failure prediction are used in this methodology. The PFA methodology is described and examples of its application are presented. Conventional approaches to failure risk evaluation for spaceflight systems are discussed, and the rationale for the approach taken in the PFA methodology is presented. The statistical methods, engineering models, and computer software used in fatigue failure mode applications are thoroughly documented.

  15. Systolic versus diastolic heart failure in community practice: clinical features, outcomes, and the use of angiotensin-converting enzyme inhibitors.

    PubMed

    Philbin, E F; Rocco, T A; Lindenmuth, N W; Ulrich, K; Jenkins, P L

    2000-12-01

    Among patients with heart failure, there is controversy about whether there are clinical features and laboratory tests that can differentiate patients who have low ejection fractions from those with normal ejection fractions. The usefulness of angiotensin-converting enzyme (ACE) inhibitors among heart failure patients who have normal left ventricular ejection fractions is also not known. From a registry of 2,906 unselected consecutive patients with heart failure who were admitted to 10 acute-care community hospitals during 1995 and 1997, we identified 1291 who had a quantitative measurement of their left ventricular ejection fraction. Patients were separated into three groups based on ejection fraction: < or =0.39 (n = 741, 57%), 0.40 to 0.49 (n = 238, 18%), and > or =0.50 (n = 312, 24%). In-hospital mortality, prescription of ACE inhibitors at discharge, subsequent rehospitalization, quality of life, and survival were measured; survivors were observed for at least 6 months after hospitalization. The mean (+/- SD) age of the sample was 75+/-11 years; the majority (55%) of patients were women. In multivariate models, age >75 years, female sex, weight >72.7 kg, and a valvular etiology for heart failure were associated with an increased probability of having an ejection fraction > or =0.50; a prior history of heart failure, an ischemic or idiopathic cause of heart failure, and radiographic cardiomegaly were associated with a lower probability of having an ejection fraction > or =0.50. Total mortality was lower in patients with an ejection fraction > or =0.50 than in those with an ejection fraction < or =0.39 (odds ratio [OR] = 0.69, 95% confidence interval [CI 0.49 to 0.98, P = 0.04). Among hospital survivors with an ejection fraction of 0.40 to 0.49, the 65% who were prescribed ACE inhibitors at discharge had better mean adjusted quality-of-life scores (7.0 versus 6.2, P = 0.02), and lower adjusted mortality (OR = 0.34, 95% CI: 0.17 to 0.70, P = 0.01) during follow-up than those who were not prescribed ACE inhibitors. Among hospital survivors with an ejection fraction > or =0.50, the 45% who were prescribed ACE inhibitors at discharge had better (lower) adjusted New York Heart Association (NYHA) functional class (2.1 versus 2.4, P = 0.04) although there was no significant improvement in survival. Among patients treated for heart failure in community hospitals, 42% of those whose ejection fraction was measured had a relatively normal systolic function (ejection fraction > or 0.40). The clinical characteristics and mortality of these patients differed from those in patients with low ejection fractions. Among the patients with ejection fractions > or =0.40, the prescription of ACE inhibitors at discharge was associated favorable effects.

  16. Reducing the Risk of Human Space Missions with INTEGRITY

    NASA Technical Reports Server (NTRS)

    Jones, Harry W.; Dillon-Merill, Robin L.; Tri, Terry O.; Henninger, Donald L.

    2003-01-01

    The INTEGRITY Program will design and operate a test bed facility to help prepare for future beyond-LEO missions. The purpose of INTEGRITY is to enable future missions by developing, testing, and demonstrating advanced human space systems. INTEGRITY will also implement and validate advanced management techniques including risk analysis and mitigation. One important way INTEGRITY will help enable future missions is by reducing their risk. A risk analysis of human space missions is important in defining the steps that INTEGRITY should take to mitigate risk. This paper describes how a Probabilistic Risk Assessment (PRA) of human space missions will help support the planning and development of INTEGRITY to maximize its benefits to future missions. PRA is a systematic methodology to decompose the system into subsystems and components, to quantify the failure risk as a function of the design elements and their corresponding probability of failure. PRA provides a quantitative estimate of the probability of failure of the system, including an assessment and display of the degree of uncertainty surrounding the probability. PRA provides a basis for understanding the impacts of decisions that affect safety, reliability, performance, and cost. Risks with both high probability and high impact are identified as top priority. The PRA of human missions beyond Earth orbit will help indicate how the risk of future human space missions can be reduced by integrating and testing systems in INTEGRITY.

  17. Characterizing trabecular bone structure for assessing vertebral fracture risk on volumetric quantitative computed tomography

    NASA Astrophysics Data System (ADS)

    Nagarajan, Mahesh B.; Checefsky, Walter A.; Abidin, Anas Z.; Tsai, Halley; Wang, Xixi; Hobbs, Susan K.; Bauer, Jan S.; Baum, Thomas; Wismüller, Axel

    2015-03-01

    While the proximal femur is preferred for measuring bone mineral density (BMD) in fracture risk estimation, the introduction of volumetric quantitative computed tomography has revealed stronger associations between BMD and spinal fracture status. In this study, we propose to capture properties of trabecular bone structure in spinal vertebrae with advanced second-order statistical features for purposes of fracture risk assessment. For this purpose, axial multi-detector CT (MDCT) images were acquired from 28 spinal vertebrae specimens using a whole-body 256-row CT scanner with a dedicated calibration phantom. A semi-automated method was used to annotate the trabecular compartment in the central vertebral slice with a circular region of interest (ROI) to exclude cortical bone; pixels within were converted to values indicative of BMD. Six second-order statistical features derived from gray-level co-occurrence matrices (GLCM) and the mean BMD within the ROI were then extracted and used in conjunction with a generalized radial basis functions (GRBF) neural network to predict the failure load of the specimens; true failure load was measured through biomechanical testing. Prediction performance was evaluated with a root-mean-square error (RMSE) metric. The best prediction performance was observed with GLCM feature `correlation' (RMSE = 1.02 ± 0.18), which significantly outperformed all other GLCM features (p < 0.01). GLCM feature correlation also significantly outperformed MDCTmeasured mean BMD (RMSE = 1.11 ± 0.17) (p< 10-4). These results suggest that biomechanical strength prediction in spinal vertebrae can be significantly improved through characterization of trabecular bone structure with GLCM-derived texture features.

  18. Kinematic Characterization of Left Ventricular Chamber Stiffness and Relaxation

    NASA Astrophysics Data System (ADS)

    Mossahebi, Sina

    Heart failure is the most common cause of hospitalization today, and diastolic heart failure accounts for 40-50% of cases. Therefore, it is critical to identify diastolic dysfunction at a subclinical stage so that appropriate therapy can be administered before ventricular function is further, and perhaps irreversibly impaired. Basic concepts in physics such as kinematic modeling provide a unique method with which to characterize cardiovascular physiology, specifically diastolic function (DF). The advantage of an approach that is standard in physics, such as the kinematic modeling is its causal formulation that functions in contrast to correlative approaches traditionally utilized in the life sciences. Our research group has pioneered theoretical and experimental quantitative analysis of DF in humans, using both non-invasive (echocardiography, cardiac MRI) and invasive (simultaneous catheterization-echocardiography) methods. Our group developed and validated the Parametrized Diastolic Filling (PDF) formalism which is motivated by basic physiologic principles (LV is a mechanical suction pump at the mitral valve opening) that obey Newton's Laws. PDF formalism is a kinematic model of filling employing an equation of motion, the solution of which accurately predicts all E-wave contours in accordance with the rules of damped harmonic oscillatory motion. The equation's lumped parameters---ventricular stiffness, ventricular viscoelasticity/relaxation and ventricular load---are obtained by solving the 'inverse problem'. The parameters' physiologic significance and clinical utility have been repeatedly demonstrated in multiple clinical settings. In this work we apply our kinematic modeling approach to better understand how the heart works as it fills in order to advance the relationship between physiology and mathematical modeling. Through the use of this modeling, we thereby define and validate novel, causal indexes of diastolic function such as early rapid filling energy, diastatic stiffness, and relaxation and stiffness components of E-wave deceleration time.

  19. Resounding failure to replicate links between developmental language disorder and cerebral lateralisation

    PubMed Central

    Bishop, Dorothy V.M.

    2018-01-01

    Background It has been suggested that failure to establish cerebral lateralisation may be related to developmental language disorder (DLD). There has been weak support for any link with handedness, but more consistent reports of associations with functional brain lateralisation for language. The consistency of lateralisation across different functions may also be important. We aimed to replicate previous findings of an association between DLD and reduced laterality on a quantitative measure of hand preference (reaching across the midline) and on language laterality assessed using functional transcranial Doppler ultrasound (fTCD). Methods From a sample of twin children aged from 6;0 to 11;11 years, we identified 107 cases of DLD and 156 typically-developing comparison cases for whom we had useable data from fTCD yielding a laterality index (LI) for language function during an animation description task. Handedness data were also available for these children. Results Indices of handedness and language laterality for this twin sample were similar to those previously reported for single-born children. There were no differences between the DLD and TD groups on measures of handedness or language lateralisation, or on a categorical measure of consistency of left hemisphere dominance. Contrary to prediction, there was a greater incidence of right lateralisation for language in the TD group (19.90%) than the DLD group (9.30%), confirming that atypical laterality is not inconsistent with typical language development. We also failed to replicate associations between language laterality and language test scores. Discussion and Conclusions Given the large sample studied here and the range of measures, we suggest that previous reports of atypical manual or language lateralisation in DLD may have been false positives. PMID:29333343

  20. JBP485 improves gentamicin-induced acute renal failure by regulating the expression and function of Oat1 and Oat3 in rats

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Guo, Xinjin; Meng, Qiang; Liu, Qi

    2013-09-01

    We investigated the effects of JBP485 (an anti-inflammatory dipeptide and a substrate of OAT) on regulation of the expression and function of renal Oat1 and Oat3, which can accelerate the excretion of accumulated uremic toxins (e.g. indoxyl sulfate) in the kidney to improve gentamicin-induced ARF in rats. JBP485 caused a significant decrease in the accumulation of endogenous substances (creatinine, blood urea nitrogen and indoxyl sulfate) in vivo, an increase in the excretion of exogenous compounds (lisinopril and inulin) into urine, and up-regulation of the expressions of renal Oat1 and Oat3 in the kidney tissues and slices via substrate induction. Tomore » determine the effect of JBP485 on the accelerated excretion of uremic toxins mediated by Oat1 and Oat3, the mRNA and protein expression levels of renal basolateral Oats were assessed by quantitative real-time PCR, western blot, immunohistochemical analysis and an immunofluorescence method. Gentamicin down-regulated the expression of Oats mRNA and protein in rat kidney, and these effects were reversed after administration of JBP485. In addition, JBP485 caused a significant decrease in MPO and MDA levels in the kidney, and improved the pathological condition of rat kidney. These results indicated that JBP485 improved acute renal failure by increasing the expression and function of Oat1 and Oat3, and by decreasing overoxidation of the kidney in gentamicin-induced ARF rats. - Highlights: • JBP485 could up-regulate function and expression of Oat1 and Oat3 in kidney. • Effects of JBP485 on ARF are mediated by stimulating excretion of uremic toxins. • JBP485 protected against gentamicin-induced ARF by decreasing MPO and MDA.« less

  1. Comment on ``The application of the thermodynamic perturbation theory to study the hydrophobic hydration'' [J. Chem. Phys. 139, 024101 (2013)

    NASA Astrophysics Data System (ADS)

    Graziano, Giuseppe

    2013-09-01

    It is shown that the behaviour of the hydration thermodynamic functions obtained in the 3D Mercedes-Benz model of water by Mohoric et al. [J. Chem. Phys. 139, 024101 (2013)] is not qualitatively correct with respect to experimental data for a solute whose diameter is 1.5-fold larger than that of a water molecule. It is also pointed out that the failure is due to the fact that the used 3D Mercedes-Benz model of water [A. Bizjak, T. Urbic, V. Vlachy, and K. A. Dill, J. Chem. Phys. 131, 194504 (2009)] does not reproduce in a quantitatively correct manner the peculiar temperature dependence of water density.

  2. [Effects of Fluoxetine on Nogo Expression and Collagen Production with Decrease of Pulmonary Artery Pressure in Rats with Right Ventricular Failure.

    PubMed

    Ran, Xun; Zhao, Jian-Xun; Nie, Hu; Chen, Yu-Cheng

    2016-11-01

    To investigate the effect of fluoxetine on neurite growth inhibitor (Nogo) expession and collagen production of cardiac tissue in rats with right heart failure and pulmonary hypertension. Thirty one male SD rats were randomly divided into the treatment group,right heart failure group and normal control group.The rats in the treatment group and right heart failure group received intrapertioneal injection of monocrotaline (MCT,60 mg/kg) to induce pulmonary hypertension and right heart failure.After 21 days,the rats in treatment group were given fluoxetine of 10 mg/(kg×d) by gavage per day for 21 days,the rats in the other two groups were given saline.HE staining was used to observe the pulmonary artery and right ventricular myocardial tissue in rats.The collagen formation in right ventricular myocardium was observed by Masson staining.The expressions of Nogo-A, Nogo-B ,type1collagen and type 3 collagen mRNA in myocardium were measured by real-time fluorescence quantitative PCR,while the semi quantitative measurement of Nogo protein level was detected by Western blot. After the intervention of fluoxetine,pulmonary artery stenosis was significantly reduced,myocardial tissue lesion decreased,collagen synthesis decreased in right ventricular myocardium.RT-PCR showed that mRNA of Nogo-A decreased,and mRNA of Nogo-B increased ( P <0.05).Western blot showed that the expression of Nogo-A protein decreased,while Nogo-B1 protein expression increased ( P <0.05),Nogo-B2 expression was not significantly changed ( P >0.05). Nogo may affect the collagen synthesis in right heart failure,and partly involved in myocardial fibrosis.

  3. Effects of self-management intervention on health outcomes of patients with heart failure: a systematic review of randomized controlled trials

    PubMed Central

    Jovicic, Aleksandra; Holroyd-Leduc, Jayna M; Straus, Sharon E

    2006-01-01

    Background Heart failure is the most common cause of hospitalization among adults over 65. Over 60% of patients die within 10 years of first onset of symptoms. The objective of this study is to determine the effectiveness of self-management interventions on hospital readmission rates, mortality, and health-related quality of life in patients diagnosed with heart failure. Methods The study is a systematic review of randomized controlled trials. The following data sources were used: MEDLINE (1966-11/2005), EMBASE (1980-11/2005), CINAHL (1982-11/2005), the ACP Journal Club database (to 11/2005), the Cochrane Central Trial Registry and the Cochrane Database of Systematic Reviews (to 11/2005); article reference lists; and experts in the field. We included randomized controlled trials of self-management interventions that enrolled patients 18 years of age or older who were diagnosed with heart failure. The primary outcomes of interest were all-cause hospital readmissions, hospital readmissions due to heart failure, and mortality. Secondary outcomes were compliance with treatment and quality of life scores. Three reviewers independently assessed the quality of each study and abstracted the results. For each included study, we computed the pooled odds ratios (OR) for all-cause hospital readmission, hospital readmission due to heart failure, and death. We used a fixed effects model to quantitatively synthesize results. We were not able to pool effects on health-related quality of life and measures of compliance with treatment, but we summarized the findings from the relevant studies. We also summarized the reported cost savings. Results From 671 citations that were identified, 6 randomized trials with 857 patients were included in the review. Self-management decreased all-cause hospital readmissions (OR 0.59; 95% confidence interval (CI) 0.44 to 0.80, P = 0.001) and heart failure readmissions (OR 0.44; 95% CI 0.27 to 0.71, P = 0.001). The effect on mortality was not significant (OR = 0.93; 95% CI 0.57 to 1.51, P = 0.76). Adherence to prescribed medical advice improved, but there was no significant difference in functional capabilities, symptom status and quality of life. The reported savings ranged from $1300 to $7515 per patient per year. Conclusion Self-management programs targeted for patients with heart failure decrease overall hospital readmissions and readmissions for heart failure. PMID:17081306

  4. The second Sandia Fracture Challenge. Predictions of ductile failure under quasi-static and moderate-rate dynamic loading

    DOE PAGES

    Boyce, B. L.; Kramer, S. L. B.; Bosiljevac, T. R.; ...

    2016-03-14

    Ductile failure of structural metals is relevant to a wide range of engineering scenarios. Computational methods are employed to anticipate the critical conditions of failure, yet they sometimes provide inaccurate and misleading predictions. Challenge scenarios, such as the one presented in the current work, provide an opportunity to assess the blind, quantitative predictive ability of simulation methods against a previously unseen failure problem. Instead of evaluating the predictions of a single simulation approach, the Sandia Fracture Challenge relied on numerous volunteer teams with expertise in computational mechanics to apply a broad range of computational methods, numerical algorithms, and constitutive modelsmore » to the challenge. This exercise is intended to evaluate the state of health of technologies available for failure prediction. In the first Sandia Fracture Challenge, a wide range of issues were raised in ductile failure modeling, including a lack of consistency in failure models, the importance of shear calibration data, and difficulties in quantifying the uncertainty of prediction [see Boyce et al. (Int J Fract 186:5–68, 2014) for details of these observations]. This second Sandia Fracture Challenge investigated the ductile rupture of a Ti–6Al–4V sheet under both quasi-static and modest-rate dynamic loading (failure in ~ 0.1 s). Like the previous challenge, the sheet had an unusual arrangement of notches and holes that added geometric complexity and fostered a competition between tensile- and shear-dominated failure modes. The teams were asked to predict the fracture path and quantitative far-field failure metrics such as the peak force and displacement to cause crack initiation. Fourteen teams contributed blind predictions, and the experimental outcomes were quantified in three independent test labs. In addition, shortcomings were revealed in this second challenge such as inconsistency in the application of appropriate boundary conditions, need for a thermomechanical treatment of the heat generation in the dynamic loading condition, and further difficulties in model calibration based on limited real-world engineering data. As with the prior challenge, this work not only documents the ‘state-of-the-art’ in computational failure prediction of ductile tearing scenarios, but also provides a detailed dataset for non-blind assessment of alternative methods.« less

  5. The second Sandia Fracture Challenge. Predictions of ductile failure under quasi-static and moderate-rate dynamic loading

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Boyce, B. L.; Kramer, S. L. B.; Bosiljevac, T. R.

    Ductile failure of structural metals is relevant to a wide range of engineering scenarios. Computational methods are employed to anticipate the critical conditions of failure, yet they sometimes provide inaccurate and misleading predictions. Challenge scenarios, such as the one presented in the current work, provide an opportunity to assess the blind, quantitative predictive ability of simulation methods against a previously unseen failure problem. Instead of evaluating the predictions of a single simulation approach, the Sandia Fracture Challenge relied on numerous volunteer teams with expertise in computational mechanics to apply a broad range of computational methods, numerical algorithms, and constitutive modelsmore » to the challenge. This exercise is intended to evaluate the state of health of technologies available for failure prediction. In the first Sandia Fracture Challenge, a wide range of issues were raised in ductile failure modeling, including a lack of consistency in failure models, the importance of shear calibration data, and difficulties in quantifying the uncertainty of prediction [see Boyce et al. (Int J Fract 186:5–68, 2014) for details of these observations]. This second Sandia Fracture Challenge investigated the ductile rupture of a Ti–6Al–4V sheet under both quasi-static and modest-rate dynamic loading (failure in ~ 0.1 s). Like the previous challenge, the sheet had an unusual arrangement of notches and holes that added geometric complexity and fostered a competition between tensile- and shear-dominated failure modes. The teams were asked to predict the fracture path and quantitative far-field failure metrics such as the peak force and displacement to cause crack initiation. Fourteen teams contributed blind predictions, and the experimental outcomes were quantified in three independent test labs. In addition, shortcomings were revealed in this second challenge such as inconsistency in the application of appropriate boundary conditions, need for a thermomechanical treatment of the heat generation in the dynamic loading condition, and further difficulties in model calibration based on limited real-world engineering data. As with the prior challenge, this work not only documents the ‘state-of-the-art’ in computational failure prediction of ductile tearing scenarios, but also provides a detailed dataset for non-blind assessment of alternative methods.« less

  6. Environment assisted degradation mechanisms in advanced light metals

    NASA Technical Reports Server (NTRS)

    Gangloff, R. P.; Stoner, G. E.; Swanson, R. E.

    1989-01-01

    A multifaceted research program on the performance of advanced light metallic alloys in aggressive aerospace environments, and associated environmental failure mechanisms was initiated. The general goal is to characterize alloy behavior quantitatively and to develop predictive mechanisms for environmental failure modes. Successes in this regard will provide the basis for metallurgical optimization of alloy performance, for chemical control of aggressive environments, and for engineering life prediction with damage tolerance and long term reliability.

  7. Extubation failure influences clinical and functional outcomes in patients with traumatic brain injury*

    PubMed Central

    dos Reis, Helena França Correia; Almeida, Mônica Lajana Oliveira; da Silva, Mário Ferreira; Rocha, Mário de Seixas

    2013-01-01

    OBJECTIVE: To evaluate the association between extubation failure and outcomes (clinical and functional) in patients with traumatic brain injury (TBI). METHODS: A prospective cohort study involving 311 consecutive patients with TBI. The patients were divided into two groups according to extubation outcome: extubation success; and extubation failure (defined as reintubation within 48 h after extubation). A multivariate model was developed in order to determine whether extubation failure was an independent predictor of in-hospital mortality. RESULTS: The mean age was 35.7 ± 13.8 years. Males accounted for 92.3%. The incidence of extubation failure was 13.8%. In-hospital mortality was 4.5% and 20.9% in successfully extubated patients and in those with extubation failure, respectively (p = 0.001). Tracheostomy was more common in the extubation failure group (55.8% vs. 1.9%; p < 0.001). The median length of hospital stay was significantly greater in the extubation failure group than in the extubation success group (44 days vs. 27 days; p = 0.002). Functional status at discharge was worse among the patients in the extubation failure group. The multivariate analysis showed that extubation failure was an independent predictor of in-hospital mortality (OR = 4.96; 95% CI, 1.86-13.22). CONCLUSIONS: In patients with TBI, extubation failure appears to lengthen hospital stays; to increase the frequency of tracheostomy and of pulmonary complications; to worsen functional outcomes; and to increase mortality. PMID:23857695

  8. Interoperability-oriented Integration of Failure Knowledge into Functional Knowledge and Knowledge Transformation based on Concepts Mapping

    NASA Astrophysics Data System (ADS)

    Koji, Yusuke; Kitamura, Yoshinobu; Kato, Yoshikiyo; Tsutsui, Yoshio; Mizoguchi, Riichiro

    In conceptual design, it is important to develop functional structures which reflect the rich experience in the knowledge from previous design failures. Especially, if a designer learns possible abnormal behaviors from a previous design failure, he or she can add an additional function which prevents such abnormal behaviors and faults. To do this, it is a crucial issue to share such knowledge about possible faulty phenomena and how to cope with them. In fact, a part of such knowledge is described in FMEA (Failure Mode and Effect Analysis) sheets, function structure models for systematic design and fault trees for FTA (Fault Tree Analysis).

  9. Reliability of pathogen control in direct potable reuse: Performance evaluation and QMRA of a full-scale 1 MGD advanced treatment train.

    PubMed

    Pecson, Brian M; Triolo, Sarah C; Olivieri, Simon; Chen, Elise C; Pisarenko, Aleksey N; Yang, Chao-Chun; Olivieri, Adam; Haas, Charles N; Trussell, R Shane; Trussell, R Rhodes

    2017-10-01

    To safely progress toward direct potable reuse (DPR), it is essential to ensure that DPR systems can provide public health protection equivalent to or greater than that of conventional drinking water sources. This study collected data over a one-year period from a full-scale DPR demonstration facility, and used both performance distribution functions (PDFs) and quantitative microbial risk assessment (QMRA) to define and evaluate the reliability of the advanced water treatment facility (AWTF). The AWTF's ability to control enterovirus, Giardia, and Cryptosporidium was characterized using online monitoring of surrogates in a treatment train consisting of ozone, biological activated carbon, microfiltration, reverse osmosis, and ultraviolet light with an advanced oxidation process. This process train was selected to improve reliability by providing redundancy, defined as the provision of treatment beyond the minimum needed to meet regulatory requirements. The PDFs demonstrated treatment that consistently exceeded the 12/10/10-log thresholds for virus, Giardia, and Cryptosporidium, as currently required for potable reuse in California (via groundwater recharge and surface water augmentation). Because no critical process failures impacted pathogen removal performance during the yearlong testing, hypothetical failures were incorporated into the analysis to understand the benefit of treatment redundancy on performance. Each unit process was modeled with a single failure per year lasting four different failure durations: 15 min, 60 min, 8 h, and 24 h. QMRA was used to quantify the impact of failures on pathogen risk. The median annual risk of infection for Cryptosporidium was 4.9 × 10 -11 in the absence of failures, and reached a maximum of 1.1 × 10 -5 assuming one 24-h failure per process per year. With the inclusion of free chlorine disinfection as part of the treatment process, enterovirus had a median annual infection risk of 1.5 × 10 -14 (no failures) and a maximum annual value of 2.1 × 10 -5 (assuming one 24-h failure per year). Even with conservative failure assumptions, pathogen risk from this treatment train remains below the risk targets for both the U.S. (10 -4 infections/person/year) and the WHO (approximately 10 -3 infections/person/year, equivalent to 10 -6 DALY/person/year), demonstrating the value of a failure prevention strategy based on treatment redundancy. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.

  10. An improved approach for flight readiness certification: Probabilistic models for flaw propagation and turbine blade failure. Volume 1: Methodology and applications

    NASA Technical Reports Server (NTRS)

    Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.

    1992-01-01

    An improved methodology for quantitatively evaluating failure risk of spaceflight systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with analytical modeling of failure phenomena to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in analytical modeling, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which analytical models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes. These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. State-of-the-art analytical models currently employed for designs failure prediction, or performance analysis are used in this methodology. The rationale for the statistical approach taken in the PFA methodology is discussed, the PFA methodology is described, and examples of its application to structural failure modes are presented. The engineering models and computer software used in fatigue crack growth and fatigue crack initiation applications are thoroughly documented.

  11. An improved approach for flight readiness certification: Probabilistic models for flaw propagation and turbine blade failure. Volume 2: Software documentation

    NASA Technical Reports Server (NTRS)

    Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.

    1992-01-01

    An improved methodology for quantitatively evaluating failure risk of spaceflights systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with analytical modeling of failure phenomena to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in analytical modeling, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which analytical models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes. These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. State-of-the-art analytical models currently employed for design, failure prediction, or performance analysis are used in this methodology. The rationale for the statistical approach taken in the PFA methodology is discussed, the PFA methodology is described, and examples of its application to structural failure modes are presented. The engineering models and computer software used in fatigue crack growth and fatigue crack initiation applications are thoroughly documented.

  12. An Abrupt Transition to an Intergranular Failure Mode in the Near-Threshold Fatigue Crack Growth Regime in Ni-Based Superalloys

    NASA Astrophysics Data System (ADS)

    Telesman, J.; Smith, T. M.; Gabb, T. P.; Ring, A. J.

    2018-06-01

    Cyclic near-threshold fatigue crack growth (FCG) behavior of two disk superalloys was evaluated and was shown to exhibit an unexpected sudden failure mode transition from a mostly transgranular failure mode at higher stress intensity factor ranges to an almost completely intergranular failure mode in the threshold regime. The change in failure modes was associated with a crossover of FCG resistance curves in which the conditions that produced higher FCG rates in the Paris regime resulted in lower FCG rates and increased ΔK th values in the threshold region. High-resolution scanning and transmission electron microscopy were used to carefully characterize the crack tips at these near-threshold conditions. Formation of stable Al-oxide followed by Cr-oxide and Ti-oxides was found to occur at the crack tip prior to formation of unstable oxides. To contrast with the threshold failure mode regime, a quantitative assessment of the role that the intergranular failure mode has on cyclic FCG behavior in the Paris regime was also performed. It was demonstrated that even a very limited intergranular failure content dominates the FCG response under mixed mode failure conditions.

  13. Accelerated life assessment of coating on the radar structure components in coastal environment.

    PubMed

    Liu, Zhe; Ming, ZhiMao

    2016-07-04

    This paper aimed to build an accelerated life test scheme and carry out quantitative analysis between accelerated life test in the laboratory and actual service for the coating composed of epoxy primer and polyurethane paint on structure components of some kind of radar served in the coastal environment of South China Sea. The accelerated life test scheme was built based on the service environment and failure analysis of the coating. The quantitative analysis between accelerated life test and actual service was conducted by comparing the gloss loss, discoloration, chalking, blistering, cracking and electrochemical impedance spectroscopy of the coating. The main factors leading to the coating failure were ultraviolet radiation, temperature, moisture, salt fog and loads, the accelerated life test included ultraviolet radiation, damp heat, thermal shock, fatigue and salt spray. The quantitative relationship was that one cycle of the accelerated life test was equal to actual service for one year. It was established that one cycle of the accelerated life test was equal to actual service for one year. It provided a precise way to predict actual service life of newly developed coatings for the manufacturer.

  14. Functional Renal Imaging with 2-Deoxy-2-18F-Fluorosorbitol PET in Rat Models of Renal Disorders.

    PubMed

    Werner, Rudolf A; Wakabayashi, Hiroshi; Chen, Xinyu; Hirano, Mitsuru; Shinaji, Tetsuya; Lapa, Constantin; Rowe, Steven P; Javadi, Mehrbod S; Higuchi, Takahiro

    2018-05-01

    Precise regional quantitative assessment of renal function is limited with conventional 99m Tc-labeled renal radiotracers. A recent study reported that the PET radiotracer 2-deoxy-2- 18 F-fluorosorbitol ( 18 F-FDS) has ideal pharmacokinetics for functional renal imaging. Furthermore, 18 F-FDS is available via simple reduction from routinely used 18 F-FDG. We aimed to further investigate the potential of 18 F-FDS PET as a functional renal imaging agent using rat models of kidney disease. Methods: Two different rat models of renal impairment were investigated: induction of acute renal failure by intramuscular administration of glycerol in the hind legs, and induction of unilateral ureteral obstruction by ligation of the left ureter. At 24 h after these procedures, dynamic 30-min 18 F-FDS PET data were acquired using a dedicated small-animal PET system. Urine 18 F-FDS radioactivity 30 min after radiotracer injection was measured together with coinjected 99m Tc-diethylenetriaminepentaacetic acid urine activity. Results: Dynamic PET imaging demonstrated rapid 18 F-FDS accumulation in the renal cortex and rapid radiotracer excretion via the kidneys in healthy control rats. On the other hand, significantly delayed renal radiotracer uptake (continuous slow uptake) was observed in acute renal failure rats and unilateral ureteral obstruction kidneys. Measured urine radiotracer concentrations of 18 F-FDS and 99m Tc-diethylenetriaminepentaacetic acid correlated well with each other ( R = 0.84, P < 0.05). Conclusion: 18 F-FDS PET demonstrated favorable kinetics for functional renal imaging in rat models of kidney diseases. 18 F-FDS PET imaging, with its advantages of high spatiotemporal resolution and simple tracer production, could potentially complement or replace conventional renal scintigraphy in select cases and significantly improve the diagnostic performance of renal functional imaging. © 2018 by the Society of Nuclear Medicine and Molecular Imaging.

  15. A quantitative model of honey bee colony population dynamics.

    PubMed

    Khoury, David S; Myerscough, Mary R; Barron, Andrew B

    2011-04-18

    Since 2006 the rate of honey bee colony failure has increased significantly. As an aid to testing hypotheses for the causes of colony failure we have developed a compartment model of honey bee colony population dynamics to explore the impact of different death rates of forager bees on colony growth and development. The model predicts a critical threshold forager death rate beneath which colonies regulate a stable population size. If death rates are sustained higher than this threshold rapid population decline is predicted and colony failure is inevitable. The model also predicts that high forager death rates draw hive bees into the foraging population at much younger ages than normal, which acts to accelerate colony failure. The model suggests that colony failure can be understood in terms of observed principles of honey bee population dynamics, and provides a theoretical framework for experimental investigation of the problem.

  16. Failure Behavior Characterization of Mo-Modified Ti Surface by Impact Test and Finite Element Analysis

    NASA Astrophysics Data System (ADS)

    Ma, Yong; Qin, Jianfeng; Zhang, Xiangyu; Lin, Naiming; Huang, Xiaobo; Tang, Bin

    2015-07-01

    Using the impact test and finite element simulation, the failure behavior of the Mo-modified layer on pure Ti was investigated. In the impact test, four loads of 100, 300, 500, and 700 N and 104 impacts were adopted. The three-dimensional residual impact dents were examined using an optical microscope (Olympus-DSX500i), indicating that the impact resistance of the Ti surface was improved. Two failure modes cohesive and wearing were elucidated by electron backscatter diffraction and energy-dispersive spectrometer performed in a field-emission scanning electron microscope. Through finite element forward analysis performed at a typical impact load of 300 N, stress-strain distributions in the Mo-modified Ti were quantitatively determined. In addition, the failure behavior of the Mo-modified layer was determined and an ideal failure model was proposed for high-load impact, based on the experimental and finite element forward analysis results.

  17. Effect of exercise on diastolic function in heart failure patients: a systematic review and meta-analysis.

    PubMed

    Pearson, M J; Mungovan, S F; Smart, N A

    2017-03-01

    Diastolic dysfunction contributes to the development and progression of heart failure. Conventional echocardiography and tissue Doppler imaging are widely utilised in clinical research providing a number of indices of diastolic function valuable in the diagnosis and prognosis of heart failure patients. The aim of this meta-analysis was to quantify the effect of exercise training on diastolic function in patients with heart failure. Exercise training studies that investigate different indices of diastolic function in patients with heart failure have reported that exercise training improves diastolic function in these patients. We sought to add to the current literature by quantifying, where possible, the effect of exercise training on diastolic function. We conducted database searches (PubMed, EBSCO, EMBASE, and Cochrane Trials Register to 31 July 2016) for exercise based rehabilitation trials in heart failure, using the search terms 'exercise training, diastolic function and diastolic dysfunction'. Data from six studies, with a total of 266 heart failure with reduced ejection fraction (HFrEF) participants, 144 in intervention groups and 122 in control groups, indicated a significant reduction in the ratio of early diastolic transmitral velocity (E) to early diastolic tissue velocity (E') (E/E' ratio) with exercise training, exercise vs. control mean difference (MD) of -2.85 (95% CI -3.66 to -2.04, p < 0.00001). Data from five studies in heart failure with preserved ejection fraction (HFpEF) patients, with a total of 204 participants, 115 in intervention groups and 89 in control groups, also demonstrated a significant improvement in E/E' in exercise vs. control MD of -2.38 (95% CI -3.47 to -1.28, p < 0.0001).

  18. Forecasting volcanic eruptions and other material failure phenomena: An evaluation of the failure forecast method

    NASA Astrophysics Data System (ADS)

    Bell, Andrew F.; Naylor, Mark; Heap, Michael J.; Main, Ian G.

    2011-08-01

    Power-law accelerations in the mean rate of strain, earthquakes and other precursors have been widely reported prior to material failure phenomena, including volcanic eruptions, landslides and laboratory deformation experiments, as predicted by several theoretical models. The Failure Forecast Method (FFM), which linearizes the power-law trend, has been routinely used to forecast the failure time in retrospective analyses; however, its performance has never been formally evaluated. Here we use synthetic and real data, recorded in laboratory brittle creep experiments and at volcanoes, to show that the assumptions of the FFM are inconsistent with the error structure of the data, leading to biased and imprecise forecasts. We show that a Generalized Linear Model method provides higher-quality forecasts that converge more accurately to the eventual failure time, accounting for the appropriate error distributions. This approach should be employed in place of the FFM to provide reliable quantitative forecasts and estimate their associated uncertainties.

  19. [Clinical characteristics and medium-term prognosis of patients with heart failure and preserved systolic function. Do they differ in systolic dysfunction?].

    PubMed

    Ojeda, Soledad; Anguita, Manuel; Muñoz, Juan F; Rodríguez, Marcos T; Mesa, Dolores; Franco, Manuel; Ureña, Isabel; Vallés, Federico

    2003-11-01

    To assess the prevalence, clinical profile and medium-term prognosis in patients with heart failure and preserved systolic ventricular function compared to those with systolic dysfunction. 153 patients were included, 62 with preserved systolic ventricular function (left ventricular ejection fraction > or = 45%) and 91 with impaired systolic ventricular function (left ventricular ejection fraction < 45%). The mean follow-up period was 25 10 months. Mean age was similar (66 10 vs. 65 10; p = 0.54). There was a higher proportion of women among patients with preserved systolic function (53% vs. 28%; p < 0.01). Ischemic and idiopathic cardiomyopathy were the most common causes of heart failure in patients with systolic dysfunction, whereas valvular disease and hypertensive cardiopathy were the most common in patients with preserved systolic function. Angiotensin-converting enzyme inhibitors and beta-blockers were more often prescribed in patients with impaired systolic ventricular function (86% vs. 52%; p < 0.01 and 33% vs. 11%; p < 0.01, respectively). There were no differences between the groups in terms of mortality rate (37% vs. 29%), readmission rate for other causes (29% vs. 23%), readmission rate for heart failure (45% vs. 45%), cumulative survival (51% vs. 62%) and the likelihood of not being readmitted for heart failure (50% vs. 52%). In the multivariate analysis, left ventricular ejection fraction was not a predictor of death or readmission because of heart failure. In a large proportion of patients with heart failure, systolic ventricular function is preserved. Despite the clinical differences between patients with preserved and impaired systolic ventricular function, the medium-term prognosis was similar in both groups.

  20. Mitochondrial function as a therapeutic target in heart failure

    PubMed Central

    Brown, David A.; Perry, Justin B.; Allen, Mitchell E.; Sabbah, Hani N.; Stauffer, Brian L.; Shaikh, Saame Raza; Cleland, John G. F.; Colucci, Wilson S.; Butler, Javed; Voors, Adriaan A.; Anker, Stefan D.; Pitt, Bertram; Pieske, Burkert; Filippatos, Gerasimos; Greene, Stephen J.; Gheorghiade, Mihai

    2017-01-01

    Heart failure is a pressing worldwide public-health problem with millions of patients having worsening heart failure. Despite all the available therapies, the condition carries a very poor prognosis. Existing therapies provide symptomatic and clinical benefit, but do not fully address molecular abnormalities that occur in cardiomyocytes. This shortcoming is particularly important given that most patients with heart failure have viable dysfunctional myocardium, in which an improvement or normalization of function might be possible. Although the pathophysiology of heart failure is complex, mitochondrial dysfunction seems to be an important target for therapy to improve cardiac function directly. Mitochondrial abnormalities include impaired mitochondrial electron transport chain activity, increased formation of reactive oxygen species, shifted metabolic substrate utilization, aberrant mitochondrial dynamics, and altered ion homeostasis. In this Consensus Statement, insights into the mechanisms of mitochondrial dysfunction in heart failure are presented, along with an overview of emerging treatments with the potential to improve the function of the failing heart by targeting mitochondria. PMID:28004807

  1. Improving Attachments of Non-Invasive (Type III) Electronic Data Loggers to Cetaceans

    DTIC Science & Technology

    2015-09-30

    animals in human care will be performed to test and validate this approach. The cadaver trials will enable controlled testing to failure or with both...quantitative metrics and analysis tools to assess the impact of a tag on the animal . Here we will present: 1) the characterization of the mechanical...fine scale motion analysis for swimming animals . 2 APPROACH Our approach is divided into four subtasks: Task 1: Forces and failure modes

  2. Acoustic emission spectral analysis of fiber composite failure mechanisms

    NASA Technical Reports Server (NTRS)

    Egan, D. M.; Williams, J. H., Jr.

    1978-01-01

    The acoustic emission of graphite fiber polyimide composite failure mechanisms was investigated with emphasis on frequency spectrum analysis. Although visual examination of spectral densities could not distinguish among fracture sources, a paired-sample t statistical analysis of mean normalized spectral densities did provide quantitative discrimination among acoustic emissions from 10 deg, 90 deg, and plus or minus 45 deg, plus or minus 45 deg sub s specimens. Comparable discrimination was not obtained for 0 deg specimens.

  3. The Importance of Human Reliability Analysis in Human Space Flight: Understanding the Risks

    NASA Technical Reports Server (NTRS)

    Hamlin, Teri L.

    2010-01-01

    HRA is a method used to describe, qualitatively and quantitatively, the occurrence of human failures in the operation of complex systems that affect availability and reliability. Modeling human actions with their corresponding failure in a PRA (Probabilistic Risk Assessment) provides a more complete picture of the risk and risk contributions. A high quality HRA can provide valuable information on potential areas for improvement, including training, procedural, equipment design and need for automation.

  4. In vivo optical imaging of the viable epidermis around the nailfold capillaries for the assessment of heart failure severity in humans.

    PubMed

    Shirshin, Evgeny A; Gurfinkel, Yury I; Matskeplishvili, Simon T; Sasonko, Maria L; Omelyanenko, Nikolai P; Yakimov, Boris P; Lademann, Juergen; Darvin, Maxim E

    2018-05-29

    Heart failure is among the socially significant diseases, involving over 2% of the adult population in the developed countries. Diagnostics of the HF severity remains complicated due to the absence of specific symptoms and objective criteria. Here we present an indicator of the HF severity based on the imaging tissue parameters around the nailfold capillaries. High resolution nailfold video capillaroscopy was performed to determine the perivascular zone (PZ) size around nailfold capillaries, and two-photon tomography with fluorescence lifetime imaging was used to investigate PZ composition. We found that the size of PZ around the nailfold capillaries strongly correlates with heart failure severity. Further investigations using two-photon tomography demonstrated that PZ corresponds to the border of viable epidermis and it was suggested that the PZ size variations were due to the different amounts of interstitial fluid that potentially further translates in clinically significant oedema. The obtained results allow for the development of a quantitative indicator of oedematous syndrome, which can be used in various applications to monitor the dynamics of interstitial fluid retention. We therefore suggest PZ size measured with nailfold video capillaroscopy as a novel quantitative sensitive non-invasive marker of heart failure severity. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.

  5. An improved approach for flight readiness certification: Methodology for failure risk assessment and application examples. Volume 3: Structure and listing of programs

    NASA Technical Reports Server (NTRS)

    Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.

    1992-01-01

    An improved methodology for quantitatively evaluating failure risk of spaceflight systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with engineering analysis to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in engineering analyses of failure phenomena, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which engineering analysis models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes. These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. Conventional engineering analysis models currently employed for design of failure prediction are used in this methodology. The PFA methodology is described and examples of its application are presented. Conventional approaches to failure risk evaluation for spaceflight systems are discussed, and the rationale for the approach taken in the PFA methodology is presented. The statistical methods, engineering models, and computer software used in fatigue failure mode applications are thoroughly documented.

  6. Analysis of base fuze functioning of HESH ammunitions through high-speed photographic technique

    NASA Astrophysics Data System (ADS)

    Biswal, T. K.

    2007-01-01

    High-speed photography plays a major role in a Test Range where the direct access is possible through imaging in order to understand a dynamic process thoroughly and both qualitative and quantitative data are obtained thereafter through image processing and analysis. In one of the trials it was difficult to understand the performance of HESH ammunitions on rolled homogeneous armour. There was no consistency in scab formation even though all other parameters like propellant charge mass, charge temperature, impact velocity etc are maintained constant. To understand the event thoroughly high-speed photography was deployed to have a frontal view of the total process. Clear information of shell impact, embedding of HE propellant on armour and base fuze initiation are obtained. In case of scab forming rounds these three processes are clearly observed in sequence. However in non-scab ones base fuze is initiated before the completion of the embedding process resulting non-availability of threshold thrust on to the armour to cause scab. This has been revealed in two rounds where there was a failure of scab formation. As a quantitative measure, fuze delay was calculated for each round and there after premature functioning of base fuze was ascertained in case of non-scab rounds. Such potency of high-speed photography has been depicted in details in this paper.

  7. Quantitative nondestructive in-service evaluation of stay cables of cable-stayed bridges: methods and practical experience

    NASA Astrophysics Data System (ADS)

    Weischedel, Herbert R.; Hoehle, Hans-Werner

    1995-05-01

    Stay cables of cable-stayed bridges have corrosion protection systems that can be elaborate. For example, such a system may simply consist of one or several coats of paint, or--more complex--of plastic pipes that are wrapped with tape and filled with grout. Frequently, these corrosion protection systems prevent visual inspections. Therefore, alternative nondestructive examination methods are called for. For example, modern dual-function electromagnetic (EM) instruments allow the simultaneous detection of external and internal localized flaws (such as external and internal broken wires and corrosion piting) and the measurement of loss of metallic cross-sectional area (typically caused by external or internal corrosion or wear). Initially developed for mining and skiing applications, these instruments have been successfully used for the inspection of stays of cable-stayed bridges, and for the inspection of guys of smoke stacks, flare stacks, broadcast towers, suspended roofs, etc. As a rule, guys and bridge cables are not subjected to wear and bending stresses. However, their safety can be compromised by corrosion caused by the failure of corrosion protection systems. Furthermore, live loads and wind forces create intermittent tensile stresses that can cause fatigue breaks of wires. This paper discusses the use of dual-function EM instruments for the detection and the nondestructive quantitative evaluation of cable deterioration. It explains the underlying principles. Experiences with this method together with field inspection results will be presented.

  8. Donor Indocyanine Green Clearance Test Predicts Graft Quality and Early Graft Prognosis After Liver Transplantation.

    PubMed

    Tang, Yunhua; Han, Ming; Chen, Maogen; Wang, Xiaoping; Ji, Fei; Zhao, Qiang; Zhang, Zhiheng; Ju, Weiqiang; Wang, Dongping; Guo, Zhiyong; He, Xiaoshun

    2017-11-01

    Transplantation centers have given much attention to donor availability. However, no reliable quantitative methods have been employed to accurately assess graft quality before transplantation. Here, we report that the indocyanine green (ICG) clearance test is a valuable index for liver grafts. We performed the ICG clearance test on 90 brain-dead donors within 6 h before organ procurement between March 2015 and November 2016. We also analyzed the relationship between graft liver function and early graft survival after liver transplantation (LT). Our results suggest that the ICG retention rate at 15 min (ICGR15) of donors before procurement was independently associated with 3-month graft survival after LT. The best donor ICGR15 cutoff value was 11.0%/min, and we observed a significant increase in 3-month graft failure among patients with a donor ICGR15 above this value. On the other hand, a donor ICGR15 value of ≤ 11.0%/min could be used as an early assessment index of graft quality because it provides additional information to the transplant surgeon or organ procurement organization members who must maintain or improve organ function to adapt the LT. An ICG clearance test before liver procurement might be an effective quantitative method to predict graft availability and improve early graft prognosis after LT.

  9. Membrane raft association is a determinant of plasma membrane localization.

    PubMed

    Diaz-Rohrer, Blanca B; Levental, Kandice R; Simons, Kai; Levental, Ilya

    2014-06-10

    The lipid raft hypothesis proposes lateral domains driven by preferential interactions between sterols, sphingolipids, and specific proteins as a central mechanism for the regulation of membrane structure and function; however, experimental limitations in defining raft composition and properties have prevented unequivocal demonstration of their functional relevance. Here, we establish a quantitative, functional relationship between raft association and subcellular protein sorting. By systematic mutation of the transmembrane and juxtamembrane domains of a model transmembrane protein, linker for activation of T-cells (LAT), we generated a panel of variants possessing a range of raft affinities. These mutations revealed palmitoylation, transmembrane domain length, and transmembrane sequence to be critical determinants of membrane raft association. Moreover, plasma membrane (PM) localization was strictly dependent on raft partitioning across the entire panel of unrelated mutants, suggesting that raft association is necessary and sufficient for PM sorting of LAT. Abrogation of raft partitioning led to mistargeting to late endosomes/lysosomes because of a failure to recycle from early endosomes. These findings identify structural determinants of raft association and validate lipid-driven domain formation as a mechanism for endosomal protein sorting.

  10. Membrane raft association is a determinant of plasma membrane localization

    PubMed Central

    Diaz-Rohrer, Blanca B.; Levental, Kandice R.; Simons, Kai; Levental, Ilya

    2014-01-01

    The lipid raft hypothesis proposes lateral domains driven by preferential interactions between sterols, sphingolipids, and specific proteins as a central mechanism for the regulation of membrane structure and function; however, experimental limitations in defining raft composition and properties have prevented unequivocal demonstration of their functional relevance. Here, we establish a quantitative, functional relationship between raft association and subcellular protein sorting. By systematic mutation of the transmembrane and juxtamembrane domains of a model transmembrane protein, linker for activation of T-cells (LAT), we generated a panel of variants possessing a range of raft affinities. These mutations revealed palmitoylation, transmembrane domain length, and transmembrane sequence to be critical determinants of membrane raft association. Moreover, plasma membrane (PM) localization was strictly dependent on raft partitioning across the entire panel of unrelated mutants, suggesting that raft association is necessary and sufficient for PM sorting of LAT. Abrogation of raft partitioning led to mistargeting to late endosomes/lysosomes because of a failure to recycle from early endosomes. These findings identify structural determinants of raft association and validate lipid-driven domain formation as a mechanism for endosomal protein sorting. PMID:24912166

  11. Comprehensive Understanding of the Zipingpu Reservoir to the Ms8.0 Wenchuan Earthquake

    NASA Astrophysics Data System (ADS)

    Cheng, H.; Pang, Y. J.; Zhang, H.; Shi, Y.

    2014-12-01

    After the Wenchuan earthquake occurred, whether the big earthquake triggered by the storage of the Zipingpu Reservoir has attracted wide attention in international academic community. In addition to the qualitative discussion, many scholars also adopted the quantitative analysis methods to calculate the stress changes, but due to the different results, they draw very different conclusions. Here, we take the dispute of different teams in the quantitative calculation of Zipingpu reservoir as a starting point. In order to find out the key influence factors of quantitative calculation and know about the existing uncertainty elements during the numerical simulation, we analyze factors which may cause the differences. The preliminary results show that the calculation methods (analytical method or numerical method), dimension of models (2-D or 3-D), diffusion model, diffusion coefficient and focal mechanism are the main factors resulted in the differences, especially the diffusion coefficient of the fractured rock mass. The change of coulomb failure stress of the epicenter of Wenchuan earthquake attained from 2-D model is about 3 times of that of 3-D model. And it is not reasonable that only considering the fault permeability (assuming the permeability of rock mass as infinity) or only considering homogeneous isotropic rock mass permeability (ignoring the fault permeability). The different focal mechanisms also could dramatically affect the change of coulomb failure stress of the epicenter of Wenchuan earthquake, and the differences can research 2-7 times. And the differences the change of coulomb failure stress can reach several hundreds times, when selecting different diffusion coefficients. According to existing research that the magnitude of coulomb failure stress change is about several kPa, we could not rule out the possibility that the Zipingpu Reservoir may trigger the 2008 Wenchuan earthquake. However, for the background stress is not clear and coulomb failure stress change is too little, we also not sure there must be a connection between reservoir and earthquake. In future work, we should target on the basis of field survey and indoor experiment, improve the model and develop high performance simulation.

  12. Application of Fault Management Theory to the Quantitative Selection of a Launch Vehicle Abort Trigger Suite

    NASA Technical Reports Server (NTRS)

    Lo, Yunnhon; Johnson, Stephen B.; Breckenridge, Jonathan T.

    2014-01-01

    This paper describes the quantitative application of the theory of System Health Management and its operational subset, Fault Management, to the selection of abort triggers for a human-rated launch vehicle, the United States' National Aeronautics and Space Administration's (NASA) Space Launch System (SLS). The results demonstrate the efficacy of the theory to assess the effectiveness of candidate failure detection and response mechanisms to protect humans from time-critical and severe hazards. The quantitative method was successfully used on the SLS to aid selection of its suite of abort triggers.

  13. Application of Fault Management Theory to the Quantitative Selection of a Launch Vehicle Abort Trigger Suite

    NASA Technical Reports Server (NTRS)

    Lo, Yunnhon; Johnson, Stephen B.; Breckenridge, Jonathan T.

    2014-01-01

    This paper describes the quantitative application of the theory of System Health Management and its operational subset, Fault Management, to the selection of Abort Triggers for a human-rated launch vehicle, the United States' National Aeronautics and Space Administration's (NASA) Space Launch System (SLS). The results demonstrate the efficacy of the theory to assess the effectiveness of candidate failure detection and response mechanisms to protect humans from time-critical and severe hazards. The quantitative method was successfully used on the SLS to aid selection of its suite of Abort Triggers.

  14. A Framework for Creating a Function-based Design Tool for Failure Mode Identification

    NASA Technical Reports Server (NTRS)

    Arunajadai, Srikesh G.; Stone, Robert B.; Tumer, Irem Y.; Clancy, Daniel (Technical Monitor)

    2002-01-01

    Knowledge of potential failure modes during design is critical for prevention of failures. Currently industries use procedures such as Failure Modes and Effects Analysis (FMEA), Fault Tree analysis, or Failure Modes, Effects and Criticality analysis (FMECA), as well as knowledge and experience, to determine potential failure modes. When new products are being developed there is often a lack of sufficient knowledge of potential failure mode and/or a lack of sufficient experience to identify all failure modes. This gives rise to a situation in which engineers are unable to extract maximum benefits from the above procedures. This work describes a function-based failure identification methodology, which would act as a storehouse of information and experience, providing useful information about the potential failure modes for the design under consideration, as well as enhancing the usefulness of procedures like FMEA. As an example, the method is applied to fifteen products and the benefits are illustrated.

  15. Availability Estimate of a Conceptual ESM System.

    DTIC Science & Technology

    1979-06-01

    affect mission operation.t A functional block level failure modes and effects analysis ( FMEA ) performed on the filter resulted in an assessed failure rate...is based on an FMEA of failures that disable the function (see Appendix A). A further 29 examination of the filter piece-parts reveals that the driver...Digital-to-analog converter DC Direct current DF Direction finding ESM Electronic Support Measures FMEA Failure modes and effects analysis FMPO

  16. To the systematization of failure analysis for perturbed systems (in German)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Haller, U.

    1974-01-01

    The paper investigates the reliable functioning of complex technical systems. Of main importance is the question of how the functioning of technical systems which may fail or whose design still has some faults can be determined in the very earliest planning stages. The present paper is to develop a functioning schedule and to look for possible methods of systematic failure analysis of systems with stochastic failures. (RW/AK)

  17. Quantitative modeling of failure propagation in intelligent transportation systems.

    DOT National Transportation Integrated Search

    2014-08-01

    Unmanned vehicles are projected to reach consumer use within this decade - related legislation has already passed in California. The : most significant technical challenge associated with these vehicles is their integration in transportation environm...

  18. Absolute and Functional Iron Deficiency Is a Common Finding in Patients With Heart Failure and After Heart Transplantation.

    PubMed

    Przybylowski, P; Wasilewski, G; Golabek, K; Bachorzewska-Gajewska, H; Dobrzycki, S; Koc-Zorawska, E; Malyszko, J

    2016-01-01

    Anemia is relatively common in patients with heart failure and heart transplant recipients. Both absolute and functional iron deficiency may contribute to the anemia in these populations. Functional iron deficiency (defined as ferritin greater than 200 ng/mL with TSAT (Transferrin saturation) less than 20%) is characterized by the presence of adequate iron stores as defined by conventional criteria, but with insufficient iron mobilization to adequately support. The aim of this study was to determine prevalence of absolute and functional iron deficiency in patients with heart failure (n = 269) and after heart transplantation (n = 130) and their relation to parameters of iron status and inflammation. Iron status, complete blood count, and creatinine levels were assessed using standard laboratory methods. C-reactive protein, hepcidin and hemojuvelin were measured using commercially available kits. Absolute iron deficiency was present in 15% of patients with heart failure and 30% in heart transplant recipients, whereas functional iron deficiency was present in 18% of patients with heart failure and 17% in heart transplant recipients. Functional iron deficiency was associated with significantly higher C-reactive protein and hepcidin levels in heart failure patients, and higher hepcidin and lower estimate glomerular filtration rates in heart transplant recipients. Prevalence of anemia (according to the World Health Organization) was significantly higher in heart transplant recipients (40% vs 22%, P < .001), they were also younger, but with worse kidney function than patients with heart failure. Both absolute and functional iron deficiency were present in a considerable group of patients. This population should be carefully screened for possible reversible causes of inflammation. Copyright © 2016 Elsevier Inc. All rights reserved.

  19. [Cardiac failure in endocrine diseases].

    PubMed

    Hashizume, K

    1993-05-01

    Several endocrine diseases show the symptoms of cardiac failure. Among them, patients with acromegaly show a specific cardiomyopathy which results in a severe left-sided cardiac failure. Hypoparathyroidism also induces cardiac failure, which is resulted from hypocalcemia and low levels of serum parathyroid hormone. In the cases of hypothyroidism, the patients with myxedemal coma show a severe cardiac failure, which is characterized by disturbance of central nervous system, renal function, and cardiac function. In the patients with thyroid crisis (storm), the cardiac failure comes from the great reduction of cardiac output with dehydration. The reduction of circulation volume, observed in the patients with pheochromocytoma easily induces cardiac failure (shock) just after the removal of adrenal tumor. In patients with malignant carcinoid syndrome, right-sided ventricular failure which may be occurred through the actions of biogenic amines is observed.

  20. A failure management prototype: DR/Rx

    NASA Technical Reports Server (NTRS)

    Hammen, David G.; Baker, Carolyn G.; Kelly, Christine M.; Marsh, Christopher A.

    1991-01-01

    This failure management prototype performs failure diagnosis and recovery management of hierarchical, distributed systems. The prototype, which evolved from a series of previous prototypes following a spiral model for development, focuses on two functions: (1) the diagnostic reasoner (DR) performs integrated failure diagnosis in distributed systems; and (2) the recovery expert (Rx) develops plans to recover from the failure. Issues related to expert system prototype design and the previous history of this prototype are discussed. The architecture of the current prototype is described in terms of the knowledge representation and functionality of its components.

  1. Advances in functional brain imaging technology and developmental neuro-psychology: their applications in the Jungian analytic domain.

    PubMed

    Petchkovsky, Leon

    2017-06-01

    Analytical psychology shares with many other psychotherapies the important task of repairing the consequences of developmental trauma. The majority of analytic patients come from compromised early developmental backgrounds: they may have experienced neglect, abuse, or failures of empathic resonance from their carers. Functional brain imagery techniques including Quantitative Electroencephalogram (QEEG), and functional Magnetic Resonance Imagery (fMRI), allow us to track mental processes in ways beyond verbal reportage and introspection. This independent perspective is useful for developing new psychodynamic hypotheses, testing current ones, providing diagnostic markers, and monitoring treatment progress. Jung, with the Word Association Test, grasped these principles 100 years ago. Brain imaging techniques have contributed to powerful recent advances in our understanding of neurodevelopmental processes in the first three years of life. If adequate nurturance is compromised, a range of difficulties may emerge. This has important implications for how we understand and treat our psychotherapy clients. The paper provides an overview of functional brain imaging and advances in developmental neuropsychology, and looks at applications of some of these findings (including neurofeedback) in the Jungian psychotherapy domain. © 2017, The Society of Analytical Psychology.

  2. Fault Damage Zone Permeability in Crystalline Rocks from Combined Field and Laboratory Measurements

    NASA Astrophysics Data System (ADS)

    Mitchell, T.; Faulkner, D.

    2008-12-01

    In nature, permeability is enhanced in the damage zone of faults, where fracturing occurs on a wide range of scales. Here we analyze the contribution of microfracture damage on the permeability of faults that cut through low porosity, crystalline rocks by combining field and laboratory measurements. Microfracture densities surrounding strike-slip faults with well-constrained displacements ranging over 3 orders of magnitude (~0.12 m - 5000 m) have been analyzed. The faults studied are excellently exposed within the Atacama Fault Zone, where exhumation from 6-10 km has occurred. Microfractures in the form of fluid inclusion planes (FIPs) show a log-linear decrease in fracture density with perpendicular distance from the fault core. Damage zone widths defined by the density of FIPs scale with fault displacement, and an empirical relationship for microfracture density distribution throughout the damage zone with displacement is derived. Damage zone rocks will have experienced differential stresses that were less than, but some proportion of, the failure stress. As such, permeability data from progressively loaded, initially intact laboratory samples, in the pre-failure region provide useful insights into fluid flow properties of various parts of the damage zone. The permeability evolution of initially intact crystalline rocks under increasing differential load leading to macroscopic failure was determined at water pore pressures of 50 MPa and effective pressure of 10 MPa. Permeability is seen to increase by up to, and over, two orders of magnitude prior to macroscopic failure. Further experiments were stopped at various points in the loading history in order to correlate microfracture density within the samples with permeability. By combining empirical relationships determined from both quantitative fieldwork and experiments we present a model that allows microfracture permeability distribution throughout the damage zone to be determined as function of increasing fault displacement.

  3. Effects of enhanced external counterpulsation on skeletal muscle gene expression in patients with severe heart failure.

    PubMed

    Melin, Michael; Montelius, Andreas; Rydén, Lars; Gonon, Adrian; Hagerman, Inger; Rullman, Eric

    2018-01-01

    Enhanced external counterpulsation (EECP) is a non-invasive treatment in which leg cuff compressions increase diastolic aortic pressure and coronary perfusion. EECP is offered to patients with refractory angina pectoris and increases physical capacity. Benefits in heart failure patients have been noted, but EECP is still considered to be experimental and its effects must be confirmed. The mechanism of action is still unclear. The aim of this study was to evaluate the effect of EECP on skeletal muscle gene expression and physical performance in patients with severe heart failure. Patients (n = 9) in NYHA III-IV despite pharmacological therapy were subjected to 35 h of EECP during 7 weeks. Before and after, lateral vastus muscle biopsies were obtained, and functional capacity was evaluated with a 6-min walk test. Skeletal muscle gene expression was evaluated using Affymetrix Hugene 1.0 arrays. Maximum walking distance increased by 15%, which is in parity to that achieved after aerobic exercise training in similar patients. Skeletal muscle gene expression analysis using Ingenuity Pathway Analysis showed an increased expression of two networks of genes with FGF-2 and IGF-1 as central regulators. The increase in gene expression was quantitatively small and no overlap with gene expression profiles after exercise training could be detected despite adequate statistical power. EECP treatment leads to a robust improvement in walking distance in patients with severe heart failure and does induce a skeletal muscle transcriptional response, but this response is small and with no significant overlap with the transcriptional signature seen after exercise training. © 2016 Scandinavian Society of Clinical Physiology and Nuclear Medicine. Published by John Wiley & Sons Ltd.

  4. Development of a calibrated software reliability model for flight and supporting ground software for avionic systems

    NASA Technical Reports Server (NTRS)

    Lawrence, Stella

    1991-01-01

    The object of this project was to develop and calibrate quantitative models for predicting the quality of software. Reliable flight and supporting ground software is a highly important factor in the successful operation of the space shuttle program. The models used in the present study consisted of SMERFS (Statistical Modeling and Estimation of Reliability Functions for Software). There are ten models in SMERFS. For a first run, the results obtained in modeling the cumulative number of failures versus execution time showed fairly good results for our data. Plots of cumulative software failures versus calendar weeks were made and the model results were compared with the historical data on the same graph. If the model agrees with actual historical behavior for a set of data then there is confidence in future predictions for this data. Considering the quality of the data, the models have given some significant results, even at this early stage. With better care in data collection, data analysis, recording of the fixing of failures and CPU execution times, the models should prove extremely helpful in making predictions regarding the future pattern of failures, including an estimate of the number of errors remaining in the software and the additional testing time required for the software quality to reach acceptable levels. It appears that there is no one 'best' model for all cases. It is for this reason that the aim of this project was to test several models. One of the recommendations resulting from this study is that great care must be taken in the collection of data. When using a model, the data should satisfy the model assumptions.

  5. Longitudinal Evaluation of Fatty Acid Metabolism in Normal and Spontaneously Hypertensive Rat Hearts with Dynamic MicroSPECT Imaging

    DOE PAGES

    Reutter, Bryan W.; Huesman, Ronald H.; Brennan, Kathleen M.; ...

    2011-01-01

    The goal of this project is to develop radionuclide molecular imaging technologies using a clinical pinhole SPECT/CT scanner to quantify changes in cardiac metabolism using the spontaneously hypertensive rat (SHR) as a model of hypertensive-related pathophysiology. This paper quantitatively compares fatty acid metabolism in hearts of SHR and Wistar-Kyoto normal rats as a function of age and thereby tracks physiological changes associated with the onset and progression of heart failure in the SHR model. The fatty acid analog, 123 I-labeled BMIPP, was used in longitudinal metabolic pinhole SPECT imaging studies performed every seven months for 21 months. The uniqueness ofmore » this project is the development of techniques for estimating the blood input function from projection data acquired by a slowly rotating camera that is imaging fast circulation and the quantification of the kinetics of 123 I-BMIPP by fitting compartmental models to the blood and tissue time-activity curves.« less

  6. Right ventricular performance and mass by use of cine MRI late after atrial repair of transposition of the great arteries.

    PubMed

    Lorenz, C H; Walker, E S; Graham, T P; Powers, T A

    1995-11-01

    The long-term adaptation of the right ventricle after atrial repair of transposition of the great arteries (TGA) remains a subject of major concern. Cine magnetic resonance imaging (MRI), with its tomographic capabilities, allows unique quantitative evaluation of both right and left ventricular function and mass. Our purpose was to use MRI and an age-matched normal population to examine the typical late adaptation of the right and left ventricles after atrial repair of TGA. Cine MRI was used to study ventricular function and mass in 22 patients after atrial repair of TGA. Images were obtained in short-axis sections from base to apex to derive normalized right and left ventricular mass (RVM and LVM, g/m2), interventricular septal mass (IVSM, g/m2), RV and LV end-diastolic volumes (EDV, mL/m2), and ejection fractions (EF). Results 8 to 23 years after repair were compared with analysis of 24 age- and sex-matched normal volunteers and revealed markedly elevated RVM, decreased LVM and IVSM, normal RV size, and only mildly depressed RVEF. Only 1 of 22 patients had clinical RV dysfunction, and this patient had increased RVM. Cine MRI allows quantitative evaluation of both RV and LV mass and function late after atrial repair of TGA. Longitudinal studies that include these measurements should prove useful in determining the mechanism of late RV failure in these patients. On the basis of these early data, inadequate hypertrophy does not appear to be the cause of late dysfunction in this patient group.

  7. ACCELERATED FAILURE TIME MODELS PROVIDE A USEFUL STATISTICAL FRAMEWORK FOR AGING RESEARCH

    PubMed Central

    Swindell, William R.

    2009-01-01

    Survivorship experiments play a central role in aging research and are performed to evaluate whether interventions alter the rate of aging and increase lifespan. The accelerated failure time (AFT) model is seldom used to analyze survivorship data, but offers a potentially useful statistical approach that is based upon the survival curve rather than the hazard function. In this study, AFT models were used to analyze data from 16 survivorship experiments that evaluated the effects of one or more genetic manipulations on mouse lifespan. Most genetic manipulations were found to have a multiplicative effect on survivorship that is independent of age and well-characterized by the AFT model “deceleration factor”. AFT model deceleration factors also provided a more intuitive measure of treatment effect than the hazard ratio, and were robust to departures from modeling assumptions. Age-dependent treatment effects, when present, were investigated using quantile regression modeling. These results provide an informative and quantitative summary of survivorship data associated with currently known long-lived mouse models. In addition, from the standpoint of aging research, these statistical approaches have appealing properties and provide valuable tools for the analysis of survivorship data. PMID:19007875

  8. Accelerated failure time models provide a useful statistical framework for aging research.

    PubMed

    Swindell, William R

    2009-03-01

    Survivorship experiments play a central role in aging research and are performed to evaluate whether interventions alter the rate of aging and increase lifespan. The accelerated failure time (AFT) model is seldom used to analyze survivorship data, but offers a potentially useful statistical approach that is based upon the survival curve rather than the hazard function. In this study, AFT models were used to analyze data from 16 survivorship experiments that evaluated the effects of one or more genetic manipulations on mouse lifespan. Most genetic manipulations were found to have a multiplicative effect on survivorship that is independent of age and well-characterized by the AFT model "deceleration factor". AFT model deceleration factors also provided a more intuitive measure of treatment effect than the hazard ratio, and were robust to departures from modeling assumptions. Age-dependent treatment effects, when present, were investigated using quantile regression modeling. These results provide an informative and quantitative summary of survivorship data associated with currently known long-lived mouse models. In addition, from the standpoint of aging research, these statistical approaches have appealing properties and provide valuable tools for the analysis of survivorship data.

  9. Loss-of-function DNA sequence variant in the CLCNKA chloride channel implicates the cardio-renal axis in interindividual heart failure risk variation.

    PubMed

    Cappola, Thomas P; Matkovich, Scot J; Wang, Wei; van Booven, Derek; Li, Mingyao; Wang, Xuexia; Qu, Liming; Sweitzer, Nancy K; Fang, James C; Reilly, Muredach P; Hakonarson, Hakon; Nerbonne, Jeanne M; Dorn, Gerald W

    2011-02-08

    Common heart failure has a strong undefined heritable component. Two recent independent cardiovascular SNP array studies identified a common SNP at 1p36 in intron 2 of the HSPB7 gene as being associated with heart failure. HSPB7 resequencing identified other risk alleles but no functional gene variants. Here, we further show no effect of the HSPB7 SNP on cardiac HSPB7 mRNA levels or splicing, suggesting that the SNP marks the position of a functional variant in another gene. Accordingly, we used massively parallel platforms to resequence all coding exons of the adjacent CLCNKA gene, which encodes the K(a) renal chloride channel (ClC-K(a)). Of 51 exonic CLCNKA variants identified, one SNP (rs10927887, encoding Arg83Gly) was common, in linkage disequilibrium with the heart failure risk SNP in HSPB7, and associated with heart failure in two independent Caucasian referral populations (n = 2,606 and 1,168; combined P = 2.25 × 10(-6)). Individual genotyping of rs10927887 in the two study populations and a third independent heart failure cohort (combined n = 5,489) revealed an additive allele effect on heart failure risk that is independent of age, sex, and prior hypertension (odds ratio = 1.27 per allele copy; P = 8.3 × 10(-7)). Functional characterization of recombinant wild-type Arg83 and variant Gly83 ClC-K(a) chloride channel currents revealed ≈ 50% loss-of-function of the variant channel. These findings identify a common, functionally significant genetic risk factor for Caucasian heart failure. The variant CLCNKA risk allele, telegraphed by linked variants in the adjacent HSPB7 gene, uncovers a previously overlooked genetic mechanism affecting the cardio-renal axis.

  10. Reduced Gray Matter Volume Is Associated With Poorer Instrumental Activities of Daily Living Performance in Heart Failure.

    PubMed

    Alosco, Michael L; Brickman, Adam M; Spitznagel, Mary Beth; Narkhede, Atul; Griffith, Erica Y; Cohen, Ronald; Sweet, Lawrence H; Josephson, Richard; Hughes, Joel; Gunstad, John

    2016-01-01

    Heart failure patients require assistance with instrumental activities of daily living in part because of the high rates of cognitive impairment in this population. Structural brain insult (eg, reduced gray matter volume) is theorized to underlie cognitive dysfunction in heart failure, although no study has examined the association among gray matter, cognition, and instrumental activities of daily living in heart failure. The aim of this study was to investigate the associations among gray matter volume, cognitive function, and functional ability in heart failure. A total of 81 heart failure patients completed a cognitive test battery and the Lawton-Brody self-report questionnaire to assess instrumental activities of daily living. Participants underwent magnetic resonance imaging to quantify total gray matter and subcortical gray matter volume. Impairments in instrumental activities of daily living were common in this sample of HF patients. Regression analyses controlling for demographic and medical confounders showed that smaller total gray matter volume predicted decreased scores on the instrumental activities of daily living composite, with specific associations noted for medication management and independence in driving. Interaction analyses showed that reduced total gray matter volume interacted with worse attention/executive function and memory to negatively impact instrumental activities of daily living. Smaller gray matter volume is associated with greater impairment in instrumental activities of daily living in persons with heart failure, possibly via cognitive dysfunction. Prospective studies are needed to clarify the utility of clinical correlates of gray matter volume (eg, cognitive dysfunction) in identifying heart failure patients at risk for functional decline and determine whether interventions that target improved brain and cognitive function can preserve functional independence in this high-risk population.

  11. Effectiveness of Quantitative Real Time PCR in Long-Term Follow-up of Chronic Myeloid Leukemia Patients.

    PubMed

    Savasoglu, Kaan; Payzin, Kadriye Bahriye; Ozdemirkiran, Fusun; Berber, Belgin

    2015-08-01

    To determine the use of the Quantitative Real Time PCR (RQ-PCR) assay follow-up with Chronic Myeloid Leukemia (CML) patients. Cross-sectional observational. Izmir Ataturk Education and Research Hospital, Izmir, Turkey, from 2009 to 2013. Cytogenetic, FISH, RQ-PCR test results from 177 CMLpatients' materials selected between 2009 - 2013 years was set up for comparison analysis. Statistical analysis was performed to compare between FISH, karyotype and RQ-PCR results of the patients. Karyotyping and FISH specificity and sensitivity rates determined by ROC analysis compared with RQ-PCR results. Chi-square test was used to compare test failure rates. Sensitivity and specificity values were determined for karyotyping 17.6 - 98% (p=0.118, p > 0.05) and for FISH 22.5 - 96% (p=0.064, p > 0.05) respectively. FISH sensitivity was slightly higher than karyotyping but there was calculated a strong correlation between them (p < 0.001). RQ-PCR test failure rate did not correlate with other two tests (p > 0.05); however, karyotyping and FISH test failure rate was statistically significant (p < 0.001). Besides, the situation needed for karyotype analysis, RQ-PCR assay can be used alone in the follow-up of CMLdisease.

  12. Use of Modal Acoustic Emission to Monitor Damage Progression in Carbon Fiber/Epoxy Tows and Implications for Composite Structures

    NASA Technical Reports Server (NTRS)

    Waller, Jess M.; Saulsberry, Regor L.; Nichols, Charles T.; Wentzel, Daniel J.

    2010-01-01

    This slide presentation reviews the use of Modal Acoustic Emission to monitor damage progression to carbon fiber/epoxy tows. There is a risk for catastrophic failure of composite overwrapped pressure vessels (COPVs) due to burst-before-leak (BBL) stress rupture (SR) failure of carbon-epoxy (C/Ep) COPVs. A lack of quantitative nondestructive evaluation (NDE) is causing problems in current and future spacecraft designs. It is therefore important to develop and demonstrate critical NDE that can be implemented during stages of the design process since the observed rupture can occur with little of no advanced warning. Therefore a program was required to develop quantitative acoustic emission (AE) procedures specific to C/Ep overwraps, but which also have utility for monitoring damage accumulation in composite structure in general, and to lay the groundwork for establishing critical thresholds for accumulated damage in composite structures, such as COPVs, so that precautionary or preemptive engineering steps can be implemented to minimize of obviate the risk of catastrophic failure. A computed Felicity Ratio (FR) coupled with fast Fourier Transform (FFT) frequency analysis shows promise as an analytical pass/fail criterion. The FR analysis and waveform and FFT analysis are reviewed

  13. Mode I Failure of Armor Ceramics: Experiments and Modeling

    NASA Astrophysics Data System (ADS)

    Meredith, Christopher; Leavy, Brian

    2017-06-01

    The pre-notched edge on impact (EOI) experiment is a technique for benchmarking the damage and fracture of ceramics subjected to projectile impact. A cylindrical projectile impacts the edge of a thin rectangular plate with a pre-notch on the opposite edge. Tension is generated at the notch tip resulting in the initiation and propagation of a mode I crack back toward the impact edge. The crack can be quantitatively measured using an optical method called Digital Gradient Sensing, which measures the crack-tip deformation by simultaneously quantifying two orthogonal surface slopes via measuring small deflections of light rays from a specularly reflective surface around the crack. The deflections in ceramics are small so the high speed camera needs to have a very high pixel count. This work reports on the results from pre-crack EOI experiments of SiC and B4 C plates. The experimental data are quantitatively compared to impact simulations using an advanced continuum damage model. The Kayenta ceramic model in Alegra will be used to compare fracture propagation speeds, bifurcations and inhomogeneous initiation of failure will be compared. This will provide insight into the driving mechanisms required for the macroscale failure modeling of ceramics.

  14. The calibration of photographic and spectroscopic films. 1: Film batch variations of reciprocity failure in IIaO film. 2: Thermal and aging effects in relationship to reciprocity failure. 3: Shifting of reciprocity failure points as a function of thermal and aging effects

    NASA Technical Reports Server (NTRS)

    Peters, K. A.; Atkinson, P. F.; Hammond, E. C., Jr.

    1986-01-01

    Reciprocity failure was examined for IIaO spectroscopic film. Three separate experiments were performed in order to study film batch variations, thermal and aging effects in relationship to reciprocity failure, and shifting of reciprocity failure points as a function of thermal and aging effects. The failure was examined over ranges of time between 5 and 60 seconds. The variation to illuminance was obtained by using thirty neutral density filters. A standard sensitometer device imprinted the wedge pattern on the film as exposure time was subjected to variation. The results indicate that film batch differences, temperature, and aging play an important role in reciprocity failure of IIaO spectroscopic film. A shifting of the failure points was also observed in various batches of film.

  15. Pathophysiological relationships between heart failure and depression and anxiety.

    PubMed

    Chapa, Deborah W; Akintade, Bimbola; Son, Heesook; Woltz, Patricia; Hunt, Dennis; Friedmann, Erika; Hartung, Mary Kay; Thomas, Sue Ann

    2014-04-01

    Depression and anxiety are common comorbid conditions in patients with heart failure. Patients with heart failure and depression have increased mortality. The association of anxiety with increased mortality in patients with heart failure is not established. The purpose of this article is to illustrate the similarities of the underlying pathophysiology of heart failure, depression, and anxiety by using the Biopsychosocial Holistic Model of Cardiovascular Health. Depression and anxiety affect biological processes of cardiovascular function in patients with heart failure by altering neurohormonal function via activation of the hypothalamic-pituitary-adrenal axis, autonomic dysregulation, and activation of cytokine cascades and platelets. Patients with heart failure and depression or anxiety may exhibit a continued cycle of heart failure progression, increased depression, and increased anxiety. Understanding the underlying pathophysiological relationships in patients with heart failure who experience comorbid depression and/or anxiety is critical in order to implement appropriate treatments, educate patients and caregivers, and educate other health professionals.

  16. Contact thermal shock test of ceramics

    NASA Technical Reports Server (NTRS)

    Rogers, W. P.; Emery, A. F.

    1992-01-01

    A novel quantitative thermal shock test of ceramics is described. The technique employs contact between a metal-cooling rod and hot disk-shaped specimen. In contrast with traditional techniques, the well-defined thermal boundary condition allows for accurate analyses of heat transfer, stress, and fracture. Uniform equibiaxial tensile stresses are induced in the center of the test specimen. Transient specimen temperature and acoustic emission are monitored continuously during the thermal stress cycle. The technique is demonstrated with soda-lime glass specimens. Experimental results are compared with theoretical predictions based on a finite-element method thermal stress analysis combined with a statistical model of fracture. Material strength parameters are determined using concentric ring flexure tests. Good agreement is found between experimental results and theoretical predictions of failure probability as a function of time and initial specimen temperature.

  17. The role of flexible polymer interconnects in chronic tissue response induced by intracortical microelectrodes--a modeling and an in vivo study.

    PubMed

    Subbaroyan, Jeyakumar; Kipke, Daryl R

    2006-01-01

    Chronic tissue response induced by tethering is one of the major causes for implant failure in intracortical microelectrodes. In this study, we had explored the hypothesis that flexible interconnects could provide strain relief against forces of "micromotion" and hence could result in maintaining a healthy tissue surrounding the implant. Finite element modeling results indicated that flexible interconnects, namely polyimide (E=2 GPa) and polydimethylsiloxane (PDMS, E=6 MPa), reduced the interfacial strain by 66% and two orders of magnitude, respectively. Quantitative immunohistochemistry results indicated that significant neuronal loss occurred up to 60 mum from the implant interface. This was strongly correlated to both glial fibrillary acidic protein (GFAP) expression and simulated strain as a function of distance away from the implant.

  18. Systems for lung volume standardization during static and dynamic MDCT-based quantitative assessment of pulmonary structure and function.

    PubMed

    Fuld, Matthew K; Grout, Randall W; Guo, Junfeng; Morgan, John H; Hoffman, Eric A

    2012-08-01

    Multidetector-row computed tomography (MDCT) has emerged as a tool for quantitative assessment of parenchymal destruction, air trapping (density metrics), and airway remodeling (metrics relating airway wall and lumen geometry) in chronic obstructive pulmonary disease (COPD) and asthma. Critical to the accuracy and interpretability of these MDCT-derived metrics is the assurance that the lungs are scanned during a breathhold at a standardized volume. A computer monitored turbine-based flow meter system was developed to control patient breathholds and facilitate static imaging at fixed percentages of the vital capacity. Because of calibration challenges with gas density changes during multibreath xenon CT, an alternative system was required. The design incorporated dual rolling seal pistons. Both systems were tested in a laboratory environment and human subject trials. The turbine-based system successfully controlled lung volumes in 32/37 subjects, having a linear relationship for CT measured air volume between repeated scans: for all scans, the mean and confidence interval of the differences (scan1-scan2) was -9 mL (-169, 151); for total lung capacity alone 6 mL (-164, 177); for functional residual capacity alone, -23 mL (-172, 126). The dual-piston system successfully controlled lung volume in 31/41 subjects. Study failures related largely to subject noncompliance with verbal instruction and gas leaks around the mouthpiece. We demonstrate the successful use of a turbine-based system for static lung volume control and demonstrate its inadequacies for dynamic xenon CT studies. Implementation of a dual-rolling seal spirometer has been shown to adequately control lung volume for multibreath wash-in xenon CT studies. These systems coupled with proper patient coaching provide the tools for the use of CT to quantitate regional lung structure and function. The wash-in xenon CT method for assessing regional lung function, although not necessarily practical for routine clinical studies, provides for a dynamic protocol against which newly emerging single breath, dual-energy xenon CT measures can be validated. Copyright © 2012 AUR. Published by Elsevier Inc. All rights reserved.

  19. Reaction Times to Consecutive Automation Failures: A Function of Working Memory and Sustained Attention.

    PubMed

    Jipp, Meike

    2016-12-01

    This study explored whether working memory and sustained attention influence cognitive lock-up, which is a delay in the response to consecutive automation failures. Previous research has demonstrated that the information that automation provides about failures and the time pressure that is associated with a task influence cognitive lock-up. Previous research has also demonstrated considerable variability in cognitive lock-up between participants. This is why individual differences might influence cognitive lock-up. The present study tested whether working memory-including flexibility in executive functioning-and sustained attention might be crucial in this regard. Eighty-five participants were asked to monitor automated aircraft functions. The experimental manipulation consisted of whether or not an initial automation failure was followed by a consecutive failure. Reaction times to the failures were recorded. Participants' working-memory and sustained-attention abilities were assessed with standardized tests. As expected, participants' reactions to consecutive failures were slower than their reactions to initial failures. In addition, working-memory and sustained-attention abilities enhanced the speed with which participants reacted to failures, more so with regard to consecutive than to initial failures. The findings highlight that operators with better working memory and sustained attention have small advantages when initial failures occur, but their advantages increase across consecutive failures. The results stress the need to consider personnel selection strategies to mitigate cognitive lock-up in general and training procedures to enhance the performance of low ability operators. © 2016, Human Factors and Ergonomics Society.

  20. Reliable Control Using Disturbance Observer and Equivalent Transfer Function for Position Servo System in Current Feedback Loop Failure

    NASA Astrophysics Data System (ADS)

    Ishikawa, Kaoru; Nakamura, Taro; Osumi, Hisashi

    A reliable control method is proposed for multiple loop control system. After a feedback loop failure, such as case of the sensor break down, the control system becomes unstable and has a big fluctuation even if it has a disturbance observer. To cope with this problem, the proposed method uses an equivalent transfer function (ETF) as active redundancy compensation after the loop failure. The ETF is designed so that it does not change the transfer function of the whole system before and after the loop failure. In this paper, the characteristic of reliable control system that uses an ETF and a disturbance observer is examined by the experiment that uses the DC servo motor for the current feedback loop failure in the position servo system.

  1. The 'aerobic/resistance/inspiratory muscle training hypothesis in heart failure'.

    PubMed

    Laoutaris, Ioannis D

    2018-01-01

    Evidence from large multicentre exercise intervention trials in heart failure patients, investigating both moderate continuous aerobic training and high intensity interval training, indicates that the 'crème de la crème' exercise programme for this population remains to be found. The 'aerobic/resistance/inspiratory (ARIS) muscle training hypothesis in heart failure' is introduced, suggesting that combined ARIS muscle training may result in maximal exercise pathophysiological and functional benefits in heart failure patients. The hypothesis is based on the decoding of the 'skeletal muscle hypothesis in heart failure' and on revision of experimental evidence to date showing that exercise and functional intolerance in heart failure patients are associated not only with reduced muscle endurance, indication for aerobic training (AT), but also with reduced muscle strength and decreased inspiratory muscle function contributing to weakness, dyspnoea, fatigue and low aerobic capacity, forming the grounds for the addition of both resistance training (RT) and inspiratory muscle training (IMT) to AT. The hypothesis will be tested by comparing all potential exercise combinations, ARIS, AT/RT, AT/IMT, AT, evaluating both functional and cardiac indices in a large sample of heart failure patients of New York Heart Association class II-III and left ventricular ejection fraction ≤35% ad hoc by the multicentre randomized clinical trial, Aerobic Resistance, InSpiratory Training OutcomeS in Heart Failure (ARISTOS-HF trial).

  2. Clinical Correlates and Prognostic Value of Proenkephalin in Acute and Chronic Heart Failure.

    PubMed

    Matsue, Yuya; Ter Maaten, Jozine M; Struck, Joachim; Metra, Marco; O'Connor, Christopher M; Ponikowski, Piotr; Teerlink, John R; Cotter, Gad; Davison, Beth; Cleland, John G; Givertz, Michael M; Bloomfield, Daniel M; Dittrich, Howard C; van Veldhuisen, Dirk J; van der Meer, Peter; Damman, Kevin; Voors, Adriaan A

    2017-03-01

    Proenkephalin (pro-ENK) has emerged as a novel biomarker associated with both renal function and cardiac function. However, its clinical and prognostic value have not been well evaluated in symptomatic patients with heart failure. The association between pro-ENK and markers of renal function was evaluated in 95 patients with chronic heart failure who underwent renal hemodynamic measurements, including renal blood flow (RBF) and glomerular filtration rate (GFR) with the use of 131 I-Hippuran and 125 I-iothalamate clearances, respectively. The association between pro-ENK and clinical outcome in acute heart failure was assessed in another 1589 patients. Pro-ENK was strongly correlated with both RBF (P < .001) and GFR (P < .001), but not with renal tubular markers. In the acute heart failure cohort, pro-ENK was a predictor of death through 180 days, heart failure rehospitalization through 60 days, and death or cardiovascular or renal rehospitalization through day 60 in univariable analyses, but its predictive value was lost in a multivariable model when other renal markers were entered in the model. In patients with chronic and acute heart failure, pro-ENK is strongly associated with glomerular function, but not with tubular damage. Pro-ENK provides limited prognostic information in patients with acute heart failure on top of established renal markers. Copyright © 2016 Elsevier Inc. All rights reserved.

  3. Contributions of nuclear magnetic resonance to renal biochemistry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ross, B.; Freeman, D.; Chan, L.

    /sup 31/P NMR as a descriptive technique is of interest to nephrologists. Particular contributions of /sup 31/P NMR to our understanding of renal function may be enumerated.: Free metabolite levels are different from those classically accepted; in particular, ADP and Pi are low with implications for the control of renal metabolism and Pi transport, and, via the phosphorylation potential, for Na+ transport. Renal pH is heterogeneous; between cortex, outer medulla, and papilla, and between cell and lumen, a large pH gradient exists. Also, quantitation between cytosol and mitochondrion of the pH gradient is now feasible. In acute renal failure ofmore » either ischemic or nonischemic origin, both ATP depletion and acidification of the renal cell result in damage, with increasing evidence for the importance of the latter. Measurements of renal metabolic rate in vivo suggest the existence of a prodromal phase of acute renal failure, which could lead to its detection at an earlier and possibly reversible stage. Human renal cancers show a unique /sup 31/P NMR spectrum and a very acidic environment. Cancer chemotherapy may alter this and detection of such changes with NMR offers a method of therapeutic monitoring with significance beyond nephrology. Renal cortex and medulla have a different T1 relaxation time, possibly due to differences in lipid composition. It seems that NMR spectroscopy has much to offer to the future understanding of the relationship between renal biochemistry and function. 56 references.« less

  4. The intratumoral balance between metabolic and immunologic gene expression is associated with anti-PD-1 response in patients with renal cell carcinoma

    PubMed Central

    Ascierto, Maria Libera; McMiller, Tracee L.; Berger, Alan E.; Danilova, Ludmila; Anders, Robert A.; Netto, George J.; Xu, Haiying; Pritchard, Theresa S.; Fan, Jinshui; Cheadle, Chris; Cope, Leslie; Drake, Charles G.; Pardoll, Drew M.; Taube, Janis M.; Topalian, Suzanne L.

    2016-01-01

    Pretreatment tumor PD-L1 expression correlates with response to anti-PD-1/PD-L1 therapies. Yet, most patients with PD-L1+ tumors do not respond to treatment. The current study was undertaken to investigate mechanisms underlying the failure of PD-1–targeted therapies in patients with advanced renal cell carcinoma (RCC) whose tumors express PD-L1. Formalin-fixed, paraffin-embedded (FFPE) pretreatment tumor biopsies expressing PD-L1 were derived from 13 RCC patients. RNA was isolated from PD-L1+ regions and subjected to whole genome microarray and multiplex quantitative (q)RT-PCR gene expression analysis. A balance between gene expression profiles reflecting metabolic pathways and immune functions was associated with clinical outcomes following anti-PD-1 therapy. In particular, the expression of genes involved in metabolic and solute transport functions such as UGT1A family members, also found in kidney cancer cell lines, was associated with treatment failure in patients with PD-L1+ RCC. Conversely, tumors from responding patients overexpressed immune markers such as BACH2, a regulator of CD4+ T cell differentiation, and CCL3, involved in leukocyte migration. These findings suggest that tumor cell–intrinsic metabolic factors may contribute to treatment resistance in RCC, thus serving as predictive markers for treatment outcomes and potential new targets for combination therapy regimens with anti-PD-1. PMID:27491898

  5. Future remnant liver function as predictive factor for the hypertrophy response after portal vein embolization.

    PubMed

    Cieslak, Kasia P; Huisman, Floor; Bais, Thomas; Bennink, Roelof J; van Lienden, Krijn P; Verheij, Joanne; Besselink, Marc G; Busch, Olivier R C; van Gulik, Thomas M

    2017-07-01

    Preoperative portal vein embolization is widely used to increase the future remnant liver. Identification of nonresponders to portal vein embolization is essential because these patients may benefit from associating liver partition and portal vein ligation for staged hepatectomy (ALPPS), which induces a more powerful hypertrophy response. 99m Tc-mebrofenin hepatobiliary scintigraphy is a quantitative method for assessment of future remnant liver function with a calculated cutoff value for the prediction of postoperative liver failure. The aim of this study was to analyze future remnant liver function before portal vein embolization to predict sufficient functional hypertrophy response after portal vein embolization. Sixty-three patients who underwent preoperative portal vein embolization and computed tomography imaging were included. Hepatobiliary scintigraphy was performed to determine pre-portal vein embolization and post-portal vein embolization future remnant liver function. Receiver operator characteristic analysis of pre-portal vein embolization future remnant liver function was performed to identify patients who would meet the post-portal vein embolization cutoff value for sufficient function (ie, 2.7%/min/m 2 ). Mean pre-portal vein embolization future remnant liver function was 1.80% ± 0.45%/min/m 2 and increased to 2.89% ± 0.97%/min/m 2 post-portal vein embolization. Receiver operator characteristic analysis in 33 patients who did not receive chemotherapy revealed that a pre-portal vein embolization future remnant liver function of ≥1.72%/min/m 2 was able to identify patients who would meet the safe future remnant liver function cutoff value 3 weeks after portal vein embolization (area under the curve = 0.820). The predictive value was less pronounced in 30 patients treated with neoadjuvant chemotherapy (area under the curve = 0.618). A total of 45 of 63 patients underwent liver resection, of whom 5 of 45 developed postoperative liver failure; 4 of 5 patients had a post-portal vein embolization future remnant liver function below the cutoff value for safe resection. When selecting patients for portal vein embolization, future remnant liver function assessed with hepatobiliary scintigraphy can be used as a predictor of insufficient functional hypertrophy after portal vein embolization, especially in nonchemotherapy patients. These patients are potential candidates for ALPPS. Copyright © 2017 Elsevier Inc. All rights reserved.

  6. The effects of aircraft certification rules on general aviation accidents

    NASA Astrophysics Data System (ADS)

    Anderson, Carolina Lenz

    The purpose of this study was to analyze the frequency of general aviation airplane accidents and accident rates on the basis of aircraft certification to determine whether or not differences in aircraft certification rules had an influence on accidents. In addition, the narrative cause descriptions contained within the accident reports were analyzed to determine whether there were differences in the qualitative data for the different certification categories. The certification categories examined were: Federal Aviation Regulations Part 23, Civil Air Regulations 3, Light Sport Aircraft, and Experimental-Amateur Built. The accident causes examined were those classified as: Loss of Control, Controlled Flight into Terrain, Engine Failure, and Structural Failure. Airworthiness certification categories represent a wide diversity of government oversight. Part 23 rules have evolved from the initial set of simpler design standards and have progressed into a comprehensive and strict set of rules to address the safety issues of the more complex airplanes within the category. Experimental-Amateur Built airplanes have the least amount of government oversight and are the fastest growing segment. The Light Sport Aircraft category is a more recent certification category that utilizes consensus standards in the approval process. Civil Air Regulations 3 airplanes were designed and manufactured under simpler rules but modifying these airplanes has become lengthy and expensive. The study was conducted using a mixed methods methodology which involves both quantitative and qualitative elements. A Chi-Square test was used for a quantitative analysis of the accident frequency among aircraft certification categories. Accident rate analysis of the accidents among aircraft certification categories involved an ANCOVA test. The qualitative component involved the use of text mining techniques for the analysis of the narrative cause descriptions contained within the accident reports. The Chi-Square test indicated that there was no significant difference in the number of accidents among the different certification categories when either Controlled Flight into Terrain or Structural Failure was listed as cause. However, there was a significant difference in the frequency of accidents with regard to Loss of Control and Engine Failure accidents. The results of the ANCOVA test indicated that there was no significant difference in the accident rate with regard to Loss of Control, Controlled Flight into Terrain, or Structural Failure accidents. There was, however, a significant difference in Engine Failure accidents between Experimental-Amateur Built and the other categories.The text mining analysis of the narrative causes of Loss of Control accidents indicated that only the Civil Air Regulations 3 category airplanes had clusters of words associated with visual flight into instrument meteorological conditions. Civil Air Regulations 3 airplanes were designed and manufactured prior to the 1960s and in most cases have not been retrofitted to take advantage of newer technologies that could help prevent Loss of Control accidents. The study indicated that General Aviation aircraft certification rules do not have a statistically significant effect on aircraft accidents except for Loss of Control and Engine Failure. According to the literature, government oversight could have become an obstacle in the implementation of safety enhancing equipment that could reduce Loss of Control accidents. Oversight should focus on ensuring that Experimental-Amateur Built aircraft owners perform a functional test that could prevent some of the Engine Failure accidents.

  7. Game-Theoretic strategies for systems of components using product-form utilities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rao, Nageswara S; Ma, Cheng-Yu; Hausken, K.

    Many critical infrastructures are composed of multiple systems of components which are correlated so that disruptions to one may propagate to others. We consider such infrastructures with correlations characterized in two ways: (i) an aggregate failure correlation function specifies the conditional failure probability of the infrastructure given the failure of an individual system, and (ii) a pairwise correlation function between two systems specifies the failure probability of one system given the failure of the other. We formulate a game for ensuring the resilience of the infrastructure, wherein the utility functions of the provider and attacker are products of an infrastructuremore » survival probability term and a cost term, both expressed in terms of the numbers of system components attacked and reinforced. The survival probabilities of individual systems satisfy first-order differential conditions that lead to simple Nash Equilibrium conditions. We then derive sensitivity functions that highlight the dependence of infrastructure resilience on the cost terms, correlation functions, and individual system survival probabilities. We apply these results to simplified models of distributed cloud computing and energy grid infrastructures.« less

  8. Heart Failure Virtual Consultation: bridging the gap of heart failure care in the community - A mixed-methods evaluation.

    PubMed

    Gallagher, Joseph; James, Stephanie; Keane, Ciara; Fitzgerald, Annie; Travers, Bronagh; Quigley, Etain; Hecht, Christina; Zhou, Shuaiwei; Watson, Chris; Ledwidge, Mark; McDonald, Kenneth

    2017-08-01

    We undertook a mixed-methods evaluation of a Web-based conferencing service (virtual consult) between general practitioners (GPs) and cardiologists in managing patients with heart failure in the community to determine its effect on use of specialist heart failure services and acceptability to GPs. All cases from June 2015 to October 2016 were recorded using a standardized recording template, which recorded patient demographics, medical history, medications, and outcome of the virtual consult for each case. Quantitative surveys and qualitative interviewing of 17 participating GPs were also undertaken. During this time, 142 cases were discussed-68 relating to a new diagnosis of heart failure, 53 relating to emerging deterioration in a known heart failure patient, and 21 relating to therapeutic issues. Only 17% required review in outpatient department following the virtual consultation. GPs reported increased confidence in heart failure management, a broadening of their knowledge base, and a perception of overall better patient outcomes. These data from an initial experience with Heart Failure Virtual Consultation present a very positive impact of this strategy on the provision of heart failure care in the community and acceptability to users. Further research on the implementation and expansion of this strategy is warranted. © 2017 The Authors. ESC Heart Failure published by John Wiley & Sons Ltd on behalf of the European Society of Cardiology.

  9. Failures and Inabilities of High School Students about Quadratic Equations and Functions

    ERIC Educational Resources Information Center

    Memnun, Dilek Sezgin; Aydin, Bünyamin; Dinç, Emre; Çoban, Merve; Sevindik, Fatma

    2015-01-01

    In this research study, it was aimed to examine failures and inabilities of eleventh grade students about quadratic equations and functions. For this purpose, these students were asked ten open-ended questions. The analysis of the answers given by the students to these questions indicated that a significant part of these students had failures and…

  10. Comprehensive reliability allocation method for CNC lathes based on cubic transformed functions of failure mode and effects analysis

    NASA Astrophysics Data System (ADS)

    Yang, Zhou; Zhu, Yunpeng; Ren, Hongrui; Zhang, Yimin

    2015-03-01

    Reliability allocation of computerized numerical controlled(CNC) lathes is very important in industry. Traditional allocation methods only focus on high-failure rate components rather than moderate failure rate components, which is not applicable in some conditions. Aiming at solving the problem of CNC lathes reliability allocating, a comprehensive reliability allocation method based on cubic transformed functions of failure modes and effects analysis(FMEA) is presented. Firstly, conventional reliability allocation methods are introduced. Then the limitations of direct combination of comprehensive allocation method with the exponential transformed FMEA method are investigated. Subsequently, a cubic transformed function is established in order to overcome these limitations. Properties of the new transformed functions are discussed by considering the failure severity and the failure occurrence. Designers can choose appropriate transform amplitudes according to their requirements. Finally, a CNC lathe and a spindle system are used as an example to verify the new allocation method. Seven criteria are considered to compare the results of the new method with traditional methods. The allocation results indicate that the new method is more flexible than traditional methods. By employing the new cubic transformed function, the method covers a wider range of problems in CNC reliability allocation without losing the advantages of traditional methods.

  11. Effect of perturbations and a meal on superior mesenteric artery flow in patients with orthostatic hypotension

    NASA Technical Reports Server (NTRS)

    Fujimura, J.; Camilleri, M.; Low, P. A.; Novak, V.; Novak, P.; Opfer-Gehrking, T. L.

    1997-01-01

    Our aims were to evaluate to role of superior mesenteric blood flow in the pathophysiology of orthostatic hypotension in patients with generalized autonomic failure. METHODS: Twelve patients with symptomatic neurogenic orthostatic hypotension and 12 healthy controls underwent superior mesenteric artery flow measurements using Doppler ultrasonography during head-up tilt and tilt plus meal ingestion. Autonomic failure was assessed using standard tests of the function of the sympathetic adrenergic, cardiovagal and postganglionic sympathetic sudomotor function. RESULTS: Superior mesenteric flow volume and time-averaged velocity were similar in patients and controls at supine rest; however, responses to cold pressor test and upright tilt were attenuated (p < 0.05) in patients compared to controls. Head-up tilt after the meal evoked a profound fall of blood pressure and mesenteric blood flow in the patients; the reduction of mesenteric blood flow correlated (r = 0.89) with the fall of blood pressure in these patients, providing another manifestation of failed baroreflexes. We make the novel finding that the severity of postprandial orthostatic hypotension regressed negatively with the postprandial increase in mesenteric flow in patients with orthostatic hypotension. CONCLUSION: Mesenteric flow is under baroreflex control, which when defective, results in, or worsens orthostatic hypotension. Its large size and baroreflexivity renders it quantitatively important in the maintenance of postural normotension. The effects of orthostatic stress can be significantly attenuated by reducing the splanchnic-mesenteric volume increase in response to food. Evaluation of mesenteric flow in response to eating and head-up tilt provide important information on intra-abdominal sympathetic adrenergic function, and the ability of the patient to cope with orthostatic stress.

  12. Uterine Dysfunction in Biglycan and Decorin Deficient Mice Leads to Dystocia during Parturition

    PubMed Central

    Wu, Zhiping; Aron, Abraham W.; Macksoud, Elyse E.; Iozzo, Renato V.; Hai, Chi-Ming; Lechner, Beatrice E.

    2012-01-01

    Cesarean birth rates are rising. Uterine dysfunction, the exact mechanism of which is unknown, is a common indication for Cesarean delivery. Biglycan and decorin are two small leucine-rich proteoglycans expressed in the extracellular matrix of reproductive tissues and muscle. Mice deficient in biglycan display a mild muscular dystrophy, and, along with mice deficient in decorin, are models of Ehlers-Danlos Syndrome, a connective tissue anomaly associated with uterine rupture. As a variant of Ehlers-Danlos Syndrome is caused by a genetic mutation resulting in abnormal biglycan and decorin secretion, we hypothesized that biglycan and decorin play a role in uterine function. Thus, we assessed wild-type, biglycan, decorin and double knockout pregnancies for timing of birth and uterine function. Uteri were harvested at embryonic days 12, 15 and 18. Nonpregnant uterine samples of the same genotypes were assessed for tissue failure rate and spontaneous and oxytocin-induced contractility. We discovered that biglycan/decorin mixed double-knockout dams displayed dystocia, were at increased risk of delayed labor onset, and showed increased tissue failure in a predominantly decorin-dependent manner. In vitro spontaneous uterine contractile amplitude and oxytocin-induced contractile force were decreased in all biglycan and decorin knockout genotypes compared to wild-type. Notably, we found no significant compensation between biglycan and decorin using quantitative real time PCR or immunohistochemistry. We conclude that the biglycan/decorin mixed double knockout mouse is a model of dystocia and delayed labor onset. Moreover, decorin is necessary for uterine function in a dose-dependent manner, while biglycan exhibits partial compensatory mechanisms in vivo. Thus, this model is poised for use as a model for testing novel targets for preventive or therapeutic manipulation of uterine dysfunction. PMID:22253749

  13. Safety evaluation of driver cognitive failures and driving errors on right-turn filtering movement at signalized road intersections based on Fuzzy Cellular Automata (FCA) model.

    PubMed

    Chai, Chen; Wong, Yiik Diew; Wang, Xuesong

    2017-07-01

    This paper proposes a simulation-based approach to estimate safety impact of driver cognitive failures and driving errors. Fuzzy Logic, which involves linguistic terms and uncertainty, is incorporated with Cellular Automata model to simulate decision-making process of right-turn filtering movement at signalized intersections. Simulation experiments are conducted to estimate the relationships between cognitive failures and driving errors with safety performance. Simulation results show Different types of cognitive failures are found to have varied relationship with driving errors and safety performance. For right-turn filtering movement, cognitive failures are more likely to result in driving errors with denser conflicting traffic stream. Moreover, different driving errors are found to have different safety impacts. The study serves to provide a novel approach to linguistically assess cognitions and replicate decision-making procedures of the individual driver. Compare to crash analysis, the proposed FCA model allows quantitative estimation of particular cognitive failures, and the impact of cognitions on driving errors and safety performance. Copyright © 2017 Elsevier Ltd. All rights reserved.

  14. Nonlinear deformation and localized failure of bacterial streamers in creeping flows

    PubMed Central

    Biswas, Ishita; Ghosh, Ranajay; Sadrzadeh, Mohtada; Kumar, Aloke

    2016-01-01

    We investigate the failure of bacterial floc mediated streamers in a microfluidic device in a creeping flow regime using both experimental observations and analytical modeling. The quantification of streamer deformation and failure behavior is possible due to the use of 200 nm fluorescent polystyrene beads which firmly embed in the extracellular polymeric substance (EPS) and act as tracers. The streamers, which form soon after the commencement of flow begin to deviate from an apparently quiescent fully formed state in spite of steady background flow and limited mass accretion indicating significant mechanical nonlinearity. This nonlinear behavior shows distinct phases of deformation with mutually different characteristic times and comes to an end with a distinct localized failure of the streamer far from the walls. We investigate this deformation and failure behavior for two separate bacterial strains and develop a simplified but nonlinear analytical model describing the experimentally observed instability phenomena assuming a necking route to instability. Our model leads to a power law relation between the critical strain at failure and the fluid velocity scale exhibiting excellent qualitative and quantitative agreeing with the experimental rupture behavior. PMID:27558511

  15. What Can We Learn from a Simple Physics-Based Earthquake Simulator?

    NASA Astrophysics Data System (ADS)

    Artale Harris, Pietro; Marzocchi, Warner; Melini, Daniele

    2018-03-01

    Physics-based earthquake simulators are becoming a popular tool to investigate on the earthquake occurrence process. So far, the development of earthquake simulators is commonly led by the approach "the more physics, the better". However, this approach may hamper the comprehension of the outcomes of the simulator; in fact, within complex models, it may be difficult to understand which physical parameters are the most relevant to the features of the seismic catalog at which we are interested. For this reason, here, we take an opposite approach and analyze the behavior of a purposely simple earthquake simulator applied to a set of California faults. The idea is that a simple simulator may be more informative than a complex one for some specific scientific objectives, because it is more understandable. Our earthquake simulator has three main components: the first one is a realistic tectonic setting, i.e., a fault data set of California; the second is the application of quantitative laws for earthquake generation on each single fault, and the last is the fault interaction modeling through the Coulomb Failure Function. The analysis of this simple simulator shows that: (1) the short-term clustering can be reproduced by a set of faults with an almost periodic behavior, which interact according to a Coulomb failure function model; (2) a long-term behavior showing supercycles of the seismic activity exists only in a markedly deterministic framework, and quickly disappears introducing a small degree of stochasticity on the recurrence of earthquakes on a fault; (3) faults that are strongly coupled in terms of Coulomb failure function model are synchronized in time only in a marked deterministic framework, and as before, such a synchronization disappears introducing a small degree of stochasticity on the recurrence of earthquakes on a fault. Overall, the results show that even in a simple and perfectly known earthquake occurrence world, introducing a small degree of stochasticity may blur most of the deterministic time features, such as long-term trend and synchronization among nearby coupled faults.

  16. On a Stochastic Failure Model under Random Shocks

    NASA Astrophysics Data System (ADS)

    Cha, Ji Hwan

    2013-02-01

    In most conventional settings, the events caused by an external shock are initiated at the moments of its occurrence. In this paper, we study a new classes of shock model, where each shock from a nonhomogeneous Poisson processes can trigger a failure of a system not immediately, as in classical extreme shock models, but with delay of some random time. We derive the corresponding survival and failure rate functions. Furthermore, we study the limiting behaviour of the failure rate function where it is applicable.

  17. Role and Value of Clinical Pharmacy in Heart Failure Management.

    PubMed

    Stough, W G; Patterson, J H

    2017-08-01

    Effectively managing heart failure requires a multidisciplinary, holistic approach attuned to many factors: diagnosis of structural and functional cardiac abnormalities; medication, device, or surgical management; concomitant treatment of comorbidities; physical rehabilitation; dietary considerations; and social factors. This practice paper highlights the pharmacist's role in the management of patients with heart failure, the evidence supporting their functions, and steps to ensure the pharmacist resource is available to the broad population of patients with heart failure. © 2017 American Society for Clinical Pharmacology and Therapeutics.

  18. Elasticity dominates strength and failure in metallic glasses

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, Z. Q.; Qu, R. T.; Zhang, Z. F., E-mail: zhfzhang@imr.ac.cn

    2015-01-07

    Two distinct deformation mechanisms of shearing and volume dilatation are quantitatively analyzed in metallic glasses (MGs) from the fundamental thermodynamics. Their competition is deduced to intrinsically dominate the strength and failure behaviors of MGs. Both the intrinsic shear and normal strengths give rise to the critical mechanical energies to activate destabilization of amorphous structures, under pure shearing and volume dilatation, respectively, and can be determined in terms of elastic constants. By adopting an ellipse failure criterion, the strength and failure behaviors of MGs can be precisely described just according to their shear modulus and Poisson's ratio without mechanical testing. Quantitativemore » relations are established systematically and verified by experimental results. Accordingly, the real-sense non-destructive failure prediction can be achieved in various MGs. By highlighting the broad key significance of elasticity, a “composition-elasticity-property” scheme is further outlined for better understanding and controlling the mechanical properties of MGs and other glassy materials from the elastic perspectives.« less

  19. Time prediction of failure a type of lamps by using general composite hazard rate model

    NASA Astrophysics Data System (ADS)

    Riaman; Lesmana, E.; Subartini, B.; Supian, S.

    2018-03-01

    This paper discusses the basic survival model estimates to obtain the average predictive value of lamp failure time. This estimate is for the parametric model, General Composite Hazard Level Model. The random time variable model used is the exponential distribution model, as the basis, which has a constant hazard function. In this case, we discuss an example of survival model estimation for a composite hazard function, using an exponential model as its basis. To estimate this model is done by estimating model parameters, through the construction of survival function and empirical cumulative function. The model obtained, will then be used to predict the average failure time of the model, for the type of lamp. By grouping the data into several intervals and the average value of failure at each interval, then calculate the average failure time of a model based on each interval, the p value obtained from the tes result is 0.3296.

  20. Clinical models of cardiovascular regulation after weightlessness

    NASA Technical Reports Server (NTRS)

    Robertson, D.; Jacob, G.; Ertl, A.; Shannon, J.; Mosqueda-Garcia, R.; Robertson, R. M.; Biaggioni, I.

    1996-01-01

    After several days in microgravity, return to earth is attended by alterations in cardiovascular function. The mechanisms underlying these effects are inadequately understood. Three clinical disorders of autonomic function represent possible models of this abnormal cardiovascular function after spaceflight. They are pure autonomic failure, baroreflex failure, and orthostatic intolerance. In pure autonomic failure, virtually complete loss of sympathetic and parasympathetic function occurs along with profound and immediate orthostatic hypotension. In baroreflex failure, various degrees of debuffering of blood pressure occur. In acute and complete baroreflex failure, there is usually severe hypertension and tachycardia, while with less complete and more chronic baroreflex impairment, orthostatic abnormalities may be more apparent. In orthostatic intolerance, blood pressure fall is minor, but orthostatic symptoms are prominent and tachycardia frequently occurs. Only careful autonomic studies of human subjects in the microgravity environment will permit us to determine which of these models most closely reflects the pathophysiology brought on by a period of time in the microgravity environment.

  1. A global analysis approach for investigating structural resilience in urban drainage systems.

    PubMed

    Mugume, Seith N; Gomez, Diego E; Fu, Guangtao; Farmani, Raziyeh; Butler, David

    2015-09-15

    Building resilience in urban drainage systems requires consideration of a wide range of threats that contribute to urban flooding. Existing hydraulic reliability based approaches have focused on quantifying functional failure caused by extreme rainfall or increase in dry weather flows that lead to hydraulic overloading of the system. Such approaches however, do not fully explore the full system failure scenario space due to exclusion of crucial threats such as equipment malfunction, pipe collapse and blockage that can also lead to urban flooding. In this research, a new analytical approach based on global resilience analysis is investigated and applied to systematically evaluate the performance of an urban drainage system when subjected to a wide range of structural failure scenarios resulting from random cumulative link failure. Link failure envelopes, which represent the resulting loss of system functionality (impacts) are determined by computing the upper and lower limits of the simulation results for total flood volume (failure magnitude) and average flood duration (failure duration) at each link failure level. A new resilience index that combines the failure magnitude and duration into a single metric is applied to quantify system residual functionality at each considered link failure level. With this approach, resilience has been tested and characterised for an existing urban drainage system in Kampala city, Uganda. In addition, the effectiveness of potential adaptation strategies in enhancing its resilience to cumulative link failure has been tested. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.

  2. A Statistical Perspective on Highly Accelerated Testing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Thomas, Edward V.

    Highly accelerated life testing has been heavily promoted at Sandia (and elsewhere) as a means to rapidly identify product weaknesses caused by flaws in the product's design or manufacturing process. During product development, a small number of units are forced to fail at high stress. The failed units are then examined to determine the root causes of failure. The identification of the root causes of product failures exposed by highly accelerated life testing can instigate changes to the product's design and/or manufacturing process that result in a product with increased reliability. It is widely viewed that this qualitative use ofmore » highly accelerated life testing (often associated with the acronym HALT) can be useful. However, highly accelerated life testing has also been proposed as a quantitative means for "demonstrating" the reliability of a product where unreliability is associated with loss of margin via an identified and dominating failure mechanism. It is assumed that the dominant failure mechanism can be accelerated by changing the level of a stress factor that is assumed to be related to the dominant failure mode. In extreme cases, a minimal number of units (often from a pre-production lot) are subjected to a single highly accelerated stress relative to normal use. If no (or, sufficiently few) units fail at this high stress level, some might claim that a certain level of reliability has been demonstrated (relative to normal use conditions). Underlying this claim are assumptions regarding the level of knowledge associated with the relationship between the stress level and the probability of failure. The primary purpose of this document is to discuss (from a statistical perspective) the efficacy of using accelerated life testing protocols (and, in particular, "highly accelerated" protocols) to make quantitative inferences concerning the performance of a product (e.g., reliability) when in fact there is lack-of-knowledge and uncertainty concerning the assumed relationship between the stress level and performance. In addition, this document contains recommendations for conducting more informative accelerated tests.« less

  3. Micro-RNA-122 levels in acute liver failure and chronic hepatitis C.

    PubMed

    Dubin, Perry H; Yuan, Hejun; Devine, Robert K; Hynan, Linda S; Jain, Mamta K; Lee, William M

    2014-09-01

    MicroRNA-122 (miR-122) is the foremost liver-related micro-RNA, but its role in the hepatocyte is not fully understood. To evaluate whether circulating levels of miR-122 are elevated in chronic-HCV for a reason other than hepatic injury, we compared serum level in patients with chronic hepatitis C to other forms of liver injury including patients with acute liver failure and healthy controls. MiR-122 was quantitated using sera from 35 acute liver failure patients (20 acetaminophen-induced, 15 other etiologies), 39 chronic-HCV patients and 12 controls. In parallel, human genomic DNA (hgDNA) levels were measured to reflect quantitatively the extent of hepatic necrosis. Additionally, six HIV-HCV co-infected patients, who achieved viral clearance after undergoing therapy with interferon and ribavirin, had serial sera miR-122 and hgDNA levels measured before and throughout treatment. Serum miR-122 levels were elevated approximately 100-fold in both acute liver failure and chronic-HCV sera as compared to controls (P < 0.001), whereas hgDNA levels were only elevated in acute liver failure patients as compared to both chronic-HCV and controls (P < 0.001). Subgroup analysis showed that chronic-HCV sera with normal aminotransferase levels showed elevated miR-122 despite low levels of hepatocyte necrosis. All successfully treated HCV patients showed a significant Log10 decrease in miR-122 levels ranging from 0.16 to 1.46, after sustained viral response. Chronic-HCV patients have very elevated serum miR-122 levels in the range of most patients with severe hepatic injury leading to acute liver failure. Eradication of HCV was associated with decreased miR-122 but not hgDNA. An additional mechanism besides hepatic injury may be active in chronic-HCV to explain the exaggerated circulating levels of miR-122 observed. © 2014 Wiley Periodicals, Inc.

  4. Echocardiographic predictors of change in renal function with intravenous diuresis for decompensated heart failure.

    PubMed

    Gannon, Stephen A; Mukamal, Kenneth J; Chang, James D

    2018-06-14

    The aim of this study was to identify echocardiographic predictors of improved or worsening renal function during intravenous diuresis for decompensated heart failure. Secondary aim included defining the incidence and clinical risk factors for acute changes in renal function with decongestion. A retrospective review of 363 patients admitted to a single centre for decompensated heart failure who underwent intravenous diuresis and transthoracic echocardiography was conducted. Clinical, echocardiographic, and renal function data were retrospectively collected. A multinomial logistic regression model was created to determine relative risk ratios for improved renal function (IRF) or worsening renal function (WRF). Within this cohort, 36% of patients experienced WRF, 35% had stable renal function, and 29% had IRF. Patients with WRF were more likely to have a preserved left ventricular ejection fraction compared with those with stable renal function or IRF (P = 0.02). Patients with IRF were more likely to have a dilated, hypokinetic right ventricle compared with those with stable renal function or WRF (P ≤ 0.01), although this was not significant after adjustment for baseline characteristics. Left atrial size, left ventricular linear dimensions, and diastolic function did not significantly predict change in renal function. An acute change in renal function occurred in 65% of patients admitted with decompensated heart failure. WRF was statistically more likely in patients with a preserved left ventricular ejection fraction. A trend towards IRF was noted in patients with global right ventricular dysfunction. © 2018 The Authors. ESC Heart Failure published by John Wiley & Sons Ltd on behalf of the European Society of Cardiology.

  5. Renal function assessment in heart failure.

    PubMed

    Pérez Calvo, J I; Josa Laorden, C; Giménez López, I

    Renal function is one of the most consistent prognostic determinants in heart failure. The prognostic information it provides is independent of the ejection fraction and functional status. This article reviews the various renal function assessment measures, with special emphasis on the fact that the patient's clinical situation and response to the heart failure treatment should be considered for the correct interpretation of the results. Finally, we review the literature on the performance of tubular damage biomarkers. Copyright © 2017 Elsevier España, S.L.U. and Sociedad Española de Medicina Interna (SEMI). All rights reserved.

  6. Impact of Cardiac Progenitor Cells on Heart Failure and Survival in Single Ventricle Congenital Heart Disease.

    PubMed

    Sano, Toshikazu; Ousaka, Daiki; Goto, Takuya; Ishigami, Shuta; Hirai, Kenta; Kasahara, Shingo; Ohtsuki, Shinichi; Sano, Shunji; Oh, Hidemasa

    2018-03-30

    Intracoronary administration of cardiosphere-derived cells (CDCs) in patients with single ventricles resulted in a short-term improvement in cardiac function. To test the hypothesis that CDC infusion is associated with improved cardiac function and reduced mortality in patients with heart failure. We evaluated the effectiveness of CDCs using an integrated cohort study in 101 patients with single ventricles, including 41 patients who received CDC infusion and 60 controls treated with staged palliation alone. Heart failure with preserved ejection fraction (EF) or reduced EF was stratified by the cardiac function after surgical reconstruction. The main outcome measure was to evaluate the magnitude of improvement in cardiac function and all-cause mortality at 2 years. Animal studies were conducted to clarify the underlying mechanisms of heart failure with preserved EF and heart failure with reduced EF phenotypes. At 2 years, CDC infusion increased ventricular function (stage 2: +8.4±10.0% versus +1.6±6.4%, P =0.03; stage 3: +7.9±7.5% versus -1.1±5.5%, P <0.001) compared with controls. In all available follow-up data, survival did not differ between the 2 groups (log-rank P =0.225), whereas overall patients treated by CDCs had lower incidences of late failure ( P =0.022), adverse events ( P =0.013), and catheter intervention ( P =0.005) compared with controls. CDC infusion was associated with a lower risk of adverse events (hazard ratio, 0.411; 95% CI, 0.179-0.942; P =0.036). Notably, CDC infusion reduced mortality ( P =0.038) and late complications ( P <0.05) in patients with heart failure with reduced EF but not with heart failure with preserved EF. CDC-treated rats significantly reversed myocardial fibrosis with differential collagen deposition and inflammatory responses between the heart failure phenotypes. CDC administration in patients with single ventricles showed favorable effects on ventricular function and was associated with reduced late complications except for all-cause mortality after staged procedures. Patients with heart failure with reduced EF but not heart failure with preserved EF treated by CDCs resulted in significant improvement in clinical outcome. URL: http://www.clinicaltrials.gov. Unique identifiers: NCT01273857 and NCT01829750. © 2018 American Heart Association, Inc.

  7. The calibration of photographic and spectroscopic films. Part 1: Film batch variations of reciprocity failure in IIaO film. Part 2: Thermal and aging effects in relationship to reciprocity failure. P art 3: Shifting of reciprocity failure points as a function of thermal and aging effects

    NASA Technical Reports Server (NTRS)

    Peters, Kevin A.; Atkinson, Pamela F.; Hammond, Ernest C., Jr

    1987-01-01

    Reciprocity failure was examined for IIaO spectroscopic film. Three separate experiments were performed in order to study film batch variations, thermal and aging effects in relationship to reciprocity failure, and shifting of reciprocity failure points as a function of thermal and aging effects. The failure was examined over ranges of time between 5 and 60 seconds. The variation to illuminance was obtained by using thirty neutral density filters. A standard sensitometer device imprinted the wedge pattern on the film as exposure time was subjected to variation. Results indicate that film batch differences, temperature, and aging play an important role in reciprocity failure of IIaO spectroscopic film. A shifting of the failure points was also observed in various batches of film.

  8. Doctor-patient communication: some quantitative estimates of the role of cognitive factors in non-compliance.

    PubMed

    Ley, P

    1985-04-01

    Patients frequently fail to understand what they are told. Further, they frequently forget the information given to them. These factors have effects on patients' satisfaction with the consultation. All three of these factors--understanding, memory and satisfaction--have effects on the probability that a patient will comply with advice. The levels of failure to understand and remember and levels of dissatisfaction are described. Quantitative estimates of the effects of these factors on non-compliance are presented.

  9. Novel Contrast Mechanisms at 3 Tesla and 7 Tesla

    PubMed Central

    Regatte, Ravinder R.; Schweitzer, Mark E.

    2013-01-01

    Osteoarthritis (OA) is the most common musculoskeletal degenerative disease, affecting millions of people. Although OA has been considered primarily a cartilage disorder associated with focal cartilage degeneration, it is accompanied by well-known changes in subchondral and trabecular bone, including sclerosis and osteophyte formation. The exact cause of OA initiation and progression remains under debate, but OA typically first affects weightbearing joints such as the knee. Magnetic resonance imaging (MRI) has been recognized as a potential tool for quantitative assessment of cartilage abnormalities due to its excellent soft tissue contrast. Over the last two decades, several new MR biochemical imaging methods have been developed to characterize the disease process and possibly predict the progression of knee OA. These new MR biochemical methods play an important role not only for diagnosis of disease at an early stage, but also for their potential use in monitoring outcome of various drug therapies (success or failure). Recent advances in multicoil radiofrequency technology and high field systems (3 T and above) significantly improve the sensitivity and specificity of imaging studies for the diagnosis of musculoskeletal disorders. The current state-of-the-art MR imaging methods are briefly reviewed for the quantitative biochemical and functional imaging assessment of musculoskeletal systems. PMID:18850506

  10. Pouch functional outcomes after restorative proctocolectomy with ileal-pouch reconstruction in patients with ulcerative colitis: Japanese multi-center nationwide cohort study.

    PubMed

    Uchino, Motoi; Ikeuchi, Hiroki; Sugita, Akira; Futami, Kitaro; Watanabe, Toshiaki; Fukushima, Kouhei; Tatsumi, Kenji; Koganei, Kazutaka; Kimura, Hideaki; Hata, Keisuke; Takahashi, Kenichi; Watanabe, Kazuhiro; Mizushima, Tsunekazu; Funayama, Yuji; Higashi, Daijiro; Araki, Toshimitsu; Kusunoki, Masato; Ueda, Takeshi; Koyama, Fumikazu; Itabashi, Michio; Nezu, Riichiro; Suzuki, Yasuo

    2018-05-01

    Although several complications capable of causing pouch failure may develop after restorative proctocolectomy (RPC) for ulcerative colitis (UC), the incidences and causes are conflicting and vary according to country, race and institution. To avoid pouch failure, this study aimed to evaluate the rate of pouch failure and its risk factors in UC patients over the past decade via a nationwide cohort study. We conducted a retrospective, observational, multicenter study that included 13 institutions in Japan. Patients who underwent RPC between January 2005 and December 2014 were included. The characteristics and backgrounds of the patients before and during surgery and their postoperative courses and complications were reviewed. A total of 2376 patients were evaluated over 6.7 ± 3.5 years of follow-up. Twenty-seven non-functional pouches were observed, and the functional pouch rate was 98.9% after RPC. Anastomotic leakage (odds ratio, 9.1) was selected as a risk factor for a non-functional pouch. The cumulative pouch failure rate was 4.2%/10 years. A change in diagnosis to Crohn's disease/indeterminate colitis (hazard ratio, 13.2) was identified as an independent risk factor for pouch failure. The significant risk factor for a non-functional pouch was anastomotic leakage. The optimal staged surgical procedure should be selected according to a patient's condition to avoid anastomotic failure during RPC. Changes in diagnosis after RPC confer a substantial risk of pouch failure. Additional cohort studies are needed to obtain an understanding of the long-standing clinical course of and proper treatment for pouch failure.

  11. Downregulation of cardiac guanosine 5'-triphosphate-binding proteins in right atrium and left ventricle in pacing-induced congestive heart failure.

    PubMed Central

    Roth, D A; Urasawa, K; Helmer, G A; Hammond, H K

    1993-01-01

    The extent to which congestive heart failure (CHF) is dependent upon increased levels of the cardiac inhibitory GTP-binding protein (Gi), and the impact of CHF on the cardiac stimulatory GTP-binding protein (Gs) and mechanisms by which Gs may change remain unexplored. We have addressed these unsettled issues using pacing-induced CHF in pigs to examine physiological, biochemical, and molecular features of the right atrium (RA) and left ventricle (LV). CHF was associated with an 85 +/- 20% decrease in LV segment shortening (P < 0.001) and a 3.5-fold increase (P = 0.006) in the ED50 for isoproterenol-stimulated heart rate responsiveness. Myocardial beta-adrenergic receptor number was decreased 54% in RA (P = 0.004) and 57% in LV (P < 0.001), and multiple measures of adenylyl cyclase activity were depressed 49 +/- 8% in RA (P < 0.005), and 44 +/- 9% in LV (P < 0.001). Quantitative immunoblotting established that Gi and Gs were decreased in RA (Gi: 59% reduction; P < 0.0001; Gs: 28% reduction; P < 0.007) and LV (Gi: 35% reduction; P < 0.008; Gs: 28% reduction; P < 0.01) after onset of CHF. Reduced levels of Gi and Gs were confirmed by ADP ribosylation studies, and diminished function of Gs was established in reconstitution studies. Steady state levels for Gs alpha mRNA were increased in RA and unchanged in LV, and significantly more GS alpha was found in the supernatant (presumably cytosolic) fraction in RA and LV membrane homogenates after CHF, suggesting that increased Gs degradation, rather than decreased Gs synthesis, is the mechanism by which Gs is downregulated. We conclude that cardiac Gi content poorly predicts adrenergic responsiveness or contractile function, that decreased Gs is caused by increased degradation rather than decreased synthesis, and that alterations in beta-adrenergic receptors, adenylyl cyclase, and GTP-binding proteins are uniform in RA and LV in this model of congestive heart failure. Images PMID:8383705

  12. Analysis of the progressive failure of brittle matrix composites

    NASA Technical Reports Server (NTRS)

    Thomas, David J.

    1995-01-01

    This report investigates two of the most common modes of localized failures, namely, periodic fiber-bridged matrix cracks and transverse matrix cracks. A modification of Daniels' bundle theory is combined with Weibull's weakest link theory to model the statistical distribution of the periodic matrix cracking strength for an individual layer. Results of the model predictions are compared with experimental data from the open literature. Extensions to the model are made to account for possible imperfections within the layer (i.e., nonuniform fiber lengths, irregular crack spacing, and degraded in-situ fiber properties), and the results of these studies are presented. A generalized shear-lag analysis is derived which is capable of modeling the development of transverse matrix cracks in material systems having a general multilayer configuration and under states of full in-plane load. A method for computing the effective elastic properties for the damaged layer at the global level is detailed based upon the solution for the effects of the damage at the local level. This methodology is general in nature and is therefore also applicable to (0(sub m)/90(sub n))(sub s) systems. The characteristic stress-strain response for more general cases is shown to be qualitatively correct (experimental data is not available for a quantitative evaluation), and the damage evolution is recorded in terms of the matrix crack density as a function of the applied strain. Probabilistic effects are introduced to account for the statistical nature of the material strengths, thus allowing cumulative distribution curves for the probability of failure to be generated for each of the example laminates. Additionally, Oh and Finney's classic work on fracture location in brittle materials is extended and combined with the shear-lag analysis. The result is an analytical form for predicting the probability density function for the location of the next transverse crack occurrence within a crack bounded region. The results of this study verified qualitatively the validity of assuming a uniform crack spacing (as was done in the shear-lag model).

  13. Forecasting overhaul or replacement intervals based on estimated system failure intensity

    NASA Astrophysics Data System (ADS)

    Gannon, James M.

    1994-12-01

    System reliability can be expressed in terms of the pattern of failure events over time. Assuming a nonhomogeneous Poisson process and Weibull intensity function for complex repairable system failures, the degree of system deterioration can be approximated. Maximum likelihood estimators (MLE's) for the system Rate of Occurrence of Failure (ROCOF) function are presented. Evaluating the integral of the ROCOF over annual usage intervals yields the expected number of annual system failures. By associating a cost of failure with the expected number of failures, budget and program policy decisions can be made based on expected future maintenance costs. Monte Carlo simulation is used to estimate the range and the distribution of the net present value and internal rate of return of alternative cash flows based on the distributions of the cost inputs and confidence intervals of the MLE's.

  14. Application of a truncated normal failure distribution in reliability testing

    NASA Technical Reports Server (NTRS)

    Groves, C., Jr.

    1968-01-01

    Statistical truncated normal distribution function is applied as a time-to-failure distribution function in equipment reliability estimations. Age-dependent characteristics of the truncated function provide a basis for formulating a system of high-reliability testing that effectively merges statistical, engineering, and cost considerations.

  15. Implementation of a Tabulated Failure Model Into a Generalized Composite Material Model Suitable for Use in Impact Problems

    NASA Technical Reports Server (NTRS)

    Goldberg, Robert K.; Carney, Kelly S.; Dubois, Paul; Hoffarth, Canio; Khaled, Bilal; Shyamsunder, Loukham; Rajan, Subramaniam; Blankenhorn, Gunther

    2017-01-01

    The need for accurate material models to simulate the deformation, damage and failure of polymer matrix composites under impact conditions is becoming critical as these materials are gaining increased use in the aerospace and automotive communities. The aerospace community has identified several key capabilities which are currently lacking in the available material models in commercial transient dynamic finite element codes. To attempt to improve the predictive capability of composite impact simulations, a next generation material model is being developed for incorporation within the commercial transient dynamic finite element code LS-DYNA. The material model, which incorporates plasticity, damage and failure, utilizes experimentally based tabulated input to define the evolution of plasticity and damage and the initiation of failure as opposed to specifying discrete input parameters such as modulus and strength. The plasticity portion of the orthotropic, three-dimensional, macroscopic composite constitutive model is based on an extension of the Tsai-Wu composite failure model into a generalized yield function with a non-associative flow rule. For the damage model, a strain equivalent formulation is used to allow for the uncoupling of the deformation and damage analyses. For the failure model, a tabulated approach is utilized in which a stress or strain based invariant is defined as a function of the location of the current stress state in stress space to define the initiation of failure. Failure surfaces can be defined with any arbitrary shape, unlike traditional failure models where the mathematical functions used to define the failure surface impose a specific shape on the failure surface. In the current paper, the complete development of the failure model is described and the generation of a tabulated failure surface for a representative composite material is discussed.

  16. QSAR Modeling: Where Have You Been? Where Are You Going To?.

    EPA Science Inventory

    Quantitative structure–activity relationship modeling is one of the major computational tools employed in medicinal chemistry. However, throughout its entire history it has drawn both praise and criticism concerning its reliability, limitations, successes, and failures. In this...

  17. Methods, apparatus and system for notification of predictable memory failure

    DOEpatents

    Cher, Chen-Yong; Andrade Costa, Carlos H.; Park, Yoonho; Rosenburg, Bryan S.; Ryu, Kyung D.

    2017-01-03

    A method for providing notification of a predictable memory failure includes the steps of: obtaining information regarding at least one condition associated with a memory; calculating a memory failure probability as a function of the obtained information; calculating a failure probability threshold; and generating a signal when the memory failure probability exceeds the failure probability threshold, the signal being indicative of a predicted future memory failure.

  18. Relating design and environmental variables to reliability

    NASA Astrophysics Data System (ADS)

    Kolarik, William J.; Landers, Thomas L.

    The combination of space application and nuclear power source demands high reliability hardware. The possibilities of failure, either an inability to provide power or a catastrophic accident, must be minimized. Nuclear power experiences on the ground have led to highly sophisticated probabilistic risk assessment procedures, most of which require quantitative information to adequately assess such risks. In the area of hardware risk analysis, reliability information plays a key role. One of the lessons learned from the Three Mile Island experience is that thorough analyses of critical components are essential. Nuclear grade equipment shows some reliability advantages over commercial. However, no statistically significant difference has been found. A recent study pertaining to spacecraft electronics reliability, examined some 2500 malfunctions on more than 300 aircraft. The study classified the equipment failures into seven general categories. Design deficiencies and lack of environmental protection accounted for about half of all failures. Within each class, limited reliability modeling was performed using a Weibull failure model.

  19. Brain proton magnetic resonance spectroscopy for hepatic encephalopathy

    NASA Astrophysics Data System (ADS)

    Ong, Chin-Sing; McConnell, James R.; Chu, Wei-Kom

    1993-08-01

    Liver failure can induce gradations of encephalopathy from mild to stupor to deep coma. The objective of this study is to investigate and quantify the variation of biochemical compounds in the brain in patients with liver failure and encephalopathy, through the use of water- suppressed, localized in-vivo Proton Magnetic Resonance Spectroscopy (HMRS). The spectral parameters of the compounds quantitated are: N-Acetyl Aspartate (NAA) to Creatine (Cr) ratio, Choline (Cho) to Creatine ratio, Inositol (Ins) to Creatine ratio and Glutamine-Glutamate Amino Acid (AA) to Creatine ratio. The study group consisted of twelve patients with proven advanced chronic liver failure and symptoms of encephalopathy. Comparison has been done with results obtained from five normal subjects without any evidence of encephalopathy or liver diseases.

  20. Cascading failure in scale-free networks with tunable clustering

    NASA Astrophysics Data System (ADS)

    Zhang, Xue-Jun; Gu, Bo; Guan, Xiang-Min; Zhu, Yan-Bo; Lv, Ren-Li

    2016-02-01

    Cascading failure is ubiquitous in many networked infrastructure systems, such as power grids, Internet and air transportation systems. In this paper, we extend the cascading failure model to a scale-free network with tunable clustering and focus on the effect of clustering coefficient on system robustness. It is found that the network robustness undergoes a nonmonotonic transition with the increment of clustering coefficient: both highly and lowly clustered networks are fragile under the intentional attack, and the network with moderate clustering coefficient can better resist the spread of cascading. We then provide an extensive explanation for this constructive phenomenon via the microscopic point of view and quantitative analysis. Our work can be useful to the design and optimization of infrastructure systems.

  1. Quantitative ultrasonic evaluation of mechanical properties of engineering materials

    NASA Technical Reports Server (NTRS)

    Vary, A.

    1978-01-01

    Current progress in the application of ultrasonic techniques to nondestructive measurement of mechanical strength properties of engineering materials is reviewed. Even where conventional NDE techniques have shown that a part is free of overt defects, advanced NDE techniques should be available to confirm the material properties assumed in the part's design. There are many instances where metallic, composite, or ceramic parts may be free of critical defects while still being susceptible to failure under design loads due to inadequate or degraded mechanical strength. This must be considered in any failure prevention scheme that relies on fracture analysis. This review will discuss the availability of ultrasonic methods that can be applied to actual parts to assess their potential susceptibility to failure under design conditions.

  2. Pulmonary hypertension and isolated right heart failure complicating amiodarone induced hyperthyroidism.

    PubMed

    Wong, Sean-Man; Tse, Hung-Fat; Siu, Chung-Wah

    2012-03-01

    Hyperthyroidism is a common side effect encountered in patients prescribed long-term amiodarone therapy for cardiac arrhythmias. We previously studied 354 patients prescribed amiodarone in whom the occurrence of hyperthyroidism was associated with major adverse cardiovascular events including heart failure, myocardial infarction, ventricular arrhythmias, stroke and even death [1]. We now present a case of amiodarone-induced hyperthyroidism complicated by isolated right heart failure and pulmonary hypertension that resolved with treatment of hyperthyroidism. Detailed quantitative echocardiography enables improved understanding of the haemodynamic mechanisms underlying the condition. Copyright © 2011 Australian and New Zealand Society of Cardiac and Thoracic Surgeons (ANZSCTS) and the Cardiac Society of Australia and New Zealand (CSANZ). Published by Elsevier B.V. All rights reserved.

  3. Long- and short-time analysis of heartbeat sequences: correlation with mortality risk in congestive heart failure patients.

    PubMed

    Allegrini, P; Balocchi, R; Chillemi, S; Grigolini, P; Hamilton, P; Maestri, R; Palatella, L; Raffaelli, G

    2003-06-01

    We analyze RR heartbeat sequences with a dynamic model that satisfactorily reproduces both the long- and the short-time statistical properties of heart beating. These properties are expressed quantitatively by means of two significant parameters, the scaling delta concerning the asymptotic effects of long-range correlation, and the quantity 1-pi establishing the amount of uncorrelated fluctuations. We find a correlation between the position in the phase space (delta, pi) of patients with congestive heart failure and their mortality risk.

  4. Mechanisms Explaining the Influence of Subclinical Hypothyroidism on the Onset and Progression of Chronic Heart Failure.

    PubMed

    Triggiani, Vincenzo; Angelo Giagulli, Vito; De Pergola, Giovanni; Licchelli, Brunella; Guastamacchia, Edoardo; Iacoviello, Massimo

    2016-01-01

    Subclinical hypothyroidism can be associated with the onset and progression of chronic heart failure. We undertook a careful search of the literature aiming to review the possible pathogenetic mechanisms explaining the influence of subclinical hypothyroidism on the onset and progression of chronic heart failure. Thyroid hormones can influence the expression of genes involved in calcium handling and contractile properties of myocardiocytes. Subclinical hypothyroidism, therefore, can alter both cardiovascular morphology and function leading to changes in myocardiocytes shape and structure, and to alterations of both contractile and relaxing properties, impairing systolic as well as diastolic functions. Furthermore, it can favour dyslipidemia, endothelial dysfunction and diastolic hypertension, favouring atherogenesis and coronary heart disease, possibly evolving into chronic heart failure. Beside an influence on the onset of chronic heart failure, subclinical hypothyroidism can represent a risk factor for its progression, in particular hospitalization and mortality but the mechanisms involved need to be fully elucidated. Subclinical hypothyroidism can be associated with the onset of chronic heart failure, because it can favour two frequent conditions that can evolve in heart failure: coronary heart disease and hypertension; it can also alter both cardiovascular morphology and function leading to heart failure progression in patients already affected through mechanisms still not completely understood.

  5. On the calculation of charge transfer transitions with standard density functionals using constrained variational density functional theory.

    PubMed

    Ziegler, Tom; Krykunov, Mykhaylo

    2010-08-21

    It is well known that time-dependent density functional theory (TD-DFT) based on standard gradient corrected functionals affords both a quantitative and qualitative incorrect picture of charge transfer transitions between two spatially separated regions. It is shown here that the well known failure can be traced back to the use of linear response theory. Further, it is demonstrated that the inclusion of higher order terms readily affords a qualitatively correct picture even for simple functionals based on the local density approximation. The inclusion of these terms is done within the framework of a newly developed variational approach to excitation energies called constrained variational density functional theory (CV-DFT). To second order [CV(2)-DFT] this theory is identical to adiabatic TD-DFT within the Tamm-Dancoff approximation. With inclusion of fourth order corrections [CV(4)-DFT] it affords a qualitative correct description of charge transfer transitions. It is finally demonstrated that the relaxation of the ground state Kohn-Sham orbitals to first order in response to the change in density on excitation together with CV(4)-DFT affords charge transfer excitations in good agreement with experiment. The new relaxed theory is termed R-CV(4)-DFT. The relaxed scheme represents an effective way in which to introduce double replacements into the description of single electron excitations, something that would otherwise require a frequency dependent kernel.

  6. Both in- and out-hospital worsening of renal function predict outcome in patients with heart failure: results from the Coordinating Study Evaluating Outcome of Advising and Counseling in Heart Failure (COACH).

    PubMed

    Damman, Kevin; Jaarsma, Tiny; Voors, Adriaan A; Navis, Gerjan; Hillege, Hans L; van Veldhuisen, Dirk J

    2009-09-01

    The effect of worsening renal function (WRF) after discharge on outcome in patients with heart failure is unknown. We assessed estimated glomerular filtration rate (eGFR) and serum creatinine at admission, discharge, and 6 and 12 months after discharge, in 1023 heart failure patients. Worsening renal function was defined as an increase in serum creatinine of >26.5 micromol/L and >25%. The primary endpoint was a composite of all-cause mortality and heart failure admissions. The mean age of patients was 71 +/- 11 years, and 62% was male. Mean eGFR at admission was 55 +/- 21 mL/min/1.73 m(2). In-hospital WRF occurred in 11% of patients, while 16 and 9% experienced WRF from 0 to 6, and 6 to 12 months after discharge, respectively. In multivariate landmark analysis, WRF at any point in time was associated with a higher incidence of the primary endpoint: hazard ratio (HR) 1.63 (1.10-2.40), P = 0.014 for in-hospital WRF, HR 2.06 (1.13-3.74), P = 0.018 for WRF between 0-6 months, and HR 5.03 (2.13-11.88), P < 0.001 for WRF between 6-12 months. Both in- and out-hospital worsening of renal function are independently related to poor prognosis in patients with heart failure, suggesting that renal function in heart failure patients should be monitored long after discharge.

  7. Deformations and strains in adhesive joints by moire interferometry

    NASA Technical Reports Server (NTRS)

    Post, D.; Czarnek, R.; Wood, J.; John, D.; Lubowinski, S.

    1984-01-01

    Displacement fields in a thick adherend lap joint and a cracked lap shear specimen were measured by high sensitivity moire interferometry. Contour maps of in-plane U and V displacements were obtained across adhesive and adherent surfaces. Loading sequences ranged from modest loads to near-failure loads. Quantitative results are given for displacements and certain strains in the adhesive and along the adhesive/adherend boundary lines. The results show nonlinear displacements and strains as a function of loads or stresses and they show viscoelastic or time-dependent response. Moire interferometry is an excellent method for experimental studies of adhesive joint performance. Subwavelength displacement resolution of a few micro-inches, and spatial resolution corresponding to 1600 fringes/inch (64 fringes/mm), were obtained in these studies. The whole-field contour maps offer insights not available from local measurements made by high sensitivity gages.

  8. Vectorcardiograph

    NASA Technical Reports Server (NTRS)

    Lintott, J.; Costello, M. J.

    1977-01-01

    A system for quantitating the cardiac electrical activity of Skylab crewmen was required for three medical experiments (M092, Lower Body Negative Pressure; M171, Metabolic Activity; and M093, In-flight Vectorcardiogram) designed to evaluate the effects of space flight on the human cardiovascular system. A Frank lead vectorcardiograph system was chosen for this task because of its general acceptability in the scientific community and its data quantification capabilities. To be used effectively in space flight, however, the system had to meet certain other requirements. The system was required to meet the specifications recommended by the American Heart Association. The vectorcardiograph had to withstand the extreme conditions of the space environment. The system had to provide features that permitted ease of use in the orbital environment. The vectorcardiograph system performed its intended function throughout all the Skylab missions without a failure. A description of this system follows.

  9. Hospital ownership and financial performance: what explains the different findings in the empirical literature?

    PubMed

    Shen, Yu-Chu; Eggleston, Karen; Lau, Joseph; Schmid, Christopher H

    2007-01-01

    This study applies meta-analytic methods to conduct a quantitative review of the empirical literature on hospital ownership since 1990. We examine four financial outcomes across 40 studies: cost, revenue, profit margin, and efficiency. We find that variation in the magnitudes of ownership effects can be explained by a study's research focus and methodology. Studies using empirical methods that control for few confounding factors tend to find larger differences between for-profit and not-for-profit hospitals than studies that control for a wider range of confounding factors. Functional form and sample size also matter. Failure to apply log transformation to highly skewed expenditure data yields misleadingly large estimated differences between for-profits and not-for-profits. Studies with fewer than 200 observations also produce larger point estimates and wide confidence intervals.

  10. Pooled nucleic acid testing to identify antiretroviral treatment failure during HIV infection.

    PubMed

    May, Susanne; Gamst, Anthony; Haubrich, Richard; Benson, Constance; Smith, Davey M

    2010-02-01

    Pooling strategies have been used to reduce the costs of polymerase chain reaction-based screening for acute HIV infection in populations in which the prevalence of acute infection is low (less than 1%). Only limited research has been done for conditions in which the prevalence of screening positivity is higher (greater than 1%). We present data on a variety of pooling strategies that incorporate the use of polymerase chain reaction-based quantitative measures to monitor for virologic failure among HIV-infected patients receiving antiretroviral therapy. For a prevalence of virologic failure between 1% and 25%, we demonstrate relative efficiency and accuracy of various strategies. These results could be used to choose the best strategy based on the requirements of individual laboratory and clinical settings such as required turnaround time of results and availability of resources. Virologic monitoring during antiretroviral therapy is not currently being performed in many resource-constrained settings largely because of costs. The presented pooling strategies may be used to significantly reduce the cost compared with individual testing, make such monitoring feasible, and limit the development and transmission of HIV drug resistance in resource-constrained settings. They may also be used to design efficient pooling strategies for other settings with quantitative screening measures.

  11. Evaluation: Review of the Past, Preview of the Future.

    ERIC Educational Resources Information Center

    Smith, M. F.

    1994-01-01

    This paper summarized contributors' ideas about evaluation as a field and where it is going. Topics discussed were qualitative versus quantitative debate; evaluation's purpose; professionalization; program failure; program development; evaluators as advocates; evaluation knowledge; evaluation expansion; and methodology and design. (SLD)

  12. Probability of Loss of Assured Safety in Systems with Multiple Time-Dependent Failure Modes: Incorporation of Delayed Link Failure in the Presence of Aleatory Uncertainty.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Helton, Jon C.; Brooks, Dusty Marie; Sallaberry, Cedric Jean-Marie.

    Probability of loss of assured safety (PLOAS) is modeled for weak link (WL)/strong link (SL) systems in which one or more WLs or SLs could potentially degrade into a precursor condition to link failure that will be followed by an actual failure after some amount of elapsed time. The following topics are considered: (i) Definition of precursor occurrence time cumulative distribution functions (CDFs) for individual WLs and SLs, (ii) Formal representation of PLOAS with constant delay times, (iii) Approximation and illustration of PLOAS with constant delay times, (iv) Formal representation of PLOAS with aleatory uncertainty in delay times, (v) Approximationmore » and illustration of PLOAS with aleatory uncertainty in delay times, (vi) Formal representation of PLOAS with delay times defined by functions of link properties at occurrence times for failure precursors, (vii) Approximation and illustration of PLOAS with delay times defined by functions of link properties at occurrence times for failure precursors, and (viii) Procedures for the verification of PLOAS calculations for the three indicated definitions of delayed link failure.« less

  13. Failure rate and reliability of the KOMATSU hydraulic excavator in surface limestone mine

    NASA Astrophysics Data System (ADS)

    Harish Kumar N., S.; Choudhary, R. P.; Murthy, Ch. S. N.

    2018-04-01

    The model with failure rate function of bathtub-shaped is helpful in reliability analysis of any system and particularly in reliability associated privative maintenance. The usual Weibull distribution is, however, not capable to model the complete lifecycle of the any with a bathtub-shaped failure rate function. In this paper, failure rate and reliability analysis of the KOMATSU hydraulic excavator/shovel in surface mine is presented and also to improve the reliability and decrease the failure rate of each subsystem of the shovel based on the preventive maintenance. The model of the bathtub-shaped for shovel can also be seen as a simplification of the Weibull distribution.

  14. Influencing factors of NT-proBNP level inheart failure patients with different cardiacfunctions and correlation with prognosis.

    PubMed

    Xu, Liang; Chen, Yanchun; Ji, Yanni; Yang, Song

    2018-06-01

    Factors influencing N-terminal pro-brain natriuretic peptide (NT-proBNP) level in heart failure patients with different cardiac functions were identified to explore the correlations with prognosis. Eighty heart failure patients with different cardiac functions treated in Yixing People's Hospital from January 2016 to June 2017 were selected, and divided into two groups (group with cardiac function in class II and below and group with cardiac function in class III and above), according to the cardiac function classification established by New York Heart Association (NYHA). Blood biochemical test and outcome analysis were conducted to measure serum NT-proBNP and matrix metalloproteinase-9 (MMP-9) levels in patients with different cardiac functions, and correlations between levels of NT-proBNP and MMP-9 and left ventricular ejection fraction (LVEF) level were analyzed in patients with different cardiac functions at the same time. In addition, risk factors for heart failure in patients with different cardiac functions were analyzed. Compared with the group with cardiac function in class III and above, the group with cardiac function in class II and below had significantly lower serum NT-proBNP and MMP-9 levels (p<0.05). For echocardiogram indexes, left ventricular end-diastolic diameter (LVEDD) and left ventricular end-systolic diameter (LVESD) in the group with cardiac function in class II and below were obviously lower than those in the group with cardiac function in class III and above (p<0.05), while LVEF was higher in group with cardiac function in class II and below than that in group with cardiac function in class III and above (p<0.05). NT-proBNP and MMP-9 levels were negatively correlated with LVEF level [r=-0.8517 and -0.8517, respectively, p<0.001 (<0.05)]. Cardiac function in class III and above, increased NT-proBNP, increased MMP-9 and decreased LVEF were relevant risk factors and independent risk factors for heart failure in patients with different cardiac functions. NT-proBNP and MMP-9 levels are negatively correlated with LVEF in patients regardless of the cardiac function class. Therefore, attention should be paid to patients who have cardiac function in class III and above, increased NT-proBNP and MMP-9 levels and decreased LVEF in clinical practices, so as to actively prevent and treat heart failure.

  15. An accurate computational method for an order parameter with a Markov state model constructed using a manifold-learning technique

    NASA Astrophysics Data System (ADS)

    Ito, Reika; Yoshidome, Takashi

    2018-01-01

    Markov state models (MSMs) are a powerful approach for analyzing the long-time behaviors of protein motion using molecular dynamics simulation data. However, their quantitative performance with respect to the physical quantities is poor. We believe that this poor performance is caused by the failure to appropriately classify protein conformations into states when constructing MSMs. Herein, we show that the quantitative performance of an order parameter is improved when a manifold-learning technique is employed for the classification in the MSM. The MSM construction using the K-center method, which has been previously used for classification, has a poor quantitative performance.

  16. A quantitative analysis of the F18 flight control system

    NASA Technical Reports Server (NTRS)

    Doyle, Stacy A.; Dugan, Joanne B.; Patterson-Hine, Ann

    1993-01-01

    This paper presents an informal quantitative analysis of the F18 flight control system (FCS). The analysis technique combines a coverage model with a fault tree model. To demonstrate the method's extensive capabilities, we replace the fault tree with a digraph model of the F18 FCS, the only model available to us. The substitution shows that while digraphs have primarily been used for qualitative analysis, they can also be used for quantitative analysis. Based on our assumptions and the particular failure rates assigned to the F18 FCS components, we show that coverage does have a significant effect on the system's reliability and thus it is important to include coverage in the reliability analysis.

  17. Quantitative proteomic changes during post myocardial infarction remodeling reveals altered cardiac metabolism and Desmin aggregation in the infarct region.

    PubMed

    Datta, Kaberi; Basak, Trayambak; Varshney, Swati; Sengupta, Shantanu; Sarkar, Sagartirtha

    2017-01-30

    Myocardial infarction is one of the leading causes of cardiac dysfunction, failure and sudden death. Post infarction cardiac remodeling presents a poor prognosis, with 30%-45% of patients developing heart failure, in a period of 5-25years. Oxidative stress has been labelled as the primary causative factor for cardiac damage during infarction, however, the impact it may have during the process of post infarction remodeling has not been well probed. In this study, we have implemented iTRAQ proteomics to catalogue proteins and functional processes, participating both temporally (early and late phases) and spatially (infarct and remote zones), during post myocardial infarction remodeling of the heart as functions of the differential oxidative stress manifest during the remodeling process. Cardiac metabolism was the dominant network to be affected during infarction and the remodeling time points considered in this study. A distinctive expression pattern of cytoskeletal proteins was also observed with increased remodeling time points. Further, it was found that the cytoskeletal protein Desmin, aggregated in the infarct zone during the remodeling process, mediated by the protease Calpain1. Taken together, all of these data in conjunction may lay the foundation to understand the effects of oxidative stress on the remodeling process and elaborate the mechanism behind the compromised cardiac function observed during post myocardial infarction remodeling. Oxidative stress is the major driving force for cardiac damage during myocardial infarction. However, the impact of oxidative stress on the process of post MI remodeling in conducting the heart towards functional failure has not been well explored. In this study, a spatial and temporal approach was taken to elaborate the major proteins and cellular processes involved in post MI remodeling. Based on level/ intensity of ROS, spatially, infarct and noninfarct zones were chosen for analysis while on the temporal scale, early (30days) and late time points (120days) post MI were included in the study. This design enabled us to delineate the differential protein expression on a spectrum of maximum oxidative stress at infarct zone during MI to minimum oxidative stress at noninfarct zone during late time point post MI. The proteome profiles for each of the study groups when comparatively analysed gave a holistic idea about the dominant cellular processes involved in post MI remodeling such as cardiac metabolism, both for short term and long term remodeling as well as unique processes such as Desmin mediated cytoskeletal remodeling of the infarcted myocardium that are involved in the compromise of cardiac function. Copyright © 2016 Elsevier B.V. All rights reserved.

  18. The Impact of Family Functioning on Caregiver Burden among Caregivers of Veterans with Congestive Heart Failure

    ERIC Educational Resources Information Center

    Moore, Crystal Dea

    2010-01-01

    A cross-sectional study of 76 family caregivers of older veterans with congestive heart failure utilized the McMaster model of family functioning to examine the impact of family functioning variables (problem solving, communication, roles, affective responsiveness, and affective involvement) on caregiver burden dimensions (relationship burden,…

  19. How Do Cognitive Function and Knowledge Affect Heart Failure Self-Care?

    ERIC Educational Resources Information Center

    Dickson, Victoria Vaughan; Lee, Christopher S.; Riegel, Barbara

    2011-01-01

    Despite extensive patient education, few heart failure (HF) patients master self-care. Impaired cognitive function may explain why patient education is ineffective. A concurrent triangulation mixed methods design was used to explore how knowledge and cognitive function influence HF self-care. A total of 41 adults with HF participated in interviews…

  20. Cardiorenal Syndrome in Acute Heart Failure: Revisiting Paradigms.

    PubMed

    Núñez, Julio; Miñana, Gema; Santas, Enrique; Bertomeu-González, Vicente

    2015-05-01

    Cardiorenal syndrome has been defined as the simultaneous dysfunction of both the heart and the kidney. Worsening renal function that occurs in patients with acute heart failure has been classified as cardiorenal syndrome type 1. In this setting, worsening renal function is a common finding and is due to complex, multifactorial, and not fully understood processes involving hemodynamic (renal arterial hypoperfusion and renal venous congestion) and nonhemodynamic factors. Traditionally, worsening renal function has been associated with worse outcomes, but recent findings have revealed mixed and heterogeneous results, perhaps suggesting that the same phenotype represents a diversity of pathophysiological and clinical situations. Interpreting the magnitude and chronology of renal changes together with baseline renal function, fluid overload status, and clinical response to therapy might help clinicians to unravel the clinical meaning of renal function changes that occur during an episode of heart failure decompensation. In this article, we critically review the contemporary evidence on the pathophysiology and clinical aspects of worsening renal function in acute heart failure. Copyright © 2014 Sociedad Española de Cardiología. Published by Elsevier España, S.L.U. All rights reserved.

  1. Liver failure in total artificial heart therapy.

    PubMed

    Dimitriou, Alexandros Merkourios; Dapunt, Otto; Knez, Igor; Wasler, Andrae; Oberwalder, Peter; Koerfer, Reiner; Tenderich, Gero; Spiliopoulos, Sotirios

    2016-07-01

    Congestive hepatopathy (CH) and acute liver failure (ALF) are common among biventricular heart failure patients. We sought to evaluate the impact of total artificial heart (TAH) therapy on hepatic function and associated clinical outcomes. A total of 31 patients received a Syncardia Total Artificial Heart. Preoperatively 17 patients exhibited normal liver function or mild hepatic derangements that were clinically insignificant and did not qualify as acute or chronic liver failure, 5 patients exhibited ALF and 9 various hepatic derangements owing to CH. Liver associated mortality and postoperative course of liver values were prospectively documented and retrospectively analyzed. Liver associated mortality in normal liver function, ALF and CH cases was 0%, 20% (P=0.03) and 44.4% (P=0.0008) respectively. 1/17 (5.8%) patients with a normal liver function developed an ALF, 4/5 (80%) patients with an ALF experienced a markedly improvement of hepatic function and 6/9 (66.6%) patients with CH a significant deterioration. TAH therapy results in recovery of hepatic function in ALF cases. Patients with CH prior to surgery form a high risk group with increased liver associated mortality.

  2. [Contribution of X-ray computed tomography in the evaluation of kidney performance].

    PubMed

    Lemoine, Sandrine; Rognant, Nicolas; Collet-Benzaquen, Diane; Juillard, Laurent

    2012-07-01

    X-ray computer assisted tomography scanner is an imaging method based on the use of X-ray attenuation in tissue. This attenuation is proportional to the density of the tissue (without or after contrast media injection) in each pixel image of the image. Spiral scanner, the electron beam computed tomography (EBCT) scanner and multidetector computed tomography scanner allow renal anatomical measurements, such as cortical and medullary volume, but also the measurement of renal functional parameters, such as regional renal perfusion, renal blood flow and glomerular filtration rate. These functional parameters are extracted from the modeling of the kinetics of the contrast media concentration in the vascular space and the renal tissue, using two main mathematical models (the gamma variate model and the Patlak model). Renal functional imaging allows measuring quantitative parameters on each kidney separately, in a non-invasive manner, providing significant opportunities in nephrology, both for experimental and clinical studies. However, this method uses contrast media that may alter renal function, thus limiting its use in patients with chronic renal failure. Moreover, the increase irradiation delivered to the patient with multi detector computed tomography (MDCT) should be considered. Copyright © 2011 Association Société de néphrologie. Published by Elsevier SAS. All rights reserved.

  3. Clinical types and drug therapy of renal impairment in cirrhosis

    PubMed Central

    Rodés, J.; Bosch, J.; Arroyo, V.

    1975-01-01

    Four separate types of renal failure in cirrhosis are described: functional renal failure; diuretic induced uraemia; acute tubular necrosis; chronic intrinsic renal disease. Functional renal failure may arise spontaneously or be precipitated by such factors as haemorrhage, surgery, or infection. It carries a poor prognosis but preliminary results of treating this condition with plasma volume expansion in combination with high doses of furosemide are encouraging. PMID:1234328

  4. [Influence Mildrocard on the morfo-functional condition of cardio-respiratory system at patients with chronic heart failure with concomitant chronic obstructive pulmonary disease].

    PubMed

    Ignatenko, G A; Mukhin, I V; Faierman, A O; Pola, M K; Taktashov, G S; Goncharov, O M; Rybalko, G S; Volodkina, N O

    2011-01-01

    In paper influence of a cytoprotective drug "Mildrocard" on morfo-functional condition of cardiorespiratory system at patients with chronic heart failure with concomitant chronic obstructive pulmonary disease is estimated. It is established, that joining "Mildrocard" to complex therapy associated to pathology promotes reduction clinical display of heart failure, shows cardioprotective and pulmoprotective effects.

  5. POF-Darts: Geometric adaptive sampling for probability of failure

    DOE PAGES

    Ebeida, Mohamed S.; Mitchell, Scott A.; Swiler, Laura P.; ...

    2016-06-18

    We introduce a novel technique, POF-Darts, to estimate the Probability Of Failure based on random disk-packing in the uncertain parameter space. POF-Darts uses hyperplane sampling to explore the unexplored part of the uncertain space. We use the function evaluation at a sample point to determine whether it belongs to failure or non-failure regions, and surround it with a protection sphere region to avoid clustering. We decompose the domain into Voronoi cells around the function evaluations as seeds and choose the radius of the protection sphere depending on the local Lipschitz continuity. As sampling proceeds, regions uncovered with spheres will shrink,more » improving the estimation accuracy. After exhausting the function evaluation budget, we build a surrogate model using the function evaluations associated with the sample points and estimate the probability of failure by exhaustive sampling of that surrogate. In comparison to other similar methods, our algorithm has the advantages of decoupling the sampling step from the surrogate construction one, the ability to reach target POF values with fewer samples, and the capability of estimating the number and locations of disconnected failure regions, not just the POF value. Furthermore, we present various examples to demonstrate the efficiency of our novel approach.« less

  6. A quantitative method for defining high-arched palate using the Tcof1(+/-) mutant mouse as a model.

    PubMed

    Conley, Zachary R; Hague, Molly; Kurosaka, Hiroshi; Dixon, Jill; Dixon, Michael J; Trainor, Paul A

    2016-07-15

    The palate functions as the roof of the mouth in mammals, separating the oral and nasal cavities. Its complex embryonic development and assembly poses unique susceptibilities to intrinsic and extrinsic disruptions. Such disruptions may cause failure of the developing palatal shelves to fuse along the midline resulting in a cleft. In other cases the palate may fuse at an arch, resulting in a vaulted oral cavity, termed high-arched palate. There are many models available for studying the pathogenesis of cleft palate but a relative paucity for high-arched palate. One condition exhibiting either cleft palate or high-arched palate is Treacher Collins syndrome, a congenital disorder characterized by numerous craniofacial anomalies. We quantitatively analyzed palatal perturbations in the Tcof1(+/-) mouse model of Treacher Collins syndrome, which phenocopies the condition in humans. We discovered that 46% of Tcof1(+/-) mutant embryos and new born pups exhibit either soft clefts or full clefts. In addition, 17% of Tcof1(+/-) mutants were found to exhibit high-arched palate, defined as two sigma above the corresponding wild-type population mean for height and angular based arch measurements. Furthermore, palatal shelf length and shelf width were decreased in all Tcof1(+/-) mutant embryos and pups compared to controls. Interestingly, these phenotypes were subsequently ameliorated through genetic inhibition of p53. The results of our study therefore provide a simple, reproducible and quantitative method for investigating models of high-arched palate. Copyright © 2015 Elsevier Inc. All rights reserved.

  7. A quantitative method for defining high-arched palate using the Tcof1+/− mutant mouse as a model

    PubMed Central

    Conley, Zachary R.; Hague, Molly; Kurosaka, Hiroshi; Dixon, Jill; Dixon, Michael J.; Trainor, Paul A.

    2016-01-01

    The palate functions as the roof of the mouth in mammals, separating the oral and nasal cavities. Its complex embryonic development and assembly poses unique susceptibilities to intrinsic and extrinsic disruptions. Such disruptions may cause failure of the developing palatal shelves to fuse along the midline resulting in a cleft. In other cases the palate may fuse at an arch, resulting in a vaulted oral cavity, termed high-arched palate. There are many models available for studying the pathogenesis of cleft palate but a relative paucity for high-arched palate. One condition exhibiting either cleft palate or high-arched palate is Treacher Collins syndrome, a congenital disorder characterized by numerous craniofacial anomalies. We quantitatively analyzed palatal perturbations in the Tcof1+/− mouse model of Treacher Collins syndrome, which phenocopies the condition in humans. We discovered that 46% of Tcof1+/− mutant embryos and new born pups exhibit either soft clefts or full clefts. In addition, 17% of Tcof1+/− mutants were found to exhibit high-arched palate, defined as two sigma above the corresponding wild-type population mean for height and angular based arch measurements. Furthermore, palatal shelf length and shelf width were decreased in all Tcof1+/− mutant embryos and pups compared to controls. Interestingly, these phenotypes were subsequently ameliorated through genetic inhibition of p53. The results of our study therefore provide a simple, reproducible and quantitative method for investigating models of high-arched palate. PMID:26772999

  8. The predictive value of the antioxidative function of HDL for cardiovascular disease and graft failure in renal transplant recipients.

    PubMed

    Leberkühne, Lynn J; Ebtehaj, Sanam; Dimova, Lidiya G; Dikkers, Arne; Dullaart, Robin P F; Bakker, Stephan J L; Tietge, Uwe J F

    2016-06-01

    Protection of low-density lipoproteins (LDL) against oxidative modification is a key anti-atherosclerotic property of high-density lipoproteins (HDL). This study evaluated the predictive value of the HDL antioxidative function for cardiovascular mortality, all-cause mortality and chronic graft failure in renal transplant recipients (RTR). The capacity of HDL to inhibit native LDL oxidation was determined in vitro in a prospective cohort of renal transplant recipients (RTR, n = 495, median follow-up 7.0 years). The HDL antioxidative functionality was significantly higher in patients experiencing graft failure (57.4 ± 9.7%) than in those without (54.2 ± 11.3%; P = 0.039), while there were no differences for cardiovascular and all-cause mortality. Specifically glomerular filtration rate (P = 0.001) and C-reactive protein levels (P = 0.006) associated independently with antioxidative functionality in multivariate linear regression analyses. Cox regression analysis demonstrated a significant relationship between antioxidative functionality of HDL and graft failure in age-adjusted analyses, but significance was lost following adjustment for baseline kidney function and inflammatory load. No significant association was found between HDL antioxidative functionality and cardiovascular and all-cause mortality. This study demonstrates that the antioxidative function of HDL (i) does not predict cardiovascular or all-cause mortality in RTR, but (ii) conceivably contributes to the development of graft failure, however, not independent of baseline kidney function and inflammatory load. Copyright © 2016 The Authors. Published by Elsevier Ireland Ltd.. All rights reserved.

  9. Heart failure—potential new targets for therapy

    PubMed Central

    Nabeebaccus, Adam; Zheng, Sean; Shah, Ajay M.

    2016-01-01

    Abstract Introduction/background Heart failure is a major cause of cardiovascular morbidity and mortality. This review covers current heart failure treatment guidelines, emerging therapies that are undergoing clinical trial, and potential new therapeutic targets arising from basic science advances. Sources of data A non-systematic search of MEDLINE was carried out. International guidelines and relevant reviews were searched for additional articles. Areas of agreement Angiotensin-converting enzyme inhibitors and beta-blockers are first line treatments for chronic heart failure with reduced left ventricular function. Areas of controversy Treatment strategies to improve mortality in heart failure with preserved left ventricular function are unclear. Growing points Many novel therapies are being tested for clinical efficacy in heart failure, including those that target natriuretic peptides and myosin activators. A large number of completely novel targets are also emerging from laboratory-based research. Better understanding of pathophysiological mechanisms driving heart failure in different settings (e.g. hypertension, post-myocardial infarction, metabolic dysfunction) may allow for targeted therapies. Areas timely for developing research Therapeutic targets directed towards modifying the extracellular environment, angiogenesis, cell viability, contractile function and microRNA-based therapies. PMID:27365454

  10. The Study of Cognitive Function and Related Factors in Patients With Heart Failure

    PubMed Central

    Ghanbari, Atefeh; Moaddab, Fatemeh; Salari, Arsalan; Kazemnezhad Leyli, Ehsan; Sedghi Sabet, Mitra; Paryad, Ezzat

    2013-01-01

    Background: Cognitive impairment is increasingly recognized as a common adverse consequence of heart failure. Both Heart failure and cognitive impairment are associated with frequent hospitalization and increased mortality, particularly when they occur simultaneously. Objectives: To determine cognitive function and related factors in patients with heart failure. Materials and Methods: In this descriptive cross-sectional study, we assessed 239 patients with heart failure. Data were collected by Mini Mental status Examination, Charlson comorbidity index and NYHA classification system. Data were analyzed using descriptive statistics, Kolmogorov-Smirnov test, chi-square test, t-test and logistic regression analysis. Results: The mean score of cognitive function was 21.68 ± 4.51. In total, 155 patients (64.9%) had cognitive impairment. Significant associations were found between the status of cognitive impairment and gender (P < 0.002), education level (P < 0.000), living location (P < 0.000), marital status (P < 0.03), living arrangement (P < 0.001 ), employment status (P < 0.000), income (P < 0.02), being the head of family (P < 0.03), the family size (P < 0.02), having a supplemental insurance (P < 0.003) and the patient’s comorbidities (P < 0.02). However, in logistic regression analysis, only education and supplementary insurance could predict cognitive status which indicates that patients with supplementary insurance and higher education levels were more likely to maintain optimal cognitive function. Conclusions: More than a half of the subjects had cognitive impairment. As the level of patients cognitive functioning affects their behaviors and daily living activities, it is recommended that patients with heart failure should be assessed for their cognitive functioning. PMID:25414874

  11. Does an inter-flaw length control the accuracy of rupture forecasting in geological materials?

    NASA Astrophysics Data System (ADS)

    Vasseur, Jérémie; Wadsworth, Fabian B.; Heap, Michael J.; Main, Ian G.; Lavallée, Yan; Dingwell, Donald B.

    2017-10-01

    Multi-scale failure of porous materials is an important phenomenon in nature and in material physics - from controlled laboratory tests to rockbursts, landslides, volcanic eruptions and earthquakes. A key unsolved research question is how to accurately forecast the time of system-sized catastrophic failure, based on observations of precursory events such as acoustic emissions (AE) in laboratory samples, or, on a larger scale, small earthquakes. Until now, the length scale associated with precursory events has not been well quantified, resulting in forecasting tools that are often unreliable. Here we test the hypothesis that the accuracy of the forecast failure time depends on the inter-flaw distance in the starting material. We use new experimental datasets for the deformation of porous materials to infer the critical crack length at failure from a static damage mechanics model. The style of acceleration of AE rate prior to failure, and the accuracy of forecast failure time, both depend on whether the cracks can span the inter-flaw length or not. A smooth inverse power-law acceleration of AE rate to failure, and an accurate forecast, occurs when the cracks are sufficiently long to bridge pore spaces. When this is not the case, the predicted failure time is much less accurate and failure is preceded by an exponential AE rate trend. Finally, we provide a quantitative and pragmatic correction for the systematic error in the forecast failure time, valid for structurally isotropic porous materials, which could be tested against larger-scale natural failure events, with suitable scaling for the relevant inter-flaw distances.

  12. Antibodies to variant surface antigens of Plasmodium falciparum infected erythrocytes associated with protection from treatment failure and development of anaemia in pregnancy

    PubMed Central

    Feng, Gaoqian; Aitken, Elizabeth; Yosaatmadja, Francisca; Kalilani, Linda; Meshnick, Steven R; Jaworowski, Anthony; Simpson, Julie A; Rogerson, Stephen J

    2009-01-01

    Background In pregnancy associated malaria (PAM), Plasmodium falciparum infected erythrocytes (IEs) express variant surface antigens (VSA-PAM) that evade existing immunity and mediate placental sequestration. Antibodies to VSA-PAM develop with gravidity and block placental adhesion or opsonise IEs for phagocytic clearance, protecting women from anemia and low birth weight Methods and findings Using sera from 141 parasitemic pregnant Malawian women enrolled in a randomized trial of antimalarials and VSA-PAM-expressing CS2 IEs, we quantitated levels of IgG to VSA-PAM by flow cytometry and opsonizing antibodies by measuring uptake of IEs by THP1 promonocytes. After controlling for gravidity and antimalarial treatment, IgG against VSA-PAM was associated with decreased anemia at delivery (OR=0.66, 95% confidence interval [CI] 0.46, 0.93; P=0.018) and weakly associated with decreased parasitological failure (OR=0.78; 95% CI, 0.60, 1.03; P=0.075), especially re-infection (OR=0.73; CI, 0.53,1.01; P=0.057). Opsonizing antibodies to CS2 IE were associated with less maternal anemia. (OR=0.31, 95% CI, 0.13, 0.74; P=0.008) and treatment failure (OR=0.48; 95% CI, 0.25, 0.90; P=0.023), primarily due to recrudescent infection (OR=0.49; 95% CI, 0.21, 1.12; P=0.089). Conclusion Both IgG antibody to VSA-PAM and opsonizing antibody, a functional measure of immunity correlate with parasite clearance and less anemia in pregnancy malaria. PMID:19500037

  13. A Mechanistic Thermal Fatigue Model for SnAgCu Solder Joints

    NASA Astrophysics Data System (ADS)

    Borgesen, Peter; Wentlent, Luke; Hamasha, Sa'd.; Khasawneh, Saif; Shirazi, Sam; Schmitz, Debora; Alghoul, Thaer; Greene, Chris; Yin, Liang

    2018-02-01

    The present work offers both a complete, quantitative model and a conservative acceleration factor expression for the life span of SnAgCu solder joints in thermal cycling. A broad range of thermal cycling experiments, conducted over many years, has revealed a series of systematic trends that are not compatible with common damage functions or constitutive relations. Complementary mechanical testing and systematic studies of the evolution of the microstructure and damage have led to a fundamental understanding of the progression of thermal fatigue and failure. A special experiment was developed to allow the effective deconstruction of conventional thermal cycling experiments and the finalization of our model. According to this model, the evolution of damage and failure in thermal cycling is controlled by a continuous recrystallization process which is dominated by the coalescence and rotation of dislocation cell structures continuously added to during the high-temperature dwell. The dominance of this dynamic recrystallization contribution is not consistent with the common assumption of a correlation between the number of cycles to failure and the total work done on the solder joint in question in each cycle. It is, however, consistent with an apparent dependence on the work done during the high-temperature dwell. Importantly, the onset of this recrystallization is delayed by pinning on the Ag3Sn precipitates until these have coarsened sufficiently, leading to a model with two terms where one tends to dominate in service and the other in accelerated thermal cycling tests. Accumulation of damage under realistic service conditions with varying dwell temperatures and times is also addressed.

  14. A Study to Compare the Failure Rates of Current Space Shuttle Ground Support Equipment with the New Pathfinder Equipment and Investigate the Effect that the Proposed GSE Infrastructure Upgrade Might Have to Reduce GSE Infrastructure Failures

    NASA Technical Reports Server (NTRS)

    Kennedy, Barbara J.

    2004-01-01

    The purposes of this study are to compare the current Space Shuttle Ground Support Equipment (GSE) infrastructure with the proposed GSE infrastructure upgrade modification. The methodology will include analyzing the first prototype installation equipment at Launch PAD B called the "Pathfinder". This study will begin by comparing the failure rate of the current components associated with the "Hardware interface module (HIM)" at the Kennedy Space Center to the failure rate of the neW Pathfinder components. Quantitative data will be gathered specifically on HIM components and the PAD B Hypergolic Fuel facility and Hypergolic Oxidizer facility areas which has the upgraded pathfinder equipment installed. The proposed upgrades include utilizing industrial controlled modules, software, and a fiber optic network. The results of this study provide evidence that there is a significant difference in the failure rates of the two studied infrastructure equipment components. There is also evidence that the support staff for each infrastructure system is not equal. A recommendation to continue with future upgrades is based on a significant reduction of failures in the new' installed ground system components.

  15. Reliability analysis for the smart grid : from cyber control and communication to physical manifestations of failure.

    DOT National Transportation Integrated Search

    2010-01-01

    The Smart Grid is a cyber-physical system comprised of physical components, such as transmission lines and generators, and a : network of embedded systems deployed for their cyber control. Our objective is to qualitatively and quantitatively analyze ...

  16. Try Fault Tree Analysis, a Step-by-Step Way to Improve Organization Development.

    ERIC Educational Resources Information Center

    Spitzer, Dean

    1980-01-01

    Fault Tree Analysis, a systems safety engineering technology used to analyze organizational systems, is described. Explains the use of logic gates to represent the relationship between failure events, qualitative analysis, quantitative analysis, and effective use of Fault Tree Analysis. (CT)

  17. Transitional care for formerly incarcerated persons with HIV: protocol for a realist review.

    PubMed

    Tsang, Jenkin; Mishra, Sharmistha; Rowe, Janet; O'Campo, Patricia; Ziegler, Carolyn; Kouyoumdjian, Fiona G; Matheson, Flora I; Bayoumi, Ahmed M; Zahid, Shatabdy; Antoniou, Tony

    2017-02-13

    Little is known about the mechanisms that influence the success or failure of programs to facilitate re-engagement with health and social services for formerly incarcerated persons with HIV. This review aims to identify how interventions to address such transitions work, for whom and under what circumstances. We will use realist review methodology to conduct our analysis. We will systematically search electronic databases and grey literature for English language qualitative and quantitative studies of interventions. Two investigators will independently screen citations and full-text articles, abstract data, appraise study quality and synthesize the literature. Data analysis will include identifying context-mechanism-outcome configurations, exploring and comparing patterns in these configurations, making comparisons across contexts and developing explanatory frameworks. This review will identify mechanisms that influence the success or failure of transition interventions for formerly incarcerated individuals with HIV. The findings will be integrated with those from complementary qualitative and quantitative studies to inform future interventions. PROSPERO CRD42016040054.

  18. Assessment and Management of Volume Overload and Congestion in Chronic Heart Failure: Can Measuring Blood Volume Provide New Insights?

    PubMed

    Miller, Wayne L

    2017-01-01

    Volume overload and fluid congestion remain primary clinical challenges in the assessment and management of patients with chronic heart failure (HF). The pathophysiology of volume regulation is complex, and the simple concept of passive intravascular fluid accumulation is not adequate. The dynamics of interstitial and intravascular fluid compartment interactions and fluid redistribution from venous splanchnic beds to the central pulmonary circulation need to be taken into account in strategies of volume management. Clinical bedside evaluations and right heart hemodynamic assessments can alert of changes in volume status, but only the quantitative measurement of total blood volume can help identify the heterogeneity in plasma volume and red blood cell mass that are features of volume overload in chronic HF. The quantitative assessment of intravascular volume is an effective tool to help guide individualized, appropriate therapy. Not all volume overload is the same, and the measurement of intravascular volume identifies heterogeneity to guide tailored therapy.

  19. Software analysis handbook: Software complexity analysis and software reliability estimation and prediction

    NASA Technical Reports Server (NTRS)

    Lee, Alice T.; Gunn, Todd; Pham, Tuan; Ricaldi, Ron

    1994-01-01

    This handbook documents the three software analysis processes the Space Station Software Analysis team uses to assess space station software, including their backgrounds, theories, tools, and analysis procedures. Potential applications of these analysis results are also presented. The first section describes how software complexity analysis provides quantitative information on code, such as code structure and risk areas, throughout the software life cycle. Software complexity analysis allows an analyst to understand the software structure, identify critical software components, assess risk areas within a software system, identify testing deficiencies, and recommend program improvements. Performing this type of analysis during the early design phases of software development can positively affect the process, and may prevent later, much larger, difficulties. The second section describes how software reliability estimation and prediction analysis, or software reliability, provides a quantitative means to measure the probability of failure-free operation of a computer program, and describes the two tools used by JSC to determine failure rates and design tradeoffs between reliability, costs, performance, and schedule.

  20. Failure dynamics of the global risk network.

    PubMed

    Szymanski, Boleslaw K; Lin, Xin; Asztalos, Andrea; Sreenivasan, Sameet

    2015-06-18

    Risks threatening modern societies form an intricately interconnected network that often underlies crisis situations. Yet, little is known about how risk materializations in distinct domains influence each other. Here we present an approach in which expert assessments of likelihoods and influence of risks underlie a quantitative model of the global risk network dynamics. The modeled risks range from environmental to economic and technological, and include difficult to quantify risks, such as geo-political and social. Using the maximum likelihood estimation, we find the optimal model parameters and demonstrate that the model including network effects significantly outperforms the others, uncovering full value of the expert collected data. We analyze the model dynamics and study its resilience and stability. Our findings include such risk properties as contagion potential, persistence, roles in cascades of failures and the identity of risks most detrimental to system stability. The model provides quantitative means for measuring the adverse effects of risk interdependencies and the materialization of risks in the network.

  1. Life prediction of thermally highly loaded components: modelling the damage process of a rocket combustion chamber hot wall

    NASA Astrophysics Data System (ADS)

    Schwarz, W.; Schwub, S.; Quering, K.; Wiedmann, D.; Höppel, H. W.; Göken, M.

    2011-09-01

    During their operational life-time, actively cooled liners of cryogenic combustion chambers are known to exhibit a characteristic so-called doghouse deformation, pursued by formation of axial cracks. The present work aims at developing a model that quantitatively accounts for this failure mechanism. High-temperature material behaviour is characterised in a test programme and it is shown that stress relaxation, strain rate dependence, isotropic and kinematic hardening as well as material ageing have to be taken into account in the model formulation. From fracture surface analyses of a thrust chamber it is concluded that the failure mode of the hot wall ligament at the tip of the doghouse is related to ductile rupture. A material model is proposed that captures all stated effects. Basing on the concept of continuum damage mechanics, the model is further extended to incorporate softening effects due to material degradation. The model is assessed on experimental data and quantitative agreement is established for all tests available. A 3D finite element thermo-mechanical analysis is performed on a representative thrust chamber applying the developed material-damage model. The simulation successfully captures the observed accrued thinning of the hot wall and quantitatively reproduces the doghouse deformation.

  2. Cochrane Qualitative and Implementation Methods Group guidance series-paper 4: methods for assessing evidence on intervention implementation.

    PubMed

    Cargo, Margaret; Harris, Janet; Pantoja, Tomas; Booth, Andrew; Harden, Angela; Hannes, Karin; Thomas, James; Flemming, Kate; Garside, Ruth; Noyes, Jane

    2018-05-01

    This article provides reviewers with guidance on methods for identifying and processing evidence to understand intervention implementation. Strategies, tools, and methods are applied to the systematic review process to illustrate how process and implementation can be addressed using quantitative, qualitative, and other sources of evidence (i.e., descriptive textual and nonempirical). Reviewers can take steps to navigate the heterogeneity and level of uncertainty present in the concepts, measures, and methods used to assess implementation. Activities can be undertaken in advance of a Cochrane quantitative review to develop program theory and logic models that situate implementation in the causal chain. Four search strategies are offered to retrieve process and implementation evidence. Recommendations are made for addressing rigor or risk of bias in process evaluation or implementation evidence. Strategies are recommended for locating and extracting data from primary studies. The basic logic is presented to assist reviewers to make initial review-level judgments about implementation failure and theory failure. Although strategies, tools, and methods can assist reviewers to address process and implementation using quantitative, qualitative, and other forms of evidence, few exemplar reviews exist. There is a need for further methodological development and trialing of proposed approaches. Copyright © 2017 Elsevier Inc. All rights reserved.

  3. Efficacy of positive airway pressure on brain natriuretic peptide in patients with heart failure and sleep-disorder breathing: a meta-analysis of randomized controlled trials.

    PubMed

    Zhang, Xiao-Bin; Yuan, Ya-Ting; Du, Yan-Ping; Jiang, Xing-Tang; Zeng, Hui-Qing

    2015-04-01

    Positive airway pressure (PAP) has been recognized as an effective therapeutic option for sleep-disordered breathing (SDB) in patients with heart failure (HF), and it can improve left ventricular function. Whether PAP can ameliorate serum brain natriuretic peptide (BNP) levels, a biomarker of HF, is controversial. The purpose of the present study was to quantitatively assess the efficacy of PAP on BNP in patients with HF and SDB. A systematic search of PubMed, Embase, Web of Science and Cochrane library identified six randomized controlled trials (RCTs), in which PAP was compared with medical therapy, subtherapeutic PAP or different types of PAP. The data of BNP were extracted and pooled into meta-analysis using STATA 12.0. Totally 6 RCT studies (7 cohorts) with 222 patients were enrolled into analysis. The quality of each study was high and the heterogeneity (I(2) = 58.1%) was noted between studies. A significant reduction of BNP was observed after PAP treatment in patients with HF and SDB (SMD -0.517, 95% CI -0.764 to -0.270, z = 4.11, p = 0.000). Our meta-analysis of RCTs demonstrated that PAP elicits significant reduction of BNP in patients with HF and SDB.

  4. What failure in collective decision-making tells us about metacognition

    PubMed Central

    Bahrami, Bahador; Olsen, Karsten; Bang, Dan; Roepstorff, Andreas; Rees, Geraint; Frith, Chris

    2012-01-01

    Condorcet (1785) proposed that a majority vote drawn from individual, independent and fallible (but not totally uninformed) opinions provides near-perfect accuracy if the number of voters is adequately large. Research in social psychology has since then repeatedly demonstrated that collectives can and do fail more often than expected by Condorcet. Since human collective decisions often follow from exchange of opinions, these failures provide an exquisite opportunity to understand human communication of metacognitive confidence. This question can be addressed by recasting collective decision-making as an information-integration problem similar to multisensory (cross-modal) perception. Previous research in systems neuroscience shows that one brain can integrate information from multiple senses nearly optimally. Inverting the question, we ask: under what conditions can two brains integrate information about one sensory modality optimally? We review recent work that has taken this approach and report discoveries about the quantitative limits of collective perceptual decision-making, and the role of the mode of communication and feedback in collective decision-making. We propose that shared metacognitive confidence conveys the strength of an individual's opinion and its reliability inseparably. We further suggest that a functional role of shared metacognition is to provide substitute signals in situations where outcome is necessary for learning but unavailable or impossible to establish. PMID:22492752

  5. Beyond auscultation: acoustic cardiography in clinical practice.

    PubMed

    Wen, Yong-Na; Lee, Alex Pui-Wai; Fang, Fang; Jin, Chun-Na; Yu, Cheuk-Man

    2014-04-01

    Cardiac auscultation by stethoscope is widely used but limited by low sensitivity and accuracy. Phonocardiogram was developed in an attempt to provide quantitative and qualitative information of heart sounds and murmurs by transforming acoustic signal into visual wavelet. Although phonocardiogram provides objective heart sound information and holds diagnostic potentials of different heart problems, its examination procedure is time-consuming and it requires specially trained technicians to operate the device. Acoustic cardiography (AUDICOR, Inovise Medical, Inc., Portland, OR, USA) is a major recent advance in the evolution of cardiac auscultation technology. The technique is more efficient and less operator-dependent. It synchronizes cardiac auscultation with ECG recording and provides a comprehensive assessment of both mechanical and electronic function of the heart. The application of acoustic cardiography is far beyond auscultation only. It generates various parameters which have been proven to correlate with gold standards in heart failure diagnosis and ischemic heart disease detection. Its application can be extended to other diseases, including LV hypertrophy, constrictive pericarditis, sleep apnea and ventricular fibrillation. The newly developed ambulatory acoustic cardiography is potentially used in heart failure follow-up in both home and hospital setting. This review comprehensively summarizes acoustic cardiographic research, including the most recent development. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  6. Assessment of Pancreatic β-Cell Function: Review of Methods and Clinical Applications

    PubMed Central

    Cersosimo, Eugenio; Solis-Herrera, Carolina; Trautmann, Michael E.; Malloy, Jaret; Triplitt, Curtis L.

    2014-01-01

    Type 2 diabetes mellitus (T2DM) is characterized by a progressive failure of pancreatic β-cell function (BCF) with insulin resistance. Once insulin over-secretion can no longer compensate for the degree of insulin resistance, hyperglycemia becomes clinically significant and deterioration of residual β-cell reserve accelerates. This pathophysiology has important therapeutic implications. Ideally, therapy should address the underlying pathology and should be started early along the spectrum of decreasing glucose tolerance in order to prevent or slow β-cell failure and reverse insulin resistance. The development of an optimal treatment strategy for each patient requires accurate diagnostic tools for evaluating the underlying state of glucose tolerance. This review focuses on the most widely used methods for measuring BCF within the context of insulin resistance and includes examples of their use in prediabetes and T2DM, with an emphasis on the most recent therapeutic options (dipeptidyl peptidase-4 inhibitors and glucagon-like peptide-1 receptor agonists). Methods of BCF measurement include the homeostasis model assessment (HOMA); oral glucose tolerance tests, intravenous glucose tolerance tests (IVGTT), and meal tolerance tests; and the hyperglycemic clamp procedure. To provide a meaningful evaluation of BCF, it is necessary to interpret all observations within the context of insulin resistance. Therefore, this review also discusses methods utilized to quantitate insulin-dependent glucose metabolism, such as the IVGTT and the euglycemic-hyperinsulinemic clamp procedures. In addition, an example is presented of a mathematical modeling approach that can use data from BCF measurements to develop a better understanding of BCF behavior and the overall status of glucose tolerance. PMID:24524730

  7. Liver congestion in heart failure contributes to inappropriately increased serum hepcidin despite anemia.

    PubMed

    Ohno, Yukako; Hanawa, Haruo; Jiao, Shuang; Hayashi, Yuka; Yoshida, Kaori; Suzuki, Tomoyasu; Kashimura, Takeshi; Obata, Hiroaki; Tanaka, Komei; Watanabe, Tohru; Minamino, Tohru

    2015-01-01

    Hepcidin is a key regulator of mammalian iron metabolism and mainly produced by the liver. Hepcidin excess causes iron deficiency and anemia by inhibiting iron absorption from the intestine and iron release from macrophage stores. Anemia is frequently complicated with heart failure. In heart failure patients, the most frequent histologic appearance of liver is congestion. However, it remains unclear whether liver congestion associated with heart failure influences hepcidin production, thereby contributing to anemia and functional iron deficiency. In this study, we investigated this relationship in clinical and basic studies. In clinical studies of consecutive heart failure patients (n = 320), anemia was a common comorbidity (41%). In heart failure patients without active infection and ongoing cancer (n = 30), log-serum hepcidin concentration of patients with liver congestion was higher than those without liver congestion (p = 0.0316). Moreover, in heart failure patients with liver congestion (n = 19), the anemia was associated with the higher serum hepcidin concentrations, which is a type of anemia characterized by induction of hepcidin. Subsequently, we produced a rat model of heart failure with liver congestion by injecting monocrotaline that causes pulmonary hypertension. The monocrotaline-treated rats displayed liver congestion with increase of hepcidin expression at 4 weeks after monocrotaline injection, followed by anemia and functional iron deficiency observed at 5 weeks. We conclude that liver congestion induces hepcidin production, which may result in anemia and functional iron deficiency in some patients with heart failure.

  8. Reliability Evaluation of Machine Center Components Based on Cascading Failure Analysis

    NASA Astrophysics Data System (ADS)

    Zhang, Ying-Zhi; Liu, Jin-Tong; Shen, Gui-Xiang; Long, Zhe; Sun, Shu-Guang

    2017-07-01

    In order to rectify the problems that the component reliability model exhibits deviation, and the evaluation result is low due to the overlook of failure propagation in traditional reliability evaluation of machine center components, a new reliability evaluation method based on cascading failure analysis and the failure influenced degree assessment is proposed. A direct graph model of cascading failure among components is established according to cascading failure mechanism analysis and graph theory. The failure influenced degrees of the system components are assessed by the adjacency matrix and its transposition, combined with the Pagerank algorithm. Based on the comprehensive failure probability function and total probability formula, the inherent failure probability function is determined to realize the reliability evaluation of the system components. Finally, the method is applied to a machine center, it shows the following: 1) The reliability evaluation values of the proposed method are at least 2.5% higher than those of the traditional method; 2) The difference between the comprehensive and inherent reliability of the system component presents a positive correlation with the failure influenced degree of the system component, which provides a theoretical basis for reliability allocation of machine center system.

  9. Structural and Functional Phenotyping of the Failing Heart: Is the Left Ventricular Ejection Fraction Obsolete?

    PubMed

    Bristow, Michael R; Kao, David P; Breathett, Khadijah K; Altman, Natasha L; Gorcsan, John; Gill, Edward A; Lowes, Brian D; Gilbert, Edward M; Quaife, Robert A; Mann, Douglas L

    2017-11-01

    Diagnosis, prognosis, treatment, and development of new therapies for diseases or syndromes depend on a reliable means of identifying phenotypes associated with distinct predictive probabilities for these various objectives. Left ventricular ejection fraction (LVEF) provides the current basis for combined functional and structural phenotyping in heart failure by classifying patients as those with heart failure with reduced ejection fraction (HFrEF) and those with heart failure with preserved ejection fraction (HFpEF). Recently the utility of LVEF as the major phenotypic determinant of heart failure has been challenged based on its load dependency and measurement variability. We review the history of the development and adoption of LVEF as a critical measurement of LV function and structure and demonstrate that, in chronic heart failure, load dependency is not an important practical issue, and we provide hemodynamic and molecular biomarker evidence that LVEF is superior or equal to more unwieldy methods of identifying phenotypes of ventricular remodeling. We conclude that, because it reliably measures both left ventricular function and structure, LVEF remains the best current method of assessing pathologic remodeling in heart failure in both individual clinical and multicenter group settings. Because of the present and future importance of left ventricular phenotyping in heart failure, LVEF should be measured by using the most accurate technology and methodologic refinements available, and improved characterization methods should continue to be sought. Copyright © 2017 American College of Cardiology Foundation. Published by Elsevier Inc. All rights reserved.

  10. [Hazard function and life table: an introduction to the failure time analysis].

    PubMed

    Matsushita, K; Inaba, H

    1987-04-01

    Failure time analysis has become popular in demographic studies. It can be viewed as a part of regression analysis with limited dependent variables as well as a special case of event history analysis and multistate demography. The idea of hazard function and failure time analysis, however, has not been properly introduced to nor commonly discussed by demographers in Japan. The concept of hazard function in comparison with life tables is briefly described, where the force of mortality is interchangeable with the hazard rate. The basic idea of failure time analysis is summarized for the cases of exponential distribution, normal distribution, and proportional hazard models. The multiple decrement life table is also introduced as an example of lifetime data analysis with cause-specific hazard rates.

  11. Fluid Volume Overload and Congestion in Heart Failure: Time to Reconsider Pathophysiology and How Volume Is Assessed.

    PubMed

    Miller, Wayne L

    2016-08-01

    Volume regulation, assessment, and management remain basic issues in patients with heart failure. The discussion presented here is directed at opening a reassessment of the pathophysiology of congestion in congestive heart failure and the methods by which we determine volume overload status. Peer-reviewed historical and contemporary literatures are reviewed. Volume overload and fluid congestion remain primary issues for patients with chronic heart failure. The pathophysiology is complex, and the simple concept of intravascular fluid accumulation is not adequate. The dynamics of interstitial and intravascular fluid compartment interactions and fluid redistribution from venous splanchnic beds to central pulmonary circulation need to be taken into account in strategies of volume management. Clinical bedside evaluations and right heart hemodynamic assessments can alert clinicians of changes in volume status, but only the quantitative measurement of total blood volume can help identify the heterogeneity in plasma volume and red blood cell mass that are features of volume overload in patients with chronic heart failure and help guide individualized, appropriate therapy-not all volume overload is the same. © 2016 American Heart Association, Inc.

  12. Clinical outcome of patients with heart failure and preserved left ventricular function.

    PubMed

    Gotsman, Israel; Zwas, Donna; Planer, David; Azaz-Livshits, Tanya; Admon, Dan; Lotan, Chaim; Keren, Andre

    2008-11-01

    Patients with heart failure have a poor prognosis. However, it has been presumed that patients with heart failure and preserved left ventricular function (LVF) may have a more benign prognosis. We evaluated the clinical outcome of patients with heart failure and preserved LVF compared with patients with reduced function and the factors affecting prognosis. We prospectively evaluated 289 consecutive patients hospitalized with a definite clinical diagnosis of heart failure based on typical symptoms and signs. They were divided into 2 subsets based on echocardiographic LVF. Patients were followed clinically for a period of 1 year. Echocardiography showed that more than one third (36%) of the patients had preserved systolic LVF. These patients were more likely to be older and female and have less ischemic heart disease. The survival at 1 year in this group was poor and not significantly different from patients with reduced LVF (75% vs 71%, respectively). The adjusted survival by Cox regression analysis was not significantly different (P=.25). However, patients with preserved LVF had fewer rehospitalizations for heart failure (25% vs 35%, P<.05). Predictors of mortality in the whole group by multivariate analysis were age, diabetes, chronic renal failure, atrial fibrillation, residence in a nursing home, and serum sodium < or = 135 mEq/L. The prognosis of patients with clinical heart failure with or without preserved LVF is poor. Better treatment modalities are needed in both subsets.

  13. Taurine Supplementation Improves Functional Capacity, Myocardial Oxygen Consumption, and Electrical Activity in Heart Failure.

    PubMed

    Ahmadian, Mehdi; Dabidi Roshan, Valiollah; Ashourpore, Eadeh

    2017-07-04

    Taurine is an amino acid found abundantly in the heart in very high concentrations. It is assumed that taurine contributes to several physiological functions of mammalian cells, such as osmoregulation, anti-inflammation, membrane stabilization, ion transport modulation, and regulation of oxidative stress and mitochondrial protein synthesis. The objective of the current study was to evaluate the effectiveness of taurine supplementation on functional capacity, myocardial oxygen consumption, and electrical activity in patients with heart failure. In a double-blind and randomly designed study, 16 patients with heart failure were assigned to two groups: taurine (TG, n = 8) and placebo (PG, n = 8). TG received 500-mg taurine supplementation three times per day for two weeks. Significant decrease in the values of Q-T segments (p < 0.01) and significant increase in the values of P-R segments (p < 0.01) were detected following exercise post-supplementation in TG rather than in PG. Significantly higher values of taurine concentration, T wave, Q-T segment, physical capacities, and lower values of cardiovascular capacities were detected post-supplementation in TG as compared with PG (all p values <0.01). Taurine significantly enhanced the physical function and significantly reduced the cardiovascular function parameters following exercise. Our results also suggest that the short-term taurine supplementation is an effective strategy for improving some selected hemodynamic parameters in heart failure patients. Together, these findings support the view that taurine improves cardiac function and functional capacity in patients with heart failure. This idea warrants further study.

  14. Systems for Lung Volume Standardization during Static and Dynamic MDCT-based Quantitative Assessment of Pulmonary Structure and Function

    PubMed Central

    Fuld, Matthew K.; Grout, Randall; Guo, Junfeng; Morgan, John H.; Hoffman, Eric A.

    2013-01-01

    Rationale and Objectives Multidetector-row Computed Tomography (MDCT) has emerged as a tool for quantitative assessment of parenchymal destruction, air trapping (density metrics) and airway remodeling (metrics relating airway wall and lumen geometry) in chronic obstructive pulmonary disease (COPD) and asthma. Critical to the accuracy and interpretability of these MDCT-derived metrics is the assurance that the lungs are scanned during a breath-hold at a standardized volume. Materials and Methods A computer monitored turbine-based flow meter system was developed to control patient breath-holds and facilitate static imaging at fixed percentages of the vital capacity. Due to calibration challenges with gas density changes during multi-breath xenon-CT an alternative system was required. The design incorporated dual rolling seal pistons. Both systems were tested in a laboratory environment and human subject trials. Results The turbine-based system successfully controlled lung volumes in 32/37 subjects, having a linear relationship for CT measured air volume between repeated scans: for all scans, the mean and confidence interval of the differences (scan1-scan2) was −9 ml (−169, 151); for TLC alone 6 ml (−164, 177); for FRC alone, −23 ml (−172, 126). The dual-piston system successfully controlled lung volume in 31/41 subjects. Study failures related largely to subject non-compliance with verbal instruction and gas leaks around the mouthpiece. Conclusion We demonstrate the successful use of a turbine-based system for static lung volume control and demonstrate its inadequacies for dynamic xenon-CT studies. Implementation of a dual-rolling seal spirometer has been shown to adequately control lung volume for multi-breath wash-in xenon-CT studies. These systems coupled with proper patient coaching provide the tools for the use of CT to quantitate regional lung structure and function. The wash-in xenon-CT method for assessing regional lung function, while not necessarily practical for routine clinical studies, provides for a dynamic protocol against which newly emerging single breath, dual-energy xenon-CT measures can be validated. PMID:22555001

  15. Integrating FMEA in a Model-Driven Methodology

    NASA Astrophysics Data System (ADS)

    Scippacercola, Fabio; Pietrantuono, Roberto; Russo, Stefano; Esper, Alexandre; Silva, Nuno

    2016-08-01

    Failure Mode and Effects Analysis (FMEA) is a well known technique for evaluating the effects of potential failures of components of a system. FMEA demands for engineering methods and tools able to support the time- consuming tasks of the analyst. We propose to make FMEA part of the design of a critical system, by integration into a model-driven methodology. We show how to conduct the analysis of failure modes, propagation and effects from SysML design models, by means of custom diagrams, which we name FMEA Diagrams. They offer an additional view of the system, tailored to FMEA goals. The enriched model can then be exploited to automatically generate FMEA worksheet and to conduct qualitative and quantitative analyses. We present a case study from a real-world project.

  16. Therapeutic potential of functional selectivity in the treatment of heart failure.

    PubMed

    Christensen, Gitte Lund; Aplin, Mark; Hansen, Jakob Lerche

    2010-10-01

    Adrenergic and angiotensin receptors are prominent targets in pharmacological alleviation of cardiac remodeling and heart failure, but their use is associated with cardiodepressant side effects. Recent advances in our understanding of seven transmembrane receptor signaling show that it is possible to design ligands with "functional selectivity," acting as agonists on certain signaling pathways while antagonizing others. This represents a major pharmaceutical opportunity to separate desired from adverse effects governed by the same receptor. Accordingly, functionally selective ligands are currently pursued as next-generation drugs for superior treatment of heart failure. Copyright © 2010 Elsevier Inc. All rights reserved.

  17. The Identification of Software Failure Regions

    DTIC Science & Technology

    1990-06-01

    be used to detect non-obviously redundant test cases. A preliminary examination of the manual analysis method is performed with a set of programs ...failure regions are defined and a method of failure region analysis is described in detail. The thesis describes how this analysis may be used to detect...is the termination of the ability of a functional unit to perform its required function. (Glossary, 1983) The presence of faults in program code

  18. Intravenous Milrinone Infusion Improves Congestive Heart Failure Caused by Diastolic Dysfunction

    PubMed Central

    Albrecht, Carlos A.; Giesler, Gregory M.; Kar, Biswajit; Hariharan, Ramesh; Delgado, Reynolds M.

    2005-01-01

    Although there have been significant advances in the medical treatment of heart failure patients with impaired systolic function, very little is known about the diagnosis and treatment of diastolic dysfunction. We report the cases of 3 patients in New York Heart Association functional class IV who had echocardiographically documented diastolic dysfunction as the main cause of heart failure. All 3 patients received medical therapy with long-term milrinone infusion. PMID:16107121

  19. Zebrafish Heart Failure Models for the Evaluation of Chemical Probes and Drugs

    PubMed Central

    Monte, Aaron; Cook, James M.; Kabir, Mohd Shahjahan; Peterson, Karl P.

    2013-01-01

    Abstract Heart failure is a complex disease that involves genetic, environmental, and physiological factors. As a result, current medication and treatment for heart failure produces limited efficacy, and better medication is in demand. Although mammalian models exist, simple and low-cost models will be more beneficial for drug discovery and mechanistic studies of heart failure. We previously reported that aristolochic acid (AA) caused cardiac defects in zebrafish embryos that resemble heart failure. Here, we showed that cardiac troponin T and atrial natriuretic peptide were expressed at significantly higher levels in AA-treated embryos, presumably due to cardiac hypertrophy. In addition, several human heart failure drugs could moderately attenuate the AA-induced heart failure by 10%–40%, further verifying the model for drug discovery. We then developed a drug screening assay using the AA-treated zebrafish embryos and identified three compounds. Mitogen-activated protein kinase kinase inhibitor (MEK-I), an inhibitor for the MEK-1/2 known to be involved in cardiac hypertrophy and heart failure, showed nearly 60% heart failure attenuation. C25, a chalcone derivative, and A11, a phenolic compound, showed around 80% and 90% attenuation, respectively. Time course experiments revealed that, to obtain 50% efficacy, these compounds were required within different hours of AA treatment. Furthermore, quantitative polymerase chain reaction showed that C25, not MEK-I or A11, strongly suppressed inflammation. Finally, C25 and MEK-I, but not A11, could also rescue the doxorubicin-induced heart failure in zebrafish embryos. In summary, we have established two tractable heart failure models for drug discovery and three potential drugs have been identified that seem to attenuate heart failure by different mechanisms. PMID:24351044

  20. Quantitative troponin and death, cardiogenic shock, cardiac arrest and new heart failure in patients with non-ST-segment elevation acute coronary syndromes (NSTE ACS): insights from the Global Registry of Acute Coronary Events.

    PubMed

    Jolly, Sanjit S; Shenkman, Heather; Brieger, David; Fox, Keith A; Yan, Andrew T; Eagle, Kim A; Steg, P Gabriel; Lim, Ki-Dong; Quill, Ann; Goodman, Shaun G

    2011-02-01

    The objective of this study was to determine if the extent of quantitative troponin elevation predicted mortality as well as in-hospital complications of cardiac arrest, new heart failure and cardiogenic shock. 16,318 patients with non-ST-segment elevation acute coronary syndromes (NSTE ACS) from the Global Registry of Acute Coronary Events (GRACE) were included. The maximum 24 h troponin value as a multiple of the local laboratory upper limit of normal was used. The population was divided into five groups based on the degree of troponin elevation, and outcomes were compared. An adjusted analysis was performed using quantitative troponin as a continuous variable with adjustment for known prognostic variables. For each approximate 10-fold increase in the troponin ratio, there was an associated increase in cardiac arrest, sustained ventricular tachycardia (VT) or ventricular fibrillation (VF) (1.0, 2.4, 3.4, 5.9 and 13.4%; p<0.001 for linear trend), cardiogenic shock (0.5, 1.4, 2.0, 4.4 and 12.7%; p<0.001), new heart failure (2.5, 5.1, 7.4, 11.6 and 15.8%; p<0.001) and mortality (0.8, 2.2, 3.0, 5.3 and 14.0%; p<0.001). These findings were replicated using the troponin ratio as a continuous variable and adjusting for covariates (cardiac arrest, sustained VT or VF, OR 1.56, 95% CI 1.39 to 1.74; cardiogenic shock, OR 1.87, 95% CI 1.61 to 2.18; and new heart failure, OR 1.57, 95% CI 1.45 to 1.71). The degree of troponin elevation was predictive of early mortality (HR 1.61, 95% CI 1.44 to 1.81; p<0.001 for days 0-14) and longer term mortality (HR 1.18, 95% CI 1.07 to 1.30, p=0.001 for days 15-180). The extent of troponin elevation is an independent predictor of morbidity and mortality.

  1. Quantitative ultrasonic evaluation of engineering properties in metals, composites and ceramics

    NASA Technical Reports Server (NTRS)

    Vary, A.

    1980-01-01

    Ultrasonic technology from the perspective of nondestructive evaluation approaches to material strength prediction and property verification is reviewed. Emergent advanced technology involving quantitative ultrasonic techniques for materials characterization is described. Ultrasonic methods are particularly useful in this area because they involve mechanical elastic waves that are strongly modulated by the same morphological factors that govern mechanical strength and dynamic failure processes. It is emphasized that the technology is in its infancy and that much effort is still required before all the available techniques can be transferred from laboratory to industrial environments.

  2. Statistical analysis of field data for aircraft warranties

    NASA Astrophysics Data System (ADS)

    Lakey, Mary J.

    Air Force and Navy maintenance data collection systems were researched to determine their scientific applicability to the warranty process. New and unique algorithms were developed to extract failure distributions which were then used to characterize how selected families of equipment typically fails. Families of similar equipment were identified in terms of function, technology and failure patterns. Statistical analyses and applications such as goodness-of-fit test, maximum likelihood estimation and derivation of confidence intervals for the probability density function parameters were applied to characterize the distributions and their failure patterns. Statistical and reliability theory, with relevance to equipment design and operational failures were also determining factors in characterizing the failure patterns of the equipment families. Inferences about the families with relevance to warranty needs were then made.

  3. Spherically Actuated Motor

    NASA Technical Reports Server (NTRS)

    Peeples, Steven

    2015-01-01

    A three degree of freedom (DOF) spherical actuator is proposed that will replace functions requiring three single DOF actuators in robotic manipulators providing space and weight savings while reducing the overall failure rate. Exploration satellites, Space Station payload manipulators, and rovers requiring pan, tilt, and rotate movements need an actuator for each function. Not only does each actuator introduce additional failure modes and require bulky mechanical gimbals, each contains many moving parts, decreasing mean time to failure. A conventional robotic manipulator is shown in figure 1. Spherical motors perform all three actuation functions, i.e., three DOF, with only one moving part. Given a standard three actuator system whose actuators have a given failure rate compared to a spherical motor with an equal failure rate, the three actuator system is three times as likely to fail over the latter. The Jet Propulsion Laboratory reliability studies of NASA robotic spacecraft have shown that mechanical hardware/mechanism failures are more frequent and more likely to significantly affect mission success than are electronic failures. Unfortunately, previously designed spherical motors have been unable to provide the performance needed by space missions. This inadequacy is also why they are unavailable commercially. An improved patentable spherically actuated motor (SAM) is proposed to provide the performance and versatility required by NASA missions.

  4. Retrospective Analysis of a Classical Biological Control Programme

    USDA-ARS?s Scientific Manuscript database

    1. Classical biological control has been a key technology in the management of invasive arthropod pests globally for over 120 years, yet rigorous quantitative evaluations of programme success or failure are rare. Here, I used life table and matrix model analyses, and life table response experiments ...

  5. Activation of PPAR-α in the early stage of heart failure maintained myocardial function and energetics in pressure-overload heart failure.

    PubMed

    Kaimoto, Satoshi; Hoshino, Atsushi; Ariyoshi, Makoto; Okawa, Yoshifumi; Tateishi, Shuhei; Ono, Kazunori; Uchihashi, Motoki; Fukai, Kuniyoshi; Iwai-Kanai, Eri; Matoba, Satoaki

    2017-02-01

    Failing heart loses its metabolic flexibility, relying increasingly on glucose as its preferential substrate and decreasing fatty acid oxidation (FAO). Peroxisome proliferator-activated receptor α (PPAR-α) is a key regulator of this substrate shift. However, its role during heart failure is complex and remains unclear. Recent studies reported that heart failure develops in the heart of myosin heavy chain-PPAR-α transgenic mice in a manner similar to that of diabetic cardiomyopathy, whereas cardiac dysfunction is enhanced in PPAR-α knockout mice in response to chronic pressure overload. We created a pressure-overload heart failure model in mice through transverse aortic constriction (TAC) and activated PPAR-α during heart failure using an inducible transgenic model. After 8 wk of TAC, left ventricular (LV) function had decreased with the reduction of PPAR-α expression in wild-type mice. We examined the effect of PPAR-α induction during heart failure using the Tet-Off system. Eight weeks after the TAC operation, LV construction was preserved significantly by PPAR-α induction with an increase in PPAR-α-targeted genes related to fatty acid metabolism. The increase of expression of fibrosis-related genes was significantly attenuated by PPAR-α induction. Metabolic rates measured by isolated heart perfusions showed a reduction in FAO and glucose oxidation in TAC hearts, but the rate of FAO preserved significantly owing to the induction of PPAR-α. Myocardial high-energy phosphates were significantly preserved by PPAR-α induction. These results suggest that PPAR-α activation during pressure-overloaded heart failure improved myocardial function and energetics. Thus activating PPAR-α and modulation of FAO could be a promising therapeutic strategy for heart failure. NEW & NOTEWORTHY The present study demonstrates the role of PPAR-α activation in the early stage of heart failure using an inducible transgenic mouse model. Induction of PPAR-α preserved heart function, and myocardial energetics. Activating PPAR-α and modulation of fatty acid oxidation could be a promising therapeutic strategy for heart failure. Copyright © 2017 the American Physiological Society.

  6. Review and Analysis of Existing Mobile Phone Apps to Support Heart Failure Symptom Monitoring and Self-Care Management Using the Mobile Application Rating Scale (MARS).

    PubMed

    Masterson Creber, Ruth M; Maurer, Mathew S; Reading, Meghan; Hiraldo, Grenny; Hickey, Kathleen T; Iribarren, Sarah

    2016-06-14

    Heart failure is the most common cause of hospital readmissions among Medicare beneficiaries and these hospitalizations are often driven by exacerbations in common heart failure symptoms. Patient collaboration with health care providers and decision making is a core component of increasing symptom monitoring and decreasing hospital use. Mobile phone apps offer a potentially cost-effective solution for symptom monitoring and self-care management at the point of need. The purpose of this review of commercially available apps was to identify and assess the functionalities of patient-facing mobile health apps targeted toward supporting heart failure symptom monitoring and self-care management. We searched 3 Web-based mobile app stores using multiple terms and combinations (eg, "heart failure," "cardiology," "heart failure and self-management"). Apps meeting inclusion criteria were evaluated using the Mobile Application Rating Scale (MARS), IMS Institute for Healthcare Informatics functionality scores, and Heart Failure Society of America (HFSA) guidelines for nonpharmacologic management. Apps were downloaded and assessed independently by 2-4 reviewers, interclass correlations between reviewers were calculated, and consensus was met by discussion. Of 3636 potentially relevant apps searched, 34 met inclusion criteria. Most apps were excluded because they were unrelated to heart failure, not in English or Spanish, or were games. Interrater reliability between reviewers was high. AskMD app had the highest average MARS total (4.9/5). More than half of the apps (23/34, 68%) had acceptable MARS scores (>3.0). Heart Failure Health Storylines (4.6) and AskMD (4.5) had the highest scores for behavior change. Factoring MARS, functionality, and HFSA guideline scores, the highest performing apps included Heart Failure Health Storylines, Symple, ContinuousCare Health App, WebMD, and AskMD. Peer-reviewed publications were identified for only 3 of the 34 apps. This review suggests that few apps meet prespecified criteria for quality, content, or functionality, highlighting the need for further refinement and mapping to evidence-based guidelines and room for overall quality improvement in heart failure symptom monitoring and self-care related apps.

  7. Comparison of frequencies of left ventricular systolic and diastolic heart failure in Chinese living in Hong Kong.

    PubMed

    Yip, G W; Ho, P P; Woo, K S; Sanderson, J E

    1999-09-01

    There is a wide variation (13% to 74%) in the reported prevalence of heart failure associated with normal left ventricular (LV) systolic function (diastolic heart failure). There is no published information on this condition in China. To ascertain the prevalence of diastolic heart failure in this community, 200 consecutive patients with the typical features of congestive heart failure were studied with standard 2-dimensional Doppler echocardiography. A LV ejection fraction (LVEF) >45% was considered normal. The results showed that 12.5% had significant valvular heart disease. Of the remaining 175 patients, 132 had a LVEF >45% (75%). Therefore, 66% of patients with a clinical diagnosis of heart failure had a normal LVEF. Heart failure with normal LV systolic function was more common than systolic heart failure in those >70 years old (65% vs 47%; p = 0.015). Most (57%) had an abnormal relaxation pattern in diastole and 14% had a restrictive filling pattern. In the systolic heart failure group, a restrictive filling pattern was more common (46%). There were no significant differences in the sex distribution, etiology, or prevalence of LV hypertrophy between these 2 heart failure groups. In conclusion, heart failure with a normal LVEF or diastolic heart failure is more common than systolic heart failure in Chinese patients with the symptoms of heart failure. This may be related to older age at presentation and the high prevalence of hypertension in this community.

  8. Functional Interrupts and Destructive Failures from Single Event Effect Testing of Point-Of-Load Devices

    NASA Technical Reports Server (NTRS)

    Chen, Dakai; Phan, Anthony; Kim, Hak; Swonger, James; Musil, Paul; LaBel, Kenneth

    2013-01-01

    We show examples of single event functional interrupt and destructive failure in modern POL devices. The increasing complexity and diversity of the design and process introduce hard SEE modes that are triggered by various mechanisms.

  9. A finite element evaluation of the moment arm hypothesis for altered vertebral shear failure force.

    PubMed

    Howarth, Samuel J; Karakolis, Thomas; Callaghan, Jack P

    2015-01-01

    The mechanism of vertebral shear failure is likely a bending moment generated about the pars interarticularis by facet contact, and the moment arm length (MAL) between the centroid of facet contact and the location of pars interarticularis failure has been hypothesised to be an influential modulator of shear failure force. To quantitatively evaluate this hypothesis, anterior shear of C3 over C4 was simulated in a finite element model of the porcine C3-C4 vertebral joint with each combination of five compressive force magnitudes (0-60% of estimated compressive failure force) and three postures (flexed, neutral and extended). Bilateral locations of peak stress within C3's pars interarticularis were identified along with the centroids of contact force on the inferior facets. These measurements were used to calculate the MAL of facet contact force. Changes in MAL were also related to shear failure forces measured from similar in vitro tests. Flexed and extended vertebral postures respectively increased and decreased the MAL by 6.6% and 4.8%. The MAL decreased by only 2.6% from the smallest to the largest compressive force. Furthermore, altered MAL explained 70% of the variance in measured shear failure force from comparable in vitro testing with larger MALs being associated with lower shear failure forces. Our results confirmed that the MAL is indeed a significant modulator of vertebral shear failure force. Considering spine flexion is necessary when assessing low-back shear injury potential because of the association between altered facet articulation and lower vertebral shear failure tolerance.

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ebeida, Mohamed S.; Mitchell, Scott A.; Swiler, Laura P.

    We introduce a novel technique, POF-Darts, to estimate the Probability Of Failure based on random disk-packing in the uncertain parameter space. POF-Darts uses hyperplane sampling to explore the unexplored part of the uncertain space. We use the function evaluation at a sample point to determine whether it belongs to failure or non-failure regions, and surround it with a protection sphere region to avoid clustering. We decompose the domain into Voronoi cells around the function evaluations as seeds and choose the radius of the protection sphere depending on the local Lipschitz continuity. As sampling proceeds, regions uncovered with spheres will shrink,more » improving the estimation accuracy. After exhausting the function evaluation budget, we build a surrogate model using the function evaluations associated with the sample points and estimate the probability of failure by exhaustive sampling of that surrogate. In comparison to other similar methods, our algorithm has the advantages of decoupling the sampling step from the surrogate construction one, the ability to reach target POF values with fewer samples, and the capability of estimating the number and locations of disconnected failure regions, not just the POF value. Furthermore, we present various examples to demonstrate the efficiency of our novel approach.« less

  11. Investigation of advanced fault insertion and simulator methods

    NASA Technical Reports Server (NTRS)

    Dunn, W. R.; Cottrell, D.

    1986-01-01

    The cooperative agreement partly supported research leading to the open-literature publication cited. Additional efforts under the agreement included research into fault modeling of semiconductor devices. Results of this research are presented in this report which is summarized in the following paragraphs. As a result of the cited research, it appears that semiconductor failure mechanism data is abundant but of little use in developing pin-level device models. Failure mode data on the other hand does exist but is too sparse to be of any statistical use in developing fault models. What is significant in the failure mode data is that, unlike classical logic, MSI and LSI devices do exhibit more than 'stuck-at' and open/short failure modes. Specifically they are dominated by parametric failures and functional anomalies that can include intermittent faults and multiple-pin failures. The report discusses methods of developing composite pin-level models based on extrapolation of semiconductor device failure mechanisms, failure modes, results of temperature stress testing and functional modeling. Limitations of this model particularly with regard to determination of fault detection coverage and latency time measurement are discussed. Indicated research directions are presented.

  12. An Acuity Tool for Heart Failure Case Management: Quantifying Workload, Service Utilization, and Disease Severity.

    PubMed

    Kilgore, Matthew D

    The cardiology service line director at a health maintenance organization (HMO) in Washington State required a valid, reliable, and practical means for measuring workloads and other productivity factors for six heart failure (HF) registered nurse case managers located across three geographical regions. The Kilgore Heart Failure Case Management (KHFCM) Acuity Tool was systematically designed, developed, and validated to measure workload as a dependent function of the number of heart failure case management (HFCM) services rendered and the duration of times spent on various care duties. Research and development occurred at various HMO-affiliated internal medicine and cardiology offices throughout Western Washington. The concepts, methods, and principles used to develop the KHFCM Acuity Tool are applicable for any type of health care professional aiming to quantify workload using a high-quality objective tool. The content matter, scaling, and language on the KHFCM Acuity Tool are specific to HFCM settings. The content matter and numeric scales for the KHFCM Acuity Tool were developed and validated using a mixed-method participant action research method applied to a group of six outpatient HF case managers and their respective caseloads. The participant action research method was selected, because the application of this method requires research participants to become directly involved in the diagnosis of research problems, the planning and execution of actions taken to address those problems, and the implementation of progressive strategies throughout the course of the study, as necessary, to produce the most credible and practical practice improvements (; ; ; ). Heart failure case managers served clients with New York Heart Association Functional Class III-IV HF (), and encounters were conducted primarily by telephone or in-office consultation. A mix of qualitative and quantitative results demonstrated a variety of quality improvement outcomes achieved by the design and practice application of the KHFCM Acuity Tool. Quality improvement outcomes included a more valid reflection of encounter times and demonstration of the KHFCM Acuity Tool as a reliable, practical, credible, and satisfying tool for reflecting HF case manager workloads and HF disease severity. The KHFCM Acuity Tool defines workload simply as a function of the number of HFCM services performed and the duration of time spent on a client encounter. The design of the tool facilitates the measure of workload, service utilization, and HF disease characteristics, independently from the overall measure of acuity, so that differences in individual case manager practice, as well as client characteristics within sites, across sites, and potentially throughout annual seasons, can be demonstrated. Data produced from long-term applications of the KHFCM Acuity Tool, across all regions, could serve as a driver for establishing systemwide HFCM productivity benchmarks or standards of practice for HF case managers. Data produced from localized applications could serve as a reference for coordinating staffing resources or developing HFCM productivity benchmarks within individual regions or sites.

  13. Methods for the Determination of Rates of Glucose and Fatty Acid Oxidation in the Isolated Working Rat Heart

    PubMed Central

    Bakrania, Bhavisha; Granger, Joey P.; Harmancey, Romain

    2016-01-01

    The mammalian heart is a major consumer of ATP and requires a constant supply of energy substrates for contraction. Not surprisingly, alterations of myocardial metabolism have been linked to the development of contractile dysfunction and heart failure. Therefore, unraveling the link between metabolism and contraction should shed light on some of the mechanisms governing cardiac adaptation or maladaptation in disease states. The isolated working rat heart preparation can be used to follow, simultaneously and in real time, cardiac contractile function and flux of energy providing substrates into oxidative metabolic pathways. The present protocol aims to provide a detailed description of the methods used in the preparation and utilization of buffers for the quantitative measurement of the rates of oxidation for glucose and fatty acids, the main energy providing substrates of the heart. The methods used for sample analysis and data interpretation are also discussed. In brief, the technique is based on the supply of 14C- radiolabeled glucose and a 3H- radiolabeled long-chain fatty acid to an ex vivo beating heart via normothermic crystalloid perfusion. 14CO2 and 3H2O, end byproducts of the enzymatic reactions involved in the utilization of these energy providing substrates, are then quantitatively recovered from the coronary effluent. With knowledge of the specific activity of the radiolabeled substrates used, it is then possible to individually quantitate the flux of glucose and fatty acid in the oxidation pathways. Contractile function of the isolated heart can be determined in parallel with the appropriate recording equipment and directly correlated to metabolic flux values. The technique is extremely useful to study the metabolism/contraction relationship in response to various stress conditions such as alterations in pre and after load and ischemia, a drug or a circulating factor, or following the alteration in the expression of a gene product. PMID:27768055

  14. Induction of ovarian function by using short-term human menopausal gonadotrophin in patients with ovarian failure following cytotoxic chemotherapy for haematological malignancy.

    PubMed

    Chatterjee, R; Mills, W; Katz, M; McGarrigle, H H; Goldstone, A H

    1993-07-01

    Currently no treatment has proved successful in inducing ovarian steroidogenic and/or gametogenic recovery in patients with haematological malignancies treated by cytotoxic chemotherapy once biochemical failure becomes manifest i.e., when FSH levels exceed 40 IU/L. This paper reports two such cases with classical biochemical ovarian failure in which ovarian function was induced by brief stimulation with Human Menopausal Gonadotrophin (HMG).

  15. Risk factors for treatment failure and recurrence of anisometropic amblyopia.

    PubMed

    Kirandi, Ece Uzun; Akar, Serpil; Gokyigit, Birsen; Onmez, Funda Ebru Aksoy; Oto, Sibel

    2017-08-01

    The aim of this study was to identify factors associated with failed vision improvement and recurrence following occlusion therapy for anisometropic amblyopia in children aged 7-9 years. We retrospectively reviewed the medical records of 64 children aged 7-9 years who had been diagnosed as having anisometropic amblyopia and were treated with patching. Functional treatment failure was defined as final visual acuity in the amblyopic eye of worse than 20/32. Improvement of fewer than two logMAR lines was considered relative treatment failure. Recurrence was defined as the reduction of at least two logMAR levels of visual acuity after decreased or discontinued patching. Functional and relative success rates were 51.6 and 62.5 %, respectively. The most important factor for functional treatment failure [adjusted odds ratio (OR) (95 % confidence interval, CI) 11.57 (1.4-95.74)] and the only risk factor for recurrence [adjusted OR (95 % CI) 3.04 (1.13-8.12)] were the same: high spherical equivalent (SE) of the amblyopic eye. A large interocular difference in the best-corrected visual acuity was found to be a risk factor for both functional and relative failure. High SE of the amblyopic eye was the most influential risk factor for treatment failure and recurrence in compliant children aged 7-9 years.

  16. Metformin improves cardiac function in mice with heart failure after myocardial infarction by regulating mitochondrial energy metabolism.

    PubMed

    Sun, Dan; Yang, Fei

    2017-04-29

    To investigate whether metformin can improve the cardiac function through improving the mitochondrial function in model of heart failure after myocardial infarction. Male C57/BL6 mice aged about 8 weeks were selected and the anterior descending branch was ligatured to establish the heart failure model after myocardial infarction. The cardiac function was evaluated via ultrasound after 3 days to determine the modeling was successful, and the mice were randomly divided into two groups. Saline group (Saline) received the intragastric administration of normal saline for 4 weeks, and metformin group (Met) received the intragastric administration of metformin for 4 weeks. At the same time, Shame group (Sham) was set up. Changes in cardiac function in mice were detected at 4 weeks after operation. Hearts were taken from mice after 4 weeks, and cell apoptosis in myocardial tissue was detected using TUNEL method; fresh mitochondria were taken and changes in oxygen consumption rate (OCR) and respiratory control rate (RCR) of mitochondria in each group were detected using bio-energy metabolism tester, and change in mitochondrial membrane potential (MMP) of myocardial tissue was detected via JC-1 staining; the expressions and changes in Bcl-2, Bax, Sirt3, PGC-1α and acetylated PGC-1α in myocardial tissue were detected by Western blot. RT-PCR was used to detect mRNA levels in Sirt3 in myocardial tissues. Metformin improved the systolic function of heart failure model rats after myocardial infarction and reduced the apoptosis of myocardial cells after myocardial infarction. Myocardial mitochondrial respiratory function and membrane potential were decreased after myocardial infarction, and metformin treatment significantly improved the mitochondrial respiratory function and mitochondrial membrane potential; Metformin up-regulated the expression of Sirt3 and the activity of PGC-1α in myocardial tissue of heart failure after myocardial infarction. Metformin decreases the acetylation level of PGC-1α through up-regulating Sirt3, mitigates the damage to mitochondrial membrane potential of model of heart failure after myocardial infarction and improves the respiratory function of mitochondria, thus improving the cardiac function of mice. Copyright © 2017. Published by Elsevier Inc.

  17. Mitochondria and heart failure.

    PubMed

    Murray, Andrew J; Edwards, Lindsay M; Clarke, Kieran

    2007-11-01

    Energetic abnormalities in cardiac and skeletal muscle occur in heart failure and correlate with clinical symptoms and mortality. It is likely that the cellular mechanism leading to energetic failure involves mitochondrial dysfunction. Therefore, it is crucial to elucidate the causes of mitochondrial myopathy, in order to improve cardiac and skeletal muscle function, and hence quality of life, in heart failure patients. Recent studies identified several potential stresses that lead to mitochondrial dysfunction in heart failure. Chronically elevated plasma free fatty acid levels in heart failure are associated with decreased metabolic efficiency and cellular insulin resistance. Tissue hypoxia, resulting from low cardiac output and endothelial impairment, can lead to oxidative stress and mitochondrial DNA damage, which in turn causes dysfunction and loss of mitochondrial mass. Therapies aimed at protecting mitochondrial function have shown promise in patients and animal models with heart failure. Despite current therapies, which provide substantial benefit to patients, heart failure remains a relentlessly progressive disease, and new approaches to treatment are necessary. Novel pharmacological agents are needed that optimize substrate metabolism and maintain mitochondrial integrity, improve oxidative capacity in heart and skeletal muscle, and alleviate many of the clinical symptoms associated with heart failure.

  18. Competing risk models in reliability systems, a weibull distribution model with bayesian analysis approach

    NASA Astrophysics Data System (ADS)

    Iskandar, Ismed; Satria Gondokaryono, Yudi

    2016-02-01

    In reliability theory, the most important problem is to determine the reliability of a complex system from the reliability of its components. The weakness of most reliability theories is that the systems are described and explained as simply functioning or failed. In many real situations, the failures may be from many causes depending upon the age and the environment of the system and its components. Another problem in reliability theory is one of estimating the parameters of the assumed failure models. The estimation may be based on data collected over censored or uncensored life tests. In many reliability problems, the failure data are simply quantitatively inadequate, especially in engineering design and maintenance system. The Bayesian analyses are more beneficial than the classical one in such cases. The Bayesian estimation analyses allow us to combine past knowledge or experience in the form of an apriori distribution with life test data to make inferences of the parameter of interest. In this paper, we have investigated the application of the Bayesian estimation analyses to competing risk systems. The cases are limited to the models with independent causes of failure by using the Weibull distribution as our model. A simulation is conducted for this distribution with the objectives of verifying the models and the estimators and investigating the performance of the estimators for varying sample size. The simulation data are analyzed by using Bayesian and the maximum likelihood analyses. The simulation results show that the change of the true of parameter relatively to another will change the value of standard deviation in an opposite direction. For a perfect information on the prior distribution, the estimation methods of the Bayesian analyses are better than those of the maximum likelihood. The sensitivity analyses show some amount of sensitivity over the shifts of the prior locations. They also show the robustness of the Bayesian analysis within the range between the true value and the maximum likelihood estimated value lines.

  19. Failure of platelet parameters and biomarkers to correlate platelet function to severity and etiology of heart failure in patients enrolled in the EPCOT trial. With special reference to the Hemodyne hemostatic analyzer. Whole Blood Impedance Aggregometry for the Assessment of Platelet Function in Patients with Congestive Heart Failure.

    PubMed

    Serebruany, Victor L; McKenzie, Marcus E; Meister, Andrew F; Fuzaylov, Sergey Y; Gurbel, Paul A; Atar, Dan; Gattis, Wendy A; O'Connor, Christopher M

    2002-01-01

    Data from small studies have suggested the presence of platelet abnormalities in patients with congestive heart failure (CHF). We sought to characterize the diagnostic utility of different platelet parameters and platelet-endothelial biomarkers in a random outpatient CHF population investigated in the EPCOT ('Whole Blood Impedance Aggregometry for the Assessment of Platelet Function in Patients with Congestive Heart Failure') Trial. Blood samples were obtained for measurement of platelet contractile force (PCF), whole blood aggregation, shear-induced closure time, expression of glycoprotein (GP) IIb/IIIa, and P-selectin in 100 consecutive patients with CHF. Substantial interindividual variability of platelet characteristics exists in patients with CHF. There were no statistically significant differences when patients were grouped according to incidence of vascular events, emergency revascularization needs, survival, or etiology of heart failure. Aspirin use did not affect instrument readings either. PCF correlates very poorly with whole blood aggregometry (r(2) = 0.023), closure time (r(2) = 0.028), platelet GP IIb/IIIa (r(2) = 0.0028), and P-selectin (r(2) = 0.002) expression. Furthermore, there was no correlation with brain natriuretic peptide concentrations, a marker of severity and prognosis in heart failure reflecting the neurohumoral status. Patients with heart failure enrolled in the EPCOT Trial exhibited a marginal, sometimes oppositely directed change in platelet function, challenging the diagnostic utility of these platelet parameters and biomarkers to serve as useful tools for the identification of platelet abnormalities, for predicting clinical outcomes, or for monitoring antiplatelet strategies in this population. The usefulness of these measurements for assessing platelets in the different clinical settings remains to be explored. Taken together, opposite to our expectations, major clinical characteristics of heart failure did not correlate well with the platelet characteristics investigated in this study. Copyright 2002 S. Karger AG, Basel

  20. Morphologic Risk Factors in Predicting Symptomatic Structural Failure of Arthroscopic Rotator Cuff Repairs: Tear Size, Location, and Atrophy Matter.

    PubMed

    Gasbarro, Gregory; Ye, Jason; Newsome, Hillary; Jiang, Kevin; Wright, Vonda; Vyas, Dharmesh; Irrgang, James J; Musahl, Volker

    2016-10-01

    To evaluate whether morphologic characteristics of rotator cuff tear have prognostic value in determining symptomatic structural failure of arthroscopic rotator cuff repair independent of age or gender. Arthroscopic rotator cuff repair cases performed by five fellowship-trained surgeons at our institution from 2006 to 2013 were retrospectively reviewed. Data extraction included demographics, comorbidities, repair technique, clinical examination, and radiographic findings. Failure in symptomatic patients was defined as structural defect on postoperative magnetic resonance imaging or pseudoparalysis on examination. Failures were age and gender matched with successful repairs in a 1:2 ratio. A total of 30 failures and 60 controls were identified. Supraspinatus atrophy (P = .03) and tear size (18.3 mm failures v 13.9 mm controls; P = .02) were significant risk factors for failure, as was the presence of an infraspinatus tear greater than 10 mm (62% v 17%, P < .01). Single-row repair (P = .06) and simple suture configuration (P = .17) were more common but similar between groups. Diabetes mellitus and active tobacco use were not significantly associated with increased failure risk but psychiatric medication use was more frequent in the failure group. This study confirms previous suspicions that tear size and fatty infiltration are associated with failure of arthroscopic rotator cuff repair but independent of age or gender in symptomatic patients. There is also a quantitative cutoff on magnetic resonance imaging for the size of infraspinatus involvement that can be used clinically as a predicting factor. Although reported in the literature, smoking and diabetes were not associated with failure. Level III, retrospective case control. Copyright © 2016 Arthroscopy Association of North America. Published by Elsevier Inc. All rights reserved.

  1. The Use of Probabilistic Methods to Evaluate the Systems Impact of Component Design Improvements on Large Turbofan Engines

    NASA Technical Reports Server (NTRS)

    Packard, Michael H.

    2002-01-01

    Probabilistic Structural Analysis (PSA) is now commonly used for predicting the distribution of time/cycles to failure of turbine blades and other engine components. These distributions are typically based on fatigue/fracture and creep failure modes of these components. Additionally, reliability analysis is used for taking test data related to particular failure modes and calculating failure rate distributions of electronic and electromechanical components. How can these individual failure time distributions of structural, electronic and electromechanical component failure modes be effectively combined into a top level model for overall system evaluation of component upgrades, changes in maintenance intervals, or line replaceable unit (LRU) redesign? This paper shows an example of how various probabilistic failure predictions for turbine engine components can be evaluated and combined to show their effect on overall engine performance. A generic model of a turbofan engine was modeled using various Probabilistic Risk Assessment (PRA) tools (Quantitative Risk Assessment Software (QRAS) etc.). Hypothetical PSA results for a number of structural components along with mitigation factors that would restrict the failure mode from propagating to a Loss of Mission (LOM) failure were used in the models. The output of this program includes an overall failure distribution for LOM of the system. The rank and contribution to the overall Mission Success (MS) is also given for each failure mode and each subsystem. This application methodology demonstrates the effectiveness of PRA for assessing the performance of large turbine engines. Additionally, the effects of system changes and upgrades, the application of different maintenance intervals, inclusion of new sensor detection of faults and other upgrades were evaluated in determining overall turbine engine reliability.

  2. Optimized Vertex Method and Hybrid Reliability

    NASA Technical Reports Server (NTRS)

    Smith, Steven A.; Krishnamurthy, T.; Mason, B. H.

    2002-01-01

    A method of calculating the fuzzy response of a system is presented. This method, called the Optimized Vertex Method (OVM), is based upon the vertex method but requires considerably fewer function evaluations. The method is demonstrated by calculating the response membership function of strain-energy release rate for a bonded joint with a crack. The possibility of failure of the bonded joint was determined over a range of loads. After completing the possibilistic analysis, the possibilistic (fuzzy) membership functions were transformed to probability density functions and the probability of failure of the bonded joint was calculated. This approach is called a possibility-based hybrid reliability assessment. The possibility and probability of failure are presented and compared to a Monte Carlo Simulation (MCS) of the bonded joint.

  3. Role of neuropeptide Y in renal sympathetic vasoconstriction: studies in normal and congestive heart failure rats.

    PubMed

    DiBona, G F; Sawin, L L

    2001-08-01

    Sympathetic nerve activity, including that in the kidney, is increased in heart failure with increased plasma concentrations of norepinephrine and the vasoconstrictor cotransmitter neuropeptide Y (NPY). We examined the contribution of NPY to sympathetically mediated alterations in kidney function in normal and heart failure rats. Heart failure rats were created by left coronary ligation and myocardial infarction. In anesthetized normal rats, the NPY Y(1) receptor antagonist, H 409/22, at two doses, had no effect on heart rate, arterial pressure, or renal hemodynamic and excretory function. In conscious severe heart failure rats, high-dose H 409/22 decreased mean arterial pressure by 8 +/- 2 mm Hg but had no effect in normal and mild heart failure rats. During graded frequency renal sympathetic nerve stimulation (0 to 10 Hz), high-dose H 409/22 attenuated the decreases in renal blood flow only at 10 Hz (-36% +/- 5%, P <.05) in normal rats but did so at both 4 (-29% +/- 4%, P <.05) and 10 Hz (-33% +/- 5%, P <.05) in heart failure rats. The glomerular filtration rate, urinary flow rate, and sodium excretion responses to renal sympathetic nerve stimulation were not affected by high-dose H 409/22 in either normal or heart failure rats. NPY does not participate in the regulation of kidney function and arterial pressure in normal conscious or anesthetized rats. When sympathetic nervous system activity is increased, as in heart failure and intense renal sympathetic nerve stimulation, respectively, a small contribution of NPY to maintenance of arterial pressure and to sympathetic renal vasoconstrictor responses may be identified.

  4. History.edu: Essays on Teaching with Technology.

    ERIC Educational Resources Information Center

    Trinkle, Dennis A., Ed.; Merriman, Scott A., Ed.

    Intended to be equally useful to high school and college instructors, this book contains studies in history pedagogy, among them the first three published essays measuring qualitatively and quantitatively the successes and failures of "e-teaching" and distance learning. Collectively, the essays urge instructors to take the next step with…

  5. Quantitative PCR Analysis of Laryngeal Muscle Fiber Types

    ERIC Educational Resources Information Center

    Van Daele, Douglas J.

    2010-01-01

    Voice and swallowing dysfunction as a result of recurrent laryngeal nerve paralysis can be improved with vocal fold injections or laryngeal framework surgery. However, denervation atrophy can cause late-term clinical failure. A major determinant of skeletal muscle physiology is myosin heavy chain (MyHC) expression, and previous protein analyses…

  6. Simple noninvasive measurement of skin autofluorescence.

    PubMed

    Meerwaldt, Robbert; Links, Thera; Graaff, Reindert; Thorpe, Suzannne R; Baynes, John W; Hartog, Jasper; Gans, Reinold; Smit, Andries

    2005-06-01

    Accumulation of advanced glycation end products (AGEs) is thought to play a role in the pathogenesis of chronic complications of diabetes mellitus and renal failure. Several studies indicate that AGE accumulation in tissue may reflect the cumulative effect of hyperglycemia and oxidative stress over many years. Simple quantitation of AGE accumulation in tissue could provide a tool for assessing the risk of long-term complications. Because several AGEs exhibit autofluorescence, we developed a noninvasive autofluorescence reader (AFR). Skin autofluorescence measured with the AFR correlates with collagen-linked fluorescence and specific skin AGE levels from skin biopsy samples. Furthermore, skin autofluorescence correlates with long-term glycemic control and renal function, and preliminary results show correlations with the presence of long-term complications in diabetes. The AFR may be useful as a clinical tool for rapid assessment of risk for AGE-related long-term complications in diabetes and in other conditions associated with AGE accumulation.

  7. Estimating explosion properties of normal hydrogen-rich core-collapse supernovae

    NASA Astrophysics Data System (ADS)

    Pejcha, Ondrej

    2017-08-01

    Recent parameterized 1D explosion models of hundreds of core-collapse supernova progenitors suggest that success and failure are intertwined in a complex pattern that is not a simple function of the progenitor initial mass. This rugged landscape is present also in other explosion properties, allowing for quantitative tests of the neutrino mechanism from observations of hundreds of supernovae discovered every year. We present a new self-consistent and versatile method that derives photospheric radius and temperature variations of normal hydrogen-rich core-collapse supernovae based on their photometric measurements and expansion velocities. We construct SED and bolometric light curves, determine explosion energies, ejecta and nickel masses while taking into account all uncertainties and covariances of the model. We describe the efforts to compare the inferences to the predictions of the neutrino mechanim. The model can be adapted to include more physical assumptions to utilize primarily photometric data coming from surveys such as LSST.

  8. Diagnosis and treatment of GH deficiency in Prader-Willi syndrome.

    PubMed

    Grugni, Graziano; Marzullo, Paolo

    2016-12-01

    Prader-Willi syndrome (PWS) results from under-expression of the paternally-derived chromosomal region 15q11-13. Growth failure is a recognized feature of PWS, and both quantitative and qualitative defects of the GH/IGF-I axis revealing GH deficiency (GHD) have been demonstrated in most children with PWS. In PWS adults, criteria for GHD are biochemically fulfilled in 8-38% of the studied cohorts. Published data support benefits of early institution of GH therapy (GHT) in PWS children, with positive effects on statural growth, body composition, metabolic homeostasis, and neurocognitive function. Like in pediatric PWS, GHT also yields beneficial effects on lean and body fat, exercise capacity, and quality of life of PWS adults. Although GHT has been generally administered safely in PWS children and adults, careful surveillance of risks is mandatory during prolonged GH replacement for all PWS individuals. Copyright © 2016 Elsevier Ltd. All rights reserved.

  9. User-perceived reliability of unrepairable shared protection systems with functionally identical units

    NASA Astrophysics Data System (ADS)

    Ozaki, Hirokazu; Kara, Atsushi; Cheng, Zixue

    2012-05-01

    In this article, we investigate the reliability of M-for-N (M:N) shared protection systems. We focus on the reliability that is perceived by an end user of one of N units. We assume that any failed unit is instantly replaced by one of the M units (if available). We describe the effectiveness of such a protection system in a quantitative manner under the condition that the failed units are not repairable. Mathematical analysis gives the closed-form solution of the reliability and mean time to failure (MTTF). We also analyse several numerical examples of the reliability and MTTF. This result can be applied, for example, to the analysis and design of an integrated circuit consisting of redundant backup components. In such a device, repairing a failed component is unrealistic. The analysis provides useful information for the design for general shared protection systems in which the failed units are not repaired.

  10. Respiratory failure in diabetic ketoacidosis.

    PubMed

    Konstantinov, Nikifor K; Rohrscheib, Mark; Agaba, Emmanuel I; Dorin, Richard I; Murata, Glen H; Tzamaloukas, Antonios H

    2015-07-25

    Respiratory failure complicating the course of diabetic ketoacidosis (DKA) is a source of increased morbidity and mortality. Detection of respiratory failure in DKA requires focused clinical monitoring, careful interpretation of arterial blood gases, and investigation for conditions that can affect adversely the respiration. Conditions that compromise respiratory function caused by DKA can be detected at presentation but are usually more prevalent during treatment. These conditions include deficits of potassium, magnesium and phosphate and hydrostatic or non-hydrostatic pulmonary edema. Conditions not caused by DKA that can worsen respiratory function under the added stress of DKA include infections of the respiratory system, pre-existing respiratory or neuromuscular disease and miscellaneous other conditions. Prompt recognition and management of the conditions that can lead to respiratory failure in DKA may prevent respiratory failure and improve mortality from DKA.

  11. Respiratory failure in diabetic ketoacidosis

    PubMed Central

    Konstantinov, Nikifor K; Rohrscheib, Mark; Agaba, Emmanuel I; Dorin, Richard I; Murata, Glen H; Tzamaloukas, Antonios H

    2015-01-01

    Respiratory failure complicating the course of diabetic ketoacidosis (DKA) is a source of increased morbidity and mortality. Detection of respiratory failure in DKA requires focused clinical monitoring, careful interpretation of arterial blood gases, and investigation for conditions that can affect adversely the respiration. Conditions that compromise respiratory function caused by DKA can be detected at presentation but are usually more prevalent during treatment. These conditions include deficits of potassium, magnesium and phosphate and hydrostatic or non-hydrostatic pulmonary edema. Conditions not caused by DKA that can worsen respiratory function under the added stress of DKA include infections of the respiratory system, pre-existing respiratory or neuromuscular disease and miscellaneous other conditions. Prompt recognition and management of the conditions that can lead to respiratory failure in DKA may prevent respiratory failure and improve mortality from DKA. PMID:26240698

  12. Modeling Dynamic Helium Release as a Tracer of Rock Deformation

    DOE PAGES

    Gardner, W. Payton; Bauer, Stephen J.; Kuhlman, Kristopher L.; ...

    2017-11-03

    Here, we use helium released during mechanical deformation of shales as a signal to explore the effects of deformation and failure on material transport properties. A dynamic dual-permeability model with evolving pore and fracture networks is used to simulate gases released from shale during deformation and failure. Changes in material properties required to reproduce experimentally observed gas signals are explored. We model two different experiments of 4He flow rate measured from shale undergoing mechanical deformation, a core parallel to bedding and a core perpendicular to bedding. We also found that the helium signal is sensitive to fracture development and evolutionmore » as well as changes in the matrix transport properties. We constrain the timing and effective fracture aperture, as well as the increase in matrix porosity and permeability. Increases in matrix permeability are required to explain gas flow prior to macroscopic failure, and the short-term gas flow postfailure. Increased matrix porosity is required to match the long-term, postfailure gas flow. This model provides the first quantitative interpretation of helium release as a result of mechanical deformation. The sensitivity of this model to changes in the fracture network, as well as to matrix properties during deformation, indicates that helium release can be used as a quantitative tool to evaluate the state of stress and strain in earth materials.« less

  13. Modeling Dynamic Helium Release as a Tracer of Rock Deformation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gardner, W. Payton; Bauer, Stephen J.; Kuhlman, Kristopher L.

    Here, we use helium released during mechanical deformation of shales as a signal to explore the effects of deformation and failure on material transport properties. A dynamic dual-permeability model with evolving pore and fracture networks is used to simulate gases released from shale during deformation and failure. Changes in material properties required to reproduce experimentally observed gas signals are explored. We model two different experiments of 4He flow rate measured from shale undergoing mechanical deformation, a core parallel to bedding and a core perpendicular to bedding. We also found that the helium signal is sensitive to fracture development and evolutionmore » as well as changes in the matrix transport properties. We constrain the timing and effective fracture aperture, as well as the increase in matrix porosity and permeability. Increases in matrix permeability are required to explain gas flow prior to macroscopic failure, and the short-term gas flow postfailure. Increased matrix porosity is required to match the long-term, postfailure gas flow. This model provides the first quantitative interpretation of helium release as a result of mechanical deformation. The sensitivity of this model to changes in the fracture network, as well as to matrix properties during deformation, indicates that helium release can be used as a quantitative tool to evaluate the state of stress and strain in earth materials.« less

  14. Two independent proteomic approaches provide a comprehensive analysis of the synovial fluid proteome response to Autologous Chondrocyte Implantation.

    PubMed

    Hulme, Charlotte H; Wilson, Emma L; Fuller, Heidi R; Roberts, Sally; Richardson, James B; Gallacher, Pete; Peffers, Mandy J; Shirran, Sally L; Botting, Catherine H; Wright, Karina T

    2018-05-02

    Autologous chondrocyte implantation (ACI) has a failure rate of approximately 20%, but it is yet to be fully understood why. Biomarkers are needed that can pre-operatively predict in which patients it is likely to fail, so that alternative or individualised therapies can be offered. We previously used label-free quantitation (LF) with a dynamic range compression proteomic approach to assess the synovial fluid (SF) of ACI responders and non-responders. However, we were able to identify only a few differentially abundant proteins at baseline. In the present study, we built upon these previous findings by assessing higher-abundance proteins within this SF, providing a more global proteomic analysis on the basis of which more of the biology underlying ACI success or failure can be understood. Isobaric tagging for relative and absolute quantitation (iTRAQ) proteomic analysis was used to assess SF from ACI responders (mean Lysholm improvement of 33; n = 14) and non-responders (mean Lysholm decrease of 14; n = 13) at the two stages of surgery (cartilage harvest and chondrocyte implantation). Differentially abundant proteins in iTRAQ and combined iTRAQ and LF datasets were investigated using pathway and network analyses. iTRAQ proteomic analysis confirmed our previous finding that there is a marked proteomic shift in response to cartilage harvest (70 and 54 proteins demonstrating ≥ 2.0-fold change and p < 0.05 between stages I and II in responders and non-responders, respectively). Further, it highlighted 28 proteins that were differentially abundant between responders and non-responders to ACI, which were not found in the LF study, 16 of which were altered at baseline. The differential expression of two proteins (complement C1s subcomponent and matrix metalloproteinase 3) was confirmed biochemically. Combination of the iTRAQ and LF proteomic datasets generated in-depth SF proteome information that was used to generate interactome networks representing ACI success or failure. Functional pathways that are dysregulated in ACI non-responders were identified, including acute-phase response signalling. Several candidate biomarkers for baseline prediction of ACI outcome were identified. A holistic overview of the SF proteome in responders and non-responders to ACI  has been profiled, providing a better understanding of the biological pathways underlying clinical outcome, particularly the differential response to cartilage harvest in non-responders.

  15. A probabilisitic based failure model for components fabricated from anisotropic graphite

    NASA Astrophysics Data System (ADS)

    Xiao, Chengfeng

    The nuclear moderator for high temperature nuclear reactors are fabricated from graphite. During reactor operations graphite components are subjected to complex stress states arising from structural loads, thermal gradients, neutron irradiation damage, and seismic events. Graphite is a quasi-brittle material. Two aspects of nuclear grade graphite, i.e., material anisotropy and different behavior in tension and compression, are explicitly accounted for in this effort. Fracture mechanic methods are useful for metal alloys, but they are problematic for anisotropic materials with a microstructure that makes it difficult to identify a "critical" flaw. In fact cracking in a graphite core component does not necessarily result in the loss of integrity of a nuclear graphite core assembly. A phenomenological failure criterion that does not rely on flaw detection has been derived that accounts for the material behaviors mentioned. The probability of failure of components fabricated from graphite is governed by the scatter in strength. The design protocols being proposed by international code agencies recognize that design and analysis of reactor core components must be based upon probabilistic principles. The reliability models proposed herein for isotropic graphite and graphite that can be characterized as being transversely isotropic are another set of design tools for the next generation very high temperature reactors (VHTR) as well as molten salt reactors. The work begins with a review of phenomenologically based deterministic failure criteria. A number of this genre of failure models are compared with recent multiaxial nuclear grade failure data. Aspects in each are shown to be lacking. The basic behavior of different failure strengths in tension and compression is exhibited by failure models derived for concrete, but attempts to extend these concrete models to anisotropy were unsuccessful. The phenomenological models are directly dependent on stress invariants. A set of invariants, known as an integrity basis, was developed for a non-linear elastic constitutive model. This integrity basis allowed the non-linear constitutive model to exhibit different behavior in tension and compression and moreover, the integrity basis was amenable to being augmented and extended to anisotropic behavior. This integrity basis served as the starting point in developing both an isotropic reliability model and a reliability model for transversely isotropic materials. At the heart of the reliability models is a failure function very similar in nature to the yield functions found in classic plasticity theory. The failure function is derived and presented in the context of a multiaxial stress space. States of stress inside the failure envelope denote safe operating states. States of stress on or outside the failure envelope denote failure. The phenomenological strength parameters associated with the failure function are treated as random variables. There is a wealth of failure data in the literature that supports this notion. The mathematical integration of a joint probability density function that is dependent on the random strength variables over the safe operating domain defined by the failure function provides a way to compute the reliability of a state of stress in a graphite core component fabricated from graphite. The evaluation of the integral providing the reliability associated with an operational stress state can only be carried out using a numerical method. Monte Carlo simulation with importance sampling was selected to make these calculations. The derivation of the isotropic reliability model and the extension of the reliability model to anisotropy are provided in full detail. Model parameters are cast in terms of strength parameters that can (and have been) characterized by multiaxial failure tests. Comparisons of model predictions with failure data is made and a brief comparison is made to reliability predictions called for in the ASME Boiler and Pressure Vessel Code. Future work is identified that would provide further verification and augmentation of the numerical methods used to evaluate model predictions.

  16. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kashiwa, Bryan Andrew; Hull, Lawrence Mark

    Highlights of recent phenomenological studies of metal failure are given. Failure leading to spallation and fragmentation are typically of interest. The current ‘best model’ includes the following; a full history stress in tension; nucleation initiating dynamic relaxation; toward a tensile yield function; failure dependent on strain, strain rate, and temperature; a mean-preserving ‘macrodefect’ is introduced when failure occurs in tension; and multifield theoretical refinements

  17. Early Exercise Rehabilitation of Muscle Weakness in Acute Respiratory Failure Patients

    PubMed Central

    Berry, Michael J.; Morris, Peter E.

    2013-01-01

    Acute Respiratory Failure patients experience significant muscle weakness which contributes to prolonged hospitalization and functional impairments post-hospital discharge. Based on our previous work, we hypothesize that an exercise intervention initiated early in the intensive care unit aimed at improving skeletal muscle strength could decrease hospital stay and attenuate the deconditioning and skeletal muscle weakness experienced by these patients. Summary Early exercise has the potential to decrease hospital length of stay and improve function in Acute Respiratory Failure patients. PMID:23873130

  18. Computer-based assessment of left ventricular regional ejection fraction in patients after myocardial infarction

    NASA Astrophysics Data System (ADS)

    Teo, S.-K.; Su, Y.; Tan, R. S.; Zhong, L.

    2014-03-01

    After myocardial infarction (MI), the left ventricle (LV) undergoes progressive remodeling which adversely affects heart function and may lead to development of heart failure. There is an escalating need to accurately depict the LV remodeling process for disease surveillance and monitoring of therapeutic efficacy. Current practice of using ejection fraction to quantitate LV function is less than ideal as it obscures regional variation and anomaly. Therefore, we sought to (i) develop a quantitative method to assess LV regional ejection fraction (REF) using a 16-segment method, and (ii) evaluate the effectiveness of REF in discriminating 10 patients 1-3 months after MI and 9 normal control (sex- and agematched) based on cardiac magnetic resonance (CMR) imaging. Late gadolinium enhancement (LGE) CMR scans were also acquired for the MI patients to assess scar extent. We observed that the REF at the basal, mid-cavity and apical regions for the patient group is significantly lower as compared to the control group (P < 0.001 using a 2-tail student t-test). In addition, we correlated the patient REF over these regions with their corresponding LGE score in terms of 4 categories - High LGE, Low LGE, Border and Remote. We observed that the median REF decreases with increasing severity of infarction. The results suggest that REF could potentially be used as a discriminator for MI and employed to measure myocardium homogeneity with respect to degree of infarction. The computational performance per data sample took approximately 25 sec, which demonstrates its clinical potential as a real-time cardiac assessment tool.

  19. The treatment with pyridostigmine improves the cardiocirculatory function in rats with chronic heart failure.

    PubMed

    Sabino, João Paulo J; da Silva, Carlos Alberto Aguiar; de Melo, Rubens Fernando; Fazan, Rubens; Salgado, Helio C

    2013-01-01

    Sympathetic hyperactivity and its outcome in heart failure have been thoroughly investigated to determine the focus of pharmacologic approaches targeting the sympathetic nervous system in the treatment of this pathophysiological condition. On the other hand, therapeutic approaches aiming to protect the reduced cardiac parasympathetic function have not received much attention. The present study evaluated rats with chronic heart failure (six to seven weeks after coronary artery ligation) and the effects of an increased parasympathetic function by pyridostigmine (an acetylcholinesterase inhibitor) on the following aspects: arterial pressure (AP), heart rate (HR), baroreceptor and Bezold-Jarisch reflex, pulse interval (PI) and AP variability, cardiac sympathetic and parasympathetic tonus, intrinsic heart rate (i-HR) and cardiac function. Conscious rats with heart failure exhibited no change in HR, Bezold-Jarisch reflex, PI variability and cardiac sympathetic tonus. On the other hand, these animals presented hypotension and reduced baroreflex sensitivity, power in the low frequency (LF) band of the systolic AP spectrum, cardiac parasympathetic tonus and i-HR, while anesthetized rats exhibited reduced cardiac performance. Pyridostigmine prevented the attenuation of all the parameters examined, except basal AP and cardiac performance. In conclusion, the blockade of acetylcholinesterase with pyridostigmine was revealed to be an important pharmacological approach, which could be used to increase parasympathetic function and to improve a number of cardiocirculatory parameters in rats with heart failure. Copyright © 2012 Elsevier B.V. All rights reserved.

  20. DRD2 Schizophrenia-Risk Allele Is Associated With Impaired Striatal Functioning in Unaffected Siblings of Schizophrenia Patients

    PubMed Central

    Vink, Matthijs; de Leeuw, Max; Luykx, Jurjen J.; van Eijk, Kristel R.; van den Munkhof, Hanna E.; van Buuren, Mariët; Kahn, René S.

    2016-01-01

    A recent Genome-Wide Association Study showed that the rs2514218 single nucleotide polymorphism (SNP) in close proximity to dopamine receptor D2 is strongly associated with schizophrenia. Further, an in silico experiment showed that rs2514218 has a cis expression quantitative trait locus effect in the basal ganglia. To date, however, the functional consequence of this SNP is unknown. Here, we used functional Magnetic resonance imaging to investigate the impact of this risk allele on striatal activation during proactive and reactive response inhibition in 45 unaffected siblings of schizophrenia patients. We included siblings to circumvent the illness specific confounds affecting striatal functioning independent from gene effects. Behavioral analyses revealed no differences between the carriers (n = 21) and noncarriers (n = 24). Risk allele carriers showed a diminished striatal response to increasing proactive inhibitory control demands, whereas overall level of striatal activation in carriers was elevated compared to noncarriers. Finally, risk allele carriers showed a blunted striatal response during successful reactive inhibition compared to the noncarriers. These data are consistent with earlier reports showing similar deficits in schizophrenia patients, and point to a failure to flexibly engage the striatum in response to contextual cues. This is the first study to demonstrate an association between impaired striatal functioning and the rs2514218 polymorphism. We take our findings to indicate that striatal functioning is impaired in carriers of the DRD2 risk allele, likely due to dopamine dysregulation at the DRD2 location. PMID:26598739

  1. Sildenafil ameliorates left ventricular T-tubule remodeling in a pressure overload-induced murine heart failure model

    PubMed Central

    Huang, Chun-kai; Chen, Bi-yi; Guo, Ang; Chen, Rong; Zhu, Yan-qi; Kutschke, William; Hong, Jiang; Song, Long-sheng

    2016-01-01

    Aim: Sildenafil, a phosphodiesterase 5 (PDE5) inhibitor, has been shown to exert beneficial effects in heart failure. The purpose of this study was to test whether sildenafil suppressed transverse-tubule (T-tubule) remodeling in left ventricular (LV) failure and thereby providing the therapeutic benefits. Methods: A pressure overload-induced murine heart failure model was established in mice by thoracic aortic banding (TAB). One day after TAB, the mice received sildenafil (100 mg·kg−1·d−1, sc) or saline for 5 weeks. At the end of treatment, echocardiography was used to examine LV function. Then the intact hearts were dissected out and placed in Langendorff-perfusion chamber for in situ confocal imaging of T-tubule ultrastructure from epicardial myocytes. Results: TAB surgery resulted in heart failure accompanied by remarkable T-tubule remodeling. Sildenafil treatment significantly attenuated TAB-induced cardiac hypertrophy and congestive heart failure, improved LV contractile function, and preserved T-tubule integrity in LV cardiomyocytes. But sildenafil treatment did not significantly affect the chamber dilation. The integrity of LV T-tubule structure was correlated with cardiac hypertrophy (R2=0.74, P<0.01) and global LV function (R2=0.47, P<0.01). Conclusion: Sildenafil effectively ameliorates LV T-tubule remodeling in TAB mice, revealing a novel mechanism underlying the therapeutic benefits of sildenafil in heart failure. PMID:26972492

  2. Patient characteristics as predictors of clinical outcome of distraction in treatment of severe ankle osteoarthritis.

    PubMed

    Marijnissen, A C A; Hoekstra, M C L; Pré, B C du; van Roermund, P M; van Melkebeek, J; Amendola, A; Maathuis, P; Lafeber, F P J G; Welsing, P M J

    2014-01-01

    Osteoarthritis (OA) is a slowly progressive joint disease. Joint distraction can be a treatment of choice in case of severe OA. Prediction of failure will facilitate implementation of joint distraction in clinical practice. Patients with severe ankle OA, who underwent joint distraction were included. Survival analysis was performed over 12 years (n = 25 after 12 years). Regression analyses were used to predict failures and clinical benefit at 2 years after joint distraction (n = 111). Survival analysis showed that 44% of the patients failed, 17% within 2 years and 37% within 5 years after joint distraction (n = 48 after 5 years). Survival analysis in subgroups showed that the percentage failure was only different in women (30% after 2 years) versus men (after 11 years still no 30% failure). In the multivariate analyses female gender was predictive for failure 2 years after joint distraction. Gender and functional disability at baseline predicted more pain. Functional disability and pain at baseline were associated with more functional disability. Joint distraction shows a long-term clinical beneficial outcome. However, failure rate is considerable over the years. Female patients have a higher chance of failure during follow-up. Unfortunately, not all potential predictors could be investigated and other clinically significant predictors were not found. © 2013 Orthopaedic Research Society. Published by Wiley Periodicals, Inc.

  3. The function and failure of sensory predictions.

    PubMed

    Bansal, Sonia; Ford, Judith M; Spering, Miriam

    2018-04-23

    Humans and other primates are equipped with neural mechanisms that allow them to automatically make predictions about future events, facilitating processing of expected sensations and actions. Prediction-driven control and monitoring of perceptual and motor acts are vital to normal cognitive functioning. This review provides an overview of corollary discharge mechanisms involved in predictions across sensory modalities and discusses consequences of predictive coding for cognition and behavior. Converging evidence now links impairments in corollary discharge mechanisms to neuropsychiatric symptoms such as hallucinations and delusions. We review studies supporting a prediction-failure hypothesis of perceptual and cognitive disturbances. We also outline neural correlates underlying prediction function and failure, highlighting similarities across the visual, auditory, and somatosensory systems. In linking basic psychophysical and psychophysiological evidence of visual, auditory, and somatosensory prediction failures to neuropsychiatric symptoms, our review furthers our understanding of disease mechanisms. © 2018 New York Academy of Sciences.

  4. Determinants of respiratory pump function in patients with cystic fibrosis.

    PubMed

    Dassios, Theodore

    2015-01-01

    Respiratory failure constitutes the major cause of morbidity and mortality in patients with Cystic Fibrosis (CF). Respiratory failure could either be due to lung parenchyma damage or to insufficiency of the respiratory pump which consists of the respiratory muscles, the rib cage and the neuromuscular transmission pathways. Airway obstruction, hyperinflation and malnutrition have been historically recognised as the major determinants of respiratory pump dysfunction in CF. Recent research has identified chronic infection, genetic predisposition, dietary and pharmaceutical interventions as possible additional determinants of this impairment. Furthermore, new methodological approaches in assessing respiratory pump function have led to a better understanding of the pathogenesis of respiratory pump failure in CF. Finally, respiratory muscle function could be partially preserved in CF patients with structured interventions such as aerobic exercise, inspiratory muscle training and non-invasive ventilation and CF patients could consequently be relatively protected from respiratory fatigue and respiratory failure. Copyright © 2014 Elsevier Ltd. All rights reserved.

  5. Revision Distal Femoral Arthroplasty With the Compress(®) Prosthesis Has a Low Rate of Mechanical Failure at 10 Years.

    PubMed

    Zimel, Melissa N; Farfalli, German L; Zindman, Alexandra M; Riedel, Elyn R; Morris, Carol D; Boland, Patrick J; Healey, John H

    2016-02-01

    Patients with failed distal femoral megaprostheses often have bone loss that limits reconstructive options and contributes to the high failure rate of revision surgery. The Compress(®) Compliant Pre-stress (CPS) implant can reconstruct the femur even when there is little remaining bone. It differs from traditional stemmed prostheses because it requires only 4 to 8 cm of residual bone for fixation. Given the poor long-term results of stemmed revision constructs, we sought to determine the failure rate and functional outcomes of the CPS implant in revision surgery. (1) What is the cumulative incidence of mechanical and other types of implant failure when used to revise failed distal femoral arthroplasties placed after oncologic resection? (2) What complications are characteristic of this prosthesis? (3) What function do patients achieve after receiving this prosthesis? We retrospectively reviewed 27 patients who experienced failure of a distal femoral prosthesis and were revised to a CPS implant from April 2000 to February 2013. Indications for use included a minimum 2.5 mm cortical thickness of the remaining proximal femur, no prior radiation, life expectancy > 10 years, and compliance with protected weightbearing for 3 months. The cumulative incidence of failure was calculated for both mechanical (loss of compression between the implant anchor plug and spindle) and other failure modes using a competing risk analysis. Failure was defined as removal of the CPS implant. Followup was a minimum of 2 years or until implant removal. Median followup for patients with successful revision arthroplasty was 90 months (range, 24-181 months). Functional outcomes were measured with the Musculoskeletal Tumor Society (MSTS) functional assessment score. The cumulative incidence of mechanical failure was 11% (95% confidence interval [CI], 4%-33%) at both 5 and 10 years. These failures occurred early at a median of 5 months. The cumulative incidence of other failures was 18% (95% CI, 7%-45%) at 5 and 10 years, all of which were deep infection. Three patients required secondary operations for cortical insufficiency proximal to the anchor plug in bone not spanned by the CPS implant and unrelated to the prosthesis. Median MSTS score was 27 (range, 24-30). Revision distal femoral replacement arthroplasty after a failed megaprosthesis is often difficult as a result of a lack of adequate bone. Reconstruction with the CPS implant has an 11% failure rate at 10 years. Our results are promising and demonstrate the durable fixation provided by the CPS implant. Further studies to compare the CPS prosthesis and other reconstruction options with respect to survival and functional outcomes are warranted. Level IV, therapeutic study.

  6. Incorporation of Failure Into an Orthotropic Three-Dimensional Model with Tabulated Input Suitable for Use in Composite Impact Problems

    NASA Technical Reports Server (NTRS)

    Goldberg, Robert K.; Carney, Kelly S.; Dubois, Paul; Hoffarth, Canio; Khaled, Bilal; Shyamsunder, Loukham; Rajan, Subramaniam; Blankenhorn, Gunther

    2017-01-01

    The need for accurate material models to simulate the deformation, damage and failure of polymer matrix composites under impact conditions is becoming critical as these materials are gaining increased use in the aerospace and automotive communities. The aerospace community has identified several key capabilities which are currently lacking in the available material models in commercial transient dynamic finite element codes. To attempt to improve the predictive capability of composite impact simulations, a next generation material model is being developed for incorporation within the commercial transient dynamic finite element code LS-DYNA. The material model, which incorporates plasticity, damage and failure, utilizes experimentally based tabulated input to define the evolution of plasticity and damage and the initiation of failure as opposed to specifying discrete input parameters such as modulus and strength. The plasticity portion of the orthotropic, three-dimensional, macroscopic composite constitutive model is based on an extension of the Tsai-Wu composite failure model into a generalized yield function with a non-associative flow rule. For the damage model, a strain equivalent formulation is used to allow for the uncoupling of the deformation and damage analyses. In the damage model, a semi-coupled approach is employed where the overall damage in a particular coordinate direction is assumed to be a multiplicative combination of the damage in that direction resulting from the applied loads in various coordinate directions. For the failure model, a tabulated approach is utilized in which a stress or strain based invariant is defined as a function of the location of the current stress state in stress space to define the initiation of failure. Failure surfaces can be defined with any arbitrary shape, unlike traditional failure models where the mathematical functions used to define the failure surface impose a specific shape on the failure surface. In the current paper, the complete development of the failure model is described and the generation of a tabulated failure surface for a representative composite material is discussed.

  7. Micromechanical investigation of ductile failure in Al 5083-H116 via 3D unit cell modeling

    NASA Astrophysics Data System (ADS)

    Bomarito, G. F.; Warner, D. H.

    2015-01-01

    Ductile failure is governed by the evolution of micro-voids within a material. The micro-voids, which commonly initiate at second phase particles within metal alloys, grow and interact with each other until failure occurs. The evolution of the micro-voids, and therefore ductile failure, depends on many parameters (e.g., stress state, temperature, strain rate, void and particle volume fraction, etc.). In this study, the stress state dependence of the ductile failure of Al 5083-H116 is investigated by means of 3-D Finite Element (FE) periodic cell models. The cell models require only two pieces of information as inputs: (1) the initial particle volume fraction of the alloy and (2) the constitutive behavior of the matrix material. Based on this information, cell models are subjected to a given stress state, defined by the stress triaxiality and the Lode parameter. For each stress state, the cells are loaded in many loading orientations until failure. Material failure is assumed to occur in the weakest orientation, and so the orientation in which failure occurs first is considered as the critical orientation. The result is a description of material failure that is derived from basic principles and requires no fitting parameters. Subsequently, the results of the simulations are used to construct a homogenized material model, which is used in a component-scale FE model. The component-scale FE model is compared to experiments and is shown to over predict ductility. By excluding smaller nucleation events and load path non-proportionality, it is concluded that accuracy could be gained by including more information about the true microstructure in the model; emphasizing that its incorporation into micromechanical models is critical to developing quantitatively accurate physics-based ductile failure models.

  8. A model for predicting embankment slope failures in clay-rich soils; A Louisiana example

    NASA Astrophysics Data System (ADS)

    Burns, S. F.

    2015-12-01

    A model for predicting embankment slope failures in clay-rich soils; A Louisiana example It is well known that smectite-rich soils significantly reduce the stability of slopes. The question is how much smectite in the soil causes slope failures. A study of over 100 sites in north and south Louisiana, USA, compared slopes that failed during a major El Nino winter (heavy rainfall) in 1982-1983 to similar slopes that did not fail. Soils in the slopes were tested for per cent clay, liquid limits, plasticity indices and semi-quantitative clay mineralogy. Slopes with the High Risk for failure (85-90% chance of failure in 8-15 years after construction) contained soils with a liquid limit > 54%, a plasticity index over 29%, and clay contents > 47%. Slopes with an Intermediate Risk (55-50% chance of failure in 8-15 years) contained soils with a liquid limit between 36-54%, plasticity index between 16-19%, and clay content between 32-47%. Slopes with a Low Risk chance of failure (< 5% chance of failure in 8-15 years after construction) contained soils with a liquid limit < 36%, a plasticity index < 16%, and a clay content < 32%. These data show that if one is constructing embankments and one wants to prevent slope failure of the 3:1 slopes, check the above soil characteristics before construction. If the soils fall into the Low Risk classification, construct the embankment normally. If the soils fall into the High Risk classification, one will need to use lime stabilization or heat treatments to prevent failures. Soils in the Intermediate Risk class will have to be evaluated on a case by case basis.

  9. Building a Database for a Quantitative Model

    NASA Technical Reports Server (NTRS)

    Kahn, C. Joseph; Kleinhammer, Roger

    2014-01-01

    A database can greatly benefit a quantitative analysis. The defining characteristic of a quantitative risk, or reliability, model is the use of failure estimate data. Models can easily contain a thousand Basic Events, relying on hundreds of individual data sources. Obviously, entering so much data by hand will eventually lead to errors. Not so obviously entering data this way does not aid linking the Basic Events to the data sources. The best way to organize large amounts of data on a computer is with a database. But a model does not require a large, enterprise-level database with dedicated developers and administrators. A database built in Excel can be quite sufficient. A simple spreadsheet database can link every Basic Event to the individual data source selected for them. This database can also contain the manipulations appropriate for how the data is used in the model. These manipulations include stressing factors based on use and maintenance cycles, dormancy, unique failure modes, the modeling of multiple items as a single "Super component" Basic Event, and Bayesian Updating based on flight and testing experience. A simple, unique metadata field in both the model and database provides a link from any Basic Event in the model to its data source and all relevant calculations. The credibility for the entire model often rests on the credibility and traceability of the data.

  10. Diabetes mellitus is associated with adverse structural and functional cardiac remodelling in chronic heart failure with reduced ejection fraction.

    PubMed

    Walker, Andrew Mn; Patel, Peysh A; Rajwani, Adil; Groves, David; Denby, Christine; Kearney, Lorraine; Sapsford, Robert J; Witte, Klaus K; Kearney, Mark T; Cubbon, Richard M

    2016-09-01

    Diabetes mellitus is associated with an increased risk of death and hospitalisation in patients with chronic heart failure. Better understanding of potential underlying mechanisms may aid the development of diabetes mellitus-specific chronic heart failure therapeutic strategies. Prospective observational cohort study of 628 patients with chronic heart failure associated with left ventricular systolic dysfunction receiving contemporary evidence-based therapy. Indices of cardiac structure and function, along with symptoms and biochemical parameters, were compared in patients with and without diabetes mellitus at study recruitment and 1 year later. Patients with diabetes mellitus (24.2%) experienced higher rates of all-cause [hazard ratio, 2.3 (95% confidence interval, 1.8-3.0)] and chronic heart failure-specific mortality and hospitalisation despite comparable pharmacological and device-based therapies. At study recruitment, patients with diabetes mellitus were more symptomatic, required greater diuretic doses and more frequently had radiologic evidence of pulmonary oedema, despite higher left ventricular ejection fraction. They also exhibited echocardiographic evidence of increased left ventricular wall thickness and pulmonary arterial pressure. Diabetes mellitus was associated with reduced indices of heart rate variability and increased heart rate turbulence. During follow-up, patients with diabetes mellitus experienced less beneficial left ventricular remodelling and greater deterioration in renal function. Diabetes mellitus is associated with features of adverse structural and functional cardiac remodelling in patients with chronic heart failure. © The Author(s) 2016.

  11. SMART: A Propositional Logic-Based Trade Analysis and Risk Assessment Tool for a Complex Mission

    NASA Technical Reports Server (NTRS)

    Ono, Masahiro; Nicholas, Austin; Alibay, Farah; Parrish, Joseph

    2015-01-01

    This paper introduces a new trade analysis software called the Space Mission Architecture and Risk Analysis Tool (SMART). This tool supports a high-level system trade study on a complex mission, such as a potential Mars Sample Return (MSR) mission, in an intuitive and quantitative manner. In a complex mission, a common approach to increase the probability of success is to have redundancy and prepare backups. Quantitatively evaluating the utility of adding redundancy to a system is important but not straightforward, particularly when the failure of parallel subsystems are correlated.

  12. First permanent implant of the Jarvik 2000 Heart.

    PubMed

    Westaby, S; Banning, A P; Jarvik, R; Frazier, O H; Pigott, D W; Jin, X Y; Catarino, P A; Saito, S; Robson, D; Freeland, A; Myers, T J; Poole-Wilson, P A

    2000-09-09

    Heart failure is a major public-health concern. Quality and duration of life on maximum medical therapy are poor. The availability of donor hearts is severely limited, therefore an alternative approach is necessary. We have explored the use of a new type of left-ventricular assist device intended as a long-term solution to end-stage heart failure. As part of a prospective clinical trial, we implanted the first permanent Jarvik 2000 Heart--an intraventricular device with an innovative power delivery system--into a 61-year-old man (New York Heart Association functional class IV) with dilated cardiomyopathy. We assessed the effect of this left-ventricular assist device on both native heart function and the symptoms and systemic characteristics of heart failure. The Jarvik 2000 Heart sustained the patient's circulation, and was practical and user-friendly. After 6 weeks, exercise tolerance, myocardial function, and end-organ function improved. Symptoms of heart failure have resolved, and continuous decreased pulse-pressure perfusion has had no adverse effects in the short term. There has been no significant haemolysis and no device-related complications. The skull-mounted pedestal is unobtrusive and has healed well. The initial success of this procedure raises the possibility of a new treatment for end-stage heart failure. In the longer term, its role will be determined by mechanical reliability.

  13. Gaussian fitting for carotid and radial artery pressure waveforms: comparison between normal subjects and heart failure patients.

    PubMed

    Liu, Chengyu; Zheng, Dingchang; Zhao, Lina; Liu, Changchun

    2014-01-01

    It has been reported that Gaussian functions could accurately and reliably model both carotid and radial artery pressure waveforms (CAPW and RAPW). However, the physiological relevance of the characteristic features from the modeled Gaussian functions has been little investigated. This study thus aimed to determine characteristic features from the Gaussian functions and to make comparisons of them between normal subjects and heart failure patients. Fifty-six normal subjects and 51 patients with heart failure were studied with the CAPW and RAPW signals recorded simultaneously. The two signals were normalized first and then modeled by three positive Gaussian functions, with their peak amplitude, peak time, and half-width determined. Comparisons of these features were finally made between the two groups. Results indicated that the peak amplitude of the first Gaussian curve was significantly decreased in heart failure patients compared with normal subjects (P<0.001). Significantly increased peak amplitude of the second Gaussian curves (P<0.001) and significantly shortened peak times of the second and third Gaussian curves (both P<0.001) were also presented in heart failure patients. These results were true for both CAPW and RAPW signals, indicating the clinical significance of the Gaussian modeling, which should provide essential tools for further understanding the underlying physiological mechanisms of the artery pressure waveform.

  14. Health management system for rocket engines

    NASA Technical Reports Server (NTRS)

    Nemeth, Edward

    1990-01-01

    The functional framework of a failure detection algorithm for the Space Shuttle Main Engine (SSME) is developed. The basic algorithm is based only on existing SSME measurements. Supplemental measurements, expected to enhance failure detection effectiveness, are identified. To support the algorithm development, a figure of merit is defined to estimate the likelihood of SSME criticality 1 failure modes and the failure modes are ranked in order of likelihood of occurrence. Nine classes of failure detection strategies are evaluated and promising features are extracted as the basis for the failure detection algorithm. The failure detection algorithm provides early warning capabilities for a wide variety of SSME failure modes. Preliminary algorithm evaluation, using data from three SSME failures representing three different failure types, demonstrated indications of imminent catastrophic failure well in advance of redline cutoff in all three cases.

  15. Predicting device failure after percutaneous repair of functional mitral regurgitation in advanced heart failure: Implications for patient selection.

    PubMed

    Stolfo, Davide; De Luca, Antonio; Morea, Gaetano; Merlo, Marco; Vitrella, Giancarlo; Caiffa, Thomas; Barbati, Giulia; Rakar, Serena; Korcova, Renata; Perkan, Andrea; Pinamonti, Bruno; Pappalardo, Aniello; Berardini, Alessandra; Biagini, Elena; Saia, Francesco; Grigioni, Francesco; Rapezzi, Claudio; Sinagra, Gianfranco

    2018-04-15

    Patients with heart failure (HF) and severe symptomatic functional mitral regurgitation (FMR) may benefit from MitraClip implantation. With increasing numbers of patients being treated the success of procedure becomes a key issue. We sought to investigate the pre-procedural predictors of device failure in patients with advanced HF treated with MitraClip. From April 2012 to November 2016, 76 patients with poor functional class (NYHA class III-IV) and severe left ventricular (LV) remodeling underwent MitraClip implantation at University Hospitals of Trieste and Bologna (Italy). Device failure was assessed according to MVARC criteria. Patients were subsequently followed to additionally assess the patient success after 12months. Mean age was 67±12years, the mean Log-EuroSCORE was 23.4±16.5%, and the mean LV end-diastolic volume index and ejection fraction (EF) were 112±33ml/m 2 and 30.6±8.9%, respectively. At short-term evaluation, device failure was observed in 22 (29%) patients. Univariate predictors of device failure were LVEF, LV and left atrial volumes and anteroposterior mitral annulus diameter. Annulus dimension (OR 1.153, 95% CI 1.002-1.327, p=0.043) and LV end-diastolic volume (OR 1.024, 95% CI 1.000-1.049, p=0.049) were the only variables independently associated with the risk of device failure at the multivariate model. Pre-procedural anteroposterior mitral annulus diameter accurately predicted the risk of device failure after MitraClip in the setting of advanced HF. Its assessment might aid the selection of the best candidates to percutaneous correction of FMR. Copyright © 2018 Elsevier B.V. All rights reserved.

  16. Design of high temperature ceramic components against fast fracture and time-dependent failure using cares/life

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jadaan, O.M.; Powers, L.M.; Nemeth, N.N.

    1995-08-01

    A probabilistic design methodology which predicts the fast fracture and time-dependent failure behavior of thermomechanically loaded ceramic components is discussed using the CARES/LIFE integrated design computer program. Slow crack growth (SCG) is assumed to be the mechanism responsible for delayed failure behavior. Inert strength and dynamic fatigue data obtained from testing coupon specimens (O-ring and C-ring specimens) are initially used to calculate the fast fracture and SCG material parameters as a function of temperature using the parameter estimation techniques available with the CARES/LIFE code. Finite element analysis (FEA) is used to compute the stress distributions for the tube as amore » function of applied pressure. Knowing the stress and temperature distributions and the fast fracture and SCG material parameters, the life time for a given tube can be computed. A stress-failure probability-time to failure (SPT) diagram is subsequently constructed for these tubes. Such a diagram can be used by design engineers to estimate the time to failure at a given failure probability level for a component subjected to a given thermomechanical load.« less

  17. Parylene MEMS patency sensor for assessment of hydrocephalus shunt obstruction.

    PubMed

    Kim, Brian J; Jin, Willa; Baldwin, Alexander; Yu, Lawrence; Christian, Eisha; Krieger, Mark D; McComb, J Gordon; Meng, Ellis

    2016-10-01

    Neurosurgical ventricular shunts inserted to treat hydrocephalus experience a cumulative failure rate of 80 % over 12 years; obstruction is responsible for most failures with a majority occurring at the proximal catheter. Current diagnosis of shunt malfunction is imprecise and involves neuroimaging studies and shunt tapping, an invasive measurement of intracranial pressure and shunt patency. These patients often present emergently and a delay in care has dire consequences. A microelectromechanical systems (MEMS) patency sensor was developed to enable direct and quantitative tracking of shunt patency in order to detect proximal shunt occlusion prior to the development of clinical symptoms thereby avoiding delays in treatment. The sensor was fabricated on a flexible polymer substrate to eventually allow integration into a shunt. In this study, the sensor was packaged for use with external ventricular drainage systems for clinical validation. Insights into the transduction mechanism of the sensor were obtained. The impact of electrode size, clinically relevant temperatures and flows, and hydrogen peroxide (H2O2) plasma sterilization on sensor function were evaluated. Sensor performance in the presence of static and dynamic obstruction was demonstrated using 3 different models of obstruction. Electrode size was found to have a minimal effect on sensor performance and increased temperature and flow resulted in a slight decrease in the baseline impedance due to an increase in ionic mobility. However, sensor response did not vary within clinically relevant temperature and flow ranges. H2O2 plasma sterilization also had no effect on sensor performance. This low power and simple format sensor was developed with the intention of future integration into shunts for wireless monitoring of shunt state and more importantly, a more accurate and timely diagnosis of shunt failure.

  18. Human lymphatic pumping measured in healthy and lymphoedematous arms by lymphatic congestion lymphoscintigraphy

    PubMed Central

    Modi, S; Stanton, A W B; Svensson, W E; Peters, A M; Mortimer, P S; Levick, J R

    2007-01-01

    Axillary surgery for breast cancer partially obstructs lymph outflow from the arm, chronically raising the lymphatic smooth muscle afterload. This may lead to pump failure, as in hypertensive cardiac failure, and could explain features of breast cancer treatment-related lymphoedema (BCRL) such as its delayed onset. A new method was developed to measure human lymphatic contractility non-invasively and test the hypothesis of contractile impairment. 99mTc-human IgG (Tc-HIG), injected into the hand dermis, drained into the arm lymphatic system which was imaged using a gamma-camera. Lymph transit time from hand to axilla, ttransit, was 9.6 ± 7.2 min (mean ±s.d.) (velocity 8.9 cm min−1) in seven normal subjects. To assess lymphatic contractility, a sphygmomanometer cuff around the upper arm was inflated to 60 mmHg (Pcuff) before 99mTc-HIG injection and maintained for >> ttransit. When Pcuff exceeded the maximum pressure generated by the lymphatic pump (Ppump), radiolabelled lymph was held up at the distal cuff border. Pcuff was then lowered in 10 mmHg steps until 99mTc-HIG began to flow under the cuff to the axilla, indicating Ppump≥Pcuff. In 16 normal subjects Ppump was 39 ± 14 mmHg. Ppump was 38% lower in 16 women with BCRL, namely 24 ± 19 mmHg (P = 0.014, Student's unpaired t test), and correlated negatively with the degree of swelling (12–56%). Blood radiolabel accumulation proved an unreliable measure of lymphatic pump function. Lymphatic congestion lymphoscintigraphy thus provided a quantitative measure of human lymphatic contractility without surgical cut-down, and the results supported the hypothesis of lymphatic pump failure in BCRL. PMID:17569739

  19. A morphologic characterisation of the 1963 Vajont Slide, Italy, using long-range terrestrial photogrammetry

    NASA Astrophysics Data System (ADS)

    Wolter, Andrea; Stead, Doug; Clague, John J.

    2014-02-01

    The 1963 Vajont Slide in northeast Italy is an important engineering and geological event. Although the landslide has been extensively studied, new insights can be derived by applying modern techniques such as remote sensing and numerical modelling. This paper presents the first digital terrestrial photogrammetric analyses of the failure scar, landslide deposits, and the area surrounding the failure, with a focus on the scar. We processed photogrammetric models to produce discontinuity stereonets, residual maps and profiles, and slope and aspect maps, all of which provide information on the failure scar morphology. Our analyses enabled the creation of a preliminary semi-quantitative morphologic classification of the Vajont failure scar based on the large-scale tectonic folds and step-paths that define it. The analyses and morphologic classification have implications for the kinematics, dynamics, and mechanism of the slide. Metre- and decametre-scale features affected the initiation, direction, and displacement rate of sliding. The most complexly folded and stepped areas occur close to the intersection of orthogonal synclinal features related to the Dinaric and Neoalpine deformation events. Our analyses also highlight, for the first time, the evolution of the Vajont failure scar from 1963 to the present.

  20. Pitfalls and Precautions When Using Predicted Failure Data for Quantitative Analysis of Safety Risk for Human Rated Launch Vehicles

    NASA Technical Reports Server (NTRS)

    Hatfield, Glen S.; Hark, Frank; Stott, James

    2016-01-01

    Launch vehicle reliability analysis is largely dependent upon using predicted failure rates from data sources such as MIL-HDBK-217F. Reliability prediction methodologies based on component data do not take into account risks attributable to manufacturing, assembly, and process controls. These sources often dominate component level reliability or risk of failure probability. While consequences of failure is often understood in assessing risk, using predicted values in a risk model to estimate the probability of occurrence will likely underestimate the risk. Managers and decision makers often use the probability of occurrence in determining whether to accept the risk or require a design modification. Due to the absence of system level test and operational data inherent in aerospace applications, the actual risk threshold for acceptance may not be appropriately characterized for decision making purposes. This paper will establish a method and approach to identify the pitfalls and precautions of accepting risk based solely upon predicted failure data. This approach will provide a set of guidelines that may be useful to arrive at a more realistic quantification of risk prior to acceptance by a program.

  1. Stress redistribution and damage in interconnects caused by electromigration

    NASA Astrophysics Data System (ADS)

    Chiras, Stefanie Ruth

    Electromigration has long been recognized as a phenomenon that induces mass redistribution in metals which, when constrained, can lead to the creation of stress. Since the development of the integrated circuit, electromigration. in interconnects, (the metal lines which carry current between devices in integrated circuits), has become a reliability concern. The primary failure mechanism in the interconnects is usually voiding, which causes electrical resistance increases in the circuit. In some cases, however, another failure mode occurs, fracture of the surrounding dielectric driven by electromigration induced compressive stresses within the interconnect. It is this failure mechanism that is the focus of this thesis. To study dielectric fracture, both residual processing stresses and the development of electromigration induced stress in isolated, constrained interconnects was measured. The high-resolution measurements were made using two types of piezospectroscopy, complemented by finite element analysis (FEA). Both procedures directly measured stress in the underlying or neighboring substrate and used FEA to determine interconnect stresses. These interconnect stresses were related to the effected circuit failure mode through post-test scanning electron microscopy and resistance measurements taken during electromigration testing. The results provide qualitative evidence of electromigration driven passivation fracture, and quantitative analysis of the theoretical model of the failure, the "immortal" interconnect concept.

  2. Formal Specification and Validation of a Hybrid Connectivity Restoration Algorithm for Wireless Sensor and Actor Networks †

    PubMed Central

    Imran, Muhammad; Zafar, Nazir Ahmad

    2012-01-01

    Maintaining inter-actor connectivity is extremely crucial in mission-critical applications of Wireless Sensor and Actor Networks (WSANs), as actors have to quickly plan optimal coordinated responses to detected events. Failure of a critical actor partitions the inter-actor network into disjoint segments besides leaving a coverage hole, and thus hinders the network operation. This paper presents a Partitioning detection and Connectivity Restoration (PCR) algorithm to tolerate critical actor failure. As part of pre-failure planning, PCR determines critical/non-critical actors based on localized information and designates each critical node with an appropriate backup (preferably non-critical). The pre-designated backup detects the failure of its primary actor and initiates a post-failure recovery process that may involve coordinated multi-actor relocation. To prove the correctness, we construct a formal specification of PCR using Z notation. We model WSAN topology as a dynamic graph and transform PCR to corresponding formal specification using Z notation. Formal specification is analyzed and validated using the Z Eves tool. Moreover, we simulate the specification to quantitatively analyze the efficiency of PCR. Simulation results confirm the effectiveness of PCR and the results shown that it outperforms contemporary schemes found in the literature.

  3. Using diagnostic experiences in experience-based innovative design

    NASA Astrophysics Data System (ADS)

    Prabhakar, Sattiraju; Goel, Ashok K.

    1992-03-01

    Designing a novel class of devices requires innovation. Often, the design knowledge of these devices does not identify and address the constraints that are required for their performance in the real world operating environment. So any new design adapted from these devices tend to be similarly sketchy. In order to address this problem, we propose a case-based reasoning method called performance driven innovation (PDI). We model the design as a dynamic process, arrive at a design by adaptation from the known designs, generate failures for this design for some new constraints, and then use this failure knowledge to generate the required design knowledge for the new constraints. In this paper, we discuss two aspects of PDI: the representation of PDI cases and the translation of the failure knowledge into design knowledge for a constraint. Each case in PDI has two components: design and failure knowledge. Both of them are represented using a substance-behavior-function model. Failure knowledge has internal device failure behaviors and external environmental behaviors. The environmental behavior, for a constraint, interacting with the design behaviors, results in the failure internal behavior. The failure adaptation strategy generates functions, from the failure knowledge, which can be addressed using the routine design methods. These ideas are illustrated using a coffee-maker example.

  4. Evaluation of Encapsulated Liver Cell Spheroids in a Fluidised-Bed Bioartificial Liver for Treatment of Ischaemic Acute Liver Failure in Pigs in a Translational Setting

    PubMed Central

    Selden, Clare; Spearman, Catherine Wendy; Kahn, Delawir; Miller, Malcolm; Figaji, Anthony; Erro, Eloy; Bundy, James; Massie, Isobel; Chalmers, Sherri-Ann; Arendse, Hiram; Gautier, Aude; Sharratt, Peter; Fuller, Barry; Hodgson, Humphrey

    2013-01-01

    Liver failure is an increasing problem. Donor-organ shortage results in patients dying before receiving a transplant. Since the liver can regenerate, alternative therapies providing temporary liver-support are sought. A bioartificial-liver would temporarily substitute function in liver failure buying time for liver regeneration/organ-procurement. Our aim: to develop a prototype bioartificial-liver-machine (BAL) comprising a human liver-derived cell-line, cultured to phenotypic competence and deliverable in a clinical setting to sites distant from its preparation. The objective of this study was to determine whether its use would improve functional parameters of liver failure in pigs with acute liver failure, to provide proof-of-principle. HepG2cells encapsulated in alginate-beads, proliferated in a fluidised-bed-bioreactor providing a biomass of 4–6×1010cells, were transported from preparation-laboratory to point-of-use operating theatre (6000miles) under perfluorodecalin at ambient temperature. Irreversible ischaemic liver failure was induced in anaesthetised pigs, after portal-systemic-shunt, by hepatic-artery-ligation. Biochemical parameters, intracranial pressure, and functional-clotting were measured in animals connected in an extracorporeal bioartificial-liver circuit. Efficacy was demonstrated comparing outcomes between animals connected to a circuit containing alginate-encapsulated cells (Cell-bead BAL), and those connected to circuit containing alginate capsules without cells (Empty-bead BAL). Cells of the biomass met regulatory standards for sterility and provenance. All animals developed progressive liver-failure after ischaemia induction. Efficacy of BAL was demonstrated since animals connected to a functional biomass (+ cells) had significantly smaller rises in intracranial pressure, lower ammonia levels, more bilirubin conjugation, improved acidosis and clotting restoration compared to animals connected to the circuit without cells. In the +cell group, human proteins accumulated in pigs' plasma. Delivery of biomass using a short-term cold-chain enabled transport and use without loss of function over 3days. Thus, a fluidised-bed bioreactor containing alginate-encapsulated HepG2cell-spheroids improved important parameters of acute liver failure in pigs. The system can readily be up-scaled and transported to point-of-use justifying development at clinical scale. PMID:24367515

  5. The kidney in congestive heart failure: 'are natriuresis, sodium, and diuretics really the good, the bad and the ugly?'.

    PubMed

    Verbrugge, Frederik H; Dupont, Matthias; Steels, Paul; Grieten, Lars; Swennen, Quirine; Tang, W H Wilson; Mullens, Wilfried

    2014-02-01

    This review discusses renal sodium handling in heart failure. Increased sodium avidity and tendency to extracellular volume overload, i.e. congestion, are hallmark features of the heart failure syndrome. Particularly in the case of concomitant renal dysfunction, the kidneys often fail to elicit potent natriuresis. Yet, assessment of renal function is generally performed by measuring serum creatinine, which has inherent limitations as a biomarker for the glomerular filtration rate (GFR). Moreover, glomerular filtration only represents part of the nephron's function. Alterations in the fractional reabsorptive rate of sodium are at least equally important in emerging therapy-refractory congestion. Indeed, renal blood flow decreases before the GFR is affected in congestive heart failure. The resulting increased filtration fraction changes Starling forces in peritubular capillaries, which drive sodium reabsorption in the proximal tubules. Congestion further stimulates this process by augmenting renal lymph flow. Consequently, fractional sodium reabsorption in the proximal tubules is significantly increased, limiting sodium delivery to the distal nephron. Orthosympathetic activation probably plays a pivotal role in those deranged intrarenal haemodynamics, which ultimately enhance diuretic resistance, stimulate neurohumoral activation with aldosterone breakthrough, and compromise the counter-regulatory function of natriuretic peptides. Recent evidence even suggests that intrinsic renal derangements might impair natriuresis early on, before clinical congestion or neurohumoral activation are evident. This represents a paradigm shift in heart failure pathophysiology, as it suggests that renal dysfunction-although not by conventional GFR measurements-is driving disease progression. In this respect, a better understanding of renal sodium handling in congestive heart failure is crucial to achieve more tailored decongestive therapy, while preserving renal function. © 2013 The Authors. European Journal of Heart Failure © 2013 European Society of Cardiology.

  6. Cardiorenal syndrome: new developments in the understanding and pharmacologic management.

    PubMed

    House, Andrew A

    2013-10-01

    Cardiorenal syndromes (CRSs) with bidirectional heart-kidney signaling are increasingly being recognized for their association with increased morbidity and mortality. In acute CRS, recognition of the importance of worsening kidney function complicating management of acute decompensated heart failure has led to the examination of this specific outcome in the context of acute heart failure clinical trials. In particular, the role of fluid overload and venous congestion has focused interest in the most effective use of diuretic therapy to relieve symptoms of heart failure while at the same time preserving kidney function. Additionally, many novel vasoactive therapies have been studied in recent years with the hopes of augmenting cardiac function, improving symptoms and patient outcomes, while maintaining or improving kidney function. Similarly, recent advances in our understanding of the pathophysiology of chronic CRS have led to reanalysis of kidney outcomes in pivotal trials in chronic congestive heart failure, and newer trials are including changes in kidney function as well as kidney injury biomarkers as prospectively monitored and adjudicated outcomes. This paper provides an overview of some new developments in the pharmacologic management of acute and chronic CRS, examines several reports that illustrate a key management principle for each subtype, and discusses opportunities for future research.

  7. Impact of Variations in Kidney Function on Nonvitamin K Oral Anticoagulant Dosing in Patients With Atrial Fibrillation and Recent Acute Heart Failure.

    PubMed

    Andreu-Cayuelas, José M; Pastor-Pérez, Francisco J; Puche, Carmen M; Mateo-Martínez, Alicia; García-Alberola, Arcadio; Flores-Blanco, Pedro J; Valdés, Mariano; Lip, Gregory Y H; Roldán, Vanessa; Manzano-Fernández, Sergio

    2016-02-01

    Renal impairment and fluctuations in renal function are common in patients recently hospitalized for acute heart failure and in those with atrial fibrillation. The aim of the present study was to evaluate the hypothetical need for dosage adjustment (based on fluctuations in kidney function) of dabigatran, rivaroxaban and apixaban during the first 6 months after hospital discharge in patients with concomitant atrial fibrillation and heart failure. An observational study was conducted in 162 patients with nonvalvular atrial fibrillation after hospitalization for acute decompensated heart failure who underwent creatinine determinations during follow-up. The hypothetical recommended dosage of dabigatran, rivaroxaban and apixaban according to renal function was determined at discharge. Variations in serum creatinine and creatinine clearance and consequent changes in the recommended dosage of these drugs were identified during 6 months of follow-up. Among the overall study population, 44% of patients would have needed dabigatran dosage adjustment during follow-up, 35% would have needed rivaroxaban adjustment, and 29% would have needed apixaban dosage adjustment. A higher proportion of patients with creatinine clearance < 60 mL/min or with advanced age (≥ 75 years) would have needed dosage adjustment during follow-up. The need for dosage adjustment of nonvitamin K oral anticoagulants during follow-up is frequent in patients with atrial fibrillation after acute decompensated heart failure, especially among older patients and those with renal impairment. Further studies are needed to clarify the clinical importance of these needs for drug dosing adjustment and the ideal renal function monitoring regime in heart failure and other subgroups of patients with atrial fibrillation. Copyright © 2015 Sociedad Española de Cardiología. Published by Elsevier España, S.L.U. All rights reserved.

  8. Multiple imputation methods for nonparametric inference on cumulative incidence with missing cause of failure

    PubMed Central

    Lee, Minjung; Dignam, James J.; Han, Junhee

    2014-01-01

    We propose a nonparametric approach for cumulative incidence estimation when causes of failure are unknown or missing for some subjects. Under the missing at random assumption, we estimate the cumulative incidence function using multiple imputation methods. We develop asymptotic theory for the cumulative incidence estimators obtained from multiple imputation methods. We also discuss how to construct confidence intervals for the cumulative incidence function and perform a test for comparing the cumulative incidence functions in two samples with missing cause of failure. Through simulation studies, we show that the proposed methods perform well. The methods are illustrated with data from a randomized clinical trial in early stage breast cancer. PMID:25043107

  9. Functionally different PIN proteins control auxin flux during bulbil development in Agave tequilana

    PubMed Central

    Abraham Juárez, María Jazmín; Hernández Cárdenas, Rocío; Santoyo Villa, José Natzul; O’Connor, Devin; Sluis, Aaron; Hake, Sarah; Ordaz-Ortiz, José; Terry, Leon; Simpson, June

    2015-01-01

    In Agave tequilana, reproductive failure or inadequate flower development stimulates the formation of vegetative bulbils at the bracteoles, ensuring survival in a hostile environment. Little is known about the signals that trigger this probably unique phenomenon in agave species. Here we report that auxin plays a central role in bulbil development and show that the localization of PIN1-related proteins is consistent with altered auxin transport during this process. Analysis of agave transcriptome data led to the identification of the A. tequilana orthologue of PIN1 (denoted AtqPIN1) and a second closely related gene from a distinct clade reported as ‘Sister of PIN1’ (denoted AtqSoPIN1). Quantitative real-time reverse transcription–PCR (RT-qPCR) analysis showed different patterns of expression for each gene during bulbil formation, and heterologous expression of the A. tequilana PIN1 and SoPIN1 genes in Arabidopsis thaliana confirmed functional differences between these genes. Although no free auxin was detected in induced pedicel samples, changes in the levels of auxin precursors were observed. Taken as a whole, the data support the model that AtqPIN1 and AtqSoPIN1 have co-ordinated but distinct functions in relation to auxin transport during the initial stages of bulbil formation. PMID:25911746

  10. Non-coding RNA in cystic fibrosis.

    PubMed

    Glasgow, Arlene M A; De Santi, Chiara; Greene, Catherine M

    2018-05-09

    Non-coding RNAs (ncRNAs) are an abundant class of RNAs that include small ncRNAs, long non-coding RNAs (lncRNA) and pseudogenes. The human ncRNA atlas includes thousands of these specialised RNA molecules that are further subcategorised based on their size or function. Two of the more well-known and widely studied ncRNA species are microRNAs (miRNAs) and lncRNAs. These are regulatory RNAs and their altered expression has been implicated in the pathogenesis of a variety of human diseases. Failure to express a functional cystic fibrosis (CF) transmembrane receptor (CFTR) chloride ion channel in epithelial cells underpins CF. Secondary to the CFTR defect, it is known that other pathways can be altered and these may contribute to the pathophysiology of CF lung disease in particular. For example, quantitative alterations in expression of some ncRNAs are associated with CF. In recent years, there has been a series of published studies exploring ncRNA expression and function in CF. The majority have focussed principally on miRNAs, with just a handful of reports to date on lncRNAs. The present study reviews what is currently known about ncRNA expression and function in CF, and discusses the possibility of applying this knowledge to the clinical management of CF in the near future. © 2018 The Author(s). Published by Portland Press Limited on behalf of the Biochemical Society.

  11. Reliability analysis of repairable systems using Petri nets and vague Lambda-Tau methodology.

    PubMed

    Garg, Harish

    2013-01-01

    The main objective of the paper is to developed a methodology, named as vague Lambda-Tau, for reliability analysis of repairable systems. Petri net tool is applied to represent the asynchronous and concurrent processing of the system instead of fault tree analysis. To enhance the relevance of the reliability study, vague set theory is used for representing the failure rate and repair times instead of classical(crisp) or fuzzy set theory because vague sets are characterized by a truth membership function and false membership functions (non-membership functions) so that sum of both values is less than 1. The proposed methodology involves qualitative modeling using PN and quantitative analysis using Lambda-Tau method of solution with the basic events represented by intuitionistic fuzzy numbers of triangular membership functions. Sensitivity analysis has also been performed and the effects on system MTBF are addressed. The methodology improves the shortcomings of the existing probabilistic approaches and gives a better understanding of the system behavior through its graphical representation. The washing unit of a paper mill situated in a northern part of India, producing approximately 200 ton of paper per day, has been considered to demonstrate the proposed approach. The results may be helpful for the plant personnel for analyzing the systems' behavior and to improve their performance by adopting suitable maintenance strategies. Copyright © 2012 ISA. Published by Elsevier Ltd. All rights reserved.

  12. Strain, strain rate, and the force frequency relationship in patients with and without heart failure.

    PubMed

    Mak, Susanna; Van Spall, Harriette G C; Wainstein, Rodrigo V; Sasson, Zion

    2012-03-01

    The aim of this study was to examine the effect of heart rate (HR) on indices of deformation in adults with and without heart failure (HF) who underwent simultaneous high-fidelity catheterization of the left ventricle to describe the force-frequency relationship. Right atrial pacing to control HR and high-fidelity recordings of left ventricular (LV) pressure were used to inscribe the force-frequency relationship. Simultaneous two-dimensional echocardiographic imaging was acquired for speckle-tracking analysis. Thirteen patients with normal LV function and 12 with systolic HF (LV ejection fraction, 31 ± 13%) were studied. Patients with HF had depressed isovolumic contractility and impaired longitudinal strain and strain rate. HR-dependent increases in LV+dP/dt(max), the force-frequency relationship, was demonstrated in both groups (normal LV function, baseline to 100 beats/min: 1,335 ± 296 to 1,564 ± 320 mm Hg/sec, P < .0001; HF, baseline to 100 beats/min: 970 ± 207 to 1,083 ± 233 mm Hg/sec, P < .01). Longitudinal strain decreased significantly (normal LV function, baseline to 100 beats/min: 18.0 ± 3.5% to 10.8 ± 6.0%, P < .001; HF: 9.4 ± 4.1% to 7.5 ± 3.4%, P < .01). The decrease in longitudinal strain was related to a decrease in LV end-diastolic dimensions. Strain rate did not change with right atrial pacing. Despite the inotropic effect of increasing HR, longitudinal strain decreases in parallel with stroke volume as load-dependent indices of ejection. Strain rate did not reflect the modest HR-related changes in contractility; on the other hand, the use of strain rate for quantitative stress imaging is also less likely to be confounded by chronotropic responses. Copyright © 2012 American Society of Echocardiography. Published by Mosby, Inc. All rights reserved.

  13. High Risk of Graft Failure in Emerging Adult Heart Transplant Recipients.

    PubMed

    Foster, B J; Dahhou, M; Zhang, X; Dharnidharka, V; Ng, V; Conway, J

    2015-12-01

    Emerging adulthood (17-24 years) is a period of high risk for graft failure in kidney transplant. Whether a similar association exists in heart transplant recipients is unknown. We sought to estimate the relative hazards of graft failure at different current ages, compared with patients between 20 and 24 years old. We evaluated 11 473 patients recorded in the Scientific Registry of Transplant Recipients who received a first transplant at <40 years old (1988-2013) and had at least 6 months of graft function. Time-dependent Cox models were used to estimate the association between current age (time-dependent) and failure risk, adjusted for time since transplant and other potential confounders. Failure was defined as death following graft failure or retransplant; observation was censored at death with graft function. There were 2567 failures. Crude age-specific graft failure rates were highest in 21-24 year olds (4.2 per 100 person-years). Compared to individuals with the same time since transplant, 21-24 year olds had significantly higher failure rates than all other age periods except 17-20 years (HR 0.92 [95%CI 0.77, 1.09]) and 25-29 years (0.86 [0.73, 1.03]). Among young first heart transplant recipients, graft failure risks are highest in the period from 17 to 29 years of age. © Copyright 2015 The American Society of Transplantation and the American Society of Transplant Surgeons.

  14. Organ dysfunction, injury and failure in acute heart failure: from pathophysiology to diagnosis and management. A review on behalf of the Acute Heart Failure Committee of the Heart Failure Association (HFA) of the European Society of Cardiology (ESC).

    PubMed

    Harjola, Veli-Pekka; Mullens, Wilfried; Banaszewski, Marek; Bauersachs, Johann; Brunner-La Rocca, Hans-Peter; Chioncel, Ovidiu; Collins, Sean P; Doehner, Wolfram; Filippatos, Gerasimos S; Flammer, Andreas J; Fuhrmann, Valentin; Lainscak, Mitja; Lassus, Johan; Legrand, Matthieu; Masip, Josep; Mueller, Christian; Papp, Zoltán; Parissis, John; Platz, Elke; Rudiger, Alain; Ruschitzka, Frank; Schäfer, Andreas; Seferovic, Petar M; Skouri, Hadi; Yilmaz, Mehmet Birhan; Mebazaa, Alexandre

    2017-07-01

    Organ injury and impairment are commonly observed in patients with acute heart failure (AHF), and congestion is an essential pathophysiological mechanism of impaired organ function. Congestion is the predominant clinical profile in most patients with AHF; a smaller proportion presents with peripheral hypoperfusion or cardiogenic shock. Hypoperfusion further deteriorates organ function. The injury and dysfunction of target organs (i.e. heart, lungs, kidneys, liver, intestine, brain) in the setting of AHF are associated with increased risk for mortality. Improvement in organ function after decongestive therapies has been associated with a lower risk for post-discharge mortality. Thus, the prevention and correction of organ dysfunction represent a therapeutic target of interest in AHF and should be evaluated in clinical trials. Treatment strategies that specifically prevent, reduce or reverse organ dysfunction remain to be identified and evaluated to determine if such interventions impact mortality, morbidity and patient-centred outcomes. This paper reflects current understanding among experts of the presentation and management of organ impairment in AHF and suggests priorities for future research to advance the field. © 2017 The Authors. European Journal of Heart Failure © 2017 European Society of Cardiology.

  15. Pyrotechnic system failures: Causes and prevention

    NASA Technical Reports Server (NTRS)

    Bement, Laurence J.

    1988-01-01

    Although pyrotechnics have successfully accomplished many critical mechanical spacecraft functions, such as ignition, severance, jettisoning and valving (excluding propulsion), failures continue to occur. Provided is a listing of 84 failures of pyrotechnic hardware with completed design over a 23-year period, compiled informally by experts from every NASA Center, as well as the Air Force Space Division and the Naval Surface Warfare Center. Analyses are presented as to when and where these failures occurred, their technical source or cause, followed by the reasons why and how these kinds of failures persist. The major contributor is a fundamental lack of understanding of the functional mechanisms of pyrotechnic devices and systems, followed by not recognizing pyrotechnics as an engineering technology, insufficient manpower with hands-on experience, too few test facilities, and inadequate guidelines and specifications for design, development, qualification and acceptance. Recommendations are made on both a managerial and technical basis to prevent failures, increase reliability, improve existing and future designs, and develop the technology to meet future requirements.

  16. MicroRNAs in Heart Failure, Cardiac Transplantation, and Myocardial Recovery: Biomarkers with Therapeutic Potential.

    PubMed

    Shah, Palak; Bristow, Michael R; Port, J David

    2017-12-01

    Heart failure is increasing in prevalence with a lack of recently developed therapies that produce major beneficial effects on its associated mortality. MicroRNAs are small non-coding RNA molecules that regulate gene expression, are differentially regulated in heart failure, and are found in the circulation serving as a biomarker of heart failure. Data suggests that microRNAs may be used to detect allograft rejection in cardiac transplantation and may predict the degree of myocardial recovery in patients with a left ventricular assist device or treated with beta-blocker therapy. Given their role in regulating cellular function, microRNAs are an intriguing target for oligonucleotide therapeutics, designed to mimic or antagonize (antagomir) their biological effects. We review the current state of microRNAs as biomarkers of heart failure and associated conditions, the mechanisms by which microRNAs control cellular function, and how specific microRNAs may be targeted with novel therapeutics designed to treat heart failure.

  17. Effect of dolutegravir functional monotherapy on HIV-1 virological response in integrase strand transfer inhibitor resistant patients.

    PubMed

    Naeger, Lisa K; Harrington, Patrick; Komatsu, Takashi; Deming, Damon

    2016-01-01

    VIKING-4 assessed the safety and efficacy of dolutegravir in heavily antiretroviral treatment-experienced patients who had documented integrase strand transfer inhibitor (INSTI) resistance-associated substitutions in their HIV. VIKING-4 had a placebo-controlled 7-day dolutegravir functional monotherapy phase followed by dolutegravir plus an optimized background regimen for 48 weeks. Independent resistance analyses evaluated week 48 virological responses in the VIKING-4 trial based on the presence of baseline INSTI resistance-associated substitutions and baseline dolutegravir phenotypic susceptibility. Response rates at week 48 based on baseline dolutegravir resistance subgroups were compared for the 7-day dolutegravir functional monotherapy arm and placebo-control arm. Additionally, genotypic and phenotypic resistance at day 8 and time of failure was analysed for the virological failures from both arms. Week 48 response rates for VIKING-4 were 23% (3/13) in the 7-day dolutegravir functional monotherapy arm compared with 60% (9/15) in the 7-day placebo arm. Response rates were consistently lower in the dolutegravir functional monotherapy arm across baseline INSTI genotypic and phenotypic subgroups. There was a higher proportion of virological failures in the 7-day dolutegravir functional monotherapy arm (n=6/13; 46%) compared with the 7-day placebo arm (n=3/15; 20%). Additionally, five virological failures in the dolutegravir arm had virus expressing emergent INSTI resistance-associated substitutions compared with two in the placebo arm. Analysis of response rates and resistance emergence in VIKING-4 suggests careful consideration should be given to the duration of functional monotherapy in future studies of highly treatment-experienced patients to reduce the risk of resistance and virological failure.

  18. Cannabidiol improves brain and liver function in a fulminant hepatic failure-induced model of hepatic encephalopathy in mice

    PubMed Central

    Avraham, Y; Grigoriadis, NC; Poutahidis, T; Vorobiev, L; Magen, I; Ilan, Y; Mechoulam, R; Berry, EM

    2011-01-01

    BACKGROUND AND PURPOSE Hepatic encephalopathy is a neuropsychiatric disorder of complex pathogenesis caused by acute or chronic liver failure. We investigated the effects of cannabidiol, a non-psychoactive constituent of Cannabis sativa with anti-inflammatory properties that activates the 5-hydroxytryptamine receptor 5-HT1A, on brain and liver functions in a model of hepatic encephalopathy associated with fulminant hepatic failure induced in mice by thioacetamide. EXPERIMENTAL APPROACH Female Sabra mice were injected with either saline or thioacetamide and were treated with either vehicle or cannabidiol. Neurological and motor functions were evaluated 2 and 3 days, respectively, after induction of hepatic failure, after which brains and livers were removed for histopathological analysis and blood was drawn for analysis of plasma liver enzymes. In a separate group of animals, cognitive function was tested after 8 days and brain 5-HT levels were measured 12 days after induction of hepatic failure. KEY RESULTS Neurological and cognitive functions were severely impaired in thioacetamide-treated mice and were restored by cannabidiol. Similarly, decreased motor activity in thioacetamide-treated mice was partially restored by cannabidiol. Increased plasma levels of ammonia, bilirubin and liver enzymes, as well as enhanced 5-HT levels in thioacetamide-treated mice were normalized following cannabidiol administration. Likewise, astrogliosis in the brains of thioacetamide-treated mice was moderated after cannabidiol treatment. CONCLUSIONS AND IMPLICATIONS Cannabidiol restores liver function, normalizes 5-HT levels and improves brain pathology in accordance with normalization of brain function. Therefore, the effects of cannabidiol may result from a combination of its actions in the liver and brain. PMID:21182490

  19. Cannabidiol improves brain and liver function in a fulminant hepatic failure-induced model of hepatic encephalopathy in mice.

    PubMed

    Avraham, Y; Grigoriadis, Nc; Poutahidis, T; Vorobiev, L; Magen, I; Ilan, Y; Mechoulam, R; Berry, Em

    2011-04-01

    Hepatic encephalopathy is a neuropsychiatric disorder of complex pathogenesis caused by acute or chronic liver failure. We investigated the effects of cannabidiol, a non-psychoactive constituent of Cannabis sativa with anti-inflammatory properties that activates the 5-hydroxytryptamine receptor 5-HT(1A) , on brain and liver functions in a model of hepatic encephalopathy associated with fulminant hepatic failure induced in mice by thioacetamide. Female Sabra mice were injected with either saline or thioacetamide and were treated with either vehicle or cannabidiol. Neurological and motor functions were evaluated 2 and 3 days, respectively, after induction of hepatic failure, after which brains and livers were removed for histopathological analysis and blood was drawn for analysis of plasma liver enzymes. In a separate group of animals, cognitive function was tested after 8 days and brain 5-HT levels were measured 12 days after induction of hepatic failure. Neurological and cognitive functions were severely impaired in thioacetamide-treated mice and were restored by cannabidiol. Similarly, decreased motor activity in thioacetamide-treated mice was partially restored by cannabidiol. Increased plasma levels of ammonia, bilirubin and liver enzymes, as well as enhanced 5-HT levels in thioacetamide-treated mice were normalized following cannabidiol administration. Likewise, astrogliosis in the brains of thioacetamide-treated mice was moderated after cannabidiol treatment. Cannabidiol restores liver function, normalizes 5-HT levels and improves brain pathology in accordance with normalization of brain function. Therefore, the effects of cannabidiol may result from a combination of its actions in the liver and brain. © 2011 The Authors. British Journal of Pharmacology © 2011 The British Pharmacological Society.

  20. Remedial Math Instruction Intervention: Efficacy of Constructivist Practices on Alternative Students with Disabilities Mathematics Achievement

    ERIC Educational Resources Information Center

    Mbwiri, Francis I.

    2017-01-01

    Many students with disabilities attending alternative high schools are not improving their mathematics ability scores. Failure to improve their mathematics ability scores has hampered their potential academic success and career prospects, resulting in many students dropping out of schools without graduating. The purpose of this quantitative study…

  1. The Relationship between Earned Value Management Metrics and Customer Satisfaction

    ERIC Educational Resources Information Center

    Plumer, David R.

    2010-01-01

    Information Technology (IT) products have a high rate of failure. Only 25% of IT projects were completed within budget and schedule, and 15% of completed projects were not operational. Researchers have not investigated the success of project management systems from the perspective of customer satisfaction. In this quantitative study, levels of…

  2. Environment assisted degradation mechanisms in advanced light metals

    NASA Technical Reports Server (NTRS)

    Gangloff, Richard P.; Stoner, Glenn E.; Swanson, Robert E.

    1988-01-01

    The general goals of the research program are to characterize alloy behavior quantitatively and to develop predictive mechanisms for environmental failure modes. Successes in this regard will provide the basis for metallurgical optimization of alloy performance, for chemical control of aggressive environments, and for engineering life prediction with damage tolerance and long term reliability.

  3. A Meta-Analysis of Predictors of Offender Treatment Attrition and Its Relationship to Recidivism

    ERIC Educational Resources Information Center

    Olver, Mark E.; Stockdale, Keira C.; Wormith, J. Stephen

    2011-01-01

    Objective: The failure of offenders to complete psychological treatment can pose significant concerns, including increased risk for recidivism. Although a large literature identifying predictors of offender treatment attrition has accumulated, there has yet to be a comprehensive quantitative review. Method: A meta-analysis of the offender…

  4. Number and placement of control system components considering possible failures. [for large space structures

    NASA Technical Reports Server (NTRS)

    Vander Velde, W. E.; Carignan, C. R.

    1984-01-01

    One of the first questions facing the designer of the control system for a large space structure is how many components actuators and sensors - to specify and where to place them on the structure. This paper presents a methodology which is intended to assist the designer in making these choices. A measure of controllability is defined which is a quantitative indication of how well the system can be controlled with a given set of actuators. Similarly, a measure of observability is defined which is a quantitative indication of how well the system can be observed with a given set of sensors. Then the effect of component unreliability is introduced by computing the average expected degree of controllability (observability) over the operating lifetime of the system accounting for the likelihood of various combinations of component failures. The problem of component location is resolved by optimizing this performance measure over the admissible set of locations. The variation of this optimized performance measure with number of actuators (sensors) is helpful in deciding how many components to use.

  5. Family Partner Intervention Influences Self-Care Confidence and Treatment Self-Regulation in Patients with Heart Failure

    PubMed Central

    Stamp, Kelly D.; Dunbar, Sandra B.; Clark, Patricia C.; Reilly, Carolyn M.; Gary, Rebecca A.; Higgins, Melinda; Ryan, Richard M

    2015-01-01

    Background Heart failure self-care requires confidence in one’s ability and motivation to perform a recommended behavior. Most self-care occurs within a family context, yet little is known about the influence of family on heart failure self-care or motivating factors. Aims To examine the association of family functioning and the self-care antecedents of confidence and motivation among heart failure participants and determine if a family partnership intervention would promote higher levels of perceived confidence and treatment self-regulation (motivation) at four and eight months compared to patient-family education or usual care groups. Methods Heart failure patients (N = 117) and a family member were randomized to a family partnership intervention, patient-family education or usual care groups. Measures of patient’s perceived family functioning, confidence, motivation for medications and following a low-sodium diet were analyzed. Data were collected at baseline, four and eight months. Results Family functioning was related to self-care confidence for diet (p=.02) and autonomous motivation for adhering to their medications (p=.05 and diet p=0.2). The family partnership intervention group significantly improved confidence (p=.05) and motivation (medications (p=.004; diet p=.012) at four months whereas patient-family education group and usual care did not change. Conclusion Perceived confidence and motivation for self-care was enhanced by family partnership intervention, regardless of family functioning. Poor family functioning at baseline contributed to lower confidence. Family functioning should be assessed to guide tailored family-patient interventions for better outcomes. PMID:25673525

  6. Can magma-injection and groundwater forces cause massive landslides on Hawaiian volcanoes?

    USGS Publications Warehouse

    Iverson, R.M.

    1995-01-01

    Landslides with volumes exceeding 1000 km3 have occurred on the flanks of Hawaiian volcanoes. Because the flanks typically slope seaward no more than 12??, the mechanics of slope failure are problematic. Limit-equilibrium analyses of wedge-shaped slices of the volcano flanks show that magma injection at prospective headscarps might trigger the landslides, but only under very restrictive conditions. Additional calculations show that groundwater head gradients associated with topographically induced flow and sea-level change are less likely to be important. Thus a simple, quantitative explanation for failure of Hawaiian volcano flanks remains elusive, and more complex scenarios may merit investigation. -from Author

  7. Experimental investigation of the crashworthiness of scaled composite sailplane fuselages

    NASA Technical Reports Server (NTRS)

    Kampf, Karl-Peter; Crawley, Edward F.; Hansman, R. John, Jr.

    1989-01-01

    The crash dynamics and energy absorption of composite sailplane fuselage segments undergoing nose-down impact were investigated. More than 10 quarter-scale structurally similar test articles, typical of high-performance sailplane designs, were tested. Fuselages segments were fabricated of combinations of fiberglass, graphite, Kevlar, and Spectra fabric materials. Quasistatic and dynamic tests were conducted. The quasistatic tests were found to replicate the strain history and failure modes observed in the dynamic tests. Failure modes of the quarter-scale model were qualitatively compared with full-scale crash evidence and quantitatively compared with current design criteria. By combining material and structural improvements, substantial increases in crashworthiness were demonstrated.

  8. Scalable Energy Efficiency with Resilience for High Performance Computing Systems: A Quantitative Methodology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tan, Li; Chen, Zizhong; Song, Shuaiwen

    2016-01-18

    Energy efficiency and resilience are two crucial challenges for HPC systems to reach exascale. While energy efficiency and resilience issues have been extensively studied individually, little has been done to understand the interplay between energy efficiency and resilience for HPC systems. Decreasing the supply voltage associated with a given operating frequency for processors and other CMOS-based components can significantly reduce power consumption. However, this often raises system failure rates and consequently increases application execution time. In this work, we present an energy saving undervolting approach that leverages the mainstream resilience techniques to tolerate the increased failures caused by undervolting.

  9. Scalable Energy Efficiency with Resilience for High Performance Computing Systems: A Quantitative Methodology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tan, Li; Chen, Zizhong; Song, Shuaiwen Leon

    2015-11-16

    Energy efficiency and resilience are two crucial challenges for HPC systems to reach exascale. While energy efficiency and resilience issues have been extensively studied individually, little has been done to understand the interplay between energy efficiency and resilience for HPC systems. Decreasing the supply voltage associated with a given operating frequency for processors and other CMOS-based components can significantly reduce power consumption. However, this often raises system failure rates and consequently increases application execution time. In this work, we present an energy saving undervolting approach that leverages the mainstream resilience techniques to tolerate the increased failures caused by undervolting.

  10. A unified phase-field theory for the mechanics of damage and quasi-brittle failure

    NASA Astrophysics Data System (ADS)

    Wu, Jian-Ying

    2017-06-01

    Being one of the most promising candidates for the modeling of localized failure in solids, so far the phase-field method has been applied only to brittle fracture with very few exceptions. In this work, a unified phase-field theory for the mechanics of damage and quasi-brittle failure is proposed within the framework of thermodynamics. Specifically, the crack phase-field and its gradient are introduced to regularize the sharp crack topology in a purely geometric context. The energy dissipation functional due to crack evolution and the stored energy functional of the bulk are characterized by a crack geometric function of polynomial type and an energetic degradation function of rational type, respectively. Standard arguments of thermodynamics then yield the macroscopic balance equation coupled with an extra evolution law of gradient type for the crack phase-field, governed by the aforesaid constitutive functions. The classical phase-field models for brittle fracture are recovered as particular examples. More importantly, the constitutive functions optimal for quasi-brittle failure are determined such that the proposed phase-field theory converges to a cohesive zone model for a vanishing length scale. Those general softening laws frequently adopted for quasi-brittle failure, e.g., linear, exponential, hyperbolic and Cornelissen et al. (1986) ones, etc., can be reproduced or fit with high precision. Except for the internal length scale, all the other model parameters can be determined from standard material properties (i.e., Young's modulus, failure strength, fracture energy and the target softening law). Some representative numerical examples are presented for the validation. It is found that both the internal length scale and the mesh size have little influences on the overall global responses, so long as the former can be well resolved by sufficiently fine mesh. In particular, for the benchmark tests of concrete the numerical results of load versus displacement curve and crack paths both agree well with the experimental data, showing validity of the proposed phase-field theory for the modeling of damage and quasi-brittle failure in solids.

  11. Economic impact of heart failure according to the effects of kidney failure.

    PubMed

    Sicras Mainar, Antoni; Navarro Artieda, Ruth; Ibáñez Nolla, Jordi

    2015-01-01

    To evaluate the use of health care resources and their cost according to the effects of kidney failure in heart failure patients during 2-year follow-up in a population setting. Observational retrospective study based on a review of medical records. The study included patients ≥ 45 years treated for heart failure from 2008 to 2010. The patients were divided into 2 groups according to the presence/absence of KF. Main outcome variables were comorbidity, clinical status (functional class, etiology), metabolic syndrome, costs, and new cases of cardiovascular events and kidney failure. The cost model included direct and indirect health care costs. Statistical analysis included multiple regression models. The study recruited 1600 patients (prevalence, 4.0%; mean age 72.4 years; women, 59.7%). Of these patients, 70.1% had hypertension, 47.1% had dyslipidemia, and 36.2% had diabetes mellitus. We analyzed 433 patients (27.1%) with kidney failure and 1167 (72.9%) without kidney failure. Patients with kidney failure were associated with functional class III-IV (54.1% vs 40.8%) and metabolic syndrome (65.3% vs 51.9%, P<.01). The average unit cost was €10,711.40. The corrected cost in the presence of kidney failure was €14,868.20 vs €9,364.50 (P=.001). During follow-up, 11.7% patients developed ischemic heart disease, 18.8% developed kidney failure, and 36.1% developed heart failure exacerbation. Comorbidity associated with heart failure is high. The presence of kidney failure increases the use of health resources and leads to higher costs within the National Health System. Copyright © 2014 Sociedad Española de Cardiología. Published by Elsevier Espana. All rights reserved.

  12. [Obesity and the prognosis of heart failure: the obesity paradox, myth or reality?].

    PubMed

    Bounhoure, Jean-Paul; Galinier, Michel; Roncalli, Jerôme; Massabuau, Pierre

    2014-01-01

    Obesity has now reached epidemic proportions worldwide. Obesity is associated with numerous comorbidities, including hypertension, lipid disorders and type II diabetes, and is also a major cause of cardiovascular disease, coronary disease, heart failure, atrial fibrillation, and sudden death. Obesity is the main cause of heart failure in respectively 11% and 14% of cases in men and women. The Framingham study showed that, after correction for other risk factors, each point increase in the body mass index raises the risk of heart failure by 5% in men and 7% in women. Obesity increases the heart workload, causes left ventricular hypertrophy, and impairs both diastolic and systolic function. The most common form of heart failure is diastolic dysfunction, and heart failure in obese individuals is associated with preserved systolic function. Despite these comorbidities and the severity of heart failure, numerous studies have revealed an "obesity paradox" in which overweight and obese individuals with heart failure appear to have a better prognosis than non overweight subjects. This review summarizes the adverse cardiac effects of this nutritional disease, the results of some studies supporting the obesity paradox, the better survival rate of obese patients with heart failure. Potential explanations for these surprising data include the possibility that a number of obese patients may simply not have heart failure, as well as methodological bias, and protective effects of adipose tissue. Further studies of large populations are needed to determine how obesity may improve the prognosis of heart failure.

  13. Differential expression of pancreatic protein and chemosensing receptor mRNAs in NKCC1-null intestine.

    PubMed

    Bradford, Emily M; Vairamani, Kanimozhi; Shull, Gary E

    2016-02-15

    To investigate the intestinal functions of the NKCC1 Na(+)-K(+)-2Cl cotransporter (SLC12a2 gene), differential mRNA expression changes in NKCC1-null intestine were analyzed. Microarray analysis of mRNA from intestines of adult wild-type mice and gene-targeted NKCC1-null mice (n = 6 of each genotype) was performed to identify patterns of differential gene expression changes. Differential expression patterns were further examined by Gene Ontology analysis using the online Gorilla program, and expression changes of selected genes were verified using northern blot analysis and quantitative real time-polymerase chain reaction. Histological staining and immunofluorescence were performed to identify cell types in which upregulated pancreatic digestive enzymes were expressed. Genes typically associated with pancreatic function were upregulated. These included lipase, amylase, elastase, and serine proteases indicative of pancreatic exocrine function, as well as insulin and regenerating islet genes, representative of endocrine function. Northern blot analysis and immunohistochemistry showed that differential expression of exocrine pancreas mRNAs was specific to the duodenum and localized to a subset of goblet cells. In addition, a major pattern of changes involving differential expression of olfactory receptors that function in chemical sensing, as well as other chemosensing G-protein coupled receptors, was observed. These changes in chemosensory receptor expression may be related to the failure of intestinal function and dependency on parenteral nutrition observed in humans with SLC12a2 mutations. The results suggest that loss of NKCC1 affects not only secretion, but also goblet cell function and chemosensing of intestinal contents via G-protein coupled chemosensory receptors.

  14. Exploring the gender gap in the conceptual survey of electricity and magnetism

    NASA Astrophysics Data System (ADS)

    Henderson, Rachel; Stewart, Gay; Stewart, John; Michaluk, Lynnette; Traxler, Adrienne

    2017-12-01

    The "gender gap" on various physics conceptual evaluations has been extensively studied. Men's average pretest scores on the Force Concept Inventory and Force and Motion Conceptual Evaluation are 13% higher than women's, and post-test scores are on average 12% higher than women's. This study analyzed the gender differences within the Conceptual Survey of Electricity and Magnetism (CSEM) in which the gender gap has been less well studied and is less consistent. In the current study, data collected from 1407 students (77% men, 23% women) in a calculus-based physics course over ten semesters showed that male students outperformed female students on the CSEM pretest (5%) and post-test (6%). Separate analyses were conducted for qualitative and quantitative problems on lab quizzes and course exams and showed that male students outperformed female students by 3% on qualitative quiz and exam problems. Male and female students performed equally on the quantitative course exam problems. The gender gaps within CSEM post-test scores, qualitative lab quiz scores, and qualitative exam scores were insignificant for students with a CSEM pretest score of 25% or less but grew as pretest scores increased. Structural equation modeling demonstrated that a latent variable, called Conceptual Physics Performance/Non-Quantitative (CPP/NonQnt), orthogonal to quantitative test performance was useful in explaining the differences observed in qualitative performance; this variable was most strongly related to CSEM post-test scores. The CPP/NonQnt of male students was 0.44 standard deviations higher than female students. The CSEM pretest measured CPP/NonQnt much less accurately for women (R2=4 % ) than for men (R2=17 % ). The failure to detect a gender gap for students scoring 25% or less on the pretest suggests that the CSEM instrument itself is not gender biased. The failure to find a performance difference in quantitative test performance while detecting a gap in qualitative performance suggests the qualitative differences do not result from psychological factors such as science anxiety or stereotype threat.

  15. A Methodology for Quantifying Certain Design Requirements During the Design Phase

    NASA Technical Reports Server (NTRS)

    Adams, Timothy; Rhodes, Russel

    2005-01-01

    A methodology for developing and balancing quantitative design requirements for safety, reliability, and maintainability has been proposed. Conceived as the basis of a more rational approach to the design of spacecraft, the methodology would also be applicable to the design of automobiles, washing machines, television receivers, or almost any other commercial product. Heretofore, it has been common practice to start by determining the requirements for reliability of elements of a spacecraft or other system to ensure a given design life for the system. Next, safety requirements are determined by assessing the total reliability of the system and adding redundant components and subsystems necessary to attain safety goals. As thus described, common practice leaves the maintainability burden to fall to chance; therefore, there is no control of recurring costs or of the responsiveness of the system. The means that have been used in assessing maintainability have been oriented toward determining the logistical sparing of components so that the components are available when needed. The process established for developing and balancing quantitative requirements for safety (S), reliability (R), and maintainability (M) derives and integrates NASA s top-level safety requirements and the controls needed to obtain program key objectives for safety and recurring cost (see figure). Being quantitative, the process conveniently uses common mathematical models. Even though the process is shown as being worked from the top down, it can also be worked from the bottom up. This process uses three math models: (1) the binomial distribution (greaterthan- or-equal-to case), (2) reliability for a series system, and (3) the Poisson distribution (less-than-or-equal-to case). The zero-fail case for the binomial distribution approximates the commonly known exponential distribution or "constant failure rate" distribution. Either model can be used. The binomial distribution was selected for modeling flexibility because it conveniently addresses both the zero-fail and failure cases. The failure case is typically used for unmanned spacecraft as with missiles.

  16. USGS approach to real-time estimation of earthquake-triggered ground failure - Results of 2015 workshop

    USGS Publications Warehouse

    Allstadt, Kate E.; Thompson, Eric M.; Wald, David J.; Hamburger, Michael W.; Godt, Jonathan W.; Knudsen, Keith L.; Jibson, Randall W.; Jessee, M. Anna; Zhu, Jing; Hearne, Michael; Baise, Laurie G.; Tanyas, Hakan; Marano, Kristin D.

    2016-03-30

    The U.S. Geological Survey (USGS) Earthquake Hazards and Landslide Hazards Programs are developing plans to add quantitative hazard assessments of earthquake-triggered landsliding and liquefaction to existing real-time earthquake products (ShakeMap, ShakeCast, PAGER) using open and readily available methodologies and products. To date, prototype global statistical models have been developed and are being refined, improved, and tested. These models are a good foundation, but much work remains to achieve robust and defensible models that meet the needs of end users. In order to establish an implementation plan and identify research priorities, the USGS convened a workshop in Golden, Colorado, in October 2015. This document summarizes current (as of early 2016) capabilities, research and operational priorities, and plans for further studies that were established at this workshop. Specific priorities established during the meeting include (1) developing a suite of alternative models; (2) making use of higher resolution and higher quality data where possible; (3) incorporating newer global and regional datasets and inventories; (4) reducing barriers to accessing inventory datasets; (5) developing methods for using inconsistent or incomplete datasets in aggregate; (6) developing standardized model testing and evaluation methods; (7) improving ShakeMap shaking estimates, particularly as relevant to ground failure, such as including topographic amplification and accounting for spatial variability; and (8) developing vulnerability functions for loss estimates.

  17. Heart Failure in Patients with Chronic Kidney Disease: A Systematic Integrative Review

    PubMed Central

    Segall, Liviu; Nistor, Ionut; Covic, Adrian

    2014-01-01

    Introduction. Heart failure (HF) is highly prevalent in patients with chronic kidney disease (CKD) and end-stage renal disease (ESRD) and is strongly associated with mortality in these patients. However, the treatment of HF in this population is largely unclear. Study Design. We conducted a systematic integrative review of the literature to assess the current evidence of HF treatment in CKD patients, searching electronic databases in April 2014. Synthesis used narrative methods. Setting and Population. We focused on adults with a primary diagnosis of CKD and HF. Selection Criteria for Studies. We included studies of any design, quantitative or qualitative. Interventions. HF treatment was defined as any formal means taken to improve the symptoms of HF and/or the heart structure and function abnormalities. Outcomes. Measures of all kinds were considered of interest. Results. Of 1,439 results returned by database searches, 79 articles met inclusion criteria. A further 23 relevant articles were identified by hand searching. Conclusions. Control of fluid overload, the use of beta-blockers and angiotensin-converting enzyme inhibitors or angiotensin receptor blockers, and optimization of dialysis appear to be the most important methods to treat HF in CKD and ESRD patients. Aldosterone antagonists and digitalis glycosides may additionally be considered; however, their use is associated with significant risks. The role of anemia correction, control of CKD-mineral and bone disorder, and cardiac resynchronization therapy are also discussed. PMID:24959595

  18. Health Literacy and Heart Failure

    PubMed Central

    Cajita, Maan Isabella; Cajita, Tara Rafaela; Han, Hae-Ra

    2015-01-01

    Background Low health literacy affects millions of Americans, putting those who are affected at a disadvantage and at risk for poorer health outcomes. Low health literacy can act as a barrier to effective disease self-management; this is especially true for chronic diseases such as heart failure (HF) that require complicated self-care regimens. Purpose This systematic review examined quantitative research literature published between 1999 and 2014 to explore the role of health literacy among HF patients. The specific aims of the systematic review are to (1) describe the prevalence of low health literacy among HF patients, (2) explore the predictors of low health literacy among HF patients, and (3) discuss the relationship between health literacy and HF self-care and common HF outcomes. Methods A systematic search of the following databases was conducted, PubMed, CINAHL Plus, Embase, PsycINFO, and Scopus, using relevant keywords and clear inclusion and exclusion criteria. Conclusions An average of 39% of HF patients have low health literacy. Age, race/ethnicity, years of education, and cognitive function are predictors of health literacy. In addition, adequate health literacy is consistently correlated with higher HF knowledge and higher salt knowledge. Clinical Implications Considering the prevalence of low health literacy among in the HF population, nurses and healthcare professionals need to recognize the consequences of low health literacy and adopt strategies that could minimize its detrimental effect on the patient's health outcomes. PMID:25569150

  19. Profound impairment of adaptive immune responses by alkylating chemotherapy

    PubMed Central

    Litterman, Adam J.; Zellmer, David M.; Grinnen, Karen L.; Hunt, Matthew A.; Dudek, Arkadiusz Z.; Salazar, Andres M.; Ohlfest, John R.

    2013-01-01

    Cancer vaccines have overall had a record of failure as an adjuvant therapy for malignancies that are treated with alkylating chemotherapy, and the contribution of standard treatment to that failure remains unclear. Vaccines aim to harness the proliferative potential of the immune system by expanding a small number of tumor-specific lymphocytes into a large number of anti-tumor effectors. Clinical trials are often conducted after treatment with alkylating chemotherapy, given either as standard therapy or for immunomodulatory effect. There is mounting evidence for synergy between chemotherapy and adoptive immunotherapy or vaccination against self-antigens; however, the impact of chemotherapy on lymphocytes primed against tumor neo-antigens remains poorly defined. We report here that clinically relevant dosages of standard alkylating chemotherapies such as temozolomide and cyclophosphamide significantly inhibit the proliferative abilities of lymphocytes in mice. This proliferative impairment was long lasting and led to quantitative and qualitative defects in B and T cell responses to neo-antigen vaccines. High affinity responder lymphocytes receiving the strongest proliferative signals from vaccines experienced the greatest DNA damage responses, skewing the response toward lower affinity responders with inferior functional characteristics. Together these defects lead to inferior efficacy and overall survival in murine tumor models treated by neo-antigen vaccines. These results suggest that clinical protocols for cancer vaccines should be designed to avoid exposing responder lymphocytes to alkylating chemotherapy. PMID:23686484

  20. Efficient 3-D finite element failure analysis of compression loaded angle-ply plates with holes

    NASA Technical Reports Server (NTRS)

    Burns, S. W.; Herakovich, C. T.; Williams, J. G.

    1987-01-01

    Finite element stress analysis and the tensor polynomial failure criterion predict that failure always initiates at the interface between layers on the hole edge for notched angle-ply laminates loaded in compression. The angular location of initial failure is a function of the fiber orientation in the laminate. The dominant stress components initiating failure are shear. It is shown that approximate symmetry can be used to reduce the computer resources required for the case of unaxial loading.

  1. Failure mechanisms of uni-ply composite plates with a circular hole under static compressive loading

    NASA Technical Reports Server (NTRS)

    Khamseh, A. R.; Waas, A. M.

    1992-01-01

    The objective of the study was to identify and study the failure mechanisms associated with compressive-loaded uniply graphite/epoxy square plates with a central circular hole. It is found that the type of compressive failure depends on the hole size. For large holes with the diameter/width ratio exceeding 0.062, fiber buckling/kinking initiated at the hole is found to be the dominant failure mechanism. In plates with smaller hole sizes, failure initiates away from the hole edge or complete global failure occurs. Critical buckle wavelengths at failure are presented as a function of the normalized hole diameter.

  2. A Micromechanics-Based Elastoplastic Damage Model for Rocks with a Brittle-Ductile Transition in Mechanical Response

    NASA Astrophysics Data System (ADS)

    Hu, Kun; Zhu, Qi-zhi; Chen, Liang; Shao, Jian-fu; Liu, Jian

    2018-06-01

    As confining pressure increases, crystalline rocks of moderate porosity usually undergo a transition in failure mode from localized brittle fracture to diffused damage and ductile failure. This transition has been widely reported experimentally for several decades; however, satisfactory modeling is still lacking. The present paper aims at modeling the brittle-ductile transition process of rocks under conventional triaxial compression. Based on quantitative analyses of experimental results, it is found that there is a quite satisfactory linearity between the axial inelastic strain at failure and the confining pressure prescribed. A micromechanics-based frictional damage model is then formulated using an associated plastic flow rule and a strain energy release rate-based damage criterion. The analytical solution to the strong plasticity-damage coupling problem is provided and applied to simulate the nonlinear mechanical behaviors of Tennessee marble, Indiana limestone and Jinping marble, each presenting a brittle-ductile transition in stress-strain curves.

  3. Back-Analyses of Landfill Instability Induced by High Water Level: Case Study of Shenzhen Landfill

    PubMed Central

    Peng, Ren; Hou, Yujing; Zhan, Liangtong; Yao, Yangping

    2016-01-01

    In June 2008, the Shenzhen landfill slope failed. This case is used as an example to study the deformation characteristics and failure mode of a slope induced by high water levels. An integrated monitoring system, including water level gauges, electronic total stations, and inclinometers, was used to monitor the slope failure process. The field measurements suggest that the landfill landslide was caused by a deep slip along the weak interface of the composite liner system at the base of the landfill. The high water level is considered to be the main factor that caused this failure. To calculate the relative interface shear displacements in the geosynthetic multilayer liner system, a series of numerical direct shear tests were carried out. Based on the numerical results, the composite lining system simplified and the centrifuge modeling technique was used to quantitatively evaluate the effect of water levels on landfill instability. PMID:26771627

  4. Interconnect fatigue design for terrestrial photovoltaic modules

    NASA Technical Reports Server (NTRS)

    Mon, G. R.; Moore, D. M.; Ross, R. G., Jr.

    1982-01-01

    The results of comprehensive investigation of interconnect fatigue that has led to the definition of useful reliability-design and life-prediction algorithms are presented. Experimental data indicate that the classical strain-cycle (fatigue) curve for the interconnect material is a good model of mean interconnect fatigue performance, but it fails to account for the broad statistical scatter, which is critical to reliability prediction. To fill this shortcoming the classical fatigue curve is combined with experimental cumulative interconnect failure rate data to yield statistical fatigue curves (having failure probability as a parameter) which enable (1) the prediction of cumulative interconnect failures during the design life of an array field, and (2) the unambiguous--ie., quantitative--interpretation of data from field-service qualification (accelerated thermal cycling) tests. Optimal interconnect cost-reliability design algorithms are derived based on minimizing the cost of energy over the design life of the array field.

  5. Interconnect fatigue design for terrestrial photovoltaic modules

    NASA Astrophysics Data System (ADS)

    Mon, G. R.; Moore, D. M.; Ross, R. G., Jr.

    1982-03-01

    The results of comprehensive investigation of interconnect fatigue that has led to the definition of useful reliability-design and life-prediction algorithms are presented. Experimental data indicate that the classical strain-cycle (fatigue) curve for the interconnect material is a good model of mean interconnect fatigue performance, but it fails to account for the broad statistical scatter, which is critical to reliability prediction. To fill this shortcoming the classical fatigue curve is combined with experimental cumulative interconnect failure rate data to yield statistical fatigue curves (having failure probability as a parameter) which enable (1) the prediction of cumulative interconnect failures during the design life of an array field, and (2) the unambiguous--ie., quantitative--interpretation of data from field-service qualification (accelerated thermal cycling) tests. Optimal interconnect cost-reliability design algorithms are derived based on minimizing the cost of energy over the design life of the array field.

  6. Metabolic bone disease in chronic renal failure. II. Renal transplant patients.

    PubMed Central

    Huffer, W. E.; Kuzela, D.; Popovtzer, M. M.; Starzl, T. E.

    1975-01-01

    Trabecular vertebral bone of renal transplant patients was quantitatively compared with bone from normal individuals and dialyzed and nondialyzed patienets with chronic renal failure reported in detail in an earlier study. Long- and short-term transplant patients have increased bone resorption and mineralization defects similar to renal osteodystrophy in dialyzed and nondialyzed patients. However, in transplant patients the magnitude of resorption is greater, and bone volume tends to decrease rather than increase. Resorptive activity in transplant patients is maximal during the first year after transplantation. Bone volume decreases continuously for at least 96 months after transplantation. Only decreased bone volume correlated with success or failure of the renal transplant. Morphologic findings in this study correlate with other clinical and morphologic data to suggest that reduction in bone volume in transplant patients results from a combination of persistent hyperparathyroidism and suppression of bone formation by steroid therapy. Images Fig 1 PMID:1091152

  7. Space station software reliability analysis based on failures observed during testing at the multisystem integration facility

    NASA Technical Reports Server (NTRS)

    Tamayo, Tak Chai

    1987-01-01

    Quality of software not only is vital to the successful operation of the space station, it is also an important factor in establishing testing requirements, time needed for software verification and integration as well as launching schedules for the space station. Defense of management decisions can be greatly strengthened by combining engineering judgments with statistical analysis. Unlike hardware, software has the characteristics of no wearout and costly redundancies, thus making traditional statistical analysis not suitable in evaluating reliability of software. A statistical model was developed to provide a representation of the number as well as types of failures occur during software testing and verification. From this model, quantitative measure of software reliability based on failure history during testing are derived. Criteria to terminate testing based on reliability objectives and methods to estimate the expected number of fixings required are also presented.

  8. The Mistreatment of Women during Childbirth in Health Facilities Globally: A Mixed-Methods Systematic Review.

    PubMed

    Bohren, Meghan A; Vogel, Joshua P; Hunter, Erin C; Lutsiv, Olha; Makh, Suprita K; Souza, João Paulo; Aguiar, Carolina; Saraiva Coneglian, Fernando; Diniz, Alex Luíz Araújo; Tunçalp, Özge; Javadi, Dena; Oladapo, Olufemi T; Khosla, Rajat; Hindin, Michelle J; Gülmezoglu, A Metin

    2015-06-01

    Despite growing recognition of neglectful, abusive, and disrespectful treatment of women during childbirth in health facilities, there is no consensus at a global level on how these occurrences are defined and measured. This mixed-methods systematic review aims to synthesize qualitative and quantitative evidence on the mistreatment of women during childbirth in health facilities to inform the development of an evidence-based typology of the phenomenon. We searched PubMed, CINAHL, and Embase databases and grey literature using a predetermined search strategy to identify qualitative, quantitative, and mixed-methods studies on the mistreatment of women during childbirth across all geographical and income-level settings. We used a thematic synthesis approach to synthesize the qualitative evidence and assessed the confidence in the qualitative review findings using the CERQual approach. In total, 65 studies were included from 34 countries. Qualitative findings were organized under seven domains: (1) physical abuse, (2) sexual abuse, (3) verbal abuse, (4) stigma and discrimination, (5) failure to meet professional standards of care, (6) poor rapport between women and providers, and (7) health system conditions and constraints. Due to high heterogeneity of the quantitative data, we were unable to conduct a meta-analysis; instead, we present descriptions of study characteristics, outcome measures, and results. Additional themes identified in the quantitative studies are integrated into the typology. This systematic review presents a comprehensive, evidence-based typology of the mistreatment of women during childbirth in health facilities, and demonstrates that mistreatment can occur at the level of interaction between the woman and provider, as well as through systemic failures at the health facility and health system levels. We propose this typology be adopted to describe the phenomenon and be used to develop measurement tools and inform future research, programs, and interventions.

  9. Involvement of systemic venous congestion in heart failure.

    PubMed

    Rubio Gracia, J; Sánchez Marteles, M; Pérez Calvo, J I

    2017-04-01

    Systemic venous congestion has gained significant importance in the interpretation of the pathophysiology of acute heart failure, especially in the development of renal function impairment during exacerbations. In this study, we review the concept, clinical characterisation and identification of venous congestion. We update current knowledge on its importance in the pathophysiology of acute heart failure and its involvement in the prognosis. We pay special attention to the relationship between abdominal congestion, the pulmonary interstitium as filtering membrane, inflammatory phenomena and renal function impairment in acute heart failure. Lastly, we review decongestion as a new therapeutic objective and the measures available for its assessment. Copyright © 2016 Elsevier España, S.L.U. and Sociedad Española de Medicina Interna (SEMI). All rights reserved.

  10. Quantitative investigation of red blood cell three-dimensional geometric and chemical changes in the storage lesion using digital holographic microscopy.

    PubMed

    Jaferzadeh, Keyvan; Moon, Inkyu

    2015-11-01

    Quantitative phase information obtained by digital holographic microscopy (DHM) can provide new insight into the functions and morphology of single red blood cells (RBCs). Since the functionality of a RBC is related to its three-dimensional (3-D) shape, quantitative 3-D geometric changes induced by storage time can help hematologists realize its optimal functionality period. We quantitatively investigate RBC 3-D geometric changes in the storage lesion using DHM. Our experimental results show that the substantial geometric transformation of the biconcave-shaped RBCs to the spherocyte occurs due to RBC storage lesion. This transformation leads to progressive loss of cell surface area, surface-to-volume ratio, and functionality of RBCs. Furthermore, our quantitative analysis shows that there are significant correlations between chemical and morphological properties of RBCs.

  11. A review of state-of-the-art stereology for better quantitative 3D morphology in cardiac research.

    PubMed

    Mühlfeld, Christian; Nyengaard, Jens Randel; Mayhew, Terry M

    2010-01-01

    The aim of stereological methods in biomedical research is to obtain quantitative information about three-dimensional (3D) features of tissues, cells, or organelles from two-dimensional physical or optical sections. With immunogold labeling, stereology can even be used for the quantitative analysis of the distribution of molecules within tissues and cells. Nowadays, a large number of design-based stereological methods offer an efficient quantitative approach to intriguing questions in cardiac research, such as "Is there a significant loss of cardiomyocytes during progression from ventricular hypertrophy to heart failure?" or "Does a specific treatment reduce the degree of fibrosis in the heart?" Nevertheless, the use of stereological methods in cardiac research is rare. The present review article demonstrates how some of the potential pitfalls in quantitative microscopy may be avoided. To this end, we outline the concepts of design-based stereology and illustrate their practical applications to a wide range of biological questions in cardiac research. We hope that the present article will stimulate researchers in cardiac research to incorporate design-based stereology into their study designs, thus promoting an unbiased quantitative 3D microscopy.

  12. Selection of reference genes for gene expression studies in heart failure for left and right ventricles.

    PubMed

    Li, Mengmeng; Rao, Man; Chen, Kai; Zhou, Jianye; Song, Jiangping

    2017-07-15

    Real-time quantitative reverse transcriptase-PCR (qRT-PCR) is a feasible tool for determining gene expression profiles, but the accuracy and reliability of the results depends on the stable expression of selected housekeeping genes in different samples. By far, researches on stable housekeeping genes in human heart failure samples are rare. Moreover the effect of heart failure on the expression of housekeeping genes in right and left ventricles is yet to be studied. Therefore we aim to provide stable housekeeping genes for both ventricles in heart failure and normal heart samples. In this study, we selected seven commonly used housekeeping genes as candidates. By using the qRT-PCR, the expression levels of ACTB, RAB7A, GAPDH, REEP5, RPL5, PSMB4 and VCP in eight heart failure and four normal heart samples were assessed. The stability of candidate housekeeping genes was evaluated by geNorm and Normfinder softwares. GAPDH showed the least variation in all heart samples. Results also indicated the difference of gene expression existed in heart failure left and right ventricles. GAPDH had the highest expression stability in both heart failure and normal heart samples. We also propose using different sets of housekeeping genes for left and right ventricles respectively. The combination of RPL5, GAPDH and PSMB4 is suitable for the right ventricle and the combination of GAPDH, REEP5 and RAB7A is suitable for the left ventricle. Copyright © 2017 Elsevier B.V. All rights reserved.

  13. Discharge clinical characteristics and 60-day readmission in patients hospitalized with heart failure.

    PubMed

    Anderson, Kelley M

    2014-01-01

    Heart failure is a clinical syndrome that incurs a high prevalence, mortality, morbidity, and economic burden in our society. Patients with heart failure may experience hospitalization because of an acute exacerbation of their condition. Recurrent hospitalizations soon after discharge are an unfortunate occurrence in this patient population. The purpose of this study was to explore the clinical and diagnostic characteristics of individuals hospitalized with a primary diagnosis of heart failure at the time of discharge and to compare the association of these indicators in individuals who did and did not experience a heart failure hospitalization within 60 days of the index stay. The study is a descriptive, correlational, quantitative study using a retrospective review of 134 individuals discharged with a primary diagnosis of heart failure. Records were reviewed for sociodemographic characteristics, health histories, clinical assessment findings, and diagnostic information. Significant predictors of 60-day heart failure readmissions were dyspnea (β = 0.579), crackles (β = 1.688), and assistance with activities of daily living (β = 2.328), independent of age, gender, and multiple other factors. By using hierarchical logistical regression, a model was derived that demonstrated the ability to correctly classify 77.4% of the cohort, 78.2% of those who did have a readmission (sensitivity of the prediction), and 76.7% of the subjects in whom the predicted event, readmission, did not occur (specificity of the prediction). Hospitalizations for heart failure are markers of clinical instability. Future events after hospitalization are common in this patient population, and this study provides a novel understanding of clinical characteristics at the time of discharge that are associated with future outcomes, specifically 60-day heart failure readmissions. A consideration of these characteristics provides an additional perspective to guide clinical decision making and the evaluation of discharge readiness.

  14. Study of the Rock Mass Failure Process and Mechanisms During the Transformation from Open-Pit to Underground Mining Based on Microseismic Monitoring

    NASA Astrophysics Data System (ADS)

    Zhao, Yong; Yang, Tianhong; Bohnhoff, Marco; Zhang, Penghai; Yu, Qinglei; Zhou, Jingren; Liu, Feiyue

    2018-05-01

    To quantitatively understand the failure process and failure mechanism of a rock mass during the transformation from open-pit mining to underground mining, the Shirengou Iron Mine was selected as an engineering project case study. The study area was determined using the rock mass basic quality classification method and the kinematic analysis method. Based on the analysis of the variations in apparent stress and apparent volume over time, the rock mass failure process was analyzed. According to the recent research on the temporal and spatial change of microseismic events in location, energy, apparent stress, and displacement, the migration characteristics of rock mass damage were studied. A hybrid moment tensor inversion method was used to determine the rock mass fracture source mechanisms, the fracture orientations, and fracture scales. The fracture area can be divided into three zones: Zone A, Zone B, and Zone C. A statistical analysis of the orientation information of the fracture planes orientations was carried out, and four dominant fracture planes were obtained. Finally, the slip tendency analysis method was employed, and the unstable fracture planes were obtained. The results show: (1) The microseismic monitoring and hybrid moment tensor analysis can effectively analyze the failure process and failure mechanism of rock mass, (2) during the transformation from open-pit to underground mining, the failure type of rock mass is mainly shear failure and the tensile failure is mostly concentrated in the roof of goafs, and (3) the rock mass of the pit bottom and the upper of goaf No. 18 have the possibility of further damage.

  15. A Case Study on Improving Intensive Care Unit (ICU) Services Reliability: By Using Process Failure Mode and Effects Analysis (PFMEA)

    PubMed Central

    Yousefinezhadi, Taraneh; Jannesar Nobari, Farnaz Attar; Goodari, Faranak Behzadi; Arab, Mohammad

    2016-01-01

    Introduction: In any complex human system, human error is inevitable and shows that can’t be eliminated by blaming wrong doers. So with the aim of improving Intensive Care Units (ICU) reliability in hospitals, this research tries to identify and analyze ICU’s process failure modes at the point of systematic approach to errors. Methods: In this descriptive research, data was gathered qualitatively by observations, document reviews, and Focus Group Discussions (FGDs) with the process owners in two selected ICUs in Tehran in 2014. But, data analysis was quantitative, based on failures’ Risk Priority Number (RPN) at the base of Failure Modes and Effects Analysis (FMEA) method used. Besides, some causes of failures were analyzed by qualitative Eindhoven Classification Model (ECM). Results: Through FMEA methodology, 378 potential failure modes from 180 ICU activities in hospital A and 184 potential failures from 99 ICU activities in hospital B were identified and evaluated. Then with 90% reliability (RPN≥100), totally 18 failures in hospital A and 42 ones in hospital B were identified as non-acceptable risks and then their causes were analyzed by ECM. Conclusions: Applying of modified PFMEA for improving two selected ICUs’ processes reliability in two different kinds of hospitals shows that this method empowers staff to identify, evaluate, prioritize and analyze all potential failure modes and also make them eager to identify their causes, recommend corrective actions and even participate in improving process without feeling blamed by top management. Moreover, by combining FMEA and ECM, team members can easily identify failure causes at the point of health care perspectives. PMID:27157162

  16. Impact of Functional Characteristics on Usage of LSS Methods in IT and Perceived Project Success

    ERIC Educational Resources Information Center

    Mushi, Francis Jeremiah

    2014-01-01

    High rates of Information Technology (IT) project failures continues; fail to meet established deadlines, exceeding budget, or not agreed-upon functionality. Failure often results from a fundamental confusion over what is involved in the project. Methods that have provided project success in Service and Manufacturing industries have not been…

  17. Pitfalls and Precautions When Using Predicted Failure Data for Quantitative Analysis of Safety Risk for Human Rated Launch Vehicles

    NASA Technical Reports Server (NTRS)

    Hatfield, Glen S.; Hark, Frank; Stott, James

    2016-01-01

    Launch vehicle reliability analysis is largely dependent upon using predicted failure rates from data sources such as MIL-HDBK-217F. Reliability prediction methodologies based on component data do not take into account system integration risks such as those attributable to manufacturing and assembly. These sources often dominate component level risk. While consequence of failure is often understood, using predicted values in a risk model to estimate the probability of occurrence may underestimate the actual risk. Managers and decision makers use the probability of occurrence to influence the determination whether to accept the risk or require a design modification. The actual risk threshold for acceptance may not be fully understood due to the absence of system level test data or operational data. This paper will establish a method and approach to identify the pitfalls and precautions of accepting risk based solely upon predicted failure data. This approach will provide a set of guidelines that may be useful to arrive at a more realistic quantification of risk prior to acceptance by a program.

  18. Risk measures for power failures in transmission systems

    NASA Astrophysics Data System (ADS)

    Cassidy, Alex; Feinstein, Zachary; Nehorai, Arye

    2016-11-01

    We present a novel framework for evaluating the risk of failures in power transmission systems. We use the concept of systemic risk measures from the financial mathematics literature with models of power system failures in order to quantify the risk of the entire power system for design and comparative purposes. The proposed risk measures provide the collection of capacity vectors for the components in the system that lead to acceptable outcomes. Keys to the formulation of our measures of risk are two elements: a model of system behavior that provides the (distribution of) outcomes based on component capacities and an acceptability criterion that determines whether a (random) outcome is acceptable from an aggregated point of view. We examine the effects of altering the line capacities on energy not served under a variety of networks, flow manipulation methods, load shedding schemes, and load profiles using Monte Carlo simulations. Our results provide a quantitative comparison of the performance of these schemes, measured by the required line capacity. These results provide more complete descriptions of the risks of power failures than the previous, one-dimensional metrics.

  19. Damage evaluation of fiber reinforced plastic-confined circular concrete-filled steel tubular columns under cyclic loading using the acoustic emission technique

    NASA Astrophysics Data System (ADS)

    Li, Dongsheng; Du, Fangzhu; Ou, Jinping

    2017-03-01

    Glass-fiber reinforced plastic (GFRP)-confined circular concrete-filled steel tubular (CCFT) columns comprise of concrete, steel, and GFRP and show complex failure mechanics under cyclic loading. This paper investigated the failure mechanism and damage evolution of GFRP-CCFT columns by performing uniaxial cyclic loading tests that were monitored using the acoustic emission (AE) technique. Characteristic AE parameters were obtained during the damage evolution of GFRP-CCFT columns. Based on the relationship between the loading curve and these parameters, the damage evolution of GFRP-CCFT columns was classified into three stages that represented different damage degrees. Damage evolution and failure mode were investigated by analyzing the b-value and the ratio of rise time to waveform amplitude and average frequency. The damage severity of GFRP-CCFT columns were quantitatively estimated according to the modified index of damage and NDIS-2421 damage assessment criteria corresponding to each loading step. The proposed method can explain the damage evolution and failure mechanism for GFRP-CCFT columns and provide critical warning information for composite structures.

  20. A novel approach for analyzing fuzzy system reliability using different types of intuitionistic fuzzy failure rates of components.

    PubMed

    Kumar, Mohit; Yadav, Shiv Prasad

    2012-03-01

    This paper addresses the fuzzy system reliability analysis using different types of intuitionistic fuzzy numbers. Till now, in the literature, to analyze the fuzzy system reliability, it is assumed that the failure rates of all components of a system follow the same type of fuzzy set or intuitionistic fuzzy set. However, in practical problems, such type of situation rarely occurs. Therefore, in the present paper, a new algorithm has been introduced to construct the membership function and non-membership function of fuzzy reliability of a system having components following different types of intuitionistic fuzzy failure rates. Functions of intuitionistic fuzzy numbers are calculated to construct the membership function and non-membership function of fuzzy reliability via non-linear programming techniques. Using the proposed algorithm, membership functions and non-membership functions of fuzzy reliability of a series system and a parallel systems are constructed. Our study generalizes the various works of the literature. Numerical examples are given to illustrate the proposed algorithm. Copyright © 2011 ISA. Published by Elsevier Ltd. All rights reserved.

  1. [Analysis of quality of life using the generic SF-36 questionnaire in patients with heart failure].

    PubMed

    López Castro, J; Cid Conde, L; Fernández Rodríguez, V; Failde Garrido, J M; Almazán Ortega, R

    2013-01-01

    Heart failure is one of the major chronic diseases that affect health related quality of life. The objective of this study was to evaluate the quality of life in patients with New York Heart Association functional class I-III using the SF-36 on a cohort of survivors of the EPICOUR Study Group and compare the quality of life with the general Spanish population of the same sex and age group. A cohort study, observational, and prospective study was conducted on survivors of the EPICOUR Study Group, on whom a clinical-progression-outcome review was performed along with the SF-36. The quality of life was studied in 50 patients (60% male). The average age of men was 64.8 years and women 68.3. When analyzing the SF-36, it was observed that the results were lower in the physical dimensions than in the mental dimensions. The quality of life worsened with increasing functional class (statistically significant differences for scales of physical functioning, social functioning and borderline significance in mental health scale). When comparing patients with the general population of the same age and sex, patients with heart failure showed lower scores on all scales (significant differences in physical functioning, body pain, vitality, and social role for men, and physical function and emotional role for women). Heart failure causes a negative impact on quality of life, physical functioning, as well as psychosocial function, with the impairment becoming worse with increased functional class. Copyright © 2013 SECA. Published by Elsevier Espana. All rights reserved.

  2. Reusable rocket engine intelligent control system framework design, phase 2

    NASA Technical Reports Server (NTRS)

    Nemeth, ED; Anderson, Ron; Ols, Joe; Olsasky, Mark

    1991-01-01

    Elements of an advanced functional framework for reusable rocket engine propulsion system control are presented for the Space Shuttle Main Engine (SSME) demonstration case. Functional elements of the baseline functional framework are defined in detail. The SSME failure modes are evaluated and specific failure modes identified for inclusion in the advanced functional framework diagnostic system. Active control of the SSME start transient is investigated, leading to the identification of a promising approach to mitigating start transient excursions. Key elements of the functional framework are simulated and demonstration cases are provided. Finally, the advanced function framework for control of reusable rocket engines is presented.

  3. Health literacy and global cognitive function predict e-mail but not internet use in heart failure patients.

    PubMed

    Schprechman, Jared P; Gathright, Emily C; Goldstein, Carly M; Guerini, Kate A; Dolansky, Mary A; Redle, Joseph; Hughes, Joel W

    2013-01-01

    Background. The internet offers a potential for improving patient knowledge, and e-mail may be used in patient communication with providers. However, barriers to internet and e-mail use, such as low health literacy and cognitive impairment, may prevent patients from using technological resources. Purpose. We investigated whether health literacy, heart failure knowledge, and cognitive function were related to internet and e-mail use in older adults with heart failure (HF). Methods. Older adults (N = 119) with heart failure (69.84 ± 9.09 years) completed measures of health literacy, heart failure knowledge, cognitive functioning, and internet use in a cross-sectional study. Results. Internet and e-mail use were reported in 78.2% and 71.4% of this sample of patients with HF, respectively. Controlling for age and education, logistic regression analyses indicated that higher health literacy predicted e-mail (P < .05) but not internet use. Global cognitive function predicted e-mail (P < .05) but not internet use. Only 45% used the Internet to obtain information on HF and internet use was not associated with greater HF knowledge. Conclusions. The majority of HF patients use the internet and e-mail, but poor health literacy and cognitive impairment may prevent some patients from accessing these resources. Future studies that examine specific internet and email interventions to increase HF knowledge are needed.

  4. Renal denervation in male rats with heart failure improves ventricular sympathetic nerve innervation and function

    PubMed Central

    Pinkham, Maximilian I.; Loftus, Michael T.; Amirapu, Satya; Guild, Sarah-Jane; Quill, Gina; Woodward, William R.; Habecker, Beth A.

    2017-01-01

    Heart failure is characterized by the loss of sympathetic innervation to the ventricles, contributing to impaired cardiac function and arrhythmogenesis. We hypothesized that renal denervation (RDx) would reverse this loss. Male Wistar rats underwent myocardial infarction (MI) or sham surgery and progressed into heart failure for 4 wk before receiving bilateral RDx or sham RDx. After additional 3 wk, left ventricular (LV) function was assessed, and ventricular sympathetic nerve fiber density was determined via histology. Post-MI heart failure rats displayed significant reductions in ventricular sympathetic innervation and tissue norepinephrine content (nerve fiber density in the LV of MI+sham RDx hearts was 0.31 ± 0.05% vs. 1.00 ± 0.10% in sham MI+sham RDx group, P < 0.05), and RDx significantly increased ventricular sympathetic innervation (0.76 ± 0.14%, P < 0.05) and tissue norepinephrine content. MI was associated with an increase in fibrosis of the noninfarcted ventricular myocardium, which was attenuated by RDx. RDx improved LV ejection fraction and end-systolic and -diastolic areas when compared with pre-RDx levels. This is the first study to show an interaction between renal nerve activity and cardiac sympathetic nerve innervation in heart failure. Our findings show denervating the renal nerves improves cardiac sympathetic innervation and function in the post-MI failing heart. PMID:28052866

  5. Alterations in left ventricular diastolic function in conscious dogs with pacing-induced heart failure

    NASA Technical Reports Server (NTRS)

    Komamura, K.; Shannon, R. P.; Pasipoularides, A.; Ihara, T.; Lader, A. S.; Patrick, T. A.; Bishop, S. P.; Vatner, S. F.

    1992-01-01

    We investigated in conscious dogs (a) the effects of heart failure induced by chronic rapid ventricular pacing on the sequence of development of left ventricular (LV) diastolic versus systolic dysfunction and (b) whether the changes were load dependent or secondary to alterations in structure. LV systolic and diastolic dysfunction were evident within 24 h after initiation of pacing and occurred in parallel over 3 wk. LV systolic function was reduced at 3 wk, i.e., peak LV dP/dt fell by -1,327 +/- 105 mmHg/s and ejection fraction by -22 +/- 2%. LV diastolic dysfunction also progressed over 3 wk of pacing, i.e., tau increased by +14.0 +/- 2.8 ms and the myocardial stiffness constant by +6.5 +/- 1.4, whereas LV chamber stiffness did not change. These alterations were associated with increases in LV end-systolic (+28.6 +/- 5.7 g/cm2) and LV end-diastolic stresses (+40.4 +/- 5.3 g/cm2). When stresses and heart rate were matched at the same levels in the control and failure states, the increases in tau and myocardial stiffness were no longer observed, whereas LV systolic function remained depressed. There were no increases in connective tissue content in heart failure. Thus, pacing-induced heart failure in conscious dogs is characterized by major alterations in diastolic function which are reversible with normalization of increased loading condition.

  6. Nonoperative management of blunt hepatic trauma: A systematic review.

    PubMed

    Boese, Christoph Kolja; Hackl, Michael; Müller, Lars Peter; Ruchholtz, Steffen; Frink, Michael; Lechler, Philipp

    2015-10-01

    Nonoperative management (NOM) has become the standard treatment in hemodynamically stable patients with blunt hepatic injuries. While the reported overall success rates of NOM are excellent, there is a lack of consensus regarding the risk factors predicting the failure of NOM. The aim of this systematic review was to identify the incidence and prognostic factors for failure of NOM in adult patients with blunt hepatic trauma. Prospective studies reporting prognostic factors for the failure of nonoperative treatment of blunt liver injuries were identified by searching MEDLINE and the Cochrane Central Register of Controlled Trials. We screened 798 titles and abstracts, of which 8 single-center prospective observational studies, reporting 410 patients, were included in the qualitative and quantitative synthesis. No randomized controlled trials were found. The pooled failure rate of NOM was 9.5% (0-24%). Twenty-six prognostic factors predicting the failure of NOM were reported, of which six reached statistical significance in one or more studies: blood pressure (p < 0.05), fluid resuscitation (p = 0.02), blood transfusion (p = 0.003), peritoneal signs (p < 0.0001), Injury Severity Score (ISS) (p = 0.03), and associated intra-abdominal injuries (p < 0.01). There is evidence that patients presenting with clinical signs of shock, a high ISS, associated intra-abdominal injuries, and peritoneal signs are at an increased risk of failure of NOM for the treatment of blunt hepatic injuries. Systematic review, level III.

  7. Ignition and growth modeling of detonation reaction zone experiments on single crystals of PETN and HMX

    NASA Astrophysics Data System (ADS)

    White, Bradley W.; Tarver, Craig M.

    2017-01-01

    It has long been known that detonating single crystals of solid explosives have much larger failure diameters than those of heterogeneous charges of the same explosive pressed or cast to 98 - 99% theoretical maximum density (TMD). In 1957, Holland et al. demonstrated that PETN single crystals have failure diameters of about 8 mm, whereas heterogeneous PETN charges have failure diameters of less than 0.5 mm. Recently, Fedorov et al. quantitatively determined nanosecond time resolved detonation reaction zone profiles of single crystals of PETN and HMX by measuring the interface particle velocity histories of the detonating crystals and LiF windows using a PDV system. The measured reaction zone time durations for PETN and HMX single crystal detonations were approximately 100 and 260 nanoseconds, respectively. These experiments provided the necessary data to develop Ignition and Growth (I&G) reactive flow model parameters for the single crystal detonation reaction zones. Using these parameters, the calculated unconfined failure diameter of a PETN single crystal was 7.5 +/- 0.5 mm, close to the 8 mm experimental value. The calculated failure diameter of an unconfined HMX single crystal was 15 +/- 1 mm. The unconfined failure diameter of an HMX single crystal has not yet been determined precisely, but Fedorov et al. detonated 14 mm diameter crystals confined by detonating a HMX-based plastic bonded explosive (PBX) without initially overdriving the HMX crystals.

  8. ADM guidance-Ceramics: guidance to the use of fractography in failure analysis of brittle materials.

    PubMed

    Scherrer, Susanne S; Lohbauer, Ulrich; Della Bona, Alvaro; Vichi, Alessandro; Tholey, Michael J; Kelly, J Robert; van Noort, Richard; Cesar, Paulo Francisco

    2017-06-01

    To provide background information and guidance as to how to use fractography accurately, a powerful tool for failure analysis of dental ceramic structures. An extended palette of qualitative and quantitative fractography is provided, both for in vivo and in vitro fracture surface analyses. As visual support, this guidance document will provide micrographs of typical critical ceramic processing flaws, differentiating between pre- versus post sintering cracks, grinding damage related failures and occlusal contact wear origins and of failures due to surface degradation. The documentation emphasizes good labeling of crack features, precise indication of the direction of crack propagation (dcp), identification of the fracture origin, the use of fractographic photomontage of critical flaws or flaw labeling on strength data graphics. A compilation of recommendations for specific applications of fractography in Dentistry is also provided. This guidance document will contribute to a more accurate use of fractography and help researchers to better identify, describe and understand the causes of failure, for both clinical and laboratory-scale situations. If adequately performed at a large scale, fractography will assist in optimizing the methods of processing and designing of restorative materials and components. Clinical failures may be better understood and consequently reduced by sending out the correct message regarding the fracture origin in clinical trials. Copyright © 2017 The Academy of Dental Materials. All rights reserved.

  9. [Renal failure in patients with liver transplant: incidence and predisposing factors].

    PubMed

    Gerona, S; Laudano, O; Macías, S; San Román, E; Galdame, O; Torres, O; Sorkin, E; Ciardullo, M; de Santibañes, E; Mastai, R

    1997-01-01

    Renal failure is a common finding in patients undergoing orthotopic liver transplantation. The aim of the present study was to evaluate the incidence, prognostic value of pre, intra and postoperative factors and severity of renal dysfunction in patients who undergo liver transplantation. Therefore, the records of 38 consecutive adult patients were reviewed. Renal failure was defined arbitrarily as an increase in creatinine (> 1.5 mg/dl) and/or blood urea (> 80 mg/dl). Three patients were excluded of the final analysis (1 acute liver failure and 2 with a survival lower than 72 hs.) Twenty one of the 35 patients has renal failure after orthotopic liver transplantation. Six of these episodes developed early, having occurred within the first 6 days. Late renal impairment occurred in 15 patients within the hospitalization (40 +/- 10 days) (Mean +/- SD). In he overall series, liver function, evaluated by Child-Pugh classification, a higher blood-related requirements and cyclosporine levels were observed more in those who experienced renal failure than those who did not (p < 0.05). Early renal failure was related with preoperative (liver function) and intraoperative (blood requirements) factors and several causes (nephrotoxic drugs and graft failure) other than cyclosporine were present in patients who developed late renal impairment. No mortality. No mortality was associated with renal failure. We conclude that renal failure a) is a common finding after liver transplantation, b) the pathogenesis of this complication is multifactorial and, c) in not related with a poor outcome.

  10. Effect of Progressive Heart Failure on Cerebral Hemodynamics and Monoamine Metabolism in CNS.

    PubMed

    Mamalyga, M L; Mamalyga, L M

    2017-07-01

    Compensated and decompensated heart failure are characterized by different associations of disorders in the brain and heart. In compensated heart failure, the blood flow in the common carotid and basilar arteries does not change. Exacerbation of heart failure leads to severe decompensation and is accompanied by a decrease in blood flow in the carotid and basilar arteries. Changes in monoamine content occurring in the brain at different stages of heart failure are determined by various factors. The functional exercise test showed unequal monoamine-synthesizing capacities of the brain in compensated and decompensated heart failure. Reduced capacity of the monoaminergic systems in decompensated heart failure probably leads to overstrain of the central regulatory mechanisms, their gradual exhaustion, and failure of the compensatory mechanisms, which contributes to progression of heart failure.

  11. Proposal for a functional classification system of heart failure in patients with end-stage renal disease: proceedings of the acute dialysis quality initiative (ADQI) XI workgroup.

    PubMed

    Chawla, Lakhmir S; Herzog, Charles A; Costanzo, Maria Rosa; Tumlin, James; Kellum, John A; McCullough, Peter A; Ronco, Claudio

    2014-04-08

    Structural heart disease is highly prevalent in patients with chronic kidney disease requiring dialysis. More than 80% of patients with end-stage renal disease (ESRD) are reported to have cardiovascular disease. This observation has enormous clinical relevance because the leading causes of death for patients with ESRD are of cardiovascular disease etiology, including heart failure, myocardial infarction, and sudden cardiac death. The 2 systems most commonly used to classify the severity of heart failure are the New York Heart Association (NYHA) functional classification and the American Heart Association (AHA)/American College of Cardiology (ACC) staging system. With rare exceptions, patients with ESRD who do not receive renal replacement therapy (RRT) develop signs and symptoms of heart failure, including dyspnea and edema due to inability of the severely diseased kidneys to excrete sodium and water. Thus, by definition, nearly all patients with ESRD develop a symptomatology consistent with heart failure if fluid removal by RRT is delayed. Neither the AHA/ACC heart failure staging nor the NYHA functional classification system identifies the variable symptomatology that patients with ESRD experience depending upon whether evaluation occurs before or after fluid removal by RRT. Consequently, the incidence, severity, and outcomes of heart failure in patients with ESRD are poorly characterized. The 11th Acute Dialysis Quality Initiative has identified this issue as a critical unmet need for the proper evaluation and treatment of heart failure in patients with ESRD. We propose a classification schema based on patient-reported dyspnea assessed both pre- and post-ultrafiltration, in conjunction with echocardiography. Copyright © 2014 American College of Cardiology Foundation. Published by Elsevier Inc. All rights reserved.

  12. Factors associated with health-related quality of life in stable ambulatory congestive heart failure patients: Systematic review.

    PubMed

    Baert, Anneleen; De Smedt, Delphine; De Sutter, Johan; De Bacquer, Dirk; Puddu, Paolo Emilio; Clays, Els; Pardaens, Sofie

    2018-03-01

    Background Since improved treatment of congestive heart failure has resulted in decreased mortality and hospitalisation rates, increasing self-perceived health-related quality of life (HRQoL) has become a major goal of congestive heart failure treatment. However, an overview on predictieve factors of HRQoL is currently lacking in literature. Purpose The aim of this study was to identify key factors associated with HRQoL in stable ambulatory patients with congestive heart failure. Methods A systematic review was performed. MEDLINE, Web of Science and Embase were searched for the following combination of terms: heart failure, quality of life, health perception or functional status between the period 2000 and February 2017. Literature screening was done by two independent reviewers. Results Thirty-five studies out of 8374 titles were included for quality appraisal, of which 29 were selected for further data extraction. Four distinct categories grouping different types of variables were identified: socio-demographic characteristics, clinical characteristics, health and health behaviour, and care provider characteristics. Within the above-mentioned categories the presence of depressive symptoms was most consistently related to a worse HRQoL, followed by a higher New York Heart Association functional class, younger age and female gender. Conclusion Through a systematic literature search, factors associated with HRQoL among congestive heart failure patients were investigated. Age, gender, New York Heart Association functional class and depressive symptoms are the most consistent variables explaining the variance in HRQoL in patients with congestive heart failure. These findings are partly in line with previous research on predictors for hard endpoints in patients with congestive heart failure.

  13. Security Analysis of Selected AMI Failure Scenarios Using Agent Based Game Theoretic Simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Abercrombie, Robert K; Schlicher, Bob G; Sheldon, Frederick T

    Information security analysis can be performed using game theory implemented in dynamic Agent Based Game Theoretic (ABGT) simulations. Such simulations can be verified with the results from game theory analysis and further used to explore larger scale, real world scenarios involving multiple attackers, defenders, and information assets. We concentrated our analysis on the Advanced Metering Infrastructure (AMI) functional domain which the National Electric Sector Cyber security Organization Resource (NESCOR) working group has currently documented 29 failure scenarios. The strategy for the game was developed by analyzing five electric sector representative failure scenarios contained in the AMI functional domain. From thesemore » five selected scenarios, we characterize them into three specific threat categories affecting confidentiality, integrity and availability (CIA). The analysis using our ABGT simulation demonstrates how to model the AMI functional domain using a set of rationalized game theoretic rules decomposed from the failure scenarios in terms of how those scenarios might impact the AMI network with respect to CIA.« less

  14. Physical Exercise and Patients with Chronic Renal Failure: A Meta-Analysis.

    PubMed

    Qiu, Zhenzhen; Zheng, Kai; Zhang, Haoxiang; Feng, Ji; Wang, Lizhi; Zhou, Hao

    2017-01-01

    Chronic renal failure is a severe clinical problem which has some significant socioeconomic impact worldwide and hemodialysis is an important way to maintain patients' health state, but it seems difficult to get better in short time. Considering these, the aim in our research is to update and evaluate the effects of exercise on the health of patients with chronic renal failure. The databases were used to search for the relevant studies in English or Chinese. And the association between physical exercise and health state of patients with chronic renal failure has been investigated. Random-effect model was used to compare the physical function and capacity in exercise and control groups. Exercise is helpful in ameliorating the situation of blood pressure in patients with renal failure and significantly reduces VO 2 in patients with renal failure. The results of subgroup analyses show that, in the age >50, physical activity can significantly reduce blood pressure in patients with renal failure. The activity program containing warm-up, strength, and aerobic exercises has benefits in blood pressure among sick people and improves their maximal oxygen consumption level. These can help patients in physical function and aerobic capacity and may give them further benefits.

  15. Physical Exercise and Patients with Chronic Renal Failure: A Meta-Analysis

    PubMed Central

    Qiu, Zhenzhen; Zheng, Kai; Zhang, Haoxiang; Feng, Ji; Wang, Lizhi

    2017-01-01

    Chronic renal failure is a severe clinical problem which has some significant socioeconomic impact worldwide and hemodialysis is an important way to maintain patients' health state, but it seems difficult to get better in short time. Considering these, the aim in our research is to update and evaluate the effects of exercise on the health of patients with chronic renal failure. The databases were used to search for the relevant studies in English or Chinese. And the association between physical exercise and health state of patients with chronic renal failure has been investigated. Random-effect model was used to compare the physical function and capacity in exercise and control groups. Exercise is helpful in ameliorating the situation of blood pressure in patients with renal failure and significantly reduces VO2 in patients with renal failure. The results of subgroup analyses show that, in the age >50, physical activity can significantly reduce blood pressure in patients with renal failure. The activity program containing warm-up, strength, and aerobic exercises has benefits in blood pressure among sick people and improves their maximal oxygen consumption level. These can help patients in physical function and aerobic capacity and may give them further benefits. PMID:28316986

  16. Lungs in Heart Failure

    PubMed Central

    Apostolo, Anna; Giusti, Giuliano; Gargiulo, Paola; Bussotti, Maurizio; Agostoni, Piergiuseppe

    2012-01-01

    Lung function abnormalities both at rest and during exercise are frequently observed in patients with chronic heart failure, also in the absence of respiratory disease. Alterations of respiratory mechanics and of gas exchange capacity are strictly related to heart failure. Severe heart failure patients often show a restrictive respiratory pattern, secondary to heart enlargement and increased lung fluids, and impairment of alveolar-capillary gas diffusion, mainly due to an increased resistance to molecular diffusion across the alveolar capillary membrane. Reduced gas diffusion contributes to exercise intolerance and to a worse prognosis. Cardiopulmonary exercise test is considered the “gold standard” when studying the cardiovascular, pulmonary, and metabolic adaptations to exercise in cardiac patients. During exercise, hyperventilation and consequent reduction of ventilation efficiency are often observed in heart failure patients, resulting in an increased slope of ventilation/carbon dioxide (VE/VCO2) relationship. Ventilatory efficiency is as strong prognostic and an important stratification marker. This paper describes the pulmonary abnormalities at rest and during exercise in the patients with heart failure, highlighting the principal diagnostic tools for evaluation of lungs function, the possible pharmacological interventions, and the parameters that could be useful in prognostic assessment of heart failure patients. PMID:23365739

  17. Muscle electrical stimulation improves neurovascular control and exercise tolerance in hospitalised advanced heart failure patients.

    PubMed

    Groehs, Raphaela V; Antunes-Correa, Ligia M; Nobre, Thais S; Alves, Maria-Janieire Nn; Rondon, Maria Urbana Pb; Barreto, Antônio Carlos Pereira; Negrão, Carlos E

    2016-10-01

    We investigated the effects of muscle functional electrical stimulation on muscle sympathetic nerve activity and muscle blood flow, and, in addition, exercise tolerance in hospitalised patients for stabilisation of heart failure. Thirty patients hospitalised for treatment of decompensated heart failure, class IV New York Heart Association and ejection fraction ≤ 30% were consecutively randomly assigned into two groups: functional electrical stimulation (n = 15; 54 ± 2 years) and control (n = 15; 49 ± 2 years). Muscle sympathetic nerve activity was directly recorded via microneurography and blood flow by venous occlusion plethysmography. Heart rate and blood pressure were evaluated on a beat-to-beat basis (Finometer), exercise tolerance by 6-minute walk test, quadriceps muscle strength by a dynamometer and quality of life by Minnesota questionnaire. Functional electrical stimulation consisted of stimulating the lower limbs at 10 Hz frequency, 150 ms pulse width and 70 mA intensity for 60 minutes/day for 8-10 consecutive days. The control group underwent electrical stimulation at an intensity of < 20 mA. Baseline characteristics were similar between groups, except age that was higher and C-reactive protein and forearm blood flow that were smaller in the functional electrical stimulation group. Functional electrical stimulation significantly decreased muscle sympathetic nerve activity and increased muscle blood flow and muscle strength. No changes were found in the control group. Walking distance and quality of life increased in both groups. However, these changes were greater in the functional electrical stimulation group. Functional electrical stimulation improves muscle sympathetic nerve activity and vasoconstriction and increases exercise tolerance, muscle strength and quality of life in hospitalised heart failure patients. These findings suggest that functional electrical stimulation may be useful to hospitalised patients with decompensated chronic heart failure. © The European Society of Cardiology 2016.

  18. Failure to Report Effect Sizes: The Handling of Quantitative Results in Published Health Education and Behavior Research

    ERIC Educational Resources Information Center

    Barry, Adam E.; Szucs, Leigh E.; Reyes, Jovanni V.; Ji, Qian; Wilson, Kelly L.; Thompson, Bruce

    2016-01-01

    Given the American Psychological Association's strong recommendation to always report effect sizes in research, scholars have a responsibility to provide complete information regarding their findings. The purposes of this study were to (a) determine the frequencies with which different effect sizes were reported in published, peer-reviewed…

  19. Failure to Get Admissions in a Discipline of Their Own Choice: Voices of Dejected Students

    ERIC Educational Resources Information Center

    Rana, Naeem Akhtar; Tuba, Naeem

    2017-01-01

    Attaining a professional engineering degree is a dream of many pre-engineering intermediate students in Pakistan. Several students face scarcity of resources to accomplish and enliven their dreams of getting admission into an engineering institute, which results in great hardships and turmoil for them. The literature reveals that quantitative work…

  20. The Long-Term Effects of Florida's Third Grade Retention Policy

    ERIC Educational Resources Information Center

    Smith, Andre K.

    2016-01-01

    The purpose of this quantitative causal-comparative study was to evaluate the long-term effects of Florida's Third-Grade Retention policy on low performing students' subsequent academic performance as measured by FCAT reading scores. The study included a random stratified sample of 1500 retained third graders for failure to meet Florida's…

  1. Correlation of Electronic Health Records Use and Reduced Prevalence of Diabetes Co-Morbidities

    ERIC Educational Resources Information Center

    Eller, James D.

    2013-01-01

    The general problem is Native American tribes have high prevalence rates of diabetes. The specific problem is the failure of IHS sites to adopt EHR may cause health care providers to miss critical opportunities to improve screening and triage processes that result in quality improvement. The purpose of the quantitative correlational study was to…

  2. The "Ins" and "Outs" of Physical Activity Policy Implementation: Inadequate Capacity, Inappropriate Outcome Measures, and Insufficient Funds

    ERIC Educational Resources Information Center

    Howie, Erin K.; Stevick, E. Doyle

    2014-01-01

    Background: Despite broad public support and legislative activity, policies intended to promote physical activity in schools have not produced positive outcomes in levels of physical activity or student health. What explains the broad failure of Physical Activity Policies (PAPs)? Thus far, PAP research has used limited quantitative methods to…

  3. MitoQ improves mitochondrial dysfunction in heart failure induced by pressure overload.

    PubMed

    Ribeiro Junior, Rogério Faustino; Dabkowski, Erinne Rose; Shekar, Kadambari Chandra; O Connell, Kelly A; Hecker, Peter A; Murphy, Michael P

    2018-03-01

    Heart failure remains a major public-health problem with an increase in the number of patients worsening from this disease. Despite current medical therapy, the condition still has a poor prognosis. Heart failure is complex but mitochondrial dysfunction seems to be an important target to improve cardiac function directly. Our goal was to analyze the effects of MitoQ (100 µM in drinking water) on the development and progression of heart failure induced by pressure overload after 14 weeks. The main findings are that pressure overload-induced heart failure in rats decreased cardiac function in vivo that was not altered by MitoQ. However, we observed a reduction in right ventricular hypertrophy and lung congestion in heart failure animals treated with MitoQ. Heart failure also decreased total mitochondrial protein content, mitochondrial membrane potential in the intermyofibrillar mitochondria. MitoQ restored membrane potential in IFM but did not restore mitochondrial protein content. These alterations are associated with the impairment of basal and stimulated mitochondrial respiration in IFM and SSM induced by heart failure. Moreover, MitoQ restored mitochondrial respiration in heart failure induced by pressure overload. We also detected higher levels of hydrogen peroxide production in heart failure and MitoQ restored the increase in ROS production. MitoQ was also able to improve mitochondrial calcium retention capacity, mainly in the SSM whereas in the IFM we observed a small alteration. In summary, MitoQ improves mitochondrial dysfunction in heart failure induced by pressure overload, by decreasing hydrogen peroxide formation, improving mitochondrial respiration and improving mPTP opening. Published by Elsevier Inc.

  4. Cardiovascular mechanisms of SSRI drugs and their benefits and risks in ischemic heart disease and heart failure.

    PubMed

    Andrade, Chittaranjan; Kumar, Chethan B; Surya, Sandarsh

    2013-05-01

    Depression and heart disease are commonly comorbid. Selective serotonin reuptake inhibitors (SSRIs) are commonly used to treat depression. In March 2011, we carried out a 15-year search of PubMed for preclinical and clinical publications related to SSRIs and ischemic heart disease (IHD) or congestive heart failure (CHF). We identify and discuss a number of mechanisms by which SSRIs may influence cardiovascular functioning and health outcomes in patients with heart disease; many of the mechanisms that we present have received little attention in previous reviews. We examine studies with positive, neutral, and negative outcomes in IHD and CHF patients treated with SSRIs. SSRIs influence cardiovascular functioning and health through several different mechanisms; for example, they inhibit serotonin-mediated and collagen-mediated platelet aggregation, reduce inflammatory mediator levels, and improve endothelial function. SSRIs improve indices of ventricular functioning in IHD and heart failure without adversely affecting electrocardiographic parameters. SSRIs may also be involved in favorable or unfavorable drug interactions with medications that influence cardiovascular functions. The clinical evidence suggests that, in general, SSRIs are safe in patients with IHD and may, in fact, exert a cardioprotective effect. The clinical data are less clear in patients with heart failure, and the evidence for benefits with SSRIs is weak.

  5. Intermittent levosimendan infusions in advanced heart failure: favourable effects on left ventricular function, neurohormonal balance, and one-year survival.

    PubMed

    Malfatto, Gabriella; Della Rosa, Francesco; Villani, Alessandra; Rella, Valeria; Branzi, Giovanna; Facchini, Mario; Parati, Gianfranco

    2012-11-01

    The role of repeated infusions of Levosimendan (LEVO) in patients with chronic advanced heart failure is still unclear. Thirty-three patients with chronic heart failure presenting clinical deterioration were randomized 2:1 to receive monthly infusions of LEVO (n = 22) or Furosemide (Controls, n = 11). At the first drug's administration, noninvasive hemodynamic evaluation was performed; before and after each infusion, we assessed NYHA class, systolic and diastolic function, functional mitral regurgitation, and brain natriuretic peptide (BNP) levels. Noninvasive hemodynamic in the LEVO group showed vasodilation and decrease in thoracic conductance (index of pulmonary congestion), whereas in Controls, only a reduced thoracic conductance was observed. In the LEVO group, systolic and diastolic function, ventricular volumes, severity of mitral regurgitation, and BNP levels improved over time from baseline and persisted 4 weeks after the last infusion (P < 0.01). In Controls, no change developed over time in cardiac function and BNP levels. In LEVO-treated patients, 1-year mortality tended to be lower than in those treated with Furosemide. In conclusion, serial LEVO infusions in advanced heart failure improved ventricular performance and favorably modulated neurohormonal activation. Multicenter randomized studies are warranted to test the effect of LEVO on long-term outcome.

  6. Cardiorenal Syndrome: New Developments in the Understanding and Pharmacologic Management

    PubMed Central

    2013-01-01

    Summary Cardiorenal syndromes (CRSs) with bidirectional heart-kidney signaling are increasingly being recognized for their association with increased morbidity and mortality. In acute CRS, recognition of the importance of worsening kidney function complicating management of acute decompensated heart failure has led to the examination of this specific outcome in the context of acute heart failure clinical trials. In particular, the role of fluid overload and venous congestion has focused interest in the most effective use of diuretic therapy to relieve symptoms of heart failure while at the same time preserving kidney function. Additionally, many novel vasoactive therapies have been studied in recent years with the hopes of augmenting cardiac function, improving symptoms and patient outcomes, while maintaining or improving kidney function. Similarly, recent advances in our understanding of the pathophysiology of chronic CRS have led to reanalysis of kidney outcomes in pivotal trials in chronic congestive heart failure, and newer trials are including changes in kidney function as well as kidney injury biomarkers as prospectively monitored and adjudicated outcomes. This paper provides an overview of some new developments in the pharmacologic management of acute and chronic CRS, examines several reports that illustrate a key management principle for each subtype, and discusses opportunities for future research. PMID:23929925

  7. Protective effects of ACLF sera on metabolic functions and proliferation of hepatocytes co-cultured with bone marrow MSCs in vitro

    PubMed Central

    Shi, Xiao-Lei; Gu, Jin-Yang; Zhang, Yue; Han, Bing; Xiao, Jiang-Qiang; Yuan, Xian-Wen; Zhang, Ning; Ding, Yi-Tao

    2011-01-01

    AIM: To investigate whether the function of hepatocytes co-cultured with bone marrow mesenchymal stem cells (MSCs) could be maintained in serum from acute-on-chronic liver failure (ACLF) patients. METHODS: Hepatocyte supportive functions and cytotoxicity of sera from 18 patients with viral hepatitis B-induced ACLF and 18 healthy volunteers were evaluated for porcine hepatocytes co-cultured with MSCs and hepatocyte mono-layered culture, respectively. Chemokine profile was also examined for the normal serum and liver failure serum. RESULTS: Hepatocyte growth factor (HGF) and Tumor necrosis factor; tumor necrosis factor (TNF)-α were remarkably elevated in response to ACLF while epidermal growth factor (EGF) and VEGF levels were significantly decreased. Liver failure serum samples induced a higher detachment rate, lower viability and decreased liver support functions in the homo-hepatocyte culture. Hepatocytes co-cultured with MSCs could tolerate the cytotoxicity of the serum from ACLF patients and had similar liver support functions compared with the hepatocytes cultured with healthy human serum in vitro. In addition, co-cultured hepatocytes maintained a proliferative capability despite of the insult from liver failure serum. CONCLUSION: ACLF serum does not impair the cell morphology, viability, proliferation and overall metabolic capacities of hepatocyte co-cultured with MSCs in vitro. PMID:21633639

  8. Magnetomotive optical coherence elastography for relating lung structure and function in cystic fibrosis

    NASA Astrophysics Data System (ADS)

    Chhetri, Raghav K.; Carpenter, Jerome; Superfine, Richard; Randell, Scott H.; Oldenburg, Amy L.

    2010-02-01

    Cystic fibrosis (CF) is a genetic defect in the cystic fibrosis transmembrane conductance regulator protein and is the most common life-limiting genetic condition affecting the Caucasian population. It is an autosomal recessive, monogenic inherited disorder characterized by failure of airway host defense against bacterial infection, which results in bronchiectasis, the breakdown of airway wall extracellular matrix (ECM). In this study, we show that the in vitro models consisting of human tracheo-bronchial-epithelial (hBE) cells grown on porous supports with embedded magnetic nanoparticles (MNPs) at an air-liquid interface are suitable for long term, non-invasive assessment of ECM remodeling using magnetomotive optical coherence elastography (MMOCE). The morphology of ex vivo CF and normal lung tissues using OCT and correlative study with histology is also examined. We also demonstrate a quantitative measure of normal and CF airway elasticity using MMOCE. The improved understanding of pathologic changes in CF lung structure and function and the novel method of longitudinal in vitro ECM assessment demonstrated in this study may lead to new in vivo imaging and elastography methods to monitor disease progression and treatment in cystic fibrosis.

  9. Stamping SERS for creatinine sensing

    NASA Astrophysics Data System (ADS)

    Li, Ming; Du, Yong; Zhao, Fusheng; Zeng, Jianbo; Santos, Greggy M.; Mohan, Chandra; Shih, Wei-Chuan

    2015-03-01

    Urine can be obtained easily, readily and non-invasively. The analysis of urine can provide metabolic information of the body and the condition of renal function. Creatinine is one of the major components of human urine associated with muscle metabolism. Since the content of creatinine excreted into urine is relatively constant, it is used as an internal standard to normalize water variations. Moreover, the detection of creatinine concentration in urine is important for the renal clearance test, which can monitor the filtration function of kidney and health status. In more details, kidney failure can be imminent when the creatinine concentration in urine is high. A simple device and protocol for creatinine sensing in urine samples can be valuable for point-of-care applications. We reported quantitative analysis of creatinine in urine samples by using stamping surface enhanced Raman scattering (S-SERS) technique with nanoporous gold disk (NPGD) based SERS substrate. S-SERS technique enables label-free and multiplexed molecular sensing under dry condition, while NPGD provides a robust, controllable, and high-sensitivity SERS substrate. The performance of S-SERS with NGPDs is evaluated by the detection and quantification of pure creatinine and creatinine in artificial urine within physiologically relevant concentration ranges.

  10. Prediction of postoperative outcome after hepatectomy with a new bedside test for maximal liver function capacity.

    PubMed

    Stockmann, Martin; Lock, Johan F; Riecke, Björn; Heyne, Karsten; Martus, Peter; Fricke, Michael; Lehmann, Sina; Niehues, Stefan M; Schwabe, Michael; Lemke, Arne-Jörn; Neuhaus, Peter

    2009-07-01

    To validate the LiMAx test, a new bedside test for the determination of maximal liver function capacity based on C-methacetin kinetics. To investigate the diagnostic performance of different liver function tests and scores including the LiMAx test for the prediction of postoperative outcome after hepatectomy. Liver failure is a major cause of mortality after hepatectomy. Preoperative prediction of residual liver function has been limited so far. Sixty-four patients undergoing hepatectomy were analyzed in a prospective observational study. Volumetric analysis of the liver was carried out using preoperative computed tomography and intraoperative measurements. Perioperative factors associated with morbidity and mortality were analyzed. Cutoff values of the LiMAx test were evaluated by receiver operating characteristic. Residual LiMAx demonstrated an excellent linear correlation with residual liver volume (r = 0.94, P < 0.001) after hepatectomy. The multivariate analysis revealed LiMAx on postoperative day 1 as the only predictor of liver failure (P = 0.003) and mortality (P = 0.004). AUROC for the prediction of liver failure and liver failure related death by the LiMAx test was both 0.99. Preoperative volume/function analysis combining CT volumetry and LiMAx allowed an accurate calculation of the remnant liver function capacity prior to surgery (r = 0.85, P < 0.001). Residual liver function is the major factor influencing the outcome of patients after hepatectomy and can be predicted preoperatively by a combination of LiMAx and CT volumetry.

  11. Functional correlation approach to operational risk in banking organizations

    NASA Astrophysics Data System (ADS)

    Kühn, Reimer; Neu, Peter

    2003-05-01

    A Value-at-Risk-based model is proposed to compute the adequate equity capital necessary to cover potential losses due to operational risks, such as human and system process failures, in banking organizations. Exploring the analogy to a lattice gas model from physics, correlations between sequential failures are modeled by as functionally defined, heterogeneous couplings between mutually supportive processes. In contrast to traditional risk models for market and credit risk, where correlations are described as equal-time-correlations by a covariance matrix, the dynamics of the model shows collective phenomena such as bursts and avalanches of process failures.

  12. Sustained choroid plexus function in human elderly and Alzheimer's disease patients.

    PubMed

    Spector, Reynold; Johanson, Conrad E

    2013-09-24

    We and other investigators have postulated deterioration of essential choroid plexus (CP) functions in some elderly and especially Alzheimer's disease patients based on apparent anatomical, histological and pathological changes in CP. We have termed this putative phenomenon CP failure. By focusing on four essential energy-requiring CP functions, specifically ascorbic acid (AA) and folate transport from blood into CSF, transthyretin synthesis and secretion into CSF, and electrolyte/acid-base balance in CSF, we were able to evaluate the hypothesis of CP failure by reviewing definitive human data. In both healthy elderly and Alzheimer's disease patients, the CP functions normally to transport AA and folates actively from blood into CSF, synthesize and secrete transthyretin into CSF, and maintain CSF acid-base balance and ion concentrations. These human CSF compositional data provide no support for the notion of CP failure in elderly humans and Alzheimer's disease patients.

  13. Towards real-time quantitative optical imaging for surgery

    NASA Astrophysics Data System (ADS)

    Gioux, Sylvain

    2017-07-01

    There is a pressing clinical need to provide image guidance during surgery. Currently, assessment of tissue that needs to be resected or avoided is performed subjectively leading to a large number of failures, patient morbidity and increased healthcare cost. Because near-infrared (NIR) optical imaging is safe, does not require contact, and can provide relatively deep information (several mm), it offers unparalleled capabilities for providing image guidance during surgery. In this work, we introduce a novel concept that enables the quantitative imaging of endogenous molecular information over large fields-of-view. Because this concept can be implemented in real-time, it is amenable to provide video-rate endogenous information during surgery.

  14. Continuous monitoring of regional function by a miniaturized ultrasound transducer allows early quantification of low-grade myocardial ischemia.

    PubMed

    Hyler, Stefan; Pischke, Søren E; Halvorsen, Per Steinar; Espinoza, Andreas; Bergsland, Jacob; Tønnessen, Tor Inge; Fosse, Erik; Skulstad, Helge

    2015-04-01

    Sensitive methods for the early detection of myocardial dysfunction are still needed, as ischemia is a leading cause of decreased ventricular function during and after heart surgery. The aim of this study was to test the hypothesis that low-grade ischemia could be detected quantitatively by a miniaturized epicardial ultrasound transducer (Ø = 3 mm), allowing continuous monitoring. In 10 pigs, transducers were positioned in the left anterior descending and circumflex coronary artery areas. Left ventricular pressure was obtained by a micromanometer. The left internal mammary artery was grafted to the left anterior descending coronary artery, which was occluded proximal to the anastomosis. Left internal mammary artery flow was stepwise reduced by 25%, 50%, and 75% for 18 min each. From the transducers, M-mode traces were obtained, allowing continuous tissue velocity traces and displacement measurements. Regional work was assessed as left ventricular pressure-displacement loop area. Tissue lactate measured from intramyocardial microdialysis was used as reference method to detect ischemia. All steps of coronary flow reduction demonstrated reduced peak systolic velocity (P < .05) and regional work (P < .01).The decreases in peak systolic velocity and regional work were closely related to the degree of ischemia, demonstrated by their correlations with lactate (R = -0.74, P < .01, and R = -0.64, P < .01, respectively). The circumflex coronary artery area was not affected by any of the interventions. The epicardially attached miniaturized ultrasound transducer allowed the precise detection of different levels of coronary flow reduction. The results also showed a quantitative and linear relationship among coronary flow, ischemia, and myocardial function. Thus, the ultrasound transducer has the potential to improve the monitoring of myocardial ischemia and to detect graft failure during and after heart surgery. Copyright © 2015 American Society of Echocardiography. Published by Elsevier Inc. All rights reserved.

  15. Personality Changes as a Function of Minimum Competency Test Success or Failure.

    ERIC Educational Resources Information Center

    Richman, Charles L.; And Others

    1987-01-01

    The psychological effects of success and failure on the North Carolina Minimum Competency Test (MCT) were examined. Subjects were high school students, who were pre- and post-tested using the Rosenberg Self Esteem Scale and the High School Personality Questionnaire. Self-esteem decreased following knowledge of MCT failure. (LMO)

  16. Teachers' Perceptions of and Solutions for Student School Failure

    ERIC Educational Resources Information Center

    Maksic, Slavica

    2015-01-01

    School failure is an important aspect of students' development and their progression through the process of education, as well as for the functioning of the education system itself. The paper reports the results of a qualitative study exploring the relationship between primary school teachers' perceptions of student school failure and the…

  17. Metallic ureteral stents in malignant ureteral obstruction: clinical factors predicting stent failure.

    PubMed

    Chow, Po-Ming; Hsu, Jui-Shan; Huang, Chao-Yuan; Wang, Shuo-Meng; Lee, Yuan-Ju; Huang, Kuo-How; Yu, Hong-Jheng; Pu, Yeong-Shiau; Liang, Po-Chin

    2014-06-01

    To provide clinical outcomes of the Resonance metallic ureteral stent in patients with malignant ureteral obstruction, as well as clinical factors predicting stent failure. Cancer patients who have received Resonance stents from July 2009 to March 2012 for ureteral obstruction were included for chart review. Stent failure was detected by clinical symptoms, image studies, and renal function tests. Survival analysis for stent duration was used to estimate patency rate and factors predicting stent failure. A total of 117 stents were inserted successfully into 94 ureteral units in 79 patients. There were no major complications. These stents underwent survival analysis and proportional hazard regression. The median duration for the stents was 5.77 months. In multivariate analysis, age (P=0.043), preoperative serum creatinine level (P=0.0174), and cancer type (P=0.0494) were significant factors associated with stent failure. Cancer treatment before and after stent insertion had no effect on stent duration. Resonance stents are effective and safe in relieving malignant ureteral obstructions. Old age and high serum creatinine level are predictors for stent failure. Stents in patients with lower gastrointestinal cancers have longer functional duration.

  18. Independent Orbiter Assessment (IOA): Analysis of the auxiliary power unit

    NASA Technical Reports Server (NTRS)

    Barnes, J. E.

    1986-01-01

    The results of the Independent Orbiter Assessment (IOA) of the Failure Modes and Effects Analysis (FMEA) and Critical Items List (CIL) are presented. The IOA approach features a top-down analysis of the hardware to determine failure modes, criticality, and potential critical items. To preserve independence, this analysis was accomplished without reliance upon the results contained within the NASA FMEA/CIL documentation. This report documents the independent analysis results corresponding to the Orbiter Auxiliary Power Unit (APU). The APUs are required to provide power to the Orbiter hydraulics systems during ascent and entry flight phases for aerosurface actuation, main engine gimballing, landing gear extension, and other vital functions. For analysis purposes, the APU system was broken down into ten functional subsystems. Each level of hardware was evaluated and analyzed for possible failure modes and effects. Criticality was assigned based upon the severity of the effect for each failure mode. A preponderance of 1/1 criticality items were related to failures that allowed the hydrazine fuel to escape into the Orbiter aft compartment, creating a severe fire hazard, and failures that caused loss of the gas generator injector cooling system.

  19. [Comorbidities of heart failure: sleep apnea].

    PubMed

    Woehrle, H; Oldenburg, O; Stadler, S; Arzt, M

    2018-05-01

    Since sleep apnea often occurs in heart failure, physicians regularly need to decide whether further diagnostic procedures and/or treatment are required. Which types of sleep apnea occur in heart failure patients? When is treatment needed? Which treatments and treatment goals are appropriate? Clinical trials and guidelines as well as their implementation in clinical practice are discussed. At least 40% of patients with heart failure, both with reduced and preserved left ventricular ejection fraction (HFrEF and HFpEF, respectively), suffer from relevant sleep apnea. In heart failure patients both obstructive and central sleep apnea are associated with increased mortality. In HFrEF as well as in HFpEF patients with obstructive sleep apnea, treatment with continuous positive airway pressure (CPAP) achieves symptomatic and functional improvements. In patients with HFpEF, positive airway pressure treatment of central sleep apnea may be beneficial. In patients with HFrEF and left ventricular ejection fraction ≤45%, adaptive servoventilation is contraindicated. Sleep apnea is highly prevalent in heart failure patients and its treatment in specific patient groups can improve symptoms and functional outcomes. Thus, testing for sleep apnea is recommended.

  20. Resetting the transcription factor network reverses terminal chronic hepatic failure

    PubMed Central

    Nishikawa, Taichiro; Bell, Aaron; Brooks, Jenna M.; Setoyama, Kentaro; Melis, Marta; Han, Bing; Fukumitsu, Ken; Handa, Kan; Tian, Jianmin; Kaestner, Klaus H.; Vodovotz, Yoram; Locker, Joseph; Soto-Gutierrez, Alejandro; Fox, Ira J.

    2015-01-01

    The cause of organ failure is enigmatic for many degenerative diseases, including end-stage liver disease. Here, using a CCl4-induced rat model of irreversible and fatal hepatic failure, which also exhibits terminal changes in the extracellular matrix, we demonstrated that chronic injury stably reprograms the critical balance of transcription factors and that diseased and dedifferentiated cells can be returned to normal function by re-expression of critical transcription factors, a process similar to the type of reprogramming that induces somatic cells to become pluripotent or to change their cell lineage. Forced re-expression of the transcription factor HNF4α induced expression of the other hepatocyte-expressed transcription factors; restored functionality in terminally diseased hepatocytes isolated from CCl4-treated rats; and rapidly reversed fatal liver failure in CCl4-treated animals by restoring diseased hepatocytes rather than replacing them with new hepatocytes or stem cells. Together, the results of our study indicate that disruption of the transcription factor network and cellular dedifferentiation likely mediate terminal liver failure and suggest reinstatement of this network has therapeutic potential for correcting organ failure without cell replacement. PMID:25774505

Top