Quantitative method of medication system interface evaluation.
Pingenot, Alleene Anne; Shanteau, James; Pingenot, James D F
2007-01-01
The objective of this study was to develop a quantitative method of evaluating the user interface for medication system software. A detailed task analysis provided a description of user goals and essential activity. A structural fault analysis was used to develop a detailed description of the system interface. Nurses experienced with use of the system under evaluation provided estimates of failure rates for each point in this simplified fault tree. Means of estimated failure rates provided quantitative data for fault analysis. Authors note that, although failures of steps in the program were frequent, participants reported numerous methods of working around these failures so that overall system failure was rare. However, frequent process failure can affect the time required for processing medications, making a system inefficient. This method of interface analysis, called Software Efficiency Evaluation and Fault Identification Method, provides quantitative information with which prototypes can be compared and problems within an interface identified.
NASA Astrophysics Data System (ADS)
Xue, Fei; Bompard, Ettore; Huang, Tao; Jiang, Lin; Lu, Shaofeng; Zhu, Huaiying
2017-09-01
As the modern power system is expected to develop to a more intelligent and efficient version, i.e. the smart grid, or to be the central backbone of energy internet for free energy interactions, security concerns related to cascading failures have been raised with consideration of catastrophic results. The researches of topological analysis based on complex networks have made great contributions in revealing structural vulnerabilities of power grids including cascading failure analysis. However, existing literature with inappropriate assumptions in modeling still cannot distinguish the effects between the structure and operational state to give meaningful guidance for system operation. This paper is to reveal the interrelation between network structure and operational states in cascading failure and give quantitative evaluation by integrating both perspectives. For structure analysis, cascading paths will be identified by extended betweenness and quantitatively described by cascading drop and cascading gradient. Furthermore, the operational state for cascading paths will be described by loading level. Then, the risk of cascading failure along a specific cascading path can be quantitatively evaluated considering these two factors. The maximum cascading gradient of all possible cascading paths can be used as an overall metric to evaluate the entire power grid for its features related to cascading failure. The proposed method is tested and verified on IEEE30-bus system and IEEE118-bus system, simulation evidences presented in this paper suggests that the proposed model can identify the structural causes for cascading failure and is promising to give meaningful guidance for the protection of system operation in the future.
The Positive Alternative Credit Experience (PACE) Program a Quantitative Comparative Study
ERIC Educational Resources Information Center
Warren, Rebecca Anne
2011-01-01
The purpose of this quantitative comparative study was to evaluate the Positive Alternative Credit Experience (PACE) Program using an objectives-oriented approach to a formative program evaluation. The PACE Program was a semester-long high school alternative education program designed to serve students at-risk for academic failure or dropping out…
Evaluation: Review of the Past, Preview of the Future.
ERIC Educational Resources Information Center
Smith, M. F.
1994-01-01
This paper summarized contributors' ideas about evaluation as a field and where it is going. Topics discussed were qualitative versus quantitative debate; evaluation's purpose; professionalization; program failure; program development; evaluators as advocates; evaluation knowledge; evaluation expansion; and methodology and design. (SLD)
NASA Astrophysics Data System (ADS)
Gan, Luping; Li, Yan-Feng; Zhu, Shun-Peng; Yang, Yuan-Jian; Huang, Hong-Zhong
2014-06-01
Failure mode, effects and criticality analysis (FMECA) and Fault tree analysis (FTA) are powerful tools to evaluate reliability of systems. Although single failure mode issue can be efficiently addressed by traditional FMECA, multiple failure modes and component correlations in complex systems cannot be effectively evaluated. In addition, correlated variables and parameters are often assumed to be precisely known in quantitative analysis. In fact, due to the lack of information, epistemic uncertainty commonly exists in engineering design. To solve these problems, the advantages of FMECA, FTA, fuzzy theory, and Copula theory are integrated into a unified hybrid method called fuzzy probability weighted geometric mean (FPWGM) risk priority number (RPN) method. The epistemic uncertainty of risk variables and parameters are characterized by fuzzy number to obtain fuzzy weighted geometric mean (FWGM) RPN for single failure mode. Multiple failure modes are connected using minimum cut sets (MCS), and Boolean logic is used to combine fuzzy risk priority number (FRPN) of each MCS. Moreover, Copula theory is applied to analyze the correlation of multiple failure modes in order to derive the failure probabilities of each MCS. Compared to the case where dependency among multiple failure modes is not considered, the Copula modeling approach eliminates the error of reliability analysis. Furthermore, for purpose of quantitative analysis, probabilities importance weight from failure probabilities are assigned to FWGM RPN to reassess the risk priority, which generalize the definition of probability weight and FRPN, resulting in a more accurate estimation than that of the traditional models. Finally, a basic fatigue analysis case drawn from turbine and compressor blades in aeroengine is used to demonstrate the effectiveness and robustness of the presented method. The result provides some important insights on fatigue reliability analysis and risk priority assessment of structural system under failure correlations.
Wang, Yan; Zhu, Wenhui; Duan, Xingxing; Zhao, Yongfeng; Liu, Wengang; Li, Ruizhen
2011-04-01
To evaluate intraventricular systolic dyssynchrony in rats with post-infarction heart failure by quantitative tissue velocity imaging combining synchronous electrocardiograph. A total of 60 male SD rats were randomly assigned to 3 groups: a 4 week post-operative group and an 8 week post-operation group (each n=25, with anterior descending branch of the left coronary artery ligated), and a sham operation group (n=10, with thoracotomy and open pericardium, but no ligation of the artery). The time to peak systolic velocity of regional myocardial in the rats was measured and the index of the left intraventricular dyssynchrony was calculated. All indexes of the heart function became lower as the heart failure worsened except the left ventricle index in the post-operative groups. All indexes of the dyssynchrony got longer in the post-operative groups (P<0.05), while the changes in the sham operation group were not significantly different (P>0.05). Quantitative tissue velocity imaging combining synchronous electrocardiograph can analyse the intraventricular systolic dyssynchrony accurately.
Stocker, Elena; Toschkoff, Gregor; Sacher, Stephan; Khinast, Johannes G
2014-11-20
The purpose of this study is to evaluate the use of computer simulations for generating quantitative knowledge as a basis for risk ranking and mechanistic process understanding, as required by ICH Q9 on quality risk management systems. In this specific publication, the main focus is the demonstration of a risk assessment workflow, including a computer simulation for the generation of mechanistic understanding of active tablet coating in a pan coater. Process parameter screening studies are statistically planned under consideration of impacts on a potentially critical quality attribute, i.e., coating mass uniformity. Based on computer simulation data the process failure mode and effects analysis of the risk factors is performed. This results in a quantitative criticality assessment of process parameters and the risk priority evaluation of failure modes. The factor for a quantitative reassessment of the criticality and risk priority is the coefficient of variation, which represents the coating mass uniformity. The major conclusion drawn from this work is a successful demonstration of the integration of computer simulation in the risk management workflow leading to an objective and quantitative risk assessment. Copyright © 2014. Published by Elsevier B.V.
Pottecher, Pierre; Engelke, Klaus; Duchemin, Laure; Museyko, Oleg; Moser, Thomas; Mitton, David; Vicaut, Eric; Adams, Judith; Skalli, Wafa; Laredo, Jean Denis; Bousson, Valérie
2016-09-01
Purpose To evaluate the performance of three imaging methods (radiography, dual-energy x-ray absorptiometry [DXA], and quantitative computed tomography [CT]) and that of a numerical analysis with finite element modeling (FEM) in the prediction of failure load of the proximal femur and to identify the best densitometric or geometric predictors of hip failure load. Materials and Methods Institutional review board approval was obtained. A total of 40 pairs of excised cadaver femurs (mean patient age at time of death, 82 years ± 12 [standard deviation]) were examined with (a) radiography to measure geometric parameters (lengths, angles, and cortical thicknesses), (b) DXA (reference standard) to determine areal bone mineral densities (BMDs), and (c) quantitative CT with dedicated three-dimensional analysis software to determine volumetric BMDs and geometric parameters (neck axis length, cortical thicknesses, volumes, and moments of inertia), and (d) quantitative CT-based FEM to calculate a numerical value of failure load. The 80 femurs were fractured via mechanical testing, with random assignment of one femur from each pair to the single-limb stance configuration (hereafter, stance configuration) and assignment of the paired femur to the sideways fall configuration (hereafter, side configuration). Descriptive statistics, univariate correlations, and stepwise regression models were obtained for each imaging method and for FEM to enable us to predict failure load in both configurations. Results Statistics reported are for stance and side configurations, respectively. For radiography, the strongest correlation with mechanical failure load was obtained by using a geometric parameter combined with a cortical thickness (r(2) = 0.66, P < .001; r(2) = 0.65, P < .001). For DXA, the strongest correlation with mechanical failure load was obtained by using total BMD (r(2) = 0.73, P < .001) and trochanteric BMD (r(2) = 0.80, P < .001). For quantitative CT, in both configurations, the best model combined volumetric BMD and a moment of inertia (r(2) = 0.78, P < .001; r(2) = 0.85, P < .001). FEM explained 87% (P < .001) and 83% (P < .001) of bone strength, respectively. By combining (a) radiography and DXA and (b) quantitative CT and DXA, correlations with mechanical failure load increased to 0.82 (P < .001) and 0.84 (P < .001), respectively, for radiography and DXA and to 0.80 (P < .001) and 0.86 (P < .001) , respectively, for quantitative CT and DXA. Conclusion Quantitative CT-based FEM was the best method with which to predict the experimental failure load; however, combining quantitative CT and DXA yielded a performance as good as that attained with FEM. The quantitative CT DXA combination may be easier to use in fracture prediction, provided standardized software is developed. These findings also highlight the major influence on femoral failure load, particularly in the trochanteric region, of a densitometric parameter combined with a geometric parameter. (©) RSNA, 2016 Online supplemental material is available for this article.
NASA Technical Reports Server (NTRS)
Motyka, P.
1983-01-01
A methodology is developed and applied for quantitatively analyzing the reliability of a dual, fail-operational redundant strapdown inertial measurement unit (RSDIMU). A Markov evaluation model is defined in terms of the operational states of the RSDIMU to predict system reliability. A 27 state model is defined based upon a candidate redundancy management system which can detect and isolate a spectrum of failure magnitudes. The results of parametric studies are presented which show the effect on reliability of the gyro failure rate, both the gyro and accelerometer failure rates together, false alarms, probability of failure detection, probability of failure isolation, and probability of damage effects and mission time. A technique is developed and evaluated for generating dynamic thresholds for detecting and isolating failures of the dual, separated IMU. Special emphasis is given to the detection of multiple, nonconcurrent failures. Digital simulation time histories are presented which show the thresholds obtained and their effectiveness in detecting and isolating sensor failures.
A novel strategy for rapid detection of NT-proBNP
NASA Astrophysics Data System (ADS)
Cui, Qiyao; Sun, Honghao; Zhu, Hui
2017-09-01
In order to establish a simple, rapid, sensitive, and specific quantitative assay to detect the biomarkers of heart failure, in this study, biotin-streptavidin technology was employed with fluorescence immunochromatographic assay to detect the concentration of the biomarkers in serum, and this method was applied to detect NT-proBNP, which is valuable for diagnostic evaluation of heart failure.
Rock Slide Risk Assessment: A Semi-Quantitative Approach
NASA Astrophysics Data System (ADS)
Duzgun, H. S. B.
2009-04-01
Rock slides can be better managed by systematic risk assessments. Any risk assessment methodology for rock slides involves identification of rock slide risk components, which are hazard, elements at risk and vulnerability. For a quantitative/semi-quantitative risk assessment for rock slides, a mathematical value the risk has to be computed and evaluated. The quantitative evaluation of risk for rock slides enables comparison of the computed risk with the risk of other natural and/or human-made hazards and providing better decision support and easier communication for the decision makers. A quantitative/semi-quantitative risk assessment procedure involves: Danger Identification, Hazard Assessment, Elements at Risk Identification, Vulnerability Assessment, Risk computation, Risk Evaluation. On the other hand, the steps of this procedure require adaptation of existing or development of new implementation methods depending on the type of landslide, data availability, investigation scale and nature of consequences. In study, a generic semi-quantitative risk assessment (SQRA) procedure for rock slides is proposed. The procedure has five consecutive stages: Data collection and analyses, hazard assessment, analyses of elements at risk and vulnerability and risk assessment. The implementation of the procedure for a single rock slide case is illustrated for a rock slope in Norway. Rock slides from mountain Ramnefjell to lake Loen are considered to be one of the major geohazards in Norway. Lake Loen is located in the inner part of Nordfjord in Western Norway. Ramnefjell Mountain is heavily jointed leading to formation of vertical rock slices with height between 400-450 m and width between 7-10 m. These slices threaten the settlements around Loen Valley and tourists visiting the fjord during summer season, as the released slides have potential of creating tsunami. In the past, several rock slides had been recorded from the Mountain Ramnefjell between 1905 and 1950. Among them, four of the slides caused formation of tsunami waves which washed up to 74 m above the lake level. Two of the slides resulted in many fatalities in the inner part of the Loen Valley as well as great damages. There are three predominant joint structures in Ramnefjell Mountain, which controls failure and the geometry of the slides. The first joint set is a foliation plane striking northeast-southwest and dipping 35Ë -40Ë to the east-southeast. The second and the third joint sets are almost perpendicular and parallel to the mountain side and scarp, respectively. These three joint sets form slices of rock columns with width ranging between 7-10 m and height of 400-450 m. It is stated that the joints in set II are opened between 1-2 m, which may bring about collection of water during heavy rainfall or snow melt causing the slices to be pressed out. It is estimated that water in the vertical joints both reduces the shear strength of sliding plane and causes reduction of normal stress on the sliding plane due to formation of uplift force. Hence rock slides in Ramnefjell mountain occur in plane failure mode. The quantitative evaluation of rock slide risk requires probabilistic analysis of rock slope stability and identification of consequences if the rock slide occurs. In this study failure probability of a rock slice is evaluated by first-order reliability method (FORM). Then in order to use the calculated probability of failure value (Pf) in risk analyses, it is required to associate this Pf with frequency based probabilities (i.ePf / year) since the computed failure probabilities is a measure of hazard and not a measure of risk unless they are associated with the consequences of the failure. This can be done by either considering the time dependent behavior of the basic variables in the probabilistic models or associating the computed Pf with frequency of the failures in the region. In this study, the frequency of previous rock slides in the previous century in Remnefjell is used for evaluation of frequency based probability to be used in risk assessment. The major consequence of a rock slide is generation of a tsunami in the lake Loen, causing inundation of residential areas around the lake. Risk is assessed by adapting damage probability matrix approach, which is originally developed for risk assessment for buildings in case of earthquake.
NASA Astrophysics Data System (ADS)
Bell, Andrew F.; Naylor, Mark; Heap, Michael J.; Main, Ian G.
2011-08-01
Power-law accelerations in the mean rate of strain, earthquakes and other precursors have been widely reported prior to material failure phenomena, including volcanic eruptions, landslides and laboratory deformation experiments, as predicted by several theoretical models. The Failure Forecast Method (FFM), which linearizes the power-law trend, has been routinely used to forecast the failure time in retrospective analyses; however, its performance has never been formally evaluated. Here we use synthetic and real data, recorded in laboratory brittle creep experiments and at volcanoes, to show that the assumptions of the FFM are inconsistent with the error structure of the data, leading to biased and imprecise forecasts. We show that a Generalized Linear Model method provides higher-quality forecasts that converge more accurately to the eventual failure time, accounting for the appropriate error distributions. This approach should be employed in place of the FFM to provide reliable quantitative forecasts and estimate their associated uncertainties.
NASA Technical Reports Server (NTRS)
Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.
1992-01-01
An improved methodology for quantitatively evaluating failure risk of spaceflight systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with engineering analysis to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in engineering analyses of failure phenomena, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which engineering analysis models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes, These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. Conventional engineering analysis models currently employed for design of failure prediction are used in this methodology. The PFA methodology is described and examples of its application are presented. Conventional approaches to failure risk evaluation for spaceflight systems are discussed, and the rationale for the approach taken in the PFA methodology is presented. The statistical methods, engineering models, and computer software used in fatigue failure mode applications are thoroughly documented.
NASA Technical Reports Server (NTRS)
Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.
1992-01-01
An improved methodology for quantitatively evaluating failure risk of spaceflight systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with engineering analysis to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in engineering analyses of failure phenomena, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which engineering analysis models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes. These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. Conventional engineering analysis models currently employed for design of failure prediction are used in this methodology. The PFA methodology is described and examples of its application are presented. Conventional approaches to failure risk evaluation for spaceflight systems are discussed, and the rationale for the approach taken in the PFA methodology is presented. The statistical methods, engineering models, and computer software used in fatigue failure mode applications are thoroughly documented.
Quantitative ultrasonic evaluation of engineering properties in metals, composites and ceramics
NASA Technical Reports Server (NTRS)
Vary, A.
1980-01-01
Ultrasonic technology from the perspective of nondestructive evaluation approaches to material strength prediction and property verification is reviewed. Emergent advanced technology involving quantitative ultrasonic techniques for materials characterization is described. Ultrasonic methods are particularly useful in this area because they involve mechanical elastic waves that are strongly modulated by the same morphological factors that govern mechanical strength and dynamic failure processes. It is emphasized that the technology is in its infancy and that much effort is still required before all the available techniques can be transferred from laboratory to industrial environments.
NASA Technical Reports Server (NTRS)
Packard, Michael H.
2002-01-01
Probabilistic Structural Analysis (PSA) is now commonly used for predicting the distribution of time/cycles to failure of turbine blades and other engine components. These distributions are typically based on fatigue/fracture and creep failure modes of these components. Additionally, reliability analysis is used for taking test data related to particular failure modes and calculating failure rate distributions of electronic and electromechanical components. How can these individual failure time distributions of structural, electronic and electromechanical component failure modes be effectively combined into a top level model for overall system evaluation of component upgrades, changes in maintenance intervals, or line replaceable unit (LRU) redesign? This paper shows an example of how various probabilistic failure predictions for turbine engine components can be evaluated and combined to show their effect on overall engine performance. A generic model of a turbofan engine was modeled using various Probabilistic Risk Assessment (PRA) tools (Quantitative Risk Assessment Software (QRAS) etc.). Hypothetical PSA results for a number of structural components along with mitigation factors that would restrict the failure mode from propagating to a Loss of Mission (LOM) failure were used in the models. The output of this program includes an overall failure distribution for LOM of the system. The rank and contribution to the overall Mission Success (MS) is also given for each failure mode and each subsystem. This application methodology demonstrates the effectiveness of PRA for assessing the performance of large turbine engines. Additionally, the effects of system changes and upgrades, the application of different maintenance intervals, inclusion of new sensor detection of faults and other upgrades were evaluated in determining overall turbine engine reliability.
Grant, Joan S; Graven, Lucinda J
2018-04-01
The purpose of this review was to examine and synthesize recent literature regarding problems experienced by informal caregivers when providing care for individuals with heart failure in the home. Integrative literature review. A review of current empirical literature was conducted utilizing PubMed, CINAHL, Embase, Sociological Abstracts, Social Sciences Full Text, PsycARTICLES, PsycINFO, Health Source: Nursing/Academic Edition, and Cochrane computerized databases. 19 qualitative, 16 quantitative, and 2 mixed methods studies met the inclusion criteria for review. Computerized databases were searched for a combination of subject terms (i.e., MeSH) and keywords related to informal caregivers, problems, and heart failure. The title and abstract of identified articles and reference lists were reviewed. Studies were included if they were published in English between January 2000 and December 2016 and examined problems experienced by informal caregivers in providing care for individuals with heart failure in the home. Studies were excluded if not written in English or if elements of caregiving in heart failure were not present in the title, abstract, or text. Unpublished and duplicate empirical literature as well as articles related to specific end-stage heart failure populations also were excluded. Methodology described by Cooper and others for integrative reviews of quantitative and qualitative research was used. Quality appraisal of the included studies was evaluated using the Joanna Briggs Institute critical appraisal tools for cross-sectional quantitative and qualitative studies. Informal caregivers experienced four key problems when providing care for individuals with heart failure in the home, including performing multifaceted activities and roles that evolve around daily heart failure demands; maintaining caregiver physical, emotional, social, spiritual, and financial well-being; having insufficient caregiver support; and performing caregiving with uncertainty and inadequate knowledge. Informal caregivers of individuals with heart failure experience complex problems in the home when providing care which impact all aspects of their lives. Incorporating advice from informal caregivers of individuals with heart failure will assist in the development of interventions to reduce negative caregiver outcomes. Given the complex roles in caring for individuals with heart failure, multicomponent interventions are potentially promising in assisting informal caregivers in performing these roles. Published by Elsevier Ltd.
NASA Technical Reports Server (NTRS)
Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.
1992-01-01
An improved methodology for quantitatively evaluating failure risk of spaceflight systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with engineering analysis to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in engineering analyses of failure phenomena, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which engineering analysis models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes. These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. Conventional engineering analysis models currently employed for design of failure prediction are used in this methodology. The PFA methodology is described and examples of its application are presented. Conventional approaches to failure risk evaluation for spaceflight systems are discussed, and the rationale for the approach taken in the PFA methodology is presented. The statistical methods, engineering models, and computer software used in fatigue failure mode applications are thoroughly documented.
SPECT and PET in ischemic heart failure.
Angelidis, George; Giamouzis, Gregory; Karagiannis, Georgios; Butler, Javed; Tsougos, Ioannis; Valotassiou, Varvara; Giannakoulas, George; Dimakopoulos, Nikolaos; Xanthopoulos, Andrew; Skoularigis, John; Triposkiadis, Filippos; Georgoulias, Panagiotis
2017-03-01
Heart failure is a common clinical syndrome associated with significant morbidity and mortality worldwide. Ischemic heart disease is the leading cause of heart failure, at least in the industrialized countries. Proper diagnosis of the syndrome and management of patients with heart failure require anatomical and functional information obtained through various imaging modalities. Nuclear cardiology techniques play a main role in the evaluation of heart failure. Myocardial single photon emission computed tomography (SPECT) with thallium-201 or technetium-99 m labelled tracers offer valuable data regarding ventricular function, myocardial perfusion, viability, and intraventricular synchronism. Moreover, positron emission tomography (PET) permits accurate evaluation of myocardial perfusion, metabolism, and viability, providing high-quality images and the ability of quantitative analysis. As these imaging techniques assess different parameters of cardiac structure and function, variations of sensitivity and specificity have been reported among them. In addition, the role of SPECT and PET guided therapy remains controversial. In this comprehensive review, we address these controversies and report the advances in patient's investigation with SPECT and PET in ischemic heart failure. Furthermore, we present the innovations in technology that are expected to strengthen the role of nuclear cardiology modalities in the investigation of heart failure.
Soft error evaluation and vulnerability analysis in Xilinx Zynq-7010 system-on chip
NASA Astrophysics Data System (ADS)
Du, Xuecheng; He, Chaohui; Liu, Shuhuan; Zhang, Yao; Li, Yonghong; Xiong, Ceng; Tan, Pengkang
2016-09-01
Radiation-induced soft errors are an increasingly important threat to the reliability of modern electronic systems. In order to evaluate system-on chip's reliability and soft error, the fault tree analysis method was used in this work. The system fault tree was constructed based on Xilinx Zynq-7010 All Programmable SoC. Moreover, the soft error rates of different components in Zynq-7010 SoC were tested by americium-241 alpha radiation source. Furthermore, some parameters that used to evaluate the system's reliability and safety were calculated using Isograph Reliability Workbench 11.0, such as failure rate, unavailability and mean time to failure (MTTF). According to fault tree analysis for system-on chip, the critical blocks and system reliability were evaluated through the qualitative and quantitative analysis.
Gallagher, Joseph; James, Stephanie; Keane, Ciara; Fitzgerald, Annie; Travers, Bronagh; Quigley, Etain; Hecht, Christina; Zhou, Shuaiwei; Watson, Chris; Ledwidge, Mark; McDonald, Kenneth
2017-08-01
We undertook a mixed-methods evaluation of a Web-based conferencing service (virtual consult) between general practitioners (GPs) and cardiologists in managing patients with heart failure in the community to determine its effect on use of specialist heart failure services and acceptability to GPs. All cases from June 2015 to October 2016 were recorded using a standardized recording template, which recorded patient demographics, medical history, medications, and outcome of the virtual consult for each case. Quantitative surveys and qualitative interviewing of 17 participating GPs were also undertaken. During this time, 142 cases were discussed-68 relating to a new diagnosis of heart failure, 53 relating to emerging deterioration in a known heart failure patient, and 21 relating to therapeutic issues. Only 17% required review in outpatient department following the virtual consultation. GPs reported increased confidence in heart failure management, a broadening of their knowledge base, and a perception of overall better patient outcomes. These data from an initial experience with Heart Failure Virtual Consultation present a very positive impact of this strategy on the provision of heart failure care in the community and acceptability to users. Further research on the implementation and expansion of this strategy is warranted. © 2017 The Authors. ESC Heart Failure published by John Wiley & Sons Ltd on behalf of the European Society of Cardiology.
Retrospective Analysis of a Classical Biological Control Programme
USDA-ARS?s Scientific Manuscript database
1. Classical biological control has been a key technology in the management of invasive arthropod pests globally for over 120 years, yet rigorous quantitative evaluations of programme success or failure are rare. Here, I used life table and matrix model analyses, and life table response experiments ...
A novel approach for evaluating the risk of health care failure modes.
Chang, Dong Shang; Chung, Jenq Hann; Sun, Kuo Lung; Yang, Fu Chiang
2012-12-01
Failure mode and effects analysis (FMEA) can be employed to reduce medical errors by identifying the risk ranking of the health care failure modes and taking priority action for safety improvement. The purpose of this paper is to propose a novel approach of data analysis. The approach is to integrate FMEA and a mathematical tool-Data envelopment analysis (DEA) with "slack-based measure" (SBM), in the field of data analysis. The risk indexes (severity, occurrence, and detection) of FMEA are viewed as multiple inputs of DEA. The practicality and usefulness of the proposed approach is illustrated by one case of health care. Being a systematic approach for improving the service quality of health care, the approach can offer quantitative corrective information of risk indexes that thereafter reduce failure possibility. For safety improvement, these new targets of the risk indexes could be used for management by objectives. But FMEA cannot provide quantitative corrective information of risk indexes. The novel approach can surely overcome this chief shortcoming of FMEA. After combining DEA SBM model with FMEA, the two goals-increase of patient safety, medical cost reduction-can be together achieved.
Quantitative Acoustic Model for Adhesion Evaluation of Pmma/silicon Film Structures
NASA Astrophysics Data System (ADS)
Ju, H. S.; Tittmann, B. R.
2010-02-01
A Poly-methyl-methacrylate (PMMA) film on a silicon substrate is a main structure for photolithography in semiconductor manufacturing processes. This paper presents a potential of scanning acoustic microscopy (SAM) for nondestructive evaluation of the PMMA/Si film structure, whose adhesion failure is commonly encountered during the fabrication and post-fabrication processes. A physical model employing a partial discontinuity in displacement is developed for rigorously quantitative evaluation of the interfacial weakness. The model is implanted to the matrix method for the surface acoustic wave (SAW) propagation in anisotropic media. Our results show that variations in the SAW velocity and reflectance are predicted to show their sensitivity to the adhesion condition. Experimental results by the v(z) technique and SAW velocity reconstruction verify the prediction.
Materials testing protocol for small joint prostheses.
Savory, K M; Hutchinson, D T; Bloebaum, R
1994-10-01
In this article, a protocol for the evaluation of new materials for small joint prostheses is introduced. The testing methods employed in the protocol were developed by reviewing reported clinical failure modes and conditions found in vivo. The methods developed quantitatively evaluates the fatigue, fatigue crack propagation, and wear resistance properties of materials. For this study, a silicone elastomer similar to Dow Corning Silastic HP100, a radiation stable polypropylene, and a copolymer of polypropylene and ethylene propylene-diene monomer (EPDM) are evaluated. None of the materials tested demonstrated the ideal properties that are sought in a self-hinging joint prostheses. The silicone elastomer had excellent wear properties; however, cracks quickly propagated, causing catastrophic failure when fatigued. Conversely, the copolymer showed excellent fatigue crack propagation resistance and less than favorable wear properties. The polypropylene did not perform well in any evaluation.
Boyce, B. L.; Kramer, S. L. B.; Bosiljevac, T. R.; ...
2016-03-14
Ductile failure of structural metals is relevant to a wide range of engineering scenarios. Computational methods are employed to anticipate the critical conditions of failure, yet they sometimes provide inaccurate and misleading predictions. Challenge scenarios, such as the one presented in the current work, provide an opportunity to assess the blind, quantitative predictive ability of simulation methods against a previously unseen failure problem. Instead of evaluating the predictions of a single simulation approach, the Sandia Fracture Challenge relied on numerous volunteer teams with expertise in computational mechanics to apply a broad range of computational methods, numerical algorithms, and constitutive modelsmore » to the challenge. This exercise is intended to evaluate the state of health of technologies available for failure prediction. In the first Sandia Fracture Challenge, a wide range of issues were raised in ductile failure modeling, including a lack of consistency in failure models, the importance of shear calibration data, and difficulties in quantifying the uncertainty of prediction [see Boyce et al. (Int J Fract 186:5–68, 2014) for details of these observations]. This second Sandia Fracture Challenge investigated the ductile rupture of a Ti–6Al–4V sheet under both quasi-static and modest-rate dynamic loading (failure in ~ 0.1 s). Like the previous challenge, the sheet had an unusual arrangement of notches and holes that added geometric complexity and fostered a competition between tensile- and shear-dominated failure modes. The teams were asked to predict the fracture path and quantitative far-field failure metrics such as the peak force and displacement to cause crack initiation. Fourteen teams contributed blind predictions, and the experimental outcomes were quantified in three independent test labs. In addition, shortcomings were revealed in this second challenge such as inconsistency in the application of appropriate boundary conditions, need for a thermomechanical treatment of the heat generation in the dynamic loading condition, and further difficulties in model calibration based on limited real-world engineering data. As with the prior challenge, this work not only documents the ‘state-of-the-art’ in computational failure prediction of ductile tearing scenarios, but also provides a detailed dataset for non-blind assessment of alternative methods.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Boyce, B. L.; Kramer, S. L. B.; Bosiljevac, T. R.
Ductile failure of structural metals is relevant to a wide range of engineering scenarios. Computational methods are employed to anticipate the critical conditions of failure, yet they sometimes provide inaccurate and misleading predictions. Challenge scenarios, such as the one presented in the current work, provide an opportunity to assess the blind, quantitative predictive ability of simulation methods against a previously unseen failure problem. Instead of evaluating the predictions of a single simulation approach, the Sandia Fracture Challenge relied on numerous volunteer teams with expertise in computational mechanics to apply a broad range of computational methods, numerical algorithms, and constitutive modelsmore » to the challenge. This exercise is intended to evaluate the state of health of technologies available for failure prediction. In the first Sandia Fracture Challenge, a wide range of issues were raised in ductile failure modeling, including a lack of consistency in failure models, the importance of shear calibration data, and difficulties in quantifying the uncertainty of prediction [see Boyce et al. (Int J Fract 186:5–68, 2014) for details of these observations]. This second Sandia Fracture Challenge investigated the ductile rupture of a Ti–6Al–4V sheet under both quasi-static and modest-rate dynamic loading (failure in ~ 0.1 s). Like the previous challenge, the sheet had an unusual arrangement of notches and holes that added geometric complexity and fostered a competition between tensile- and shear-dominated failure modes. The teams were asked to predict the fracture path and quantitative far-field failure metrics such as the peak force and displacement to cause crack initiation. Fourteen teams contributed blind predictions, and the experimental outcomes were quantified in three independent test labs. In addition, shortcomings were revealed in this second challenge such as inconsistency in the application of appropriate boundary conditions, need for a thermomechanical treatment of the heat generation in the dynamic loading condition, and further difficulties in model calibration based on limited real-world engineering data. As with the prior challenge, this work not only documents the ‘state-of-the-art’ in computational failure prediction of ductile tearing scenarios, but also provides a detailed dataset for non-blind assessment of alternative methods.« less
Sophisticated Calculation of the 1oo4-architecture for Safety-related Systems Conforming to IEC61508
NASA Astrophysics Data System (ADS)
Hayek, A.; Bokhaiti, M. Al; Schwarz, M. H.; Boercsoek, J.
2012-05-01
With the publication and enforcement of the standard IEC 61508 of safety related systems, recent system architectures have been presented and evaluated. Among a number of techniques and measures to the evaluation of safety integrity level (SIL) for safety-related systems, several measures such as reliability block diagrams and Markov models are used to analyze the probability of failure on demand (PFD) and mean time to failure (MTTF) which conform to IEC 61508. The current paper deals with the quantitative analysis of the novel 1oo4-architecture (one out of four) presented in recent work. Therefore sophisticated calculations for the required parameters are introduced. The provided 1oo4-architecture represents an advanced safety architecture based on on-chip redundancy, which is 3-failure safe. This means that at least one of the four channels have to work correctly in order to trigger the safety function.
Claycamp, H Gregg; Kona, Ravikanth; Fahmy, Raafat; Hoag, Stephen W
2016-04-01
Qualitative risk assessment methods are often used as the first step to determining design space boundaries; however, quantitative assessments of risk with respect to the design space, i.e., calculating the probability of failure for a given severity, are needed to fully characterize design space boundaries. Quantitative risk assessment methods in design and operational spaces are a significant aid to evaluating proposed design space boundaries. The goal of this paper is to demonstrate a relatively simple strategy for design space definition using a simplified Bayesian Monte Carlo simulation. This paper builds on a previous paper that used failure mode and effects analysis (FMEA) qualitative risk assessment and Plackett-Burman design of experiments to identity the critical quality attributes. The results show that the sequential use of qualitative and quantitative risk assessments can focus the design of experiments on a reduced set of critical material and process parameters that determine a robust design space under conditions of limited laboratory experimentation. This approach provides a strategy by which the degree of risk associated with each known parameter can be calculated and allocates resources in a manner that manages risk to an acceptable level.
Rezaei, Fatemeh; Yarmohammadian, Mohmmad H.; Haghshenas, Abbas; Fallah, Ali; Ferdosi, Masoud
2018-01-01
Background: Methodology of Failure Mode and Effects Analysis (FMEA) is known as an important risk assessment tool and accreditation requirement by many organizations. For prioritizing failures, the index of “risk priority number (RPN)” is used, especially for its ease and subjective evaluations of occurrence, the severity and the detectability of each failure. In this study, we have tried to apply FMEA model more compatible with health-care systems by redefining RPN index to be closer to reality. Methods: We used a quantitative and qualitative approach in this research. In the qualitative domain, focused groups discussion was used to collect data. A quantitative approach was used to calculate RPN score. Results: We have studied patient's journey in surgery ward from holding area to the operating room. The highest priority failures determined based on (1) defining inclusion criteria as severity of incident (clinical effect, claim consequence, waste of time and financial loss), occurrence of incident (time - unit occurrence and degree of exposure to risk) and preventability (degree of preventability and defensive barriers) then, (2) risks priority criteria quantified by using RPN index (361 for the highest rate failure). The ability of improved RPN scores reassessed by root cause analysis showed some variations. Conclusions: We concluded that standard criteria should be developed inconsistent with clinical linguistic and special scientific fields. Therefore, cooperation and partnership of technical and clinical groups are necessary to modify these models. PMID:29441184
33 CFR 154.804 - Review, certification, and initial inspection.
Code of Federal Regulations, 2012 CFR
2012-07-01
..., property, and the environment if an accident were to occur; and (4) If a quantitative failure analysis is... quantitative failure analysis. (e) The certifying entity must conduct all initial inspections and witness all...
33 CFR 154.804 - Review, certification, and initial inspection.
Code of Federal Regulations, 2013 CFR
2013-07-01
..., property, and the environment if an accident were to occur; and (4) If a quantitative failure analysis is... quantitative failure analysis. (e) The certifying entity must conduct all initial inspections and witness all...
33 CFR 154.804 - Review, certification, and initial inspection.
Code of Federal Regulations, 2010 CFR
2010-07-01
..., property, and the environment if an accident were to occur; and (4) If a quantitative failure analysis is... quantitative failure analysis. (e) The certifying entity must conduct all initial inspections and witness all...
33 CFR 154.804 - Review, certification, and initial inspection.
Code of Federal Regulations, 2011 CFR
2011-07-01
..., property, and the environment if an accident were to occur; and (4) If a quantitative failure analysis is... quantitative failure analysis. (e) The certifying entity must conduct all initial inspections and witness all...
Predictive factors for renal failure and a control and treatment algorithm
Cerqueira, Denise de Paula; Tavares, José Roberto; Machado, Regimar Carla
2014-01-01
Objectives to evaluate the renal function of patients in an intensive care unit, to identify the predisposing factors for the development of renal failure, and to develop an algorithm to help in the control of the disease. Method exploratory, descriptive, prospective study with a quantitative approach. Results a total of 30 patients (75.0%) were diagnosed with kidney failure and the main factors associated with this disease were: advanced age, systemic arterial hypertension, diabetes mellitus, lung diseases, and antibiotic use. Of these, 23 patients (76.6%) showed a reduction in creatinine clearance in the first 24 hours of hospitalization. Conclusion a decline in renal function was observed in a significant number of subjects, therefore, an algorithm was developed with the aim of helping in the control of renal failure in a practical and functional way. PMID:26107827
33 CFR 154.2020 - Certification and recertification-owner/operator responsibilities.
Code of Federal Regulations, 2014 CFR
2014-07-01
... Procedures,” and in Military Standard MIL-STD-882B for a quantitative failure analysis. For assistance in... quantitative failure analysis is also conducted, the level of safety attained is at least one order of...
NASA Astrophysics Data System (ADS)
Wu, Li; Adoko, Amoussou Coffi; Li, Bo
2018-04-01
In tunneling, determining quantitatively the rock mass strength parameters of the Hoek-Brown (HB) failure criterion is useful since it can improve the reliability of the design of tunnel support systems. In this study, a quantitative method is proposed to determine the rock mass quality parameters of the HB failure criterion, namely the Geological Strength Index (GSI) and the disturbance factor ( D) based on the structure of drilling core and weathering condition of rock mass combined with acoustic wave test to calculate the strength of rock mass. The Rock Mass Structure Index and the Rock Mass Weathering Index are used to quantify the GSI while the longitudinal wave velocity ( V p) is employed to derive the value of D. The DK383+338 tunnel face of Yaojia tunnel of Shanghai-Kunming passenger dedicated line served as illustration of how the methodology is implemented. The values of the GSI and D are obtained using the HB criterion and then using the proposed method. The measured in situ stress is used to evaluate their accuracy. To this end, the major and minor principal stresses are calculated based on the GSI and D given by HB criterion and the proposed method. The results indicated that both methods were close to the field observation which suggests that the proposed method can be used for determining quantitatively the rock quality parameters, as well. However, these results remain valid only for rock mass quality and rock type similar to those of the DK383+338 tunnel face of Yaojia tunnel.
Quantitative ultrasonic evaluation of mechanical properties of engineering materials
NASA Technical Reports Server (NTRS)
Vary, A.
1978-01-01
Current progress in the application of ultrasonic techniques to nondestructive measurement of mechanical strength properties of engineering materials is reviewed. Even where conventional NDE techniques have shown that a part is free of overt defects, advanced NDE techniques should be available to confirm the material properties assumed in the part's design. There are many instances where metallic, composite, or ceramic parts may be free of critical defects while still being susceptible to failure under design loads due to inadequate or degraded mechanical strength. This must be considered in any failure prevention scheme that relies on fracture analysis. This review will discuss the availability of ultrasonic methods that can be applied to actual parts to assess their potential susceptibility to failure under design conditions.
A finite element evaluation of the moment arm hypothesis for altered vertebral shear failure force.
Howarth, Samuel J; Karakolis, Thomas; Callaghan, Jack P
2015-01-01
The mechanism of vertebral shear failure is likely a bending moment generated about the pars interarticularis by facet contact, and the moment arm length (MAL) between the centroid of facet contact and the location of pars interarticularis failure has been hypothesised to be an influential modulator of shear failure force. To quantitatively evaluate this hypothesis, anterior shear of C3 over C4 was simulated in a finite element model of the porcine C3-C4 vertebral joint with each combination of five compressive force magnitudes (0-60% of estimated compressive failure force) and three postures (flexed, neutral and extended). Bilateral locations of peak stress within C3's pars interarticularis were identified along with the centroids of contact force on the inferior facets. These measurements were used to calculate the MAL of facet contact force. Changes in MAL were also related to shear failure forces measured from similar in vitro tests. Flexed and extended vertebral postures respectively increased and decreased the MAL by 6.6% and 4.8%. The MAL decreased by only 2.6% from the smallest to the largest compressive force. Furthermore, altered MAL explained 70% of the variance in measured shear failure force from comparable in vitro testing with larger MALs being associated with lower shear failure forces. Our results confirmed that the MAL is indeed a significant modulator of vertebral shear failure force. Considering spine flexion is necessary when assessing low-back shear injury potential because of the association between altered facet articulation and lower vertebral shear failure tolerance.
NASA Astrophysics Data System (ADS)
Telesman, J.; Smith, T. M.; Gabb, T. P.; Ring, A. J.
2018-06-01
Cyclic near-threshold fatigue crack growth (FCG) behavior of two disk superalloys was evaluated and was shown to exhibit an unexpected sudden failure mode transition from a mostly transgranular failure mode at higher stress intensity factor ranges to an almost completely intergranular failure mode in the threshold regime. The change in failure modes was associated with a crossover of FCG resistance curves in which the conditions that produced higher FCG rates in the Paris regime resulted in lower FCG rates and increased ΔK th values in the threshold region. High-resolution scanning and transmission electron microscopy were used to carefully characterize the crack tips at these near-threshold conditions. Formation of stable Al-oxide followed by Cr-oxide and Ti-oxides was found to occur at the crack tip prior to formation of unstable oxides. To contrast with the threshold failure mode regime, a quantitative assessment of the role that the intergranular failure mode has on cyclic FCG behavior in the Paris regime was also performed. It was demonstrated that even a very limited intergranular failure content dominates the FCG response under mixed mode failure conditions.
Comparison of three commercially available fit-test methods.
Janssen, Larry L; Luinenburg, D Michael; Mullins, Haskell E; Nelson, Thomas J
2002-01-01
American National Standards Institute (ANSI) standard Z88.10, Respirator Fit Testing Methods, includes criteria to evaluate new fit-tests. The standard allows generated aerosol, particle counting, or controlled negative pressure quantitative fit-tests to be used as the reference method to determine acceptability of a new test. This study examined (1) comparability of three Occupational Safety and Health Administration-accepted fit-test methods, all of which were validated using generated aerosol as the reference method; and (2) the effect of the reference method on the apparent performance of a fit-test method under evaluation. Sequential fit-tests were performed using the controlled negative pressure and particle counting quantitative fit-tests and the bitter aerosol qualitative fit-test. Of 75 fit-tests conducted with each method, the controlled negative pressure method identified 24 failures; bitter aerosol identified 22 failures; and the particle counting method identified 15 failures. The sensitivity of each method, that is, agreement with the reference method in identifying unacceptable fits, was calculated using each of the other two methods as the reference. None of the test methods met the ANSI sensitivity criterion of 0.95 or greater when compared with either of the other two methods. These results demonstrate that (1) the apparent performance of any fit-test depends on the reference method used, and (2) the fit-tests evaluated use different criteria to identify inadequately fitting respirators. Although "acceptable fit" cannot be defined in absolute terms at this time, the ability of existing fit-test methods to reject poor fits can be inferred from workplace protection factor studies.
The Long-Term Effects of Florida's Third Grade Retention Policy
ERIC Educational Resources Information Center
Smith, Andre K.
2016-01-01
The purpose of this quantitative causal-comparative study was to evaluate the long-term effects of Florida's Third-Grade Retention policy on low performing students' subsequent academic performance as measured by FCAT reading scores. The study included a random stratified sample of 1500 retained third graders for failure to meet Florida's…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Reddy, Y.S.
1992-01-01
The failure behavior of composite laminates is modeled numerically using the Generalized Layerwise Plate Theory (GLPT) of Reddy and a progressive failure algorithm. The Layerwise Theory of Reddy assumes a piecewise continuous displacement field through the thickness of the laminate and therefore has the ability to capture the interlaminar stress fields near the free edges and cut outs more accurately. The progressive failure algorithm is based on the assumption that the material behaves like a stable progressively fracturing solid. A three-dimensional stiffness reduction scheme is developed and implemented to study progressive failures in composite laminates. The effect of various parametersmore » such as out-of-plane material properties, boundary conditions, and stiffness reduction methods on the failure stresses and strains of a quasi-isotropic composite laminate with free edges subjected to tensile loading is studied. The ultimate stresses and strains predicted by the Generalized Layerwise Plate Theory (GLPT) and the more widely used First Order Shear Deformation Theory (FSDT) are compared with experimental results. The predictions of the GLPT are found to be in good agreement with the experimental results both qualitatively and quantitatively, while the predictions of FSDT are found to be different from experimental results both qualitatively and quantitatively. The predictive ability of various phenomenological failure criteria is evaluated with reference to the experimental results available in the literature. The effect of geometry of the test specimen and the displacement boundary conditions at the grips on the ultimate stresses and strains of a composite laminate under compressive loading is studied. The ultimate stresses and strains are found to be quite sensitive to the geometry of the test specimen and the displacement boundary conditions at the grips. The degree of sensitivity is observed to depend strongly on the lamination sequence.« less
NASA Technical Reports Server (NTRS)
Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.
1992-01-01
An improved methodology for quantitatively evaluating failure risk of spaceflight systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with analytical modeling of failure phenomena to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in analytical modeling, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which analytical models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes. These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. State-of-the-art analytical models currently employed for designs failure prediction, or performance analysis are used in this methodology. The rationale for the statistical approach taken in the PFA methodology is discussed, the PFA methodology is described, and examples of its application to structural failure modes are presented. The engineering models and computer software used in fatigue crack growth and fatigue crack initiation applications are thoroughly documented.
NASA Technical Reports Server (NTRS)
Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.
1992-01-01
An improved methodology for quantitatively evaluating failure risk of spaceflights systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with analytical modeling of failure phenomena to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in analytical modeling, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which analytical models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes. These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. State-of-the-art analytical models currently employed for design, failure prediction, or performance analysis are used in this methodology. The rationale for the statistical approach taken in the PFA methodology is discussed, the PFA methodology is described, and examples of its application to structural failure modes are presented. The engineering models and computer software used in fatigue crack growth and fatigue crack initiation applications are thoroughly documented.
Designing dual-plate meteoroid shields: A new analysis
NASA Technical Reports Server (NTRS)
Swift, H. F.; Bamford, R.; Chen, R.
1982-01-01
Physics governing ultrahigh velocity impacts onto dual-plate meteor armor is discussed. Meteoroid shield design methodologies are considered: failure mechanisms, qualitative features of effective meteoroid shield designs, evaluating/processing meteoroid threat models, and quantitative techniques for optimizing effective meteoroid shield designs. Related investigations are included: use of Kevlar cloth/epoxy panels in meteoroid shields for the Halley's Comet intercept vehicle, mirror exposure dynamics, and evaluation of ion fields produced around the Halley Intercept Mission vehicle by meteoroid impacts.
Ware, Patrick; Ross, Heather J; Cafazzo, Joseph A; Laporte, Audrey; Seto, Emily
2018-05-03
Meta-analyses of telemonitoring for patients with heart failure conclude that it can lower the utilization of health services and improve health outcomes compared with the standard of care. A smartphone-based telemonitoring program is being implemented as part of the standard of care at a specialty care clinic for patients with heart failure in Toronto, Canada. The objectives of this study are to (1) evaluate the impact of the telemonitoring program on health service utilization, patient health outcomes, and their ability to self-care; (2) identify the contextual barriers and facilitators of implementation at the physician, clinic, and institutional level; (3) describe patient usage patterns to determine adherence and other behaviors in the telemonitoring program; and (4) evaluate the costs associated with implementation of the telemonitoring program from the perspective of the health care system (ie, public payer), hospital, and patient. The evaluation will use a mixed-methods approach. The quantitative component will include a pragmatic pre- and posttest study design for the impact and cost analyses, which will make use of clinical data and questionnaires administered to at least 108 patients at baseline and 6 months. Furthermore, outcome data will be collected at 1, 12, and 24 months to explore the longitudinal impact of the program. In addition, quantitative data related to implementation outcomes and patient usage patterns of the telemonitoring system will be reported. The qualitative component involves an embedded single case study design to identify the contextual factors that influenced the implementation. The implementation evaluation will be completed using semistructured interviews with clinicians, and other program staff at baseline, 4 months, and 12 months after the program start date. Interviews conducted with patients will be triangulated with usage data to explain usage patterns and adherence to the system. The telemonitoring program was launched in August 2016 and patient enrollment is ongoing. The methods described provide an example for conducting comprehensive evaluations of telemonitoring programs. The combination of impact, implementation, and cost evaluations will inform the quality improvement of the existing program and will yield insights into the sustainability of smartphone-based telemonitoring programs for patients with heart failure within a specialty care setting. ©Patrick Ware, Heather J Ross, Joseph A Cafazzo, Audrey Laporte, Emily Seto. Originally published in JMIR Research Protocols (http://www.researchprotocols.org), 03.05.2018.
Ross, Heather J; Cafazzo, Joseph A; Laporte, Audrey; Seto, Emily
2018-01-01
Background Meta-analyses of telemonitoring for patients with heart failure conclude that it can lower the utilization of health services and improve health outcomes compared with the standard of care. A smartphone-based telemonitoring program is being implemented as part of the standard of care at a specialty care clinic for patients with heart failure in Toronto, Canada. Objective The objectives of this study are to (1) evaluate the impact of the telemonitoring program on health service utilization, patient health outcomes, and their ability to self-care; (2) identify the contextual barriers and facilitators of implementation at the physician, clinic, and institutional level; (3) describe patient usage patterns to determine adherence and other behaviors in the telemonitoring program; and (4) evaluate the costs associated with implementation of the telemonitoring program from the perspective of the health care system (ie, public payer), hospital, and patient. Methods The evaluation will use a mixed-methods approach. The quantitative component will include a pragmatic pre- and posttest study design for the impact and cost analyses, which will make use of clinical data and questionnaires administered to at least 108 patients at baseline and 6 months. Furthermore, outcome data will be collected at 1, 12, and 24 months to explore the longitudinal impact of the program. In addition, quantitative data related to implementation outcomes and patient usage patterns of the telemonitoring system will be reported. The qualitative component involves an embedded single case study design to identify the contextual factors that influenced the implementation. The implementation evaluation will be completed using semistructured interviews with clinicians, and other program staff at baseline, 4 months, and 12 months after the program start date. Interviews conducted with patients will be triangulated with usage data to explain usage patterns and adherence to the system. Results The telemonitoring program was launched in August 2016 and patient enrollment is ongoing. Conclusions The methods described provide an example for conducting comprehensive evaluations of telemonitoring programs. The combination of impact, implementation, and cost evaluations will inform the quality improvement of the existing program and will yield insights into the sustainability of smartphone-based telemonitoring programs for patients with heart failure within a specialty care setting. PMID:29724704
NASA Technical Reports Server (NTRS)
Waller, Jess M.; Saulsberry, Regor L.; Nichols, Charles T.; Wentzel, Daniel J.
2010-01-01
This slide presentation reviews the use of Modal Acoustic Emission to monitor damage progression to carbon fiber/epoxy tows. There is a risk for catastrophic failure of composite overwrapped pressure vessels (COPVs) due to burst-before-leak (BBL) stress rupture (SR) failure of carbon-epoxy (C/Ep) COPVs. A lack of quantitative nondestructive evaluation (NDE) is causing problems in current and future spacecraft designs. It is therefore important to develop and demonstrate critical NDE that can be implemented during stages of the design process since the observed rupture can occur with little of no advanced warning. Therefore a program was required to develop quantitative acoustic emission (AE) procedures specific to C/Ep overwraps, but which also have utility for monitoring damage accumulation in composite structure in general, and to lay the groundwork for establishing critical thresholds for accumulated damage in composite structures, such as COPVs, so that precautionary or preemptive engineering steps can be implemented to minimize of obviate the risk of catastrophic failure. A computed Felicity Ratio (FR) coupled with fast Fourier Transform (FFT) frequency analysis shows promise as an analytical pass/fail criterion. The FR analysis and waveform and FFT analysis are reviewed
SMART: A Propositional Logic-Based Trade Analysis and Risk Assessment Tool for a Complex Mission
NASA Technical Reports Server (NTRS)
Ono, Masahiro; Nicholas, Austin; Alibay, Farah; Parrish, Joseph
2015-01-01
This paper introduces a new trade analysis software called the Space Mission Architecture and Risk Analysis Tool (SMART). This tool supports a high-level system trade study on a complex mission, such as a potential Mars Sample Return (MSR) mission, in an intuitive and quantitative manner. In a complex mission, a common approach to increase the probability of success is to have redundancy and prepare backups. Quantitatively evaluating the utility of adding redundancy to a system is important but not straightforward, particularly when the failure of parallel subsystems are correlated.
Integrating FMEA in a Model-Driven Methodology
NASA Astrophysics Data System (ADS)
Scippacercola, Fabio; Pietrantuono, Roberto; Russo, Stefano; Esper, Alexandre; Silva, Nuno
2016-08-01
Failure Mode and Effects Analysis (FMEA) is a well known technique for evaluating the effects of potential failures of components of a system. FMEA demands for engineering methods and tools able to support the time- consuming tasks of the analyst. We propose to make FMEA part of the design of a critical system, by integration into a model-driven methodology. We show how to conduct the analysis of failure modes, propagation and effects from SysML design models, by means of custom diagrams, which we name FMEA Diagrams. They offer an additional view of the system, tailored to FMEA goals. The enriched model can then be exploited to automatically generate FMEA worksheet and to conduct qualitative and quantitative analyses. We present a case study from a real-world project.
Chai, Chen; Wong, Yiik Diew; Wang, Xuesong
2017-07-01
This paper proposes a simulation-based approach to estimate safety impact of driver cognitive failures and driving errors. Fuzzy Logic, which involves linguistic terms and uncertainty, is incorporated with Cellular Automata model to simulate decision-making process of right-turn filtering movement at signalized intersections. Simulation experiments are conducted to estimate the relationships between cognitive failures and driving errors with safety performance. Simulation results show Different types of cognitive failures are found to have varied relationship with driving errors and safety performance. For right-turn filtering movement, cognitive failures are more likely to result in driving errors with denser conflicting traffic stream. Moreover, different driving errors are found to have different safety impacts. The study serves to provide a novel approach to linguistically assess cognitions and replicate decision-making procedures of the individual driver. Compare to crash analysis, the proposed FCA model allows quantitative estimation of particular cognitive failures, and the impact of cognitions on driving errors and safety performance. Copyright © 2017 Elsevier Ltd. All rights reserved.
ERIC Educational Resources Information Center
Ekiko, Mbong C.
2014-01-01
The research problem was the lack of knowledge about the effect of leadership style of the project champion on global information technology (IT) project outcomes, resulting in a high failure rate of IT projects accompanied by significant waste of resources. The purpose of this quantitative, nonexperimental study was to evaluate the relationship…
Procedures to evaluate the efficiency of protective clothing worn by operators applying pesticide.
Espanhol-Soares, Melina; Nociti, Leticia A S; Machado-Neto, Joaquim Gonçalves
2013-10-01
The evaluation of the efficiency of whole-body protective clothing against pesticides has already been carried out through field tests and procedures defined by international standards, but there is a need to determine the useful life of these garments to ensure worker safety. The aim of this article is to compare the procedures for evaluating efficiency of two whole-body protective garments, both new and previously used by applicators of herbicides, using a laboratory test with a mannequin and in the field with the operator. The evaluation of the efficiency of protective clothing used both quantitative and qualitative methodologies, leading to a proposal for classification according to efficiency, and determination of the useful life of protective clothing for use against pesticides, based on a quantitative assessment. The procedures used were in accordance with the standards of the modified American Society for Testing and Materials (ASTM) F 1359:2007 and International Organization for Standardization 17491-4. The protocol used in the field was World Health Organization Vector Biology and Control (VBC)/82.1. Clothing tested was personal water repellent and pesticide protective. Two varieties of fabric were tested: Beige (100% cotton) and Camouflaged (31% polyester and 69% cotton). The efficiency in exposure control of the personal protective clothing was measured before use and after 5, 10, 20, and 30 uses and washes under field conditions. Personal protective clothing was worn by workers in the field during the application of the herbicide glyphosate on weed species in mature sugar cane plantations using a knapsack sprayer. The modified ASTM 1359:2007 procedure was chosen as the most appropriate due to its greater repeatability (lower coefficient of variation). This procedure provides quantitative evaluation needed to determine the efficiency and useful life of individual protective clothing, not just at specific points of failure, but according to dermal protection as a whole. The qualitative assessment, which is suitable for verification of garment design and stitching flaws, does not aid in determining useful life, but does complement the quantitative evaluation. The proposed classification is appropriate and accurate for determining the useful life of personal protective clothing against pesticide materials relative to number of uses and washes after each use. For example, the Beige garment had a useful life of 30 uses and washes, while the Camouflaged garment had a useful life of 5 uses and washes. The quantitative evaluation aids in determining the efficiency and useful life of individual protective clothing according to dermal protection as a whole, not just at specific points of failure.
NASA Astrophysics Data System (ADS)
Du, Fangzhu; Li, Dongsheng
2018-03-01
As a new kind of composite structures, the using of steel confined reinforced concrete column attract increasing attention in civil engineer. During the damage process, this new structure offers highly complex and invisible failure mechanism due to the combination effects of steel tubes, concrete, and steel rebar. Acoustic emission (AE) technique has been extensively studied in nondestructive testing (NDT) and is currently applied in civil engineering for structural health monitoring (SHM) and damage evaluation. In the present study, damage property and failure evolution of steel confined and unconfined reinforced concrete (RC) columns are investigated under quasi-static loading through (AE) signal. Significantly improved loading capacity and excellent energy dissipation characteristic demonstrated the practicality of that proposed structure. AE monitoring results indicated that the progressive deformation of the test specimens occur in three stages representing different damage conditions. Sentry function compares the logarithm ratio between the stored strain energy (Es) and the released acoustic energy (Ea); explicitly disclose the damage growth and failure mechanism of the test specimens. Other extended AE features including index of damage (ID), and relax ratio are calculated to quantitatively evaluate the damage severity and critical point. Complicated temporal evolution of different AE features confirms the potential importance of integrated analysis of two or more parameters. The proposed multi-indicators analysis is capable of revealing the damage growth and failure mechanism for steel confined RC columns, and providing critical warning information for structure failure.
Mans, D R A; Kent, A D; Hu, R V; Lai A Fat, E J; Schoone, G J; Adams, E R; Rood, E J; Alba, S; Sabajo, L O A; Lai A Fat, R F; de Vries, H J C; Schallig, H D F H
2016-08-01
Leishmania (Viannia) guyanensis is believed to be the principal cause of cutaneous leishmaniasis (CL) in Suriname. This disease is treated with pentamidine isethionate (PI), but treatment failure has increasingly been reported. To evaluate PI for its clinical efficacy, to compare parasite load, and to assess the possibility of treatment failure due to other infecting Leishmania species. Parasite load of patients with CL was determined in skin biopsies using real-time quantitative PCR before treatment and 6 and 12 weeks after treatment. Clinical responses were evaluated at week 12 and compared with parasite load. In parallel, molecular species differentiation was performed. L. (V.) guyanensis was the main infecting species in 129 of 143 patients (about 90%). PI treatment led to a significant decrease (P < 0.001) in parasite counts, and cured about 75% of these patients. Treatment failure was attributable to infections with Leishmania (Viannia) braziliensis, Leishmania (Leishmania) amazonensis and L. (V.) guyanensis (1/92, 1/92 and 22/92 evaluable cases, respectively). There was substantial agreement beyond chance between the parasite load at week 6 and the clinical outcome at week 12, as indicated by the κ value of 0.61. L. (V.) guyanensis is the main infecting species of CL in Suriname, followed by L. (V.) braziliensis and L. (L.) amazonensis. Furthermore, patient response to PI can be better anticipated based on the parasite load 6 weeks after the treatment rather than on parasite load before treatment. © 2015 The Authors Clinical and Experimental Dermatology published by John Wiley & Sons Ltd on behalf of British Association of Dermatologists, North American Clinical Dermatologic Society and St Johns Dermatological Society.
Yousefinezhadi, Taraneh; Jannesar Nobari, Farnaz Attar; Goodari, Faranak Behzadi; Arab, Mohammad
2016-01-01
Introduction: In any complex human system, human error is inevitable and shows that can’t be eliminated by blaming wrong doers. So with the aim of improving Intensive Care Units (ICU) reliability in hospitals, this research tries to identify and analyze ICU’s process failure modes at the point of systematic approach to errors. Methods: In this descriptive research, data was gathered qualitatively by observations, document reviews, and Focus Group Discussions (FGDs) with the process owners in two selected ICUs in Tehran in 2014. But, data analysis was quantitative, based on failures’ Risk Priority Number (RPN) at the base of Failure Modes and Effects Analysis (FMEA) method used. Besides, some causes of failures were analyzed by qualitative Eindhoven Classification Model (ECM). Results: Through FMEA methodology, 378 potential failure modes from 180 ICU activities in hospital A and 184 potential failures from 99 ICU activities in hospital B were identified and evaluated. Then with 90% reliability (RPN≥100), totally 18 failures in hospital A and 42 ones in hospital B were identified as non-acceptable risks and then their causes were analyzed by ECM. Conclusions: Applying of modified PFMEA for improving two selected ICUs’ processes reliability in two different kinds of hospitals shows that this method empowers staff to identify, evaluate, prioritize and analyze all potential failure modes and also make them eager to identify their causes, recommend corrective actions and even participate in improving process without feeling blamed by top management. Moreover, by combining FMEA and ECM, team members can easily identify failure causes at the point of health care perspectives. PMID:27157162
Miller, Wayne L
2017-01-01
Volume overload and fluid congestion remain primary clinical challenges in the assessment and management of patients with chronic heart failure (HF). The pathophysiology of volume regulation is complex, and the simple concept of passive intravascular fluid accumulation is not adequate. The dynamics of interstitial and intravascular fluid compartment interactions and fluid redistribution from venous splanchnic beds to the central pulmonary circulation need to be taken into account in strategies of volume management. Clinical bedside evaluations and right heart hemodynamic assessments can alert of changes in volume status, but only the quantitative measurement of total blood volume can help identify the heterogeneity in plasma volume and red blood cell mass that are features of volume overload in chronic HF. The quantitative assessment of intravascular volume is an effective tool to help guide individualized, appropriate therapy. Not all volume overload is the same, and the measurement of intravascular volume identifies heterogeneity to guide tailored therapy.
Evaluation of methods for determining hardware projected life
NASA Technical Reports Server (NTRS)
1971-01-01
An investigation of existing methods of predicting hardware life is summarized by reviewing programs having long life requirements, current research efforts on long life problems, and technical papers reporting work on life predicting techniques. The results indicate that there are no accurate quantitative means to predict hardware life for system level hardware. The effectiveness of test programs and the cause of hardware failures is considered.
Exploring the gender gap in the conceptual survey of electricity and magnetism
NASA Astrophysics Data System (ADS)
Henderson, Rachel; Stewart, Gay; Stewart, John; Michaluk, Lynnette; Traxler, Adrienne
2017-12-01
The "gender gap" on various physics conceptual evaluations has been extensively studied. Men's average pretest scores on the Force Concept Inventory and Force and Motion Conceptual Evaluation are 13% higher than women's, and post-test scores are on average 12% higher than women's. This study analyzed the gender differences within the Conceptual Survey of Electricity and Magnetism (CSEM) in which the gender gap has been less well studied and is less consistent. In the current study, data collected from 1407 students (77% men, 23% women) in a calculus-based physics course over ten semesters showed that male students outperformed female students on the CSEM pretest (5%) and post-test (6%). Separate analyses were conducted for qualitative and quantitative problems on lab quizzes and course exams and showed that male students outperformed female students by 3% on qualitative quiz and exam problems. Male and female students performed equally on the quantitative course exam problems. The gender gaps within CSEM post-test scores, qualitative lab quiz scores, and qualitative exam scores were insignificant for students with a CSEM pretest score of 25% or less but grew as pretest scores increased. Structural equation modeling demonstrated that a latent variable, called Conceptual Physics Performance/Non-Quantitative (CPP/NonQnt), orthogonal to quantitative test performance was useful in explaining the differences observed in qualitative performance; this variable was most strongly related to CSEM post-test scores. The CPP/NonQnt of male students was 0.44 standard deviations higher than female students. The CSEM pretest measured CPP/NonQnt much less accurately for women (R2=4 % ) than for men (R2=17 % ). The failure to detect a gender gap for students scoring 25% or less on the pretest suggests that the CSEM instrument itself is not gender biased. The failure to find a performance difference in quantitative test performance while detecting a gap in qualitative performance suggests the qualitative differences do not result from psychological factors such as science anxiety or stereotype threat.
Miller, Wayne L
2016-08-01
Volume regulation, assessment, and management remain basic issues in patients with heart failure. The discussion presented here is directed at opening a reassessment of the pathophysiology of congestion in congestive heart failure and the methods by which we determine volume overload status. Peer-reviewed historical and contemporary literatures are reviewed. Volume overload and fluid congestion remain primary issues for patients with chronic heart failure. The pathophysiology is complex, and the simple concept of intravascular fluid accumulation is not adequate. The dynamics of interstitial and intravascular fluid compartment interactions and fluid redistribution from venous splanchnic beds to central pulmonary circulation need to be taken into account in strategies of volume management. Clinical bedside evaluations and right heart hemodynamic assessments can alert clinicians of changes in volume status, but only the quantitative measurement of total blood volume can help identify the heterogeneity in plasma volume and red blood cell mass that are features of volume overload in patients with chronic heart failure and help guide individualized, appropriate therapy-not all volume overload is the same. © 2016 American Heart Association, Inc.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mankamo, T.; Kim, I.S.; Yang, Ji Wu
Failures in the auxiliary feedwater (AFW) system of pressurized water reactors (PWRs) are considered to involve substantial risk whether a decision is made to either continue power operation while repair is being done, or to shut down the plant to undertake repairs. Technical specification action requirements usually require immediate plant shutdown in the case of multiple failures in the system (in some cases, immediate repair of one train is required when all AFW trains fail). This paper presents a probabilistic risk assessment-based method to quantitatively evaluate and compare both the risks of continued power operation and of shutting the plantmore » down, given known failures in the system. The method is applied to the AFW system for four different PWRs. Results show that the risk of continued power operation and plant shutdown both are substantial, but the latter is larger than the former over the usual repair time. This was proven for four plants with different designs: two operating Westinghouse plants, one operating Asea-Brown Boveri Combustion Engineering Plant, and one of evolutionary design. The method can be used to analyze individual plant design and to improve AFW action requirements using risk-informed evaluations.« less
A pilot rating scale for evaluating failure transients in electronic flight control systems
NASA Technical Reports Server (NTRS)
Hindson, William S.; Schroeder, Jeffery A.; Eshow, Michelle M.
1990-01-01
A pilot rating scale was developed to describe the effects of transients in helicopter flight-control systems on safety-of-flight and on pilot recovery action. The scale was applied to the evaluation of hardovers that could potentially occur in the digital flight-control system being designed for a variable-stability UH-60A research helicopter. Tests were conducted in a large moving-base simulator and in flight. The results of the investigation were combined with existing airworthiness criteria to determine quantitative reliability design goals for the control system.
Lien, W P; Lee, Y S; Chang, F Z; Chen, J J; Shieh, W B
1978-01-01
Quantitative one-plane cineangiocardiography in right anterior oblique position for evaluation of LV performance was carried out in 62 patients with various heart diseases and in 13 subjects with normal LV. Parameters for evaluating both pump and muscle performances were derived from volume and pressure measurements. Of 31 patients with either systolic hypertension or LV myocardial diseases (coronary artery disease or idiopathic cardiomyopathy), 14 had clinical evidence of LV failure before the study. It was found that mean VCF and EF were most sensitive indicators of impaired LV performance among the various parameters. There was a close correlation between mean VCF and EF, yet discordant changes of both parameters were noted in some patients. Furthermore, wall motion abnormalities were not infrequently observed in patients with coronary artery disease or primary cardiomyopathy. Therefore, assessment of at least three ejection properties (EF, mean VCF and wall motion abnormalities) are considered to be essential for full understanding of derangement of LV function in heart disease. This is especially true of patients with coronary artery disease. LV behavior in relation to different pathological stresses or lesions, such as chronic pressure or volume load, myocardial disease and mitral stenosis, was also studied and possible cause of impaired LV myocardial function in mitral stenosis was discussed.
An academic medical center's response to widespread computer failure.
Genes, Nicholas; Chary, Michael; Chason, Kevin W
2013-01-01
As hospitals incorporate information technology (IT), their operations become increasingly vulnerable to technological breakdowns and attacks. Proper emergency management and business continuity planning require an approach to identify, mitigate, and work through IT downtime. Hospitals can prepare for these disasters by reviewing case studies. This case study details the disruption of computer operations at Mount Sinai Medical Center (MSMC), an urban academic teaching hospital. The events, and MSMC's response, are narrated and the impact on hospital operations is analyzed. MSMC's disaster management strategy prevented computer failure from compromising patient care, although walkouts and time-to-disposition in the emergency department (ED) notably increased. This incident highlights the importance of disaster preparedness and mitigation. It also demonstrates the value of using operational data to evaluate hospital responses to disasters. Quantifying normal hospital functions, just as with a patient's vital signs, may help quantitatively evaluate and improve disaster management and business continuity planning.
Cargo, Margaret; Harris, Janet; Pantoja, Tomas; Booth, Andrew; Harden, Angela; Hannes, Karin; Thomas, James; Flemming, Kate; Garside, Ruth; Noyes, Jane
2018-05-01
This article provides reviewers with guidance on methods for identifying and processing evidence to understand intervention implementation. Strategies, tools, and methods are applied to the systematic review process to illustrate how process and implementation can be addressed using quantitative, qualitative, and other sources of evidence (i.e., descriptive textual and nonempirical). Reviewers can take steps to navigate the heterogeneity and level of uncertainty present in the concepts, measures, and methods used to assess implementation. Activities can be undertaken in advance of a Cochrane quantitative review to develop program theory and logic models that situate implementation in the causal chain. Four search strategies are offered to retrieve process and implementation evidence. Recommendations are made for addressing rigor or risk of bias in process evaluation or implementation evidence. Strategies are recommended for locating and extracting data from primary studies. The basic logic is presented to assist reviewers to make initial review-level judgments about implementation failure and theory failure. Although strategies, tools, and methods can assist reviewers to address process and implementation using quantitative, qualitative, and other forms of evidence, few exemplar reviews exist. There is a need for further methodological development and trialing of proposed approaches. Copyright © 2017 Elsevier Inc. All rights reserved.
Back-Analyses of Landfill Instability Induced by High Water Level: Case Study of Shenzhen Landfill
Peng, Ren; Hou, Yujing; Zhan, Liangtong; Yao, Yangping
2016-01-01
In June 2008, the Shenzhen landfill slope failed. This case is used as an example to study the deformation characteristics and failure mode of a slope induced by high water levels. An integrated monitoring system, including water level gauges, electronic total stations, and inclinometers, was used to monitor the slope failure process. The field measurements suggest that the landfill landslide was caused by a deep slip along the weak interface of the composite liner system at the base of the landfill. The high water level is considered to be the main factor that caused this failure. To calculate the relative interface shear displacements in the geosynthetic multilayer liner system, a series of numerical direct shear tests were carried out. Based on the numerical results, the composite lining system simplified and the centrifuge modeling technique was used to quantitatively evaluate the effect of water levels on landfill instability. PMID:26771627
NASA Technical Reports Server (NTRS)
Tamayo, Tak Chai
1987-01-01
Quality of software not only is vital to the successful operation of the space station, it is also an important factor in establishing testing requirements, time needed for software verification and integration as well as launching schedules for the space station. Defense of management decisions can be greatly strengthened by combining engineering judgments with statistical analysis. Unlike hardware, software has the characteristics of no wearout and costly redundancies, thus making traditional statistical analysis not suitable in evaluating reliability of software. A statistical model was developed to provide a representation of the number as well as types of failures occur during software testing and verification. From this model, quantitative measure of software reliability based on failure history during testing are derived. Criteria to terminate testing based on reliability objectives and methods to estimate the expected number of fixings required are also presented.
Obonaga, Ricardo; Fernández, Olga Lucía; Valderrama, Liliana; Rubiano, Luisa Consuelo; Castro, Maria del Mar; Barrera, Maria Claudia; Gomez, Maria Adelaida
2014-01-01
Treatment failure and parasite drug susceptibility in dermal leishmaniasis caused by Leishmania (Viannia) species are poorly understood. Prospective evaluation of drug susceptibility of strains isolated from individual patients before drug exposure and at clinical failure allows intrinsic and acquired differences in susceptibility to be discerned and analyzed. To determine whether intrinsic susceptibility or loss of susceptibility to miltefosine contributed to treatment failure, we evaluated the miltefosine susceptibility of intracellular amastigotes and promastigotes of six Leishmania (Viannia) braziliensis and six Leishmania (Viannia) panamensis strains isolated sequentially, at diagnosis and treatment failure, from two children and four adults ≥55 years old with concurrent conditions. Four patients presented only cutaneous lesions, one had mucosal disease, and one had disseminated mucocutaneous disease. Expression of the Leishmania drug transporter genes abca2, abca3, abcc2, abcc3, abcg4, abcg6, and LbMT was evaluated by quantitative reverse transcription-PCR (qRT-PCR). Intracellular amastigotes (median 50% effective concentration [EC50], 10.7 μmol/liter) were more susceptible to miltefosine than promastigotes (median EC50, 55.3 μmol/liter) (P < 0.0001). Loss of susceptibility at failure, demonstrated by a miltefosine EC50 of >32 μmol/liter (the upper limit of intracellular amastigote assay), occurred in L. panamensis infection in a child and in L. braziliensis infection in an adult and was accompanied by decreased expression of the miltefosine transporter LbMT (LbMT/β-tubulin, 0.42- to 0.26-fold [P = 0.039] and 0.70- to 0.57-fold [P = 0.009], respectively). LbMT gene polymorphisms were not associated with susceptibility phenotype. Leishmania ABCA3 transporter expression was inversely correlated with miltefosine susceptibility (r = −0.605; P = 0.037). Loss of susceptibility is one of multiple factors involved in failure of miltefosine treatment in dermal leishmaniasis. PMID:24145529
Zhao, Feng; Wang, Chuan; Fan, Yubo
2015-01-01
Wear of polyethylene (PE) tibial inserts is a significant cause of implant failure of total knee arthroplasty (TKA). PE inserts wear measurement and evaluation is the key in TKA researches. There are many methods to measure insert wear. Qualitative methods such as observation are used to determine the wear and its type. Quantitative methods such as gravimetric analysis, coordinate measuring machines (CMM) and micro-computed tomography (micro-CT) are used to measure the mass, volume and geometry of wear. In this paper, the principle, characteristics and research progress of main insert wear evaluation method were introduced and the problems and disadvantages were analyzed.
Micro-RNA-122 levels in acute liver failure and chronic hepatitis C.
Dubin, Perry H; Yuan, Hejun; Devine, Robert K; Hynan, Linda S; Jain, Mamta K; Lee, William M
2014-09-01
MicroRNA-122 (miR-122) is the foremost liver-related micro-RNA, but its role in the hepatocyte is not fully understood. To evaluate whether circulating levels of miR-122 are elevated in chronic-HCV for a reason other than hepatic injury, we compared serum level in patients with chronic hepatitis C to other forms of liver injury including patients with acute liver failure and healthy controls. MiR-122 was quantitated using sera from 35 acute liver failure patients (20 acetaminophen-induced, 15 other etiologies), 39 chronic-HCV patients and 12 controls. In parallel, human genomic DNA (hgDNA) levels were measured to reflect quantitatively the extent of hepatic necrosis. Additionally, six HIV-HCV co-infected patients, who achieved viral clearance after undergoing therapy with interferon and ribavirin, had serial sera miR-122 and hgDNA levels measured before and throughout treatment. Serum miR-122 levels were elevated approximately 100-fold in both acute liver failure and chronic-HCV sera as compared to controls (P < 0.001), whereas hgDNA levels were only elevated in acute liver failure patients as compared to both chronic-HCV and controls (P < 0.001). Subgroup analysis showed that chronic-HCV sera with normal aminotransferase levels showed elevated miR-122 despite low levels of hepatocyte necrosis. All successfully treated HCV patients showed a significant Log10 decrease in miR-122 levels ranging from 0.16 to 1.46, after sustained viral response. Chronic-HCV patients have very elevated serum miR-122 levels in the range of most patients with severe hepatic injury leading to acute liver failure. Eradication of HCV was associated with decreased miR-122 but not hgDNA. An additional mechanism besides hepatic injury may be active in chronic-HCV to explain the exaggerated circulating levels of miR-122 observed. © 2014 Wiley Periodicals, Inc.
[INTESTINAL FAILURE IN PEDIATRIC PATIENTS: EXPERIENCE AND MANAGEMENT BY A MULTIDISCIPLINARY GROUP].
Giraldo Villa, Adriana; Martínez Volkmar, María Isabel; Valencia Quintero, Andrés Felipe; Montoya Delgado, Diana Catalina; Henao Roldan, Catherine; Ruiz Navas, Patricia; García Loboguerrero, Fanny; Contreras Ramírez, Mónica María
2015-12-01
institutions with multidisciplinary teams have shown improvements in patient outcomes with intestinal failure. Multidisciplinary approach allows an integral management and effective communication between families and care teams. describe the multidisciplinary management and outcome in pediatric patients with intestinal failure. retrospective study in patients 18 years old or less, with intestinal failure and Total Parenteral Nutrition (TPN) required. Simple frequencies and percentages were used for qualitative variables, and central tendency and dispersion measures were used for quantitative variables. 33 patients with a median follow up of 281 days were evaluated. The median duration of the TPN was 68 days and the mean of catheter-related infections was 2.26 per patient. In 31 patients oral or enteral nutrition was provided, starting in 61.3% of cases through tube and continuous infusion. As concomitant treatment 72.7% of children received ursodeoxycholic acid, 67.7%, cholestyramine 57.6% loperamide, 48.5% antibiotics and 36.4% probiotic. The families of 24 patients were evaluated by social work professionals. Intestinal autonomy was achieved in 69.7% of cases, 72.7% of them showed an improvement in the score z of weight and showed an end albumin significantly higher than the initial (p value: 0.012). the management of patients with intestinal failure is a challenge for health institutions and require care based on a standardized protocol and a multidisciplinary group. Copyright AULA MEDICA EDICIONES 2014. Published by AULA MEDICA. All rights reserved.
Yepes-Rios, Monica; Dudek, Nancy; Duboyce, Rita; Curtis, Jerri; Allard, Rhonda J; Varpio, Lara
2016-11-01
Many clinical educators feel unprepared and/or unwilling to report unsatisfactory trainee performance. This systematic review consolidates knowledge from medical, nursing, and dental literature on the experiences and perceptions of evaluators or assessors with this failure to fail phenomenon. We searched the English language literature in CINAHL, EMBASE, and MEDLINE from January 2005 to January 2015. Qualitative and quantitative studies were included. Following our review protocol, registered with BEME, reviewers worked in pairs to identify relevant articles. The investigators participated in thematic analysis of the qualitative data reported in these studies. Through several cycles of analysis, discussion and reflection, the team identified the barriers and enablers to failing a trainee. From 5330 articles, we included 28 publications in the review. The barriers identified were (1) assessor's professional considerations, (2) assessor's personal considerations, (3) trainee related considerations, (4) unsatisfactory evaluator development and evaluation tools, (5) institutional culture and (6) consideration of available remediation for the trainee. The enablers identified were: (1) duty to patients, to society, and to the profession, (2) institutional support such as backing a failing evaluation, support from colleagues, evaluator development, and strong assessment systems, and (3) opportunities for students after failing. The inhibiting and enabling factors to failing an underperforming trainee were common across the professions included in this study, across the 10 years of data, and across the educational continuum. We suggest that these results can inform efforts aimed at addressing the failure to fail problem.
Fibrosis-Related Gene Expression in Single Ventricle Heart Disease.
Nakano, Stephanie J; Siomos, Austine K; Garcia, Anastacia M; Nguyen, Hieu; SooHoo, Megan; Galambos, Csaba; Nunley, Karin; Stauffer, Brian L; Sucharov, Carmen C; Miyamoto, Shelley D
2017-12-01
To evaluate fibrosis and fibrosis-related gene expression in the myocardium of pediatric subjects with single ventricle with right ventricular failure. Real-time quantitative polymerase chain reaction was performed on explanted right ventricular myocardium of pediatric subjects with single ventricle disease and controls with nonfailing heart disease. Subjects were divided into 3 groups: single ventricle failing (right ventricular failure before or after stage I palliation), single ventricle nonfailing (infants listed for primary transplantation with normal right ventricular function), and stage III (Fontan or right ventricular failure after stage III). To evaluate subjects of similar age and right ventricular volume loading, single ventricle disease with failure was compared with single ventricle without failure and stage III was compared with nonfailing right ventricular disease. Histologic fibrosis was assessed in all hearts. Mann-Whitney tests were performed to identify differences in gene expression. Collagen (Col1α, Col3) expression is decreased in single ventricle congenital heart disease with failure compared with nonfailing single ventricle congenital heart disease (P = .019 and P = .035, respectively), and is equivalent in stage III compared with nonfailing right ventricular heart disease. Tissue inhibitors of metalloproteinase (TIMP-1, TIMP-3, and TIMP-4) are downregulated in stage III compared with nonfailing right ventricular heart disease (P = .0047, P = .013 and P = .013, respectively). Matrix metalloproteinases (MMP-2, MMP-9) are similar between nonfailing single ventricular heart disease and failing single ventricular heart disease, and between stage III heart disease and nonfailing right ventricular heart disease. There is no difference in the prevalence of right ventricular fibrosis by histology in subjects with single ventricular failure heart disease with right ventricular failure (18%) compared with those with normal right ventricular function (38%). Fibrosis is not a primary contributor to right ventricular failure in infants and young children with single ventricular heart disease. Additional studies are required to understand whether antifibrotic therapies are beneficial in this population. Copyright © 2017 Elsevier Inc. All rights reserved.
Modeling Dynamic Helium Release as a Tracer of Rock Deformation
Gardner, W. Payton; Bauer, Stephen J.; Kuhlman, Kristopher L.; ...
2017-11-03
Here, we use helium released during mechanical deformation of shales as a signal to explore the effects of deformation and failure on material transport properties. A dynamic dual-permeability model with evolving pore and fracture networks is used to simulate gases released from shale during deformation and failure. Changes in material properties required to reproduce experimentally observed gas signals are explored. We model two different experiments of 4He flow rate measured from shale undergoing mechanical deformation, a core parallel to bedding and a core perpendicular to bedding. We also found that the helium signal is sensitive to fracture development and evolutionmore » as well as changes in the matrix transport properties. We constrain the timing and effective fracture aperture, as well as the increase in matrix porosity and permeability. Increases in matrix permeability are required to explain gas flow prior to macroscopic failure, and the short-term gas flow postfailure. Increased matrix porosity is required to match the long-term, postfailure gas flow. This model provides the first quantitative interpretation of helium release as a result of mechanical deformation. The sensitivity of this model to changes in the fracture network, as well as to matrix properties during deformation, indicates that helium release can be used as a quantitative tool to evaluate the state of stress and strain in earth materials.« less
Modeling Dynamic Helium Release as a Tracer of Rock Deformation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gardner, W. Payton; Bauer, Stephen J.; Kuhlman, Kristopher L.
Here, we use helium released during mechanical deformation of shales as a signal to explore the effects of deformation and failure on material transport properties. A dynamic dual-permeability model with evolving pore and fracture networks is used to simulate gases released from shale during deformation and failure. Changes in material properties required to reproduce experimentally observed gas signals are explored. We model two different experiments of 4He flow rate measured from shale undergoing mechanical deformation, a core parallel to bedding and a core perpendicular to bedding. We also found that the helium signal is sensitive to fracture development and evolutionmore » as well as changes in the matrix transport properties. We constrain the timing and effective fracture aperture, as well as the increase in matrix porosity and permeability. Increases in matrix permeability are required to explain gas flow prior to macroscopic failure, and the short-term gas flow postfailure. Increased matrix porosity is required to match the long-term, postfailure gas flow. This model provides the first quantitative interpretation of helium release as a result of mechanical deformation. The sensitivity of this model to changes in the fracture network, as well as to matrix properties during deformation, indicates that helium release can be used as a quantitative tool to evaluate the state of stress and strain in earth materials.« less
Miller, Wayne L; Mullan, Brian P
2014-06-01
This study sought to quantitate total blood volume (TBV) in patients hospitalized for decompensated chronic heart failure (DCHF) and to determine the extent of volume overload, and the magnitude and distribution of blood volume and body water changes following diuretic therapy. The accurate assessment and management of volume overload in patients with DCHF remains problematic. TBV was measured by a radiolabeled-albumin dilution technique with intravascular volume, pre-to-post-diuretic therapy, evaluated at hospital admission and at discharge. Change in body weight in relation to quantitated TBV was used to determine interstitial volume contribution to total fluid loss. Twenty-six patients were prospectively evaluated. Two patients had normal TBV at admission. Twenty-four patients were hypervolemic with TBV (7.4 ± 1.6 liters) increased by +39 ± 22% (range, +9.5% to +107%) above the expected normal volume. With diuresis, TBV decreased marginally (+30 ± 16%). Body weight declined by 6.9 ± 5.2 kg, and fluid intake/fluid output was a net negative 8.4 ± 5.2 liters. Interstitial compartment fluid loss was calculated at 6.2 ± 4.0 liters, accounting for 85 ± 15% of the total fluid reduction. TBV analysis demonstrated a wide range in the extent of intravascular overload. Dismissal measurements revealed marginally reduced intravascular volume post-diuretic therapy despite large reductions in body weight. Mobilization of interstitial fluid to the intravascular compartment with diuresis accounted for this disparity. Intravascular volume, however, remained increased at dismissal. The extent, composition, and distribution of volume overload are highly variable in DCHF, and this variability needs to be taken into account in the approach to individualized therapy. TBV quantitation, particularly serial measurements, can facilitate informed volume management with respect to a goal of treating to euvolemia. Copyright © 2014 American College of Cardiology Foundation. Published by Elsevier Inc. All rights reserved.
Risk measures for power failures in transmission systems
NASA Astrophysics Data System (ADS)
Cassidy, Alex; Feinstein, Zachary; Nehorai, Arye
2016-11-01
We present a novel framework for evaluating the risk of failures in power transmission systems. We use the concept of systemic risk measures from the financial mathematics literature with models of power system failures in order to quantify the risk of the entire power system for design and comparative purposes. The proposed risk measures provide the collection of capacity vectors for the components in the system that lead to acceptable outcomes. Keys to the formulation of our measures of risk are two elements: a model of system behavior that provides the (distribution of) outcomes based on component capacities and an acceptability criterion that determines whether a (random) outcome is acceptable from an aggregated point of view. We examine the effects of altering the line capacities on energy not served under a variety of networks, flow manipulation methods, load shedding schemes, and load profiles using Monte Carlo simulations. Our results provide a quantitative comparison of the performance of these schemes, measured by the required line capacity. These results provide more complete descriptions of the risks of power failures than the previous, one-dimensional metrics.
NASA Astrophysics Data System (ADS)
Li, Dongsheng; Du, Fangzhu; Ou, Jinping
2017-03-01
Glass-fiber reinforced plastic (GFRP)-confined circular concrete-filled steel tubular (CCFT) columns comprise of concrete, steel, and GFRP and show complex failure mechanics under cyclic loading. This paper investigated the failure mechanism and damage evolution of GFRP-CCFT columns by performing uniaxial cyclic loading tests that were monitored using the acoustic emission (AE) technique. Characteristic AE parameters were obtained during the damage evolution of GFRP-CCFT columns. Based on the relationship between the loading curve and these parameters, the damage evolution of GFRP-CCFT columns was classified into three stages that represented different damage degrees. Damage evolution and failure mode were investigated by analyzing the b-value and the ratio of rise time to waveform amplitude and average frequency. The damage severity of GFRP-CCFT columns were quantitatively estimated according to the modified index of damage and NDIS-2421 damage assessment criteria corresponding to each loading step. The proposed method can explain the damage evolution and failure mechanism for GFRP-CCFT columns and provide critical warning information for composite structures.
Zebrafish Heart Failure Models for the Evaluation of Chemical Probes and Drugs
Monte, Aaron; Cook, James M.; Kabir, Mohd Shahjahan; Peterson, Karl P.
2013-01-01
Abstract Heart failure is a complex disease that involves genetic, environmental, and physiological factors. As a result, current medication and treatment for heart failure produces limited efficacy, and better medication is in demand. Although mammalian models exist, simple and low-cost models will be more beneficial for drug discovery and mechanistic studies of heart failure. We previously reported that aristolochic acid (AA) caused cardiac defects in zebrafish embryos that resemble heart failure. Here, we showed that cardiac troponin T and atrial natriuretic peptide were expressed at significantly higher levels in AA-treated embryos, presumably due to cardiac hypertrophy. In addition, several human heart failure drugs could moderately attenuate the AA-induced heart failure by 10%–40%, further verifying the model for drug discovery. We then developed a drug screening assay using the AA-treated zebrafish embryos and identified three compounds. Mitogen-activated protein kinase kinase inhibitor (MEK-I), an inhibitor for the MEK-1/2 known to be involved in cardiac hypertrophy and heart failure, showed nearly 60% heart failure attenuation. C25, a chalcone derivative, and A11, a phenolic compound, showed around 80% and 90% attenuation, respectively. Time course experiments revealed that, to obtain 50% efficacy, these compounds were required within different hours of AA treatment. Furthermore, quantitative polymerase chain reaction showed that C25, not MEK-I or A11, strongly suppressed inflammation. Finally, C25 and MEK-I, but not A11, could also rescue the doxorubicin-induced heart failure in zebrafish embryos. In summary, we have established two tractable heart failure models for drug discovery and three potential drugs have been identified that seem to attenuate heart failure by different mechanisms. PMID:24351044
NASA Technical Reports Server (NTRS)
Lysak, Daniel B.
2003-01-01
In this report we examine the applicability of shearography techniques for nondestructive inspection and evaluation in two unique application areas. In the first application, shearography is used to evaluate the quality of adhesive bonds holding lead tiles to the BAT gamma ray mask for the NASA Swift program. By exciting the mask with a vibration, the more poorly bonded tiles can be distinguished by their greater displacement response, which is readily identifiable in the shearography image. A quantitative analysis is presented that compares the shearography results with a destructive pull test measuring the force at bond failure. Generally speaking, the results show good agreement. Further investigation would be useful to optimize certain test parameters such as vibration frequency and amplitude. The second application is to evaluate the bonding between the skin and core of a honeycomb structure with a specular (mirror-like) surface. In standard shearography techniques, the object under test must have a diffuse surface to generate the speckle patterns in laser light, which are then sheared. A novel configuration using the specular surface as a mirror to image speckles from a diffuser is presented, opening up the use of shearography to a new class of objects that could not have been examined with the traditional approach. This new technique readily identifies large scale bond failures in the panel, demonstrating the validity of this approach. For the particular panel examined here, some scaling issues should be examined further to resolve the measurement scale down to the very small size of the core cells. In addition, further development should be undertaken to determine the general applicability of the new approach and to establish a firm quantitative foundation.
Evaluation of airway protection: Quantitative timing measures versus penetration/aspiration score.
Kendall, Katherine A
2017-10-01
Quantitative measures of swallowing function may improve the reliability and accuracy of modified barium swallow (MBS) study interpretation. Quantitative study analysis has not been widely instituted, however, secondary to concerns about the time required to make measures and a lack of research demonstrating impact on MBS interpretation. This study compares the accuracy of the penetration/aspiration (PEN/ASP) scale (an observational visual-perceptual assessment tool) to quantitative measures of airway closure timing relative to the arrival of the bolus at the upper esophageal sphincter in identifying a failure of airway protection during deglutition. Retrospective review of clinical swallowing data from a university-based outpatient clinic. Swallowing data from 426 patients were reviewed. Patients with normal PEN/ASP scores were identified, and the results of quantitative airway closure timing measures for three liquid bolus sizes were evaluated. The incidence of significant airway closure delay with and without a normal PEN/ASP score was determined. Inter-rater reliability for the quantitative measures was calculated. In patients with a normal PEN/ASP score, 33% demonstrated a delay in airway closure on at least one swallow during the MBS study. There was no correlation between PEN/ASP score and airway closure delay. Inter-rater reliability for the quantitative measure of airway closure timing was nearly perfect (intraclass correlation coefficient = 0.973). The use of quantitative measures of swallowing function, in conjunction with traditional visual perceptual methods of MBS study interpretation, improves the identification of airway closure delay, and hence, potential aspiration risk, even when no penetration or aspiration is apparent on the MBS study. 4. Laryngoscope, 127:2314-2318, 2017. © 2017 The American Laryngological, Rhinological and Otological Society, Inc.
Evaluation of a Postdischarge Call System Using the Logic Model.
Frye, Timothy C; Poe, Terri L; Wilson, Marisa L; Milligan, Gary
2018-02-01
This mixed-method study was conducted to evaluate a postdischarge call program for congestive heart failure patients at a major teaching hospital in the southeastern United States. The program was implemented based on the premise that it would improve patient outcomes and overall quality of life, but it had never been evaluated for effectiveness. The Logic Model was used to evaluate the input of key staff members to determine whether the outputs and results of the program matched the expectations of the organization. Interviews, online surveys, reviews of existing patient outcome data, and reviews of publicly available program marketing materials were used to ascertain current program output. After analyzing both qualitative and quantitative data from the evaluation, recommendations were made to the organization to improve the effectiveness of the program.
An evidential reasoning extension to quantitative model-based failure diagnosis
NASA Technical Reports Server (NTRS)
Gertler, Janos J.; Anderson, Kenneth C.
1992-01-01
The detection and diagnosis of failures in physical systems characterized by continuous-time operation are studied. A quantitative diagnostic methodology has been developed that utilizes the mathematical model of the physical system. On the basis of the latter, diagnostic models are derived each of which comprises a set of orthogonal parity equations. To improve the robustness of the algorithm, several models may be used in parallel, providing potentially incomplete and/or conflicting inferences. Dempster's rule of combination is used to integrate evidence from the different models. The basic probability measures are assigned utilizing quantitative information extracted from the mathematical model and from online computation performed therewith.
Gasbarro, Gregory; Ye, Jason; Newsome, Hillary; Jiang, Kevin; Wright, Vonda; Vyas, Dharmesh; Irrgang, James J; Musahl, Volker
2016-10-01
To evaluate whether morphologic characteristics of rotator cuff tear have prognostic value in determining symptomatic structural failure of arthroscopic rotator cuff repair independent of age or gender. Arthroscopic rotator cuff repair cases performed by five fellowship-trained surgeons at our institution from 2006 to 2013 were retrospectively reviewed. Data extraction included demographics, comorbidities, repair technique, clinical examination, and radiographic findings. Failure in symptomatic patients was defined as structural defect on postoperative magnetic resonance imaging or pseudoparalysis on examination. Failures were age and gender matched with successful repairs in a 1:2 ratio. A total of 30 failures and 60 controls were identified. Supraspinatus atrophy (P = .03) and tear size (18.3 mm failures v 13.9 mm controls; P = .02) were significant risk factors for failure, as was the presence of an infraspinatus tear greater than 10 mm (62% v 17%, P < .01). Single-row repair (P = .06) and simple suture configuration (P = .17) were more common but similar between groups. Diabetes mellitus and active tobacco use were not significantly associated with increased failure risk but psychiatric medication use was more frequent in the failure group. This study confirms previous suspicions that tear size and fatty infiltration are associated with failure of arthroscopic rotator cuff repair but independent of age or gender in symptomatic patients. There is also a quantitative cutoff on magnetic resonance imaging for the size of infraspinatus involvement that can be used clinically as a predicting factor. Although reported in the literature, smoking and diabetes were not associated with failure. Level III, retrospective case control. Copyright © 2016 Arthroscopy Association of North America. Published by Elsevier Inc. All rights reserved.
Vibroacoustic test plan evaluation: Parameter variation study
NASA Technical Reports Server (NTRS)
Stahle, C. V.; Gongloef, H. R.
1976-01-01
Statistical decision models are shown to provide a viable method of evaluating the cost effectiveness of alternate vibroacoustic test plans and the associated test levels. The methodology developed provides a major step toward the development of a realistic tool to quantitatively tailor test programs to specific payloads. Testing is considered at the no test, component, subassembly, or system level of assembly. Component redundancy and partial loss of flight data are considered. Most and probabilistic costs are considered, and incipient failures resulting from ground tests are treated. Optimums defining both component and assembly test levels are indicated for the modified test plans considered. modeling simplifications must be considered in interpreting the results relative to a particular payload. New parameters introduced were a no test option, flight by flight failure probabilities, and a cost to design components for higher vibration requirements. Parameters varied were the shuttle payload bay internal acoustic environment, the STS launch cost, the component retest/repair cost, and the amount of redundancy in the housekeeping section of the payload reliability model.
Common cause evaluations in applied risk analysis of nuclear power plants. [PWR
DOE Office of Scientific and Technical Information (OSTI.GOV)
Taniguchi, T.; Ligon, D.; Stamatelatos, M.
1983-04-01
Qualitative and quantitative approaches were developed for the evaluation of common cause failures (CCFs) in nuclear power plants and were applied to the analysis of the auxiliary feedwater systems of several pressurized water reactors (PWRs). Key CCF variables were identified through a survey of experts in the field and a review of failure experience in operating PWRs. These variables were classified into categories of high, medium, and low defense against a CCF. Based on the results, a checklist was developed for analyzing CCFs of systems. Several known techniques for quantifying CCFs were also reviewed. The information provided valuable insights inmore » the development of a new model for estimating CCF probabilities, which is an extension of and improvement over the Beta Factor method. As applied to the analysis of the PWR auxiliary feedwater systems, the method yielded much more realistic values than the original Beta Factor method for a one-out-of-three system.« less
Strength of SiCf-SiCm composite tube under uniaxial and multiaxial loading
NASA Astrophysics Data System (ADS)
Shapovalov, Kirill; Jacobsen, George M.; Alva, Luis; Truesdale, Nathaniel; Deck, Christian P.; Huang, Xinyu
2018-03-01
The authors report mechanical strength of nuclear grade silicon carbide fiber reinforced silicon carbide matrix composite (SiCf-SiCm) tubing under several different stress states. The composite tubing was fabricated via a Chemical Vapor Infiltration (CVI) process, and is being evaluated for accident tolerant nuclear fuel cladding. Several experimental techniques were applied including uniaxial tension, elastomer insert burst test, open and closed end hydraulic bladder burst test, and torsion test. These tests provided critical stress and strain values at proportional limit and at ultimate failure points. Full field strain measurements using digital image correlation (DIC) were obtained in order to acquire quantitative information on localized deformation during application of stress. Based on the test results, a failure map was constructed for the SiCf-SiCm composites.
1981-09-01
them presumably as plan proceeds. A work package description is in- an adjunct in ongoing failure studies . It would cluded for each block on the...Furthermore, the two limiting cases in the unit theoreti-al and laboratory results, lhe most amplitude study are the square wave and the ex - important reason is...of Denver Research Institute ...................................................... 87 STUDY PROGRAM FOR ENCAPSULATION MATERIALS INTERFACE FOR LOW-COST
Embedded Resistors and Capacitors in Organic and Inorganic Substrates
NASA Technical Reports Server (NTRS)
Gerke, Robert David; Ator, Danielle
2006-01-01
Embedded resistors and capacitors were purchased from two technology; organic PWB and inorganic low temperature co-fire ceramic (LTCC). Small groups of each substrate were exposed to four environmental tests and several characterization tests to evaluate their performance and reliability. Even though all passive components maintained electrical performance throughout environmental testing, differences between the two technologies were observed. Environmental testing was taken beyond manufacturers' reported testing, but general not taken to failure. When possible, data was quantitatively compared to manufacturer's data.
Stefaniak, Katarzyna; Wróżyńska, Magdalena
2018-02-01
Protection of common natural goods is one of the greatest challenges man faces every day. Extracting and processing natural resources such as mineral deposits contributes to the transformation of the natural environment. The number of activities designed to keep balance are undertaken in accordance with the concept of integrated order. One of them is the use of comprehensive systems of tailings storage facility monitoring. Despite the monitoring, system failures still occur. The quantitative aspect of the failures illustrates both the scale of the problem and the quantitative aspect of the consequences of tailings storage facility failures. The paper presents vast possibilities provided by the global monitoring in the effective prevention of these failures. Particular attention is drawn to the potential of using multidirectional monitoring, including technical and environmental monitoring by the example of one of the world's biggest hydrotechnical constructions-Żelazny Most Tailings Storage Facility (TSF), Poland. Analysis of monitoring data allows to take preventive action against construction failures of facility dams, which can have devastating effects on human life and the natural environment.
Recent advances in computational structural reliability analysis methods
NASA Astrophysics Data System (ADS)
Thacker, Ben H.; Wu, Y.-T.; Millwater, Harry R.; Torng, Tony Y.; Riha, David S.
1993-10-01
The goal of structural reliability analysis is to determine the probability that the structure will adequately perform its intended function when operating under the given environmental conditions. Thus, the notion of reliability admits the possibility of failure. Given the fact that many different modes of failure are usually possible, achievement of this goal is a formidable task, especially for large, complex structural systems. The traditional (deterministic) design methodology attempts to assure reliability by the application of safety factors and conservative assumptions. However, the safety factor approach lacks a quantitative basis in that the level of reliability is never known and usually results in overly conservative designs because of compounding conservatisms. Furthermore, problem parameters that control the reliability are not identified, nor their importance evaluated. A summary of recent advances in computational structural reliability assessment is presented. A significant level of activity in the research and development community was seen recently, much of which was directed towards the prediction of failure probabilities for single mode failures. The focus is to present some early results and demonstrations of advanced reliability methods applied to structural system problems. This includes structures that can fail as a result of multiple component failures (e.g., a redundant truss), or structural components that may fail due to multiple interacting failure modes (e.g., excessive deflection, resonate vibration, or creep rupture). From these results, some observations and recommendations are made with regard to future research needs.
Recent advances in computational structural reliability analysis methods
NASA Technical Reports Server (NTRS)
Thacker, Ben H.; Wu, Y.-T.; Millwater, Harry R.; Torng, Tony Y.; Riha, David S.
1993-01-01
The goal of structural reliability analysis is to determine the probability that the structure will adequately perform its intended function when operating under the given environmental conditions. Thus, the notion of reliability admits the possibility of failure. Given the fact that many different modes of failure are usually possible, achievement of this goal is a formidable task, especially for large, complex structural systems. The traditional (deterministic) design methodology attempts to assure reliability by the application of safety factors and conservative assumptions. However, the safety factor approach lacks a quantitative basis in that the level of reliability is never known and usually results in overly conservative designs because of compounding conservatisms. Furthermore, problem parameters that control the reliability are not identified, nor their importance evaluated. A summary of recent advances in computational structural reliability assessment is presented. A significant level of activity in the research and development community was seen recently, much of which was directed towards the prediction of failure probabilities for single mode failures. The focus is to present some early results and demonstrations of advanced reliability methods applied to structural system problems. This includes structures that can fail as a result of multiple component failures (e.g., a redundant truss), or structural components that may fail due to multiple interacting failure modes (e.g., excessive deflection, resonate vibration, or creep rupture). From these results, some observations and recommendations are made with regard to future research needs.
An Adaptive Failure Detector Based on Quality of Service in Peer-to-Peer Networks
Dong, Jian; Ren, Xiao; Zuo, Decheng; Liu, Hongwei
2014-01-01
The failure detector is one of the fundamental components that maintain high availability of Peer-to-Peer (P2P) networks. Under different network conditions, the adaptive failure detector based on quality of service (QoS) can achieve the detection time and accuracy required by upper applications with lower detection overhead. In P2P systems, complexity of network and high churn lead to high message loss rate. To reduce the impact on detection accuracy, baseline detection strategy based on retransmission mechanism has been employed widely in many P2P applications; however, Chen's classic adaptive model cannot describe this kind of detection strategy. In order to provide an efficient service of failure detection in P2P systems, this paper establishes a novel QoS evaluation model for the baseline detection strategy. The relationship between the detection period and the QoS is discussed and on this basis, an adaptive failure detector (B-AFD) is proposed, which can meet the quantitative QoS metrics under changing network environment. Meanwhile, it is observed from the experimental analysis that B-AFD achieves better detection accuracy and time with lower detection overhead compared to the traditional baseline strategy and the adaptive detectors based on Chen's model. Moreover, B-AFD has better adaptability to P2P network. PMID:25198005
A model for predicting embankment slope failures in clay-rich soils; A Louisiana example
NASA Astrophysics Data System (ADS)
Burns, S. F.
2015-12-01
A model for predicting embankment slope failures in clay-rich soils; A Louisiana example It is well known that smectite-rich soils significantly reduce the stability of slopes. The question is how much smectite in the soil causes slope failures. A study of over 100 sites in north and south Louisiana, USA, compared slopes that failed during a major El Nino winter (heavy rainfall) in 1982-1983 to similar slopes that did not fail. Soils in the slopes were tested for per cent clay, liquid limits, plasticity indices and semi-quantitative clay mineralogy. Slopes with the High Risk for failure (85-90% chance of failure in 8-15 years after construction) contained soils with a liquid limit > 54%, a plasticity index over 29%, and clay contents > 47%. Slopes with an Intermediate Risk (55-50% chance of failure in 8-15 years) contained soils with a liquid limit between 36-54%, plasticity index between 16-19%, and clay content between 32-47%. Slopes with a Low Risk chance of failure (< 5% chance of failure in 8-15 years after construction) contained soils with a liquid limit < 36%, a plasticity index < 16%, and a clay content < 32%. These data show that if one is constructing embankments and one wants to prevent slope failure of the 3:1 slopes, check the above soil characteristics before construction. If the soils fall into the Low Risk classification, construct the embankment normally. If the soils fall into the High Risk classification, one will need to use lime stabilization or heat treatments to prevent failures. Soils in the Intermediate Risk class will have to be evaluated on a case by case basis.
A Systematic Review of Quantitative Resilience Measures for Water Infrastructure Systems
Shin, Sangmin; Lee, Seungyub; Judi, David; ...
2018-02-07
Over the past few decades, the concept of resilience has emerged as an important consideration in the planning and management of water infrastructure systems. Accordingly, various resilience measures have been developed for the quantitative evaluation and decision-making of systems. There are, however, numerous considerations and no clear choice of which measure, if any, provides the most appropriate representation of resilience for a given application. This study provides a critical review of quantitative approaches to measure the resilience of water infrastructure systems, with a focus on water resources and distribution systems. A compilation of 11 criteria evaluating 21 selected resilience measuresmore » addressing major features of resilience is developed using the Axiomatic Design process. Existing gaps of resilience measures are identified based on the review criteria. The results show that resilience measures have generally paid less attention to cascading damage to interrelated systems, rapid identification of failure, physical damage of system components, and time variation of resilience. Concluding the paper, improvements to resilience measures are recommended. The findings contribute to our understanding of gaps and provide information to help further improve resilience measures of water infrastructure systems.« less
A Systematic Review of Quantitative Resilience Measures for Water Infrastructure Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shin, Sangmin; Lee, Seungyub; Judi, David
Over the past few decades, the concept of resilience has emerged as an important consideration in the planning and management of water infrastructure systems. Accordingly, various resilience measures have been developed for the quantitative evaluation and decision-making of systems. There are, however, numerous considerations and no clear choice of which measure, if any, provides the most appropriate representation of resilience for a given application. This study provides a critical review of quantitative approaches to measure the resilience of water infrastructure systems, with a focus on water resources and distribution systems. A compilation of 11 criteria evaluating 21 selected resilience measuresmore » addressing major features of resilience is developed using the Axiomatic Design process. Existing gaps of resilience measures are identified based on the review criteria. The results show that resilience measures have generally paid less attention to cascading damage to interrelated systems, rapid identification of failure, physical damage of system components, and time variation of resilience. Concluding the paper, improvements to resilience measures are recommended. The findings contribute to our understanding of gaps and provide information to help further improve resilience measures of water infrastructure systems.« less
Li, Mengmeng; Rao, Man; Chen, Kai; Zhou, Jianye; Song, Jiangping
2017-07-15
Real-time quantitative reverse transcriptase-PCR (qRT-PCR) is a feasible tool for determining gene expression profiles, but the accuracy and reliability of the results depends on the stable expression of selected housekeeping genes in different samples. By far, researches on stable housekeeping genes in human heart failure samples are rare. Moreover the effect of heart failure on the expression of housekeeping genes in right and left ventricles is yet to be studied. Therefore we aim to provide stable housekeeping genes for both ventricles in heart failure and normal heart samples. In this study, we selected seven commonly used housekeeping genes as candidates. By using the qRT-PCR, the expression levels of ACTB, RAB7A, GAPDH, REEP5, RPL5, PSMB4 and VCP in eight heart failure and four normal heart samples were assessed. The stability of candidate housekeeping genes was evaluated by geNorm and Normfinder softwares. GAPDH showed the least variation in all heart samples. Results also indicated the difference of gene expression existed in heart failure left and right ventricles. GAPDH had the highest expression stability in both heart failure and normal heart samples. We also propose using different sets of housekeeping genes for left and right ventricles respectively. The combination of RPL5, GAPDH and PSMB4 is suitable for the right ventricle and the combination of GAPDH, REEP5 and RAB7A is suitable for the left ventricle. Copyright © 2017 Elsevier B.V. All rights reserved.
Herbort, Carl P; Tugal-Tutkun, Ilknur; Neri, Piergiorgio; Pavésio, Carlos; Onal, Sumru; LeHoang, Phuc
2017-05-01
Uveitis is one of the fields in ophthalmology where a tremendous evolution took place in the past 25 years. Not only did we gain access to more efficient, more targeted, and better tolerated therapies, but also in parallel precise and quantitative measurement methods developed allowing the clinician to evaluate these therapies and adjust therapeutic intervention with a high degree of precision. Objective and quantitative measurement of the global level of intraocular inflammation became possible for most inflammatory diseases with direct or spill-over anterior chamber inflammation, thanks to laser flare photometry. The amount of retinal inflammation could be quantified by using fluorescein angiography to score retinal angiographic signs. Indocyanine green angiography gave imaging insight into the hitherto inaccessible choroidal compartment, rendering possible the quantification of choroiditis by scoring indocyanine green angiographic signs. Optical coherence tomography has enabled measurement and objective monitoring of retinal and choroidal thickness. This multimodal quantitative appraisal of intraocular inflammation represents an exquisite security in monitoring uveitis. What is enigmatic, however, is the slow pace with which these improvements are integrated in some areas. What is even more difficult to understand is the fact that clinical trials to assess new therapeutic agents still mostly rely on subjective parameters such as clinical evaluation of vitreous haze as a main endpoint; whereas a whole array of precise, quantitative, and objective modalities are available for the design of clinical studies. The scope of this work was to review the quantitative investigations that improved the management of uveitis in the past 2-3 decades.
NASA Astrophysics Data System (ADS)
Shen, Weidian
2005-03-01
Plastic film packaging is widely used these days, especially in the convenience food industry due to its flexibility, boilability, and microwavability. Almost every package is printed with ink. The adhesion of ink on plastic films merits increasing attention to ensure quality packaging. However, inks and plastic films are polymeric materials with complicated molecular structures. The thickness of the jelly-like ink is only 500nm or less, and the thickness of the soft and flexible film is no more than 50μm, which make the quantitative measurement of their adhesion very challenging. Up to now, no scientific quantitative measurement method for the adhesion of ink on plastic films has been documented. We have tried a technique, in which a Nano-Indenter and a Scanning Probe Microscope were used to evaluate the adhesion strength of ink deposited on plastic films, quantitatively, as well as examine the configurations of adhesion failure. It was helpful in better understanding the adhesion mechanism, thus giving direction as to how to improve the adhesion.
Risk assessment techniques with applicability in marine engineering
NASA Astrophysics Data System (ADS)
Rudenko, E.; Panaitescu, F. V.; Panaitescu, M.
2015-11-01
Nowadays risk management is a carefully planned process. The task of risk management is organically woven into the general problem of increasing the efficiency of business. Passive attitude to risk and awareness of its existence are replaced by active management techniques. Risk assessment is one of the most important stages of risk management, since for risk management it is necessary first to analyze and evaluate risk. There are many definitions of this notion but in general case risk assessment refers to the systematic process of identifying the factors and types of risk and their quantitative assessment, i.e. risk analysis methodology combines mutually complementary quantitative and qualitative approaches. Purpose of the work: In this paper we will consider as risk assessment technique Fault Tree analysis (FTA). The objectives are: understand purpose of FTA, understand and apply rules of Boolean algebra, analyse a simple system using FTA, FTA advantages and disadvantages. Research and methodology: The main purpose is to help identify potential causes of system failures before the failures actually occur. We can evaluate the probability of the Top event.The steps of this analize are: the system's examination from Top to Down, the use of symbols to represent events, the use of mathematical tools for critical areas, the use of Fault tree logic diagrams to identify the cause of the Top event. Results: In the finally of study it will be obtained: critical areas, Fault tree logical diagrams and the probability of the Top event. These results can be used for the risk assessment analyses.
NASA Astrophysics Data System (ADS)
Vasilenko, Irina; Vlasova, Elizaveta; Metelin, Vladislav; Kardasheva, Ziver
2018-02-01
The development of robust non-invasive laboratory screening methods for early diagnosis on the out-patient basis seems quite relevant for practical medicine. It is known, that platelet is an original biosensor, a detector of early changes in hemostasis condition. The aim of this study was to assess a potential of the quantitative phase imaging (QPI) technique for real time evaluation the influence of low-molecular weight and unfractionated heparin on platelets in patients with the end-stage of chronic renal failure, who were treated with program hemodialysis (PHD). The main group consisted of 21 patients who were administered a low-molecular weight heparin for hypocoagulation during the procedure of hemodialysis. The control group (15 patients) received unfractionated heparin. Morphodensitometric state of living platelets we evaluated by QPI using computer phase-interference microscope MIM (Moscow, Russia). We analyzed the optical-geometrical parameters and the morphological features of living platelets which reflected the degree of their activation at the beginning of PHD (before administration of heparin), in 15 minutes after it and at the end of the procedure. The results allow us to conclude that the use of low-molecular weight heparin provides better ratio of efficacy/safety and causes a reduction of the platelet activation during the hemodialysis procedure. Practical implementation of QPI for clinical monitoring of platelets makes it possible to obtain important information on hemostasis cell. It opens new opportunities to assess the efficacy of treatment, as well as for early diagnosis of complications for disease.
A study of Mariner 10 flight experiences and some flight piece part failure rate computations
NASA Technical Reports Server (NTRS)
Paul, F. A.
1976-01-01
The problems and failures encountered in Mariner flight are discussed and the data available through a quantitative accounting of all electronic piece parts on the spacecraft are summarized. It also shows computed failure rates for electronic piece parts. It is intended that these computed data be used in the continued updating of the failure rate base used for trade-off studies and predictions for future JPL space missions.
Melin, Michael; Montelius, Andreas; Rydén, Lars; Gonon, Adrian; Hagerman, Inger; Rullman, Eric
2018-01-01
Enhanced external counterpulsation (EECP) is a non-invasive treatment in which leg cuff compressions increase diastolic aortic pressure and coronary perfusion. EECP is offered to patients with refractory angina pectoris and increases physical capacity. Benefits in heart failure patients have been noted, but EECP is still considered to be experimental and its effects must be confirmed. The mechanism of action is still unclear. The aim of this study was to evaluate the effect of EECP on skeletal muscle gene expression and physical performance in patients with severe heart failure. Patients (n = 9) in NYHA III-IV despite pharmacological therapy were subjected to 35 h of EECP during 7 weeks. Before and after, lateral vastus muscle biopsies were obtained, and functional capacity was evaluated with a 6-min walk test. Skeletal muscle gene expression was evaluated using Affymetrix Hugene 1.0 arrays. Maximum walking distance increased by 15%, which is in parity to that achieved after aerobic exercise training in similar patients. Skeletal muscle gene expression analysis using Ingenuity Pathway Analysis showed an increased expression of two networks of genes with FGF-2 and IGF-1 as central regulators. The increase in gene expression was quantitatively small and no overlap with gene expression profiles after exercise training could be detected despite adequate statistical power. EECP treatment leads to a robust improvement in walking distance in patients with severe heart failure and does induce a skeletal muscle transcriptional response, but this response is small and with no significant overlap with the transcriptional signature seen after exercise training. © 2016 Scandinavian Society of Clinical Physiology and Nuclear Medicine. Published by John Wiley & Sons Ltd.
Anderson, Kelley M
2014-01-01
Heart failure is a clinical syndrome that incurs a high prevalence, mortality, morbidity, and economic burden in our society. Patients with heart failure may experience hospitalization because of an acute exacerbation of their condition. Recurrent hospitalizations soon after discharge are an unfortunate occurrence in this patient population. The purpose of this study was to explore the clinical and diagnostic characteristics of individuals hospitalized with a primary diagnosis of heart failure at the time of discharge and to compare the association of these indicators in individuals who did and did not experience a heart failure hospitalization within 60 days of the index stay. The study is a descriptive, correlational, quantitative study using a retrospective review of 134 individuals discharged with a primary diagnosis of heart failure. Records were reviewed for sociodemographic characteristics, health histories, clinical assessment findings, and diagnostic information. Significant predictors of 60-day heart failure readmissions were dyspnea (β = 0.579), crackles (β = 1.688), and assistance with activities of daily living (β = 2.328), independent of age, gender, and multiple other factors. By using hierarchical logistical regression, a model was derived that demonstrated the ability to correctly classify 77.4% of the cohort, 78.2% of those who did have a readmission (sensitivity of the prediction), and 76.7% of the subjects in whom the predicted event, readmission, did not occur (specificity of the prediction). Hospitalizations for heart failure are markers of clinical instability. Future events after hospitalization are common in this patient population, and this study provides a novel understanding of clinical characteristics at the time of discharge that are associated with future outcomes, specifically 60-day heart failure readmissions. A consideration of these characteristics provides an additional perspective to guide clinical decision making and the evaluation of discharge readiness.
Stochastic availability analysis of operational data systems in the Deep Space Network
NASA Technical Reports Server (NTRS)
Issa, T. N.
1991-01-01
Existing availability models of standby redundant systems consider only an operator's performance and its interaction with the hardware performance. In the case of operational data systems in the Deep Space Network (DSN), in addition to an operator system interface, a controller reconfigures the system and links a standby unit into the network data path upon failure of the operating unit. A stochastic (Markovian) process technique is used to model and analyze the availability performance and occurrence of degradation due to partial failures are quantitatively incorporated into the model. Exact expressions of the steady state availability and proportion degraded performance measures are derived for the systems under study. The interaction among the hardware, operator, and controller performance parameters and that interaction's effect on data availability are evaluated and illustrated for an operational data processing system.
Simulating Initial and Progressive Failure of Open-Hole Composite Laminates under Tension
NASA Astrophysics Data System (ADS)
Guo, Zhangxin; Zhu, Hao; Li, Yongcun; Han, Xiaoping; Wang, Zhihua
2016-12-01
A finite element (FE) model is developed for the progressive failure analysis of fiber reinforced polymer laminates. The failure criterion for fiber and matrix failure is implemented in the FE code Abaqus using user-defined material subroutine UMAT. The gradual degradation of the material properties is controlled by the individual fracture energies of fiber and matrix. The failure and damage in composite laminates containing a central hole subjected to uniaxial tension are simulated. The numerical results show that the damage model can be used to accurately predicte the progressive failure behaviour both qualitatively and quantitatively.
NASA Technical Reports Server (NTRS)
Saulsberry, Regor; Greene, Nathanael; Cameron, Ken; Madaras, Eric; Grimes-Ledesma, Lorie; Thesken, John; Phoenix, Leigh; Murthy, Pappu; Revilock, Duane
2007-01-01
Many aging composite overwrapped pressure vessels (COPVs), being used by the National Aeronautics and Space Administration (NASA) are currently under evaluation to better quantify their reliability and clarify their likelihood of failure due to stress rupture and age-dependent issues. As a result, some test and analysis programs have been successfully accomplished and other related programs are still in progress at the NASA Johnson Space Center (JSC) White Sands Test Facility (WSTF) and other NASA centers, with assistance from the commercial sector. To support this effort, a group of Nondestructive Evaluation (NDE) experts was assembled to provide NDE competence for pretest evaluation of test articles and for application of NDE technology to real-time testing. Techniques were required to provide assurance that the test article had adequate structural integrity and manufacturing consistency to be considered acceptable for testing and these techniques were successfully applied. Destructive testing is also being accomplished to better understand the physical and chemical property changes associated with progression toward "stress rupture" (SR) failure, and it is being associated with NDE response, so it can potentially be used to help with life prediction. Destructive work also includes the evaluation of residual stresses during dissection of the overwrap, laboratory evaluation of specimens extracted from the overwrap to evaluate physical property changes, and quantitative microscopy to inform the theoretical micromechanics.
A Model of Risk Analysis in Analytical Methodology for Biopharmaceutical Quality Control.
Andrade, Cleyton Lage; Herrera, Miguel Angel De La O; Lemes, Elezer Monte Blanco
2018-01-01
One key quality control parameter for biopharmaceutical products is the analysis of residual cellular DNA. To determine small amounts of DNA (around 100 pg) that may be in a biologically derived drug substance, an analytical method should be sensitive, robust, reliable, and accurate. In principle, three techniques have the ability to measure residual cellular DNA: radioactive dot-blot, a type of hybridization; threshold analysis; and quantitative polymerase chain reaction. Quality risk management is a systematic process for evaluating, controlling, and reporting of risks that may affects method capabilities and supports a scientific and practical approach to decision making. This paper evaluates, by quality risk management, an alternative approach to assessing the performance risks associated with quality control methods used with biopharmaceuticals, using the tool hazard analysis and critical control points. This tool provides the possibility to find the steps in an analytical procedure with higher impact on method performance. By applying these principles to DNA analysis methods, we conclude that the radioactive dot-blot assay has the largest number of critical control points, followed by quantitative polymerase chain reaction, and threshold analysis. From the analysis of hazards (i.e., points of method failure) and the associated method procedure critical control points, we conclude that the analytical methodology with the lowest risk for performance failure for residual cellular DNA testing is quantitative polymerase chain reaction. LAY ABSTRACT: In order to mitigate the risk of adverse events by residual cellular DNA that is not completely cleared from downstream production processes, regulatory agencies have required the industry to guarantee a very low level of DNA in biologically derived pharmaceutical products. The technique historically used was radioactive blot hybridization. However, the technique is a challenging method to implement in a quality control laboratory: It is laborious, time consuming, semi-quantitative, and requires a radioisotope. Along with dot-blot hybridization, two alternatives techniques were evaluated: threshold analysis and quantitative polymerase chain reaction. Quality risk management tools were applied to compare the techniques, taking into account the uncertainties, the possibility of circumstances or future events, and their effects upon method performance. By illustrating the application of these tools with DNA methods, we provide an example of how they can be used to support a scientific and practical approach to decision making and can assess and manage method performance risk using such tools. This paper discusses, considering the principles of quality risk management, an additional approach to the development and selection of analytical quality control methods using the risk analysis tool hazard analysis and critical control points. This tool provides the possibility to find the method procedural steps with higher impact on method reliability (called critical control points). Our model concluded that the radioactive dot-blot assay has the larger number of critical control points, followed by quantitative polymerase chain reaction and threshold analysis. Quantitative polymerase chain reaction is shown to be the better alternative analytical methodology in residual cellular DNA analysis. © PDA, Inc. 2018.
NASA Technical Reports Server (NTRS)
Lysak, Daniel B.
2003-01-01
The applicability of shearography techniques for non-destructive evaluation in two unique application areas is examined. In the first application, shearography is used to evaluate the quality of adhesive bonds holding lead tiles to the B.4T gamma ray mask for the NASA Swift program. Using a vibration excitation, the more poorly bonded tiles are readily identifiable in the shearography image. A quantitative analysis is presented that compares the shearography results with a destructive pull test measuring the force at bond failure. The second application is to evaluate the bonding between the skin and core of a honeycomb structure with a specular (mirror-like) surface. In standard shearography techniques, the object under test must have a diffuse surface to generate the speckle patterns in laser light, which are then sheared. A novel configuration using the specular surface as a mirror to image speckles from a diffuser is presented, opening up the use of shearography to a new class of objects that could not have been examined with the traditional approach. This new technique readily identifies large scale bond failures in the panel, demonstrating the validity of this approach.
Liu, Tong; Song, Deli; Dong, Jianzeng; Zhu, Pinghui; Liu, Jie; Liu, Wei; Ma, Xiaohai; Zhao, Lei; Ling, Shukuan
2017-01-01
Myocardial fibrosis is an important part of cardiac remodeling that leads to heart failure and death. Myocardial fibrosis results from increased myofibroblast activity and excessive extracellular matrix deposition. Various cells and molecules are involved in this process, providing targets for potential drug therapies. Currently, the main detection methods of myocardial fibrosis rely on serum markers, cardiac magnetic resonance imaging, and endomyocardial biopsy. This review summarizes our current knowledge regarding the pathophysiology, quantitative assessment, and novel therapeutic strategies of myocardial fibrosis. PMID:28484397
DOE Office of Scientific and Technical Information (OSTI.GOV)
Roach, Dennis Patrick; Jauregui, David Villegas; Daumueller, Andrew Nicholas
2012-02-01
Recent structural failures such as the I-35W Mississippi River Bridge in Minnesota have underscored the urgent need for improved methods and procedures for evaluating our aging transportation infrastructure. This research seeks to develop a basis for a Structural Health Monitoring (SHM) system to provide quantitative information related to the structural integrity of metallic structures to make appropriate management decisions and ensuring public safety. This research employs advanced structural analysis and nondestructive testing (NDT) methods for an accurate fatigue analysis. Metal railroad bridges in New Mexico will be the focus since many of these structures are over 100 years old andmore » classified as fracture-critical. The term fracture-critical indicates that failure of a single component may result in complete collapse of the structure such as the one experienced by the I-35W Bridge. Failure may originate from sources such as loss of section due to corrosion or cracking caused by fatigue loading. Because standard inspection practice is primarily visual, these types of defects can go undetected due to oversight, lack of access to critical areas, or, in riveted members, hidden defects that are beneath fasteners or connection angles. Another issue is that it is difficult to determine the fatigue damage that a structure has experienced and the rate at which damage is accumulating due to uncertain history and load distribution in supporting members. A SHM system has several advantages that can overcome these limitations. SHM allows critical areas of the structure to be monitored more quantitatively under actual loading. The research needed to apply SHM to metallic structures was performed and a case study was carried out to show the potential of SHM-driven fatigue evaluation to assess the condition of critical transportation infrastructure and to guide inspectors to potential problem areas. This project combines the expertise in transportation infrastructure at New Mexico State University with the expertise at Sandia National Laboratories in the emerging field of SHM.« less
Probabilistic framework for product design optimization and risk management
NASA Astrophysics Data System (ADS)
Keski-Rahkonen, J. K.
2018-05-01
Probabilistic methods have gradually gained ground within engineering practices but currently it is still the industry standard to use deterministic safety margin approaches to dimensioning components and qualitative methods to manage product risks. These methods are suitable for baseline design work but quantitative risk management and product reliability optimization require more advanced predictive approaches. Ample research has been published on how to predict failure probabilities for mechanical components and furthermore to optimize reliability through life cycle cost analysis. This paper reviews the literature for existing methods and tries to harness their best features and simplify the process to be applicable in practical engineering work. Recommended process applies Monte Carlo method on top of load-resistance models to estimate failure probabilities. Furthermore, it adds on existing literature by introducing a practical framework to use probabilistic models in quantitative risk management and product life cycle costs optimization. The main focus is on mechanical failure modes due to the well-developed methods used to predict these types of failures. However, the same framework can be applied on any type of failure mode as long as predictive models can be developed.
Estimation of submarine mass failure probability from a sequence of deposits with age dates
Geist, Eric L.; Chaytor, Jason D.; Parsons, Thomas E.; ten Brink, Uri S.
2013-01-01
The empirical probability of submarine mass failure is quantified from a sequence of dated mass-transport deposits. Several different techniques are described to estimate the parameters for a suite of candidate probability models. The techniques, previously developed for analyzing paleoseismic data, include maximum likelihood and Type II (Bayesian) maximum likelihood methods derived from renewal process theory and Monte Carlo methods. The estimated mean return time from these methods, unlike estimates from a simple arithmetic mean of the center age dates and standard likelihood methods, includes the effects of age-dating uncertainty and of open time intervals before the first and after the last event. The likelihood techniques are evaluated using Akaike’s Information Criterion (AIC) and Akaike’s Bayesian Information Criterion (ABIC) to select the optimal model. The techniques are applied to mass transport deposits recorded in two Integrated Ocean Drilling Program (IODP) drill sites located in the Ursa Basin, northern Gulf of Mexico. Dates of the deposits were constrained by regional bio- and magnetostratigraphy from a previous study. Results of the analysis indicate that submarine mass failures in this location occur primarily according to a Poisson process in which failures are independent and return times follow an exponential distribution. However, some of the model results suggest that submarine mass failures may occur quasiperiodically at one of the sites (U1324). The suite of techniques described in this study provides quantitative probability estimates of submarine mass failure occurrence, for any number of deposits and age uncertainty distributions.
A cascading failure model for analyzing railway accident causation
NASA Astrophysics Data System (ADS)
Liu, Jin-Tao; Li, Ke-Ping
2018-01-01
In this paper, a new cascading failure model is proposed for quantitatively analyzing the railway accident causation. In the model, the loads of nodes are redistributed according to the strength of the causal relationships between the nodes. By analyzing the actual situation of the existing prevention measures, a critical threshold of the load parameter in the model is obtained. To verify the effectiveness of the proposed cascading model, simulation experiments of a train collision accident are performed. The results show that the cascading failure model can describe the cascading process of the railway accident more accurately than the previous models, and can quantitatively analyze the sensitivities and the influence of the causes. In conclusion, this model can assist us to reveal the latent rules of accident causation to reduce the occurrence of railway accidents.
Comprehensive, Quantitative Risk Assessment of CO{sub 2} Geologic Sequestration
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lepinski, James
2013-09-30
A Quantitative Failure Modes and Effects Analysis (QFMEA) was developed to conduct comprehensive, quantitative risk assessments on CO{sub 2} capture, transportation, and sequestration or use in deep saline aquifers, enhanced oil recovery operations, or enhanced coal bed methane operations. The model identifies and characterizes potential risks; identifies the likely failure modes, causes, effects and methods of detection; lists possible risk prevention and risk mitigation steps; estimates potential damage recovery costs, mitigation costs and costs savings resulting from mitigation; and ranks (prioritizes) risks according to the probability of failure, the severity of failure, the difficulty of early failure detection and themore » potential for fatalities. The QFMEA model generates the necessary information needed for effective project risk management. Diverse project information can be integrated into a concise, common format that allows comprehensive, quantitative analysis, by a cross-functional team of experts, to determine: What can possibly go wrong? How much will damage recovery cost? How can it be prevented or mitigated? What is the cost savings or benefit of prevention or mitigation? Which risks should be given highest priority for resolution? The QFMEA model can be tailored to specific projects and is applicable to new projects as well as mature projects. The model can be revised and updated as new information comes available. It accepts input from multiple sources, such as literature searches, site characterization, field data, computer simulations, analogues, process influence diagrams, probability density functions, financial analysis models, cost factors, and heuristic best practices manuals, and converts the information into a standardized format in an Excel spreadsheet. Process influence diagrams, geologic models, financial models, cost factors and an insurance schedule were developed to support the QFMEA model. Comprehensive, quantitative risk assessments were conducted on three (3) sites using the QFMEA model: (1) SACROC Northern Platform CO{sub 2}-EOR Site in the Permian Basin, Scurry County, TX, (2) Pump Canyon CO{sub 2}-ECBM Site in the San Juan Basin, San Juan County, NM, and (3) Farnsworth Unit CO{sub 2}-EOR Site in the Anadarko Basin, Ochiltree County, TX. The sites were sufficiently different from each other to test the robustness of the QFMEA model.« less
Wilkinson, Irene J; Pisaniello, Dino; Ahmad, Junaid; Edwards, Suzanne
2010-09-01
To present the evaluation of a large-scale quantitative respirator-fit testing program. Concurrent questionnaire survey of fit testers and test subjects. Ambulatory care, home nursing care, and acute care hospitals across South Australia. Quantitative facial-fit testing was performed with TSI PortaCount instruments for healthcare workers (HCWs) who wore 5 different models of a disposable P2 (N95-equivalent) respirator. The questionnaire included questions about the HCW's age, sex, race, occupational category, main area of work, smoking status, facial characteristics, prior training and experience in use of respiratory masks, and number of attempts to obtain a respirator fit. A total of 6,160 HCWs were successfully fitted during the period from January through July 2007. Of the 4,472 HCWs who responded to the questionnaire and were successfully fitted, 3,707 (82.9%) were successfully fitted with the first tested respirator, 551 (12.3%) required testing with a second model, and 214 (4.8%) required 3 or more tests. We noted an increased pass rate on the first attempt over time. Asians (excluding those from South and Central Asia) had the highest failure rate (16.3% [45 of 276 Asian HCWs were unsuccessfully fitted]), and whites had the lowest (9.8% [426 of 4,338 white HCWs]). Race was highly correlated with facial shape. Among occupational groups, doctors had the highest failure rate (13.4% [81 of 604 doctors]), but they also had the highest proportion of Asians. Prior education and/or training in respirator use were not associated with a higher pass rate. Certain facial characteristics were associated with higher or lower pass rates with regard to fit testing, and fit testers were able to select a suitable respirator on the basis of a visual assessment in the majority of cases. For the fit tester, training and experience were important factors; however, for the HCW being fitted, prior experience in respirator use was not an important factor.
Visibility graph analysis of heart rate time series and bio-marker of congestive heart failure
NASA Astrophysics Data System (ADS)
Bhaduri, Anirban; Bhaduri, Susmita; Ghosh, Dipak
2017-09-01
Study of RR interval time series for Congestive Heart Failure had been an area of study with different methods including non-linear methods. In this article the cardiac dynamics of heart beat are explored in the light of complex network analysis, viz. visibility graph method. Heart beat (RR Interval) time series data taken from Physionet database [46, 47] belonging to two groups of subjects, diseased (congestive heart failure) (29 in number) and normal (54 in number) are analyzed with the technique. The overall results show that a quantitative parameter can significantly differentiate between the diseased subjects and the normal subjects as well as different stages of the disease. Further, the data when split into periods of around 1 hour each and analyzed separately, also shows the same consistent differences. This quantitative parameter obtained using the visibility graph analysis thereby can be used as a potential bio-marker as well as a subsequent alarm generation mechanism for predicting the onset of Congestive Heart Failure.
NASA Technical Reports Server (NTRS)
Anderson, Leif F.; Harrington, Sean P.; Omeke, Ojei, II; Schwaab, Douglas G.
2009-01-01
This is a case study on revised estimates of induced failure for International Space Station (ISS) on-orbit replacement units (ORUs). We devise a heuristic to leverage operational experience data by aggregating ORU, associated function (vehicle sub -system), and vehicle effective' k-factors using actual failure experience. With this input, we determine a significant failure threshold and minimize the difference between the actual and predicted failure rates. We conclude with a discussion on both qualitative and quantitative improvements the heuristic methods and potential benefits to ISS supportability engineering analysis.
Quantitative validation of carbon-fiber laminate low velocity impact simulations
English, Shawn A.; Briggs, Timothy M.; Nelson, Stacy M.
2015-09-26
Simulations of low velocity impact with a flat cylindrical indenter upon a carbon fiber fabric reinforced polymer laminate are rigorously validated. Comparison of the impact energy absorption between the model and experiment is used as the validation metric. Additionally, non-destructive evaluation, including ultrasonic scans and three-dimensional computed tomography, provide qualitative validation of the models. The simulations include delamination, matrix cracks and fiber breaks. An orthotropic damage and failure constitutive model, capable of predicting progressive damage and failure, is developed in conjunction and described. An ensemble of simulations incorporating model parameter uncertainties is used to predict a response distribution which ismore » then compared to experimental output using appropriate statistical methods. Lastly, the model form errors are exposed and corrected for use in an additional blind validation analysis. The result is a quantifiable confidence in material characterization and model physics when simulating low velocity impact in structures of interest.« less
Quantitative Approach to Failure Mode and Effect Analysis for Linear Accelerator Quality Assurance
DOE Office of Scientific and Technical Information (OSTI.GOV)
O'Daniel, Jennifer C., E-mail: jennifer.odaniel@duke.edu; Yin, Fang-Fang
Purpose: To determine clinic-specific linear accelerator quality assurance (QA) TG-142 test frequencies, to maximize physicist time efficiency and patient treatment quality. Methods and Materials: A novel quantitative approach to failure mode and effect analysis is proposed. Nine linear accelerator-years of QA records provided data on failure occurrence rates. The severity of test failure was modeled by introducing corresponding errors into head and neck intensity modulated radiation therapy treatment plans. The relative risk of daily linear accelerator QA was calculated as a function of frequency of test performance. Results: Although the failure severity was greatest for daily imaging QA (imaging vsmore » treatment isocenter and imaging positioning/repositioning), the failure occurrence rate was greatest for output and laser testing. The composite ranking results suggest that performing output and lasers tests daily, imaging versus treatment isocenter and imaging positioning/repositioning tests weekly, and optical distance indicator and jaws versus light field tests biweekly would be acceptable for non-stereotactic radiosurgery/stereotactic body radiation therapy linear accelerators. Conclusions: Failure mode and effect analysis is a useful tool to determine the relative importance of QA tests from TG-142. Because there are practical time limitations on how many QA tests can be performed, this analysis highlights which tests are the most important and suggests the frequency of testing based on each test's risk priority number.« less
Park, Joung-Man; Kim, Jin-Won; Yoon, Dong-Jin
2002-03-01
Interfacial and microfailure properties of carbon fiber/epoxy composites were evaluated using both tensile fragmentation and compressive Broutman tests with an aid of acoustic emission (AE). A monomeric and two polymeric coupling agents were applied via the electrodeposition (ED) and the dipping applications. A monomeric and a polymeric coupling agent showed significant and comparable improvements in interfacial shear strength (IFSS) compared to the untreated case under both tensile and compressive tests. Typical microfailure modes including cone-shaped fiber break, matrix cracking, and partial interlayer failure were observed under tension, whereas the diagonal slipped failure at both ends of the fractured fiber exhibited under compression. Adsorption and shear displacement mechanisms at the interface were described in terms of electrical attraction and primary and secondary bonding forces. For both the untreated and the treated cases AE distributions were separated well in tension, whereas AE distributions were rather closely overlapped in compression. It might be because of the difference in molecular failure energies and failure mechanisms between tension and compression. The maximum AE voltage for the waveform of either carbon or large-diameter basalt fiber breakages in tension exhibited much larger than that in compression. AE could provide more likely the quantitative information on the interfacial adhesion and microfailure.
Capoccia, Massimo; Marconi, Silvia; Singh, Sanjeet Avtaar; Pisanelli, Domenico M; De Lazzari, Claudio
2018-05-02
Modelling and simulation may become clinically applicable tools for detailed evaluation of the cardiovascular system and clinical decision-making to guide therapeutic intervention. Models based on pressure-volume relationship and zero-dimensional representation of the cardiovascular system may be a suitable choice given their simplicity and versatility. This approach has great potential for application in heart failure where the impact of left ventricular assist devices has played a significant role as a bridge to transplant and more recently as a long-term solution for non eligible candidates. We sought to investigate the value of simulation in the context of three heart failure patients with a view to predict or guide further management. CARDIOSIM © was the software used for this purpose. The study was based on retrospective analysis of haemodynamic data previously discussed at a multidisciplinary meeting. The outcome of the simulations addressed the value of a more quantitative approach in the clinical decision process. Although previous experience, co-morbidities and the risk of potentially fatal complications play a role in clinical decision-making, patient-specific modelling may become a daily approach for selection and optimisation of device-based treatment for heart failure patients. Willingness to adopt this integrated approach may be the key to further progress.
Distant failure prediction for early stage NSCLC by analyzing PET with sparse representation
NASA Astrophysics Data System (ADS)
Hao, Hongxia; Zhou, Zhiguo; Wang, Jing
2017-03-01
Positron emission tomography (PET) imaging has been widely explored for treatment outcome prediction. Radiomicsdriven methods provide a new insight to quantitatively explore underlying information from PET images. However, it is still a challenging problem to automatically extract clinically meaningful features for prognosis. In this work, we develop a PET-guided distant failure predictive model for early stage non-small cell lung cancer (NSCLC) patients after stereotactic ablative radiotherapy (SABR) by using sparse representation. The proposed method does not need precalculated features and can learn intrinsically distinctive features contributing to classification of patients with distant failure. The proposed framework includes two main parts: 1) intra-tumor heterogeneity description; and 2) dictionary pair learning based sparse representation. Tumor heterogeneity is initially captured through anisotropic kernel and represented as a set of concatenated vectors, which forms the sample gallery. Then, given a test tumor image, its identity (i.e., distant failure or not) is classified by applying the dictionary pair learning based sparse representation. We evaluate the proposed approach on 48 NSCLC patients treated by SABR at our institute. Experimental results show that the proposed approach can achieve an area under the characteristic curve (AUC) of 0.70 with a sensitivity of 69.87% and a specificity of 69.51% using a five-fold cross validation.
NASA Technical Reports Server (NTRS)
Smart, Christian
1998-01-01
During 1997, a team from Hernandez Engineering, MSFC, Rocketdyne, Thiokol, Pratt & Whitney, and USBI completed the first phase of a two year Quantitative Risk Assessment (QRA) of the Space Shuttle. The models for the Shuttle systems were entered and analyzed by a new QRA software package. This system, termed the Quantitative Risk Assessment System(QRAS), was designed by NASA and programmed by the University of Maryland. The software is a groundbreaking PC-based risk assessment package that allows the user to model complex systems in a hierarchical fashion. Features of the software include the ability to easily select quantifications of failure modes, draw Event Sequence Diagrams(ESDs) interactively, perform uncertainty and sensitivity analysis, and document the modeling. This paper illustrates both the approach used in modeling and the particular features of the software package. The software is general and can be used in a QRA of any complex engineered system. The author is the project lead for the modeling of the Space Shuttle Main Engines (SSMEs), and this paper focuses on the modeling completed for the SSMEs during 1997. In particular, the groundrules for the study, the databases used, the way in which ESDs were used to model catastrophic failure of the SSMES, the methods used to quantify the failure rates, and how QRAS was used in the modeling effort are discussed. Groundrules were necessary to limit the scope of such a complex study, especially with regard to a liquid rocket engine such as the SSME, which can be shut down after ignition either on the pad or in flight. The SSME was divided into its constituent components and subsystems. These were ranked on the basis of the possibility of being upgraded and risk of catastrophic failure. Once this was done the Shuttle program Hazard Analysis and Failure Modes and Effects Analysis (FMEA) were used to create a list of potential failure modes to be modeled. The groundrules and other criteria were used to screen out the many failure modes that did not contribute significantly to the catastrophic risk. The Hazard Analysis and FMEA for the SSME were also used to build ESDs that show the chain of events leading from the failure mode occurence to one of the following end states: catastrophic failure, engine shutdown, or siccessful operation( successful with respect to the failure mode under consideration).
Four-point bending as a method for quantitatively evaluating spinal arthrodesis in a rat model.
Robinson, Samuel T; Svet, Mark T; Kanim, Linda A; Metzger, Melodie F
2015-02-01
The most common method of evaluating the success (or failure) of rat spinal fusion procedures is manual palpation testing. Whereas manual palpation provides only a subjective binary answer (fused or not fused) regarding the success of a fusion surgery, mechanical testing can provide more quantitative data by assessing variations in strength among treatment groups. We here describe a mechanical testing method to quantitatively assess single-level spinal fusion in a rat model, to improve on the binary and subjective nature of manual palpation as an end point for fusion-related studies. We tested explanted lumbar segments from Sprague-Dawley rat spines after single-level posterolateral fusion procedures at L4-L5. Segments were classified as 'not fused,' 'restricted motion,' or 'fused' by using manual palpation testing. After thorough dissection and potting of the spine, 4-point bending in flexion then was applied to the L4-L5 motion segment, and stiffness was measured as the slope of the moment-displacement curve. Results demonstrated statistically significant differences in stiffness among all groups, which were consistent with preliminary grading according to manual palpation. In addition, the 4-point bending results provided quantitative information regarding the quality of the bony union formed and therefore enabled the comparison of fused specimens. Our results demonstrate that 4-point bending is a simple, reliable, and effective way to describe and compare results among rat spines after fusion surgery.
Pecson, Brian M; Triolo, Sarah C; Olivieri, Simon; Chen, Elise C; Pisarenko, Aleksey N; Yang, Chao-Chun; Olivieri, Adam; Haas, Charles N; Trussell, R Shane; Trussell, R Rhodes
2017-10-01
To safely progress toward direct potable reuse (DPR), it is essential to ensure that DPR systems can provide public health protection equivalent to or greater than that of conventional drinking water sources. This study collected data over a one-year period from a full-scale DPR demonstration facility, and used both performance distribution functions (PDFs) and quantitative microbial risk assessment (QMRA) to define and evaluate the reliability of the advanced water treatment facility (AWTF). The AWTF's ability to control enterovirus, Giardia, and Cryptosporidium was characterized using online monitoring of surrogates in a treatment train consisting of ozone, biological activated carbon, microfiltration, reverse osmosis, and ultraviolet light with an advanced oxidation process. This process train was selected to improve reliability by providing redundancy, defined as the provision of treatment beyond the minimum needed to meet regulatory requirements. The PDFs demonstrated treatment that consistently exceeded the 12/10/10-log thresholds for virus, Giardia, and Cryptosporidium, as currently required for potable reuse in California (via groundwater recharge and surface water augmentation). Because no critical process failures impacted pathogen removal performance during the yearlong testing, hypothetical failures were incorporated into the analysis to understand the benefit of treatment redundancy on performance. Each unit process was modeled with a single failure per year lasting four different failure durations: 15 min, 60 min, 8 h, and 24 h. QMRA was used to quantify the impact of failures on pathogen risk. The median annual risk of infection for Cryptosporidium was 4.9 × 10 -11 in the absence of failures, and reached a maximum of 1.1 × 10 -5 assuming one 24-h failure per process per year. With the inclusion of free chlorine disinfection as part of the treatment process, enterovirus had a median annual infection risk of 1.5 × 10 -14 (no failures) and a maximum annual value of 2.1 × 10 -5 (assuming one 24-h failure per year). Even with conservative failure assumptions, pathogen risk from this treatment train remains below the risk targets for both the U.S. (10 -4 infections/person/year) and the WHO (approximately 10 -3 infections/person/year, equivalent to 10 -6 DALY/person/year), demonstrating the value of a failure prevention strategy based on treatment redundancy. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.
DOT National Transportation Integrated Search
2013-03-01
Rock falls on highways while dangerous are unpredictable. Most rock falls are of the raveling type and not conducive to stability : calculations, and even the failure mechanisms are not well understood. LIDAR (LIght Detection And Ranging) has been sh...
Quantitative Ultrasonic Evaluation of Mechanical Properties of Engineering Materials
NASA Technical Reports Server (NTRS)
Vary, A.
1978-01-01
Progress in the application of ultrasonic techniques to nondestructive measurement of mechanical strength of engineering materials is reviewed. A dormant concept in nondestructive evaluation (NDE) is invoked. The availability of ultrasonic methods that can be applied to actual parts to assess their potential susceptibility to failure under design conditions is discussed. It was shown that ultrasonic methods yield measurements of elastic moduli, microstructure, hardness, fracture toughness, tensile strength, yield strength, and shear strength for a wide range of materials (including many types of metals, ceramics, and fiber composites). It was also indicated that although most of these methods were shown feasible in laboratory studies, more work is needed before they can be used on actual parts in processing, assembly, inspection, and maintenance lines.
Confessions of a Quantitative Educational Researcher Trying to Teach Qualitative Research.
ERIC Educational Resources Information Center
Stallings, William M.
1995-01-01
Describes one quantitative educational researcher's experiences teaching qualitative research, the approach used in classes, and the successes and failures. These experiences are examined from the viewpoint of a traditionally trained professor who has now been called upon to master and teach qualitative research. (GR)
Application of Elements of TPM Strategy for Operation Analysis of Mining Machine
NASA Astrophysics Data System (ADS)
Brodny, Jaroslaw; Tutak, Magdalena
2017-12-01
Total Productive Maintenance (TPM) strategy includes group of activities and actions in order to maintenance machines in failure-free state and without breakdowns thanks to tending limitation of failures, non-planned shutdowns, lacks and non-planned service of machines. These actions are ordered to increase effectiveness of utilization of possessed devices and machines in company. Very significant element of this strategy is connection of technical actions with changes in their perception by employees. Whereas fundamental aim of introduction this strategy is improvement of economic efficiency of enterprise. Increasing competition and necessity of reduction of production costs causes that also mining enterprises are forced to introduce this strategy. In the paper examples of use of OEE model for quantitative evaluation of selected mining devices were presented. OEE model is quantitative tool of TPM strategy and can be the base for further works connected with its introduction. OEE indicator is the product of three components which include availability and performance of the studied machine and the quality of the obtained product. The paper presents the results of the effectiveness analysis of the use of a set of mining machines included in the longwall system, which is the first and most important link in the technological line of coal production. The set of analyzed machines included the longwall shearer, armored face conveyor and cruscher. From a reliability point of view, the analyzed set of machines is a system that is characterized by the serial structure. The analysis was based on data recorded by the industrial automation system used in the mines. This method of data acquisition ensured their high credibility and a full time synchronization. Conclusions from the research and analyses should be used to reduce breakdowns, failures and unplanned downtime, increase performance and improve production quality.
Ultrashort echo time magnetization transfer (UTE-MT) imaging of cortical bone.
Chang, Eric Y; Bae, Won C; Shao, Hongda; Biswas, Reni; Li, Shihong; Chen, Jun; Patil, Shantanu; Healey, Robert; D'Lima, Darryl D; Chung, Christine B; Du, Jiang
2015-07-01
Magnetization transfer (MT) imaging is one way to indirectly assess pools of protons with fast transverse relaxation. However, conventional MT imaging sequences are not applicable to short T2 tissues such as cortical bone. Ultrashort echo time (UTE) sequences with TE values as low as 8 µs can detect signals from different water components in cortical bone. In this study we aim to evaluate two-dimensional UTE-MT imaging of cortical bone and its application in assessing cortical bone porosity as measured by micro-computed tomography (μCT) and biomechanical properties. In total, 38 human cadaveric distal femur and proximal tibia bones were sectioned to produce 122 rectangular pieces of cortical bone for quantitative UTE-MT MR imaging, μCT, and biomechanical testing. Off-resonance saturation ratios (OSRs) with a series of MT pulse frequency offsets (Δf) were calculated and compared with porosity assessed with μCT, as well as elastic (modulus, yield stress, and strain) and failure (ultimate stress, failure strain, and energy) properties, using Pearson correlation and linear regression. A moderately strong negative correlation was observed between OSR and μCT porosity (R(2) = 0.46-0.51), while a moderate positive correlation was observed between OSR and yield stress (R(2) = 0.25-0.30) and failure stress (R(2) = 0.31-0.35), and a weak positive correlation (R(2) = 0.09-0.12) between OSR and Young's modulus at all off-resonance saturation frequencies. OSR determined with the UTE-MT sequence provides quantitative information on cortical bone and is sensitive to μCT porosity and biomechanical function. Copyright © 2015 John Wiley & Sons, Ltd.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-05-12
... 8, 2010.\\3\\ The Commission subsequently extended the time period in which to either approve the... BX Venture Market will have minimal quantitative listing standards, but will have qualitative... another national securities exchange for failure to meet quantitative listing standards (including price...
ACCELERATED FAILURE TIME MODELS PROVIDE A USEFUL STATISTICAL FRAMEWORK FOR AGING RESEARCH
Swindell, William R.
2009-01-01
Survivorship experiments play a central role in aging research and are performed to evaluate whether interventions alter the rate of aging and increase lifespan. The accelerated failure time (AFT) model is seldom used to analyze survivorship data, but offers a potentially useful statistical approach that is based upon the survival curve rather than the hazard function. In this study, AFT models were used to analyze data from 16 survivorship experiments that evaluated the effects of one or more genetic manipulations on mouse lifespan. Most genetic manipulations were found to have a multiplicative effect on survivorship that is independent of age and well-characterized by the AFT model “deceleration factor”. AFT model deceleration factors also provided a more intuitive measure of treatment effect than the hazard ratio, and were robust to departures from modeling assumptions. Age-dependent treatment effects, when present, were investigated using quantile regression modeling. These results provide an informative and quantitative summary of survivorship data associated with currently known long-lived mouse models. In addition, from the standpoint of aging research, these statistical approaches have appealing properties and provide valuable tools for the analysis of survivorship data. PMID:19007875
Accelerated failure time models provide a useful statistical framework for aging research.
Swindell, William R
2009-03-01
Survivorship experiments play a central role in aging research and are performed to evaluate whether interventions alter the rate of aging and increase lifespan. The accelerated failure time (AFT) model is seldom used to analyze survivorship data, but offers a potentially useful statistical approach that is based upon the survival curve rather than the hazard function. In this study, AFT models were used to analyze data from 16 survivorship experiments that evaluated the effects of one or more genetic manipulations on mouse lifespan. Most genetic manipulations were found to have a multiplicative effect on survivorship that is independent of age and well-characterized by the AFT model "deceleration factor". AFT model deceleration factors also provided a more intuitive measure of treatment effect than the hazard ratio, and were robust to departures from modeling assumptions. Age-dependent treatment effects, when present, were investigated using quantile regression modeling. These results provide an informative and quantitative summary of survivorship data associated with currently known long-lived mouse models. In addition, from the standpoint of aging research, these statistical approaches have appealing properties and provide valuable tools for the analysis of survivorship data.
Evaluation and treatment of students with difficulties passing the Step examinations.
Laatsch, Linda
2009-05-01
The author designed this retrospective case series study both to systematically examine characteristics of individuals referred for treatment after multiple failures on the United States Medical Licensing Examinations (USMLE) Step 1 or 2 administered by the National Board of Medical Examiners and to evaluate treatment effectiveness in a uniform sample. Six medical students referred to rehabilitation psychology met selection criteria. All students completed the requisite neuropsychological, academic, and psychological testing to identify cognitive and emotional strengths and weaknesses. All six underwent individualized cognitive rehabilitation (CR) with a primary focus on reading fluency and accuracy. All participants improved on a quantitative measure of reading speed and accuracy, and five of the six passed their next USLME Step examination in spite of past failures. Medical students with identified difficulties on reading fluency, but no history of a learning disability, may benefit from systematic CR that addresses cognitive weaknesses related to test-taking abilities. The strong relationships between language and reading skills and the USMLE Step examinations suggest that some students may fail these examinations because of a relative weakness in language processing and reading fluency that may prohibit their successful completion of the Step examinations.
Tensile failure properties of the perinatal, neonatal, and pediatric cadaveric cervical spine.
Luck, Jason F; Nightingale, Roger W; Song, Yin; Kait, Jason R; Loyd, Andre M; Myers, Barry S; Bass, Cameron R Dale
2013-01-01
Biomechanical tensile testing of perinatal, neonatal, and pediatric cadaveric cervical spines to failure. To assess the tensile failure properties of the cervical spine from birth to adulthood. Pediatric cervical spine biomechanical studies have been few due to the limited availability of pediatric cadavers. Therefore, scaled data based on human adult and juvenile animal studies have been used to augment the limited pediatric cadaver data. Despite these efforts, substantial uncertainty remains in our understanding of pediatric cervical spine biomechanics. A total of 24 cadaveric osteoligamentous head-neck complexes, 20 weeks gestation to 18 years, were sectioned into segments (occiput-C2 [O-C2], C4-C5, and C6-C7) and tested in tension to determine axial stiffness, displacement at failure, and load-to-failure. Tensile stiffness-to-failure (N/mm) increased by age (O-C2: 23-fold, neonate: 22 ± 7, 18 yr: 504; C4-C5: 7-fold, neonate: 71 ± 14, 18 yr: 509; C6-C7: 7-fold, neonate: 64 ± 17, 18 yr: 456). Load-to-failure (N) increased by age (O-C2: 13-fold, neonate: 228 ± 40, 18 yr: 2888; C4-C5: 9-fold, neonate: 207 ± 63, 18 yr: 1831; C6-C7: 10-fold, neonate: 174 ± 41, 18 yr: 1720). Normalized displacement at failure (mm/mm) decreased by age (O-C2: 6-fold, neonate: 0.34 ± 0.076, 18 yr: 0.059; C4-C5: 3-fold, neonate: 0.092 ± 0.015, 18 yr: 0.035; C6-C7: 2-fold, neonate: 0.088 ± 0.019, 18 yr: 0.037). Cervical spine tensile stiffness-to-failure and load-to-failure increased nonlinearly, whereas normalized displacement at failure decreased nonlinearly, from birth to adulthood. Pronounced ligamentous laxity observed at younger ages in the O-C2 segment quantitatively supports the prevalence of spinal cord injury without radiographic abnormality in the pediatric population. This study provides important and previously unavailable data for validating pediatric cervical spine models, for evaluating current scaling techniques and animal surrogate models, and for the development of more biofidelic pediatric crash test dummies.
Riegel, Barbara; Dickson, Victoria V; Hoke, Linda; McMahon, Janet P; Reis, Brendali F; Sayers, Steven
2006-01-01
Self-care is an integral component of successful heart failure (HF) management. Engaging patients in self-care can be challenging. Fifteen patients with HF enrolled during hospitalization received a motivational intervention designed to improve HF self-care. A mixed method, pretest posttest design was used to evaluate the proportion of patients in whom the intervention was beneficial and the mechanism of effectiveness. Participants received, on average, 3.0 +/- 1.5 home visits (median 3, mode 3, range 1-6) over a three-month period from an advanced practice nurse trained in motivational interviewing and family counseling. Quantitative and qualitative data were used to judge individual patients in whom the intervention produced a clinically significant improvement in HF self-care. Audiotaped intervention sessions were analyzed using qualitative methods to assess the mechanism of intervention effectiveness. Congruence between quantitative and qualitative judgments of improved self-care revealed that 71.4% of participants improved in self-care after receiving the intervention. Analysis of transcribed intervention sessions revealed themes of 1) communication (reflective listening, empathy); 2) making it fit (acknowledging cultural beliefs, overcoming barriers and constraints, negotiating an action plan); and, 3) bridging the transition from hospital to home (providing information, building skills, activating support resources). An intervention that incorporates the core elements of motivational interviewing may be effective in improving HF self-care, but further research is needed.
Risk analysis of heat recovery steam generator with semi quantitative risk based inspection API 581
NASA Astrophysics Data System (ADS)
Prayogo, Galang Sandy; Haryadi, Gunawan Dwi; Ismail, Rifky; Kim, Seon Jin
2016-04-01
Corrosion is a major problem that most often occurs in the power plant. Heat recovery steam generator (HRSG) is an equipment that has a high risk to the power plant. The impact of corrosion damage causing HRSG power plant stops operating. Furthermore, it could be threaten the safety of employees. The Risk Based Inspection (RBI) guidelines by the American Petroleum Institute (API) 58 has been used to risk analysis in the HRSG 1. By using this methodology, the risk that caused by unexpected failure as a function of the probability and consequence of failure can be estimated. This paper presented a case study relating to the risk analysis in the HRSG, starting with a summary of the basic principles and procedures of risk assessment and applying corrosion RBI for process industries. The risk level of each HRSG equipment were analyzed: HP superheater has a medium high risk (4C), HP evaporator has a medium-high risk (4C), and the HP economizer has a medium risk (3C). The results of the risk assessment using semi-quantitative method of standard API 581 based on the existing equipment at medium risk. In the fact, there is no critical problem in the equipment components. Damage mechanisms were prominent throughout the equipment is thinning mechanism. The evaluation of the risk approach was done with the aim of reducing risk by optimizing the risk assessment activities.
Andrews, Laura Kierol; Coviello, Jessica; Hurley, Elisabeth; Rose, Leonie; Redeker, Nancy S.
2014-01-01
Background Poor sleep, including insomnia, is common among patients with heart failure (HF). However, little is known about the efficacy of interventions for insomnia in this population. Prior to developing interventions, there is a need for better understanding of patient perceptions about insomnia and its treatment. Objectives To evaluate HF patients’ perceptions about 1) insomnia and its consequences; 2) predisposing, precipitating, and perpetuating factors for insomnia; 3) self-management strategies and treatments for insomnia; and 4) preferences for insomnia treatment. Methods The study, guided by the “3 P” model of insomnia, employed a parallel convergent mixed methods design in which we obtained qualitative data through focus groups and quantitative data through questionnaires (sleep quality, insomnia severity, dysfunctional beliefs and attitudes about sleep; sleep-related daytime symptoms and functional performance). Content analysis was used to evaluate themes arising from the focus group data, and descriptive statistics were used to analyze the quantitative data. The results of both forms of data collection were compared and synthesized. Results HF patients perceived insomnia as having a negative impact on daytime function and comorbid health problems, pain, nocturia, and psychological factors as perpetuating factors. They viewed use of hypnotic medications as often necessary but disliked negative daytime side effects. They used a variety of strategies to manage their insomnia, but generally did not mention their sleep concerns to physicians whom they perceived as not interested in sleep. Conclusions HF patients believe insomnia is important and multi-factorial. Behavioral treatments, such as Cognitive Behavioral Therapy, for insomnia may be efficacious in modifying perpetuating factors and likely to be acceptable to patients. PMID:23998381
Rastgou, Fereydoon; Shojaeifard, Maryam; Amin, Ahmad; Ghaedian, Tahereh; Firoozabadi, Hasan; Malek, Hadi; Yaghoobi, Nahid; Bitarafan-Rajabi, Ahmad; Haghjoo, Majid; Amouzadeh, Hedieh; Barati, Hossein
2014-12-01
Recently, the phase analysis of gated single-photon emission computed tomography (SPECT) myocardial perfusion imaging (MPI) has become feasible via several software packages for the evaluation of left ventricular mechanical dyssynchrony. We compared two quantitative software packages, quantitative gated SPECT (QGS) and Emory cardiac toolbox (ECTb), with tissue Doppler imaging (TDI) as the conventional method for the evaluation of left ventricular mechanical dyssynchrony. Thirty-one patients with severe heart failure (ejection fraction ≤35%) and regular heart rhythm, who referred for gated-SPECT MPI, were enrolled. TDI was performed within 3 days after MPI. Dyssynchrony parameters derived from gated-SPECT MPI were analyzed by QGS and ECTb and were compared with the Yu index and septal-lateral wall delay measured by TDI. QGS and ECTb showed a good correlation for assessment of phase histogram bandwidth (PHB) and phase standard deviation (PSD) (r = 0.664 and r = 0.731, P < .001, respectively). However, the mean value of PHB and PSD by ECTb was significantly higher than that of QGS. No significant correlation was found between ECTb and QGS and the Yu index. Nevertheless, PHB, PSD, and entropy derived from QGS revealed a significant (r = 0.424, r = 0.478, r = 0.543, respectively; P < .02) correlation with septal-lateral wall delay. Despite a good correlation between QGS and ECTb software packages, different normal cut-off values of PSD and PHB should be defined for each software package. There was only a modest correlation between phase analysis of gated-SPECT MPI and TDI data, especially in the population of heart failure patients with both narrow and wide QRS complex.
Common Cause Failure Modeling in Space Launch Vehicles
NASA Technical Reports Server (NTRS)
Hark, Frank; Ring, Rob; Novack, Steven D.; Britton, Paul
2015-01-01
Common Cause Failures (CCFs) are a known and documented phenomenon that defeats system redundancy. CCFs are a set of dependent type of failures that can be caused for example by system environments, manufacturing, transportation, storage, maintenance, and assembly. Since there are many factors that contribute to CCFs, they can be reduced, but are difficult to eliminate entirely. Furthermore, failure databases sometimes fail to differentiate between independent and dependent CCF. Because common cause failure data is limited in the aerospace industry, the Probabilistic Risk Assessment (PRA) Team at Bastion Technology Inc. is estimating CCF risk using generic data collected by the Nuclear Regulatory Commission (NRC). Consequently, common cause risk estimates based on this database, when applied to other industry applications, are highly uncertain. Therefore, it is important to account for a range of values for independent and CCF risk and to communicate the uncertainty to decision makers. There is an existing methodology for reducing CCF risk during design, which includes a checklist of 40+ factors grouped into eight categories. Using this checklist, an approach to produce a beta factor estimate is being investigated that quantitatively relates these factors. In this example, the checklist will be tailored to space launch vehicles, a quantitative approach will be described, and an example of the method will be presented.
NASA Astrophysics Data System (ADS)
Niu, Xiqun
Polybutylene (PB) is a semicrystalline thermoplastics. It has been widely used in potable water distribution piping system. However, field practice shows that failure occurs much earlier than the expected service lifetime. What are the causes and how to appropriately evaluate its lifetime motivate this study. In this thesis, three parts of work have been done. First is the understanding of PB, which includes material thermo and mechanical characterization, aging phenomena and notch sensitivity. The second part analyzes the applicability of the existing lifetime testing method for PB. It is shown that PB is an anomaly in terms of the temperature-lifetime relation because of the fracture mechanism transition across the testing temperature range. The third part is the development of the methodology of lifetime prediction for PB pipe. The fracture process of PB pipe consists of three stages, i.e., crack initiation, slow crack growth (SCG) and crack instability. The practical lifetime of PB pipe is primarily determined by the duration of the first two stages. The mechanism of crack initiation and the quantitative estimation of the time to crack initiation are studied by employing environment stress cracking technique. A fatigue slow crack growth testing method has been developed and applied in the study of SCG. By using Paris-Erdogan equation, a model is constructed to evaluate the time for SCG. As a result, the total lifetime is determined. Through this work, the failure mechanisms of PB pipe has been analyzed and the lifetime prediction methodology has been developed.
Of pacemakers and statistics: the actuarial method extended.
Dussel, J; Wolbarst, A B; Scott-Millar, R N; Obel, I W
1980-01-01
Pacemakers cease functioning because of either natural battery exhaustion (nbe) or component failure (cf). A study of four series of pacemakers shows that a simple extension of the actuarial method, so as to incorporate Normal statistics, makes possible a quantitative differentiation between the two modes of failure. This involves the separation of the overall failure probability density function PDF(t) into constituent parts pdfnbe(t) and pdfcf(t). The approach should allow a meaningful comparison of the characteristics of different pacemaker types.
Causes of catastrophic failure in complex systems
NASA Astrophysics Data System (ADS)
Thomas, David A.
2010-08-01
Root causes of mission critical failures and major cost and schedule overruns in complex systems and programs are studied through the post-mortem analyses compiled for several examples, including the Hubble Space Telescope, the Challenger and Columbia Shuttle accidents, and the Three Mile Island nuclear power plant accident. The roles of organizational complexity, cognitive biases in decision making, the display of quantitative data, and cost and schedule pressure are all considered. Recommendations for mitigating the risk of similar failures in future programs are also provided.
Probabilistic failure assessment with application to solid rocket motors
NASA Technical Reports Server (NTRS)
Jan, Darrell L.; Davidson, Barry D.; Moore, Nicholas R.
1990-01-01
A quantitative methodology is being developed for assessment of risk of failure of solid rocket motors. This probabilistic methodology employs best available engineering models and available information in a stochastic framework. The framework accounts for incomplete knowledge of governing parameters, intrinsic variability, and failure model specification error. Earlier case studies have been conducted on several failure modes of the Space Shuttle Main Engine. Work in progress on application of this probabilistic approach to large solid rocket boosters such as the Advanced Solid Rocket Motor for the Space Shuttle is described. Failure due to debonding has been selected as the first case study for large solid rocket motors (SRMs) since it accounts for a significant number of historical SRM failures. Impact of incomplete knowledge of governing parameters and failure model specification errors is expected to be important.
Quantitative risk analysis of oil storage facilities in seismic areas.
Fabbrocino, Giovanni; Iervolino, Iunio; Orlando, Francesca; Salzano, Ernesto
2005-08-31
Quantitative risk analysis (QRA) of industrial facilities has to take into account multiple hazards threatening critical equipment. Nevertheless, engineering procedures able to evaluate quantitatively the effect of seismic action are not well established. Indeed, relevant industrial accidents may be triggered by loss of containment following ground shaking or other relevant natural hazards, either directly or through cascade effects ('domino effects'). The issue of integrating structural seismic risk into quantitative probabilistic seismic risk analysis (QpsRA) is addressed in this paper by a representative study case regarding an oil storage plant with a number of atmospheric steel tanks containing flammable substances. Empirical seismic fragility curves and probit functions, properly defined both for building-like and non building-like industrial components, have been crossed with outcomes of probabilistic seismic hazard analysis (PSHA) for a test site located in south Italy. Once the seismic failure probabilities have been quantified, consequence analysis has been performed for those events which may be triggered by the loss of containment following seismic action. Results are combined by means of a specific developed code in terms of local risk contour plots, i.e. the contour line for the probability of fatal injures at any point (x, y) in the analysed area. Finally, a comparison with QRA obtained by considering only process-related top events is reported for reference.
Wood-adhesive bonding failure : modeling and simulation
Zhiyong Cai
2010-01-01
The mechanism of wood bonding failure when exposed to wet conditions or wet/dry cycles is not fully understood and the role of the resulting internal stresses exerted upon the wood-adhesive bondline has yet to be quantitatively determined. Unlike previous modeling this study has developed a new two-dimensional internal-stress model on the basis of the mechanics of...
ERIC Educational Resources Information Center
Awofala, Adeneye O. A.; Odogwu, Helen N.
2017-01-01
The study investigated mathematics cognitive failures as related to mathematics anxiety, gender and performance in calculus among 450 preservice teachers from four public universities in the South West geo-political zone of Nigeria using the quantitative research method within the blueprint of the descriptive survey design. Data collected were…
ERIC Educational Resources Information Center
Rieg, Sue A.
2007-01-01
With the focus on standardized tests, it appears that we are leaving classroom assessments and students at-risk of school failure behind. This quantitative study investigated the perceptions of junior high school teachers, and students at risk of school failure, on the effectiveness and level of use of various classroom assessments and…
Feasibility of high-resolution quantitative perfusion analysis in patients with heart failure.
Sammut, Eva; Zarinabad, Niloufar; Wesolowski, Roman; Morton, Geraint; Chen, Zhong; Sohal, Manav; Carr-White, Gerry; Razavi, Reza; Chiribiri, Amedeo
2015-02-12
Cardiac magnetic resonance (CMR) is playing an expanding role in the assessment of patients with heart failure (HF). The assessment of myocardial perfusion status in HF can be challenging due to left ventricular (LV) remodelling and wall thinning, coexistent scar and respiratory artefacts. The aim of this study was to assess the feasibility of quantitative CMR myocardial perfusion analysis in patients with HF. A group of 58 patients with heart failure (HF; left ventricular ejection fraction, LVEF ≤ 50%) and 33 patients with normal LVEF (LVEF >50%), referred for suspected coronary artery disease, were studied. All subjects underwent quantitative first-pass stress perfusion imaging using adenosine according to standard acquisition protocols. The feasibility of quantitative perfusion analysis was then assessed using high-resolution, 3 T kt perfusion and voxel-wise Fermi deconvolution. 30/58 (52%) subjects in the HF group had underlying ischaemic aetiology. Perfusion abnormalities were seen amongst patients with ischaemic HF and patients with normal LV function. No regional perfusion defect was observed in the non-ischaemic HF group. Good agreement was found between visual and quantitative analysis across all groups. Absolute stress perfusion rate, myocardial perfusion reserve (MPR) and endocardial-epicardial MPR ratio identified areas with abnormal perfusion in the ischaemic HF group (p = 0.02; p = 0.04; p = 0.02, respectively). In the Normal LV group, MPR and endocardial-epicardial MPR ratio were able to distinguish between normal and abnormal segments (p = 0.04; p = 0.02 respectively). No significant differences of absolute stress perfusion rate or MPR were observed comparing visually normal segments amongst groups. Our results demonstrate the feasibility of high-resolution voxel-wise perfusion assessment in patients with HF.
Press, Craig A; Morgan, Lindsey; Mills, Michele; Stack, Cynthia V; Goldstein, Joshua L; Alonso, Estella M; Wainwright, Mark S
2017-01-01
Spectral electroencephalogram analysis is a method for automated analysis of electroencephalogram patterns, which can be performed at the bedside. We sought to determine the utility of spectral electroencephalogram for grading hepatic encephalopathy in children with acute liver failure. Retrospective cohort study. Tertiary care pediatric hospital. Patients between 0 and 18 years old who presented with acute liver failure and were admitted to the PICU. None. Electroencephalograms were analyzed by spectral analysis including total power, relative δ, relative θ, relative α, relative β, θ-to-Δ ratio, and α-to-Δ ratio. Normal values and ranges were first derived using normal electroencephalograms from 70 children of 0-18 years old. Age had a significant effect on each variable measured (p < 0.03). Electroencephalograms from 33 patients with acute liver failure were available for spectral analysis. The median age was 4.3 years, 14 of 33 were male, and the majority had an indeterminate etiology of acute liver failure. Neuroimaging was performed in 26 cases and was normal in 20 cases (77%). The majority (64%) survived, and 82% had a good outcome with a score of 1-3 on the Pediatric Glasgow Outcome Scale-Extended at the time of discharge. Hepatic encephalopathy grade correlated with the qualitative visual electroencephalogram scores assigned by blinded neurophysiologists (rs = 0.493; p < 0.006). Spectral electroencephalogram characteristics varied significantly with the qualitative electroencephalogram classification (p < 0.05). Spectral electroencephalogram variables including relative Δ, relative θ, relative α, θ-to-Δ ratio, and α-to-Δ ratio all significantly varied with the qualitative electroencephalogram (p < 0.025). Moderate to severe hepatic encephalopathy was correlated with a total power of less than or equal to 50% of normal for children 0-3 years old, and with a relative θ of less than or equal to 50% normal for children more than 3 years old (p > 0.05). Spectral electroencephalogram classification correlated with outcome (p < 0.05). Spectral electroencephalogram analysis can be used to evaluate even young patients for hepatic encephalopathy and correlates with outcome. Spectral electroencephalogram may allow improved quantitative and reproducible assessment of hepatic encephalopathy grade in children with acute liver failure.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-12-29
....\\4\\ The Commission subsequently extended the time period in which to either approve the proposed rule... Market.'' The BX Venture Market will have minimal quantitative listing standards, but have qualitative... securities exchange for failure to meet quantitative listing standards (including price or other market value...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kurtz, Sarah; Repins, Ingrid L; Hacke, Peter L
Continued growth of PV system deployment would be enhanced by quantitative, low-uncertainty predictions of the degradation and failure rates of PV modules and systems. The intended product lifetime (decades) far exceeds the product development cycle (months), limiting our ability to reduce the uncertainty of the predictions for this rapidly changing technology. Yet, business decisions (setting insurance rates, analyzing return on investment, etc.) require quantitative risk assessment. Moving toward more quantitative assessments requires consideration of many factors, including the intended application, consequence of a possible failure, variability in the manufacturing, installation, and operation, as well as uncertainty in the measured accelerationmore » factors, which provide the basis for predictions based on accelerated tests. As the industry matures, it is useful to periodically assess the overall strategy for standards development and prioritization of research to provide a technical basis both for the standards and the analysis related to the application of those. To this end, this paper suggests a tiered approach to creating risk assessments. Recent and planned potential improvements in international standards are also summarized.« less
NASA Technical Reports Server (NTRS)
Monaghan, Mark W.; Gillespie, Amanda M.
2013-01-01
During the shuttle era NASA utilized a failure reporting system called the Problem Reporting and Corrective Action (PRACA) it purpose was to identify and track system non-conformance. The PRACA system over the years evolved from a relatively nominal way to identify system problems to a very complex tracking and report generating data base. The PRACA system became the primary method to categorize any and all anomalies from corrosion to catastrophic failure. The systems documented in the PRACA system range from flight hardware to ground or facility support equipment. While the PRACA system is complex, it does possess all the failure modes, times of occurrence, length of system delay, parts repaired or replaced, and corrective action performed. The difficulty is mining the data then to utilize that data in order to estimate component, Line Replaceable Unit (LRU), and system reliability analysis metrics. In this paper, we identify a methodology to categorize qualitative data from the ground system PRACA data base for common ground or facility support equipment. Then utilizing a heuristic developed for review of the PRACA data determine what reports identify a credible failure. These data are the used to determine inter-arrival times to perform an estimation of a metric for repairable component-or LRU reliability. This analysis is used to determine failure modes of the equipment, determine the probability of the component failure mode, and support various quantitative differing techniques for performing repairable system analysis. The result is that an effective and concise estimate of components used in manned space flight operations. The advantage is the components or LRU's are evaluated in the same environment and condition that occurs during the launch process.
Fidelity Failures in Brief Strategic Family Therapy for Adolescent Drug Abuse: A Clinical Analysis.
Lebensohn-Chialvo, Florencia; Rohrbaugh, Michael J; Hasler, Brant P
2018-04-30
As evidence-based family treatments for adolescent substance use and conduct problems gain traction, cutting edge research moves beyond randomized efficacy trials to address questions such as how these treatments work and how best to disseminate them to community settings. A key factor in effective dissemination is treatment fidelity, which refers to implementing an intervention in a manner consistent with an established manual. While most fidelity research is quantitative, this study offers a qualitative clinical analysis of fidelity failures in a large, multisite effectiveness trial of Brief Strategic Family Therapy (BSFT) for adolescent drug abuse, where BSFT developers trained community therapists to administer this intervention in their own agencies. Using case notes and video recordings of therapy sessions, an independent expert panel first rated 103 cases on quantitative fidelity scales grounded in the BSFT manual and the broader structural-strategic framework that informs BSFT intervention. Because fidelity was generally low, the panel reviewed all cases qualitatively to identify emergent types or categories of fidelity failure. Ten categories of failures emerged, characterized by therapist omissions (e.g., failure to engage key family members, failure to think in threes) and commissions (e.g., off-model, nonsystemic formulations/interventions). Of these, "failure to think in threes" appeared basic and particularly problematic, reflecting the central place of this idea in structural theory and therapy. Although subject to possible bias, our observations highlight likely stumbling blocks in exporting a complex family treatment like BSFT to community settings. These findings also underscore the importance of treatment fidelity in family therapy research. © 2018 Family Process Institute.
NDE of ceramics and ceramic composites
NASA Technical Reports Server (NTRS)
Vary, Alex; Klima, Stanley J.
1991-01-01
Although nondestructive evaluation (NDE) techniques for ceramics are fairly well developed, they are difficult to apply in many cases for high probability detection of the minute flaws that can cause failure in monolithic ceramics. Conventional NDE techniques are available for monolithic and fiber reinforced ceramic matrix composites, but more exact quantitative techniques needed are still being investigated and developed. Needs range from flaw detection to below 100 micron levels in monolithic ceramics to global imaging of fiber architecture and matrix densification anomalies in ceramic composites. NDE techniques that will ultimately be applicable to production and quality control of ceramic structures are still emerging from the lab. Needs are different depending on the processing stage, fabrication method, and nature of the finished product. NDE techniques are being developed in concert with materials processing research where they can provide feedback information to processing development and quality improvement. NDE techniques also serve as research tools for materials characterization and for understanding failure processes, e.g., during thermomechanical testing.
Lognormal Uncertainty Estimation for Failure Rates
NASA Technical Reports Server (NTRS)
Britton, Paul T.; Al Hassan, Mohammad; Ring, Robert W.
2017-01-01
"Uncertainty analysis itself is uncertain, therefore, you cannot evaluate it exactly," Source Uncertain. Quantitative results for aerospace engineering problems are influenced by many sources of uncertainty. Uncertainty analysis aims to make a technical contribution to decision-making through the quantification of uncertainties in the relevant variables as well as through the propagation of these uncertainties up to the result. Uncertainty can be thought of as a measure of the 'goodness' of a result and is typically represented as statistical dispersion. This presentation will explain common measures of centrality and dispersion; and-with examples-will provide guidelines for how they may be estimated to ensure effective technical contributions to decision-making.
Cardiomyocyte-Specific Telomere Shortening is a Distinct Signature of Heart Failure in Humans.
Sharifi-Sanjani, Maryam; Oyster, Nicholas M; Tichy, Elisia D; Bedi, Kenneth C; Harel, Ofer; Margulies, Kenneth B; Mourkioti, Foteini
2017-09-07
Telomere defects are thought to play a role in cardiomyopathies, but the specific cell type affected by the disease in human hearts is not yet identified. The aim of this study was to systematically evaluate the cell type specificity of telomere shortening in patients with heart failure in relation to their cardiac disease, age, and sex. We studied cardiac tissues from patients with heart failure by utilizing telomere quantitative fluorescence in situ hybridization, a highly sensitive method with single-cell resolution. In this study, total of 63 human left ventricular samples, including 37 diseased and 26 nonfailing donor hearts, were stained for telomeres in combination with cardiomyocyte- or α-smooth muscle cell-specific markers, cardiac troponin T, and smooth muscle actin, respectively, and assessed for telomere length. Patients with heart failure demonstrate shorter cardiomyocyte telomeres compared with nonfailing donors, which is specific only to cardiomyocytes within diseased human hearts and is associated with cardiomyocyte DNA damage. Our data further reveal that hypertrophic hearts with reduced ejection fraction exhibit the shortest telomeres. In contrast to other reported cell types, no difference in cardiomyocyte telomere length is evident with age. However, under the disease state, telomere attrition manifests in both young and older patients with cardiac hypertrophy. Finally, we demonstrate that cardiomyocyte-telomere length is better sustained in women than men under diseased conditions. This study provides the first evidence of cardiomyocyte-specific telomere shortening in heart failure. © 2017 The Authors. Published on behalf of the American Heart Association, Inc., by Wiley.
Non-Linear Optical Microscopy Sheds Light on Cardiovascular Disease
Caorsi, Valentina; Toepfer, Christopher; Sikkel, Markus B.; Lyon, Alexander R.; MacLeod, Ken; Ferenczi, Mike A.
2013-01-01
Many cardiac diseases have been associated with increased fibrosis and changes in the organization of fibrillar collagen. The degree of fibrosis is routinely analyzed with invasive histological and immunohistochemical methods, giving a limited and qualitative understanding of the tissue's morphological adaptation to disease. Our aim is to quantitatively evaluate the increase in fibrosis by three-dimensional imaging of the collagen network in the myocardium using the non-linear optical microscopy techniques Two-Photon Excitation microscopy (TPE) and Second Harmonic signal Generation (SHG). No sample staining is needed because numerous endogenous fluorophores are excited by a two-photon mechanism and highly non-centrosymmetric structures such as collagen generate strong second harmonic signals. We propose for the first time a 3D quantitative analysis to carefully evaluate the increased fibrosis in tissue from a rat model of heart failure post myocardial infarction. We show how to measure changes in fibrosis from the backward SHG (BSHG) alone, as only backward-propagating SHG is accessible for true in vivo applications. A 5-fold increase in collagen I fibrosis is detected in the remote surviving myocardium measured 20 weeks after infarction. The spatial distribution is also shown to change markedly, providing insight into the morphology of disease progression. PMID:23409139
Differential reliability : probabilistic engineering applied to wood members in bending-tension
Stanley K. Suddarth; Frank E. Woeste; William L. Galligan
1978-01-01
Reliability analysis is a mathematical technique for appraising the design and materials of engineered structures to provide a quantitative estimate of probability of failure. Two or more cases which are similar in all respects but one may be analyzed by this method; the contrast between the probabilities of failure for these cases allows strong analytical focus on the...
ERIC Educational Resources Information Center
Carey, Theodore; Carifio, James
2012-01-01
In an effort to reduce failure and drop-out rates, schools have been implementing minimum grading. One form involves raising catastrophically low student quarter grades to a predetermined minimum--typically a 50. Proponents argue it gives struggling students a reasonable chance to recover from failure. Critics contend the practice induces grade…
ERIC Educational Resources Information Center
Chabaya, Owence; Chiome, Chrispen; Chabaya, Raphinos A.
2009-01-01
The study sought to determine lecturers' and students' perceptions of factors contributing to students' failure to submit research projects on time in three departments of the Zimbabwe Open University. The study employed a descriptive survey design and was both quantitative and qualitative. The questionnaire used as a data-gathering instrument had…
Review of nutritional screening and assessment tools and clinical outcomes in heart failure.
Lin, Hong; Zhang, Haifeng; Lin, Zheng; Li, Xinli; Kong, Xiangqin; Sun, Gouzhen
2016-09-01
Recent studies have suggested that undernutrition as defined using multidimensional nutritional evaluation tools may affect clinical outcomes in heart failure (HF). The evidence supporting this correlation is unclear. Therefore, we conducted this systematic review to critically appraise the use of multidimensional evaluation tools in the prediction of clinical outcomes in HF. We performed descriptive analyses of all identified articles involving qualitative analyses. We used STATA to conduct meta-analyses when at least three studies that tested the same type of nutritional assessment or screening tools and used the same outcome were identified. Sensitivity analyses were conducted to validate our positive results. We identified 17 articles with qualitative analyses and 11 with quantitative analysis after comprehensive literature searching and screening. We determined that the prevalence of malnutrition is high in HF (range 16-90 %), particularly in advanced and acute decompensated HF (approximate range 75-90 %). Undernutrition as identified by multidimensional evaluation tools may be significantly associated with hospitalization, length of stay and complications and is particularly strongly associated with high mortality. The meta-analysis revealed that compared with other tools, Mini Nutritional Assessment (MNA) scores were the strongest predictors of mortality in HF [HR (4.32, 95 % CI 2.30-8.11)]. Our results remained reliable after conducting sensitivity analyses. The prevalence of malnutrition is high in HF, particularly in advanced and acute decompensated HF. Moreover, undernutrition as identified by multidimensional evaluation tools is significantly associated with unfavourable prognoses and high mortality in HF.
Reliability Evaluation of Machine Center Components Based on Cascading Failure Analysis
NASA Astrophysics Data System (ADS)
Zhang, Ying-Zhi; Liu, Jin-Tong; Shen, Gui-Xiang; Long, Zhe; Sun, Shu-Guang
2017-07-01
In order to rectify the problems that the component reliability model exhibits deviation, and the evaluation result is low due to the overlook of failure propagation in traditional reliability evaluation of machine center components, a new reliability evaluation method based on cascading failure analysis and the failure influenced degree assessment is proposed. A direct graph model of cascading failure among components is established according to cascading failure mechanism analysis and graph theory. The failure influenced degrees of the system components are assessed by the adjacency matrix and its transposition, combined with the Pagerank algorithm. Based on the comprehensive failure probability function and total probability formula, the inherent failure probability function is determined to realize the reliability evaluation of the system components. Finally, the method is applied to a machine center, it shows the following: 1) The reliability evaluation values of the proposed method are at least 2.5% higher than those of the traditional method; 2) The difference between the comprehensive and inherent reliability of the system component presents a positive correlation with the failure influenced degree of the system component, which provides a theoretical basis for reliability allocation of machine center system.
NASA Technical Reports Server (NTRS)
DeCarvalho, N. V.; Chen, B. Y.; Pinho, S. T.; Baiz, P. M.; Ratcliffe, J. G.; Tay, T. E.
2013-01-01
A novel approach is proposed for high-fidelity modeling of progressive damage and failure in composite materials that combines the Floating Node Method (FNM) and the Virtual Crack Closure Technique (VCCT) to represent multiple interacting failure mechanisms in a mesh-independent fashion. In this study, the approach is applied to the modeling of delamination migration in cross-ply tape laminates. Delamination, matrix cracking, and migration are all modeled using fracture mechanics based failure and migration criteria. The methodology proposed shows very good qualitative and quantitative agreement with experiments.
NASA Technical Reports Server (NTRS)
DeCarvalho, Nelson V.; Chen, B. Y.; Pinho, Silvestre T.; Baiz, P. M.; Ratcliffe, James G.; Tay, T. E.
2013-01-01
A novel approach is proposed for high-fidelity modeling of progressive damage and failure in composite materials that combines the Floating Node Method (FNM) and the Virtual Crack Closure Technique (VCCT) to represent multiple interacting failure mechanisms in a mesh-independent fashion. In this study, the approach is applied to the modeling of delamination migration in cross-ply tape laminates. Delamination, matrix cracking, and migration are all modeled using fracture mechanics based failure and migration criteria. The methodology proposed shows very good qualitative and quantitative agreement with experiments.
Failure mode and effects analysis: too little for too much?
Dean Franklin, Bryony; Shebl, Nada Atef; Barber, Nick
2012-07-01
Failure mode and effects analysis (FMEA) is a structured prospective risk assessment method that is widely used within healthcare. FMEA involves a multidisciplinary team mapping out a high-risk process of care, identifying the failures that can occur, and then characterising each of these in terms of probability of occurrence, severity of effects and detectability, to give a risk priority number used to identify failures most in need of attention. One might assume that such a widely used tool would have an established evidence base. This paper considers whether or not this is the case, examining the evidence for the reliability and validity of its outputs, the mathematical principles behind the calculation of a risk prioirty number, and variation in how it is used in practice. We also consider the likely advantages of this approach, together with the disadvantages in terms of the healthcare professionals' time involved. We conclude that although FMEA is popular and many published studies have reported its use within healthcare, there is little evidence to support its use for the quantitative prioritisation of process failures. It lacks both reliability and validity, and is very time consuming. We would not recommend its use as a quantitative technique to prioritise, promote or study patient safety interventions. However, the stage of FMEA involving multidisciplinary mapping process seems valuable and work is now needed to identify the best way of converting this into plans for action.
The Failing Heart Relies on Ketone Bodies as a Fuel.
Aubert, Gregory; Martin, Ola J; Horton, Julie L; Lai, Ling; Vega, Rick B; Leone, Teresa C; Koves, Timothy; Gardell, Stephen J; Krüger, Marcus; Hoppel, Charles L; Lewandowski, E Douglas; Crawford, Peter A; Muoio, Deborah M; Kelly, Daniel P
2016-02-23
Significant evidence indicates that the failing heart is energy starved. During the development of heart failure, the capacity of the heart to utilize fatty acids, the chief fuel, is diminished. Identification of alternate pathways for myocardial fuel oxidation could unveil novel strategies to treat heart failure. Quantitative mitochondrial proteomics was used to identify energy metabolic derangements that occur during the development of cardiac hypertrophy and heart failure in well-defined mouse models. As expected, the amounts of proteins involved in fatty acid utilization were downregulated in myocardial samples from the failing heart. Conversely, expression of β-hydroxybutyrate dehydrogenase 1, a key enzyme in the ketone oxidation pathway, was increased in the heart failure samples. Studies of relative oxidation in an isolated heart preparation using ex vivo nuclear magnetic resonance combined with targeted quantitative myocardial metabolomic profiling using mass spectrometry revealed that the hypertrophied and failing heart shifts to oxidizing ketone bodies as a fuel source in the context of reduced capacity to oxidize fatty acids. Distinct myocardial metabolomic signatures of ketone oxidation were identified. These results indicate that the hypertrophied and failing heart shifts to ketone bodies as a significant fuel source for oxidative ATP production. Specific metabolite biosignatures of in vivo cardiac ketone utilization were identified. Future studies aimed at determining whether this fuel shift is adaptive or maladaptive could unveil new therapeutic strategies for heart failure. © 2016 American Heart Association, Inc.
Angermann, Christiane E; Störk, Stefan; Gelbrich, Götz; Faller, Hermann; Jahns, Roland; Frantz, Stefan; Loeffler, Markus; Ertl, Georg
2012-01-01
Trials investigating efficacy of disease management programs (DMP) in heart failure reported contradictory results. Features rendering specific interventions successful are often ill defined. We evaluated the mode of action and effects of a nurse-coordinated DMP (HeartNetCare-HF, HNC). Patients hospitalized for systolic heart failure were randomly assigned to HNC or usual care (UC). Besides telephone-based monitoring and education, HNC addressed individual problems raised by patients, pursued networking of health care providers and provided training for caregivers. End points were time to death or rehospitalization (combined primary), heart failure symptoms, and quality of life (SF-36). Of 1007 consecutive patients, 715 were randomly assigned (HNC: n=352; UC: n=363; age, 69±12 years; 29% female; 40% New York Heart Association class III-IV). Within 180 days, 130 HNC and 137 UC patients reached the primary end point (hazard ratio, 1.02; 95% confidence interval, 0.81-1.30; P=0.89), since more HNC patients were readmitted. Overall, 32 HNC and 52 UC patients died (1 UC patient and 4 HNC patients after dropout); thus, uncensored hazard ratio was 0.62 (0.40-0.96; P=0.03). HNC patients improved more regarding New York Heart Association class (P=0.05), physical functioning (P=0.03), and physical health component (P=0.03). Except for HNC, health care utilization was comparable between groups. However, HNC patients requested counseling for noncardiac problems even more frequently than for cardiovascular or heart-failure-related issues. The primary end point of this study was neutral. However, mortality risk and surrogates of well-being improved significantly. Quantitative assessment of patient requirements suggested that besides (tele)monitoring individualized care considering also noncardiac problems should be integrated in efforts to achieve more sustainable improvement in heart failure outcomes. URL: http://www.controlled-trials.com. Unique identifier: ISRCTN23325295.
Quantitative risk assessment system (QRAS)
NASA Technical Reports Server (NTRS)
Tan, Zhibin (Inventor); Mosleh, Ali (Inventor); Weinstock, Robert M (Inventor); Smidts, Carol S (Inventor); Chang, Yung-Hsien (Inventor); Groen, Francisco J (Inventor); Swaminathan, Sankaran (Inventor)
2001-01-01
A quantitative risk assessment system (QRAS) builds a risk model of a system for which risk of failure is being assessed, then analyzes the risk of the system corresponding to the risk model. The QRAS performs sensitivity analysis of the risk model by altering fundamental components and quantifications built into the risk model, then re-analyzes the risk of the system using the modifications. More particularly, the risk model is built by building a hierarchy, creating a mission timeline, quantifying failure modes, and building/editing event sequence diagrams. Multiplicities, dependencies, and redundancies of the system are included in the risk model. For analysis runs, a fixed baseline is first constructed and stored. This baseline contains the lowest level scenarios, preserved in event tree structure. The analysis runs, at any level of the hierarchy and below, access this baseline for risk quantitative computation as well as ranking of particular risks. A standalone Tool Box capability exists, allowing the user to store application programs within QRAS.
CT-derived indices of canine osteosarcoma-affected antebrachial strength.
Garcia, Tanya C; Steffey, Michele A; Zwingenberger, Allison L; Daniel, Leticia; Stover, Susan M
2017-05-01
To improve the prediction of fractures in dogs with bone tumors of the distal radius by identifying computed tomography (CT) indices that correlate with antebrachial bone strength and fracture location. Prospective experimental study. Dogs with antebrachial osteosarcoma (n = 10), and normal cadaver bones (n=9). Antebrachia were imaged with quantitative CT prior to biomechanical testing to failure. CT indices of structural properties were compared to yield force and maximum force using Pearson correlation tests. Straight beam failure (Fs), axial rigidity, curved beam failure (Fc), and craniocaudal bending moment of inertia (MOICrCd) CT indices most highly correlated (0.77 > R > 0.57) with yield and maximum forces when iOSA-affected and control bones were included in the analysis. Considering only OSA-affected bones, Fs, Fc, and axial rigidity correlated highly (0.85 > R > 0.80) with maximum force. In affected bones, the location of minimum axial rigidity and maximum MOICrCd correlated highly (R > 0.85) with the actual fracture location. CT-derived axial rigidity, Fs, and MOICrCd have strong linear relationships with yield and maximum force. These indices should be further evaluated prospectively in OSA-affected dogs that do, and do not, experience pathologic fracture. © 2017 The American College of Veterinary Surgeons.
a New Quantitative Method for the Rapid Evaluation of Buildings against Earthquakes
NASA Astrophysics Data System (ADS)
Mahmoodzadeh, Amir; Mazaheri, Mohammad Mehdi
2008-07-01
At the present time there exist numerous weak buildings which are not able to withstand earthquakes. At the same time, both private and public developers are trying to use scientific methods to prioritize and allocate budget in order to reinforce the above mentioned structures. This is because of the limited financial resources and time. In the recent years the procedure of seismic assessment before rehabilitation of vulnerable buildings has been implemented in many countries. Now, it seems logical to reinforce the existing procedures with the mass of available data about the effects caused by earthquakes on buildings. The main idea is driven from FMEA (Failure Mode and Effect Analysis) in quality management where the main procedure is to recognize the failure, the causes, and the priority of each cause and failure. Specifying the causes and effects which lead to a certain shortcoming in structural behavior during earthquakes, an inventory is developed and each building is rated through a yes-or-no procedure. In this way, the rating of the structure is based on some standard forms which along with relative weights are developed in this study. The resulted criteria by rapid assessment will indicate whether the structure is to be demolished, has a high, medium or low vulnerability or is invulnerable.
Laffont, I; Hoffmann, G; Dizien, O; Revol, M; Roby-Brami, A
2007-07-01
Prospective control cohort study. To develop a new test to analyse qualitatively grasping strategies in C6/C7 tetraplegic patients, and to quantify the effect of musculo-tendinous transfers. France. Twelve C6/C7 tetraplegic adults (17 arms; 31.3+/-7.9 years) and 17 healthy subjects (30.9+/-9.4 years) completed the study. We assessed participants' ability to grasp, move and release standardized balls of variable sizes and weights. Failures, movement duration (MD), grip patterns, forearm orientation during transport. In patients as well as in controls, the number of digits involved in prehension increased proportionally to the size and weight of the ball. C6 non-operated tetraplegic patients failed 38.2% of the tasks. They frequently used supine transport (51.4% of successful tasks). MD was longer, with a large distribution of values. The presence of active elbow extension poorly influenced the amount of failure nor grip configuration, but significantly reduced MD and supine transport (34%). Patients who were evaluated after hand surgery showed a trend towards improved MD and more frequent completion (failure 30%), especially for middle-sized and middle-weighted balls. Grip patterns were deeply modified, and all transports were made in pronation. The 'Tetra Ball Test' evidences the characteristics of grasping in tetraplegic patients and those influenced by surgery. It may be useful in understanding effects of surgical procedures. This preliminary study must be completed to evaluate the quantitative responsiveness and reproducibility of this test and to develop instrumented electronic balls to optimise it.
An Approach for Reducing the Error Rate in Automated Lung Segmentation
Gill, Gurman; Beichel, Reinhard R.
2016-01-01
Robust lung segmentation is challenging, especially when tens of thousands of lung CT scans need to be processed, as required by large multi-center studies. The goal of this work was to develop and assess a method for the fusion of segmentation results from two different methods to generate lung segmentations that have a lower failure rate than individual input segmentations. As basis for the fusion approach, lung segmentations generated with a region growing and model-based approach were utilized. The fusion result was generated by comparing input segmentations and selectively combining them using a trained classification system. The method was evaluated on a diverse set of 204 CT scans of normal and diseased lungs. The fusion approach resulted in a Dice coefficient of 0.9855 ± 0.0106 and showed a statistically significant improvement compared to both input segmentation methods. In addition, the failure rate at different segmentation accuracy levels was assessed. For example, when requiring that lung segmentations must have a Dice coefficient of better than 0.97, the fusion approach had a failure rate of 6.13%. In contrast, the failure rate for region growing and model-based methods was 18.14% and 15.69%, respectively. Therefore, the proposed method improves the quality of the lung segmentations, which is important for subsequent quantitative analysis of lungs. Also, to enable a comparison with other methods, results on the LOLA11 challenge test set are reported. PMID:27447897
Quantitative Risk Analysis on the Transport of Dangerous Goods Through a Bi-Directional Road Tunnel.
Caliendo, Ciro; De Guglielmo, Maria Luisa
2017-01-01
A quantitative risk analysis (QRA) regarding dangerous goods vehicles (DGVs) running through road tunnels was set up. Peak hourly traffic volumes (VHP), percentage of heavy goods vehicles (HGVs), and failure of the emergency ventilation system were investigated in order to assess their impact on the risk level. The risk associated with an alternative route running completely in the open air and passing through a highly populated urban area was also evaluated. The results in terms of social risk, as F/N curves, show an increased risk level with an increase the VHP, the percentage of HGVs, and a failure of the emergency ventilation system. The risk curves of the tunnel investigated were found to lie both above and below those of the alternative route running in the open air depending on the type of dangerous goods transported. In particular, risk was found to be greater in the tunnel for two fire scenarios (no explosion). In contrast, the risk level for the exposed population was found to be greater for the alternative route in three possible accident scenarios associated with explosions and toxic releases. Therefore, one should be wary before stating that for the transport of dangerous products an itinerary running completely in the open air might be used if the latter passes through a populated area. The QRA may help decisionmakers both to implement additional safety measures and to understand whether to allow, forbid, or limit circulation of DGVs. © 2016 Society for Risk Analysis.
Modelling passive diastolic mechanics with quantitative MRI of cardiac structure and function.
Wang, Vicky Y; Lam, H I; Ennis, Daniel B; Cowan, Brett R; Young, Alistair A; Nash, Martyn P
2009-10-01
The majority of patients with clinically diagnosed heart failure have normal systolic pump function and are commonly categorized as suffering from diastolic heart failure. The left ventricle (LV) remodels its structure and function to adapt to pathophysiological changes in geometry and loading conditions, which in turn can alter the passive ventricular mechanics. In order to better understand passive ventricular mechanics, a LV finite element (FE) model was customized to geometric data segmented from in vivo tagged magnetic resonance images (MRI) data and myofibre orientation derived from ex vivo diffusion tensor MRI (DTMRI) of a canine heart using nonlinear finite element fitting techniques. MRI tissue tagging enables quantitative evaluation of cardiac mechanical function with high spatial and temporal resolution, whilst the direction of maximum water diffusion in each voxel of a DTMRI directly corresponds to the local myocardial fibre orientation. Due to differences in myocardial geometry between in vivo and ex vivo imaging, myofibre orientations were mapped into the geometric FE model using host mesh fitting (a free form deformation technique). Pressure recordings, temporally synchronized to the tagging data, were used as the loading constraints to simulate the LV deformation during diastole. Simulation of diastolic LV mechanics allowed us to estimate the stiffness of the passive LV myocardium based on kinematic data obtained from tagged MRI. Integrated physiological modelling of this kind will allow more insight into mechanics of the LV on an individualized basis, thereby improving our understanding of the underlying structural basis of mechanical dysfunction under pathological conditions.
Risk analysis of heat recovery steam generator with semi quantitative risk based inspection API 581
DOE Office of Scientific and Technical Information (OSTI.GOV)
Prayogo, Galang Sandy, E-mail: gasandylang@live.com; Haryadi, Gunawan Dwi; Ismail, Rifky
Corrosion is a major problem that most often occurs in the power plant. Heat recovery steam generator (HRSG) is an equipment that has a high risk to the power plant. The impact of corrosion damage causing HRSG power plant stops operating. Furthermore, it could be threaten the safety of employees. The Risk Based Inspection (RBI) guidelines by the American Petroleum Institute (API) 58 has been used to risk analysis in the HRSG 1. By using this methodology, the risk that caused by unexpected failure as a function of the probability and consequence of failure can be estimated. This paper presentedmore » a case study relating to the risk analysis in the HRSG, starting with a summary of the basic principles and procedures of risk assessment and applying corrosion RBI for process industries. The risk level of each HRSG equipment were analyzed: HP superheater has a medium high risk (4C), HP evaporator has a medium-high risk (4C), and the HP economizer has a medium risk (3C). The results of the risk assessment using semi-quantitative method of standard API 581 based on the existing equipment at medium risk. In the fact, there is no critical problem in the equipment components. Damage mechanisms were prominent throughout the equipment is thinning mechanism. The evaluation of the risk approach was done with the aim of reducing risk by optimizing the risk assessment activities.« less
Fierstra, Jorn; van Niftrik, Christiaan; Warnock, Geoffrey; Wegener, Susanne; Piccirelli, Marco; Pangalu, Athina; Esposito, Giuseppe; Valavanis, Antonios; Buck, Alfred; Luft, Andreas; Bozinov, Oliver; Regli, Luca
2018-03-01
Increased stroke risk correlates with hemodynamic failure, which can be assessed with ( 15 O-)H 2 O positron emission tomography (PET) cerebral blood flow (CBF) measurements. This gold standard technique, however, is not established for routine clinical imaging. Standardized blood oxygen-level-dependent (BOLD) functional magnetic resonance imaging+CO 2 is a noninvasive and potentially widely applicable tool to assess whole-brain quantitative cerebrovascular reactivity (CVR). We examined the agreement between the 2 imaging modalities and hypothesized that quantitative CVR can be a surrogate imaging marker to assess hemodynamic failure. Nineteen data sets of subjects with chronic cerebrovascular steno-occlusive disease (age, 60±11 years; 4 women) and unilaterally impaired perfusion reserve on Diamox-challenged ( 15 O-)H 2 O PET were studied and compared with a standardized BOLD functional magnetic resonance imaging+CO 2 examination within 6 weeks (8±19 days). Agreement between quantitative CBF- and CVR-based perfusion reserve was assessed. Hemodynamic failure was staged according to PET findings: stage 0: normal CBF, normal perfusion reserve; stage I: normal CBF, decreased perfusion reserve; and stage II: decreased CBF, decreased perfusion reserve. The BOLD CVR data set of the same subjects was then matched to the corresponding stage of hemodynamic failure. PET-based stage I versus stage II could also be clearly separated with BOLD CVR measurements (CVR for stage I 0.11 versus CVR for stage II -0.03; P <0.01). Hemispheric and middle cerebral artery territory difference analyses (ie, affected versus unaffected side) showed a significant correlation for CVR impairment in the affected hemisphere and middle cerebral artery territory ( P <0.01, R 2 =0.47 and P =0.02, R 2 = 0.25, respectively). BOLD CVR corresponded well to CBF perfusion reserve measurements obtained with ( 15 O-)H 2 O-PET, especially for detecting hemodynamic failure in the affected hemisphere and middle cerebral artery territory and for identifying hemodynamic failure stage II. BOLD CVR may, therefore, be considered for prospective studies assessing stroke risk in patients with chronic cerebrovascular steno-occlusive disease, in particular because it can potentially be implemented in routine clinical imaging. © 2018 American Heart Association, Inc.
NASA Astrophysics Data System (ADS)
Taylor, Gabriel James
The failure of electrical cables exposed to severe thermal fire conditions are a safety concern for operating commercial nuclear power plants (NPPs). The Nuclear Regulatory Commission (NRC) has promoted the use of risk-informed and performance-based methods for fire protection which resulted in a need to develop realistic methods to quantify the risk of fire to NPP safety. Recent electrical cable testing has been conducted to provide empirical data on the failure modes and likelihood of fire-induced damage. This thesis evaluated numerous aspects of the data. Circuit characteristics affecting fire-induced electrical cable failure modes have been evaluated. In addition, thermal failure temperatures corresponding to cable functional failures have been evaluated to develop realistic single point thermal failure thresholds and probability distributions for specific cable insulation types. Finally, the data was used to evaluate the prediction capabilities of a one-dimension conductive heat transfer model used to predict cable failure.
Santarelli, Simona; Russo, Veronica; Lalle, Irene; De Berardinis, Benedetta; Vetrone, Francesco; Magrini, Laura; Di Stasio, Enrico; Piccoli, Antonio; Codognotto, Marta; Mion, Monica M; Castello, Luigi M; Avanzi, Gian Carlo; Di Somma, Salvatore
2017-06-01
The objective of this study was to investigate the prognostic role of quantitative reduction of congestion during hospitalization assessed by Bioelectrical Impedance Vector Analysis (BIVA) serial evaluations in patients admitted for acute heart failure (AHF). AHF is a frequent reason for patients to be admitted. Exacerbation of chronic heart failure is linked with a progressive worsening of the disease with increased incidence of death. Fluid overload is the main mechanism underlying acute decompensation in these patients. BIVA is a validated technique able to quantify fluid overload. a prospective, multicentre, observational study in AHF and no AHF patients in three Emergency Departments centres in Italy. Clinical data and BIVA evaluations were performed at admission (t0) and discharge (tdis). A follow-up phone call was carried out at 90 days. Three hundred and thirty-six patients were enrolled (221 AHF and 115 no AHF patients). We found that clinical signs showed the most powerful prognostic relevance. In particular the presence of rales and lower limb oedema at tdis were linked with events relapse at 90 days. At t0, congestion detected by BIVA was observed only in the AHF group, and significantly decreased at tdis. An increase of resistance variation (dR/H) >11 Ω/m during hospitalization was associated with survival. BIVA showed significant results in predicting total events, both at t0 (area under the curve (AUC) 0.56, p<0.04) and at tdis (AUC 0.57, p<0.03). When combined with clinical signs, BIVA showed a very good predictive value for cardiovascular events at 90 days (AUC 0.97, p<0.0001). In AHF patients, an accurate physical examination evaluating the presence of rales and lower limbs oedema remains the cornerstone in the management of patients with AHF. A congestion reduction, obtained as a consequence of therapies and detected through BIVA analysis, with an increase of dR/H >11 Ω/m during hospitalization seems to be associated with increased 90 day survival in patients admitted for AHF.
Kalolo, Albino; Radermacher, Ralf; Stoermer, Manfred; Meshack, Menoris; De Allegri, Manuela
2015-01-01
Background Despite the implementation of various initiatives to address low enrollment in voluntary micro health insurance (MHI) schemes in sub-Saharan Africa, the problem of low enrollment remains unresolved. The lack of process evaluations of such interventions makes it difficult to ascertain whether their poor results are because of design failures or implementation weaknesses. Objective In this paper, we describe a process evaluation protocol aimed at opening the ‘black box’ to evaluate the implementation processes of the Redesigned Community Health Fund (CHF) program in the Dodoma region of Tanzania. Design The study employs a cross-sectional mixed methods design and is being carried out 3 years after the launch of the Redesigned CHF program. The study is grounded in a conceptual framework which rests on the Diffusion of Innovation Theory and the Implementation Fidelity Framework. The study utilizes a mixture of quantitative and qualitative data collection tools (questionnaires, focus group discussions, in-depth interviews, and document review), and aligns the evaluation to the Theory of Intervention developed by our team. Quantitative data will be used to measure program adoption, implementation fidelity, and their moderating factors. Qualitative data will be used to explore the responses of stakeholders to the intervention, contextual factors, and moderators of adoption, implementation fidelity, and sustainability. Discussion This protocol describes a systematic process evaluation in relation to the implementation of a reformed MHI. We trust that the theoretical approaches and methodologies described in our protocol may be useful to inform the design of future process evaluations focused on the assessment of complex interventions, such as MHI schemes. PMID:26679408
Fractography: determining the sites of fracture initiation.
Mecholsky, J J
1995-03-01
Fractography is the analysis of fracture surfaces. Here, it refers to quantitative fracture surface analysis (FSA) in the context of applying the principles of fracture mechanics to the topography observed on the fracture surface of brittle materials. The application of FSA is based on the principle that encoded on the fracture surface of brittle materials is the entire history of the fracture process. It is our task to develop the skills and knowledge to decode this information. There are several motivating factors for applying our knowledge of FSA. The first and foremost is that there is specific, quantitative information to be obtained from the fracture surface. This information includes the identification of the size and location of the fracture initiating crack or defect, the stress state at failure, the existence, or not, of local or global residual stress, the existence, or not, of stress corrosion and a knowledge of local processing anomalies which affect the fracture process. The second motivating factor is that the information is free. Once a material is tested to failure, the encoded information becomes available. If we decide to observe the features produced during fracture then we are rewarded with much information. If we decide to ignore the fracture surface, then we are left to guess and/or reason as to the cause of the failure without the benefit of all of the possible information available. This paper addresses the application of quantitative fracture surface analysis to basic research, material and product development, and "trouble-shooting" of in-service failures. First, the basic principles involved will be presented. Next, the methodology necessary to apply the principles will be presented. Finally, a summary of the presentation will be made showing the applicability to design and reliability.
Buck, Harleah G; Hupcey, Judith; Wang, Hsiao-Lan; Fradley, Michael; Donovan, Kristine A; Watach, Alexa
Recent heart failure (HF) patient and informal caregiver (eg, dyadic) studies have either examined self-care from a qualitative or quantitative perspective. To date, the 2 types of data have not been integrated. The aim of this study was to understand HF self-care within the context of dyadic engagement. This was a cross-sectional, mixed methods (quantitative/qualitative) study. Heart failure self-care was measured with the Self-care of Heart Failure Index (v.6) dichotomized to adequate (≥70) or inadequate (<69). Dyadic symptom management type was assessed with the Dyadic Symptom Management Type scale. Interviews regarding self-care were conducted with both dyad members present. Content analytic techniques were used. Data were integrated using an information matrix and triangulated using Creswell and Plano Clark's methods. Of the 27 dyads, HF participants were 56% men, with a mean age of 77 years. Caregivers were 74% women, with a mean age of 66 years, representing spouses (n = 14) and adult children (n = 7). Quantitatively, few dyads scored as adequate (≥70) in self-care; the qualitative data described the impact of adequacy on the dyads' behavior. Dyads who scored higher, individually or both, on self-care self-efficacy and self-care management were less likely to change from their life course pattern. Either the patient or dyad continued to handle all self-care as they always had, rather than trying new strategies or reaching out for help as the patient's condition deteriorated. Our data suggest links that should be explored between dyadic adequacy and response to patients' symptoms. Future studies should assess dyadic adequacy longitudinally and examine its relationship to event-free survival and health services cost.
A methodology for estimating risks associated with landslides of contaminated soil into rivers.
Göransson, Gunnel; Norrman, Jenny; Larson, Magnus; Alén, Claes; Rosén, Lars
2014-02-15
Urban areas adjacent to surface water are exposed to soil movements such as erosion and slope failures (landslides). A landslide is a potential mechanism for mobilisation and spreading of pollutants. This mechanism is in general not included in environmental risk assessments for contaminated sites, and the consequences associated with contamination in the soil are typically not considered in landslide risk assessments. This study suggests a methodology to estimate the environmental risks associated with landslides in contaminated sites adjacent to rivers. The methodology is probabilistic and allows for datasets with large uncertainties and the use of expert judgements, providing quantitative estimates of probabilities for defined failures. The approach is illustrated by a case study along the river Göta Älv, Sweden, where failures are defined and probabilities for those failures are estimated. Failures are defined from a pollution perspective and in terms of exceeding environmental quality standards (EQSs) and acceptable contaminant loads. Models are then suggested to estimate probabilities of these failures. A landslide analysis is carried out to assess landslide probabilities based on data from a recent landslide risk classification study along the river Göta Älv. The suggested methodology is meant to be a supplement to either landslide risk assessment (LRA) or environmental risk assessment (ERA), providing quantitative estimates of the risks associated with landslide in contaminated sites. The proposed methodology can also act as a basis for communication and discussion, thereby contributing to intersectoral management solutions. From the case study it was found that the defined failures are governed primarily by the probability of a landslide occurring. The overall probabilities for failure are low; however, if a landslide occurs the probabilities of exceeding EQS are high and the probability of having at least a 10% increase in the contamination load within one year is also high. Copyright © 2013 Elsevier B.V. All rights reserved.
High-resolution three-dimensional imaging and analysis of rock falls in Yosemite valley, California
Stock, Gregory M.; Bawden, G.W.; Green, J.K.; Hanson, E.; Downing, G.; Collins, B.D.; Bond, S.; Leslar, M.
2011-01-01
We present quantitative analyses of recent large rock falls in Yosemite Valley, California, using integrated high-resolution imaging techniques. Rock falls commonly occur from the glacially sculpted granitic walls of Yosemite Valley, modifying this iconic landscape but also posing signifi cant potential hazards and risks. Two large rock falls occurred from the cliff beneath Glacier Point in eastern Yosemite Valley on 7 and 8 October 2008, causing minor injuries and damaging structures in a developed area. We used a combination of gigapixel photography, airborne laser scanning (ALS) data, and ground-based terrestrial laser scanning (TLS) data to characterize the rock-fall detachment surface and adjacent cliff area, quantify the rock-fall volume, evaluate the geologic structure that contributed to failure, and assess the likely failure mode. We merged the ALS and TLS data to resolve the complex, vertical to overhanging topography of the Glacier Point area in three dimensions, and integrated these data with gigapixel photographs to fully image the cliff face in high resolution. Three-dimensional analysis of repeat TLS data reveals that the cumulative failure consisted of a near-planar rock slab with a maximum length of 69.0 m, a mean thickness of 2.1 m, a detachment surface area of 2750 m2, and a volume of 5663 ?? 36 m3. Failure occurred along a surfaceparallel, vertically oriented sheeting joint in a clear example of granitic exfoliation. Stress concentration at crack tips likely propagated fractures through the partially attached slab, leading to failure. Our results demonstrate the utility of high-resolution imaging techniques for quantifying far-range (>1 km) rock falls occurring from the largely inaccessible, vertical rock faces of Yosemite Valley, and for providing highly accurate and precise data needed for rock-fall hazard assessment. ?? 2011 Geological Society of America.
Rah, Jeong-Eun; Manger, Ryan P; Yock, Adam D; Kim, Gwe-Ya
2016-12-01
To examine the abilities of a traditional failure mode and effects analysis (FMEA) and modified healthcare FMEA (m-HFMEA) scoring methods by comparing the degree of congruence in identifying high risk failures. The authors applied two prospective methods of the quality management to surface image guided, linac-based radiosurgery (SIG-RS). For the traditional FMEA, decisions on how to improve an operation were based on the risk priority number (RPN). The RPN is a product of three indices: occurrence, severity, and detectability. The m-HFMEA approach utilized two indices, severity and frequency. A risk inventory matrix was divided into four categories: very low, low, high, and very high. For high risk events, an additional evaluation was performed. Based upon the criticality of the process, it was decided if additional safety measures were needed and what they comprise. The two methods were independently compared to determine if the results and rated risks matched. The authors' results showed an agreement of 85% between FMEA and m-HFMEA approaches for top 20 risks of SIG-RS-specific failure modes. The main differences between the two approaches were the distribution of the values and the observation that failure modes (52, 54, 154) with high m-HFMEA scores do not necessarily have high FMEA-RPN scores. In the m-HFMEA analysis, when the risk score is determined, the basis of the established HFMEA Decision Tree™ or the failure mode should be more thoroughly investigated. m-HFMEA is inductive because it requires the identification of the consequences from causes, and semi-quantitative since it allows the prioritization of high risks and mitigation measures. It is therefore a useful tool for the prospective risk analysis method to radiotherapy.
Risk analysis of analytical validations by probabilistic modification of FMEA.
Barends, D M; Oldenhof, M T; Vredenbregt, M J; Nauta, M J
2012-05-01
Risk analysis is a valuable addition to validation of an analytical chemistry process, enabling not only detecting technical risks, but also risks related to human failures. Failure Mode and Effect Analysis (FMEA) can be applied, using a categorical risk scoring of the occurrence, detection and severity of failure modes, and calculating the Risk Priority Number (RPN) to select failure modes for correction. We propose a probabilistic modification of FMEA, replacing the categorical scoring of occurrence and detection by their estimated relative frequency and maintaining the categorical scoring of severity. In an example, the results of traditional FMEA of a Near Infrared (NIR) analytical procedure used for the screening of suspected counterfeited tablets are re-interpretated by this probabilistic modification of FMEA. Using this probabilistic modification of FMEA, the frequency of occurrence of undetected failure mode(s) can be estimated quantitatively, for each individual failure mode, for a set of failure modes, and the full analytical procedure. Copyright © 2012 Elsevier B.V. All rights reserved.
Fault tree applications within the safety program of Idaho Nuclear Corporation
NASA Technical Reports Server (NTRS)
Vesely, W. E.
1971-01-01
Computerized fault tree analyses are used to obtain both qualitative and quantitative information about the safety and reliability of an electrical control system that shuts the reactor down when certain safety criteria are exceeded, in the design of a nuclear plant protection system, and in an investigation of a backup emergency system for reactor shutdown. The fault tree yields the modes by which the system failure or accident will occur, the most critical failure or accident causing areas, detailed failure probabilities, and the response of safety or reliability to design modifications and maintenance schemes.
The VHCF experimental investigation of FV520B-I with surface roughness Ry
NASA Astrophysics Data System (ADS)
Wang, J. L.; Zhang, Y. L.; Ding, M. C.; Zhao, Q. C.
2018-05-01
Different surface roughness type (Ra and Ry) has different effect on the VHCF failure and life. Ra is widely employed as the quantitative expression of the surface roughness, but there are few fatigue failure mechanism analysis and experimental study under surface roughness Ry. The VHCF experiment is conducted out using the specimen with different surface roughness values. The surface roughness Ry is employed as the major research object to investigate the relationship and distribution tendency between the Ry, fatigue life and the distance between internal inclusion and surface, and a new VHCF failure character is proposed.
Fatigue crack growth in an aluminum alloy-fractographic study
NASA Astrophysics Data System (ADS)
Salam, I.; Muhammad, W.; Ejaz, N.
2016-08-01
A two-fold approach was adopted to understand the fatigue crack growth process in an Aluminum alloy; fatigue crack growth test of samples and analysis of fractured surfaces. Fatigue crack growth tests were conducted on middle tension M(T) samples prepared from an Aluminum alloy cylinder. The tests were conducted under constant amplitude loading at R ratio 0.1. The stress applied was from 20,30 and 40 per cent of the yield stress of the material. The fatigue crack growth data was recorded. After fatigue testing, the samples were subjected to detailed scanning electron microscopic (SEM) analysis. The resulting fracture surfaces were subjected to qualitative and quantitative fractographic examinations. Quantitative fracture analysis included an estimation of crack growth rate (CGR) in different regions. The effect of the microstructural features on fatigue crack growth was examined. It was observed that in stage II (crack growth region), the failure mode changes from intergranular to transgranular as the stress level increases. In the region of intergranular failure the localized brittle failure was observed and fatigue striations are difficult to reveal. However, in the region of transgranular failure the crack path is independent of the microstructural features. In this region, localized ductile failure mode was observed and well defined fatigue striations were present in the wake of fatigue crack. The effect of interaction of growing fatigue crack with microstructural features was not substantial. The final fracture (stage III) was ductile in all the cases.
PROBABILISTIC RISK ANALYSIS OF RADIOACTIVE WASTE DISPOSALS - a case study
NASA Astrophysics Data System (ADS)
Trinchero, P.; Delos, A.; Tartakovsky, D. M.; Fernandez-Garcia, D.; Bolster, D.; Dentz, M.; Sanchez-Vila, X.; Molinero, J.
2009-12-01
The storage of contaminant material in superficial or sub-superficial repositories, such as tailing piles for mine waste or disposal sites for low and intermediate nuclear waste, poses a potential threat for the surrounding biosphere. The minimization of these risks can be achieved by supporting decision-makers with quantitative tools capable to incorporate all source of uncertainty within a rigorous probabilistic framework. A case study is presented where we assess the risks associated to the superficial storage of hazardous waste close to a populated area. The intrinsic complexity of the problem, involving many events with different spatial and time scales and many uncertainty parameters is overcome by using a formal PRA (probabilistic risk assessment) procedure that allows decomposing the system into a number of key events. Hence, the failure of the system is directly linked to the potential contamination of one of the three main receptors: the underlying karst aquifer, a superficial stream that flows near the storage piles and a protection area surrounding a number of wells used for water supply. The minimal cut sets leading to the failure of the system are obtained by defining a fault-tree that incorporates different events including the failure of the engineered system (e.g. cover of the piles) and the failure of the geological barrier (e.g. clay layer that separates the bottom of the pile from the karst formation). Finally the probability of failure is quantitatively assessed combining individual independent or conditional probabilities that are computed numerically or borrowed from reliability database.
NASA Astrophysics Data System (ADS)
Yang, Sheng-Qi; Tian, Wen-Ling; Ranjith, P. G.
2017-11-01
The deformation failure characteristics of marble subjected to triaxial cyclic loading are significant when evaluating the stability and safety of deep excavation damage zones. To date, however, there have been notably few triaxial experimental studies on marble under triaxial cyclic loading. Therefore, in this research, a series of triaxial cyclic tests was conducted to analyze the mechanical damage characteristics of a marble. The post-peak deformation of the marble changed gradually from strain softening to strain hardening as the confining pressure increased from 0 to 10 MPa. Under uniaxial compression, marble specimens showed brittle failure characteristics with a number axial splitting tensile cracks; in the range of σ 3 = 2.5-7.5 MPa, the marble specimens assumed single shear fracture characteristics with larger fracture angles of about 65°. However, at σ 3 = 10 MPa, the marble specimens showed no obvious shear fracture surfaces. The triaxial cyclic experimental results indicate that in the range of the tested confining pressures, the triaxial strengths of the marble specimens under cyclic loading were approximately equal to those under monotonic loading. With the increase in cycle number, the elastic strains of the marble specimens all increased at first and later decreased, achieving maximum values, but the plastic strains of the marble specimens increased nonlinearly. To evaluate quantitatively the damage extent of the marble under triaxial cyclic loading, a damage variable is defined according to the irreversible deformation for each cycle. The evolutions of the elastic modulus for the marble were characterized by four stages: material strengthening, material degradation, material failure and structure slippage. Based on the experimental results of the marble specimens under complex cyclic loading, the cohesion of the marble decreased linearly, but the internal friction angles did not depend on the damage extent. To describe the peak strength characteristics of the marble specimens under complex cyclic loadings with various deformation positions, a revised strength criterion for damaged rocks is offered.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lah, J; Manger, R; Kim, G
Purpose: To examine the ability of traditional Failure mode and effects analysis (FMEA) and a light version of Healthcare FMEA (HFMEA), called Scenario analysis of FMEA (SAFER) by comparing their outputs in terms of the risks identified and their severity rankings. Methods: We applied two prospective methods of the quality management to surface image guided, linac-based radiosurgery (SIG-RS). For the traditional FMEA, decisions on how to improve an operation are based on risk priority number (RPN). RPN is a product of three indices: occurrence, severity and detectability. The SAFER approach; utilized two indices-frequency and severity-which were defined by a multidisciplinarymore » team. A criticality matrix was divided into 4 categories; very low, low, high and very high. For high risk events, an additional evaluation was performed. Based upon the criticality of the process, it was decided if additional safety measures were needed and what they comprise. Results: Two methods were independently compared to determine if the results and rated risks were matching or not. Our results showed an agreement of 67% between FMEA and SAFER approaches for the 15 riskiest SIG-specific failure modes. The main differences between the two approaches were the distribution of the values and the failure modes (No.52, 54, 154) that have high SAFER scores do not necessarily have high FMEA RPN scores. In our results, there were additional risks identified by both methods with little correspondence. In the SAFER, when the risk score is determined, the basis of the established decision tree or the failure mode should be more investigated. Conclusion: The FMEA method takes into account the probability that an error passes without being detected. SAFER is inductive because it requires the identification of the consequences from causes, and semi-quantitative since it allow the prioritization of risks and mitigation measures, and thus is perfectly applicable to clinical parts of radiotherapy.« less
Failure rates of mini-implants placed in the infrazygomatic region.
Uribe, Flavio; Mehr, Rana; Mathur, Ajay; Janakiraman, Nandakumar; Allareddy, Veerasathpurush
2015-01-01
The purpose of this pilot study was to evaluate the failure rates of mini-implants placed in the infrazygomatic region and to evaluate factors that affect their stability. A retrospective cohort study of 30 consecutive patients (55 mini-implants) who had infrazygomatic mini-implants at a University Clinic were evaluated for failure rates. Patient, mini-implant, orthodontic, surgical, and mini-implant maintenance factors were evaluated by univariate logistic regression models for association to failure rates. A 21.8 % failure rate of mini-implants placed in the infazygomatic region was observed. None of the predictor variables were significantly associated with higher or lower odds for failed implants. Failure rates for infrazygomatic mini-implants were slightly higher than those reported in other maxilla-mandibular osseous locations. No predictor variables were found to be associated to the failure rates.
Ran, Xun; Zhao, Jian-Xun; Nie, Hu; Chen, Yu-Cheng
2016-11-01
To investigate the effect of fluoxetine on neurite growth inhibitor (Nogo) expession and collagen production of cardiac tissue in rats with right heart failure and pulmonary hypertension. Thirty one male SD rats were randomly divided into the treatment group,right heart failure group and normal control group.The rats in the treatment group and right heart failure group received intrapertioneal injection of monocrotaline (MCT,60 mg/kg) to induce pulmonary hypertension and right heart failure.After 21 days,the rats in treatment group were given fluoxetine of 10 mg/(kg×d) by gavage per day for 21 days,the rats in the other two groups were given saline.HE staining was used to observe the pulmonary artery and right ventricular myocardial tissue in rats.The collagen formation in right ventricular myocardium was observed by Masson staining.The expressions of Nogo-A, Nogo-B ,type1collagen and type 3 collagen mRNA in myocardium were measured by real-time fluorescence quantitative PCR,while the semi quantitative measurement of Nogo protein level was detected by Western blot. After the intervention of fluoxetine,pulmonary artery stenosis was significantly reduced,myocardial tissue lesion decreased,collagen synthesis decreased in right ventricular myocardium.RT-PCR showed that mRNA of Nogo-A decreased,and mRNA of Nogo-B increased ( P <0.05).Western blot showed that the expression of Nogo-A protein decreased,while Nogo-B1 protein expression increased ( P <0.05),Nogo-B2 expression was not significantly changed ( P >0.05). Nogo may affect the collagen synthesis in right heart failure,and partly involved in myocardial fibrosis.
Deng, Bo; Wang, Jin Xin; Hu, Xing Xing; Duan, Peng; Wang, Lin; Li, Yang; Zhu, Qing Lei
2017-08-01
The aim of this study is to determine whether Nkx2.5 transfection of transplanted bone marrow mesenchymal stem cells (MSCs) improves the efficacy of treatment of adriamycin-induced heart failure in a rat model. Nkx2.5 was transfected in MSCs by lentiviral vector transduction. The expressions of Nkx2.5 and cardiac specific genes in MSCs and Nkx2.5 transfected mesenchymal stem cells (MSCs-Nkx2.5) were analyzed with quantitative real-time PCR and Western blot in vitro. Heart failure models of rats were induced by adriamycin and were then randomly divided into 3 groups: injected saline, MSCs or MSCs-Nkx2.5 via the femoral vein respectively. Four weeks after injection, the cardiac function, expressions of cardiac specific gene, fibrosis formation and collagen volume fraction in the myocardium as well as the expressions of GATA4 and MEF2 in rats were analyzed with echocardiography, immunohistochemistry, Masson staining, quantitative real-time PCR and Western blot, respectively. Nkx2.5 enhanced cardiac specific gene expressions including α-MHC, TNI, CKMB, connexin-43 in MSCs-Nkx2.5 in vitro. Both MSCs and MSCs-Nkx2.5 improved cardiac function, promoted the differentiation of transplanted MSCs into cardiomyocyte-like cells, decreased fibrosis formation and collagen volume fraction in the myocardium, as well as increased the expressions of GATA4 and MEF2 in adriamycin-induced rat heart failure models. Moreover, the effect was much more remarkable in MSCs-Nkx2.5 than in MSCs group. This study has found that Nkx2.5 enhances the efficacy of MSCs transplantation in treatment adriamycin-induced heart failure in rats. Nkx2.5 transfected to transplanted MSCs provides a potential effective approach to heart failure. Copyright © 2017 Elsevier Inc. All rights reserved.
NASA Technical Reports Server (NTRS)
Steurer, W. H.
1980-01-01
A survey of all presently defined or proposed large space systems indicated an ever increasing demand for flexible components and materials, primarily as a result of the widening disparity between the stowage space of launch vehicles and the size of advanced systems. Typical flexible components and material requirements were identified on the basis of recurrence and/or functional commonality. This was followed by the evaluation of candidate materials and the search for material capabilities which promise to satisfy the postulated requirements. Particular attention was placed on thin films, and on the requirements of deployable antennas. The assessment of the performance of specific materials was based primarily on the failure mode, derived from a detailed failure analysis. In view of extensive on going work on thermal and environmental degradation effects, prime emphasis was placed on the assessment of the performance loss by meteoroid damage. Quantitative data were generated for tension members and antenna reflector materials. A methodology was developed for the representation of the overall materials performance as related to systems service life. A number of promising new concepts for flexible materials were identified.
Quantifying Pilot Contribution to Flight Safety during Hydraulic Systems Failure
NASA Technical Reports Server (NTRS)
Kramer, Lynda J.; Etherington, Timothy J.; Bailey, Randall E.; Kennedy, Kellie D.
2017-01-01
Accident statistics cite the flight crew as a causal factor in over 60% of large transport aircraft fatal accidents. Yet, a well-trained and well-qualified pilot is acknowledged as the critical center point of aircraft systems safety and an integral safety component of the entire commercial aviation system. The latter statement, while generally accepted, cannot be verified because little or no quantitative data exists on how and how many accidents/incidents are averted by crew actions. A joint NASA/FAA high-fidelity motion-base human-in-the-loop test was conducted using a Level D certified Boeing 737-800 simulator to evaluate the pilot's contribution to safety-of-flight during routine air carrier flight operations and in response to aircraft system failures. To quantify the human's contribution, crew complement (two-crew, reduced crew, single pilot) was used as the independent variable in a between-subjects design. This paper details the crew's actions, including decision-making, and responses while dealing with a hydraulic systems leak - one of 6 total non-normal events that were simulated in this experiment.
[Reliability theory based on quality risk network analysis for Chinese medicine injection].
Li, Zheng; Kang, Li-Yuan; Fan, Xiao-Hui
2014-08-01
A new risk analysis method based upon reliability theory was introduced in this paper for the quality risk management of Chinese medicine injection manufacturing plants. The risk events including both cause and effect ones were derived in the framework as nodes with a Bayesian network analysis approach. It thus transforms the risk analysis results from failure mode and effect analysis (FMEA) into a Bayesian network platform. With its structure and parameters determined, the network can be used to evaluate the system reliability quantitatively with probabilistic analytical appraoches. Using network analysis tools such as GeNie and AgenaRisk, we are able to find the nodes that are most critical to influence the system reliability. The importance of each node to the system can be quantitatively evaluated by calculating the effect of the node on the overall risk, and minimization plan can be determined accordingly to reduce their influences and improve the system reliability. Using the Shengmai injection manufacturing plant of SZYY Ltd as a user case, we analyzed the quality risk with both static FMEA analysis and dynamic Bayesian Network analysis. The potential risk factors for the quality of Shengmai injection manufacturing were identified with the network analysis platform. Quality assurance actions were further defined to reduce the risk and improve the product quality.
An Overview of Quantitative Risk Assessment of Space Shuttle Propulsion Elements
NASA Technical Reports Server (NTRS)
Safie, Fayssal M.
1998-01-01
Since the Space Shuttle Challenger accident in 1986, NASA has been working to incorporate quantitative risk assessment (QRA) in decisions concerning the Space Shuttle and other NASA projects. One current major NASA QRA study is the creation of a risk model for the overall Space Shuttle system. The model is intended to provide a tool to estimate Space Shuttle risk and to perform sensitivity analyses/trade studies, including the evaluation of upgrades. Marshall Space Flight Center (MSFC) is a part of the NASA team conducting the QRA study; MSFC responsibility involves modeling the propulsion elements of the Space Shuttle, namely: the External Tank (ET), the Solid Rocket Booster (SRB), the Reusable Solid Rocket Motor (RSRM), and the Space Shuttle Main Engine (SSME). This paper discusses the approach that MSFC has used to model its Space Shuttle elements, including insights obtained from this experience in modeling large scale, highly complex systems with a varying availability of success/failure data. Insights, which are applicable to any QRA study, pertain to organizing the modeling effort, obtaining customer buy-in, preparing documentation, and using varied modeling methods and data sources. Also provided is an overall evaluation of the study results, including the strengths and the limitations of the MSFC QRA approach and of qRA technology in general.
Cardiac magnetic resonance imaging in heart failure: where the alphabet begins!
Aljizeeri, Ahmed; Sulaiman, Abdulbaset; Alhulaimi, Naji; Alsaileek, Ahmed; Al-Mallah, Mouaz H
2017-07-01
Cardiac Magnetic Resonance Imaging has become a cornerstone in the evaluation of heart failure. It provides a comprehensive evaluation by answering all the pertinent clinical questions across the full pathological spectrum of heart failure. Nowadays, CMR is considered the gold standard in evaluation of ventricular volumes, wall motion and systolic function. Through its unique ability of tissue characterization, it provides incremental diagnostic and prognostic information and thus has emerged as a comprehensive imaging modality in heart failure. This review outlines the role of main conventional CMR sequences in the evaluation of heart failure and their impact in the management and prognosis.
A Bayesian paradigm for decision-making in proof-of-concept trials.
Pulkstenis, Erik; Patra, Kaushik; Zhang, Jianliang
2017-01-01
Decision-making is central to every phase of drug development, and especially at the proof of concept stage where risk and evidence must be weighed carefully, often in the presence of significant uncertainty. The decision to proceed or not to large expensive Phase 3 trials has significant implications to both patients and sponsors alike. Recent experience has shown that Phase 3 failure rates remain high. We present a flexible Bayesian quantitative decision-making paradigm that evaluates evidence relative to achieving a multilevel target product profile. A framework for operating characteristics is provided that allows the drug developer to design a proof-of-concept trial in light of its ability to support decision-making rather than merely achieve statistical significance. Operating characteristics are shown to be superior to traditional p-value-based methods. In addition, discussion related to sample size considerations, application to interim futility analysis and incorporation of prior historical information is evaluated.
In-flight demonstration of a Real-Time Flush Airdata Sensing (RT-FADS) system
NASA Technical Reports Server (NTRS)
Whitmore, Stephen A.; Davis, Roy J.; Fife, John Michael
1995-01-01
A prototype real-time flush airdata sensing (RT-FADS) system has been developed and flight tested at the NASA Dryden Flight Research Center. This system uses a matrix of pressure orifices on the vehicle nose to estimate airdata parameters in real time using nonlinear regression. The algorithm is robust to sensor failures and noise in the measured pressures. The RT-FADS system has been calibrated using inertial trajectory measurements that were bootstrapped for atmospheric conditions using meteorological data. Mach numbers as high as 1.6 and angles of attack greater than 45 deg have been tested. The system performance has been evaluated by comparing the RT-FADS to the ship system airdata computer measurements to give a quantitative evaluation relative to an accepted measurement standard. Nominal agreements of approximately 0.003 in Mach number and 0.20 deg in angle of attack and angle of sideslip have been achieved.
The book availability study as an objective measure of performance in a health sciences library.
Kolner, S J; Welch, E C
1985-01-01
In its search for an objective overall diagnostic evaluation, the University of Illinois Library of the Health Sciences' Program Evaluation Committee selected a book availability measure; it is easy to administer and repeat, results are reproducible, and comparable data exist for other academic and health sciences libraries. The study followed the standard methodology in the literature with minor modifications. Patrons searching for particular books were asked to record item(s) needed and the outcome of the search. Library staff members then determined the reasons for failures in obtaining desired items. The results of the study are five performance scores. The first four represent the percentage probability of a library's operating with ideal effectiveness; the last provides an overall performance score. The scores of the Library of the Health Sciences demonstrated no unusual availability problems. The study was easy to implement and provided meaningful, quantitative, and objective data. PMID:3995202
Environment assisted degradation mechanisms in advanced light metals
NASA Technical Reports Server (NTRS)
Gangloff, R. P.; Stoner, G. E.; Swanson, R. E.
1989-01-01
A multifaceted research program on the performance of advanced light metallic alloys in aggressive aerospace environments, and associated environmental failure mechanisms was initiated. The general goal is to characterize alloy behavior quantitatively and to develop predictive mechanisms for environmental failure modes. Successes in this regard will provide the basis for metallurgical optimization of alloy performance, for chemical control of aggressive environments, and for engineering life prediction with damage tolerance and long term reliability.
Evaluation of a Linear Cumulative Damage Failure Model for Epoxy Adhesive
NASA Technical Reports Server (NTRS)
Richardson, David E.; Batista-Rodriquez, Alicia; Macon, David; Totman, Peter; McCool, Alex (Technical Monitor)
2001-01-01
Recently a significant amount of work has been conducted to provide more complex and accurate material models for use in the evaluation of adhesive bondlines. Some of this has been prompted by recent studies into the effects of residual stresses on the integrity of bondlines. Several techniques have been developed for the analysis of bondline residual stresses. Key to these analyses is the criterion that is used for predicting failure. Residual stress loading of an adhesive bondline can occur over the life of the component. For many bonded systems, this can be several years. It is impractical to directly characterize failure of adhesive bondlines under a constant load for several years. Therefore, alternative approaches for predictions of bondline failures are required. In the past, cumulative damage failure models have been developed. These models have ranged from very simple to very complex. This paper documents the generation and evaluation of some of the most simple linear damage accumulation tensile failure models for an epoxy adhesive. This paper shows how several variations on the failure model were generated and presents an evaluation of the accuracy of these failure models in predicting creep failure of the adhesive. The paper shows that a simple failure model can be generated from short-term failure data for accurate predictions of long-term adhesive performance.
Accelerated life assessment of coating on the radar structure components in coastal environment.
Liu, Zhe; Ming, ZhiMao
2016-07-04
This paper aimed to build an accelerated life test scheme and carry out quantitative analysis between accelerated life test in the laboratory and actual service for the coating composed of epoxy primer and polyurethane paint on structure components of some kind of radar served in the coastal environment of South China Sea. The accelerated life test scheme was built based on the service environment and failure analysis of the coating. The quantitative analysis between accelerated life test and actual service was conducted by comparing the gloss loss, discoloration, chalking, blistering, cracking and electrochemical impedance spectroscopy of the coating. The main factors leading to the coating failure were ultraviolet radiation, temperature, moisture, salt fog and loads, the accelerated life test included ultraviolet radiation, damp heat, thermal shock, fatigue and salt spray. The quantitative relationship was that one cycle of the accelerated life test was equal to actual service for one year. It was established that one cycle of the accelerated life test was equal to actual service for one year. It provided a precise way to predict actual service life of newly developed coatings for the manufacturer.
Subbaraman, Ramnath; Nolan, Laura; Sawant, Kiran; Shitole, Shrutika; Shitole, Tejal; Nanarkar, Mahesh; Patil-Deshmukh, Anita; Bloom, David E
2015-01-01
A focus on bacterial contamination has limited many studies of water service delivery in slums, with diarrheal illness being the presumed outcome of interest. We conducted a mixed methods study in a slum of 12,000 people in Mumbai, India to measure deficiencies in a broader array of water service delivery indicators and their adverse life impacts on the slum's residents. Six focus group discussions and 40 individual qualitative interviews were conducted using purposeful sampling. Quantitative data on water indicators-quantity, access, price, reliability, and equity-were collected via a structured survey of 521 households selected using population-based random sampling. In addition to negatively affecting health, the qualitative findings reveal that water service delivery failures have a constellation of other adverse life impacts-on household economy, employment, education, quality of life, social cohesion, and people's sense of political inclusion. In a multivariate logistic regression analysis, price of water is the factor most strongly associated with use of inadequate water quantity (≤20 liters per capita per day). Water service delivery failures and their adverse impacts vary based on whether households fetch water or have informal water vendors deliver it to their homes. Deficiencies in water service delivery are associated with many non-health-related adverse impacts on slum households. Failure to evaluate non-health outcomes may underestimate the deprivation resulting from inadequate water service delivery. Based on these findings, we outline a multidimensional definition of household "water poverty" that encourages policymakers and researchers to look beyond evaluation of water quality and health. Use of multidimensional water metrics by governments, slum communities, and researchers may help to ensure that water supplies are designed to advance a broad array of health, economic, and social outcomes for the urban poor.
Subbaraman, Ramnath; Nolan, Laura; Sawant, Kiran; Shitole, Shrutika; Shitole, Tejal; Nanarkar, Mahesh; Patil-Deshmukh, Anita; Bloom, David E.
2015-01-01
Objective A focus on bacterial contamination has limited many studies of water service delivery in slums, with diarrheal illness being the presumed outcome of interest. We conducted a mixed methods study in a slum of 12,000 people in Mumbai, India to measure deficiencies in a broader array of water service delivery indicators and their adverse life impacts on the slum’s residents. Methods Six focus group discussions and 40 individual qualitative interviews were conducted using purposeful sampling. Quantitative data on water indicators—quantity, access, price, reliability, and equity—were collected via a structured survey of 521 households selected using population-based random sampling. Results In addition to negatively affecting health, the qualitative findings reveal that water service delivery failures have a constellation of other adverse life impacts—on household economy, employment, education, quality of life, social cohesion, and people’s sense of political inclusion. In a multivariate logistic regression analysis, price of water is the factor most strongly associated with use of inadequate water quantity (≤20 liters per capita per day). Water service delivery failures and their adverse impacts vary based on whether households fetch water or have informal water vendors deliver it to their homes. Conclusions Deficiencies in water service delivery are associated with many non-health-related adverse impacts on slum households. Failure to evaluate non-health outcomes may underestimate the deprivation resulting from inadequate water service delivery. Based on these findings, we outline a multidimensional definition of household “water poverty” that encourages policymakers and researchers to look beyond evaluation of water quality and health. Use of multidimensional water metrics by governments, slum communities, and researchers may help to ensure that water supplies are designed to advance a broad array of health, economic, and social outcomes for the urban poor. PMID:26196295
Balinda, Sheila N; Ondoa, Pascale; Obuku, Ekwaro A; Kliphuis, Aletta; Egau, Isaac; Bronze, Michelle; Kasambula, Lordwin; Schuurman, Rob; Spieker, Nicole; Rinke de Wit, Tobias F; Kityo, Cissy
2016-01-01
WHO recommends regular viral load (VL) monitoring of patients on antiretroviral therapy (ART) for timely detection of virological failure, prevention of acquired HIV drug resistance (HIVDR) and avoiding unnecessary switching to second-line ART. However, the cost and complexity of routine VL testing remains prohibitive in most resource limited settings (RLS). We evaluated a simple, low-cost, qualitative viral-failure assay (VFA) on dried blood spots (DBS) in three clinical settings in Uganda. We conducted a cross-sectional diagnostic accuracy study in three HIV/AIDS treatment centres at the Joint Clinical Research Centre in Uganda. The VFA employs semi-quantitative detection of HIV-1 RNA amplified from the LTR gene. We used paired dry blood spot (DBS) and plasma with the COBASAmpliPrep/COBASTaqMan, Roche version 2 (VLref) as the reference assay. We used the VFA at two thresholds of viral load, (>5,000 or >1,000 copies/ml). 496 paired VFA and VLref results were available for comparative analysis. Overall, VFA demonstrated 78.4% sensitivity, (95% CI: 69.7%-87.1%), 93% specificity (95% CI: 89.7%-96.4%), 89.3% accuracy (95% CI: 85%-92%) and an agreement kappa = 0.72 as compared to the VLref. The predictive values of positivity and negativity among patients on ART for >12 months were 72.7% and 99.3%, respectively. VFA allowed 89% of correct classification of VF. Only 11% of the patients were misclassified with the potential of unnecessary or late switch to second-line ART. Our findings present an opportunity to roll out simple and affordable VL monitoring for HIV-1 treatment in RLS.
A quantitative model of honey bee colony population dynamics.
Khoury, David S; Myerscough, Mary R; Barron, Andrew B
2011-04-18
Since 2006 the rate of honey bee colony failure has increased significantly. As an aid to testing hypotheses for the causes of colony failure we have developed a compartment model of honey bee colony population dynamics to explore the impact of different death rates of forager bees on colony growth and development. The model predicts a critical threshold forager death rate beneath which colonies regulate a stable population size. If death rates are sustained higher than this threshold rapid population decline is predicted and colony failure is inevitable. The model also predicts that high forager death rates draw hive bees into the foraging population at much younger ages than normal, which acts to accelerate colony failure. The model suggests that colony failure can be understood in terms of observed principles of honey bee population dynamics, and provides a theoretical framework for experimental investigation of the problem.
NASA Astrophysics Data System (ADS)
Ma, Yong; Qin, Jianfeng; Zhang, Xiangyu; Lin, Naiming; Huang, Xiaobo; Tang, Bin
2015-07-01
Using the impact test and finite element simulation, the failure behavior of the Mo-modified layer on pure Ti was investigated. In the impact test, four loads of 100, 300, 500, and 700 N and 104 impacts were adopted. The three-dimensional residual impact dents were examined using an optical microscope (Olympus-DSX500i), indicating that the impact resistance of the Ti surface was improved. Two failure modes cohesive and wearing were elucidated by electron backscatter diffraction and energy-dispersive spectrometer performed in a field-emission scanning electron microscope. Through finite element forward analysis performed at a typical impact load of 300 N, stress-strain distributions in the Mo-modified Ti were quantitatively determined. In addition, the failure behavior of the Mo-modified layer was determined and an ideal failure model was proposed for high-load impact, based on the experimental and finite element forward analysis results.
Importance Sampling in the Evaluation and Optimization of Buffered Failure Probability
2015-07-01
12th International Conference on Applications of Statistics and Probability in Civil Engineering, ICASP12 Vancouver, Canada, July 12-15, 2015...Importance Sampling in the Evaluation and Optimization of Buffered Failure Probability Marwan M. Harajli Graduate Student, Dept. of Civil and Environ...criterion is usually the failure probability . In this paper, we examine the buffered failure probability as an attractive alternative to the failure
... News Physician Resources Professions Site Index A-Z Kidney Failure Kidney failure, also known as renal failure, ... evaluated? How is kidney failure treated? What is kidney (renal) failure? The kidneys are designed to maintain ...
NASA Astrophysics Data System (ADS)
Nagarajan, Mahesh B.; Checefsky, Walter A.; Abidin, Anas Z.; Tsai, Halley; Wang, Xixi; Hobbs, Susan K.; Bauer, Jan S.; Baum, Thomas; Wismüller, Axel
2015-03-01
While the proximal femur is preferred for measuring bone mineral density (BMD) in fracture risk estimation, the introduction of volumetric quantitative computed tomography has revealed stronger associations between BMD and spinal fracture status. In this study, we propose to capture properties of trabecular bone structure in spinal vertebrae with advanced second-order statistical features for purposes of fracture risk assessment. For this purpose, axial multi-detector CT (MDCT) images were acquired from 28 spinal vertebrae specimens using a whole-body 256-row CT scanner with a dedicated calibration phantom. A semi-automated method was used to annotate the trabecular compartment in the central vertebral slice with a circular region of interest (ROI) to exclude cortical bone; pixels within were converted to values indicative of BMD. Six second-order statistical features derived from gray-level co-occurrence matrices (GLCM) and the mean BMD within the ROI were then extracted and used in conjunction with a generalized radial basis functions (GRBF) neural network to predict the failure load of the specimens; true failure load was measured through biomechanical testing. Prediction performance was evaluated with a root-mean-square error (RMSE) metric. The best prediction performance was observed with GLCM feature `correlation' (RMSE = 1.02 ± 0.18), which significantly outperformed all other GLCM features (p < 0.01). GLCM feature correlation also significantly outperformed MDCTmeasured mean BMD (RMSE = 1.11 ± 0.17) (p< 10-4). These results suggest that biomechanical strength prediction in spinal vertebrae can be significantly improved through characterization of trabecular bone structure with GLCM-derived texture features.
Improving Attachments of Non-Invasive (Type III) Electronic Data Loggers to Cetaceans
2015-09-30
animals in human care will be performed to test and validate this approach. The cadaver trials will enable controlled testing to failure or with both...quantitative metrics and analysis tools to assess the impact of a tag on the animal . Here we will present: 1) the characterization of the mechanical...fine scale motion analysis for swimming animals . 2 APPROACH Our approach is divided into four subtasks: Task 1: Forces and failure modes
Acoustic emission spectral analysis of fiber composite failure mechanisms
NASA Technical Reports Server (NTRS)
Egan, D. M.; Williams, J. H., Jr.
1978-01-01
The acoustic emission of graphite fiber polyimide composite failure mechanisms was investigated with emphasis on frequency spectrum analysis. Although visual examination of spectral densities could not distinguish among fracture sources, a paired-sample t statistical analysis of mean normalized spectral densities did provide quantitative discrimination among acoustic emissions from 10 deg, 90 deg, and plus or minus 45 deg, plus or minus 45 deg sub s specimens. Comparable discrimination was not obtained for 0 deg specimens.
The Importance of Human Reliability Analysis in Human Space Flight: Understanding the Risks
NASA Technical Reports Server (NTRS)
Hamlin, Teri L.
2010-01-01
HRA is a method used to describe, qualitatively and quantitatively, the occurrence of human failures in the operation of complex systems that affect availability and reliability. Modeling human actions with their corresponding failure in a PRA (Probabilistic Risk Assessment) provides a more complete picture of the risk and risk contributions. A high quality HRA can provide valuable information on potential areas for improvement, including training, procedural, equipment design and need for automation.
Shirshin, Evgeny A; Gurfinkel, Yury I; Matskeplishvili, Simon T; Sasonko, Maria L; Omelyanenko, Nikolai P; Yakimov, Boris P; Lademann, Juergen; Darvin, Maxim E
2018-05-29
Heart failure is among the socially significant diseases, involving over 2% of the adult population in the developed countries. Diagnostics of the HF severity remains complicated due to the absence of specific symptoms and objective criteria. Here we present an indicator of the HF severity based on the imaging tissue parameters around the nailfold capillaries. High resolution nailfold video capillaroscopy was performed to determine the perivascular zone (PZ) size around nailfold capillaries, and two-photon tomography with fluorescence lifetime imaging was used to investigate PZ composition. We found that the size of PZ around the nailfold capillaries strongly correlates with heart failure severity. Further investigations using two-photon tomography demonstrated that PZ corresponds to the border of viable epidermis and it was suggested that the PZ size variations were due to the different amounts of interstitial fluid that potentially further translates in clinically significant oedema. The obtained results allow for the development of a quantitative indicator of oedematous syndrome, which can be used in various applications to monitor the dynamics of interstitial fluid retention. We therefore suggest PZ size measured with nailfold video capillaroscopy as a novel quantitative sensitive non-invasive marker of heart failure severity. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.
Comprehensive Understanding of the Zipingpu Reservoir to the Ms8.0 Wenchuan Earthquake
NASA Astrophysics Data System (ADS)
Cheng, H.; Pang, Y. J.; Zhang, H.; Shi, Y.
2014-12-01
After the Wenchuan earthquake occurred, whether the big earthquake triggered by the storage of the Zipingpu Reservoir has attracted wide attention in international academic community. In addition to the qualitative discussion, many scholars also adopted the quantitative analysis methods to calculate the stress changes, but due to the different results, they draw very different conclusions. Here, we take the dispute of different teams in the quantitative calculation of Zipingpu reservoir as a starting point. In order to find out the key influence factors of quantitative calculation and know about the existing uncertainty elements during the numerical simulation, we analyze factors which may cause the differences. The preliminary results show that the calculation methods (analytical method or numerical method), dimension of models (2-D or 3-D), diffusion model, diffusion coefficient and focal mechanism are the main factors resulted in the differences, especially the diffusion coefficient of the fractured rock mass. The change of coulomb failure stress of the epicenter of Wenchuan earthquake attained from 2-D model is about 3 times of that of 3-D model. And it is not reasonable that only considering the fault permeability (assuming the permeability of rock mass as infinity) or only considering homogeneous isotropic rock mass permeability (ignoring the fault permeability). The different focal mechanisms also could dramatically affect the change of coulomb failure stress of the epicenter of Wenchuan earthquake, and the differences can research 2-7 times. And the differences the change of coulomb failure stress can reach several hundreds times, when selecting different diffusion coefficients. According to existing research that the magnitude of coulomb failure stress change is about several kPa, we could not rule out the possibility that the Zipingpu Reservoir may trigger the 2008 Wenchuan earthquake. However, for the background stress is not clear and coulomb failure stress change is too little, we also not sure there must be a connection between reservoir and earthquake. In future work, we should target on the basis of field survey and indoor experiment, improve the model and develop high performance simulation.
Health management system for rocket engines
NASA Technical Reports Server (NTRS)
Nemeth, Edward
1990-01-01
The functional framework of a failure detection algorithm for the Space Shuttle Main Engine (SSME) is developed. The basic algorithm is based only on existing SSME measurements. Supplemental measurements, expected to enhance failure detection effectiveness, are identified. To support the algorithm development, a figure of merit is defined to estimate the likelihood of SSME criticality 1 failure modes and the failure modes are ranked in order of likelihood of occurrence. Nine classes of failure detection strategies are evaluated and promising features are extracted as the basis for the failure detection algorithm. The failure detection algorithm provides early warning capabilities for a wide variety of SSME failure modes. Preliminary algorithm evaluation, using data from three SSME failures representing three different failure types, demonstrated indications of imminent catastrophic failure well in advance of redline cutoff in all three cases.
NASA Technical Reports Server (NTRS)
Lo, Yunnhon; Johnson, Stephen B.; Breckenridge, Jonathan T.
2014-01-01
This paper describes the quantitative application of the theory of System Health Management and its operational subset, Fault Management, to the selection of abort triggers for a human-rated launch vehicle, the United States' National Aeronautics and Space Administration's (NASA) Space Launch System (SLS). The results demonstrate the efficacy of the theory to assess the effectiveness of candidate failure detection and response mechanisms to protect humans from time-critical and severe hazards. The quantitative method was successfully used on the SLS to aid selection of its suite of abort triggers.
NASA Technical Reports Server (NTRS)
Lo, Yunnhon; Johnson, Stephen B.; Breckenridge, Jonathan T.
2014-01-01
This paper describes the quantitative application of the theory of System Health Management and its operational subset, Fault Management, to the selection of Abort Triggers for a human-rated launch vehicle, the United States' National Aeronautics and Space Administration's (NASA) Space Launch System (SLS). The results demonstrate the efficacy of the theory to assess the effectiveness of candidate failure detection and response mechanisms to protect humans from time-critical and severe hazards. The quantitative method was successfully used on the SLS to aid selection of its suite of Abort Triggers.
NASA Astrophysics Data System (ADS)
Odry, Benjamin L.; Kiraly, Atilla P.; Novak, Carol L.; Naidich, David P.; Lerallut, Jean-Francois
2006-03-01
Pulmonary diseases such as bronchiectasis, asthma, and emphysema are characterized by abnormalities in airway dimensions. Multi-slice computed tomography (MSCT) has become one of the primary means to depict these abnormalities, as the availability of high-resolution near-isotropic data makes it possible to evaluate airways at oblique angles to the scanner plane. However, currently, clinical evaluation of airways is typically limited to subjective visual inspection only: systematic evaluation of the airways to take advantage of high-resolution data has not proved practical without automation. We present an automated method to quantitatively evaluate airway lumen diameter, wall thickness and broncho-arterial ratios. In addition, our method provides 3D visualization of these values, graphically illustrating the location and extent of disease. Our algorithm begins by automatic airway segmentation to extract paths to the distal airways, and to create a map of airway diameters. Normally, airway diameters decrease as paths progress distally; failure to taper indicates abnormal dilatation. Our approach monitors airway lumen diameters along each airway path in order to detect abnormal profiles, allowing even subtle degrees of pathologic dilatation to be identified. Our method also systematically computes the broncho-arterial ratio at every terminal branch of the tree model, as a ratio above 1 indicates potentially abnormal bronchial dilatation. Finally, the airway wall thickness is computed at corresponding locations. These measurements are used to highlight abnormal branches for closer inspection, and can be summed to compute a quantitative global score for the entire airway tree, allowing reproducible longitudinal assessment of disease severity. Preliminary tests on patients diagnosed with bronchiectasis demonstrated rapid identification of lack of tapering, which also was confirmed by corresponding demonstration of elevated broncho-arterial ratios.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-10-01
... Friday, except Federal holidays. The AD docket contains this proposed AD, the regulatory evaluation, any... three criteria address the failure types under evaluation: single failures, single failures in... evaluations included consideration of previous actions taken that may mitigate the need for further action...
Evaluation Methodologies for Estimating the Likelihood of Program Implementation Failure
ERIC Educational Resources Information Center
Durand, Roger; Decker, Phillip J.; Kirkman, Dorothy M.
2014-01-01
Despite our best efforts as evaluators, program implementation failures abound. A wide variety of valuable methodologies have been adopted to explain and evaluate the "why" of these failures. Yet, typically these methodologies have been employed concurrently (e.g., project monitoring) or to the post-hoc assessment of program activities.…
Quantitative modeling of failure propagation in intelligent transportation systems.
DOT National Transportation Integrated Search
2014-08-01
Unmanned vehicles are projected to reach consumer use within this decade - related legislation has already passed in California. The : most significant technical challenge associated with these vehicles is their integration in transportation environm...
NASA Technical Reports Server (NTRS)
Bundick, W. T.
1985-01-01
The application of the failure detection filter to the detection and identification of aircraft control element failures was evaluated in a linear digital simulation of the longitudinal dynamics of a B-737 Aircraft. Simulation results show that with a simple correlator and threshold detector used to process the filter residuals, the failure detection performance is seriously degraded by the effects of turbulence.
Brown, Dorothy Cimino; Bell, Margie; Rhodes, Linda
2013-12-01
To determine the optimal method for use of the Canine Brief Pain Inventory (CBPI) to quantitate responses of dogs with osteoarthritis to treatment with carprofen or placebo. 150 dogs with osteoarthritis. Data were analyzed from 2 studies with identical protocols in which owner-completed CBPIs were used. Treatment for each dog was classified as a success or failure by comparing the pain severity score (PSS) and pain interference score (PIS) on day 0 (baseline) with those on day 14. Treatment success or failure was defined on the basis of various combinations of reduction in the 2 scores when inclusion criteria were set as a PSS and PIS ≥ 1, 2, or 3 at baseline. Statistical analyses were performed to select the definition of treatment success that had the greatest statistical power to detect differences between carprofen and placebo treatments. Defining treatment success as a reduction of ≥ 1 in PSS and ≥ 2 in PIS in each dog had consistently robust power. Power was 62.8% in the population that included only dogs with baseline scores ≥ 2 and 64.7% in the population that included only dogs with baseline scores ≥ 3. The CBPI had robust statistical power to evaluate the treatment effect of carprofen in dogs with osteoarthritis when protocol success criteria were predefined as a reduction ≥ 1 in PIS and ≥ 2 in PSS. Results indicated the CBPI can be used as an outcome measure in clinical trials to evaluate new pain treatments when it is desirable to evaluate success in individual dogs rather than overall mean or median scores in a test population.
Savasoglu, Kaan; Payzin, Kadriye Bahriye; Ozdemirkiran, Fusun; Berber, Belgin
2015-08-01
To determine the use of the Quantitative Real Time PCR (RQ-PCR) assay follow-up with Chronic Myeloid Leukemia (CML) patients. Cross-sectional observational. Izmir Ataturk Education and Research Hospital, Izmir, Turkey, from 2009 to 2013. Cytogenetic, FISH, RQ-PCR test results from 177 CMLpatients' materials selected between 2009 - 2013 years was set up for comparison analysis. Statistical analysis was performed to compare between FISH, karyotype and RQ-PCR results of the patients. Karyotyping and FISH specificity and sensitivity rates determined by ROC analysis compared with RQ-PCR results. Chi-square test was used to compare test failure rates. Sensitivity and specificity values were determined for karyotyping 17.6 - 98% (p=0.118, p > 0.05) and for FISH 22.5 - 96% (p=0.064, p > 0.05) respectively. FISH sensitivity was slightly higher than karyotyping but there was calculated a strong correlation between them (p < 0.001). RQ-PCR test failure rate did not correlate with other two tests (p > 0.05); however, karyotyping and FISH test failure rate was statistically significant (p < 0.001). Besides, the situation needed for karyotype analysis, RQ-PCR assay can be used alone in the follow-up of CMLdisease.
Mode I Failure of Armor Ceramics: Experiments and Modeling
NASA Astrophysics Data System (ADS)
Meredith, Christopher; Leavy, Brian
2017-06-01
The pre-notched edge on impact (EOI) experiment is a technique for benchmarking the damage and fracture of ceramics subjected to projectile impact. A cylindrical projectile impacts the edge of a thin rectangular plate with a pre-notch on the opposite edge. Tension is generated at the notch tip resulting in the initiation and propagation of a mode I crack back toward the impact edge. The crack can be quantitatively measured using an optical method called Digital Gradient Sensing, which measures the crack-tip deformation by simultaneously quantifying two orthogonal surface slopes via measuring small deflections of light rays from a specularly reflective surface around the crack. The deflections in ceramics are small so the high speed camera needs to have a very high pixel count. This work reports on the results from pre-crack EOI experiments of SiC and B4 C plates. The experimental data are quantitatively compared to impact simulations using an advanced continuum damage model. The Kayenta ceramic model in Alegra will be used to compare fracture propagation speeds, bifurcations and inhomogeneous initiation of failure will be compared. This will provide insight into the driving mechanisms required for the macroscale failure modeling of ceramics.
Deng, Xinyang; Jiang, Wen
2017-09-12
Failure mode and effect analysis (FMEA) is a useful tool to define, identify, and eliminate potential failures or errors so as to improve the reliability of systems, designs, and products. Risk evaluation is an important issue in FMEA to determine the risk priorities of failure modes. There are some shortcomings in the traditional risk priority number (RPN) approach for risk evaluation in FMEA, and fuzzy risk evaluation has become an important research direction that attracts increasing attention. In this paper, the fuzzy risk evaluation in FMEA is studied from a perspective of multi-sensor information fusion. By considering the non-exclusiveness between the evaluations of fuzzy linguistic variables to failure modes, a novel model called D numbers is used to model the non-exclusive fuzzy evaluations. A D numbers based multi-sensor information fusion method is proposed to establish a new model for fuzzy risk evaluation in FMEA. An illustrative example is provided and examined using the proposed model and other existing method to show the effectiveness of the proposed model.
Deng, Xinyang
2017-01-01
Failure mode and effect analysis (FMEA) is a useful tool to define, identify, and eliminate potential failures or errors so as to improve the reliability of systems, designs, and products. Risk evaluation is an important issue in FMEA to determine the risk priorities of failure modes. There are some shortcomings in the traditional risk priority number (RPN) approach for risk evaluation in FMEA, and fuzzy risk evaluation has become an important research direction that attracts increasing attention. In this paper, the fuzzy risk evaluation in FMEA is studied from a perspective of multi-sensor information fusion. By considering the non-exclusiveness between the evaluations of fuzzy linguistic variables to failure modes, a novel model called D numbers is used to model the non-exclusive fuzzy evaluations. A D numbers based multi-sensor information fusion method is proposed to establish a new model for fuzzy risk evaluation in FMEA. An illustrative example is provided and examined using the proposed model and other existing method to show the effectiveness of the proposed model. PMID:28895905
Nayak, Moksha; Babshet, Medha
2016-01-01
Background The prevalence of apical periodontitis in diabetes mellitus patients is high. The altered immunity in diabetes affects the healing process of periapical tissue. Single visit endodontic treatment has shown to increase the periapical healing rate with better patient compliance. Hence the present study aims at evaluating the clinical and radiographic healing outcome of single visit endodontic treatment, in type 2 diabetes mellitus patients with periapical disease. Material and Methods Eighty patients with periapical disease were divided into 2 groups of 40 each: Group I, Control subjects and Group II, Type 2 diabetics. Glycosylated hemoglobin levels were assessed preoperatively and at follow up intervals in diabetics. Pre-operative assessment of periapical status was done using CPDR (Clinical periapical diagnosis of root), QLDR (Qualitative radiographic diagnosis of tooth) and QTDR (Quantitative radiographic diagnosis of tooth) criteria. Postoperative healing was evaluated following single-visit endodontic treatment by Strindberg criteria. Results Group 2 subjects had chronic and exacerbating lesions with significantly larger lesions (p=0.029). 100 % clinical healing outcome in diabetic group was seen in two months. Group 2 showed 85% success in one year on radiographic evaluation. Poor controlled diabetics showed failure compared to fair and good controlled. Conclusions Type 2 diabetics had chronic and larger sized lesions when compared to control subjects. The periapical lesions in patients with poor diabetic control showed failure. The clinical and radiographic healing outcome of single visit endodontic therapy was delayed in diabetic patients. Key words:Apical periodontitis, diabetes mellitus type 2, endodontics, periapical lesion, strindberg criteria. PMID:27957260
DEPEND - A design environment for prediction and evaluation of system dependability
NASA Technical Reports Server (NTRS)
Goswami, Kumar K.; Iyer, Ravishankar K.
1990-01-01
The development of DEPEND, an integrated simulation environment for the design and dependability analysis of fault-tolerant systems, is described. DEPEND models both hardware and software components at a functional level, and allows automatic failure injection to assess system performance and reliability. It relieves the user of the work needed to inject failures, maintain statistics, and output reports. The automatic failure injection scheme is geared toward evaluating a system under high stress (workload) conditions. The failures that are injected can affect both hardware and software components. To illustrate the capability of the simulator, a distributed system which employs a prediction-based, dynamic load-balancing heuristic is evaluated. Experiments were conducted to determine the impact of failures on system performance and to identify the failures to which the system is especially susceptible.
NASA Astrophysics Data System (ADS)
Mallineni, Jaya krishna
This study evaluates two photovoltaic (PV) power plants based on electrical performance measurements, diode checks, visual inspections and infrared scanning. The purpose of this study is to measure degradation rates of performance parameters (Pmax, Isc, Voc, Vmax, Imax and FF) and to identify the failure modes in a "hot-dry desert" climatic condition along with quantitative determination of safety failure rates and reliability failure rates. The data obtained from this study can be used by module manufacturers in determining the warranty limits of their modules and also by banks, investors, project developers and users in determining appropriate financing or decommissioning models. In addition, the data obtained in this study will be helpful in selecting appropriate accelerated stress tests which would replicate the field failures for the new modules and would predict the lifetime for new PV modules. The study was conducted at two, single axis tracking monocrystalline silicon (c-Si) power plants, Site 3 and Site 4c of Salt River Project (SRP). The Site 3 power plant is located in Glendale, Arizona and the Site 4c power plant is located in Mesa, Arizona both considered a "hot-dry" field condition. The Site 3 power plant has 2,352 modules (named as Model-G) which was rated at 250 kW DC output. The mean and median degradation of these 12 years old modules are 0.95%/year and 0.96%/year, respectively. The major cause of degradation found in Site 3 is due to high series resistance (potentially due to solder-bond thermo-mechanical fatigue) and the failure mode is ribbon-ribbon solder bond failure/breakage. The Site 4c power plant has 1,280 modules (named as Model-H) which provide 243 kW DC output. The mean and median degradation of these 4 years old modules are 0.96%/year and 1%/year, respectively. At Site 4c, practically, none of the module failures are observed. The average soiling loss is 6.9% in Site 3 and 5.5% in Site 4c. The difference in soiling level is attributed to the rural and urban surroundings of these two power plants.
When should we use nitrates in congestive heart failure?
Vizzardi, Enrico; Bonadei, Ivano; Rovetta, Riccardo; D'Aloia, Antonio; Quinzani, Filippo; Curnis, Antonio; Dei Cas, Livio
2013-02-01
Organic nitrates remain among the oldest and most commonly employed drugs in cardiology. Although, in most cases, their use in acute and chronic heart failure is based on clinical practice, only a few clinical trials have been conducted to evaluate their use in acute and chronic heart failure, most of which compare them with other drugs to evaluate differing endpoints. The purpose of this review is to examine the various trials that have evaluated the use of nitrates in acute and chronic heart failure. © 2012 Blackwell Publishing Ltd.
Quantifying Pilot Contribution to Flight Safety During an In-Flight Airspeed Failure
NASA Technical Reports Server (NTRS)
Etherington, Timothy J.; Kramer, Lynda J.; Bailey, Randall E.; Kennedey, Kellie D.
2017-01-01
Accident statistics cite the flight crew as a causal factor in over 60% of large transport fatal accidents. Yet a well-trained and well-qualified crew is acknowledged as the critical center point of aircraft systems safety and an integral component of the entire commercial aviation system. A human-in-the-loop test was conducted using a Level D certified Boeing 737-800 simulator to evaluate the pilot's contribution to safety-of-flight during routine air carrier flight operations and in response to system failures. To quantify the human's contribution, crew complement was used as an independent variable in a between-subjects design. This paper details the crew's actions and responses while dealing with an in-flight airspeed failure. Accident statistics often cite flight crew error (Baker, 2001) as the primary contributor in accidents and incidents in transport category aircraft. However, the Air Line Pilots Association (2011) suggests "a well-trained and well-qualified pilot is acknowledged as the critical center point of the aircraft systems safety and an integral safety component of the entire commercial aviation system." This is generally acknowledged but cannot be verified because little or no quantitative data exists on how or how many accidents/incidents are averted by crew actions. Anecdotal evidence suggest crews handle failures on a daily basis and Aviation Safety Action Program data generally supports this assertion, even if the data is not released to the public. However without hard evidence, the contribution and means by which pilots achieve safety of flight is difficult to define. Thus, ways to improve the human ability to contribute or overcome deficiencies are ill-defined.
Quantifying Pilot Contribution to Flight Safety during Drive Shaft Failure
NASA Technical Reports Server (NTRS)
Kramer, Lynda J.; Etherington, Tim; Last, Mary Carolyn; Bailey, Randall E.; Kennedy, Kellie D.
2017-01-01
Accident statistics cite the flight crew as a causal factor in over 60% of large transport aircraft fatal accidents. Yet, a well-trained and well-qualified pilot is acknowledged as the critical center point of aircraft systems safety and an integral safety component of the entire commercial aviation system. The latter statement, while generally accepted, cannot be verified because little or no quantitative data exists on how and how many accidents/incidents are averted by crew actions. A joint NASA/FAA high-fidelity motion-base simulation experiment specifically addressed this void by collecting data to quantify the human (pilot) contribution to safety-of-flight and the methods they use in today's National Airspace System. A human-in-the-loop test was conducted using the FAA's Oklahoma City Flight Simulation Branch Level D-certified B-737-800 simulator to evaluate the pilot's contribution to safety-of-flight during routine air carrier flight operations and in response to aircraft system failures. These data are fundamental to and critical for the design and development of future increasingly autonomous systems that can better support the human in the cockpit. Eighteen U.S. airline crews flew various normal and non-normal procedures over a two-day period and their actions were recorded in response to failures. To quantify the human's contribution to safety of flight, crew complement was used as the experiment independent variable in a between-subjects design. Pilot actions and performance during single pilot and reduced crew operations were measured for comparison against the normal two-crew complement during normal and non-normal situations. This paper details the crew's actions, including decision-making, and responses while dealing with a drive shaft failure - one of 6 non-normal events that were simulated in this experiment.
Nonlinear deformation and localized failure of bacterial streamers in creeping flows
Biswas, Ishita; Ghosh, Ranajay; Sadrzadeh, Mohtada; Kumar, Aloke
2016-01-01
We investigate the failure of bacterial floc mediated streamers in a microfluidic device in a creeping flow regime using both experimental observations and analytical modeling. The quantification of streamer deformation and failure behavior is possible due to the use of 200 nm fluorescent polystyrene beads which firmly embed in the extracellular polymeric substance (EPS) and act as tracers. The streamers, which form soon after the commencement of flow begin to deviate from an apparently quiescent fully formed state in spite of steady background flow and limited mass accretion indicating significant mechanical nonlinearity. This nonlinear behavior shows distinct phases of deformation with mutually different characteristic times and comes to an end with a distinct localized failure of the streamer far from the walls. We investigate this deformation and failure behavior for two separate bacterial strains and develop a simplified but nonlinear analytical model describing the experimentally observed instability phenomena assuming a necking route to instability. Our model leads to a power law relation between the critical strain at failure and the fluid velocity scale exhibiting excellent qualitative and quantitative agreeing with the experimental rupture behavior. PMID:27558511
NASA Astrophysics Data System (ADS)
Weischedel, Herbert R.; Hoehle, Hans-Werner
1995-05-01
Stay cables of cable-stayed bridges have corrosion protection systems that can be elaborate. For example, such a system may simply consist of one or several coats of paint, or--more complex--of plastic pipes that are wrapped with tape and filled with grout. Frequently, these corrosion protection systems prevent visual inspections. Therefore, alternative nondestructive examination methods are called for. For example, modern dual-function electromagnetic (EM) instruments allow the simultaneous detection of external and internal localized flaws (such as external and internal broken wires and corrosion piting) and the measurement of loss of metallic cross-sectional area (typically caused by external or internal corrosion or wear). Initially developed for mining and skiing applications, these instruments have been successfully used for the inspection of stays of cable-stayed bridges, and for the inspection of guys of smoke stacks, flare stacks, broadcast towers, suspended roofs, etc. As a rule, guys and bridge cables are not subjected to wear and bending stresses. However, their safety can be compromised by corrosion caused by the failure of corrosion protection systems. Furthermore, live loads and wind forces create intermittent tensile stresses that can cause fatigue breaks of wires. This paper discusses the use of dual-function EM instruments for the detection and the nondestructive quantitative evaluation of cable deterioration. It explains the underlying principles. Experiences with this method together with field inspection results will be presented.
Reslan, Summar; Axelrod, Bradley N
2017-01-01
The purpose of the current study was to compare three potential profiles of the Medical Symptom Validity Test (MSVT; Pass, Genuine Memory Impairment Profile [GMIP], and Fail) on other freestanding and embedded performance validity tests (PVTs). Notably, a quantitatively computed version of the GMIP was utilized in this investigation. Data obtained from veterans referred for a neuropsychological evaluation in a metropolitan Veteran Affairs medical center were included (N = 494). Individuals age 65 and older were not included to exclude individuals with dementia from this investigation. The sample revealed 222 (45%) in the Pass group. Of the 272 who failed the easy subtests of the MSVT, 221 (81%) met quantitative criteria for the GMIP and 51 (19%) were classified as Fail. The Pass group failed fewer freestanding and embedded PVTs and obtained higher raw scores on all PVTs than both GMIP and Fail groups. The differences in performances of the GMIP and Fail groups were minimal. Specifically, GMIP protocols failed fewer freestanding PVTs than the Fail group; failure on embedded PVTs did not differ between GMIP and Fail. The MSVT GMIP incorporates the presence of clinical correlates of disability to assist with this distinction, but future research should consider performances on other freestanding measures of performance validity to differentiate cognitive impairment from invalidity.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hanson, David Edward; Barber, John L.
From quantum chemistry simulations using density functional theory, we obtain the total electronic energy of an eight-atom sulfur chain as its end-to-end distance is extended until S–S bond rupture occurs. We find that a sulfur chain can be extended by about 40% beyond its nominally straight conformation, where it experiences rupture at an end-to-end tension of about 1.5 nN. Using this rupture force as the chain failure limit in an explicit polymer network simulation model (EPnet), we predict the tensile failure stress for sulfur crosslinked (vulcanized) natural rubber. Furthermore, quantitative agreement with published experimental data for the failure stress ismore » obtained in these simulations if we assume that only about 30% of the sulfur chains produce viable network crosslinks. Surprisingly, we also find that the failure stress of a rubber network does not scale linearly with the chain failure force limit.« less
Elasticity dominates strength and failure in metallic glasses
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liu, Z. Q.; Qu, R. T.; Zhang, Z. F., E-mail: zhfzhang@imr.ac.cn
2015-01-07
Two distinct deformation mechanisms of shearing and volume dilatation are quantitatively analyzed in metallic glasses (MGs) from the fundamental thermodynamics. Their competition is deduced to intrinsically dominate the strength and failure behaviors of MGs. Both the intrinsic shear and normal strengths give rise to the critical mechanical energies to activate destabilization of amorphous structures, under pure shearing and volume dilatation, respectively, and can be determined in terms of elastic constants. By adopting an ellipse failure criterion, the strength and failure behaviors of MGs can be precisely described just according to their shear modulus and Poisson's ratio without mechanical testing. Quantitativemore » relations are established systematically and verified by experimental results. Accordingly, the real-sense non-destructive failure prediction can be achieved in various MGs. By highlighting the broad key significance of elasticity, a “composition-elasticity-property” scheme is further outlined for better understanding and controlling the mechanical properties of MGs and other glassy materials from the elastic perspectives.« less
Hanson, David Edward; Barber, John L.
2017-11-20
From quantum chemistry simulations using density functional theory, we obtain the total electronic energy of an eight-atom sulfur chain as its end-to-end distance is extended until S–S bond rupture occurs. We find that a sulfur chain can be extended by about 40% beyond its nominally straight conformation, where it experiences rupture at an end-to-end tension of about 1.5 nN. Using this rupture force as the chain failure limit in an explicit polymer network simulation model (EPnet), we predict the tensile failure stress for sulfur crosslinked (vulcanized) natural rubber. Furthermore, quantitative agreement with published experimental data for the failure stress ismore » obtained in these simulations if we assume that only about 30% of the sulfur chains produce viable network crosslinks. Surprisingly, we also find that the failure stress of a rubber network does not scale linearly with the chain failure force limit.« less
NASA Astrophysics Data System (ADS)
Huda, Nizlel; Sutawidjaja, Akbar; Subanji; Rahardjo, Swasono
2018-04-01
Metacognitive activity is very important in mathematical problems solving. Metacognitive activity consists of metacognitive awareness, metacognitive evaluation and metacognitive regulation. This study aimed to reveal the errors of metacognitive evaluation in students’ metacognitive failure in solving mathematical problems. 20 students taken as research subjects were grouped into three groups: the first group was students who experienced one metacognitive failure, the second group was students who experienced two metacognitive failures and the third group was students who experienced three metacognitive failures. One person was taken from each group as the reasearch subject. The research data was collected from worksheets done using think aload then followed by interviewing the research subjects based on the results’ of subject work. The findings in this study were students who experienced metacognitive failure in solving mathematical problems tends to miscalculate metacognitive evaluation in considering the effectiveness and limitations of their thinking and the effectiveness of their chosen strategy of completion.
A Statistical Perspective on Highly Accelerated Testing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Thomas, Edward V.
Highly accelerated life testing has been heavily promoted at Sandia (and elsewhere) as a means to rapidly identify product weaknesses caused by flaws in the product's design or manufacturing process. During product development, a small number of units are forced to fail at high stress. The failed units are then examined to determine the root causes of failure. The identification of the root causes of product failures exposed by highly accelerated life testing can instigate changes to the product's design and/or manufacturing process that result in a product with increased reliability. It is widely viewed that this qualitative use ofmore » highly accelerated life testing (often associated with the acronym HALT) can be useful. However, highly accelerated life testing has also been proposed as a quantitative means for "demonstrating" the reliability of a product where unreliability is associated with loss of margin via an identified and dominating failure mechanism. It is assumed that the dominant failure mechanism can be accelerated by changing the level of a stress factor that is assumed to be related to the dominant failure mode. In extreme cases, a minimal number of units (often from a pre-production lot) are subjected to a single highly accelerated stress relative to normal use. If no (or, sufficiently few) units fail at this high stress level, some might claim that a certain level of reliability has been demonstrated (relative to normal use conditions). Underlying this claim are assumptions regarding the level of knowledge associated with the relationship between the stress level and the probability of failure. The primary purpose of this document is to discuss (from a statistical perspective) the efficacy of using accelerated life testing protocols (and, in particular, "highly accelerated" protocols) to make quantitative inferences concerning the performance of a product (e.g., reliability) when in fact there is lack-of-knowledge and uncertainty concerning the assumed relationship between the stress level and performance. In addition, this document contains recommendations for conducting more informative accelerated tests.« less
Cross-layer restoration with software defined networking based on IP over optical transport networks
NASA Astrophysics Data System (ADS)
Yang, Hui; Cheng, Lei; Deng, Junni; Zhao, Yongli; Zhang, Jie; Lee, Young
2015-10-01
The IP over optical transport network is a very promising networking architecture applied to the interconnection of geographically distributed data centers due to the performance guarantee of low delay, huge bandwidth and high reliability at a low cost. It can enable efficient resource utilization and support heterogeneous bandwidth demands in highly-available, cost-effective and energy-effective manner. In case of cross-layer link failure, to ensure a high-level quality of service (QoS) for user request after the failure becomes a research focus. In this paper, we propose a novel cross-layer restoration scheme for data center services with software defined networking based on IP over optical network. The cross-layer restoration scheme can enable joint optimization of IP network and optical network resources, and enhance the data center service restoration responsiveness to the dynamic end-to-end service demands. We quantitatively evaluate the feasibility and performances through the simulation under heavy traffic load scenario in terms of path blocking probability and path restoration latency. Numeric results show that the cross-layer restoration scheme improves the recovery success rate and minimizes the overall recovery time.
Fracturing as a Quantitative Indicator of Lava Flow Dynamics
NASA Astrophysics Data System (ADS)
Kilburn, C. R.; Solana, C.
2005-12-01
The traditional classification of lava flows into pahoehoe and aa varieties reflects differences in how a flow can fracture its surface during advance. Both types of lava have a low strength upon eruption and require surface cooling to produce a crust that can fracture. Among pahoehoe lavas, applied stresses are small enough to allow the growth of a continuous crust, which is broken intermittently as the flow advances by propagating a collection of lava tongues. Among aa lavas, in contrast, applied stresses are large enough to maintain persistent crustal failure. The differences in fracturing characteristics has been used to quantify the transition between flow regimes and suggests that shear fracture may dominate tensile failure. Applied to Lanzarote, the model confirms the inference from incomplete eye-witness accounts of the 1730-36 Timanfaya eruption that pahoehoe flows were able to advance about an order of magnitude more quickly than would have been expected by analogy with Hawaiian pahoehoe flow-fields of similar dimensions. Surface texture and morphology, therefore, are insufficient guides for constraining the rate and style of pahoehoe emplacement. Applications include improved hazard assessments during effusive eruptions and new evaluations of the emplacement conditions for very large-volume pahoehoe lava flows.
Real time in-situ sensing of damage evolution in nanocomposite bonded surrogate energetic materials
NASA Astrophysics Data System (ADS)
Sengezer, Engin C.; Seidel, Gary D.
2016-04-01
The current work aims to explore the potential for in-situ structural health monitoring in polymer bonded energetic materials through the introduction of carbon nanotubes (CNTs) into the binder phase as a means to establish a significant piezoresistive response through the resulting nanocomposite binder. The experimental effort herein is focused towards electro-mechanical characterization of surrogate materials in place of actual energetic (explosive) materials in order to provide proof of concept for the strain and damage sensing. The electrical conductivity and the piezoresistive behavior of samples containing randomly oriented MWCNTs introduced into the epoxy (EPON 862) binder of 70 wt% ammonium perchlorate-epoxy hybrid composites are quantitatively and qualitatively evaluated. Brittle failure going through linear elastic behavior, formation of microcracks leading to reduction in composite load carrying capacity and finally macrocracks resulting in eventual failure are observed in the mechanical response of MWNT-ammonium perchlorateepoxy hybrid composites. Incorporating MWNTs into local polymer binder improves the effective stiffness about 40% compared to neat ammonium perchlorate-polymer samples. The real time in-situ relative change in resistance for MWNT hybrid composites was detected with the applied strains through piezoresistive response.
Ley, P
1985-04-01
Patients frequently fail to understand what they are told. Further, they frequently forget the information given to them. These factors have effects on patients' satisfaction with the consultation. All three of these factors--understanding, memory and satisfaction--have effects on the probability that a patient will comply with advice. The levels of failure to understand and remember and levels of dissatisfaction are described. Quantitative estimates of the effects of these factors on non-compliance are presented.
Fatigue strength of common tibial intramedullary nail distal locking screws
Griffin, Lanny V; Harris, Robert M; Zubak, Joseph J
2009-01-01
Background Premature failure of either the nail and/or locking screws with unstable fracture patterns may lead to angulation, shortening, malunion, and IM nail migration. Up to thirty percent of all unreamed nail locking screws can break after initial weight bearing is allowed at 8–10 weeks if union has not occurred. The primary problem this presents is hardware removal during revision surgery. The purposes of our study was to evaluate the relative fatigue resistance of distal locking screws and bolts from representative manufacturers of tibial IM nail systems, and develop a relative risk assessment of screws and materials used. Evaluations included quantitative and qualitative measures of the relative performance of these screws. Methods Fatigue tests were conducted to simulate a comminuted fracture that was treated by IM nailing assuming that all load was carried by the screws. Each screw type was tested ten times in a single screw configuration. One screw type was tested an additional ten times in a two-screw parallel configuration. Fatigue tests were performed using a servohydraulic materials testing system and custom fixturing that simulated screws placed in the distal region of an appropriately sized tibial IM nail. Fatigue loads were estimated based on a seventy-five kilogram individual at full weight bearing. The test duration was one million cycles (roughly one year), or screw fracture, whichever occurred first. Failure analysis of a representative sample of titanium alloy and stainless steel screws included scanning electron microscopy (SEM) and quantitative metallography. Results The average fatigue life of a single screw with a diameter of 4.0 mm was 1200 cycles, which would correspond roughly to half a day of full weight bearing. Single screws with a diameter of 4.5 mm or larger have approximately a 50 percent probability of withstanding a week of weight bearing, whereas a single 5.0 mm diameter screw has greater than 90 percent probability of withstanding more than a week of weight bearing. If two small diameter screws are used, our tests showed that the probability of withstanding a week of weight bearing increases from zero to about 20 percent, which is similar to having a single 4.5 mm diameter screw providing fixation. Conclusion Our results show that selecting the system that uses the largest distal locking screws would offer the best fatigue resistance for an unstable fracture pattern subjected to full weight bearing. Furthermore, using multiple screws will substantially reduce the risk of premature hardware failure. PMID:19371438
A standardized model for predicting flap failure using indocyanine green dye
NASA Astrophysics Data System (ADS)
Zimmermann, Terence M.; Moore, Lindsay S.; Warram, Jason M.; Greene, Benjamin J.; Nakhmani, Arie; Korb, Melissa L.; Rosenthal, Eben L.
2016-03-01
Techniques that provide a non-invasive method for evaluation of intraoperative skin flap perfusion are currently available but underutilized. We hypothesize that intraoperative vascular imaging can be used to reliably assess skin flap perfusion and elucidate areas of future necrosis by means of a standardized critical perfusion threshold. Five animal groups (negative controls, n=4; positive controls, n=5; chemotherapy group, n=5; radiation group, n=5; chemoradiation group, n=5) underwent pre-flap treatments two weeks prior to undergoing random pattern dorsal fasciocutaneous flaps with a length to width ratio of 2:1 (3 x 1.5 cm). Flap perfusion was assessed via laser-assisted indocyanine green dye angiography and compared to standard clinical assessment for predictive accuracy of flap necrosis. For estimating flap-failure, clinical prediction achieved a sensitivity of 79.3% and a specificity of 90.5%. When average flap perfusion was more than three standard deviations below the average flap perfusion for the negative control group at the time of the flap procedure (144.3+/-17.05 absolute perfusion units), laser-assisted indocyanine green dye angiography achieved a sensitivity of 81.1% and a specificity of 97.3%. When absolute perfusion units were seven standard deviations below the average flap perfusion for the negative control group, specificity of necrosis prediction was 100%. Quantitative absolute perfusion units can improve specificity for intraoperative prediction of viable tissue. Using this strategy, a positive predictive threshold of flap failure can be standardized for clinical use.
NASA Astrophysics Data System (ADS)
Fan, Linfeng; Lehmann, Peter; Or, Dani
2015-10-01
Evidence suggests that the sudden triggering of rainfall-induced shallow landslides is preceded by accumulation of local internal failures in the soil mantle before their abrupt coalescence into a landslide failure plane. The mechanical status of a hillslope at any given time reflects competition between local damage accumulated during antecedent rainfall events and rates of mechanical healing (e.g., rebonding of microcracks and root regrowth). This dynamic interplay between damage accumulation and healing rates determines the initial mechanical state for landslide modeling. We evaluated the roles of these dynamic processes on landslide characteristics and patterns using a hydromechanical landslide-triggering model for a sequence of rainfall scenarios. The progressive nature of soil failure was represented by the fiber bundle model formalism that considers threshold strength of mechanical bonds linking adjacent soil columns and bedrock. The antecedent damage induced by prior rainfall events was expressed by the fraction of broken fibers that gradually regain strength or mechanically heal at rates specific to soil and roots. Results indicate that antecedent damage accelerates landslide initiation relative to pristine (undamaged) hillslopes. The volumes of first triggered landslides increase with increasing antecedent damage; however, for heavily damaged hillslopes, landslide volumes tend to decrease. Elapsed time between rainfall events allows mechanical healing that reduces the effects of antecedent damage. This study proposed a quantitative framework for systematically incorporating hydromechanical loading history and information on precursor events (e.g., such as recorded by acoustic emissions) into shallow landslide hazard assessment.
NASA Technical Reports Server (NTRS)
Schuecker, Clara; Davila, Carlos G.; Rose, Cheryl A.
2010-01-01
Five models for matrix damage in fiber reinforced laminates are evaluated for matrix-dominated loading conditions under plane stress and are compared both qualitatively and quantitatively. The emphasis of this study is on a comparison of the response of embedded plies subjected to a homogeneous stress state. Three of the models are specifically designed for modeling the non-linear response due to distributed matrix cracking under homogeneous loading, and also account for non-linear (shear) behavior prior to the onset of cracking. The remaining two models are localized damage models intended for predicting local failure at stress concentrations. The modeling approaches of distributed vs. localized cracking as well as the different formulations of damage initiation and damage progression are compared and discussed.
QSAR Modeling: Where Have You Been? Where Are You Going To?.
Quantitative structure–activity relationship modeling is one of the major computational tools employed in medicinal chemistry. However, throughout its entire history it has drawn both praise and criticism concerning its reliability, limitations, successes, and failures. In this...
Relating design and environmental variables to reliability
NASA Astrophysics Data System (ADS)
Kolarik, William J.; Landers, Thomas L.
The combination of space application and nuclear power source demands high reliability hardware. The possibilities of failure, either an inability to provide power or a catastrophic accident, must be minimized. Nuclear power experiences on the ground have led to highly sophisticated probabilistic risk assessment procedures, most of which require quantitative information to adequately assess such risks. In the area of hardware risk analysis, reliability information plays a key role. One of the lessons learned from the Three Mile Island experience is that thorough analyses of critical components are essential. Nuclear grade equipment shows some reliability advantages over commercial. However, no statistically significant difference has been found. A recent study pertaining to spacecraft electronics reliability, examined some 2500 malfunctions on more than 300 aircraft. The study classified the equipment failures into seven general categories. Design deficiencies and lack of environmental protection accounted for about half of all failures. Within each class, limited reliability modeling was performed using a Weibull failure model.
Analysis of rockbolt performance at the Waste Isolation Pilot Plant
DOE Office of Scientific and Technical Information (OSTI.GOV)
Terrill, L.J.; Francke, C.T.; Saeb, S.
Rockbolt failures at the Waste Isolation Pilot Plant have been recorded since 1990 and are categorized in terms of mode of failure. The failures are evaluated in terms of physical location of installation within the mine, local excavation geometry and stratigraphy, proximity to other excavations or shafts, and excavation age. The database of failures has revealed discrete ares of the mine containing relatively large numbers of failures. The results of metallurgical analyses and standard rockbolt load testing have generally been in agreement with the in situ evaluations.
A System Computational Model of Implicit Emotional Learning
Puviani, Luca; Rama, Sidita
2016-01-01
Nowadays, the experimental study of emotional learning is commonly based on classical conditioning paradigms and models, which have been thoroughly investigated in the last century. Unluckily, models based on classical conditioning are unable to explain or predict important psychophysiological phenomena, such as the failure of the extinction of emotional responses in certain circumstances (for instance, those observed in evaluative conditioning, in post-traumatic stress disorders and in panic attacks). In this manuscript, starting from the experimental results available from the literature, a computational model of implicit emotional learning based both on prediction errors computation and on statistical inference is developed. The model quantitatively predicts (a) the occurrence of evaluative conditioning, (b) the dynamics and the resistance-to-extinction of the traumatic emotional responses, (c) the mathematical relation between classical conditioning and unconditioned stimulus revaluation. Moreover, we discuss how the derived computational model can lead to the development of new animal models for resistant-to-extinction emotional reactions and novel methodologies of emotions modulation. PMID:27378898
A System Computational Model of Implicit Emotional Learning.
Puviani, Luca; Rama, Sidita
2016-01-01
Nowadays, the experimental study of emotional learning is commonly based on classical conditioning paradigms and models, which have been thoroughly investigated in the last century. Unluckily, models based on classical conditioning are unable to explain or predict important psychophysiological phenomena, such as the failure of the extinction of emotional responses in certain circumstances (for instance, those observed in evaluative conditioning, in post-traumatic stress disorders and in panic attacks). In this manuscript, starting from the experimental results available from the literature, a computational model of implicit emotional learning based both on prediction errors computation and on statistical inference is developed. The model quantitatively predicts (a) the occurrence of evaluative conditioning, (b) the dynamics and the resistance-to-extinction of the traumatic emotional responses, (c) the mathematical relation between classical conditioning and unconditioned stimulus revaluation. Moreover, we discuss how the derived computational model can lead to the development of new animal models for resistant-to-extinction emotional reactions and novel methodologies of emotions modulation.
Brain proton magnetic resonance spectroscopy for hepatic encephalopathy
NASA Astrophysics Data System (ADS)
Ong, Chin-Sing; McConnell, James R.; Chu, Wei-Kom
1993-08-01
Liver failure can induce gradations of encephalopathy from mild to stupor to deep coma. The objective of this study is to investigate and quantify the variation of biochemical compounds in the brain in patients with liver failure and encephalopathy, through the use of water- suppressed, localized in-vivo Proton Magnetic Resonance Spectroscopy (HMRS). The spectral parameters of the compounds quantitated are: N-Acetyl Aspartate (NAA) to Creatine (Cr) ratio, Choline (Cho) to Creatine ratio, Inositol (Ins) to Creatine ratio and Glutamine-Glutamate Amino Acid (AA) to Creatine ratio. The study group consisted of twelve patients with proven advanced chronic liver failure and symptoms of encephalopathy. Comparison has been done with results obtained from five normal subjects without any evidence of encephalopathy or liver diseases.
Cascading failure in scale-free networks with tunable clustering
NASA Astrophysics Data System (ADS)
Zhang, Xue-Jun; Gu, Bo; Guan, Xiang-Min; Zhu, Yan-Bo; Lv, Ren-Li
2016-02-01
Cascading failure is ubiquitous in many networked infrastructure systems, such as power grids, Internet and air transportation systems. In this paper, we extend the cascading failure model to a scale-free network with tunable clustering and focus on the effect of clustering coefficient on system robustness. It is found that the network robustness undergoes a nonmonotonic transition with the increment of clustering coefficient: both highly and lowly clustered networks are fragile under the intentional attack, and the network with moderate clustering coefficient can better resist the spread of cascading. We then provide an extensive explanation for this constructive phenomenon via the microscopic point of view and quantitative analysis. Our work can be useful to the design and optimization of infrastructure systems.
Wong, Sean-Man; Tse, Hung-Fat; Siu, Chung-Wah
2012-03-01
Hyperthyroidism is a common side effect encountered in patients prescribed long-term amiodarone therapy for cardiac arrhythmias. We previously studied 354 patients prescribed amiodarone in whom the occurrence of hyperthyroidism was associated with major adverse cardiovascular events including heart failure, myocardial infarction, ventricular arrhythmias, stroke and even death [1]. We now present a case of amiodarone-induced hyperthyroidism complicated by isolated right heart failure and pulmonary hypertension that resolved with treatment of hyperthyroidism. Detailed quantitative echocardiography enables improved understanding of the haemodynamic mechanisms underlying the condition. Copyright © 2011 Australian and New Zealand Society of Cardiac and Thoracic Surgeons (ANZSCTS) and the Cardiac Society of Australia and New Zealand (CSANZ). Published by Elsevier B.V. All rights reserved.
Mortimer, Duncan; Segal, Leonie
2006-01-01
To propose methods for the inclusion of within-family external effects in clinical and economic evaluations. To demonstrate the extent of bias due to the exclusion of within-family external effects when measuring the relative performance of interventions for problem drinking and alcohol dependence. The timing and magnitude of treatment effects are modified to accommodate the external health-related quality of life impact of having a problem or dependent drinker in the family home. The inclusion of within-family external effects reduces cost per QALY estimates of interventions for problem drinking and alcohol dependence thereby improving the performance of all evaluated interventions. In addition, the inclusion of within-family external effects improves the relative performance of interventions targeted at those with moderate-to-severe alcohol dependence as compared to interventions targeted at less severe alcohol problems. Failure to take account of external effects in clinical and economic evaluations results in an uneven playing field. Interventions with readily quantifiable health benefits (where social costs and benefits are predominantly comprised of private costs and benefits) are at a distinct advantage when competing for public funding against interventions with quantitatively important external effects.
NASA Technical Reports Server (NTRS)
Lo, Yunnhon; Johnson, Stephen B.; Breckenridge, Jonathan T.
2014-01-01
The theory of System Health Management (SHM) and of its operational subset Fault Management (FM) states that FM is implemented as a "meta" control loop, known as an FM Control Loop (FMCL). The FMCL detects that all or part of a system is now failed, or in the future will fail (that is, cannot be controlled within acceptable limits to achieve its objectives), and takes a control action (a response) to return the system to a controllable state. In terms of control theory, the effectiveness of each FMCL is estimated based on its ability to correctly estimate the system state, and on the speed of its response to the current or impending failure effects. This paper describes how this theory has been successfully applied on the National Aeronautics and Space Administration's (NASA) Space Launch System (SLS) Program to quantitatively estimate the effectiveness of proposed abort triggers so as to select the most effective suite to protect the astronauts from catastrophic failure of the SLS. The premise behind this process is to be able to quantitatively provide the value versus risk trade-off for any given abort trigger, allowing decision makers to make more informed decisions. All current and planned crewed launch vehicles have some form of vehicle health management system integrated with an emergency launch abort system to ensure crew safety. While the design can vary, the underlying principle is the same: detect imminent catastrophic vehicle failure, initiate launch abort, and extract the crew to safety. Abort triggers are the detection mechanisms that identify that a catastrophic launch vehicle failure is occurring or is imminent and cause the initiation of a notification to the crew vehicle that the escape system must be activated. While ensuring that the abort triggers provide this function, designers must also ensure that the abort triggers do not signal that a catastrophic failure is imminent when in fact the launch vehicle can successfully achieve orbit. That is, the abort triggers must have low false negative rates to be sure that real crew-threatening failures are detected, and also low false positive rates to ensure that the crew does not abort from non-crew-threatening launch vehicle behaviors. The analysis process described in this paper is a compilation of over six years of lessons learned and refinements from experiences developing abort triggers for NASA's Constellation Program (Ares I Project) and the SLS Program, as well as the simultaneous development of SHM/FM theory. The paper will describe the abort analysis concepts and process, developed in conjunction with SLS Safety and Mission Assurance (S&MA) to define a common set of mission phase, failure scenario, and Loss of Mission Environment (LOME) combinations upon which the SLS Loss of Mission (LOM) Probabilistic Risk Assessment (PRA) models are built. This abort analysis also requires strong coordination with the Multi-Purpose Crew Vehicle (MPCV) and SLS Structures and Environments (STE) to formulate a series of abortability tables that encapsulate explosion dynamics over the ascent mission phase. The design and assessment of abort conditions and triggers to estimate their Loss of Crew (LOC) Benefits also requires in-depth integration with other groups, including Avionics, Guidance, Navigation and Control(GN&C), the Crew Office, Mission Operations, and Ground Systems. The outputs of this analysis are a critical input to SLS S&MA's LOC PRA models. The process described here may well be the first full quantitative application of SHM/FM theory to the selection of a sensor suite for any aerospace system.
Allegrini, P; Balocchi, R; Chillemi, S; Grigolini, P; Hamilton, P; Maestri, R; Palatella, L; Raffaelli, G
2003-06-01
We analyze RR heartbeat sequences with a dynamic model that satisfactorily reproduces both the long- and the short-time statistical properties of heart beating. These properties are expressed quantitatively by means of two significant parameters, the scaling delta concerning the asymptotic effects of long-range correlation, and the quantity 1-pi establishing the amount of uncorrelated fluctuations. We find a correlation between the position in the phase space (delta, pi) of patients with congestive heart failure and their mortality risk.
42 CFR 431.424 - Evaluation requirements.
Code of Federal Regulations, 2012 CFR
2012-10-01
... evaluations. Demonstration evaluations will include the following: (1) Quantitative research methods. (i... of appropriate evaluation strategies (including experimental and other quantitative and qualitative... demonstration. (ii) CMS will consider alternative evaluation designs when quantitative designs are technically...
42 CFR 431.424 - Evaluation requirements.
Code of Federal Regulations, 2013 CFR
2013-10-01
... evaluations. Demonstration evaluations will include the following: (1) Quantitative research methods. (i... of appropriate evaluation strategies (including experimental and other quantitative and qualitative... demonstration. (ii) CMS will consider alternative evaluation designs when quantitative designs are technically...
42 CFR 431.424 - Evaluation requirements.
Code of Federal Regulations, 2014 CFR
2014-10-01
... evaluations. Demonstration evaluations will include the following: (1) Quantitative research methods. (i... of appropriate evaluation strategies (including experimental and other quantitative and qualitative... demonstration. (ii) CMS will consider alternative evaluation designs when quantitative designs are technically...
Microtensile bond strength of enamel after bleaching.
Lago, Andrea Dias Neves; Garone-Netto, Narciso
2013-01-01
To evaluate the bond strength of a composite resin to the bovine enamel bleached with 35% hydrogen peroxide. It was used an etching-and-rinse adhesive system employed immediately, 7 and 14 days after the bleaching. Twenty bovine teeth were randomly distributed into 4 groups (n = 5), 3 experimental and 1 control. G1: Unbleached + restoration 14 days after storage in artificial saliva (control); G2: Bleached + restoration immediately after bleaching; G3: Bleached + restoration 7 days after bleaching; G4: Bleached + restoration 14 days after bleaching. Their buccal enamel surfaces were flattened, and a 25 mm² (5 × 5 mm) area from each one of these regions was outlined so as to standardize the experimental region. Universal hybrid composite resin Filtek™Z350 was inserted into four layers of 1 mm each and photo-activated. The bond strength was quantitatively evaluated by a microtensile test (1.0 mm/min) 24 h after the restorative procedures. The failure mode was assessed through scanning electron microscopy (SEM). There was a significant reduction in the bond strength of the restored teeth immediately after the bleaching (G2). There were no significant differences in enamel bond strength between groups G1, G3, and G4. There was a predominance of adhesive and mixed (cohesive + adhesive) failure in all groups. The 7-day-period after the end of the bleaching with 35% hydrogen peroxide was enough to achieve the appropriate values of bond strength to the enamel.
Pooled nucleic acid testing to identify antiretroviral treatment failure during HIV infection.
May, Susanne; Gamst, Anthony; Haubrich, Richard; Benson, Constance; Smith, Davey M
2010-02-01
Pooling strategies have been used to reduce the costs of polymerase chain reaction-based screening for acute HIV infection in populations in which the prevalence of acute infection is low (less than 1%). Only limited research has been done for conditions in which the prevalence of screening positivity is higher (greater than 1%). We present data on a variety of pooling strategies that incorporate the use of polymerase chain reaction-based quantitative measures to monitor for virologic failure among HIV-infected patients receiving antiretroviral therapy. For a prevalence of virologic failure between 1% and 25%, we demonstrate relative efficiency and accuracy of various strategies. These results could be used to choose the best strategy based on the requirements of individual laboratory and clinical settings such as required turnaround time of results and availability of resources. Virologic monitoring during antiretroviral therapy is not currently being performed in many resource-constrained settings largely because of costs. The presented pooling strategies may be used to significantly reduce the cost compared with individual testing, make such monitoring feasible, and limit the development and transmission of HIV drug resistance in resource-constrained settings. They may also be used to design efficient pooling strategies for other settings with quantitative screening measures.
Fan, Ruoxun; Liu, Jie; Jia, Zhengbin; Deng, Ying; Liu, Jun
2018-01-01
Macro-level failure in bone structure could be diagnosed by pain or physical examination. However, diagnosing tissue-level failure in a timely manner is challenging due to the difficulty in observing the interior mechanical environment of bone tissue. Because most fractures begin with tissue-level failure in bone tissue caused by continually applied loading, people attempt to monitor the tissue-level failure of bone and provide corresponding measures to prevent fracture. Many tissue-level mechanical parameters of bone could be predicted or measured; however, the value of the parameter may vary among different specimens belonging to a kind of bone structure even at the same age and anatomical site. These variations cause difficulty in representing tissue-level bone failure. Therefore, determining an appropriate tissue-level failure evaluation standard is necessary to represent tissue-level bone failure. In this study, the yield and failure processes of rat femoral cortical bones were primarily simulated through a hybrid computational-experimental method. Subsequently, the tissue-level strains and the ratio between tissue-level failure and yield strains in cortical bones were predicted. The results indicated that certain differences existed in tissue-level strains; however, slight variations in the ratio were observed among different cortical bones. Therefore, the ratio between tissue-level failure and yield strains for a kind of bone structure could be determined. This ratio may then be regarded as an appropriate tissue-level failure evaluation standard to represent the mechanical status of bone tissue.
Mureddu, Gian F; Nistri, Stefano; Faggiano, Pompilio; Fimiani, Biagio; Misuraca, Gianfranco; Maggi, Antonio; Gori, Anna M; Uguccioni, Massimo; Tavazzi, Luigi; Zito, Giovanni B
2016-07-01
Early detection of heart failure, when still preclinical, is fundamental. Therefore, it is important to assess whether preclinical heart failure management by cardiologists is adequate. The VASTISSIMO study ('EValuation of the AppropriateneSs of The preclInical phase (Stage A and Stage B) of heart failure Management in Outpatient clinics in Italy') is a prospective nationwide study aimed to evaluate the appropriateness of diagnosis and management of preclinical heart failure (stages A and B) by cardiologists working in outpatient clinics in Italy. Secondary goals are to verify if an online educational course for cardiologists can improve management of preclinical heart failure, and evaluate how well cardiologists are aware of patients' adherence to medications. The study involves 80 outpatient cardiology clinics distributed throughout Italy, affiliated either to the Hospital Cardiologists Association or to the Regional Association of Outpatient Cardiologists, and is designed with two phases of consecutive outpatient enrolment each lasting 1 month. In phase 1, physicians' awareness of the risk of heart failure and their decision-making process are recorded. Subsequently, half of the cardiologists are randomized to undergo an online educational course aimed to improve preclinical heart failure management through implementation of guideline recommendations. At the end of the course, all cardiologists are evaluated (phase 2) to see whether changes in clinical management have occurred in those who underwent the educational program versus those who did not. Patients' adherence to prescribed medications will be assessed through the Morisky Self-report Questionnaire. This study should provide valuable information about cardiologists' awareness of preclinical heart failure and the appropriateness of clinical practice in outpatient cardiology clinics in Italy.
NASA Astrophysics Data System (ADS)
Ito, Reika; Yoshidome, Takashi
2018-01-01
Markov state models (MSMs) are a powerful approach for analyzing the long-time behaviors of protein motion using molecular dynamics simulation data. However, their quantitative performance with respect to the physical quantities is poor. We believe that this poor performance is caused by the failure to appropriately classify protein conformations into states when constructing MSMs. Herein, we show that the quantitative performance of an order parameter is improved when a manifold-learning technique is employed for the classification in the MSM. The MSM construction using the K-center method, which has been previously used for classification, has a poor quantitative performance.
A quantitative analysis of the F18 flight control system
NASA Technical Reports Server (NTRS)
Doyle, Stacy A.; Dugan, Joanne B.; Patterson-Hine, Ann
1993-01-01
This paper presents an informal quantitative analysis of the F18 flight control system (FCS). The analysis technique combines a coverage model with a fault tree model. To demonstrate the method's extensive capabilities, we replace the fault tree with a digraph model of the F18 FCS, the only model available to us. The substitution shows that while digraphs have primarily been used for qualitative analysis, they can also be used for quantitative analysis. Based on our assumptions and the particular failure rates assigned to the F18 FCS components, we show that coverage does have a significant effect on the system's reliability and thus it is important to include coverage in the reliability analysis.
Desmoulin, Franck; Galinier, Michel; Trouillet, Charlotte; Berry, Matthieu; Delmas, Clément; Turkieh, Annie; Massabuau, Pierre; Taegtmeyer, Heinrich; Smih, Fatima; Rouet, Philippe
2013-01-01
Objective Mortality in heart failure (AHF) remains high, especially during the first days of hospitalization. New prognostic biomarkers may help to optimize treatment. The aim of the study was to determine metabolites that have a high prognostic value. Methods We conducted a prospective study on a training cohort of AHF patients (n = 126) admitted in the cardiac intensive care unit and assessed survival at 30 days. Venous plasmas collected at admission were used for 1H NMR – based metabonomics analysis. Differences between plasma metabolite profiles allow determination of discriminating metabolites. A cohort of AHF patients was subsequently constituted (n = 74) to validate the findings. Results Lactate and cholesterol were the major discriminating metabolites predicting 30-day mortality. Mortality was increased in patients with high lactate and low total cholesterol concentrations at admission. Accuracies of lactate, cholesterol concentration and lactate to cholesterol (Lact/Chol) ratio to predict 30-day mortality were evaluated using ROC analysis. The Lact/Chol ratio provided the best accuracy with an AUC of 0.82 (P < 0.0001). The acute physiology and chronic health evaluation (APACHE) II scoring system provided an AUC of 0.76 for predicting 30-day mortality. APACHE II score, Cardiogenic shock (CS) state and Lact/Chol ratio ≥ 0.4 (cutoff value with 82% sensitivity and 64% specificity) were significant independent predictors of 30-day mortality with hazard ratios (HR) of 1.11, 4.77 and 3.59, respectively. In CS patients, the HR of 30-day mortality risk for plasma Lact/Chol ratio ≥ 0.4 was 3.26 compared to a Lact/Chol ratio of < 0.4 (P = 0.018). The predictive power of the Lact/Chol ratio for 30-day mortality outcome was confirmed with the independent validation cohort. Conclusion This study identifies the plasma Lact/Chol ratio as a useful objective and simple parameter to evaluate short term prognostic and could be integrated into quantitative guidance for decision making in heart failure care. PMID:23573279
The contribution of simple random sampling to observed variations in faecal egg counts.
Torgerson, Paul R; Paul, Michaela; Lewis, Fraser I
2012-09-10
It has been over 100 years since the classical paper published by Gosset in 1907, under the pseudonym "Student", demonstrated that yeast cells suspended in a fluid and measured by a haemocytometer conformed to a Poisson process. Similarly parasite eggs in a faecal suspension also conform to a Poisson process. Despite this there are common misconceptions how to analyse or interpret observations from the McMaster or similar quantitative parasitic diagnostic techniques, widely used for evaluating parasite eggs in faeces. The McMaster technique can easily be shown from a theoretical perspective to give variable results that inevitably arise from the random distribution of parasite eggs in a well mixed faecal sample. The Poisson processes that lead to this variability are described and illustrative examples of the potentially large confidence intervals that can arise from observed faecal eggs counts that are calculated from the observations on a McMaster slide. Attempts to modify the McMaster technique, or indeed other quantitative techniques, to ensure uniform egg counts are doomed to failure and belie ignorance of Poisson processes. A simple method to immediately identify excess variation/poor sampling from replicate counts is provided. Copyright © 2012 Elsevier B.V. All rights reserved.
Does an inter-flaw length control the accuracy of rupture forecasting in geological materials?
NASA Astrophysics Data System (ADS)
Vasseur, Jérémie; Wadsworth, Fabian B.; Heap, Michael J.; Main, Ian G.; Lavallée, Yan; Dingwell, Donald B.
2017-10-01
Multi-scale failure of porous materials is an important phenomenon in nature and in material physics - from controlled laboratory tests to rockbursts, landslides, volcanic eruptions and earthquakes. A key unsolved research question is how to accurately forecast the time of system-sized catastrophic failure, based on observations of precursory events such as acoustic emissions (AE) in laboratory samples, or, on a larger scale, small earthquakes. Until now, the length scale associated with precursory events has not been well quantified, resulting in forecasting tools that are often unreliable. Here we test the hypothesis that the accuracy of the forecast failure time depends on the inter-flaw distance in the starting material. We use new experimental datasets for the deformation of porous materials to infer the critical crack length at failure from a static damage mechanics model. The style of acceleration of AE rate prior to failure, and the accuracy of forecast failure time, both depend on whether the cracks can span the inter-flaw length or not. A smooth inverse power-law acceleration of AE rate to failure, and an accurate forecast, occurs when the cracks are sufficiently long to bridge pore spaces. When this is not the case, the predicted failure time is much less accurate and failure is preceded by an exponential AE rate trend. Finally, we provide a quantitative and pragmatic correction for the systematic error in the forecast failure time, valid for structurally isotropic porous materials, which could be tested against larger-scale natural failure events, with suitable scaling for the relevant inter-flaw distances.
NASA Technical Reports Server (NTRS)
Kennedy, Barbara J.
2004-01-01
The purposes of this study are to compare the current Space Shuttle Ground Support Equipment (GSE) infrastructure with the proposed GSE infrastructure upgrade modification. The methodology will include analyzing the first prototype installation equipment at Launch PAD B called the "Pathfinder". This study will begin by comparing the failure rate of the current components associated with the "Hardware interface module (HIM)" at the Kennedy Space Center to the failure rate of the neW Pathfinder components. Quantitative data will be gathered specifically on HIM components and the PAD B Hypergolic Fuel facility and Hypergolic Oxidizer facility areas which has the upgraded pathfinder equipment installed. The proposed upgrades include utilizing industrial controlled modules, software, and a fiber optic network. The results of this study provide evidence that there is a significant difference in the failure rates of the two studied infrastructure equipment components. There is also evidence that the support staff for each infrastructure system is not equal. A recommendation to continue with future upgrades is based on a significant reduction of failures in the new' installed ground system components.
DOT National Transportation Integrated Search
2010-01-01
The Smart Grid is a cyber-physical system comprised of physical components, such as transmission lines and generators, and a : network of embedded systems deployed for their cyber control. Our objective is to qualitatively and quantitatively analyze ...
Try Fault Tree Analysis, a Step-by-Step Way to Improve Organization Development.
ERIC Educational Resources Information Center
Spitzer, Dean
1980-01-01
Fault Tree Analysis, a systems safety engineering technology used to analyze organizational systems, is described. Explains the use of logic gates to represent the relationship between failure events, qualitative analysis, quantitative analysis, and effective use of Fault Tree Analysis. (CT)
Transitional care for formerly incarcerated persons with HIV: protocol for a realist review.
Tsang, Jenkin; Mishra, Sharmistha; Rowe, Janet; O'Campo, Patricia; Ziegler, Carolyn; Kouyoumdjian, Fiona G; Matheson, Flora I; Bayoumi, Ahmed M; Zahid, Shatabdy; Antoniou, Tony
2017-02-13
Little is known about the mechanisms that influence the success or failure of programs to facilitate re-engagement with health and social services for formerly incarcerated persons with HIV. This review aims to identify how interventions to address such transitions work, for whom and under what circumstances. We will use realist review methodology to conduct our analysis. We will systematically search electronic databases and grey literature for English language qualitative and quantitative studies of interventions. Two investigators will independently screen citations and full-text articles, abstract data, appraise study quality and synthesize the literature. Data analysis will include identifying context-mechanism-outcome configurations, exploring and comparing patterns in these configurations, making comparisons across contexts and developing explanatory frameworks. This review will identify mechanisms that influence the success or failure of transition interventions for formerly incarcerated individuals with HIV. The findings will be integrated with those from complementary qualitative and quantitative studies to inform future interventions. PROSPERO CRD42016040054.
NASA Technical Reports Server (NTRS)
Lee, Alice T.; Gunn, Todd; Pham, Tuan; Ricaldi, Ron
1994-01-01
This handbook documents the three software analysis processes the Space Station Software Analysis team uses to assess space station software, including their backgrounds, theories, tools, and analysis procedures. Potential applications of these analysis results are also presented. The first section describes how software complexity analysis provides quantitative information on code, such as code structure and risk areas, throughout the software life cycle. Software complexity analysis allows an analyst to understand the software structure, identify critical software components, assess risk areas within a software system, identify testing deficiencies, and recommend program improvements. Performing this type of analysis during the early design phases of software development can positively affect the process, and may prevent later, much larger, difficulties. The second section describes how software reliability estimation and prediction analysis, or software reliability, provides a quantitative means to measure the probability of failure-free operation of a computer program, and describes the two tools used by JSC to determine failure rates and design tradeoffs between reliability, costs, performance, and schedule.
Failure dynamics of the global risk network.
Szymanski, Boleslaw K; Lin, Xin; Asztalos, Andrea; Sreenivasan, Sameet
2015-06-18
Risks threatening modern societies form an intricately interconnected network that often underlies crisis situations. Yet, little is known about how risk materializations in distinct domains influence each other. Here we present an approach in which expert assessments of likelihoods and influence of risks underlie a quantitative model of the global risk network dynamics. The modeled risks range from environmental to economic and technological, and include difficult to quantify risks, such as geo-political and social. Using the maximum likelihood estimation, we find the optimal model parameters and demonstrate that the model including network effects significantly outperforms the others, uncovering full value of the expert collected data. We analyze the model dynamics and study its resilience and stability. Our findings include such risk properties as contagion potential, persistence, roles in cascades of failures and the identity of risks most detrimental to system stability. The model provides quantitative means for measuring the adverse effects of risk interdependencies and the materialization of risks in the network.
Accident hazard evaluation and control decisions on forested recreation sites
Lee A. Paine
1971-01-01
Accident hazard associated with trees on recreation sites is inherently concerned with probabilities. The major factors include the probabilities of mechanical failure and of target impact if failure occurs, the damage potential of the failure, and the target value. Hazard may be evaluated as the product of these factors; i.e., expected loss during the current...
NASA Astrophysics Data System (ADS)
Schwarz, W.; Schwub, S.; Quering, K.; Wiedmann, D.; Höppel, H. W.; Göken, M.
2011-09-01
During their operational life-time, actively cooled liners of cryogenic combustion chambers are known to exhibit a characteristic so-called doghouse deformation, pursued by formation of axial cracks. The present work aims at developing a model that quantitatively accounts for this failure mechanism. High-temperature material behaviour is characterised in a test programme and it is shown that stress relaxation, strain rate dependence, isotropic and kinematic hardening as well as material ageing have to be taken into account in the model formulation. From fracture surface analyses of a thrust chamber it is concluded that the failure mode of the hot wall ligament at the tip of the doghouse is related to ductile rupture. A material model is proposed that captures all stated effects. Basing on the concept of continuum damage mechanics, the model is further extended to incorporate softening effects due to material degradation. The model is assessed on experimental data and quantitative agreement is established for all tests available. A 3D finite element thermo-mechanical analysis is performed on a representative thrust chamber applying the developed material-damage model. The simulation successfully captures the observed accrued thinning of the hot wall and quantitatively reproduces the doghouse deformation.
Fang, Zhong; Giambini, Hugo; Zeng, Heng; Camp, Jon J.; Dadsetan, Mahrokh; Robb, Richard A.; An, Kai-Nan; Yaszemski, Michael J.
2014-01-01
A novel biodegradable copolymer, poly(propylene fumarate-co-caprolactone) [P(PF-co-CL)], has been developed in our laboratory as an injectable scaffold for bone defect repair. In the current study, we evaluated the ability of P(PF-co-CL) to reconstitute the load-bearing capacity of vertebral bodies with lytic lesions. Forty vertebral bodies from four fresh-frozen cadaveric thoracolumbar spines were used for this study. They were randomly divided into four groups: intact vertebral body (intact control), simulated defect without treatment (negative control), defect treated with P(PF-co-CL) (copolymer group), and defect treated with poly(methyl methacrylate) (PMMA group). Simulated metastatic lytic defects were made by removing a central core of the trabecular bone in each vertebral body with an approximate volume of 25% through an access hole in the side of the vertebrae. Defects were then filled by injecting either P(PF-co-CL) or PMMA in situ crosslinkable formulations. After the spines were imaged with quantitative computerized tomography, single vertebral body segments were harvested for mechanical testing. Specimens were compressed until failure or to 25% reduction in body height and ultimate strength and elastic modulus of each specimen were then calculated from the force–displacement data. The average failure strength of the copolymer group was 1.83 times stronger than the untreated negative group and it closely matched the intact vertebral bodies (intact control). The PMMA-treated vertebrae, however, had a failure strength 1.64 times larger compared with the intact control. The elastic modulus followed the same trend. This modulus mismatch between PMMA-treated vertebrae and the host vertebrae could potentially induce a fracture cascade and degenerative changes in adjacent intervertebral discs. In contrast, P(PF-co-CL) restored the mechanical properties of the treated segments similar to the normal, intact, vertebrae. Therefore, P(PF-co-CL) may be a suitable alternative to PMMA for vertebroplasty treatment of vertebral bodies with lytic defects. PMID:24256208
Application of Laser Based Ultrasound for NDE of Damage in Thick Stitched Composites
NASA Technical Reports Server (NTRS)
Anastasi, Robert F.; Friedman, Adam D.; Hinders, Mark K.; Madaras, Eric I.
1997-01-01
As design engineers implement new composite systems such as thick, load bearing composite structures, they must have certifiable confidence in structure s durability and worthiness. This confidence builds from understanding the structural response and failure characteristics of simple components loaded in testing machines to tests on full scale sections. Nondestructive evaluation is an important element which can provide quantitative information on the damage initiation, propagation, and final failure modes for the composite structural components. Although ultrasound is generally accepted as a test method, the use of conventional ultrasound for in-situ monitoring of damage during tests of large structures is not practical. The use of lasers to both generate and detect ultrasound extends the application of ultrasound to in- situ sensing of damage in a deformed structure remotely and in a non-contact manner. The goal of the present research is to utilize this technology to monitor damage progression during testing. The present paper describes the application of laser based ultrasound to quantify damage in thick stitched composite structural elements to demonstrate the method. This method involves using a Q-switched laser to generate a rapid, local linear thermal strain on the surface of the structure. This local strain causes the generation of ultrasonic waves into the material. A second laser used with a Fabry-Perot interferometer detects the surface deflections. The use of fiber optics provides for eye safety and a convenient method of delivering the laser over long distances to the specimens. The material for these structural elements is composed of several stacks of composite material assembled together by stitching through the laminate thickness that ranging from 0.5 to 0.8 inches. The specimens used for these nondestructive evaluation studies had either impact damage or skin/stiffener interlaminar failure. Although little or no visible surface damage existed, internal damage was detected by laser based ultrasound.
USAF Evaluation of an Automated Adaptive Flight Training System
1975-10-01
system. C. What is the most effective wav to utilize the system in ^jierational training’ Student opinion for this question JS equally divided...None Utility hydraulic failure Flap failure left engine failure Right engine failure Stah 2 aug failure No g\\ ro approach procedure, no MIDI
Environmental control system transducer development study
NASA Technical Reports Server (NTRS)
Brudnicki, M. J.
1973-01-01
A failure evaluation of the transducers used in the environmental control systems of the Apollo command service module, lunar module, and portable life support system is presented in matrix form for several generic categories of transducers to enable identification of chronic failure modes. Transducer vendors were contacted and asked to supply detailed information. The evaluation data generated for each category of transducer were compiled and published in failure design evaluation reports. The evaluation reports also present a review of the failure and design data for the transducers and suggest both design criteria to improve reliability of the transducers and, where necessary, design concepts for required redesign of the transducers. Remedial designs were implemented on a family of pressure transducers and on the oxygen flow transducer. The design concepts were subjected to analysis, breadboard fabrication, and verification testing.
NESTEM-QRAS: A Tool for Estimating Probability of Failure
NASA Technical Reports Server (NTRS)
Patel, Bhogilal M.; Nagpal, Vinod K.; Lalli, Vincent A.; Pai, Shantaram; Rusick, Jeffrey J.
2002-01-01
An interface between two NASA GRC specialty codes, NESTEM and QRAS has been developed. This interface enables users to estimate, in advance, the risk of failure of a component, a subsystem, and/or a system under given operating conditions. This capability would be able to provide a needed input for estimating the success rate for any mission. NESTEM code, under development for the last 15 years at NASA Glenn Research Center, has the capability of estimating probability of failure of components under varying loading and environmental conditions. This code performs sensitivity analysis of all the input variables and provides their influence on the response variables in the form of cumulative distribution functions. QRAS, also developed by NASA, assesses risk of failure of a system or a mission based on the quantitative information provided by NESTEM or other similar codes, and user provided fault tree and modes of failure. This paper will describe briefly, the capabilities of the NESTEM, QRAS and the interface. Also, in this presentation we will describe stepwise process the interface uses using an example.
NESTEM-QRAS: A Tool for Estimating Probability of Failure
NASA Astrophysics Data System (ADS)
Patel, Bhogilal M.; Nagpal, Vinod K.; Lalli, Vincent A.; Pai, Shantaram; Rusick, Jeffrey J.
2002-10-01
An interface between two NASA GRC specialty codes, NESTEM and QRAS has been developed. This interface enables users to estimate, in advance, the risk of failure of a component, a subsystem, and/or a system under given operating conditions. This capability would be able to provide a needed input for estimating the success rate for any mission. NESTEM code, under development for the last 15 years at NASA Glenn Research Center, has the capability of estimating probability of failure of components under varying loading and environmental conditions. This code performs sensitivity analysis of all the input variables and provides their influence on the response variables in the form of cumulative distribution functions. QRAS, also developed by NASA, assesses risk of failure of a system or a mission based on the quantitative information provided by NESTEM or other similar codes, and user provided fault tree and modes of failure. This paper will describe briefly, the capabilities of the NESTEM, QRAS and the interface. Also, in this presentation we will describe stepwise process the interface uses using an example.
Evaluation of lightweight material concepts for aircraft turbine engine rotor failure protection
DOT National Transportation Integrated Search
1997-07-01
Results of the evaluation of lightweight materials for aircraft turbine engine rotor failure protection are presented in this report. The program consisted of two phases. Phase 1 was an evaluation of a group of composite materials which could possibl...
Jolly, Sanjit S; Shenkman, Heather; Brieger, David; Fox, Keith A; Yan, Andrew T; Eagle, Kim A; Steg, P Gabriel; Lim, Ki-Dong; Quill, Ann; Goodman, Shaun G
2011-02-01
The objective of this study was to determine if the extent of quantitative troponin elevation predicted mortality as well as in-hospital complications of cardiac arrest, new heart failure and cardiogenic shock. 16,318 patients with non-ST-segment elevation acute coronary syndromes (NSTE ACS) from the Global Registry of Acute Coronary Events (GRACE) were included. The maximum 24 h troponin value as a multiple of the local laboratory upper limit of normal was used. The population was divided into five groups based on the degree of troponin elevation, and outcomes were compared. An adjusted analysis was performed using quantitative troponin as a continuous variable with adjustment for known prognostic variables. For each approximate 10-fold increase in the troponin ratio, there was an associated increase in cardiac arrest, sustained ventricular tachycardia (VT) or ventricular fibrillation (VF) (1.0, 2.4, 3.4, 5.9 and 13.4%; p<0.001 for linear trend), cardiogenic shock (0.5, 1.4, 2.0, 4.4 and 12.7%; p<0.001), new heart failure (2.5, 5.1, 7.4, 11.6 and 15.8%; p<0.001) and mortality (0.8, 2.2, 3.0, 5.3 and 14.0%; p<0.001). These findings were replicated using the troponin ratio as a continuous variable and adjusting for covariates (cardiac arrest, sustained VT or VF, OR 1.56, 95% CI 1.39 to 1.74; cardiogenic shock, OR 1.87, 95% CI 1.61 to 2.18; and new heart failure, OR 1.57, 95% CI 1.45 to 1.71). The degree of troponin elevation was predictive of early mortality (HR 1.61, 95% CI 1.44 to 1.81; p<0.001 for days 0-14) and longer term mortality (HR 1.18, 95% CI 1.07 to 1.30, p=0.001 for days 15-180). The extent of troponin elevation is an independent predictor of morbidity and mortality.
A Study on the Effects of Ball Defects on the Fatigue Life in Hybrid Bearings
NASA Technical Reports Server (NTRS)
Tang, Ching-Yao; Foerster, Chad E.; O'Brien, Michael J.; Hardy, Brian S.; Goyal, Vinay K.; Nelson, Benjamin A.; Robinson, Ernest Y.; Ward, Peter C.; Hilton, Michael R.
2014-01-01
Hybrid ball bearings using silicon nitride ceramic balls with steel rings are increasingly being used in space mechanism applications due to their high wear resistance and long rolling contact fatigue life. However, qualitative and quantitative reports of the effects of ball defects that cause early fatigue failure are rare. We report on our approach to study these effects. Our strategy includes characterization of defects encountered in use, generation of similar defects in a laboratory setting, execution of full-scale bearing tests to obtain lifetimes, post-test characterization, and related finite-element modeling to understand the stress concentration of these defects. We have confirmed that at least one type of defect of appropriate size can significantly reduce fatigue life. Our method can be used to evaluate other defects as they occur or are encountered.
Defining Aggressive Prostate Cancer Using a 12-Gene Model1
Riva, Alberto; Kim, Robert; Varambally, Sooryanarayana; He, Le; Kutok, Jeff; Aster, Jonathan C; Tang, Jeffery; Kuefer, Rainer; Hofer, Matthias D; Febbo, Phillip G; Chinnaiyan, Arul M; Rubin, Mark A
2006-01-01
Abstract The critical clinical question in prostate cancer research is: How do we develop means of distinguishing aggressive disease from indolent disease? Using a combination of proteomic and expression array data, we identified a set of 36 genes with concordant dysregulation of protein products that could be evaluated in situ by quantitative immunohistochemistry. Another five prostate cancer biomarkers were included using linear discriminant analysis, we determined that the optimal model used to predict prostate cancer progression consisted of 12 proteins. Using a separate patient population, transcriptional levels of the 12 genes encoding for these proteins predicted prostate-specific antigen failure in 79 men following surgery for clinically localized prostate cancer (P = .0015). This study demonstrates that cross-platform models can lead to predictive models with the possible advantage of being more robust through this selection process. PMID:16533427
A novel rotometer based on a RISC microcontroller.
Heredia-López, F J; Bata-García, J L; Alvarez-Cervera, F J; Góngora-Alfaro, J L
2002-08-01
A new, low-cost rotometer, based on a reduced instruction set computer (RISC) microcontroller, is presented. Like earlier devices, it counts the number and direction of full turns for predetermined time periods during the evaluation of turning behavior induced by drug administration in rats. The present stand-alone system includes a nonvolatile memory for long-term data storage and a serial port for data transmission. It also contains a display for monitoring the experiments and has battery backup to avoid interruptions owing to power failures. A high correlation was found (r > .988, p < 2 x 10(-14)) between the counts of the rotometer and those of two trained observers. The system reflects quantitative differences in turning behavior owing to pharmacological manipulations. It provides the most common counting parameters and is inexpensive, flexible, highly reliable, and completely portable (weight including batteries, 159 g).
NASA Technical Reports Server (NTRS)
Lintott, J.; Costello, M. J.
1977-01-01
A system for quantitating the cardiac electrical activity of Skylab crewmen was required for three medical experiments (M092, Lower Body Negative Pressure; M171, Metabolic Activity; and M093, In-flight Vectorcardiogram) designed to evaluate the effects of space flight on the human cardiovascular system. A Frank lead vectorcardiograph system was chosen for this task because of its general acceptability in the scientific community and its data quantification capabilities. To be used effectively in space flight, however, the system had to meet certain other requirements. The system was required to meet the specifications recommended by the American Heart Association. The vectorcardiograph had to withstand the extreme conditions of the space environment. The system had to provide features that permitted ease of use in the orbital environment. The vectorcardiograph system performed its intended function throughout all the Skylab missions without a failure. A description of this system follows.
Failure tolerance strategy of space manipulator for large load carrying tasks
NASA Astrophysics Data System (ADS)
Chen, Gang; Yuan, Bonan; Jia, Qingxuan; Sun, Hanxu; Guo, Wen
2018-07-01
During the execution of large load carrying tasks in long term service, there is a notable risk of space manipulator suffering from locked-joint failure, thus space manipulator should be with enough failure tolerance performance. A research on evaluating failure tolerance performance and re-planning feasible task trajectory for space manipulator performing large load carrying tasks is conducted in this paper. The effects of locked-joint failure on critical performance(reachability and load carrying capacity) of space manipulator are analyzed at first. According to the requirements of load carrying tasks, we further propose a new concept of failure tolerance workspace with load carrying capacity(FTWLCC) to evaluate failure tolerance performance, and improve the classic A* algorithm to search the feasible task trajectory. Through the normalized FTWLCC and the improved A* algorithm, the reachability and load carrying capacity of the degraded space manipulator are evaluated, and the reachable and capable trajectory can be obtained. The establishment of FTWLCC provides a novel idea that combines mathematical statistics with failure tolerance performance to illustrate the distribution of load carrying capacity in three-dimensional space, so multiple performance indices can be analyzed simultaneously and visually. And the full consideration of all possible failure situations and motion states makes FTWLCC and improved A* algorithm be universal and effective enough to be appropriate for random joint failure and variety of requirement of large load carrying tasks, so they can be extended to other types of manipulators.
The role of quantitative safety evaluation in regulatory decision making of drugs.
Chakravarty, Aloka G; Izem, Rima; Keeton, Stephine; Kim, Clara Y; Levenson, Mark S; Soukup, Mat
2016-01-01
Evaluation of safety is a critical component of drug review at the US Food and Drug Administration (FDA). Statisticians are playing an increasingly visible role in quantitative safety evaluation and regulatory decision-making. This article reviews the history and the recent events relating to quantitative drug safety evaluation at the FDA. The article then focuses on five active areas of quantitative drug safety evaluation and the role Division of Biometrics VII (DBVII) plays in these areas, namely meta-analysis for safety evaluation, large safety outcome trials, post-marketing requirements (PMRs), the Sentinel Initiative, and the evaluation of risk from extended/long-acting opioids. This article will focus chiefly on developments related to quantitative drug safety evaluation and not on the many additional developments in drug safety in general.
Roberts, Shelley; McInnes, Elizabeth; Bucknall, Tracey; Wallis, Marianne; Banks, Merrilyn; Chaboyer, Wendy
2017-02-13
As pressure ulcers contribute to significant patient burden and increased health care costs, their prevention is a clinical priority. Our team developed and tested a complex intervention, a pressure ulcer prevention care bundle promoting patient participation in care, in a cluster-randomised trial. The UK Medical Research Council recommends process evaluation of complex interventions to provide insight into why they work or fail and how they might be improved. This study aimed to evaluate processes underpinning implementation of the intervention and explore end-users' perceptions of it, in order to give a deeper understanding of its effects. A pre-specified, mixed-methods process evaluation was conducted as an adjunct to the main trial, guided by a framework for process evaluation of cluster-randomised trials. Data was collected across eight Australian hospitals but mainly focused on the four intervention hospitals. Quantitative and qualitative data were collected across the evaluation domains: recruitment, reach, intervention delivery and response to intervention, at both cluster and individual patient level. Quantitative data were analysed using descriptive and inferential statistics. Qualitative data were analysed using thematic analysis. In the context of the main trial, which found a 42% reduction in risk of pressure ulcer with the intervention that was not significant after adjusting for clustering and covariates, this process evaluation provides important insights. Recruitment and reach among clusters and individuals was high, indicating that patients, nurses and hospitals are willing to engage with a pressure ulcer prevention care bundle. Of 799 intervention patients in the trial, 96.7% received the intervention, which took under 10 min to deliver. Patients and nurses accepted the care bundle, recognising benefits to it and describing how it enabled participation in pressure ulcer prevention (PUP) care. This process evaluation found no major failures relating to implementation of the intervention. The care bundle was found to be easy to understand and deliver, and it reached a large proportion of the target population and was found to be acceptable to patients and nurses; therefore, it may be an effective way of engaging patients in their pressure ulcer prevention care and promoting evidence-based practise.
Significance of Sarcopenia Evaluation in Acute Decompensated Heart Failure.
Tsuchida, Keiichi; Fujihara, Yuki; Hiroki, Jiro; Hakamata, Takahiro; Sakai, Ryohei; Nishida, Kota; Sudo, Koji; Tanaka, Komei; Hosaka, Yukio; Takahashi, Kazuyoshi; Oda, Hirotaka
2018-01-27
In patients with chronic heart failure (HF), the clinical importance of sarcopenia has been recognized in relation to disease severity, reduced exercise capacity, and adverse clinical outcome. Nevertheless, its impact on acute decompensated heart failure (ADHF) is still poorly understood. Dual-energy X-ray absorptiometry (DXA) is a technique for quantitatively analyzing muscle mass and the degree of sarcopenia. Fat-free mass index (FFMI) is a noninvasive and easily applicable marker of muscle mass.This was a prospective observational cohort study comprising 38 consecutive patients hospitalized for ADHF. Sarcopenia, derived from DXA, was defined as a skeletal muscle mass index (SMI) two standard deviations below the mean for healthy young subjects. FFMI (kg/m 2 ) was calculated as 7.38 + 0.02908 × urinary creatinine (mg/day) divided by the square of height (m 2 ).Sarcopenia was present in 52.6% of study patients. B-type natriuretic peptide (BNP) levels were significantly higher in ADHF patients with sarcopenia than in those without sarcopenia (1666 versus 429 pg/mL, P < 0.0001). Receiver operator curves were used to compare the predictive accuracy of SMI and FFMI for higher BNP levels. Areas under the curve for SMI and FFMI were 0.743 and 0.717, respectively. Multiple logistic regression analysis showed sarcopenia as a predictor of higher BNP level (OR = 18.4; 95% CI, 1.86-181.27; P = 0.013).Sarcopenia is associated with increased disease severity in ADHF. SMI based on DXA is potentially superior to FFMI in terms of predicting the degree of severity, but FFMI is also associated with ADHF severity.
Lay Consultations in Heart Failure Symptom Evaluation.
Reeder, Katherine M; Sims, Jessica L; Ercole, Patrick M; Shetty, Shivan S; Wallendorf, Michael
2017-01-01
Lay consultations can facilitate or impede healthcare. However, little is known about how lay consultations for symptom evaluation affect treatment decision-making. The purpose of this study was to explore the role of lay consultations in symptom evaluation prior to hospitalization among patients with heart failure. Semi-structured interviews were conducted with 60 patients hospitalized for acute decompensated heart failure. Chi-square and Fisher's exact tests, along with logistic regression were used to characterize lay consultations in this sample. A large proportion of patients engaged in lay consultations for symptom evaluation and decision-making before hospitalization. Lay consultants provided attributions and advice and helped make the decision to seek medical care. Men consulted more often with their spouse than women, while women more often consulted with adult children. Findings have implications for optimizing heart failure self-management interventions, improving outcomes, and reducing hospital readmissions.
NASA Technical Reports Server (NTRS)
Kalinowski, Kevin F.; Tucker, George E.; Moralez, Ernesto, III
2006-01-01
Engineering development and qualification of a Research Flight Control System (RFCS) for the Rotorcraft Aircrew Systems Concepts Airborne Laboratory (RASCAL) JUH-60A has motivated the development of a pilot rating scale for evaluating failure transients in fly-by-wire flight control systems. The RASCAL RFCS includes a highly-reliable, dual-channel Servo Control Unit (SCU) to command and monitor the performance of the fly-by-wire actuators and protect against the effects of erroneous commands from the flexible, but single-thread Flight Control Computer. During the design phase of the RFCS, two piloted simulations were conducted on the Ames Research Center Vertical Motion Simulator (VMS) to help define the required performance characteristics of the safety monitoring algorithms in the SCU. Simulated failures, including hard-over and slow-over commands, were injected into the command path, and the aircraft response and safety monitor performance were evaluated. A subjective Failure/Recovery Rating (F/RR) scale was developed as a means of quantifying the effects of the injected failures on the aircraft state and the degree of pilot effort required to safely recover the aircraft. A brief evaluation of the rating scale was also conducted on the Army/NASA CH-47B variable stability helicopter to confirm that the rating scale was likely to be equally applicable to in-flight evaluations. Following the initial research flight qualification of the RFCS in 2002, a flight test effort was begun to validate the performance of the safety monitors and to validate their design for the safe conduct of research flight testing. Simulated failures were injected into the SCU, and the F/RR scale was applied to assess the results. The results validate the performance of the monitors, and indicate that the Failure/Recovery Rating scale is a very useful tool for evaluating failure transients in fly-by-wire flight control systems.
Fluid removal in acute heart failure: diuretics versus devices.
Krishnamoorthy, Arun; Felker, G Michael
2014-10-01
Fluid removal and relief of congestion are central to treatment of acute heart failure. Diuretics have been the decongestive mainstay but their known limitations have led to the exploration of alternative strategies. This review compares diuretics with ultrafiltration and examines the recent evidence evaluating their use. Relevant recent studies are the Diuretic Optimization Strategies Evaluation trial (of diuretics) and the Cardiorenal Rescue Study in Acute Decompensated Heart Failure (of ultrafiltration). The Diuretic Optimization Strategies Evaluation study evaluated strategies of loop diuretic use during acute heart failure (continuous infusion versus intermittent bolus and high dose versus low dose). After 72 h, there was no significant difference with either comparison for the coprimary end points. Patients treated with a high-dose strategy tended to have greater diuresis and more decongestion compared with low-dose therapy, at the cost of transient changes in renal function. The Cardiorenal Rescue Study in Acute Decompensated Heart Failure study showed that in acute heart failure patients with persistent congestion and worsening renal function, ultrafiltration, as compared with a medical therapy, was associated with similar weight loss but greater increase in serum creatinine and more adverse events. Decongestion remains a major challenge in acute heart failure. Although recent studies provide useful data to guide practice, the relatively poor outcomes point to the continued need to identify better strategies for safe and effective decongestion.
Lifetime evaluation of large format CMOS mixed signal infrared devices
NASA Astrophysics Data System (ADS)
Linder, A.; Glines, Eddie
2015-09-01
New large scale foundry processes continue to produce reliable products. These new large scale devices continue to use industry best practice to screen for failure mechanisms and validate their long lifetime. The Failure-in-Time analysis in conjunction with foundry qualification information can be used to evaluate large format device lifetimes. This analysis is a helpful tool when zero failure life tests are typical. The reliability of the device is estimated by applying the failure rate to the use conditions. JEDEC publications continue to be the industry accepted methods.
History.edu: Essays on Teaching with Technology.
ERIC Educational Resources Information Center
Trinkle, Dennis A., Ed.; Merriman, Scott A., Ed.
Intended to be equally useful to high school and college instructors, this book contains studies in history pedagogy, among them the first three published essays measuring qualitatively and quantitatively the successes and failures of "e-teaching" and distance learning. Collectively, the essays urge instructors to take the next step with…
Quantitative PCR Analysis of Laryngeal Muscle Fiber Types
ERIC Educational Resources Information Center
Van Daele, Douglas J.
2010-01-01
Voice and swallowing dysfunction as a result of recurrent laryngeal nerve paralysis can be improved with vocal fold injections or laryngeal framework surgery. However, denervation atrophy can cause late-term clinical failure. A major determinant of skeletal muscle physiology is myosin heavy chain (MyHC) expression, and previous protein analyses…
Real-time adjustment of ventricular restraint therapy in heart failure.
Ghanta, Ravi K; Lee, Lawrence S; Umakanthan, Ramanan; Laurence, Rita G; Fox, John A; Bolman, Ralph Morton; Cohn, Lawrence H; Chen, Frederick Y
2008-12-01
Current ventricular restraint devices do not allow for either the measurement or adjustment of ventricular restraint level. Periodic adjustment of restraint level post-device implantation may improve therapeutic efficacy. We evaluated the feasibility of an adjustable quantitative ventricular restraint (QVR) technique utilizing a fluid-filled polyurethane epicardial balloon to measure and adjust restraint level post-implantation guided by physiologic parameters. QVR balloons were implanted in nine ovine with post-infarction dilated heart failure. Restraint level was defined by the maximum restraint pressure applied by the balloon to the epicardium at end-diastole. An access line connected the balloon lumen to a subcutaneous portacath to allow percutaneous access. Restraint level was adjusted while left ventricular (LV) end-diastolic volume (EDV) and cardiac output was assessed with simultaneous transthoracic echocardiography. All nine ovine successfully underwent QVR balloon implantation. Post-implantation, restraint level could be measured percutaneously in real-time and dynamically adjusted by instillation and withdrawal of fluid from the balloon lumen. Using simultaneous echocardiography, restraint level could be adjusted based on LV EDV and cardiac output. After QVR therapy for 21 days, LV EDV decreased from 133+/-15 ml to 113+/-17 ml (p<0.05). QVR permits real-time measurement and physiologic adjustment of ventricular restraint therapy after device implantation.
Clinical assessment of social cognitive function in neurological disorders.
Henry, Julie D; von Hippel, William; Molenberghs, Pascal; Lee, Teresa; Sachdev, Perminder S
2016-01-01
Social cognition broadly refers to the processing of social information in the brain that underlies abilities such as the detection of others' emotions and responding appropriately to these emotions. Social cognitive skills are critical for successful communication and, consequently, mental health and wellbeing. Disturbances of social cognition are early and salient features of many neuropsychiatric, neurodevelopmental and neurodegenerative disorders, and often occur after acute brain injury. Its assessment in the clinic is, therefore, of paramount importance. Indeed, the most recent edition of the American Psychiatric Association's Diagnostic and Statistical Manual for Mental Disorders (DSM-5) introduced social cognition as one of six core components of neurocognitive function, alongside memory and executive control. Failures of social cognition most often present as poor theory of mind, reduced affective empathy, impaired social perception or abnormal social behaviour. Standard neuropsychological assessments lack the precision and sensitivity needed to adequately inform treatment of these failures. In this Review, we present appropriate methods of assessment for each of the four domains, using an example disorder to illustrate the value of these approaches. We discuss the clinical applications of testing for social cognitive function, and finally suggest a five-step algorithm for the evaluation and treatment of impairments, providing quantitative evidence to guide the selection of social cognitive measures in clinical practice.
Evaluation of the fuselage lap joint fatigue and terminating action repair
NASA Technical Reports Server (NTRS)
Samavedam, Gopal; Thomson, Douglas; Jeong, David Y.
1994-01-01
Terminating action is a remedial repair which entails the replacement of shear head countersunk rivets with universal head rivets which have a larger shank diameter. The procedure was developed to eliminate the risk of widespread fatigue damage (WFD) in the upper rivet row of a fuselage lap joint. A test and evaluation program has been conducted by Foster-Miller, Inc. (FMI) to evaluate the terminating action repair of the upper rivet row of a commercial aircraft fuselage lap splice. Two full scale fatigue tests were conducted on fuselage panels using the growth of fatigue cracks in the lap joint. The second test was performed to evaluate the effectiveness of the terminating action repair. In both tests, cyclic pressurization loading was applied to the panels while crack propagation was recorded at all rivet locations at regular intervals to generate detailed data on conditions of fatigue crack initiation, ligament link-up, and fuselage fracture. This program demonstrated that the terminating action repair substantially increases the fatigue life of a fuselage panel structure and effectively eliminates the occurrence of cracking in the upper rivet row of the lap joint. While high cycle crack growth was recorded in the middle rivet row during the second test, failure was not imminent when the test was terminated after cycling to well beyond the service life. The program also demonstrated that the initiation, propagation, and linkup of WFD in full-scale fuselage structures can be simulated and quantitatively studied in the laboratory. This paper presents an overview of the testing program and provides a detailed discussion of the data analysis and results. Crack distribution and propagation rates and directions as well as frequency of cracking are presented for both tests. The progression of damage to linkup of adjacent cracks and to eventual overall panel failure is discussed. In addition, an assessment of the effectiveness of the terminating action repair and the occurrence of cracking in the middle rivet row is provided, and conclusions of practical interest are drawn.
PREDICE score as a predictor of 90 days mortality in patients with heart failure
NASA Astrophysics Data System (ADS)
Purba, D. P. S.; Hasan, R.
2018-03-01
Hospitalization in chronic heart failure patients associated with high mortality and morbidity rate. The 90 days post-discharge period following hospitalization in heart failure patients is known as the vulnerable phase, it carries the high risk of poor outcomes. Identification of high-risk individuals by using prognostic evaluation was intended to do a closer follow up and more intensive to decreasing the morbidity and mortality rate of heart failure.To determine whether PREDICE score could predict mortality within 90 days in patients with heart failure, an observational cohort study in patients with heart failure who were hospitalized due to worsening chronic heart failure. Patients were in following-up for up to 90 days after initial evaluation with the primary endpoint is death.We found a difference of the significantstatistical between PREDICE score in survival and mortality group (p=0.001) of 84% (95% CI: 60.9% - 97.4%).In conclusion, PREDICE score has a good ability to predict mortality within 90 days in patients with heart failure.
NASA Technical Reports Server (NTRS)
Weiss, Jerold L.; Hsu, John Y.
1986-01-01
The use of a decentralized approach to failure detection and isolation for use in restructurable control systems is examined. This work has produced: (1) A method for evaluating fundamental limits to FDI performance; (2) Application using flight recorded data; (3) A working control element FDI system with maximal sensitivity to critical control element failures; (4) Extensive testing on realistic simulations; and (5) A detailed design methodology involving parameter optimization (with respect to model uncertainties) and sensitivity analyses. This project has concentrated on detection and isolation of generic control element failures since these failures frequently lead to emergency conditions and since knowledge of remaining control authority is essential for control system redesign. The failures are generic in the sense that no temporal failure signature information was assumed. Thus, various forms of functional failures are treated in a unified fashion. Such a treatment results in a robust FDI system (i.e., one that covers all failure modes) but sacrifices some performance when detailed failure signature information is known, useful, and employed properly. It was assumed throughout that all sensors are validated (i.e., contain only in-spec errors) and that only the first failure of a single control element needs to be detected and isolated. The FDI system which has been developed will handle a class of multiple failures.
Evaluation of a Multi-Axial, Temperature, and Time Dependent (MATT) Failure Model
NASA Technical Reports Server (NTRS)
Richardson, D. E.; Anderson, G. L.; Macon, D. J.; Rudolphi, Michael (Technical Monitor)
2002-01-01
To obtain a better understanding the response of the structural adhesives used in the Space Shuttle's Reusable Solid Rocket Motor (RSRM) nozzle, an extensive effort has been conducted to characterize in detail the failure properties of these adhesives. This effort involved the development of a failure model that includes the effects of multi-axial loading, temperature, and time. An understanding of the effects of these parameters on the failure of the adhesive is crucial to the understanding and prediction of the safety of the RSRM nozzle. This paper documents the use of this newly developed multi-axial, temperature, and time (MATT) dependent failure model for modeling failure for the adhesives TIGA 321, EA913NA, and EA946. The development of the mathematical failure model using constant load rate normal and shear test data is presented. Verification of the accuracy of the failure model is shown through comparisons between predictions and measured creep and multi-axial failure data. The verification indicates that the failure model performs well for a wide range of conditions (loading, temperature, and time) for the three adhesives. The failure criterion is shown to be accurate through the glass transition for the adhesive EA946. Though this failure model has been developed and evaluated with adhesives, the concepts are applicable for other isotropic materials.
NASA Technical Reports Server (NTRS)
Glover, R. C.; Kelley, B. A.; Tischer, A. E.
1986-01-01
The results of a review of the Space Shuttle Main Engine (SSME) failure data for the period 1980 through 1983 are presented. The data was collected, evaluated, and ranked according to procedures established during this study. A number of conclusions and recommendations are made based upon this failure data review. The results of a state-of-the-art diagnostic survey are also presented. This survey covered a broad range of diagnostic sensors and techniques and the findings were evaluated for application to the SSME. Finally, a discussion of the initial activities for the on-going SSME diagnostic evaluation is included.
Maturo, Donna; Powell, Alexis; Major-Wilson, Hannah; Sanchez, Kenia; De Santis, Joseph P; Friedman, Lawrence B
2015-01-01
Advances in care and treatment of adolescents/young adults with HIV infection have made survival into adulthood possible, requiring transition to adult care. Researchers have documented that the transition process is challenging for adolescents/young adults. To ensure successful transition, a formal transition protocol is needed. Despite existing research, little quantitative evaluation of the transition process has been conducted. The purpose of the study was to pilot test the "Movin' Out" Transitioning Protocol, a formalized protocol developed to assist transition to adult care. A retrospective medical/nursing record review was conducted with 38 clients enrolled in the "Movin' Out" Transitioning Protocol at a university-based adolescent medicine clinic providing care to adolescents/young adults with HIV infection. Almost half of the participants were able to successfully transition to adult care. Reasons for failure to transition included relocation, attrition, lost to follow-up, and transfer to another adult service. Failure to transition to adult care was not related to adherence issues, X(2) (1, N=38)=2.49, p=.288; substance use, X(2) (1, N=38)=1.71, p=.474; mental health issues, X(2) (1, N=38)=2.23, p=.322; or pregnancy/childrearing, X(2) (1, N=38)=0.00, p=.627). Despite the small sample size, the "Movin' Out" Transitioning Protocol appears to be useful in guiding the transition process of adolescents/young adults with HIV infection to adult care. More research is needed with a larger sample to fully evaluate the "Movin' Out" Transitioning Protocol. Copyright © 2015 Elsevier Inc. All rights reserved.
Assessment of Pancreatic β-Cell Function: Review of Methods and Clinical Applications
Cersosimo, Eugenio; Solis-Herrera, Carolina; Trautmann, Michael E.; Malloy, Jaret; Triplitt, Curtis L.
2014-01-01
Type 2 diabetes mellitus (T2DM) is characterized by a progressive failure of pancreatic β-cell function (BCF) with insulin resistance. Once insulin over-secretion can no longer compensate for the degree of insulin resistance, hyperglycemia becomes clinically significant and deterioration of residual β-cell reserve accelerates. This pathophysiology has important therapeutic implications. Ideally, therapy should address the underlying pathology and should be started early along the spectrum of decreasing glucose tolerance in order to prevent or slow β-cell failure and reverse insulin resistance. The development of an optimal treatment strategy for each patient requires accurate diagnostic tools for evaluating the underlying state of glucose tolerance. This review focuses on the most widely used methods for measuring BCF within the context of insulin resistance and includes examples of their use in prediabetes and T2DM, with an emphasis on the most recent therapeutic options (dipeptidyl peptidase-4 inhibitors and glucagon-like peptide-1 receptor agonists). Methods of BCF measurement include the homeostasis model assessment (HOMA); oral glucose tolerance tests, intravenous glucose tolerance tests (IVGTT), and meal tolerance tests; and the hyperglycemic clamp procedure. To provide a meaningful evaluation of BCF, it is necessary to interpret all observations within the context of insulin resistance. Therefore, this review also discusses methods utilized to quantitate insulin-dependent glucose metabolism, such as the IVGTT and the euglycemic-hyperinsulinemic clamp procedures. In addition, an example is presented of a mathematical modeling approach that can use data from BCF measurements to develop a better understanding of BCF behavior and the overall status of glucose tolerance. PMID:24524730
Facebook Facts: Breast Reconstruction Patient-Reported Outcomes Using Social Media.
Tang, Sherry Y Q; Israel, Jacqueline S; Poore, Samuel O; Afifi, Ahmed M
2018-05-01
Social media are used for information sharing among patients with similar health conditions, and analysis of social media activity could inform clinical decision-making. The aim of this study was to use Facebook to evaluate a cohort of individuals' perceptions of and satisfaction with breast reconstruction. In this observational study, the authors collected and analyzed posts pertaining to autologous and implant-based breast reconstruction from active Facebook groups. Patient satisfaction data were categorized, and a thematic analysis of posts was conducted. Qualitative posts were grouped based on common themes and quantitatively compared using frequency and chi-square analysis. The authors evaluated 500 posts from two Facebook groups. Two hundred sixty-four posts referenced deep inferior epigastric perforator (DIEP) flap reconstruction and 117 were related to implant-based reconstruction. Among individuals referencing DIEP flap reconstruction, 52 percent were satisfied, compared with 20 percent of individuals who referenced satisfaction with implant-based reconstruction (p < 0.0001). Individuals posting about DIEP flaps reported a higher rate of unexpected side effects (p < 0.001) and numbness (p = 0.004). When referencing implant-based reconstruction, individuals reported significantly higher rates of infection, contracture, and implant failure (p < 0.001). Based on the authors' review of social media activity, individuals undergoing DIEP flap breast reconstruction expressed relatively high individual satisfaction despite difficult postoperative recovery. Individuals who referenced implant-based reconstruction mentioned infection and implant failure, leading to high rates of dissatisfaction. Social media appear to provide informational and emotional support to patients. Plastic surgeons can use social media to gather unbiased information of patients' experience to inform clinical conversation and guide clinical practice.
Lay Consultations in Heart Failure Symptom Evaluation
Reeder, Katherine M.; Sims, Jessica L.; Ercole, Patrick M.; Shetty, Shivan S.; Wallendorf, Michael
2017-01-01
Purpose Lay consultations can facilitate or impede healthcare. However, little is known about how lay consultations for symptom evaluation affect treatment decision-making. The purpose of this study was to explore the role of lay consultations in symptom evaluation prior to hospitalization among patients with heart failure. Methods Semi-structured interviews were conducted with 60 patients hospitalized for acute decompensated heart failure. Chi-square and Fisher’s exact tests, along with logistic regression were used to characterize lay consultations in this sample. Results A large proportion of patients engaged in lay consultations for symptom evaluation and decision-making before hospitalization. Lay consultants provided attributions and advice and helped make the decision to seek medical care. Men consulted more often with their spouse than women, while women more often consulted with adult children. Conclusions Findings have implications for optimizing heart failure self-management interventions, improving outcomes, and reducing hospital readmissions. PMID:29399657
Evaluation of a Progressive Failure Analysis Methodology for Laminated Composite Structures
NASA Technical Reports Server (NTRS)
Sleight, David W.; Knight, Norman F., Jr.; Wang, John T.
1997-01-01
A progressive failure analysis methodology has been developed for predicting the nonlinear response and failure of laminated composite structures. The progressive failure analysis uses C plate and shell elements based on classical lamination theory to calculate the in-plane stresses. Several failure criteria, including the maximum strain criterion, Hashin's criterion, and Christensen's criterion, are used to predict the failure mechanisms. The progressive failure analysis model is implemented into a general purpose finite element code and can predict the damage and response of laminated composite structures from initial loading to final failure.
IRAC Full-Scale Flight Testbed Capabilities
NASA Technical Reports Server (NTRS)
Lee, James A.; Pahle, Joseph; Cogan, Bruce R.; Hanson, Curtis E.; Bosworth, John T.
2009-01-01
Overview: Provide validation of adaptive control law concepts through full scale flight evaluation in a representative avionics architecture. Develop an understanding of aircraft dynamics of current vehicles in damaged and upset conditions Real-world conditions include: a) Turbulence, sensor noise, feedback biases; and b) Coupling between pilot and adaptive system. Simulated damage includes 1) "B" matrix (surface) failures; and 2) "A" matrix failures. Evaluate robustness of control systems to anticipated and unanticipated failures.
POF-Darts: Geometric adaptive sampling for probability of failure
Ebeida, Mohamed S.; Mitchell, Scott A.; Swiler, Laura P.; ...
2016-06-18
We introduce a novel technique, POF-Darts, to estimate the Probability Of Failure based on random disk-packing in the uncertain parameter space. POF-Darts uses hyperplane sampling to explore the unexplored part of the uncertain space. We use the function evaluation at a sample point to determine whether it belongs to failure or non-failure regions, and surround it with a protection sphere region to avoid clustering. We decompose the domain into Voronoi cells around the function evaluations as seeds and choose the radius of the protection sphere depending on the local Lipschitz continuity. As sampling proceeds, regions uncovered with spheres will shrink,more » improving the estimation accuracy. After exhausting the function evaluation budget, we build a surrogate model using the function evaluations associated with the sample points and estimate the probability of failure by exhaustive sampling of that surrogate. In comparison to other similar methods, our algorithm has the advantages of decoupling the sampling step from the surrogate construction one, the ability to reach target POF values with fewer samples, and the capability of estimating the number and locations of disconnected failure regions, not just the POF value. Furthermore, we present various examples to demonstrate the efficiency of our novel approach.« less
Waldréus, Nana; Jaarsma, Tiny; van der Wal, Martje Hl; Kato, Naoko P
2018-03-01
Patients with heart failure can experience thirst distress. However, there is no instrument to measure this in patients with heart failure. The aim of the present study was to develop the Thirst Distress Scale for patients with Heart Failure (TDS-HF) and to evaluate psychometric properties of the scale. The TDS-HF was developed to measure thirst distress in patients with heart failure. Face and content validity was confirmed using expert panels including patients and healthcare professionals. Data on the TDS-HF was collected from patients with heart failure at outpatient heart failure clinics and hospitals in Sweden, the Netherlands and Japan. Psychometric properties were evaluated using data from 256 heart failure patients (age 72±11 years). Concurrent validity of the scale was assessed using a thirst intensity visual analogue scale. Patients did not have any difficulties answering the questions, and time taken to answer the questions was about five minutes. Factor analysis of the scale showed one factor. After psychometric testing, one item was deleted. For the eight item TDS-HF, a single factor explained 61% of the variance and Cronbach's alpha was 0.90. The eight item TDS-HF was significantly associated with the thirst intensity score ( r=0.55, p<0.001). Regarding test-retest reliability, the intraclass correlation coefficient was 0.88, and the weighted kappa values ranged from 0.29-0.60. The eight-item TDS-HF is valid and reliable for measuring thirst distress in patients with heart failure.
ERIC Educational Resources Information Center
Goodvin, Rebecca; Rolfson, Jacqueline
2014-01-01
Effects of feedback on children's self-evaluations are well established, yet little is known about how parents talk with children about everyday successes and failures, despite the importance of parent-child reminiscing in children's psychological understanding. We examine mothers' attributions and performance evaluations in conversations about…
NASA Astrophysics Data System (ADS)
Yang, Zhenyin
Metal-contact MEMS switches hold great promise for implementing agile radio frequency (RF) systems because of their small size, low fabrication cost, low power consumption, wide operational band, excellent isolation and exceptionally low signal insertion loss. Gold is often utilized as a contact material for metal-contact MEMS switches due to its excellent electrical conductivity and corrosion resistance. However contact wear and stiction are the two major failure modes for these switches due to its material softness and high surface adhesion energy. To strengthen the contact material, pure gold was alloyed with other metal elements. We designed and constructed a new micro-contacting test facility that closely mimic the typical MEMS operation and utilized this facility to efficiently evaluate optimized contact materials. Au-Ni binary alloy system as the candidate contact material for MEMS switches was systematically investigated. A correlation between contact material properties (etc. microstructure, micro-hardness, electrical resistivity, topology, surface structures and composition) and micro-contacting performance was established. It was demonstrated nano-scale graded two-phase Au-Ni film could possibly yield an improved device performance. Gold micro-contact degradation mechanisms were also systematically investigated by running the MEMS switching tests under a wide range of test conditions. According to our quantitative failure analysis, field evaporation could be the dominant failure mode for highfield (> critical threshold field) hot switching; transient thermal-assisted wear could be the dominant failure mode for low-field hot switching; on the other hand, pure mechanical wear and steady current heating (1 mA) caused much less contact degradation in cold switching tests. Results from low-force (50 muN/micro-contact), low current (0.1 mA) tests on real MEMS switches indicated that continuous adsorbed films from ambient air could degrade the switch contact resistance. Our work also contributes to the field of general nano-science and technology by resolving the transfer directionality of field evaporation of gold in atomic force microscope (AFM)/scanning tunneling microscope (STM).
NASA Astrophysics Data System (ADS)
Malyshev, Mikhail; Kreimer, Johannes
2013-09-01
Safety analyses for electrical, electronic and/or programmable electronic (E/E/EP) safety-related systems used in payload applications on-board the International Space Station (ISS) are often based on failure modes, effects and criticality analysis (FMECA). For industrial applications of E/E/EP safety-related systems, comparable strategies exist and are defined in the IEC-61508 standard. This standard defines some quantitative criteria based on potential failure modes (for example, Safe Failure Fraction). These criteria can be calculated for an E/E/EP system or components to assess their compliance to requirements of a particular Safety Integrity Level (SIL). The standard defines several SILs depending on how much risk has to be mitigated by a safety-critical system. When a FMECA is available for an ISS payload or its subsystem, it may be possible to calculate the same or similar parameters as defined in the 61508 standard. One example of a payload that has a dedicated functional safety subsystem is the Electromagnetic Levitator (EML). This payload for the ISS is planned to be operated on-board starting 2014. The EML is a high-temperature materials processing facility. The dedicated subsystem "Hazard Control Electronics" (HCE) is implemented to ensure compliance to failure tolerance in limiting samples processing parameters to maintain generation of the potentially toxic by-products to safe limits in line with the requirements applied to the payloads by the ISS Program. The objective of this paper is to assess the implementation of the HCE in the EML against criteria for functional safety systems in the IEC-61508 standard and to evaluate commonalities and differences with respect to safety requirements levied on ISS Payloads. An attempt is made to assess a possibility of using commercially available components and systems certified for compliance to industrial functional safety standards in ISS payloads.
Sanaka, Tsutomu; Funaki, Takenori; Tanaka, Toshihisa; Hoshi, Sayako; Niwayama, Jyun; Taitoh, Takashi; Nishimura, Hideki; Higuchi, Chieko
2002-05-01
The plasma pentosidine levels in patients with renal disease were measured by a simple method which was established for plasma and urinary pentosidine determinations. The method, which can be completed within a few hours, involves pretreating plasma with proteolytic enzyme (pronase) and measuring the concentration of pentosidine in the sample by ELISA using antipentosidine antibodies. The prepared antibodies showed no cross-reaction with the raw materials for pentosidine synthesis or with compounds having similar structures. SDS-PAGE indicated that the antibodies had a high purity. The reaction of the antibodies and keyhole limpet hemocyanin-pentosidine in the competitive ELISA system was inhibited by free pentosidine. Excellent standard curves for pentosidine determination were obtained. In actual measurements of clinical samples from patients, a good correlation (r = 0.9356) was obtained between the values measured by ELISA and HPLC. The plasma pentosidine level in patients with renal disease correlated significantly with plasma creatinine, urea nitrogen, beta2-microglobulin, and creatinine clearance, indicating its usefulness in evaluating the severity of renal disease. A significant elevation in plasma pentosidine levels was observed in mild renal dysfunction, whereas no significant increases in creatinine and urea nitrogen levels were detected, suggesting that the plasma pentosidine level is useful in the early diagnosis of beginning renal failure. In patients with chronic renal failure, no difference in plasma pentosidine levels was observed between diabetic nephropathy and chronic glomerulonephritis, while a significant correlation was observed with phosphatidylcholine hydroperoxide, suggesting the possibility that the plasma pentosidine level reflects injury due to oxidation. From these results, the quantitative measurement method developed by us is judged to be a superior innovation for measuring pentosidine in body fluids. The plasma pentosidine level may be useful for the early diagnosis of mild renal failure and to estimate the degree of the severity of renal diseases. Copyright 2002 S. Karger AG, Basel
Using Seismic Signals to Forecast Volcanic Processes
NASA Astrophysics Data System (ADS)
Salvage, R.; Neuberg, J. W.
2012-04-01
Understanding seismic signals generated during volcanic unrest have the ability to allow scientists to more accurately predict and understand active volcanoes since they are intrinsically linked to rock failure at depth (Voight, 1988). In particular, low frequency long period signals (LP events) have been related to the movement of fluid and the brittle failure of magma at depth due to high strain rates (Hammer and Neuberg, 2009). This fundamentally relates to surface processes. However, there is currently no physical quantitative model for determining the likelihood of an eruption following precursory seismic signals, or the timing or type of eruption that will ensue (Benson et al., 2010). Since the beginning of its current eruptive phase, accelerating LP swarms (< 10 events per hour) have been a common feature at Soufriere Hills volcano, Montserrat prior to surface expressions such as dome collapse or eruptions (Miller et al., 1998). The dynamical behaviour of such swarms can be related to accelerated magma ascent rates since the seismicity is thought to be a consequence of magma deformation as it rises to the surface. In particular, acceleration rates can be successfully used in collaboration with the inverse material failure law; a linear relationship against time (Voight, 1988); in the accurate prediction of volcanic eruption timings. Currently, this has only been investigated for retrospective events (Hammer and Neuberg, 2009). The identification of LP swarms on Montserrat and analysis of their dynamical characteristics allows a better understanding of the nature of the seismic signals themselves, as well as their relationship to surface processes such as magma extrusion rates. Acceleration and deceleration rates of seismic swarms provide insights into the plumbing system of the volcano at depth. The application of the material failure law to multiple LP swarms of data allows a critical evaluation of the accuracy of the method which further refines current understanding of the relationship between seismic signals and volcanic eruptions. It is hoped that such analysis will assist the development of real time forecasting models.
Micromechanical investigation of ductile failure in Al 5083-H116 via 3D unit cell modeling
NASA Astrophysics Data System (ADS)
Bomarito, G. F.; Warner, D. H.
2015-01-01
Ductile failure is governed by the evolution of micro-voids within a material. The micro-voids, which commonly initiate at second phase particles within metal alloys, grow and interact with each other until failure occurs. The evolution of the micro-voids, and therefore ductile failure, depends on many parameters (e.g., stress state, temperature, strain rate, void and particle volume fraction, etc.). In this study, the stress state dependence of the ductile failure of Al 5083-H116 is investigated by means of 3-D Finite Element (FE) periodic cell models. The cell models require only two pieces of information as inputs: (1) the initial particle volume fraction of the alloy and (2) the constitutive behavior of the matrix material. Based on this information, cell models are subjected to a given stress state, defined by the stress triaxiality and the Lode parameter. For each stress state, the cells are loaded in many loading orientations until failure. Material failure is assumed to occur in the weakest orientation, and so the orientation in which failure occurs first is considered as the critical orientation. The result is a description of material failure that is derived from basic principles and requires no fitting parameters. Subsequently, the results of the simulations are used to construct a homogenized material model, which is used in a component-scale FE model. The component-scale FE model is compared to experiments and is shown to over predict ductility. By excluding smaller nucleation events and load path non-proportionality, it is concluded that accuracy could be gained by including more information about the true microstructure in the model; emphasizing that its incorporation into micromechanical models is critical to developing quantitatively accurate physics-based ductile failure models.
Building a Database for a Quantitative Model
NASA Technical Reports Server (NTRS)
Kahn, C. Joseph; Kleinhammer, Roger
2014-01-01
A database can greatly benefit a quantitative analysis. The defining characteristic of a quantitative risk, or reliability, model is the use of failure estimate data. Models can easily contain a thousand Basic Events, relying on hundreds of individual data sources. Obviously, entering so much data by hand will eventually lead to errors. Not so obviously entering data this way does not aid linking the Basic Events to the data sources. The best way to organize large amounts of data on a computer is with a database. But a model does not require a large, enterprise-level database with dedicated developers and administrators. A database built in Excel can be quite sufficient. A simple spreadsheet database can link every Basic Event to the individual data source selected for them. This database can also contain the manipulations appropriate for how the data is used in the model. These manipulations include stressing factors based on use and maintenance cycles, dormancy, unique failure modes, the modeling of multiple items as a single "Super component" Basic Event, and Bayesian Updating based on flight and testing experience. A simple, unique metadata field in both the model and database provides a link from any Basic Event in the model to its data source and all relevant calculations. The credibility for the entire model often rests on the credibility and traceability of the data.
NASA Astrophysics Data System (ADS)
Wolter, Andrea; Stead, Doug; Clague, John J.
2014-02-01
The 1963 Vajont Slide in northeast Italy is an important engineering and geological event. Although the landslide has been extensively studied, new insights can be derived by applying modern techniques such as remote sensing and numerical modelling. This paper presents the first digital terrestrial photogrammetric analyses of the failure scar, landslide deposits, and the area surrounding the failure, with a focus on the scar. We processed photogrammetric models to produce discontinuity stereonets, residual maps and profiles, and slope and aspect maps, all of which provide information on the failure scar morphology. Our analyses enabled the creation of a preliminary semi-quantitative morphologic classification of the Vajont failure scar based on the large-scale tectonic folds and step-paths that define it. The analyses and morphologic classification have implications for the kinematics, dynamics, and mechanism of the slide. Metre- and decametre-scale features affected the initiation, direction, and displacement rate of sliding. The most complexly folded and stepped areas occur close to the intersection of orthogonal synclinal features related to the Dinaric and Neoalpine deformation events. Our analyses also highlight, for the first time, the evolution of the Vajont failure scar from 1963 to the present.
NASA Technical Reports Server (NTRS)
Hatfield, Glen S.; Hark, Frank; Stott, James
2016-01-01
Launch vehicle reliability analysis is largely dependent upon using predicted failure rates from data sources such as MIL-HDBK-217F. Reliability prediction methodologies based on component data do not take into account risks attributable to manufacturing, assembly, and process controls. These sources often dominate component level reliability or risk of failure probability. While consequences of failure is often understood in assessing risk, using predicted values in a risk model to estimate the probability of occurrence will likely underestimate the risk. Managers and decision makers often use the probability of occurrence in determining whether to accept the risk or require a design modification. Due to the absence of system level test and operational data inherent in aerospace applications, the actual risk threshold for acceptance may not be appropriately characterized for decision making purposes. This paper will establish a method and approach to identify the pitfalls and precautions of accepting risk based solely upon predicted failure data. This approach will provide a set of guidelines that may be useful to arrive at a more realistic quantification of risk prior to acceptance by a program.
Stress redistribution and damage in interconnects caused by electromigration
NASA Astrophysics Data System (ADS)
Chiras, Stefanie Ruth
Electromigration has long been recognized as a phenomenon that induces mass redistribution in metals which, when constrained, can lead to the creation of stress. Since the development of the integrated circuit, electromigration. in interconnects, (the metal lines which carry current between devices in integrated circuits), has become a reliability concern. The primary failure mechanism in the interconnects is usually voiding, which causes electrical resistance increases in the circuit. In some cases, however, another failure mode occurs, fracture of the surrounding dielectric driven by electromigration induced compressive stresses within the interconnect. It is this failure mechanism that is the focus of this thesis. To study dielectric fracture, both residual processing stresses and the development of electromigration induced stress in isolated, constrained interconnects was measured. The high-resolution measurements were made using two types of piezospectroscopy, complemented by finite element analysis (FEA). Both procedures directly measured stress in the underlying or neighboring substrate and used FEA to determine interconnect stresses. These interconnect stresses were related to the effected circuit failure mode through post-test scanning electron microscopy and resistance measurements taken during electromigration testing. The results provide qualitative evidence of electromigration driven passivation fracture, and quantitative analysis of the theoretical model of the failure, the "immortal" interconnect concept.
Imran, Muhammad; Zafar, Nazir Ahmad
2012-01-01
Maintaining inter-actor connectivity is extremely crucial in mission-critical applications of Wireless Sensor and Actor Networks (WSANs), as actors have to quickly plan optimal coordinated responses to detected events. Failure of a critical actor partitions the inter-actor network into disjoint segments besides leaving a coverage hole, and thus hinders the network operation. This paper presents a Partitioning detection and Connectivity Restoration (PCR) algorithm to tolerate critical actor failure. As part of pre-failure planning, PCR determines critical/non-critical actors based on localized information and designates each critical node with an appropriate backup (preferably non-critical). The pre-designated backup detects the failure of its primary actor and initiates a post-failure recovery process that may involve coordinated multi-actor relocation. To prove the correctness, we construct a formal specification of PCR using Z notation. We model WSAN topology as a dynamic graph and transform PCR to corresponding formal specification using Z notation. Formal specification is analyzed and validated using the Z Eves tool. Moreover, we simulate the specification to quantitatively analyze the efficiency of PCR. Simulation results confirm the effectiveness of PCR and the results shown that it outperforms contemporary schemes found in the literature.
NASA trend analysis procedures
NASA Technical Reports Server (NTRS)
1993-01-01
This publication is primarily intended for use by NASA personnel engaged in managing or implementing trend analysis programs. 'Trend analysis' refers to the observation of current activity in the context of the past in order to infer the expected level of future activity. NASA trend analysis was divided into 5 categories: problem, performance, supportability, programmatic, and reliability. Problem trend analysis uncovers multiple occurrences of historical hardware or software problems or failures in order to focus future corrective action. Performance trend analysis observes changing levels of real-time or historical flight vehicle performance parameters such as temperatures, pressures, and flow rates as compared to specification or 'safe' limits. Supportability trend analysis assesses the adequacy of the spaceflight logistics system; example indicators are repair-turn-around time and parts stockage levels. Programmatic trend analysis uses quantitative indicators to evaluate the 'health' of NASA programs of all types. Finally, reliability trend analysis attempts to evaluate the growth of system reliability based on a decreasing rate of occurrence of hardware problems over time. Procedures for conducting all five types of trend analysis are provided in this publication, prepared through the joint efforts of the NASA Trend Analysis Working Group.
Joint Multi-Leaf Segmentation, Alignment, and Tracking for Fluorescence Plant Videos.
Yin, Xi; Liu, Xiaoming; Chen, Jin; Kramer, David M
2018-06-01
This paper proposes a novel framework for fluorescence plant video processing. The plant research community is interested in the leaf-level photosynthetic analysis within a plant. A prerequisite for such analysis is to segment all leaves, estimate their structures, and track them over time. We identify this as a joint multi-leaf segmentation, alignment, and tracking problem. First, leaf segmentation and alignment are applied on the last frame of a plant video to find a number of well-aligned leaf candidates. Second, leaf tracking is applied on the remaining frames with leaf candidate transformation from the previous frame. We form two optimization problems with shared terms in their objective functions for leaf alignment and tracking respectively. A quantitative evaluation framework is formulated to evaluate the performance of our algorithm with four metrics. Two models are learned to predict the alignment accuracy and detect tracking failure respectively in order to provide guidance for subsequent plant biology analysis. The limitation of our algorithm is also studied. Experimental results show the effectiveness, efficiency, and robustness of the proposed method.
NASA Astrophysics Data System (ADS)
Steger, Stefan; Schmaltz, Elmar; Glade, Thomas
2017-04-01
Empirical landslide susceptibility maps spatially depict the areas where future slope failures are likely due to specific environmental conditions. The underlying statistical models are based on the assumption that future landsliding is likely to occur under similar circumstances (e.g. topographic conditions, lithology, land cover) as past slope failures. This principle is operationalized by applying a supervised classification approach (e.g. a regression model with a binary response: landslide presence/absence) that enables discrimination between conditions that favored past landslide occurrences and the circumstances typical for landslide absences. The derived empirical relation is then transferred to each spatial unit of an area. Literature reveals that the specific topographic conditions representative for landslide presences are frequently extracted from derivatives of digital terrain models at locations were past landslides were mapped. The underlying morphology-based landslide identification becomes possible due to the fact that the topography at a specific locality usually changes after landslide occurrence (e.g. hummocky surface, concave and steep scarp). In a strict sense, this implies that topographic predictors used within conventional statistical landslide susceptibility models relate to post-failure topographic conditions - and not to the required pre-failure situation. This study examines the assumption that models calibrated on the basis of post-failure topographies may not be appropriate to predict future landslide locations, because (i) post-failure and pre-failure topographic conditions may differ and (ii) areas were future landslides will occur do not yet exhibit such a distinct post-failure morphology. The study was conducted for an area located in the Walgau region (Vorarlberg, western Austria), where a detailed inventory consisting of shallow landslides was available. The methodology comprised multiple systematic comparisons of models generated on the basis of post-failure conditions (i.e. the standard approach) with models based on an approximated pre-failure topography. Pre-failure topography was approximated by (i) erasing the area of mapped landslide polygons within a digital terrain model and (ii) filling these "empty" areas by interpolating elevation points located outside the mapped landslides. Landslide presence information was extracted from the respective landslide scarp locations while an equal number of randomly sampled points represented landslide absences. After an initial exploratory data analysis, mixed-effects logistic regression was applied to model landslide susceptibility on the basis of two predictor sets (post-failure versus pre-failure predictors). Furthermore, all analyses were separately conducted for five different modelling resolutions to elaborate the suspicion that the degree of generalization of topographic parameters may as well play a role on how the respective models may differ. Model evaluation was conducted by means of multiple procedures (i.e. odds ratios, k-fold cross validation, permutation-based variable importance, difference maps of predictions). The results revealed that models based on highest resolutions (e.g. 1 m, 2.5 m) and post-failure topography performed best from a purely quantitative perspective. A confrontation of models (post-failure versus pre-failure based models) based on an identical modelling resolution exposed that validation results, modelled relationships as well as the prediction pattern tended to converge with a decreasing raster resolution. Based on the results, we concluded that an approximation of pre-failure topography does not significantly contribute to improved landslide susceptibility models in the case (i) the underlying inventory consists of small landslide features and (ii) the models are based on coarse raster resolutions (e.g. 25 m). However, in the case modelling with high raster resolutions is envisaged (e.g. 1 m, 2.5 m) or the inventory mainly consists of larger events, a reconstruction of pre-failure conditions might be highly expedient, even though conventional validation results might indicate an opposite tendency. Finally, we recommend to consider that topographic predictors highly useful to detect past slope movements (e.g. roughness) are not necessarily valuable to predict future slope instabilities.
NASA Technical Reports Server (NTRS)
Bundick, W. Thomas
1990-01-01
A methodology for designing a failure detection and identification (FDI) system to detect and isolate control element failures in aircraft control systems is reviewed. An FDI system design for a modified B-737 aircraft resulting from this methodology is also reviewed, and the results of evaluating this system via simulation are presented. The FDI system performed well in a no-turbulence environment, but it experienced an unacceptable number of false alarms in atmospheric turbulence. An adaptive FDI system, which adjusts thresholds and other system parameters based on the estimated turbulence level, was developed and evaluated. The adaptive system performed well over all turbulence levels simulated, reliably detecting all but the smallest magnitude partially-missing-surface failures.
Predicting, examining, and evaluating FAC in US power plants
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cohn, M.J.; Garud, Y.S.; Raad, J. de
1999-11-01
There have been many pipe failures in fossil and nuclear power plant piping systems caused by flow-accelerated corrosion (FAC). In some piping systems, this failure mechanism maybe the most important type of damage to mitigate because FAC damage has led to catastrophic failures and fatalities. Detecting the damage and mitigating the problem can significantly reduce future forced outages and increase personnel safety. This article discusses the implementation of recent developments to select FAC inspection locations, perform cost-effective examinations, evaluate results, and mitigate FAC failures. These advances include implementing the combination of software to assist in selecting examination locations and anmore » improved pulsed eddy current technique to scan for wall thinning without removing insulation. The use of statistical evaluation methodology and possible mitigation strategies also are discussed.« less
ERIC Educational Resources Information Center
Mbwiri, Francis I.
2017-01-01
Many students with disabilities attending alternative high schools are not improving their mathematics ability scores. Failure to improve their mathematics ability scores has hampered their potential academic success and career prospects, resulting in many students dropping out of schools without graduating. The purpose of this quantitative study…
The Relationship between Earned Value Management Metrics and Customer Satisfaction
ERIC Educational Resources Information Center
Plumer, David R.
2010-01-01
Information Technology (IT) products have a high rate of failure. Only 25% of IT projects were completed within budget and schedule, and 15% of completed projects were not operational. Researchers have not investigated the success of project management systems from the perspective of customer satisfaction. In this quantitative study, levels of…
Environment assisted degradation mechanisms in advanced light metals
NASA Technical Reports Server (NTRS)
Gangloff, Richard P.; Stoner, Glenn E.; Swanson, Robert E.
1988-01-01
The general goals of the research program are to characterize alloy behavior quantitatively and to develop predictive mechanisms for environmental failure modes. Successes in this regard will provide the basis for metallurgical optimization of alloy performance, for chemical control of aggressive environments, and for engineering life prediction with damage tolerance and long term reliability.
A Meta-Analysis of Predictors of Offender Treatment Attrition and Its Relationship to Recidivism
ERIC Educational Resources Information Center
Olver, Mark E.; Stockdale, Keira C.; Wormith, J. Stephen
2011-01-01
Objective: The failure of offenders to complete psychological treatment can pose significant concerns, including increased risk for recidivism. Although a large literature identifying predictors of offender treatment attrition has accumulated, there has yet to be a comprehensive quantitative review. Method: A meta-analysis of the offender…
NASA Technical Reports Server (NTRS)
Vander Velde, W. E.; Carignan, C. R.
1984-01-01
One of the first questions facing the designer of the control system for a large space structure is how many components actuators and sensors - to specify and where to place them on the structure. This paper presents a methodology which is intended to assist the designer in making these choices. A measure of controllability is defined which is a quantitative indication of how well the system can be controlled with a given set of actuators. Similarly, a measure of observability is defined which is a quantitative indication of how well the system can be observed with a given set of sensors. Then the effect of component unreliability is introduced by computing the average expected degree of controllability (observability) over the operating lifetime of the system accounting for the likelihood of various combinations of component failures. The problem of component location is resolved by optimizing this performance measure over the admissible set of locations. The variation of this optimized performance measure with number of actuators (sensors) is helpful in deciding how many components to use.
Can magma-injection and groundwater forces cause massive landslides on Hawaiian volcanoes?
Iverson, R.M.
1995-01-01
Landslides with volumes exceeding 1000 km3 have occurred on the flanks of Hawaiian volcanoes. Because the flanks typically slope seaward no more than 12??, the mechanics of slope failure are problematic. Limit-equilibrium analyses of wedge-shaped slices of the volcano flanks show that magma injection at prospective headscarps might trigger the landslides, but only under very restrictive conditions. Additional calculations show that groundwater head gradients associated with topographically induced flow and sea-level change are less likely to be important. Thus a simple, quantitative explanation for failure of Hawaiian volcano flanks remains elusive, and more complex scenarios may merit investigation. -from Author
Experimental investigation of the crashworthiness of scaled composite sailplane fuselages
NASA Technical Reports Server (NTRS)
Kampf, Karl-Peter; Crawley, Edward F.; Hansman, R. John, Jr.
1989-01-01
The crash dynamics and energy absorption of composite sailplane fuselage segments undergoing nose-down impact were investigated. More than 10 quarter-scale structurally similar test articles, typical of high-performance sailplane designs, were tested. Fuselages segments were fabricated of combinations of fiberglass, graphite, Kevlar, and Spectra fabric materials. Quasistatic and dynamic tests were conducted. The quasistatic tests were found to replicate the strain history and failure modes observed in the dynamic tests. Failure modes of the quarter-scale model were qualitatively compared with full-scale crash evidence and quantitatively compared with current design criteria. By combining material and structural improvements, substantial increases in crashworthiness were demonstrated.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tan, Li; Chen, Zizhong; Song, Shuaiwen
2016-01-18
Energy efficiency and resilience are two crucial challenges for HPC systems to reach exascale. While energy efficiency and resilience issues have been extensively studied individually, little has been done to understand the interplay between energy efficiency and resilience for HPC systems. Decreasing the supply voltage associated with a given operating frequency for processors and other CMOS-based components can significantly reduce power consumption. However, this often raises system failure rates and consequently increases application execution time. In this work, we present an energy saving undervolting approach that leverages the mainstream resilience techniques to tolerate the increased failures caused by undervolting.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tan, Li; Chen, Zizhong; Song, Shuaiwen Leon
2015-11-16
Energy efficiency and resilience are two crucial challenges for HPC systems to reach exascale. While energy efficiency and resilience issues have been extensively studied individually, little has been done to understand the interplay between energy efficiency and resilience for HPC systems. Decreasing the supply voltage associated with a given operating frequency for processors and other CMOS-based components can significantly reduce power consumption. However, this often raises system failure rates and consequently increases application execution time. In this work, we present an energy saving undervolting approach that leverages the mainstream resilience techniques to tolerate the increased failures caused by undervolting.
Comparative Evaluation of Quantitative Test Methods for Gases on a Hard Surface
2017-02-01
COMPARATIVE EVALUATION OF QUANTITATIVE TEST METHODS FOR GASES ON A HARD SURFACE ECBC-TR-1426 Vipin Rastogi...1 COMPARATIVE EVALUATION OF QUANTITATIVE TEST METHODS FOR GASES ON A HARD SURFACE 1. INTRODUCTION Members of the U.S. Environmental...Generator 4 2.4 Experimental Design Each quantitative method was performed three times on three consecutive days. For the CD runs, three
Evaluation of Gusset Plate Safety in Steel Truss Bridges
DOT National Transportation Integrated Search
2011-10-01
Failure of the I-35 truss bridge in Minneapolis has been attributed to failure of a gusset plate, necessitating : evaluation of gusset plate safety on bridges across the county. FHWA Publication IF-09-014 provides state : DOTs with important guidance...
Signal analysis techniques for incipient failure detection in turbomachinery
NASA Technical Reports Server (NTRS)
Coffin, T.
1985-01-01
Signal analysis techniques for the detection and classification of incipient mechanical failures in turbomachinery were developed, implemented and evaluated. Signal analysis techniques available to describe dynamic measurement characteristics are reviewed. Time domain and spectral methods are described, and statistical classification in terms of moments is discussed. Several of these waveform analysis techniques were implemented on a computer and applied to dynamic signals. A laboratory evaluation of the methods with respect to signal detection capability is described. Plans for further technique evaluation and data base development to characterize turbopump incipient failure modes from Space Shuttle main engine (SSME) hot firing measurements are outlined.
Friday, Laura; Zoller, James S; Hollerbach, Ann D; Jones, Katherine; Knofczynski, Greg
2015-01-01
Organizations are looking to new graduate nurses to fill expected staffing shortages over the next decade. Creative and effective onboarding programs will determine the success or failure of these graduates as they transition from student to professional nurse. This longitudinal quantitative study with repeated measures used the Casey-Fink Graduate Nurse Experience Survey to investigate the effects of offering a prelicensure extern program and postlicensure residency program on new graduate nurses and organizational outcomes versus a residency program alone. Compared with the nurse residency program alone, the combination of extern program and nurse residency program improved neither the transition factors most important to new nurse graduates during their first year of practice nor a measure important to organizations, retention rates. The additional cost of providing an extern program should be closely evaluated when making financially responsible decisions.
Progressive approach to eruption at Campi Flegrei caldera in southern Italy
NASA Astrophysics Data System (ADS)
Kilburn, Christopher R. J.; de Natale, Giuseppe; Carlino, Stefano
2017-05-01
Unrest at large calderas rarely ends in eruption, encouraging vulnerable communities to perceive emergency warnings of volcanic activity as false alarms. A classic example is the Campi Flegrei caldera in southern Italy, where three episodes of major uplift since 1950 have raised its central district by about 3 m without an eruption. Individual episodes have conventionally been treated as independent events, so that only data from an ongoing episode are considered pertinent to evaluating eruptive potential. An implicit assumption is that the crust relaxes accumulated stress after each episode. Here we apply a new model of elastic-brittle failure to test the alternative view that successive episodes promote a long-term accumulation of stress in the crust. The results provide the first quantitative evidence that Campi Flegrei is evolving towards conditions more favourable to eruption and identify field tests for predictions on how the caldera will behave during future unrest.
Progressive approach to eruption at Campi Flegrei caldera in southern Italy
Kilburn, Christopher R.J.; De Natale, Giuseppe; Carlino, Stefano
2017-01-01
Unrest at large calderas rarely ends in eruption, encouraging vulnerable communities to perceive emergency warnings of volcanic activity as false alarms. A classic example is the Campi Flegrei caldera in southern Italy, where three episodes of major uplift since 1950 have raised its central district by about 3 m without an eruption. Individual episodes have conventionally been treated as independent events, so that only data from an ongoing episode are considered pertinent to evaluating eruptive potential. An implicit assumption is that the crust relaxes accumulated stress after each episode. Here we apply a new model of elastic-brittle failure to test the alternative view that successive episodes promote a long-term accumulation of stress in the crust. The results provide the first quantitative evidence that Campi Flegrei is evolving towards conditions more favourable to eruption and identify field tests for predictions on how the caldera will behave during future unrest. PMID:28504261
NASA Technical Reports Server (NTRS)
Low, P. A.; Denq, J. C.; Opfer-Gehrking, T. L.; Dyck, P. J.; O'Brien, P. C.; Slezak, J. M.
1997-01-01
Normative data are limited on autonomic function tests, especially beyond age 60 years. We therefore evaluated these tests in a total of 557 normal subjects evenly distributed by age and gender from 10 to 83 years. Heart rate (HR) response to deep breathing fell with increasing age. Valsalva ratio varied with both age and gender. QSART (quantitative sudomotor axon-reflex test) volume was consistently greater in men (approximately double) and progressively declined with age for all three lower extremity sites but not the forearm site. Orthostatic blood pressure reduction was greater with increasing age. HR at rest was significantly higher in women, and the increment with head-up tilt fell with increasing age. For no tests did we find a regression to zero, and some tests seem to level off with increasing age, indicating that diagnosis of autonomic failure was possible to over 80 years of age.
Progressive approach to eruption at Campi Flegrei caldera in southern Italy.
Kilburn, Christopher R J; De Natale, Giuseppe; Carlino, Stefano
2017-05-15
Unrest at large calderas rarely ends in eruption, encouraging vulnerable communities to perceive emergency warnings of volcanic activity as false alarms. A classic example is the Campi Flegrei caldera in southern Italy, where three episodes of major uplift since 1950 have raised its central district by about 3 m without an eruption. Individual episodes have conventionally been treated as independent events, so that only data from an ongoing episode are considered pertinent to evaluating eruptive potential. An implicit assumption is that the crust relaxes accumulated stress after each episode. Here we apply a new model of elastic-brittle failure to test the alternative view that successive episodes promote a long-term accumulation of stress in the crust. The results provide the first quantitative evidence that Campi Flegrei is evolving towards conditions more favourable to eruption and identify field tests for predictions on how the caldera will behave during future unrest.
The relationship between fuel lubricity and diesel injection system wear
NASA Astrophysics Data System (ADS)
Lacy, Paul I.
1992-01-01
Use of low-lubricity fuel may have contributed to increased failure rates associated with critical fuel injection equipment during the 1991 Operation Desert Storm. However, accurate quantitative analysis of failed components from the field is almost impossible due to the unique service history of each pump. This report details the results of pump stand tests with fuels of equal viscosity, but widely different lubricity. Baseline tests were also performed using reference no. 2 diesel fuel. Use of poor lubricity fuel under these controlled conditions was found to greatly reduce both pump durability and engine performance. However, both improved metallurgy and fuel lubricity additives significantly reduced wear. Good correlation was obtained between standard bench tests and lightly loaded pump components. However, high contact loads on isolated components produced a more severe wear mechanism that is not well reflected by the Ball-on-Cylinder Lubricity Evaluator.
Analysis of lasers as a solution to efficiency droop in solid-state lighting
Chow, Weng W.; Crawford, Mary H.
2015-10-06
This letter analyzes the proposal to mitigate the efficiency droop in solid-state light emitters by replacing InGaN light-emitting diodes (LEDs) with lasers. The argument in favor of this approach is that carrier-population clamping after the onset of lasing limits carrier loss to that at threshold, while stimulated emission continues to grow with injection current. A fully quantized (carriers and light) theory that is applicable to LEDs and lasers (above and below threshold) is used to obtain a quantitative evaluation. The results confirm the potential advantage of higher laser output power and efficiency above lasing threshold, while also indicating disadvantages includingmore » low efficiency prior to lasing onset, sensitivity of lasing threshold to temperature, and the effects of catastrophic laser failure. As a result, a solution to some of these concerns is suggested that takes advantage of recent developments in nanolasers.« less
Paradise, Jordan; Wolf, Susan M; Kuzma, Jennifer; Kuzhabekova, Aliya; Tisdale, Alison W; Kokkoli, Efrosini; Ramachandran, Gurumurthy
2009-01-01
The emergence of nanotechnology, and specifically nanobiotechnology, raises major oversight challenges. In the United States, government, industry, and researchers are debating what oversight approaches are most appropriate. Among the federal agencies already embroiled in discussion of oversight approaches are the Food and Drug Administration (FDA), Environmental Protection Agency (EPA), Department of Agriculture (USDA), Occupational Safety and Health Administration (OSHA), and National Institutes of Health (NIH). All can learn from assessment of the successes and failures of past oversight efforts aimed at emerging technologies. This article reports on work funded by the National Science Foundation (NSF) aimed at learning the lessons of past oversight efforts. The article offers insights that emerge from comparing five oversight case studies that examine oversight of genetically engineered organisms (GEOs) in the food supply, pharmaceuticals, medical devices, chemicals in the workplace, and gene therapy. Using quantitative and qualitative analysis, the authors present a new way of evaluating oversight.
An Educational Intervention to Evaluate Nurses' Knowledge of Heart Failure.
Sundel, Siobhan; Ea, Emerson E
2018-07-01
Nurses are the main providers of patient education in inpatient and outpatient settings. Unfortunately, nurses may lack knowledge of chronic medical conditions, such as heart failure. The purpose of this one-group pretest-posttest intervention was to determine the effectiveness of teaching intervention on nurses' knowledge of heart failure self-care principles in an ambulatory care setting. The sample consisted of 40 staff nurses in ambulatory care. Nurse participants received a focused education intervention based on knowledge deficits revealed in the pretest and were then resurveyed within 30 days. Nurses were evaluated using the valid and reliable 20-item Nurses Knowledge of Heart Failure Education Principles Survey tool. The results of this project demonstrated that an education intervention on heart failure self-care principles improved nurses' knowledge of heart failure in an ambulatory care setting, which was statistically significant (p < .05). Results suggest that a teaching intervention could improve knowledge of heart failure, which could lead to better patient education and could reduce patient readmission for heart failure. J Contin Educ Nurs. 2018;49(7):315-321. Copyright 2018, SLACK Incorporated.
Shi, Lei; Shuai, Jian; Xu, Kui
2014-08-15
Fire and explosion accidents of steel oil storage tanks (FEASOST) occur occasionally during the petroleum and chemical industry production and storage processes and often have devastating impact on lives, the environment and property. To contribute towards the development of a quantitative approach for assessing the occurrence probability of FEASOST, a fault tree of FEASOST is constructed that identifies various potential causes. Traditional fault tree analysis (FTA) can achieve quantitative evaluation if the failure data of all of the basic events (BEs) are available, which is almost impossible due to the lack of detailed data, as well as other uncertainties. This paper makes an attempt to perform FTA of FEASOST by a hybrid application between an expert elicitation based improved analysis hierarchy process (AHP) and fuzzy set theory, and the occurrence possibility of FEASOST is estimated for an oil depot in China. A comparison between statistical data and calculated data using fuzzy fault tree analysis (FFTA) based on traditional and improved AHP is also made. Sensitivity and importance analysis has been performed to identify the most crucial BEs leading to FEASOST that will provide insights into how managers should focus effective mitigation. Copyright © 2014 Elsevier B.V. All rights reserved.
Tate, Sonya C; Burke, Teresa F; Hartman, Daisy; Kulanthaivel, Palaniappan; Beckmann, Richard P; Cronier, Damien M
2016-03-15
Resistance to BRAF inhibition is a major cause of treatment failure for BRAF-mutated metastatic melanoma patients. Abemaciclib, a cyclin-dependent kinase 4 and 6 inhibitor, overcomes this resistance in xenograft tumours and offers a promising drug combination. The present work aims to characterise the quantitative pharmacology of the abemaciclib/vemurafenib combination using a semimechanistic pharmacokinetic/pharmacodynamic modelling approach and to identify an optimum dosing regimen for potential clinical evaluation. A PK/biomarker model was developed to connect abemaciclib/vemurafenib concentrations to changes in MAPK and cell cycle pathway biomarkers in A375 BRAF-mutated melanoma xenografts. Resultant tumour growth inhibition was described by relating (i) MAPK pathway inhibition to apoptosis, (ii) mitotic cell density to tumour growth and, under resistant conditions, (iii) retinoblastoma protein inhibition to cell survival. The model successfully described vemurafenib/abemaciclib-mediated changes in MAPK pathway and cell cycle biomarkers. Initial tumour shrinkage by vemurafenib, acquisition of resistance and subsequent abemaciclib-mediated efficacy were successfully captured and externally validated. Model simulations illustrate the benefit of intermittent vemurafenib therapy over continuous treatment, and indicate that continuous abemaciclib in combination with intermittent vemurafenib offers the potential for considerable tumour regression. The quantitative pharmacology of the abemaciclib/vemurafenib combination was successfully characterised and an optimised, clinically-relevant dosing strategy was identified.
A Methodology for Quantifying Certain Design Requirements During the Design Phase
NASA Technical Reports Server (NTRS)
Adams, Timothy; Rhodes, Russel
2005-01-01
A methodology for developing and balancing quantitative design requirements for safety, reliability, and maintainability has been proposed. Conceived as the basis of a more rational approach to the design of spacecraft, the methodology would also be applicable to the design of automobiles, washing machines, television receivers, or almost any other commercial product. Heretofore, it has been common practice to start by determining the requirements for reliability of elements of a spacecraft or other system to ensure a given design life for the system. Next, safety requirements are determined by assessing the total reliability of the system and adding redundant components and subsystems necessary to attain safety goals. As thus described, common practice leaves the maintainability burden to fall to chance; therefore, there is no control of recurring costs or of the responsiveness of the system. The means that have been used in assessing maintainability have been oriented toward determining the logistical sparing of components so that the components are available when needed. The process established for developing and balancing quantitative requirements for safety (S), reliability (R), and maintainability (M) derives and integrates NASA s top-level safety requirements and the controls needed to obtain program key objectives for safety and recurring cost (see figure). Being quantitative, the process conveniently uses common mathematical models. Even though the process is shown as being worked from the top down, it can also be worked from the bottom up. This process uses three math models: (1) the binomial distribution (greaterthan- or-equal-to case), (2) reliability for a series system, and (3) the Poisson distribution (less-than-or-equal-to case). The zero-fail case for the binomial distribution approximates the commonly known exponential distribution or "constant failure rate" distribution. Either model can be used. The binomial distribution was selected for modeling flexibility because it conveniently addresses both the zero-fail and failure cases. The failure case is typically used for unmanned spacecraft as with missiles.
Quantitative Decision Support Requires Quantitative User Guidance
NASA Astrophysics Data System (ADS)
Smith, L. A.
2009-12-01
Is it conceivable that models run on 2007 computer hardware could provide robust and credible probabilistic information for decision support and user guidance at the ZIP code level for sub-daily meteorological events in 2060? In 2090? Retrospectively, how informative would output from today’s models have proven in 2003? or the 1930’s? Consultancies in the United Kingdom, including the Met Office, are offering services to “future-proof” their customers from climate change. How is a US or European based user or policy maker to determine the extent to which exciting new Bayesian methods are relevant here? or when a commercial supplier is vastly overselling the insights of today’s climate science? How are policy makers and academic economists to make the closely related decisions facing them? How can we communicate deep uncertainty in the future at small length-scales without undermining the firm foundation established by climate science regarding global trends? Three distinct aspects of the communication of the uses of climate model output targeting users and policy makers, as well as other specialist adaptation scientists, are discussed. First, a brief scientific evaluation of the length and time scales at which climate model output is likely to become uninformative is provided, including a note on the applicability the latest Bayesian methodology to current state-of-the-art general circulation models output. Second, a critical evaluation of the language often employed in communication of climate model output, a language which accurately states that models are “better”, have “improved” and now “include” and “simulate” relevant meteorological processed, without clearly identifying where the current information is thought to be uninformative and misleads, both for the current climate and as a function of the state of the (each) climate simulation. And thirdly, a general approach for evaluating the relevance of quantitative climate model output for a given problem is presented. Based on climate science, meteorology, and the details of the question in hand, this approach identifies necessary (never sufficient) conditions required for the rational use of climate model output in quantitative decision support tools. Inasmuch as climate forecasting is a problem of extrapolation, there will always be harsh limits on our ability to establish where a model is fit for purpose, this does not, however, limit us from identifying model noise as such, and thereby avoiding some cases of the misapplication and over interpretation of model output. It is suggested that failure to clearly communicate the limits of today’s climate model in providing quantitative decision relevant climate information to today’s users of climate information, would risk the credibility of tomorrow’s climate science and science based policy more generally.
Identification and Evaluation of Deepwater Port Hose Inspection Methods
DOT National Transportation Integrated Search
1979-01-01
The work contained in this report consists of a review of deepwater port hose failures to date, and the causes leading to these failures, as well as an evaluation of current hose inspection techniques and procedures, and an examination of available n...
Tanaka, Hidetatsu; Mori, Yu; Noro, Atsushi; Kogure, Atsushi; Kamimura, Masayuki; Yamada, Norikazu; Hanada, Shuji; Masahashi, Naoya; Itoi, Eiji
2016-01-01
Ti-6Al-4V alloy is widely prevalent as a material for orthopaedic implants because of its good corrosion resistance and biocompatibility. However, the discrepancy in Young’s modulus between metal prosthesis and human cortical bone sometimes induces clinical problems, thigh pain and bone atrophy due to stress shielding. We designed a Ti-Nb-Sn alloy with a low Young’s modulus to address problems of stress disproportion. In this study, we assessed effects of anodic oxidation with or without hot water treatment on the bone-bonding characteristics of a Ti-Nb-Sn alloy. We examined surface analyses and apatite formation by SEM micrographs, XPS and XRD analyses. We also evaluated biocompatibility in experimental animal models by measuring failure loads with a pull-out test and by quantitative histomorphometric analyses. By SEM, abundant apatite formation was observed on the surface of Ti-Nb-Sn alloy discs treated with anodic oxidation and hot water after incubation in Hank’s solution. A strong peak of apatite formation was detected on the surface using XRD analyses. XPS analysis revealed an increase of the H2O fraction in O 1s XPS. Results of the pull-out test showed that the failure loads of Ti-Nb-Sn alloy rods treated with anodic oxidation and hot water was greater than those of untreated rods. Quantitative histomorphometric analyses indicated that anodic oxidation and hot water treatment induced higher new bone formation around the rods. Our findings indicate that Ti-Nb-Sn alloy treated with anodic oxidation and hot water showed greater capacity for apatite formation, stronger bone bonding and higher biocompatibility for osteosynthesis. Ti-Nb-Sn alloy treated with anodic oxidation and hot water treatment is a promising material for orthopaedic implants enabling higher osteosynthesis and lower stress disproportion. PMID:26914329
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ebeida, Mohamed S.; Mitchell, Scott A.; Swiler, Laura P.
We introduce a novel technique, POF-Darts, to estimate the Probability Of Failure based on random disk-packing in the uncertain parameter space. POF-Darts uses hyperplane sampling to explore the unexplored part of the uncertain space. We use the function evaluation at a sample point to determine whether it belongs to failure or non-failure regions, and surround it with a protection sphere region to avoid clustering. We decompose the domain into Voronoi cells around the function evaluations as seeds and choose the radius of the protection sphere depending on the local Lipschitz continuity. As sampling proceeds, regions uncovered with spheres will shrink,more » improving the estimation accuracy. After exhausting the function evaluation budget, we build a surrogate model using the function evaluations associated with the sample points and estimate the probability of failure by exhaustive sampling of that surrogate. In comparison to other similar methods, our algorithm has the advantages of decoupling the sampling step from the surrogate construction one, the ability to reach target POF values with fewer samples, and the capability of estimating the number and locations of disconnected failure regions, not just the POF value. Furthermore, we present various examples to demonstrate the efficiency of our novel approach.« less
Strength criteria for composite materials (a literature survey)
NASA Technical Reports Server (NTRS)
Roode, F.
1982-01-01
Literature concerning strength (failure) criteria for composite materials is reviewed with emphasis on phenomenological failure criteria. These criteria are primarily intended to give a good estimation of the safety margin with respect to failure for arbitrary multiaxial stress states. The failure criteria do not indicate the types of fracture that will occur in the material. The collection of failure criteria is evaluated for applicability for the glass reinforced plastics used in mine detectors. Material tests necessary to determine the parameters in the failure criteria are discussed.
Global resilience analysis of water distribution systems.
Diao, Kegong; Sweetapple, Chris; Farmani, Raziyeh; Fu, Guangtao; Ward, Sarah; Butler, David
2016-12-01
Evaluating and enhancing resilience in water infrastructure is a crucial step towards more sustainable urban water management. As a prerequisite to enhancing resilience, a detailed understanding is required of the inherent resilience of the underlying system. Differing from traditional risk analysis, here we propose a global resilience analysis (GRA) approach that shifts the objective from analysing multiple and unknown threats to analysing the more identifiable and measurable system responses to extreme conditions, i.e. potential failure modes. GRA aims to evaluate a system's resilience to a possible failure mode regardless of the causal threat(s) (known or unknown, external or internal). The method is applied to test the resilience of four water distribution systems (WDSs) with various features to three typical failure modes (pipe failure, excess demand, and substance intrusion). The study reveals GRA provides an overview of a water system's resilience to various failure modes. For each failure mode, it identifies the range of corresponding failure impacts and reveals extreme scenarios (e.g. the complete loss of water supply with only 5% pipe failure, or still meeting 80% of demand despite over 70% of pipes failing). GRA also reveals that increased resilience to one failure mode may decrease resilience to another and increasing system capacity may delay the system's recovery in some situations. It is also shown that selecting an appropriate level of detail for hydraulic models is of great importance in resilience analysis. The method can be used as a comprehensive diagnostic framework to evaluate a range of interventions for improving system resilience in future studies. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.
NASA Astrophysics Data System (ADS)
Hu, Kun; Zhu, Qi-zhi; Chen, Liang; Shao, Jian-fu; Liu, Jian
2018-06-01
As confining pressure increases, crystalline rocks of moderate porosity usually undergo a transition in failure mode from localized brittle fracture to diffused damage and ductile failure. This transition has been widely reported experimentally for several decades; however, satisfactory modeling is still lacking. The present paper aims at modeling the brittle-ductile transition process of rocks under conventional triaxial compression. Based on quantitative analyses of experimental results, it is found that there is a quite satisfactory linearity between the axial inelastic strain at failure and the confining pressure prescribed. A micromechanics-based frictional damage model is then formulated using an associated plastic flow rule and a strain energy release rate-based damage criterion. The analytical solution to the strong plasticity-damage coupling problem is provided and applied to simulate the nonlinear mechanical behaviors of Tennessee marble, Indiana limestone and Jinping marble, each presenting a brittle-ductile transition in stress-strain curves.
Interconnect fatigue design for terrestrial photovoltaic modules
NASA Technical Reports Server (NTRS)
Mon, G. R.; Moore, D. M.; Ross, R. G., Jr.
1982-01-01
The results of comprehensive investigation of interconnect fatigue that has led to the definition of useful reliability-design and life-prediction algorithms are presented. Experimental data indicate that the classical strain-cycle (fatigue) curve for the interconnect material is a good model of mean interconnect fatigue performance, but it fails to account for the broad statistical scatter, which is critical to reliability prediction. To fill this shortcoming the classical fatigue curve is combined with experimental cumulative interconnect failure rate data to yield statistical fatigue curves (having failure probability as a parameter) which enable (1) the prediction of cumulative interconnect failures during the design life of an array field, and (2) the unambiguous--ie., quantitative--interpretation of data from field-service qualification (accelerated thermal cycling) tests. Optimal interconnect cost-reliability design algorithms are derived based on minimizing the cost of energy over the design life of the array field.
Interconnect fatigue design for terrestrial photovoltaic modules
NASA Astrophysics Data System (ADS)
Mon, G. R.; Moore, D. M.; Ross, R. G., Jr.
1982-03-01
The results of comprehensive investigation of interconnect fatigue that has led to the definition of useful reliability-design and life-prediction algorithms are presented. Experimental data indicate that the classical strain-cycle (fatigue) curve for the interconnect material is a good model of mean interconnect fatigue performance, but it fails to account for the broad statistical scatter, which is critical to reliability prediction. To fill this shortcoming the classical fatigue curve is combined with experimental cumulative interconnect failure rate data to yield statistical fatigue curves (having failure probability as a parameter) which enable (1) the prediction of cumulative interconnect failures during the design life of an array field, and (2) the unambiguous--ie., quantitative--interpretation of data from field-service qualification (accelerated thermal cycling) tests. Optimal interconnect cost-reliability design algorithms are derived based on minimizing the cost of energy over the design life of the array field.
Metabolic bone disease in chronic renal failure. II. Renal transplant patients.
Huffer, W. E.; Kuzela, D.; Popovtzer, M. M.; Starzl, T. E.
1975-01-01
Trabecular vertebral bone of renal transplant patients was quantitatively compared with bone from normal individuals and dialyzed and nondialyzed patienets with chronic renal failure reported in detail in an earlier study. Long- and short-term transplant patients have increased bone resorption and mineralization defects similar to renal osteodystrophy in dialyzed and nondialyzed patients. However, in transplant patients the magnitude of resorption is greater, and bone volume tends to decrease rather than increase. Resorptive activity in transplant patients is maximal during the first year after transplantation. Bone volume decreases continuously for at least 96 months after transplantation. Only decreased bone volume correlated with success or failure of the renal transplant. Morphologic findings in this study correlate with other clinical and morphologic data to suggest that reduction in bone volume in transplant patients results from a combination of persistent hyperparathyroidism and suppression of bone formation by steroid therapy. Images Fig 1 PMID:1091152
Fatigue in older adults with stable heart failure.
Stephen, Sharon A
2008-01-01
The purpose of this study was to describe fatigue and the relationships among fatigue intensity, self-reported functional status, and quality of life in older adults with stable heart failure. A descriptive, correlational design was used to collect quantitative data with reliable and valid instruments. Fifty-three eligible volunteers completed a questionnaire during an interview. Those with recent changes in their medical regimen, other fatigue-inducing illnesses, and isolated diastolic dysfunction were excluded. Fatigue intensity (Profile of Mood States fatigue subscale) was associated with lower quality of life, perceived health, and satisfaction with life. Fatigue was common, and no relationship was found between fatigue intensity and self-reported functional status. Marital status was the only independent predictor of fatigue. In stable heart failure, fatigue is a persistent symptom. Clinicians need to ask patients about fatigue and assess the impact on quality of life. Self-reported functional status cannot serve as a proxy measure for fatigue.
Bohren, Meghan A; Vogel, Joshua P; Hunter, Erin C; Lutsiv, Olha; Makh, Suprita K; Souza, João Paulo; Aguiar, Carolina; Saraiva Coneglian, Fernando; Diniz, Alex Luíz Araújo; Tunçalp, Özge; Javadi, Dena; Oladapo, Olufemi T; Khosla, Rajat; Hindin, Michelle J; Gülmezoglu, A Metin
2015-06-01
Despite growing recognition of neglectful, abusive, and disrespectful treatment of women during childbirth in health facilities, there is no consensus at a global level on how these occurrences are defined and measured. This mixed-methods systematic review aims to synthesize qualitative and quantitative evidence on the mistreatment of women during childbirth in health facilities to inform the development of an evidence-based typology of the phenomenon. We searched PubMed, CINAHL, and Embase databases and grey literature using a predetermined search strategy to identify qualitative, quantitative, and mixed-methods studies on the mistreatment of women during childbirth across all geographical and income-level settings. We used a thematic synthesis approach to synthesize the qualitative evidence and assessed the confidence in the qualitative review findings using the CERQual approach. In total, 65 studies were included from 34 countries. Qualitative findings were organized under seven domains: (1) physical abuse, (2) sexual abuse, (3) verbal abuse, (4) stigma and discrimination, (5) failure to meet professional standards of care, (6) poor rapport between women and providers, and (7) health system conditions and constraints. Due to high heterogeneity of the quantitative data, we were unable to conduct a meta-analysis; instead, we present descriptions of study characteristics, outcome measures, and results. Additional themes identified in the quantitative studies are integrated into the typology. This systematic review presents a comprehensive, evidence-based typology of the mistreatment of women during childbirth in health facilities, and demonstrates that mistreatment can occur at the level of interaction between the woman and provider, as well as through systemic failures at the health facility and health system levels. We propose this typology be adopted to describe the phenomenon and be used to develop measurement tools and inform future research, programs, and interventions.
NASA Astrophysics Data System (ADS)
Sinescu, C.; Bradu, A.; Duma, V.-F.; Topala, F. I.; Negrutiu, M. L.; Podoleanu, A. G.
2018-02-01
We present a recent investigation regarding the use of optical coherence tomography (OCT) in the monitoring of the calibration loss of sintering ovens for the manufacturing of metal ceramic dental prostheses. Differences in the temperatures of such ovens with regard to their specifications lead to stress and even cracks in the prostheses material, therefore to the failure of the dental treatment. Evaluation methods of the ovens calibration consist nowadays of firing supplemental samples; this is subjective, expensive, and time consuming. Using an in-house developed swept source (SS) OCT system, we have demonstrated that a quantitative assessment of the internal structure of the prostheses, therefore of the temperature settings of the ovens can be made. Using en-face OCT images acquired at similar depths inside the samples, the differences in reflectivity allow for the evaluation of the differences in granulation (i.e., in number and size of ceramic grains) of the prostheses material. Fifty samples, divided in five groups, each sintered at different temperatures (lower, higher, or equal to the prescribed one) have been analyzed. The consequences of the temperature variations with regard to the one prescribed were determined. Rules-of-thumb were extracted to monitor objectively, using only OCT images of currently manufactured samples, the settings of the oven. The method proposed allows for avoiding producing prostheses with defects. While such rules-of-thumb achieve a qualitative assessment, an insight in our on-going work on the quantitative assessment of such losses of calibration on dental ovens using OCT is also made.
Lorenz, C H; Walker, E S; Graham, T P; Powers, T A
1995-11-01
The long-term adaptation of the right ventricle after atrial repair of transposition of the great arteries (TGA) remains a subject of major concern. Cine magnetic resonance imaging (MRI), with its tomographic capabilities, allows unique quantitative evaluation of both right and left ventricular function and mass. Our purpose was to use MRI and an age-matched normal population to examine the typical late adaptation of the right and left ventricles after atrial repair of TGA. Cine MRI was used to study ventricular function and mass in 22 patients after atrial repair of TGA. Images were obtained in short-axis sections from base to apex to derive normalized right and left ventricular mass (RVM and LVM, g/m2), interventricular septal mass (IVSM, g/m2), RV and LV end-diastolic volumes (EDV, mL/m2), and ejection fractions (EF). Results 8 to 23 years after repair were compared with analysis of 24 age- and sex-matched normal volunteers and revealed markedly elevated RVM, decreased LVM and IVSM, normal RV size, and only mildly depressed RVEF. Only 1 of 22 patients had clinical RV dysfunction, and this patient had increased RVM. Cine MRI allows quantitative evaluation of both RV and LV mass and function late after atrial repair of TGA. Longitudinal studies that include these measurements should prove useful in determining the mechanism of late RV failure in these patients. On the basis of these early data, inadequate hypertrophy does not appear to be the cause of late dysfunction in this patient group.
A review of state-of-the-art stereology for better quantitative 3D morphology in cardiac research.
Mühlfeld, Christian; Nyengaard, Jens Randel; Mayhew, Terry M
2010-01-01
The aim of stereological methods in biomedical research is to obtain quantitative information about three-dimensional (3D) features of tissues, cells, or organelles from two-dimensional physical or optical sections. With immunogold labeling, stereology can even be used for the quantitative analysis of the distribution of molecules within tissues and cells. Nowadays, a large number of design-based stereological methods offer an efficient quantitative approach to intriguing questions in cardiac research, such as "Is there a significant loss of cardiomyocytes during progression from ventricular hypertrophy to heart failure?" or "Does a specific treatment reduce the degree of fibrosis in the heart?" Nevertheless, the use of stereological methods in cardiac research is rare. The present review article demonstrates how some of the potential pitfalls in quantitative microscopy may be avoided. To this end, we outline the concepts of design-based stereology and illustrate their practical applications to a wide range of biological questions in cardiac research. We hope that the present article will stimulate researchers in cardiac research to incorporate design-based stereology into their study designs, thus promoting an unbiased quantitative 3D microscopy.
NASA Astrophysics Data System (ADS)
Zhao, Yong; Yang, Tianhong; Bohnhoff, Marco; Zhang, Penghai; Yu, Qinglei; Zhou, Jingren; Liu, Feiyue
2018-05-01
To quantitatively understand the failure process and failure mechanism of a rock mass during the transformation from open-pit mining to underground mining, the Shirengou Iron Mine was selected as an engineering project case study. The study area was determined using the rock mass basic quality classification method and the kinematic analysis method. Based on the analysis of the variations in apparent stress and apparent volume over time, the rock mass failure process was analyzed. According to the recent research on the temporal and spatial change of microseismic events in location, energy, apparent stress, and displacement, the migration characteristics of rock mass damage were studied. A hybrid moment tensor inversion method was used to determine the rock mass fracture source mechanisms, the fracture orientations, and fracture scales. The fracture area can be divided into three zones: Zone A, Zone B, and Zone C. A statistical analysis of the orientation information of the fracture planes orientations was carried out, and four dominant fracture planes were obtained. Finally, the slip tendency analysis method was employed, and the unstable fracture planes were obtained. The results show: (1) The microseismic monitoring and hybrid moment tensor analysis can effectively analyze the failure process and failure mechanism of rock mass, (2) during the transformation from open-pit to underground mining, the failure type of rock mass is mainly shear failure and the tensile failure is mostly concentrated in the roof of goafs, and (3) the rock mass of the pit bottom and the upper of goaf No. 18 have the possibility of further damage.
Borba, Marcelo; Deluiz, Daniel; Lourenço, Eduardo José Veras; Oliveira, Luciano; Tannure, Patrícia Nivoloni
2017-08-21
This study aimed to evaluate dental implant outcomes and to identify risk factors associated with implant failure over 12 years via dental records of patients attending an educational institution. Dental records of 202 patients receiving 774 dental implants from 2002 to 2014 were analyzed by adopting a more reliable statistical method to evaluate risk factors with patients as the unit [generalized estimating equation (GEE)]. Information regarding patient age at implantation, sex, use of tobacco, and history of systemic diseases was collected. Information about implant location in the arch region and implant length, diameter, and placement in a grafted area was evaluated after 2 years under load. Systemic and local risk factors for early and late implant failure were studied. A total of 18 patients experienced 25 implant failures, resulting in an overall survival rate of 96.8% (2.84% and 0.38% early and late implant failures, respectively). The patient-based survival rate was 91.8%. GEE univariate and multivariate analyses revealed that a significant risk factor for implant failure was the maxillary implant (p = 0.006 and p = 0.014, respectively). Bone grafting appeared to be a risk factor for implant failure (p = 0.054). According to GEE analyses, maxillary implants had significantly worse outcomes in this population and were considered to be a risk factor for implant failure. Our results suggested that implants placed in a bone augmentation area had a tendency to fail.
Parylene MEMS patency sensor for assessment of hydrocephalus shunt obstruction.
Kim, Brian J; Jin, Willa; Baldwin, Alexander; Yu, Lawrence; Christian, Eisha; Krieger, Mark D; McComb, J Gordon; Meng, Ellis
2016-10-01
Neurosurgical ventricular shunts inserted to treat hydrocephalus experience a cumulative failure rate of 80 % over 12 years; obstruction is responsible for most failures with a majority occurring at the proximal catheter. Current diagnosis of shunt malfunction is imprecise and involves neuroimaging studies and shunt tapping, an invasive measurement of intracranial pressure and shunt patency. These patients often present emergently and a delay in care has dire consequences. A microelectromechanical systems (MEMS) patency sensor was developed to enable direct and quantitative tracking of shunt patency in order to detect proximal shunt occlusion prior to the development of clinical symptoms thereby avoiding delays in treatment. The sensor was fabricated on a flexible polymer substrate to eventually allow integration into a shunt. In this study, the sensor was packaged for use with external ventricular drainage systems for clinical validation. Insights into the transduction mechanism of the sensor were obtained. The impact of electrode size, clinically relevant temperatures and flows, and hydrogen peroxide (H2O2) plasma sterilization on sensor function were evaluated. Sensor performance in the presence of static and dynamic obstruction was demonstrated using 3 different models of obstruction. Electrode size was found to have a minimal effect on sensor performance and increased temperature and flow resulted in a slight decrease in the baseline impedance due to an increase in ionic mobility. However, sensor response did not vary within clinically relevant temperature and flow ranges. H2O2 plasma sterilization also had no effect on sensor performance. This low power and simple format sensor was developed with the intention of future integration into shunts for wireless monitoring of shunt state and more importantly, a more accurate and timely diagnosis of shunt failure.
SU-F-P-07: Applying Failure Modes and Effects Analysis to Treatment Planning System QA
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mathew, D; Alaei, P
2016-06-15
Purpose: A small-scale implementation of Failure Modes and Effects Analysis (FMEA) for treatment planning system QA by utilizing methodology of AAPM TG-100 report. Methods: FMEA requires numerical values for severity (S), occurrence (O) and detectability (D) of each mode of failure. The product of these three values gives a risk priority number (RPN). We have implemented FMEA for the treatment planning system (TPS) QA for two clinics which use Pinnacle and Eclipse TPS. Quantitative monthly QA data dating back to 4 years for Pinnacle and 1 year for Eclipse have been used to determine values for severity (deviations from predeterminedmore » doses at points or volumes), and occurrence of such deviations. The TPS QA protocol includes a phantom containing solid water and lung- and bone-equivalent heterogeneities. Photon and electron plans have been evaluated in both systems. The dose values at multiple distinct points of interest (POI) within the solid water, lung, and bone-equivalent slabs, as well as mean doses to several volumes of interest (VOI), have been re-calculated monthly using the available algorithms. Results: The computed doses vary slightly month-over-month. There have been more significant deviations following software upgrades, especially if the upgrade involved re-modeling of the beams. TG-100 guidance and the data presented here suggest an occurrence (O) of 2 depending on the frequency of re-commissioning the beams, severity (S) of 3, and detectability (D) of 2, giving an RPN of 12. Conclusion: Computerized treatment planning systems could pose a risk due to dosimetric errors and suboptimal treatment plans. The FMEA analysis presented here suggests that TPS QA should immediately follow software upgrades, but does not need to be performed every month.« less
Network Science Based Quantification of Resilience Demonstrated on the Indian Railways Network.
Bhatia, Udit; Kumar, Devashish; Kodra, Evan; Ganguly, Auroop R
2015-01-01
The structure, interdependence, and fragility of systems ranging from power-grids and transportation to ecology, climate, biology and even human communities and the Internet have been examined through network science. While response to perturbations has been quantified, recovery strategies for perturbed networks have usually been either discussed conceptually or through anecdotal case studies. Here we develop a network science based quantitative framework for measuring, comparing and interpreting hazard responses as well as recovery strategies. The framework, motivated by the recently proposed temporal resilience paradigm, is demonstrated with the Indian Railways Network. Simulations inspired by the 2004 Indian Ocean Tsunami and the 2012 North Indian blackout as well as a cyber-physical attack scenario illustrate hazard responses and effectiveness of proposed recovery strategies. Multiple metrics are used to generate various recovery strategies, which are simply sequences in which system components should be recovered after a disruption. Quantitative evaluation of these strategies suggests that faster and more efficient recovery is possible through network centrality measures. Optimal recovery strategies may be different per hazard, per community within a network, and for different measures of partial recovery. In addition, topological characterization provides a means for interpreting the comparative performance of proposed recovery strategies. The methods can be directly extended to other Large-Scale Critical Lifeline Infrastructure Networks including transportation, water, energy and communications systems that are threatened by natural or human-induced hazards, including cascading failures. Furthermore, the quantitative framework developed here can generalize across natural, engineered and human systems, offering an actionable and generalizable approach for emergency management in particular as well as for network resilience in general.
Network Science Based Quantification of Resilience Demonstrated on the Indian Railways Network
Bhatia, Udit; Kumar, Devashish; Kodra, Evan; Ganguly, Auroop R.
2015-01-01
The structure, interdependence, and fragility of systems ranging from power-grids and transportation to ecology, climate, biology and even human communities and the Internet have been examined through network science. While response to perturbations has been quantified, recovery strategies for perturbed networks have usually been either discussed conceptually or through anecdotal case studies. Here we develop a network science based quantitative framework for measuring, comparing and interpreting hazard responses as well as recovery strategies. The framework, motivated by the recently proposed temporal resilience paradigm, is demonstrated with the Indian Railways Network. Simulations inspired by the 2004 Indian Ocean Tsunami and the 2012 North Indian blackout as well as a cyber-physical attack scenario illustrate hazard responses and effectiveness of proposed recovery strategies. Multiple metrics are used to generate various recovery strategies, which are simply sequences in which system components should be recovered after a disruption. Quantitative evaluation of these strategies suggests that faster and more efficient recovery is possible through network centrality measures. Optimal recovery strategies may be different per hazard, per community within a network, and for different measures of partial recovery. In addition, topological characterization provides a means for interpreting the comparative performance of proposed recovery strategies. The methods can be directly extended to other Large-Scale Critical Lifeline Infrastructure Networks including transportation, water, energy and communications systems that are threatened by natural or human-induced hazards, including cascading failures. Furthermore, the quantitative framework developed here can generalize across natural, engineered and human systems, offering an actionable and generalizable approach for emergency management in particular as well as for network resilience in general. PMID:26536227
Houser-Marko, Linda; Sheldon, Kennon M
2008-11-01
These studies tested the hypothesis that evaluating goal feedback in terms of a primary, longer term goal can be risky for future motivation. Study 1 was a 2 x 2 experiment in which framing level (primary goal/subgoal) and feedback valence (success/failure) were manipulated for participants during a verbal skills task. In the primary goal failure condition, there was increased negative mood and decreased positive mood and expectancy for subsequent trials, even while controlling for goal difficulty and importance. Study 2 was an 8-week study throughout which participants were asked to evaluate their progress regarding a primary goal (class grade goal) or subgoal (weekly study hours goal), and success or failure varied naturally. When progress was lacking, participants in the primary goal condition experienced the largest decreases in mood and expectancy. These results suggest that it is optimal to evaluate goal progress at the lower, subgoal level, particularly after failure feedback.
NASA Technical Reports Server (NTRS)
Hatfield, Glen S.; Hark, Frank; Stott, James
2016-01-01
Launch vehicle reliability analysis is largely dependent upon using predicted failure rates from data sources such as MIL-HDBK-217F. Reliability prediction methodologies based on component data do not take into account system integration risks such as those attributable to manufacturing and assembly. These sources often dominate component level risk. While consequence of failure is often understood, using predicted values in a risk model to estimate the probability of occurrence may underestimate the actual risk. Managers and decision makers use the probability of occurrence to influence the determination whether to accept the risk or require a design modification. The actual risk threshold for acceptance may not be fully understood due to the absence of system level test data or operational data. This paper will establish a method and approach to identify the pitfalls and precautions of accepting risk based solely upon predicted failure data. This approach will provide a set of guidelines that may be useful to arrive at a more realistic quantification of risk prior to acceptance by a program.
Criteria for evaluating evidence on public health interventions.
Rychetnik, L; Frommer, M; Hawe, P; Shiell, A
2002-02-01
Public health interventions tend to be complex, programmatic, and context dependent. The evidence for their effectiveness must be sufficiently comprehensive to encompass that complexity. This paper asks whether and to what extent evaluative research on public health interventions can be adequately appraised by applying well established criteria for judging the quality of evidence in clinical practice. It is adduced that these criteria are useful in evaluating some aspects of evidence. However, there are other important aspects of evidence on public health interventions that are not covered by the established criteria. The evaluation of evidence must distinguish between the fidelity of the evaluation process in detecting the success or failure of an intervention, and the success or failure of the intervention itself. Moreover, if an intervention is unsuccessful, the evidence should help to determine whether the intervention was inherently faulty (that is, failure of intervention concept or theory), or just badly delivered (failure of implementation). Furthermore, proper interpretation of the evidence depends upon the availability of descriptive information on the intervention and its context, so that the transferability of the evidence can be determined. Study design alone is an inadequate marker of evidence quality in public health intervention evaluation.
Nonoperative management of blunt hepatic trauma: A systematic review.
Boese, Christoph Kolja; Hackl, Michael; Müller, Lars Peter; Ruchholtz, Steffen; Frink, Michael; Lechler, Philipp
2015-10-01
Nonoperative management (NOM) has become the standard treatment in hemodynamically stable patients with blunt hepatic injuries. While the reported overall success rates of NOM are excellent, there is a lack of consensus regarding the risk factors predicting the failure of NOM. The aim of this systematic review was to identify the incidence and prognostic factors for failure of NOM in adult patients with blunt hepatic trauma. Prospective studies reporting prognostic factors for the failure of nonoperative treatment of blunt liver injuries were identified by searching MEDLINE and the Cochrane Central Register of Controlled Trials. We screened 798 titles and abstracts, of which 8 single-center prospective observational studies, reporting 410 patients, were included in the qualitative and quantitative synthesis. No randomized controlled trials were found. The pooled failure rate of NOM was 9.5% (0-24%). Twenty-six prognostic factors predicting the failure of NOM were reported, of which six reached statistical significance in one or more studies: blood pressure (p < 0.05), fluid resuscitation (p = 0.02), blood transfusion (p = 0.003), peritoneal signs (p < 0.0001), Injury Severity Score (ISS) (p = 0.03), and associated intra-abdominal injuries (p < 0.01). There is evidence that patients presenting with clinical signs of shock, a high ISS, associated intra-abdominal injuries, and peritoneal signs are at an increased risk of failure of NOM for the treatment of blunt hepatic injuries. Systematic review, level III.
NASA Astrophysics Data System (ADS)
White, Bradley W.; Tarver, Craig M.
2017-01-01
It has long been known that detonating single crystals of solid explosives have much larger failure diameters than those of heterogeneous charges of the same explosive pressed or cast to 98 - 99% theoretical maximum density (TMD). In 1957, Holland et al. demonstrated that PETN single crystals have failure diameters of about 8 mm, whereas heterogeneous PETN charges have failure diameters of less than 0.5 mm. Recently, Fedorov et al. quantitatively determined nanosecond time resolved detonation reaction zone profiles of single crystals of PETN and HMX by measuring the interface particle velocity histories of the detonating crystals and LiF windows using a PDV system. The measured reaction zone time durations for PETN and HMX single crystal detonations were approximately 100 and 260 nanoseconds, respectively. These experiments provided the necessary data to develop Ignition and Growth (I&G) reactive flow model parameters for the single crystal detonation reaction zones. Using these parameters, the calculated unconfined failure diameter of a PETN single crystal was 7.5 +/- 0.5 mm, close to the 8 mm experimental value. The calculated failure diameter of an unconfined HMX single crystal was 15 +/- 1 mm. The unconfined failure diameter of an HMX single crystal has not yet been determined precisely, but Fedorov et al. detonated 14 mm diameter crystals confined by detonating a HMX-based plastic bonded explosive (PBX) without initially overdriving the HMX crystals.
ADM guidance-Ceramics: guidance to the use of fractography in failure analysis of brittle materials.
Scherrer, Susanne S; Lohbauer, Ulrich; Della Bona, Alvaro; Vichi, Alessandro; Tholey, Michael J; Kelly, J Robert; van Noort, Richard; Cesar, Paulo Francisco
2017-06-01
To provide background information and guidance as to how to use fractography accurately, a powerful tool for failure analysis of dental ceramic structures. An extended palette of qualitative and quantitative fractography is provided, both for in vivo and in vitro fracture surface analyses. As visual support, this guidance document will provide micrographs of typical critical ceramic processing flaws, differentiating between pre- versus post sintering cracks, grinding damage related failures and occlusal contact wear origins and of failures due to surface degradation. The documentation emphasizes good labeling of crack features, precise indication of the direction of crack propagation (dcp), identification of the fracture origin, the use of fractographic photomontage of critical flaws or flaw labeling on strength data graphics. A compilation of recommendations for specific applications of fractography in Dentistry is also provided. This guidance document will contribute to a more accurate use of fractography and help researchers to better identify, describe and understand the causes of failure, for both clinical and laboratory-scale situations. If adequately performed at a large scale, fractography will assist in optimizing the methods of processing and designing of restorative materials and components. Clinical failures may be better understood and consequently reduced by sending out the correct message regarding the fracture origin in clinical trials. Copyright © 2017 The Academy of Dental Materials. All rights reserved.
Failure analysis of parameter-induced simulation crashes in climate models
NASA Astrophysics Data System (ADS)
Lucas, D. D.; Klein, R.; Tannahill, J.; Ivanova, D.; Brandon, S.; Domyancic, D.; Zhang, Y.
2013-01-01
Simulations using IPCC-class climate models are subject to fail or crash for a variety of reasons. Quantitative analysis of the failures can yield useful insights to better understand and improve the models. During the course of uncertainty quantification (UQ) ensemble simulations to assess the effects of ocean model parameter uncertainties on climate simulations, we experienced a series of simulation crashes within the Parallel Ocean Program (POP2) component of the Community Climate System Model (CCSM4). About 8.5% of our CCSM4 simulations failed for numerical reasons at combinations of POP2 parameter values. We apply support vector machine (SVM) classification from machine learning to quantify and predict the probability of failure as a function of the values of 18 POP2 parameters. A committee of SVM classifiers readily predicts model failures in an independent validation ensemble, as assessed by the area under the receiver operating characteristic (ROC) curve metric (AUC > 0.96). The causes of the simulation failures are determined through a global sensitivity analysis. Combinations of 8 parameters related to ocean mixing and viscosity from three different POP2 parameterizations are the major sources of the failures. This information can be used to improve POP2 and CCSM4 by incorporating correlations across the relevant parameters. Our method can also be used to quantify, predict, and understand simulation crashes in other complex geoscientific models.
Failure analysis of parameter-induced simulation crashes in climate models
NASA Astrophysics Data System (ADS)
Lucas, D. D.; Klein, R.; Tannahill, J.; Ivanova, D.; Brandon, S.; Domyancic, D.; Zhang, Y.
2013-08-01
Simulations using IPCC (Intergovernmental Panel on Climate Change)-class climate models are subject to fail or crash for a variety of reasons. Quantitative analysis of the failures can yield useful insights to better understand and improve the models. During the course of uncertainty quantification (UQ) ensemble simulations to assess the effects of ocean model parameter uncertainties on climate simulations, we experienced a series of simulation crashes within the Parallel Ocean Program (POP2) component of the Community Climate System Model (CCSM4). About 8.5% of our CCSM4 simulations failed for numerical reasons at combinations of POP2 parameter values. We applied support vector machine (SVM) classification from machine learning to quantify and predict the probability of failure as a function of the values of 18 POP2 parameters. A committee of SVM classifiers readily predicted model failures in an independent validation ensemble, as assessed by the area under the receiver operating characteristic (ROC) curve metric (AUC > 0.96). The causes of the simulation failures were determined through a global sensitivity analysis. Combinations of 8 parameters related to ocean mixing and viscosity from three different POP2 parameterizations were the major sources of the failures. This information can be used to improve POP2 and CCSM4 by incorporating correlations across the relevant parameters. Our method can also be used to quantify, predict, and understand simulation crashes in other complex geoscientific models.
NASA Technical Reports Server (NTRS)
Sanchez, Christopher M.
2011-01-01
NASA White Sands Test Facility (WSTF) is leading an evaluation effort in advanced destructive and nondestructive testing of composite pressure vessels and structures. WSTF is using progressive finite element analysis methods for test design and for confirmation of composite pressure vessel performance. Using composite finite element analysis models and failure theories tested in the World-Wide Failure Exercise, WSTF is able to estimate the static strength of composite pressure vessels. Additionally, test and evaluation on composites that have been impact damaged is in progress so that models can be developed to estimate damage tolerance and the degradation in static strength.
Hybrid Bearing Prognostic Test Rig
NASA Technical Reports Server (NTRS)
Dempsey, Paula J.; Certo, Joseph M.; Handschuh, Robert F.; Dimofte, Florin
2005-01-01
The NASA Glenn Research Center has developed a new Hybrid Bearing Prognostic Test Rig to evaluate the performance of sensors and algorithms in predicting failures of rolling element bearings for aeronautics and space applications. The failure progression of both conventional and hybrid (ceramic rolling elements, metal races) bearings can be tested from fault initiation to total failure. The effects of different lubricants on bearing life can also be evaluated. Test conditions monitored and recorded during the test include load, oil temperature, vibration, and oil debris. New diagnostic research instrumentation will also be evaluated for hybrid bearing damage detection. This paper summarizes the capabilities of this new test rig.
Advanced detection, isolation and accommodation of sensor failures: Real-time evaluation
NASA Technical Reports Server (NTRS)
Merrill, Walter C.; Delaat, John C.; Bruton, William M.
1987-01-01
The objective of the Advanced Detection, Isolation, and Accommodation (ADIA) Program is to improve the overall demonstrated reliability of digital electronic control systems for turbine engines by using analytical redundacy to detect sensor failures. The results of a real time hybrid computer evaluation of the ADIA algorithm are presented. Minimum detectable levels of sensor failures for an F100 engine control system are determined. Also included are details about the microprocessor implementation of the algorithm as well as a description of the algorithm itself.
Compound estimation procedures in reliability
NASA Technical Reports Server (NTRS)
Barnes, Ron
1990-01-01
At NASA, components and subsystems of components in the Space Shuttle and Space Station generally go through a number of redesign stages. While data on failures for various design stages are sometimes available, the classical procedures for evaluating reliability only utilize the failure data on the present design stage of the component or subsystem. Often, few or no failures have been recorded on the present design stage. Previously, Bayesian estimators for the reliability of a single component, conditioned on the failure data for the present design, were developed. These new estimators permit NASA to evaluate the reliability, even when few or no failures have been recorded. Point estimates for the latter evaluation were not possible with the classical procedures. Since different design stages of a component (or subsystem) generally have a good deal in common, the development of new statistical procedures for evaluating the reliability, which consider the entire failure record for all design stages, has great intuitive appeal. A typical subsystem consists of a number of different components and each component has evolved through a number of redesign stages. The present investigations considered compound estimation procedures and related models. Such models permit the statistical consideration of all design stages of each component and thus incorporate all the available failure data to obtain estimates for the reliability of the present version of the component (or subsystem). A number of models were considered to estimate the reliability of a component conditioned on its total failure history from two design stages. It was determined that reliability estimators for the present design stage, conditioned on the complete failure history for two design stages have lower risk than the corresponding estimators conditioned only on the most recent design failure data. Several models were explored and preliminary models involving bivariate Poisson distribution and the Consael Process (a bivariate Poisson process) were developed. Possible short comings of the models are noted. An example is given to illustrate the procedures. These investigations are ongoing with the aim of developing estimators that extend to components (and subsystems) with three or more design stages.
ERIC Educational Resources Information Center
Barry, Adam E.; Szucs, Leigh E.; Reyes, Jovanni V.; Ji, Qian; Wilson, Kelly L.; Thompson, Bruce
2016-01-01
Given the American Psychological Association's strong recommendation to always report effect sizes in research, scholars have a responsibility to provide complete information regarding their findings. The purposes of this study were to (a) determine the frequencies with which different effect sizes were reported in published, peer-reviewed…
Failure to Get Admissions in a Discipline of Their Own Choice: Voices of Dejected Students
ERIC Educational Resources Information Center
Rana, Naeem Akhtar; Tuba, Naeem
2017-01-01
Attaining a professional engineering degree is a dream of many pre-engineering intermediate students in Pakistan. Several students face scarcity of resources to accomplish and enliven their dreams of getting admission into an engineering institute, which results in great hardships and turmoil for them. The literature reveals that quantitative work…
Correlation of Electronic Health Records Use and Reduced Prevalence of Diabetes Co-Morbidities
ERIC Educational Resources Information Center
Eller, James D.
2013-01-01
The general problem is Native American tribes have high prevalence rates of diabetes. The specific problem is the failure of IHS sites to adopt EHR may cause health care providers to miss critical opportunities to improve screening and triage processes that result in quality improvement. The purpose of the quantitative correlational study was to…
ERIC Educational Resources Information Center
Howie, Erin K.; Stevick, E. Doyle
2014-01-01
Background: Despite broad public support and legislative activity, policies intended to promote physical activity in schools have not produced positive outcomes in levels of physical activity or student health. What explains the broad failure of Physical Activity Policies (PAPs)? Thus far, PAP research has used limited quantitative methods to…
Visual field defects may not affect safe driving.
Dow, Jamie
2011-10-01
In Quebec a driver whose acquired visual field defect renders them ineligible for a driver's permit renewal may request an exemption from the visual field standard by demonstrating safe driving despite the defect. For safety reasons it was decided to attempt to identify predictors of failure on the road test in order to avoid placing driving evaluators in potentially dangerous situations when evaluating drivers with visual field defects. During a 4-month period in 2009 all requests for exemptions from the visual field standard were collected and analyzed. All available medical and visual field data were collated for 103 individuals, of whom 91 successfully completed the evaluation process and obtained a waiver. The collated data included age, sex, type of visual field defect, visual field characteristics, and concomitant medical problems. No single factor, or combination of factors, could predict failure of the road test. All 5 failures of the road test had cognitive problems but 6 of the successful drivers also had known cognitive problems. Thus, cognitive problems influence the risk of failure but do not predict certain failure. Most of the applicants for an exemption were able to complete the evaluation process successfully, thereby demonstrating safe driving despite their handicap. Consequently, jurisdictions that have visual field standards for their driving permit should implement procedures to evaluate drivers with visual field defects that render them unable to meet the standard but who wish to continue driving.
Symons, Nicholas R A; Almoudaris, Alex M; Nagpal, Kamal; Vincent, Charles A; Moorthy, Krishna
2013-01-01
To investigate the nature of process failures in postoperative care, to assess their frequency and preventability, and to explore their relationship to adverse events. Adverse events are common and are frequently caused by failures in the process of care. These processes are often evaluated independently using clinical audit. There is little understanding of process failures in terms of their overall frequency, relative risk, and cumulative effect on the surgical patient. Patients were observed daily from the first postoperative day until discharge by an independent surgeon. Field notes on the circumstances surrounding any nonroutine or atypical event were recorded. Field notes were assessed by 2 surgeons to identify failures in the process of care. Preventability, the degree of harm caused to the patient, and the underlying etiology of process failures were evaluated by 2 independent surgeons. Fifty patients undergoing major elective general surgery were observed for a total of 659 days of postoperative care. A total of 256 process failures were identified, of which 85% were preventable and 51% directly led to patient harm. Process failures occurred in all aspects of care, the most frequent being medication prescribing and administration, management of lines, tubes, and drains, and pain control interventions. Process failures accounted for 57% of all preventable adverse events. Communication failures and delays were the main etiologies, leading to 54% of process failures. Process failures are common in postoperative care, are highly preventable, and frequently cause harm to patients. Interventions to prevent process failures will improve the reliability of surgical postoperative care and have the potential to reduce hospital stay.
Effect of Combined Loading Due to Bending and Internal Pressure on Pipe Flaw Evaluation Criteria
NASA Astrophysics Data System (ADS)
Miura, Naoki; Sakai, Shinsuke
Considering a rule for the rationalization of maintenance of Light Water Reactor piping, reliable flaw evaluation criteria are essential for determining how a detected flaw will be detrimental to continuous plant operation. Ductile fracture is one of the dominant failure modes that must be considered for carbon steel piping and can be analyzed by elastic-plastic fracture mechanics. Some analytical efforts have provided various flaw evaluation criteria using load correction factors, such as the Z-factors in the JSME codes on fitness-for-service for nuclear power plants and the section XI of the ASME boiler and pressure vessel code. The present Z-factors were conventionally determined, taking conservativity and simplicity into account; however, the effect of internal pressure, which is an important factor under actual plant conditions, was not adequately considered. Recently, a J-estimation scheme, LBB.ENGC for the ductile fracture analysis of circumferentially through-wall-cracked pipes subjected to combined loading was developed for more accurate prediction under more realistic conditions. This method explicitly incorporates the contributions of both bending and tension due to internal pressure by means of a scheme that is compatible with an arbitrary combined-loading history. In this study, the effect of internal pressure on the flaw evaluation criteria was investigated using the new J-estimation scheme. The Z-factor obtained in this study was compared with the presently used Z-factors, and the predictability of the current flaw evaluation criteria was quantitatively evaluated in consideration of the internal pressure.
Towards real-time quantitative optical imaging for surgery
NASA Astrophysics Data System (ADS)
Gioux, Sylvain
2017-07-01
There is a pressing clinical need to provide image guidance during surgery. Currently, assessment of tissue that needs to be resected or avoided is performed subjectively leading to a large number of failures, patient morbidity and increased healthcare cost. Because near-infrared (NIR) optical imaging is safe, does not require contact, and can provide relatively deep information (several mm), it offers unparalleled capabilities for providing image guidance during surgery. In this work, we introduce a novel concept that enables the quantitative imaging of endogenous molecular information over large fields-of-view. Because this concept can be implemented in real-time, it is amenable to provide video-rate endogenous information during surgery.
Lanying Lin; Sheng He; Feng Fu; Xiping Wang
2015-01-01
Wood failure percentage (WFP) is an important index for evaluating the bond strength of plywood. Currently, the method used for detecting WFP is visual inspection, which lacks efficiency. In order to improve it, image processing methods are applied to wood failure detection. The present study used thresholding and K-means clustering algorithms in wood failure detection...
Development of failure model for nickel cadmium cells
NASA Technical Reports Server (NTRS)
Gupta, A.
1980-01-01
The development of a method for the life prediction of nickel cadmium cells is discussed. The approach described involves acquiring an understanding of the mechanisms of degradation and failure and at the same time developing nondestructive evaluation techniques for the nickel cadmium cells. The development of a statistical failure model which will describe the mechanisms of degradation and failure is outlined.
Yildiz, Saliha; Soyoral, Yasemin; Demirkiran, Davut; Ozturk, Mustafa
2014-04-01
Hypoparathyroidism is an uncommon disease and its coexistence with chronic renal failure is quite rare. Hypocalcemia and hyperphosphatemia are seen in both diseases. Diagnosis of hypoparathyroidism may be overlooked when parathormone response is not evaluated in patients with chronic renal failure. A 19-year-old female patient who had been receiving hemodialysis for 3 years because of chronic renal failure was diagnosed as idiopathic hypoparathyroidism and hashimoto thyroiditis. When her medical records on the first admission and medical history were evaluated, hypoparathyroidism and hashimoto thyroiditis were seen to be present also when she was started hemodialysis. Idiopathic hypoparathyroidism should be suspected in case as absence of parathormone response to hypocalcemia in patients with chronic renal failure. It should be taken into consideration that hashimoto thyroiditis may accompany and required analysis should be done.
Does early reading failure decrease children's reading motivation?
Morgan, Paul L; Fuchs, Douglas; Compton, Donald L; Cordray, David S; Fuchs, Lynn S
2008-01-01
The authors used a pretest-posttest control group design with random assignment to evaluate whether early reading failure decreases children's motivation to practice reading. First, they investigated whether 60 first-grade children would report substantially different levels of interest in reading as a function of their relative success or failure in learning to read. Second, they evaluated whether increasing the word reading ability of 15 at-risk children would lead to gains in their motivation to read. Multivariate analyses of variance suggest marked differences in both motivation and reading practice between skilled and unskilled readers. However, bolstering at-risk children's word reading ability did not yield evidence of a causal relationship between early reading failure and decreased motivation to engage in reading activities. Instead, hierarchical regression analyses indicate a covarying relationship among early reading failure, poor motivation, and avoidance of reading.
Reed, Shelby D; Neilson, Matthew P; Gardner, Matthew; Li, Yanhong; Briggs, Andrew H; Polsky, Daniel E; Graham, Felicia L; Bowers, Margaret T; Paul, Sara C; Granger, Bradi B; Schulman, Kevin A; Whellan, David J; Riegel, Barbara; Levy, Wayne C
2015-11-01
Heart failure disease management programs can influence medical resource use and quality-adjusted survival. Because projecting long-term costs and survival is challenging, a consistent and valid approach to extrapolating short-term outcomes would be valuable. We developed the Tools for Economic Analysis of Patient Management Interventions in Heart Failure Cost-Effectiveness Model, a Web-based simulation tool designed to integrate data on demographic, clinical, and laboratory characteristics; use of evidence-based medications; and costs to generate predicted outcomes. Survival projections are based on a modified Seattle Heart Failure Model. Projections of resource use and quality of life are modeled using relationships with time-varying Seattle Heart Failure Model scores. The model can be used to evaluate parallel-group and single-cohort study designs and hypothetical programs. Simulations consist of 10,000 pairs of virtual cohorts used to generate estimates of resource use, costs, survival, and incremental cost-effectiveness ratios from user inputs. The model demonstrated acceptable internal and external validity in replicating resource use, costs, and survival estimates from 3 clinical trials. Simulations to evaluate the cost-effectiveness of heart failure disease management programs across 3 scenarios demonstrate how the model can be used to design a program in which short-term improvements in functioning and use of evidence-based treatments are sufficient to demonstrate good long-term value to the health care system. The Tools for Economic Analysis of Patient Management Interventions in Heart Failure Cost-Effectiveness Model provides researchers and providers with a tool for conducting long-term cost-effectiveness analyses of disease management programs in heart failure. Copyright © 2015 Elsevier Inc. All rights reserved.
Iguchi, Toshihiro; Hiraki, Takao; Matsui, Yusuke; Fujiwara, Hiroyasu; Masaoka, Yoshihisa; Tanaka, Takashi; Sato, Takuya; Gobara, Hideo; Toyooka, Shinichi; Kanazawa, Susumu
2018-05-01
To retrospectively evaluate the technical success of computed tomography fluoroscopy-guided short hookwire placement before video-assisted thoracoscopic surgery and to identify the risk factors for initial placement failure. In total, 401 short hookwire placements for 401 lesions (mean diameter 9.3 mm) were reviewed. Technical success was defined as correct positioning of the hookwire. Possible risk factors for initial placement failure (i.e., requirement for placement of an additional hookwire or to abort the attempt) were evaluated using logistic regression analysis for all procedures, and for procedures performed via the conventional route separately. Of the 401 initial placements, 383 were successful and 18 failed. Short hookwires were finally placed for 399 of 401 lesions (99.5%). Univariate logistic regression analyses revealed that in all 401 procedures only the transfissural approach was a significant independent predictor of initial placement failure (odds ratio, OR, 15.326; 95% confidence interval, CI, 5.429-43.267; p < 0.001) and for the 374 procedures performed via the conventional route only lesion size was a significant independent predictor of failure (OR 0.793, 95% CI 0.631-0.996; p = 0.046). The technical success of preoperative short hookwire placement was extremely high. The transfissural approach was a predictor initial placement failure for all procedures and small lesion size was a predictor of initial placement failure for procedures performed via the conventional route. • Technical success of preoperative short hookwire placement was extremely high. • The transfissural approach was a significant independent predictor of initial placement failure for all procedures. • Small lesion size was a significant independent predictor of initial placement failure for procedures performed via the conventional route.
NASA Astrophysics Data System (ADS)
Gurfinkel, Yuri I.; Mikhailov, Valery M.; Kudutkina, Marina I.
2004-06-01
Capillaries play a critical role in cardiovascular function as the point of exchange of nutrients and waste products between tissues and circulation. A common problem for healthy volunteers examined during isolation, and for the patients suffering from heart failure is a quantitative estimation tissue oedema. Until now, objective assessment body fluids retention in tissues did not exist. Optical imaging of living capillaries is a challenging and medically important scientific problem. Goal of the investigation was to study dynamic of microcriculation parameters including tissue oedema in healthy volunteers during extended isolation and relative hypokinesia as a model of mission to the International Space Station. The other aim was to study dynamic of microcirculation parameters including tissue oedema in patients suffering from heart failure under treatment. Healthy volunteers and patients. We studied four healthy male subjects at the age of 41, 37, 40, and 48 before the experiment (June 1999), and during the 240-d isolation period starting from July3, 1999. Unique hermetic chambers with artidicial environmental parameters allowed performing this study with maximum similarity to real conditions in the International Space Station (ISS). With the regularity of 3 times a week at the same time, each subject recorded three video episodes with the total length of one-minute using the optical computerized capillaroscope for noninvasive measurement of the capillary diameters sizes, capillary blood velocity as well as the size of the perivascular zone. All this parameters of microcirculation determined during three weeks in 15 patients (10 male, 5 female, aged 62,2+/-8,8) suffering from heart failure under Furosemid 40 mg 2 times a week, as diuretic. Results. About 1500 episodes recorded on laser disks and analyzed during this experiment. Every subject had wave-like variations of capillary blood velocity within the minute, week, and month ranges. It was found that the perivascular zone sizes rising during isolation correlate with body mass of subjects and probably depend on retention of body fluids in tissues. Computerized capillaroscopy provides a new opportunity for non-invasive quantitative estimation tissue oedema and suggests for exact management patients suffering from heart failure under diuretic treatment.
The Six Minute Walk Test Revisited
NASA Astrophysics Data System (ADS)
Mazumder, M.
2017-12-01
Background and Purpose: Heart failure is the leading cause of death and often alters or severely restricts human mobility, an essential life function. Motion capture is an emerging tool for analyzing human movement and extremity articulation, providing quantitative information on gait and range of motion. This study uses BioStamp mechanosensors to identify differences in motion for the duration of the Six Minute Walk Test and signature patterns of muscle contraction and posture in patients with advanced heart failure compared to healthy subjects. Identification and close follow up of these patterns may allow enhanced diagnosis and the possibility for early intervention before disease worsening. Additionally, movement parameters represent a new family of potential biomarkers to track heart failure onset, progression and therapy. Methods: Prior to the Six Minute Walk Test, BioStamps (MC10) were applied to the chest, upper and lower extremities of heart failure and healthy patients and data were streamed and recorded revealing the pattern of movement in three separate axes. Conjointly, before and after the Six Minute Walk Test, the following vitals were measured per subject: heart rate, respiratory rate, blood pressure, oxygen saturation, dyspnea and leg fatigue (self-reported with Borg scale). During the test, patients were encouraged to walk as far as they can in 6 minutes on a 30m course, as we recorded the number of laps completed and oxygen saturation every minute. Results and Conclusions: The sensors captured and quantified whole body and regional motion parameters including: a. motion extent, position, acceleration and angle via incorporated accelerometers and gyroscopes; b. muscle contraction via incorporated electromyogram (EMG). Accelerometry and gyroscopic data for the last five steps of a healthy and heart failure patient are shown. While significant differences in motion for the duration of the test were not found, each category of patients had a distinct pattern of motion - with identifiable qualitative and quantitative differences. These wearable conformal skin adherent sensors allow on-body, mobile, personalized determination of motion and flexibility parameters. This tool and method hold promise for providing motion "biomarker" data in health and disease.
Sensor failure detection for jet engines
NASA Technical Reports Server (NTRS)
Beattie, E. C.; Laprad, R. F.; Akhter, M. M.; Rock, S. M.
1983-01-01
Revisions to the advanced sensor failure detection, isolation, and accommodation (DIA) algorithm, developed under the sensor failure detection system program were studied to eliminate the steady state errors due to estimation filter biases. Three algorithm revisions were formulated and one revision for detailed evaluation was chosen. The selected version modifies the DIA algorithm to feedback the actual sensor outputs to the integral portion of the control for the nofailure case. In case of a failure, the estimates of the failed sensor output is fed back to the integral portion. The estimator outputs are fed back to the linear regulator portion of the control all the time. The revised algorithm is evaluated and compared to the baseline algorithm developed previously.
NASA Astrophysics Data System (ADS)
Vandromme, Rosalie; Thiéry, Yannick; Sedan, Olivier; Bernardie, Séverine
2016-04-01
Landslide hazard assessment is the estimation of a target area where landslides of a particular type, volume, runout and intensity may occur within a given period. The first step to analyze landslide hazard consists in assessing the spatial and temporal failure probability (when the information is available, i.e. susceptibility assessment). Two types of approach are generally recommended to achieve this goal: (i) qualitative approach (i.e. inventory based methods and knowledge data driven methods) and (ii) quantitative approach (i.e. data-driven methods or deterministic physically based methods). Among quantitative approaches, deterministic physically based methods (PBM) are generally used at local and/or site-specific scales (1:5,000-1:25,000 and >1:5,000, respectively). The main advantage of these methods is the calculation of probability of failure (safety factor) following some specific environmental conditions. For some models it is possible to integrate the land-uses and climatic change. At the opposite, major drawbacks are the large amounts of reliable and detailed data (especially materials type, their thickness and the geotechnical parameters heterogeneity over a large area) and the fact that only shallow landslides are taking into account. This is why they are often used at site-specific scales (> 1:5,000). Thus, to take into account (i) materials' heterogeneity , (ii) spatial variation of physical parameters, (iii) different landslide types, the French Geological Survey (i.e. BRGM) has developed a physically based model (PBM) implemented in a GIS environment. This PBM couples a global hydrological model (GARDENIA®) including a transient unsaturated/saturated hydrological component with a physically based model computing the stability of slopes (ALICE®, Assessment of Landslides Induced by Climatic Events) based on the Morgenstern-Price method for any slip surface. The variability of mechanical parameters is handled by Monte Carlo approach. The probability to obtain a safety factor below 1 represents the probability of occurrence of a landslide for a given triggering event. The dispersion of the distribution gives the uncertainty of the result. Finally, a map is created, displaying a probability of occurrence for each computing cell of the studied area. In order to take into account the land-uses change, a complementary module integrating the vegetation effects on soil properties has been recently developed. Last years, the model has been applied at different scales for different geomorphological environments: (i) at regional scale (1:50,000-1:25,000) in French West Indies and French Polynesian islands (ii) at local scale (i.e.1:10,000) for two complex mountainous areas; (iii) at the site-specific scale (1:2,000) for one landslide. For each study the 3D geotechnical model has been adapted. The different studies have allowed : (i) to discuss the different factors included in the model especially the initial 3D geotechnical models; (ii) to precise the location of probable failure following different hydrological scenarii; (iii) to test the effects of climatic change and land-use on slopes for two cases. In that way, future changes in temperature, precipitation and vegetation cover can be analyzed, permitting to address the impacts of global change on landslides. Finally, results show that it is possible to obtain reliable information about future slope failures at different scale of work for different scenarii with an integrated approach. The final information about landslide susceptibility (i.e. probability of failure) can be integrated in landslide hazard assessment and could be an essential information source for future land planning. As it has been performed in the ANR Project SAMCO (Society Adaptation for coping with Mountain risks in a global change COntext), this analysis constitutes a first step in the chain for risk assessment for different climate and economical development scenarios, to evaluate the resilience of mountainous areas.
NASA Technical Reports Server (NTRS)
Fujimura, J.; Camilleri, M.; Low, P. A.; Novak, V.; Novak, P.; Opfer-Gehrking, T. L.
1997-01-01
Our aims were to evaluate to role of superior mesenteric blood flow in the pathophysiology of orthostatic hypotension in patients with generalized autonomic failure. METHODS: Twelve patients with symptomatic neurogenic orthostatic hypotension and 12 healthy controls underwent superior mesenteric artery flow measurements using Doppler ultrasonography during head-up tilt and tilt plus meal ingestion. Autonomic failure was assessed using standard tests of the function of the sympathetic adrenergic, cardiovagal and postganglionic sympathetic sudomotor function. RESULTS: Superior mesenteric flow volume and time-averaged velocity were similar in patients and controls at supine rest; however, responses to cold pressor test and upright tilt were attenuated (p < 0.05) in patients compared to controls. Head-up tilt after the meal evoked a profound fall of blood pressure and mesenteric blood flow in the patients; the reduction of mesenteric blood flow correlated (r = 0.89) with the fall of blood pressure in these patients, providing another manifestation of failed baroreflexes. We make the novel finding that the severity of postprandial orthostatic hypotension regressed negatively with the postprandial increase in mesenteric flow in patients with orthostatic hypotension. CONCLUSION: Mesenteric flow is under baroreflex control, which when defective, results in, or worsens orthostatic hypotension. Its large size and baroreflexivity renders it quantitatively important in the maintenance of postural normotension. The effects of orthostatic stress can be significantly attenuated by reducing the splanchnic-mesenteric volume increase in response to food. Evaluation of mesenteric flow in response to eating and head-up tilt provide important information on intra-abdominal sympathetic adrenergic function, and the ability of the patient to cope with orthostatic stress.
NASA Astrophysics Data System (ADS)
Zhang, Zhong
In this work, motivated by the need to coordinate transmission maintenance scheduling among a multiplicity of self-interested entities in restructured power industry, a distributed decision support framework based on multiagent negotiation systems (MANS) is developed. An innovative risk-based transmission maintenance optimization procedure is introduced. Several models for linking condition monitoring information to the equipment's instantaneous failure probability are presented, which enable quantitative evaluation of the effectiveness of maintenance activities in terms of system cumulative risk reduction. Methodologies of statistical processing, equipment deterioration evaluation and time-dependent failure probability calculation are also described. A novel framework capable of facilitating distributed decision-making through multiagent negotiation is developed. A multiagent negotiation model is developed and illustrated that accounts for uncertainty and enables social rationality. Some issues of multiagent negotiation convergence and scalability are discussed. The relationships between agent-based negotiation and auction systems are also identified. A four-step MAS design methodology for constructing multiagent systems for power system applications is presented. A generic multiagent negotiation system, capable of inter-agent communication and distributed decision support through inter-agent negotiations, is implemented. A multiagent system framework for facilitating the automated integration of condition monitoring information and maintenance scheduling for power transformers is developed. Simulations of multiagent negotiation-based maintenance scheduling among several independent utilities are provided. It is shown to be a viable alternative solution paradigm to the traditional centralized optimization approach in today's deregulated environment. This multiagent system framework not only facilitates the decision-making among competing power system entities, but also provides a tool to use in studying competitive industry relative to monopolistic industry.
Rapid experimental measurements of physicochemical properties to inform models and testing.
Nicolas, Chantel I; Mansouri, Kamel; Phillips, Katherine A; Grulke, Christopher M; Richard, Ann M; Williams, Antony J; Rabinowitz, James; Isaacs, Kristin K; Yau, Alice; Wambaugh, John F
2018-05-02
The structures and physicochemical properties of chemicals are important for determining their potential toxicological effects, toxicokinetics, and route(s) of exposure. These data are needed to prioritize the risk for thousands of environmental chemicals, but experimental values are often lacking. In an attempt to efficiently fill data gaps in physicochemical property information, we generated new data for 200 structurally diverse compounds, which were rigorously selected from the USEPA ToxCast chemical library, and whose structures are available within the Distributed Structure-Searchable Toxicity Database (DSSTox). This pilot study evaluated rapid experimental methods to determine five physicochemical properties, including the log of the octanol:water partition coefficient (known as log(K ow ) or logP), vapor pressure, water solubility, Henry's law constant, and the acid dissociation constant (pKa). For most compounds, experiments were successful for at least one property; log(K ow ) yielded the largest return (176 values). It was determined that 77 ToxPrint structural features were enriched in chemicals with at least one measurement failure, indicating which features may have played a role in rapid method failures. To gauge consistency with traditional measurement methods, the new measurements were compared with previous measurements (where available). Since quantitative structure-activity/property relationship (QSAR/QSPR) models are used to fill gaps in physicochemical property information, 5 suites of QSPRs were evaluated for their predictive ability and chemical coverage or applicability domain of new experimental measurements. The ability to have accurate measurements of these properties will facilitate better exposure predictions in two ways: 1) direct input of these experimental measurements into exposure models; and 2) construction of QSPRs with a wider applicability domain, as their predicted physicochemical values can be used to parameterize exposure models in the absence of experimental data. Published by Elsevier B.V.
Multi-laboratory survey of qPCR enterococci analysis method performance
Quantitative polymerase chain reaction (qPCR) has become a frequently used technique for quantifying enterococci in recreational surface waters, but there are several methodological options. Here we evaluated how three method permutations, type of mastermix, sample extract dilution and use of controls in results calculation, affect method reliability among multiple laboratories with respect to sample interference. Multiple samples from each of 22 sites representing an array of habitat types were analyzed using EPA Method 1611 and 1609 reagents with full strength and five-fold diluted extracts. The presence of interference was assessed three ways: using sample processing and PCR amplifications controls; consistency of results across extract dilutions; and relative recovery of target genes from spiked enterococci in water sample compared to control matrices with acceptable recovery defined as 50 to 200%. Method 1609, which is based on an environmental mastermix, was found to be superior to Method 1611, which is based on a universal mastermix. Method 1611 had over a 40% control assay failure rate with undiluted extracts and a 6% failure rate with diluted extracts. Method 1609 failed in only 11% and 3% of undiluted and diluted extracts analyses. Use of sample processing control assay results in the delta-delta Ct method for calculating relative target gene recoveries increased the number of acceptable recovery results. Delta-delta tended to bias recoveries fr
Allstadt, Kate E.; Thompson, Eric M.; Wald, David J.; Hamburger, Michael W.; Godt, Jonathan W.; Knudsen, Keith L.; Jibson, Randall W.; Jessee, M. Anna; Zhu, Jing; Hearne, Michael; Baise, Laurie G.; Tanyas, Hakan; Marano, Kristin D.
2016-03-30
The U.S. Geological Survey (USGS) Earthquake Hazards and Landslide Hazards Programs are developing plans to add quantitative hazard assessments of earthquake-triggered landsliding and liquefaction to existing real-time earthquake products (ShakeMap, ShakeCast, PAGER) using open and readily available methodologies and products. To date, prototype global statistical models have been developed and are being refined, improved, and tested. These models are a good foundation, but much work remains to achieve robust and defensible models that meet the needs of end users. In order to establish an implementation plan and identify research priorities, the USGS convened a workshop in Golden, Colorado, in October 2015. This document summarizes current (as of early 2016) capabilities, research and operational priorities, and plans for further studies that were established at this workshop. Specific priorities established during the meeting include (1) developing a suite of alternative models; (2) making use of higher resolution and higher quality data where possible; (3) incorporating newer global and regional datasets and inventories; (4) reducing barriers to accessing inventory datasets; (5) developing methods for using inconsistent or incomplete datasets in aggregate; (6) developing standardized model testing and evaluation methods; (7) improving ShakeMap shaking estimates, particularly as relevant to ground failure, such as including topographic amplification and accounting for spatial variability; and (8) developing vulnerability functions for loss estimates.
NASA Astrophysics Data System (ADS)
Liu, Haixing; Savić, Dragan; Kapelan, Zoran; Zhao, Ming; Yuan, Yixing; Zhao, Hongbin
2014-07-01
Flow entropy is a measure of uniformity of pipe flows in water distribution systems. By maximizing flow entropy one can identify reliable layouts or connectivity in networks. In order to overcome the disadvantage of the common definition of flow entropy that does not consider the impact of pipe diameter on reliability, an extended definition of flow entropy, termed as diameter-sensitive flow entropy, is proposed. This new methodology is then assessed by using other reliability methods, including Monte Carlo Simulation, a pipe failure probability model, and a surrogate measure (resilience index) integrated with water demand and pipe failure uncertainty. The reliability assessment is based on a sample of WDS designs derived from an optimization process for each of the two benchmark networks. Correlation analysis is used to evaluate quantitatively the relationship between entropy and reliability. To ensure reliability, a comparative analysis between the flow entropy and the new method is conducted. The results demonstrate that the diameter-sensitive flow entropy shows consistently much stronger correlation with the three reliability measures than simple flow entropy. Therefore, the new flow entropy method can be taken as a better surrogate measure for reliability and could be potentially integrated into the optimal design problem of WDSs. Sensitivity analysis results show that the velocity parameters used in the new flow entropy has no significant impact on the relationship between diameter-sensitive flow entropy and reliability.
Probability of loss of assured safety in systems with multiple time-dependent failure modes.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Helton, Jon Craig; Pilch, Martin.; Sallaberry, Cedric Jean-Marie.
2012-09-01
Weak link (WL)/strong link (SL) systems are important parts of the overall operational design of high-consequence systems. In such designs, the SL system is very robust and is intended to permit operation of the entire system under, and only under, intended conditions. In contrast, the WL system is intended to fail in a predictable and irreversible manner under accident conditions and render the entire system inoperable before an accidental operation of the SL system. The likelihood that the WL system will fail to deactivate the entire system before the SL system fails (i.e., degrades into a configuration that could allowmore » an accidental operation of the entire system) is referred to as probability of loss of assured safety (PLOAS). Representations for PLOAS for situations in which both link physical properties and link failure properties are time-dependent are derived and numerically evaluated for a variety of WL/SL configurations, including PLOAS defined by (i) failure of all SLs before failure of any WL, (ii) failure of any SL before failure of any WL, (iii) failure of all SLs before failure of all WLs, and (iv) failure of any SL before failure of all WLs. The effects of aleatory uncertainty and epistemic uncertainty in the definition and numerical evaluation of PLOAS are considered.« less
Quantitative evolutionary design
Diamond, Jared
2002-01-01
The field of quantitative evolutionary design uses evolutionary reasoning (in terms of natural selection and ultimate causation) to understand the magnitudes of biological reserve capacities, i.e. excesses of capacities over natural loads. Ratios of capacities to loads, defined as safety factors, fall in the range 1.2-10 for most engineered and biological components, even though engineered safety factors are specified intentionally by humans while biological safety factors arise through natural selection. Familiar examples of engineered safety factors include those of buildings, bridges and elevators (lifts), while biological examples include factors of bones and other structural elements, of enzymes and transporters, and of organ metabolic performances. Safety factors serve to minimize the overlap zone (resulting in performance failure) between the low tail of capacity distributions and the high tail of load distributions. Safety factors increase with coefficients of variation of load and capacity, with capacity deterioration with time, and with cost of failure, and decrease with costs of initial construction, maintenance, operation, and opportunity. Adaptive regulation of many biological systems involves capacity increases with increasing load; several quantitative examples suggest sublinear increases, such that safety factors decrease towards 1.0. Unsolved questions include safety factors of series systems, parallel or branched pathways, elements with multiple functions, enzyme reaction chains, and equilibrium enzymes. The modest sizes of safety factors imply the existence of costs that penalize excess capacities. Those costs are likely to involve wasted energy or space for large or expensive components, but opportunity costs of wasted space at the molecular level for minor components. PMID:12122135
Radhakrishna, K.; Bowles, K.; Zettek-Sumner, A.
2013-01-01
Summary Background Telehealth data overload through high alert generation is a significant barrier to sustained adoption of telehealth for managing HF patients. Objective To explore the factors contributing to frequent telehealth alerts including false alerts for Medicare heart failure (HF) patients admitted to a home health agency. Materials and Methods A mixed methods design that combined quantitative correlation analysis of patient characteristic data with number of telehealth alerts and qualitative analysis of telehealth and visiting nurses’ notes on follow-up actions to patients’ telehealth alerts was employed. All the quantitative and qualitative data was collected through retrospective review of electronic records of the home heath agency. Results Subjects in the study had a mean age of 83 (SD = 7.6); 56% were female. Patient co-morbidities (p<0.05) of renal disorders, anxiety, and cardiac arrhythmias emerged as predictors of telehealth alerts through quantitative analysis (n = 168) using multiple regression. Inappropriate telehealth measurement technique by patients (54%) and home healthcare system inefficiencies (37%) contributed to most telehealth false alerts in the purposive qualitative sub-sample (n = 35) of patients with high telehealth alerts. Conclusion Encouraging patient engagement with the telehealth process, fostering a collaborative approach among all the clinicians involved with the telehealth intervention, tailoring telehealth alert thresholds to patient characteristics along with establishing patient-centered telehealth outcome goals may allow meaningful generation of telehealth alerts. Reducing avoidable telehealth alerts could vastly improve the efficiency and sustainability of telehealth programs for HF management. PMID:24454576
[Reconstituting evaluation methods based on both qualitative and quantitative paradigms].
Miyata, Hiroaki; Okubo, Suguru; Yoshie, Satoru; Kai, Ichiro
2011-01-01
Debate about the relationship between quantitative and qualitative paradigms is often muddled and confusing and the clutter of terms and arguments has resulted in the concepts becoming obscure and unrecognizable. In this study we conducted content analysis regarding evaluation methods of qualitative healthcare research. We extracted descriptions on four types of evaluation paradigm (validity/credibility, reliability/credibility, objectivity/confirmability, and generalizability/transferability), and classified them into subcategories. In quantitative research, there has been many evaluation methods based on qualitative paradigms, and vice versa. Thus, it might not be useful to consider evaluation methods of qualitative paradigm are isolated from those of quantitative methods. Choosing practical evaluation methods based on the situation and prior conditions of each study is an important approach for researchers.
Solar-cell interconnect design for terrestrial photovoltaic modules
NASA Technical Reports Server (NTRS)
Mon, G. R.; Moore, D. M.; Ross, R. G., Jr.
1984-01-01
Useful solar cell interconnect reliability design and life prediction algorithms are presented, together with experimental data indicating that the classical strain cycle (fatigue) curve for the interconnect material does not account for the statistical scatter that is required in reliability predictions. This shortcoming is presently addressed by fitting a functional form to experimental cumulative interconnect failure rate data, which thereby yields statistical fatigue curves enabling not only the prediction of cumulative interconnect failures during the design life of an array field, but also the quantitative interpretation of data from accelerated thermal cycling tests. Optimal interconnect cost reliability design algorithms are also derived which may allow the minimization of energy cost over the design life of the array field.
Genetics of common forms of heart failure: challenges and potential solutions.
Rau, Christoph D; Lusis, Aldons J; Wang, Yibin
2015-05-01
In contrast to many other human diseases, the use of genome-wide association studies (GWAS) to identify genes for heart failure (HF) has had limited success. We will discuss the underlying challenges as well as potential new approaches to understanding the genetics of common forms of HF. Recent research using intermediate phenotypes, more detailed and quantitative stratification of HF symptoms, founder populations and novel animal models has begun to allow researchers to make headway toward explaining the genetics underlying HF using GWAS techniques. By expanding analyses of HF to improved clinical traits, additional HF classifications and innovative model systems, the intractability of human HF GWAS should be ameliorated significantly.
Solar-cell interconnect design for terrestrial photovoltaic modules
NASA Astrophysics Data System (ADS)
Mon, G. R.; Moore, D. M.; Ross, R. G., Jr.
1984-11-01
Useful solar cell interconnect reliability design and life prediction algorithms are presented, together with experimental data indicating that the classical strain cycle (fatigue) curve for the interconnect material does not account for the statistical scatter that is required in reliability predictions. This shortcoming is presently addressed by fitting a functional form to experimental cumulative interconnect failure rate data, which thereby yields statistical fatigue curves enabling not only the prediction of cumulative interconnect failures during the design life of an array field, but also the quantitative interpretation of data from accelerated thermal cycling tests. Optimal interconnect cost reliability design algorithms are also derived which may allow the minimization of energy cost over the design life of the array field.
King, J. W.; Kennedy, F. S.; Hanley, H. G.; Lierl, J. J.; Fowler, M. R.; White, M. C.
1986-01-01
The increasingly frequent use of endomyocardial biopsies for diagnosis has provided the opportunity to study myocardial metabolism in patients with cardiac diseases. The authors have tested microassays of the hexose monophosphate shunt, glycolytic pathway, and Krebs cycle and demonstrated that they are easily and reproducibly performed on small pieces of cardiac tissue. They have also used these assays to study myocardial metabolism in 2 patients with endocarditis uncomplicated by congestive heart failure and in 2 patients with congestive heart failure due to idiopathic dilated cardiomyopathy. The ability to quantitate myocardial metabolism in biopsies from patients with a variety of cardiac diseases may enhance our understanding of cardiac pathophysiology. PMID:3706492
Nonlinear viscoelasticity and generalized failure criterion for biopolymer gels
NASA Astrophysics Data System (ADS)
Divoux, Thibaut; Keshavarz, Bavand; Manneville, Sébastien; McKinley, Gareth
2016-11-01
Biopolymer gels display a multiscale microstructure that is responsible for their solid-like properties. Upon external deformation, these soft viscoelastic solids exhibit a generic nonlinear mechanical response characterized by pronounced stress- or strain-stiffening prior to irreversible damage and failure, most often through macroscopic fractures. Here we show on a model acid-induced protein gel that the nonlinear viscoelastic properties of the gel can be described in terms of a 'damping function' which predicts the gel mechanical response quantitatively up to the onset of macroscopic failure. Using a nonlinear integral constitutive equation built upon the experimentally-measured damping function in conjunction with power-law linear viscoelastic response, we derive the form of the stress growth in the gel following the start up of steady shear. We also couple the shear stress response with Bailey's durability criteria for brittle solids in order to predict the critical values of the stress σc and strain γc for failure of the gel, and how they scale with the applied shear rate. This provides a generalized failure criterion for biopolymer gels in a range of different deformation histories. This work was funded by the MIT-France seed fund and by the CNRS PICS-USA scheme (#36939). BK acknowledges financial support from Axalta Coating Systems.
NASA Astrophysics Data System (ADS)
Lansard, Erick; Frayssinhes, Eric; Palmade, Jean-Luc
Basically, the problem of designing a multisatellite constellation exhibits a lot of parameters with many possible combinations: total number of satellites, orbital parameters of each individual satellite, number of orbital planes, number of satellites in each plane, spacings between satellites of each plane, spacings between orbital planes, relative phasings between consecutive orbital planes. Hopefully, some authors have theoretically solved this complex problem under simplified assumptions: the permanent (or continuous) coverage by a single and multiple satellites of the whole Earth and zonal areas has been entirely solved from a pure geometrical point of view. These solutions exhibit strong symmetry properties (e.g. Walker, Ballard, Rider, Draim constellations): altitude and inclination are identical, orbital planes and satellites are regularly spaced, etc. The problem with such constellations is their oversimplified and restricted geometrical assumption. In fact, the evaluation function which is used implicitly only takes into account the point-to-point visibility between users and satellites and does not deal with very important constraints and considerations that become mandatory when designing a real satellite system (e.g. robustness to satellite failures, total system cost, common view between satellites and ground stations, service availability and satellite reliability, launch and early operations phase, production constraints, etc.). An original and global methodology relying on a powerful optimization tool based on genetic algorithms has been developed at ALCATEL ESPACE. In this approach, symmetrical constellations can be used as initial conditions of the optimization process together with specific evaluation functions. A multi-criteria performance analysis is conducted and presented here in a parametric way in order to identify and evaluate the main sensitive parameters. Quantitative results are given for three examples in the fields of navigation, telecommunication and multimedia satellite systems. In particular, a new design pattern with very efficient properties in terms of robustness to satellite failures is presented and compared with classical Walker patterns.
Vegter, Eline L; Ovchinnikova, Ekaterina S; Silljé, Herman H W; Meems, Laura M G; van der Pol, Atze; van der Velde, A Rogier; Berezikov, Eugene; Voors, Adriaan A; de Boer, Rudolf A; van der Meer, Peter
2017-01-01
We recently identified a set of plasma microRNAs (miRNAs) that are downregulated in patients with heart failure in comparison with control subjects. To better understand their meaning and function, we sought to validate these circulating miRNAs in 3 different well-established rat and mouse heart failure models, and correlated the miRNAs to parameters of cardiac function. The previously identified let-7i-5p, miR-16-5p, miR-18a-5p, miR-26b-5p, miR-27a-3p, miR-30e-5p, miR-199a-3p, miR-223-3p, miR-423-3p, miR-423-5p and miR-652-3p were measured by means of quantitative real time polymerase chain reaction (qRT-PCR) in plasma samples of 8 homozygous TGR(mREN2)27 (Ren2) transgenic rats and 8 (control) Sprague-Dawley rats, 6 mice with angiotensin II-induced heart failure (AngII) and 6 control mice, and 8 mice with ischemic heart failure and 6 controls. Circulating miRNA levels were compared between the heart failure animals and healthy controls. Ren2 rats, AngII mice and mice with ischemic heart failure showed clear signs of heart failure, exemplified by increased left ventricular and lung weights, elevated end-diastolic left ventricular pressures, increased expression of cardiac stress markers and reduced left ventricular ejection fraction. All miRNAs were detectable in plasma from rats and mice. No significant differences were observed between the circulating miRNAs in heart failure animals when compared to the healthy controls (all P>0.05) and no robust associations with cardiac function could be found. The previous observation that miRNAs circulate in lower levels in human patients with heart failure could not be validated in well-established rat and mouse heart failure models. These results question the translation of data on human circulating miRNA levels to experimental models, and vice versa the validity of experimental miRNA data for human heart failure.
Kale-Pradhan, Pramodini B.; Mariani, Nicholas P.; Wilhelm, Sheila M.; Johnson, Leonard B.
2015-01-01
Background: Vancomycin is used to treat serious infections caused by methicillin-resistant Staphylococcus aureus (MRSA). It is unclear whether MRSA isolates with minimum inhibitory concentration (MIC) 1.5 to 2 µg/mL are successfully treated with vancomycin. Objective: Evaluate vancomycin failure rates in MRSA bacteremia with an MIC <1.5 versus ≥1.5 µg/mL, and MIC ≤1 versus ≥2 µg/mL. Methods: A literature search was conducted using MESH terms vancomycin, MRSA, bacteremia, MIC, treatment and vancomycin failure to identify human studies published in English. All studies of patients with MRSA bacteremia treated with vancomycin were included if they evaluated vancomycin failures, defined as mortality, and reported associated MICs determined by E-test. Study sample size, vancomycin failure rates, and corresponding MIC values were extracted and analyzed using RevMan 5.2.5. Results: Thirteen studies including 2955 patients met all criteria. Twelve studies including 2861 patients evaluated outcomes using an MIC cutoff of 1.5 µg/mL. A total of 413 of 1186 (34.8%) patients with an MIC <1.5 and 531 of 1675 (31.7%) patients with an MIC of ≥1.5 µg/mL experienced treatment failure (odds ratio = 0.72, 95% confidence interval = 0.49-1.04, P = .08). Six studies evaluated 728 patients using the cutoffs of ≤1 and ≥2 µg/mL. A total of 384 patients had isolates with MIC ≤1 µg/mL, 344 had an MIC ≥2 µg/mL. Therapeutic failure occurred in 87 and 102 patients, respectively (odds ratio = 0.61, 95% confidence interval = 0.34-1.10, P = .10). As heterogeneity between the studies was high, a random-effects model was used. Conclusion: Vancomycin MIC may not be an optimal sole indicator of vancomycin treatment failure in MRSA bacteremia.
Opasich, C; Cobelli, F; Riccardi, G; La Rovere, M T; Calsamiglia, G; Specchia, G
1988-04-01
The anaerobic threshold (AT) has been proposed as an index to assess the functional status of patients with chronic heart failure. The focus of this report was to evaluate in post-myocardial infarction patients the utility of the AT for (a) assessing the severity of exercise-induced left ventricular impairment, (b) determining the responses obtained from different treatments and (c) prescribing exercise training. We found that the AT level was lower in patients with abnormal haemodynamic patterns during exercise. The AT was correlated to different degrees of exercise-induced left ventricular impairment. The nitrate and calcium-antagonist effects have been evaluated in patients with abnormal exercise haemodynamics. The resting and exertional results were in agreement with the vasodilator effects. Moreover, the time from onset of exercise to the appearance of the AT was significantly increased by the treatments. Thus, AT during pharmacological treatments may be a non-invasive useful parameter for assessing their haemodynamic effects. Finally, a 4-week intermittent training programme based on AT level was evaluated in patients with abnormal resting and exertional haemodynamics. The results showed an improvement of the exercise cardiovascular tolerance without negative effects on left ventricular function. Therefore, the AT seems to be useful when prescribing a rational and individualized training programme.
Liu, Yitong
2018-05-18
An increased use of herbal dietary supplements has been associated with adverse liver effects such as elevated serum enzymes and liver failure. The safety assessment for herbal dietary supplements is challenging since they often contain complex mixtures of phytochemicals, most of which have unknown pharmacokinetic and toxicological properties. Rapid tools are needed to evaluate large numbers of phytochemicals for potential liver toxicity. The current study demonstrates a tiered approach combining identification of phytochemicals in liver toxic botanicals, followed by in silico quantitative structure-activity relationship (QSAR) evaluation of these phytochemicals for absorption (e.g. permeability), metabolism (cytochromes P450) and liver toxicity (e.g. elevated transaminases). First, 255 phytochemicals from 20 botanicals associated with clinical liver injury were identified, and the phytochemical structures were subsequently used for QSAR evaluation. Among these identified phytochemicals, 193 were predicted to be absorbed and then used to generate metabolites, which were both used to predict liver toxicity. Forty-eight phytochemicals were predicted as liver toxic, either due to parent phytochemicals or metabolites. Among them, nineteen phytochemicals have previous evidence of liver toxicity (e.g. pyrrolizidine alkaloids), while the majority were newly discovered (e.g. sesquiterpenoids). These findings help reveal new toxic phytochemicals in herbal dietary supplements and prioritize future toxicological testing. Published by Elsevier Ltd.
Gadkar, Kapil; Lu, James; Sahasranaman, Srikumar; Davis, John; Mazer, Norman A.; Ramanujan, Saroja
2016-01-01
The recent failures of cholesteryl ester transport protein inhibitor drugs to decrease CVD risk, despite raising HDL cholesterol (HDL-C) levels, suggest that pharmacologic increases in HDL-C may not always reflect elevations in reverse cholesterol transport (RCT), the process by which HDL is believed to exert its beneficial effects. HDL-modulating therapies can affect HDL properties beyond total HDL-C, including particle numbers, size, and composition, and may contribute differently to RCT and CVD risk. The lack of validated easily measurable pharmacodynamic markers to link drug effects to RCT, and ultimately to CVD risk, complicates target and compound selection and evaluation. In this work, we use a systems pharmacology model to contextualize the roles of different HDL targets in cholesterol metabolism and provide quantitative links between HDL-related measurements and the associated changes in RCT rate to support target and compound evaluation in drug development. By quantifying the amount of cholesterol removed from the periphery over the short-term, our simulations show the potential for infused HDL to treat acute CVD. For the primary prevention of CVD, our analysis suggests that the induction of ApoA-I synthesis may be a more viable approach, due to the long-term increase in RCT rate. PMID:26522778
Gadkar, Kapil; Lu, James; Sahasranaman, Srikumar; Davis, John; Mazer, Norman A; Ramanujan, Saroja
2016-01-01
The recent failures of cholesteryl ester transport protein inhibitor drugs to decrease CVD risk, despite raising HDL cholesterol (HDL-C) levels, suggest that pharmacologic increases in HDL-C may not always reflect elevations in reverse cholesterol transport (RCT), the process by which HDL is believed to exert its beneficial effects. HDL-modulating therapies can affect HDL properties beyond total HDL-C, including particle numbers, size, and composition, and may contribute differently to RCT and CVD risk. The lack of validated easily measurable pharmacodynamic markers to link drug effects to RCT, and ultimately to CVD risk, complicates target and compound selection and evaluation. In this work, we use a systems pharmacology model to contextualize the roles of different HDL targets in cholesterol metabolism and provide quantitative links between HDL-related measurements and the associated changes in RCT rate to support target and compound evaluation in drug development. By quantifying the amount of cholesterol removed from the periphery over the short-term, our simulations show the potential for infused HDL to treat acute CVD. For the primary prevention of CVD, our analysis suggests that the induction of ApoA-I synthesis may be a more viable approach, due to the long-term increase in RCT rate. Copyright © 2016 by the American Society for Biochemistry and Molecular Biology, Inc.
NASA Astrophysics Data System (ADS)
Sajun Prasad, K.; Panda, Sushanta Kumar; Kar, Sujoy Kumar; Sen, Mainak; Murty, S. V. S. Naryana; Sharma, Sharad Chandra
2017-04-01
Recently, aerospace industries have shown increasing interest in forming limits of Inconel 718 sheet metals, which can be utilised in designing tools and selection of process parameters for successful fabrication of components. In the present work, stress-strain response with failure strains was evaluated by uniaxial tensile tests in different orientations, and two-stage work-hardening behavior was observed. In spite of highly preferred texture, tensile properties showed minor variations in different orientations due to the random distribution of nanoprecipitates. The forming limit strains were evaluated by deforming specimens in seven different strain paths using limiting dome height (LDH) test facility. Mostly, the specimens failed without prior indication of localized necking. Thus, fracture forming limit diagram (FFLD) was evaluated, and bending correction was imposed due to the use of sub-size hemispherical punch. The failure strains of FFLD were converted into major-minor stress space ( σ-FFLD) and effective plastic strain-stress triaxiality space ( ηEPS-FFLD) as failure criteria to avoid the strain path dependence. Moreover, FE model was developed, and the LDH, strain distribution and failure location were predicted successfully using above-mentioned failure criteria with two stages of work hardening. Fractographs were correlated with the fracture behavior and formability of sheet metal.
How and why of orthodontic bond failures: An in vivo study
Vijayakumar, R. K.; Jagadeep, Raju; Ahamed, Fayyaz; Kanna, Aprose; Suresh, K.
2014-01-01
Introduction: The bonding of orthodontic brackets and their failure rates by both direct and in-direct procedures are well-documented in orthodontic literature. Over the years different adhesive materials and various indirect bonding transfer procedures have been compared and evaluated for bond failure rates. The aim of our study is to highlight the use of a simple, inexpensive and ease of manipulation of a single thermo-plastic transfer tray and the use the of a single light cure adhesive to evaluate the bond failure rates in clinical situations. Materials and Methods: A total of 30 patients were randomly divided into two groups (Group A and Group B). A split-mouth study design was used, for, both the groups so that they were distributed equally with-out bias. After initial prophylaxis, both the procedures were done as per manufactures instructions. All patients were initially motivated and reviewed for bond failures rates for 6 months. Results: Bond failure rates were assessed for over-all direct and indirect procedures, anterior and posterior arches, and for individual tooth. Z-test was used for statistically analyzing, the normal distribution of the sample in a spilt mouth study. The results of the two groups were compared and P value was calculated using Z-proportion test to assess the significance of the bond failure. Conclusion: Over-all bond failure was more for direct bonding. Anterior bracket failure was more in-direct bonding than indirect procedure, which showed more posterior bracket failures. In individual tooth bond failure, mandibular incisor, and premolar brackets showed more failure, followed by maxillary premolars and canines. PMID:25210392
Process Evaluation of a Workers' Health Surveillance Program for Meat Processing Workers.
van Holland, Berry J; Brouwer, Sandra; de Boer, Michiel R; Reneman, Michiel F; Soer, Remko
2017-09-01
Objective To evaluate the implementation process of a workers' health surveillance (WHS) program in a Dutch meat processing company. Methods Workers from five plants were eligible to participate in the WHS program. The program consisted of four evaluative components and an intervention component. Qualitative and quantitative methods were used to evaluate seven process aspects. Data were gathered by interviews with stakeholders, participant questionnaires, and from registries of the company and occupational health service. Results Two recruitment strategies were used: open invitation or automatic participation. Of the 986 eligible workers, 305 participated in the program. Average reach was 53 %. Two out of five program components could not be assessed on dose delivered, dose received and fidelity. If components were assessable, 85-100 % of the components was delivered, 66-100 % of the components was received by participants, and fidelity was 100 %. Participants were satisfied with the WHS program (mean score 7.6). Contextual factors that facilitated implementation were among others societal developments and management support. Factors that formed barriers were program novelty and delayed follow-up. Conclusion The WHS program was well received by participants. Not all participants were offered the same number of program components, and not all components were performed according to protocol. Deviation from protocol is an indication of program failure and may affect program effectiveness.
47,XXX in an adolescent with premature ovarian failure and autoimmune disease.
Holland, C M
2001-05-01
Premature ovarian failure (POF) may be idiopathic or may be associated with genetic or autoimmune disorders. The 47,XXX karyotype has been associated with POF and other genitourinary anomalies. A 17-year-old woman with a history of immune thrombocytopenic purpura was referred to the adolescent medicine clinic for evaluation of oligomenorrhea with secondary amenorrhea. Evaluation revealed hypergonadotrophic premature ovarian failure, a positive antinuclear antibody, and the 47,XXX karyotype. She has since developed a positive anti-cardiolipin antibody but does not meet diagnostic criteria for systemic lupus erythematosis. The presence of known autoimmune disease in a woman with POF should not dissuade the clinician from evaluating for a potential genetic cause.
Piller, Linda B; Davis, Barry R; Cutler, Jeffrey A; Cushman, William C; Wright, Jackson T; Williamson, Jeff D; Leenen, Frans HH; Einhorn, Paula T; Randall, Otelio S; Golden, John S; Haywood, L Julian
2002-01-01
Background The Antihypertensive and Lipid Lowering Treatment to Prevent Heart Attack Trial (ALLHAT) is a randomized, double-blind, active-controlled trial designed to compare the rate of coronary heart disease events in high-risk hypertensive participants initially randomized to a diuretic (chlorthalidone) versus each of three alternative antihypertensive drugs: alpha-adrenergic blocker (doxazosin), ACE-inhibitor (lisinopril), and calcium-channel blocker (amlodipine). Combined cardiovascular disease risk was significantly increased in the doxazosin arm compared to the chlorthalidone arm (RR 1.25; 95% CI, 1.17–1.33; P < .001), with a doubling of heart failure (fatal, hospitalized, or non-hospitalized but treated) (RR 2.04; 95% CI, 1.79–2.32; P < .001). Questions about heart failure diagnostic criteria led to steps to validate these events further. Methods and Results Baseline characteristics (age, race, sex, blood pressure) did not differ significantly between treatment groups (P < .05) for participants with heart failure events. Post-event pharmacologic management was similar in both groups and generally conformed to accepted heart failure therapy. Central review of a small sample of cases showed high adherence to ALLHAT heart failure criteria. Of 105 participants with quantitative ejection fraction measurements provided, (67% by echocardiogram, 31% by catheterization), 29/46 (63%) from the chlorthalidone group and 41/59 (70%) from the doxazosin group were at or below 40%. Two-year heart failure case-fatalities (22% and 19% in the doxazosin and chlorthalidone groups, respectively) were as expected and did not differ significantly (RR 0.96; 95% CI, 0.67–1.38; P = 0.83). Conclusion Results of the validation process supported findings of increased heart failure in the ALLHAT doxazosin treatment arm compared to the chlorthalidone treatment arm. PMID:12459039
Chen, Jing; Wang, Shu-Mei; Meng, Jiang; Sun, Fei; Liang, Sheng-Wang
2013-05-01
To establish a new method for quality evaluation and validate its feasibilities by simultaneous quantitative assay of five alkaloids in Sophora flavescens. The new quality evaluation method, quantitative analysis of multi-components by single marker (QAMS), was established and validated with S. flavescens. Five main alkaloids, oxymatrine, sophocarpine, matrine, oxysophocarpine and sophoridine, were selected as analytes to evaluate the quality of rhizome of S. flavescens, and the relative correction factor has good repeatibility. Their contents in 21 batches of samples, collected from different areas, were determined by both external standard method and QAMS. The method was evaluated by comparison of the quantitative results between external standard method and QAMS. No significant differences were found in the quantitative results of five alkaloids in 21 batches of S. flavescens determined by external standard method and QAMS. It is feasible and suitable to evaluate the quality of rhizome of S. flavescens by QAMS.
Hypercalcemia with renal failure.
Bhavani, Nisha; Praveen, Valiyaparambil Pavithran; Jayakumar, Rohinivilasam Vasukutty; Nair, Vasantha; Muraleedharan, Mangath; Kuma, Harish; Unnikrishnan, Ambika Gopalakrishnan; Menon, Vadayath Usha
2012-06-01
We report a cse of nephrocalcinosis with renal failure which on evaluation was found to have hypercalcemia. Further investigations showed an inappropriately normal intact parathormone (iPTH) and 1,25 dihydroxy-vitamin D level in the setting of renal failure. Probing for a cause of non-PTH mediated hypercalcemia led to the diagnosis of sarcoidosis. Treatment with glucocorticoids could partially reverse the renal failure and control the hypercalcemia. This case illustrates the importance of careful interpretation of laboratory parameters especially levels of iPTH and vitamin D metabolites in renal failure.
Failure mechanisms of fibrin-based surgical tissue adhesives
NASA Astrophysics Data System (ADS)
Sierra, David Hugh
A series of studies was performed to investigate the potential impact of heterogeneity in the matrix of multiple-component fibrin-based tissue adhesives upon their mechanical and biomechanical properties both in vivo and in vitro. Investigations into the failure mechanisms by stereological techniques demonstrated that heterogeneity could be measured quantitatively and that the variation in heterogeneity could be altered both by the means of component mixing and delivery and by the formulation of the sealant. Ex vivo tensile adhesive strength was found to be inversely proportional to the amount of heterogeneity. In contrast, in vivo tensile wound-closure strength was found to be relatively unaffected by the degree of heterogeneity, while in vivo parenchymal organ hemostasis in rabbits was found to be affected: greater heterogeneity appeared to correlate with an increase in hemostasis time and amount of sealant necessary to effect hemostasis. Tensile testing of the bulk sealant showed that mechanical parameters were proportional to fibrin concentration and that the physical characteristics of the failure supported a ductile mechanism. Strain hardening as a function of percentage of strain, and strain rate was observed for both concentrations, and syneresis was observed at low strain rates for the lower fibrin concentration. Blister testing demonstrated that burst pressure and failure energy were proportional to fibrin concentration and decreased with increasing flow rate. Higher fibrin concentration demonstrated predominately compact morphology debonds with cohesive failure loci, demonstrating shear or viscous failure in a viscoelastic rubbery adhesive. The lower fibrin concentration sealant exhibited predominately fractal morphology debonds with cohesive failure loci, supporting an elastoviscous material condition. The failure mechanism for these was hypothesized and shown to be flow-induced ductile fracture. Based on these findings, the failure mechanism was stochastic in nature because the mean failure energy and burst pressure values were not predictive of locus and morphology. Instead, flow rate and fibrin concentration showed the most predictive value, with the outcome best described as a probability distribution rather than a specific deterministic outcome.
2011-01-01
Background Disease management programmes (DMPs) have been developed to improve effectiveness and economic efficiency within chronic care delivery by combining patient-related, professional-directed, and organisational interventions. The benefits of DMPs within different settings, patient groups, and versions remain unclear. In this article we propose a protocol to evaluate a range of current DMPs by capturing them in a single conceptual framework, employing comparable structure, process, and outcome measures, and combining qualitative and quantitative research methods. Methods To assess DMP effectiveness a practical clinical trial will be conducted. Twenty-two disease management experiments will be studied in various Dutch regions consisting of a variety of collaborations between organisations and/or professionals. Patient cohorts include those with cardiovascular diseases, chronic obstructive pulmonary disease, diabetes, stroke, depression, psychotic diseases, and eating disorders. Our methodological approach combines qualitative and quantitative research methods to enable a comprehensive evaluation of complex programmes. Process indicators will be collected from health care providers' data registries and measured via physician and staff questionnaires. Patient questionnaires include health care experiences, health care utilisation, and quality of life. Qualitative data will be gathered by means of interviews and document analysis for an in depth description of project interventions and the contexts in which DMPs are embedded, and an ethnographic process evaluation in five DMPs. Such a design will provide insight into ongoing DMPs and demonstrate which elements of the intervention are potentially (cost)-effective for which patient populations. It will also enable sound comparison of the results of the different programmes. Discussion The study will lead to a better understanding of (1) the mechanisms of disease management, (2) the feasibility, and cost-effectiveness of a disease management approach to improving health care, and (3) the factors that determine success and failure of DMPs. Our study results will be relevant to decision makers and managers who confront the challenge of implementing and integrating DMPs into the health care system. Moreover, it will contribute to the search for methods to evaluate complex healthcare interventions. PMID:21219620
Lemmens, Karin M M; Rutten-Van Mölken, Maureen P M H; Cramm, Jane M; Huijsman, Robbert; Bal, Roland A; Nieboer, Anna P
2011-01-10
Disease management programmes (DMPs) have been developed to improve effectiveness and economic efficiency within chronic care delivery by combining patient-related, professional-directed, and organisational interventions. The benefits of DMPs within different settings, patient groups, and versions remain unclear. In this article we propose a protocol to evaluate a range of current DMPs by capturing them in a single conceptual framework, employing comparable structure, process, and outcome measures, and combining qualitative and quantitative research methods. To assess DMP effectiveness a practical clinical trial will be conducted. Twenty-two disease management experiments will be studied in various Dutch regions consisting of a variety of collaborations between organisations and/or professionals. Patient cohorts include those with cardiovascular diseases, chronic obstructive pulmonary disease, diabetes, stroke, depression, psychotic diseases, and eating disorders. Our methodological approach combines qualitative and quantitative research methods to enable a comprehensive evaluation of complex programmes. Process indicators will be collected from health care providers' data registries and measured via physician and staff questionnaires. Patient questionnaires include health care experiences, health care utilisation, and quality of life. Qualitative data will be gathered by means of interviews and document analysis for an in depth description of project interventions and the contexts in which DMPs are embedded, and an ethnographic process evaluation in five DMPs. Such a design will provide insight into ongoing DMPs and demonstrate which elements of the intervention are potentially (cost)-effective for which patient populations. It will also enable sound comparison of the results of the different programmes. The study will lead to a better understanding of (1) the mechanisms of disease management, (2) the feasibility, and cost-effectiveness of a disease management approach to improving health care, and (3) the factors that determine success and failure of DMPs. Our study results will be relevant to decision makers and managers who confront the challenge of implementing and integrating DMPs into the health care system. Moreover, it will contribute to the search for methods to evaluate complex healthcare interventions.
Pavlova, Viola; Grimm, Volker; Dietz, Rune; Sonne, Christian; Vorkamp, Katrin; Rigét, Frank F; Letcher, Robert J; Gustavson, Kim; Desforges, Jean-Pierre; Nabe-Nielsen, Jacob
2016-01-01
Polychlorinated biphenyls (PCBs) can cause endocrine disruption, cancer, immunosuppression, or reproductive failure in animals. We used an individual-based model to explore whether and how PCB-associated reproductive failure could affect the dynamics of a hypothetical polar bear (Ursus maritimus) population exposed to PCBs to the same degree as the East Greenland subpopulation. Dose-response data from experimental studies on a surrogate species, the mink (Mustela vision), were used in the absence of similar data for polar bears. Two alternative types of reproductive failure in relation to maternal sum-PCB concentrations were considered: increased abortion rate and increased cub mortality. We found that the quantitative impact of PCB-induced reproductive failure on population growth rate depended largely on the actual type of reproductive failure involved. Critical potencies of the dose-response relationship for decreasing the population growth rate were established for both modeled types of reproductive failure. Comparing the model predictions of the age-dependent trend of sum-PCBs concentrations in females with actual field measurements from East Greenland indicated that it was unlikely that PCB exposure caused a high incidence of abortions in the subpopulation. However, on the basis of this analysis, it could not be excluded that PCB exposure contributes to higher cub mortality. Our results highlight the necessity for further research on the possible influence of PCBs on polar bear reproduction regarding their physiological pathway. This includes determining the exact cause of reproductive failure, i.e., in utero exposure versus lactational exposure of offspring; the timing of offspring death; and establishing the most relevant reference metrics for the dose-response relationship.
Dynamic deformations and the M6.7, Northridge, California earthquake
Gomberg, J.
1997-01-01
A method of estimating the complete time-varying dynamic formation field from commonly available three-component single station seismic data has been developed and applied to study the relationship between dynamic deformation and ground failures and structural damage using observations from the 1994 Northridge, California earthquake. Estimates from throughout the epicentral region indicate that the horizontal strains exceed the vertical ones by more than a factor of two. The largest strains (exceeding ???100 ??strain) correlate with regions of greatest ground failure. There is a poor correlation between structural damage and peak strain amplitudes. The smallest strains, ???35 ??strain, are estimated in regions of no damage or ground failure. Estimates in the two regions with most severe and well mapped permanent deformation, Potrero Canyon and the Granada-Mission Hills regions, exhibit the largest strains; peak horizontal strains estimates in these regions equal ???139 and ???229 ??strain respectively. Of note, the dynamic principal strain axes have strikes consistent with the permanent failure features suggesting that, while gravity, sub-surface materials, and hydrologic conditions undoubtedly played fundamental roles in determining where and what types of failures occurred, the dynamic deformation field may have been favorably sized and oriented to initiate failure processes. These results support other studies that conclude that the permanent deformation resulted from ground shaking, rather than from static strains associated with primary or secondary faulting. They also suggest that such an analysis, either using data or theoretical calculations, may enable observations of paleo-ground failure to be used as quantitative constraints on the size and geometry of previous earthquakes. ?? 1997 Elsevier Science Limited.
Substantial vertebral body osteophytes protect against severe vertebral fractures in compression
Aubin, Carl-Éric; Chaumoître, Kathia; Mac-Thiong, Jean-Marc; Ménard, Anne-Laure; Petit, Yvan; Garo, Anaïs; Arnoux, Pierre-Jean
2017-01-01
Recent findings suggest that vertebral osteophytes increase the resistance of the spine to compression. However, the role of vertebral osteophytes on the biomechanical response of the spine under fast dynamic compression, up to failure, is unclear. Seventeen human spine specimens composed of three vertebrae (from T5-T7 to T11-L1) and their surrounding soft tissues were harvested from nine cadavers, aged 77 to 92 years. Specimens were imaged using quantitative computer tomography (QCT) for medical observation, classification of the intervertebral disc degeneration (Thomson grade) and measurement of the vertebral trabecular density (VTD), height and cross-sectional area. Specimens were divided into two groups (with (n = 9) or without (n = 8) substantial vertebral body osteophytes) and compressed axially at a dynamic displacement rate of 1 m/s, up to failure. Normalized force-displacement curves, videos and QCT images allowed characterizing failure parameters (force, displacement and energy at failure) and fracture patterns. Results were analyzed using chi-squared tests for sampling distributions and linear regression for correlations between VTD and failure parameters. Specimens with substantial vertebral body osteophytes present higher stiffness (2.7 times on average) and force at failure (1.8 times on average) than other segments. The presence of osteophytes significantly influences the location, pattern and type of fracture. VTD was a good predictor of the dynamic force and energy at failure for specimens without substantial osteophytes. This study also showed that vertebral body osteophytes provide a protective mechanism to the underlying vertebra against severe compression fractures. PMID:29065144
Failure to Learn from Failure: Evaluating Computer Systems in Medicine
Grann, Richard P.
1980-01-01
Evaluation of ADP systems in medicine frequently becomes mired in problems of tenuous cost measurement, of proving illusory cost savings, of false precision, and of dubious discounting methods, while giving only superficial treatment to non-dollar benefits. It would frequently be more advantageous to study non-dollar impacts with greater care and rigor.
Ecological validity of the five digit test and the oral trails test.
Paiva, Gabrielle Chequer de Castro; Fialho, Mariana Braga; Costa, Danielle de Souza; Paula, Jonas Jardim de
2016-01-01
Tests evaluating the attentional-executive system are widely used in clinical practice. However, proximity of an objective cognitive test with real-world situations (ecological validity) is not frequently investigated. The present study evaluate the association between measures of the Five Digit Test (FDT) and the Oral Trails Test (OTT) with self-reported cognitive failures in everyday life as measured by the Cognitive Failures Questionnaire (CFQ). Brazilian adults from 18-to-65 years old voluntarily performed the FDT and OTT tests and reported the frequency of cognitive failures in their everyday life through the CFQ. After controlling for the age effect, the measures of controlled attentional processes were associated with cognitive failures, yet the cognitive flexibility of both FDT and OTT accounted for by the majority of variance in most aspects of the CFQ factors. The FDT and the OTT measures were predictive of real-world problems such as cognitive failures in everyday activities/situations.
The Impact of a Professional Learning Community on Student Academic Achievement in Mathematics
ERIC Educational Resources Information Center
Powell, Eldridge
2013-01-01
The continued failure of a large, suburban, Title I high school to meet adequate yearly performance indicators as defined in the No Child Left Behind Act is a concern for school personnel, district officials, and the school community. The purpose of this quantitative study was to determine whether 9th grade teachers working collaboratively, within…
Lies of the Reader: Disadvantages of the Sociological Research Methods for the Study of the Reading
ERIC Educational Resources Information Center
Tsvetkova, Milena I.
2018-01-01
The research problems of this study are the difficulties in the explanation of the phenomenon of reading in its accelerated transformations by quantitative sociological methods, because of failure to comply with a number of factors: first, the social aspects of the purchase, consumption and possession of reading materials have not yet been…
ERIC Educational Resources Information Center
Buhrman, B. R.
2010-01-01
Concerned educators have been implementing ninth-grade transition programs to help freshmen adjust to the demands in high school and to reduce ninth-grade failure rates. The purpose of this quasi-experimental quantitative study was to investigate the impact of a ninth-grade transition program. The research questions addressed impact on cumulative…
ERIC Educational Resources Information Center
Strand, Kerry J.
2013-01-01
A baccalaureate degree is essential to success in the contemporary United States. The degree offers improved economic security and the development of capabilities such as critical thinking, effective communication, quantitative reasoning, creativity, problem solving, personal and social responsibility, and social and cultural capital. Failure to…
ERIC Educational Resources Information Center
Pogrow, Stanley
2017-01-01
One of the major successes of advanced quantitative methods has been its seeming ability to provide unbiased determinations of which education practices are effective for education in general and for improving the educational achievement and opportunity of the neediest students. The power of this methodology as applied in the top education…
Effect of the infrastructure material on the failure behavior of prosthetic crowns.
Sonza, Queli Nunes; Della Bona, Alvaro; Borba, Márcia
2014-05-01
To evaluate the effect of infrastructure (IS) material on the fracture behavior of prosthetic crowns. Restorations were fabricated using a metal die simulating a prepared tooth. Four groups were evaluated: YZ-C, Y-TZP (In-Ceram YZ, Vita) IS produced by CAD-CAM; IZ-C, In-Ceram Zirconia (Vita) IS produced by CAD-CAM; IZ-S, In-Ceram Zirconia (Vita) IS produced by slip-cast; MC, metal IS (control). The IS were veneered with porcelain and resin cemented to fiber-reinforced composite dies. Specimens were loaded in compression to failure using a universal testing machine. The 30° angle load was applied by a spherical piston, in 37°C distilled water. Fractography was performed using stereomicroscope and SEM. Data were statistically analyzed with Anova and Student-Newman-Keuls tests (α=0.05). Significant differences were found between groups (p=0.022). MC showed the highest mean failure load, statistically similar to YZ-C. There was no statistical difference between YZ-C, IZ-C and IZ-S. MC and YZ-C showed no catastrophic failure. IZ-C and IZ-S showed chipping and catastrophic failures. The fracture behavior is similar to reported clinical failures. Considering the ceramic systems evaluated, YZ-C and MC crowns present greater fracture load and a more favorable failure mode than In-Ceram Zirconia crowns, regardless of the fabrication type (CAD-CAM or slip-cast). Copyright © 2014 Academy of Dental Materials. Published by Elsevier Ltd. All rights reserved.
On the buckling of an elastic holey column
Hazel, A. L.; Pihler-Puzović, D.
2017-01-01
We report the results of a numerical and theoretical study of buckling in elastic columns containing a line of holes. Buckling is a common failure mode of elastic columns under compression, found over scales ranging from metres in buildings and aircraft to tens of nanometers in DNA. This failure usually occurs through lateral buckling, described for slender columns by Euler’s theory. When the column is perforated with a regular line of holes, a new buckling mode arises, in which adjacent holes collapse in orthogonal directions. In this paper, we firstly elucidate how this alternate hole buckling mode coexists and interacts with classical Euler buckling modes, using finite-element numerical calculations with bifurcation tracking. We show how the preferred buckling mode is selected by the geometry, and discuss the roles of localized (hole-scale) and global (column-scale) buckling. Secondly, we develop a novel predictive model for the buckling of columns perforated with large holes. This model is derived without arbitrary fitting parameters, and quantitatively predicts the critical strain for buckling. We extend the model to sheets perforated with a regular array of circular holes and use it to provide quantitative predictions of their buckling. PMID:29225498
NASA Technical Reports Server (NTRS)
Panda, Binayak
2009-01-01
Modern analytical tools can yield invaluable results during materials characterization and failure analysis. Scanning electron microscopes (SEMs) provide significant analytical capabilities, including angstrom-level resolution. These systems can be equipped with a silicon drift detector (SDD) for very fast yet precise analytical mapping of phases, as well as electron back-scattered diffraction (EBSD) units to map grain orientations, chambers that admit large samples, variable pressure for wet samples, and quantitative analysis software to examine phases. Advanced solid-state electronics have also improved surface and bulk analysis instruments: Secondary ion mass spectroscopy (SIMS) can quantitatively determine and map light elements such as hydrogen, lithium, and boron - with their isotopes. Its high sensitivity detects impurities at parts per billion (ppb) levels. X-ray photo-electron spectroscopy (XPS) can determine oxidation states of elements, as well as identifying polymers and measuring film thicknesses on coated composites. This technique is also known as electron spectroscopy for chemical analysis (ESCA). Scanning Auger electron spectroscopy (SAM) combines surface sensitivity, spatial lateral resolution (10 nm), and depth profiling capabilities to describe elemental compositions of near and below surface regions down to the chemical state of an atom.
Pharmaco-EEG: A Study of Individualized Medicine in Clinical Practice.
Swatzyna, Ronald J; Kozlowski, Gerald P; Tarnow, Jay D
2015-07-01
Pharmaco-electroencephalography (Pharmaco-EEG) studies using clinical EEG and quantitative EEG (qEEG) technologies have existed for more than 4 decades. This is a promising area that could improve psychotropic intervention using neurological data. One of the objectives in our clinical practice has been to collect EEG and quantitative EEG (qEEG) data. In the past 5 years, we have identified a subset of refractory cases (n = 386) found to contain commonalities of a small number of electrophysiological features in the following diagnostic categories: mood, anxiety, autistic spectrum, and attention deficit disorders, Four abnormalities were noted in the majority of medication failure cases and these abnormalities did not appear to significantly align with their diagnoses. Those were the following: encephalopathy, focal slowing, beta spindles, and transient discharges. To analyze the relationship noted, they were tested for association with the assigned diagnoses. Fisher's exact test and binary logistics regression found very little (6%) association between particular EEG/qEEG abnormalities and diagnoses. Findings from studies of this type suggest that EEG/qEEG provides individualized understanding of pharmacotherapy failures and has the potential to improve medication selection. © EEG and Clinical Neuroscience Society (ECNS) 2014.
NASA Astrophysics Data System (ADS)
Lee, Seung Yup; Pakela, Julia M.; Helton, Michael C.; Vishwanath, Karthik; Chung, Yooree G.; Kolodziejski, Noah J.; Stapels, Christopher J.; McAdams, Daniel R.; Fernandez, Daniel E.; Christian, James F.; O'Reilly, Jameson; Farkas, Dana; Ward, Brent B.; Feinberg, Stephen E.; Mycek, Mary-Ann
2017-12-01
In reconstructive surgery, the ability to detect blood flow interruptions to grafted tissue represents a critical step in preventing postsurgical complications. We have developed and pilot tested a compact, fiber-based device that combines two complimentary modalities-diffuse correlation spectroscopy (DCS) and diffuse reflectance spectroscopy-to quantitatively monitor blood perfusion. We present a proof-of-concept study on an in vivo porcine model (n=8). With a controllable arterial blood flow supply, occlusion studies (n=4) were performed on surgically isolated free flaps while the device simultaneously monitored blood flow through the supplying artery as well as flap perfusion from three orientations: the distal side of the flap and two transdermal channels. Further studies featuring long-term monitoring, arterial failure simulations, and venous failure simulations were performed on flaps that had undergone an anastomosis procedure (n=4). Additionally, benchtop verification of the DCS system was performed on liquid flow phantoms. Data revealed relationships between diffuse optical measures and state of occlusion as well as the ability to detect arterial and venous compromise. The compact construction of the device, along with its noninvasive and quantitative nature, would make this technology suitable for clinical translation.
Vidal, José E; Gerhardt, Juliana; Peixoto de Miranda, Erique J; Dauar, Rafi F; Oliveira Filho, Gilberto S; Penalva de Oliveira, Augusto C; Boulware, David R
2012-05-01
This retrospective study aimed to evaluate the clinical, laboratory, and quantitative cerebrospinal fluid (CSF) cryptococcal cell counts for associations with in-hospital outcomes of HIV-infected patients with cryptococcal meningitis. Ninety-eight HIV-infected adult patients with CSF culture-proven cryptococcal meningitis were admitted between January 2006 and June 2008 at a referral center in Sao Paulo, Brazil. Cryptococcal meningitis was the first AIDS-defining illness in 69%, of whom 97% (95/98) had known prior HIV infection. The median CD4+ T-cell count was 39 cells/μL (interquartile range 17-87 cells/μL). Prior antiretroviral therapy was reported in 50%. Failure to sterilize the CSF by 7-14 days was associated with baseline fungal burden of ≥ 10 yeasts/μL by quantitative CSF microscopy (odds ratio [OR] = 15.3, 95% confidence interval [CI] 4.1-56.7; P < 0.001) and positive blood cultures (OR = 11.5, 95% CI 1.2-109; P = 0.034). At 7-14 days, ≥ 10 yeasts/μL CSF was associated with positive CSF cultures in 98% versus 36% with <10 yeasts/μL CSF (P < 0.001). In-hospital mortality was 30% and was associated with symptoms duration for >14 days, altered mental status (P < 0.001), CSF white blood cell counts <5 cells/μL (P = 0.027), intracranial hypertension (P = 0.011), viral loads >50,000 copies/mL (P = 0.036), ≥ 10 yeasts/μL CSF at 7-14 days (P = 0.038), and intracranial pressure >50 cmH(2)0 at 7-14 days (P = 0.007). In conclusion, most patients were aware of their HIV status. Fungal burden of ≥ 10 yeasts/μL by quantitative CSF microscopy predicted current CSF culture status and may be useful to customize the induction therapy. High uncontrolled intracranial pressure was associated with mortality. Copyright © 2012 Elsevier Inc. All rights reserved.
[Biochemical failure after curative treatment for localized prostate cancer].
Zouhair, Abderrahim; Jichlinski, Patrice; Mirimanoff, René-Olivier
2005-12-07
Biochemical failure after curative treatment for localized prostate cancer is frequent. The diagnosis of biochemical failure is clear when PSA levels rise after radical prostatectomy, but may be more difficult after external beam radiation therapy. The main difficulty once biochemical failure is diagnosed is to distinguish between local and distant failure, given the low sensitivity of standard work-up exams. Metabolic imaging techniques currently under evaluation may in the future help us to localize the site of failures. There are several therapeutic options depending on the initial curative treatment, each with morbidity risks that should be considered in multidisciplinary decision-making.
Evaluating Micrometeoroid and Orbital Debris Risk Assessments Using Anomaly Data
NASA Technical Reports Server (NTRS)
Squire, Michael
2017-01-01
The accuracy of micrometeoroid and orbital debris (MMOD) risk assessments can be difficult to evaluate. A team from the National Aeronautics and Space Administration (NASA) Engineering and Safety Center (NESC) has completed a study that compared MMOD-related failures on operational satellites to predictions of how many of those failures should occur using NASA's TM"s MMOD risk assessment methodology and tools. The study team used the Poisson probability to quantify the degree of inconsistency between the predicted and reported numbers of failures. Many elements go into a risk assessment, and each of those elements represent a possible source of uncertainty or bias that will influence the end result. There are also challenges in obtaining accurate and useful data on MMOD-related failures.
NASA Astrophysics Data System (ADS)
Biały, Witold
2017-06-01
Failure frequency in the mining process, with a focus on the mining machine, has been presented and illustrated by the example of two coal-mines. Two mining systems have been subjected to analysis: a cutter-loader and a plough system. In order to reduce costs generated by failures, maintenance teams should regularly make sure that the machines are used and operated in a rational and effective way. Such activities will allow downtimes to be reduced, and, in consequence, will increase the effectiveness of a mining plant. The evaluation of mining machines' failure frequency contained in this study has been based on one of the traditional quality management tools - the Pareto chart.
[Renal failure in patients with liver transplant: incidence and predisposing factors].
Gerona, S; Laudano, O; Macías, S; San Román, E; Galdame, O; Torres, O; Sorkin, E; Ciardullo, M; de Santibañes, E; Mastai, R
1997-01-01
Renal failure is a common finding in patients undergoing orthotopic liver transplantation. The aim of the present study was to evaluate the incidence, prognostic value of pre, intra and postoperative factors and severity of renal dysfunction in patients who undergo liver transplantation. Therefore, the records of 38 consecutive adult patients were reviewed. Renal failure was defined arbitrarily as an increase in creatinine (> 1.5 mg/dl) and/or blood urea (> 80 mg/dl). Three patients were excluded of the final analysis (1 acute liver failure and 2 with a survival lower than 72 hs.) Twenty one of the 35 patients has renal failure after orthotopic liver transplantation. Six of these episodes developed early, having occurred within the first 6 days. Late renal impairment occurred in 15 patients within the hospitalization (40 +/- 10 days) (Mean +/- SD). In he overall series, liver function, evaluated by Child-Pugh classification, a higher blood-related requirements and cyclosporine levels were observed more in those who experienced renal failure than those who did not (p < 0.05). Early renal failure was related with preoperative (liver function) and intraoperative (blood requirements) factors and several causes (nephrotoxic drugs and graft failure) other than cyclosporine were present in patients who developed late renal impairment. No mortality. No mortality was associated with renal failure. We conclude that renal failure a) is a common finding after liver transplantation, b) the pathogenesis of this complication is multifactorial and, c) in not related with a poor outcome.
False positive acetaminophen concentrations in patients with liver injury.
Polson, Julie; Wians, Frank H; Orsulak, Paul; Fuller, Dwain; Murray, Natalie G; Koff, Jonathan M; Khan, Adil I; Balko, Jody A; Hynan, Linda S; Lee, William M
2008-05-01
Acetaminophen toxicity is the most common form of acute liver failure in the U.S. After acetaminophen overdoses, quantitation of plasma acetaminophen can aid in predicting severity of injury. However, recent case reports have suggested that acetaminophen concentrations may be falsely increased in the presence of hyperbilirubinemia. We tested sera obtained from 43 patients with acute liver failure, mostly unrelated to acetaminophen, utilizing 6 different acetaminophen quantitation systems to determine the significance of this effect. In 36 of the 43 samples with bilirubin concentrations ranging from 1.0-61.5 mg/dl no acetaminophen was detectable by gas chromatography-mass spectroscopy. These 36 samples were then utilized to test the performance characteristics of 2 immunoassay and 4 enzymatic-colorimetric methods. Three of four colorimetric methods demonstrated 'detectable' values for acetaminophen in from 4 to 27 of the 36 negative samples, low concentration positive values being observed when serum bilirubin concentrations exceeded 10 mg/dl. By contrast, the 2 immunoassay methods (EMIT, FPIA) were virtually unaffected. The false positive values obtained were, in general, proportional to the quantity of bilirubin in the sample. However, prepared samples of normal human serum with added bilirubin showed a dose-response curve for only one of the 4 colorimetric assays. False positive acetaminophen tests may result when enzymatic-colorimetric assays are used, most commonly with bilirubin concentrations >10 mg/dl, leading to potential clinical errors in this setting. Bilirubin (or possibly other substances in acute liver failure sera) appears to affect the reliable measurement of acetaminophen, particularly with enzymatic-colorimetric assays.
Development and evaluation of endurance test system for ventricular assist devices.
Sumikura, Hirohito; Homma, Akihiko; Ohnuma, Kentaro; Taenaka, Yoshiyuki; Takewa, Yoshiaki; Mukaibayashi, Hiroshi; Katano, Kazuo; Tatsumi, Eisuke
2013-06-01
We developed a novel endurance test system that can arbitrarily set various circulatory conditions and has durability and stability for long-term continuous evaluation of ventricular assist devices (VADs), and we evaluated its fundamental performance and prolonged durability and stability. The circulation circuit of the present endurance test system consisted of a pulsatile pump with a small closed chamber (SCC), a closed chamber, a reservoir and an electromagnetic proportional valve. Two duckbill valves were mounted in the inlet and outlet of the pulsatile pump. The features of the circulation circuit are as follows: (1) the components of the circulation circuit consist of optimized industrial devices, giving durability; (2) the pulsatile pump can change the heart rate and stroke length (SL), as well as its compliance using the SCC. Therefore, the endurance test system can quantitatively reproduce various circulatory conditions. The range of reproducible circulatory conditions in the endurance test circuit was examined in terms of fundamental performance. Additionally, continuous operation for 6 months was performed in order to evaluate the durability and stability. The circulation circuit was able to set up a wide range of pressure and total flow conditions using the SCC and adjusting the pulsatile pump SL. The long-term continuous operation test demonstrated that stable, continuous operation for 6 months was possible without leakage or industrial device failure. The newly developed endurance test system demonstrated a wide range of reproducible circulatory conditions, durability and stability, and is a promising approach for evaluating the basic characteristics of VADs.
The chromatin-binding protein Smyd1 restricts adult mammalian heart growth
Kimball, Todd; Rasmussen, Tara L.; Rosa-Garrido, Manuel; Chen, Haodong; Tran, Tam; Miller, Mickey R.; Gray, Ricardo; Jiang, Shanxi; Ren, Shuxun; Wang, Yibin; Tucker, Haley O.; Vondriska, Thomas M.
2016-01-01
All terminally differentiated organs face two challenges, maintaining their cellular identity and restricting organ size. The molecular mechanisms responsible for these decisions are of critical importance to organismal development, and perturbations in their normal balance can lead to disease. A hallmark of heart failure, a condition affecting millions of people worldwide, is hypertrophic growth of cardiomyocytes. The various forms of heart failure in human and animal models share conserved transcriptome remodeling events that lead to expression of genes normally silenced in the healthy adult heart. However, the chromatin remodeling events that maintain cell and organ size are incompletely understood; insights into these mechanisms could provide new targets for heart failure therapy. Using a quantitative proteomics approach to identify muscle-specific chromatin regulators in a mouse model of hypertrophy and heart failure, we identified upregulation of the histone methyltransferase Smyd1 during disease. Inducible loss-of-function studies in vivo demonstrate that Smyd1 is responsible for restricting growth in the adult heart, with its absence leading to cellular hypertrophy, organ remodeling, and fulminate heart failure. Molecular studies reveal Smyd1 to be a muscle-specific regulator of gene expression and indicate that Smyd1 modulates expression of gene isoforms whose expression is associated with cardiac pathology. Importantly, activation of Smyd1 can prevent pathological cell growth. These findings have basic implications for our understanding of cardiac pathologies and open new avenues to the treatment of cardiac hypertrophy and failure by modulating Smyd1. PMID:27663768
The chromatin-binding protein Smyd1 restricts adult mammalian heart growth.
Franklin, Sarah; Kimball, Todd; Rasmussen, Tara L; Rosa-Garrido, Manuel; Chen, Haodong; Tran, Tam; Miller, Mickey R; Gray, Ricardo; Jiang, Shanxi; Ren, Shuxun; Wang, Yibin; Tucker, Haley O; Vondriska, Thomas M
2016-11-01
All terminally differentiated organs face two challenges, maintaining their cellular identity and restricting organ size. The molecular mechanisms responsible for these decisions are of critical importance to organismal development, and perturbations in their normal balance can lead to disease. A hallmark of heart failure, a condition affecting millions of people worldwide, is hypertrophic growth of cardiomyocytes. The various forms of heart failure in human and animal models share conserved transcriptome remodeling events that lead to expression of genes normally silenced in the healthy adult heart. However, the chromatin remodeling events that maintain cell and organ size are incompletely understood; insights into these mechanisms could provide new targets for heart failure therapy. Using a quantitative proteomics approach to identify muscle-specific chromatin regulators in a mouse model of hypertrophy and heart failure, we identified upregulation of the histone methyltransferase Smyd1 during disease. Inducible loss-of-function studies in vivo demonstrate that Smyd1 is responsible for restricting growth in the adult heart, with its absence leading to cellular hypertrophy, organ remodeling, and fulminate heart failure. Molecular studies reveal Smyd1 to be a muscle-specific regulator of gene expression and indicate that Smyd1 modulates expression of gene isoforms whose expression is associated with cardiac pathology. Importantly, activation of Smyd1 can prevent pathological cell growth. These findings have basic implications for our understanding of cardiac pathologies and open new avenues to the treatment of cardiac hypertrophy and failure by modulating Smyd1. Copyright © 2016 the American Physiological Society.
Bone anchors or interference screws? A biomechanical evaluation for autograft ankle stabilization.
Jeys, Lee; Korrosis, Sotiris; Stewart, Todd; Harris, Nicholas J
2004-01-01
Autograft stabilization uses free semitendinosus tendon grafts to anatomically reconstruct the anterior talofibular ligament. Study aims were to evaluate the biomechanical properties of Mitek GII anchors compared with the Arthrex Bio-Tenodesis Screw for free tendon reconstruction of the anterior talofibular ligament. There are no differences in load to failure and percentage specimen elongation at failure between the 2 methods. Controlled laboratory study using porcine models. Sixty porcine tendon constructs were failure tested. Re-creating the pull of the anterior talofibular ligament, loads were applied at 70 degrees to the bones. Thirty-six tendons were fixed to porcine tali and tested using a single pull to failure; 10 were secured with anchors and No. 2 Ethibond, 10 with anchors and FiberWire, 10 with screws and Fiberwire, and 6 with partially gripped screws. Cyclic preloading was conducted on 6 tendons fixed by anchors and on 6 tendons fixed by screws before failure testing. Two groups of 6 components fixed to the fibula were also tested. The talus single-pull anchor group produced a mean load of 114 N and elongation of 37% at failure. The talus single-pull screw group produced a mean load of 227 N and elongation of 22% at failure (P <.05). Cyclic preloading at 65% failure load before failure testing produced increases in load and decreases in elongation at failure. Partially gripped screws produced a load of 133 N and elongation of 30% at failure. The fibula model produced significant increases in load to failure for both. The human anterior talofibular ligament has loads of 139 N at failure with instability occurring at 20% elongation. Interference screw fixation produced significantly greater failure strength and less elongation at failure than bone anchors. The improved biomechanics of interference screws suggests that these may be more suited to in vivo reconstruction of the anterior talofibular ligament than are bone anchors.
Influence of Finite Element Size in Residual Strength Prediction of Composite Structures
NASA Technical Reports Server (NTRS)
Satyanarayana, Arunkumar; Bogert, Philip B.; Karayev, Kazbek Z.; Nordman, Paul S.; Razi, Hamid
2012-01-01
The sensitivity of failure load to the element size used in a progressive failure analysis (PFA) of carbon composite center notched laminates is evaluated. The sensitivity study employs a PFA methodology previously developed by the authors consisting of Hashin-Rotem intra-laminar fiber and matrix failure criteria and a complete stress degradation scheme for damage simulation. The approach is implemented with a user defined subroutine in the ABAQUS/Explicit finite element package. The effect of element size near the notch tips on residual strength predictions was assessed for a brittle failure mode with a parametric study that included three laminates of varying material system, thickness and stacking sequence. The study resulted in the selection of an element size of 0.09 in. X 0.09 in., which was later used for predicting crack paths and failure loads in sandwich panels and monolithic laminated panels. Comparison of predicted crack paths and failure loads for these panels agreed well with experimental observations. Additionally, the element size vs. normalized failure load relationship, determined in the parametric study, was used to evaluate strength-scaling factors for three different element sizes. The failure loads predicted with all three element sizes provided converged failure loads with respect to that corresponding with the 0.09 in. X 0.09 in. element size. Though preliminary in nature, the strength-scaling concept has the potential to greatly reduce the computational time required for PFA and can enable the analysis of large scale structural components where failure is dominated by fiber failure in tension.
An experimental evaluation of software redundancy as a strategy for improving reliability
NASA Technical Reports Server (NTRS)
Eckhardt, Dave E., Jr.; Caglayan, Alper K.; Knight, John C.; Lee, Larry D.; Mcallister, David F.; Vouk, Mladen A.; Kelly, John P. J.
1990-01-01
The strategy of using multiple versions of independently developed software as a means to tolerate residual software design faults is suggested by the success of hardware redundancy for tolerating hardware failures. Although, as generally accepted, the independence of hardware failures resulting from physical wearout can lead to substantial increases in reliability for redundant hardware structures, a similar conclusion is not immediate for software. The degree to which design faults are manifested as independent failures determines the effectiveness of redundancy as a method for improving software reliability. Interest in multi-version software centers on whether it provides an adequate measure of increased reliability to warrant its use in critical applications. The effectiveness of multi-version software is studied by comparing estimates of the failure probabilities of these systems with the failure probabilities of single versions. The estimates are obtained under a model of dependent failures and compared with estimates obtained when failures are assumed to be independent. The experimental results are based on twenty versions of an aerospace application developed and certified by sixty programmers from four universities. Descriptions of the application, development and certification processes, and operational evaluation are given together with an analysis of the twenty versions.
NASA Astrophysics Data System (ADS)
Nursyamsiah; Hasan, R.
2018-03-01
Hospitalization in patients with chronic heart failure is associated with high rates of mortality and morbidity that during treatment and post-treatment. Despite the various therapies available today, mortality and re-hospitalization rates within 60 to 90 days post-hospitalization are still quite high. This period is known as the vulnerable phase. With the prognostic evaluation tools in patients with heart failure are expected to help identify high-risk individuals, then more rigorous monitoring and interventions can be undertaken. To determine whether hs-CRP have an impact on mortality within 90 days in hospitalized patients with heart failure, an observational cohort study was conducted in 39 patients with heart failure who were hospitalized due to worsening chronic heart failure. Patients were followed for up to 90 days after initial evaluation with the primary endpoint is death. Hs-CRP value >4.25 mg/L we found 70% was dead and hs-CRP value <4.25 mg/L only 6.9% was dead whereas the survival within 90 days. p:0.000.In conclusion, there were differences in hs-CRP values between in patients with heart failure who died and survival within 90 days.
Management of heart failure in the new era: the role of scores.
Mantegazza, Valentina; Badagliacca, Roberto; Nodari, Savina; Parati, Gianfranco; Lombardi, Carolina; Di Somma, Salvatore; Carluccio, Erberto; Dini, Frank Lloyd; Correale, Michele; Magrì, Damiano; Agostoni, Piergiuseppe
2016-08-01
Heart failure is a widespread syndrome involving several organs, still characterized by high mortality and morbidity, and whose clinical course is heterogeneous and hardly predictable.In this scenario, the assessment of heart failure prognosis represents a fundamental step in clinical practice. A single parameter is always unable to provide a very precise prognosis. Therefore, risk scores based on multiple parameters have been introduced, but their clinical utility is still modest. In this review, we evaluated several prognostic models for acute, right, chronic, and end-stage heart failure based on multiple parameters. In particular, for chronic heart failure we considered risk scores essentially based on clinical evaluation, comorbidities analysis, baroreflex sensitivity, heart rate variability, sleep disorders, laboratory tests, echocardiographic imaging, and cardiopulmonary exercise test parameters. What is at present established is that a single parameter is not sufficient for an accurate prediction of prognosis in heart failure because of the complex nature of the disease. However, none of the scoring systems available is widely used, being in some cases complex, not user-friendly, or based on expensive or not easily available parameters. We believe that multiparametric scores for risk assessment in heart failure are promising but their widespread use needs to be experienced.
Unbiased multi-fidelity estimate of failure probability of a free plane jet
NASA Astrophysics Data System (ADS)
Marques, Alexandre; Kramer, Boris; Willcox, Karen; Peherstorfer, Benjamin
2017-11-01
Estimating failure probability related to fluid flows is a challenge because it requires a large number of evaluations of expensive models. We address this challenge by leveraging multiple low fidelity models of the flow dynamics to create an optimal unbiased estimator. In particular, we investigate the effects of uncertain inlet conditions in the width of a free plane jet. We classify a condition as failure when the corresponding jet width is below a small threshold, such that failure is a rare event (failure probability is smaller than 0.001). We estimate failure probability by combining the frameworks of multi-fidelity importance sampling and optimal fusion of estimators. Multi-fidelity importance sampling uses a low fidelity model to explore the parameter space and create a biasing distribution. An unbiased estimate is then computed with a relatively small number of evaluations of the high fidelity model. In the presence of multiple low fidelity models, this framework offers multiple competing estimators. Optimal fusion combines all competing estimators into a single estimator with minimal variance. We show that this combined framework can significantly reduce the cost of estimating failure probabilities, and thus can have a large impact in fluid flow applications. This work was funded by DARPA.
Behavioral interventions for agitation in older adults with dementia: an evaluative review.
Spira, Adam P; Edelstein, Barry A
2006-06-01
Older adults with dementia commonly exhibit agitated behavior that puts them at risk of injury and institutionalization and is associated with caregiver stress. A range of theoretical approaches has produced numerous interventions to manage these behavior problems. This paper critically reviews the empirical literature on behavioral interventions to reduce agitation in older adults with dementia. A literature search yielded 23 articles that met inclusion criteria. These articles described interventions that targeted wandering, disruptive vocalization, physical aggression, other agitated behaviors and a combination of these behaviors. Studies are summarized individually and then evaluated. Behavioral interventions targeting agitated behavior exhibited by older adults with dementia show considerable promise. A number of methodological issues must be addressed to advance this research area. Problem areas include inconsistent use of functional assessment techniques, failure to report quantitative findings and inadequate demonstrations of experimental control. The reviewed studies collectively provide evidence that warrants optimism regarding the application of behavioral principles to the management of agitation among older adults with dementia. Although the results of some studies were mixed and several studies revealed methodological shortcomings, many of them offered innovations that can be used in future, more rigorously designed, intervention studies.
Randomized in vivo evaluation of photodynamic antimicrobial chemotherapy on deciduous carious dentin
NASA Astrophysics Data System (ADS)
Steiner-Oliveira, Carolina; Longo, Priscila Larcher; Aranha, Ana Cecília Corrêa; Ramalho, Karen Müller; Mayer, Marcia Pinto Alves; de Paula Eduardo, Carlos
2015-10-01
The aim of this randomized in vivo study was to compare antimicrobial chemotherapies in primary carious dentin. Thirty-two participants ages 5 to 7 years underwent partial caries removal from deep carious dentin lesions in primary molars and were subsequently divided into three groups: control [chlorhexidine and resin-modified glass ionomer cement (RMGIC)], LEDTB [photodynamic antimicrobial chemotherapy (PACT) with light-emitting diode associated with toluidine blue solution and RMGIC], and LMB [PACT with laser associated with methylene blue solution and RMGIC]. The participants were submitted to initial clinical and radiographic examinations. Demographic features and biofilm, gingival, and DMFT/DMFS indexes were evaluated, in addition to clinical and radiographic followups at 6 and 12 months after treatments. Carious dentin was collected before and after each treatment, and the number of Streptococcus mutans, Streptococcus sobrinus, Lactobacillus casei, Fusobacterium nucleatum, Atopobium rimae, and total bacteria was established by quantitative polymerase chain reaction. No signs of pain or restoration failure were observed. All therapies were effective in reducing the number of microorganisms, except for S. sobrinus. No statistical differences were observed among the protocols used. All therapies may be considered as effective modern approaches to minimal intervention for the management of deep primary caries treatment.
Lewis, Gwyn N; Rosie, Juliet A
2012-01-01
To review quantitative and qualitative studies that have examined the users' response to virtual reality game-based interventions in people with movement disorders associated with chronic neurological conditions. We aimed to determine key themes that influenced users' enjoyment and engagement in the games and develop suggestions as to how future systems could best address their needs and expectations. There were a limited number of studies that evaluated user opinions. From those found, seven common themes emerged: technology limitations, user control and therapist assistance, the novel physical and cognitive challenge, feedback, social interaction, game purpose and expectations, and the virtual environments. Our key recommendations derived from the review were to avoid technology failure, maintain overt therapeutic principles within the games, encompass progression to promote continuing physical and cognitive challenge, and to provide feedback that is easily and readily associated with success. While there have been few studies that have evaluated the users' perspective of virtual rehabilitation games, our findings indicate that canvassing these experiences provides valuable information on the needs of the intended users. Incorporating our recommendations may enhance the efficacy of future systems to optimize the rehabilitation benefits of virtual reality games.
Thoracic radiography in the cat: Identification of cardiomegaly and congestive heart failure.
Guglielmini, Carlo; Diana, Alessia
2015-12-01
Thoracic radiography is one of the most commonly employed diagnostic tools for the clinical evaluation of cats with suspected heart disease and is the standard diagnostic method in the confirmation of cardiogenic pulmonary edema. In the past, interpretation of feline radiographs focused on a description of the qualitative radiographic features of feline heart disease or the measurement of the cardiac silhouette in healthy cats and cats with different cardiovascular disorders. More recently, studies have begun to critically address the issue of the diagnostic accuracy of thoracic radiography in the diagnostic work-up of cats with heart disease. In these studies, qualitative and quantitative radiographic parameters were compared to echocardiographic findings to evaluate the usefulness of thoracic radiography for the identification of cardiac enlargement and pulmonary edema in the cat. Thoracic radiography is reasonably specific but has a low sensitivity when identifying cardiomegaly in cats with mild structural heart disease. Feline cardiogenic pulmonary edema has a variable radiographic presentation and several specific radiographic findings (i.e., enlargement of the left atrium and the pulmonary veins) can be absent or non-recognizable in affected cats. Copyright © 2015 Elsevier B.V. All rights reserved.
[Qualitative evaluation of blood products records in a hospital].
Lartigue, B; Catillon, E
2012-02-01
This study aimed at evaluating the qualitative performance of blood products traceability from paper and electronic medical records in a hospital. Quality of date/time documentation was assessed by detection, for 20minutes or more, of chronological errors and inter-source inconsistencies, in a random sample of 168 blood products transfused during 2009. A receipt date/time was confirmed in 52% of paper records; a data entry error was attested in 25% of paper records, and 21% of electronic records. A transfusion date/time was notified in 93% of paper records, with a data entry error in 26% of paper records and 25% of electronic records. The patient medical record held at least one date/time error in 18% and 17%, for receipt and transfusion respectively. Environmental factors (clinical setting, urgency, blood product category) did not contributed to data error rates. Although blood products traceability has good quantitative results, the recorded documentation is not qualitative. In our study, data entry errors are similar in electronic or paper records, but the global failure rate is lesser in electronic records because omissions are controlled. Copyright © 2011 Elsevier Masson SAS. All rights reserved.
NASA Astrophysics Data System (ADS)
Bandurin, M. A.; Volosukhin, V. A.; Vanzha, V. V.; Mikheev, A. V.; Volosukhin, Y. V.
2018-05-01
At present theoretical substations for fundamental methods of forecasting possible natural disasters and for quantitative evaluating remaining live technical state of landfall dams in the mountain regions with higher danger are lacking. In this article, the task was set to carry out finite-element simulation of possible natural disasters with changes in the climate as well as in modern seismic conditions of operation in the mountain regions of the Greater Caucasus with higher danger. The research is aimed at the development of methods and principles for monitoring safety of possible natural disasters, evaluating remaining live technical state of landfall dams having one or another damage and for determination of dam failure riskiness, as well. When developing mathematical models of mudflow descents by inflows tributaries into the main bed, an intensive danger threshold was determined, taking into consideration geomorphological characteristics of earthflow courses, physico-chemical and mechanical state of mudflow mass and the dynamics of their state change. Consequences of mudflow descents into river basins were simulated with assessment of threats and risks for projects with different infrastructures located in the river floodplain.
NASA Technical Reports Server (NTRS)
McCarty, John P.; Lyles, Garry M.
1997-01-01
Propulsion system quality is defined in this paper as having high reliability, that is, quality is a high probability of within-tolerance performance or operation. Since failures are out-of-tolerance performance, the probability of failures and their occurrence is the difference between high and low quality systems. Failures can be described at 3 levels: the system failure (which is the detectable end of a failure), the failure mode (which is the failure process), and the failure cause (which is the start). Failure causes can be evaluated & classified by type. The results of typing flight history failures shows that most failures are in unrecognized modes and result from human error or noise, i.e. failures are when engineers learn how things really work. Although the study based on US launch vehicles, a sampling of failures from other countries indicates the finding has broad application. The parameters of the design of a propulsion system are not single valued, but have dispersions associated with the manufacturing of parts. Many tests are needed to find failures, if the dispersions are large relative to tolerances, which could contribute to the large number of failures in unrecognized modes.
Fracasso, Tony; Karger, Bernd; Pfeiffer, Heidi; Sauerland, Cristina; Schmeling, Andreas
2010-11-01
Pulmonary fat embolism is a life-threatening event that may result to potentially determining right ventricular failure. Even if the pathophysiology of this phenomenon has been widely investigated, no immunohistochemical demonstration of right ventricular failure following pulmonary fat embolism has been reported till now. We performed an immunohistochemical investigation with the markers fibronectin and C5b-9 in 21 cases of polytrauma with bone fractures (study group-nine females and 12 males; mean age 64.6 years) compared to a control group of 21 forensic cases with various causes of death (nine females and 12 males; mean age 68.6 years). In each case at least one tissue slide from both cardiac ventricles (free wall of the right ventricle, anterior and/or posterior wall of the left ventricle) was available. The reactions were semi-quantitatively classified, and the two groups were compared. In the study group, the occurrence of ischemic changes at the right ventricle was significantly higher than in controls. The determining aspect, however, seems to be the prevalent ischemic lesion at the right ventricle compared to the left one. This may indicate the primary involvement of the right ventricle, thus, demonstrating a right ventricular failure.
Quantifying Safety Margin Using the Risk-Informed Safety Margin Characterization (RISMC)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Grabaskas, David; Bucknor, Matthew; Brunett, Acacia
2015-04-26
The Risk-Informed Safety Margin Characterization (RISMC), developed by Idaho National Laboratory as part of the Light-Water Reactor Sustainability Project, utilizes a probabilistic safety margin comparison between a load and capacity distribution, rather than a deterministic comparison between two values, as is usually done in best-estimate plus uncertainty analyses. The goal is to determine the failure probability, or in other words, the probability of the system load equaling or exceeding the system capacity. While this method has been used in pilot studies, there has been little work conducted investigating the statistical significance of the resulting failure probability. In particular, it ismore » difficult to determine how many simulations are necessary to properly characterize the failure probability. This work uses classical (frequentist) statistics and confidence intervals to examine the impact in statistical accuracy when the number of simulations is varied. Two methods are proposed to establish confidence intervals related to the failure probability established using a RISMC analysis. The confidence interval provides information about the statistical accuracy of the method utilized to explore the uncertainty space, and offers a quantitative method to gauge the increase in statistical accuracy due to performing additional simulations.« less
Simplified methods for evaluating road prism stability
William J. Elliot; Mark Ballerini; David Hall
2003-01-01
Mass failure is one of the most common failures of low-volume roads in mountainous terrain. Current methods for evaluating stability of these roads require a geotechnical specialist. A stability analysis program, XSTABL, was used to estimate the stability of 3,696 combinations of road geometry, soil, and groundwater conditions. A sensitivity analysis was carried out to...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dickson, T.L.; Simonen, F.A.
1992-05-01
Probabilistic fracture mechanics analysis is a major element of comprehensive probabilistic methodology on which current NRC regulatory requirements for pressurized water reactor vessel integrity evaluation are based. Computer codes such as OCA-P and VISA-II perform probabilistic fracture analyses to estimate the increase in vessel failure probability that occurs as the vessel material accumulates radiation damage over the operating life of the vessel. The results of such analyses, when compared with limits of acceptable failure probabilities, provide an estimation of the residual life of a vessel. Such codes can be applied to evaluate the potential benefits of plant-specific mitigating actions designedmore » to reduce the probability of failure of a reactor vessel. 10 refs.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dickson, T.L.; Simonen, F.A.
1992-01-01
Probabilistic fracture mechanics analysis is a major element of comprehensive probabilistic methodology on which current NRC regulatory requirements for pressurized water reactor vessel integrity evaluation are based. Computer codes such as OCA-P and VISA-II perform probabilistic fracture analyses to estimate the increase in vessel failure probability that occurs as the vessel material accumulates radiation damage over the operating life of the vessel. The results of such analyses, when compared with limits of acceptable failure probabilities, provide an estimation of the residual life of a vessel. Such codes can be applied to evaluate the potential benefits of plant-specific mitigating actions designedmore » to reduce the probability of failure of a reactor vessel. 10 refs.« less
Chung, Sheng-Heng; Han, Pauline; Manthiram, Arumugam
2017-06-07
The viability of employing high-capacity sulfur cathodes in building high-energy-density lithium-sulfur batteries is limited by rapid self-discharge, short shelf life, and severe structural degradation during cell resting (static instability). Unfortunately, the static instability has largely been ignored in the literature. We present in this letter a longterm self-discharge study by quantitatively analyzing the control lithium-sulfur batteries with a conventional cathode configuration, which provides meaningful insights into the cathode failure mechanisms during resting. Lastly, utilizing the understanding obtained with the control cells, we design and present low self-discharge (LSD) lithium-sulfur batteries for investigating the long-term self-discharge effect and electrode stability.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chung, Sheng-Heng; Han, Pauline; Manthiram, Arumugam
The viability of employing high-capacity sulfur cathodes in building high-energy-density lithium-sulfur batteries is limited by rapid self-discharge, short shelf life, and severe structural degradation during cell resting (static instability). Unfortunately, the static instability has largely been ignored in the literature. We present in this letter a longterm self-discharge study by quantitatively analyzing the control lithium-sulfur batteries with a conventional cathode configuration, which provides meaningful insights into the cathode failure mechanisms during resting. Lastly, utilizing the understanding obtained with the control cells, we design and present low self-discharge (LSD) lithium-sulfur batteries for investigating the long-term self-discharge effect and electrode stability.
Quantitative risk assessment of Cryptosporidium in tap water in Ireland.
Cummins, E; Kennedy, R; Cormican, M
2010-01-15
Cryptosporidium species are protozoan parasites associated with gastro-intestinal illness. Following a number of high profile outbreaks worldwide, it has emerged as a parasite of major public health concern. A quantitative Monte Carlo simulation model was developed to evaluate the annual risk of infection from Cryptosporidium in tap water in Ireland. The assessment considers the potential initial contamination levels in raw water, oocyst removal and decontamination events following various process stages, including coagulation/flocculation, sedimentation, filtration and disinfection. A number of scenarios were analysed to represent potential risks from public water supplies, group water schemes and private wells. Where surface water is used additional physical and chemical water treatment is important in terms of reducing the risk to consumers. The simulated annual risk of illness for immunocompetent individuals was below 1 x 10(-4) per year (as set by the US EPA) except under extreme contamination events. The risk for immunocompromised individuals was 2-3 orders of magnitude greater for the scenarios analysed. The model indicates a reduced risk of infection from tap water that has undergone microfiltration, as this treatment is more robust in the event of high contamination loads. The sensitivity analysis highlighted the importance of watershed protection and the importance of adequate coagulation/flocculation in conventional treatment. The frequency of failure of the treatment process is the most important parameter influencing human risk in conventional treatment. The model developed in this study may be useful for local authorities, government agencies and other stakeholders to evaluate the likely risk of infection given some basic input data on source water and treatment processes used. Copyright 2009 Elsevier B.V. All rights reserved.
Peters, Iain R; Helps, Chris R; Calvert, Emma L; Hall, Edward J; Day, Michael J
2005-01-01
To examine the difference in expression of messenger RNA (mRNA) transcripts for polymeric immunoglobulin receptor (plgR), alpha-chain, and J-chain determined by use of quantitative real-time reverse transcription-polymerase chain reaction (QRT-PCR) assays in duodenal biopsy specimens obtained from dogs with and without chronic diarrhea. Biopsy specimens of the proximal portion of the duodenum were obtained endoscopically from 39 dogs evaluated because of chronic diarrhea (12 German Shepherd Dogs and 27 non-German Shepherd Dog breeds); specimens were also obtained from a control group of 7 dogs evaluated because of other gastrointestinal tract diseases and 2 dogs that were euthanatized as a result of nongastrointestinal tract disease. Dogs were anesthetized, and multiple mucosal biopsy specimens were obtained endoscopically at the level of the caudal duodenal flexure by use of biopsy forceps; in 2 control dogs, samples were obtained from the descending duodenum within 5 minutes of euthanasia. One-step QRT-PCR was used to quantify the level of expression of transcripts for the housekeeper gene glyceraldehyde-3-phosphate dehydrogenase, plgR, alpha-chain, and J-chain in duodenal mucosal tissue. There was no significant difference in the level of expression of any transcript among non-German Shepherd Dog breeds without diarrhea (control group), non-German Shepherd Dog breeds with chronic diarrhea, and German Shepherd Dogs with chronic diarrhea. Conclusions and Clinical Relevance-Results indicated that the susceptibility of German Shepherd Dogs to chronic diarrhea is not a result of simple failure of transcription of the key genes that encode molecules involved in mucosal IgA secretion.
Cong, Guang-Ting; Lebaschi, Amir H; Camp, Christopher L; Carballo, Camila B; Nakagawa, Yusuke; Wada, Susumu; Deng, Xiang-Hua; Rodeo, Scott A
2018-04-23
Subacromial impingement of the rotator cuff is understood as a contributing factor in the development of rotator cuff tendinopathy. However, changes that occur in the impinged tendon are poorly understood and warrant further study. To enable further study of rotator cuff tendinopathy, we performed a controlled laboratory study to determine feasibility and baseline characteristics of a new murine model for subacromial impingement. This model involves surgically inserting a microvascular clip into the subacromial space in adult C57Bl/6 mice. Along with a sham surgery arm, 90 study animals were distributed among time point groups for sacrifice up to 6 weeks. All animals underwent bilateral surgery (total N = 180). Biomechanical, histologic, and molecular analyses were performed to identify and quantify the progression of changes in the supraspinatus tendon. Decreases in failure force and stiffness were found in impinged tendon specimens compared to sham and no-surgery controls at all study time points. Semi-quantitative scoring of histologic specimens demonstrated significant, persistent tendinopathic changes over 6 weeks. Quantitative real-time polymerase chain reaction analysis of impinged tendon specimens demonstrated persistently increased expression of genes related to matrix remodeling, inflammation, and tendon development. Overall, this novel murine subacromial impingement model creates changes consistent with acute tendonitis, which may mimic the early stages of rotator cuff tendinopathy. This article is protected by copyright. All rights reserved Clinical Significance: A robust, simple, and reproducible animal model of rotator cuff tendinopathy is a valuable research tool to allow further studies of cellular and molecular mechanisms and evaluation of therapeutic interventions in rotator cuff tendinopathy. This article is protected by copyright. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, Jiangjiang; Li, Weixuan; Lin, Guang
In decision-making for groundwater management and contamination remediation, it is important to accurately evaluate the probability of the occurrence of a failure event. For small failure probability analysis, a large number of model evaluations are needed in the Monte Carlo (MC) simulation, which is impractical for CPU-demanding models. One approach to alleviate the computational cost caused by the model evaluations is to construct a computationally inexpensive surrogate model instead. However, using a surrogate approximation can cause an extra error in the failure probability analysis. Moreover, constructing accurate surrogates is challenging for high-dimensional models, i.e., models containing many uncertain input parameters.more » To address these issues, we propose an efficient two-stage MC approach for small failure probability analysis in high-dimensional groundwater contaminant transport modeling. In the first stage, a low-dimensional representation of the original high-dimensional model is sought with Karhunen–Loève expansion and sliced inverse regression jointly, which allows for the easy construction of a surrogate with polynomial chaos expansion. Then a surrogate-based MC simulation is implemented. In the second stage, the small number of samples that are close to the failure boundary are re-evaluated with the original model, which corrects the bias introduced by the surrogate approximation. The proposed approach is tested with a numerical case study and is shown to be 100 times faster than the traditional MC approach in achieving the same level of estimation accuracy.« less
Razali, Haslina; O'Connor, Emily; Drews, Anna; Burke, Terry; Westerdahl, Helena
2017-07-28
High-throughput sequencing enables high-resolution genotyping of extremely duplicated genes. 454 amplicon sequencing (454) has become the standard technique for genotyping the major histocompatibility complex (MHC) genes in non-model organisms. However, illumina MiSeq amplicon sequencing (MiSeq), which offers a much higher read depth, is now superseding 454. The aim of this study was to quantitatively and qualitatively evaluate the performance of MiSeq in relation to 454 for genotyping MHC class I alleles using a house sparrow (Passer domesticus) dataset with pedigree information. House sparrows provide a good study system for this comparison as their MHC class I genes have been studied previously and, consequently, we had prior expectations concerning the number of alleles per individual. We found that 454 and MiSeq performed equally well in genotyping amplicons with low diversity, i.e. amplicons from individuals that had fewer than 6 alleles. Although there was a higher rate of failure in the 454 dataset in resolving amplicons with higher diversity (6-9 alleles), the same genotypes were identified by both 454 and MiSeq in 98% of cases. We conclude that low diversity amplicons are equally well genotyped using either 454 or MiSeq, but the higher coverage afforded by MiSeq can lead to this approach outperforming 454 in amplicons with higher diversity.
Patra, Subir; Banerjee, Sourav
2017-01-01
Material state awareness of composites using conventional Nondestructive Evaluation (NDE) method is limited by finding the size and the locations of the cracks and the delamination in a composite structure. To aid the progressive failure models using the slow growth criteria, the awareness of the precursor damage state and quantification of the degraded material properties is necessary, which is challenging using the current NDE methods. To quantify the material state, a new offline NDE method is reported herein. The new method named Quantitative Ultrasonic Image Correlation (QUIC) is devised, where the concept of microcontinuum mechanics is hybrid with the experimentally measured Ultrasonic wave parameters. This unique combination resulted in a parameter called Nonlocal Damage Entropy for the precursor awareness. High frequency (more than 25 MHz) scanning acoustic microscopy is employed for the proposed QUIC. Eight woven carbon-fiber-reinforced-plastic composite specimens were tested under fatigue up to 70% of their remaining useful life. During the first 30% of the life, the proposed nonlocal damage entropy is plotted to demonstrate the degradation of the material properties via awareness of the precursor damage state. Visual proofs for the precursor damage states are provided with the digital images obtained from the micro-optical microscopy, the scanning acoustic microscopy and the scanning electron microscopy. PMID:29258256
Tate, Sonya C; Burke, Teresa F; Hartman, Daisy; Kulanthaivel, Palaniappan; Beckmann, Richard P; Cronier, Damien M
2016-01-01
Background: Resistance to BRAF inhibition is a major cause of treatment failure for BRAF-mutated metastatic melanoma patients. Abemaciclib, a cyclin-dependent kinase 4 and 6 inhibitor, overcomes this resistance in xenograft tumours and offers a promising drug combination. The present work aims to characterise the quantitative pharmacology of the abemaciclib/vemurafenib combination using a semimechanistic pharmacokinetic/pharmacodynamic modelling approach and to identify an optimum dosing regimen for potential clinical evaluation. Methods: A PK/biomarker model was developed to connect abemaciclib/vemurafenib concentrations to changes in MAPK and cell cycle pathway biomarkers in A375 BRAF-mutated melanoma xenografts. Resultant tumour growth inhibition was described by relating (i) MAPK pathway inhibition to apoptosis, (ii) mitotic cell density to tumour growth and, under resistant conditions, (iii) retinoblastoma protein inhibition to cell survival. Results: The model successfully described vemurafenib/abemaciclib-mediated changes in MAPK pathway and cell cycle biomarkers. Initial tumour shrinkage by vemurafenib, acquisition of resistance and subsequent abemaciclib-mediated efficacy were successfully captured and externally validated. Model simulations illustrate the benefit of intermittent vemurafenib therapy over continuous treatment, and indicate that continuous abemaciclib in combination with intermittent vemurafenib offers the potential for considerable tumour regression. Conclusions: The quantitative pharmacology of the abemaciclib/vemurafenib combination was successfully characterised and an optimised, clinically-relevant dosing strategy was identified. PMID:26978007
The Evaluator's Perspective: Evaluating the State Capacity Building Program.
ERIC Educational Resources Information Center
Madey, Doren L.
A historical antagonism between the advocates of quantitative evaluation methods and the proponents of qualitative evaluation methods has stymied the recognition of the value to be gained by utilizing both methodologies in the same study. The integration of quantitative and qualitative methods within a single evaluation has synergistic effects in…
The application of encapsulation material stability data to photovoltaic module life assessment
NASA Technical Reports Server (NTRS)
Coulbert, C. D.
1983-01-01
For any piece of hardware that degrades when subject to environmental and application stresses, the route or sequence that describes the degradation process may be summarized in terms of six key words: LOADS, RESPONSE, CHANGE, DAMAGE, FAILURE, and PENALTY. Applied to photovoltaic modules, these six factors form the core outline of an expanded failure analysis matrix for unifying and integrating relevant material degradation data and analyses. An important feature of this approach is the deliberate differentiation between factors such as CHANGE, DAMAGE, and FAILURE. The application of this outline to materials degradation research facilitates the distinction between quantifying material property changes and quantifying module damage or power loss with their economic consequences. The approach recommended for relating material stability data to photovoltaic module life is to use the degree of DAMAGE to (1) optical coupling, (2) encapsulant package integrity, (3) PV circuit integrity or (4) electrical isolation as the quantitative criterion for assessing module potential service life rather than simply using module power loss.