Sample records for quantitatively evaluating failure

  1. Quantitative method of medication system interface evaluation.

    PubMed

    Pingenot, Alleene Anne; Shanteau, James; Pingenot, James D F

    2007-01-01

    The objective of this study was to develop a quantitative method of evaluating the user interface for medication system software. A detailed task analysis provided a description of user goals and essential activity. A structural fault analysis was used to develop a detailed description of the system interface. Nurses experienced with use of the system under evaluation provided estimates of failure rates for each point in this simplified fault tree. Means of estimated failure rates provided quantitative data for fault analysis. Authors note that, although failures of steps in the program were frequent, participants reported numerous methods of working around these failures so that overall system failure was rare. However, frequent process failure can affect the time required for processing medications, making a system inefficient. This method of interface analysis, called Software Efficiency Evaluation and Fault Identification Method, provides quantitative information with which prototypes can be compared and problems within an interface identified.

  2. [Evaluation of intraventricular dyssynchrony by quantitative tissue velocity imaging in rats of post-infarction heart failure].

    PubMed

    Wang, Yan; Zhu, Wenhui; Duan, Xingxing; Zhao, Yongfeng; Liu, Wengang; Li, Ruizhen

    2011-04-01

    To evaluate intraventricular systolic dyssynchrony in rats with post-infarction heart failure by quantitative tissue velocity imaging combining synchronous electrocardiograph. A total of 60 male SD rats were randomly assigned to 3 groups: a 4 week post-operative group and an 8 week post-operation group (each n=25, with anterior descending branch of the left coronary artery ligated), and a sham operation group (n=10, with thoracotomy and open pericardium, but no ligation of the artery). The time to peak systolic velocity of regional myocardial in the rats was measured and the index of the left intraventricular dyssynchrony was calculated. All indexes of the heart function became lower as the heart failure worsened except the left ventricle index in the post-operative groups. All indexes of the dyssynchrony got longer in the post-operative groups (P<0.05), while the changes in the sham operation group were not significantly different (P>0.05). Quantitative tissue velocity imaging combining synchronous electrocardiograph can analyse the intraventricular systolic dyssynchrony accurately.

  3. Forecasting volcanic eruptions and other material failure phenomena: An evaluation of the failure forecast method

    NASA Astrophysics Data System (ADS)

    Bell, Andrew F.; Naylor, Mark; Heap, Michael J.; Main, Ian G.

    2011-08-01

    Power-law accelerations in the mean rate of strain, earthquakes and other precursors have been widely reported prior to material failure phenomena, including volcanic eruptions, landslides and laboratory deformation experiments, as predicted by several theoretical models. The Failure Forecast Method (FFM), which linearizes the power-law trend, has been routinely used to forecast the failure time in retrospective analyses; however, its performance has never been formally evaluated. Here we use synthetic and real data, recorded in laboratory brittle creep experiments and at volcanoes, to show that the assumptions of the FFM are inconsistent with the error structure of the data, leading to biased and imprecise forecasts. We show that a Generalized Linear Model method provides higher-quality forecasts that converge more accurately to the eventual failure time, accounting for the appropriate error distributions. This approach should be employed in place of the FFM to provide reliable quantitative forecasts and estimate their associated uncertainties.

  4. Weighted Fuzzy Risk Priority Number Evaluation of Turbine and Compressor Blades Considering Failure Mode Correlations

    NASA Astrophysics Data System (ADS)

    Gan, Luping; Li, Yan-Feng; Zhu, Shun-Peng; Yang, Yuan-Jian; Huang, Hong-Zhong

    2014-06-01

    Failure mode, effects and criticality analysis (FMECA) and Fault tree analysis (FTA) are powerful tools to evaluate reliability of systems. Although single failure mode issue can be efficiently addressed by traditional FMECA, multiple failure modes and component correlations in complex systems cannot be effectively evaluated. In addition, correlated variables and parameters are often assumed to be precisely known in quantitative analysis. In fact, due to the lack of information, epistemic uncertainty commonly exists in engineering design. To solve these problems, the advantages of FMECA, FTA, fuzzy theory, and Copula theory are integrated into a unified hybrid method called fuzzy probability weighted geometric mean (FPWGM) risk priority number (RPN) method. The epistemic uncertainty of risk variables and parameters are characterized by fuzzy number to obtain fuzzy weighted geometric mean (FWGM) RPN for single failure mode. Multiple failure modes are connected using minimum cut sets (MCS), and Boolean logic is used to combine fuzzy risk priority number (FRPN) of each MCS. Moreover, Copula theory is applied to analyze the correlation of multiple failure modes in order to derive the failure probabilities of each MCS. Compared to the case where dependency among multiple failure modes is not considered, the Copula modeling approach eliminates the error of reliability analysis. Furthermore, for purpose of quantitative analysis, probabilities importance weight from failure probabilities are assigned to FWGM RPN to reassess the risk priority, which generalize the definition of probability weight and FRPN, resulting in a more accurate estimation than that of the traditional models. Finally, a basic fatigue analysis case drawn from turbine and compressor blades in aeroengine is used to demonstrate the effectiveness and robustness of the presented method. The result provides some important insights on fatigue reliability analysis and risk priority assessment of structural

  5. Heart Failure Virtual Consultation: bridging the gap of heart failure care in the community - A mixed-methods evaluation.

    PubMed

    Gallagher, Joseph; James, Stephanie; Keane, Ciara; Fitzgerald, Annie; Travers, Bronagh; Quigley, Etain; Hecht, Christina; Zhou, Shuaiwei; Watson, Chris; Ledwidge, Mark; McDonald, Kenneth

    2017-08-01

    We undertook a mixed-methods evaluation of a Web-based conferencing service (virtual consult) between general practitioners (GPs) and cardiologists in managing patients with heart failure in the community to determine its effect on use of specialist heart failure services and acceptability to GPs. All cases from June 2015 to October 2016 were recorded using a standardized recording template, which recorded patient demographics, medical history, medications, and outcome of the virtual consult for each case. Quantitative surveys and qualitative interviewing of 17 participating GPs were also undertaken. During this time, 142 cases were discussed-68 relating to a new diagnosis of heart failure, 53 relating to emerging deterioration in a known heart failure patient, and 21 relating to therapeutic issues. Only 17% required review in outpatient department following the virtual consultation. GPs reported increased confidence in heart failure management, a broadening of their knowledge base, and a perception of overall better patient outcomes. These data from an initial experience with Heart Failure Virtual Consultation present a very positive impact of this strategy on the provision of heart failure care in the community and acceptability to users. Further research on the implementation and expansion of this strategy is warranted. © 2017 The Authors. ESC Heart Failure published by John Wiley & Sons Ltd on behalf of the European Society of Cardiology.

  6. An evidential reasoning extension to quantitative model-based failure diagnosis

    NASA Technical Reports Server (NTRS)

    Gertler, Janos J.; Anderson, Kenneth C.

    1992-01-01

    The detection and diagnosis of failures in physical systems characterized by continuous-time operation are studied. A quantitative diagnostic methodology has been developed that utilizes the mathematical model of the physical system. On the basis of the latter, diagnostic models are derived each of which comprises a set of orthogonal parity equations. To improve the robustness of the algorithm, several models may be used in parallel, providing potentially incomplete and/or conflicting inferences. Dempster's rule of combination is used to integrate evidence from the different models. The basic probability measures are assigned utilizing quantitative information extracted from the mathematical model and from online computation performed therewith.

  7. An Illustration of Determining Quantitatively the Rock Mass Quality Parameters of the Hoek-Brown Failure Criterion

    NASA Astrophysics Data System (ADS)

    Wu, Li; Adoko, Amoussou Coffi; Li, Bo

    2018-04-01

    In tunneling, determining quantitatively the rock mass strength parameters of the Hoek-Brown (HB) failure criterion is useful since it can improve the reliability of the design of tunnel support systems. In this study, a quantitative method is proposed to determine the rock mass quality parameters of the HB failure criterion, namely the Geological Strength Index (GSI) and the disturbance factor ( D) based on the structure of drilling core and weathering condition of rock mass combined with acoustic wave test to calculate the strength of rock mass. The Rock Mass Structure Index and the Rock Mass Weathering Index are used to quantify the GSI while the longitudinal wave velocity ( V p) is employed to derive the value of D. The DK383+338 tunnel face of Yaojia tunnel of Shanghai-Kunming passenger dedicated line served as illustration of how the methodology is implemented. The values of the GSI and D are obtained using the HB criterion and then using the proposed method. The measured in situ stress is used to evaluate their accuracy. To this end, the major and minor principal stresses are calculated based on the GSI and D given by HB criterion and the proposed method. The results indicated that both methods were close to the field observation which suggests that the proposed method can be used for determining quantitatively the rock quality parameters, as well. However, these results remain valid only for rock mass quality and rock type similar to those of the DK383+338 tunnel face of Yaojia tunnel.

  8. Quantitative ultrasonic evaluation of engineering properties in metals, composites and ceramics

    NASA Technical Reports Server (NTRS)

    Vary, A.

    1980-01-01

    Ultrasonic technology from the perspective of nondestructive evaluation approaches to material strength prediction and property verification is reviewed. Emergent advanced technology involving quantitative ultrasonic techniques for materials characterization is described. Ultrasonic methods are particularly useful in this area because they involve mechanical elastic waves that are strongly modulated by the same morphological factors that govern mechanical strength and dynamic failure processes. It is emphasized that the technology is in its infancy and that much effort is still required before all the available techniques can be transferred from laboratory to industrial environments.

  9. Feasibility of high-resolution quantitative perfusion analysis in patients with heart failure.

    PubMed

    Sammut, Eva; Zarinabad, Niloufar; Wesolowski, Roman; Morton, Geraint; Chen, Zhong; Sohal, Manav; Carr-White, Gerry; Razavi, Reza; Chiribiri, Amedeo

    2015-02-12

    Cardiac magnetic resonance (CMR) is playing an expanding role in the assessment of patients with heart failure (HF). The assessment of myocardial perfusion status in HF can be challenging due to left ventricular (LV) remodelling and wall thinning, coexistent scar and respiratory artefacts. The aim of this study was to assess the feasibility of quantitative CMR myocardial perfusion analysis in patients with HF. A group of 58 patients with heart failure (HF; left ventricular ejection fraction, LVEF ≤ 50%) and 33 patients with normal LVEF (LVEF >50%), referred for suspected coronary artery disease, were studied. All subjects underwent quantitative first-pass stress perfusion imaging using adenosine according to standard acquisition protocols. The feasibility of quantitative perfusion analysis was then assessed using high-resolution, 3 T kt perfusion and voxel-wise Fermi deconvolution. 30/58 (52%) subjects in the HF group had underlying ischaemic aetiology. Perfusion abnormalities were seen amongst patients with ischaemic HF and patients with normal LV function. No regional perfusion defect was observed in the non-ischaemic HF group. Good agreement was found between visual and quantitative analysis across all groups. Absolute stress perfusion rate, myocardial perfusion reserve (MPR) and endocardial-epicardial MPR ratio identified areas with abnormal perfusion in the ischaemic HF group (p = 0.02; p = 0.04; p = 0.02, respectively). In the Normal LV group, MPR and endocardial-epicardial MPR ratio were able to distinguish between normal and abnormal segments (p = 0.04; p = 0.02 respectively). No significant differences of absolute stress perfusion rate or MPR were observed comparing visually normal segments amongst groups. Our results demonstrate the feasibility of high-resolution voxel-wise perfusion assessment in patients with HF.

  10. Quantitative Acoustic Model for Adhesion Evaluation of Pmma/silicon Film Structures

    NASA Astrophysics Data System (ADS)

    Ju, H. S.; Tittmann, B. R.

    2010-02-01

    A Poly-methyl-methacrylate (PMMA) film on a silicon substrate is a main structure for photolithography in semiconductor manufacturing processes. This paper presents a potential of scanning acoustic microscopy (SAM) for nondestructive evaluation of the PMMA/Si film structure, whose adhesion failure is commonly encountered during the fabrication and post-fabrication processes. A physical model employing a partial discontinuity in displacement is developed for rigorously quantitative evaluation of the interfacial weakness. The model is implanted to the matrix method for the surface acoustic wave (SAW) propagation in anisotropic media. Our results show that variations in the SAW velocity and reflectance are predicted to show their sensitivity to the adhesion condition. Experimental results by the v(z) technique and SAW velocity reconstruction verify the prediction.

  11. Quantitative Approach to Failure Mode and Effect Analysis for Linear Accelerator Quality Assurance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    O'Daniel, Jennifer C., E-mail: jennifer.odaniel@duke.edu; Yin, Fang-Fang

    Purpose: To determine clinic-specific linear accelerator quality assurance (QA) TG-142 test frequencies, to maximize physicist time efficiency and patient treatment quality. Methods and Materials: A novel quantitative approach to failure mode and effect analysis is proposed. Nine linear accelerator-years of QA records provided data on failure occurrence rates. The severity of test failure was modeled by introducing corresponding errors into head and neck intensity modulated radiation therapy treatment plans. The relative risk of daily linear accelerator QA was calculated as a function of frequency of test performance. Results: Although the failure severity was greatest for daily imaging QA (imaging vsmore » treatment isocenter and imaging positioning/repositioning), the failure occurrence rate was greatest for output and laser testing. The composite ranking results suggest that performing output and lasers tests daily, imaging versus treatment isocenter and imaging positioning/repositioning tests weekly, and optical distance indicator and jaws versus light field tests biweekly would be acceptable for non-stereotactic radiosurgery/stereotactic body radiation therapy linear accelerators. Conclusions: Failure mode and effect analysis is a useful tool to determine the relative importance of QA tests from TG-142. Because there are practical time limitations on how many QA tests can be performed, this analysis highlights which tests are the most important and suggests the frequency of testing based on each test's risk priority number.« less

  12. Lay Consultations in Heart Failure Symptom Evaluation.

    PubMed

    Reeder, Katherine M; Sims, Jessica L; Ercole, Patrick M; Shetty, Shivan S; Wallendorf, Michael

    2017-01-01

    Lay consultations can facilitate or impede healthcare. However, little is known about how lay consultations for symptom evaluation affect treatment decision-making. The purpose of this study was to explore the role of lay consultations in symptom evaluation prior to hospitalization among patients with heart failure. Semi-structured interviews were conducted with 60 patients hospitalized for acute decompensated heart failure. Chi-square and Fisher's exact tests, along with logistic regression were used to characterize lay consultations in this sample. A large proportion of patients engaged in lay consultations for symptom evaluation and decision-making before hospitalization. Lay consultants provided attributions and advice and helped make the decision to seek medical care. Men consulted more often with their spouse than women, while women more often consulted with adult children. Findings have implications for optimizing heart failure self-management interventions, improving outcomes, and reducing hospital readmissions.

  13. Lay Consultations in Heart Failure Symptom Evaluation

    PubMed Central

    Reeder, Katherine M.; Sims, Jessica L.; Ercole, Patrick M.; Shetty, Shivan S.; Wallendorf, Michael

    2017-01-01

    Purpose Lay consultations can facilitate or impede healthcare. However, little is known about how lay consultations for symptom evaluation affect treatment decision-making. The purpose of this study was to explore the role of lay consultations in symptom evaluation prior to hospitalization among patients with heart failure. Methods Semi-structured interviews were conducted with 60 patients hospitalized for acute decompensated heart failure. Chi-square and Fisher’s exact tests, along with logistic regression were used to characterize lay consultations in this sample. Results A large proportion of patients engaged in lay consultations for symptom evaluation and decision-making before hospitalization. Lay consultants provided attributions and advice and helped make the decision to seek medical care. Men consulted more often with their spouse than women, while women more often consulted with adult children. Conclusions Findings have implications for optimizing heart failure self-management interventions, improving outcomes, and reducing hospital readmissions. PMID:29399657

  14. Reliability Evaluation of Machine Center Components Based on Cascading Failure Analysis

    NASA Astrophysics Data System (ADS)

    Zhang, Ying-Zhi; Liu, Jin-Tong; Shen, Gui-Xiang; Long, Zhe; Sun, Shu-Guang

    2017-07-01

    In order to rectify the problems that the component reliability model exhibits deviation, and the evaluation result is low due to the overlook of failure propagation in traditional reliability evaluation of machine center components, a new reliability evaluation method based on cascading failure analysis and the failure influenced degree assessment is proposed. A direct graph model of cascading failure among components is established according to cascading failure mechanism analysis and graph theory. The failure influenced degrees of the system components are assessed by the adjacency matrix and its transposition, combined with the Pagerank algorithm. Based on the comprehensive failure probability function and total probability formula, the inherent failure probability function is determined to realize the reliability evaluation of the system components. Finally, the method is applied to a machine center, it shows the following: 1) The reliability evaluation values of the proposed method are at least 2.5% higher than those of the traditional method; 2) The difference between the comprehensive and inherent reliability of the system component presents a positive correlation with the failure influenced degree of the system component, which provides a theoretical basis for reliability allocation of machine center system.

  15. Evaluation of airway protection: Quantitative timing measures versus penetration/aspiration score.

    PubMed

    Kendall, Katherine A

    2017-10-01

    Quantitative measures of swallowing function may improve the reliability and accuracy of modified barium swallow (MBS) study interpretation. Quantitative study analysis has not been widely instituted, however, secondary to concerns about the time required to make measures and a lack of research demonstrating impact on MBS interpretation. This study compares the accuracy of the penetration/aspiration (PEN/ASP) scale (an observational visual-perceptual assessment tool) to quantitative measures of airway closure timing relative to the arrival of the bolus at the upper esophageal sphincter in identifying a failure of airway protection during deglutition. Retrospective review of clinical swallowing data from a university-based outpatient clinic. Swallowing data from 426 patients were reviewed. Patients with normal PEN/ASP scores were identified, and the results of quantitative airway closure timing measures for three liquid bolus sizes were evaluated. The incidence of significant airway closure delay with and without a normal PEN/ASP score was determined. Inter-rater reliability for the quantitative measures was calculated. In patients with a normal PEN/ASP score, 33% demonstrated a delay in airway closure on at least one swallow during the MBS study. There was no correlation between PEN/ASP score and airway closure delay. Inter-rater reliability for the quantitative measure of airway closure timing was nearly perfect (intraclass correlation coefficient = 0.973). The use of quantitative measures of swallowing function, in conjunction with traditional visual perceptual methods of MBS study interpretation, improves the identification of airway closure delay, and hence, potential aspiration risk, even when no penetration or aspiration is apparent on the MBS study. 4. Laryngoscope, 127:2314-2318, 2017. © 2017 The American Laryngological, Rhinological and Otological Society, Inc.

  16. A finite element evaluation of the moment arm hypothesis for altered vertebral shear failure force.

    PubMed

    Howarth, Samuel J; Karakolis, Thomas; Callaghan, Jack P

    2015-01-01

    The mechanism of vertebral shear failure is likely a bending moment generated about the pars interarticularis by facet contact, and the moment arm length (MAL) between the centroid of facet contact and the location of pars interarticularis failure has been hypothesised to be an influential modulator of shear failure force. To quantitatively evaluate this hypothesis, anterior shear of C3 over C4 was simulated in a finite element model of the porcine C3-C4 vertebral joint with each combination of five compressive force magnitudes (0-60% of estimated compressive failure force) and three postures (flexed, neutral and extended). Bilateral locations of peak stress within C3's pars interarticularis were identified along with the centroids of contact force on the inferior facets. These measurements were used to calculate the MAL of facet contact force. Changes in MAL were also related to shear failure forces measured from similar in vitro tests. Flexed and extended vertebral postures respectively increased and decreased the MAL by 6.6% and 4.8%. The MAL decreased by only 2.6% from the smallest to the largest compressive force. Furthermore, altered MAL explained 70% of the variance in measured shear failure force from comparable in vitro testing with larger MALs being associated with lower shear failure forces. Our results confirmed that the MAL is indeed a significant modulator of vertebral shear failure force. Considering spine flexion is necessary when assessing low-back shear injury potential because of the association between altered facet articulation and lower vertebral shear failure tolerance.

  17. Evaluation of a Linear Cumulative Damage Failure Model for Epoxy Adhesive

    NASA Technical Reports Server (NTRS)

    Richardson, David E.; Batista-Rodriquez, Alicia; Macon, David; Totman, Peter; McCool, Alex (Technical Monitor)

    2001-01-01

    Recently a significant amount of work has been conducted to provide more complex and accurate material models for use in the evaluation of adhesive bondlines. Some of this has been prompted by recent studies into the effects of residual stresses on the integrity of bondlines. Several techniques have been developed for the analysis of bondline residual stresses. Key to these analyses is the criterion that is used for predicting failure. Residual stress loading of an adhesive bondline can occur over the life of the component. For many bonded systems, this can be several years. It is impractical to directly characterize failure of adhesive bondlines under a constant load for several years. Therefore, alternative approaches for predictions of bondline failures are required. In the past, cumulative damage failure models have been developed. These models have ranged from very simple to very complex. This paper documents the generation and evaluation of some of the most simple linear damage accumulation tensile failure models for an epoxy adhesive. This paper shows how several variations on the failure model were generated and presents an evaluation of the accuracy of these failure models in predicting creep failure of the adhesive. The paper shows that a simple failure model can be generated from short-term failure data for accurate predictions of long-term adhesive performance.

  18. Evaluation Methodologies for Estimating the Likelihood of Program Implementation Failure

    ERIC Educational Resources Information Center

    Durand, Roger; Decker, Phillip J.; Kirkman, Dorothy M.

    2014-01-01

    Despite our best efforts as evaluators, program implementation failures abound. A wide variety of valuable methodologies have been adopted to explain and evaluate the "why" of these failures. Yet, typically these methodologies have been employed concurrently (e.g., project monitoring) or to the post-hoc assessment of program activities.…

  19. A novel approach for evaluating the risk of health care failure modes.

    PubMed

    Chang, Dong Shang; Chung, Jenq Hann; Sun, Kuo Lung; Yang, Fu Chiang

    2012-12-01

    Failure mode and effects analysis (FMEA) can be employed to reduce medical errors by identifying the risk ranking of the health care failure modes and taking priority action for safety improvement. The purpose of this paper is to propose a novel approach of data analysis. The approach is to integrate FMEA and a mathematical tool-Data envelopment analysis (DEA) with "slack-based measure" (SBM), in the field of data analysis. The risk indexes (severity, occurrence, and detection) of FMEA are viewed as multiple inputs of DEA. The practicality and usefulness of the proposed approach is illustrated by one case of health care. Being a systematic approach for improving the service quality of health care, the approach can offer quantitative corrective information of risk indexes that thereafter reduce failure possibility. For safety improvement, these new targets of the risk indexes could be used for management by objectives. But FMEA cannot provide quantitative corrective information of risk indexes. The novel approach can surely overcome this chief shortcoming of FMEA. After combining DEA SBM model with FMEA, the two goals-increase of patient safety, medical cost reduction-can be together achieved.

  20. Implementation and Evaluation of a Smartphone-Based Telemonitoring Program for Patients With Heart Failure: Mixed-Methods Study Protocol

    PubMed Central

    Ross, Heather J; Cafazzo, Joseph A; Laporte, Audrey; Seto, Emily

    2018-01-01

    Background Meta-analyses of telemonitoring for patients with heart failure conclude that it can lower the utilization of health services and improve health outcomes compared with the standard of care. A smartphone-based telemonitoring program is being implemented as part of the standard of care at a specialty care clinic for patients with heart failure in Toronto, Canada. Objective The objectives of this study are to (1) evaluate the impact of the telemonitoring program on health service utilization, patient health outcomes, and their ability to self-care; (2) identify the contextual barriers and facilitators of implementation at the physician, clinic, and institutional level; (3) describe patient usage patterns to determine adherence and other behaviors in the telemonitoring program; and (4) evaluate the costs associated with implementation of the telemonitoring program from the perspective of the health care system (ie, public payer), hospital, and patient. Methods The evaluation will use a mixed-methods approach. The quantitative component will include a pragmatic pre- and posttest study design for the impact and cost analyses, which will make use of clinical data and questionnaires administered to at least 108 patients at baseline and 6 months. Furthermore, outcome data will be collected at 1, 12, and 24 months to explore the longitudinal impact of the program. In addition, quantitative data related to implementation outcomes and patient usage patterns of the telemonitoring system will be reported. The qualitative component involves an embedded single case study design to identify the contextual factors that influenced the implementation. The implementation evaluation will be completed using semistructured interviews with clinicians, and other program staff at baseline, 4 months, and 12 months after the program start date. Interviews conducted with patients will be triangulated with usage data to explain usage patterns and adherence to the system. Results The

  1. Evaluation of Window Failure Modes

    DTIC Science & Technology

    1999-12-01

    U.S. Coast Guard Research and Development Center 1082 Shennecossett Road, Groton, CT 06340-6096 Report No. CG-D-08-00 EVALUATION OF WINDOW FAILURE...States Coast Guard Research & Development Center 1082 Shennecossett Road Groton, CT 06340-6096 11 I.Report No. CG-D-08-00 Technical Report...Development Center 1082 Shennecossett Road Groton,CT 06340-6096 12. Sponsoring Organization Name and Address U.S. Department of Transportation United

  2. SPECT and PET in ischemic heart failure.

    PubMed

    Angelidis, George; Giamouzis, Gregory; Karagiannis, Georgios; Butler, Javed; Tsougos, Ioannis; Valotassiou, Varvara; Giannakoulas, George; Dimakopoulos, Nikolaos; Xanthopoulos, Andrew; Skoularigis, John; Triposkiadis, Filippos; Georgoulias, Panagiotis

    2017-03-01

    Heart failure is a common clinical syndrome associated with significant morbidity and mortality worldwide. Ischemic heart disease is the leading cause of heart failure, at least in the industrialized countries. Proper diagnosis of the syndrome and management of patients with heart failure require anatomical and functional information obtained through various imaging modalities. Nuclear cardiology techniques play a main role in the evaluation of heart failure. Myocardial single photon emission computed tomography (SPECT) with thallium-201 or technetium-99 m labelled tracers offer valuable data regarding ventricular function, myocardial perfusion, viability, and intraventricular synchronism. Moreover, positron emission tomography (PET) permits accurate evaluation of myocardial perfusion, metabolism, and viability, providing high-quality images and the ability of quantitative analysis. As these imaging techniques assess different parameters of cardiac structure and function, variations of sensitivity and specificity have been reported among them. In addition, the role of SPECT and PET guided therapy remains controversial. In this comprehensive review, we address these controversies and report the advances in patient's investigation with SPECT and PET in ischemic heart failure. Furthermore, we present the innovations in technology that are expected to strengthen the role of nuclear cardiology modalities in the investigation of heart failure.

  3. Importance Sampling in the Evaluation and Optimization of Buffered Failure Probability

    DTIC Science & Technology

    2015-07-01

    12th International Conference on Applications of Statistics and Probability in Civil Engineering, ICASP12 Vancouver, Canada, July 12-15, 2015...Importance Sampling in the Evaluation and Optimization of Buffered Failure Probability Marwan M. Harajli Graduate Student, Dept. of Civil and Environ...criterion is usually the failure probability . In this paper, we examine the buffered failure probability as an attractive alternative to the failure

  4. Current Understanding of the Pathophysiology of Myocardial Fibrosis and Its Quantitative Assessment in Heart Failure

    PubMed Central

    Liu, Tong; Song, Deli; Dong, Jianzeng; Zhu, Pinghui; Liu, Jie; Liu, Wei; Ma, Xiaohai; Zhao, Lei; Ling, Shukuan

    2017-01-01

    Myocardial fibrosis is an important part of cardiac remodeling that leads to heart failure and death. Myocardial fibrosis results from increased myofibroblast activity and excessive extracellular matrix deposition. Various cells and molecules are involved in this process, providing targets for potential drug therapies. Currently, the main detection methods of myocardial fibrosis rely on serum markers, cardiac magnetic resonance imaging, and endomyocardial biopsy. This review summarizes our current knowledge regarding the pathophysiology, quantitative assessment, and novel therapeutic strategies of myocardial fibrosis. PMID:28484397

  5. Interrelation of structure and operational states in cascading failure of overloading lines in power grids

    NASA Astrophysics Data System (ADS)

    Xue, Fei; Bompard, Ettore; Huang, Tao; Jiang, Lin; Lu, Shaofeng; Zhu, Huaiying

    2017-09-01

    As the modern power system is expected to develop to a more intelligent and efficient version, i.e. the smart grid, or to be the central backbone of energy internet for free energy interactions, security concerns related to cascading failures have been raised with consideration of catastrophic results. The researches of topological analysis based on complex networks have made great contributions in revealing structural vulnerabilities of power grids including cascading failure analysis. However, existing literature with inappropriate assumptions in modeling still cannot distinguish the effects between the structure and operational state to give meaningful guidance for system operation. This paper is to reveal the interrelation between network structure and operational states in cascading failure and give quantitative evaluation by integrating both perspectives. For structure analysis, cascading paths will be identified by extended betweenness and quantitatively described by cascading drop and cascading gradient. Furthermore, the operational state for cascading paths will be described by loading level. Then, the risk of cascading failure along a specific cascading path can be quantitatively evaluated considering these two factors. The maximum cascading gradient of all possible cascading paths can be used as an overall metric to evaluate the entire power grid for its features related to cascading failure. The proposed method is tested and verified on IEEE30-bus system and IEEE118-bus system, simulation evidences presented in this paper suggests that the proposed model can identify the structural causes for cascading failure and is promising to give meaningful guidance for the protection of system operation in the future.

  6. Zebrafish Heart Failure Models for the Evaluation of Chemical Probes and Drugs

    PubMed Central

    Monte, Aaron; Cook, James M.; Kabir, Mohd Shahjahan; Peterson, Karl P.

    2013-01-01

    Abstract Heart failure is a complex disease that involves genetic, environmental, and physiological factors. As a result, current medication and treatment for heart failure produces limited efficacy, and better medication is in demand. Although mammalian models exist, simple and low-cost models will be more beneficial for drug discovery and mechanistic studies of heart failure. We previously reported that aristolochic acid (AA) caused cardiac defects in zebrafish embryos that resemble heart failure. Here, we showed that cardiac troponin T and atrial natriuretic peptide were expressed at significantly higher levels in AA-treated embryos, presumably due to cardiac hypertrophy. In addition, several human heart failure drugs could moderately attenuate the AA-induced heart failure by 10%–40%, further verifying the model for drug discovery. We then developed a drug screening assay using the AA-treated zebrafish embryos and identified three compounds. Mitogen-activated protein kinase kinase inhibitor (MEK-I), an inhibitor for the MEK-1/2 known to be involved in cardiac hypertrophy and heart failure, showed nearly 60% heart failure attenuation. C25, a chalcone derivative, and A11, a phenolic compound, showed around 80% and 90% attenuation, respectively. Time course experiments revealed that, to obtain 50% efficacy, these compounds were required within different hours of AA treatment. Furthermore, quantitative polymerase chain reaction showed that C25, not MEK-I or A11, strongly suppressed inflammation. Finally, C25 and MEK-I, but not A11, could also rescue the doxorubicin-induced heart failure in zebrafish embryos. In summary, we have established two tractable heart failure models for drug discovery and three potential drugs have been identified that seem to attenuate heart failure by different mechanisms. PMID:24351044

  7. Implementation and Evaluation of a Smartphone-Based Telemonitoring Program for Patients With Heart Failure: Mixed-Methods Study Protocol.

    PubMed

    Ware, Patrick; Ross, Heather J; Cafazzo, Joseph A; Laporte, Audrey; Seto, Emily

    2018-05-03

    Meta-analyses of telemonitoring for patients with heart failure conclude that it can lower the utilization of health services and improve health outcomes compared with the standard of care. A smartphone-based telemonitoring program is being implemented as part of the standard of care at a specialty care clinic for patients with heart failure in Toronto, Canada. The objectives of this study are to (1) evaluate the impact of the telemonitoring program on health service utilization, patient health outcomes, and their ability to self-care; (2) identify the contextual barriers and facilitators of implementation at the physician, clinic, and institutional level; (3) describe patient usage patterns to determine adherence and other behaviors in the telemonitoring program; and (4) evaluate the costs associated with implementation of the telemonitoring program from the perspective of the health care system (ie, public payer), hospital, and patient. The evaluation will use a mixed-methods approach. The quantitative component will include a pragmatic pre- and posttest study design for the impact and cost analyses, which will make use of clinical data and questionnaires administered to at least 108 patients at baseline and 6 months. Furthermore, outcome data will be collected at 1, 12, and 24 months to explore the longitudinal impact of the program. In addition, quantitative data related to implementation outcomes and patient usage patterns of the telemonitoring system will be reported. The qualitative component involves an embedded single case study design to identify the contextual factors that influenced the implementation. The implementation evaluation will be completed using semistructured interviews with clinicians, and other program staff at baseline, 4 months, and 12 months after the program start date. Interviews conducted with patients will be triangulated with usage data to explain usage patterns and adherence to the system. The telemonitoring program was launched in

  8. A preliminary evaluation of a failure detection filter for detecting and identifying control element failures in a transport aircraft

    NASA Technical Reports Server (NTRS)

    Bundick, W. T.

    1985-01-01

    The application of the failure detection filter to the detection and identification of aircraft control element failures was evaluated in a linear digital simulation of the longitudinal dynamics of a B-737 Aircraft. Simulation results show that with a simple correlator and threshold detector used to process the filter residuals, the failure detection performance is seriously degraded by the effects of turbulence.

  9. [Reconstituting evaluation methods based on both qualitative and quantitative paradigms].

    PubMed

    Miyata, Hiroaki; Okubo, Suguru; Yoshie, Satoru; Kai, Ichiro

    2011-01-01

    Debate about the relationship between quantitative and qualitative paradigms is often muddled and confusing and the clutter of terms and arguments has resulted in the concepts becoming obscure and unrecognizable. In this study we conducted content analysis regarding evaluation methods of qualitative healthcare research. We extracted descriptions on four types of evaluation paradigm (validity/credibility, reliability/credibility, objectivity/confirmability, and generalizability/transferability), and classified them into subcategories. In quantitative research, there has been many evaluation methods based on qualitative paradigms, and vice versa. Thus, it might not be useful to consider evaluation methods of qualitative paradigm are isolated from those of quantitative methods. Choosing practical evaluation methods based on the situation and prior conditions of each study is an important approach for researchers.

  10. Rock Slide Risk Assessment: A Semi-Quantitative Approach

    NASA Astrophysics Data System (ADS)

    Duzgun, H. S. B.

    2009-04-01

    , four of the slides caused formation of tsunami waves which washed up to 74 m above the lake level. Two of the slides resulted in many fatalities in the inner part of the Loen Valley as well as great damages. There are three predominant joint structures in Ramnefjell Mountain, which controls failure and the geometry of the slides. The first joint set is a foliation plane striking northeast-southwest and dipping 35˚ -40˚ to the east-southeast. The second and the third joint sets are almost perpendicular and parallel to the mountain side and scarp, respectively. These three joint sets form slices of rock columns with width ranging between 7-10 m and height of 400-450 m. It is stated that the joints in set II are opened between 1-2 m, which may bring about collection of water during heavy rainfall or snow melt causing the slices to be pressed out. It is estimated that water in the vertical joints both reduces the shear strength of sliding plane and causes reduction of normal stress on the sliding plane due to formation of uplift force. Hence rock slides in Ramnefjell mountain occur in plane failure mode. The quantitative evaluation of rock slide risk requires probabilistic analysis of rock slope stability and identification of consequences if the rock slide occurs. In this study failure probability of a rock slice is evaluated by first-order reliability method (FORM). Then in order to use the calculated probability of failure value (Pf) in risk analyses, it is required to associate this Pf with frequency based probabilities (i.ePf / year) since the computed failure probabilities is a measure of hazard and not a measure of risk unless they are associated with the consequences of the failure. This can be done by either considering the time dependent behavior of the basic variables in the probabilistic models or associating the computed Pf with frequency of the failures in the region. In this study, the frequency of previous rock slides in the previous century in

  11. Quantitative phase imaging of platelets in patients with chronic renal failure treated with hemodialysis

    NASA Astrophysics Data System (ADS)

    Vasilenko, Irina; Vlasova, Elizaveta; Metelin, Vladislav; Kardasheva, Ziver

    2018-02-01

    The development of robust non-invasive laboratory screening methods for early diagnosis on the out-patient basis seems quite relevant for practical medicine. It is known, that platelet is an original biosensor, a detector of early changes in hemostasis condition. The aim of this study was to assess a potential of the quantitative phase imaging (QPI) technique for real time evaluation the influence of low-molecular weight and unfractionated heparin on platelets in patients with the end-stage of chronic renal failure, who were treated with program hemodialysis (PHD). The main group consisted of 21 patients who were administered a low-molecular weight heparin for hypocoagulation during the procedure of hemodialysis. The control group (15 patients) received unfractionated heparin. Morphodensitometric state of living platelets we evaluated by QPI using computer phase-interference microscope MIM (Moscow, Russia). We analyzed the optical-geometrical parameters and the morphological features of living platelets which reflected the degree of their activation at the beginning of PHD (before administration of heparin), in 15 minutes after it and at the end of the procedure. The results allow us to conclude that the use of low-molecular weight heparin provides better ratio of efficacy/safety and causes a reduction of the platelet activation during the hemodialysis procedure. Practical implementation of QPI for clinical monitoring of platelets makes it possible to obtain important information on hemostasis cell. It opens new opportunities to assess the efficacy of treatment, as well as for early diagnosis of complications for disease.

  12. An Educational Intervention to Evaluate Nurses' Knowledge of Heart Failure.

    PubMed

    Sundel, Siobhan; Ea, Emerson E

    2018-07-01

    Nurses are the main providers of patient education in inpatient and outpatient settings. Unfortunately, nurses may lack knowledge of chronic medical conditions, such as heart failure. The purpose of this one-group pretest-posttest intervention was to determine the effectiveness of teaching intervention on nurses' knowledge of heart failure self-care principles in an ambulatory care setting. The sample consisted of 40 staff nurses in ambulatory care. Nurse participants received a focused education intervention based on knowledge deficits revealed in the pretest and were then resurveyed within 30 days. Nurses were evaluated using the valid and reliable 20-item Nurses Knowledge of Heart Failure Education Principles Survey tool. The results of this project demonstrated that an education intervention on heart failure self-care principles improved nurses' knowledge of heart failure in an ambulatory care setting, which was statistically significant (p < .05). Results suggest that a teaching intervention could improve knowledge of heart failure, which could lead to better patient education and could reduce patient readmission for heart failure. J Contin Educ Nurs. 2018;49(7):315-321. Copyright 2018, SLACK Incorporated.

  13. The Positive Alternative Credit Experience (PACE) Program a Quantitative Comparative Study

    ERIC Educational Resources Information Center

    Warren, Rebecca Anne

    2011-01-01

    The purpose of this quantitative comparative study was to evaluate the Positive Alternative Credit Experience (PACE) Program using an objectives-oriented approach to a formative program evaluation. The PACE Program was a semester-long high school alternative education program designed to serve students at-risk for academic failure or dropping out…

  14. The role of quantitative safety evaluation in regulatory decision making of drugs.

    PubMed

    Chakravarty, Aloka G; Izem, Rima; Keeton, Stephine; Kim, Clara Y; Levenson, Mark S; Soukup, Mat

    2016-01-01

    Evaluation of safety is a critical component of drug review at the US Food and Drug Administration (FDA). Statisticians are playing an increasingly visible role in quantitative safety evaluation and regulatory decision-making. This article reviews the history and the recent events relating to quantitative drug safety evaluation at the FDA. The article then focuses on five active areas of quantitative drug safety evaluation and the role Division of Biometrics VII (DBVII) plays in these areas, namely meta-analysis for safety evaluation, large safety outcome trials, post-marketing requirements (PMRs), the Sentinel Initiative, and the evaluation of risk from extended/long-acting opioids. This article will focus chiefly on developments related to quantitative drug safety evaluation and not on the many additional developments in drug safety in general.

  15. Quantitative evaluations of ankle spasticity and stiffness in neurological disorders using manual spasticity evaluator.

    PubMed

    Peng, Qiyu; Park, Hyung-Soon; Shah, Parag; Wilson, Nicole; Ren, Yupeng; Wu, Yi-Ning; Liu, Jie; Gaebler-Spira, Deborah J; Zhang, Li-Qun

    2011-01-01

    Spasticity and contracture are major sources of disability in people with neurological impairments that have been evaluated using various instruments: the Modified Ashworth Scale, tendon reflex scale, pendulum test, mechanical perturbations, and passive joint range of motion (ROM). These measures generally are either convenient to use in clinics but not quantitative or they are quantitative but difficult to use conveniently in clinics. We have developed a manual spasticity evaluator (MSE) to evaluate spasticity/contracture quantitatively and conveniently, with ankle ROM and stiffness measured at a controlled low velocity and joint resistance and Tardieu catch angle measured at several higher velocities. We found that the Tardieu catch angle was linearly related to the velocity, indicating that increased resistance at higher velocities was felt at further stiffer positions and, thus, that the velocity dependence of spasticity may also be position-dependent. This finding indicates the need to control velocity in spasticity evaluation, which is achieved with the MSE. Quantitative measurements of spasticity, stiffness, and ROM can lead to more accurate characterizations of pathological conditions and outcome evaluations of interventions, potentially contributing to better healthcare services for patients with neurological disorders such as cerebral palsy, spinal cord injury, traumatic brain injury, and stroke.

  16. Quantitative risk assessment for skin sensitization: Success or failure?

    PubMed

    Kimber, Ian; Gerberick, G Frank; Basketter, David A

    2017-02-01

    Skin sensitization is unique in the world of toxicology. There is a combination of reliable, validated predictive test methods for identification of skin sensitizing chemicals, a clearly documented and transparent approach to risk assessment, and effective feedback from dermatology clinics around the world delivering evidence of the success or failure of the hazard identification/risk assessment/management process. Recent epidemics of contact allergy, particularly to preservatives, have raised questions of whether the safety/risk assessment process is working in an optimal manner (or indeed is working at all!). This review has as its focus skin sensitization quantitative risk assessment (QRA). The core toxicological principles of QRA are reviewed, and evidence of use and misuse examined. What becomes clear is that skin sensitization QRA will only function adequately if two essential criteria are met. The first is that QRA is applied rigourously, and the second is that potential exposure to the sensitizing substance is assessed adequately. This conclusion will come as no surprise to any toxicologist who appreciates the basic premise that "risk = hazard x exposure". Accordingly, use of skin sensitization QRA is encouraged, not least because the essential feedback from dermatology clinics can be used as a tool to refine QRA in situations where this risk assessment tool has not been properly used. Copyright © 2016 Elsevier Inc. All rights reserved.

  17. The errors of metacognitive evaluation on metacognitive failure of students in mathematical problem solving

    NASA Astrophysics Data System (ADS)

    Huda, Nizlel; Sutawidjaja, Akbar; Subanji; Rahardjo, Swasono

    2018-04-01

    Metacognitive activity is very important in mathematical problems solving. Metacognitive activity consists of metacognitive awareness, metacognitive evaluation and metacognitive regulation. This study aimed to reveal the errors of metacognitive evaluation in students’ metacognitive failure in solving mathematical problems. 20 students taken as research subjects were grouped into three groups: the first group was students who experienced one metacognitive failure, the second group was students who experienced two metacognitive failures and the third group was students who experienced three metacognitive failures. One person was taken from each group as the reasearch subject. The research data was collected from worksheets done using think aload then followed by interviewing the research subjects based on the results’ of subject work. The findings in this study were students who experienced metacognitive failure in solving mathematical problems tends to miscalculate metacognitive evaluation in considering the effectiveness and limitations of their thinking and the effectiveness of their chosen strategy of completion.

  18. Quantitative evaluation methods of skin condition based on texture feature parameters.

    PubMed

    Pang, Hui; Chen, Tianhua; Wang, Xiaoyi; Chang, Zhineng; Shao, Siqi; Zhao, Jing

    2017-03-01

    In order to quantitatively evaluate the improvement of the skin condition after using skin care products and beauty, a quantitative evaluation method for skin surface state and texture is presented, which is convenient, fast and non-destructive. Human skin images were collected by image sensors. Firstly, the median filter of the 3 × 3 window is used and then the location of the hairy pixels on the skin is accurately detected according to the gray mean value and color information. The bilinear interpolation is used to modify the gray value of the hairy pixels in order to eliminate the negative effect of noise and tiny hairs on the texture. After the above pretreatment, the gray level co-occurrence matrix (GLCM) is calculated. On the basis of this, the four characteristic parameters, including the second moment, contrast, entropy and correlation, and their mean value are calculated at 45 ° intervals. The quantitative evaluation model of skin texture based on GLCM is established, which can calculate the comprehensive parameters of skin condition. Experiments show that using this method evaluates the skin condition, both based on biochemical indicators of skin evaluation methods in line, but also fully consistent with the human visual experience. This method overcomes the shortcomings of the biochemical evaluation method of skin damage and long waiting time, also the subjectivity and fuzziness of the visual evaluation, which achieves the non-destructive, rapid and quantitative evaluation of skin condition. It can be used for health assessment or classification of the skin condition, also can quantitatively evaluate the subtle improvement of skin condition after using skin care products or stage beauty.

  19. Immunity-based detection, identification, and evaluation of aircraft sub-system failures

    NASA Astrophysics Data System (ADS)

    Moncayo, Hever Y.

    This thesis describes the design, development, and flight-simulation testing of an integrated Artificial Immune System (AIS) for detection, identification, and evaluation of a wide variety of sensor, actuator, propulsion, and structural failures/damages including the prediction of the achievable states and other limitations on performance and handling qualities. The AIS scheme achieves high detection rate and low number of false alarms for all the failure categories considered. Data collected using a motion-based flight simulator are used to define the self for an extended sub-region of the flight envelope. The NASA IFCS F-15 research aircraft model is used and represents a supersonic fighter which include model following adaptive control laws based on non-linear dynamic inversion and artificial neural network augmentation. The flight simulation tests are designed to analyze and demonstrate the performance of the immunity-based aircraft failure detection, identification and evaluation (FDIE) scheme. A general robustness analysis is also presented by determining the achievable limits for a desired performance in the presence of atmospheric perturbations. For the purpose of this work, the integrated AIS scheme is implemented based on three main components. The first component performs the detection when one of the considered failures is present in the system. The second component consists in the identification of the failure category and the classification according to the failed element. During the third phase a general evaluation of the failure is performed with the estimation of the magnitude/severity of the failure and the prediction of its effect on reducing the flight envelope of the aircraft system. Solutions and alternatives to specific design issues of the AIS scheme, such as data clustering and empty space optimization, data fusion and duplication removal, definition of features, dimensionality reduction, and selection of cluster/detector shape are also

  20. Comparative Evaluation of Quantitative Test Methods for Gases on a Hard Surface

    DTIC Science & Technology

    2017-02-01

    COMPARATIVE EVALUATION OF QUANTITATIVE TEST METHODS FOR GASES ON A HARD SURFACE ECBC-TR-1426 Vipin Rastogi...1 COMPARATIVE EVALUATION OF QUANTITATIVE TEST METHODS FOR GASES ON A HARD SURFACE 1. INTRODUCTION Members of the U.S. Environmental...Generator 4 2.4 Experimental Design Each quantitative method was performed three times on three consecutive days. For the CD runs, three

  1. Nuclear medicine and quantitative imaging research (instrumentation and quantitative methods of evaluation)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Beck, R.N.; Cooper, M.D.

    1990-09-01

    This report summarizes goals and accomplishments of the research program supported under DOE Grant No. FG02-86ER60418 entitled Instrumentation and Quantitative Methods of Evaluation, with R. Beck, P. I. and M. Cooper, Co-P.I. during the period January 15, 1990 through September 1, 1990. This program addresses the problems involving the basic science and technology underlying the physical and conceptual tools of radioactive tracer methodology as they relate to the measurement of structural and functional parameters of physiologic importance in health and disease. The principal tool is quantitative radionuclide imaging. The overall objective of this program is to further the development andmore » transfer of radiotracer methodology from basic theory to routine clinical practice in order that individual patients and society as a whole will receive the maximum net benefit from the new knowledge gained. The focus of the research is on the development of new instruments and radiopharmaceuticals, and the evaluation of these through the phase of clinical feasibility. 7 figs.« less

  2. Left ventricular performance in various heart diseases with or without heart failure:--an appraisal by quantitative one-plane cineangiocardiography.

    PubMed

    Lien, W P; Lee, Y S; Chang, F Z; Chen, J J; Shieh, W B

    1978-01-01

    Quantitative one-plane cineangiocardiography in right anterior oblique position for evaluation of LV performance was carried out in 62 patients with various heart diseases and in 13 subjects with normal LV. Parameters for evaluating both pump and muscle performances were derived from volume and pressure measurements. Of 31 patients with either systolic hypertension or LV myocardial diseases (coronary artery disease or idiopathic cardiomyopathy), 14 had clinical evidence of LV failure before the study. It was found that mean VCF and EF were most sensitive indicators of impaired LV performance among the various parameters. There was a close correlation between mean VCF and EF, yet discordant changes of both parameters were noted in some patients. Furthermore, wall motion abnormalities were not infrequently observed in patients with coronary artery disease or primary cardiomyopathy. Therefore, assessment of at least three ejection properties (EF, mean VCF and wall motion abnormalities) are considered to be essential for full understanding of derangement of LV function in heart disease. This is especially true of patients with coronary artery disease. LV behavior in relation to different pathological stresses or lesions, such as chronic pressure or volume load, myocardial disease and mitral stenosis, was also studied and possible cause of impaired LV myocardial function in mitral stenosis was discussed.

  3. Quantitative evaluation of Alzheimer's disease

    NASA Astrophysics Data System (ADS)

    Duchesne, S.; Frisoni, G. B.

    2009-02-01

    We propose a single, quantitative metric called the disease evaluation factor (DEF) and assess its efficiency at estimating disease burden in normal, control subjects (CTRL) and probable Alzheimer's disease (AD) patients. The study group consisted in 75 patients with a diagnosis of probable AD and 75 age-matched normal CTRL without neurological or neuropsychological deficit. We calculated a reference eigenspace of MRI appearance from reference data, in which our CTRL and probable AD subjects were projected. We then calculated the multi-dimensional hyperplane separating the CTRL and probable AD groups. The DEF was estimated via a multidimensional weighted distance of eigencoordinates for a given subject and the CTRL group mean, along salient principal components forming the separating hyperplane. We used quantile plots, Kolmogorov-Smirnov and χ2 tests to compare the DEF values and test that their distribution was normal. We used a linear discriminant test to separate CTRL from probable AD based on the DEF factor, and reached an accuracy of 87%. A quantitative biomarker in AD would act as an important surrogate marker of disease status and progression.

  4. Evaluation of a Multi-Axial, Temperature, and Time Dependent (MATT) Failure Model

    NASA Technical Reports Server (NTRS)

    Richardson, D. E.; Anderson, G. L.; Macon, D. J.; Rudolphi, Michael (Technical Monitor)

    2002-01-01

    To obtain a better understanding the response of the structural adhesives used in the Space Shuttle's Reusable Solid Rocket Motor (RSRM) nozzle, an extensive effort has been conducted to characterize in detail the failure properties of these adhesives. This effort involved the development of a failure model that includes the effects of multi-axial loading, temperature, and time. An understanding of the effects of these parameters on the failure of the adhesive is crucial to the understanding and prediction of the safety of the RSRM nozzle. This paper documents the use of this newly developed multi-axial, temperature, and time (MATT) dependent failure model for modeling failure for the adhesives TIGA 321, EA913NA, and EA946. The development of the mathematical failure model using constant load rate normal and shear test data is presented. Verification of the accuracy of the failure model is shown through comparisons between predictions and measured creep and multi-axial failure data. The verification indicates that the failure model performs well for a wide range of conditions (loading, temperature, and time) for the three adhesives. The failure criterion is shown to be accurate through the glass transition for the adhesive EA946. Though this failure model has been developed and evaluated with adhesives, the concepts are applicable for other isotropic materials.

  5. Four-point bending as a method for quantitatively evaluating spinal arthrodesis in a rat model.

    PubMed

    Robinson, Samuel T; Svet, Mark T; Kanim, Linda A; Metzger, Melodie F

    2015-02-01

    The most common method of evaluating the success (or failure) of rat spinal fusion procedures is manual palpation testing. Whereas manual palpation provides only a subjective binary answer (fused or not fused) regarding the success of a fusion surgery, mechanical testing can provide more quantitative data by assessing variations in strength among treatment groups. We here describe a mechanical testing method to quantitatively assess single-level spinal fusion in a rat model, to improve on the binary and subjective nature of manual palpation as an end point for fusion-related studies. We tested explanted lumbar segments from Sprague-Dawley rat spines after single-level posterolateral fusion procedures at L4-L5. Segments were classified as 'not fused,' 'restricted motion,' or 'fused' by using manual palpation testing. After thorough dissection and potting of the spine, 4-point bending in flexion then was applied to the L4-L5 motion segment, and stiffness was measured as the slope of the moment-displacement curve. Results demonstrated statistically significant differences in stiffness among all groups, which were consistent with preliminary grading according to manual palpation. In addition, the 4-point bending results provided quantitative information regarding the quality of the bony union formed and therefore enabled the comparison of fused specimens. Our results demonstrate that 4-point bending is a simple, reliable, and effective way to describe and compare results among rat spines after fusion surgery.

  6. Quantitative phase-contrast digital holographic microscopy for cell dynamic evaluation

    NASA Astrophysics Data System (ADS)

    Yu, Lingfeng; Mohanty, Samarendra; Berns, Michael W.; Chen, Zhongping

    2009-02-01

    The laser microbeam uses lasers to alter and/or to ablate intracellular organelles and cellular and tissue samples, and, today, has become an important tool for cell biologists to study the molecular mechanism of complex biological systems by removing individual cells or sub-cellular organelles. However, absolute quantitation of the localized alteration/damage to transparent phase objects, such as the cell membrane or chromosomes, was not possible using conventional phase-contrast or differential interference contrast microscopy. We report the development of phase-contrast digital holographic microscopy for quantitative evaluation of cell dynamic changes in real time during laser microsurgery. Quantitative phase images are recorded during the process of laser microsurgery and thus, the dynamic change in phase can be continuously evaluated. Out-of-focus organelles are re-focused by numerical reconstruction algorithms.

  7. Evaluation of lightweight material concepts for aircraft turbine engine rotor failure protection

    DOT National Transportation Integrated Search

    1997-07-01

    Results of the evaluation of lightweight materials for aircraft turbine engine rotor failure protection are presented in this report. The program consisted of two phases. Phase 1 was an evaluation of a group of composite materials which could possibl...

  8. Failure and life cycle evaluation of watering valves.

    PubMed

    Gonzalez, David M; Graciano, Sandy J; Karlstad, John; Leblanc, Mathias; Clark, Tom; Holmes, Scott; Reuter, Jon D

    2011-09-01

    Automated watering systems provide a reliable source of ad libitum water to animal cages. Our facility uses an automated water delivery system to support approximately 95% of the housed population (approximately 14,000 mouse cages). Drinking valve failure rates from 2002 through 2006 never exceeded the manufacturer standard of 0.1% total failure, based on monthly cage census and the number of floods. In 2007, we noted an increase in both flooding and cases of clinical dehydration in our mouse population. Using manufacturer's specifications for a water flow rate of 25 to 50 mL/min, we initiated a wide-scale screening of all valves used. During a 4-mo period, approximately 17,000 valves were assessed, of which 2200 failed according to scoring criteria (12.9% overall; 7.2% low flow; 1.6% no flow; 4.1% leaky). Factors leading to valve failures included residual metal shavings, silicone flash, introduced debris or bedding, and (most common) distortion of the autoclave-rated internal diaphragm and O-ring. Further evaluation revealed that despite normal autoclave conditions of heat, pressure, and steam, an extreme negative vacuum pull caused the valves' internal silicone components (diaphragm and O-ring) to become distorted and water-permeable. Normal flow rate often returned after a 'drying out' period, but components then reabsorbed water while on the animal rack or during subsequent autoclave cycles to revert to a variable flow condition. On the basis of our findings, we recalibrated autoclaves and initiated a preventative maintenance program to mitigate the risk of future valve failure.

  9. Failure and Life Cycle Evaluation of Watering Valves

    PubMed Central

    Gonzalez, David M; Graciano, Sandy J; Karlstad, John; Leblanc, Mathias; Clark, Tom; Holmes, Scott; Reuter, Jon D

    2011-01-01

    Automated watering systems provide a reliable source of ad libitum water to animal cages. Our facility uses an automated water delivery system to support approximately 95% of the housed population (approximately 14,000 mouse cages). Drinking valve failure rates from 2002 through 2006 never exceeded the manufacturer standard of 0.1% total failure, based on monthly cage census and the number of floods. In 2007, we noted an increase in both flooding and cases of clinical dehydration in our mouse population. Using manufacturer's specifications for a water flow rate of 25 to 50 mL/min, we initiated a wide-scale screening of all valves used. During a 4-mo period, approximately 17,000 valves were assessed, of which 2200 failed according to scoring criteria (12.9% overall; 7.2% low flow; 1.6% no flow; 4.1% leaky). Factors leading to valve failures included residual metal shavings, silicone flash, introduced debris or bedding, and (most common) distortion of the autoclave-rated internal diaphragm and O-ring. Further evaluation revealed that despite normal autoclave conditions of heat, pressure, and steam, an extreme negative vacuum pull caused the valves’ internal silicone components (diaphragm and O-ring) to become distorted and water-permeable. Normal flow rate often returned after a ‘drying out’ period, but components then reabsorbed water while on the animal rack or during subsequent autoclave cycles to revert to a variable flow condition. On the basis of our findings, we recalibrated autoclaves and initiated a preventative maintenance program to mitigate the risk of future valve failure. PMID:22330720

  10. Evaluation: Review of the Past, Preview of the Future.

    ERIC Educational Resources Information Center

    Smith, M. F.

    1994-01-01

    This paper summarized contributors' ideas about evaluation as a field and where it is going. Topics discussed were qualitative versus quantitative debate; evaluation's purpose; professionalization; program failure; program development; evaluators as advocates; evaluation knowledge; evaluation expansion; and methodology and design. (SLD)

  11. Spectral Electroencephalogram Analysis for the Evaluation of Encephalopathy Grade in Children With Acute Liver Failure.

    PubMed

    Press, Craig A; Morgan, Lindsey; Mills, Michele; Stack, Cynthia V; Goldstein, Joshua L; Alonso, Estella M; Wainwright, Mark S

    2017-01-01

    > 0.05). Spectral electroencephalogram classification correlated with outcome (p < 0.05). Spectral electroencephalogram analysis can be used to evaluate even young patients for hepatic encephalopathy and correlates with outcome. Spectral electroencephalogram may allow improved quantitative and reproducible assessment of hepatic encephalopathy grade in children with acute liver failure.

  12. Kidney (Renal) Failure

    MedlinePlus

    ... News Physician Resources Professions Site Index A-Z Kidney Failure Kidney failure, also known as renal failure, ... evaluated? How is kidney failure treated? What is kidney (renal) failure? The kidneys are designed to maintain ...

  13. Use of mechanistic simulations as a quantitative risk-ranking tool within the quality by design framework.

    PubMed

    Stocker, Elena; Toschkoff, Gregor; Sacher, Stephan; Khinast, Johannes G

    2014-11-20

    The purpose of this study is to evaluate the use of computer simulations for generating quantitative knowledge as a basis for risk ranking and mechanistic process understanding, as required by ICH Q9 on quality risk management systems. In this specific publication, the main focus is the demonstration of a risk assessment workflow, including a computer simulation for the generation of mechanistic understanding of active tablet coating in a pan coater. Process parameter screening studies are statistically planned under consideration of impacts on a potentially critical quality attribute, i.e., coating mass uniformity. Based on computer simulation data the process failure mode and effects analysis of the risk factors is performed. This results in a quantitative criticality assessment of process parameters and the risk priority evaluation of failure modes. The factor for a quantitative reassessment of the criticality and risk priority is the coefficient of variation, which represents the coating mass uniformity. The major conclusion drawn from this work is a successful demonstration of the integration of computer simulation in the risk management workflow leading to an objective and quantitative risk assessment. Copyright © 2014. Published by Elsevier B.V.

  14. Development and psychometric evaluation of the Thirst Distress Scale for patients with heart failure.

    PubMed

    Waldréus, Nana; Jaarsma, Tiny; van der Wal, Martje Hl; Kato, Naoko P

    2018-03-01

    Patients with heart failure can experience thirst distress. However, there is no instrument to measure this in patients with heart failure. The aim of the present study was to develop the Thirst Distress Scale for patients with Heart Failure (TDS-HF) and to evaluate psychometric properties of the scale. The TDS-HF was developed to measure thirst distress in patients with heart failure. Face and content validity was confirmed using expert panels including patients and healthcare professionals. Data on the TDS-HF was collected from patients with heart failure at outpatient heart failure clinics and hospitals in Sweden, the Netherlands and Japan. Psychometric properties were evaluated using data from 256 heart failure patients (age 72±11 years). Concurrent validity of the scale was assessed using a thirst intensity visual analogue scale. Patients did not have any difficulties answering the questions, and time taken to answer the questions was about five minutes. Factor analysis of the scale showed one factor. After psychometric testing, one item was deleted. For the eight item TDS-HF, a single factor explained 61% of the variance and Cronbach's alpha was 0.90. The eight item TDS-HF was significantly associated with the thirst intensity score ( r=0.55, p<0.001). Regarding test-retest reliability, the intraclass correlation coefficient was 0.88, and the weighted kappa values ranged from 0.29-0.60. The eight-item TDS-HF is valid and reliable for measuring thirst distress in patients with heart failure.

  15. Understanding the heterogeneity in volume overload and fluid distribution in decompensated heart failure is key to optimal volume management: role for blood volume quantitation.

    PubMed

    Miller, Wayne L; Mullan, Brian P

    2014-06-01

    This study sought to quantitate total blood volume (TBV) in patients hospitalized for decompensated chronic heart failure (DCHF) and to determine the extent of volume overload, and the magnitude and distribution of blood volume and body water changes following diuretic therapy. The accurate assessment and management of volume overload in patients with DCHF remains problematic. TBV was measured by a radiolabeled-albumin dilution technique with intravascular volume, pre-to-post-diuretic therapy, evaluated at hospital admission and at discharge. Change in body weight in relation to quantitated TBV was used to determine interstitial volume contribution to total fluid loss. Twenty-six patients were prospectively evaluated. Two patients had normal TBV at admission. Twenty-four patients were hypervolemic with TBV (7.4 ± 1.6 liters) increased by +39 ± 22% (range, +9.5% to +107%) above the expected normal volume. With diuresis, TBV decreased marginally (+30 ± 16%). Body weight declined by 6.9 ± 5.2 kg, and fluid intake/fluid output was a net negative 8.4 ± 5.2 liters. Interstitial compartment fluid loss was calculated at 6.2 ± 4.0 liters, accounting for 85 ± 15% of the total fluid reduction. TBV analysis demonstrated a wide range in the extent of intravascular overload. Dismissal measurements revealed marginally reduced intravascular volume post-diuretic therapy despite large reductions in body weight. Mobilization of interstitial fluid to the intravascular compartment with diuresis accounted for this disparity. Intravascular volume, however, remained increased at dismissal. The extent, composition, and distribution of volume overload are highly variable in DCHF, and this variability needs to be taken into account in the approach to individualized therapy. TBV quantitation, particularly serial measurements, can facilitate informed volume management with respect to a goal of treating to euvolemia. Copyright © 2014 American College of Cardiology Foundation. Published

  16. Advanced detection, isolation and accommodation of sensor failures: Real-time evaluation

    NASA Technical Reports Server (NTRS)

    Merrill, Walter C.; Delaat, John C.; Bruton, William M.

    1987-01-01

    The objective of the Advanced Detection, Isolation, and Accommodation (ADIA) Program is to improve the overall demonstrated reliability of digital electronic control systems for turbine engines by using analytical redundacy to detect sensor failures. The results of a real time hybrid computer evaluation of the ADIA algorithm are presented. Minimum detectable levels of sensor failures for an F100 engine control system are determined. Also included are details about the microprocessor implementation of the algorithm as well as a description of the algorithm itself.

  17. Evaluation of Brazed Joints Using Failure Assessment Diagram

    NASA Technical Reports Server (NTRS)

    Flom, Yury

    2012-01-01

    Fitness-for service approach was used to perform structural analysis of the brazed joints consisting of several base metal / filler metal combinations. Failure Assessment Diagrams (FADs) based on tensile and shear stress ratios were constructed and experimentally validated. It was shown that such FADs can provide a conservative estimate of safe combinations of stresses in the brazed joints. Based on this approach, Margins of Safety (MS) of the brazed joints subjected to multi-axial loading conditions can be evaluated..

  18. A quantitative method for evaluating alternatives. [aid to decision making

    NASA Technical Reports Server (NTRS)

    Forthofer, M. J.

    1981-01-01

    When faced with choosing between alternatives, people tend to use a number of criteria (often subjective, rather than objective) to decide which is the best alternative for them given their unique situation. The subjectivity inherent in the decision-making process can be reduced by the definition and use of a quantitative method for evaluating alternatives. This type of method can help decision makers achieve degree of uniformity and completeness in the evaluation process, as well as an increased sensitivity to the factors involved. Additional side-effects are better documentation and visibility of the rationale behind the resulting decisions. General guidelines for defining a quantitative method are presented and a particular method (called 'hierarchical weighted average') is defined and applied to the evaluation of design alternatives for a hypothetical computer system capability.

  19. Problems experienced by informal caregivers of individuals with heart failure: An integrative review.

    PubMed

    Grant, Joan S; Graven, Lucinda J

    2018-04-01

    The purpose of this review was to examine and synthesize recent literature regarding problems experienced by informal caregivers when providing care for individuals with heart failure in the home. Integrative literature review. A review of current empirical literature was conducted utilizing PubMed, CINAHL, Embase, Sociological Abstracts, Social Sciences Full Text, PsycARTICLES, PsycINFO, Health Source: Nursing/Academic Edition, and Cochrane computerized databases. 19 qualitative, 16 quantitative, and 2 mixed methods studies met the inclusion criteria for review. Computerized databases were searched for a combination of subject terms (i.e., MeSH) and keywords related to informal caregivers, problems, and heart failure. The title and abstract of identified articles and reference lists were reviewed. Studies were included if they were published in English between January 2000 and December 2016 and examined problems experienced by informal caregivers in providing care for individuals with heart failure in the home. Studies were excluded if not written in English or if elements of caregiving in heart failure were not present in the title, abstract, or text. Unpublished and duplicate empirical literature as well as articles related to specific end-stage heart failure populations also were excluded. Methodology described by Cooper and others for integrative reviews of quantitative and qualitative research was used. Quality appraisal of the included studies was evaluated using the Joanna Briggs Institute critical appraisal tools for cross-sectional quantitative and qualitative studies. Informal caregivers experienced four key problems when providing care for individuals with heart failure in the home, including performing multifaceted activities and roles that evolve around daily heart failure demands; maintaining caregiver physical, emotional, social, spiritual, and financial well-being; having insufficient caregiver support; and performing caregiving with uncertainty

  20. An improved approach for flight readiness certification: Methodology for failure risk assessment and application examples, volume 1

    NASA Technical Reports Server (NTRS)

    Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.

    1992-01-01

    An improved methodology for quantitatively evaluating failure risk of spaceflight systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with engineering analysis to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in engineering analyses of failure phenomena, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which engineering analysis models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes. These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. Conventional engineering analysis models currently employed for design of failure prediction are used in this methodology. The PFA methodology is described and examples of its application are presented. Conventional approaches to failure risk evaluation for spaceflight systems are discussed, and the rationale for the approach taken in the PFA methodology is presented. The statistical methods, engineering models, and computer software used in fatigue failure mode applications are thoroughly documented.

  1. High Resolution Qualitative and Quantitative MR Evaluation of the Glenoid Labrum

    PubMed Central

    Iwasaki, Kenyu; Tafur, Monica; Chang, Eric Y.; SherondaStatum; Biswas, Reni; Tran, Betty; Bae, Won C.; Du, Jiang; Bydder, Graeme M.; Chung, Christine B.

    2015-01-01

    Objective To implement qualitative and quantitative MR sequences for the evaluation of labral pathology. Methods Six glenoid labra were dissected and the anterior and posterior portions were divided into normal, mildly degenerated, or severely degenerated groups using gross and MR findings. Qualitative evaluation was performed using T1-weighted, proton density-weighted (PD), spoiled gradient echo (SPGR) and ultra-short echo time (UTE) sequences. Quantitative evaluation included T2 and T1rho measurements as well as T1, T2*, and T1rho measurements acquired with UTE techniques. Results SPGR and UTE sequences best demonstrated labral fiber structure. Degenerated labra had a tendency towards decreased T1 values, increased T2/T2* values and increased T1 rho values. T2* values obtained with the UTE sequence allowed for delineation between normal, mildly degenerated and severely degenerated groups (p<0.001). Conclusion Quantitative T2* measurements acquired with the UTE technique are useful for distinguishing between normal, mildly degenerated and severely degenerated labra. PMID:26359581

  2. Evaluation of a large-scale quantitative respirator-fit testing program for healthcare workers: survey results.

    PubMed

    Wilkinson, Irene J; Pisaniello, Dino; Ahmad, Junaid; Edwards, Suzanne

    2010-09-01

    To present the evaluation of a large-scale quantitative respirator-fit testing program. Concurrent questionnaire survey of fit testers and test subjects. Ambulatory care, home nursing care, and acute care hospitals across South Australia. Quantitative facial-fit testing was performed with TSI PortaCount instruments for healthcare workers (HCWs) who wore 5 different models of a disposable P2 (N95-equivalent) respirator. The questionnaire included questions about the HCW's age, sex, race, occupational category, main area of work, smoking status, facial characteristics, prior training and experience in use of respiratory masks, and number of attempts to obtain a respirator fit. A total of 6,160 HCWs were successfully fitted during the period from January through July 2007. Of the 4,472 HCWs who responded to the questionnaire and were successfully fitted, 3,707 (82.9%) were successfully fitted with the first tested respirator, 551 (12.3%) required testing with a second model, and 214 (4.8%) required 3 or more tests. We noted an increased pass rate on the first attempt over time. Asians (excluding those from South and Central Asia) had the highest failure rate (16.3% [45 of 276 Asian HCWs were unsuccessfully fitted]), and whites had the lowest (9.8% [426 of 4,338 white HCWs]). Race was highly correlated with facial shape. Among occupational groups, doctors had the highest failure rate (13.4% [81 of 604 doctors]), but they also had the highest proportion of Asians. Prior education and/or training in respirator use were not associated with a higher pass rate. Certain facial characteristics were associated with higher or lower pass rates with regard to fit testing, and fit testers were able to select a suitable respirator on the basis of a visual assessment in the majority of cases. For the fit tester, training and experience were important factors; however, for the HCW being fitted, prior experience in respirator use was not an important factor.

  3. Facial asymmetry quantitative evaluation in oculoauriculovertebral spectrum.

    PubMed

    Manara, Renzo; Schifano, Giovanni; Brotto, Davide; Mardari, Rodica; Ghiselli, Sara; Gerunda, Antonio; Ghirotto, Cristina; Fusetti, Stefano; Piacentile, Katherine; Scienza, Renato; Ermani, Mario; Martini, Alessandro

    2016-03-01

    Facial asymmetries in oculoauriculovertebral spectrum (OAVS) patients might require surgical corrections that are mostly based on qualitative approach and surgeon's experience. The present study aimed to develop a quantitative 3D CT imaging-based procedure suitable for maxillo-facial surgery planning in OAVS patients. Thirteen OAVS patients (mean age 3.5 ± 4.0 years; range 0.2-14.2, 6 females) and 13 controls (mean age 7.1 ± 5.3 years; range 0.6-15.7, 5 females) who underwent head CT examination were retrospectively enrolled. Eight bilateral anatomical facial landmarks were defined on 3D CT images (porion, orbitale, most anterior point of frontozygomatic suture, most superior point of temporozygomatic suture, most posterior-lateral point of the maxilla, gonion, condylion, mental foramen) and distance from orthogonal planes (in millimeters) was used to evaluate the asymmetry on each axis and to calculate a global asymmetry index of each anatomical landmark. Mean asymmetry values and relative confidence intervals were obtained from the control group. OAVS patients showed 2.5 ± 1.8 landmarks above the confidence interval while considering the global asymmetry values; 12 patients (92%) showed at least one pathologically asymmetric landmark. Considering each axis, the mean number of pathologically asymmetric landmarks increased to 5.5 ± 2.6 (p = 0.002) and all patients presented at least one significant landmark asymmetry. Modern CT-based 3D reconstructions allow accurate assessment of facial bone asymmetries in patients affected by OAVS. The evaluation as a global score and in different orthogonal axes provides precise quantitative data suitable for maxillo-facial surgical planning. CT-based 3D reconstruction might allow a quantitative approach for planning and following-up maxillo-facial surgery in OAVS patients.

  4. Design and evaluation of a failure detection and isolation algorithm for restructurable control systems

    NASA Technical Reports Server (NTRS)

    Weiss, Jerold L.; Hsu, John Y.

    1986-01-01

    The use of a decentralized approach to failure detection and isolation for use in restructurable control systems is examined. This work has produced: (1) A method for evaluating fundamental limits to FDI performance; (2) Application using flight recorded data; (3) A working control element FDI system with maximal sensitivity to critical control element failures; (4) Extensive testing on realistic simulations; and (5) A detailed design methodology involving parameter optimization (with respect to model uncertainties) and sensitivity analyses. This project has concentrated on detection and isolation of generic control element failures since these failures frequently lead to emergency conditions and since knowledge of remaining control authority is essential for control system redesign. The failures are generic in the sense that no temporal failure signature information was assumed. Thus, various forms of functional failures are treated in a unified fashion. Such a treatment results in a robust FDI system (i.e., one that covers all failure modes) but sacrifices some performance when detailed failure signature information is known, useful, and employed properly. It was assumed throughout that all sensors are validated (i.e., contain only in-spec errors) and that only the first failure of a single control element needs to be detected and isolated. The FDI system which has been developed will handle a class of multiple failures.

  5. In-Flight Validation of a Pilot Rating Scale for Evaluating Failure Transients in Electronic Flight Control Systems

    NASA Technical Reports Server (NTRS)

    Kalinowski, Kevin F.; Tucker, George E.; Moralez, Ernesto, III

    2006-01-01

    Engineering development and qualification of a Research Flight Control System (RFCS) for the Rotorcraft Aircrew Systems Concepts Airborne Laboratory (RASCAL) JUH-60A has motivated the development of a pilot rating scale for evaluating failure transients in fly-by-wire flight control systems. The RASCAL RFCS includes a highly-reliable, dual-channel Servo Control Unit (SCU) to command and monitor the performance of the fly-by-wire actuators and protect against the effects of erroneous commands from the flexible, but single-thread Flight Control Computer. During the design phase of the RFCS, two piloted simulations were conducted on the Ames Research Center Vertical Motion Simulator (VMS) to help define the required performance characteristics of the safety monitoring algorithms in the SCU. Simulated failures, including hard-over and slow-over commands, were injected into the command path, and the aircraft response and safety monitor performance were evaluated. A subjective Failure/Recovery Rating (F/RR) scale was developed as a means of quantifying the effects of the injected failures on the aircraft state and the degree of pilot effort required to safely recover the aircraft. A brief evaluation of the rating scale was also conducted on the Army/NASA CH-47B variable stability helicopter to confirm that the rating scale was likely to be equally applicable to in-flight evaluations. Following the initial research flight qualification of the RFCS in 2002, a flight test effort was begun to validate the performance of the safety monitors and to validate their design for the safe conduct of research flight testing. Simulated failures were injected into the SCU, and the F/RR scale was applied to assess the results. The results validate the performance of the monitors, and indicate that the Failure/Recovery Rating scale is a very useful tool for evaluating failure transients in fly-by-wire flight control systems.

  6. Predictive factors for renal failure and a control and treatment algorithm

    PubMed Central

    Cerqueira, Denise de Paula; Tavares, José Roberto; Machado, Regimar Carla

    2014-01-01

    Objectives to evaluate the renal function of patients in an intensive care unit, to identify the predisposing factors for the development of renal failure, and to develop an algorithm to help in the control of the disease. Method exploratory, descriptive, prospective study with a quantitative approach. Results a total of 30 patients (75.0%) were diagnosed with kidney failure and the main factors associated with this disease were: advanced age, systemic arterial hypertension, diabetes mellitus, lung diseases, and antibiotic use. Of these, 23 patients (76.6%) showed a reduction in creatinine clearance in the first 24 hours of hospitalization. Conclusion a decline in renal function was observed in a significant number of subjects, therefore, an algorithm was developed with the aim of helping in the control of renal failure in a practical and functional way. PMID:26107827

  7. Worsening renal function definition is insufficient for evaluating acute renal failure in acute heart failure.

    PubMed

    Shirakabe, Akihiro; Hata, Noritake; Kobayashi, Nobuaki; Okazaki, Hirotake; Matsushita, Masato; Shibata, Yusaku; Nishigoori, Suguru; Uchiyama, Saori; Asai, Kuniya; Shimizu, Wataru

    2018-06-01

    Whether or not the definition of a worsening renal function (WRF) is adequate for the evaluation of acute renal failure in patients with acute heart failure is unclear. One thousand and eighty-three patients with acute heart failure were analysed. A WRF, indicated by a change in serum creatinine ≥0.3 mg/mL during the first 5 days, occurred in 360 patients while no-WRF, indicated by a change <0.3 mg/dL, in 723 patients. Acute kidney injury (AKI) upon admission was defined based on the ratio of the serum creatinine value recorded on admission to the baseline creatinine value and placed into groups based on the degree of AKI: no-AKI (n = 751), Class R (risk; n = 193), Class I (injury; n = 41), or Class F (failure; n = 98). The patients were assigned to another set of four groups: no-WRF/no-AKI (n = 512), no-WRF/AKI (n = 211), WRF/no-AKI (n = 239), and WRF/AKI (n = 121). A multivariate logistic regression model found that no-WRF/AKI and WRF/AKI were independently associated with 365 day mortality (hazard ratio: 1.916; 95% confidence interval: 1.234-2.974 and hazard ratio: 3.622; 95% confidence interval: 2.332-5.624). Kaplan-Meier survival curves showed that the rate of any-cause death during 1 year was significantly poorer in the no-WRF/AKI and WRF/AKI groups than in the WRF/no-AKI and no-WRF/no-AKI groups and in Class I and Class F than in Class R and the no-AKI group. The presence of AKI on admission, especially Class I and Class F status, is associated with a poor prognosis despite the lack of a WRF within the first 5 days. The prognostic ability of AKI on admission may be superior to WRF within the first 5 days. © 2018 The Authors. ESC Heart Failure published by John Wiley & Sons Ltd on behalf of the European Society of Cardiology.

  8. Evaluation of Progressive Failure Analysis and Modeling of Impact Damage in Composite Pressure Vessels

    NASA Technical Reports Server (NTRS)

    Sanchez, Christopher M.

    2011-01-01

    NASA White Sands Test Facility (WSTF) is leading an evaluation effort in advanced destructive and nondestructive testing of composite pressure vessels and structures. WSTF is using progressive finite element analysis methods for test design and for confirmation of composite pressure vessel performance. Using composite finite element analysis models and failure theories tested in the World-Wide Failure Exercise, WSTF is able to estimate the static strength of composite pressure vessels. Additionally, test and evaluation on composites that have been impact damaged is in progress so that models can be developed to estimate damage tolerance and the degradation in static strength.

  9. Risk measures for power failures in transmission systems

    NASA Astrophysics Data System (ADS)

    Cassidy, Alex; Feinstein, Zachary; Nehorai, Arye

    2016-11-01

    We present a novel framework for evaluating the risk of failures in power transmission systems. We use the concept of systemic risk measures from the financial mathematics literature with models of power system failures in order to quantify the risk of the entire power system for design and comparative purposes. The proposed risk measures provide the collection of capacity vectors for the components in the system that lead to acceptable outcomes. Keys to the formulation of our measures of risk are two elements: a model of system behavior that provides the (distribution of) outcomes based on component capacities and an acceptability criterion that determines whether a (random) outcome is acceptable from an aggregated point of view. We examine the effects of altering the line capacities on energy not served under a variety of networks, flow manipulation methods, load shedding schemes, and load profiles using Monte Carlo simulations. Our results provide a quantitative comparison of the performance of these schemes, measured by the required line capacity. These results provide more complete descriptions of the risks of power failures than the previous, one-dimensional metrics.

  10. Online versus Paper Evaluations: Differences in Both Quantitative and Qualitative Data

    ERIC Educational Resources Information Center

    Burton, William B.; Civitano, Adele; Steiner-Grossman, Penny

    2012-01-01

    This study sought to determine if differences exist in the quantitative and qualitative data collected with paper and online versions of a medical school clerkship evaluation form. Data from six-and-a-half years of clerkship evaluations were used, some collected before and some after the conversion from a paper to an online evaluation system. The…

  11. Evaluating the Phoenix definition of biochemical failure after (125)I prostate brachytherapy: Can PSA kinetics distinguish PSA failures from PSA bounces?

    PubMed

    Thompson, Anna; Keyes, Mira; Pickles, Tom; Palma, David; Moravan, Veronika; Spadinger, Ingrid; Lapointe, Vincent; Morris, W James

    2010-10-01

    To evaluate the prostate-specific antigen (PSA) kinetics of PSA failure (PSAf) and PSA bounce (PSAb) after permanent (125)I prostate brachytherapy (PB). The study included 1,006 consecutive low and "low tier" intermediate-risk patients treated with (125)I PB, with a potential minimum follow-up of 4 years. Patients who met the Phoenix definition of biochemical failure (nadir + 2 ng/mL(-1)) were identified. If the PSA subsequently fell to ≤0.5 ng/mL(-1)without intervention, this was considered a PSAb. All others were scored as true PSAf. Patient, tumor and dosimetric characteristics were compared between groups using the chi-square test and analysis of variance to evaluate factors associated with PSAf or PSAb. Median follow-up was 54 months. Of the 1,006 men, 57 patients triggered the Phoenix definition of PSA failure, 32 (56%) were true PSAf, and 25 PSAb (44%). The median time to trigger nadir + 2 was 20.6 months (range, 6-36) vs. 49 mo (range, 12-83) for PSAb vs. PSAf groups (p < 0.001). The PSAb patients were significantly younger (p < 0.0001), had shorter time to reach the nadir (median 6 vs. 11.5 months, p = 0.001) and had a shorter PSA doubling time (p = 0.05). Men younger than age 70 who trigger nadir +2 PSA failure within 38 months of implant have an 80% likelihood of having PSAb and 20% chance of PSAf. With adequate follow-up, 44% of PSA failures by the Phoenix definition in our cohort were found to be benign PSA bounces. Our study reinforces the need for adequate follow-up when reporting PB PSA outcomes, to ensure accurate estimates of treatment efficacy and to avoid unnecessary secondary interventions. 2010. Published by Elsevier Inc. All rights reserved.

  12. Fuzzy Risk Evaluation in Failure Mode and Effects Analysis Using a D Numbers Based Multi-Sensor Information Fusion Method.

    PubMed

    Deng, Xinyang; Jiang, Wen

    2017-09-12

    Failure mode and effect analysis (FMEA) is a useful tool to define, identify, and eliminate potential failures or errors so as to improve the reliability of systems, designs, and products. Risk evaluation is an important issue in FMEA to determine the risk priorities of failure modes. There are some shortcomings in the traditional risk priority number (RPN) approach for risk evaluation in FMEA, and fuzzy risk evaluation has become an important research direction that attracts increasing attention. In this paper, the fuzzy risk evaluation in FMEA is studied from a perspective of multi-sensor information fusion. By considering the non-exclusiveness between the evaluations of fuzzy linguistic variables to failure modes, a novel model called D numbers is used to model the non-exclusive fuzzy evaluations. A D numbers based multi-sensor information fusion method is proposed to establish a new model for fuzzy risk evaluation in FMEA. An illustrative example is provided and examined using the proposed model and other existing method to show the effectiveness of the proposed model.

  13. Fuzzy Risk Evaluation in Failure Mode and Effects Analysis Using a D Numbers Based Multi-Sensor Information Fusion Method

    PubMed Central

    Deng, Xinyang

    2017-01-01

    Failure mode and effect analysis (FMEA) is a useful tool to define, identify, and eliminate potential failures or errors so as to improve the reliability of systems, designs, and products. Risk evaluation is an important issue in FMEA to determine the risk priorities of failure modes. There are some shortcomings in the traditional risk priority number (RPN) approach for risk evaluation in FMEA, and fuzzy risk evaluation has become an important research direction that attracts increasing attention. In this paper, the fuzzy risk evaluation in FMEA is studied from a perspective of multi-sensor information fusion. By considering the non-exclusiveness between the evaluations of fuzzy linguistic variables to failure modes, a novel model called D numbers is used to model the non-exclusive fuzzy evaluations. A D numbers based multi-sensor information fusion method is proposed to establish a new model for fuzzy risk evaluation in FMEA. An illustrative example is provided and examined using the proposed model and other existing method to show the effectiveness of the proposed model. PMID:28895905

  14. Quantitative Evaluation of Performance during Robot-assisted Treatment.

    PubMed

    Peri, E; Biffi, E; Maghini, C; Servodio Iammarrone, F; Gagliardi, C; Germiniasi, C; Pedrocchi, A; Turconi, A C; Reni, G

    2016-01-01

    This article is part of the Focus Theme of Methods of Information in Medicine on "Methodologies, Models and Algorithms for Patients Rehabilitation". The great potential of robots in extracting quantitative and meaningful data is not always exploited in clinical practice. The aim of the present work is to describe a simple parameter to assess the performance of subjects during upper limb robotic training exploiting data automatically recorded by the robot, with no additional effort for patients and clinicians. Fourteen children affected by cerebral palsy (CP) performed a training with Armeo®Spring. Each session was evaluated with P, a simple parameter that depends on the overall performance recorded, and median and interquartile values were computed to perform a group analysis. Median (interquartile) values of P significantly increased from 0.27 (0.21) at T0 to 0.55 (0.27) at T1 . This improvement was functionally validated by a significant increase of the Melbourne Assessment of Unilateral Upper Limb Function. The parameter described here was able to show variations in performance over time and enabled a quantitative evaluation of motion abilities in a way that is reliable with respect to a well-known clinical scale.

  15. A new method to evaluate image quality of CBCT images quantitatively without observers

    PubMed Central

    Shimizu, Mayumi; Okamura, Kazutoshi; Yoshida, Shoko; Weerawanich, Warangkana; Tokumori, Kenji; Jasa, Gainer R; Yoshiura, Kazunori

    2017-01-01

    Objectives: To develop an observer-free method for quantitatively evaluating the image quality of CBCT images by applying just-noticeable difference (JND). Methods: We used two test objects: (1) a Teflon (polytetrafluoroethylene) plate phantom attached to a dry human mandible; and (2) a block phantom consisting of a Teflon step phantom and an aluminium step phantom. These phantoms had holes with different depths. They were immersed in water and scanned with a CB MercuRay (Hitachi Medical Corporation, Tokyo, Japan) at tube voltages of 120 kV, 100 kV, 80 kV and 60 kV. Superimposed images of the phantoms with holes were used for evaluation. The number of detectable holes was used as an index of image quality. In detecting holes quantitatively, the threshold grey value (ΔG), which differentiated holes from the background, was calculated using a specific threshold (the JND), and we extracted the holes with grey values above ΔG. The indices obtained by this quantitative method (the extracted hole values) were compared with the observer evaluations (the observed hole values). In addition, the contrast-to-noise ratio (CNR) of the shallowest detectable holes and the deepest undetectable holes were measured to evaluate the contribution of CNR to detectability. Results: The results of this evaluation method corresponded almost exactly with the evaluations made by observers. The extracted hole values reflected the influence of different tube voltages. All extracted holes had an area with a CNR of ≥1.5. Conclusions: This quantitative method of evaluating CBCT image quality may be more useful and less time-consuming than evaluation by observation. PMID:28045343

  16. Prediction of Emergent Heart Failure Death by Semi-Quantitative Triage Risk Stratification

    PubMed Central

    Van Spall, Harriette G. C.; Atzema, Clare; Schull, Michael J.; Newton, Gary E.; Mak, Susanna; Chong, Alice; Tu, Jack V.; Stukel, Thérèse A.; Lee, Douglas S.

    2011-01-01

    Objectives Generic triage risk assessments are widely used in the emergency department (ED), but have not been validated for prediction of short-term risk among patients with acute heart failure (HF). Our objective was to evaluate the Canadian Triage Acuity Scale (CTAS) for prediction of early death among HF patients. Methods We included patients presenting with HF to an ED in Ontario from Apr 2003 to Mar 2007. We used the National Ambulatory Care Reporting System and vital statistics databases to examine care and outcomes. Results Among 68,380 patients (76±12 years, 49.4% men), early mortality was stratified with death rates of 9.9%, 1.9%, 0.9%, and 0.5% at 1-day, and 17.2%, 5.9%, 3.8%, and 2.5% at 7-days, for CTAS 1, 2, 3, and 4–5, respectively. Compared to lower acuity (CTAS 4–5) patients, adjusted odds ratios (aOR) for 1-day death were 1.32 (95%CI; 0.93–1.88; p = 0.12) for CTAS 3, 2.41 (95%CI; 1.71–3.40; p<0.001) for CTAS 2, and highest for CTAS 1: 9.06 (95%CI; 6.28–13.06; p<0.001). Predictors of triage-critical (CTAS 1) status included oxygen saturation <90% (aOR 5.92, 95%CI; 3.09–11.81; p<0.001), respiratory rate >24 breaths/minute (aOR 1.96, 95%CI; 1.05–3.67; p = 0.034), and arrival by paramedic (aOR 3.52, 95%CI; 1.70–8.02; p = 0.001). While age/sex-adjusted CTAS score provided good discrimination for ED (c-statistic = 0.817) and 1-day (c-statistic = 0.724) death, mortality prediction was improved further after accounting for cardiac and non-cardiac co-morbidities (c-statistics 0.882 and 0.810, respectively; both p<0.001). Conclusions A semi-quantitative triage acuity scale assigned at ED presentation and based largely on respiratory factors predicted emergent death among HF patients. PMID:21853068

  17. An improved approach for flight readiness certification: Methodology for failure risk assessment and application examples. Volume 2: Software documentation

    NASA Technical Reports Server (NTRS)

    Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.

    1992-01-01

    An improved methodology for quantitatively evaluating failure risk of spaceflight systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with engineering analysis to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in engineering analyses of failure phenomena, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which engineering analysis models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes, These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. Conventional engineering analysis models currently employed for design of failure prediction are used in this methodology. The PFA methodology is described and examples of its application are presented. Conventional approaches to failure risk evaluation for spaceflight systems are discussed, and the rationale for the approach taken in the PFA methodology is presented. The statistical methods, engineering models, and computer software used in fatigue failure mode applications are thoroughly documented.

  18. A Systematic Quantitative-Qualitative Model: How To Evaluate Professional Services

    ERIC Educational Resources Information Center

    Yoda, Koji

    1973-01-01

    The proposed evaluation model provides for the assignment of relative weights to each criterion, and establishes a weighting system for calculating a quantitative-qualitative raw score for each service activity of a faculty member being reviewed. (Author)

  19. Worsening renal function definition is insufficient for evaluating acute renal failure in acute heart failure

    PubMed Central

    Hata, Noritake; Kobayashi, Nobuaki; Okazaki, Hirotake; Matsushita, Masato; Shibata, Yusaku; Nishigoori, Suguru; Uchiyama, Saori; Asai, Kuniya; Shimizu, Wataru

    2018-01-01

    Abstract Aims Whether or not the definition of a worsening renal function (WRF) is adequate for the evaluation of acute renal failure in patients with acute heart failure is unclear. Methods and results One thousand and eighty‐three patients with acute heart failure were analysed. A WRF, indicated by a change in serum creatinine ≥0.3 mg/mL during the first 5 days, occurred in 360 patients while no‐WRF, indicated by a change <0.3 mg/dL, in 723 patients. Acute kidney injury (AKI) upon admission was defined based on the ratio of the serum creatinine value recorded on admission to the baseline creatinine value and placed into groups based on the degree of AKI: no‐AKI (n = 751), Class R (risk; n = 193), Class I (injury; n = 41), or Class F (failure; n = 98). The patients were assigned to another set of four groups: no‐WRF/no‐AKI (n = 512), no‐WRF/AKI (n = 211), WRF/no‐AKI (n = 239), and WRF/AKI (n = 121). A multivariate logistic regression model found that no‐WRF/AKI and WRF/AKI were independently associated with 365 day mortality (hazard ratio: 1.916; 95% confidence interval: 1.234–2.974 and hazard ratio: 3.622; 95% confidence interval: 2.332–5.624). Kaplan–Meier survival curves showed that the rate of any‐cause death during 1 year was significantly poorer in the no‐WRF/AKI and WRF/AKI groups than in the WRF/no‐AKI and no‐WRF/no‐AKI groups and in Class I and Class F than in Class R and the no‐AKI group. Conclusions The presence of AKI on admission, especially Class I and Class F status, is associated with a poor prognosis despite the lack of a WRF within the first 5 days. The prognostic ability of AKI on admission may be superior to WRF within the first 5 days. PMID:29388735

  20. Significance of Sarcopenia Evaluation in Acute Decompensated Heart Failure.

    PubMed

    Tsuchida, Keiichi; Fujihara, Yuki; Hiroki, Jiro; Hakamata, Takahiro; Sakai, Ryohei; Nishida, Kota; Sudo, Koji; Tanaka, Komei; Hosaka, Yukio; Takahashi, Kazuyoshi; Oda, Hirotaka

    2018-01-27

    In patients with chronic heart failure (HF), the clinical importance of sarcopenia has been recognized in relation to disease severity, reduced exercise capacity, and adverse clinical outcome. Nevertheless, its impact on acute decompensated heart failure (ADHF) is still poorly understood. Dual-energy X-ray absorptiometry (DXA) is a technique for quantitatively analyzing muscle mass and the degree of sarcopenia. Fat-free mass index (FFMI) is a noninvasive and easily applicable marker of muscle mass.This was a prospective observational cohort study comprising 38 consecutive patients hospitalized for ADHF. Sarcopenia, derived from DXA, was defined as a skeletal muscle mass index (SMI) two standard deviations below the mean for healthy young subjects. FFMI (kg/m 2 ) was calculated as 7.38 + 0.02908 × urinary creatinine (mg/day) divided by the square of height (m 2 ).Sarcopenia was present in 52.6% of study patients. B-type natriuretic peptide (BNP) levels were significantly higher in ADHF patients with sarcopenia than in those without sarcopenia (1666 versus 429 pg/mL, P < 0.0001). Receiver operator curves were used to compare the predictive accuracy of SMI and FFMI for higher BNP levels. Areas under the curve for SMI and FFMI were 0.743 and 0.717, respectively. Multiple logistic regression analysis showed sarcopenia as a predictor of higher BNP level (OR = 18.4; 95% CI, 1.86-181.27; P = 0.013).Sarcopenia is associated with increased disease severity in ADHF. SMI based on DXA is potentially superior to FFMI in terms of predicting the degree of severity, but FFMI is also associated with ADHF severity.

  1. Applying Quantitative Approaches to the Formative Evaluation of Antismoking Campaign Messages

    PubMed Central

    Parvanta, Sarah; Gibson, Laura; Forquer, Heather; Shapiro-Luft, Dina; Dean, Lorraine; Freres, Derek; Lerman, Caryn; Mallya, Giridhar; Moldovan-Johnson, Mihaela; Tan, Andy; Cappella, Joseph; Hornik, Robert

    2014-01-01

    This article shares an in-depth summary of a formative evaluation that used quantitative data to inform the development and selection of promotional ads for the antismoking communication component of a social marketing campaign. A foundational survey provided cross-sectional data to identify beliefs about quitting smoking that campaign messages should target, as well as beliefs to avoid. Pretesting draft ads against quantitative indicators of message effectiveness further facilitated the selection and rejection of final campaign ads. Finally, we consider lessons learned from the process of balancing quantitative methods and judgment to make formative decisions about more and less promising persuasive messages for campaigns. PMID:24817829

  2. Quantitative Evaluation of Heavy Duty Machine Tools Remanufacturing Based on Modified Catastrophe Progression Method

    NASA Astrophysics Data System (ADS)

    shunhe, Li; jianhua, Rao; lin, Gui; weimin, Zhang; degang, Liu

    2017-11-01

    The result of remanufacturing evaluation is the basis for judging whether the heavy duty machine tool can remanufacture in the EOL stage of the machine tool lifecycle management.The objectivity and accuracy of evaluation is the key to the evaluation method.In this paper, the catastrophe progression method is introduced into the quantitative evaluation of heavy duty machine tools’ remanufacturing,and the results are modified by the comprehensive adjustment method,which makes the evaluation results accord with the standard of human conventional thinking.Using the catastrophe progression method to establish the heavy duty machine tools’ quantitative evaluation model,to evaluate the retired TK6916 type CNC floor milling-boring machine’s remanufacturing.The evaluation process is simple,high quantification,the result is objective.

  3. A Framework for Quantitative Evaluation of Care Coordination Effectiveness

    ERIC Educational Resources Information Center

    Liu, Wei

    2017-01-01

    The U.S. healthcare system lacks incentives and quantitative evaluation tools to assess coordination in a patient's care transition process. This is needed because poor care coordination has been identified by many studies as one of the major root causes for the U.S. health system's inefficiency, for poor outcomes, and for high cost. Despite…

  4. Slope failures evaluation and landslides investigation using 2-D resistivity method

    NASA Astrophysics Data System (ADS)

    Nordiana, M. M.; Azwin, I. N.; Nawawi, M. N. M.; Khalil, A. E.

    2018-06-01

    Slope failure is a complex phenomenon that may caused to landslides. Buildings and infrastructure such as transportation facilities and pipelines located within the boundaries of a landslide can be damaged or destroyed. Slope failure classification and various factors contributing to the instability using 2-D resistivity survey conducted in Selangor, Malaysia are described. Six 2-D resistivity survey lines with 5 m minimum electrode spacing using Pole-dipole array were performed. The data were processed using Res2Dinv and surfer10 software to evaluate the subsurface characteristics. The 2-D resistivity results show that the subsurface consist of two main zones. The first zone was alluvium or highly weathered with resistivity value of 100-1000 Ω m and depth of >30 m. This zone consists of saturated area with resistivity value of 1-100 Ω m and boulders with resistivity value of 1200-7000 Ω m. The second zone with resistivity value of >7000 Ω m was interpreted as granitic bedrock. The study area was characterized by saturated zones, highly weathered zone, highly contain of sand and boulders that will trigger slope failure in the survey area. This will cause to low strength of soil, debris flow and movement of earth. On the basis of the case examples described, 2-D resistivity method is categorized into desirable and useful method in determination of slope failure and future assessments.

  5. Usability Evaluation of a Web-Based Symptom Monitoring Application for Heart Failure.

    PubMed

    Wakefield, Bonnie; Pham, Kassie; Scherubel, Melody

    2015-07-01

    Symptom recognition and reporting by patients with heart failure are critical to avoid hospitalization. This project evaluated a patient symptom tracking application. Fourteen end users (nine patients, five clinicians) from a Midwestern Veterans Affairs Medical Center evaluated the website using a think aloud protocol. A structured observation protocol was used to assess success or failure for each task. Measures included task time, success, and satisfaction. Patients had a mean age of 70 years; clinicians averaged 42 years in age. Patients took 9.3 min and clinicians took less than 3 min per scenario. Most patients needed some assistance, but few patients were completely unable to complete some tasks. Clinicians demonstrated few problems navigating the site. Patient System Usability Scale item scores ranged from 2.0 to 3.6; clinician item scores ranged from 1.8 to 4.0. Further work is needed to determine whether using the web-based tool improves symptom recognition and reporting. © The Author(s) 2015.

  6. A Stereological Method for the Quantitative Evaluation of Cartilage Repair Tissue.

    PubMed

    Foldager, Casper Bindzus; Nyengaard, Jens Randel; Lind, Martin; Spector, Myron

    2015-04-01

    To implement stereological principles to develop an easy applicable algorithm for unbiased and quantitative evaluation of cartilage repair. Design-unbiased sampling was performed by systematically sectioning the defect perpendicular to the joint surface in parallel planes providing 7 to 10 hematoxylin-eosin stained histological sections. Counting windows were systematically selected and converted into image files (40-50 per defect). The quantification was performed by two-step point counting: (1) calculation of defect volume and (2) quantitative analysis of tissue composition. Step 2 was performed by assigning each point to one of the following categories based on validated and easy distinguishable morphological characteristics: (1) hyaline cartilage (rounded cells in lacunae in hyaline matrix), (2) fibrocartilage (rounded cells in lacunae in fibrous matrix), (3) fibrous tissue (elongated cells in fibrous tissue), (4) bone, (5) scaffold material, and (6) others. The ability to discriminate between the tissue types was determined using conventional or polarized light microscopy, and the interobserver variability was evaluated. We describe the application of the stereological method. In the example, we assessed the defect repair tissue volume to be 4.4 mm(3) (CE = 0.01). The tissue fractions were subsequently evaluated. Polarized light illumination of the slides improved discrimination between hyaline cartilage and fibrocartilage and increased the interobserver agreement compared with conventional transmitted light. We have applied a design-unbiased method for quantitative evaluation of cartilage repair, and we propose this algorithm as a natural supplement to existing descriptive semiquantitative scoring systems. We also propose that polarized light is effective for discrimination between hyaline cartilage and fibrocartilage.

  7. Prediction of Hip Failure Load: In Vitro Study of 80 Femurs Using Three Imaging Methods and Finite Element Models-The European Fracture Study (EFFECT).

    PubMed

    Pottecher, Pierre; Engelke, Klaus; Duchemin, Laure; Museyko, Oleg; Moser, Thomas; Mitton, David; Vicaut, Eric; Adams, Judith; Skalli, Wafa; Laredo, Jean Denis; Bousson, Valérie

    2016-09-01

    Purpose To evaluate the performance of three imaging methods (radiography, dual-energy x-ray absorptiometry [DXA], and quantitative computed tomography [CT]) and that of a numerical analysis with finite element modeling (FEM) in the prediction of failure load of the proximal femur and to identify the best densitometric or geometric predictors of hip failure load. Materials and Methods Institutional review board approval was obtained. A total of 40 pairs of excised cadaver femurs (mean patient age at time of death, 82 years ± 12 [standard deviation]) were examined with (a) radiography to measure geometric parameters (lengths, angles, and cortical thicknesses), (b) DXA (reference standard) to determine areal bone mineral densities (BMDs), and (c) quantitative CT with dedicated three-dimensional analysis software to determine volumetric BMDs and geometric parameters (neck axis length, cortical thicknesses, volumes, and moments of inertia), and (d) quantitative CT-based FEM to calculate a numerical value of failure load. The 80 femurs were fractured via mechanical testing, with random assignment of one femur from each pair to the single-limb stance configuration (hereafter, stance configuration) and assignment of the paired femur to the sideways fall configuration (hereafter, side configuration). Descriptive statistics, univariate correlations, and stepwise regression models were obtained for each imaging method and for FEM to enable us to predict failure load in both configurations. Results Statistics reported are for stance and side configurations, respectively. For radiography, the strongest correlation with mechanical failure load was obtained by using a geometric parameter combined with a cortical thickness (r(2) = 0.66, P < .001; r(2) = 0.65, P < .001). For DXA, the strongest correlation with mechanical failure load was obtained by using total BMD (r(2) = 0.73, P < .001) and trochanteric BMD (r(2) = 0.80, P < .001). For quantitative CT, in both configurations

  8. The second Sandia Fracture Challenge. Predictions of ductile failure under quasi-static and moderate-rate dynamic loading

    DOE PAGES

    Boyce, B. L.; Kramer, S. L. B.; Bosiljevac, T. R.; ...

    2016-03-14

    Ductile failure of structural metals is relevant to a wide range of engineering scenarios. Computational methods are employed to anticipate the critical conditions of failure, yet they sometimes provide inaccurate and misleading predictions. Challenge scenarios, such as the one presented in the current work, provide an opportunity to assess the blind, quantitative predictive ability of simulation methods against a previously unseen failure problem. Instead of evaluating the predictions of a single simulation approach, the Sandia Fracture Challenge relied on numerous volunteer teams with expertise in computational mechanics to apply a broad range of computational methods, numerical algorithms, and constitutive modelsmore » to the challenge. This exercise is intended to evaluate the state of health of technologies available for failure prediction. In the first Sandia Fracture Challenge, a wide range of issues were raised in ductile failure modeling, including a lack of consistency in failure models, the importance of shear calibration data, and difficulties in quantifying the uncertainty of prediction [see Boyce et al. (Int J Fract 186:5–68, 2014) for details of these observations]. This second Sandia Fracture Challenge investigated the ductile rupture of a Ti–6Al–4V sheet under both quasi-static and modest-rate dynamic loading (failure in ~ 0.1 s). Like the previous challenge, the sheet had an unusual arrangement of notches and holes that added geometric complexity and fostered a competition between tensile- and shear-dominated failure modes. The teams were asked to predict the fracture path and quantitative far-field failure metrics such as the peak force and displacement to cause crack initiation. Fourteen teams contributed blind predictions, and the experimental outcomes were quantified in three independent test labs. In addition, shortcomings were revealed in this second challenge such as inconsistency in the application of appropriate boundary

  9. The second Sandia Fracture Challenge. Predictions of ductile failure under quasi-static and moderate-rate dynamic loading

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Boyce, B. L.; Kramer, S. L. B.; Bosiljevac, T. R.

    Ductile failure of structural metals is relevant to a wide range of engineering scenarios. Computational methods are employed to anticipate the critical conditions of failure, yet they sometimes provide inaccurate and misleading predictions. Challenge scenarios, such as the one presented in the current work, provide an opportunity to assess the blind, quantitative predictive ability of simulation methods against a previously unseen failure problem. Instead of evaluating the predictions of a single simulation approach, the Sandia Fracture Challenge relied on numerous volunteer teams with expertise in computational mechanics to apply a broad range of computational methods, numerical algorithms, and constitutive modelsmore » to the challenge. This exercise is intended to evaluate the state of health of technologies available for failure prediction. In the first Sandia Fracture Challenge, a wide range of issues were raised in ductile failure modeling, including a lack of consistency in failure models, the importance of shear calibration data, and difficulties in quantifying the uncertainty of prediction [see Boyce et al. (Int J Fract 186:5–68, 2014) for details of these observations]. This second Sandia Fracture Challenge investigated the ductile rupture of a Ti–6Al–4V sheet under both quasi-static and modest-rate dynamic loading (failure in ~ 0.1 s). Like the previous challenge, the sheet had an unusual arrangement of notches and holes that added geometric complexity and fostered a competition between tensile- and shear-dominated failure modes. The teams were asked to predict the fracture path and quantitative far-field failure metrics such as the peak force and displacement to cause crack initiation. Fourteen teams contributed blind predictions, and the experimental outcomes were quantified in three independent test labs. In addition, shortcomings were revealed in this second challenge such as inconsistency in the application of appropriate boundary

  10. Evaluating the operational risks of biomedical waste using failure mode and effects analysis.

    PubMed

    Chen, Ying-Chu; Tsai, Pei-Yi

    2017-06-01

    The potential problems and risks of biomedical waste generation have become increasingly apparent in recent years. This study applied a failure mode and effects analysis to evaluate the operational problems and risks of biomedical waste. The microbiological contamination of biomedical waste seldom receives the attention of researchers. In this study, the biomedical waste lifecycle was divided into seven processes: Production, classification, packaging, sterilisation, weighing, storage, and transportation. Twenty main failure modes were identified in these phases and risks were assessed based on their risk priority numbers. The failure modes in the production phase accounted for the highest proportion of the risk priority number score (27.7%). In the packaging phase, the failure mode 'sharp articles not placed in solid containers' had the highest risk priority number score, mainly owing to its high severity rating. The sterilisation process is the main difference in the treatment of infectious and non-infectious biomedical waste. The failure modes in the sterilisation phase were mainly owing to human factors (mostly related to operators). This study increases the understanding of the potential problems and risks associated with biomedical waste, thereby increasing awareness of how to improve the management of biomedical waste to better protect workers, the public, and the environment.

  11. Student evaluations of teaching: teaching quantitative courses can be hazardous to one's career.

    PubMed

    Uttl, Bob; Smibert, Dylan

    2017-01-01

    Anonymous student evaluations of teaching (SETs) are used by colleges and universities to measure teaching effectiveness and to make decisions about faculty hiring, firing, re-appointment, promotion, tenure, and merit pay. Although numerous studies have found that SETs correlate with various teaching effectiveness irrelevant factors (TEIFs) such as subject, class size, and grading standards, it has been argued that such correlations are small and do not undermine the validity of SETs as measures of professors' teaching effectiveness. However, previous research has generally used inappropriate parametric statistics and effect sizes to examine and to evaluate the significance of TEIFs on personnel decisions. Accordingly, we examined the influence of quantitative vs. non-quantitative courses on SET ratings and SET based personnel decisions using 14,872 publicly posted class evaluations where each evaluation represents a summary of SET ratings provided by individual students responding in each class. In total, 325,538 individual student evaluations from a US mid-size university contributed to theses class evaluations. The results demonstrate that class subject (math vs. English) is strongly associated with SET ratings, has a substantial impact on professors being labeled satisfactory vs. unsatisfactory and excellent vs. non-excellent, and the impact varies substantially depending on the criteria used to classify professors as satisfactory vs. unsatisfactory. Professors teaching quantitative courses are far more likely not to receive tenure, promotion, and/or merit pay when their performance is evaluated against common standards.

  12. Quantitative evaluation of the voice range profile in patients with voice disorder.

    PubMed

    Ikeda, Y; Masuda, T; Manako, H; Yamashita, H; Yamamoto, T; Komiyama, S

    1999-01-01

    In 1953, Calvet first displayed the fundamental frequency (pitch) and sound pressure level (intensity) of a voice on a two-dimensional plane and created a voice range profile. This profile has been used to evaluate clinically various vocal disorders, although such evaluations to date have been subjective without quantitative assessment. In the present study, a quantitative system was developed to evaluate the voice range profile utilizing a personal computer. The area of the voice range profile was defined as the voice volume. This volume was analyzed in 137 males and 175 females who were treated for various dysphonias at Kyushu University between 1984 and 1990. Ten normal subjects served as controls. The voice volume in cases with voice disorders significantly decreased irrespective of the disease and sex. Furthermore, cases having better improvement after treatment showed a tendency for the voice volume to increase. These findings illustrated the voice volume as a useful clinical test for evaluating voice control in cases with vocal disorders.

  13. Quantitative evaluation of dermatological antiseptics.

    PubMed

    Leitch, C S; Leitch, A E; Tidman, M J

    2015-12-01

    Topical antiseptics are frequently used in dermatological management, yet evidence for the efficacy of traditional generic formulations is often largely anecdotal. We tested the in vitro bactericidal activity of four commonly used topical antiseptics against Staphylococcus aureus, using a modified version of the European Standard EN 1276, a quantitative suspension test for evaluation of the bactericidal activity of chemical disinfectants and antiseptics. To meet the standard for antiseptic effectiveness of EN 1276, at least a 5 log10 reduction in bacterial count within 5 minutes of exposure is required. While 1% benzalkonium chloride and 6% hydrogen peroxide both achieved a 5 log10 reduction in S. aureus count, neither 2% aqueous eosin nor 1 : 10 000 potassium permanganate showed significant bactericidal activity compared with control at exposure periods of up to 1 h. Aqueous eosin and potassium permanganate may have desirable astringent properties, but these results suggest they lack effective antiseptic activity, at least against S. aureus. © 2015 British Association of Dermatologists.

  14. Evaluating the risk of water distribution system failure: A shared frailty model

    NASA Astrophysics Data System (ADS)

    Clark, Robert M.; Thurnau, Robert C.

    2011-12-01

    Condition assessment (CA) Modeling is drawing increasing interest as a technique that can assist in managing drinking water infrastructure. This paper develops a model based on the application of a Cox proportional hazard (PH)/shared frailty model and applies it to evaluating the risk of failure in drinking water networks using data from the Laramie Water Utility (located in Laramie, Wyoming, USA). Using the risk model a cost/ benefit analysis incorporating the inspection value method (IVM), is used to assist in making improved repair, replacement and rehabilitation decisions for selected drinking water distribution system pipes. A separate model is developed to predict failures in prestressed concrete cylinder pipe (PCCP). Various currently available inspection technologies are presented and discussed.

  15. Determination of a tissue-level failure evaluation standard for rat femoral cortical bone utilizing a hybrid computational-experimental method.

    PubMed

    Fan, Ruoxun; Liu, Jie; Jia, Zhengbin; Deng, Ying; Liu, Jun

    2018-01-01

    Macro-level failure in bone structure could be diagnosed by pain or physical examination. However, diagnosing tissue-level failure in a timely manner is challenging due to the difficulty in observing the interior mechanical environment of bone tissue. Because most fractures begin with tissue-level failure in bone tissue caused by continually applied loading, people attempt to monitor the tissue-level failure of bone and provide corresponding measures to prevent fracture. Many tissue-level mechanical parameters of bone could be predicted or measured; however, the value of the parameter may vary among different specimens belonging to a kind of bone structure even at the same age and anatomical site. These variations cause difficulty in representing tissue-level bone failure. Therefore, determining an appropriate tissue-level failure evaluation standard is necessary to represent tissue-level bone failure. In this study, the yield and failure processes of rat femoral cortical bones were primarily simulated through a hybrid computational-experimental method. Subsequently, the tissue-level strains and the ratio between tissue-level failure and yield strains in cortical bones were predicted. The results indicated that certain differences existed in tissue-level strains; however, slight variations in the ratio were observed among different cortical bones. Therefore, the ratio between tissue-level failure and yield strains for a kind of bone structure could be determined. This ratio may then be regarded as an appropriate tissue-level failure evaluation standard to represent the mechanical status of bone tissue.

  16. Evaluation of a rapid quantitative determination method of PSA concentration with gold immunochromatographic strips.

    PubMed

    Wu, Cheng-Ching; Lin, Hung-Yu; Wang, Chao-Ping; Lu, Li-Fen; Yu, Teng-Hung; Hung, Wei-Chin; Houng, Jer-Yiing; Chung, Fu-Mei; Lee, Yau-Jiunn; Hu, Jin-Jia

    2015-11-03

    Prostate cancer remains the most common cancer in men. Qualitative or semi-quantitative immunochromatographic measurements of prostate specific antigen (PSA) have been shown to be simple, noninvasive and feasible. The aim of this study was to evaluate an optimized gold immunochromatographic strip device for the detection of PSA, in which the results can be analysed using a Chromogenic Rapid Test Reader to quantitatively assess the test results. This reader measures the reflectance of the signal line via a charge-coupled device camera. For quantitative analysis, PSA concentration was computed via a calibration equation. Capillary blood samples from 305 men were evaluated, and two independent observers interpreted the test results after 12 min. Blood samples were also collected and tested with a conventional quantitative assay. Sensitivity, specificity, positive and negative predictive values, and accuracy of the PSA rapid quantitative test system were 100, 96.6, 89.5, 100, and 97.4 %, respectively. Reproducibility of the test was 99.2, and interobserver variation was 8 % with a false positive rate of 3.4 %. The correlation coefficient between the ordinary quantitative assay and the rapid quantitative test was 0.960. The PSA rapid quantitative test system provided results quickly and was easy to use, so that tests using this system can be easily performed at outpatient clinics or elsewhere. This system may also be useful for initial cancer screening and for point-of-care testing, because results can be obtained within 12 min and at a cost lower than that of conventional quantitative assays.

  17. An improved approach for flight readiness certification: Methodology for failure risk assessment and application examples. Volume 3: Structure and listing of programs

    NASA Technical Reports Server (NTRS)

    Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.

    1992-01-01

    An improved methodology for quantitatively evaluating failure risk of spaceflight systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with engineering analysis to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in engineering analyses of failure phenomena, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which engineering analysis models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes. These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. Conventional engineering analysis models currently employed for design of failure prediction are used in this methodology. The PFA methodology is described and examples of its application are presented. Conventional approaches to failure risk evaluation for spaceflight systems are discussed, and the rationale for the approach taken in the PFA methodology is presented. The statistical methods, engineering models, and computer software used in fatigue failure mode applications are thoroughly documented.

  18. Midterm prospective evaluation of TVT-Secur reveals high failure rate.

    PubMed

    Cornu, Jean-Nicolas; Sèbe, Philippe; Peyrat, Laurence; Ciofu, Calin; Cussenot, Olivier; Haab, Francois

    2010-07-01

    TVT-Secur has been described as a new minimally invasive sling for women's stress urinary incontinence (SUI) management, showing promising results in short-term studies. Our goal was to evaluate the outcome of this procedure after a midterm follow-up. A prospective evaluation involved 45 consecutive patients presenting SUI associated with urethral hypermobility. Fourteen patients preoperatively reported overactive bladder (OAB) symptoms, but none had objective detrusor overactivity. Eight patients had low maximal urethral closure pressure (MUCP). Four patients had pelvic organ prolapse (POP). Patients with POP were treated under general anesthesia by Prolift and TVT-Secur procedure. The 41 other patients received TVT-Secur under local anesthesia on an outpatient basis. All interventions were made by the same surgeon. Postoperative assessment included pad count, bladder diary, clinical examination with stress test, evaluation of satisfaction with the Patient Global Impression of Improvement (PGI-I) scale, and evaluation of side effects. Patients were classified as cured if they used no pads, had no leakage, and had a PGI-I score < or = 2; as improved in case of reduction of SUI symptoms >50% and PGI-I score < or = 3; and as failure otherwise. Mean postoperative follow-up was 30.2 +/- 9.8 mo (range: 11-40 mo). Short-term evaluation showed a 93.5% success rate, but, at last follow-up, only 18 (40%) patients were cured, while 8 (18%) were improved, and 19 (42%) failed. Twelve patients underwent implantation of TVT or transobturator tape during follow-up. Age, MUCP, or OAB were not associated with failure. Side effects were limited to five cases of de novo OAB and three cases of urinary tract infection. This work is limited by the absence of a comparison group. Our experience shows that despite its good short-term efficacy, TVT-Secur is associated with a high recurrence rate of SUI. Therefore, TVT-Secur does not seem appropriate for SUI first-line management in women

  19. A Stereological Method for the Quantitative Evaluation of Cartilage Repair Tissue

    PubMed Central

    Nyengaard, Jens Randel; Lind, Martin; Spector, Myron

    2015-01-01

    Objective To implement stereological principles to develop an easy applicable algorithm for unbiased and quantitative evaluation of cartilage repair. Design Design-unbiased sampling was performed by systematically sectioning the defect perpendicular to the joint surface in parallel planes providing 7 to 10 hematoxylin–eosin stained histological sections. Counting windows were systematically selected and converted into image files (40-50 per defect). The quantification was performed by two-step point counting: (1) calculation of defect volume and (2) quantitative analysis of tissue composition. Step 2 was performed by assigning each point to one of the following categories based on validated and easy distinguishable morphological characteristics: (1) hyaline cartilage (rounded cells in lacunae in hyaline matrix), (2) fibrocartilage (rounded cells in lacunae in fibrous matrix), (3) fibrous tissue (elongated cells in fibrous tissue), (4) bone, (5) scaffold material, and (6) others. The ability to discriminate between the tissue types was determined using conventional or polarized light microscopy, and the interobserver variability was evaluated. Results We describe the application of the stereological method. In the example, we assessed the defect repair tissue volume to be 4.4 mm3 (CE = 0.01). The tissue fractions were subsequently evaluated. Polarized light illumination of the slides improved discrimination between hyaline cartilage and fibrocartilage and increased the interobserver agreement compared with conventional transmitted light. Conclusion We have applied a design-unbiased method for quantitative evaluation of cartilage repair, and we propose this algorithm as a natural supplement to existing descriptive semiquantitative scoring systems. We also propose that polarized light is effective for discrimination between hyaline cartilage and fibrocartilage. PMID:26069715

  20. Quantitative framework for prospective motion correction evaluation.

    PubMed

    Pannetier, Nicolas A; Stavrinos, Theano; Ng, Peter; Herbst, Michael; Zaitsev, Maxim; Young, Karl; Matson, Gerald; Schuff, Norbert

    2016-02-01

    Establishing a framework to evaluate performances of prospective motion correction (PMC) MRI considering motion variability between MRI scans. A framework was developed to obtain quantitative comparisons between different motion correction setups, considering that varying intrinsic motion patterns between acquisitions can induce bias. Intrinsic motion was considered by replaying in a phantom experiment the recorded motion trajectories from subjects. T1-weighted MRI on five volunteers and two different marker fixations (mouth guard and nose bridge fixations) were used to test the framework. Two metrics were investigated to quantify the improvement of the image quality with PMC. Motion patterns vary between subjects as well as between repeated scans within a subject. This variability can be approximated by replaying the motion in a distinct phantom experiment and used as a covariate in models comparing motion corrections. We show that considering the intrinsic motion alters the statistical significance in comparing marker fixations. As an example, two marker fixations, a mouth guard and a nose bridge, were evaluated in terms of their effectiveness for PMC. A mouth guard achieved better PMC performance. Intrinsic motion patterns can bias comparisons between PMC configurations and must be considered for robust evaluations. A framework for evaluating intrinsic motion patterns in PMC is presented. © 2015 Wiley Periodicals, Inc.

  1. TSS-1R Failure Mode Evaluation

    NASA Technical Reports Server (NTRS)

    Vaughn, Jason A.; McCollum, Matthew B.; Kamenetzky, Rachel R.

    1997-01-01

    Soon after the break of the tether during the Tethered Satellite System (TSS-1R) mission in February, 1996, a Tiger Team was assembled at the George C. Marshall Space Flight Center to determine the tether failure mode. One possible failure scenario was the Kevlar' strength member of the tether failed because of degradation due to electrical discharge or electrical arcing. During the next several weeks, extensive electrical discharge testing in low vacuum and plasma environments was conducted in an attempt to reproduce the electrical activity recorded by on-board science instruments during the mission. The results of these tests are presented in this paper.

  2. Mothers' Attributions in Reminiscing Conversations about Children's Successes and Failures: Connections with Children's Self-Evaluations

    ERIC Educational Resources Information Center

    Goodvin, Rebecca; Rolfson, Jacqueline

    2014-01-01

    Effects of feedback on children's self-evaluations are well established, yet little is known about how parents talk with children about everyday successes and failures, despite the importance of parent-child reminiscing in children's psychological understanding. We examine mothers' attributions and performance evaluations in conversations about…

  3. Quantitative evaluation research of glare from automotive headlamps

    NASA Astrophysics Data System (ADS)

    Wang, Tiecheng; Qian, Rui; Cao, Ye; Gao, Mingqiu

    2018-01-01

    This study concerns the quantized evaluation research of glare from automotive headlamps. In the actual regulations, only one point in the test screen is set for judging whether driver can bear the light caused by headlamps of opposing vehicle. To evaluating practical effect of glare, we accept a glare zone with the probability distribution information of the oncoming driver's eye position. In this focus area, glare level of headlamp is represented by weighted luminous flux. To confirm the most comfortable illuminance value to human eyes at 50 m, we used test point B50L as observation position, and collected 1,000 subjective evaluation data from 20 test personnel in different ages during two months. Basing on the assessment results, we calculated 0.60 lx as recommended value for standardized testing procedure at 25 m. Then we figured out 0.38 lm as optimum value, and 0.25 / 1.20 lm as limiting values depending on regulations. We tested 40 sample vehicles with different levels to verify the sectional nonlinear quantitative evaluation method we designed, and analyzed the typical test results.

  4. 'Stories' or 'snapshots'? A study directed at comparing qualitative and quantitative approaches to curriculum evaluation.

    PubMed

    Pateman, B; Jinks, A M

    1999-01-01

    The focus of this paper is a study designed to explore the validity of quantitative approaches of student evaluation in a pre-registration degree programme. As managers of the students' education we were concerned that the quantitative method, which used lecturer criteria, may not fully represent students' views. The approach taken is that of a process-type strategy for curriculum evaluation as described by Parlett and Hamilton (1972). The aim of the study is to produce illuminative data, or students' 'stories' of their educational experiences through use of semi-structured interviews. The results are then compared to the current quantitative measurement tools designed to obtain 'snapshots' of the educational effectiveness of the curriculum. The quantitative measurement tools use Likert scale measurements of teacher-devised criterion statements. The results of the study give a rich source of qualitative data which can be used to inform future curriculum development. However, complete validation of the current quantitative instruments used was not achieved in this study. Student and teacher agendas in respect of important issues pertaining to the course programme were found to differ. Limitations of the study are given. There is discussion of the options open to the management team with regard to future development of curriculum evaluation systems.

  5. Cartilage Repair Surgery: Outcome Evaluation by Using Noninvasive Cartilage Biomarkers Based on Quantitative MRI Techniques?

    PubMed Central

    Jungmann, Pia M.; Baum, Thomas; Bauer, Jan S.; Karampinos, Dimitrios C.; Link, Thomas M.; Li, Xiaojuan; Trattnig, Siegfried; Rummeny, Ernst J.; Woertler, Klaus; Welsch, Goetz H.

    2014-01-01

    Background. New quantitative magnetic resonance imaging (MRI) techniques are increasingly applied as outcome measures after cartilage repair. Objective. To review the current literature on the use of quantitative MRI biomarkers for evaluation of cartilage repair at the knee and ankle. Methods. Using PubMed literature research, studies on biochemical, quantitative MR imaging of cartilage repair were identified and reviewed. Results. Quantitative MR biomarkers detect early degeneration of articular cartilage, mainly represented by an increasing water content, collagen disruption, and proteoglycan loss. Recently, feasibility of biochemical MR imaging of cartilage repair tissue and surrounding cartilage was demonstrated. Ultrastructural properties of the tissue after different repair procedures resulted in differences in imaging characteristics. T2 mapping, T1rho mapping, delayed gadolinium-enhanced MRI of cartilage (dGEMRIC), and diffusion weighted imaging (DWI) are applicable on most clinical 1.5 T and 3 T MR scanners. Currently, a standard of reference is difficult to define and knowledge is limited concerning correlation of clinical and MR findings. The lack of histological correlations complicates the identification of the exact tissue composition. Conclusions. A multimodal approach combining several quantitative MRI techniques in addition to morphological and clinical evaluation might be promising. Further investigations are required to demonstrate the potential for outcome evaluation after cartilage repair. PMID:24877139

  6. Scare Tactics: Evaluating Problem Decompositions Using Failure Scenarios

    NASA Technical Reports Server (NTRS)

    Helm, B. Robert; Fickas, Stephen

    1992-01-01

    Our interest is in the design of multi-agent problem-solving systems, which we refer to as composite systems. We have proposed an approach to composite system design by decomposition of problem statements. An automated assistant called Critter provides a library of reusable design transformations which allow a human analyst to search the space of decompositions for a problem. In this paper we describe a method for evaluating and critiquing problem decompositions generated by this search process. The method uses knowledge stored in the form of failure decompositions attached to design transformations. We suggest the benefits of our critiquing method by showing how it could re-derive steps of a published development example. We then identify several open issues for the method.

  7. Failure analysis and evaluation of a six cylinders crankshaft for marine diesel generator

    NASA Astrophysics Data System (ADS)

    Khaeroman, Haryadi, Gunawan Dwi; Ismail, R.; Kim, Seon Jin

    2017-01-01

    This paper discusses the failure of a diesel engine crankshaft of a four stroke 6 cylinders, used in a marine diesel generator. A correct analysis and evaluation of the dimension of the crankshaft are very essential to prevent failure of the crankshaft fracture and cracks. The crankshaft is liable to deformation due to misalignment of the main journals bearings. This article presents the result of crankshaft failure analysis by measuring the mean diameter of the rod journal and the main journal, on the wear, out of roundness, taper, etc. The measurement results must be compared with the acceptable value in the engine specification and manual service and also should follow the American Bureau of Shipping (ABS) guidance notes on propulsion shafting alignment. The measurement results of this study show that the main journal diameter of the third cylinder exhibits an excessive wear, 1.35 % above the permissible lowest rate. It also has a taper for 0.23 mm and out of roundness of 0.13 mm. The diameter of the rod journal indicates excessive wear, 1.06 % higher than the permissible lowest rate, the taper of 0.41 mm and out of roundness of 0.65 mm. The crankshaft warpage or run-out journal, the analysis of the crank web deflection are also evaluated and presented in this paper.

  8. Quality-by-Design II: Application of Quantitative Risk Analysis to the Formulation of Ciprofloxacin Tablets.

    PubMed

    Claycamp, H Gregg; Kona, Ravikanth; Fahmy, Raafat; Hoag, Stephen W

    2016-04-01

    Qualitative risk assessment methods are often used as the first step to determining design space boundaries; however, quantitative assessments of risk with respect to the design space, i.e., calculating the probability of failure for a given severity, are needed to fully characterize design space boundaries. Quantitative risk assessment methods in design and operational spaces are a significant aid to evaluating proposed design space boundaries. The goal of this paper is to demonstrate a relatively simple strategy for design space definition using a simplified Bayesian Monte Carlo simulation. This paper builds on a previous paper that used failure mode and effects analysis (FMEA) qualitative risk assessment and Plackett-Burman design of experiments to identity the critical quality attributes. The results show that the sequential use of qualitative and quantitative risk assessments can focus the design of experiments on a reduced set of critical material and process parameters that determine a robust design space under conditions of limited laboratory experimentation. This approach provides a strategy by which the degree of risk associated with each known parameter can be calculated and allocates resources in a manner that manages risk to an acceptable level.

  9. A quantitative evaluation of the high elbow technique in front crawl.

    PubMed

    Suito, Hiroshi; Nunome, Hiroyuki; Ikegami, Yasuo

    2017-07-01

    Many coaches often instruct swimmers to keep the elbow in a high position (high elbow position) during early phase of the underwater stroke motion (pull phase) in front crawl, however, the high elbow position has never been quantitatively evaluated. The aims of this study were (1) to quantitatively evaluate the "high elbow" position, (2) to clarify the relationship between the high elbow position and required upper limb configuration and (3) to examine the efficacy of high elbow position on the resultant swimming velocity. Sixteen highly skilled and 6 novice male swimmers performed 25 m front crawl with maximal effort and their 3-dimensional arm stroke motion was captured at 60 Hz. An attempt was made to develop a new index to evaluate the high elbow position (I he : high elbow index) using 3-dimensional coordinates of the shoulder, elbow and wrist joints. I he of skilled swimmers moderately correlated with the average shoulder internal rotation angle (r = -0.652, P < 0.01) and swimming velocity (r = -0.683, P < 0.01) during the pull phase. These results indicate that I he is a useful index for evaluating high elbow arm stroke technique during the pull phase in front crawl.

  10. Genetic toxicology at the crossroads-from qualitative hazard evaluation to quantitative risk assessment.

    PubMed

    White, Paul A; Johnson, George E

    2016-05-01

    Applied genetic toxicology is undergoing a transition from qualitative hazard identification to quantitative dose-response analysis and risk assessment. To facilitate this change, the Health and Environmental Sciences Institute (HESI) Genetic Toxicology Technical Committee (GTTC) sponsored a workshop held in Lancaster, UK on July 10-11, 2014. The event included invited speakers from several institutions and the contents was divided into three themes-1: Point-of-departure Metrics for Quantitative Dose-Response Analysis in Genetic Toxicology; 2: Measurement and Estimation of Exposures for Better Extrapolation to Humans and 3: The Use of Quantitative Approaches in Genetic Toxicology for human health risk assessment (HHRA). A host of pertinent issues were discussed relating to the use of in vitro and in vivo dose-response data, the development of methods for in vitro to in vivo extrapolation and approaches to use in vivo dose-response data to determine human exposure limits for regulatory evaluations and decision-making. This Special Issue, which was inspired by the workshop, contains a series of papers that collectively address topics related to the aforementioned themes. The Issue includes contributions that collectively evaluate, describe and discuss in silico, in vitro, in vivo and statistical approaches that are facilitating the shift from qualitative hazard evaluation to quantitative risk assessment. The use and application of the benchmark dose approach was a central theme in many of the workshop presentations and discussions, and the Special Issue includes several contributions that outline novel applications for the analysis and interpretation of genetic toxicity data. Although the contents of the Special Issue constitutes an important step towards the adoption of quantitative methods for regulatory assessment of genetic toxicity, formal acceptance of quantitative methods for HHRA and regulatory decision-making will require consensus regarding the

  11. Revised Risk Priority Number in Failure Mode and Effects Analysis Model from the Perspective of Healthcare System

    PubMed Central

    Rezaei, Fatemeh; Yarmohammadian, Mohmmad H.; Haghshenas, Abbas; Fallah, Ali; Ferdosi, Masoud

    2018-01-01

    Background: Methodology of Failure Mode and Effects Analysis (FMEA) is known as an important risk assessment tool and accreditation requirement by many organizations. For prioritizing failures, the index of “risk priority number (RPN)” is used, especially for its ease and subjective evaluations of occurrence, the severity and the detectability of each failure. In this study, we have tried to apply FMEA model more compatible with health-care systems by redefining RPN index to be closer to reality. Methods: We used a quantitative and qualitative approach in this research. In the qualitative domain, focused groups discussion was used to collect data. A quantitative approach was used to calculate RPN score. Results: We have studied patient's journey in surgery ward from holding area to the operating room. The highest priority failures determined based on (1) defining inclusion criteria as severity of incident (clinical effect, claim consequence, waste of time and financial loss), occurrence of incident (time - unit occurrence and degree of exposure to risk) and preventability (degree of preventability and defensive barriers) then, (2) risks priority criteria quantified by using RPN index (361 for the highest rate failure). The ability of improved RPN scores reassessed by root cause analysis showed some variations. Conclusions: We concluded that standard criteria should be developed inconsistent with clinical linguistic and special scientific fields. Therefore, cooperation and partnership of technical and clinical groups are necessary to modify these models. PMID:29441184

  12. Student evaluations of teaching: teaching quantitative courses can be hazardous to one’s career

    PubMed Central

    Smibert, Dylan

    2017-01-01

    Anonymous student evaluations of teaching (SETs) are used by colleges and universities to measure teaching effectiveness and to make decisions about faculty hiring, firing, re-appointment, promotion, tenure, and merit pay. Although numerous studies have found that SETs correlate with various teaching effectiveness irrelevant factors (TEIFs) such as subject, class size, and grading standards, it has been argued that such correlations are small and do not undermine the validity of SETs as measures of professors’ teaching effectiveness. However, previous research has generally used inappropriate parametric statistics and effect sizes to examine and to evaluate the significance of TEIFs on personnel decisions. Accordingly, we examined the influence of quantitative vs. non-quantitative courses on SET ratings and SET based personnel decisions using 14,872 publicly posted class evaluations where each evaluation represents a summary of SET ratings provided by individual students responding in each class. In total, 325,538 individual student evaluations from a US mid-size university contributed to theses class evaluations. The results demonstrate that class subject (math vs. English) is strongly associated with SET ratings, has a substantial impact on professors being labeled satisfactory vs. unsatisfactory and excellent vs. non-excellent, and the impact varies substantially depending on the criteria used to classify professors as satisfactory vs. unsatisfactory. Professors teaching quantitative courses are far more likely not to receive tenure, promotion, and/or merit pay when their performance is evaluated against common standards. PMID:28503380

  13. Quantitative evaluation of morphological changes in activated platelets in vitro using digital holographic microscopy.

    PubMed

    Kitamura, Yutaka; Isobe, Kazushige; Kawabata, Hideo; Tsujino, Tetsuhiro; Watanabe, Taisuke; Nakamura, Masayuki; Toyoda, Toshihisa; Okudera, Hajime; Okuda, Kazuhiro; Nakata, Koh; Kawase, Tomoyuki

    2018-06-18

    Platelet activation and aggregation have been conventionally evaluated using an aggregometer. However, this method is suitable for short-term but not long-term quantitative evaluation of platelet aggregation, morphological changes, and/or adhesion to specific materials. The recently developed digital holographic microscopy (DHM) has enabled the quantitative evaluation of cell size and morphology without labeling or destruction. Thus, we aim to validate its applicability in quantitatively evaluating changes in cell morphology, especially in the aggregation and spreading of activated platelets, thus modifying typical image analysis procedures to suit aggregated platelets. Freshly prepared platelet-rich plasma was washed with phosphate-buffered saline and treated with 0.1% CaCl 2 . Platelets were then fixed and subjected to DHM, scanning electron microscopy (SEM), atomic force microscopy, optical microscopy, and flow cytometry (FCM). Tightly aggregated platelets were identified as single cells. Data obtained from time-course experiments were plotted two-dimensionally according to the average optical thickness versus attachment area and divided into four regions. The majority of the control platelets, which supposedly contained small and round platelets, were distributed in the lower left region. As activation time increased, however, this population dispersed toward the upper right region. The distribution shift demonstrated by DHM was essentially consistent with data obtained from SEM and FCM. Therefore, DHM was validated as a promising device for testing platelet function given that it allows for the quantitative evaluation of activation-dependent morphological changes in platelets. DHM technology will be applicable to the quality assurance of platelet concentrates, as well as diagnosis and drug discovery related to platelet functions. Copyright © 2018 Elsevier Ltd. All rights reserved.

  14. Objective evaluation of reconstruction methods for quantitative SPECT imaging in the absence of ground truth.

    PubMed

    Jha, Abhinav K; Song, Na; Caffo, Brian; Frey, Eric C

    2015-04-13

    Quantitative single-photon emission computed tomography (SPECT) imaging is emerging as an important tool in clinical studies and biomedical research. There is thus a need for optimization and evaluation of systems and algorithms that are being developed for quantitative SPECT imaging. An appropriate objective method to evaluate these systems is by comparing their performance in the end task that is required in quantitative SPECT imaging, such as estimating the mean activity concentration in a volume of interest (VOI) in a patient image. This objective evaluation can be performed if the true value of the estimated parameter is known, i.e. we have a gold standard. However, very rarely is this gold standard known in human studies. Thus, no-gold-standard techniques to optimize and evaluate systems and algorithms in the absence of gold standard are required. In this work, we developed a no-gold-standard technique to objectively evaluate reconstruction methods used in quantitative SPECT when the parameter to be estimated is the mean activity concentration in a VOI. We studied the performance of the technique with realistic simulated image data generated from an object database consisting of five phantom anatomies with all possible combinations of five sets of organ uptakes, where each anatomy consisted of eight different organ VOIs. Results indicate that the method provided accurate ranking of the reconstruction methods. We also demonstrated the application of consistency checks to test the no-gold-standard output.

  15. EVALUATION OF SAFETY IN A RADIATION ONCOLOGY SETTING USING FAILURE MODE AND EFFECTS ANALYSIS

    PubMed Central

    Ford, Eric C.; Gaudette, Ray; Myers, Lee; Vanderver, Bruce; Engineer, Lilly; Zellars, Richard; Song, Danny Y.; Wong, John; DeWeese, Theodore L.

    2013-01-01

    Purpose Failure mode and effects analysis (FMEA) is a widely used tool for prospectively evaluating safety and reliability. We report our experiences in applying FMEA in the setting of radiation oncology. Methods and Materials We performed an FMEA analysis for our external beam radiation therapy service, which consisted of the following tasks: (1) create a visual map of the process, (2) identify possible failure modes; assign risk probability numbers (RPN) to each failure mode based on tabulated scores for the severity, frequency of occurrence, and detectability, each on a scale of 1 to 10; and (3) identify improvements that are both feasible and effective. The RPN scores can span a range of 1 to 1000, with higher scores indicating the relative importance of a given failure mode. Results Our process map consisted of 269 different nodes. We identified 127 possible failure modes with RPN scores ranging from 2 to 160. Fifteen of the top-ranked failure modes were considered for process improvements, representing RPN scores of 75 and more. These specific improvement suggestions were incorporated into our practice with a review and implementation by each department team responsible for the process. Conclusions The FMEA technique provides a systematic method for finding vulnerabilities in a process before they result in an error. The FMEA framework can naturally incorporate further quantification and monitoring. A general-use system for incident and near miss reporting would be useful in this regard. PMID:19409731

  16. Quantitative risk assessment system (QRAS)

    NASA Technical Reports Server (NTRS)

    Tan, Zhibin (Inventor); Mosleh, Ali (Inventor); Weinstock, Robert M (Inventor); Smidts, Carol S (Inventor); Chang, Yung-Hsien (Inventor); Groen, Francisco J (Inventor); Swaminathan, Sankaran (Inventor)

    2001-01-01

    A quantitative risk assessment system (QRAS) builds a risk model of a system for which risk of failure is being assessed, then analyzes the risk of the system corresponding to the risk model. The QRAS performs sensitivity analysis of the risk model by altering fundamental components and quantifications built into the risk model, then re-analyzes the risk of the system using the modifications. More particularly, the risk model is built by building a hierarchy, creating a mission timeline, quantifying failure modes, and building/editing event sequence diagrams. Multiplicities, dependencies, and redundancies of the system are included in the risk model. For analysis runs, a fixed baseline is first constructed and stored. This baseline contains the lowest level scenarios, preserved in event tree structure. The analysis runs, at any level of the hierarchy and below, access this baseline for risk quantitative computation as well as ranking of particular risks. A standalone Tool Box capability exists, allowing the user to store application programs within QRAS.

  17. Treatment Failure and Miltefosine Susceptibility in Dermal Leishmaniasis Caused by Leishmania Subgenus Viannia Species

    PubMed Central

    Obonaga, Ricardo; Fernández, Olga Lucía; Valderrama, Liliana; Rubiano, Luisa Consuelo; Castro, Maria del Mar; Barrera, Maria Claudia; Gomez, Maria Adelaida

    2014-01-01

    Treatment failure and parasite drug susceptibility in dermal leishmaniasis caused by Leishmania (Viannia) species are poorly understood. Prospective evaluation of drug susceptibility of strains isolated from individual patients before drug exposure and at clinical failure allows intrinsic and acquired differences in susceptibility to be discerned and analyzed. To determine whether intrinsic susceptibility or loss of susceptibility to miltefosine contributed to treatment failure, we evaluated the miltefosine susceptibility of intracellular amastigotes and promastigotes of six Leishmania (Viannia) braziliensis and six Leishmania (Viannia) panamensis strains isolated sequentially, at diagnosis and treatment failure, from two children and four adults ≥55 years old with concurrent conditions. Four patients presented only cutaneous lesions, one had mucosal disease, and one had disseminated mucocutaneous disease. Expression of the Leishmania drug transporter genes abca2, abca3, abcc2, abcc3, abcg4, abcg6, and LbMT was evaluated by quantitative reverse transcription-PCR (qRT-PCR). Intracellular amastigotes (median 50% effective concentration [EC50], 10.7 μmol/liter) were more susceptible to miltefosine than promastigotes (median EC50, 55.3 μmol/liter) (P < 0.0001). Loss of susceptibility at failure, demonstrated by a miltefosine EC50 of >32 μmol/liter (the upper limit of intracellular amastigote assay), occurred in L. panamensis infection in a child and in L. braziliensis infection in an adult and was accompanied by decreased expression of the miltefosine transporter LbMT (LbMT/β-tubulin, 0.42- to 0.26-fold [P = 0.039] and 0.70- to 0.57-fold [P = 0.009], respectively). LbMT gene polymorphisms were not associated with susceptibility phenotype. Leishmania ABCA3 transporter expression was inversely correlated with miltefosine susceptibility (r = −0.605; P = 0.037). Loss of susceptibility is one of multiple factors involved in failure of miltefosine treatment in dermal

  18. Evaluation of a Progressive Failure Analysis Methodology for Laminated Composite Structures

    NASA Technical Reports Server (NTRS)

    Sleight, David W.; Knight, Norman F., Jr.; Wang, John T.

    1997-01-01

    A progressive failure analysis methodology has been developed for predicting the nonlinear response and failure of laminated composite structures. The progressive failure analysis uses C plate and shell elements based on classical lamination theory to calculate the in-plane stresses. Several failure criteria, including the maximum strain criterion, Hashin's criterion, and Christensen's criterion, are used to predict the failure mechanisms. The progressive failure analysis model is implemented into a general purpose finite element code and can predict the damage and response of laminated composite structures from initial loading to final failure.

  19. [INTESTINAL FAILURE IN PEDIATRIC PATIENTS: EXPERIENCE AND MANAGEMENT BY A MULTIDISCIPLINARY GROUP].

    PubMed

    Giraldo Villa, Adriana; Martínez Volkmar, María Isabel; Valencia Quintero, Andrés Felipe; Montoya Delgado, Diana Catalina; Henao Roldan, Catherine; Ruiz Navas, Patricia; García Loboguerrero, Fanny; Contreras Ramírez, Mónica María

    2015-12-01

    institutions with multidisciplinary teams have shown improvements in patient outcomes with intestinal failure. Multidisciplinary approach allows an integral management and effective communication between families and care teams. describe the multidisciplinary management and outcome in pediatric patients with intestinal failure. retrospective study in patients 18 years old or less, with intestinal failure and Total Parenteral Nutrition (TPN) required. Simple frequencies and percentages were used for qualitative variables, and central tendency and dispersion measures were used for quantitative variables. 33 patients with a median follow up of 281 days were evaluated. The median duration of the TPN was 68 days and the mean of catheter-related infections was 2.26 per patient. In 31 patients oral or enteral nutrition was provided, starting in 61.3% of cases through tube and continuous infusion. As concomitant treatment 72.7% of children received ursodeoxycholic acid, 67.7%, cholestyramine 57.6% loperamide, 48.5% antibiotics and 36.4% probiotic. The families of 24 patients were evaluated by social work professionals. Intestinal autonomy was achieved in 69.7% of cases, 72.7% of them showed an improvement in the score z of weight and showed an end albumin significantly higher than the initial (p value: 0.012). the management of patients with intestinal failure is a challenge for health institutions and require care based on a standardized protocol and a multidisciplinary group. Copyright AULA MEDICA EDICIONES 2014. Published by AULA MEDICA. All rights reserved.

  20. Numerical simulation of damage and progressive failures in composite laminates using the layerwise plate theory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Reddy, Y.S.

    1992-01-01

    The failure behavior of composite laminates is modeled numerically using the Generalized Layerwise Plate Theory (GLPT) of Reddy and a progressive failure algorithm. The Layerwise Theory of Reddy assumes a piecewise continuous displacement field through the thickness of the laminate and therefore has the ability to capture the interlaminar stress fields near the free edges and cut outs more accurately. The progressive failure algorithm is based on the assumption that the material behaves like a stable progressively fracturing solid. A three-dimensional stiffness reduction scheme is developed and implemented to study progressive failures in composite laminates. The effect of various parametersmore » such as out-of-plane material properties, boundary conditions, and stiffness reduction methods on the failure stresses and strains of a quasi-isotropic composite laminate with free edges subjected to tensile loading is studied. The ultimate stresses and strains predicted by the Generalized Layerwise Plate Theory (GLPT) and the more widely used First Order Shear Deformation Theory (FSDT) are compared with experimental results. The predictions of the GLPT are found to be in good agreement with the experimental results both qualitatively and quantitatively, while the predictions of FSDT are found to be different from experimental results both qualitatively and quantitatively. The predictive ability of various phenomenological failure criteria is evaluated with reference to the experimental results available in the literature. The effect of geometry of the test specimen and the displacement boundary conditions at the grips on the ultimate stresses and strains of a composite laminate under compressive loading is studied. The ultimate stresses and strains are found to be quite sensitive to the geometry of the test specimen and the displacement boundary conditions at the grips. The degree of sensitivity is observed to depend strongly on the lamination sequence.« less

  1. Common Cause Failure Modeling in Space Launch Vehicles

    NASA Technical Reports Server (NTRS)

    Hark, Frank; Ring, Rob; Novack, Steven D.; Britton, Paul

    2015-01-01

    Common Cause Failures (CCFs) are a known and documented phenomenon that defeats system redundancy. CCFs are a set of dependent type of failures that can be caused for example by system environments, manufacturing, transportation, storage, maintenance, and assembly. Since there are many factors that contribute to CCFs, they can be reduced, but are difficult to eliminate entirely. Furthermore, failure databases sometimes fail to differentiate between independent and dependent CCF. Because common cause failure data is limited in the aerospace industry, the Probabilistic Risk Assessment (PRA) Team at Bastion Technology Inc. is estimating CCF risk using generic data collected by the Nuclear Regulatory Commission (NRC). Consequently, common cause risk estimates based on this database, when applied to other industry applications, are highly uncertain. Therefore, it is important to account for a range of values for independent and CCF risk and to communicate the uncertainty to decision makers. There is an existing methodology for reducing CCF risk during design, which includes a checklist of 40+ factors grouped into eight categories. Using this checklist, an approach to produce a beta factor estimate is being investigated that quantitatively relates these factors. In this example, the checklist will be tailored to space launch vehicles, a quantitative approach will be described, and an example of the method will be presented.

  2. Clinical Evaluation of an Affordable Qualitative Viral Failure Assay for HIV Using Dried Blood Spots in Uganda.

    PubMed

    Balinda, Sheila N; Ondoa, Pascale; Obuku, Ekwaro A; Kliphuis, Aletta; Egau, Isaac; Bronze, Michelle; Kasambula, Lordwin; Schuurman, Rob; Spieker, Nicole; Rinke de Wit, Tobias F; Kityo, Cissy

    2016-01-01

    WHO recommends regular viral load (VL) monitoring of patients on antiretroviral therapy (ART) for timely detection of virological failure, prevention of acquired HIV drug resistance (HIVDR) and avoiding unnecessary switching to second-line ART. However, the cost and complexity of routine VL testing remains prohibitive in most resource limited settings (RLS). We evaluated a simple, low-cost, qualitative viral-failure assay (VFA) on dried blood spots (DBS) in three clinical settings in Uganda. We conducted a cross-sectional diagnostic accuracy study in three HIV/AIDS treatment centres at the Joint Clinical Research Centre in Uganda. The VFA employs semi-quantitative detection of HIV-1 RNA amplified from the LTR gene. We used paired dry blood spot (DBS) and plasma with the COBASAmpliPrep/COBASTaqMan, Roche version 2 (VLref) as the reference assay. We used the VFA at two thresholds of viral load, (>5,000 or >1,000 copies/ml). 496 paired VFA and VLref results were available for comparative analysis. Overall, VFA demonstrated 78.4% sensitivity, (95% CI: 69.7%-87.1%), 93% specificity (95% CI: 89.7%-96.4%), 89.3% accuracy (95% CI: 85%-92%) and an agreement kappa = 0.72 as compared to the VLref. The predictive values of positivity and negativity among patients on ART for >12 months were 72.7% and 99.3%, respectively. VFA allowed 89% of correct classification of VF. Only 11% of the patients were misclassified with the potential of unnecessary or late switch to second-line ART. Our findings present an opportunity to roll out simple and affordable VL monitoring for HIV-1 treatment in RLS.

  3. Quantitative and Qualitative Geospatial Analysis of a Probable Catastrophic Dam Failure

    NASA Astrophysics Data System (ADS)

    Oduor, P. G.; Stenehjem, J.

    2011-12-01

    Geospatial techniques were used in assessing inundation extents that would occur in the event of a catastrophic failure of Fort Peck dam. Fort Peck dam, located in Montana, USA has a spillway design which under dam failure the crest is expected to reach Williston a major economic hub in North Dakota in 1.4 days with a peak elevation of 1891 ft (576.377 m) msl (mean sea level). In this study, we address flooding extents and impacts on establishments with respect to a peak elevation of 1891 ft. From this study, we can unequivocally state that the City of Williston will be significantly impacted if Fort Peck dam fails with almost all critical needs, for example, gasoline stations, emergency facilities and grocery stores completely inundated. A secondary catastrophic event may be tied to the primary economic activity in Williston, that is, oil rigs of which most lie on the pathway of an inadvertent flood crest. We also applied a Discrete Fourier Transformation (DFT), and Lomb-Scargle normalized periodogram analyses and fitting of Fort Peck dam reservoir level fluctuations to gauge (a) likelihood of the dam overtopping, and (b) anatomic life span. Whereas we found that inasmuch as the dam could be considered stable by directly analyzing other dams that have failed, there is still a lower likelihood of it to fail at a 99-232 years range from construction. There was lack of concomitancy between overtopping and dam failure rates.

  4. Failure to Integrate Quantitative Measurement Methods of Ocular Inflammation Hampers Clinical Practice and Trials on New Therapies for Posterior Uveitis.

    PubMed

    Herbort, Carl P; Tugal-Tutkun, Ilknur; Neri, Piergiorgio; Pavésio, Carlos; Onal, Sumru; LeHoang, Phuc

    2017-05-01

    Uveitis is one of the fields in ophthalmology where a tremendous evolution took place in the past 25 years. Not only did we gain access to more efficient, more targeted, and better tolerated therapies, but also in parallel precise and quantitative measurement methods developed allowing the clinician to evaluate these therapies and adjust therapeutic intervention with a high degree of precision. Objective and quantitative measurement of the global level of intraocular inflammation became possible for most inflammatory diseases with direct or spill-over anterior chamber inflammation, thanks to laser flare photometry. The amount of retinal inflammation could be quantified by using fluorescein angiography to score retinal angiographic signs. Indocyanine green angiography gave imaging insight into the hitherto inaccessible choroidal compartment, rendering possible the quantification of choroiditis by scoring indocyanine green angiographic signs. Optical coherence tomography has enabled measurement and objective monitoring of retinal and choroidal thickness. This multimodal quantitative appraisal of intraocular inflammation represents an exquisite security in monitoring uveitis. What is enigmatic, however, is the slow pace with which these improvements are integrated in some areas. What is even more difficult to understand is the fact that clinical trials to assess new therapeutic agents still mostly rely on subjective parameters such as clinical evaluation of vitreous haze as a main endpoint; whereas a whole array of precise, quantitative, and objective modalities are available for the design of clinical studies. The scope of this work was to review the quantitative investigations that improved the management of uveitis in the past 2-3 decades.

  5. Development and evaluation of a composite risk score to predict kidney transplant failure.

    PubMed

    Moore, Jason; He, Xiang; Shabir, Shazia; Hanvesakul, Rajesh; Benavente, David; Cockwell, Paul; Little, Mark A; Ball, Simon; Inston, Nicholas; Johnston, Atholl; Borrows, Richard

    2011-05-01

    Although risk factors for kidney transplant failure are well described, prognostic risk scores to estimate risk in prevalent transplant recipients are limited. Development and validation of risk-prediction instruments. The development data set included 2,763 prevalent patients more than 12 months posttransplant enrolled into the LOTESS (Long Term Efficacy and Safety Surveillance) Study. The validation data set included 731 patients who underwent transplant at a single UK center. Estimated glomerular filtration rate (eGFR) and other risk factors were evaluated using Cox regression. Scores for death-censored and overall transplant failure were based on the summed hazard ratios for baseline predictor variables. Predictive performance was assessed using calibration (Hosmer-Lemeshow statistic), discrimination (C statistic), and clinical reclassification (net reclassification improvement) compared with eGFR alone. In the development data set, 196 patients died and another 225 experienced transplant failure. eGFR, recipient age, race, serum urea and albumin levels, declining eGFR, and prior acute rejection predicted death-censored transplant failure. eGFR, recipient age, sex, serum urea and albumin levels, and declining eGFR predicted overall transplant failure. In the validation data set, 44 patients died and another 101 experienced transplant failure. The weighted scores comprising these variables showed adequate discrimination and calibration for death-censored (C statistic, 0.83; 95% CI, 0.75-0.91; Hosmer-Lemeshow χ(2)P = 0.8) and overall (C statistic, 0.70; 95% CI, 0.64-0.77; Hosmer-Lemeshow χ(2)P = 0.5) transplant failure. However, the scores failed to reclassify risk compared with eGFR alone (net reclassification improvements of 7.6% [95% CI, -0.2 to 13.4; P = 0.09] and 4.3% [95% CI, -2.7 to 11.8; P = 0.3] for death-censored and overall transplant failure, respectively). Retrospective analysis of predominantly cyclosporine-treated patients; limited study size and

  6. Quantitative Skills as a Graduate Learning Outcome: Exploring Students' Evaluative Expertise

    ERIC Educational Resources Information Center

    Matthews, Kelly E.; Adams, Peter; Goos, Merrilyn

    2017-01-01

    In the biosciences, quantitative skills are an essential graduate learning outcome. Efforts to evidence student attainment at the whole of degree programme level are rare and making sense of such data is complex. We draw on assessment theories from Sadler (evaluative expertise) and Boud (sustainable assessment) to interpret final-year bioscience…

  7. Raman spectral imaging for quantitative contaminant evaluation in skim milk powder

    USDA-ARS?s Scientific Manuscript database

    This study uses a point-scan Raman spectral imaging system for quantitative detection of melamine in milk powder. A sample depth of 2 mm and corresponding laser intensity of 200 mW were selected after evaluating the penetration of a 785 nm laser through milk powder. Horizontal and vertical spatial r...

  8. Quantitative nondestructive evaluation: Requirements for tomorrow's reliability

    NASA Technical Reports Server (NTRS)

    Heyman, Joseph S.

    1991-01-01

    Quantitative Nondestructive Evaluation (QNDE) is the technology of measurement, analysis, and prediction of the state of material/structural systems for safety, reliability, and mission assurance. QNDE has impact on everyday life from the cars we drive, the planes we fly, the buildings we work or live in, literally to the infrastructure of our world. Here, researchers highlight some of the new sciences and technologies that are part of a safer, cost effective tomorrow. Specific technologies that are discussed are thermal QNDE of aircraft structural integrity, ultrasonic QNDE for materials characterization, and technology spinoffs from aerospace to the medical sector. In each case, examples are given of how new requirements result in enabling measurement technologies, which in turn change the boundaries of design/practice.

  9. Quantitative nondestructive evaluation of ceramic matrix composite by the resonance method

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Watanabe, T.; Aizawa, T.; Kihara, J.

    The resonance method was developed to make quantitative nondestructive evaluation on the mechanical properties without any troublesome procedure. Since the present method is indifferent to the geometry of specimen, both monolithic and ceramic matrix composite materials in process can be evaluated in the nondestructive manner. Al{sub 2}O{sub 3}, Si{sub 3}N{sub 4}, SiC/Si{sub 3}N{sub 4}, and various C/C composite materials are employed to demonstrate the validity and effectiveness of the present method.

  10. Simulating Initial and Progressive Failure of Open-Hole Composite Laminates under Tension

    NASA Astrophysics Data System (ADS)

    Guo, Zhangxin; Zhu, Hao; Li, Yongcun; Han, Xiaoping; Wang, Zhihua

    2016-12-01

    A finite element (FE) model is developed for the progressive failure analysis of fiber reinforced polymer laminates. The failure criterion for fiber and matrix failure is implemented in the FE code Abaqus using user-defined material subroutine UMAT. The gradual degradation of the material properties is controlled by the individual fracture energies of fiber and matrix. The failure and damage in composite laminates containing a central hole subjected to uniaxial tension are simulated. The numerical results show that the damage model can be used to accurately predicte the progressive failure behaviour both qualitatively and quantitatively.

  11. Studies and analyses of the Space Shuttle Main Engine: SSME failure data review, diagnostic survey and SSME diagnostic evaluation

    NASA Technical Reports Server (NTRS)

    Glover, R. C.; Kelley, B. A.; Tischer, A. E.

    1986-01-01

    The results of a review of the Space Shuttle Main Engine (SSME) failure data for the period 1980 through 1983 are presented. The data was collected, evaluated, and ranked according to procedures established during this study. A number of conclusions and recommendations are made based upon this failure data review. The results of a state-of-the-art diagnostic survey are also presented. This survey covered a broad range of diagnostic sensors and techniques and the findings were evaluated for application to the SSME. Finally, a discussion of the initial activities for the on-going SSME diagnostic evaluation is included.

  12. Causes of catastrophic failure in complex systems

    NASA Astrophysics Data System (ADS)

    Thomas, David A.

    2010-08-01

    Root causes of mission critical failures and major cost and schedule overruns in complex systems and programs are studied through the post-mortem analyses compiled for several examples, including the Hubble Space Telescope, the Challenger and Columbia Shuttle accidents, and the Three Mile Island nuclear power plant accident. The roles of organizational complexity, cognitive biases in decision making, the display of quantitative data, and cost and schedule pressure are all considered. Recommendations for mitigating the risk of similar failures in future programs are also provided.

  13. First metatarsophalangeal joint arthrodesis: an evaluation of hardware failure.

    PubMed

    Bennett, Gordon L; Kay, David B; Sabatta, James

    2005-08-01

    First metatarsophalangeal joint (MTPJ) arthrodesis is commonly used for the treatment of a variety of conditions affecting the hallux. We used a method incorporating a ball-and-cup preparation of the first metatarsal and proximal phalanx, followed by fixation of the arthrodesis with a lag screw and a dorsal plate (Synthes Modular Hand Set). Ninety-five consecutive patients had first MTPJ arthrodesis using fixation with the Synthes Modular Hand Set. All patients were evaluated preoperatively, at regular intervals postoperatively, and at final followup. The American Orthopaedic Foot and Ankle Society (AOFAS) forefoot scoring system was used preoperatively and at final followup. Solid fusion occurred in 93 of 107 feet (86.9%). In the 14 that did not fuse, either the screws or plate, or both, broke. Ten of the 14 feet were symptomatic, but only three required further operative treatment. There were no hardware problems or failures in patients who had solid fusions. Preoperative AOFAS scores were improved after surgery in all patients. A solid first MTPJ fusion results in excellent function and pain relief, but the Synthes Modular Hand Set implants do not appear to be strong enough in all patients for this application; nonunion at the arthrodesis site and failure of hardware occurred in 13% of arthrodeses. We no longer recommend this implant for this application.

  14. Probabilistic failure assessment with application to solid rocket motors

    NASA Technical Reports Server (NTRS)

    Jan, Darrell L.; Davidson, Barry D.; Moore, Nicholas R.

    1990-01-01

    A quantitative methodology is being developed for assessment of risk of failure of solid rocket motors. This probabilistic methodology employs best available engineering models and available information in a stochastic framework. The framework accounts for incomplete knowledge of governing parameters, intrinsic variability, and failure model specification error. Earlier case studies have been conducted on several failure modes of the Space Shuttle Main Engine. Work in progress on application of this probabilistic approach to large solid rocket boosters such as the Advanced Solid Rocket Motor for the Space Shuttle is described. Failure due to debonding has been selected as the first case study for large solid rocket motors (SRMs) since it accounts for a significant number of historical SRM failures. Impact of incomplete knowledge of governing parameters and failure model specification errors is expected to be important.

  15. Study on Failure of Third-Party Damage for Urban Gas Pipeline Based on Fuzzy Comprehensive Evaluation.

    PubMed

    Li, Jun; Zhang, Hong; Han, Yinshan; Wang, Baodong

    2016-01-01

    Focusing on the diversity, complexity and uncertainty of the third-party damage accident, the failure probability of third-party damage to urban gas pipeline was evaluated on the theory of analytic hierarchy process and fuzzy mathematics. The fault tree of third-party damage containing 56 basic events was built by hazard identification of third-party damage. The fuzzy evaluation of basic event probabilities were conducted by the expert judgment method and using membership function of fuzzy set. The determination of the weight of each expert and the modification of the evaluation opinions were accomplished using the improved analytic hierarchy process, and the failure possibility of the third-party to urban gas pipeline was calculated. Taking gas pipelines of a certain large provincial capital city as an example, the risk assessment structure of the method was proved to conform to the actual situation, which provides the basis for the safety risk prevention.

  16. Qualitative and quantitative evaluation of avian demineralized bone matrix in heterotopic beds.

    PubMed

    Reza Sanaei, M; Abu, Jalila; Nazari, Mojgan; A B, Mohd Zuki; Allaudin, Zeenathul N

    2013-11-01

    To evaluate the osteogenic potential of avian demineralized bone matrix (DBM) in the context of implant geometry. Experimental. Rock pigeons (n = 24). Tubular and chipped forms of DBM were prepared by acid demineralization of long bones from healthy allogeneic donors and implanted bilaterally into the pectoral region of 24 pigeons. After euthanasia at 1, 4, 6, 8, 10, and 12 weeks, explants were evaluated histologically and compared by means of quantitative (bone area) and semi quantitative measures (scores). All explants had new bone at retrieval with the exception of tubular implants at the end of week 1. The most reactive part in both implants was the interior region between the periosteal and endosteal surfaces followed by the area at the implant-muscle interface. Quantitative measurements demonstrated a significantly (P = .012) greater percentage of new bone formation induced by tubular implants (80.28 ± 8.94) compared with chip implants (57.64 ± 3.12). There was minimal inflammation. Avian DBM initiates heterotopic bone formation in allogeneic recipients with low grades of immunogenicity. Implant geometry affects this phenomenon as osteoconduction appeared to augment the magnitude of the effects in larger tubular implants. © Copyright 2013 by The American College of Veterinary Surgeons.

  17. Failure Analysis and Magnetic Evaluation of Tertiary Superheater Tube Used in Gas-Fired Boiler

    NASA Astrophysics Data System (ADS)

    Mohapatra, J. N.; Patil, Sujay; Sah, Rameshwar; Krishna, P. C.; Eswarappa, B.

    2018-02-01

    Failure analysis was carried out on a prematurely failed tertiary superheater tube used in gas-fired boiler. The analysis includes a comparative study of visual examination, chemical composition, hardness and microstructure at failed region, adjacent and far to failure as well as on fresh tube. The chemistry was found matching to the standard specification, whereas the hardness was low in failed tube compared to the fish mouth opening region and the fresh tube. Microscopic examination of failed sample revealed the presence of spheroidal carbides of Cr and Mo predominantly along the grain boundaries. The primary cause of failure is found to be localized heating. Magnetic hysteresis loop (MHL) measurements were carried out to correlate the magnetic parameters with microstructure and mechanical properties to establish a possible non-destructive evaluation (NDE) for health monitoring of the tubes. The coercivity of the MHL showed a very good correlation with microstructure and mechanical properties deterioration enabling a possible NDE technique for the health monitoring of the tubes.

  18. An anthropomorphic phantom for quantitative evaluation of breast MRI.

    PubMed

    Freed, Melanie; de Zwart, Jacco A; Loud, Jennifer T; El Khouli, Riham H; Myers, Kyle J; Greene, Mark H; Duyn, Jeff H; Badano, Aldo

    2011-02-01

    In this study, the authors aim to develop a physical, tissue-mimicking phantom for quantitative evaluation of breast MRI protocols. The objective of this phantom is to address the need for improved standardization in breast MRI and provide a platform for evaluating the influence of image protocol parameters on lesion detection and discrimination. Quantitative comparisons between patient and phantom image properties are presented. The phantom is constructed using a mixture of lard and egg whites, resulting in a random structure with separate adipose- and glandular-mimicking components. T1 and T2 relaxation times of the lard and egg components of the phantom were estimated at 1.5 T from inversion recovery and spin-echo scans, respectively, using maximum-likelihood methods. The image structure was examined quantitatively by calculating and comparing spatial covariance matrices of phantom and patient images. A static, enhancing lesion was introduced by creating a hollow mold with stereolithography and filling it with a gadolinium-doped water solution. Measured phantom relaxation values fall within 2 standard errors of human values from the literature and are reasonably stable over 9 months of testing. Comparison of the covariance matrices of phantom and patient data demonstrates that the phantom and patient data have similar image structure. Their covariance matrices are the same to within error bars in the anterior-posterior direction and to within about two error bars in the right-left direction. The signal from the phantom's adipose-mimicking material can be suppressed using active fat-suppression protocols. A static, enhancing lesion can also be included with the ability to change morphology and contrast agent concentration. The authors have constructed a phantom and demonstrated its ability to mimic human breast images in terms of key physical properties that are relevant to breast MRI. This phantom provides a platform for the optimization and standardization of

  19. Quantitative Decision Support Requires Quantitative User Guidance

    NASA Astrophysics Data System (ADS)

    Smith, L. A.

    2009-12-01

    Is it conceivable that models run on 2007 computer hardware could provide robust and credible probabilistic information for decision support and user guidance at the ZIP code level for sub-daily meteorological events in 2060? In 2090? Retrospectively, how informative would output from today’s models have proven in 2003? or the 1930’s? Consultancies in the United Kingdom, including the Met Office, are offering services to “future-proof” their customers from climate change. How is a US or European based user or policy maker to determine the extent to which exciting new Bayesian methods are relevant here? or when a commercial supplier is vastly overselling the insights of today’s climate science? How are policy makers and academic economists to make the closely related decisions facing them? How can we communicate deep uncertainty in the future at small length-scales without undermining the firm foundation established by climate science regarding global trends? Three distinct aspects of the communication of the uses of climate model output targeting users and policy makers, as well as other specialist adaptation scientists, are discussed. First, a brief scientific evaluation of the length and time scales at which climate model output is likely to become uninformative is provided, including a note on the applicability the latest Bayesian methodology to current state-of-the-art general circulation models output. Second, a critical evaluation of the language often employed in communication of climate model output, a language which accurately states that models are “better”, have “improved” and now “include” and “simulate” relevant meteorological processed, without clearly identifying where the current information is thought to be uninformative and misleads, both for the current climate and as a function of the state of the (each) climate simulation. And thirdly, a general approach for evaluating the relevance of quantitative climate model output

  20. Evaluation of failure criterion for graphite/epoxy fabric laminates

    NASA Technical Reports Server (NTRS)

    Tennyson, R. C.; Wharram, G. E.

    1985-01-01

    The development and application of the tensor polynomial failure criterion for composite laminate analysis is described. Emphasis is given to the fabrication and testing of Narmco Rigidite 5208-WT300, a plain weave fabric of Thornel 300 Graphite fibers impregnated with Narmco 5208 Resin. The quadratic-failure criterion with F sub 12=0 provides accurate estimates of failure stresses for the graphite/epoxy investigated. The cubic failure criterion was recast into an operationally easier form, providing design curves that can be applied to laminates fabricated from orthotropic woven fabric prepregs. In the form presented, no interaction strength tests are required, although recourse to the quadratic model and the principal strength parameters is necessary. However, insufficient test data exist at present to generalize this approach for all prepreg constructions, and its use must be restricted to the generic materials and configurations investigated to date.

  1. Analytical Method to Evaluate Failure Potential During High-Risk Component Development

    NASA Technical Reports Server (NTRS)

    Tumer, Irem Y.; Stone, Robert B.; Clancy, Daniel (Technical Monitor)

    2001-01-01

    Communicating failure mode information during design and manufacturing is a crucial task for failure prevention. Most processes use Failure Modes and Effects types of analyses, as well as prior knowledge and experience, to determine the potential modes of failures a product might encounter during its lifetime. When new products are being considered and designed, this knowledge and information is expanded upon to help designers extrapolate based on their similarity with existing products and the potential design tradeoffs. This paper makes use of similarities and tradeoffs that exist between different failure modes based on the functionality of each component/product. In this light, a function-failure method is developed to help the design of new products with solutions for functions that eliminate or reduce the potential of a failure mode. The method is applied to a simplified rotating machinery example in this paper, and is proposed as a means to account for helicopter failure modes during design and production, addressing stringent safety and performance requirements for NASA applications.

  2. SU-F-T-246: Evaluation of Healthcare Failure Mode And Effect Analysis For Risk Assessment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harry, T; University of California, San Diego, La Jolla, CA; Manger, R

    Purpose: To evaluate the differences between the Veteran Affairs Healthcare Failure Modes and Effect Analysis (HFMEA) and the AAPM Task Group 100 Failure and Effect Analysis (FMEA) risk assessment techniques in the setting of a stereotactic radiosurgery (SRS) procedure were compared respectively. Understanding the differences in the techniques methodologies and outcomes will provide further insight into the applicability and utility of risk assessments exercises in radiation therapy. Methods: HFMEA risk assessment analysis was performed on a stereotactic radiosurgery procedure. A previous study from our institution completed a FMEA of our SRS procedure and the process map generated from this workmore » was used for the HFMEA. The process of performing the HFMEA scoring was analyzed, and the results from both analyses were compared. Results: The key differences between the two risk assessments are the scoring criteria for failure modes and identifying critical failure modes for potential hazards. The general consensus among the team performing the analyses was that scoring for the HFMEA was simpler and more intuitive then the FMEA. The FMEA identified 25 critical failure modes while the HFMEA identified 39. Seven of the FMEA critical failure modes were not identified by the HFMEA and 21 of the HFMEA critical failure modes were not identified by the FMEA. HFMEA as described by the Veteran Affairs provides guidelines on which failure modes to address first. Conclusion: HFMEA is a more efficient model for identifying gross risks in a process than FMEA. Clinics with minimal staff, time and resources can benefit from this type of risk assessment to eliminate or mitigate high risk hazards with nominal effort. FMEA can provide more in depth details but at the cost of elevated effort.« less

  3. Application of Quality Management Tools for Evaluating the Failure Frequency of Cutter-Loader and Plough Mining Systems

    NASA Astrophysics Data System (ADS)

    Biały, Witold

    2017-06-01

    Failure frequency in the mining process, with a focus on the mining machine, has been presented and illustrated by the example of two coal-mines. Two mining systems have been subjected to analysis: a cutter-loader and a plough system. In order to reduce costs generated by failures, maintenance teams should regularly make sure that the machines are used and operated in a rational and effective way. Such activities will allow downtimes to be reduced, and, in consequence, will increase the effectiveness of a mining plant. The evaluation of mining machines' failure frequency contained in this study has been based on one of the traditional quality management tools - the Pareto chart.

  4. Micro-RNA-122 levels in acute liver failure and chronic hepatitis C.

    PubMed

    Dubin, Perry H; Yuan, Hejun; Devine, Robert K; Hynan, Linda S; Jain, Mamta K; Lee, William M

    2014-09-01

    MicroRNA-122 (miR-122) is the foremost liver-related micro-RNA, but its role in the hepatocyte is not fully understood. To evaluate whether circulating levels of miR-122 are elevated in chronic-HCV for a reason other than hepatic injury, we compared serum level in patients with chronic hepatitis C to other forms of liver injury including patients with acute liver failure and healthy controls. MiR-122 was quantitated using sera from 35 acute liver failure patients (20 acetaminophen-induced, 15 other etiologies), 39 chronic-HCV patients and 12 controls. In parallel, human genomic DNA (hgDNA) levels were measured to reflect quantitatively the extent of hepatic necrosis. Additionally, six HIV-HCV co-infected patients, who achieved viral clearance after undergoing therapy with interferon and ribavirin, had serial sera miR-122 and hgDNA levels measured before and throughout treatment. Serum miR-122 levels were elevated approximately 100-fold in both acute liver failure and chronic-HCV sera as compared to controls (P < 0.001), whereas hgDNA levels were only elevated in acute liver failure patients as compared to both chronic-HCV and controls (P < 0.001). Subgroup analysis showed that chronic-HCV sera with normal aminotransferase levels showed elevated miR-122 despite low levels of hepatocyte necrosis. All successfully treated HCV patients showed a significant Log10 decrease in miR-122 levels ranging from 0.16 to 1.46, after sustained viral response. Chronic-HCV patients have very elevated serum miR-122 levels in the range of most patients with severe hepatic injury leading to acute liver failure. Eradication of HCV was associated with decreased miR-122 but not hgDNA. An additional mechanism besides hepatic injury may be active in chronic-HCV to explain the exaggerated circulating levels of miR-122 observed. © 2014 Wiley Periodicals, Inc.

  5. Are Teacher Course Evaluations Biased against Faculty That Teach Quantitative Methods Courses?

    ERIC Educational Resources Information Center

    Royal, Kenneth D.; Stockdale, Myrah R.

    2015-01-01

    The present study investigated graduate students' responses to teacher/course evaluations (TCE) to determine if students' responses were inherently biased against faculty who teach quantitative methods courses. Item response theory (IRT) and Differential Item Functioning (DIF) techniques were utilized for data analysis. Results indicate students…

  6. The Nuclear Renaissance — Implications on Quantitative Nondestructive Evaluations

    NASA Astrophysics Data System (ADS)

    Matzie, Regis A.

    2007-03-01

    The world demand for energy is growing rapidly, particularly in developing countries that are trying to raise the standard of living for billions of people, many of whom do not even have access to electricity. With this increased energy demand and the high and volatile price of fossil fuels, nuclear energy is experiencing resurgence. This so-called nuclear renaissance is broad based, reaching across Asia, the United States, Europe, as well as selected countries in Africa and South America. Some countries, such as Italy, that have actually turned away from nuclear energy are reconsidering the advisability of this design. This renaissance provides the opportunity to deploy more advanced reactor designs that are operating today, with improved safety, economy, and operations. In this keynote address, I will briefly present three such advanced reactor designs in whose development Westinghouse is participating. These designs include the advanced passive PWR, AP1000, which recently received design certification for the US Nuclear Regulatory Commission; the Pebble Bed Modular reactor (PBMR) which is being demonstrated in South Africa; and the International Reactor Innovative and Secure (IRIS), which was showcased in the US Department of Energy's recently announced Global Nuclear Energy Partnership (GNEP), program. The salient features of these designs that impact future requirements on quantitative nondestructive evaluations will be discussed. Such features as reactor vessel materials, operating temperature regimes, and new geometric configurations will be described, and mention will be made of the impact on quantitative nondestructive evaluation (NDE) approaches.

  7. Hypercalcemia with renal failure.

    PubMed

    Bhavani, Nisha; Praveen, Valiyaparambil Pavithran; Jayakumar, Rohinivilasam Vasukutty; Nair, Vasantha; Muraleedharan, Mangath; Kuma, Harish; Unnikrishnan, Ambika Gopalakrishnan; Menon, Vadayath Usha

    2012-06-01

    We report a cse of nephrocalcinosis with renal failure which on evaluation was found to have hypercalcemia. Further investigations showed an inappropriately normal intact parathormone (iPTH) and 1,25 dihydroxy-vitamin D level in the setting of renal failure. Probing for a cause of non-PTH mediated hypercalcemia led to the diagnosis of sarcoidosis. Treatment with glucocorticoids could partially reverse the renal failure and control the hypercalcemia. This case illustrates the importance of careful interpretation of laboratory parameters especially levels of iPTH and vitamin D metabolites in renal failure.

  8. Preoperative short hookwire placement for small pulmonary lesions: evaluation of technical success and risk factors for initial placement failure.

    PubMed

    Iguchi, Toshihiro; Hiraki, Takao; Matsui, Yusuke; Fujiwara, Hiroyasu; Masaoka, Yoshihisa; Tanaka, Takashi; Sato, Takuya; Gobara, Hideo; Toyooka, Shinichi; Kanazawa, Susumu

    2018-05-01

    To retrospectively evaluate the technical success of computed tomography fluoroscopy-guided short hookwire placement before video-assisted thoracoscopic surgery and to identify the risk factors for initial placement failure. In total, 401 short hookwire placements for 401 lesions (mean diameter 9.3 mm) were reviewed. Technical success was defined as correct positioning of the hookwire. Possible risk factors for initial placement failure (i.e., requirement for placement of an additional hookwire or to abort the attempt) were evaluated using logistic regression analysis for all procedures, and for procedures performed via the conventional route separately. Of the 401 initial placements, 383 were successful and 18 failed. Short hookwires were finally placed for 399 of 401 lesions (99.5%). Univariate logistic regression analyses revealed that in all 401 procedures only the transfissural approach was a significant independent predictor of initial placement failure (odds ratio, OR, 15.326; 95% confidence interval, CI, 5.429-43.267; p < 0.001) and for the 374 procedures performed via the conventional route only lesion size was a significant independent predictor of failure (OR 0.793, 95% CI 0.631-0.996; p = 0.046). The technical success of preoperative short hookwire placement was extremely high. The transfissural approach was a predictor initial placement failure for all procedures and small lesion size was a predictor of initial placement failure for procedures performed via the conventional route. • Technical success of preoperative short hookwire placement was extremely high. • The transfissural approach was a significant independent predictor of initial placement failure for all procedures. • Small lesion size was a significant independent predictor of initial placement failure for procedures performed via the conventional route.

  9. Evaluation of strength and failure of brittle rock containing initial cracks under lithospheric conditions

    NASA Astrophysics Data System (ADS)

    Li, Xiaozhao; Qi, Chengzhi; Shao, Zhushan; Ma, Chao

    2018-02-01

    Natural brittle rock contains numerous randomly distributed microcracks. Crack initiation, growth, and coalescence play a predominant role in evaluation for the strength and failure of brittle rocks. A new analytical method is proposed to predict the strength and failure of brittle rocks containing initial microcracks. The formulation of this method is based on an improved wing crack model and a suggested micro-macro relation. In this improved wing crack model, the parameter of crack angle is especially introduced as a variable, and the analytical stress-crack relation considering crack angle effect is obtained. Coupling the proposed stress-crack relation and the suggested micro-macro relation describing the relation between crack growth and axial strain, the stress-strain constitutive relation is obtained to predict the rock strength and failure. Considering different initial microcrack sizes, friction coefficients and confining pressures, effects of crack angle on tensile wedge force acting on initial crack interface are studied, and effects of crack angle on stress-strain constitutive relation of rocks are also analyzed. The strength and crack initiation stress under different crack angles are discussed, and the value of most disadvantaged angle triggering crack initiation and rock failure is founded. The analytical results are similar to the published study results. Rationality of this proposed analytical method is verified.

  10. Quantitative lung perfusion evaluation using Fourier decomposition perfusion MRI.

    PubMed

    Kjørstad, Åsmund; Corteville, Dominique M R; Fischer, Andre; Henzler, Thomas; Schmid-Bindert, Gerald; Zöllner, Frank G; Schad, Lothar R

    2014-08-01

    To quantitatively evaluate lung perfusion using Fourier decomposition perfusion MRI. The Fourier decomposition (FD) method is a noninvasive method for assessing ventilation- and perfusion-related information in the lungs, where the perfusion maps in particular have shown promise for clinical use. However, the perfusion maps are nonquantitative and dimensionless, making follow-ups and direct comparisons between patients difficult. We present an approach to obtain physically meaningful and quantifiable perfusion maps using the FD method. The standard FD perfusion images are quantified by comparing the partially blood-filled pixels in the lung parenchyma with the fully blood-filled pixels in the aorta. The percentage of blood in a pixel is then combined with the temporal information, yielding quantitative blood flow values. The values of 10 healthy volunteers are compared with SEEPAGE measurements which have shown high consistency with dynamic contrast enhanced-MRI. All pulmonary blood flow (PBF) values are within the expected range. The two methods are in good agreement (mean difference = 0.2 mL/min/100 mL, mean absolute difference = 11 mL/min/100 mL, mean PBF-FD = 150 mL/min/100 mL, mean PBF-SEEPAGE = 151 mL/min/100 mL). The Bland-Altman plot shows a good spread of values, indicating no systematic bias between the methods. Quantitative lung perfusion can be obtained using the Fourier Decomposition method combined with a small amount of postprocessing. Copyright © 2013 Wiley Periodicals, Inc.

  11. Study on Failure of Third-Party Damage for Urban Gas Pipeline Based on Fuzzy Comprehensive Evaluation

    PubMed Central

    Li, Jun; Zhang, Hong; Han, Yinshan; Wang, Baodong

    2016-01-01

    Focusing on the diversity, complexity and uncertainty of the third-party damage accident, the failure probability of third-party damage to urban gas pipeline was evaluated on the theory of analytic hierarchy process and fuzzy mathematics. The fault tree of third-party damage containing 56 basic events was built by hazard identification of third-party damage. The fuzzy evaluation of basic event probabilities were conducted by the expert judgment method and using membership function of fuzzy set. The determination of the weight of each expert and the modification of the evaluation opinions were accomplished using the improved analytic hierarchy process, and the failure possibility of the third-party to urban gas pipeline was calculated. Taking gas pipelines of a certain large provincial capital city as an example, the risk assessment structure of the method was proved to conform to the actual situation, which provides the basis for the safety risk prevention. PMID:27875545

  12. An Abrupt Transition to an Intergranular Failure Mode in the Near-Threshold Fatigue Crack Growth Regime in Ni-Based Superalloys

    NASA Astrophysics Data System (ADS)

    Telesman, J.; Smith, T. M.; Gabb, T. P.; Ring, A. J.

    2018-06-01

    Cyclic near-threshold fatigue crack growth (FCG) behavior of two disk superalloys was evaluated and was shown to exhibit an unexpected sudden failure mode transition from a mostly transgranular failure mode at higher stress intensity factor ranges to an almost completely intergranular failure mode in the threshold regime. The change in failure modes was associated with a crossover of FCG resistance curves in which the conditions that produced higher FCG rates in the Paris regime resulted in lower FCG rates and increased ΔK th values in the threshold region. High-resolution scanning and transmission electron microscopy were used to carefully characterize the crack tips at these near-threshold conditions. Formation of stable Al-oxide followed by Cr-oxide and Ti-oxides was found to occur at the crack tip prior to formation of unstable oxides. To contrast with the threshold failure mode regime, a quantitative assessment of the role that the intergranular failure mode has on cyclic FCG behavior in the Paris regime was also performed. It was demonstrated that even a very limited intergranular failure content dominates the FCG response under mixed mode failure conditions.

  13. Fluorescent proteins for quantitative microscopy: important properties and practical evaluation.

    PubMed

    Shaner, Nathan Christopher

    2014-01-01

    More than 20 years after their discovery, fluorescent proteins (FPs) continue to be the subject of massive engineering efforts yielding continued improvements. Among these efforts are many aspects that should be of great interest to quantitative imaging users. With new variants frequently introduced into the research community, "tried and true" FPs that have been relied on for many years may now be due for upgrades to more modern variants. However, the dizzying array of FPs now available can make the initial act of narrowing down the potential choices an intimidating prospect. This chapter describes the FP properties that most strongly impact their performance in quantitative imaging experiments, along with their physical origins as they are currently understood. A workflow for evaluating a given FP in the researcher's chosen experimental system (e.g., a specific cell line) is described. © 2014 Elsevier Inc. All rights reserved.

  14. Top-Down Quantitative Proteomics Identified Phosphorylation of Cardiac Troponin I as a Candidate Biomarker for Chronic Heart Failure

    PubMed Central

    Zhang, Jiang; Guy, Moltu J.; Norman, Holly S.; Chen, Yi-Chen; Xu, Qingge; Dong, Xintong; Guner, Huseyin; Wang, Sijian; Kohmoto, Takushi; Young, Ken H.; Moss, Richard L.; Ge, Ying

    2011-01-01

    The rapid increase in the prevalence of chronic heart failure (CHF) worldwide underscores an urgent need to identify biomarkers for the early detection of CHF. Post-translational modifications (PTMs) are associated with many critical signaling events during disease progression and thus offer a plethora of candidate biomarkers. We have employed top-down quantitative proteomics methodology for comprehensive assessment of PTMs in whole proteins extracted from normal and diseased tissues. We have systematically analyzed thirty-six clinical human heart tissue samples and identified phosphorylation of cardiac troponin I (cTnI) as a candidate biomarker for CHF. The relative percentages of the total phosphorylated cTnI forms over the entire cTnI populations (%Ptotal) were 56.4±3.5%, 36.9±1.6%, 6.1±2.4%, and 1.0±0.6% for postmortem hearts with normal cardiac function (n=7), early-stage of mild hypertrophy (n=5), severe hypertrophy/dilation (n=4), and end-stage CHF (n=6), respectively. In fresh transplant samples, the %Ptotal of cTnI from non-failing donor (n=4), and end-stage failing hearts (n=10) were 49.5±5.9% and 18.8±2.9%, respectively. Top-down MS with electron capture dissociation unequivocally localized the altered phosphorylation sites to Ser22/23 and determined the order of phosphorylation/dephosphorylation. This study represents the first clinical application of top-down MS-based quantitative proteomics for biomarker discovery from tissues, highlighting the potential of PTM as disease biomarkers. PMID:21751783

  15. Tools for Economic Analysis of Patient Management Interventions in Heart Failure Cost-Effectiveness Model: A Web-based program designed to evaluate the cost-effectiveness of disease management programs in heart failure.

    PubMed

    Reed, Shelby D; Neilson, Matthew P; Gardner, Matthew; Li, Yanhong; Briggs, Andrew H; Polsky, Daniel E; Graham, Felicia L; Bowers, Margaret T; Paul, Sara C; Granger, Bradi B; Schulman, Kevin A; Whellan, David J; Riegel, Barbara; Levy, Wayne C

    2015-11-01

    Heart failure disease management programs can influence medical resource use and quality-adjusted survival. Because projecting long-term costs and survival is challenging, a consistent and valid approach to extrapolating short-term outcomes would be valuable. We developed the Tools for Economic Analysis of Patient Management Interventions in Heart Failure Cost-Effectiveness Model, a Web-based simulation tool designed to integrate data on demographic, clinical, and laboratory characteristics; use of evidence-based medications; and costs to generate predicted outcomes. Survival projections are based on a modified Seattle Heart Failure Model. Projections of resource use and quality of life are modeled using relationships with time-varying Seattle Heart Failure Model scores. The model can be used to evaluate parallel-group and single-cohort study designs and hypothetical programs. Simulations consist of 10,000 pairs of virtual cohorts used to generate estimates of resource use, costs, survival, and incremental cost-effectiveness ratios from user inputs. The model demonstrated acceptable internal and external validity in replicating resource use, costs, and survival estimates from 3 clinical trials. Simulations to evaluate the cost-effectiveness of heart failure disease management programs across 3 scenarios demonstrate how the model can be used to design a program in which short-term improvements in functioning and use of evidence-based treatments are sufficient to demonstrate good long-term value to the health care system. The Tools for Economic Analysis of Patient Management Interventions in Heart Failure Cost-Effectiveness Model provides researchers and providers with a tool for conducting long-term cost-effectiveness analyses of disease management programs in heart failure. Copyright © 2015 Elsevier Inc. All rights reserved.

  16. An improved approach for flight readiness certification: Probabilistic models for flaw propagation and turbine blade failure. Volume 2: Software documentation

    NASA Technical Reports Server (NTRS)

    Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.

    1992-01-01

    An improved methodology for quantitatively evaluating failure risk of spaceflights systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with analytical modeling of failure phenomena to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in analytical modeling, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which analytical models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes. These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. State-of-the-art analytical models currently employed for design, failure prediction, or performance analysis are used in this methodology. The rationale for the statistical approach taken in the PFA methodology is discussed, the PFA methodology is described, and examples of its application to structural failure modes are presented. The engineering models and computer software used in fatigue crack growth and fatigue crack initiation applications are thoroughly documented.

  17. A cascading failure model for analyzing railway accident causation

    NASA Astrophysics Data System (ADS)

    Liu, Jin-Tao; Li, Ke-Ping

    2018-01-01

    In this paper, a new cascading failure model is proposed for quantitatively analyzing the railway accident causation. In the model, the loads of nodes are redistributed according to the strength of the causal relationships between the nodes. By analyzing the actual situation of the existing prevention measures, a critical threshold of the load parameter in the model is obtained. To verify the effectiveness of the proposed cascading model, simulation experiments of a train collision accident are performed. The results show that the cascading failure model can describe the cascading process of the railway accident more accurately than the previous models, and can quantitatively analyze the sensitivities and the influence of the causes. In conclusion, this model can assist us to reveal the latent rules of accident causation to reduce the occurrence of railway accidents.

  18. Evaluation of empirical rule of linearly correlated peptide selection (ERLPS) for proteotypic peptide-based quantitative proteomics.

    PubMed

    Liu, Kehui; Zhang, Jiyang; Fu, Bin; Xie, Hongwei; Wang, Yingchun; Qian, Xiaohong

    2014-07-01

    Precise protein quantification is essential in comparative proteomics. Currently, quantification bias is inevitable when using proteotypic peptide-based quantitative proteomics strategy for the differences in peptides measurability. To improve quantification accuracy, we proposed an "empirical rule for linearly correlated peptide selection (ERLPS)" in quantitative proteomics in our previous work. However, a systematic evaluation on general application of ERLPS in quantitative proteomics under diverse experimental conditions needs to be conducted. In this study, the practice workflow of ERLPS was explicitly illustrated; different experimental variables, such as, different MS systems, sample complexities, sample preparations, elution gradients, matrix effects, loading amounts, and other factors were comprehensively investigated to evaluate the applicability, reproducibility, and transferability of ERPLS. The results demonstrated that ERLPS was highly reproducible and transferable within appropriate loading amounts and linearly correlated response peptides should be selected for each specific experiment. ERLPS was used to proteome samples from yeast to mouse and human, and in quantitative methods from label-free to O18/O16-labeled and SILAC analysis, and enabled accurate measurements for all proteotypic peptide-based quantitative proteomics over a large dynamic range. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  19. An improved method for risk evaluation in failure modes and effects analysis of CNC lathe

    NASA Astrophysics Data System (ADS)

    Rachieru, N.; Belu, N.; Anghel, D. C.

    2015-11-01

    Failure mode and effects analysis (FMEA) is one of the most popular reliability analysis tools for identifying, assessing and eliminating potential failure modes in a wide range of industries. In general, failure modes in FMEA are evaluated and ranked through the risk priority number (RPN), which is obtained by the multiplication of crisp values of the risk factors, such as the occurrence (O), severity (S), and detection (D) of each failure mode. However, the crisp RPN method has been criticized to have several deficiencies. In this paper, linguistic variables, expressed in Gaussian, trapezoidal or triangular fuzzy numbers, are used to assess the ratings and weights for the risk factors S, O and D. A new risk assessment system based on the fuzzy set theory and fuzzy rule base theory is to be applied to assess and rank risks associated to failure modes that could appear in the functioning of Turn 55 Lathe CNC. Two case studies have been shown to demonstrate the methodology thus developed. It is illustrated a parallel between the results obtained by the traditional method and fuzzy logic for determining the RPNs. The results show that the proposed approach can reduce duplicated RPN numbers and get a more accurate, reasonable risk assessment. As a result, the stability of product and process can be assured.

  20. Safety evaluation of driver cognitive failures and driving errors on right-turn filtering movement at signalized road intersections based on Fuzzy Cellular Automata (FCA) model.

    PubMed

    Chai, Chen; Wong, Yiik Diew; Wang, Xuesong

    2017-07-01

    This paper proposes a simulation-based approach to estimate safety impact of driver cognitive failures and driving errors. Fuzzy Logic, which involves linguistic terms and uncertainty, is incorporated with Cellular Automata model to simulate decision-making process of right-turn filtering movement at signalized intersections. Simulation experiments are conducted to estimate the relationships between cognitive failures and driving errors with safety performance. Simulation results show Different types of cognitive failures are found to have varied relationship with driving errors and safety performance. For right-turn filtering movement, cognitive failures are more likely to result in driving errors with denser conflicting traffic stream. Moreover, different driving errors are found to have different safety impacts. The study serves to provide a novel approach to linguistically assess cognitions and replicate decision-making procedures of the individual driver. Compare to crash analysis, the proposed FCA model allows quantitative estimation of particular cognitive failures, and the impact of cognitions on driving errors and safety performance. Copyright © 2017 Elsevier Ltd. All rights reserved.

  1. Determining quantitative immunophenotypes and evaluating their implications

    NASA Astrophysics Data System (ADS)

    Redelman, Douglas; Hudig, Dorothy; Berner, Dave; Castell, Linda M.; Roberts, Don; Ensign, Wayne

    2002-05-01

    Quantitative immunophenotypes varied widely among > 100 healthy young males but were maintained at characteristic levels within individuals. The initial results (SPIE Proceedings 4260:226) that examined cell numbers and the quantitative expression of adhesion and lineage-specific molecules, e.g., CD2 and CD14, have now been confirmed and extended to include the quantitative expression of inducible molecules such as HLA-DR and perforin (Pf). Some properties, such as the ratio of T helper (Th) to T cytotoxic/suppressor (Tc/s) cells, are known to be genetically determined. Other properties, e.g., the T:B cell ratio, the amount of CD19 per B cell, etc., behaved similarly and may also be inherited traits. Since some patterns observed in these healthy individuals resembled those found in pathological situations we tested whether the patterns could be associated with the occurrence of disease. The current studies shows that there were associations between quantitative immunophenotypes and the subsequent incidence and severity of disease. For example, individuals with characteristically low levels of HLA-DR or B cells or reduced numbers of Pf+ Tc/s cells had more frequent and/or more severe upper respiratory infections. Quantitative immunophenotypes will be more widely measured if the necessary standards are available and if appropriate procedures are made more accessible.

  2. Evaluation of Quantitative Performance of Sequential Immobilized Metal Affinity Chromatographic Enrichment for Phosphopeptides

    PubMed Central

    Sun, Zeyu; Hamilton, Karyn L.; Reardon, Kenneth F.

    2014-01-01

    We evaluated a sequential elution protocol from immobilized metal affinity chromatography (SIMAC) employing gallium-based immobilized metal affinity chromatography (IMAC) in conjunction with titanium-dioxide-based metal oxide affinity chromatography (MOAC). The quantitative performance of this SIMAC enrichment approach, assessed in terms of repeatability, dynamic range, and linearity, was evaluated using a mixture composed of tryptic peptides from caseins, bovine serum albumin, and phosphopeptide standards. While our data demonstrate the overall consistent performance of the SIMAC approach under various loading conditions, the results also revealed that the method had limited repeatability and linearity for most phosphopeptides tested, and different phosphopeptides were found to have different linear ranges. These data suggest that, unless additional strategies are used, SIMAC should be regarded as a semi-quantitative method when used in large-scale phosphoproteomics studies in complex backgrounds. PMID:24096195

  3. Establishment of a Quantitative Medical Technology Evaluation System and Indicators within Medical Institutions.

    PubMed

    Wu, Suo-Wei; Chen, Tong; Pan, Qi; Wei, Liang-Yu; Wang, Qin; Li, Chao; Song, Jing-Chen; Luo, Ji

    2018-06-05

    The development and application of medical technologies reflect the medical quality and clinical capacity of a hospital. It is also an effective approach in upgrading medical service and core competitiveness among medical institutions. This study aimed to build a quantitative medical technology evaluation system through questionnaire survey within medical institutions to perform an assessment to medical technologies more objectively and accurately, and promote the management of medical quality technologies and ensure the medical safety of various operations among the hospitals. A two-leveled quantitative medical technology evaluation system was built through a two-round questionnaire survey of chosen experts. The Delphi method was applied in identifying the structure of evaluation system and indicators. The judgment of the experts on the indicators was adopted in building the matrix so that the weight coefficient and maximum eigenvalue (λ max), consistency index (CI), and random consistency ratio (CR) could be obtained and collected. The results were verified through consistency tests, and the index weight coefficient of each indicator was conducted and calculated through analytical hierarchy process. Twenty-six experts of different medical fields were involved in the questionnaire survey, 25 of whom successfully responded to the two-round research. Altogether, 4 primary indicators (safety, effectiveness, innovativeness, and benefits), as well as 13 secondary indicators, were included in the evaluation system. The matrix is built to conduct the λ max, CI, and CR of each expert in the survey, and the index weight coefficients of primary indicators were 0.33, 0.28, 0.27, and 0.12, respectively, and the index weight coefficients of secondary indicators were conducted and calculated accordingly. As the two-round questionnaire survey of experts and statistical analysis were performed and credibility of the results was verified through consistency evaluation test, the

  4. An academic medical center's response to widespread computer failure.

    PubMed

    Genes, Nicholas; Chary, Michael; Chason, Kevin W

    2013-01-01

    As hospitals incorporate information technology (IT), their operations become increasingly vulnerable to technological breakdowns and attacks. Proper emergency management and business continuity planning require an approach to identify, mitigate, and work through IT downtime. Hospitals can prepare for these disasters by reviewing case studies. This case study details the disruption of computer operations at Mount Sinai Medical Center (MSMC), an urban academic teaching hospital. The events, and MSMC's response, are narrated and the impact on hospital operations is analyzed. MSMC's disaster management strategy prevented computer failure from compromising patient care, although walkouts and time-to-disposition in the emergency department (ED) notably increased. This incident highlights the importance of disaster preparedness and mitigation. It also demonstrates the value of using operational data to evaluate hospital responses to disasters. Quantifying normal hospital functions, just as with a patient's vital signs, may help quantitatively evaluate and improve disaster management and business continuity planning.

  5. [Clinical evaluation of a novel HBsAg quantitative assay].

    PubMed

    Takagi, Kazumi; Tanaka, Yasuhito; Naganuma, Hatsue; Hiramatsu, Kumiko; Iida, Takayasu; Takasaka, Yoshimitsu; Mizokami, Masashi

    2007-07-01

    The clinical implication of the hepatitis B surface antigen (HBsAg) concentrations in HBV-infected individuals remains unclear. The aim of this study was to evaluate a novel fully automated Chemiluminescence Enzyme Immunoassay (Sysmex HBsAg quantitative assay) by comparative measurements of the reference serum samples versus two independent commercial assays (Lumipulse f or Architect HBsAg QT). Furthermore, clinical usefulness was assessed for monitoring of the serum HBsAg levels during antiviral therapy. A dilution test using 5 reference-serum samples showed linear correlation curve in range from 0.03 to 2,360 IU/ml. The HBsAg was measured in total of 400 serum samples and 99.8% had consistent results between Sysmex and Lumipulse f. Additionally, a positive linear correlation was observed between Sysmex and Architect. To compare the Architect and Sysmex, both methods were applied to quantify the HBsAg in serum samples with different HBV genotypes/subgenotypes, as well as in serum contained HBV vaccine escape mutants (126S, 145R). Correlation between the methods was observed in results for escape mutants and common genotypes (A, B, C) in Japan. Observed during lamivudine therapy, an increase in HBsAg and HBV DNA concentrations preceded the aminotransferase (ALT) elevation associated with drug-resistant HBV variant emergence (breakthrough hepatitis). In conclusion, reliability of the Sysmex HBsAg quantitative assay was confirmed for all HBV genetic variants common in Japan. Monitoring of serum HBsAg concentrations in addition to HBV DNA quantification, is helpful in evaluation of the response to lamivudine treatment and diagnosis of the breakthrough hepatitis.

  6. The SMART personalised self-management system for congestive heart failure: results of a realist evaluation.

    PubMed

    Bartlett, Yvonne K; Haywood, Annette; Bentley, Claire L; Parker, Jack; Hawley, Mark S; Mountain, Gail A; Mawson, Susan

    2014-11-25

    Technology has the potential to provide support for self-management to people with congestive heart failure (CHF). This paper describes the results of a realist evaluation of the SMART Personalised Self-Management System (PSMS) for CHF. The PSMS was used, at home, by seven people with CHF. Data describing system usage and usability as well as questionnaire and interview data were evaluated in terms of the context, mechanism and outcome hypotheses (CMOs) integral to realist evaluation. The CHF PSMS improved heart failure related knowledge in those with low levels of knowledge at baseline, through providing information and quizzes. Furthermore, participants perceived the self-regulatory aspects of the CHF PSMS as being useful in encouraging daily walking. The CMOs were revised to describe the context of use, and how this influences both the mechanisms and the outcomes. Participants with CHF engaged with the PSMS despite some technological problems. Some positive effects on knowledge were observed as well as the potential to assist with changing physical activity behaviour. Knowledge of CHF and physical activity behaviour change are important self-management targets for CHF, and this study provides evidence to direct the further development of a technology to support these targets.

  7. Quantitative Ultrasonic Evaluation of Mechanical Properties of Engineering Materials

    NASA Technical Reports Server (NTRS)

    Vary, A.

    1978-01-01

    Progress in the application of ultrasonic techniques to nondestructive measurement of mechanical strength of engineering materials is reviewed. A dormant concept in nondestructive evaluation (NDE) is invoked. The availability of ultrasonic methods that can be applied to actual parts to assess their potential susceptibility to failure under design conditions is discussed. It was shown that ultrasonic methods yield measurements of elastic moduli, microstructure, hardness, fracture toughness, tensile strength, yield strength, and shear strength for a wide range of materials (including many types of metals, ceramics, and fiber composites). It was also indicated that although most of these methods were shown feasible in laboratory studies, more work is needed before they can be used on actual parts in processing, assembly, inspection, and maintenance lines.

  8. Wind Turbine Failures - Tackling current Problems in Failure Data Analysis

    NASA Astrophysics Data System (ADS)

    Reder, M. D.; Gonzalez, E.; Melero, J. J.

    2016-09-01

    The wind industry has been growing significantly over the past decades, resulting in a remarkable increase in installed wind power capacity. Turbine technologies are rapidly evolving in terms of complexity and size, and there is an urgent need for cost effective operation and maintenance (O&M) strategies. Especially unplanned downtime represents one of the main cost drivers of a modern wind farm. Here, reliability and failure prediction models can enable operators to apply preventive O&M strategies rather than corrective actions. In order to develop these models, the failure rates and downtimes of wind turbine (WT) components have to be understood profoundly. This paper is focused on tackling three of the main issues related to WT failure analyses. These are, the non-uniform data treatment, the scarcity of available failure analyses, and the lack of investigation on alternative data sources. For this, a modernised form of an existing WT taxonomy is introduced. Additionally, an extensive analysis of historical failure and downtime data of more than 4300 turbines is presented. Finally, the possibilities to encounter the lack of available failure data by complementing historical databases with Supervisory Control and Data Acquisition (SCADA) alarms are evaluated.

  9. An improved approach for flight readiness certification: Probabilistic models for flaw propagation and turbine blade failure. Volume 1: Methodology and applications

    NASA Technical Reports Server (NTRS)

    Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.

    1992-01-01

    An improved methodology for quantitatively evaluating failure risk of spaceflight systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with analytical modeling of failure phenomena to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in analytical modeling, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which analytical models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes. These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. State-of-the-art analytical models currently employed for designs failure prediction, or performance analysis are used in this methodology. The rationale for the statistical approach taken in the PFA methodology is discussed, the PFA methodology is described, and examples of its application to structural failure modes are presented. The engineering models and computer software used in fatigue crack growth and fatigue crack initiation applications are thoroughly documented.

  10. Competitive evaluation of failure detection algorithms for strapdown redundant inertial instruments

    NASA Technical Reports Server (NTRS)

    Wilcox, J. C.

    1973-01-01

    Algorithms for failure detection, isolation, and correction of redundant inertial instruments in the strapdown dodecahedron configuration are competitively evaluated in a digital computer simulation that subjects them to identical environments. Their performance is compared in terms of orientation and inertial velocity errors and in terms of missed and false alarms. The algorithms appear in the simulation program in modular form, so that they may be readily extracted for use elsewhere. The simulation program and its inputs and outputs are described. The algorithms, along with an eight algorithm that was not simulated, also compared analytically to show the relationships among them.

  11. How to quantitatively evaluate safety of driver behavior upon accident? A biomechanical methodology

    PubMed Central

    Zhang, Wen; Cao, Jieer

    2017-01-01

    How to evaluate driver spontaneous reactions in various collision patterns in a quantitative way is one of the most important topics in vehicle safety. Firstly, this paper constructs representative numerical crash scenarios described by impact velocity, impact angle and contact position based on finite element (FE) computation platform. Secondly, a driver cabin model is extracted and described in the well validated multi-rigid body (MB) model to compute the value of weighted injury criterion to quantitatively assess drivers’ overall injury under certain circumstances. Furthermore, based on the coupling of FE and MB, parametric studies on various crash scenarios are conducted. It is revealed that the WIC (Weighted Injury Criteria) value variation law under high impact velocities is quite distinct comparing with the one in low impact velocities. In addition, the coupling effect can be elucidated by the fact that the difference of WIC value among three impact velocities under smaller impact angles tends to be distinctly higher than that under larger impact angles. Meanwhile, high impact velocity also increases the sensitivity of WIC under different collision positions and impact angles. Results may provide a new methodology to quantitatively evaluate driving behaviors and serve as a significant guiding step towards collision avoidance for autonomous driving vehicles. PMID:29240789

  12. How to quantitatively evaluate safety of driver behavior upon accident? A biomechanical methodology.

    PubMed

    Zhang, Wen; Cao, Jieer; Xu, Jun

    2017-01-01

    How to evaluate driver spontaneous reactions in various collision patterns in a quantitative way is one of the most important topics in vehicle safety. Firstly, this paper constructs representative numerical crash scenarios described by impact velocity, impact angle and contact position based on finite element (FE) computation platform. Secondly, a driver cabin model is extracted and described in the well validated multi-rigid body (MB) model to compute the value of weighted injury criterion to quantitatively assess drivers' overall injury under certain circumstances. Furthermore, based on the coupling of FE and MB, parametric studies on various crash scenarios are conducted. It is revealed that the WIC (Weighted Injury Criteria) value variation law under high impact velocities is quite distinct comparing with the one in low impact velocities. In addition, the coupling effect can be elucidated by the fact that the difference of WIC value among three impact velocities under smaller impact angles tends to be distinctly higher than that under larger impact angles. Meanwhile, high impact velocity also increases the sensitivity of WIC under different collision positions and impact angles. Results may provide a new methodology to quantitatively evaluate driving behaviors and serve as a significant guiding step towards collision avoidance for autonomous driving vehicles.

  13. Quantitative evaluation of translational medicine based on scientometric analysis and information extraction.

    PubMed

    Zhang, Yin; Diao, Tianxi; Wang, Lei

    2014-12-01

    Designed to advance the two-way translational process between basic research and clinical practice, translational medicine has become one of the most important areas in biomedicine. The quantitative evaluation of translational medicine is valuable for the decision making of global translational medical research and funding. Using the scientometric analysis and information extraction techniques, this study quantitatively analyzed the scientific articles on translational medicine. The results showed that translational medicine had significant scientific output and impact, specific core field and institute, and outstanding academic status and benefit. While it is not considered in this study, the patent data are another important indicators that should be integrated in the relevant research in the future. © 2014 Wiley Periodicals, Inc.

  14. Evaluation of neutrophil/leukocyte ratio and organ failure score as predictors of reversibility and survival following an acute-on-chronic liver failure event.

    PubMed

    Agiasotelli, Danai; Alexopoulou, Alexandra; Vasilieva, Larisa; Kalpakou, Georgia; Papadaki, Sotiria; Dourakis, Spyros P

    2016-05-01

    Acute-on-chronic liver failure (ACLF) is defined as an acute deterioration of liver disease with high mortality in patients with cirrhosis. The early mortality in ACLF is associated with organ failure and high leukocyte count. The time needed to reverse this condition and the factors affecting mortality after the early 30-day-period were evaluated. One hundred and ninety-seven consecutive patients with cirrhosis were included. Patients were prospectively followed up for 180 days. ACLF was diagnosed in 54.8% of the patients. Infection was the most common precipitating event in patients with ACLF. On multivariate analysis, only the neutrophil/leukocyte ratio and Chronic Liver Failure Consortium Organ Failure (CLIF-C OF) score were associated with mortality. Hazard ratios for mortality of patients with ACLF compared with those without at different time end-points post-enrollment revealed that the relative risk of death in the ACLF group was 8.54 during the first 30-day period and declined to 1.94 during the second period of observation. The time varying effect of neutrophil/leukocyte ratio and CLIF-C score was negative (1% and 18% decline in the hazard ratio per month) while that of Model for End-Stage Liver Disease (MELD) was positive (3% increase in the hazard ratio per month). The condition of ACLF was reversible in patients who survived. During the 30-180-day period following the acute event, the probability of death in ACLF became gradually similar to the non-ACLF group. The impact of inflammatory response and organ failure on survival is powerful during the first 30-day period and weakens thereafter while that of MELD increases. © 2015 The Japan Society of Hepatology.

  15. A Case Study on Improving Intensive Care Unit (ICU) Services Reliability: By Using Process Failure Mode and Effects Analysis (PFMEA)

    PubMed Central

    Yousefinezhadi, Taraneh; Jannesar Nobari, Farnaz Attar; Goodari, Faranak Behzadi; Arab, Mohammad

    2016-01-01

    Introduction: In any complex human system, human error is inevitable and shows that can’t be eliminated by blaming wrong doers. So with the aim of improving Intensive Care Units (ICU) reliability in hospitals, this research tries to identify and analyze ICU’s process failure modes at the point of systematic approach to errors. Methods: In this descriptive research, data was gathered qualitatively by observations, document reviews, and Focus Group Discussions (FGDs) with the process owners in two selected ICUs in Tehran in 2014. But, data analysis was quantitative, based on failures’ Risk Priority Number (RPN) at the base of Failure Modes and Effects Analysis (FMEA) method used. Besides, some causes of failures were analyzed by qualitative Eindhoven Classification Model (ECM). Results: Through FMEA methodology, 378 potential failure modes from 180 ICU activities in hospital A and 184 potential failures from 99 ICU activities in hospital B were identified and evaluated. Then with 90% reliability (RPN≥100), totally 18 failures in hospital A and 42 ones in hospital B were identified as non-acceptable risks and then their causes were analyzed by ECM. Conclusions: Applying of modified PFMEA for improving two selected ICUs’ processes reliability in two different kinds of hospitals shows that this method empowers staff to identify, evaluate, prioritize and analyze all potential failure modes and also make them eager to identify their causes, recommend corrective actions and even participate in improving process without feeling blamed by top management. Moreover, by combining FMEA and ECM, team members can easily identify failure causes at the point of health care perspectives. PMID:27157162

  16. Fluid Volume Overload and Congestion in Heart Failure: Time to Reconsider Pathophysiology and How Volume Is Assessed.

    PubMed

    Miller, Wayne L

    2016-08-01

    Volume regulation, assessment, and management remain basic issues in patients with heart failure. The discussion presented here is directed at opening a reassessment of the pathophysiology of congestion in congestive heart failure and the methods by which we determine volume overload status. Peer-reviewed historical and contemporary literatures are reviewed. Volume overload and fluid congestion remain primary issues for patients with chronic heart failure. The pathophysiology is complex, and the simple concept of intravascular fluid accumulation is not adequate. The dynamics of interstitial and intravascular fluid compartment interactions and fluid redistribution from venous splanchnic beds to central pulmonary circulation need to be taken into account in strategies of volume management. Clinical bedside evaluations and right heart hemodynamic assessments can alert clinicians of changes in volume status, but only the quantitative measurement of total blood volume can help identify the heterogeneity in plasma volume and red blood cell mass that are features of volume overload in patients with chronic heart failure and help guide individualized, appropriate therapy-not all volume overload is the same. © 2016 American Heart Association, Inc.

  17. A model for predicting embankment slope failures in clay-rich soils; A Louisiana example

    NASA Astrophysics Data System (ADS)

    Burns, S. F.

    2015-12-01

    A model for predicting embankment slope failures in clay-rich soils; A Louisiana example It is well known that smectite-rich soils significantly reduce the stability of slopes. The question is how much smectite in the soil causes slope failures. A study of over 100 sites in north and south Louisiana, USA, compared slopes that failed during a major El Nino winter (heavy rainfall) in 1982-1983 to similar slopes that did not fail. Soils in the slopes were tested for per cent clay, liquid limits, plasticity indices and semi-quantitative clay mineralogy. Slopes with the High Risk for failure (85-90% chance of failure in 8-15 years after construction) contained soils with a liquid limit > 54%, a plasticity index over 29%, and clay contents > 47%. Slopes with an Intermediate Risk (55-50% chance of failure in 8-15 years) contained soils with a liquid limit between 36-54%, plasticity index between 16-19%, and clay content between 32-47%. Slopes with a Low Risk chance of failure (< 5% chance of failure in 8-15 years after construction) contained soils with a liquid limit < 36%, a plasticity index < 16%, and a clay content < 32%. These data show that if one is constructing embankments and one wants to prevent slope failure of the 3:1 slopes, check the above soil characteristics before construction. If the soils fall into the Low Risk classification, construct the embankment normally. If the soils fall into the High Risk classification, one will need to use lime stabilization or heat treatments to prevent failures. Soils in the Intermediate Risk class will have to be evaluated on a case by case basis.

  18. [Reconsidering evaluation criteria regarding health care research: toward an integrative framework of quantitative and qualitative criteria].

    PubMed

    Miyata, Hiroaki; Kai, Ichiro

    2006-05-01

    Debate about the relationship between quantitative and qualitative paradigms is often muddled and confused and the clutter of terms and arguments has resulted in the concepts becoming obscure and unrecognizable. It is therefore very important to reconsider evaluation criteria regarding rigor in social science. As Lincoln & Guba have already compared quantitative paradigms (validity, reliability, neutrality, generalizability) with qualitative paradigms (credibility, dependability, confirmability, transferability), we have discuss use of evaluation criteria based on pragmatic perspective. Validity/Credibility is the paradigm concerned to observational framework, while Reliability/Dependability refer to the range of stability in observations, Neutrality/Confirmability reflect influences between observers and subjects, Generalizability/Transferability have epistemological difference in the way findings are applied. Qualitative studies, however, does not always chose the qualitative paradigms. If we assume the stability to some extent, it is better to use the quantitative paradigm (reliability). Moreover as a quantitative study can not always guarantee a perfect observational framework, with stability in all phases of observations, it is useful to use qualitative paradigms to enhance the rigor in the study.

  19. Quantitative light-induced fluorescence technology for quantitative evaluation of tooth wear

    NASA Astrophysics Data System (ADS)

    Kim, Sang-Kyeom; Lee, Hyung-Suk; Park, Seok-Woo; Lee, Eun-Song; de Josselin de Jong, Elbert; Jung, Hoi-In; Kim, Baek-Il

    2017-12-01

    Various technologies used to objectively determine enamel thickness or dentin exposure have been suggested. However, most methods have clinical limitations. This study was conducted to confirm the potential of quantitative light-induced fluorescence (QLF) using autofluorescence intensity of occlusal surfaces of worn teeth according to enamel grinding depth in vitro. Sixteen permanent premolars were used. Each tooth was gradationally ground down at the occlusal surface in the apical direction. QLF-digital and swept-source optical coherence tomography images were acquired at each grinding depth (in steps of 100 μm). All QLF images were converted to 8-bit grayscale images to calculate the fluorescence intensity. The maximum brightness (MB) values of the same sound regions in grayscale images before (MB) and phased values after (MB) the grinding process were calculated. Finally, 13 samples were evaluated. MB increased over the grinding depth range with a strong correlation (r=0.994, P<0.001). In conclusion, the fluorescence intensity of the teeth and grinding depth was strongly correlated in the QLF images. Therefore, QLF technology may be a useful noninvasive tool used to monitor the progression of tooth wear and to conveniently estimate enamel thickness.

  20. On possibilities of using global monitoring in effective prevention of tailings storage facilities failures.

    PubMed

    Stefaniak, Katarzyna; Wróżyńska, Magdalena

    2018-02-01

    Protection of common natural goods is one of the greatest challenges man faces every day. Extracting and processing natural resources such as mineral deposits contributes to the transformation of the natural environment. The number of activities designed to keep balance are undertaken in accordance with the concept of integrated order. One of them is the use of comprehensive systems of tailings storage facility monitoring. Despite the monitoring, system failures still occur. The quantitative aspect of the failures illustrates both the scale of the problem and the quantitative aspect of the consequences of tailings storage facility failures. The paper presents vast possibilities provided by the global monitoring in the effective prevention of these failures. Particular attention is drawn to the potential of using multidirectional monitoring, including technical and environmental monitoring by the example of one of the world's biggest hydrotechnical constructions-Żelazny Most Tailings Storage Facility (TSF), Poland. Analysis of monitoring data allows to take preventive action against construction failures of facility dams, which can have devastating effects on human life and the natural environment.

  1. Mechanical Model Analysis for Quantitative Evaluation of Liver Fibrosis Based on Ultrasound Tissue Elasticity Imaging

    NASA Astrophysics Data System (ADS)

    Shiina, Tsuyoshi; Maki, Tomonori; Yamakawa, Makoto; Mitake, Tsuyoshi; Kudo, Masatoshi; Fujimoto, Kenji

    2012-07-01

    Precise evaluation of the stage of chronic hepatitis C with respect to fibrosis has become an important issue to prevent the occurrence of cirrhosis and to initiate appropriate therapeutic intervention such as viral eradication using interferon. Ultrasound tissue elasticity imaging, i.e., elastography can visualize tissue hardness/softness, and its clinical usefulness has been studied to detect and evaluate tumors. We have recently reported that the texture of elasticity image changes as fibrosis progresses. To evaluate fibrosis progression quantitatively on the basis of ultrasound tissue elasticity imaging, we introduced a mechanical model of fibrosis progression and simulated the process by which hepatic fibrosis affects elasticity images and compared the results with those clinical data analysis. As a result, it was confirmed that even in diffuse diseases like chronic hepatitis, the patterns of elasticity images are related to fibrous structural changes caused by hepatic disease and can be used to derive features for quantitative evaluation of fibrosis stage.

  2. Using qualitative and quantitative methods to evaluate small-scale disease management pilot programs.

    PubMed

    Esposito, Dominick; Taylor, Erin Fries; Gold, Marsha

    2009-02-01

    Interest in disease management programs continues to grow as managed care plans, the federal and state governments, and other organizations consider such efforts as a means to improve health care quality and reduce costs. These efforts vary in size, scope, and target population. While large-scale programs provide the means to measure impacts, evaluation of smaller interventions remains valuable as they often represent the early planning stages of larger initiatives. This paper describes a multi-method approach for evaluating small interventions that sought to improve the quality of care for Medicaid beneficiaries with multiple chronic conditions. Our approach relied on quantitative and qualitative methods to develop a complete understanding of each intervention. Quantitative data in the form of both process measures, such as case manager contacts, and outcome measures, such as hospital use, were reported and analyzed. Qualitative information was collected through interviews and the development of logic models to document the flow of intervention activities and how they were intended to affect outcomes. The logic models helped us to understand the underlying reasons for the success or lack thereof of each intervention. The analysis provides useful information on several fronts. First, qualitative data provided valuable information about implementation. Second, process measures helped determine whether implementation occurred as anticipated. Third, outcome measures indicated the potential for favorable results later, possibly suggesting further study. Finally, the evaluation of qualitative and quantitative data in combination helped us assess the potential promise of each intervention and identify common themes and challenges across all interventions.

  3. Quantitative methods for evaluating the efficacy of thalamic deep brain stimulation in patients with essential tremor.

    PubMed

    Wastensson, Gunilla; Holmberg, Björn; Johnels, Bo; Barregard, Lars

    2013-01-01

    Deep brain stimulation (DBS) of the thalamus is a safe and efficient method for treatment of disabling tremor in patient with essential tremor (ET). However, successful tremor suppression after surgery requires careful selection of stimulus parameters. Our aim was to examine the possible use of certain quantitative methods for evaluating the efficacy of thalamic DBS in ET patients in clinical practice, and to compare these methods with traditional clinical tests. We examined 22 patients using the Essential Tremor Rating Scale (ETRS) and quantitative assessment of tremor with the stimulator both activated and deactivated. We used an accelerometer (CATSYS tremor Pen) for quantitative measurement of postural tremor, and a eurythmokinesimeter (EKM) to evaluate kinetic tremor in a rapid pointing task. The efficacy of DBS on tremor suppression was prominent irrespective of the method used. The agreement between clinical rating of postural tremor and tremor intensity as measured by the CATSYS tremor pen was relatively high (rs = 0.74). The agreement between kinetic tremor as assessed by the ETRS and the main outcome variable from the EKM test was low (rs = 0.34). The lack of agreement indicates that the EKM test is not comparable with the clinical test. Quantitative methods, such as the CATSYS tremor pen, could be a useful complement to clinical tremor assessment in evaluating the efficacy of DBS in clinical practice. Future studies should evaluate the precision of these methods and long-term impact on tremor suppression, activities of daily living (ADL) function and quality of life.

  4. A Prospective, Quantitative Evaluation of Fatty Infiltration Before and After Rotator Cuff Repair.

    PubMed

    Lansdown, Drew A; Lee, Sonia; Sam, Craig; Krug, Roland; Feeley, Brian T; Ma, C Benjamin

    2017-07-01

    Current evaluation of muscle fatty infiltration has been limited by subjective classifications. Quantitative fat evaluation through magnetic resonance imaging (MRI) may allow for an improved longitudinal evaluation of the effect of surgical repair on the progression of fatty infiltration. We hypothesized that (1) patients with isolated full-thickness supraspinatus tendon tears would have less progression in fatty infiltration compared with patients with full-thickness tears of multiple tendons and (2) patients with eventual failed repair would have higher baseline levels of fatty infiltration. Cohort study; Level of evidence, 2. Thirty-five patients with full-thickness rotator cuff tears were followed longitudinally. All patients received a shoulder MRI, including the iterative decomposition of echoes of asymmetric length (IDEAL) sequence for fat measurement, prior to surgical treatment and at 6 months after surgical repair. Fat fractions were recorded for all 4 rotator cuff muscles from measurements on 4 sagittal slices centered at the scapular-Y. Demographics and tear characteristics were recorded. Baseline and follow-up fat fractions were compared for patients with isolated supraspinatus tears versus multitendon tears and for patients with intact repairs versus failed repairs. Statistical significance was set at P < .05. The mean fat fractions were significantly higher at follow-up than at baseline for the supraspinatus (9.8% ± 7.0% vs 8.3% ± 5.7%; P = .025) and infraspinatus (7.4% ± 6.1% vs 5.7% ± 4.4%; P = .027) muscles. Patients with multitendon tears showed no significant change for any rotator cuff muscle after repair. Patients with isolated supraspinatus tears showed a significant progression in the supraspinatus fat fraction from baseline to follow-up (from 6.8% ± 4.9% to 8.6% ± 6.8%; P = .0083). Baseline supraspinatus fat fractions were significantly higher in patients with eventual failed repairs compared with those with intact repairs (11.7% ± 6

  5. Quantitative evaluation of orbital hybridization in carbon nanotubes under radial deformation using π-orbital axis vector

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ohnishi, Masato, E-mail: masato.ohnishi@rift.mech.tohoku.ac.jp; Suzuki, Ken; Miura, Hideo, E-mail: hmiura@rift.mech.tohoku.ac.jp

    2015-04-15

    When a radial strain is applied to a carbon nanotube (CNT), the increase in local curvature induces orbital hybridization. The effect of the curvature-induced orbital hybridization on the electronic properties of CNTs, however, has not been evaluated quantitatively. In this study, the strength of orbital hybridization in CNTs under homogeneous radial strain was evaluated quantitatively. Our analyses revealed the detailed procedure of the change in electronic structure of CNTs. In addition, the dihedral angle, the angle between π-orbital axis vectors of adjacent atoms, was found to effectively predict the strength of local orbital hybridization in deformed CNTs.

  6. Evaluation of marginal failures of dental composite restorations by acoustic emission analysis.

    PubMed

    Gu, Ja-Uk; Choi, Nak-Sam

    2013-01-01

    In this study, a nondestructive method based on acoustic emission (AE) analysis was developed to evaluate the marginal failure states of dental composite restorations. Three types of ring-shaped substrates, which were modeled after a Class I cavity, were prepared from polymethyl methacrylate, stainless steel, and human molar teeth. A bonding agent and a composite resin were applied to the ring-shaped substrates and cured by light exposure. At each time-interval measurement, the tooth substrate presented a higher number of AE hits than polymethyl methacrylate and steel substrates. Marginal disintegration estimations derived from cumulative AE hits and cumulative AE energy parameters showed that a signification portion of marginal gap formation was already realized within 1 min at the initial light-curing stage. Estimation based on cumulative AE energy gave a higher level of marginal failure than that based on AE hits. It was concluded that the AE analysis method developed in this study was a viable approach in predicting the clinical survival of dental composite restorations efficiently within a short test period.

  7. Quantitative evaluation of software packages for single-molecule localization microscopy.

    PubMed

    Sage, Daniel; Kirshner, Hagai; Pengo, Thomas; Stuurman, Nico; Min, Junhong; Manley, Suliana; Unser, Michael

    2015-08-01

    The quality of super-resolution images obtained by single-molecule localization microscopy (SMLM) depends largely on the software used to detect and accurately localize point sources. In this work, we focus on the computational aspects of super-resolution microscopy and present a comprehensive evaluation of localization software packages. Our philosophy is to evaluate each package as a whole, thus maintaining the integrity of the software. We prepared synthetic data that represent three-dimensional structures modeled after biological components, taking excitation parameters, noise sources, point-spread functions and pixelation into account. We then asked developers to run their software on our data; most responded favorably, allowing us to present a broad picture of the methods available. We evaluated their results using quantitative and user-interpretable criteria: detection rate, accuracy, quality of image reconstruction, resolution, software usability and computational resources. These metrics reflect the various tradeoffs of SMLM software packages and help users to choose the software that fits their needs.

  8. Evaluating the Effectiveness of Remedial Reading Courses at Community Colleges: A Quantitative Study

    ERIC Educational Resources Information Center

    Lavonier, Nicole

    2014-01-01

    The present study evaluated the effectiveness of two instructional approaches for remedial reading courses at a community college. The instructional approaches were strategic reading and traditional, textbook-based instruction. The two research questions that guided the quantitative, quasi-experimental study were: (a) what is the effect of…

  9. A Systematic Review of Quantitative Resilience Measures for Water Infrastructure Systems

    DOE PAGES

    Shin, Sangmin; Lee, Seungyub; Judi, David; ...

    2018-02-07

    Over the past few decades, the concept of resilience has emerged as an important consideration in the planning and management of water infrastructure systems. Accordingly, various resilience measures have been developed for the quantitative evaluation and decision-making of systems. There are, however, numerous considerations and no clear choice of which measure, if any, provides the most appropriate representation of resilience for a given application. This study provides a critical review of quantitative approaches to measure the resilience of water infrastructure systems, with a focus on water resources and distribution systems. A compilation of 11 criteria evaluating 21 selected resilience measuresmore » addressing major features of resilience is developed using the Axiomatic Design process. Existing gaps of resilience measures are identified based on the review criteria. The results show that resilience measures have generally paid less attention to cascading damage to interrelated systems, rapid identification of failure, physical damage of system components, and time variation of resilience. Concluding the paper, improvements to resilience measures are recommended. The findings contribute to our understanding of gaps and provide information to help further improve resilience measures of water infrastructure systems.« less

  10. A Systematic Review of Quantitative Resilience Measures for Water Infrastructure Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shin, Sangmin; Lee, Seungyub; Judi, David

    Over the past few decades, the concept of resilience has emerged as an important consideration in the planning and management of water infrastructure systems. Accordingly, various resilience measures have been developed for the quantitative evaluation and decision-making of systems. There are, however, numerous considerations and no clear choice of which measure, if any, provides the most appropriate representation of resilience for a given application. This study provides a critical review of quantitative approaches to measure the resilience of water infrastructure systems, with a focus on water resources and distribution systems. A compilation of 11 criteria evaluating 21 selected resilience measuresmore » addressing major features of resilience is developed using the Axiomatic Design process. Existing gaps of resilience measures are identified based on the review criteria. The results show that resilience measures have generally paid less attention to cascading damage to interrelated systems, rapid identification of failure, physical damage of system components, and time variation of resilience. Concluding the paper, improvements to resilience measures are recommended. The findings contribute to our understanding of gaps and provide information to help further improve resilience measures of water infrastructure systems.« less

  11. Development of an adaptive failure detection and identification system for detecting aircraft control element failures

    NASA Technical Reports Server (NTRS)

    Bundick, W. Thomas

    1990-01-01

    A methodology for designing a failure detection and identification (FDI) system to detect and isolate control element failures in aircraft control systems is reviewed. An FDI system design for a modified B-737 aircraft resulting from this methodology is also reviewed, and the results of evaluating this system via simulation are presented. The FDI system performed well in a no-turbulence environment, but it experienced an unacceptable number of false alarms in atmospheric turbulence. An adaptive FDI system, which adjusts thresholds and other system parameters based on the estimated turbulence level, was developed and evaluated. The adaptive system performed well over all turbulence levels simulated, reliably detecting all but the smallest magnitude partially-missing-surface failures.

  12. Analytical insight into "breathing" crack-induced acoustic nonlinearity with an application to quantitative evaluation of contact cracks.

    PubMed

    Wang, Kai; Liu, Menglong; Su, Zhongqing; Yuan, Shenfang; Fan, Zheng

    2018-08-01

    To characterize fatigue cracks, in the undersized stage in particular, preferably in a quantitative and precise manner, a two-dimensional (2D) analytical model is developed for interpreting the modulation mechanism of a "breathing" crack on guided ultrasonic waves (GUWs). In conjunction with a modal decomposition method and a variational principle-based algorithm, the model is capable of analytically depicting the propagating and evanescent waves induced owing to the interaction of probing GUWs with a "breathing" crack, and further extracting linear and nonlinear wave features (e.g., reflection, transmission, mode conversion and contact acoustic nonlinearity (CAN)). With the model, a quantitative correlation between CAN embodied in acquired GUWs and crack parameters (e.g., location and severity) is obtained, whereby a set of damage indices is proposed via which the severity of the crack can be evaluated quantitatively. The evaluation, in principle, does not entail a benchmarking process against baseline signals. As validation, the results obtained from the analytical model are compared with those from finite element simulation, showing good consistency. This has demonstrated accuracy of the developed analytical model in interpreting contact crack-induced CAN, and spotlighted its application to quantitative evaluation of fatigue damage. Copyright © 2018 Elsevier B.V. All rights reserved.

  13. Comprehensive, Quantitative Risk Assessment of CO{sub 2} Geologic Sequestration

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lepinski, James

    2013-09-30

    A Quantitative Failure Modes and Effects Analysis (QFMEA) was developed to conduct comprehensive, quantitative risk assessments on CO{sub 2} capture, transportation, and sequestration or use in deep saline aquifers, enhanced oil recovery operations, or enhanced coal bed methane operations. The model identifies and characterizes potential risks; identifies the likely failure modes, causes, effects and methods of detection; lists possible risk prevention and risk mitigation steps; estimates potential damage recovery costs, mitigation costs and costs savings resulting from mitigation; and ranks (prioritizes) risks according to the probability of failure, the severity of failure, the difficulty of early failure detection and themore » potential for fatalities. The QFMEA model generates the necessary information needed for effective project risk management. Diverse project information can be integrated into a concise, common format that allows comprehensive, quantitative analysis, by a cross-functional team of experts, to determine: What can possibly go wrong? How much will damage recovery cost? How can it be prevented or mitigated? What is the cost savings or benefit of prevention or mitigation? Which risks should be given highest priority for resolution? The QFMEA model can be tailored to specific projects and is applicable to new projects as well as mature projects. The model can be revised and updated as new information comes available. It accepts input from multiple sources, such as literature searches, site characterization, field data, computer simulations, analogues, process influence diagrams, probability density functions, financial analysis models, cost factors, and heuristic best practices manuals, and converts the information into a standardized format in an Excel spreadsheet. Process influence diagrams, geologic models, financial models, cost factors and an insurance schedule were developed to support the QFMEA model. Comprehensive, quantitative risk

  14. Integrating quantitative and qualitative evaluation methods to compare two teacher inservice training programs

    NASA Astrophysics Data System (ADS)

    Lawrenz, Frances; McCreath, Heather

    Qualitative and quantitative evaluation procedures were used to compare two physical-science teacher inservice training programs. The two programs followed the master teacher training model espoused by NSF but used different types of master teachers and types of activities. The two evaluation procedures produced different results and together they provided a much clearer picture of the strengths and weaknesses of the two programs. Using only one approach or the other would have substantially altered the conclusions.

  15. Proteus mirabilis biofilm - qualitative and quantitative colorimetric methods-based evaluation.

    PubMed

    Kwiecinska-Piróg, Joanna; Bogiel, Tomasz; Skowron, Krzysztof; Wieckowska, Ewa; Gospodarek, Eugenia

    2014-01-01

    Proteus mirabilis strains ability to form biofilm is a current topic of a number of research worldwide. In this study the biofilm formation of P. mirabilis strains derived from urine of the catheterized and non-catheterized patients has been investigated. A total number of 39 P. mirabilis strains isolated from the urine samples of the patients of dr Antoni Jurasz University Hospital No. 1 in Bydgoszcz clinics between 2011 and 2012 was used. Biofilm formation was evaluated using two independent quantitative and qualitative methods with TTC (2,3,5-triphenyl-tetrazolium chloride) and CV (crystal violet) application. The obtained results confirmed biofilm formation by all the examined strains, except quantitative method with TTC, in which 7.7% of the strains did not have this ability. It was shown that P. mirabilis rods have the ability to form biofilm on the surfaces of both biomaterials applied, polystyrene and polyvinyl chloride (Nelaton catheters). The differences in ability to form biofilm observed between P. mirabilis strains derived from the urine of the catheterized and non-catheterized patients were not statistically significant.

  16. Discharge clinical characteristics and 60-day readmission in patients hospitalized with heart failure.

    PubMed

    Anderson, Kelley M

    2014-01-01

    Heart failure is a clinical syndrome that incurs a high prevalence, mortality, morbidity, and economic burden in our society. Patients with heart failure may experience hospitalization because of an acute exacerbation of their condition. Recurrent hospitalizations soon after discharge are an unfortunate occurrence in this patient population. The purpose of this study was to explore the clinical and diagnostic characteristics of individuals hospitalized with a primary diagnosis of heart failure at the time of discharge and to compare the association of these indicators in individuals who did and did not experience a heart failure hospitalization within 60 days of the index stay. The study is a descriptive, correlational, quantitative study using a retrospective review of 134 individuals discharged with a primary diagnosis of heart failure. Records were reviewed for sociodemographic characteristics, health histories, clinical assessment findings, and diagnostic information. Significant predictors of 60-day heart failure readmissions were dyspnea (β = 0.579), crackles (β = 1.688), and assistance with activities of daily living (β = 2.328), independent of age, gender, and multiple other factors. By using hierarchical logistical regression, a model was derived that demonstrated the ability to correctly classify 77.4% of the cohort, 78.2% of those who did have a readmission (sensitivity of the prediction), and 76.7% of the subjects in whom the predicted event, readmission, did not occur (specificity of the prediction). Hospitalizations for heart failure are markers of clinical instability. Future events after hospitalization are common in this patient population, and this study provides a novel understanding of clinical characteristics at the time of discharge that are associated with future outcomes, specifically 60-day heart failure readmissions. A consideration of these characteristics provides an additional perspective to guide clinical decision making and the

  17. Quantitative Evaluation Method of Each Generation Margin for Power System Planning

    NASA Astrophysics Data System (ADS)

    Su, Su; Tanaka, Kazuyuki

    As the power system deregulation advances, the competition among the power companies becomes heated, and they seek more efficient system planning using existing facilities. Therefore, an efficient system planning method has been expected. This paper proposes a quantitative evaluation method for the (N-1) generation margin considering the overload and the voltage stability restriction. Concerning the generation margin related with the overload, a fast solution method without the recalculation of the (N-1) Y-matrix is proposed. Referred to the voltage stability, this paper proposes an efficient method to search the stability limit. The IEEE30 model system which is composed of 6 generators and 14 load nodes is employed to validate the proposed method. According to the results, the proposed method can reduce the computational cost for the generation margin related with the overload under the (N-1) condition, and specify the value quantitatively.

  18. Four hundred meters walking test in the evaluation of heart failure patients.

    PubMed

    Zdrenghea, D; Beudean, Maria; Pop, Dana; Zdrenghea, V

    2010-01-01

    The best evaluation of the severity and prognosis of heart failure patients is obtained by the maximal exercise stress testing, but for the very large number of HF patients and for evaluation of their daily effort capacity submaximal stress testing, mainly 6 minutes walking test are used. The limit of 6mWT is that during it the patients are not motivated to walk and also, the periphery, so important for heart failure patients, is not equally involved. To compare a new fixed walking test-400m walking test with 6MWT and maximal exercise testing. There were investigated 20 patients with dilated cardiomyopathy (DCM). The patients were included in the study after the relief of the congestive syndrome. Each patient was submitted in three consecutive days to a maximal symptom-limited exercise stress test on cycloergometer, a six minutes walking test, a 400 meters walking test. The last one consisted of walking on a corridor 40 meters long, at a speed chosen by the patient himself. The results were expressed in seconds representing the necessary time to cover the established 400 meters of distance. During cycloergometer exercise stress test the calculated mean peak VO2 was 15.2 +/- 1.4 mlO2/kg/min (4.32 METs). The mean distance walked during 6MWT was 350 +/- 34m and the mean time needed to walk 400m (400mWT) was 300 +/- 27 seconds. The correlation between peak VO2 and distance walked during 6MWT was -0.40, a similar but negative value (r = -0.42) being registered between peak VO2 and time registered during 400mWT. Only weak correlation was registered between LVEF and all the three tests. In turn the correlation between distance registered during 6MWT and time registered during 400mWT was excellent: r = -0.60. 400mWT is a useful tool for the evaluation of submaximal effort capacity of CHF patients. Its value to evaluate exercise capacity is similar with that of the 6 MWT, but 400mWT can assure a better evaluation of peripheral involvement.

  19. Failure rates of mini-implants placed in the infrazygomatic region.

    PubMed

    Uribe, Flavio; Mehr, Rana; Mathur, Ajay; Janakiraman, Nandakumar; Allareddy, Veerasathpurush

    2015-01-01

    The purpose of this pilot study was to evaluate the failure rates of mini-implants placed in the infrazygomatic region and to evaluate factors that affect their stability. A retrospective cohort study of 30 consecutive patients (55 mini-implants) who had infrazygomatic mini-implants at a University Clinic were evaluated for failure rates. Patient, mini-implant, orthodontic, surgical, and mini-implant maintenance factors were evaluated by univariate logistic regression models for association to failure rates. A 21.8 % failure rate of mini-implants placed in the infazygomatic region was observed. None of the predictor variables were significantly associated with higher or lower odds for failed implants. Failure rates for infrazygomatic mini-implants were slightly higher than those reported in other maxilla-mandibular osseous locations. No predictor variables were found to be associated to the failure rates.

  20. Quantitative nondestructive in-service evaluation of stay cables of cable-stayed bridges: methods and practical experience

    NASA Astrophysics Data System (ADS)

    Weischedel, Herbert R.; Hoehle, Hans-Werner

    1995-05-01

    Stay cables of cable-stayed bridges have corrosion protection systems that can be elaborate. For example, such a system may simply consist of one or several coats of paint, or--more complex--of plastic pipes that are wrapped with tape and filled with grout. Frequently, these corrosion protection systems prevent visual inspections. Therefore, alternative nondestructive examination methods are called for. For example, modern dual-function electromagnetic (EM) instruments allow the simultaneous detection of external and internal localized flaws (such as external and internal broken wires and corrosion piting) and the measurement of loss of metallic cross-sectional area (typically caused by external or internal corrosion or wear). Initially developed for mining and skiing applications, these instruments have been successfully used for the inspection of stays of cable-stayed bridges, and for the inspection of guys of smoke stacks, flare stacks, broadcast towers, suspended roofs, etc. As a rule, guys and bridge cables are not subjected to wear and bending stresses. However, their safety can be compromised by corrosion caused by the failure of corrosion protection systems. Furthermore, live loads and wind forces create intermittent tensile stresses that can cause fatigue breaks of wires. This paper discusses the use of dual-function EM instruments for the detection and the nondestructive quantitative evaluation of cable deterioration. It explains the underlying principles. Experiences with this method together with field inspection results will be presented.

  1. Elasticity dominates strength and failure in metallic glasses

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, Z. Q.; Qu, R. T.; Zhang, Z. F., E-mail: zhfzhang@imr.ac.cn

    2015-01-07

    Two distinct deformation mechanisms of shearing and volume dilatation are quantitatively analyzed in metallic glasses (MGs) from the fundamental thermodynamics. Their competition is deduced to intrinsically dominate the strength and failure behaviors of MGs. Both the intrinsic shear and normal strengths give rise to the critical mechanical energies to activate destabilization of amorphous structures, under pure shearing and volume dilatation, respectively, and can be determined in terms of elastic constants. By adopting an ellipse failure criterion, the strength and failure behaviors of MGs can be precisely described just according to their shear modulus and Poisson's ratio without mechanical testing. Quantitativemore » relations are established systematically and verified by experimental results. Accordingly, the real-sense non-destructive failure prediction can be achieved in various MGs. By highlighting the broad key significance of elasticity, a “composition-elasticity-property” scheme is further outlined for better understanding and controlling the mechanical properties of MGs and other glassy materials from the elastic perspectives.« less

  2. [Evaluation on methodological problems in reports concerning quantitative analysis of syndrome differentiation of diabetes mellitus].

    PubMed

    Chen, Bi-Cang; Wu, Qiu-Ying; Xiang, Cheng-Bin; Zhou, Yi; Guo, Ling-Xiang; Zhao, Neng-Jiang; Yang, Shu-Yu

    2006-01-01

    To evaluate the quality of reports published in recent 10 years in China about quantitative analysis of syndrome differentiation for diabetes mellitus (DM) in order to explore the methodological problems in these reports and find possible solutions. The main medical literature databases in China were searched. Thirty-one articles were included and evaluated by the principles of clinical epidemiology. There were many mistakes and deficiencies in these articles, such as clinical trial designs, diagnosis criteria for DM, standards of syndrome differentiation of DM, case inclusive and exclusive criteria, sample size and estimation, data comparability and statistical methods. It is necessary and important to improve the quality of reports concerning quantitative analysis of syndrome differentiation of DM in light of the principles of clinical epidemiology.

  3. Assessment of Intralaminar Progressive Damage and Failure Analysis Using an Efficient Evaluation Framework

    NASA Technical Reports Server (NTRS)

    Hyder, Imran; Schaefer, Joseph; Justusson, Brian; Wanthal, Steve; Leone, Frank; Rose, Cheryl

    2017-01-01

    Reducing the timeline for development and certification for composite structures has been a long standing objective of the aerospace industry. This timeline can be further exacerbated when attempting to integrate new fiber-reinforced composite materials due to the large number of testing required at every level of design. computational progressive damage and failure analysis (PDFA) attempts to mitigate this effect; however, new PDFA methods have been slow to be adopted in industry since material model evaluation techniques have not been fully defined. This study presents an efficient evaluation framework which uses a piecewise verification and validation (V&V) approach for PDFA methods. Specifically, the framework is applied to evaluate PDFA research codes within the context of intralaminar damage. Methods are incrementally taken through various V&V exercises specifically tailored to study PDFA intralaminar damage modeling capability. Finally, methods are evaluated against a defined set of success criteria to highlight successes and limitations.

  4. Quantitative Evaluation of the Use of Actigraphy for Neurological and Psychiatric Disorders

    PubMed Central

    Song, Yu; Kwak, Shin; Yoshida, Sohei; Yamamoto, Yoshiharu

    2014-01-01

    Quantitative and objective evaluation of disease severity and/or drug effect is necessary in clinical practice. Wearable accelerometers such as an actigraph enable long-term recording of a patient's movement during activities and they can be used for quantitative assessment of symptoms due to various diseases. We reviewed some applications of actigraphy with analytical methods that are sufficiently sensitive and reliable to determine the severity of diseases and disorders such as motor and nonmotor disorders like Parkinson's disease, sleep disorders, depression, behavioral and psychological symptoms of dementia (BPSD) for vascular dementia (VD), seasonal affective disorder (SAD), and stroke, as well as the effects of drugs used to treat them. We believe it is possible to develop analytical methods to assess more neurological or psychopathic disorders using actigraphy records. PMID:25214709

  5. Quantitative ultrasonic evaluation of mechanical properties of engineering materials

    NASA Technical Reports Server (NTRS)

    Vary, A.

    1978-01-01

    Current progress in the application of ultrasonic techniques to nondestructive measurement of mechanical strength properties of engineering materials is reviewed. Even where conventional NDE techniques have shown that a part is free of overt defects, advanced NDE techniques should be available to confirm the material properties assumed in the part's design. There are many instances where metallic, composite, or ceramic parts may be free of critical defects while still being susceptible to failure under design loads due to inadequate or degraded mechanical strength. This must be considered in any failure prevention scheme that relies on fracture analysis. This review will discuss the availability of ultrasonic methods that can be applied to actual parts to assess their potential susceptibility to failure under design conditions.

  6. Comparative analysis of quantitative efficiency evaluation methods for transportation networks.

    PubMed

    He, Yuxin; Qin, Jin; Hong, Jian

    2017-01-01

    An effective evaluation of transportation network efficiency could offer guidance for the optimal control of urban traffic. Based on the introduction and related mathematical analysis of three quantitative evaluation methods for transportation network efficiency, this paper compares the information measured by them, including network structure, traffic demand, travel choice behavior and other factors which affect network efficiency. Accordingly, the applicability of various evaluation methods is discussed. Through analyzing different transportation network examples it is obtained that Q-H method could reflect the influence of network structure, traffic demand and user route choice behavior on transportation network efficiency well. In addition, the transportation network efficiency measured by this method and Braess's Paradox can be explained with each other, which indicates a better evaluation of the real operation condition of transportation network. Through the analysis of the network efficiency calculated by Q-H method, it can also be drawn that a specific appropriate demand is existed to a given transportation network. Meanwhile, under the fixed demand, both the critical network structure that guarantees the stability and the basic operation of the network and a specific network structure contributing to the largest value of the transportation network efficiency can be identified.

  7. Wood-adhesive bonding failure : modeling and simulation

    Treesearch

    Zhiyong Cai

    2010-01-01

    The mechanism of wood bonding failure when exposed to wet conditions or wet/dry cycles is not fully understood and the role of the resulting internal stresses exerted upon the wood-adhesive bondline has yet to be quantitatively determined. Unlike previous modeling this study has developed a new two-dimensional internal-stress model on the basis of the mechanics of...

  8. A Computational Framework for Quantitative Evaluation of Movement during Rehabilitation

    NASA Astrophysics Data System (ADS)

    Chen, Yinpeng; Duff, Margaret; Lehrer, Nicole; Sundaram, Hari; He, Jiping; Wolf, Steven L.; Rikakis, Thanassis

    2011-06-01

    This paper presents a novel generalized computational framework for quantitative kinematic evaluation of movement in a rehabilitation clinic setting. The framework integrates clinical knowledge and computational data-driven analysis together in a systematic manner. The framework provides three key benefits to rehabilitation: (a) the resulting continuous normalized measure allows the clinician to monitor movement quality on a fine scale and easily compare impairments across participants, (b) the framework reveals the effect of individual movement components on the composite movement performance helping the clinician decide the training foci, and (c) the evaluation runs in real-time, which allows the clinician to constantly track a patient's progress and make appropriate adaptations to the therapy protocol. The creation of such an evaluation is difficult because of the sparse amount of recorded clinical observations, the high dimensionality of movement and high variations in subject's performance. We address these issues by modeling the evaluation function as linear combination of multiple normalized kinematic attributes y = Σwiφi(xi) and estimating the attribute normalization function φi(ṡ) by integrating distributions of idealized movement and deviated movement. The weights wi are derived from a therapist's pair-wise comparison using a modified RankSVM algorithm. We have applied this framework to evaluate upper limb movement for stroke survivors with excellent results—the evaluation results are highly correlated to the therapist's observations.

  9. A study of Mariner 10 flight experiences and some flight piece part failure rate computations

    NASA Technical Reports Server (NTRS)

    Paul, F. A.

    1976-01-01

    The problems and failures encountered in Mariner flight are discussed and the data available through a quantitative accounting of all electronic piece parts on the spacecraft are summarized. It also shows computed failure rates for electronic piece parts. It is intended that these computed data be used in the continued updating of the failure rate base used for trade-off studies and predictions for future JPL space missions.

  10. Quantitative oxygen concentration imaging in toluene atmospheres using Dual Imaging with Modeling Evaluation

    NASA Astrophysics Data System (ADS)

    Ehn, Andreas; Jonsson, Malin; Johansson, Olof; Aldén, Marcus; Bood, Joakim

    2013-01-01

    Fluorescence lifetimes of toluene as a function of oxygen concentration in toluene/nitrogen/oxygen mixtures have been measured at room temperature using picosecond-laser excitation of the S1-S0 transition at 266 nm. The data satisfy the Stern-Volmer relation with high accuracy, providing an updated value of the Stern-Volmer slope. A newly developed fluorescence lifetime imaging scheme, called Dual Imaging with Modeling Evaluation (DIME), is evaluated and successfully demonstrated for quantitative oxygen concentration imaging in toluene-seeded O2/N2 gas mixtures.

  11. Quantitative oxygen concentration imaging in toluene atmospheres using Dual Imaging with Modeling Evaluation

    NASA Astrophysics Data System (ADS)

    Ehn, Andreas; Jonsson, Malin; Johansson, Olof; Aldén, Marcus; Bood, Joakim

    2012-12-01

    Fluorescence lifetimes of toluene as a function of oxygen concentration in toluene/nitrogen/oxygen mixtures have been measured at room temperature using picosecond-laser excitation of the S1-S0 transition at 266 nm. The data satisfy the Stern-Volmer relation with high accuracy, providing an updated value of the Stern-Volmer slope. A newly developed fluorescence lifetime imaging scheme, called Dual Imaging with Modeling Evaluation (DIME), is evaluated and successfully demonstrated for quantitative oxygen concentration imaging in toluene-seeded O2/N2 gas mixtures.

  12. Rodent heart failure models do not reflect the human circulating microRNA signature in heart failure.

    PubMed

    Vegter, Eline L; Ovchinnikova, Ekaterina S; Silljé, Herman H W; Meems, Laura M G; van der Pol, Atze; van der Velde, A Rogier; Berezikov, Eugene; Voors, Adriaan A; de Boer, Rudolf A; van der Meer, Peter

    2017-01-01

    We recently identified a set of plasma microRNAs (miRNAs) that are downregulated in patients with heart failure in comparison with control subjects. To better understand their meaning and function, we sought to validate these circulating miRNAs in 3 different well-established rat and mouse heart failure models, and correlated the miRNAs to parameters of cardiac function. The previously identified let-7i-5p, miR-16-5p, miR-18a-5p, miR-26b-5p, miR-27a-3p, miR-30e-5p, miR-199a-3p, miR-223-3p, miR-423-3p, miR-423-5p and miR-652-3p were measured by means of quantitative real time polymerase chain reaction (qRT-PCR) in plasma samples of 8 homozygous TGR(mREN2)27 (Ren2) transgenic rats and 8 (control) Sprague-Dawley rats, 6 mice with angiotensin II-induced heart failure (AngII) and 6 control mice, and 8 mice with ischemic heart failure and 6 controls. Circulating miRNA levels were compared between the heart failure animals and healthy controls. Ren2 rats, AngII mice and mice with ischemic heart failure showed clear signs of heart failure, exemplified by increased left ventricular and lung weights, elevated end-diastolic left ventricular pressures, increased expression of cardiac stress markers and reduced left ventricular ejection fraction. All miRNAs were detectable in plasma from rats and mice. No significant differences were observed between the circulating miRNAs in heart failure animals when compared to the healthy controls (all P>0.05) and no robust associations with cardiac function could be found. The previous observation that miRNAs circulate in lower levels in human patients with heart failure could not be validated in well-established rat and mouse heart failure models. These results question the translation of data on human circulating miRNA levels to experimental models, and vice versa the validity of experimental miRNA data for human heart failure.

  13. Economic impact of heart failure according to the effects of kidney failure.

    PubMed

    Sicras Mainar, Antoni; Navarro Artieda, Ruth; Ibáñez Nolla, Jordi

    2015-01-01

    To evaluate the use of health care resources and their cost according to the effects of kidney failure in heart failure patients during 2-year follow-up in a population setting. Observational retrospective study based on a review of medical records. The study included patients ≥ 45 years treated for heart failure from 2008 to 2010. The patients were divided into 2 groups according to the presence/absence of KF. Main outcome variables were comorbidity, clinical status (functional class, etiology), metabolic syndrome, costs, and new cases of cardiovascular events and kidney failure. The cost model included direct and indirect health care costs. Statistical analysis included multiple regression models. The study recruited 1600 patients (prevalence, 4.0%; mean age 72.4 years; women, 59.7%). Of these patients, 70.1% had hypertension, 47.1% had dyslipidemia, and 36.2% had diabetes mellitus. We analyzed 433 patients (27.1%) with kidney failure and 1167 (72.9%) without kidney failure. Patients with kidney failure were associated with functional class III-IV (54.1% vs 40.8%) and metabolic syndrome (65.3% vs 51.9%, P<.01). The average unit cost was €10,711.40. The corrected cost in the presence of kidney failure was €14,868.20 vs €9,364.50 (P=.001). During follow-up, 11.7% patients developed ischemic heart disease, 18.8% developed kidney failure, and 36.1% developed heart failure exacerbation. Comorbidity associated with heart failure is high. The presence of kidney failure increases the use of health resources and leads to higher costs within the National Health System. Copyright © 2014 Sociedad Española de Cardiología. Published by Elsevier Espana. All rights reserved.

  14. Space station software reliability analysis based on failures observed during testing at the multisystem integration facility

    NASA Technical Reports Server (NTRS)

    Tamayo, Tak Chai

    1987-01-01

    Quality of software not only is vital to the successful operation of the space station, it is also an important factor in establishing testing requirements, time needed for software verification and integration as well as launching schedules for the space station. Defense of management decisions can be greatly strengthened by combining engineering judgments with statistical analysis. Unlike hardware, software has the characteristics of no wearout and costly redundancies, thus making traditional statistical analysis not suitable in evaluating reliability of software. A statistical model was developed to provide a representation of the number as well as types of failures occur during software testing and verification. From this model, quantitative measure of software reliability based on failure history during testing are derived. Criteria to terminate testing based on reliability objectives and methods to estimate the expected number of fixings required are also presented.

  15. Polymer on Top: Current Limits and Future Perspectives of Quantitatively Evaluating Surface Grafting.

    PubMed

    Michalek, Lukas; Barner, Leonie; Barner-Kowollik, Christopher

    2018-03-07

    Well-defined polymer strands covalently tethered onto solid substrates determine the properties of the resulting functional interface. Herein, the current approaches to determine quantitative grafting densities are assessed. Based on a brief introduction into the key theories describing polymer brush regimes, a user's guide is provided to estimating maximum chain coverage and-importantly-examine the most frequently employed approaches for determining grafting densities, i.e., dry thickness measurements, gravimetric assessment, and swelling experiments. An estimation of the reliability of these determination methods is provided via carefully evaluating their assumptions and assessing the stability of the underpinning equations. A practical access guide for comparatively and quantitatively evaluating the reliability of a given approach is thus provided, enabling the field to critically judge experimentally determined grafting densities and to avoid the reporting of grafting densities that fall outside the physically realistic parameter space. The assessment is concluded with a perspective on the development of advanced approaches for determination of grafting density, in particular, on single-chain methodologies. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  16. Performance Evaluation and Quantitative Accuracy of Multipinhole NanoSPECT/CT Scanner for Theranostic Lu-177 Imaging

    NASA Astrophysics Data System (ADS)

    Gupta, Arun; Kim, Kyeong Yun; Hwang, Donghwi; Lee, Min Sun; Lee, Dong Soo; Lee, Jae Sung

    2018-06-01

    SPECT plays important role in peptide receptor targeted radionuclide therapy using theranostic radionuclides such as Lu-177 for the treatment of various cancers. However, SPECT studies must be quantitatively accurate because the reliable assessment of tumor uptake and tumor-to-normal tissue ratios can only be performed using quantitatively accurate images. Hence, it is important to evaluate performance parameters and quantitative accuracy of preclinical SPECT systems for therapeutic radioisotopes before conducting pre- and post-therapy SPECT imaging or dosimetry studies. In this study, we evaluated system performance and quantitative accuracy of NanoSPECT/CT scanner for Lu-177 imaging using point source and uniform phantom studies. We measured recovery coefficient, uniformity, spatial resolution, system sensitivity and calibration factor for mouse whole body standard aperture. We also performed the experiments using Tc-99m to compare the results with that of Lu-177. We found that the recovery coefficient of more than 70% for Lu-177 at the optimum noise level when nine iterations were used. The spatial resolutions of Lu-177 with and without adding uniform background was comparable to that of Tc-99m in axial, radial and tangential directions. System sensitivity measured for Lu-177 was almost three times less than that of Tc-99m.

  17. Estimation of submarine mass failure probability from a sequence of deposits with age dates

    USGS Publications Warehouse

    Geist, Eric L.; Chaytor, Jason D.; Parsons, Thomas E.; ten Brink, Uri S.

    2013-01-01

    The empirical probability of submarine mass failure is quantified from a sequence of dated mass-transport deposits. Several different techniques are described to estimate the parameters for a suite of candidate probability models. The techniques, previously developed for analyzing paleoseismic data, include maximum likelihood and Type II (Bayesian) maximum likelihood methods derived from renewal process theory and Monte Carlo methods. The estimated mean return time from these methods, unlike estimates from a simple arithmetic mean of the center age dates and standard likelihood methods, includes the effects of age-dating uncertainty and of open time intervals before the first and after the last event. The likelihood techniques are evaluated using Akaike’s Information Criterion (AIC) and Akaike’s Bayesian Information Criterion (ABIC) to select the optimal model. The techniques are applied to mass transport deposits recorded in two Integrated Ocean Drilling Program (IODP) drill sites located in the Ursa Basin, northern Gulf of Mexico. Dates of the deposits were constrained by regional bio- and magnetostratigraphy from a previous study. Results of the analysis indicate that submarine mass failures in this location occur primarily according to a Poisson process in which failures are independent and return times follow an exponential distribution. However, some of the model results suggest that submarine mass failures may occur quasiperiodically at one of the sites (U1324). The suite of techniques described in this study provides quantitative probability estimates of submarine mass failure occurrence, for any number of deposits and age uncertainty distributions.

  18. Quantitative evolutionary design

    PubMed Central

    Diamond, Jared

    2002-01-01

    The field of quantitative evolutionary design uses evolutionary reasoning (in terms of natural selection and ultimate causation) to understand the magnitudes of biological reserve capacities, i.e. excesses of capacities over natural loads. Ratios of capacities to loads, defined as safety factors, fall in the range 1.2-10 for most engineered and biological components, even though engineered safety factors are specified intentionally by humans while biological safety factors arise through natural selection. Familiar examples of engineered safety factors include those of buildings, bridges and elevators (lifts), while biological examples include factors of bones and other structural elements, of enzymes and transporters, and of organ metabolic performances. Safety factors serve to minimize the overlap zone (resulting in performance failure) between the low tail of capacity distributions and the high tail of load distributions. Safety factors increase with coefficients of variation of load and capacity, with capacity deterioration with time, and with cost of failure, and decrease with costs of initial construction, maintenance, operation, and opportunity. Adaptive regulation of many biological systems involves capacity increases with increasing load; several quantitative examples suggest sublinear increases, such that safety factors decrease towards 1.0. Unsolved questions include safety factors of series systems, parallel or branched pathways, elements with multiple functions, enzyme reaction chains, and equilibrium enzymes. The modest sizes of safety factors imply the existence of costs that penalize excess capacities. Those costs are likely to involve wasted energy or space for large or expensive components, but opportunity costs of wasted space at the molecular level for minor components. PMID:12122135

  19. Soft error evaluation and vulnerability analysis in Xilinx Zynq-7010 system-on chip

    NASA Astrophysics Data System (ADS)

    Du, Xuecheng; He, Chaohui; Liu, Shuhuan; Zhang, Yao; Li, Yonghong; Xiong, Ceng; Tan, Pengkang

    2016-09-01

    Radiation-induced soft errors are an increasingly important threat to the reliability of modern electronic systems. In order to evaluate system-on chip's reliability and soft error, the fault tree analysis method was used in this work. The system fault tree was constructed based on Xilinx Zynq-7010 All Programmable SoC. Moreover, the soft error rates of different components in Zynq-7010 SoC were tested by americium-241 alpha radiation source. Furthermore, some parameters that used to evaluate the system's reliability and safety were calculated using Isograph Reliability Workbench 11.0, such as failure rate, unavailability and mean time to failure (MTTF). According to fault tree analysis for system-on chip, the critical blocks and system reliability were evaluated through the qualitative and quantitative analysis.

  20. An Elephant in the Room: Bias in Evaluating a Required Quantitative Methods Course

    ERIC Educational Resources Information Center

    Fletcher, Joseph F.; Painter-Main, Michael A.

    2014-01-01

    Undergraduate Political Science programs often require students to take a quantitative research methods course. Such courses are typically among the most poorly rated. This can be due, in part, to the way in which courses are evaluated. Students are generally asked to provide an overall rating, which, in turn, is widely used by students, faculty,…

  1. Evaluation Criteria of Noninvasive Telemonitoring for Patients With Heart Failure: Systematic Review.

    PubMed

    Farnia, Troskah; Jaulent, Marie-Christine; Steichen, Olivier

    2018-01-16

    Telemonitoring can improve heart failure (HF) management, but there is no standardized evaluation framework to comprehensively evaluate its impact. Our objectives were to list the criteria used in published evaluations of noninvasive HF telemonitoring projects, describe how they are used in the evaluation studies, and organize them into a consistent scheme. Articles published from January 1990 to August 2015 were obtained through MEDLINE, Web of Science, and EMBASE. Articles were eligible if they were original reports of a noninvasive HF telemonitoring evaluation study in the English language. Studies of implantable telemonitoring devices were excluded. Each selected article was screened to extract the description of the telemonitoring project and the evaluation process and criteria. A qualitative synthesis was performed. We identified and reviewed 128 articles leading to 52 evaluation criteria classified into 6 dimensions: clinical, economic, user perspective, educational, organizational, and technical. The clinical and economic impacts were evaluated in more than 70% of studies, whereas the educational, organizational, and technical impacts were studied in fewer than 15%. User perspective was the most frequently covered dimension in the development phase of telemonitoring projects, whereas clinical and economic impacts were the focus of later phases. Telemonitoring evaluation frameworks should cover all 6 dimensions appropriately distributed along the telemonitoring project lifecycle. Our next goal is to build such a comprehensive evaluation framework for telemonitoring and test it on an ongoing noninvasive HF telemonitoring project. ©Troskah Farnia, Marie-Christine Jaulent, Olivier Steichen. Originally published in the Journal of Medical Internet Research (http://www.jmir.org), 16.01.2018.

  2. [Acute renal failure after cardiac surgery: evaluation of the RIFLE criteria].

    PubMed

    Kallel, Sami; Triki, Zied; Abdenadher, Mohammed; Frikha, Imed; Jemel, Amine; Karoui, Abdelhamid

    2013-04-01

    Acute renal failure is a common complication is a common complication in cardiac surgery under cardiopulmonary bypass. It is associated with increased morbidity and mortality. Acute kidney injury (AKI) is a clinical entity encompassing the entire spectrum of acute renal failure, since minor alterations to the need for renal replacement therapy. The RIFLE criteria have been proposed for defining and classifying AKI. The aim of our study was to apply the RIFLE to a population of patients undergoing cardiac surgery with cardiopulmonary bypass (CPB) and to assess its relevance in terms of risk factor for hospital mortality compared to other risk factors. In this prospective observational study, we included patients who were operated for programmed cardiac surgery. The assay of blood creatinine was performed at admission, after surgery and daily for 5 days post-surgery. The AKI was evaluated according to the criteria of classification RIFLE. The patients were divided into three levels of severity based on plasmatic creatinine (R: Risk=creatinine×1.5; I: Injury=creatinine×2; F: Failure=creatinine×3). We have analyzed the different perioperative parameters and we sought associations with the occurrence of AKI. We also studied the impact of AKI on length of stay in ICU and mortality early and late. One hundred and thirty-six patients were included. AKI was diagnosed in 17.6% of patients (RIFLE-R: 8.8%, RIFLE-I: 5.9% and RIFLE-F: 2.9%). AKI significantly prolongs the duration of ICU stay (7±3.8 versus 5±2.3 days; P=0.02). RIFLE-R patients had a mortality of 8.3%, compared to 12.5% for I and 50% for F. Patients without PORD had a mortality of 1.8%. In univariate analysis, age, the EURO score, preoperative renal dysfunction, duration of aortic clamping, duration of CPB and C-reactive protein (CRP) were significantly associated with the occurrence of AKI. In multivariate analysis only preoperative renal dysfunction (clearance less than 63 mL/min) and CRP greater than 158

  3. Selection of reference genes for gene expression studies in heart failure for left and right ventricles.

    PubMed

    Li, Mengmeng; Rao, Man; Chen, Kai; Zhou, Jianye; Song, Jiangping

    2017-07-15

    Real-time quantitative reverse transcriptase-PCR (qRT-PCR) is a feasible tool for determining gene expression profiles, but the accuracy and reliability of the results depends on the stable expression of selected housekeeping genes in different samples. By far, researches on stable housekeeping genes in human heart failure samples are rare. Moreover the effect of heart failure on the expression of housekeeping genes in right and left ventricles is yet to be studied. Therefore we aim to provide stable housekeeping genes for both ventricles in heart failure and normal heart samples. In this study, we selected seven commonly used housekeeping genes as candidates. By using the qRT-PCR, the expression levels of ACTB, RAB7A, GAPDH, REEP5, RPL5, PSMB4 and VCP in eight heart failure and four normal heart samples were assessed. The stability of candidate housekeeping genes was evaluated by geNorm and Normfinder softwares. GAPDH showed the least variation in all heart samples. Results also indicated the difference of gene expression existed in heart failure left and right ventricles. GAPDH had the highest expression stability in both heart failure and normal heart samples. We also propose using different sets of housekeeping genes for left and right ventricles respectively. The combination of RPL5, GAPDH and PSMB4 is suitable for the right ventricle and the combination of GAPDH, REEP5 and RAB7A is suitable for the left ventricle. Copyright © 2017 Elsevier B.V. All rights reserved.

  4. The Quantitative Evaluation of the Clinical and Translational Science Awards (CTSA) Program Based on Science Mapping and Scientometric Analysis

    PubMed Central

    Zhang, Yin; Wang, Lei

    2013-01-01

    Abstract The Clinical and Translational Science Awards (CTSA) program is one of the most important initiatives in translational medical funding. The quantitative evaluation of the efficiency and performance of the CTSA program has a significant referential meaning for the decision making of global translational medical funding. Using science mapping and scientometric analytic tools, this study quantitatively analyzed the scientific articles funded by the CTSA program. The results of the study showed that the quantitative productivities of the CTSA program had a stable increase since 2008. In addition, the emerging trends of the research funded by the CTSA program covered clinical and basic medical research fields. The academic benefits from the CTSA program were assisting its members to build a robust academic home for the Clinical and Translational Science and to attract other financial support. This study provided a quantitative evaluation of the CTSA program based on science mapping and scientometric analysis. Further research is required to compare and optimize other quantitative methods and to integrate various research results. PMID:24330689

  5. The quantitative evaluation of the Clinical and Translational Science Awards (CTSA) program based on science mapping and scientometric analysis.

    PubMed

    Zhang, Yin; Wang, Lei; Diao, Tianxi

    2013-12-01

    The Clinical and Translational Science Awards (CTSA) program is one of the most important initiatives in translational medical funding. The quantitative evaluation of the efficiency and performance of the CTSA program has a significant referential meaning for the decision making of global translational medical funding. Using science mapping and scientometric analytic tools, this study quantitatively analyzed the scientific articles funded by the CTSA program. The results of the study showed that the quantitative productivities of the CTSA program had a stable increase since 2008. In addition, the emerging trends of the research funded by the CTSA program covered clinical and basic medical research fields. The academic benefits from the CTSA program were assisting its members to build a robust academic home for the Clinical and Translational Science and to attract other financial support. This study provided a quantitative evaluation of the CTSA program based on science mapping and scientometric analysis. Further research is required to compare and optimize other quantitative methods and to integrate various research results. © 2013 Wiley Periodicals, Inc.

  6. Evaluating the Quantitative Capabilities of Metagenomic Analysis Software.

    PubMed

    Kerepesi, Csaba; Grolmusz, Vince

    2016-05-01

    DNA sequencing technologies are applied widely and frequently today to describe metagenomes, i.e., microbial communities in environmental or clinical samples, without the need for culturing them. These technologies usually return short (100-300 base-pairs long) DNA reads, and these reads are processed by metagenomic analysis software that assign phylogenetic composition-information to the dataset. Here we evaluate three metagenomic analysis software (AmphoraNet--a webserver implementation of AMPHORA2--, MG-RAST, and MEGAN5) for their capabilities of assigning quantitative phylogenetic information for the data, describing the frequency of appearance of the microorganisms of the same taxa in the sample. The difficulties of the task arise from the fact that longer genomes produce more reads from the same organism than shorter genomes, and some software assign higher frequencies to species with longer genomes than to those with shorter ones. This phenomenon is called the "genome length bias." Dozens of complex artificial metagenome benchmarks can be found in the literature. Because of the complexity of those benchmarks, it is usually difficult to judge the resistance of a metagenomic software to this "genome length bias." Therefore, we have made a simple benchmark for the evaluation of the "taxon-counting" in a metagenomic sample: we have taken the same number of copies of three full bacterial genomes of different lengths, break them up randomly to short reads of average length of 150 bp, and mixed the reads, creating our simple benchmark. Because of its simplicity, the benchmark is not supposed to serve as a mock metagenome, but if a software fails on that simple task, it will surely fail on most real metagenomes. We applied three software for the benchmark. The ideal quantitative solution would assign the same proportion to the three bacterial taxa. We have found that AMPHORA2/AmphoraNet gave the most accurate results and the other two software were under

  7. Sensor failure detection for jet engines

    NASA Technical Reports Server (NTRS)

    Beattie, E. C.; Laprad, R. F.; Akhter, M. M.; Rock, S. M.

    1983-01-01

    Revisions to the advanced sensor failure detection, isolation, and accommodation (DIA) algorithm, developed under the sensor failure detection system program were studied to eliminate the steady state errors due to estimation filter biases. Three algorithm revisions were formulated and one revision for detailed evaluation was chosen. The selected version modifies the DIA algorithm to feedback the actual sensor outputs to the integral portion of the control for the nofailure case. In case of a failure, the estimates of the failed sensor output is fed back to the integral portion. The estimator outputs are fed back to the linear regulator portion of the control all the time. The revised algorithm is evaluated and compared to the baseline algorithm developed previously.

  8. Quantitative evaluation of optically induced disorientation.

    DOT National Transportation Integrated Search

    1970-01-01

    The purpose of this study was to establish quantitatively and systematically the association between the speed of movement of an optical environment and the extent of disorientation experienced by an individual viewing this environment. The degree of...

  9. A TEM quantitative evaluation of strengthening in an Mg-RE alloy reinforced with SiC

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cabibbo, Marcello, E-mail: m.cabibbo@univpm.it; Spigarelli, Stefano

    2011-10-15

    Magnesium alloys containing rare earth elements are known to have high specific strength, good creep and corrosion resistance up to 523 K. The addition of SiC ceramic particles strengthens the metal matrix composite resulting in better wear and creep resistance while maintaining good machinability. The role of the reinforcement particles in enhancing strength can be quantitatively evaluated using transmission electron microscopy (TEM). This paper presents a quantitative evaluation of the different strengthening contributions, determined through TEM inspections, in an SiC Mg-RE composite alloy containing yttrium, neodymium, gadolinium and dysprosium. Compression tests at temperatures ranging between 290 and 573 K weremore » carried out. The microstructure strengthening mechanism was studied for all the compression conditions. Strengthening was compared to the mechanical results and the way the different contributions were combined is also discussed and justified. - Research Highlights: {yields} TEM yield strengthening terms evaluation on a Mg-RE SiC alloy. {yields} The evaluation has been extended to different compression temperature conditions. {yields} Linear and Quadratic sum has been proposed and validated. {yields} Hall-Petch was found to be the most prominent strengthening contributions.« less

  10. Tensile failure properties of the perinatal, neonatal, and pediatric cadaveric cervical spine.

    PubMed

    Luck, Jason F; Nightingale, Roger W; Song, Yin; Kait, Jason R; Loyd, Andre M; Myers, Barry S; Bass, Cameron R Dale

    2013-01-01

    Biomechanical tensile testing of perinatal, neonatal, and pediatric cadaveric cervical spines to failure. To assess the tensile failure properties of the cervical spine from birth to adulthood. Pediatric cervical spine biomechanical studies have been few due to the limited availability of pediatric cadavers. Therefore, scaled data based on human adult and juvenile animal studies have been used to augment the limited pediatric cadaver data. Despite these efforts, substantial uncertainty remains in our understanding of pediatric cervical spine biomechanics. A total of 24 cadaveric osteoligamentous head-neck complexes, 20 weeks gestation to 18 years, were sectioned into segments (occiput-C2 [O-C2], C4-C5, and C6-C7) and tested in tension to determine axial stiffness, displacement at failure, and load-to-failure. Tensile stiffness-to-failure (N/mm) increased by age (O-C2: 23-fold, neonate: 22 ± 7, 18 yr: 504; C4-C5: 7-fold, neonate: 71 ± 14, 18 yr: 509; C6-C7: 7-fold, neonate: 64 ± 17, 18 yr: 456). Load-to-failure (N) increased by age (O-C2: 13-fold, neonate: 228 ± 40, 18 yr: 2888; C4-C5: 9-fold, neonate: 207 ± 63, 18 yr: 1831; C6-C7: 10-fold, neonate: 174 ± 41, 18 yr: 1720). Normalized displacement at failure (mm/mm) decreased by age (O-C2: 6-fold, neonate: 0.34 ± 0.076, 18 yr: 0.059; C4-C5: 3-fold, neonate: 0.092 ± 0.015, 18 yr: 0.035; C6-C7: 2-fold, neonate: 0.088 ± 0.019, 18 yr: 0.037). Cervical spine tensile stiffness-to-failure and load-to-failure increased nonlinearly, whereas normalized displacement at failure decreased nonlinearly, from birth to adulthood. Pronounced ligamentous laxity observed at younger ages in the O-C2 segment quantitatively supports the prevalence of spinal cord injury without radiographic abnormality in the pediatric population. This study provides important and previously unavailable data for validating pediatric cervical spine models, for evaluating current scaling techniques and animal surrogate models, and for the development

  11. Design of the heart failure endpoint evaluation of AII-antagonist losartan (HEAAL) study in patients intolerant to ACE-inhibitor.

    PubMed

    Konstam, Marvin A; Poole-Wilson, Philip A; Dickstein, Kenneth; Drexler, Helmut; Justice, Steven J; Komajda, Michel; Malbecq, William; Martinez, Felipe A; Neaton, James D; Riegger, Gunter A J; Guptha, Soneil

    2008-09-01

    In patients with heart failure and reduced left ventricular ejection fraction, angiotensin receptor blockers have been found to reduce mortality and morbidity and to prevent or reverse left ventricular remodelling, compared to optimized background treatment. In light of these data, The Heart failure Endpoint evaluation of Angiotensin II Antagonist Losartan (HEAAL) study was developed to determine whether losartan 150 mg is superior to losartan 50 mg (antihypertensive dose) in reducing morbidity and mortality among patients with symptomatic heart failure who are intolerant of angiotensin-converting enzyme (ACE)-inhibitors. To compare the effect of high and moderate doses of losartan on the primary endpoint of all-cause mortality and hospitalisation due to heart failure in patients (n = 3834) with symptomatic heart failure and an ejection fraction < or = 40% who are intolerant of ACE-inhibitor treatment. This paper presents the rationale, trial design, and baseline characteristics of the study population. The study, which completed recruitment on 31 March 2005, is event-driven and is estimated to accrue the target of 1710 adjudicated primary events during the latter half of 2008. The results of HEAAL should facilitate selection of an optimal dosing regimen for losartan in patients with symptomatic heart failure who are intolerant of ACE-inhibitors.

  12. [Evaluation of a chronic fatigue in patients with moderate-to-severe chronic heart failure].

    PubMed

    Jasiukeviciene, Lina; Vasiliauskas, Donatas; Kavoliūniene, Ausra; Marcinkeviciene, Jolanta; Grybauskiene, Regina; Grizas, Vytautas; Tumyniene, Vida

    2008-01-01

    To evaluate the chronic fatigue and its relation to the function of hypothalamus-pituitary-adrenal axis in patients with New York Heart Association (NYHA) functional class III-IV chronic heart failure. A total of 170 patients with NYHA functional class III-IV chronic heart failure completed MFI-20L, DUFS, and DEFS questionnaires assessing chronic fatigue and underwent echocardiography. Blood cortisol concentration was assessed at 8:00 am and 3:00 pm, and plasma N-terminal brain natriuretic pro-peptide (NT-proBNP) concentration was measured at 8:00 am. Neurohumoral investigations were repeated before cardiopulmonary exercise test and after it. The results of all questionnaires showed that 100% of patients with NYHA functional class III-IV heart failure complained of chronic fatigue. The level of overall fatigue was 54.5+/-31.5 points; physical fatigue - 56.8+/-24.6 points. Blood cortisol concentration at 8:00 am was normal (410.1+/-175.1 mmol/L) in majority of patients. Decreased concentration was only in four patients (122.4+/-15.5 mmol/L); one of these patients underwent heart transplantation. In the afternoon, blood cortisol concentration was insufficiently decreased (355.6+/-160.3 mmol/L); reaction to a physical stress was attenuated (Delta 92.9 mmol/L). Plasma NT-proBNP concentration was 2188.9+/-1852.2 pg/L; reaction to a physical stress was diminished (Delta 490.3 pg/L). All patients with NYHA class III-IV heart failure complained of daily chronic fatigue. Insufficiently decreased blood cortisol concentration in the afternoon showed that in the presence of chronic fatigue in long-term cardiovascular organic disease, disorder of a hypothalamus-pituitary-adrenal axis is involved.

  13. Applying the Growth Failure in CKD Consensus Conference: evaluation and treatment algorithm in children with chronic kidney disease.

    PubMed

    Mahan, John D

    2006-07-01

    Growth failure is a common and significant clinical problem for children with chronic kidney disease (CKD), particularly those with chronic renal insufficiency (CRI). Children with CRI (typically defined by a glomerular filtration rate [GFR] <75 mL/min/1.73 m2) who have growth impairment exhibit a variety of medical and psychological problems in addition to increased mortality. Growth failure in children with CKD is usually multifactorial in etiology, including abnormalities in the growth hormone (GH)-insulin-like growth factor (IGF)-I axis and a variety of nutritional and metabolic concerns characteristic of CKD. Proper management of these factors contributes to better growth in affected children. Although the safety and efficacy of recombinant human GH (rhGH) therapy in promoting growth in children with CKD are well established, recent data indicate that the use of rhGH administration in children with CKD and growth failure remains low. Recently, guidelines were developed by the Consensus Conference for Evaluation and Treatment of Growth Failure in Children with CKD. This paper focuses on the application of these guidelines to children with CKD.

  14. An Adaptive Failure Detector Based on Quality of Service in Peer-to-Peer Networks

    PubMed Central

    Dong, Jian; Ren, Xiao; Zuo, Decheng; Liu, Hongwei

    2014-01-01

    The failure detector is one of the fundamental components that maintain high availability of Peer-to-Peer (P2P) networks. Under different network conditions, the adaptive failure detector based on quality of service (QoS) can achieve the detection time and accuracy required by upper applications with lower detection overhead. In P2P systems, complexity of network and high churn lead to high message loss rate. To reduce the impact on detection accuracy, baseline detection strategy based on retransmission mechanism has been employed widely in many P2P applications; however, Chen's classic adaptive model cannot describe this kind of detection strategy. In order to provide an efficient service of failure detection in P2P systems, this paper establishes a novel QoS evaluation model for the baseline detection strategy. The relationship between the detection period and the QoS is discussed and on this basis, an adaptive failure detector (B-AFD) is proposed, which can meet the quantitative QoS metrics under changing network environment. Meanwhile, it is observed from the experimental analysis that B-AFD achieves better detection accuracy and time with lower detection overhead compared to the traditional baseline strategy and the adaptive detectors based on Chen's model. Moreover, B-AFD has better adaptability to P2P network. PMID:25198005

  15. Echocardiographic evaluation of right ventricular stroke work index in advanced heart failure: a new index?

    PubMed

    Frea, Simone; Bovolo, Virginia; Bergerone, Serena; D'Ascenzo, Fabrizio; Antolini, Marina; Capriolo, Michele; Canavosio, Federico Giovanni; Morello, Mara; Gaita, Fiorenzo

    2012-12-01

    Right ventricular (RV) function plays a pivotal role in advanced heart failure patients, especially for screening those who may benefit from left ventricular assist device (LVAD) implantation. We introduce RV contraction pressure index (RVCPI) as a new echo-Doppler parameter of RV function. The accuracy of RVCPI in detecting RV failure was compared with the criterion standard, the RV stroke work index (RVSWI) obtained through right heart catheterization in advanced heart failure patients referred for heart transplantation or LVAD implantation. Right heart catheterization and echo-Doppler were simultaneously performed in 94 consecutive patients referred to our center for advanced heart failure (ejection fraction (EF) 24 ± 8.8%, 40% NYHA functional class IV). RV stroke volume and invasive pulmonary pressures were used to obtain RVSWI. Simplified RVCPI (sRVCPI) was derived as TAPSE × (RV - right atrial pressure gradient). Close positive correlation between sRVCPI and RVSWI was found (r = 0.68; P < .001). With logistic regression, we found that increased sRVCPI showed an independent reduced risk (odds ratio 0.98, 95% confidence interval [CI] 0.97-0.99; P = .016) for patients to present a depressed RVSWI (<0.25 mm Hg/L·m(2)). Simplified RVCPI showed high diagnostic accuracy (area under the receiver operating characteristic curve 0.94, 95% CI 0.89-0.99) and good sensitivity and specificity (92% and 85%, respectively) to predict depressed RVSWI with the use of a cutoff value of <400 mm·mm Hg. In patients with advanced heart failure, the new simple bedside sRVCPI closely correlated with RVSWI, providing an independent, noninvasive, and easy tool for the evaluation of RV function. Copyright © 2012 Elsevier Inc. All rights reserved.

  16. Cardiac magnetic resonance imaging in heart failure: where the alphabet begins!

    PubMed

    Aljizeeri, Ahmed; Sulaiman, Abdulbaset; Alhulaimi, Naji; Alsaileek, Ahmed; Al-Mallah, Mouaz H

    2017-07-01

    Cardiac Magnetic Resonance Imaging has become a cornerstone in the evaluation of heart failure. It provides a comprehensive evaluation by answering all the pertinent clinical questions across the full pathological spectrum of heart failure. Nowadays, CMR is considered the gold standard in evaluation of ventricular volumes, wall motion and systolic function. Through its unique ability of tissue characterization, it provides incremental diagnostic and prognostic information and thus has emerged as a comprehensive imaging modality in heart failure. This review outlines the role of main conventional CMR sequences in the evaluation of heart failure and their impact in the management and prognosis.

  17. Comparative analysis of quantitative efficiency evaluation methods for transportation networks

    PubMed Central

    He, Yuxin; Hong, Jian

    2017-01-01

    An effective evaluation of transportation network efficiency could offer guidance for the optimal control of urban traffic. Based on the introduction and related mathematical analysis of three quantitative evaluation methods for transportation network efficiency, this paper compares the information measured by them, including network structure, traffic demand, travel choice behavior and other factors which affect network efficiency. Accordingly, the applicability of various evaluation methods is discussed. Through analyzing different transportation network examples it is obtained that Q-H method could reflect the influence of network structure, traffic demand and user route choice behavior on transportation network efficiency well. In addition, the transportation network efficiency measured by this method and Braess’s Paradox can be explained with each other, which indicates a better evaluation of the real operation condition of transportation network. Through the analysis of the network efficiency calculated by Q-H method, it can also be drawn that a specific appropriate demand is existed to a given transportation network. Meanwhile, under the fixed demand, both the critical network structure that guarantees the stability and the basic operation of the network and a specific network structure contributing to the largest value of the transportation network efficiency can be identified. PMID:28399165

  18. Evaluating 'good governance': The development of a quantitative tool in the Greater Serengeti Ecosystem.

    PubMed

    Kisingo, Alex; Rollins, Rick; Murray, Grant; Dearden, Phil; Clarke, Marlea

    2016-10-01

    Protected areas (PAs) can provide important benefits to conservation and to communities. A key factor in the effective delivery of these benefits is the role of governance. There has been a growth in research developing frameworks to evaluate 'good' PA governance, usually drawing on a set of principles that are associated with groups of indicators. In contrast to dominant qualitative approaches, this paper describes the development of a quantitative method for measuring effectiveness of protected area governance, as perceived by stakeholders in the Greater Serengeti Ecosystem in Tanzania. The research developed a quantitative method for developing effectiveness measures of PA governance, using a set of 65 statements related to governance principles developed from a literature review. The instrument was administered to 389 individuals from communities located near PAs in the Greater Serengeti Ecosystem. The results of a factor analysis suggest that statements load onto 10 factors that demonstrate high psychometric validity as measured by factor loadings, explained variance, and Cronbach's alpha reliability. The ten common factors that were extracted were: 1) legitimacy, 2) transparency and accountability, 3) responsiveness, 4) fairness, 5) participation, 6) ecosystem based management (EBM) and connectivity, 7) resilience, 8) achievements, 9) consensus orientation, and 10) power. The paper concludes that quantitative surveys can be used to evaluate governance of protected areas from a community-level perspective. Copyright © 2016 Elsevier Ltd. All rights reserved.

  19. Failure mode and effects analysis: too little for too much?

    PubMed

    Dean Franklin, Bryony; Shebl, Nada Atef; Barber, Nick

    2012-07-01

    Failure mode and effects analysis (FMEA) is a structured prospective risk assessment method that is widely used within healthcare. FMEA involves a multidisciplinary team mapping out a high-risk process of care, identifying the failures that can occur, and then characterising each of these in terms of probability of occurrence, severity of effects and detectability, to give a risk priority number used to identify failures most in need of attention. One might assume that such a widely used tool would have an established evidence base. This paper considers whether or not this is the case, examining the evidence for the reliability and validity of its outputs, the mathematical principles behind the calculation of a risk prioirty number, and variation in how it is used in practice. We also consider the likely advantages of this approach, together with the disadvantages in terms of the healthcare professionals' time involved. We conclude that although FMEA is popular and many published studies have reported its use within healthcare, there is little evidence to support its use for the quantitative prioritisation of process failures. It lacks both reliability and validity, and is very time consuming. We would not recommend its use as a quantitative technique to prioritise, promote or study patient safety interventions. However, the stage of FMEA involving multidisciplinary mapping process seems valuable and work is now needed to identify the best way of converting this into plans for action.

  20. In vivo quantitative evaluation of tooth color with hand-held colorimeter and custom template.

    PubMed

    Shimada, Kazuki; Kakehashi, Yoshiyuki; Matsumura, Hideo; Tanoue, Naomi

    2004-04-01

    This article presents a technique for quantitatively evaluating the color of teeth, as well as color change in restorations and tooth surfaces. Through use of a custom template made of a thermoplastic polymer and a dental colorimeter, tooth surface color can be recorded periodically at the same location intraorally.

  1. Taking stock of four decades of quantitative research on stakeholder participation and evaluation use: a systematic map.

    PubMed

    Daigneault, Pierre-Marc

    2014-08-01

    Stakeholder participation and evaluation use have attracted a lot of attention from practitioners, theorists and researchers. A common hypothesis is that participation is positively associated with evaluation use. Whereas the number of empirical studies conducted on this topic is impressive, quantitative research has held a minority position within this scientific production. This study mobilizes systematic review methods to 'map' the empirical literature that has quantitatively studied participation and use. The goal is to take stock and assess the strength of evidence of this literature (but not to synthesize the findings) and, based on this assessment, to provide directions for future research. Copyright © 2014 Elsevier Ltd. All rights reserved.

  2. Program Helps In Analysis Of Failures

    NASA Technical Reports Server (NTRS)

    Stevenson, R. W.; Austin, M. E.; Miller, J. G.

    1993-01-01

    Failure Environment Analysis Tool (FEAT) computer program developed to enable people to see and better understand effects of failures in system. User selects failures from either engineering schematic diagrams or digraph-model graphics, and effects or potential causes of failures highlighted in color on same schematic-diagram or digraph representation. Uses digraph models to answer two questions: What will happen to system if set of failure events occurs? and What are possible causes of set of selected failures? Helps design reviewers understand exactly what redundancies built into system and where there is need to protect weak parts of system or remove them by redesign. Program also useful in operations, where it helps identify causes of failure after they occur. FEAT reduces costs of evaluation of designs, training, and learning how failures propagate through system. Written using Macintosh Programmers Workshop C v3.1. Can be linked with CLIPS 5.0 (MSC-21927, available from COSMIC).

  3. Methods for quantitative and qualitative evaluation of vaginal microflora during menstruation.

    PubMed Central

    Onderdonk, A B; Zamarchi, G R; Walsh, J A; Mellor, R D; Muñoz, A; Kass, E H

    1986-01-01

    The quantitative and qualitative changes in the bacterial flora of the vagina during menstruation have received inadequate study. Similarly, the effect of vaginal tampons on the microbial flora as well as the relationship between the microbial flora of the vagina and that of the tampon has not been adequately evaluated. The purposes of the present study were (i) to develop quantitative methods for studying the vaginal flora and the flora of tampons obtained during menstruation and (ii) to determine whether there were differences between the microflora of the tampon and that of the vaginal vault. Tampon and swab samples were obtained at various times from eight young healthy volunteers for 8 to 10 menstrual cycles. Samples consisted of swabs from women wearing menstrual pads compared with swab and tampon samples taken at various times during the menstrual cycle. Samples were analyzed for total facultative and anaerobic bacterial counts, and the six dominant bacterial species in each culture were identified. Statistical evaluation of the results indicates that total bacterial counts decreased during menstruation and that swab and tampon samples yielded similar total counts per unit weight of sample. The numbers of bacteria in tampons tended to be lower than in swabs taken at the same time. Overall, during menstruation, the concentrations of lactobacilli declined, but otherwise there was little difference among the species found during menstruation compared with those found in intermenstrual samples. Cotton tampons had little discernible effect on the microbial flora. PMID:3954346

  4. Bridging the qualitative-quantitative divide: Experiences from conducting a mixed methods evaluation in the RUCAS programme.

    PubMed

    Makrakis, Vassilios; Kostoulas-Makrakis, Nelly

    2016-02-01

    Quantitative and qualitative approaches to planning and evaluation in education for sustainable development have often been treated by practitioners from a single research paradigm. This paper discusses the utility of mixed method evaluation designs which integrate qualitative and quantitative data through a sequential transformative process. Sequential mixed method data collection strategies involve collecting data in an iterative process whereby data collected in one phase contribute to data collected in the next. This is done through examples from a programme addressing the 'Reorientation of University Curricula to Address Sustainability (RUCAS): A European Commission Tempus-funded Programme'. It is argued that the two approaches are complementary and that there are significant gains from combining both. Using methods from both research paradigms does not, however, mean that the inherent differences among epistemologies and methodologies should be neglected. Based on this experience, it is recommended that using a sequential transformative mixed method evaluation can produce more robust results than could be accomplished using a single approach in programme planning and evaluation focussed on education for sustainable development. Copyright © 2015 Elsevier Ltd. All rights reserved.

  5. When should we use nitrates in congestive heart failure?

    PubMed

    Vizzardi, Enrico; Bonadei, Ivano; Rovetta, Riccardo; D'Aloia, Antonio; Quinzani, Filippo; Curnis, Antonio; Dei Cas, Livio

    2013-02-01

    Organic nitrates remain among the oldest and most commonly employed drugs in cardiology. Although, in most cases, their use in acute and chronic heart failure is based on clinical practice, only a few clinical trials have been conducted to evaluate their use in acute and chronic heart failure, most of which compare them with other drugs to evaluate differing endpoints. The purpose of this review is to examine the various trials that have evaluated the use of nitrates in acute and chronic heart failure. © 2012 Blackwell Publishing Ltd.

  6. Distant failure prediction for early stage NSCLC by analyzing PET with sparse representation

    NASA Astrophysics Data System (ADS)

    Hao, Hongxia; Zhou, Zhiguo; Wang, Jing

    2017-03-01

    Positron emission tomography (PET) imaging has been widely explored for treatment outcome prediction. Radiomicsdriven methods provide a new insight to quantitatively explore underlying information from PET images. However, it is still a challenging problem to automatically extract clinically meaningful features for prognosis. In this work, we develop a PET-guided distant failure predictive model for early stage non-small cell lung cancer (NSCLC) patients after stereotactic ablative radiotherapy (SABR) by using sparse representation. The proposed method does not need precalculated features and can learn intrinsically distinctive features contributing to classification of patients with distant failure. The proposed framework includes two main parts: 1) intra-tumor heterogeneity description; and 2) dictionary pair learning based sparse representation. Tumor heterogeneity is initially captured through anisotropic kernel and represented as a set of concatenated vectors, which forms the sample gallery. Then, given a test tumor image, its identity (i.e., distant failure or not) is classified by applying the dictionary pair learning based sparse representation. We evaluate the proposed approach on 48 NSCLC patients treated by SABR at our institute. Experimental results show that the proposed approach can achieve an area under the characteristic curve (AUC) of 0.70 with a sensitivity of 69.87% and a specificity of 69.51% using a five-fold cross validation.

  7. Comprehensive evaluation of direct injection mass spectrometry for the quantitative profiling of volatiles in food samples

    PubMed Central

    2016-01-01

    Although qualitative strategies based on direct injection mass spectrometry (DIMS) have recently emerged as an alternative for the rapid classification of food samples, the potential of these approaches in quantitative tasks has scarcely been addressed to date. In this paper, the applicability of different multivariate regression procedures to data collected by DIMS from simulated mixtures has been evaluated. The most relevant factors affecting quantitation, such as random noise, the number of calibration samples, type of validation, mixture complexity and similarity of mass spectra, were also considered and comprehensively discussed. Based on the conclusions drawn from simulated data, and as an example of application, experimental mass spectral fingerprints collected by direct thermal desorption coupled to mass spectrometry were used for the quantitation of major volatiles in Thymus zygis subsp. zygis chemotypes. The results obtained, validated with the direct thermal desorption coupled to gas chromatography–mass spectrometry method here used as a reference, show the potential of DIMS approaches for the fast and precise quantitative profiling of volatiles in foods. This article is part of the themed issue ‘Quantitative mass spectrometry’. PMID:27644978

  8. Quantitative magnetic resonance (MR) neurography for evaluation of peripheral nerves and plexus injuries

    PubMed Central

    Barousse, Rafael; Socolovsky, Mariano; Luna, Antonio

    2017-01-01

    Traumatic conditions of peripheral nerves and plexus have been classically evaluated by morphological imaging techniques and electrophysiological tests. New magnetic resonance imaging (MRI) studies based on 3D fat-suppressed techniques are providing high accuracy for peripheral nerve injury evaluation from a qualitative point of view. However, these techniques do not provide quantitative information. Diffusion weighted imaging (DWI) and diffusion tensor imaging (DTI) are functional MRI techniques that are able to evaluate and quantify the movement of water molecules within different biological structures. These techniques have been successfully applied in other anatomical areas, especially in the assessment of central nervous system, and now are being imported, with promising results for peripheral nerve and plexus evaluation. DWI and DTI allow performing a qualitative and quantitative peripheral nerve analysis, providing valuable pathophysiological information about functional integrity of these structures. In the field of trauma and peripheral nerve or plexus injury, several derived parameters from DWI and DTI studies such as apparent diffusion coefficient (ADC) or fractional anisotropy (FA) among others, can be used as potential biomarkers of neural damage providing information about fiber organization, axonal flow or myelin integrity. A proper knowledge of physical basis of these techniques and their limitations is important for an optimal interpretation of the imaging findings and derived data. In this paper, a comprehensive review of the potential applications of DWI and DTI neurographic studies is performed with a focus on traumatic conditions, including main nerve entrapment syndromes in both peripheral nerves and brachial or lumbar plexus. PMID:28932698

  9. A quantitative evaluation of the public response to climate engineering

    NASA Astrophysics Data System (ADS)

    Wright, Malcolm J.; Teagle, Damon A. H.; Feetham, Pamela M.

    2014-02-01

    Atmospheric greenhouse gas concentrations continue to increase, with CO2 passing 400 parts per million in May 2013. To avoid severe climate change and the attendant economic and social dislocation, existing energy efficiency and emissions control initiatives may need support from some form of climate engineering. As climate engineering will be controversial, there is a pressing need to inform the public and understand their concerns before policy decisions are taken. So far, engagement has been exploratory, small-scale or technique-specific. We depart from past research to draw on the associative methods used by corporations to evaluate brands. A systematic, quantitative and comparative approach for evaluating public reaction to climate engineering is developed. Its application reveals that the overall public evaluation of climate engineering is negative. Where there are positive associations they favour carbon dioxide removal (CDR) over solar radiation management (SRM) techniques. Therefore, as SRM techniques become more widely known they are more likely to elicit negative reactions. Two climate engineering techniques, enhanced weathering and cloud brightening, have indistinct concept images and so are less likely to draw public attention than other CDR or SRM techniques.

  10. Comparison of methods for quantitative evaluation of endoscopic distortion

    NASA Astrophysics Data System (ADS)

    Wang, Quanzeng; Castro, Kurt; Desai, Viraj N.; Cheng, Wei-Chung; Pfefer, Joshua

    2015-03-01

    Endoscopy is a well-established paradigm in medical imaging, and emerging endoscopic technologies such as high resolution, capsule and disposable endoscopes promise significant improvements in effectiveness, as well as patient safety and acceptance of endoscopy. However, the field lacks practical standardized test methods to evaluate key optical performance characteristics (OPCs), in particular the geometric distortion caused by fisheye lens effects in clinical endoscopic systems. As a result, it has been difficult to evaluate an endoscope's image quality or assess its changes over time. The goal of this work was to identify optimal techniques for objective, quantitative characterization of distortion that are effective and not burdensome. Specifically, distortion measurements from a commercially available distortion evaluation/correction software package were compared with a custom algorithm based on a local magnification (ML) approach. Measurements were performed using a clinical gastroscope to image square grid targets. Recorded images were analyzed with the ML approach and the commercial software where the results were used to obtain corrected images. Corrected images based on the ML approach and the software were compared. The study showed that the ML method could assess distortion patterns more accurately than the commercial software. Overall, the development of standardized test methods for characterizing distortion and other OPCs will facilitate development, clinical translation, manufacturing quality and assurance of performance during clinical use of endoscopic technologies.

  11. A methodology to quantitatively evaluate the safety of a glazing robot.

    PubMed

    Lee, Seungyeol; Yu, Seungnam; Choi, Junho; Han, Changsoo

    2011-03-01

    A new construction method using robots is spreading widely among construction sites in order to overcome labour shortages and frequent construction accidents. Along with economical efficiency, safety is a very important factor for evaluating the use of construction robots in construction sites. However, the quantitative evaluation of safety is difficult compared with that of economical efficiency. In this study, we suggested a safety evaluation methodology by defining the 'worker' and 'work conditions' as two risk factors, defining the 'worker' factor as posture load and the 'work conditions' factor as the work environment and the risk exposure time. The posture load evaluation reflects the risk of musculoskeletal disorders which can be caused by work posture and the risk of accidents which can be caused by reduced concentration. We evaluated the risk factors that may cause various accidents such as falling, colliding, capsizing, and squeezing in work environments, and evaluated the operational risk by considering worker exposure time to risky work environments. With the results of the evaluations for each factor, we calculated the general operational risk and deduced the improvement ratio in operational safety by introducing a construction robot. To verify these results, we compared the safety of the existing human manual labour and the proposed robotic labour construction methods for manipulating large glass panels. Copyright © 2010 Elsevier Ltd and The Ergonomics Society. All rights reserved.

  12. An FMEA evaluation of intensity modulated radiation therapy dose delivery failures at tolerance criteria levels.

    PubMed

    Faught, Jacqueline Tonigan; Balter, Peter A; Johnson, Jennifer L; Kry, Stephen F; Court, Laurence E; Stingo, Francesco C; Followill, David S

    2017-11-01

    scores positively correlated (P < 0.01) for each FM as expected. No universal correlations were found between the demographic information collected and scoring, percent dose errors or ranking. Failure modes investigated overall were evaluated as low to medium risk, with average RPNs less than 110. The ranking of 11 failure modes was not agreed upon by the community. Large variability in FMEA scoring may be caused by individual interpretation and/or experience, reflecting the subjective nature of the FMEA tool. © 2017 American Association of Physicists in Medicine.

  13. Failure Progress of 3D Reinforced GFRP Laminate during Static Bending, Evaluated by Means of Acoustic Emission and Vibrations Analysis.

    PubMed

    Koziol, Mateusz; Figlus, Tomasz

    2015-12-14

    The work aimed to assess the failure progress in a glass fiber-reinforced polymer laminate with a 3D-woven and (as a comparison) plain-woven reinforcement, during static bending, using acoustic emission signals. The innovative method of the separation of the signal coming from the fiber fracture and the one coming from the matrix fracture with the use of the acoustic event's energy as a criterion was applied. The failure progress during static bending was alternatively analyzed by evaluation of the vibration signal. It gave a possibility to validate the results of the acoustic emission. Acoustic emission, as well as vibration signal analysis proved to be good and effective tools for the registration of failure effects in composite laminates. Vibration analysis is more complicated methodologically, yet it is more precise. The failure progress of the 3D laminate is "safer" and more beneficial than that of the plain-woven laminate. It exhibits less rapid load capacity drops and a higher fiber effort contribution at the moment of the main laminate failure.

  14. Evaluation of reference genes for quantitative RT-PCR in Lolium temulentum under abiotic stress

    USDA-ARS?s Scientific Manuscript database

    Lolium temulentum is a valuable model grass species for the study of stress in forage and turf grasses. Gene expression analysis by quantitative real time RT-PCR relies on the use of proper internal standards. The aim of this study was to identify and evaluate reference genes for use in real-time q...

  15. Quantitative Evaluation of a First Year Seminar Program: Relationships to Persistence and Academic Success

    ERIC Educational Resources Information Center

    Jenkins-Guarnieri, Michael A.; Horne, Melissa M.; Wallis, Aaron L.; Rings, Jeffrey A.; Vaughan, Angela L.

    2015-01-01

    In the present study, we conducted a quantitative evaluation of a novel First Year Seminar (FYS) program with a coordinated curriculum implemented at a public, four-year university to assess its potential role in undergraduate student persistence decisions and academic success. Participants were 2,188 first-year students, 342 of whom completed the…

  16. High-throughput sequencing: a failure mode analysis.

    PubMed

    Yang, George S; Stott, Jeffery M; Smailus, Duane; Barber, Sarah A; Balasundaram, Miruna; Marra, Marco A; Holt, Robert A

    2005-01-04

    Basic manufacturing principles are becoming increasingly important in high-throughput sequencing facilities where there is a constant drive to increase quality, increase efficiency, and decrease operating costs. While high-throughput centres report failure rates typically on the order of 10%, the causes of sporadic sequencing failures are seldom analyzed in detail and have not, in the past, been formally reported. Here we report the results of a failure mode analysis of our production sequencing facility based on detailed evaluation of 9,216 ESTs generated from two cDNA libraries. Two categories of failures are described; process-related failures (failures due to equipment or sample handling) and template-related failures (failures that are revealed by close inspection of electropherograms and are likely due to properties of the template DNA sequence itself). Preventative action based on a detailed understanding of failure modes is likely to improve the performance of other production sequencing pipelines.

  17. Quantitative Evaluation of Musical Scale Tunings

    ERIC Educational Resources Information Center

    Hall, Donald E.

    1974-01-01

    The acoustical and mathematical basis of the problem of tuning the twelve-tone chromatic scale is reviewed. A quantitative measurement showing how well any tuning succeeds in providing just intonation for any specific piece of music is explained and applied to musical examples using a simple computer program. (DT)

  18. Biomarkers in acute heart failure.

    PubMed

    Mallick, Aditi; Januzzi, James L

    2015-06-01

    The care of patients with acutely decompensated heart failure is being reshaped by the availability and understanding of several novel and emerging heart failure biomarkers. The gold standard biomarkers in heart failure are B-type natriuretic peptide and N-terminal pro-B-type natriuretic peptide, which play an important role in the diagnosis, prognosis, and management of acute decompensated heart failure. Novel biomarkers that are increasingly involved in the processes of myocardial injury, neurohormonal activation, and ventricular remodeling are showing promise in improving diagnosis and prognosis among patients with acute decompensated heart failure. These include midregional proatrial natriuretic peptide, soluble ST2, galectin-3, highly-sensitive troponin, and midregional proadrenomedullin. There has also been an emergence of biomarkers for evaluation of acute decompensated heart failure that assist in the differential diagnosis of dyspnea, such as procalcitonin (for identification of acute pneumonia), as well as markers that predict complications of acute decompensated heart failure, such as renal injury markers. In this article, we will review the pathophysiology and usefulness of established and emerging biomarkers for the clinical diagnosis, prognosis, and management of acute decompensated heart failure. Copyright © 2015 Sociedad Española de Cardiología. Published by Elsevier España, S.L.U. All rights reserved.

  19. Evaluation of Quantitative Literacy Series: Exploring Data and Exploring Probability. Program Report 87-5.

    ERIC Educational Resources Information Center

    Day, Roger P.; And Others

    A quasi-experimental design with two experimental groups and one control group was used to evaluate the use of two books in the Quantitative Literacy Series, "Exploring Data" and "Exploring Probability." Group X teachers were those who had attended a workshop on the use of the materials and were using the materials during the…

  20. Mode of action and effects of standardized collaborative disease management on mortality and morbidity in patients with systolic heart failure: the Interdisciplinary Network for Heart Failure (INH) study.

    PubMed

    Angermann, Christiane E; Störk, Stefan; Gelbrich, Götz; Faller, Hermann; Jahns, Roland; Frantz, Stefan; Loeffler, Markus; Ertl, Georg

    2012-01-01

    Trials investigating efficacy of disease management programs (DMP) in heart failure reported contradictory results. Features rendering specific interventions successful are often ill defined. We evaluated the mode of action and effects of a nurse-coordinated DMP (HeartNetCare-HF, HNC). Patients hospitalized for systolic heart failure were randomly assigned to HNC or usual care (UC). Besides telephone-based monitoring and education, HNC addressed individual problems raised by patients, pursued networking of health care providers and provided training for caregivers. End points were time to death or rehospitalization (combined primary), heart failure symptoms, and quality of life (SF-36). Of 1007 consecutive patients, 715 were randomly assigned (HNC: n=352; UC: n=363; age, 69±12 years; 29% female; 40% New York Heart Association class III-IV). Within 180 days, 130 HNC and 137 UC patients reached the primary end point (hazard ratio, 1.02; 95% confidence interval, 0.81-1.30; P=0.89), since more HNC patients were readmitted. Overall, 32 HNC and 52 UC patients died (1 UC patient and 4 HNC patients after dropout); thus, uncensored hazard ratio was 0.62 (0.40-0.96; P=0.03). HNC patients improved more regarding New York Heart Association class (P=0.05), physical functioning (P=0.03), and physical health component (P=0.03). Except for HNC, health care utilization was comparable between groups. However, HNC patients requested counseling for noncardiac problems even more frequently than for cardiovascular or heart-failure-related issues. The primary end point of this study was neutral. However, mortality risk and surrogates of well-being improved significantly. Quantitative assessment of patient requirements suggested that besides (tele)monitoring individualized care considering also noncardiac problems should be integrated in efforts to achieve more sustainable improvement in heart failure outcomes. URL: http://www.controlled-trials.com. Unique identifier: ISRCTN23325295.

  1. On-Board Particulate Filter Failure Prevention and Failure Diagnostics Using Radio Frequency Sensing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sappok, Alex; Ragaller, Paul; Herman, Andrew

    The increasing use of diesel and gasoline particulate filters requires advanced on-board diagnostics (OBD) to prevent and detect filter failures and malfunctions. Early detection of upstream (engine-out) malfunctions is paramount to preventing irreversible damage to downstream aftertreatment system components. Such early detection can mitigate the failure of the particulate filter resulting in the escape of emissions exceeding permissible limits and extend the component life. However, despite best efforts at early detection and filter failure prevention, the OBD system must also be able to detect filter failures when they occur. In this study, radio frequency (RF) sensors were used to directlymore » monitor the particulate filter state of health for both gasoline particulate filter (GPF) and diesel particulate filter (DPF) applications. The testing included controlled engine dynamometer evaluations, which characterized soot slip from various filter failure modes, as well as on-road fleet vehicle tests. The results show a high sensitivity to detect conditions resulting in soot leakage from the particulate filter, as well as potential for direct detection of structural failures including internal cracks and melted regions within the filter media itself. Furthermore, the measurements demonstrate, for the first time, the capability to employ a direct and continuous monitor of particulate filter diagnostics to both prevent and detect potential failure conditions in the field.« less

  2. An analysis of policy success and failure in formal evaluations of Australia's national mental health strategy (1992-2012).

    PubMed

    Grace, Francesca C; Meurk, Carla S; Head, Brian W; Hall, Wayne D; Harris, Meredith G; Whiteford, Harvey A

    2017-05-30

    Heightened fiscal constraints, increases in the chronic disease burden and in consumer expectations are among several factors contributing to the global interest in evidence-informed health policy. The present article builds on previous work that explored how the Australian Federal Government applied five instruments of policy, or policy levers, to implement a series of reforms under the Australian National Mental Health Strategy (NMHS). The present article draws on theoretical insights from political science to analyse the relative successes and failures of these levers, as portrayed in formal government evaluations of the NMHS. Documentary analysis of six evaluation documents corresponding to three National Mental Health Plans was undertaken. Both the content and approach of these government-funded, independently conducted evaluations were appraised. An overall improvement was apparent in the development and application of policy levers over time. However, this finding should be interpreted with caution due to variations in evaluation approach according to Plan and policy lever. Tabulated summaries of the success and failure of each policy initiative, ordered by lever type, are provided to establish a resource that could be consulted for future policy-making. This analysis highlights the complexities of health service reform and underscores the limitations of narrowly focused empirical approaches. A theoretical framework is provided that could inform the evaluation and targeted selection of appropriate policy levers in mental health.

  3. Qualitative and quantitative evaluation of some vocal function parameters following fitting of a prosthesis.

    PubMed

    Cavalot, A L; Palonta, F; Preti, G; Nazionale, G; Ricci, E; Vione, N; Albera, R; Cortesina, G

    2001-12-01

    The insertion of a prosthesis and restoration with pectoralis major myocutaneous flaps for patients subjected to total pharyngolaryngectomy is a technique now universally accepted; however the literature on the subject is lacking. Our study considers 10 patients subjected to total pharyngolaryngectomy and restoration with pectoralis major myocutaneous flaps who were fitted with vocal function prostheses and a control group of 50 subjects treated with a total laryngectomy without pectoralis major myocutaneous flaps and who were fitted with vocal function prostheses. Specific qualitative and quantitative parameters were compared. The quantitative measurement of the levels of voice intensity and the evaluation of the harmonics-to-noise ratio were not statistically significant (p > 0.05) between the two study groups at either high- or low-volume speech. On the contrary, statistically significant differences were found (p < 0.05) for the basic frequency of both the low and the high volume voice. For the qualitative analysis seven parameters were established for evaluation by trained and untrained listeners: on the basis of these parameters the control group had statistically better voices.

  4. Lungs in Heart Failure

    PubMed Central

    Apostolo, Anna; Giusti, Giuliano; Gargiulo, Paola; Bussotti, Maurizio; Agostoni, Piergiuseppe

    2012-01-01

    Lung function abnormalities both at rest and during exercise are frequently observed in patients with chronic heart failure, also in the absence of respiratory disease. Alterations of respiratory mechanics and of gas exchange capacity are strictly related to heart failure. Severe heart failure patients often show a restrictive respiratory pattern, secondary to heart enlargement and increased lung fluids, and impairment of alveolar-capillary gas diffusion, mainly due to an increased resistance to molecular diffusion across the alveolar capillary membrane. Reduced gas diffusion contributes to exercise intolerance and to a worse prognosis. Cardiopulmonary exercise test is considered the “gold standard” when studying the cardiovascular, pulmonary, and metabolic adaptations to exercise in cardiac patients. During exercise, hyperventilation and consequent reduction of ventilation efficiency are often observed in heart failure patients, resulting in an increased slope of ventilation/carbon dioxide (VE/VCO2) relationship. Ventilatory efficiency is as strong prognostic and an important stratification marker. This paper describes the pulmonary abnormalities at rest and during exercise in the patients with heart failure, highlighting the principal diagnostic tools for evaluation of lungs function, the possible pharmacological interventions, and the parameters that could be useful in prognostic assessment of heart failure patients. PMID:23365739

  5. Quantitative maps of genetic interactions in yeast - comparative evaluation and integrative analysis.

    PubMed

    Lindén, Rolf O; Eronen, Ville-Pekka; Aittokallio, Tero

    2011-03-24

    High-throughput genetic screening approaches have enabled systematic means to study how interactions among gene mutations contribute to quantitative fitness phenotypes, with the aim of providing insights into the functional wiring diagrams of genetic interaction networks on a global scale. However, it is poorly known how well these quantitative interaction measurements agree across the screening approaches, which hinders their integrated use toward improving the coverage and quality of the genetic interaction maps in yeast and other organisms. Using large-scale data matrices from epistatic miniarray profiling (E-MAP), genetic interaction mapping (GIM), and synthetic genetic array (SGA) approaches, we carried out here a systematic comparative evaluation among these quantitative maps of genetic interactions in yeast. The relatively low association between the original interaction measurements or their customized scores could be improved using a matrix-based modelling framework, which enables the use of single- and double-mutant fitness estimates and measurements, respectively, when scoring genetic interactions. Toward an integrative analysis, we show how the detections from the different screening approaches can be combined to suggest novel positive and negative interactions which are complementary to those obtained using any single screening approach alone. The matrix approximation procedure has been made available to support the design and analysis of the future screening studies. We have shown here that even if the correlation between the currently available quantitative genetic interaction maps in yeast is relatively low, their comparability can be improved by means of our computational matrix approximation procedure, which will enable integrative analysis and detection of a wider spectrum of genetic interactions using data from the complementary screening approaches.

  6. Morphologic Risk Factors in Predicting Symptomatic Structural Failure of Arthroscopic Rotator Cuff Repairs: Tear Size, Location, and Atrophy Matter.

    PubMed

    Gasbarro, Gregory; Ye, Jason; Newsome, Hillary; Jiang, Kevin; Wright, Vonda; Vyas, Dharmesh; Irrgang, James J; Musahl, Volker

    2016-10-01

    To evaluate whether morphologic characteristics of rotator cuff tear have prognostic value in determining symptomatic structural failure of arthroscopic rotator cuff repair independent of age or gender. Arthroscopic rotator cuff repair cases performed by five fellowship-trained surgeons at our institution from 2006 to 2013 were retrospectively reviewed. Data extraction included demographics, comorbidities, repair technique, clinical examination, and radiographic findings. Failure in symptomatic patients was defined as structural defect on postoperative magnetic resonance imaging or pseudoparalysis on examination. Failures were age and gender matched with successful repairs in a 1:2 ratio. A total of 30 failures and 60 controls were identified. Supraspinatus atrophy (P = .03) and tear size (18.3 mm failures v 13.9 mm controls; P = .02) were significant risk factors for failure, as was the presence of an infraspinatus tear greater than 10 mm (62% v 17%, P < .01). Single-row repair (P = .06) and simple suture configuration (P = .17) were more common but similar between groups. Diabetes mellitus and active tobacco use were not significantly associated with increased failure risk but psychiatric medication use was more frequent in the failure group. This study confirms previous suspicions that tear size and fatty infiltration are associated with failure of arthroscopic rotator cuff repair but independent of age or gender in symptomatic patients. There is also a quantitative cutoff on magnetic resonance imaging for the size of infraspinatus involvement that can be used clinically as a predicting factor. Although reported in the literature, smoking and diabetes were not associated with failure. Level III, retrospective case control. Copyright © 2016 Arthroscopy Association of North America. Published by Elsevier Inc. All rights reserved.

  7. Quantitative risk analysis of oil storage facilities in seismic areas.

    PubMed

    Fabbrocino, Giovanni; Iervolino, Iunio; Orlando, Francesca; Salzano, Ernesto

    2005-08-31

    Quantitative risk analysis (QRA) of industrial facilities has to take into account multiple hazards threatening critical equipment. Nevertheless, engineering procedures able to evaluate quantitatively the effect of seismic action are not well established. Indeed, relevant industrial accidents may be triggered by loss of containment following ground shaking or other relevant natural hazards, either directly or through cascade effects ('domino effects'). The issue of integrating structural seismic risk into quantitative probabilistic seismic risk analysis (QpsRA) is addressed in this paper by a representative study case regarding an oil storage plant with a number of atmospheric steel tanks containing flammable substances. Empirical seismic fragility curves and probit functions, properly defined both for building-like and non building-like industrial components, have been crossed with outcomes of probabilistic seismic hazard analysis (PSHA) for a test site located in south Italy. Once the seismic failure probabilities have been quantified, consequence analysis has been performed for those events which may be triggered by the loss of containment following seismic action. Results are combined by means of a specific developed code in terms of local risk contour plots, i.e. the contour line for the probability of fatal injures at any point (x, y) in the analysed area. Finally, a comparison with QRA obtained by considering only process-related top events is reported for reference.

  8. Malnutrition and Cachexia in Heart Failure.

    PubMed

    Rahman, Adam; Jafry, Syed; Jeejeebhoy, Khursheed; Nagpal, A Dave; Pisani, Barbara; Agarwala, Ravi

    2016-05-01

    Heart failure is a growing public health concern. Advanced heart failure is frequently associated with severe muscle wasting, termed cardiac cachexia This process is driven by systemic inflammation and tumor necrosis factor in a manner common to other forms of disease-related wasting seen with cancer or human immunodeficiency virus. A variable degree of malnutrition is often superimposed from poor nutrient intake. Cardiac cachexia significantly decreases quality of life and survival in patients with heart failure. This review outlines the evaluation of nutrition status in heart failure, explores the pathophysiology of cardiac cachexia, and discusses therapeutic interventions targeting wasting in these patients. © 2015 American Society for Parenteral and Enteral Nutrition.

  9. A novel approach for evaluating the performance of real time quantitative loop-mediated isothermal amplification-based methods.

    PubMed

    Nixon, Gavin J; Svenstrup, Helle F; Donald, Carol E; Carder, Caroline; Stephenson, Judith M; Morris-Jones, Stephen; Huggett, Jim F; Foy, Carole A

    2014-12-01

    Molecular diagnostic measurements are currently underpinned by the polymerase chain reaction (PCR). There are also a number of alternative nucleic acid amplification technologies, which unlike PCR, work at a single temperature. These 'isothermal' methods, reportedly offer potential advantages over PCR such as simplicity, speed and resistance to inhibitors and could also be used for quantitative molecular analysis. However there are currently limited mechanisms to evaluate their quantitative performance, which would assist assay development and study comparisons. This study uses a sexually transmitted infection diagnostic model in combination with an adapted metric termed isothermal doubling time (IDT), akin to PCR efficiency, to compare quantitative PCR and quantitative loop-mediated isothermal amplification (qLAMP) assays, and to quantify the impact of matrix interference. The performance metric described here facilitates the comparison of qLAMP assays that could assist assay development and validation activities.

  10. Evaluation of chronic kidney disease in chronic heart failure: From biomarkers to arterial renal resistances

    PubMed Central

    Iacoviello, Massimo; Leone, Marta; Antoncecchi, Valeria; Ciccone, Marco Matteo

    2015-01-01

    Chronic kidney disease and its worsening are recurring conditions in chronic heart failure (CHF) which are independently associated with poor patient outcome. The heart and kidney share many pathophysiological mechanisms which can determine dysfunction in each organ. Cardiorenal syndrome is the condition in which these two organs negatively affect each other, therefore an accurate evaluation of renal function in the clinical setting of CHF is essential. This review aims to revise the parameters currently used to evaluate renal dysfunction in CHF with particular reference to the usefulness and the limitations of biomarkers in evaluating glomerular dysfunction and tubular damage. Moreover, it is reported the possible utility of renal arterial resistance index (a parameter associated with abnormalities in renal vascular bed) for a better assesment of kidney disfunction. PMID:25610846

  11. Fluid removal in acute heart failure: diuretics versus devices.

    PubMed

    Krishnamoorthy, Arun; Felker, G Michael

    2014-10-01

    Fluid removal and relief of congestion are central to treatment of acute heart failure. Diuretics have been the decongestive mainstay but their known limitations have led to the exploration of alternative strategies. This review compares diuretics with ultrafiltration and examines the recent evidence evaluating their use. Relevant recent studies are the Diuretic Optimization Strategies Evaluation trial (of diuretics) and the Cardiorenal Rescue Study in Acute Decompensated Heart Failure (of ultrafiltration). The Diuretic Optimization Strategies Evaluation study evaluated strategies of loop diuretic use during acute heart failure (continuous infusion versus intermittent bolus and high dose versus low dose). After 72  h, there was no significant difference with either comparison for the coprimary end points. Patients treated with a high-dose strategy tended to have greater diuresis and more decongestion compared with low-dose therapy, at the cost of transient changes in renal function. The Cardiorenal Rescue Study in Acute Decompensated Heart Failure study showed that in acute heart failure patients with persistent congestion and worsening renal function, ultrafiltration, as compared with a medical therapy, was associated with similar weight loss but greater increase in serum creatinine and more adverse events. Decongestion remains a major challenge in acute heart failure. Although recent studies provide useful data to guide practice, the relatively poor outcomes point to the continued need to identify better strategies for safe and effective decongestion.

  12. Diffusion tensor imaging with quantitative evaluation and fiber tractography of lumbar nerve roots in sciatica.

    PubMed

    Shi, Yin; Zong, Min; Xu, Xiaoquan; Zou, Yuefen; Feng, Yang; Liu, Wei; Wang, Chuanbing; Wang, Dehang

    2015-04-01

    To quantitatively evaluate nerve roots by measuring fractional anisotropy (FA) values in healthy volunteers and sciatica patients, visualize nerve roots by tractography, and compare the diagnostic efficacy between conventional magnetic resonance imaging (MRI) and DTI. Seventy-five sciatica patients and thirty-six healthy volunteers underwent MR imaging using DTI. FA values for L5-S1 lumbar nerve roots were calculated at three levels from DTI images. Tractography was performed on L3-S1 nerve roots. ROC analysis was performed for FA values. The lumbar nerve roots were visualized and FA values were calculated in all subjects. FA values decreased in compressed nerve roots and declined from proximal to distal along the compressed nerve tracts. Mean FA values were more sensitive and specific than MR imaging for differentiating compressed nerve roots, especially in the far lateral zone at distal nerves. DTI can quantitatively evaluate compressed nerve roots, and DTT enables visualization of abnormal nerve tracts, providing vivid anatomic information and localization of probable nerve compression. DTI has great potential utility for evaluating lumbar nerve compression in sciatica. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  13. Psychometric Evaluation of Two Appetite Questionnaires in Patients With Heart Failure.

    PubMed

    Andreae, Christina; Strömberg, Anna; Sawatzky, Richard; Årestedt, Kristofer

    2015-12-01

    Decreased appetite in heart failure (HF) may lead to undernutrition which could negatively influence prognosis. Appetite is a complex clinical issue that is often best measured with the use of self-report instruments. However, there is a lack of self-rated appetite instruments. The Council on Nutrition Appetite Questionnaire (CNAQ) and the Simplified Nutritional Appetite Questionnaire (SNAQ) are validated instruments developed primarily for elderly people. Yet, the psychometric properties have not been evaluated in HF populations. The aim of the present study was to evaluate the psychometric properties of CNAQ and SNAQ in patients with HF. A total of 186 outpatients with reduced ejection fraction and New York Heart Association (NYHA) functional classifications II-IV were included (median age 72 y; 70% men). Data were collected with the use of a questionnaire that included the CNAQ and SNAQ. The psychometric evaluation included data quality, factor structure, construct validity, known-group validity, and internal consistency. Unidimensionality was supported by means of parallel analysis and confirmatory factor analyses (CFAs). The CFA results indicated sufficient model fit. Both construct validity and known-group validity were supported. Internal consistency reliability was acceptable, with ordinal coefficient alpha estimates of 0.82 for CNAQ and 0.77 for SNAQ. CNAQ and SNAQ demonstrated sound psychometric properties and can be used to measure appetite in patients with HF. Copyright © 2015 Elsevier Inc. All rights reserved.

  14. Evaluation of road failure vulnerability section through integrated geophysical and geotechnical studies

    NASA Astrophysics Data System (ADS)

    Adiat, K. A. N.; Akinlalu, A. A.; Adegoroye, A. A.

    2017-06-01

    In order to investigate the competence of the proposed road for pavement stability, geotechnical and geophysical investigations involving Land Magnetic, Very Low Frequency Electromagnetic (VLF-EM) and Electrical Resistivity methods were carried out along Akure-Ipinsa road Southwestern Nigeria. The magnetic profile was qualitatively and quantitatively interpreted to produce geomagnetic section that provides information on the basement topography and structural disposition beneath the proposed road. Similarly, the VLF-EM profile was equally interpreted to provide information on the possible occurrence of linear features beneath the study area. These linear features pose a potential risk to the proposed road as they are capable of undermining the stability of the pavement structure. The geoelectric parameters obtained from the quantitative interpretation of the VES data were used to generate geoelectric section. The geoelectric section generated shows that the study area was underlain by four geoelectric layers namely the topsoil, the weathered layer, the partly weathered/fractured basement and the fresh basement. The major part of the topsoil, which constitutes the subgrade, is characterized by relatively low resistivity values (<100 Ωm) suggestive of weak zones that are capable of undermining the stability of the proposed road. This therefore suggests that the layer is composed of incompetent materials that are unsuitable for engineering structures. Furthermore, fractured basement was also delineated beneath some portion of the proposed road. Since fracture is a weak zone, its presence can facilitate failure of the proposed road especially when it is occurring at shallow depth. The geotechnical results reveal that most of the investigated soil samples are clayey in nature. Integration of the results demonstrates that there is a good correlation between geophysical results and the geotechnical results. Furthermore, a vulnerability section that divided the road

  15. Evaluation of shear wave elastography for differential diagnosis of breast lesions: A new qualitative analysis versus conventional quantitative analysis.

    PubMed

    Ren, Wei-Wei; Li, Xiao-Long; Wang, Dan; Liu, Bo-Ji; Zhao, Chong-Ke; Xu, Hui-Xiong

    2018-04-13

    To evaluate a special kind of ultrasound (US) shear wave elastography for differential diagnosis of breast lesions, using a new qualitative analysis (i.e. the elasticity score in the travel time map) compared with conventional quantitative analysis. From June 2014 to July 2015, 266 pathologically proven breast lesions were enrolled in this study. The maximum, mean, median, minimum, and standard deviation of shear wave speed (SWS) values (m/s) were assessed. The elasticity score, a new qualitative feature, was evaluated in the travel time map. The area under the receiver operating characteristic (AUROC) curves were plotted to evaluate the diagnostic performance of both qualitative and quantitative analyses for differentiation of breast lesions. Among all quantitative parameters, SWS-max showed the highest AUROC (0.805; 95% CI: 0.752, 0.851) compared with SWS-mean (0.786; 95% CI:0.732, 0.834; P = 0.094), SWS-median (0.775; 95% CI:0.720, 0.824; P = 0.046), SWS-min (0.675; 95% CI:0.615, 0.731; P = 0.000), and SWS-SD (0.768; 95% CI:0.712, 0.817; P = 0.074). The AUROC of qualitative analysis in this study obtained the best diagnostic performance (0.871; 95% CI: 0.825, 0.909, compared with the best parameter of SWS-max in quantitative analysis, P = 0.011). The new qualitative analysis of shear wave travel time showed the superior diagnostic performance in the differentiation of breast lesions in comparison with conventional quantitative analysis.

  16. Failure-probability driven dose painting

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vogelius, Ivan R.; Håkansson, Katrin; Due, Anne K.

    Purpose: To demonstrate a data-driven dose-painting strategy based on the spatial distribution of recurrences in previously treated patients. The result is a quantitative way to define a dose prescription function, optimizing the predicted local control at constant treatment intensity. A dose planning study using the optimized dose prescription in 20 patients is performed.Methods: Patients treated at our center have five tumor subvolumes from the center of the tumor (PET positive volume) and out delineated. The spatial distribution of 48 failures in patients with complete clinical response after (chemo)radiation is used to derive a model for tumor control probability (TCP). Themore » total TCP is fixed to the clinically observed 70% actuarial TCP at five years. Additionally, the authors match the distribution of failures between the five subvolumes to the observed distribution. The steepness of the dose–response is extracted from the literature and the authors assume 30% and 20% risk of subclinical involvement in the elective volumes. The result is a five-compartment dose response model matching the observed distribution of failures. The model is used to optimize the distribution of dose in individual patients, while keeping the treatment intensity constant and the maximum prescribed dose below 85 Gy.Results: The vast majority of failures occur centrally despite the small volumes of the central regions. Thus, optimizing the dose prescription yields higher doses to the central target volumes and lower doses to the elective volumes. The dose planning study shows that the modified prescription is clinically feasible. The optimized TCP is 89% (range: 82%–91%) as compared to the observed TCP of 70%.Conclusions: The observed distribution of locoregional failures was used to derive an objective, data-driven dose prescription function. The optimized dose is predicted to result in a substantial increase in local control without increasing the predicted risk of

  17. 77 FR 41985 - Use of Influenza Disease Models To Quantitatively Evaluate the Benefits and Risks of Vaccines: A...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-07-17

    ... Evaluation and Research (CBER) and suggestions for further development. The public workshop will include... Evaluation and Research (HFM-210), Food and Drug Administration, 1401 Rockville Pike, suite 200N, Rockville... models to generate quantitative estimates of the benefits and risks of influenza vaccination. The public...

  18. NESTEM-QRAS: A Tool for Estimating Probability of Failure

    NASA Technical Reports Server (NTRS)

    Patel, Bhogilal M.; Nagpal, Vinod K.; Lalli, Vincent A.; Pai, Shantaram; Rusick, Jeffrey J.

    2002-01-01

    An interface between two NASA GRC specialty codes, NESTEM and QRAS has been developed. This interface enables users to estimate, in advance, the risk of failure of a component, a subsystem, and/or a system under given operating conditions. This capability would be able to provide a needed input for estimating the success rate for any mission. NESTEM code, under development for the last 15 years at NASA Glenn Research Center, has the capability of estimating probability of failure of components under varying loading and environmental conditions. This code performs sensitivity analysis of all the input variables and provides their influence on the response variables in the form of cumulative distribution functions. QRAS, also developed by NASA, assesses risk of failure of a system or a mission based on the quantitative information provided by NESTEM or other similar codes, and user provided fault tree and modes of failure. This paper will describe briefly, the capabilities of the NESTEM, QRAS and the interface. Also, in this presentation we will describe stepwise process the interface uses using an example.

  19. NESTEM-QRAS: A Tool for Estimating Probability of Failure

    NASA Astrophysics Data System (ADS)

    Patel, Bhogilal M.; Nagpal, Vinod K.; Lalli, Vincent A.; Pai, Shantaram; Rusick, Jeffrey J.

    2002-10-01

    An interface between two NASA GRC specialty codes, NESTEM and QRAS has been developed. This interface enables users to estimate, in advance, the risk of failure of a component, a subsystem, and/or a system under given operating conditions. This capability would be able to provide a needed input for estimating the success rate for any mission. NESTEM code, under development for the last 15 years at NASA Glenn Research Center, has the capability of estimating probability of failure of components under varying loading and environmental conditions. This code performs sensitivity analysis of all the input variables and provides their influence on the response variables in the form of cumulative distribution functions. QRAS, also developed by NASA, assesses risk of failure of a system or a mission based on the quantitative information provided by NESTEM or other similar codes, and user provided fault tree and modes of failure. This paper will describe briefly, the capabilities of the NESTEM, QRAS and the interface. Also, in this presentation we will describe stepwise process the interface uses using an example.

  20. Visibility graph analysis of heart rate time series and bio-marker of congestive heart failure

    NASA Astrophysics Data System (ADS)

    Bhaduri, Anirban; Bhaduri, Susmita; Ghosh, Dipak

    2017-09-01

    Study of RR interval time series for Congestive Heart Failure had been an area of study with different methods including non-linear methods. In this article the cardiac dynamics of heart beat are explored in the light of complex network analysis, viz. visibility graph method. Heart beat (RR Interval) time series data taken from Physionet database [46, 47] belonging to two groups of subjects, diseased (congestive heart failure) (29 in number) and normal (54 in number) are analyzed with the technique. The overall results show that a quantitative parameter can significantly differentiate between the diseased subjects and the normal subjects as well as different stages of the disease. Further, the data when split into periods of around 1 hour each and analyzed separately, also shows the same consistent differences. This quantitative parameter obtained using the visibility graph analysis thereby can be used as a potential bio-marker as well as a subsequent alarm generation mechanism for predicting the onset of Congestive Heart Failure.

  1. Poem Generator: A Comparative Quantitative Evaluation of a Microworlds-Based Learning Approach for Teaching English

    ERIC Educational Resources Information Center

    Jenkins, Craig

    2015-01-01

    This paper is a comparative quantitative evaluation of an approach to teaching poetry in the subject domain of English that employs a "guided discovery" pedagogy using computer-based microworlds. It uses a quasi-experimental design in order to measure performance gains in computational thinking and poetic thinking following a…

  2. Quantitative evaluation of learning and memory trace in studies of mnemotropic effects of immunotropic drugs.

    PubMed

    Kiseleva, N M; Novoseletskaya, A V; Voevodina, Ye B; Kozlov, I G; Inozemtsev, A N

    2012-12-01

    Apart from restoration of disordered immunological parameters, tactivin and derinat exhibit a pronounced effect on the higher integrative functions of the brain. Experiments on Wistar rats have shown that these drugs accelerated conditioning of food and defense responses. New methods for quantitative evaluation of memory trace consolidation are proposed.

  3. Evaluating Failures and Near Misses in Human Spaceflight History for Lessons for Future Human Spaceflight

    NASA Technical Reports Server (NTRS)

    Barr, Stephanie

    2009-01-01

    There have been a number of studies done in the past drawn on lessons learned with regard to human loss-of-life events. Generally, the systemic causes and proximate causes for fatal events have both been examined in considerable detail. However, an examination of near-fatal accidents and failures that narrowly missed being fatal could be equally useful, not only in detecting causes, both proximate and systemic, but also for determining what factors averted disaster, what design decisions and/or operator actions prevented catastrophe. Additionally, review of risk factors for upcoming or future programs will often look at trending statistics, generally focusing on failure/success statistics. Unfortunately, doing so can give a skewed or misleading view of past reliability or a reliability that cannot be presumed to apply to a new program. One reason for this might be that failure/success criteria aren't the same across programs, but also that apparent success can hide systemic faults that, under other circumstances, can be fatal to a program with different parameters. A program with a number of near misses can look more reliable than a consistently healthy program with a single out-of-family failure and provide very misleading data if it is not examined in detail. This is particularly true for a manned space program where failure/success includes more than making a particular orbit. Augmenting reliability evaluations with this near miss data can provide insight and expand on the limitations of a strictly pass/fail evaluation. Even more importantly, a thorough understanding of these near miss events can identify conditions that prevented fatalities. Those conditions may be key to a programs reliability, but, without insight to the repercussions if such conditions were not in place, their importance may not be readily clear. As programs mature and political and fiscal responsibilities come to the fore, often there is considerable incentive to eliminate unnecessary

  4. How is success or failure in river restoration projects evaluated? Feedback from French restoration projects.

    PubMed

    Morandi, Bertrand; Piégay, Hervé; Lamouroux, Nicolas; Vaudor, Lise

    2014-05-01

    Since the 1990s, French operational managers and scientists have been involved in the environmental restoration of rivers. The European Water Framework Directive (2000) highlights the need for feedback from restoration projects and for evidence-based evaluation of success. Based on 44 French pilot projects that included such an evaluation, the present study includes: 1) an introduction to restoration projects based on their general characteristics 2) a description of evaluation strategies and authorities in charge of their implementation, and 3) a focus on the evaluation of results and the links between these results and evaluation strategies. The results show that: 1) the quality of an evaluation strategy often remains too poor to understand well the link between a restoration project and ecological changes; 2) in many cases, the conclusions drawn are contradictory, making it difficult to determine the success or failure of a restoration project; and 3) the projects with the poorest evaluation strategies generally have the most positive conclusions about the effects of restoration. Recommendations are that evaluation strategies should be designed early in the project planning process and be based on clearly-defined objectives. Copyright © 2014 Elsevier Ltd. All rights reserved.

  5. A combined pulmonary-radiology workshop for visual evaluation of COPD: study design, chest CT findings and concordance with quantitative evaluation.

    PubMed

    Barr, R Graham; Berkowitz, Eugene A; Bigazzi, Francesca; Bode, Frederick; Bon, Jessica; Bowler, Russell P; Chiles, Caroline; Crapo, James D; Criner, Gerard J; Curtis, Jeffrey L; Dass, Chandra; Dirksen, Asger; Dransfield, Mark T; Edula, Goutham; Erikkson, Leif; Friedlander, Adam; Galperin-Aizenberg, Maya; Gefter, Warren B; Gierada, David S; Grenier, Philippe A; Goldin, Jonathan; Han, MeiLan K; Hanania, Nicola A; Hansel, Nadia N; Jacobson, Francine L; Kauczor, Hans-Ulrich; Kinnula, Vuokko L; Lipson, David A; Lynch, David A; MacNee, William; Make, Barry J; Mamary, A James; Mann, Howard; Marchetti, Nathaniel; Mascalchi, Mario; McLennan, Geoffrey; Murphy, James R; Naidich, David; Nath, Hrudaya; Newell, John D; Pistolesi, Massimo; Regan, Elizabeth A; Reilly, John J; Sandhaus, Robert; Schroeder, Joyce D; Sciurba, Frank; Shaker, Saher; Sharafkhaneh, Amir; Silverman, Edwin K; Steiner, Robert M; Strange, Charlton; Sverzellati, Nicola; Tashjian, Joseph H; van Beek, Edwin J R; Washington, Lacey; Washko, George R; Westney, Gloria; Wood, Susan A; Woodruff, Prescott G

    2012-04-01

    The purposes of this study were: to describe chest CT findings in normal non-smoking controls and cigarette smokers with and without COPD; to compare the prevalence of CT abnormalities with severity of COPD; and to evaluate concordance between visual and quantitative chest CT (QCT) scoring. Volumetric inspiratory and expiratory CT scans of 294 subjects, including normal non-smokers, smokers without COPD, and smokers with GOLD Stage I-IV COPD, were scored at a multi-reader workshop using a standardized worksheet. There were 58 observers (33 pulmonologists, 25 radiologists); each scan was scored by 9-11 observers. Interobserver agreement was calculated using kappa statistic. Median score of visual observations was compared with QCT measurements. Interobserver agreement was moderate for the presence or absence of emphysema and for the presence of panlobular emphysema; fair for the presence of centrilobular, paraseptal, and bullous emphysema subtypes and for the presence of bronchial wall thickening; and poor for gas trapping, centrilobular nodularity, mosaic attenuation, and bronchial dilation. Agreement was similar for radiologists and pulmonologists. The prevalence on CT readings of most abnormalities (e.g. emphysema, bronchial wall thickening, mosaic attenuation, expiratory gas trapping) increased significantly with greater COPD severity, while the prevalence of centrilobular nodularity decreased. Concordances between visual scoring and quantitative scoring of emphysema, gas trapping and airway wall thickening were 75%, 87% and 65%, respectively. Despite substantial inter-observer variation, visual assessment of chest CT scans in cigarette smokers provides information regarding lung disease severity; visual scoring may be complementary to quantitative evaluation.

  6. Quantitative evaluation of his-tag purification and immunoprecipitation of tristetraprolin and its mutant proteins from transfected human cells

    USDA-ARS?s Scientific Manuscript database

    Histidine (His)-tag is widely used for affinity purification of recombinant proteins, but the yield and purity of expressed proteins are quite different. Little information is available about quantitative evaluation of this procedure. The objective of the current study was to evaluate the His-tag pr...

  7. Evaluation of Fuzzy Rulemaking for Expert Systems for Failure Detection

    NASA Technical Reports Server (NTRS)

    Laritz, F.; Sheridan, T. B.

    1984-01-01

    Computer aids in expert systems were proposed to diagnose failures in complex systems. It is shown that the fuzzy set theory of Zadeh offers a new perspective for modeling for humans thinking and language use. It is assumed that real expert human operators of aircraft, power plants and other systems do not think of their control tasks or failure diagnosis tasks in terms of control laws in differential equation form, but rather keep in mind a set of rules of thumb in fuzzy form. Fuzzy set experiments are described.

  8. Evaluation of the appropriateness of the preclinical phase (stage A and stage B) of heart failure Management in Outpatient clinics in Italy rationale and design of the 'VASTISSIMO' study.

    PubMed

    Mureddu, Gian F; Nistri, Stefano; Faggiano, Pompilio; Fimiani, Biagio; Misuraca, Gianfranco; Maggi, Antonio; Gori, Anna M; Uguccioni, Massimo; Tavazzi, Luigi; Zito, Giovanni B

    2016-07-01

    Early detection of heart failure, when still preclinical, is fundamental. Therefore, it is important to assess whether preclinical heart failure management by cardiologists is adequate. The VASTISSIMO study ('EValuation of the AppropriateneSs of The preclInical phase (Stage A and Stage B) of heart failure Management in Outpatient clinics in Italy') is a prospective nationwide study aimed to evaluate the appropriateness of diagnosis and management of preclinical heart failure (stages A and B) by cardiologists working in outpatient clinics in Italy. Secondary goals are to verify if an online educational course for cardiologists can improve management of preclinical heart failure, and evaluate how well cardiologists are aware of patients' adherence to medications. The study involves 80 outpatient cardiology clinics distributed throughout Italy, affiliated either to the Hospital Cardiologists Association or to the Regional Association of Outpatient Cardiologists, and is designed with two phases of consecutive outpatient enrolment each lasting 1 month. In phase 1, physicians' awareness of the risk of heart failure and their decision-making process are recorded. Subsequently, half of the cardiologists are randomized to undergo an online educational course aimed to improve preclinical heart failure management through implementation of guideline recommendations. At the end of the course, all cardiologists are evaluated (phase 2) to see whether changes in clinical management have occurred in those who underwent the educational program versus those who did not. Patients' adherence to prescribed medications will be assessed through the Morisky Self-report Questionnaire. This study should provide valuable information about cardiologists' awareness of preclinical heart failure and the appropriateness of clinical practice in outpatient cardiology clinics in Italy.

  9. Management of Arrhythmias in Heart Failure

    PubMed Central

    Masarone, Daniele; Limongelli, Giuseppe; Rubino, Marta; Valente, Fabio; Vastarella, Rossella; Ammendola, Ernesto; Gravino, Rita; Verrengia, Marina; Salerno, Gemma; Pacileo, Giuseppe

    2017-01-01

    Heart failure patients are predisposed to develop arrhythmias. Supraventricular arrhythmias can exacerbate the heart failure symptoms by decreasing the effective cardiac output and their control require pharmacological, electrical, or catheter-based intervention. In the setting of atrial flutter or atrial fibrillation, anticoagulation becomes paramount to prevent systemic or cerebral embolism. Patients with heart failure are also prone to develop ventricular arrhythmias that can present a challenge to the managing clinician. The management strategy depends on the type of arrhythmia, the underlying structural heart disease, the severity of heart failure, and the range from optimization of heart failure therapy to catheter ablation. Patients with heart failure, irrespective of ejection fraction are at high risk for developing sudden cardiac death, however risk stratification is a clinical challenge and requires a multiparametric evaluation for identification of patients who should undergo implantation of a cardioverter defibrillator. Finally, patients with heart failure can also develop symptomatic bradycardia, caused by sinus node dysfunction or atrio-ventricular block. The treatment of bradycardia in these patients with pacing is usually straightforward but needs some specific issue. PMID:29367535

  10. Periodontitis in Chronic Heart Failure.

    PubMed

    Fröhlich, Hanna; Herrmann, Kristina; Franke, Jennifer; Karimi, Alamara; Täger, Tobias; Cebola, Rita; Katus, Hugo A; Zugck, Christian; Frankenstein, Lutz

    2016-08-01

    Periodontal disease has been associated with an increased risk of cardiovascular events. The purpose of our study was to investigate whether a correlation between periodontitis and chronic heart failure exists, as well as the nature of the underlying cause. We enrolled 71 patients (mean age, 54 ± 13 yr; 56 men) who had stable chronic heart failure; all underwent complete cardiologic and dental evaluations. The periodontal screening index was used to quantify the degree of periodontal disease. We compared the findings to those in the general population with use of data from the 4th German Dental Health Survey. Gingivitis, moderate periodontitis, and severe periodontitis were present in 17 (24%), 17 (24%), and 37 (52%) patients, respectively. Severe periodontitis was more prevalent among chronic heart failure patients than in the general population. In contrast, moderate periodontitis was more prevalent in the general population (P <0.00001). The severity of periodontal disease was not associated with the cause of chronic heart failure or the severity of heart failure symptoms. Six-minute walking distance was the only independent predictor of severe periodontitis. Periodontal disease is highly prevalent in chronic heart failure patients regardless of the cause of heart failure. Prospective trials are warranted to clarify the causal relationship between both diseases.

  11. Periodontitis in Chronic Heart Failure

    PubMed Central

    Fröhlich, Hanna; Herrmann, Kristina; Franke, Jennifer; Karimi, Alamara; Täger, Tobias; Cebola, Rita; Katus, Hugo A.; Zugck, Christian

    2016-01-01

    Periodontal disease has been associated with an increased risk of cardiovascular events. The purpose of our study was to investigate whether a correlation between periodontitis and chronic heart failure exists, as well as the nature of the underlying cause. We enrolled 71 patients (mean age, 54 ± 13 yr; 56 men) who had stable chronic heart failure; all underwent complete cardiologic and dental evaluations. The periodontal screening index was used to quantify the degree of periodontal disease. We compared the findings to those in the general population with use of data from the 4th German Dental Health Survey. Gingivitis, moderate periodontitis, and severe periodontitis were present in 17 (24%), 17 (24%), and 37 (52%) patients, respectively. Severe periodontitis was more prevalent among chronic heart failure patients than in the general population. In contrast, moderate periodontitis was more prevalent in the general population (P <0.00001). The severity of periodontal disease was not associated with the cause of chronic heart failure or the severity of heart failure symptoms. Six-minute walking distance was the only independent predictor of severe periodontitis. Periodontal disease is highly prevalent in chronic heart failure patients regardless of the cause of heart failure. Prospective trials are warranted to clarify the causal relationship between both diseases. PMID:27547136

  12. Fidelity Failures in Brief Strategic Family Therapy for Adolescent Drug Abuse: A Clinical Analysis.

    PubMed

    Lebensohn-Chialvo, Florencia; Rohrbaugh, Michael J; Hasler, Brant P

    2018-04-30

    As evidence-based family treatments for adolescent substance use and conduct problems gain traction, cutting edge research moves beyond randomized efficacy trials to address questions such as how these treatments work and how best to disseminate them to community settings. A key factor in effective dissemination is treatment fidelity, which refers to implementing an intervention in a manner consistent with an established manual. While most fidelity research is quantitative, this study offers a qualitative clinical analysis of fidelity failures in a large, multisite effectiveness trial of Brief Strategic Family Therapy (BSFT) for adolescent drug abuse, where BSFT developers trained community therapists to administer this intervention in their own agencies. Using case notes and video recordings of therapy sessions, an independent expert panel first rated 103 cases on quantitative fidelity scales grounded in the BSFT manual and the broader structural-strategic framework that informs BSFT intervention. Because fidelity was generally low, the panel reviewed all cases qualitatively to identify emergent types or categories of fidelity failure. Ten categories of failures emerged, characterized by therapist omissions (e.g., failure to engage key family members, failure to think in threes) and commissions (e.g., off-model, nonsystemic formulations/interventions). Of these, "failure to think in threes" appeared basic and particularly problematic, reflecting the central place of this idea in structural theory and therapy. Although subject to possible bias, our observations highlight likely stumbling blocks in exporting a complex family treatment like BSFT to community settings. These findings also underscore the importance of treatment fidelity in family therapy research. © 2018 Family Process Institute.

  13. Global left atrial failure in heart failure.

    PubMed

    Triposkiadis, Filippos; Pieske, Burkert; Butler, Javed; Parissis, John; Giamouzis, Gregory; Skoularigis, John; Brutsaert, Dirk; Boudoulas, Harisios

    2016-11-01

    The left atrium plays an important role in the maintenance of cardiovascular and neurohumoral homeostasis in heart failure. However, with progressive left ventricular dysfunction, left atrial (LA) dilation and mechanical failure develop, which frequently culminate in atrial fibrillation. Moreover, LA mechanical failure is accompanied by LA endocrine failure [deficient atrial natriuretic peptide (ANP) processing-synthesis/development of ANP resistance) and LA regulatory failure (dominance of sympathetic nervous system excitatory mechanisms, excessive vasopressin release) contributing to neurohumoral overactivity, vasoconstriction, and volume overload (global LA failure). The purpose of the present review is to describe the characteristics and emphasize the clinical significance of global LA failure in patients with heart failure. © 2016 The Authors. European Journal of Heart Failure © 2016 European Society of Cardiology.

  14. POF-Darts: Geometric adaptive sampling for probability of failure

    DOE PAGES

    Ebeida, Mohamed S.; Mitchell, Scott A.; Swiler, Laura P.; ...

    2016-06-18

    We introduce a novel technique, POF-Darts, to estimate the Probability Of Failure based on random disk-packing in the uncertain parameter space. POF-Darts uses hyperplane sampling to explore the unexplored part of the uncertain space. We use the function evaluation at a sample point to determine whether it belongs to failure or non-failure regions, and surround it with a protection sphere region to avoid clustering. We decompose the domain into Voronoi cells around the function evaluations as seeds and choose the radius of the protection sphere depending on the local Lipschitz continuity. As sampling proceeds, regions uncovered with spheres will shrink,more » improving the estimation accuracy. After exhausting the function evaluation budget, we build a surrogate model using the function evaluations associated with the sample points and estimate the probability of failure by exhaustive sampling of that surrogate. In comparison to other similar methods, our algorithm has the advantages of decoupling the sampling step from the surrogate construction one, the ability to reach target POF values with fewer samples, and the capability of estimating the number and locations of disconnected failure regions, not just the POF value. Furthermore, we present various examples to demonstrate the efficiency of our novel approach.« less

  15. Quantitative evaluation of lipid concentration in atherosclerotic plaque phantom by near-infrared multispectral angioscope at wavelengths around 1200 nm

    NASA Astrophysics Data System (ADS)

    Matsui, Daichi; Ishii, Katsunori; Awazu, Kunio

    2015-07-01

    Atherosclerosis is a primary cause of critical ischemic diseases like heart infarction or stroke. A method that can provide detailed information about the stability of atherosclerotic plaques is required. We focused on spectroscopic techniques that could evaluate the chemical composition of lipid in plaques. A novel angioscope using multispectral imaging at wavelengths around 1200 nm for quantitative evaluation of atherosclerotic plaques was developed. The angioscope consists of a halogen lamp, an indium gallium arsenide (InGaAs) camera, 3 optical band pass filters transmitting wavelengths of 1150, 1200, and 1300 nm, an image fiber having 0.7 mm outer diameter, and an irradiation fiber which consists of 7 multimode fibers. Atherosclerotic plaque phantoms with 100, 60, 20 vol.% of lipid were prepared and measured by the multispectral angioscope. The acquired datasets were processed by spectral angle mapper (SAM) method. As a result, simulated plaque areas in atherosclerotic plaque phantoms that could not be detected by an angioscopic visible image could be clearly enhanced. In addition, quantitative evaluation of atherosclerotic plaque phantoms based on the lipid volume fractions was performed up to 20 vol.%. These results show the potential of a multispectral angioscope at wavelengths around 1200 nm for quantitative evaluation of the stability of atherosclerotic plaques.

  16. Quantitative metrics for evaluating the phased roll-out of clinical information systems.

    PubMed

    Wong, David; Wu, Nicolas; Watkinson, Peter

    2017-09-01

    We introduce a novel quantitative approach for evaluating the order of roll-out during phased introduction of clinical information systems. Such roll-outs are associated with unavoidable risk due to patients transferring between clinical areas using both the old and new systems. We proposed a simple graphical model of patient flow through a hospital. Using a simple instance of the model, we showed how a roll-out order can be generated by minimising the flow of patients from the new system to the old system. The model was applied to admission and discharge data acquired from 37,080 patient journeys at the Churchill Hospital, Oxford between April 2013 and April 2014. The resulting order was evaluated empirically and produced acceptable orders. The development of data-driven approaches to clinical Information system roll-out provides insights that may not necessarily be ascertained through clinical judgment alone. Such methods could make a significant contribution to the smooth running of an organisation during the roll-out of a potentially disruptive technology. Unlike previous approaches, which are based on clinical opinion, the approach described here quantitatively assesses the appropriateness of competing roll-out strategies. The data-driven approach was shown to produce strategies that matched clinical intuition and provides a flexible framework that may be used to plan and monitor Clinical Information System roll-out. Copyright © 2017 The Author(s). Published by Elsevier B.V. All rights reserved.

  17. Development of failure model for nickel cadmium cells

    NASA Technical Reports Server (NTRS)

    Gupta, A.

    1980-01-01

    The development of a method for the life prediction of nickel cadmium cells is discussed. The approach described involves acquiring an understanding of the mechanisms of degradation and failure and at the same time developing nondestructive evaluation techniques for the nickel cadmium cells. The development of a statistical failure model which will describe the mechanisms of degradation and failure is outlined.

  18. Quantitative image quality evaluation of MR images using perceptual difference models

    PubMed Central

    Miao, Jun; Huo, Donglai; Wilson, David L.

    2008-01-01

    The authors are using a perceptual difference model (Case-PDM) to quantitatively evaluate image quality of the thousands of test images which can be created when optimizing fast magnetic resonance (MR) imaging strategies and reconstruction techniques. In this validation study, they compared human evaluation of MR images from multiple organs and from multiple image reconstruction algorithms to Case-PDM and similar models. The authors found that Case-PDM compared very favorably to human observers in double-stimulus continuous-quality scale and functional measurement theory studies over a large range of image quality. The Case-PDM threshold for nonperceptible differences in a 2-alternative forced choice study varied with the type of image under study, but was ≈1.1 for diffuse image effects, providing a rule of thumb. Ordering the image quality evaluation models, we found in overall Case-PDM ≈ IDM (Sarnoff Corporation) ≈ SSIM [Wang et al. IEEE Trans. Image Process. 13, 600–612 (2004)] > mean squared error ≈ NR [Wang et al. (2004) (unpublished)] > DCTune (NASA) > IQM (MITRE Corporation). The authors conclude that Case-PDM is very useful in MR image evaluation but that one should probably restrict studies to similar images and similar processing, normally not a limitation in image reconstruction studies. PMID:18649487

  19. A real-time simulation evaluation of an advanced detection. Isolation and accommodation algorithm for sensor failures in turbine engines

    NASA Technical Reports Server (NTRS)

    Merrill, W. C.; Delaat, J. C.

    1986-01-01

    An advanced sensor failure detection, isolation, and accommodation (ADIA) algorithm has been developed for use with an aircraft turbofan engine control system. In a previous paper the authors described the ADIA algorithm and its real-time implementation. Subsequent improvements made to the algorithm and implementation are discussed, and the results of an evaluation presented. The evaluation used a real-time, hybrid computer simulation of an F100 turbofan engine.

  20. Cardiomyocyte-Specific Telomere Shortening is a Distinct Signature of Heart Failure in Humans.

    PubMed

    Sharifi-Sanjani, Maryam; Oyster, Nicholas M; Tichy, Elisia D; Bedi, Kenneth C; Harel, Ofer; Margulies, Kenneth B; Mourkioti, Foteini

    2017-09-07

    Telomere defects are thought to play a role in cardiomyopathies, but the specific cell type affected by the disease in human hearts is not yet identified. The aim of this study was to systematically evaluate the cell type specificity of telomere shortening in patients with heart failure in relation to their cardiac disease, age, and sex. We studied cardiac tissues from patients with heart failure by utilizing telomere quantitative fluorescence in situ hybridization, a highly sensitive method with single-cell resolution. In this study, total of 63 human left ventricular samples, including 37 diseased and 26 nonfailing donor hearts, were stained for telomeres in combination with cardiomyocyte- or α-smooth muscle cell-specific markers, cardiac troponin T, and smooth muscle actin, respectively, and assessed for telomere length. Patients with heart failure demonstrate shorter cardiomyocyte telomeres compared with nonfailing donors, which is specific only to cardiomyocytes within diseased human hearts and is associated with cardiomyocyte DNA damage. Our data further reveal that hypertrophic hearts with reduced ejection fraction exhibit the shortest telomeres. In contrast to other reported cell types, no difference in cardiomyocyte telomere length is evident with age. However, under the disease state, telomere attrition manifests in both young and older patients with cardiac hypertrophy. Finally, we demonstrate that cardiomyocyte-telomere length is better sustained in women than men under diseased conditions. This study provides the first evidence of cardiomyocyte-specific telomere shortening in heart failure. © 2017 The Authors. Published on behalf of the American Heart Association, Inc., by Wiley.

  1. Extensions and evaluations of a general quantitative theory of forest structure and dynamics

    PubMed Central

    Enquist, Brian J.; West, Geoffrey B.; Brown, James H.

    2009-01-01

    Here, we present the second part of a quantitative theory for the structure and dynamics of forests under demographic and resource steady state. The theory is based on individual-level allometric scaling relations for how trees use resources, fill space, and grow. These scale up to determine emergent properties of diverse forests, including size–frequency distributions, spacing relations, canopy configurations, mortality rates, population dynamics, successional dynamics, and resource flux rates. The theory uniquely makes quantitative predictions for both stand-level scaling exponents and normalizations. We evaluate these predictions by compiling and analyzing macroecological datasets from several tropical forests. The close match between theoretical predictions and data suggests that forests are organized by a set of very general scaling rules. Our mechanistic theory is based on allometric scaling relations, is complementary to “demographic theory,” but is fundamentally different in approach. It provides a quantitative baseline for understanding deviations from predictions due to other factors, including disturbance, variation in branching architecture, asymmetric competition, resource limitation, and other sources of mortality, which are not included in the deliberately simplified theory. The theory should apply to a wide range of forests despite large differences in abiotic environment, species diversity, and taxonomic and functional composition. PMID:19363161

  2. Quantitative evaluation of brain development using anatomical MRI and diffusion tensor imaging☆

    PubMed Central

    Oishi, Kenichi; Faria, Andreia V.; Yoshida, Shoko; Chang, Linda; Mori, Susumu

    2013-01-01

    The development of the brain is structure-specific, and the growth rate of each structure differs depending on the age of the subject. Magnetic resonance imaging (MRI) is often used to evaluate brain development because of the high spatial resolution and contrast that enable the observation of structure-specific developmental status. Currently, most clinical MRIs are evaluated qualitatively to assist in the clinical decision-making and diagnosis. The clinical MRI report usually does not provide quantitative values that can be used to monitor developmental status. Recently, the importance of image quantification to detect and evaluate mild-to-moderate anatomical abnormalities has been emphasized because these alterations are possibly related to several psychiatric disorders and learning disabilities. In the research arena, structural MRI and diffusion tensor imaging (DTI) have been widely applied to quantify brain development of the pediatric population. To interpret the values from these MR modalities, a “growth percentile chart,” which describes the mean and standard deviation of the normal developmental curve for each anatomical structure, is required. Although efforts have been made to create such a growth percentile chart based on MRI and DTI, one of the greatest challenges is to standardize the anatomical boundaries of the measured anatomical structures. To avoid inter- and intra-reader variability about the anatomical boundary definition, and hence, to increase the precision of quantitative measurements, an automated structure parcellation method, customized for the neonatal and pediatric population, has been developed. This method enables quantification of multiple MR modalities using a common analytic framework. In this paper, the attempt to create an MRI- and a DTI-based growth percentile chart, followed by an application to investigate developmental abnormalities related to cerebral palsy, Williams syndrome, and Rett syndrome, have been introduced

  3. A quantitative approach to evaluating caring in nursing simulation.

    PubMed

    Eggenberger, Terry L; Keller, Kathryn B; Chase, Susan K; Payne, Linda

    2012-01-01

    This study was designed to test a quantitative method of measuring caring in the simulated environment. Since competency in caring is central to nursing practice, ways of including caring concepts in designing scenarios and in evaluation of performance need to be developed. Coates' Caring Efficacy scales were adapted for simulation and named the Caring Efficacy Scale-Simulation Student Version (CES-SSV) and Caring Efficacy Scale-Simulation Faculty Version (CES-SFV). A correlational study was designed to compare student self-ratings with faculty ratings on caring efficacy during an adult acute simulation experience with traditional and accelerated baccalaureate students in a nursing program grounded in caring theory. Student self-ratings were significantly correlated with objective ratings (r = 0.345, 0.356). Both the CES-SSV and the CES-SFV were found to have excellent internal consistency and significantly correlated interrater reliability. They were useful in measuring caring in the simulated learning environment.

  4. Connecting qualitative observation and quantitative measurement for enhancing quantitative literacy in plant anatomy course

    NASA Astrophysics Data System (ADS)

    Nuraeni, E.; Rahmat, A.

    2018-05-01

    Forming of cognitive schemes of plant anatomy concepts is performed by processing of qualitative and quantitative data obtained from microscopic observations. To enhancing student’s quantitative literacy, strategy of plant anatomy course was modified by adding the task to analyze quantitative data produced by quantitative measurement of plant anatomy guided by material course. Participant in this study was 24 biology students and 35 biology education students. Quantitative Literacy test, complex thinking in plant anatomy test and questioner used to evaluate the course. Quantitative literacy capability data was collected by quantitative literacy test with the rubric from the Association of American Colleges and Universities, Complex thinking in plant anatomy by test according to Marzano and questioner. Quantitative literacy data are categorized according to modified Rhodes and Finley categories. The results showed that quantitative literacy of biology education students is better than biology students.

  5. Dual respiratory and cardiac motion estimation in PET imaging: Methods design and quantitative evaluation.

    PubMed

    Feng, Tao; Wang, Jizhe; Tsui, Benjamin M W

    2018-04-01

    The goal of this study was to develop and evaluate four post-reconstruction respiratory and cardiac (R&C) motion vector field (MVF) estimation methods for cardiac 4D PET data. In Method 1, the dual R&C motions were estimated directly from the dual R&C gated images. In Method 2, respiratory motion (RM) and cardiac motion (CM) were separately estimated from the respiratory gated only and cardiac gated only images. The effects of RM on CM estimation were modeled in Method 3 by applying an image-based RM correction on the cardiac gated images before CM estimation, the effects of CM on RM estimation were neglected. Method 4 iteratively models the mutual effects of RM and CM during dual R&C motion estimations. Realistic simulation data were generated for quantitative evaluation of four methods. Almost noise-free PET projection data were generated from the 4D XCAT phantom with realistic R&C MVF using Monte Carlo simulation. Poisson noise was added to the scaled projection data to generate additional datasets of two more different noise levels. All the projection data were reconstructed using a 4D image reconstruction method to obtain dual R&C gated images. The four dual R&C MVF estimation methods were applied to the dual R&C gated images and the accuracy of motion estimation was quantitatively evaluated using the root mean square error (RMSE) of the estimated MVFs. Results show that among the four estimation methods, Methods 2 performed the worst for noise-free case while Method 1 performed the worst for noisy cases in terms of quantitative accuracy of the estimated MVF. Methods 4 and 3 showed comparable results and achieved RMSE lower by up to 35% than that in Method 1 for noisy cases. In conclusion, we have developed and evaluated 4 different post-reconstruction R&C MVF estimation methods for use in 4D PET imaging. Comparison of the performance of four methods on simulated data indicates separate R&C estimation with modeling of RM before CM estimation (Method 3) to be

  6. Design and Evaluation of a Web-Based Symptom Monitoring Tool for Heart Failure.

    PubMed

    Wakefield, Bonnie J; Alexander, Gregory; Dohrmann, Mary; Richardson, James

    2017-05-01

    Heart failure is a chronic condition where symptom recognition and between-visit communication with providers are critical. Patients are encouraged to track disease-specific data, such as weight and shortness of breath. Use of a Web-based tool that facilitates data display in graph form may help patients recognize exacerbations and more easily communicate out-of-range data to clinicians. The purposes of this study were to (1) design a Web-based tool to facilitate symptom monitoring and symptom recognition in patients with chronic heart failure and (2) conduct a usability evaluation of the Web site. Patient participants generally had a positive view of the Web site and indicated it would support recording their health status and communicating with their doctors. Clinician participants generally had a positive view of the Web site and indicated it would be a potentially useful adjunct to electronic health delivery systems. Participants expressed a need to incorporate decision support within the site and wanted to add other data, for example, blood pressure, and have the ability to adjust font size. A few expressed concerns about data privacy and security. Technologies require careful design and testing to ensure they are useful, usable, and safe for patients and do not add to the burden of busy providers.

  7. Simulation as a preoperative planning approach in advanced heart failure patients. A retrospective clinical analysis.

    PubMed

    Capoccia, Massimo; Marconi, Silvia; Singh, Sanjeet Avtaar; Pisanelli, Domenico M; De Lazzari, Claudio

    2018-05-02

    Modelling and simulation may become clinically applicable tools for detailed evaluation of the cardiovascular system and clinical decision-making to guide therapeutic intervention. Models based on pressure-volume relationship and zero-dimensional representation of the cardiovascular system may be a suitable choice given their simplicity and versatility. This approach has great potential for application in heart failure where the impact of left ventricular assist devices has played a significant role as a bridge to transplant and more recently as a long-term solution for non eligible candidates. We sought to investigate the value of simulation in the context of three heart failure patients with a view to predict or guide further management. CARDIOSIM © was the software used for this purpose. The study was based on retrospective analysis of haemodynamic data previously discussed at a multidisciplinary meeting. The outcome of the simulations addressed the value of a more quantitative approach in the clinical decision process. Although previous experience, co-morbidities and the risk of potentially fatal complications play a role in clinical decision-making, patient-specific modelling may become a daily approach for selection and optimisation of device-based treatment for heart failure patients. Willingness to adopt this integrated approach may be the key to further progress.

  8. Establishing a preoperative evaluation system for lumboperitoneal shunt: Approach to attenuate the risk of shunt failure.

    PubMed

    Sun, Tong; Yuan, Yikai; Zhang, Qiuming; Zhou, Yicheng; Li, Xuepei; Yu, Hang; Tian, Meng; Guan, Junwen

    2018-06-12

    Lumboperitoneal shunt (LPS) has been demonstrated an effective method for the treatment of communicating hydrocephalus in the presence of frequent shunt failure. To determine if establishing a preoperative evaluation system could benefit patients thus attenuating the risk of LPS failure. In this three-year study, treated by LPS, patients undergoing preoperative evaluation were included into study group and others without preoperative evaluation were included into control group. Perioperative conditions, including Keifer's hydrocephalus score (KHS), symptomatic control rate (SCR), Evans index, complications, long-term shunt revision rate, and quality of life (QOL), were synchronously investigated. 93 eligible patients were included in the study (study group: 51, control group: 42). The baseline characteristics of two groups were basically similar. The results showed patients in study group had better short-term improvement in symptoms and imageology, including higher SCR (Median, 62.5% vs 50%, P=0.001), more reduction in Evans index (0.08±0.05 vs 0.05±0.04, P=0.002), and lower incidence of postoperative complications (Median, 35.3% vs 57.1%, P=0.04). Similarly, the incidence of shunt revision in study group was dramatically lower than control group (Median, 15.7% vs 40.9%, P=0.006) in line with the revision-free curve (P=0.002), in which suggested most of patients received revision, if needed, within 3 months. Additionally, patients in study group had better QOL. In conclusion, patients who underwent the evaluation before LPS had better short-term and long-term outcomes, suggesting it would be a promising strategy to correctly select patients for LPS with prolonged favorable shunt outcomes. Copyright © 2018 Elsevier Inc. All rights reserved.

  9. Effects of enhanced external counterpulsation on skeletal muscle gene expression in patients with severe heart failure.

    PubMed

    Melin, Michael; Montelius, Andreas; Rydén, Lars; Gonon, Adrian; Hagerman, Inger; Rullman, Eric

    2018-01-01

    Enhanced external counterpulsation (EECP) is a non-invasive treatment in which leg cuff compressions increase diastolic aortic pressure and coronary perfusion. EECP is offered to patients with refractory angina pectoris and increases physical capacity. Benefits in heart failure patients have been noted, but EECP is still considered to be experimental and its effects must be confirmed. The mechanism of action is still unclear. The aim of this study was to evaluate the effect of EECP on skeletal muscle gene expression and physical performance in patients with severe heart failure. Patients (n = 9) in NYHA III-IV despite pharmacological therapy were subjected to 35 h of EECP during 7 weeks. Before and after, lateral vastus muscle biopsies were obtained, and functional capacity was evaluated with a 6-min walk test. Skeletal muscle gene expression was evaluated using Affymetrix Hugene 1.0 arrays. Maximum walking distance increased by 15%, which is in parity to that achieved after aerobic exercise training in similar patients. Skeletal muscle gene expression analysis using Ingenuity Pathway Analysis showed an increased expression of two networks of genes with FGF-2 and IGF-1 as central regulators. The increase in gene expression was quantitatively small and no overlap with gene expression profiles after exercise training could be detected despite adequate statistical power. EECP treatment leads to a robust improvement in walking distance in patients with severe heart failure and does induce a skeletal muscle transcriptional response, but this response is small and with no significant overlap with the transcriptional signature seen after exercise training. © 2016 Scandinavian Society of Clinical Physiology and Nuclear Medicine. Published by John Wiley & Sons Ltd.

  10. Does early reading failure decrease children's reading motivation?

    PubMed

    Morgan, Paul L; Fuchs, Douglas; Compton, Donald L; Cordray, David S; Fuchs, Lynn S

    2008-01-01

    The authors used a pretest-posttest control group design with random assignment to evaluate whether early reading failure decreases children's motivation to practice reading. First, they investigated whether 60 first-grade children would report substantially different levels of interest in reading as a function of their relative success or failure in learning to read. Second, they evaluated whether increasing the word reading ability of 15 at-risk children would lead to gains in their motivation to read. Multivariate analyses of variance suggest marked differences in both motivation and reading practice between skilled and unskilled readers. However, bolstering at-risk children's word reading ability did not yield evidence of a causal relationship between early reading failure and decreased motivation to engage in reading activities. Instead, hierarchical regression analyses indicate a covarying relationship among early reading failure, poor motivation, and avoidance of reading.

  11. Evaluation of patients with painful total hip arthroplasty using combined single photon emission tomography and conventional computerized tomography (SPECT/CT) - a comparison of semi-quantitative versus 3D volumetric quantitative measurements.

    PubMed

    Barthassat, Emilienne; Afifi, Faik; Konala, Praveen; Rasch, Helmut; Hirschmann, Michael T

    2017-05-08

    It was the primary purpose of our study to evaluate the inter- and intra-observer reliability of a standardized SPECT/CT algorithm for evaluating patients with painful primary total hip arthroplasty (THA). The secondary purpose was a comparison of semi-quantitative and 3D volumetric quantification method for assessment of bone tracer uptake (BTU) in those patients. A novel SPECT/CT localization scheme consisting of 14 femoral and 4 acetabular regions on standardized axial and coronal slices was introduced and evaluated in terms of inter- and intra-observer reliability in 37 consecutive patients with hip pain after THA. BTU for each anatomical region was assessed semi-quantitatively using a color-coded Likert type scale (0-10) and volumetrically quantified using a validated software. Two observers interpreted the SPECT/CT findings in all patients two times with six weeks interval between interpretations in random order. Semi-quantitative and quantitative measurements were compared in terms of reliability. In addition, the values were correlated using Pearson`s correlation. A factorial cluster analysis of BTU was performed to identify clinically relevant regions, which should be grouped and analysed together. The localization scheme showed high inter- and intra-observer reliabilities for all femoral and acetabular regions independent of the measurement method used (semiquantitative versus 3D volumetric quantitative measurements). A high to moderate correlation between both measurement methods was shown for the distal femur, the proximal femur and the acetabular cup. The factorial cluster analysis showed that the anatomical regions might be summarized into three distinct anatomical regions. These were the proximal femur, the distal femur and the acetabular cup region. The SPECT/CT algorithm for assessment of patients with pain after THA is highly reliable independent from the measurement method used. Three clinically relevant anatomical regions (proximal femoral

  12. Heart failure remote monitoring: evidence from the retrospective evaluation of a real-world remote monitoring program.

    PubMed

    Agboola, Stephen; Jethwani, Kamal; Khateeb, Kholoud; Moore, Stephanie; Kvedar, Joseph

    2015-04-22

    Given the magnitude of increasing heart failure mortality, multidisciplinary approaches, in the form of disease management programs and other integrative models of care, are recommended to optimize treatment outcomes. Remote monitoring, either as structured telephone support or telemonitoring or a combination of both, is fast becoming an integral part of many disease management programs. However, studies reporting on the evaluation of real-world heart failure remote monitoring programs are scarce. This study aims to evaluate the effect of a heart failure telemonitoring program, Connected Cardiac Care Program (CCCP), on hospitalization and mortality in a retrospective database review of medical records of patients with heart failure receiving care at the Massachusetts General Hospital. Patients enrolled in the CCCP heart failure monitoring program at the Massachusetts General Hospital were matched 1:1 with usual care patients. Control patients received care from similar clinical settings as CCCP patients and were identified from a large clinical data registry. The primary endpoint was all-cause mortality and hospitalizations assessed during the 4-month program duration. Secondary outcomes included hospitalization and mortality rates (obtained by following up on patients over an additional 8 months after program completion for a total duration of 1 year), risk for multiple hospitalizations and length of stay. The Cox proportional hazard model, stratified on the matched pairs, was used to assess primary outcomes. A total of 348 patients were included in the time-to-event analyses. The baseline rates of hospitalizations prior to program enrollment did not differ significantly by group. Compared with controls, hospitalization rates decreased within the first 30 days of program enrollment: hazard ratio (HR)=0.52, 95% CI 0.31-0.86, P=.01). The differential effect on hospitalization rates remained consistent until the end of the 4-month program (HR=0.74, 95% CI 0

  13. Generic Sensor Failure Modeling for Cooperative Systems.

    PubMed

    Jäger, Georg; Zug, Sebastian; Casimiro, António

    2018-03-20

    The advent of cooperative systems entails a dynamic composition of their components. As this contrasts current, statically composed systems, new approaches for maintaining their safety are required. In that endeavor, we propose an integration step that evaluates the failure model of shared information in relation to an application's fault tolerance and thereby promises maintainability of such system's safety. However, it also poses new requirements on failure models, which are not fulfilled by state-of-the-art approaches. Consequently, this work presents a mathematically defined generic failure model as well as a processing chain for automatically extracting such failure models from empirical data. By examining data of an Sharp GP2D12 distance sensor, we show that the generic failure model not only fulfills the predefined requirements, but also models failure characteristics appropriately when compared to traditional techniques.

  14. Generic Sensor Failure Modeling for Cooperative Systems

    PubMed Central

    Jäger, Georg; Zug, Sebastian

    2018-01-01

    The advent of cooperative systems entails a dynamic composition of their components. As this contrasts current, statically composed systems, new approaches for maintaining their safety are required. In that endeavor, we propose an integration step that evaluates the failure model of shared information in relation to an application’s fault tolerance and thereby promises maintainability of such system’s safety. However, it also poses new requirements on failure models, which are not fulfilled by state-of-the-art approaches. Consequently, this work presents a mathematically defined generic failure model as well as a processing chain for automatically extracting such failure models from empirical data. By examining data of an Sharp GP2D12 distance sensor, we show that the generic failure model not only fulfills the predefined requirements, but also models failure characteristics appropriately when compared to traditional techniques. PMID:29558435

  15. Comparative evaluation of 3 microbond strength tests using 4 adhesive systems: Mechanical, finite element, and failure analysis.

    PubMed

    Campos, Roberto E; Santos Filho, Paulo César F; de O Júnior, Osmir Batista; Ambrosano, Gláucia M B; Pereira, Cristina Alves

    2018-01-01

    Bond strength (BS) values from in vitro studies are useful when dentists are selecting an adhesive system, but there is no ideal measuring method. The purpose of this in vitro study was to investigate the influence of the evaluation method in the BS between dentin and composite resin. Molars with exposed superficial dentin (N=240) were divided into 3 groups according to the test: microtensile (μTBS), microshear (μSBS), and micropush-out (μPBS). Each one was subdivided into 4 groups according to the adhesive system: total etch, 3- and 2-step; and self-etch, 2- and 1-step). For the μPBS test, a conical cavity was prepared and restored with composite resin. An occlusal slice (1.5 mm in thickness) was obtained from each tooth. For the μSBS test, a composite resin cylinder (1 mm in diameter) was built on the dentin surface of each tooth. For the μTBS test, a 2-increment composite resin cylinder was built on the dentin surface, and beams with a sectional area of 0.5 mm 2 were obtained. Each subgroup was divided into 2 (n=10) as the specimens were tested after 7 days and 1 year of water storage. The specimens were submitted to load, and the failure recorded in units of megapascals. Original BS values from the μTBS and μSBS tests were normalized for the area from μPBS specimens. Original and normalized results were submitted to a 3-way ANOVA (α=.05). The correlation among mechanical results, stress distribution, and failure pattern was investigated. Significant differences (P<.05) were found among the adhesive systems and methods within both the original and normalized data but not between the storage times (P>.05). Within the 7 days of storage, the original BS values from μTBS were significantly higher (P<.001) than those from μPBS and μSBS. After 1 year, μSBS presented significantly lower results (P<.001). However, after the normalization for area, the BS values of the μTBS and μPBS tests were similar, and both were higher (P<.001) than that of

  16. Mode I Failure of Armor Ceramics: Experiments and Modeling

    NASA Astrophysics Data System (ADS)

    Meredith, Christopher; Leavy, Brian

    2017-06-01

    The pre-notched edge on impact (EOI) experiment is a technique for benchmarking the damage and fracture of ceramics subjected to projectile impact. A cylindrical projectile impacts the edge of a thin rectangular plate with a pre-notch on the opposite edge. Tension is generated at the notch tip resulting in the initiation and propagation of a mode I crack back toward the impact edge. The crack can be quantitatively measured using an optical method called Digital Gradient Sensing, which measures the crack-tip deformation by simultaneously quantifying two orthogonal surface slopes via measuring small deflections of light rays from a specularly reflective surface around the crack. The deflections in ceramics are small so the high speed camera needs to have a very high pixel count. This work reports on the results from pre-crack EOI experiments of SiC and B4 C plates. The experimental data are quantitatively compared to impact simulations using an advanced continuum damage model. The Kayenta ceramic model in Alegra will be used to compare fracture propagation speeds, bifurcations and inhomogeneous initiation of failure will be compared. This will provide insight into the driving mechanisms required for the macroscale failure modeling of ceramics.

  17. Confessions of a Quantitative Educational Researcher Trying to Teach Qualitative Research.

    ERIC Educational Resources Information Center

    Stallings, William M.

    1995-01-01

    Describes one quantitative educational researcher's experiences teaching qualitative research, the approach used in classes, and the successes and failures. These experiences are examined from the viewpoint of a traditionally trained professor who has now been called upon to master and teach qualitative research. (GR)

  18. Failure Analysis at the Kennedy Space Center

    NASA Technical Reports Server (NTRS)

    Salazar, Victoria L.; Wright, M. Clara

    2010-01-01

    History has shown that failures occur in every engineering endeavor, and what we learn from those failures contributes to the knowledge base to safely complete future missions. The necessity of failure analysis is at its apex at the end of one aged program and at the beginning of a new and untested program. The information that we gain through failure analysis corrects the deficiencies in the current vehicle to make the next generation of vehicles more efficient and safe. The Failure Analysis and Materials Evaluation Branch in the Materials Science Division at the Kennedy Space Center performs metallurgical, mechanical, electrical, and non-metallic materials failure analyses and accident investigations on both flight hardware and ground support equipment for the Space Shuttle, International Space Station, Constellation, and Launch Services Programs. This paper will explore a variety of failure case studies at the Kennedy Space Center and the lessons learned that can be applied in future programs.

  19. A quantitative evaluation of cell migration by the phagokinetic track motility assay.

    PubMed

    Nogalski, Maciej T; Chan, Gary C T; Stevenson, Emily V; Collins-McMillen, Donna K; Yurochko, Andrew D

    2012-12-04

    Cellular motility is an important biological process for both unicellular and multicellular organisms. It is essential for movement of unicellular organisms towards a source of nutrients or away from unsuitable conditions, as well as in multicellular organisms for tissue development, immune surveillance and wound healing, just to mention a few roles(1,2,3). Deregulation of this process can lead to serious neurological, cardiovascular and immunological diseases, as well as exacerbated tumor formation and spread(4,5). Molecularly, actin polymerization and receptor recycling have been shown to play important roles in creating cellular extensions (lamellipodia), that drive the forward movement of the cell(6,7,8). However, many biological questions about cell migration remain unanswered. The central role for cellular motility in human health and disease underlines the importance of understanding the specific mechanisms involved in this process and makes accurate methods for evaluating cell motility particularly important. Microscopes are usually used to visualize the movement of cells. However, cells move rather slowly, making the quantitative measurement of cell migration a resource-consuming process requiring expensive cameras and software to create quantitative time-lapsed movies of motile cells. Therefore, the ability to perform a quantitative measurement of cell migration that is cost-effective, non-laborious, and that utilizes common laboratory equipment is a great need for many researchers. The phagokinetic track motility assay utilizes the ability of a moving cell to clear gold particles from its path to create a measurable track on a colloidal gold-coated glass coverslip(9,10). With the use of freely available software, multiple tracks can be evaluated for each treatment to accomplish statistical requirements. The assay can be utilized to assess motility of many cell types, such as cancer cells(11,12), fibroblasts(9), neutrophils(13), skeletal muscle cells(14

  20. Failure environment analysis tool applications

    NASA Astrophysics Data System (ADS)

    Pack, Ginger L.; Wadsworth, David B.

    1993-02-01

    Understanding risks and avoiding failure are daily concerns for the women and men of NASA. Although NASA's mission propels us to push the limits of technology, and though the risks are considerable, the NASA community has instilled within, the determination to preserve the integrity of the systems upon which our mission and, our employees lives and well-being depend. One of the ways this is being done is by expanding and improving the tools used to perform risk assessment. The Failure Environment Analysis Tool (FEAT) was developed to help engineers and analysts more thoroughly and reliably conduct risk assessment and failure analysis. FEAT accomplishes this by providing answers to questions regarding what might have caused a particular failure; or, conversely, what effect the occurrence of a failure might have on an entire system. Additionally, FEAT can determine what common causes could have resulted in other combinations of failures. FEAT will even help determine the vulnerability of a system to failures, in light of reduced capability. FEAT also is useful in training personnel who must develop an understanding of particular systems. FEAT facilitates training on system behavior, by providing an automated environment in which to conduct 'what-if' evaluation. These types of analyses make FEAT a valuable tool for engineers and operations personnel in the design, analysis, and operation of NASA space systems.

  1. Failure environment analysis tool applications

    NASA Technical Reports Server (NTRS)

    Pack, Ginger L.; Wadsworth, David B.

    1993-01-01

    Understanding risks and avoiding failure are daily concerns for the women and men of NASA. Although NASA's mission propels us to push the limits of technology, and though the risks are considerable, the NASA community has instilled within, the determination to preserve the integrity of the systems upon which our mission and, our employees lives and well-being depend. One of the ways this is being done is by expanding and improving the tools used to perform risk assessment. The Failure Environment Analysis Tool (FEAT) was developed to help engineers and analysts more thoroughly and reliably conduct risk assessment and failure analysis. FEAT accomplishes this by providing answers to questions regarding what might have caused a particular failure; or, conversely, what effect the occurrence of a failure might have on an entire system. Additionally, FEAT can determine what common causes could have resulted in other combinations of failures. FEAT will even help determine the vulnerability of a system to failures, in light of reduced capability. FEAT also is useful in training personnel who must develop an understanding of particular systems. FEAT facilitates training on system behavior, by providing an automated environment in which to conduct 'what-if' evaluation. These types of analyses make FEAT a valuable tool for engineers and operations personnel in the design, analysis, and operation of NASA space systems.

  2. Failure environment analysis tool applications

    NASA Technical Reports Server (NTRS)

    Pack, Ginger L.; Wadsworth, David B.

    1994-01-01

    Understanding risks and avoiding failure are daily concerns for the women and men of NASA. Although NASA's mission propels us to push the limits of technology, and though the risks are considerable, the NASA community has instilled within it, the determination to preserve the integrity of the systems upon which our mission and, our employees lives and well-being depend. One of the ways this is being done is by expanding and improving the tools used to perform risk assessment. The Failure Environment Analysis Tool (FEAT) was developed to help engineers and analysts more thoroughly and reliably conduct risk assessment and failure analysis. FEAT accomplishes this by providing answers to questions regarding what might have caused a particular failure; or, conversely, what effect the occurrence of a failure might have on an entire system. Additionally, FEAT can determine what common causes could have resulted in other combinations of failures. FEAT will even help determine the vulnerability of a system to failures, in light of reduced capability. FEAT also is useful in training personnel who must develop an understanding of particular systems. FEAT facilitates training on system behavior, by providing an automated environment in which to conduct 'what-if' evaluation. These types of analyses make FEAT a valuable tool for engineers and operations personnel in the design, analysis, and operation of NASA space systems.

  3. Metallization failures

    NASA Technical Reports Server (NTRS)

    Beatty, R.

    1971-01-01

    Metallization-related failure mechanisms were shown to be a major cause of integrated circuit failures under accelerated stress conditions, as well as in actual use under field operation. The integrated circuit industry is aware of the problem and is attempting to solve it in one of two ways: (1) better understanding of the aluminum system, which is the most widely used metallization material for silicon integrated circuits both as a single level and multilevel metallization, or (2) evaluating alternative metal systems. Aluminum metallization offers many advantages, but also has limitations particularly at elevated temperatures and high current densities. As an alternative, multilayer systems of the general form, silicon device-metal-inorganic insulator-metal, are being considered to produce large scale integrated arrays. The merits and restrictions of metallization systems in current usage and systems under development are defined.

  4. Quantitative evaluation of malignant gliomas damage induced by photoactivation of IR700 dye

    NASA Astrophysics Data System (ADS)

    Sakuma, Morito; Kita, Sayaka; Higuchi, Hideo

    2016-01-01

    The processes involved in malignant gliomas damage were quantitatively evaluated by microscopy. The near-infrared fluorescent dye IR700 that is conjugated to an anti-CD133 antibody (IR700-CD133) specifically targets malignant gliomas (U87MG) and stem cells (BT142) and is endocytosed into the cells. The gliomas are then photodamaged by the release of reactive oxygen species (ROS) and the heat induced by illumination of IR700 by a red laser, and the motility of the vesicles within these cells is altered as a result of cellular damage. To investigate these changes in motility, we developed a new method that measures fluctuations in the intensity of phase-contrast images obtained from small areas within cells. The intensity fluctuation in U87MG cells gradually decreased as cell damage progressed, whereas the fluctuation in BT142 cells increased. The endocytosed IR700 dye was co-localized in acidic organelles such as endosomes and lysosomes. The pH in U87MG cells, as monitored by a pH indicator, was decreased and then gradually increased by the illumination of IR700, while the pH in BT142 cells increased monotonically. In these experiments, the processes of cell damage were quantitatively evaluated according to the motility of vesicles and changes in pH.

  5. Quantitative Evaluation of Performance in Interventional Neuroradiology: An Integrated Curriculum Featuring Theoretical and Practical Challenges.

    PubMed

    Ernst, Marielle; Kriston, Levente; Romero, Javier M; Frölich, Andreas M; Jansen, Olav; Fiehler, Jens; Buhk, Jan-Hendrik

    2016-01-01

    We sought to develop a standardized curriculum capable of assessing key competencies in Interventional Neuroradiology by the use of models and simulators in an objective, quantitative, and efficient way. In this evaluation we analyzed the associations between the practical experience, theoretical knowledge, and the skills lab performance of interventionalists. We evaluated the endovascular skills of 26 participants of the Advanced Course in Endovascular Interventional Neuroradiology of the European Society of Neuroradiology with a set of three tasks (aneurysm coiling and thrombectomy in a virtual simulator and placement of an intra-aneurysmal flow disruptor in a flow model). Practical experience was assessed by a survey. Participants completed a written and oral examination to evaluate theoretical knowledge. Bivariate and multivariate analyses were performed. In multivariate analysis knowledge of materials and techniques in Interventional Neuroradiology was moderately associated with skills in aneurysm coiling and thrombectomy. Experience in mechanical thrombectomy was moderately associated with thrombectomy skills, while age was negatively associated with thrombectomy skills. We found no significant association between age, sex, or work experience and skills in aneurysm coiling. Our study gives an example of how an integrated curriculum for reasonable and cost-effective assessment of key competences of an interventional neuroradiologist could look. In addition to traditional assessment of theoretical knowledge practical skills are measured by the use of endovascular simulators yielding objective, quantitative, and constructive data for the evaluation of the current performance status of participants as well as the evolution of their technical competency over time.

  6. Evaluating Attitudes, Skill, and Performance in a Learning-Enhanced Quantitative Methods Course: A Structural Modeling Approach.

    ERIC Educational Resources Information Center

    Harlow, Lisa L.; Burkholder, Gary J.; Morrow, Jennifer A.

    2002-01-01

    Used a structural modeling approach to evaluate relations among attitudes, initial skills, and performance in a Quantitative Methods course that involved students in active learning. Results largely confirmed hypotheses offering support for educational reform efforts that propose actively involving students in the learning process, especially in…

  7. TEAM-HF Cost-Effectiveness Model: A Web-Based Program Designed to Evaluate the Cost-Effectiveness of Disease Management Programs in Heart Failure

    PubMed Central

    Reed, Shelby D.; Neilson, Matthew P.; Gardner, Matthew; Li, Yanhong; Briggs, Andrew H.; Polsky, Daniel E.; Graham, Felicia L.; Bowers, Margaret T.; Paul, Sara C.; Granger, Bradi B.; Schulman, Kevin A.; Whellan, David J.; Riegel, Barbara; Levy, Wayne C.

    2015-01-01

    Background Heart failure disease management programs can influence medical resource use and quality-adjusted survival. Because projecting long-term costs and survival is challenging, a consistent and valid approach to extrapolating short-term outcomes would be valuable. Methods We developed the Tools for Economic Analysis of Patient Management Interventions in Heart Failure (TEAM-HF) Cost-Effectiveness Model, a Web-based simulation tool designed to integrate data on demographic, clinical, and laboratory characteristics, use of evidence-based medications, and costs to generate predicted outcomes. Survival projections are based on a modified Seattle Heart Failure Model (SHFM). Projections of resource use and quality of life are modeled using relationships with time-varying SHFM scores. The model can be used to evaluate parallel-group and single-cohort designs and hypothetical programs. Simulations consist of 10,000 pairs of virtual cohorts used to generate estimates of resource use, costs, survival, and incremental cost-effectiveness ratios from user inputs. Results The model demonstrated acceptable internal and external validity in replicating resource use, costs, and survival estimates from 3 clinical trials. Simulations to evaluate the cost-effectiveness of heart failure disease management programs across 3 scenarios demonstrate how the model can be used to design a program in which short-term improvements in functioning and use of evidence-based treatments are sufficient to demonstrate good long-term value to the health care system. Conclusion The TEAM-HF Cost-Effectiveness Model provides researchers and providers with a tool for conducting long-term cost-effectiveness analyses of disease management programs in heart failure. PMID:26542504

  8. A pilot rating scale for evaluating failure transients in electronic flight control systems

    NASA Technical Reports Server (NTRS)

    Hindson, William S.; Schroeder, Jeffery A.; Eshow, Michelle M.

    1990-01-01

    A pilot rating scale was developed to describe the effects of transients in helicopter flight-control systems on safety-of-flight and on pilot recovery action. The scale was applied to the evaluation of hardovers that could potentially occur in the digital flight-control system being designed for a variable-stability UH-60A research helicopter. Tests were conducted in a large moving-base simulator and in flight. The results of the investigation were combined with existing airworthiness criteria to determine quantitative reliability design goals for the control system.

  9. Failure mechanisms of fibrin-based surgical tissue adhesives

    NASA Astrophysics Data System (ADS)

    Sierra, David Hugh

    A series of studies was performed to investigate the potential impact of heterogeneity in the matrix of multiple-component fibrin-based tissue adhesives upon their mechanical and biomechanical properties both in vivo and in vitro. Investigations into the failure mechanisms by stereological techniques demonstrated that heterogeneity could be measured quantitatively and that the variation in heterogeneity could be altered both by the means of component mixing and delivery and by the formulation of the sealant. Ex vivo tensile adhesive strength was found to be inversely proportional to the amount of heterogeneity. In contrast, in vivo tensile wound-closure strength was found to be relatively unaffected by the degree of heterogeneity, while in vivo parenchymal organ hemostasis in rabbits was found to be affected: greater heterogeneity appeared to correlate with an increase in hemostasis time and amount of sealant necessary to effect hemostasis. Tensile testing of the bulk sealant showed that mechanical parameters were proportional to fibrin concentration and that the physical characteristics of the failure supported a ductile mechanism. Strain hardening as a function of percentage of strain, and strain rate was observed for both concentrations, and syneresis was observed at low strain rates for the lower fibrin concentration. Blister testing demonstrated that burst pressure and failure energy were proportional to fibrin concentration and decreased with increasing flow rate. Higher fibrin concentration demonstrated predominately compact morphology debonds with cohesive failure loci, demonstrating shear or viscous failure in a viscoelastic rubbery adhesive. The lower fibrin concentration sealant exhibited predominately fractal morphology debonds with cohesive failure loci, supporting an elastoviscous material condition. The failure mechanism for these was hypothesized and shown to be flow-induced ductile fracture. Based on these findings, the failure mechanism was

  10. Application of Organosilane Monolayer Template to Quantitative Evaluation of Cancer Cell Adhesive Ability

    NASA Astrophysics Data System (ADS)

    Tanii, Takashi; Sasaki, Kosuke; Ichisawa, Kota; Demura, Takanori; Beppu, Yuichi; Vu, Hoan Anh; Thanh Chi, Hoan; Yamamoto, Hideaki; Sato, Yuko

    2011-06-01

    The adhesive ability of two human pancreatic cancer cell lines was evaluated using organosilane monolayer templates (OMTs). Using the OMT, the spreading area of adhered cells can be limited, and this enables us to focus on the initial attachment process of adhesion. Moreover, it becomes possible to arrange the cells in an array and to quantitatively evaluate the number of attached cells. The adhesive ability of the cancer cells cultured on the OMT was controlled by adding (-)-epigallocatechin-3-gallate (EGCG), which blocks a receptor that mediates cell adhesion and is overexpressed in cancer cells. Measurement of the relative ability of the cancer cells to attach to the OMT revealed that the ability for attachment decreased with increasing EGCG concentration. The results agreed well with the western blot analysis, indicating that the OMT can potentially be employed to evaluate the adhesive ability of various cancer cells.

  11. Development and evaluation of a model-based downscatter compensation method for quantitative I-131 SPECT

    PubMed Central

    Song, Na; Du, Yong; He, Bin; Frey, Eric C.

    2011-01-01

    Purpose: The radionuclide 131I has found widespread use in targeted radionuclide therapy (TRT), partly due to the fact that it emits photons that can be imaged to perform treatment planning or posttherapy dose verification as well as beta rays that are suitable for therapy. In both the treatment planning and dose verification applications, it is necessary to estimate the activity distribution in organs or tumors at several time points. In vivo estimates of the 131I activity distribution at each time point can be obtained from quantitative single-photon emission computed tomography (QSPECT) images and organ activity estimates can be obtained either from QSPECT images or quantification of planar projection data. However, in addition to the photon used for imaging, 131I decay results in emission of a number of other higher-energy photons with significant abundances. These higher-energy photons can scatter in the body, collimator, or detector and be counted in the 364 keV photopeak energy window, resulting in reduced image contrast and degraded quantitative accuracy; these photons are referred to as downscatter. The goal of this study was to develop and evaluate a model-based downscatter compensation method specifically designed for the compensation of high-energy photons emitted by 131I and detected in the imaging energy window. Methods: In the evaluation study, we used a Monte Carlo simulation (MCS) code that had previously been validated for other radionuclides. Thus, in preparation for the evaluation study, we first validated the code for 131I imaging simulation by comparison with experimental data. Next, we assessed the accuracy of the downscatter model by comparing downscatter estimates with MCS results. Finally, we combined the downscatter model with iterative reconstruction-based compensation for attenuation (A) and scatter (S) and the full (D) collimator-detector response of the 364 keV photons to form a comprehensive compensation method. We evaluated this

  12. Assessing the Expected Impact of Global Health Treaties: Evidence From 90 Quantitative Evaluations

    PubMed Central

    Røttingen, John-Arne

    2015-01-01

    We assessed what impact can be expected from global health treaties on the basis of 90 quantitative evaluations of existing treaties on trade, finance, human rights, conflict, and the environment. It appears treaties consistently succeed in shaping economic matters and consistently fail in achieving social progress. There are at least 3 differences between these domains that point to design characteristics that new global health treaties can incorporate to achieve positive impact: (1) incentives for those with power to act on them; (2) institutions designed to bring edicts into effect; and (3) interests advocating their negotiation, adoption, ratification, and domestic implementation. Experimental and quasiexperimental evaluations of treaties would provide more information about what can be expected from this type of global intervention. PMID:25393196

  13. The failure to fail underperforming trainees in health professions education: A BEME systematic review: BEME Guide No. 42.

    PubMed

    Yepes-Rios, Monica; Dudek, Nancy; Duboyce, Rita; Curtis, Jerri; Allard, Rhonda J; Varpio, Lara

    2016-11-01

    Many clinical educators feel unprepared and/or unwilling to report unsatisfactory trainee performance. This systematic review consolidates knowledge from medical, nursing, and dental literature on the experiences and perceptions of evaluators or assessors with this failure to fail phenomenon. We searched the English language literature in CINAHL, EMBASE, and MEDLINE from January 2005 to January 2015. Qualitative and quantitative studies were included. Following our review protocol, registered with BEME, reviewers worked in pairs to identify relevant articles. The investigators participated in thematic analysis of the qualitative data reported in these studies. Through several cycles of analysis, discussion and reflection, the team identified the barriers and enablers to failing a trainee. From 5330 articles, we included 28 publications in the review. The barriers identified were (1) assessor's professional considerations, (2) assessor's personal considerations, (3) trainee related considerations, (4) unsatisfactory evaluator development and evaluation tools, (5) institutional culture and (6) consideration of available remediation for the trainee. The enablers identified were: (1) duty to patients, to society, and to the profession, (2) institutional support such as backing a failing evaluation, support from colleagues, evaluator development, and strong assessment systems, and (3) opportunities for students after failing. The inhibiting and enabling factors to failing an underperforming trainee were common across the professions included in this study, across the 10 years of data, and across the educational continuum. We suggest that these results can inform efforts aimed at addressing the failure to fail problem.

  14. The Use of Probabilistic Methods to Evaluate the Systems Impact of Component Design Improvements on Large Turbofan Engines

    NASA Technical Reports Server (NTRS)

    Packard, Michael H.

    2002-01-01

    Probabilistic Structural Analysis (PSA) is now commonly used for predicting the distribution of time/cycles to failure of turbine blades and other engine components. These distributions are typically based on fatigue/fracture and creep failure modes of these components. Additionally, reliability analysis is used for taking test data related to particular failure modes and calculating failure rate distributions of electronic and electromechanical components. How can these individual failure time distributions of structural, electronic and electromechanical component failure modes be effectively combined into a top level model for overall system evaluation of component upgrades, changes in maintenance intervals, or line replaceable unit (LRU) redesign? This paper shows an example of how various probabilistic failure predictions for turbine engine components can be evaluated and combined to show their effect on overall engine performance. A generic model of a turbofan engine was modeled using various Probabilistic Risk Assessment (PRA) tools (Quantitative Risk Assessment Software (QRAS) etc.). Hypothetical PSA results for a number of structural components along with mitigation factors that would restrict the failure mode from propagating to a Loss of Mission (LOM) failure were used in the models. The output of this program includes an overall failure distribution for LOM of the system. The rank and contribution to the overall Mission Success (MS) is also given for each failure mode and each subsystem. This application methodology demonstrates the effectiveness of PRA for assessing the performance of large turbine engines. Additionally, the effects of system changes and upgrades, the application of different maintenance intervals, inclusion of new sensor detection of faults and other upgrades were evaluated in determining overall turbine engine reliability.

  15. [Study on the quantitative evaluation on the degree of TCM basic syndromes often encountered in patients with primary liver cancer].

    PubMed

    Li, Dong-tao; Ling, Chang-quan; Zhu, De-zeng

    2007-07-01

    To establish a quantitative model for evaluating the degree of the TCM basic syndromes often encountered in patients with primary liver cancer (PLC). Medical literatures concerning the clinical investigation and TCM syndrome of PLC were collected and analyzed adopting expert-composed symposium method, and the 100 millimeter scaling was applied in combining with scoring on degree of symptoms to establish a quantitative criterion for symptoms and signs degree classification in patients with PLC. Two models, i.e. the additive model and the additive-multiplicative model, were established by using comprehensive analytic hierarchy process (AHP) as the mathematical tool to estimate the weight of the criterion for evaluating basic syndromes in various layers by specialists. Then the two models were verified in clinical practice and the outcomes were compared with that fuzzy evaluated by specialists. Verification on 459 times/case of PLC showed that the coincidence rate between the outcomes derived from specialists with that from the additive model was 84.53 %, and with that from the additive-multificative model was 62.75 %, the difference between the two showed statistical significance (P<0.01). It could be decided that the additive model is the principle model suitable for quantitative evaluation on the degree of TCM basic syndromes in patients with PLC.

  16. Quantitative measurement of adhesion of ink on plastic films with a Nano Indenter and a Scanning Probe Microscope

    NASA Astrophysics Data System (ADS)

    Shen, Weidian

    2005-03-01

    Plastic film packaging is widely used these days, especially in the convenience food industry due to its flexibility, boilability, and microwavability. Almost every package is printed with ink. The adhesion of ink on plastic films merits increasing attention to ensure quality packaging. However, inks and plastic films are polymeric materials with complicated molecular structures. The thickness of the jelly-like ink is only 500nm or less, and the thickness of the soft and flexible film is no more than 50μm, which make the quantitative measurement of their adhesion very challenging. Up to now, no scientific quantitative measurement method for the adhesion of ink on plastic films has been documented. We have tried a technique, in which a Nano-Indenter and a Scanning Probe Microscope were used to evaluate the adhesion strength of ink deposited on plastic films, quantitatively, as well as examine the configurations of adhesion failure. It was helpful in better understanding the adhesion mechanism, thus giving direction as to how to improve the adhesion.

  17. Failure tolerance strategy of space manipulator for large load carrying tasks

    NASA Astrophysics Data System (ADS)

    Chen, Gang; Yuan, Bonan; Jia, Qingxuan; Sun, Hanxu; Guo, Wen

    2018-07-01

    During the execution of large load carrying tasks in long term service, there is a notable risk of space manipulator suffering from locked-joint failure, thus space manipulator should be with enough failure tolerance performance. A research on evaluating failure tolerance performance and re-planning feasible task trajectory for space manipulator performing large load carrying tasks is conducted in this paper. The effects of locked-joint failure on critical performance(reachability and load carrying capacity) of space manipulator are analyzed at first. According to the requirements of load carrying tasks, we further propose a new concept of failure tolerance workspace with load carrying capacity(FTWLCC) to evaluate failure tolerance performance, and improve the classic A* algorithm to search the feasible task trajectory. Through the normalized FTWLCC and the improved A* algorithm, the reachability and load carrying capacity of the degraded space manipulator are evaluated, and the reachable and capable trajectory can be obtained. The establishment of FTWLCC provides a novel idea that combines mathematical statistics with failure tolerance performance to illustrate the distribution of load carrying capacity in three-dimensional space, so multiple performance indices can be analyzed simultaneously and visually. And the full consideration of all possible failure situations and motion states makes FTWLCC and improved A* algorithm be universal and effective enough to be appropriate for random joint failure and variety of requirement of large load carrying tasks, so they can be extended to other types of manipulators.

  18. Cascading failure in scale-free networks with tunable clustering

    NASA Astrophysics Data System (ADS)

    Zhang, Xue-Jun; Gu, Bo; Guan, Xiang-Min; Zhu, Yan-Bo; Lv, Ren-Li

    2016-02-01

    Cascading failure is ubiquitous in many networked infrastructure systems, such as power grids, Internet and air transportation systems. In this paper, we extend the cascading failure model to a scale-free network with tunable clustering and focus on the effect of clustering coefficient on system robustness. It is found that the network robustness undergoes a nonmonotonic transition with the increment of clustering coefficient: both highly and lowly clustered networks are fragile under the intentional attack, and the network with moderate clustering coefficient can better resist the spread of cascading. We then provide an extensive explanation for this constructive phenomenon via the microscopic point of view and quantitative analysis. Our work can be useful to the design and optimization of infrastructure systems.

  19. Rationale and design of the Japanese heart failure outpatients disease management and cardiac evaluation (J-HOMECARE).

    PubMed

    Tsuchihashi-Makaya, Miyuki; Matsuo, Hisashi; Kakinoki, Shigeo; Takechi, Shigeru; Tsutsui, Hiroyuki

    2011-09-01

    Although many studies have demonstrated the efficacy of disease management programs on mortality, morbidity, quality of life (QOL), and medical cost in patients with heart failure (HF), no study has focused on psychological status as an outcome of disease management. In addition, very little information is available on the effectiveness of disease management programs in other areas than the USA and Europe. The Japanese Heart Failure Outpatients Disease Management and Cardiac Evaluation (J-HOMECARE) is a randomized controlled trial in which 156 patients hospitalized with HF will be randomized into usual care or a home-based disease management arm receiving comprehensive advice and counseling by visiting nurses during the initial 2 months and telephone follow-up for the following 4 months after discharge. This study evaluates depression and anxiety (Hospital Anxiety and Depression Scale), mortality, readmission due to HF, and QOL (Short Form-8). Data are collected during index hospitalization and then 2, 6, and 12 months after discharge. This study started in December 2007, and the final results are expected in 2011. The J-HOMECARE will provide important information on the efficacy of disease management for psychological status as well as the effective components of disease management for patients with HF. (ClinicalTrials.gov number, NCT01284400). Copyright © 2011 Japanese College of Cardiology. Published by Elsevier Ltd. All rights reserved.

  20. [Biochemical failure after curative treatment for localized prostate cancer].

    PubMed

    Zouhair, Abderrahim; Jichlinski, Patrice; Mirimanoff, René-Olivier

    2005-12-07

    Biochemical failure after curative treatment for localized prostate cancer is frequent. The diagnosis of biochemical failure is clear when PSA levels rise after radical prostatectomy, but may be more difficult after external beam radiation therapy. The main difficulty once biochemical failure is diagnosed is to distinguish between local and distant failure, given the low sensitivity of standard work-up exams. Metabolic imaging techniques currently under evaluation may in the future help us to localize the site of failures. There are several therapeutic options depending on the initial curative treatment, each with morbidity risks that should be considered in multidisciplinary decision-making.

  1. Nonlinear deformation and localized failure of bacterial streamers in creeping flows

    PubMed Central

    Biswas, Ishita; Ghosh, Ranajay; Sadrzadeh, Mohtada; Kumar, Aloke

    2016-01-01

    We investigate the failure of bacterial floc mediated streamers in a microfluidic device in a creeping flow regime using both experimental observations and analytical modeling. The quantification of streamer deformation and failure behavior is possible due to the use of 200 nm fluorescent polystyrene beads which firmly embed in the extracellular polymeric substance (EPS) and act as tracers. The streamers, which form soon after the commencement of flow begin to deviate from an apparently quiescent fully formed state in spite of steady background flow and limited mass accretion indicating significant mechanical nonlinearity. This nonlinear behavior shows distinct phases of deformation with mutually different characteristic times and comes to an end with a distinct localized failure of the streamer far from the walls. We investigate this deformation and failure behavior for two separate bacterial strains and develop a simplified but nonlinear analytical model describing the experimentally observed instability phenomena assuming a necking route to instability. Our model leads to a power law relation between the critical strain at failure and the fluid velocity scale exhibiting excellent qualitative and quantitative agreeing with the experimental rupture behavior. PMID:27558511

  2. CT fluoroscopy-guided renal tumour cutting needle biopsy: retrospective evaluation of diagnostic yield, safety, and risk factors for diagnostic failure.

    PubMed

    Iguchi, Toshihiro; Hiraki, Takao; Matsui, Yusuke; Fujiwara, Hiroyasu; Sakurai, Jun; Masaoka, Yoshihisa; Gobara, Hideo; Kanazawa, Susumu

    2018-01-01

    To evaluate retrospectively the diagnostic yield, safety, and risk factors for diagnostic failure of computed tomography (CT) fluoroscopy-guided renal tumour biopsy. Biopsies were performed for 208 tumours (mean diameter 2.3 cm; median diameter 2.1 cm; range 0.9-8.5 cm) in 199 patients. One hundred and ninety-nine tumours were ≤4 cm. All 208 initial procedures were divided into diagnostic success and failure groups. Multiple variables related to the patients, lesions, and procedures were assessed to determine the risk factors for diagnostic failure. After performing 208 initial and nine repeat biopsies, 180 malignancies and 15 benign tumours were pathologically diagnosed, whereas 13 were not diagnosed. In 117 procedures, 118 Grade I and one Grade IIIa adverse events (AEs) occurred. Neither Grade ≥IIIb AEs nor tumour seeding were observed within a median follow-up period of 13.7 months. Logistic regression analysis revealed only small tumour size (≤1.5 cm; odds ratio 3.750; 95% confidence interval 1.362-10.326; P = 0.011) to be a significant risk factor for diagnostic failure. CT fluoroscopy-guided renal tumour biopsy is a safe procedure with a high diagnostic yield. A small tumour size (≤1.5 cm) is a significant risk factor for diagnostic failure. • CT fluoroscopy-guided renal tumour biopsy has a high diagnostic yield. • CT fluoroscopy-guided renal tumour biopsy is safe. • Small tumour size (≤1.5 cm) is a risk factor for diagnostic failure.

  3. Evaluation of quantitative image analysis criteria for the high-resolution microendoscopic detection of neoplasia in Barrett's esophagus

    NASA Astrophysics Data System (ADS)

    Muldoon, Timothy J.; Thekkek, Nadhi; Roblyer, Darren; Maru, Dipen; Harpaz, Noam; Potack, Jonathan; Anandasabapathy, Sharmila; Richards-Kortum, Rebecca

    2010-03-01

    Early detection of neoplasia in patients with Barrett's esophagus is essential to improve outcomes. The aim of this ex vivo study was to evaluate the ability of high-resolution microendoscopic imaging and quantitative image analysis to identify neoplastic lesions in patients with Barrett's esophagus. Nine patients with pathologically confirmed Barrett's esophagus underwent endoscopic examination with biopsies or endoscopic mucosal resection. Resected fresh tissue was imaged with fiber bundle microendoscopy; images were analyzed by visual interpretation or by quantitative image analysis to predict whether the imaged sites were non-neoplastic or neoplastic. The best performing pair of quantitative features were chosen based on their ability to correctly classify the data into the two groups. Predictions were compared to the gold standard of histopathology. Subjective analysis of the images by expert clinicians achieved average sensitivity and specificity of 87% and 61%, respectively. The best performing quantitative classification algorithm relied on two image textural features and achieved a sensitivity and specificity of 87% and 85%, respectively. This ex vivo pilot trial demonstrates that quantitative analysis of images obtained with a simple microendoscope system can distinguish neoplasia in Barrett's esophagus with good sensitivity and specificity when compared to histopathology and to subjective image interpretation.

  4. Quantitative Evaluation of Performance in Interventional Neuroradiology: An Integrated Curriculum Featuring Theoretical and Practical Challenges

    PubMed Central

    Ernst, Marielle; Kriston, Levente; Romero, Javier M.; Frölich, Andreas M.; Jansen, Olav; Fiehler, Jens; Buhk, Jan-Hendrik

    2016-01-01

    Purpose We sought to develop a standardized curriculum capable of assessing key competencies in Interventional Neuroradiology by the use of models and simulators in an objective, quantitative, and efficient way. In this evaluation we analyzed the associations between the practical experience, theoretical knowledge, and the skills lab performance of interventionalists. Materials and Methods We evaluated the endovascular skills of 26 participants of the Advanced Course in Endovascular Interventional Neuroradiology of the European Society of Neuroradiology with a set of three tasks (aneurysm coiling and thrombectomy in a virtual simulator and placement of an intra-aneurysmal flow disruptor in a flow model). Practical experience was assessed by a survey. Participants completed a written and oral examination to evaluate theoretical knowledge. Bivariate and multivariate analyses were performed. Results In multivariate analysis knowledge of materials and techniques in Interventional Neuroradiology was moderately associated with skills in aneurysm coiling and thrombectomy. Experience in mechanical thrombectomy was moderately associated with thrombectomy skills, while age was negatively associated with thrombectomy skills. We found no significant association between age, sex, or work experience and skills in aneurysm coiling. Conclusion Our study gives an example of how an integrated curriculum for reasonable and cost-effective assessment of key competences of an interventional neuroradiologist could look. In addition to traditional assessment of theoretical knowledge practical skills are measured by the use of endovascular simulators yielding objective, quantitative, and constructive data for the evaluation of the current performance status of participants as well as the evolution of their technical competency over time. PMID:26848840

  5. Evaluation of echocardiography in the management of elderly patients with heart failure.

    PubMed

    Hendry, A; Hacking, L; Langhorne, P; Vallance, R; MacDonald, J

    1999-09-01

    To determine the validity of a clinical diagnosis of systolic dysfunction in elderly patients with heart failure and assess the contribution of echocardiography to their management. 61 elderly patients with a diagnosis of heart failure in a geriatric assessment unit setting. Prospective study determining sensitivity, specificity and predictive values of a clinical and radiological diagnosis compared with echocardiographic standard. Proposed management was compared before and after echocardiography. Clinical assessment was highly sensitive (93%) but lacked specificity (32%). Combining radiological and clinical diagnoses increased specificity to 58%. Echocardiography revised the lead cardiac diagnosis for 28% of patients and influenced patient management plans for 41%. For elderly patients with heart failure, echocardiography improves diagnostic accuracy and identifies those patients with potential to benefit from angiotensin-converting enzyme inhibitors.

  6. Quantitative evaluation of protocorm growth and fungal colonization in Bletilla striata (Orchidaceae) reveals less-productive symbiosis with a non-native symbiotic fungus.

    PubMed

    Yamamoto, Tatsuki; Miura, Chihiro; Fuji, Masako; Nagata, Shotaro; Otani, Yuria; Yagame, Takahiro; Yamato, Masahide; Kaminaka, Hironori

    2017-02-21

    In nature, orchid plants depend completely on symbiotic fungi for their nutrition at the germination and the subsequent seedling (protocorm) stages. However, only limited quantitative methods for evaluating the orchid-fungus interactions at the protocorm stage are currently available, which greatly constrains our understanding of the symbiosis. Here, we aimed to improve and integrate quantitative evaluations of the growth and fungal colonization in the protocorms of a terrestrial orchid, Blettila striata, growing on a plate medium. We achieved both symbiotic and asymbiotic germinations for the terrestrial orchid B. striata. The protocorms produced by the two germination methods grew almost synchronously for the first three weeks. At week four, however, the length was significantly lower in the symbiotic protocorms. Interestingly, the dry weight of symbiotic protocorms did not significantly change during the growth period, which implies that there was only limited transfer of carbon compounds from the fungus to the protocorms in this relationship. Next, to evaluate the orchid-fungus interactions, we developed an ink-staining method to observe the hyphal coils in protocorms without preparing thin sections. Crushing the protocorm under the coverglass enables us to observe all hyphal coils in the protocorms with high resolution. For this observation, we established a criterion to categorize the stages of hyphal coils, depending on development and degradation. By counting the symbiotic cells within each stage, it was possible to quantitatively evaluate the orchid-fungus symbiosis. We describe a method for quantitative evaluation of orchid-fungus symbiosis by integrating the measurements of plant growth and fungal colonization. The current study revealed that although fungal colonization was observed in the symbiotic protocorms, the weight of the protocorm did not significantly increase, which is probably due to the incompatibility of the fungus in this symbiosis. These

  7. Risk assessment of turbine rotor failure using probabilistic ultrasonic non-destructive evaluations

    NASA Astrophysics Data System (ADS)

    Guan, Xuefei; Zhang, Jingdan; Zhou, S. Kevin; Rasselkorde, El Mahjoub; Abbasi, Waheed A.

    2014-02-01

    The study presents a method and application of risk assessment methodology for turbine rotor fatigue failure using probabilistic ultrasonic nondestructive evaluations. A rigorous probabilistic modeling for ultrasonic flaw sizing is developed by incorporating the model-assisted probability of detection, and the probability density function (PDF) of the actual flaw size is derived. Two general scenarios, namely the ultrasonic inspection with an identified flaw indication and the ultrasonic inspection without flaw indication, are considered in the derivation. To perform estimations for fatigue reliability and remaining useful life, uncertainties from ultrasonic flaw sizing and fatigue model parameters are systematically included and quantified. The model parameter PDF is estimated using Bayesian parameter estimation and actual fatigue testing data. The overall method is demonstrated using a realistic application of steam turbine rotor, and the risk analysis under given safety criteria is provided to support maintenance planning.

  8. Dynamic and quantitative evaluation of degenerative mitral valve disease: a dedicated framework based on cardiac magnetic resonance imaging

    PubMed Central

    Onorati, Francesco; Puppini, Giovanni; Pappalardo, Omar A.; Selmi, Matteo; Votta, Emiliano; Faggian, Giuseppe; Redaelli, Alberto

    2017-01-01

    Background Accurate quantification of mitral valve (MV) morphology and dynamic behavior over the cardiac cycle is crucial to understand the mechanisms of degenerative MV dysfunction and to guide the surgical intervention. Cardiac magnetic resonance (CMR) imaging has progressively been adopted to evaluate MV pathophysiology, although a dedicated framework is required to perform a quantitative assessment of the functional MV anatomy. Methods We investigated MV dynamic behavior in subjects with normal MV anatomy (n=10) and patients referred to surgery due to degenerative MV prolapse, classified as fibro-elastic deficiency (FED, n=9) and Barlow’s disease (BD, n=10). A CMR-dedicated framework was adopted to evaluate prolapse height and volume and quantitatively assess valvular morphology and papillary muscles (PAPs) function over the cardiac cycle. Multiple comparison was used to investigate the hallmarks associated to MV degenerative prolapse and evaluate the feasibility of anatomical and functional distinction between FED and BD phenotypes. Results On average, annular dimensions were significantly (P<0.05) larger in BD than in FED and normal subjects while no significant differences were noticed between FED and normal. MV eccentricity progressively decreased passing from normal to FED and BD, with the latter exhibiting a rounder annulus shape. Over the cardiac cycle, we noticed significant differences for BD during systole with an abnormal annular enlargement between mid and late systole (LS) (P<0.001 vs. normal); the PAPs dynamics remained comparable in the three groups. Prolapse height and volume highlighted significant differences among normal, FED and BD valves. Conclusions Our CMR-dedicated framework allows for the quantitative and dynamic evaluation of MV apparatus, with quantifiable annular alterations representing the primary hallmark of severe MV degeneration. This may aid surgeons in the evaluation of the severity of MV dysfunction and the selection of the

  9. Dynamic and quantitative evaluation of degenerative mitral valve disease: a dedicated framework based on cardiac magnetic resonance imaging.

    PubMed

    Sturla, Francesco; Onorati, Francesco; Puppini, Giovanni; Pappalardo, Omar A; Selmi, Matteo; Votta, Emiliano; Faggian, Giuseppe; Redaelli, Alberto

    2017-04-01

    Accurate quantification of mitral valve (MV) morphology and dynamic behavior over the cardiac cycle is crucial to understand the mechanisms of degenerative MV dysfunction and to guide the surgical intervention. Cardiac magnetic resonance (CMR) imaging has progressively been adopted to evaluate MV pathophysiology, although a dedicated framework is required to perform a quantitative assessment of the functional MV anatomy. We investigated MV dynamic behavior in subjects with normal MV anatomy (n=10) and patients referred to surgery due to degenerative MV prolapse, classified as fibro-elastic deficiency (FED, n=9) and Barlow's disease (BD, n=10). A CMR-dedicated framework was adopted to evaluate prolapse height and volume and quantitatively assess valvular morphology and papillary muscles (PAPs) function over the cardiac cycle. Multiple comparison was used to investigate the hallmarks associated to MV degenerative prolapse and evaluate the feasibility of anatomical and functional distinction between FED and BD phenotypes. On average, annular dimensions were significantly (P<0.05) larger in BD than in FED and normal subjects while no significant differences were noticed between FED and normal. MV eccentricity progressively decreased passing from normal to FED and BD, with the latter exhibiting a rounder annulus shape. Over the cardiac cycle, we noticed significant differences for BD during systole with an abnormal annular enlargement between mid and late systole (LS) (P<0.001 vs. normal); the PAPs dynamics remained comparable in the three groups. Prolapse height and volume highlighted significant differences among normal, FED and BD valves. Our CMR-dedicated framework allows for the quantitative and dynamic evaluation of MV apparatus, with quantifiable annular alterations representing the primary hallmark of severe MV degeneration. This may aid surgeons in the evaluation of the severity of MV dysfunction and the selection of the appropriate MV treatment.

  10. Closed-Loop Evaluation of an Integrated Failure Identification and Fault Tolerant Control System for a Transport Aircraft

    NASA Technical Reports Server (NTRS)

    Shin, Jong-Yeob; Belcastro, Christine; Khong, thuan

    2006-01-01

    Formal robustness analysis of aircraft control upset prevention and recovery systems could play an important role in their validation and ultimate certification. Such systems developed for failure detection, identification, and reconfiguration, as well as upset recovery, need to be evaluated over broad regions of the flight envelope or under extreme flight conditions, and should include various sources of uncertainty. To apply formal robustness analysis, formulation of linear fractional transformation (LFT) models of complex parameter-dependent systems is required, which represent system uncertainty due to parameter uncertainty and actuator faults. This paper describes a detailed LFT model formulation procedure from the nonlinear model of a transport aircraft by using a preliminary LFT modeling software tool developed at the NASA Langley Research Center, which utilizes a matrix-based computational approach. The closed-loop system is evaluated over the entire flight envelope based on the generated LFT model which can cover nonlinear dynamics. The robustness analysis results of the closed-loop fault tolerant control system of a transport aircraft are presented. A reliable flight envelope (safe flight regime) is also calculated from the robust performance analysis results, over which the closed-loop system can achieve the desired performance of command tracking and failure detection.

  11. Quantitative evaluation of haze formation of koji and progression of internal haze by drying of koji during koji making.

    PubMed

    Ito, Kazunari; Gomi, Katsuya; Kariyama, Masahiro; Miyake, Tsuyoshi

    2017-07-01

    The construction of an experimental system that can mimic koji making in the manufacturing setting of a sake brewery is initially required for the quantitative evaluation of mycelia grown on/in koji pellets (haze formation). Koji making with rice was investigated with a solid-state fermentation (SSF) system using a non-airflow box (NAB), which produced uniform conditions in the culture substrate with high reproducibility and allowed for the control of favorable conditions in the substrate during culture. The SSF system using NAB accurately reproduced koji making in a manufacturing setting. To evaluate haze formation during koji making, surfaces and cross sections of koji pellets obtained from koji making tests were observed using a digital microscope. Image analysis was used to distinguish between haze and non-haze sections of koji pellets, enabling the evaluation of haze formation in a batch by measuring the haze rate of a specific number of koji pellets. This method allowed us to obtain continuous and quantitative data on the time course of haze formation. Moreover, drying koji during the late stage of koji making was revealed to cause further penetration of mycelia into koji pellets (internal haze). The koji making test with the SSF system using NAB and quantitative evaluation of haze formation in a batch by image analysis is a useful method for understanding the relations between haze formation and koji making conditions. Copyright © 2017 The Society for Biotechnology, Japan. Published by Elsevier B.V. All rights reserved.

  12. Building a Database for a Quantitative Model

    NASA Technical Reports Server (NTRS)

    Kahn, C. Joseph; Kleinhammer, Roger

    2014-01-01

    A database can greatly benefit a quantitative analysis. The defining characteristic of a quantitative risk, or reliability, model is the use of failure estimate data. Models can easily contain a thousand Basic Events, relying on hundreds of individual data sources. Obviously, entering so much data by hand will eventually lead to errors. Not so obviously entering data this way does not aid linking the Basic Events to the data sources. The best way to organize large amounts of data on a computer is with a database. But a model does not require a large, enterprise-level database with dedicated developers and administrators. A database built in Excel can be quite sufficient. A simple spreadsheet database can link every Basic Event to the individual data source selected for them. This database can also contain the manipulations appropriate for how the data is used in the model. These manipulations include stressing factors based on use and maintenance cycles, dormancy, unique failure modes, the modeling of multiple items as a single "Super component" Basic Event, and Bayesian Updating based on flight and testing experience. A simple, unique metadata field in both the model and database provides a link from any Basic Event in the model to its data source and all relevant calculations. The credibility for the entire model often rests on the credibility and traceability of the data.

  13. A whole-cell bioreporter assay for quantitative genotoxicity evaluation of environmental samples.

    PubMed

    Jiang, Bo; Li, Guanghe; Xing, Yi; Zhang, Dayi; Jia, Jianli; Cui, Zhisong; Luan, Xiao; Tang, Hui

    2017-10-01

    Whole-cell bioreporters have emerged as promising tools for genotoxicity evaluation, due to their rapidity, cost-effectiveness, sensitivity and selectivity. In this study, a method for detecting genotoxicity in environmental samples was developed using the bioluminescent whole-cell bioreporter Escherichia coli recA::luxCDABE. To further test its performance in a real world scenario, the E. coli bioreporter was applied in two cases: i) soil samples collected from chromium(VI) contaminated sites; ii) crude oil contaminated seawater collected after the Jiaozhou Bay oil spill which occurred in 2013. The chromium(VI) contaminated soils were pretreated by water extraction, and directly exposed to the bioreporter in two phases: aqueous soil extraction (water phase) and soil supernatant (solid phase). The results indicated that both extractable and soil particle fixed chromium(VI) were bioavailable to the bioreporter, and the solid-phase contact bioreporter assay provided a more precise evaluation of soil genotoxicity. For crude oil contaminated seawater, the response of the bioreporter clearly illustrated the spatial and time change in genotoxicity surrounding the spill site, suggesting that the crude oil degradation process decreased the genotoxic risk to ecosystem. In addition, the performance of the bioreporter was simulated by a modified cross-regulation gene expression model, which quantitatively described the DNA damage response of the E. coli bioreporter. Accordingly, the bioluminescent response of the bioreporter was calculated as the mitomycin C equivalent, enabling quantitative comparison of genotoxicities between different environmental samples. This bioreporter assay provides a rapid and sensitive screening tool for direct genotoxicity assessment of environmental samples. Copyright © 2017. Published by Elsevier Ltd.

  14. First-Ply-Failure Performance of Composite Clamped Spherical Shells

    NASA Astrophysics Data System (ADS)

    Ghosh, A.; Chakravorty, D.

    2018-05-01

    The failure aspects of composites are available for plates, but studies of the literature on shells unveils that similar reports on them are very limited in number. The aim of this work was to investigate the first-ply-failure of industrially and aesthetically important spherical shells under uniform loadings. Apart from solving benchmark problems, numerical experiments were carried out with different variations of their parameters to obtain the first-ply-failure stresses by using the finite-element method. The load was increased in steps, and the lamina strains and stresses were put into well-established failure criteria to evaluate their first-ply-failure stress, the failed ply, the point of initiation of failure, and failure modes and tendencies. The results obtained are analyzed to extract the points of engineering significance.

  15. Nuclear medicine and imaging research (Instrumentation and quantitative methods of evaluation)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Beck, R.N.; Cooper, M.D.

    1989-09-01

    This program addresses the problems involving the basic science and technology underlying the physical and conceptual tools of radioactive tracer methodology as they relate to the measurement of structural and functional parameters of physiologic importance in health and disease. The principal tool is quantitative radionuclide imaging. The overall objective of this program is to further the development and transfer of radiotracer methodology from basic theory to routine clinical practice in order that individual patients and society as a whole will receive the maximum net benefit from the new knowledge gained. The focus of the research is on the development ofmore » new instruments and radiopharmaceuticals, and the evaluation of these through the phase of clinical feasibility.« less

  16. Relief and Recurrence of Congestion During and After Hospitalization for Acute Heart Failure: Insights From Diuretic Optimization Strategy Evaluation in Acute Decompensated Heart Failure (DOSE-AHF) and Cardiorenal Rescue Study in Acute Decompensated Heart Failure (CARESS-HF).

    PubMed

    Lala, Anuradha; McNulty, Steven E; Mentz, Robert J; Dunlay, Shannon M; Vader, Justin M; AbouEzzeddine, Omar F; DeVore, Adam D; Khazanie, Prateeti; Redfield, Margaret M; Goldsmith, Steven R; Bart, Bradley A; Anstrom, Kevin J; Felker, G Michael; Hernandez, Adrian F; Stevenson, Lynne W

    2015-07-01

    Congestion is the most frequent cause for hospitalization in acute decompensated heart failure. Although decongestion is a major goal of acute therapy, it is unclear how the clinical components of congestion (eg, peripheral edema, orthopnea) contribute to outcomes after discharge or how well decongestion is maintained. A post hoc analysis was performed of 496 patients enrolled in the Diuretic Optimization Strategy Evaluation in Acute Decompensated Heart Failure (DOSE-AHF) and Cardiorenal Rescue Study in Acute Decompensated Heart Failure (CARRESS-HF) trials during hospitalization with acute decompensated heart failure and clinical congestion. A simple orthodema congestion score was generated based on symptoms of orthopnea (≥2 pillows=2 points, <2 pillows=0 points) and peripheral edema (trace=0 points, moderate=1 point, severe=2 points) at baseline, discharge, and 60-day follow-up. Orthodema scores were classified as absent (score of 0), low-grade (score of 1-2), and high-grade (score of 3-4), and the association with death, rehospitalization, or unscheduled medical visits through 60 days was assessed. At baseline, 65% of patients had high-grade orthodema and 35% had low-grade orthodema. At discharge, 52% patients were free from orthodema at discharge (score=0) and these patients had lower 60-day rates of death, rehospitalization, or unscheduled visits (50%) compared with those with low-grade or high-grade orthodema (52% and 68%, respectively; P=0.038). Of the patients without orthodema at discharge, 27% relapsed to low-grade orthodema and 38% to high-grade orthodema at 60-day follow-up. Increased severity of congestion by a simple orthodema assessment is associated with increased morbidity and mortality. Despite intent to relieve congestion, current therapy often fails to relieve orthodema during hospitalization or to prevent recurrence after discharge. URL: http://www.clinicaltrials.gov. Unique identifiers: NCT00608491, NCT00577135. © 2015 American Heart

  17. Infusion of Quantitative and Statistical Concepts into Biology Courses Does Not Improve Quantitative Literacy

    ERIC Educational Resources Information Center

    Beck, Christopher W.

    2018-01-01

    Multiple national reports have pushed for the integration of quantitative concepts into the context of disciplinary science courses. The aim of this study was to evaluate the quantitative and statistical literacy of biology students and explore learning gains when those skills were taught implicitly in the context of biology. I examined gains in…

  18. Quantitative troponin and death, cardiogenic shock, cardiac arrest and new heart failure in patients with non-ST-segment elevation acute coronary syndromes (NSTE ACS): insights from the Global Registry of Acute Coronary Events.

    PubMed

    Jolly, Sanjit S; Shenkman, Heather; Brieger, David; Fox, Keith A; Yan, Andrew T; Eagle, Kim A; Steg, P Gabriel; Lim, Ki-Dong; Quill, Ann; Goodman, Shaun G

    2011-02-01

    The objective of this study was to determine if the extent of quantitative troponin elevation predicted mortality as well as in-hospital complications of cardiac arrest, new heart failure and cardiogenic shock. 16,318 patients with non-ST-segment elevation acute coronary syndromes (NSTE ACS) from the Global Registry of Acute Coronary Events (GRACE) were included. The maximum 24 h troponin value as a multiple of the local laboratory upper limit of normal was used. The population was divided into five groups based on the degree of troponin elevation, and outcomes were compared. An adjusted analysis was performed using quantitative troponin as a continuous variable with adjustment for known prognostic variables. For each approximate 10-fold increase in the troponin ratio, there was an associated increase in cardiac arrest, sustained ventricular tachycardia (VT) or ventricular fibrillation (VF) (1.0, 2.4, 3.4, 5.9 and 13.4%; p<0.001 for linear trend), cardiogenic shock (0.5, 1.4, 2.0, 4.4 and 12.7%; p<0.001), new heart failure (2.5, 5.1, 7.4, 11.6 and 15.8%; p<0.001) and mortality (0.8, 2.2, 3.0, 5.3 and 14.0%; p<0.001). These findings were replicated using the troponin ratio as a continuous variable and adjusting for covariates (cardiac arrest, sustained VT or VF, OR 1.56, 95% CI 1.39 to 1.74; cardiogenic shock, OR 1.87, 95% CI 1.61 to 2.18; and new heart failure, OR 1.57, 95% CI 1.45 to 1.71). The degree of troponin elevation was predictive of early mortality (HR 1.61, 95% CI 1.44 to 1.81; p<0.001 for days 0-14) and longer term mortality (HR 1.18, 95% CI 1.07 to 1.30, p=0.001 for days 15-180). The extent of troponin elevation is an independent predictor of morbidity and mortality.

  19. Quantitative Muscle Ultrasonography in Carpal Tunnel Syndrome.

    PubMed

    Lee, Hyewon; Jee, Sungju; Park, Soo Ho; Ahn, Seung-Chan; Im, Juneho; Sohn, Min Kyun

    2016-12-01

    To assess the reliability of quantitative muscle ultrasonography (US) in healthy subjects and to evaluate the correlation between quantitative muscle US findings and electrodiagnostic study results in patients with carpal tunnel syndrome (CTS). The clinical significance of quantitative muscle US in CTS was also assessed. Twenty patients with CTS and 20 age-matched healthy volunteers were recruited. All control and CTS subjects underwent a bilateral median and ulnar nerve conduction study (NCS) and quantitative muscle US. Transverse US images of the abductor pollicis brevis (APB) and abductor digiti minimi (ADM) were obtained to measure muscle cross-sectional area (CSA), thickness, and echo intensity (EI). EI was determined using computer-assisted, grayscale analysis. Inter-rater and intra-rater reliability for quantitative muscle US in control subjects, and differences in muscle thickness, CSA, and EI between the CTS patient and control groups were analyzed. Relationships between quantitative US parameters and electrodiagnostic study results were evaluated. Quantitative muscle US had high inter-rater and intra-rater reliability in the control group. Muscle thickness and CSA were significantly decreased, and EI was significantly increased in the APB of the CTS group (all p<0.05). EI demonstrated a significant positive correlation with latency of the median motor and sensory NCS in CTS patients (p<0.05). These findings suggest that quantitative muscle US parameters may be useful for detecting muscle changes in CTS. Further study involving patients with other neuromuscular diseases is needed to evaluate peripheral muscle change using quantitative muscle US.

  20. Acute Respiratory Failure in Cardiac Transplant Recipients.

    PubMed

    Komurcu, Ozgur; Ozdemirkan, Aycan; Camkiran Firat, Aynur; Zeyneloglu, Pinar; Sezgin, Atilla; Pirat, Arash

    2015-11-01

    This study sought to evaluate the incidence, risk factors, and outcomes of acute respiratory failure in cardiac transplant recipients. Cardiac transplant recipients >15 years of age and readmitted to the intensive care unit after cardiac transplant between 2005 and 2015 were included. Thirty-nine patients were included in the final analyses. Patients with acute respiratory failure and without acute respiratory failure were compared. The most frequent causes of readmission were routine intensive care unit follow-up after endomyocardial biopsy, heart failure, sepsis, and pneumonia. Patients who were readmitted to the intensive care unit were further divided into 2 groups based on presence of acute respiratory failure. Patients' ages and body weights did not differ between groups. The groups were not different in terms of comorbidities. The admission sequential organ failure assessment scores were higher in patients with acute respiratory failure. Patients with acute respiratory failure were more likely to use bronchodilators and n-acetylcysteine before readmission. Mean peak inspiratory pressures were higher in patients in acute respiratory failure. Patients with acute respiratory failure developed sepsis more frequently and they were more likely to have hypotension. Patients with acute respiratory failure had higher values of serum creatinine before admission to intensive care unit and in the first day of intensive care unit. Patients with acute respiratory failure had more frequent bilateral opacities on chest radiographs and positive blood and urine cultures. Duration of intensive care unit and hospital stays were not statistically different between groups. Mortality in patients with acute respiratory failure was 76.5% compared with 0% in patients without acute respiratory failure. A significant number of cardiac transplant recipients were readmitted to the intensive care unit. Patients presenting with acute respiratory failure on readmission more frequently

  1. Dynamic phase differences based on quantitative phase imaging for the objective evaluation of cell behavior.

    PubMed

    Krizova, Aneta; Collakova, Jana; Dostal, Zbynek; Kvasnica, Lukas; Uhlirova, Hana; Zikmund, Tomas; Vesely, Pavel; Chmelik, Radim

    2015-01-01

    Quantitative phase imaging (QPI) brought innovation to noninvasive observation of live cell dynamics seen as cell behavior. Unlike the Zernike phase contrast or differential interference contrast, QPI provides quantitative information about cell dry mass distribution. We used such data for objective evaluation of live cell behavioral dynamics by the advanced method of dynamic phase differences (DPDs). The DPDs method is considered a rational instrument offered by QPI. By subtracting the antecedent from the subsequent image in a time-lapse series, only the changes in mass distribution in the cell are detected. The result is either visualized as a two dimensional color-coded projection of these two states of the cell or as a time dependence of changes quantified in picograms. Then in a series of time-lapse recordings, the chain of cell mass distribution changes that would otherwise escape attention is revealed. Consequently, new salient features of live cell behavior should emerge. Construction of the DPDs method and results exhibiting the approach are presented. Advantage of the DPDs application is demonstrated on cells exposed to an osmotic challenge. For time-lapse acquisition of quantitative phase images, the recently developed coherence-controlled holographic microscope was employed.

  2. Dynamic phase differences based on quantitative phase imaging for the objective evaluation of cell behavior

    NASA Astrophysics Data System (ADS)

    Krizova, Aneta; Collakova, Jana; Dostal, Zbynek; Kvasnica, Lukas; Uhlirova, Hana; Zikmund, Tomas; Vesely, Pavel; Chmelik, Radim

    2015-11-01

    Quantitative phase imaging (QPI) brought innovation to noninvasive observation of live cell dynamics seen as cell behavior. Unlike the Zernike phase contrast or differential interference contrast, QPI provides quantitative information about cell dry mass distribution. We used such data for objective evaluation of live cell behavioral dynamics by the advanced method of dynamic phase differences (DPDs). The DPDs method is considered a rational instrument offered by QPI. By subtracting the antecedent from the subsequent image in a time-lapse series, only the changes in mass distribution in the cell are detected. The result is either visualized as a two-dimensional color-coded projection of these two states of the cell or as a time dependence of changes quantified in picograms. Then in a series of time-lapse recordings, the chain of cell mass distribution changes that would otherwise escape attention is revealed. Consequently, new salient features of live cell behavior should emerge. Construction of the DPDs method and results exhibiting the approach are presented. Advantage of the DPDs application is demonstrated on cells exposed to an osmotic challenge. For time-lapse acquisition of quantitative phase images, the recently developed coherence-controlled holographic microscope was employed.

  3. Tensile failure criteria for fiber composite materials

    NASA Technical Reports Server (NTRS)

    Rosen, B. W.; Zweben, C. H.

    1972-01-01

    The analysis provides insight into the failure mechanics of these materials and defines criteria which serve as tools for preliminary design material selection and for material reliability assessment. The model incorporates both dispersed and propagation type failures and includes the influence of material heterogeneity. The important effects of localized matrix damage and post-failure matrix shear stress transfer are included in the treatment. The model is used to evaluate the influence of key parameters on the failure of several commonly used fiber-matrix systems. Analyses of three possible failure modes were developed. These modes are the fiber break propagation mode, the cumulative group fracture mode, and the weakest link mode. Application of the new model to composite material systems has indicated several results which require attention in the development of reliable structural composites. Prominent among these are the size effect and the influence of fiber strength variability.

  4. [Prediction of mortality in patients with acute hepatic failure].

    PubMed

    Eremeeva, L F; Berdnikov, A P; Musaeva, T S; Zabolotskikh, I B

    2013-01-01

    The article deals with a study of 243 patients (from 18 to 65 years old) with acute hepatic failure. Purpose of the study was to evaluate the predictive capability of severity scales APACHE III, SOFA, MODS, Child-Pugh and to identify mortality predictors in patients with acute hepatic failure. Results; The best predictive ability in patients with acute hepatic failure and multiple organ failure had APACHE III and SOFA scales. The strongest mortality predictors were: serum creatinine > 132 mmol/L, fibrinogen < 1.4 g/L, Na < 129 mmol/L.

  5. Nonlinear viscoelasticity and generalized failure criterion for biopolymer gels

    NASA Astrophysics Data System (ADS)

    Divoux, Thibaut; Keshavarz, Bavand; Manneville, Sébastien; McKinley, Gareth

    2016-11-01

    Biopolymer gels display a multiscale microstructure that is responsible for their solid-like properties. Upon external deformation, these soft viscoelastic solids exhibit a generic nonlinear mechanical response characterized by pronounced stress- or strain-stiffening prior to irreversible damage and failure, most often through macroscopic fractures. Here we show on a model acid-induced protein gel that the nonlinear viscoelastic properties of the gel can be described in terms of a 'damping function' which predicts the gel mechanical response quantitatively up to the onset of macroscopic failure. Using a nonlinear integral constitutive equation built upon the experimentally-measured damping function in conjunction with power-law linear viscoelastic response, we derive the form of the stress growth in the gel following the start up of steady shear. We also couple the shear stress response with Bailey's durability criteria for brittle solids in order to predict the critical values of the stress σc and strain γc for failure of the gel, and how they scale with the applied shear rate. This provides a generalized failure criterion for biopolymer gels in a range of different deformation histories. This work was funded by the MIT-France seed fund and by the CNRS PICS-USA scheme (#36939). BK acknowledges financial support from Axalta Coating Systems.

  6. Evaluation of possible prognostic factors for the success, survival, and failure of dental implants.

    PubMed

    Geckili, Onur; Bilhan, Hakan; Geckili, Esma; Cilingir, Altug; Mumcu, Emre; Bural, Canan

    2014-02-01

    To analyze the prognostic factors that are associated with the success, survival, and failure rates of dental implants. Data including implant sizes, insertion time, implant location, and prosthetic treatment of 1656 implants have been collected, and the association of these factors with success, survival, and failure of implants was analyzed. The success rate was lower for short and maxillary implants. The failure rate of maxillary implants exceeded that of mandibular implants, and the failure rate of implants that were placed in the maxillary anterior region was significantly higher than other regions. The failure rates of implants that were placed 5 years ago or more were higher than those that were placed later. Anterior maxilla is more critical for implant loss than other sites. Implants in the anterior mandible show better success compared with other locations, and longer implants show better success rates. The learning curve of the clinician influences survival and success rates of dental implants.

  7. The Use of Mouse Models of Breast Cancer and Quantitative Image Analysis to Evaluate Hormone Receptor Antigenicity after Microwave-assisted Formalin Fixation

    PubMed Central

    Engelberg, Jesse A.; Giberson, Richard T.; Young, Lawrence J.T.; Hubbard, Neil E.

    2014-01-01

    Microwave methods of fixation can dramatically shorten fixation times while preserving tissue structure; however, it remains unclear if adequate tissue antigenicity is preserved. To assess and validate antigenicity, robust quantitative methods and animal disease models are needed. We used two mouse mammary models of human breast cancer to evaluate microwave-assisted and standard 24-hr formalin fixation. The mouse models expressed four antigens prognostic for breast cancer outcome: estrogen receptor, progesterone receptor, Ki67, and human epidermal growth factor receptor 2. Using pathologist evaluation and novel methods of quantitative image analysis, we measured and compared the quality of antigen preservation, percentage of positive cells, and line plots of cell intensity. Visual evaluations by pathologists established that the amounts and patterns of staining were similar in tissues fixed by the different methods. The results of the quantitative image analysis provided a fine-grained evaluation, demonstrating that tissue antigenicity is preserved in tissues fixed using microwave methods. Evaluation of the results demonstrated that a 1-hr, 150-W fixation is better than a 45-min, 150-W fixation followed by a 15-min, 650-W fixation. The results demonstrated that microwave-assisted formalin fixation can standardize fixation times to 1 hr and produce immunohistochemistry that is in every way commensurate with longer conventional fixation methods. PMID:24682322

  8. [Quality evaluation of rhubarb dispensing granules based on multi-component simultaneous quantitative analysis and bioassay].

    PubMed

    Tan, Peng; Zhang, Hai-Zhu; Zhang, Ding-Kun; Wu, Shan-Na; Niu, Ming; Wang, Jia-Bo; Xiao, Xiao-He

    2017-07-01

    This study attempts to evaluate the quality of Chinese formula granules by combined use of multi-component simultaneous quantitative analysis and bioassay. The rhubarb dispensing granules were used as the model drug for demonstrative study. The ultra-high performance liquid chromatography (UPLC) method was adopted for simultaneously quantitative determination of the 10 anthraquinone derivatives (such as aloe emodin-8-O-β-D-glucoside) in rhubarb dispensing granules; purgative biopotency of different batches of rhubarb dispensing granules was determined based on compound diphenoxylate tablets-induced mouse constipation model; blood activating biopotency of different batches of rhubarb dispensing granules was determined based on in vitro rat antiplatelet aggregation model; SPSS 22.0 statistical software was used for correlation analysis between 10 anthraquinone derivatives and purgative biopotency, blood activating biopotency. The results of multi-components simultaneous quantitative analysisshowed that there was a great difference in chemical characterizationand certain differences inpurgative biopotency and blood activating biopotency among 10 batches of rhubarb dispensing granules. The correlation analysis showed that the intensity of purgative biopotency was significantly correlated with the content of conjugated anthraquinone glycosides (P<0.01), and the intensity of blood activating biopotency was significantly correlated with the content of free anthraquinone (P<0.01). In summary, the combined use of multi-component simultaneous quantitative analysis and bioassay can achieve objective quantification and more comprehensive reflection on overall quality difference among different batches of rhubarb dispensing granules. Copyright© by the Chinese Pharmaceutical Association.

  9. Quantitation of aortic and mitral regurgitation in the pediatric population: evaluation by radionuclide angiocardiography

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hurwitz, R.A.; Treves, S.; Freed, M.

    The ability to quantitate aortic (AR) or mitral regurgitation (MR), or both, by radionuclide angiocardiography was evaluated in children and young adults at rest and during isometric exercise. Regurgitation was estimated by determining the ratio of left ventricular stroke volume to right ventricular stroke volume obtained during equilibrium ventriculography. The radionuclide measurement was compared with results of cineangiography, with good correlation between both studies in 47 of 48 patients. Radionuclide stroke volume ratio was used to classify severity: the group with equivocal regurgitation differed from the group with mild regurgitation (p less than 0.02); patients with mild regurgitation differed frommore » those with moderate regurgitation (p less than 0.001); and those with moderate regurgitation differed from those with severe regurgitation (p less than 0.01). The stroke volume ratio was responsive to isometric exercise, remaining constant or increasing in 16 of 18 patients. After surgery to correct regurgitation, the stroke volume ratio significantly decreased from preoperative measurements in all 7 patients evaluated. Results from the present study demonstrate that a stroke volume ratio greater than 2.0 is compatible with moderately severe regurgitation and that a ratio greater than 3.0 suggests the presence of severe regurgitation. Thus, radionuclide angiocardiography should be useful for noninvasive quantitation of AR or MR, or both, helping define the course of young patients with left-side valvular regurgitation.« less

  10. Pitfalls and Precautions When Using Predicted Failure Data for Quantitative Analysis of Safety Risk for Human Rated Launch Vehicles

    NASA Technical Reports Server (NTRS)

    Hatfield, Glen S.; Hark, Frank; Stott, James

    2016-01-01

    Launch vehicle reliability analysis is largely dependent upon using predicted failure rates from data sources such as MIL-HDBK-217F. Reliability prediction methodologies based on component data do not take into account risks attributable to manufacturing, assembly, and process controls. These sources often dominate component level reliability or risk of failure probability. While consequences of failure is often understood in assessing risk, using predicted values in a risk model to estimate the probability of occurrence will likely underestimate the risk. Managers and decision makers often use the probability of occurrence in determining whether to accept the risk or require a design modification. Due to the absence of system level test and operational data inherent in aerospace applications, the actual risk threshold for acceptance may not be appropriately characterized for decision making purposes. This paper will establish a method and approach to identify the pitfalls and precautions of accepting risk based solely upon predicted failure data. This approach will provide a set of guidelines that may be useful to arrive at a more realistic quantification of risk prior to acceptance by a program.

  11. A Weibull distribution accrual failure detector for cloud computing.

    PubMed

    Liu, Jiaxi; Wu, Zhibo; Wu, Jin; Dong, Jian; Zhao, Yao; Wen, Dongxin

    2017-01-01

    Failure detectors are used to build high availability distributed systems as the fundamental component. To meet the requirement of a complicated large-scale distributed system, accrual failure detectors that can adapt to multiple applications have been studied extensively. However, several implementations of accrual failure detectors do not adapt well to the cloud service environment. To solve this problem, a new accrual failure detector based on Weibull Distribution, called the Weibull Distribution Failure Detector, has been proposed specifically for cloud computing. It can adapt to the dynamic and unexpected network conditions in cloud computing. The performance of the Weibull Distribution Failure Detector is evaluated and compared based on public classical experiment data and cloud computing experiment data. The results show that the Weibull Distribution Failure Detector has better performance in terms of speed and accuracy in unstable scenarios, especially in cloud computing.

  12. A Weibull distribution accrual failure detector for cloud computing

    PubMed Central

    Wu, Zhibo; Wu, Jin; Zhao, Yao; Wen, Dongxin

    2017-01-01

    Failure detectors are used to build high availability distributed systems as the fundamental component. To meet the requirement of a complicated large-scale distributed system, accrual failure detectors that can adapt to multiple applications have been studied extensively. However, several implementations of accrual failure detectors do not adapt well to the cloud service environment. To solve this problem, a new accrual failure detector based on Weibull Distribution, called the Weibull Distribution Failure Detector, has been proposed specifically for cloud computing. It can adapt to the dynamic and unexpected network conditions in cloud computing. The performance of the Weibull Distribution Failure Detector is evaluated and compared based on public classical experiment data and cloud computing experiment data. The results show that the Weibull Distribution Failure Detector has better performance in terms of speed and accuracy in unstable scenarios, especially in cloud computing. PMID:28278229

  13. Quantitative evaluation method of the threshold adjustment and the flat field correction performances of hybrid photon counting pixel detectors

    NASA Astrophysics Data System (ADS)

    Medjoubi, K.; Dawiec, A.

    2017-12-01

    A simple method is proposed in this work for quantitative evaluation of the quality of the threshold adjustment and the flat-field correction of Hybrid Photon Counting pixel (HPC) detectors. This approach is based on the Photon Transfer Curve (PTC) corresponding to the measurement of the standard deviation of the signal in flat field images. Fixed pattern noise (FPN), easily identifiable in the curve, is linked to the residual threshold dispersion, sensor inhomogeneity and the remnant errors in flat fielding techniques. The analytical expression of the signal to noise ratio curve is developed for HPC and successfully used as a fit function applied to experimental data obtained with the XPAD detector. The quantitative evaluation of the FPN, described by the photon response non-uniformity (PRNU), is measured for different configurations (threshold adjustment method and flat fielding technique) and is demonstrated to be used in order to evaluate the best setting for having the best image quality from a commercial or a R&D detector.

  14. Quantitative evaluation of 3D dosimetry for stereotactic volumetric‐modulated arc delivery using COMPASS

    PubMed Central

    Manigandan, Durai; Karrthick, Karukkupalayam Palaniappan; Sambasivaselli, Raju; Senniandavar, Vellaingiri; Ramu, Mahendran; Rajesh, Thiyagarajan; Lutz, Muller; Muthukumaran, Manavalan; Karthikeyan, Nithyanantham; Tejinder, Kataria

    2014-01-01

    The purpose of this study was to evaluate quantitatively the patient‐specific 3D dosimetry tool COMPASS with 2D array MatriXX detector for stereotactic volumetric‐modulated arc delivery. Twenty‐five patients CT images and RT structures from different sites (brain, head & neck, thorax, abdomen, and spine) were taken from CyberKnife Multiplan planning system for this study. All these patients underwent radical stereotactic treatment in CyberKnife. For each patient, linac based volumetric‐modulated arc therapy (VMAT) stereotactic plans were generated in Monaco TPS v3.1 using Elekta Beam Modulator MLC. Dose prescription was in the range of 5–20 Gy per fraction. Target prescription and critical organ constraints were tried to match the delivered treatment plans. Each plan quality was analyzed using conformity index (CI), conformity number (CN), gradient Index (GI), target coverage (TC), and dose to 95% of volume (D95). Monaco Monte Carlo (MC)‐calculated treatment plan delivery accuracy was quantitatively evaluated with COMPASS‐calculated (CCA) dose and COMPASS indirectly measured (CME) dose based on dose‐volume histogram metrics. In order to ascertain the potential of COMPASS 3D dosimetry for stereotactic plan delivery, 2D fluence verification was performed with MatriXX using MultiCube phantom. Routine quality assurance of absolute point dose verification was performed to check the overall delivery accuracy. Quantitative analyses of dose delivery verification were compared with pass and fail criteria of 3 mm and 3% distance to agreement and dose differences. Gamma passing rate was compared with 2D fluence verification from MatriXX with MultiCube. Comparison of COMPASS reconstructed dose from measured fluence and COMPASS computed dose has shown a very good agreement with TPS calculated dose. Each plan was evaluated based on dose volume parameters for target volumes such as dose at 95% of volume (D95) and average dose. For critical organs dose at 20% of

  15. Improving the Estimates of International Space Station (ISS) Induced K-Factor Failure Rates for On-Orbit Replacement Unit (ORU) Supportability Analyses

    NASA Technical Reports Server (NTRS)

    Anderson, Leif F.; Harrington, Sean P.; Omeke, Ojei, II; Schwaab, Douglas G.

    2009-01-01

    This is a case study on revised estimates of induced failure for International Space Station (ISS) on-orbit replacement units (ORUs). We devise a heuristic to leverage operational experience data by aggregating ORU, associated function (vehicle sub -system), and vehicle effective' k-factors using actual failure experience. With this input, we determine a significant failure threshold and minimize the difference between the actual and predicted failure rates. We conclude with a discussion on both qualitative and quantitative improvements the heuristic methods and potential benefits to ISS supportability engineering analysis.

  16. The failure of earthquake failure models

    USGS Publications Warehouse

    Gomberg, J.

    2001-01-01

    In this study I show that simple heuristic models and numerical calculations suggest that an entire class of commonly invoked models of earthquake failure processes cannot explain triggering of seismicity by transient or "dynamic" stress changes, such as stress changes associated with passing seismic waves. The models of this class have the common feature that the physical property characterizing failure increases at an accelerating rate when a fault is loaded (stressed) at a constant rate. Examples include models that invoke rate state friction or subcritical crack growth, in which the properties characterizing failure are slip or crack length, respectively. Failure occurs when the rate at which these grow accelerates to values exceeding some critical threshold. These accelerating failure models do not predict the finite durations of dynamically triggered earthquake sequences (e.g., at aftershock or remote distances). Some of the failure models belonging to this class have been used to explain static stress triggering of aftershocks. This may imply that the physical processes underlying dynamic triggering differs or that currently applied models of static triggering require modification. If the former is the case, we might appeal to physical mechanisms relying on oscillatory deformations such as compaction of saturated fault gouge leading to pore pressure increase, or cyclic fatigue. However, if dynamic and static triggering mechanisms differ, one still needs to ask why static triggering models that neglect these dynamic mechanisms appear to explain many observations. If the static and dynamic triggering mechanisms are the same, perhaps assumptions about accelerating failure and/or that triggering advances the failure times of a population of inevitable earthquakes are incorrect.

  17. Quantifying Pilot Contribution to Flight Safety during Drive Shaft Failure

    NASA Technical Reports Server (NTRS)

    Kramer, Lynda J.; Etherington, Tim; Last, Mary Carolyn; Bailey, Randall E.; Kennedy, Kellie D.

    2017-01-01

    Accident statistics cite the flight crew as a causal factor in over 60% of large transport aircraft fatal accidents. Yet, a well-trained and well-qualified pilot is acknowledged as the critical center point of aircraft systems safety and an integral safety component of the entire commercial aviation system. The latter statement, while generally accepted, cannot be verified because little or no quantitative data exists on how and how many accidents/incidents are averted by crew actions. A joint NASA/FAA high-fidelity motion-base simulation experiment specifically addressed this void by collecting data to quantify the human (pilot) contribution to safety-of-flight and the methods they use in today's National Airspace System. A human-in-the-loop test was conducted using the FAA's Oklahoma City Flight Simulation Branch Level D-certified B-737-800 simulator to evaluate the pilot's contribution to safety-of-flight during routine air carrier flight operations and in response to aircraft system failures. These data are fundamental to and critical for the design and development of future increasingly autonomous systems that can better support the human in the cockpit. Eighteen U.S. airline crews flew various normal and non-normal procedures over a two-day period and their actions were recorded in response to failures. To quantify the human's contribution to safety of flight, crew complement was used as the experiment independent variable in a between-subjects design. Pilot actions and performance during single pilot and reduced crew operations were measured for comparison against the normal two-crew complement during normal and non-normal situations. This paper details the crew's actions, including decision-making, and responses while dealing with a drive shaft failure - one of 6 non-normal events that were simulated in this experiment.

  18. Standardizing evaluation of pQCT image quality in the presence of subject movement: qualitative versus quantitative assessment.

    PubMed

    Blew, Robert M; Lee, Vinson R; Farr, Joshua N; Schiferl, Daniel J; Going, Scott B

    2014-02-01

    Peripheral quantitative computed tomography (pQCT) is an essential tool for assessing bone parameters of the limbs, but subject movement and its impact on image quality remains a challenge to manage. The current approach to determine image viability is by visual inspection, but pQCT lacks a quantitative evaluation. Therefore, the aims of this study were to (1) examine the reliability of a qualitative visual inspection scale and (2) establish a quantitative motion assessment methodology. Scans were performed on 506 healthy girls (9-13 years) at diaphyseal regions of the femur and tibia. Scans were rated for movement independently by three technicians using a linear, nominal scale. Quantitatively, a ratio of movement to limb size (%Move) provided a measure of movement artifact. A repeat-scan subsample (n = 46) was examined to determine %Move's impact on bone parameters. Agreement between measurers was strong (intraclass correlation coefficient = 0.732 for tibia, 0.812 for femur), but greater variability was observed in scans rated 3 or 4, the delineation between repeat and no repeat. The quantitative approach found ≥95% of subjects had %Move <25 %. Comparison of initial and repeat scans by groups above and below 25% initial movement showed significant differences in the >25 % grouping. A pQCT visual inspection scale can be a reliable metric of image quality, but technicians may periodically mischaracterize subject motion. The presented quantitative methodology yields more consistent movement assessment and could unify procedure across laboratories. Data suggest a delineation of 25% movement for determining whether a diaphyseal scan is viable or requires repeat.

  19. Standardizing Evaluation of pQCT Image Quality in the Presence of Subject Movement: Qualitative vs. Quantitative Assessment

    PubMed Central

    Blew, Robert M.; Lee, Vinson R.; Farr, Joshua N.; Schiferl, Daniel J.; Going, Scott B.

    2013-01-01

    Purpose Peripheral quantitative computed tomography (pQCT) is an essential tool for assessing bone parameters of the limbs, but subject movement and its impact on image quality remains a challenge to manage. The current approach to determine image viability is by visual inspection, but pQCT lacks a quantitative evaluation. Therefore, the aims of this study were to (1) examine the reliability of a qualitative visual inspection scale, and (2) establish a quantitative motion assessment methodology. Methods Scans were performed on 506 healthy girls (9–13yr) at diaphyseal regions of the femur and tibia. Scans were rated for movement independently by three technicians using a linear, nominal scale. Quantitatively, a ratio of movement to limb size (%Move) provided a measure of movement artifact. A repeat-scan subsample (n=46) was examined to determine %Move’s impact on bone parameters. Results Agreement between measurers was strong (ICC = .732 for tibia, .812 for femur), but greater variability was observed in scans rated 3 or 4, the delineation between repeat or no repeat. The quantitative approach found ≥95% of subjects had %Move <25%. Comparison of initial and repeat scans by groups above and below 25% initial movement, showed significant differences in the >25% grouping. Conclusions A pQCT visual inspection scale can be a reliable metric of image quality but technicians may periodically mischaracterize subject motion. The presented quantitative methodology yields more consistent movement assessment and could unify procedure across laboratories. Data suggest a delineation of 25% movement for determining whether a diaphyseal scan is viable or requires repeat. PMID:24077875

  20. Quantitative Evaluation System of Soft Neurological Signs for Children with Attention Deficit Hyperactivity Disorder.

    PubMed

    Kaneko, Miki; Yamashita, Yushiro; Iramina, Keiji

    2016-01-18

    Attention deficit hyperactivity disorder (ADHD) is a neurodevelopmental disorder characterized by symptoms of inattention, hyperactivity, and impulsivity. Soft neurological signs (SNS) are minor neurological abnormalities in motor performance, and are used as one evaluation method for neurodevelopmental delays in children with ADHD. Our aim is to establish a quantitative evaluation system for children with ADHD. We focused on the arm movement called pronation and supination, which is one such soft neurological sign. Thirty three children with ADHD aged 7-11 years (27 males, six females) and twenty five adults participants aged 21-29 years old (19 males, six females) participated in our experiments. Our results suggested that the pronation and supination function in children with ADHD has a tendency to lag behind that of typically developing children by several years. From these results, our system has a possibility to objectively evaluate the neurodevelopmental delay of children with ADHD.

  1. Liquefaction, flow, and associated ground failure

    USGS Publications Warehouse

    Youd, T. Leslie

    1973-01-01

    Ambiguities in the use of the term liquefaction and in defining the relation between liquefaction and ground failure have led to encumbered communication between workers in various fields and between specialists in the same field, and the possibility that evaluations of liquefaction potential could be misinterpreted or misapplied. Explicit definitions of liquefaction and related concepts are proposed herein. These definitions, based on observed laboratory behavior, are then used to clarify the relation between liquefaction and ground failure. Soil liquefaction is defined as the transformation of a granular material from a solid into a liquefied state as a consequence of increased pore-water pressures. This definition avoids confusion between liquefaction and possible flow-failure conditions after liquefaction. Flow-failure conditions are divided into two types: (1) unlimited flow if pore-pressure reductions caused by dilatancy during flow deformation are not sufficient to solidify the material and thus arrest flow, and (2) limited flow if they are sufficient to solidify the material after a finite deformation. After liquefaction in the field, unlimited flow commonly leads to flow landslides, whereas limited flow leads at most to lateral-spreading landslides. Quick-condition failures such as loss of bearing capacity form a third type of ground failure associated with liquefaction.

  2. Kidney Failure

    MedlinePlus

    ... store Donate Now Give Monthly Give In Honor Kidney Failure (ESRD) Causes, Symptoms, & Treatments www.kidneyfund.org > ... Disaster preparedness Kidney failure/ESRD diet What causes kidney failure? In most cases, kidney failure is caused ...

  3. StreamQRE: Modular Specification and Efficient Evaluation of Quantitative Queries over Streaming Data.

    PubMed

    Mamouras, Konstantinos; Raghothaman, Mukund; Alur, Rajeev; Ives, Zachary G; Khanna, Sanjeev

    2017-06-01

    Real-time decision making in emerging IoT applications typically relies on computing quantitative summaries of large data streams in an efficient and incremental manner. To simplify the task of programming the desired logic, we propose StreamQRE, which provides natural and high-level constructs for processing streaming data. Our language has a novel integration of linguistic constructs from two distinct programming paradigms: streaming extensions of relational query languages and quantitative extensions of regular expressions. The former allows the programmer to employ relational constructs to partition the input data by keys and to integrate data streams from different sources, while the latter can be used to exploit the logical hierarchy in the input stream for modular specifications. We first present the core language with a small set of combinators, formal semantics, and a decidable type system. We then show how to express a number of common patterns with illustrative examples. Our compilation algorithm translates the high-level query into a streaming algorithm with precise complexity bounds on per-item processing time and total memory footprint. We also show how to integrate approximation algorithms into our framework. We report on an implementation in Java, and evaluate it with respect to existing high-performance engines for processing streaming data. Our experimental evaluation shows that (1) StreamQRE allows more natural and succinct specification of queries compared to existing frameworks, (2) the throughput of our implementation is higher than comparable systems (for example, two-to-four times greater than RxJava), and (3) the approximation algorithms supported by our implementation can lead to substantial memory savings.

  4. Signal analysis techniques for incipient failure detection in turbomachinery

    NASA Technical Reports Server (NTRS)

    Coffin, T.

    1985-01-01

    Signal analysis techniques for the detection and classification of incipient mechanical failures in turbomachinery were developed, implemented and evaluated. Signal analysis techniques available to describe dynamic measurement characteristics are reviewed. Time domain and spectral methods are described, and statistical classification in terms of moments is discussed. Several of these waveform analysis techniques were implemented on a computer and applied to dynamic signals. A laboratory evaluation of the methods with respect to signal detection capability is described. Plans for further technique evaluation and data base development to characterize turbopump incipient failure modes from Space Shuttle main engine (SSME) hot firing measurements are outlined.

  5. The Spectrum of Renal Allograft Failure

    PubMed Central

    Chand, Sourabh; Atkinson, David; Collins, Clare; Briggs, David; Ball, Simon; Sharif, Adnan; Skordilis, Kassiani; Vydianath, Bindu; Neil, Desley; Borrows, Richard

    2016-01-01

    Background Causes of “true” late kidney allograft failure remain unclear as study selection bias and limited follow-up risk incomplete representation of the spectrum. Methods We evaluated all unselected graft failures from 2008–2014 (n = 171; 0–36 years post-transplantation) by contemporary classification of indication biopsies “proximate” to failure, DSA assessment, clinical and biochemical data. Results The spectrum of graft failure changed markedly depending on the timing of allograft failure. Failures within the first year were most commonly attributed to technical failure, acute rejection (with T-cell mediated rejection [TCMR] dominating antibody-mediated rejection [ABMR]). Failures beyond a year were increasingly dominated by ABMR and ‘interstitial fibrosis with tubular atrophy’ without rejection, infection or recurrent disease (“IFTA”). Cases of IFTA associated with inflammation in non-scarred areas (compared with no inflammation or inflammation solely within scarred regions) were more commonly associated with episodes of prior rejection, late rejection and nonadherence, pointing to an alloimmune aetiology. Nonadherence and late rejection were common in ABMR and TCMR, particularly Acute Active ABMR. Acute Active ABMR and nonadherence were associated with younger age, faster functional decline, and less hyalinosis on biopsy. Chronic and Chronic Active ABMR were more commonly associated with Class II DSA. C1q-binding DSA, detected in 33% of ABMR episodes, were associated with shorter time to graft failure. Most non-biopsied patients were DSA-negative (16/21; 76.1%). Finally, twelve losses to recurrent disease were seen (16%). Conclusion This data from an unselected population identifies IFTA alongside ABMR as a very important cause of true late graft failure, with nonadherence-associated TCMR as a phenomenon in some patients. It highlights clinical and immunological characteristics of ABMR subgroups, and should inform clinical practice and

  6. Failure analysis of parameter-induced simulation crashes in climate models

    NASA Astrophysics Data System (ADS)

    Lucas, D. D.; Klein, R.; Tannahill, J.; Ivanova, D.; Brandon, S.; Domyancic, D.; Zhang, Y.

    2013-01-01

    Simulations using IPCC-class climate models are subject to fail or crash for a variety of reasons. Quantitative analysis of the failures can yield useful insights to better understand and improve the models. During the course of uncertainty quantification (UQ) ensemble simulations to assess the effects of ocean model parameter uncertainties on climate simulations, we experienced a series of simulation crashes within the Parallel Ocean Program (POP2) component of the Community Climate System Model (CCSM4). About 8.5% of our CCSM4 simulations failed for numerical reasons at combinations of POP2 parameter values. We apply support vector machine (SVM) classification from machine learning to quantify and predict the probability of failure as a function of the values of 18 POP2 parameters. A committee of SVM classifiers readily predicts model failures in an independent validation ensemble, as assessed by the area under the receiver operating characteristic (ROC) curve metric (AUC > 0.96). The causes of the simulation failures are determined through a global sensitivity analysis. Combinations of 8 parameters related to ocean mixing and viscosity from three different POP2 parameterizations are the major sources of the failures. This information can be used to improve POP2 and CCSM4 by incorporating correlations across the relevant parameters. Our method can also be used to quantify, predict, and understand simulation crashes in other complex geoscientific models.

  7. Quantitative evaluation of toothbrush and arm-joint motion during tooth brushing.

    PubMed

    Inada, Emi; Saitoh, Issei; Yu, Yong; Tomiyama, Daisuke; Murakami, Daisuke; Takemoto, Yoshihiko; Morizono, Ken; Iwasaki, Tomonori; Iwase, Yoko; Yamasaki, Youichi

    2015-07-01

    It is very difficult for dental professionals to objectively assess tooth brushing skill of patients, because an obvious index to assess the brushing motion of patients has not been established. The purpose of this study was to quantitatively evaluate toothbrush and arm-joint motion during tooth brushing. Tooth brushing motion, performed by dental hygienists for 15 s, was captured using a motion-capture system that continuously calculates the three-dimensional coordinates of object's motion relative to the floor. The dental hygienists performed the tooth brushing on the buccal and palatal sides of their right and left upper molars. The frequencies and power spectra of toothbrush motion and joint angles of the shoulder, elbow, and wrist were calculated and analyzed statistically. The frequency of toothbrush motion was higher on the left side (both buccal and palatal areas) than on the right side. There were no significant differences among joint angle frequencies within each brushing area. The inter- and intra-individual variations of the power spectrum of the elbow flexion angle when brushing were smaller than for any of the other angles. This study quantitatively confirmed that dental hygienists have individual distinctive rhythms during tooth brushing. All arm joints moved synchronously during brushing, and tooth brushing motion was controlled by coordinated movement of the joints. The elbow generated an individual's frequency through a stabilizing movement. The shoulder and wrist control the hand motion, and the elbow generates the cyclic rhythm during tooth brushing.

  8. Evaluation Aspects of Building Structures Reconstructed After a Failure or Catastrophe

    NASA Astrophysics Data System (ADS)

    Krentowski, Janusz R.; Knyziak, Piotr

    2017-10-01

    The article presents the characteristics of several steel structures, among others modernized industrial dye house, school sports hall, truck repair workshop, that have been rebuilt after a disaster or a catastrophe. The structures were analyzed in detail, and the evaluation and reconstruction processes were described. The emergencies that occurred during exploitation of the buildings were the result of multiple mistakes: incorrectly defined intervals between inspections, errors during periodic inspections, incorrect repair work recommendations. The concepts of reinforcement work implemented by the authors, enabling the long-term future failure-free operation of the objects, were presented. Recommendations for monitoring of the facilities, applied after reinforcement or reconstruction, have been formulated. The methodology for the implementation of specialized investigations, such as geodetic, optical, geological, chemical strength tests, both destructive and non-destructive, has been defined. The need to determine the limit values of deformations, deflections, damage or other faults of structural elements and the entire rebuilt facilities, as well as defining conditions for objects’ withdrawal from operation in subsequent exceptional situations was indicated.

  9. Predicting Failure Under Laboratory Conditions: Learning the Physics of Slow Frictional Slip and Dynamic Failure

    NASA Astrophysics Data System (ADS)

    Rouet-Leduc, B.; Hulbert, C.; Riviere, J.; Lubbers, N.; Barros, K.; Marone, C.; Johnson, P. A.

    2016-12-01

    Forecasting failure is a primary goal in diverse domains that include earthquake physics, materials science, nondestructive evaluation of materials and other engineering applications. Due to the highly complex physics of material failure and limitations on gathering data in the failure nucleation zone, this goal has often appeared out of reach; however, recent advances in instrumentation sensitivity, instrument density and data analysis show promise toward forecasting failure times. Here, we show that we can predict frictional failure times of both slow and fast stick slip failure events in the laboratory. This advance is made possible by applying a machine learning approach known as Random Forests1(RF) to the continuous acoustic emission (AE) time series recorded by detectors located on the fault blocks. The RF is trained using a large number of statistical features derived from the AE time series signal. The model is then applied to data not previously analyzed. Remarkably, we find that the RF method predicts upcoming failure time far in advance of a stick slip event, based only on a short time window of data. Further, the algorithm accurately predicts the time of the beginning and end of the next slip event. The predicted time improves as failure is approached, as other data features add to prediction. Our results show robust predictions of slow and dynamic failure based on acoustic emissions from the fault zone throughout the laboratory seismic cycle. The predictions are based on previously unidentified tremor-like acoustic signals that occur during stress build up and the onset of macroscopic frictional weakening. We suggest that the tremor-like signals carry information about fault zone processes and allow precise predictions of failure at any time in the slow slip or stick slip cycle2. If the laboratory experiments represent Earth frictional conditions, it could well be that signals are being missed that contain highly useful predictive information. 1Breiman

  10. Pulmonary hypertension and isolated right heart failure complicating amiodarone induced hyperthyroidism.

    PubMed

    Wong, Sean-Man; Tse, Hung-Fat; Siu, Chung-Wah

    2012-03-01

    Hyperthyroidism is a common side effect encountered in patients prescribed long-term amiodarone therapy for cardiac arrhythmias. We previously studied 354 patients prescribed amiodarone in whom the occurrence of hyperthyroidism was associated with major adverse cardiovascular events including heart failure, myocardial infarction, ventricular arrhythmias, stroke and even death [1]. We now present a case of amiodarone-induced hyperthyroidism complicated by isolated right heart failure and pulmonary hypertension that resolved with treatment of hyperthyroidism. Detailed quantitative echocardiography enables improved understanding of the haemodynamic mechanisms underlying the condition. Copyright © 2011 Australian and New Zealand Society of Cardiac and Thoracic Surgeons (ANZSCTS) and the Cardiac Society of Australia and New Zealand (CSANZ). Published by Elsevier B.V. All rights reserved.

  11. A preliminary evaluation of the generalized likelihood ratio for detecting and identifying control element failures in a transport aircraft

    NASA Technical Reports Server (NTRS)

    Bundick, W. T.

    1985-01-01

    The application of the Generalized Likelihood Ratio technique to the detection and identification of aircraft control element failures has been evaluated in a linear digital simulation of the longitudinal dynamics of a B-737 aircraft. Simulation results show that the technique has potential but that the effects of wind turbulence and Kalman filter model errors are problems which must be overcome.

  12. Development and evaluation of event-specific quantitative PCR method for genetically modified soybean A2704-12.

    PubMed

    Takabatake, Reona; Akiyama, Hiroshi; Sakata, Kozue; Onishi, Mari; Koiwa, Tomohiro; Futo, Satoshi; Minegishi, Yasutaka; Teshima, Reiko; Mano, Junichi; Furui, Satoshi; Kitta, Kazumi

    2011-01-01

    A novel real-time PCR-based analytical method was developed for the event-specific quantification of a genetically modified (GM) soybean event; A2704-12. During the plant transformation, DNA fragments derived from pUC19 plasmid were integrated in A2704-12, and the region was found to be A2704-12 specific. The pUC19-derived DNA sequences were used as primers for the specific detection of A2704-12. We first tried to construct a standard plasmid for A2704-12 quantification using pUC19. However, non-specific signals appeared with both qualitative and quantitative PCR analyses using the specific primers with pUC19 as a template, and we then constructed a plasmid using pBR322. The conversion factor (C(f)), which is required to calculate the amount of the genetically modified organism (GMO), was experimentally determined with two real-time PCR instruments, the Applied Biosystems 7900HT and the Applied Biosystems 7500. The determined C(f) values were both 0.98. The quantitative method was evaluated by means of blind tests in multi-laboratory trials using the two real-time PCR instruments. The limit of quantitation for the method was estimated to be 0.1%. The trueness and precision were evaluated as the bias and reproducibility of relative standard deviation (RSD(R)), and the determined bias and RSD(R) values for the method were each less than 20%. These results suggest that the developed method would be suitable for practical analyses for the detection and quantification of A2704-12.

  13. Assessment and Management of Volume Overload and Congestion in Chronic Heart Failure: Can Measuring Blood Volume Provide New Insights?

    PubMed

    Miller, Wayne L

    2017-01-01

    Volume overload and fluid congestion remain primary clinical challenges in the assessment and management of patients with chronic heart failure (HF). The pathophysiology of volume regulation is complex, and the simple concept of passive intravascular fluid accumulation is not adequate. The dynamics of interstitial and intravascular fluid compartment interactions and fluid redistribution from venous splanchnic beds to the central pulmonary circulation need to be taken into account in strategies of volume management. Clinical bedside evaluations and right heart hemodynamic assessments can alert of changes in volume status, but only the quantitative measurement of total blood volume can help identify the heterogeneity in plasma volume and red blood cell mass that are features of volume overload in chronic HF. The quantitative assessment of intravascular volume is an effective tool to help guide individualized, appropriate therapy. Not all volume overload is the same, and the measurement of intravascular volume identifies heterogeneity to guide tailored therapy.

  14. Acoustic emission spectral analysis of fiber composite failure mechanisms

    NASA Technical Reports Server (NTRS)

    Egan, D. M.; Williams, J. H., Jr.

    1978-01-01

    The acoustic emission of graphite fiber polyimide composite failure mechanisms was investigated with emphasis on frequency spectrum analysis. Although visual examination of spectral densities could not distinguish among fracture sources, a paired-sample t statistical analysis of mean normalized spectral densities did provide quantitative discrimination among acoustic emissions from 10 deg, 90 deg, and plus or minus 45 deg, plus or minus 45 deg sub s specimens. Comparable discrimination was not obtained for 0 deg specimens.

  15. Evaluating Failures and near Misses in Human Spaceflight History for Lessons for Future Human Spaceflight

    NASA Technical Reports Server (NTRS)

    Barr, Stephanie

    2010-01-01

    Studies done in the past have drawn on lessons learned with regard to human loss-of-life events. However, an examination of near-fatal accidents can be equally useful, not only in detecting causes, both proximate and systemic, but also for determining what factors averted disaster, what design decisions and/or operator actions prevented catastrophe. Binary pass/fail launch history is often used for risk, but this also has limitations. A program with a number of near misses can look more reliable than a consistently healthy program with a single out-of-family failure. Augmenting reliability evaluations with this near miss data can provide insight and expand on the limitations of a strictly pass/fail evaluation. This paper intends to show how near-miss lessons learned can provide crucial data for any new human spaceflight programs that are interested in sending man into space

  16. Evaluating Failures and Near Misses in Human Spaceflight History for Lessons for Future Human Spaceflight

    NASA Astrophysics Data System (ADS)

    Barr, Stephanie

    2010-09-01

    Studies done in the past have drawn on lessons learned with regard to human loss-of-life events. However, an examination of near-fatal accidents can be equally useful, not only in detecting causes, both proximate and systemic, but also for determining what factors averted disaster, what design decisions and/or operator actions prevented catastrophe. Binary pass/fail launch history is often used for risk, but this also has limitations. A program with a number of near misses can look more reliable than a consistently healthy program with a single out-of-family failure. Augmenting reliability evaluations with this near miss data can provide insight and expand on the limitations of a strictly pass/fail evaluation. This paper intends to show how near-miss lessons learned can provide crucial data for any new human spaceflight programs that are interested in sending man into space.

  17. Failure dynamics of the global risk network.

    PubMed

    Szymanski, Boleslaw K; Lin, Xin; Asztalos, Andrea; Sreenivasan, Sameet

    2015-06-18

    Risks threatening modern societies form an intricately interconnected network that often underlies crisis situations. Yet, little is known about how risk materializations in distinct domains influence each other. Here we present an approach in which expert assessments of likelihoods and influence of risks underlie a quantitative model of the global risk network dynamics. The modeled risks range from environmental to economic and technological, and include difficult to quantify risks, such as geo-political and social. Using the maximum likelihood estimation, we find the optimal model parameters and demonstrate that the model including network effects significantly outperforms the others, uncovering full value of the expert collected data. We analyze the model dynamics and study its resilience and stability. Our findings include such risk properties as contagion potential, persistence, roles in cascades of failures and the identity of risks most detrimental to system stability. The model provides quantitative means for measuring the adverse effects of risk interdependencies and the materialization of risks in the network.

  18. Quantitative evaluation of analyte transport on microfluidic paper-based analytical devices (μPADs).

    PubMed

    Ota, Riki; Yamada, Kentaro; Suzuki, Koji; Citterio, Daniel

    2018-02-07

    The transport efficiency during capillary flow-driven sample transport on microfluidic paper-based analytical devices (μPADs) made from filter paper has been investigated for a selection of model analytes (Ni 2+ , Zn 2+ , Cu 2+ , PO 4 3- , bovine serum albumin, sulforhodamine B, amaranth) representing metal cations, complex anions, proteins and anionic molecules. For the first time, the transport of the analytical target compounds rather than the sample liquid, has been quantitatively evaluated by means of colorimetry and absorption spectrometry-based methods. The experiments have revealed that small paperfluidic channel dimensions, additional user operation steps (e.g. control of sample volume, sample dilution, washing step) as well as the introduction of sample liquid wicking areas allow to increase analyte transport efficiency. It is also shown that the interaction of analytes with the negatively charged cellulosic paper substrate surface is strongly influenced by the physico-chemical properties of the model analyte and can in some cases (Cu 2+ ) result in nearly complete analyte depletion during sample transport. The quantitative information gained through these experiments is expected to contribute to the development of more sensitive μPADs.

  19. A GATE evaluation of the sources of error in quantitative {sup 90}Y PET

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Strydhorst, Jared, E-mail: jared.strydhorst@gmail.

    Purpose: Accurate reconstruction of the dose delivered by {sup 90}Y microspheres using a postembolization PET scan would permit the establishment of more accurate dose–response relationships for treatment of hepatocellular carcinoma with {sup 90}Y. However, the quality of the PET data obtained is compromised by several factors, including poor count statistics and a very high random fraction. This work uses Monte Carlo simulations to investigate what impact factors other than low count statistics have on the quantification of {sup 90}Y PET. Methods: PET acquisitions of two phantoms—a NEMA PET phantom and the NEMA IEC PET body phantom-containing either {sup 90}Y ormore » {sup 18}F were simulated using GATE. Simulated projections were created with subsets of the simulation data allowing the contributions of random, scatter, and LSO background to be independently evaluated. The simulated projections were reconstructed using the commercial software for the simulated scanner, and the quantitative accuracy of the reconstruction and the contrast recovery of the reconstructed images were evaluated. Results: The quantitative accuracy of the {sup 90}Y reconstructions were not strongly influenced by the high random fraction present in the projection data, and the activity concentration was recovered to within 5% of the known value. The contrast recovery measured for simulated {sup 90}Y data was slightly poorer than that for simulated {sup 18}F data with similar count statistics. However, the degradation was not strongly linked to any particular factor. Using a more restricted energy range to reduce the random fraction in the projections had no significant effect. Conclusions: Simulations of {sup 90}Y PET confirm that quantitative {sup 90}Y is achievable with the same approach as that used for {sup 18}F, and that there is likely very little margin for improvement by attempting to model aspects unique to {sup 90}Y, such as the much higher random fraction or the presence of

  20. Quantitative evaluation of the CEEM soil sampling intercomparison.

    PubMed

    Wagner, G; Lischer, P; Theocharopoulos, S; Muntau, H; Desaules, A; Quevauviller, P

    2001-01-08

    The aim of the CEEM soil project was to compare and to test the soil sampling and sample preparation guidelines used in the member states of the European Union and Switzerland for investigations of background and large-scale contamination of soils, soil monitoring and environmental risk assessments. The results of the comparative evaluation of the sampling guidelines demonstrated that, in soil contamination studies carried out with different sampling strategies and methods, comparable results can hardly be expected. Therefore, a reference database (RDB) was established by the organisers, which acted as a basis for the quantitative comparison of the participants' results. The detected deviations were related to the methodological details of the individual strategies. The comparative evaluation concept consisted of three steps: The first step was a comparison of the participants' samples (which were both centrally and individually analysed) between each other, as well as with the reference data base (RDB) and some given soil quality standards on the level of concentrations present. The comparison was made using the example of the metals cadmium, copper, lead and zinc. As a second step, the absolute and relative deviations between the reference database and the participants' results (both centrally analysed under repeatability conditions) were calculated. The comparability of the samples with the RDB was categorised on four levels. Methods of exploratory statistical analysis were applied to estimate the differential method bias among the participants. The levels of error caused by sampling and sample preparation were compared with those caused by the analytical procedures. As a third step, the methodological profiles of the participants were compiled to concisely describe the different procedures used. They were related to the results to find out the main factors leading to their incomparability. The outcome of this evaluation process was a list of strategies and

  1. Investigating failure behavior and origins under supposed "shear bond" loading.

    PubMed

    Sultan, Hassam; Kelly, J Robert; Kazemi, Reza B

    2015-07-01

    This study evaluated failure behavior when resin-composite cylinders bonded to dentin fractured under traditional "shear" testing. Failure was assessed by scaling of failure loads to changes in cylinder radii and fracture surface analysis. Three stress models were examined including failure by: bonded area; flat-on-cylinder contact; and, uniformly-loaded, cantilevered-beam. Nine 2-mm dentin occlusal dentin discs for each radii tested were embedded in resin and bonded to resin-composite cylinders; radii (mm)=0.79375; 1.5875; 2.38125; 3.175. Samples were "shear" tested at 1.0mm/min. Following testing, disks were finished with silicone carbide paper (240-600grit) to remove residual composite debris and tested again using different radii. Failure stresses were calculated for: "shear"; flat-on-cylinder contact; and, bending of a uniformly-loaded cantilevered beam. Stress equations and constants were evaluated for each model. Fracture-surface analysis was performed. Failure stresses calculated as flat-on-cylinder contact scaled best with its radii relationship. Stress equation constants were constant for failure from the outside surface of the loaded cylinders and not with the bonded surface area or cantilevered beam. Contact failure stresses were constant over all specimen sizes. Fractography reinforced that failures originated from loaded cylinder surface and were unrelated to the bonded surface area. "Shear bond" testing does not appear to test the bonded interface. Load/area "stress" calculations have no physical meaning. While failure is related to contact stresses, the mechanism(s) likely involve non-linear damage accumulation, which may only indirectly be influenced by the interface. Copyright © 2015 Academy of Dental Materials. Published by Elsevier Ltd. All rights reserved.

  2. Comparative Evaluation of Periodontal Status of Chronic Renal Failure Patients and Systemically Healthy Individuals.

    PubMed

    Gupta, Radhika; Kumar, Uttam; Mallapragada, Siddharth; Agarwal, Pallavi

    2018-03-01

    Periodontitis, a chronic infectious disease, affects most of the population at one time or the other and its expression is a combination of hosts, microbial agents, and environmental factors. Extensive literature exists for the relationship between periodontal disease and diabetes mellitus, cardiovascular diseases, and adverse pregnancy outcomes. Only a few studies performed in a limited number of patients have reported peri-odontal health status in chronic renal failure patients. Hence, the aim of the present study is to assess and compare the periodontal status of patients with chronic renal failure undergoing dialysis, predialysis with systemically healthy individuals. A total of 90 patients were divided into three groups. Group I: 30 renal dialysis patients. Group II: 30 predialysis patients. Control group comprised 30 systemically healthy patients who formed group III. Periodontal examination was carried out using oral hygiene index-simplified (OHI-S), plaque index (PI), gingival index (GI), probing depth, and clinical attachment loss. The results of the study showed that the periodontal status of patients with chronic renal failure undergoing dialysis (dialysis group) and patients with chronic renal failure not undergoing renal dialysis (predialysis) when compared with systemically healthy subjects showed significantly higher mean scores of OHI-S, PI, and clinical attachment loss. Thus, patients with chronic renal failure showed poor oral hygiene and higher prevalence of periodontal disease. The dental community's awareness of implications of poor health within chronic renal failure patients should be elevated.

  3. Qualitative and quantitative evaluation of solvent systems for countercurrent separation.

    PubMed

    Friesen, J Brent; Ahmed, Sana; Pauli, Guido F

    2015-01-16

    Rational solvent system selection for countercurrent chromatography and centrifugal partition chromatography technology (collectively known as countercurrent separation) studies continues to be a scientific challenge as the fundamental questions of comparing polarity range and selectivity within a solvent system family and between putative orthogonal solvent systems remain unanswered. The current emphasis on metabolomic investigations and analysis of complex mixtures necessitates the use of successive orthogonal countercurrent separation (CS) steps as part of complex fractionation protocols. Addressing the broad range of metabolite polarities demands development of new CS solvent systems with appropriate composition, polarity (π), selectivity (σ), and suitability. In this study, a mixture of twenty commercially available natural products, called the GUESSmix, was utilized to evaluate both solvent system polarity and selectively characteristics. Comparisons of GUESSmix analyte partition coefficient (K) values give rise to a measure of solvent system polarity range called the GUESSmix polarity index (GUPI). Solvatochromic dye and electrical permittivity measurements were also evaluated in quantitatively assessing solvent system polarity. The relative selectivity of solvent systems were evaluated with the GUESSmix by calculating the pairwise resolution (αip), the number of analytes found in the sweet spot (Nsw), and the pairwise resolution of those sweet spot analytes (αsw). The combination of these parameters allowed for both intra- and inter-family comparison of solvent system selectivity. Finally, 2-dimensional reciprocal shifted symmetry plots (ReSS(2)) were created to visually compare both the polarities and selectivities of solvent system pairs. This study helps to pave the way to the development of new solvent systems that are amenable to successive orthogonal CS protocols employed in metabolomic studies. Copyright © 2014 Elsevier B.V. All rights reserved.

  4. An adaptive model approach for quantitative wrist rigidity evaluation during deep brain stimulation surgery.

    PubMed

    Assis, Sofia; Costa, Pedro; Rosas, Maria Jose; Vaz, Rui; Silva Cunha, Joao Paulo

    2016-08-01

    Intraoperative evaluation of the efficacy of Deep Brain Stimulation includes evaluation of the effect on rigidity. A subjective semi-quantitative scale is used, dependent on the examiner perception and experience. A system was proposed previously, aiming to tackle this subjectivity, using quantitative data and providing real-time feedback of the computed rigidity reduction, hence supporting the physician decision. This system comprised of a gyroscope-based motion sensor in a textile band, placed in the patients hand, which communicated its measurements to a laptop. The latter computed a signal descriptor from the angular velocity of the hand during wrist flexion in DBS surgery. The first approach relied on using a general rigidity reduction model, regardless of the initial severity of the symptom. Thus, to enhance the performance of the previously presented system, we aimed to develop models for high and low baseline rigidity, according to the examiner assessment before any stimulation. This would allow a more patient-oriented approach. Additionally, usability was improved by having in situ processing in a smartphone, instead of a computer. Such system has shown to be reliable, presenting an accuracy of 82.0% and a mean error of 3.4%. Relatively to previous results, the performance was similar, further supporting the importance of considering the cogwheel rigidity to better infer about the reduction in rigidity. Overall, we present a simple, wearable, mobile system, suitable for intra-operatory conditions during DBS, supporting a physician in decision-making when setting stimulation parameters.

  5. The observation of AE events under uniaxial compression and the quantitative relationship between the anisotropy index and the main failure plane

    NASA Astrophysics Data System (ADS)

    Zhang, Zhibo; Wang, Enyuan; Chen, Dong; Li, Xuelong; Li, Nan

    2016-11-01

    In this paper, the P-wave velocities in different directions of sandstone samples under uniaxial compression are measured. The results indicate that the changes in the P-wave velocity in different directions are almost the same. In the initial stage of loading, the P-wave velocity exhibits a rising trend due to compaction and closure of preexisting fissures. As the stress increase, preexisting fissures are closed but induced fractures are not yet generated. The sandstone samples become denser and more uniform. The P-wave velocity remains in a steady state at a high level. In the late stage of loading, the P-wave velocity drops significantly due to the expansion and breakthrough of induced fractures. The P-wave velocity anisotropy index ε is analyzed during the process of loading. It can be observed that the change in the degree of wave velocity anisotropy can be divided into three stages: the AB stage, the BC stage and the CD stage, with a changing trend from decline to incline. In the initial stage of loading, the preexisting fissures have a randomized distribution, and the change is large-scale and uniform. The difference in each spatial point decreases gradually, and synchronization increases gradually. Thus, the P-wave velocity anisotropy declines. As the stress increases gradually, with the expansion and breakthrough of induced fractures, the difference in each spatial point increases. Before failure of rock samples, the violent change region of the rock samples' internal structure is focused on a narrow two-dimensional zone, and the rock samples' structural change is obviously local. Therefore, the degree of velocity anisotropy rises after declining, and it also has good corresponding relation among the AE count, the location of AE events and the degree of wave velocity anisotropy. The projection plane of the main fracture plane on the axis plane is recorded as M plane. Based on the AFF equation, for the CD stage, we analyze the quantitative relationship

  6. Technology Efficacy in Active Prosthetic Knees for Transfemoral Amputees: A Quantitative Evaluation

    PubMed Central

    El-Sayed, Amr M.; Abu Osman, Noor Azuan

    2014-01-01

    Several studies have presented technological ensembles of active knee systems for transfemoral prosthesis. Other studies have examined the amputees' gait performance while wearing a specific active prosthesis. This paper combined both insights, that is, a technical examination of the components used, with an evaluation of how these improved the gait of respective users. This study aims to offer a quantitative understanding of the potential enhancement derived from strategic integration of core elements in developing an effective device. The study systematically discussed the current technology in active transfemoral prosthesis with respect to its functional walking performance amongst above-knee amputee users, to evaluate the system's efficacy in producing close-to-normal user performance. The performances of its actuator, sensory system, and control technique that are incorporated in each reported system were evaluated separately and numerical comparisons were conducted based on the percentage of amputees' gait deviation from normal gait profile points. The results identified particular components that contributed closest to normal gait parameters. However, the conclusion is limitedly extendable due to the small number of studies. Thus, more clinical validation of the active prosthetic knee technology is needed to better understand the extent of contribution of each component to the most functional development. PMID:25110727

  7. Technology efficacy in active prosthetic knees for transfemoral amputees: a quantitative evaluation.

    PubMed

    El-Sayed, Amr M; Hamzaid, Nur Azah; Abu Osman, Noor Azuan

    2014-01-01

    Several studies have presented technological ensembles of active knee systems for transfemoral prosthesis. Other studies have examined the amputees' gait performance while wearing a specific active prosthesis. This paper combined both insights, that is, a technical examination of the components used, with an evaluation of how these improved the gait of respective users. This study aims to offer a quantitative understanding of the potential enhancement derived from strategic integration of core elements in developing an effective device. The study systematically discussed the current technology in active transfemoral prosthesis with respect to its functional walking performance amongst above-knee amputee users, to evaluate the system's efficacy in producing close-to-normal user performance. The performances of its actuator, sensory system, and control technique that are incorporated in each reported system were evaluated separately and numerical comparisons were conducted based on the percentage of amputees' gait deviation from normal gait profile points. The results identified particular components that contributed closest to normal gait parameters. However, the conclusion is limitedly extendable due to the small number of studies. Thus, more clinical validation of the active prosthetic knee technology is needed to better understand the extent of contribution of each component to the most functional development.

  8. Quantitative evaluation of photoplethysmographic artifact reduction for pulse oximetry

    NASA Astrophysics Data System (ADS)

    Hayes, Matthew J.; Smith, Peter R.

    1999-01-01

    Motion artefact corruption of pulse oximeter output, causing both measurement inaccuracies and false alarm conditions, is a primary restriction in the current clinical practice and future applications of this useful technique. Artefact reduction in photoplethysmography (PPG), and therefore by application in pulse oximetry, is demonstrated using a novel non-linear methodology recently proposed by the authors. The significance of these processed PPG signals for pulse oximetry measurement is discussed, with particular attention to the normalization inherent in the artefact reduction process. Quantitative experimental investigation of the performance of PPG artefact reduction is then utilized to evaluate this technology for application to pulse oximetry. While the successfully demonstrated reduction of severe artefacts may widen the applicability of all PPG technologies and decrease the occurrence of pulse oximeter false alarms, the observed reduction of slight artefacts suggests that many such effects may go unnoticed in clinical practice. The signal processing and output averaging used in most commercial oximeters can incorporate these artefact errors into the output, while masking the true PPG signal corruption. It is therefore suggested that PPG artefact reduction should be incorporated into conventional pulse oximetry measurement, even in the absence of end-user artefact problems.

  9. Qualification Testing Versus Quantitative Reliability Testing of PV - Gaining Confidence in a Rapidly Changing Technology: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kurtz, Sarah; Repins, Ingrid L; Hacke, Peter L

    Continued growth of PV system deployment would be enhanced by quantitative, low-uncertainty predictions of the degradation and failure rates of PV modules and systems. The intended product lifetime (decades) far exceeds the product development cycle (months), limiting our ability to reduce the uncertainty of the predictions for this rapidly changing technology. Yet, business decisions (setting insurance rates, analyzing return on investment, etc.) require quantitative risk assessment. Moving toward more quantitative assessments requires consideration of many factors, including the intended application, consequence of a possible failure, variability in the manufacturing, installation, and operation, as well as uncertainty in the measured accelerationmore » factors, which provide the basis for predictions based on accelerated tests. As the industry matures, it is useful to periodically assess the overall strategy for standards development and prioritization of research to provide a technical basis both for the standards and the analysis related to the application of those. To this end, this paper suggests a tiered approach to creating risk assessments. Recent and planned potential improvements in international standards are also summarized.« less

  10. Failure mechanisms and lifetime prediction methodology for polybutylene pipe in water distribution system

    NASA Astrophysics Data System (ADS)

    Niu, Xiqun

    Polybutylene (PB) is a semicrystalline thermoplastics. It has been widely used in potable water distribution piping system. However, field practice shows that failure occurs much earlier than the expected service lifetime. What are the causes and how to appropriately evaluate its lifetime motivate this study. In this thesis, three parts of work have been done. First is the understanding of PB, which includes material thermo and mechanical characterization, aging phenomena and notch sensitivity. The second part analyzes the applicability of the existing lifetime testing method for PB. It is shown that PB is an anomaly in terms of the temperature-lifetime relation because of the fracture mechanism transition across the testing temperature range. The third part is the development of the methodology of lifetime prediction for PB pipe. The fracture process of PB pipe consists of three stages, i.e., crack initiation, slow crack growth (SCG) and crack instability. The practical lifetime of PB pipe is primarily determined by the duration of the first two stages. The mechanism of crack initiation and the quantitative estimation of the time to crack initiation are studied by employing environment stress cracking technique. A fatigue slow crack growth testing method has been developed and applied in the study of SCG. By using Paris-Erdogan equation, a model is constructed to evaluate the time for SCG. As a result, the total lifetime is determined. Through this work, the failure mechanisms of PB pipe has been analyzed and the lifetime prediction methodology has been developed.

  11. Establishment of a new method to quantitatively evaluate hyphal fusion ability in Aspergillus oryzae.

    PubMed

    Tsukasaki, Wakako; Maruyama, Jun-Ichi; Kitamoto, Katsuhiko

    2014-01-01

    Hyphal fusion is involved in the formation of an interconnected colony in filamentous fungi, and it is the first process in sexual/parasexual reproduction. However, it was difficult to evaluate hyphal fusion efficiency due to the low frequency in Aspergillus oryzae in spite of its industrial significance. Here, we established a method to quantitatively evaluate the hyphal fusion ability of A. oryzae with mixed culture of two different auxotrophic strains, where the ratio of heterokaryotic conidia growing without the auxotrophic requirements reflects the hyphal fusion efficiency. By employing this method, it was demonstrated that AoSO and AoFus3 are required for hyphal fusion, and that hyphal fusion efficiency of A. oryzae was increased by depleting nitrogen source, including large amounts of carbon source, and adjusting pH to 7.0.

  12. Procedures to evaluate the efficiency of protective clothing worn by operators applying pesticide.

    PubMed

    Espanhol-Soares, Melina; Nociti, Leticia A S; Machado-Neto, Joaquim Gonçalves

    2013-10-01

    The evaluation of the efficiency of whole-body protective clothing against pesticides has already been carried out through field tests and procedures defined by international standards, but there is a need to determine the useful life of these garments to ensure worker safety. The aim of this article is to compare the procedures for evaluating efficiency of two whole-body protective garments, both new and previously used by applicators of herbicides, using a laboratory test with a mannequin and in the field with the operator. The evaluation of the efficiency of protective clothing used both quantitative and qualitative methodologies, leading to a proposal for classification according to efficiency, and determination of the useful life of protective clothing for use against pesticides, based on a quantitative assessment. The procedures used were in accordance with the standards of the modified American Society for Testing and Materials (ASTM) F 1359:2007 and International Organization for Standardization 17491-4. The protocol used in the field was World Health Organization Vector Biology and Control (VBC)/82.1. Clothing tested was personal water repellent and pesticide protective. Two varieties of fabric were tested: Beige (100% cotton) and Camouflaged (31% polyester and 69% cotton). The efficiency in exposure control of the personal protective clothing was measured before use and after 5, 10, 20, and 30 uses and washes under field conditions. Personal protective clothing was worn by workers in the field during the application of the herbicide glyphosate on weed species in mature sugar cane plantations using a knapsack sprayer. The modified ASTM 1359:2007 procedure was chosen as the most appropriate due to its greater repeatability (lower coefficient of variation). This procedure provides quantitative evaluation needed to determine the efficiency and useful life of individual protective clothing, not just at specific points of failure, but according to dermal

  13. Quantifying Pilot Contribution to Flight Safety During an In-Flight Airspeed Failure

    NASA Technical Reports Server (NTRS)

    Etherington, Timothy J.; Kramer, Lynda J.; Bailey, Randall E.; Kennedey, Kellie D.

    2017-01-01

    Accident statistics cite the flight crew as a causal factor in over 60% of large transport fatal accidents. Yet a well-trained and well-qualified crew is acknowledged as the critical center point of aircraft systems safety and an integral component of the entire commercial aviation system. A human-in-the-loop test was conducted using a Level D certified Boeing 737-800 simulator to evaluate the pilot's contribution to safety-of-flight during routine air carrier flight operations and in response to system failures. To quantify the human's contribution, crew complement was used as an independent variable in a between-subjects design. This paper details the crew's actions and responses while dealing with an in-flight airspeed failure. Accident statistics often cite flight crew error (Baker, 2001) as the primary contributor in accidents and incidents in transport category aircraft. However, the Air Line Pilots Association (2011) suggests "a well-trained and well-qualified pilot is acknowledged as the critical center point of the aircraft systems safety and an integral safety component of the entire commercial aviation system." This is generally acknowledged but cannot be verified because little or no quantitative data exists on how or how many accidents/incidents are averted by crew actions. Anecdotal evidence suggest crews handle failures on a daily basis and Aviation Safety Action Program data generally supports this assertion, even if the data is not released to the public. However without hard evidence, the contribution and means by which pilots achieve safety of flight is difficult to define. Thus, ways to improve the human ability to contribute or overcome deficiencies are ill-defined.

  14. A simple hemostasis model for the quantitative evaluation of hydrogel-based local hemostatic biomaterials on tissue surface.

    PubMed

    Murakami, Yoshihiko; Yokoyama, Masayuki; Nishida, Hiroshi; Tomizawa, Yasuko; Kurosawa, Hiromi

    2008-09-01

    Several hemostat hydrogels are clinically used, and some other agents are studied for safer, more facile, and more efficient hemostasis. In the present paper, we proposed a novel method to evaluate local hemostat hydrogel on tissue surface. The procedure consisted of the following steps: (step 1) a mouse was fixed on a cork board, and its abdomen was incised; (step 2) serous fluid was carefully removed because it affected the estimation of the weight gained by the filter paper, and parafilm and preweighted filter paper were placed beneath the liver (parafilm prevented the filter paper's absorption of gradually oozing serous fluid); (step 3) the cork board was tilted and maintained at an angle of about 45 degrees so that the bleeding would more easily flow from the liver toward the filter paper; and (step 4) the bleeding lasted for 3 min. In this step, a hemostat was applied to the liver wound immediately after the liver was pricked with a needle. We found that (1) a careful removal of serous fluid prior to a bleeding and (2) a quantitative determination of the amount of excess aqueous solution that oozed out from a hemostat were important to a rigorous evaluation of hemostat efficacy. We successfully evaluated the efficacy of a fibrin-based hemostat hydrogel by using our method. The method proposed in the present study enabled the quantitative, accurate, and easy evaluation of the efficacy of local hemostatic hydrogel which acts as tissue-adhesive agent on biointerfaces.

  15. Quantitative and Qualitative Evaluation of Iranian Researchers' Scientific Production in Dentistry Subfields.

    PubMed

    Yaminfirooz, Mousa; Motallebnejad, Mina; Gholinia, Hemmat; Esbakian, Somayeh

    2015-10-01

    As in other fields of medicine, scientific production in the field of dentistry has significant placement. This study aimed at quantitatively and qualitatively evaluating Iranian researchers' scientific output in the field of dentistry and determining their contribution in each of dentistry subfields and branches. This research was a scientometric study that applied quantitative and qualitative indices of Web of Science (WoS). Research population consisted of927indexed documents published under the name of Iran in the time span of 1993-2012 which were extracted from WoS on 10 March 2013. The Mann-Whitney test and Pearson correlation coefficient were used to data analyses in SPSS 19. 777 (83. 73%) of indexed items of all scientific output in WoS were scientific articles. The highest growth rate of scientific productionwith90% belonged to endodontic sub field. The correlation coefficient test showed that there was a significant positive relationship between the number of documents and their publication age (P < 0. 0001). There was a significant difference between the mean number of published articles in the first ten- year (1993-2003) and that of the second one (2004-2013), in favor of the latter (P = 0. 001). The distribution frequencies of scientific production in various subfields of dentistry were very different. It needs to reinforce the infrastructure for more balanced scientific production in the field and its related subfields.

  16. A new approach for the quantitative evaluation of drawings in children with learning disabilities.

    PubMed

    Galli, Manuela; Vimercati, Sara Laura; Stella, Giacomo; Caiazzo, Giorgia; Norveti, Federica; Onnis, Francesca; Rigoldi, Chiara; Albertini, Giorgio

    2011-01-01

    A new method for a quantitative and objective description of drawing and for the quantification of drawing ability in children with learning disabilities (LD) is hereby presented. Twenty-four normally developing children (N) (age 10.6 ± 0.5) and 18 children with learning disabilities (LD) (age 10.3 ± 2.4) took part to the study. The drawing tasks were chosen among those already used in clinical daily experience (Denver Developmental Screening Test). Some parameters were defined in order to quantitatively describe the features of the children's drawings, introducing new objective measurements beside the subjective standard clinical evaluation. The experimental set-up revealed to be valid for clinical application with LD children. The parameters highlighted the presence of differences in the drawing features of N and LD children. This paper suggests the applicability of this protocol to other fields of motor and cognitive valuation, as well as the possibility to study the upper limbs position and muscle activation during drawing. Copyright © 2011 Elsevier Ltd. All rights reserved.

  17. [Evaluation of YAG-laser vitreolysis effectiveness based on quantitative characterization of vitreous floaters].

    PubMed

    Shaimova, V A; Shaimov, T B; Shaimov, R B; Galin, A Yu; Goloshchapova, Zh A; Ryzhkov, P K; Fomin, A V

    2018-01-01

    To develop methods for evaluating effectiveness of YAG-laser vitreolysis of vitreous floaters. The study included 144 patients (173 eyes) who had underwent YAG-laser vitreolysis and were under observation from 01.09.16 to 31.01.18. The patients were 34 to 86 years old (mean age 62.7±10.2 years), 28 (19.4%) patients were male, 116 (80.6%) - female. All patients underwent standard and additional examination: ultrasonography (Accutome B-scan plus, U.S.A.), optic biometry (Lenstar 900, Haag-Streit, Switzerland), spectral optical coherence tomography using RTVue XR Avanti scanner (Optovue, U.S.A.) in modes Enhanced HD Line, 3D Retina, 3D Widefield MCT, Cross Line, Angio Retina, and scanning laser ophthalmoscopy (SLO) using Navilas 577s system. Laser vitreolysis was performed using the Ultra Q Reflex laser (Ellex, Australia). This paper presents methods of objective quantitative and qualitative assessment of artifactual shadows of vitreous floaters with spectral optical coherence tomographic scanner RTVue xR Avanti employing an algorithm of automatic detection of non-perfusion zones in modes Angio Retina, HD Angio Retina, as well as foveal avascular zone (FAZ) measurement with Angio Analytics® software. SLO performed with Navilas 577s was used as method of visualizing floaters and artifactual shadows in retinal surface layers prior to surgical treatment and after YAG-laser vitreolysis. Suggested methods of quantitative and qualitative assessment of artifactual shadows of the floaters in retinal layers are promising and may prove to be highly relevant for clinical monitoring of patients, optimization of treatment indications and evaluating effectiveness of YAG-laser vitreolysis. Further research of laser vitreolysis effectiveness in patients with vitreous floaters is necessary.

  18. StreamQRE: Modular Specification and Efficient Evaluation of Quantitative Queries over Streaming Data*

    PubMed Central

    Mamouras, Konstantinos; Raghothaman, Mukund; Alur, Rajeev; Ives, Zachary G.; Khanna, Sanjeev

    2017-01-01

    Real-time decision making in emerging IoT applications typically relies on computing quantitative summaries of large data streams in an efficient and incremental manner. To simplify the task of programming the desired logic, we propose StreamQRE, which provides natural and high-level constructs for processing streaming data. Our language has a novel integration of linguistic constructs from two distinct programming paradigms: streaming extensions of relational query languages and quantitative extensions of regular expressions. The former allows the programmer to employ relational constructs to partition the input data by keys and to integrate data streams from different sources, while the latter can be used to exploit the logical hierarchy in the input stream for modular specifications. We first present the core language with a small set of combinators, formal semantics, and a decidable type system. We then show how to express a number of common patterns with illustrative examples. Our compilation algorithm translates the high-level query into a streaming algorithm with precise complexity bounds on per-item processing time and total memory footprint. We also show how to integrate approximation algorithms into our framework. We report on an implementation in Java, and evaluate it with respect to existing high-performance engines for processing streaming data. Our experimental evaluation shows that (1) StreamQRE allows more natural and succinct specification of queries compared to existing frameworks, (2) the throughput of our implementation is higher than comparable systems (for example, two-to-four times greater than RxJava), and (3) the approximation algorithms supported by our implementation can lead to substantial memory savings. PMID:29151821

  19. A quantitative analysis of the F18 flight control system

    NASA Technical Reports Server (NTRS)

    Doyle, Stacy A.; Dugan, Joanne B.; Patterson-Hine, Ann

    1993-01-01

    This paper presents an informal quantitative analysis of the F18 flight control system (FCS). The analysis technique combines a coverage model with a fault tree model. To demonstrate the method's extensive capabilities, we replace the fault tree with a digraph model of the F18 FCS, the only model available to us. The substitution shows that while digraphs have primarily been used for qualitative analysis, they can also be used for quantitative analysis. Based on our assumptions and the particular failure rates assigned to the F18 FCS components, we show that coverage does have a significant effect on the system's reliability and thus it is important to include coverage in the reliability analysis.

  20. Failure criterion for materials with spatially correlated mechanical properties

    NASA Astrophysics Data System (ADS)

    Faillettaz, J.; Or, D.

    2015-03-01

    The role of spatially correlated mechanical elements in the failure behavior of heterogeneous materials represented by fiber bundle models (FBMs) was evaluated systematically for different load redistribution rules. Increasing the range of spatial correlation for FBMs with local load sharing is marked by a transition from ductilelike failure characteristics into brittlelike failure. The study identified a global failure criterion based on macroscopic properties (external load and cumulative damage) that is independent of spatial correlation or load redistribution rules. This general metric could be applied to assess the mechanical stability of complex and heterogeneous systems and thus provide an important component for early warning of a class of geophysical ruptures.

  1. Quantitative contrast enhanced magnetic resonance imaging for the evaluation of peripheral arterial disease: a comparative study versus standard digital angiography.

    PubMed

    Pavlovic, Chris; Futamatsu, Hideki; Angiolillo, Dominick J; Guzman, Luis A; Wilke, Norbert; Siragusa, Daniel; Wludyka, Peter; Percy, Robert; Northrup, Martin; Bass, Theodore A; Costa, Marco A

    2007-04-01

    The purpose of this study is to evaluate the accuracy of semiautomated analysis of contrast enhanced magnetic resonance angiography (MRA) in patients who have undergone standard angiographic evaluation for peripheral vascular disease (PVD). Magnetic resonance angiography is an important tool for evaluating PVD. Although this technique is both safe and noninvasive, the accuracy and reproducibility of quantitative measurements of disease severity using MRA in the clinical setting have not been fully investigated. 43 lesions in 13 patients who underwent both MRA and digital subtraction angiography (DSA) of iliac and common femoral arteries within 6 months were analyzed using quantitative magnetic resonance angiography (QMRA) and quantitative vascular analysis (QVA). Analysis was repeated by a second operator and by the same operator in approximately 1 month time. QMRA underestimated percent diameter stenosis (%DS) compared to measurements made with QVA by 2.47%. Limits of agreement between the two methods were +/- 9.14%. Interobserver variability in measurements of %DS were +/- 12.58% for QMRA and +/- 10.04% for QVA. Intraobserver variability of %DS for QMRA was +/- 4.6% and for QVA was +/- 8.46%. QMRA displays a high level of agreement to QVA when used to determine stenosis severity in iliac and common femoral arteries. Similar levels of interobserver and intraobserver variability are present with each method. Overall, QMRA represents a useful method to quantify severity of PVD.

  2. Evaluation of a Postdischarge Call System Using the Logic Model.

    PubMed

    Frye, Timothy C; Poe, Terri L; Wilson, Marisa L; Milligan, Gary

    2018-02-01

    This mixed-method study was conducted to evaluate a postdischarge call program for congestive heart failure patients at a major teaching hospital in the southeastern United States. The program was implemented based on the premise that it would improve patient outcomes and overall quality of life, but it had never been evaluated for effectiveness. The Logic Model was used to evaluate the input of key staff members to determine whether the outputs and results of the program matched the expectations of the organization. Interviews, online surveys, reviews of existing patient outcome data, and reviews of publicly available program marketing materials were used to ascertain current program output. After analyzing both qualitative and quantitative data from the evaluation, recommendations were made to the organization to improve the effectiveness of the program.

  3. Fatigue in older adults with stable heart failure.

    PubMed

    Stephen, Sharon A

    2008-01-01

    The purpose of this study was to describe fatigue and the relationships among fatigue intensity, self-reported functional status, and quality of life in older adults with stable heart failure. A descriptive, correlational design was used to collect quantitative data with reliable and valid instruments. Fifty-three eligible volunteers completed a questionnaire during an interview. Those with recent changes in their medical regimen, other fatigue-inducing illnesses, and isolated diastolic dysfunction were excluded. Fatigue intensity (Profile of Mood States fatigue subscale) was associated with lower quality of life, perceived health, and satisfaction with life. Fatigue was common, and no relationship was found between fatigue intensity and self-reported functional status. Marital status was the only independent predictor of fatigue. In stable heart failure, fatigue is a persistent symptom. Clinicians need to ask patients about fatigue and assess the impact on quality of life. Self-reported functional status cannot serve as a proxy measure for fatigue.

  4. Causes of corneal graft failure in India.

    PubMed

    Dandona, L; Naduvilath, T J; Janarthanan, M; Rao, G N

    1998-09-01

    The success of corneal grafting in visual rehabilitation of the corneal blind in India depends on survival of the grafts. Understanding the causes of graft failure may help reduce the risk of failure. We studied these causes in a series of 638 graft failures at our institution. Multivariate logistic regression analysis was used to evaluate the association of particular causes of graft failure with indications for grafting, socioeconomic status, age, sex, host corneal vascularization, donor corneal quality, and experience of surgeon. The major causes of graft failure were allograft rejection (29.2%), increased intraocular pressure (16.9%), infection excluding endophthalmitis (15.4%), and surface problems (12.7%). The odds of infection causing graft failure were significantly higher in patients of lower socioeconomic status (odds ratio 2.45, 95% CI 1.45-4.15). Surface problems as a cause of graft failure was significantly associated with grafts done for corneal scarring or for regrafts (odds ratio 3.36, 95% CI 1.80-6.30). Increased intraocular pressure as a cause of graft failure had significant association with grafts done for aphakic or pseudophakic bullous keratopathy, congenital conditions or glaucoma, or regrafts (odds ratio 2.19, 95% CI 1.25-3.84). Corneal dystrophy was the indication for grafting in 12 of the 13 cases of graft failure due to recurrence of host disease. Surface problems, increased intraocular pressure, and infection are modifiable risk factors that are more likely to cause graft failure in certain categories of patients in India. Knowledge about these associations can be helpful in looking for and aggressively treating these modifiable risk factors in the at-risk categories of corneal graft patients. This can possibly reduce the chance of graft failure.

  5. [Research progress of polyethylene inserts wear measurement and evaluation in total knee arthroplasty].

    PubMed

    Zhao, Feng; Wang, Chuan; Fan, Yubo

    2015-01-01

    Wear of polyethylene (PE) tibial inserts is a significant cause of implant failure of total knee arthroplasty (TKA). PE inserts wear measurement and evaluation is the key in TKA researches. There are many methods to measure insert wear. Qualitative methods such as observation are used to determine the wear and its type. Quantitative methods such as gravimetric analysis, coordinate measuring machines (CMM) and micro-computed tomography (micro-CT) are used to measure the mass, volume and geometry of wear. In this paper, the principle, characteristics and research progress of main insert wear evaluation method were introduced and the problems and disadvantages were analyzed.

  6. Reliable Broadcast under Cascading Failures in Interdependent Networks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Duan, Sisi; Lee, Sangkeun; Chinthavali, Supriya

    Reliable broadcast is an essential tool to disseminate information among a set of nodes in the presence of failures. We present a novel study of reliable broadcast in interdependent networks, in which the failures in one network may cascade to another network. In particular, we focus on the interdependency between the communication network and power grid network, where the power grid depends on the signals from the communication network for control and the communication network depends on the grid for power. In this paper, we build a resilient solution to handle crash failures in the communication network that may causemore » cascading failures and may even partition the network. In order to guarantee that all the correct nodes deliver the messages, we use soft links, which are inactive backup links to non-neighboring nodes that are only active when failures occur. At the core of our work is a fully distributed algorithm for the nodes to predict and collect the information of cascading failures so that soft links can be maintained to correct nodes prior to the failures. In the presence of failures, soft links are activated to guarantee message delivery and new soft links are built accordingly for long term robustness. Our evaluation results show that the algorithm achieves low packet drop rate and handles cascading failures with little overhead.« less

  7. [Renal failure in patients with liver transplant: incidence and predisposing factors].

    PubMed

    Gerona, S; Laudano, O; Macías, S; San Román, E; Galdame, O; Torres, O; Sorkin, E; Ciardullo, M; de Santibañes, E; Mastai, R

    1997-01-01

    Renal failure is a common finding in patients undergoing orthotopic liver transplantation. The aim of the present study was to evaluate the incidence, prognostic value of pre, intra and postoperative factors and severity of renal dysfunction in patients who undergo liver transplantation. Therefore, the records of 38 consecutive adult patients were reviewed. Renal failure was defined arbitrarily as an increase in creatinine (> 1.5 mg/dl) and/or blood urea (> 80 mg/dl). Three patients were excluded of the final analysis (1 acute liver failure and 2 with a survival lower than 72 hs.) Twenty one of the 35 patients has renal failure after orthotopic liver transplantation. Six of these episodes developed early, having occurred within the first 6 days. Late renal impairment occurred in 15 patients within the hospitalization (40 +/- 10 days) (Mean +/- SD). In he overall series, liver function, evaluated by Child-Pugh classification, a higher blood-related requirements and cyclosporine levels were observed more in those who experienced renal failure than those who did not (p < 0.05). Early renal failure was related with preoperative (liver function) and intraoperative (blood requirements) factors and several causes (nephrotoxic drugs and graft failure) other than cyclosporine were present in patients who developed late renal impairment. No mortality. No mortality was associated with renal failure. We conclude that renal failure a) is a common finding after liver transplantation, b) the pathogenesis of this complication is multifactorial and, c) in not related with a poor outcome.

  8. Insulation failure in electrosurgery instrumentation: a prospective evaluation.

    PubMed

    Tixier, Floriane; Garçon, Mélanie; Rochefort, Françoise; Corvaisier, Stéphane

    2016-11-01

    The use of electrosurgery has expanded to a wide variety of surgical specialities, but it has also been accompanied by its share of complications, including thermal injuries to nontargeted tissues, caused by a break or defect in the insulation of the instrument's coat. The purpose of this study was to determine the prevalence and the location of insulation failures (IFs) in electrosurgical instruments, then to assess the necessity of routine IF testing. Electrosurgical instruments were visually inspected and checked for IF using a high-voltage detector. Two different detectors were used during two testing sessions: DTU-6 (Petel company) and DIATEG (Morgate company). Laparoscopic and non-laparoscopic instruments were determined to have IF if current crossed the instrument's insulation, signaled by an alarm sound. A total of 489 instruments were tested. The overall prevalence of IFs was 24.1 % with only visual inspection and 37.2 % with the IF detector. Among the 489 instruments, 13.1 % were visually intact, but had an electric test failure. DTU-6 and DIATEG detectors showed comparable efficiency in detection of overall IFs and for laparoscopic and non-laparoscopic instruments. The median location of IFs was more pronounced for laparoscopic instruments (50.4 %) and the distal location for non-laparoscopic instruments (40.4 %). Accidental burns are a hidden problem and can lead to patient complications. In Central Sterilization Service Department, prevention currently includes only visual control of electrosurgery instrumentation, but testing campaigns are now necessary in order to identify maximum instruments' defects.

  9. Failure analysis of parameter-induced simulation crashes in climate models

    NASA Astrophysics Data System (ADS)

    Lucas, D. D.; Klein, R.; Tannahill, J.; Ivanova, D.; Brandon, S.; Domyancic, D.; Zhang, Y.

    2013-08-01

    Simulations using IPCC (Intergovernmental Panel on Climate Change)-class climate models are subject to fail or crash for a variety of reasons. Quantitative analysis of the failures can yield useful insights to better understand and improve the models. During the course of uncertainty quantification (UQ) ensemble simulations to assess the effects of ocean model parameter uncertainties on climate simulations, we experienced a series of simulation crashes within the Parallel Ocean Program (POP2) component of the Community Climate System Model (CCSM4). About 8.5% of our CCSM4 simulations failed for numerical reasons at combinations of POP2 parameter values. We applied support vector machine (SVM) classification from machine learning to quantify and predict the probability of failure as a function of the values of 18 POP2 parameters. A committee of SVM classifiers readily predicted model failures in an independent validation ensemble, as assessed by the area under the receiver operating characteristic (ROC) curve metric (AUC > 0.96). The causes of the simulation failures were determined through a global sensitivity analysis. Combinations of 8 parameters related to ocean mixing and viscosity from three different POP2 parameterizations were the major sources of the failures. This information can be used to improve POP2 and CCSM4 by incorporating correlations across the relevant parameters. Our method can also be used to quantify, predict, and understand simulation crashes in other complex geoscientific models.

  10. Quantitative evaluation of hidden defects in cast iron components using ultrasound activated lock-in vibrothermography.

    PubMed

    Montanini, R; Freni, F; Rossi, G L

    2012-09-01

    This paper reports one of the first experimental results on the application of ultrasound activated lock-in vibrothermography for quantitative assessment of buried flaws in complex cast parts. The use of amplitude modulated ultrasonic heat generation allowed selective response of defective areas within the part, as the defect itself is turned into a local thermal wave emitter. Quantitative evaluation of hidden damages was accomplished by estimating independently both the area and the depth extension of the buried flaws, while x-ray 3D computed tomography was used as reference for sizing accuracy assessment. To retrieve flaw's area, a simple yet effective histogram-based phase image segmentation algorithm with automatic pixels classification has been developed. A clear correlation was found between the thermal (phase) signature measured by the infrared camera on the target surface and the actual mean cross-section area of the flaw. Due to the very fast cycle time (<30 s/part), the method could potentially be applied for 100% quality control of casting components.

  11. Failure analysis of pinch-torsion tests as a thermal runaway risk evaluation method of Li-ion cells

    NASA Astrophysics Data System (ADS)

    Xia, Yuzhi; Li, Tianlei; Ren, Fei; Gao, Yanfei; Wang, Hsin

    2014-11-01

    Recently a pinch-torsion test is developed for safety testing of Li-ion batteries. It has been demonstrated that this test can generate small internal short-circuit spots in the separator in a controllable and repeatable manner. In the current research, the failure mechanism is examined by numerical simulations and comparisons to experimental observations. Finite element models are developed to evaluate the deformation of the separators under both pure pinch and pinch-torsion loading conditions. It is discovered that the addition of the torsion component significantly increased the maximum first principal strain, which is believed to induce the internal short circuit. In addition, the applied load in the pinch-torsion test is significantly less than in the pure pinch test, thus dramatically improving the applicability of this method to ultra-thick batteries which otherwise require heavy load in excess of machine capability. It is further found that the separator failure is achieved in the early stage of torsion (within a few degree of rotation). Effect of coefficient of friction on the maximum first principal strain is also examined.

  12. How and why of orthodontic bond failures: An in vivo study

    PubMed Central

    Vijayakumar, R. K.; Jagadeep, Raju; Ahamed, Fayyaz; Kanna, Aprose; Suresh, K.

    2014-01-01

    Introduction: The bonding of orthodontic brackets and their failure rates by both direct and in-direct procedures are well-documented in orthodontic literature. Over the years different adhesive materials and various indirect bonding transfer procedures have been compared and evaluated for bond failure rates. The aim of our study is to highlight the use of a simple, inexpensive and ease of manipulation of a single thermo-plastic transfer tray and the use the of a single light cure adhesive to evaluate the bond failure rates in clinical situations. Materials and Methods: A total of 30 patients were randomly divided into two groups (Group A and Group B). A split-mouth study design was used, for, both the groups so that they were distributed equally with-out bias. After initial prophylaxis, both the procedures were done as per manufactures instructions. All patients were initially motivated and reviewed for bond failures rates for 6 months. Results: Bond failure rates were assessed for over-all direct and indirect procedures, anterior and posterior arches, and for individual tooth. Z-test was used for statistically analyzing, the normal distribution of the sample in a spilt mouth study. The results of the two groups were compared and P value was calculated using Z-proportion test to assess the significance of the bond failure. Conclusion: Over-all bond failure was more for direct bonding. Anterior bracket failure was more in-direct bonding than indirect procedure, which showed more posterior bracket failures. In individual tooth bond failure, mandibular incisor, and premolar brackets showed more failure, followed by maxillary premolars and canines. PMID:25210392

  13. Evaluation of a web based informatics system with data mining tools for predicting outcomes with quantitative imaging features in stroke rehabilitation clinical trials

    NASA Astrophysics Data System (ADS)

    Wang, Ximing; Kim, Bokkyu; Park, Ji Hoon; Wang, Erik; Forsyth, Sydney; Lim, Cody; Ravi, Ragini; Karibyan, Sarkis; Sanchez, Alexander; Liu, Brent

    2017-03-01

    Quantitative imaging biomarkers are used widely in clinical trials for tracking and evaluation of medical interventions. Previously, we have presented a web based informatics system utilizing quantitative imaging features for predicting outcomes in stroke rehabilitation clinical trials. The system integrates imaging features extraction tools and a web-based statistical analysis tool. The tools include a generalized linear mixed model(GLMM) that can investigate potential significance and correlation based on features extracted from clinical data and quantitative biomarkers. The imaging features extraction tools allow the user to collect imaging features and the GLMM module allows the user to select clinical data and imaging features such as stroke lesion characteristics from the database as regressors and regressands. This paper discusses the application scenario and evaluation results of the system in a stroke rehabilitation clinical trial. The system was utilized to manage clinical data and extract imaging biomarkers including stroke lesion volume, location and ventricle/brain ratio. The GLMM module was validated and the efficiency of data analysis was also evaluated.

  14. Failure Behavior Characterization of Mo-Modified Ti Surface by Impact Test and Finite Element Analysis

    NASA Astrophysics Data System (ADS)

    Ma, Yong; Qin, Jianfeng; Zhang, Xiangyu; Lin, Naiming; Huang, Xiaobo; Tang, Bin

    2015-07-01

    Using the impact test and finite element simulation, the failure behavior of the Mo-modified layer on pure Ti was investigated. In the impact test, four loads of 100, 300, 500, and 700 N and 104 impacts were adopted. The three-dimensional residual impact dents were examined using an optical microscope (Olympus-DSX500i), indicating that the impact resistance of the Ti surface was improved. Two failure modes cohesive and wearing were elucidated by electron backscatter diffraction and energy-dispersive spectrometer performed in a field-emission scanning electron microscope. Through finite element forward analysis performed at a typical impact load of 300 N, stress-strain distributions in the Mo-modified Ti were quantitatively determined. In addition, the failure behavior of the Mo-modified layer was determined and an ideal failure model was proposed for high-load impact, based on the experimental and finite element forward analysis results.

  15. Diagnosing and managing acute heart failure in the emergency department

    PubMed Central

    Kuo, Dick C.; Peacock, W. Frank

    2015-01-01

    Heart failure is a clinical syndrome that results from the impairment of ventricular filling or ejection of blood and affects millions of people worldwide. Diagnosis may not be straightforward and at times may be difficult in an undifferentiated patient. However, rapid evaluation and diagnosis is important for the optimal management of acute heart failure. We review the many aspects of diagnosing and treating acute heart failure in the emergency department. PMID:27752588

  16. Micromechanics Based Failure Analysis of Heterogeneous Materials

    NASA Astrophysics Data System (ADS)

    Sertse, Hamsasew M.

    In recent decades, heterogeneous materials are extensively used in various industries such as aerospace, defense, automotive and others due to their desirable specific properties and excellent capability of accumulating damage. Despite their wide use, there are numerous challenges associated with the application of these materials. One of the main challenges is lack of accurate tools to predict the initiation, progression and final failure of these materials under various thermomechanical loading conditions. Although failure is usually treated at the macro and meso-scale level, the initiation and growth of failure is a complex phenomena across multiple scales. The objective of this work is to enable the mechanics of structure genome (MSG) and its companion code SwiftComp to analyze the initial failure (also called static failure), progressive failure, and fatigue failure of heterogeneous materials using micromechanics approach. The initial failure is evaluated at each numerical integration point using pointwise and nonlocal approach for each constituent of the heterogeneous materials. The effects of imperfect interfaces among constituents of heterogeneous materials are also investigated using a linear traction-displacement model. Moreover, the progressive and fatigue damage analyses are conducted using continuum damage mechanics (CDM) approach. The various failure criteria are also applied at a material point to analyze progressive damage in each constituent. The constitutive equation of a damaged material is formulated based on a consistent irreversible thermodynamics approach. The overall tangent modulus of uncoupled elastoplastic damage for negligible back stress effect is derived. The initiation of plasticity and damage in each constituent is evaluated at each numerical integration point using a nonlocal approach. The accumulated plastic strain and anisotropic damage evolution variables are iteratively solved using an incremental algorithm. The damage analyses

  17. Development of the local magnification method for quantitative evaluation of endoscope geometric distortion

    NASA Astrophysics Data System (ADS)

    Wang, Quanzeng; Cheng, Wei-Chung; Suresh, Nitin; Hua, Hong

    2016-05-01

    With improved diagnostic capabilities and complex optical designs, endoscopic technologies are advancing. As one of the several important optical performance characteristics, geometric distortion can negatively affect size estimation and feature identification related diagnosis. Therefore, a quantitative and simple distortion evaluation method is imperative for both the endoscopic industry and the medical device regulatory agent. However, no such method is available yet. While the image correction techniques are rather mature, they heavily depend on computational power to process multidimensional image data based on complex mathematical model, i.e., difficult to understand. Some commonly used distortion evaluation methods, such as the picture height distortion (DPH) or radial distortion (DRAD), are either too simple to accurately describe the distortion or subject to the error of deriving a reference image. We developed the basic local magnification (ML) method to evaluate endoscope distortion. Based on the method, we also developed ways to calculate DPH and DRAD. The method overcomes the aforementioned limitations, has clear physical meaning in the whole field of view, and can facilitate lesion size estimation during diagnosis. Most importantly, the method can facilitate endoscopic technology to market and potentially be adopted in an international endoscope standard.

  18. Ethical dilemmas in psychiatric evaluations in patients with fulminant liver failure.

    PubMed

    Appel, Jacob; Vaidya, Swapna

    2014-04-01

    Fulminant hepatic failure (FHF) is one of the more dramatic and challenging syndromes in clinical medicine. Time constraints and the scarcity of organs complicate the evaluation process in the case of patients presenting with FHF, raising ethical questions related to fairness and justice. The challenges are compounded by an absence of standardized guidelines. Acetaminophen overdose, often occurring in patients with histories of psychiatric illness and substance dependence, has emerged as the most common cause of FHF. The weak correlations between psychosocial factors and nonadherence, as per some studies, suggest that adherence may be influenced by systematic factors. Most research suggests that applying rigid ethical parameters in these patients, rather than allowing for case-dependent flexibility, can be problematic. The decision to transplant in patients with FHF has to be made in a very narrow window of time. The time-constrained process is fraught with uncertainties and limitations, given the absence of patient interview, fluctuating medical eligibility, and limited data. Although standardized scales exist, their benefit in such settings appears limited. Predicting compliance with posttransplant medical regimens is difficult to assess and raises the question of prospective studies to monitor compliance.

  19. Experimental Investigation on Deformation Failure Characteristics of Crystalline Marble Under Triaxial Cyclic Loading

    NASA Astrophysics Data System (ADS)

    Yang, Sheng-Qi; Tian, Wen-Ling; Ranjith, P. G.

    2017-11-01

    The deformation failure characteristics of marble subjected to triaxial cyclic loading are significant when evaluating the stability and safety of deep excavation damage zones. To date, however, there have been notably few triaxial experimental studies on marble under triaxial cyclic loading. Therefore, in this research, a series of triaxial cyclic tests was conducted to analyze the mechanical damage characteristics of a marble. The post-peak deformation of the marble changed gradually from strain softening to strain hardening as the confining pressure increased from 0 to 10 MPa. Under uniaxial compression, marble specimens showed brittle failure characteristics with a number axial splitting tensile cracks; in the range of σ 3 = 2.5-7.5 MPa, the marble specimens assumed single shear fracture characteristics with larger fracture angles of about 65°. However, at σ 3 = 10 MPa, the marble specimens showed no obvious shear fracture surfaces. The triaxial cyclic experimental results indicate that in the range of the tested confining pressures, the triaxial strengths of the marble specimens under cyclic loading were approximately equal to those under monotonic loading. With the increase in cycle number, the elastic strains of the marble specimens all increased at first and later decreased, achieving maximum values, but the plastic strains of the marble specimens increased nonlinearly. To evaluate quantitatively the damage extent of the marble under triaxial cyclic loading, a damage variable is defined according to the irreversible deformation for each cycle. The evolutions of the elastic modulus for the marble were characterized by four stages: material strengthening, material degradation, material failure and structure slippage. Based on the experimental results of the marble specimens under complex cyclic loading, the cohesion of the marble decreased linearly, but the internal friction angles did not depend on the damage extent. To describe the peak strength

  20. ACCELERATED FAILURE TIME MODELS PROVIDE A USEFUL STATISTICAL FRAMEWORK FOR AGING RESEARCH

    PubMed Central

    Swindell, William R.

    2009-01-01

    Survivorship experiments play a central role in aging research and are performed to evaluate whether interventions alter the rate of aging and increase lifespan. The accelerated failure time (AFT) model is seldom used to analyze survivorship data, but offers a potentially useful statistical approach that is based upon the survival curve rather than the hazard function. In this study, AFT models were used to analyze data from 16 survivorship experiments that evaluated the effects of one or more genetic manipulations on mouse lifespan. Most genetic manipulations were found to have a multiplicative effect on survivorship that is independent of age and well-characterized by the AFT model “deceleration factor”. AFT model deceleration factors also provided a more intuitive measure of treatment effect than the hazard ratio, and were robust to departures from modeling assumptions. Age-dependent treatment effects, when present, were investigated using quantile regression modeling. These results provide an informative and quantitative summary of survivorship data associated with currently known long-lived mouse models. In addition, from the standpoint of aging research, these statistical approaches have appealing properties and provide valuable tools for the analysis of survivorship data. PMID:19007875

  1. Accelerated failure time models provide a useful statistical framework for aging research.

    PubMed

    Swindell, William R

    2009-03-01

    Survivorship experiments play a central role in aging research and are performed to evaluate whether interventions alter the rate of aging and increase lifespan. The accelerated failure time (AFT) model is seldom used to analyze survivorship data, but offers a potentially useful statistical approach that is based upon the survival curve rather than the hazard function. In this study, AFT models were used to analyze data from 16 survivorship experiments that evaluated the effects of one or more genetic manipulations on mouse lifespan. Most genetic manipulations were found to have a multiplicative effect on survivorship that is independent of age and well-characterized by the AFT model "deceleration factor". AFT model deceleration factors also provided a more intuitive measure of treatment effect than the hazard ratio, and were robust to departures from modeling assumptions. Age-dependent treatment effects, when present, were investigated using quantile regression modeling. These results provide an informative and quantitative summary of survivorship data associated with currently known long-lived mouse models. In addition, from the standpoint of aging research, these statistical approaches have appealing properties and provide valuable tools for the analysis of survivorship data.

  2. Quantitative methods in assessment of neurologic function.

    PubMed

    Potvin, A R; Tourtellotte, W W; Syndulko, K; Potvin, J

    1981-01-01

    Traditionally, neurologists have emphasized qualitative techniques for assessing results of clinical trials. However, in recent years qualitative evaluations have been increasingly augmented by quantitative tests for measuring neurologic functions pertaining to mental state, strength, steadiness, reactions, speed, coordination, sensation, fatigue, gait, station, and simulated activities of daily living. Quantitative tests have long been used by psychologists for evaluating asymptomatic function, assessing human information processing, and predicting proficiency in skilled tasks; however, their methodology has never been directly assessed for validity in a clinical environment. In this report, relevant contributions from the literature on asymptomatic human performance and that on clinical quantitative neurologic function are reviewed and assessed. While emphasis is focused on tests appropriate for evaluating clinical neurologic trials, evaluations of tests for reproducibility, reliability, validity, and examiner training procedures, and for effects of motivation, learning, handedness, age, and sex are also reported and interpreted. Examples of statistical strategies for data analysis, scoring systems, data reduction methods, and data display concepts are presented. Although investigative work still remains to be done, it appears that carefully selected and evaluated tests of sensory and motor function should be an essential factor for evaluating clinical trials in an objective manner.

  3. Respiratory Failure

    MedlinePlus

    ... of oxygen in the blood, it's called hypoxemic (HI-pok-SE-mik) respiratory failure. When respiratory failure ... carbon dioxide in the blood, it's called hypercapnic (HI-per-KAP-nik) respiratory failure. Causes Diseases and ...

  4. Bone Marrow Failure Secondary to Cytokinesis Failure

    DTIC Science & Technology

    2015-12-01

    SUPPLEMENTARY NOTES 14. ABSTRACT Fanconi anemia (FA) is a human genetic disease characterized by a progressive bone marrow failure and heightened...Fanconi anemia (FA) is the most commonly inherited bone marrow failure syndrome. FA patients develop bone marrow failure during the first decade of...experiments proposed in specific aims 1- 3 (Tasks 1-3). Task 1: To determine whether HSCs from Fanconi anemia mouse models have increased cytokinesis

  5. Quantifying Pilot Contribution to Flight Safety during Hydraulic Systems Failure

    NASA Technical Reports Server (NTRS)

    Kramer, Lynda J.; Etherington, Timothy J.; Bailey, Randall E.; Kennedy, Kellie D.

    2017-01-01

    Accident statistics cite the flight crew as a causal factor in over 60% of large transport aircraft fatal accidents. Yet, a well-trained and well-qualified pilot is acknowledged as the critical center point of aircraft systems safety and an integral safety component of the entire commercial aviation system. The latter statement, while generally accepted, cannot be verified because little or no quantitative data exists on how and how many accidents/incidents are averted by crew actions. A joint NASA/FAA high-fidelity motion-base human-in-the-loop test was conducted using a Level D certified Boeing 737-800 simulator to evaluate the pilot's contribution to safety-of-flight during routine air carrier flight operations and in response to aircraft system failures. To quantify the human's contribution, crew complement (two-crew, reduced crew, single pilot) was used as the independent variable in a between-subjects design. This paper details the crew's actions, including decision-making, and responses while dealing with a hydraulic systems leak - one of 6 total non-normal events that were simulated in this experiment.

  6. A standardized model for predicting flap failure using indocyanine green dye

    NASA Astrophysics Data System (ADS)

    Zimmermann, Terence M.; Moore, Lindsay S.; Warram, Jason M.; Greene, Benjamin J.; Nakhmani, Arie; Korb, Melissa L.; Rosenthal, Eben L.

    2016-03-01

    Techniques that provide a non-invasive method for evaluation of intraoperative skin flap perfusion are currently available but underutilized. We hypothesize that intraoperative vascular imaging can be used to reliably assess skin flap perfusion and elucidate areas of future necrosis by means of a standardized critical perfusion threshold. Five animal groups (negative controls, n=4; positive controls, n=5; chemotherapy group, n=5; radiation group, n=5; chemoradiation group, n=5) underwent pre-flap treatments two weeks prior to undergoing random pattern dorsal fasciocutaneous flaps with a length to width ratio of 2:1 (3 x 1.5 cm). Flap perfusion was assessed via laser-assisted indocyanine green dye angiography and compared to standard clinical assessment for predictive accuracy of flap necrosis. For estimating flap-failure, clinical prediction achieved a sensitivity of 79.3% and a specificity of 90.5%. When average flap perfusion was more than three standard deviations below the average flap perfusion for the negative control group at the time of the flap procedure (144.3+/-17.05 absolute perfusion units), laser-assisted indocyanine green dye angiography achieved a sensitivity of 81.1% and a specificity of 97.3%. When absolute perfusion units were seven standard deviations below the average flap perfusion for the negative control group, specificity of necrosis prediction was 100%. Quantitative absolute perfusion units can improve specificity for intraoperative prediction of viable tissue. Using this strategy, a positive predictive threshold of flap failure can be standardized for clinical use.

  7. Multiorgan failure in the serious trauma patient.

    PubMed

    Llompart-Pou, J A; Talayero, M; Homar, J; Royo, C

    2014-10-01

    Multiorgan failure remains one of the leading causes of late morbidity and mortality after severe trauma. In the early phase, it is related with an uncontrolled hyper-inflammation state, whereas in the late phase (>72 h), septic complications play a major role. We review the underlying pathophysiology, the evaluation with different scales and the clinical factors associated with multiorgan failure, as well as potential treatment options. Copyright © 2014 Elsevier España, S.L.U. and SEMICYUC. All rights reserved.

  8. [Therapy of heart failure with beta-blockers?].

    PubMed

    Osterziel, K J; Dietz, R

    1997-01-01

    In heart failure the chronic sympathetic stimulation alters the cardiac beta-adrenergic pathway. This alteration leads to a diminished contractile response to stimulation of the cardiac beta 1 receptor. A blockade of the beta 1 receptor partly restores the physiologic response to sympathetic stimulation at rest and during exercise. Several mechanisms resulting from the competitive blockade of the beta 1 receptor may be important. The major effect of beta-blockers seems to be triggered by a reduction of the heart rate at rest resulting in an increase of the left ventricular ejection fraction on the average by 7-8%. Patients with heart failure who are treated with a beta-blocker experience initially a slight decrease of the left ventricular function. beta-blocker therapy should therefore be initiated only in patients with stable heart failure. The starting dose of the beta-blocker has to be very small, e.g, 5 mg Metoprolol, 1.25 mg Bisoprolol or 3.125 mg Carvedilol. In a stepwise fashion the dose has to be increased to a full beta blocking effect over a period of 4-8 weeks. Despite a careful dose titration only 90% of the patients tolerate this regimen. Patients with high resting heart rates and/or dilated cardiomyopathy will have the greatest benefit. The two main reasons for withdrawal of the beta-blocker are deterioration of heart failure or symptomatic hypotension. Symptomatic improvement and a significant increase of exercise capacity appear gradually and can be measured only after more than 1 month duration of therapy. Three multicenter studies (MDC. CIBIS I, Carvedilol) evaluated the influence of beta-blockers on prognosis of heart failure. The MDC trial demonstrated a slower progression of heart failure with Metoprolol. The MDC and the CIBIS I trial could not show a significant improvement of prognosis. The larger trial with carvedilol was the first study to demonstrate a decreased mortality in patients who initially tolerate the beta-blocker therapy. One

  9. Diagnosis and management of heart failure in the fetus

    PubMed Central

    DAVEY, B.; SZWAST, A.; RYCHIK, J.

    2015-01-01

    Heart failure can be defined as the inability of the heart to sufficiently support the circulation. In the fetus, heart failure can be caused by a myriad of factors that include fetal shunting abnormalities, genetic cardiomyopathies, extracardiac malformations, arrhythmias and structural congenital heart disease. With advances in ultrasound has come the ability to characterize many complex conditions, previously poorly understood. Fetal echocardiography provides the tools necessary to evaluate and understand the various physiologies that contribute to heart failure in the fetus. In this review, we will explore the different mechanisms of heart failure in this unique patient population and highlight the role of fetal echocardiography in the current management of these conditions PMID:22992530

  10. Quantitative Evaluation of PET Respiratory Motion Correction Using MR Derived Simulated Data

    NASA Astrophysics Data System (ADS)

    Polycarpou, Irene; Tsoumpas, Charalampos; King, Andrew P.; Marsden, Paul K.

    2015-12-01

    The impact of respiratory motion correction on quantitative accuracy in PET imaging is evaluated using simulations for variable patient specific characteristics such as tumor uptake and respiratory pattern. Respiratory patterns from real patients were acquired, with long quiescent motion periods (type-1) as commonly observed in most patients and with long-term amplitude variability as is expected under conditions of difficult breathing (type-2). The respiratory patterns were combined with an MR-derived motion model to simulate real-time 4-D PET-MR datasets. Lung and liver tumors were simulated with diameters of 10 and 12 mm and tumor-to-background ratio ranging from 3:1 to 6:1. Projection data for 6- and 3-mm PET resolution were generated for the Philips Gemini scanner and reconstructed without and with motion correction using OSEM (2 iterations, 23 subsets). Motion correction was incorporated into the reconstruction process based on MR-derived motion fields. Tumor peak standardized uptake values (SUVpeak) were calculated from 30 noise realizations. Respiratory motion correction improves the quantitative performance with the greatest benefit observed for patients of breathing type-2. For breathing type-1 after applying motion correction, SUVpeak of 12-mm liver tumor with 6:1 contrast was increased by 46% for a current PET resolution (i.e., 6 mm) and by 47% for a higher PET resolution (i.e., 3 mm). Furthermore, the results of this study indicate that the benefit of higher scanner resolution is small unless motion correction is applied. In particular, for large liver tumor (12 mm) with low contrast (3:1) after motion correction, the SUVpeak was increased by 34% for 6-mm resolution and by 50% for a higher PET resolution (i.e., 3-mm resolution. This investigation indicates that there is a high impact of respiratory motion correction on tumor quantitative accuracy and that motion correction is important in order to benefit from the increased resolution of future PET

  11. SCADA alarms processing for wind turbine component failure detection

    NASA Astrophysics Data System (ADS)

    Gonzalez, E.; Reder, M.; Melero, J. J.

    2016-09-01

    Wind turbine failure and downtime can often compromise the profitability of a wind farm due to their high impact on the operation and maintenance (O&M) costs. Early detection of failures can facilitate the changeover from corrective maintenance towards a predictive approach. This paper presents a cost-effective methodology to combine various alarm analysis techniques, using data from the Supervisory Control and Data Acquisition (SCADA) system, in order to detect component failures. The approach categorises the alarms according to a reviewed taxonomy, turning overwhelming data into valuable information to assess component status. Then, different alarms analysis techniques are applied for two purposes: the evaluation of the SCADA alarm system capability to detect failures, and the investigation of the relation between components faults being followed by failure occurrences in others. Various case studies are presented and discussed. The study highlights the relationship between faulty behaviour in different components and between failures and adverse environmental conditions.

  12. Forecasting the brittle failure of heterogeneous, porous geomaterials

    NASA Astrophysics Data System (ADS)

    Vasseur, Jérémie; Wadsworth, Fabian; Heap, Michael; Main, Ian; Lavallée, Yan; Dingwell, Donald

    2017-04-01

    Heterogeneity develops in magmas during ascent and is dominated by the development of crystal and importantly, bubble populations or pore-network clusters which grow, interact, localize, coalesce, outgas and resorb. Pore-scale heterogeneity is also ubiquitous in sedimentary basin fill during diagenesis. As a first step, we construct numerical simulations in 3D in which randomly generated heterogeneous and polydisperse spheres are placed in volumes and which are permitted to overlap with one another, designed to represent the random growth and interaction of bubbles in a liquid volume. We use these simulated geometries to show that statistical predictions of the inter-bubble lengthscales and evolving bubble surface area or cluster densities can be made based on fundamental percolation theory. As a second step, we take a range of well constrained random heterogeneous rock samples including sandstones, andesites, synthetic partially sintered glass bead samples, and intact glass samples and subject them to a variety of stress loading conditions at a range of temperatures until failure. We record in real time the evolution of the number of acoustic events that precede failure and show that in all scenarios, the acoustic event rate accelerates toward failure, consistent with previous findings. Applying tools designed to forecast the failure time based on these precursory signals, we constrain the absolute error on the forecast time. We find that for all sample types, the error associated with an accurate forecast of failure scales non-linearly with the lengthscale between the pore clusters in the material. Moreover, using a simple micromechanical model for the deformation of porous elastic bodies, we show that the ratio between the equilibrium sub-critical crack length emanating from the pore clusters relative to the inter-pore lengthscale, provides a scaling for the error on forecast accuracy. Thus for the first time we provide a potential quantitative correction for

  13. Decreasing handoff-related care failures in children's hospitals.

    PubMed

    Bigham, Michael T; Logsdon, Tina R; Manicone, Paul E; Landrigan, Christopher P; Hayes, Leslie W; Randall, Kelly H; Grover, Purva; Collins, Susan B; Ramirez, Dana E; O'Guin, Crystal D; Williams, Catherine I; Warnick, Robin J; Sharek, Paul J

    2014-08-01

    Patient handoffs in health care require transfer of information, responsibility, and authority between providers. Suboptimal patient handoffs pose a serious safety risk. Studies demonstrating the impact of improved patient handoffs on care failures are lacking. The primary objective of this study was to evaluate the effect of a multihospital collaborative designed to decrease handoff-related care failures. Twenty-three children's hospitals participated in a quality improvement collaborative aimed at reducing handoff-related care failures. The improvement was guided by evidence-based recommendations regarding handoff intent and content, standardized handoff tools/methods, and clear transition of responsibility. Hospitals tailored handoff elements to locally important handoff types. Handoff-related care failures were compared between baseline and 3 intervention periods. Secondary outcomes measured compliance to specific change package elements and balancing measure of staff satisfaction. Twenty-three children's hospitals evaluated 7864 handoffs over the 12-month study period. Handoff-related care failures decreased from baseline (25.8%) to the final intervention period (7.9%) (P < .05). Significant improvement was observed in every handoff type studied. Compliance to change package elements improved (achieving a common understanding about the patient from 86% to 96% [P < .05]; clear transition of responsibility from 92% to 96% [P < .05]; and minimized interruptions and distractions from 84% to 90% [P < .05]) as did overall satisfaction with the handoff (from 55% to 70% [P < .05]). Implementation of a standardized evidence-based handoff process across 23 children's hospitals resulted in a significant decrease in handoff-related care failures, observed over all handoff types. Compliance to critical components of the handoff process improved, as did provider satisfaction. Copyright © 2014 by the American Academy of Pediatrics.

  14. Prognostic value of decreased peripheral congestion detected by Bioelectrical Impedance Vector Analysis (BIVA) in patients hospitalized for acute heart failure: BIVA prognostic value in acute heart failure.

    PubMed

    Santarelli, Simona; Russo, Veronica; Lalle, Irene; De Berardinis, Benedetta; Vetrone, Francesco; Magrini, Laura; Di Stasio, Enrico; Piccoli, Antonio; Codognotto, Marta; Mion, Monica M; Castello, Luigi M; Avanzi, Gian Carlo; Di Somma, Salvatore

    2017-06-01

    The objective of this study was to investigate the prognostic role of quantitative reduction of congestion during hospitalization assessed by Bioelectrical Impedance Vector Analysis (BIVA) serial evaluations in patients admitted for acute heart failure (AHF). AHF is a frequent reason for patients to be admitted. Exacerbation of chronic heart failure is linked with a progressive worsening of the disease with increased incidence of death. Fluid overload is the main mechanism underlying acute decompensation in these patients. BIVA is a validated technique able to quantify fluid overload. a prospective, multicentre, observational study in AHF and no AHF patients in three Emergency Departments centres in Italy. Clinical data and BIVA evaluations were performed at admission (t0) and discharge (tdis). A follow-up phone call was carried out at 90 days. Three hundred and thirty-six patients were enrolled (221 AHF and 115 no AHF patients). We found that clinical signs showed the most powerful prognostic relevance. In particular the presence of rales and lower limb oedema at tdis were linked with events relapse at 90 days. At t0, congestion detected by BIVA was observed only in the AHF group, and significantly decreased at tdis. An increase of resistance variation (dR/H) >11 Ω/m during hospitalization was associated with survival. BIVA showed significant results in predicting total events, both at t0 (area under the curve (AUC) 0.56, p<0.04) and at tdis (AUC 0.57, p<0.03). When combined with clinical signs, BIVA showed a very good predictive value for cardiovascular events at 90 days (AUC 0.97, p<0.0001). In AHF patients, an accurate physical examination evaluating the presence of rales and lower limbs oedema remains the cornerstone in the management of patients with AHF. A congestion reduction, obtained as a consequence of therapies and detected through BIVA analysis, with an increase of dR/H >11 Ω/m during hospitalization seems to be associated with increased 90 day

  15. Evaluation of changes in periodontal bacteria in healthy dogs over 6 months using quantitative real-time PCR.

    PubMed

    Maruyama, N; Mori, A; Shono, S; Oda, H; Sako, T

    2018-03-01

    Porphyromonas gulae, Tannerella forsythia and Campylobacter rectus are considered dominant periodontal pathogens in dogs. Recently, quantitative real-time PCR (qRT-PCR) methods have been used for absolute quantitative determination of oral bacterial counts. The purpose of the present study was to establish a standardized qRT-PCR procedure to quantify bacterial counts of the three target periodontal bacteria (P. gulae, T. forsythia and C. rectus). Copy numbers of the three target periodontal bacteria were evaluated in 26 healthy dogs. Then, changes in bacterial counts of the three target periodontal bacteria were evaluated for 24 weeks in 7 healthy dogs after periodontal scaling. Analytical evaluation of each self-designed primer indicated acceptable analytical imprecision. All 26 healthy dogs were found to be positive for P. gulae, T. forsythia and C. rectus. Median total bacterial counts (copies/ng) of each target genes were 385.612 for P. gulae, 25.109 for T. forsythia and 5.771 for C. rectus. Significant differences were observed between the copy numbers of the three target periodontal bacteria. Periodontal scaling reduced median copy numbers of the three target periodontal bacteria in 7 healthy dogs. However, after periodontal scaling, copy numbers of all three periodontal bacteria significantly increased over time (p<0.05, Kruskal-Wallis test) (24 weeks). In conclusion, our results demonstrated that qRT-PCR can accurately measure periodontal bacteria in dogs. Furthermore, the present study has revealed that qRT-PCR method can be considered as a new objective evaluation system for canine periodontal disease. Copyright© by the Polish Academy of Sciences.

  16. Comparison of two bioelectrical impedance devices and dual-energy X-ray absorptiometry to evaluate body composition in heart failure.

    PubMed

    Alves, F D; Souza, G C; Biolo, A; Clausell, N

    2014-12-01

    The utilisation of bioelectrical impedance analysis (BIA) in heart failure can be affected by many factors and its applicability remains controversial. The present study aimed to verify the adequacy of single-frequency BIA (SF-BIA) and multifrequency BIA (MF-BIA) compared to dual-energy x-ray absorptiometry (DEXA) for evaluating body composition in outpatients with heart failure. In this cross-sectional study, 55 patients with stable heart failure and left ventricle ejection fraction ≤45% were evaluated for fat mass percentage, fat mass and fat-free mass by DEXA and compared with the results obtained by SF-BIA (single frequency of 50 kHz) and MF-BIA (frequencies of 20 and 100 kHz). MF-BIA and DEXA gave similar mean values for fat mass percentage, fat mass and fat-free mass, whereas values from SF-BIA were significantly different from DEXA. Both SF-BIA and MF-BIA measures of body composition correlated strongly with DEXA (r > 0.8; P < 0.001), except for fat mass assessed by SF-BIA, which showed a moderate correlation (r = 0.760; P < 0.001). MF-BIA also showed a better agreement with DEXA by Bland-Altman analysis in all measurements. However, both types of equipment showed wide limits of agreement and a significant relationship between variance and bias (Pitmans's test P > 0.05), except MF-BIA for fat-free mass. Compared with DEXA, MF-BIA showed better accuracy than SF-BIA, although both types of equipment showed wide limits of agreement. The BIA technique should be used with caution, and regression equations might be useful for correcting the observed variations, mainly in extreme values of body composition. © 2014 The British Dietetic Association Ltd.

  17. Methods for the field evaluation of quantitative G6PD diagnostics: a review.

    PubMed

    Ley, Benedikt; Bancone, Germana; von Seidlein, Lorenz; Thriemer, Kamala; Richards, Jack S; Domingo, Gonzalo J; Price, Ric N

    2017-09-11

    Individuals with glucose-6-phosphate dehydrogenase (G6PD) deficiency are at risk of severe haemolysis following the administration of 8-aminoquinoline compounds. Primaquine is the only widely available 8-aminoquinoline for the radical cure of Plasmodium vivax. Tafenoquine is under development with the potential to simplify treatment regimens, but point-of-care (PoC) tests will be needed to provide quantitative measurement of G6PD activity prior to its administration. There is currently a lack of appropriate G6PD PoC tests, but a number of new tests are in development and are likely to enter the market in the coming years. As these are implemented, they will need to be validated in field studies. This article outlines the technical details for the field evaluation of novel quantitative G6PD diagnostics such as sample handling, reference testing and statistical analysis. Field evaluation is based on the comparison of paired samples, including one sample tested by the new assay at point of care and one sample tested by the gold-standard reference method, UV spectrophotometry in an established laboratory. Samples can be collected as capillary or venous blood; the existing literature suggests that potential differences in capillary or venous blood are unlikely to affect results substantially. The collection and storage of samples is critical to ensure preservation of enzyme activity, it is recommended that samples are stored at 4 °C and testing occurs within 4 days of collection. Test results can be visually presented as scatter plot, Bland-Altman plot, and a histogram of the G6PD activity distribution of the study population. Calculating the adjusted male median allows categorizing results according to G6PD activity to calculate standard performance indicators and to perform receiver operating characteristic (ROC) analysis.

  18. Evaluating a Dutch cardiology primary care plus intervention on the Triple Aim outcomes: study design of a practice-based quantitative and qualitative research.

    PubMed

    Quanjel, Tessa C C; Spreeuwenberg, Marieke D; Struijs, Jeroen N; Baan, Caroline A; Ruwaard, Dirk

    2017-09-06

    In an attempt to deal with the pressures on the health-care system and to guarantee sustainability, changes are needed. This study focuses on a cardiology primary care plus intervention. Primary care plus (PC+) is a new health-care delivery model focused on substitution of specialist care in the hospital setting with specialist care in the primary care setting. The intervention consists of a cardiology PC+ centre in which cardiologists, supported by other health-care professionals, provide consultations in a primary care setting. The PC+ centre aims to improve the health of the population and quality of care as experienced by patients, and reduce the number of referrals to hospital-based outpatient specialist care in order to reduce health-care costs. These aims reflect the Triple Aim principle. Hence, the objectives of the study are to evaluate the cardiology PC+ centre in terms of the Triple Aim outcomes and to evaluate the process of the introduction of PC+. The study is a practice-based, quantitative study with a longitudinal observational design, and an additional qualitative study to supplement, interpret and improve the quantitative study. The study population of the quantitative part will consist of adult patients (≥18 years) with non-acute and low-complexity cardiology-related health complaints, who will be referred to the cardiology PC+ centre (intervention group) or hospital-based outpatient cardiology care (control group). All eligible patients will be asked to complete questionnaires at three different time points consisting of questions about their demographics, health status and experience of care. Additionally, quantitative data will be collected about health-care utilization and related health-care costs at the PC+ centre and the hospital. The qualitative part, consisting of semi-structured interviews, focus groups, and observations, is designed to evaluate the process as well as to amplify, clarify and explain quantitative results. This study

  19. Prognostic value of noninvasive hemodynamic evaluation of the acute effect of levosimendan in advanced heart failure.

    PubMed

    Malfatto, Gabriella; Della Rosa, Francesco; Rella, Valeria; Villani, Alessandra; Branzi, Giovanna; Blengino, Simonetta; Giglio, Alessia; Facchini, Mario; Parati, Gianfranco

    2014-04-01

    Optimization of inotropic treatment in worsening heart failure sometimes requires invasive hemodynamic assessment in selected patients. Impedance cardiography (ICG) may be useful for a noninvasive hemodynamic evaluation. ICG was performed in 40 patients (69 ± 8 years; left ventricular ejection fraction 27.5 ± 5.6%; New York Heart Association 3.18 ± 0.34; Interagency Registry for Mechanically Assisted Circulatory Support 5.48 ± 0.96, before and after infusion of Levosimendan (0.1–0.2 µg/kg per min for up to 24 h). Echocardiogram, ICG [measuring cardiac index (CI), total peripheral resistances (TPRs) and thoracic fluid content (TFC)] and plasma levels of brain natriuretic peptide (BNP) were obtained; in nine patients, right heart catheterization was also carried out. When right catheterization and ICG were performed simultaneously, a significant relationship was observed between values of CI and TPR, and between TFC and pulmonary wedge pressure. ICG detected the Levosimendan-induced recovery of the hemodynamic status, associated with improved systolic and diastolic function and reduction in BNP levels. One-year mortality was 4.4%. At multivariate analysis, independent predictors of mortality were: no improvement in the severity of mitral regurgitation, a persistent restrictive filling pattern (E/E’ > 15), a reduction of BNP levels below 30% and a change below 10% in CI, TPR and TFC. When combined, absence of hemodynamic improvement at ICG could predict 1-year mortality with better sensitivity (86%) and specificity (85%) than the combination of echocardiographic and BNP criteria only (sensitivity 80% and specificity 36%). Noninvasive hemodynamic evaluation of heart failure patients during infusion of inodilator drugs is reliable and may help in their prognostic stratification.

  20. PREDICE score as a predictor of 90 days mortality in patients with heart failure

    NASA Astrophysics Data System (ADS)

    Purba, D. P. S.; Hasan, R.

    2018-03-01

    Hospitalization in chronic heart failure patients associated with high mortality and morbidity rate. The 90 days post-discharge period following hospitalization in heart failure patients is known as the vulnerable phase, it carries the high risk of poor outcomes. Identification of high-risk individuals by using prognostic evaluation was intended to do a closer follow up and more intensive to decreasing the morbidity and mortality rate of heart failure.To determine whether PREDICE score could predict mortality within 90 days in patients with heart failure, an observational cohort study in patients with heart failure who were hospitalized due to worsening chronic heart failure. Patients were in following-up for up to 90 days after initial evaluation with the primary endpoint is death.We found a difference of the significantstatistical between PREDICE score in survival and mortality group (p=0.001) of 84% (95% CI: 60.9% - 97.4%).In conclusion, PREDICE score has a good ability to predict mortality within 90 days in patients with heart failure.

  1. Quantitative and Qualitative Evaluation of Iranian Researchers’ Scientific Production in Dentistry Subfields

    PubMed Central

    Yaminfirooz, Mousa; Motallebnejad, Mina; Gholinia, Hemmat; Esbakian, Somayeh

    2015-01-01

    Background: As in other fields of medicine, scientific production in the field of dentistry has significant placement. This study aimed at quantitatively and qualitatively evaluating Iranian researchers’ scientific output in the field of dentistry and determining their contribution in each of dentistry subfields and branches. Methods: This research was a scientometric study that applied quantitative and qualitative indices of Web of Science (WoS). Research population consisted of927indexed documents published under the name of Iran in the time span of 1993-2012 which were extracted from WoS on 10 March 2013. The Mann-Whitney test and Pearson correlation coefficient were used to data analyses in SPSS 19. Results: 777 (83. 73%) of indexed items of all scientific output in WoS were scientific articles. The highest growth rate of scientific productionwith90% belonged to endodontic sub field. The correlation coefficient test showed that there was a significant positive relationship between the number of documents and their publication age (P < 0. 0001). There was a significant difference between the mean number of published articles in the first ten- year (1993-2003) and that of the second one (2004-2013), in favor of the latter (P = 0. 001). Conclusions: The distribution frequencies of scientific production in various subfields of dentistry were very different. It needs to reinforce the infrastructure for more balanced scientific production in the field and its related subfields. PMID:26635439

  2. Ventilatory support in critically ill hematology patients with respiratory failure

    PubMed Central

    2012-01-01

    Introduction Hematology patients admitted to the ICU frequently experience respiratory failure and require mechanical ventilation. Noninvasive mechanical ventilation (NIMV) may decrease the risk of intubation, but NIMV failure poses its own risks. Methods To establish the impact of ventilatory management and NIMV failure on outcome, data from a prospective, multicenter, observational study were analyzed. All hematology patients admitted to one of the 34 participating ICUs in a 17-month period were followed up. Data on demographics, diagnosis, severity, organ failure, and supportive therapies were recorded. A logistic regression analysis was done to evaluate the risk factors associated with death and NIVM failure. Results Of 450 patients, 300 required ventilatory support. A diagnosis of congestive heart failure and the initial use of NIMV significantly improved survival, whereas APACHE II score, allogeneic transplantation, and NIMV failure increased the risk of death. The risk factors associated with NIMV success were age, congestive heart failure, and bacteremia. Patients with NIMV failure experienced a more severe respiratory impairment than did those electively intubated. Conclusions NIMV improves the outcome of hematology patients with respiratory insufficiency, but NIMV failure may have the opposite effect. A careful selection of patients with rapidly reversible causes of respiratory failure may increase NIMV success. PMID:22827955

  3. Toward Quantitative Small Animal Pinhole SPECT: Assessment of Quantitation Accuracy Prior to Image Compensations

    PubMed Central

    Chen, Chia-Lin; Wang, Yuchuan; Lee, Jason J. S.; Tsui, Benjamin M. W.

    2011-01-01

    Purpose We assessed the quantitation accuracy of small animal pinhole single photon emission computed tomography (SPECT) under the current preclinical settings, where image compensations are not routinely applied. Procedures The effects of several common image-degrading factors and imaging parameters on quantitation accuracy were evaluated using Monte-Carlo simulation methods. Typical preclinical imaging configurations were modeled, and quantitative analyses were performed based on image reconstructions without compensating for attenuation, scatter, and limited system resolution. Results Using mouse-sized phantom studies as examples, attenuation effects alone degraded quantitation accuracy by up to −18% (Tc-99m or In-111) or −41% (I-125). The inclusion of scatter effects changed the above numbers to −12% (Tc-99m or In-111) and −21% (I-125), respectively, indicating the significance of scatter in quantitative I-125 imaging. Region-of-interest (ROI) definitions have greater impacts on regional quantitation accuracy for small sphere sources as compared to attenuation and scatter effects. For the same ROI, SPECT acquisitions using pinhole apertures of different sizes could significantly affect the outcome, whereas the use of different radii-of-rotation yielded negligible differences in quantitation accuracy for the imaging configurations simulated. Conclusions We have systematically quantified the influence of several factors affecting the quantitation accuracy of small animal pinhole SPECT. In order to consistently achieve accurate quantitation within 5% of the truth, comprehensive image compensation methods are needed. PMID:19048346

  4. Health information systems: failure, success and improvisation.

    PubMed

    Heeks, Richard

    2006-02-01

    The generalised assumption of health information systems (HIS) success is questioned by a few commentators in the medical informatics field. They point to widespread HIS failure. The purpose of this paper was therefore to develop a better conceptual foundation for, and practical guidance on, health information systems failure (and success). Literature and case analysis plus pilot testing of developed model. Defining HIS failure and success is complex, and the current evidence base on HIS success and failure rates was found to be weak. Nonetheless, the best current estimate is that HIS failure is an important problem. The paper therefore derives and explains the "design-reality gap" conceptual model. This is shown to be robust in explaining multiple cases of HIS success and failure, yet provides a contingency that encompasses the differences which exist in different HIS contexts. The design-reality gap model is piloted to demonstrate its value as a tool for risk assessment and mitigation on HIS projects. It also throws into question traditional, structured development methodologies, highlighting the importance of emergent change and improvisation in HIS. The design-reality gap model can be used to address the problem of HIS failure, both as a post hoc evaluative tool and as a pre hoc risk assessment and mitigation tool. It also validates a set of methods, techniques, roles and competencies needed to support the dynamic improvisations that are found to underpin cases of HIS success.

  5. Evaluating wood failure in plywood shear by optical image analysis

    Treesearch

    Charles W. McMillin

    1984-01-01

    This exploratory study evaulates the potential of using an automatic image analysis method to measure percent wood failure in plywood shear specimens. The results suggest that this method my be as accurate as the visual method in tracking long-term gluebond quality. With further refinement, the method could lead to automated equipment replacing the subjective visual...

  6. Framework for quantitative evaluation of 3D vessel segmentation approaches using vascular phantoms in conjunction with 3D landmark localization and registration

    NASA Astrophysics Data System (ADS)

    Wörz, Stefan; Hoegen, Philipp; Liao, Wei; Müller-Eschner, Matthias; Kauczor, Hans-Ulrich; von Tengg-Kobligk, Hendrik; Rohr, Karl

    2016-03-01

    We introduce a framework for quantitative evaluation of 3D vessel segmentation approaches using vascular phantoms. Phantoms are designed using a CAD system and created with a 3D printer, and comprise realistic shapes including branches and pathologies such as abdominal aortic aneurysms (AAA). To transfer ground truth information to the 3D image coordinate system, we use a landmark-based registration scheme utilizing fiducial markers integrated in the phantom design. For accurate 3D localization of the markers we developed a novel 3D parametric intensity model that is directly fitted to the markers in the images. We also performed a quantitative evaluation of different vessel segmentation approaches for a phantom of an AAA.

  7. Development of a quantitative model for the mechanism of raveling failure in highway rock slopes using LIDAR.

    DOT National Transportation Integrated Search

    2013-03-01

    Rock falls on highways while dangerous are unpredictable. Most rock falls are of the raveling type and not conducive to stability : calculations, and even the failure mechanisms are not well understood. LIDAR (LIght Detection And Ranging) has been sh...

  8. Development and Evaluation of Event-Specific Quantitative PCR Method for Genetically Modified Soybean MON87701.

    PubMed

    Tsukahara, Keita; Takabatake, Reona; Masubuchi, Tomoko; Futo, Satoshi; Minegishi, Yasutaka; Noguchi, Akio; Kondo, Kazunari; Nishimaki-Mogami, Tomoko; Kurashima, Takeyo; Mano, Junichi; Kitta, Kazumi

    2016-01-01

    A real-time PCR-based analytical method was developed for the event-specific quantification of a genetically modified (GM) soybean event, MON87701. First, a standard plasmid for MON87701 quantification was constructed. The conversion factor (C f ) required to calculate the amount of genetically modified organism (GMO) was experimentally determined for a real-time PCR instrument. The determined C f for the real-time PCR instrument was 1.24. For the evaluation of the developed method, a blind test was carried out in an inter-laboratory trial. The trueness and precision were evaluated as the bias and reproducibility of relative standard deviation (RSDr), respectively. The determined biases and the RSDr values were less than 30 and 13%, respectively, at all evaluated concentrations. The limit of quantitation of the method was 0.5%, and the developed method would thus be applicable for practical analyses for the detection and quantification of MON87701.

  9. Failure analysis and modeling of a VAXcluster system

    NASA Technical Reports Server (NTRS)

    Tang, Dong; Iyer, Ravishankar K.; Subramani, Sujatha S.

    1990-01-01

    This paper discusses the results of a measurement-based analysis of real error data collected from a DEC VAXcluster multicomputer system. In addition to evaluating basic system dependability characteristics such as error and failure distributions and hazard rates for both individual machines and for the VAXcluster, reward models were developed to analyze the impact of failures on the system as a whole. The results show that more than 46 percent of all failures were due to errors in shared resources. This is despite the fact that these errors have a recovery probability greater than 0.99. The hazard rate calculations show that not only errors, but also failures occur in bursts. Approximately 40 percent of all failures occur in bursts and involved multiple machines. This result indicates that correlated failures are significant. Analysis of rewards shows that software errors have the lowest reward (0.05 vs 0.74 for disk errors). The expected reward rate (reliability measure) of the VAXcluster drops to 0.5 in 18 hours for the 7-out-of-7 model and in 80 days for the 3-out-of-7 model.

  10. Quantitative reactive modeling and verification.

    PubMed

    Henzinger, Thomas A

    Formal verification aims to improve the quality of software by detecting errors before they do harm. At the basis of formal verification is the logical notion of correctness , which purports to capture whether or not a program behaves as desired. We suggest that the boolean partition of software into correct and incorrect programs falls short of the practical need to assess the behavior of software in a more nuanced fashion against multiple criteria. We therefore propose to introduce quantitative fitness measures for programs, specifically for measuring the function, performance, and robustness of reactive programs such as concurrent processes. This article describes the goals of the ERC Advanced Investigator Project QUAREM. The project aims to build and evaluate a theory of quantitative fitness measures for reactive models. Such a theory must strive to obtain quantitative generalizations of the paradigms that have been success stories in qualitative reactive modeling, such as compositionality, property-preserving abstraction and abstraction refinement, model checking, and synthesis. The theory will be evaluated not only in the context of software and hardware engineering, but also in the context of systems biology. In particular, we will use the quantitative reactive models and fitness measures developed in this project for testing hypotheses about the mechanisms behind data from biological experiments.

  11. Evaluating the best time to intervene acute liver failure in rat models induced by d-galactosamine.

    PubMed

    Éboli, Lígia Patrícia de Carvalho Batista; Netto, Alcides Augusto Salzedas; Azevedo, Ramiro Antero de; Lanzoni, Valéria Pereira; Paula, Tatiana Sugayama de; Goldenberg, Alberto; Gonzalez, Adriano Miziara

    2016-12-01

    To describe an animal model for acute liver failure by intraperitoneal d-galactosamine injections in rats and to define when is the best time to intervene through King's College and Clichy´s criteria evaluation. Sixty-one Wistar female rats were distributed into three groups: group 1 (11 rats received 1.4 g/kg of d-galactosamine intraperitoneally and were observed until they died); group 2 (44 rats received a dose of 1.4 g/kg of d-galactosamine and blood and histological samples were collected for analysis at 12 , 24, 48 , 72 and 120 hours after the injection); and the control group as well (6 rats) . Twelve hours after applying d-galactosamine, AST/ALT, bilirubin, factor V, PT and INR were already altered. The peak was reached at 48 hours. INR > 6.5 was found 12 hours after the injection and factor V < 30% after 24 hours. All the laboratory variables presented statistical differences, except urea (p = 0.758). There were statistical differences among all the histological variables analyzed. King's College and Clichy´s criteria were fulfilled 12 hours after the d-galactosamine injection and this time may represent the best time to intervene in this acute liver failure animal model.

  12. Real-time automated failure analysis for on-orbit operations

    NASA Technical Reports Server (NTRS)

    Kirby, Sarah; Lauritsen, Janet; Pack, Ginger; Ha, Anhhoang; Jowers, Steven; Mcnenny, Robert; Truong, The; Dell, James

    1993-01-01

    A system which is to provide real-time failure analysis support to controllers at the NASA Johnson Space Center Control Center Complex (CCC) for both Space Station and Space Shuttle on-orbit operations is described. The system employs monitored systems' models of failure behavior and model evaluation algorithms which are domain-independent. These failure models are viewed as a stepping stone to more robust algorithms operating over models of intended function. The described system is designed to meet two sets of requirements. It must provide a useful failure analysis capability enhancement to the mission controller. It must satisfy CCC operational environment constraints such as cost, computer resource requirements, verification, and validation. The underlying technology and how it may be used to support operations is also discussed.

  13. Quantitative evaluation of dual-flip-angle T1 mapping on DCE-MRI kinetic parameter estimation in head and neck

    PubMed Central

    Chow, Steven Kwok Keung; Yeung, David Ka Wai; Ahuja, Anil T; King, Ann D

    2012-01-01

    Purpose To quantitatively evaluate the kinetic parameter estimation for head and neck (HN) dynamic contrast-enhanced (DCE) MRI with dual-flip-angle (DFA) T1 mapping. Materials and methods Clinical DCE-MRI datasets of 23 patients with HN tumors were included in this study. T1 maps were generated based on multiple-flip-angle (MFA) method and different DFA combinations. Tofts model parameter maps of kep, Ktrans and vp based on MFA and DFAs were calculated and compared. Fitted parameter by MFA and DFAs were quantitatively evaluated in primary tumor, salivary gland and muscle. Results T1 mapping deviations by DFAs produced remarkable kinetic parameter estimation deviations in head and neck tissues. In particular, the DFA of [2º, 7º] overestimated, while [7º, 12º] and [7º, 15º] underestimated Ktrans and vp, significantly (P<0.01). [2º, 15º] achieved the smallest but still statistically significant overestimation for Ktrans and vp in primary tumors, 32.1% and 16.2% respectively. kep fitting results by DFAs were relatively close to the MFA reference compared to Ktrans and vp. Conclusions T1 deviations induced by DFA could result in significant errors in kinetic parameter estimation, particularly Ktrans and vp, through Tofts model fitting. MFA method should be more reliable and robust for accurate quantitative pharmacokinetic analysis in head and neck. PMID:23289084

  14. Heart Failure

    MedlinePlus

    Heart failure is a condition in which the heart can't pump enough blood to meet the body's needs. Heart failure does not mean that your heart has stopped ... and shortness of breath Common causes of heart failure are coronary artery disease, high blood pressure and ...

  15. Comparison of semi-quantitative and quantitative dynamic contrast-enhanced MRI evaluations of vertebral marrow perfusion in a rat osteoporosis model.

    PubMed

    Zhu, Jingqi; Xiong, Zuogang; Zhang, Jiulong; Qiu, Yuyou; Hua, Ting; Tang, Guangyu

    2017-11-14

    This study aims to investigate the technical feasibility of semi-quantitative and quantitative dynamic contrast-enhanced magnetic resonance imaging (DCE-MRI) in the assessment of longitudinal changes of marrow perfusion in a rat osteoporosis model, using bone mineral density (BMD) measured by micro-computed tomography (micro-CT) and histopathology as the gold standards. Fifty rats were randomly assigned to the control group (n=25) and ovariectomy (OVX) group whose bilateral ovaries were excised (n=25). Semi-quantitative and quantitative DCE-MRI, micro-CT, and histopathological examinations were performed on lumbar vertebrae at baseline and 3, 6, 9, and 12 weeks after operation. The differences between the two groups in terms of semi-quantitative DCE-MRI parameter (maximum enhancement, E max ), quantitative DCE-MRI parameters (volume transfer constant, K trans ; interstitial volume, V e ; and efflux rate constant, K ep ), micro-CT parameter (BMD), and histopathological parameter (microvessel density, MVD) were compared at each of the time points using an independent-sample t test. The differences in these parameters between baseline and other time points in each group were assessed via Bonferroni's multiple comparison test. A Pearson correlation analysis was applied to assess the relationships between DCE-MRI, micro-CT, and histopathological parameters. In the OVX group, the E max values decreased significantly compared with those of the control group at weeks 6 and 9 (p=0.003 and 0.004, respectively). The K trans values decreased significantly compared with those of the control group from week 3 (p<0.05). However, the V e values decreased significantly only at week 9 (p=0.032), and no difference in the K ep was found between two groups. The BMD values of the OVX group decreased significantly compared with those of the control group from week 3 (p<0.05). Transmission electron microscopy showed tighter gaps between vascular endothelial cells with swollen mitochondria

  16. [Understanding heart failure].

    PubMed

    Boo, José Fernando Guadalajara

    2006-01-01

    Heart failure is a disease with several definitions. The term "heart failure" is used by has brougth about confusion in the terminology. For this reason, the value of the ejection fraction (< 0.40 or < 0.35) is used in most meganalyses on the treatment of heart failure, avoiding the term "heart failure" that is a confounding concept. In this paper we carefully analyze the meaning of contractility, ventricular function or performance, preload, afterload, heart failure, compensation mechanisms in heart failure, myocardial oxygen consumption, inadequate, adequate and inappropriate hypertrophy, systole, diastole, compliance, problems of relaxation, and diastolic dysfunction. Their definitions are supported by the original scientific descriptions in an attempt to clarify the concepts about ventricular function and heart failure and, in this way, use the same scientific language about the meaning of ventricular function, heart failure, and diastolic dysfunction.

  17. Pitfalls and Precautions When Using Predicted Failure Data for Quantitative Analysis of Safety Risk for Human Rated Launch Vehicles

    NASA Technical Reports Server (NTRS)

    Hatfield, Glen S.; Hark, Frank; Stott, James

    2016-01-01

    Launch vehicle reliability analysis is largely dependent upon using predicted failure rates from data sources such as MIL-HDBK-217F. Reliability prediction methodologies based on component data do not take into account system integration risks such as those attributable to manufacturing and assembly. These sources often dominate component level risk. While consequence of failure is often understood, using predicted values in a risk model to estimate the probability of occurrence may underestimate the actual risk. Managers and decision makers use the probability of occurrence to influence the determination whether to accept the risk or require a design modification. The actual risk threshold for acceptance may not be fully understood due to the absence of system level test data or operational data. This paper will establish a method and approach to identify the pitfalls and precautions of accepting risk based solely upon predicted failure data. This approach will provide a set of guidelines that may be useful to arrive at a more realistic quantification of risk prior to acceptance by a program.

  18. Recognising and referring children exposed to domestic abuse: a multi-professional, proactive systems-based evaluation using a modified Failure Mode and Effects Analysis (FMEA).

    PubMed

    Ashley, Laura; Armitage, Gerry; Taylor, Julie

    2017-03-01

    Failure Modes and Effects Analysis (FMEA) is a prospective quality assurance methodology increasingly used in healthcare, which identifies potential vulnerabilities in complex, high-risk processes and generates remedial actions. We aimed, for the first time, to apply FMEA in a social care context to evaluate the process for recognising and referring children exposed to domestic abuse within one Midlands city safeguarding area in England. A multidisciplinary, multi-agency team of 10 front-line professionals undertook the FMEA, using a modified methodology, over seven group meetings. The FMEA included mapping out the process under evaluation to identify its component steps, identifying failure modes (potential errors) and possible causes for each step and generating corrective actions. In this article, we report the output from the FMEA, including illustrative examples of the failure modes and corrective actions generated. We also present an analysis of feedback from the FMEA team and provide future recommendations for the use of FMEA in appraising social care processes and practice. Although challenging, the FMEA was unequivocally valuable for team members and generated a significant number of corrective actions locally for the safeguarding board to consider in its response to children exposed to domestic abuse. © 2016 John Wiley & Sons Ltd.

  19. Failure analysis of pinch-torsion tests as a thermal runaway risk evaluation method of Li-Ion Cells

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xia, Yuzhi; Li, Dr. Tianlei; Ren, Prof. Fei

    2014-01-01

    Recently a pinch-torsion test is developed for safety testing of Li-ion batteries (Ren et al., J. Power Source, 2013). It has been demonstrated that this test can generate small internal short-circuit spots in the separator in a controllable and repeatable manner. In the current research, the failure mechanism is examined by numerical simulations and comparisons to experimental observations. Finite element models are developed to evaluate the deformation of the separators under both pure pinch and pinch-torsion loading conditions. It is discovered that the addition of the torsion component significantly increased the maximum principal strain, which is believed to induce themore » internal short circuit. In addition, the applied load in the pinch-torsion test is significantly less than in the pure pinch test, thus dramatically improving the applicability of this method to ultra-thick batteries which otherwise require heavy load in excess of machine capability. It is further found that the separator failure is achieved in the early stage of torsion (within a few degree of rotation). Effect of coefficient of friction on the maximum principal strain is also examined.« less

  20. Heart failure.

    PubMed

    Metra, Marco; Teerlink, John R

    2017-10-28

    Heart failure is common in adults, accounting for substantial morbidity and mortality worldwide. Its prevalence is increasing because of ageing of the population and improved treatment of acute cardiovascular events, despite the efficacy of many therapies for patients with heart failure with reduced ejection fraction, such as angiotensin converting enzyme (ACE) inhibitors, angiotensin receptor blockers (ARBs), β blockers, and mineralocorticoid receptor antagonists, and advanced device therapies. Combined angiotensin receptor blocker neprilysin inhibitors (ARNIs) have been associated with improvements in hospital admissions and mortality from heart failure compared with enalapril, and guidelines now recommend substitution of ACE inhibitors or ARBs with ARNIs in appropriate patients. Improved safety of left ventricular assist devices means that these are becoming more commonly used in patients with severe symptoms. Antidiabetic therapies might further improve outcomes in patients with heart failure. New drugs with novel mechanisms of action, such as cardiac myosin activators, are under investigation for patients with heart failure with reduced left ventricular ejection fraction. Heart failure with preserved ejection fraction is a heterogeneous disorder that remains incompletely understood and will continue to increase in prevalence with the ageing population. Although some data suggest that spironolactone might improve outcomes in these patients, no therapy has conclusively shown a significant effect. Hopefully, future studies will address these unmet needs for patients with heart failure. Admissions for acute heart failure continue to increase but, to date, no new therapies have improved clinical outcomes. Copyright © 2017 Elsevier Ltd. All rights reserved.

  1. Quantitative Appearance Inspection for Film Coated Tablets.

    PubMed

    Yoshino, Hiroyuki; Yamashita, Kazunari; Iwao, Yasunori; Noguchi, Shuji; Itai, Shigeru

    2016-01-01

    The decision criteria for the physical appearance of pharmaceutical products are subjective and qualitative means of evaluation that are based entirely on human interpretation. In this study, we have developed a comprehensive method for the quantitative analysis of the physical appearance of film coated tablets. Three different kinds of film coated tablets with considerable differences in their physical appearances were manufactured as models, and their surface roughness, contact angle, color measurements and physicochemical properties were investigated as potential characteristics for the quantitative analysis of their physical appearance. All of these characteristics were useful for the quantitative evaluation of the physical appearances of the tablets, and could potentially be used to establish decision criteria to assess the quality of tablets. In particular, the analysis of the surface roughness and film coating properties of the tablets by terahertz spectroscopy allowed for an effective evaluation of the tablets' properties. These results indicated the possibility of inspecting the appearance of tablets during the film coating process.

  2. Right ventricular strain in heart failure: Clinical perspective.

    PubMed

    Tadic, Marijana; Pieske-Kraigher, Elisabeth; Cuspidi, Cesare; Morris, Daniel A; Burkhardt, Franziska; Baudisch, Ana; Haßfeld, Sabine; Tschöpe, Carsten; Pieske, Burket

    2017-10-01

    The number of studies demonstrating the importance of right ventricular remodelling in a wide range of cardiovascular diseases has increased in the past two decades. Speckle-tracking imaging provides new variables that give comprehensive information about right ventricular function and mechanics. In this review, we summarize current knowledge of right ventricular mechanics in heart failure with reduced ejection fraction and preserved ejection fraction. We searched PubMed, MEDLINE, Ovid and Embase databases for studies published from January 2000 to December 2016 in the English language using the following keywords: "right ventricle"; "strain"; "speckle tracking"; "heart failure with reduced ejection fraction"; and "heart failure with preserved ejection fraction". Investigations showed that right ventricular dysfunction is associated with higher cardiovascular and overall mortality in patients with heart failure, irrespective of ejection fraction. The number of studies investigating right ventricular strain in patients with heart failure with reduced ejection fraction is constantly increasing, whereas data on right ventricular mechanics in patients with heart failure with preserved ejection fraction are limited. Given the high feasibility, accuracy and clinical implications of right ventricular strain in the population with heart failure, it is of great importance to try to include the evaluation of right ventricular strain as a regular part of each echocardiographic examination in patients with heart failure. However, further investigations are necessary to establish right ventricular strain as a standard variable for decision-making. Copyright © 2017 Elsevier Masson SAS. All rights reserved.

  3. Effect of the infrastructure material on the failure behavior of prosthetic crowns.

    PubMed

    Sonza, Queli Nunes; Della Bona, Alvaro; Borba, Márcia

    2014-05-01

    To evaluate the effect of infrastructure (IS) material on the fracture behavior of prosthetic crowns. Restorations were fabricated using a metal die simulating a prepared tooth. Four groups were evaluated: YZ-C, Y-TZP (In-Ceram YZ, Vita) IS produced by CAD-CAM; IZ-C, In-Ceram Zirconia (Vita) IS produced by CAD-CAM; IZ-S, In-Ceram Zirconia (Vita) IS produced by slip-cast; MC, metal IS (control). The IS were veneered with porcelain and resin cemented to fiber-reinforced composite dies. Specimens were loaded in compression to failure using a universal testing machine. The 30° angle load was applied by a spherical piston, in 37°C distilled water. Fractography was performed using stereomicroscope and SEM. Data were statistically analyzed with Anova and Student-Newman-Keuls tests (α=0.05). Significant differences were found between groups (p=0.022). MC showed the highest mean failure load, statistically similar to YZ-C. There was no statistical difference between YZ-C, IZ-C and IZ-S. MC and YZ-C showed no catastrophic failure. IZ-C and IZ-S showed chipping and catastrophic failures. The fracture behavior is similar to reported clinical failures. Considering the ceramic systems evaluated, YZ-C and MC crowns present greater fracture load and a more favorable failure mode than In-Ceram Zirconia crowns, regardless of the fabrication type (CAD-CAM or slip-cast). Copyright © 2014 Academy of Dental Materials. Published by Elsevier Ltd. All rights reserved.

  4. Quantitative Analysis of Verbal Expressions in Comments from Evaluation Committee Reviewers in AIST between Fiscal Years 2001 and 2008

    ERIC Educational Resources Information Center

    Yamamoto, Tetsuya

    2010-01-01

    This article discusses the quantitative analysis of verbal expressions of comments from the evaluation committee reviewers for 8 years (FY2001-FY2008) at the Japanese Public Research Institute, National Institute of Advanced Industrial Science and Technology (AIST). First, the terms often appearing in the comment sheets were observed. Moreover,…

  5. Tophaceous gout: quantitative evaluation by direct physical measurement.

    PubMed

    Schumacher, H Ralph; Becker, Michael A; Palo, William A; Streit, Janet; MacDonald, Patricia A; Joseph-Ridge, Nancy

    2005-12-01

    The absence of accepted standardized methods for monitoring tophaceous gout limits the ability to track tophus progression or regression. This multicenter study assessed intra- and interrater reproducibility of a simple and direct physical measurement. The quantitative evaluation was the area (mm2) of each measurable tophus and was determined independently by 2 raters on 2 occasions within 10 days. Intra- and interrater reproducibilities were determined by calculating mean differences and average percentage differences (APD) in measurements of areas for the same tophus at each of 2 visits and by each rater, respectively. Fifty-two tophi were measured in 13 subjects: 22 on the hand/wrist, 16 on the elbow, and 14 on the foot/ankle. The mean (+/- SD) difference in tophus areas between visits was -0.2 +/- 835 mm2 (95% CI -162 to 162 mm2) and the mean (+/- SD) APD was 29% +/- 33%. The mean (+/- SD) APD between raters was 32% +/- 27%. The largest variations in measurements were noted for elbow tophi and variations were least for well demarcated tophi on the hands. This simple and reproducible method can be easily utilized in clinical trials and in practice as a measure of efficacy of urate-lowering treatment in tophaceous gout. Among factors contributing to variability in these measurements were the anatomic site of tophi and rater experience with the method. Restriction of measurements to well circumscribed hand or foot tophi could improve reliability, but major changes, as expected with effective therapy, can clearly be documented with this simple technique.

  6. Evaluation of nasal mucociliary activity in patients with chronic renal failure.

    PubMed

    Kucur, Cuneyt; Ozbay, Isa; Gulcan, Erim; Kulekci, Semra; Aksoy, Sinan; Oghan, Fatih

    2016-05-01

    The ability of respiratory mucosal surfaces to eliminate foreign particles and pathogens and to keep mucosal surfaces moist and fresh depends on mucociliary activity. Chronic renal failure (CRF) is an irreversible medical condition that may result in important extrarenal systemic consequences, such as cardiovascular, metabolic, and respiratory system abnormalities. Although there are studies describing nasal manifestations of CRF, data are lacking concerning the effects of the condition on nasal mucosa. The goal of the current study was to evaluate nasal mucociliary clearance (NMC) time in patients with CRF. This prospective cohort study conducted in a tertiary referral center included 32 non-diabetic end-stage CRF patients and 30 control individuals. The control group consisted of voluntary participants who had been referred to our clinic for symptoms other than rhinological diseases. The mean NMC times in CRF patients and control individuals were 12.51 ± 3.74 min (range 7-22 min) and 8.97 ± 1.83 min (range 6-13 min), respectively. The mean NMC time in patients with CRF was significantly longer than that in control individuals (p < 0.001). Clinicians must keep in mind that NMC time in CRF patients is prolonged and must follow-up these patients more closely for sinonasal and middle ear infections.

  7. Quantitative Evaluation of Brain Stem Atrophy Using Magnetic Resonance Imaging in Adult Patients with Alexander Disease.

    PubMed

    Yoshida, Tomokatsu; Yasuda, Rei; Mizuta, Ikuko; Nakagawa, Masanori; Mizuno, Toshiki

    2017-01-01

    Brain MRI in adult patients with Alexander disease (AxD) mainly shows atrophy in the medulla oblongata. However, currently there is no quantitative standard for assessing this atrophy. In this study, we quantitatively evaluated the brain stem of AxD patients with glial fibrillary acidic protein (GFAP) mutation using conventional MRI to evaluate its usefulness as an aid to diagnosing AxD in daily clinical practice. Nineteen AxD patients with GFAP mutation were compared with 14 patients negative for GFAP mutation in whom AxD was suspected due to "atrophy of the medulla oblongata." In the GFAP mutation-positive group, the sagittal diameter of the medulla oblongata, the ratio of the diameter of the medulla oblongata to that of the midbrain (MO/MB), and the ratio of the sagittal diameter of the medulla oblongata to that of the pons (MO/Po) were significantly smaller compared to those of the GFAP mutation-negative group (p < 0.01). The sensitivity and specificity of each parameter were 87.5 and 92.3%, 91.7 and 81.3%, and 88.2 and 100% with a sagittal diameter of the medulla oblongata <9.0 mm, MO/MB <0.60, and sagittal MO/Po <0.46, respectively. These parameters can provide very useful information to differentially diagnose AxD from other disorders associated with brain stem atrophy in adult patients. © 2017 S. Karger AG, Basel.

  8. Highly sensitive and quantitative evaluation of the EGFR T790M mutation by nanofluidic digital PCR.

    PubMed

    Iwama, Eiji; Takayama, Koichi; Harada, Taishi; Okamoto, Isamu; Ookubo, Fumihiko; Kishimoto, Junji; Baba, Eishi; Oda, Yoshinao; Nakanishi, Yoichi

    2015-08-21

    The mutation of T790M in EGFR is a major mechanism of resistance to treatment with EGFR-TKIs. Only qualitative detection (presence or absence) of T790M has been described to date, however. Digital PCR (dPCR) analysis has recently been applied to the quantitative detection of target molecules in cancer with high sensitivity. In the present study, 25 tumor samples (13 obtained before and 12 after EGFR-TKI treatment) from 18 NSCLC patients with activating EGFR mutations were evaluated for T790M with dPCR. The ratio of the number of T790M alleles to that of activating mutation alleles (T/A) was determined. dPCR detected T790M in all 25 samples. Although T790M was present in all pre-TKI samples from 13 patients, 10 of these patients had a low T/A ratio and manifested substantial tumor shrinkage during treatment with EGFR-TKIs. In six of seven patients for whom both pre- and post-TKI samples were available, the T/A ratio increased markedly during EGFR-TKI treatment. Highly sensitive dPCR thus detected T790M in all NSCLC patients harboring activating EGFR mutations whether or not they had received EGFR-TKI treatment. Not only highly sensitive but also quantitative detection of T790M is important for evaluation of the contribution of T790M to EGFR-TKI resistance.

  9. Non cardiopatic and cardiopatic beta thalassemic patients: quantitative and qualitative cardiac iron deposition evaluation with MRI.

    PubMed

    Macarini, L; Marini, S; Pietrapertosa, A; Scardapane, A; Ettorre, G C

    2005-01-01

    Cardiomyopathy is one of the major complications of b thalassaemia major as a result of transfusional iron overload. The aim of our study is to evaluate with MR if there is any difference of iron deposition signal intensity (SI) or distribution between non-cardiopathic and cardiopathic thalassaemic patients in order to establish if there is a relationship between cardiopathy and iron deposition. We studied 20 patients affected by b thalassaemia major, of whom 10 cardiopathic and 10 non-cardiopathic, and 10 healthy volunteers as control group. Serum ferritin and left ventricular ejection fraction were calculated in thalassaemic patients. All patients were examined using a 1.5 MR unit with ECG-gated GE cine-MR T2*-weighted, SE T1-weighted and GE T2*-weighted sequences. In all cases, using an adequate ROI, the myocardial and skeletal muscle signal intensity (SI), the myocardial/skeletal muscle signal intensity ratio (SIR) and the SI average of the myocardium and skeletal muscle were calculated for every study group. The qualitative evaluation of iron deposition distribution was independently performed by three radiologists who analyzed the extension, the site and the morphology of iron deposition on the MR images and reported their observations on the basis of a four-level rating scale: 0 (absent), 1 (limited), 2 (partial), 3 (widespread deposition). The result of quantitative and qualitative evaluations were analysed with statistical tests. Cardiac iron deposition was found in 8/10 non-cardiopathic thalassaemic patients and in all cardiopathic thalassaemic patients. We noticed a significant SI difference (p>0.05) between the healthy volunteer control group and the thalassaemic patients with iron deposition, but no significant SI difference in iron deposition between non-cardiopathic and cardiopathic thalassaemic patients in the areas evaluated. The qualitative evaluation revealed a different distribution of iron deposition between the two thalassaemic groups, with

  10. Quantitative Evaluation of the Environmental Impact Quotient (EIQ) for Comparing Herbicides

    PubMed Central

    Kniss, Andrew R.; Coburn, Carl W.

    2015-01-01

    Various indicators of pesticide environmental risk have been proposed, and one of the most widely known and used is the environmental impact quotient (EIQ). The EIQ has been criticized by others in the past, but it continues to be used regularly in the weed science literature. The EIQ is typically considered an improvement over simply comparing the amount of herbicides applied by weight. Herbicides are treated differently compared to other pesticide groups when calculating the EIQ, and therefore, it is important to understand how different risk factors affect the EIQ for herbicides. The purpose of this work was to evaluate the suitability of the EIQ as an environmental indicator for herbicides. Simulation analysis was conducted to quantify relative sensitivity of the EIQ to changes in risk factors, and actual herbicide EIQ values were used to quantify the impact of herbicide application rate on the EIQ Field Use Rating. Herbicide use rate was highly correlated with the EIQ Field Use Rating (Spearman’s rho >0.96, P-value <0.001) for two herbicide datasets. Two important risk factors for herbicides, leaching and surface runoff potential, are included in the EIQ calculation but explain less than 1% of total variation in the EIQ. Plant surface half-life was the risk factor with the greatest relative influence on herbicide EIQ, explaining 26 to 28% of the total variation in EIQ for actual and simulated EIQ values, respectively. For herbicides, the plant surface half-life risk factor is assigned values without any supporting quantitative data, and can result in EIQ estimates that are contrary to quantitative risk estimates for some herbicides. In its current form, the EIQ is a poor measure of herbicide environmental impact. PMID:26121252

  11. Quantitative Evaluation of the Environmental Impact Quotient (EIQ) for Comparing Herbicides.

    PubMed

    Kniss, Andrew R; Coburn, Carl W

    2015-01-01

    Various indicators of pesticide environmental risk have been proposed, and one of the most widely known and used is the environmental impact quotient (EIQ). The EIQ has been criticized by others in the past, but it continues to be used regularly in the weed science literature. The EIQ is typically considered an improvement over simply comparing the amount of herbicides applied by weight. Herbicides are treated differently compared to other pesticide groups when calculating the EIQ, and therefore, it is important to understand how different risk factors affect the EIQ for herbicides. The purpose of this work was to evaluate the suitability of the EIQ as an environmental indicator for herbicides. Simulation analysis was conducted to quantify relative sensitivity of the EIQ to changes in risk factors, and actual herbicide EIQ values were used to quantify the impact of herbicide application rate on the EIQ Field Use Rating. Herbicide use rate was highly correlated with the EIQ Field Use Rating (Spearman's rho >0.96, P-value <0.001) for two herbicide datasets. Two important risk factors for herbicides, leaching and surface runoff potential, are included in the EIQ calculation but explain less than 1% of total variation in the EIQ. Plant surface half-life was the risk factor with the greatest relative influence on herbicide EIQ, explaining 26 to 28% of the total variation in EIQ for actual and simulated EIQ values, respectively. For herbicides, the plant surface half-life risk factor is assigned values without any supporting quantitative data, and can result in EIQ estimates that are contrary to quantitative risk estimates for some herbicides. In its current form, the EIQ is a poor measure of herbicide environmental impact.

  12. TU-H-CAMPUS-IeP2-01: Quantitative Evaluation of PROPELLER DWI Using QIBA Diffusion Phantom

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yung, J; Ai, H; Liu, H

    Purpose: The purpose of this study is to determine the quantitative variability of apparent diffusion coefficient (ADC) values when varying imaging parameters in a diffusion-weighted (DW) fast spin echo (FSE) sequence with Periodically Rotated Overlapping ParallEL Lines with Enhanced Reconstruction (PROPELLER) k-space trajectory. Methods: Using a 3T MRI scanner, a NIST traceable, quantitative magnetic resonance imaging (MRI) diffusion phantom (High Precision Devices, Inc, Boulder, Colorado) consisting of 13 vials filled with various concentrations of polymer polyvinylpyrrolidone (PVP) in aqueous solution was imaged with a standard Quantitative Imaging Biomarkers Alliance (QIBA) DWI spin echo, echo planar imaging (SE EPI) acquisition. Themore » same phantom was then imaged with a DWI PROPELLER sequence at varying echo train lengths (ETL) of 8, 20, and 32, as well as b-values of 400, 900, and 2000. QIBA DWI phantom analysis software was used to generate ADC maps and create region of interests (ROIs) for quantitative measurements of each vial. Mean and standard deviations of the ROIs were compared. Results: The SE EPI sequence generated ADC values that showed very good agreement with the known ADC values of the phantom (r2 = 0.9995, slope = 1.0061). The ADC values measured from the PROPELLER sequences were inflated, but were highly correlated with an r2 range from 0.8754 to 0.9880. The PROPELLER sequence with an ETL=20 and b-value of 0 and 2000 showed the closest agreement (r2 = 0.9034, slope = 0.9880). Conclusion: The DW PROPELLER sequence is promising for quantitative evaluation of ADC values. A drawback of the PROPELLER sequence is the longer acquisition time. The 180° refocusing pulses may also cause the observed increase in ADC values compared to the standard SE EPI DW sequence. However, the FSE sequence offers an advantage with in-plane motion and geometric distortion which will be investigated in future studies.« less

  13. Evaluation of acute ischemic stroke using quantitative EEG: a comparison with conventional EEG and CT scan.

    PubMed

    Murri, L; Gori, S; Massetani, R; Bonanni, E; Marcella, F; Milani, S

    1998-06-01

    The sensitivity of quantitative electroencephalogram (EEG) was compared with that of conventional EEG in patients with acute ischaemic stroke. In addition, a correlation between quantitative EEG data and computerized tomography (CT) scan findings was carried out for all the areas of lesion in order to reassess the actual role of EEG in the evaluation of stroke. Sixty-five patients were tested with conventional and quantitative EEG within 24 h from the onset of neurological symptoms, whereas CT scan was performed within 4 days from the onset of stroke. EEG was recorded from 19 electrodes placed upon the scalp according to the International 10-20 System. Spectral analysis was carried out on 30 artefact-free 4-sec epochs. For each channel absolute and relative power were calculated for the delta, theta, alpha and beta frequency bands and such data were successively represented in colour-coded maps. Ten patients with extensive lesions documented by CT scan were excluded. The results indicated that conventional EEG revealed abnormalities in 40 of 55 cases, while EEG mapping showed abnormalities in 46 of 55 cases: it showed focal abnormalities in five cases and nonfocal abnormalities in one of six cases which had appeared to be normal according to visual inspection of EEG. In a further 11 cases, where the conventional EEG revealed abnormalities in one hemisphere, the quantitative EEG and maps allowed to further localize abnormal activity in a more localized way. The sensitivity of both methods was higher for frontocentral, temporal and parieto-occipital cortical-subcortical infarctions than for basal ganglia and internal capsule lesions; however, quantitative EEG was more efficient for all areas of lesion in detecting cases that had appeared normal by visual inspection and was clearly superior in revealing focal abnormalities. When we considered the electrode related to which the maximum power of the delta frequency band is recorded, a fairly close correlation was found

  14. Predicting Failure Progression and Failure Loads in Composite Open-Hole Tension Coupons

    NASA Technical Reports Server (NTRS)

    Arunkumar, Satyanarayana; Przekop, Adam

    2010-01-01

    Failure types and failure loads in carbon-epoxy [45n/90n/-45n/0n]ms laminate coupons with central circular holes subjected to tensile load are simulated using progressive failure analysis (PFA) methodology. The progressive failure methodology is implemented using VUMAT subroutine within the ABAQUS(TradeMark)/Explicit nonlinear finite element code. The degradation model adopted in the present PFA methodology uses an instantaneous complete stress reduction (COSTR) approach to simulate damage at a material point when failure occurs. In-plane modeling parameters such as element size and shape are held constant in the finite element models, irrespective of laminate thickness and hole size, to predict failure loads and failure progression. Comparison to published test data indicates that this methodology accurately simulates brittle, pull-out and delamination failure types. The sensitivity of the failure progression and the failure load to analytical loading rates and solvers precision is demonstrated.

  15. Round-robin analysis of the behavior of a 1:6-scale reinforced concrete containment model pressurized to failure: Posttest evaluations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Clauss, D.B.

    A 1:6-scale model of a reinforced concrete containment building was pressurized incrementally to failure at a remote site at Sandia National Laboratories. The response of the model was recorded with more than 1000 channels of data (primarily strain and displacement measurements) at 37 discrete pressure levels. The primary objective of this test was to generate data that could be used to validate methods for predicting the performance of containment buildings subject to loads beyond their design basis. Extensive analyses were conducted before the test to predict the behavior of the model. Ten organizations in Europe and the US conducted independentmore » analyses of the model and contributed to a report on the pretest predictions. Predictions included structural response at certain predetermined locations in the model as well as capacity and failure mode. This report discusses comparisons between the pretest predictions and the experimental results. Posttest evaluations that were conducted to provide additional insight into the model behavior are also described. The significance of the analysis and testing of the 1:6-scale model to performance evaluations of actual containments subject to beyond design basis loads is also discussed. 70 refs., 428 figs., 24 tabs.« less

  16. Qualitative and quantitative evaluation of human dental enamel after bracket debonding: a noncontact three-dimensional optical profilometry analysis.

    PubMed

    Ferreira, Fabiano G; Nouer, Darcy F; Silva, Nelson P; Garbui, Ivana U; Correr-Sobrinho, Lourenço; Nouer, Paulo R A

    2014-09-01

    The aim of this study was to undertake a qualitative and quantitative evaluation of changes on enamel surfaces after debonding of brackets followed by finishing procedures, using a high-resolution three-dimensional optical profiler and to investigate the accuracy of the technique. The labial surfaces of 36 extracted upper central incisors were examined. Before bonding, the enamel surfaces were subjected to profilometry, recording four amplitude parameters. Brackets were then bonded using two types of light-cured orthodontic adhesive: composite resin and resin-modified glass ionomer cement. Finishing was performed by three different methods: pumice on a rubber cup, fine and ultrafine aluminum oxide discs, and microfine diamond cups followed by silicon carbide brushes. The samples were subsequently re-analyzed by profilometry. Wilcoxon signed-rank test, Kruskal-Wallis test (p < 0.05) and a posteriori Mann-Whitney U test with Bonferroni correction (p < 0.0167) revealed a significant reduction of enamel roughness when diamond cups followed by silicon carbide brushes were used to finish surfaces that had remnants of resin-modified glass ionomer adhesive and when pumice was used to finish surfaces that had traces of composite resin. Enamel loss was minimal. The 3D optical profilometry technique was able to provide accurate qualitative and quantitative assessment of changes on the enamel surface after debonding. Morphological changes in the topography of dental surfaces, especially if related to enamel loss and roughness, are of considerable clinical importance. The quantitative evaluation method used herein enables a more comprehensive understanding of the effects of orthodontic bonding on teeth.

  17. Failure Assessment of Stainless Steel and Titanium Brazed Joints

    NASA Technical Reports Server (NTRS)

    Flom, Yury A.

    2012-01-01

    Following successful application of Coulomb-Mohr and interaction equations for evaluation of safety margins in Albemet 162 brazed joints, two additional base metal/filler metal systems were investigated. Specimens consisting of stainless steel brazed with silver-base filler metal and titanium brazed with 1100 Al alloy were tested to failure under combined action of tensile, shear, bending and torsion loads. Finite Element Analysis (FEA), hand calculations and digital image comparison (DIC) techniques were used to estimate failure stresses and construct Failure Assessment Diagrams (FAD). This study confirms that interaction equation R(sub sigma) + R(sub tau) = 1, where R(sub sigma) and R(sub t u) are normal and shear stress ratios, can be used as conservative lower bound estimate of the failure criterion in stainless steel and titanium brazed joints.

  18. Pooled nucleic acid testing to identify antiretroviral treatment failure during HIV infection.

    PubMed

    May, Susanne; Gamst, Anthony; Haubrich, Richard; Benson, Constance; Smith, Davey M

    2010-02-01

    Pooling strategies have been used to reduce the costs of polymerase chain reaction-based screening for acute HIV infection in populations in which the prevalence of acute infection is low (less than 1%). Only limited research has been done for conditions in which the prevalence of screening positivity is higher (greater than 1%). We present data on a variety of pooling strategies that incorporate the use of polymerase chain reaction-based quantitative measures to monitor for virologic failure among HIV-infected patients receiving antiretroviral therapy. For a prevalence of virologic failure between 1% and 25%, we demonstrate relative efficiency and accuracy of various strategies. These results could be used to choose the best strategy based on the requirements of individual laboratory and clinical settings such as required turnaround time of results and availability of resources. Virologic monitoring during antiretroviral therapy is not currently being performed in many resource-constrained settings largely because of costs. The presented pooling strategies may be used to significantly reduce the cost compared with individual testing, make such monitoring feasible, and limit the development and transmission of HIV drug resistance in resource-constrained settings. They may also be used to design efficient pooling strategies for other settings with quantitative screening measures.

  19. A motivational counseling approach to improving heart failure self-care: mechanisms of effectiveness.

    PubMed

    Riegel, Barbara; Dickson, Victoria V; Hoke, Linda; McMahon, Janet P; Reis, Brendali F; Sayers, Steven

    2006-01-01

    Self-care is an integral component of successful heart failure (HF) management. Engaging patients in self-care can be challenging. Fifteen patients with HF enrolled during hospitalization received a motivational intervention designed to improve HF self-care. A mixed method, pretest posttest design was used to evaluate the proportion of patients in whom the intervention was beneficial and the mechanism of effectiveness. Participants received, on average, 3.0 +/- 1.5 home visits (median 3, mode 3, range 1-6) over a three-month period from an advanced practice nurse trained in motivational interviewing and family counseling. Quantitative and qualitative data were used to judge individual patients in whom the intervention produced a clinically significant improvement in HF self-care. Audiotaped intervention sessions were analyzed using qualitative methods to assess the mechanism of intervention effectiveness. Congruence between quantitative and qualitative judgments of improved self-care revealed that 71.4% of participants improved in self-care after receiving the intervention. Analysis of transcribed intervention sessions revealed themes of 1) communication (reflective listening, empathy); 2) making it fit (acknowledging cultural beliefs, overcoming barriers and constraints, negotiating an action plan); and, 3) bridging the transition from hospital to home (providing information, building skills, activating support resources). An intervention that incorporates the core elements of motivational interviewing may be effective in improving HF self-care, but further research is needed.

  20. Spatial correlation analysis of cascading failures: Congestions and Blackouts

    PubMed Central

    Daqing, Li; Yinan, Jiang; Rui, Kang; Havlin, Shlomo

    2014-01-01

    Cascading failures have become major threats to network robustness due to their potential catastrophic consequences, where local perturbations can induce global propagation of failures. Unlike failures spreading via direct contacts due to structural interdependencies, overload failures usually propagate through collective interactions among system components. Despite the critical need in developing protection or mitigation strategies in networks such as power grids and transportation, the propagation behavior of cascading failures is essentially unknown. Here we find by analyzing our collected data that jams in city traffic and faults in power grid are spatially long-range correlated with correlations decaying slowly with distance. Moreover, we find in the daily traffic, that the correlation length increases dramatically and reaches maximum, when morning or evening rush hour is approaching. Our study can impact all efforts towards improving actively system resilience ranging from evaluation of design schemes, development of protection strategies to implementation of mitigation programs. PMID:24946927