Science.gov

Sample records for reliability physics-of-failure based

  1. Agent autonomy approach to probabilistic physics-of-failure modeling of complex dynamic systems with interacting failure mechanisms

    NASA Astrophysics Data System (ADS)

    Gromek, Katherine Emily

    A novel computational and inference framework of the physics-of-failure (PoF) reliability modeling for complex dynamic systems has been established in this research. The PoF-based reliability models are used to perform a real time simulation of system failure processes, so that the system level reliability modeling would constitute inferences from checking the status of component level reliability at any given time. The "agent autonomy" concept is applied as a solution method for the system-level probabilistic PoF-based (i.e. PPoF-based) modeling. This concept originated from artificial intelligence (AI) as a leading intelligent computational inference in modeling of multi agents systems (MAS). The concept of agent autonomy in the context of reliability modeling was first proposed by M. Azarkhail [1], where a fundamentally new idea of system representation by autonomous intelligent agents for the purpose of reliability modeling was introduced. Contribution of the current work lies in the further development of the agent anatomy concept, particularly the refined agent classification within the scope of the PoF-based system reliability modeling, new approaches to the learning and the autonomy properties of the intelligent agents, and modeling interacting failure mechanisms within the dynamic engineering system. The autonomous property of intelligent agents is defined as agent's ability to self-activate, deactivate or completely redefine their role in the analysis. This property of agents and the ability to model interacting failure mechanisms of the system elements makes the agent autonomy fundamentally different from all existing methods of probabilistic PoF-based reliability modeling. 1. Azarkhail, M., "Agent Autonomy Approach to Physics-Based Reliability Modeling of Structures and Mechanical Systems", PhD thesis, University of Maryland, College Park, 2007.

  2. Methodology for Physics and Engineering of Reliable Products

    NASA Technical Reports Server (NTRS)

    Cornford, Steven L.; Gibbel, Mark

    1996-01-01

    Physics of failure approaches have gained wide spread acceptance within the electronic reliability community. These methodologies involve identifying root cause failure mechanisms, developing associated models, and utilizing these models to inprove time to market, lower development and build costs and higher reliability. The methodology outlined herein sets forth a process, based on integration of both physics and engineering principles, for achieving the same goals.

  3. Reliability-based casing design

    SciTech Connect

    Maes, M.A.; Gulati, K.C.; Johnson, R.C.; McKenna, D.L.; Brand, P.R.; Lewis, D.B.

    1995-06-01

    The present paper describes the development of reliability-based design criteria for oil and/or gas well casing/tubing. The approach is based on the fundamental principles of limit state design. Limit states for tubulars are discussed and specific techniques for the stochastic modeling of loading and resistance variables are described. Zonation methods and calibration techniques are developed which are geared specifically to the characteristic tubular design for both hydrocarbon drilling and production applications. The application of quantitative risk analysis to the development of risk-consistent design criteria is shown to be a major and necessary step forward in achieving more economic tubular design.

  4. A Reliability-Based Track Fusion Algorithm

    PubMed Central

    Xu, Li; Pan, Liqiang; Jin, Shuilin; Liu, Haibo; Yin, Guisheng

    2015-01-01

    The common track fusion algorithms in multi-sensor systems have some defects, such as serious imbalances between accuracy and computational cost, the same treatment of all the sensor information regardless of their quality, high fusion errors at inflection points. To address these defects, a track fusion algorithm based on the reliability (TFR) is presented in multi-sensor and multi-target environments. To improve the information quality, outliers in the local tracks are eliminated at first. Then the reliability of local tracks is calculated, and the local tracks with high reliability are chosen for the state estimation fusion. In contrast to the existing methods, TFR reduces high fusion errors at the inflection points of system tracks, and obtains a high accuracy with less computational cost. Simulation results verify the effectiveness and the superiority of the algorithm in dense sensor environments. PMID:25950174

  5. A reliability-based track fusion algorithm.

    PubMed

    Xu, Li; Pan, Liqiang; Jin, Shuilin; Liu, Haibo; Yin, Guisheng

    2015-01-01

    The common track fusion algorithms in multi-sensor systems have some defects, such as serious imbalances between accuracy and computational cost, the same treatment of all the sensor information regardless of their quality, high fusion errors at inflection points. To address these defects, a track fusion algorithm based on the reliability (TFR) is presented in multi-sensor and multi-target environments. To improve the information quality, outliers in the local tracks are eliminated at first. Then the reliability of local tracks is calculated, and the local tracks with high reliability are chosen for the state estimation fusion. In contrast to the existing methods, TFR reduces high fusion errors at the inflection points of system tracks, and obtains a high accuracy with less computational cost. Simulation results verify the effectiveness and the superiority of the algorithm in dense sensor environments.

  6. Reliability mechanisms in distributed data base systems

    SciTech Connect

    Son, S.H.

    1986-01-01

    Distributed database systems operate in computer networking environments where component failures are inevitable during normal operation. Failures not only threaten normal operation of the system, but they may destroy the correctness of the data base by direct damage to the storage subsystem. In order to cope with these failures, distributed data base systems must provide reliability mechanisms that maintain the system consistency. There are two major parts in this dissertation. In the first part, mechanisms are presented for recovery management in distributed data base system. The recovery management of a distributed data bases system consists of two parts: the preparation for the recovery by saving necessary information during normal operation of the data base system, and the coordination of the actual recovery in order to avoid the possible inconsistency after the recovery. The preparation for the recovery is done through the checkpointing and logging. A new scheme is proposed for reconstruction of the data base in distributed environments. In the second part, a token-based resiliency control scheme for replicated distributed data base systems. The proposed control scheme increases the reliability as well as the degree of concurrency while maintaining the consistency of the system.

  7. Measurement-based reliability/performability models

    NASA Technical Reports Server (NTRS)

    Hsueh, Mei-Chen

    1987-01-01

    Measurement-based models based on real error-data collected on a multiprocessor system are described. Model development from the raw error-data to the estimation of cumulative reward is also described. A workload/reliability model is developed based on low-level error and resource usage data collected on an IBM 3081 system during its normal operation in order to evaluate the resource usage/error/recovery process in a large mainframe system. Thus, both normal and erroneous behavior of the system are modeled. The results provide an understanding of the different types of errors and recovery processes. The measured data show that the holding times in key operational and error states are not simple exponentials and that a semi-Markov process is necessary to model the system behavior. A sensitivity analysis is performed to investigate the significance of using a semi-Markov process, as opposed to a Markov process, to model the measured system.

  8. Reliability Analysis and Reliability-Based Design Optimization of Circular Composite Cylinders Under Axial Compression

    NASA Technical Reports Server (NTRS)

    Rais-Rohani, Masoud

    2001-01-01

    This report describes the preliminary results of an investigation on component reliability analysis and reliability-based design optimization of thin-walled circular composite cylinders with average diameter and average length of 15 inches. Structural reliability is based on axial buckling strength of the cylinder. Both Monte Carlo simulation and First Order Reliability Method are considered for reliability analysis with the latter incorporated into the reliability-based structural optimization problem. To improve the efficiency of reliability sensitivity analysis and design optimization solution, the buckling strength of the cylinder is estimated using a second-order response surface model. The sensitivity of the reliability index with respect to the mean and standard deviation of each random variable is calculated and compared. The reliability index is found to be extremely sensitive to the applied load and elastic modulus of the material in the fiber direction. The cylinder diameter was found to have the third highest impact on the reliability index. Also the uncertainty in the applied load, captured by examining different values for its coefficient of variation, is found to have a large influence on cylinder reliability. The optimization problem for minimum weight is solved subject to a design constraint on element reliability index. The methodology, solution procedure and optimization results are included in this report.

  9. A reliability measure of protein-protein interactions and a reliability measure-based search engine.

    PubMed

    Park, Byungkyu; Han, Kyungsook

    2010-02-01

    Many methods developed for estimating the reliability of protein-protein interactions are based on the topology of protein-protein interaction networks. This paper describes a new reliability measure for protein-protein interactions, which does not rely on the topology of protein interaction networks, but expresses biological information on functional roles, sub-cellular localisations and protein classes as a scoring schema. The new measure is useful for filtering many spurious interactions, as well as for estimating the reliability of protein interaction data. In particular, the reliability measure can be used to search protein-protein interactions with the desired reliability in databases. The reliability-based search engine is available at http://yeast.hpid.org. We believe this is the first search engine for interacting proteins, which is made available to public. The search engine and the reliability measure of protein interactions should provide useful information for determining proteins to focus on.

  10. Assuring reliability program effectiveness.

    NASA Technical Reports Server (NTRS)

    Ball, L. W.

    1973-01-01

    An attempt is made to provide simple identification and description of techniques that have proved to be most useful either in developing a new product or in improving reliability of an established product. The first reliability task is obtaining and organizing parts failure rate data. Other tasks are parts screening, tabulation of general failure rates, preventive maintenance, prediction of new product reliability, and statistical demonstration of achieved reliability. Five principal tasks for improving reliability involve the physics of failure research, derating of internal stresses, control of external stresses, functional redundancy, and failure effects control. A final task is the training and motivation of reliability specialist engineers.

  11. System Reliability for LED-Based Products

    SciTech Connect

    Davis, J Lynn; Mills, Karmann; Lamvik, Michael; Yaga, Robert; Shepherd, Sarah D; Bittle, James; Baldasaro, Nick; Solano, Eric; Bobashev, Georgiy; Johnson, Cortina; Evans, Amy

    2014-04-07

    Results from accelerated life tests (ALT) on mass-produced commercially available 6” downlights are reported along with results from commercial LEDs. The luminaires capture many of the design features found in modern luminaires. In general, a systems perspective is required to understand the reliability of these devices since LED failure is rare. In contrast, components such as drivers, lenses, and reflector are more likely to impact luminaire reliability than LEDs.

  12. Reliability sensitivity-based correlation coefficient calculation in structural reliability analysis

    NASA Astrophysics Data System (ADS)

    Yang, Zhou; Zhang, Yimin; Zhang, Xufang; Huang, Xianzhen

    2012-05-01

    The correlation coefficients of random variables of mechanical structures are generally chosen with experience or even ignored, which cannot actually reflect the effects of parameter uncertainties on reliability. To discuss the selection problem of the correlation coefficients from the reliability-based sensitivity point of view, the theory principle of the problem is established based on the results of the reliability sensitivity, and the criterion of correlation among random variables is shown. The values of the correlation coefficients are obtained according to the proposed principle and the reliability sensitivity problem is discussed. Numerical studies have shown the following results: (1) If the sensitivity value of correlation coefficient ρ is less than (at what magnitude 0.000 01), then the correlation could be ignored, which could simplify the procedure without introducing additional error. (2) However, as the difference between ρ s, that is the most sensitive to the reliability, and ρ R , that is with the smallest reliability, is less than 0.001, ρ s is suggested to model the dependency of random variables. This could ensure the robust quality of system without the loss of safety requirement. (3) In the case of | E abs|>0.001 and also | E rel|>0.001, ρ R should be employed to quantify the correlation among random variables in order to ensure the accuracy of reliability analysis. Application of the proposed approach could provide a practical routine for mechanical design and manufactory to study the reliability and reliability-based sensitivity of basic design variables in mechanical reliability analysis and design.

  13. Reliability-based design optimization under stationary stochastic process loads

    NASA Astrophysics Data System (ADS)

    Hu, Zhen; Du, Xiaoping

    2016-08-01

    Time-dependent reliability-based design ensures the satisfaction of reliability requirements for a given period of time, but with a high computational cost. This work improves the computational efficiency by extending the sequential optimization and reliability analysis (SORA) method to time-dependent problems with both stationary stochastic process loads and random variables. The challenge of the extension is the identification of the most probable point (MPP) associated with time-dependent reliability targets. Since a direct relationship between the MPP and reliability target does not exist, this work defines the concept of equivalent MPP, which is identified by the extreme value analysis and the inverse saddlepoint approximation. With the equivalent MPP, the time-dependent reliability-based design optimization is decomposed into two decoupled loops: deterministic design optimization and reliability analysis, and both are performed sequentially. Two numerical examples are used to show the efficiency of the proposed method.

  14. Reliability modeling of fault-tolerant computer based systems

    NASA Technical Reports Server (NTRS)

    Bavuso, Salvatore J.

    1987-01-01

    Digital fault-tolerant computer-based systems have become commonplace in military and commercial avionics. These systems hold the promise of increased availability, reliability, and maintainability over conventional analog-based systems through the application of replicated digital computers arranged in fault-tolerant configurations. Three tightly coupled factors of paramount importance, ultimately determining the viability of these systems, are reliability, safety, and profitability. Reliability, the major driver affects virtually every aspect of design, packaging, and field operations, and eventually produces profit for commercial applications or increased national security. However, the utilization of digital computer systems makes the task of producing credible reliability assessment a formidable one for the reliability engineer. The root of the problem lies in the digital computer's unique adaptability to changing requirements, computational power, and ability to test itself efficiently. Addressed here are the nuances of modeling the reliability of systems with large state sizes, in the Markov sense, which result from systems based on replicated redundant hardware and to discuss the modeling of factors which can reduce reliability without concomitant depletion of hardware. Advanced fault-handling models are described and methods of acquiring and measuring parameters for these models are delineated.

  15. Reliability-based lifetime maintenance of aging highway bridges

    NASA Astrophysics Data System (ADS)

    Enright, Michael P.; Frangopol, Dan M.

    2000-06-01

    As the nation's infrastructure continues to age, the cost of maintaining it at an acceptable safety level continues to increase. In the United States, about one of every three bridges is rated structurally deficient and/or functionally obsolete. It will require about 80 billion to eliminate the current backlog of bridge deficiencies and maintain repair levels. Unfortunately, the financial resources allocated for these activities fall extremely short of the demand. Although several existing and emerging NDT techniques are available to gather inspection data, current maintenance planning decisions for deficient bridges are based on data from subjective condition assessments and do not consider the reliability of bridge components and systems. Recently, reliability-based optimum maintenance planning strategies have been developed. They can be used to predict inspection and repair times to achieve minimum life-cycle cost of deteriorating structural systems. In this study, a reliability-based methodology which takes into account loading randomness and history, and randomness in strength and degradation resulting from aggressive environmental factors, is used to predict the time- dependent reliability of aging highway bridges. A methodology for incorporating inspection data into reliability predictions is also presented. Finally, optimal lifetime maintenance strategies are identified, in which optimal inspection/repair times are found based on minimum expected life-cycle cost under prescribed reliability constraints. The influence of discount rate on optimum solutions is evaluated.

  16. Physics of failure modes in accelerometers utilizing single crystal piezoelectric materials

    NASA Astrophysics Data System (ADS)

    Wlodkowski, Paul Alexander

    1999-11-01

    For over forty years, the lead zirconate -- lead titanate system (PZT) has been the industrial standard of sensing materials for piezoelectric accelerometers. This ceramic has established a reliability benchmark given the uniformity of its electromechanical properties, the negligible dependence of these properties on temperature and pre-stress, and the ability to manufacture the sensing element cost-effectively into a myriad of geometries. Today, revolutionary advances in the growth of single crystal piezoelectric materials have spawned the evolution of novel sensor designs. With piezoelectric coefficients exceeding 2000 pC/N, and electromechanical coupling factors above 90%, single crystals of Pb(Mg1/3Nb2/3)O3-PbTiO3 [PMNT] and Pb(Zn1/3Nb2/3)O3-PbTiO3 [PZNT] have the potential of superseding PZT ceramics in certain critical applications. This dissertation reports the first results of the design, development and performance characterization for an accelerometer utilizing bulk, single crystal piezoelectric materials. Numerous prototypes, developed in the compression and flexural-mode design configurations, exhibit charge sensitivities that exceed that of their PZT-counterparts by a factor of greater than three times. The introduction of accelerometer prototypes employing single crystal piezoelectric material is an important advancement for the sensor industry. Root-cause failure processes were identified and subsequently used as a reliability enhancement tool to prevent device failures through robust design and manufacturing practices. Crystal machining techniques were analyzed in which a scanning electron microscope was used to inspect the crystal surface for defects. Inhomogeneity in the piezoelectric properties over the surface of the crystal was quantified and recognized as a major obstacle to commercialization. Measurements were made on the material's fracture toughness and electromechanical properties over a wide temperature range. Effects of aging and

  17. Fatigue reliability based optimal design of planar compliant micropositioning stages.

    PubMed

    Wang, Qiliang; Zhang, Xianmin

    2015-10-01

    Conventional compliant micropositioning stages are usually developed based on static strength and deterministic methods, which may lead to either unsafe or excessive designs. This paper presents a fatigue reliability analysis and optimal design of a three-degree-of-freedom (3 DOF) flexure-based micropositioning stage. Kinematic, modal, static, and fatigue stress modelling of the stage were conducted using the finite element method. The maximum equivalent fatigue stress in the hinges was derived using sequential quadratic programming. The fatigue strength of the hinges was obtained by considering various influencing factors. On this basis, the fatigue reliability of the hinges was analysed using the stress-strength interference method. Fatigue-reliability-based optimal design of the stage was then conducted using the genetic algorithm and MATLAB. To make fatigue life testing easier, a 1 DOF stage was then optimized and manufactured. Experimental results demonstrate the validity of the approach.

  18. Fatigue reliability based optimal design of planar compliant micropositioning stages

    NASA Astrophysics Data System (ADS)

    Wang, Qiliang; Zhang, Xianmin

    2015-10-01

    Conventional compliant micropositioning stages are usually developed based on static strength and deterministic methods, which may lead to either unsafe or excessive designs. This paper presents a fatigue reliability analysis and optimal design of a three-degree-of-freedom (3 DOF) flexure-based micropositioning stage. Kinematic, modal, static, and fatigue stress modelling of the stage were conducted using the finite element method. The maximum equivalent fatigue stress in the hinges was derived using sequential quadratic programming. The fatigue strength of the hinges was obtained by considering various influencing factors. On this basis, the fatigue reliability of the hinges was analysed using the stress-strength interference method. Fatigue-reliability-based optimal design of the stage was then conducted using the genetic algorithm and MATLAB. To make fatigue life testing easier, a 1 DOF stage was then optimized and manufactured. Experimental results demonstrate the validity of the approach.

  19. Fatigue reliability based optimal design of planar compliant micropositioning stages.

    PubMed

    Wang, Qiliang; Zhang, Xianmin

    2015-10-01

    Conventional compliant micropositioning stages are usually developed based on static strength and deterministic methods, which may lead to either unsafe or excessive designs. This paper presents a fatigue reliability analysis and optimal design of a three-degree-of-freedom (3 DOF) flexure-based micropositioning stage. Kinematic, modal, static, and fatigue stress modelling of the stage were conducted using the finite element method. The maximum equivalent fatigue stress in the hinges was derived using sequential quadratic programming. The fatigue strength of the hinges was obtained by considering various influencing factors. On this basis, the fatigue reliability of the hinges was analysed using the stress-strength interference method. Fatigue-reliability-based optimal design of the stage was then conducted using the genetic algorithm and MATLAB. To make fatigue life testing easier, a 1 DOF stage was then optimized and manufactured. Experimental results demonstrate the validity of the approach. PMID:26520994

  20. Multi-mode reliability-based design of horizontal curves.

    PubMed

    Essa, Mohamed; Sayed, Tarek; Hussein, Mohamed

    2016-08-01

    Recently, reliability analysis has been advocated as an effective approach to account for uncertainty in the geometric design process and to evaluate the risk associated with a particular design. In this approach, a risk measure (e.g. probability of noncompliance) is calculated to represent the probability that a specific design would not meet standard requirements. The majority of previous applications of reliability analysis in geometric design focused on evaluating the probability of noncompliance for only one mode of noncompliance such as insufficient sight distance. However, in many design situations, more than one mode of noncompliance may be present (e.g. insufficient sight distance and vehicle skidding at horizontal curves). In these situations, utilizing a multi-mode reliability approach that considers more than one failure (noncompliance) mode is required. The main objective of this paper is to demonstrate the application of multi-mode (system) reliability analysis to the design of horizontal curves. The process is demonstrated by a case study of Sea-to-Sky Highway located between Vancouver and Whistler, in southern British Columbia, Canada. Two noncompliance modes were considered: insufficient sight distance and vehicle skidding. The results show the importance of accounting for several noncompliance modes in the reliability model. The system reliability concept could be used in future studies to calibrate the design of various design elements in order to achieve consistent safety levels based on all possible modes of noncompliance.

  1. Multi-mode reliability-based design of horizontal curves.

    PubMed

    Essa, Mohamed; Sayed, Tarek; Hussein, Mohamed

    2016-08-01

    Recently, reliability analysis has been advocated as an effective approach to account for uncertainty in the geometric design process and to evaluate the risk associated with a particular design. In this approach, a risk measure (e.g. probability of noncompliance) is calculated to represent the probability that a specific design would not meet standard requirements. The majority of previous applications of reliability analysis in geometric design focused on evaluating the probability of noncompliance for only one mode of noncompliance such as insufficient sight distance. However, in many design situations, more than one mode of noncompliance may be present (e.g. insufficient sight distance and vehicle skidding at horizontal curves). In these situations, utilizing a multi-mode reliability approach that considers more than one failure (noncompliance) mode is required. The main objective of this paper is to demonstrate the application of multi-mode (system) reliability analysis to the design of horizontal curves. The process is demonstrated by a case study of Sea-to-Sky Highway located between Vancouver and Whistler, in southern British Columbia, Canada. Two noncompliance modes were considered: insufficient sight distance and vehicle skidding. The results show the importance of accounting for several noncompliance modes in the reliability model. The system reliability concept could be used in future studies to calibrate the design of various design elements in order to achieve consistent safety levels based on all possible modes of noncompliance. PMID:27180287

  2. Indel reliability in indel-based phylogenetic inference.

    PubMed

    Ashkenazy, Haim; Cohen, Ofir; Pupko, Tal; Huchon, Dorothée

    2014-12-01

    It is often assumed that it is unlikely that the same insertion or deletion (indel) event occurred at the same position in two independent evolutionary lineages, and thus, indel-based inference of phylogeny should be less subject to homoplasy compared with standard inference which is based on substitution events. Indeed, indels were successfully used to solve debated evolutionary relationships among various taxonomical groups. However, indels are never directly observed but rather inferred from the alignment and thus indel-based inference may be sensitive to alignment errors. It is hypothesized that phylogenetic reconstruction would be more accurate if it relied only on a subset of reliable indels instead of the entire indel data. Here, we developed a method to quantify the reliability of indel characters by measuring how often they appear in a set of alternative multiple sequence alignments. Our approach is based on the assumption that indels that are consistently present in most alternative alignments are more reliable compared with indels that appear only in a small subset of these alignments. Using simulated and empirical data, we studied the impact of filtering and weighting indels by their reliability scores on the accuracy of indel-based phylogenetic reconstruction. The new method is available as a web-server at http://guidance.tau.ac.il/RELINDEL/.

  3. Reliability, Compliance, and Security in Web-Based Course Assessments

    ERIC Educational Resources Information Center

    Bonham, Scott

    2008-01-01

    Pre- and postcourse assessment has become a very important tool for education research in physics and other areas. The web offers an attractive alternative to in-class paper administration, but concerns about web-based administration include reliability due to changes in medium, student compliance rates, and test security, both question leakage…

  4. The Verification-based Analysis of Reliable Multicast Protocol

    NASA Technical Reports Server (NTRS)

    Wu, Yunqing

    1996-01-01

    Reliable Multicast Protocol (RMP) is a communication protocol that provides an atomic, totally ordered, reliable multicast service on top of unreliable IP Multicasting. In this paper, we develop formal models for R.W using existing automatic verification systems, and perform verification-based analysis on the formal RMP specifications. We also use the formal models of RW specifications to generate a test suite for conformance testing of the RMP implementation. Throughout the process of RMP development, we follow an iterative, interactive approach that emphasizes concurrent and parallel progress between the implementation and verification processes. Through this approach, we incorporate formal techniques into our development process, promote a common understanding for the protocol, increase the reliability of our software, and maintain high fidelity between the specifications of RMP and its implementation.

  5. NHPP-Based Software Reliability Models Using Equilibrium Distribution

    NASA Astrophysics Data System (ADS)

    Xiao, Xiao; Okamura, Hiroyuki; Dohi, Tadashi

    Non-homogeneous Poisson processes (NHPPs) have gained much popularity in actual software testing phases to estimate the software reliability, the number of remaining faults in software and the software release timing. In this paper, we propose a new modeling approach for the NHPP-based software reliability models (SRMs) to describe the stochastic behavior of software fault-detection processes. The fundamental idea is to apply the equilibrium distribution to the fault-detection time distribution in NHPP-based modeling. We also develop efficient parameter estimation procedures for the proposed NHPP-based SRMs. Through numerical experiments, it can be concluded that the proposed NHPP-based SRMs outperform the existing ones in many data sets from the perspective of goodness-of-fit and prediction performance.

  6. Surrogate-based Reliability Analysis Using Support Vector Machine

    NASA Astrophysics Data System (ADS)

    Li, Gang; Liu, Zhiqiang

    2010-05-01

    An approach of surrogate-based reliability analysis by support vector machine with Monte-Carlo simulation is proposed. The efficient sampling techniques, such as uniform design and Latin Hypercube sampling, are used, and the SVM is trained with the sample pairs of input and output data obtained by the finite element analysis. The trained SVM model, as a solver-surrogate model, is intended to approximate the real performance function. Considering the selection of parameters for SVM affects the learning performance of SVM strongly, the Genetic Algorithm (GA) is integrated to the construction of the SVM, by optimizing the relevant parameters. The influence of the parameters on SVM is discussed and a methodology is proposed for selecting the SVM model. Support Vector Classification (SVC) based and Support Vector Regression (SVR) based reliability analyses are studied. Some numerical examples demonstrate the efficiency and applicability of the proposed method.

  7. Reliability analysis based on the losses from failures.

    PubMed

    Todinov, M T

    2006-04-01

    The conventional reliability analysis is based on the premise that increasing the reliability of a system will decrease the losses from failures. On the basis of counterexamples, it is demonstrated that this is valid only if all failures are associated with the same losses. In case of failures associated with different losses, a system with larger reliability is not necessarily characterized by smaller losses from failures. Consequently, a theoretical framework and models are proposed for a reliability analysis, linking reliability and the losses from failures. Equations related to the distributions of the potential losses from failure have been derived. It is argued that the classical risk equation only estimates the average value of the potential losses from failure and does not provide insight into the variability associated with the potential losses. Equations have also been derived for determining the potential and the expected losses from failures for nonrepairable and repairable systems with components arranged in series, with arbitrary life distributions. The equations are also valid for systems/components with multiple mutually exclusive failure modes. The expected losses given failure is a linear combination of the expected losses from failure associated with the separate failure modes scaled by the conditional probabilities with which the failure modes initiate failure. On this basis, an efficient method for simplifying complex reliability block diagrams has been developed. Branches of components arranged in series whose failures are mutually exclusive can be reduced to single components with equivalent hazard rate, downtime, and expected costs associated with intervention and repair. A model for estimating the expected losses from early-life failures has also been developed. For a specified time interval, the expected losses from early-life failures are a sum of the products of the expected number of failures in the specified time intervals covering the

  8. Reliability-based optimization under random vibration environment

    NASA Technical Reports Server (NTRS)

    Rao, S. S.

    1981-01-01

    A methodology of formulating the optimum design problem for structural systems with random parameters and subjected to random vibration as a mathematical programming problem is presented. The proposed method is applied to the optimum design of a cantilever beam with a tip mass and a truss structure supporting a water tank. The excitations are assumed to be Gaussian processes and the geometric and material properties are taken to be normally distributed random variables. The probabilistic constraints are specified for individual failure modes since it is easier to specify the reliability level for each failure mode keeping in view the consequences of failure in that particular mode. The time parameter appearing in the random vibration based constraints is eliminated by replacing the probabilities of failure by suitable upper bounds. The numerical results demonstrate the feasibility and effectiveness of applying the reliability-based design concepts to structures with random parameters and operating in random vibration environment.

  9. Modeling Sensor Reliability in Fault Diagnosis Based on Evidence Theory

    PubMed Central

    Yuan, Kaijuan; Xiao, Fuyuan; Fei, Liguo; Kang, Bingyi; Deng, Yong

    2016-01-01

    Sensor data fusion plays an important role in fault diagnosis. Dempster–Shafer (D-R) evidence theory is widely used in fault diagnosis, since it is efficient to combine evidence from different sensors. However, under the situation where the evidence highly conflicts, it may obtain a counterintuitive result. To address the issue, a new method is proposed in this paper. Not only the statistic sensor reliability, but also the dynamic sensor reliability are taken into consideration. The evidence distance function and the belief entropy are combined to obtain the dynamic reliability of each sensor report. A weighted averaging method is adopted to modify the conflict evidence by assigning different weights to evidence according to sensor reliability. The proposed method has better performance in conflict management and fault diagnosis due to the fact that the information volume of each sensor report is taken into consideration. An application in fault diagnosis based on sensor fusion is illustrated to show the efficiency of the proposed method. The results show that the proposed method improves the accuracy of fault diagnosis from 81.19% to 89.48% compared to the existing methods. PMID:26797611

  10. A TCAD-based yield and reliability analysis for VCSELs

    NASA Astrophysics Data System (ADS)

    Odermatt, Stefan; Eitel, Sven; Hövel, Rainer; Letay, Gergoe; Witzigmann, Bernd

    2006-02-01

    Yield enhancement and reliability improvement are main requirements in todays industrial VCSEL manufacturing. This requires a thorough understanding of process tolerances and the effects resulting from design variations. So far, this has been done by statistical analysis of experimental data. In this work, we use a state-of-the art technology computer aided design (TCAD) tool to analyze device reliability and yield for multiple VCSEL designs. The starting point is a physics-based simulation model that is calibrated to temperature-dependent static and dynamic measurements for a set of single- and multi-mode VCSELs lasing at 850 nm. Applying statistical variations that result from design modifications and process fluctuations, yield and reliability data are extracted by means of simulation. The yield will be derived by compliance to selected device specifications (such available single-mode power), and the device reliability is determined from an analysis of the internal device properties. As example, the oxide aperture and metal aperture design will be discussed, and a robust design will be presented.

  11. Modeling Sensor Reliability in Fault Diagnosis Based on Evidence Theory.

    PubMed

    Yuan, Kaijuan; Xiao, Fuyuan; Fei, Liguo; Kang, Bingyi; Deng, Yong

    2016-01-18

    Sensor data fusion plays an important role in fault diagnosis. Dempster-Shafer (D-R) evidence theory is widely used in fault diagnosis, since it is efficient to combine evidence from different sensors. However, under the situation where the evidence highly conflicts, it may obtain a counterintuitive result. To address the issue, a new method is proposed in this paper. Not only the statistic sensor reliability, but also the dynamic sensor reliability are taken into consideration. The evidence distance function and the belief entropy are combined to obtain the dynamic reliability of each sensor report. A weighted averaging method is adopted to modify the conflict evidence by assigning different weights to evidence according to sensor reliability. The proposed method has better performance in conflict management and fault diagnosis due to the fact that the information volume of each sensor report is taken into consideration. An application in fault diagnosis based on sensor fusion is illustrated to show the efficiency of the proposed method. The results show that the proposed method improves the accuracy of fault diagnosis from 81.19% to 89.48% compared to the existing methods.

  12. Reliability based design including future tests and multiagent approaches

    NASA Astrophysics Data System (ADS)

    Villanueva, Diane

    The initial stages of reliability-based design optimization involve the formulation of objective functions and constraints, and building a model to estimate the reliability of the design with quantified uncertainties. However, even experienced hands often overlook important objective functions and constraints that affect the design. In addition, uncertainty reduction measures, such as tests and redesign, are often not considered in reliability calculations during the initial stages. This research considers two areas that concern the design of engineering systems: 1) the trade-off of the effect of a test and post-test redesign on reliability and cost and 2) the search for multiple candidate designs as insurance against unforeseen faults in some designs. In this research, a methodology was developed to estimate the effect of a single future test and post-test redesign on reliability and cost. The methodology uses assumed distributions of computational and experimental errors with re-design rules to simulate alternative future test and redesign outcomes to form a probabilistic estimate of the reliability and cost for a given design. Further, it was explored how modeling a future test and redesign provides a company an opportunity to balance development costs versus performance by simultaneously designing the design and the post-test redesign rules during the initial design stage. The second area of this research considers the use of dynamic local surrogates, or surrogate-based agents, to locate multiple candidate designs. Surrogate-based global optimization algorithms often require search in multiple candidate regions of design space, expending most of the computation needed to define multiple alternate designs. Thus, focusing on solely locating the best design may be wasteful. We extended adaptive sampling surrogate techniques to locate multiple optima by building local surrogates in sub-regions of the design space to identify optima. The efficiency of this method

  13. Reliability improvement through nanoparticle material-based fiber structures

    NASA Astrophysics Data System (ADS)

    Mirer, Tatiana; Ingman, Dov; Suhir, Ephraim

    2007-01-01

    Optical fibers require protection against moisture and oxygen, as well as mechanical and thermal protection. Although the reliability of polymer coatings has improved considerably over the last decade, it is still insufficient for particular applications. The authors recommend a newly invented nanoparticle material (NPM)-based fiber structures as a solution to an effective coating system. NPM is able to actively replace water molecules at the surface of the underlying material. The NPM fills in the existing or incipient flaws (cracks, etc.), thereby "healing" the damaged (defected) material. Nonpolymer coatings make the fiber mechanically reliable and environmentally durable. This is due to the "self-healing" ability of the thixotropic NPM compound, as well as to the NPM ability to "heal the wounds" on the surface of the silica material under stress. The objective of the two experiments undertaken and addressed in this study is to compare the mechanical and the environmental characteristics of NPM-based and "conventional" fibers under different loading and ambient conditions. We show that the NPM effectively protects the silica surface against damage that could be caused by water vapor. The NPM is promising as an effective coating that is able to improve dramatically the optical performance, mechanical reliability, environmental durability, and cost effectiveness of silica-based light-guides.

  14. A Research Roadmap for Computation-Based Human Reliability Analysis

    SciTech Connect

    Boring, Ronald; Mandelli, Diego; Joe, Jeffrey; Smith, Curtis; Groth, Katrina

    2015-08-01

    The United States (U.S.) Department of Energy (DOE) is sponsoring research through the Light Water Reactor Sustainability (LWRS) program to extend the life of the currently operating fleet of commercial nuclear power plants. The Risk Informed Safety Margin Characterization (RISMC) research pathway within LWRS looks at ways to maintain and improve the safety margins of these plants. The RISMC pathway includes significant developments in the area of thermalhydraulics code modeling and the development of tools to facilitate dynamic probabilistic risk assessment (PRA). PRA is primarily concerned with the risk of hardware systems at the plant; yet, hardware reliability is often secondary in overall risk significance to human errors that can trigger or compound undesirable events at the plant. This report highlights ongoing efforts to develop a computation-based approach to human reliability analysis (HRA). This computation-based approach differs from existing static and dynamic HRA approaches in that it: (i) interfaces with a dynamic computation engine that includes a full scope plant model, and (ii) interfaces with a PRA software toolset. The computation-based HRA approach presented in this report is called the Human Unimodels for Nuclear Technology to Enhance Reliability (HUNTER) and incorporates in a hybrid fashion elements of existing HRA methods to interface with new computational tools developed under the RISMC pathway. The goal of this research effort is to model human performance more accurately than existing approaches, thereby minimizing modeling uncertainty found in current plant risk models.

  15. Limit states and reliability-based pipeline design. Final report

    SciTech Connect

    Zimmerman, T.J.E.; Chen, Q.; Pandey, M.D.

    1997-06-01

    This report provides the results of a study to develop limit states design (LSD) procedures for pipelines. Limit states design, also known as load and resistance factor design (LRFD), provides a unified approach to dealing with all relevant failure modes combinations of concern. It explicitly accounts for the uncertainties that naturally occur in the determination of the loads which act on a pipeline and in the resistance of the pipe to failure. The load and resistance factors used are based on reliability considerations; however, the designer is not faced with carrying out probabilistic calculations. This work is done during development and periodic updating of the LSD document. This report provides background information concerning limits states and reliability-based design (Section 2), gives the limit states design procedures that were developed (Section 3) and provides results of the reliability analyses that were undertaken in order to partially calibrate the LSD method (Section 4). An appendix contains LSD design examples in order to demonstrate use of the method. Section 3, Limit States Design has been written in the format of a recommended practice. It has been structured so that, in future, it can easily be converted to a limit states design code format. Throughout the report, figures and tables are given at the end of each section, with the exception of Section 3, where to facilitate understanding of the LSD method, they have been included with the text.

  16. Reliability-Based Control Design for Uncertain Systems

    NASA Technical Reports Server (NTRS)

    Crespo, Luis G.; Kenny, Sean P.

    2005-01-01

    This paper presents a robust control design methodology for systems with probabilistic parametric uncertainty. Control design is carried out by solving a reliability-based multi-objective optimization problem where the probability of violating design requirements is minimized. Simultaneously, failure domains are optimally enlarged to enable global improvements in the closed-loop performance. To enable an efficient numerical implementation, a hybrid approach for estimating reliability metrics is developed. This approach, which integrates deterministic sampling and asymptotic approximations, greatly reduces the numerical burden associated with complex probabilistic computations without compromising the accuracy of the results. Examples using output-feedback and full-state feedback with state estimation are used to demonstrate the ideas proposed.

  17. Study of vertical breakwater reliability based on copulas

    NASA Astrophysics Data System (ADS)

    Dong, Sheng; Li, Jingjing; Li, Xue; Wei, Yong

    2016-04-01

    The reliability of a vertical breakwater is calculated using direct integration methods based on joint density functions. The horizontal and uplifting wave forces on the vertical breakwater can be well fitted by the lognormal and the Gumbel distributions, respectively. The joint distribution of the horizontal and uplifting wave forces is analyzed using different probabilistic distributions, including the bivariate logistic Gumbel distribution, the bivariate lognormal distribution, and three bivariate Archimedean copulas functions constructed with different marginal distributions simultaneously. We use the fully nested copulas to construct multivariate distributions taking into account related variables. Different goodness fitting tests are carried out to determine the best bivariate copula model for wave forces on a vertical breakwater. We show that a bivariate model constructed by Frank copula gives the best reliability analysis, using marginal distributions of Gumbel and lognormal to account for uplifting pressure and horizontal wave force on a vertical breakwater, respectively. The results show that failure probability of the vertical breakwater calculated by multivariate density function is comparable to those by the Joint Committee on Structural Safety methods. As copulas are suitable for constructing a bivariate or multivariate joint distribution, they have great potential in reliability analysis for other coastal structures.

  18. Improved reliability analysis method based on the failure assessment diagram

    NASA Astrophysics Data System (ADS)

    Zhou, Yu; Zhang, Zheng; Zhong, Qunpeng

    2012-07-01

    With the uncertainties related to operating conditions, in-service non-destructive testing (NDT) measurements and material properties considered in the structural integrity assessment, probabilistic analysis based on the failure assessment diagram (FAD) approach has recently become an important concern. However, the point density revealing the probabilistic distribution characteristics of the assessment points is usually ignored. To obtain more detailed and direct knowledge from the reliability analysis, an improved probabilistic fracture mechanics (PFM) assessment method is proposed. By integrating 2D kernel density estimation (KDE) technology into the traditional probabilistic assessment, the probabilistic density of the randomly distributed assessment points is visualized in the assessment diagram. Moreover, a modified interval sensitivity analysis is implemented and compared with probabilistic sensitivity analysis. The improved reliability analysis method is applied to the assessment of a high pressure pipe containing an axial internal semi-elliptical surface crack. The results indicate that these two methods can give consistent sensitivities of input parameters, but the interval sensitivity analysis is computationally more efficient. Meanwhile, the point density distribution and its contour are plotted in the FAD, thereby better revealing the characteristics of PFM assessment. This study provides a powerful tool for the reliability analysis of critical structures.

  19. Reliability Quantification of Advanced Stirling Convertor (ASC) Components

    NASA Technical Reports Server (NTRS)

    Shah, Ashwin R.; Korovaichuk, Igor; Zampino, Edward

    2010-01-01

    The Advanced Stirling Convertor, is intended to provide power for an unmanned planetary spacecraft and has an operational life requirement of 17 years. Over this 17 year mission, the ASC must provide power with desired performance and efficiency and require no corrective maintenance. Reliability demonstration testing for the ASC was found to be very limited due to schedule and resource constraints. Reliability demonstration must involve the application of analysis, system and component level testing, and simulation models, taken collectively. Therefore, computer simulation with limited test data verification is a viable approach to assess the reliability of ASC components. This approach is based on physics-of-failure mechanisms and involves the relationship among the design variables based on physics, mechanics, material behavior models, interaction of different components and their respective disciplines such as structures, materials, fluid, thermal, mechanical, electrical, etc. In addition, these models are based on the available test data, which can be updated, and analysis refined as more data and information becomes available. The failure mechanisms and causes of failure are included in the analysis, especially in light of the new information, in order to develop guidelines to improve design reliability and better operating controls to reduce the probability of failure. Quantified reliability assessment based on fundamental physical behavior of components and their relationship with other components has demonstrated itself to be a superior technique to conventional reliability approaches based on utilizing failure rates derived from similar equipment or simply expert judgment.

  20. Reliability-based robust design optimization of vehicle components, Part I: Theory

    NASA Astrophysics Data System (ADS)

    Zhang, Yimin

    2015-06-01

    The reliability-based design optimization, the reliability sensitivity analysis and robust design method are employed to present a practical and effective approach for reliability-based robust design optimization of vehicle components. A procedure for reliability-based robust design optimization of vehicle components is proposed. Application of the method is illustrated by reliability-based robust design optimization of axle and spring. Numerical results have shown that the proposed method can be trusted to perform reliability-based robust design optimization of vehicle components.

  1. A Measure for the Reliability of a Rating Scale Based on Longitudinal Clinical Trial Data

    ERIC Educational Resources Information Center

    Laenen, Annouschka; Alonso, Ariel; Molenberghs, Geert

    2007-01-01

    A new measure for reliability of a rating scale is introduced, based on the classical definition of reliability, as the ratio of the true score variance and the total variance. Clinical trial data can be employed to estimate the reliability of the scale in use, whenever repeated measurements are taken. The reliability is estimated from the…

  2. Reliability-based robust design optimization of vehicle components, Part II: Case studies

    NASA Astrophysics Data System (ADS)

    Zhang, Yimin

    2015-06-01

    The reliability-based optimization, the reliability- based sensitivity analysis and robust design method are employed to propose an effective approach for reliability-based robust design optimization of vehicle components in Part I. Applications of the method are further discussed for reliability-based robust optimization of vehicle components in this paper. Examples of axles, torsion bar, coil and composite springs are illustrated for numerical investigations. Results have shown the proposed method is an efficient method for reliability-based robust design optimization of vehicle components.

  3. RELIABILITY BASED DESIGN OF FIXED FOUNDATION WIND TURBINES

    SciTech Connect

    Nichols, R.

    2013-10-14

    Recent analysis of offshore wind turbine foundations using both applicable API and IEC standards show that the total load demand from wind and waves is greatest in wave driven storms. Further, analysis of overturning moment loads (OTM) reveal that impact forces exerted by breaking waves are the largest contributor to OTM in big storms at wind speeds above the operating range of 25 m/s. Currently, no codes or standards for offshore wind power generators have been adopted by the Bureau of Ocean Energy Management Regulation and Enforcement (BOEMRE) for use on the Outer Continental Shelf (OCS). Current design methods based on allowable stress design (ASD) incorporate the uncertainty in the variation of loads transferred to the foundation and geotechnical capacity of the soil and rock to support the loads is incorporated into a factor of safety. Sources of uncertainty include spatial and temporal variation of engineering properties, reliability of property measurements applicability and sufficiency of sampling and testing methods, modeling errors, and variability of estimated load predictions. In ASD these sources of variability are generally given qualitative rather than quantitative consideration. The IEC 61400‐3 design standard for offshore wind turbines is based on ASD methods. Load and resistance factor design (LRFD) methods are being increasingly used in the design of structures. Uncertainties such as those listed above can be included quantitatively into the LRFD process. In LRFD load factors and resistance factors are statistically based. This type of analysis recognizes that there is always some probability of failure and enables the probability of failure to be quantified. This paper presents an integrated approach consisting of field observations and numerical simulation to establish the distribution of loads from breaking waves to support the LRFD of fixed offshore foundations.

  4. Simulation and Non-Simulation Based Human Reliability Analysis Approaches

    SciTech Connect

    Boring, Ronald Laurids; Shirley, Rachel Elizabeth; Joe, Jeffrey Clark; Mandelli, Diego

    2014-12-01

    Part of the U.S. Department of Energy’s Light Water Reactor Sustainability (LWRS) Program, the Risk-Informed Safety Margin Characterization (RISMC) Pathway develops approaches to estimating and managing safety margins. RISMC simulations pair deterministic plant physics models with probabilistic risk models. As human interactions are an essential element of plant risk, it is necessary to integrate human actions into the RISMC risk model. In this report, we review simulation-based and non-simulation-based human reliability assessment (HRA) methods. Chapter 2 surveys non-simulation-based HRA methods. Conventional HRA methods target static Probabilistic Risk Assessments for Level 1 events. These methods would require significant modification for use in dynamic simulation of Level 2 and Level 3 events. Chapter 3 is a review of human performance models. A variety of methods and models simulate dynamic human performance; however, most of these human performance models were developed outside the risk domain and have not been used for HRA. The exception is the ADS-IDAC model, which can be thought of as a virtual operator program. This model is resource-intensive but provides a detailed model of every operator action in a given scenario, along with models of numerous factors that can influence operator performance. Finally, Chapter 4 reviews the treatment of timing of operator actions in HRA methods. This chapter is an example of one of the critical gaps between existing HRA methods and the needs of dynamic HRA. This report summarizes the foundational information needed to develop a feasible approach to modeling human interactions in the RISMC simulations.

  5. A Research of Weapon System Storage Reliability Simulation Method Based on Fuzzy Theory

    NASA Astrophysics Data System (ADS)

    Shi, Yonggang; Wu, Xuguang; Chen, Haijian; Xu, Tingxue

    Aimed at the problem of the new, complicated weapon equipment system storage reliability analyze, the paper researched on the methods of fuzzy fault tree analysis and fuzzy system storage reliability simulation, discussed the path that regarded weapon system as fuzzy system, and researched the storage reliability of weapon system based on fuzzy theory, provided a method of storage reliability research for the new, complicated weapon equipment system. As an example, built up the fuzzy fault tree of one type missile control instrument based on function analysis, and used the method of fuzzy system storage reliability simulation to analyze storage reliability index of control instrument.

  6. Reliable freestanding position-based routing in highway scenarios.

    PubMed

    Galaviz-Mosqueda, Gabriel A; Aquino-Santos, Raúl; Villarreal-Reyes, Salvador; Rivera-Rodríguez, Raúl; Villaseñor-González, Luis; Edwards, Arthur

    2012-01-01

    Vehicular Ad Hoc Networks (VANETs) are considered by car manufacturers and the research community as the enabling technology to radically improve the safety, efficiency and comfort of everyday driving. However, before VANET technology can fulfill all its expected potential, several difficulties must be addressed. One key issue arising when working with VANETs is the complexity of the networking protocols compared to those used by traditional infrastructure networks. Therefore, proper design of the routing strategy becomes a main issue for the effective deployment of VANETs. In this paper, a reliable freestanding position-based routing algorithm (FPBR) for highway scenarios is proposed. For this scenario, several important issues such as the high mobility of vehicles and the propagation conditions may affect the performance of the routing strategy. These constraints have only been partially addressed in previous proposals. In contrast, the design approach used for developing FPBR considered the constraints imposed by a highway scenario and implements mechanisms to overcome them. FPBR performance is compared to one of the leading protocols for highway scenarios. Performance metrics show that FPBR yields similar results when considering freespace propagation conditions, and outperforms the leading protocol when considering a realistic highway path loss model. PMID:23202159

  7. High reliability outdoor sonar prototype based on efficient signal coding.

    PubMed

    Alvarez, Fernando J; Ureña, Jesús; Mazo, Manuel; Hernández, Alvaro; García, Juan J; de Marziani, Carlos

    2006-10-01

    Many mobile robots and autonomous vehicles designed for outdoor operation have incorporated ultrasonic sensors in their navigation systems, whose function is mainly to avoid possible collisions with very close obstacles. The use of these systems in more precise tasks requires signal encoding and the incorporation of pulse compression techniques that have already been used with success in the design of high-performance indoor sonars. However, the transmission of ultrasonic encoded signals outdoors entails a new challenge because of the effects of atmospheric turbulence. This phenomenon causes random fluctuations in the phase and amplitude of traveling acoustic waves, a fact that can make the encoded signal completely unrecognizable by its matched receiver. Atmospheric turbulence is investigated in this work, with the aim of determining the conditions under which it is possible to assure the reliable outdoor operation of an ultrasonic pulse compression system. As a result of this analysis, a novel sonar prototype based on complementary sequences coding is developed and experimentally tested. This encoding scheme provides the system with very useful additional features, namely, high robustness to noise, multi-mode operation capability (simultaneous emissions with minimum cross talk interference), and the possibility of applying an efficient detection algorithm that notably decreases the hardware resource requirements.

  8. A Vision for Spaceflight Reliability: NASA's Objectives Based Strategy

    NASA Technical Reports Server (NTRS)

    Groen, Frank; Evans, John; Hall, Tony

    2015-01-01

    In defining the direction for a new Reliability and Maintainability standard, OSMA has extracted the essential objectives that our programs need, to undertake a reliable mission. These objectives have been structured to lead mission planning through construction of an objective hierarchy, which defines the critical approaches for achieving high reliability and maintainability (R M). Creating a hierarchy, as a basis for assurance implementation, is a proven approach; yet, it holds the opportunity to enable new directions, as NASA moves forward in tackling the challenges of space exploration.

  9. Integrated circuit reliability. Citations from the NTIS data base

    NASA Astrophysics Data System (ADS)

    Reed, W. E.

    1980-06-01

    The bibliography presents research pertinent to design, reliability prediction, failure and malfunction, processing techniques, and radiation damage. This updated bibliography contains 193 abstracts, 17 of which are new entries to the previous edition.

  10. Reliability-Based Design Optimization of a Composite Airframe Component

    NASA Technical Reports Server (NTRS)

    Patnaik, Surya N.; Pai, Shantaram S.; Coroneos, Rula M.

    2009-01-01

    A stochastic design optimization methodology (SDO) has been developed to design components of an airframe structure that can be made of metallic and composite materials. The design is obtained as a function of the risk level, or reliability, p. The design method treats uncertainties in load, strength, and material properties as distribution functions, which are defined with mean values and standard deviations. A design constraint or a failure mode is specified as a function of reliability p. Solution to stochastic optimization yields the weight of a structure as a function of reliability p. Optimum weight versus reliability p traced out an inverted-S-shaped graph. The center of the inverted-S graph corresponded to 50 percent (p = 0.5) probability of success. A heavy design with weight approaching infinity could be produced for a near-zero rate of failure that corresponds to unity for reliability p (or p = 1). Weight can be reduced to a small value for the most failure-prone design with a reliability that approaches zero (p = 0). Reliability can be changed for different components of an airframe structure. For example, the landing gear can be designed for a very high reliability, whereas it can be reduced to a small extent for a raked wingtip. The SDO capability is obtained by combining three codes: (1) The MSC/Nastran code was the deterministic analysis tool, (2) The fast probabilistic integrator, or the FPI module of the NESSUS software, was the probabilistic calculator, and (3) NASA Glenn Research Center s optimization testbed CometBoards became the optimizer. The SDO capability requires a finite element structural model, a material model, a load model, and a design model. The stochastic optimization concept is illustrated considering an academic example and a real-life raked wingtip structure of the Boeing 767-400 extended range airliner made of metallic and composite materials.

  11. Reliability-Based Life Assessment of Stirling Convertor Heater Head

    NASA Technical Reports Server (NTRS)

    Shah, Ashwin R.; Halford, Gary R.; Korovaichuk, Igor

    2004-01-01

    Onboard radioisotope power systems being developed and planned for NASA's deep-space missions require reliable design lifetimes of up to 14 yr. The structurally critical heater head of the high-efficiency Stirling power convertor has undergone extensive computational analysis of operating temperatures, stresses, and creep resistance of the thin-walled Inconel 718 bill of material. A preliminary assessment of the effect of uncertainties in the material behavior was also performed. Creep failure resistance of the thin-walled heater head could show variation due to small deviations in the manufactured thickness and in uncertainties in operating temperature and pressure. Durability prediction and reliability of the heater head are affected by these deviations from nominal design conditions. Therefore, it is important to include the effects of these uncertainties in predicting the probability of survival of the heater head under mission loads. Furthermore, it may be possible for the heater head to experience rare incidences of small temperature excursions of short duration. These rare incidences would affect the creep strain rate and, therefore, the life. This paper addresses the effects of such rare incidences on the reliability. In addition, the sensitivities of variables affecting the reliability are quantified, and guidelines developed to improve the reliability are outlined. Heater head reliability is being quantified with data from NASA Glenn Research Center's accelerated benchmark testing program.

  12. 49 CFR Appendix E to Part 238 - General Principles of Reliability-Based Maintenance Programs

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... Maintenance Programs E Appendix E to Part 238 Transportation Other Regulations Relating to Transportation... STANDARDS Pt. 238, App. E Appendix E to Part 238—General Principles of Reliability-Based Maintenance... reliability beyond the design reliability. (e) When a maintenance program is developed, it includes tasks...

  13. 49 CFR Appendix E to Part 238 - General Principles of Reliability-Based Maintenance Programs

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... Maintenance Programs E Appendix E to Part 238 Transportation Other Regulations Relating to Transportation... STANDARDS Pt. 238, App. E Appendix E to Part 238—General Principles of Reliability-Based Maintenance... reliability beyond the design reliability. (e) When a maintenance program is developed, it includes tasks...

  14. Routing Protocol Based on Link Reliability for WSN

    NASA Astrophysics Data System (ADS)

    Weipeng, Jing; Yaqiu, Liu

    In this paper, defining link reliability strategy constructed by remain energy, and communication cost of nodes as topology weight to synthetically reflect the energy efficiency of dominator, an Energy-radio and communication cost route (ECCR) is proposed to solve the problem that the average energy consumption in cluster and minimum communication cost. We take node residual energy and distance between sink and node to compete cluster head, at the same time, in order to reduce the cluster head energy costs, link reliability and hop is used to establish topology. The experimental results show that the algorithm not only has the energy saved characters, but also ensures the reliability of topology links and extends the network life-cycle efficiently.

  15. Reliability-based condition assessment of steel containment and liners

    SciTech Connect

    Ellingwood, B.; Bhattacharya, B.; Zheng, R.

    1996-11-01

    Steel containments and liners in nuclear power plants may be exposed to aggressive environments that may cause their strength and stiffness to decrease during the plant service life. Among the factors recognized as having the potential to cause structural deterioration are uniform, pitting or crevice corrosion; fatigue, including crack initiation and propagation to fracture; elevated temperature; and irradiation. The evaluation of steel containments and liners for continued service must provide assurance that they are able to withstand future extreme loads during the service period with a level of reliability that is sufficient for public safety. Rational methodologies to provide such assurances can be developed using modern structural reliability analysis principles that take uncertainties in loading, strength, and degradation resulting from environmental factors into account. The research described in this report is in support of the Steel Containments and Liners Program being conducted for the US Nuclear Regulatory Commission by the Oak Ridge National Laboratory. The research demonstrates the feasibility of using reliability analysis as a tool for performing condition assessments and service life predictions of steel containments and liners. Mathematical models that describe time-dependent changes in steel due to aggressive environmental factors are identified, and statistical data supporting the use of these models in time-dependent reliability analysis are summarized. The analysis of steel containment fragility is described, and simple illustrations of the impact on reliability of structural degradation are provided. The role of nondestructive evaluation in time-dependent reliability analysis, both in terms of defect detection and sizing, is examined. A Markov model provides a tool for accounting for time-dependent changes in damage condition of a structural component or system. 151 refs.

  16. Architecture-Based Reliability Analysis of Web Services

    ERIC Educational Resources Information Center

    Rahmani, Cobra Mariam

    2012-01-01

    In a Service Oriented Architecture (SOA), the hierarchical complexity of Web Services (WS) and their interactions with the underlying Application Server (AS) create new challenges in providing a realistic estimate of WS performance and reliability. The current approaches often treat the entire WS environment as a black-box. Thus, the sensitivity…

  17. Neural Networks Based Approach to Enhance Space Hardware Reliability

    NASA Technical Reports Server (NTRS)

    Zebulum, Ricardo S.; Thakoor, Anilkumar; Lu, Thomas; Franco, Lauro; Lin, Tsung Han; McClure, S. S.

    2011-01-01

    This paper demonstrates the use of Neural Networks as a device modeling tool to increase the reliability analysis accuracy of circuits targeted for space applications. The paper tackles a number of case studies of relevance to the design of Flight hardware. The results show that the proposed technique generates more accurate models than the ones regularly used to model circuits.

  18. Reliability and Validity of Curriculum-Based Informal Reading Inventories.

    ERIC Educational Resources Information Center

    Fuchs, Lynn; And Others

    A study was conducted to explore the reliability and validity of three prominent procedures used in informal reading inventories (IRIs): (1) choosing a 95% word recognition accuracy standard for determining student instructional level, (2) arbitrarily selecting a passage to represent the difficulty level of a basal reader, and (3) employing…

  19. A Most Probable Point-Based Method for Reliability Analysis, Sensitivity Analysis and Design Optimization

    NASA Technical Reports Server (NTRS)

    Hou, Gene J.-W; Newman, Perry A. (Technical Monitor)

    2004-01-01

    A major step in a most probable point (MPP)-based method for reliability analysis is to determine the MPP. This is usually accomplished by using an optimization search algorithm. The minimum distance associated with the MPP provides a measurement of safety probability, which can be obtained by approximate probability integration methods such as FORM or SORM. The reliability sensitivity equations are derived first in this paper, based on the derivatives of the optimal solution. Examples are provided later to demonstrate the use of these derivatives for better reliability analysis and reliability-based design optimization (RBDO).

  20. A simple reliability-based topology optimization approach for continuum structures using a topology description function

    NASA Astrophysics Data System (ADS)

    Liu, Jie; Wen, Guilin; Zhi Zuo, Hao; Qing, Qixiang

    2016-07-01

    The structural configuration obtained by deterministic topology optimization may represent a low reliability level and lead to a high failure rate. Therefore, it is necessary to take reliability into account for topology optimization. By integrating reliability analysis into topology optimization problems, a simple reliability-based topology optimization (RBTO) methodology for continuum structures is investigated in this article. The two-layer nesting involved in RBTO, which is time consuming, is decoupled by the use of a particular optimization procedure. A topology description function approach (TOTDF) and a first order reliability method are employed for topology optimization and reliability calculation, respectively. The problem of the non-smoothness inherent in TOTDF is dealt with using two different smoothed Heaviside functions and the corresponding topologies are compared. Numerical examples demonstrate the validity and efficiency of the proposed improved method. In-depth discussions are also presented on the influence of different structural reliability indices on the final layout.

  1. A damage mechanics based approach to structural deterioration and reliability

    SciTech Connect

    Bhattcharya, B.; Ellingwood, B.

    1998-02-01

    Structural deterioration often occurs without perceptible manifestation. Continuum damage mechanics defines structural damage in terms of the material microstructure, and relates the damage variable to the macroscopic strength or stiffness of the structure. This enables one to predict the state of damage prior to the initiation of a macroscopic flaw, and allows one to estimate residual strength/service life of an existing structure. The accumulation of damage is a dissipative process that is governed by the laws of thermodynamics. Partial differential equations for damage growth in terms of the Helmholtz free energy are derived from fundamental thermodynamical conditions. Closed-form solutions to the equations are obtained under uniaxial loading for ductile deformation damage as a function of plastic strain, for creep damage as a function of time, and for fatigue damage as function of number of cycles. The proposed damage growth model is extended into the stochastic domain by considering fluctuations in the free energy, and closed-form solutions of the resulting stochastic differential equation are obtained in each of the three cases mentioned above. A reliability analysis of a ring-stiffened cylindrical steel shell subjected to corrosion, accidental pressure, and temperature is performed.

  2. Hard and Soft Constraints in Reliability-Based Design Optimization

    NASA Technical Reports Server (NTRS)

    Crespo, L.uis G.; Giesy, Daniel P.; Kenny, Sean P.

    2006-01-01

    This paper proposes a framework for the analysis and design optimization of models subject to parametric uncertainty where design requirements in the form of inequality constraints are present. Emphasis is given to uncertainty models prescribed by norm bounded perturbations from a nominal parameter value and by sets of componentwise bounded uncertain variables. These models, which often arise in engineering problems, allow for a sharp mathematical manipulation. Constraints can be implemented in the hard sense, i.e., constraints must be satisfied for all parameter realizations in the uncertainty model, and in the soft sense, i.e., constraints can be violated by some realizations of the uncertain parameter. In regard to hard constraints, this methodology allows (i) to determine if a hard constraint can be satisfied for a given uncertainty model and constraint structure, (ii) to generate conclusive, formally verifiable reliability assessments that allow for unprejudiced comparisons of competing design alternatives and (iii) to identify the critical combination of uncertain parameters leading to constraint violations. In regard to soft constraints, the methodology allows the designer (i) to use probabilistic uncertainty models, (ii) to calculate upper bounds to the probability of constraint violation, and (iii) to efficiently estimate failure probabilities via a hybrid method. This method integrates the upper bounds, for which closed form expressions are derived, along with conditional sampling. In addition, an l(sub infinity) formulation for the efficient manipulation of hyper-rectangular sets is also proposed.

  3. A MCS Based Neural Network Approach to Extract Network Approximate Reliability Function

    NASA Astrophysics Data System (ADS)

    Yeh, Wei-Chang; Lin, Chien-Hsing; Lin, Yi-Cheng

    Simulations have been applied extensively to solve complex problems in real-world. They provide reference results and support the decision candidates in quantitative attributes. This paper combines ANN with Monte Carlo Simulation (MCS) to provide a reference model of predicting reliability of a network. It suggests reduced BBD design to select the input training data and opens the black box of neural networks through constructing the limited space reliability function from ANN parameters. Besides, this paper applies a practical problem that considers both cost and reliability to evaluate the performance of the ANN based reliability function.

  4. Reliability and Validity of the Evidence-Based Practice Confidence (EPIC) Scale

    ERIC Educational Resources Information Center

    Salbach, Nancy M.; Jaglal, Susan B.; Williams, Jack I.

    2013-01-01

    Introduction: The reliability, minimal detectable change (MDC), and construct validity of the evidence-based practice confidence (EPIC) scale were evaluated among physical therapists (PTs) in clinical practice. Methods: A longitudinal mail survey was conducted. Internal consistency and test-retest reliability were estimated using Cronbach's alpha…

  5. Is School-Based Height and Weight Screening of Elementary Students Private and Reliable?

    ERIC Educational Resources Information Center

    Stoddard, Sarah A.; Kubik, Martha Y.; Skay, Carol

    2008-01-01

    The Institute of Medicine recommends school-based body mass index (BMI) screening as an obesity prevention strategy. While school nurses have provided height/weight screening for years, little has been published describing measurement reliability or process. This study evaluated the reliability of height/weight measures collected by school nurses…

  6. The Reliability of Randomly Generated Math Curriculum-Based Measurements

    ERIC Educational Resources Information Center

    Strait, Gerald G.; Smith, Bradley H.; Pender, Carolyn; Malone, Patrick S.; Roberts, Jarod; Hall, John D.

    2015-01-01

    "Curriculum-Based Measurement" (CBM) is a direct method of academic assessment used to screen and evaluate students' skills and monitor their responses to academic instruction and intervention. Interventioncentral.org offers a math worksheet generator at no cost that creates randomly generated "math curriculum-based measures"…

  7. A Reliable Homemade Electrode Based on Glassy Polymeric Carbon

    ERIC Educational Resources Information Center

    Santos, Andre L.; Takeuchi, Regina M.; Oliviero, Herilton P.; Rodriguez, Marcello G.; Zimmerman, Robert L.

    2004-01-01

    The production of a GPC-based material by submitting a cross-linked resin precursor to control thermal conditions is discussed. The precursor material is prepolymerized at 60-degree Celsius in a mold and is carbonized in inert atmosphere by slowly raising the temperature, the rise is performed to avoid change in the shape of the carbonization…

  8. On the Reliability of Vocational Workplace-Based Certifications

    ERIC Educational Resources Information Center

    Harth, H.; Hemker, B.T.

    2013-01-01

    The assessment of vocational workplace-based qualifications in England relies on human assessors (raters). These assessors observe naturally occurring, non-standardised evidence, unique to each learner and evaluate the learner as competent/not yet competent against content standards. Whilst these are considered difficult to measure, this study…

  9. A reliable Doppler-based solution for single sensor geolocation

    NASA Astrophysics Data System (ADS)

    Witzgall, H.

    This paper examines the ability of particle filters to provide accurate Doppler-based frequency of arrival (FOA) geolocation of radio frequency (RF) emitters. Most existing non-differential Doppler geolocation techniques simplify their geolocation solution by assuming that the emitter's carrier frequency is unknown but stable over the course of the triangulation. This assumption is often violated by today's commercial devices whose applications allow for significant carrier frequency drift, with the result of erroneous FOA solutions. The proposed approach uses particles to discretely represent a state's hypothesized emitter location and conditionally updates the particle's associated frequency drift based on that location and the observations. The performance of this approach is examined for the case of a relatively slow-moving unmanned aerial vehicle (UAV). The results show it is significantly more accurate and robust than Newton's iterative gradient descent techniques, and closely approaches the FOA Cramer-Rao lower bound (CRLB) for location estimation.

  10. Reliability Modeling Development and Its Applications for Ceramic Capacitors with Base-Metal Electrodes (BMEs)

    NASA Technical Reports Server (NTRS)

    Liu, Donhang

    2014-01-01

    This presentation includes a summary of NEPP-funded deliverables for the Base-Metal Electrodes (BMEs) capacitor task, development of a general reliability model for BME capacitors, and a summary and future work.

  11. Reliable binary cell-fate decisions based on oscillations

    NASA Astrophysics Data System (ADS)

    Pfeuty, B.; Kaneko, K.

    2014-02-01

    Biological systems have often to perform binary decisions under highly dynamic and noisy environments, such as during cell-fate determination. These decisions can be implemented by two main bifurcation mechanisms based on the transitions from either monostability or oscillation to bistability. We compare these two mechanisms by using stochastic models with time-varying fields and by establishing asymptotic formulas for the choice probabilities. Different scaling laws for decision sensitivity with respect to noise strength and signal timescale are obtained, supporting a role for oscillatory dynamics in performing noise-robust and temporally tunable binary decision-making. This result provides a rationale for recent experimental evidences showing that oscillatory expression of proteins often precedes binary cell-fate decisions.

  12. A guide to reliability aspects of microprocessor-based instrument development

    NASA Astrophysics Data System (ADS)

    Taunton, J. C.

    1982-03-01

    Techniques for assessing the hardware reliability of microprocessor-based products are reviewed. Models for predicting the failure rates of indifferent categories of microelectronic components, failure mechanisms, and degradation processes are examined. The failure rates of several types of microprocessor, memory and peripheral component, obtained from accelerated life testing are given. Software design philosophies, the choice of programming languages and methods of software testing and reliability assessment are discussed. The life characteristics of microelectronic components follow the same curve as those for discrete digital or analog components, and similar models can be used to describe their failure characteristics. Best estimates of system reliability come from the independent assessment of hardware and software reliability. The overall reliability of hardware is expected to be better in LSI systems, although initial failure rates can be higher than for discrete components.

  13. Construction of reliable radiocarbon-based chronologies for speleothems

    NASA Astrophysics Data System (ADS)

    Lechleitner, Franziska; Fohlmeister, Jens; McIntyre, Cameron; Baldini, Lisa M.; Jamieson, Robert A.; Hercman, Helena; Gasiorowski, Michal; Pawlak, Jacek; Stefaniak, Krzysztof; Socha, Pawel; Eglinton, Timothy I.; Baldini, James U. L.

    2016-04-01

    Speleothems have become one of the most widely applied archives for paleoclimate research. One of their key advantages is their amenability for U-series dating, often producing excellent high precision chronologies. However, stalagmites with high detrital Th or very low U concentrations are problematic to date using U-series, and sometimes need to be discarded from further paleoclimate analysis. Radiocarbon chronologies could present an alternative for stalagmites that cannot be dated using U-series, if offsets from the "dead carbon fraction" (DCF) can be resolved. The DCF is a variable reservoir effect introduced by the addition of 14C-dead carbon from host rock dissolution and soil organic matter. We present a novel age modeling technique that provides accurate 14C-based chronologies for stalagmites. As this technique focuses on the long-term decay pattern of 14C, it is only applicable on stalagmites that show no secular variability in their 14C-depth profiles, but is independent of short-term DCF variations. In order to determine whether a stalagmite is suitable for this method without direct knowledge of long-term trends in the DCF, we highlight how other geochemical proxies (δ13C, Mg/Ca) can provide additional information on changes in karst hydrology, soil conditions, and climate that would affect DCF. We apply our model on a previously published U-Th dated stalagmite 14C dataset from Heshang Cave, China with excellent results, followed by a previously 'undateable' stalagmite from southern Poland.

  14. Reliable Location-Based Services from Radio Navigation Systems

    PubMed Central

    Qiu, Di; Boneh, Dan; Lo, Sherman; Enge, Per

    2010-01-01

    Loran is a radio-based navigation system originally designed for naval applications. We show that Loran-C’s high-power and high repeatable accuracy are fantastic for security applications. First, we show how to derive a precise location tag—with a sensitivity of about 20 meters—that is difficult to project to an exact location. A device can use our location tag to block or allow certain actions, without knowing its precise location. To ensure that our tag is reproducible we make use of fuzzy extractors, a mechanism originally designed for biometric authentication. We build a fuzzy extractor specifically designed for radio-type errors and give experimental evidence to show its effectiveness. Second, we show that our location tag is difficult to predict from a distance. For example, an observer cannot predict the location tag inside a guarded data center from a few hundreds of meters away. As an application, consider a location-aware disk drive that will only work inside the data center. An attacker who steals the device and is capable of spoofing Loran-C signals, still cannot make the device work since he does not know what location tag to spoof. We provide experimental data supporting our unpredictability claim. PMID:22163532

  15. Towards Reliable Evaluation of Anomaly-Based Intrusion Detection Performance

    NASA Technical Reports Server (NTRS)

    Viswanathan, Arun

    2012-01-01

    This report describes the results of research into the effects of environment-induced noise on the evaluation process for anomaly detectors in the cyber security domain. This research was conducted during a 10-week summer internship program from the 19th of August, 2012 to the 23rd of August, 2012 at the Jet Propulsion Laboratory in Pasadena, California. The research performed lies within the larger context of the Los Angeles Department of Water and Power (LADWP) Smart Grid cyber security project, a Department of Energy (DoE) funded effort involving the Jet Propulsion Laboratory, California Institute of Technology and the University of Southern California/ Information Sciences Institute. The results of the present effort constitute an important contribution towards building more rigorous evaluation paradigms for anomaly-based intrusion detectors in complex cyber physical systems such as the Smart Grid. Anomaly detection is a key strategy for cyber intrusion detection and operates by identifying deviations from profiles of nominal behavior and are thus conceptually appealing for detecting "novel" attacks. Evaluating the performance of such a detector requires assessing: (a) how well it captures the model of nominal behavior, and (b) how well it detects attacks (deviations from normality). Current evaluation methods produce results that give insufficient insight into the operation of a detector, inevitably resulting in a significantly poor characterization of a detectors performance. In this work, we first describe a preliminary taxonomy of key evaluation constructs that are necessary for establishing rigor in the evaluation regime of an anomaly detector. We then focus on clarifying the impact of the operational environment on the manifestation of attacks in monitored data. We show how dynamic and evolving environments can introduce high variability into the data stream perturbing detector performance. Prior research has focused on understanding the impact of this

  16. Composite reliability of a workplace-based assessment toolbox for postgraduate medical education.

    PubMed

    Moonen-van Loon, J M W; Overeem, K; Donkers, H H L M; van der Vleuten, C P M; Driessen, E W

    2013-12-01

    In recent years, postgraduate assessment programmes around the world have embraced workplace-based assessment (WBA) and its related tools. Despite their widespread use, results of studies on the validity and reliability of these tools have been variable. Although in many countries decisions about residents' continuation of training and certification as a specialist are based on the composite results of different WBAs collected in a portfolio, to our knowledge, the reliability of such a WBA toolbox has never been investigated. Using generalisability theory, we analysed the separate and composite reliability of three WBA tools [mini-Clinical Evaluation Exercise (mini-CEX), direct observation of procedural skills (DOPS), and multisource feedback (MSF)] included in a resident portfolio. G-studies and D-studies of 12,779 WBAs from a total of 953 residents showed that a reliability coefficient of 0.80 was obtained for eight mini-CEXs, nine DOPS, and nine MSF rounds, whilst the same reliability was found for seven mini-CEXs, eight DOPS, and one MSF when combined in a portfolio. At the end of the first year of residency a portfolio with five mini-CEXs, six DOPS, and one MSF afforded reliable judgement. The results support the conclusion that several WBA tools combined in a portfolio can be a feasible and reliable method for high-stakes judgements.

  17. The weakest t-norm based intuitionistic fuzzy fault-tree analysis to evaluate system reliability.

    PubMed

    Kumar, Mohit; Yadav, Shiv Prasad

    2012-07-01

    In this paper, a new approach of intuitionistic fuzzy fault-tree analysis is proposed to evaluate system reliability and to find the most critical system component that affects the system reliability. Here weakest t-norm based intuitionistic fuzzy fault tree analysis is presented to calculate fault interval of system components from integrating expert's knowledge and experience in terms of providing the possibility of failure of bottom events. It applies fault-tree analysis, α-cut of intuitionistic fuzzy set and T(ω) (the weakest t-norm) based arithmetic operations on triangular intuitionistic fuzzy sets to obtain fault interval and reliability interval of the system. This paper also modifies Tanaka et al.'s fuzzy fault-tree definition. In numerical verification, a malfunction of weapon system "automatic gun" is presented as a numerical example. The result of the proposed method is compared with the listing approaches of reliability analysis methods.

  18. Reliable contact fabrication on nanostructured Bi2Te3-based thermoelectric materials.

    PubMed

    Feng, Shien-Ping; Chang, Ya-Huei; Yang, Jian; Poudel, Bed; Yu, Bo; Ren, Zhifeng; Chen, Gang

    2013-05-14

    A cost-effective and reliable Ni-Au contact on nanostructured Bi2Te3-based alloys for a solar thermoelectric generator (STEG) is reported. The use of MPS SAMs creates a strong covalent binding and more nucleation sites with even distribution for electroplating contact electrodes on nanostructured thermoelectric materials. A reliable high-performance flat-panel STEG can be obtained by using this new method. PMID:23531997

  19. Reliable contact fabrication on nanostructured Bi2Te3-based thermoelectric materials.

    PubMed

    Feng, Shien-Ping; Chang, Ya-Huei; Yang, Jian; Poudel, Bed; Yu, Bo; Ren, Zhifeng; Chen, Gang

    2013-05-14

    A cost-effective and reliable Ni-Au contact on nanostructured Bi2Te3-based alloys for a solar thermoelectric generator (STEG) is reported. The use of MPS SAMs creates a strong covalent binding and more nucleation sites with even distribution for electroplating contact electrodes on nanostructured thermoelectric materials. A reliable high-performance flat-panel STEG can be obtained by using this new method.

  20. The B-747 flight control system maintenance and reliability data base for cost effectiveness tradeoff studies

    NASA Technical Reports Server (NTRS)

    1982-01-01

    Primary and automatic flight controls are combined for a total flight control reliability and maintenance cost data base using information from two previous reports and additional cost data gathered from a major airline. A comparison of the current B-747 flight control system effects on reliability and operating cost with that of a B-747 designed for an active control wing load alleviation system is provided.

  1. Reliability of 3D laser-based anthropometry and comparison with classical anthropometry

    PubMed Central

    Kuehnapfel, Andreas; Ahnert, Peter; Loeffler, Markus; Broda, Anja; Scholz, Markus

    2016-01-01

    Anthropometric quantities are widely used in epidemiologic research as possible confounders, risk factors, or outcomes. 3D laser-based body scans (BS) allow evaluation of dozens of quantities in short time with minimal physical contact between observers and probands. The aim of this study was to compare BS with classical manual anthropometric (CA) assessments with respect to feasibility, reliability, and validity. We performed a study on 108 individuals with multiple measurements of BS and CA to estimate intra- and inter-rater reliabilities for both. We suggested BS equivalents of CA measurements and determined validity of BS considering CA the gold standard. Throughout the study, the overall concordance correlation coefficient (OCCC) was chosen as indicator of agreement. BS was slightly more time consuming but better accepted than CA. For CA, OCCCs for intra- and inter-rater reliability were greater than 0.8 for all nine quantities studied. For BS, 9 of 154 quantities showed reliabilities below 0.7. BS proxies for CA measurements showed good agreement (minimum OCCC > 0.77) after offset correction. Thigh length showed higher reliability in BS while upper arm length showed higher reliability in CA. Except for these issues, reliabilities of CA measurements and their BS equivalents were comparable. PMID:27225483

  2. Reliability of 3D laser-based anthropometry and comparison with classical anthropometry.

    PubMed

    Kuehnapfel, Andreas; Ahnert, Peter; Loeffler, Markus; Broda, Anja; Scholz, Markus

    2016-01-01

    Anthropometric quantities are widely used in epidemiologic research as possible confounders, risk factors, or outcomes. 3D laser-based body scans (BS) allow evaluation of dozens of quantities in short time with minimal physical contact between observers and probands. The aim of this study was to compare BS with classical manual anthropometric (CA) assessments with respect to feasibility, reliability, and validity. We performed a study on 108 individuals with multiple measurements of BS and CA to estimate intra- and inter-rater reliabilities for both. We suggested BS equivalents of CA measurements and determined validity of BS considering CA the gold standard. Throughout the study, the overall concordance correlation coefficient (OCCC) was chosen as indicator of agreement. BS was slightly more time consuming but better accepted than CA. For CA, OCCCs for intra- and inter-rater reliability were greater than 0.8 for all nine quantities studied. For BS, 9 of 154 quantities showed reliabilities below 0.7. BS proxies for CA measurements showed good agreement (minimum OCCC > 0.77) after offset correction. Thigh length showed higher reliability in BS while upper arm length showed higher reliability in CA. Except for these issues, reliabilities of CA measurements and their BS equivalents were comparable. PMID:27225483

  3. An Energy-Based Limit State Function for Estimation of Structural Reliability in Shock Environments

    DOE PAGES

    Guthrie, Michael A.

    2013-01-01

    limit state function is developed for the estimation of structural reliability in shock environments. This limit state function uses peak modal strain energies to characterize environmental severity and modal strain energies at failure to characterize the structural capacity. The Hasofer-Lind reliability index is briefly reviewed and its computation for the energy-based limit state function is discussed. Applications to two degree of freedom mass-spring systems and to a simple finite element model are considered. For these examples, computation of the reliability index requires little effort beyond a modal analysis, but still accounts for relevant uncertainties in both the structure and environment.more » For both examples, the reliability index is observed to agree well with the results of Monte Carlo analysis. In situations where fast, qualitative comparison of several candidate designs is required, the reliability index based on the proposed limit state function provides an attractive metric which can be used to compare and control reliability.« less

  4. A General Reliability Model for Ni-BaTiO3-Based Multilayer Ceramic Capacitors

    NASA Technical Reports Server (NTRS)

    Liu, Donhang

    2014-01-01

    The evaluation of multilayer ceramic capacitors (MLCCs) with Ni electrode and BaTiO3 dielectric material for potential space project applications requires an in-depth understanding of their reliability. A general reliability model for Ni-BaTiO3 MLCC is developed and discussed. The model consists of three parts: a statistical distribution; an acceleration function that describes how a capacitor's reliability life responds to the external stresses, and an empirical function that defines contribution of the structural and constructional characteristics of a multilayer capacitor device, such as the number of dielectric layers N, dielectric thickness d, average grain size, and capacitor chip size A. Application examples are also discussed based on the proposed reliability model for Ni-BaTiO3 MLCCs.

  5. A General Reliability Model for Ni-BaTiO3-Based Multilayer Ceramic Capacitors

    NASA Technical Reports Server (NTRS)

    Liu, Donhang

    2014-01-01

    The evaluation for potential space project applications of multilayer ceramic capacitors (MLCCs) with Ni electrode and BaTiO3 dielectric material requires an in-depth understanding of the MLCCs reliability. A general reliability model for Ni-BaTiO3 MLCCs is developed and discussed in this paper. The model consists of three parts: a statistical distribution; an acceleration function that describes how a capacitors reliability life responds to external stresses; and an empirical function that defines the contribution of the structural and constructional characteristics of a multilayer capacitor device, such as the number of dielectric layers N, dielectric thickness d, average grain size r, and capacitor chip size A. Application examples are also discussed based on the proposed reliability model for Ni-BaTiO3 MLCCs.

  6. The SUPERB Project: Reliability-based design guideline for submarine pipelines

    SciTech Connect

    Sotberg, T.; Bruschi, R.; Moerk, K.

    1996-12-31

    This paper gives an overview of the research program SUPERB, the main objective being the development of a SUbmarine PipelinE Reliability Based Design Guideline with a comprehensive setup of design recommendations and criteria for pipeline design. The motivation of this program is related to the fact that project guidelines currently in force do not account for modern fabrication technology and the findings of recent research programs and by advanced engineering tools. The main structure of the Limit State Based Design (LSBD) Guideline is described followed by an outline of the safety philosophy which is introduced to fit within this framework. Focus is on the development of a reliability-based design guideline as a rational tool to manage future offshore projects with an optimal balance between project safety and economy. Selection of appropriate limit state functions and use of reliability tools to calibrate partial safety factors is also discussed.

  7. The validity and reliability of a problem-based learning implementation questionnaire

    PubMed Central

    Patria, Bhina

    2015-01-01

    Purpose: The aim of this paper is to provide evidence for the validity and reliability of a questionnaire for assessing the implementation of problem-based learning (PBL). This questionnaire was developed to assess the quality of PBL implementation from the perspective of medical school graduates. Methods: A confirmatory factor analysis was conducted to assess the validity of the questionnaire. The analysis was based on a survey of 225 graduates of a problem-based medical school in Indonesia. Results: The results showed that the confirmatory factor analysis model had a good fit to the data. Further, the values of the standardized loading estimates, the squared inter-construct correlations, the average variances extracted, and the composite reliabilities all provided evidence of construct validity. Conclusion: The PBL implementation questionnaire was found to be valid and reliable, making it suitable for evaluation purposes. PMID:26072901

  8. Reliability issues of an accelerometer under environmental stresses

    NASA Astrophysics Data System (ADS)

    Schmitt, Petra; Pressecq, Francis; Perez, Guy; Lafontan, Xavier; Nicot, Jean Marc; Esteve, Daniel; Fourniols, Jean Yves; Camon, Henri; Oudea, Coumar

    2003-12-01

    COTS (Commercial-off-the-shelf) MEMS components are very interesting for space applications because they are lightweight, small, economic in energy, cheap and available in short delays. The reliability of MEMS COTS that are used out of their intended domain of operation (such as a space application) might be assured by a reliability methodology derived from the Physics of Failure approach. In order to use this approach it is necessary to create models of MEMS components that take into consideration environmental stresses and thus can be used for lifetime prediction. Unfortunately, today MEMS failure mechanisms are not well understood today and therefore a preliminary work is necessary to determine influent factors and physical phenomena. The model development is based on a good knowledge of the process parameters (Young"s modulus, stress…), environmental tests and appropriated modeling approaches, such a finite element analysis (FEA) and behavioural modeling. In order to do the environmental tests and to analyse MEMS behaviour, we have developed the Environmental MEMS Analyzer EMA 3D. The described methodology has been applied to a Commercial-off-the-shelf (COTS) accelerometer, the ADXL150. A first level behavioral model was created and then refined in the following steps by the enrichment with experimental results and finite element simulations.

  9. Reliability issues of an accelerometer under environmental stresses

    NASA Astrophysics Data System (ADS)

    Schmitt, Petra; Pressecq, Francis; Perez, Guy; Lafontan, Xavier; Nicot, Jean Marc; Esteve, Daniel; Fourniols, Jean Yves; Camon, Henri; Oudea, Coumar

    2004-01-01

    COTS (Commercial-off-the-shelf) MEMS components are very interesting for space applications because they are lightweight, small, economic in energy, cheap and available in short delays. The reliability of MEMS COTS that are used out of their intended domain of operation (such as a space application) might be assured by a reliability methodology derived from the Physics of Failure approach. In order to use this approach it is necessary to create models of MEMS components that take into consideration environmental stresses and thus can be used for lifetime prediction. Unfortunately, today MEMS failure mechanisms are not well understood today and therefore a preliminary work is necessary to determine influent factors and physical phenomena. The model development is based on a good knowledge of the process parameters (Young"s modulus, stress...), environmental tests and appropriated modeling approaches, such a finite element analysis (FEA) and behavioural modeling. In order to do the environmental tests and to analyse MEMS behaviour, we have developed the Environmental MEMS Analyzer EMA 3D. The described methodology has been applied to a Commercial-off-the-shelf (COTS) accelerometer, the ADXL150. A first level behavioral model was created and then refined in the following steps by the enrichment with experimental results and finite element simulations.

  10. Validation of highly reliable, real-time knowledge-based systems

    NASA Technical Reports Server (NTRS)

    Johnson, Sally C.

    1988-01-01

    Knowledge-based systems have the potential to greatly increase the capabilities of future aircraft and spacecraft and to significantly reduce support manpower needed for the space station and other space missions. However, a credible validation methodology must be developed before knowledge-based systems can be used for life- or mission-critical applications. Experience with conventional software has shown that the use of good software engineering techniques and static analysis tools can greatly reduce the time needed for testing and simulation of a system. Since exhaustive testing is infeasible, reliability must be built into the software during the design and implementation phases. Unfortunately, many of the software engineering techniques and tools used for conventional software are of little use in the development of knowledge-based systems. Therefore, research at Langley is focused on developing a set of guidelines, methods, and prototype validation tools for building highly reliable, knowledge-based systems. The use of a comprehensive methodology for building highly reliable, knowledge-based systems should significantly decrease the time needed for testing and simulation. A proven record of delivering reliable systems at the beginning of the highly visible testing and simulation phases is crucial to the acceptance of knowledge-based systems in critical applications.

  11. Reliability Based Design for a Raked Wing Tip of an Airframe

    NASA Technical Reports Server (NTRS)

    Patnaik, Surya N.; Pai, Shantaram S.; Coroneos, Rula M.

    2011-01-01

    A reliability-based optimization methodology has been developed to design the raked wing tip of the Boeing 767-400 extended range airliner made of composite and metallic materials. Design is formulated for an accepted level of risk or reliability. The design variables, weight and the constraints became functions of reliability. Uncertainties in the load, strength and the material properties, as well as the design variables, were modeled as random parameters with specified distributions, like normal, Weibull or Gumbel functions. The objective function and constraint, or a failure mode, became derived functions of the risk-level. Solution to the problem produced the optimum design with weight, variables and constraints as a function of the risk-level. Optimum weight versus reliability traced out an inverted-S shaped graph. The center of the graph corresponded to a 50 percent probability of success, or one failure in two samples. Under some assumptions, this design would be quite close to the deterministic optimum solution. The weight increased when reliability exceeded 50 percent, and decreased when the reliability was compromised. A design could be selected depending on the level of risk acceptable to a situation. The optimization process achieved up to a 20-percent reduction in weight over traditional design.

  12. Composite Stress Rupture: A New Reliability Model Based on Strength Decay

    NASA Technical Reports Server (NTRS)

    Reeder, James R.

    2012-01-01

    A model is proposed to estimate reliability for stress rupture of composite overwrap pressure vessels (COPVs) and similar composite structures. This new reliability model is generated by assuming a strength degradation (or decay) over time. The model suggests that most of the strength decay occurs late in life. The strength decay model will be shown to predict a response similar to that predicted by a traditional reliability model for stress rupture based on tests at a single stress level. In addition, the model predicts that even though there is strength decay due to proof loading, a significant overall increase in reliability is gained by eliminating any weak vessels, which would fail early. The model predicts that there should be significant periods of safe life following proof loading, because time is required for the strength to decay from the proof stress level to the subsequent loading level. Suggestions for testing the strength decay reliability model have been made. If the strength decay reliability model predictions are shown through testing to be accurate, COPVs may be designed to carry a higher level of stress than is currently allowed, which will enable the production of lighter structures

  13. Estimating Reliability of Disturbances in Satellite Time Series Data Based on Statistical Analysis

    NASA Astrophysics Data System (ADS)

    Zhou, Z.-G.; Tang, P.; Zhou, M.

    2016-06-01

    Normally, the status of land cover is inherently dynamic and changing continuously on temporal scale. However, disturbances or abnormal changes of land cover — caused by such as forest fire, flood, deforestation, and plant diseases — occur worldwide at unknown times and locations. Timely detection and characterization of these disturbances is of importance for land cover monitoring. Recently, many time-series-analysis methods have been developed for near real-time or online disturbance detection, using satellite image time series. However, the detection results were only labelled with "Change/ No change" by most of the present methods, while few methods focus on estimating reliability (or confidence level) of the detected disturbances in image time series. To this end, this paper propose a statistical analysis method for estimating reliability of disturbances in new available remote sensing image time series, through analysis of full temporal information laid in time series data. The method consists of three main steps. (1) Segmenting and modelling of historical time series data based on Breaks for Additive Seasonal and Trend (BFAST). (2) Forecasting and detecting disturbances in new time series data. (3) Estimating reliability of each detected disturbance using statistical analysis based on Confidence Interval (CI) and Confidence Levels (CL). The method was validated by estimating reliability of disturbance regions caused by a recent severe flooding occurred around the border of Russia and China. Results demonstrated that the method can estimate reliability of disturbances detected in satellite image with estimation error less than 5% and overall accuracy up to 90%.

  14. The Seismic Reliability of Offshore Structures Based on Nonlinear Time History Analyses

    SciTech Connect

    Hosseini, Mahmood; Karimiyani, Somayyeh; Ghafooripour, Amin; Jabbarzadeh, Mohammad Javad

    2008-07-08

    Regarding the past earthquakes damages to offshore structures, as vital structures in the oil and gas industries, it is important that their seismic design is performed by very high reliability. Accepting the Nonlinear Time History Analyses (NLTHA) as the most reliable seismic analysis method, in this paper an offshore platform of jacket type with the height of 304 feet, having a deck of 96 feet by 94 feet, and weighing 290 million pounds has been studied. At first, some Push-Over Analyses (POA) have been preformed to recognize the more critical members of the jacket, based on the range of their plastic deformations. Then NLTHA have been performed by using the 3-components accelerograms of 100 earthquakes, covering a wide range of frequency content, and normalized to three Peak Ground Acceleration (PGA) levels of 0.3 g, 0.65 g, and 1.0 g. By using the results of NLTHA the damage and rupture probabilities of critical member have been studied to assess the reliability of the jacket structure. Regarding that different structural members of the jacket have different effects on the stability of the platform, an 'importance factor' has been considered for each critical member based on its location and orientation in the structure, and then the reliability of the whole structure has been obtained by combining the reliability of the critical members, each having its specific importance factor.

  15. Assessing the Reliability of Curriculum-Based Measurement: An Application of Latent Growth Modeling

    ERIC Educational Resources Information Center

    Yeo, Seungsoo; Kim, Dong-Il; Branum-Martin, Lee; Wayman, Miya Miura; Espin, Christine A.

    2012-01-01

    The purpose of this study was to demonstrate the use of Latent Growth Modeling (LGM) as a method for estimating reliability of Curriculum-Based Measurement (CBM) progress-monitoring data. The LGM approach permits the error associated with each measure to differ at each time point, thus providing an alternative method for examining of the…

  16. Two Prophecy Formulas for Assessing the Reliability of Item Response Theory-Based Ability Estimates

    ERIC Educational Resources Information Center

    Raju, Nambury S.; Oshima, T.C.

    2005-01-01

    Two new prophecy formulas for estimating item response theory (IRT)-based reliability of a shortened or lengthened test are proposed. Some of the relationships between the two formulas, one of which is identical to the well-known Spearman-Brown prophecy formula, are examined and illustrated. The major assumptions underlying these formulas are…

  17. Evaluating the Reliability of Selected School-Based Indices of Adequate Reading Progress

    ERIC Educational Resources Information Center

    Wheeler, Courtney E.

    2010-01-01

    The present study examined the stability (i.e., 4-month and 12-month test-retest reliability) of six selected school-based indices of adequate reading progress. The total sampling frame included between 3970 and 5655 schools depending on the index and research question. Each school had at least 40 second-grade students that had complete Oral…

  18. Composite Reliability of a Workplace-Based Assessment Toolbox for Postgraduate Medical Education

    ERIC Educational Resources Information Center

    Moonen-van Loon, J. M. W.; Overeem, K.; Donkers, H. H. L. M.; van der Vleuten, C. P. M.; Driessen, E. W.

    2013-01-01

    In recent years, postgraduate assessment programmes around the world have embraced workplace-based assessment (WBA) and its related tools. Despite their widespread use, results of studies on the validity and reliability of these tools have been variable. Although in many countries decisions about residents' continuation of training and…

  19. Empirical vs. Expected IRT-Based Reliability Estimation in Computerized Multistage Testing (MST)

    ERIC Educational Resources Information Center

    Zhang, Yanwei; Breithaupt, Krista; Tessema, Aster; Chuah, David

    2006-01-01

    Two IRT-based procedures to estimate test reliability for a certification exam that used both adaptive (via a MST model) and non-adaptive design were considered in this study. Both procedures rely on calibrated item parameters to estimate error variance. In terms of score variance, one procedure (Method 1) uses the empirical ability distribution…

  20. Genetic algorithm-support vector regression for high reliability SHM system based on FBG sensor network

    NASA Astrophysics Data System (ADS)

    Zhang, XiaoLi; Liang, DaKai; Zeng, Jie; Asundi, Anand

    2012-02-01

    Structural Health Monitoring (SHM) based on fiber Bragg grating (FBG) sensor network has attracted considerable attention in recent years. However, FBG sensor network is embedded or glued in the structure simply with series or parallel. In this case, if optic fiber sensors or fiber nodes fail, the fiber sensors cannot be sensed behind the failure point. Therefore, for improving the survivability of the FBG-based sensor system in the SHM, it is necessary to build high reliability FBG sensor network for the SHM engineering application. In this study, a model reconstruction soft computing recognition algorithm based on genetic algorithm-support vector regression (GA-SVR) is proposed to achieve the reliability of the FBG-based sensor system. Furthermore, an 8-point FBG sensor system is experimented in an aircraft wing box. The external loading damage position prediction is an important subject for SHM system; as an example, different failure modes are selected to demonstrate the SHM system's survivability of the FBG-based sensor network. Simultaneously, the results are compared with the non-reconstruct model based on GA-SVR in each failure mode. Results show that the proposed model reconstruction algorithm based on GA-SVR can still keep the predicting precision when partial sensors failure in the SHM system; thus a highly reliable sensor network for the SHM system is facilitated without introducing extra component and noise.

  1. Measuring Fidelity and Adaptation: Reliability of a Instrument for School-Based Prevention Programs.

    PubMed

    Bishop, Dana C; Pankratz, Melinda M; Hansen, William B; Albritton, Jordan; Albritton, Lauren; Strack, Joann

    2014-06-01

    There is a need to standardize methods for assessing fidelity and adaptation. Such standardization would allow program implementation to be examined in a manner that will be useful for understanding the moderating role of fidelity in dissemination research. This article describes a method for collecting data about fidelity of implementation for school-based prevention programs, including measures of adherence, quality of delivery, dosage, participant engagement, and adaptation. We report about the reliability of these methods when applied by four observers who coded video recordings of teachers delivering All Stars, a middle school drug prevention program. Interrater agreement for scaled items was assessed for an instrument designed to evaluate program fidelity. Results indicated sound interrater reliability for items assessing adherence, dosage, quality of teaching, teacher understanding of concepts, and program adaptations. The interrater reliability for items assessing potential program effectiveness, classroom management, achievement of activity objectives, and adaptation valences was improved by dichotomizing the response options for these items. The item that assessed student engagement demonstrated only modest interrater reliability and was not improved through dichotomization. Several coder pairs were discordant on items that overall demonstrated good interrater reliability. Proposed modifications to the coding manual and protocol are discussed.

  2. A rainwater harvesting system reliability model based on nonparametric stochastic rainfall generator

    NASA Astrophysics Data System (ADS)

    Basinger, Matt; Montalto, Franco; Lall, Upmanu

    2010-10-01

    SummaryThe reliability with which harvested rainwater can be used as a means of flushing toilets, irrigating gardens, and topping off air-conditioner serving multifamily residential buildings in New York City is assessed using a new rainwater harvesting (RWH) system reliability model. Although demonstrated with a specific case study, the model is portable because it is based on a nonparametric rainfall generation procedure utilizing a bootstrapped markov chain. Precipitation occurrence is simulated using transition probabilities derived for each day of the year based on the historical probability of wet and dry day state changes. Precipitation amounts are selected from a matrix of historical values within a moving 15 day window that is centered on the target day. RWH system reliability is determined for user-specified catchment area and tank volume ranges using precipitation ensembles generated using the described stochastic procedure. The reliability with which NYC backyard gardens can be irrigated and air conditioning units supplied with water harvested from local roofs exceeds 80% and 90%, respectively, for the entire range of catchment areas and tank volumes considered in the analysis. For RWH systems installed on the most commonly occurring rooftop catchment areas found in NYC (51-75 m 2), toilet flushing demand can be met with 7-40% reliability, with lower end of the range representing buildings with high flow toilets and no storage elements, and the upper end representing buildings that feature low flow fixtures and storage tanks of up to 5 m 3. When the reliability curves developed are used to size RWH systems to flush the low flow toilets of all multifamily buildings found a typical residential neighborhood in the Bronx, rooftop runoff inputs to the sewer system are reduced by approximately 28% over an average rainfall year, and potable water demand is reduced by approximately 53%.

  3. Degradation mechanisms in high-power multi-mode InGaAs-AlGaAs strained quantum well lasers for high-reliability applications

    NASA Astrophysics Data System (ADS)

    Sin, Yongkun; Presser, Nathan; Brodie, Miles; Lingley, Zachary; Foran, Brendan; Moss, Steven C.

    2015-03-01

    Laser diode manufacturers perform accelerated multi-cell lifetests to estimate lifetimes of lasers using an empirical model. Since state-of-the-art laser diodes typically require a long period of latency before they degrade, significant amount of stress is applied to the lasers to generate failures in relatively short test durations. A drawback of this approach is the lack of mean-time-to-failure data under intermediate and low stress conditions, leading to uncertainty in model parameters (especially optical power and current exponent) and potential overestimation of lifetimes at usage conditions. This approach is a concern especially for satellite communication systems where high reliability is required of lasers for long-term duration in the space environment. A number of groups have studied reliability and degradation processes in GaAs-based lasers, but none of these studies have yielded a reliability model based on the physics of failure. The lack of such a model is also a concern for space applications where complete understanding of degradation mechanisms is necessary. Our present study addresses the aforementioned issues by performing long-term lifetests under low stress conditions followed by failure mode analysis (FMA) and physics of failure investigation. We performed low-stress lifetests on both MBE- and MOCVD-grown broad-area InGaAs- AlGaAs strained QW lasers under ACC (automatic current control) mode to study low-stress degradation mechanisms. Our lifetests have accumulated over 36,000 test hours and FMA is performed on failures using our angle polishing technique followed by EL. This technique allows us to identify failure types by observing dark line defects through a window introduced in backside metal contacts. We also investigated degradation mechanisms in MOCVD-grown broad-area InGaAs-AlGaAs strained QW lasers using various FMA techniques. Since it is a challenge to control defect densities during the growth of laser structures, we chose to

  4. A Reliability Model for Ni-BaTiO3-Based (BME) Ceramic Capacitors

    NASA Technical Reports Server (NTRS)

    Liu, Donhang

    2014-01-01

    The evaluation of multilayer ceramic capacitors (MLCCs) with base-metal electrodes (BMEs) for potential NASA space project applications requires an in-depth understanding of their reliability. The reliability of an MLCC is defined as the ability of the dielectric material to retain its insulating properties under stated environmental and operational conditions for a specified period of time t. In this presentation, a general mathematic expression of a reliability model for a BME MLCC is developed and discussed. The reliability model consists of three parts: (1) a statistical distribution that describes the individual variation of properties in a test group of samples (Weibull, log normal, normal, etc.), (2) an acceleration function that describes how a capacitors reliability responds to external stresses such as applied voltage and temperature (All units in the test group should follow the same acceleration function if they share the same failure mode, independent of individual units), and (3) the effect and contribution of the structural and constructional characteristics of a multilayer capacitor device, such as the number of dielectric layers N, dielectric thickness d, average grain size r, and capacitor chip size S. In general, a two-parameter Weibull statistical distribution model is used in the description of a BME capacitors reliability as a function of time. The acceleration function that relates a capacitors reliability to external stresses is dependent on the failure mode. Two failure modes have been identified in BME MLCCs: catastrophic and slow degradation. A catastrophic failure is characterized by a time-accelerating increase in leakage current that is mainly due to existing processing defects (voids, cracks, delamination, etc.), or the extrinsic defects. A slow degradation failure is characterized by a near-linear increase in leakage current against the stress time; this is caused by the electromigration of oxygen vacancies (intrinsic defects). The

  5. Reliability Sensitivity Analysis and Design Optimization of Composite Structures Based on Response Surface Methodology

    NASA Technical Reports Server (NTRS)

    Rais-Rohani, Masoud

    2003-01-01

    This report discusses the development and application of two alternative strategies in the form of global and sequential local response surface (RS) techniques for the solution of reliability-based optimization (RBO) problems. The problem of a thin-walled composite circular cylinder under axial buckling instability is used as a demonstrative example. In this case, the global technique uses a single second-order RS model to estimate the axial buckling load over the entire feasible design space (FDS) whereas the local technique uses multiple first-order RS models with each applied to a small subregion of FDS. Alternative methods for the calculation of unknown coefficients in each RS model are explored prior to the solution of the optimization problem. The example RBO problem is formulated as a function of 23 uncorrelated random variables that include material properties, thickness and orientation angle of each ply, cylinder diameter and length, as well as the applied load. The mean values of the 8 ply thicknesses are treated as independent design variables. While the coefficients of variation of all random variables are held fixed, the standard deviations of ply thicknesses can vary during the optimization process as a result of changes in the design variables. The structural reliability analysis is based on the first-order reliability method with reliability index treated as the design constraint. In addition to the probabilistic sensitivity analysis of reliability index, the results of the RBO problem are presented for different combinations of cylinder length and diameter and laminate ply patterns. The two strategies are found to produce similar results in terms of accuracy with the sequential local RS technique having a considerably better computational efficiency.

  6. Reliability of Two Field-Based Tests for Measuring Cardiorespiratory Fitness in Preschool Children.

    PubMed

    Ayán, Carlos; Cancela, José M; Romero, Sonia; Alonso, Susana

    2015-10-01

    This study is aimed at analyzing the reliability of 2 field-based cardiorespiratory fitness tests when applied to a sample specifically made up of preschool-aged children. A total of 97 preschoolers (mean age: 4.36 ± 0.4 years; 50.5% girls) performed Course-Navette and Mini-Cooper tests 3 times (familiarization test and retest). The scores obtained were compared with the results provided by the 3-minute shuttle run test, which is considered to be a reliable field-based test for preschoolers. The Mini-Cooper test showed a high reliability for children aged 4 (intraclass correlation coefficient [ICC]: 0.942; 95% confidence interval [CI]: 0.903-0.965) and 5 years old (ICC: 0.946; 95% CI: 0.893-0.973). The reliability of Course-Navette was also high for both 4-year-old (ICC: 0.909; 95% CI: 0.849-0.945) and 5-year-old children (ICC: 0.889; 95% CI: 0.780-0.944). The mean scores of the 3-minute shuttle run test did not show a significant correlation with the mean scores obtained in the Mini-Cooper test and in the Course-Navette test in the 4-year-old children. The results of this study suggest that Course-Navette and Mini-Cooper tests are reliable measures of cardiorespiratory fitness that can be used to assess health-related fitness in preschool children. Nevertheless, some considerations must be taken into account before administering them. PMID:26402475

  7. The Bayesian Reliability Assessment and Prediction for Radar System Based on New Dirichlet Prior Distribution

    NASA Astrophysics Data System (ADS)

    Ming, Zhimao; Ling, Xiaodong; Bai, Xiaoshu; Zong, Bo

    2016-02-01

    This article studies on Bayesian reliability growth models of complex system based on new Dirichlet prior distribution when the sample of system is small. The model briefly describes expert experience as uniform distribution, then equivalent general Beta distribution of uniform distribution can be solved by optimization method when prior parameters are variables, mean is constraint condition, and variance is regarding as the optimization objective. The optimization method solves the problem of how to determine values of hyper-parameters of new Dirichlet distribution when these parameters have no specific physical meaning. Because the multidimensional numerical integration of posterior distribution is very difficult to calculate, WinBUGS software is employed to establish Bayesian reliability growth model based on a new Dirichlet prior distribution, and two practical cases are studied under this model in order to prove validity of model. The analysis results show that the model can improve the precision of calculation, and it is easy to use in engineering.

  8. On Improving the Reliability of Distribution Networks Based on Investment Scenarios Using Reference Networks

    NASA Astrophysics Data System (ADS)

    Kawahara, Koji

    Distribution systems are inherent monopolies and therefore these have generally been regulated in order to protect customers and to ensure cost-effective operation. In the UK this is one of the functions of OFGEM (Office of Gas and Electricity Markets). Initially the regulation was based on the value of assets but there is a trend nowadays towards performance-based regulation. In order to achieve this, a methodology is needed that enables the reliability performance associated with alternative investment strategies to be compared with the investment cost of these strategies. At present there is no accepted approach for such assessments. Building on the concept of reference networks proposed in Refs. (1), (2), this paper describes how these networks can be used to assess the impact that performance driven investment strategies will have on the improvement in reliability indices. The method has been tested using the underground and overhead part of a real system.

  9. Generalizability theory reliability of written expression curriculum-based measurement in universal screening.

    PubMed

    Keller-Margulis, Milena A; Mercer, Sterett H; Thomas, Erin L

    2016-09-01

    The purpose of this study was to examine the reliability of written expression curriculum-based measurement (WE-CBM) in the context of universal screening from a generalizability theory framework. Students in second through fifth grade (n = 145) participated in the study. The sample included 54% female students, 49% White students, 23% African American students, 17% Hispanic students, 8% Asian students, and 3% of students identified as 2 or more races. Of the sample, 8% were English Language Learners and 6% were students receiving special education. Three WE-CBM probes were administered for 7 min each at 3 time points across 1 year. Writing samples were scored for commonly used WE-CBM metrics (e.g., correct minus incorrect word sequences; CIWS). Results suggest that nearly half the variance in WE-CBM is related to unsystematic error and that conventional screening procedures (i.e., the use of one 3-min sample) do not yield scores with adequate reliability for relative or absolute decisions about student performance. In most grades, three 3-min writing samples (or 2 longer duration samples) were required for adequate reliability for relative decisions, and three 7-min writing samples would not yield adequate reliability for relative decisions about within-year student growth. Implications and recommendations are discussed. (PsycINFO Database Record PMID:26322656

  10. Reliable and redundant FPGA based read-out design in the ATLAS TileCal Demonstrator

    SciTech Connect

    Akerstedt, Henrik; Muschter, Steffen; Drake, Gary; Anderson, Kelby; Bohm, Christian; Oreglia, Mark; Tang, Fukun

    2015-10-01

    The Tile Calorimeter at ATLAS [1] is a hadron calorimeter based on steel plates and scintillating tiles read out by PMTs. The current read-out system uses standard ADCs and custom ASICs to digitize and temporarily store the data on the detector. However, only a subset of the data is actually read out to the counting room. The on-detector electronics will be replaced around 2023. To achieve the required reliability the upgraded system will be highly redundant. Here the ASICs will be replaced with Kintex-7 FPGAs from Xilinx. This, in addition to the use of multiple 10 Gbps optical read-out links, will allow a full read-out of all detector data. Due to the higher radiation levels expected when the beam luminosity is increased, opportunities for repairs will be less frequent. The circuitry and firmware must therefore be designed for sufficiently high reliability using redundancy and radiation tolerant components. Within a year, a hybrid demonstrator including the new readout system will be installed in one slice of the ATLAS Tile Calorimeter. This will allow the proposed upgrade to be thoroughly evaluated well before the planned 2023 deployment in all slices, especially with regard to long term reliability. Different firmware strategies alongside with their integration in the demonstrator are presented in the context of high reliability protection against hardware malfunction and radiation induced errors.

  11. Mobile inertial sensor based gait analysis: Validity and reliability of spatiotemporal gait characteristics in healthy seniors.

    PubMed

    Donath, Lars; Faude, Oliver; Lichtenstein, Eric; Pagenstert, Geert; Nüesch, Corina; Mündermann, Annegret

    2016-09-01

    Gait analysis is commonly used to identify gait changes and fall risk in clinical populations and seniors. Body-worn inertial sensor based gait analyses provide a feasible alternative to optometric and pressure based measurements of spatiotemporal gait characteristics. We assessed validity and relative and absolute reliability of a body-worn inertial sensor system (RehaGait(®)) for measuring spatiotemporal gait characteristics compared to a standard stationary treadmill (Zebris(®)). Spatiotemporal gait parameters (walking speed, stride length, cadence and stride time) were collected for 24 healthy seniors (age: 75.3±6.7 years) tested on 2 days (1 week apart) simultaneously using the sensor based system and instrumented treadmill. Each participant completed walking tests (200 strides) at different walking speeds and slopes. The difference between the RehaGait(®) system and the treadmill was trivial (Cohen's d<0.2) except for speed and stride length at slow speed (Cohen's d, 0.35 and 0.49, respectively). Intraclass correlation coefficients (ICC) were excellent for temporal gait characteristics (cadence and stride time; ICC: 0.99-1.00) and moderate for stride length (ICC: 0.73-0.89). Both devices had excellent day-to-day reliability for all gait parameters (ICC: 0.82-0.99) except for stride length at slow speed (ICC: 0.74). The RehaGait(®) is a valid and reliable tool for assessing spatiotemporal gait parameters for treadmill walking at different speeds and slopes. PMID:27494305

  12. Reliability and performance evaluation of systems containing embedded rule-based expert systems

    NASA Technical Reports Server (NTRS)

    Beaton, Robert M.; Adams, Milton B.; Harrison, James V. A.

    1989-01-01

    A method for evaluating the reliability of real-time systems containing embedded rule-based expert systems is proposed and investigated. It is a three stage technique that addresses the impact of knowledge-base uncertainties on the performance of expert systems. In the first stage, a Markov reliability model of the system is developed which identifies the key performance parameters of the expert system. In the second stage, the evaluation method is used to determine the values of the expert system's key performance parameters. The performance parameters can be evaluated directly by using a probabilistic model of uncertainties in the knowledge-base or by using sensitivity analyses. In the third and final state, the performance parameters of the expert system are combined with performance parameters for other system components and subsystems to evaluate the reliability and performance of the complete system. The evaluation method is demonstrated in the context of a simple expert system used to supervise the performances of an FDI algorithm associated with an aircraft longitudinal flight-control system.

  13. A genetic algorithm approach for assessing soil liquefaction potential based on reliability method

    NASA Astrophysics Data System (ADS)

    Bagheripour, M. H.; Shooshpasha, I.; Afzalirad, M.

    2012-02-01

    Deterministic approaches are unable to account for the variations in soil's strength properties, earthquake loads, as well as source of errors in evaluations of liquefaction potential in sandy soils which make them questionable against other reliability concepts. Furthermore, deterministic approaches are incapable of precisely relating the probability of liquefaction and the factor of safety (FS). Therefore, the use of probabilistic approaches and especially, reliability analysis is considered since a complementary solution is needed to reach better engineering decisions. In this study, Advanced First-Order Second-Moment (AFOSM) technique associated with genetic algorithm (GA) and its corresponding sophisticated optimization techniques have been used to calculate the reliability index and the probability of liquefaction. The use of GA provides a reliable mechanism suitable for computer programming and fast convergence. A new relation is developed here, by which the liquefaction potential can be directly calculated based on the estimated probability of liquefaction ( P L ), cyclic stress ratio (CSR) and normalized standard penetration test (SPT) blow counts while containing a mean error of less than 10% from the observational data. The validity of the proposed concept is examined through comparison of the results obtained by the new relation and those predicted by other investigators. A further advantage of the proposed relation is that it relates P L and FS and hence it provides possibility of decision making based on the liquefaction risk and the use of deterministic approaches. This could be beneficial to geotechnical engineers who use the common methods of FS for evaluation of liquefaction. As an application, the city of Babolsar which is located on the southern coasts of Caspian Sea is investigated for liquefaction potential. The investigation is based primarily on in situ tests in which the results of SPT are analysed.

  14. Reliability-based structural optimization using response surface approximations and probabilistic sufficiency factor

    NASA Astrophysics Data System (ADS)

    Qu, Xueyong

    Uncertainties exist practically everywhere from structural design to manufacturing, product lifetime service, and maintenance. Uncertainties can be introduced by errors in modeling and simulation; by manufacturing imperfections (such as variability in material properties and structural geometric dimensions); and by variability in loading. Structural design by safety factors using nominal values without considering uncertainties may lead to designs that are either unsafe, or too conservative and thus not efficient. The focus of this dissertation is reliability-based design optimization (RBDO) of composite structures. Uncertainties are modeled by the probabilistic distributions of random variables. Structural reliability is evaluated in term of the probability of failure. RBDO minimizes cost such as structural weight subject to reliability constraints. Since engineering structures usually have multiple failure modes, Monte Carlo simulation (MCS) was used employed to calculate the system probability of failure. Response surface (RS) approximation techniques were used to solve the difficulties associated with MCS. The high computational cost of a large number of MCS samples was alleviated by analysis RS, and numerical noise in the results of MCS was filtered out by design RS. RBDO of composite laminates is investigated for use in hydrogen tanks in cryogenic environments. The major challenge is to reduce the large residual strains developed due to thermal mismatch between matrix and fibers while maintaining the load carrying capacity. RBDO is performed to provide laminate designs, quantify the effects of uncertainties on the optimum weight, and identify those parameters that have the largest influence on optimum design. Studies of weight and reliability tradeoffs indicate that the most cost-effective measure for reducing weight and increasing reliability is quality control. A probabilistic sufficiency factor (PSF) approach was developed to improve the computational

  15. 75 FR 16098 - Reliable Power LLC; Supplemental Notice That Initial Market-Based Rate Filing Includes Request...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-03-31

    ... Reference Room in Washington, DC. There is an eSubscription link on the Web site that enables subscribers to... Energy Regulatory Commission Reliable Power LLC; Supplemental Notice That Initial Market-Based Rate... notice in the above-referenced proceeding of Reliable Power, LLC's application for market-based...

  16. 76 FR 40722 - Granite Reliable Power, LLC; Supplemental Notice That Initial Market-Based Rate Filing Includes...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-07-11

    ... From the Federal Register Online via the Government Publishing Office DEPARTMENT OF ENERGY Federal Energy Regulatory Commission Granite Reliable Power, LLC; Supplemental Notice That Initial Market-Based... above-referenced proceeding of Granite Reliable Power, LLC's application for market-based rate...

  17. A Reliability and Validity of an Instrument to Evaluate the School-Based Assessment System: A Pilot Study

    ERIC Educational Resources Information Center

    Ghazali, Nor Hasnida Md

    2016-01-01

    A valid, reliable and practical instrument is needed to evaluate the implementation of the school-based assessment (SBA) system. The aim of this study is to develop and assess the validity and reliability of an instrument to measure the perception of teachers towards the SBA implementation in schools. The instrument is developed based on a…

  18. Web-Based Assessment of Mental Well-Being in Early Adolescence: A Reliability Study

    PubMed Central

    Hamann, Christoph; Schultze-Lutter, Frauke

    2016-01-01

    Background The ever-increasing use of the Internet among adolescents represents an emerging opportunity for researchers to gain access to larger samples, which can be queried over several years longitudinally. Among adolescents, young adolescents (ages 11 to 13 years) are of particular interest to clinicians as this is a transitional stage, during which depressive and anxiety symptoms often emerge. However, it remains unclear whether these youngest adolescents can accurately answer questions about their mental well-being using a Web-based platform. Objective The aim of the study was to examine the accuracy of responses obtained from Web-based questionnaires by comparing Web-based with paper-and-pencil versions of depression and anxiety questionnaires. Methods The primary outcome was the score on the depression and anxiety questionnaires under two conditions: (1) paper-and-pencil and (2) Web-based versions. Twenty-eight adolescents (aged 11-13 years, mean age 12.78 years and SD 0.78; 18 females, 64%) were randomly assigned to complete either the paper-and-pencil or the Web-based questionnaire first. Intraclass correlation coefficients (ICCs) were calculated to measure intrarater reliability. Intraclass correlation coefficients were calculated separately for depression (Children’s Depression Inventory, CDI) and anxiety (Spence Children’s Anxiety Scale, SCAS) questionnaires. Results On average, it took participants 17 minutes (SD 6) to answer 116 questions online. Intraclass correlation coefficient analysis revealed high intrarater reliability when comparing Web-based with paper-and-pencil responses for both CDI (ICC=.88; P<.001) and the SCAS (ICC=.95; P<.001). According to published criteria, both of these values are in the “almost perfect” category indicating the highest degree of reliability. Conclusions The results of the study show an excellent reliability of Web-based assessment in 11- to 13-year-old children as compared with the standard paper

  19. Reliability and validity of the NeuroCognitive Performance Test, a web-based neuropsychological assessment

    PubMed Central

    Morrison, Glenn E.; Simone, Christa M.; Ng, Nicole F.; Hardy, Joseph L.

    2015-01-01

    The NeuroCognitive Performance Test (NCPT) is a brief, repeatable, web-based cognitive assessment platform that measures performance across several cognitive domains. The NCPT platform is modular and includes 18 subtests that can be arranged into customized batteries. Here we present normative data from a sample of 130,140 healthy volunteers for an NCPT battery consisting of 8 subtests. Participants took the NCPT remotely and without supervision. Factor structure and effects of age, education, and gender were evaluated with this normative dataset. Test-retest reliability was evaluated in a subset of participants who took the battery again an average of 78.8 days later. The eight NCPT subtests group into 4 putative cognitive domains, have adequate to good test-retest reliability, and are sensitive to expected age- and education-related cognitive effects. Concurrent validity to standard neuropsychological tests was demonstrated in 73 healthy volunteers. In an exploratory analysis the NCPT battery could differentiate those who self-reported Mild Cognitive Impairment or Alzheimer's disease from matched healthy controls. Overall these results demonstrate the reliability and validity of the NCPT battery as a measure of cognitive performance and support the feasibility of web-based, unsupervised testing, with potential utility in clinical and research settings. PMID:26579035

  20. Summary of Research on Reliability Criteria-Based Flight System Control

    NASA Technical Reports Server (NTRS)

    Wu, N. Eva; Belcastro, Christine (Technical Monitor)

    2002-01-01

    This paper presents research on the reliability assessment of adaptive flight control systems. The topics include: 1) Overview of Project Focuses; 2) Reliability Analysis; and 3) Design for Reliability. This paper is presented in viewgraph form.

  1. Achieving a high-reliability organization through implementation of the ARCC model for systemwide sustainability of evidence-based practice.

    PubMed

    Melnyk, Bernadette Mazurek

    2012-01-01

    High-reliability health care organizations are those that provide care that is safe and one that minimizes errors while achieving exceptional performance in quality and safety. This article presents major concepts and characteristics of a patient safety culture and a high-reliability health care organization and explains how building a culture of evidence-based practice can assist organizations in achieving high reliability. The ARCC (Advancing Research and Clinical practice through close Collaboration) model for systemwide implementation and sustainability of evidence-based practice is highlighted as a key strategy in achieving high reliability in health care organizations.

  2. Strategies for reliable second harmonic of nonlinear acoustic wave through cement-based materials

    NASA Astrophysics Data System (ADS)

    Xie, Fan; Guo, Zhiwei; Zhang, Jinwei

    2014-07-01

    The strategies for retrieving reliable nonlinear second harmonic in cement-based materials are proposed in this paper using high-performance test system, piezoelectric transducers with central frequency in MHz, monochromatic tone-burst excitation and robust data process method.The Fundamental and second-order harmonics are measured to retrieve reliable acoustic nonlinearity with the input power level increased from ∼50 V to ∼280 V. About 173 times repeatable measurements are conducted to verify the stability of the experimental system. Specimens with three distinct aggregate sizes are used to measure the acoustic nonlinearity under uniaxial load. The results show a decrease in the measured acoustic nonlinearity at early damage stage, then a slight increase when large cracks coalesce. The rapid increase in acoustic nonlinearity at the final stage indicates the imminent failure. Our results also suggest that the nonlinear ultrasonic method is more sensitive than P-wave velocity for damage evaluation.

  3. Cut set-based risk and reliability analysis for arbitrarily interconnected networks

    DOEpatents

    Wyss, Gregory D.

    2000-01-01

    Method for computing all-terminal reliability for arbitrarily interconnected networks such as the United States public switched telephone network. The method includes an efficient search algorithm to generate minimal cut sets for nonhierarchical networks directly from the network connectivity diagram. Efficiency of the search algorithm stems in part from its basis on only link failures. The method also includes a novel quantification scheme that likewise reduces computational effort associated with assessing network reliability based on traditional risk importance measures. Vast reductions in computational effort are realized since combinatorial expansion and subsequent Boolean reduction steps are eliminated through analysis of network segmentations using a technique of assuming node failures to occur on only one side of a break in the network, and repeating the technique for all minimal cut sets generated with the search algorithm. The method functions equally well for planar and non-planar networks.

  4. Trust sensor interface for improving reliability of EMG-based user intent recognition.

    PubMed

    Liu, Yuhong; Zhang, Fan; Sun, Yan Lindsay; Huang, He

    2011-01-01

    To achieve natural and smooth control of prostheses, Electromyographic (EMG) signals have been investigated for decoding user intent. However, EMG signals can be easily contaminated by diverse disturbances, leading to errors in user intent recognition and threatening the safety of prostheses users. To address this problem, we propose a trust sensor interface (TSI) that contains 2 modules: (1) abnormality detector that detects diverse disturbances with high accuracy and low latency and (2) trust evaluation that dynamically evaluates the reliability of EMG sensors. Based on the output of the TSI, the user intention recognition (UIR) algorithm is able to dynamically adjust their operations or decisions. Our experiments on an able-bodied subject have demonstrated that the proposed TSI can effectively detect two types of disturbances (i.e. motion artifacts and baseline shifts) and improve the reliability of the UIR.

  5. Reliability and parasitic issues in GaN-based power HEMTs: a review

    NASA Astrophysics Data System (ADS)

    Meneghesso, G.; Meneghini, M.; Rossetto, I.; Bisi, D.; Stoffels, S.; Van Hove, M.; Decoutere, S.; Zanoni, E.

    2016-09-01

    Despite the potential of GaN-based power transistors, these devices still suffer from certain parasitic and reliability issues that limit their static and dynamic performance and the maximum switching frequency. The aim of this paper is to review our most recent results on the parasitic mechanisms that affect the performance of GaN-on-Si HEMTs; more specifically, we describe the following relevant processes: (i) trapping of electrons in the buffer, which is induced by off-state operation; (ii) trapping of hot electrons, which is promoted by semi-on state operation; (iii) trapping of electrons in the gate insulator, which is favored by the exposure to positive gate bias. Moreover, we will describe one of the most critical reliability aspects of Metal-Insulator-Semiconductor HEMTs (MIS-HEMTs), namely time-dependent dielectric breakdown.

  6. Analyzing and designing of reliable multicast based on FEC in distributed switch

    NASA Astrophysics Data System (ADS)

    Luo, Ting; Yu, Shaohua; Wang, Xueshun

    2008-11-01

    AS businesses become more dependent on IP networks, lots of real-time services are adopted and high availability in networks has become increasingly critical. With the development of carrier grade ethernet, the requirements of high speed metro ethernet device are more urgently. In order to reach the capacity of hundreds of Gbps or Tbps, most of core ethernet switches almost adopted distributed control architecture and large capacity forwarding fabric. When distributed switch works, they always have one CE and many FE. There for, it shows the feature of multicast with one sender and many receivers. It is deserved research for us how to apply reliable multicast to distributed switch inner communication system. In this paper, we present the general architecture of a distributed ethernet switch, focusing on analysis the model of internal communication subsystem. According to its character, a novel reliable multicast communication mechanism based on FEC recovery algorithm has been applied and evaluated in experiment.

  7. Reliability Assessment and Robustness Study for Key Navigation Components using Belief Rule Based System

    NASA Astrophysics Data System (ADS)

    You, Yuan; Wang, Liuying; Chang, Leilei; Ling, Xiaodong; Sun, Nan

    2016-02-01

    The gyro device is the key navigation component for maritime tracking and control, and gyro shift is the key factor which influences the performance of the gyro device, which makes conducting the reliability analysis on the gyro device very important. For the gyro device reliability analysis, the residual life probability prediction plays an essential role although it requires a complex process adapted by existed studies. In this study the Belief Rule Base (BRB) system is applied to model the relationship between the time as the input and the residual life probability as the output. Two scenarios are designed to study the robustness of the proposed BRB prediction model. The comparative results show that the BRB prediction model performs better in Scenario II when new the referenced values are predictable.

  8. Accounting for Proof Test Data in a Reliability Based Design Optimization Framework

    NASA Technical Reports Server (NTRS)

    Ventor, Gerharad; Scotti, Stephen J.

    2012-01-01

    This paper investigates the use of proof (or acceptance) test data during the reliability based design optimization of structural components. It is assumed that every component will be proof tested and that the component will only enter into service if it passes the proof test. The goal is to reduce the component weight, while maintaining high reliability, by exploiting the proof test results during the design process. The proposed procedure results in the simultaneous design of the structural component and the proof test itself and provides the designer with direct control over the probability of failing the proof test. The procedure is illustrated using two analytical example problems and the results indicate that significant weight savings are possible when exploiting the proof test results during the design process.

  9. Differential Evolution Based Intelligent System State Search Method for Composite Power System Reliability Evaluation

    NASA Astrophysics Data System (ADS)

    Bakkiyaraj, Ashok; Kumarappan, N.

    2015-09-01

    This paper presents a new approach for evaluating the reliability indices of a composite power system that adopts binary differential evolution (BDE) algorithm in the search mechanism to select the system states. These states also called dominant states, have large state probability and higher loss of load curtailment necessary to maintain real power balance. A chromosome of a BDE algorithm represents the system state. BDE is not applied for its traditional application of optimizing a non-linear objective function, but used as tool for exploring more number of dominant states by producing new chromosomes, mutant vectors and trail vectors based on the fitness function. The searched system states are used to evaluate annualized system and load point reliability indices. The proposed search methodology is applied to RBTS and IEEE-RTS test systems and results are compared with other approaches. This approach evaluates the indices similar to existing methods while analyzing less number of system states.

  10. Estimating Paleointensity Reliability Based on the Physical Mechanism of Natural Remanence

    NASA Astrophysics Data System (ADS)

    Smirnov, A. V.; Tarduno, J. A.

    2007-12-01

    Data on the long-term evolution of Earth's magnetic field intensity are crucial for understanding the geodynamo and planetary evolution. However, paleointensity remains one of the most difficult quantities to determine. The conventional Thellier method is based on the assumption that the paleointensity signal is carried by non- interacting single-domain (SD) magnetic grains that hold a thermal remanent magnetization (TRM). Most bulk rock samples, however, deviate from this ideal case. This departure, coupled with the desire to tap the relatively plentiful potential record held by bulk rocks has led to the development of reliability criteria that largely rely on the observed NRM/TRM characteristics (Arai plots). While such methods may identify effects such as non-SD behavior and laboratory alteration, they assume that the paleointensity signal is a TRM. However, many paleointensity estimates in the current database are probably held by thermochemical remanent magnetizations (TCRMs) or crystallization remanent magnetizations (CRMs). Common processes that form such magnetizations include subsolidus reactions in magnetic grains during initial lava cooling (e.g., oxyexsolution), subsequent low- temperature oxidation (e.g., maghemitization), and the formation of secondary magnetic phases (e.g., hydrothermal magnetite). If unrecognized, such magnetizations can lead to large paleointensity underestimates or overestimates. In most cases, these processes cannot be identified using the Arai-based reliability controls. We suggest that additional criteria based on the physical mechanisms of recording and preserving the paleointensity signal should be utilized in order to assess the reliability of data. We introduce criteria based on whether the magnetization represents a TRM, TCRM or/and CRM based on rock magnetic and other analytical techniques. While such a categorization is needed to make further progress in understanding the nominal paleointensity signal of bulk rocks, we

  11. Reliability Evaluation of Base-Metal-Electrode Multilayer Ceramic Capacitors for Potential Space Applications

    NASA Technical Reports Server (NTRS)

    Liu, David (Donhang); Sampson, Michael J.

    2011-01-01

    Base-metal-electrode (BME) ceramic capacitors are being investigated for possible use in high-reliability spacelevel applications. This paper focuses on how BME capacitors construction and microstructure affects their lifetime and reliability. Examination of the construction and microstructure of commercial off-the-shelf (COTS) BME capacitors reveals great variance in dielectric layer thickness, even among BME capacitors with the same rated voltage. Compared to PME (precious-metal-electrode) capacitors, BME capacitors exhibit a denser and more uniform microstructure, with an average grain size between 0.3 and 0.5 m, which is much less than that of most PME capacitors. BME capacitors can be fabricated with more internal electrode layers and thinner dielectric layers than PME capacitors because they have a fine-grained microstructure and do not shrink much during ceramic sintering. This makes it possible for BME capacitors to achieve a very high capacitance volumetric efficiency. The reliability of BME and PME capacitors was investigated using highly accelerated life testing (HALT). Most BME capacitors were found to fail with an early avalanche breakdown, followed by a regular dielectric wearout failure during the HALT test. When most of the early failures, characterized with avalanche breakdown, were removed, BME capacitors exhibited a minimum mean time-to-failure (MTTF) of more than 105 years at room temperature and rated voltage. Dielectric thickness was found to be a critical parameter for the reliability of BME capacitors. The number of stacked grains in a dielectric layer appears to play a significant role in determining BME capacitor reliability. Although dielectric layer thickness varies for a given rated voltage in BME capacitors, the number of stacked grains is relatively consistent, typically around 12 for a number of BME capacitors with a rated voltage of 25V. This may suggest that the number of grains per dielectric layer is more critical than the

  12. WEAMR — A Weighted Energy Aware Multipath Reliable Routing Mechanism for Hotline-Based WSNs

    PubMed Central

    Tufail, Ali; Qamar, Arslan; Khan, Adil Mehmood; Baig, Waleed Akram; Kim, Ki-Hyung

    2013-01-01

    Reliable source to sink communication is the most important factor for an efficient routing protocol especially in domains of military, healthcare and disaster recovery applications. We present weighted energy aware multipath reliable routing (WEAMR), a novel energy aware multipath routing protocol which utilizes hotline-assisted routing to meet such requirements for mission critical applications. The protocol reduces the number of average hops from source to destination and provides unmatched reliability as compared to well known reactive ad hoc protocols i.e., AODV and AOMDV. Our protocol makes efficient use of network paths based on weighted cost calculation and intelligently selects the best possible paths for data transmissions. The path cost calculation considers end to end number of hops, latency and minimum energy node value in the path. In case of path failure path recalculation is done efficiently with minimum latency and control packets overhead. Our evaluation shows that our proposal provides better end-to-end delivery with less routing overhead and higher packet delivery success ratio compared to AODV and AOMDV. The use of multipath also increases overall life time of WSN network using optimum energy available paths between sender and receiver in WDNs. PMID:23669714

  13. Reliability prediction of ontology-based service compositions using Petri net and time series models.

    PubMed

    Li, Jia; Xia, Yunni; Luo, Xin

    2014-01-01

    OWL-S, one of the most important Semantic Web service ontologies proposed to date, provides a core ontological framework and guidelines for describing the properties and capabilities of their web services in an unambiguous, computer interpretable form. Predicting the reliability of composite service processes specified in OWL-S allows service users to decide whether the process meets the quantitative quality requirement. In this study, we consider the runtime quality of services to be fluctuating and introduce a dynamic framework to predict the runtime reliability of services specified in OWL-S, employing the Non-Markovian stochastic Petri net (NMSPN) and the time series model. The framework includes the following steps: obtaining the historical response times series of individual service components; fitting these series with a autoregressive-moving-average-model (ARMA for short) and predicting the future firing rates of service components; mapping the OWL-S process into a NMSPN model; employing the predicted firing rates as the model input of NMSPN and calculating the normal completion probability as the reliability estimate. In the case study, a comparison between the static model and our approach based on experimental data is presented and it is shown that our approach achieves higher prediction accuracy. PMID:24688429

  14. Reliability prediction of ontology-based service compositions using Petri net and time series models.

    PubMed

    Li, Jia; Xia, Yunni; Luo, Xin

    2014-01-01

    OWL-S, one of the most important Semantic Web service ontologies proposed to date, provides a core ontological framework and guidelines for describing the properties and capabilities of their web services in an unambiguous, computer interpretable form. Predicting the reliability of composite service processes specified in OWL-S allows service users to decide whether the process meets the quantitative quality requirement. In this study, we consider the runtime quality of services to be fluctuating and introduce a dynamic framework to predict the runtime reliability of services specified in OWL-S, employing the Non-Markovian stochastic Petri net (NMSPN) and the time series model. The framework includes the following steps: obtaining the historical response times series of individual service components; fitting these series with a autoregressive-moving-average-model (ARMA for short) and predicting the future firing rates of service components; mapping the OWL-S process into a NMSPN model; employing the predicted firing rates as the model input of NMSPN and calculating the normal completion probability as the reliability estimate. In the case study, a comparison between the static model and our approach based on experimental data is presented and it is shown that our approach achieves higher prediction accuracy.

  15. Reliability of team-based self-monitoring in critical events: a pilot study

    PubMed Central

    2013-01-01

    Background Teamwork is a critical component during critical events. Assessment is mandatory for remediation and to target training programmes for observed performance gaps. Methods The primary purpose was to test the feasibility of team-based self-monitoring of crisis resource management with a validated teamwork assessment tool. A secondary purpose was to assess item-specific reliability and content validity in order to develop a modified context-optimised assessment tool. We conducted a prospective, single-centre study to assess team-based self-monitoring of teamwork after in-situ inter-professional simulated critical events by comparison with an assessment by observers. The Mayo High Performance Teamwork Scale (MHPTS) was used as the assessment tool with evaluation of internal consistency, item-specific consensus estimates for agreement between participating teams and observers, and content validity. Results 105 participants and 58 observers completed the MHPTS after a total of 16 simulated critical events over 8 months. Summative internal consistency of the MHPTS calculated as Cronbach’s alpha was acceptable with 0.712 for observers and 0.710 for participants. Overall consensus estimates for dichotomous data (agreement/non-agreement) was 0.62 (Cohen’s kappa; IQ-range 0.31-0.87). 6/16 items had excellent (kappa > 0.8) and 3/16 good reliability (kappa > 0.6). Short questions concerning easy to observe behaviours were more likely to be reliable. The MHPTS was modified using a threshold for good reliability of kappa > 0.6. The result is a 9 item self-assessment tool (TeamMonitor) with a calculated median kappa of 0.86 (IQ-range: 0.67-1.0) and good content validity. Conclusions Team-based self-monitoring with the MHPTS to assess team performance during simulated critical events is feasible. A context-based modification of the tool is achievable with good internal consistency and content validity. Further studies are needed to investigate if team-based

  16. Quantified Risk Ranking Model for Condition-Based Risk and Reliability Centered Maintenance

    NASA Astrophysics Data System (ADS)

    Chattopadhyaya, Pradip Kumar; Basu, Sushil Kumar; Majumdar, Manik Chandra

    2016-03-01

    In the recent past, risk and reliability centered maintenance (RRCM) framework is introduced with a shift in the methodological focus from reliability and probabilities (expected values) to reliability, uncertainty and risk. In this paper authors explain a novel methodology for risk quantification and ranking the critical items for prioritizing the maintenance actions on the basis of condition-based risk and reliability centered maintenance (CBRRCM). The critical items are identified through criticality analysis of RPN values of items of a system and the maintenance significant precipitating factors (MSPF) of items are evaluated. The criticality of risk is assessed using three risk coefficients. The likelihood risk coefficient treats the probability as a fuzzy number. The abstract risk coefficient deduces risk influenced by uncertainty, sensitivity besides other factors. The third risk coefficient is called hazardous risk coefficient, which is due to anticipated hazards which may occur in the future and the risk is deduced from criteria of consequences on safety, environment, maintenance and economic risks with corresponding cost for consequences. The characteristic values of all the three risk coefficients are obtained with a particular test. With few more tests on the system, the values may change significantly within controlling range of each coefficient, hence `random number simulation' is resorted to obtain one distinctive value for each coefficient. The risk coefficients are statistically added to obtain final risk coefficient of each critical item and then the final rankings of critical items are estimated. The prioritization in ranking of critical items using the developed mathematical model for risk assessment shall be useful in optimization of financial losses and timing of maintenance actions.

  17. [The reliability of reliability].

    PubMed

    Blancas Espejo, A

    1991-01-01

    The author critically analyzes an article by Rodolfo Corona Vazquez that questions the reliability of the preliminary results of the Eleventh Census of Population and Housing, conducted in Mexico in March 1990. The need to define what constitutes "reliability" for preliminary results is stressed. PMID:12317739

  18. Validity and reliability of an IMU-based method to detect APAs prior to gait initiation.

    PubMed

    Mancini, Martina; Chiari, Lorenzo; Holmstrom, Lars; Salarian, Arash; Horak, Fay B

    2016-01-01

    Anticipatory postural adjustments (APAs) prior to gait initiation have been largely studied in traditional, laboratory settings using force plates under the feet to characterize the displacement of the center of pressure. However clinical trials and clinical practice would benefit from a portable, inexpensive method for characterizing APAs. Therefore, the main objectives of this study were (1) to develop a novel, automatic IMU-based method to detect and characterize APAs during gait initiation and (2) to measure its test-retest reliability. Experiment I was carried out in the laboratory to determine the validity of the IMU-based method in 10 subjects with PD (OFF medication) and 12 control subjects. Experiment II was carried out in the clinic, to determine test-retest reliability of the IMU-based method in a different set of 17 early-to-moderate, treated subjects with PD (tested ON medication) and 17 age-matched control subjects. Results showed that gait initiation characteristics (both APAs and 1st step) detected with our novel method were significantly correlated to the characteristics calculated with a force plate and motion analysis system. The size of APAs measured with either inertial sensors or force plate was significantly smaller in subjects with PD than in control subjects (p<0.05). Test-retest reliability for the gait initiation characteristics measured with inertial sensors was moderate-to-excellent (0.56

  19. How to Characterize the Reliability of Ceramic Capacitors with Base-Metal Electrodes (BMEs)

    NASA Technical Reports Server (NTRS)

    Liu, Donhang

    2015-01-01

    The reliability of an MLCC device is the product of a time-dependent part and a time-independent part: 1) Time-dependent part is a statistical distribution; 2) Time-independent part is the reliability at t0, the initial reliability. Initial reliability depends only on how a BME MLCC is designed and processed. Similar to the way the minimum dielectric thickness ensured the long-term reliability of a PME MLCC, the initial reliability also ensures the long term-reliability of a BME MLCC. This presentation shows new discoveries regarding commonalities and differences between PME and BME capacitor technologies.

  20. How to Characterize the Reliability of Ceramic Capacitors with Base-Metal Electrodes (BMEs)

    NASA Technical Reports Server (NTRS)

    Liu, David (Donhang)

    2015-01-01

    The reliability of an MLCC device is the product of a time-dependent part and a time-independent part: 1) Time-dependent part is a statistical distribution; 2) Time-independent part is the reliability at t=0, the initial reliability. Initial reliability depends only on how a BME MLCC is designed and processed. Similar to the way the minimum dielectric thickness ensured the long-term reliability of a PME MLCC, the initial reliability also ensures the long term-reliability of a BME MLCC. This presentation shows new discoveries regarding commonalities and differences between PME and BME capacitor technologies.

  1. Reliability-based optimization of maintenance scheduling of mechanical components under fatigue

    PubMed Central

    Beaurepaire, P.; Valdebenito, M.A.; Schuëller, G.I.; Jensen, H.A.

    2012-01-01

    This study presents the optimization of the maintenance scheduling of mechanical components under fatigue loading. The cracks of damaged structures may be detected during non-destructive inspection and subsequently repaired. Fatigue crack initiation and growth show inherent variability, and as well the outcome of inspection activities. The problem is addressed under the framework of reliability based optimization. The initiation and propagation of fatigue cracks are efficiently modeled using cohesive zone elements. The applicability of the method is demonstrated by a numerical example, which involves a plate with two holes subject to alternating stress. PMID:23564979

  2. Effect of Clinically Discriminating, Evidence-Based Checklist Items on the Reliability of Scores from an Internal Medicine Residency OSCE

    ERIC Educational Resources Information Center

    Daniels, Vijay J.; Bordage, Georges; Gierl, Mark J.; Yudkowsky, Rachel

    2014-01-01

    Objective structured clinical examinations (OSCEs) are used worldwide for summative examinations but often lack acceptable reliability. Research has shown that reliability of scores increases if OSCE checklists for medical students include only clinically relevant items. Also, checklists are often missing evidence-based items that high-achieving…

  3. Reliability and Validity of Assessing User Satisfaction With Web-Based Health Interventions

    PubMed Central

    Lehr, Dirk; Reis, Dorota; Vis, Christiaan; Riper, Heleen; Berking, Matthias; Ebert, David Daniel

    2016-01-01

    Background The perspective of users should be taken into account in the evaluation of Web-based health interventions. Assessing the users’ satisfaction with the intervention they receive could enhance the evidence for the intervention effects. Thus, there is a need for valid and reliable measures to assess satisfaction with Web-based health interventions. Objective The objective of this study was to analyze the reliability, factorial structure, and construct validity of the Client Satisfaction Questionnaire adapted to Internet-based interventions (CSQ-I). Methods The psychometric quality of the CSQ-I was analyzed in user samples from 2 separate randomized controlled trials evaluating Web-based health interventions, one from a depression prevention intervention (sample 1, N=174) and the other from a stress management intervention (sample 2, N=111). At first, the underlying measurement model of the CSQ-I was analyzed to determine the internal consistency. The factorial structure of the scale and the measurement invariance across groups were tested by multigroup confirmatory factor analyses. Additionally, the construct validity of the scale was examined by comparing satisfaction scores with the primary clinical outcome. Results Multigroup confirmatory analyses on the scale yielded a one-factorial structure with a good fit (root-mean-square error of approximation =.09, comparative fit index =.96, standardized root-mean-square residual =.05) that showed partial strong invariance across the 2 samples. The scale showed very good reliability, indicated by McDonald omegas of .95 in sample 1 and .93 in sample 2. Significant correlations with change in depressive symptoms (r=−.35, P<.001) and perceived stress (r=−.48, P<.001) demonstrated the construct validity of the scale. Conclusions The proven internal consistency, factorial structure, and construct validity of the CSQ-I indicate a good overall psychometric quality of the measure to assess the user’s general

  4. The Accessibility, Usability, and Reliability of Chinese Web-Based Information on HIV/AIDS

    PubMed Central

    Niu, Lu; Luo, Dan; Liu, Ying; Xiao, Shuiyuan

    2016-01-01

    Objective: The present study was designed to assess the quality of Chinese-language Internet-based information on HIV/AIDS. Methods: We entered the following search terms, in Chinese, into Baidu and Sogou: “HIV/AIDS”, “symptoms”, and “treatment”, and evaluated the first 50 hits of each query using the Minervation validation instrument (LIDA tool) and DISCERN instrument. Results: Of the 900 hits identified, 85 websites were included in this study. The overall score of the LIDA tool was 63.7%; the mean score of accessibility, usability, and reliability was 82.2%, 71.5%, and 27.3%, respectively. Of the top 15 sites according to the LIDA score, the mean DISCERN score was calculated at 43.1 (95% confidence intervals (CI) = 37.7–49.5). Noncommercial websites showed higher DISCERN scores than commercial websites; whereas commercial websites were more likely to be found in the first 20 links obtained from each search engine than the noncommercial websites. Conclusions: In general, the HIV/AIDS related Chinese-language websites have poor reliability, although their accessibility and usability are fair. In addition, the treatment information presented on Chinese-language websites is far from sufficient. There is an imperative need for professionals and specialized institutes to improve the comprehensiveness of web-based information related to HIV/AIDS. PMID:27556475

  5. Spectrum survey for reliable communications of cognitive radio based smart grid network

    NASA Astrophysics Data System (ADS)

    Farah Aqilah, Wan; Jayavalan, Shanjeevan; Mohd Aripin, Norazizah; Mohamad, Hafizal; Ismail, Aiman

    2013-06-01

    The smart grid (SG) system is expected to involve huge amount of data with different levels of priorities to different applications or users. The traditional grid which tend to deploy propriety networks with limited coverage and bandwidth, is not sufficient to support large scale SG network. Cognitive radio (CR) is a promising communication platform for SG network by utilizing potentially all available spectrum resources, subject to interference constraint. In order to develop a reliable communication framework for CR based SG network, thorough investigations on the current radio spectrum are required. This paper presents the spectrum utilization in Malaysia, specifically in the UHF/VHF bands, cellular (GSM 900, GSM 1800 and 3G), WiMAX, ISM and LTE band. The goal is to determine the potential spectrum that can be exploit by the CR users in the SG network. Measurements was conducted for 24 hours to quantify the average spectrum usage and the amount of available bandwidth. The findings in this paper are important to provide insight of actual spectrum utilization prior to developing a reliable communication platform for CR based SG network.

  6. Reliable clock estimation using linear weighted fusion based on pairwise broadcast synchronization

    SciTech Connect

    Shi, Xin Zhao, Xiangmo Hui, Fei Ma, Junyan Yang, Lan

    2014-10-06

    Clock synchronization in wireless sensor networks (WSNs) has been studied extensively in recent years and many protocols are put forward based on the point of statistical signal processing, which is an effective way to optimize accuracy. However, the accuracy derived from the statistical data can be improved mainly by sufficient packets exchange, which will consume the limited power resources greatly. In this paper, a reliable clock estimation using linear weighted fusion based on pairwise broadcast synchronization is proposed to optimize sync accuracy without expending additional sync packets. As a contribution, a linear weighted fusion scheme for multiple clock deviations is constructed with the collaborative sensing of clock timestamp. And the fusion weight is defined by the covariance of sync errors for different clock deviations. Extensive simulation results show that the proposed approach can achieve better performance in terms of sync overhead and sync accuracy.

  7. Reliability of file-based retrospective ratings of psychopathy with the PCL-R.

    PubMed

    Grann, M; Långström, N; Tengström, A; Stålenheim, E G

    1998-06-01

    A rapidly emerging consensus recognizes Hare's Psychopathy Checklist-Revised (PCL-R; Hare, 1991) as the most valid and useful instrument to assess psychopathy (Fulero, 1995; Stone, 1995). We compared independent clinical PCL-R ratings of 40 forensic adult male criminal offenders to retrospective file-only ratings. File-based PCL-R ratings, in comparison to the clinical ratings, yielded categorical psychopathy diagnoses with a sensitivity of .57 and a specificity of .96. The intraclass correlation (ICC) of the total scores as estimated by ICC(2,1) was .88, and was markedly better on Factor 2, ICC(2,1) = .89, than on Factor 1, ICC(2,1) = .69. The findings support the belief that for research purposes, file-only PCL-R ratings based on Swedish forensic psychiatric investigation records can be made with good alternate-form reliability.

  8. Physics of Failure Analysis of Xilinx Flip Chip CCGA Packages: Effects of Mission Environments on Properties of LP2 Underfill and ATI Lid Adhesive Materials

    NASA Technical Reports Server (NTRS)

    Suh, Jong-ook

    2013-01-01

    The Xilinx Virtex 4QV and 5QV (V4 and V5) are next-generation field-programmable gate arrays (FPGAs) for space applications. However, there have been concerns within the space community regarding the non-hermeticity of V4/V5 packages; polymeric materials such as the underfill and lid adhesive will be directly exposed to the space environment. In this study, reliability concerns associated with the non-hermeticity of V4/V5 packages were investigated by studying properties and behavior of the underfill and the lid adhesvie materials used in V4/V5 packages.

  9. Reliability and validity of an internet-based questionnaire measuring lifetime physical activity.

    PubMed

    De Vera, Mary A; Ratzlaff, Charles; Doerfling, Paul; Kopec, Jacek

    2010-11-15

    Lifetime exposure to physical activity is an important construct for evaluating associations between physical activity and disease outcomes, given the long induction periods in many chronic diseases. The authors' objective in this study was to evaluate the measurement properties of the Lifetime Physical Activity Questionnaire (L-PAQ), a novel Internet-based, self-administered instrument measuring lifetime physical activity, among Canadian men and women in 2005-2006. Reliability was examined using a test-retest study. Validity was examined in a 2-part study consisting of 1) comparisons with previously validated instruments measuring similar constructs, the Lifetime Total Physical Activity Questionnaire (LT-PAQ) and the Chasan-Taber Physical Activity Questionnaire (CT-PAQ), and 2) a priori hypothesis tests of constructs measured by the L-PAQ. The L-PAQ demonstrated good reliability, with intraclass correlation coefficients ranging from 0.67 (household activity) to 0.89 (sports/recreation). Comparison between the L-PAQ and the LT-PAQ resulted in Spearman correlation coefficients ranging from 0.41 (total activity) to 0.71 (household activity); comparison between the L-PAQ and the CT-PAQ yielded coefficients of 0.58 (sports/recreation), 0.56 (household activity), and 0.50 (total activity). L-PAQ validity was further supported by observed relations between the L-PAQ and sociodemographic variables, consistent with a priori hypotheses. Overall, the L-PAQ is a useful instrument for assessing multiple domains of lifetime physical activity with acceptable reliability and validity.

  10. A Reliability Test of a Complex System Based on Empirical Likelihood

    PubMed Central

    Zhang, Jun; Hui, Yongchang

    2016-01-01

    To analyze the reliability of a complex system described by minimal paths, an empirical likelihood method is proposed to solve the reliability test problem when the subsystem distributions are unknown. Furthermore, we provide a reliability test statistic of the complex system and extract the limit distribution of the test statistic. Therefore, we can obtain the confidence interval for reliability and make statistical inferences. The simulation studies also demonstrate the theorem results. PMID:27760130

  11. Effect of clinically discriminating, evidence-based checklist items on the reliability of scores from an Internal Medicine residency OSCE.

    PubMed

    Daniels, Vijay J; Bordage, Georges; Gierl, Mark J; Yudkowsky, Rachel

    2014-10-01

    Objective structured clinical examinations (OSCEs) are used worldwide for summative examinations but often lack acceptable reliability. Research has shown that reliability of scores increases if OSCE checklists for medical students include only clinically relevant items. Also, checklists are often missing evidence-based items that high-achieving learners are more likely to use. The purpose of this study was to determine if limiting checklist items to clinically discriminating items and/or adding missing evidence-based items improved score reliability in an Internal Medicine residency OSCE. Six internists reviewed the traditional checklists of four OSCE stations classifying items as clinically discriminating or non-discriminating. Two independent reviewers augmented checklists with missing evidence-based items. We used generalizability theory to calculate overall reliability of faculty observer checklist scores from 45 first and second-year residents and predict how many 10-item stations would be required to reach a Phi coefficient of 0.8. Removing clinically non-discriminating items from the traditional checklist did not affect the number of stations (15) required to reach a Phi of 0.8 with 10 items. Focusing the checklist on only evidence-based clinically discriminating items increased test score reliability, needing 11 stations instead of 15 to reach 0.8; adding missing evidence-based clinically discriminating items to the traditional checklist modestly improved reliability (needing 14 instead of 15 stations). Checklists composed of evidence-based clinically discriminating items improved the reliability of checklist scores and reduced the number of stations needed for acceptable reliability. Educators should give preference to evidence-based items over non-evidence-based items when developing OSCE checklists.

  12. Determine the optimal carrier selection for a logistics network based on multi-commodity reliability criterion

    NASA Astrophysics Data System (ADS)

    Lin, Yi-Kuei; Yeh, Cheng-Ta

    2013-05-01

    From the perspective of supply chain management, the selected carrier plays an important role in freight delivery. This article proposes a new criterion of multi-commodity reliability and optimises the carrier selection based on such a criterion for logistics networks with routes and nodes, over which multiple commodities are delivered. Carrier selection concerns the selection of exactly one carrier to deliver freight on each route. The capacity of each carrier has several available values associated with a probability distribution, since some of a carrier's capacity may be reserved for various orders. Therefore, the logistics network, given any carrier selection, is a multi-commodity multi-state logistics network. Multi-commodity reliability is defined as a probability that the logistics network can satisfy a customer's demand for various commodities, and is a performance indicator for freight delivery. To solve this problem, this study proposes an optimisation algorithm that integrates genetic algorithm, minimal paths and Recursive Sum of Disjoint Products. A practical example in which multi-sized LCD monitors are delivered from China to Germany is considered to illustrate the solution procedure.

  13. A Compact Forearm Crutch Based on Force Sensors for Aided Gait: Reliability and Validity.

    PubMed

    Chamorro-Moriana, Gema; Sevillano, José Luis; Ridao-Fernández, Carmen

    2016-01-01

    Frequently, patients who suffer injuries in some lower member require forearm crutches in order to partially unload weight-bearing. These lesions cause pain in lower limb unloading and their progression should be controlled objectively to avoid significant errors in accuracy and, consequently, complications and after effects in lesions. The design of a new and feasible tool that allows us to control and improve the accuracy of loads exerted on crutches during aided gait is necessary, so as to unburden the lower limbs. In this paper, we describe such a system based on a force sensor, which we have named the GCH System 2.0. Furthermore, we determine the validity and reliability of measurements obtained using this tool via a comparison with the validated AMTI (Advanced Mechanical Technology, Inc., Watertown, MA, USA) OR6-7-2000 Platform. An intra-class correlation coefficient demonstrated excellent agreement between the AMTI Platform and the GCH System. A regression line to determine the predictive ability of the GCH system towards the AMTI Platform was found, which obtained a precision of 99.3%. A detailed statistical analysis is presented for all the measurements and also segregated for several requested loads on the crutches (10%, 25% and 50% of body weight). Our results show that our system, designed for assessing loads exerted by patients on forearm crutches during assisted gait, provides valid and reliable measurements of loads.

  14. A Compact Forearm Crutch Based on Force Sensors for Aided Gait: Reliability and Validity

    PubMed Central

    Chamorro-Moriana, Gema; Sevillano, José Luis; Ridao-Fernández, Carmen

    2016-01-01

    Frequently, patients who suffer injuries in some lower member require forearm crutches in order to partially unload weight-bearing. These lesions cause pain in lower limb unloading and their progression should be controlled objectively to avoid significant errors in accuracy and, consequently, complications and after effects in lesions. The design of a new and feasible tool that allows us to control and improve the accuracy of loads exerted on crutches during aided gait is necessary, so as to unburden the lower limbs. In this paper, we describe such a system based on a force sensor, which we have named the GCH System 2.0. Furthermore, we determine the validity and reliability of measurements obtained using this tool via a comparison with the validated AMTI (Advanced Mechanical Technology, Inc., Watertown, MA, USA) OR6-7-2000 Platform. An intra-class correlation coefficient demonstrated excellent agreement between the AMTI Platform and the GCH System. A regression line to determine the predictive ability of the GCH system towards the AMTI Platform was found, which obtained a precision of 99.3%. A detailed statistical analysis is presented for all the measurements and also segregated for several requested loads on the crutches (10%, 25% and 50% of body weight). Our results show that our system, designed for assessing loads exerted by patients on forearm crutches during assisted gait, provides valid and reliable measurements of loads. PMID:27338396

  15. Novel bandgap-based under-voltage-lockout methods with high reliability

    NASA Astrophysics Data System (ADS)

    Yongrui, Zhao; Xinquan, Lai

    2013-10-01

    Highly reliable bandgap-based under-voltage-lockout (UVLO) methods are presented in this paper. The proposed under-voltage state to signal conversion methods take full advantages of the high temperature stability characteristics and the enhancement low-voltage protection methods which protect the core circuit from error operation; moreover, a common-source stage amplifier method is introduced to expand the output voltage range. All of these methods are verified in a UVLO circuit fabricated with a 0.5 μm standard BCD process technology. The experimental result shows that the proposed bandgap method exhibits a good temperature coefficient of 20 ppm/°C, which ensures that the UVLO keeps a stable output until the under-voltage state changes. Moreover, at room temperature, the high threshold voltage VTH+ generated by the UVLO is 12.3 V with maximum drift voltage of ±80 mV, and the low threshold voltage VTH- is 9.5 V with maximum drift voltage of ±70 mV. Also, the low voltage protection method used in the circuit brings a high reliability when the supply voltage is very low.

  16. Reliability analysis of laser ultrasonics for train axle diagnostics based on model assisted POD curves

    NASA Astrophysics Data System (ADS)

    Malik, M. S.; Cavuto, A.; Martarelli, M.; Pandarese, G.; Revel, G. M.

    2014-05-01

    High speed train axles are integrated for a lifetime and it is time and resource consuming to conduct in service inspection with high accuracy. Laser ultrasonics is a proposed solution as a subset of non-contact measuring methods effective also for hard to reach areas and even recently proved to be effective using Laser Doppler Vibrometer (LDV) or air-coupled probes in reception. A reliability analysis of laser ultrasonics for this specific application is here performed. The research is mainly based on numerical study of the effect of high energy laser pulses on the surface of a steel axle and of the behavior of the ultrasonic waves in detecting possible defects. Probability of Detection (POD) concept is used as an estimated reliability of the inspection method. In particular Model Assisted Probability of Detection (MAPOD), a modified form of POD where models are used to infer results for making a decisive statistical approach of POD curve, is here adopted. This paper implements this approach by taking the inputs from limited experiments conducted on a high speed train axle using laser ultrasonics (source pulsed Nd:Yag, reception by high-frequency LDV) to calibrate a multiphysics FE model and by using the calibrated model to generate data samples statistically representative of damaged train axles. The simulated flaws are in accordance with the real defects present on the axle. A set of flaws of different depth has been modeled in order to assess the laser ultrasonics POD for this specific application.

  17. Improving the reliability of diagnostic tests in population-based agreement studies

    PubMed Central

    Nelson, Kerrie P.; Edwards, Don

    2016-01-01

    Many large-scale studies have recently been carried out to assess the reliability of diagnostic procedures, such as mammography for the detection of breast cancer. The large numbers of raters and subjects involved raise new challenges in how to measure agreement in these types of studies. An important motivator of these studies is the identification of factors that contribute to the often wide discrepancies observed between raters’ classifications, such as a rater’s experience, in order to improve the reliability of the diagnostic process of interest. Incorporating covariate information into the agreement model is a key component in addressing these questions. Few agreement models are currently available that jointly model larger numbers of raters and subjects and incorporate covariate information. In this paper, we extend a recently developed population-based model and measure of agreement for binary ratings to incorporate covariate information using the class of generalized linear mixed models with a probit link function. Important information on factors related to the subjects and raters can be included as fixed and/or random effects in the model. We demonstrate how agreement can be assessed between subgroups of the raters and/or subjects, for example, comparing agreement between experienced and less experienced raters. Simulation studies are carried out to test the performance of the proposed models and measures of agreement. Application to a large-scale breast cancer study is presented. PMID:20128018

  18. On the Reliability of a Solitary Wave Based Transducer to Determine the Characteristics of Some Materials

    PubMed Central

    Deng, Wen; Nasrollahi, Amir; Rizzo, Piervincenzo; Li, Kaiyuan

    2015-01-01

    In the study presented in this article we investigated the feasibility and the reliability of a transducer design for the nondestructive evaluation (NDE) of the stiffness of structural materials. The NDE method is based on the propagation of highly nonlinear solitary waves (HNSWs) along a one-dimensional chain of spherical particles that is in contact with the material to be assessed. The chain is part of a built-in system designed and assembled to excite and detect HNSWs, and to exploit the dynamic interaction between the particles and the material to be inspected. This interaction influences the time-of-flight and the amplitude of the solitary pulses reflected at the transducer/material interface. The results of this study show that certain features of the waves are dependent on the modulus of elasticity of the material and that the built-in system is reliable. In the future the proposed NDE method may provide a cost-effective tool for the rapid assessment of materials’ modulus. PMID:26703617

  19. Reliable pre-eclampsia pathways based on multiple independent microarray data sets.

    PubMed

    Kawasaki, Kaoru; Kondoh, Eiji; Chigusa, Yoshitsugu; Ujita, Mari; Murakami, Ryusuke; Mogami, Haruta; Brown, J B; Okuno, Yasushi; Konishi, Ikuo

    2015-02-01

    Pre-eclampsia is a multifactorial disorder characterized by heterogeneous clinical manifestations. Gene expression profiling of preeclamptic placenta have provided different and even opposite results, partly due to data compromised by various experimental artefacts. Here we aimed to identify reliable pre-eclampsia-specific pathways using multiple independent microarray data sets. Gene expression data of control and preeclamptic placentas were obtained from Gene Expression Omnibus. Single-sample gene-set enrichment analysis was performed to generate gene-set activation scores of 9707 pathways obtained from the Molecular Signatures Database. Candidate pathways were identified by t-test-based screening using data sets, GSE10588, GSE14722 and GSE25906. Additionally, recursive feature elimination was applied to arrive at a further reduced set of pathways. To assess the validity of the pre-eclampsia pathways, a statistically-validated protocol was executed using five data sets including two independent other validation data sets, GSE30186, GSE44711. Quantitative real-time PCR was performed for genes in a panel of potential pre-eclampsia pathways using placentas of 20 women with normal or severe preeclamptic singleton pregnancies (n = 10, respectively). A panel of ten pathways were found to discriminate women with pre-eclampsia from controls with high accuracy. Among these were pathways not previously associated with pre-eclampsia, such as the GABA receptor pathway, as well as pathways that have already been linked to pre-eclampsia, such as the glutathione and CDKN1C pathways. mRNA expression of GABRA3 (GABA receptor pathway), GCLC and GCLM (glutathione metabolic pathway), and CDKN1C was significantly reduced in the preeclamptic placentas. In conclusion, ten accurate and reliable pre-eclampsia pathways were identified based on multiple independent microarray data sets. A pathway-based classification may be a worthwhile approach to elucidate the pathogenesis of pre-eclampsia.

  20. An Examination of Temporal Trends in Electricity Reliability Based on Reports from U.S. Electric Utilities

    SciTech Connect

    Eto, Joseph H.; LaCommare, Kristina Hamachi; Larsen, Peter; Todd, Annika; Fisher, Emily

    2012-01-06

    Since the 1960s, the U.S. electric power system has experienced a major blackout about once every 10 years. Each has been a vivid reminder of the importance society places on the continuous availability of electricity and has led to calls for changes to enhance reliability. At the root of these calls are judgments about what reliability is worth and how much should be paid to ensure it. In principle, comprehensive information on the actual reliability of the electric power system and on how proposed changes would affect reliability ought to help inform these judgments. Yet, comprehensive, national-scale information on the reliability of the U.S. electric power system is lacking. This report helps to address this information gap by assessing trends in U.S. electricity reliability based on information reported by electric utilities on power interruptions experienced by their customers. Our research augments prior investigations, which focused only on power interruptions originating in the bulk power system, by considering interruptions originating both from the bulk power system and from within local distribution systems. Our research also accounts for differences among utility reliability reporting practices by employing statistical techniques that remove the influence of these differences on the trends that we identify. The research analyzes up to 10 years of electricity reliability information collected from 155 U.S. electric utilities, which together account for roughly 50% of total U.S. electricity sales. The questions analyzed include: 1. Are there trends in reported electricity reliability over time? 2. How are trends in reported electricity reliability affected by the installation or upgrade of an automated outage management system? 3. How are trends in reported electricity reliability affected by the use of IEEE Standard 1366-2003?

  1. Design and implementation of reliability evaluation of SAS hard disk based on RAID card

    NASA Astrophysics Data System (ADS)

    Ren, Shaohua; Han, Sen

    2015-10-01

    Because of the huge advantage of RAID technology in storage, it has been widely used. However, the question associated with this technology is that the hard disk based on the RAID card can not be queried by Operating System. Therefore how to read the self-information and log data of hard disk has been a problem, while this data is necessary for reliability test of hard disk. In traditional way, this information can be read just suitable for SATA hard disk, but not for SAS hard disk. In this paper, we provide a method by using LSI RAID card's Application Program Interface, communicating with RAID card and analyzing the feedback data to solve the problem. Then we will get the necessary information to assess the SAS hard disk.

  2. Comparative analysis of different configurations of PLC-based safety systems from reliability point of view

    NASA Technical Reports Server (NTRS)

    Tapia, Moiez A.

    1993-01-01

    The study of a comparative analysis of distinct multiplex and fault-tolerant configurations for a PLC-based safety system from a reliability point of view is presented. It considers simplex, duplex and fault-tolerant triple redundancy configurations. The standby unit in case of a duplex configuration has a failure rate which is k times the failure rate of the standby unit, the value of k varying from 0 to 1. For distinct values of MTTR and MTTF of the main unit, MTBF and availability for these configurations are calculated. The effect of duplexing only the PLC module or only the sensors and the actuators module, on the MTBF of the configuration, is also presented. The results are summarized and merits and demerits of various configurations under distinct environments are discussed.

  3. Reliability-Based Design of a Safety-Critical Automation System: A Case Study

    NASA Technical Reports Server (NTRS)

    Carroll, Carol W.; Dunn, W.; Doty, L.; Frank, M. V.; Hulet, M.; Alvarez, Teresa (Technical Monitor)

    1994-01-01

    In 1986, NASA funded a project to modernize the NASA Ames Research Center Unitary Plan Wind Tunnels, including the replacement of obsolescent controls with a modern, automated distributed control system (DCS). The project effort on this system included an independent safety analysis (ISA) of the automation system. The purpose of the ISA was to evaluate the completeness of the hazard analyses which had already been performed on the Modernization Project. The ISA approach followed a tailoring of the risk assessment approach widely used on existing nuclear power plants. The tailoring of the nuclear industry oriented risk assessment approach to the automation system and its role in reliability-based design of the automation system is the subject of this paper.

  4. Reliability model generator

    NASA Technical Reports Server (NTRS)

    McMann, Catherine M. (Inventor); Cohen, Gerald C. (Inventor)

    1991-01-01

    An improved method and system for automatically generating reliability models for use with a reliability evaluation tool is described. The reliability model generator of the present invention includes means for storing a plurality of low level reliability models which represent the reliability characteristics for low level system components. In addition, the present invention includes means for defining the interconnection of the low level reliability models via a system architecture description. In accordance with the principles of the present invention, a reliability model for the entire system is automatically generated by aggregating the low level reliability models based on the system architecture description.

  5. Gradient-based reliability maps for ACM-based segmentation of hippocampus.

    PubMed

    Zarpalas, Dimitrios; Gkontra, Polyxeni; Daras, Petros; Maglaveras, Nicos

    2014-04-01

    Automatic segmentation of deep brain structures, such as the hippocampus (HC), in MR images has attracted considerable scientific attention due to the widespread use of MRI and to the principal role of some structures in various mental disorders. In this literature, there exists a substantial amount of work relying on deformable models incorporating prior knowledge about structures' anatomy and shape information. However, shape priors capture global shape characteristics and thus fail to model boundaries of varying properties; HC boundaries present rich, poor, and missing gradient regions. On top of that, shape prior knowledge is blended with image information in the evolution process, through global weighting of the two terms, again neglecting the spatially varying boundary properties, causing segmentation faults. An innovative method is hereby presented that aims to achieve highly accurate HC segmentation in MR images, based on the modeling of boundary properties at each anatomical location and the inclusion of appropriate image information for each of those, within an active contour model framework. Hence, blending of image information and prior knowledge is based on a local weighting map, which mixes gradient information, regional and whole brain statistical information with a multi-atlas-based spatial distribution map of the structure's labels. Experimental results on three different datasets demonstrate the efficacy and accuracy of the proposed method.

  6. The Diagnostic Validity and Reliability of an Internet-Based Clinical Assessment Program for Mental Disorders

    PubMed Central

    Klein, Britt; Meyer, Denny; Austin, David William; Abbott, Jo-Anne M

    2015-01-01

    Background Internet-based assessment has the potential to assist with the diagnosis of mental health disorders and overcome the barriers associated with traditional services (eg, cost, stigma, distance). Further to existing online screening programs available, there is an opportunity to deliver more comprehensive and accurate diagnostic tools to supplement the assessment and treatment of mental health disorders. Objective The aim was to evaluate the diagnostic criterion validity and test-retest reliability of the electronic Psychological Assessment System (e-PASS), an online, self-report, multidisorder, clinical assessment and referral system. Methods Participants were 616 adults residing in Australia, recruited online, and representing prospective e-PASS users. Following e-PASS completion, 158 participants underwent a telephone-administered structured clinical interview and 39 participants repeated the e-PASS within 25 days of initial completion. Results With structured clinical interview results serving as the gold standard, diagnostic agreement with the e-PASS varied considerably from fair (eg, generalized anxiety disorder: κ=.37) to strong (eg, panic disorder: κ=.62). Although the e-PASS’ sensitivity also varied (0.43-0.86) the specificity was generally high (0.68-1.00). The e-PASS sensitivity generally improved when reducing the e-PASS threshold to a subclinical result. Test-retest reliability ranged from moderate (eg, specific phobia: κ=.54) to substantial (eg, bulimia nervosa: κ=.87). Conclusions The e-PASS produces reliable diagnostic results and performs generally well in excluding mental disorders, although at the expense of sensitivity. For screening purposes, the e-PASS subclinical result generally appears better than a clinical result as a diagnostic indicator. Further development and evaluation is needed to support the use of online diagnostic assessment programs for mental disorders. Trial Registration Australian and New Zealand Clinical Trials

  7. Fast two-dimensional phase-unwrapping algorithm based on sorting by reliability following a noncontinuous path.

    PubMed

    Herráez, Miguel Arevallilo; Burton, David R; Lalor, Michael J; Gdeisat, Munther A

    2002-12-10

    We describe what is to our knowledge a novel technique for phase unwrapping. Several algorithms based on unwrapping the most-reliable pixels first have been proposed. These were restricted to continuous paths and were subject to difficulties in defining a starting pixel. The technique described here uses a different type of reliability function and does not follow a continuous path to perform the unwrapping operation. The technique is explained in detail and illustrated with a number of examples. PMID:12502301

  8. Fast two-dimensional phase-unwrapping algorithm based on sorting by reliability following a noncontinuous path.

    PubMed

    Herráez, Miguel Arevallilo; Burton, David R; Lalor, Michael J; Gdeisat, Munther A

    2002-12-10

    We describe what is to our knowledge a novel technique for phase unwrapping. Several algorithms based on unwrapping the most-reliable pixels first have been proposed. These were restricted to continuous paths and were subject to difficulties in defining a starting pixel. The technique described here uses a different type of reliability function and does not follow a continuous path to perform the unwrapping operation. The technique is explained in detail and illustrated with a number of examples.

  9. Validity and reliability of perceptually-based scales during exhausting runs in trained male runners.

    PubMed

    Coquart, J B J; Garcin, M

    2007-02-01

    The purposes of this study were to test the validity of a recent scale based on the estimation of a time of exhaustion (entitled Estimated Time Limit scale) to predict a time limit (Tlim) and to regulate exercise intensity and to investigate the reliability of the Estimated Time Limit scale and the Ratings of Perceived Exertion (RPE) scale. 14 male runners performed one incremental test, one constant velocity test at 85% of Maximal Aerobic Velocity (MAV), one constant duration test and one retest of 15 min. on an outdoor track. The difference between Estimated Time Limit values obtained during the incremental test at 85% MAV and measured Tlim values during the constant velocity test were examined, the velocities at ETL = 13 (i.e., 15 min.) obtained during the incremental test were compared with measured velocities during the constant duration test or the retest (only the best performance was used), and RPE and Estimated Time Limit values during the constant duration test were compared with those measured during retest. The results have shown a nonsignificant correlation between Estimated Time Limit values at 85% MAV and measured Tlim values during constant velocity test. There was a significant correlation (p < .02, r = .64) between velocities at ETL= 13 and measured velocities. However, the slope and y intercept value of this regression were significantly different from those of the identity line. There was no significant difference between constant duration test and retest for the values of RPE and Estimated Time Limit with high correlations (between r = .77 and .99 for RPE scale, and r = .74 and .99 for Estimated Time Limit scale). Moreover, the regression lines were close to the identity line. The RPE and Estimated Time Limit scales are reliable, but the lack of validity for the Estimated Time Limit scale suggests that more studies must be performed before using this scale to predict Tlim and regulate exercise intensity in male runners.

  10. Validity and Reliability of Two Field-Based Leg Stiffness Devices: Implications for Practical Use.

    PubMed

    Ruggiero, Luca; Dewhurst, Susan; Bampouras, Theodoros M

    2016-08-01

    Leg stiffness is an important performance determinant in several sporting activities. This study evaluated the criterion-related validity and reliability of 2 field-based leg stiffness devices, Optojump NextR (Optojump) and Myotest ProR (Myotest) in different testing approaches. Thirty-four males performed, on 2 separate sessions, 3 trials of 7 maximal hops, synchronously recorded from a force platform (FP), Optojump and Myotest. Validity (Pearson's correlation coefficient, r; relative mean bias; 95% limits of agreement, 95%LoA) and reliability (coefficient of variation, CV; intraclass correlation coefficient, ICC; standard error of measurement, SEM) were calculated for first attempt, maximal attempt, and average across 3 trials. For all 3 methods, Optojump correlated highly to the FP (range r = .98-.99) with small bias (range 0.91-0.92, 95%LoA 0.86-0.98). Myotest demonstrated high correlation to FP (range r = .81-.86) with larger bias (range 1.92-1.93, 95%LoA 1.63-2.23). Optojump yielded a low CV (range 5.9% to 6.8%), high ICC (range 0.82-0.86), and SEM ranging 1.8-2.1 kN/m. Myotest had a larger CV (range 8.9% to 13.0%), moderate ICC (range 0.64-0.79), and SEM ranging from 6.3 to 8.9 kN/m. The findings present important information for these devices and support the use of a time-efficient single trial to assess leg stiffness in the field. PMID:26959196

  11. Reliability-based aeroelastic optimization of a composite aircraft wing via fluid-structure interaction of high fidelity solvers

    NASA Astrophysics Data System (ADS)

    Nikbay, M.; Fakkusoglu, N.; Kuru, M. N.

    2010-06-01

    We consider reliability based aeroelastic optimization of a AGARD 445.6 composite aircraft wing with stochastic parameters. Both commercial engineering software and an in-house reliability analysis code are employed in this high-fidelity computational framework. Finite volume based flow solver Fluent is used to solve 3D Euler equations, while Gambit is the fluid domain mesh generator and Catia-V5-R16 is used as a parametric 3D solid modeler. Abaqus, a structural finite element solver, is used to compute the structural response of the aeroelastic system. Mesh based parallel code coupling interface MPCCI-3.0.6 is used to exchange the pressure and displacement information between Fluent and Abaqus to perform a loosely coupled fluid-structure interaction by employing a staggered algorithm. To compute the probability of failure for the probabilistic constraints, one of the well known MPP (Most Probable Point) based reliability analysis methods, FORM (First Order Reliability Method) is implemented in Matlab. This in-house developed Matlab code is embedded in the multidisciplinary optimization workflow which is driven by Modefrontier. Modefrontier 4.1, is used for its gradient based optimization algorithm called NBI-NLPQLP which is based on sequential quadratic programming method. A pareto optimal solution for the stochastic aeroelastic optimization is obtained for a specified reliability index and results are compared with the results of deterministic aeroelastic optimization.

  12. Post-illumination pupil response after blue light: Reliability of optimized melanopsin-based phototransduction assessment.

    PubMed

    van der Meijden, Wisse P; te Lindert, Bart H W; Bijlenga, Denise; Coppens, Joris E; Gómez-Herrero, Germán; Bruijel, Jessica; Kooij, J J Sandra; Cajochen, Christian; Bourgin, Patrice; Van Someren, Eus J W

    2015-10-01

    ± 3.6 yr) we examined the potential confounding effects of dark adaptation, time of the day (morning vs. afternoon), body posture (upright vs. supine position), and 24-h environmental light history on the PIPR assessment. Mixed effect regression models were used to analyze these possible confounders. A supine position caused larger PIPR-mm (β = 0.29 mm, SE = 0.10, p = 0.01) and PIPR-% (β = 4.34%, SE = 1.69, p = 0.02), which was due to an increase in baseline dark pupil diameter; this finding is of relevance for studies requiring a supine posture, as in functional Magnetic Resonance Imaging, constant routine protocols, and bed-ridden patients. There were no effects of dark adaptation, time of day, and light history. In conclusion, the presented method provides a reliable and robust assessment of the PIPR to allow for studies on individual differences in melanopsin-based phototransduction and effects of interventions. PMID:26209783

  13. Post-illumination pupil response after blue light: Reliability of optimized melanopsin-based phototransduction assessment.

    PubMed

    van der Meijden, Wisse P; te Lindert, Bart H W; Bijlenga, Denise; Coppens, Joris E; Gómez-Herrero, Germán; Bruijel, Jessica; Kooij, J J Sandra; Cajochen, Christian; Bourgin, Patrice; Van Someren, Eus J W

    2015-10-01

    ± 3.6 yr) we examined the potential confounding effects of dark adaptation, time of the day (morning vs. afternoon), body posture (upright vs. supine position), and 24-h environmental light history on the PIPR assessment. Mixed effect regression models were used to analyze these possible confounders. A supine position caused larger PIPR-mm (β = 0.29 mm, SE = 0.10, p = 0.01) and PIPR-% (β = 4.34%, SE = 1.69, p = 0.02), which was due to an increase in baseline dark pupil diameter; this finding is of relevance for studies requiring a supine posture, as in functional Magnetic Resonance Imaging, constant routine protocols, and bed-ridden patients. There were no effects of dark adaptation, time of day, and light history. In conclusion, the presented method provides a reliable and robust assessment of the PIPR to allow for studies on individual differences in melanopsin-based phototransduction and effects of interventions.

  14. The mindfulness-based relapse prevention adherence and competence scale: development, interrater reliability, and validity.

    PubMed

    Chawla, Neharika; Collin, Susan; Bowen, Sarah; Hsu, Sharon; Grow, Joel; Douglass, Anne; Marlatt, G Alan

    2010-07-01

    The present study describes the development of the Mindfulness-Based Relapse Prevention Adherence and Competence Scale (MBRP-AC), a measure of treatment integrity for mindfulness-based relapse prevention (MBRP). MBRP is a newly developed treatment integrating core aspects of relapse prevention with mindfulness practices. The MBRP-AC was developed in the context of a randomized controlled trial (RCT) of MBRP efficacy and consists of two sections: Adherence (adherence to individual components of MBRP and discussion of key concepts) and Competence (ratings of therapist style/approach and performance). Audio recordings from 44 randomly selected group treatment sessions (50%) were rated by independent raters for therapist adherence and competence in the RCT. Findings evinced high interrater reliability for all treatment adherence and competence ratings, and adequate internal consistency for Therapist Style/Approach and Therapist Performance summary scales. Ratings on the MBRP-AC suggested that therapists in the recent RCT adhered to protocol, discussed key concepts in each session, and demonstrated the intended style and competence in treatment delivery. Finally, overall ratings on the Adherence section were positively related to changes in mindfulness over the course of the treatment.

  15. Reliable Dual Tensor Model Estimation in Single and Crossing Fibers Based on Jeffreys Prior

    PubMed Central

    Yang, Jianfei; Poot, Dirk H. J.; Caan, Matthan W. A.; Su, Tanja; Majoie, Charles B. L. M.; van Vliet, Lucas J.; Vos, Frans M.

    2016-01-01

    Purpose This paper presents and studies a framework for reliable modeling of diffusion MRI using a data-acquisition adaptive prior. Methods Automated relevance determination estimates the mean of the posterior distribution of a rank-2 dual tensor model exploiting Jeffreys prior (JARD). This data-acquisition prior is based on the Fisher information matrix and enables the assessment whether two tensors are mandatory to describe the data. The method is compared to Maximum Likelihood Estimation (MLE) of the dual tensor model and to FSL’s ball-and-stick approach. Results Monte Carlo experiments demonstrated that JARD’s volume fractions correlated well with the ground truth for single and crossing fiber configurations. In single fiber configurations JARD automatically reduced the volume fraction of one compartment to (almost) zero. The variance in fractional anisotropy (FA) of the main tensor component was thereby reduced compared to MLE. JARD and MLE gave a comparable outcome in data simulating crossing fibers. On brain data, JARD yielded a smaller spread in FA along the corpus callosum compared to MLE. Tract-based spatial statistics demonstrated a higher sensitivity in detecting age-related white matter atrophy using JARD compared to both MLE and the ball-and-stick approach. Conclusions The proposed framework offers accurate and precise estimation of diffusion properties in single and dual fiber regions. PMID:27760166

  16. 49 CFR Appendix E to Part 238 - General Principles of Reliability-Based Maintenance Programs

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ..., scheduled maintenance is required for any item whose loss of function or mode of failure could have safety... (Continued) FEDERAL RAILROAD ADMINISTRATION, DEPARTMENT OF TRANSPORTATION PASSENGER EQUIPMENT SAFETY... the design level of safety and reliability of the equipment; (2) To restore safety and reliability...

  17. Instrumentation and Control Needs for Reliable Operation of Lunar Base Surface Nuclear Power Systems

    NASA Technical Reports Server (NTRS)

    Turso, James; Chicatelli, Amy; Bajwa, Anupa

    2005-01-01

    needed to enable this critical functionality of autonomous operation. It will be imperative to consider instrumentation and control requirements in parallel to system configuration development so as to identify control-related, as well as integrated system-related, problem areas early to avoid potentially expensive work-arounds . This paper presents an overview of the enabling technologies necessary for the development of reliable, autonomous lunar base nuclear power systems with an emphasis on system architectures and off-the-shelf algorithms rather than hardware. Autonomy needs are presented in the context of a hypothetical lunar base nuclear power system. The scenarios and applications presented are hypothetical in nature, based on information from open-literature sources, and only intended to provoke thought and provide motivation for the use of autonomous, intelligent control and diagnostics.

  18. Security-reliability performance of cognitive AF relay-based wireless communication system with channel estimation error

    NASA Astrophysics Data System (ADS)

    Gu, Qi; Wang, Gongpu; Gao, Li; Peng, Mugen

    2014-12-01

    In this paper, both the security and the reliability performance of the cognitive amplify-and-forward (AF) relay system are analyzed in the presence of the channel estimation error. The security and the reliability performance are represented by the outage probability and the intercept probability, respectively. Instead of perfect channel state information (CSI) predominantly assumed in the literature, a certain channel estimation algorithm and the influence of the corresponding channel estimation error are considered in this study. Specifically, linear minimum mean square error estimation (LMMSE) is utilized by the destination node and the eavesdropper node to obtain the CSI, and the closed form for the outage probability and that for the intercept probability are derived with the channel estimation error. It is shown that the transmission security (reliability) can be improved by loosening the reliability (security) requirement. Moreover, we compare the security and reliability performance of this relay-based cognitive radio system with those of the direct communication system without relay. Interestingly, it is found that the AF relay-based system has less reliability performance than the direct cognitive radio system; however, it can lower the sum of the outage probability and the intercept probability than the direct communication system. It is also found that there exists an optimal training number to minimize the sum of the outage probability and the intercept probability.

  19. Is Learner Self-Assessment Reliable and Valid in a Web-Based Portfolio Environment for High School Students?

    ERIC Educational Resources Information Center

    Chang, Chi-Cheng; Liang, Chaoyun; Chen, Yi-Hui

    2013-01-01

    This study explored the reliability and validity of Web-based portfolio self-assessment. Participants were 72 senior high school students enrolled in a computer application course. The students created learning portfolios, viewed peers' work, and performed self-assessment on the Web-based portfolio assessment system. The results indicated: 1)…

  20. Is Teacher Assessment Reliable or Valid for High School Students under a Web-Based Portfolio Environment?

    ERIC Educational Resources Information Center

    Chang, Chi-Cheng; Wu, Bing-Hong

    2012-01-01

    This study explored the reliability and validity of teacher assessment under a Web-based portfolio assessment environment (or Web-based teacher portfolio assessment). Participants were 72 eleventh graders taking the "Computer Application" course. The students perform portfolio creation, inspection, self- and peer-assessment using the Web-based…

  1. Shock reliability analysis and improvement of MEMS electret-based vibration energy harvesters

    NASA Astrophysics Data System (ADS)

    Renaud, M.; Fujita, T.; Goedbloed, M.; de Nooijer, C.; van Schaijk, R.

    2015-10-01

    Vibration energy harvesters can serve as a replacement solution to batteries for powering tire pressure monitoring systems (TPMS). Autonomous wireless TPMS powered by microelectromechanical system (MEMS) electret-based vibration energy harvester have been demonstrated. The mechanical reliability of the MEMS harvester still has to be assessed in order to bring the harvester to the requirements of the consumer market. It should survive the mechanical shocks occurring in the tire environment. A testing procedure to quantify the shock resilience of harvesters is described in this article. Our first generation of harvesters has a shock resilience of 400 g, which is far from being sufficient for the targeted application. In order to improve this aspect, the first important aspect is to understand the failure mechanism. Failure is found to occur in the form of fracture of the device’s springs. It results from impacts between the anchors of the springs when the harvester undergoes a shock. The shock resilience of the harvesters can be improved by redirecting these impacts to nonvital parts of the device. With this philosophy in mind, we design three types of shock absorbing structures and test their effect on the shock resilience of our MEMS harvesters. The solution leading to the best results consists of rigid silicon stoppers covered by a layer of Parylene. The shock resilience of the harvesters is brought above 2500 g. Results in the same range are also obtained with flexible silicon bumpers, which are simpler to manufacture.

  2. Optimization of Systems with Uncertainty: Initial Developments for Performance, Robustness and Reliability Based Designs

    NASA Technical Reports Server (NTRS)

    Crespo, Luis G.; Bushnell, Dennis M. (Technical Monitor)

    2002-01-01

    This paper presents a study on the optimization of systems with structured uncertainties, whose inputs and outputs can be exhaustively described in the probabilistic sense. By propagating the uncertainty from the input to the output in the space of the probability density functions and the moments, optimization problems that pursue performance, robustness and reliability based designs are studied. Be specifying the desired outputs in terms of desired probability density functions and then in terms of meaningful probabilistic indices, we settle a computationally viable framework for solving practical optimization problems. Applications to static optimization and stability control are used to illustrate the relevance of incorporating uncertainty in the early stages of the design. Several examples that admit a full probabilistic description of the output in terms of the design variables and the uncertain inputs are used to elucidate the main features of the generic problem and its solution. Extensions to problems that do not admit closed form solutions are also evaluated. Concrete evidence of the importance of using a consistent probabilistic formulation of the optimization problem and a meaningful probabilistic description of its solution is provided in the examples. In the stability control problem the analysis shows that standard deterministic approaches lead to designs with high probability of running into instability. The implementation of such designs can indeed have catastrophic consequences.

  3. Degradation reliability modeling based on an independent increment process with quadratic variance

    NASA Astrophysics Data System (ADS)

    Wang, Zhihua; Zhang, Yongbo; Wu, Qiong; Fu, Huimin; Liu, Chengrui; Krishnaswamy, Sridhar

    2016-03-01

    Degradation testing is an important technique for assessing life time information of complex systems and highly reliable products. Motivated by fatigue crack growth (FCG) data and our previous study, this paper develops a novel degradation modeling approach, in which degradation is represented by an independent increment process with linear mean and general quadratic variance functions of test time or transformed test time if necessary. Based on the constructed degradation model, closed-form expressions of failure time distribution (FTD) and its percentiles can be straightforwardly derived and calculated. A one-stage method is developed to estimate model parameters and FTD. Simulation studies are conducted to validate the proposed approach, and the results illustrate that the approach can provide reasonable estimates even for small sample size situations. Finally, the method is verified by the FCG data set given as the motivating example, and the results show that it can be considered as an effective degradation modeling approach compared with the multivariate normal model and graphic approach.

  4. Science-Based Simulation Model of Human Performance for Human Reliability Analysis

    SciTech Connect

    Dana L. Kelly; Ronald L. Boring; Ali Mosleh; Carol Smidts

    2011-10-01

    Human reliability analysis (HRA), a component of an integrated probabilistic risk assessment (PRA), is the means by which the human contribution to risk is assessed, both qualitatively and quantitatively. However, among the literally dozens of HRA methods that have been developed, most cannot fully model and quantify the types of errors that occurred at Three Mile Island. Furthermore, all of the methods lack a solid empirical basis, relying heavily on expert judgment or empirical results derived in non-reactor domains. Finally, all of the methods are essentially static, and are thus unable to capture the dynamics of an accident in progress. The objective of this work is to begin exploring a dynamic simulation approach to HRA, one whose models have a basis in psychological theories of human performance, and whose quantitative estimates have an empirical basis. This paper highlights a plan to formalize collaboration among the Idaho National Laboratory (INL), the University of Maryland, and The Ohio State University (OSU) to continue development of a simulation model initially formulated at the University of Maryland. Initial work will focus on enhancing the underlying human performance models with the most recent psychological research, and on planning follow-on studies to establish an empirical basis for the model, based on simulator experiments to be carried out at the INL and at the OSU.

  5. An Acetylcholinesterase-Based Chronoamperometric Biosensor for Fast and Reliable Assay of Nerve Agents

    PubMed Central

    Pohanka, Miroslav; Adam, Vojtech; Kizek, Rene

    2013-01-01

    The enzyme acetylcholinesterase (AChE) is an important part of cholinergic nervous system, where it stops neurotransmission by hydrolysis of the neurotransmitter acetylcholine. It is sensitive to inhibition by organophosphate and carbamate insecticides, some Alzheimer disease drugs, secondary metabolites such as aflatoxins and nerve agents used in chemical warfare. When immobilized on a sensor (physico-chemical transducer), it can be used for assay of these inhibitors. In the experiments described herein, an AChE- based electrochemical biosensor using screen printed electrode systems was prepared. The biosensor was used for assay of nerve agents such as sarin, soman, tabun and VX. The limits of detection achieved in a measuring protocol lasting ten minutes were 7.41 × 10−12 mol/L for sarin, 6.31 × 10−12 mol/L for soman, 6.17 × 10−11 mol/L for tabun, and 2.19 × 10−11 mol/L for VX, respectively. The assay was reliable, with minor interferences caused by the organic solvents ethanol, methanol, isopropanol and acetonitrile. Isopropanol was chosen as suitable medium for processing lipophilic samples. PMID:23999806

  6. A GC/MS-based metabolomic approach for reliable diagnosis of phenylketonuria.

    PubMed

    Xiong, Xiyue; Sheng, Xiaoqi; Liu, Dan; Zeng, Ting; Peng, Ying; Wang, Yichao

    2015-11-01

    ), which showed that phenylacetic acid may be used as a reliable discriminator for the diagnosis of PKU. The low false positive rate (1-specificity, 0.064) can be eliminated or at least greatly reduced by simultaneously referring to other markers, especially phenylpyruvic acid, a unique marker in PKU. Additionally, this standard was obtained with high sensitivity and specificity in a less invasive manner for diagnosing PKU compared with the Phe/Tyr ratio. Therefore, we conclude that urinary metabolomic information based on the improved oximation-silylation method together with GC/MS may be reliable for the diagnosis and differential diagnosis of PKU.

  7. Autonomous, Decentralized Grid Architecture: Prosumer-Based Distributed Autonomous Cyber-Physical Architecture for Ultra-Reliable Green Electricity Networks

    SciTech Connect

    2012-01-11

    GENI Project: Georgia Tech is developing a decentralized, autonomous, internet-like control architecture and control software system for the electric power grid. Georgia Tech’s new architecture is based on the emerging concept of electricity prosumers—economically motivated actors that can produce, consume, or store electricity. Under Georgia Tech’s architecture, all of the actors in an energy system are empowered to offer associated energy services based on their capabilities. The actors achieve their sustainability, efficiency, reliability, and economic objectives, while contributing to system-wide reliability and efficiency goals. This is in marked contrast to the current one-way, centralized control paradigm.

  8. Stochastic Analysis of Waterhammer and Applications in Reliability-Based Structural Design for Hydro Turbine Penstocks

    SciTech Connect

    Zhang, Qin Fen; Karney, Professor Byran W.; Suo, Prof. Lisheng; Colombo, Dr. Andrew

    2011-01-01

    Abstract: The randomness of transient events, and the variability in factors which influence the magnitudes of resultant pressure fluctuations, ensures that waterhammer and surges in a pressurized pipe system are inherently stochastic. To bolster and improve reliability-based structural design, a stochastic model of transient pressures is developed for water conveyance systems in hydropower plants. The statistical characteristics and probability distributions of key factors in boundary conditions, initial states and hydraulic system parameters are analyzed based on a large record of observed data from hydro plants in China; and then the statistical characteristics and probability distributions of annual maximum waterhammer pressures are simulated using Monte Carlo method and verified by the analytical probabilistic model for a simplified pipe system. In addition, the characteristics (annual occurrence, sustaining period and probability distribution) of hydraulic loads for both steady and transient states are discussed. Illustrating with an example of penstock structural design, it is shown that the total waterhammer pressure should be split into two individual random variable loads: the steady/static pressure and the waterhammer pressure rise during transients; and that different partial load factors should be applied to each individual load to reflect its unique physical and stochastic features. Particularly, the normative load (usually the unfavorable value at 95-percentage point) for steady/static hydraulic pressure should be taken from the probability distribution of its maximum values during the pipe's design life, while for waterhammer pressure rise, as the second variable load, the probability distribution of its annual maximum values is used to determine its normative load.

  9. Reliability of Nationwide Prevalence Estimates of Dementia: A Critical Appraisal Based on Brazilian Surveys

    PubMed Central

    2015-01-01

    Background The nationwide dementia prevalence is usually calculated by applying the results of local surveys to countries’ populations. To evaluate the reliability of such estimations in developing countries, we chose Brazil as an example. We carried out a systematic review of dementia surveys, ascertained their risk of bias, and present the best estimate of occurrence of dementia in Brazil. Methods and Findings We carried out an electronic search of PubMed, Latin-American databases, and a Brazilian thesis database for surveys focusing on dementia prevalence in Brazil. The systematic review was registered at PROSPERO (CRD42014008815). Among the 35 studies found, 15 analyzed population-based random samples. However, most of them utilized inadequate criteria for diagnostics. Six studies without these limitations were further analyzed to assess the risk of selection, attrition, outcome and population bias as well as several statistical issues. All the studies presented moderate or high risk of bias in at least two domains due to the following features: high non-response, inaccurate cut-offs, and doubtful accuracy of the examiners. Two studies had limited external validity due to high rates of illiteracy or low income. The three studies with adequate generalizability and the lowest risk of bias presented a prevalence of dementia between 7.1% and 8.3% among subjects aged 65 years and older. However, after adjustment for accuracy of screening, the best available evidence points towards a figure between 15.2% and 16.3%. Conclusions The risk of bias may strongly limit the generalizability of dementia prevalence estimates in developing countries. Extrapolations that have already been made for Brazil and Latin America were based on a prevalence that should have been adjusted for screening accuracy or not used at all due to severe bias. Similar evaluations regarding other developing countries are needed in order to verify the scope of these limitations. PMID:26131563

  10. CardioGuard: A Brassiere-Based Reliable ECG Monitoring Sensor System for Supporting Daily Smartphone Healthcare Applications

    PubMed Central

    Kwon, Sungjun; Kim, Jeehoon; Kang, Seungwoo; Lee, Youngki; Baek, Hyunjae

    2014-01-01

    Abstract We propose CardioGuard, a brassiere-based reliable electrocardiogram (ECG) monitoring sensor system, for supporting daily smartphone healthcare applications. It is designed to satisfy two key requirements for user-unobtrusive daily ECG monitoring: reliability of ECG sensing and usability of the sensor. The system is validated through extensive evaluations. The evaluation results showed that the CardioGuard sensor reliably measure the ECG during 12 representative daily activities including diverse movement levels; 89.53% of QRS peaks were detected on average. The questionnaire-based user study with 15 participants showed that the CardioGuard sensor was comfortable and unobtrusive. Additionally, the signal-to-noise ratio test and the washing durability test were conducted to show the high-quality sensing of the proposed sensor and its physical durability in practical use, respectively. PMID:25405527

  11. Delay Analysis of Car-to-Car Reliable Data Delivery Strategies Based on Data Mulling with Network Coding

    NASA Astrophysics Data System (ADS)

    Park, Joon-Sang; Lee, Uichin; Oh, Soon Young; Gerla, Mario; Lun, Desmond Siumen; Ro, Won Woo; Park, Joonseok

    Vehicular ad hoc networks (VANET) aims to enhance vehicle navigation safety by providing an early warning system: any chance of accidents is informed through the wireless communication between vehicles. For the warning system to work, it is crucial that safety messages be reliably delivered to the target vehicles in a timely manner and thus reliable and timely data dissemination service is the key building block of VANET. Data mulling technique combined with three strategies, network codeing, erasure coding and repetition coding, is proposed for the reliable and timely data dissemination service. Particularly, vehicles in the opposite direction on a highway are exploited as data mules, mobile nodes physically delivering data to destinations, to overcome intermittent network connectivity cause by sparse vehicle traffic. Using analytic models, we show that in such a highway data mulling scenario the network coding based strategy outperforms erasure coding and repetition based strategies.

  12. Validity and Reliability of the Turkish Version of Needs Based Biopsychosocial Distress Instrument for Cancer Patients (CANDI)

    PubMed Central

    Beyhun, Nazim Ercument; Can, Gamze; Tiryaki, Ahmet; Karakullukcu, Serdar; Bulut, Bekir; Yesilbas, Sehbal; Kavgaci, Halil; Topbas, Murat

    2016-01-01

    Background Needs based biopsychosocial distress instrument for cancer patients (CANDI) is a scale based on needs arising due to the effects of cancer. Objectives The aim of this research was to determine the reliability and validity of the CANDI scale in the Turkish language. Patients and Methods The study was performed with the participation of 172 cancer patients aged 18 and over. Factor analysis (principal components analysis) was used to assess construct validity. Criterion validities were tested by computing Spearman correlation between CANDI and hospital anxiety depression scale (HADS), and brief symptom inventory (BSI) (convergent validity) and quality of life scales (FACT-G) (divergent validity). Test-retest reliabilities and internal consistencies were measured with intraclass correlation (ICC) and Cronbach-α. Results A three-factor solution (emotional, physical and social) was found with factor analysis. Internal reliability (α = 0.94) and test-retest reliability (ICC = 0.87) were significantly high. Correlations between CANDI and HADS (rs = 0.67), and BSI (rs = 0.69) and FACT-G (rs = -0.76) were moderate and significant in the expected direction. Conclusions CANDI is a valid and reliable scale in cancer patients with a three-factor structure (emotional, physical and social) in the Turkish language. PMID:27621931

  13. [Study on the Reliability Assessment Method of Heavy Vehicle Gearbox Based on Spectrometric Analysis].

    PubMed

    Bao, Ke; Zhang, Zhong; Cao, Yuan-fu; Chen, Yi-jie

    2015-04-01

    Spectrometric oil analysis is of great importance for wear condition monitoring of gearbox. In this context, the contents of main elements compositions in the bench test of heavy vehicle gearbox are obtained by atomic emission spectrometric oil analysis first. Then correlation analysis of the test data and wearing mechanism analysis are carried out to get the metal element which could be used to describe the wearing and failure of the gearbox. The spectrometric data after filling/changing oil are corrected, and the laws of the contents of main elements compositions during tests are expressed as linear functions. After that, the reliability assessment is executed with considering the degradation law and discreteness of test data, in which the mean and standard deviation of normal distribution of spectrometric oil data at each time point are adopted. Finally, the influences of the threshold are discussed. It has been proved that the contents of metal element Cu, which is got by spectrometric oil analysis of different samples, could be used to assess the reliability of heavy vehicle gearbox. The reason is that the metal element Cu is closely related to the general wear state of gearbox, and is easy to be measured. When the threshold of Cu content is treated as a constant, bigger threshold means higher reliability at the same time, and the mean value of threshold has significant impact on the reliability assessment results as R > 0.9. When the threshold is treated as a random variable, bigger dispersion of threshold means smaller slope of reliability against time, and also means lower reliability of gearbox as R > 0.9 at the same time. In this study, the spectrometric oil analysis and probability statistics are used together for the reliability assessment of gear box, which extends the application range of spectrometric analysis.

  14. Nanoparticle-based cancer treatment: can delivered dose and biological dose be reliably modeled and quantified?

    NASA Astrophysics Data System (ADS)

    Hoopes, P. Jack; Petryk, Alicia A.; Giustini, Andrew J.; Stigliano, Robert V.; D'Angelo, Robert N.; Tate, Jennifer A.; Cassim, Shiraz M.; Foreman, Allan; Bischof, John C.; Pearce, John A.; Ryan, Thomas

    2011-03-01

    Essential developments in the reliable and effective use of heat in medicine include: 1) the ability to model energy deposition and the resulting thermal distribution and tissue damage (Arrhenius models) over time in 3D, 2) the development of non-invasive thermometry and imaging for tissue damage monitoring, and 3) the development of clinically relevant algorithms for accurate prediction of the biological effect resulting from a delivered thermal dose in mammalian cells, tissues, and organs. The accuracy and usefulness of this information varies with the type of thermal treatment, sensitivity and accuracy of tissue assessment, and volume, shape, and heterogeneity of the tumor target and normal tissue. That said, without the development of an algorithm that has allowed the comparison and prediction of the effects of hyperthermia in a wide variety of tumor and normal tissues and settings (cumulative equivalent minutes/ CEM), hyperthermia would never have achieved clinical relevance. A new hyperthermia technology, magnetic nanoparticle-based hyperthermia (mNPH), has distinct advantages over the previous techniques: the ability to target the heat to individual cancer cells (with a nontoxic nanoparticle), and to excite the nanoparticles noninvasively with a noninjurious magnetic field, thus sparing associated normal cells and greatly improving the therapeutic ratio. As such, this modality has great potential as a primary and adjuvant cancer therapy. Although the targeted and safe nature of the noninvasive external activation (hysteretic heating) are a tremendous asset, the large number of therapy based variables and the lack of an accurate and useful method for predicting, assessing and quantifying mNP dose and treatment effect is a major obstacle to moving the technology into routine clinical practice. Among other parameters, mNPH will require the accurate determination of specific nanoparticle heating capability, the total nanoparticle content and biodistribution in

  15. A general cause based methodology for analysis of dependent failures in system risk and reliability assessments

    NASA Astrophysics Data System (ADS)

    O'Connor, Andrew N.

    Traditional parametric Common Cause Failure (CCF) models quantify the soft dependencies between component failures through the use of empirical ratio relationships. Furthermore CCF modeling has been essentially restricted to identical components in redundant formations. While this has been advantageous in allowing the prediction of system reliability with little or no data, it has been prohibitive in other applications such as modeling the characteristics of a system design or including the characteristics of failure when assessing the risk significance of a failure or degraded performance event (known as an event assessment). This dissertation extends the traditional definition of CCF to model soft dependencies between like and non-like components. It does this through the explicit modeling of soft dependencies between systems (coupling factors) such as sharing a maintenance team or sharing a manufacturer. By modeling the soft dependencies explicitly these relationships can be individually quantified based on the specific design of the system and allows for more accurate event assessment given knowledge of the failure cause. Since the most data informed model in use is the Alpha Factor Model (AFM), it has been used as the baseline for the proposed solutions. This dissertation analyzes the US Nuclear Regulatory Commission's Common Cause Failure Database event data to determine the suitability of the data and failure taxonomy for use in the proposed cause-based models. Recognizing that CCF events are characterized by full or partial presence of "root cause" and "coupling factor" a refined failure taxonomy is proposed which provides a direct link between the failure cause category and the coupling factors. This dissertation proposes two CCF models (a) Partial Alpha Factor Model (PAFM) that accounts for the relevant coupling factors based on system design and provide event assessment with knowledge of the failure cause, and (b)General Dependency Model (GDM),which uses

  16. Testing the Reliability and Sensitivity of Foraminiferal Transfer Functions Based on the Modern Analog Technique (MAT)

    NASA Astrophysics Data System (ADS)

    Lac, D.; Cullen, J. L.; Martin, A.

    2004-05-01

    analogs for all the samples within a particular set of duplicates, with no regional pattern in this number observed. The warm and cold SST estimates generated using the SST's above each of the 5 chosen analogs exhibit a wide range of variation, particularly for the three sample sets from the high latitudes. The three subpolar sample sets exhibit a 3.4, 1.1, and 1.0 degree C range in their cold SST estimates. There is no clear relationship between differences in SST estimates and differences in the average dissimilarities within duplicate sample sets. Using 10 instead of 5 modern analogs to estimate SST's produces somewhat better results for 4 of the 8 sample sets and similar results for the remaining 4. Our results suggest that foraminiferal samples with dissimilarity values of up to 0.15 are not detectably different from duplicate foraminiferal census counts and should be considered excellent modern faunal analogs for any fossil sample. In addition high latitude samples seem to be produce somewhat less reliable SST estimates than low latitude samples. Finally, our results suggest that, when estimating past SST's choosing to average the SST's above the 10 best analogs produces more accurate (and precise) results, particularly in situations where the Global Data Base contains adequate modern analogs.

  17. A Critique of Raju and Oshima's Prophecy Formulas for Assessing the Reliability of Item Response Theory-Based Ability Estimates

    ERIC Educational Resources Information Center

    Wang, Wen-Chung

    2008-01-01

    Raju and Oshima (2005) proposed two prophecy formulas based on item response theory in order to predict the reliability of ability estimates for a test after change in its length. The first prophecy formula is equivalent to the classical Spearman-Brown prophecy formula. The second prophecy formula is misleading because of an underlying false…

  18. The Development of the Functional Literacy Experience Scale Based upon Ecological Theory (FLESBUET) and Validity-Reliability Study

    ERIC Educational Resources Information Center

    Özenç, Emine Gül; Dogan, M. Cihangir

    2014-01-01

    This study aims to perform a validity-reliability test by developing the Functional Literacy Experience Scale based upon Ecological Theory (FLESBUET) for primary education students. The study group includes 209 fifth grade students at Sabri Taskin Primary School in the Kartal District of Istanbul, Turkey during the 2010-2011 academic year.…

  19. Content Validity and Inter-Rater Reliability of the Halliwick-Concept-Based Instrument "Swimming with Independent Measure"

    ERIC Educational Resources Information Center

    Srsen, Katja Groleger; Vidmar, Gaj; Pikl, Masa; Vrecar, Irena; Burja, Cirila; Krusec, Klavdija

    2012-01-01

    The Halliwick concept is widely used in different settings to promote joyful movement in water and swimming. To assess the swimming skills and progression of an individual swimmer, a valid and reliable measure should be used. The Halliwick-concept-based Swimming with Independent Measure (SWIM) was introduced for this purpose. We aimed to determine…

  20. Determining Functional Reliability of Pyrotechnic Mechanical Devices

    NASA Technical Reports Server (NTRS)

    Bement, Laurence J.; Multhaup, Herbert A.

    1997-01-01

    This paper describes a new approach for evaluating mechanical performance and predicting the mechanical functional reliability of pyrotechnic devices. Not included are other possible failure modes, such as the initiation of the pyrotechnic energy source. The requirement of hundreds or thousands of consecutive, successful tests on identical components for reliability predictions, using the generally accepted go/no-go statistical approach routinely ignores physics of failure. The approach described in this paper begins with measuring, understanding and controlling mechanical performance variables. Then, the energy required to accomplish the function is compared to that delivered by the pyrotechnic energy source to determine mechanical functional margin. Finally, the data collected in establishing functional margin is analyzed to predict mechanical functional reliability, using small-sample statistics. A careful application of this approach can provide considerable cost improvements and understanding over that of go/no-go statistics. Performance and the effects of variables can be defined, and reliability predictions can be made by evaluating 20 or fewer units. The application of this approach to a pin puller used on a successful NASA mission is provided as an example.

  1. Improved Reliability of InGaN-Based Light-Emitting Diodes by HfO2 Passivation Layer.

    PubMed

    Park, Seung Hyun; Kim, Yoon Seok; Kim, Tae Hoon; Ryu, Sang Wan

    2016-02-01

    We utilized a passivation layer to improve the leakage current and reliability characteristics of GaN-based light-emitting diodes. The electrical and optical characteristics of the fabricated LEDs were characterized by current-voltage and optical power measurements. The HfO2 passivation layer showed no optical power degradation and suppressed leakage current. The low deposition temper- ature of sputtered HfO2 is responsible for the improved reliability of the LEDs because it suppresses the diffusion of hydrogen plasma into GaN to form harmful Mg-H complexes. PMID:27433667

  2. Content validity and inter-rater reliability of the Halliwick-concept-based instrument 'Swimming with Independent Measure'.

    PubMed

    Sršen, Katja Groleger; Vidmar, Gaj; Pikl, Maša; Vrečar, Irena; Burja, Cirila; Krušec, Klavdija

    2012-06-01

    The Halliwick concept is widely used in different settings to promote joyful movement in water and swimming. To assess the swimming skills and progression of an individual swimmer, a valid and reliable measure should be used. The Halliwick-concept-based Swimming with Independent Measure (SWIM) was introduced for this purpose. We aimed to determine its content validity and inter-rater reliability. Fifty-four healthy children, 3.5-11 years old, from a mainstream swimming program participated in a content validity study. They were evaluated with SWIM and the national evaluation system of swimming abilities (classifying children into seven categories). To study the inter-rater reliability of SWIM, we included 37 children and youth from a Halliwick swimming program, aged 7-22 years, who were evaluated by two Halliwick instructors independently. The average SWIM score differed between national evaluation system categories and followed the expected order (P<0.001), whereby a ceiling effect was observed in the higher categories. High inter-rater reliability was found for all 11 SWIM items. The lowest reliability was observed for item G (sagittal rotation), although the estimates were still above 0.9. As expected, the highest reliability was observed for the total score (intraclass correlation 0.996). The validity of SWIM with respect to the national evaluation system of swimming abilities is high until the point where a swimmer is well adapted to water and already able to learn some swimming techniques. The inter-rater reliability of SWIM is very high; thus, we believe that SWIM can be used in further research and practice to follow the progress of swimmers.

  3. The specification-based validation of reliable multicast protocol: Problem Report. M.S. Thesis

    NASA Technical Reports Server (NTRS)

    Wu, Yunqing

    1995-01-01

    Reliable Multicast Protocol (RMP) is a communication protocol that provides an atomic, totally ordered, reliable multicast service on top of unreliable IP multicasting. In this report, we develop formal models for RMP using existing automated verification systems, and perform validation on the formal RMP specifications. The validation analysis help identifies some minor specification and design problems. We also use the formal models of RMP to generate a test suite for conformance testing of the implementation. Throughout the process of RMP development, we follow an iterative, interactive approach that emphasizes concurrent and parallel progress of implementation and verification processes. Through this approach, we incorporate formal techniques into our development process, promote a common understanding for the protocol, increase the reliability of our software, and maintain high fidelity between the specifications of RMP and its implementation.

  4. Reliability-centered maintenance for ground-based large optical telescopes and radio antenna arrays

    NASA Astrophysics Data System (ADS)

    Marchiori, G.; Formentin, F.; Rampini, F.

    2014-07-01

    In the last years, EIE GROUP has been more and more involved in large optical telescopes and radio antennas array projects. In this frame, the paper describes a fundamental aspect of the Logistic Support Analysis (LSA) process, that is the application of the Reliability-Centered Maintenance (RCM) methodology for the generation of maintenance plans for ground-based large optical telescopes and radio antennas arrays. This helps maintenance engineers to make sure that the telescopes continue to work properly, doing what their users require them to do in their present operating conditions. The main objective of the RCM process is to establish the complete maintenance regime, with the safe minimum required maintenance, carried out without any risk to personnel, telescope and subsystems. At the same time, a correct application of the RCM allows to increase the cost effectiveness, telescope uptime and items availability, and to provide greater understanding of the level of risk that the organization is managing. At the same time, engineers shall make a great effort since the initial phase of the project to obtain a telescope requiring easy maintenance activities and simple replacement of the major assemblies, taking special care on the accesses design and items location, implementation and design of special lifting equipment and handling devices for the heavy items. This maintenance engineering framework is based on seven points, which lead to the main steps of the RCM program. The initial steps of the RCM process consist of: system selection and data collection (MTBF, MTTR, etc.), definition of system boundaries and operating context, telescope description with the use of functional block diagrams, and the running of a FMECA to address the dominant causes of equipment failure and to lay down the Critical Items List. In the second part of the process the RCM logic is applied, which helps to determine the appropriate maintenance tasks for each identified failure mode. Once

  5. Asymmetric programming: a highly reliable metadata allocation strategy for MLC NAND flash memory-based sensor systems.

    PubMed

    Huang, Min; Liu, Zhaoqing; Qiao, Liyan

    2014-01-01

    While the NAND flash memory is widely used as the storage medium in modern sensor systems, the aggressive shrinking of process geometry and an increase in the number of bits stored in each memory cell will inevitably degrade the reliability of NAND flash memory. In particular, it's critical to enhance metadata reliability, which occupies only a small portion of the storage space, but maintains the critical information of the file system and the address translations of the storage system. Metadata damage will cause the system to crash or a large amount of data to be lost. This paper presents Asymmetric Programming, a highly reliable metadata allocation strategy for MLC NAND flash memory storage systems. Our technique exploits for the first time the property of the multi-page architecture of MLC NAND flash memory to improve the reliability of metadata. The basic idea is to keep metadata in most significant bit (MSB) pages which are more reliable than least significant bit (LSB) pages. Thus, we can achieve relatively low bit error rates for metadata. Based on this idea, we propose two strategies to optimize address mapping and garbage collection. We have implemented Asymmetric Programming on a real hardware platform. The experimental results show that Asymmetric Programming can achieve a reduction in the number of page errors of up to 99.05% with the baseline error correction scheme.

  6. Estimating and comparing the reliability of a suite of workplace-based assessments: an obstetrics and gynaecology setting.

    PubMed

    Homer, Matt; Setna, Zeryab; Jha, Vikram; Higham, Jenny; Roberts, Trudie; Boursicot, Katherine

    2013-08-01

    This paper reports on a study that compares estimates of the reliability of a suite of workplace based assessment forms as employed to formatively assess the progress of trainee obstetricians and gynaecologists. The use of such forms of assessment is growing nationally and internationally in many specialties, but there is little research evidence on comparisons by procedure/competency and form-type across an entire specialty. Generalisability theory combined with a multilevel modelling approach is used to estimate variance components, G-coefficients and standard errors of measurement across 13 procedures and three form-types (mini-CEX, OSATS and CbD). The main finding is that there are wide variations in the estimates of reliability across the forms, and that therefore the guidance on assessment within the specialty does not always allow for enough forms per trainee to ensure that the levels of reliability of the process is adequate. There is, however, little evidence that reliability varies systematically by form-type. Methodologically, the problems of accurately estimating reliability in these contexts through the calculation of variance components and, crucially, their associated standard errors are considered. The importance of the use of appropriate methods in such calculations is emphasised, and the unavoidable limitations of research in naturalistic settings are discussed.

  7. Asymmetric programming: a highly reliable metadata allocation strategy for MLC NAND flash memory-based sensor systems.

    PubMed

    Huang, Min; Liu, Zhaoqing; Qiao, Liyan

    2014-01-01

    While the NAND flash memory is widely used as the storage medium in modern sensor systems, the aggressive shrinking of process geometry and an increase in the number of bits stored in each memory cell will inevitably degrade the reliability of NAND flash memory. In particular, it's critical to enhance metadata reliability, which occupies only a small portion of the storage space, but maintains the critical information of the file system and the address translations of the storage system. Metadata damage will cause the system to crash or a large amount of data to be lost. This paper presents Asymmetric Programming, a highly reliable metadata allocation strategy for MLC NAND flash memory storage systems. Our technique exploits for the first time the property of the multi-page architecture of MLC NAND flash memory to improve the reliability of metadata. The basic idea is to keep metadata in most significant bit (MSB) pages which are more reliable than least significant bit (LSB) pages. Thus, we can achieve relatively low bit error rates for metadata. Based on this idea, we propose two strategies to optimize address mapping and garbage collection. We have implemented Asymmetric Programming on a real hardware platform. The experimental results show that Asymmetric Programming can achieve a reduction in the number of page errors of up to 99.05% with the baseline error correction scheme. PMID:25310473

  8. Observer reliability of the Gross Motor Performance Measure and the Quality of Upper Extremity Skills Test, based on video recordings.

    PubMed

    Sorsdahl, Anne Brit; Moe-Nilssen, Rolf; Strand, Liv Inger

    2008-02-01

    The aim of this study was to examine observer reliability of the Gross Motor Performance Measure (GMPM) and the Quality of Upper Extremity Skills Test (QUEST) based on video clips. The tests were administered to 26 children with cerebral palsy (CP; 14 males, 12 females; range 2-13y, mean 7y 6mo), 24 with spastic CP, and two with dyskinesia. Respectively, five, six, five, four, and six children were classified in Gross Motor Function Classification System Levels I to V; and four, nine, five, five, and three children were classified in Manual Ability Classification System levels I to V. The children's performances were recorded and edited. Two experienced paediatric physical therapists assessed the children from watching the video clips. Intraobserver and interobserver reliability values of the total scores were mostly high, intraclass correlation coefficient (ICC)(1,1) varying from 0.69 to 0.97 with only one coefficient below 0.89. The ICCs of subscores varied from 0.36 to 0.95, finding'Alignment'and'Weight shift'in GMPM and'Protective extension'in QUEST highly reliable. The subscores'Dissociated movements'in GMPM and QUEST, and'Grasp'in QUEST were the least reliable, and recommendations are made to increase reliability of these subscores. Video scoring was time consuming, but was found to offer many advantages; the possibility to review performance, to use special trained observers for scoring and less demanding assessment for the children. PMID:18201304

  9. Test-Retest Reliability of an Automated Infrared-Assisted Trunk Accelerometer-Based Gait Analysis System.

    PubMed

    Hsu, Chia-Yu; Tsai, Yuh-Show; Yau, Cheng-Shiang; Shie, Hung-Hai; Wu, Chu-Ming

    2016-01-01

    The aim of this study was to determine the test-retest reliability of an automated infrared-assisted, trunk accelerometer-based gait analysis system for measuring gait parameters of healthy subjects in a hospital. Thirty-five participants (28 of them females; age range, 23-79 years) performed a 5-m walk twice using an accelerometer-based gait analysis system with infrared assist. Measurements of spatiotemporal gait parameters (walking speed, step length, and cadence) and trunk control (gait symmetry, gait regularity, acceleration root mean square (RMS), and acceleration root mean square ratio (RMSR)) were recorded in two separate walking tests conducted 1 week apart. Relative and absolute test-retest reliability was determined by calculating the intra-class correlation coefficient (ICC3,1) and smallest detectable difference (SDD), respectively. The test-retest reliability was excellent for walking speed (ICC = 0.87, 95% confidence interval = 0.74-0.93, SDD = 13.4%), step length (ICC = 0.81, 95% confidence interval = 0.63-0.91, SDD = 12.2%), cadence (ICC = 0.81, 95% confidence interval = 0.63-0.91, SDD = 10.8%), and trunk control (step and stride regularity in anterior-posterior direction, acceleration RMS and acceleration RMSR in medial-lateral direction, and acceleration RMS and stride regularity in vertical direction). An automated infrared-assisted, trunk accelerometer-based gait analysis system is a reliable tool for measuring gait parameters in the hospital environment. PMID:27455281

  10. Classification of Lower Extremity Movement Patterns Based on Visual Assessment: Reliability and Correlation With 2-Dimensional Video Analysis

    PubMed Central

    Harris-Hayes, Marcie; Steger-May, Karen; Koh, Christine; Royer, Nat K.; Graci, Valentina; Salsich, Gretchen B.

    2014-01-01

    Context: Abnormal movement patterns have been implicated in lower extremity injury. Reliable, valid, and easily implemented assessment methods are needed to examine existing musculoskeletal disorders and investigate predictive factors for lower extremity injury. Objective: To determine the reliability of experienced and novice testers in making visual assessments of lower extremity movement patterns and to characterize the construct validity of the visual assessments. Design: Cross-sectional study. Setting: University athletic department and research laboratory. Patients or Other Participants: Convenience sample of 30 undergraduate and graduate students who regularly participate in athletics (age = 19.3 ± 4.5 years). Testers were 2 experienced physical therapists and 1 novice postdoctoral fellow (nonclinician). Main Outcome Measure(s): We took videos of 30 athletes performing the single-legged squat. Three testers observed the videos on 2 occasions and classified the lower extremity movement as dynamic valgus, no change, or dynamic varus. The classification was based on the estimated change in frontal-plane projection angle (FPPA) of the knee from single-legged stance to maximum single-legged squat depth. The actual FPPA change was measured quantitatively. We used percentage agreement and weighted κ to examine tester reliability and to determine construct validity of the visual assessment. Results: The κ values for intratester and intertester reliability ranged from 0.75 to 0.90, indicating substantial to excellent reliability. Percentage agreement between the visual assessment and the quantitative FPPA change category was 90%, with a κ value of 0.85. Conclusions: Visual assessments were made reliably by experienced and novice testers. Additionally, movement-pattern categories based on visual assessments were in excellent agreement with objective methods to measure FPPA change. Therefore, visual assessments can be used in the clinic to assess movement patterns

  11. Temporal Stability of Strength-Based Assessments: Test-Retest Reliability of Student and Teacher Reports

    ERIC Educational Resources Information Center

    Romer, Natalie; Merrell, Kenneth W.

    2013-01-01

    This study focused on evaluating the temporal stability of self-reported and teacher-reported perceptions of students' social and emotional skills and assets. We used a test-retest reliability procedure over repeated administrations of the child, adolescent, and teacher versions of the "Social-Emotional Assets and Resilience Scales". Middle school…

  12. Reliability-Based Weighting of Visual and Vestibular Cues in Displacement Estimation

    PubMed Central

    ter Horst, Arjan C.; Koppen, Mathieu; Selen, Luc P. J.; Medendorp, W. Pieter

    2015-01-01

    When navigating through the environment, our brain needs to infer how far we move and in which direction we are heading. In this estimation process, the brain may rely on multiple sensory modalities, including the visual and vestibular systems. Previous research has mainly focused on heading estimation, showing that sensory cues are combined by weighting them in proportion to their reliability, consistent with statistically optimal integration. But while heading estimation could improve with the ongoing motion, due to the constant flow of information, the estimate of how far we move requires the integration of sensory information across the whole displacement. In this study, we investigate whether the brain optimally combines visual and vestibular information during a displacement estimation task, even if their reliability varies from trial to trial. Participants were seated on a linear sled, immersed in a stereoscopic virtual reality environment. They were subjected to a passive linear motion involving visual and vestibular cues with different levels of visual coherence to change relative cue reliability and with cue discrepancies to test relative cue weighting. Participants performed a two-interval two-alternative forced-choice task, indicating which of two sequentially perceived displacements was larger. Our results show that humans adapt their weighting of visual and vestibular information from trial to trial in proportion to their reliability. These results provide evidence that humans optimally integrate visual and vestibular information in order to estimate their body displacement. PMID:26658990

  13. Moving to a Higher Level for PV Reliability through Comprehensive Standards Based on Solid Science (Presentation)

    SciTech Connect

    Kurtz, S.

    2014-11-01

    PV reliability is a challenging topic because of the desired long life of PV modules, the diversity of use environments and the pressure on companies to rapidly reduce their costs. This presentation describes the challenges, examples of failure mechanisms that we know or don't know how to test for, and how a scientific approach is being used to establish international standards.

  14. Predictions of Crystal Structure Based on Radius Ratio: How Reliable Are They?

    ERIC Educational Resources Information Center

    Nathan, Lawrence C.

    1985-01-01

    Discussion of crystalline solids in undergraduate curricula often includes the use of radius ratio rules as a method for predicting which type of crystal structure is likely to be adopted by a given ionic compound. Examines this topic, establishing more definitive guidelines for the use and reliability of the rules. (JN)

  15. Methodology for reliability based condition assessment. Application to concrete structures in nuclear plants

    SciTech Connect

    Mori, Y.; Ellingwood, B.

    1993-08-01

    Structures in nuclear power plants may be exposed to aggressive environmental effects that cause their strength to decrease over an extended period of service. A major concern in evaluating the continued service for such structures is to ensure that in their current condition they are able to withstand future extreme load events during the intended service life with a level of reliability sufficient for public safety. This report describes a methodology to facilitate quantitative assessments of current and future structural reliability and performance of structures in nuclear power plants. This methodology takes into account the nature of past and future loads, and randomness in strength and in degradation resulting from environmental factors. An adaptive Monte Carlo simulation procedure is used to evaluate time-dependent system reliability. The time-dependent reliability is sensitive to the time-varying load characteristics and to the choice of initial strength and strength degradation models but not to correlation in component strengths within a system. Inspection/maintenance strategies are identified that minimize the expected future costs of keeping the failure probability of a structure at or below an established target failure probability during its anticipated service period.

  16. Reliability-Based Weighting of Visual and Vestibular Cues in Displacement Estimation.

    PubMed

    ter Horst, Arjan C; Koppen, Mathieu; Selen, Luc P J; Medendorp, W Pieter

    2015-01-01

    When navigating through the environment, our brain needs to infer how far we move and in which direction we are heading. In this estimation process, the brain may rely on multiple sensory modalities, including the visual and vestibular systems. Previous research has mainly focused on heading estimation, showing that sensory cues are combined by weighting them in proportion to their reliability, consistent with statistically optimal integration. But while heading estimation could improve with the ongoing motion, due to the constant flow of information, the estimate of how far we move requires the integration of sensory information across the whole displacement. In this study, we investigate whether the brain optimally combines visual and vestibular information during a displacement estimation task, even if their reliability varies from trial to trial. Participants were seated on a linear sled, immersed in a stereoscopic virtual reality environment. They were subjected to a passive linear motion involving visual and vestibular cues with different levels of visual coherence to change relative cue reliability and with cue discrepancies to test relative cue weighting. Participants performed a two-interval two-alternative forced-choice task, indicating which of two sequentially perceived displacements was larger. Our results show that humans adapt their weighting of visual and vestibular information from trial to trial in proportion to their reliability. These results provide evidence that humans optimally integrate visual and vestibular information in order to estimate their body displacement.

  17. An Application-Based Discussion of Construct Validity and Internal Consistency Reliability.

    ERIC Educational Resources Information Center

    Taylor, Dianne L.; Campbell, Kathleen T.

    Several techniques for conducting studies of measurement integrity are explained and illustrated using a heuristic data set from a study of teachers' participation in decision making (D. L. Taylor, 1991). The sample consisted of 637 teachers. It is emphasized that validity and reliability are characteristics of data, and do not inure to tests as…

  18. Inter-rater reliability of ohio school-based overweight and obesity surveillance data.

    PubMed

    Oza-Frank, Reena; Hade, Erinn M; Conrey, Elizabeth J

    2012-09-01

    Measurement of height and weight in large studies may force the use of multiple measurers. The purpose of this study was to evaluate the reliability of height, weight, and body mass index (BMI) measures collected by multiple measurers in a large, statewide BMI surveillance program. A random subsample of schools (n=30) was selected from schools that participated in the 2009 to 2010 Ohio third-grade Oral Health/BMI surveillance program. Children (n=1,189) were measured by multiple volunteer health professional measurers and again by a trained researcher, who was standard across all schools. Mean differences for height, weight, and BMI percentiles were calculated for BMI category classifications. Agreement was estimated by the reliability coefficient, McNemar's test, and Kappa statistic. Sensitivity, specificity, and positive and negative predictive values were estimated using the trained researcher measures as the reference. Overall mean differences (95% confidence interval) were 0.45 (0.41-0.48) cm for height, 0.07 (-0.01-0.15) kg for weight, and 1.37 (1.20-1.53) for BMI. The correlation coefficient for all three measures was over 0.9 (P<0.01), indicating a strong positive association between measures. BMI category classifications showed substantial reliability (Kappa range: 0.94-0.96). Percentage agreement ranged from 98% to 99% for all BMI categories, as did sensitivities and specificities. Positive predictive values for all BMI categories were approximately 97%, and close to 100% for negative predictive values. Reliability for height, weight, BMI percentile, and BMI classification was very high, supporting the use of multiple trained measurers in a statewide BMI surveillance program. Similar methods can be applied to other public health and clinical settings to improve anthropometric measurement reliability. PMID:22939442

  19. A human reliability based usability evaluation method for safety-critical software

    SciTech Connect

    Boring, R. L.; Tran, T. Q.; Gertman, D. I.; Ragsdale, A.

    2006-07-01

    Boring and Gertman (2005) introduced a novel method that augments heuristic usability evaluation methods with that of the human reliability analysis method of SPAR-H. By assigning probabilistic modifiers to individual heuristics, it is possible to arrive at the usability error probability (UEP). Although this UEP is not a literal probability of error, it nonetheless provides a quantitative basis to heuristic evaluation. This method allows one to seamlessly prioritize and identify usability issues (i.e., a higher UEP requires more immediate fixes). However, the original version of this method required the usability evaluator to assign priority weights to the final UEP, thus allowing the priority of a usability issue to differ among usability evaluators. The purpose of this paper is to explore an alternative approach to standardize the priority weighting of the UEP in an effort to improve the method's reliability. (authors)

  20. Reliability growth modeling analysis of the space shuttle main engines based upon the Weibull process

    NASA Technical Reports Server (NTRS)

    Wheeler, J. T.

    1990-01-01

    The Weibull process, identified as the inhomogeneous Poisson process with the Weibull intensity function, is used to model the reliability growth assessment of the space shuttle main engine test and flight failure data. Additional tables of percentage-point probabilities for several different values of the confidence coefficient have been generated for setting (1-alpha)100-percent two sided confidence interval estimates on the mean time between failures. The tabled data pertain to two cases: (1) time-terminated testing, and (2) failure-terminated testing. The critical values of the three test statistics, namely Cramer-von Mises, Kolmogorov-Smirnov, and chi-square, were calculated and tabled for use in the goodness of fit tests for the engine reliability data. Numerical results are presented for five different groupings of the engine data that reflect the actual response to the failures.

  1. RICA: a reliable and image configurable arena for cyborg bumblebee based on CAN bus.

    PubMed

    Gong, Fan; Zheng, Nenggan; Xue, Lei; Xu, Kedi; Zheng, Xiaoxiang

    2014-01-01

    In this paper, we designed a reliable and image configurable flight arena, RICA, for developing cyborg bumblebees. To meet the spatial and temporal requirements of bumblebees, the Controller Area Network (CAN) bus is adopted to interconnect the LED display modules to ensure the reliability and real-time performance of the arena system. Easily-configurable interfaces on a desktop computer implemented by python scripts are provided to transmit the visual patterns to the LED distributor online and configure RICA dynamically. The new arena system will be a power tool to investigate the quantitative relationship between the visual inputs and induced flight behaviors and also will be helpful to the visual-motor research in other related fields.

  2. RICA: a reliable and image configurable arena for cyborg bumblebee based on CAN bus.

    PubMed

    Gong, Fan; Zheng, Nenggan; Xue, Lei; Xu, Kedi; Zheng, Xiaoxiang

    2014-01-01

    In this paper, we designed a reliable and image configurable flight arena, RICA, for developing cyborg bumblebees. To meet the spatial and temporal requirements of bumblebees, the Controller Area Network (CAN) bus is adopted to interconnect the LED display modules to ensure the reliability and real-time performance of the arena system. Easily-configurable interfaces on a desktop computer implemented by python scripts are provided to transmit the visual patterns to the LED distributor online and configure RICA dynamically. The new arena system will be a power tool to investigate the quantitative relationship between the visual inputs and induced flight behaviors and also will be helpful to the visual-motor research in other related fields. PMID:25570095

  3. A Human Reliability Based Usability Evaluation Method for Safety-Critical Software

    SciTech Connect

    Phillippe Palanque; Regina Bernhaupt; Ronald Boring; Chris Johnson

    2006-04-01

    Recent years have seen an increasing use of sophisticated interaction techniques including in the field of safety critical interactive software [8]. The use of such techniques has been required in order to increase the bandwidth between the users and systems and thus to help them deal efficiently with increasingly complex systems. These techniques come from research and innovation done in the field of humancomputer interaction (HCI). A significant effort is currently being undertaken by the HCI community in order to apply and extend current usability evaluation techniques to these new kinds of interaction techniques. However, very little has been done to improve the reliability of software offering these kinds of interaction techniques. Even testing basic graphical user interfaces remains a challenge that has rarely been addressed in the field of software engineering [9]. However, the non reliability of interactive software can jeopardize usability evaluation by showing unexpected or undesired behaviors. The aim of this SIG is to provide a forum for both researchers and practitioners interested in testing interactive software. Our goal is to define a roadmap of activities to cross fertilize usability and reliability testing of these kinds of systems to minimize duplicate efforts in both communities.

  4. Validity and Reliability of Accelerometer-Based Gait Assessment in Patients with Diabetes on Challenging Surfaces

    PubMed Central

    de Bruin, Eling D.; Hubli, Michèle; Hofer, Pamela; Wolf, Peter; Murer, Kurt; Zijlstra, Wiebren

    2012-01-01

    Walking on irregular terrain influences gait of diabetic patients. We investigate the test-retest reliability and construct validity of gait measured with the DynaPort MiniMod under single and dual task conditions in diabetic patients walking on irregular terrain to identify the measurement error (precision) and minimal clinical detectable change. 29 patients with Type 2 diabetes were measured once, and 25 repeated the measurement within 7 days. Patients walked on a therapy garden walkway. Differences between three groups of diabetics with various levels of lower extremity neuropathy were analyzed with planned contrasts. ICC was excellent for intervisit measurements with ICC's >0.824. Bland and Altman Plots, SEM, and SDD showed precise values, distributed around zero for both test conditions. A significant effect of grouping on step length performance hints at possible construct validity of the device. Good reliability of DynaPort MiniMod measurements on a therapy garden walkway and an indication for discriminatory capability suggests that DynaPort MiniMod could facilitate the study of gait in diabetic patients in conditions close to real-life situations. Good reliability, small measurement error, and values of minimal clinical detectable change recommend the further utilization of DynaPort MiniMod for the evaluation of gait parameters in diabetic patients. PMID:22900182

  5. Inter-observer reliability of forceful exertion analysis based on video-recordings.

    PubMed

    Bao, S; Howard, N; Spielholz, P; Silverstein, B

    2010-09-01

    The objectives were to examine inter-observer reliability of job-level forceful exertion analyses and temporal agreement of detailed time study results. Three observers performed the analyses on 12 different jobs. Continuous duration, frequency and % time of lifting, pushing/pulling, power and pinch gripping exertions and estimated level of the exertions were obtained. Intraclass correlation coefficient and variance components were computed. Temporal agreement analyses of raw time study data were performed. The inter-observer reliability was good for most job-level exposure parameters (continuous duration, frequency and % time of forceful exertions), but only fair to moderate for the estimated level of forceful exertions. The finding that the between-observer variability was less than the between-exertion variability confirmed that the forceful exertion analysis method used in the present study can detect job exertion differences.Using three observers to perform detailed time studies on task activities and getting consensus of the majority can increase the between-observer agreement up to 97%. STATEMENT OF RELEVANCE: The results inform researchers that inter-observer reliability for job-level exposure measurement of forceful exertion analysis obtained from detailed time studies is generally good, but the observers' ability in the estimation of forceful exertion level can be poor. It also provides information on the temporal agreement of detailed forceful exertion analysis and guidelines on achieving better agreement for studies where accurate synchronisation of task activities and direct physiological/biomechanical measurements is crucial. PMID:20737338

  6. Based on Weibull Information Fusion Analysis Semiconductors Quality the Key Technology of Manufacturing Execution Systems Reliability

    NASA Astrophysics Data System (ADS)

    Huang, Zhi-Hui; Tang, Ying-Chun; Dai, Kai

    2016-05-01

    Semiconductor materials and Product qualified rate are directly related to the manufacturing costs and survival of the enterprise. Application a dynamic reliability growth analysis method studies manufacturing execution system reliability growth to improve product quality. Refer to classical Duane model assumptions and tracking growth forecasts the TGP programming model, through the failure data, established the Weibull distribution model. Combining with the median rank of average rank method, through linear regression and least squares estimation method, match respectively weibull information fusion reliability growth curve. This assumption model overcome Duane model a weakness which is MTBF point estimation accuracy is not high, through the analysis of the failure data show that the method is an instance of the test and evaluation modeling process are basically identical. Median rank in the statistics is used to determine the method of random variable distribution function, which is a good way to solve the problem of complex systems such as the limited sample size. Therefore this method has great engineering application value.

  7. Reliable change indices and standardized regression-based change score norms for evaluating neuropsychological change in children with epilepsy.

    PubMed

    Busch, Robyn M; Lineweaver, Tara T; Ferguson, Lisa; Haut, Jennifer S

    2015-06-01

    Reliable change indices (RCIs) and standardized regression-based (SRB) change score norms permit evaluation of meaningful changes in test scores following treatment interventions, like epilepsy surgery, while accounting for test-retest reliability, practice effects, score fluctuations due to error, and relevant clinical and demographic factors. Although these methods are frequently used to assess cognitive change after epilepsy surgery in adults, they have not been widely applied to examine cognitive change in children with epilepsy. The goal of the current study was to develop RCIs and SRB change score norms for use in children with epilepsy. Sixty-three children with epilepsy (age range: 6-16; M=10.19, SD=2.58) underwent comprehensive neuropsychological evaluations at two time points an average of 12 months apart. Practice effect-adjusted RCIs and SRB change score norms were calculated for all cognitive measures in the battery. Practice effects were quite variable across the neuropsychological measures, with the greatest differences observed among older children, particularly on the Children's Memory Scale and Wisconsin Card Sorting Test. There was also notable variability in test-retest reliabilities across measures in the battery, with coefficients ranging from 0.14 to 0.92. Reliable change indices and SRB change score norms for use in assessing meaningful cognitive change in children following epilepsy surgery are provided for measures with reliability coefficients above 0.50. This is the first study to provide RCIs and SRB change score norms for a comprehensive neuropsychological battery based on a large sample of children with epilepsy. Tables to aid in evaluating cognitive changes in children who have undergone epilepsy surgery are provided for clinical use. An Excel sheet to perform all relevant calculations is also available to interested clinicians or researchers.

  8. Manufacturability and reliability on 10-Gb/s transponder for ethernet-based applications

    NASA Astrophysics Data System (ADS)

    Kao, Min-Sheng; Tsai, Cheng-Hung; Chiu, Chia-Hung; Cheng, Shou-Chien; Shen, Kun-Yi; Huang, Min-Fa; Shaw, Cheng-Da; Lee, Shin-Ge

    2004-05-01

    In this paper, manufacturing issues include Optical Sub-Assembly (OSA), Electrical Sub-Assembly (ESA) and reliability considerations of 10 Gb/s Ethernet transponder were studied by using experiments and implementation. In the growing optical communication industry, one of the star products is the Z-axis pluggable optical transceiver module. Under the broad usage of Ethernet means high port density, low cost, high utilities, compact size and still require excellent performance. After standardizing of 10 Gb/s ethernet (IEEE 802.3ae), many transceiver companies, silicon vendors and system vendors reached the agreement and signed up diversity of MSA (Multi-Source Agreement). These MSAs still keep modifying with system demands, customer requirements, cost and performance issue. This paper presents how to achieve these functions description in the MSA and own a highly manufacturability and reliability module design. According to composed block of transponder, we split it into OSA, ESA, mechanical design and related reliability experimental result. In the OSA, traditional TO-CAN package and optical components be introduced. Because the mature manufacture experience, vendor can easy to meet low cost and manufacturability requirements and only need to slightly modifications. A simply solution be implemented to solve this problem and discuss the critical point of the design. Thermal issue on OSA will also be mentioned because of the sensitive of light source and how to calculate the effect to find effective solutions. By the way, some manufacturability criteria will be discussed for OSA characteristics in 10 Gb/s applications. In the ESA, PMD (Physical media dependant) driving methods, Multi-Source Agreement related digital optical monitor function implement and performance comparison will be presented. On the other hand, we will examine the crosstalk effect between transmitter and receiver circuit and impact to the module Optical to Electrical convert interface design. We

  9. Optimal clustering of MGs based on droop controller for improving reliability using a hybrid of harmony search and genetic algorithms.

    PubMed

    Abedini, Mohammad; Moradi, Mohammad H; Hosseinian, S M

    2016-03-01

    This paper proposes a novel method to address reliability and technical problems of microgrids (MGs) based on designing a number of self-adequate autonomous sub-MGs via adopting MGs clustering thinking. In doing so, a multi-objective optimization problem is developed where power losses reduction, voltage profile improvement and reliability enhancement are considered as the objective functions. To solve the optimization problem a hybrid algorithm, named HS-GA, is provided, based on genetic and harmony search algorithms, and a load flow method is given to model different types of DGs as droop controller. The performance of the proposed method is evaluated in two case studies. The results provide support for the performance of the proposed method. PMID:26767800

  10. Reliability of a Novel CBCT-Based 3D Classification System for Maxillary Canine Impactions in Orthodontics: The KPG Index

    PubMed Central

    Visconti, Luca; Martin, Conchita

    2013-01-01

    The aim of this study was to evaluate both intra- and interoperator reliability of a radiological three-dimensional classification system (KPG index) for the assessment of degree of difficulty for orthodontic treatment of maxillary canine impactions. Cone beam computed tomography (CBCT) scans of fifty impacted canines, obtained using three different scanners (NewTom, Kodak, and Planmeca), were classified using the KPG index by three independent orthodontists. Measurements were repeated one month later. Based on these two sessions, several recommendations on KPG Index scoring were elaborated. After a joint calibration session, these recommendations were explained to nine orthodontists and the two measurement sessions were repeated. There was a moderate intrarater agreement in the precalibration measurement sessions. After the calibration session, both intra- and interrater agreement were almost perfect. Indexes assessed with Kodak Dental Imaging 3D module software showed a better reliability in z-axis values, whereas indexes assessed with Planmeca Romexis software showed a better reliability in x- and y-axis values. No differences were found between the CBCT scanners used. Taken together, these findings indicate that the application of the instructions elaborated during this study improved KPG index reliability, which was nevertheless variously influenced by the use of different software for images evaluation. PMID:24235889

  11. On Reliable and Efficient Data Gathering Based Routing in Underwater Wireless Sensor Networks.

    PubMed

    Liaqat, Tayyaba; Akbar, Mariam; Javaid, Nadeem; Qasim, Umar; Khan, Zahoor Ali; Javaid, Qaisar; Alghamdi, Turki Ali; Niaz, Iftikhar Azim

    2016-08-30

    This paper presents cooperative routing scheme to improve data reliability. The proposed protocol achieves its objective, however, at the cost of surplus energy consumption. Thus sink mobility is introduced to minimize the energy consumption cost of nodes as it directly collects data from the network nodes at minimized communication distance. We also present delay and energy optimized versions of our proposed RE-AEDG to further enhance its performance. Simulation results prove the effectiveness of our proposed RE-AEDG in terms of the selected performance matrics.

  12. On Reliable and Efficient Data Gathering Based Routing in Underwater Wireless Sensor Networks

    PubMed Central

    Liaqat, Tayyaba; Akbar, Mariam; Javaid, Nadeem; Qasim, Umar; Khan, Zahoor Ali; Javaid, Qaisar; Alghamdi, Turki Ali; Niaz, Iftikhar Azim

    2016-01-01

    This paper presents cooperative routing scheme to improve data reliability. The proposed protocol achieves its objective, however, at the cost of surplus energy consumption. Thus sink mobility is introduced to minimize the energy consumption cost of nodes as it directly collects data from the network nodes at minimized communication distance. We also present delay and energy optimized versions of our proposed RE-AEDG to further enhance its performance. Simulation results prove the effectiveness of our proposed RE-AEDG in terms of the selected performance matrics. PMID:27589750

  13. On Reliable and Efficient Data Gathering Based Routing in Underwater Wireless Sensor Networks.

    PubMed

    Liaqat, Tayyaba; Akbar, Mariam; Javaid, Nadeem; Qasim, Umar; Khan, Zahoor Ali; Javaid, Qaisar; Alghamdi, Turki Ali; Niaz, Iftikhar Azim

    2016-01-01

    This paper presents cooperative routing scheme to improve data reliability. The proposed protocol achieves its objective, however, at the cost of surplus energy consumption. Thus sink mobility is introduced to minimize the energy consumption cost of nodes as it directly collects data from the network nodes at minimized communication distance. We also present delay and energy optimized versions of our proposed RE-AEDG to further enhance its performance. Simulation results prove the effectiveness of our proposed RE-AEDG in terms of the selected performance matrics. PMID:27589750

  14. Test-Retest Reliability of an Automated Infrared-Assisted Trunk Accelerometer-Based Gait Analysis System

    PubMed Central

    Hsu, Chia-Yu; Tsai, Yuh-Show; Yau, Cheng-Shiang; Shie, Hung-Hai; Wu, Chu-Ming

    2016-01-01

    The aim of this study was to determine the test-retest reliability of an automated infrared-assisted, trunk accelerometer-based gait analysis system for measuring gait parameters of healthy subjects in a hospital. Thirty-five participants (28 of them females; age range, 23–79 years) performed a 5-m walk twice using an accelerometer-based gait analysis system with infrared assist. Measurements of spatiotemporal gait parameters (walking speed, step length, and cadence) and trunk control (gait symmetry, gait regularity, acceleration root mean square (RMS), and acceleration root mean square ratio (RMSR)) were recorded in two separate walking tests conducted 1 week apart. Relative and absolute test-retest reliability was determined by calculating the intra-class correlation coefficient (ICC3,1) and smallest detectable difference (SDD), respectively. The test-retest reliability was excellent for walking speed (ICC = 0.87, 95% confidence interval = 0.74–0.93, SDD = 13.4%), step length (ICC = 0.81, 95% confidence interval = 0.63–0.91, SDD = 12.2%), cadence (ICC = 0.81, 95% confidence interval = 0.63–0.91, SDD = 10.8%), and trunk control (step and stride regularity in anterior-posterior direction, acceleration RMS and acceleration RMSR in medial-lateral direction, and acceleration RMS and stride regularity in vertical direction). An automated infrared-assisted, trunk accelerometer-based gait analysis system is a reliable tool for measuring gait parameters in the hospital environment. PMID:27455281

  15. Reliable Execution Based on CPN and Skyline Optimization for Web Service Composition

    PubMed Central

    Ha, Weitao; Zhang, Guojun

    2013-01-01

    With development of SOA, the complex problem can be solved by combining available individual services and ordering them to best suit user's requirements. Web services composition is widely used in business environment. With the features of inherent autonomy and heterogeneity for component web services, it is difficult to predict the behavior of the overall composite service. Therefore, transactional properties and nonfunctional quality of service (QoS) properties are crucial for selecting the web services to take part in the composition. Transactional properties ensure reliability of composite Web service, and QoS properties can identify the best candidate web services from a set of functionally equivalent services. In this paper we define a Colored Petri Net (CPN) model which involves transactional properties of web services in the composition process. To ensure reliable and correct execution, unfolding processes of the CPN are followed. The execution of transactional composition Web service (TCWS) is formalized by CPN properties. To identify the best services of QoS properties from candidate service sets formed in the TCSW-CPN, we use skyline computation to retrieve dominant Web service. It can overcome that the reduction of individual scores to an overall similarity leads to significant information loss. We evaluate our approach experimentally using both real and synthetically generated datasets. PMID:23935431

  16. Multiplication factor for open ground storey buildings-a reliability based evaluation

    NASA Astrophysics Data System (ADS)

    Haran Pragalath, D. C.; Avadhoot, Bhosale; Robin, Davis P.; Pradip, Sarkar

    2016-06-01

    Open Ground Storey (OGS) framed buildings where the ground storey is kept open without infill walls, mainly to facilitate parking, is increasing commonly in urban areas. However, vulnerability of this type of buildings has been exposed in past earthquakes. OGS buildings are conventionally designed by a bare frame analysis that ignores the stiffness of the infill walls present in the upper storeys, but doing so underestimates the inter-storey drift (ISD) and thereby the force demand in the ground storey columns. Therefore, a multiplication factor (MF) is introduced in various international codes to estimate the design forces (bending moments and shear forces) in the ground storey columns. This study focuses on the seismic performance of typical OGS buildings designed by means of MFs. The probabilistic seismic demand models, fragility curves, reliability and cost indices for various frame models including bare frames and fully infilled frames are developed. It is found that the MF scheme suggested by the Israel code is better than other international codes in terms of reliability and cost.

  17. Resistive switching memories based on metal oxides: mechanisms, reliability and scaling

    NASA Astrophysics Data System (ADS)

    Ielmini, Daniele

    2016-06-01

    With the explosive growth of digital data in the era of the Internet of Things (IoT), fast and scalable memory technologies are being researched for data storage and data-driven computation. Among the emerging memories, resistive switching memory (RRAM) raises strong interest due to its high speed, high density as a result of its simple two-terminal structure, and low cost of fabrication. The scaling projection of RRAM, however, requires a detailed understanding of switching mechanisms and there are potential reliability concerns regarding small device sizes. This work provides an overview of the current understanding of bipolar-switching RRAM operation, reliability and scaling. After reviewing the phenomenological and microscopic descriptions of the switching processes, the stability of the low- and high-resistance states will be discussed in terms of conductance fluctuations and evolution in 1D filaments containing only a few atoms. The scaling potential of RRAM will finally be addressed by reviewing the recent breakthroughs in multilevel operation and 3D architecture, making RRAM a strong competitor among future high-density memory solutions.

  18. A reliability-based maintenance technicians' workloads optimisation model with stochastic consideration

    NASA Astrophysics Data System (ADS)

    Ighravwe, D. E.; Oke, S. A.; Adebiyi, K. A.

    2016-12-01

    The growing interest in technicians' workloads research is probably associated with the recent surge in competition. This was prompted by unprecedented technological development that triggers changes in customer tastes and preferences for industrial goods. In a quest for business improvement, this worldwide intense competition in industries has stimulated theories and practical frameworks that seek to optimise performance in workplaces. In line with this drive, the present paper proposes an optimisation model which considers technicians' reliability that complements factory information obtained. The information used emerged from technicians' productivity and earned-values using the concept of multi-objective modelling approach. Since technicians are expected to carry out routine and stochastic maintenance work, we consider these workloads as constraints. The influence of training, fatigue and experiential knowledge of technicians on workload management was considered. These workloads were combined with maintenance policy in optimising reliability, productivity and earned-values using the goal programming approach. Practical datasets were utilised in studying the applicability of the proposed model in practice. It was observed that our model was able to generate information that practicing maintenance engineers can apply in making more informed decisions on technicians' management.

  19. Reliable execution based on CPN and skyline optimization for Web service composition.

    PubMed

    Chen, Liping; Ha, Weitao; Zhang, Guojun

    2013-01-01

    With development of SOA, the complex problem can be solved by combining available individual services and ordering them to best suit user's requirements. Web services composition is widely used in business environment. With the features of inherent autonomy and heterogeneity for component web services, it is difficult to predict the behavior of the overall composite service. Therefore, transactional properties and nonfunctional quality of service (QoS) properties are crucial for selecting the web services to take part in the composition. Transactional properties ensure reliability of composite Web service, and QoS properties can identify the best candidate web services from a set of functionally equivalent services. In this paper we define a Colored Petri Net (CPN) model which involves transactional properties of web services in the composition process. To ensure reliable and correct execution, unfolding processes of the CPN are followed. The execution of transactional composition Web service (TCWS) is formalized by CPN properties. To identify the best services of QoS properties from candidate service sets formed in the TCSW-CPN, we use skyline computation to retrieve dominant Web service. It can overcome that the reduction of individual scores to an overall similarity leads to significant information loss. We evaluate our approach experimentally using both real and synthetically generated datasets. PMID:23935431

  20. Reliable Change Indices and Standardized Regression-Based Change Score Norms for Evaluating Neuropsychological Change in Children with Epilepsy

    PubMed Central

    Busch, Robyn M.; Lineweaver, Tara T.; Ferguson, Lisa; Haut, Jennifer S.

    2015-01-01

    Reliable change index scores (RCIs) and standardized regression-based change score norms (SRBs) permit evaluation of meaningful changes in test scores following treatment interventions, like epilepsy surgery, while accounting for test-retest reliability, practice effects, score fluctuations due to error, and relevant clinical and demographic factors. Although these methods are frequently used to assess cognitive change after epilepsy surgery in adults, they have not been widely applied to examine cognitive change in children with epilepsy. The goal of the current study was to develop RCIs and SRBs for use in children with epilepsy. Sixty-three children with epilepsy (age range 6–16; M=10.19, SD=2.58) underwent comprehensive neuropsychological evaluations at two time points an average of 12 months apart. Practice adjusted RCIs and SRBs were calculated for all cognitive measures in the battery. Practice effects were quite variable across the neuropsychological measures, with the greatest differences observed among older children, particularly on the Children’s Memory Scale and Wisconsin Card Sorting Test. There was also notable variability in test-retest reliabilities across measures in the battery, with coefficients ranging from 0.14 to 0.92. RCIs and SRBs for use in assessing meaningful cognitive change in children following epilepsy surgery are provided for measures with reliability coefficients above 0.50. This is the first study to provide RCIs and SRBs for a comprehensive neuropsychological battery based on a large sample of children with epilepsy. Tables to aid in evaluating cognitive changes in children who have undergone epilepsy surgery are provided for clinical use. An excel sheet to perform all relevant calculations is also available to interested clinicians or researchers. PMID:26043163

  1. Diskless supercomputers: Scalable, reliable I/O for the Tera-Op technology base

    NASA Technical Reports Server (NTRS)

    Katz, Randy H.; Ousterhout, John K.; Patterson, David A.

    1993-01-01

    Computing is seeing an unprecedented improvement in performance; over the last five years there has been an order-of-magnitude improvement in the speeds of workstation CPU's. At least another order of magnitude seems likely in the next five years, to machines with 500 MIPS or more. The goal of the ARPA Teraop program is to realize even larger, more powerful machines, executing as many as a trillion operations per second. Unfortunately, we have seen no comparable breakthroughs in I/O performance; the speeds of I/O devices and the hardware and software architectures for managing them have not changed substantially in many years. We have completed a program of research to demonstrate hardware and software I/O architectures capable of supporting the kinds of internetworked 'visualization' workstations and supercomputers that will appear in the mid 1990s. The project had three overall goals: high performance, high reliability, and scalable, multipurpose system.

  2. Reliable Acquisition of RAM Dumps from Intel-Based Apple Mac Computers over FireWire

    NASA Astrophysics Data System (ADS)

    Gladyshev, Pavel; Almansoori, Afrah

    RAM content acquisition is an important step in live forensic analysis of computer systems. FireWire offers an attractive way to acquire RAM content of Apple Mac computers equipped with a FireWire connection. However, the existing techniques for doing so require substantial knowledge of the target computer configuration and cannot be used reliably on a previously unknown computer in a crime scene. This paper proposes a novel method for acquiring RAM content of Apple Mac computers over FireWire, which automatically discovers necessary information about the target computer and can be used in the crime scene setting. As an application of the developed method, the techniques for recovery of AOL Instant Messenger (AIM) conversation fragments from RAM dumps are also discussed in this paper.

  3. A reliability study on tin based lead free micro joint including intermetallic and void evolution

    NASA Astrophysics Data System (ADS)

    Feyissa, Frezer Assefa

    In microelectronics soldering to Cu pad lead to formation of two intermetallic structures in the solder -pad interface. The growth of these layers is accompanied by microscopic voids that usually cause reliability concern in the industry. Therefore it is important to understand factors that contribute for the growth of IMC using various combination of reflow time, Sn thickness and aging temperature. Systematic study was conducted on Cu-Sn system to investigate the formation and growth of intermetallic compound (IMC) as well as voiding evolution for different solder thicknesses. The growth of the Cu6Sn5 IMC layer was found to be increasing as the Sn thicknesses increase after reflow while the Cu3Sn layer were decreasing under same conditions. Also after reflow and aging more voiding were shown to occur in the thin solder than thicker one.

  4. Reliability of a Cycle Ergometer Peak Power Test in Running-based Team Sport Athletes: A Technical Report.

    PubMed

    Wehbe, George M; Gabbett, Tim J; Hartwig, Timothy B; Mclellan, Christopher P

    2015-07-01

    Given the importance of ensuring athletes train and compete in a nonfatigued state, reliable tests are required to regularly monitor fatigue. The purpose of this study was to investigate the reliability of a cycle ergometer to measure peak power during short maximal sprint cycle efforts in running-based team sport athletes. Fourteen professional male Australian rules footballers performed a sprint cycle protocol during 3 separate trials, with each trial separated by 7 days. The protocol consisted of a standardized warm-up, a maximal 6-second sprint cycle effort, a 1-minute active recovery, and a second maximal 6-second sprint cycle effort. Peak power was recorded as the highest power output of the 2 sprint cycle efforts. Absolute peak power (mean ± SD) was 1502 ± 202, 1498 ± 191, and 1495 ± 210 W for trials 1, 2, and 3, respectively. The mean coefficient of variation, intraclass correlation coefficient, and SE of measurement for peak power between trials was 3.0% (90% confidence intervals [CIs] = 2.5-3.8%), 0.96 (90% CIs = 0.91-0.98), and 39 W, respectively. The smallest worthwhile change for relative peak power was 6.0%, which equated to 1.03 W·kg⁻¹. The cycle ergometer sprint test protocol described in this study is highly reliable in elite Australian rules footballers and can be used to track meaningful changes in performance over time, making it a potentially useful fatigue-monitoring tool.

  5. Reliable Mixed H∞ and Passivity-Based Control for Fuzzy Markovian Switching Systems With Probabilistic Time Delays and Actuator Failures.

    PubMed

    Sakthivel, Rathinasamy; Selvi, Subramaniam; Mathiyalagan, Kalidass; Shi, Peng

    2015-12-01

    This paper is concerned with the problem of reliable mixed H ∞ and passivity-based control for a class of stochastic Takagi-Sugeno (TS) fuzzy systems with Markovian switching and probabilistic time varying delays. Different from the existing works, the H∞ and passivity control problem with probabilistic occurrence of time-varying delays and actuator failures is considered in a unified framework, which is more general in some practical situations. The main aim of this paper is to design a reliable mixed H∞ and passivity-based controller such that the stochastic TS fuzzy system with Markovian switching is stochastically stable with a prescribed mixed H∞ and passivity performance level γ > 0 . Based on the Lyapunov-Krasovskii functional (LKF) involving lower and upper bound of probabilistic time delay and convex combination technique, a new set of delay-dependent sufficient condition in terms of linear matrix inequalities (LMIs) is established for obtaining the required result. Finally, a numerical example based on the modified truck-trailer model is given to demonstrate the effectiveness and applicability of the proposed design techniques.

  6. Feasibility, Reliability, and Validity of a Smartphone Based Application for the Assessment of Cognitive Function in the Elderly

    PubMed Central

    Brouillette, Robert M.; Foil, Heather; Fontenot, Stephanie; Correro, Anthony; Allen, Ray; Martin, Corby K.; Bruce-Keller, Annadora J.; Keller, Jeffrey N.

    2013-01-01

    While considerable knowledge has been gained through the use of established cognitive and motor assessment tools, there is a considerable interest and need for the development of a battery of reliable and validated assessment tools that provide real-time and remote analysis of cognitive and motor function in the elderly. Smartphones appear to be an obvious choice for the development of these “next-generation” assessment tools for geriatric research, although to date no studies have reported on the use of smartphone-based applications for the study of cognition in the elderly. The primary focus of the current study was to assess the feasibility, reliability, and validity of a smartphone-based application for the assessment of cognitive function in the elderly. A total of 57 non-demented elderly individuals were administered a newly developed smartphone application-based Color-Shape Test (CST) in order to determine its utility in measuring cognitive processing speed in the elderly. Validity of this novel cognitive task was assessed by correlating performance on the CST with scores on widely accepted assessments of cognitive function. Scores on the CST were significantly correlated with global cognition (Mini-Mental State Exam: r = 0.515, p<0.0001) and multiple measures of processing speed and attention (Digit Span: r = 0.427, p<0.0001; Trail Making Test: r = −0.651, p<0.00001; Digit Symbol Test: r = 0.508, p<0.0001). The CST was not correlated with naming and verbal fluency tasks (Boston Naming Test, Vegetable/Animal Naming) or memory tasks (Logical Memory Test). Test re-test reliability was observed to be significant (r = 0.726; p = 0.02). Together, these data are the first to demonstrate the feasibility, reliability, and validity of using a smartphone-based application for the purpose of assessing cognitive function in the elderly. The importance of these findings for the establishment of smartphone-based assessment batteries of

  7. Disposable and reliable electrochemical magnetoimmunosensor for Fumonisins simplified determination in maize-based foodstuffs.

    PubMed

    Jodra, Adrián; López, Miguel Ángel; Escarpa, Alberto

    2015-02-15

    An electrochemical magnetoimmunosensor involving magnetic beads and disposable carbon screen-printed electrode (CSPE) for Fumonosins (FB1, FB2 and FB3) has been developed and evaluated through a certified reference material (CRM) and beer samples. Once the immunochemical reactions took place on the magnetic beads solution, they were confined on the surface of CSPE, where electrochemical detection is achieved through the addition of suitable substrate and mediator for enzymatic tracer (Horseradish peroxidase--HRP). A remarkable detection limit of 0.33 μg L(-1), outstanding repeatability and reproducibility (RSD(intraday) of 5.6% and 2.9%; RSD(interday) of 6.9% and 6.0%; both for 0 and 5 μg L(-1) FB1 respectively), and excellent accuracy with recovery rate of 85-96% showed the suggested approach to be a very suitable screening tool for the analysis of Fumonisin B1 and B2 in food samples. A simultaneous simplified calibration and analysis protocol allows a fast and reliable determination of Fumonisin in beer samples with recovery rate of 87-105%. This strategy enhanced the analytical merits of immunosensor approach towards truly disposable tools for food-safety monitoring. PMID:25441412

  8. Tumor Heterogeneity: Mechanisms and Bases for a Reliable Application of Molecular Marker Design

    PubMed Central

    Diaz-Cano, Salvador J.

    2012-01-01

    Tumor heterogeneity is a confusing finding in the assessment of neoplasms, potentially resulting in inaccurate diagnostic, prognostic and predictive tests. This tumor heterogeneity is not always a random and unpredictable phenomenon, whose knowledge helps designing better tests. The biologic reasons for this intratumoral heterogeneity would then be important to understand both the natural history of neoplasms and the selection of test samples for reliable analysis. The main factors contributing to intratumoral heterogeneity inducing gene abnormalities or modifying its expression include: the gradient ischemic level within neoplasms, the action of tumor microenvironment (bidirectional interaction between tumor cells and stroma), mechanisms of intercellular transference of genetic information (exosomes), and differential mechanisms of sequence-independent modifications of genetic material and proteins. The intratumoral heterogeneity is at the origin of tumor progression and it is also the byproduct of the selection process during progression. Any analysis of heterogeneity mechanisms must be integrated within the process of segregation of genetic changes in tumor cells during the clonal expansion and progression of neoplasms. The evaluation of these mechanisms must also consider the redundancy and pleiotropism of molecular pathways, for which appropriate surrogate markers would support the presence or not of heterogeneous genetics and the main mechanisms responsible. This knowledge would constitute a solid scientific background for future therapeutic planning. PMID:22408433

  9. Human reliability-based MC&A models for detecting insider theft.

    SciTech Connect

    Duran, Felicia Angelica; Wyss, Gregory Dane

    2010-06-01

    Material control and accounting (MC&A) safeguards operations that track and account for critical assets at nuclear facilities provide a key protection approach for defeating insider adversaries. These activities, however, have been difficult to characterize in ways that are compatible with the probabilistic path analysis methods that are used to systematically evaluate the effectiveness of a site's physical protection (security) system (PPS). MC&A activities have many similar characteristics to operator procedures performed in a nuclear power plant (NPP) to check for anomalous conditions. This work applies human reliability analysis (HRA) methods and models for human performance of NPP operations to develop detection probabilities for MC&A activities. This has enabled the development of an extended probabilistic path analysis methodology in which MC&A protections can be combined with traditional sensor data in the calculation of PPS effectiveness. The extended path analysis methodology provides an integrated evaluation of a safeguards and security system that addresses its effectiveness for attacks by both outside and inside adversaries.

  10. Nanowire growth process modeling and reliability models for nanodevices

    NASA Astrophysics Data System (ADS)

    Fathi Aghdam, Faranak

    . This work is an early attempt that uses a physical-statistical modeling approach to studying selective nanowire growth for the improvement of process yield. In the second research work, the reliability of nano-dielectrics is investigated. As electronic devices get smaller, reliability issues pose new challenges due to unknown underlying physics of failure (i.e., failure mechanisms and modes). This necessitates new reliability analysis approaches related to nano-scale devices. One of the most important nano-devices is the transistor that is subject to various failure mechanisms. Dielectric breakdown is known to be the most critical one and has become a major barrier for reliable circuit design in nano-scale. Due to the need for aggressive downscaling of transistors, dielectric films are being made extremely thin, and this has led to adopting high permittivity (k) dielectrics as an alternative to widely used SiO2 in recent years. Since most time-dependent dielectric breakdown test data on bilayer stacks show significant deviations from a Weibull trend, we have proposed two new approaches to modeling the time to breakdown of bi-layer high-k dielectrics. In the first approach, we have used a marked space-time self-exciting point process to model the defect generation rate. A simulation algorithm is used to generate defects within the dielectric space, and an optimization algorithm is employed to minimize the Kullback-Leibler divergence between the empirical distribution obtained from the real data and the one based on the simulated data to find the best parameter values and to predict the total time to failure. The novelty of the presented approach lies in using a conditional intensity for trap generation in dielectric that is a function of time, space and size of the previous defects. In addition, in the second approach, a k-out-of-n system framework is proposed to estimate the total failure time after the generation of more than one soft breakdown.

  11. Reliability and Validity of Web-Based Portfolio Peer Assessment: A Case Study for a Senior High School's Students Taking Computer Course

    ERIC Educational Resources Information Center

    Chang, Chi-Cheng; Tseng, Kuo-Hung; Chou, Pao-Nan; Chen, Yi-Hui

    2011-01-01

    This study examined the reliability and validity of Web-based portfolio peer assessment. Participants were 72 second-grade students from a senior high school taking a computer course. The results indicated that: 1) there was a lack of consistency across various student raters on a portfolio, or inter-rater reliability; 2) two-thirds of the raters…

  12. Problems with Assessment Validity and Reliability in Web-Based Distance Learning Environments and Solutions

    ERIC Educational Resources Information Center

    Wijekumar, Kay; Ferguson, Lon; Wagoner, Diane

    2006-01-01

    Assessment of learning is critical to the learners, teachers, and designers of learning environments. Current assessment techniques in web-based distance learning apply age-old techniques to a new medium and are not adequate for web-based distance learning environments (WBDLE). The goals of this article are to extend existing critiques of…

  13. Skeletal age estimation based on medial clavicle--a test of the method reliability.

    PubMed

    Milenkovic, Petar; Djukic, Ksenija; Djonic, Danijela; Milovanovic, Petar; Djuric, Marija

    2013-05-01

    In order to establish a reliable age indicator in the period when all other epiphyseal age indicators have already been inactivated, medial clavicle as the bone with the longest period of growth became the object of various investigations. However, the lack of population-specific method often made it unreliable in some regions. The current study involved a Balkan population and it was designed in order to examine whether morphological, radiological, and histological analyses of medial clavicles could be applied with success in age assessment of individuals beyond their twenties in anthropological and forensic practice. The medial clavicular specimens were collected from contemporary Serbian population, autopsied in the period from 1998 to 2001, encompassing 67 individuals (42 males and 25 females) with the age range from 20 to 90 years. The conducted analyses of morphological features identified the epiphyseal union timing, signs of lipping in the region of the notch for the first rib as well as exostoses and bone overgrowths of the articular surface margin as age-dependent attributes. Trabecular bone volume fraction and minimum trabecular width were also highlighted as age-distinctive microscopic features. Sex difference was ascertainable in epiphyseal union timing, morphology of the notch for the first rib, margin of the articular surface, and basic morphology of articular surface as well as in two microscopic characteristics: trabecular bone volume fraction and minimum trabecular width. The study managed to identify several age- and sex-related features that could be applied as additional guidance for age estimation in Serbian population. PMID:23329360

  14. Poor Reliability of Wrist Blood Pressure Self-Measurement at Home: A Population-Based Study.

    PubMed

    Casiglia, Edoardo; Tikhonoff, Valérie; Albertini, Federica; Palatini, Paolo

    2016-10-01

    The reliability of blood pressure measurement with wrist devices, which has not previously been assessed under real-life circumstances in general population, is dependent on correct positioning of the wrist device at heart level. We determined whether an error was present when blood pressure was self-measured at the wrist in 721 unselected subjects from the general population. After training, blood pressure was measured in the office and self-measured at home with an upper-arm device (the UA-767 Plus) and a wrist device (the UB-542, not provided with a position sensor). The upper-arm-wrist blood pressure difference detected in the office was used as the reference measurement. The discrepancy between office and home differences was the home measurement error. In the office, systolic blood pressure was 2.5% lower at wrist than at arm (P=0.002), whereas at home, systolic and diastolic blood pressures were higher at wrist than at arm (+5.6% and +5.4%, respectively; P<0.0001 for both); 621 subjects had home measurement error of at least ±5 mm Hg and 455 of at least ±10 mm Hg (bad measurers). In multivariable linear regression, a lower cognitive pattern independently determined both the systolic and the diastolic home measurement error and a longer forearm the systolic error only. This was confirmed by logistic regression having bad measurers as dependent variable. The use of wrist devices for home self-measurement, therefore, leads to frequent detection of falsely elevated blood pressure values likely because of a poor memory and rendition of the instructions, leading to the wrong position of the wrist.

  15. Reliability training

    NASA Technical Reports Server (NTRS)

    Lalli, Vincent R. (Editor); Malec, Henry A. (Editor); Dillard, Richard B.; Wong, Kam L.; Barber, Frank J.; Barina, Frank J.

    1992-01-01

    Discussed here is failure physics, the study of how products, hardware, software, and systems fail and what can be done about it. The intent is to impart useful information, to extend the limits of production capability, and to assist in achieving low cost reliable products. A review of reliability for the years 1940 to 2000 is given. Next, a review of mathematics is given as well as a description of what elements contribute to product failures. Basic reliability theory and the disciplines that allow us to control and eliminate failures are elucidated.

  16. Reliable likelihood ratios for statistical model-based voice activity detector with low false-alarm rate

    NASA Astrophysics Data System (ADS)

    Kim, Younggwan; Suh, Youngjoo; Kim, Hoirin

    2011-12-01

    The role of the statistical model-based voice activity detector (SMVAD) is to detect speech regions from input signals using the statistical models of noise and noisy speech. The decision rule of SMVAD is based on the likelihood ratio test (LRT). The LRT-based decision rule may cause detection errors because of statistical properties of noise and speech signals. In this article, we first analyze the reasons why the detection errors occur and then propose two modified decision rules using reliable likelihood ratios (LRs). We also propose an effective weighting scheme considering spectral characteristics of noise and speech signals. In the experiments proposed in this study, with almost no additional computations, the proposed methods show significant performance improvement in various noise conditions. Experimental results also show that the proposed weighting scheme provides additional performance improvement over the two proposed SMVADs.

  17. Reliability of genomic evaluations in Holstein-Friesians using haplotypes based on the BovineHD BeadChip.

    PubMed

    Schopen, G C B; Schrooten, C

    2013-01-01

    The objectives of this study were to make subsets of high-density (HD) loci based on localized haplotype clusters, without loss of genomic information, to reduce computing time compared with the use of all HD loci and to investigate the effect on the reliability of the direct genomic value (DGV) when using this HD subset based on localized haplotype clusters in the genomic evaluation for Holstein-Friesians. The DNA was isolated from semen samples of 548 bulls (key ancestors) of the EuroGenomics Consortium, a collaboration between 4 European dairy cattle breeding organizations and scientific partners. These bulls were genotyped with the BovineHD BeadChip [~777,000 (777K) single nucleotide polymorphisms (SNP); Illumina Inc., San Diego, CA] and used to impute all 30,483 Holstein-Friesians from the BovineSNP50 BeadChip [~50,000 (50K) SNP; Illumina Inc.] to HD, using the BEAGLE software package. The final data set consisted of 30,483 animals and 603,145 SNP. For each locus, localized haplotype clusters (i.e., edges of the fitted graph model) identifications were obtained from BEAGLE. Three subsets [38,000 (38K), 116,000 (116K), and 322,000 (322K) loci] were made based on deleting obsolete loci (i.e., loci that do not give extra information compared with the neighboring loci). A fourth data set was based on 38K SNP, which is currently used for routine genomic evaluation at the Cattle Improvement Cooperative (CRV, Arnhem, the Netherlands). A validation study using the HD loci subsets based on localized haplotype clusters was performed for 9 traits (production, conformation, and functional traits). Error of imputation from 50K to HD averaged 0.78%. Three thresholds (0.17, 0.05, and 0.008%) were used for the identification of obsolete HD loci based on localized haplotype clusters to obtain a desired number of HD loci (38K, 116K, and 322K). On average, 46% (using threshold 0.008%) to 93% (using threshold 0.17%) of HD loci were eliminated. The computing time was about 9 d for

  18. Robot-Assisted End-Effector-Based Stair Climbing for Cardiopulmonary Exercise Testing: Feasibility, Reliability, and Repeatability

    PubMed Central

    Stoller, Oliver; Schindelholz, Matthias; Hunt, Kenneth J.

    2016-01-01

    Background Neurological impairments can limit the implementation of conventional cardiopulmonary exercise testing (CPET) and cardiovascular training strategies. A promising approach to provoke cardiovascular stress while facilitating task-specific exercise in people with disabilities is feedback-controlled robot-assisted end-effector-based stair climbing (RASC). The aim of this study was to evaluate the feasibility, reliability, and repeatability of augmented RASC-based CPET in able-bodied subjects, with a view towards future research and applications in neurologically impaired populations. Methods Twenty able-bodied subjects performed a familiarisation session and 2 consecutive incremental CPETs using augmented RASC. Outcome measures focussed on standard cardiopulmonary performance parameters and on accuracy of work rate tracking (RMSEP−root mean square error). Criteria for feasibility were cardiopulmonary responsiveness and technical implementation. Relative and absolute test-retest reliability were assessed by intraclass correlation coefficients (ICC), standard error of the measurement (SEM), and minimal detectable change (MDC). Mean differences, limits of agreement, and coefficients of variation (CoV) were estimated to assess repeatability. Results All criteria for feasibility were achieved. Mean V′O2peak was 106±9% of predicted V′O2max and mean HRpeak was 99±3% of predicted HRmax. 95% of the subjects achieved at least 1 criterion for V′O2max, and the detection of the sub-maximal ventilatory thresholds was successful (ventilatory anaerobic threshold 100%, respiratory compensation point 90% of the subjects). Excellent reliability was found for peak cardiopulmonary outcome measures (ICC ≥ 0.890, SEM ≤ 0.60%, MDC ≤ 1.67%). Repeatability for the primary outcomes was good (CoV ≤ 0.12). Conclusions RASC-based CPET with feedback-guided exercise intensity demonstrated comparable or higher peak cardiopulmonary performance variables relative to

  19. Hardware based redundant multi-threading inside a GPU for improved reliability

    DOEpatents

    Sridharan, Vilas; Gurumurthi, Sudhanva

    2015-05-05

    A system and method for verifying computation output using computer hardware are provided. Instances of computation are generated and processed on hardware-based processors. As instances of computation are processed, each instance of computation receives a load accessible to other instances of computation. Instances of output are generated by processing the instances of computation. The instances of output are verified against each other in a hardware based processor to ensure accuracy of the output.

  20. Validity and reliability of a dish-based, semi-quantitative food frequency questionnaire for Korean diet and cancer research.

    PubMed

    Park, Min Kyung; Noh, Hwa Young; Song, Na Yeun; Paik, Hee Young; Park, Sohee; Joung, Hyojee; Song, Won O; Kim, Jeongseon

    2012-01-01

    This study evaluated the validity and reliability of applying a newly developed dish-based, semi-quantitative food frequency questionnaire (FFQ) for Korean diet and cancer research. The subjects in the present study were 288 Korean adults over 30 years of age who had completed two FFQs and four 3-day diet records (DRs) from May 2008 to February 2009. Student's t-tests, Chi-square tests, and Spearman's rank correlation coefficients were used to estimate and compare intakes from different dietary assessment tools. Agreement in quintiles was calculated to validate agreement between the results of the second FFQ (FFQ-2) conducted in February 2009 and the DRs. Median Spearman's correlation coefficients between the intake of nutrients and foods assessed by the FFQ-1 and FFQ-2 were 0.59 and 0.57, respectively, and the coefficients between the intake of nutrients and foods assessed by the FFQ-2 and the DRs were 0.31 and 0.29, respectively. The quintile classifications of same or adjacent quintile for intake of nutrients and foods were 64% and 65%, respectively. Misclassification into opposite quintiles occurred in less than 5% for all dietary factors. Thus this newly-developed, Korean dish-based FFQ demonstrated moderate correspondence with the four 3-day DRs. Its reliability and validity are comparable to those reported in other studies. PMID:22524822

  1. Real-time reliability evaluation methodology based on dynamic Bayesian networks: A case study of a subsea pipe ram BOP system.

    PubMed

    Cai, Baoping; Liu, Yonghong; Ma, Yunpeng; Liu, Zengkai; Zhou, Yuming; Sun, Junhe

    2015-09-01

    A novel real-time reliability evaluation methodology is proposed by combining root cause diagnosis phase based on Bayesian networks (BNs) and reliability evaluation phase based on dynamic BNs (DBNs). The root cause diagnosis phase exactly locates the root cause of a complex mechatronic system failure in real time to increase diagnostic coverage and is performed through backward analysis of BNs. The reliability evaluation phase calculates the real-time reliability of the entire system by forward inference of DBNs. The application of the proposed methodology is demonstrated using a case of a subsea pipe ram blowout preventer system. The value and the variation trend of real-time system reliability when the faults of components occur are studied; the importance degree sequence of components at different times is also determined using mutual information and belief variance. PMID:26169121

  2. The N of 1 in Arts-Based Research: Reliability and Validity

    ERIC Educational Resources Information Center

    Siegesmund, Richard

    2014-01-01

    N signifies the number of data samples in a study. Traditional research values numerous data samples as this reduces the variability created by extremes. Alternatively, arts-based research privileges the outlier, the N of 1. Oftentimes, what is unique and outside the norm is the focus. There are three approaches to the N of 1 in arts-based…

  3. From Fulcher to PLEVALEX: Issues in Interface Design, Validity and Reliability in Internet Based Language Testing

    ERIC Educational Resources Information Center

    Garcia Laborda, Jesus

    2007-01-01

    Interface design and ergonomics, while already studied in much of educational theory, have not until recently been considered in language testing (Fulcher, 2003). In this paper, we revise the design principles of PLEVALEX, a fully operational prototype Internet based language testing platform. Our focus here is to show PLEVALEX's interfaces and…

  4. Reliability and Validity of Authentic Assessment in a Web Based Course

    ERIC Educational Resources Information Center

    Olfos, Raimundo; Zulantay, Hildaura

    2007-01-01

    Web-based courses are promising in that they are effective and have the possibility of their instructional design being improved over time. However, the assessments of said courses are criticized in terms of their validity. This paper is an exploratory case study regarding the validity of the assessment system used in a semi presential web-based…

  5. Child and Adolescent Behaviorally Based Disorders: A Critical Review of Reliability and Validity

    ERIC Educational Resources Information Center

    Mallett, Christopher A.

    2014-01-01

    Objectives: The purpose of this study was to investigate the historical construction and empirical support of two child and adolescent behaviorally based mental health disorders: oppositional defiant and conduct disorders. Method: The study utilized a historiography methodology to review, from 1880 to 2012, these disorders' inclusion in…

  6. Note: Reliable and non-contact 6D motion tracking system based on 2D laser scanners for cargo transportation

    NASA Astrophysics Data System (ADS)

    Kim, Young-Keun; Kim, Kyung-Soo

    2014-10-01

    Maritime transportation demands an accurate measurement system to track the motion of oscillating container boxes in real time. However, it is a challenge to design a sensor system that can provide both reliable and non-contact methods of 6-DOF motion measurements of a remote object for outdoor applications. In the paper, a sensor system based on two 2D laser scanners is proposed for detecting the relative 6-DOF motion of a crane load in real time. Even without implementing a camera, the proposed system can detect the motion of a remote object using four laser beam points. Because it is a laser-based sensor, the system is expected to be highly robust to sea weather conditions.

  7. Note: Reliable and non-contact 6D motion tracking system based on 2D laser scanners for cargo transportation

    SciTech Connect

    Kim, Young-Keun; Kim, Kyung-Soo

    2014-10-15

    Maritime transportation demands an accurate measurement system to track the motion of oscillating container boxes in real time. However, it is a challenge to design a sensor system that can provide both reliable and non-contact methods of 6-DOF motion measurements of a remote object for outdoor applications. In the paper, a sensor system based on two 2D laser scanners is proposed for detecting the relative 6-DOF motion of a crane load in real time. Even without implementing a camera, the proposed system can detect the motion of a remote object using four laser beam points. Because it is a laser-based sensor, the system is expected to be highly robust to sea weather conditions.

  8. Accuracy and reliability of GPS devices for measurement of movement patterns in confined spaces for court-based sports.

    PubMed

    Duffield, Rob; Reid, Machar; Baker, John; Spratford, Wayne

    2010-09-01

    The aim of this study was to assess the accuracy and reliability of global positioning system (GPS) measures of distance and speed, compared to a high-resolution motion analysis system, for confined movement patterns used in many court-based sports. A single male participant performed 10 repetitions of four respective drills replicating court-based movement patterns and six repetitions of a random movement drill that replicated tennis match-play movement patterns. Two 1Hz and two 5Hz GPS devices concurrently measured distance covered and speed of all court-based drills. A 22 camera VICON motion analysis system, operating at 100Hz, tracked the position of an 18mm reflective marker affixed to one of the GPS devices to provide the criterion movement data. Results indicated that both 1 and 5Hz GPS devices under reported distance covered as well as both mean and peak speed compared to the VICON system (P<0.05). The coefficient of variation for both GPS devices for distance and speed measures ranged between 4 and 25%. Further, the faster the speed and more repetitive the movement pattern (over a similar location), the greater the measurement error. The inter-unit reliability for distance and speed measures of both 1 and 5Hz systems for movements in confined spaces was generally low to moderate (r=0.10-0.70). In conclusion, for court-based sports or movements in confined spaces, GPS technology under reports distance covered and both mean and peak speed of movement.

  9. Test-Retest Reliability and Convergent Validity of a Computer Based Hand Function Test Protocol in People with Arthritis

    PubMed Central

    Srikesavan, Cynthia S.; Shay, Barbara; Szturm, Tony

    2015-01-01

    Objectives: A computer based hand function assessment tool has been developed to provide a standardized method for quantifying task performance during manipulations of common objects/tools/utensils with diverse physical properties and grip/grasp requirements for handling. The study objectives were to determine test-retest reliability and convergent validity of the test protocol in people with arthritis. Methods: Three different object manipulation tasks were evaluated twice in forty people with rheumatoid arthritis (RA) or hand osteoarthritis (HOA). Each object was instrumented with a motion sensor and moved in concert with a computer generated visual target. Self-reported joint pain and stiffness levels were recorded before and after each task. Task performance was determined by comparing the object movement with the computer target motion. This was correlated with grip strength, nine hole peg test, Disabilities of Arm, Shoulder, and Hand (DASH) questionnaire, and the Health Assessment Questionnaire (HAQ) scores. Results: The test protocol indicated moderate to high test-retest reliability of performance measures for three manipulation tasks, intraclass correlation coefficients (ICCs) ranging between 0.5 to 0.84, p<0.05. Strength of association between task performance measures with self- reported activity/participation composite scores was low to moderate (Spearman rho <0.7). Low correlations (Spearman rho < 0.4) were observed between task performance measures and grip strength; and between three objects’ performance measures. Significant reduction in pain and joint stiffness (p<0.05) was observed after performing each task. Conclusion: The study presents initial evidence on the test retest reliability and convergent validity of a computer based hand function assessment protocol in people with rheumatoid arthritis or hand osteoarthritis. The novel tool objectively measures overall task performance during a variety of object manipulation tasks done by tracking a

  10. A measurement-based model of software reliability in a production environment

    NASA Technical Reports Server (NTRS)

    Hsueh, M. C.; Iyer, R. K.

    1987-01-01

    In this paper, a semi-Markov model is built to describe the software error and recovery process in a large mainframe system. The model is based on low-level error data from the MVS operating system running on an IBM 3081 machine. The semi-Markov model developed provides a quantification of system error characteristics and the interaction between different types of errors. As an example, a detailed model is provided, and an analysis is made of multiple errors, which constitute approximately an 17 percent of all software errors and result in considerable recovery overhead.

  11. The Impact of Grain Boundaries on the Reliability and Performance of Polysilicon-Based Devices

    NASA Astrophysics Data System (ADS)

    Suryanarayana, Bhattacharya S.

    Three different polysilicon-based device structures have been studied, each exploiting some property of the grain boundaries in polysilicon: electrical, diffusion or microstructural. In the first sub-task, hot-carrier-induced degradation of polysilicon-on-oxide, LDD P-MOSFET's that find application as load elements in 3-D integrated CMOS SRAM's has been studied. It has been shown that in addition to hot-electron trapping that occurs similar to that reported in bulk P -MOSFET's, polysilicon P-MOSFET's show a new hot-carrier -induced degradation mechanism due to the breaking of Si -H bonds by hot carriers, which increases the grain boundary trap density at the H-passivated grain boundaries in polysilicon. A new model based on Thermionic Field Emission (TFE) via grain boundary traps has been developed to explain the temperature and bias dependence of the leakage current, which is not described by the existing model based on field emission. In the second sub-task, ion-implanted polysilicon is used as a solid diffusion source and a self-aligned emitter contact to form polysilicon emitter bipolar transistors. The impact of ex situ chemical cleans such as a RCA clean and RCA clean followed by a HF dip prior to polysilicon deposition has been studied. It has been shown that the RCA clean by itself is not desirable because of the poor controllability of electrical parameters as well as the high emitter resistance resulting from the thicker interfacial oxide compared to that obtained with the HF-clean. The use of an in situ clean using a cluster tool for polysilicon deposition to incorporate controllable amounts of oxide at the polysilicon/silicon interface has been explored to show for the first time that by having increasing thicknesses of interfacial oxide, and subsequently annealing to partially break up the oxide, one can obtain improved current gains without paying a penalty of higher emitter resistance. Finally, a new double polysilicon-gate-based P -I-N MOSFET

  12. SPARK: Sparsity-based analysis of reliable k-hubness and overlapping network structure in brain functional connectivity.

    PubMed

    Lee, Kangjoo; Lina, Jean-Marc; Gotman, Jean; Grova, Christophe

    2016-07-01

    Functional hubs are defined as the specific brain regions with dense connections to other regions in a functional brain network. Among them, connector hubs are of great interests, as they are assumed to promote global and hierarchical communications between functionally specialized networks. Damage to connector hubs may have a more crucial effect on the system than does damage to other hubs. Hubs in graph theory are often identified from a correlation matrix, and classified as connector hubs when the hubs are more connected to regions in other networks than within the networks to which they belong. However, the identification of hubs from functional data is more complex than that from structural data, notably because of the inherent problem of multicollinearity between temporal dynamics within a functional network. In this context, we developed and validated a method to reliably identify connectors and corresponding overlapping network structure from resting-state fMRI. This new method is actually handling the multicollinearity issue, since it does not rely on counting the number of connections from a thresholded correlation matrix. The novelty of the proposed method is that besides counting the number of networks involved in each voxel, it allows us to identify which networks are actually involved in each voxel, using a data-driven sparse general linear model in order to identify brain regions involved in more than one network. Moreover, we added a bootstrap resampling strategy to assess statistically the reproducibility of our results at the single subject level. The unified framework is called SPARK, i.e. SParsity-based Analysis of Reliable k-hubness, where k-hubness denotes the number of networks overlapping in each voxel. The accuracy and robustness of SPARK were evaluated using two dimensional box simulations and realistic simulations that examined detection of artificial hubs generated on real data. Then, test/retest reliability of the method was assessed

  13. SPARK: Sparsity-based analysis of reliable k-hubness and overlapping network structure in brain functional connectivity.

    PubMed

    Lee, Kangjoo; Lina, Jean-Marc; Gotman, Jean; Grova, Christophe

    2016-07-01

    Functional hubs are defined as the specific brain regions with dense connections to other regions in a functional brain network. Among them, connector hubs are of great interests, as they are assumed to promote global and hierarchical communications between functionally specialized networks. Damage to connector hubs may have a more crucial effect on the system than does damage to other hubs. Hubs in graph theory are often identified from a correlation matrix, and classified as connector hubs when the hubs are more connected to regions in other networks than within the networks to which they belong. However, the identification of hubs from functional data is more complex than that from structural data, notably because of the inherent problem of multicollinearity between temporal dynamics within a functional network. In this context, we developed and validated a method to reliably identify connectors and corresponding overlapping network structure from resting-state fMRI. This new method is actually handling the multicollinearity issue, since it does not rely on counting the number of connections from a thresholded correlation matrix. The novelty of the proposed method is that besides counting the number of networks involved in each voxel, it allows us to identify which networks are actually involved in each voxel, using a data-driven sparse general linear model in order to identify brain regions involved in more than one network. Moreover, we added a bootstrap resampling strategy to assess statistically the reproducibility of our results at the single subject level. The unified framework is called SPARK, i.e. SParsity-based Analysis of Reliable k-hubness, where k-hubness denotes the number of networks overlapping in each voxel. The accuracy and robustness of SPARK were evaluated using two dimensional box simulations and realistic simulations that examined detection of artificial hubs generated on real data. Then, test/retest reliability of the method was assessed

  14. A Multi-Criteria Decision Analysis based methodology for quantitatively scoring the reliability and relevance of ecotoxicological data.

    PubMed

    Isigonis, Panagiotis; Ciffroy, Philippe; Zabeo, Alex; Semenzin, Elena; Critto, Andrea; Giove, Silvio; Marcomini, Antonio

    2015-12-15

    Ecotoxicological data are highly important for risk assessment processes and are used for deriving environmental quality criteria, which are enacted for assuring the good quality of waters, soils or sediments and achieving desirable environmental quality objectives. Therefore, it is of significant importance the evaluation of the reliability of available data for analysing their possible use in the aforementioned processes. The thorough analysis of currently available frameworks for the assessment of ecotoxicological data has led to the identification of significant flaws but at the same time various opportunities for improvement. In this context, a new methodology, based on Multi-Criteria Decision Analysis (MCDA) techniques, has been developed with the aim of analysing the reliability and relevance of ecotoxicological data (which are produced through laboratory biotests for individual effects), in a transparent quantitative way, through the use of expert knowledge, multiple criteria and fuzzy logic. The proposed methodology can be used for the production of weighted Species Sensitivity Weighted Distributions (SSWD), as a component of the ecological risk assessment of chemicals in aquatic systems. The MCDA aggregation methodology is described in detail and demonstrated through examples in the article and the hierarchically structured framework that is used for the evaluation and classification of ecotoxicological data is shortly discussed. The methodology is demonstrated for the aquatic compartment but it can be easily tailored to other environmental compartments (soil, air, sediments).

  15. Reliable Alignment in Total Knee Arthroplasty by the Use of an iPod-Based Navigation System

    PubMed Central

    Koenen, Paola; Schneider, Marco M.; Fröhlich, Matthias; Driessen, Arne; Bouillon, Bertil; Bäthis, Holger

    2016-01-01

    Axial alignment is one of the main objectives in total knee arthroplasty (TKA). Computer-assisted surgery (CAS) is more accurate regarding limb alignment reconstruction compared to the conventional technique. The aim of this study was to analyse the precision of the innovative navigation system DASH® by Brainlab and to evaluate the reliability of intraoperatively acquired data. A retrospective analysis of 40 patients was performed, who underwent CAS TKA using the iPod-based navigation system DASH. Pre- and postoperative axial alignment were measured on standardized radiographs by two independent observers. These data were compared with the navigation data. Furthermore, interobserver reliability was measured. The duration of surgery was monitored. The mean difference between the preoperative mechanical axis by X-ray and the first intraoperatively measured limb axis by the navigation system was 2.4°. The postoperative X-rays showed a mean difference of 1.3° compared to the final navigation measurement. According to radiographic measurements, 88% of arthroplasties had a postoperative limb axis within ±3°. The mean additional time needed for navigation was 5 minutes. We could prove very good precision for the DASH system, which is comparable to established navigation devices with only negligible expenditure of time compared to conventional TKA. PMID:27313898

  16. Network-based identification of reliable bio-markers for cancers.

    PubMed

    Deng, Shiguo; Qi, Jingchao; Stephen, Mutua; Qiu, Lu; Yang, Huijie

    2015-10-21

    Finding bio-markers for complex disease from gene expression profiles attracts extensive attentions for its potential use in diagnosis, therapy, and drug design. In this paper we propose a network-based method to seek high-confident bio-markers from candidate genes collected in the literature. The algorithm includes three consequent steps. First, one can collect the proposed bio-markers in literature as being the preliminary candidate; Second, a spanning-tree based threshold can be used to reconstruct gene networks for normal and cancer samples; Third, by jointly using of degree changes and distribution of the candidates in communities, one can filter out the low-confident genes. The survival candidates are high-confident genes. Specially, we consider expression profiles for carcinoma of colon. A total of 34 preliminary bio-markers collected from literature are evaluated and a set of 16 genes are proposed as high confident bio-markers, which behave high performance in distinguishing normal and cancer samples.

  17. Parameter estimation techniques based on optimizing goodness-of-fit statistics for structural reliability

    NASA Technical Reports Server (NTRS)

    Starlinger, Alois; Duffy, Stephen F.; Palko, Joseph L.

    1993-01-01

    New methods are presented that utilize the optimization of goodness-of-fit statistics in order to estimate Weibull parameters from failure data. It is assumed that the underlying population is characterized by a three-parameter Weibull distribution. Goodness-of-fit tests are based on the empirical distribution function (EDF). The EDF is a step function, calculated using failure data, and represents an approximation of the cumulative distribution function for the underlying population. Statistics (such as the Kolmogorov-Smirnov statistic and the Anderson-Darling statistic) measure the discrepancy between the EDF and the cumulative distribution function (CDF). These statistics are minimized with respect to the three Weibull parameters. Due to nonlinearities encountered in the minimization process, Powell's numerical optimization procedure is applied to obtain the optimum value of the EDF. Numerical examples show the applicability of these new estimation methods. The results are compared to the estimates obtained with Cooper's nonlinear regression algorithm.

  18. Reliability, Compliance and Security of Web-based Pre/Post-testing

    NASA Astrophysics Data System (ADS)

    Bonham, Scott

    2007-01-01

    Pre/post testing is an important tool for improving science education. Standard in-class administration has drawbacks such as `lost' class time and converting data into electronic format. These are not issues for unproctored web-based administration, but there are concerns about assessment validity, compliance rates, and instrument security. A preliminary investigation compared astronomy students taking pre/post tests on paper to those taking the same tests over the web. The assessments included the Epistemological Beliefs Assessment for Physical Science and a conceptual assessment developed for this study. Preliminary results on validity show no significant difference on scores or on most individual questions. Compliance rates were similar between web and paper on the pretest and much better for web on the posttest. Remote monitoring of student activity during the assessments recorded no clear indication of any copying, printing or saving of questions, and no widespread use of the web to search for answers.

  19. Towards a reliable and high sensitivity O₂-independent glucose sensor based on Ir oxide nanoparticles.

    PubMed

    Campbell, H B; Elzanowska, H; Birss, V I

    2013-04-15

    The primary goal of this work is the development of a rapidly responding, sensitive, and biocompatible Ir oxide (IrOx)-based glucose sensor that regenerates solely via IrOx-mediation in both O₂-free and aerobic environments. An important discovery is that, for films composed of IrOx nanoparticles, Nafion® and glucose oxidase (GOx), a Michaelis-Menten constant (K'(m)) of 20-30 mM is obtained in the case of dual-regeneration (O₂ and IrOx), while K'(m) values are much smaller (3-5 mM) when re-oxidation of GOx occurs only through IrOx-mediation. These smaller K'(m) values indicate that the regeneration of GOx via direct electron transfer to the IrOx nanoparticles is more rapid than to O₂. Small K'(m) values, which are obtained more commonly when Nafion® is not present in the films, are also important for the accurate measurement of low glucose concentrations under hypoglycemic conditions. In this work, the sensing film was also optimized for miniaturization. Depending on the IrOx and GOx surface loadings and the use of sonication before film deposition, the i(max) values ranged from 5 to 225 μA cm⁻², showing very good sensitivity down to 0.4 mM glucose. PMID:23261690

  20. Improved Membrane-Based Sensor Network for Reliable Gas Monitoring in the Subsurface

    PubMed Central

    Lazik, Detlef; Ebert, Sebastian

    2012-01-01

    A conceptually improved sensor network to monitor the partial pressure of CO2 in different soil horizons was designed. Consisting of five membrane-based linear sensors (line-sensors) each with 10 m length, the set-up enables us to integrate over the locally fluctuating CO2 concentrations (typically lower 5%vol) up to the meter-scale gaining valuable concentration means with a repetition time of about 1 min. Preparatory tests in the laboratory resulted in a unexpected highly increased accuracy of better than 0.03%vol with respect to the previously published 0.08%vol. Thereby, the statistical uncertainties (standard deviations) of the line-sensors and the reference sensor (nondispersive infrared CO2-sensor) were close to each other. Whereas the uncertainty of the reference increases with the measurement value, the line-sensors show an inverse uncertainty trend resulting in a comparatively enhanced accuracy for concentrations >1%vol. Furthermore, a method for in situ maintenance was developed, enabling a proof of sensor quality and its effective calibration without demounting the line-sensors from the soil which would disturb the established structures and ongoing processes. PMID:23235447

  1. The reliability of differentiating neurogenic claudication from vascular claudication based on symptomatic presentation

    PubMed Central

    Nadeau, Mélissa; Rosas-Arellano, M. Patricia; Gurr, Kevin R.; Bailey, Stewart I.; Taylor, David C.; Grewal, Ruby; Lawlor, D. Kirk; Bailey, Chris S.

    2013-01-01

    Background Intermittent claudication can be neurogenic or vascular. Physicians use a profile based on symptom attributes to differentiate the 2 types of claudication, and this guides their investigations for diagnosis of the underlying pathology. We evaluated the validity of these symptom attributes in differentiating neurogenic from vascular claudication. Methods Patients with a diagnosis of lumbar spinal stenosis (LSS) or peripheral vascular disease (PVD) who reported claudication answered 14 questions characterizing their symptoms. We determined the sensitivity, specificity and positive and negative likelihood ratios (PLR and NLR) for neurogenic and vascular claudication for each symptom attribute. Results We studied 53 patients. The most sensitive symptom attribute to rule out LSS was the absence of “triggering of pain with standing alone” (sensitivity 0.97, NLR 0.050). Pain alleviators and symptom location data showed a weak clinical significance for LSS and PVD. Constellation of symptoms yielded the strongest associations: patients with a positive shopping cart sign whose symptoms were located above the knees, triggered with standing alone and relieved with sitting had a strong likelihood of neurogenic claudication (PLR 13). Patients with symptoms in the calf that were relieved with standing alone had a strong likelihood of vascular claudication (PLR 20.0). Conclusion The classic symptom attributes used to differentiate neurogenic from vascular claudication are at best weakly valid independently. However, certain constellation of symptoms are much more indicative of etiology. These results can guide general practitioners in their evaluation of and investigation for claudication. PMID:24284143

  2. SAR-based sea traffic monitoring: a reliable approach for maritime surveillance

    NASA Astrophysics Data System (ADS)

    Renga, Alfredo; Graziano, Maria D.; D'Errico, M.; Moccia, A.; Cecchini, A.

    2011-11-01

    Maritime surveillance problems are drawing the attention of multiple institutional actors. National and international security agencies are interested in matters like maritime traffic security, maritime pollution control, monitoring migration flows and detection of illegal fishing activities. Satellite imaging is a good way to identify ships but, characterized by large swaths, it is likely that the imaged scenes contain a large number of ships, with the vast majority, hopefully, performing legal activities. Therefore, the imaging system needs a supporting system which identifies legal ships and limits the number of potential alarms to be further monitored by patrol boats or aircrafts. In this framework, spaceborne Synthetic Aperture Radar (SAR) sensors, terrestrial AIS and the ongoing satellite AIS systems can represent a great potential synergy for maritime security. Starting from this idea the paper develops different designs for an AIS constellation able to reduce the time lag between SAR image and AIS data acquisition. An analysis of SAR-based ship detection algorithms is also reported and candidate algorithms identified.

  3. Improving the reliability of road materials based on micronized sulfur composites

    NASA Astrophysics Data System (ADS)

    Abdrakhmanova, K. K.

    2015-01-01

    The work contains the results of a nano-structural modification of sulfur that prevents polymorphic transformations from influencing the properties of sulfur composites where sulfur is present in a thermodynamic stable condition that precludes destruction when operated. It has been established that the properties of sulfur-based composite materials can be significantly improved by modifying sulfur and structuring sulfur binder by nano-dispersed fiber particles and ultra-dispersed state filler. The paper shows the possibility of modifying Tengiz sulfur by its fragmenting which ensures that the structured sulfur is structurally changed and stabilized through reinforcement by ultra-dispersed fiber particles allowing the phase contact area to be multiplied. Interaction between nano-dispersed fibers of chrysotile asbestos and sulfur ensures the implementation of the mechanical properties of chrysotile asbestos tubes in reinforced composite and its integrity provided that the surface of chrysotile asbestos tubes are highly moistened with molten sulfur and there is high adhesion between the tubes and the matrix that, in addition to sulfur, contains limestone microparticles. Ability to apply materials in severe operation conditions and possibility of exposure in both aggressive medium and mechanical loads makes produced sulfur composites required by the road construction industry.

  4. Reliability of neuronal information conveyed by unreliable neuristor-based leaky integrate-and-fire neurons: a model study

    PubMed Central

    Lim, Hyungkwang; Kornijcuk, Vladimir; Seok, Jun Yeong; Kim, Seong Keun; Kim, Inho; Hwang, Cheol Seong; Jeong, Doo Seok

    2015-01-01

    We conducted simulations on the neuronal behavior of neuristor-based leaky integrate-and-fire (NLIF) neurons. The phase-plane analysis on the NLIF neuron highlights its spiking dynamics – determined by two nullclines conditional on the variables on the plane. Particular emphasis was placed on the operational noise arising from the variability of the threshold switching behavior in the neuron on each switching event. As a consequence, we found that the NLIF neuron exhibits a Poisson-like noise in spiking, delimiting the reliability of the information conveyed by individual NLIF neurons. To highlight neuronal information coding at a higher level, a population of noisy NLIF neurons was analyzed in regard to probability of successful information decoding given the Poisson-like noise of each neuron. The result demonstrates highly probable success in decoding in spite of large variability – due to the variability of the threshold switching behavior – of individual neurons. PMID:25966658

  5. Observer-based reliable stabilization of uncertain linear systems subject to actuator faults, saturation, and bounded system disturbances.

    PubMed

    Fan, Jinhua; Zhang, Youmin; Zheng, Zhiqiang

    2013-11-01

    A matrix inequality approach is proposed to reliably stabilize a class of uncertain linear systems subject to actuator faults, saturation, and bounded system disturbances. The system states are assumed immeasurable, and a classical observer is incorporated for observation to enable state-based feedback control. Both the stability and stabilization of the closed-loop system are discussed and the closed-loop domain of attraction is estimated by an ellipsoidal invariant set. The resultant stabilization conditions in the form of matrix inequalities enable simultaneous optimization of both the observer gain and the feedback controller gain, which is realized by converting the non-convex optimization problem to an unconstrained nonlinear programming problem. The effectiveness of proposed design techniques is demonstrated through a linearized model of F-18 HARV around an operating point.

  6. Reliability of information-based integration of EEG and fMRI data: a simulation study.

    PubMed

    Assecondi, Sara; Ostwald, Dirk; Bagshaw, Andrew P

    2015-02-01

    Most studies involving simultaneous electroencephalographic (EEG) and functional magnetic resonance imaging (fMRI) data rely on the first-order, affine-linear correlation of EEG and fMRI features within the framework of the general linear model. An alternative is the use of information-based measures such as mutual information and entropy, which can also detect higher-order correlations present in the data. The estimate of information-theoretic quantities might be influenced by several parameters, such as the numerosity of the sample, the amount of correlation between variables, and the discretization (or binning) strategy of choice. While these issues have been investigated for invasive neurophysiological data and a number of bias-correction estimates have been developed, there has been no attempt to systematically examine the accuracy of information estimates for the multivariate distributions arising in the context of EEG-fMRI recordings. This is especially important given the differences between electrophysiological and EEG-fMRI recordings. In this study, we drew random samples from simulated bivariate and trivariate distributions, mimicking the statistical properties of EEG-fMRI data. We compared the estimated information shared by simulated random variables with its numerical value and found that the interaction between the binning strategy and the estimation method influences the accuracy of the estimate. Conditional on the simulation assumptions, we found that the equipopulated binning strategy yields the best and most consistent results across distributions and bias correction methods. We also found that within bias correction techniques, the asymptotically debiased (TPMC), the jackknife debiased (JD), and the best upper bound (BUB) approach give similar results, and those are consistent across distributions.

  7. Observational measures of implementer fidelity for a school-based preventive intervention: development, reliability, and validity.

    PubMed

    Cross, Wendi; West, Jennifer; Wyman, Peter A; Schmeelk-Cone, Karen; Xia, Yinglin; Tu, Xin; Teisl, Michael; Brown, C Hendricks; Forgatch, Marion

    2015-01-01

    Current measures of implementer fidelity often fail to adequately measure core constructs of adherence and competence, and their relationship to outcomes can be mixed. To address these limitations, we used observational methods to assess these constructs and their relationships to proximal outcomes in a randomized trial of a school-based preventive intervention (Rochester Resilience Project) designed to strengthen emotion self-regulation skills in first-third graders with elevated aggressive-disruptive behaviors. Within the intervention group (n = 203), a subsample (n = 76) of students was selected to reflect the overall sample. Implementers were 10 paraprofessionals. Videotaped observations of three lessons from year 1 of the intervention (14 lessons) were coded for each implementer-child dyad on adherence (content) and competence (quality). Using multilevel modeling, we examined how much of the variance in the fidelity measures was attributed to implementer and to the child within implementer. Both measures had large and significant variance accounted for by implementer (competence, 68 %; adherence, 41 %); child within implementer did not account for significant variance indicating that ratings reflected stable qualities of the implementer rather than the child. Raw adherence and competence scores shared 46 % of variance (r = .68). Controlling for baseline differences and age, the amount (adherence) and quality (competence) of program delivered predicted children's enhanced response to the intervention on both child and parent reports after 6 months, but not on teacher report of externalizing behavior. Our findings support the use of multiple observations for measuring fidelity and that adherence and competence are important components of fidelity which could be assessed by many programs using these methods. PMID:24736951

  8. Evolution of conductive filament and its impact on reliability issues in oxide-electrolyte based resistive random access memory

    PubMed Central

    Lv, Hangbing; Xu, Xiaoxin; Liu, Hongtao; Liu, Ruoyu; Liu, Qi; Banerjee, Writam; Sun, Haitao; Long, Shibing; Li, Ling; Liu, Ming

    2015-01-01

    The electrochemical metallization cell, also referred to as conductive bridge random access memory, is considered to be a promising candidate or complementary component to the traditional charge based memory. As such, it is receiving additional focus to accelerate the commercialization process. To create a successful mass product, reliability issues must first be rigorously solved. In-depth understanding of the failure behavior of the ECM is essential for performance optimization. Here, we reveal the degradation of high resistance state behaves as the majority cases of the endurance failure of the HfO2 electrolyte based ECM cell. High resolution transmission electron microscopy was used to characterize the change in filament nature after repetitive switching cycles. The result showed that Cu accumulation inside the filament played a dominant role in switching failure, which was further supported by measuring the retention of cycle dependent high resistance state and low resistance state. The clarified physical picture of filament evolution provides a basic understanding of the mechanisms of endurance and retention failure, and the relationship between them. Based on these results, applicable approaches for performance optimization can be implicatively developed, ranging from material tailoring to structure engineering and algorithm design. PMID:25586207

  9. A stochastic simulation-optimization approach for estimating highly reliable soil tension threshold values in sensor-based deficit irrigation

    NASA Astrophysics Data System (ADS)

    Kloss, S.; Schütze, N.; Walser, S.; Grundmann, J.

    2012-04-01

    In arid and semi-arid regions where water is scarce, farmers heavily rely on irrigation in order to grow crops and to produce agricultural commodities. The variable and often severely limited water supply thereby poses a serious challenge for farmers to cope with and demand sophisticated irrigation strategies that allow an efficient management of the available water resources. The general aim is to increase water productivity (WP) and one of these strategies to achieve this goal is controlled deficit irrigation (CDI). One way to realize CDI is by defining soil water status specific threshold values (either in soil tension or moisture) at which irrigation cycles are triggered. When utilizing CDI, irrigation control is of utmost importance and yet thresholds are likely chosen by trial and error and thus unreliable. Hence, for CDI to be effective systematic investigations for deriving reliable threshold values that account for different CDI strategies are needed. In this contribution, a method is presented that uses a simulation-based stochastic approach for estimating threshold values with a high reliability. The approach consist of a weather generator offering statistical significance to site-specific climate series, an optimization algorithm that determines optimal threshold values under limiting waters supply, and a crop model for simulating plant growth and water consumption. The study focuses on threshold values of soil tension for different CDI strategies. The advantage of soil-tension-based threshold values over soil-moisture-based lies in their universal and soil type independent applicability. The investigated CDI strategies comprised schedules of constant threshold values, crop development stage dependent threshold values, and different minimum irrigation intervals. For practical reasons, fixed irrigation schedules were tested as well. Additionally, a full irrigation schedule served as reference. The obtained threshold values were then tested in field

  10. Feasibility of AmbulanCe-Based Telemedicine (FACT) Study: Safety, Feasibility and Reliability of Third Generation In-Ambulance Telemedicine

    PubMed Central

    Yperzeele, Laetitia; Van Hooff, Robbert-Jan; De Smedt, Ann; Valenzuela Espinoza, Alexis; Van Dyck, Rita; Van de Casseye, Rohny; Convents, Andre; Hubloue, Ives; Lauwaert, Door; De Keyser, Jacques; Brouns, Raf

    2014-01-01

    Background Telemedicine is currently mainly applied as an in-hospital service, but this technology also holds potential to improve emergency care in the prehospital arena. We report on the safety, feasibility and reliability of in-ambulance teleconsultation using a telemedicine system of the third generation. Methods A routine ambulance was equipped with a system for real-time bidirectional audio-video communication, automated transmission of vital parameters, glycemia and electronic patient identification. All patients ( ≥18 years) transported during emergency missions by a Prehospital Intervention Team of the Universitair Ziekenhuis Brussel were eligible for inclusion. To guarantee mobility and to facilitate 24/7 availability, the teleconsultants used lightweight laptop computers to access a dedicated telemedicine platform, which also provided functionalities for neurological assessment, electronic reporting and prehospital notification of the in-hospital team. Key registrations included any safety issue, mobile connectivity, communication of patient information, audiovisual quality, user-friendliness and accuracy of the prehospital diagnosis. Results Prehospital teleconsultation was obtained in 41 out of 43 cases (95.3%). The success rates for communication of blood pressure, heart rate, blood oxygen saturation, glycemia, and electronic patient identification were 78.7%, 84.8%, 80.6%, 64.0%, and 84.2%. A preliminary prehospital diagnosis was formulated in 90.2%, with satisfactory agreement with final in-hospital diagnoses. Communication of a prehospital report to the in-hospital team was successful in 94.7% and prenotification of the in-hospital team via SMS in 90.2%. Failures resulted mainly from limited mobile connectivity and to a lesser extent from software, hardware or human error. The user acceptance was high. Conclusions Ambulance-based telemedicine of the third generation is safe, feasible and reliable but further research and development, especially

  11. Tutorial on use of intraclass correlation coefficients for assessing intertest reliability and its application in functional near-infrared spectroscopy-based brain imaging

    NASA Astrophysics Data System (ADS)

    Li, Lin; Zeng, Li; Lin, Zi-Jing; Cazzell, Mary; Liu, Hanli

    2015-05-01

    Test-retest reliability of neuroimaging measurements is an important concern in the investigation of cognitive functions in the human brain. To date, intraclass correlation coefficients (ICCs), originally used in inter-rater reliability studies in behavioral sciences, have become commonly used metrics in reliability studies on neuroimaging and functional near-infrared spectroscopy (fNIRS). However, as there are six popular forms of ICC, the adequateness of the comprehensive understanding of ICCs will affect how one may appropriately select, use, and interpret ICCs toward a reliability study. We first offer a brief review and tutorial on the statistical rationale of ICCs, including their underlying analysis of variance models and technical definitions, in the context of assessment on intertest reliability. Second, we provide general guidelines on the selection and interpretation of ICCs. Third, we illustrate the proposed approach by using an actual research study to assess intertest reliability of fNIRS-based, volumetric diffuse optical tomography of brain activities stimulated by a risk decision-making protocol. Last, special issues that may arise in reliability assessment using ICCs are discussed and solutions are suggested.

  12. Tutorial on use of intraclass correlation coefficients for assessing intertest reliability and its application in functional near-infrared spectroscopy-based brain imaging.

    PubMed

    Li, Lin; Zeng, Li; Lin, Zi-Jing; Cazzell, Mary; Liu, Hanli

    2015-05-01

    Test-retest reliability of neuroimaging measurements is an important concern in the investigation of cognitive functions in the human brain. To date, intraclass correlation coefficients (ICCs), originally used in interrater reliability studies in behavioral sciences, have become commonly used metrics in reliability studies on neuroimaging and functional near-infrared spectroscopy (fNIRS). However, as there are six popular forms of ICC, the adequateness of the comprehensive understanding of ICCs will affect how one may appropriately select, use, and interpret ICCs toward a reliability study. We first offer a brief review and tutorial on the statistical rationale of ICCs, including their underlying analysis of variance models and technical definitions, in the context of assessment on intertest reliability. Second, we provide general guidelines on the selection and interpretation of ICCs. Third, we illustrate the proposed approach by using an actual research study to assess interest reliability of fNIRS-based, volumetric diffuse optical tomography of brain activities stimulated by a risk decision-making protocol. Last, special issues that may arise in reliability assessment using ICCs are discussed and solutions are suggested. PMID:25992845

  13. Search for a reliable nucleic acid force field using neutron inelastic scattering and quantum mechanical calculations: Bases, nucleosides and nucleotides

    SciTech Connect

    Leulliot, Nicolas; Ghomi, Mahmoud; Jobic, Herve

    1999-06-15

    Neutron inelastic scattering (NIS), IR and Raman spectra of the RNA constituents: bases, nucleosides and nucleotides have been analyzed. The complementary aspects of these different experimental techniques makes them especially powerful for assigning the vibrational modes of the molecules of interest. Geometry optimization and harmonic force field calculations of these molecules have been undertaken by quantum mechanical calculations at several theoretical levels: Hartree-Fock (HF), Moller-plesset second-order perturbation (MP2) and Density Functional Theory (DFT). In all cases, it has been shown that HF calculations lead to insufficient results for assigning accurately the intramolecular vibrational modes. In the case of the nucleic bases, these discrepancies could be satisfactorily removed by introducing the correlation effects at MP2 level. However, the application of the MP2 procedure to the large size molecules such as nucleosides and nucleotides is absolutely impossible, taking into account the prohibitive computational time needed. On the basis of our results, the calculations at DFT levels using B3LYP exchange and correlation functional appear to be a cost-effective alternative in obtaining a reliable force field for the whole set of nucleic acid constituents.

  14. A reliable and inexpensive method of nucleic acid extraction for the PCR-based detection of diverse plant pathogens.

    PubMed

    Li, R; Mock, R; Huang, Q; Abad, J; Hartung, J; Kinard, G

    2008-12-01

    A reliable extraction method is described for the preparation of total nucleic acids from at least ten plant genera for subsequent detection of plant pathogens by PCR-based techniques. The method combined a modified CTAB (cetyltrimethylammonium bromide) extraction protocol with a semi-automatic homogenizer (FastPrep) instrument) for rapid sample processing and low potential of cross contamination. The method was applied to sample preparation for PCR-based detection of 28 different RNA and DNA viruses, six viroids, two phytoplasmas and two bacterial pathogens from a range of infected host plants including sweet potato, small fruits and fruit trees. The procedure is cost-effective and the qualities of the nucleic acid preparations are comparable to those prepared by commonly used commercial kits. The efficiency of the procedure permits processing of numerous samples and the use of a single nucleic acid preparation for testing both RNA and DNA genomes by PCR, making this an appealing method for testing multiple pathogens in certification and quarantine programs.

  15. Reliability Prediction

    NASA Technical Reports Server (NTRS)

    1993-01-01

    RELAV, a NASA-developed computer program, enables Systems Control Technology, Inc. (SCT) to predict performance of aircraft subsystems. RELAV provides a system level evaluation of a technology. Systems, the mechanism of a landing gear for example, are first described as a set of components performing a specific function. RELAV analyzes the total system and the individual subsystem probabilities to predict success probability, and reliability. This information is then translated into operational support and maintenance requirements. SCT provides research and development services in support of government contracts.

  16. A GIS-based assessment of the suitability of SCIAMACHY satellite sensor measurements for estimating reliable CO concentrations in a low-latitude climate.

    PubMed

    Fagbeja, Mofoluso A; Hill, Jennifer L; Chatterton, Tim J; Longhurst, James W S

    2015-02-01

    An assessment of the reliability of the Scanning Imaging Absorption Spectrometer for Atmospheric Cartography (SCIAMACHY) satellite sensor measurements to interpolate tropospheric concentrations of carbon monoxide considering the low-latitude climate of the Niger Delta region in Nigeria was conducted. Monthly SCIAMACHY carbon monoxide (CO) column measurements from January 2,003 to December 2005 were interpolated using ordinary kriging technique. The spatio-temporal variations observed in the reliability were based on proximity to the Atlantic Ocean, seasonal variations in the intensities of rainfall and relative humidity, the presence of dust particles from the Sahara desert, industrialization in Southwest Nigeria and biomass burning during the dry season in Northern Nigeria. Spatial reliabilities of 74 and 42 % are observed for the inland and coastal areas, respectively. Temporally, average reliability of 61 and 55 % occur during the dry and wet seasons, respectively. Reliability in the inland and coastal areas was 72 and 38 % during the wet season, and 75 and 46 % during the dry season, respectively. Based on the results, the WFM-DOAS SCIAMACHY CO data product used for this study is therefore relevant in the assessment of CO concentrations in developing countries within the low latitudes that could not afford monitoring infrastructure due to the required high costs. Although the SCIAMACHY sensor is no longer available, it provided cost-effective, reliable and accessible data that could support air quality assessment in developing countries. PMID:25626562

  17. Reliability and validity of an ultrasound-based imaging method for measuring interspinous process distance in the lumbar spine using two different index points.

    PubMed

    Tozawa, Ryosuke; Katoh, Munenori; Aramaki, Hidefumi; Kumamoto, Tsuneo; Fujinawa, Osamu

    2015-07-01

    [Purpose] This study assessed the reliability and validity of an ultrasound-based imaging method for measuring the interspinous process distance in the lumbar spine using two different index points. [Subjects and Methods] Ten healthy males were recruited. Five physical therapy students participated in this study as examiners. The L2-L3 interspinous distance was measured from the caudal end of the L2 spinous process to the cranial end of the L3 spinous process (E-E measurement) and from the top of the L2 spinous process to the top of the L3 spinous process (T-T measurement). Intraclass correlation coefficients were calculated to estimate the relative reliability. Validity was assessed using a model resembling the living human body. [Results] The reliability study showed no difference in intra-rater reliability between the two measurements. However, the E-E measurement showed higher inter-rater reliability than the T-T measurement (Intraclass correlation coefficients: 0.914 vs. 0.725). Moreover, the E-E measurement method had good validity (Intraclass correlation coefficients: 0.999 and 95% confidence interval for minimal detectable change: 0.29 mm). [Conclusion] These results demonstrate the high reliability and validity of ultrasound-based imaging in the quantitative assessment of lumbar interspinous process distance. Of the two methods, the E-E measurement method is recommended.

  18. Reliability-based design optimization of reinforced concrete structures including soil-structure interaction using a discrete gravitational search algorithm and a proposed metamodel

    NASA Astrophysics Data System (ADS)

    Khatibinia, M.; Salajegheh, E.; Salajegheh, J.; Fadaee, M. J.

    2013-10-01

    A new discrete gravitational search algorithm (DGSA) and a metamodelling framework are introduced for reliability-based design optimization (RBDO) of reinforced concrete structures. The RBDO of structures with soil-structure interaction (SSI) effects is investigated in accordance with performance-based design. The proposed DGSA is based on the standard gravitational search algorithm (GSA) to optimize the structural cost under deterministic and probabilistic constraints. The Monte-Carlo simulation (MCS) method is considered as the most reliable method for estimating the probabilities of reliability. In order to reduce the computational time of MCS, the proposed metamodelling framework is employed to predict the responses of the SSI system in the RBDO procedure. The metamodel consists of a weighted least squares support vector machine (WLS-SVM) and a wavelet kernel function, which is called WWLS-SVM. Numerical results demonstrate the efficiency and computational advantages of DGSA and the proposed metamodel for RBDO of reinforced concrete structures.

  19. Reliability analysis of charge plasma based double material gate oxide (DMGO) SiGe-on-insulator (SGOI) MOSFET

    NASA Astrophysics Data System (ADS)

    Pradhan, K. P.; Sahu, P. K.; Singh, D.; Artola, L.; Mohapatra, S. K.

    2015-09-01

    A novel device named charge plasma based doping less double material gate oxide (DMGO) silicon-germanium on insulator (SGOI) double gate (DG) MOSFET is proposed for the first time. The fundamental objective in this work is to modify the channel potential, electric field and electron velocity for improving leakage current, transconductance (gm) and transconductance generation factor (TGF). Using 2-D simulation, we exhibit that the DMGO-SGOI MOSFET shows higher electron velocity at source side and lower electric field at drain side as compare to ultra-thin body (UTB) DG MOSFET. On the other hand DMGO-SGOI MOSFET demonstrates a significant improvement in gm and TGF in comparison to UTB-DG MOSFET. This work also evaluates the existence of a biasing point i.e. zero temperature coefficient (ZTC) bias point, where the device parameters become independent of temperature. The impact of operating temperature (T) on above said various performance metrics are also subjected to extensive analysis. This further validates the reliability of charge plasma DMGO SGOI MOSFET and its application opportunities involved in designing analog/RF circuits for a wide range of temperature applications.

  20. Reliability study of Au-in solid-liquid interdiffusion bonding for GaN-based vertical LED packaging

    NASA Astrophysics Data System (ADS)

    Sung, Ho-Kun; Wang, Cong; Kim, Nam-Young

    2015-12-01

    An In-rich Au-In bonding system has been developed to transfer vertical light-emitting diodes (VLEDs) from a sapphire to a graphite substrate and enable them to survive under n-ohmic contact treatment at 350 °C. The bonding temperature is 210 °C, and three intermetallic compounds are detected: AuIn, AuIn2, and γ phase. As a result, the remelting temperature increases beyond the theoretical value of 450 °C according to the Au-In binary phase diagram. In fact, reliability testing showed that joints obtained by rapid thermal annealing at 400 °C for 1 min survived whereas those obtained at 500 °C for 1 min failed. Finally, a GaN-based blue VLED was transferred to the graphite substrate by means of the proposed bonding method, and its average light output power was measured to be 386.6 mW (@350 mA) after n-ohmic contact treatment. This wafer-level bonding technique also shows excellent potential for high-temperature packing applications.

  1. Empirically based comparisons of the reliability and validity of common quantification approaches for eyeblink startle potentiation in humans.

    PubMed

    Bradford, Daniel E; Starr, Mark J; Shackman, Alexander J; Curtin, John J

    2015-12-01

    Startle potentiation is a well-validated translational measure of negative affect. Startle potentiation is widely used in clinical and affective science, and there are multiple approaches for its quantification. The three most commonly used approaches quantify startle potentiation as the increase in startle response from a neutral to threat condition based on (1) raw potentiation, (2) standardized potentiation, or (3) percent-change potentiation. These three quantification approaches may yield qualitatively different conclusions about effects of independent variables (IVs) on affect when within- or between-group differences exist for startle response in the neutral condition. Accordingly, we directly compared these quantification approaches in a shock-threat task using four IVs known to influence startle response in the no-threat condition: probe intensity, time (i.e., habituation), alcohol administration, and individual differences in general startle reactivity measured at baseline. We confirmed the expected effects of time, alcohol, and general startle reactivity on affect using self-reported fear/anxiety as a criterion. The percent-change approach displayed apparent artifact across all four IVs, which raises substantial concerns about its validity. Both raw and standardized potentiation approaches were stable across probe intensity and time, which supports their validity. However, only raw potentiation displayed effects that were consistent with a priori specifications and/or the self-report criterion for the effects of alcohol and general startle reactivity. Supplemental analyses of reliability and validity for each approach provided additional evidence in support of raw potentiation.

  2. Reliability-based automatic repeat request for short code modulation visual evoked potentials in brain computer interfaces.

    PubMed

    Sato, Jun-Ichi; Washizawa, Yoshikazu

    2015-08-01

    We propose two methods to improve code modulation visual evoked potential brain computer interfaces (cVEP BCIs). Most of BCIs average brain signals from several trials in order to improve the classification performance. The number of averaging defines the trade-off between input speed and accuracy, and the optimal averaging number depends on individual, signal acquisition system, and so forth. Firstly, we propose a novel dynamic method to estimate the averaging number for cVEP BCIs. The proposed method is based on the automatic repeat request (ARQ) that is used in communication systems. The existing cVEP BCIs employ rather longer code, such as 63-bit M-sequence. The code length also defines the trade-off between input speed and accuracy. Since the reliability of the proposed BCI can be controlled by the proposed ARQ method, we introduce shorter codes, 32-bit M-sequence and the Kasami-sequence. Thanks to combine the dynamic averaging number estimation method and the shorter codes, the proposed system exhibited higher information transfer rate compared to existing cVEP BCIs.

  3. Curriculum-based measurement of oral reading: A preliminary investigation of confidence interval overlap to detect reliable growth.

    PubMed

    Van Norman, Ethan R

    2016-09-01

    Curriculum-based measurement of oral reading (CBM-R) progress monitoring data is used to measure student response to instruction. Federal legislation permits educators to use CBM-R progress monitoring data as a basis for determining the presence of specific learning disabilities. However, decision making frameworks originally developed for CBM-R progress monitoring data were not intended for such high stakes assessments. Numerous documented issues with trend line estimation undermine the validity of using slope estimates to infer progress. One proposed recommendation is to use confidence interval overlap as a means of judging reliable growth. This project explored the degree to which confidence interval overlap was related to true growth magnitude using simulation methodology. True and observed CBM-R scores were generated across 7 durations of data collection (range 6-18 weeks), 3 levels of dataset quality or residual variance (5, 10, and 15 words read correct per minute) and 2 types of data collection schedules. Descriptive and inferential analyses were conducted to explore interactions between overlap status, progress monitoring scenarios, and true growth magnitude. A small but statistically significant interaction was observed between overlap status, duration, and dataset quality, b = -0.004, t(20992) =-7.96, p < .001. In general, confidence interval overlap does not appear to meaningfully account for variance in true growth across many progress monitoring conditions. Implications for research and practice are discussed. Limitations and directions for future research are addressed. (PsycINFO Database Record

  4. Safety reliability evaluation when vehicles turn right from urban major roads onto minor ones based on driver's visual perception.

    PubMed

    Yu, Bo; Chen, Yuren; Wang, Ruiyun; Dong, Yongjie

    2016-10-01

    Turning right has a significant impact on urban road traffic safety. Driving into the curve inappropriately or with improper turning speed often leads to a series of potential accidents and hidden dangers. For a long time, the design speed at intersections has been used to determine the physical radius of curbs and channelization, and drivers are expected to drive in accordance with the design speed. However, a large number of real vehicle tests show that for the road without an exclusive right-turn lane, there is not a good correlation between the physical radius of curbs and the turning right speeds. In this paper, shape parameters of the driver's visual lane model are put forward and they have relatively high correlations with right-turn speeds. Hence, an evaluation method about safety reliability of turning right from urban major roads onto minor ones based on driver's visual perception is proposed. For existing roads, the evaluation object could be real driving videos; for those under construction roads, the evaluation object could be visual scenes obtained from a driving simulation device. Findings in this research will make a contribution to the optimization of right-turn design at intersections and lead to the development of auxiliary driving technology.

  5. Empirically based comparisons of the reliability and validity of common quantification approaches for eyeblink startle potentiation in humans

    PubMed Central

    Bradford, Daniel E.; Starr, Mark J.; Shackman, Alexander J.

    2015-01-01

    Abstract Startle potentiation is a well‐validated translational measure of negative affect. Startle potentiation is widely used in clinical and affective science, and there are multiple approaches for its quantification. The three most commonly used approaches quantify startle potentiation as the increase in startle response from a neutral to threat condition based on (1) raw potentiation, (2) standardized potentiation, or (3) percent‐change potentiation. These three quantification approaches may yield qualitatively different conclusions about effects of independent variables (IVs) on affect when within‐ or between‐group differences exist for startle response in the neutral condition. Accordingly, we directly compared these quantification approaches in a shock‐threat task using four IVs known to influence startle response in the no‐threat condition: probe intensity, time (i.e., habituation), alcohol administration, and individual differences in general startle reactivity measured at baseline. We confirmed the expected effects of time, alcohol, and general startle reactivity on affect using self‐reported fear/anxiety as a criterion. The percent‐change approach displayed apparent artifact across all four IVs, which raises substantial concerns about its validity. Both raw and standardized potentiation approaches were stable across probe intensity and time, which supports their validity. However, only raw potentiation displayed effects that were consistent with a priori specifications and/or the self‐report criterion for the effects of alcohol and general startle reactivity. Supplemental analyses of reliability and validity for each approach provided additional evidence in support of raw potentiation. PMID:26372120

  6. Spatiotemporal variation of long-term drought propensity through reliability-resilience-vulnerability based Drought Management Index

    NASA Astrophysics Data System (ADS)

    Chanda, Kironmala; Maity, Rajib; Sharma, Ashish; Mehrotra, Rajeshwar

    2014-10-01

    This paper characterizes the long-term, spatiotemporal variation of drought propensity through a newly proposed, namely Drought Management Index (DMI), and explores its predictability in order to assess the future drought propensity and adapt drought management policies for a location. The DMI was developed using the reliability-resilience-vulnerability (RRV) rationale commonly used in water resources systems analysis, under the assumption that depletion of soil moisture across a vertical soil column is equivalent to the operation of a water supply reservoir, and that drought should be managed not simply using a measure of system reliability, but should also take into account the readiness of the system to bounce back from drought to a normal state. Considering India as a test bed, 5 year long monthly gridded (0.5° Lat × 0.5° Lon) soil moisture data are used to compute the RRV at each grid location falling within the study domain. The Permanent Wilting Point (PWP) is used as the threshold, indicative of transition into water stress. The association between resilience and vulnerability is then characterized through their joint probability distribution ascertained using Plackett copula models for four broad soil types across India. The joint cumulative distribution functions (CDF) of resilience and vulnerability form the basis for estimating the DMI as a five-yearly time series at each grid location assessed. The status of DMI over the past 50 years indicate that drought propensity is consistently low toward northern and north eastern parts of India but higher in the western part of peninsular India. Based on the observed past behavior of DMI series on a climatological time scale, a DMI prediction model comprising deterministic and stochastic components is developed. The predictability of DMI for a lead time of 5 years is found to vary across India, with a Pearson correlation coefficient between observed and predicted DMI above 0.6 over most of the study area

  7. Linear Interaction Energy Based Prediction of Cytochrome P450 1A2 Binding Affinities with Reliability Estimation

    PubMed Central

    Capoferri, Luigi; Verkade-Vreeker, Marlies C. A.; Buitenhuis, Danny; Commandeur, Jan N. M.; Pastor, Manuel; Vermeulen, Nico P. E.; Geerke, Daan P.

    2015-01-01

    Prediction of human Cytochrome P450 (CYP) binding affinities of small ligands, i.e., substrates and inhibitors, represents an important task for predicting drug-drug interactions. A quantitative assessment of the ligand binding affinity towards different CYPs can provide an estimate of inhibitory activity or an indication of isoforms prone to interact with the substrate of inhibitors. However, the accuracy of global quantitative models for CYP substrate binding or inhibition based on traditional molecular descriptors can be limited, because of the lack of information on the structure and flexibility of the catalytic site of CYPs. Here we describe the application of a method that combines protein-ligand docking, Molecular Dynamics (MD) simulations and Linear Interaction Energy (LIE) theory, to allow for quantitative CYP affinity prediction. Using this combined approach, a LIE model for human CYP 1A2 was developed and evaluated, based on a structurally diverse dataset for which the estimated experimental uncertainty was 3.3 kJ mol-1. For the computed CYP 1A2 binding affinities, the model showed a root mean square error (RMSE) of 4.1 kJ mol-1 and a standard error in prediction (SDEP) in cross-validation of 4.3 kJ mol-1. A novel approach that includes information on both structural ligand description and protein-ligand interaction was developed for estimating the reliability of predictions, and was able to identify compounds from an external test set with a SDEP for the predicted affinities of 4.6 kJ mol-1 (corresponding to 0.8 pKi units). PMID:26551865

  8. Physics-Based Stress Corrosion Cracking Component Reliability Model cast in an R7-Compatible Cumulative Damage Framework

    SciTech Connect

    Unwin, Stephen D.; Lowry, Peter P.; Layton, Robert F.; Toloczko, Mychailo B.; Johnson, Kenneth I.; Sanborn, Scott E.

    2011-07-01

    This is a working report drafted under the Risk-Informed Safety Margin Characterization pathway of the Light Water Reactor Sustainability Program, describing statistical models of passives component reliabilities.

  9. Trunk-acceleration based assessment of gait parameters in older persons: a comparison of reliability and validity of four inverted pendulum based estimations.

    PubMed

    Zijlstra, Agnes; Zijlstra, Wiebren

    2013-09-01

    Inverted pendulum (IP) models of human walking allow for wearable motion-sensor based estimations of spatio-temporal gait parameters during unconstrained walking in daily-life conditions. At present it is unclear to what extent different IP based estimations yield different results, and reliability and validity have not been investigated in older persons without a specific medical condition. The aim of this study was to compare reliability and validity of four different IP based estimations of mean step length in independent-living older persons. Participants were assessed twice and walked at different speeds while wearing a tri-axial accelerometer at the lower back. For all step-length estimators, test-retest intra-class correlations approached or were above 0.90. Intra-class correlations with reference step length were above 0.92 with a mean error of 0.0 cm when (1) multiplying the estimated center-of-mass displacement during a step by an individual correction factor in a simple IP model, or (2) adding an individual constant for bipedal stance displacement to the estimated displacement during single stance in a 2-phase IP model. When applying generic corrections or constants in all subjects (i.e. multiplication by 1.25, or adding 75% of foot length), correlations were above 0.75 with a mean error of respectively 2.0 and 1.2 cm. Although the results indicate that an individual adjustment of the IP models provides better estimations of mean step length, the ease of a generic adjustment can be favored when merely evaluating intra-individual differences. Further studies should determine the validity of these IP based estimations for assessing gait in daily life.

  10. Reliability of calculation of the lithosphere deformations in tectonically stable area of Poland based on the GPS measurements

    NASA Astrophysics Data System (ADS)

    Araszkiewicz, Andrzej; Jarosiński, Marek

    2013-04-01

    In this research we aimed to check if the GPS observations can be used for calculation of a reliable deformation pattern of the intracontinental lithosphere in seismically inactive areas, such as territory of Poland. For this purpose we have used data mainly from the ASG-EUPOS permanent network and the solutions developed by the MUT CAG team (Military University of Technology: Centre of Applied Geomatics). From the 128 analyzed stations almost 100 are mounted on buildings. Daily observations were processed in the Bernese 5.0 software and next the weekly solutions were used to determine the station velocities expressed in ETRF2000. The strain rates were determined for almost 200 triangles with GPS stations in their corners plotted used Delaunay triangulation. The obtained scattered directions of deformations and highly changeable values of strain rates point to insufficient antennas' stabilization as for geodynamical studies. In order to depict badly stabilized stations we carried out a benchmark test to show what might be the effect of one station drift on deformations in contacting triangles. Based on the benchmark results, from our network we have eliminated the stations which showed deformation pattern characteristic for instable station. After several rounds of strain rate calculations and eliminations of dubious points we have reduced the number of stations down to 60. The refined network revealed more consistent deformation pattern across Poland. Deformations compared with the recent stress field of the study area disclosed good correlation in some places and significant discrepancies in the others, which will be the subject of future research.

  11. Factor Structure and Reliability of the Revised Conflict Tactics Scales' (CTS2) 10-Factor Model in a Community-Based Female Sample

    ERIC Educational Resources Information Center

    Yun, Sung Hyun

    2011-01-01

    The present study investigated the factor structure and reliability of the revised Conflict Tactics Scales' (CTS2) 10-factor model in a community-based female sample (N = 261). The underlying factor structure of the 10-factor model was tested by the confirmatory multiple group factor analysis, which demonstrated complex factor cross-loadings…

  12. Reliable, Efficient and Cost-Effective Electric Power Converter for Small Wind Turbines Based on AC-link Technology

    SciTech Connect

    Darren Hammell; Mark Holveck; DOE Project Officer - Keith Bennett

    2006-08-01

    Grid-tied inverter power electronics have been an Achilles heel of the small wind industry, providing opportunity for new technologies to provide lower costs, greater efficiency, and improved reliability. The small wind turbine market is also moving towards the 50-100kW size range. The unique AC-link power conversion technology provides efficiency, reliability, and power quality advantages over existing technologies, and Princeton Power will adapt prototype designs used for industrial asynchronous motor control to a 50kW small wind turbine design.

  13. A Study on Combination of Reliability-based Automatic Repeat reQuest with Error Potential-based Error Correction for Improving P300 Speller Performance

    NASA Astrophysics Data System (ADS)

    Takahashi, Hiromu; Yoshikawa, Tomohiro; Furuhashi, Takeshi

    Brain-computer interfaces (BCIs) are systems that translate one's thoughts into commands to restore control and communication to severely paralyzed people, and also appealing to healthy people. The P300 speller, one of the most renowned BCIs for communication, allows users to select letters just by thoughts. However, due to the low signal-to-noise ratio of the P300, signal averaging is often performed, which improves the spelling accuracy but degrades the spelling speed. The authors have proposed reliability-based automatic repeat request (RB-ARQ) to ease this problem. RB-ARQ could be enhanced when it is combined with the error correction based on the error-related potentials (ErrPs) that occur on erroneous feedbacks. Thus, this study aims to reveal the characteristics of the ErrPs in the P300 speller paradigm, and to combine RB-ARQ with the ErrP-based error correction to further improve the performance. The results show that the ErrPs observed in the current study resemble the previously reported ErrPs observed in a cursor control task using a BCI, and that the performance of the P300 speller could be improved by 35 percent on average.

  14. Estimating the Reliability of a Test Battery Composite or a Test Score Based on Weighted Item Scoring

    ERIC Educational Resources Information Center

    Feldt, Leonard S.

    2004-01-01

    In some settings, the validity of a battery composite or a test score is enhanced by weighting some parts or items more heavily than others in the total score. This article describes methods of estimating the total score reliability coefficient when differential weights are used with items or parts.

  15. Reliability and Validity of Inferences about Teachers Based on Student Scores. William H. Angoff Memorial Lecture Series

    ERIC Educational Resources Information Center

    Haertel, Edward H.

    2013-01-01

    Policymakers and school administrators have embraced value-added models of teacher effectiveness as tools for educational improvement. Teacher value-added estimates may be viewed as complicated scores of a certain kind. This suggests using a test validation model to examine their reliability and validity. Validation begins with an interpretive…

  16. Development of a reliable and highly sensitive, digital PCR-based assay for early detection of HLB

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Huanglongbing (HLB) is caused by a phloem-limited bacterium, Ca. Liberibacter asiaticus (Las) in the United States. The bacterium often is present at a low concentration and unevenly distributed in the early stage of infection, making reliable and early diagnosis a serious challenge. Conventional d...

  17. Reliability Centered Maintenance - Methodologies

    NASA Technical Reports Server (NTRS)

    Kammerer, Catherine C.

    2009-01-01

    Journal article about Reliability Centered Maintenance (RCM) methodologies used by United Space Alliance, LLC (USA) in support of the Space Shuttle Program at Kennedy Space Center. The USA Reliability Centered Maintenance program differs from traditional RCM programs because various methodologies are utilized to take advantage of their respective strengths for each application. Based on operational experience, USA has customized the traditional RCM methodology into a streamlined lean logic path and has implemented the use of statistical tools to drive the process. USA RCM has integrated many of the L6S tools into both RCM methodologies. The tools utilized in the Measure, Analyze, and Improve phases of a Lean Six Sigma project lend themselves to application in the RCM process. All USA RCM methodologies meet the requirements defined in SAE JA 1011, Evaluation Criteria for Reliability-Centered Maintenance (RCM) Processes. The proposed article explores these methodologies.

  18. Integrating field methodology and web-based data collection to assess the reliability of the Alcohol Use Disorders Identification Test (AUDIT).

    PubMed

    Celio, Mark A; Vetter-O'Hagen, Courtney S; Lisman, Stephen A; Johansen, Gerard E; Spear, Linda P

    2011-12-01

    Field methodologies offer a unique opportunity to collect ecologically valid data on alcohol use and its associated problems within natural drinking environments. However, limitations in follow-up data collection methods have left unanswered questions regarding the psychometric properties of field-based measures. The aim of the current study is to evaluate the reliability of self-report data collected in a naturally occurring environment - as indexed by the Alcohol Use Disorders Identification Test (AUDIT) - compared to self-report data obtained through an innovative web-based follow-up procedure. Individuals recruited outside of bars (N=170; mean age=21; range 18-32) provided a BAC sample and completed a self-administered survey packet that included the AUDIT. BAC feedback was provided anonymously through a dedicated web page. Upon sign in, follow-up participants (n=89; 52%) were again asked to complete the AUDIT before receiving their BAC feedback. Reliability analyses demonstrated that AUDIT scores - both continuous and dichotomized at the standard cut-point - were stable across field- and web-based administrations. These results suggest that self-report data obtained from acutely intoxicated individuals in naturally occurring environments are reliable when compared to web-based data obtained after a brief follow-up interval. Furthermore, the results demonstrate the feasibility, utility, and potential of integrating field methods and web-based data collection procedures.

  19. Intra-observer reliability for measuring first and second toe and metatarsal protrusion distance using palpation-based tests: a test-retest study

    PubMed Central

    2014-01-01

    Background Measurement of first and second metatarsal and toe protrusion is frequently used to explain foot problems using x-rays, osteological measurements or palpation-based tests. Length differences could be related to the appearance of problems in the foot. A test-retest design was conducted in order to establish the intra-rater reliability of three palpation-based tests. Methods 202 feet of physical therapy students and teachers of the CEU San Pablo University of Madrid, 39 men and 62 women, were measured using three different tests. Data were analysed using SPSS version 15.0. Mean, SD and 95% CI were calculated for each variable. A normal distribution of quantitative data was assessed using the Kolmogorov-Smirnov test. The test-retest intra-rater reliability was assessed using an Intraclass Correlation Coefficient (ICC). The Standard Error Mean (SEM) and the Minimal Detectable Change (MDC) were also obtained. Results All the ICC values showed a high degree of reliability (Test 1 = 0.97, Test 2 = 0.86 and Test 3 = 0.88) as did the SEM (Test 1 = 0.07, Test 2 = 0.10 and Test 3 = 0.11) and the MDC (Test 1 = 0.21, Test 2 = 0.30 and Test 3 = 0.31). Conclusions Reliability of measuring first and second metatarsal and toe protrusion using the three palpation-based tests showed a high degree of reliability. PMID:25729437

  20. Optimizing the preventive maintenance scheduling by genetic algorithm based on cost and reliability in National Iranian Drilling Company

    NASA Astrophysics Data System (ADS)

    Javanmard, Habibollah; Koraeizadeh, Abd al-Wahhab

    2016-06-01

    The present research aims at predicting the required activities for preventive maintenance in terms of equipment optimal cost and reliability. The research sample includes all offshore drilling equipment of FATH 59 Derrick Site affiliated with National Iranian Drilling Company. Regarding the method, the research uses a field methodology and in terms of its objectives, it is classified as an applied research. Some of the data are extracted from the documents available in the equipment and maintenance department of FATH 59 Derrick site, and other needed data are resulted from experts' estimates through genetic algorithm method. The research result is provided as the prediction of downtimes, costs, and reliability in a predetermined time interval. The findings of the method are applicable for all manufacturing and non-manufacturing equipment.

  1. Beyond Reliability

    PubMed Central

    2008-01-01

    The validity of psychiatric diagnosis rests in part on a demonstration that identifiable biomarkers exist for major psychiatric illnesses. Recent evidence supports the existence of several biomarkers or endophenotypes for both schizophrenia and bipolar disorder. As we learn more about how these biomarkers relate to the symptoms, course, and treatment response of major psychiatric disorders, the “objectivity” of psychiatric diagnosis will increase. However, psychiatry is and will remain a clinically based discipline, aimed at comprehensively understanding and relieving human suffering. PMID:19727304

  2. Test-retest reliability of a battery of field-based health-related fitness measures for adolescents.

    PubMed

    Lubans, David R; Morgan, Philip; Callister, Robin; Plotnikoff, Ronald C; Eather, Narelle; Riley, Nicholas; Smith, Chris J

    2011-04-01

    The main aim of this study was to determine the test-retest reliability of existing tests of health-related fitness. Participants (mean age 14.8 years, s = 0.4) were 42 boys and 26 girls who completed the study assessments on two occasions separated by one week. The following tests were conducted: bioelectrical impedance analysis (BIA) to calculate percent body fat, leg dynamometer, 90° push-up, 7-stage sit-up, and wall squat tests. Intra-class correlation (ICC), paired samples t-tests, and typical error expressed as a coefficient of variation were calculated. The mean percent body fat intra-class correlation coefficient was similar for boys (ICC = 0.95) and girls (ICC = 0.93), but the mean coefficient of variation was considerably higher for boys than girls (22.2% vs. 12.2%). The boys' coefficients of variation for the tests of muscular fitness ranged from 9.0% for the leg dynamometer test to 26.5% for the timed wall squat test. The girls' coefficients of variation ranged from 17.1% for the sit-up test to 21.4% for the push-up test. Although the BIA machine produced reliable estimates of percent body fat, the tests of muscular fitness resulted in high systematic error, suggesting that these measures may require an extensive familiarization phase before the results can be considered reliable.

  3. How to measure ecosystem stability? An evaluation of the reliability of stability metrics based on remote sensing time series across the major global ecosystems.

    PubMed

    De Keersmaecker, Wanda; Lhermitte, Stef; Honnay, Olivier; Farifteh, Jamshid; Somers, Ben; Coppin, Pol

    2014-07-01

    Increasing frequency of extreme climate events is likely to impose increased stress on ecosystems and to jeopardize the services that ecosystems provide. Therefore, it is of major importance to assess the effects of extreme climate events on the temporal stability (i.e., the resistance, the resilience, and the variance) of ecosystem properties. Most time series of ecosystem properties are, however, affected by varying data characteristics, uncertainties, and noise, which complicate the comparison of ecosystem stability metrics (ESMs) between locations. Therefore, there is a strong need for a more comprehensive understanding regarding the reliability of stability metrics and how they can be used to compare ecosystem stability globally. The objective of this study was to evaluate the performance of temporal ESMs based on time series of the Moderate Resolution Imaging Spectroradiometer derived Normalized Difference Vegetation Index of 15 global land-cover types. We provide a framework (i) to assess the reliability of ESMs in function of data characteristics, uncertainties and noise and (ii) to integrate reliability estimates in future global ecosystem stability studies against climate disturbances. The performance of our framework was tested through (i) a global ecosystem comparison and (ii) an comparison of ecosystem stability in response to the 2003 drought. The results show the influence of data quality on the accuracy of ecosystem stability. White noise, biased noise, and trends have a stronger effect on the accuracy of stability metrics than the length of the time series, temporal resolution, or amount of missing values. Moreover, we demonstrate the importance of integrating reliability estimates to interpret stability metrics within confidence limits. Based on these confidence limits, other studies dealing with specific ecosystem types or locations can be put into context, and a more reliable assessment of ecosystem stability against environmental disturbances

  4. How to measure ecosystem stability? An evaluation of the reliability of stability metrics based on remote sensing time series across the major global ecosystems.

    PubMed

    De Keersmaecker, Wanda; Lhermitte, Stef; Honnay, Olivier; Farifteh, Jamshid; Somers, Ben; Coppin, Pol

    2014-07-01

    Increasing frequency of extreme climate events is likely to impose increased stress on ecosystems and to jeopardize the services that ecosystems provide. Therefore, it is of major importance to assess the effects of extreme climate events on the temporal stability (i.e., the resistance, the resilience, and the variance) of ecosystem properties. Most time series of ecosystem properties are, however, affected by varying data characteristics, uncertainties, and noise, which complicate the comparison of ecosystem stability metrics (ESMs) between locations. Therefore, there is a strong need for a more comprehensive understanding regarding the reliability of stability metrics and how they can be used to compare ecosystem stability globally. The objective of this study was to evaluate the performance of temporal ESMs based on time series of the Moderate Resolution Imaging Spectroradiometer derived Normalized Difference Vegetation Index of 15 global land-cover types. We provide a framework (i) to assess the reliability of ESMs in function of data characteristics, uncertainties and noise and (ii) to integrate reliability estimates in future global ecosystem stability studies against climate disturbances. The performance of our framework was tested through (i) a global ecosystem comparison and (ii) an comparison of ecosystem stability in response to the 2003 drought. The results show the influence of data quality on the accuracy of ecosystem stability. White noise, biased noise, and trends have a stronger effect on the accuracy of stability metrics than the length of the time series, temporal resolution, or amount of missing values. Moreover, we demonstrate the importance of integrating reliability estimates to interpret stability metrics within confidence limits. Based on these confidence limits, other studies dealing with specific ecosystem types or locations can be put into context, and a more reliable assessment of ecosystem stability against environmental disturbances

  5. Internal consistency, test-retest reliability, and predictive validity for a Likert-based version of the Sources of occupational stress-14 (SOOS-14) scale.

    PubMed

    Kimbrel, Nathan A; Flynn, Elisa J; Carpenter, Grace Stephanie J; Cammarata, Claire M; Leto, Frank; Ostiguy, William J; Kamholz, Barbara W; Zimering, Rose T; Gulliver, Suzy B

    2015-08-30

    This study examined the psychometric properties of a Likert-based version of the Sources of Occupational Stress-14 (SOOS-14) scale. Internal consistency for the SOOS-14 ranged from 0.78-0.84, whereas three-month test-retest reliability was 0.51. In addition, SOOS-14 scores were prospectively associated with symptoms of PTSD and depression at a three-month follow-up assessment. PMID:26073282

  6. Reliability-based econometrics of aerospace structural systems: Design criteria and test options. Ph.D. Thesis - Georgia Inst. of Tech.

    NASA Technical Reports Server (NTRS)

    Thomas, J. M.; Hanagud, S.

    1974-01-01

    The design criteria and test options for aerospace structural reliability were investigated. A decision methodology was developed for selecting a combination of structural tests and structural design factors. The decision method involves the use of Bayesian statistics and statistical decision theory. Procedures are discussed for obtaining and updating data-based probabilistic strength distributions for aerospace structures when test information is available and for obtaining subjective distributions when data are not available. The techniques used in developing the distributions are explained.

  7. Ant-based power efficient, adaptive, reliable, and load balanced (A-PEARL) routing for smart metering networks

    NASA Astrophysics Data System (ADS)

    Muraleedharan, Rajani

    2011-06-01

    The future of metering networks requires adaptation of different sensor technology while reducing energy exploitation. In this paper, a routing protocol with the ability to adapt and communicate reliably over varied IEEE standards is proposed. Due to sensor's resource constraints, such as memory, energy, processing power an algorithm that balances resources without compromising performance is preferred. The proposed A-PEARL protocol is tested under harsh simulated scenarios such as sensor failure and fading conditions. The inherent features of A-PEARL protocol such as data aggregation, fusion and channel hopping enables minimal resource consumption and secure communication.

  8. Issues in Modeling System Reliability

    NASA Astrophysics Data System (ADS)

    Cruse, Thomas A.; Annis, Chuck; Booker, Jane; Robinson, David; Sues, Rob

    2002-10-01

    This paper discusses various issues in modeling system reliability. The topics include: 1) Statistical formalisms versus pragmatic numerics; 2) Language; 3) Statistical methods versus reliability-based design methods; 4) Professional bias; and 5) Real issues that need to be identified and resolved prior to certifying designs. This paper is in viewgraph form.

  9. Parametric Mass Reliability Study

    NASA Technical Reports Server (NTRS)

    Holt, James P.

    2014-01-01

    The International Space Station (ISS) systems are designed based upon having redundant systems with replaceable orbital replacement units (ORUs). These ORUs are designed to be swapped out fairly quickly, but some are very large, and some are made up of many components. When an ORU fails, it is replaced on orbit with a spare; the failed unit is sometimes returned to Earth to be serviced and re-launched. Such a system is not feasible for a 500+ day long-duration mission beyond low Earth orbit. The components that make up these ORUs have mixed reliabilities. Components that make up the most mass-such as computer housings, pump casings, and the silicon board of PCBs-typically are the most reliable. Meanwhile components that tend to fail the earliest-such as seals or gaskets-typically have a small mass. To better understand the problem, my project is to create a parametric model that relates both the mass of ORUs to reliability, as well as the mass of ORU subcomponents to reliability.

  10. Software reliability report

    NASA Technical Reports Server (NTRS)

    Wilson, Larry

    1991-01-01

    There are many software reliability models which try to predict future performance of software based on data generated by the debugging process. Unfortunately, the models appear to be unable to account for the random nature of the data. If the same code is debugged multiple times and one of the models is used to make predictions, intolerable variance is observed in the resulting reliability predictions. It is believed that data replication can remove this variance in lab type situations and that it is less than scientific to talk about validating a software reliability model without considering replication. It is also believed that data replication may prove to be cost effective in the real world, thus the research centered on verification of the need for replication and on methodologies for generating replicated data in a cost effective manner. The context of the debugging graph was pursued by simulation and experimentation. Simulation was done for the Basic model and the Log-Poisson model. Reasonable values of the parameters were assigned and used to generate simulated data which is then processed by the models in order to determine limitations on their accuracy. These experiments exploit the existing software and program specimens which are in AIR-LAB to measure the performance of reliability models.

  11. Improving the communication reliability of body sensor networks based on the IEEE 802.15.4 protocol.

    PubMed

    Gomes, Diogo; Afonso, José A

    2014-03-01

    Body sensor networks (BSNs) enable continuous monitoring of patients anywhere, with minimum constraints to daily life activities. Although the IEEE 802.15.4 and ZigBee(®) (ZigBee Alliance, San Ramon, CA) standards were mainly developed for use in wireless sensors network (WSN) applications, they are also widely used in BSN applications because of device characteristics such as low power, low cost, and small form factor. However, compared with WSNs, BSNs present some very distinctive characteristics in terms of traffic and mobility patterns, heterogeneity of the nodes, and quality of service requirements. This article evaluates the suitability of the carrier sense multiple access-collision avoidance protocol, used by the IEEE 802.15.4 and ZigBee standards, for data-intensive BSN applications, through the execution of experimental tests in different evaluation scenarios, in order to take into account the effects of contention, clock drift, and hidden nodes on the communication reliability. Results show that the delivery ratio may decrease substantially during transitory periods, which can last for several minutes, to a minimum of 90% with retransmissions and 13% without retransmissions. This article also proposes and evaluates the performance of the BSN contention avoidance mechanism, which was designed to solve the identified reliability problems. This mechanism was able to restore the delivery ratio to 100% even in the scenario without retransmissions.

  12. Accuracy and reliability of GPS devices for measurement of sports-specific movement patterns related to cricket, tennis, and field-based team sports.

    PubMed

    Vickery, William M; Dascombe, Ben J; Baker, John D; Higham, Dean G; Spratford, Wayne A; Duffield, Rob

    2014-06-01

    The aim of this study was to determine the accuracy and reliability of 5, 10, and 15 Hz global positioning system (GPS) devices. Two male subjects (mean ± SD; age, 25.5 ± 0.7 years; height, 1.75 ± 0.01 m; body mass, 74 ± 5.7 kg) completed 10 repetitions of drills replicating movements typical of tennis, cricket, and field-based (football) sports. All movements were completed wearing two 5 and 10 Hz MinimaxX and 2 GPS-Sports 15 Hz GPS devices in a specially designed harness. Criterion movement data for distance and speed were provided from a 22-camera VICON system sampling at 100 Hz. Accuracy was determined using 1-way analysis of variance with Tukey's post hoc tests. Interunit reliability was determined using intraclass correlation (ICC), and typical error was estimated as coefficient of variation (CV). Overall, for the majority of distance and speed measures, as measured using the 5, 10, and 15 Hz GPS devices, were not significantly different (p > 0.05) to the VICON data. Additionally, no improvements in the accuracy or reliability of GPS devices were observed with an increase in the sampling rate. However, the CV for the 5 and 15 Hz devices for distance and speed measures ranged between 3 and 33%, with increasing variability evident in higher speed zones. The majority of ICC measures possessed a low level of interunit reliability (r = -0.35 to 0.39). Based on these results, practitioners of these devices should be aware that measurements of distance and speed may be consistently underestimated, regardless of the movements performed.

  13. Methods for reliability and uncertainty assessment and for applicability evaluations of classification- and regression-based QSARs.

    PubMed Central

    Eriksson, Lennart; Jaworska, Joanna; Worth, Andrew P; Cronin, Mark T D; McDowell, Robert M; Gramatica, Paola

    2003-01-01

    This article provides an overview of methods for reliability assessment of quantitative structure-activity relationship (QSAR) models in the context of regulatory acceptance of human health and environmental QSARs. Useful diagnostic tools and data analytical approaches are highlighted and exemplified. Particular emphasis is given to the question of how to define the applicability borders of a QSAR and how to estimate parameter and prediction uncertainty. The article ends with a discussion regarding QSAR acceptability criteria. This discussion contains a list of recommended acceptability criteria, and we give reference values for important QSAR performance statistics. Finally, we emphasize that rigorous and independent validation of QSARs is an essential step toward their regulatory acceptance and implementation. PMID:12896860

  14. Methods for reliability and uncertainty assessment and for applicability evaluations of classification- and regression-based QSARs.

    PubMed

    Eriksson, Lennart; Jaworska, Joanna; Worth, Andrew P; Cronin, Mark T D; McDowell, Robert M; Gramatica, Paola

    2003-08-01

    This article provides an overview of methods for reliability assessment of quantitative structure-activity relationship (QSAR) models in the context of regulatory acceptance of human health and environmental QSARs. Useful diagnostic tools and data analytical approaches are highlighted and exemplified. Particular emphasis is given to the question of how to define the applicability borders of a QSAR and how to estimate parameter and prediction uncertainty. The article ends with a discussion regarding QSAR acceptability criteria. This discussion contains a list of recommended acceptability criteria, and we give reference values for important QSAR performance statistics. Finally, we emphasize that rigorous and independent validation of QSARs is an essential step toward their regulatory acceptance and implementation.

  15. Objectives, priorities, reliable knowledge, and science-based management of Missouri River interior least terns and piping plovers

    USGS Publications Warehouse

    Sherfy, Mark; Anteau, Michael; Shaffer, Terry; Sovada, Marsha; Stucker, Jennifer

    2011-01-01

    Supporting recovery of federally listed interior least tern (Sternula antillarum athalassos; tern) and piping plover (Charadrius melodus; plover) populations is a desirable goal in management of the Missouri River ecosystem. Many tools are implemented in support of this goal, including habitat management, annual monitoring, directed research, and threat mitigation. Similarly, many types of data can be used to make management decisions, evaluate system responses, and prioritize research and monitoring. The ecological importance of Missouri River recovery and the conservation status of terns and plovers place a premium on efficient and effective resource use. Efficiency is improved when a single data source informs multiple high-priority decisions, whereas effectiveness is improved when decisions are informed by reliable knowledge. Seldom will a single study design be optimal for addressing all data needs, making prioritization of needs essential. Data collection motivated by well-articulated objectives and priorities has many advantages over studies in which questions and priorities are determined retrospectively. Research and monitoring for terns and plovers have generated a wealth of data that can be interpreted in a variety of ways. The validity and strength of conclusions from analyses of these data is dependent on compatibility between the study design and the question being asked. We consider issues related to collection and interpretation of biological data, and discuss their utility for enhancing the role of science in management of Missouri River terns and plovers. A team of USGS scientists at Northern Prairie Wildlife Research Center has been conducting tern and plover research on the Missouri River since 2005. The team has had many discussions about the importance of setting objectives, identifying priorities, and obtaining reliable information to answer pertinent questions about tern and plover management on this river system. The objectives of this

  16. Investigation of displacement property and electric reliability of (Li,Na,K)NbO3-based multilayer piezoceramics

    NASA Astrophysics Data System (ADS)

    Hatano, Keiichi; Yamamoto, Asa; Kishimoto, Sumiaki; Doshida, Yutaka

    2016-10-01

    In this study, lead-free multilayer piezoceramics with Pd inner electrodes were fabricated, and their displacement properties and electric reliabilities were investigated. The Li0.06Na0.52K0.42NbO3 multilayer piezoceramic exhibited a high displacement (S max/E max = 350 pm/V at 5 kV/mm) but a low resistivity (1.3 × 108 Ω·cm at 100 °C). On the other hand, the additive-modified Li0.06Na0.52K0.42NbO3 multilayer piezoceramic exhibited both high displacement (S max/E max = 330 pm/V at 5 kV/mm) and high resistivity (1.2 × 1012 at 100 °C), and the breakdown voltages of the two piezoceramics were 4 and 16 kV/mm, respectively, at 100 °C. The observed improvement in electric reliability can be attributed to the refinement of the microstructure of Li0.06Na0.52K0.42NbO3 after the use of additives. Furthermore, the additive-modified Li0.06Na0.52K0.42NbO3 multilayer piezoceramic also showed a markedly higher resistivity than previously reported multilayer piezoceramics with Ag/Pd, Cu, and Ni inner electrodes, since the dispersion of elemental Ag and the generation of oxygen vacancies during the sintering process was prevented in the former case.

  17. Defining Requirements for Improved Photovoltaic System Reliability

    SciTech Connect

    Maish, A.B.

    1998-12-21

    Reliable systems are an essential ingredient of any technology progressing toward commercial maturity and large-scale deployment. This paper defines reliability as meeting system fictional requirements, and then develops a framework to understand and quantify photovoltaic system reliability based on initial and ongoing costs and system value. The core elements necessary to achieve reliable PV systems are reviewed. These include appropriate system design, satisfactory component reliability, and proper installation and servicing. Reliability status, key issues, and present needs in system reliability are summarized for four application sectors.

  18. Software reliability studies

    NASA Technical Reports Server (NTRS)

    Hoppa, Mary Ann; Wilson, Larry W.

    1994-01-01

    There are many software reliability models which try to predict future performance of software based on data generated by the debugging process. Our research has shown that by improving the quality of the data one can greatly improve the predictions. We are working on methodologies which control some of the randomness inherent in the standard data generation processes in order to improve the accuracy of predictions. Our contribution is twofold in that we describe an experimental methodology using a data structure called the debugging graph and apply this methodology to assess the robustness of existing models. The debugging graph is used to analyze the effects of various fault recovery orders on the predictive accuracy of several well-known software reliability algorithms. We found that, along a particular debugging path in the graph, the predictive performance of different models can vary greatly. Similarly, just because a model 'fits' a given path's data well does not guarantee that the model would perform well on a different path. Further we observed bug interactions and noted their potential effects on the predictive process. We saw that not only do different faults fail at different rates, but that those rates can be affected by the particular debugging stage at which the rates are evaluated. Based on our experiment, we conjecture that the accuracy of a reliability prediction is affected by the fault recovery order as well as by fault interaction.

  19. Reliability of photovoltaic modules

    NASA Technical Reports Server (NTRS)

    Ross, R. G., Jr.

    1986-01-01

    In order to assess the reliability of photovoltaic modules, four categories of known array failure and degradation mechanisms are discussed, and target reliability allocations have been developed within each category based on the available technology and the life-cycle-cost requirements of future large-scale terrestrial applications. Cell-level failure mechanisms associated with open-circuiting or short-circuiting of individual solar cells generally arise from cell cracking or the fatigue of cell-to-cell interconnects. Power degradation mechanisms considered include gradual power loss in cells, light-induced effects, and module optical degradation. Module-level failure mechanisms and life-limiting wear-out mechanisms are also explored.

  20. The Development of DNA Based Methods for the Reliable and Efficient Identification of Nicotiana tabacum in Tobacco and Its Derived Products

    PubMed Central

    Fan, Wei; Li, Rong; Li, Sifan; Ping, Wenli; Li, Shujun; Naumova, Alexandra; Peelen, Tamara; Yuan, Zheng; Zhang, Dabing

    2016-01-01

    Reliable methods are needed to detect the presence of tobacco components in tobacco products to effectively control smuggling and classify tariff and excise in tobacco industry to control illegal tobacco trade. In this study, two sensitive and specific DNA based methods, one quantitative real-time PCR (qPCR) assay and the other loop-mediated isothermal amplification (LAMP) assay, were developed for the reliable and efficient detection of the presence of tobacco (Nicotiana tabacum) in various tobacco samples and commodities. Both assays targeted the same sequence of the uridine 5′-monophosphate synthase (UMPS), and their specificities and sensitivities were determined with various plant materials. Both qPCR and LAMP methods were reliable and accurate in the rapid detection of tobacco components in various practical samples, including customs samples, reconstituted tobacco samples, and locally purchased cigarettes, showing high potential for their application in tobacco identification, particularly in the special cases where the morphology or chemical compositions of tobacco have been disrupted. Therefore, combining both methods would facilitate not only the detection of tobacco smuggling control, but also the detection of tariff classification and of excise.

  1. High reliable and stable organic field-effect transistor nonvolatile memory with a poly(4-vinyl phenol) charge trapping layer based on a pn-heterojunction active layer

    NASA Astrophysics Data System (ADS)

    Xiang, Lanyi; Ying, Jun; Han, Jinhua; Zhang, Letian; Wang, Wei

    2016-04-01

    In this letter, we demonstrate a high reliable and stable organic field-effect transistor (OFET) based nonvolatile memory (NVM) with a polymer poly(4-vinyl phenol) (PVP) as the charge trapping layer. In the unipolar OFETs, the inreversible shifts of the turn-on voltage (Von) and severe degradation of the memory window (ΔVon) at programming (P) and erasing (E) voltages, respectively, block their application in NVMs. The obstacle is overcome by using a pn-heterojunction as the active layer in the OFET memory, which supplied a holes and electrons accumulating channel at the supplied P and E voltages, respectively. Both holes and electrons transferring from the channels to PVP layer and overwriting the trapped charges with an opposite polarity result in the reliable bidirectional shifts of Von at P and E voltages, respectively. The heterojunction OFET exhibits excellent nonvolatile memory characteristics, with a large ΔVon of 8.5 V, desired reading (R) voltage at 0 V, reliable P/R/E/R dynamic endurance over 100 cycles and a long retention time over 10 years.

  2. Effect of an interface Mg insertion layer on the reliability of a magnetic tunnel junction based on a Co2FeAl full-Heusler alloy

    NASA Astrophysics Data System (ADS)

    Lee, Jung-Min; Kil, Gyu Hyun; Lee, Gae Hun; Choi, Chul Min; Song, Yun-Heub; Sukegawa, Hiroaki; Mitani, Seiji

    2014-04-01

    The reliability of a magnetic tunnel junction (MTJ) based on a Co2FeAl (CFA) full-Heusler alloy with a MgO tunnel barrier was evaluated. In particular, the effect of a Mg insertion layer under the MgO was investigated in view of resistance drift by using various voltage stress tests. We compared the resistance change during constant voltage stress (CVS) and confirmed a trap/detrap phenomenon during the interval stress test for samples with and without a Mg insertion layer. The MTJ with a Mg insertion layer showed a relatively small resistance change for the CVS test and a reduced trap/detrap phenomenon for the interval stress test compared to the sample without a Mg insertion layer. This is understood to be caused by the improved crystallinity at the bottom of the CFA/MgO interface due to the Mg insertion layer, which provides a smaller number of trap site during the stress test. As a result, the interface condition of the MgO layer is very important for the reliability of a MTJ using a full-Heusler alloy, and the the insert of a Mg layer at the MgO interface is expected to be an effective method for enhancing the reliability of a MTJ.

  3. The Development of DNA Based Methods for the Reliable and Efficient Identification of Nicotiana tabacum in Tobacco and Its Derived Products

    PubMed Central

    Fan, Wei; Li, Rong; Li, Sifan; Ping, Wenli; Li, Shujun; Naumova, Alexandra; Peelen, Tamara; Yuan, Zheng; Zhang, Dabing

    2016-01-01

    Reliable methods are needed to detect the presence of tobacco components in tobacco products to effectively control smuggling and classify tariff and excise in tobacco industry to control illegal tobacco trade. In this study, two sensitive and specific DNA based methods, one quantitative real-time PCR (qPCR) assay and the other loop-mediated isothermal amplification (LAMP) assay, were developed for the reliable and efficient detection of the presence of tobacco (Nicotiana tabacum) in various tobacco samples and commodities. Both assays targeted the same sequence of the uridine 5′-monophosphate synthase (UMPS), and their specificities and sensitivities were determined with various plant materials. Both qPCR and LAMP methods were reliable and accurate in the rapid detection of tobacco components in various practical samples, including customs samples, reconstituted tobacco samples, and locally purchased cigarettes, showing high potential for their application in tobacco identification, particularly in the special cases where the morphology or chemical compositions of tobacco have been disrupted. Therefore, combining both methods would facilitate not only the detection of tobacco smuggling control, but also the detection of tariff classification and of excise. PMID:27635142

  4. The Development of DNA Based Methods for the Reliable and Efficient Identification of Nicotiana tabacum in Tobacco and Its Derived Products.

    PubMed

    Biswas, Sukumar; Fan, Wei; Li, Rong; Li, Sifan; Ping, Wenli; Li, Shujun; Naumova, Alexandra; Peelen, Tamara; Kok, Esther; Yuan, Zheng; Zhang, Dabing; Shi, Jianxin

    2016-01-01

    Reliable methods are needed to detect the presence of tobacco components in tobacco products to effectively control smuggling and classify tariff and excise in tobacco industry to control illegal tobacco trade. In this study, two sensitive and specific DNA based methods, one quantitative real-time PCR (qPCR) assay and the other loop-mediated isothermal amplification (LAMP) assay, were developed for the reliable and efficient detection of the presence of tobacco (Nicotiana tabacum) in various tobacco samples and commodities. Both assays targeted the same sequence of the uridine 5'-monophosphate synthase (UMPS), and their specificities and sensitivities were determined with various plant materials. Both qPCR and LAMP methods were reliable and accurate in the rapid detection of tobacco components in various practical samples, including customs samples, reconstituted tobacco samples, and locally purchased cigarettes, showing high potential for their application in tobacco identification, particularly in the special cases where the morphology or chemical compositions of tobacco have been disrupted. Therefore, combining both methods would facilitate not only the detection of tobacco smuggling control, but also the detection of tariff classification and of excise. PMID:27635142

  5. Reliability and validity of expert assessment based on airborne and urinary measures of nickel and chromium exposure in the electroplating industry

    PubMed Central

    Chen, Yu-Cheng; Coble, Joseph B; Deziel, Nicole C.; Ji, Bu-Tian; Xue, Shouzheng; Lu, Wei; Stewart, Patricia A; Friesen, Melissa C

    2014-01-01

    The reliability and validity of six experts’ exposure ratings were evaluated for 64 nickel-exposed and 72 chromium-exposed workers from six Shanghai electroplating plants based on airborne and urinary nickel and chromium measurements. Three industrial hygienists and three occupational physicians independently ranked the exposure intensity of each metal on an ordinal scale (1–4) for each worker's job in two rounds: the first round was based on responses to an occupational history questionnaire and the second round also included responses to an electroplating industry-specific questionnaire. Spearman correlation (rs) was used to compare each rating's validity to its corresponding subject-specific arithmetic mean of four airborne or four urinary measurements. Reliability was moderately-high (weighted kappa range=0.60–0.64). Validity was poor to moderate (rs= -0.37–0.46) for both airborne and urinary concentrations of both metals. For airborne nickel concentrations, validity differed by plant. For dichotomized metrics, sensitivity and specificity were higher based on urinary measurements (47–78%) than airborne measurements (16–50%). Few patterns were observed by metal, assessment round, or expert type. These results suggest that, for electroplating exposures, experts can achieve moderately-high agreement and (reasonably) distinguish between low and high exposures when reviewing responses to in-depth questionnaires used in population-based case-control studies. PMID:24736099

  6. Reliability and validity of expert assessment based on airborne and urinary measures of nickel and chromium exposure in the electroplating industry.

    PubMed

    Chen, Yu-Cheng; Coble, Joseph B; Deziel, Nicole C; Ji, Bu-Tian; Xue, Shouzheng; Lu, Wei; Stewart, Patricia A; Friesen, Melissa C

    2014-11-01

    The reliability and validity of six experts' exposure ratings were evaluated for 64 nickel-exposed and 72 chromium-exposed workers from six Shanghai electroplating plants based on airborne and urinary nickel and chromium measurements. Three industrial hygienists and three occupational physicians independently ranked the exposure intensity of each metal on an ordinal scale (1-4) for each worker's job in two rounds: the first round was based on responses to an occupational history questionnaire and the second round also included responses to an electroplating industry-specific questionnaire. The Spearman correlation (r(s)) was used to compare each rating's validity to its corresponding subject-specific arithmetic mean of four airborne or four urinary measurements. Reliability was moderately high (weighted kappa range=0.60-0.64). Validity was poor to moderate (r(s)=-0.37-0.46) for both airborne and urinary concentrations of both metals. For airborne nickel concentrations, validity differed by plant. For dichotomized metrics, sensitivity and specificity were higher based on urinary measurements (47-78%) than airborne measurements (16-50%). Few patterns were observed by metal, assessment round, or expert type. These results suggest that, for electroplating exposures, experts can achieve moderately high agreement and (reasonably) distinguish between low and high exposures when reviewing responses to in-depth questionnaires used in population-based case-control studies.

  7. Electronic Versus Paper-Based Assessment of Health-Related Quality of Life Specific to HIV Disease: Reliability Study of the PROQOL-HIV Questionnaire

    PubMed Central

    Lalanne, Christophe; Goujard, Cécile; Herrmann, Susan; Cheung-Lung, Christian; Brosseau, Jean-Paul; Schwartz, Yannick; Chassany, Olivier

    2014-01-01

    Background Electronic patient-reported outcomes (PRO) provide quick and usually reliable assessments of patients’ health-related quality of life (HRQL). Objective An electronic version of the Patient-Reported Outcomes Quality of Life-human immunodeficiency virus (PROQOL-HIV) questionnaire was developed, and its face validity and reliability were assessed using standard psychometric methods. Methods A sample of 80 French outpatients (66% male, 52/79; mean age 46.7 years, SD 10.9) were recruited. Paper-based and electronic questionnaires were completed in a randomized crossover design (2-7 day interval). Biomedical data were collected. Questionnaire version and order effects were tested on full-scale scores in a 2-way ANOVA with patients as random effects. Test-retest reliability was evaluated using Pearson and intraclass correlation coefficients (ICC, with 95% confidence interval) for each dimension. Usability testing was carried out from patients’ survey reports, specifically, general satisfaction, ease of completion, quality and clarity of user interface, and motivation to participate in follow-up PROQOL-HIV electronic assessments. Results Questionnaire version and administration order effects (N=59 complete cases) were not significant at the 5% level, and no interaction was found between these 2 factors (P=.94). Reliability indexes were acceptable, with Pearson correlations greater than .7 and ICCs ranging from .708 to .939; scores were not statistically different between the two versions. A total of 63 (79%) complete patients’ survey reports were available, and 55% of patients (30/55) reported being satisfied and interested in electronic assessment of their HRQL in clinical follow-up. Individual ratings of PROQOL-HIV user interface (85%-100% of positive responses) confirmed user interface clarity and usability. Conclusions The electronic PROQOL-HIV introduces minor modifications to the original paper-based version, following International Society for

  8. Reliable multihop broadcast protocol with a low-overhead link quality assessment for ITS based on VANETs in highway scenarios.

    PubMed

    Galaviz-Mosqueda, Alejandro; Villarreal-Reyes, Salvador; Galeana-Zapién, Hiram; Rubio-Loyola, Javier; Covarrubias-Rosales, David H

    2014-01-01

    Vehicular ad hoc networks (VANETs) have been identified as a key technology to enable intelligent transport systems (ITS), which are aimed to radically improve the safety, comfort, and greenness of the vehicles in the road. However, in order to fully exploit VANETs potential, several issues must be addressed. Because of the high dynamic of VANETs and the impairments in the wireless channel, one key issue arising when working with VANETs is the multihop dissemination of broadcast packets for safety and infotainment applications. In this paper a reliable low-overhead multihop broadcast (RLMB) protocol is proposed to address the well-known broadcast storm problem. The proposed RLMB takes advantage of the hello messages exchanged between the vehicles and it processes such information to intelligently select a relay set and reduce the redundant broadcast. Additionally, to reduce the hello messages rate dependency, RLMB uses a point-to-zone link evaluation approach. RLMB performance is compared with one of the leading multihop broadcast protocols existing to date. Performance metrics show that our RLMB solution outperforms the leading protocol in terms of important metrics such as packet dissemination ratio, overhead, and delay. PMID:25133224

  9. Reliable resonance assignments of selected residues of proteins with known structure based on empirical NMR chemical shift prediction

    NASA Astrophysics Data System (ADS)

    Li, Da-Wei; Meng, Dan; Brüschweiler, Rafael

    2015-05-01

    A robust NMR resonance assignment method is introduced for proteins whose 3D structure has previously been determined by X-ray crystallography. The goal of the method is to obtain a subset of correct assignments from a parsimonious set of 3D NMR experiments of 15N, 13C labeled proteins. Chemical shifts of sequential residue pairs are predicted from static protein structures using PPM_One, which are then compared with the corresponding experimental shifts. Globally optimized weighted matching identifies the assignments that are robust with respect to small changes in NMR cross-peak positions. The method, termed PASSPORT, is demonstrated for 4 proteins with 100-250 amino acids using 3D NHCA and a 3D CBCA(CO)NH experiments as input producing correct assignments with high reliability for 22% of the residues. The method, which works best for Gly, Ala, Ser, and Thr residues, provides assignments that serve as anchor points for additional assignments by both manual and semi-automated methods or they can be directly used for further studies, e.g. on ligand binding, protein dynamics, or post-translational modification, such as phosphorylation.

  10. Cost and Reliability Improvement for CIGS-Based PV on Flexible Substrate: May 24, 2006 -- July 31, 2010

    SciTech Connect

    Wiedeman, S.

    2011-05-01

    Global Solar Energy rapidly advances the cost and performance of commercial thin-film CIGS products using roll-to-roll processing on steel foil substrate in compact, low cost deposition equipment, with in-situ sensors for real-time intelligent process control. Substantial increases in power module efficiency, which now exceed 13%, are evident at GSE factories in two countries with a combined capacity greater than 75 MW. During 2009 the average efficiency of cell strings (3780 cm2) was increased from 7% to over 11%, with champion results exceeding 13% Continued testing of module reliability in rigid product has reaffirmed extended life expectancy for standard glass product, and has qualified additional lower-cost methods and materials. Expected lifetime for PV in flexible packages continues to increase as failure mechanisms are elucidated, and resolved by better methods and materials. Cost reduction has been achieved through better materials utilization, enhanced vendor and material qualification and selection. The largest cost gains have come as a result of higher cell conversion efficiency and yields, higher processing rates, greater automation and improved control in all process steps. These improvements are integral to this thin film PV partnership program, and all realized with the 'Gen2' manufacturing plants, processes and equipment.

  11. An Internet-based symptom questionnaire that is reliable, valid, and available to psychiatrists, neurologists, and psychologists.

    PubMed

    Gualtieri, C Thomas

    2007-10-03

    The Neuropsych Questionnaire (NPQ) addresses 2 important clinical issues: how to screen patients for a wide range of neuropsychiatric disorders quickly and efficiently, and how to acquire independent verification of a patient's complaints. The NPQ is available over the Internet in adult and pediatric versions. The adult version of the NPQ consists of 207 simple questions about common symptoms of neuropsychiatric disorders. The NPQ scores patient and/or observer responses in terms of 20 symptom clusters: inattention, hyperactivity-impulsivity, learning problems, memory, anxiety, panic, agoraphobia, obsessions and compulsions, social anxiety, depression, mood instability, mania, aggression, psychosis, somatization, fatigue, sleep, suicide, pain, and substance abuse. The NPQ is reliable (patients tested twice, patient-observer pairs, 2 observers) and discriminates patients with different diagnoses. Scores generated by the NPQ correlate reasonably well with commonly used rating scales, and the test is sensitive to the effects of treatment. The NPQ is suitable for initial patient evaluations, and a short form is appropriate for follow-up assessment. The availability of a comprehensive computerized symptom checklist can help to make the day-to-day practice of psychiatry, neurology, and neuropsychology more objective.

  12. Reliable Multihop Broadcast Protocol with a Low-Overhead Link Quality Assessment for ITS Based on VANETs in Highway Scenarios

    PubMed Central

    Galaviz-Mosqueda, Alejandro; Villarreal-Reyes, Salvador; Galeana-Zapién, Hiram; Rubio-Loyola, Javier; Covarrubias-Rosales, David H.

    2014-01-01

    Vehicular ad hoc networks (VANETs) have been identified as a key technology to enable intelligent transport systems (ITS), which are aimed to radically improve the safety, comfort, and greenness of the vehicles in the road. However, in order to fully exploit VANETs potential, several issues must be addressed. Because of the high dynamic of VANETs and the impairments in the wireless channel, one key issue arising when working with VANETs is the multihop dissemination of broadcast packets for safety and infotainment applications. In this paper a reliable low-overhead multihop broadcast (RLMB) protocol is proposed to address the well-known broadcast storm problem. The proposed RLMB takes advantage of the hello messages exchanged between the vehicles and it processes such information to intelligently select a relay set and reduce the redundant broadcast. Additionally, to reduce the hello messages rate dependency, RLMB uses a point-to-zone link evaluation approach. RLMB performance is compared with one of the leading multihop broadcast protocols existing to date. Performance metrics show that our RLMB solution outperforms the leading protocol in terms of important metrics such as packet dissemination ratio, overhead, and delay. PMID:25133224

  13. Reliable multihop broadcast protocol with a low-overhead link quality assessment for ITS based on VANETs in highway scenarios.

    PubMed

    Galaviz-Mosqueda, Alejandro; Villarreal-Reyes, Salvador; Galeana-Zapién, Hiram; Rubio-Loyola, Javier; Covarrubias-Rosales, David H

    2014-01-01

    Vehicular ad hoc networks (VANETs) have been identified as a key technology to enable intelligent transport systems (ITS), which are aimed to radically improve the safety, comfort, and greenness of the vehicles in the road. However, in order to fully exploit VANETs potential, several issues must be addressed. Because of the high dynamic of VANETs and the impairments in the wireless channel, one key issue arising when working with VANETs is the multihop dissemination of broadcast packets for safety and infotainment applications. In this paper a reliable low-overhead multihop broadcast (RLMB) protocol is proposed to address the well-known broadcast storm problem. The proposed RLMB takes advantage of the hello messages exchanged between the vehicles and it processes such information to intelligently select a relay set and reduce the redundant broadcast. Additionally, to reduce the hello messages rate dependency, RLMB uses a point-to-zone link evaluation approach. RLMB performance is compared with one of the leading multihop broadcast protocols existing to date. Performance metrics show that our RLMB solution outperforms the leading protocol in terms of important metrics such as packet dissemination ratio, overhead, and delay.

  14. Proof-of-Concept Demonstrations for Computation-Based Human Reliability Analysis. Modeling Operator Performance During Flooding Scenarios

    SciTech Connect

    Joe, Jeffrey Clark; Boring, Ronald Laurids; Herberger, Sarah Elizabeth Marie; Mandelli, Diego; Smith, Curtis Lee

    2015-09-01

    The United States (U.S.) Department of Energy (DOE) Light Water Reactor Sustainability (LWRS) program has the overall objective to help sustain the existing commercial nuclear power plants (NPPs). To accomplish this program objective, there are multiple LWRS “pathways,” or research and development (R&D) focus areas. One LWRS focus area is called the Risk-Informed Safety Margin and Characterization (RISMC) pathway. Initial efforts under this pathway to combine probabilistic and plant multi-physics models to quantify safety margins and support business decisions also included HRA, but in a somewhat simplified manner. HRA experts at Idaho National Laboratory (INL) have been collaborating with other experts to develop a computational HRA approach, called the Human Unimodel for Nuclear Technology to Enhance Reliability (HUNTER), for inclusion into the RISMC framework. The basic premise of this research is to leverage applicable computational techniques, namely simulation and modeling, to develop and then, using RAVEN as a controller, seamlessly integrate virtual operator models (HUNTER) with 1) the dynamic computational MOOSE runtime environment that includes a full-scope plant model, and 2) the RISMC framework PRA models already in use. The HUNTER computational HRA approach is a hybrid approach that leverages past work from cognitive psychology, human performance modeling, and HRA, but it is also a significant departure from existing static and even dynamic HRA methods. This report is divided into five chapters that cover the development of an external flooding event test case and associated statistical modeling considerations.

  15. Reliable prediction of insulin resistance by a school-based fitness test in middle-school children.

    PubMed

    Varness, Todd; Carrel, Aaron L; Eickhoff, Jens C; Allen, David B

    2009-01-01

    Objectives. (1) Determine the predictive value of a school-based test of cardiovascular fitness (CVF) for insulin resistance (IR); (2) compare a "school-based" prediction of IR to a "laboratory-based" prediction, using various measures of fitness and body composition. Methods. Middle school children (n = 82) performed the Progressive Aerobic Cardiovascular Endurance Run (PACER), a school-based CVF test, and underwent evaluation of maximal oxygen consumption treadmill testing (VO(2) max), body composition (percent body fat and BMI z score), and IR (derived homeostasis model assessment index [HOMA(IR)]). Results. PACER showed a strong correlation with VO(2) max/kg (r(s) = 0.83, P < .001) and with HOMA(IR) (r(s) = -0.60, P < .001). Multivariate regression analysis revealed that a school-based model (using PACER and BMI z score) predicted IR similar to a laboratory-based model (using VO(2) max/kg of lean body mass and percent body fat). Conclusions. The PACER is a valid school-based test of CVF, is predictive of IR, and has a similar relationship to IR when compared to complex laboratory-based testing. Simple school-based measures of childhood fitness (PACER) and fatness (BMI z score) could be used to identify childhood risk for IR and evaluate interventions.

  16. Using a Web-Based Approach to Assess Test-Retest Reliability of the "Hypertension Self-Care Profile" Tool in an Asian Population: A Validation Study.

    PubMed

    Koh, Yi Ling Eileen; Lua, Yi Hui Adela; Hong, Liyue; Bong, Huey Shin Shirley; Yeo, Ling Sui Jocelyn; Tsang, Li Ping Marianne; Ong, Kai Zhi; Wong, Sook Wai Samantha; Tan, Ngiap Chuan

    2016-03-01

    Essential hypertension often requires affected patients to self-manage their condition most of the time. Besides seeking regular medical review of their life-long condition to detect vascular complications, patients have to maintain healthy lifestyles in between physician consultations via diet and physical activity, and to take their medications according to their prescriptions. Their self-management ability is influenced by their self-efficacy capacity, which can be assessed using questionnaire-based tools. The "Hypertension Self-Care Profile" (HTN-SCP) is 1 such questionnaire assessing self-efficacy in the domains of "behavior," "motivation," and "self-efficacy." This study aims to determine the test-retest reliability of HTN-SCP in an English-literate Asian population using a web-based approach. Multiethnic Asian patients, aged 40 years and older, with essential hypertension were recruited from a typical public primary care clinic in Singapore. The investigators guided the patients to fill up the web-based 60-item HTN-SCP in English using a tablet or smartphone on the first visit and refilled the instrument 2 weeks later in the retest. Internal consistency and test-retest reliability were evaluated using Cronbach's Alpha and intraclass correlation coefficients (ICC), respectively. The t test was used to determine the relationship between the overall HTN-SCP scores of the patients and their self-reported self-management activities. A total of 160 patients completed the HTN-SCP during the initial test, from which 71 test-retest responses were completed. No floor or ceiling effect was found for the scores for the 3 subscales. Cronbach's Alpha coefficients were 0.857, 0.948, and 0.931 for "behavior," "motivation," and "self-efficacy" domains respectively, indicating high internal consistency. The item-total correlation ranges for the 3 scales were from 0.105 to 0.656 for Behavior, 0.401 to 0.808 for Motivation, 0.349 to 0.789 for Self-efficacy. The corresponding

  17. Using a Web-Based Approach to Assess Test-Retest Reliability of the "Hypertension Self-Care Profile" Tool in an Asian Population: A Validation Study.

    PubMed

    Koh, Yi Ling Eileen; Lua, Yi Hui Adela; Hong, Liyue; Bong, Huey Shin Shirley; Yeo, Ling Sui Jocelyn; Tsang, Li Ping Marianne; Ong, Kai Zhi; Wong, Sook Wai Samantha; Tan, Ngiap Chuan

    2016-03-01

    Essential hypertension often requires affected patients to self-manage their condition most of the time. Besides seeking regular medical review of their life-long condition to detect vascular complications, patients have to maintain healthy lifestyles in between physician consultations via diet and physical activity, and to take their medications according to their prescriptions. Their self-management ability is influenced by their self-efficacy capacity, which can be assessed using questionnaire-based tools. The "Hypertension Self-Care Profile" (HTN-SCP) is 1 such questionnaire assessing self-efficacy in the domains of "behavior," "motivation," and "self-efficacy." This study aims to determine the test-retest reliability of HTN-SCP in an English-literate Asian population using a web-based approach. Multiethnic Asian patients, aged 40 years and older, with essential hypertension were recruited from a typical public primary care clinic in Singapore. The investigators guided the patients to fill up the web-based 60-item HTN-SCP in English using a tablet or smartphone on the first visit and refilled the instrument 2 weeks later in the retest. Internal consistency and test-retest reliability were evaluated using Cronbach's Alpha and intraclass correlation coefficients (ICC), respectively. The t test was used to determine the relationship between the overall HTN-SCP scores of the patients and their self-reported self-management activities. A total of 160 patients completed the HTN-SCP during the initial test, from which 71 test-retest responses were completed. No floor or ceiling effect was found for the scores for the 3 subscales. Cronbach's Alpha coefficients were 0.857, 0.948, and 0.931 for "behavior," "motivation," and "self-efficacy" domains respectively, indicating high internal consistency. The item-total correlation ranges for the 3 scales were from 0.105 to 0.656 for Behavior, 0.401 to 0.808 for Motivation, 0.349 to 0.789 for Self-efficacy. The corresponding

  18. Assessment of performance and reliability of computer-aided detection scheme using content-based image retrieval approach and limited reference database.

    PubMed

    Wang, Xiao Hui; Park, Sang Cheol; Zheng, Bin

    2011-04-01

    Content-based image retrieval approach was used in our computer-aided detection (CAD) schemes for breast cancer detection with mammography. In this study, we assessed CAD performance and reliability using a reference database including 1500 positive (breast mass) regions of interest (ROIs) and 1500 normal ROIs. To test the relationship between CAD performance and the similarity level between the queried ROI and the retrieved ROIs, we applied a set of similarity thresholds to the retrieved similar ROIs selected by the CAD schemes for all queried suspicious regions, and used only the ROIs that were above the threshold for assessing CAD performance at each threshold level. Using the leave-one-out testing method, we computed areas under receiver operating characteristic (ROC) curves (A(Z)) to assess CAD performance. The experimental results showed that as threshold increase, (1) less true positive ROIs can be referenced in the database than normal ROIs and (2) the A(Z) value was monotonically increased from 0.854 ± 0.004 to 0.932 ± 0.016. This study suggests that (1) in order to more accurately detect and diagnose subtle masses, a large and diverse database is required, and (2) assessing the reliability of the decision scores based on the similarity measurement is important in application of the CBIR-based CAD schemes when the limited database is used.

  19. Dual-Fuel Combustion Turbine Provides Reliable Power to U.S. Navy Submarine Base New London in Groton, Connecticut

    SciTech Connect

    Halverson, Mark A. )

    2002-01-01

    In keeping with a long-standing tradition of running Base utilities as a business, the U.S. Navy Submarine Base New London installed a dual-fuel combustion turbine with a heat recovery boiler. The 5-megawatt (MW) gas- and oil-fired combustion turbine sits within the Lower Base area, just off the shores of the Thames River. The U.S. Navy owns, operates, and maintains the combined heat and power (CHP) plant, which provides power to the Navy?s nuclear submarines when they are in port and to the Navy?s training facilities at the Submarine Base. Heat recovered from the turbine is used to produce steam for use in Base housing, medical facilities, and laundries. In FY00, the Navy estimates that it will save over $500,000 per year as a result of the combined heat and power unit.

  20. A multitracer test proving the reliability of Rayleigh equation-based approach for assessing biodegradation in a BTEX contaminated aquifer.

    PubMed

    Fischer, Anko; Bauer, Jana; Meckenstock, Rainer U; Stichler, Willibald; Griebler, Christian; Maloszewski, Piotr; Kästner, Matthias; Richnow, Hans H

    2006-07-01

    Compound-specific stable isotope analysis (CSIA) is one of the most important methods for assessing biodegradation activities in contaminated aquifers. Although the concept is straightforward, the proof that the method cannot be only used for a qualitative analysis but also to quantify biodegradation in the subsurface was missing. We therefore performed a multitracer test in the field with ring-deuterated (d5) and completely (d8) deuterium-labeled toluene isotopologues (400 g) as reactive tracers as well as bromide as a conservative tracer. The compounds were injected into the anoxic zone of a BTEX plume located down-gradient of the contaminant source. Over a period of 4.5 months the tracer concentrations were analyzed at two control planes located 24 and 35 m downgradient of the injection well. Deuterium-labeled benzylsuccinate was found in the aquifer, indicating the anaerobic biodegradation of deuterated toluene via the benzylsuccinate synthase pathway. Three independent methods were applied to quantify biodegradation of deuterated toluene. First, fractionation of toluene-d8 and toluene-d5 using the Rayleigh equation and an appropriate laboratory-derived isotope fractionation factor was used for the calculation of the microbial decomposition of deuterated toluene isotopologues (CSIA-method). Second, the biodegradation was quantified by the changes of the concentrations of deuterated toluene relative to bromide. Both methods gave similar results, implying that the CSIA-method is a reliable tool to quantify biodegradation in contaminated aquifers. The results of both methods yielded a biodegradation of deuterated toluene isotopologues of approximately 23-29% for the first and 44-51% for the second control plane. Third, the mineralization of deuterated toluene isotopologues was verified by determination of the enrichment of deuterium in the groundwater. This method indicated that parts of deuterium were assimilated into the biomass of toluene degrading

  1. Stirling Convertor Fasteners Reliability Quantification

    NASA Technical Reports Server (NTRS)

    Shah, Ashwin R.; Korovaichuk, Igor; Kovacevich, Tiodor; Schreiber, Jeffrey G.

    2006-01-01

    Onboard Radioisotope Power Systems (RPS) being developed for NASA s deep-space science and exploration missions require reliable operation for up to 14 years and beyond. Stirling power conversion is a candidate for use in an RPS because it offers a multifold increase in the conversion efficiency of heat to electric power and reduced inventory of radioactive material. Structural fasteners are responsible to maintain structural integrity of the Stirling power convertor, which is critical to ensure reliable performance during the entire mission. Design of fasteners involve variables related to the fabrication, manufacturing, behavior of fasteners and joining parts material, structural geometry of the joining components, size and spacing of fasteners, mission loads, boundary conditions, etc. These variables have inherent uncertainties, which need to be accounted for in the reliability assessment. This paper describes these uncertainties along with a methodology to quantify the reliability, and provides results of the analysis in terms of quantified reliability and sensitivity of Stirling power conversion reliability to the design variables. Quantification of the reliability includes both structural and functional aspects of the joining components. Based on the results, the paper also describes guidelines to improve the reliability and verification testing.

  2. Reliability Generalization: "Lapsus Linguae"

    ERIC Educational Resources Information Center

    Smith, Julie M.

    2011-01-01

    This study examines the proposed Reliability Generalization (RG) method for studying reliability. RG employs the application of meta-analytic techniques similar to those used in validity generalization studies to examine reliability coefficients. This study explains why RG does not provide a proper research method for the study of reliability,…

  3. Assessing communication quality of consultations in primary care: initial reliability of the Global Consultation Rating Scale, based on the Calgary-Cambridge Guide to the Medical Interview

    PubMed Central

    Burt, Jenni; Abel, Gary; Elmore, Natasha; Campbell, John; Roland, Martin; Benson, John; Silverman, Jonathan

    2014-01-01

    Objectives To investigate initial reliability of the Global Consultation Rating Scale (GCRS: an instrument to assess the effectiveness of communication across an entire doctor–patient consultation, based on the Calgary-Cambridge guide to the medical interview), in simulated patient consultations. Design Multiple ratings of simulated general practitioner (GP)–patient consultations by trained GP evaluators. Setting UK primary care. Participants 21 GPs and six trained GP evaluators. Outcome measures GCRS score. Methods 6 GP raters used GCRS to rate randomly assigned video recordings of GP consultations with simulated patients. Each of the 42 consultations was rated separately by four raters. We considered whether a fixed difference between scores had the same meaning at all levels of performance. We then examined the reliability of GCRS using mixed linear regression models. We augmented our regression model to also examine whether there were systematic biases between the scores given by different raters and to look for possible order effects. Results Assessing the communication quality of individual consultations, GCRS achieved a reliability of 0.73 (95% CI 0.44 to 0.79) for two raters, 0.80 (0.54 to 0.85) for three and 0.85 (0.61 to 0.88) for four. We found an average difference of 1.65 (on a 0–10 scale) in the scores given by the least and most generous raters: adjusting for this evaluator bias increased reliability to 0.78 (0.53 to 0.83) for two raters; 0.85 (0.63 to 0.88) for three and 0.88 (0.69 to 0.91) for four. There were considerable order effects, with later consultations (after 15–20 ratings) receiving, on average, scores more than one point higher on a 0–10 scale. Conclusions GCRS shows good reliability with three raters assessing each consultation. We are currently developing the scale further by assessing a large sample of real-world consultations. PMID:24604483

  4. Can There Be Reliability without "Reliability?"

    ERIC Educational Resources Information Center

    Mislevy, Robert J.

    2004-01-01

    An "Educational Researcher" article by Pamela Moss (1994) asks the title question, "Can there be validity without reliability?" Yes, she answers, if by reliability one means "consistency among independent observations intended as interchangeable" (Moss, 1994, p. 7), quantified by internal consistency indices such as KR-20 coefficients and…

  5. HELIOS Critical Design Review: Reliability

    NASA Technical Reports Server (NTRS)

    Benoehr, H. C.; Herholz, J.; Prem, H.; Mann, D.; Reichert, L.; Rupp, W.; Campbell, D.; Boettger, H.; Zerwes, G.; Kurvin, C.

    1972-01-01

    This paper presents Helios Critical Design Review Reliability form October 16-20, 1972. The topics include: 1) Reliability Requirement; 2) Reliability Apportionment; 3) Failure Rates; 4) Reliability Assessment; 5) Reliability Block Diagram; and 5) Reliability Information Sheet.

  6. How to work towards more reliable residual water estimations? State of the art in Switzerland and ideas for using physical based approaches

    NASA Astrophysics Data System (ADS)

    Floriancic, Marius; Margreth, Michael; Naef, Felix

    2016-04-01

    Reliable low flow estimations are important for many ecologic and economic reasons. In Switzerland the base for defining residual flow is Q347 (Q95), the discharge exceeded 347 days per year. To improve estimations, we need further knowledge of dominant processes of storage and drainage during low flow periods. We present new approaches to define Q347 based on physical properties of the contributing slopes and catchment parts. We used dominant runoff process maps, representing storage and drainage capacity of soils, to predict discharge during dry periods. We found that recession depends on these processes but during low flow periods and times of water scarcity different mechanisms sustain discharge and streamflow. During an extended field campaign in dry summer 2015, we surveyed drainage behavior of different landscape elements in the Swiss midlands and found major differences in their contribution to discharge. The contributing storages have small volumes but long residence times mainly influenced by pore volume distribution and flow paths in fractured rocks and bedrock. We found that steep areas formed of sandstones are more likely to support higher flows than flat alluvial valleys or areas with thick moraine deposits, where infiltration takes place more frequently. The gathered knowledge helps assessing catchment scale low flow issues and supports more reliable estimations of water availability during dry periods. Furthermore the presented approach may help detect areas more or less vulnerable to extended dry periods, important ecologic and economic issues especially during changing climatic conditions.

  7. Inter-operator Reliability of Magnetic Resonance Image-Based Computational Fluid Dynamics Prediction of Cerebrospinal Fluid Motion in the Cervical Spine.

    PubMed

    Martin, Bryn A; Yiallourou, Theresia I; Pahlavian, Soroush Heidari; Thyagaraj, Suraj; Bunck, Alexander C; Loth, Francis; Sheffer, Daniel B; Kröger, Jan Robert; Stergiopulos, Nikolaos

    2016-05-01

    For the first time, inter-operator dependence of MRI based computational fluid dynamics (CFD) modeling of cerebrospinal fluid (CSF) in the cervical spinal subarachnoid space (SSS) is evaluated. In vivo MRI flow measurements and anatomy MRI images were obtained at the cervico-medullary junction of a healthy subject and a Chiari I malformation patient. 3D anatomies of the SSS were reconstructed by manual segmentation by four independent operators for both cases. CFD results were compared at nine axial locations along the SSS in terms of hydrodynamic and geometric parameters. Intraclass correlation (ICC) assessed the inter-operator agreement for each parameter over the axial locations and coefficient of variance (CV) compared the percentage of variance for each parameter between the operators. Greater operator dependence was found for the patient (0.19 < ICC < 0.99) near the craniovertebral junction compared to the healthy subject (ICC > 0.78). For the healthy subject, hydraulic diameter and Womersley number had the least variance (CV = ~2%). For the patient, peak diastolic velocity and Reynolds number had the smallest variance (CV = ~3%). These results show a high degree of inter-operator reliability for MRI-based CFD simulations of CSF flow in the cervical spine for healthy subjects and a lower degree of reliability for patients with Type I Chiari malformation.

  8. Reliability assessment of GaAs- and InP-based diode lasers for high-energy single-pulse operation

    NASA Astrophysics Data System (ADS)

    Maiorov, M.; Damm, D.; Trofimov, I.; Zeidel, V.; Sellers, R.

    2009-08-01

    With the maturing of high-power diode laser technology, studies of laser-assisted ignition of a variety of substances are becoming an increasingly popular research topic. Its range of applications is wide - from fusing in the defense, construction and exploration industries to ignition in future combustion engines. Recent advances in InP-based technology have expanded the wavelength range that can be covered by multi-watt GaAs- and InP-based diode lasers to about 0.8 to 2 μm. With such a wide range, the wattage is no longer the sole defining factor for efficient ignition. Ignition-related studies should include the interaction of radiation of various wavelengths with matter and the reliability of devices based on different material systems. In this paper, we focus on the reliability of pulsed laser diodes for use in ignition applications. We discuss the existing data on the catastrophic optical damage (COD) of the mirrors of the GaAsbased laser diodes and come up with a non-destructive test method to predict the COD level of a particular device. This allows pre-characterization of the devices intended for fusing to eliminate failures during single-pulse operation in the field. We also tested InP-based devices and demonstrated that the maximum power is not limited by COD. Currently, devices with >10W output power are available from both GaAs- and InP-based devices, which dramatically expands the potential use of laser diodes in ignition systems.

  9. In-plant reliability data base for nuclear plant components: a feasibility study on human error information

    SciTech Connect

    Borkowski, R.J.; Fragola, J.R.; Schurman, D.L.; Johnson, J.W.

    1984-03-01

    This report documents the procedure and final results of a feasibility study which examined the usefulness of nuclear plant maintenance work requests in the IPRDS as tools for understanding human error and its influence on component failure and repair. Developed in this study were (1) a set of criteria for judging the quality of a plant maintenance record set for studying human error; (2) a scheme for identifying human errors in the maintenance records; and (3) two taxonomies (engineering-based and psychology-based) for categorizing and coding human error-related events.

  10. On the Reliability and Validity of Human and LSA-Based Evaluations of Complex Student-Authored Texts

    ERIC Educational Resources Information Center

    Seifried, Eva; Lenhard, Wolfgang; Baier, Herbert; Spinath, Birgit

    2012-01-01

    This study investigates the potential of a software tool based on Latent Semantic Analysis (LSA; Landauer, McNamara, Dennis, & Kintsch, 2007) to automatically evaluate complex German texts. A sample of N = 94 German university students provided written answers to questions that involved a high amount of analytical reasoning and evaluation.…

  11. Curriculum-Based Measurement of Oral Reading: A Preliminary Investigation of Confidence Interval Overlap to Detect Reliable Growth

    ERIC Educational Resources Information Center

    Van Norman, Ethan R.

    2016-01-01

    Curriculum-based measurement of oral reading (CBM-R) progress monitoring data is used to measure student response to instruction. Federal legislation permits educators to use CBM-R progress monitoring data as a basis for determining the presence of specific learning disabilities. However, decision making frameworks originally developed for CBM-R…

  12. Statistics-related and reliability-physics-related failure processes in electronics devices and products

    NASA Astrophysics Data System (ADS)

    Suhir, E.

    2014-05-01

    The well known and widely used experimental reliability "passport" of a mass manufactured electronic or a photonic product — the bathtub curve — reflects the combined contribution of the statistics-related and reliability-physics (physics-of-failure)-related processes. When time progresses, the first process results in a decreasing failure rate, while the second process associated with the material aging and degradation leads to an increased failure rate. An attempt has been made in this analysis to assess the level of the reliability physics-related aging process from the available bathtub curve (diagram). It is assumed that the products of interest underwent the burn-in testing and therefore the obtained bathtub curve does not contain the infant mortality portion. It has been also assumed that the two random processes in question are statistically independent, and that the failure rate of the physical process can be obtained by deducting the theoretically assessed statistical failure rate from the bathtub curve ordinates. In the carried out numerical example, the Raleigh distribution for the statistical failure rate was used, for the sake of a relatively simple illustration. The developed methodology can be used in reliability physics evaluations, when there is a need to better understand the roles of the statistics-related and reliability-physics-related irreversible random processes in reliability evaluations. The future work should include investigations on how powerful and flexible methods and approaches of the statistical mechanics can be effectively employed, in addition to reliability physics techniques, to model the operational reliability of electronic and photonic products.

  13. Rabbit System. Low cost, high reliability front end electronics featuring 16 bit dynamic range. [Redundant Analog Bus Based Information Transfer

    SciTech Connect

    Drake, G.; Droege, T.F.; Nelson, C.A. Jr.; Turner, K.J.; Ohska, T.K.

    1985-10-01

    A new crate-based front end system has been built which features low cost, compact packaging, command capability, 16 bit dynamic range digitization, and a high degree of redundancy. The crate can contain a variety of instrumentation modules, and is designed to be situated close to the detector. The system is suitable for readout of a large number of channels via parallel multiprocessor data acquisition.

  14. Development of Stronger and More Reliable Cast Austenitic Stainless Steels (H-Series) Based on Scientific Design Methodology

    SciTech Connect

    Muralidharan, G.; Sikka, V.K.; Pankiw, R.I.

    2006-04-15

    The goal of this program was to increase the high-temperature strength of the H-Series of cast austenitic stainless steels by 50% and upper use temperature by 86 to 140 F (30 to 60 C). Meeting this goal is expected to result in energy savings of 38 trillion Btu/year by 2020 and energy cost savings of $185 million/year. The higher strength H-Series of cast stainless steels (HK and HP type) have applications for the production of ethylene in the chemical industry, for radiant burner tubes and transfer rolls for secondary processing of steel in the steel industry, and for many applications in the heat-treating industry. The project was led by Duraloy Technologies, Inc. with research participation by the Oak Ridge National Laboratory (ORNL) and industrial participation by a diverse group of companies. Energy Industries of Ohio (EIO) was also a partner in this project. Each team partner had well-defined roles. Duraloy Technologies led the team by identifying the base alloys that were to be improved from this research. Duraloy Technologies also provided an extensive creep data base on current alloys, provided creep-tested specimens of certain commercial alloys, and carried out centrifugal casting and component fabrication of newly designed alloys. Nucor Steel was the first partner company that installed the radiant burner tube assembly in their heat-treating furnace. Other steel companies participated in project review meetings and are currently working with Duraloy Technologies to obtain components of the new alloys. EIO is promoting the enhanced performance of the newly designed alloys to Ohio-based companies. The Timken Company is one of the Ohio companies being promoted by EIO. The project management and coordination plan is shown in Fig. 1.1. A related project at University of Texas-Arlington (UT-A) is described in Development of Semi-Stochastic Algorithm for Optimizing Alloy Composition of High-Temperature Austenitic Stainless Steels (H-Series) for Desired

  15. Modelling a reliable wind/PV/storage power system for remote radio base station sites without utility power

    NASA Astrophysics Data System (ADS)

    Bitterlin, Ian F.

    The development of photovoltaic (PV) cells has made steady progress from the early days, when only the USA space program could afford to deploy them, to now, seeing them applied to roadside applications even in our Northern European climes. The manufacturing cost per watt has fallen and the daylight-to-power conversion efficiency increased. At the same time, the perception that the sun has to be directly shining on it for a PV array to work has faded. On some of those roadside applications, particularly for remote emergency telephones or for temporary roadwork signage where a utility electrical power connection is not practical, the keen observer will spot, usually in addition to a PV array, a small wind-turbine and an electrical cabinet quite obviously (by virtue of its volume) containing a storage battery. In the UK, we have the lions share (>40%) of Europe's entire wind power resource although, despite press coverage of the "anti-wind" lobby to the contrary, we have hardly started to harvest this clean and free energy source. Taking this (established and proven) roadside solution one step further, we will consider higher power applications. A cellular phone system is one where a multitude of remote radio base stations (RBS) are required to provide geographical coverage. With networks developing into the so called "3G" technologies the need for base stations has tripled, as each 3G cell covers only 1/3 the geographical area of its "2G" counterpart. To cover >90% of the UK's topology (>97% population coverage) with 3G cellular technology will requires in excess of 12,000 radio base stations per operator network. In 2001, there were around 25,000 established sites and, with an anticipated degree of collocation by necessity, that figure is forecast to rise to >47,000. Of course, the vast majority of these sites have a convenient grid connection. However, it is easy to see that the combination of wind and PV power generation and an energy storage system may be an

  16. Population-based study of intra-household gender differences in water insecurity: reliability and validity of a survey instrument for use in rural Uganda.

    PubMed

    Tsai, Alexander C; Kakuhikire, Bernard; Mushavi, Rumbidzai; Vořechovská, Dagmar; Perkins, Jessica M; McDonough, Amy Q; Bangsberg, David R

    2016-04-01

    Hundreds of millions of people worldwide lack adequate access to water. Water insecurity, which is defined as having limited or uncertain availability of safe water or the ability to acquire safe water in socially acceptable ways, is typically overlooked by development organizations focusing on water availability. To address the urgent need in the literature for validated measures of water insecurity, we conducted a population-based study in rural Uganda with 327 reproductive-age women and 204 linked men from the same households. We used a novel method of photo identification so that we could accurately elicit study participants' primary household water sources, thereby enabling us to identify water sources for objective water quality testing and distance/elevation measurement. Our psychometric analyses provided strong evidence of the internal structure, reliability, and validity of a new eight-item Household Water Insecurity Access Scale (HWIAS). Important intra-household gender differences in perceptions of water insecurity were observed, with men generally perceiving household water insecurity as being less severe compared to women. In summary, the HWIAS represents a reliable and valid measure of water insecurity, particularly among women, and may be useful for informing and evaluating interventions to improve water access in resource-limited settings. PMID:27105413

  17. Population-based study of intra-household gender differences in water insecurity: reliability and validity of a survey instrument for use in rural Uganda.

    PubMed

    Tsai, Alexander C; Kakuhikire, Bernard; Mushavi, Rumbidzai; Vořechovská, Dagmar; Perkins, Jessica M; McDonough, Amy Q; Bangsberg, David R

    2016-04-01

    Hundreds of millions of people worldwide lack adequate access to water. Water insecurity, which is defined as having limited or uncertain availability of safe water or the ability to acquire safe water in socially acceptable ways, is typically overlooked by development organizations focusing on water availability. To address the urgent need in the literature for validated measures of water insecurity, we conducted a population-based study in rural Uganda with 327 reproductive-age women and 204 linked men from the same households. We used a novel method of photo identification so that we could accurately elicit study participants' primary household water sources, thereby enabling us to identify water sources for objective water quality testing and distance/elevation measurement. Our psychometric analyses provided strong evidence of the internal structure, reliability, and validity of a new eight-item Household Water Insecurity Access Scale (HWIAS). Important intra-household gender differences in perceptions of water insecurity were observed, with men generally perceiving household water insecurity as being less severe compared to women. In summary, the HWIAS represents a reliable and valid measure of water insecurity, particularly among women, and may be useful for informing and evaluating interventions to improve water access in resource-limited settings.

  18. Equilibrating errors: reliable estimation of information transmission rates in biological systems with spectral analysis-based methods.

    PubMed

    Ignatova, Irina; French, Andrew S; Immonen, Esa-Ville; Frolov, Roman; Weckström, Matti

    2014-06-01

    Shannon's seminal approach to estimating information capacity is widely used to quantify information processing by biological systems. However, the Shannon information theory, which is based on power spectrum estimation, necessarily contains two sources of error: time delay bias error and random error. These errors are particularly important for systems with relatively large time delay values and for responses of limited duration, as is often the case in experimental work. The window function type and size chosen, as well as the values of inherent delays cause changes in both the delay bias and random errors, with possibly strong effect on the estimates of system properties. Here, we investigated the properties of these errors using white-noise simulations and analysis of experimental photoreceptor responses to naturalistic and white-noise light contrasts. Photoreceptors were used from several insect species, each characterized by different visual performance, behavior, and ecology. We show that the effect of random error on the spectral estimates of photoreceptor performance (gain, coherence, signal-to-noise ratio, Shannon information rate) is opposite to that of the time delay bias error: the former overestimates information rate, while the latter underestimates it. We propose a new algorithm for reducing the impact of time delay bias error and random error, based on discovering, and then using that size of window, at which the absolute values of these errors are equal and opposite, thus cancelling each other, allowing minimally biased measurement of neural coding.

  19. Multisite longitudinal reliability of tract-based spatial statistics in diffusion tensor imaging of healthy elderly subjects.

    PubMed

    Jovicich, Jorge; Marizzoni, Moira; Bosch, Beatriz; Bartrés-Faz, David; Arnold, Jennifer; Benninghoff, Jens; Wiltfang, Jens; Roccatagliata, Luca; Picco, Agnese; Nobili, Flavio; Blin, Oliver; Bombois, Stephanie; Lopes, Renaud; Bordet, Régis; Chanoine, Valérie; Ranjeva, Jean-Philippe; Didic, Mira; Gros-Dagnac, Hélène; Payoux, Pierre; Zoccatelli, Giada; Alessandrini, Franco; Beltramello, Alberto; Bargalló, Núria; Ferretti, Antonio; Caulo, Massimo; Aiello, Marco; Ragucci, Monica; Soricelli, Andrea; Salvadori, Nicola; Tarducci, Roberto; Floridi, Piero; Tsolaki, Magda; Constantinidis, Manos; Drevelegas, Antonios; Rossini, Paolo Maria; Marra, Camillo; Otto, Josephin; Reiss-Zimmermann, Martin; Hoffmann, Karl-Titus; Galluzzi, Samantha; Frisoni, Giovanni B

    2014-11-01

    Large-scale longitudinal neuroimaging studies with diffusion imaging techniques are necessary to test and validate models of white matter neurophysiological processes that change in time, both in healthy and diseased brains. The predictive power of such longitudinal models will always be limited by the reproducibility of repeated measures acquired during different sessions. At present, there is limited quantitative knowledge about the across-session reproducibility of standard diffusion metrics in 3T multi-centric studies on subjects in stable conditions, in particular when using tract based spatial statistics and with elderly people. In this study we implemented a multi-site brain diffusion protocol in 10 clinical 3T MRI sites distributed across 4 countries in Europe (Italy, Germany, France and Greece) using vendor provided sequences from Siemens (Allegra, Trio Tim, Verio, Skyra, Biograph mMR), Philips (Achieva) and GE (HDxt) scanners. We acquired DTI data (2 × 2 × 2 mm(3), b = 700 s/mm(2), 5 b0 and 30 diffusion weighted volumes) of a group of healthy stable elderly subjects (5 subjects per site) in two separate sessions at least a week apart. For each subject and session four scalar diffusion metrics were considered: fractional anisotropy (FA), mean diffusivity (MD), radial diffusivity (RD) and axial (AD) diffusivity. The diffusion metrics from multiple subjects and sessions at each site were aligned to their common white matter skeleton using tract-based spatial statistics. The reproducibility at each MRI site was examined by looking at group averages of absolute changes relative to the mean (%) on various parameters: i) reproducibility of the signal-to-noise ratio (SNR) of the b0 images in centrum semiovale, ii) full brain test-retest differences of the diffusion metric maps on the white matter skeleton, iii) reproducibility of the diffusion metrics on atlas-based white matter ROIs on the white matter skeleton. Despite the differences of MRI scanner

  20. Assuring Electronics Reliability: What Could and Should Be Done Differently

    NASA Astrophysics Data System (ADS)

    Suhir, E.

    The following “ ten commandments” for the predicted and quantified reliability of aerospace electronic, and photonic products are addressed and discussed: 1) The best product is the best compromise between the needs for reliability, cost effectiveness and time-to-market; 2) Reliability cannot be low, need not be higher than necessary, but has to be adequate for a particular product; 3) When reliability is imperative, ability to quantify it is a must, especially if optimization is considered; 4) One cannot design a product with quantified, optimized and assured reliability by limiting the effort to the highly accelerated life testing (HALT) that does not quantify reliability; 5) Reliability is conceived at the design stage and should be taken care of, first of all, at this stage, when a “ genetically healthy” product should be created; reliability evaluations and assurances cannot be delayed until the product is fabricated and shipped to the customer, i.e., cannot be left to the prognostics-and-health-monitoring/managing (PHM) stage; it is too late at this stage to change the design or the materials for improved reliability; that is why, when reliability is imperative, users re-qualify parts to assess their lifetime and use redundancy to build a highly reliable system out of insufficiently reliable components; 6) Design, fabrication, qualification and PHM efforts should consider and be specific for particular products and their most likely actual or at least anticipated application(s); 7) Probabilistic design for reliability (PDfR) is an effective means for improving the state-of-the-art in the field: nothing is perfect, and the difference between an unreliable product and a robust one is “ merely” the probability of failure (PoF); 8) Highly cost-effective and highly focused failure oriented accelerated testing (FOAT) geared to a particular pre-determined reliability model and aimed at understanding the physics of failure- anticipated by this model is an

  1. Human Reliability Program Overview

    SciTech Connect

    Bodin, Michael

    2012-09-25

    This presentation covers the high points of the Human Reliability Program, including certification/decertification, critical positions, due process, organizational structure, program components, personnel security, an overview of the US DOE reliability program, retirees and academia, and security program integration.

  2. Power electronics reliability analysis.

    SciTech Connect

    Smith, Mark A.; Atcitty, Stanley

    2009-12-01

    This report provides the DOE and industry with a general process for analyzing power electronics reliability. The analysis can help with understanding the main causes of failures, downtime, and cost and how to reduce them. One approach is to collect field maintenance data and use it directly to calculate reliability metrics related to each cause. Another approach is to model the functional structure of the equipment using a fault tree to derive system reliability from component reliability. Analysis of a fictitious device demonstrates the latter process. Optimization can use the resulting baseline model to decide how to improve reliability and/or lower costs. It is recommended that both electric utilities and equipment manufacturers make provisions to collect and share data in order to lay the groundwork for improving reliability into the future. Reliability analysis helps guide reliability improvements in hardware and software technology including condition monitoring and prognostics and health management.

  3. Reliable Design Versus Trust

    NASA Technical Reports Server (NTRS)

    Berg, Melanie; LaBel, Kenneth A.

    2016-01-01

    This presentation focuses on reliability and trust for the users portion of the FPGA design flow. It is assumed that the manufacturer prior to hand-off to the user tests FPGA internal components. The objective is to present the challenges of creating reliable and trusted designs. The following will be addressed: What makes a design vulnerable to functional flaws (reliability) or attackers (trust)? What are the challenges for verifying a reliable design versus a trusted design?

  4. CRISPR-STAT: an easy and reliable PCR-based method to evaluate target-specific sgRNA activity.

    PubMed

    Carrington, Blake; Varshney, Gaurav K; Burgess, Shawn M; Sood, Raman

    2015-12-15

    CRISPR/Cas9 has emerged as a versatile genome-engineering tool that relies on a single guide RNA (sgRNA) and the Cas9 enzyme for genome editing. Simple, fast and economical methods to generate sgRNAs have made targeted mutagenesis routine in cultured cells, mice, zebrafish and other model systems. Pre-screening of sgRNAs for target efficacy is desirable both for successful mutagenesis and minimizing wasted animal husbandry on targets with poor activity. Here, we describe an easy, quick and cost-effective fluorescent polymerase chain reaction (PCR)-based method, CRISPR Somatic Tissue Activity Test (CRISPR-STAT), to determine target-specific efficiency of sgRNA. As a proof of principle, we validated our method using 28 sgRNAs with known and varied levels of germline transmission efficiency in zebrafish by analysis of their somatic activity in injected embryos. Our data revealed a strong positive correlation between the fluorescent PCR profiles of the injected embryos and the germline transmission efficiency. Furthermore, the assay was sensitive enough to evaluate multiplex gene targeting. This method is easy to implement by laboratories with access to a capillary sequencer. Although we validated the method using CRISPR/Cas9 and zebrafish, it can be applied to other model systems and other genome targeting nucleases. PMID:26253739

  5. Reliability of Sleep Measures from Four Personal Health Monitoring Devices Compared to Research-Based Actigraphy and Polysomnography.

    PubMed

    Mantua, Janna; Gravel, Nickolas; Spencer, Rebecca M C

    2016-01-01

    Polysomnography (PSG) is the "gold standard" for monitoring sleep. Alternatives to PSG are of interest for clinical, research, and personal use. Wrist-worn actigraph devices have been utilized in research settings for measures of sleep for over two decades. Whether sleep measures from commercially available devices are similarly valid is unknown. We sought to determine the validity of five wearable devices: Basis Health Tracker, Misfit Shine, Fitbit Flex, Withings Pulse O2, and a research-based actigraph, Actiwatch Spectrum. We used Wilcoxon Signed Rank tests to assess differences between devices relative to PSG and correlational analysis to assess the strength of the relationship. Data loss was greatest for Fitbit and Misfit. For all devices, we found no difference and strong correlation of total sleep time with PSG. Sleep efficiency differed from PSG for Withings, Misfit, Fitbit, and Basis, while Actiwatch mean values did not differ from that of PSG. Only mean values of sleep efficiency (time asleep/time in bed) from Actiwatch correlated with PSG, yet this correlation was weak. Light sleep time differed from PSG (nREM1 + nREM2) for all devices. Measures of Deep sleep time did not differ from PSG (SWS + REM) for Basis. These results reveal the current strengths and limitations in sleep estimates produced by personal health monitoring devices and point to a need for future development. PMID:27164110

  6. A reliability-based particle filter for humanoid robot self-localization in RoboCup Standard Platform League.

    PubMed

    Munera Sánchez, Eduardo; Muñoz Alcobendas, Manuel; Blanes Noguera, Juan Fco; Benet Gilabert, Ginés; Simó Ten, José E

    2013-01-01

    This paper deals with the problem of humanoid robot localization and proposes a new method for position estimation that has been developed for the RoboCup Standard Platform League environment. Firstly, a complete vision system has been implemented in the Nao robot platform that enables the detection of relevant field markers. The detection of field markers provides some estimation of distances for the current robot position. To reduce errors in these distance measurements, extrinsic and intrinsic camera calibration procedures have been developed and described. To validate the localization algorithm, experiments covering many of the typical situations that arise during RoboCup games have been developed: ranging from degradation in position estimation to total loss of position (due to falls, 'kidnapped robot', or penalization). The self-localization method developed is based on the classical particle filter algorithm. The main contribution of this work is a new particle selection strategy. Our approach reduces the CPU computing time required for each iteration and so eases the limited resource availability problem that is common in robot platforms such as Nao. The experimental results show the quality of the new algorithm in terms of localization and CPU time consumption.

  7. A Reliability-Based Particle Filter for Humanoid Robot Self-Localization in RoboCup Standard Platform League

    PubMed Central

    Sánchez, Eduardo Munera; Alcobendas, Manuel Muñoz; Noguera, Juan Fco. Blanes; Gilabert, Ginés Benet; Simó Ten, José E.

    2013-01-01

    This paper deals with the problem of humanoid robot localization and proposes a new method for position estimation that has been developed for the RoboCup Standard Platform League environment. Firstly, a complete vision system has been implemented in the Nao robot platform that enables the detection of relevant field markers. The detection of field markers provides some estimation of distances for the current robot position. To reduce errors in these distance measurements, extrinsic and intrinsic camera calibration procedures have been developed and described. To validate the localization algorithm, experiments covering many of the typical situations that arise during RoboCup games have been developed: ranging from degradation in position estimation to total loss of position (due to falls, ‘kidnapped robot’, or penalization). The self-localization method developed is based on the classical particle filter algorithm. The main contribution of this work is a new particle selection strategy. Our approach reduces the CPU computing time required for each iteration and so eases the limited resource availability problem that is common in robot platforms such as Nao. The experimental results show the quality of the new algorithm in terms of localization and CPU time consumption. PMID:24193098

  8. Reliability of Sleep Measures from Four Personal Health Monitoring Devices Compared to Research-Based Actigraphy and Polysomnography

    PubMed Central

    Mantua, Janna; Gravel, Nickolas; Spencer, Rebecca M. C.

    2016-01-01

    Polysomnography (PSG) is the “gold standard” for monitoring sleep. Alternatives to PSG are of interest for clinical, research, and personal use. Wrist-worn actigraph devices have been utilized in research settings for measures of sleep for over two decades. Whether sleep measures from commercially available devices are similarly valid is unknown. We sought to determine the validity of five wearable devices: Basis Health Tracker, Misfit Shine, Fitbit Flex, Withings Pulse O2, and a research-based actigraph, Actiwatch Spectrum. We used Wilcoxon Signed Rank tests to assess differences between devices relative to PSG and correlational analysis to assess the strength of the relationship. Data loss was greatest for Fitbit and Misfit. For all devices, we found no difference and strong correlation of total sleep time with PSG. Sleep efficiency differed from PSG for Withings, Misfit, Fitbit, and Basis, while Actiwatch mean values did not differ from that of PSG. Only mean values of sleep efficiency (time asleep/time in bed) from Actiwatch correlated with PSG, yet this correlation was weak. Light sleep time differed from PSG (nREM1 + nREM2) for all devices. Measures of Deep sleep time did not differ from PSG (SWS + REM) for Basis. These results reveal the current strengths and limitations in sleep estimates produced by personal health monitoring devices and point to a need for future development. PMID:27164110

  9. Reliability of Sleep Measures from Four Personal Health Monitoring Devices Compared to Research-Based Actigraphy and Polysomnography.

    PubMed

    Mantua, Janna; Gravel, Nickolas; Spencer, Rebecca M C

    2016-05-05

    Polysomnography (PSG) is the "gold standard" for monitoring sleep. Alternatives to PSG are of interest for clinical, research, and personal use. Wrist-worn actigraph devices have been utilized in research settings for measures of sleep for over two decades. Whether sleep measures from commercially available devices are similarly valid is unknown. We sought to determine the validity of five wearable devices: Basis Health Tracker, Misfit Shine, Fitbit Flex, Withings Pulse O2, and a research-based actigraph, Actiwatch Spectrum. We used Wilcoxon Signed Rank tests to assess differences between devices relative to PSG and correlational analysis to assess the strength of the relationship. Data loss was greatest for Fitbit and Misfit. For all devices, we found no difference and strong correlation of total sleep time with PSG. Sleep efficiency differed from PSG for Withings, Misfit, Fitbit, and Basis, while Actiwatch mean values did not differ from that of PSG. Only mean values of sleep efficiency (time asleep/time in bed) from Actiwatch correlated with PSG, yet this correlation was weak. Light sleep time differed from PSG (nREM1 + nREM2) for all devices. Measures of Deep sleep time did not differ from PSG (SWS + REM) for Basis. These results reveal the current strengths and limitations in sleep estimates produced by personal health monitoring devices and point to a need for future development.

  10. A reliability-based particle filter for humanoid robot self-localization in RoboCup Standard Platform League.

    PubMed

    Munera Sánchez, Eduardo; Muñoz Alcobendas, Manuel; Blanes Noguera, Juan Fco; Benet Gilabert, Ginés; Simó Ten, José E

    2013-01-01

    This paper deals with the problem of humanoid robot localization and proposes a new method for position estimation that has been developed for the RoboCup Standard Platform League environment. Firstly, a complete vision system has been implemented in the Nao robot platform that enables the detection of relevant field markers. The detection of field markers provides some estimation of distances for the current robot position. To reduce errors in these distance measurements, extrinsic and intrinsic camera calibration procedures have been developed and described. To validate the localization algorithm, experiments covering many of the typical situations that arise during RoboCup games have been developed: ranging from degradation in position estimation to total loss of position (due to falls, 'kidnapped robot', or penalization). The self-localization method developed is based on the classical particle filter algorithm. The main contribution of this work is a new particle selection strategy. Our approach reduces the CPU computing time required for each iteration and so eases the limited resource availability problem that is common in robot platforms such as Nao. The experimental results show the quality of the new algorithm in terms of localization and CPU time consumption. PMID:24193098

  11. Reliability in aposematic signaling

    PubMed Central

    2010-01-01

    In light of recent work, we will expand on the role and variability of aposematic signals. The focus of this review will be the concepts of reliability and honesty in aposematic signaling. We claim that reliable signaling can solve the problem of aposematic evolution, and that variability in reliability can shed light on the complexity of aposematic systems. PMID:20539774

  12. Viking Lander reliability program

    NASA Technical Reports Server (NTRS)

    Pilny, M. J.

    1978-01-01

    The Viking Lander reliability program is reviewed with attention given to the development of the reliability program requirements, reliability program management, documents evaluation, failure modes evaluation, production variation control, failure reporting and correction, and the parts program. Lander hardware failures which have occurred during the mission are listed.

  13. Reliability as Argument

    ERIC Educational Resources Information Center

    Parkes, Jay

    2007-01-01

    Reliability consists of both important social and scientific values and methods for evidencing those values, though in practice methods are often conflated with the values. With the two distinctly understood, a reliability argument can be made that articulates the particular reliability values most relevant to the particular measurement situation…

  14. Reliability model generator specification

    NASA Technical Reports Server (NTRS)

    Cohen, Gerald C.; Mccann, Catherine

    1990-01-01

    The Reliability Model Generator (RMG), a program which produces reliability models from block diagrams for ASSIST, the interface for the reliability evaluation tool SURE is described. An account is given of motivation for RMG and the implemented algorithms are discussed. The appendices contain the algorithms and two detailed traces of examples.

  15. Using a Web-Based Approach to Assess Test–Retest Reliability of the “Hypertension Self-Care Profile” Tool in an Asian Population

    PubMed Central

    Koh, Yi Ling Eileen; Lua, Yi Hui Adela; Hong, Liyue; Bong, Huey Shin Shirley; Yeo, Ling Sui Jocelyn; Tsang, Li Ping Marianne; Ong, Kai Zhi; Wong, Sook Wai Samantha; Tan, Ngiap Chuan

    2016-01-01

    Abstract Essential hypertension often requires affected patients to self-manage their condition most of the time. Besides seeking regular medical review of their life-long condition to detect vascular complications, patients have to maintain healthy lifestyles in between physician consultations via diet and physical activity, and to take their medications according to their prescriptions. Their self-management ability is influenced by their self-efficacy capacity, which can be assessed using questionnaire-based tools. The “Hypertension Self-Care Profile” (HTN-SCP) is 1 such questionnaire assessing self-efficacy in the domains of “behavior,” “motivation,” and “self-efficacy.” This study aims to determine the test–retest reliability of HTN-SCP in an English-literate Asian population using a web-based approach. Multiethnic Asian patients, aged 40 years and older, with essential hypertension were recruited from a typical public primary care clinic in Singapore. The investigators guided the patients to fill up the web-based 60-item HTN-SCP in English using a tablet or smartphone on the first visit and refilled the instrument 2 weeks later in the retest. Internal consistency and test–retest reliability were evaluated using Cronbach's Alpha and intraclass correlation coefficients (ICC), respectively. The t test was used to determine the relationship between the overall HTN-SCP scores of the patients and their self-reported self-management activities. A total of 160 patients completed the HTN-SCP during the initial test, from which 71 test–retest responses were completed. No floor or ceiling effect was found for the scores for the 3 subscales. Cronbach's Alpha coefficients were 0.857, 0.948, and 0.931 for “behavior,” “motivation,” and “self-efficacy” domains respectively, indicating high internal consistency. The item-total correlation ranges for the 3 scales were from 0.105 to 0.656 for Behavior, 0.401 to 0.808 for Motivation, 0.349 to 0

  16. Validation and test-retest reliability of a health measure, health as ability of acting, based on the welfare theory of health.

    PubMed

    Snellman, Ingrid; Jonsson, Bosse; Wikblad, Karin

    2012-03-01

    The aim of this study was to conduct a validation and assess the test-retest reliability of the health questionnaire based on Nordenfelt's Welfare Theory of Health (WTH). The study used a questionnaire on health together with the Short Form 12-Item Health Survey (SF-12) questionnaire, and 490 pupils at colleges for adult education participated. The results of the study are in accordance with Nordenfelt's WTH. Three hypotheses were stated, and the first was confirmed: People who were satisfied with life rated higher levels than those who were dissatisfied with life concerning both mental and physical health, measured with the SF-12. The second hypothesis was partially confirmed: People with high education were more often satisfied with life than those with low education, but they were not healthier. The third hypothesis, that women are unhealthy more often than men, was not confirmed. The questionnaire on health showed acceptable stability. PMID:21930655

  17. Toward improving the reliability of hydrologic prediction: Model structure uncertainty and its quantification using ensemble-based genetic programming framework

    NASA Astrophysics Data System (ADS)

    Parasuraman, Kamban; Elshorbagy, Amin

    2008-12-01

    Uncertainty analysis is starting to be widely acknowledged as an integral part of hydrological modeling. The conventional treatment of uncertainty analysis in hydrologic modeling is to assume a deterministic model structure, and treat its associated parameters as imperfectly known, thereby neglecting the uncertainty associated with the model structure. In this paper, a modeling framework that can explicitly account for the effect of model structure uncertainty has been proposed. The modeling framework is based on initially generating different realizations of the original data set using a non-parametric bootstrap method, and then exploiting the ability of the self-organizing algorithms, namely genetic programming, to evolve their own model structure for each of the resampled data sets. The resulting ensemble of models is then used to quantify the uncertainty associated with the model structure. The performance of the proposed modeling framework is analyzed with regards to its ability in characterizing the evapotranspiration process at the Southwest Sand Storage facility, located near Ft. McMurray, Alberta. Eddy-covariance-measured actual evapotranspiration is modeled as a function of net radiation, air temperature, ground temperature, relative humidity, and wind speed. Investigating the relation between model complexity, prediction accuracy, and uncertainty, two sets of experiments were carried out by varying the level of mathematical operators that can be used to define the predictand-predictor relationship. While the first set uses just the additive operators, the second set uses both the additive and the multiplicative operators to define the predictand-predictor relationship. The results suggest that increasing the model complexity may lead to better prediction accuracy but at an expense of increasing uncertainty. Compared to the model parameter uncertainty, the relative contribution of model structure uncertainty to the predictive uncertainty of a model is

  18. Reliable Entanglement Verification

    NASA Astrophysics Data System (ADS)

    Arrazola, Juan; Gittsovich, Oleg; Donohue, John; Lavoie, Jonathan; Resch, Kevin; Lütkenhaus, Norbert

    2013-05-01

    Entanglement plays a central role in quantum protocols. It is therefore important to be able to verify the presence of entanglement in physical systems from experimental data. In the evaluation of these data, the proper treatment of statistical effects requires special attention, as one can never claim to have verified the presence of entanglement with certainty. Recently increased attention has been paid to the development of proper frameworks to pose and to answer these type of questions. In this work, we apply recent results by Christandl and Renner on reliable quantum state tomography to construct a reliable entanglement verification procedure based on the concept of confidence regions. The statements made do not require the specification of a prior distribution nor the assumption of an independent and identically distributed (i.i.d.) source of states. Moreover, we develop efficient numerical tools that are necessary to employ this approach in practice, rendering the procedure ready to be employed in current experiments. We demonstrate this fact by analyzing the data of an experiment where photonic entangled two-photon states were generated and whose entanglement is verified with the use of an accessible nonlinear witness.

  19. Load Control System Reliability

    SciTech Connect

    Trudnowski, Daniel

    2015-04-03

    This report summarizes the results of the Load Control System Reliability project (DOE Award DE-FC26-06NT42750). The original grant was awarded to Montana Tech April 2006. Follow-on DOE awards and expansions to the project scope occurred August 2007, January 2009, April 2011, and April 2013. In addition to the DOE monies, the project also consisted of matching funds from the states of Montana and Wyoming. Project participants included Montana Tech; the University of Wyoming; Montana State University; NorthWestern Energy, Inc., and MSE. Research focused on two areas: real-time power-system load control methodologies; and, power-system measurement-based stability-assessment operation and control tools. The majority of effort was focused on area 2. Results from the research includes: development of fundamental power-system dynamic concepts, control schemes, and signal-processing algorithms; many papers (including two prize papers) in leading journals and conferences and leadership of IEEE activities; one patent; participation in major actual-system testing in the western North American power system; prototype power-system operation and control software installed and tested at three major North American control centers; and, the incubation of a new commercial-grade operation and control software tool. Work under this grant certainly supported the DOE-OE goals in the area of “Real Time Grid Reliability Management.”

  20. Business of reliability

    NASA Astrophysics Data System (ADS)

    Engel, Pierre

    1999-12-01

    The presentation is organized around three themes: (1) The decrease of reception equipment costs allows non-Remote Sensing organization to access a technology until recently reserved to scientific elite. What this means is the rise of 'operational' executive agencies considering space-based technology and operations as a viable input to their daily tasks. This is possible thanks to totally dedicated ground receiving entities focusing on one application for themselves, rather than serving a vast community of users. (2) The multiplication of earth observation platforms will form the base for reliable technical and financial solutions. One obstacle to the growth of the earth observation industry is the variety of policies (commercial versus non-commercial) ruling the distribution of the data and value-added products. In particular, the high volume of data sales required for the return on investment does conflict with traditional low-volume data use for most applications. Constant access to data sources supposes monitoring needs as well as technical proficiency. (3) Large volume use of data coupled with low- cost equipment costs is only possible when the technology has proven reliable, in terms of application results, financial risks and data supply. Each of these factors is reviewed. The expectation is that international cooperation between agencies and private ventures will pave the way for future business models. As an illustration, the presentation proposes to use some recent non-traditional monitoring applications, that may lead to significant use of earth observation data, value added products and services: flood monitoring, ship detection, marine oil pollution deterrent systems and rice acreage monitoring.

  1. Facial Psoriasis Log-based Area and Severity Index: A valid and reliable severity measurement method detecting improvement of facial psoriasis in clinical practice settings.

    PubMed

    Kwon, Hyuck Hoon; Kim, Min-Woo; Park, Gyeong-Hun; Bae, You In; Kuk, Su Kyung; Suh, Dae Hun; Youn, Jai Il; Kwon, In Ho

    2016-08-01

    Facial psoriasis is often observed in moderate to severe degrees of psoriasis. While we previously demonstrated construct validity of the facial Psoriasis Log-based Area and Severity Index (fPLASI) system for the cross-sectional evaluation of facial psoriasis, its reliability and accuracy to detect clinical improvement has not been confirmed yet. The aim of this study is to analyze whether the fPLASI properly represents the range of improvement for facial psoriasis compared with the existing facial Psoriasis Area and Severity Index (fPASI) after receiving systemic treatments in clinical practice settings. The changing severity of facial psoriasis for 118 patients was calculated by the scales of fPASI and fPLASI between two time points after systemic treatments. Then, percentage changes (ΔfPASI and ΔfPLASI) were analyzed from the perspective of both the Physician's Global Assessment of effectiveness (PGA) and patients' Subjective Global Assessment (SGA). As a result, the distribution of the fPASI was more heavily clustered around the low score range compared with the fPLASI at both first and second visits. Linear regression analysis between ΔfPASI and ΔfPLASI shows that the correlation coefficient was 0.94, and ΔfPLASI represented greater percentage changes than ΔfPASI. Remarkably, degrees of clinical improvement measured by the PGA matched better with ΔfPLASI, while ΔfPASI underestimated clinical improvements compared with ΔfPLASI from treatment-responding groups by the PGA and SGA. In conclusion, the fPLASI represented clinical improvement of facial psoriasis with more sensitivity and reliability compared with the fPASI. Therefore, the PLASI system would be a viable severity measurement method for facial psoriasis in clinical practice.

  2. Integrated electrokinetic magnetic bead-based electrochemical immunoassay on microfluidic chips for reliable control of permitted levels of zearalenone in infant foods.

    PubMed

    Hervás, Mirian; López, Miguel A; Escarpa, Alberto

    2011-05-21

    Microfluidic technology has now become a novel sensing platform where different analytical steps, biological recognition materials and suitable transducers can be cleverly integrated yielding a new sensor generation. A novel "lab-on-a-chip" strategy integrating an electrokinetic magnetic bead-based electrochemical immunoassay on a microfluidic chip for reliable control of permitted levels of zearalenone in infant foods is proposed. The strategy implies the creative use of the simple channel layout of the double-T microchip to perform sequentially the immunointeraction and enzymatic reaction by applying a program of electric fields suitably connected to the reservoirs for driving the fluidics at different chambers in order to perform the different reactions. Both zones are used with the aid of a magnetic field to avoid in a very simple and elegant way the non-specific adsorption. Immunological reaction is performed under a competitive enzyme-linked immunosorbent assay (ELISA) where the mycotoxin ZEA and an enzyme-labelled derivative compete for the binding sites of the specific monoclonal antibody immobilised onto protein G modified magnetic beads. Horseradish peroxidase (HRP), in the presence of hydrogen peroxide, catalyses the oxidation of hydroquinone (HQ) to benzoquinone (BQN), whose back electrochemical reduction was detected at +0.1 V. Controlled-electrokinetic fluidic handling optimized conditions are addressed for all analytical steps cited above, and allows performing the complete immunoassay for the target ZEA analyte in less than 15 minutes with unique analytical merits: competitive immunoassay currents showed a very well-defined concentration dependence with a good precision as well as a suitable limit of detection of 0.4 µg L(-1), well below the legislative requirements, and an extremely low systematic error of 2% from the analysis of a maize certified reference material revealing additionally an excellent accuracy. Also, the reliability of the

  3. Performance of GaN-on-Si-based vertical light-emitting diodes using silicon nitride electrodes with conducting filaments: correlation between filament density and device reliability.

    PubMed

    Kim, Kyeong Heon; Kim, Su Jin; Lee, Tae Ho; Lee, Byeong Ryong; Kim, Tae Geun

    2016-08-01

    Transparent conductive electrodes with good conductivity and optical transmittance are an essential element for highly efficient light-emitting diodes. However, conventional indium tin oxide and its alternative transparent conductive electrodes have some trouble with a trade-off between electrical conductivity and optical transmittance, thus limiting their practical applications. Here, we present silicon nitride transparent conductive electrodes with conducting filaments embedded using the electrical breakdown process and investigate the dependence of the conducting filament density formed in the transparent conductive electrode on the device performance of gallium nitride-based vertical light-emitting diodes. Three gallium nitride-on-silicon-based vertical light-emitting diodes using silicon nitride transparent conductive electrodes with high, medium, and low conducting filament densities were prepared with a reference vertical light-emitting diode using metal electrodes. This was carried to determine the optimal density of the conducting filaments in the proposed silicon nitride transparent conductive electrodes. In comparison, the vertical light-emitting diodes with a medium conducting filament density exhibited the lowest optical loss, direct ohmic behavior, and the best current injection and distribution over the entire n-type gallium nitride surface, leading to highly reliable light-emitting diode performance.

  4. Reliable and fast allele-specific extension of 3'-LNA modified oligonucleotides covalently immobilized on a plastic base, combined with biotin-dUTP mediated optical detection.

    PubMed

    Michikawa, Yuichi; Fujimoto, Kentaro; Kinoshita, Kenji; Kawai, Seiko; Sugahara, Keisuke; Suga, Tomo; Otsuka, Yoshimi; Fujiwara, Kazuhiko; Iwakawa, Mayumi; Imai, Takashi

    2006-12-01

    In the present work, a convenient microarray SNP typing system has been developed using a plastic base that covalently immobilizes amino-modified oligonucleotides. Reliable SNP allele discrimination was achieved by using allelic specificity-enhanced enzymatic extension of immobilized oligonucleotide primer, with a locked nucleic acid (LNA) modification at the SNP-discriminating 3'-end nucleotide. Incorporation of multiple biotin-dUTP molecules during primer extension, followed by binding of alkaline phosphatase-conjugated streptavidin, allowed optical detection of the genotyping results through precipitation of colored alkaline phosphatase substrates onto the surface of the plastic base. Notably, rapid primer extension was demonstrated without a preliminary annealing step of double-stranded template DNA, allowing overall processes to be performed within a couple of hours. Simultaneous evaluation of three SNPs in the genes TGFB1, SOD2 and APEX1, previously investigated for association with radiation sensitivity, in 25 individuals has shown perfect assignment with data obtained by another established technique (MassARRAY system).

  5. Performance of GaN-on-Si-based vertical light-emitting diodes using silicon nitride electrodes with conducting filaments: correlation between filament density and device reliability.

    PubMed

    Kim, Kyeong Heon; Kim, Su Jin; Lee, Tae Ho; Lee, Byeong Ryong; Kim, Tae Geun

    2016-08-01

    Transparent conductive electrodes with good conductivity and optical transmittance are an essential element for highly efficient light-emitting diodes. However, conventional indium tin oxide and its alternative transparent conductive electrodes have some trouble with a trade-off between electrical conductivity and optical transmittance, thus limiting their practical applications. Here, we present silicon nitride transparent conductive electrodes with conducting filaments embedded using the electrical breakdown process and investigate the dependence of the conducting filament density formed in the transparent conductive electrode on the device performance of gallium nitride-based vertical light-emitting diodes. Three gallium nitride-on-silicon-based vertical light-emitting diodes using silicon nitride transparent conductive electrodes with high, medium, and low conducting filament densities were prepared with a reference vertical light-emitting diode using metal electrodes. This was carried to determine the optimal density of the conducting filaments in the proposed silicon nitride transparent conductive electrodes. In comparison, the vertical light-emitting diodes with a medium conducting filament density exhibited the lowest optical loss, direct ohmic behavior, and the best current injection and distribution over the entire n-type gallium nitride surface, leading to highly reliable light-emitting diode performance. PMID:27505739

  6. Mesostructured HfxAlyO2 Thin Films as Reliable and Robust Gate Dielectrics with Tunable Dielectric Constants for High-Performance Graphene-Based Transistors.

    PubMed

    Lee, Yunseong; Jeon, Woojin; Cho, Yeonchoo; Lee, Min-Hyun; Jeong, Seong-Jun; Park, Jongsun; Park, Seongjun

    2016-07-26

    We introduce a reliable and robust gate dielectric material with tunable dielectric constants based on a mesostructured HfxAlyO2 film. The ultrathin mesostructured HfxAlyO2 film is deposited on graphene via a physisorbed-precursor-assisted atomic layer deposition process and consists of an intermediate state with small crystallized parts in an amorphous matrix. Crystal phase engineering using Al dopant is employed to achieve HfO2 phase transitions, which produce the crystallized part of the mesostructured HfxAlyO2 film. The effects of various Al doping concentrations are examined, and an enhanced dielectric constant of ∼25 is obtained. Further, the leakage current is suppressed (∼10(-8) A/cm(2)) and the dielectric breakdown properties are enhanced (breakdown field: ∼7 MV/cm) by the partially remaining amorphous matrix. We believe that this contribution is theoretically and practically relevant because excellent gate dielectric performance is obtained. In addition, an array of top-gated metal-insulator-graphene field-effect transistors is fabricated on a 6 in. wafer, yielding a capacitance equivalent oxide thickness of less than 1 nm (0.78 nm). This low capacitance equivalent oxide thickness has important implications for the incorporation of graphene into high-performance silicon-based nanoelectronics. PMID:27355098

  7. Human reliability analysis

    SciTech Connect

    Dougherty, E.M.; Fragola, J.R.

    1988-01-01

    The authors present a treatment of human reliability analysis incorporating an introduction to probabilistic risk assessment for nuclear power generating stations. They treat the subject according to the framework established for general systems theory. Draws upon reliability analysis, psychology, human factors engineering, and statistics, integrating elements of these fields within a systems framework. Provides a history of human reliability analysis, and includes examples of the application of the systems approach.

  8. Web Awards: Are They Reliable?

    ERIC Educational Resources Information Center

    Everhart, Nancy; McKnight, Kathleen

    1997-01-01

    School library media specialists recommend quality Web sites to children based on evaluations and Web awards. This article examines three types of Web awards and who grants them, suggests ways to determine their reliability, and discusses specific award sites. Includes a bibliography of Web sites. (PEN)

  9. The Reliability of College Grades

    ERIC Educational Resources Information Center

    Beatty, Adam S.; Walmsley, Philip T.; Sackett, Paul R.; Kuncel, Nathan R.; Koch, Amanda J.

    2015-01-01

    Little is known about the reliability of college grades relative to how prominently they are used in educational research, and the results to date tend to be based on small sample studies or are decades old. This study uses two large databases (N > 800,000) from over 200 educational institutions spanning 13 years and finds that both first-year…

  10. Reliability of fluid systems

    NASA Astrophysics Data System (ADS)

    Kopáček, Jaroslav; Fojtášek, Kamil; Dvořák, Lukáš

    2016-03-01

    This paper focuses on the importance of detection reliability, especially in complex fluid systems for demanding production technology. The initial criterion for assessing the reliability is the failure of object (element), which is seen as a random variable and their data (values) can be processed using by the mathematical methods of theory probability and statistics. They are defined the basic indicators of reliability and their applications in calculations of serial, parallel and backed-up systems. For illustration, there are calculation examples of indicators of reliability for various elements of the system and for the selected pneumatic circuit.

  11. Operational safety reliability research

    SciTech Connect

    Hall, R.E.; Boccio, J.L.

    1986-01-01

    Operating reactor events such as the TMI accident and the Salem automatic-trip failures raised the concern that during a plant's operating lifetime the reliability of systems could degrade from the design level that was considered in the licensing process. To address this concern, NRC is sponsoring the Operational Safety Reliability Research project. The objectives of this project are to identify the essential tasks of a reliability program and to evaluate the effectiveness and attributes of such a reliability program applicable to maintaining an acceptable level of safety during the operating lifetime at the plant.

  12. Reliable Quantification of the Potential for Equations Based on Spot Urine Samples to Estimate Population Salt Intake: Protocol for a Systematic Review and Meta-Analysis

    PubMed Central

    Huang, Liping; Crino, Michelle; Wu, Jason HY; Woodward, Mark; Land, Mary-Anne; McLean, Rachael; Webster, Jacqui; Enkhtungalag, Batsaikhan; Nowson, Caryl A; Elliott, Paul; Cogswell, Mary; Toft, Ulla; Mill, Jose G; Furlanetto, Tania W; Ilich, Jasminka Z; Hong, Yet Hoi; Cohall, Damian; Luzardo, Leonella; Noboa, Oscar; Holm, Ellen; Gerbes, Alexander L; Senousy, Bahaa; Pinar Kara, Sonat; Brewster, Lizzy M; Ueshima, Hirotsugu; Subramanian, Srinivas; Teo, Boon Wee; Allen, Norrina; Choudhury, Sohel Reza; Polonia, Jorge; Yasuda, Yoshinari; Campbell, Norm RC; Neal, Bruce

    2016-01-01

    Background Methods based on spot urine samples (a single sample at one time-point) have been identified as a possible alternative approach to 24-hour urine samples for determining mean population salt intake. Objective The aim of this study is to identify a reliable method for estimating mean population salt intake from spot urine samples. This will be done by comparing the performance of existing equations against one other and against estimates derived from 24-hour urine samples. The effects of factors such as ethnicity, sex, age, body mass index, antihypertensive drug use, health status, and timing of spot urine collection will be explored. The capacity of spot urine samples to measure change in salt intake over time will also be determined. Finally, we aim to develop a novel equation (or equations) that performs better than existing equations to estimate mean population salt intake. Methods A systematic review and meta-analysis of individual participant data will be conducted. A search has been conducted to identify human studies that report salt (or sodium) excretion based upon 24-hour urine samples and spot urine samples. There were no restrictions on language, study sample size, or characteristics of the study population. MEDLINE via OvidSP (1946-present), Premedline via OvidSP, EMBASE, Global Health via OvidSP (1910-present), and the Cochrane Library were searched, and two reviewers identified eligible studies. The authors of these studies will be invited to contribute data according to a standard format. Individual participant records will be compiled and a series of analyses will be completed to: (1) compare existing equations for estimating 24-hour salt intake from spot urine samples with 24-hour urine samples, and assess the degree of bias according to key demographic and clinical characteristics; (2) assess the reliability of using spot urine samples to measure population changes in salt intake overtime; and (3) develop a novel equation that performs

  13. Reliability model for planetary gear

    NASA Technical Reports Server (NTRS)

    Savage, M.; Paridon, C. A.; Coy, J. J.

    1982-01-01

    A reliability model is presented for planetary gear trains in which the ring gear is fixed, the Sun gear is the input, and the planet arm is the output. The input and output shafts are coaxial and the input and output torques are assumed to be coaxial with these shafts. Thrust and side loading are neglected. This type of gear train is commonly used in main rotor transmissions for helicopters and in other applications which require high reductions in speed. The reliability model is based on the Weibull distribution of the individual reliabilities of the transmission components. The transmission's basic dynamic capacity is defined as the input torque which may be applied for one million input rotations of the Sun gear. Load and life are related by a power law. The load life exponent and basic dynamic capacity are developed as functions of the component capacities.

  14. Reliable measurement of 3D foot bone angles based on the frame-of-reference derived from a sole of the foot

    NASA Astrophysics Data System (ADS)

    Kim, Taeho; Lee, Dong Yeon; Park, Jinah

    2016-03-01

    Clinical management of foot pathology requires accurate and robust measurement of the anatomical angles. In order to measure a 3D angle, recent approaches have adopted a landmark-based local coordinate system to establish bone angles used in orthopedics. These measurement methods mainly assess the relative angle between bones using a representative axis derived from the morphological feature of the bone and therefore, the results can be affected by bone deformities. In this study, we propose a method of deriving a global frame-of-reference to acquire consistent direction of the foot by extracting the undersurface of the foot from the CT image data. The two lowest positions of the foot skin are identified from the surface to define the base plane, and the direction from the hallux to the fourth toe is defined together to construct the global coordinate system. We performed the experiment on 10 volumes of foot CT images of healthy subjects to verify that the proposed method provides reliable measurements. We measured 3D angles for talus-calcaneus and talus-navicular using facing articular surfaces of paired bones. The angle was reported in 3 projection angles based on both coordinate systems defined by proposed global frame-of-reference and by CT image planes (saggital, frontal, and transverse). The result shows that the quantified angle using the proposed method considerably reduced the standard deviation (SD) against the angle using the conventional projection planes, and it was also comparable with the measured angles obtained from local coordinate systems of the bones. Since our method is independent from any individual local shape of a bone, unlike the measurement method using the local coordinate system, it is suitable for inter-subject comparison studies.

  15. Hawaii electric system reliability.

    SciTech Connect

    Silva Monroy, Cesar Augusto; Loose, Verne William

    2012-09-01

    This report addresses Hawaii electric system reliability issues; greater emphasis is placed on short-term reliability but resource adequacy is reviewed in reference to electric consumers' views of reliability %E2%80%9Cworth%E2%80%9D and the reserve capacity required to deliver that value. The report begins with a description of the Hawaii electric system to the extent permitted by publicly available data. Electrical engineering literature in the area of electric reliability is researched and briefly reviewed. North American Electric Reliability Corporation standards and measures for generation and transmission are reviewed and identified as to their appropriateness for various portions of the electric grid and for application in Hawaii. Analysis of frequency data supplied by the State of Hawaii Public Utilities Commission is presented together with comparison and contrast of performance of each of the systems for two years, 2010 and 2011. Literature tracing the development of reliability economics is reviewed and referenced. A method is explained for integrating system cost with outage cost to determine the optimal resource adequacy given customers' views of the value contributed by reliable electric supply. The report concludes with findings and recommendations for reliability in the State of Hawaii.

  16. Omics for prediction of environmental health effects: Blood leukocyte-based cross-omic profiling reliably predicts diseases associated with tobacco smoking

    PubMed Central

    Georgiadis, Panagiotis; Hebels, Dennie G.; Valavanis, Ioannis; Liampa, Irene; Bergdahl, Ingvar A.; Johansson, Anders; Palli, Domenico; Chadeau-Hyam, Marc; Chatziioannou, Aristotelis; Jennen, Danyel G. J.; Krauskopf, Julian; Jetten, Marlon J.; Kleinjans, Jos C. S.; Vineis, Paolo; Kyrtopoulos, Soterios A.; Gottschalk, Ralph; van Leeuwen, Danitsja; Timmermans, Leen; de Kok, Theo M.C.M.; Botsivali, Maria; Bendinelli, Benedetta; Kelly, Rachel; Vermeulen, Roel; Portengen, Lutzen; Saberi-Hosnijeh, Fatemeh; Melin, Beatrice; Hallmans, Göran; Lenner, Per; Keun, Hector C.; Siskos, Alexandros; Athersuch, Toby J.; Kogevinas, Manolis; Stephanou, Euripides G.; Myridakis, Antonis; Fazzo, Lucia; De Santis, Marco; Comba, Pietro; Kiviranta, Hannu; Rantakokko, Panu; Airaksinen, Riikka; Ruokojärvi, Päivi; Gilthorpe, Mark; Fleming, Sarah; Fleming, Thomas; Tu, Yu-Kang; Jonsson, Bo; Lundh, Thomas; Chen, Wei J.; Lee, Wen-Chung; Kate Hsiao, Chuhsing; Chien, Kuo-Liong; Kuo, Po-Hsiu; Hung, Hung; Liao, Shu-Fen

    2016-01-01

    The utility of blood-based omic profiles for linking environmental exposures to their potential health effects was evaluated in 649 individuals, drawn from the general population, in relation to tobacco smoking, an exposure with well-characterised health effects. Using disease connectivity analysis, we found that the combination of smoking-modified, genome-wide gene (including miRNA) expression and DNA methylation profiles predicts with remarkable reliability most diseases and conditions independently known to be causally associated with smoking (indicative estimates of sensitivity and positive predictive value 94% and 84%, respectively). Bioinformatics analysis reveals the importance of a small number of smoking-modified, master-regulatory genes and suggest a central role for altered ubiquitination. The smoking-induced gene expression profiles overlap significantly with profiles present in blood cells of patients with lung cancer or coronary heart disease, diseases strongly associated with tobacco smoking. These results provide proof-of-principle support to the suggestion that omic profiling in peripheral blood has the potential of identifying early, disease-related perturbations caused by toxic exposures and may be a useful tool in hazard and risk assessment. PMID:26837704

  17. Bayesian methods in reliability

    NASA Astrophysics Data System (ADS)

    Sander, P.; Badoux, R.

    1991-11-01

    The present proceedings from a course on Bayesian methods in reliability encompasses Bayesian statistical methods and their computational implementation, models for analyzing censored data from nonrepairable systems, the traits of repairable systems and growth models, the use of expert judgment, and a review of the problem of forecasting software reliability. Specific issues addressed include the use of Bayesian methods to estimate the leak rate of a gas pipeline, approximate analyses under great prior uncertainty, reliability estimation techniques, and a nonhomogeneous Poisson process. Also addressed are the calibration sets and seed variables of expert judgment systems for risk assessment, experimental illustrations of the use of expert judgment for reliability testing, and analyses of the predictive quality of software-reliability growth models such as the Weibull order statistics.

  18. Assessing the performance and reliability of PERSIANN-CDR satellite-based rainfall estimates over Spain: case study of rainfall Dry Spell Lengths (DSL)

    NASA Astrophysics Data System (ADS)

    Garcia Galiano, S. G.; Giraldo Osorio, J. D.; Nguyen, P.; Hsu, K. L.; Braithwaite, D.; Olmos, P.; Sorooshian, S.

    2015-12-01

    Studying Spain's long-term variability and changing trends in rainfall, due to its unique position in the Mediterranean basin (i.e., the latitudinal gradient from North to South and its orographic variation), can provide a valuable insight into how hydroclimatology of the region has changed. A recently released high resolution satellite-based global daily precipitation climate dataset PERSIANN-CDR (Precipitation Estimation from Remotely Sensed Information using Artificial Neural Network - Climate Data Record), provided the opportunity to conduct such study. It covers the period 01/01/1983 - to date, at 0.25° resolution. In areas without a dense network of rain-gauges, the PERSIANN-CDR dataset could be useful for identifying the reliability of regional climate models (RCMs), in order to build robust RCMs ensemble for reducing the uncertainties in the climate and hydrological projections. However, before using this data set for RCM evaluation, an assessment of performance of PERSIANN-CDR dataset against in-situ observations is necessary. The high-resolution gridded daily rain-gauge dataset, named Spain02, was employed in this study. The variable Dry Spell Lengths (DSL) considering 1 mm and 10 mm as thresholds of daily rainfall, and the time period 1988-2007 was defined for the study. A procedure for improving the consistency and homogeneity between the two datasets was applied. The assessment is based on distributional similarity and the well-known statistical tests (Smirnov-Kolmogorov of two samples and Chi-Square) are used as fitting criteria. The results demonstrate good fit of PERSIANN-CDR over whole Spain, for threshold 10 mm/day. However, for threshold 1 mm/day PERSIANN-CDR compares well with Spain02 dataset for areas with high values of rainfall (North of Spain); while in semiarid areas (South East of Spain) there is strong overestimation of short DSLs. Overall, PERSIANN-CDR demonstrate its robustness in the simulation of DSLs for the highest thresholds.

  19. Statistical Reliability of Phase Fraction Determination Based on Electron Backscatter Diffraction (EBSD) Investigations on the Example of an Al-TRIP Steel

    NASA Astrophysics Data System (ADS)

    Davut, Kemal; Zaefferer, Stefan

    2010-09-01

    The aim of this article is to discuss the representativeness of electron backscatter diffraction (EBSD) mapping data for phase fraction determination in multiphase materials. Particular attention is paid to the effect of step size and scanned area. The experimental investigations were carried out on a low-alloyed steel with transformation induced plasticity (TRIP) that shows a relatively heterogeneous distribution of residual austenite in a ferrite matrix. EBSD scans of various area sizes and step sizes were carried out and analyzed with respect to the determined austenite phase fraction. The step size has only an indirect influence on the results, as it determines the size of the investigated area if the number of measurement points is kept constant. Based on the experimental results, the optimum sampling conditions in terms of analyzed area size and the number of measurement points were determined. These values were compared with values obtained from Cochran’s formula, which allows calculation of sampling sizes for predefined levels of precision and confidence. A significant deviation of experimental from theoretical optimum sample sizes was found. This deviation is, for the most part, a result of the heterogeneous distribution of the austenite phase. Depending on grain size and volume fraction of the second phase, the false assignment of phases at grain boundaries also may introduce a significant error. A general formula is introduced that allows estimation of the error caused by these parameters. Finally, a new measurement scheme is proposed that allows improvement of reliability and representativeness of EBSD-based phase determination without large sacrifices in measurement time or data set sizes.

  20. Ca II H+K fluxes from S-indices of large samples: a reliable and consistent conversion based on PHOENIX model atmospheres

    NASA Astrophysics Data System (ADS)

    Mittag, M.; Schmitt, J. H. M. M.; Schröder, K.-P.

    2013-01-01

    Context. Historic stellar activity data based on chromospheric line emission using O.C. Wilson's S-index reach back to the 1960ies and represent a very valuable data resource both in terms of quantity and time-coverage. However, these data are not flux-calibrated and are therefore difficult to compare with modern spectroscopy and to relate to quantitative physics. Aims: In order to make use of the rich archives of Mount Wilson and many other S-index measurements of thousands of main sequence stars, subgiants and giants in terms of physical Ca II H+K line chromospheric surface fluxes and the related R-index, we seek a new, simple but reliable conversion method of the S-indices. A first application aims to obtain the (empirical) basal chromospheric surface flux to better characterise stars with minimal activity levels. Methods: We collect 6024 S-indices from six large catalogues from a total of 2530 stars with well-defined parallaxes (as given by the Hipparcos catalogue) in order to distinguish between main sequence stars (2133), subgiants (252) and giants (145), based on their positions in the Hertzsprung-Russell diagram. We use the spectra of a grid of PHOENIX model atmospheres to obtain the photospheric contributions to the S-index. To convert the latter into absolute Ca II H+K chromospheric line flux, we first derive new, colour-dependent photospheric flux relations for, each, main sequence, subgiant and giant stars, and then obtain the chromospheric flux component. In this process, the PHOENIX models also provide a very reliable scale for the physical surface flux. Results: For very large samples of main sequence stars, giants and subgiants, we obtain the chromospheric Ca II H+K line surface fluxes in the colour range of 0.44 < B - V < 1.6 and the related R-indices. We determine and parametrize the lower envelopes, which we find to well coincide with historic work on the basal chromospheric flux. There is good agreement in the apparently simpler cases of

  1. Managing Reliability in the 21st Century

    SciTech Connect

    Dellin, T.A.

    1998-11-23

    The rapid pace of change at Ike end of the 20th Century should continue unabated well into the 21st Century. The driver will be the marketplace imperative of "faster, better, cheaper." This imperative has already stimulated a revolution-in-engineering in design and manufacturing. In contrast, to date, reliability engineering has not undergone a similar level of change. It is critical that we implement a corresponding revolution-in-reliability-engineering as we enter the new millennium. If we are still using 20th Century reliability approaches in the 21st Century, then reliability issues will be the limiting factor in faster, better, and cheaper. At the heart of this reliability revolution will be a science-based approach to reliability engineering. Science-based reliability will enable building-in reliability, application-specific products, virtual qualification, and predictive maintenance. The purpose of this paper is to stimulate a dialogue on the future of reliability engineering. We will try to gaze into the crystal ball and predict some key issues that will drive reliability programs in the new millennium. In the 21st Century, we will demand more of our reliability programs. We will need the ability to make accurate reliability predictions that will enable optimizing cost, performance and time-to-market to meet the needs of every market segment. We will require that all of these new capabilities be in place prior to the stint of a product development cycle. The management of reliability programs will be driven by quantifiable metrics of value added to the organization business objectives.

  2. Test-Retest Reliability and Concurrent Validity of a Single Tri-Axial Accelerometer-Based Gait Analysis in Older Adults with Normal Cognition

    PubMed Central

    Byun, Seonjeong; Han, Ji Won; Kim, Tae Hui; Kim, Ki Woong

    2016-01-01

    Objective We investigated the concurrent validity and test-retest reliability of spatio-temporal gait parameters measured with a single tri-axial accelerometer (TAA), determined the optimal number of steps required for obtaining acceptable levels of reliability, and compared the validity and reliability of the estimated gait parameters across the three reference axes of the TAA. Methods A total of 82 cognitively normal elderly participants walked around a 40-m long round walkway twice wearing a TAA at their center of body mass. Gait parameters such as cadence, gait velocity, step time, step length, step time variability, and step time asymmetry were estimated from the low pass-filtered signal of the TAA. The test-retest reliability and concurrent validity with the GAITRite® system were evaluated for the estimated gait parameters. Results Gait parameters using signals from the vertical axis showed excellent reliability for all gait parameters; the intraclass correlation coefficient (ICC) was 0.79–0.90. A minimum of 26 steps and 14 steps were needed to achieve excellent reliability in step time variability and step time asymmetry, respectively. A strong level of agreement was seen for the basic gait parameters between the TAA and GAITRiteⓇ (ICC = 0.91–0.96). Conclusions The measurement of gait parameters of elderly individuals with normal cognition using a TAA placed on the body’s center of mass was reliable and showed superiority over the GAITRiteⓇ with regard to gait variability and asymmetry. The TAA system was a valid tool for measuring basic gait parameters. Considering its wearability and low price, the TAA system may be a promising alternative to the pressure sensor walkway system for measuring gait parameters. PMID:27427965

  3. Multidisciplinary System Reliability Analysis

    NASA Technical Reports Server (NTRS)

    Mahadevan, Sankaran; Han, Song; Chamis, Christos C. (Technical Monitor)

    2001-01-01

    The objective of this study is to develop a new methodology for estimating the reliability of engineering systems that encompass multiple disciplines. The methodology is formulated in the context of the NESSUS probabilistic structural analysis code, developed under the leadership of NASA Glenn Research Center. The NESSUS code has been successfully applied to the reliability estimation of a variety of structural engineering systems. This study examines whether the features of NESSUS could be used to investigate the reliability of systems in other disciplines such as heat transfer, fluid mechanics, electrical circuits etc., without considerable programming effort specific to each discipline. In this study, the mechanical equivalence between system behavior models in different disciplines are investigated to achieve this objective. A new methodology is presented for the analysis of heat transfer, fluid flow, and electrical circuit problems using the structural analysis routines within NESSUS, by utilizing the equivalence between the computational quantities in different disciplines. This technique is integrated with the fast probability integration and system reliability techniques within the NESSUS code, to successfully compute the system reliability of multidisciplinary systems. Traditional as well as progressive failure analysis methods for system reliability estimation are demonstrated, through a numerical example of a heat exchanger system involving failure modes in structural, heat transfer and fluid flow disciplines.

  4. Reliability assurance for regulation of advanced reactors

    SciTech Connect

    Fullwood, R.; Lofaro, R.; Samanta, P.

    1991-01-01

    The advanced nuclear power plants must achieve higher levels of safety than the first generation of plants. Showing that this is indeed true provides new challenges to reliability and risk assessment methods in the analysis of the designs employing passive and semi-passive protection. Reliability assurance of the advanced reactor systems is important for determining the safety of the design and for determining the plant operability. Safety is the primary concern, but operability is considered indicative of good and safe operation. This paper discusses several concerns for reliability assurance of the advanced design encompassing reliability determination, level of detail required in advanced reactor submittals, data for reliability assurance, systems interactions and common cause effects, passive component reliability, PRA-based configuration control system, and inspection, training, maintenance and test requirements. Suggested approaches are provided for addressing each of these topics.

  5. Reliability assurance for regulation of advanced reactors

    SciTech Connect

    Fullwood, R.; Lofaro, R.; Samanta, P.

    1991-12-31

    The advanced nuclear power plants must achieve higher levels of safety than the first generation of plants. Showing that this is indeed true provides new challenges to reliability and risk assessment methods in the analysis of the designs employing passive and semi-passive protection. Reliability assurance of the advanced reactor systems is important for determining the safety of the design and for determining the plant operability. Safety is the primary concern, but operability is considered indicative of good and safe operation. This paper discusses several concerns for reliability assurance of the advanced design encompassing reliability determination, level of detail required in advanced reactor submittals, data for reliability assurance, systems interactions and common cause effects, passive component reliability, PRA-based configuration control system, and inspection, training, maintenance and test requirements. Suggested approaches are provided for addressing each of these topics.

  6. Inter-rater reliability and generalizability of patient note scores using a scoring rubric based on the USMLE Step-2 CS format.

    PubMed

    Park, Yoon Soo; Hyderi, Abbas; Bordage, Georges; Xing, Kuan; Yudkowsky, Rachel

    2016-10-01

    Recent changes to the patient note (PN) format of the United States Medical Licensing Examination have challenged medical schools to improve the instruction and assessment of students taking the Step-2 clinical skills examination. The purpose of this study was to gather validity evidence regarding response process and internal structure, focusing on inter-rater reliability and generalizability, to determine whether a locally-developed PN scoring rubric and scoring guidelines could yield reproducible PN scores. A randomly selected subsample of historical data (post-encounter PN from 55 of 177 medical students) was rescored by six trained faculty raters in November-December 2014. Inter-rater reliability (% exact agreement and kappa) was calculated for five standardized patient cases administered in a local graduation competency examination. Generalizability studies were conducted to examine the overall reliability. Qualitative data were collected through surveys and a rater-debriefing meeting. The overall inter-rater reliability (weighted kappa) was .79 (Documentation = .63, Differential Diagnosis = .90, Justification = .48, and Workup = .54). The majority of score variance was due to case specificity (13 %) and case-task specificity (31 %), indicating differences in student performance by case and by case-task interactions. Variance associated with raters and its interactions were modest (<5 %). Raters felt that justification was the most difficult task to score and that having case and level-specific scoring guidelines during training was most helpful for calibration. The overall inter-rater reliability indicates high level of confidence in the consistency of note scores. Designs for scoring notes may optimize reliability by balancing the number of raters and cases.

  7. Statistical modelling of software reliability

    NASA Technical Reports Server (NTRS)

    Miller, Douglas R.

    1991-01-01

    During the six-month period from 1 April 1991 to 30 September 1991 the following research papers in statistical modeling of software reliability appeared: (1) A Nonparametric Software Reliability Growth Model; (2) On the Use and the Performance of Software Reliability Growth Models; (3) Research and Development Issues in Software Reliability Engineering; (4) Special Issues on Software; and (5) Software Reliability and Safety.

  8. Orbiter Autoland reliability analysis

    NASA Technical Reports Server (NTRS)

    Welch, D. Phillip

    1993-01-01

    The Space Shuttle Orbiter is the only space reentry vehicle in which the crew is seated upright. This position presents some physiological effects requiring countermeasures to prevent a crewmember from becoming incapacitated. This also introduces a potential need for automated vehicle landing capability. Autoland is a primary procedure that was identified as a requirement for landing following and extended duration orbiter mission. This report documents the results of the reliability analysis performed on the hardware required for an automated landing. A reliability block diagram was used to evaluate system reliability. The analysis considers the manual and automated landing modes currently available on the Orbiter. (Autoland is presently a backup system only.) Results of this study indicate a +/- 36 percent probability of successfully extending a nominal mission to 30 days. Enough variations were evaluated to verify that the reliability could be altered with missions planning and procedures. If the crew is modeled as being fully capable after 30 days, the probability of a successful manual landing is comparable to that of Autoland because much of the hardware is used for both manual and automated landing modes. The analysis indicates that the reliability for the manual mode is limited by the hardware and depends greatly on crew capability. Crew capability for a successful landing after 30 days has not been determined yet.

  9. Photovoltaic module reliability workshop

    SciTech Connect

    Mrig, L.

    1990-01-01

    The paper and presentations compiled in this volume form the Proceedings of the fourth in a series of Workshops sponsored by Solar Energy Research Institute (SERI/DOE) under the general theme of photovoltaic module reliability during the period 1986--1990. The reliability Photo Voltaic (PV) modules/systems is exceedingly important along with the initial cost and efficiency of modules if the PV technology has to make a major impact in the power generation market, and for it to compete with the conventional electricity producing technologies. The reliability of photovoltaic modules has progressed significantly in the last few years as evidenced by warranties available on commercial modules of as long as 12 years. However, there is still need for substantial research and testing required to improve module field reliability to levels of 30 years or more. Several small groups of researchers are involved in this research, development, and monitoring activity around the world. In the US, PV manufacturers, DOE laboratories, electric utilities and others are engaged in the photovoltaic reliability research and testing. This group of researchers and others interested in this field were brought together under SERI/DOE sponsorship to exchange the technical knowledge and field experience as related to current information in this important field. The papers presented here reflect this effort.

  10. Proposed reliability cost model

    NASA Technical Reports Server (NTRS)

    Delionback, L. M.

    1973-01-01

    The research investigations which were involved in the study include: cost analysis/allocation, reliability and product assurance, forecasting methodology, systems analysis, and model-building. This is a classic example of an interdisciplinary problem, since the model-building requirements include the need for understanding and communication between technical disciplines on one hand, and the financial/accounting skill categories on the other. The systems approach is utilized within this context to establish a clearer and more objective relationship between reliability assurance and the subcategories (or subelements) that provide, or reenforce, the reliability assurance for a system. Subcategories are further subdivided as illustrated by a tree diagram. The reliability assurance elements can be seen to be potential alternative strategies, or approaches, depending on the specific goals/objectives of the trade studies. The scope was limited to the establishment of a proposed reliability cost-model format. The model format/approach is dependent upon the use of a series of subsystem-oriented CER's and sometimes possible CTR's, in devising a suitable cost-effective policy.

  11. A Theory-Based Comparison of the Reliabilities of Fixed-Length and Trials-to-Criterion Scoring of Physical Education Skills Tests.

    ERIC Educational Resources Information Center

    Feldt, Leonard S.; Spray, Judith A.

    1983-01-01

    The reliabilities of two types of measurement plans were compared across six hypothetical distributions of true scores or abilities. The measurement plans were: (1) fixed-length, where the number of trials for all examinees is set in advance; and (2) trials-to-criterion, where examinees must keep trying until they complete a given number of trials…

  12. Adaptation of the Boundary Violations Scale Developed Based on Structural Family Therapy to the Turkish Context: A Study of Validity and Reliability

    ERIC Educational Resources Information Center

    Avci, Rasit; Çolakkadioglu, Oguzhan; Öz, Aysegül Sükran; Akbas, Turan

    2015-01-01

    The purpose of this study was to adapt "The Boundary Violations Scale" (Madden et al., 2002), which was created to measure the intergenerational boundary violations in families from the perspective of children, to Turkish and to test the validity and reliability of the Turkish version of this instrument. This instrument was developed…

  13. Software reliability perspectives

    NASA Technical Reports Server (NTRS)

    Wilson, Larry; Shen, Wenhui

    1987-01-01

    Software which is used in life critical functions must be known to be highly reliable before installation. This requires a strong testing program to estimate the reliability, since neither formal methods, software engineering nor fault tolerant methods can guarantee perfection. Prior to the final testing software goes through a debugging period and many models have been developed to try to estimate reliability from the debugging data. However, the existing models are poorly validated and often give poor performance. This paper emphasizes the fact that part of their failures can be attributed to the random nature of the debugging data given to these models as input, and it poses the problem of correcting this defect as an area of future research.

  14. Long-term reliability of Al2O3 and Parylene C bilayer encapsulated Utah electrode array based neural interfaces for chronic implantation

    NASA Astrophysics Data System (ADS)

    Xie, Xianzong; Rieth, Loren; Williams, Layne; Negi, Sandeep; Bhandari, Rajmohan; Caldwell, Ryan; Sharma, Rohit; Tathireddy, Prashant; Solzbacher, Florian

    2014-04-01

    Objective. We focus on improving the long-term stability and functionality of neural interfaces for chronic implantation by using bilayer encapsulation. Approach. We evaluated the long-term reliability of Utah electrode array (UEA) based neural interfaces encapsulated by 52 nm of atomic layer deposited Al2O3 and 6 µm of Parylene C bilayer, and compared these to devices with the baseline Parylene-only encapsulation. Three variants of arrays including wired, wireless, and active UEAs were used to evaluate this bilayer encapsulation scheme, and were immersed in phosphate buffered saline (PBS) at 57 °C for accelerated lifetime testing. Main results. The median tip impedance of the bilayer encapsulated wired UEAs increased from 60 to 160 kΩ during the 960 days of equivalent soak testing at 37 °C, the opposite trend to that typically observed for Parylene encapsulated devices. The loss of the iridium oxide tip metallization and etching of the silicon tip in PBS solution contributed to the increase of impedance. The lifetime of fully integrated wireless UEAs was also tested using accelerated lifetime measurement techniques. The bilayer coated devices had stable power-up frequencies at ˜910 MHz and constant radio-frequency signal strength of -50 dBm during up to 1044 days (still under testing) of equivalent soaking time at 37 °C. This is a significant improvement over the lifetime of ˜100 days achieved with Parylene-only encapsulation at 37 °C. The preliminary samples of bilayer coated active UEAs with a flip-chip bonded ASIC chip had a steady current draw of ˜3 mA during 228 days of soak testing at 37 °C. An increase in the current draw has been consistently correlated to device failures, so is a sensitive metric for their lifetime. Significance. The trends of increasing electrode impedance of wired devices and performance stability of wireless and active devices support the significantly greater encapsulation performance of this bilayer encapsulation compared

  15. LONG-TERM RELIABILITY OF AL2O3 AND PARYLENE C BILAYER ENCAPSULATED UTAH ELECTRODE ARRAY BASED NEURAL INTERFACES FOR CHRONIC IMPLANTATION

    PubMed Central

    Xie, Xianzong; Rieth, Loren; Williams, Layne; Negi, Sandeep; Bhandari, Rajmohan; Caldwell, Ryan; Sharma, Rohit; Tathireddy, Prashant; Solzbacher, Florian

    2014-01-01

    Objective We focus on improving the long-term stability and functionality of neural interfaces for chronic implantation by using bilayer encapsulation. Approach We evaluated the long-term reliability of Utah electrode array (UEA) based neural interfaces encapsulated by 52 nm of atomic layer deposited (ALD) Al2O3 and 6 μm of Parylene C bilayer, and compared these to devices with the baseline Parylene-only encapsulation. Three variants of arrays including wired, wireless, and active UEAs were used to evaluate this bilayer encapsulation scheme, and were immersed in phosphate buffered saline (PBS) at 57 °C for accelerated lifetime testing. Main results The median tip impedance of the bilayer encapsulated wired UEAs increased from 60 kΩ to 160 kΩ during the 960 days of equivalent soak testing at 37 °C, the opposite trend as typically observed for Parylene encapsulated devices. The loss of the iridium oxide tip metallization and etching of the silicon tip in PBS solution contributed to the increase of impedance. The lifetime of fully integrated wireless UEAs was also tested using accelerated lifetime measurement techniques. The bilayer coated devices had stable power-up frequencies at ~910 MHz and constant RF signal strength of -50 dBm during up to 1044 days (still under testing) of equivalent soaking time at 37 °C. This is a significant improvement over the lifetime of ~ 100 days achieved with Parylene-only encapsulation at 37 °C. The preliminary samples of bilayer coated active UEAs with a flip-chip bonded ASIC chip had a steady current draw of ~ 3 mA during 228 days of soak testing at 37 °C. An increase in current draw has been consistently correlated to device failures, so is a sensitive metric for their lifetime. Significance The trends of increasing electrode impedance of wired devices and performance stability of wireless and active devices support the significantly greater encapsulation performance of this bilayer encapsulation compared with Parylene

  16. Eco-friendly ionic liquid based ultrasonic assisted selective extraction coupled with a simple liquid chromatography for the reliable determination of acrylamide in food samples.

    PubMed

    Albishri, Hassan M; El-Hady, Deia Abd

    2014-01-01

    Acrylamide in food has drawn worldwide attention since 2002 due to its neurotoxic and carcinogenic effects. These influences brought out the dual polar and non-polar characters of acrylamide as they enabled it to dissolve in aqueous blood medium or penetrate the non-polar plasma membrane. In the current work, a simple HPLC/UV system was used to reveal that the penetration of acrylamide in non-polar phase was stronger than its dissolution in polar phase. The presence of phosphate salts in the polar phase reduced the acrylamide interaction with the non-polar phase. Furthermore, an eco-friendly and costless coupling of the HPLC/UV with ionic liquid based ultrasonic assisted extraction (ILUAE) was developed to determine the acrylamide content in food samples. ILUAE was proposed for the efficient extraction of acrylamide from bread and potato chips samples. The extracts were obtained by soaking of potato chips and bread samples in 1.5 mol L(-1) 1-butyl-3-methylimmidazolium bromide (BMIMBr) for 30.0 and 60.0 min, respectively and subsequent chromatographic separation within 12.0 min using Luna C18 column and 100% water mobile phase with 0.5 mL min(-1) under 25 °C column temperature at 250 nm. The extraction and analysis of acrylamide could be achieved within 2h. The mean extraction efficiency of acrylamide showed adequate repeatability with relative standard deviation (RSD) of 4.5%. The limit of detection and limit of quantitation were 25.0 and 80.0 ng mL(-1), respectively. The accuracy of the proposed method was tested by recovery in seven food samples giving values ranged between 90.6% and 109.8%. Therefore, the methodology was successfully validated by official guidelines, indicating its reliability to be applied to analysis of real samples, proven to be useful for its intended purpose. Moreover, it served as a simple, eco-friendly and costless alternative method over hitherto reported ones. PMID:24274280

  17. Eco-friendly ionic liquid based ultrasonic assisted selective extraction coupled with a simple liquid chromatography for the reliable determination of acrylamide in food samples.

    PubMed

    Albishri, Hassan M; El-Hady, Deia Abd

    2014-01-01

    Acrylamide in food has drawn worldwide attention since 2002 due to its neurotoxic and carcinogenic effects. These influences brought out the dual polar and non-polar characters of acrylamide as they enabled it to dissolve in aqueous blood medium or penetrate the non-polar plasma membrane. In the current work, a simple HPLC/UV system was used to reveal that the penetration of acrylamide in non-polar phase was stronger than its dissolution in polar phase. The presence of phosphate salts in the polar phase reduced the acrylamide interaction with the non-polar phase. Furthermore, an eco-friendly and costless coupling of the HPLC/UV with ionic liquid based ultrasonic assisted extraction (ILUAE) was developed to determine the acrylamide content in food samples. ILUAE was proposed for the efficient extraction of acrylamide from bread and potato chips samples. The extracts were obtained by soaking of potato chips and bread samples in 1.5 mol L(-1) 1-butyl-3-methylimmidazolium bromide (BMIMBr) for 30.0 and 60.0 min, respectively and subsequent chromatographic separation within 12.0 min using Luna C18 column and 100% water mobile phase with 0.5 mL min(-1) under 25 °C column temperature at 250 nm. The extraction and analysis of acrylamide could be achieved within 2h. The mean extraction efficiency of acrylamide showed adequate repeatability with relative standard deviation (RSD) of 4.5%. The limit of detection and limit of quantitation were 25.0 and 80.0 ng mL(-1), respectively. The accuracy of the proposed method was tested by recovery in seven food samples giving values ranged between 90.6% and 109.8%. Therefore, the methodology was successfully validated by official guidelines, indicating its reliability to be applied to analysis of real samples, proven to be useful for its intended purpose. Moreover, it served as a simple, eco-friendly and costless alternative method over hitherto reported ones.

  18. Gearbox Reliability Collaborative Update (Presentation)

    SciTech Connect

    Sheng, S.

    2013-10-01

    This presentation was given at the Sandia Reliability Workshop in August 2013 and provides information on current statistics, a status update, next steps, and other reliability research and development activities related to the Gearbox Reliability Collaborative.

  19. Materials reliability issues in microelectronics

    SciTech Connect

    Lloyd, J.R. ); Yost, F.G. ); Ho, P.S. )

    1991-01-01

    This book covers the proceedings of a MRS symposium on materials reliability in microelectronics. Topics include: electromigration; stress effects on reliability; stress and packaging; metallization; device, oxide and dielectric reliability; new investigative techniques; and corrosion.

  20. Gearbox Reliability Collaborative Bearing Calibration

    SciTech Connect

    van Dam, J.

    2011-10-01

    NREL has initiated the Gearbox Reliability Collaborative (GRC) to investigate the root cause of the low wind turbine gearbox reliability. The GRC follows a multi-pronged approach based on a collaborative of manufacturers, owners, researchers and consultants. The project combines analysis, field testing, dynamometer testing, condition monitoring, and the development and population of a gearbox failure database. At the core of the project are two 750kW gearboxes that have been redesigned and rebuilt so that they are representative of the multi-megawatt gearbox topology currently used in the industry. These gearboxes are heavily instrumented and are tested in the field and on the dynamometer. This report discusses the bearing calibrations of the gearboxes.

  1. Reliability techniques in the petroleum industry

    NASA Technical Reports Server (NTRS)

    Williams, H. L.

    1971-01-01

    Quantitative reliability evaluation methods used in the Apollo Spacecraft Program are translated into petroleum industry requirements with emphasis on offsetting reliability demonstration costs and limited production runs. Described are the qualitative disciplines applicable, the definitions and criteria that accompany the disciplines, and the generic application of these disciplines to the chemical industry. The disciplines are then translated into proposed definitions and criteria for the industry, into a base-line reliability plan that includes these disciplines, and into application notes to aid in adapting the base-line plan to a specific operation.

  2. A short food-group-based dietary questionnaire is reliable and valid for assessing toddlers' dietary risk in relatively advantaged samples.

    PubMed

    Bell, Lucinda K; Golley, Rebecca K; Magarey, Anthea M

    2014-08-28

    Identifying toddlers at dietary risk is crucial for determining who requires intervention to improve dietary patterns and reduce health consequences. The objectives of the present study were to develop a simple tool that assesses toddlers' dietary risk and investigate its reliability and validity. The nineteen-item Toddler Dietary Questionnaire (TDQ) is informed by dietary patterns observed in Australian children aged 14 (n 552) and 24 (n 493) months and the Australian dietary guidelines. It assesses the intake of 'core' food groups (e.g. fruit, vegetables and dairy products) and 'non-core' food groups (e.g. high-fat, high-sugar and/or high-salt foods and sweetened beverages) over the previous 7 d, which is then scored against a dietary risk criterion (0-100; higher score = higher risk). Parents of toddlers aged 12-36 months (Socio-Economic Index for Areas decile range 5-9) were asked to complete the TDQ for their child (n 111) on two occasions, 3·2 (SD 1·8) weeks apart, to assess test-retest reliability. They were also asked to complete a validated FFQ from which the risk score was calculated and compared with the TDQ-derived risk score (relative validity). Mean scores were highly correlated and not significantly different for reliability (intra-class correlation = 0·90, TDQ1 30·2 (SD 8·6) v. TDQ2 30·9 (SD 8·9); P= 0·14) and validity (r 0·83, average TDQ ((TDQ1+TDQ2)/2) 30·5 (SD 8·4) v. FFQ 31·4 (SD 8·1); P= 0·05). All the participants were classified into the same (reliability 75 %; validity 79 %) or adjacent (reliability 25 %; validity 21 %) risk category (low (0-24), moderate (25-49), high (50-74) and very high (75-100)). Overall, the TDQ is a valid and reliable screening tool for identifying at-risk toddlers in relatively advantaged samples.

  3. Quantifying Human Performance Reliability.

    ERIC Educational Resources Information Center

    Askren, William B.; Regulinski, Thaddeus L.

    Human performance reliability for tasks in the time-space continuous domain is defined and a general mathematical model presented. The human performance measurement terms time-to-error and time-to-error-correction are defined. The model and measurement terms are tested using laboratory vigilance and manual control tasks. Error and error-correction…

  4. Grid reliability management tools

    SciTech Connect

    Eto, J.; Martinez, C.; Dyer, J.; Budhraja, V.

    2000-10-01

    To summarize, Consortium for Electric Reliability Technology Solutions (CERTS) is engaged in a multi-year program of public interest R&D to develop and prototype software tools that will enhance system reliability during the transition to competitive markets. The core philosophy embedded in the design of these tools is the recognition that in the future reliability will be provided through market operations, not the decisions of central planners. Embracing this philosophy calls for tools that: (1) Recognize that the game has moved from modeling machine and engineering analysis to simulating markets to understand the impacts on reliability (and vice versa); (2) Provide real-time data and support information transparency toward enhancing the ability of operators and market participants to quickly grasp, analyze, and act effectively on information; (3) Allow operators, in particular, to measure, monitor, assess, and predict both system performance as well as the performance of market participants; and (4) Allow rapid incorporation of the latest sensing, data communication, computing, visualization, and algorithmic techniques and technologies.

  5. Reliable solar cookers

    SciTech Connect

    Magney, G.K.

    1992-12-31

    The author describes the activities of SERVE, a Christian relief and development agency, to introduce solar ovens to the Afghan refugees in Pakistan. It has provided 5,000 solar cookers since 1984. The experience has demonstrated the potential of the technology and the need for a durable and reliable product. Common complaints about the cookers are discussed and the ideal cooker is described.

  6. The SQM/COSMO filter: reliable native pose identification based on the quantum-mechanical description of protein-ligand interactions and implicit COSMO solvation.

    PubMed

    Pecina, Adam; Meier, René; Fanfrlík, Jindřich; Lepšík, Martin; Řezáč, Jan; Hobza, Pavel; Baldauf, Carsten

    2016-02-25

    Current virtual screening tools are fast, but reliable scoring is elusive. Here, we present the 'SQM/COSMO filter', a novel scoring function featuring a quantitative semiempirical quantum mechanical (SQM) description of all types of noncovalent interactions coupled with implicit COSMO solvation. We show unequivocally that it outperforms eight widely used scoring functions. The accuracy and chemical generality of the SQM/COSMO filter make it a perfect tool for late stages of virtual screening. PMID:26821703

  7. Reliability and Validity of Field-Based Tests to Assess Upper-Body Muscular Strength in Children Aged 6-12 Years.

    PubMed

    Fernandez Santos, Jorge R; Ruiz, Jonatan R; Gonzalez-Montesinos, Jose Luis; Castro-Piñero, Jose

    2016-05-01

    The aim of this study was to analyze the reliability and the validity of the handgrip, basketball throw and pushups tests in children aged 6-12 years. One hundred and eighty healthy children (82 girls) agreed to participate in this study. All the upper body muscular fitness tests were performed twice (7 days apart) whereas the 1 repetition maximum (1RM) bench press test was performed 2 days after the first session of testing. All the tests showed a high reproducibility (ICC > 0.9) except the push-ups test (intertrial difference = 0.77 ± 2.38, p < .001 and the percentage error = 9%). The handgrip test showed the highest association with 1RM bench press test (r = .79, p < .01; R2 = .621). In conclusion the handgrip and basketball throw tests are shown as reliable and valid tests to assess upper body muscular strength in children. More studies are needed to assess the validity and the reliability of the upper body muscular endurance tests in children.

  8. Reliability of genetic networks is evolvable

    NASA Astrophysics Data System (ADS)

    Braunewell, Stefan; Bornholdt, Stefan

    2008-06-01

    Control of the living cell functions with remarkable reliability despite the stochastic nature of the underlying molecular networks—a property presumably optimized by biological evolution. We ask here to what extent the ability of a stochastic dynamical network to produce reliable dynamics is an evolvable trait. Using an evolutionary algorithm based on a deterministic selection criterion for the reliability of dynamical attractors, we evolve networks of noisy discrete threshold nodes. We find that, starting from any random network, reliability of the attractor landscape can often be achieved with only a few small changes to the network structure. Further, the evolvability of networks toward reliable dynamics while retaining their function is investigated and a high success rate is found.

  9. Failure modes effects and criticality analysis (FMECA) approach to the crystalline silicon photovoltaic module reliability assessment

    NASA Astrophysics Data System (ADS)

    Kuitche, Joseph M.; Tamizh-Mani, Govindasamy; Pan, Rong

    2011-09-01

    Traditional degradation or reliability analysis of photovoltaic (PV) modules has historically consisted of some combination of accelerated stress and field testing, including field deployment and monitoring of modules over long time periods, and analyzing commercial warranty returns. This has been effective in identifying failure mechanisms and developing stress tests that accelerate those failures. For example, BP Solar assessed the long term reliability of modules deployed outdoor and modules returned from the field in 2003; and presented the types of failures observed. Out of about 2 million modules, the total number of returns over nine year period was only 0.13%. An analysis on these returns resulted that 86% of the field failures were due to corrosion and cell or interconnect break. These failures were eliminated through extended thermal cycling and damp heat tests. Considering that these failures are observed even on modules that have successfully gone through conventional qualification tests, it is possible that known failure modes and mechanisms are not well understood. Moreover, when a defect is not easily identifiable, the existing accelerated tests might no longer be sufficient. Thus, a detailed study of all known failure modes existed in field test is essential. In this paper, we combine the physics of failure analysis with an empirical study of the field inspection data of PV modules deployed in Arizona to develop a FMECA model. This technique examines the failure rates of individual components of fielded modules, along with their severities and detectabilities, to determine the overall effect of a defect on the module's quality and reliability.

  10. Reliability in individual monitoring service.

    PubMed

    Mod Ali, N

    2011-03-01

    As a laboratory certified to ISO 9001:2008 and accredited to ISO/IEC 17025, the Secondary Standard Dosimetry Laboratory (SSDL)-Nuclear Malaysia has incorporated an overall comprehensive system for technical and quality management in promoting a reliable individual monitoring service (IMS). Faster identification and resolution of issues regarding dosemeter preparation and issuing of reports, personnel enhancement, improved customer satisfaction and overall efficiency of laboratory activities are all results of the implementation of an effective quality system. Review of these measures and responses to observed trends provide continuous improvement of the system. By having these mechanisms, reliability of the IMS can be assured in the promotion of safe behaviour at all levels of the workforce utilising ionising radiation facilities. Upgradation of in the reporting program through a web-based e-SSDL marks a major improvement in Nuclear Malaysia's IMS reliability on the whole. The system is a vital step in providing a user friendly and effective occupational exposure evaluation program in the country. It provides a higher level of confidence in the results generated for occupational dose monitoring of the IMS, thus, enhances the status of the radiation protection framework of the country.

  11. Reliability in individual monitoring service.

    PubMed

    Mod Ali, N

    2011-03-01

    As a laboratory certified to ISO 9001:2008 and accredited to ISO/IEC 17025, the Secondary Standard Dosimetry Laboratory (SSDL)-Nuclear Malaysia has incorporated an overall comprehensive system for technical and quality management in promoting a reliable individual monitoring service (IMS). Faster identification and resolution of issues regarding dosemeter preparation and issuing of reports, personnel enhancement, improved customer satisfaction and overall efficiency of laboratory activities are all results of the implementation of an effective quality system. Review of these measures and responses to observed trends provide continuous improvement of the system. By having these mechanisms, reliability of the IMS can be assured in the promotion of safe behaviour at all levels of the workforce utilising ionising radiation facilities. Upgradation of in the reporting program through a web-based e-SSDL marks a major improvement in Nuclear Malaysia's IMS reliability on the whole. The system is a vital step in providing a user friendly and effective occupational exposure evaluation program in the country. It provides a higher level of confidence in the results generated for occupational dose monitoring of the IMS, thus, enhances the status of the radiation protection framework of the country. PMID:21147789

  12. Reliability and Validity of a New Test of Change-of-Direction Speed for Field-Based Sports: the Change-of-Direction and Acceleration Test (CODAT).

    PubMed

    Lockie, Robert G; Schultz, Adrian B; Callaghan, Samuel J; Jeffriess, Matthew D; Berry, Simon P

    2013-01-01

    Field sport coaches must use reliable and valid tests to assess change-of-direction speed in their athletes. Few tests feature linear sprinting with acute change- of-direction maneuvers. The Change-of-Direction and Acceleration Test (CODAT) was designed to assess field sport change-of-direction speed, and includes a linear 5-meter (m) sprint, 45° and 90° cuts, 3- m sprints to the left and right, and a linear 10-m sprint. This study analyzed the reliability and validity of this test, through comparisons to 20-m sprint (0-5, 0-10, 0-20 m intervals) and Illinois agility run (IAR) performance. Eighteen Australian footballers (age = 23.83 ± 7.04 yrs; height = 1.79 ± 0.06 m; mass = 85.36 ± 13.21 kg) were recruited. Following familiarization, subjects completed the 20-m sprint, CODAT, and IAR in 2 sessions, 48 hours apart. Intra-class correlation coefficients (ICC) assessed relative reliability. Absolute reliability was analyzed through paired samples t-tests (p ≤ 0.05) determining between-session differences. Typical error (TE), coefficient of variation (CV), and differences between the TE and smallest worthwhile change (SWC), also assessed absolute reliability and test usefulness. For the validity analysis, Pearson's correlations (p ≤ 0.05) analyzed between-test relationships. Results showed no between-session differences for any test (p = 0.19-0.86). CODAT time averaged ~6 s, and the ICC and CV equaled 0.84 and 3.0%, respectively. The homogeneous sample of Australian footballers meant that the CODAT's TE (0.19 s) exceeded the usual 0.2 x standard deviation (SD) SWC (0.10 s). However, the CODAT is capable of detecting moderate performance changes (SWC calculated as 0.5 x SD = 0.25 s). There was a near perfect correlation between the CODAT and IAR (r = 0.92), and very large correlations with the 20-m sprint (r = 0.75-0.76), suggesting that the CODAT was a valid change-of-direction speed test. Due to movement specificity, the CODAT has value for field sport

  13. Comparison of the Reliability of Anatomic Landmarks based on PA Cephalometric Radiographs and 3D CT Scans in Patients with Facial Asymmetry

    PubMed Central

    Rathee, Pooja; Jain, Pradeep; Panwar, Vasim Raja

    2011-01-01

    Introduction Conventional cephalometry is an inexpensive and well-established method for evaluating patients with dentofacial deformities. However, patients with major deformities and in particular asymmetric cases are difficult to evaluate by conventional cephalometry. Reliable and accurate evaluation in the orbital and midfacial region in craniofacial syndrome patients is difficult due to inherent geometric magnification, distortion and the superpositioning of the craniofacial structures on cephalograms. Both two- and three-dimensional computed tomography (CT) have been proposed to alleviate some of these difficulties. Aims and objectives The aim of our study is to compare the reliability of anatomic cephalometric points obtained from the two modalities: Conventional posteroanterior cephalograms and 3D CT of patients with facial asymmetry, by comparison of intra- and interobserver variation of points recorded from frontal X-ray to those recorded from 3D CT. Materials and methods The sample included nine patients (5 males and 4 females) with an age range of 14 to 21 years and a mean age of 17.11 years, whose treatment plan called for correction of facial asymmetry. All CT scans were measured twice by two investigators with 2 weeks separation for determination of intraobserver and interobserver variability. Similarly, all measurement points on the frontal cephalograms were traced twice with 2 weeks separation. The tracings were superimposed and the average distance between replicate points readings were used as a measure of intra- and interobserver reliability. Intra-and interobserver variations are calculated for each method and the data were imported directly into the statistical program, SPSS 10.0.1 for windows. Results Intraobserver variations of points defined on 3D CT were small compared with frontal cephalograms. The intraobserver variations ranged from 0 (A1, B1) to 0.6 mm with the variations less than 0.5 mm for most of the points. Interobserver variations

  14. Reliability and Validity of a New Test of Change-of-Direction Speed for Field-Based Sports: the Change-of-Direction and Acceleration Test (CODAT)

    PubMed Central

    Lockie, Robert G.; Schultz, Adrian B.; Callaghan, Samuel J.; Jeffriess, Matthew D.; Berry, Simon P.

    2013-01-01

    Field sport coaches must use reliable and valid tests to assess change-of-direction speed in their athletes. Few tests feature linear sprinting with acute change- of-direction maneuvers. The Change-of-Direction and Acceleration Test (CODAT) was designed to assess field sport change-of-direction speed, and includes a linear 5-meter (m) sprint, 45° and 90° cuts, 3- m sprints to the left and right, and a linear 10-m sprint. This study analyzed the reliability and validity of this test, through comparisons to 20-m sprint (0-5, 0-10, 0-20 m intervals) and Illinois agility run (IAR) performance. Eighteen Australian footballers (age = 23.83 ± 7.04 yrs; height = 1.79 ± 0.06 m; mass = 85.36 ± 13.21 kg) were recruited. Following familiarization, subjects completed the 20-m sprint, CODAT, and IAR in 2 sessions, 48 hours apart. Intra-class correlation coefficients (ICC) assessed relative reliability. Absolute reliability was analyzed through paired samples t-tests (p ≤ 0.05) determining between-session differences. Typical error (TE), coefficient of variation (CV), and differences between the TE and smallest worthwhile change (SWC), also assessed absolute reliability and test usefulness. For the validity analysis, Pearson’s correlations (p ≤ 0.05) analyzed between-test relationships. Results showed no between-session differences for any test (p = 0.19-0.86). CODAT time averaged ~6 s, and the ICC and CV equaled 0.84 and 3.0%, respectively. The homogeneous sample of Australian footballers meant that the CODAT’s TE (0.19 s) exceeded the usual 0.2 x standard deviation (SD) SWC (0.10 s). However, the CODAT is capable of detecting moderate performance changes (SWC calculated as 0.5 x SD = 0.25 s). There was a near perfect correlation between the CODAT and IAR (r = 0.92), and very large correlations with the 20-m sprint (r = 0.75-0.76), suggesting that the CODAT was a valid change-of-direction speed test. Due to movement specificity, the CODAT has value for field sport

  15. [Reliability of cancer as the underlying cause of death according to the Mortality Information System and Population-Based Cancer Registry in Goiânia, Goiás State, Brazil].

    PubMed

    Oliveira, Patricia Pereira Vasconcelos de; Silva, Gulnar Azevedo e; Curado, Maria Paula; Malta, Deborah Carvalho; Moura, Lenildo de

    2014-02-01

    This study assessed the reliability of cancer as the underlying cause of death using probabilistic linkage between the Mortality Information System and Population-Based Cancer Registry (PBCR) in Goiânia, Goiás State, Brazil, from 2000 to 2005. RecLink III was used for probabilistic linkage, and reliability was assessed by Cohen's kappa and prevalence-adjusted and bias-adjusted kappa (PABAK). In the probabilistic linkage, 2,874 individuals were identified for the reliability analysis. Cohen's kappa ranged from 0.336 to 0.846 and PABAK from 0.810 to 0.990 for 14 neoplasm groups defined in the study. For reliability of the 35 leading cancers, 12(34.3%) presented kappa values under 0.600 and PABAK over 0.981. Among the neoplasms common to both sexes, crude agreement ranged from 0.672 to 0.790 and adjusted agreement from 0.894 to 0.961. Sixty-seven percent of cases classified by the Mortality Information System as "cancer of ill-defined sites" were reclassified according to the PBCR. This study was useful for the classification of cancer mortality estimates in areas covered by the PBCR.

  16. Reliability and durability problems

    NASA Astrophysics Data System (ADS)

    Bojtsov, B. V.; Kondrashov, V. Z.

    The papers presented in this volume focus on methods for determining the stress-strain state of structures and machines and evaluating their reliability and service life. Specific topics discussed include a method for estimating the service life of thin-sheet automotive structures, stressed state at the tip of small cracks in anisotropic plates under biaxial tension, evaluation of the elastic-dissipative characteristics of joints by vibrational diagnostics methods, and calculation of the reliability of ceramic structures for arbitrary long-term loading programs. Papers are also presented on the effect of prior plastic deformation on fatigue damage kinetics, axisymmetric and local deformation of cylindrical parts during finishing-hardening treatments, and adhesion of polymers to diffusion coatings on steels.

  17. Human Reliability Program Workshop

    SciTech Connect

    Landers, John; Rogers, Erin; Gerke, Gretchen

    2014-05-18

    A Human Reliability Program (HRP) is designed to protect national security as well as worker and public safety by continuously evaluating the reliability of those who have access to sensitive materials, facilities, and programs. Some elements of a site HRP include systematic (1) supervisory reviews, (2) medical and psychological assessments, (3) management evaluations, (4) personnel security reviews, and (4) training of HRP staff and critical positions. Over the years of implementing an HRP, the Department of Energy (DOE) has faced various challenges and overcome obstacles. During this 4-day activity, participants will examine programs that mitigate threats to nuclear security and the insider threat to include HRP, Nuclear Security Culture (NSC) Enhancement, and Employee Assistance Programs. The focus will be to develop an understanding of the need for a systematic HRP and to discuss challenges and best practices associated with mitigating the insider threat.

  18. Reliable broadcast protocols

    NASA Technical Reports Server (NTRS)

    Joseph, T. A.; Birman, Kenneth P.

    1989-01-01

    A number of broadcast protocols that are reliable subject to a variety of ordering and delivery guarantees are considered. Developing applications that are distributed over a number of sites and/or must tolerate the failures of some of them becomes a considerably simpler task when such protocols are available for communication. Without such protocols the kinds of distributed applications that can reasonably be built will have a very limited scope. As the trend towards distribution and decentralization continues, it will not be surprising if reliable broadcast protocols have the same role in distributed operating systems of the future that message passing mechanisms have in the operating systems of today. On the other hand, the problems of engineering such a system remain large. For example, deciding which protocol is the most appropriate to use in a certain situation or how to balance the latency-communication-storage costs is not an easy question.

  19. Space Shuttle Propulsion System Reliability

    NASA Technical Reports Server (NTRS)

    Welzyn, Ken; VanHooser, Katherine; Moore, Dennis; Wood, David

    2011-01-01

    This session includes the following sessions: (1) External Tank (ET) System Reliability and Lessons, (2) Space Shuttle Main Engine (SSME), Reliability Validated by a Million Seconds of Testing, (3) Reusable Solid Rocket Motor (RSRM) Reliability via Process Control, and (4) Solid Rocket Booster (SRB) Reliability via Acceptance and Testing.

  20. Spacecraft transmitter reliability

    NASA Technical Reports Server (NTRS)

    1980-01-01

    A workshop on spacecraft transmitter reliability was held at the NASA Lewis Research Center on September 25 and 26, 1979, to discuss present knowledge and to plan future research areas. Since formal papers were not submitted, this synopsis was derived from audio tapes of the workshop. The following subjects were covered: users' experience with space transmitters; cathodes; power supplies and interfaces; and specifications and quality assurance. A panel discussion ended the workshop.

  1. ATLAS reliability analysis

    SciTech Connect

    Bartsch, R.R.

    1995-09-01

    Key elements of the 36 MJ ATLAS capacitor bank have been evaluated for individual probabilities of failure. These have been combined to estimate system reliability which is to be greater than 95% on each experimental shot. This analysis utilizes Weibull or Weibull-like distributions with increasing probability of failure with the number of shots. For transmission line insulation, a minimum thickness is obtained and for the railgaps, a method for obtaining a maintenance interval from forthcoming life tests is suggested.

  2. Compact, Reliable EEPROM Controller

    NASA Technical Reports Server (NTRS)

    Katz, Richard; Kleyner, Igor

    2010-01-01

    A compact, reliable controller for an electrically erasable, programmable read-only memory (EEPROM) has been developed specifically for a space-flight application. The design may be adaptable to other applications in which there are requirements for reliability in general and, in particular, for prevention of inadvertent writing of data in EEPROM cells. Inadvertent writes pose risks of loss of reliability in the original space-flight application and could pose such risks in other applications. Prior EEPROM controllers are large and complex and do not provide all reasonable protections (in many cases, few or no protections) against inadvertent writes. In contrast, the present controller provides several layers of protection against inadvertent writes. The controller also incorporates a write-time monitor, enabling determination of trends in the performance of an EEPROM through all phases of testing. The controller has been designed as an integral subsystem of a system that includes not only the controller and the controlled EEPROM aboard a spacecraft but also computers in a ground control station, relatively simple onboard support circuitry, and an onboard communication subsystem that utilizes the MIL-STD-1553B protocol. (MIL-STD-1553B is a military standard that encompasses a method of communication and electrical-interface requirements for digital electronic subsystems connected to a data bus. MIL-STD- 1553B is commonly used in defense and space applications.) The intent was to both maximize reliability while minimizing the size and complexity of onboard circuitry. In operation, control of the EEPROM is effected via the ground computers, the MIL-STD-1553B communication subsystem, and the onboard support circuitry, all of which, in combination, provide the multiple layers of protection against inadvertent writes. There is no controller software, unlike in many prior EEPROM controllers; software can be a major contributor to unreliability, particularly in fault

  3. Investigation of reliability method formulations in Dakota/UQ.

    SciTech Connect

    Renaud, John E.; Perez, Victor M.; Wojtkiewicz, Steven F., Jr.; Agarwal, H.; Eldred, Michael Scott

    2004-07-01

    Reliability methods are probabilistic algorithms for quantifying the effect of simulation input uncertainties on response metrics of interest. In particular, they compute approximate response function distribution statistics (probability, reliability and response levels) based on specified input random variable probability distributions. In this paper, a number of algorithmic variations are explored for both the forward reliability analysis of computing probabilities for specified response levels (the reliability index approach (RIA)) and the inverse reliability analysis of computing response levels for specified probabilities (the performance measure approach (PMA)). These variations include limit state linearizations, probability integrations, warm starting and optimization algorithm selections. The resulting RIA/PMA reliability algorithms for uncertainty quantification are then employed within bi-level and sequential reliability-based design optimization approaches. Relative performance of these uncertainty quantification and reliability-based design optimization algorithms are presented for a number of computational experiments performed using the DAKOTA/UQ software.

  4. The concurrent validity and reliability of a low-cost, high-speed camera-based method for measuring the flight time of vertical jumps.

    PubMed

    Balsalobre-Fernández, Carlos; Tejero-González, Carlos M; del Campo-Vecino, Juan; Bavaresco, Nicolás

    2014-02-01

    Flight time is the most accurate and frequently used variable when assessing the height of vertical jumps. The purpose of this study was to analyze the validity and reliability of an alternative method (i.e., the HSC-Kinovea method) for measuring the flight time and height of vertical jumping using a low-cost high-speed Casio Exilim FH-25 camera (HSC). To this end, 25 subjects performed a total of 125 vertical jumps on an infrared (IR) platform while simultaneously being recorded with a HSC at 240 fps. Subsequently, 2 observers with no experience in video analysis analyzed the 125 videos independently using the open-license Kinovea 0.8.15 software. The flight times obtained were then converted into vertical jump heights, and the intraclass correlation coefficient (ICC), Bland-Altman plot, and Pearson correlation coefficient were calculated for those variables. The results showed a perfect correlation agreement (ICC = 1, p < 0.0001) between both observers' measurements of flight time and jump height and a highly reliable agreement (ICC = 0.997, p < 0.0001) between the observers' measurements of flight time and jump height using the HSC-Kinovea method and those obtained using the IR system, thus explaining 99.5% (p < 0.0001) of the differences (shared variance) obtained using the IR platform. As a result, besides requiring no previous experience in the use of this technology, the HSC-Kinovea method can be considered to provide similarly valid and reliable measurements of flight time and vertical jump height as more expensive equipment (i.e., IR). As such, coaches from many sports could use the HSC-Kinovea method to measure the flight time and height of their athlete's vertical jumps.

  5. The comparison of wavelet- and Fourier-based electromyographic indices of back muscle fatigue during dynamic contractions: validity and reliability results.

    PubMed

    da Silva, R A; Larivière, C; Arsenault, A B; Nadeau, S; Plamondon, A

    2008-01-01

    The purpose of this study was to compare the electromyographic (EMG) fatigue indices computed from short-time Fourier transform (STFT) and wavelet transform (WAV), by analyzing their criterion validity and test-retest reliability. The effect of averaging spectral estimates within and between repeated contractions (cycles) on EMG fatigue indices was also demonstrated. Thirty-one healthy subjects performed trunk flexion-extension cycles until exhaustion on a Biodex dynamometer. The load was determined theoretically as twice the L5-S1 moment produced by the trunk mass. To assess reliability, 10 subjects performed the same experimental protocol after a two-week interval. EMG signals were recorded bilaterally with 12 pairs of electrodes placed on the back muscles (at L4, L3, L1 and T10 levels), as well as on the gluteus maximus and biceps femoris. The endurance time and perceived muscle fatigue (Borg CR-10 scale) were used as fatigue criteria. EMG signals were processed using STFT and WAV to extract global (e.g, median frequency and instantaneous median frequency, respectively) or local (e.g., intensity contained in 8 frequency bands) information from the power spectrum. The slope values of these variables over time, obtained from regression analyses, were retained as EMG fatigue indices. EMG fatigue indices (STFT vs. WAV) were not significantly different within each muscle, had a variable association (Pearson's r range.: 0.06 to 0.68) with our fatigue criteria, and showed comparable reliability (Intra-class correlation range: 0.00 to 0.88), although they varied between muscles. The effect of averaging, within and between cycles, contributed to the strong association between EMG fatigue indices computed from STFT and WAV. As for EMG spectral indices of muscle fatigue, the conclusion is that both transforms carry essentially the same information.

  6. The concurrent validity and reliability of a low-cost, high-speed camera-based method for measuring the flight time of vertical jumps.

    PubMed

    Balsalobre-Fernández, Carlos; Tejero-González, Carlos M; del Campo-Vecino, Juan; Bavaresco, Nicolás

    2014-02-01

    Flight time is the most accurate and frequently used variable when assessing the height of vertical jumps. The purpose of this study was to analyze the validity and reliability of an alternative method (i.e., the HSC-Kinovea method) for measuring the flight time and height of vertical jumping using a low-cost high-speed Casio Exilim FH-25 camera (HSC). To this end, 25 subjects performed a total of 125 vertical jumps on an infrared (IR) platform while simultaneously being recorded with a HSC at 240 fps. Subsequently, 2 observers with no experience in video analysis analyzed the 125 videos independently using the open-license Kinovea 0.8.15 software. The flight times obtained were then converted into vertical jump heights, and the intraclass correlation coefficient (ICC), Bland-Altman plot, and Pearson correlation coefficient were calculated for those variables. The results showed a perfect correlation agreement (ICC = 1, p < 0.0001) between both observers' measurements of flight time and jump height and a highly reliable agreement (ICC = 0.997, p < 0.0001) between the observers' measurements of flight time and jump height using the HSC-Kinovea method and those obtained using the IR system, thus explaining 99.5% (p < 0.0001) of the differences (shared variance) obtained using the IR platform. As a result, besides requiring no previous experience in the use of this technology, the HSC-Kinovea method can be considered to provide similarly valid and reliable measurements of flight time and vertical jump height as more expensive equipment (i.e., IR). As such, coaches from many sports could use the HSC-Kinovea method to measure the flight time and height of their athlete's vertical jumps. PMID:23689339

  7. General Aviation Aircraft Reliability Study

    NASA Technical Reports Server (NTRS)

    Pettit, Duane; Turnbull, Andrew; Roelant, Henk A. (Technical Monitor)

    2001-01-01

    This reliability study was performed in order to provide the aviation community with an estimate of Complex General Aviation (GA) Aircraft System reliability. To successfully improve the safety and reliability for the next generation of GA aircraft, a study of current GA aircraft attributes was prudent. This was accomplished by benchmarking the reliability of operational Complex GA Aircraft Systems. Specifically, Complex GA Aircraft System reliability was estimated using data obtained from the logbooks of a random sample of the Complex GA Aircraft population.

  8. Apollo experience report: Reliability and quality assurance

    NASA Technical Reports Server (NTRS)

    Sperber, K. P.

    1973-01-01

    The reliability of the Apollo spacecraft resulted from the application of proven reliability and quality techniques and from sound management, engineering, and manufacturing practices. Continual assessment of these techniques and practices was made during the program, and, when deficiencies were detected, adjustments were made and the deficiencies were effectively corrected. The most significant practices, deficiencies, adjustments, and experiences during the Apollo Program are described in this report. These experiences can be helpful in establishing an effective base on which to structure an efficient reliability and quality assurance effort for future space-flight programs.

  9. Operational reliability of standby safety systems

    SciTech Connect

    Grant, G.M.; Atwood, C.L.; Gentillon, C.D.

    1995-04-01

    The Idaho National Engineering Laboratory (INEL) is evaluating the operational reliability of several risk-significant standby safety systems based on the operating experience at US commercial nuclear power plants from 1987 through 1993. The reliability assessed is the probability that the system will perform its Probabilistic Risk Assessment (PRA) defined safety function. The quantitative estimates of system reliability are expected to be useful in risk-based regulation. This paper is an overview of the analysis methods and the results of the high pressure coolant injection (HPCI) system reliability study. Key characteristics include (1) descriptions of the data collection and analysis methods, (2) the statistical methods employed to estimate operational unreliability, (3) a description of how the operational unreliability estimates were compared with typical PRA results, both overall and for each dominant failure mode, and (4) a summary of results of the study.

  10. MOV reliability evaluation and periodic verification scheduling

    SciTech Connect

    Bunte, B.D.

    1996-12-01

    The purpose of this paper is to establish a periodic verification testing schedule based on the expected long term reliability of gate or globe motor operated valves (MOVs). The methodology in this position paper determines the nominal (best estimate) design margin for any MOV based on the best available information pertaining to the MOVs design requirements, design parameters, existing hardware design, and present setup. The uncertainty in this margin is then determined using statistical means. By comparing the nominal margin to the uncertainty, the reliability of the MOV is estimated. The methodology is appropriate for evaluating the reliability of MOVs in the GL 89-10 program. It may be used following periodic testing to evaluate and trend MOV performance and reliability. It may also be used to evaluate the impact of proposed modifications and maintenance activities such as packing adjustments. In addition, it may be used to assess the impact of new information of a generic nature which impacts safety related MOVs.

  11. Mission Reliability Estimation for Repairable Robot Teams

    NASA Technical Reports Server (NTRS)

    Trebi-Ollennu, Ashitey; Dolan, John; Stancliff, Stephen

    2010-01-01

    A mission reliability estimation method has been designed to translate mission requirements into choices of robot modules in order to configure a multi-robot team to have high reliability at minimal cost. In order to build cost-effective robot teams for long-term missions, one must be able to compare alternative design paradigms in a principled way by comparing the reliability of different robot models and robot team configurations. Core modules have been created including: a probabilistic module with reliability-cost characteristics, a method for combining the characteristics of multiple modules to determine an overall reliability-cost characteristic, and a method for the generation of legitimate module combinations based on mission specifications and the selection of the best of the resulting combinations from a cost-reliability standpoint. The developed methodology can be used to predict the probability of a mission being completed, given information about the components used to build the robots, as well as information about the mission tasks. In the research for this innovation, sample robot missions were examined and compared to the performance of robot teams with different numbers of robots and different numbers of spare components. Data that a mission designer would need was factored in, such as whether it would be better to have a spare robot versus an equivalent number of spare parts, or if mission cost can be reduced while maintaining reliability using spares. This analytical model was applied to an example robot mission, examining the cost-reliability tradeoffs among different team configurations. Particularly scrutinized were teams using either redundancy (spare robots) or repairability (spare components). Using conservative estimates of the cost-reliability relationship, results show that it is possible to significantly reduce the cost of a robotic mission by using cheaper, lower-reliability components and providing spares. This suggests that the

  12. The relative reliability of actively participating and passively observing raters in a simulation-based assessment for selection to specialty training in anaesthesia.

    PubMed

    Roberts, M J; Gale, T C E; Sice, P J A; Anderson, I R

    2013-06-01

    Selection to specialty training is a high-stakes assessment demanding valuable consultant time. In one initial entry level and two higher level anaesthesia selection centres, we investigated the feasibility of using staff participating in simulation scenarios, rather than observing consultants, to rate candidate performance. We compared participant and observer scores using four different outcomes: inter-rater reliability; score distributions; correlation of candidate rankings; and percentage of candidates whose selection might be affected by substituting participants' for observers' ratings. Inter-rater reliability between observers was good (correlation coefficient 0.73-0.96) but lower between participants (correlation coefficient 0.39-0.92), particularly at higher level where participants also rated candidates more favourably than did observers. Station rank orderings were strongly correlated between the rater groups at entry level (rho 0.81, p < 0.001) but weaker at the two higher level centres (rho 0.52, p = 0.018; rho 0.58, p = 0.001). Substituting participants' for observers' ratings had less effect once scores were combined with those from other selection centre stations. Selection decisions for 0-20% of candidates could have changed, depending on the numbers of training posts available. We conclude that using participating raters is feasible at initial entry level only.

  13. Development of a New Ultrasound-Based System for Tracking Motion of the Human Lumbar Spine: Reliability, Stability and Repeatability during Forward Bending Movement Trials.

    PubMed

    Cuesta-Vargas, Antonio I

    2015-07-01

    The aim of this study was to develop a new method for quantifying intersegmental motion of the spine in an instrumented motion segment L4-L5 model using ultrasound image post-processing combined with an electromagnetic device. A prospective test-retest design was employed, combined with an evaluation of stability and within- and between-day intra-tester reliability during forward bending by 15 healthy male patients. The accuracy of the measurement system using the model was calculated to be ± 0.9° (standard deviation = 0.43) over a 40° range and ± 0.4 cm (standard deviation = 0.28) over 1.5 cm. The mean composite range of forward bending was 15.5 ± 2.04° during a single trial (standard error of the mean = 0.54, coefficient of variation = 4.18). Reliability (intra-class correlation coefficient = 2.1) was found to be excellent for both within-day measures (0.995-0.999) and between-day measures (0.996-0.999). Further work is necessary to explore the use of this approach in the evaluation of biomechanics, clinical assessments and interventions. PMID:25864018

  14. Distribution system reliability assessment using hierarchical Markov modeling

    SciTech Connect

    Brown, R.E.; Gupta, S.; Christie, R.D.; Venkata, S.S.; Fletcher, R.

    1996-10-01

    Distribution system reliability assessment is concerned with power availability and power quality at each customer`s service entrance. This paper presents a new method, termed Hierarchical Markov Modeling (HMM), which can perform predictive distribution system reliability assessment. HMM is unique in that it decomposes the reliability model based on system topology, integrated protection systems, and individual protection devices. This structure, which easily accommodates the effects of backup protection, fault isolation, and load restoration, is compared to simpler reliability models. HMM is then used to assess the reliability of an existing utility distribution system and to explore the reliability impact of several design improvement options.

  15. Production Facility System Reliability Analysis Report

    SciTech Connect

    Dale, Crystal Buchanan; Klein, Steven Karl

    2015-10-06

    This document describes the reliability, maintainability, and availability (RMA) modeling of the Los Alamos National Laboratory (LANL) design for the Closed Loop Helium Cooling System (CLHCS) planned for the NorthStar accelerator-based 99Mo production facility. The current analysis incorporates a conceptual helium recovery system, beam diagnostics, and prototype control system into the reliability analysis. The results from the 1000 hr blower test are addressed.

  16. Thin-film reliability and engineering overview

    NASA Technical Reports Server (NTRS)

    Ross, R. G., Jr.

    1984-01-01

    The reliability and engineering technology base required for thin film solar energy conversions modules is discussed. The emphasis is on the integration of amorphous silicon cells into power modules. The effort is being coordinated with SERI's thin film cell research activities as part of DOE's Amorphous Silicon Program. Program concentration is on temperature humidity reliability research, glass breaking strength research, point defect system analysis, hot spot heating assessment, and electrical measurements technology.

  17. Gearbox Reliability Collaborative (GRC) Description and Loading

    SciTech Connect

    Oyague, F.

    2011-11-01

    This document describes simulated turbine load cases in accordance to the IEC 61400-1 Ed.3 standard, which is representative of the typical wind turbine design process. The information presented herein is intended to provide a broad understanding of the gearbox reliability collaborative 750kW drivetrain and turbine configuration. In addition, fatigue and ultimate strength drivetrain loads resulting from simulations are presented. This information provides the bases for the analytical work of the gearbox reliability collaborative effort.

  18. Reliability Degradation Due to Stockpile Aging

    SciTech Connect

    Robinson, David G.

    1999-04-01

    The objective of this reseach is the investigation of alternative methods for characterizing the reliability of systems with time dependent failure modes associated with stockpile aging. Reference to 'reliability degradation' has, unfortunately, come to be associated with all types of aging analyes: both deterministic and stochastic. In this research, in keeping with the true theoretical definition, reliability is defined as a probabilistic description of system performance as a funtion of time. Traditional reliability methods used to characterize stockpile reliability depend on the collection of a large number of samples or observations. Clearly, after the experiments have been performed and the data has been collected, critical performance problems can be identified. A Major goal of this research is to identify existing methods and/or develop new mathematical techniques and computer analysis tools to anticipate stockpile problems before they become critical issues. One of the most popular methods for characterizing the reliability of components, particularly electronic components, assumes that failures occur in a completely random fashion, i.e. uniformly across time. This method is based primarily on the use of constant failure rates for the various elements that constitute the weapon system, i.e. the systems do not degrade while in storage. Experience has shown that predictions based upon this approach should be regarded with great skepticism since the relationship between the life predicted and the observed life has been difficult to validate. In addition to this fundamental problem, the approach does not recognize that there are time dependent material properties and variations associated with the manufacturing process and the operational environment. To appreciate the uncertainties in predicting system reliability a number of alternative methods are explored in this report. All of the methods are very different from those currently used to assess stockpile

  19. Ultimately Reliable Pyrotechnic Systems

    NASA Technical Reports Server (NTRS)

    Scott, John H.; Hinkel, Todd

    2015-01-01

    This paper presents the methods by which NASA has designed, built, tested, and certified pyrotechnic devices for high reliability operation in extreme environments and illustrates the potential applications in the oil and gas industry. NASA's extremely successful application of pyrotechnics is built upon documented procedures and test methods that have been maintained and developed since the Apollo Program. Standards are managed and rigorously enforced for performance margins, redundancy, lot sampling, and personnel safety. The pyrotechnics utilized in spacecraft include such devices as small initiators and detonators with the power of a shotgun shell, detonating cord systems for explosive energy transfer across many feet, precision linear shaped charges for breaking structural membranes, and booster charges to actuate valves and pistons. NASA's pyrotechnics program is one of the more successful in the history of Human Spaceflight. No pyrotechnic device developed in accordance with NASA's Human Spaceflight standards has ever failed in flight use. NASA's pyrotechnic initiators work reliably in temperatures as low as -420 F. Each of the 135 Space Shuttle flights fired 102 of these initiators, some setting off multiple pyrotechnic devices, with never a failure. The recent landing on Mars of the Opportunity rover fired 174 of NASA's pyrotechnic initiators to complete the famous '7 minutes of terror.' Even after traveling through extreme radiation and thermal environments on the way to Mars, every one of them worked. These initiators have fired on the surface of Titan. NASA's design controls, procedures, and processes produce the most reliable pyrotechnics in the world. Application of pyrotechnics designed and procured in this manner could enable the energy industry's emergency equipment, such as shutoff valves and deep-sea blowout preventers, to be left in place for years in extreme environments and still be relied upon to function when needed, thus greatly enhancing

  20. Blade reliability collaborative :

    SciTech Connect

    Ashwill, Thomas D.; Ogilvie, Alistair B.; Paquette, Joshua A.

    2013-04-01

    The Blade Reliability Collaborative (BRC) was started by the Wind Energy Technologies Department of Sandia National Laboratories and DOE in 2010 with the goal of gaining insight into planned and unplanned O&M issues associated with wind turbine blades. A significant part of BRC is the Blade Defect, Damage and Repair Survey task, which will gather data from blade manufacturers, service companies, operators and prior studies to determine details about the largest sources of blade unreliability. This report summarizes the initial findings from this work.

  1. A sensitive sequential 'on/off' SERS assay for heparin with wider detection window and higher reliability based on the reversed surface charge changes of functionalized Au@Ag nanoparticles.

    PubMed

    Zeng, Yi; Pei, Jin-Ju; Wang, Li-Hua; Shen, Ai-Guo; Hu, Ji-Ming

    2015-04-15

    A sequential 'on/off' dual mode SERS assay platform for heparin with wider detection window and higher reliability is constructed based on electrostatic forces, in which the highly protonated chitosan encapsulated p-Mercaptobenzoic acid coated Au@Ag core-shell nanoparticles undergo sequential aggregation/segregation upon the additive of heparin with a limit of detection of 43.74ng/mL (5.69U/mL) and a continuous concentration range of 50-800ng/mL (6.5-104U/mL), which are lower in sensitivity and wider in detection window than the most reported assay for heparin. Remarkably, the latter declined window over a range of 350-800ng/mL in contrast, which has not reported before, is extremely important in reliable and practical assay of heparin.

  2. Meta-Analysis of Scale Reliability Using Latent Variable Modeling

    ERIC Educational Resources Information Center

    Raykov, Tenko; Marcoulides, George A.

    2013-01-01

    A latent variable modeling approach is outlined that can be used for meta-analysis of reliability coefficients of multicomponent measuring instruments. Important limitations of efforts to combine composite reliability findings across multiple studies are initially pointed out. A reliability synthesis procedure is discussed that is based on…

  3. Methods to approximate reliabilities in single-step genomic evaluation

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Reliability of predictions from single-step genomic BLUP (ssGBLUP) can be calculated by inversion, but that is not feasible for large data sets. Two methods of approximating reliability were developed based on decomposition of a function of reliability into contributions from records, pedigrees, and...

  4. Industrial Power Distribution System Reliability Assessment utilizing Markov Approach

    NASA Astrophysics Data System (ADS)

    Guzman-Rivera, Oscar R.

    A method to perform power system reliability analysis using Markov Approach, Reliability Block Diagrams and Fault Tree analysis has been presented. The Markov method we use is a state space model and is based on state diagrams generated for a one line industrial power distribution system. The Reliability block diagram (RBD) method is a graphical and calculation tool used to model the distribution power system of an industrial facility. Quantitative reliability estimations on this work are based on CARMS and Block Sim simulations as well as state space, RBD's and Failure Mode analyses. The power system reliability was assessed and the main contributors to power system reliability have been identified, both qualitatively and quantitatively. Methods to improve reliability have also been provided including redundancies and protection systems that might be added to the system in order to improve reliability.

  5. Fault Tree Reliability Analysis and Design-for-reliability

    1998-05-05

    WinR provides a fault tree analysis capability for performing systems reliability and design-for-reliability analyses. The package includes capabilities for sensitivity and uncertainity analysis, field failure data analysis, and optimization.

  6. Probabilistic fatigue methodology for six nines reliability

    NASA Technical Reports Server (NTRS)

    Everett, R. A., Jr.; Bartlett, F. D., Jr.; Elber, Wolf

    1990-01-01

    Fleet readiness and flight safety strongly depend on the degree of reliability that can be designed into rotorcraft flight critical components. The current U.S. Army fatigue life specification for new rotorcraft is the so-called six nines reliability, or a probability of failure of one in a million. The progress of a round robin which was established by the American Helicopter Society (AHS) Subcommittee for Fatigue and Damage Tolerance is reviewed to investigate reliability-based fatigue methodology. The participants in this cooperative effort are in the U.S. Army Aviation Systems Command (AVSCOM) and the rotorcraft industry. One phase of the joint activity examined fatigue reliability under uniquely defined conditions for which only one answer was correct. The other phases were set up to learn how the different industry methods in defining fatigue strength affected the mean fatigue life and reliability calculations. Hence, constant amplitude and spectrum fatigue test data were provided so that each participant could perform their standard fatigue life analysis. As a result of this round robin, the probabilistic logic which includes both fatigue strength and spectrum loading variability in developing a consistant reliability analysis was established. In this first study, the reliability analysis was limited to the linear cumulative damage approach. However, it is expected that superior fatigue life prediction methods will ultimately be developed through this open AHS forum. To that end, these preliminary results were useful in identifying some topics for additional study.

  7. Test-Retest Reliability of High Angular Resolution Diffusion Imaging Acquisition within Medial Temporal Lobe Connections Assessed via Tract Based Spatial Statistics, Probabilistic Tractography and a Novel Graph Theory Metric

    PubMed Central

    Kuhn, T.; Gullett, J. M.; Nguyen, P.; Boutzoukas, A. E.; Ford, A.; Colon-Perez, L. M.; Triplett, W.; Carney, P.R.; Mareci, T. H.; Price, C. C.; Bauer, R. M.

    2015-01-01

    Introduction This study examined the reliability of high angular resolution diffusion tensor imaging (HARDI) data collected on a single individual across several sessions using the same scanner. Methods HARDI data was acquired for one healthy adult male at the same time of day on ten separate days across a one-month period. Environmental factors (e.g. temperature) were controlled across scanning sessions. Tract Based Spatial Statistics (TBSS) was used to assess session-to-session variability in measures of diffusion, fractional anisotropy (FA) and mean diffusivity (MD). To address reliability within specific structures of the medial temporal lobe (MTL; the focus of an ongoing investigation), probabilistic tractography segmented the Entorhinal cortex (ERc) based on connections with Hippocampus (HC), Perirhinal (PRc) and Parahippocampal (PHc) cortices. Streamline tractography generated edge weight (EW) metrics for the aforementioned ERc connections and, as comparison regions, connections between left and right rostral and caudal anterior cingulate cortex (ACC). Coefficients of variation (CoV) were derived for the surface area and volumes of these ERc connectivity-defined regions (CDR) and for EW across all ten scans, expecting that scan-to-scan reliability would yield low CoVs. Results TBSS revealed no significant variation in FA or MD across scanning sessions. Probabilistic tractography successfully reproduced histologically-verified adjacent medial temporal lobe circuits. Tractography-derived metrics displayed larger ranges of scanner-to-scanner variability. Connections involving HC displayed greater variability than metrics of connection between other investigated regions. Conclusions By confirming the test retest reliability of HARDI data acquisition, support for the validity of significant results derived from diffusion data can be obtained. PMID:26189060

  8. Test-retest reliability of high angular resolution diffusion imaging acquisition within medial temporal lobe connections assessed via tract based spatial statistics, probabilistic tractography and a novel graph theory metric.

    PubMed

    Kuhn, T; Gullett, J M; Nguyen, P; Boutzoukas, A E; Ford, A; Colon-Perez, L M; Triplett, W; Carney, P R; Mareci, T H; Price, C C; Bauer, R M

    2016-06-01

    This study examined the reliability of high angular resolution diffusion tensor imaging (HARDI) data collected on a single individual across several sessions using the same scanner. HARDI data was acquired for one healthy adult male at the same time of day on ten separate days across a one-month period. Environmental factors (e.g. temperature) were controlled across scanning sessions. Tract Based Spatial Statistics (TBSS) was used to assess session-to-session variability in measures of diffusion, fractional anisotropy (FA) and mean diffusivity (MD). To address reliability within specific structures of the medial temporal lobe (MTL; the focus of an ongoing investigation), probabilistic tractography segmented the Entorhinal cortex (ERc) based on connections with Hippocampus (HC), Perirhinal (PRc) and Parahippocampal (PHc) cortices. Streamline tractography generated edge weight (EW) metrics for the aforementioned ERc connections and, as comparison regions, connections between left and right rostral and caudal anterior cingulate cortex (ACC). Coefficients of variation (CoV) were derived for the surface area and volumes of these ERc connectivity-defined regions (CDR) and for EW across all ten scans, expecting that scan-to-scan reliability would yield low CoVs. TBSS revealed no significant variation in FA or MD across scanning sessions. Probabilistic tractography successfully reproduced histologically-verified adjacent medial temporal lobe circuits. Tractography-derived metrics displayed larger ranges of scanner-to-scanner variability. Connections involving HC displayed greater variability than metrics of connection between other investigated regions. By confirming the test retest reliability of HARDI data acquisition, support for the validity of significant results derived from diffusion data can be obtained.

  9. On Component Reliability and System Reliability for Space Missions

    NASA Technical Reports Server (NTRS)

    Chen, Yuan; Gillespie, Amanda M.; Monaghan, Mark W.; Sampson, Michael J.; Hodson, Robert F.

    2012-01-01

    This paper is to address the basics, the limitations and the relationship between component reliability and system reliability through a study of flight computing architectures and related avionics components for NASA future missions. Component reliability analysis and system reliability analysis need to be evaluated at the same time, and the limitations of each analysis and the relationship between the two analyses need to be understood.

  10. Improve relief valve reliability

    SciTech Connect

    Nelson, W.E.

    1993-01-01

    This paper reports on careful evaluation of safety relief valves and their service conditions which can improve reliability and permit more time between testing. Some factors that aid in getting long-run results are: Use of valves suitable for service, Attention to design of the relieving system (including use of block valves) and Close attention to repair procedures. Use these procedures for each installation, applying good engineering practices. The Clean Air Act of 1990 and other legislation limiting allowable fugitive emissions in a hydrocarbon processing plant will greatly impact safety relief valve installations. Normal leakage rate from a relief valve will require that it be connected to a closed vent system connected to a recovery or control device. Tying the outlet of an existing valve into a header system can cause accelerated corrosion and operating difficulties. Reliability of many existing safety relief valves may be compromised when they are connected to an outlet header without following good engineering practices. The law has been enacted but all the rules have not been promulgated.

  11. Integrated circuit reliability testing

    NASA Technical Reports Server (NTRS)

    Buehler, Martin G. (Inventor); Sayah, Hoshyar R. (Inventor)

    1990-01-01

    A technique is described for use in determining the reliability of microscopic conductors deposited on an uneven surface of an integrated circuit device. A wafer containing integrated circuit chips is formed with a test area having regions of different heights. At the time the conductors are formed on the chip areas of the wafer, an elongated serpentine assay conductor is deposited on the test area so the assay conductor extends over multiple steps between regions of different heights. Also, a first test conductor is deposited in the test area upon a uniform region of first height, and a second test conductor is deposited in the test area upon a uniform region of second height. The occurrence of high resistances at the steps between regions of different height is indicated by deriving the measured length of the serpentine conductor using the resistance measured between the ends of the serpentine conductor, and comparing that to the design length of the serpentine conductor. The percentage by which the measured length exceeds the design length, at which the integrated circuit will be discarded, depends on the required reliability of the integrated circuit.

  12. Integrated circuit reliability testing

    NASA Technical Reports Server (NTRS)

    Buehler, Martin G. (Inventor); Sayah, Hoshyar R. (Inventor)

    1988-01-01

    A technique is described for use in determining the reliability of microscopic conductors deposited on an uneven surface of an integrated circuit device. A wafer containing integrated circuit chips is formed with a test area having regions of different heights. At the time the conductors are formed on the chip areas of the wafer, an elongated serpentine assay conductor is deposited on the test area so the assay conductor extends over multiple steps between regions of different heights. Also, a first test conductor is deposited in the test area upon a uniform region of first height, and a second test conductor is deposited in the test area upon a uniform region of second height. The occurrence of high resistances at the steps between regions of different height is indicated by deriving the measured length of the serpentine conductor using the resistance measured between the ends of the serpentine conductor, and comparing that to the design length of the serpentine conductor. The percentage by which the measured length exceeds the design length, at which the integrated circuit will be discarded, depends on the required reliability of the integrated circuit.

  13. Reliability of MR-Based Volumetric 3-D Analysis of Pelvic Muscles among Subjects with Low Back with Leg Pain and Healthy Volunteers

    PubMed Central

    Skorupska, Elżbieta; Keczmer, Przemysław; Łochowski, Rafał M.; Tomal, Paulina; Rychlik, Michał; Samborski, Włodzimierz

    2016-01-01

    Aim Lately, the diagnostic value of magnetic resonance imaging, Lasègue sign and classic neurological signs have been considered not accurate enough to distinguish the radicular from non-radicular low back with leg pain (LBLP) and a calculation of the symptomatic side muscle volume has been indicated as a probable valuable marker. However, only the multifidus muscle volume has been calculated so far. The main objective of the study was to verify whether LBLP subjects presented symptomatic side pelvic muscle atrophy compared to healthy volunteers. The second aim was to assess the inter-rater reliability of 3-D manual method for segmenting and measuring the volume of the gluteus maximus, gluteus medius, gluteus minimus and piriformis muscles in both LBLP patients and healthy subjects. Method Two independent raters analyzed MR images of LBLP and healthy subjects towards muscle volume of four pelvic muscles, i.e. the piriformis, gluteus minimus, gluteus medius and gluteus maximus. For both sides, the MR images of the muscles without adipose tissue infiltration were manually segmented in 3-D medical images. Results Symptomatic muscle atrophy was confirmed in only over 50% of LBLP subjects (gluteus maximus (p<0.001), gluteus minimus (p<0.01) and piriformis (p<0.05)). The ICC values indicated that the inter-rater reproducibility was greater than 0.90 for all measurements (LBLP and healthy subjects), except for the measurement of the right gluteus medius muscle in LBLP patients, which was equal to 0.848. Conclusion More than 50% of LBLP subjects presented symptomatic gluteus maximus, gluteus minimus and piriformis muscle atrophy. 3-D manual segmentation reliably measured muscle volume in all the measured pelvic muscles in both healthy and LBLP subjects. To answer the question of what kind of muscle atrophy is indicative of radicular or non-radicular pain further studies are required. PMID:27459688

  14. Creating High Reliability in Health Care Organizations

    PubMed Central

    Pronovost, Peter J; Berenholtz, Sean M; Goeschel, Christine A; Needham, Dale M; Sexton, J Bryan; Thompson, David A; Lubomski, Lisa H; Marsteller, Jill A; Makary, Martin A; Hunt, Elizabeth

    2006-01-01

    Objective The objective of this paper was to present a comprehensive approach to help health care organizations reliably deliver effective interventions. Context Reliability in healthcare translates into using valid rate-based measures. Yet high reliability organizations have proven that the context in which care is delivered, called organizational culture, also has important influences on patient safety. Model for Improvement Our model to improve reliability, which also includes interventions to improve culture, focuses on valid rate-based measures. This model includes (1) identifying evidence-based interventions that improve the outcome, (2) selecting interventions with the most impact on outcomes and converting to behaviors, (3) developing measures to evaluate reliability, (4) measuring baseline performance, and (5) ensuring patients receive the evidence-based interventions. The comprehensive unit-based safety program (CUSP) is used to improve culture and guide organizations in learning from mistakes that are important, but cannot be measured as rates. Conclusions We present how this model was used in over 100 intensive care units in Michigan to improve culture and eliminate catheter-related blood stream infections—both were accomplished. Our model differs from existing models in that it incorporates efforts to improve a vital component for system redesign—culture, it targets 3 important groups—senior leaders, team leaders, and front line staff, and facilitates change management—engage, educate, execute, and evaluate for planned interventions. PMID:16898981

  15. Understanding the Elements of Operational Reliability: A Key for Achieving High Reliability

    NASA Technical Reports Server (NTRS)

    Safie, Fayssal M.

    2010-01-01

    This viewgraph presentation reviews operational reliability and its role in achieving high reliability through design and process reliability. The topics include: 1) Reliability Engineering Major Areas and interfaces; 2) Design Reliability; 3) Process Reliability; and 4) Reliability Applications.

  16. NASA Applications and Lessons Learned in Reliability Engineering

    NASA Technical Reports Server (NTRS)

    Safie, Fayssal M.; Fuller, Raymond P.

    2011-01-01

    Since the Shuttle Challenger accident in 1986, communities across NASA have been developing and extensively using quantitative reliability and risk assessment methods in their decision making process. This paper discusses several reliability engineering applications that NASA has used over the year to support the design, development, and operation of critical space flight hardware. Specifically, the paper discusses several reliability engineering applications used by NASA in areas such as risk management, inspection policies, components upgrades, reliability growth, integrated failure analysis, and physics based probabilistic engineering analysis. In each of these areas, the paper provides a brief discussion of a case study to demonstrate the value added and the criticality of reliability engineering in supporting NASA project and program decisions to fly safely. Examples of these case studies discussed are reliability based life limit extension of Shuttle Space Main Engine (SSME) hardware, Reliability based inspection policies for Auxiliary Power Unit (APU) turbine disc, probabilistic structural engineering analysis for reliability prediction of the SSME alternate turbo-pump development, impact of ET foam reliability on the Space Shuttle System risk, and reliability based Space Shuttle upgrade for safety. Special attention is given in this paper to the physics based probabilistic engineering analysis applications and their critical role in evaluating the reliability of NASA development hardware including their potential use in a research and technology development environment.

  17. Apply reliability centered maintenance to sealless pumps

    SciTech Connect

    Pradhan, S. )

    1993-01-01

    This paper reports on reliability centered maintenance (RCM) which is considered a crucial part of future reliability engineering. RCM determines the maintenance requirements of plants and equipment in their operating context. The RCM method has been applied to the management of critical sealless pumps in fire/toxic risk services, typical of the petrochemical industry. The method provides advantages from a detailed study of any critical engineering system. RCM is a team exercise and fosters team spirit in the plant environment. The maintenance strategy that evolves is based on team decisions and relies on maximizing the inherent reliability built into the equipment. RCM recommends design upgrades where this inherent reliability is being questioned. Sealless pumps of canned motor design are used as main reactor charge pumps in PVC plants. These pumps handle fresh vinyl chloride monomer (VCM), which is both carcinogenic and flammable.

  18. Reliability of chemical analyses of water samples

    SciTech Connect

    Beardon, R.

    1989-11-01

    Ground-water quality investigations require reliable chemical analyses of water samples. Unfortunately, laboratory analytical results are often unreliable. The Uranium Mill Tailings Remedial Action (UMTRA) Project`s solution to this problem was to establish a two phase quality assurance program for the analysis of water samples. In the first phase, eight laboratories analyzed three solutions of known composition. The analytical accuracy of each laboratory was ranked and three laboratories were awarded contracts. The second phase consists of on-going monitoring of the reliability of the selected laboratories. The following conclusions are based on two years experience with the UMTRA Project`s Quality Assurance Program. The reliability of laboratory analyses should not be taken for granted. Analytical reliability may be independent of the prices charged by laboratories. Quality assurance programs benefit both the customer and the laboratory.

  19. Measurement Practices for Reliability and Power Quality

    SciTech Connect

    Kueck, JD

    2005-05-06

    to support activities to develop and share information among industry and regulatory participants about critical resources and practices. The toolkit has been developed by investigating the status of indices and definitions, surveying utility organizations on information sharing, and preparing summaries of reliability standards and monitoring requirements--the issues, needs, work under way, existing standards, practices and guidelines--for the following three classifications: (1) terms and definitions of reliability; (2) power quality standards, guidelines, and measurements; and (3) activities and organizations developing and sharing information on distribution reliability. As these synopses of reliability measurement practices are provided, it must be noted that an economic penalty may be associated with requiring too high a reliability level from the distribution system for all customers. It may be appropriate for the distribution system to supply only some base, generally accepted level of reliability. This base level would be adequate for the majority of customers. Users who need a higher level may find it economical to supply using distributed energy resources (DER) and other local solutions to reliability and power quality needs. Local solutions implemented by the customer may be the most cost-effective method for addressing the more stringent needs of a digital economy. These local solutions include energy storage, small distributed generators, and microgrids. This report also considers the market's role in addressing reliability issues and requirements. The customer's needs are discussed in view of issues such as power quality requirements of digital electronic equipment, the cost of outages, the cost of storage and new infrastructure, and natural gas prices. The market role in addressing these issues and requirements is explored. The economic considerations associated with the reliability issues are discussed, as well as the levels at which these economic

  20. Reliable vibrational wavenumbers for C=O and N-H stretchings of isolated and hydrogen-bonded nucleic acid bases.

    PubMed

    Fornaro, Teresa; Biczysko, Malgorzata; Bloino, Julien; Barone, Vincenzo

    2016-03-28

    The accurate prediction of vibrational wavenumbers for functional groups involved in hydrogen-bonded bridges remains an important challenge for computational spectroscopy. For the specific case of the C=O and N-H stretching modes of nucleobases and their oligomers, the paucity of experimental reference values needs to be compensated by reliable computational data, which require the use of approaches going beyond the standard harmonic oscillator model. Test computations performed for model systems (formamide, acetamide and their cyclic homodimers) in the framework of the second order vibrational perturbation theory (VPT2) confirmed that anharmonic corrections can be safely computed by global hybrid (GHF) or double hybrid (DHF) functionals, whereas the harmonic part is particularly challenging. As a matter of fact, GHFs perform quite poorly and even DHFs, while fully satisfactory for C=O stretchings, face unexpected difficulties when dealing with N-H stretchings. On these grounds, a linear regression for N-H stretchings has been obtained and validated for the heterodimers formed by 4-aminopyrimidine with 6-methyl-4-pyrimidinone (4APM-M4PMN) and by uracil with water. In view of the good performance of this computational model, we have built a training set of B2PLYP-D3/maug-cc-pVTZ harmonic wavenumbers (including linear regression scaling for N-H) for six-different uracil dimers and a validation set including 4APM-M4PMN, one of the most stable hydrogen-bonded adenine homodimers, as well as the adenine-uracil, adenine-thymine, guanine-cytosine and adenine-4-thiouracil heterodimers. Because of the unfavourable scaling of DHF harmonic wavenumbers with the dimensions of the investigated systems, we have optimized a linear regression of B3LYP-D3/N07D harmonic wavenumbers for the training set, which has been next checked against the validation set. This relatively cheap model, which shows very good agreement with experimental data (average errors of about 10 cm(-1)), paves