Science.gov

Sample records for reliability physics-of-failure based

  1. Prediction of reliability on thermoelectric module through accelerated life test and Physics-of-failure

    NASA Astrophysics Data System (ADS)

    Choi, Hyoung-Seuk; Seo, Won-Seon; Choi, Duck-Kyun

    2011-09-01

    Thermoelectric cooling module (TEM) which is electric device has a mechanical stress because of temperature gradient in itself. It means that structure of TEM is vulnerable in an aspect of reliability but research on reliability of TEM was not performed a lot. Recently, the more the utilization of thermoelectric cooling devices grows, the more the needs for life prediction and improvement are increasing. In this paper, we investigated life distribution, shape parameter of the TEM through accelerated life test (ALT). And we discussed about how to enhance life of TEM through the Physics-of-failure. Experimental results of ALT showed that the thermoelectric cooling module follows the Weibull distribution, shape parameter of which is 3.6. The acceleration model is coffin Coffin-Manson and material constant is 1.8.

  2. Agent autonomy approach to probabilistic physics-of-failure modeling of complex dynamic systems with interacting failure mechanisms

    NASA Astrophysics Data System (ADS)

    Gromek, Katherine Emily

    A novel computational and inference framework of the physics-of-failure (PoF) reliability modeling for complex dynamic systems has been established in this research. The PoF-based reliability models are used to perform a real time simulation of system failure processes, so that the system level reliability modeling would constitute inferences from checking the status of component level reliability at any given time. The "agent autonomy" concept is applied as a solution method for the system-level probabilistic PoF-based (i.e. PPoF-based) modeling. This concept originated from artificial intelligence (AI) as a leading intelligent computational inference in modeling of multi agents systems (MAS). The concept of agent autonomy in the context of reliability modeling was first proposed by M. Azarkhail [1], where a fundamentally new idea of system representation by autonomous intelligent agents for the purpose of reliability modeling was introduced. Contribution of the current work lies in the further development of the agent anatomy concept, particularly the refined agent classification within the scope of the PoF-based system reliability modeling, new approaches to the learning and the autonomy properties of the intelligent agents, and modeling interacting failure mechanisms within the dynamic engineering system. The autonomous property of intelligent agents is defined as agent's ability to self-activate, deactivate or completely redefine their role in the analysis. This property of agents and the ability to model interacting failure mechanisms of the system elements makes the agent autonomy fundamentally different from all existing methods of probabilistic PoF-based reliability modeling. 1. Azarkhail, M., "Agent Autonomy Approach to Physics-Based Reliability Modeling of Structures and Mechanical Systems", PhD thesis, University of Maryland, College Park, 2007.

  3. Methodology for Physics and Engineering of Reliable Products

    NASA Technical Reports Server (NTRS)

    Cornford, Steven L.; Gibbel, Mark

    1996-01-01

    Physics of failure approaches have gained wide spread acceptance within the electronic reliability community. These methodologies involve identifying root cause failure mechanisms, developing associated models, and utilizing these models to inprove time to market, lower development and build costs and higher reliability. The methodology outlined herein sets forth a process, based on integration of both physics and engineering principles, for achieving the same goals.

  4. Reliability-based casing design

    SciTech Connect

    Maes, M.A.; Gulati, K.C.; Johnson, R.C.; McKenna, D.L.; Brand, P.R.; Lewis, D.B.

    1995-06-01

    The present paper describes the development of reliability-based design criteria for oil and/or gas well casing/tubing. The approach is based on the fundamental principles of limit state design. Limit states for tubulars are discussed and specific techniques for the stochastic modeling of loading and resistance variables are described. Zonation methods and calibration techniques are developed which are geared specifically to the characteristic tubular design for both hydrocarbon drilling and production applications. The application of quantitative risk analysis to the development of risk-consistent design criteria is shown to be a major and necessary step forward in achieving more economic tubular design.

  5. Reliability based fatigue design and maintenance procedures

    NASA Technical Reports Server (NTRS)

    Hanagud, S.

    1977-01-01

    A stochastic model has been developed to describe a probability for fatigue process by assuming a varying hazard rate. This stochastic model can be used to obtain the desired probability of a crack of certain length at a given location after a certain number of cycles or time. Quantitative estimation of the developed model was also discussed. Application of the model to develop a procedure for reliability-based cost-effective fail-safe structural design is presented. This design procedure includes the reliability improvement due to inspection and repair. Methods of obtaining optimum inspection and maintenance schemes are treated.

  6. A Reliability-Based Track Fusion Algorithm

    PubMed Central

    Xu, Li; Pan, Liqiang; Jin, Shuilin; Liu, Haibo; Yin, Guisheng

    2015-01-01

    The common track fusion algorithms in multi-sensor systems have some defects, such as serious imbalances between accuracy and computational cost, the same treatment of all the sensor information regardless of their quality, high fusion errors at inflection points. To address these defects, a track fusion algorithm based on the reliability (TFR) is presented in multi-sensor and multi-target environments. To improve the information quality, outliers in the local tracks are eliminated at first. Then the reliability of local tracks is calculated, and the local tracks with high reliability are chosen for the state estimation fusion. In contrast to the existing methods, TFR reduces high fusion errors at the inflection points of system tracks, and obtains a high accuracy with less computational cost. Simulation results verify the effectiveness and the superiority of the algorithm in dense sensor environments. PMID:25950174

  7. Reliability mechanisms in distributed data base systems

    SciTech Connect

    Son, S.H.

    1986-01-01

    Distributed database systems operate in computer networking environments where component failures are inevitable during normal operation. Failures not only threaten normal operation of the system, but they may destroy the correctness of the data base by direct damage to the storage subsystem. In order to cope with these failures, distributed data base systems must provide reliability mechanisms that maintain the system consistency. There are two major parts in this dissertation. In the first part, mechanisms are presented for recovery management in distributed data base system. The recovery management of a distributed data bases system consists of two parts: the preparation for the recovery by saving necessary information during normal operation of the data base system, and the coordination of the actual recovery in order to avoid the possible inconsistency after the recovery. The preparation for the recovery is done through the checkpointing and logging. A new scheme is proposed for reconstruction of the data base in distributed environments. In the second part, a token-based resiliency control scheme for replicated distributed data base systems. The proposed control scheme increases the reliability as well as the degree of concurrency while maintaining the consistency of the system.

  8. Measurement-based reliability/performability models

    NASA Technical Reports Server (NTRS)

    Hsueh, Mei-Chen

    1987-01-01

    Measurement-based models based on real error-data collected on a multiprocessor system are described. Model development from the raw error-data to the estimation of cumulative reward is also described. A workload/reliability model is developed based on low-level error and resource usage data collected on an IBM 3081 system during its normal operation in order to evaluate the resource usage/error/recovery process in a large mainframe system. Thus, both normal and erroneous behavior of the system are modeled. The results provide an understanding of the different types of errors and recovery processes. The measured data show that the holding times in key operational and error states are not simple exponentials and that a semi-Markov process is necessary to model the system behavior. A sensitivity analysis is performed to investigate the significance of using a semi-Markov process, as opposed to a Markov process, to model the measured system.

  9. Reliability Analysis and Reliability-Based Design Optimization of Circular Composite Cylinders Under Axial Compression

    NASA Technical Reports Server (NTRS)

    Rais-Rohani, Masoud

    2001-01-01

    This report describes the preliminary results of an investigation on component reliability analysis and reliability-based design optimization of thin-walled circular composite cylinders with average diameter and average length of 15 inches. Structural reliability is based on axial buckling strength of the cylinder. Both Monte Carlo simulation and First Order Reliability Method are considered for reliability analysis with the latter incorporated into the reliability-based structural optimization problem. To improve the efficiency of reliability sensitivity analysis and design optimization solution, the buckling strength of the cylinder is estimated using a second-order response surface model. The sensitivity of the reliability index with respect to the mean and standard deviation of each random variable is calculated and compared. The reliability index is found to be extremely sensitive to the applied load and elastic modulus of the material in the fiber direction. The cylinder diameter was found to have the third highest impact on the reliability index. Also the uncertainty in the applied load, captured by examining different values for its coefficient of variation, is found to have a large influence on cylinder reliability. The optimization problem for minimum weight is solved subject to a design constraint on element reliability index. The methodology, solution procedure and optimization results are included in this report.

  10. System Reliability for LED-Based Products

    SciTech Connect

    Davis, J Lynn; Mills, Karmann; Lamvik, Michael; Yaga, Robert; Shepherd, Sarah D; Bittle, James; Baldasaro, Nick; Solano, Eric; Bobashev, Georgiy; Johnson, Cortina; Evans, Amy

    2014-04-07

    Results from accelerated life tests (ALT) on mass-produced commercially available 6” downlights are reported along with results from commercial LEDs. The luminaires capture many of the design features found in modern luminaires. In general, a systems perspective is required to understand the reliability of these devices since LED failure is rare. In contrast, components such as drivers, lenses, and reflector are more likely to impact luminaire reliability than LEDs.

  11. Reliability-based design optimization under stationary stochastic process loads

    NASA Astrophysics Data System (ADS)

    Hu, Zhen; Du, Xiaoping

    2016-08-01

    Time-dependent reliability-based design ensures the satisfaction of reliability requirements for a given period of time, but with a high computational cost. This work improves the computational efficiency by extending the sequential optimization and reliability analysis (SORA) method to time-dependent problems with both stationary stochastic process loads and random variables. The challenge of the extension is the identification of the most probable point (MPP) associated with time-dependent reliability targets. Since a direct relationship between the MPP and reliability target does not exist, this work defines the concept of equivalent MPP, which is identified by the extreme value analysis and the inverse saddlepoint approximation. With the equivalent MPP, the time-dependent reliability-based design optimization is decomposed into two decoupled loops: deterministic design optimization and reliability analysis, and both are performed sequentially. Two numerical examples are used to show the efficiency of the proposed method.

  12. Complementary Reliability-Based Decodings of Binary Linear Block Codes

    NASA Technical Reports Server (NTRS)

    Fossorier, Marc P. C.; Lin, Shu

    1997-01-01

    This correspondence presents a hybrid reliability-based decoding algorithm which combines the reprocessing method based on the most reliable basis and a generalized Chase-type algebraic decoder based on the least reliable positions. It is shown that reprocessing with a simple additional algebraic decoding effort achieves significant coding gain. For long codes, the order of reprocessing required to achieve asymptotic optimum error performance is reduced by approximately 1/3. This significantly reduces the computational complexity, especially for long codes. Also, a more efficient criterion for stopping the decoding process is derived based on the knowledge of the algebraic decoding solution.

  13. A highly reliable RAID system based on GPUs.

    SciTech Connect

    Curry, Matthew L.

    2010-06-01

    While RAID is the prevailing method of creating reliable secondary storage infrastructure, many users desire more flexibility than offered by current implementations. To attain needed performance, customers have often sought after hardware-based RAID solutions. This talk describes a RAID system that offloads erasure correction coding calculations to GPUs, allowing increased reliability by supporting new RAID levels while maintaining high performance.

  14. Reliability Evaluation Based on Different Distributions of Random Load

    PubMed Central

    Gao, Peng; Xie, Liyang

    2013-01-01

    The reliability models of the components under the nonstationary random load are developed in this paper. Through the definition of the distribution of the random load, it can be seen that the conventional load-strength interference model is suitable for the calculation of the static reliability of the components, which does not reflect the dynamic change in the reliability and cannot be used to evaluate the dynamic reliability. Therefore, by developing an approach to converting the nonstationary random load into the random load whose pdf is the same at each moment when the random load applies, the reliability model based on the longitudinal distribution is derived. Moreover, through the definition of the transverse standard load and the transverse standard load coefficient, the reliability model based on the transverse distribution is derived. When the occurrence of the random load follows the Poisson process, the dynamic reliability models considering the strength degradation are derived. These models take the correlation between the random load and the strength into consideration. The result shows that the dispersion of the initial strength and that of the transverse standard load coefficient have great influences on the reliability and the hazard rate of the components. PMID:24223504

  15. Reliability modeling of fault-tolerant computer based systems

    NASA Technical Reports Server (NTRS)

    Bavuso, Salvatore J.

    1987-01-01

    Digital fault-tolerant computer-based systems have become commonplace in military and commercial avionics. These systems hold the promise of increased availability, reliability, and maintainability over conventional analog-based systems through the application of replicated digital computers arranged in fault-tolerant configurations. Three tightly coupled factors of paramount importance, ultimately determining the viability of these systems, are reliability, safety, and profitability. Reliability, the major driver affects virtually every aspect of design, packaging, and field operations, and eventually produces profit for commercial applications or increased national security. However, the utilization of digital computer systems makes the task of producing credible reliability assessment a formidable one for the reliability engineer. The root of the problem lies in the digital computer's unique adaptability to changing requirements, computational power, and ability to test itself efficiently. Addressed here are the nuances of modeling the reliability of systems with large state sizes, in the Markov sense, which result from systems based on replicated redundant hardware and to discuss the modeling of factors which can reduce reliability without concomitant depletion of hardware. Advanced fault-handling models are described and methods of acquiring and measuring parameters for these models are delineated.

  16. Reliability-based lifetime maintenance of aging highway bridges

    NASA Astrophysics Data System (ADS)

    Enright, Michael P.; Frangopol, Dan M.

    2000-06-01

    As the nation's infrastructure continues to age, the cost of maintaining it at an acceptable safety level continues to increase. In the United States, about one of every three bridges is rated structurally deficient and/or functionally obsolete. It will require about 80 billion to eliminate the current backlog of bridge deficiencies and maintain repair levels. Unfortunately, the financial resources allocated for these activities fall extremely short of the demand. Although several existing and emerging NDT techniques are available to gather inspection data, current maintenance planning decisions for deficient bridges are based on data from subjective condition assessments and do not consider the reliability of bridge components and systems. Recently, reliability-based optimum maintenance planning strategies have been developed. They can be used to predict inspection and repair times to achieve minimum life-cycle cost of deteriorating structural systems. In this study, a reliability-based methodology which takes into account loading randomness and history, and randomness in strength and degradation resulting from aggressive environmental factors, is used to predict the time- dependent reliability of aging highway bridges. A methodology for incorporating inspection data into reliability predictions is also presented. Finally, optimal lifetime maintenance strategies are identified, in which optimal inspection/repair times are found based on minimum expected life-cycle cost under prescribed reliability constraints. The influence of discount rate on optimum solutions is evaluated.

  17. Reliability analysis based on a direct ship hull strength assessment

    NASA Astrophysics Data System (ADS)

    Feng, Guoqing; Wang, Dongsheng; Garbatov, Yordan; Guedes Soares, C.

    2015-12-01

    A method of reliability analysis based on a direct strength calculation employing the von Mises stress failure criterion is presented here. The short term strain distributions of ship hull structural components are identified through the statistical analysis of the wave-induced strain history and the long term distributions by the weighted summation of the short term strain distributions. The wave-induced long term strain distribution is combined with the still water strain. The extreme strain distribution of the response strain is obtained by statistical analysis of the combined strains. The limit state function of the reliability analysis is based on the von Mises stress failure criterion, including the related uncertainties due to the quality of the material and model uncertainty. The reliability index is calculated using FORM and sensitivity analysis of each variable that has effects on the reliability is also discussed.

  18. Reliability Performance Optimization of Meshed Electrical Distribution System Considering Customer and Energy based Reliability Indices

    NASA Astrophysics Data System (ADS)

    Arya, L. D.; Kela, K. B.

    2013-12-01

    This paper describes a methodology for determination of optimum failure rate and repair time for each component of a meshed distribution system. In this paper the reliability indices for a sample meshed network are optimized. An objective function incorporating customer and energy based reliability indices and their target values is formulated. These indices are function of failure rate and repair time of a section of a distribution network. Modification of failure rate and repair time modifies the cost attached to them. Hence the optimization of the objective function is achieved by modifying the failure rate and repair time of each section of the meshed distribution system accounting constraint on budget allocated. The problem has been solved using population based differential evolution and bare bones particle swarm optimization techniques and results have been compared for a sample meshed distribution system.

  19. Reliability-Based Decision Fusion in Multimodal Biometric Verification Systems

    NASA Astrophysics Data System (ADS)

    Kryszczuk, Krzysztof; Richiardi, Jonas; Prodanov, Plamen; Drygajlo, Andrzej

    2007-12-01

    We present a methodology of reliability estimation in the multimodal biometric verification scenario. Reliability estimation has shown to be an efficient and accurate way of predicting and correcting erroneous classification decisions in both unimodal (speech, face, online signature) and multimodal (speech and face) systems. While the initial research results indicate the high potential of the proposed methodology, the performance of the reliability estimation in a multimodal setting has not been sufficiently studied or evaluated. In this paper, we demonstrate the advantages of using the unimodal reliability information in order to perform an efficient biometric fusion of two modalities. We further show the presented method to be superior to state-of-the-art multimodal decision-level fusion schemes. The experimental evaluation presented in this paper is based on the popular benchmarking bimodal BANCA database.

  20. Fatigue reliability based optimal design of planar compliant micropositioning stages

    NASA Astrophysics Data System (ADS)

    Wang, Qiliang; Zhang, Xianmin

    2015-10-01

    Conventional compliant micropositioning stages are usually developed based on static strength and deterministic methods, which may lead to either unsafe or excessive designs. This paper presents a fatigue reliability analysis and optimal design of a three-degree-of-freedom (3 DOF) flexure-based micropositioning stage. Kinematic, modal, static, and fatigue stress modelling of the stage were conducted using the finite element method. The maximum equivalent fatigue stress in the hinges was derived using sequential quadratic programming. The fatigue strength of the hinges was obtained by considering various influencing factors. On this basis, the fatigue reliability of the hinges was analysed using the stress-strength interference method. Fatigue-reliability-based optimal design of the stage was then conducted using the genetic algorithm and MATLAB. To make fatigue life testing easier, a 1 DOF stage was then optimized and manufactured. Experimental results demonstrate the validity of the approach.

  1. Multi-mode reliability-based design of horizontal curves.

    PubMed

    Essa, Mohamed; Sayed, Tarek; Hussein, Mohamed

    2016-08-01

    Recently, reliability analysis has been advocated as an effective approach to account for uncertainty in the geometric design process and to evaluate the risk associated with a particular design. In this approach, a risk measure (e.g. probability of noncompliance) is calculated to represent the probability that a specific design would not meet standard requirements. The majority of previous applications of reliability analysis in geometric design focused on evaluating the probability of noncompliance for only one mode of noncompliance such as insufficient sight distance. However, in many design situations, more than one mode of noncompliance may be present (e.g. insufficient sight distance and vehicle skidding at horizontal curves). In these situations, utilizing a multi-mode reliability approach that considers more than one failure (noncompliance) mode is required. The main objective of this paper is to demonstrate the application of multi-mode (system) reliability analysis to the design of horizontal curves. The process is demonstrated by a case study of Sea-to-Sky Highway located between Vancouver and Whistler, in southern British Columbia, Canada. Two noncompliance modes were considered: insufficient sight distance and vehicle skidding. The results show the importance of accounting for several noncompliance modes in the reliability model. The system reliability concept could be used in future studies to calibrate the design of various design elements in order to achieve consistent safety levels based on all possible modes of noncompliance. PMID:27180287

  2. Reliability, Compliance, and Security in Web-Based Course Assessments

    ERIC Educational Resources Information Center

    Bonham, Scott

    2008-01-01

    Pre- and postcourse assessment has become a very important tool for education research in physics and other areas. The web offers an attractive alternative to in-class paper administration, but concerns about web-based administration include reliability due to changes in medium, student compliance rates, and test security, both question leakage…

  3. Reliability and Efficiency of a DNA-Based Computation

    NASA Astrophysics Data System (ADS)

    Deaton, R.; Garzon, M.; Murphy, R. C.; Rose, J. A.; Franceschetti, D. R.; Stevens, S. E., Jr.

    1998-01-01

    DNA-based computing uses the tendency of nucleotide bases to bind (hybridize) in preferred combinations to do computation. Depending on reaction conditions, oligonucleotides can bind despite noncomplementary base pairs. These mismatched hybridizations are a source of false positives and negatives, which limit the efficiency and scalability of DNA-based computing. The ability of specific base sequences to support error-tolerant Adleman-style computation is analyzed, and criteria are proposed to increase reliability and efficiency. A method is given to calculate reaction conditions from estimates of DNA melting.

  4. The Verification-based Analysis of Reliable Multicast Protocol

    NASA Technical Reports Server (NTRS)

    Wu, Yunqing

    1996-01-01

    Reliable Multicast Protocol (RMP) is a communication protocol that provides an atomic, totally ordered, reliable multicast service on top of unreliable IP Multicasting. In this paper, we develop formal models for R.W using existing automatic verification systems, and perform verification-based analysis on the formal RMP specifications. We also use the formal models of RW specifications to generate a test suite for conformance testing of the RMP implementation. Throughout the process of RMP development, we follow an iterative, interactive approach that emphasizes concurrent and parallel progress between the implementation and verification processes. Through this approach, we incorporate formal techniques into our development process, promote a common understanding for the protocol, increase the reliability of our software, and maintain high fidelity between the specifications of RMP and its implementation.

  5. Intelligent computer based reliability assessment of multichip modules

    NASA Astrophysics Data System (ADS)

    Grosse, Ian R.; Katragadda, Prasanna; Bhattacharya, Sandeepan; Kulkarni, Sarang

    1994-04-01

    To deliver reliable Multichip (MCM's) in the face of rapidly changing technology, computer-based tools are needed for predicting the thermal mechanical behavior of various MCM package designs and selecting the most promising design in terms of performance, robustness, and reliability. The design tool must be able to address new design technologies manufacturing processes, novel materials, application criteria, and thermal environmental conditions. Reliability is one of the most important factors for determining design quality and hence must be a central condition in the design of Multichip Module packages. Clearly, design engineers need computer based simulation tools for rapid and efficient electrical, thermal, and mechanical modeling and optimization of advanced devices. For three dimensional thermal and mechanical simulation of advanced devices, the finite element method (FEM) is increasingly becoming the numerical method of choice. FEM is a versatile and sophisticated numerical techniques for solving the partial differential equations that describe the physical behavior of complex designs. AUTOTHERM(TM) is a MCM design tool developed by Mentor Graphics for Motorola, Inc. This tool performs thermal analysis of MCM packages using finite element analysis techniques. The tools used the philosophy of object oriented representation of components and simplified specification of boundary conditions for the thermal analysis so that the user need not be an expert in using finite element techniques. Different package types can be assessed and environmental conditions can be modeled. It also includes a detailed reliability module which allows the user to choose a desired failure mechanism (model). All the current tools perform thermal and/or stress analysis and do not address the issues of robustness and optimality of the MCM designs and the reliability prediction techniques are based on closed form analytical models and can often fail to predict the cycles of failure (N

  6. Surrogate-based Reliability Analysis Using Support Vector Machine

    NASA Astrophysics Data System (ADS)

    Li, Gang; Liu, Zhiqiang

    2010-05-01

    An approach of surrogate-based reliability analysis by support vector machine with Monte-Carlo simulation is proposed. The efficient sampling techniques, such as uniform design and Latin Hypercube sampling, are used, and the SVM is trained with the sample pairs of input and output data obtained by the finite element analysis. The trained SVM model, as a solver-surrogate model, is intended to approximate the real performance function. Considering the selection of parameters for SVM affects the learning performance of SVM strongly, the Genetic Algorithm (GA) is integrated to the construction of the SVM, by optimizing the relevant parameters. The influence of the parameters on SVM is discussed and a methodology is proposed for selecting the SVM model. Support Vector Classification (SVC) based and Support Vector Regression (SVR) based reliability analyses are studied. Some numerical examples demonstrate the efficiency and applicability of the proposed method.

  7. Physics of Failure

    NASA Astrophysics Data System (ADS)

    Marder, Michael

    2012-10-01

    One of the questions solid state physics was long supposed to answer was why a glass shatters when you drop it on the floor but a spoon does not. It turned out not to be such an easy problem and was only occasionally addressed until a series of major accidents in the 1940's and 1950's directed scientific attention to it. I will talk about the basic ideas of fracture mechanics that emerged as the answer, and display some recent applications to failure of silicon, rubber, and graphene.

  8. Diagnostic reliability of MMPI-2 computer-based test interpretations.

    PubMed

    Pant, Hina; McCabe, Brian J; Deskovitz, Mark A; Weed, Nathan C; Williams, John E

    2014-09-01

    Reflecting the common use of the MMPI-2 to provide diagnostic considerations, computer-based test interpretations (CBTIs) also typically offer diagnostic suggestions. However, these diagnostic suggestions can sometimes be shown to vary widely across different CBTI programs even for identical MMPI-2 profiles. The present study evaluated the diagnostic reliability of 6 commercially available CBTIs using a 20-item Q-sort task developed for this study. Four raters each sorted diagnostic classifications based on these 6 CBTI reports for 20 MMPI-2 profiles. Two questions were addressed. First, do users of CBTIs understand the diagnostic information contained within the reports similarly? Overall, diagnostic sorts of the CBTIs showed moderate inter-interpreter diagnostic reliability (mean r = .56), with sorts for the 1/2/3 profile showing the highest inter-interpreter diagnostic reliability (mean r = .67). Second, do different CBTIs programs vary with respect to diagnostic suggestions? It was found that diagnostic sorts of the CBTIs had a mean inter-CBTI diagnostic reliability of r = .56, indicating moderate but not strong agreement across CBTIs in terms of diagnostic suggestions. The strongest inter-CBTI diagnostic agreement was found for sorts of the 1/2/3 profile CBTIs (mean r = .71). Limitations and future directions are discussed. PMID:24749753

  9. Reliability analysis based on the losses from failures.

    PubMed

    Todinov, M T

    2006-04-01

    The conventional reliability analysis is based on the premise that increasing the reliability of a system will decrease the losses from failures. On the basis of counterexamples, it is demonstrated that this is valid only if all failures are associated with the same losses. In case of failures associated with different losses, a system with larger reliability is not necessarily characterized by smaller losses from failures. Consequently, a theoretical framework and models are proposed for a reliability analysis, linking reliability and the losses from failures. Equations related to the distributions of the potential losses from failure have been derived. It is argued that the classical risk equation only estimates the average value of the potential losses from failure and does not provide insight into the variability associated with the potential losses. Equations have also been derived for determining the potential and the expected losses from failures for nonrepairable and repairable systems with components arranged in series, with arbitrary life distributions. The equations are also valid for systems/components with multiple mutually exclusive failure modes. The expected losses given failure is a linear combination of the expected losses from failure associated with the separate failure modes scaled by the conditional probabilities with which the failure modes initiate failure. On this basis, an efficient method for simplifying complex reliability block diagrams has been developed. Branches of components arranged in series whose failures are mutually exclusive can be reduced to single components with equivalent hazard rate, downtime, and expected costs associated with intervention and repair. A model for estimating the expected losses from early-life failures has also been developed. For a specified time interval, the expected losses from early-life failures are a sum of the products of the expected number of failures in the specified time intervals covering the

  10. Reliability-based optimization under random vibration environment

    NASA Technical Reports Server (NTRS)

    Rao, S. S.

    1981-01-01

    A methodology of formulating the optimum design problem for structural systems with random parameters and subjected to random vibration as a mathematical programming problem is presented. The proposed method is applied to the optimum design of a cantilever beam with a tip mass and a truss structure supporting a water tank. The excitations are assumed to be Gaussian processes and the geometric and material properties are taken to be normally distributed random variables. The probabilistic constraints are specified for individual failure modes since it is easier to specify the reliability level for each failure mode keeping in view the consequences of failure in that particular mode. The time parameter appearing in the random vibration based constraints is eliminated by replacing the probabilities of failure by suitable upper bounds. The numerical results demonstrate the feasibility and effectiveness of applying the reliability-based design concepts to structures with random parameters and operating in random vibration environment.

  11. Modeling Sensor Reliability in Fault Diagnosis Based on Evidence Theory

    PubMed Central

    Yuan, Kaijuan; Xiao, Fuyuan; Fei, Liguo; Kang, Bingyi; Deng, Yong

    2016-01-01

    Sensor data fusion plays an important role in fault diagnosis. Dempster–Shafer (D-R) evidence theory is widely used in fault diagnosis, since it is efficient to combine evidence from different sensors. However, under the situation where the evidence highly conflicts, it may obtain a counterintuitive result. To address the issue, a new method is proposed in this paper. Not only the statistic sensor reliability, but also the dynamic sensor reliability are taken into consideration. The evidence distance function and the belief entropy are combined to obtain the dynamic reliability of each sensor report. A weighted averaging method is adopted to modify the conflict evidence by assigning different weights to evidence according to sensor reliability. The proposed method has better performance in conflict management and fault diagnosis due to the fact that the information volume of each sensor report is taken into consideration. An application in fault diagnosis based on sensor fusion is illustrated to show the efficiency of the proposed method. The results show that the proposed method improves the accuracy of fault diagnosis from 81.19% to 89.48% compared to the existing methods. PMID:26797611

  12. Modeling Sensor Reliability in Fault Diagnosis Based on Evidence Theory.

    PubMed

    Yuan, Kaijuan; Xiao, Fuyuan; Fei, Liguo; Kang, Bingyi; Deng, Yong

    2016-01-01

    Sensor data fusion plays an important role in fault diagnosis. Dempster-Shafer (D-R) evidence theory is widely used in fault diagnosis, since it is efficient to combine evidence from different sensors. However, under the situation where the evidence highly conflicts, it may obtain a counterintuitive result. To address the issue, a new method is proposed in this paper. Not only the statistic sensor reliability, but also the dynamic sensor reliability are taken into consideration. The evidence distance function and the belief entropy are combined to obtain the dynamic reliability of each sensor report. A weighted averaging method is adopted to modify the conflict evidence by assigning different weights to evidence according to sensor reliability. The proposed method has better performance in conflict management and fault diagnosis due to the fact that the information volume of each sensor report is taken into consideration. An application in fault diagnosis based on sensor fusion is illustrated to show the efficiency of the proposed method. The results show that the proposed method improves the accuracy of fault diagnosis from 81.19% to 89.48% compared to the existing methods. PMID:26797611

  13. Reliability based design including future tests and multiagent approaches

    NASA Astrophysics Data System (ADS)

    Villanueva, Diane

    The initial stages of reliability-based design optimization involve the formulation of objective functions and constraints, and building a model to estimate the reliability of the design with quantified uncertainties. However, even experienced hands often overlook important objective functions and constraints that affect the design. In addition, uncertainty reduction measures, such as tests and redesign, are often not considered in reliability calculations during the initial stages. This research considers two areas that concern the design of engineering systems: 1) the trade-off of the effect of a test and post-test redesign on reliability and cost and 2) the search for multiple candidate designs as insurance against unforeseen faults in some designs. In this research, a methodology was developed to estimate the effect of a single future test and post-test redesign on reliability and cost. The methodology uses assumed distributions of computational and experimental errors with re-design rules to simulate alternative future test and redesign outcomes to form a probabilistic estimate of the reliability and cost for a given design. Further, it was explored how modeling a future test and redesign provides a company an opportunity to balance development costs versus performance by simultaneously designing the design and the post-test redesign rules during the initial design stage. The second area of this research considers the use of dynamic local surrogates, or surrogate-based agents, to locate multiple candidate designs. Surrogate-based global optimization algorithms often require search in multiple candidate regions of design space, expending most of the computation needed to define multiple alternate designs. Thus, focusing on solely locating the best design may be wasteful. We extended adaptive sampling surrogate techniques to locate multiple optima by building local surrogates in sub-regions of the design space to identify optima. The efficiency of this method

  14. Reliability-based analysis and design optimization for durability

    NASA Astrophysics Data System (ADS)

    Choi, Kyung K.; Youn, Byeng D.; Tang, Jun; Hardee, Edward

    2005-05-01

    In the Army mechanical fatigue subject to external and inertia transient loads in the service life of mechanical systems often leads to a structural failure due to accumulated damage. Structural durability analysis that predicts the fatigue life of mechanical components subject to dynamic stresses and strains is a compute intensive multidisciplinary simulation process, since it requires the integration of several computer-aided engineering tools and considerable data communication and computation. Uncertainties in geometric dimensions due to manufacturing tolerances cause the indeterministic nature of the fatigue life of a mechanical component. Due to the fact that uncertainty propagation to structural fatigue under transient dynamic loading is not only numerically complicated but also extremely computationally expensive, it is a challenging task to develop a structural durability-based design optimization process and reliability analysis to ascertain whether the optimal design is reliable. The objective of this paper is the demonstration of an integrated CAD-based computer-aided engineering process to effectively carry out design optimization for structural durability, yielding a durable and cost-effectively manufacturable product. This paper shows preliminary results of reliability-based durability design optimization for the Army Stryker A-Arm.

  15. A Research Roadmap for Computation-Based Human Reliability Analysis

    SciTech Connect

    Boring, Ronald; Mandelli, Diego; Joe, Jeffrey; Smith, Curtis; Groth, Katrina

    2015-08-01

    The United States (U.S.) Department of Energy (DOE) is sponsoring research through the Light Water Reactor Sustainability (LWRS) program to extend the life of the currently operating fleet of commercial nuclear power plants. The Risk Informed Safety Margin Characterization (RISMC) research pathway within LWRS looks at ways to maintain and improve the safety margins of these plants. The RISMC pathway includes significant developments in the area of thermalhydraulics code modeling and the development of tools to facilitate dynamic probabilistic risk assessment (PRA). PRA is primarily concerned with the risk of hardware systems at the plant; yet, hardware reliability is often secondary in overall risk significance to human errors that can trigger or compound undesirable events at the plant. This report highlights ongoing efforts to develop a computation-based approach to human reliability analysis (HRA). This computation-based approach differs from existing static and dynamic HRA approaches in that it: (i) interfaces with a dynamic computation engine that includes a full scope plant model, and (ii) interfaces with a PRA software toolset. The computation-based HRA approach presented in this report is called the Human Unimodels for Nuclear Technology to Enhance Reliability (HUNTER) and incorporates in a hybrid fashion elements of existing HRA methods to interface with new computational tools developed under the RISMC pathway. The goal of this research effort is to model human performance more accurately than existing approaches, thereby minimizing modeling uncertainty found in current plant risk models.

  16. Limit states and reliability-based pipeline design. Final report

    SciTech Connect

    Zimmerman, T.J.E.; Chen, Q.; Pandey, M.D.

    1997-06-01

    This report provides the results of a study to develop limit states design (LSD) procedures for pipelines. Limit states design, also known as load and resistance factor design (LRFD), provides a unified approach to dealing with all relevant failure modes combinations of concern. It explicitly accounts for the uncertainties that naturally occur in the determination of the loads which act on a pipeline and in the resistance of the pipe to failure. The load and resistance factors used are based on reliability considerations; however, the designer is not faced with carrying out probabilistic calculations. This work is done during development and periodic updating of the LSD document. This report provides background information concerning limits states and reliability-based design (Section 2), gives the limit states design procedures that were developed (Section 3) and provides results of the reliability analyses that were undertaken in order to partially calibrate the LSD method (Section 4). An appendix contains LSD design examples in order to demonstrate use of the method. Section 3, Limit States Design has been written in the format of a recommended practice. It has been structured so that, in future, it can easily be converted to a limit states design code format. Throughout the report, figures and tables are given at the end of each section, with the exception of Section 3, where to facilitate understanding of the LSD method, they have been included with the text.

  17. Reliability-Based Control Design for Uncertain Systems

    NASA Technical Reports Server (NTRS)

    Crespo, Luis G.; Kenny, Sean P.

    2005-01-01

    This paper presents a robust control design methodology for systems with probabilistic parametric uncertainty. Control design is carried out by solving a reliability-based multi-objective optimization problem where the probability of violating design requirements is minimized. Simultaneously, failure domains are optimally enlarged to enable global improvements in the closed-loop performance. To enable an efficient numerical implementation, a hybrid approach for estimating reliability metrics is developed. This approach, which integrates deterministic sampling and asymptotic approximations, greatly reduces the numerical burden associated with complex probabilistic computations without compromising the accuracy of the results. Examples using output-feedback and full-state feedback with state estimation are used to demonstrate the ideas proposed.

  18. Study of vertical breakwater reliability based on copulas

    NASA Astrophysics Data System (ADS)

    Dong, Sheng; Li, Jingjing; Li, Xue; Wei, Yong

    2016-04-01

    The reliability of a vertical breakwater is calculated using direct integration methods based on joint density functions. The horizontal and uplifting wave forces on the vertical breakwater can be well fitted by the lognormal and the Gumbel distributions, respectively. The joint distribution of the horizontal and uplifting wave forces is analyzed using different probabilistic distributions, including the bivariate logistic Gumbel distribution, the bivariate lognormal distribution, and three bivariate Archimedean copulas functions constructed with different marginal distributions simultaneously. We use the fully nested copulas to construct multivariate distributions taking into account related variables. Different goodness fitting tests are carried out to determine the best bivariate copula model for wave forces on a vertical breakwater. We show that a bivariate model constructed by Frank copula gives the best reliability analysis, using marginal distributions of Gumbel and lognormal to account for uplifting pressure and horizontal wave force on a vertical breakwater, respectively. The results show that failure probability of the vertical breakwater calculated by multivariate density function is comparable to those by the Joint Committee on Structural Safety methods. As copulas are suitable for constructing a bivariate or multivariate joint distribution, they have great potential in reliability analysis for other coastal structures.

  19. Improved reliability analysis method based on the failure assessment diagram

    NASA Astrophysics Data System (ADS)

    Zhou, Yu; Zhang, Zheng; Zhong, Qunpeng

    2012-07-01

    With the uncertainties related to operating conditions, in-service non-destructive testing (NDT) measurements and material properties considered in the structural integrity assessment, probabilistic analysis based on the failure assessment diagram (FAD) approach has recently become an important concern. However, the point density revealing the probabilistic distribution characteristics of the assessment points is usually ignored. To obtain more detailed and direct knowledge from the reliability analysis, an improved probabilistic fracture mechanics (PFM) assessment method is proposed. By integrating 2D kernel density estimation (KDE) technology into the traditional probabilistic assessment, the probabilistic density of the randomly distributed assessment points is visualized in the assessment diagram. Moreover, a modified interval sensitivity analysis is implemented and compared with probabilistic sensitivity analysis. The improved reliability analysis method is applied to the assessment of a high pressure pipe containing an axial internal semi-elliptical surface crack. The results indicate that these two methods can give consistent sensitivities of input parameters, but the interval sensitivity analysis is computationally more efficient. Meanwhile, the point density distribution and its contour are plotted in the FAD, thereby better revealing the characteristics of PFM assessment. This study provides a powerful tool for the reliability analysis of critical structures.

  20. Reliability Quantification of Advanced Stirling Convertor (ASC) Components

    NASA Technical Reports Server (NTRS)

    Shah, Ashwin R.; Korovaichuk, Igor; Zampino, Edward

    2010-01-01

    The Advanced Stirling Convertor, is intended to provide power for an unmanned planetary spacecraft and has an operational life requirement of 17 years. Over this 17 year mission, the ASC must provide power with desired performance and efficiency and require no corrective maintenance. Reliability demonstration testing for the ASC was found to be very limited due to schedule and resource constraints. Reliability demonstration must involve the application of analysis, system and component level testing, and simulation models, taken collectively. Therefore, computer simulation with limited test data verification is a viable approach to assess the reliability of ASC components. This approach is based on physics-of-failure mechanisms and involves the relationship among the design variables based on physics, mechanics, material behavior models, interaction of different components and their respective disciplines such as structures, materials, fluid, thermal, mechanical, electrical, etc. In addition, these models are based on the available test data, which can be updated, and analysis refined as more data and information becomes available. The failure mechanisms and causes of failure are included in the analysis, especially in light of the new information, in order to develop guidelines to improve design reliability and better operating controls to reduce the probability of failure. Quantified reliability assessment based on fundamental physical behavior of components and their relationship with other components has demonstrated itself to be a superior technique to conventional reliability approaches based on utilizing failure rates derived from similar equipment or simply expert judgment.

  1. Reliability of metalloceramic and zirconia-based ceramic crowns.

    PubMed

    Silva, N R F A; Bonfante, E A; Zavanelli, R A; Thompson, V P; Ferencz, J L; Coelho, P G

    2010-10-01

    Despite the increasing utilization of all-ceramic crown systems, their mechanical performance relative to that of metal ceramic restorations (MCR) has yet to be determined. This investigation tested the hypothesis that MCR present higher reliability over two Y-TZP all-ceramic crown systems under mouth-motion fatigue conditions. A CAD-based tooth preparation with the average dimensions of a mandibular first molar was used as a master die to fabricate all restorations. One 0.5-mm Pd-Ag and two Y-TZP system cores were veneered with 1.5 mm porcelain. Crowns were cemented onto aged (60 days in water) composite (Z100, 3M/ESPE) reproductions of the die. Mouth-motion fatigue was performed, and use level probability Weibull curves were determined. Failure modes of all systems included chipping or fracture of the porcelain veneer initiating at the indentation site. Fatigue was an acceleration factor for all-ceramic systems, but not for the MCR system. The latter presented significantly higher reliability under mouth-motion cyclic mechanical testing. PMID:20660796

  2. Reliability of Metalloceramic and Zirconia-based Ceramic Crowns

    PubMed Central

    Silva, N.R.F.A.; Bonfante, E.A.; Zavanelli, R.A.; Thompson, V.P.; Ferencz, J.L.; Coelho, P.G.

    2010-01-01

    Despite the increasing utilization of all-ceramic crown systems, their mechanical performance relative to that of metal ceramic restorations (MCR) has yet to be determined. This investigation tested the hypothesis that MCR present higher reliability over two Y-TZP all-ceramic crown systems under mouth-motion fatigue conditions. A CAD-based tooth preparation with the average dimensions of a mandibular first molar was used as a master die to fabricate all restorations. One 0.5-mm Pd-Ag and two Y-TZP system cores were veneered with 1.5 mm porcelain. Crowns were cemented onto aged (60 days in water) composite (Z100, 3M/ESPE) reproductions of the die. Mouth-motion fatigue was performed, and use level probability Weibull curves were determined. Failure modes of all systems included chipping or fracture of the porcelain veneer initiating at the indentation site. Fatigue was an acceleration factor for all-ceramic systems, but not for the MCR system. The latter presented significantly higher reliability under mouth-motion cyclic mechanical testing. PMID:20660796

  3. Quantifying neurotransmission reliability through metrics-based information analysis.

    PubMed

    Brasselet, Romain; Johansson, Roland S; Arleo, Angelo

    2011-04-01

    We set forth an information-theoretical measure to quantify neurotransmission reliability while taking into full account the metrical properties of the spike train space. This parametric information analysis relies on similarity measures induced by the metrical relations between neural responses as spikes flow in. Thus, in order to assess the entropy, the conditional entropy, and the overall information transfer, this method does not require any a priori decoding algorithm to partition the space into equivalence classes. It therefore allows the optimal parameters of a class of distances to be determined with respect to information transmission. To validate the proposed information-theoretical approach, we study precise temporal decoding of human somatosensory signals recorded using microneurography experiments. For this analysis, we employ a similarity measure based on the Victor-Purpura spike train metrics. We show that with appropriate parameters of this distance, the relative spike times of the mechanoreceptors' responses convey enough information to perform optimal discrimination--defined as maximum metrical information and zero conditional entropy--of 81 distinct stimuli within 40 ms of the first afferent spike. The proposed information-theoretical measure proves to be a suitable generalization of Shannon mutual information in order to consider the metrics of temporal codes explicitly. It allows neurotransmission reliability to be assessed in the presence of large spike train spaces (e.g., neural population codes) with high temporal precision. PMID:21222522

  4. Value-based reliability-constrained power system stability

    NASA Astrophysics Data System (ADS)

    Elfayoumy, Mahmoud Khamil Aly

    For power system operation, the acceptance of reliability is determined on the basis of design criteria expressed in terms of contingencies of pre-defined severity that the system is able to withstand with acceptable transient and steady state performance. The essential operating objective is the delivery to the customers of power with acceptable quality at minimum cost. Both the meaning of "minimum cost" and "acceptable quality" are the result of a number of technical, economic, and ecological perceptions which evolve with time. In the work presented in this dissertation, an integrated framework for value-based reliability constrained power system stability with a benefit allocation scheme for multi utility power system is proposed. The scheme consists of two main parts. The first part proposes a scheme that uses multiple objective optimization that employs Goal Decision network approach and post optimization analysis to produce Pareto-Optimal control strategies and recommends a solution for the Decision-Maker. The second part deals with the economical analysis of the Pareto-Optimal control strategy in terms of cost/benefit analysis to find out the most economical control strategy using a set of indices. It also presents a benefit allocation strategy that is based on a proposed scheme utilizing game theory to model the cooperation and competition between the different utilities forming the power system. The proposed integrated scheme was tested on IEEE test systems and the results show that the proposed scheme is very efficient not only in optimizing both adequacy and security but also in assessing different control strategies economically while providing an efficient strategy for the benefit allocation.

  5. A Measure for the Reliability of a Rating Scale Based on Longitudinal Clinical Trial Data

    ERIC Educational Resources Information Center

    Laenen, Annouschka; Alonso, Ariel; Molenberghs, Geert

    2007-01-01

    A new measure for reliability of a rating scale is introduced, based on the classical definition of reliability, as the ratio of the true score variance and the total variance. Clinical trial data can be employed to estimate the reliability of the scale in use, whenever repeated measurements are taken. The reliability is estimated from the…

  6. Reliability-based design optimization of multiphysics, aerospace systems

    NASA Astrophysics Data System (ADS)

    Allen, Matthew R.

    Aerospace systems are inherently plagued by uncertainties in their design, fabrication, and operation. Safety factors and expensive testing at the prototype level traditionally account for these uncertainties. Reliability-based design optimization (RBDO) can drastically decrease life-cycle development costs by accounting for the stochastic nature of the system response in the design process. The reduction in cost is amplified for conceptually new designs, for which no accepted safety factors currently exist. Aerospace systems often operate in environments dominated by multiphysics phenomena, such as the fluid-structure interaction of aeroelastic wings or the electrostatic-mechanical interaction of sensors and actuators. The analysis of such phenomena is generally complex and computationally expensive, and therefore is usually simplified or approximated in the design process. However, this leads to significant epistemic uncertainties in modeling, which may dominate the uncertainties for which the reliability analysis was intended. Therefore, the goal of this thesis is to present a RBDO framework that utilizes high-fidelity simulation techniques to minimize the modeling error for multiphysics phenomena. A key component of the framework is an extended reduced order modeling (EROM) technique that can analyze various states in the design or uncertainty parameter space at a reduced computational cost, while retaining characteristics of high-fidelity methods. The computational framework is verified and applied to the RBDO of aeroelastic systems and electrostatically driven sensors and actuators, utilizing steady-state analysis and design criteria. The framework is also applied to the design of electrostatic devices with transient criteria, which requires the use of the EROM technique to overcome the computational burden of multiple transient analyses.

  7. Reliability-based robust design optimization of vehicle components, Part II: Case studies

    NASA Astrophysics Data System (ADS)

    Zhang, Yimin

    2015-06-01

    The reliability-based optimization, the reliability- based sensitivity analysis and robust design method are employed to propose an effective approach for reliability-based robust design optimization of vehicle components in Part I. Applications of the method are further discussed for reliability-based robust optimization of vehicle components in this paper. Examples of axles, torsion bar, coil and composite springs are illustrated for numerical investigations. Results have shown the proposed method is an efficient method for reliability-based robust design optimization of vehicle components.

  8. RELIABILITY BASED DESIGN OF FIXED FOUNDATION WIND TURBINES

    SciTech Connect

    Nichols, R.

    2013-10-14

    Recent analysis of offshore wind turbine foundations using both applicable API and IEC standards show that the total load demand from wind and waves is greatest in wave driven storms. Further, analysis of overturning moment loads (OTM) reveal that impact forces exerted by breaking waves are the largest contributor to OTM in big storms at wind speeds above the operating range of 25 m/s. Currently, no codes or standards for offshore wind power generators have been adopted by the Bureau of Ocean Energy Management Regulation and Enforcement (BOEMRE) for use on the Outer Continental Shelf (OCS). Current design methods based on allowable stress design (ASD) incorporate the uncertainty in the variation of loads transferred to the foundation and geotechnical capacity of the soil and rock to support the loads is incorporated into a factor of safety. Sources of uncertainty include spatial and temporal variation of engineering properties, reliability of property measurements applicability and sufficiency of sampling and testing methods, modeling errors, and variability of estimated load predictions. In ASD these sources of variability are generally given qualitative rather than quantitative consideration. The IEC 61400‐3 design standard for offshore wind turbines is based on ASD methods. Load and resistance factor design (LRFD) methods are being increasingly used in the design of structures. Uncertainties such as those listed above can be included quantitatively into the LRFD process. In LRFD load factors and resistance factors are statistically based. This type of analysis recognizes that there is always some probability of failure and enables the probability of failure to be quantified. This paper presents an integrated approach consisting of field observations and numerical simulation to establish the distribution of loads from breaking waves to support the LRFD of fixed offshore foundations.

  9. Simulation and Non-Simulation Based Human Reliability Analysis Approaches

    SciTech Connect

    Boring, Ronald Laurids; Shirley, Rachel Elizabeth; Joe, Jeffrey Clark; Mandelli, Diego

    2014-12-01

    Part of the U.S. Department of Energy’s Light Water Reactor Sustainability (LWRS) Program, the Risk-Informed Safety Margin Characterization (RISMC) Pathway develops approaches to estimating and managing safety margins. RISMC simulations pair deterministic plant physics models with probabilistic risk models. As human interactions are an essential element of plant risk, it is necessary to integrate human actions into the RISMC risk model. In this report, we review simulation-based and non-simulation-based human reliability assessment (HRA) methods. Chapter 2 surveys non-simulation-based HRA methods. Conventional HRA methods target static Probabilistic Risk Assessments for Level 1 events. These methods would require significant modification for use in dynamic simulation of Level 2 and Level 3 events. Chapter 3 is a review of human performance models. A variety of methods and models simulate dynamic human performance; however, most of these human performance models were developed outside the risk domain and have not been used for HRA. The exception is the ADS-IDAC model, which can be thought of as a virtual operator program. This model is resource-intensive but provides a detailed model of every operator action in a given scenario, along with models of numerous factors that can influence operator performance. Finally, Chapter 4 reviews the treatment of timing of operator actions in HRA methods. This chapter is an example of one of the critical gaps between existing HRA methods and the needs of dynamic HRA. This report summarizes the foundational information needed to develop a feasible approach to modeling human interactions in the RISMC simulations.

  10. High reliability outdoor sonar prototype based on efficient signal coding.

    PubMed

    Alvarez, Fernando J; Ureña, Jesús; Mazo, Manuel; Hernández, Alvaro; García, Juan J; de Marziani, Carlos

    2006-10-01

    Many mobile robots and autonomous vehicles designed for outdoor operation have incorporated ultrasonic sensors in their navigation systems, whose function is mainly to avoid possible collisions with very close obstacles. The use of these systems in more precise tasks requires signal encoding and the incorporation of pulse compression techniques that have already been used with success in the design of high-performance indoor sonars. However, the transmission of ultrasonic encoded signals outdoors entails a new challenge because of the effects of atmospheric turbulence. This phenomenon causes random fluctuations in the phase and amplitude of traveling acoustic waves, a fact that can make the encoded signal completely unrecognizable by its matched receiver. Atmospheric turbulence is investigated in this work, with the aim of determining the conditions under which it is possible to assure the reliable outdoor operation of an ultrasonic pulse compression system. As a result of this analysis, a novel sonar prototype based on complementary sequences coding is developed and experimentally tested. This encoding scheme provides the system with very useful additional features, namely, high robustness to noise, multi-mode operation capability (simultaneous emissions with minimum cross talk interference), and the possibility of applying an efficient detection algorithm that notably decreases the hardware resource requirements. PMID:17036794

  11. Optimum Performance-Based Reliability Design of Structures

    NASA Astrophysics Data System (ADS)

    Papadrakakis, M.; Fragiadakis, M.; Lagaros, N. D.

    The objective of this paper is to present a performance based design procedure for steel structures in the framework of structural optimization. The structural performance is evaluated by means of the reliability demand and resistance methodology of FEMA-350 (Federal Emergency Management Agency) guidelines where the uncertainties and randomness in capacity and seismic demand are taken into account in a consistent manner. The structure has to be able to respond for different hazard levels with a desired confidence. Both Nonlinear Static and Nonlinear Dynamic analysis procedures are used in order to obtain the response for two hazard levels. The design procedure is performed in a structural optimization environment, where the Evolution Strategies algorithm is implemented for the solution of the optimization problem. In order to handle the excessive computational cost the inelastic time history analyses are performed in a parallel computing environment. The objective of the study is to obtain the design with the least material weight, and thus with less cost, that is capable to respond with the desired confidence for each performance level following the specifications of FEMA-350.

  12. Reliable Freestanding Position-Based Routing in Highway Scenarios

    PubMed Central

    Galaviz-Mosqueda, Gabriel A.; Aquino-Santos, Raúl; Villarreal-Reyes, Salvador; Rivera-Rodríguez, Raúl; Villaseñor-González, Luis; Edwards, Arthur

    2012-01-01

    Vehicular Ad Hoc Networks (VANETs) are considered by car manufacturers and the research community as the enabling technology to radically improve the safety, efficiency and comfort of everyday driving. However, before VANET technology can fulfill all its expected potential, several difficulties must be addressed. One key issue arising when working with VANETs is the complexity of the networking protocols compared to those used by traditional infrastructure networks. Therefore, proper design of the routing strategy becomes a main issue for the effective deployment of VANETs. In this paper, a reliable freestanding position-based routing algorithm (FPBR) for highway scenarios is proposed. For this scenario, several important issues such as the high mobility of vehicles and the propagation conditions may affect the performance of the routing strategy. These constraints have only been partially addressed in previous proposals. In contrast, the design approach used for developing FPBR considered the constraints imposed by a highway scenario and implements mechanisms to overcome them. FPBR performance is compared to one of the leading protocols for highway scenarios. Performance metrics show that FPBR yields similar results when considering freespace propagation conditions, and outperforms the leading protocol when considering a realistic highway path loss model. PMID:23202159

  13. Reliable freestanding position-based routing in highway scenarios.

    PubMed

    Galaviz-Mosqueda, Gabriel A; Aquino-Santos, Raúl; Villarreal-Reyes, Salvador; Rivera-Rodríguez, Raúl; Villaseñor-González, Luis; Edwards, Arthur

    2012-01-01

    Vehicular Ad Hoc Networks (VANETs) are considered by car manufacturers and the research community as the enabling technology to radically improve the safety, efficiency and comfort of everyday driving. However, before VANET technology can fulfill all its expected potential, several difficulties must be addressed. One key issue arising when working with VANETs is the complexity of the networking protocols compared to those used by traditional infrastructure networks. Therefore, proper design of the routing strategy becomes a main issue for the effective deployment of VANETs. In this paper, a reliable freestanding position-based routing algorithm (FPBR) for highway scenarios is proposed. For this scenario, several important issues such as the high mobility of vehicles and the propagation conditions may affect the performance of the routing strategy. These constraints have only been partially addressed in previous proposals. In contrast, the design approach used for developing FPBR considered the constraints imposed by a highway scenario and implements mechanisms to overcome them. FPBR performance is compared to one of the leading protocols for highway scenarios. Performance metrics show that FPBR yields similar results when considering freespace propagation conditions, and outperforms the leading protocol when considering a realistic highway path loss model. PMID:23202159

  14. A Vision for Spaceflight Reliability: NASA's Objectives Based Strategy

    NASA Technical Reports Server (NTRS)

    Groen, Frank; Evans, John; Hall, Tony

    2015-01-01

    In defining the direction for a new Reliability and Maintainability standard, OSMA has extracted the essential objectives that our programs need, to undertake a reliable mission. These objectives have been structured to lead mission planning through construction of an objective hierarchy, which defines the critical approaches for achieving high reliability and maintainability (R M). Creating a hierarchy, as a basis for assurance implementation, is a proven approach; yet, it holds the opportunity to enable new directions, as NASA moves forward in tackling the challenges of space exploration.

  15. Reliability-Based Design Optimization of a Composite Airframe Component

    NASA Technical Reports Server (NTRS)

    Patnaik, Surya N.; Pai, Shantaram S.; Coroneos, Rula M.

    2009-01-01

    A stochastic design optimization methodology (SDO) has been developed to design components of an airframe structure that can be made of metallic and composite materials. The design is obtained as a function of the risk level, or reliability, p. The design method treats uncertainties in load, strength, and material properties as distribution functions, which are defined with mean values and standard deviations. A design constraint or a failure mode is specified as a function of reliability p. Solution to stochastic optimization yields the weight of a structure as a function of reliability p. Optimum weight versus reliability p traced out an inverted-S-shaped graph. The center of the inverted-S graph corresponded to 50 percent (p = 0.5) probability of success. A heavy design with weight approaching infinity could be produced for a near-zero rate of failure that corresponds to unity for reliability p (or p = 1). Weight can be reduced to a small value for the most failure-prone design with a reliability that approaches zero (p = 0). Reliability can be changed for different components of an airframe structure. For example, the landing gear can be designed for a very high reliability, whereas it can be reduced to a small extent for a raked wingtip. The SDO capability is obtained by combining three codes: (1) The MSC/Nastran code was the deterministic analysis tool, (2) The fast probabilistic integrator, or the FPI module of the NESSUS software, was the probabilistic calculator, and (3) NASA Glenn Research Center s optimization testbed CometBoards became the optimizer. The SDO capability requires a finite element structural model, a material model, a load model, and a design model. The stochastic optimization concept is illustrated considering an academic example and a real-life raked wingtip structure of the Boeing 767-400 extended range airliner made of metallic and composite materials.

  16. Reliability-Based Life Assessment of Stirling Convertor Heater Head

    NASA Technical Reports Server (NTRS)

    Shah, Ashwin R.; Halford, Gary R.; Korovaichuk, Igor

    2004-01-01

    Onboard radioisotope power systems being developed and planned for NASA's deep-space missions require reliable design lifetimes of up to 14 yr. The structurally critical heater head of the high-efficiency Stirling power convertor has undergone extensive computational analysis of operating temperatures, stresses, and creep resistance of the thin-walled Inconel 718 bill of material. A preliminary assessment of the effect of uncertainties in the material behavior was also performed. Creep failure resistance of the thin-walled heater head could show variation due to small deviations in the manufactured thickness and in uncertainties in operating temperature and pressure. Durability prediction and reliability of the heater head are affected by these deviations from nominal design conditions. Therefore, it is important to include the effects of these uncertainties in predicting the probability of survival of the heater head under mission loads. Furthermore, it may be possible for the heater head to experience rare incidences of small temperature excursions of short duration. These rare incidences would affect the creep strain rate and, therefore, the life. This paper addresses the effects of such rare incidences on the reliability. In addition, the sensitivities of variables affecting the reliability are quantified, and guidelines developed to improve the reliability are outlined. Heater head reliability is being quantified with data from NASA Glenn Research Center's accelerated benchmark testing program.

  17. A Most Probable Point-Based Method for Reliability Analysis, Sensitivity Analysis and Design Optimization

    NASA Technical Reports Server (NTRS)

    Hou, Gene J.-W.; Gumbert, Clyde R.; Newman, Perry A.

    2004-01-01

    A major step in a most probable point (MPP)-based method for reliability analysis is to determine the MPP. This is usually accomplished by using an optimization search algorithm. The optimal solutions associated with the MPP provide measurements related to safety probability. This study focuses on two commonly used approximate probability integration methods; i.e., the Reliability Index Approach (RIA) and the Performance Measurement Approach (PMA). Their reliability sensitivity equations are first derived in this paper, based on the derivatives of their respective optimal solutions. Examples are then provided to demonstrate the use of these derivatives for better reliability analysis and Reliability-Based Design Optimization (RBDO).

  18. The Specification-Based Validation of Reliable Multicast Protocol

    NASA Technical Reports Server (NTRS)

    Wu, Yunqing

    1995-01-01

    Reliable Multicast Protocol (RMP) is a communication protocol that provides an atomic, totally ordered, reliable multicast service on top of unreliable IP multicasting. In this report, we develop formal models for RMP using existing automated verification systems, and perform validation on the formal RMP specifications. The validation analysis help identifies some minor specification and design problems. We also use the formal models of RMP to generate a test suite for conformance testing of the implementation. Throughout the process of RMP development, we follow an iterative, interactive approach that emphasizes concurrent and parallel progress of the implementation and verification processes. Through this approach, we incorporate formal techniques into our development process, promote a common understanding for the protocol, increase the reliability of our software, and maintain high fidelity between the specifications of RMP and its implementation.

  19. Routing Protocol Based on Link Reliability for WSN

    NASA Astrophysics Data System (ADS)

    Weipeng, Jing; Yaqiu, Liu

    In this paper, defining link reliability strategy constructed by remain energy, and communication cost of nodes as topology weight to synthetically reflect the energy efficiency of dominator, an Energy-radio and communication cost route (ECCR) is proposed to solve the problem that the average energy consumption in cluster and minimum communication cost. We take node residual energy and distance between sink and node to compete cluster head, at the same time, in order to reduce the cluster head energy costs, link reliability and hop is used to establish topology. The experimental results show that the algorithm not only has the energy saved characters, but also ensures the reliability of topology links and extends the network life-cycle efficiently.

  20. Reliability Generalization of Curriculum-Based Measurement Reading Aloud: A Meta-Analytic Review

    ERIC Educational Resources Information Center

    Yeo, Seungsoo

    2011-01-01

    The purpose of this study was to employ the meta-analytic method of Reliability Generalization to investigate the magnitude and variability of reliability estimates obtained across studies using Curriculum-Based Measurement reading aloud. Twenty-eight studies that met the inclusion criteria were used to calculate the overall mean reliability of…

  1. 49 CFR Appendix E to Part 238 - General Principles of Reliability-Based Maintenance Programs

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... Maintenance Programs E Appendix E to Part 238 Transportation Other Regulations Relating to Transportation... STANDARDS Pt. 238, App. E Appendix E to Part 238—General Principles of Reliability-Based Maintenance... reliability beyond the design reliability. (e) When a maintenance program is developed, it includes tasks...

  2. Reliability-based condition assessment of steel containment and liners

    SciTech Connect

    Ellingwood, B.; Bhattacharya, B.; Zheng, R.

    1996-11-01

    Steel containments and liners in nuclear power plants may be exposed to aggressive environments that may cause their strength and stiffness to decrease during the plant service life. Among the factors recognized as having the potential to cause structural deterioration are uniform, pitting or crevice corrosion; fatigue, including crack initiation and propagation to fracture; elevated temperature; and irradiation. The evaluation of steel containments and liners for continued service must provide assurance that they are able to withstand future extreme loads during the service period with a level of reliability that is sufficient for public safety. Rational methodologies to provide such assurances can be developed using modern structural reliability analysis principles that take uncertainties in loading, strength, and degradation resulting from environmental factors into account. The research described in this report is in support of the Steel Containments and Liners Program being conducted for the US Nuclear Regulatory Commission by the Oak Ridge National Laboratory. The research demonstrates the feasibility of using reliability analysis as a tool for performing condition assessments and service life predictions of steel containments and liners. Mathematical models that describe time-dependent changes in steel due to aggressive environmental factors are identified, and statistical data supporting the use of these models in time-dependent reliability analysis are summarized. The analysis of steel containment fragility is described, and simple illustrations of the impact on reliability of structural degradation are provided. The role of nondestructive evaluation in time-dependent reliability analysis, both in terms of defect detection and sizing, is examined. A Markov model provides a tool for accounting for time-dependent changes in damage condition of a structural component or system. 151 refs.

  3. Fatigue reliability based on residual strength model with hybrid uncertain parameters

    NASA Astrophysics Data System (ADS)

    Wang, Jun; Qiu, Zhi-Ping

    2012-02-01

    The aim of this paper is to evaluate the fatigue reliability with hybrid uncertain parameters based on a residual strength model. By solving the non-probabilistic set-based reliability problem and analyzing the reliability with randomness, the fatigue reliability with hybrid parameters can be obtained. The presented hybrid model can adequately consider all uncertainties affecting the fatigue reliability with hybrid uncertain parameters. A comparison among the presented hybrid model, non-probabilistic set-theoretic model and the conventional random model is made through two typical numerical examples. The results show that the presented hybrid model, which can ensure structural security, is effective and practical.

  4. Neural Networks Based Approach to Enhance Space Hardware Reliability

    NASA Technical Reports Server (NTRS)

    Zebulum, Ricardo S.; Thakoor, Anilkumar; Lu, Thomas; Franco, Lauro; Lin, Tsung Han; McClure, S. S.

    2011-01-01

    This paper demonstrates the use of Neural Networks as a device modeling tool to increase the reliability analysis accuracy of circuits targeted for space applications. The paper tackles a number of case studies of relevance to the design of Flight hardware. The results show that the proposed technique generates more accurate models than the ones regularly used to model circuits.

  5. Architecture-Based Reliability Analysis of Web Services

    ERIC Educational Resources Information Center

    Rahmani, Cobra Mariam

    2012-01-01

    In a Service Oriented Architecture (SOA), the hierarchical complexity of Web Services (WS) and their interactions with the underlying Application Server (AS) create new challenges in providing a realistic estimate of WS performance and reliability. The current approaches often treat the entire WS environment as a black-box. Thus, the sensitivity…

  6. A Latent-Trait Based Reliability Estimate and Upper Bound.

    ERIC Educational Resources Information Center

    Nicewander, W. Alan

    1990-01-01

    An estimate and upper-bound estimate for the reliability of a test composed of binary items is derived from the multidimensional latent trait theory of R. D. Bock and M. Aitken (1981). The practical uses of such estimates are discussed. (SLD)

  7. Reliability and Validity of Curriculum-Based Informal Reading Inventories.

    ERIC Educational Resources Information Center

    Fuchs, Lynn; And Others

    A study was conducted to explore the reliability and validity of three prominent procedures used in informal reading inventories (IRIs): (1) choosing a 95% word recognition accuracy standard for determining student instructional level, (2) arbitrarily selecting a passage to represent the difficulty level of a basal reader, and (3) employing…

  8. A reliability and mass perspective of SP-100 Stirling cycle lunar-base powerplant designs

    NASA Technical Reports Server (NTRS)

    Bloomfield, Harvey S.

    1991-01-01

    The purpose was to obtain reliability and mass perspectives on selection of space power system conceptual designs based on SP-100 reactor and Stirling cycle power-generation subsystems. The approach taken was to: (1) develop a criterion for an acceptable overall reliability risk as a function of the expected range of emerging technology subsystem unit reliabilities; (2) conduct reliability and mass analyses for a diverse matrix of 800-kWe lunar-base design configurations employing single and multiple powerplants with both full and partial subsystem redundancy combinations; and (3) derive reliability and mass perspectives on selection of conceptual design configurations that meet an acceptable reliability criterion with the minimum system mass increase relative to reference powerplant design. The developed perspectives provided valuable insight into the considerations required to identify and characterize high-reliability and low-mass lunar-base powerplant conceptual design.

  9. Optimization Based Efficiencies in First Order Reliability Analysis

    NASA Technical Reports Server (NTRS)

    Peck, Jeffrey A.; Mahadevan, Sankaran

    2003-01-01

    This paper develops a method for updating the gradient vector of the limit state function in reliability analysis using Broyden's rank one updating technique. In problems that use commercial code as a black box, the gradient calculations are usually done using a finite difference approach, which becomes very expensive for large system models. The proposed method replaces the finite difference gradient calculations in a standard first order reliability method (FORM) with Broyden's Quasi-Newton technique. The resulting algorithm of Broyden updates within a FORM framework (BFORM) is used to run several example problems, and the results compared to standard FORM results. It is found that BFORM typically requires fewer functional evaluations that FORM to converge to the same answer.

  10. A Most Probable Point-Based Method for Reliability Analysis, Sensitivity Analysis and Design Optimization

    NASA Technical Reports Server (NTRS)

    Hou, Gene J.-W; Newman, Perry A. (Technical Monitor)

    2004-01-01

    A major step in a most probable point (MPP)-based method for reliability analysis is to determine the MPP. This is usually accomplished by using an optimization search algorithm. The minimum distance associated with the MPP provides a measurement of safety probability, which can be obtained by approximate probability integration methods such as FORM or SORM. The reliability sensitivity equations are derived first in this paper, based on the derivatives of the optimal solution. Examples are provided later to demonstrate the use of these derivatives for better reliability analysis and reliability-based design optimization (RBDO).

  11. A simple reliability-based topology optimization approach for continuum structures using a topology description function

    NASA Astrophysics Data System (ADS)

    Liu, Jie; Wen, Guilin; Zhi Zuo, Hao; Qing, Qixiang

    2016-07-01

    The structural configuration obtained by deterministic topology optimization may represent a low reliability level and lead to a high failure rate. Therefore, it is necessary to take reliability into account for topology optimization. By integrating reliability analysis into topology optimization problems, a simple reliability-based topology optimization (RBTO) methodology for continuum structures is investigated in this article. The two-layer nesting involved in RBTO, which is time consuming, is decoupled by the use of a particular optimization procedure. A topology description function approach (TOTDF) and a first order reliability method are employed for topology optimization and reliability calculation, respectively. The problem of the non-smoothness inherent in TOTDF is dealt with using two different smoothed Heaviside functions and the corresponding topologies are compared. Numerical examples demonstrate the validity and efficiency of the proposed improved method. In-depth discussions are also presented on the influence of different structural reliability indices on the final layout.

  12. A perspective on the reliability of MEMS-based components for telecommunications

    NASA Astrophysics Data System (ADS)

    McNulty, John C.

    2008-02-01

    Despite the initial skepticism of OEM companies regarding reliability, MEMS-based devices are increasingly common in optical networking. This presentation will discuss the use and reliability of MEMS in a variety of network applications, from tunable lasers and filters to variable optical attenuators and dynamic channel equalizers. The failure mechanisms of these devices will be addressed in terms of reliability physics, packaging methodologies, and process controls. Typical OEM requirements will also be presented, including testing beyond of the scope of Telcordia qualification standards. The key conclusion is that, with sufficiently robust design and manufacturing controls, MEMS-based devices can meet or exceed the demanding reliability requirements for telecommunications components.

  13. Reliability-based failure analysis of brittle materials

    NASA Technical Reports Server (NTRS)

    Powers, Lynn M.; Ghosn, Louis J.

    1989-01-01

    The reliability of brittle materials under a generalized state of stress is analyzed using the Batdorf model. The model is modified to include the reduction in shear due to the effect of the compressive stress on the microscopic crack faces. The combined effect of both surface and volume flaws is included. Due to the nature of fracture of brittle materials under compressive loading, the component is modeled as a series system in order to establish bounds on the probability of failure. A computer program was written to determine the probability of failure employing data from a finite element analysis. The analysis showed that for tensile loading a single crack will be the cause of total failure but under compressive loading a series of microscopic cracks must join together to form a dominant crack.

  14. A damage mechanics based approach to structural deterioration and reliability

    SciTech Connect

    Bhattcharya, B.; Ellingwood, B.

    1998-02-01

    Structural deterioration often occurs without perceptible manifestation. Continuum damage mechanics defines structural damage in terms of the material microstructure, and relates the damage variable to the macroscopic strength or stiffness of the structure. This enables one to predict the state of damage prior to the initiation of a macroscopic flaw, and allows one to estimate residual strength/service life of an existing structure. The accumulation of damage is a dissipative process that is governed by the laws of thermodynamics. Partial differential equations for damage growth in terms of the Helmholtz free energy are derived from fundamental thermodynamical conditions. Closed-form solutions to the equations are obtained under uniaxial loading for ductile deformation damage as a function of plastic strain, for creep damage as a function of time, and for fatigue damage as function of number of cycles. The proposed damage growth model is extended into the stochastic domain by considering fluctuations in the free energy, and closed-form solutions of the resulting stochastic differential equation are obtained in each of the three cases mentioned above. A reliability analysis of a ring-stiffened cylindrical steel shell subjected to corrosion, accidental pressure, and temperature is performed.

  15. Hard and Soft Constraints in Reliability-Based Design Optimization

    NASA Technical Reports Server (NTRS)

    Crespo, L.uis G.; Giesy, Daniel P.; Kenny, Sean P.

    2006-01-01

    This paper proposes a framework for the analysis and design optimization of models subject to parametric uncertainty where design requirements in the form of inequality constraints are present. Emphasis is given to uncertainty models prescribed by norm bounded perturbations from a nominal parameter value and by sets of componentwise bounded uncertain variables. These models, which often arise in engineering problems, allow for a sharp mathematical manipulation. Constraints can be implemented in the hard sense, i.e., constraints must be satisfied for all parameter realizations in the uncertainty model, and in the soft sense, i.e., constraints can be violated by some realizations of the uncertain parameter. In regard to hard constraints, this methodology allows (i) to determine if a hard constraint can be satisfied for a given uncertainty model and constraint structure, (ii) to generate conclusive, formally verifiable reliability assessments that allow for unprejudiced comparisons of competing design alternatives and (iii) to identify the critical combination of uncertain parameters leading to constraint violations. In regard to soft constraints, the methodology allows the designer (i) to use probabilistic uncertainty models, (ii) to calculate upper bounds to the probability of constraint violation, and (iii) to efficiently estimate failure probabilities via a hybrid method. This method integrates the upper bounds, for which closed form expressions are derived, along with conditional sampling. In addition, an l(sub infinity) formulation for the efficient manipulation of hyper-rectangular sets is also proposed.

  16. The Reliability of Randomly Generated Math Curriculum-Based Measurements

    ERIC Educational Resources Information Center

    Strait, Gerald G.; Smith, Bradley H.; Pender, Carolyn; Malone, Patrick S.; Roberts, Jarod; Hall, John D.

    2015-01-01

    "Curriculum-Based Measurement" (CBM) is a direct method of academic assessment used to screen and evaluate students' skills and monitor their responses to academic instruction and intervention. Interventioncentral.org offers a math worksheet generator at no cost that creates randomly generated "math curriculum-based measures"…

  17. A Generalizability Approach To Evaluating the Reliability of Testlet-Based Test Scores.

    ERIC Educational Resources Information Center

    Lee, Guemin; Frisbie, David A.

    Previous studies have indicated that the reliability of test scores composed of testlets might be overestimated by conventional item-based reliability estimation methods (R. Thorndike, 1953; A. Anastasi, 1988; S. Sireci, D. Thissen, and H. Wainer, 1991; H. Wainer and D. Thissen, 1996). This study used generalizability theory to investigate the…

  18. Is School-Based Height and Weight Screening of Elementary Students Private and Reliable?

    ERIC Educational Resources Information Center

    Stoddard, Sarah A.; Kubik, Martha Y.; Skay, Carol

    2008-01-01

    The Institute of Medicine recommends school-based body mass index (BMI) screening as an obesity prevention strategy. While school nurses have provided height/weight screening for years, little has been published describing measurement reliability or process. This study evaluated the reliability of height/weight measures collected by school nurses…

  19. A GIS-based software for lifeline reliability analysis under seismic hazard

    NASA Astrophysics Data System (ADS)

    Sevtap Selcuk-Kestel, A.; Sebnem Duzgun, H.; Oduncuoglu, Lutfi

    2012-05-01

    Lifelines are vital networks, and it is important that those networks are still functional after major natural disasters such as earthquakes. Assessing reliability of lifelines requires spatial analysis of lifelines with respect to a given earthquake hazard map. In this paper, a GIS-based software for the spatial assessment of lifeline reliability which is developed by using GeoTools environment is presented. The developed GIS-based software imports seismic hazard and lifeline network layers and then creates a gridded network structure. Finally, it adopts a network reliability algorithm to calculate the upper and lower bounds for system reliability of the lifeline under seismic hazard. The software enables user visualizing the reliability values in graphical form as well as thematic lifeline reliability map with colors indicating reliability level along with the link and the overall network. It also provides functions for saving the analysis results in shape file format. The software is tested and validated for an application taken from literature which is a part of water distribution system of Bursa in Turkey. The developed GIS-based software module that creates GIS-based reliability map of the lifelines under seismic hazard is user friendly, modifiable, fast in execution time, illustrative and validated for the existing literature studies.

  20. Reliability and Validity of the Evidence-Based Practice Confidence (EPIC) Scale

    ERIC Educational Resources Information Center

    Salbach, Nancy M.; Jaglal, Susan B.; Williams, Jack I.

    2013-01-01

    Introduction: The reliability, minimal detectable change (MDC), and construct validity of the evidence-based practice confidence (EPIC) scale were evaluated among physical therapists (PTs) in clinical practice. Methods: A longitudinal mail survey was conducted. Internal consistency and test-retest reliability were estimated using Cronbach's alpha…

  1. Expected-Credibility-Based Job Scheduling for Reliable Volunteer Computing

    NASA Astrophysics Data System (ADS)

    Watanabe, Kan; Fukushi, Masaru; Horiguchi, Susumu

    This paper presents a proposal of an expected-credibility-based job scheduling method for volunteer computing (VC) systems with malicious participants who return erroneous results. Credibility-based voting is a promising approach to guaranteeing the computational correctness of VC systems. However, it relies on a simple round-robin job scheduling method that does not consider the jobs' order of execution, thereby resulting in numerous unnecessary job allocations and performance degradation of VC systems. To improve the performance of VC systems, the proposed job scheduling method selects a job to be executed prior to others dynamically based on two novel metrics: expected credibility and the expected number of results for each job. Simulation of VCs shows that the proposed method can improve the VC system performance up to 11%; It always outperforms the original round-robin method irrespective of the value of unknown parameters such as population and behavior of saboteurs.

  2. A Reliable Homemade Electrode Based on Glassy Polymeric Carbon

    ERIC Educational Resources Information Center

    Santos, Andre L.; Takeuchi, Regina M.; Oliviero, Herilton P.; Rodriguez, Marcello G.; Zimmerman, Robert L.

    2004-01-01

    The production of a GPC-based material by submitting a cross-linked resin precursor to control thermal conditions is discussed. The precursor material is prepolymerized at 60-degree Celsius in a mold and is carbonized in inert atmosphere by slowly raising the temperature, the rise is performed to avoid change in the shape of the carbonization…

  3. Measurement-based reliability prediction methodology. M.S. Thesis

    NASA Technical Reports Server (NTRS)

    Linn, Linda Shen

    1991-01-01

    In the past, analytical and measurement based models were developed to characterize computer system behavior. An open issue is how these models can be used, if at all, for system design improvement. The issue is addressed here. A combined statistical/analytical approach to use measurements from one environment to model the system failure behavior in a new environment is proposed. A comparison of the predicted results with the actual data from the new environment shows a close correspondence.

  4. Reliability Modeling Development and Its Applications for Ceramic Capacitors with Base-Metal Electrodes (BMEs)

    NASA Technical Reports Server (NTRS)

    Liu, Donhang

    2014-01-01

    This presentation includes a summary of NEPP-funded deliverables for the Base-Metal Electrodes (BMEs) capacitor task, development of a general reliability model for BME capacitors, and a summary and future work.

  5. Reliable location-based services from radio navigation systems.

    PubMed

    Qiu, Di; Boneh, Dan; Lo, Sherman; Enge, Per

    2010-01-01

    Loran is a radio-based navigation system originally designed for naval applications. We show that Loran-C's high-power and high repeatable accuracy are fantastic for security applications. First, we show how to derive a precise location tag--with a sensitivity of about 20 meters--that is difficult to project to an exact location. A device can use our location tag to block or allow certain actions, without knowing its precise location. To ensure that our tag is reproducible we make use of fuzzy extractors, a mechanism originally designed for biometric authentication. We build a fuzzy extractor specifically designed for radio-type errors and give experimental evidence to show its effectiveness. Second, we show that our location tag is difficult to predict from a distance. For example, an observer cannot predict the location tag inside a guarded data center from a few hundreds of meters away. As an application, consider a location-aware disk drive that will only work inside the data center. An attacker who steals the device and is capable of spoofing Loran-C signals, still cannot make the device work since he does not know what location tag to spoof. We provide experimental data supporting our unpredictability claim. PMID:22163532

  6. Construction of reliable radiocarbon-based chronologies for speleothems

    NASA Astrophysics Data System (ADS)

    Lechleitner, Franziska; Fohlmeister, Jens; McIntyre, Cameron; Baldini, Lisa M.; Jamieson, Robert A.; Hercman, Helena; Gasiorowski, Michal; Pawlak, Jacek; Stefaniak, Krzysztof; Socha, Pawel; Eglinton, Timothy I.; Baldini, James U. L.

    2016-04-01

    Speleothems have become one of the most widely applied archives for paleoclimate research. One of their key advantages is their amenability for U-series dating, often producing excellent high precision chronologies. However, stalagmites with high detrital Th or very low U concentrations are problematic to date using U-series, and sometimes need to be discarded from further paleoclimate analysis. Radiocarbon chronologies could present an alternative for stalagmites that cannot be dated using U-series, if offsets from the "dead carbon fraction" (DCF) can be resolved. The DCF is a variable reservoir effect introduced by the addition of 14C-dead carbon from host rock dissolution and soil organic matter. We present a novel age modeling technique that provides accurate 14C-based chronologies for stalagmites. As this technique focuses on the long-term decay pattern of 14C, it is only applicable on stalagmites that show no secular variability in their 14C-depth profiles, but is independent of short-term DCF variations. In order to determine whether a stalagmite is suitable for this method without direct knowledge of long-term trends in the DCF, we highlight how other geochemical proxies (δ13C, Mg/Ca) can provide additional information on changes in karst hydrology, soil conditions, and climate that would affect DCF. We apply our model on a previously published U-Th dated stalagmite 14C dataset from Heshang Cave, China with excellent results, followed by a previously 'undateable' stalagmite from southern Poland.

  7. Reliable Location-Based Services from Radio Navigation Systems

    PubMed Central

    Qiu, Di; Boneh, Dan; Lo, Sherman; Enge, Per

    2010-01-01

    Loran is a radio-based navigation system originally designed for naval applications. We show that Loran-C’s high-power and high repeatable accuracy are fantastic for security applications. First, we show how to derive a precise location tag—with a sensitivity of about 20 meters—that is difficult to project to an exact location. A device can use our location tag to block or allow certain actions, without knowing its precise location. To ensure that our tag is reproducible we make use of fuzzy extractors, a mechanism originally designed for biometric authentication. We build a fuzzy extractor specifically designed for radio-type errors and give experimental evidence to show its effectiveness. Second, we show that our location tag is difficult to predict from a distance. For example, an observer cannot predict the location tag inside a guarded data center from a few hundreds of meters away. As an application, consider a location-aware disk drive that will only work inside the data center. An attacker who steals the device and is capable of spoofing Loran-C signals, still cannot make the device work since he does not know what location tag to spoof. We provide experimental data supporting our unpredictability claim. PMID:22163532

  8. Towards Reliable Evaluation of Anomaly-Based Intrusion Detection Performance

    NASA Technical Reports Server (NTRS)

    Viswanathan, Arun

    2012-01-01

    This report describes the results of research into the effects of environment-induced noise on the evaluation process for anomaly detectors in the cyber security domain. This research was conducted during a 10-week summer internship program from the 19th of August, 2012 to the 23rd of August, 2012 at the Jet Propulsion Laboratory in Pasadena, California. The research performed lies within the larger context of the Los Angeles Department of Water and Power (LADWP) Smart Grid cyber security project, a Department of Energy (DoE) funded effort involving the Jet Propulsion Laboratory, California Institute of Technology and the University of Southern California/ Information Sciences Institute. The results of the present effort constitute an important contribution towards building more rigorous evaluation paradigms for anomaly-based intrusion detectors in complex cyber physical systems such as the Smart Grid. Anomaly detection is a key strategy for cyber intrusion detection and operates by identifying deviations from profiles of nominal behavior and are thus conceptually appealing for detecting "novel" attacks. Evaluating the performance of such a detector requires assessing: (a) how well it captures the model of nominal behavior, and (b) how well it detects attacks (deviations from normality). Current evaluation methods produce results that give insufficient insight into the operation of a detector, inevitably resulting in a significantly poor characterization of a detectors performance. In this work, we first describe a preliminary taxonomy of key evaluation constructs that are necessary for establishing rigor in the evaluation regime of an anomaly detector. We then focus on clarifying the impact of the operational environment on the manifestation of attacks in monitored data. We show how dynamic and evolving environments can introduce high variability into the data stream perturbing detector performance. Prior research has focused on understanding the impact of this

  9. Reliability estimation for cutting tools based on logistic regression model using vibration signals

    NASA Astrophysics Data System (ADS)

    Chen, Baojia; Chen, Xuefeng; Li, Bing; He, Zhengjia; Cao, Hongrui; Cai, Gaigai

    2011-10-01

    As an important part of CNC machine, the reliability of cutting tools influences the whole manufacturing effectiveness and stability of equipment. The present study proposes a novel reliability estimation approach to the cutting tools based on logistic regression model by using vibration signals. The operation condition information of the CNC machine is incorporated into reliability analysis to reflect the product time-varying characteristics. The proposed approach is superior to other degradation estimation methods in that it does not necessitate any assumption about degradation paths and probability density functions of condition parameters. The three steps of new reliability estimation approach for cutting tools are as follows. First, on-line vibration signals of cutting tools are measured during the manufacturing process. Second, wavelet packet (WP) transform is employed to decompose the original signals and correlation analysis is employed to find out the feature frequency bands which indicate tool wear. Third, correlation analysis is also used to select the salient feature parameters which are composed of feature band energy, energy entropy and time-domain features. Finally, reliability estimation is carried out based on logistic regression model. The approach has been validated on a NC lathe. Under different failure threshold, the reliability and failure time of the cutting tools are all estimated accurately. The positive results show the plausibility and effectiveness of the proposed approach, which can facilitate machine performance and reliability estimation.

  10. Reliability analysis of idealized tunnel support system using probability-based methods with case studies

    NASA Astrophysics Data System (ADS)

    Gharouni-Nik, Morteza; Naeimi, Meysam; Ahadi, Sodayf; Alimoradi, Zahra

    2014-06-01

    In order to determine the overall safety of a tunnel support lining, a reliability-based approach is presented in this paper. Support elements in jointed rock tunnels are provided to control the ground movement caused by stress redistribution during the tunnel drive. Main support elements contribute to stability of the tunnel structure are recognized owing to identify various aspects of reliability and sustainability in the system. The selection of efficient support methods for rock tunneling is a key factor in order to reduce the number of problems during construction and maintain the project cost and time within the limited budget and planned schedule. This paper introduces a smart approach by which decision-makers will be able to find the overall reliability of tunnel support system before selecting the final scheme of the lining system. Due to this research focus, engineering reliability which is a branch of statistics and probability is being appropriately applied to the field and much effort has been made to use it in tunneling while investigating the reliability of the lining support system for the tunnel structure. Therefore, reliability analysis for evaluating the tunnel support performance is the main idea used in this research. Decomposition approaches are used for producing system block diagram and determining the failure probability of the whole system. Effectiveness of the proposed reliability model of tunnel lining together with the recommended approaches is examined using several case studies and the final value of reliability obtained for different designing scenarios. Considering the idea of linear correlation between safety factors and reliability parameters, the values of isolated reliabilities determined for different structural components of tunnel support system. In order to determine individual safety factors, finite element modeling is employed for different structural subsystems and the results of numerical analyses are obtained in

  11. Reliable contact fabrication on nanostructured Bi2Te3-based thermoelectric materials.

    PubMed

    Feng, Shien-Ping; Chang, Ya-Huei; Yang, Jian; Poudel, Bed; Yu, Bo; Ren, Zhifeng; Chen, Gang

    2013-05-14

    A cost-effective and reliable Ni-Au contact on nanostructured Bi2Te3-based alloys for a solar thermoelectric generator (STEG) is reported. The use of MPS SAMs creates a strong covalent binding and more nucleation sites with even distribution for electroplating contact electrodes on nanostructured thermoelectric materials. A reliable high-performance flat-panel STEG can be obtained by using this new method. PMID:23531997

  12. The B-747 flight control system maintenance and reliability data base for cost effectiveness tradeoff studies

    NASA Technical Reports Server (NTRS)

    1982-01-01

    Primary and automatic flight controls are combined for a total flight control reliability and maintenance cost data base using information from two previous reports and additional cost data gathered from a major airline. A comparison of the current B-747 flight control system effects on reliability and operating cost with that of a B-747 designed for an active control wing load alleviation system is provided.

  13. Reliability of 3D laser-based anthropometry and comparison with classical anthropometry

    PubMed Central

    Kuehnapfel, Andreas; Ahnert, Peter; Loeffler, Markus; Broda, Anja; Scholz, Markus

    2016-01-01

    Anthropometric quantities are widely used in epidemiologic research as possible confounders, risk factors, or outcomes. 3D laser-based body scans (BS) allow evaluation of dozens of quantities in short time with minimal physical contact between observers and probands. The aim of this study was to compare BS with classical manual anthropometric (CA) assessments with respect to feasibility, reliability, and validity. We performed a study on 108 individuals with multiple measurements of BS and CA to estimate intra- and inter-rater reliabilities for both. We suggested BS equivalents of CA measurements and determined validity of BS considering CA the gold standard. Throughout the study, the overall concordance correlation coefficient (OCCC) was chosen as indicator of agreement. BS was slightly more time consuming but better accepted than CA. For CA, OCCCs for intra- and inter-rater reliability were greater than 0.8 for all nine quantities studied. For BS, 9 of 154 quantities showed reliabilities below 0.7. BS proxies for CA measurements showed good agreement (minimum OCCC > 0.77) after offset correction. Thigh length showed higher reliability in BS while upper arm length showed higher reliability in CA. Except for these issues, reliabilities of CA measurements and their BS equivalents were comparable. PMID:27225483

  14. Reliability of 3D laser-based anthropometry and comparison with classical anthropometry.

    PubMed

    Kuehnapfel, Andreas; Ahnert, Peter; Loeffler, Markus; Broda, Anja; Scholz, Markus

    2016-01-01

    Anthropometric quantities are widely used in epidemiologic research as possible confounders, risk factors, or outcomes. 3D laser-based body scans (BS) allow evaluation of dozens of quantities in short time with minimal physical contact between observers and probands. The aim of this study was to compare BS with classical manual anthropometric (CA) assessments with respect to feasibility, reliability, and validity. We performed a study on 108 individuals with multiple measurements of BS and CA to estimate intra- and inter-rater reliabilities for both. We suggested BS equivalents of CA measurements and determined validity of BS considering CA the gold standard. Throughout the study, the overall concordance correlation coefficient (OCCC) was chosen as indicator of agreement. BS was slightly more time consuming but better accepted than CA. For CA, OCCCs for intra- and inter-rater reliability were greater than 0.8 for all nine quantities studied. For BS, 9 of 154 quantities showed reliabilities below 0.7. BS proxies for CA measurements showed good agreement (minimum OCCC > 0.77) after offset correction. Thigh length showed higher reliability in BS while upper arm length showed higher reliability in CA. Except for these issues, reliabilities of CA measurements and their BS equivalents were comparable. PMID:27225483

  15. Development of a nanosatellite de-orbiting system by reliability based design optimization

    NASA Astrophysics Data System (ADS)

    Nikbay, Melike; Acar, Pınar; Aslan, Alim Rüstem

    2015-12-01

    This paper presents design approaches to develop a reliable and efficient de-orbiting system for the 3USAT nanosatellite to provide a beneficial orbital decay process at the end of a mission. A de-orbiting system is initially designed by employing the aerodynamic drag augmentation principle where the structural constraints of the overall satellite system and the aerodynamic forces are taken into account. Next, an alternative de-orbiting system is designed with new considerations and further optimized using deterministic and reliability based design techniques. For the multi-objective design, the objectives are chosen to maximize the aerodynamic drag force through the maximization of the Kapton surface area while minimizing the de-orbiting system mass. The constraints are related in a deterministic manner to the required deployment force, the height of the solar panel hole and the deployment angle. The length and the number of layers of the deployable Kapton structure are used as optimization variables. In the second stage of this study, uncertainties related to both manufacturing and operating conditions of the deployable structure in space environment are considered. These uncertainties are then incorporated into the design process by using different probabilistic approaches such as Monte Carlo Simulation, the First-Order Reliability Method and the Second-Order Reliability Method. The reliability based design optimization seeks optimal solutions using the former design objectives and constraints with the inclusion of a reliability index. Finally, the de-orbiting system design alternatives generated by different approaches are investigated and the reliability based optimum design is found to yield the best solution since it significantly improves both system reliability and performance requirements.

  16. Dynamic reliability-based robust design optimization with time-variant probabilistic constraints

    NASA Astrophysics Data System (ADS)

    Wang, Pingfeng; Wang, Zequn; Almaktoom, Abdulaziz T.

    2014-06-01

    With the increasing complexity of engineering systems, ensuring high system reliability and system performance robustness throughout a product life cycle is of vital importance in practical engineering design. Dynamic reliability analysis, which is generally encountered due to time-variant system random inputs, becomes a primary challenge in reliability-based robust design optimization (RBRDO). This article presents a new approach to efficiently carry out dynamic reliability analysis for RBRDO. The key idea of the proposed approach is to convert time-variant probabilistic constraints to time-invariant ones by efficiently constructing a nested extreme response surface (NERS) and then carry out dynamic reliability analysis using NERS in an iterative RBRDO process. The NERS employs an efficient global optimization technique to identify the extreme time responses that correspond to the worst case scenario of system time-variant limit state functions. With these extreme time samples, a kriging-based time prediction model is built and used to estimate extreme responses for any given arbitrary design in the design space. An adaptive response prediction and model maturation mechanism is developed to guarantee the accuracy and efficiency of the proposed NERS approach. The NERS is integrated with RBRDO with time-variant probabilistic constraints to achieve optimum designs of engineered systems with desired reliability and performance robustness. Two case studies are used to demonstrate the efficacy of the proposed approach.

  17. An Energy-Based Limit State Function for Estimation of Structural Reliability in Shock Environments

    DOE PAGESBeta

    Guthrie, Michael A.

    2013-01-01

    limit state function is developed for the estimation of structural reliability in shock environments. This limit state function uses peak modal strain energies to characterize environmental severity and modal strain energies at failure to characterize the structural capacity. The Hasofer-Lind reliability index is briefly reviewed and its computation for the energy-based limit state function is discussed. Applications to two degree of freedom mass-spring systems and to a simple finite element model are considered. For these examples, computation of the reliability index requires little effort beyond a modal analysis, but still accounts for relevant uncertainties in both the structure and environment.more » For both examples, the reliability index is observed to agree well with the results of Monte Carlo analysis. In situations where fast, qualitative comparison of several candidate designs is required, the reliability index based on the proposed limit state function provides an attractive metric which can be used to compare and control reliability.« less

  18. A General Reliability Model for Ni-BaTiO3-Based Multilayer Ceramic Capacitors

    NASA Technical Reports Server (NTRS)

    Liu, Donhang

    2014-01-01

    The evaluation of multilayer ceramic capacitors (MLCCs) with Ni electrode and BaTiO3 dielectric material for potential space project applications requires an in-depth understanding of their reliability. A general reliability model for Ni-BaTiO3 MLCC is developed and discussed. The model consists of three parts: a statistical distribution; an acceleration function that describes how a capacitor's reliability life responds to the external stresses, and an empirical function that defines contribution of the structural and constructional characteristics of a multilayer capacitor device, such as the number of dielectric layers N, dielectric thickness d, average grain size, and capacitor chip size A. Application examples are also discussed based on the proposed reliability model for Ni-BaTiO3 MLCCs.

  19. Reliability-based structural optimization: A proposed analytical-experimental study

    NASA Technical Reports Server (NTRS)

    Stroud, W. Jefferson; Nikolaidis, Efstratios

    1993-01-01

    An analytical and experimental study for assessing the potential of reliability-based structural optimization is proposed and described. In the study, competing designs obtained by deterministic and reliability-based optimization are compared. The experimental portion of the study is practical because the structure selected is a modular, actively and passively controlled truss that consists of many identical members, and because the competing designs are compared in terms of their dynamic performance and are not destroyed if failure occurs. The analytical portion of this study is illustrated on a 10-bar truss example. In the illustrative example, it is shown that reliability-based optimization can yield a design that is superior to an alternative design obtained by deterministic optimization. These analytical results provide motivation for the proposed study, which is underway.

  20. The SUPERB Project: Reliability-based design guideline for submarine pipelines

    SciTech Connect

    Sotberg, T.; Bruschi, R.; Moerk, K.

    1996-12-31

    This paper gives an overview of the research program SUPERB, the main objective being the development of a SUbmarine PipelinE Reliability Based Design Guideline with a comprehensive setup of design recommendations and criteria for pipeline design. The motivation of this program is related to the fact that project guidelines currently in force do not account for modern fabrication technology and the findings of recent research programs and by advanced engineering tools. The main structure of the Limit State Based Design (LSBD) Guideline is described followed by an outline of the safety philosophy which is introduced to fit within this framework. Focus is on the development of a reliability-based design guideline as a rational tool to manage future offshore projects with an optimal balance between project safety and economy. Selection of appropriate limit state functions and use of reliability tools to calibrate partial safety factors is also discussed.

  1. Non-probabilistic reliability method and reliability-based optimal LQR design for vibration control of structures with uncertain-but-bounded parameters

    NASA Astrophysics Data System (ADS)

    Guo, Shu-Xiang; Li, Ying

    2013-12-01

    Uncertainty is inherent and unavoidable in almost all engineering systems. It is of essential significance to deal with uncertainties by means of reliability approach and to achieve a reasonable balance between reliability against uncertainties and system performance in the control design of uncertain systems. Nevertheless, reliability methods which can be used directly for analysis and synthesis of active control of structures in the presence of uncertainties remain to be developed, especially in non-probabilistic uncertainty situations. In the present paper, the issue of vibration control of uncertain structures using linear quadratic regulator (LQR) approach is studied from the viewpoint of reliability. An efficient non-probabilistic robust reliability method for LQR-based static output feedback robust control of uncertain structures is presented by treating bounded uncertain parameters as interval variables. The optimal vibration controller design for uncertain structures is carried out by solving a robust reliability-based optimization problem with the objective to minimize the quadratic performance index. The controller obtained may possess optimum performance under the condition that the controlled structure is robustly reliable with respect to admissible uncertainties. The proposed method provides an essential basis for achieving a balance between robustness and performance in controller design of uncertain structures. The presented formulations are in the framework of linear matrix inequality and can be carried out conveniently. Two numerical examples are provided to illustrate the effectiveness and feasibility of the present method.

  2. Validation of highly reliable, real-time knowledge-based systems

    NASA Technical Reports Server (NTRS)

    Johnson, Sally C.

    1988-01-01

    Knowledge-based systems have the potential to greatly increase the capabilities of future aircraft and spacecraft and to significantly reduce support manpower needed for the space station and other space missions. However, a credible validation methodology must be developed before knowledge-based systems can be used for life- or mission-critical applications. Experience with conventional software has shown that the use of good software engineering techniques and static analysis tools can greatly reduce the time needed for testing and simulation of a system. Since exhaustive testing is infeasible, reliability must be built into the software during the design and implementation phases. Unfortunately, many of the software engineering techniques and tools used for conventional software are of little use in the development of knowledge-based systems. Therefore, research at Langley is focused on developing a set of guidelines, methods, and prototype validation tools for building highly reliable, knowledge-based systems. The use of a comprehensive methodology for building highly reliable, knowledge-based systems should significantly decrease the time needed for testing and simulation. A proven record of delivering reliable systems at the beginning of the highly visible testing and simulation phases is crucial to the acceptance of knowledge-based systems in critical applications.

  3. Composite Stress Rupture: A New Reliability Model Based on Strength Decay

    NASA Technical Reports Server (NTRS)

    Reeder, James R.

    2012-01-01

    A model is proposed to estimate reliability for stress rupture of composite overwrap pressure vessels (COPVs) and similar composite structures. This new reliability model is generated by assuming a strength degradation (or decay) over time. The model suggests that most of the strength decay occurs late in life. The strength decay model will be shown to predict a response similar to that predicted by a traditional reliability model for stress rupture based on tests at a single stress level. In addition, the model predicts that even though there is strength decay due to proof loading, a significant overall increase in reliability is gained by eliminating any weak vessels, which would fail early. The model predicts that there should be significant periods of safe life following proof loading, because time is required for the strength to decay from the proof stress level to the subsequent loading level. Suggestions for testing the strength decay reliability model have been made. If the strength decay reliability model predictions are shown through testing to be accurate, COPVs may be designed to carry a higher level of stress than is currently allowed, which will enable the production of lighter structures

  4. Reliability Based Design for a Raked Wing Tip of an Airframe

    NASA Technical Reports Server (NTRS)

    Patnaik, Surya N.; Pai, Shantaram S.; Coroneos, Rula M.

    2011-01-01

    A reliability-based optimization methodology has been developed to design the raked wing tip of the Boeing 767-400 extended range airliner made of composite and metallic materials. Design is formulated for an accepted level of risk or reliability. The design variables, weight and the constraints became functions of reliability. Uncertainties in the load, strength and the material properties, as well as the design variables, were modeled as random parameters with specified distributions, like normal, Weibull or Gumbel functions. The objective function and constraint, or a failure mode, became derived functions of the risk-level. Solution to the problem produced the optimum design with weight, variables and constraints as a function of the risk-level. Optimum weight versus reliability traced out an inverted-S shaped graph. The center of the graph corresponded to a 50 percent probability of success, or one failure in two samples. Under some assumptions, this design would be quite close to the deterministic optimum solution. The weight increased when reliability exceeded 50 percent, and decreased when the reliability was compromised. A design could be selected depending on the level of risk acceptable to a situation. The optimization process achieved up to a 20-percent reduction in weight over traditional design.

  5. Space station software reliability analysis based on failures observed during testing at the multisystem integration facility

    NASA Technical Reports Server (NTRS)

    Tamayo, Tak Chai

    1987-01-01

    Quality of software not only is vital to the successful operation of the space station, it is also an important factor in establishing testing requirements, time needed for software verification and integration as well as launching schedules for the space station. Defense of management decisions can be greatly strengthened by combining engineering judgments with statistical analysis. Unlike hardware, software has the characteristics of no wearout and costly redundancies, thus making traditional statistical analysis not suitable in evaluating reliability of software. A statistical model was developed to provide a representation of the number as well as types of failures occur during software testing and verification. From this model, quantitative measure of software reliability based on failure history during testing are derived. Criteria to terminate testing based on reliability objectives and methods to estimate the expected number of fixings required are also presented.

  6. Reliability issues of an accelerometer under environmental stresses

    NASA Astrophysics Data System (ADS)

    Schmitt, Petra; Pressecq, Francis; Perez, Guy; Lafontan, Xavier; Nicot, Jean Marc; Esteve, Daniel; Fourniols, Jean Yves; Camon, Henri; Oudea, Coumar

    2004-01-01

    COTS (Commercial-off-the-shelf) MEMS components are very interesting for space applications because they are lightweight, small, economic in energy, cheap and available in short delays. The reliability of MEMS COTS that are used out of their intended domain of operation (such as a space application) might be assured by a reliability methodology derived from the Physics of Failure approach. In order to use this approach it is necessary to create models of MEMS components that take into consideration environmental stresses and thus can be used for lifetime prediction. Unfortunately, today MEMS failure mechanisms are not well understood today and therefore a preliminary work is necessary to determine influent factors and physical phenomena. The model development is based on a good knowledge of the process parameters (Young"s modulus, stress...), environmental tests and appropriated modeling approaches, such a finite element analysis (FEA) and behavioural modeling. In order to do the environmental tests and to analyse MEMS behaviour, we have developed the Environmental MEMS Analyzer EMA 3D. The described methodology has been applied to a Commercial-off-the-shelf (COTS) accelerometer, the ADXL150. A first level behavioral model was created and then refined in the following steps by the enrichment with experimental results and finite element simulations.

  7. Reliability issues of an accelerometer under environmental stresses

    NASA Astrophysics Data System (ADS)

    Schmitt, Petra; Pressecq, Francis; Perez, Guy; Lafontan, Xavier; Nicot, Jean Marc; Esteve, Daniel; Fourniols, Jean Yves; Camon, Henri; Oudea, Coumar

    2003-12-01

    COTS (Commercial-off-the-shelf) MEMS components are very interesting for space applications because they are lightweight, small, economic in energy, cheap and available in short delays. The reliability of MEMS COTS that are used out of their intended domain of operation (such as a space application) might be assured by a reliability methodology derived from the Physics of Failure approach. In order to use this approach it is necessary to create models of MEMS components that take into consideration environmental stresses and thus can be used for lifetime prediction. Unfortunately, today MEMS failure mechanisms are not well understood today and therefore a preliminary work is necessary to determine influent factors and physical phenomena. The model development is based on a good knowledge of the process parameters (Young"s modulus, stress…), environmental tests and appropriated modeling approaches, such a finite element analysis (FEA) and behavioural modeling. In order to do the environmental tests and to analyse MEMS behaviour, we have developed the Environmental MEMS Analyzer EMA 3D. The described methodology has been applied to a Commercial-off-the-shelf (COTS) accelerometer, the ADXL150. A first level behavioral model was created and then refined in the following steps by the enrichment with experimental results and finite element simulations.

  8. Reliability assessment of long span bridges based on structural health monitoring: application to Yonghe Bridge

    NASA Astrophysics Data System (ADS)

    Li, Shunlong; Li, Hui; Ou, Jinping; Li, Hongwei

    2009-07-01

    This paper presents the reliability estimation studies based on structural health monitoring data for long span cable stayed bridges. The data collected by structural health monitoring system can be used to update the assumptions or probability models of random load effects, which would give potential for accurate reliability estimation. The reliability analysis is based on the estimated distribution for Dead, Live, Wind and Temperature Load effects. For the components with FBG strain sensors, the Dead, Live and unit Temperature Load effects can be determined by the strain measurements. For components without FBG strain sensors, the Dead and unit Temperature Load and Wind Load effects of the bridge can be evaluated by the finite element model, updated and calibrated by monitoring data. By applying measured truck loads and axle spacing data from weight in motion (WIM) system to the calibrated finite element model, the Live Load effects of components without FBG sensors can be generated. The stochastic process of Live Load effects can be described approximately by a Filtered Poisson Process and the extreme value distribution of Live Load effects can be calculated by Filtered Poisson Process theory. Then first order reliability method (FORM) is employed to estimate the reliability index of main components of the bridge (i.e. stiffening girder).

  9. Estimating Reliability of Disturbances in Satellite Time Series Data Based on Statistical Analysis

    NASA Astrophysics Data System (ADS)

    Zhou, Z.-G.; Tang, P.; Zhou, M.

    2016-06-01

    Normally, the status of land cover is inherently dynamic and changing continuously on temporal scale. However, disturbances or abnormal changes of land cover — caused by such as forest fire, flood, deforestation, and plant diseases — occur worldwide at unknown times and locations. Timely detection and characterization of these disturbances is of importance for land cover monitoring. Recently, many time-series-analysis methods have been developed for near real-time or online disturbance detection, using satellite image time series. However, the detection results were only labelled with "Change/ No change" by most of the present methods, while few methods focus on estimating reliability (or confidence level) of the detected disturbances in image time series. To this end, this paper propose a statistical analysis method for estimating reliability of disturbances in new available remote sensing image time series, through analysis of full temporal information laid in time series data. The method consists of three main steps. (1) Segmenting and modelling of historical time series data based on Breaks for Additive Seasonal and Trend (BFAST). (2) Forecasting and detecting disturbances in new time series data. (3) Estimating reliability of each detected disturbance using statistical analysis based on Confidence Interval (CI) and Confidence Levels (CL). The method was validated by estimating reliability of disturbance regions caused by a recent severe flooding occurred around the border of Russia and China. Results demonstrated that the method can estimate reliability of disturbances detected in satellite image with estimation error less than 5% and overall accuracy up to 90%.

  10. The Seismic Reliability of Offshore Structures Based on Nonlinear Time History Analyses

    SciTech Connect

    Hosseini, Mahmood; Karimiyani, Somayyeh; Ghafooripour, Amin; Jabbarzadeh, Mohammad Javad

    2008-07-08

    Regarding the past earthquakes damages to offshore structures, as vital structures in the oil and gas industries, it is important that their seismic design is performed by very high reliability. Accepting the Nonlinear Time History Analyses (NLTHA) as the most reliable seismic analysis method, in this paper an offshore platform of jacket type with the height of 304 feet, having a deck of 96 feet by 94 feet, and weighing 290 million pounds has been studied. At first, some Push-Over Analyses (POA) have been preformed to recognize the more critical members of the jacket, based on the range of their plastic deformations. Then NLTHA have been performed by using the 3-components accelerograms of 100 earthquakes, covering a wide range of frequency content, and normalized to three Peak Ground Acceleration (PGA) levels of 0.3 g, 0.65 g, and 1.0 g. By using the results of NLTHA the damage and rupture probabilities of critical member have been studied to assess the reliability of the jacket structure. Regarding that different structural members of the jacket have different effects on the stability of the platform, an 'importance factor' has been considered for each critical member based on its location and orientation in the structure, and then the reliability of the whole structure has been obtained by combining the reliability of the critical members, each having its specific importance factor.

  11. Two Prophecy Formulas for Assessing the Reliability of Item Response Theory-Based Ability Estimates

    ERIC Educational Resources Information Center

    Raju, Nambury S.; Oshima, T.C.

    2005-01-01

    Two new prophecy formulas for estimating item response theory (IRT)-based reliability of a shortened or lengthened test are proposed. Some of the relationships between the two formulas, one of which is identical to the well-known Spearman-Brown prophecy formula, are examined and illustrated. The major assumptions underlying these formulas are…

  12. Assessing I-Grid(TM) web-based monitoring for power quality and reliability benchmarking

    SciTech Connect

    Divan, Deepak; Brumsickle, William; Eto, Joseph

    2003-04-30

    This paper presents preliminary findings from DOEs pilot program. The results show how a web-based monitoring system can form the basis for aggregation of data and correlation and benchmarking across broad geographical lines. A longer report describes additional findings from the pilot, including impacts of power quality and reliability on customers operations [Divan, Brumsickle, Eto 2003].

  13. 49 CFR Appendix E to Part 238 - General Principles of Reliability-Based Maintenance Programs

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 49 Transportation 4 2014-10-01 2014-10-01 false General Principles of Reliability-Based Maintenance Programs E Appendix E to Part 238 Transportation Other Regulations Relating to Transportation (Continued) FEDERAL RAILROAD ADMINISTRATION, DEPARTMENT OF TRANSPORTATION PASSENGER EQUIPMENT SAFETY STANDARDS Pt. 238, App. E Appendix E to Part...

  14. Total ionizing dose effects and reliability of graphene-based non-volatile memory devices

    NASA Astrophysics Data System (ADS)

    Zhang, Cher Xuan; Zhang, En Xia; Fleetwood, Daniel M.; Alles, Michael L.; Schrimpf, Ronald D.; Song, Emil B.; Galatsis, Kosmas; Newaz, A. K. M.; Bolotin, K. I.

    We discuss total ionizing dose effects and reliability of graphene-based electronics and non-volatile memory devices. The degradation after radiation exposure of these structures derives primarily from surface oxygen adsorption. Excellent stability and memory retention are observed for ionizing radiation exposure or constant-voltage stress. Cycling of the memory state leads to a significant degradation of the performance.

  15. Composite Reliability of a Workplace-Based Assessment Toolbox for Postgraduate Medical Education

    ERIC Educational Resources Information Center

    Moonen-van Loon, J. M. W.; Overeem, K.; Donkers, H. H. L. M.; van der Vleuten, C. P. M.; Driessen, E. W.

    2013-01-01

    In recent years, postgraduate assessment programmes around the world have embraced workplace-based assessment (WBA) and its related tools. Despite their widespread use, results of studies on the validity and reliability of these tools have been variable. Although in many countries decisions about residents' continuation of training and…

  16. Empirical vs. Expected IRT-Based Reliability Estimation in Computerized Multistage Testing (MST)

    ERIC Educational Resources Information Center

    Zhang, Yanwei; Breithaupt, Krista; Tessema, Aster; Chuah, David

    2006-01-01

    Two IRT-based procedures to estimate test reliability for a certification exam that used both adaptive (via a MST model) and non-adaptive design were considered in this study. Both procedures rely on calibrated item parameters to estimate error variance. In terms of score variance, one procedure (Method 1) uses the empirical ability distribution…

  17. The Reliability and Validity of Curriculum-Based Measurement Readiness Probes for Kindergarten Students.

    ERIC Educational Resources Information Center

    Vanderheyden, Amanda M.; Witt, Joseph C.; Naquin, Gale; Noell, George

    2001-01-01

    A series of group-administered curriculum-based measurement (CBM) probes were developed to assist in the identification of kindergarten students exhibiting deficient readiness skills. Acceptable reliability and validity estimates were obtained for three of the probe measures. Proposes the use of kindergarten CBM probes as a potential screening…

  18. Body-Image Perceptions: Reliability of a BMI-Based Silhouette Matching Test

    ERIC Educational Resources Information Center

    Peterson, Michael; Ellenberg, Deborah; Crossan, Sarah

    2003-01-01

    Objective: To assess the reliability of a BMI-based Silhouette Matching Test (BMI-SMT). Methods: The perceptions of ideal and current body images of 215 ninth through twelfth graders' were assessed at 5 different schools within a mid-Atlantic state public school system. Results: Findings provided quantifiable data and discriminating measurements…

  19. Assessing the Reliability of Curriculum-Based Measurement: An Application of Latent Growth Modeling

    ERIC Educational Resources Information Center

    Yeo, Seungsoo; Kim, Dong-Il; Branum-Martin, Lee; Wayman, Miya Miura; Espin, Christine A.

    2012-01-01

    The purpose of this study was to demonstrate the use of Latent Growth Modeling (LGM) as a method for estimating reliability of Curriculum-Based Measurement (CBM) progress-monitoring data. The LGM approach permits the error associated with each measure to differ at each time point, thus providing an alternative method for examining of the…

  20. Genetic algorithm-support vector regression for high reliability SHM system based on FBG sensor network

    NASA Astrophysics Data System (ADS)

    Zhang, XiaoLi; Liang, DaKai; Zeng, Jie; Asundi, Anand

    2012-02-01

    Structural Health Monitoring (SHM) based on fiber Bragg grating (FBG) sensor network has attracted considerable attention in recent years. However, FBG sensor network is embedded or glued in the structure simply with series or parallel. In this case, if optic fiber sensors or fiber nodes fail, the fiber sensors cannot be sensed behind the failure point. Therefore, for improving the survivability of the FBG-based sensor system in the SHM, it is necessary to build high reliability FBG sensor network for the SHM engineering application. In this study, a model reconstruction soft computing recognition algorithm based on genetic algorithm-support vector regression (GA-SVR) is proposed to achieve the reliability of the FBG-based sensor system. Furthermore, an 8-point FBG sensor system is experimented in an aircraft wing box. The external loading damage position prediction is an important subject for SHM system; as an example, different failure modes are selected to demonstrate the SHM system's survivability of the FBG-based sensor network. Simultaneously, the results are compared with the non-reconstruct model based on GA-SVR in each failure mode. Results show that the proposed model reconstruction algorithm based on GA-SVR can still keep the predicting precision when partial sensors failure in the SHM system; thus a highly reliable sensor network for the SHM system is facilitated without introducing extra component and noise.

  1. A rainwater harvesting system reliability model based on nonparametric stochastic rainfall generator

    NASA Astrophysics Data System (ADS)

    Basinger, Matt; Montalto, Franco; Lall, Upmanu

    2010-10-01

    SummaryThe reliability with which harvested rainwater can be used as a means of flushing toilets, irrigating gardens, and topping off air-conditioner serving multifamily residential buildings in New York City is assessed using a new rainwater harvesting (RWH) system reliability model. Although demonstrated with a specific case study, the model is portable because it is based on a nonparametric rainfall generation procedure utilizing a bootstrapped markov chain. Precipitation occurrence is simulated using transition probabilities derived for each day of the year based on the historical probability of wet and dry day state changes. Precipitation amounts are selected from a matrix of historical values within a moving 15 day window that is centered on the target day. RWH system reliability is determined for user-specified catchment area and tank volume ranges using precipitation ensembles generated using the described stochastic procedure. The reliability with which NYC backyard gardens can be irrigated and air conditioning units supplied with water harvested from local roofs exceeds 80% and 90%, respectively, for the entire range of catchment areas and tank volumes considered in the analysis. For RWH systems installed on the most commonly occurring rooftop catchment areas found in NYC (51-75 m 2), toilet flushing demand can be met with 7-40% reliability, with lower end of the range representing buildings with high flow toilets and no storage elements, and the upper end representing buildings that feature low flow fixtures and storage tanks of up to 5 m 3. When the reliability curves developed are used to size RWH systems to flush the low flow toilets of all multifamily buildings found a typical residential neighborhood in the Bronx, rooftop runoff inputs to the sewer system are reduced by approximately 28% over an average rainfall year, and potable water demand is reduced by approximately 53%.

  2. Reliability assessment of a peer evaluation instrument in a team-based learning course

    PubMed Central

    Wahawisan, Joy; Salazar, Miguel; Walters, Robin; Alkhateeb, Fadi M.; Attarabeen, Omar

    2015-01-01

    Objective: To evaluate the reliability of a peer evaluation instrument in a longitudinal team-based learning setting. Methods: Student pharmacists were instructed to evaluate the contributions of their peers. Evaluations were analyzed for the variance of the scores by identifying low, medium, and high scores. Agreement between performance ratings within each group of students was assessed via intra-class correlation coefficient (ICC). Results: We found little variation in the standard deviation (SD) based on the score means among the high, medium, and low scores within each group. The lack of variation in SD of results between groups suggests that the peer evaluation instrument produces precise results. The ICC showed strong concordance among raters. Conclusions: Findings suggest that our student peer evaluation instrument provides a reliable method for peer assessment in team-based learning settings. PMID:27011776

  3. Effect of Microstructure on Reliability of Ca(TiZr)O3-Based Multilayer Ceramic Capacitors

    NASA Astrophysics Data System (ADS)

    Motoki, Tomoo; Naito, Masahiro; Sano, Harunobu; Konoike, Takehiro; Tomono, Kunisaburo

    2000-09-01

    We examined the reliability of Ca(TiZr)O3 (CTZ)-based Ni-electrode multilayer ceramic capacitors (MLCs) prepared by two different processes with particular interest in the microstructure. One process was to calcine the mixture of CaCO3 and TiO2 to prepare CaTiO3 (CT) powder and the mixture of CaCO3 and ZrO2 to prepare CaZrO3 (CZ) powder, and then mix these calcined powders and sinter them to synthesize the CTZ-based ceramics. The other was to calcine the mixture of CaCO3, TiO2 and ZrO2 powders together to prepare CTZ powder and then sinter them. These two processes of CTZ ceramic preparation resulted in a different crystallinity and distribution of the elements. We found that these factors influenced the reliability of CTZ-based MLCs.

  4. The construction of a FBG-based hierarchical AOFSN with high reliability and scalability

    NASA Astrophysics Data System (ADS)

    Peng, Li-mei; Yang, Won-Hyuk; Li, Xin-wan; Kim, Young-Chon

    2008-11-01

    To improve the reliability and scalability that are very important for large-scale all optical fiber sensor networks (AOFSN), three-level hierarchical sensor network architectures are proposed. The first two levels consist of active interrogation and RNs, respectively. The third level called sensor subnet (SSN) consists of passive FBGs and a few switches. As AOFSN is mainly multiplexed by wired and passive FBGs, the routing algorithm for scanning sensors is determined by the virtual topology of SSN due to the passivity. Therefore, the research concentrates on the construction of SSN and aims at proposing regular and unicursal virtual topology to realize reliable and scalable routing schemes. Two regular types of SSNs are proposed. Each type consists of several sensor cells (SC), square-based SC (SSC) or pentagon-based SC (PSC) and is scaled several times from the SCs. The virtual topologies maintain the self-similar square- or pentagon-like architecture so as to gain simple routing. Finally, the switch architecture of RN is proposed for the reliability of the first two levels; and then, the reliability and scalability of SSN are discussed in view of how much link failures can be tolerant, and how each SC is scaled to maintain the self-similarity, respectively.

  5. High-reliability nonhermetic 1.3-um InP-based uncooled lasers

    NASA Astrophysics Data System (ADS)

    Chand, Naresh; Osenbach, John W.; Evanosky, T. L.; Comizzoli, Robert B.; Tsang, Won-Tien

    1996-01-01

    We report the first uncooled non-hermetic 1.3 micrometer InP-based communication lasers that have reliability comparable to their hermetically packaged counterparts for possible applications in fiber in the loop and cable TV. The development of reliable non-hermetic semiconductor lasers would not only lead to the elimination of the costs specifically associated with hermetic packaging but also lead the way for possible revolutionary low cost optoelectronic packaging technologies. We have used Fabry-Perot capped mesa buried heterostructure (CMBH) uncooled lasers with both bulk and MQW active regions grown on n- type InP substrates by VPE and MOCVD. We find that the proper dielectric facet passivation is the key to obtain high reliability in a non-hermetic environment. The passivation protects the laser from the ambient and maintains the proper facet reflectivity to achieve desired laser characteristics. The SiO facet passivation formed by molecular beam deposition (MBD) has resulted in lasers with lifetime well in excess of the reliability goal of 3,000 hours operation at 85 degrees Celsius/90% RH/30 mA aging condition. Based on extrapolations derived experimentally, we calculate a 15 year average device hazard rate of less than 300 FITs (as against the desired 1,500 FITs) for the combination of thermal and humidity induced degradation at an ambient condition of 45 degrees Celsius/50% RH. For comparison, the average hazard rate at 45 degrees Celsius and 15 years of service is approximately 250 FITs for hermetic lasers of similar construction. A comparison of the thermal only degradation (hermetic) to the thermal plus humidity induced degradation (non-hermetic) indicates that the reliability of these nonhermetic lasers is controlled by thermal degradation only and not by moisture-induced degradation. In addition to device passivation for a non-hermetic environment, MBD-SiO maintains the optical, electrical and mechanical properties needed for high-performance laser

  6. Reliability Sensitivity Analysis and Design Optimization of Composite Structures Based on Response Surface Methodology

    NASA Technical Reports Server (NTRS)

    Rais-Rohani, Masoud

    2003-01-01

    This report discusses the development and application of two alternative strategies in the form of global and sequential local response surface (RS) techniques for the solution of reliability-based optimization (RBO) problems. The problem of a thin-walled composite circular cylinder under axial buckling instability is used as a demonstrative example. In this case, the global technique uses a single second-order RS model to estimate the axial buckling load over the entire feasible design space (FDS) whereas the local technique uses multiple first-order RS models with each applied to a small subregion of FDS. Alternative methods for the calculation of unknown coefficients in each RS model are explored prior to the solution of the optimization problem. The example RBO problem is formulated as a function of 23 uncorrelated random variables that include material properties, thickness and orientation angle of each ply, cylinder diameter and length, as well as the applied load. The mean values of the 8 ply thicknesses are treated as independent design variables. While the coefficients of variation of all random variables are held fixed, the standard deviations of ply thicknesses can vary during the optimization process as a result of changes in the design variables. The structural reliability analysis is based on the first-order reliability method with reliability index treated as the design constraint. In addition to the probabilistic sensitivity analysis of reliability index, the results of the RBO problem are presented for different combinations of cylinder length and diameter and laminate ply patterns. The two strategies are found to produce similar results in terms of accuracy with the sequential local RS technique having a considerably better computational efficiency.

  7. Reliability of Two Field-Based Tests for Measuring Cardiorespiratory Fitness in Preschool Children.

    PubMed

    Ayán, Carlos; Cancela, José M; Romero, Sonia; Alonso, Susana

    2015-10-01

    This study is aimed at analyzing the reliability of 2 field-based cardiorespiratory fitness tests when applied to a sample specifically made up of preschool-aged children. A total of 97 preschoolers (mean age: 4.36 ± 0.4 years; 50.5% girls) performed Course-Navette and Mini-Cooper tests 3 times (familiarization test and retest). The scores obtained were compared with the results provided by the 3-minute shuttle run test, which is considered to be a reliable field-based test for preschoolers. The Mini-Cooper test showed a high reliability for children aged 4 (intraclass correlation coefficient [ICC]: 0.942; 95% confidence interval [CI]: 0.903-0.965) and 5 years old (ICC: 0.946; 95% CI: 0.893-0.973). The reliability of Course-Navette was also high for both 4-year-old (ICC: 0.909; 95% CI: 0.849-0.945) and 5-year-old children (ICC: 0.889; 95% CI: 0.780-0.944). The mean scores of the 3-minute shuttle run test did not show a significant correlation with the mean scores obtained in the Mini-Cooper test and in the Course-Navette test in the 4-year-old children. The results of this study suggest that Course-Navette and Mini-Cooper tests are reliable measures of cardiorespiratory fitness that can be used to assess health-related fitness in preschool children. Nevertheless, some considerations must be taken into account before administering them. PMID:26402475

  8. The Bayesian Reliability Assessment and Prediction for Radar System Based on New Dirichlet Prior Distribution

    NASA Astrophysics Data System (ADS)

    Ming, Zhimao; Ling, Xiaodong; Bai, Xiaoshu; Zong, Bo

    2016-02-01

    This article studies on Bayesian reliability growth models of complex system based on new Dirichlet prior distribution when the sample of system is small. The model briefly describes expert experience as uniform distribution, then equivalent general Beta distribution of uniform distribution can be solved by optimization method when prior parameters are variables, mean is constraint condition, and variance is regarding as the optimization objective. The optimization method solves the problem of how to determine values of hyper-parameters of new Dirichlet distribution when these parameters have no specific physical meaning. Because the multidimensional numerical integration of posterior distribution is very difficult to calculate, WinBUGS software is employed to establish Bayesian reliability growth model based on a new Dirichlet prior distribution, and two practical cases are studied under this model in order to prove validity of model. The analysis results show that the model can improve the precision of calculation, and it is easy to use in engineering.

  9. Generalizability theory reliability of written expression curriculum-based measurement in universal screening.

    PubMed

    Keller-Margulis, Milena A; Mercer, Sterett H; Thomas, Erin L

    2016-09-01

    The purpose of this study was to examine the reliability of written expression curriculum-based measurement (WE-CBM) in the context of universal screening from a generalizability theory framework. Students in second through fifth grade (n = 145) participated in the study. The sample included 54% female students, 49% White students, 23% African American students, 17% Hispanic students, 8% Asian students, and 3% of students identified as 2 or more races. Of the sample, 8% were English Language Learners and 6% were students receiving special education. Three WE-CBM probes were administered for 7 min each at 3 time points across 1 year. Writing samples were scored for commonly used WE-CBM metrics (e.g., correct minus incorrect word sequences; CIWS). Results suggest that nearly half the variance in WE-CBM is related to unsystematic error and that conventional screening procedures (i.e., the use of one 3-min sample) do not yield scores with adequate reliability for relative or absolute decisions about student performance. In most grades, three 3-min writing samples (or 2 longer duration samples) were required for adequate reliability for relative decisions, and three 7-min writing samples would not yield adequate reliability for relative decisions about within-year student growth. Implications and recommendations are discussed. (PsycINFO Database Record PMID:26322656

  10. Reliable and redundant FPGA based read-out design in the ATLAS TileCal Demonstrator

    SciTech Connect

    Akerstedt, Henrik; Muschter, Steffen; Drake, Gary; Anderson, Kelby; Bohm, Christian; Oreglia, Mark; Tang, Fukun

    2015-10-01

    The Tile Calorimeter at ATLAS [1] is a hadron calorimeter based on steel plates and scintillating tiles read out by PMTs. The current read-out system uses standard ADCs and custom ASICs to digitize and temporarily store the data on the detector. However, only a subset of the data is actually read out to the counting room. The on-detector electronics will be replaced around 2023. To achieve the required reliability the upgraded system will be highly redundant. Here the ASICs will be replaced with Kintex-7 FPGAs from Xilinx. This, in addition to the use of multiple 10 Gbps optical read-out links, will allow a full read-out of all detector data. Due to the higher radiation levels expected when the beam luminosity is increased, opportunities for repairs will be less frequent. The circuitry and firmware must therefore be designed for sufficiently high reliability using redundancy and radiation tolerant components. Within a year, a hybrid demonstrator including the new readout system will be installed in one slice of the ATLAS Tile Calorimeter. This will allow the proposed upgrade to be thoroughly evaluated well before the planned 2023 deployment in all slices, especially with regard to long term reliability. Different firmware strategies alongside with their integration in the demonstrator are presented in the context of high reliability protection against hardware malfunction and radiation induced errors.

  11. [Reliability theory based on quality risk network analysis for Chinese medicine injection].

    PubMed

    Li, Zheng; Kang, Li-Yuan; Fan, Xiao-Hui

    2014-08-01

    A new risk analysis method based upon reliability theory was introduced in this paper for the quality risk management of Chinese medicine injection manufacturing plants. The risk events including both cause and effect ones were derived in the framework as nodes with a Bayesian network analysis approach. It thus transforms the risk analysis results from failure mode and effect analysis (FMEA) into a Bayesian network platform. With its structure and parameters determined, the network can be used to evaluate the system reliability quantitatively with probabilistic analytical appraoches. Using network analysis tools such as GeNie and AgenaRisk, we are able to find the nodes that are most critical to influence the system reliability. The importance of each node to the system can be quantitatively evaluated by calculating the effect of the node on the overall risk, and minimization plan can be determined accordingly to reduce their influences and improve the system reliability. Using the Shengmai injection manufacturing plant of SZYY Ltd as a user case, we analyzed the quality risk with both static FMEA analysis and dynamic Bayesian Network analysis. The potential risk factors for the quality of Shengmai injection manufacturing were identified with the network analysis platform. Quality assurance actions were further defined to reduce the risk and improve the product quality. PMID:25509315

  12. Reliability analysis of the solar array based on Fault Tree Analysis

    NASA Astrophysics Data System (ADS)

    Jianing, Wu; Shaoze, Yan

    2011-07-01

    The solar array is an important device used in the spacecraft, which influences the quality of in-orbit operation of the spacecraft and even the launches. This paper analyzes the reliability of the mechanical system and certifies the most vital subsystem of the solar array. The fault tree analysis (FTA) model is established according to the operating process of the mechanical system based on DFH-3 satellite; the logical expression of the top event is obtained by Boolean algebra and the reliability of the solar array is calculated. The conclusion shows that the hinges are the most vital links between the solar arrays. By analyzing the structure importance(SI) of the hinge's FTA model, some fatal causes, including faults of the seal, insufficient torque of the locking spring, temperature in space, and friction force, can be identified. Damage is the initial stage of the fault, so limiting damage is significant to prevent faults. Furthermore, recommendations for improving reliability associated with damage limitation are discussed, which can be used for the redesigning of the solar array and the reliability growth planning.

  13. Degradation mechanisms in high-power multi-mode InGaAs-AlGaAs strained quantum well lasers for high-reliability applications

    NASA Astrophysics Data System (ADS)

    Sin, Yongkun; Presser, Nathan; Brodie, Miles; Lingley, Zachary; Foran, Brendan; Moss, Steven C.

    2015-03-01

    Laser diode manufacturers perform accelerated multi-cell lifetests to estimate lifetimes of lasers using an empirical model. Since state-of-the-art laser diodes typically require a long period of latency before they degrade, significant amount of stress is applied to the lasers to generate failures in relatively short test durations. A drawback of this approach is the lack of mean-time-to-failure data under intermediate and low stress conditions, leading to uncertainty in model parameters (especially optical power and current exponent) and potential overestimation of lifetimes at usage conditions. This approach is a concern especially for satellite communication systems where high reliability is required of lasers for long-term duration in the space environment. A number of groups have studied reliability and degradation processes in GaAs-based lasers, but none of these studies have yielded a reliability model based on the physics of failure. The lack of such a model is also a concern for space applications where complete understanding of degradation mechanisms is necessary. Our present study addresses the aforementioned issues by performing long-term lifetests under low stress conditions followed by failure mode analysis (FMA) and physics of failure investigation. We performed low-stress lifetests on both MBE- and MOCVD-grown broad-area InGaAs- AlGaAs strained QW lasers under ACC (automatic current control) mode to study low-stress degradation mechanisms. Our lifetests have accumulated over 36,000 test hours and FMA is performed on failures using our angle polishing technique followed by EL. This technique allows us to identify failure types by observing dark line defects through a window introduced in backside metal contacts. We also investigated degradation mechanisms in MOCVD-grown broad-area InGaAs-AlGaAs strained QW lasers using various FMA techniques. Since it is a challenge to control defect densities during the growth of laser structures, we chose to

  14. Reliability and performance evaluation of systems containing embedded rule-based expert systems

    NASA Technical Reports Server (NTRS)

    Beaton, Robert M.; Adams, Milton B.; Harrison, James V. A.

    1989-01-01

    A method for evaluating the reliability of real-time systems containing embedded rule-based expert systems is proposed and investigated. It is a three stage technique that addresses the impact of knowledge-base uncertainties on the performance of expert systems. In the first stage, a Markov reliability model of the system is developed which identifies the key performance parameters of the expert system. In the second stage, the evaluation method is used to determine the values of the expert system's key performance parameters. The performance parameters can be evaluated directly by using a probabilistic model of uncertainties in the knowledge-base or by using sensitivity analyses. In the third and final state, the performance parameters of the expert system are combined with performance parameters for other system components and subsystems to evaluate the reliability and performance of the complete system. The evaluation method is demonstrated in the context of a simple expert system used to supervise the performances of an FDI algorithm associated with an aircraft longitudinal flight-control system.

  15. High-Confidence Compositional Reliability Assessment of SOA-Based Systems Using Machine Learning Techniques

    NASA Astrophysics Data System (ADS)

    Challagulla, Venkata U. B.; Bastani, Farokh B.; Yen, I.-Ling

    Service-oriented architecture (SOA) techniques are being increasingly used for developing critical applications, especially network-centric systems. While the SOA paradigm provides flexibility and agility to better respond to changing business requirements, the task of assessing the reliability of SOA-based systems is quite challenging. Deriving high confidence reliability estimates for mission-critical systems can require huge costs and time. SOAsystems/ applications are built by using either atomic or composite services as building blocks. These services are generally assumed to be realized with reuse and logical composition of components. One approach for assessing the reliability of SOA-based systems is to use AI reasoning techniques on dynamically collected failure data of each service and its components as one of the evidences together with results from random testing. Memory-Based Reasoning technique and Bayesian Belief Net-works are verified as the reasoning tools best suited to guide the prediction analysis. A framework constructed from the above approach identifies the least tested and “high usage” input subdomains of the service(s) and performs necessary remedial actions depending on the predicted results.

  16. A genetic algorithm approach for assessing soil liquefaction potential based on reliability method

    NASA Astrophysics Data System (ADS)

    Bagheripour, M. H.; Shooshpasha, I.; Afzalirad, M.

    2012-02-01

    Deterministic approaches are unable to account for the variations in soil's strength properties, earthquake loads, as well as source of errors in evaluations of liquefaction potential in sandy soils which make them questionable against other reliability concepts. Furthermore, deterministic approaches are incapable of precisely relating the probability of liquefaction and the factor of safety (FS). Therefore, the use of probabilistic approaches and especially, reliability analysis is considered since a complementary solution is needed to reach better engineering decisions. In this study, Advanced First-Order Second-Moment (AFOSM) technique associated with genetic algorithm (GA) and its corresponding sophisticated optimization techniques have been used to calculate the reliability index and the probability of liquefaction. The use of GA provides a reliable mechanism suitable for computer programming and fast convergence. A new relation is developed here, by which the liquefaction potential can be directly calculated based on the estimated probability of liquefaction ( P L ), cyclic stress ratio (CSR) and normalized standard penetration test (SPT) blow counts while containing a mean error of less than 10% from the observational data. The validity of the proposed concept is examined through comparison of the results obtained by the new relation and those predicted by other investigators. A further advantage of the proposed relation is that it relates P L and FS and hence it provides possibility of decision making based on the liquefaction risk and the use of deterministic approaches. This could be beneficial to geotechnical engineers who use the common methods of FS for evaluation of liquefaction. As an application, the city of Babolsar which is located on the southern coasts of Caspian Sea is investigated for liquefaction potential. The investigation is based primarily on in situ tests in which the results of SPT are analysed.

  17. Reliability- and performance-based robust design optimization of MEMS structures considering technological uncertainties

    NASA Astrophysics Data System (ADS)

    Martowicz, Adam; Uhl, Tadeusz

    2012-10-01

    The paper discusses the applicability of a reliability- and performance-based multi-criteria robust design optimization technique for micro-electromechanical systems, considering their technological uncertainties. Nowadays, micro-devices are commonly applied systems, especially in the automotive industry, taking advantage of utilizing both the mechanical structure and electronic control circuit on one board. Their frequent use motivates the elaboration of virtual prototyping tools that can be applied in design optimization with the introduction of technological uncertainties and reliability. The authors present a procedure for the optimization of micro-devices, which is based on the theory of reliability-based robust design optimization. This takes into consideration the performance of a micro-device and its reliability assessed by means of uncertainty analysis. The procedure assumes that, for each checked design configuration, the assessment of uncertainty propagation is performed with the meta-modeling technique. The described procedure is illustrated with an example of the optimization carried out for a finite element model of a micro-mirror. The multi-physics approach allowed the introduction of several physical phenomena to correctly model the electrostatic actuation and the squeezing effect present between electrodes. The optimization was preceded by sensitivity analysis to establish the design and uncertain domains. The genetic algorithms fulfilled the defined optimization task effectively. The best discovered individuals are characterized by a minimized value of the multi-criteria objective function, simultaneously satisfying the constraint on material strength. The restriction of the maximum equivalent stresses was introduced with the conditionally formulated objective function with a penalty component. The yielded results were successfully verified with a global uniform search through the input design domain.

  18. Reliability-based structural optimization using response surface approximations and probabilistic sufficiency factor

    NASA Astrophysics Data System (ADS)

    Qu, Xueyong

    Uncertainties exist practically everywhere from structural design to manufacturing, product lifetime service, and maintenance. Uncertainties can be introduced by errors in modeling and simulation; by manufacturing imperfections (such as variability in material properties and structural geometric dimensions); and by variability in loading. Structural design by safety factors using nominal values without considering uncertainties may lead to designs that are either unsafe, or too conservative and thus not efficient. The focus of this dissertation is reliability-based design optimization (RBDO) of composite structures. Uncertainties are modeled by the probabilistic distributions of random variables. Structural reliability is evaluated in term of the probability of failure. RBDO minimizes cost such as structural weight subject to reliability constraints. Since engineering structures usually have multiple failure modes, Monte Carlo simulation (MCS) was used employed to calculate the system probability of failure. Response surface (RS) approximation techniques were used to solve the difficulties associated with MCS. The high computational cost of a large number of MCS samples was alleviated by analysis RS, and numerical noise in the results of MCS was filtered out by design RS. RBDO of composite laminates is investigated for use in hydrogen tanks in cryogenic environments. The major challenge is to reduce the large residual strains developed due to thermal mismatch between matrix and fibers while maintaining the load carrying capacity. RBDO is performed to provide laminate designs, quantify the effects of uncertainties on the optimum weight, and identify those parameters that have the largest influence on optimum design. Studies of weight and reliability tradeoffs indicate that the most cost-effective measure for reducing weight and increasing reliability is quality control. A probabilistic sufficiency factor (PSF) approach was developed to improve the computational

  19. 75 FR 16098 - Reliable Power LLC; Supplemental Notice That Initial Market-Based Rate Filing Includes Request...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-03-31

    ... From the Federal Register Online via the Government Publishing Office DEPARTMENT OF ENERGY Federal Energy Regulatory Commission Reliable Power LLC; Supplemental Notice That Initial Market-Based Rate... notice in the above-referenced proceeding of Reliable Power, LLC's application for market-based...

  20. 76 FR 40722 - Granite Reliable Power, LLC; Supplemental Notice That Initial Market-Based Rate Filing Includes...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-07-11

    ... From the Federal Register Online via the Government Publishing Office DEPARTMENT OF ENERGY Federal Energy Regulatory Commission Granite Reliable Power, LLC; Supplemental Notice That Initial Market-Based... above-referenced proceeding of Granite Reliable Power, LLC's application for market-based rate...

  1. Web-Based Assessment of Mental Well-Being in Early Adolescence: A Reliability Study

    PubMed Central

    Hamann, Christoph; Schultze-Lutter, Frauke

    2016-01-01

    Background The ever-increasing use of the Internet among adolescents represents an emerging opportunity for researchers to gain access to larger samples, which can be queried over several years longitudinally. Among adolescents, young adolescents (ages 11 to 13 years) are of particular interest to clinicians as this is a transitional stage, during which depressive and anxiety symptoms often emerge. However, it remains unclear whether these youngest adolescents can accurately answer questions about their mental well-being using a Web-based platform. Objective The aim of the study was to examine the accuracy of responses obtained from Web-based questionnaires by comparing Web-based with paper-and-pencil versions of depression and anxiety questionnaires. Methods The primary outcome was the score on the depression and anxiety questionnaires under two conditions: (1) paper-and-pencil and (2) Web-based versions. Twenty-eight adolescents (aged 11-13 years, mean age 12.78 years and SD 0.78; 18 females, 64%) were randomly assigned to complete either the paper-and-pencil or the Web-based questionnaire first. Intraclass correlation coefficients (ICCs) were calculated to measure intrarater reliability. Intraclass correlation coefficients were calculated separately for depression (Children’s Depression Inventory, CDI) and anxiety (Spence Children’s Anxiety Scale, SCAS) questionnaires. Results On average, it took participants 17 minutes (SD 6) to answer 116 questions online. Intraclass correlation coefficient analysis revealed high intrarater reliability when comparing Web-based with paper-and-pencil responses for both CDI (ICC=.88; P<.001) and the SCAS (ICC=.95; P<.001). According to published criteria, both of these values are in the “almost perfect” category indicating the highest degree of reliability. Conclusions The results of the study show an excellent reliability of Web-based assessment in 11- to 13-year-old children as compared with the standard paper

  2. Reliability and validity of the NeuroCognitive Performance Test, a web-based neuropsychological assessment

    PubMed Central

    Morrison, Glenn E.; Simone, Christa M.; Ng, Nicole F.; Hardy, Joseph L.

    2015-01-01

    The NeuroCognitive Performance Test (NCPT) is a brief, repeatable, web-based cognitive assessment platform that measures performance across several cognitive domains. The NCPT platform is modular and includes 18 subtests that can be arranged into customized batteries. Here we present normative data from a sample of 130,140 healthy volunteers for an NCPT battery consisting of 8 subtests. Participants took the NCPT remotely and without supervision. Factor structure and effects of age, education, and gender were evaluated with this normative dataset. Test-retest reliability was evaluated in a subset of participants who took the battery again an average of 78.8 days later. The eight NCPT subtests group into 4 putative cognitive domains, have adequate to good test-retest reliability, and are sensitive to expected age- and education-related cognitive effects. Concurrent validity to standard neuropsychological tests was demonstrated in 73 healthy volunteers. In an exploratory analysis the NCPT battery could differentiate those who self-reported Mild Cognitive Impairment or Alzheimer's disease from matched healthy controls. Overall these results demonstrate the reliability and validity of the NCPT battery as a measure of cognitive performance and support the feasibility of web-based, unsupervised testing, with potential utility in clinical and research settings. PMID:26579035

  3. Reliability Assessment and Robustness Study for Key Navigation Components using Belief Rule Based System

    NASA Astrophysics Data System (ADS)

    You, Yuan; Wang, Liuying; Chang, Leilei; Ling, Xiaodong; Sun, Nan

    2016-02-01

    The gyro device is the key navigation component for maritime tracking and control, and gyro shift is the key factor which influences the performance of the gyro device, which makes conducting the reliability analysis on the gyro device very important. For the gyro device reliability analysis, the residual life probability prediction plays an essential role although it requires a complex process adapted by existed studies. In this study the Belief Rule Base (BRB) system is applied to model the relationship between the time as the input and the residual life probability as the output. Two scenarios are designed to study the robustness of the proposed BRB prediction model. The comparative results show that the BRB prediction model performs better in Scenario II when new the referenced values are predictable.

  4. Differential Evolution Based Intelligent System State Search Method for Composite Power System Reliability Evaluation

    NASA Astrophysics Data System (ADS)

    Bakkiyaraj, Ashok; Kumarappan, N.

    2015-09-01

    This paper presents a new approach for evaluating the reliability indices of a composite power system that adopts binary differential evolution (BDE) algorithm in the search mechanism to select the system states. These states also called dominant states, have large state probability and higher loss of load curtailment necessary to maintain real power balance. A chromosome of a BDE algorithm represents the system state. BDE is not applied for its traditional application of optimizing a non-linear objective function, but used as tool for exploring more number of dominant states by producing new chromosomes, mutant vectors and trail vectors based on the fitness function. The searched system states are used to evaluate annualized system and load point reliability indices. The proposed search methodology is applied to RBTS and IEEE-RTS test systems and results are compared with other approaches. This approach evaluates the indices similar to existing methods while analyzing less number of system states.

  5. Analyzing and designing of reliable multicast based on FEC in distributed switch

    NASA Astrophysics Data System (ADS)

    Luo, Ting; Yu, Shaohua; Wang, Xueshun

    2008-11-01

    AS businesses become more dependent on IP networks, lots of real-time services are adopted and high availability in networks has become increasingly critical. With the development of carrier grade ethernet, the requirements of high speed metro ethernet device are more urgently. In order to reach the capacity of hundreds of Gbps or Tbps, most of core ethernet switches almost adopted distributed control architecture and large capacity forwarding fabric. When distributed switch works, they always have one CE and many FE. There for, it shows the feature of multicast with one sender and many receivers. It is deserved research for us how to apply reliable multicast to distributed switch inner communication system. In this paper, we present the general architecture of a distributed ethernet switch, focusing on analysis the model of internal communication subsystem. According to its character, a novel reliable multicast communication mechanism based on FEC recovery algorithm has been applied and evaluated in experiment.

  6. Accounting for Proof Test Data in a Reliability Based Design Optimization Framework

    NASA Technical Reports Server (NTRS)

    Ventor, Gerharad; Scotti, Stephen J.

    2012-01-01

    This paper investigates the use of proof (or acceptance) test data during the reliability based design optimization of structural components. It is assumed that every component will be proof tested and that the component will only enter into service if it passes the proof test. The goal is to reduce the component weight, while maintaining high reliability, by exploiting the proof test results during the design process. The proposed procedure results in the simultaneous design of the structural component and the proof test itself and provides the designer with direct control over the probability of failing the proof test. The procedure is illustrated using two analytical example problems and the results indicate that significant weight savings are possible when exploiting the proof test results during the design process.

  7. Cut set-based risk and reliability analysis for arbitrarily interconnected networks

    DOEpatents

    Wyss, Gregory D.

    2000-01-01

    Method for computing all-terminal reliability for arbitrarily interconnected networks such as the United States public switched telephone network. The method includes an efficient search algorithm to generate minimal cut sets for nonhierarchical networks directly from the network connectivity diagram. Efficiency of the search algorithm stems in part from its basis on only link failures. The method also includes a novel quantification scheme that likewise reduces computational effort associated with assessing network reliability based on traditional risk importance measures. Vast reductions in computational effort are realized since combinatorial expansion and subsequent Boolean reduction steps are eliminated through analysis of network segmentations using a technique of assuming node failures to occur on only one side of a break in the network, and repeating the technique for all minimal cut sets generated with the search algorithm. The method functions equally well for planar and non-planar networks.

  8. Summary of Research on Reliability Criteria-Based Flight System Control

    NASA Technical Reports Server (NTRS)

    Wu, N. Eva; Belcastro, Christine (Technical Monitor)

    2002-01-01

    This paper presents research on the reliability assessment of adaptive flight control systems. The topics include: 1) Overview of Project Focuses; 2) Reliability Analysis; and 3) Design for Reliability. This paper is presented in viewgraph form.

  9. Estimating Paleointensity Reliability Based on the Physical Mechanism of Natural Remanence

    NASA Astrophysics Data System (ADS)

    Smirnov, A. V.; Tarduno, J. A.

    2007-12-01

    Data on the long-term evolution of Earth's magnetic field intensity are crucial for understanding the geodynamo and planetary evolution. However, paleointensity remains one of the most difficult quantities to determine. The conventional Thellier method is based on the assumption that the paleointensity signal is carried by non- interacting single-domain (SD) magnetic grains that hold a thermal remanent magnetization (TRM). Most bulk rock samples, however, deviate from this ideal case. This departure, coupled with the desire to tap the relatively plentiful potential record held by bulk rocks has led to the development of reliability criteria that largely rely on the observed NRM/TRM characteristics (Arai plots). While such methods may identify effects such as non-SD behavior and laboratory alteration, they assume that the paleointensity signal is a TRM. However, many paleointensity estimates in the current database are probably held by thermochemical remanent magnetizations (TCRMs) or crystallization remanent magnetizations (CRMs). Common processes that form such magnetizations include subsolidus reactions in magnetic grains during initial lava cooling (e.g., oxyexsolution), subsequent low- temperature oxidation (e.g., maghemitization), and the formation of secondary magnetic phases (e.g., hydrothermal magnetite). If unrecognized, such magnetizations can lead to large paleointensity underestimates or overestimates. In most cases, these processes cannot be identified using the Arai-based reliability controls. We suggest that additional criteria based on the physical mechanisms of recording and preserving the paleointensity signal should be utilized in order to assess the reliability of data. We introduce criteria based on whether the magnetization represents a TRM, TCRM or/and CRM based on rock magnetic and other analytical techniques. While such a categorization is needed to make further progress in understanding the nominal paleointensity signal of bulk rocks, we

  10. Reliability Evaluation of Base-Metal-Electrode Multilayer Ceramic Capacitors for Potential Space Applications

    NASA Technical Reports Server (NTRS)

    Liu, David (Donhang); Sampson, Michael J.

    2011-01-01

    Base-metal-electrode (BME) ceramic capacitors are being investigated for possible use in high-reliability spacelevel applications. This paper focuses on how BME capacitors construction and microstructure affects their lifetime and reliability. Examination of the construction and microstructure of commercial off-the-shelf (COTS) BME capacitors reveals great variance in dielectric layer thickness, even among BME capacitors with the same rated voltage. Compared to PME (precious-metal-electrode) capacitors, BME capacitors exhibit a denser and more uniform microstructure, with an average grain size between 0.3 and 0.5 m, which is much less than that of most PME capacitors. BME capacitors can be fabricated with more internal electrode layers and thinner dielectric layers than PME capacitors because they have a fine-grained microstructure and do not shrink much during ceramic sintering. This makes it possible for BME capacitors to achieve a very high capacitance volumetric efficiency. The reliability of BME and PME capacitors was investigated using highly accelerated life testing (HALT). Most BME capacitors were found to fail with an early avalanche breakdown, followed by a regular dielectric wearout failure during the HALT test. When most of the early failures, characterized with avalanche breakdown, were removed, BME capacitors exhibited a minimum mean time-to-failure (MTTF) of more than 105 years at room temperature and rated voltage. Dielectric thickness was found to be a critical parameter for the reliability of BME capacitors. The number of stacked grains in a dielectric layer appears to play a significant role in determining BME capacitor reliability. Although dielectric layer thickness varies for a given rated voltage in BME capacitors, the number of stacked grains is relatively consistent, typically around 12 for a number of BME capacitors with a rated voltage of 25V. This may suggest that the number of grains per dielectric layer is more critical than the

  11. Reliability Prediction of Ontology-Based Service Compositions Using Petri Net and Time Series Models

    PubMed Central

    Li, Jia; Xia, Yunni; Luo, Xin

    2014-01-01

    OWL-S, one of the most important Semantic Web service ontologies proposed to date, provides a core ontological framework and guidelines for describing the properties and capabilities of their web services in an unambiguous, computer interpretable form. Predicting the reliability of composite service processes specified in OWL-S allows service users to decide whether the process meets the quantitative quality requirement. In this study, we consider the runtime quality of services to be fluctuating and introduce a dynamic framework to predict the runtime reliability of services specified in OWL-S, employing the Non-Markovian stochastic Petri net (NMSPN) and the time series model. The framework includes the following steps: obtaining the historical response times series of individual service components; fitting these series with a autoregressive-moving-average-model (ARMA for short) and predicting the future firing rates of service components; mapping the OWL-S process into a NMSPN model; employing the predicted firing rates as the model input of NMSPN and calculating the normal completion probability as the reliability estimate. In the case study, a comparison between the static model and our approach based on experimental data is presented and it is shown that our approach achieves higher prediction accuracy. PMID:24688429

  12. The assessment of stability and reliability of a virtual reality-based intravenous injection simulator.

    PubMed

    Tsai, Wen-Wei; Fung, Chin-Ping; Tsai, Sing-Ling; Jeng, Ming-Chang; Doong, Ji-Liang

    2008-01-01

    The aim of this study was to validate the feasibility of a virtual reality-based (without haptic feedback) intravenous injection system as an effective tool for computer-assisted instruction and training. The stability and reliability of the system were assessed. A personal computer, a needle/catheter device, and a data acquisition interface are included in this system. Using Virtual Reality Modeling Language, an interactive virtual environment was developed. Ten participants, ranging from 20 to 28 years of age, were recruited for this study. The self-learning and training procedures encompassed an intravenous catheterization process. The experimental results showed that after a few trials, the change in task time was not obvious in each trial, and the error frequency decreased slightly with more trials. High intraclass correlation coefficients also were obtained for task time and error frequency by analyzing the test-retest reliability. These results indicated that the system was stable and that the system reliability was acceptable. PMID:18600130

  13. Probabilistic and structural reliability analysis of laminated composite structures based on the IPACS code

    NASA Technical Reports Server (NTRS)

    Sobel, Larry; Buttitta, Claudio; Suarez, James

    1993-01-01

    Probabilistic predictions based on the Integrated Probabilistic Assessment of Composite Structures (IPACS) code are presented for the material and structural response of unnotched and notched, 1M6/3501-6 Gr/Ep laminates. Comparisons of predicted and measured modulus and strength distributions are given for unnotched unidirectional, cross-ply, and quasi-isotropic laminates. The predicted modulus distributions were found to correlate well with the test results for all three unnotched laminates. Correlations of strength distributions for the unnotched laminates are judged good for the unidirectional laminate and fair for the cross-ply laminate, whereas the strength correlation for the quasi-isotropic laminate is deficient because IPACS did not yet have a progressive failure capability. The paper also presents probabilistic and structural reliability analysis predictions for the strain concentration factor (SCF) for an open-hole, quasi-isotropic laminate subjected to longitudinal tension. A special procedure was developed to adapt IPACS for the structural reliability analysis. The reliability results show the importance of identifying the most significant random variables upon which the SCF depends, and of having accurate scatter values for these variables.

  14. WEAMR — A Weighted Energy Aware Multipath Reliable Routing Mechanism for Hotline-Based WSNs

    PubMed Central

    Tufail, Ali; Qamar, Arslan; Khan, Adil Mehmood; Baig, Waleed Akram; Kim, Ki-Hyung

    2013-01-01

    Reliable source to sink communication is the most important factor for an efficient routing protocol especially in domains of military, healthcare and disaster recovery applications. We present weighted energy aware multipath reliable routing (WEAMR), a novel energy aware multipath routing protocol which utilizes hotline-assisted routing to meet such requirements for mission critical applications. The protocol reduces the number of average hops from source to destination and provides unmatched reliability as compared to well known reactive ad hoc protocols i.e., AODV and AOMDV. Our protocol makes efficient use of network paths based on weighted cost calculation and intelligently selects the best possible paths for data transmissions. The path cost calculation considers end to end number of hops, latency and minimum energy node value in the path. In case of path failure path recalculation is done efficiently with minimum latency and control packets overhead. Our evaluation shows that our proposal provides better end-to-end delivery with less routing overhead and higher packet delivery success ratio compared to AODV and AOMDV. The use of multipath also increases overall life time of WSN network using optimum energy available paths between sender and receiver in WDNs. PMID:23669714

  15. Reliability of team-based self-monitoring in critical events: a pilot study

    PubMed Central

    2013-01-01

    Background Teamwork is a critical component during critical events. Assessment is mandatory for remediation and to target training programmes for observed performance gaps. Methods The primary purpose was to test the feasibility of team-based self-monitoring of crisis resource management with a validated teamwork assessment tool. A secondary purpose was to assess item-specific reliability and content validity in order to develop a modified context-optimised assessment tool. We conducted a prospective, single-centre study to assess team-based self-monitoring of teamwork after in-situ inter-professional simulated critical events by comparison with an assessment by observers. The Mayo High Performance Teamwork Scale (MHPTS) was used as the assessment tool with evaluation of internal consistency, item-specific consensus estimates for agreement between participating teams and observers, and content validity. Results 105 participants and 58 observers completed the MHPTS after a total of 16 simulated critical events over 8 months. Summative internal consistency of the MHPTS calculated as Cronbach’s alpha was acceptable with 0.712 for observers and 0.710 for participants. Overall consensus estimates for dichotomous data (agreement/non-agreement) was 0.62 (Cohen’s kappa; IQ-range 0.31-0.87). 6/16 items had excellent (kappa > 0.8) and 3/16 good reliability (kappa > 0.6). Short questions concerning easy to observe behaviours were more likely to be reliable. The MHPTS was modified using a threshold for good reliability of kappa > 0.6. The result is a 9 item self-assessment tool (TeamMonitor) with a calculated median kappa of 0.86 (IQ-range: 0.67-1.0) and good content validity. Conclusions Team-based self-monitoring with the MHPTS to assess team performance during simulated critical events is feasible. A context-based modification of the tool is achievable with good internal consistency and content validity. Further studies are needed to investigate if team-based

  16. A reliable transmission protocol for ZigBee-based wireless patient monitoring.

    PubMed

    Chen, Shyr-Kuen; Kao, Tsair; Chan, Chia-Tai; Huang, Chih-Ning; Chiang, Chih-Yen; Lai, Chin-Yu; Tung, Tse-Hua; Wang, Pi-Chung

    2012-01-01

    Patient monitoring systems are gaining their importance as the fast-growing global elderly population increases demands for caretaking. These systems use wireless technologies to transmit vital signs for medical evaluation. In a multihop ZigBee network, the existing systems usually use broadcast or multicast schemes to increase the reliability of signals transmission; however, both the schemes lead to significantly higher network traffic and end-to-end transmission delay. In this paper, we present a reliable transmission protocol based on anycast routing for wireless patient monitoring. Our scheme automatically selects the closest data receiver in an anycast group as a destination to reduce the transmission latency as well as the control overhead. The new protocol also shortens the latency of path recovery by initiating route recovery from the intermediate routers of the original path. On the basis of a reliable transmission scheme, we implement a ZigBee device for fall monitoring, which integrates fall detection, indoor positioning, and ECG monitoring. When the triaxial accelerometer of the device detects a fall, the current position of the patient is transmitted to an emergency center through a ZigBee network. In order to clarify the situation of the fallen patient, 4-s ECG signals are also transmitted. Our transmission scheme ensures the successful transmission of these critical messages. The experimental results show that our scheme is fast and reliable. We also demonstrate that our devices can seamlessly integrate with the next generation technology of wireless wide area network, worldwide interoperability for microwave access, to achieve real-time patient monitoring. PMID:21997287

  17. Quantified Risk Ranking Model for Condition-Based Risk and Reliability Centered Maintenance

    NASA Astrophysics Data System (ADS)

    Chattopadhyaya, Pradip Kumar; Basu, Sushil Kumar; Majumdar, Manik Chandra

    2016-03-01

    In the recent past, risk and reliability centered maintenance (RRCM) framework is introduced with a shift in the methodological focus from reliability and probabilities (expected values) to reliability, uncertainty and risk. In this paper authors explain a novel methodology for risk quantification and ranking the critical items for prioritizing the maintenance actions on the basis of condition-based risk and reliability centered maintenance (CBRRCM). The critical items are identified through criticality analysis of RPN values of items of a system and the maintenance significant precipitating factors (MSPF) of items are evaluated. The criticality of risk is assessed using three risk coefficients. The likelihood risk coefficient treats the probability as a fuzzy number. The abstract risk coefficient deduces risk influenced by uncertainty, sensitivity besides other factors. The third risk coefficient is called hazardous risk coefficient, which is due to anticipated hazards which may occur in the future and the risk is deduced from criteria of consequences on safety, environment, maintenance and economic risks with corresponding cost for consequences. The characteristic values of all the three risk coefficients are obtained with a particular test. With few more tests on the system, the values may change significantly within controlling range of each coefficient, hence `random number simulation' is resorted to obtain one distinctive value for each coefficient. The risk coefficients are statistically added to obtain final risk coefficient of each critical item and then the final rankings of critical items are estimated. The prioritization in ranking of critical items using the developed mathematical model for risk assessment shall be useful in optimization of financial losses and timing of maintenance actions.

  18. Validity and reliability of an IMU-based method to detect APAs prior to gait initiation.

    PubMed

    Mancini, Martina; Chiari, Lorenzo; Holmstrom, Lars; Salarian, Arash; Horak, Fay B

    2016-01-01

    Anticipatory postural adjustments (APAs) prior to gait initiation have been largely studied in traditional, laboratory settings using force plates under the feet to characterize the displacement of the center of pressure. However clinical trials and clinical practice would benefit from a portable, inexpensive method for characterizing APAs. Therefore, the main objectives of this study were (1) to develop a novel, automatic IMU-based method to detect and characterize APAs during gait initiation and (2) to measure its test-retest reliability. Experiment I was carried out in the laboratory to determine the validity of the IMU-based method in 10 subjects with PD (OFF medication) and 12 control subjects. Experiment II was carried out in the clinic, to determine test-retest reliability of the IMU-based method in a different set of 17 early-to-moderate, treated subjects with PD (tested ON medication) and 17 age-matched control subjects. Results showed that gait initiation characteristics (both APAs and 1st step) detected with our novel method were significantly correlated to the characteristics calculated with a force plate and motion analysis system. The size of APAs measured with either inertial sensors or force plate was significantly smaller in subjects with PD than in control subjects (p<0.05). Test-retest reliability for the gait initiation characteristics measured with inertial sensors was moderate-to-excellent (0.56

  19. How to Characterize the Reliability of Ceramic Capacitors with Base-Metal Electrodes (BMEs)

    NASA Technical Reports Server (NTRS)

    Liu, David (Donhang)

    2015-01-01

    The reliability of an MLCC device is the product of a time-dependent part and a time-independent part: 1) Time-dependent part is a statistical distribution; 2) Time-independent part is the reliability at t=0, the initial reliability. Initial reliability depends only on how a BME MLCC is designed and processed. Similar to the way the minimum dielectric thickness ensured the long-term reliability of a PME MLCC, the initial reliability also ensures the long term-reliability of a BME MLCC. This presentation shows new discoveries regarding commonalities and differences between PME and BME capacitor technologies.

  20. Entity Model Based Quality Management: A First Step Towards High Reliability Organization Management

    NASA Astrophysics Data System (ADS)

    Engelbrecht, S.; Radestock, Chr.; Bohle, D. K. H.

    2010-09-01

    A management system built upon a generic entity model is presented as an approach towards management systems for High Reliability Organizations(HRO). The entity model is derived from the Ground Systems and Operations standard of the European Cooperation for Space Standardization(ECSS). DLR has launched a first application of the model in its Applied Remote Sensing Cluster, especially for the Center for Satellite based Crisis Information. It is proposed that a management system built upon the entity model systematically enhances a significant number of HRO characteristics.

  1. Reliability-based optimization of maintenance scheduling of mechanical components under fatigue

    PubMed Central

    Beaurepaire, P.; Valdebenito, M.A.; Schuëller, G.I.; Jensen, H.A.

    2012-01-01

    This study presents the optimization of the maintenance scheduling of mechanical components under fatigue loading. The cracks of damaged structures may be detected during non-destructive inspection and subsequently repaired. Fatigue crack initiation and growth show inherent variability, and as well the outcome of inspection activities. The problem is addressed under the framework of reliability based optimization. The initiation and propagation of fatigue cracks are efficiently modeled using cohesive zone elements. The applicability of the method is demonstrated by a numerical example, which involves a plate with two holes subject to alternating stress. PMID:23564979

  2. Effect of Clinically Discriminating, Evidence-Based Checklist Items on the Reliability of Scores from an Internal Medicine Residency OSCE

    ERIC Educational Resources Information Center

    Daniels, Vijay J.; Bordage, Georges; Gierl, Mark J.; Yudkowsky, Rachel

    2014-01-01

    Objective structured clinical examinations (OSCEs) are used worldwide for summative examinations but often lack acceptable reliability. Research has shown that reliability of scores increases if OSCE checklists for medical students include only clinically relevant items. Also, checklists are often missing evidence-based items that high-achieving…

  3. The Accessibility, Usability, and Reliability of Chinese Web-Based Information on HIV/AIDS

    PubMed Central

    Niu, Lu; Luo, Dan; Liu, Ying; Xiao, Shuiyuan

    2016-01-01

    Objective: The present study was designed to assess the quality of Chinese-language Internet-based information on HIV/AIDS. Methods: We entered the following search terms, in Chinese, into Baidu and Sogou: “HIV/AIDS”, “symptoms”, and “treatment”, and evaluated the first 50 hits of each query using the Minervation validation instrument (LIDA tool) and DISCERN instrument. Results: Of the 900 hits identified, 85 websites were included in this study. The overall score of the LIDA tool was 63.7%; the mean score of accessibility, usability, and reliability was 82.2%, 71.5%, and 27.3%, respectively. Of the top 15 sites according to the LIDA score, the mean DISCERN score was calculated at 43.1 (95% confidence intervals (CI) = 37.7–49.5). Noncommercial websites showed higher DISCERN scores than commercial websites; whereas commercial websites were more likely to be found in the first 20 links obtained from each search engine than the noncommercial websites. Conclusions: In general, the HIV/AIDS related Chinese-language websites have poor reliability, although their accessibility and usability are fair. In addition, the treatment information presented on Chinese-language websites is far from sufficient. There is an imperative need for professionals and specialized institutes to improve the comprehensiveness of web-based information related to HIV/AIDS. PMID:27556475

  4. Spectrum survey for reliable communications of cognitive radio based smart grid network

    NASA Astrophysics Data System (ADS)

    Farah Aqilah, Wan; Jayavalan, Shanjeevan; Mohd Aripin, Norazizah; Mohamad, Hafizal; Ismail, Aiman

    2013-06-01

    The smart grid (SG) system is expected to involve huge amount of data with different levels of priorities to different applications or users. The traditional grid which tend to deploy propriety networks with limited coverage and bandwidth, is not sufficient to support large scale SG network. Cognitive radio (CR) is a promising communication platform for SG network by utilizing potentially all available spectrum resources, subject to interference constraint. In order to develop a reliable communication framework for CR based SG network, thorough investigations on the current radio spectrum are required. This paper presents the spectrum utilization in Malaysia, specifically in the UHF/VHF bands, cellular (GSM 900, GSM 1800 and 3G), WiMAX, ISM and LTE band. The goal is to determine the potential spectrum that can be exploit by the CR users in the SG network. Measurements was conducted for 24 hours to quantify the average spectrum usage and the amount of available bandwidth. The findings in this paper are important to provide insight of actual spectrum utilization prior to developing a reliable communication platform for CR based SG network.

  5. Physics of Failure Analysis of Xilinx Flip Chip CCGA Packages: Effects of Mission Environments on Properties of LP2 Underfill and ATI Lid Adhesive Materials

    NASA Technical Reports Server (NTRS)

    Suh, Jong-ook

    2013-01-01

    The Xilinx Virtex 4QV and 5QV (V4 and V5) are next-generation field-programmable gate arrays (FPGAs) for space applications. However, there have been concerns within the space community regarding the non-hermeticity of V4/V5 packages; polymeric materials such as the underfill and lid adhesive will be directly exposed to the space environment. In this study, reliability concerns associated with the non-hermeticity of V4/V5 packages were investigated by studying properties and behavior of the underfill and the lid adhesvie materials used in V4/V5 packages.

  6. Reliable clock estimation using linear weighted fusion based on pairwise broadcast synchronization

    SciTech Connect

    Shi, Xin Zhao, Xiangmo Hui, Fei Ma, Junyan Yang, Lan

    2014-10-06

    Clock synchronization in wireless sensor networks (WSNs) has been studied extensively in recent years and many protocols are put forward based on the point of statistical signal processing, which is an effective way to optimize accuracy. However, the accuracy derived from the statistical data can be improved mainly by sufficient packets exchange, which will consume the limited power resources greatly. In this paper, a reliable clock estimation using linear weighted fusion based on pairwise broadcast synchronization is proposed to optimize sync accuracy without expending additional sync packets. As a contribution, a linear weighted fusion scheme for multiple clock deviations is constructed with the collaborative sensing of clock timestamp. And the fusion weight is defined by the covariance of sync errors for different clock deviations. Extensive simulation results show that the proposed approach can achieve better performance in terms of sync overhead and sync accuracy.

  7. JHUF-5 Steganalyzer: Huffman Based Steganalytic Features for Reliable Detection of YASS in JPEG Images

    NASA Astrophysics Data System (ADS)

    Bhat, Veena H.; Krishna, S.; Shenoy, P. Deepa; Venugopal, K. R.; Patnaik, L. M.

    Yet Another Steganographic Scheme (YASS) is one of the recent steganographic schemes that embeds data at randomized locations in a JPEG image, to avert blind steganalysis. In this paper we present JHUF-5, a statistical steganalyzer wherein J stands for JPEG, HU represents Huffman based statistics, F denotes FR Index (ratio of file size to resolution) and 5 - the number of features used as predictors for classification. The contribution of this paper is twofold; first the ability of the proposed blind steganalyzer to detect YASS reliably with a consistent performance for several settings. Second, the algorithm is based on only five uncalibrated features for efficient prediction as against other techniques, some of which employs several hundreds of predictors. The detection accuracy of the proposed method is found to be superior to existing blind steganalysis techniques.

  8. Reliability based calibration of partial safety factors for design of free pipeline spans

    SciTech Connect

    Ronold, K.O.; Nielsen, N.J.R.; Tura, F.; Bryndum, M.B.; Smed, P.F.

    1995-12-31

    This paper demonstrates how a structural reliability method can be applied as a rational means to analyze free spans of submarine pipelines with respect to failure in ultimate loading, and to establish partial safety factors for design of such free spans against this failure mode. It is important to note that the described procedure shall be considered as an illustration of a structural reliability methodology, and that the results do not represent a set of final design recommendations. A scope of design cases, consisting of a number of available site-specific pipeline spans, is established and is assumed representative for the future occurrence of submarine pipeline spans. Probabilistic models for the wave and current loading and its transfer to stresses in the pipe wall of a pipeline span is established together with a stochastic representation of the material resistance. The event of failure in ultimate loading is considered as based on a limit state which is reached when the maximum stress over the design life of the pipeline exceeds the yield strength of the pipe material. The yielding limit state is considered an ultimate limit state (ULS).

  9. On the Reliability of a Solitary Wave Based Transducer to Determine the Characteristics of Some Materials

    PubMed Central

    Deng, Wen; Nasrollahi, Amir; Rizzo, Piervincenzo; Li, Kaiyuan

    2015-01-01

    In the study presented in this article we investigated the feasibility and the reliability of a transducer design for the nondestructive evaluation (NDE) of the stiffness of structural materials. The NDE method is based on the propagation of highly nonlinear solitary waves (HNSWs) along a one-dimensional chain of spherical particles that is in contact with the material to be assessed. The chain is part of a built-in system designed and assembled to excite and detect HNSWs, and to exploit the dynamic interaction between the particles and the material to be inspected. This interaction influences the time-of-flight and the amplitude of the solitary pulses reflected at the transducer/material interface. The results of this study show that certain features of the waves are dependent on the modulus of elasticity of the material and that the built-in system is reliable. In the future the proposed NDE method may provide a cost-effective tool for the rapid assessment of materials’ modulus. PMID:26703617

  10. A Compact Forearm Crutch Based on Force Sensors for Aided Gait: Reliability and Validity

    PubMed Central

    Chamorro-Moriana, Gema; Sevillano, José Luis; Ridao-Fernández, Carmen

    2016-01-01

    Frequently, patients who suffer injuries in some lower member require forearm crutches in order to partially unload weight-bearing. These lesions cause pain in lower limb unloading and their progression should be controlled objectively to avoid significant errors in accuracy and, consequently, complications and after effects in lesions. The design of a new and feasible tool that allows us to control and improve the accuracy of loads exerted on crutches during aided gait is necessary, so as to unburden the lower limbs. In this paper, we describe such a system based on a force sensor, which we have named the GCH System 2.0. Furthermore, we determine the validity and reliability of measurements obtained using this tool via a comparison with the validated AMTI (Advanced Mechanical Technology, Inc., Watertown, MA, USA) OR6-7-2000 Platform. An intra-class correlation coefficient demonstrated excellent agreement between the AMTI Platform and the GCH System. A regression line to determine the predictive ability of the GCH system towards the AMTI Platform was found, which obtained a precision of 99.3%. A detailed statistical analysis is presented for all the measurements and also segregated for several requested loads on the crutches (10%, 25% and 50% of body weight). Our results show that our system, designed for assessing loads exerted by patients on forearm crutches during assisted gait, provides valid and reliable measurements of loads. PMID:27338396

  11. Novel bandgap-based under-voltage-lockout methods with high reliability

    NASA Astrophysics Data System (ADS)

    Yongrui, Zhao; Xinquan, Lai

    2013-10-01

    Highly reliable bandgap-based under-voltage-lockout (UVLO) methods are presented in this paper. The proposed under-voltage state to signal conversion methods take full advantages of the high temperature stability characteristics and the enhancement low-voltage protection methods which protect the core circuit from error operation; moreover, a common-source stage amplifier method is introduced to expand the output voltage range. All of these methods are verified in a UVLO circuit fabricated with a 0.5 μm standard BCD process technology. The experimental result shows that the proposed bandgap method exhibits a good temperature coefficient of 20 ppm/°C, which ensures that the UVLO keeps a stable output until the under-voltage state changes. Moreover, at room temperature, the high threshold voltage VTH+ generated by the UVLO is 12.3 V with maximum drift voltage of ±80 mV, and the low threshold voltage VTH- is 9.5 V with maximum drift voltage of ±70 mV. Also, the low voltage protection method used in the circuit brings a high reliability when the supply voltage is very low.

  12. Determine the optimal carrier selection for a logistics network based on multi-commodity reliability criterion

    NASA Astrophysics Data System (ADS)

    Lin, Yi-Kuei; Yeh, Cheng-Ta

    2013-05-01

    From the perspective of supply chain management, the selected carrier plays an important role in freight delivery. This article proposes a new criterion of multi-commodity reliability and optimises the carrier selection based on such a criterion for logistics networks with routes and nodes, over which multiple commodities are delivered. Carrier selection concerns the selection of exactly one carrier to deliver freight on each route. The capacity of each carrier has several available values associated with a probability distribution, since some of a carrier's capacity may be reserved for various orders. Therefore, the logistics network, given any carrier selection, is a multi-commodity multi-state logistics network. Multi-commodity reliability is defined as a probability that the logistics network can satisfy a customer's demand for various commodities, and is a performance indicator for freight delivery. To solve this problem, this study proposes an optimisation algorithm that integrates genetic algorithm, minimal paths and Recursive Sum of Disjoint Products. A practical example in which multi-sized LCD monitors are delivered from China to Germany is considered to illustrate the solution procedure.

  13. Reliability analysis of laser ultrasonics for train axle diagnostics based on model assisted POD curves

    NASA Astrophysics Data System (ADS)

    Malik, M. S.; Cavuto, A.; Martarelli, M.; Pandarese, G.; Revel, G. M.

    2014-05-01

    High speed train axles are integrated for a lifetime and it is time and resource consuming to conduct in service inspection with high accuracy. Laser ultrasonics is a proposed solution as a subset of non-contact measuring methods effective also for hard to reach areas and even recently proved to be effective using Laser Doppler Vibrometer (LDV) or air-coupled probes in reception. A reliability analysis of laser ultrasonics for this specific application is here performed. The research is mainly based on numerical study of the effect of high energy laser pulses on the surface of a steel axle and of the behavior of the ultrasonic waves in detecting possible defects. Probability of Detection (POD) concept is used as an estimated reliability of the inspection method. In particular Model Assisted Probability of Detection (MAPOD), a modified form of POD where models are used to infer results for making a decisive statistical approach of POD curve, is here adopted. This paper implements this approach by taking the inputs from limited experiments conducted on a high speed train axle using laser ultrasonics (source pulsed Nd:Yag, reception by high-frequency LDV) to calibrate a multiphysics FE model and by using the calibrated model to generate data samples statistically representative of damaged train axles. The simulated flaws are in accordance with the real defects present on the axle. A set of flaws of different depth has been modeled in order to assess the laser ultrasonics POD for this specific application.

  14. An Examination of Temporal Trends in Electricity Reliability Based on Reports from U.S. Electric Utilities

    SciTech Connect

    Eto, Joseph H.; LaCommare, Kristina Hamachi; Larsen, Peter; Todd, Annika; Fisher, Emily

    2012-01-06

    Since the 1960s, the U.S. electric power system has experienced a major blackout about once every 10 years. Each has been a vivid reminder of the importance society places on the continuous availability of electricity and has led to calls for changes to enhance reliability. At the root of these calls are judgments about what reliability is worth and how much should be paid to ensure it. In principle, comprehensive information on the actual reliability of the electric power system and on how proposed changes would affect reliability ought to help inform these judgments. Yet, comprehensive, national-scale information on the reliability of the U.S. electric power system is lacking. This report helps to address this information gap by assessing trends in U.S. electricity reliability based on information reported by electric utilities on power interruptions experienced by their customers. Our research augments prior investigations, which focused only on power interruptions originating in the bulk power system, by considering interruptions originating both from the bulk power system and from within local distribution systems. Our research also accounts for differences among utility reliability reporting practices by employing statistical techniques that remove the influence of these differences on the trends that we identify. The research analyzes up to 10 years of electricity reliability information collected from 155 U.S. electric utilities, which together account for roughly 50% of total U.S. electricity sales. The questions analyzed include: 1. Are there trends in reported electricity reliability over time? 2. How are trends in reported electricity reliability affected by the installation or upgrade of an automated outage management system? 3. How are trends in reported electricity reliability affected by the use of IEEE Standard 1366-2003?

  15. A reliable and energy-efficient wave routing algorithm based on cluster for sensor networks

    NASA Astrophysics Data System (ADS)

    Wu, Fei; Tan, Zhihu; Chen, Yu; Xie, Changsheng

    2008-11-01

    In the recent past, wireless sensor networks have found their way into a wide variety of applications and systems with vastly varying requirements and characteristics. The major limit to the sensor nodes is the limit of energy and computing capability. The major consumption of the nodes energy in sensor networks occurs in the communication between nodes or between the nodes and the base station. Many current protocols only pay attention to how to reduce the energy consumption, rather than the stability during the routing process. Aimed to solve this problem, the paper produces a new kind of reliable and energy-efficient wave routing algorithm. The results of the experimentation can indicate that this new algorithm can not only reduce the energy consumption, but also ensure the stability of the routing.

  16. Reliability-Based Design of a Safety-Critical Automation System: A Case Study

    NASA Technical Reports Server (NTRS)

    Carroll, Carol W.; Dunn, W.; Doty, L.; Frank, M. V.; Hulet, M.; Alvarez, Teresa (Technical Monitor)

    1994-01-01

    In 1986, NASA funded a project to modernize the NASA Ames Research Center Unitary Plan Wind Tunnels, including the replacement of obsolescent controls with a modern, automated distributed control system (DCS). The project effort on this system included an independent safety analysis (ISA) of the automation system. The purpose of the ISA was to evaluate the completeness of the hazard analyses which had already been performed on the Modernization Project. The ISA approach followed a tailoring of the risk assessment approach widely used on existing nuclear power plants. The tailoring of the nuclear industry oriented risk assessment approach to the automation system and its role in reliability-based design of the automation system is the subject of this paper.

  17. Design and implementation of reliability evaluation of SAS hard disk based on RAID card

    NASA Astrophysics Data System (ADS)

    Ren, Shaohua; Han, Sen

    2015-10-01

    Because of the huge advantage of RAID technology in storage, it has been widely used. However, the question associated with this technology is that the hard disk based on the RAID card can not be queried by Operating System. Therefore how to read the self-information and log data of hard disk has been a problem, while this data is necessary for reliability test of hard disk. In traditional way, this information can be read just suitable for SATA hard disk, but not for SAS hard disk. In this paper, we provide a method by using LSI RAID card's Application Program Interface, communicating with RAID card and analyzing the feedback data to solve the problem. Then we will get the necessary information to assess the SAS hard disk.

  18. Comparative analysis of different configurations of PLC-based safety systems from reliability point of view

    NASA Technical Reports Server (NTRS)

    Tapia, Moiez A.

    1993-01-01

    The study of a comparative analysis of distinct multiplex and fault-tolerant configurations for a PLC-based safety system from a reliability point of view is presented. It considers simplex, duplex and fault-tolerant triple redundancy configurations. The standby unit in case of a duplex configuration has a failure rate which is k times the failure rate of the standby unit, the value of k varying from 0 to 1. For distinct values of MTTR and MTTF of the main unit, MTBF and availability for these configurations are calculated. The effect of duplexing only the PLC module or only the sensors and the actuators module, on the MTBF of the configuration, is also presented. The results are summarized and merits and demerits of various configurations under distinct environments are discussed.

  19. Structural damage measure index based on non-probabilistic reliability model

    NASA Astrophysics Data System (ADS)

    Wang, Xiaojun; Xia, Yong; Zhou, Xiaoqing; Yang, Chen

    2014-02-01

    Uncertainties in the structural model and measurement data affect structural condition assessment in practice. As the probabilistic information of these uncertainties lacks, the non-probabilistic interval analysis framework is developed to quantify the interval of the structural element stiffness parameters. According to the interval intersection of the element stiffness parameters in the undamaged and damaged states, the possibility of damage existence is defined based on the reliability theory. A damage measure index is then proposed as the product of the nominal stiffness reduction and the defined possibility of damage existence. This new index simultaneously reflects the damage severity and possibility of damage at each structural component. Numerical and experimental examples are presented to illustrate the validity and applicability of the method. The results show that the proposed method can improve the accuracy of damage diagnosis compared with the deterministic damage identification method.

  20. Web-based phenotyping for Tourette Syndrome: Reliability of common co-morbid diagnoses.

    PubMed

    Darrow, Sabrina M; Illmann, Cornelia; Gauvin, Caitlin; Osiecki, Lisa; Egan, Crystelle A; Greenberg, Erica; Eckfield, Monika; Hirschtritt, Matthew E; Pauls, David L; Batterson, James R; Berlin, Cheston M; Malaty, Irene A; Woods, Douglas W; Scharf, Jeremiah M; Mathews, Carol A

    2015-08-30

    Collecting phenotypic data necessary for genetic analyses of neuropsychiatric disorders is time consuming and costly. Development of web-based phenotype assessments would greatly improve the efficiency and cost-effectiveness of genetic research. However, evaluating the reliability of this approach compared to standard, in-depth clinical interviews is essential. The current study replicates and extends a preliminary report on the utility of a web-based screen for Tourette Syndrome (TS) and common comorbid diagnoses (obsessive compulsive disorder (OCD) and attention deficit/hyperactivity disorder (ADHD)). A subset of individuals who completed a web-based phenotyping assessment for a TS genetic study was invited to participate in semi-structured diagnostic clinical interviews. The data from these interviews were used to determine participants' diagnostic status for TS, OCD, and ADHD using best estimate procedures, which then served as the gold standard to compare diagnoses assigned using web-based screen data. The results show high rates of agreement for TS. Kappas for OCD and ADHD diagnoses were also high and together demonstrate the utility of this self-report data in comparison previous diagnoses from clinicians and dimensional assessment methods. PMID:26054936

  1. Interpretive Reliability of Six Computer-Based Test Interpretation Programs for the Minnesota Multiphasic Personality Inventory-2.

    PubMed

    Deskovitz, Mark A; Weed, Nathan C; McLaughlan, Joseph K; Williams, John E

    2016-04-01

    The reliability of six Minnesota Multiphasic Personality Inventory-Second edition (MMPI-2) computer-based test interpretation (CBTI) programs was evaluated across a set of 20 commonly appearing MMPI-2 profile codetypes in clinical settings. Evaluation of CBTI reliability comprised examination of (a) interrater reliability, the degree to which raters arrive at similar inferences based on the same CBTI profile and (b) interprogram reliability, the level of agreement across different CBTI systems. Profile inferences drawn by four raters were operationalized using q-sort methodology. Results revealed no significant differences overall with regard to interrater and interprogram reliability. Some specific CBTI/profile combinations (e.g., the CBTI by Automated Assessment Associates on a within normal limits profile) and specific profiles (e.g., the 4/9 profile displayed greater interprogram reliability than the 2/4 profile) were interpreted with variable consensus (α range = .21-.95). In practice, users should consider that certain MMPI-2 profiles are interpreted more or less consensually and that some CBTIs show variable reliability depending on the profile. PMID:25944798

  2. The Diagnostic Validity and Reliability of an Internet-Based Clinical Assessment Program for Mental Disorders

    PubMed Central

    Klein, Britt; Meyer, Denny; Austin, David William; Abbott, Jo-Anne M

    2015-01-01

    Background Internet-based assessment has the potential to assist with the diagnosis of mental health disorders and overcome the barriers associated with traditional services (eg, cost, stigma, distance). Further to existing online screening programs available, there is an opportunity to deliver more comprehensive and accurate diagnostic tools to supplement the assessment and treatment of mental health disorders. Objective The aim was to evaluate the diagnostic criterion validity and test-retest reliability of the electronic Psychological Assessment System (e-PASS), an online, self-report, multidisorder, clinical assessment and referral system. Methods Participants were 616 adults residing in Australia, recruited online, and representing prospective e-PASS users. Following e-PASS completion, 158 participants underwent a telephone-administered structured clinical interview and 39 participants repeated the e-PASS within 25 days of initial completion. Results With structured clinical interview results serving as the gold standard, diagnostic agreement with the e-PASS varied considerably from fair (eg, generalized anxiety disorder: κ=.37) to strong (eg, panic disorder: κ=.62). Although the e-PASS’ sensitivity also varied (0.43-0.86) the specificity was generally high (0.68-1.00). The e-PASS sensitivity generally improved when reducing the e-PASS threshold to a subclinical result. Test-retest reliability ranged from moderate (eg, specific phobia: κ=.54) to substantial (eg, bulimia nervosa: κ=.87). Conclusions The e-PASS produces reliable diagnostic results and performs generally well in excluding mental disorders, although at the expense of sensitivity. For screening purposes, the e-PASS subclinical result generally appears better than a clinical result as a diagnostic indicator. Further development and evaluation is needed to support the use of online diagnostic assessment programs for mental disorders. Trial Registration Australian and New Zealand Clinical Trials

  3. Fast two-dimensional phase-unwrapping algorithm based on sorting by reliability following a noncontinuous path.

    PubMed

    Herráez, Miguel Arevallilo; Burton, David R; Lalor, Michael J; Gdeisat, Munther A

    2002-12-10

    We describe what is to our knowledge a novel technique for phase unwrapping. Several algorithms based on unwrapping the most-reliable pixels first have been proposed. These were restricted to continuous paths and were subject to difficulties in defining a starting pixel. The technique described here uses a different type of reliability function and does not follow a continuous path to perform the unwrapping operation. The technique is explained in detail and illustrated with a number of examples. PMID:12502301

  4. Reliability model generator

    NASA Technical Reports Server (NTRS)

    McMann, Catherine M. (Inventor); Cohen, Gerald C. (Inventor)

    1991-01-01

    An improved method and system for automatically generating reliability models for use with a reliability evaluation tool is described. The reliability model generator of the present invention includes means for storing a plurality of low level reliability models which represent the reliability characteristics for low level system components. In addition, the present invention includes means for defining the interconnection of the low level reliability models via a system architecture description. In accordance with the principles of the present invention, a reliability model for the entire system is automatically generated by aggregating the low level reliability models based on the system architecture description.

  5. Reliability Optimization Design for Contact Springs of AC Contactors Based on Adaptive Genetic Algorithm

    NASA Astrophysics Data System (ADS)

    Zhao, Sheng; Su, Xiuping; Wu, Ziran; Xu, Chengwen

    The paper illustrates the procedure of reliability optimization modeling for contact springs of AC contactors under nonlinear multi-constraint conditions. The adaptive genetic algorithm (AGA) is utilized to perform reliability optimization on the contact spring parameters of a type of AC contactor. A method that changes crossover and mutation rates at different times in the AGA can effectively avoid premature convergence, and experimental tests are performed after optimization. The experimental result shows that the mass of each optimized spring is reduced by 16.2%, while the reliability increases to 99.9% from 94.5%. The experimental result verifies the correctness and feasibility of this reliability optimization designing method.

  6. New approaches in reliability based optimization of tuned mass damper in presence of uncertain bounded parameters

    NASA Astrophysics Data System (ADS)

    Mrabet, Elyes; Guedri, Mohamed; Ichchou, Mohamed; Ghanmi, Samir

    2015-10-01

    This work deals with control of vibrating structures using tuned mass damper (TMD) in presence of uncertain bounded structural parameters. The adopted optimization strategy of the TMD parameters is the reliability based optimization (RBO) where the failure probability, approximated with the classical Rice's formula, is related to the primary structure displacement. In presence of uncertain bounded structural parameters it is convenient to describe them using intervals. Consequently, the optimized failure probability is also defined over an interval. In this paper a continuous-optimization nested loop method (CONLM) is presented to provide the exact range of the optimum TMD parameters and their corresponding failure probabilities. The CONLM is time consuming; in this context an approximation method using the monotonicity-based extension method (MBEM) with box splitting is also proposed. Therefore, the initial non-deterministic optimization problem can be transformed into two independent deterministic sub-problems involving discrete-optimization nested loop rather than the continuous-optimization nested loop used in the CONLM. The effectiveness and robustness of the presented optimum bounds of the TMD parameters are investigated and a performance index is introduced. The numerical results obtained with a one degree of freedom and a multi-degree of freedom systems subject to different seismic motions have shown the efficiency of the proposed methods, even with high level of uncertainties. Besides, the good robustness of the TMD device when it is exactly tuned on the optimum TMD parameters corresponding to the deterministic structural parameters has been proven.

  7. The design and characteristics analysis of lunar based Two Axes gimbal with high reliability

    NASA Astrophysics Data System (ADS)

    Huang, Jing; Liu, Zhao-hui; Li, Zhi-guo; Chang, Zhi-yuan; Shi, Xin

    2011-08-01

    In order to meet the requirements of lunar based astronomical observation which includes three different observing modes: specific celestial body observations, calibration observations, sky surveys, the two Axes gimbal is designed to guarantee the telescope can point and listen to a exact point in the sky. Due to the harsh environment on the moon and space payload weight limit, based on lightweight design method, such as structure optimization and rational material selection, the weight of gimbal is greatly reduced without decrease of rigidity and strength. In addition, because of the usage of external rotor mechanism for vertical shaft, the system's first order mode along the emission direction is greatly improved. On the other hand, The shaft with one fixed end and another free end is adopted to reduce the deflection between its two larger span ends. Furthermore, the shaft stuck at both ends due to temperature changes on the moon is eliminated by rational determining the clearance of deep groove ball bearings. Experiments show that, the system's first order resonant frequency can reach 81HZ, and the mechanism works well from -25 °C to +60 °C without stuck phenomenon occured. So, because of the adoption of approaches mentioned above, the mechanism has good mechanical properties, such as high reliability and light weight.

  8. Post-illumination pupil response after blue light: Reliability of optimized melanopsin-based phototransduction assessment.

    PubMed

    van der Meijden, Wisse P; te Lindert, Bart H W; Bijlenga, Denise; Coppens, Joris E; Gómez-Herrero, Germán; Bruijel, Jessica; Kooij, J J Sandra; Cajochen, Christian; Bourgin, Patrice; Van Someren, Eus J W

    2015-10-01

    ± 3.6 yr) we examined the potential confounding effects of dark adaptation, time of the day (morning vs. afternoon), body posture (upright vs. supine position), and 24-h environmental light history on the PIPR assessment. Mixed effect regression models were used to analyze these possible confounders. A supine position caused larger PIPR-mm (β = 0.29 mm, SE = 0.10, p = 0.01) and PIPR-% (β = 4.34%, SE = 1.69, p = 0.02), which was due to an increase in baseline dark pupil diameter; this finding is of relevance for studies requiring a supine posture, as in functional Magnetic Resonance Imaging, constant routine protocols, and bed-ridden patients. There were no effects of dark adaptation, time of day, and light history. In conclusion, the presented method provides a reliable and robust assessment of the PIPR to allow for studies on individual differences in melanopsin-based phototransduction and effects of interventions. PMID:26209783

  9. Instrumentation and Control Needs for Reliable Operation of Lunar Base Surface Nuclear Power Systems

    NASA Technical Reports Server (NTRS)

    Turso, James; Chicatelli, Amy; Bajwa, Anupa

    2005-01-01

    needed to enable this critical functionality of autonomous operation. It will be imperative to consider instrumentation and control requirements in parallel to system configuration development so as to identify control-related, as well as integrated system-related, problem areas early to avoid potentially expensive work-arounds . This paper presents an overview of the enabling technologies necessary for the development of reliable, autonomous lunar base nuclear power systems with an emphasis on system architectures and off-the-shelf algorithms rather than hardware. Autonomy needs are presented in the context of a hypothetical lunar base nuclear power system. The scenarios and applications presented are hypothetical in nature, based on information from open-literature sources, and only intended to provoke thought and provide motivation for the use of autonomous, intelligent control and diagnostics.

  10. Is Learner Self-Assessment Reliable and Valid in a Web-Based Portfolio Environment for High School Students?

    ERIC Educational Resources Information Center

    Chang, Chi-Cheng; Liang, Chaoyun; Chen, Yi-Hui

    2013-01-01

    This study explored the reliability and validity of Web-based portfolio self-assessment. Participants were 72 senior high school students enrolled in a computer application course. The students created learning portfolios, viewed peers' work, and performed self-assessment on the Web-based portfolio assessment system. The results indicated: 1)…

  11. Is Teacher Assessment Reliable or Valid for High School Students under a Web-Based Portfolio Environment?

    ERIC Educational Resources Information Center

    Chang, Chi-Cheng; Wu, Bing-Hong

    2012-01-01

    This study explored the reliability and validity of teacher assessment under a Web-based portfolio assessment environment (or Web-based teacher portfolio assessment). Participants were 72 eleventh graders taking the "Computer Application" course. The students perform portfolio creation, inspection, self- and peer-assessment using the Web-based…

  12. Pareto-based efficient stochastic simulation-optimization for robust and reliable groundwater management

    NASA Astrophysics Data System (ADS)

    Sreekanth, J.; Moore, Catherine; Wolf, Leif

    2016-02-01

    Simulation-optimization methods are used to develop optimal solutions for a variety of groundwater management problems. The true optimality of these solutions is often dependent on the reliability of the simulation model. Therefore, where model predictions are uncertain due to parameter uncertainty, this should be accounted for within the optimization formulation to ensure that solutions are robust and reliable. In this study, we present a stochastic multi-objective formulation of the otherwise single objective groundwater optimization problem by considering minimization of prediction uncertainty as an additional objective. The proposed method is illustrated by applying to an injection bore field design problem. The primary objective of optimization is maximization of the total volume of water injected into a confined aquifer, subject to the constraints that the resulting increases in hydraulic head in a set of control bores are below specified target levels. Both bore locations and injection rates were considered as optimization variables. Prediction uncertainty is estimated using stacks of uncertain parameters and is explicitly minimized to produce robust and reliable solutions. Reliability analysis using post-optimization Monte Carlo analysis proved that while a stochastic single objective optimization failed to provide reliable solutions with a stack size of 50, the proposed method resulted in many robust solutions with high reliability close to 1.0. Results of the comparison indicate potential gains in efficiency of the stochastic multi-objective formulation to identify robust and reliable groundwater management strategies.

  13. A Bayesian-Based EDA Tool for Nano-circuits Reliability Calculations

    NASA Astrophysics Data System (ADS)

    Ibrahim, Walid; Beiu, Valeriu

    As the sizes of (nano-)devices are aggressively scaled deep into the nanometer range, the design and manufacturing of future (nano-)circuits will become extremely complex and inevitably will introduce more defects while their functioning will be adversely affected by transient faults. Therefore, accurately calculating the reliability of future designs will become a very important aspect for (nano-)circuit designers as they investigate several design alternatives to optimize the trade-offs between the conflicting metrics of area-power-energy-delay versus reliability. This paper introduces a novel generic technique for the accurate calculation of the reliability of future nano-circuits. Our aim is to provide both educational and research institutions (as well as the semiconductor industry at a later stage) with an accurate and easy to use tool for closely comparing the reliability of different design alternatives, and for being able to easily select the design that best fits a set of given (design) constraints. Moreover, the reliability model generated by the tool should empower designers with the unique opportunity of understanding the influence individual gates play on the design’s overall reliability, and identifying those (few) gates which impact the design’s reliability most significantly.

  14. Science-Based Simulation Model of Human Performance for Human Reliability Analysis

    SciTech Connect

    Dana L. Kelly; Ronald L. Boring; Ali Mosleh; Carol Smidts

    2011-10-01

    Human reliability analysis (HRA), a component of an integrated probabilistic risk assessment (PRA), is the means by which the human contribution to risk is assessed, both qualitatively and quantitatively. However, among the literally dozens of HRA methods that have been developed, most cannot fully model and quantify the types of errors that occurred at Three Mile Island. Furthermore, all of the methods lack a solid empirical basis, relying heavily on expert judgment or empirical results derived in non-reactor domains. Finally, all of the methods are essentially static, and are thus unable to capture the dynamics of an accident in progress. The objective of this work is to begin exploring a dynamic simulation approach to HRA, one whose models have a basis in psychological theories of human performance, and whose quantitative estimates have an empirical basis. This paper highlights a plan to formalize collaboration among the Idaho National Laboratory (INL), the University of Maryland, and The Ohio State University (OSU) to continue development of a simulation model initially formulated at the University of Maryland. Initial work will focus on enhancing the underlying human performance models with the most recent psychological research, and on planning follow-on studies to establish an empirical basis for the model, based on simulator experiments to be carried out at the INL and at the OSU.

  15. An acetylcholinesterase-based chronoamperometric biosensor for fast and reliable assay of nerve agents.

    PubMed

    Pohanka, Miroslav; Adam, Vojtech; Kizek, Rene

    2013-01-01

    The enzyme acetylcholinesterase (AChE) is an important part of cholinergic nervous system, where it stops neurotransmission by hydrolysis of the neurotransmitter acetylcholine. It is sensitive to inhibition by organophosphate and carbamate insecticides, some Alzheimer disease drugs, secondary metabolites such as aflatoxins and nerve agents used in chemical warfare. When immobilized on a sensor (physico-chemical transducer), it can be used for assay of these inhibitors. In the experiments described herein, an AChE- based electrochemical biosensor using screen printed electrode systems was prepared. The biosensor was used for assay of nerve agents such as sarin, soman, tabun and VX. The limits of detection achieved in a measuring protocol lasting ten minutes were 7.41 × 10(-12) mol/L for sarin, 6.31 × 10(-12) mol /L for soman, 6.17 × 10(-11) mol/L for tabun, and 2.19 × 10(-11) mol/L for VX, respectively. The assay was reliable, with minor interferences caused by the organic solvents ethanol, methanol, isopropanol and acetonitrile. Isopropanol was chosen as suitable medium for processing lipophilic samples. PMID:23999806

  16. A new multilocus approach for a reliable DNA-based identification of Armillaria species.

    PubMed

    Tsykun, Tetyana; Rigling, Daniel; Prospero, Simone

    2013-01-01

    In this paper we highlight and critically discuss limitations to molecular methods for identification of fungi via the example of the basidiomycete genus Armillaria. We analyzed a total of 144 sequences of three DNA regions commonly used for identifying fungi (ribosomal IGS-1 and ITS regions, translation elongation factor-1 alpha gene) from 48 specimens of six Armillaria species occurring in Europe (A. cepistipes, A. ostoyae, A. gallica, A. borealis, A. mellea, A. tabescens). Species were identified by comparing newly obtained sequences with those from the NCBI database, phylogenetic analyses and PCR-RFLP analyses of the three regions considered. When analyzed separately, no single gene region could unambiguously identify all six Armillaria species because of low interspecific and high intrasequence variability. We therefore developed a multilocus approach, which involves the stepwise use of the three regions. Following this scheme, all six species could be clearly discriminated. Our study suggests that, to improve the reliability of DNA-based techniques for species identification, multiple genes or intergenic regions should be analyzed. PMID:23449075

  17. Shock reliability analysis and improvement of MEMS electret-based vibration energy harvesters

    NASA Astrophysics Data System (ADS)

    Renaud, M.; Fujita, T.; Goedbloed, M.; de Nooijer, C.; van Schaijk, R.

    2015-10-01

    Vibration energy harvesters can serve as a replacement solution to batteries for powering tire pressure monitoring systems (TPMS). Autonomous wireless TPMS powered by microelectromechanical system (MEMS) electret-based vibration energy harvester have been demonstrated. The mechanical reliability of the MEMS harvester still has to be assessed in order to bring the harvester to the requirements of the consumer market. It should survive the mechanical shocks occurring in the tire environment. A testing procedure to quantify the shock resilience of harvesters is described in this article. Our first generation of harvesters has a shock resilience of 400 g, which is far from being sufficient for the targeted application. In order to improve this aspect, the first important aspect is to understand the failure mechanism. Failure is found to occur in the form of fracture of the device’s springs. It results from impacts between the anchors of the springs when the harvester undergoes a shock. The shock resilience of the harvesters can be improved by redirecting these impacts to nonvital parts of the device. With this philosophy in mind, we design three types of shock absorbing structures and test their effect on the shock resilience of our MEMS harvesters. The solution leading to the best results consists of rigid silicon stoppers covered by a layer of Parylene. The shock resilience of the harvesters is brought above 2500 g. Results in the same range are also obtained with flexible silicon bumpers, which are simpler to manufacture.

  18. Optimization of Systems with Uncertainty: Initial Developments for Performance, Robustness and Reliability Based Designs

    NASA Technical Reports Server (NTRS)

    Crespo, Luis G.; Bushnell, Dennis M. (Technical Monitor)

    2002-01-01

    This paper presents a study on the optimization of systems with structured uncertainties, whose inputs and outputs can be exhaustively described in the probabilistic sense. By propagating the uncertainty from the input to the output in the space of the probability density functions and the moments, optimization problems that pursue performance, robustness and reliability based designs are studied. Be specifying the desired outputs in terms of desired probability density functions and then in terms of meaningful probabilistic indices, we settle a computationally viable framework for solving practical optimization problems. Applications to static optimization and stability control are used to illustrate the relevance of incorporating uncertainty in the early stages of the design. Several examples that admit a full probabilistic description of the output in terms of the design variables and the uncertain inputs are used to elucidate the main features of the generic problem and its solution. Extensions to problems that do not admit closed form solutions are also evaluated. Concrete evidence of the importance of using a consistent probabilistic formulation of the optimization problem and a meaningful probabilistic description of its solution is provided in the examples. In the stability control problem the analysis shows that standard deterministic approaches lead to designs with high probability of running into instability. The implementation of such designs can indeed have catastrophic consequences.

  19. Degradation reliability modeling based on an independent increment process with quadratic variance

    NASA Astrophysics Data System (ADS)

    Wang, Zhihua; Zhang, Yongbo; Wu, Qiong; Fu, Huimin; Liu, Chengrui; Krishnaswamy, Sridhar

    2016-03-01

    Degradation testing is an important technique for assessing life time information of complex systems and highly reliable products. Motivated by fatigue crack growth (FCG) data and our previous study, this paper develops a novel degradation modeling approach, in which degradation is represented by an independent increment process with linear mean and general quadratic variance functions of test time or transformed test time if necessary. Based on the constructed degradation model, closed-form expressions of failure time distribution (FTD) and its percentiles can be straightforwardly derived and calculated. A one-stage method is developed to estimate model parameters and FTD. Simulation studies are conducted to validate the proposed approach, and the results illustrate that the approach can provide reasonable estimates even for small sample size situations. Finally, the method is verified by the FCG data set given as the motivating example, and the results show that it can be considered as an effective degradation modeling approach compared with the multivariate normal model and graphic approach.

  20. An Acetylcholinesterase-Based Chronoamperometric Biosensor for Fast and Reliable Assay of Nerve Agents

    PubMed Central

    Pohanka, Miroslav; Adam, Vojtech; Kizek, Rene

    2013-01-01

    The enzyme acetylcholinesterase (AChE) is an important part of cholinergic nervous system, where it stops neurotransmission by hydrolysis of the neurotransmitter acetylcholine. It is sensitive to inhibition by organophosphate and carbamate insecticides, some Alzheimer disease drugs, secondary metabolites such as aflatoxins and nerve agents used in chemical warfare. When immobilized on a sensor (physico-chemical transducer), it can be used for assay of these inhibitors. In the experiments described herein, an AChE- based electrochemical biosensor using screen printed electrode systems was prepared. The biosensor was used for assay of nerve agents such as sarin, soman, tabun and VX. The limits of detection achieved in a measuring protocol lasting ten minutes were 7.41 × 10−12 mol/L for sarin, 6.31 × 10−12 mol/L for soman, 6.17 × 10−11 mol/L for tabun, and 2.19 × 10−11 mol/L for VX, respectively. The assay was reliable, with minor interferences caused by the organic solvents ethanol, methanol, isopropanol and acetonitrile. Isopropanol was chosen as suitable medium for processing lipophilic samples. PMID:23999806

  1. A GC/MS-based metabolomic approach for reliable diagnosis of phenylketonuria.

    PubMed

    Xiong, Xiyue; Sheng, Xiaoqi; Liu, Dan; Zeng, Ting; Peng, Ying; Wang, Yichao

    2015-11-01

    ), which showed that phenylacetic acid may be used as a reliable discriminator for the diagnosis of PKU. The low false positive rate (1-specificity, 0.064) can be eliminated or at least greatly reduced by simultaneously referring to other markers, especially phenylpyruvic acid, a unique marker in PKU. Additionally, this standard was obtained with high sensitivity and specificity in a less invasive manner for diagnosing PKU compared with the Phe/Tyr ratio. Therefore, we conclude that urinary metabolomic information based on the improved oximation-silylation method together with GC/MS may be reliable for the diagnosis and differential diagnosis of PKU. PMID:26410738

  2. Autonomous, Decentralized Grid Architecture: Prosumer-Based Distributed Autonomous Cyber-Physical Architecture for Ultra-Reliable Green Electricity Networks

    SciTech Connect

    2012-01-11

    GENI Project: Georgia Tech is developing a decentralized, autonomous, internet-like control architecture and control software system for the electric power grid. Georgia Tech’s new architecture is based on the emerging concept of electricity prosumers—economically motivated actors that can produce, consume, or store electricity. Under Georgia Tech’s architecture, all of the actors in an energy system are empowered to offer associated energy services based on their capabilities. The actors achieve their sustainability, efficiency, reliability, and economic objectives, while contributing to system-wide reliability and efficiency goals. This is in marked contrast to the current one-way, centralized control paradigm.

  3. Reliability of Nationwide Prevalence Estimates of Dementia: A Critical Appraisal Based on Brazilian Surveys

    PubMed Central

    2015-01-01

    Background The nationwide dementia prevalence is usually calculated by applying the results of local surveys to countries’ populations. To evaluate the reliability of such estimations in developing countries, we chose Brazil as an example. We carried out a systematic review of dementia surveys, ascertained their risk of bias, and present the best estimate of occurrence of dementia in Brazil. Methods and Findings We carried out an electronic search of PubMed, Latin-American databases, and a Brazilian thesis database for surveys focusing on dementia prevalence in Brazil. The systematic review was registered at PROSPERO (CRD42014008815). Among the 35 studies found, 15 analyzed population-based random samples. However, most of them utilized inadequate criteria for diagnostics. Six studies without these limitations were further analyzed to assess the risk of selection, attrition, outcome and population bias as well as several statistical issues. All the studies presented moderate or high risk of bias in at least two domains due to the following features: high non-response, inaccurate cut-offs, and doubtful accuracy of the examiners. Two studies had limited external validity due to high rates of illiteracy or low income. The three studies with adequate generalizability and the lowest risk of bias presented a prevalence of dementia between 7.1% and 8.3% among subjects aged 65 years and older. However, after adjustment for accuracy of screening, the best available evidence points towards a figure between 15.2% and 16.3%. Conclusions The risk of bias may strongly limit the generalizability of dementia prevalence estimates in developing countries. Extrapolations that have already been made for Brazil and Latin America were based on a prevalence that should have been adjusted for screening accuracy or not used at all due to severe bias. Similar evaluations regarding other developing countries are needed in order to verify the scope of these limitations. PMID:26131563

  4. Stochastic Analysis of Waterhammer and Applications in Reliability-Based Structural Design for Hydro Turbine Penstocks

    SciTech Connect

    Zhang, Qin Fen; Karney, Professor Byran W.; Suo, Prof. Lisheng; Colombo, Dr. Andrew

    2011-01-01

    Abstract: The randomness of transient events, and the variability in factors which influence the magnitudes of resultant pressure fluctuations, ensures that waterhammer and surges in a pressurized pipe system are inherently stochastic. To bolster and improve reliability-based structural design, a stochastic model of transient pressures is developed for water conveyance systems in hydropower plants. The statistical characteristics and probability distributions of key factors in boundary conditions, initial states and hydraulic system parameters are analyzed based on a large record of observed data from hydro plants in China; and then the statistical characteristics and probability distributions of annual maximum waterhammer pressures are simulated using Monte Carlo method and verified by the analytical probabilistic model for a simplified pipe system. In addition, the characteristics (annual occurrence, sustaining period and probability distribution) of hydraulic loads for both steady and transient states are discussed. Illustrating with an example of penstock structural design, it is shown that the total waterhammer pressure should be split into two individual random variable loads: the steady/static pressure and the waterhammer pressure rise during transients; and that different partial load factors should be applied to each individual load to reflect its unique physical and stochastic features. Particularly, the normative load (usually the unfavorable value at 95-percentage point) for steady/static hydraulic pressure should be taken from the probability distribution of its maximum values during the pipe's design life, while for waterhammer pressure rise, as the second variable load, the probability distribution of its annual maximum values is used to determine its normative load.

  5. Reliability prediction for evolutionary product in the conceptual design phase using neural network-based fuzzy synthetic assessment

    NASA Astrophysics Data System (ADS)

    Liu, Yu; Huang, Hong-Zhong; Ling, Dan

    2013-03-01

    Reliability prediction plays an important role in product lifecycle management. It has been used to assess various reliability indices (such as reliability, availability and mean time to failure) before a new product is physically built and/or put into use. In this article, a novel approach is proposed to facilitate reliability prediction for evolutionary products during their early design stages. Due to the lack of sufficient data in the conceptual design phase, reliability prediction is not a straightforward task. Taking account of the information from existing similar products and knowledge from domain experts, a neural network-based fuzzy synthetic assessment (FSA) approach is proposed to predict the reliability indices that a new evolutionary product could achieve. The proposed approach takes advantage of the capability of the back-propagation neural network in terms of constructing highly non-linear functional relationship and combines both the data sets from existing similar products and subjective knowledge from domain experts. It is able to reach a more accurate prediction than the conventional FSA method reported in the literature. The effectiveness and advantages of the proposed method are demonstrated via a case study of the fuel injection pump and a comparative study.

  6. Fundamental Study on Rationalization of Maintenance Procedure based upon Reliability Analysis

    NASA Astrophysics Data System (ADS)

    Kameda, Hideyuki; Yamashita, Koji

    We have developed the Reliability Analysis System for Protective Relays (RASPR), which is composed of a database for protective relay equipment and its failures, and reliability analysis part. The data for relay systems and their failures have been collected since fiscal 1997. In this paper, the following suggestion was made from the analysis of the reliability for protective relays; 1) Omission of interruption of automatic checking at fault doesn’t have an influence on the reliability such as the availability, 2) Improvement of the availability was cleared from the analysis of effect of shortening of the testing time required, 3) The extension of the periodic testing period could be possible if 70% of failures discovered in a periodic testing were detected by continuous monitoring.

  7. CardioGuard: A Brassiere-Based Reliable ECG Monitoring Sensor System for Supporting Daily Smartphone Healthcare Applications

    PubMed Central

    Kwon, Sungjun; Kim, Jeehoon; Kang, Seungwoo; Lee, Youngki; Baek, Hyunjae

    2014-01-01

    Abstract We propose CardioGuard, a brassiere-based reliable electrocardiogram (ECG) monitoring sensor system, for supporting daily smartphone healthcare applications. It is designed to satisfy two key requirements for user-unobtrusive daily ECG monitoring: reliability of ECG sensing and usability of the sensor. The system is validated through extensive evaluations. The evaluation results showed that the CardioGuard sensor reliably measure the ECG during 12 representative daily activities including diverse movement levels; 89.53% of QRS peaks were detected on average. The questionnaire-based user study with 15 participants showed that the CardioGuard sensor was comfortable and unobtrusive. Additionally, the signal-to-noise ratio test and the washing durability test were conducted to show the high-quality sensing of the proposed sensor and its physical durability in practical use, respectively. PMID:25405527

  8. Delay Analysis of Car-to-Car Reliable Data Delivery Strategies Based on Data Mulling with Network Coding

    NASA Astrophysics Data System (ADS)

    Park, Joon-Sang; Lee, Uichin; Oh, Soon Young; Gerla, Mario; Lun, Desmond Siumen; Ro, Won Woo; Park, Joonseok

    Vehicular ad hoc networks (VANET) aims to enhance vehicle navigation safety by providing an early warning system: any chance of accidents is informed through the wireless communication between vehicles. For the warning system to work, it is crucial that safety messages be reliably delivered to the target vehicles in a timely manner and thus reliable and timely data dissemination service is the key building block of VANET. Data mulling technique combined with three strategies, network codeing, erasure coding and repetition coding, is proposed for the reliable and timely data dissemination service. Particularly, vehicles in the opposite direction on a highway are exploited as data mules, mobile nodes physically delivering data to destinations, to overcome intermittent network connectivity cause by sparse vehicle traffic. Using analytic models, we show that in such a highway data mulling scenario the network coding based strategy outperforms erasure coding and repetition based strategies.

  9. Validity and Reliability of the Turkish Version of Needs Based Biopsychosocial Distress Instrument for Cancer Patients (CANDI)

    PubMed Central

    Beyhun, Nazim Ercument; Can, Gamze; Tiryaki, Ahmet; Karakullukcu, Serdar; Bulut, Bekir; Yesilbas, Sehbal; Kavgaci, Halil; Topbas, Murat

    2016-01-01

    Background Needs based biopsychosocial distress instrument for cancer patients (CANDI) is a scale based on needs arising due to the effects of cancer. Objectives The aim of this research was to determine the reliability and validity of the CANDI scale in the Turkish language. Patients and Methods The study was performed with the participation of 172 cancer patients aged 18 and over. Factor analysis (principal components analysis) was used to assess construct validity. Criterion validities were tested by computing Spearman correlation between CANDI and hospital anxiety depression scale (HADS), and brief symptom inventory (BSI) (convergent validity) and quality of life scales (FACT-G) (divergent validity). Test-retest reliabilities and internal consistencies were measured with intraclass correlation (ICC) and Cronbach-α. Results A three-factor solution (emotional, physical and social) was found with factor analysis. Internal reliability (α = 0.94) and test-retest reliability (ICC = 0.87) were significantly high. Correlations between CANDI and HADS (rs = 0.67), and BSI (rs = 0.69) and FACT-G (rs = -0.76) were moderate and significant in the expected direction. Conclusions CANDI is a valid and reliable scale in cancer patients with a three-factor structure (emotional, physical and social) in the Turkish language. PMID:27621931

  10. Nanoparticle-based cancer treatment: can delivered dose and biological dose be reliably modeled and quantified?

    NASA Astrophysics Data System (ADS)

    Hoopes, P. Jack; Petryk, Alicia A.; Giustini, Andrew J.; Stigliano, Robert V.; D'Angelo, Robert N.; Tate, Jennifer A.; Cassim, Shiraz M.; Foreman, Allan; Bischof, John C.; Pearce, John A.; Ryan, Thomas

    2011-03-01

    Essential developments in the reliable and effective use of heat in medicine include: 1) the ability to model energy deposition and the resulting thermal distribution and tissue damage (Arrhenius models) over time in 3D, 2) the development of non-invasive thermometry and imaging for tissue damage monitoring, and 3) the development of clinically relevant algorithms for accurate prediction of the biological effect resulting from a delivered thermal dose in mammalian cells, tissues, and organs. The accuracy and usefulness of this information varies with the type of thermal treatment, sensitivity and accuracy of tissue assessment, and volume, shape, and heterogeneity of the tumor target and normal tissue. That said, without the development of an algorithm that has allowed the comparison and prediction of the effects of hyperthermia in a wide variety of tumor and normal tissues and settings (cumulative equivalent minutes/ CEM), hyperthermia would never have achieved clinical relevance. A new hyperthermia technology, magnetic nanoparticle-based hyperthermia (mNPH), has distinct advantages over the previous techniques: the ability to target the heat to individual cancer cells (with a nontoxic nanoparticle), and to excite the nanoparticles noninvasively with a noninjurious magnetic field, thus sparing associated normal cells and greatly improving the therapeutic ratio. As such, this modality has great potential as a primary and adjuvant cancer therapy. Although the targeted and safe nature of the noninvasive external activation (hysteretic heating) are a tremendous asset, the large number of therapy based variables and the lack of an accurate and useful method for predicting, assessing and quantifying mNP dose and treatment effect is a major obstacle to moving the technology into routine clinical practice. Among other parameters, mNPH will require the accurate determination of specific nanoparticle heating capability, the total nanoparticle content and biodistribution in

  11. [Study on the Reliability Assessment Method of Heavy Vehicle Gearbox Based on Spectrometric Analysis].

    PubMed

    Bao, Ke; Zhang, Zhong; Cao, Yuan-fu; Chen, Yi-jie

    2015-04-01

    Spectrometric oil analysis is of great importance for wear condition monitoring of gearbox. In this context, the contents of main elements compositions in the bench test of heavy vehicle gearbox are obtained by atomic emission spectrometric oil analysis first. Then correlation analysis of the test data and wearing mechanism analysis are carried out to get the metal element which could be used to describe the wearing and failure of the gearbox. The spectrometric data after filling/changing oil are corrected, and the laws of the contents of main elements compositions during tests are expressed as linear functions. After that, the reliability assessment is executed with considering the degradation law and discreteness of test data, in which the mean and standard deviation of normal distribution of spectrometric oil data at each time point are adopted. Finally, the influences of the threshold are discussed. It has been proved that the contents of metal element Cu, which is got by spectrometric oil analysis of different samples, could be used to assess the reliability of heavy vehicle gearbox. The reason is that the metal element Cu is closely related to the general wear state of gearbox, and is easy to be measured. When the threshold of Cu content is treated as a constant, bigger threshold means higher reliability at the same time, and the mean value of threshold has significant impact on the reliability assessment results as R > 0.9. When the threshold is treated as a random variable, bigger dispersion of threshold means smaller slope of reliability against time, and also means lower reliability of gearbox as R > 0.9 at the same time. In this study, the spectrometric oil analysis and probability statistics are used together for the reliability assessment of gear box, which extends the application range of spectrometric analysis. PMID:26197588

  12. Testing the Reliability and Sensitivity of Foraminiferal Transfer Functions Based on the Modern Analog Technique (MAT)

    NASA Astrophysics Data System (ADS)

    Lac, D.; Cullen, J. L.; Martin, A.

    2004-05-01

    analogs for all the samples within a particular set of duplicates, with no regional pattern in this number observed. The warm and cold SST estimates generated using the SST's above each of the 5 chosen analogs exhibit a wide range of variation, particularly for the three sample sets from the high latitudes. The three subpolar sample sets exhibit a 3.4, 1.1, and 1.0 degree C range in their cold SST estimates. There is no clear relationship between differences in SST estimates and differences in the average dissimilarities within duplicate sample sets. Using 10 instead of 5 modern analogs to estimate SST's produces somewhat better results for 4 of the 8 sample sets and similar results for the remaining 4. Our results suggest that foraminiferal samples with dissimilarity values of up to 0.15 are not detectably different from duplicate foraminiferal census counts and should be considered excellent modern faunal analogs for any fossil sample. In addition high latitude samples seem to be produce somewhat less reliable SST estimates than low latitude samples. Finally, our results suggest that, when estimating past SST's choosing to average the SST's above the 10 best analogs produces more accurate (and precise) results, particularly in situations where the Global Data Base contains adequate modern analogs.

  13. A Critique of Raju and Oshima's Prophecy Formulas for Assessing the Reliability of Item Response Theory-Based Ability Estimates

    ERIC Educational Resources Information Center

    Wang, Wen-Chung

    2008-01-01

    Raju and Oshima (2005) proposed two prophecy formulas based on item response theory in order to predict the reliability of ability estimates for a test after change in its length. The first prophecy formula is equivalent to the classical Spearman-Brown prophecy formula. The second prophecy formula is misleading because of an underlying false…

  14. A Reliable and Inexpensive Method of Nucleic Acid Extraction for the PCR-Based Detection of Diverse Plant Pathogens

    Technology Transfer Automated Retrieval System (TEKTRAN)

    A reliable extraction method is described for the preparation of total nucleic acids from several plant genera for subsequent detection of plant pathogens by PCR-based techniques. By the combined use of a modified CTAB (cetyltrimethylammonium bromide) extraction protocol and a semi-automatic homogen...

  15. The Development of the Functional Literacy Experience Scale Based upon Ecological Theory (FLESBUET) and Validity-Reliability Study

    ERIC Educational Resources Information Center

    Özenç, Emine Gül; Dogan, M. Cihangir

    2014-01-01

    This study aims to perform a validity-reliability test by developing the Functional Literacy Experience Scale based upon Ecological Theory (FLESBUET) for primary education students. The study group includes 209 fifth grade students at Sabri Taskin Primary School in the Kartal District of Istanbul, Turkey during the 2010-2011 academic year.…

  16. Tool for Assessing Responsibility-Based Education (TARE): Instrument Development, Content Validity, and Inter-Rater Reliability

    ERIC Educational Resources Information Center

    Wright, Paul M.; Craig, Mark W.

    2011-01-01

    Numerous scholars have stressed the importance of personal and social responsibility in physical activity settings; however, there is a lack of instrumentation to study the implementation of responsibility-based teaching strategies. The development, content validity, and initial inter-rater reliability testing of the Tool for Assessing…

  17. Content Validity and Inter-Rater Reliability of the Halliwick-Concept-Based Instrument "Swimming with Independent Measure"

    ERIC Educational Resources Information Center

    Srsen, Katja Groleger; Vidmar, Gaj; Pikl, Masa; Vrecar, Irena; Burja, Cirila; Krusec, Klavdija

    2012-01-01

    The Halliwick concept is widely used in different settings to promote joyful movement in water and swimming. To assess the swimming skills and progression of an individual swimmer, a valid and reliable measure should be used. The Halliwick-concept-based Swimming with Independent Measure (SWIM) was introduced for this purpose. We aimed to determine…

  18. Reliability-centered maintenance for ground-based large optical telescopes and radio antenna arrays

    NASA Astrophysics Data System (ADS)

    Marchiori, G.; Formentin, F.; Rampini, F.

    2014-07-01

    In the last years, EIE GROUP has been more and more involved in large optical telescopes and radio antennas array projects. In this frame, the paper describes a fundamental aspect of the Logistic Support Analysis (LSA) process, that is the application of the Reliability-Centered Maintenance (RCM) methodology for the generation of maintenance plans for ground-based large optical telescopes and radio antennas arrays. This helps maintenance engineers to make sure that the telescopes continue to work properly, doing what their users require them to do in their present operating conditions. The main objective of the RCM process is to establish the complete maintenance regime, with the safe minimum required maintenance, carried out without any risk to personnel, telescope and subsystems. At the same time, a correct application of the RCM allows to increase the cost effectiveness, telescope uptime and items availability, and to provide greater understanding of the level of risk that the organization is managing. At the same time, engineers shall make a great effort since the initial phase of the project to obtain a telescope requiring easy maintenance activities and simple replacement of the major assemblies, taking special care on the accesses design and items location, implementation and design of special lifting equipment and handling devices for the heavy items. This maintenance engineering framework is based on seven points, which lead to the main steps of the RCM program. The initial steps of the RCM process consist of: system selection and data collection (MTBF, MTTR, etc.), definition of system boundaries and operating context, telescope description with the use of functional block diagrams, and the running of a FMECA to address the dominant causes of equipment failure and to lay down the Critical Items List. In the second part of the process the RCM logic is applied, which helps to determine the appropriate maintenance tasks for each identified failure mode. Once

  19. Improved Reliability of InGaN-Based Light-Emitting Diodes by HfO2 Passivation Layer.

    PubMed

    Park, Seung Hyun; Kim, Yoon Seok; Kim, Tae Hoon; Ryu, Sang Wan

    2016-02-01

    We utilized a passivation layer to improve the leakage current and reliability characteristics of GaN-based light-emitting diodes. The electrical and optical characteristics of the fabricated LEDs were characterized by current-voltage and optical power measurements. The HfO2 passivation layer showed no optical power degradation and suppressed leakage current. The low deposition temper- ature of sputtered HfO2 is responsible for the improved reliability of the LEDs because it suppresses the diffusion of hydrogen plasma into GaN to form harmful Mg-H complexes. PMID:27433667

  20. The specification-based validation of reliable multicast protocol: Problem Report. M.S. Thesis

    NASA Technical Reports Server (NTRS)

    Wu, Yunqing

    1995-01-01

    Reliable Multicast Protocol (RMP) is a communication protocol that provides an atomic, totally ordered, reliable multicast service on top of unreliable IP multicasting. In this report, we develop formal models for RMP using existing automated verification systems, and perform validation on the formal RMP specifications. The validation analysis help identifies some minor specification and design problems. We also use the formal models of RMP to generate a test suite for conformance testing of the implementation. Throughout the process of RMP development, we follow an iterative, interactive approach that emphasizes concurrent and parallel progress of implementation and verification processes. Through this approach, we incorporate formal techniques into our development process, promote a common understanding for the protocol, increase the reliability of our software, and maintain high fidelity between the specifications of RMP and its implementation.

  1. Application of a microcomputer-based database management system to distribution system reliability evaluation

    SciTech Connect

    Hsu, Y.Y.; Chen, L.M.; Chen, J.L. . Dept. of Electrical Engineering); Hsueh, M.C.; Lin, C.T.; Chen, Y.W.; Chen, J.J. ); Liu, T.S.S.; Chen, W.C. )

    1990-01-01

    The experience with the application of a database management system (DBMS) to handle the large amounts of data involved in distribution system reliability evaluation is reported. To demonstrate the capability of the DBMS in data manipulation, reliability evaluation of a distribution system in Taiwan is performed using a DBMS installed on an IBM PC-AT. It is found that using DBMS tool is a very efficient way of organizing data required by distribution planners. Moreover, the DBMS method is very cost-effective since it is installed on a personal computer.

  2. Kuhn-Tucker optimization based reliability analysis for probabilistic finite elements

    NASA Technical Reports Server (NTRS)

    Liu, W. K.; Besterfield, G.; Lawrence, M.; Belytschko, T.

    1988-01-01

    The fusion of probability finite element method (PFEM) and reliability analysis for fracture mechanics is considered. Reliability analysis with specific application to fracture mechanics is presented, and computational procedures are discussed. Explicit expressions for the optimization procedure with regard to fracture mechanics are given. The results show the PFEM is a very powerful tool in determining the second-moment statistics. The method can determine the probability of failure or fracture subject to randomness in load, material properties and crack length, orientation, and location.

  3. A PRACTICAL GEOTECHNICAL RELIABILITY BASED DESIGN EMPLOYING RESPONSE SURFACE -SESIMIC DESIGN OF IRRIGATION CHANNEL ON LIQUEFIABLE GROUND-

    NASA Astrophysics Data System (ADS)

    Otake, Yu; Honjo, Yusuke

    The paper proposes a simple and yet practical reliability based design (RBD) scheme which takes into account the characteristics of geotechnical design. The proposed scheme evaluates the reliability by basically separating the geotechnical analysis and the uncertainty analysis, and reuniting them by the response surface (RS) and Monte Carlo simulation (MCS). As a result, RBD can be easily done making the best use of geotechnical analysis tool newly developed. The philosophy and procedures of the scheme are presented through a real and complex geotechnical structure design, i.e. seismic design of 12 km long irrigation channel against liquefaction. Through this design example, merits of the geotechnical RBD are shown, such as amount and quality of investigation data on the reliability evaluation, major sources of uncertainties in RBD, and suggestions for priority of countermeasure locations and additional investigations.

  4. Comprehensive reliability allocation method for CNC lathes based on cubic transformed functions of failure mode and effects analysis

    NASA Astrophysics Data System (ADS)

    Yang, Zhou; Zhu, Yunpeng; Ren, Hongrui; Zhang, Yimin

    2015-03-01

    Reliability allocation of computerized numerical controlled(CNC) lathes is very important in industry. Traditional allocation methods only focus on high-failure rate components rather than moderate failure rate components, which is not applicable in some conditions. Aiming at solving the problem of CNC lathes reliability allocating, a comprehensive reliability allocation method based on cubic transformed functions of failure modes and effects analysis(FMEA) is presented. Firstly, conventional reliability allocation methods are introduced. Then the limitations of direct combination of comprehensive allocation method with the exponential transformed FMEA method are investigated. Subsequently, a cubic transformed function is established in order to overcome these limitations. Properties of the new transformed functions are discussed by considering the failure severity and the failure occurrence. Designers can choose appropriate transform amplitudes according to their requirements. Finally, a CNC lathe and a spindle system are used as an example to verify the new allocation method. Seven criteria are considered to compare the results of the new method with traditional methods. The allocation results indicate that the new method is more flexible than traditional methods. By employing the new cubic transformed function, the method covers a wider range of problems in CNC reliability allocation without losing the advantages of traditional methods.

  5. Asymmetric Programming: A Highly Reliable Metadata Allocation Strategy for MLC NAND Flash Memory-Based Sensor Systems

    PubMed Central

    Huang, Min; Liu, Zhaoqing; Qiao, Liyan

    2014-01-01

    While the NAND flash memory is widely used as the storage medium in modern sensor systems, the aggressive shrinking of process geometry and an increase in the number of bits stored in each memory cell will inevitably degrade the reliability of NAND flash memory. In particular, it's critical to enhance metadata reliability, which occupies only a small portion of the storage space, but maintains the critical information of the file system and the address translations of the storage system. Metadata damage will cause the system to crash or a large amount of data to be lost. This paper presents Asymmetric Programming, a highly reliable metadata allocation strategy for MLC NAND flash memory storage systems. Our technique exploits for the first time the property of the multi-page architecture of MLC NAND flash memory to improve the reliability of metadata. The basic idea is to keep metadata in most significant bit (MSB) pages which are more reliable than least significant bit (LSB) pages. Thus, we can achieve relatively low bit error rates for metadata. Based on this idea, we propose two strategies to optimize address mapping and garbage collection. We have implemented Asymmetric Programming on a real hardware platform. The experimental results show that Asymmetric Programming can achieve a reduction in the number of page errors of up to 99.05% with the baseline error correction scheme. PMID:25310473

  6. Test-Retest Reliability of an Automated Infrared-Assisted Trunk Accelerometer-Based Gait Analysis System.

    PubMed

    Hsu, Chia-Yu; Tsai, Yuh-Show; Yau, Cheng-Shiang; Shie, Hung-Hai; Wu, Chu-Ming

    2016-01-01

    The aim of this study was to determine the test-retest reliability of an automated infrared-assisted, trunk accelerometer-based gait analysis system for measuring gait parameters of healthy subjects in a hospital. Thirty-five participants (28 of them females; age range, 23-79 years) performed a 5-m walk twice using an accelerometer-based gait analysis system with infrared assist. Measurements of spatiotemporal gait parameters (walking speed, step length, and cadence) and trunk control (gait symmetry, gait regularity, acceleration root mean square (RMS), and acceleration root mean square ratio (RMSR)) were recorded in two separate walking tests conducted 1 week apart. Relative and absolute test-retest reliability was determined by calculating the intra-class correlation coefficient (ICC3,1) and smallest detectable difference (SDD), respectively. The test-retest reliability was excellent for walking speed (ICC = 0.87, 95% confidence interval = 0.74-0.93, SDD = 13.4%), step length (ICC = 0.81, 95% confidence interval = 0.63-0.91, SDD = 12.2%), cadence (ICC = 0.81, 95% confidence interval = 0.63-0.91, SDD = 10.8%), and trunk control (step and stride regularity in anterior-posterior direction, acceleration RMS and acceleration RMSR in medial-lateral direction, and acceleration RMS and stride regularity in vertical direction). An automated infrared-assisted, trunk accelerometer-based gait analysis system is a reliable tool for measuring gait parameters in the hospital environment. PMID:27455281

  7. Knowledge-base for the new human reliability analysis method, A Technique for Human Error Analysis (ATHEANA)

    SciTech Connect

    Cooper, S.E.; Wreathall, J.; Thompson, C.M., Drouin, M.; Bley, D.C.

    1996-10-01

    This paper describes the knowledge base for the application of the new human reliability analysis (HRA) method, a ``A Technique for Human Error Analysis`` (ATHEANA). Since application of ATHEANA requires the identification of previously unmodeled human failure events, especially errors of commission, and associated error-forcing contexts (i.e., combinations of plant conditions and performance shaping factors), this knowledge base is an essential aid for the HRA analyst.

  8. RELIABILITY-BASED UNCERTAINTY ANALYSIS OF GROUNDWATER CONTAMINANT TRANSPORT AND REMEDIATION

    EPA Science Inventory

    This report presents a discussion of the application of the first- and second-order reliability methods (FORM and SORM, respectively) to ground-water transport and remediation, and to public health risk assessment. Using FORM and SORM allows the formal incorporation of parameter...

  9. Reliable Assessment with CyberTutor, a Web-Based Homework Tutor.

    ERIC Educational Resources Information Center

    Pritchard, David E.; Morote, Elsa-Sofia

    This paper demonstrates that an electronic tutoring program can collect data that enables a far more reliable assessment of students' skills than a standard examination. Socratic electronic homework tutor, CyberTutor can integrate effectively instruction and assessment. CyberTutor assessment has about 62 times less variance due to random test…

  10. Predictions of Crystal Structure Based on Radius Ratio: How Reliable Are They?

    ERIC Educational Resources Information Center

    Nathan, Lawrence C.

    1985-01-01

    Discussion of crystalline solids in undergraduate curricula often includes the use of radius ratio rules as a method for predicting which type of crystal structure is likely to be adopted by a given ionic compound. Examines this topic, establishing more definitive guidelines for the use and reliability of the rules. (JN)

  11. Moving to a Higher Level for PV Reliability through Comprehensive Standards Based on Solid Science (Presentation)

    SciTech Connect

    Kurtz, S.

    2014-11-01

    PV reliability is a challenging topic because of the desired long life of PV modules, the diversity of use environments and the pressure on companies to rapidly reduce their costs. This presentation describes the challenges, examples of failure mechanisms that we know or don't know how to test for, and how a scientific approach is being used to establish international standards.

  12. Temporal Stability of Strength-Based Assessments: Test-Retest Reliability of Student and Teacher Reports

    ERIC Educational Resources Information Center

    Romer, Natalie; Merrell, Kenneth W.

    2013-01-01

    This study focused on evaluating the temporal stability of self-reported and teacher-reported perceptions of students' social and emotional skills and assets. We used a test-retest reliability procedure over repeated administrations of the child, adolescent, and teacher versions of the "Social-Emotional Assets and Resilience Scales". Middle school…

  13. Optimum structural design based on reliability and proof-load testing

    NASA Technical Reports Server (NTRS)

    Shinozuka, M.; Yang, J. N.

    1969-01-01

    Proof-load test eliminates structures with strength less than the proof load and improves the reliability value in analysis. It truncates the distribution function of strength at the proof load, thereby alleviating verification of a fitted distribution function at the lower tail portion where data are usually nonexistent.

  14. Methodology for reliability based condition assessment. Application to concrete structures in nuclear plants

    SciTech Connect

    Mori, Y.; Ellingwood, B.

    1993-08-01

    Structures in nuclear power plants may be exposed to aggressive environmental effects that cause their strength to decrease over an extended period of service. A major concern in evaluating the continued service for such structures is to ensure that in their current condition they are able to withstand future extreme load events during the intended service life with a level of reliability sufficient for public safety. This report describes a methodology to facilitate quantitative assessments of current and future structural reliability and performance of structures in nuclear power plants. This methodology takes into account the nature of past and future loads, and randomness in strength and in degradation resulting from environmental factors. An adaptive Monte Carlo simulation procedure is used to evaluate time-dependent system reliability. The time-dependent reliability is sensitive to the time-varying load characteristics and to the choice of initial strength and strength degradation models but not to correlation in component strengths within a system. Inspection/maintenance strategies are identified that minimize the expected future costs of keeping the failure probability of a structure at or below an established target failure probability during its anticipated service period.

  15. Reliability-Based Weighting of Visual and Vestibular Cues in Displacement Estimation

    PubMed Central

    ter Horst, Arjan C.; Koppen, Mathieu; Selen, Luc P. J.; Medendorp, W. Pieter

    2015-01-01

    When navigating through the environment, our brain needs to infer how far we move and in which direction we are heading. In this estimation process, the brain may rely on multiple sensory modalities, including the visual and vestibular systems. Previous research has mainly focused on heading estimation, showing that sensory cues are combined by weighting them in proportion to their reliability, consistent with statistically optimal integration. But while heading estimation could improve with the ongoing motion, due to the constant flow of information, the estimate of how far we move requires the integration of sensory information across the whole displacement. In this study, we investigate whether the brain optimally combines visual and vestibular information during a displacement estimation task, even if their reliability varies from trial to trial. Participants were seated on a linear sled, immersed in a stereoscopic virtual reality environment. They were subjected to a passive linear motion involving visual and vestibular cues with different levels of visual coherence to change relative cue reliability and with cue discrepancies to test relative cue weighting. Participants performed a two-interval two-alternative forced-choice task, indicating which of two sequentially perceived displacements was larger. Our results show that humans adapt their weighting of visual and vestibular information from trial to trial in proportion to their reliability. These results provide evidence that humans optimally integrate visual and vestibular information in order to estimate their body displacement. PMID:26658990

  16. A Note on the Reliability Coefficients for Item Response Model-Based Ability Estimates

    ERIC Educational Resources Information Center

    Kim, Seonghoon

    2012-01-01

    Assuming item parameters on a test are known constants, the reliability coefficient for item response theory (IRT) ability estimates is defined for a population of examinees in two different ways: as (a) the product-moment correlation between ability estimates on two parallel forms of a test and (b) the squared correlation between the true…

  17. Reliability-Based Weighting of Visual and Vestibular Cues in Displacement Estimation.

    PubMed

    ter Horst, Arjan C; Koppen, Mathieu; Selen, Luc P J; Medendorp, W Pieter

    2015-01-01

    When navigating through the environment, our brain needs to infer how far we move and in which direction we are heading. In this estimation process, the brain may rely on multiple sensory modalities, including the visual and vestibular systems. Previous research has mainly focused on heading estimation, showing that sensory cues are combined by weighting them in proportion to their reliability, consistent with statistically optimal integration. But while heading estimation could improve with the ongoing motion, due to the constant flow of information, the estimate of how far we move requires the integration of sensory information across the whole displacement. In this study, we investigate whether the brain optimally combines visual and vestibular information during a displacement estimation task, even if their reliability varies from trial to trial. Participants were seated on a linear sled, immersed in a stereoscopic virtual reality environment. They were subjected to a passive linear motion involving visual and vestibular cues with different levels of visual coherence to change relative cue reliability and with cue discrepancies to test relative cue weighting. Participants performed a two-interval two-alternative forced-choice task, indicating which of two sequentially perceived displacements was larger. Our results show that humans adapt their weighting of visual and vestibular information from trial to trial in proportion to their reliability. These results provide evidence that humans optimally integrate visual and vestibular information in order to estimate their body displacement. PMID:26658990

  18. Potential of the Reliability-Resilience-Vulnerability (RRV) Based Drought Management Index (DMI)

    NASA Astrophysics Data System (ADS)

    Maity, R.; Chanda, K.; D, N. K.; Sharma, A.; Mehrotra, R.

    2014-12-01

    This paper highlights the findings from a couple of recent investigations aimed at characterizing and predicting the long-term drought propensity at a region for effective water management. A probabilistic index, named as Drought Management Index (DMI), was proposed for assessing the drought propensity on a multi-year scale at the chosen study area. The novelty of this index lay in the fact that it employed the Reliability-Resilience-Vulnerability (RRV) rationale, commonly used in water resources systems analysis, with the assumption that depletion of soil moisture across a vertical soil column is analogous to the operation of a water supply reservoir. This was the very first attempt to incorporate into a drought index the resilience of soil moisture series, which denotes the readiness of soil moisture to bounce back from drought to normal state. Further, the predictability of DMI was explored to assess the future drought propensity, which is essential for adopting suitable drought management policies at any location. For computing DMI, the intermediate measures i.e., RRV were obtained using the Permanent Wilting Point (PWP) as the threshold, indicative of transition into water stress. The joint distribution of resilience and vulnerability of soil moisture series was subsequently determined using Plackett copula. The DMI was designed such that it increases with increase in vulnerability as well as with decrease in resilience and vice versa. Thus, it was expressed as the joint probability of exceedence of resilience and non-exceedence of vulnerability of a soil moisture series. An assessment of the sensitivity of the DMI to the length of the data segments indicated that a 5-year temporal scale is optimum to obtain stable estimates of DMI. The ability of the DMI to reflect the spatio-temporal variation of drought propensity was illustrated using India as a test bed. Based on the observed behaviour of DMI series across India, on a climatological time scale, a DMI

  19. Determining Functional Reliability of Pyrotechnic Mechanical Devices

    NASA Technical Reports Server (NTRS)

    Bement, Laurence J.; Multhaup, Herbert A.

    1997-01-01

    This paper describes a new approach for evaluating mechanical performance and predicting the mechanical functional reliability of pyrotechnic devices. Not included are other possible failure modes, such as the initiation of the pyrotechnic energy source. The requirement of hundreds or thousands of consecutive, successful tests on identical components for reliability predictions, using the generally accepted go/no-go statistical approach routinely ignores physics of failure. The approach described in this paper begins with measuring, understanding and controlling mechanical performance variables. Then, the energy required to accomplish the function is compared to that delivered by the pyrotechnic energy source to determine mechanical functional margin. Finally, the data collected in establishing functional margin is analyzed to predict mechanical functional reliability, using small-sample statistics. A careful application of this approach can provide considerable cost improvements and understanding over that of go/no-go statistics. Performance and the effects of variables can be defined, and reliability predictions can be made by evaluating 20 or fewer units. The application of this approach to a pin puller used on a successful NASA mission is provided as an example.

  20. Reliability growth modeling analysis of the space shuttle main engines based upon the Weibull process

    NASA Technical Reports Server (NTRS)

    Wheeler, J. T.

    1990-01-01

    The Weibull process, identified as the inhomogeneous Poisson process with the Weibull intensity function, is used to model the reliability growth assessment of the space shuttle main engine test and flight failure data. Additional tables of percentage-point probabilities for several different values of the confidence coefficient have been generated for setting (1-alpha)100-percent two sided confidence interval estimates on the mean time between failures. The tabled data pertain to two cases: (1) time-terminated testing, and (2) failure-terminated testing. The critical values of the three test statistics, namely Cramer-von Mises, Kolmogorov-Smirnov, and chi-square, were calculated and tabled for use in the goodness of fit tests for the engine reliability data. Numerical results are presented for five different groupings of the engine data that reflect the actual response to the failures.

  1. RICA: a reliable and image configurable arena for cyborg bumblebee based on CAN bus.

    PubMed

    Gong, Fan; Zheng, Nenggan; Xue, Lei; Xu, Kedi; Zheng, Xiaoxiang

    2014-01-01

    In this paper, we designed a reliable and image configurable flight arena, RICA, for developing cyborg bumblebees. To meet the spatial and temporal requirements of bumblebees, the Controller Area Network (CAN) bus is adopted to interconnect the LED display modules to ensure the reliability and real-time performance of the arena system. Easily-configurable interfaces on a desktop computer implemented by python scripts are provided to transmit the visual patterns to the LED distributor online and configure RICA dynamically. The new arena system will be a power tool to investigate the quantitative relationship between the visual inputs and induced flight behaviors and also will be helpful to the visual-motor research in other related fields. PMID:25570095

  2. Optimal periodic proof test based on cost-effective and reliability criteria

    NASA Technical Reports Server (NTRS)

    Yang, J.-N.

    1976-01-01

    An exploratory study for the optimization of periodic proof tests for fatigue-critical structures is presented. The optimal proof load level and the optimal number of periodic proof tests are determined by minimizing the total expected (statistical average) cost, while the constraint on the allowable level of structural reliability is satisfied. The total expected cost consists of the expected cost of proof tests, the expected cost of structures destroyed by proof tests, and the expected cost of structural failure in service. It is demonstrated by numerical examples that significant cost saving and reliability improvement for fatigue-critical structures can be achieved by the application of the optimal periodic proof test. The present study is relevant to the establishment of optimal maintenance procedures for fatigue-critical structures.

  3. A human reliability based usability evaluation method for safety-critical software

    SciTech Connect

    Boring, R. L.; Tran, T. Q.; Gertman, D. I.; Ragsdale, A.

    2006-07-01

    Boring and Gertman (2005) introduced a novel method that augments heuristic usability evaluation methods with that of the human reliability analysis method of SPAR-H. By assigning probabilistic modifiers to individual heuristics, it is possible to arrive at the usability error probability (UEP). Although this UEP is not a literal probability of error, it nonetheless provides a quantitative basis to heuristic evaluation. This method allows one to seamlessly prioritize and identify usability issues (i.e., a higher UEP requires more immediate fixes). However, the original version of this method required the usability evaluator to assign priority weights to the final UEP, thus allowing the priority of a usability issue to differ among usability evaluators. The purpose of this paper is to explore an alternative approach to standardize the priority weighting of the UEP in an effort to improve the method's reliability. (authors)

  4. Inter-observer reliability of forceful exertion analysis based on video-recordings.

    PubMed

    Bao, S; Howard, N; Spielholz, P; Silverstein, B

    2010-09-01

    The objectives were to examine inter-observer reliability of job-level forceful exertion analyses and temporal agreement of detailed time study results. Three observers performed the analyses on 12 different jobs. Continuous duration, frequency and % time of lifting, pushing/pulling, power and pinch gripping exertions and estimated level of the exertions were obtained. Intraclass correlation coefficient and variance components were computed. Temporal agreement analyses of raw time study data were performed. The inter-observer reliability was good for most job-level exposure parameters (continuous duration, frequency and % time of forceful exertions), but only fair to moderate for the estimated level of forceful exertions. The finding that the between-observer variability was less than the between-exertion variability confirmed that the forceful exertion analysis method used in the present study can detect job exertion differences.Using three observers to perform detailed time studies on task activities and getting consensus of the majority can increase the between-observer agreement up to 97%. STATEMENT OF RELEVANCE: The results inform researchers that inter-observer reliability for job-level exposure measurement of forceful exertion analysis obtained from detailed time studies is generally good, but the observers' ability in the estimation of forceful exertion level can be poor. It also provides information on the temporal agreement of detailed forceful exertion analysis and guidelines on achieving better agreement for studies where accurate synchronisation of task activities and direct physiological/biomechanical measurements is crucial. PMID:20737338

  5. A Human Reliability Based Usability Evaluation Method for Safety-Critical Software

    SciTech Connect

    Phillippe Palanque; Regina Bernhaupt; Ronald Boring; Chris Johnson

    2006-04-01

    Recent years have seen an increasing use of sophisticated interaction techniques including in the field of safety critical interactive software [8]. The use of such techniques has been required in order to increase the bandwidth between the users and systems and thus to help them deal efficiently with increasingly complex systems. These techniques come from research and innovation done in the field of humancomputer interaction (HCI). A significant effort is currently being undertaken by the HCI community in order to apply and extend current usability evaluation techniques to these new kinds of interaction techniques. However, very little has been done to improve the reliability of software offering these kinds of interaction techniques. Even testing basic graphical user interfaces remains a challenge that has rarely been addressed in the field of software engineering [9]. However, the non reliability of interactive software can jeopardize usability evaluation by showing unexpected or undesired behaviors. The aim of this SIG is to provide a forum for both researchers and practitioners interested in testing interactive software. Our goal is to define a roadmap of activities to cross fertilize usability and reliability testing of these kinds of systems to minimize duplicate efforts in both communities.

  6. Validity and reliability of criterion based clinical audit to assess obstetrical quality of care in West Africa

    PubMed Central

    2012-01-01

    Background In Mali and Senegal, over 1% of women die giving birth in hospital. At some hospitals, over a third of infants are stillborn. Many deaths are due to substandard medical practices. Criterion-based clinical audits (CBCA) are increasingly used to measure and improve obstetrical care in resource-limited settings, but their measurement properties have not been formally evaluated. In 2011, we published a systematic review of obstetrical CBCA highlighting insufficient considerations of validity and reliability. The objective of this study is to develop an obstetrical CBCA adapted to the West African context and assess its reliability and validity. This work was conducted as a sub-study within a cluster randomized trial known as QUARITE. Methods Criteria were selected based on extensive literature review and expert opinion. Early 2010, two auditors applied the CBCA to identical samples at 8 sites in Mali and Senegal (n = 185) to evaluate inter-rater reliability. In 2010–11, we conducted CBCA at 32 hospitals to assess construct validity (n = 633 patients). We correlated hospital characteristics (resource availability, facility perinatal and maternal mortality) with mean hospital CBCA scores. We used generalized estimating equations to assess whether patient CBCA scores were associated with perinatal mortality. Results Results demonstrate substantial (ICC = 0.67, 95% CI 0.54; 0.76) to elevated inter-rater reliability (ICC = 0.84, 95% CI 0.77; 0.89) in Senegal and Mali, respectively. Resource availability positively correlated with mean hospital CBCA scores and maternal and perinatal mortality were inversely correlated with hospital CBCA scores. Poor CBCA scores, adjusted for hospital and patient characteristics, were significantly associated with perinatal mortality (OR 1.84, 95% CI 1.01-3.34). Conclusion Our CBCA has substantial inter-rater reliability and there is compelling evidence of its validity as the tool performs according to theory. Trial registration

  7. Manufacturability and reliability on 10-Gb/s transponder for ethernet-based applications

    NASA Astrophysics Data System (ADS)

    Kao, Min-Sheng; Tsai, Cheng-Hung; Chiu, Chia-Hung; Cheng, Shou-Chien; Shen, Kun-Yi; Huang, Min-Fa; Shaw, Cheng-Da; Lee, Shin-Ge

    2004-05-01

    In this paper, manufacturing issues include Optical Sub-Assembly (OSA), Electrical Sub-Assembly (ESA) and reliability considerations of 10 Gb/s Ethernet transponder were studied by using experiments and implementation. In the growing optical communication industry, one of the star products is the Z-axis pluggable optical transceiver module. Under the broad usage of Ethernet means high port density, low cost, high utilities, compact size and still require excellent performance. After standardizing of 10 Gb/s ethernet (IEEE 802.3ae), many transceiver companies, silicon vendors and system vendors reached the agreement and signed up diversity of MSA (Multi-Source Agreement). These MSAs still keep modifying with system demands, customer requirements, cost and performance issue. This paper presents how to achieve these functions description in the MSA and own a highly manufacturability and reliability module design. According to composed block of transponder, we split it into OSA, ESA, mechanical design and related reliability experimental result. In the OSA, traditional TO-CAN package and optical components be introduced. Because the mature manufacture experience, vendor can easy to meet low cost and manufacturability requirements and only need to slightly modifications. A simply solution be implemented to solve this problem and discuss the critical point of the design. Thermal issue on OSA will also be mentioned because of the sensitive of light source and how to calculate the effect to find effective solutions. By the way, some manufacturability criteria will be discussed for OSA characteristics in 10 Gb/s applications. In the ESA, PMD (Physical media dependant) driving methods, Multi-Source Agreement related digital optical monitor function implement and performance comparison will be presented. On the other hand, we will examine the crosstalk effect between transmitter and receiver circuit and impact to the module Optical to Electrical convert interface design. We

  8. Optimal clustering of MGs based on droop controller for improving reliability using a hybrid of harmony search and genetic algorithms.

    PubMed

    Abedini, Mohammad; Moradi, Mohammad H; Hosseinian, S M

    2016-03-01

    This paper proposes a novel method to address reliability and technical problems of microgrids (MGs) based on designing a number of self-adequate autonomous sub-MGs via adopting MGs clustering thinking. In doing so, a multi-objective optimization problem is developed where power losses reduction, voltage profile improvement and reliability enhancement are considered as the objective functions. To solve the optimization problem a hybrid algorithm, named HS-GA, is provided, based on genetic and harmony search algorithms, and a load flow method is given to model different types of DGs as droop controller. The performance of the proposed method is evaluated in two case studies. The results provide support for the performance of the proposed method. PMID:26767800

  9. Reliability of a novel CBCT-based 3D classification system for maxillary canine impactions in orthodontics: the KPG index.

    PubMed

    Dalessandri, Domenico; Migliorati, Marco; Rubiano, Rachele; Visconti, Luca; Contardo, Luca; Di Lenarda, Roberto; Martin, Conchita

    2013-01-01

    The aim of this study was to evaluate both intra- and interoperator reliability of a radiological three-dimensional classification system (KPG index) for the assessment of degree of difficulty for orthodontic treatment of maxillary canine impactions. Cone beam computed tomography (CBCT) scans of fifty impacted canines, obtained using three different scanners (NewTom, Kodak, and Planmeca), were classified using the KPG index by three independent orthodontists. Measurements were repeated one month later. Based on these two sessions, several recommendations on KPG Index scoring were elaborated. After a joint calibration session, these recommendations were explained to nine orthodontists and the two measurement sessions were repeated. There was a moderate intrarater agreement in the precalibration measurement sessions. After the calibration session, both intra- and interrater agreement were almost perfect. Indexes assessed with Kodak Dental Imaging 3D module software showed a better reliability in z-axis values, whereas indexes assessed with Planmeca Romexis software showed a better reliability in x- and y-axis values. No differences were found between the CBCT scanners used. Taken together, these findings indicate that the application of the instructions elaborated during this study improved KPG index reliability, which was nevertheless variously influenced by the use of different software for images evaluation. PMID:24235889

  10. MRI-based Kidney Volume Measurements in ADPKD: Reliability and Effect of Gadolinium Enhancement

    PubMed Central

    Bae, Kyongtae T.; Tao, Cheng; Zhu, Fang; Bost, James E.; Chapman, Arlene B.; Grantham, Jared J.; Torres, Vicente E.; Guay-Woodford, Lisa M.; Meyers, Catherine M.; Bennett, William M.

    2009-01-01

    Background and objectives: To evaluate the inter- and intrareader reliability and the effect of gadolinium enhancement on kidney volume measurements obtained from pre- and postgadolinium T1 MR images in patients with autosomal dominant polycystic kidney disease (ADPKD). Design, setting, participants, & measurements: Twenty subjects were randomly selected with approximately equal frequency from three kidney-size groups. Pre- and postgadolinium 3D T1 (pre-T1, post-T1) MR images were obtained. The stereology method was applied to segment and measure kidney volumes. The measurement process was repeated at two-wk intervals by two radiologists. Reliability was assessed with correlation coefficients. Intra- and inter-reader bias and measure differences were assessed with paired T-tests. The size effect on the pre- and post-T1 measurements was evaluated with one-way ANOVA. Results: The intra- and inter-reader reliability was extremely high in all measurements. No systematic intrareader bias but a small inter-reader bias for the post-T1 measurements was observed. All kidney volumes measured on the pre- and post-T1 images were highly correlated with each other for both readers. The post-T1 volumes were significantly higher than pre-T1 volumes. While the post-pre volume differences were relatively constant across the three kidney-size groups, the post-pre percent volume differences were significantly smaller as the size of the kidney increased. Conclusions: Kidney volume measurements can be made with minimum intra- and inter-reader variability on both pre- and post-T1 MR images. Kidney volumes measured on the pre-T1 were smaller than those on post-T1, and percent differences between pre-T1 and post-T1 kidney volumes decreased with increasing kidney size. PMID:19339416

  11. On Reliable and Efficient Data Gathering Based Routing in Underwater Wireless Sensor Networks.

    PubMed

    Liaqat, Tayyaba; Akbar, Mariam; Javaid, Nadeem; Qasim, Umar; Khan, Zahoor Ali; Javaid, Qaisar; Alghamdi, Turki Ali; Niaz, Iftikhar Azim

    2016-01-01

    This paper presents cooperative routing scheme to improve data reliability. The proposed protocol achieves its objective, however, at the cost of surplus energy consumption. Thus sink mobility is introduced to minimize the energy consumption cost of nodes as it directly collects data from the network nodes at minimized communication distance. We also present delay and energy optimized versions of our proposed RE-AEDG to further enhance its performance. Simulation results prove the effectiveness of our proposed RE-AEDG in terms of the selected performance matrics. PMID:27589750

  12. Reliable Execution Based on CPN and Skyline Optimization for Web Service Composition

    PubMed Central

    Ha, Weitao; Zhang, Guojun

    2013-01-01

    With development of SOA, the complex problem can be solved by combining available individual services and ordering them to best suit user's requirements. Web services composition is widely used in business environment. With the features of inherent autonomy and heterogeneity for component web services, it is difficult to predict the behavior of the overall composite service. Therefore, transactional properties and nonfunctional quality of service (QoS) properties are crucial for selecting the web services to take part in the composition. Transactional properties ensure reliability of composite Web service, and QoS properties can identify the best candidate web services from a set of functionally equivalent services. In this paper we define a Colored Petri Net (CPN) model which involves transactional properties of web services in the composition process. To ensure reliable and correct execution, unfolding processes of the CPN are followed. The execution of transactional composition Web service (TCWS) is formalized by CPN properties. To identify the best services of QoS properties from candidate service sets formed in the TCSW-CPN, we use skyline computation to retrieve dominant Web service. It can overcome that the reduction of individual scores to an overall similarity leads to significant information loss. We evaluate our approach experimentally using both real and synthetically generated datasets. PMID:23935431

  13. Resistive switching memories based on metal oxides: mechanisms, reliability and scaling

    NASA Astrophysics Data System (ADS)

    Ielmini, Daniele

    2016-06-01

    With the explosive growth of digital data in the era of the Internet of Things (IoT), fast and scalable memory technologies are being researched for data storage and data-driven computation. Among the emerging memories, resistive switching memory (RRAM) raises strong interest due to its high speed, high density as a result of its simple two-terminal structure, and low cost of fabrication. The scaling projection of RRAM, however, requires a detailed understanding of switching mechanisms and there are potential reliability concerns regarding small device sizes. This work provides an overview of the current understanding of bipolar-switching RRAM operation, reliability and scaling. After reviewing the phenomenological and microscopic descriptions of the switching processes, the stability of the low- and high-resistance states will be discussed in terms of conductance fluctuations and evolution in 1D filaments containing only a few atoms. The scaling potential of RRAM will finally be addressed by reviewing the recent breakthroughs in multilevel operation and 3D architecture, making RRAM a strong competitor among future high-density memory solutions.

  14. A reliability-based maintenance technicians' workloads optimisation model with stochastic consideration

    NASA Astrophysics Data System (ADS)

    Ighravwe, D. E.; Oke, S. A.; Adebiyi, K. A.

    2015-12-01

    The growing interest in technicians' workloads research is probably associated with the recent surge in competition. This was prompted by unprecedented technological development that triggers changes in customer tastes and preferences for industrial goods. In a quest for business improvement, this worldwide intense competition in industries has stimulated theories and practical frameworks that seek to optimise performance in workplaces. In line with this drive, the present paper proposes an optimisation model which considers technicians' reliability that complements factory information obtained. The information used emerged from technicians' productivity and earned-values using the concept of multi-objective modelling approach. Since technicians are expected to carry out routine and stochastic maintenance work, we consider these workloads as constraints. The influence of training, fatigue and experiential knowledge of technicians on workload management was considered. These workloads were combined with maintenance policy in optimising reliability, productivity and earned-values using the goal programming approach. Practical datasets were utilised in studying the applicability of the proposed model in practice. It was observed that our model was able to generate information that practicing maintenance engineers can apply in making more informed decisions on technicians' management.

  15. Multiplication factor for open ground storey buildings-a reliability based evaluation

    NASA Astrophysics Data System (ADS)

    Haran Pragalath, D. C.; Avadhoot, Bhosale; Robin, Davis P.; Pradip, Sarkar

    2016-06-01

    Open Ground Storey (OGS) framed buildings where the ground storey is kept open without infill walls, mainly to facilitate parking, is increasing commonly in urban areas. However, vulnerability of this type of buildings has been exposed in past earthquakes. OGS buildings are conventionally designed by a bare frame analysis that ignores the stiffness of the infill walls present in the upper storeys, but doing so underestimates the inter-storey drift (ISD) and thereby the force demand in the ground storey columns. Therefore, a multiplication factor (MF) is introduced in various international codes to estimate the design forces (bending moments and shear forces) in the ground storey columns. This study focuses on the seismic performance of typical OGS buildings designed by means of MFs. The probabilistic seismic demand models, fragility curves, reliability and cost indices for various frame models including bare frames and fully infilled frames are developed. It is found that the MF scheme suggested by the Israel code is better than other international codes in terms of reliability and cost.

  16. Co-detection: ultra-reliable nanoparticle-based electrical detection of biomolecules in the presence of large background interference.

    PubMed

    Liu, Yang; Gu, Ming; Alocilja, Evangelyn C; Chakrabartty, Shantanu

    2010-11-15

    An ultra-reliable technique for detecting trace quantities of biomolecules is reported. The technique called "co-detection" exploits the non-linear redundancy amongst synthetically patterned biomolecular logic circuits for deciphering the presence or absence of target biomolecules in a sample. In this paper, we verify the "co-detection" principle on gold-nanoparticle-based conductimetric soft-logic circuits which use a silver-enhancement technique for signal amplification. Using co-detection, we have been able to demonstrate a great improvement in the reliability of detecting mouse IgG at concentration levels that are 10(5) lower than the concentration of rabbit IgG which serves as background interference. PMID:20864327

  17. A reliable low-cost wireless and wearable gait monitoring system based on a plastic optical fibre sensor

    NASA Astrophysics Data System (ADS)

    Bilro, L.; Oliveira, J. G.; Pinto, J. L.; Nogueira, R. N.

    2011-04-01

    A wearable and wireless system designed to evaluate quantitatively the human gait is presented. It allows knee sagittal motion monitoring over long distances and periods with a portable and low-cost package. It is based on the measurement of transmittance changes when a side-polished plastic optical fibre is bent. Four voluntary healthy subjects, on five different days, were tested in order to assess inter-day and inter-subject reliability. Results have shown that this technique is reliable, allows a one-time calibration and is suitable in the diagnosis and rehabilitation of knee injuries or for monitoring the performance of competitive athletes. Environmental testing was accomplished in order to study the influence of different temperatures and humidity conditions.

  18. Reliable Change Indices and Standardized Regression-Based Change Score Norms for Evaluating Neuropsychological Change in Children with Epilepsy

    PubMed Central

    Busch, Robyn M.; Lineweaver, Tara T.; Ferguson, Lisa; Haut, Jennifer S.

    2015-01-01

    Reliable change index scores (RCIs) and standardized regression-based change score norms (SRBs) permit evaluation of meaningful changes in test scores following treatment interventions, like epilepsy surgery, while accounting for test-retest reliability, practice effects, score fluctuations due to error, and relevant clinical and demographic factors. Although these methods are frequently used to assess cognitive change after epilepsy surgery in adults, they have not been widely applied to examine cognitive change in children with epilepsy. The goal of the current study was to develop RCIs and SRBs for use in children with epilepsy. Sixty-three children with epilepsy (age range 6–16; M=10.19, SD=2.58) underwent comprehensive neuropsychological evaluations at two time points an average of 12 months apart. Practice adjusted RCIs and SRBs were calculated for all cognitive measures in the battery. Practice effects were quite variable across the neuropsychological measures, with the greatest differences observed among older children, particularly on the Children’s Memory Scale and Wisconsin Card Sorting Test. There was also notable variability in test-retest reliabilities across measures in the battery, with coefficients ranging from 0.14 to 0.92. RCIs and SRBs for use in assessing meaningful cognitive change in children following epilepsy surgery are provided for measures with reliability coefficients above 0.50. This is the first study to provide RCIs and SRBs for a comprehensive neuropsychological battery based on a large sample of children with epilepsy. Tables to aid in evaluating cognitive changes in children who have undergone epilepsy surgery are provided for clinical use. An excel sheet to perform all relevant calculations is also available to interested clinicians or researchers. PMID:26043163

  19. A reliability study on tin based lead free micro joint including intermetallic and void evolution

    NASA Astrophysics Data System (ADS)

    Feyissa, Frezer Assefa

    In microelectronics soldering to Cu pad lead to formation of two intermetallic structures in the solder -pad interface. The growth of these layers is accompanied by microscopic voids that usually cause reliability concern in the industry. Therefore it is important to understand factors that contribute for the growth of IMC using various combination of reflow time, Sn thickness and aging temperature. Systematic study was conducted on Cu-Sn system to investigate the formation and growth of intermetallic compound (IMC) as well as voiding evolution for different solder thicknesses. The growth of the Cu6Sn5 IMC layer was found to be increasing as the Sn thicknesses increase after reflow while the Cu3Sn layer were decreasing under same conditions. Also after reflow and aging more voiding were shown to occur in the thin solder than thicker one.

  20. Atomistic force field for pyridinium-based ionic liquids: reliable transport properties.

    PubMed

    Voroshylova, Iuliia V; Chaban, Vitaly V

    2014-09-11

    Reliable force field (FF) is a central issue in successful prediction of physical chemical properties via computer simulations. This work introduces refined FF parameters for six popular ionic liquids (ILs) of the pyridinium family (butylpyridinium tetrafluoroborate, bis(trifluoromethanesulfonyl)imide, dicyanamide, hexafluorophosphate, triflate, chloride). We elaborate a systematic procedure, which allows accounting for specific cation-anion interactions in the liquid phase. Once these interactions are described accurately, all experimentally determined transport properties can be reproduced. We prove that three parameters per interaction site (atom diameter, depth of potential well, point electrostatic charge) provide a sufficient basis to predict thermodynamics (heat of vaporization, density), structure (radial distributions), and transport (diffusion, viscosity, conductivity) of ILs at room conditions and elevated temperature. The developed atomistic models provide a systematic refinement upon the well-known Canongia Lopes-Pádua (CL&P) FF. Together with the original CL&P parameters the present models foster a computational investigation of ionic liquids. PMID:25144141

  1. Reliable Acquisition of RAM Dumps from Intel-Based Apple Mac Computers over FireWire

    NASA Astrophysics Data System (ADS)

    Gladyshev, Pavel; Almansoori, Afrah

    RAM content acquisition is an important step in live forensic analysis of computer systems. FireWire offers an attractive way to acquire RAM content of Apple Mac computers equipped with a FireWire connection. However, the existing techniques for doing so require substantial knowledge of the target computer configuration and cannot be used reliably on a previously unknown computer in a crime scene. This paper proposes a novel method for acquiring RAM content of Apple Mac computers over FireWire, which automatically discovers necessary information about the target computer and can be used in the crime scene setting. As an application of the developed method, the techniques for recovery of AOL Instant Messenger (AIM) conversation fragments from RAM dumps are also discussed in this paper.

  2. Reliable Mixed H∞ and Passivity-Based Control for Fuzzy Markovian Switching Systems With Probabilistic Time Delays and Actuator Failures.

    PubMed

    Sakthivel, Rathinasamy; Selvi, Subramaniam; Mathiyalagan, Kalidass; Shi, Peng

    2015-12-01

    This paper is concerned with the problem of reliable mixed H ∞ and passivity-based control for a class of stochastic Takagi-Sugeno (TS) fuzzy systems with Markovian switching and probabilistic time varying delays. Different from the existing works, the H∞ and passivity control problem with probabilistic occurrence of time-varying delays and actuator failures is considered in a unified framework, which is more general in some practical situations. The main aim of this paper is to design a reliable mixed H∞ and passivity-based controller such that the stochastic TS fuzzy system with Markovian switching is stochastically stable with a prescribed mixed H∞ and passivity performance level γ > 0 . Based on the Lyapunov-Krasovskii functional (LKF) involving lower and upper bound of probabilistic time delay and convex combination technique, a new set of delay-dependent sufficient condition in terms of linear matrix inequalities (LMIs) is established for obtaining the required result. Finally, a numerical example based on the modified truck-trailer model is given to demonstrate the effectiveness and applicability of the proposed design techniques. PMID:25576589

  3. Performance and reliability of multimodel hydrological ensemble simulations based on seventeen lumped models and a thousand catchments

    NASA Astrophysics Data System (ADS)

    Velázquez, J. A.; Anctil, F.; Perrin, C.

    2010-06-01

    This work investigates the added value of ensembles constructed from seventeen lumped hydrological models against their simple average counterparts. It is thus hypothesized that there is more information provided by all the outputs of these models than by their single aggregated predictors. For all available 1061 catchments, results showed that the mean continuous ranked probability score of the ensemble simulations were better than the mean average error of the aggregated simulations, confirming the added value of retaining all the components of the model outputs. Reliability of the simulation ensembles is also achieved for about 30% of the catchments, as assessed by rank histograms and reliability plots. Nonetheless this imperfection, the ensemble simulations were shown to have better skills than the deterministic simulations at discriminating between events and non-events, as confirmed by relative operating characteristic scores especially for larger streamflows. From 7 to 10 models are deemed sufficient to construct ensembles with improved performance, based on a genetic algorithm search optimizing the continuous ranked probability score. In fact, many model subsets were found improving the performance of the reference ensemble. This is thus not essential to implement as much as seventeen lumped hydrological models. The gain in performance of the optimized subsets is accompanied by some improvement of the ensemble reliability in most cases. Nonetheless, a calibration of the predictive distribution is still needed for many catchments.

  4. Performance and reliability of multimodel hydrological ensemble simulations based on seventeen lumped models and a thousand catchments

    NASA Astrophysics Data System (ADS)

    Velázquez, J. A.; Anctil, F.; Perrin, C.

    2010-11-01

    This work investigates the added value of ensembles constructed from seventeen lumped hydrological models against their simple average counterparts. It is thus hypothesized that there is more information provided by all the outputs of these models than by their single aggregated predictors. For all available 1061 catchments, results showed that the mean continuous ranked probability score of the ensemble simulations were better than the mean average error of the aggregated simulations, confirming the added value of retaining all the components of the model outputs. Reliability of the simulation ensembles is also achieved for about 30% of the catchments, as assessed by rank histograms and reliability plots. Nonetheless this imperfection, the ensemble simulations were shown to have better skills than the deterministic simulations at discriminating between events and non-events, as confirmed by relative operating characteristic scores especially for larger streamflows. From 7 to 10 models are deemed sufficient to construct ensembles with improved performance, based on a genetic algorithm search optimizing the continuous ranked probability score. In fact, many model subsets were found improving the performance of the reference ensemble. This is thus not essential to implement as much as seventeen lumped hydrological models. The gain in performance of the optimized subsets is accompanied by some improvement of the ensemble reliability in most cases. Nonetheless, a calibration of the predictive distribution is still needed for many catchments.

  5. Feasibility, Reliability, and Validity of a Smartphone Based Application for the Assessment of Cognitive Function in the Elderly

    PubMed Central

    Brouillette, Robert M.; Foil, Heather; Fontenot, Stephanie; Correro, Anthony; Allen, Ray; Martin, Corby K.; Bruce-Keller, Annadora J.; Keller, Jeffrey N.

    2013-01-01

    While considerable knowledge has been gained through the use of established cognitive and motor assessment tools, there is a considerable interest and need for the development of a battery of reliable and validated assessment tools that provide real-time and remote analysis of cognitive and motor function in the elderly. Smartphones appear to be an obvious choice for the development of these “next-generation” assessment tools for geriatric research, although to date no studies have reported on the use of smartphone-based applications for the study of cognition in the elderly. The primary focus of the current study was to assess the feasibility, reliability, and validity of a smartphone-based application for the assessment of cognitive function in the elderly. A total of 57 non-demented elderly individuals were administered a newly developed smartphone application-based Color-Shape Test (CST) in order to determine its utility in measuring cognitive processing speed in the elderly. Validity of this novel cognitive task was assessed by correlating performance on the CST with scores on widely accepted assessments of cognitive function. Scores on the CST were significantly correlated with global cognition (Mini-Mental State Exam: r = 0.515, p<0.0001) and multiple measures of processing speed and attention (Digit Span: r = 0.427, p<0.0001; Trail Making Test: r = −0.651, p<0.00001; Digit Symbol Test: r = 0.508, p<0.0001). The CST was not correlated with naming and verbal fluency tasks (Boston Naming Test, Vegetable/Animal Naming) or memory tasks (Logical Memory Test). Test re-test reliability was observed to be significant (r = 0.726; p = 0.02). Together, these data are the first to demonstrate the feasibility, reliability, and validity of using a smartphone-based application for the purpose of assessing cognitive function in the elderly. The importance of these findings for the establishment of smartphone-based assessment batteries of

  6. Feasibility, reliability, and validity of a smartphone based application for the assessment of cognitive function in the elderly.

    PubMed

    Brouillette, Robert M; Foil, Heather; Fontenot, Stephanie; Correro, Anthony; Allen, Ray; Martin, Corby K; Bruce-Keller, Annadora J; Keller, Jeffrey N

    2013-01-01

    While considerable knowledge has been gained through the use of established cognitive and motor assessment tools, there is a considerable interest and need for the development of a battery of reliable and validated assessment tools that provide real-time and remote analysis of cognitive and motor function in the elderly. Smartphones appear to be an obvious choice for the development of these "next-generation" assessment tools for geriatric research, although to date no studies have reported on the use of smartphone-based applications for the study of cognition in the elderly. The primary focus of the current study was to assess the feasibility, reliability, and validity of a smartphone-based application for the assessment of cognitive function in the elderly. A total of 57 non-demented elderly individuals were administered a newly developed smartphone application-based Color-Shape Test (CST) in order to determine its utility in measuring cognitive processing speed in the elderly. Validity of this novel cognitive task was assessed by correlating performance on the CST with scores on widely accepted assessments of cognitive function. Scores on the CST were significantly correlated with global cognition (Mini-Mental State Exam: r = 0.515, p<0.0001) and multiple measures of processing speed and attention (Digit Span: r = 0.427, p<0.0001; Trail Making Test: r = -0.651, p<0.00001; Digit Symbol Test: r = 0.508, p<0.0001). The CST was not correlated with naming and verbal fluency tasks (Boston Naming Test, Vegetable/Animal Naming) or memory tasks (Logical Memory Test). Test re-test reliability was observed to be significant (r = 0.726; p = 0.02). Together, these data are the first to demonstrate the feasibility, reliability, and validity of using a smartphone-based application for the purpose of assessing cognitive function in the elderly. The importance of these findings for the establishment of smartphone-based assessment batteries of cognitive

  7. Human reliability-based MC&A models for detecting insider theft.

    SciTech Connect

    Duran, Felicia Angelica; Wyss, Gregory Dane

    2010-06-01

    Material control and accounting (MC&A) safeguards operations that track and account for critical assets at nuclear facilities provide a key protection approach for defeating insider adversaries. These activities, however, have been difficult to characterize in ways that are compatible with the probabilistic path analysis methods that are used to systematically evaluate the effectiveness of a site's physical protection (security) system (PPS). MC&A activities have many similar characteristics to operator procedures performed in a nuclear power plant (NPP) to check for anomalous conditions. This work applies human reliability analysis (HRA) methods and models for human performance of NPP operations to develop detection probabilities for MC&A activities. This has enabled the development of an extended probabilistic path analysis methodology in which MC&A protections can be combined with traditional sensor data in the calculation of PPS effectiveness. The extended path analysis methodology provides an integrated evaluation of a safeguards and security system that addresses its effectiveness for attacks by both outside and inside adversaries.

  8. Highly reliable 198-nm light source for semiconductor inspection based on dual fiber lasers

    NASA Astrophysics Data System (ADS)

    Imai, Shinichi; Matsuki, Kazuto; Kikuiri, Nobutaka; Takayama, Katsuhiko; Iwase, Osamu; Urata, Yoshiharu; Shinozaki, Tatsuya; Wada, Yoshio; Wada, Satoshi

    2010-02-01

    Highly reliable DUV light sources are required for semiconductor applications such as a photomask inspection. The mask inspection for the advanced devices requires the UV lightning wavelength beyond 200 nm. By use of dual fiber lasers as fundamental light sources and the multi-wavelength conversion we have constructed a light source of 198nm with more than 100 mW. The first laser is Yb doped fiber laser with the wavelength of 1064 nm; the second is Er doped fiber laser with 1560 nm. To obtain the robustness and to simplify the configuration, the fundamental lights are run in the pulsed operation and all wavelength conversions are made in single-pass scheme. The PRFs of more than 2 MHz are chosen as an alternative of a CW light source; such a high PRF light is equivalent to CW light for inspection cameras. The light source is operated described as follows. Automatic weekly maintenance within an hour is done if it is required; automatic monthly maintenance within 4 hours is done on fixed date per month; manufacturer's maintenance is done every 6 month. Now this 198 nm light sources are equipped in the leading edge photomask inspection machines.

  9. A reliable modeless mobile multi-sensor integration technique based on RLS-lattice

    NASA Astrophysics Data System (ADS)

    El-Gizawy, Mahmoud; Noureldin, Aboelmagd; El-Sheimy, Naser

    2006-01-01

    The last two decades have witnessed an increasing trend in integrating different navigation sensors for the overall purpose of overcoming the limitation of stand-alone systems. An example of this integration is the fusion of the global positioning system (GPS) and inertial navigation system (INS) for several navigation and positioning applications. Both systems have their unique features and shortcomings. Therefore, their integration offers a robust navigation solution. This paper introduces a novel multi-sensor system integration using a recursive least-squares lattice (RLSL) filter. The proposed system has a similar structure to the widely used KF. However, it has the major advantage of working without the need of either dynamic or stochastic models. Furthermore, no prior information about the covariance information of INS and GPS is required. The proposed RLSL filter parameters, similar to Kalman gain, are tuned recursively in the update mode utilizing the GPS velocity components. The RLSL, in turn, can filter out the high frequency noise associated with the INS. To test the capabilities of the proposed architecture, a field test was conducted in a land vehicle using a tactical grade INS system (the Honeywell HG1700) integrated with differential GPS measurements collected by a NovAtel OEM4 GPS receiver. The proposed system is examined during the availability of the GPS signal and with intentionally introduced GPS signal outages. The results indicate that the proposed RLSL system is robust in providing a reliable modeless INS/GPS integration module.

  10. Tumor Heterogeneity: Mechanisms and Bases for a Reliable Application of Molecular Marker Design

    PubMed Central

    Diaz-Cano, Salvador J.

    2012-01-01

    Tumor heterogeneity is a confusing finding in the assessment of neoplasms, potentially resulting in inaccurate diagnostic, prognostic and predictive tests. This tumor heterogeneity is not always a random and unpredictable phenomenon, whose knowledge helps designing better tests. The biologic reasons for this intratumoral heterogeneity would then be important to understand both the natural history of neoplasms and the selection of test samples for reliable analysis. The main factors contributing to intratumoral heterogeneity inducing gene abnormalities or modifying its expression include: the gradient ischemic level within neoplasms, the action of tumor microenvironment (bidirectional interaction between tumor cells and stroma), mechanisms of intercellular transference of genetic information (exosomes), and differential mechanisms of sequence-independent modifications of genetic material and proteins. The intratumoral heterogeneity is at the origin of tumor progression and it is also the byproduct of the selection process during progression. Any analysis of heterogeneity mechanisms must be integrated within the process of segregation of genetic changes in tumor cells during the clonal expansion and progression of neoplasms. The evaluation of these mechanisms must also consider the redundancy and pleiotropism of molecular pathways, for which appropriate surrogate markers would support the presence or not of heterogeneous genetics and the main mechanisms responsible. This knowledge would constitute a solid scientific background for future therapeutic planning. PMID:22408433

  11. Disposable and reliable electrochemical magnetoimmunosensor for Fumonisins simplified determination in maize-based foodstuffs.

    PubMed

    Jodra, Adrián; López, Miguel Ángel; Escarpa, Alberto

    2015-02-15

    An electrochemical magnetoimmunosensor involving magnetic beads and disposable carbon screen-printed electrode (CSPE) for Fumonosins (FB1, FB2 and FB3) has been developed and evaluated through a certified reference material (CRM) and beer samples. Once the immunochemical reactions took place on the magnetic beads solution, they were confined on the surface of CSPE, where electrochemical detection is achieved through the addition of suitable substrate and mediator for enzymatic tracer (Horseradish peroxidase--HRP). A remarkable detection limit of 0.33 μg L(-1), outstanding repeatability and reproducibility (RSD(intraday) of 5.6% and 2.9%; RSD(interday) of 6.9% and 6.0%; both for 0 and 5 μg L(-1) FB1 respectively), and excellent accuracy with recovery rate of 85-96% showed the suggested approach to be a very suitable screening tool for the analysis of Fumonisin B1 and B2 in food samples. A simultaneous simplified calibration and analysis protocol allows a fast and reliable determination of Fumonisin in beer samples with recovery rate of 87-105%. This strategy enhanced the analytical merits of immunosensor approach towards truly disposable tools for food-safety monitoring. PMID:25441412

  12. Reliability and Validity of Web-Based Portfolio Peer Assessment: A Case Study for a Senior High School's Students Taking Computer Course

    ERIC Educational Resources Information Center

    Chang, Chi-Cheng; Tseng, Kuo-Hung; Chou, Pao-Nan; Chen, Yi-Hui

    2011-01-01

    This study examined the reliability and validity of Web-based portfolio peer assessment. Participants were 72 second-grade students from a senior high school taking a computer course. The results indicated that: 1) there was a lack of consistency across various student raters on a portfolio, or inter-rater reliability; 2) two-thirds of the raters…

  13. Reliable emotion recognition system based on dynamic adaptive fusion of forehead biopotentials and physiological signals.

    PubMed

    Khezri, Mahdi; Firoozabadi, Mohammad; Sharafat, Ahmad Reza

    2015-11-01

    In this study, we proposed a new adaptive method for fusing multiple emotional modalities to improve the performance of the emotion recognition system. Three-channel forehead biosignals along with peripheral physiological measurements (blood volume pressure, skin conductance, and interbeat intervals) were utilized as emotional modalities. Six basic emotions, i.e., anger, sadness, fear, disgust, happiness, and surprise were elicited by displaying preselected video clips for each of the 25 participants in the experiment; the physiological signals were collected simultaneously. In our multimodal emotion recognition system, recorded signals with the formation of several classification units identified the emotions independently. Then the results were fused using the adaptive weighted linear model to produce the final result. Each classification unit is assigned a weight that is determined dynamically by considering the performance of the units during the testing phase and the training phase results. This dynamic weighting scheme enables the emotion recognition system to adapt itself to each new user. The results showed that the suggested method outperformed conventional fusion of the features and classification units using the majority voting method. In addition, a considerable improvement, compared to the systems that used the static weighting schemes for fusing classification units, was also shown. Using support vector machine (SVM) and k-nearest neighbors (KNN) classifiers, the overall classification accuracies of 84.7% and 80% were obtained in identifying the emotions, respectively. In addition, applying the forehead or physiological signals in the proposed scheme indicates that designing a reliable emotion recognition system is feasible without the need for additional emotional modalities. PMID:26253158

  14. Probing Reliability of Transport Phenomena Based Heat Transfer and Fluid Flow Analysis in Autogeneous Fusion Welding Process

    NASA Astrophysics Data System (ADS)

    Bag, S.; de, A.

    2010-09-01

    The transport phenomena based heat transfer and fluid flow calculations in weld pool require a number of input parameters. Arc efficiency, effective thermal conductivity, and viscosity in weld pool are some of these parameters, values of which are rarely known and difficult to assign a priori based on the scientific principles alone. The present work reports a bi-directional three-dimensional (3-D) heat transfer and fluid flow model, which is integrated with a real number based genetic algorithm. The bi-directional feature of the integrated model allows the identification of the values of a required set of uncertain model input parameters and, next, the design of process parameters to achieve a target weld pool dimension. The computed values are validated with measured results in linear gas-tungsten-arc (GTA) weld samples. Furthermore, a novel methodology to estimate the overall reliability of the computed solutions is also presented.

  15. Robot-Assisted End-Effector-Based Stair Climbing for Cardiopulmonary Exercise Testing: Feasibility, Reliability, and Repeatability

    PubMed Central

    Stoller, Oliver; Schindelholz, Matthias; Hunt, Kenneth J.

    2016-01-01

    Background Neurological impairments can limit the implementation of conventional cardiopulmonary exercise testing (CPET) and cardiovascular training strategies. A promising approach to provoke cardiovascular stress while facilitating task-specific exercise in people with disabilities is feedback-controlled robot-assisted end-effector-based stair climbing (RASC). The aim of this study was to evaluate the feasibility, reliability, and repeatability of augmented RASC-based CPET in able-bodied subjects, with a view towards future research and applications in neurologically impaired populations. Methods Twenty able-bodied subjects performed a familiarisation session and 2 consecutive incremental CPETs using augmented RASC. Outcome measures focussed on standard cardiopulmonary performance parameters and on accuracy of work rate tracking (RMSEP−root mean square error). Criteria for feasibility were cardiopulmonary responsiveness and technical implementation. Relative and absolute test-retest reliability were assessed by intraclass correlation coefficients (ICC), standard error of the measurement (SEM), and minimal detectable change (MDC). Mean differences, limits of agreement, and coefficients of variation (CoV) were estimated to assess repeatability. Results All criteria for feasibility were achieved. Mean V′O2peak was 106±9% of predicted V′O2max and mean HRpeak was 99±3% of predicted HRmax. 95% of the subjects achieved at least 1 criterion for V′O2max, and the detection of the sub-maximal ventilatory thresholds was successful (ventilatory anaerobic threshold 100%, respiratory compensation point 90% of the subjects). Excellent reliability was found for peak cardiopulmonary outcome measures (ICC ≥ 0.890, SEM ≤ 0.60%, MDC ≤ 1.67%). Repeatability for the primary outcomes was good (CoV ≤ 0.12). Conclusions RASC-based CPET with feedback-guided exercise intensity demonstrated comparable or higher peak cardiopulmonary performance variables relative to

  16. Reliable detection of fluence anomalies in EPID-based IMRT pretreatment quality assurance using pixel intensity deviations

    SciTech Connect

    Gordon, J. J.; Gardner, J. K.; Wang, S.; Siebers, J. V.

    2012-08-15

    Purpose: This work uses repeat images of intensity modulated radiation therapy (IMRT) fields to quantify fluence anomalies (i.e., delivery errors) that can be reliably detected in electronic portal images used for IMRT pretreatment quality assurance. Methods: Repeat images of 11 clinical IMRT fields are acquired on a Varian Trilogy linear accelerator at energies of 6 MV and 18 MV. Acquired images are corrected for output variations and registered to minimize the impact of linear accelerator and electronic portal imaging device (EPID) positioning deviations. Detection studies are performed in which rectangular anomalies of various sizes are inserted into the images. The performance of detection strategies based on pixel intensity deviations (PIDs) and gamma indices is evaluated using receiver operating characteristic analysis. Results: Residual differences between registered images are due to interfraction positional deviations of jaws and multileaf collimator leaves, plus imager noise. Positional deviations produce large intensity differences that degrade anomaly detection. Gradient effects are suppressed in PIDs using gradient scaling. Background noise is suppressed using median filtering. In the majority of images, PID-based detection strategies can reliably detect fluence anomalies of {>=}5% in {approx}1 mm{sup 2} areas and {>=}2% in {approx}20 mm{sup 2} areas. Conclusions: The ability to detect small dose differences ({<=}2%) depends strongly on the level of background noise. This in turn depends on the accuracy of image registration, the quality of the reference image, and field properties. The longer term aim of this work is to develop accurate and reliable methods of detecting IMRT delivery errors and variations. The ability to resolve small anomalies will allow the accuracy of advanced treatment techniques, such as image guided, adaptive, and arc therapies, to be quantified.

  17. Hardware based redundant multi-threading inside a GPU for improved reliability

    DOEpatents

    Sridharan, Vilas; Gurumurthi, Sudhanva

    2015-05-05

    A system and method for verifying computation output using computer hardware are provided. Instances of computation are generated and processed on hardware-based processors. As instances of computation are processed, each instance of computation receives a load accessible to other instances of computation. Instances of output are generated by processing the instances of computation. The instances of output are verified against each other in a hardware based processor to ensure accuracy of the output.

  18. Position-Based k-Disjoint Path Routing for Reliable Data Gathering in Wireless Sensor Networks

    NASA Astrophysics Data System (ADS)

    Baek, Jang Woon; Nam, Young Jin; Seo, Dae-Wha

    This paper proposes a novel routing algorithm that constructs position-based k-disjoint paths to realize greater resiliency to patterned failure. The proposed algorithm constructs k-disjoint paths that are spatially distributed by using the hop-count based positioning system. Simulation results reveal that the proposed algorithm is more resilient to patterned failure than other routing algorithms, while it has low power consumption and small delay.

  19. A Logarithmic Opinion Pool Based STAPLE Algorithm For The Fusion of Segmentations With Associated Reliability Weights

    PubMed Central

    Akhondi-Asl, Alireza; Hoyte, Lennox; Lockhart, Mark E.; Warfield, Simon K.

    2014-01-01

    Pelvic floor dysfunction is very common in women after childbirth and precise segmentation of magnetic resonance images (MRI) of the pelvic floor may facilitate diagnosis and treatment of patients. However, because of the complexity of the structures of pelvic floor, manual segmentation of the pelvic floor is challenging and suffers from high inter and intra-rater variability of expert raters. Multiple template fusion algorithms are promising techniques for segmentation of MRI in these types of applications, but these algorithms have been limited by imperfections in the alignment of each template to the target, and by template segmentation errors. In this class of segmentation techniques, a collection of templates is aligned to a target, and a new segmentation of the target is inferred. A number of algorithms sought to improve segmentation performance by combining image intensities and template labels as two independent sources of information, carrying out decision fusion through local intensity weighted voting schemes. This class of approach is a form of linear opinion pooling, and achieves unsatisfactory performance for this application. We hypothesized that better decision fusion could be achieved by assessing the contribution of each template in comparison to a reference standard segmentation of the target image and developed a novel segmentation algorithm to enable automatic segmentation of MRI of the female pelvic floor. The algorithm achieves high performance by estimating and compensating for both imperfect registration of the templates to the target image and template segmentation inaccuracies. The algorithm is a generalization of the STAPLE algorithm in which a reference segmentation is estimated and used to infer an optimal weighting for fusion of templates. A local image similarity measure is used to infer a local reliability weight, which contributes to the fusion through a novel logarithmic opinion pooling. We evaluated our new algorithm in comparison

  20. Reliability computation from reliability block diagrams

    NASA Technical Reports Server (NTRS)

    Chelson, P. O.; Eckstein, R. E.

    1971-01-01

    A method and a computer program are presented to calculate probability of system success from an arbitrary reliability block diagram. The class of reliability block diagrams that can be handled include any active/standby combination of redundancy, and the computations include the effects of dormancy and switching in any standby redundancy. The mechanics of the program are based on an extension of the probability tree method of computing system probabilities.

  1. Validity and reliability of a dish-based, semi-quantitative food frequency questionnaire for Korean diet and cancer research.

    PubMed

    Park, Min Kyung; Noh, Hwa Young; Song, Na Yeun; Paik, Hee Young; Park, Sohee; Joung, Hyojee; Song, Won O; Kim, Jeongseon

    2012-01-01

    This study evaluated the validity and reliability of applying a newly developed dish-based, semi-quantitative food frequency questionnaire (FFQ) for Korean diet and cancer research. The subjects in the present study were 288 Korean adults over 30 years of age who had completed two FFQs and four 3-day diet records (DRs) from May 2008 to February 2009. Student's t-tests, Chi-square tests, and Spearman's rank correlation coefficients were used to estimate and compare intakes from different dietary assessment tools. Agreement in quintiles was calculated to validate agreement between the results of the second FFQ (FFQ-2) conducted in February 2009 and the DRs. Median Spearman's correlation coefficients between the intake of nutrients and foods assessed by the FFQ-1 and FFQ-2 were 0.59 and 0.57, respectively, and the coefficients between the intake of nutrients and foods assessed by the FFQ-2 and the DRs were 0.31 and 0.29, respectively. The quintile classifications of same or adjacent quintile for intake of nutrients and foods were 64% and 65%, respectively. Misclassification into opposite quintiles occurred in less than 5% for all dietary factors. Thus this newly-developed, Korean dish-based FFQ demonstrated moderate correspondence with the four 3-day DRs. Its reliability and validity are comparable to those reported in other studies. PMID:22524822

  2. Fabrication of holey-fiber-based optical patch cords with bending insensitivity and their feasible reliability

    NASA Astrophysics Data System (ADS)

    Han, Young-Geon; Shim, Yu Tae; Song, Seok Ho; Jung, Chang-Hyun; Ouh, Chi-Hwan; Ryu, Ki-Sun; Kang, Hee-Jeon; Lee, Sang-Bae

    2008-02-01

    We propose the holey-fiber-based optical cord cables with the excellent optical characteristics in terms of attenuation loss, insertion loss, and bending loss for the application to access network like FTTH. The attenuation loss of the holey-fiber- based cord cable with the length of 1 km at 1383 nm is as low as ~0.323 dB/km. The bending loss under the bending diameter of 5.5 mm is measured to be ~0.04 dB. We also test the proposed holey-fiber-based optical cable for E-PON transmission systems by measuring the internet speed in the server under the bending diameter of 7, 3, and 1 cm.

  3. 49 CFR Appendix E to Part 238 - General Principles of Reliability-Based Maintenance Programs

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... time or age limit; (3) Discard of an item (or parts of it) at or before some specified life limit; and... overall failure rate of simple components or systems. Here, safe-life limits, fail-safe designs, or damage tolerance-based residual life calculations may be imposed on a single component or system to play a...

  4. Reliability and Validity of Authentic Assessment in a Web Based Course

    ERIC Educational Resources Information Center

    Olfos, Raimundo; Zulantay, Hildaura

    2007-01-01

    Web-based courses are promising in that they are effective and have the possibility of their instructional design being improved over time. However, the assessments of said courses are criticized in terms of their validity. This paper is an exploratory case study regarding the validity of the assessment system used in a semi presential web-based…

  5. From Fulcher to PLEVALEX: Issues in Interface Design, Validity and Reliability in Internet Based Language Testing

    ERIC Educational Resources Information Center

    Garcia Laborda, Jesus

    2007-01-01

    Interface design and ergonomics, while already studied in much of educational theory, have not until recently been considered in language testing (Fulcher, 2003). In this paper, we revise the design principles of PLEVALEX, a fully operational prototype Internet based language testing platform. Our focus here is to show PLEVALEX's interfaces and…

  6. Child and Adolescent Behaviorally Based Disorders: A Critical Review of Reliability and Validity

    ERIC Educational Resources Information Center

    Mallett, Christopher A.

    2014-01-01

    Objectives: The purpose of this study was to investigate the historical construction and empirical support of two child and adolescent behaviorally based mental health disorders: oppositional defiant and conduct disorders. Method: The study utilized a historiography methodology to review, from 1880 to 2012, these disorders' inclusion in…

  7. Reliability training

    NASA Technical Reports Server (NTRS)

    Lalli, Vincent R. (Editor); Malec, Henry A. (Editor); Dillard, Richard B.; Wong, Kam L.; Barber, Frank J.; Barina, Frank J.

    1992-01-01

    Discussed here is failure physics, the study of how products, hardware, software, and systems fail and what can be done about it. The intent is to impart useful information, to extend the limits of production capability, and to assist in achieving low cost reliable products. A review of reliability for the years 1940 to 2000 is given. Next, a review of mathematics is given as well as a description of what elements contribute to product failures. Basic reliability theory and the disciplines that allow us to control and eliminate failures are elucidated.

  8. Note: Reliable and non-contact 6D motion tracking system based on 2D laser scanners for cargo transportation

    SciTech Connect

    Kim, Young-Keun; Kim, Kyung-Soo

    2014-10-15

    Maritime transportation demands an accurate measurement system to track the motion of oscillating container boxes in real time. However, it is a challenge to design a sensor system that can provide both reliable and non-contact methods of 6-DOF motion measurements of a remote object for outdoor applications. In the paper, a sensor system based on two 2D laser scanners is proposed for detecting the relative 6-DOF motion of a crane load in real time. Even without implementing a camera, the proposed system can detect the motion of a remote object using four laser beam points. Because it is a laser-based sensor, the system is expected to be highly robust to sea weather conditions.

  9. Creep-Fatigue Life Prediction and Reliability Analysis of P91 Steel Based on Applied Mechanical Work Density

    NASA Astrophysics Data System (ADS)

    Ji, D. M.; Shen, M.-H. H.; Wang, D. X.; Ren, J. X.

    2015-01-01

    A creep-fatigue (CF) life prediction model and its simplified expression were developed based on the applied mechanical work density (AMWD). The foundation of this model was an integration of N- S curve. Comparisons of the model predicted fatigue lifetimes with the experimental data of load-controlled CF tests on P91 base metal and welded metal at 848 K from the reference were made and apparently illustrated that the model predictions were in a good agreement with the experimental fatigue lifetimes. In addition, the curve of the numbers of cycles to failure versus AMWD at the associated probability was deduced. A reliability model was constructed by combining the curve and the simplified life prediction model.

  10. Note: Reliable and non-contact 6D motion tracking system based on 2D laser scanners for cargo transportation

    NASA Astrophysics Data System (ADS)

    Kim, Young-Keun; Kim, Kyung-Soo

    2014-10-01

    Maritime transportation demands an accurate measurement system to track the motion of oscillating container boxes in real time. However, it is a challenge to design a sensor system that can provide both reliable and non-contact methods of 6-DOF motion measurements of a remote object for outdoor applications. In the paper, a sensor system based on two 2D laser scanners is proposed for detecting the relative 6-DOF motion of a crane load in real time. Even without implementing a camera, the proposed system can detect the motion of a remote object using four laser beam points. Because it is a laser-based sensor, the system is expected to be highly robust to sea weather conditions.

  11. Test-Retest Reliability and Convergent Validity of a Computer Based Hand Function Test Protocol in People with Arthritis

    PubMed Central

    Srikesavan, Cynthia S.; Shay, Barbara; Szturm, Tony

    2015-01-01

    Objectives: A computer based hand function assessment tool has been developed to provide a standardized method for quantifying task performance during manipulations of common objects/tools/utensils with diverse physical properties and grip/grasp requirements for handling. The study objectives were to determine test-retest reliability and convergent validity of the test protocol in people with arthritis. Methods: Three different object manipulation tasks were evaluated twice in forty people with rheumatoid arthritis (RA) or hand osteoarthritis (HOA). Each object was instrumented with a motion sensor and moved in concert with a computer generated visual target. Self-reported joint pain and stiffness levels were recorded before and after each task. Task performance was determined by comparing the object movement with the computer target motion. This was correlated with grip strength, nine hole peg test, Disabilities of Arm, Shoulder, and Hand (DASH) questionnaire, and the Health Assessment Questionnaire (HAQ) scores. Results: The test protocol indicated moderate to high test-retest reliability of performance measures for three manipulation tasks, intraclass correlation coefficients (ICCs) ranging between 0.5 to 0.84, p<0.05. Strength of association between task performance measures with self- reported activity/participation composite scores was low to moderate (Spearman rho <0.7). Low correlations (Spearman rho < 0.4) were observed between task performance measures and grip strength; and between three objects’ performance measures. Significant reduction in pain and joint stiffness (p<0.05) was observed after performing each task. Conclusion: The study presents initial evidence on the test retest reliability and convergent validity of a computer based hand function assessment protocol in people with rheumatoid arthritis or hand osteoarthritis. The novel tool objectively measures overall task performance during a variety of object manipulation tasks done by tracking a

  12. A measurement-based model of software reliability in a production environment

    NASA Technical Reports Server (NTRS)

    Hsueh, M. C.; Iyer, R. K.

    1987-01-01

    In this paper, a semi-Markov model is built to describe the software error and recovery process in a large mainframe system. The model is based on low-level error data from the MVS operating system running on an IBM 3081 machine. The semi-Markov model developed provides a quantification of system error characteristics and the interaction between different types of errors. As an example, a detailed model is provided, and an analysis is made of multiple errors, which constitute approximately an 17 percent of all software errors and result in considerable recovery overhead.

  13. Software Reliability 2002

    NASA Technical Reports Server (NTRS)

    Wallace, Dolores R.

    2003-01-01

    In FY01 we learned that hardware reliability models need substantial changes to account for differences in software, thus making software reliability measurements more effective, accurate, and easier to apply. These reliability models are generally based on familiar distributions or parametric methods. An obvious question is 'What new statistical and probability models can be developed using non-parametric and distribution-free methods instead of the traditional parametric method?" Two approaches to software reliability engineering appear somewhat promising. The first study, begin in FY01, is based in hardware reliability, a very well established science that has many aspects that can be applied to software. This research effort has investigated mathematical aspects of hardware reliability and has identified those applicable to software. Currently the research effort is applying and testing these approaches to software reliability measurement, These parametric models require much project data that may be difficult to apply and interpret. Projects at GSFC are often complex in both technology and schedules. Assessing and estimating reliability of the final system is extremely difficult when various subsystems are tested and completed long before others. Parametric and distribution free techniques may offer a new and accurate way of modeling failure time and other project data to provide earlier and more accurate estimates of system reliability.

  14. SPARK: Sparsity-based analysis of reliable k-hubness and overlapping network structure in brain functional connectivity.

    PubMed

    Lee, Kangjoo; Lina, Jean-Marc; Gotman, Jean; Grova, Christophe

    2016-07-01

    Functional hubs are defined as the specific brain regions with dense connections to other regions in a functional brain network. Among them, connector hubs are of great interests, as they are assumed to promote global and hierarchical communications between functionally specialized networks. Damage to connector hubs may have a more crucial effect on the system than does damage to other hubs. Hubs in graph theory are often identified from a correlation matrix, and classified as connector hubs when the hubs are more connected to regions in other networks than within the networks to which they belong. However, the identification of hubs from functional data is more complex than that from structural data, notably because of the inherent problem of multicollinearity between temporal dynamics within a functional network. In this context, we developed and validated a method to reliably identify connectors and corresponding overlapping network structure from resting-state fMRI. This new method is actually handling the multicollinearity issue, since it does not rely on counting the number of connections from a thresholded correlation matrix. The novelty of the proposed method is that besides counting the number of networks involved in each voxel, it allows us to identify which networks are actually involved in each voxel, using a data-driven sparse general linear model in order to identify brain regions involved in more than one network. Moreover, we added a bootstrap resampling strategy to assess statistically the reproducibility of our results at the single subject level. The unified framework is called SPARK, i.e. SParsity-based Analysis of Reliable k-hubness, where k-hubness denotes the number of networks overlapping in each voxel. The accuracy and robustness of SPARK were evaluated using two dimensional box simulations and realistic simulations that examined detection of artificial hubs generated on real data. Then, test/retest reliability of the method was assessed

  15. Reliability of signals from a chronically implanted, silicon-based electrode array in non-human primate primary motor cortex.

    PubMed

    Suner, Selim; Fellows, Matthew R; Vargas-Irwin, Carlos; Nakata, Gordon Kenji; Donoghue, John P

    2005-12-01

    Multiple-electrode arrays are valuable both as a research tool and as a sensor for neuromotor prosthetic devices, which could potentially restore voluntary motion and functional independence to paralyzed humans. Long-term array reliability is an important requirement for these applications. Here, we demonstrate the reliability of a regular array of 100 microelectrodes to obtain neural recordings from primary motor cortex (MI) of monkeys for at least three months and up to 1.5 years. We implanted Bionic (Cyberkinetics, Inc., Foxboro, MA) silicon probe arrays in MI of three Macaque monkeys. Neural signals were recorded during performance of an eight-direction, push-button task. Recording reliability was evaluated for 18, 35, or 51 sessions distributed over 83, 179, and 569 days after implantation, respectively, using qualitative and quantitative measures. A four-point signal quality scale was defined based on the waveform amplitude relative to noise. A single observer applied this scale to score signal quality for each electrode. A mean of 120 (+/- 17.6 SD), 146 (+/- 7.3), and 119 (+/- 16.9) neural-like waveforms were observed from 65-85 electrodes across subjects for all recording sessions of which over 80% were of high quality. Quantitative measures demonstrated that waveforms had signal-to-noise ratio (SNR) up to 20 with maximum peak-to-peak amplitude of over 1200 microv with a mean SNR of 4.8 for signals ranked as high quality. Mean signal quality did not change over the duration of the evaluation period (slope 0.001, 0.0068 and 0.03; NS). By contrast, neural waveform shape varied between, but not within days in all animals, suggesting a shifting population of recorded neurons over time. Arm-movement related modulation was common and 66% of all recorded neurons were tuned to reach direction. The ability for the array to record neural signals from parietal cortex was also established. These results demonstrate that neural recordings that can provide movement

  16. Network-based identification of reliable bio-markers for cancers.

    PubMed

    Deng, Shiguo; Qi, Jingchao; Stephen, Mutua; Qiu, Lu; Yang, Huijie

    2015-10-21

    Finding bio-markers for complex disease from gene expression profiles attracts extensive attentions for its potential use in diagnosis, therapy, and drug design. In this paper we propose a network-based method to seek high-confident bio-markers from candidate genes collected in the literature. The algorithm includes three consequent steps. First, one can collect the proposed bio-markers in literature as being the preliminary candidate; Second, a spanning-tree based threshold can be used to reconstruct gene networks for normal and cancer samples; Third, by jointly using of degree changes and distribution of the candidates in communities, one can filter out the low-confident genes. The survival candidates are high-confident genes. Specially, we consider expression profiles for carcinoma of colon. A total of 34 preliminary bio-markers collected from literature are evaluated and a set of 16 genes are proposed as high confident bio-markers, which behave high performance in distinguishing normal and cancer samples. PMID:26247140

  17. Optimization of high-reliability-based hydrological design problems by robust automatic sampling of critical model realizations

    NASA Astrophysics Data System (ADS)

    Bayer, Peter; de Paly, Michael; Bürger, Claudius M.

    2010-05-01

    This study demonstrates the high efficiency of the so-called stack-ordering technique for optimizing a groundwater management problem under uncertain conditions. The uncertainty is expressed by multiple equally probable model representations, such as realizations of hydraulic conductivity. During optimization of a well-layout problem for contaminant control, a ranking mechanism is applied that extracts those realizations that appear most critical for the optimization problem. It is shown that this procedure works well for evolutionary optimization algorithms, which are to some extent robust against noisy objective functions. More precisely, differential evolution (DE) and the Covariance Matrix Adaptation Evolution Strategy (CMA-ES) are applied. Stack ordering is comprehensively investigated for a plume management problem at a hypothetical template site based on parameter values measured at and on a geostatistical model developed for the Lauswiesen study site near Tübingen, Germany. The straightforward procedure yields computational savings above 90% in comparison to always evaluating the full set of realizations. This is confirmed by cross testing with four additional validation cases. The results show that both evolutionary algorithms obtain highly reliable near-optimal solutions. DE appears to be the better choice for cases with significant noise caused by small stack sizes. On the other hand, there seems to be a problem-specific threshold for the evaluation stack size above which the CMA-ES achieves solutions with both better fitness and higher reliability.

  18. A Multi-Criteria Decision Analysis based methodology for quantitatively scoring the reliability and relevance of ecotoxicological data.

    PubMed

    Isigonis, Panagiotis; Ciffroy, Philippe; Zabeo, Alex; Semenzin, Elena; Critto, Andrea; Giove, Silvio; Marcomini, Antonio

    2015-12-15

    Ecotoxicological data are highly important for risk assessment processes and are used for deriving environmental quality criteria, which are enacted for assuring the good quality of waters, soils or sediments and achieving desirable environmental quality objectives. Therefore, it is of significant importance the evaluation of the reliability of available data for analysing their possible use in the aforementioned processes. The thorough analysis of currently available frameworks for the assessment of ecotoxicological data has led to the identification of significant flaws but at the same time various opportunities for improvement. In this context, a new methodology, based on Multi-Criteria Decision Analysis (MCDA) techniques, has been developed with the aim of analysing the reliability and relevance of ecotoxicological data (which are produced through laboratory biotests for individual effects), in a transparent quantitative way, through the use of expert knowledge, multiple criteria and fuzzy logic. The proposed methodology can be used for the production of weighted Species Sensitivity Weighted Distributions (SSWD), as a component of the ecological risk assessment of chemicals in aquatic systems. The MCDA aggregation methodology is described in detail and demonstrated through examples in the article and the hierarchically structured framework that is used for the evaluation and classification of ecotoxicological data is shortly discussed. The methodology is demonstrated for the aquatic compartment but it can be easily tailored to other environmental compartments (soil, air, sediments). PMID:26298253

  19. Reliable Alignment in Total Knee Arthroplasty by the Use of an iPod-Based Navigation System

    PubMed Central

    Koenen, Paola; Schneider, Marco M.; Fröhlich, Matthias; Driessen, Arne; Bouillon, Bertil; Bäthis, Holger

    2016-01-01

    Axial alignment is one of the main objectives in total knee arthroplasty (TKA). Computer-assisted surgery (CAS) is more accurate regarding limb alignment reconstruction compared to the conventional technique. The aim of this study was to analyse the precision of the innovative navigation system DASH® by Brainlab and to evaluate the reliability of intraoperatively acquired data. A retrospective analysis of 40 patients was performed, who underwent CAS TKA using the iPod-based navigation system DASH. Pre- and postoperative axial alignment were measured on standardized radiographs by two independent observers. These data were compared with the navigation data. Furthermore, interobserver reliability was measured. The duration of surgery was monitored. The mean difference between the preoperative mechanical axis by X-ray and the first intraoperatively measured limb axis by the navigation system was 2.4°. The postoperative X-rays showed a mean difference of 1.3° compared to the final navigation measurement. According to radiographic measurements, 88% of arthroplasties had a postoperative limb axis within ±3°. The mean additional time needed for navigation was 5 minutes. We could prove very good precision for the DASH system, which is comparable to established navigation devices with only negligible expenditure of time compared to conventional TKA. PMID:27313898

  20. Design and reliability analysis of high-speed and continuous data recording system based on disk array

    NASA Astrophysics Data System (ADS)

    Jiang, Changlong; Ma, Cheng; He, Ning; Zhang, Xugang; Wang, Chongyang; Jia, Huibo

    2002-12-01

    In many real-time fields the sustained high-speed data recording system is required. This paper proposes a high-speed and sustained data recording system based on the complex-RAID 3+0. The system consists of Array Controller Module (ACM), String Controller Module (SCM) and Main Controller Module (MCM). ACM implemented by an FPGA chip is used to split the high-speed incoming data stream into several lower-speed streams and generate one parity code stream synchronously. It also can inversely recover the original data stream while reading. SCMs record lower-speed streams from the ACM into the SCSI disk drivers. In the SCM, the dual-page buffer technology is adopted to implement speed-matching function and satisfy the need of sustainable recording. MCM monitors the whole system, controls ACM and SCMs to realize the data stripping, reconstruction, and recovery functions. The method of how to determine the system scale is presented. At the end, two new ways Floating Parity Group (FPG) and full 2D-Parity Group (full 2D-PG) are proposed to improve the system reliability and compared with the Traditional Parity Group (TPG). This recording system can be used conveniently in many areas of data recording, storing, playback and remote backup with its high-reliability.

  1. Reliability of an expert rating procedure for retrospective assessment of occupational exposures in community-based case-control studies.

    PubMed

    Siemiatycki, J; Fritschi, L; Nadon, L; Gérin, M

    1997-03-01

    The most daunting problem in community-based studies of occupational cancer is retrospective exposure assessment. To avoid the error involved in using job title as the exposure variable or self-report of exposure, our team developed an approach based on expert judgment applied to job descriptions obtained by interviewers. A population-based case-control study of cancer and occupation was carried out in Montreal between 1979 and 1986, and over 4,000 job histories were assessed by our team of experts. The job histories of these subjects were evaluated, by consensus, by a team of chemist/hygienists for evidence of exposure to a list of 294 workplace chemicals. In order to evaluate the reliability of this exposure assessment procedure, four years after the rating was completed, we selected 50 job histories at random and had two members of the expert team carry out the same type of coding, blind to the original ratings for these jobs. For 25 job histories, comprising 94 distinct jobs, the pair worked as a consensus panel; for the other 25, comprising 92 distinct jobs, they worked independently. Statistical comparisons were made between the new ratings and the old. Among those rated by consensus, the marginal distribution of exposure prevalence was almost identical between old and new. The weighted kappa for agreement was 0.80. Among items for which both ratings agreed that there had been exposure, there was good agreement on the frequency, concentration, and route of contact. When the two raters worked independently, the levels of agreement between them and between each of them and the original rating was good (kappas around 0.70), though not as high as when they worked together. It is concluded that high levels of reliability are attainable for retrospective exposure assessment by experts. PMID:9055950

  2. Establishing reliable miRNA-cancer association network based on text-mining method.

    PubMed

    Li, Lun; Hu, Xingchi; Yang, Zhaowan; Jia, Zhenyu; Fang, Ming; Zhang, Libin; Zhou, Yanhong

    2014-01-01

    Associating microRNAs (miRNAs) with cancers is an important step of understanding the mechanisms of cancer pathogenesis and finding novel biomarkers for cancer therapies. In this study, we constructed a miRNA-cancer association network (miCancerna) based on more than 1,000 miRNA-cancer associations detected from millions of abstracts with the text-mining method, including 226 miRNA families and 20 common cancers. We further prioritized cancer-related miRNAs at the network level with the random-walk algorithm, achieving a relatively higher performance than previous miRNA disease networks. Finally, we examined the top 5 candidate miRNAs for each kind of cancer and found that 71% of them are confirmed experimentally. miCancerna would be an alternative resource for the cancer-related miRNA identification. PMID:24895499

  3. Parameter estimation techniques based on optimizing goodness-of-fit statistics for structural reliability

    NASA Technical Reports Server (NTRS)

    Starlinger, Alois; Duffy, Stephen F.; Palko, Joseph L.

    1993-01-01

    New methods are presented that utilize the optimization of goodness-of-fit statistics in order to estimate Weibull parameters from failure data. It is assumed that the underlying population is characterized by a three-parameter Weibull distribution. Goodness-of-fit tests are based on the empirical distribution function (EDF). The EDF is a step function, calculated using failure data, and represents an approximation of the cumulative distribution function for the underlying population. Statistics (such as the Kolmogorov-Smirnov statistic and the Anderson-Darling statistic) measure the discrepancy between the EDF and the cumulative distribution function (CDF). These statistics are minimized with respect to the three Weibull parameters. Due to nonlinearities encountered in the minimization process, Powell's numerical optimization procedure is applied to obtain the optimum value of the EDF. Numerical examples show the applicability of these new estimation methods. The results are compared to the estimates obtained with Cooper's nonlinear regression algorithm.

  4. Towards a reliable and high sensitivity O₂-independent glucose sensor based on Ir oxide nanoparticles.

    PubMed

    Campbell, H B; Elzanowska, H; Birss, V I

    2013-04-15

    The primary goal of this work is the development of a rapidly responding, sensitive, and biocompatible Ir oxide (IrOx)-based glucose sensor that regenerates solely via IrOx-mediation in both O₂-free and aerobic environments. An important discovery is that, for films composed of IrOx nanoparticles, Nafion® and glucose oxidase (GOx), a Michaelis-Menten constant (K'(m)) of 20-30 mM is obtained in the case of dual-regeneration (O₂ and IrOx), while K'(m) values are much smaller (3-5 mM) when re-oxidation of GOx occurs only through IrOx-mediation. These smaller K'(m) values indicate that the regeneration of GOx via direct electron transfer to the IrOx nanoparticles is more rapid than to O₂. Small K'(m) values, which are obtained more commonly when Nafion® is not present in the films, are also important for the accurate measurement of low glucose concentrations under hypoglycemic conditions. In this work, the sensing film was also optimized for miniaturization. Depending on the IrOx and GOx surface loadings and the use of sonication before film deposition, the i(max) values ranged from 5 to 225 μA cm⁻², showing very good sensitivity down to 0.4 mM glucose. PMID:23261690

  5. Improved Membrane-Based Sensor Network for Reliable Gas Monitoring in the Subsurface

    PubMed Central

    Lazik, Detlef; Ebert, Sebastian

    2012-01-01

    A conceptually improved sensor network to monitor the partial pressure of CO2 in different soil horizons was designed. Consisting of five membrane-based linear sensors (line-sensors) each with 10 m length, the set-up enables us to integrate over the locally fluctuating CO2 concentrations (typically lower 5%vol) up to the meter-scale gaining valuable concentration means with a repetition time of about 1 min. Preparatory tests in the laboratory resulted in a unexpected highly increased accuracy of better than 0.03%vol with respect to the previously published 0.08%vol. Thereby, the statistical uncertainties (standard deviations) of the line-sensors and the reference sensor (nondispersive infrared CO2-sensor) were close to each other. Whereas the uncertainty of the reference increases with the measurement value, the line-sensors show an inverse uncertainty trend resulting in a comparatively enhanced accuracy for concentrations >1%vol. Furthermore, a method for in situ maintenance was developed, enabling a proof of sensor quality and its effective calibration without demounting the line-sensors from the soil which would disturb the established structures and ongoing processes. PMID:23235447

  6. Improving the reliability of road materials based on micronized sulfur composites

    NASA Astrophysics Data System (ADS)

    Abdrakhmanova, K. K.

    2015-01-01

    The work contains the results of a nano-structural modification of sulfur that prevents polymorphic transformations from influencing the properties of sulfur composites where sulfur is present in a thermodynamic stable condition that precludes destruction when operated. It has been established that the properties of sulfur-based composite materials can be significantly improved by modifying sulfur and structuring sulfur binder by nano-dispersed fiber particles and ultra-dispersed state filler. The paper shows the possibility of modifying Tengiz sulfur by its fragmenting which ensures that the structured sulfur is structurally changed and stabilized through reinforcement by ultra-dispersed fiber particles allowing the phase contact area to be multiplied. Interaction between nano-dispersed fibers of chrysotile asbestos and sulfur ensures the implementation of the mechanical properties of chrysotile asbestos tubes in reinforced composite and its integrity provided that the surface of chrysotile asbestos tubes are highly moistened with molten sulfur and there is high adhesion between the tubes and the matrix that, in addition to sulfur, contains limestone microparticles. Ability to apply materials in severe operation conditions and possibility of exposure in both aggressive medium and mechanical loads makes produced sulfur composites required by the road construction industry.

  7. Observational Measures of Implementer Fidelity for a School-based Preventive Intervention: Development, Reliability and Validity

    PubMed Central

    Cross, Wendi; West, Jennifer; Wyman, Peter A.; Schmeelk-Cone, Karen; Xia, Yinglin; Tu, Xin; Teisl, Michael; Brown, C. Hendricks; Forgatch, Marion

    2014-01-01

    Current measures of implementer fidelity often fail to adequately measure core constructs of adherence and competence, and their relationship to outcomes can be mixed. To address these limitations, we used observational methods to assess these constructs and their relationships to proximal outcomes in a randomized trial of a school-based preventive intervention (Rochester Resilience Project) designed to strengthen emotion self-regulation skills in 1st–3rd graders with elevated aggressive-disruptive behaviors. Within the intervention group (n = 203), a subsample (n = 76) of students was selected to reflect the overall sample. Implementers were 10 paraprofessionals. Videotaped observations of three lessons from Year 1 of the intervention (14 lessons) were coded for each implementer-child dyad on Adherence (content) and Competence (quality). Using multi-level modeling we examined how much of the variance in the fidelity measures was attributed to implementer and to the child within implementer. Both measures had large and significant variance accounted for by implementer (Competence, 68%; Adherence, 41%); child within implementer did not account for significant variance indicating that ratings reflected stable qualities of the implementer rather than the child. Raw Adherence and Competence scores shared 46% of variance (r = .68). Controlling for baseline differences and age, the amount (Adherence) and quality (Competence) of program delivered predicted children’s enhanced response to the intervention on both child and parent reports after six months, but not on teacher report of externalizing behavior. Our findings support the use of multiple observations for measuring fidelity and that adherence and competence are important components of fidelity which could be assessed by many programs using these methods. PMID:24736951

  8. Observational measures of implementer fidelity for a school-based preventive intervention: development, reliability, and validity.

    PubMed

    Cross, Wendi; West, Jennifer; Wyman, Peter A; Schmeelk-Cone, Karen; Xia, Yinglin; Tu, Xin; Teisl, Michael; Brown, C Hendricks; Forgatch, Marion

    2015-01-01

    Current measures of implementer fidelity often fail to adequately measure core constructs of adherence and competence, and their relationship to outcomes can be mixed. To address these limitations, we used observational methods to assess these constructs and their relationships to proximal outcomes in a randomized trial of a school-based preventive intervention (Rochester Resilience Project) designed to strengthen emotion self-regulation skills in first-third graders with elevated aggressive-disruptive behaviors. Within the intervention group (n = 203), a subsample (n = 76) of students was selected to reflect the overall sample. Implementers were 10 paraprofessionals. Videotaped observations of three lessons from year 1 of the intervention (14 lessons) were coded for each implementer-child dyad on adherence (content) and competence (quality). Using multilevel modeling, we examined how much of the variance in the fidelity measures was attributed to implementer and to the child within implementer. Both measures had large and significant variance accounted for by implementer (competence, 68 %; adherence, 41 %); child within implementer did not account for significant variance indicating that ratings reflected stable qualities of the implementer rather than the child. Raw adherence and competence scores shared 46 % of variance (r = .68). Controlling for baseline differences and age, the amount (adherence) and quality (competence) of program delivered predicted children's enhanced response to the intervention on both child and parent reports after 6 months, but not on teacher report of externalizing behavior. Our findings support the use of multiple observations for measuring fidelity and that adherence and competence are important components of fidelity which could be assessed by many programs using these methods. PMID:24736951

  9. On the impact of Ag doping on performance and reliability of GeS2-based conductive bridge memories

    NASA Astrophysics Data System (ADS)

    Longnos, F.; Vianello, E.; Cagli, C.; Molas, G.; Souchier, E.; Blaise, P.; Carabasse, C.; Rodriguez, G.; Jousseaume, V.; De Salvo, B.; Dahmani, F.; Verrier, P.; Bretegnier, D.; Liebault, J.

    2013-06-01

    In this work, we study the impact of Ag doping on GeS2-based CBRAM devices employing Ag as active electrode. Several devices with Ag doping varying between 10% and 24% are extensively analyzed. First, we assess switching voltages and time-to-set as a function of Ag concentration in the electrolyte layer. Subsequently, we evaluate the two most important reliability aspects of RRAM devices: endurance and data retention at different temperatures. The results show that an increase of Ag doping in the GeS2 layer yields a strong improvement to both endurance and data retention performances. The extrapolated temperature allowing for 10 years data retention increases from 75 °C for the 10% Ag-doped sample to 109 °C for the 24% Ag-doped one.

  10. Reliability of neuronal information conveyed by unreliable neuristor-based leaky integrate-and-fire neurons: a model study

    PubMed Central

    Lim, Hyungkwang; Kornijcuk, Vladimir; Seok, Jun Yeong; Kim, Seong Keun; Kim, Inho; Hwang, Cheol Seong; Jeong, Doo Seok

    2015-01-01

    We conducted simulations on the neuronal behavior of neuristor-based leaky integrate-and-fire (NLIF) neurons. The phase-plane analysis on the NLIF neuron highlights its spiking dynamics – determined by two nullclines conditional on the variables on the plane. Particular emphasis was placed on the operational noise arising from the variability of the threshold switching behavior in the neuron on each switching event. As a consequence, we found that the NLIF neuron exhibits a Poisson-like noise in spiking, delimiting the reliability of the information conveyed by individual NLIF neurons. To highlight neuronal information coding at a higher level, a population of noisy NLIF neurons was analyzed in regard to probability of successful information decoding given the Poisson-like noise of each neuron. The result demonstrates highly probable success in decoding in spite of large variability – due to the variability of the threshold switching behavior – of individual neurons. PMID:25966658

  11. Copper-based micro-channel cooler reliably operated using solutions of distilled-water and ethanol as a coolant

    NASA Astrophysics Data System (ADS)

    Chin, A. K.; Nelson, A.; Chin, R. H.; Bertaska, R.; Jacob, J. H.

    2015-03-01

    Copper-based micro-channel coolers (Cu-MCC) are the lowest thermal-resistance heat-sinks for high-power laserdiode (LD) bars. Presently, the resistivity, pH and oxygen content of the de-ionized water coolant, must be actively controlled to minimize cooler failure by corrosion and electro-corrosion. Additionally, the water must be constantly exposed to ultraviolet radiation to limit the growth of micro-organisms that may clog the micro-channels. In this study, we report the reliable, care-free operation of LD-bars attached to Cu-MCCs, using a solution of distilledwater and ethanol as the coolant. This coolant meets the storage requirements of Mil-Std 810G, e.g. exposure to a storage temperature as low as -51°C and no growth of micro-organisms during passive storage.

  12. Effect of Inner Electrode on Reliability of (Zn,Mg)TiO3-Based Multilayer Ceramic Capacitor

    NASA Astrophysics Data System (ADS)

    Lee, Wen‑His; Su, Chi‑Yi; Lee, Ying Chieh; Yang, Jackey; Yang, Tong; PinLin, Shih

    2006-07-01

    In this study, different proportions of silver-palladium alloy acting as the inner electrode were adopted to a (Zn,Mg)TiO3-based multilayer ceramic capacitor (MLCC) sintered at 925 °C for 2 h to evaluate the effect of the inner electrode on reliability. The main results show that the lifetime is inversely proportional to Ag content in the Pd/Ag inner electrode. Ag+1 diffusion into the (Zn,Mg)TiO3-based MLCC during cofiring at 925 °C for 2 h and Ag+1 migration at 140 °C against 200 V are both responsible for the short lifetime of the (Zn,Mg)TiO3-based MLCC, particularly the latter factor. A (Zn,Mg)TiO3-based MLCC with high Ag content in the inner electrode Ag/Pd=99/01 exhibits the shortest lifetime (13 h), and the effect of Ag+1 migration is markedly enhanced when the activation energy of the (Zn,Mg)TiO3 dielectric is greatly lowered due to the excessive formation of oxygen vacancies and the semiconducting Zn2TiO4 phase when Ag+ substitutes for Zn+2 during co-firing.

  13. Evolution of conductive filament and its impact on reliability issues in oxide-electrolyte based resistive random access memory

    PubMed Central

    Lv, Hangbing; Xu, Xiaoxin; Liu, Hongtao; Liu, Ruoyu; Liu, Qi; Banerjee, Writam; Sun, Haitao; Long, Shibing; Li, Ling; Liu, Ming

    2015-01-01

    The electrochemical metallization cell, also referred to as conductive bridge random access memory, is considered to be a promising candidate or complementary component to the traditional charge based memory. As such, it is receiving additional focus to accelerate the commercialization process. To create a successful mass product, reliability issues must first be rigorously solved. In-depth understanding of the failure behavior of the ECM is essential for performance optimization. Here, we reveal the degradation of high resistance state behaves as the majority cases of the endurance failure of the HfO2 electrolyte based ECM cell. High resolution transmission electron microscopy was used to characterize the change in filament nature after repetitive switching cycles. The result showed that Cu accumulation inside the filament played a dominant role in switching failure, which was further supported by measuring the retention of cycle dependent high resistance state and low resistance state. The clarified physical picture of filament evolution provides a basic understanding of the mechanisms of endurance and retention failure, and the relationship between them. Based on these results, applicable approaches for performance optimization can be implicatively developed, ranging from material tailoring to structure engineering and algorithm design. PMID:25586207

  14. Evolution of conductive filament and its impact on reliability issues in oxide-electrolyte based resistive random access memory.

    PubMed

    Lv, Hangbing; Xu, Xiaoxin; Liu, Hongtao; Liu, Ruoyu; Liu, Qi; Banerjee, Writam; Sun, Haitao; Long, Shibing; Li, Ling; Liu, Ming

    2015-01-01

    The electrochemical metallization cell, also referred to as conductive bridge random access memory, is considered to be a promising candidate or complementary component to the traditional charge based memory. As such, it is receiving additional focus to accelerate the commercialization process. To create a successful mass product, reliability issues must first be rigorously solved. In-depth understanding of the failure behavior of the ECM is essential for performance optimization. Here, we reveal the degradation of high resistance state behaves as the majority cases of the endurance failure of the HfO2 electrolyte based ECM cell. High resolution transmission electron microscopy was used to characterize the change in filament nature after repetitive switching cycles. The result showed that Cu accumulation inside the filament played a dominant role in switching failure, which was further supported by measuring the retention of cycle dependent high resistance state and low resistance state. The clarified physical picture of filament evolution provides a basic understanding of the mechanisms of endurance and retention failure, and the relationship between them. Based on these results, applicable approaches for performance optimization can be implicatively developed, ranging from material tailoring to structure engineering and algorithm design. PMID:25586207

  15. A stochastic simulation-optimization approach for estimating highly reliable soil tension threshold values in sensor-based deficit irrigation

    NASA Astrophysics Data System (ADS)

    Kloss, S.; Schütze, N.; Walser, S.; Grundmann, J.

    2012-04-01

    In arid and semi-arid regions where water is scarce, farmers heavily rely on irrigation in order to grow crops and to produce agricultural commodities. The variable and often severely limited water supply thereby poses a serious challenge for farmers to cope with and demand sophisticated irrigation strategies that allow an efficient management of the available water resources. The general aim is to increase water productivity (WP) and one of these strategies to achieve this goal is controlled deficit irrigation (CDI). One way to realize CDI is by defining soil water status specific threshold values (either in soil tension or moisture) at which irrigation cycles are triggered. When utilizing CDI, irrigation control is of utmost importance and yet thresholds are likely chosen by trial and error and thus unreliable. Hence, for CDI to be effective systematic investigations for deriving reliable threshold values that account for different CDI strategies are needed. In this contribution, a method is presented that uses a simulation-based stochastic approach for estimating threshold values with a high reliability. The approach consist of a weather generator offering statistical significance to site-specific climate series, an optimization algorithm that determines optimal threshold values under limiting waters supply, and a crop model for simulating plant growth and water consumption. The study focuses on threshold values of soil tension for different CDI strategies. The advantage of soil-tension-based threshold values over soil-moisture-based lies in their universal and soil type independent applicability. The investigated CDI strategies comprised schedules of constant threshold values, crop development stage dependent threshold values, and different minimum irrigation intervals. For practical reasons, fixed irrigation schedules were tested as well. Additionally, a full irrigation schedule served as reference. The obtained threshold values were then tested in field

  16. Reliability physics

    NASA Technical Reports Server (NTRS)

    Cuddihy, E. F.; Ross, R. G., Jr.

    1984-01-01

    Speakers whose topics relate to the reliability physics of solar arrays are listed and their topics briefly reviewed. Nine reports are reviewed ranging in subjects from studies of photothermal degradation in encapsulants and polymerizable ultraviolet stabilizers to interface bonding stability to electrochemical degradation of photovoltaic modules.

  17. Feasibility of AmbulanCe-Based Telemedicine (FACT) Study: Safety, Feasibility and Reliability of Third Generation In-Ambulance Telemedicine

    PubMed Central

    Yperzeele, Laetitia; Van Hooff, Robbert-Jan; De Smedt, Ann; Valenzuela Espinoza, Alexis; Van Dyck, Rita; Van de Casseye, Rohny; Convents, Andre; Hubloue, Ives; Lauwaert, Door; De Keyser, Jacques; Brouns, Raf

    2014-01-01

    Background Telemedicine is currently mainly applied as an in-hospital service, but this technology also holds potential to improve emergency care in the prehospital arena. We report on the safety, feasibility and reliability of in-ambulance teleconsultation using a telemedicine system of the third generation. Methods A routine ambulance was equipped with a system for real-time bidirectional audio-video communication, automated transmission of vital parameters, glycemia and electronic patient identification. All patients ( ≥18 years) transported during emergency missions by a Prehospital Intervention Team of the Universitair Ziekenhuis Brussel were eligible for inclusion. To guarantee mobility and to facilitate 24/7 availability, the teleconsultants used lightweight laptop computers to access a dedicated telemedicine platform, which also provided functionalities for neurological assessment, electronic reporting and prehospital notification of the in-hospital team. Key registrations included any safety issue, mobile connectivity, communication of patient information, audiovisual quality, user-friendliness and accuracy of the prehospital diagnosis. Results Prehospital teleconsultation was obtained in 41 out of 43 cases (95.3%). The success rates for communication of blood pressure, heart rate, blood oxygen saturation, glycemia, and electronic patient identification were 78.7%, 84.8%, 80.6%, 64.0%, and 84.2%. A preliminary prehospital diagnosis was formulated in 90.2%, with satisfactory agreement with final in-hospital diagnoses. Communication of a prehospital report to the in-hospital team was successful in 94.7% and prenotification of the in-hospital team via SMS in 90.2%. Failures resulted mainly from limited mobile connectivity and to a lesser extent from software, hardware or human error. The user acceptance was high. Conclusions Ambulance-based telemedicine of the third generation is safe, feasible and reliable but further research and development, especially

  18. Development of a novel location-based assessment of sensory symptoms in cancer patients: preliminary reliability and validity assessment.

    PubMed

    Burkey, Adam R; Kanetsky, Peter A

    2009-05-01

    We report on the development of a novel location-based assessment of sensory symptoms in cancer (L-BASIC) instrument, and its initial estimates of reliability and validity. L-BASIC is structured so that patients provide a numeric score and an adjectival description for any sensory symptom, including both pain and neuropathic sensations, present in each of the 10 predefined body areas. Ninety-seven patients completed the baseline questionnaire; 39 completed the questionnaire on two occasions. A mean of 3.5 body parts was scored per patient. On average, 2.7 (of 11) descriptor categories were used per body part. There was good internal consistency (Cronbach's alpha=0.74) for a four-item scale that combined location-specific metrics. Temporal stability was adequate (kappa>0.50 and r>0.60 for categorical and continuous variables, respectively) among patients without observed or reported subjective change in clinical status between L-BASIC administrations. We compared our four-item scale against scores obtained from validated pain and quality-of-life (QOL) scales, and as expected, correlations were higher for pain-related items than for QOL-related items. We detected differences in L-BASIC responses among patients with cancer-related head or neck pain, chemotherapy-related neuropathy and breast cancer-related lymphedema. We conclude that L-BASIC provides internally consistent and temporally stable responses, while acknowledging that further refinement and testing of this novel instrument are necessary. We anticipate that future versions of L-BASIC will provide reliable and valid syndrome-specific measurement of defined clinical pain and symptom constructs in the cancer population, which may be of particular value in assessing treatment response in patients with such multiple complaints. PMID:19059751

  19. Tutorial on use of intraclass correlation coefficients for assessing intertest reliability and its application in functional near-infrared spectroscopy-based brain imaging

    NASA Astrophysics Data System (ADS)

    Li, Lin; Zeng, Li; Lin, Zi-Jing; Cazzell, Mary; Liu, Hanli

    2015-05-01

    Test-retest reliability of neuroimaging measurements is an important concern in the investigation of cognitive functions in the human brain. To date, intraclass correlation coefficients (ICCs), originally used in inter-rater reliability studies in behavioral sciences, have become commonly used metrics in reliability studies on neuroimaging and functional near-infrared spectroscopy (fNIRS). However, as there are six popular forms of ICC, the adequateness of the comprehensive understanding of ICCs will affect how one may appropriately select, use, and interpret ICCs toward a reliability study. We first offer a brief review and tutorial on the statistical rationale of ICCs, including their underlying analysis of variance models and technical definitions, in the context of assessment on intertest reliability. Second, we provide general guidelines on the selection and interpretation of ICCs. Third, we illustrate the proposed approach by using an actual research study to assess intertest reliability of fNIRS-based, volumetric diffuse optical tomography of brain activities stimulated by a risk decision-making protocol. Last, special issues that may arise in reliability assessment using ICCs are discussed and solutions are suggested.

  20. Validity, Reliability, and Sensitivity of a 3D Vision Sensor-based Upper Extremity Reachable Workspace Evaluation in Neuromuscular Diseases

    PubMed Central

    Han, Jay J.; Kurillo, Gregorij; Abresch, R. Ted; Nicorici, Alina; Bajcsy, Ruzena

    2013-01-01

    Introduction: One of the major challenges in the neuromuscular field has been lack of upper extremity outcome measures that can be useful for clinical therapeutic efficacy studies. Using vision-based sensor system and customized software, 3-dimensional (3D) upper extremity motion analysis can reconstruct a reachable workspace as a valid, reliable and sensitive outcome measure in various neuromuscular conditions where proximal upper extremity range of motion and function is impaired. Methods: Using a stereo-camera sensor system, 3D reachable workspace envelope surface area normalized to an individual’s arm length (relative surface area: RSA) to allow comparison between subjects was determined for 20 healthy controls and 9 individuals with varying degrees of upper extremity dysfunction due to neuromuscular conditions. All study subjects were classified based on Brooke upper extremity function scale. Right and left upper extremity reachable workspaces were determined based on three repeated measures. The RSAs for each frontal hemi-sphere quadrant and total reachable workspaces were determined with and without loading condition (500 gram wrist weight). Data were analyzed for assessment of the developed system and validity, reliability, and sensitivity to change of the reachable workspace outcome. Results: The mean total RSAs of the reachable workspace for the healthy controls and individuals with NMD were significantly different (0.586 ± 0.085 and 0.299 ± 0.198 respectively; p<0.001). All quadrant RSAs were reduced for individuals with NMDs compared to the healthy controls and these reductions correlated with reduced upper limb function as measured by Brooke grade. The upper quadrants of reachable workspace (above the shoulder level) demonstrated greatest reductions in RSA among subjects with progressive severity in upper extremity impairment. Evaluation of the developed outcomes system with the Bland-Altman method demonstrated narrow 95% limits of agreement (LOA

  1. A GIS-based assessment of the suitability of SCIAMACHY satellite sensor measurements for estimating reliable CO concentrations in a low-latitude climate.

    PubMed

    Fagbeja, Mofoluso A; Hill, Jennifer L; Chatterton, Tim J; Longhurst, James W S

    2015-02-01

    An assessment of the reliability of the Scanning Imaging Absorption Spectrometer for Atmospheric Cartography (SCIAMACHY) satellite sensor measurements to interpolate tropospheric concentrations of carbon monoxide considering the low-latitude climate of the Niger Delta region in Nigeria was conducted. Monthly SCIAMACHY carbon monoxide (CO) column measurements from January 2,003 to December 2005 were interpolated using ordinary kriging technique. The spatio-temporal variations observed in the reliability were based on proximity to the Atlantic Ocean, seasonal variations in the intensities of rainfall and relative humidity, the presence of dust particles from the Sahara desert, industrialization in Southwest Nigeria and biomass burning during the dry season in Northern Nigeria. Spatial reliabilities of 74 and 42 % are observed for the inland and coastal areas, respectively. Temporally, average reliability of 61 and 55 % occur during the dry and wet seasons, respectively. Reliability in the inland and coastal areas was 72 and 38 % during the wet season, and 75 and 46 % during the dry season, respectively. Based on the results, the WFM-DOAS SCIAMACHY CO data product used for this study is therefore relevant in the assessment of CO concentrations in developing countries within the low latitudes that could not afford monitoring infrastructure due to the required high costs. Although the SCIAMACHY sensor is no longer available, it provided cost-effective, reliable and accessible data that could support air quality assessment in developing countries. PMID:25626562

  2. Investigating univariate temporal patterns for intrinsic connectivity networks based on complexity and low-frequency oscillation: a test-retest reliability study.

    PubMed

    Wang, X; Jiao, Y; Tang, T; Wang, H; Lu, Z

    2013-12-19

    Intrinsic connectivity networks (ICNs) are composed of spatial components and time courses. The spatial components of ICNs were discovered with moderate-to-high reliability. So far as we know, few studies focused on the reliability of the temporal patterns for ICNs based their individual time courses. The goals of this study were twofold: to investigate the test-retest reliability of temporal patterns for ICNs, and to analyze these informative univariate metrics. Additionally, a correlation analysis was performed to enhance interpretability. Our study included three datasets: (a) short- and long-term scans, (b) multi-band echo-planar imaging (mEPI), and (c) eyes open or closed. Using dual regression, we obtained the time courses of ICNs for each subject. To produce temporal patterns for ICNs, we applied two categories of univariate metrics: network-wise complexity and network-wise low-frequency oscillation. Furthermore, we validated the test-retest reliability for each metric. The network-wise temporal patterns for most ICNs (especially for default mode network, DMN) exhibited moderate-to-high reliability and reproducibility under different scan conditions. Network-wise complexity for DMN exhibited fair reliability (ICC<0.5) based on eyes-closed sessions. Specially, our results supported that mEPI could be a useful method with high reliability and reproducibility. In addition, these temporal patterns were with physiological meanings, and certain temporal patterns were correlated to the node strength of the corresponding ICN. Overall, network-wise temporal patterns of ICNs were reliable and informative and could be complementary to spatial patterns of ICNs for further study. PMID:24042040

  3. An integrative C. elegans protein-protein interaction network with reliability assessment based on a probabilistic graphical model.

    PubMed

    Huang, Xiao-Tai; Zhu, Yuan; Chan, Leanne Lai Hang; Zhao, Zhongying; Yan, Hong

    2016-01-01

    In Caenorhabditis elegans, a large number of protein-protein interactions (PPIs) are identified by different experiments. However, a comprehensive weighted PPI network, which is essential for signaling pathway inference, is not yet available in this model organism. Therefore, we firstly construct an integrative PPI network in C. elegans with 12,951 interactions involving 5039 proteins from seven molecular interaction databases. Then, a reliability score based on a probabilistic graphical model (RSPGM) is proposed to assess PPIs. It assumes that the random number of interactions between two proteins comes from the Bernoulli distribution to avoid multi-links. The main parameter of the RSPGM score contains a few latent variables which can be considered as several common properties between two proteins. Validations on high-confidence yeast datasets show that RSPGM provides more accurate evaluation than other approaches, and the PPIs in the reconstructed PPI network have higher biological relevance than that in the original network in terms of gene ontology, gene expression, essentiality and the prediction of known protein complexes. Furthermore, this weighted integrative PPI network in C. elegans is employed on inferring interaction path of the canonical Wnt/β-catenin pathway as well. Most genes on the inferred interaction path have been validated to be Wnt pathway components. Therefore, RSPGM is essential and effective for evaluating PPIs and inferring interaction path. Finally, the PPI network with RSPGM scores can be queried and visualized on a user interactive website, which is freely available at . PMID:26555698

  4. Reliability analysis of charge plasma based double material gate oxide (DMGO) SiGe-on-insulator (SGOI) MOSFET

    NASA Astrophysics Data System (ADS)

    Pradhan, K. P.; Sahu, P. K.; Singh, D.; Artola, L.; Mohapatra, S. K.

    2015-09-01

    A novel device named charge plasma based doping less double material gate oxide (DMGO) silicon-germanium on insulator (SGOI) double gate (DG) MOSFET is proposed for the first time. The fundamental objective in this work is to modify the channel potential, electric field and electron velocity for improving leakage current, transconductance (gm) and transconductance generation factor (TGF). Using 2-D simulation, we exhibit that the DMGO-SGOI MOSFET shows higher electron velocity at source side and lower electric field at drain side as compare to ultra-thin body (UTB) DG MOSFET. On the other hand DMGO-SGOI MOSFET demonstrates a significant improvement in gm and TGF in comparison to UTB-DG MOSFET. This work also evaluates the existence of a biasing point i.e. zero temperature coefficient (ZTC) bias point, where the device parameters become independent of temperature. The impact of operating temperature (T) on above said various performance metrics are also subjected to extensive analysis. This further validates the reliability of charge plasma DMGO SGOI MOSFET and its application opportunities involved in designing analog/RF circuits for a wide range of temperature applications.

  5. A Microstructure-Based Time-Dependent Crack Growth Model for Life and Reliability Prediction of Turbopropulsion Systems

    NASA Astrophysics Data System (ADS)

    Chan, Kwai S.; Enright, Michael P.; Moody, Jonathan; Fitch, Simeon H. K.

    2014-01-01

    The objective of this investigation was to develop an innovative methodology for life and reliability prediction of hot-section components in advanced turbopropulsion systems. A set of generic microstructure-based time-dependent crack growth (TDCG) models was developed and used to assess the sources of material variability due to microstructure and material parameters such as grain size, activation energy, and crack growth threshold for TDCG. A comparison of model predictions and experimental data obtained in air and in vacuum suggests that oxidation is responsible for higher crack growth rates at high temperatures, low frequencies, and long dwell times, but oxidation can also induce higher crack growth thresholds (Δ K th or K th) under certain conditions. Using the enhanced risk analysis tool and material constants calibrated to IN 718 data, the effect of TDCG on the risk of fracture in turboengine components was demonstrated for a generic rotor design and a realistic mission profile using the DARWIN® probabilistic life-prediction code. The results of this investigation confirmed that TDCG and cycle-dependent crack growth in IN 718 can be treated by a simple summation of the crack increments over a mission. For the temperatures considered, TDCG in IN 718 can be considered as a K-controlled or a diffusion-controlled oxidation-induced degradation process. This methodology provides a pathway for evaluating microstructural effects on multiple damage modes in hot-section components.

  6. Voxel-based morphometry (VBM) studies in schizophrenia—can white matter changes be reliably detected with VBM?

    PubMed Central

    Melonakos, Eric; Shenton, Martha; Rathi, Yogesh; Terry, Doug; Bouix, Sylvain; Kubicki, Marek

    2012-01-01

    Voxel-Based Morphometry (VBM) is a hypothesis-free, whole-brain, voxel-by-voxel analytic method that attempts to compare imaging data between populations. Schizophrenia studies have utilized this method to localize differences in Diffusion Tensor Imaging (DTI) derived Fractional Anisotropy (FA), a measure of white matter integrity, between patients and healthy controls. The number of publications has grown, although it is unclear how reliable and reproducible this method is, given the subtle white matter abnormalities expected in schizophrenia. Here we analyze and combine results from 23 studies published to date that use VBM to study schizophrenia in order to evaluate the reproducibility of this method in DTI analysis. Coordinates of each region reported in DTI VBM studies published thus far in schizophrenia were plotted onto a Montreal Neurological Institute atlas, and their anatomical locations were recorded. Results indicated that the reductions of FA in patients with schizophrenia were scattered across the brain. Moreover, even the most consistently reported regions were reported independently in less than 35% of the papers studied. Other instances of reduced FA were replicated at an even lower rate. Our findings demonstrate striking inconsistency, with none of the regions reported in much more than a third of the published papers. Poor replication rate suggests that the application of VBM to DTI data may not be the optimal way for studying the subtle microstructural abnormalities that are being hypothesized in schizophrenia. PMID:21684124

  7. Recent progress in high performance and reliable n-type transition metal oxide-based thin film transistors

    NASA Astrophysics Data System (ADS)

    Kwon, Jang Yeon; Kyeong Jeong, Jae

    2015-02-01

    This review gives an overview of the recent progress in vacuum-based n-type transition metal oxide (TMO) thin film transistors (TFTs). Several excellent review papers regarding metal oxide TFTs in terms of fundamental electron structure, device process and reliability have been published. In particular, the required field-effect mobility of TMO TFTs has been increasing rapidly to meet the demands of the ultra-high-resolution, large panel size and three dimensional visual effects as a megatrend of flat panel displays, such as liquid crystal displays, organic light emitting diodes and flexible displays. In this regard, the effects of the TMO composition on the performance of the resulting oxide TFTs has been reviewed, and classified into binary, ternary and quaternary composition systems. In addition, the new strategic approaches including zinc oxynitride materials, double channel structures, and composite structures have been proposed recently, and were not covered in detail in previous review papers. Special attention is given to the advanced device architecture of TMO TFTs, such as back-channel-etch and self-aligned coplanar structure, which is a key technology because of their advantages including low cost fabrication, high driving speed and unwanted visual artifact-free high quality imaging. The integration process and related issues, such as etching, post treatment, low ohmic contact and Cu interconnection, required for realizing these advanced architectures are also discussed.

  8. Reliability study of Au-in solid-liquid interdiffusion bonding for GaN-based vertical LED packaging

    NASA Astrophysics Data System (ADS)

    Sung, Ho-Kun; Wang, Cong; Kim, Nam-Young

    2015-12-01

    An In-rich Au-In bonding system has been developed to transfer vertical light-emitting diodes (VLEDs) from a sapphire to a graphite substrate and enable them to survive under n-ohmic contact treatment at 350 °C. The bonding temperature is 210 °C, and three intermetallic compounds are detected: AuIn, AuIn2, and γ phase. As a result, the remelting temperature increases beyond the theoretical value of 450 °C according to the Au-In binary phase diagram. In fact, reliability testing showed that joints obtained by rapid thermal annealing at 400 °C for 1 min survived whereas those obtained at 500 °C for 1 min failed. Finally, a GaN-based blue VLED was transferred to the graphite substrate by means of the proposed bonding method, and its average light output power was measured to be 386.6 mW (@350 mA) after n-ohmic contact treatment. This wafer-level bonding technique also shows excellent potential for high-temperature packing applications.

  9. Spatiotemporal variation of long-term drought propensity through reliability-resilience-vulnerability based Drought Management Index

    NASA Astrophysics Data System (ADS)

    Chanda, Kironmala; Maity, Rajib; Sharma, Ashish; Mehrotra, Rajeshwar

    2014-10-01

    This paper characterizes the long-term, spatiotemporal variation of drought propensity through a newly proposed, namely Drought Management Index (DMI), and explores its predictability in order to assess the future drought propensity and adapt drought management policies for a location. The DMI was developed using the reliability-resilience-vulnerability (RRV) rationale commonly used in water resources systems analysis, under the assumption that depletion of soil moisture across a vertical soil column is equivalent to the operation of a water supply reservoir, and that drought should be managed not simply using a measure of system reliability, but should also take into account the readiness of the system to bounce back from drought to a normal state. Considering India as a test bed, 5 year long monthly gridded (0.5° Lat × 0.5° Lon) soil moisture data are used to compute the RRV at each grid location falling within the study domain. The Permanent Wilting Point (PWP) is used as the threshold, indicative of transition into water stress. The association between resilience and vulnerability is then characterized through their joint probability distribution ascertained using Plackett copula models for four broad soil types across India. The joint cumulative distribution functions (CDF) of resilience and vulnerability form the basis for estimating the DMI as a five-yearly time series at each grid location assessed. The status of DMI over the past 50 years indicate that drought propensity is consistently low toward northern and north eastern parts of India but higher in the western part of peninsular India. Based on the observed past behavior of DMI series on a climatological time scale, a DMI prediction model comprising deterministic and stochastic components is developed. The predictability of DMI for a lead time of 5 years is found to vary across India, with a Pearson correlation coefficient between observed and predicted DMI above 0.6 over most of the study area

  10. Distance-dependent Schwarz-based integral estimates for two-electron integrals: Reliable tightness vs. rigorous upper bounds

    NASA Astrophysics Data System (ADS)

    Maurer, Simon A.; Lambrecht, Daniel S.; Flaig, Denis; Ochsenfeld, Christian

    2012-04-01

    A new integral estimate for four-center two-electron integrals is introduced that accounts for distance information between the bra- and ket-charge distributions describing the two electrons. The screening is denoted as QQR and combines the most important features of the conventional Schwarz screening by Häser and Ahlrichs published in 1989 [J. Comput. Chem. 10, 104 (1989), 10.1002/jcc.540100111] and our multipole-based integral estimates (MBIE) introduced in 2005 [D. S. Lambrecht and C. Ochsenfeld, J. Chem. Phys. 123, 184101 (2005), 10.1063/1.2079967]. At the same time the estimates are not only tighter but also much easier to implement, so that we recommend them instead of our MBIE bounds introduced first for accounting for charge-distance information. The inclusion of distance dependence between charge distributions is not only useful at the SCF level but is particularly important for describing electron-correlation effects, e.g., within AO-MP2 theory, where the decay behavior is at least 1/R4 or even 1/R6. In our present work, we focus on studying the efficiency of our QQR estimates within SCF theory and demonstrate the performance for a benchmark set of 44 medium to large molecules, where savings of up to a factor of 2 for exchange integrals are observed for larger systems. Based on the results of the benchmark set we show that reliable tightness of integral estimates is more important for the screening performance than rigorous upper bound properties.

  11. Linear Interaction Energy Based Prediction of Cytochrome P450 1A2 Binding Affinities with Reliability Estimation

    PubMed Central

    Capoferri, Luigi; Verkade-Vreeker, Marlies C. A.; Buitenhuis, Danny; Commandeur, Jan N. M.; Pastor, Manuel; Vermeulen, Nico P. E.; Geerke, Daan P.

    2015-01-01

    Prediction of human Cytochrome P450 (CYP) binding affinities of small ligands, i.e., substrates and inhibitors, represents an important task for predicting drug-drug interactions. A quantitative assessment of the ligand binding affinity towards different CYPs can provide an estimate of inhibitory activity or an indication of isoforms prone to interact with the substrate of inhibitors. However, the accuracy of global quantitative models for CYP substrate binding or inhibition based on traditional molecular descriptors can be limited, because of the lack of information on the structure and flexibility of the catalytic site of CYPs. Here we describe the application of a method that combines protein-ligand docking, Molecular Dynamics (MD) simulations and Linear Interaction Energy (LIE) theory, to allow for quantitative CYP affinity prediction. Using this combined approach, a LIE model for human CYP 1A2 was developed and evaluated, based on a structurally diverse dataset for which the estimated experimental uncertainty was 3.3 kJ mol-1. For the computed CYP 1A2 binding affinities, the model showed a root mean square error (RMSE) of 4.1 kJ mol-1 and a standard error in prediction (SDEP) in cross-validation of 4.3 kJ mol-1. A novel approach that includes information on both structural ligand description and protein-ligand interaction was developed for estimating the reliability of predictions, and was able to identify compounds from an external test set with a SDEP for the predicted affinities of 4.6 kJ mol-1 (corresponding to 0.8 pKi units). PMID:26551865

  12. Network reliability

    NASA Technical Reports Server (NTRS)

    Johnson, Marjory J.

    1985-01-01

    Network control (or network management) functions are essential for efficient and reliable operation of a network. Some control functions are currently included as part of the Open System Interconnection model. For local area networks, it is widely recognized that there is a need for additional control functions, including fault isolation functions, monitoring functions, and configuration functions. These functions can be implemented in either a central or distributed manner. The Fiber Distributed Data Interface Medium Access Control and Station Management protocols provide an example of distributed implementation. Relative information is presented here in outline form.

  13. Physics-Based Stress Corrosion Cracking Component Reliability Model cast in an R7-Compatible Cumulative Damage Framework

    SciTech Connect

    Unwin, Stephen D.; Lowry, Peter P.; Layton, Robert F.; Toloczko, Mychailo B.; Johnson, Kenneth I.; Sanborn, Scott E.

    2011-07-01

    This is a working report drafted under the Risk-Informed Safety Margin Characterization pathway of the Light Water Reactor Sustainability Program, describing statistical models of passives component reliabilities.

  14. The Reliability of Workplace-Based Assessment in Postgraduate Medical Education and Training: A National Evaluation in General Practice in the United Kingdom

    ERIC Educational Resources Information Center

    Murphy, Douglas J.; Bruce, David A.; Mercer, Stewart W.; Eva, Kevin W.

    2009-01-01

    To investigate the reliability and feasibility of six potential workplace-based assessment methods in general practice training: criterion audit, multi-source feedback from clinical and non-clinical colleagues, patient feedback (the CARE Measure), referral letters, significant event analysis, and video analysis of consultations. Performance of GP…

  15. Factor Structure and Reliability of the Revised Conflict Tactics Scales' (CTS2) 10-Factor Model in a Community-Based Female Sample

    ERIC Educational Resources Information Center

    Yun, Sung Hyun

    2011-01-01

    The present study investigated the factor structure and reliability of the revised Conflict Tactics Scales' (CTS2) 10-factor model in a community-based female sample (N = 261). The underlying factor structure of the 10-factor model was tested by the confirmatory multiple group factor analysis, which demonstrated complex factor cross-loadings…

  16. Between-day reliability of three-dimensional motion analysis of the trunk: A comparison of marker based protocols.

    PubMed

    Rast, Fabian Marcel; Graf, Eveline Silvia; Meichtry, André; Kool, Jan; Bauer, Christoph Michael

    2016-03-21

    Motion capture of the trunk using three-dimensional optoelectronic systems and skin markers placed on anatomical landmarks is prone to error due to marker placement, thus decreasing between-day reliability. The influence of these errors on angular output might be reduced by using an overdetermined number of markers and optimization algorithms, or by defining the neutral position using a reference trial. The purpose of this study was to quantify and compare the between-day reliability of trunk kinematics, when using these methods. In each of two sessions, 20 subjects performed four movement tasks. Trunk kinematics were established through the plug-in-gait protocol, the point cloud optimization algorithm, and by defining upright standing as neutral position. Between-day reliability was analyzed using generalizability theory and quantified by indexes of dependability. Across all movement tasks, none of the methods was superior in terms of between-day reliability. The point cloud algorithm did not improve between-day reliability, but did result in 24.3% greater axial rotation angles. The definition of neutral position by means of a reference trial revealed 5.8% higher indexes of dependability for lateral bending and axial rotation angles, but 13.7% smaller indexes of dependability for flexion angles. Further, using a reference trial resulted in 8.3° greater trunk flexion angles. Therefore, the selection of appropriate marker placement and the corresponding calculation of angular output are dependent on the movement task and the underlying research question. PMID:26920506

  17. Reliable, Efficient and Cost-Effective Electric Power Converter for Small Wind Turbines Based on AC-link Technology

    SciTech Connect

    Darren Hammell; Mark Holveck; DOE Project Officer - Keith Bennett

    2006-08-01

    Grid-tied inverter power electronics have been an Achilles heel of the small wind industry, providing opportunity for new technologies to provide lower costs, greater efficiency, and improved reliability. The small wind turbine market is also moving towards the 50-100kW size range. The unique AC-link power conversion technology provides efficiency, reliability, and power quality advantages over existing technologies, and Princeton Power will adapt prototype designs used for industrial asynchronous motor control to a 50kW small wind turbine design.

  18. A Study on Combination of Reliability-based Automatic Repeat reQuest with Error Potential-based Error Correction for Improving P300 Speller Performance

    NASA Astrophysics Data System (ADS)

    Takahashi, Hiromu; Yoshikawa, Tomohiro; Furuhashi, Takeshi

    Brain-computer interfaces (BCIs) are systems that translate one's thoughts into commands to restore control and communication to severely paralyzed people, and also appealing to healthy people. The P300 speller, one of the most renowned BCIs for communication, allows users to select letters just by thoughts. However, due to the low signal-to-noise ratio of the P300, signal averaging is often performed, which improves the spelling accuracy but degrades the spelling speed. The authors have proposed reliability-based automatic repeat request (RB-ARQ) to ease this problem. RB-ARQ could be enhanced when it is combined with the error correction based on the error-related potentials (ErrPs) that occur on erroneous feedbacks. Thus, this study aims to reveal the characteristics of the ErrPs in the P300 speller paradigm, and to combine RB-ARQ with the ErrP-based error correction to further improve the performance. The results show that the ErrPs observed in the current study resemble the previously reported ErrPs observed in a cursor control task using a BCI, and that the performance of the P300 speller could be improved by 35 percent on average.

  19. Youth health risk behavior assessment in Fiji: The reliability of Global School-based Health Survey content adapted for ethnic Fijian girls

    PubMed Central

    Becker, Anne E.; Roberts, Andrea L.; Perloe, Alexandra; Bainivualiku, Asenaca; Richards, Lauren K.; Gilman, Stephen E.; Striegel-Moore, Ruth H.

    2010-01-01

    Objective The Global School-based Student Health Survey (GSHS) is an assessment for adolescent health risk behaviors and exposures, supported by the World Health Organization. Although already widely implemented—and intended for youth assessment across diverse ethnic and national contexts—no reliability data have yet been reported for GSHS-based assessment in any ethnicity or country-specific population. This study reports test-retest reliability for GSHS content adapted for a female adolescent ethnic Fijian study sample in Fiji. Design We adapted and translated GSHS content to assess health risk behaviors as part of a larger study investigating the impact of social transition on ethnic Fijian secondary schoolgirls in Fiji. In order to evaluate the performance of this measure for our ethnic Fijian study sample (n=523), we examined its test-retest reliability with kappa coefficients, % agreement, and prevalence estimates in a sub-sample (n=81). Reliability among strata defined by topic, age, and language was also examined. Results Average agreement between test and retest was 77%, and average Cohen's kappa was 0.47. Mean kappas for questions from core modules about alcohol use, tobacco use, and sexual behavior were substantial, and higher than those for modules relating to other risk behaviors. Conclusions Although test-retest reliability of responses within this country-specific version of GSHS content was substantial in several topical domains for this ethnic Fijian sample, only fair reliability for the module assessing dietary behaviors and other individual items suggests that population-specific psychometric evaluation is essential to interpreting language and country-specific GSHS data. PMID:20234961

  20. Development of a reliable and highly sensitive, digital PCR-based assay for early detection of HLB

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Huanglongbing (HLB) is caused by a phloem-limited bacterium, Ca. Liberibacter asiaticus (Las) in the United States. The bacterium often is present at a low concentration and unevenly distributed in the early stage of infection, making reliable and early diagnosis a serious challenge. Conventional d...

  1. Further discussion on reliability: the art of reliability estimation.

    PubMed

    Yang, Yanyun; Green, Samuel B

    2015-01-01

    Sijtsma and van der Ark (2015) focused in their lead article on three frameworks for reliability estimation in nursing research: classical test theory (CTT), factor analysis (FA), and generalizability theory. We extend their presentation with particular attention to CTT and FA methods. We first consider the potential of yielding an overly negative or an overly positive assessment of reliability based on coefficient alpha. Next, we discuss other CTT methods for estimating reliability and how the choice of methods affects the interpretation of the reliability coefficient. Finally, we describe FA methods, which not only permit an understanding of a measure's underlying structure but also yield a variety of reliability coefficients with different interpretations. On a more general note, we discourage reporting reliability as a two-choice outcome--unsatisfactory or satisfactory; rather, we recommend that nursing researchers make a conceptual and empirical argument about when a measure might be more or less reliable, depending on its use. PMID:25738627

  2. Intra-observer reliability for measuring first and second toe and metatarsal protrusion distance using palpation-based tests: a test-retest study

    PubMed Central

    2014-01-01

    Background Measurement of first and second metatarsal and toe protrusion is frequently used to explain foot problems using x-rays, osteological measurements or palpation-based tests. Length differences could be related to the appearance of problems in the foot. A test-retest design was conducted in order to establish the intra-rater reliability of three palpation-based tests. Methods 202 feet of physical therapy students and teachers of the CEU San Pablo University of Madrid, 39 men and 62 women, were measured using three different tests. Data were analysed using SPSS version 15.0. Mean, SD and 95% CI were calculated for each variable. A normal distribution of quantitative data was assessed using the Kolmogorov-Smirnov test. The test-retest intra-rater reliability was assessed using an Intraclass Correlation Coefficient (ICC). The Standard Error Mean (SEM) and the Minimal Detectable Change (MDC) were also obtained. Results All the ICC values showed a high degree of reliability (Test 1 = 0.97, Test 2 = 0.86 and Test 3 = 0.88) as did the SEM (Test 1 = 0.07, Test 2 = 0.10 and Test 3 = 0.11) and the MDC (Test 1 = 0.21, Test 2 = 0.30 and Test 3 = 0.31). Conclusions Reliability of measuring first and second metatarsal and toe protrusion using the three palpation-based tests showed a high degree of reliability. PMID:25729437

  3. Computational methods for efficient structural reliability and reliability sensitivity analysis

    NASA Technical Reports Server (NTRS)

    Wu, Y.-T.

    1993-01-01

    This paper presents recent developments in efficient structural reliability analysis methods. The paper proposes an efficient, adaptive importance sampling (AIS) method that can be used to compute reliability and reliability sensitivities. The AIS approach uses a sampling density that is proportional to the joint PDF of the random variables. Starting from an initial approximate failure domain, sampling proceeds adaptively and incrementally with the goal of reaching a sampling domain that is slightly greater than the failure domain to minimize over-sampling in the safe region. Several reliability sensitivity coefficients are proposed that can be computed directly and easily from the above AIS-based failure points. These probability sensitivities can be used for identifying key random variables and for adjusting design to achieve reliability-based objectives. The proposed AIS methodology is demonstrated using a turbine blade reliability analysis problem.

  4. Optimizing the preventive maintenance scheduling by genetic algorithm based on cost and reliability in National Iranian Drilling Company

    NASA Astrophysics Data System (ADS)

    Javanmard, Habibollah; Koraeizadeh, Abd al-Wahhab

    2016-06-01

    The present research aims at predicting the required activities for preventive maintenance in terms of equipment optimal cost and reliability. The research sample includes all offshore drilling equipment of FATH 59 Derrick Site affiliated with National Iranian Drilling Company. Regarding the method, the research uses a field methodology and in terms of its objectives, it is classified as an applied research. Some of the data are extracted from the documents available in the equipment and maintenance department of FATH 59 Derrick site, and other needed data are resulted from experts' estimates through genetic algorithm method. The research result is provided as the prediction of downtimes, costs, and reliability in a predetermined time interval. The findings of the method are applicable for all manufacturing and non-manufacturing equipment.

  5. Effect of Vanadium Addition on Reliability and Microstructure of BaTiO3-Based Multilayer Ceramic Capacitors

    NASA Astrophysics Data System (ADS)

    Natsui, Hidesada; Shibahara, Takeshi; Yonezawa, Yu; Kido, Osamu

    2012-09-01

    The vanadium distribution in multilayer ceramic capacitors (MLCCs), sintered under a reducing atmosphere, was investigated using scanning transmission electron microscopy-electron energy loss spectroscopy (STEM-EELS), and insulation resistance degradation was analyzed using impedance spectroscopy in highly accelerated lifetime tests to clarify the effects of vanadium on both the electrical properties and microstructure of MLCCs. Vanadium mitigated insulation resistance degradation and increased the reliability of MLCCs. Moreover, vanadium content increased and insulation resistance at the ceramic/electrode interface decreased slowly. This change in dynamics directly resulted in an improved lifetime of MLCCs. The results of STEM-EELS analysis showed that vanadium distributed along the grain boundary and grain boundary junction, but substituted into BaTiO3 at the ceramic/electrode interface. Therefore, it is considered that vanadium substitution at the ceramic/electrode interface improves the reliability of MLCCs.

  6. Reliability Centered Maintenance - Methodologies

    NASA Technical Reports Server (NTRS)

    Kammerer, Catherine C.

    2009-01-01

    Journal article about Reliability Centered Maintenance (RCM) methodologies used by United Space Alliance, LLC (USA) in support of the Space Shuttle Program at Kennedy Space Center. The USA Reliability Centered Maintenance program differs from traditional RCM programs because various methodologies are utilized to take advantage of their respective strengths for each application. Based on operational experience, USA has customized the traditional RCM methodology into a streamlined lean logic path and has implemented the use of statistical tools to drive the process. USA RCM has integrated many of the L6S tools into both RCM methodologies. The tools utilized in the Measure, Analyze, and Improve phases of a Lean Six Sigma project lend themselves to application in the RCM process. All USA RCM methodologies meet the requirements defined in SAE JA 1011, Evaluation Criteria for Reliability-Centered Maintenance (RCM) Processes. The proposed article explores these methodologies.

  7. How to measure ecosystem stability? An evaluation of the reliability of stability metrics based on remote sensing time series across the major global ecosystems.

    PubMed

    De Keersmaecker, Wanda; Lhermitte, Stef; Honnay, Olivier; Farifteh, Jamshid; Somers, Ben; Coppin, Pol

    2014-07-01

    Increasing frequency of extreme climate events is likely to impose increased stress on ecosystems and to jeopardize the services that ecosystems provide. Therefore, it is of major importance to assess the effects of extreme climate events on the temporal stability (i.e., the resistance, the resilience, and the variance) of ecosystem properties. Most time series of ecosystem properties are, however, affected by varying data characteristics, uncertainties, and noise, which complicate the comparison of ecosystem stability metrics (ESMs) between locations. Therefore, there is a strong need for a more comprehensive understanding regarding the reliability of stability metrics and how they can be used to compare ecosystem stability globally. The objective of this study was to evaluate the performance of temporal ESMs based on time series of the Moderate Resolution Imaging Spectroradiometer derived Normalized Difference Vegetation Index of 15 global land-cover types. We provide a framework (i) to assess the reliability of ESMs in function of data characteristics, uncertainties and noise and (ii) to integrate reliability estimates in future global ecosystem stability studies against climate disturbances. The performance of our framework was tested through (i) a global ecosystem comparison and (ii) an comparison of ecosystem stability in response to the 2003 drought. The results show the influence of data quality on the accuracy of ecosystem stability. White noise, biased noise, and trends have a stronger effect on the accuracy of stability metrics than the length of the time series, temporal resolution, or amount of missing values. Moreover, we demonstrate the importance of integrating reliability estimates to interpret stability metrics within confidence limits. Based on these confidence limits, other studies dealing with specific ecosystem types or locations can be put into context, and a more reliable assessment of ecosystem stability against environmental disturbances

  8. Electrical and reliability characteristics of Mn-doped nano BaTiO3-based ceramics for ultrathin multilayer ceramic capacitor application

    NASA Astrophysics Data System (ADS)

    Gong, Huiling; Wang, Xiaohui; Zhang, Shaopeng; Tian, Zhibin; Li, Longtu

    2012-12-01

    Nano BaTiO3-based dielectric ceramics were prepared by chemical coating approach, which are promising for ultrathin multilayer ceramic capacitor (MLCC) applications. The doping effects of Mn element on the microstructures and dielectric properties of the ceramics were investigated. The degradation test and impedance spectroscopy were employed to study the resistance degradation and the conduction mechanism of Mn-doped nano-BaTiO3 ceramic samples. It has been found that the reliability characteristics greatly depended on the Mn-doped content. Moreover, the BaTiO3 ceramic with grain size in nanoscale is more sensitive to the Mn-doped content than that in sub-micron scale. The addition of 0.3 mol. % Mn is beneficial for improving the reliability of the nano BaTiO3-based ceramics, which is an important parameter for MLCC applications. However, further increasing the addition amount will deteriorate the performance of the ceramic samples.

  9. Reliability-based econometrics of aerospace structural systems: Design criteria and test options. Ph.D. Thesis - Georgia Inst. of Tech.

    NASA Technical Reports Server (NTRS)

    Thomas, J. M.; Hanagud, S.

    1974-01-01

    The design criteria and test options for aerospace structural reliability were investigated. A decision methodology was developed for selecting a combination of structural tests and structural design factors. The decision method involves the use of Bayesian statistics and statistical decision theory. Procedures are discussed for obtaining and updating data-based probabilistic strength distributions for aerospace structures when test information is available and for obtaining subjective distributions when data are not available. The techniques used in developing the distributions are explained.

  10. Reliability/redundancy trade-off evaluation for multiplexed architectures used to implement quantum dot based computing

    SciTech Connect

    Bhaduri, D.; Shukla, S. K.; Graham, P. S.; Gokhale, M.

    2004-01-01

    With the advent of nanocomputing, researchers have proposed Quantum Dot Cellular Automata (QCA) as one of the implementation technologies. The majority gate is one of the fundamental gates implementable with QCAs. Moreover, majority gates play an important role in defect-tolerant circuit implementations for nanotechnologies due to their use in redundancy mechanisms such as TMR, CTMR etc. Therefore, providing reliable implementation of majority logic using some redundancy mechanism is extremely important. This problem was addressed by von Neumann in 1956, in the form of 'majority multiplexing' and since then several analytical probabilistic models have been proposed to analyze majority multiplexing circuits. However, such analytical approaches are extremely challenging combinatorially and error prone. Also the previous analyses did not distinguish between permanent faults at the gates and transient faults due to noisy interconnects or noise effects on gates. In this paper, we provide explicit fault models for transient and permanent errors at the gates and noise effects at the interconnects. We model majority multiplexing in a probabilistic system description language, and use probabilistic model checking to analyze the effects of our fault models on the different reliability/redundancy trade-offs for majority multiplexing configurations. We also draw parallels with another fundamental logic gate multiplexing technique, namely NAND multiplexing. Tools and methodologies for analyzing redundant architectures that use majority gates will help logic designers to quickly evaluate the amount of redundancy needed to achieve a given level of reliability. VLSI designs at the nanoscale will utilize implementation fabrics prone to faults of permanent and transient nature, and the interconnects will be extensively affected by noise, hence the need for tools that can capture probabilistically quantified fault models and provide quick evaluation of the trade-offs. A comparative

  11. Beyond Reliability

    PubMed Central

    2008-01-01

    The validity of psychiatric diagnosis rests in part on a demonstration that identifiable biomarkers exist for major psychiatric illnesses. Recent evidence supports the existence of several biomarkers or endophenotypes for both schizophrenia and bipolar disorder. As we learn more about how these biomarkers relate to the symptoms, course, and treatment response of major psychiatric disorders, the “objectivity” of psychiatric diagnosis will increase. However, psychiatry is and will remain a clinically based discipline, aimed at comprehensively understanding and relieving human suffering. PMID:19727304

  12. Ant-based power efficient, adaptive, reliable, and load balanced (A-PEARL) routing for smart metering networks

    NASA Astrophysics Data System (ADS)

    Muraleedharan, Rajani

    2011-06-01

    The future of metering networks requires adaptation of different sensor technology while reducing energy exploitation. In this paper, a routing protocol with the ability to adapt and communicate reliably over varied IEEE standards is proposed. Due to sensor's resource constraints, such as memory, energy, processing power an algorithm that balances resources without compromising performance is preferred. The proposed A-PEARL protocol is tested under harsh simulated scenarios such as sensor failure and fading conditions. The inherent features of A-PEARL protocol such as data aggregation, fusion and channel hopping enables minimal resource consumption and secure communication.

  13. Reliability and Confidence.

    ERIC Educational Resources Information Center

    Test Service Bulletin, 1952

    1952-01-01

    Some aspects of test reliability are discussed. Topics covered are: (1) how high should a reliability coefficient be?; (2) two factors affecting the interpretation of reliability coefficients--range of talent and interval between testings; (3) some common misconceptions--reliability of speed tests, part vs. total reliability, reliability for what…

  14. A novel decoding algorithm based on the hierarchical reliable strategy for SCG-LDPC codes in optical communications

    NASA Astrophysics Data System (ADS)

    Yuan, Jian-guo; Tong, Qing-zhen; Huang, Sheng; Wang, Yong

    2013-11-01

    An effective hierarchical reliable belief propagation (HRBP) decoding algorithm is proposed according to the structural characteristics of systematically constructed Gallager low-density parity-check (SCG-LDPC) codes. The novel decoding algorithm combines the layered iteration with the reliability judgment, and can greatly reduce the number of the variable nodes involved in the subsequent iteration process and accelerate the convergence rate. The result of simulation for SCG-LDPC(3969,3720) code shows that the novel HRBP decoding algorithm can greatly reduce the computing amount at the condition of ensuring the performance compared with the traditional belief propagation (BP) algorithm. The bit error rate (BER) of the HRBP algorithm is considerable at the threshold value of 15, but in the subsequent iteration process, the number of the variable nodes for the HRBP algorithm can be reduced by about 70% at the high signal-to-noise ratio (SNR) compared with the BP algorithm. When the threshold value is further increased, the HRBP algorithm will gradually degenerate into the layered-BP algorithm, but at the BER of 10-7 and the maximal iteration number of 30, the net coding gain (NCG) of the HRBP algorithm is 0.2 dB more than that of the BP algorithm, and the average iteration times can be reduced by about 40% at the high SNR. Therefore, the novel HRBP decoding algorithm is more suitable for optical communication systems.

  15. Improving the communication reliability of body sensor networks based on the IEEE 802.15.4 protocol.

    PubMed

    Gomes, Diogo; Afonso, José A

    2014-03-01

    Body sensor networks (BSNs) enable continuous monitoring of patients anywhere, with minimum constraints to daily life activities. Although the IEEE 802.15.4 and ZigBee(®) (ZigBee Alliance, San Ramon, CA) standards were mainly developed for use in wireless sensors network (WSN) applications, they are also widely used in BSN applications because of device characteristics such as low power, low cost, and small form factor. However, compared with WSNs, BSNs present some very distinctive characteristics in terms of traffic and mobility patterns, heterogeneity of the nodes, and quality of service requirements. This article evaluates the suitability of the carrier sense multiple access-collision avoidance protocol, used by the IEEE 802.15.4 and ZigBee standards, for data-intensive BSN applications, through the execution of experimental tests in different evaluation scenarios, in order to take into account the effects of contention, clock drift, and hidden nodes on the communication reliability. Results show that the delivery ratio may decrease substantially during transitory periods, which can last for several minutes, to a minimum of 90% with retransmissions and 13% without retransmissions. This article also proposes and evaluates the performance of the BSN contention avoidance mechanism, which was designed to solve the identified reliability problems. This mechanism was able to restore the delivery ratio to 100% even in the scenario without retransmissions. PMID:24350805

  16. Rational Hydrogenation for Enhanced Mobility and High Reliability on ZnO-based Thin Film Transistors: From Simulation to Experiment.

    PubMed

    Xu, Lei; Chen, Qian; Liao, Lei; Liu, Xingqiang; Chang, Ting-Chang; Chang, Kuan-Chang; Tsai, Tsung-Ming; Jiang, Changzhong; Wang, Jinlan; Li, Jinchai

    2016-03-01

    Hydrogenation is one of the effective methods for improving the performance of ZnO thin film transistors (TFTs), which originate from the fact that hydrogen (H) acts as a defect passivator and a shallow n-type dopant in ZnO materials. However, passivation accompanied by an excessive H doping of the channel region of a ZnO TFT is undesirable because high carrier density leads to negative threshold voltages. Herein, we report that Mg/H codoping could overcome the trade-off between performance and reliability in the ZnO TFTs. The theoretical calculation suggests that the incorporation of Mg in hydrogenated ZnO decrease the formation energy of interstitial H and increase formation energy of O-vacancy (VO). The experimental results demonstrate that the existence of the diluted Mg in hydrogenated ZnO TFTs could be sufficient to boost up mobility from 10 to 32.2 cm(2)/(V s) at a low carrier density (∼2.0 × 10(18) cm(-3)), which can be attributed to the decreased electron effective mass by surface band bending. The all results verified that the Mg/H codoping can significantly passivate the VO to improve device reliability and enhance mobility. Thus, this finding clearly points the way to realize high-performance metal oxide TFTs for low-cost, large-volume, flexible electronics. PMID:26856932

  17. Omnidirectional Audio-Visual Talker Localization Based on Dynamic Fusion of Audio-Visual Features Using Validity and Reliability Criteria

    NASA Astrophysics Data System (ADS)

    Denda, Yuki; Nishiura, Takanobu; Yamashita, Yoichi

    This paper proposes a robust omnidirectional audio-visual (AV) talker localizer for AV applications. The proposed localizer consists of two innovations. One of them is robust omnidirectional audio and visual features. The direction of arrival (DOA) estimation using an equilateral triangular microphone array, and human position estimation using an omnidirectional video camera extract the AV features. The other is a dynamic fusion of the AV features. The validity criterion, called the audioor visual-localization counter, validates each audio- or visual-feature. The reliability criterion, called the speech arriving evaluator, acts as a dynamic weight to eliminate any prior statistical properties from its fusion procedure. The proposed localizer can compatibly achieve talker localization in a speech activity and user localization in a non-speech activity under the identical fusion rule. Talker localization experiments were conducted in an actual room to evaluate the effectiveness of the proposed localizer. The results confirmed that the talker localization performance of the proposed AV localizer using the validity and reliability criteria is superior to that of conventional localizers.

  18. The reliability of a novel magnetic resonance imaging-based tool for the evaluation of forefoot bursae in patients with rheumatoid arthritis: the FFB score

    PubMed Central

    King, Leonard; Thomas, Matthew; Roemer, Frank; Culliford, David; Bowen, Catherine J.; Arden, Nigel K.; Edwards, Christopher J.

    2014-01-01

    Objective. The aim of this study was to determine the reliability of an MRI-based score that evaluates forefoot bursae (FFBs) in patients with RA. Methods. Items for inclusion, grading criteria and MRI sequences were determined iteratively. The score was evaluated in 30 patients with established RA. Reader agreement was evaluated using the percentage of exact/close agreement, Bland–Altman plots, kappa and intraclass correlation coefficient analyses. Results. The FFB score assesses nine forefoot regions and contains four items: presence, shape, enhancement and magnetic resonance characteristics. The FFB score showed moderate to good intra- and interreader agreement (κ range = 0.5–0.9 and 0.47–0.87, respectively). Conclusion. The FFB score is adequately reliable in the evaluation of bursa-like lesions of the forefoot in patients with RA. PMID:24907157

  19. Reliability of wireless sensor networks.

    PubMed

    Dâmaso, Antônio; Rosa, Nelson; Maciel, Paulo

    2014-01-01

    Wireless Sensor Networks (WSNs) consist of hundreds or thousands of sensor nodes with limited processing, storage, and battery capabilities. There are several strategies to reduce the power consumption of WSN nodes (by increasing the network lifetime) and increase the reliability of the network (by improving the WSN Quality of Service). However, there is an inherent conflict between power consumption and reliability: an increase in reliability usually leads to an increase in power consumption. For example, routing algorithms can send the same packet though different paths (multipath strategy), which it is important for reliability, but they significantly increase the WSN power consumption. In this context, this paper proposes a model for evaluating the reliability of WSNs considering the battery level as a key factor. Moreover, this model is based on routing algorithms used by WSNs. In order to evaluate the proposed models, three scenarios were considered to show the impact of the power consumption on the reliability of WSNs. PMID:25157553

  20. Methods for reliability and uncertainty assessment and for applicability evaluations of classification- and regression-based QSARs.

    PubMed Central

    Eriksson, Lennart; Jaworska, Joanna; Worth, Andrew P; Cronin, Mark T D; McDowell, Robert M; Gramatica, Paola

    2003-01-01

    This article provides an overview of methods for reliability assessment of quantitative structure-activity relationship (QSAR) models in the context of regulatory acceptance of human health and environmental QSARs. Useful diagnostic tools and data analytical approaches are highlighted and exemplified. Particular emphasis is given to the question of how to define the applicability borders of a QSAR and how to estimate parameter and prediction uncertainty. The article ends with a discussion regarding QSAR acceptability criteria. This discussion contains a list of recommended acceptability criteria, and we give reference values for important QSAR performance statistics. Finally, we emphasize that rigorous and independent validation of QSARs is an essential step toward their regulatory acceptance and implementation. PMID:12896860

  1. Objectives, priorities, reliable knowledge, and science-based management of Missouri River interior least terns and piping plovers

    USGS Publications Warehouse

    Sherfy, Mark; Anteau, Michael; Shaffer, Terry; Sovada, Marsha; Stucker, Jennifer

    2011-01-01

    Supporting recovery of federally listed interior least tern (Sternula antillarum athalassos; tern) and piping plover (Charadrius melodus; plover) populations is a desirable goal in management of the Missouri River ecosystem. Many tools are implemented in support of this goal, including habitat management, annual monitoring, directed research, and threat mitigation. Similarly, many types of data can be used to make management decisions, evaluate system responses, and prioritize research and monitoring. The ecological importance of Missouri River recovery and the conservation status of terns and plovers place a premium on efficient and effective resource use. Efficiency is improved when a single data source informs multiple high-priority decisions, whereas effectiveness is improved when decisions are informed by reliable knowledge. Seldom will a single study design be optimal for addressing all data needs, making prioritization of needs essential. Data collection motivated by well-articulated objectives and priorities has many advantages over studies in which questions and priorities are determined retrospectively. Research and monitoring for terns and plovers have generated a wealth of data that can be interpreted in a variety of ways. The validity and strength of conclusions from analyses of these data is dependent on compatibility between the study design and the question being asked. We consider issues related to collection and interpretation of biological data, and discuss their utility for enhancing the role of science in management of Missouri River terns and plovers. A team of USGS scientists at Northern Prairie Wildlife Research Center has been conducting tern and plover research on the Missouri River since 2005. The team has had many discussions about the importance of setting objectives, identifying priorities, and obtaining reliable information to answer pertinent questions about tern and plover management on this river system. The objectives of this

  2. Software reliability report

    NASA Technical Reports Server (NTRS)

    Wilson, Larry

    1991-01-01

    There are many software reliability models which try to predict future performance of software based on data generated by the debugging process. Unfortunately, the models appear to be unable to account for the random nature of the data. If the same code is debugged multiple times and one of the models is used to make predictions, intolerable variance is observed in the resulting reliability predictions. It is believed that data replication can remove this variance in lab type situations and that it is less than scientific to talk about validating a software reliability model without considering replication. It is also believed that data replication may prove to be cost effective in the real world, thus the research centered on verification of the need for replication and on methodologies for generating replicated data in a cost effective manner. The context of the debugging graph was pursued by simulation and experimentation. Simulation was done for the Basic model and the Log-Poisson model. Reasonable values of the parameters were assigned and used to generate simulated data which is then processed by the models in order to determine limitations on their accuracy. These experiments exploit the existing software and program specimens which are in AIR-LAB to measure the performance of reliability models.

  3. Parametric Mass Reliability Study

    NASA Technical Reports Server (NTRS)

    Holt, James P.

    2014-01-01

    The International Space Station (ISS) systems are designed based upon having redundant systems with replaceable orbital replacement units (ORUs). These ORUs are designed to be swapped out fairly quickly, but some are very large, and some are made up of many components. When an ORU fails, it is replaced on orbit with a spare; the failed unit is sometimes returned to Earth to be serviced and re-launched. Such a system is not feasible for a 500+ day long-duration mission beyond low Earth orbit. The components that make up these ORUs have mixed reliabilities. Components that make up the most mass-such as computer housings, pump casings, and the silicon board of PCBs-typically are the most reliable. Meanwhile components that tend to fail the earliest-such as seals or gaskets-typically have a small mass. To better understand the problem, my project is to create a parametric model that relates both the mass of ORUs to reliability, as well as the mass of ORU subcomponents to reliability.

  4. A Study on Enhancing Data Storage Capacity and Mechanical Reliability of Solid Immersion Lens-Based Near-Field Recording System

    NASA Astrophysics Data System (ADS)

    Park, No-Cheol; Yang, Hyun-Seok; Rhim, Yoon-Cheol; Park, Young-Pil

    2008-08-01

    In this study, several technical issues on solid immersion lens (SIL)-based near-field recording (NFR) are explored, namely, to enhance storage capacity and to guarantee mechanical reliability of the device. For the purpose of enhancing the storage capacity of the NFR system, two optical configurations using radial polarization and dual recording layers are proposed. Through a feasibility analysis of the proposed optical configuration with radial polarization, it was determined that illumination of radial polarization is not a suitable solution to achieve higher areal density. To apply highly focusing characteristics of incidence of radial polarized light to cover-layer protected data storage, an annular pupil filtering method was introduced. Complete field analysis of the proposed dual layered NFR optics verified its feasibility, and the assembly of the SIL of the proposed model was successfully achieved. In addition, to improve mechanical reliability of the SIL-based NFR system, improved near-field (NF) air-gap servo methods and air flow analysis around the low part of the SIL have been evaluated. With improved NF gap servo methods using an error-based disturbance observer (EDOB) on a base air-gap controller, residual gap errors were markebly reduced by 26.26% while controlling the NF air-gap to 30 nm. Air flow near the head media interface was visualized and an undesirable effect of backward flow climbing from the bottom surface of the SIL was ovserved.

  5. Reliability and validity of expert assessment based on airborne and urinary measures of nickel and chromium exposure in the electroplating industry.

    PubMed

    Chen, Yu-Cheng; Coble, Joseph B; Deziel, Nicole C; Ji, Bu-Tian; Xue, Shouzheng; Lu, Wei; Stewart, Patricia A; Friesen, Melissa C

    2014-11-01

    The reliability and validity of six experts' exposure ratings were evaluated for 64 nickel-exposed and 72 chromium-exposed workers from six Shanghai electroplating plants based on airborne and urinary nickel and chromium measurements. Three industrial hygienists and three occupational physicians independently ranked the exposure intensity of each metal on an ordinal scale (1-4) for each worker's job in two rounds: the first round was based on responses to an occupational history questionnaire and the second round also included responses to an electroplating industry-specific questionnaire. The Spearman correlation (r(s)) was used to compare each rating's validity to its corresponding subject-specific arithmetic mean of four airborne or four urinary measurements. Reliability was moderately high (weighted kappa range=0.60-0.64). Validity was poor to moderate (r(s)=-0.37-0.46) for both airborne and urinary concentrations of both metals. For airborne nickel concentrations, validity differed by plant. For dichotomized metrics, sensitivity and specificity were higher based on urinary measurements (47-78%) than airborne measurements (16-50%). Few patterns were observed by metal, assessment round, or expert type. These results suggest that, for electroplating exposures, experts can achieve moderately high agreement and (reasonably) distinguish between low and high exposures when reviewing responses to in-depth questionnaires used in population-based case-control studies. PMID:24736099

  6. Effect of an interface Mg insertion layer on the reliability of a magnetic tunnel junction based on a Co2FeAl full-Heusler alloy

    NASA Astrophysics Data System (ADS)

    Lee, Jung-Min; Kil, Gyu Hyun; Lee, Gae Hun; Choi, Chul Min; Song, Yun-Heub; Sukegawa, Hiroaki; Mitani, Seiji

    2014-04-01

    The reliability of a magnetic tunnel junction (MTJ) based on a Co2FeAl (CFA) full-Heusler alloy with a MgO tunnel barrier was evaluated. In particular, the effect of a Mg insertion layer under the MgO was investigated in view of resistance drift by using various voltage stress tests. We compared the resistance change during constant voltage stress (CVS) and confirmed a trap/detrap phenomenon during the interval stress test for samples with and without a Mg insertion layer. The MTJ with a Mg insertion layer showed a relatively small resistance change for the CVS test and a reduced trap/detrap phenomenon for the interval stress test compared to the sample without a Mg insertion layer. This is understood to be caused by the improved crystallinity at the bottom of the CFA/MgO interface due to the Mg insertion layer, which provides a smaller number of trap site during the stress test. As a result, the interface condition of the MgO layer is very important for the reliability of a MTJ using a full-Heusler alloy, and the the insert of a Mg layer at the MgO interface is expected to be an effective method for enhancing the reliability of a MTJ.

  7. The Development of DNA Based Methods for the Reliable and Efficient Identification of Nicotiana tabacum in Tobacco and Its Derived Products

    PubMed Central

    Fan, Wei; Li, Rong; Li, Sifan; Ping, Wenli; Li, Shujun; Naumova, Alexandra; Peelen, Tamara; Yuan, Zheng; Zhang, Dabing

    2016-01-01

    Reliable methods are needed to detect the presence of tobacco components in tobacco products to effectively control smuggling and classify tariff and excise in tobacco industry to control illegal tobacco trade. In this study, two sensitive and specific DNA based methods, one quantitative real-time PCR (qPCR) assay and the other loop-mediated isothermal amplification (LAMP) assay, were developed for the reliable and efficient detection of the presence of tobacco (Nicotiana tabacum) in various tobacco samples and commodities. Both assays targeted the same sequence of the uridine 5′-monophosphate synthase (UMPS), and their specificities and sensitivities were determined with various plant materials. Both qPCR and LAMP methods were reliable and accurate in the rapid detection of tobacco components in various practical samples, including customs samples, reconstituted tobacco samples, and locally purchased cigarettes, showing high potential for their application in tobacco identification, particularly in the special cases where the morphology or chemical compositions of tobacco have been disrupted. Therefore, combining both methods would facilitate not only the detection of tobacco smuggling control, but also the detection of tariff classification and of excise.

  8. Reliability and validity of a semi-structured DSM-based diagnostic interview module for the assessment of Attention Deficit Hyperactivity Disorder in adult psychiatric outpatients.

    PubMed

    Gorlin, Eugenia I; Dalrymple, Kristy; Chelminski, Iwona; Zimmerman, Mark

    2016-08-30

    Despite growing recognition that the symptoms and functional impairments of Attention Deficit/Hyperactivity Disorder (ADHD) persist into adulthood, only a few psychometrically sound diagnostic measures have been developed for the assessment of ADHD in adults, and none have been validated for use in a broad treatment-seeking psychiatric sample. The current study presents the reliability and validity of a semi-structured DSM-based diagnostic interview module for ADHD, which was administered to 1194 adults presenting to an outpatient psychiatric practice. The module showed excellent internal consistency and interrater reliability, good convergent and discriminant validity (as indexed by relatively high correlations with self-report measures of ADHD and ADHD-related constructs and little or no correlation with other, non-ADHD symptom domains), and good construct validity (as indexed by significantly higher rates of psychosocial impairment and self-reported family history of ADHD in individuals who meet criteria for an ADHD diagnosis). This instrument is thus a reliable and valid diagnostic tool for the detection of ADHD in adults presenting for psychiatric evaluation and treatment. PMID:27259136

  9. High reliable and stable organic field-effect transistor nonvolatile memory with a poly(4-vinyl phenol) charge trapping layer based on a pn-heterojunction active layer

    NASA Astrophysics Data System (ADS)

    Xiang, Lanyi; Ying, Jun; Han, Jinhua; Zhang, Letian; Wang, Wei

    2016-04-01

    In this letter, we demonstrate a high reliable and stable organic field-effect transistor (OFET) based nonvolatile memory (NVM) with a polymer poly(4-vinyl phenol) (PVP) as the charge trapping layer. In the unipolar OFETs, the inreversible shifts of the turn-on voltage (Von) and severe degradation of the memory window (ΔVon) at programming (P) and erasing (E) voltages, respectively, block their application in NVMs. The obstacle is overcome by using a pn-heterojunction as the active layer in the OFET memory, which supplied a holes and electrons accumulating channel at the supplied P and E voltages, respectively. Both holes and electrons transferring from the channels to PVP layer and overwriting the trapped charges with an opposite polarity result in the reliable bidirectional shifts of Von at P and E voltages, respectively. The heterojunction OFET exhibits excellent nonvolatile memory characteristics, with a large ΔVon of 8.5 V, desired reading (R) voltage at 0 V, reliable P/R/E/R dynamic endurance over 100 cycles and a long retention time over 10 years.

  10. Defining Requirements for Improved Photovoltaic System Reliability

    SciTech Connect

    Maish, A.B.

    1998-12-21

    Reliable systems are an essential ingredient of any technology progressing toward commercial maturity and large-scale deployment. This paper defines reliability as meeting system fictional requirements, and then develops a framework to understand and quantify photovoltaic system reliability based on initial and ongoing costs and system value. The core elements necessary to achieve reliable PV systems are reviewed. These include appropriate system design, satisfactory component reliability, and proper installation and servicing. Reliability status, key issues, and present needs in system reliability are summarized for four application sectors.

  11. Electronic Versus Paper-Based Assessment of Health-Related Quality of Life Specific to HIV Disease: Reliability Study of the PROQOL-HIV Questionnaire

    PubMed Central

    Lalanne, Christophe; Goujard, Cécile; Herrmann, Susan; Cheung-Lung, Christian; Brosseau, Jean-Paul; Schwartz, Yannick; Chassany, Olivier

    2014-01-01

    Background Electronic patient-reported outcomes (PRO) provide quick and usually reliable assessments of patients’ health-related quality of life (HRQL). Objective An electronic version of the Patient-Reported Outcomes Quality of Life-human immunodeficiency virus (PROQOL-HIV) questionnaire was developed, and its face validity and reliability were assessed using standard psychometric methods. Methods A sample of 80 French outpatients (66% male, 52/79; mean age 46.7 years, SD 10.9) were recruited. Paper-based and electronic questionnaires were completed in a randomized crossover design (2-7 day interval). Biomedical data were collected. Questionnaire version and order effects were tested on full-scale scores in a 2-way ANOVA with patients as random effects. Test-retest reliability was evaluated using Pearson and intraclass correlation coefficients (ICC, with 95% confidence interval) for each dimension. Usability testing was carried out from patients’ survey reports, specifically, general satisfaction, ease of completion, quality and clarity of user interface, and motivation to participate in follow-up PROQOL-HIV electronic assessments. Results Questionnaire version and administration order effects (N=59 complete cases) were not significant at the 5% level, and no interaction was found between these 2 factors (P=.94). Reliability indexes were acceptable, with Pearson correlations greater than .7 and ICCs ranging from .708 to .939; scores were not statistically different between the two versions. A total of 63 (79%) complete patients’ survey reports were available, and 55% of patients (30/55) reported being satisfied and interested in electronic assessment of their HRQL in clinical follow-up. Individual ratings of PROQOL-HIV user interface (85%-100% of positive responses) confirmed user interface clarity and usability. Conclusions The electronic PROQOL-HIV introduces minor modifications to the original paper-based version, following International Society for

  12. Proof-of-Concept Demonstrations for Computation-Based Human Reliability Analysis. Modeling Operator Performance During Flooding Scenarios

    SciTech Connect

    Joe, Jeffrey Clark; Boring, Ronald Laurids; Herberger, Sarah Elizabeth Marie; Mandelli, Diego; Smith, Curtis Lee

    2015-09-01

    The United States (U.S.) Department of Energy (DOE) Light Water Reactor Sustainability (LWRS) program has the overall objective to help sustain the existing commercial nuclear power plants (NPPs). To accomplish this program objective, there are multiple LWRS “pathways,” or research and development (R&D) focus areas. One LWRS focus area is called the Risk-Informed Safety Margin and Characterization (RISMC) pathway. Initial efforts under this pathway to combine probabilistic and plant multi-physics models to quantify safety margins and support business decisions also included HRA, but in a somewhat simplified manner. HRA experts at Idaho National Laboratory (INL) have been collaborating with other experts to develop a computational HRA approach, called the Human Unimodel for Nuclear Technology to Enhance Reliability (HUNTER), for inclusion into the RISMC framework. The basic premise of this research is to leverage applicable computational techniques, namely simulation and modeling, to develop and then, using RAVEN as a controller, seamlessly integrate virtual operator models (HUNTER) with 1) the dynamic computational MOOSE runtime environment that includes a full-scope plant model, and 2) the RISMC framework PRA models already in use. The HUNTER computational HRA approach is a hybrid approach that leverages past work from cognitive psychology, human performance modeling, and HRA, but it is also a significant departure from existing static and even dynamic HRA methods. This report is divided into five chapters that cover the development of an external flooding event test case and associated statistical modeling considerations.

  13. Reliable Multihop Broadcast Protocol with a Low-Overhead Link Quality Assessment for ITS Based on VANETs in Highway Scenarios

    PubMed Central

    Galaviz-Mosqueda, Alejandro; Villarreal-Reyes, Salvador; Galeana-Zapién, Hiram; Rubio-Loyola, Javier; Covarrubias-Rosales, David H.

    2014-01-01

    Vehicular ad hoc networks (VANETs) have been identified as a key technology to enable intelligent transport systems (ITS), which are aimed to radically improve the safety, comfort, and greenness of the vehicles in the road. However, in order to fully exploit VANETs potential, several issues must be addressed. Because of the high dynamic of VANETs and the impairments in the wireless channel, one key issue arising when working with VANETs is the multihop dissemination of broadcast packets for safety and infotainment applications. In this paper a reliable low-overhead multihop broadcast (RLMB) protocol is proposed to address the well-known broadcast storm problem. The proposed RLMB takes advantage of the hello messages exchanged between the vehicles and it processes such information to intelligently select a relay set and reduce the redundant broadcast. Additionally, to reduce the hello messages rate dependency, RLMB uses a point-to-zone link evaluation approach. RLMB performance is compared with one of the leading multihop broadcast protocols existing to date. Performance metrics show that our RLMB solution outperforms the leading protocol in terms of important metrics such as packet dissemination ratio, overhead, and delay. PMID:25133224

  14. A sensitive and reliable dopamine biosensor was developed based on the Au@carbon dots-chitosan composite film.

    PubMed

    Huang, Qitong; Zhang, Hanqiang; Hu, Shirong; Li, Feiming; Weng, Wen; Chen, Jianhua; Wang, Qingxiang; He, Yasan; Zhang, Wuxiang; Bao, Xiuxiu

    2014-02-15

    A novel composite film of Au@carbon dots (Au@CDs)-chitosan (CS) modified glassy carbon electrode (Au@CDs-CS/GCE) was prepared in a simple manner and applied in the sensitive and reliable determination of dopamine (DA). The CDs had carboxyl groups with negative charge, which not only gave it have good stability but also enabled interaction with amine functional groups in DA through electrostatic interaction to multiply recognize DA with high specificity, and the Au nanoparticle could make the surface of the electrode more conductive. Compared with the bare GCE, CS/GCE, and CDs-CS/GCE electrodes, the Au@CDs-CS/GCE had higher catalytic activity toward the oxidation of DA. Furthermore, Au@CDs-CS/GCE exhibited good ability to suppress the background current from large excess ascorbic acid (AA) and uric acid (UA). Under the optimal conditions, selective detection of DA in a linear concentration range of 0.01-100.0 μM was obtained with the limit of 0.001 μM (3S/N). At the same time, the Au@CDs-CS/GCE was also applied to the detection of DA content in DA's injection with satisfactory results, and the biosensor could keep its activity for at least 2 weeks. PMID:24064477

  15. Cost and Reliability Improvement for CIGS-Based PV on Flexible Substrate: May 24, 2006 -- July 31, 2010

    SciTech Connect

    Wiedeman, S.

    2011-05-01

    Global Solar Energy rapidly advances the cost and performance of commercial thin-film CIGS products using roll-to-roll processing on steel foil substrate in compact, low cost deposition equipment, with in-situ sensors for real-time intelligent process control. Substantial increases in power module efficiency, which now exceed 13%, are evident at GSE factories in two countries with a combined capacity greater than 75 MW. During 2009 the average efficiency of cell strings (3780 cm2) was increased from 7% to over 11%, with champion results exceeding 13% Continued testing of module reliability in rigid product has reaffirmed extended life expectancy for standard glass product, and has qualified additional lower-cost methods and materials. Expected lifetime for PV in flexible packages continues to increase as failure mechanisms are elucidated, and resolved by better methods and materials. Cost reduction has been achieved through better materials utilization, enhanced vendor and material qualification and selection. The largest cost gains have come as a result of higher cell conversion efficiency and yields, higher processing rates, greater automation and improved control in all process steps. These improvements are integral to this thin film PV partnership program, and all realized with the 'Gen2' manufacturing plants, processes and equipment.

  16. Reliable multihop broadcast protocol with a low-overhead link quality assessment for ITS based on VANETs in highway scenarios.

    PubMed

    Galaviz-Mosqueda, Alejandro; Villarreal-Reyes, Salvador; Galeana-Zapién, Hiram; Rubio-Loyola, Javier; Covarrubias-Rosales, David H

    2014-01-01

    Vehicular ad hoc networks (VANETs) have been identified as a key technology to enable intelligent transport systems (ITS), which are aimed to radically improve the safety, comfort, and greenness of the vehicles in the road. However, in order to fully exploit VANETs potential, several issues must be addressed. Because of the high dynamic of VANETs and the impairments in the wireless channel, one key issue arising when working with VANETs is the multihop dissemination of broadcast packets for safety and infotainment applications. In this paper a reliable low-overhead multihop broadcast (RLMB) protocol is proposed to address the well-known broadcast storm problem. The proposed RLMB takes advantage of the hello messages exchanged between the vehicles and it processes such information to intelligently select a relay set and reduce the redundant broadcast. Additionally, to reduce the hello messages rate dependency, RLMB uses a point-to-zone link evaluation approach. RLMB performance is compared with one of the leading multihop broadcast protocols existing to date. Performance metrics show that our RLMB solution outperforms the leading protocol in terms of important metrics such as packet dissemination ratio, overhead, and delay. PMID:25133224

  17. A comprehensive multi-scenario based approach for a reliable flood-hazard assessment: a case-study application

    NASA Astrophysics Data System (ADS)

    Lanni, Cristiano; Mazzorana, Bruno; Volcan, Claudio; Bertagnolli, Rudi

    2015-04-01

    Flood hazard is generally assessed by assuming the return period of the rainfall as a proxy for the return period of the discharge and the related hydrograph. Frequently this deterministic view is extended also to the straightforward application of hydrodynamic models. However, the climate (i.e. precipitation), the catchment (i.e. geology, soil and antecedent soil-moisture condition) and the anthropogenic (i.e. drainage system and its regulation) systems interact in a complex way, and the occurrence probability of a flood inundation event can significantly differ from the occurrence probability of the triggering event (i.e. rainfall). In order to reliably determine the spatial patterns of flood intensities and probabilities, the rigorous determination of flood event scenarios is beneficial because it provides a clear, rationale method to recognize and unveil the inherent stochastic behavior of natural processes. Therefore, a multi-scenario approach for hazard assessment should be applied and should consider the possible events taking place in the area potentially subject to flooding (i.e. floodplains). Here, we apply a multi-scenario approach for the assessment of the flood hazard around the Idro lake (Italy). We consider and estimate the probability of occurrence of several scenarios related to the initial (i.e. initial water level in the lake) and boundary (i.e. shape of the hydrograph, downslope drainage, spillway opening operations) conditions characterizing the lake. Finally, we discuss the advantages and issues of the presented methodological procedure compared to traditional (and essentially deterministic) approaches.

  18. Reliable resonance assignments of selected residues of proteins with known structure based on empirical NMR chemical shift prediction

    NASA Astrophysics Data System (ADS)

    Li, Da-Wei; Meng, Dan; Brüschweiler, Rafael

    2015-05-01

    A robust NMR resonance assignment method is introduced for proteins whose 3D structure has previously been determined by X-ray crystallography. The goal of the method is to obtain a subset of correct assignments from a parsimonious set of 3D NMR experiments of 15N, 13C labeled proteins. Chemical shifts of sequential residue pairs are predicted from static protein structures using PPM_One, which are then compared with the corresponding experimental shifts. Globally optimized weighted matching identifies the assignments that are robust with respect to small changes in NMR cross-peak positions. The method, termed PASSPORT, is demonstrated for 4 proteins with 100-250 amino acids using 3D NHCA and a 3D CBCA(CO)NH experiments as input producing correct assignments with high reliability for 22% of the residues. The method, which works best for Gly, Ala, Ser, and Thr residues, provides assignments that serve as anchor points for additional assignments by both manual and semi-automated methods or they can be directly used for further studies, e.g. on ligand binding, protein dynamics, or post-translational modification, such as phosphorylation.

  19. Software reliability studies

    NASA Technical Reports Server (NTRS)

    Hoppa, Mary Ann; Wilson, Larry W.

    1994-01-01

    There are many software reliability models which try to predict future performance of software based on data generated by the debugging process. Our research has shown that by improving the quality of the data one can greatly improve the predictions. We are working on methodologies which control some of the randomness inherent in the standard data generation processes in order to improve the accuracy of predictions. Our contribution is twofold in that we describe an experimental methodology using a data structure called the debugging graph and apply this methodology to assess the robustness of existing models. The debugging graph is used to analyze the effects of various fault recovery orders on the predictive accuracy of several well-known software reliability algorithms. We found that, along a particular debugging path in the graph, the predictive performance of different models can vary greatly. Similarly, just because a model 'fits' a given path's data well does not guarantee that the model would perform well on a different path. Further we observed bug interactions and noted their potential effects on the predictive process. We saw that not only do different faults fail at different rates, but that those rates can be affected by the particular debugging stage at which the rates are evaluated. Based on our experiment, we conjecture that the accuracy of a reliability prediction is affected by the fault recovery order as well as by fault interaction.

  20. Reliable and integrated technique for determining resonant frequency in radio frequency resonators. Application to a high-precision resonant cavity-based displacement sensor

    NASA Astrophysics Data System (ADS)

    Jauregui, Rigoberto; Asua, Estibaliz; Portilla, Joaquin; Etxebarria, Victor

    2015-03-01

    This paper presents a reliable and integrated technique for determining the resonant frequency of radio frequency resonators, which can be of interest for different purposes. The approach uses a heterodyne scheme as phase detector coupled to a voltage-controlled oscillator. The system seeks the oscillator frequency that produces a phase null in the resonator, which corresponds to the resonant frequency. A complete explanation of the technique to determine the resonant frequency is presented and experimentally tested. The method has been applied to a high-precision displacement sensor based on resonant cavity, obtaining a theoretical nanometric precision.

  1. Using a Web-Based Approach to Assess Test-Retest Reliability of the "Hypertension Self-Care Profile" Tool in an Asian Population: A Validation Study.

    PubMed

    Koh, Yi Ling Eileen; Lua, Yi Hui Adela; Hong, Liyue; Bong, Huey Shin Shirley; Yeo, Ling Sui Jocelyn; Tsang, Li Ping Marianne; Ong, Kai Zhi; Wong, Sook Wai Samantha; Tan, Ngiap Chuan

    2016-03-01

    Essential hypertension often requires affected patients to self-manage their condition most of the time. Besides seeking regular medical review of their life-long condition to detect vascular complications, patients have to maintain healthy lifestyles in between physician consultations via diet and physical activity, and to take their medications according to their prescriptions. Their self-management ability is influenced by their self-efficacy capacity, which can be assessed using questionnaire-based tools. The "Hypertension Self-Care Profile" (HTN-SCP) is 1 such questionnaire assessing self-efficacy in the domains of "behavior," "motivation," and "self-efficacy." This study aims to determine the test-retest reliability of HTN-SCP in an English-literate Asian population using a web-based approach. Multiethnic Asian patients, aged 40 years and older, with essential hypertension were recruited from a typical public primary care clinic in Singapore. The investigators guided the patients to fill up the web-based 60-item HTN-SCP in English using a tablet or smartphone on the first visit and refilled the instrument 2 weeks later in the retest. Internal consistency and test-retest reliability were evaluated using Cronbach's Alpha and intraclass correlation coefficients (ICC), respectively. The t test was used to determine the relationship between the overall HTN-SCP scores of the patients and their self-reported self-management activities. A total of 160 patients completed the HTN-SCP during the initial test, from which 71 test-retest responses were completed. No floor or ceiling effect was found for the scores for the 3 subscales. Cronbach's Alpha coefficients were 0.857, 0.948, and 0.931 for "behavior," "motivation," and "self-efficacy" domains respectively, indicating high internal consistency. The item-total correlation ranges for the 3 scales were from 0.105 to 0.656 for Behavior, 0.401 to 0.808 for Motivation, 0.349 to 0.789 for Self-efficacy. The corresponding

  2. Reliability of photovoltaic modules

    NASA Technical Reports Server (NTRS)

    Ross, R. G., Jr.

    1986-01-01

    In order to assess the reliability of photovoltaic modules, four categories of known array failure and degradation mechanisms are discussed, and target reliability allocations have been developed within each category based on the available technology and the life-cycle-cost requirements of future large-scale terrestrial applications. Cell-level failure mechanisms associated with open-circuiting or short-circuiting of individual solar cells generally arise from cell cracking or the fatigue of cell-to-cell interconnects. Power degradation mechanisms considered include gradual power loss in cells, light-induced effects, and module optical degradation. Module-level failure mechanisms and life-limiting wear-out mechanisms are also explored.

  3. Dual-Fuel Combustion Turbine Provides Reliable Power to U.S. Navy Submarine Base New London in Groton, Connecticut

    SciTech Connect

    Halverson, Mark A. )

    2002-01-01

    In keeping with a long-standing tradition of running Base utilities as a business, the U.S. Navy Submarine Base New London installed a dual-fuel combustion turbine with a heat recovery boiler. The 5-megawatt (MW) gas- and oil-fired combustion turbine sits within the Lower Base area, just off the shores of the Thames River. The U.S. Navy owns, operates, and maintains the combined heat and power (CHP) plant, which provides power to the Navy?s nuclear submarines when they are in port and to the Navy?s training facilities at the Submarine Base. Heat recovered from the turbine is used to produce steam for use in Base housing, medical facilities, and laundries. In FY00, the Navy estimates that it will save over $500,000 per year as a result of the combined heat and power unit.

  4. A multitracer test proving the reliability of Rayleigh equation-based approach for assessing biodegradation in a BTEX contaminated aquifer.

    PubMed

    Fischer, Anko; Bauer, Jana; Meckenstock, Rainer U; Stichler, Willibald; Griebler, Christian; Maloszewski, Piotr; Kästner, Matthias; Richnow, Hans H

    2006-07-01

    Compound-specific stable isotope analysis (CSIA) is one of the most important methods for assessing biodegradation activities in contaminated aquifers. Although the concept is straightforward, the proof that the method cannot be only used for a qualitative analysis but also to quantify biodegradation in the subsurface was missing. We therefore performed a multitracer test in the field with ring-deuterated (d5) and completely (d8) deuterium-labeled toluene isotopologues (400 g) as reactive tracers as well as bromide as a conservative tracer. The compounds were injected into the anoxic zone of a BTEX plume located down-gradient of the contaminant source. Over a period of 4.5 months the tracer concentrations were analyzed at two control planes located 24 and 35 m downgradient of the injection well. Deuterium-labeled benzylsuccinate was found in the aquifer, indicating the anaerobic biodegradation of deuterated toluene via the benzylsuccinate synthase pathway. Three independent methods were applied to quantify biodegradation of deuterated toluene. First, fractionation of toluene-d8 and toluene-d5 using the Rayleigh equation and an appropriate laboratory-derived isotope fractionation factor was used for the calculation of the microbial decomposition of deuterated toluene isotopologues (CSIA-method). Second, the biodegradation was quantified by the changes of the concentrations of deuterated toluene relative to bromide. Both methods gave similar results, implying that the CSIA-method is a reliable tool to quantify biodegradation in contaminated aquifers. The results of both methods yielded a biodegradation of deuterated toluene isotopologues of approximately 23-29% for the first and 44-51% for the second control plane. Third, the mineralization of deuterated toluene isotopologues was verified by determination of the enrichment of deuterium in the groundwater. This method indicated that parts of deuterium were assimilated into the biomass of toluene degrading

  5. Assessing communication quality of consultations in primary care: initial reliability of the Global Consultation Rating Scale, based on the Calgary-Cambridge Guide to the Medical Interview

    PubMed Central

    Burt, Jenni; Abel, Gary; Elmore, Natasha; Campbell, John; Roland, Martin; Benson, John; Silverman, Jonathan

    2014-01-01

    Objectives To investigate initial reliability of the Global Consultation Rating Scale (GCRS: an instrument to assess the effectiveness of communication across an entire doctor–patient consultation, based on the Calgary-Cambridge guide to the medical interview), in simulated patient consultations. Design Multiple ratings of simulated general practitioner (GP)–patient consultations by trained GP evaluators. Setting UK primary care. Participants 21 GPs and six trained GP evaluators. Outcome measures GCRS score. Methods 6 GP raters used GCRS to rate randomly assigned video recordings of GP consultations with simulated patients. Each of the 42 consultations was rated separately by four raters. We considered whether a fixed difference between scores had the same meaning at all levels of performance. We then examined the reliability of GCRS using mixed linear regression models. We augmented our regression model to also examine whether there were systematic biases between the scores given by different raters and to look for possible order effects. Results Assessing the communication quality of individual consultations, GCRS achieved a reliability of 0.73 (95% CI 0.44 to 0.79) for two raters, 0.80 (0.54 to 0.85) for three and 0.85 (0.61 to 0.88) for four. We found an average difference of 1.65 (on a 0–10 scale) in the scores given by the least and most generous raters: adjusting for this evaluator bias increased reliability to 0.78 (0.53 to 0.83) for two raters; 0.85 (0.63 to 0.88) for three and 0.88 (0.69 to 0.91) for four. There were considerable order effects, with later consultations (after 15–20 ratings) receiving, on average, scores more than one point higher on a 0–10 scale. Conclusions GCRS shows good reliability with three raters assessing each consultation. We are currently developing the scale further by assessing a large sample of real-world consultations. PMID:24604483

  6. How to work towards more reliable residual water estimations? State of the art in Switzerland and ideas for using physical based approaches

    NASA Astrophysics Data System (ADS)

    Floriancic, Marius; Margreth, Michael; Naef, Felix

    2016-04-01

    Reliable low flow estimations are important for many ecologic and economic reasons. In Switzerland the base for defining residual flow is Q347 (Q95), the discharge exceeded 347 days per year. To improve estimations, we need further knowledge of dominant processes of storage and drainage during low flow periods. We present new approaches to define Q347 based on physical properties of the contributing slopes and catchment parts. We used dominant runoff process maps, representing storage and drainage capacity of soils, to predict discharge during dry periods. We found that recession depends on these processes but during low flow periods and times of water scarcity different mechanisms sustain discharge and streamflow. During an extended field campaign in dry summer 2015, we surveyed drainage behavior of different landscape elements in the Swiss midlands and found major differences in their contribution to discharge. The contributing storages have small volumes but long residence times mainly influenced by pore volume distribution and flow paths in fractured rocks and bedrock. We found that steep areas formed of sandstones are more likely to support higher flows than flat alluvial valleys or areas with thick moraine deposits, where infiltration takes place more frequently. The gathered knowledge helps assessing catchment scale low flow issues and supports more reliable estimations of water availability during dry periods. Furthermore the presented approach may help detect areas more or less vulnerable to extended dry periods, important ecologic and economic issues especially during changing climatic conditions.

  7. Inter-operator Reliability of Magnetic Resonance Image-Based Computational Fluid Dynamics Prediction of Cerebrospinal Fluid Motion in the Cervical Spine.

    PubMed

    Martin, Bryn A; Yiallourou, Theresia I; Pahlavian, Soroush Heidari; Thyagaraj, Suraj; Bunck, Alexander C; Loth, Francis; Sheffer, Daniel B; Kröger, Jan Robert; Stergiopulos, Nikolaos

    2016-05-01

    For the first time, inter-operator dependence of MRI based computational fluid dynamics (CFD) modeling of cerebrospinal fluid (CSF) in the cervical spinal subarachnoid space (SSS) is evaluated. In vivo MRI flow measurements and anatomy MRI images were obtained at the cervico-medullary junction of a healthy subject and a Chiari I malformation patient. 3D anatomies of the SSS were reconstructed by manual segmentation by four independent operators for both cases. CFD results were compared at nine axial locations along the SSS in terms of hydrodynamic and geometric parameters. Intraclass correlation (ICC) assessed the inter-operator agreement for each parameter over the axial locations and coefficient of variance (CV) compared the percentage of variance for each parameter between the operators. Greater operator dependence was found for the patient (0.19 < ICC < 0.99) near the craniovertebral junction compared to the healthy subject (ICC > 0.78). For the healthy subject, hydraulic diameter and Womersley number had the least variance (CV = ~2%). For the patient, peak diastolic velocity and Reynolds number had the smallest variance (CV = ~3%). These results show a high degree of inter-operator reliability for MRI-based CFD simulations of CSF flow in the cervical spine for healthy subjects and a lower degree of reliability for patients with Type I Chiari malformation. PMID:26446009

  8. Stirling Convertor Fasteners Reliability Quantification

    NASA Technical Reports Server (NTRS)

    Shah, Ashwin R.; Korovaichuk, Igor; Kovacevich, Tiodor; Schreiber, Jeffrey G.

    2006-01-01

    Onboard Radioisotope Power Systems (RPS) being developed for NASA s deep-space science and exploration missions require reliable operation for up to 14 years and beyond. Stirling power conversion is a candidate for use in an RPS because it offers a multifold increase in the conversion efficiency of heat to electric power and reduced inventory of radioactive material. Structural fasteners are responsible to maintain structural integrity of the Stirling power convertor, which is critical to ensure reliable performance during the entire mission. Design of fasteners involve variables related to the fabrication, manufacturing, behavior of fasteners and joining parts material, structural geometry of the joining components, size and spacing of fasteners, mission loads, boundary conditions, etc. These variables have inherent uncertainties, which need to be accounted for in the reliability assessment. This paper describes these uncertainties along with a methodology to quantify the reliability, and provides results of the analysis in terms of quantified reliability and sensitivity of Stirling power conversion reliability to the design variables. Quantification of the reliability includes both structural and functional aspects of the joining components. Based on the results, the paper also describes guidelines to improve the reliability and verification testing.

  9. Alternative Methods to Curriculum-Based Measurement for Written Expression: Implications for Reliability and Validity of the Scores

    ERIC Educational Resources Information Center

    Merrigan, Teresa E.

    2012-01-01

    The purpose of the current study was to evaluate the psychometric properties of alternative approaches to administering and scoring curriculum-based measurement for written expression. Specifically, three response durations (3, 5, and 7 minutes) and six score types (total words written, words spelled correctly, percent of words spelled correctly,…

  10. Reliability Generalization: "Lapsus Linguae"

    ERIC Educational Resources Information Center

    Smith, Julie M.

    2011-01-01

    This study examines the proposed Reliability Generalization (RG) method for studying reliability. RG employs the application of meta-analytic techniques similar to those used in validity generalization studies to examine reliability coefficients. This study explains why RG does not provide a proper research method for the study of reliability,…

  11. Reliability of Predicting Early Hospital Readmission After Discharge for an Acute Coronary Syndrome Using Claims-Based Data.

    PubMed

    McManus, David D; Saczynski, Jane S; Lessard, Darleen; Waring, Molly E; Allison, Jeroan; Parish, David C; Goldberg, Robert J; Ash, Arlene; Kiefe, Catarina I

    2016-02-15

    Early rehospitalization after discharge for an acute coronary syndrome, including acute myocardial infarction (AMI), is generally considered undesirable. The Centers for Medicare and Medicaid Services (CMS) base hospital financial incentives on risk-adjusted readmission rates after AMI, using claims data in its adjustment models. Little is known about the contribution to readmission risk of factors not captured by claims. For 804 consecutive patients >65 years discharged in 2011 to 2013 from 6 hospitals in Massachusetts and Georgia after an acute coronary syndrome, we compared a CMS-like readmission prediction model with an enhanced model incorporating additional clinical, psychosocial, and sociodemographic characteristics, after principal components analysis. Mean age was 73 years, 38% were women, 25% college educated, and 32% had a previous AMI; all-cause rehospitalization occurred within 30 days for 13%. In the enhanced model, previous coronary intervention (odds ratio [OR] = 2.05, 95% confidence interval [CI] 1.34 to 3.16; chronic kidney disease OR 1.89, 95% CI 1.15 to 3.10; low health literacy OR 1.75, 95% CI 1.14 to 2.69), lower serum sodium levels, and current nonsmoker status were positively associated with readmission. The discriminative ability of the enhanced versus the claims-based model was higher without evidence of overfitting. For example, for patients in the highest deciles of readmission likelihood, observed readmissions occurred in 24% for the claims-based model and 33% for the enhanced model. In conclusion, readmission may be influenced by measurable factors not in CMS' claims-based models and not controllable by hospitals. Incorporating additional factors into risk-adjusted readmission models may improve their accuracy and validity for use as indicators of hospital quality. PMID:26718235

  12. A reliable user authentication and key agreement scheme for Web-based Hospital-acquired Infection Surveillance Information System.

    PubMed

    Wu, Zhen-Yu; Tseng, Yi-Ju; Chung, Yufang; Chen, Yee-Chun; Lai, Feipei

    2012-08-01

    With the rapid development of the Internet, both digitization and electronic orientation are required on various applications in the daily life. For hospital-acquired infection control, a Web-based Hospital-acquired Infection Surveillance System was implemented. Clinical data from different hospitals and systems were collected and analyzed. The hospital-acquired infection screening rules in this system utilized this information to detect different patterns of defined hospital-acquired infection. Moreover, these data were integrated into the user interface of a signal entry point to assist physicians and healthcare providers in making decisions. Based on Service-Oriented Architecture, web-service techniques which were suitable for integrating heterogeneous platforms, protocols, and applications, were used. In summary, this system simplifies the workflow of hospital infection control and improves the healthcare quality. However, it is probable for attackers to intercept the process of data transmission or access to the user interface. To tackle the illegal access and to prevent the information from being stolen during transmission over the insecure Internet, a password-based user authentication scheme is proposed for information integrity. PMID:21556897

  13. A network-based integrative approach to prioritize reliable hits from multiple genome-wide RNAi screens in Drosophila

    PubMed Central

    Wang, Li; Tu, Zhidong; Sun, Fengzhu

    2009-01-01

    Background The recently developed RNA interference (RNAi) technology has created an unprecedented opportunity which allows the function of individual genes in whole organisms or cell lines to be interrogated at genome-wide scale. However, multiple issues, such as off-target effects or low efficacies in knocking down certain genes, have produced RNAi screening results that are often noisy and that potentially yield both high rates of false positives and false negatives. Therefore, integrating RNAi screening results with other information, such as protein-protein interaction (PPI), may help to address these issues. Results By analyzing 24 genome-wide RNAi screens interrogating various biological processes in Drosophila, we found that RNAi positive hits were significantly more connected to each other when analyzed within a protein-protein interaction network, as opposed to random cases, for nearly all screens. Based on this finding, we developed a network-based approach to identify false positives (FPs) and false negatives (FNs) in these screening results. This approach relied on a scoring function, which we termed NePhe, to integrate information obtained from both PPI network and RNAi screening results. Using a novel rank-based test, we compared the performance of different NePhe scoring functions and found that diffusion kernel-based methods generally outperformed others, such as direct neighbor-based methods. Using two genome-wide RNAi screens as examples, we validated our approach extensively from multiple aspects. We prioritized hits in the original screens that were more likely to be reproduced by the validation screen and recovered potential FNs whose involvements in the biological process were suggested by previous knowledge and mutant phenotypes. Finally, we demonstrated that the NePhe scoring system helped to biologically interpret RNAi results at the module level. Conclusion By comprehensively analyzing multiple genome-wide RNAi screens, we conclude that

  14. Can There Be Reliability without "Reliability?"

    ERIC Educational Resources Information Center

    Mislevy, Robert J.

    2004-01-01

    An "Educational Researcher" article by Pamela Moss (1994) asks the title question, "Can there be validity without reliability?" Yes, she answers, if by reliability one means "consistency among independent observations intended as interchangeable" (Moss, 1994, p. 7), quantified by internal consistency indices such as KR-20 coefficients and…

  15. HELIOS Critical Design Review: Reliability

    NASA Technical Reports Server (NTRS)

    Benoehr, H. C.; Herholz, J.; Prem, H.; Mann, D.; Reichert, L.; Rupp, W.; Campbell, D.; Boettger, H.; Zerwes, G.; Kurvin, C.

    1972-01-01

    This paper presents Helios Critical Design Review Reliability form October 16-20, 1972. The topics include: 1) Reliability Requirement; 2) Reliability Apportionment; 3) Failure Rates; 4) Reliability Assessment; 5) Reliability Block Diagram; and 5) Reliability Information Sheet.

  16. Development of Stronger and More Reliable Cast Austenitic Stainless Steels (H-Series) Based on Scientific Design Methodology

    SciTech Connect

    Muralidharan, G.; Sikka, V.K.; Pankiw, R.I.

    2006-04-15

    The goal of this program was to increase the high-temperature strength of the H-Series of cast austenitic stainless steels by 50% and upper use temperature by 86 to 140 F (30 to 60 C). Meeting this goal is expected to result in energy savings of 38 trillion Btu/year by 2020 and energy cost savings of $185 million/year. The higher strength H-Series of cast stainless steels (HK and HP type) have applications for the production of ethylene in the chemical industry, for radiant burner tubes and transfer rolls for secondary processing of steel in the steel industry, and for many applications in the heat-treating industry. The project was led by Duraloy Technologies, Inc. with research participation by the Oak Ridge National Laboratory (ORNL) and industrial participation by a diverse group of companies. Energy Industries of Ohio (EIO) was also a partner in this project. Each team partner had well-defined roles. Duraloy Technologies led the team by identifying the base alloys that were to be improved from this research. Duraloy Technologies also provided an extensive creep data base on current alloys, provided creep-tested specimens of certain commercial alloys, and carried out centrifugal casting and component fabrication of newly designed alloys. Nucor Steel was the first partner company that installed the radiant burner tube assembly in their heat-treating furnace. Other steel companies participated in project review meetings and are currently working with Duraloy Technologies to obtain components of the new alloys. EIO is promoting the enhanced performance of the newly designed alloys to Ohio-based companies. The Timken Company is one of the Ohio companies being promoted by EIO. The project management and coordination plan is shown in Fig. 1.1. A related project at University of Texas-Arlington (UT-A) is described in Development of Semi-Stochastic Algorithm for Optimizing Alloy Composition of High-Temperature Austenitic Stainless Steels (H-Series) for Desired

  17. Modelling a reliable wind/PV/storage power system for remote radio base station sites without utility power

    NASA Astrophysics Data System (ADS)

    Bitterlin, Ian F.

    The development of photovoltaic (PV) cells has made steady progress from the early days, when only the USA space program could afford to deploy them, to now, seeing them applied to roadside applications even in our Northern European climes. The manufacturing cost per watt has fallen and the daylight-to-power conversion efficiency increased. At the same time, the perception that the sun has to be directly shining on it for a PV array to work has faded. On some of those roadside applications, particularly for remote emergency telephones or for temporary roadwork signage where a utility electrical power connection is not practical, the keen observer will spot, usually in addition to a PV array, a small wind-turbine and an electrical cabinet quite obviously (by virtue of its volume) containing a storage battery. In the UK, we have the lions share (>40%) of Europe's entire wind power resource although, despite press coverage of the "anti-wind" lobby to the contrary, we have hardly started to harvest this clean and free energy source. Taking this (established and proven) roadside solution one step further, we will consider higher power applications. A cellular phone system is one where a multitude of remote radio base stations (RBS) are required to provide geographical coverage. With networks developing into the so called "3G" technologies the need for base stations has tripled, as each 3G cell covers only 1/3 the geographical area of its "2G" counterpart. To cover >90% of the UK's topology (>97% population coverage) with 3G cellular technology will requires in excess of 12,000 radio base stations per operator network. In 2001, there were around 25,000 established sites and, with an anticipated degree of collocation by necessity, that figure is forecast to rise to >47,000. Of course, the vast majority of these sites have a convenient grid connection. However, it is easy to see that the combination of wind and PV power generation and an energy storage system may be an

  18. Flow cytometric readout based on Mitotracker Red CMXRos staining of live asexual blood stage malarial parasites reliably assesses antibody dependent cellular inhibition

    PubMed Central

    2012-01-01

    with determination of parasite counts through CMXRos staining and flow cytometry. Conclusions A flow cytometry method based on CMXRos staining for detection of live parasite populations has been optimized. This is a rapid and sensitive method with high inter-assay reproducibility which can reliably determine the anti-parasite effect mediated by antibodies in functional in vitro assays such as ADCI assay. PMID:22818754

  19. Population-based study of intra-household gender differences in water insecurity: reliability and validity of a survey instrument for use in rural Uganda.

    PubMed

    Tsai, Alexander C; Kakuhikire, Bernard; Mushavi, Rumbidzai; Vořechovská, Dagmar; Perkins, Jessica M; McDonough, Amy Q; Bangsberg, David R

    2016-04-01

    Hundreds of millions of people worldwide lack adequate access to water. Water insecurity, which is defined as having limited or uncertain availability of safe water or the ability to acquire safe water in socially acceptable ways, is typically overlooked by development organizations focusing on water availability. To address the urgent need in the literature for validated measures of water insecurity, we conducted a population-based study in rural Uganda with 327 reproductive-age women and 204 linked men from the same households. We used a novel method of photo identification so that we could accurately elicit study participants' primary household water sources, thereby enabling us to identify water sources for objective water quality testing and distance/elevation measurement. Our psychometric analyses provided strong evidence of the internal structure, reliability, and validity of a new eight-item Household Water Insecurity Access Scale (HWIAS). Important intra-household gender differences in perceptions of water insecurity were observed, with men generally perceiving household water insecurity as being less severe compared to women. In summary, the HWIAS represents a reliable and valid measure of water insecurity, particularly among women, and may be useful for informing and evaluating interventions to improve water access in resource-limited settings. PMID:27105413

  20. Extending the use of the Web-based HIV Testing Belief Inventory to students attending historically Black colleges and universities: an examination of reliability and validity.

    PubMed

    Hou, Su-I

    2009-02-01

    This study sought to extend the use of a Web-based HIV Testing Belief Inventory (wHITBI), developed and validated in a majority White university in the southeastern United States, to students attending historically Black colleges and universities (HBCUs). The 19-item wHITBI was reviewed by experts to qualitatively assess its construct validity, clarity, relevancy, and comprehensiveness to HBCU students. Participants were recruited from 15 HBCUs (valid N = 372). Mean age was 20.5 years (SD = 2.4), 80% were females, 92% were heterosexual-oriented, and 58% had prior HIV test(s). Reliability coefficients revealed satisfactory internal consistencies (Cronbach's alphas: .58 approximately .85). Confirmatory factor analysis showed that items were loaded consistently with the four constructs: perceived benefits, concerns of HIV risk, stigma, and testing availability/accessibility. Data indicated good model fits (RMSEA = .06; CFI = .93; IFI = .93; RMS = .07), with all items loaded significantly. Findings showed that the psychometrics of wHITBI appears to maintain its integrity in this sample with satisfactory reliability coefficients and validities. PMID:19243233

  1. Custom oligonucleotide array-based CGH: a reliable diagnostic tool for detection of exonic copy-number changes in multiple targeted genes

    PubMed Central

    Vasson, Aurélie; Leroux, Céline; Orhant, Lucie; Boimard, Mathieu; Toussaint, Aurélie; Leroy, Chrystel; Commere, Virginie; Ghiotti, Tiffany; Deburgrave, Nathalie; Saillour, Yoann; Atlan, Isabelle; Fouveaut, Corinne; Beldjord, Cherif; Valleix, Sophie; Leturcq, France; Dodé, Catherine; Bienvenu, Thierry; Chelly, Jamel; Cossée, Mireille

    2013-01-01

    The frequency of disease-related large rearrangements (referred to as copy-number mutations, CNMs) varies among genes, and search for these mutations has an important place in diagnostic strategies. In recent years, CGH method using custom-designed high-density oligonucleotide-based arrays allowed the development of a powerful tool for detection of alterations at the level of exons and made it possible to provide flexibility through the possibility of modeling chips. The aim of our study was to test custom-designed oligonucleotide CGH array in a diagnostic laboratory setting that analyses several genes involved in various genetic diseases, and to compare it with conventional strategies. To this end, we designed a 12-plex CGH array (135k; 135 000 probes/subarray) (Roche Nimblegen) with exonic and intronic oligonucleotide probes covering 26 genes routinely analyzed in the laboratory. We tested control samples with known CNMs and patients for whom genetic causes underlying their disorders were unknown. The contribution of this technique is undeniable. Indeed, it appeared reproducible, reliable and sensitive enough to detect heterozygous single-exon deletions or duplications, complex rearrangements and somatic mosaicism. In addition, it improves reliability of CNM detection and allows determination of boundaries precisely enough to direct targeted sequencing of breakpoints. All of these points, associated with the possibility of a simultaneous analysis of several genes and scalability ‘homemade' make it a valuable tool as a new diagnostic approach of CNMs. PMID:23340513

  2. Statistics-related and reliability-physics-related failure processes in electronics devices and products

    NASA Astrophysics Data System (ADS)

    Suhir, E.

    2014-05-01

    The well known and widely used experimental reliability "passport" of a mass manufactured electronic or a photonic product — the bathtub curve — reflects the combined contribution of the statistics-related and reliability-physics (physics-of-failure)-related processes. When time progresses, the first process results in a decreasing failure rate, while the second process associated with the material aging and degradation leads to an increased failure rate. An attempt has been made in this analysis to assess the level of the reliability physics-related aging process from the available bathtub curve (diagram). It is assumed that the products of interest underwent the burn-in testing and therefore the obtained bathtub curve does not contain the infant mortality portion. It has been also assumed that the two random processes in question are statistically independent, and that the failure rate of the physical process can be obtained by deducting the theoretically assessed statistical failure rate from the bathtub curve ordinates. In the carried out numerical example, the Raleigh distribution for the statistical failure rate was used, for the sake of a relatively simple illustration. The developed methodology can be used in reliability physics evaluations, when there is a need to better understand the roles of the statistics-related and reliability-physics-related irreversible random processes in reliability evaluations. The future work should include investigations on how powerful and flexible methods and approaches of the statistical mechanics can be effectively employed, in addition to reliability physics techniques, to model the operational reliability of electronic and photonic products.

  3. RNABindRPlus: a predictor that combines machine learning and sequence homology-based methods to improve the reliability of predicted RNA-binding residues in proteins.

    PubMed

    Walia, Rasna R; Xue, Li C; Wilkins, Katherine; El-Manzalawy, Yasser; Dobbs, Drena; Honavar, Vasant

    2014-01-01

    Protein-RNA interactions are central to essential cellular processes such as protein synthesis and regulation of gene expression and play roles in human infectious and genetic diseases. Reliable identification of protein-RNA interfaces is critical for understanding the structural bases and functional implications of such interactions and for developing effective approaches to rational drug design. Sequence-based computational methods offer a viable, cost-effective way to identify putative RNA-binding residues in RNA-binding proteins. Here we report two novel approaches: (i) HomPRIP, a sequence homology-based method for predicting RNA-binding sites in proteins; (ii) RNABindRPlus, a new method that combines predictions from HomPRIP with those from an optimized Support Vector Machine (SVM) classifier trained on a benchmark dataset of 198 RNA-binding proteins. Although highly reliable, HomPRIP cannot make predictions for the unaligned parts of query proteins and its coverage is limited by the availability of close sequence homologs of the query protein with experimentally determined RNA-binding sites. RNABindRPlus overcomes these limitations. We compared the performance of HomPRIP and RNABindRPlus with that of several state-of-the-art predictors on two test sets, RB44 and RB111. On a subset of proteins for which homologs with experimentally determined interfaces could be reliably identified, HomPRIP outperformed all other methods achieving an MCC of 0.63 on RB44 and 0.83 on RB111. RNABindRPlus was able to predict RNA-binding residues of all proteins in both test sets, achieving an MCC of 0.55 and 0.37, respectively, and outperforming all other methods, including those that make use of structure-derived features of proteins. More importantly, RNABindRPlus outperforms all other methods for any choice of tradeoff between precision and recall. An important advantage of both HomPRIP and RNABindRPlus is that they rely on readily available sequence and sequence

  4. Egnos-Based Multi-Sensor Accurate and Reliable Navigation in Search-And Missions with Uavs

    NASA Astrophysics Data System (ADS)

    Molina, P.; Colomina, I.; Vitoria, T.; Silva, P. F.; Stebler, Y.; Skaloud, J.; Kornus, W.; Prades, R.

    2011-09-01

    This paper will introduce and describe the goals, concept and overall approach of the European 7th Framework Programme's project named CLOSE-SEARCH, which stands for 'Accurate and safe EGNOS-SoL Navigation for UAV-based low-cost SAR operations'. The goal of CLOSE-SEARCH is to integrate in a helicopter-type unmanned aerial vehicle, a thermal imaging sensor and a multi-sensor navigation system (based on the use of a Barometric Altimeter (BA), a Magnetometer (MAGN), a Redundant Inertial Navigation System (RINS) and an EGNOS-enabled GNSS receiver) with an Autonomous Integrity Monitoring (AIM) capability, to support the search component of Search-And-Rescue operations in remote, difficult-to-access areas and/or in time critical situations. The proposed integration will result in a hardware and software prototype that will demonstrate an end-to-end functionality, that is to fly in patterns over a region of interest (possibly inaccessible) during day or night and also under adverse weather conditions and locate there disaster survivors or lost people through the detection of the body heat. This paper will identify the technical challenges of the proposed approach, from navigating with a BA/MAGN/RINS/GNSS-EGNOSbased integrated system to the interpretation of thermal images for person identification. Moreover, the AIM approach will be described together with the proposed integrity requirements. Finally, this paper will show some results obtained in the project during the first test campaign performed on November 2010. On that day, a prototype was flown in three different missions to assess its high-level performance and to observe some fundamental mission parameters as the optimal flying height and flying speed to enable body recognition. The second test campaign is scheduled for the end of 2011.

  5. Sensor based 3D conformal cueing for safe and reliable HC operation specifically for landing in DVE

    NASA Astrophysics Data System (ADS)

    Münsterer, Thomas; Kress, Martin; Klasen, Stephanus

    2013-05-01

    The paper describes the approach of a sensor based landing aid for helicopters in degraded visual conditions. The system concept presented employs a long range high resolution ladar sensor allowing for identifying obstacles in the flight and in the approach path as well as measuring landing site conditions like slope, roughness and precise position relative to the helicopter during long final approach. All these measurements are visualized to the pilot. Cueing is done by 3D conformal symbology displayed in a head-tracked HMD enhanced by 2D symbols for data which is perceived easier by 2D symbols than by 3D cueing. All 3D conformal symbology is placed on the measured landing site surface which is further visualized by a grid structure for displaying landing site slope, roughness and small obstacles. Due to the limited resolution of the employed HMD a specific scheme of blending in the information during the approach is employed. The interplay between in flight and in approach obstacle warning and CFIT warning symbology with this landing aid symbology is also investigated and exemplarily evaluated for the NH90 helicopter which has already today implemented a long range high resolution ladar sensor based obstacle warning and CFIT symbology. The paper further describes the results of simulator and flight tests performed with this system employing a ladar sensor and a head-tracked head mounted display system. In the simulator trials a full model of the ladar sensor producing 3D measurement points was used working with the same algorithms used in flight tests.

  6. CRISPR-STAT: an easy and reliable PCR-based method to evaluate target-specific sgRNA activity.

    PubMed

    Carrington, Blake; Varshney, Gaurav K; Burgess, Shawn M; Sood, Raman

    2015-12-15

    CRISPR/Cas9 has emerged as a versatile genome-engineering tool that relies on a single guide RNA (sgRNA) and the Cas9 enzyme for genome editing. Simple, fast and economical methods to generate sgRNAs have made targeted mutagenesis routine in cultured cells, mice, zebrafish and other model systems. Pre-screening of sgRNAs for target efficacy is desirable both for successful mutagenesis and minimizing wasted animal husbandry on targets with poor activity. Here, we describe an easy, quick and cost-effective fluorescent polymerase chain reaction (PCR)-based method, CRISPR Somatic Tissue Activity Test (CRISPR-STAT), to determine target-specific efficiency of sgRNA. As a proof of principle, we validated our method using 28 sgRNAs with known and varied levels of germline transmission efficiency in zebrafish by analysis of their somatic activity in injected embryos. Our data revealed a strong positive correlation between the fluorescent PCR profiles of the injected embryos and the germline transmission efficiency. Furthermore, the assay was sensitive enough to evaluate multiplex gene targeting. This method is easy to implement by laboratories with access to a capillary sequencer. Although we validated the method using CRISPR/Cas9 and zebrafish, it can be applied to other model systems and other genome targeting nucleases. PMID:26253739

  7. CRISPR-STAT: an easy and reliable PCR-based method to evaluate target-specific sgRNA activity

    PubMed Central

    Carrington, Blake; Varshney, Gaurav K.; Burgess, Shawn M.; Sood, Raman

    2015-01-01

    CRISPR/Cas9 has emerged as a versatile genome-engineering tool that relies on a single guide RNA (sgRNA) and the Cas9 enzyme for genome editing. Simple, fast and economical methods to generate sgRNAs have made targeted mutagenesis routine in cultured cells, mice, zebrafish and other model systems. Pre-screening of sgRNAs for target efficacy is desirable both for successful mutagenesis and minimizing wasted animal husbandry on targets with poor activity. Here, we describe an easy, quick and cost-effective fluorescent polymerase chain reaction (PCR)-based method, CRISPR Somatic Tissue Activity Test (CRISPR-STAT), to determine target-specific efficiency of sgRNA. As a proof of principle, we validated our method using 28 sgRNAs with known and varied levels of germline transmission efficiency in zebrafish by analysis of their somatic activity in injected embryos. Our data revealed a strong positive correlation between the fluorescent PCR profiles of the injected embryos and the germline transmission efficiency. Furthermore, the assay was sensitive enough to evaluate multiplex gene targeting. This method is easy to implement by laboratories with access to a capillary sequencer. Although we validated the method using CRISPR/Cas9 and zebrafish, it can be applied to other model systems and other genome targeting nucleases. PMID:26253739

  8. Reliability of Sleep Measures from Four Personal Health Monitoring Devices Compared to Research-Based Actigraphy and Polysomnography.

    PubMed

    Mantua, Janna; Gravel, Nickolas; Spencer, Rebecca M C

    2016-01-01

    Polysomnography (PSG) is the "gold standard" for monitoring sleep. Alternatives to PSG are of interest for clinical, research, and personal use. Wrist-worn actigraph devices have been utilized in research settings for measures of sleep for over two decades. Whether sleep measures from commercially available devices are similarly valid is unknown. We sought to determine the validity of five wearable devices: Basis Health Tracker, Misfit Shine, Fitbit Flex, Withings Pulse O2, and a research-based actigraph, Actiwatch Spectrum. We used Wilcoxon Signed Rank tests to assess differences between devices relative to PSG and correlational analysis to assess the strength of the relationship. Data loss was greatest for Fitbit and Misfit. For all devices, we found no difference and strong correlation of total sleep time with PSG. Sleep efficiency differed from PSG for Withings, Misfit, Fitbit, and Basis, while Actiwatch mean values did not differ from that of PSG. Only mean values of sleep efficiency (time asleep/time in bed) from Actiwatch correlated with PSG, yet this correlation was weak. Light sleep time differed from PSG (nREM1 + nREM2) for all devices. Measures of Deep sleep time did not differ from PSG (SWS + REM) for Basis. These results reveal the current strengths and limitations in sleep estimates produced by personal health monitoring devices and point to a need for future development. PMID:27164110

  9. Reliability of Sleep Measures from Four Personal Health Monitoring Devices Compared to Research-Based Actigraphy and Polysomnography

    PubMed Central

    Mantua, Janna; Gravel, Nickolas; Spencer, Rebecca M. C.

    2016-01-01

    Polysomnography (PSG) is the “gold standard” for monitoring sleep. Alternatives to PSG are of interest for clinical, research, and personal use. Wrist-worn actigraph devices have been utilized in research settings for measures of sleep for over two decades. Whether sleep measures from commercially available devices are similarly valid is unknown. We sought to determine the validity of five wearable devices: Basis Health Tracker, Misfit Shine, Fitbit Flex, Withings Pulse O2, and a research-based actigraph, Actiwatch Spectrum. We used Wilcoxon Signed Rank tests to assess differences between devices relative to PSG and correlational analysis to assess the strength of the relationship. Data loss was greatest for Fitbit and Misfit. For all devices, we found no difference and strong correlation of total sleep time with PSG. Sleep efficiency differed from PSG for Withings, Misfit, Fitbit, and Basis, while Actiwatch mean values did not differ from that of PSG. Only mean values of sleep efficiency (time asleep/time in bed) from Actiwatch correlated with PSG, yet this correlation was weak. Light sleep time differed from PSG (nREM1 + nREM2) for all devices. Measures of Deep sleep time did not differ from PSG (SWS + REM) for Basis. These results reveal the current strengths and limitations in sleep estimates produced by personal health monitoring devices and point to a need for future development. PMID:27164110

  10. A Reliability-Based Particle Filter for Humanoid Robot Self-Localization in RoboCup Standard Platform League

    PubMed Central

    Sánchez, Eduardo Munera; Alcobendas, Manuel Muñoz; Noguera, Juan Fco. Blanes; Gilabert, Ginés Benet; Simó Ten, José E.

    2013-01-01

    This paper deals with the problem of humanoid robot localization and proposes a new method for position estimation that has been developed for the RoboCup Standard Platform League environment. Firstly, a complete vision system has been implemented in the Nao robot platform that enables the detection of relevant field markers. The detection of field markers provides some estimation of distances for the current robot position. To reduce errors in these distance measurements, extrinsic and intrinsic camera calibration procedures have been developed and described. To validate the localization algorithm, experiments covering many of the typical situations that arise during RoboCup games have been developed: ranging from degradation in position estimation to total loss of position (due to falls, ‘kidnapped robot’, or penalization). The self-localization method developed is based on the classical particle filter algorithm. The main contribution of this work is a new particle selection strategy. Our approach reduces the CPU computing time required for each iteration and so eases the limited resource availability problem that is common in robot platforms such as Nao. The experimental results show the quality of the new algorithm in terms of localization and CPU time consumption. PMID:24193098

  11. A reliability-based particle filter for humanoid robot self-localization in RoboCup Standard Platform League.

    PubMed

    Munera Sánchez, Eduardo; Muñoz Alcobendas, Manuel; Blanes Noguera, Juan Fco; Benet Gilabert, Ginés; Simó Ten, José E

    2013-01-01

    This paper deals with the problem of humanoid robot localization and proposes a new method for position estimation that has been developed for the RoboCup Standard Platform League environment. Firstly, a complete vision system has been implemented in the Nao robot platform that enables the detection of relevant field markers. The detection of field markers provides some estimation of distances for the current robot position. To reduce errors in these distance measurements, extrinsic and intrinsic camera calibration procedures have been developed and described. To validate the localization algorithm, experiments covering many of the typical situations that arise during RoboCup games have been developed: ranging from degradation in position estimation to total loss of position (due to falls, 'kidnapped robot', or penalization). The self-localization method developed is based on the classical particle filter algorithm. The main contribution of this work is a new particle selection strategy. Our approach reduces the CPU computing time required for each iteration and so eases the limited resource availability problem that is common in robot platforms such as Nao. The experimental results show the quality of the new algorithm in terms of localization and CPU time consumption. PMID:24193098

  12. A systematic reliability investigation of the dielectric charging process in electrostatically actuated MEMS based on Kelvin probe force microscopy

    NASA Astrophysics Data System (ADS)

    Zaghloul, U.; Papaioannou, G. J.; Coccetti, F.; Pons, P.; Plana, R.

    2010-06-01

    This paper presents a comprehensive investigation for the dielectric charging problem in electrostatically actuated microelectromechanical system (MEMS) devices. The approach is based on Kelvin probe force microscopy (KPFM) and targets, in this specific paper, thin PECVD silicon nitride films for electrostatic capacitive RF MEMS switches. KPFM has been employed in order to mimic the potential induced at the dielectric surface due to charge injection through asperities. The effect of dielectric thickness has been investigated through depositing SiNx films with different thicknesses. Then, in order to simulate the different scenarios of dielectric charging in real MEMS switches, SiNx films have been deposited over thermally grown oxide, evaporated gold and electroplated gold layers. Also, the effect of the deposition conditions has been investigated through depositing dielectric films using low and high frequency PECVD methods. The investigation reveals that thin dielectric films have larger relaxation times compared to thick ones when the same injection bias is applied, independently of the substrate nature. For the same SiNx film thickness, the decay time constant is found to be smaller in dielectric films deposited over metallic layers compared to the ones deposited over silicon substrates. Finally, the material stoichiometry is found to affect the surface potential distribution as well as the relaxation time constant.

  13. Reliability-based foundation design for transmission line structures: Volume 3, Uncertainties in soil property measurement: Final report

    SciTech Connect

    Filippas, O.B.; Kulhawy, F.H.; Grigoriu, M.D.

    1988-10-01

    This study presents a probability-based procedure for evaluating the uncertainties in in-situ measurements for transmission line foundation design. The steps in the procedure involve in-situ soil testing, estimating mean values of the in-situ parameters, and evaluating the uncertainties in these estimates. The uncertainties include inherent soil variability and measurement error. The resulting in-situ parameters then are combined with available prior information by using a Bayesian approach. With Bayesian updating, the final estimate of the in-situ parameters incorporates information obtained from all available sources. Models then are used to transform the in-situ parameters into the needed design soil properties. The analyses use data in the vertical direction, so they are useful primarily at a single structure site, where from one test boring or probe it is possible to evaluate the vertical variations in in-situ measurements. A general strategy for evaluating the uncertainties in undrained deep foundation design also is outlined, and the effects of the different uncertainties on the predictions are illustrated. The results show that the uncertainties are of major importance at shallow depths but, as the foundation depth increases, the uncertainties become less important. Measurement uncertainty is particularly important, and the results show that these influences should be minimized where possible. 47 refs., 29 figs., 8 tabs.

  14. Toward improving the reliability of hydrologic prediction: Model structure uncertainty and its quantification using ensemble-based genetic programming framework

    NASA Astrophysics Data System (ADS)

    Parasuraman, Kamban; Elshorbagy, Amin

    2008-12-01

    Uncertainty analysis is starting to be widely acknowledged as an integral part of hydrological modeling. The conventional treatment of uncertainty analysis in hydrologic modeling is to assume a deterministic model structure, and treat its associated parameters as imperfectly known, thereby neglecting the uncertainty associated with the model structure. In this paper, a modeling framework that can explicitly account for the effect of model structure uncertainty has been proposed. The modeling framework is based on initially generating different realizations of the original data set using a non-parametric bootstrap method, and then exploiting the ability of the self-organizing algorithms, namely genetic programming, to evolve their own model structure for each of the resampled data sets. The resulting ensemble of models is then used to quantify the uncertainty associated with the model structure. The performance of the proposed modeling framework is analyzed with regards to its ability in characterizing the evapotranspiration process at the Southwest Sand Storage facility, located near Ft. McMurray, Alberta. Eddy-covariance-measured actual evapotranspiration is modeled as a function of net radiation, air temperature, ground temperature, relative humidity, and wind speed. Investigating the relation between model complexity, prediction accuracy, and uncertainty, two sets of experiments were carried out by varying the level of mathematical operators that can be used to define the predictand-predictor relationship. While the first set uses just the additive operators, the second set uses both the additive and the multiplicative operators to define the predictand-predictor relationship. The results suggest that increasing the model complexity may lead to better prediction accuracy but at an expense of increasing uncertainty. Compared to the model parameter uncertainty, the relative contribution of model structure uncertainty to the predictive uncertainty of a model is

  15. An artificial neural network-based response surface method for reliability analyses of c-φ slopes with spatially variable soil

    NASA Astrophysics Data System (ADS)

    Shu, Su-xun; Gong, Wen-hui

    2016-03-01

    This paper presents an artificial neural network (ANN)-based response surface method that can be used to predict the failure probability of c-φ slopes with spatially variable soil. In this method, the Latin hypercube sampling technique is adopted to generate input datasets for establishing an ANN model; the random finite element method is then utilized to calculate the corresponding output datasets considering the spatial variability of soil properties; and finally, an ANN model is trained to construct the response surface of failure probability and obtain an approximate function that incorporates the relevant variables. The results of the illustrated example indicate that the proposed method provides credible and accurate estimations of failure probability. As a result, the obtained approximate function can be used as an alternative to the specific analysis process in c-φ slope reliability analyses.

  16. Using a Web-Based Approach to Assess Test–Retest Reliability of the “Hypertension Self-Care Profile” Tool in an Asian Population

    PubMed Central

    Koh, Yi Ling Eileen; Lua, Yi Hui Adela; Hong, Liyue; Bong, Huey Shin Shirley; Yeo, Ling Sui Jocelyn; Tsang, Li Ping Marianne; Ong, Kai Zhi; Wong, Sook Wai Samantha; Tan, Ngiap Chuan

    2016-01-01

    Abstract Essential hypertension often requires affected patients to self-manage their condition most of the time. Besides seeking regular medical review of their life-long condition to detect vascular complications, patients have to maintain healthy lifestyles in between physician consultations via diet and physical activity, and to take their medications according to their prescriptions. Their self-management ability is influenced by their self-efficacy capacity, which can be assessed using questionnaire-based tools. The “Hypertension Self-Care Profile” (HTN-SCP) is 1 such questionnaire assessing self-efficacy in the domains of “behavior,” “motivation,” and “self-efficacy.” This study aims to determine the test–retest reliability of HTN-SCP in an English-literate Asian population using a web-based approach. Multiethnic Asian patients, aged 40 years and older, with essential hypertension were recruited from a typical public primary care clinic in Singapore. The investigators guided the patients to fill up the web-based 60-item HTN-SCP in English using a tablet or smartphone on the first visit and refilled the instrument 2 weeks later in the retest. Internal consistency and test–retest reliability were evaluated using Cronbach's Alpha and intraclass correlation coefficients (ICC), respectively. The t test was used to determine the relationship between the overall HTN-SCP scores of the patients and their self-reported self-management activities. A total of 160 patients completed the HTN-SCP during the initial test, from which 71 test–retest responses were completed. No floor or ceiling effect was found for the scores for the 3 subscales. Cronbach's Alpha coefficients were 0.857, 0.948, and 0.931 for “behavior,” “motivation,” and “self-efficacy” domains respectively, indicating high internal consistency. The item-total correlation ranges for the 3 scales were from 0.105 to 0.656 for Behavior, 0.401 to 0.808 for Motivation, 0.349 to 0

  17. Validation and test-retest reliability of a health measure, health as ability of acting, based on the welfare theory of health.

    PubMed

    Snellman, Ingrid; Jonsson, Bosse; Wikblad, Karin

    2012-03-01

    The aim of this study was to conduct a validation and assess the test-retest reliability of the health questionnaire based on Nordenfelt's Welfare Theory of Health (WTH). The study used a questionnaire on health together with the Short Form 12-Item Health Survey (SF-12) questionnaire, and 490 pupils at colleges for adult education participated. The results of the study are in accordance with Nordenfelt's WTH. Three hypotheses were stated, and the first was confirmed: People who were satisfied with life rated higher levels than those who were dissatisfied with life concerning both mental and physical health, measured with the SF-12. The second hypothesis was partially confirmed: People with high education were more often satisfied with life than those with low education, but they were not healthier. The third hypothesis, that women are unhealthy more often than men, was not confirmed. The questionnaire on health showed acceptable stability. PMID:21930655

  18. Investigation of thermal stability and reliability of HfO2 based resistive random access memory devices with cross-bar structure

    NASA Astrophysics Data System (ADS)

    Chand, Umesh; Huang, Kuan-Chang; Huang, Chun-Yang; Ho, Chia-Hua; Lin, Chen-Hsi; Tseng, Tseung-Yuen

    2015-05-01

    The effect of the annealing treatment of a HfO2 resistive switching layer and the memory performance of a HfO2-based resistive random access memory (cross-bar structure) device were investigated. Oxygen is released from HfO2 resistive switching layers during vacuum annealing, leading to unstable resistive switching properties. This oxygen release problem can be suppressed by inserting an Al2O3 thin film, which has a lower Gibbs free energy, between the HfO2 layer and top electrode to form a Ti/Al2O3/HfO2/TiN structure. This device structure exhibited good reliability after high temperature vacuum annealing and post metal annealing (PMA) treatments. Moreover, the endurance and retention properties of the device were also improved after the PMA treatment.

  19. RFA-based 589-nm guide star lasers for ESO VLT: a paradigm shift in performance, operational simplicity, reliability, and maintenance

    NASA Astrophysics Data System (ADS)

    Friedenauer, Axel; Karpov, Vladimir; Wei, Daoping; Hager, Manfred; Ernstberger, Bernhard; Clements, Wallace R. L.; Kaenders, Wilhelm G.

    2012-07-01

    Large telescopes equipped with adaptive optics require 20-25W CW 589-nm sources with emission linewidths of ~5 MHz. These Guide Star (GS) lasers should also be highly reliable and simple to operate and maintain for many years at the top of a mountain facility. Under contract from ESO, industrial partners TOPTICA and MPBC are nearing completion of the development of GS lasers for the ESO VLT, with delivery of the first of four units scheduled for December 2012. We report on the design and performance of the fully-engineered Pre-Production Unit (PPU), including system reliability/availability analysis, the successfully-concluded qualification testing, long-term component and system level tests and long-term maintenance and support planning. The chosen approach is based on ESO's patented narrow-band Raman Fiber Amplifier (EFRA) technology. A master oscillator signal from a linearly-polarized TOPTICA 20-mW, 1178-nm CW diode laser, with stabilized emission frequency and controllable linewidth up to a few MHz, is amplified in an MPBC polarization-maintaining (PM) RFA pumped by a high-power 1120-nm PM fiber laser. With efficient stimulated Brillouin scattering suppression, an unprecedented 40W of narrow-band RFA output has been obtained. This is then mode-matched into a resonant-cavity doubler with a free-spectral-range matching the sodium D2a to D2b separation, allowing simultaneous generation of an additional frequency component (D2b line) to re-pump the sodium atom electronic population. With this technique, the return flux can be increased without having to resort to electro-optical modulators and without the risk of introducing optical wave front distortions. The demonstrated output powers with doubling efficiencies >80% at 589 nm easily exceed the 20W design goal and require less than 700 W of electrical power. In summary, the fiber-based guide star lasers provide excellent beam quality and are modular, turn-key, maintenance-free, reliable, efficient, and ruggedized

  20. Assuring Electronics Reliability: What Could and Should Be Done Differently

    NASA Astrophysics Data System (ADS)

    Suhir, E.

    The following “ ten commandments” for the predicted and quantified reliability of aerospace electronic, and photonic products are addressed and discussed: 1) The best product is the best compromise between the needs for reliability, cost effectiveness and time-to-market; 2) Reliability cannot be low, need not be higher than necessary, but has to be adequate for a particular product; 3) When reliability is imperative, ability to quantify it is a must, especially if optimization is considered; 4) One cannot design a product with quantified, optimized and assured reliability by limiting the effort to the highly accelerated life testing (HALT) that does not quantify reliability; 5) Reliability is conceived at the design stage and should be taken care of, first of all, at this stage, when a “ genetically healthy” product should be created; reliability evaluations and assurances cannot be delayed until the product is fabricated and shipped to the customer, i.e., cannot be left to the prognostics-and-health-monitoring/managing (PHM) stage; it is too late at this stage to change the design or the materials for improved reliability; that is why, when reliability is imperative, users re-qualify parts to assess their lifetime and use redundancy to build a highly reliable system out of insufficiently reliable components; 6) Design, fabrication, qualification and PHM efforts should consider and be specific for particular products and their most likely actual or at least anticipated application(s); 7) Probabilistic design for reliability (PDfR) is an effective means for improving the state-of-the-art in the field: nothing is perfect, and the difference between an unreliable product and a robust one is “ merely” the probability of failure (PoF); 8) Highly cost-effective and highly focused failure oriented accelerated testing (FOAT) geared to a particular pre-determined reliability model and aimed at understanding the physics of failure- anticipated by this model is an

  1. Power electronics reliability analysis.

    SciTech Connect

    Smith, Mark A.; Atcitty, Stanley

    2009-12-01

    This report provides the DOE and industry with a general process for analyzing power electronics reliability. The analysis can help with understanding the main causes of failures, downtime, and cost and how to reduce them. One approach is to collect field maintenance data and use it directly to calculate reliability metrics related to each cause. Another approach is to model the functional structure of the equipment using a fault tree to derive system reliability from component reliability. Analysis of a fictitious device demonstrates the latter process. Optimization can use the resulting baseline model to decide how to improve reliability and/or lower costs. It is recommended that both electric utilities and equipment manufacturers make provisions to collect and share data in order to lay the groundwork for improving reliability into the future. Reliability analysis helps guide reliability improvements in hardware and software technology including condition monitoring and prognostics and health management.

  2. Human Reliability Program Overview

    SciTech Connect

    Bodin, Michael

    2012-09-25

    This presentation covers the high points of the Human Reliability Program, including certification/decertification, critical positions, due process, organizational structure, program components, personnel security, an overview of the US DOE reliability program, retirees and academia, and security program integration.

  3. Reliable Design Versus Trust

    NASA Technical Reports Server (NTRS)

    Berg, Melanie; LaBel, Kenneth A.

    2016-01-01

    This presentation focuses on reliability and trust for the users portion of the FPGA design flow. It is assumed that the manufacturer prior to hand-off to the user tests FPGA internal components. The objective is to present the challenges of creating reliable and trusted designs. The following will be addressed: What makes a design vulnerable to functional flaws (reliability) or attackers (trust)? What are the challenges for verifying a reliable design versus a trusted design?

  4. Load Control System Reliability

    SciTech Connect

    Trudnowski, Daniel

    2015-04-03

    This report summarizes the results of the Load Control System Reliability project (DOE Award DE-FC26-06NT42750). The original grant was awarded to Montana Tech April 2006. Follow-on DOE awards and expansions to the project scope occurred August 2007, January 2009, April 2011, and April 2013. In addition to the DOE monies, the project also consisted of matching funds from the states of Montana and Wyoming. Project participants included Montana Tech; the University of Wyoming; Montana State University; NorthWestern Energy, Inc., and MSE. Research focused on two areas: real-time power-system load control methodologies; and, power-system measurement-based stability-assessment operation and control tools. The majority of effort was focused on area 2. Results from the research includes: development of fundamental power-system dynamic concepts, control schemes, and signal-processing algorithms; many papers (including two prize papers) in leading journals and conferences and leadership of IEEE activities; one patent; participation in major actual-system testing in the western North American power system; prototype power-system operation and control software installed and tested at three major North American control centers; and, the incubation of a new commercial-grade operation and control software tool. Work under this grant certainly supported the DOE-OE goals in the area of “Real Time Grid Reliability Management.”

  5. Predicting software reliability

    NASA Technical Reports Server (NTRS)

    Littlewood, B.

    1989-01-01

    A detailed look is given to software reliability techniques. A conceptual model of the failure process is examined, and some software reliability growth models are discussed. Problems for which no current solutions exist are addressed, emphasizing the very difficult problem of safety-critical systems for which the reliability requirements can be enormously demanding.

  6. Reliability model generator specification

    NASA Technical Reports Server (NTRS)

    Cohen, Gerald C.; Mccann, Catherine

    1990-01-01

    The Reliability Model Generator (RMG), a program which produces reliability models from block diagrams for ASSIST, the interface for the reliability evaluation tool SURE is described. An account is given of motivation for RMG and the implemented algorithms are discussed. The appendices contain the algorithms and two detailed traces of examples.

  7. Facial Psoriasis Log-based Area and Severity Index: A valid and reliable severity measurement method detecting improvement of facial psoriasis in clinical practice settings.

    PubMed

    Kwon, Hyuck Hoon; Kim, Min-Woo; Park, Gyeong-Hun; Bae, You In; Kuk, Su Kyung; Suh, Dae Hun; Youn, Jai Il; Kwon, In Ho

    2016-08-01

    Facial psoriasis is often observed in moderate to severe degrees of psoriasis. While we previously demonstrated construct validity of the facial Psoriasis Log-based Area and Severity Index (fPLASI) system for the cross-sectional evaluation of facial psoriasis, its reliability and accuracy to detect clinical improvement has not been confirmed yet. The aim of this study is to analyze whether the fPLASI properly represents the range of improvement for facial psoriasis compared with the existing facial Psoriasis Area and Severity Index (fPASI) after receiving systemic treatments in clinical practice settings. The changing severity of facial psoriasis for 118 patients was calculated by the scales of fPASI and fPLASI between two time points after systemic treatments. Then, percentage changes (ΔfPASI and ΔfPLASI) were analyzed from the perspective of both the Physician's Global Assessment of effectiveness (PGA) and patients' Subjective Global Assessment (SGA). As a result, the distribution of the fPASI was more heavily clustered around the low score range compared with the fPLASI at both first and second visits. Linear regression analysis between ΔfPASI and ΔfPLASI shows that the correlation coefficient was 0.94, and ΔfPLASI represented greater percentage changes than ΔfPASI. Remarkably, degrees of clinical improvement measured by the PGA matched better with ΔfPLASI, while ΔfPASI underestimated clinical improvements compared with ΔfPLASI from treatment-responding groups by the PGA and SGA. In conclusion, the fPLASI represented clinical improvement of facial psoriasis with more sensitivity and reliability compared with the fPASI. Therefore, the PLASI system would be a viable severity measurement method for facial psoriasis in clinical practice. PMID:26992293

  8. Robust fusion with reliabilities weights

    NASA Astrophysics Data System (ADS)

    Grandin, Jean-Francois; Marques, Miguel

    2002-03-01

    The reliability is a value of the degree of trust in a given measurement. We analyze and compare: ML (Classical Maximum Likelihood), MLE (Maximum Likelihood weighted by Entropy), MLR (Maximum Likelihood weighted by Reliability), MLRE (Maximum Likelihood weighted by Reliability and Entropy), DS (Credibility Plausibility), DSR (DS weighted by reliabilities). The analysis is based on a model of a dynamical fusion process. It is composed of three sensors, which have each it's own discriminatory capacity, reliability rate, unknown bias and measurement noise. The knowledge of uncertainties is also severely corrupted, in order to analyze the robustness of the different fusion operators. Two sensor models are used: the first type of sensor is able to estimate the probability of each elementary hypothesis (probabilistic masses), the second type of sensor delivers masses on union of elementary hypotheses (DS masses). In the second case probabilistic reasoning leads to sharing the mass abusively between elementary hypotheses. Compared to the classical ML or DS which achieves just 50% of correct classification in some experiments, DSR, MLE, MLR and MLRE reveals very good performances on all experiments (more than 80% of correct classification rate). The experiment was performed with large variations of the reliability coefficients for each sensor (from 0 to 1), and with large variations on the knowledge of these coefficients (from 0 0.8). All four operators reveal good robustness, but the MLR reveals to be uniformly dominant on all the experiments in the Bayesian case and achieves the best mean performance under incomplete a priori information.

  9. Reliability of Task-Based fMRI for Preoperative Planning: A Test-Retest Study in Brain Tumor Patients and Healthy Controls

    PubMed Central

    Morrison, Melanie A.; Churchill, Nathan W.; Cusimano, Michael D.; Schweizer, Tom A.; Das, Sunit; Graham, Simon J.

    2016-01-01

    Background Functional magnetic resonance imaging (fMRI) continues to develop as a clinical tool for patients with brain cancer, offering data that may directly influence surgical decisions. Unfortunately, routine integration of preoperative fMRI has been limited by concerns about reliability. Many pertinent studies have been undertaken involving healthy controls, but work involving brain tumor patients has been limited. To develop fMRI fully as a clinical tool, it will be critical to examine these reliability issues among patients with brain tumors. The present work is the first to extensively characterize differences in activation map quality between brain tumor patients and healthy controls, including the effects of tumor grade and the chosen behavioral testing paradigm on reliability outcomes. Method Test-retest data were collected for a group of low-grade (n = 6) and high-grade glioma (n = 6) patients, and for matched healthy controls (n = 12), who performed motor and language tasks during a single fMRI session. Reliability was characterized by the spatial overlap and displacement of brain activity clusters, BOLD signal stability, and the laterality index. Significance testing was performed to assess differences in reliability between the patients and controls, and low-grade and high-grade patients; as well as between different fMRI testing paradigms. Results There were few significant differences in fMRI reliability measures between patients and controls. Reliability was significantly lower when comparing high-grade tumor patients to controls, or to low-grade tumor patients. The motor task produced more reliable activation patterns than the language tasks, as did the rhyming task in comparison to the phonemic fluency task. Conclusion In low-grade glioma patients, fMRI data are as reliable as healthy control subjects. For high-grade glioma patients, further investigation is required to determine the underlying causes of reduced reliability. To maximize

  10. Fast and reliable artemisinin determination from different Artemisia annua leaves based alimentary products by high performance liquid chromatography-tandem mass spectrometry.

    PubMed

    Carrà, Andrea; Bagnati, Renzo; Fanelli, Roberto; Bonati, Maurizio

    2014-01-01

    In many tropical countries malaria is endemic, causing acute illness and killing people, especially children. The availability of recommended malaria medicines is scant, even though these medicines are based on artemisinin, a compound extracted from the Artemisia annua plant that grows in many of these countries. New sources of treatment drawn from traditional medicine are therefore used, such as the tea infusion. An analytical method based on high-performance liquid chromatography/tandem mass spectrometry (HPLC-MS/MS) was developed to quantify the artemisinin content of foods prepared with Artemisia annua leaves. A fast and reliable analytical method is described. The technique does not require any derivatisation prior to injection and offers excellent analytical intermediate precision. Robust qualitative and quantitative results were obtained using tea, biscuit or porridge specimens. Although further research is needed to define the potential therapeutic benefits of these alimentary formulations, the analytical method described can be employed in developing more convenient and appropriate foods for administering artemisinin to those infected with malaria. PMID:24001820

  11. Mesostructured HfxAlyO2 Thin Films as Reliable and Robust Gate Dielectrics with Tunable Dielectric Constants for High-Performance Graphene-Based Transistors.

    PubMed

    Lee, Yunseong; Jeon, Woojin; Cho, Yeonchoo; Lee, Min-Hyun; Jeong, Seong-Jun; Park, Jongsun; Park, Seongjun

    2016-07-26

    We introduce a reliable and robust gate dielectric material with tunable dielectric constants based on a mesostructured HfxAlyO2 film. The ultrathin mesostructured HfxAlyO2 film is deposited on graphene via a physisorbed-precursor-assisted atomic layer deposition process and consists of an intermediate state with small crystallized parts in an amorphous matrix. Crystal phase engineering using Al dopant is employed to achieve HfO2 phase transitions, which produce the crystallized part of the mesostructured HfxAlyO2 film. The effects of various Al doping concentrations are examined, and an enhanced dielectric constant of ∼25 is obtained. Further, the leakage current is suppressed (∼10(-8) A/cm(2)) and the dielectric breakdown properties are enhanced (breakdown field: ∼7 MV/cm) by the partially remaining amorphous matrix. We believe that this contribution is theoretically and practically relevant because excellent gate dielectric performance is obtained. In addition, an array of top-gated metal-insulator-graphene field-effect transistors is fabricated on a 6 in. wafer, yielding a capacitance equivalent oxide thickness of less than 1 nm (0.78 nm). This low capacitance equivalent oxide thickness has important implications for the incorporation of graphene into high-performance silicon-based nanoelectronics. PMID:27355098

  12. Performance of GaN-on-Si-based vertical light-emitting diodes using silicon nitride electrodes with conducting filaments: correlation between filament density and device reliability.

    PubMed

    Kim, Kyeong Heon; Kim, Su Jin; Lee, Tae Ho; Lee, Byeong Ryong; Kim, Tae Geun

    2016-08-01

    Transparent conductive electrodes with good conductivity and optical transmittance are an essential element for highly efficient light-emitting diodes. However, conventional indium tin oxide and its alternative transparent conductive electrodes have some trouble with a trade-off between electrical conductivity and optical transmittance, thus limiting their practical applications. Here, we present silicon nitride transparent conductive electrodes with conducting filaments embedded using the electrical breakdown process and investigate the dependence of the conducting filament density formed in the transparent conductive electrode on the device performance of gallium nitride-based vertical light-emitting diodes. Three gallium nitride-on-silicon-based vertical light-emitting diodes using silicon nitride transparent conductive electrodes with high, medium, and low conducting filament densities were prepared with a reference vertical light-emitting diode using metal electrodes. This was carried to determine the optimal density of the conducting filaments in the proposed silicon nitride transparent conductive electrodes. In comparison, the vertical light-emitting diodes with a medium conducting filament density exhibited the lowest optical loss, direct ohmic behavior, and the best current injection and distribution over the entire n-type gallium nitride surface, leading to highly reliable light-emitting diode performance. PMID:27505739

  13. Business of reliability

    NASA Astrophysics Data System (ADS)

    Engel, Pierre

    1999-12-01

    The presentation is organized around three themes: (1) The decrease of reception equipment costs allows non-Remote Sensing organization to access a technology until recently reserved to scientific elite. What this means is the rise of 'operational' executive agencies considering space-based technology and operations as a viable input to their daily tasks. This is possible thanks to totally dedicated ground receiving entities focusing on one application for themselves, rather than serving a vast community of users. (2) The multiplication of earth observation platforms will form the base for reliable technical and financial solutions. One obstacle to the growth of the earth observation industry is the variety of policies (commercial versus non-commercial) ruling the distribution of the data and value-added products. In particular, the high volume of data sales required for the return on investment does conflict with traditional low-volume data use for most applications. Constant access to data sources supposes monitoring needs as well as technical proficiency. (3) Large volume use of data coupled with low- cost equipment costs is only possible when the technology has proven reliable, in terms of application results, financial risks and data supply. Each of these factors is reviewed. The expectation is that international cooperation between agencies and private ventures will pave the way for future business models. As an illustration, the presentation proposes to use some recent non-traditional monitoring applications, that may lead to significant use of earth observation data, value added products and services: flood monitoring, ship detection, marine oil pollution deterrent systems and rice acreage monitoring.

  14. Disappointing reliability of pulsatility indices to identify candidates for magnetic resonance imaging screening in population-based studies assessing prevalence of cerebral small vessel disease

    PubMed Central

    Del Brutto, Oscar H.; Mera, Robertino M.; Andrade, María de la Luz; Castillo, Pablo R.; Zambrano, Mauricio; Nader, Juan A.

    2015-01-01

    Background: Diagnosis of cerebral small vessel disease (SVD) is a challenge in remote areas where magnetic resonance imaging (MRI) is not available. Hospital-based studies in high-risk or stroke patients have found an association between the pulsatility index (PI) of intracranial arteries – as derived from transcranial Doppler (TCD) – and white matter hyperintensities (WMH) of presumed vascular origin. We aimed to assess the reliability of cerebral pulsatility indices to identify candidates for MRI screening in population-based studies assessing prevalence of SVD. Methods: A representative sample of stroke-free Atahualpa residents aged ≥65 years investigated with MRI underwent TCD. Using generalized linear models, we evaluated whether the PI of major intracranial arteries correlate with WMH (used as a proxy of diffuse SVD), after adjusting for demographics and cardiovascular risk factors. Results: Out of 70 participants (mean age 70.6 ± 4.6 years, 57% women), 28 (40%) had moderate-to-severe WMH. In multivariate models, there were no differences across categories of WMH in the mean PI of middle cerebral arteries (1.10 ± 0.16 vs. 1.22 ± 0.24, β: 0.065, 95% confidence interval (CI): −0.084–0.177, P = 0.474) or vertebrobasilar arteries (1.11 ± 0.16 vs. 1.29 ± 0.27, β: 0.066, 95% CI: −0.0024–0.156, P = 0.146). Conclusions: Cerebral PI should not be used to identify candidates for MRI screening in population-based studies assessing the burden of SVD. PMID:26167015

  15. The Reliability of College Grades

    ERIC Educational Resources Information Center

    Beatty, Adam S.; Walmsley, Philip T.; Sackett, Paul R.; Kuncel, Nathan R.; Koch, Amanda J.

    2015-01-01

    Little is known about the reliability of college grades relative to how prominently they are used in educational research, and the results to date tend to be based on small sample studies or are decades old. This study uses two large databases (N > 800,000) from over 200 educational institutions spanning 13 years and finds that both first-year…

  16. Web Awards: Are They Reliable?

    ERIC Educational Resources Information Center

    Everhart, Nancy; McKnight, Kathleen

    1997-01-01

    School library media specialists recommend quality Web sites to children based on evaluations and Web awards. This article examines three types of Web awards and who grants them, suggests ways to determine their reliability, and discusses specific award sites. Includes a bibliography of Web sites. (PEN)

  17. Human reliability analysis

    SciTech Connect

    Dougherty, E.M.; Fragola, J.R.

    1988-01-01

    The authors present a treatment of human reliability analysis incorporating an introduction to probabilistic risk assessment for nuclear power generating stations. They treat the subject according to the framework established for general systems theory. Draws upon reliability analysis, psychology, human factors engineering, and statistics, integrating elements of these fields within a systems framework. Provides a history of human reliability analysis, and includes examples of the application of the systems approach.

  18. Omics for prediction of environmental health effects: Blood leukocyte-based cross-omic profiling reliably predicts diseases associated with tobacco smoking

    PubMed Central

    Georgiadis, Panagiotis; Hebels, Dennie G.; Valavanis, Ioannis; Liampa, Irene; Bergdahl, Ingvar A.; Johansson, Anders; Palli, Domenico; Chadeau-Hyam, Marc; Chatziioannou, Aristotelis; Jennen, Danyel G. J.; Krauskopf, Julian; Jetten, Marlon J.; Kleinjans, Jos C. S.; Vineis, Paolo; Kyrtopoulos, Soterios A.; Gottschalk, Ralph; van Leeuwen, Danitsja; Timmermans, Leen; de Kok, Theo M.C.M.; Botsivali, Maria; Bendinelli, Benedetta; Kelly, Rachel; Vermeulen, Roel; Portengen, Lutzen; Saberi-Hosnijeh, Fatemeh; Melin, Beatrice; Hallmans, Göran; Lenner, Per; Keun, Hector C.; Siskos, Alexandros; Athersuch, Toby J.; Kogevinas, Manolis; Stephanou, Euripides G.; Myridakis, Antonis; Fazzo, Lucia; De Santis, Marco; Comba, Pietro; Kiviranta, Hannu; Rantakokko, Panu; Airaksinen, Riikka; Ruokojärvi, Päivi; Gilthorpe, Mark; Fleming, Sarah; Fleming, Thomas; Tu, Yu-Kang; Jonsson, Bo; Lundh, Thomas; Chen, Wei J.; Lee, Wen-Chung; Kate Hsiao, Chuhsing; Chien, Kuo-Liong; Kuo, Po-Hsiu; Hung, Hung; Liao, Shu-Fen

    2016-01-01

    The utility of blood-based omic profiles for linking environmental exposures to their potential health effects was evaluated in 649 individuals, drawn from the general population, in relation to tobacco smoking, an exposure with well-characterised health effects. Using disease connectivity analysis, we found that the combination of smoking-modified, genome-wide gene (including miRNA) expression and DNA methylation profiles predicts with remarkable reliability most diseases and conditions independently known to be causally associated with smoking (indicative estimates of sensitivity and positive predictive value 94% and 84%, respectively). Bioinformatics analysis reveals the importance of a small number of smoking-modified, master-regulatory genes and suggest a central role for altered ubiquitination. The smoking-induced gene expression profiles overlap significantly with profiles present in blood cells of patients with lung cancer or coronary heart disease, diseases strongly associated with tobacco smoking. These results provide proof-of-principle support to the suggestion that omic profiling in peripheral blood has the potential of identifying early, disease-related perturbations caused by toxic exposures and may be a useful tool in hazard and risk assessment. PMID:26837704

  19. Recalibrating software reliability models

    NASA Technical Reports Server (NTRS)

    Brocklehurst, Sarah; Chan, P. Y.; Littlewood, Bev; Snell, John

    1989-01-01

    In spite of much research effort, there is no universally applicable software reliability growth model which can be trusted to give accurate predictions of reliability in all circumstances. Further, it is not even possible to decide a priori which of the many models is most suitable in a particular context. In an attempt to resolve this problem, techniques were developed whereby, for each program, the accuracy of various models can be analyzed. A user is thus enabled to select that model which is giving the most accurate reliability predictions for the particular program under examination. One of these ways of analyzing predictive accuracy, called the u-plot, in fact allows a user to estimate the relationship between the predicted reliability and the true reliability. It is shown how this can be used to improve reliability predictions in a completely general way by a process of recalibration. Simulation results show that the technique gives improved reliability predictions in a large proportion of cases. However, a user does not need to trust the efficacy of recalibration, since the new reliability estimates produced by the technique are truly predictive and so their accuracy in a particular application can be judged using the earlier methods. The generality of this approach would therefore suggest that it be applied as a matter of course whenever a software reliability model is used.

  20. Recalibrating software reliability models

    NASA Technical Reports Server (NTRS)

    Brocklehurst, Sarah; Chan, P. Y.; Littlewood, Bev; Snell, John

    1990-01-01

    In spite of much research effort, there is no universally applicable software reliability growth model which can be trusted to give accurate predictions of reliability in all circumstances. Further, it is not even possible to decide a priori which of the many models is most suitable in a particular context. In an attempt to resolve this problem, techniques were developed whereby, for each program, the accuracy of various models can be analyzed. A user is thus enabled to select that model which is giving the most accurate reliability predicitons for the particular program under examination. One of these ways of analyzing predictive accuracy, called the u-plot, in fact allows a user to estimate the relationship between the predicted reliability and the true reliability. It is shown how this can be used to improve reliability predictions in a completely general way by a process of recalibration. Simulation results show that the technique gives improved reliability predictions in a large proportion of cases. However, a user does not need to trust the efficacy of recalibration, since the new reliability estimates prodcued by the technique are truly predictive and so their accuracy in a particular application can be judged using the earlier methods. The generality of this approach would therefore suggest that it be applied as a matter of course whenever a software reliability model is used.

  1. Reliability of fluid systems

    NASA Astrophysics Data System (ADS)

    Kopáček, Jaroslav; Fojtášek, Kamil; Dvořák, Lukáš

    2016-03-01

    This paper focuses on the importance of detection reliability, especially in complex fluid systems for demanding production technology. The initial criterion for assessing the reliability is the failure of object (element), which is seen as a random variable and their data (values) can be processed using by the mathematical methods of theory probability and statistics. They are defined the basic indicators of reliability and their applications in calculations of serial, parallel and backed-up systems. For illustration, there are calculation examples of indicators of reliability for various elements of the system and for the selected pneumatic circuit.

  2. Reliability reporting practices in rape myth research.

    PubMed

    Buhi, Eric R

    2005-02-01

    A number of school-based programs address sexual violence by focusing on adolescents' attitudes about rape or acceptance of rape myths. However, many problems exist in the literature regarding measurement of rape myth acceptance, including issues of reliability and validity. This paper addresses measurement reliability issues and reviews reliability reporting practices of studies using the Burt Rape Myth Acceptance Scale. Less than one-half of the 68 articles examined reported reliability coefficients for the data collected. Almost one-third of the studies did not mention reliability. Examples of acceptable reliability reporting are provided. It is argued that reliability coefficients for the data actually analyzed should always be assessed and reported when interpreting program results. Implicationsfor school health research and practice are discussed. PMID:15929595

  3. Development of CPV solar receiver based on insulated metal substrate (IMS): Comparison with receiver based on the direct bonded copper substrate (DBC) - A reliability study

    NASA Astrophysics Data System (ADS)

    Mabille, Loïc; Mangeant, Christophe; Baudrit, Mathieu

    2012-10-01

    Most of the CPV solar receivers are based on III-V multijunction cells die-attached to a direct bonded copper (DBC) substrate. An alternative to DBC resides in the insulated metal substrate (IMS). This paper presents the behavior of IMS and DBC receivers when tested under accelerated aging conditions such as described in the IEC 62108. Characterization tools involved in the monitoring of potential degradation are electroluminescence (EL), dark-I (V) (DIV), spectral response (EQE), diode factor measurements (VIM), RX tomography (RX-T), and, of course, illuminated I (V) under various concentration factors. Based on EL, DIV and EQE first results, IMS and DBC age in a similar way. Study is ongoing.

  4. Assessing the performance and reliability of PERSIANN-CDR satellite-based rainfall estimates over Spain: case study of rainfall Dry Spell Lengths (DSL)

    NASA Astrophysics Data System (ADS)

    Garcia Galiano, S. G.; Giraldo Osorio, J. D.; Nguyen, P.; Hsu, K. L.; Braithwaite, D.; Olmos, P.; Sorooshian, S.

    2015-12-01

    Studying Spain's long-term variability and changing trends in rainfall, due to its unique position in the Mediterranean basin (i.e., the latitudinal gradient from North to South and its orographic variation), can provide a valuable insight into how hydroclimatology of the region has changed. A recently released high resolution satellite-based global daily precipitation climate dataset PERSIANN-CDR (Precipitation Estimation from Remotely Sensed Information using Artificial Neural Network - Climate Data Record), provided the opportunity to conduct such study. It covers the period 01/01/1983 - to date, at 0.25° resolution. In areas without a dense network of rain-gauges, the PERSIANN-CDR dataset could be useful for identifying the reliability of regional climate models (RCMs), in order to build robust RCMs ensemble for reducing the uncertainties in the climate and hydrological projections. However, before using this data set for RCM evaluation, an assessment of performance of PERSIANN-CDR dataset against in-situ observations is necessary. The high-resolution gridded daily rain-gauge dataset, named Spain02, was employed in this study. The variable Dry Spell Lengths (DSL) considering 1 mm and 10 mm as thresholds of daily rainfall, and the time period 1988-2007 was defined for the study. A procedure for improving the consistency and homogeneity between the two datasets was applied. The assessment is based on distributional similarity and the well-known statistical tests (Smirnov-Kolmogorov of two samples and Chi-Square) are used as fitting criteria. The results demonstrate good fit of PERSIANN-CDR over whole Spain, for threshold 10 mm/day. However, for threshold 1 mm/day PERSIANN-CDR compares well with Spain02 dataset for areas with high values of rainfall (North of Spain); while in semiarid areas (South East of Spain) there is strong overestimation of short DSLs. Overall, PERSIANN-CDR demonstrate its robustness in the simulation of DSLs for the highest thresholds.

  5. Hawaii electric system reliability.

    SciTech Connect

    Silva Monroy, Cesar Augusto; Loose, Verne William

    2012-09-01

    This report addresses Hawaii electric system reliability issues; greater emphasis is placed on short-term reliability but resource adequacy is reviewed in reference to electric consumers' views of reliability %E2%80%9Cworth%E2%80%9D and the reserve capacity required to deliver that value. The report begins with a description of the Hawaii electric system to the extent permitted by publicly available data. Electrical engineering literature in the area of electric reliability is researched and briefly reviewed. North American Electric Reliability Corporation standards and measures for generation and transmission are reviewed and identified as to their appropriateness for various portions of the electric grid and for application in Hawaii. Analysis of frequency data supplied by the State of Hawaii Public Utilities Commission is presented together with comparison and contrast of performance of each of the systems for two years, 2010 and 2011. Literature tracing the development of reliability economics is reviewed and referenced. A method is explained for integrating system cost with outage cost to determine the optimal resource adequacy given customers' views of the value contributed by reliable electric supply. The report concludes with findings and recommendations for reliability in the State of Hawaii.

  6. Test-Retest Reliability and Concurrent Validity of a Single Tri-Axial Accelerometer-Based Gait Analysis in Older Adults with Normal Cognition

    PubMed Central

    Byun, Seonjeong; Han, Ji Won; Kim, Tae Hui; Kim, Ki Woong

    2016-01-01

    Objective We investigated the concurrent validity and test-retest reliability of spatio-temporal gait parameters measured with a single tri-axial accelerometer (TAA), determined the optimal number of steps required for obtaining acceptable levels of reliability, and compared the validity and reliability of the estimated gait parameters across the three reference axes of the TAA. Methods A total of 82 cognitively normal elderly participants walked around a 40-m long round walkway twice wearing a TAA at their center of body mass. Gait parameters such as cadence, gait velocity, step time, step length, step time variability, and step time asymmetry were estimated from the low pass-filtered signal of the TAA. The test-retest reliability and concurrent validity with the GAITRite® system were evaluated for the estimated gait parameters. Results Gait parameters using signals from the vertical axis showed excellent reliability for all gait parameters; the intraclass correlation coefficient (ICC) was 0.79–0.90. A minimum of 26 steps and 14 steps were needed to achieve excellent reliability in step time variability and step time asymmetry, respectively. A strong level of agreement was seen for the basic gait parameters between the TAA and GAITRiteⓇ (ICC = 0.91–0.96). Conclusions The measurement of gait parameters of elderly individuals with normal cognition using a TAA placed on the body’s center of mass was reliable and showed superiority over the GAITRiteⓇ with regard to gait variability and asymmetry. The TAA system was a valid tool for measuring basic gait parameters. Considering its wearability and low price, the TAA system may be a promising alternative to the pressure sensor walkway system for measuring gait parameters. PMID:27427965

  7. Managing Reliability in the 21st Century

    SciTech Connect

    Dellin, T.A.

    1998-11-23

    The rapid pace of change at Ike end of the 20th Century should continue unabated well into the 21st Century. The driver will be the marketplace imperative of "faster, better, cheaper." This imperative has already stimulated a revolution-in-engineering in design and manufacturing. In contrast, to date, reliability engineering has not undergone a similar level of change. It is critical that we implement a corresponding revolution-in-reliability-engineering as we enter the new millennium. If we are still using 20th Century reliability approaches in the 21st Century, then reliability issues will be the limiting factor in faster, better, and cheaper. At the heart of this reliability revolution will be a science-based approach to reliability engineering. Science-based reliability will enable building-in reliability, application-specific products, virtual qualification, and predictive maintenance. The purpose of this paper is to stimulate a dialogue on the future of reliability engineering. We will try to gaze into the crystal ball and predict some key issues that will drive reliability programs in the new millennium. In the 21st Century, we will demand more of our reliability programs. We will need the ability to make accurate reliability predictions that will enable optimizing cost, performance and time-to-market to meet the needs of every market segment. We will require that all of these new capabilities be in place prior to the stint of a product development cycle. The management of reliability programs will be driven by quantifiable metrics of value added to the organization business objectives.

  8. Reliability of health-related quality-of-life indicators in cancer survivors from a population-based sample, 2005, BRFSS

    PubMed Central

    Kapp, J.M.; Jackson-Thompson, J.; Petroski, G.F.; Schootman, M.

    2009-01-01

    Summary Objective The current emphasis in cancer survivorship research, which includes health-related quality of life (HRQoL), drives the need to monitor the nation’s cancer burden. Routine, ongoing public health surveillance tools, such as the Behavioral Risk Factor Surveillance System (BRFSS), may be relevant for this purpose. Study design A subsample of the 2005 Missouri BRFSS was used to estimate test–retest reliability of HRQoL questions among persons who did and did not report a personal cancer history. Methods Retest interviews were conducted by telephone 14–21 days after the initial data collection (n=540, 67% response rate). Reliability was estimated overall and by cancer history using intraclass correlation coefficients (ICCs) and kappa statistics. Results The majority of retest respondents were White, female and married, with 13% reporting a history of cancer. Overall, point estimates of the reliability coefficients ranged from moderate to excellent (κ=0.57–0.75). There were no statistically significant differences in test–retest reliability between persons with and without a history of cancer, except for self-reported pain (ICC=0.59 and ICC=0.78, respectively). Conclusions In general, BRFSS questions appear to have adequate reliability for monitoring HRQoL in this community-dwelling population, regardless of cancer history. PMID:19081117

  9. Photovoltaic system reliability

    SciTech Connect

    Maish, A.B.; Atcitty, C.; Greenberg, D.

    1997-10-01

    This paper discusses the reliability of several photovoltaic projects including SMUD`s PV Pioneer project, various projects monitored by Ascension Technology, and the Colorado Parks project. System times-to-failure range from 1 to 16 years, and maintenance costs range from 1 to 16 cents per kilowatt-hour. Factors contributing to the reliability of these systems are discussed, and practices are recommended that can be applied to future projects. This paper also discusses the methodology used to collect and analyze PV system reliability data.

  10. Novel method for constructing a large-scale design space in lubrication process by using Bayesian estimation based on the reliability of a scale-up rule.

    PubMed

    Maeda, Jin; Suzuki, Tatsuya; Takayama, Kozo

    2012-01-01

    A reliable large-scale design space was constructed by integrating the reliability of a scale-up rule into the Bayesian estimation without enforcing a large-scale design of experiments (DoE). A small-scale DoE was conducted using various Froude numbers (X(1)) and blending times (X(2)) in the lubricant blending process for theophylline tablets. The response surfaces, design space, and their reliability of the compression rate of the powder mixture (Y(1)), tablet hardness (Y(2)), and dissolution rate (Y(3)) on a small scale were calculated using multivariate spline interpolation, a bootstrap resampling technique, and self-organizing map clustering. A constant Froude number was applied as a scale-up rule. Experiments were conducted at four different small scales with the same Froude number and blending time in order to determine the discrepancies in the response variables between the scales so as to indicate the reliability of the scale-up rule. Three experiments under an optimal condition and two experiments under other conditions were performed on a large scale. The response surfaces on the small scale were corrected to those on the large scale by Bayesian estimation using the large-scale results and the reliability of the scale-up rule. Large-scale experiments performed under three additional sets of conditions showed that the corrected design space was more reliable than the small-scale design space even when there was some discrepancy in the pharmaceutical quality between the manufacturing scales. This approach is useful for setting up a design space in pharmaceutical development when a DoE cannot be performed at a commercial large manufacturing scale. PMID:22976324

  11. Reliability of delirium rating scale (DRS) and delirium rating scale-revised-98 (DRS-R98) using variance-based multivariate modelling.

    PubMed

    Adamis, Dimitrios; Slor, Chantal J; Leonard, Maeve; Witlox, Joost; de Jonghe, Jos F M; Macdonald, Alastair J D; Trzepacz, Paula; Meagher, David

    2013-07-01

    Delirium's characteristic fluctuation in symptom severity complicates the assessment of test-retest reliability of scales using classical analyses, but application of modelling to longitudinal data offers a new approach. We evaluated test-retest reliability of the delirium rating scale (DRS) and delirium rating scale-revised-98 (DRS-R98), two widely used instruments with high validity and inter-rater reliability. Two existing longitudinal datasets for each scale included DSM-IV criteria for delirium diagnosis and repeated measurements using the DRS or DRS-R98. To estimate the reliability coefficients RT and RΛ for each scale we used a macros provided by Dr. Laenen at http://www.ibiostat.be/software/measurement.asp. For each dataset a linear mixed-effects model was fitted to estimate the variance-covariance parameters. A total of 531 cases with between 4 and 9 measurement points across studies including both delirious and non-delirious patients. Comorbid dementia in the datasets varied from 27% to 55%. Overall RT for the DRS were 0.71 and 0.50 and for DRS-R98 0.75 and 0.84. RΛ values for DRS were 0.99 and 0.98 and for DRS-R98 were 0.92 and 0.96. Individual RT measures for DRS-R98 and DRS across visits within studies showed more range than overall values. Our models found high overall reliability for both scales. Multiple factors impact a scale's reliability values including sample size, repeated measurements, patient population, etc in addition to rater variability. PMID:23522935

  12. Reliable measures of behaviorally-evoked cardiovascular reactivity from a PC-based test battery: results from student and community samples.

    PubMed

    Kamarck, T W; Jennings, J R; Debski, T T; Glickman-Weiss, E; Johnson, P S; Eddy, M J; Manuck, S B

    1992-01-01

    This paper describes efforts to reduce measurement error in the assessment of cardiovascular reactivity by standardizing task requirements and by aggregating data across tasks and testing sessions. Using these methods, reliable measures of reactivity (.80 or greater) were obtained on five different measures of cardiovascular function (heart rate, systolic blood pressure, diastolic blood pressure, stroke volume, pre-ejection period) in samples of college students and community volunteers. Methodological limitations may have hampered previous efforts in this area. Current findings are consistent with a dispositional model of cardiovascular reactivity, and they suggest productive future strategies for obtaining reliable assessments. PMID:1609024

  13. Reliability Reporting Practices in Rape Myth Research.

    ERIC Educational Resources Information Center

    Buhi, Eric R.

    2005-01-01

    A number of school-based programs address sexual violence by focusing on adolescents' attitudes about rape or acceptance of rape myths. However, really problems exist in the literature regarding measurement of rape myth acceptance, including issues of reliability and validity. This paper addresses measurement reliability issues and reviews…

  14. Reliability of telescopes for the lunar surface

    NASA Astrophysics Data System (ADS)

    Benaroya, Haym

    1995-02-01

    The subject of risk and reliability for lunar structures, in particular lunar-based telescopes, is introduced and critical issues deliberated. General discussions are made more specific regarding the lunar telescope, but this paper provides a framework for further quantitative reliability studies.

  15. Reliability assurance for regulation of advanced reactors

    SciTech Connect

    Fullwood, R.; Lofaro, R.; Samanta, P.

    1991-01-01

    The advanced nuclear power plants must achieve higher levels of safety than the first generation of plants. Showing that this is indeed true provides new challenges to reliability and risk assessment methods in the analysis of the designs employing passive and semi-passive protection. Reliability assurance of the advanced reactor systems is important for determining the safety of the design and for determining the plant operability. Safety is the primary concern, but operability is considered indicative of good and safe operation. This paper discusses several concerns for reliability assurance of the advanced design encompassing reliability determination, level of detail required in advanced reactor submittals, data for reliability assurance, systems interactions and common cause effects, passive component reliability, PRA-based configuration control system, and inspection, training, maintenance and test requirements. Suggested approaches are provided for addressing each of these topics.

  16. Reliability assurance for regulation of advanced reactors

    SciTech Connect

    Fullwood, R.; Lofaro, R.; Samanta, P.

    1991-12-31

    The advanced nuclear power plants must achieve higher levels of safety than the first generation of plants. Showing that this is indeed true provides new challenges to reliability and risk assessment methods in the analysis of the designs employing passive and semi-passive protection. Reliability assurance of the advanced reactor systems is important for determining the safety of the design and for determining the plant operability. Safety is the primary concern, but operability is considered indicative of good and safe operation. This paper discusses several concerns for reliability assurance of the advanced design encompassing reliability determination, level of detail required in advanced reactor submittals, data for reliability assurance, systems interactions and common cause effects, passive component reliability, PRA-based configuration control system, and inspection, training, maintenance and test requirements. Suggested approaches are provided for addressing each of these topics.

  17. An asymptotic approach for assessing fatigue reliability

    SciTech Connect

    Tang, J.

    1996-12-01

    By applying the cumulative fatigue damage theory to the random process reliability problem, and the introduction of a new concept of unified equivalent stress level in fatigue life prediction, a technical reliability model for the random process reliability problem under fatigue failure is proposed. The technical model emphasizes efficiency in the design choice and also focuses on the accuracy of the results. Based on this model, an asymptotic method for fatigue reliability under stochastic process loadings is developed. The proposed method uses the recursive iteration algorithm to achieve results which include reliability and corresponding life. The method reconciles the requirement of accuracy and efficiency for the random process reliability problems under fatigue failure. The accuracy and analytical and numerical efforts required are compared. Through numerical example, the advantage of the proposed method is demonstrated.

  18. Software reliability studies

    NASA Technical Reports Server (NTRS)

    Wilson, Larry W.

    1989-01-01

    The longterm goal of this research is to identify or create a model for use in analyzing the reliability of flight control software. The immediate tasks addressed are the creation of data useful to the study of software reliability and production of results pertinent to software reliability through the analysis of existing reliability models and data. The completed data creation portion of this research consists of a Generic Checkout System (GCS) design document created in cooperation with NASA and Research Triangle Institute (RTI) experimenters. This will lead to design and code reviews with the resulting product being one of the versions used in the Terminal Descent Experiment being conducted by the Systems Validations Methods Branch (SVMB) of NASA/Langley. An appended paper details an investigation of the Jelinski-Moranda and Geometric models for software reliability. The models were given data from a process that they have correctly simulated and asked to make predictions about the reliability of that process. It was found that either model will usually fail to make good predictions. These problems were attributed to randomness in the data and replication of data was recommended.

  19. Multidisciplinary System Reliability Analysis

    NASA Technical Reports Server (NTRS)

    Mahadevan, Sankaran; Han, Song; Chamis, Christos C. (Technical Monitor)

    2001-01-01

    The objective of this study is to develop a new methodology for estimating the reliability of engineering systems that encompass multiple disciplines. The methodology is formulated in the context of the NESSUS probabilistic structural analysis code, developed under the leadership of NASA Glenn Research Center. The NESSUS code has been successfully applied to the reliability estimation of a variety of structural engineering systems. This study examines whether the features of NESSUS could be used to investigate the reliability of systems in other disciplines such as heat transfer, fluid mechanics, electrical circuits etc., without considerable programming effort specific to each discipline. In this study, the mechanical equivalence between system behavior models in different disciplines are investigated to achieve this objective. A new methodology is presented for the analysis of heat transfer, fluid flow, and electrical circuit problems using the structural analysis routines within NESSUS, by utilizing the equivalence between the computational quantities in different disciplines. This technique is integrated with the fast probability integration and system reliability techniques within the NESSUS code, to successfully compute the system reliability of multidisciplinary systems. Traditional as well as progressive failure analysis methods for system reliability estimation are demonstrated, through a numerical example of a heat exchanger system involving failure modes in structural, heat transfer and fluid flow disciplines.

  20. A Theory-Based Comparison of the Reliabilities of Fixed-Length and Trials-to-Criterion Scoring of Physical Education Skills Tests.

    ERIC Educational Resources Information Center

    Feldt, Leonard S.; Spray, Judith A.

    1983-01-01

    The reliabilities of two types of measurement plans were compared across six hypothetical distributions of true scores or abilities. The measurement plans were: (1) fixed-length, where the number of trials for all examinees is set in advance; and (2) trials-to-criterion, where examinees must keep trying until they complete a given number of trials…

  1. Adaptation of the Boundary Violations Scale Developed Based on Structural Family Therapy to the Turkish Context: A Study of Validity and Reliability

    ERIC Educational Resources Information Center

    Avci, Rasit; Çolakkadioglu, Oguzhan; Öz, Aysegül Sükran; Akbas, Turan

    2015-01-01

    The purpose of this study was to adapt "The Boundary Violations Scale" (Madden et al., 2002), which was created to measure the intergenerational boundary violations in families from the perspective of children, to Turkish and to test the validity and reliability of the Turkish version of this instrument. This instrument was developed…

  2. Eco-friendly ionic liquid based ultrasonic assisted selective extraction coupled with a simple liquid chromatography for the reliable determination of acrylamide in food samples.

    PubMed

    Albishri, Hassan M; El-Hady, Deia Abd

    2014-01-01

    Acrylamide in food has drawn worldwide attention since 2002 due to its neurotoxic and carcinogenic effects. These influences brought out the dual polar and non-polar characters of acrylamide as they enabled it to dissolve in aqueous blood medium or penetrate the non-polar plasma membrane. In the current work, a simple HPLC/UV system was used to reveal that the penetration of acrylamide in non-polar phase was stronger than its dissolution in polar phase. The presence of phosphate salts in the polar phase reduced the acrylamide interaction with the non-polar phase. Furthermore, an eco-friendly and costless coupling of the HPLC/UV with ionic liquid based ultrasonic assisted extraction (ILUAE) was developed to determine the acrylamide content in food samples. ILUAE was proposed for the efficient extraction of acrylamide from bread and potato chips samples. The extracts were obtained by soaking of potato chips and bread samples in 1.5 mol L(-1) 1-butyl-3-methylimmidazolium bromide (BMIMBr) for 30.0 and 60.0 min, respectively and subsequent chromatographic separation within 12.0 min using Luna C18 column and 100% water mobile phase with 0.5 mL min(-1) under 25 °C column temperature at 250 nm. The extraction and analysis of acrylamide could be achieved within 2h. The mean extraction efficiency of acrylamide showed adequate repeatability with relative standard deviation (RSD) of 4.5%. The limit of detection and limit of quantitation were 25.0 and 80.0 ng mL(-1), respectively. The accuracy of the proposed method was tested by recovery in seven food samples giving values ranged between 90.6% and 109.8%. Therefore, the methodology was successfully validated by official guidelines, indicating its reliability to be applied to analysis of real samples, proven to be useful for its intended purpose. Moreover, it served as a simple, eco-friendly and costless alternative method over hitherto reported ones. PMID:24274280

  3. LONG-TERM RELIABILITY OF AL2O3 AND PARYLENE C BILAYER ENCAPSULATED UTAH ELECTRODE ARRAY BASED NEURAL INTERFACES FOR CHRONIC IMPLANTATION

    PubMed Central

    Xie, Xianzong; Rieth, Loren; Williams, Layne; Negi, Sandeep; Bhandari, Rajmohan; Caldwell, Ryan; Sharma, Rohit; Tathireddy, Prashant; Solzbacher, Florian

    2014-01-01

    Objective We focus on improving the long-term stability and functionality of neural interfaces for chronic implantation by using bilayer encapsulation. Approach We evaluated the long-term reliability of Utah electrode array (UEA) based neural interfaces encapsulated by 52 nm of atomic layer deposited (ALD) Al2O3 and 6 μm of Parylene C bilayer, and compared these to devices with the baseline Parylene-only encapsulation. Three variants of arrays including wired, wireless, and active UEAs were used to evaluate this bilayer encapsulation scheme, and were immersed in phosphate buffered saline (PBS) at 57 °C for accelerated lifetime testing. Main results The median tip impedance of the bilayer encapsulated wired UEAs increased from 60 kΩ to 160 kΩ during the 960 days of equivalent soak testing at 37 °C, the opposite trend as typically observed for Parylene encapsulated devices. The loss of the iridium oxide tip metallization and etching of the silicon tip in PBS solution contributed to the increase of impedance. The lifetime of fully integrated wireless UEAs was also tested using accelerated lifetime measurement techniques. The bilayer coated devices had stable power-up frequencies at ~910 MHz and constant RF signal strength of -50 dBm during up to 1044 days (still under testing) of equivalent soaking time at 37 °C. This is a significant improvement over the lifetime of ~ 100 days achieved with Parylene-only encapsulation at 37 °C. The preliminary samples of bilayer coated active UEAs with a flip-chip bonded ASIC chip had a steady current draw of ~ 3 mA during 228 days of soak testing at 37 °C. An increase in current draw has been consistently correlated to device failures, so is a sensitive metric for their lifetime. Significance The trends of increasing electrode impedance of wired devices and performance stability of wireless and active devices support the significantly greater encapsulation performance of this bilayer encapsulation compared with Parylene

  4. Long-term reliability of Al2O3 and Parylene C bilayer encapsulated Utah electrode array based neural interfaces for chronic implantation

    NASA Astrophysics Data System (ADS)

    Xie, Xianzong; Rieth, Loren; Williams, Layne; Negi, Sandeep; Bhandari, Rajmohan; Caldwell, Ryan; Sharma, Rohit; Tathireddy, Prashant; Solzbacher, Florian

    2014-04-01

    Objective. We focus on improving the long-term stability and functionality of neural interfaces for chronic implantation by using bilayer encapsulation. Approach. We evaluated the long-term reliability of Utah electrode array (UEA) based neural interfaces encapsulated by 52 nm of atomic layer deposited Al2O3 and 6 µm of Parylene C bilayer, and compared these to devices with the baseline Parylene-only encapsulation. Three variants of arrays including wired, wireless, and active UEAs were used to evaluate this bilayer encapsulation scheme, and were immersed in phosphate buffered saline (PBS) at 57 °C for accelerated lifetime testing. Main results. The median tip impedance of the bilayer encapsulated wired UEAs increased from 60 to 160 kΩ during the 960 days of equivalent soak testing at 37 °C, the opposite trend to that typically observed for Parylene encapsulated devices. The loss of the iridium oxide tip metallization and etching of the silicon tip in PBS solution contributed to the increase of impedance. The lifetime of fully integrated wireless UEAs was also tested using accelerated lifetime measurement techniques. The bilayer coated devices had stable power-up frequencies at ˜910 MHz and constant radio-frequency signal strength of -50 dBm during up to 1044 days (still under testing) of equivalent soaking time at 37 °C. This is a significant improvement over the lifetime of ˜100 days achieved with Parylene-only encapsulation at 37 °C. The preliminary samples of bilayer coated active UEAs with a flip-chip bonded ASIC chip had a steady current draw of ˜3 mA during 228 days of soak testing at 37 °C. An increase in the current draw has been consistently correlated to device failures, so is a sensitive metric for their lifetime. Significance. The trends of increasing electrode impedance of wired devices and performance stability of wireless and active devices support the significantly greater encapsulation performance of this bilayer encapsulation compared

  5. Statistical modelling of software reliability

    NASA Technical Reports Server (NTRS)

    Miller, Douglas R.

    1991-01-01

    During the six-month period from 1 April 1991 to 30 September 1991 the following research papers in statistical modeling of software reliability appeared: (1) A Nonparametric Software Reliability Growth Model; (2) On the Use and the Performance of Software Reliability Growth Models; (3) Research and Development Issues in Software Reliability Engineering; (4) Special Issues on Software; and (5) Software Reliability and Safety.

  6. Photovoltaic module reliability workshop

    NASA Astrophysics Data System (ADS)

    Mrig, L.

    The paper and presentations compiled in this volume form the Proceedings of the fourth in a series of Workshops sponsored by Solar Energy Research Institute (SERI/DOE) under the general theme of photovoltaic module reliability during the period 1986 to 1990. The reliability photovoltaic (PV) modules/systems is exceedingly important along with the initial cost and efficiency of modules if the PV technology has to make a major impact in the power generation market, and for it to compete with the conventional electricity producing technologies. The reliability of photovoltaic modules has progressed significantly in the last few years as evidenced by warrantees available on commercial modules of as long as 12 years. However, there is still need for substantial research and testing required to improve module field reliability to levels of 30 years or more. Several small groups of researchers are involved in this research, development, and monitoring activity around the world. In the U.S., PV manufacturers, DOE laboratories, electric utilities and others are engaged in the photovoltaic reliability research and testing. This group of researchers and others interested in this field were brought together under SERI/DOE sponsorship to exchange the technical knowledge and field experience as related to current information in this important field. The papers presented here reflect this effort.

  7. Proposed reliability cost model

    NASA Technical Reports Server (NTRS)

    Delionback, L. M.

    1973-01-01

    The research investigations which were involved in the study include: cost analysis/allocation, reliability and product assurance, forecasting methodology, systems analysis, and model-building. This is a classic example of an interdisciplinary problem, since the model-building requirements include the need for understanding and communication between technical disciplines on one hand, and the financial/accounting skill categories on the other. The systems approach is utilized within this context to establish a clearer and more objective relationship between reliability assurance and the subcategories (or subelements) that provide, or reenforce, the reliability assurance for a system. Subcategories are further subdivided as illustrated by a tree diagram. The reliability assurance elements can be seen to be potential alternative strategies, or approaches, depending on the specific goals/objectives of the trade studies. The scope was limited to the establishment of a proposed reliability cost-model format. The model format/approach is dependent upon the use of a series of subsystem-oriented CER's and sometimes possible CTR's, in devising a suitable cost-effective policy.

  8. Orbiter Autoland reliability analysis

    NASA Technical Reports Server (NTRS)

    Welch, D. Phillip

    1993-01-01

    The Space Shuttle Orbiter is the only space reentry vehicle in which the crew is seated upright. This position presents some physiological effects requiring countermeasures to prevent a crewmember from becoming incapacitated. This also introduces a potential need for automated vehicle landing capability. Autoland is a primary procedure that was identified as a requirement for landing following and extended duration orbiter mission. This report documents the results of the reliability analysis performed on the hardware required for an automated landing. A reliability block diagram was used to evaluate system reliability. The analysis considers the manual and automated landing modes currently available on the Orbiter. (Autoland is presently a backup system only.) Results of this study indicate a +/- 36 percent probability of successfully extending a nominal mission to 30 days. Enough variations were evaluated to verify that the reliability could be altered with missions planning and procedures. If the crew is modeled as being fully capable after 30 days, the probability of a successful manual landing is comparable to that of Autoland because much of the hardware is used for both manual and automated landing modes. The analysis indicates that the reliability for the manual mode is limited by the hardware and depends greatly on crew capability. Crew capability for a successful landing after 30 days has not been determined yet.

  9. Photovoltaic module reliability workshop

    SciTech Connect

    Mrig, L.

    1990-01-01

    The paper and presentations compiled in this volume form the Proceedings of the fourth in a series of Workshops sponsored by Solar Energy Research Institute (SERI/DOE) under the general theme of photovoltaic module reliability during the period 1986--1990. The reliability Photo Voltaic (PV) modules/systems is exceedingly important along with the initial cost and efficiency of modules if the PV technology has to make a major impact in the power generation market, and for it to compete with the conventional electricity producing technologies. The reliability of photovoltaic modules has progressed significantly in the last few years as evidenced by warranties available on commercial modules of as long as 12 years. However, there is still need for substantial research and testing required to improve module field reliability to levels of 30 years or more. Several small groups of researchers are involved in this research, development, and monitoring activity around the world. In the US, PV manufacturers, DOE laboratories, electric utilities and others are engaged in the photovoltaic reliability research and testing. This group of researchers and others interested in this field were brought together under SERI/DOE sponsorship to exchange the technical knowledge and field experience as related to current information in this important field. The papers presented here reflect this effort.

  10. A short food-group-based dietary questionnaire is reliable and valid for assessing toddlers' dietary risk in relatively advantaged samples.

    PubMed

    Bell, Lucinda K; Golley, Rebecca K; Magarey, Anthea M

    2014-08-28

    Identifying toddlers at dietary risk is crucial for determining who requires intervention to improve dietary patterns and reduce health consequences. The objectives of the present study were to develop a simple tool that assesses toddlers' dietary risk and investigate its reliability and validity. The nineteen-item Toddler Dietary Questionnaire (TDQ) is informed by dietary patterns observed in Australian children aged 14 (n 552) and 24 (n 493) months and the Australian dietary guidelines. It assesses the intake of 'core' food groups (e.g. fruit, vegetables and dairy products) and 'non-core' food groups (e.g. high-fat, high-sugar and/or high-salt foods and sweetened beverages) over the previous 7 d, which is then scored against a dietary risk criterion (0-100; higher score = higher risk). Parents of toddlers aged 12-36 months (Socio-Economic Index for Areas decile range 5-9) were asked to complete the TDQ for their child (n 111) on two occasions, 3·2 (SD 1·8) weeks apart, to assess test-retest reliability. They were also asked to complete a validated FFQ from which the risk score was calculated and compared with the TDQ-derived risk score (relative validity). Mean scores were highly correlated and not significantly different for reliability (intra-class correlation = 0·90, TDQ1 30·2 (SD 8·6) v. TDQ2 30·9 (SD 8·9); P= 0·14) and validity (r 0·83, average TDQ ((TDQ1+TDQ2)/2) 30·5 (SD 8·4) v. FFQ 31·4 (SD 8·1); P= 0·05). All the participants were classified into the same (reliability 75 %; validity 79 %) or adjacent (reliability 25 %; validity 21 %) risk category (low (0-24), moderate (25-49), high (50-74) and very high (75-100)). Overall, the TDQ is a valid and reliable screening tool for identifying at-risk toddlers in relatively advantaged samples. PMID:24886781

  11. Software reliability perspectives

    NASA Technical Reports Server (NTRS)

    Wilson, Larry; Shen, Wenhui

    1987-01-01

    Software which is used in life critical functions must be known to be highly reliable before installation. This requires a strong testing program to estimate the reliability, since neither formal methods, software engineering nor fault tolerant methods can guarantee perfection. Prior to the final testing software goes through a debugging period and many models have been developed to try to estimate reliability from the debugging data. However, the existing models are poorly validated and often give poor performance. This paper emphasizes the fact that part of their failures can be attributed to the random nature of the debugging data given to these models as input, and it poses the problem of correcting this defect as an area of future research.

  12. Effects of Grain Boundary and Segregated-Phases on Reliability of Ba(Ti,Zr)O3-Based Ni Electrode MLCs

    NASA Astrophysics Data System (ADS)

    Okamatsu, Toshihiro; Sano, Harunobu; Takagi, Hiroshi

    2005-09-01

    The reliability in the highly accelerated life test (HALT) for multilayer ceramic capacitors (MLCs) composed of the Ba(Ti,Zr)O3-Gd-Mg-Mn-Si system was markedly improved by controlling the amount of SiO2 and the ratio of Ba/(Ti+Zr). We observed the following characteristics in the microstructures when the reliability was improved. 1) The segregated-phases, which consisted of Mg oxides and Si oxides, were dispersed well in the dielectric layers and their size was considerably small. 2) The corrosion resistance of grain boundaries was improved. 3) The electric potential as a contrast in the focused ion beam scanning ion microscopy was homogeneous in both the grains and the grain boundaries. We can design these microstructures by controlling the additives to generate BaMg2Si2O7 and BaSi2O5, which have low melting points.

  13. Gearbox Reliability Collaborative Bearing Calibration

    SciTech Connect

    van Dam, J.

    2011-10-01

    NREL has initiated the Gearbox Reliability Collaborative (GRC) to investigate the root cause of the low wind turbine gearbox reliability. The GRC follows a multi-pronged approach based on a collaborative of manufacturers, owners, researchers and consultants. The project combines analysis, field testing, dynamometer testing, condition monitoring, and the development and population of a gearbox failure database. At the core of the project are two 750kW gearboxes that have been redesigned and rebuilt so that they are representative of the multi-megawatt gearbox topology currently used in the industry. These gearboxes are heavily instrumented and are tested in the field and on the dynamometer. This report discusses the bearing calibrations of the gearboxes.

  14. Gearbox Reliability Collaborative Update (Presentation)

    SciTech Connect

    Sheng, S.

    2013-10-01

    This presentation was given at the Sandia Reliability Workshop in August 2013 and provides information on current statistics, a status update, next steps, and other reliability research and development activities related to the Gearbox Reliability Collaborative.

  15. Materials reliability issues in microelectronics

    SciTech Connect

    Lloyd, J.R. ); Yost, F.G. ); Ho, P.S. )

    1991-01-01

    This book covers the proceedings of a MRS symposium on materials reliability in microelectronics. Topics include: electromigration; stress effects on reliability; stress and packaging; metallization; device, oxide and dielectric reliability; new investigative techniques; and corrosion.

  16. Reliability techniques in the petroleum industry

    NASA Technical Reports Server (NTRS)

    Williams, H. L.

    1971-01-01

    Quantitative reliability evaluation methods used in the Apollo Spacecraft Program are translated into petroleum industry requirements with emphasis on offsetting reliability demonstration costs and limited production runs. Described are the qualitative disciplines applicable, the definitions and criteria that accompany the disciplines, and the generic application of these disciplines to the chemical industry. The disciplines are then translated into proposed definitions and criteria for the industry, into a base-line reliability plan that includes these disciplines, and into application notes to aid in adapting the base-line plan to a specific operation.

  17. Test–retest reliability of self-reported diabetes diagnosis in the Norwegian Women and Cancer Study: A population-based longitudinal study (n =33,919)

    PubMed Central

    Sheikh, Mashhood Ahmed; Lund, Eiliv; Braaten, Tonje

    2016-01-01

    Objective: Self-reported information from questionnaires is frequently used in epidemiological studies, but few of these studies provide information on the reproducibility of individual items contained in the questionnaire. We studied the test–retest reliability of self-reported diabetes among 33,919 participants in Norwegian Women and Cancer Study. Methods: The test–retest reliability of self-reported type 1 and type 2 diabetes diagnoses was evaluated between three self-administered questionnaires (completed in 1991, 1998, and 2005 by Norwegian Women and Cancer participants) by kappa agreement. The time interval between the test–retest studies was ~7 and ~14 years. Sensitivity of the kappa agreement for type 1 and type 2 diabetes diagnoses was assessed. Subgroup analysis was performed to assess whether test–retest reliability varies with age, body mass index, physical activity, education, and smoking status. Results: The kappa agreement for both types of self-reported diabetes diagnoses combined was good (⩾0.65) for all three test–retest studies (1991–1998, 1991–2005, and 1998–2005). The kappa agreement for type 1 diabetes was good (⩾0.73) in the 1991–2005 and the 1998–2005 test–retest studies, and very good (0.83) in the 1991–1998 test–retest study. The kappa agreement for type 2 diabetes was moderate (0.57) in the 1991–2005 test–retest study and good (⩾0.66) in the 1991–1998 and 1998–2005 test–retest studies. The overall kappa agreement in the 1991–1998 test–retest study was stronger than in the 1991–2005 test–retest study and the 1998–2005 test–retest study. There was no clear pattern of inconsistency in the kappa agreements within different strata of age, BMI, physical activity, and smoking. The kappa agreement was strongest among the respondents with 17 or more years of education, while generally it was weaker among the least educated group. Conclusion: The test–retest reliability of the diabetes was

  18. Reliability and Validity of a New Test of Change-of-Direction Speed for Field-Based Sports: the Change-of-Direction and Acceleration Test (CODAT)

    PubMed Central

    Lockie, Robert G.; Schultz, Adrian B.; Callaghan, Samuel J.; Jeffriess, Matthew D.; Berry, Simon P.

    2013-01-01

    Field sport coaches must use reliable and valid tests to assess change-of-direction speed in their athletes. Few tests feature linear sprinting with acute change- of-direction maneuvers. The Change-of-Direction and Acceleration Test (CODAT) was designed to assess field sport change-of-direction speed, and includes a linear 5-meter (m) sprint, 45° and 90° cuts, 3- m sprints to the left and right, and a linear 10-m sprint. This study analyzed the reliability and validity of this test, through comparisons to 20-m sprint (0-5, 0-10, 0-20 m intervals) and Illinois agility run (IAR) performance. Eighteen Australian footballers (age = 23.83 ± 7.04 yrs; height = 1.79 ± 0.06 m; mass = 85.36 ± 13.21 kg) were recruited. Following familiarization, subjects completed the 20-m sprint, CODAT, and IAR in 2 sessions, 48 hours apart. Intra-class correlation coefficients (ICC) assessed relative reliability. Absolute reliability was analyzed through paired samples t-tests (p ≤ 0.05) determining between-session differences. Typical error (TE), coefficient of variation (CV), and differences between the TE and smallest worthwhile change (SWC), also assessed absolute reliability and test usefulness. For the validity analysis, Pearson’s correlations (p ≤ 0.05) analyzed between-test relationships. Results showed no between-session differences for any test (p = 0.19-0.86). CODAT time averaged ~6 s, and the ICC and CV equaled 0.84 and 3.0%, respectively. The homogeneous sample of Australian footballers meant that the CODAT’s TE (0.19 s) exceeded the usual 0.2 x standard deviation (SD) SWC (0.10 s). However, the CODAT is capable of detecting moderate performance changes (SWC calculated as 0.5 x SD = 0.25 s). There was a near perfect correlation between the CODAT and IAR (r = 0.92), and very large correlations with the 20-m sprint (r = 0.75-0.76), suggesting that the CODAT was a valid change-of-direction speed test. Due to movement specificity, the CODAT has value for field sport

  19. Designing reliability into accelerators

    SciTech Connect

    Hutton, A.

    1992-08-01

    For the next generation of high performance, high average luminosity colliders, the ``factories,`` reliability engineering must be introduced right at the inception of the project and maintained as a central theme throughout the project. There are several aspects which will be addressed separately: Concept; design; motivation; management techniques; and fault diagnosis.

  20. Designing reliability into accelerators

    SciTech Connect

    Hutton, A.

    1992-08-01

    For the next generation of high performance, high average luminosity colliders, the factories,'' reliability engineering must be introduced right at the inception of the project and maintained as a central theme throughout the project. There are several aspects which will be addressed separately: Concept; design; motivation; management techniques; and fault diagnosis.

  1. Software Reliability Measurement Experience

    NASA Technical Reports Server (NTRS)

    Nikora, A. P.

    1993-01-01

    In this chapter, we describe a recent study of software reliability measurement methods that was conducted at the Jet Propulsion Laboratory. The first section of the chapter, sections 8.1, summarizes the study, characterizes the participating projects, describes the available data, and summarizes the tudy's results.

  2. Reliable solar cookers

    SciTech Connect

    Magney, G.K.

    1992-12-31

    The author describes the activities of SERVE, a Christian relief and development agency, to introduce solar ovens to the Afghan refugees in Pakistan. It has provided 5,000 solar cookers since 1984. The experience has demonstrated the potential of the technology and the need for a durable and reliable product. Common complaints about the cookers are discussed and the ideal cooker is described.

  3. Nonparametric Methods in Reliability

    PubMed Central

    Hollander, Myles; Peña, Edsel A.

    2005-01-01

    Probabilistic and statistical models for the occurrence of a recurrent event over time are described. These models have applicability in the reliability, engineering, biomedical and other areas where a series of events occurs for an experimental unit as time progresses. Nonparametric inference methods, in particular, the estimation of a relevant distribution function, are described. PMID:16710444

  4. The concurrent validity and reliability of a low-cost, high-speed camera-based method for measuring the flight time of vertical jumps.

    PubMed

    Balsalobre-Fernández, Carlos; Tejero-González, Carlos M; del Campo-Vecino, Juan; Bavaresco, Nicolás

    2014-02-01

    Flight time is the most accurate and frequently used variable when assessing the height of vertical jumps. The purpose of this study was to analyze the validity and reliability of an alternative method (i.e., the HSC-Kinovea method) for measuring the flight time and height of vertical jumping using a low-cost high-speed Casio Exilim FH-25 camera (HSC). To this end, 25 subjects performed a total of 125 vertical jumps on an infrared (IR) platform while simultaneously being recorded with a HSC at 240 fps. Subsequently, 2 observers with no experience in video analysis analyzed the 125 videos independently using the open-license Kinovea 0.8.15 software. The flight times obtained were then converted into vertical jump heights, and the intraclass correlation coefficient (ICC), Bland-Altman plot, and Pearson correlation coefficient were calculated for those variables. The results showed a perfect correlation agreement (ICC = 1, p < 0.0001) between both observers' measurements of flight time and jump height and a highly reliable agreement (ICC = 0.997, p < 0.0001) between the observers' measurements of flight time and jump height using the HSC-Kinovea method and those obtained using the IR system, thus explaining 99.5% (p < 0.0001) of the differences (shared variance) obtained using the IR platform. As a result, besides requiring no previous experience in the use of this technology, the HSC-Kinovea method can be considered to provide similarly valid and reliable measurements of flight time and vertical jump height as more expensive equipment (i.e., IR). As such, coaches from many sports could use the HSC-Kinovea method to measure the flight time and height of their athlete's vertical jumps. PMID:23689339

  5. Reliability in individual monitoring service.

    PubMed

    Mod Ali, N

    2011-03-01

    As a laboratory certified to ISO 9001:2008 and accredited to ISO/IEC 17025, the Secondary Standard Dosimetry Laboratory (SSDL)-Nuclear Malaysia has incorporated an overall comprehensive system for technical and quality management in promoting a reliable individual monitoring service (IMS). Faster identification and resolution of issues regarding dosemeter preparation and issuing of reports, personnel enhancement, improved customer satisfaction and overall efficiency of laboratory activities are all results of the implementation of an effective quality system. Review of these measures and responses to observed trends provide continuous improvement of the system. By having these mechanisms, reliability of the IMS can be assured in the promotion of safe behaviour at all levels of the workforce utilising ionising radiation facilities. Upgradation of in the reporting program through a web-based e-SSDL marks a major improvement in Nuclear Malaysia's IMS reliability on the whole. The system is a vital step in providing a user friendly and effective occupational exposure evaluation program in the country. It provides a higher level of confidence in the results generated for occupational dose monitoring of the IMS, thus, enhances the status of the radiation protection framework of the country. PMID:21147789

  6. Development of a New Ultrasound-Based System for Tracking Motion of the Human Lumbar Spine: Reliability, Stability and Repeatability during Forward Bending Movement Trials.

    PubMed

    Cuesta-Vargas, Antonio I

    2015-07-01

    The aim of this study was to develop a new method for quantifying intersegmental motion of the spine in an instrumented motion segment L4-L5 model using ultrasound image post-processing combined with an electromagnetic device. A prospective test-retest design was employed, combined with an evaluation of stability and within- and between-day intra-tester reliability during forward bending by 15 healthy male patients. The accuracy of the measurement system using the model was calculated to be ± 0.9° (standard deviation = 0.43) over a 40° range and ± 0.4 cm (standard deviation = 0.28) over 1.5 cm. The mean composite range of forward bending was 15.5 ± 2.04° during a single trial (standard error of the mean = 0.54, coefficient of variation = 4.18). Reliability (intra-class correlation coefficient = 2.1) was found to be excellent for both within-day measures (0.995-0.999) and between-day measures (0.996-0.999). Further work is necessary to explore the use of this approach in the evaluation of biomechanics, clinical assessments and interventions. PMID:25864018

  7. Investigation of reliability method formulations in Dakota/UQ.

    SciTech Connect

    Renaud, John E.; Perez, Victor M.; Wojtkiewicz, Steven F., Jr.; Agarwal, H.; Eldred, Michael Scott

    2004-07-01

    Reliability methods are probabilistic algorithms for quantifying the effect of simulation input uncertainties on response metrics of interest. In particular, they compute approximate response function distribution statistics (probability, reliability and response levels) based on specified input random variable probability distributions. In this paper, a number of algorithmic variations are explored for both the forward reliability analysis of computing probabilities for specified response levels (the reliability index approach (RIA)) and the inverse reliability analysis of computing response levels for specified probabilities (the performance measure approach (PMA)). These variations include limit state linearizations, probability integrations, warm starting and optimization algorithm selections. The resulting RIA/PMA reliability algorithms for uncertainty quantification are then employed within bi-level and sequential reliability-based design optimization approaches. Relative performance of these uncertainty quantification and reliability-based design optimization algorithms are presented for a number of computational experiments performed using the DAKOTA/UQ software.

  8. Space Shuttle Propulsion System Reliability

    NASA Technical Reports Server (NTRS)

    Welzyn, Ken; VanHooser, Katherine; Moore, Dennis; Wood, David

    2011-01-01

    This session includes the following sessions: (1) External Tank (ET) System Reliability and Lessons, (2) Space Shuttle Main Engine (SSME), Reliability Validated by a Million Seconds of Testing, (3) Reusable Solid Rocket Motor (RSRM) Reliability via Process Control, and (4) Solid Rocket Booster (SRB) Reliability via Acceptance and Testing.

  9. Reliable broadcast protocols

    NASA Technical Reports Server (NTRS)

    Joseph, T. A.; Birman, Kenneth P.

    1989-01-01

    A number of broadcast protocols that are reliable subject to a variety of ordering and delivery guarantees are considered. Developing applications that are distributed over a number of sites and/or must tolerate the failures of some of them becomes a considerably simpler task when such protocols are available for communication. Without such protocols the kinds of distributed applications that can reasonably be built will have a very limited scope. As the trend towards distribution and decentralization continues, it will not be surprising if reliable broadcast protocols have the same role in distributed operating systems of the future that message passing mechanisms have in the operating systems of today. On the other hand, the problems of engineering such a system remain large. For example, deciding which protocol is the most appropriate to use in a certain situation or how to balance the latency-communication-storage costs is not an easy question.

  10. Data networks reliability

    NASA Astrophysics Data System (ADS)

    Gallager, Robert G.

    1988-10-01

    The research from 1984 to 1986 on Data Network Reliability had the objective of developing general principles governing the reliable and efficient control of data networks. The research was centered around three major areas: congestion control, multiaccess networks, and distributed asynchronous algorithms. The major topics within congestion control were the use of flow control algorithms. The major topics within congestion control were the use of flow control to reduce congestion and the use of routing to reduce congestion. The major topics within multiaccess networks were the communication properties of multiaccess channels, collision resolution, and packet radio networks. The major topics within asynchronous distributed algorithms were failure recovery, time vs. communication tradeoffs, and the general theory of distributed algorithms.

  11. Human Reliability Program Workshop

    SciTech Connect

    Landers, John; Rogers, Erin; Gerke, Gretchen

    2014-05-18

    A Human Reliability Program (HRP) is designed to protect national security as well as worker and public safety by continuously evaluating the reliability of those who have access to sensitive materials, facilities, and programs. Some elements of a site HRP include systematic (1) supervisory reviews, (2) medical and psychological assessments, (3) management evaluations, (4) personnel security reviews, and (4) training of HRP staff and critical positions. Over the years of implementing an HRP, the Department of Energy (DOE) has faced various challenges and overcome obstacles. During this 4-day activity, participants will examine programs that mitigate threats to nuclear security and the insider threat to include HRP, Nuclear Security Culture (NSC) Enhancement, and Employee Assistance Programs. The focus will be to develop an understanding of the need for a systematic HRP and to discuss challenges and best practices associated with mitigating the insider threat.

  12. Compact, Reliable EEPROM Controller

    NASA Technical Reports Server (NTRS)

    Katz, Richard; Kleyner, Igor

    2010-01-01

    A compact, reliable controller for an electrically erasable, programmable read-only memory (EEPROM) has been developed specifically for a space-flight application. The design may be adaptable to other applications in which there are requirements for reliability in general and, in particular, for prevention of inadvertent writing of data in EEPROM cells. Inadvertent writes pose risks of loss of reliability in the original space-flight application and could pose such risks in other applications. Prior EEPROM controllers are large and complex and do not provide all reasonable protections (in many cases, few or no protections) against inadvertent writes. In contrast, the present controller provides several layers of protection against inadvertent writes. The controller also incorporates a write-time monitor, enabling determination of trends in the performance of an EEPROM through all phases of testing. The controller has been designed as an integral subsystem of a system that includes not only the controller and the controlled EEPROM aboard a spacecraft but also computers in a ground control station, relatively simple onboard support circuitry, and an onboard communication subsystem that utilizes the MIL-STD-1553B protocol. (MIL-STD-1553B is a military standard that encompasses a method of communication and electrical-interface requirements for digital electronic subsystems connected to a data bus. MIL-STD- 1553B is commonly used in defense and space applications.) The intent was to both maximize reliability while minimizing the size and complexity of onboard circuitry. In operation, control of the EEPROM is effected via the ground computers, the MIL-STD-1553B communication subsystem, and the onboard support circuitry, all of which, in combination, provide the multiple layers of protection against inadvertent writes. There is no controller software, unlike in many prior EEPROM controllers; software can be a major contributor to unreliability, particularly in fault

  13. Spacecraft transmitter reliability

    NASA Technical Reports Server (NTRS)

    1980-01-01

    A workshop on spacecraft transmitter reliability was held at the NASA Lewis Research Center on September 25 and 26, 1979, to discuss present knowledge and to plan future research areas. Since formal papers were not submitted, this synopsis was derived from audio tapes of the workshop. The following subjects were covered: users' experience with space transmitters; cathodes; power supplies and interfaces; and specifications and quality assurance. A panel discussion ended the workshop.

  14. Reliability and testing

    NASA Technical Reports Server (NTRS)

    Auer, Werner

    1996-01-01

    Reliability and its interdependence with testing are important topics for development and manufacturing of successful products. This generally accepted fact is not only a technical statement, but must be also seen in the light of 'Human Factors.' While the background for this paper is the experience gained with electromechanical/electronic space products, including control and system considerations, it is believed that the content could be also of interest for other fields.

  15. Apollo experience report: Reliability and quality assurance

    NASA Technical Reports Server (NTRS)

    Sperber, K. P.

    1973-01-01

    The reliability of the Apollo spacecraft resulted from the application of proven reliability and quality techniques and from sound management, engineering, and manufacturing practices. Continual assessment of these techniques and practices was made during the program, and, when deficiencies were detected, adjustments were made and the deficiencies were effectively corrected. The most significant practices, deficiencies, adjustments, and experiences during the Apollo Program are described in this report. These experiences can be helpful in establishing an effective base on which to structure an efficient reliability and quality assurance effort for future space-flight programs.

  16. General Aviation Aircraft Reliability Study

    NASA Technical Reports Server (NTRS)

    Pettit, Duane; Turnbull, Andrew; Roelant, Henk A. (Technical Monitor)

    2001-01-01

    This reliability study was performed in order to provide the aviation community with an estimate of Complex General Aviation (GA) Aircraft System reliability. To successfully improve the safety and reliability for the next generation of GA aircraft, a study of current GA aircraft attributes was prudent. This was accomplished by benchmarking the reliability of operational Complex GA Aircraft Systems. Specifically, Complex GA Aircraft System reliability was estimated using data obtained from the logbooks of a random sample of the Complex GA Aircraft population.

  17. MOV reliability evaluation and periodic verification scheduling

    SciTech Connect

    Bunte, B.D.

    1996-12-01

    The purpose of this paper is to establish a periodic verification testing schedule based on the expected long term reliability of gate or globe motor operated valves (MOVs). The methodology in this position paper determines the nominal (best estimate) design margin for any MOV based on the best available information pertaining to the MOVs design requirements, design parameters, existing hardware design, and present setup. The uncertainty in this margin is then determined using statistical means. By comparing the nominal margin to the uncertainty, the reliability of the MOV is estimated. The methodology is appropriate for evaluating the reliability of MOVs in the GL 89-10 program. It may be used following periodic testing to evaluate and trend MOV performance and reliability. It may also be used to evaluate the impact of proposed modifications and maintenance activities such as packing adjustments. In addition, it may be used to assess the impact of new information of a generic nature which impacts safety related MOVs.

  18. Operational reliability of standby safety systems

    SciTech Connect

    Grant, G.M.; Atwood, C.L.; Gentillon, C.D.

    1995-04-01

    The Idaho National Engineering Laboratory (INEL) is evaluating the operational reliability of several risk-significant standby safety systems based on the operating experience at US commercial nuclear power plants from 1987 through 1993. The reliability assessed is the probability that the system will perform its Probabilistic Risk Assessment (PRA) defined safety function. The quantitative estimates of system reliability are expected to be useful in risk-based regulation. This paper is an overview of the analysis methods and the results of the high pressure coolant injection (HPCI) system reliability study. Key characteristics include (1) descriptions of the data collection and analysis methods, (2) the statistical methods employed to estimate operational unreliability, (3) a description of how the operational unreliability estimates were compared with typical PRA results, both overall and for each dominant failure mode, and (4) a summary of results of the study.

  19. Reliability and failure mode investigation of high-power multimode InGaAs strained quantum well single emitters

    NASA Astrophysics Data System (ADS)

    Sin, Yongkun; Foran, Brendan; Presser, Nathan; Mason, Maribeth; Moss, Steven C.

    2007-02-01

    In recent years record performance characteristics from multi-mode InGaAs strained quantum well single emitters at 920-980nm have been reported including a maximum CW optical output power of ~20W and a power conversion efficiency of ~75%. These excellent performance characteristics are only possible through combined optimization of laser structure design, chip fabrication processes, and packaging. Whereas broad area multi-mode single emitters likely have sufficient reliability for industrial uses, reliability of these lasers still remains a concern for communications applications including deployment in potential space satellite systems where high reliability is required. Most of previous reports on these lasers have been focused on their performance characteristics with very limited reports on failure mode analysis although understanding the physics of failure is crucial in developing a proper lifetime model for these lasers. We thus report on the reliability and failure mode analysis of high power multi-mode single emitters. The lasers studied were broad area strained InGaAs single QW lasers at 940-980nm with typical aperture widths of around 100μm. At an injection current of 7A typical CW output powers were over 6W at 25°C with a wall plug efficiency of ~60%. First, various lasing characteristics were measured including spatial and thermal characteristics that are critical to understanding performance and reliability of these devices. ACC burn-in tests with different stress conditions were performed on these devices until their failure. We report accelerated lifetest results with over 5000 accumulated test hours. Finally, we report failure mode investigation results of the degraded lasers.

  20. Software reliability modeling and analysis

    NASA Technical Reports Server (NTRS)

    Scholz, F.-W.

    1986-01-01

    A discrete and, as approximation to it, a continuous model for the software reliability growth process are examined. The discrete model is based on independent multinomial trials and concerns itself with the joint distribution of the first occurrence time of its underlying events (bugs). The continuous model is based on the order statistics of N independent nonidentically distributed exponential random variables. It is shown that the spacings between bugs are not necessarily independent or exponentially (geometrically) distributed. However, there is a statistical rationale for viewing them so conditionally. Some identifiability problems are pointed out and resolved. In particular, it appears that the number of bugs in a program is not identifiable. Estimated upper bounds and confidence bounds for the residual program eror content are given based on the spacings of the first k bugs removed.