Sample records for sensitive component reliability

  1. Design Optimization Method for Composite Components Based on Moment Reliability-Sensitivity Criteria

    NASA Astrophysics Data System (ADS)

    Sun, Zhigang; Wang, Changxi; Niu, Xuming; Song, Yingdong

    2017-08-01

    In this paper, a Reliability-Sensitivity Based Design Optimization (RSBDO) methodology for the design of the ceramic matrix composites (CMCs) components has been proposed. A practical and efficient method for reliability analysis and sensitivity analysis of complex components with arbitrary distribution parameters are investigated by using the perturbation method, the respond surface method, the Edgeworth series and the sensitivity analysis approach. The RSBDO methodology is then established by incorporating sensitivity calculation model into RBDO methodology. Finally, the proposed RSBDO methodology is applied to the design of the CMCs components. By comparing with Monte Carlo simulation, the numerical results demonstrate that the proposed methodology provides an accurate, convergent and computationally efficient method for reliability-analysis based finite element modeling engineering practice.

  2. Test-retest reliability of infant event related potentials evoked by faces.

    PubMed

    Munsters, N M; van Ravenswaaij, H; van den Boomen, C; Kemner, C

    2017-04-05

    Reliable measures are required to draw meaningful conclusions regarding developmental changes in longitudinal studies. Little is known, however, about the test-retest reliability of face-sensitive event related potentials (ERPs), a frequently used neural measure in infants. The aim of the current study is to investigate the test-retest reliability of ERPs typically evoked by faces in 9-10 month-old infants. The infants (N=31) were presented with neutral, fearful and happy faces that contained only the lower or higher spatial frequency information. They were tested twice within two weeks. The present results show that the test-retest reliability of the face-sensitive ERP components is moderate (P400 and Nc) to substantial (N290). However, there is low test-retest reliability for the effects of the specific experimental manipulations (i.e. emotion and spatial frequency) on the face-sensitive ERPs. To conclude, in infants the face-sensitive ERP components (i.e. N290, P400 and Nc) show adequate test-retest reliability, but not the effects of emotion and spatial frequency on these ERP components. We propose that further research focuses on investigating elements that might increase the test-retest reliability, as adequate test-retest reliability is necessary to draw meaningful conclusions on individual developmental trajectories of the face-sensitive ERPs in infants. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.

  3. The work and social adjustment scale: reliability, sensitivity and value.

    PubMed

    Zahra, Daniel; Qureshi, Adam; Henley, William; Taylor, Rod; Quinn, Cath; Pooler, Jill; Hardy, Gillian; Newbold, Alexandra; Byng, Richard

    2014-06-01

    To investigate the psychometric properties of the Work and Social Adjustment Scale (WSAS) as an outcome measure for the Improving Access to Psychological Therapy programme, assessing its value as an addition to the Patient Health (PHQ-9) and Generalised Anxiety Disorder questionnaires (GAD-7). Little research has investigated these properties to date. Reliability and responsiveness to change were assessed using data from 4,835 patients. Principal components analysis was used to determine whether the WSAS measures a factor distinct from the PHQ-9 and GAD-7. The WSAS measures a distinct social functioning factor, has high internal reliability, and is sensitive to treatment effects. The WSAS, PHQ-9 and GAD-7 perform comparably on measures of reliability and sensitivity. The WSAS also measures a distinct social functioning component suggesting it has potential as an additional outcome measure.

  4. Sensitivity analysis by approximation formulas - Illustrative examples. [reliability analysis of six-component architectures

    NASA Technical Reports Server (NTRS)

    White, A. L.

    1983-01-01

    This paper examines the reliability of three architectures for six components. For each architecture, the probabilities of the failure states are given by algebraic formulas involving the component fault rate, the system recovery rate, and the operating time. The dominant failure modes are identified, and the change in reliability is considered with respect to changes in fault rate, recovery rate, and operating time. The major conclusions concern the influence of system architecture on failure modes and parameter requirements. Without this knowledge, a system designer may pick an inappropriate structure.

  5. A longitudinal examination of event-related potentials sensitive to monetary reward and loss feedback from late childhood to middle adolescence.

    PubMed

    Kujawa, Autumn; Carroll, Ashley; Mumper, Emma; Mukherjee, Dahlia; Kessel, Ellen M; Olino, Thomas; Hajcak, Greg; Klein, Daniel N

    2017-11-04

    Brain regions involved in reward processing undergo developmental changes from childhood to adolescence, and alterations in reward-related brain function are thought to contribute to the development of psychopathology. Event-related potentials (ERPs), such as the reward positivity (RewP) component, are valid measures of reward responsiveness that are easily assessed across development and provide insight into temporal dynamics of reward processing. Little work has systematically examined developmental changes in ERPs sensitive to reward. In this longitudinal study of 75 youth assessed 3 times across 6years, we used principal components analyses (PCA) to differentiate ERPs sensitive to monetary reward and loss feedback in late childhood, early adolescence, and middle adolescence. We then tested reliability of, and developmental changes in, ERPs. A greater number of ERP components differentiated reward and loss feedback in late childhood compared to adolescence, but components in childhood accounted for only a small proportion of variance. A component consistent with RewP was the only one to consistently emerge at each of the 3 assessments. RewP demonstrated acceptable reliability, particularly from early to middle adolescence, though reliability estimates varied depending on scoring approach and developmental period. The magnitude of the RewP component did not significantly change across time. Results provide insight into developmental changes in the structure of ERPs sensitive to reward, and indicate that RewP is a consistently observed and relatively stable measure of reward responsiveness, particularly across adolescence. Copyright © 2017. Published by Elsevier B.V.

  6. A Statistical Simulation Approach to Safe Life Fatigue Analysis of Redundant Metallic Components

    NASA Technical Reports Server (NTRS)

    Matthews, William T.; Neal, Donald M.

    1997-01-01

    This paper introduces a dual active load path fail-safe fatigue design concept analyzed by Monte Carlo simulation. The concept utilizes the inherent fatigue life differences between selected pairs of components for an active dual path system, enhanced by a stress level bias in one component. The design is applied to a baseline design; a safe life fatigue problem studied in an American Helicopter Society (AHS) round robin. The dual active path design is compared with a two-element standby fail-safe system and the baseline design for life at specified reliability levels and weight. The sensitivity of life estimates for both the baseline and fail-safe designs was examined by considering normal and Weibull distribution laws and coefficient of variation levels. Results showed that the biased dual path system lifetimes, for both the first element failure and residual life, were much greater than for standby systems. The sensitivity of the residual life-weight relationship was not excessive at reliability levels up to R = 0.9999 and the weight penalty was small. The sensitivity of life estimates increases dramatically at higher reliability levels.

  7. Reliability analysis of component-level redundant topologies for solid-state fault current limiter

    NASA Astrophysics Data System (ADS)

    Farhadi, Masoud; Abapour, Mehdi; Mohammadi-Ivatloo, Behnam

    2018-04-01

    Experience shows that semiconductor switches in power electronics systems are the most vulnerable components. One of the most common ways to solve this reliability challenge is component-level redundant design. There are four possible configurations for the redundant design in component level. This article presents a comparative reliability analysis between different component-level redundant designs for solid-state fault current limiter. The aim of the proposed analysis is to determine the more reliable component-level redundant configuration. The mean time to failure (MTTF) is used as the reliability parameter. Considering both fault types (open circuit and short circuit), the MTTFs of different configurations are calculated. It is demonstrated that more reliable configuration depends on the junction temperature of the semiconductor switches in the steady state. That junction temperature is a function of (i) ambient temperature, (ii) power loss of the semiconductor switch and (iii) thermal resistance of heat sink. Also, results' sensitivity to each parameter is investigated. The results show that in different conditions, various configurations have higher reliability. The experimental results are presented to clarify the theory and feasibility of the proposed approaches. At last, levelised costs of different configurations are analysed for a fair comparison.

  8. Analysis of fatigue reliability for high temperature and high pressure multi-stage decompression control valve

    NASA Astrophysics Data System (ADS)

    Yu, Long; Xu, Juanjuan; Zhang, Lifang; Xu, Xiaogang

    2018-03-01

    Based on stress-strength interference theory to establish the reliability mathematical model for high temperature and high pressure multi-stage decompression control valve (HMDCV), and introduced to the temperature correction coefficient for revising material fatigue limit at high temperature. Reliability of key dangerous components and fatigue sensitivity curve of each component are calculated and analyzed by the means, which are analyzed the fatigue life of control valve and combined with reliability theory of control valve model. The impact proportion of each component on the control valve system fatigue failure was obtained. The results is shown that temperature correction factor makes the theoretical calculations of reliability more accurate, prediction life expectancy of main pressure parts accords with the technical requirements, and valve body and the sleeve have obvious influence on control system reliability, the stress concentration in key part of control valve can be reduced in the design process by improving structure.

  9. Strategies and Approaches to TPS Design

    NASA Technical Reports Server (NTRS)

    Kolodziej, Paul

    2005-01-01

    Thermal protection systems (TPS) insulate planetary probes and Earth re-entry vehicles from the aerothermal heating experienced during hypersonic deceleration to the planet s surface. The systems are typically designed with some additional capability to compensate for both variations in the TPS material and for uncertainties in the heating environment. This additional capability, or robustness, also provides a surge capability for operating under abnormal severe conditions for a short period of time, and for unexpected events, such as meteoroid impact damage, that would detract from the nominal performance. Strategies and approaches to developing robust designs must also minimize mass because an extra kilogram of TPS displaces one kilogram of payload. Because aircraft structures must be optimized for minimum mass, reliability-based design approaches for mechanical components exist that minimize mass. Adapting these existing approaches to TPS component design takes advantage of the extensive work, knowledge, and experience from nearly fifty years of reliability-based design of mechanical components. A Non-Dimensional Load Interference (NDLI) method for calculating the thermal reliability of TPS components is presented in this lecture and applied to several examples. A sensitivity analysis from an existing numerical simulation of a carbon phenolic TPS provides insight into the effects of the various design parameters, and is used to demonstrate how sensitivity analysis may be used with NDLI to develop reliability-based designs of TPS components.

  10. A GA based penalty function technique for solving constrained redundancy allocation problem of series system with interval valued reliability of components

    NASA Astrophysics Data System (ADS)

    Gupta, R. K.; Bhunia, A. K.; Roy, D.

    2009-10-01

    In this paper, we have considered the problem of constrained redundancy allocation of series system with interval valued reliability of components. For maximizing the overall system reliability under limited resource constraints, the problem is formulated as an unconstrained integer programming problem with interval coefficients by penalty function technique and solved by an advanced GA for integer variables with interval fitness function, tournament selection, uniform crossover, uniform mutation and elitism. As a special case, considering the lower and upper bounds of the interval valued reliabilities of the components to be the same, the corresponding problem has been solved. The model has been illustrated with some numerical examples and the results of the series redundancy allocation problem with fixed value of reliability of the components have been compared with the existing results available in the literature. Finally, sensitivity analyses have been shown graphically to study the stability of our developed GA with respect to the different GA parameters.

  11. Reliability Analysis and Reliability-Based Design Optimization of Circular Composite Cylinders Under Axial Compression

    NASA Technical Reports Server (NTRS)

    Rais-Rohani, Masoud

    2001-01-01

    This report describes the preliminary results of an investigation on component reliability analysis and reliability-based design optimization of thin-walled circular composite cylinders with average diameter and average length of 15 inches. Structural reliability is based on axial buckling strength of the cylinder. Both Monte Carlo simulation and First Order Reliability Method are considered for reliability analysis with the latter incorporated into the reliability-based structural optimization problem. To improve the efficiency of reliability sensitivity analysis and design optimization solution, the buckling strength of the cylinder is estimated using a second-order response surface model. The sensitivity of the reliability index with respect to the mean and standard deviation of each random variable is calculated and compared. The reliability index is found to be extremely sensitive to the applied load and elastic modulus of the material in the fiber direction. The cylinder diameter was found to have the third highest impact on the reliability index. Also the uncertainty in the applied load, captured by examining different values for its coefficient of variation, is found to have a large influence on cylinder reliability. The optimization problem for minimum weight is solved subject to a design constraint on element reliability index. The methodology, solution procedure and optimization results are included in this report.

  12. Laser beam soldering of micro-optical components

    NASA Astrophysics Data System (ADS)

    Eberhardt, R.

    2003-05-01

    MOTIVATION Ongoing miniaturisation and higher requirements within optical assemblies and the processing of temperature sensitive components demands for innovative selective joining techniques. So far adhesive bonding has primarily been used to assemble and adjust hybrid micro optical systems. However, the properties of the organic polymers used for the adhesives limit the application of these systems. In fields of telecommunication and lithography, an enhancement of existing joining techniques is necessary to improve properties like humidity resistance, laserstability, UV-stability, thermal cycle reliability and life time reliability. Against this background laser beam soldering of optical components is a reasonable joining technology alternative. Properties like: - time and area restricted energy input - energy input can be controlled by the process temperature - direct and indirect heating of the components is possible - no mechanical contact between joining tool and components give good conditions to meet the requirements on a joining technology for sensitive optical components. Additionally to the laser soldering head, for the assembly of optical components it is necessary to include positioning units to adjust the position of the components with high accuracy before joining. Furthermore, suitable measurement methods to characterize the soldered assemblies (for instance in terms of position tolerances) need to be developed.

  13. Reliability and Productivity Modeling for the Optimization of Separated Spacecraft Interferometers

    NASA Technical Reports Server (NTRS)

    Kenny, Sean (Technical Monitor); Wertz, Julie

    2002-01-01

    As technological systems grow in capability, they also grow in complexity. Due to this complexity, it is no longer possible for a designer to use engineering judgement to identify the components that have the largest impact on system life cycle metrics, such as reliability, productivity, cost, and cost effectiveness. One way of identifying these key components is to build quantitative models and analysis tools that can be used to aid the designer in making high level architecture decisions. Once these key components have been identified, two main approaches to improving a system using these components exist: add redundancy or improve the reliability of the component. In reality, the most effective approach to almost any system will be some combination of these two approaches, in varying orders of magnitude for each component. Therefore, this research tries to answer the question of how to divide funds, between adding redundancy and improving the reliability of components, to most cost effectively improve the life cycle metrics of a system. While this question is relevant to any complex system, this research focuses on one type of system in particular: Separate Spacecraft Interferometers (SSI). Quantitative models are developed to analyze the key life cycle metrics of different SSI system architectures. Next, tools are developed to compare a given set of architectures in terms of total performance, by coupling different life cycle metrics together into one performance metric. Optimization tools, such as simulated annealing and genetic algorithms, are then used to search the entire design space to find the "optimal" architecture design. Sensitivity analysis tools have been developed to determine how sensitive the results of these analyses are to uncertain user defined parameters. Finally, several possibilities for the future work that could be done in this area of research are presented.

  14. Component Reliability Testing of Long-Life Sorption Cryocoolers

    NASA Technical Reports Server (NTRS)

    Bard, S.; Wu, J.; Karlmann, P.; Mirate, C.; Wade, L.

    1994-01-01

    This paper summarizes ongoing experiments characterizing the ability of critical sorption cryocooler components to achieve highly reliable operation for long-life space missions. Test data obtained over the past several years at JPL are entirely consistent with achieving ten year life for sorption compressors, electrical heaters, container materials, valves, and various sorbent materials suitable for driving 8 to 180 K refrigeration stages. Test results for various compressor systems are reported. Planned future tests necessary to gain a detailed understanding of the sensitivity of cooler performance and component life to operating constraints, design configurations, and fabrication, assembly and handling techniques, are also discussed.

  15. LOX/GOX sensitivity of fluoroelastomers. [effect of formulation components and addition of fire retardants

    NASA Technical Reports Server (NTRS)

    Kirshen, N.; Mill, T.

    1973-01-01

    The effect of formulation components and the addition of fire retardants on the impact sensitivity of Viton B fluoroelastomer in liquid oxygen was studied with the objective of developing a procedure for reliably reducing this sensitivity. Component evaluation, carried out on more than 40 combinations of components and cure cycles, showed that almost all the standard formulation agents, including carbon, MgO, Diak-3, and PbO2, will sensitize the Viton stock either singly or in combinations, some combinations being much more sensitive than others. Cure and postcure treatments usually reduced the sensitivity of a given formulation, often dramatically, but no formulated Viton was as insensitive as the pure Viton B stock. Coating formulated Viton with a thin layer of pure Viton gave some indication of reduced sensitivity, but additional tests are needed. It is concluded that sensitivity in formulated Viton arises from a variety of sources, some physical and some chemical in origin. Elemental analyses for all the formulated Vitons are reported as are the results of a literature search on the subject of LOX impact sensitivity.

  16. Test-retest reliability of cognitive EEG

    NASA Technical Reports Server (NTRS)

    McEvoy, L. K.; Smith, M. E.; Gevins, A.

    2000-01-01

    OBJECTIVE: Task-related EEG is sensitive to changes in cognitive state produced by increased task difficulty and by transient impairment. If task-related EEG has high test-retest reliability, it could be used as part of a clinical test to assess changes in cognitive function. The aim of this study was to determine the reliability of the EEG recorded during the performance of a working memory (WM) task and a psychomotor vigilance task (PVT). METHODS: EEG was recorded while subjects rested quietly and while they performed the tasks. Within session (test-retest interval of approximately 1 h) and between session (test-retest interval of approximately 7 days) reliability was calculated for four EEG components: frontal midline theta at Fz, posterior theta at Pz, and slow and fast alpha at Pz. RESULTS: Task-related EEG was highly reliable within and between sessions (r0.9 for all components in WM task, and r0.8 for all components in the PVT). Resting EEG also showed high reliability, although the magnitude of the correlation was somewhat smaller than that of the task-related EEG (r0.7 for all 4 components). CONCLUSIONS: These results suggest that under appropriate conditions, task-related EEG has sufficient retest reliability for use in assessing clinical changes in cognitive status.

  17. Technique for Early Reliability Prediction of Software Components Using Behaviour Models

    PubMed Central

    Ali, Awad; N. A. Jawawi, Dayang; Adham Isa, Mohd; Imran Babar, Muhammad

    2016-01-01

    Behaviour models are the most commonly used input for predicting the reliability of a software system at the early design stage. A component behaviour model reveals the structure and behaviour of the component during the execution of system-level functionalities. There are various challenges related to component reliability prediction at the early design stage based on behaviour models. For example, most of the current reliability techniques do not provide fine-grained sequential behaviour models of individual components and fail to consider the loop entry and exit points in the reliability computation. Moreover, some of the current techniques do not tackle the problem of operational data unavailability and the lack of analysis results that can be valuable for software architects at the early design stage. This paper proposes a reliability prediction technique that, pragmatically, synthesizes system behaviour in the form of a state machine, given a set of scenarios and corresponding constraints as input. The state machine is utilized as a base for generating the component-relevant operational data. The state machine is also used as a source for identifying the nodes and edges of a component probabilistic dependency graph (CPDG). Based on the CPDG, a stack-based algorithm is used to compute the reliability. The proposed technique is evaluated by a comparison with existing techniques and the application of sensitivity analysis to a robotic wheelchair system as a case study. The results indicate that the proposed technique is more relevant at the early design stage compared to existing works, and can provide a more realistic and meaningful prediction. PMID:27668748

  18. Structural reliability methods: Code development status

    NASA Astrophysics Data System (ADS)

    Millwater, Harry R.; Thacker, Ben H.; Wu, Y.-T.; Cruse, T. A.

    1991-05-01

    The Probabilistic Structures Analysis Method (PSAM) program integrates state of the art probabilistic algorithms with structural analysis methods in order to quantify the behavior of Space Shuttle Main Engine structures subject to uncertain loadings, boundary conditions, material parameters, and geometric conditions. An advanced, efficient probabilistic structural analysis software program, NESSUS (Numerical Evaluation of Stochastic Structures Under Stress) was developed as a deliverable. NESSUS contains a number of integrated software components to perform probabilistic analysis of complex structures. A nonlinear finite element module NESSUS/FEM is used to model the structure and obtain structural sensitivities. Some of the capabilities of NESSUS/FEM are shown. A Fast Probability Integration module NESSUS/FPI estimates the probability given the structural sensitivities. A driver module, PFEM, couples the FEM and FPI. NESSUS, version 5.0, addresses component reliability, resistance, and risk.

  19. Structural reliability methods: Code development status

    NASA Technical Reports Server (NTRS)

    Millwater, Harry R.; Thacker, Ben H.; Wu, Y.-T.; Cruse, T. A.

    1991-01-01

    The Probabilistic Structures Analysis Method (PSAM) program integrates state of the art probabilistic algorithms with structural analysis methods in order to quantify the behavior of Space Shuttle Main Engine structures subject to uncertain loadings, boundary conditions, material parameters, and geometric conditions. An advanced, efficient probabilistic structural analysis software program, NESSUS (Numerical Evaluation of Stochastic Structures Under Stress) was developed as a deliverable. NESSUS contains a number of integrated software components to perform probabilistic analysis of complex structures. A nonlinear finite element module NESSUS/FEM is used to model the structure and obtain structural sensitivities. Some of the capabilities of NESSUS/FEM are shown. A Fast Probability Integration module NESSUS/FPI estimates the probability given the structural sensitivities. A driver module, PFEM, couples the FEM and FPI. NESSUS, version 5.0, addresses component reliability, resistance, and risk.

  20. The Development of DNA Based Methods for the Reliable and Efficient Identification of Nicotiana tabacum in Tobacco and Its Derived Products

    PubMed Central

    Fan, Wei; Li, Rong; Li, Sifan; Ping, Wenli; Li, Shujun; Naumova, Alexandra; Peelen, Tamara; Yuan, Zheng; Zhang, Dabing

    2016-01-01

    Reliable methods are needed to detect the presence of tobacco components in tobacco products to effectively control smuggling and classify tariff and excise in tobacco industry to control illegal tobacco trade. In this study, two sensitive and specific DNA based methods, one quantitative real-time PCR (qPCR) assay and the other loop-mediated isothermal amplification (LAMP) assay, were developed for the reliable and efficient detection of the presence of tobacco (Nicotiana tabacum) in various tobacco samples and commodities. Both assays targeted the same sequence of the uridine 5′-monophosphate synthase (UMPS), and their specificities and sensitivities were determined with various plant materials. Both qPCR and LAMP methods were reliable and accurate in the rapid detection of tobacco components in various practical samples, including customs samples, reconstituted tobacco samples, and locally purchased cigarettes, showing high potential for their application in tobacco identification, particularly in the special cases where the morphology or chemical compositions of tobacco have been disrupted. Therefore, combining both methods would facilitate not only the detection of tobacco smuggling control, but also the detection of tariff classification and of excise. PMID:27635142

  1. Key Reliability Drivers of Liquid Propulsion Engines and A Reliability Model for Sensitivity Analysis

    NASA Technical Reports Server (NTRS)

    Huang, Zhao-Feng; Fint, Jeffry A.; Kuck, Frederick M.

    2005-01-01

    This paper is to address the in-flight reliability of a liquid propulsion engine system for a launch vehicle. We first establish a comprehensive list of system and sub-system reliability drivers for any liquid propulsion engine system. We then build a reliability model to parametrically analyze the impact of some reliability parameters. We present sensitivity analysis results for a selected subset of the key reliability drivers using the model. Reliability drivers identified include: number of engines for the liquid propulsion stage, single engine total reliability, engine operation duration, engine thrust size, reusability, engine de-rating or up-rating, engine-out design (including engine-out switching reliability, catastrophic fraction, preventable failure fraction, unnecessary shutdown fraction), propellant specific hazards, engine start and cutoff transient hazards, engine combustion cycles, vehicle and engine interface and interaction hazards, engine health management system, engine modification, engine ground start hold down with launch commit criteria, engine altitude start (1 in. start), Multiple altitude restart (less than 1 restart), component, subsystem and system design, manufacturing/ground operation support/pre and post flight check outs and inspection, extensiveness of the development program. We present some sensitivity analysis results for the following subset of the drivers: number of engines for the propulsion stage, single engine total reliability, engine operation duration, engine de-rating or up-rating requirements, engine-out design, catastrophic fraction, preventable failure fraction, unnecessary shutdown fraction, and engine health management system implementation (basic redlines and more advanced health management systems).

  2. SCARE: A post-processor program to MSC/NASTRAN for the reliability analysis of structural ceramic components

    NASA Technical Reports Server (NTRS)

    Gyekenyesi, J. P.

    1985-01-01

    A computer program was developed for calculating the statistical fast fracture reliability and failure probability of ceramic components. The program includes the two-parameter Weibull material fracture strength distribution model, using the principle of independent action for polyaxial stress states and Batdorf's shear-sensitive as well as shear-insensitive crack theories, all for volume distributed flaws in macroscopically isotropic solids. Both penny-shaped cracks and Griffith cracks are included in the Batdorf shear-sensitive crack response calculations, using Griffith's maximum tensile stress or critical coplanar strain energy release rate criteria to predict mixed mode fracture. Weibull material parameters can also be calculated from modulus of rupture bar tests, using the least squares method with known specimen geometry and fracture data. The reliability prediction analysis uses MSC/NASTRAN stress, temperature and volume output, obtained from the use of three-dimensional, quadratic, isoparametric, or axisymmetric finite elements. The statistical fast fracture theories employed, along with selected input and output formats and options, are summarized. An example problem to demonstrate various features of the program is included.

  3. Stirling Convertor Fasteners Reliability Quantification

    NASA Technical Reports Server (NTRS)

    Shah, Ashwin R.; Korovaichuk, Igor; Kovacevich, Tiodor; Schreiber, Jeffrey G.

    2006-01-01

    Onboard Radioisotope Power Systems (RPS) being developed for NASA s deep-space science and exploration missions require reliable operation for up to 14 years and beyond. Stirling power conversion is a candidate for use in an RPS because it offers a multifold increase in the conversion efficiency of heat to electric power and reduced inventory of radioactive material. Structural fasteners are responsible to maintain structural integrity of the Stirling power convertor, which is critical to ensure reliable performance during the entire mission. Design of fasteners involve variables related to the fabrication, manufacturing, behavior of fasteners and joining parts material, structural geometry of the joining components, size and spacing of fasteners, mission loads, boundary conditions, etc. These variables have inherent uncertainties, which need to be accounted for in the reliability assessment. This paper describes these uncertainties along with a methodology to quantify the reliability, and provides results of the analysis in terms of quantified reliability and sensitivity of Stirling power conversion reliability to the design variables. Quantification of the reliability includes both structural and functional aspects of the joining components. Based on the results, the paper also describes guidelines to improve the reliability and verification testing.

  4. Digital Avionics Information System (DAIS): Life Cycle Cost Impact Modeling System Reliability, Maintainability, and Cost Model (RMCM)--Description. Users Guide. Final Report.

    ERIC Educational Resources Information Center

    Goclowski, John C.; And Others

    The Reliability, Maintainability, and Cost Model (RMCM) described in this report is an interactive mathematical model with a built-in sensitivity analysis capability. It is a major component of the Life Cycle Cost Impact Model (LCCIM), which was developed as part of the DAIS advanced development program to be used to assess the potential impacts…

  5. The diagnostic accuracy of the rapid dipstick test to predict asymptomatic urinary tract infection of pregnancy.

    PubMed

    Eigbefoh, J O; Isabu, P; Okpere, E; Abebe, J

    2008-07-01

    Untreated urinary tract infection can have devastating maternal and neonatal effects. Thus, routine screening for bacteriuria is advocated. This study was designed to evaluate the diagnostic accuracy of the rapid dipstick test to predict urinary tract infection in pregnancy with the gold standard of urine microscopy, culture and sensitivity acting as the control. The urine dipstick test uses the leucocyte esterase, nitrite and test for protein singly and in combination. The result of the dipstick was compared with the gold standard, urine microscopy, culture and sensitivity using confidence interval for proportions. The reliability and validity of the urine dipstick was also evaluated. Overall, the urine dipstick test has a poor correlation with urine culture (p = 0.125, CI 95%). The same holds true for individual components of the dipstick test. The overall sensitivity of the urine dipstick test was poor at 2.3%. Individual sensitivity of the various components varied between 9.1% for leucocyte esterase and the nitrite test to 56.8% for leucocyte esterase alone. The other components of the dipstick test, the test of nitrite, test for protein and combination of the test (leucocyte esterase, nitrite and proteinuria) appear to decrease the sensitivity of the leucocyte esterase test alone. The ability of the urine dipstick test to correctly rule out urinary tract infection (specificity) was high. The positive predictive value for the dipstick test was high, with the leucocyte esterase test having the highest positive predictive value compared with the other components of the dipstick test. The negative predictive value (NPV) was expectedly highest for the leucocyte esterase test alone with values higher than the other components of the urine dipstick test singly and in various combinations. Compared with the other parameters of the urine dipstick test, singly and in combination, leucocyte esterase appears to be the most accurate (90.25%). The dipstick test has a limited use in screening for asymptomatic bacteriuria. The leucocyte esterase test component of the dipstick test appears to have the highest reliability and validity. The other parameters of the dipstick test decreases the reliability and validity of the leucocyte esterase test. A positive test merits empirical antibiotics, while a negative test is an indication for urine culture. The urine dipstick test if positive will also be useful in follow-up of patient after treatment of urinary tract infection. This is useful in poor resource setting especially in the third world where there is a dearth of trained personnel and equipment for urine culture.

  6. Reliability of Radioisotope Stirling Convertor Linear Alternator

    NASA Technical Reports Server (NTRS)

    Shah, Ashwin; Korovaichuk, Igor; Geng, Steven M.; Schreiber, Jeffrey G.

    2006-01-01

    Onboard radioisotope power systems being developed and planned for NASA s deep-space missions would require reliable design lifetimes of up to 14 years. Critical components and materials of Stirling convertors have been undergoing extensive testing and evaluation in support of a reliable performance for the specified life span. Of significant importance to the successful development of the Stirling convertor is the design of a lightweight and highly efficient linear alternator. Alternator performance could vary due to small deviations in the permanent magnet properties, operating temperature, and component geometries. Durability prediction and reliability of the alternator may be affected by these deviations from nominal design conditions. Therefore, it is important to evaluate the effect of these uncertainties in predicting the reliability of the linear alternator performance. This paper presents a study in which a reliability-based methodology is used to assess alternator performance. The response surface characterizing the induced open-circuit voltage performance is constructed using 3-D finite element magnetic analysis. Fast probability integration method is used to determine the probability of the desired performance and its sensitivity to the alternator design parameters.

  7. Effects of electrons and protons on science instruments

    NASA Technical Reports Server (NTRS)

    Parker, R. H.

    1972-01-01

    The radiation effects on typical science instruments according to the Jupiter trapped radiation design restraint model are described, and specific aspects of the model where an improved understanding would be beneficial are suggested. The spacecraft design used is the TOPS 12L configuration. Ionization and displacement damage are considered, and damage criteria are placed on the most sensitive components. Possible protective measures are mentioned: selecting components as radiation resistant as possible, using a difference in desired and undesired signal shapes for electronic shielding, orienting and locating the component on the spacecraft for better shielding, and adding passive shields to protect specific components. Available options are listed in decreasing order of attractiveness: attempt to lower the design restraints without compromising the success of the missions, trade off experiment objectives for increased reliability, alter the trajectory, and remove sensitive instruments from the payload.

  8. Surface flaw reliability analysis of ceramic components with the SCARE finite element postprocessor program

    NASA Technical Reports Server (NTRS)

    Gyekenyesi, John P.; Nemeth, Noel N.

    1987-01-01

    The SCARE (Structural Ceramics Analysis and Reliability Evaluation) computer program on statistical fast fracture reliability analysis with quadratic elements for volume distributed imperfections is enhanced to include the use of linear finite elements and the capability of designing against concurrent surface flaw induced ceramic component failure. The SCARE code is presently coupled as a postprocessor to the MSC/NASTRAN general purpose, finite element analysis program. The improved version now includes the Weibull and Batdorf statistical failure theories for both surface and volume flaw based reliability analysis. The program uses the two-parameter Weibull fracture strength cumulative failure probability distribution model with the principle of independent action for poly-axial stress states, and Batdorf's shear-sensitive as well as shear-insensitive statistical theories. The shear-sensitive surface crack configurations include the Griffith crack and Griffith notch geometries, using the total critical coplanar strain energy release rate criterion to predict mixed-mode fracture. Weibull material parameters based on both surface and volume flaw induced fracture can also be calculated from modulus of rupture bar tests, using the least squares method with known specimen geometry and grouped fracture data. The statistical fast fracture theories for surface flaw induced failure, along with selected input and output formats and options, are summarized. An example problem to demonstrate various features of the program is included.

  9. Psychometric properties of the Late-Life Function and Disability Instrument: a systematic review

    PubMed Central

    2014-01-01

    Background The choice of measure for use as a primary outcome in geriatric research is contingent upon the construct of interest and evidence for its psychometric properties. The Late-Life Function and Disability Instrument (LLFDI) has been widely used to assess functional limitations and disability in studies with older adults. The primary aim of this systematic review was to evaluate the current available evidence for the psychometric properties of the LLFDI. Methods Published studies of any design reporting results based on administration of the original version of the LLFDI in community-dwelling older adults were identified after searches of 9 electronic databases. Data related to construct validity (convergent/divergent and known-groups validity), test-retest reliability and sensitivity to change were extracted. Effect sizes were calculated for within-group changes and summarized graphically. Results Seventy-one studies including 17,301 older adults met inclusion criteria. Data supporting the convergent/divergent and known-groups validity for both the Function and Disability components were extracted from 30 and 18 studies, respectively. High test-retest reliability was found for the Function component, while results for the Disability component were more variable. Sensitivity to change of the LLFDI was confirmed based on findings from 25 studies. The basic lower extremity subscale and overall summary score of the Function component and limitation dimension of the Disability component were associated with the strongest relative effect sizes. Conclusions There is extensive evidence to support the construct validity and sensitivity to change of the LLFDI among various clinical populations of community-dwelling older adults. Further work is needed on predictive validity and values for clinically important change. Findings from this review can be used to guide the selection of the most appropriate LLFDI subscale for use an outcome measure in geriatric research and practice. PMID:24476510

  10. Reliability and validity of the upper-body dressing scale in Japanese patients with vascular dementia with hemiparesis.

    PubMed

    Endo, Arisa; Suzuki, Makoto; Akagi, Atsumi; Chiba, Naoyuki; Ishizaka, Ikuyo; Matsunaga, Atsuhiko; Fukuda, Michinari

    2015-03-01

    The purpose of this study was to examine the reliability and validity of the Upper-body Dressing Scale (UBDS) for buttoned shirt dressing, which evaluates the learning process of new component actions of upper-body dressing in patients diagnosed with dementia and hemiparesis. This was a preliminary correlational study of concurrent validity and reliability in which 10 vascular dementia patients with hemiparesis were enrolled and assessed repeatedly by six occupational therapists by means of the UBDS and the dressing item of the Functional Independence Measure (FIM). Intraclass correlation coefficient was 0.97 for intra-rater reliability and 0.99 for inter-rater reliability. The level of correlation between UBDS score and FIM dressing item scores was -0.93. UBDS scores for paralytic hand passed into the sleeve and sleeve pulled up beyond the shoulder joint were worse than the scores for the other components of the task. The UBDS has good reliability and validity for vascular dementia patients with hemiparesis. Further research is needed to investigate the relation between UBDS score and the effect of intervention and to clarify sensitivity or responsiveness of the scale to clinical change. Copyright © 2014 John Wiley & Sons, Ltd.

  11. Strategies for Increasing the Market Share of Recycled Products—A Games Theory Approach

    NASA Astrophysics Data System (ADS)

    Batzias, Dimitris F.; Pollalis, Yannis A.

    2009-08-01

    A methodological framework (including 28 activity stages and 10 decision nodes) has been designed under the form of an algorithmic procedure for the development of strategies for increasing the market share of recycled products within a games theory context. A case example is presented referring to a paper market, where a recycling company (RC) is in competition with a virgin-raw-material-using company (VC). The strategies of the VC, for increasing its market share, are the strengthening of (and advertisement based on) the high quality (VC1), the high reliability (VC2), the combination quality and reliability, putting emphasis on the first component (VC3), the combination quality and reliability, putting emphasis on the second component (VC4). The strategies of the RC, for increasing its market share, are proper advertisement based on the low price of produced recycled paper satisfying minimum quality requirements (RC1), the combination of low price with sensitization of the public as regards environmental and materials-saving issues, putting emphasis on the first component (RC2), the same combination, putting emphasis on the second component (RC3). Analysis of all possible situations for the case example under examination is also presented.

  12. Effect of Sport Related Concussion on Clinically Measured Simple Reaction Time

    PubMed Central

    Eckner, James T.; Kutcher, Jeffrey S.; Broglio, Steven P.; Richardson, James K.

    2013-01-01

    Background Reaction time (RT) is a valuable component of the sport concussion assessment battery. RT is typically measured using computers running specialized software, which limits its applicability in some athletic settings and populations. To address this, we developed a simple clinical test of RT (RTclin) that involves grasping a falling measuring stick. Purpose To determine the effect of concussion on RTclin and its sensitivity and specificity for concussion. Materials and methods Concussed athletes (n=28) and non-concussed control teammates (n=28) completed RTclin assessments at baseline and within 48 hours of injury. Repeated measures ANOVA compared mean baseline and follow-up RTclin values between groups. Sensitivity and specificity were calculated over a range of reliable change confidence levels. Results RTclin differed significantly between groups (p < .001): there was significant prolongation from baseline to post-injury in the concussed group (p= .003), with a trend toward improvement in the control group (p = .058). Sensitivity and specificity were maximized when a critical change value of 0 ms was applied (i.e., any increase in RTclin from baseline was interpreted as abnormal), which corresponded to a sensitivity of 75%, specificity of 68%, and a 65% reliable change confidence level. Conclusions RTclin appears sensitive to the effects of concussion and distinguished concussed and non-concussed athletes with similar sensitivity and specificity to other commonly used concussion assessment tools. Given its simplicity, low cost, and minimal time requirement, RTclin should be considered a viable component of the sports medicine provider’s multifaceted concussion assessment battery. PMID:23314889

  13. Performance Assessment of Internal Quality Control (IQC) Products in Blood Transfusion Compatibility Testing in China

    PubMed Central

    Li, Jing-Jing; Gao, Qi; Liu, Zhi-Dong; Kang, Qiong-Hua; Hou, Yi-Jun; Zhang, Luo-Chuan; Hu, Xiao-Mei; Li, Jie; Zhang, Juan

    2015-01-01

    Internal quality control (IQC) is a critical component of laboratory quality management, and IQC products can determine the reliability of testing results. In China, given the fact that most blood transfusion compatibility laboratories do not employ IQC products or do so minimally, there is a lack of uniform and standardized IQC methods. To explore the reliability of IQC products and methods, we studied 697 results from IQC samples in our laboratory from 2012 to 2014. The results showed that the sensitivity and specificity of the IQCs in anti-B testing were 100% and 99.7%, respectively. The sensitivity and specificity of the IQCs in forward blood typing, anti-A testing, irregular antibody screening, and cross-matching were all 100%. The reliability analysis indicated that 97% of anti-B testing results were at a 99% confidence level, and 99.9% of forward blood typing, anti-A testing, irregular antibody screening, and cross-matching results were at a 99% confidence level. Therefore, our IQC products and methods are highly sensitive, specific, and reliable. Our study paves the way for the establishment of a uniform and standardized IQC method for pre-transfusion compatibility testing in China and other parts of the world. PMID:26488582

  14. Methods to Improve Reliability of Video Recorded Behavioral Data

    PubMed Central

    Haidet, Kim Kopenhaver; Tate, Judith; Divirgilio-Thomas, Dana; Kolanowski, Ann; Happ, Mary Beth

    2009-01-01

    Behavioral observation is a fundamental component of nursing practice and a primary source of clinical research data. The use of video technology in behavioral research offers important advantages to nurse scientists in assessing complex behaviors and relationships between behaviors. The appeal of using this method should be balanced, however, by an informed approach to reliability issues. In this paper, we focus on factors that influence reliability, such as the use of sensitizing sessions to minimize participant reactivity and the importance of training protocols for video coders. In addition, we discuss data quality, the selection and use of observational tools, calculating reliability coefficients, and coding considerations for special populations based on our collective experiences across three different populations and settings. PMID:19434651

  15. The Child Adolescent Bullying Scale (CABS): Psychometric evaluation of a new measure.

    PubMed

    Strout, Tania D; Vessey, Judith A; DiFazio, Rachel L; Ludlow, Larry H

    2018-06-01

    While youth bullying is a significant public health problem, healthcare providers have been limited in their ability to identify bullied youths due to the lack of a reliable, and valid instrument appropriate for use in clinical settings. We conducted a multisite study to evaluate the psychometric properties of a new 22-item instrument for assessing youths' experiences of being bullied, the Child Adolescent Bullying Scale (CABS). The 20 items summed to produce the measure's score were evaluated here. Diagnostic performance was assessed through evaluation of sensitivity, specificity, predictive values, and area under receiver operating characteristic (AUROC) curve. A sample of 352 youths from diverse racial, ethnic, and geographic backgrounds (188 female, 159 male, 5 transgender, sample mean age 13.5 years) were recruited from two clinical sites. Participants completed the CABS and existing youth bullying measures. Analyses grounded in classical test theory, including assessments of reliability and validity, item analyses, and principal components analysis, were conducted. The diagnostic performance and test characteristics of the CABS were also evaluated. The CABS is comprised of one component, accounting for 67% of observed variance. Analyses established evidence of internal consistency reliability (Cronbach's α = 0.97), construct and convergent validity. Sensitivity was 84%, specificity was 65%, and the AUROC curve was 0.74 (95% CI: 0.69-0.80). Findings suggest that the CABS holds promise as a reliable, valid tool for healthcare provider use in screening for bullying exposure in the clinical setting. © 2018 Wiley Periodicals, Inc.

  16. Competing risk models in reliability systems, a weibull distribution model with bayesian analysis approach

    NASA Astrophysics Data System (ADS)

    Iskandar, Ismed; Satria Gondokaryono, Yudi

    2016-02-01

    In reliability theory, the most important problem is to determine the reliability of a complex system from the reliability of its components. The weakness of most reliability theories is that the systems are described and explained as simply functioning or failed. In many real situations, the failures may be from many causes depending upon the age and the environment of the system and its components. Another problem in reliability theory is one of estimating the parameters of the assumed failure models. The estimation may be based on data collected over censored or uncensored life tests. In many reliability problems, the failure data are simply quantitatively inadequate, especially in engineering design and maintenance system. The Bayesian analyses are more beneficial than the classical one in such cases. The Bayesian estimation analyses allow us to combine past knowledge or experience in the form of an apriori distribution with life test data to make inferences of the parameter of interest. In this paper, we have investigated the application of the Bayesian estimation analyses to competing risk systems. The cases are limited to the models with independent causes of failure by using the Weibull distribution as our model. A simulation is conducted for this distribution with the objectives of verifying the models and the estimators and investigating the performance of the estimators for varying sample size. The simulation data are analyzed by using Bayesian and the maximum likelihood analyses. The simulation results show that the change of the true of parameter relatively to another will change the value of standard deviation in an opposite direction. For a perfect information on the prior distribution, the estimation methods of the Bayesian analyses are better than those of the maximum likelihood. The sensitivity analyses show some amount of sensitivity over the shifts of the prior locations. They also show the robustness of the Bayesian analysis within the range between the true value and the maximum likelihood estimated value lines.

  17. Validation of the Regicor Short Physical Activity Questionnaire for the Adult Population

    PubMed Central

    Molina, Luis; Sarmiento, Manuel; Peñafiel, Judith; Donaire, David; Garcia-Aymerich, Judith; Gomez, Miquel; Ble, Mireia; Ruiz, Sonia; Frances, Albert; Schröder, Helmut; Marrugat, Jaume; Elosua, Roberto

    2017-01-01

    Objective To develop and validate a short questionnaire to estimate physical activity (PA) practice and sedentary behavior for the adult population. Methods The short questionnaire was developed using data from a cross-sectional population-based survey (n = 6352) that included the Minnesota leisure-time PA questionnaire. Activities that explained a significant proportion of the variability of population PA practice were identified. Validation of the short questionnaire included a cross-sectional component to assess validity with respect to the data collected by accelerometers and a longitudinal component to assess reliability and sensitivity to detect changes (n = 114, aged 35 to 74 years). Results Six types of activities that accounted for 87% of population variability in PA estimated with the Minnesota questionnaire were selected. The short questionnaire estimates energy expenditure in total PA and by intensity (light, moderate, vigorous), and includes 2 questions about sedentary behavior and a question about occupational PA. The short questionnaire showed high reliability, with intraclass correlation coefficients ranging between 0.79 to 0.95. The Spearman correlation coefficients between estimated energy expenditure obtained with the questionnaire and the number of steps detected by the accelerometer were as follows: 0.36 for total PA, 0.40 for moderate intensity, and 0.26 for vigorous intensity. The questionnaire was sensitive to detect changes in moderate and vigorous PA (correlation coefficients ranging from 0.26 to 0.34). Conclusion The REGICOR short questionnaire is reliable, valid, and sensitive to detect changes in moderate and vigorous PA. This questionnaire could be used in daily clinical practice and epidemiological studies. PMID:28085886

  18. Mid-frequency Band Dynamics of Large Space Structures

    NASA Technical Reports Server (NTRS)

    Coppolino, Robert N.; Adams, Douglas S.

    2004-01-01

    High and low intensity dynamic environments experienced by a spacecraft during launch and on-orbit operations, respectively, induce structural loads and motions, which are difficult to reliably predict. Structural dynamics in low- and mid-frequency bands are sensitive to component interface uncertainty and non-linearity as evidenced in laboratory testing and flight operations. Analytical tools for prediction of linear system response are not necessarily adequate for reliable prediction of mid-frequency band dynamics and analysis of measured laboratory and flight data. A new MATLAB toolbox, designed to address the key challenges of mid-frequency band dynamics, is introduced in this paper. Finite-element models of major subassemblies are defined following rational frequency-wavelength guidelines. For computational efficiency, these subassemblies are described as linear, component mode models. The complete structural system model is composed of component mode subassemblies and linear or non-linear joint descriptions. Computation and display of structural dynamic responses are accomplished employing well-established, stable numerical methods, modern signal processing procedures and descriptive graphical tools. Parametric sensitivity and Monte-Carlo based system identification tools are used to reconcile models with experimental data and investigate the effects of uncertainties. Models and dynamic responses are exported for employment in applications, such as detailed structural integrity and mechanical-optical-control performance analyses.

  19. Integrated Droplet-Based Microextraction with ESI-MS for Removal of Matrix Interference in Single-Cell Analysis.

    PubMed

    Zhang, Xiao-Chao; Wei, Zhen-Wei; Gong, Xiao-Yun; Si, Xing-Yu; Zhao, Yao-Yao; Yang, Cheng-Dui; Zhang, Si-Chun; Zhang, Xin-Rong

    2016-04-29

    Integrating droplet-based microfluidics with mass spectrometry is essential to high-throughput and multiple analysis of single cells. Nevertheless, matrix effects such as the interference of culture medium and intracellular components influence the sensitivity and the accuracy of results in single-cell analysis. To resolve this problem, we developed a method that integrated droplet-based microextraction with single-cell mass spectrometry. Specific extraction solvent was used to selectively obtain intracellular components of interest and remove interference of other components. Using this method, UDP-Glc-NAc, GSH, GSSG, AMP, ADP and ATP were successfully detected in single MCF-7 cells. We also applied the method to study the change of unicellular metabolites in the biological process of dysfunctional oxidative phosphorylation. The method could not only realize matrix-free, selective and sensitive detection of metabolites in single cells, but also have the capability for reliable and high-throughput single-cell analysis.

  20. Finger tapping impairments are highly sensitive for evaluating upper motor neuron lesions.

    PubMed

    Shirani, Afsaneh; Newton, Braeden D; Okuda, Darin T

    2017-03-21

    Identifying highly sensitive and reliable neurological exam components are crucial in recognizing clinical deficiencies. This study aimed to investigate finger tapping performance differences between patients with CNS demyelinating lesions and healthy control subjects. Twenty-three patients with multiple sclerosis or clinically isolated syndrome with infratentorial and/or cervical cord lesions on MRI, and 12 healthy controls were videotaped while tapping the tip of the index finger against the tip and distal crease of the thumb using both the dominant and non-dominant hand. Videos were assessed independently by 10 evaluators (three MS neurologists, four neurology residents, three advanced practice providers). Sensitivity and inter-evaluator reliability of finger tapping interpretations were calculated. A total of 1400 evaluations (four videos per each of the 35 subjects evaluated by 10 independent providers) were obtained. Impairments in finger tapping against the distal thumb crease of the non-dominant hand, identified by neurologists, had the greatest sensitivity (84%, p < 0.001) for detecting impairment. Finger tapping against the thumb crease was more sensitive than the thumb tip across all categories of providers. The best inter-evaluator reliability was associated with neurologists' evaluations for the thumb crease of the non-dominant hand (kappa = 0.83, p < 0.001). Impaired finger tapping against the distal thumb crease of the non-dominant hand was a more sensitive technique for detecting impairments related to CNS demyelinating lesions. Our findings highlight the importance of precise examinations of the non-dominant side where impaired fine motor control secondary to an upper motor injury might be detectable earlier than the dominant side.

  1. Biomarkers for diet and cancer prevention research: potentials and challenges.

    PubMed

    Davis, Cindy D; Milner, John A

    2007-09-01

    As cancer incidence is projected to increase for decades there is a need for effective preventive strategies. Fortunately, evidence continues to mount that altering dietary habits is an effective and cost-efficient approach for reducing cancer risk and for modifying the biological behavior of tumors. Predictive, validated and sensitive biomarkers, including those that reliably evaluate "intake" or exposure to a specific food or bioactive component, that assess one or more specific biological "effects" that are linked to cancer, and that effectively predict individual "susceptibility" as a function of nutrient-nutrient interactions and genetics, are fundamental to evaluating who will benefit most from dietary interventions. These biomarkers must be readily accessible, easily and reliably assayed, and predictive of a key process(es) involved in cancer. The response to a food is determined not only by the effective concentration of the bioactive food component(s) reaching the target tissue, but also by the amount of the target requiring modification. Thus, this threshold response to foods and their components will vary from individual to individual. The key to understanding a personalized response is a greater knowledge of nutrigenomics, proteomics and metabolomics.

  2. Reliability analysis of a phaser measurement unit using a generalized fuzzy lambda-tau(GFLT) technique.

    PubMed

    Komal

    2018-05-01

    Nowadays power consumption is increasing day-by-day. To fulfill failure free power requirement, planning and implementation of an effective and reliable power management system is essential. Phasor measurement unit(PMU) is one of the key device in wide area measurement and control systems. The reliable performance of PMU assures failure free power supply for any power system. So, the purpose of the present study is to analyse the reliability of a PMU used for controllability and observability of power systems utilizing available uncertain data. In this paper, a generalized fuzzy lambda-tau (GFLT) technique has been proposed for this purpose. In GFLT, system components' uncertain failure and repair rates are fuzzified using fuzzy numbers having different shapes such as triangular, normal, cauchy, sharp gamma and trapezoidal. To select a suitable fuzzy number for quantifying data uncertainty, system experts' opinion have been considered. The GFLT technique applies fault tree, lambda-tau method, fuzzified data using different membership functions, alpha-cut based fuzzy arithmetic operations to compute some important reliability indices. Furthermore, in this study ranking of critical components of the system using RAM-Index and sensitivity analysis have also been performed. The developed technique may be helpful to improve system performance significantly and can be applied to analyse fuzzy reliability of other engineering systems. Copyright © 2018 ISA. Published by Elsevier Ltd. All rights reserved.

  3. Ceramics for engines

    NASA Technical Reports Server (NTRS)

    Kiser, James D.; Levine, Stanley R.; Dicarlo, James A.

    1987-01-01

    Structural ceramics were under nearly continuous development for various heat engine applications since the early 1970s. These efforts were sustained by the properties that ceramics offer in the areas of high-temperature strength, environmental resistance, and low density and the large benefits in system efficiency and performance that can result. The promise of ceramics was not realized because their brittle nature results in high sensitivity to microscopic flaws and catastrophic fracture behavior. This translated into low reliability for ceramic components and thus limited their application in engines. For structural ceramics to successfully make inroads into the terrestrial heat engine market requires further advances in low cost, net shape fabrication of high reliability components, and improvements in properties such as toughness, and strength. These advances will lead to very limited use of ceramics in noncritical applications in aerospace engines. For critical aerospace applications, an additional requirement is that the components display markedly improved toughness and noncatastrophic or graceful fracture. Thus the major emphasis is on fiber-reinforced ceramics.

  4. Chemical profiling of Qixue Shuangbu Tincture by ultra-performance liquid chromatography with electrospray ionization quadrupole-time-of-flight high-definition mass spectrometry (UPLC-QTOF/MS).

    PubMed

    Chen, Lin-Wei; Wang, Qin; Qin, Kun-Ming; Wang, Xiao-Li; Wang, Bin; Chen, Dan-Ni; Cai, Bao-Chang; Cai, Ting

    2016-02-01

    The present study was designed to develop and validate a sensitive and reliable ultra high performance liquid chromatography coupled with quadrupole time-of-flight mass spectrometry (UPLC-QTOF/MS) method to separate and identify the chemical constituents of Qixue Shuangbu Tincture (QXSBT), a classic traditional Chinese medicine (TCM) prescription. Under the optimized UPLC and QTOF/MS conditions, 56 components in QXSBT, including chalcones, triterpenoids, protopanaxatriol, flavones and flavanones were identified and tentatively characterized within a running time of 42 min. The components were identified by comparing the retention times, accurate mass, and mass spectrometric fragmentation characteristic ions, and matching empirical molecular formula with that of the published compounds. In conclusion, the established UPLC-QTOF/MS method was reliable for a rapid identification of complicated components in the TCM prescriptions. Copyright © 2016 China Pharmaceutical University. Published by Elsevier B.V. All rights reserved.

  5. Fast gas spectroscopy using pulsed quantum cascade lasers

    NASA Astrophysics Data System (ADS)

    Beyer, T.; Braun, M.; Lambrecht, A.

    2003-03-01

    Laser spectroscopy has found many industrial applications, e.g., control of automotive exhaust and process monitoring. The midinfrared region is of special interest because it has stronger absorption lines compared to the near infrared (NIR). However, in the NIR high quality reliable laser sources, detectors, and passive optical components are available. A quantum cascade laser could change this situation if fundamental advantages can be exploited with compact and reliable systems. It will be shown that, using pulsed lasers and available fast detectors, lower residual sensitivity levels than in corresponding NIR systems can be achieved. The stability is sufficient for industrial applications.

  6. Fundamentals of endoscopic surgery: creation and validation of the hands-on test.

    PubMed

    Vassiliou, Melina C; Dunkin, Brian J; Fried, Gerald M; Mellinger, John D; Trus, Thadeus; Kaneva, Pepa; Lyons, Calvin; Korndorffer, James R; Ujiki, Michael; Velanovich, Vic; Kochman, Michael L; Tsuda, Shawn; Martinez, Jose; Scott, Daniel J; Korus, Gary; Park, Adrian; Marks, Jeffrey M

    2014-03-01

    The Fundamentals of Endoscopic Surgery™ (FES) program consists of online materials and didactic and skills-based tests. All components were designed to measure the skills and knowledge required to perform safe flexible endoscopy. The purpose of this multicenter study was to evaluate the reliability and validity of the hands-on component of the FES examination, and to establish the pass score. Expert endoscopists identified the critical skill set required for flexible endoscopy. They were then modeled in a virtual reality simulator (GI Mentor™ II, Simbionix™ Ltd., Airport City, Israel) to create five tasks and metrics. Scores were designed to measure both speed and precision. Validity evidence was assessed by correlating performance with self-reported endoscopic experience (surgeons and gastroenterologists [GIs]). Internal consistency of each test task was assessed using Cronbach's alpha. Test-retest reliability was determined by having the same participant perform the test a second time and comparing their scores. Passing scores were determined by a contrasting groups methodology and use of receiver operating characteristic curves. A total of 160 participants (17 % GIs) performed the simulator test. Scores on the five tasks showed good internal consistency reliability and all had significant correlations with endoscopic experience. Total FES scores correlated 0.73, with participants' level of endoscopic experience providing evidence of their validity, and their internal consistency reliability (Cronbach's alpha) was 0.82. Test-retest reliability was assessed in 11 participants, and the intraclass correlation was 0.85. The passing score was determined and is estimated to have a sensitivity (true positive rate) of 0.81 and a 1-specificity (false positive rate) of 0.21. The FES hands-on skills test examines the basic procedural components required to perform safe flexible endoscopy. It meets rigorous standards of reliability and validity required for high-stakes examinations, and, together with the knowledge component, may help contribute to the definition and determination of competence in endoscopy.

  7. Changes in the electromechanical delay components during a fatiguing stimulation in human skeletal muscle: an EMG, MMG and force combined approach.

    PubMed

    Cè, Emiliano; Rampichini, Susanna; Monti, Elena; Venturelli, Massimo; Limonta, Eloisa; Esposito, Fabio

    2017-01-01

    Peripheral fatigue involves electrochemical and mechanical mechanisms. An electromyographic, mechanomyographic and force combined approach may permit a kinetic evaluation of the changes at the synaptic, skeletal muscle fiber, and muscle-tendon unit level during a fatiguing stimulation. Surface electromyogram, mechanomyogram, force and stimulation current were detected from the gastrocnemius medialis muscle in twenty male participants during a fatiguing stimulation (twelve blocks of 35 Hz stimulations, duty cycle 9 s on/1 s off, duration 120 s). The total electromechanical delay and its three components (between stimulation current and electromyogram, synaptic component; between electromyogram and mechanomyogram signal onset, muscle fiber electrochemical component, and between mechanomyogram and force signal onset, mechanical component) were calculated. Interday reliability and sensitivity were determined. After fatigue, peak force decreased by 48% (P < 0.05) and the total electromechanical delay and its synaptic, electrochemical and mechanical components lengthened from 25.8 ± 0.9, 1.47 ± 0.04, 11.2 ± 0.6, and 13.1 ± 1.3 ms to 29.0 ± 1.6, 1.56 ± 0.05, 12.4 ± 0.9, and 17.2 ± 0.6 ms, respectively (P < 0.05). During fatigue, the total electromechanical delay and the mechanical component increased significantly after the 40th second, and then remained stable. The synaptic and electrochemical components lengthened significantly after the 20th and 30th second, respectively. Interday reliability was high to very high, with an adequate level of sensitivity. The kinetic evaluation of the delays during the fatiguing stimulation highlighted different onsets and kinetics, with the events at synaptic level being the first to reveal a significant elongation, followed by those at the intra-fiber level. The mechanical events, which were the most affected by fatigue, were the last to lengthen.

  8. Soil life in reconstructed ecosystems: initial soil food web responses after rebuilding a forest soil profile for a climate change experiment

    Treesearch

    Paul T. Rygiewicz; Vicente J. Monleon; Elaine R. Ingham; Kendall J. Martin; Mark G. Johnson

    2010-01-01

    Disrupting ecosystem components, while transferring and reconstructing them for experiments can produce myriad responses. Establishing the extent of these biological responses as the system approaches a new equilibrium allows us more reliably to emulate comparable native systems. That is, the sensitivity of analyzing ecosystem processes in a reconstructed system is...

  9. Measuring verbal and non-verbal communication in aphasia: reliability, validity, and sensitivity to change of the Scenario Test.

    PubMed

    van der Meulen, Ineke; van de Sandt-Koenderman, W Mieke E; Duivenvoorden, Hugo J; Ribbers, Gerard M

    2010-01-01

    This study explores the psychometric qualities of the Scenario Test, a new test to assess daily-life communication in severe aphasia. The test is innovative in that it: (1) examines the effectiveness of verbal and non-verbal communication; and (2) assesses patients' communication in an interactive setting, with a supportive communication partner. To determine the reliability, validity, and sensitivity to change of the Scenario Test and discuss its clinical value. The Scenario Test was administered to 122 persons with aphasia after stroke and to 25 non-aphasic controls. Analyses were performed for the entire group of persons with aphasia, as well as for a subgroup of persons unable to communicate verbally (n = 43). Reliability (internal consistency, test-retest reliability, inter-judge, and intra-judge reliability) and validity (internal validity, convergent validity, known-groups validity) and sensitivity to change were examined using standard psychometric methods. The Scenario Test showed high levels of reliability. Internal consistency (Cronbach's alpha = 0.96; item-rest correlations = 0.58-0.82) and test-retest reliability (ICC = 0.98) were high. Agreement between judges in total scores was good, as indicated by the high inter- and intra-judge reliability (ICC = 0.86-1.00). Agreement in scores on the individual items was also good (square-weighted kappa values 0.61-0.92). The test demonstrated good levels of validity. A principal component analysis for categorical data identified two dimensions, interpreted as general communication and communicative creativity. Correlations with three other instruments measuring communication in aphasia, that is, Spontaneous Speech interview from the Aachen Aphasia Test (AAT), Amsterdam-Nijmegen Everyday Language Test (ANELT), and Communicative Effectiveness Index (CETI), were moderate to strong (0.50-0.85) suggesting good convergent validity. Group differences were observed between persons with aphasia and non-aphasic controls, as well as between persons with aphasia unable to use speech to convey information and those able to communicate verbally; this indicates good known-groups validity. The test was sensitive to changes in performance, measured over a period of 6 months. The data support the reliability and validity of the Scenario Test as an instrument for examining daily-life communication in aphasia. The test focuses on multimodal communication; its psychometric qualities enable future studies on the effect of Alternative and Augmentative Communication (AAC) training in aphasia.

  10. Reliability and validity of the work and social adjustment scale in phobic disorders.

    PubMed

    Mataix-Cols, David; Cowley, Amy J; Hankins, Matthew; Schneider, Andreas; Bachofen, Martin; Kenwright, Mark; Gega, Lina; Cameron, Rachel; Marks, Isaac M

    2005-01-01

    The Work and Social Adjustment Scale (WSAS) is a simple widely used 5-item measure of disability whose psychometric properties need more analysis in phobic disorders. The reliability, factor structure, validity, and sensitivity to change of the WSAS were studied in 205 phobic patients (73 agoraphobia, 62 social phobia, and 70 specific phobia) who participated in various open and randomized trials of self-exposure therapy. Internal consistency of the WSAS was excellent in all phobics pooled and in agoraphobics and social phobics separately. Principal components analysis extracted a single general factor of disability. Specific phobics gave less consistent ratings across WSAS items, suggesting that some items were less relevant to their problem. Internal consistency was marginally higher for self-ratings than clinician ratings of the WSAS. Self-ratings and clinician ratings correlated highly though patients tended to rate themselves as more disabled than clinicians did. WSAS total scores reflected differences in phobic severity and improvement with treatment. The WSAS is a valid, reliable, and change-sensitive measure of work/social and other adjustment in phobic disorders, especially in agoraphobia and social phobia.

  11. Psychometrics of chronic liver disease questionnaire in Chinese chronic hepatitis B patients.

    PubMed

    Zhou, Kai-Na; Zhang, Min; Wu, Qian; Ji, Zhen-Hao; Zhang, Xiao-Mei; Zhuang, Gui-Hua

    2013-06-14

    To evaluate psychometrics of the Chinese (mainland) chronic liver disease questionnaire (CLDQ) in patients with chronic hepatitis B (CHB). A cross-sectional sample of 460 Chinese patients with CHB was selected from the Outpatient Department of the Eighth Hospital of Xi'an, including CHB (CHB without cirrhosis) (n = 323) and CHB-related cirrhosis (n = 137). The psychometrics includes reliability, validity and sensitivity. Internal consistency reliability was measured using Cronbach's α. Convergent and discriminant validity was evaluated by item-scale correlation. Factorial validity was explored by principal component analysis with varimax rotation. Sensitivity was assessed using Cohen's effect size (ES), and independent sample t test between CHB and CHB-related cirrhosis groups and between alanine aminotransferase (ALT) normal and abnormal groups after stratifying the disease (CHB and CHB-related cirrhosis). Internal consistency reliability of the CLDQ was 0.83 (range: 0.65-0.90). Most of the hypothesized item-scale correlations were 0.40 or over, and all of such hypothesized correlations were higher than the alternative ones, indicating satisfactory convergent and discriminant validity. Six factors were extracted after varimax rotation from the 29 items of CLDQ. The eligible Cohen's ES with statistically significant independent sample t test was found in the overall CLDQ and abdominal, systematic, activity scales (CHB vs CHB-related cirrhosis), and in the overall CLDQ and abdominal scale in the stratification of patients with CHB (ALT normal vs abnormal). The CLDQ has acceptable reliability, validity and sensitivity in Chinese (mainland) patients with CHB.

  12. The development of an instrument to measure quality of vision: the Quality of Vision (QoV) questionnaire.

    PubMed

    McAlinden, Colm; Pesudovs, Konrad; Moore, Jonathan E

    2010-11-01

    To develop an instrument to measure subjective quality of vision: the Quality of Vision (QoV) questionnaire. A 30-item instrument was designed with 10 symptoms rated in each of three scales (frequency, severity, and bothersome). The QoV was completed by 900 subjects in groups of spectacle wearers, contact lens wearers, and those having had laser refractive surgery, intraocular refractive surgery, or eye disease and investigated with Rasch analysis and traditional statistics. Validity and reliability were assessed by Rasch fit statistics, principal components analysis (PCA), person separation, differential item functioning (DIF), item targeting, construct validity (correlation with visual acuity, contrast sensitivity, total root mean square [RMS] higher order aberrations [HOA]), and test-retest reliability (two-way random intraclass correlation coefficients [ICC] and 95% repeatability coefficients [R(c)]). Rasch analysis demonstrated good precision, reliability, and internal consistency for all three scales (mean square infit and outfit within 0.81-1.27; PCA >60% variance explained by the principal component; person separation 2.08, 2.10, and 2.01 respectively; and minimal DIF). Construct validity was indicated by strong correlations with visual acuity, contrast sensitivity and RMS HOA. Test-retest reliability was evidenced by a minimum ICC of 0.867 and a minimum 95% R(c) of 1.55 units. The QoV Questionnaire consists of a Rasch-tested, linear-scaled, 30-item instrument on three scales providing a QoV score in terms of symptom frequency, severity, and bothersome. It is suitable for measuring QoV in patients with all types of refractive correction, eye surgery, and eye disease that cause QoV problems.

  13. Simultaneous determination of 15 phenolic constituents of Chinese black rice wine by HPLC-MS/MS with SPE.

    PubMed

    Wang, Yutang; Liu, Yuanyuan; Xiao, Chunxia; Liu, Laping; Hao, Miao; Wang, Jianguo; Liu, Xuebo

    2014-06-01

    This study established a new method for quantitative and qualitative determination of certain components in black rice wine, a traditional Chinese brewed wine. Specifically, we combined solid-phase extraction and high-performance liquid chromatography (HPLC) with triple quadrupole mass spectrometry (MS/MS) to determine 8 phenolic acids, 3 flavonols, and 4 anthocyanins in black rice wine. First, we clean samples with OASIS HLB cartridges and optimized extraction parameters. Next, we performed separation on a SHIM-PACK XR-ODS column (I.D. 3.0 mm × 75 mm, 2.2 μm particle size) with a gradient elution of 50% aqueous acetonitrile (V/V) and water, both containing 0.2% formic acid. We used multiple-reaction monitoring scanning for quantification, with switching electrospray ion source polarity between positive and negative modes in a single chromatographic run. We detected 15 phenolic compounds properly within 38 min under optimized conditions. Limits of detection ranged from 0.008 to 0.030 mg/L, and average recoveries ranged from 60.8 to 103.1% with relative standard deviation ≤8.6%. We validated the method and found it to be sensitive and reliable for quantifying phenolic compounds in rice wine matrices. This study developed a new, reliable HPLC-MS/MS method for simultaneous determination of 15 bioactive components in black rice wine. This method was validated and found to be sensitive and reliable for quantifying phenolic compounds in rice wine. © 2014 Institute of Food Technologists®

  14. Self-Noise of the STS-2 and sensitivity of its computation to errors in alignment of sensors

    NASA Astrophysics Data System (ADS)

    Gerner, Andreas; Sleeman, Reinoud; Grasemann, Bernhard; Lenhardt, Wolfgang

    2016-04-01

    The assessment of a seismometer's self-noise is an important part of establishing its health, quality, and suitability. A spectral coherence technique proposed by Sleeman et al. (2006) using synchronously recorded data of triples of collocated and co-aligned seismometers has shown to be a very robust and reliable way to estimate the self-noise of modern broadband seismic sensors. It has been demonstrated in previous works that the resulting self-noise spectra, primarily in the frequency range of Earth's microseisms, are considerably affected by small errors in the alignment of sensors. Further, due to the sensitivity of the 3-channel correlation technique to misalignment, numerical rotation of the recorded traces prior to self-noise computation can be performed to find best possible alignment by searching for minimum self-noise values. In this study we focus on the sensitivity of the 3-channel correlation technique to misalignment, and investigate the possibility of complete removal of the microseism signal from self-noise estimates for the sensors' three components separately. Data from a long-term installation of four STS-2 sensors, specifically intended for self-noise studies, at the Conrad Observatory (Austria) in a collaboration between the KNMI (Netherlands) and the ZAMG (Austria) provides a reliable basis for an accurate sensitivity analysis and self-noise assessment. Our work resulted in undisturbed self-noise estimates for the vertical components, and our current focus is on improving alignment of horizontal axes, and verification of the manufacturer's specification regarding orthogonality of all three components. The tools and methods developed within this research can help to quickly establish consistent self-noise models, including estimates of orthogonality and alignment, which facilitates comparison of different models and provides us with a means to test quality and accuracy of a seismic sensor over its life span.

  15. Development of a Pressure Sensitive Paint System with Correction for Temperature Variation

    NASA Technical Reports Server (NTRS)

    Simmons, Kantis A.

    1995-01-01

    Pressure Sensitive Paint (PSP) is known to provide a global image of pressure over a model surface. However, improvements in its accuracy and reliability are needed. Several factors contribute to the inaccuracy of PSP. One major factor is that luminescence is temperature dependent. To correct the luminescence of the pressure sensing component for changes in temperature, a temperature sensitive luminophore incorporated in the paint allows the user to measure both pressure and temperature simultaneously on the surface of a model. Magnesium Octaethylporphine (MgOEP) was used as a temperature sensing luminophore, with the pressure sensing luminophore, Platinum Octaethylporphine (PtOEP), to correct for temperature variations in model surface pressure measurements.

  16. FY12 End of Year Report for NEPP DDR2 Reliability

    NASA Technical Reports Server (NTRS)

    Guertin, Steven M.

    2013-01-01

    This document reports the status of the NASA Electronic Parts and Packaging (NEPP) Double Data Rate 2 (DDR2) Reliability effort for FY2012. The task expanded the focus of evaluating reliability effects targeted for device examination. FY11 work highlighted the need to test many more parts and to examine more operating conditions, in order to provide useful recommendations for NASA users of these devices. This year's efforts focused on development of test capabilities, particularly focusing on those that can be used to determine overall lot quality and identify outlier devices, and test methods that can be employed on components for flight use. Flight acceptance of components potentially includes considerable time for up-screening (though this time may not currently be used for much reliability testing). Manufacturers are much more knowledgeable about the relevant reliability mechanisms for each of their devices. We are not in a position to know what the appropriate reliability tests are for any given device, so although reliability testing could be focused for a given device, we are forced to perform a large campaign of reliability tests to identify devices with degraded reliability. With the available up-screening time for NASA parts, it is possible to run many device performance studies. This includes verification of basic datasheet characteristics. Furthermore, it is possible to perform significant pattern sensitivity studies. By doing these studies we can establish higher reliability of flight components. In order to develop these approaches, it is necessary to develop test capability that can identify reliability outliers. To do this we must test many devices to ensure outliers are in the sample, and we must develop characterization capability to measure many different parameters. For FY12 we increased capability for reliability characterization and sample size. We increased sample size this year by moving from loose devices to dual inline memory modules (DIMMs) with an approximate reduction of 20 to 50 times in terms of per device under test (DUT) cost. By increasing sample size we have improved our ability to characterize devices that may be considered reliability outliers. This report provides an update on the effort to improve DDR2 testing capability. Although focused on DDR2, the methods being used can be extended to DDR and DDR3 with relative ease.

  17. Usefulness of component resolved analysis of cat allergy in routine clinical practice.

    PubMed

    Eder, Katharina; Becker, Sven; San Nicoló, Marion; Berghaus, Alexander; Gröger, Moritz

    2016-01-01

    Cat allergy is of great importance, and its prevalence is increasing worldwide. Cat allergens and house dust mite allergens represent the major indoor allergens; however, they are ubiquitous. Cat sensitization and allergy are known risk factors for rhinitis, bronchial hyperreactivity and asthma. Thus, the diagnosis of sensitization to cats is important for any allergist. 70 patients with positive skin prick tests for cats were retrospectively compared regarding their skin prick test results, as well as their specific immunoglobulin E antibody profiles with regard to their responses to the native cat extract, rFel d 1, nFel d 2 and rFel d 4. 35 patients were allergic to cats, as determined by positive anamnesis and/or nasal provocation with cat allergens, and 35 patients exhibited clinically non-relevant sensitization, as indicated by negative anamnesis and/or a negative nasal allergen challenge. Native cat extract serology testing detected 100% of patients who were allergic to cats but missed eight patients who showed sensitization in the skin prick test and did not have allergic symptoms. The median values of the skin prick test, as well as those of the specific immunoglobulin E antibodies against the native cat extract, were significantly higher for allergic patients than for patients with clinically non-relevant sensitization. Component based diagnostic testing to rFel d 1 was not as reliable. Sensitization to nFel d 2 and rFel d 4 was seen only in individual patients. Extract based diagnostic methods for identifying cat allergy and sensitization, such as the skin prick test and native cat extract serology, remain crucial in routine clinical practice. In our study, component based diagnostic testing could not replace these methods with regard to the detection of sensitization to cats and differentiation between allergy and sensitization without clinical relevance. However, component resolved allergy diagnostic tools have individual implications, and future studies may facilitate a better understanding of its use and subsequently may improve the clinical management of allergic patients.

  18. Ribo-attenuators: novel elements for reliable and modular riboswitch engineering.

    PubMed

    Folliard, Thomas; Mertins, Barbara; Steel, Harrison; Prescott, Thomas P; Newport, Thomas; Jones, Christopher W; Wadhams, George; Bayer, Travis; Armitage, Judith P; Papachristodoulou, Antonis; Rothschild, Lynn J

    2017-07-04

    Riboswitches are structural genetic regulatory elements that directly couple the sensing of small molecules to gene expression. They have considerable potential for applications throughout synthetic biology and bio-manufacturing as they are able to sense a wide range of small molecules and regulate gene expression in response. Despite over a decade of research they have yet to reach this considerable potential as they cannot yet be treated as modular components. This is due to several limitations including sensitivity to changes in genetic context, low tunability, and variability in performance. To overcome the associated difficulties with riboswitches, we have designed and introduced a novel genetic element called a ribo-attenuator in Bacteria. This genetic element allows for predictable tuning, insulation from contextual changes, and a reduction in expression variation. Ribo-attenuators allow riboswitches to be treated as truly modular and tunable components, thus increasing their reliability for a wide range of applications.

  19. Evaluating the reliability of the stream tracer approach to characterize stream-subsurface water exchange

    USGS Publications Warehouse

    Harvey, Judson W.; Wagner, Brian J.; Bencala, Kenneth E.

    1996-01-01

    Stream water was locally recharged into shallow groundwater flow paths that returned to the stream (hyporheic exchange) in St. Kevin Gulch, a Rocky Mountain stream in Colorado contaminated by acid mine drainage. Two approaches were used to characterize hyporheic exchange: sub-reach-scale measurement of hydraulic heads and hydraulic conductivity to compute streambed fluxes (hydrometric approach) and reachscale modeling of in-stream solute tracer injections to determine characteristic length and timescales of exchange with storage zones (stream tracer approach). Subsurface data were the standard of comparison used to evaluate the reliability of the stream tracer approach to characterize hyporheic exchange. The reach-averaged hyporheic exchange flux (1.5 mL s−1 m−1), determined by hydrometric methods, was largest when stream base flow was low (10 L s−1); hyporheic exchange persisted when base flow was 10-fold higher, decreasing by approximately 30%. Reliability of the stream tracer approach to detect hyporheic exchange was assessed using first-order uncertainty analysis that considered model parameter sensitivity. The stream tracer approach did not reliably characterize hyporheic exchange at high base flow: the model was apparently more sensitive to exchange with surface water storage zones than with the hyporheic zone. At low base flow the stream tracer approach reliably characterized exchange between the stream and gravel streambed (timescale of hours) but was relatively insensitive to slower exchange with deeper alluvium (timescale of tens of hours) that was detected by subsurface measurements. The stream tracer approach was therefore not equally sensitive to all timescales of hyporheic exchange. We conclude that while the stream tracer approach is an efficient means to characterize surface-subsurface exchange, future studies will need to more routinely consider decreasing sensitivities of tracer methods at higher base flow and a potential bias toward characterizing only a fast component of hyporheic exchange. Stream tracer models with multiple rate constants to consider both fast exchange with streambed gravel and slower exchange with deeper alluvium appear to be warranted.

  20. Origins of chemoreceptor curvature sorting in Escherichia coli

    PubMed Central

    Draper, Will; Liphardt, Jan

    2017-01-01

    Bacterial chemoreceptors organize into large clusters at the cell poles. Despite a wealth of structural and biochemical information on the system's components, it is not clear how chemoreceptor clusters are reliably targeted to the cell pole. Here, we quantify the curvature-dependent localization of chemoreceptors in live cells by artificially deforming growing cells of Escherichia coli in curved agar microchambers, and find that chemoreceptor cluster localization is highly sensitive to membrane curvature. Through analysis of multiple mutants, we conclude that curvature sensitivity is intrinsic to chemoreceptor trimers-of-dimers, and results from conformational entropy within the trimer-of-dimers geometry. We use the principles of the conformational entropy model to engineer curvature sensitivity into a series of multi-component synthetic protein complexes. When expressed in E. coli, the synthetic complexes form large polar clusters, and a complex with inverted geometry avoids the cell poles. This demonstrates the successful rational design of both polar and anti-polar clustering, and provides a synthetic platform on which to build new systems. PMID:28322223

  1. Psychometrics of chronic liver disease questionnaire in Chinese chronic hepatitis B patients

    PubMed Central

    Zhou, Kai-Na; Zhang, Min; Wu, Qian; Ji, Zhen-Hao; Zhang, Xiao-Mei; Zhuang, Gui-Hua

    2013-01-01

    AIM: To evaluate psychometrics of the Chinese (mainland) chronic liver disease questionnaire (CLDQ) in patients with chronic hepatitis B (CHB). METHODS: A cross-sectional sample of 460 Chinese patients with CHB was selected from the Outpatient Department of the Eighth Hospital of Xi’an, including CHB (CHB without cirrhosis) (n = 323) and CHB-related cirrhosis (n = 137). The psychometrics includes reliability, validity and sensitivity. Internal consistency reliability was measured using Cronbach’s α. Convergent and discriminant validity was evaluated by item-scale correlation. Factorial validity was explored by principal component analysis with varimax rotation. Sensitivity was assessed using Cohen’s effect size (ES), and independent sample t test between CHB and CHB-related cirrhosis groups and between alanine aminotransferase (ALT) normal and abnormal groups after stratifying the disease (CHB and CHB-related cirrhosis). RESULTS: Internal consistency reliability of the CLDQ was 0.83 (range: 0.65-0.90). Most of the hypothesized item-scale correlations were 0.40 or over, and all of such hypothesized correlations were higher than the alternative ones, indicating satisfactory convergent and discriminant validity. Six factors were extracted after varimax rotation from the 29 items of CLDQ. The eligible Cohen’s ES with statistically significant independent sample t test was found in the overall CLDQ and abdominal, systematic, activity scales (CHB vs CHB-related cirrhosis), and in the overall CLDQ and abdominal scale in the stratification of patients with CHB (ALT normal vs abnormal). CONCLUSION: The CLDQ has acceptable reliability, validity and sensitivity in Chinese (mainland) patients with CHB. PMID:23801844

  2. Modeling Imperfect Generator Behavior in Power System Operation Models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Krad, Ibrahim

    A key component in power system operations is the use of computer models to quickly study and analyze different operating conditions and futures in an efficient manner. The output of these models are sensitive to the data used in them as well as the assumptions made during their execution. One typical assumption is that generators and load assets perfectly follow operator control signals. While this is a valid simulation assumption, generators may not always accurately follow control signals. This imperfect response of generators could impact cost and reliability metrics. This paper proposes a generator model that capture this imperfect behaviormore » and examines its impact on production costs and reliability metrics using a steady-state power system operations model. Preliminary analysis shows that while costs remain relatively unchanged, there could be significant impacts on reliability metrics.« less

  3. Probabilistic Structural Analysis and Reliability Using NESSUS With Implemented Material Strength Degradation Model

    NASA Technical Reports Server (NTRS)

    Bast, Callie C.; Jurena, Mark T.; Godines, Cody R.; Chamis, Christos C. (Technical Monitor)

    2001-01-01

    This project included both research and education objectives. The goal of this project was to advance innovative research and education objectives in theoretical and computational probabilistic structural analysis, reliability, and life prediction for improved reliability and safety of structural components of aerospace and aircraft propulsion systems. Research and education partners included Glenn Research Center (GRC) and Southwest Research Institute (SwRI) along with the University of Texas at San Antonio (UTSA). SwRI enhanced the NESSUS (Numerical Evaluation of Stochastic Structures Under Stress) code and provided consulting support for NESSUS-related activities at UTSA. NASA funding supported three undergraduate students, two graduate students, a summer course instructor and the Principal Investigator. Matching funds from UTSA provided for the purchase of additional equipment for the enhancement of the Advanced Interactive Computational SGI Lab established during the first year of this Partnership Award to conduct the probabilistic finite element summer courses. The research portion of this report presents the cumulation of work performed through the use of the probabilistic finite element program, NESSUS, Numerical Evaluation and Structures Under Stress, and an embedded Material Strength Degradation (MSD) model. Probabilistic structural analysis provided for quantification of uncertainties associated with the design, thus enabling increased system performance and reliability. The structure examined was a Space Shuttle Main Engine (SSME) fuel turbopump blade. The blade material analyzed was Inconel 718, since the MSD model was previously calibrated for this material. Reliability analysis encompassing the effects of high temperature and high cycle fatigue, yielded a reliability value of 0.99978 using a fully correlated random field for the blade thickness. The reliability did not change significantly for a change in distribution type except for a change in distribution from Gaussian to Weibull for the centrifugal load. The sensitivity factors determined to be most dominant were the centrifugal loading and the initial strength of the material. These two sensitivity factors were influenced most by a change in distribution type from Gaussian to Weibull. The education portion of this report describes short-term and long-term educational objectives. Such objectives serve to integrate research and education components of this project resulting in opportunities for ethnic minority students, principally Hispanic. The primary vehicle to facilitate such integration was the teaching of two probabilistic finite element method courses to undergraduate engineering students in the summers of 1998 and 1999.

  4. Uncertainties in obtaining high reliability from stress-strength models

    NASA Technical Reports Server (NTRS)

    Neal, Donald M.; Matthews, William T.; Vangel, Mark G.

    1992-01-01

    There has been a recent interest in determining high statistical reliability in risk assessment of aircraft components. The potential consequences are identified of incorrectly assuming a particular statistical distribution for stress or strength data used in obtaining the high reliability values. The computation of the reliability is defined as the probability of the strength being greater than the stress over the range of stress values. This method is often referred to as the stress-strength model. A sensitivity analysis was performed involving a comparison of reliability results in order to evaluate the effects of assuming specific statistical distributions. Both known population distributions, and those that differed slightly from the known, were considered. Results showed substantial differences in reliability estimates even for almost nondetectable differences in the assumed distributions. These differences represent a potential problem in using the stress-strength model for high reliability computations, since in practice it is impossible to ever know the exact (population) distribution. An alternative reliability computation procedure is examined involving determination of a lower bound on the reliability values using extreme value distributions. This procedure reduces the possibility of obtaining nonconservative reliability estimates. Results indicated the method can provide conservative bounds when computing high reliability. An alternative reliability computation procedure is examined involving determination of a lower bound on the reliability values using extreme value distributions. This procedure reduces the possibility of obtaining nonconservative reliability estimates. Results indicated the method can provide conservative bounds when computing high reliability.

  5. High-efficiency high-reliability optical components for a large, high-average-power visible laser system

    NASA Astrophysics Data System (ADS)

    Taylor, John R.; Stolz, Christopher J.

    1993-08-01

    Laser system performance and reliability depends on the related performance and reliability of the optical components which define the cavity and transport subsystems. High-average-power and long transport lengths impose specific requirements on component performance. The complexity of the manufacturing process for optical components requires a high degree of process control and verification. Qualification has proven effective in ensuring confidence in the procurement process for these optical components. Issues related to component reliability have been studied and provide useful information to better understand the long term performance and reliability of the laser system.

  6. High-efficiency high-reliability optical components for a large, high-average-power visible laser system

    NASA Astrophysics Data System (ADS)

    Taylor, J. R.; Stolz, C. J.

    1992-12-01

    Laser system performance and reliability depends on the related performance and reliability of the optical components which define the cavity and transport subsystems. High-average-power and long transport lengths impose specific requirements on component performance. The complexity of the manufacturing process for optical components requires a high degree of process control and verification. Qualification has proven effective in ensuring confidence in the procurement process for these optical components. Issues related to component reliability have been studied and provide useful information to better understand the long term performance and reliability of the laser system.

  7. Development of Camera Electronics for the Advanced Gamma-ray Imaging System (AGIS)

    NASA Astrophysics Data System (ADS)

    Tajima, Hiroyasu

    2009-05-01

    AGIS, a next generation of atmospheric Cherenkov telescope arrays, aims to achieve a sensitivity level of a milliCrab for gamma-ray observations in in the energy band of 40 GeV to 100 TeV. Such improvement requires cost reduction of individual components with high reliability in order to equip the order of 100 telescopes necessary to achieve the sensitivity goal. We are exploring several design concepts to reduce the cost of camera electronics while improving their performance. We have developed test systems for some of these concepts and are testing their performance. Here we present test results of the test systems.

  8. Reliability and performance evaluation of systems containing embedded rule-based expert systems

    NASA Technical Reports Server (NTRS)

    Beaton, Robert M.; Adams, Milton B.; Harrison, James V. A.

    1989-01-01

    A method for evaluating the reliability of real-time systems containing embedded rule-based expert systems is proposed and investigated. It is a three stage technique that addresses the impact of knowledge-base uncertainties on the performance of expert systems. In the first stage, a Markov reliability model of the system is developed which identifies the key performance parameters of the expert system. In the second stage, the evaluation method is used to determine the values of the expert system's key performance parameters. The performance parameters can be evaluated directly by using a probabilistic model of uncertainties in the knowledge-base or by using sensitivity analyses. In the third and final state, the performance parameters of the expert system are combined with performance parameters for other system components and subsystems to evaluate the reliability and performance of the complete system. The evaluation method is demonstrated in the context of a simple expert system used to supervise the performances of an FDI algorithm associated with an aircraft longitudinal flight-control system.

  9. Nonlinear processing of a multicomponent communication signal by combination-sensitive neurons in the anuran inferior colliculus.

    PubMed

    Lee, Norman; Schrode, Katrina M; Bee, Mark A

    2017-09-01

    Diverse animals communicate using multicomponent signals. How a receiver's central nervous system integrates multiple signal components remains largely unknown. We investigated how female green treefrogs (Hyla cinerea) integrate the multiple spectral components present in male advertisement calls. Typical calls have a bimodal spectrum consisting of formant-like low-frequency (~0.9 kHz) and high-frequency (~2.7 kHz) components that are transduced by different sensory organs in the inner ear. In behavioral experiments, only bimodal calls reliably elicited phonotaxis in no-choice tests, and they were selectively chosen over unimodal calls in two-alternative choice tests. Single neurons in the inferior colliculus of awake, passively listening subjects were classified as combination-insensitive units (27.9%) or combination-sensitive units (72.1%) based on patterns of relative responses to the same bimodal and unimodal calls. Combination-insensitive units responded similarly to the bimodal call and one or both unimodal calls. In contrast, combination-sensitive units exhibited both linear responses (i.e., linear summation) and, more commonly, nonlinear responses (e.g., facilitation, compressive summation, or suppression) to the spectral combination in the bimodal call. These results are consistent with the hypothesis that nonlinearities play potentially critical roles in spectral integration and in the neural processing of multicomponent communication signals.

  10. Assessment of Power Quality based on Fuzzy Logic and Discrete Wavelet Transform for Nonstationary Disturbances

    NASA Astrophysics Data System (ADS)

    Sinha, Pampa; Nath, Sudipta

    2010-10-01

    The main aspects of power system delivery are reliability and quality. If all the customers of a power system get uninterrupted power through the year then the system is considered to be reliable. The term power quality may be referred to as maintaining near sinusoidal voltage at rated frequency at the consumers end. The power component definitions are defined according to the IEEE Standard 1459-2000 both for single phase and three phase unbalanced systems based on Fourier Transform (FFT). In the presence of nonstationary power quality (PQ) disturbances results in accurate values due to its sensitivity to the spectral leakage problem. To overcome these limitations the power quality components are calculated using Discrete Wavelet Transform (DWT). In order to handle the uncertainties associated with electric power systems operations fuzzy logic has been incorporated in this paper. A new power quality index has been introduced here which can assess the power quality under nonstationary disturbances.

  11. Tool development to assess the work related neck and upper limb musculoskeletal disorders among female garment workers in Sri-Lanka.

    PubMed

    Amarasinghe, Nirmalie Champika; De AlwisSenevirathne, Rohini

    2016-10-17

    Musculoskeletal disorders (MSDs) have been identified as a predisposing factor for lesser productivity, but no validated tool has been developed to assess them in the Sri- Lankan context. To develop a validated tool to assess the neck and upper limb MSDs. It comprises three components: item selections, item reduction using principal component analysis, and validation. A tentative self-administrated questionnaire was developed, translated, and pre-tested. Four important domains - neck, shoulder, elbow and wrist - were identified through principal component analysis. Prevalence of any MSDs was 38.1% and prevalence of neck, shoulder, elbow and wrist MSDs are 12.85%, 13.71%, 12%, 13.71% respectively. Content and criterion validity of the tool was assessed. Separate ROC curves were produced and sensitivity and specificity of neck (83.1%, 71.7%), shoulder (97.6%, 91.9%), elbow (98.2%, 87.2%), and wrist (97.6%, 94.9%) was determined. Cronbach's Alpha and correlation coefficient was above 0.7. The tool has high sensitivity, specificity, internal consistency, and test re-test reliability.

  12. Repetition-related reductions in neural activity reveal component processes of mental simulation.

    PubMed

    Szpunar, Karl K; St Jacques, Peggy L; Robbins, Clifford A; Wig, Gagan S; Schacter, Daniel L

    2014-05-01

    In everyday life, people adaptively prepare for the future by simulating dynamic events about impending interactions with people, objects and locations. Previous research has consistently demonstrated that a distributed network of frontal-parietal-temporal brain regions supports this ubiquitous mental activity. Nonetheless, little is known about the manner in which specific regions of this network contribute to component features of future simulation. In two experiments, we used a functional magnetic resonance (fMR)-repetition suppression paradigm to demonstrate that distinct frontal-parietal-temporal regions are sensitive to processing the scenarios or what participants imagined was happening in an event (e.g., medial prefrontal, posterior cingulate, temporal-parietal and middle temporal cortices are sensitive to the scenarios associated with future social events), people (medial prefrontal cortex), objects (inferior frontal and premotor cortices) and locations (posterior cingulate/retrosplenial, parahippocampal and posterior parietal cortices) that typically constitute simulations of personal future events. This pattern of results demonstrates that the neural substrates of these component features of event simulations can be reliably identified in the context of a task that requires participants to simulate complex, everyday future experiences.

  13. Sensitive and reliable multianalyte quantitation of herbal medicine in rat plasma using dynamic triggered multiple reaction monitoring.

    PubMed

    Yan, Zhixiang; Li, Tianxue; Lv, Pin; Li, Xiang; Zhou, Chen; Yang, Xinghao

    2013-06-01

    There is a growing need both clinically and experimentally to improve the determination of the blood levels of multiple chemical constituents in herbal medicines. The conventional multiple reaction monitoring (cMRM), however, is not well suited for multi-component determination and could not provide qualitative information for identity confirmation. Here we apply a dynamic triggered MRM (DtMRM) algorithm for the quantification of 20 constituents in an herbal prescription Bu-Zhong-Yi-Qi-Tang (BZYQT) in rat plasma. Dynamic MRM (DMRM) dramatically reduced the number of concurrent MRM transitions that are monitored during each MS scan. This advantage has been enhanced with the addition of triggered MRM (tMRM) for simultaneous confirmation, which maximizes the dwell time in the primary MRM quantitation phase, and also acquires sufficient MRM data to create a composite product ion spectrum. By allowing optimized collision energy for each product ion and maximizing dwell times, tMRM is significantly more sensitive and reliable than conventional product ion scanning. The DtMRM approach provides much higher sensitivity and reproducibility than cMRM. Copyright © 2013 Elsevier B.V. All rights reserved.

  14. Reliability Evaluation of Machine Center Components Based on Cascading Failure Analysis

    NASA Astrophysics Data System (ADS)

    Zhang, Ying-Zhi; Liu, Jin-Tong; Shen, Gui-Xiang; Long, Zhe; Sun, Shu-Guang

    2017-07-01

    In order to rectify the problems that the component reliability model exhibits deviation, and the evaluation result is low due to the overlook of failure propagation in traditional reliability evaluation of machine center components, a new reliability evaluation method based on cascading failure analysis and the failure influenced degree assessment is proposed. A direct graph model of cascading failure among components is established according to cascading failure mechanism analysis and graph theory. The failure influenced degrees of the system components are assessed by the adjacency matrix and its transposition, combined with the Pagerank algorithm. Based on the comprehensive failure probability function and total probability formula, the inherent failure probability function is determined to realize the reliability evaluation of the system components. Finally, the method is applied to a machine center, it shows the following: 1) The reliability evaluation values of the proposed method are at least 2.5% higher than those of the traditional method; 2) The difference between the comprehensive and inherent reliability of the system component presents a positive correlation with the failure influenced degree of the system component, which provides a theoretical basis for reliability allocation of machine center system.

  15. Computational methods for efficient structural reliability and reliability sensitivity analysis

    NASA Technical Reports Server (NTRS)

    Wu, Y.-T.

    1993-01-01

    This paper presents recent developments in efficient structural reliability analysis methods. The paper proposes an efficient, adaptive importance sampling (AIS) method that can be used to compute reliability and reliability sensitivities. The AIS approach uses a sampling density that is proportional to the joint PDF of the random variables. Starting from an initial approximate failure domain, sampling proceeds adaptively and incrementally with the goal of reaching a sampling domain that is slightly greater than the failure domain to minimize over-sampling in the safe region. Several reliability sensitivity coefficients are proposed that can be computed directly and easily from the above AIS-based failure points. These probability sensitivities can be used for identifying key random variables and for adjusting design to achieve reliability-based objectives. The proposed AIS methodology is demonstrated using a turbine blade reliability analysis problem.

  16. Difficult Decisions Made Easier

    NASA Technical Reports Server (NTRS)

    2006-01-01

    NASA missions are extremely complex and prone to sudden, catastrophic failure if equipment falters or if an unforeseen event occurs. For these reasons, NASA trains to expect the unexpected. It tests its equipment and systems in extreme conditions, and it develops risk-analysis tests to foresee any possible problems. The Space Agency recently worked with an industry partner to develop reliability analysis software capable of modeling complex, highly dynamic systems, taking into account variations in input parameters and the evolution of the system over the course of a mission. The goal of this research was multifold. It included performance and risk analyses of complex, multiphase missions, like the insertion of the Mars Reconnaissance Orbiter; reliability analyses of systems with redundant and/or repairable components; optimization analyses of system configurations with respect to cost and reliability; and sensitivity analyses to identify optimal areas for uncertainty reduction or performance enhancement.

  17. Constitutive Theory Developed for Monolithic Ceramic Materials

    NASA Technical Reports Server (NTRS)

    Janosik, Lesley A.

    1998-01-01

    With the increasing use of advanced ceramic materials in high-temperature structural applications such as advanced heat engine components, the need arises to accurately predict thermomechanical behavior that is inherently time-dependent and that is hereditary in the sense that the current behavior depends not only on current conditions but also on the material's thermomechanical history. Most current analytical life prediction methods for both subcritical crack growth and creep models use elastic stress fields to predict the time-dependent reliability response of components subjected to elevated service temperatures. Inelastic response at high temperatures has been well documented in the materials science literature for these material systems, but this issue has been ignored by the engineering design community. From a design engineer's perspective, it is imperative to emphasize that accurate predictions of time-dependent reliability demand accurate stress field information. Ceramic materials exhibit different time-dependent behavior in tension and compression. Thus, inelastic deformation models for ceramics must be constructed in a fashion that admits both sensitivity to hydrostatic stress and differing behavior in tension and compression. A number of constitutive theories for materials that exhibit sensitivity to the hydrostatic component of stress have been proposed that characterize deformation using time-independent classical plasticity as a foundation. However, none of these theories allow different behavior in tension and compression. In addition, these theories are somewhat lacking in that they are unable to capture the creep, relaxation, and rate-sensitive phenomena exhibited by ceramic materials at high temperatures. The objective of this effort at the NASA Lewis Research Center has been to formulate a macroscopic continuum theory that captures these time-dependent phenomena. Specifically, the effort has focused on inelastic deformation behavior associated with these service conditions by developing a multiaxial viscoplastic constitutive model that accounts for time-dependent hereditary material deformation (such as creep and stress relaxation) in monolithic structural ceramics. Using continuum principles of engineering mechanics, we derived the complete viscoplastic theory from a scalar dissipative potential function.

  18. Validity, discriminative ability, and reliability of the hearing-related quality of life questionnaire for adolescents.

    PubMed

    Rachakonda, Tara; Jeffe, Donna B; Shin, Jennifer J; Mankarious, Leila; Fanning, Robert J; Lesperance, Marci M; Lieu, Judith E C

    2014-02-01

    The prevalence of hearing loss (HL) in adolescents has grown over the past decade, but hearing-related quality of life (QOL) has not been well-measured. We sought to develop a reliable, valid measure of hearing-related QOL for adolescents and the Hearing Environments And Reflection on Quality of Life (HEAR-QL). Multisite observational study. Adolescents with HL and siblings without HL were recruited from five centers. Participants completed the HEAR-QL and validated questionnaires measuring generic pediatric QOL (PedsQL), depression and anxiety (RCADS-25), and hearing-related QOL for adults (HHIA) to determine construct and discriminant validity. Participants completed the HEAR-QL 2 weeks later for test-retest reliability. We used exploratory principal components analysis to determine the HEAR-QL factor structure and measured reliability. Sensitivity and specificity of the HEAR-QL, PedsQL, HHIA, and RCADS-25 were assessed. We compared scores on all surveys between those with normal hearing, unilateral, and bilateral HL. A total of 233 adolescents (13-18 years old) participated: 179 with HL, 54 without HL. The original 45-item HEAR-QL was shortened to 28 items after determining factor structure. The resulting HEAR-QL-28 demonstrated excellent reliability (Cronbach's alpha = 0.95) and construct validity (HHIA: r = .845, PedsQL: r = .587; RCADS-25: r = .433). The HEAR-QL-28 displayed excellent discriminant validity, with higher area under the curve (0.932) than the PedsQL (0.597) or RCADS-25 (0.529). Teens with bilateral HL using hearing devices reported worse QOL on the HEAR-QL and HHIA than peers with HL not using devices. The HEAR-QL is a sensitive, reliable, and valid measure of hearing-related QOL for adolescents. 2b. © 2013 The American Laryngological, Rhinological and Otological Society, Inc.

  19. Validity, Discriminative Ability and Reliability of the Hearing-Related Quality of Life (HEAR-QL) Questionnaire for Adolescents

    PubMed Central

    Rachakonda, Tara; Jeffe, Donna B.; Shin, Jennifer J.; Mankarious, Leila; Fanning, Robert J.; Lesperance, Marci M.; Lieu, Judith E.C.

    2014-01-01

    Objectives The prevalence of hearing loss (HL) in adolescents has grown over the past decade, but hearing-related quality of life (QOL) has not been well-measured. We sought to develop a reliable, valid measure of hearing-related QOL for adolescents, the Hearing Environments And Reflection on Quality of Life (HEAR-QL). Study Design Multi-site observational study. Methods Adolescents with HL and siblings without HL were recruited from five centers. Participants completed the HEAR-QL and validated questionnaires measuring generic pediatric QOL (PedsQL), depression and anxiety (RCADS-25), and hearing-related QOL for adults (HHIA) to determine construct and discriminant validity. Participants completed the HEAR-QL two weeks later for test-retest reliability. We used exploratory principal components analysis to determine the HEAR-QL factor structure and measured reliability. Sensitivity and specificity of the HEAR-QL, PedsQL, HHIA and RCADS-25 were assessed. We compared scores on all surveys between those with normal hearing, unilateral and bilateral HL. Results 233 adolescents (13–18 years old) participated—179 with HL, 54 without HL. The original 45-item HEAR-QL was shortened to 28 items after determining factor structure. The resulting HEAR-QL-28 demonstrated excellent reliability (Cronbach’s alpha= 0.95) and construct validity (HHIA: r =.845, PedsQL: r =.587; RCADS-25: r =.433). The HEAR-QL-28 displayed excellent discriminant validity, with higher area under the curve (0.932) than the PedsQL (0.597) or RCADS-25 (0.529). Teens with bilateral HL using hearing devices reported worse QOL on the HEAR-QL and HHIA than peers with HL not using devices. Conclusions The HEAR-QL is a sensitive, reliable and valid measure of hearing-related QOL for adolescents. PMID:23900836

  20. Posterior stabilized versus cruciate retaining total knee arthroplasty designs: conformity affects the performance reliability of the design over the patient population.

    PubMed

    Ardestani, Marzieh M; Moazen, Mehran; Maniei, Ehsan; Jin, Zhongmin

    2015-04-01

    Commercially available fixed bearing knee prostheses are mainly divided into two groups: posterior stabilized (PS) versus cruciate retaining (CR). Despite the widespread comparative studies, the debate continues regarding the superiority of one type over the other. This study used a combined finite element (FE) simulation and principal component analysis (PCA) to evaluate "reliability" and "sensitivity" of two PS designs versus two CR designs over a patient population. Four fixed bearing implants were chosen: PFC (DePuy), PFC Sigma (DePuy), NexGen (Zimmer) and Genesis II (Smith & Nephew). Using PCA, a large probabilistic knee joint motion and loading database was generated based on the available experimental data from literature. The probabilistic knee joint data were applied to each implant in a FE simulation to calculate the potential envelopes of kinematics (i.e. anterior-posterior [AP] displacement and internal-external [IE] rotation) and contact mechanics. The performance envelopes were considered as an indicator of performance reliability. For each implant, PCA was used to highlight how much the implant performance was influenced by changes in each input parameter (sensitivity). Results showed that (1) conformity directly affected the reliability of the knee implant over a patient population such that lesser conformity designs (PS or CR), had higher kinematic variability and were more influenced by AP force and IE torque, (2) contact reliability did not differ noticeably among different designs and (3) CR or PS designs affected the relative rank of critical factors that influenced the reliability of each design. Such investigations enlighten the underlying biomechanics of various implant designs and can be utilized to estimate the potential performance of an implant design over a patient population. Copyright © 2015 IPEM. Published by Elsevier Ltd. All rights reserved.

  1. Six components observations of local earthquakes during the 2016 Central Italy seismic sequence

    NASA Astrophysics Data System (ADS)

    Simonelli, A.; Bernauer, F.; Chow, B.; Braun, T.; Wassermann, J. M.; Igel, H.

    2017-12-01

    For many years the seismological community has looked for a reliable, sensitive, broadband three-component portable rotational sensor. In this preliminary study, we show the possibility of measuring and extracting relevant seismological information from local earthquakes. We employ portable three-component rotational sensors, insensitive to translations, which operate on optical interferometry principles (Sagnac effect). Multiple sensors recording redundantly add significance to the measurements.During the Central Italy seismic sequence in November 2016, we deployed two portable fiber-optic gyroscopes (BlueSeis3A from iXBlue and LCG demonstrator from LITEF) and a broadband seismometer in Colfiorito, Italy. We present here the six-component observations, with analysis of rotational (three redundant components) and translational (three components) ground motions, generated by earthquakes at local distances. For each seismic event, we compare coherence between rotational sensors and estimate a back azimuth consistent with theoretical values. We also estimate Love and Rayleigh wave phase velocities in the 5 to 10 Hz frequency range.

  2. Space Station Freedom electric power system availability study

    NASA Technical Reports Server (NTRS)

    Turnquist, Scott R.

    1990-01-01

    The results are detailed of follow-on availability analyses performed on the Space Station Freedom electric power system (EPS). The scope includes analyses of several EPS design variations, these are: the 4-photovoltaic (PV) module baseline EPS design, a 6-PV module EPS design, and a 3-solar dynamic module EPS design which included a 10 kW PV module. The analyses performed included: determining the discrete power levels that the EPS will operate at upon various component failures and the availability of each of these operating states; ranking EPS components by the relative contribution each component type gives to the power availability of the EPS; determining the availability impacts of including structural and long-life EPS components in the availability models used in the analyses; determining optimum sparing strategies, for storing space EPS components on-orbit, to maintain high average-power-capability with low lift-mass requirements; and analyses to determine the sensitivity of EPS-availability to uncertainties in the component reliability and maintainability data used.

  3. Similarities and differences between on-scalp and conventional in-helmet magnetoencephalography recordings

    PubMed Central

    Pfeiffer, Christoph; Ruffieux, Silvia; Jousmäki, Veikko; Hämäläinen, Matti; Schneiderman, Justin F.; Lundqvist, Daniel

    2017-01-01

    The development of new magnetic sensor technologies that promise sensitivities approaching that of conventional MEG technology while operating at far lower operating temperatures has catalysed the growing field of on-scalp MEG. The feasibility of on-scalp MEG has been demonstrated via benchmarking of new sensor technologies performing neuromagnetic recordings in close proximity to the head surface against state-of-the-art in-helmet MEG sensor technology. However, earlier work has provided little information about how these two approaches compare, or about the reliability of observed differences. Herein, we present such a comparison, based on recordings of the N20m component of the somatosensory evoked field as elicited by electric median nerve stimulation. As expected from the proximity differences between the on-scalp and in-helmet sensors, the magnitude of the N20m activation as recorded with the on-scalp sensor was higher than that of the in-helmet sensors. The dipole pattern of the on-scalp recordings was also more spatially confined than that of the conventional recordings. Our results furthermore revealed unexpected temporal differences in the peak of the N20m component. An analysis protocol was therefore developed for assessing the reliability of this observed difference. We used this protocol to examine our findings in terms of differences in sensor sensitivity between the two types of MEG recordings. The measurements and subsequent analysis raised attention to the fact that great care has to be taken in measuring the field close to the zero-line crossing of the dipolar field, since it is heavily dependent on the orientation of sensors. Taken together, our findings provide reliable evidence that on-scalp and in-helmet sensors measure neural sources in mostly similar ways. PMID:28742118

  4. Validity and Reliability of New Agility Test among Elite and Subelite under 14-Soccer Players

    PubMed Central

    Hachana, Younés; Chaabène, Helmi; Ben Rajeb, Ghada; Khlifa, Riadh; Aouadi, Ridha; Chamari, Karim; Gabbett, Tim J.

    2014-01-01

    Background Agility is a determinant component in soccer performance. This study aimed to evaluate the reliability and sensitivity of a “Modified Illinois change of direction test” (MICODT) in ninety-five U-14 soccer players. Methods A total of 95 U-14 soccer players (mean ± SD: age: 13.61±1.04 years; body mass: 30.52±4.54 kg; height: 1.57±0.1 m) from a professional and semi-professional soccer academy, participated to this study. Sixty of them took part in reliability analysis and thirty-two in sensitivity analysis. Results The intraclass correlation coefficient (ICC) that aims to assess relative reliability of the MICODT was of 0.99, and its standard error of measurement (SEM) for absolute reliability was <5% (1.24%). The MICODT’s capacity to detect change is “good”, it’s SEM (0.10 s) was ≤ SWC (0.33 s). The MICODT is significantly correlated to the Illinois change of direction speed test (ICODT) (r = 0.77; p<0.0001). The ICODT’s MDC95 (0.64 s) was twice about the MICODT’s MDC95 (0.28 s), indicating that MICODT presents better ability to detect true changes than ICODT. The MICODT provided good sensitivity since elite U-14 soccer players were better than non-elite one on MICODT (p = 0.005; dz = 1.01 [large]). This was supported by an area under the ROC curve of 0.77 (CI 95%, 0.59 to 0.89, p<0.0008). The difference observed in these two groups in ICODT was not statistically significant (p = 0.14; dz = 0.51 [small]), showing poor discriminant ability. Conclusion MICODT can be considered as more suitable protocol for assessing agility performance level than ICODT in U-14 soccer players. PMID:24752193

  5. Parts and Components Reliability Assessment: A Cost Effective Approach

    NASA Technical Reports Server (NTRS)

    Lee, Lydia

    2009-01-01

    System reliability assessment is a methodology which incorporates reliability analyses performed at parts and components level such as Reliability Prediction, Failure Modes and Effects Analysis (FMEA) and Fault Tree Analysis (FTA) to assess risks, perform design tradeoffs, and therefore, to ensure effective productivity and/or mission success. The system reliability is used to optimize the product design to accommodate today?s mandated budget, manpower, and schedule constraints. Stand ard based reliability assessment is an effective approach consisting of reliability predictions together with other reliability analyses for electronic, electrical, and electro-mechanical (EEE) complex parts and components of large systems based on failure rate estimates published by the United States (U.S.) military or commercial standards and handbooks. Many of these standards are globally accepted and recognized. The reliability assessment is especially useful during the initial stages when the system design is still in the development and hard failure data is not yet available or manufacturers are not contractually obliged by their customers to publish the reliability estimates/predictions for their parts and components. This paper presents a methodology to assess system reliability using parts and components reliability estimates to ensure effective productivity and/or mission success in an efficient manner, low cost, and tight schedule.

  6. Test-retest and between-site reliability in a multicenter fMRI study.

    PubMed

    Friedman, Lee; Stern, Hal; Brown, Gregory G; Mathalon, Daniel H; Turner, Jessica; Glover, Gary H; Gollub, Randy L; Lauriello, John; Lim, Kelvin O; Cannon, Tyrone; Greve, Douglas N; Bockholt, Henry Jeremy; Belger, Aysenil; Mueller, Bryon; Doty, Michael J; He, Jianchun; Wells, William; Smyth, Padhraic; Pieper, Steve; Kim, Seyoung; Kubicki, Marek; Vangel, Mark; Potkin, Steven G

    2008-08-01

    In the present report, estimates of test-retest and between-site reliability of fMRI assessments were produced in the context of a multicenter fMRI reliability study (FBIRN Phase 1, www.nbirn.net). Five subjects were scanned on 10 MRI scanners on two occasions. The fMRI task was a simple block design sensorimotor task. The impulse response functions to the stimulation block were derived using an FIR-deconvolution analysis with FMRISTAT. Six functionally-derived ROIs covering the visual, auditory and motor cortices, created from a prior analysis, were used. Two dependent variables were compared: percent signal change and contrast-to-noise-ratio. Reliability was assessed with intraclass correlation coefficients derived from a variance components analysis. Test-retest reliability was high, but initially, between-site reliability was low, indicating a strong contribution from site and site-by-subject variance. However, a number of factors that can markedly improve between-site reliability were uncovered, including increasing the size of the ROIs, adjusting for smoothness differences, and inclusion of additional runs. By employing multiple steps, between-site reliability for 3T scanners was increased by 123%. Dropping one site at a time and assessing reliability can be a useful method of assessing the sensitivity of the results to particular sites. These findings should provide guidance toothers on the best practices for future multicenter studies.

  7. Validity and reliability of the Spanish version of the DN4 (Douleur Neuropathique 4 questions) questionnaire for differential diagnosis of pain syndromes associated to a neuropathic or somatic component

    PubMed Central

    Perez, Concepcion; Galvez, Rafael; Huelbes, Silvia; Insausti, Joaquin; Bouhassira, Didier; Diaz, Silvia; Rejas, Javier

    2007-01-01

    Background This study assesses the validity and reliability of the Spanish version of DN4 questionnaire as a tool for differential diagnosis of pain syndromes associated to a neuropathic (NP) or somatic component (non-neuropathic pain, NNP). Methods A study was conducted consisting of two phases: cultural adaptation into the Spanish language by means of conceptual equivalence, including forward and backward translations in duplicate and cognitive debriefing, and testing of psychometric properties in patients with NP (peripheral, central and mixed) and NNP. The analysis of psychometric properties included reliability (internal consistency, inter-rater agreement and test-retest reliability) and validity (ROC curve analysis, agreement with the reference diagnosis and determination of sensitivity, specificity, and positive and negative predictive values in different subsamples according to type of NP). Results A sample of 164 subjects (99 women, 60.4%; age: 60.4 ± 16.0 years), 94 (57.3%) with NP (36 with peripheral, 32 with central, and 26 with mixed pain) and 70 with NNP was enrolled. The questionnaire was reliable [Cronbach's alpha coefficient: 0.71, inter-rater agreement coefficient: 0.80 (0.71–0.89), and test-retest intra-class correlation coefficient: 0.95 (0.92–0.97)] and valid for a cut-off value ≥ 4 points, which was the best value to discriminate between NP and NNP subjects. Discussion This study, representing the first validation of the DN4 questionnaire into another language different than the original, not only supported its high discriminatory value for identification of neuropathic pain, but also provided supplemental psychometric validation (i.e. test-retest reliability, influence of educational level and pain intensity) and showed its validity in mixed pain syndromes. PMID:18053212

  8. Integrated Computational Solution for Predicting Skin Sensitization Potential of Molecules

    PubMed Central

    Desai, Aarti; Singh, Vivek K.; Jere, Abhay

    2016-01-01

    Introduction Skin sensitization forms a major toxicological endpoint for dermatology and cosmetic products. Recent ban on animal testing for cosmetics demands for alternative methods. We developed an integrated computational solution (SkinSense) that offers a robust solution and addresses the limitations of existing computational tools i.e. high false positive rate and/or limited coverage. Results The key components of our solution include: QSAR models selected from a combinatorial set, similarity information and literature-derived sub-structure patterns of known skin protein reactive groups. Its prediction performance on a challenge set of molecules showed accuracy = 75.32%, CCR = 74.36%, sensitivity = 70.00% and specificity = 78.72%, which is better than several existing tools including VEGA (accuracy = 45.00% and CCR = 54.17% with ‘High’ reliability scoring), DEREK (accuracy = 72.73% and CCR = 71.44%) and TOPKAT (accuracy = 60.00% and CCR = 61.67%). Although, TIMES-SS showed higher predictive power (accuracy = 90.00% and CCR = 92.86%), the coverage was very low (only 10 out of 77 molecules were predicted reliably). Conclusions Owing to improved prediction performance and coverage, our solution can serve as a useful expert system towards Integrated Approaches to Testing and Assessment for skin sensitization. It would be invaluable to cosmetic/ dermatology industry for pre-screening their molecules, and reducing time, cost and animal testing. PMID:27271321

  9. Shuttle cryogenic supply system optimization study

    NASA Technical Reports Server (NTRS)

    1971-01-01

    Technical information on different cryogenic supply systems is presented for selecting representative designs. Parametric data and sensitivity studies, and an evaluation of related technology status are included. An integrated mathematical model for hardware program support was developed. The life support system, power generation, and propellant supply are considered. The major study conclusions are the following: Optimum integrated systems tend towards maximizing liquid storage. Vacuum jacketing of tanks is a major effect on integrated systems. Subcritical storage advantages over supercritical storage decrease as the quantity of propellant or reactant decreases. Shuttle duty cycles are not severe. The operational mode has a significant effect on reliability. Components are available for most subsystem applications. Subsystems and components require a minimum amount of technology development.

  10. Mechanical performance and parameter sensitivity analysis of 3D braided composites joints.

    PubMed

    Wu, Yue; Nan, Bo; Chen, Liang

    2014-01-01

    3D braided composite joints are the important components in CFRP truss, which have significant influence on the reliability and lightweight of structures. To investigate the mechanical performance of 3D braided composite joints, a numerical method based on the microscopic mechanics is put forward, the modeling technologies, including the material constants selection, element type, grid size, and the boundary conditions, are discussed in detail. Secondly, a method for determination of ultimate bearing capacity is established, which can consider the strength failure. Finally, the effect of load parameters, geometric parameters, and process parameters on the ultimate bearing capacity of joints is analyzed by the global sensitivity analysis method. The results show that the main pipe diameter thickness ratio γ, the main pipe diameter D, and the braided angle α are sensitive to the ultimate bearing capacity N.

  11. Quantitative understanding of explosive stimulus transfer

    NASA Technical Reports Server (NTRS)

    Schimmel, M. L.

    1973-01-01

    The mechanisms of detonation transfer across hermetically sealed interfaces created by necessary interruptions in high explosive trains, such as at detonators to explosive columns, field joints in explosive columns, and components of munitions fuse trains are demostrated. Reliability of detonation transfer is limited by minimizing explosive quantities, the use of intensitive explosives for safety, and requirements to propagate across gaps and angles dictated by installation and production restraints. The major detonation transfer variables studied were: explosive quanity, sensitivity, and thickness, and the separation distances between donor and acceptor explosives.

  12. Sensitivity Study for Long Term Reliability

    NASA Technical Reports Server (NTRS)

    White, Allan L.

    2008-01-01

    This paper illustrates using Markov models to establish system and maintenance requirements for small electronic controllers where the goal is a high probability of continuous service for a long period of time. The system and maintenance items considered are quality of components, various degrees of simple redundancy, redundancy with reconfiguration, diagnostic levels, periodic maintenance, and preventive maintenance. Markov models permit a quantitative investigation with comparison and contrast. An element of special interest is the use of conditional probability to study the combination of imperfect diagnostics and periodic maintenance.

  13. Investigation of Axial Electric Field Measurements with Grounded-Wire TEM Surveys

    NASA Astrophysics Data System (ADS)

    Zhou, Nan-nan; Xue, Guo-qiang; Li, Hai; Hou, Dong-yang

    2018-01-01

    The grounded-wire transient electromagnetic (TEM) surveying is often performed along the equatorial direction with its observation lines paralleling to the transmitting wire with a certain transmitter-receiver distance. However, such method takes into account only the equatorial component of the electromagnetic field, and a little effort has been made on incorporating the other major component along the transmitting wire, here denoted as axial field. To obtain a comprehensive understanding of its fundamental characteristics and guide the designing of the corresponding observation system for reliable anomaly detection, this study for the first time investigates the axial electric field from three crucial aspects, including its decay curve, plane distribution, and anomaly sensitivity, through both synthetic modeling and real application to one major coal field in China. The results demonstrate a higher sensitivity to both high- and low-resistivity anomalies by the electric field in axial direction and confirm its great potentials for robust anomaly detection in the subsurface.

  14. Dakota, a multilevel parallel object-oriented framework for design optimization, parameter estimation, uncertainty quantification, and sensitivity analysis :

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Adams, Brian M.; Ebeida, Mohamed Salah; Eldred, Michael S.

    The Dakota (Design Analysis Kit for Optimization and Terascale Applications) toolkit provides a exible and extensible interface between simulation codes and iterative analysis methods. Dakota contains algorithms for optimization with gradient and nongradient-based methods; uncertainty quanti cation with sampling, reliability, and stochastic expansion methods; parameter estimation with nonlinear least squares methods; and sensitivity/variance analysis with design of experiments and parameter study methods. These capabilities may be used on their own or as components within advanced strategies such as surrogate-based optimization, mixed integer nonlinear programming, or optimization under uncertainty. By employing object-oriented design to implement abstractions of the key components requiredmore » for iterative systems analyses, the Dakota toolkit provides a exible and extensible problem-solving environment for design and performance analysis of computational models on high performance computers. This report serves as a user's manual for the Dakota software and provides capability overviews and procedures for software execution, as well as a variety of example studies.« less

  15. Functional connectivity analysis of the neural bases of emotion regulation: A comparison of independent component method with density-based k-means clustering method.

    PubMed

    Zou, Ling; Guo, Qian; Xu, Yi; Yang, Biao; Jiao, Zhuqing; Xiang, Jianbo

    2016-04-29

    Functional magnetic resonance imaging (fMRI) is an important tool in neuroscience for assessing connectivity and interactions between distant areas of the brain. To find and characterize the coherent patterns of brain activity as a means of identifying brain systems for the cognitive reappraisal of the emotion task, both density-based k-means clustering and independent component analysis (ICA) methods can be applied to characterize the interactions between brain regions involved in cognitive reappraisal of emotion. Our results reveal that compared with the ICA method, the density-based k-means clustering method provides a higher sensitivity of polymerization. In addition, it is more sensitive to those relatively weak functional connection regions. Thus, the study concludes that in the process of receiving emotional stimuli, the relatively obvious activation areas are mainly distributed in the frontal lobe, cingulum and near the hypothalamus. Furthermore, density-based k-means clustering method creates a more reliable method for follow-up studies of brain functional connectivity.

  16. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hornemann, Andrea, E-mail: andrea.hornemann@ptb.de; Hoehl, Arne, E-mail: arne.hoehl@ptb.de; Ulm, Gerhard, E-mail: gerhard.ulm@ptb.de

    Bio-diagnostic assays of high complexity rely on nanoscaled assay recognition elements that can provide unique selectivity and design-enhanced sensitivity features. High throughput performance requires the simultaneous detection of various analytes combined with appropriate bioassay components. Nanoparticle induced sensitivity enhancement, and subsequent multiplexed capability Surface-Enhanced InfraRed Absorption (SEIRA) assay formats are fitting well these purposes. SEIRA constitutes an ideal platform to isolate the vibrational signatures of targeted bioassay and active molecules. The potential of several targeted biolabels, here fluorophore-labeled antibody conjugates, chemisorbed onto low-cost biocompatible gold nano-aggregates substrates have been explored for their use in assay platforms. Dried films were analyzedmore » by synchrotron radiation based FTIR/SEIRA spectro-microscopy and the resulting complex hyperspectral datasets were submitted to automated statistical analysis, namely Principal Components Analysis (PCA). The relationships between molecular fingerprints were put in evidence to highlight their spectral discrimination capabilities. We demonstrate that robust spectral encoding via SEIRA fingerprints opens up new opportunities for fast, reliable and multiplexed high-end screening not only in biodiagnostics but also in vitro biochemical imaging.« less

  17. Sensitivity-Informed De Novo Programming for Many-Objective Water Portfolio Planning Under Uncertainty

    NASA Astrophysics Data System (ADS)

    Kasprzyk, J. R.; Reed, P. M.; Kirsch, B. R.; Characklis, G. W.

    2009-12-01

    Risk-based water supply management presents severe cognitive, computational, and social challenges to planning in a changing world. Decision aiding frameworks must confront the cognitive biases implicit to risk, the severe uncertainties associated with long term planning horizons, and the consequent ambiguities that shape how we define and solve water resources planning and management problems. This paper proposes and demonstrates a new interactive framework for sensitivity informed de novo programming. The theoretical focus of our many-objective de novo programming is to promote learning and evolving problem formulations to enhance risk-based decision making. We have demonstrated our proposed de novo programming framework using a case study for a single city’s water supply in the Lower Rio Grande Valley (LRGV) in Texas. Key decisions in this case study include the purchase of permanent rights to reservoir inflows and anticipatory thresholds for acquiring transfers of water through optioning and spot leases. A 10-year Monte Carlo simulation driven by historical data is used to provide performance metrics for the supply portfolios. The three major components of our methodology include Sobol globoal sensitivity analysis, many-objective evolutionary optimization and interactive tradeoff visualization. The interplay between these components allows us to evaluate alternative design metrics, their decision variable controls and the consequent system vulnerabilities. Our LRGV case study measures water supply portfolios’ efficiency, reliability, and utilization of transfers in the water supply market. The sensitivity analysis is used interactively over interannual, annual, and monthly time scales to indicate how the problem controls change as a function of the timescale of interest. These results have been used then to improve our exploration and understanding of LRGV costs, vulnerabilities, and the water portfolios’ critical reliability constraints. These results demonstrate how we can adaptively improve the value and robustness of our problem formulations by evolving our definition of optimality to discover key tradeoffs.

  18. Reliability Quantification of the Flexure: A Critical Stirling Convertor Component

    NASA Technical Reports Server (NTRS)

    Shah, Ashwin R.; Korovaichuk, Igor; Zampino, Edward J.

    2004-01-01

    Uncertainties in the manufacturing, fabrication process, material behavior, loads, and boundary conditions results in the variation of the stresses and strains induced in the flexures and its fatigue life. Past experience and the test data at material coupon levels revealed a significant amount of scatter of the fatigue life. Owing to these facts, the design of the flexure, using conventional approaches based on safety factor or traditional reliability based on similar equipment considerations does not provide a direct measure of reliability. Additionally, it may not be feasible to run actual long term fatigue tests due to cost and time constraints. Therefore it is difficult to ascertain material fatigue strength limit. The objective of the paper is to present a methodology and quantified results of numerical simulation for the reliability of flexures used in the Stirling convertor for their structural performance. The proposed approach is based on application of finite element analysis method in combination with the random fatigue limit model, which includes uncertainties in material fatigue life. Additionally, sensitivity of fatigue life reliability to the design variables is quantified and its use to develop guidelines to improve design, manufacturing, quality control and inspection design process is described.

  19. Assessment of the reliability of protein-protein interactions and protein function prediction.

    PubMed

    Deng, Minghua; Sun, Fengzhu; Chen, Ting

    2003-01-01

    As more and more high-throughput protein-protein interaction data are collected, the task of estimating the reliability of different data sets becomes increasingly important. In this paper, we present our study of two groups of protein-protein interaction data, the physical interaction data and the protein complex data, and estimate the reliability of these data sets using three different measurements: (1) the distribution of gene expression correlation coefficients, (2) the reliability based on gene expression correlation coefficients, and (3) the accuracy of protein function predictions. We develop a maximum likelihood method to estimate the reliability of protein interaction data sets according to the distribution of correlation coefficients of gene expression profiles of putative interacting protein pairs. The results of the three measurements are consistent with each other. The MIPS protein complex data have the highest mean gene expression correlation coefficients (0.256) and the highest accuracy in predicting protein functions (70% sensitivity and specificity), while Ito's Yeast two-hybrid data have the lowest mean (0.041) and the lowest accuracy (15% sensitivity and specificity). Uetz's data are more reliable than Ito's data in all three measurements, and the TAP protein complex data are more reliable than the HMS-PCI data in all three measurements as well. The complex data sets generally perform better in function predictions than do the physical interaction data sets. Proteins in complexes are shown to be more highly correlated in gene expression. The results confirm that the components of a protein complex can be assigned to functions that the complex carries out within a cell. There are three interaction data sets different from the above two groups: the genetic interaction data, the in-silico data and the syn-express data. Their capability of predicting protein functions generally falls between that of the Y2H data and that of the MIPS protein complex data. The supplementary information is available at the following Web site: http://www-hto.usc.edu/-msms/AssessInteraction/.

  20. The Irvine, Beatties, and Bresnahan (IBB) Forelimb Recovery Scale: An Assessment of Reliability and Validity

    PubMed Central

    Irvine, Karen-Amanda; Ferguson, Adam R.; Mitchell, Kathleen D.; Beattie, Stephanie B.; Lin, Amity; Stuck, Ellen D.; Huie, J. Russell; Nielson, Jessica L.; Talbott, Jason F.; Inoue, Tomoo; Beattie, Michael S.; Bresnahan, Jacqueline C.

    2014-01-01

    The IBB scale is a recently developed forelimb scale for the assessment of fine control of the forelimb and digits after cervical spinal cord injury [SCI; (1)]. The present paper describes the assessment of inter-rater reliability and face, concurrent and construct validity of this scale following SCI. It demonstrates that the IBB is a reliable and valid scale that is sensitive to severity of SCI and to recovery over time. In addition, the IBB correlates with other outcome measures and is highly predictive of biological measures of tissue pathology. Multivariate analysis using principal component analysis (PCA) demonstrates that the IBB is highly predictive of the syndromic outcome after SCI (2), and is among the best predictors of bio-behavioral function, based on strong construct validity. Altogether, the data suggest that the IBB, especially in concert with other measures, is a reliable and valid tool for assessing neurological deficits in fine motor control of the distal forelimb, and represents a powerful addition to multivariate outcome batteries aimed at documenting recovery of function after cervical SCI in rats. PMID:25071704

  1. A diameter-sensitive flow entropy method for reliability consideration in water distribution system design

    NASA Astrophysics Data System (ADS)

    Liu, Haixing; Savić, Dragan; Kapelan, Zoran; Zhao, Ming; Yuan, Yixing; Zhao, Hongbin

    2014-07-01

    Flow entropy is a measure of uniformity of pipe flows in water distribution systems. By maximizing flow entropy one can identify reliable layouts or connectivity in networks. In order to overcome the disadvantage of the common definition of flow entropy that does not consider the impact of pipe diameter on reliability, an extended definition of flow entropy, termed as diameter-sensitive flow entropy, is proposed. This new methodology is then assessed by using other reliability methods, including Monte Carlo Simulation, a pipe failure probability model, and a surrogate measure (resilience index) integrated with water demand and pipe failure uncertainty. The reliability assessment is based on a sample of WDS designs derived from an optimization process for each of the two benchmark networks. Correlation analysis is used to evaluate quantitatively the relationship between entropy and reliability. To ensure reliability, a comparative analysis between the flow entropy and the new method is conducted. The results demonstrate that the diameter-sensitive flow entropy shows consistently much stronger correlation with the three reliability measures than simple flow entropy. Therefore, the new flow entropy method can be taken as a better surrogate measure for reliability and could be potentially integrated into the optimal design problem of WDSs. Sensitivity analysis results show that the velocity parameters used in the new flow entropy has no significant impact on the relationship between diameter-sensitive flow entropy and reliability.

  2. Feasibility of determining flat roof heat losses using aerial thermography

    NASA Technical Reports Server (NTRS)

    Bowman, R. L.; Jack, J. R.

    1979-01-01

    The utility of aerial thermography for determining rooftop heat losses was investigated experimentally using several completely instrumented test roofs with known thermal resistances. Actual rooftop heat losses were obtained both from in-situ instrumentation and aerial thermography obtained from overflights at an altitude of 305 m. In general, the remotely determined roof surface temperatures agreed very well with those obtained from ground measurements. The roof heat losses calculated using the remotely determined roof temperature agreed to within 17% of those calculated from 1/R delta T using ground measurements. However, this agreement may be fortuitous since the convective component of the heat loss is sensitive to small changes in roof temperature and to the average heat transfer coefficient used, whereas the radiative component is less sensitive. This, at this time, it is felt that an acceptable quantitative determination of roof heat losses using aerial thermography is only feasible when the convective term is accurately known or minimized. The sensitivity of the heat loss determination to environmental conditions was also evaluated. The analysis showed that the most reliable quantitative heat loss determinations can probably be obtained from aerial thermography taken under conditions of total cloud cover with low wind speeds and at low ambient temperatures.

  3. A new voice rating tool for clinical practice.

    PubMed

    Gould, James; Waugh, Jessica; Carding, Paul; Drinnan, Michael

    2012-07-01

    Perceptual rating of voice quality is a key component in the comprehensive assessment of voice, but there are practical difficulties in making reliable measurements. We have developed the Newcastle Audio Ranking (NeAR) test, a new referential system for the rating of voice parameters. In this article, we present our first results using NeAR. We asked five experts and 11 naive raters to assess 15 male and 15 female voices using the NeAR test. We assessed: validity with respect to the GRBAS scale; interrater reliability; sensitivity to subtle voice differences; and the performance of expert versus naïve raters. There was a uniformly excellent agreement with GRBAS (r=0.87) and interrater agreement (intraclass correlation coefficient=0.86). Considering each GRBAS grade of voice separately, there was still good interrater agreement in NeAR, implying it has good sensitivity to subtle changes. All these results were equally true for expert and naive raters. The NeAR test is a promising new tool in the assessment of voice disorders. Copyright © 2012 The Voice Foundation. Published by Mosby, Inc. All rights reserved.

  4. On the Nature of Clitics and Their Sensitivity to Number Attraction Effects

    PubMed Central

    Santesteban, Mikel; Zawiszewski, Adam; Erdocia, Kepa; Laka, Itziar

    2017-01-01

    Pronominal dependencies have been shown to be more resilient to attraction effects than subject-verb agreement. We use this phenomenon to investigate whether antecedent-clitic dependencies in Spanish are computed like agreement or like pronominal dependencies. In Experiment 1, an acceptability judgment self-paced reading task was used. Accuracy data yielded reliable attraction effects in both grammatical and ungrammatical sentences, only in singular (but not plural) clitics. Reading times did not show reliable attraction effects. In Experiment 2, we measured electrophysiological responses to violations, which elicited a biphasic frontal negativity-P600 pattern. Number attraction modulated the frontal negativity but not the amplitude of the P600 component. This differs from ERP findings on subject-verb agreement, since when the baseline matching condition obtained a biphasic pattern, attraction effects only modulated the P600, not the preceding negativity. We argue that these findings support cue-retrieval accounts of dependency resolution and further suggest that the sensitivity to attraction effects shown by clitics resembles more the computation of pronominal dependencies than that of agreement. PMID:28928686

  5. Low cost split stirling cryogenic cooler for aerospace applications

    NASA Astrophysics Data System (ADS)

    Veprik, Alexander; Zechtzer, Semeon; Pundak, Nachman; Riabzev, Sergey; Kirckconnel, C.; Freeman, Jeremy

    2012-06-01

    Cryogenic coolers are used in association with sensitive electronics and sensors for military, commercial or scientific space payloads. The general requirements are high reliability and power efficiency, low vibration export and ability to survive launch vibration extremes and long-term exposure to space radiation. A long standing paradigm of using exclusively space heritage derivatives of legendary "Oxford" cryocoolers featuring linear actuators, flexural bearings, contactless seals and active vibration cancellation is so far the best known practice aiming at delivering high reliability components for the critical and usually expensive space missions. The recent tendency of developing mini and micro satellites for the budget constrained missions has spurred attempts to adapt leading-edge tactical cryogenic coolers to meet the space requirements. The authors are disclosing theoretical and practical aspects of a collaborative effort on developing a space qualified cryogenic refrigerator based on the Ricor model K527 tactical cooler and Iris Technology radiation hardened, low cost cryocooler electronics. The initially targeted applications are cost-sensitive flight experiments, but should the results show promise, some long-life "traditional" cryocooler missions may well be satisfied by this approach.

  6. On the Nature of Clitics and Their Sensitivity to Number Attraction Effects.

    PubMed

    Santesteban, Mikel; Zawiszewski, Adam; Erdocia, Kepa; Laka, Itziar

    2017-01-01

    Pronominal dependencies have been shown to be more resilient to attraction effects than subject-verb agreement. We use this phenomenon to investigate whether antecedent-clitic dependencies in Spanish are computed like agreement or like pronominal dependencies. In Experiment 1, an acceptability judgment self-paced reading task was used. Accuracy data yielded reliable attraction effects in both grammatical and ungrammatical sentences, only in singular (but not plural) clitics. Reading times did not show reliable attraction effects. In Experiment 2, we measured electrophysiological responses to violations, which elicited a biphasic frontal negativity-P600 pattern. Number attraction modulated the frontal negativity but not the amplitude of the P600 component. This differs from ERP findings on subject-verb agreement, since when the baseline matching condition obtained a biphasic pattern, attraction effects only modulated the P600, not the preceding negativity. We argue that these findings support cue-retrieval accounts of dependency resolution and further suggest that the sensitivity to attraction effects shown by clitics resembles more the computation of pronominal dependencies than that of agreement.

  7. The pointillism method for creating stimuli suitable for use in computer-based visual contrast sensitivity testing.

    PubMed

    Turner, Travis H

    2005-03-30

    An increasingly large corpus of clinical and experimental neuropsychological research has demonstrated the utility of measuring visual contrast sensitivity. Unfortunately, existing means of measuring contrast sensitivity can be prohibitively expensive, difficult to standardize, or lack reliability. Additionally, most existing tests do not allow full control over important characteristics, such as off-angle rotations, waveform, contrast, and spatial frequency. Ideally, researchers could manipulate characteristics and display stimuli in a computerized task designed to meet experimental needs. Thus far, 256-bit color limitation in standard cathode ray tube (CRT) monitors has been preclusive. To this end, the pointillism method (PM) was developed. Using MATLAB software, stimuli are created based on both mathematical and stochastic components, such that differences in regional luminance values of the gradient field closely approximate the desired contrast. This paper describes the method and examines its performance in sine and square-wave image sets from a range of contrast values. Results suggest the utility of the method for most experimental applications. Weaknesses in the current version, the need for validation and reliability studies, and considerations regarding applications are discussed. Syntax for the program is provided in an appendix, and a version of the program independent of MATLAB is available from the author.

  8. Improved estimation of subject-level functional connectivity using full and partial correlation with empirical Bayes shrinkage.

    PubMed

    Mejia, Amanda F; Nebel, Mary Beth; Barber, Anita D; Choe, Ann S; Pekar, James J; Caffo, Brian S; Lindquist, Martin A

    2018-05-15

    Reliability of subject-level resting-state functional connectivity (FC) is determined in part by the statistical techniques employed in its estimation. Methods that pool information across subjects to inform estimation of subject-level effects (e.g., Bayesian approaches) have been shown to enhance reliability of subject-level FC. However, fully Bayesian approaches are computationally demanding, while empirical Bayesian approaches typically rely on using repeated measures to estimate the variance components in the model. Here, we avoid the need for repeated measures by proposing a novel measurement error model for FC describing the different sources of variance and error, which we use to perform empirical Bayes shrinkage of subject-level FC towards the group average. In addition, since the traditional intra-class correlation coefficient (ICC) is inappropriate for biased estimates, we propose a new reliability measure denoted the mean squared error intra-class correlation coefficient (ICC MSE ) to properly assess the reliability of the resulting (biased) estimates. We apply the proposed techniques to test-retest resting-state fMRI data on 461 subjects from the Human Connectome Project to estimate connectivity between 100 regions identified through independent components analysis (ICA). We consider both correlation and partial correlation as the measure of FC and assess the benefit of shrinkage for each measure, as well as the effects of scan duration. We find that shrinkage estimates of subject-level FC exhibit substantially greater reliability than traditional estimates across various scan durations, even for the most reliable connections and regardless of connectivity measure. Additionally, we find partial correlation reliability to be highly sensitive to the choice of penalty term, and to be generally worse than that of full correlations except for certain connections and a narrow range of penalty values. This suggests that the penalty needs to be chosen carefully when using partial correlations. Copyright © 2018. Published by Elsevier Inc.

  9. Lifetime Reliability Prediction of Ceramic Structures Under Transient Thermomechanical Loads

    NASA Technical Reports Server (NTRS)

    Nemeth, Noel N.; Jadaan, Osama J.; Gyekenyesi, John P.

    2005-01-01

    An analytical methodology is developed to predict the probability of survival (reliability) of ceramic components subjected to harsh thermomechanical loads that can vary with time (transient reliability analysis). This capability enables more accurate prediction of ceramic component integrity against fracture in situations such as turbine startup and shutdown, operational vibrations, atmospheric reentry, or other rapid heating or cooling situations (thermal shock). The transient reliability analysis methodology developed herein incorporates the following features: fast-fracture transient analysis (reliability analysis without slow crack growth, SCG); transient analysis with SCG (reliability analysis with time-dependent damage due to SCG); a computationally efficient algorithm to compute the reliability for components subjected to repeated transient loading (block loading); cyclic fatigue modeling using a combined SCG and Walker fatigue law; proof testing for transient loads; and Weibull and fatigue parameters that are allowed to vary with temperature or time. Component-to-component variation in strength (stochastic strength response) is accounted for with the Weibull distribution, and either the principle of independent action or the Batdorf theory is used to predict the effect of multiaxial stresses on reliability. The reliability analysis can be performed either as a function of the component surface (for surface-distributed flaws) or component volume (for volume-distributed flaws). The transient reliability analysis capability has been added to the NASA CARES/ Life (Ceramic Analysis and Reliability Evaluation of Structures/Life) code. CARES/Life was also updated to interface with commercially available finite element analysis software, such as ANSYS, when used to model the effects of transient load histories. Examples are provided to demonstrate the features of the methodology as implemented in the CARES/Life program.

  10. A particle swarm model for estimating reliability and scheduling system maintenance

    NASA Astrophysics Data System (ADS)

    Puzis, Rami; Shirtz, Dov; Elovici, Yuval

    2016-05-01

    Modifying data and information system components may introduce new errors and deteriorate the reliability of the system. Reliability can be efficiently regained with reliability centred maintenance, which requires reliability estimation for maintenance scheduling. A variant of the particle swarm model is used to estimate reliability of systems implemented according to the model view controller paradigm. Simulations based on data collected from an online system of a large financial institute are used to compare three component-level maintenance policies. Results show that appropriately scheduled component-level maintenance greatly reduces the cost of upholding an acceptable level of reliability by reducing the need in system-wide maintenance.

  11. Is computed tomography an accurate and reliable method for measuring total knee arthroplasty component rotation?

    PubMed

    Figueroa, José; Guarachi, Juan Pablo; Matas, José; Arnander, Magnus; Orrego, Mario

    2016-04-01

    Computed tomography (CT) is widely used to assess component rotation in patients with poor results after total knee arthroplasty (TKA). The purpose of this study was to simultaneously determine the accuracy and reliability of CT in measuring TKA component rotation. TKA components were implanted in dry-bone models and assigned to two groups. The first group (n = 7) had variable femoral component rotations, and the second group (n = 6) had variable tibial tray rotations. CT images were then used to assess component rotation. Accuracy of CT rotational assessment was determined by mean difference, in degrees, between implanted component rotation and CT-measured rotation. Intraclass correlation coefficient (ICC) was applied to determine intra-observer and inter-observer reliability. Femoral component accuracy showed a mean difference of 2.5° and the tibial tray a mean difference of 3.2°. There was good intra- and inter-observer reliability for both components, with a femoral ICC of 0.8 and 0.76, and tibial ICC of 0.68 and 0.65, respectively. CT rotational assessment accuracy can differ from true component rotation by approximately 3° for each component. It does, however, have good inter- and intra-observer reliability.

  12. Feasibility Study on Cutting HTPB Propellants with Abrasive Water Jet

    NASA Astrophysics Data System (ADS)

    Jiang, Dayong; Bai, Yun

    2018-01-01

    Abrasive water jet is used to carry out the experiment research on cutting HTPB propellants with three components, which will provide technical support for the engineering treatment of waste rocket motor. Based on the reliability theory and related scientific research results, the safety and efficiency of cutting sensitive HTPB propellants by abrasive water jet were experimentally studied. The results show that the safety reliability is not less than 99.52% at 90% confidence level, so the safety is adequately ensured. The cooling and anti-friction effect of high-speed water jet is the decisive factor to suppress the detonation of HTPB propellant. Compared with pure water jet, cutting efficiency was increased by 5% - 87%. The study shows that abrasive water jets meet the practical use for cutting HTPB propellants.

  13. Developing Reliable Life Support for Mars

    NASA Technical Reports Server (NTRS)

    Jones, Harry W.

    2017-01-01

    A human mission to Mars will require highly reliable life support systems. Mars life support systems may recycle water and oxygen using systems similar to those on the International Space Station (ISS). However, achieving sufficient reliability is less difficult for ISS than it will be for Mars. If an ISS system has a serious failure, it is possible to provide spare parts, or directly supply water or oxygen, or if necessary bring the crew back to Earth. Life support for Mars must be designed, tested, and improved as needed to achieve high demonstrated reliability. A quantitative reliability goal should be established and used to guide development t. The designers should select reliable components and minimize interface and integration problems. In theory a system can achieve the component-limited reliability, but testing often reveal unexpected failures due to design mistakes or flawed components. Testing should extend long enough to detect any unexpected failure modes and to verify the expected reliability. Iterated redesign and retest may be required to achieve the reliability goal. If the reliability is less than required, it may be improved by providing spare components or redundant systems. The number of spares required to achieve a given reliability goal depends on the component failure rate. If the failure rate is under estimated, the number of spares will be insufficient and the system may fail. If the design is likely to have undiscovered design or component problems, it is advisable to use dissimilar redundancy, even though this multiplies the design and development cost. In the ideal case, a human tended closed system operational test should be conducted to gain confidence in operations, maintenance, and repair. The difficulty in achieving high reliability in unproven complex systems may require the use of simpler, more mature, intrinsically higher reliability systems. The limitations of budget, schedule, and technology may suggest accepting lower and less certain expected reliability. A plan to develop reliable life support is needed to achieve the best possible reliability.

  14. Sensitivity to Mental Effort and Test-Retest Reliability of Heart Rate Variability Measures in Healthy Seniors

    PubMed Central

    Mukherjee, Shalini; Yadav, Rajeev; Yung, Iris; Zajdel, Daniel P.; Oken, Barry S.

    2011-01-01

    Objectives To determine 1) whether heart rate variability (HRV) was a sensitive and reliable measure in mental effort tasks carried out by healthy seniors and 2) whether non-linear approaches to HRV analysis, in addition to traditional time and frequency domain approaches were useful to study such effects. Methods Forty healthy seniors performed two visual working memory tasks requiring different levels of mental effort, while ECG was recorded. They underwent the same tasks and recordings two weeks later. Traditional and 13 non-linear indices of HRV including Poincaré, entropy and detrended fluctuation analysis (DFA) were determined. Results Time domain (especially mean R-R interval/RRI), frequency domain and, among nonlinear parameters- Poincaré and DFA were the most reliable indices. Mean RRI, time domain and Poincaré were also the most sensitive to different mental effort task loads and had the largest effect size. Conclusions Overall, linear measures were the most sensitive and reliable indices to mental effort. In non-linear measures, Poincaré was the most reliable and sensitive, suggesting possible usefulness as an independent marker in cognitive function tasks in healthy seniors. Significance A large number of HRV parameters was both reliable as well as sensitive indices of mental effort, although the simple linear methods were the most sensitive. PMID:21459665

  15. A sensitive UPLC-MS/MS method for simultaneous determination of eleven bioactive components of Tong-Xie-Yao-Fang decoction in rat biological matrices.

    PubMed

    Li, Tian-xue; Hu, Lang; Zhang, Meng-meng; Sun, Jian; Qiu, Yue; Rui, Jun-qian; Yang, Xing-hao

    2014-01-01

    There is a growing concern for the sensitive quantification of multiple components using advanced data acquisition method in herbal medicines (HMs). An improved and rugged UPLC-MS/MS method has been developed and validated for sensitive and rapid determination of multiply analytes from Tong-Xie-Yao-Fang (TXYF) decoction in three biological matrices (plasma/brain tissue/urine) using geniposide and formononetin as internal standards. After solid-phase extraction, chromatographic separation was performed on a C18 column using gradient elution. Quantifier and qualifier transitions were monitored using novel Triggered Dynamic multiple reaction monitoring (TdMRM) in the positive ionization mode. A significant peak symmetry and sensitivity improvement in the TdMRM mode was achieved as compared to conventional MRM. The reproducibility (RSD%) was ≤7.9% by applying TdMRM transition while the values were 6.8-20.6% for MRM. Excellent linear calibration curves were obtained under TdMRM transitions over the tested concentration ranges. Intra- and inter-day precisions (RSD%) were ≤14.2% and accuracies (RE%) ranged from -9.6% to 10.6%. The validation data of specificity, carryover, recovery, matrix effect and stability were within the required limits. The method was effectively applied to simultaneously detect and quantify 1 lactone, 2 monoterpene glucosides, 1 alkaloid, 5 flavonoids and 2 chromones in plasma, brain tissue and urine after oral administration of TXYF decoction. In conclusion, this new and reliable method is beneficial for quantification and confirmation assays of multiply components in complex biological samples. Copyright © 2013 Elsevier B.V. All rights reserved.

  16. Goal setting as an outcome measure: A systematic review.

    PubMed

    Hurn, Jane; Kneebone, Ian; Cropley, Mark

    2006-09-01

    Goal achievement has been considered to be an important measure of outcome by clinicians working with patients in physical and neurological rehabilitation settings. This systematic review was undertaken to examine the reliability, validity and sensitivity of goal setting and goal attainment scaling approaches when used with working age and older people. To review the reliability, validity and sensitivity of both goal setting and goal attainment scaling when employed as an outcome measure within a physical and neurological working age and older person rehabilitation environment, by examining the research literature covering the 36 years since goal-setting theory was proposed. Data sources included a computer-aided literature search of published studies examining the reliability, validity and sensitivity of goal setting/goal attainment scaling, with further references sourced from articles obtained through this process. There is strong evidence for the reliability, validity and sensitivity of goal attainment scaling. Empirical support was found for the validity of goal setting but research demonstrating its reliability and sensitivity is limited. Goal attainment scaling appears to be a sound measure for use in physical rehabilitation settings with working age and older people. Further work needs to be carried out with goal setting to establish its reliability and sensitivity as a measurement tool.

  17. Prediction during language comprehension: benefits, costs, and ERP components.

    PubMed

    Van Petten, Cyma; Luka, Barbara J

    2012-02-01

    Because context has a robust influence on the processing of subsequent words, the idea that readers and listeners predict upcoming words has attracted research attention, but prediction has fallen in and out of favor as a likely factor in normal comprehension. We note that the common sense of this word includes both benefits for confirmed predictions and costs for disconfirmed predictions. The N400 component of the event-related potential (ERP) reliably indexes the benefits of semantic context. Evidence that the N400 is sensitive to the other half of prediction--a cost for failure--is largely absent from the literature. This raises the possibility that "prediction" is not a good description of what comprehenders do. However, it need not be the case that the benefits and costs of prediction are evident in a single ERP component. Research outside of language processing indicates that late positive components of the ERP are very sensitive to disconfirmed predictions. We review late positive components elicited by words that are potentially more or less predictable from preceding sentence context. This survey suggests that late positive responses to unexpected words are fairly common, but that these consist of two distinct components with different scalp topographies, one associated with semantically incongruent words and one associated with congruent words. We conclude with a discussion of the possible cognitive correlates of these distinct late positivities and their relationships with more thoroughly characterized ERP components, namely the P300, P600 response to syntactic errors, and the "old/new effect" in studies of recognition memory. Copyright © 2011 Elsevier B.V. All rights reserved.

  18. Integrated Design Methodology for Highly Reliable Liquid Rocket Engine

    NASA Astrophysics Data System (ADS)

    Kuratani, Naoshi; Aoki, Hiroshi; Yasui, Masaaki; Kure, Hirotaka; Masuya, Goro

    The Integrated Design Methodology is strongly required at the conceptual design phase to achieve the highly reliable space transportation systems, especially the propulsion systems, not only in Japan but also all over the world in these days. Because in the past some catastrophic failures caused some losses of mission and vehicle (LOM/LOV) at the operational phase, moreover did affect severely the schedule delays and cost overrun at the later development phase. Design methodology for highly reliable liquid rocket engine is being preliminarily established and investigated in this study. The sensitivity analysis is systematically performed to demonstrate the effectiveness of this methodology, and to clarify and especially to focus on the correlation between the combustion chamber, turbopump and main valve as main components. This study describes the essential issues to understand the stated correlations, the need to apply this methodology to the remaining critical failure modes in the whole engine system, and the perspective on the engine development in the future.

  19. A Robust Damage-Reporting Strategy for Polymeric Materials Enabled by Aggregation-Induced Emission.

    PubMed

    Robb, Maxwell J; Li, Wenle; Gergely, Ryan C R; Matthews, Christopher C; White, Scott R; Sottos, Nancy R; Moore, Jeffrey S

    2016-09-28

    Microscopic damage inevitably leads to failure in polymers and composite materials, but it is difficult to detect without the aid of specialized equipment. The ability to enhance the detection of small-scale damage prior to catastrophic material failure is important for improving the safety and reliability of critical engineering components, while simultaneously reducing life cycle costs associated with regular maintenance and inspection. Here, we demonstrate a simple, robust, and sensitive fluorescence-based approach for autonomous detection of damage in polymeric materials and composites enabled by aggregation-induced emission (AIE). This simple, yet powerful system relies on a single active component, and the general mechanism delivers outstanding performance in a wide variety of materials with diverse chemical and mechanical properties.

  20. Exploration of Korean Students' Scientific Imagination Using the Scientific Imagination Inventory

    NASA Astrophysics Data System (ADS)

    Mun, Jiyeong; Mun, Kongju; Kim, Sung-Won

    2015-09-01

    This article reports on the study of the components of scientific imagination and describes the scales used to measure scientific imagination in Korean elementary and secondary students. In this study, we developed an inventory, which we call the Scientific Imagination Inventory (SII), in order to examine aspects of scientific imagination. We identified three conceptual components of scientific imagination, which were composed of (1) scientific sensitivity, (2) scientific creativity, and (3) scientific productivity. We administered SII to 662 students (4th-8th grades) and confirmed validity and reliability using exploratory factor analysis and Cronbach α coefficient. The characteristics of Korean elementary and secondary students' overall scientific imagination and difference across gender and grade level are discussed in the results section.

  1. Light scattering techniques for the characterization of optical components

    NASA Astrophysics Data System (ADS)

    Hauptvogel, M.; Schröder, S.; Herffurth, T.; Trost, M.; von Finck, A.; Duparré, A.; Weigel, T.

    2017-11-01

    The rapid developments in optical technologies generate increasingly higher and sometimes completely new demands on the quality of materials, surfaces, components, and systems. Examples for such driving applications are the steadily shrinking feature sizes in semiconductor lithography, nanostructured functional surfaces for consumer optics, and advanced optical systems for astronomy and space applications. The reduction of surface defects as well as the minimization of roughness and other scatter-relevant irregularities are essential factors in all these areas of application. Quality-monitoring for analysing and improving those properties must ensure that even minimal defects and roughness values can be detected reliably. Light scattering methods have a high potential for a non-contact, rapid, efficient, and sensitive determination of roughness, surface structures, and defects.

  2. The Psychometric Properties of the Center for Epidemiologic Studies Depression Scale in Chinese Primary Care Patients: Factor Structure, Construct Validity, Reliability, Sensitivity and Responsiveness.

    PubMed

    Chin, Weng Yee; Choi, Edmond P H; Chan, Kit T Y; Wong, Carlos K H

    2015-01-01

    The Center for Epidemiologic Studies Depression Scale (CES-D) is a commonly used instrument to measure depressive symptomatology. Despite this, the evidence for its psychometric properties remains poorly established in Chinese populations. The aim of this study was to validate the use of the CES-D in Chinese primary care patients by examining factor structure, construct validity, reliability, sensitivity and responsiveness. The psychometric properties were assessed amongst a sample of 3686 Chinese adult primary care patients in Hong Kong. Three competing factor structure models were examined using confirmatory factor analysis. The original CES-D four-structure model had adequate fit, however the data was better fit into a bi-factor model. For the internal construct validity, corrected item-total correlations were 0.4 for most items. The convergent validity was assessed by examining the correlations between the CES-D, the Patient Health Questionnaire 9 (PHQ-9) and the Short Form-12 Health Survey (version 2) Mental Component Summary (SF-12 v2 MCS). The CES-D had a strong correlation with the PHQ-9 (coefficient: 0.78) and SF-12 v2 MCS (coefficient: -0.75). Internal consistency was assessed by McDonald's omega hierarchical (ωH). The ωH value for the general depression factor was 0.855. The ωH values for "somatic", "depressed affect", "positive affect" and "interpersonal problems" were 0.434, 0.038, 0.738 and 0.730, respectively. For the two-week test-retest reliability, the intraclass correlation coefficient was 0.91. The CES-D was sensitive in detecting differences between known groups, with the AUC >0.7. Internal responsiveness of the CES-D to detect positive and negative changes was satisfactory (with p value <0.01 and all effect size statistics >0.2). The CES-D was externally responsive, with the AUC>0.7. The CES-D appears to be a valid, reliable, sensitive and responsive instrument for screening and monitoring depressive symptoms in adult Chinese primary care patients. In its original four-factor and bi-factor structure, the CES-D is supported for cross-cultural comparisons of depression in multi-center studies.

  3. Decreasing inventory of a cement factory roller mill parts using reliability centered maintenance method

    NASA Astrophysics Data System (ADS)

    Witantyo; Rindiyah, Anita

    2018-03-01

    According to data from maintenance planning and control, it was obtained that highest inventory value is non-routine components. Maintenance components are components which procured based on maintenance activities. The problem happens because there is no synchronization between maintenance activities and the components required. Reliability Centered Maintenance method is used to overcome the problem by reevaluating maintenance activities required components. The case chosen is roller mill system because it has the highest unscheduled downtime record. Components required for each maintenance activities will be determined by its failure distribution, so the number of components needed could be predicted. Moreover, those components will be reclassified from routine component to be non-routine component, so the procurement could be carried out regularly. Based on the conducted analysis, failure happens in almost every maintenance task are classified to become scheduled on condition task, scheduled discard task, schedule restoration task and no schedule maintenance. From 87 used components for maintenance activities are evaluated and there 19 components that experience reclassification from non-routine components to routine components. Then the reliability and need of those components were calculated for one-year operation period. Based on this invention, it is suggested to change all of the components in overhaul activity to increase the reliability of roller mill system. Besides, the inventory system should follow maintenance schedule and the number of required components in maintenance activity so the value of procurement will be decreased and the reliability system will increase.

  4. Computing Reliabilities Of Ceramic Components Subject To Fracture

    NASA Technical Reports Server (NTRS)

    Nemeth, N. N.; Gyekenyesi, J. P.; Manderscheid, J. M.

    1992-01-01

    CARES calculates fast-fracture reliability or failure probability of macroscopically isotropic ceramic components. Program uses results from commercial structural-analysis program (MSC/NASTRAN or ANSYS) to evaluate reliability of component in presence of inherent surface- and/or volume-type flaws. Computes measure of reliability by use of finite-element mathematical model applicable to multiple materials in sense model made function of statistical characterizations of many ceramic materials. Reliability analysis uses element stress, temperature, area, and volume outputs, obtained from two-dimensional shell and three-dimensional solid isoparametric or axisymmetric finite elements. Written in FORTRAN 77.

  5. Reliability analysis of component of affination centrifugal 1 machine by using reliability engineering

    NASA Astrophysics Data System (ADS)

    Sembiring, N.; Ginting, E.; Darnello, T.

    2017-12-01

    Problems that appear in a company that produces refined sugar, the production floor has not reached the level of critical machine availability because it often suffered damage (breakdown). This results in a sudden loss of production time and production opportunities. This problem can be solved by Reliability Engineering method where the statistical approach to historical damage data is performed to see the pattern of the distribution. The method can provide a value of reliability, rate of damage, and availability level, of an machine during the maintenance time interval schedule. The result of distribution test to time inter-damage data (MTTF) flexible hose component is lognormal distribution while component of teflon cone lifthing is weibull distribution. While from distribution test to mean time of improvement (MTTR) flexible hose component is exponential distribution while component of teflon cone lifthing is weibull distribution. The actual results of the flexible hose component on the replacement schedule per 720 hours obtained reliability of 0.2451 and availability 0.9960. While on the critical components of teflon cone lifthing actual on the replacement schedule per 1944 hours obtained reliability of 0.4083 and availability 0.9927.

  6. Design of fuel cell powered data centers for sufficient reliability and availability

    NASA Astrophysics Data System (ADS)

    Ritchie, Alexa J.; Brouwer, Jacob

    2018-04-01

    It is challenging to design a sufficiently reliable fuel cell electrical system for use in data centers, which require 99.9999% uptime. Such a system could lower emissions and increase data center efficiency, but the reliability and availability of such a system must be analyzed and understood. Currently, extensive backup equipment is used to ensure electricity availability. The proposed design alternative uses multiple fuel cell systems each supporting a small number of servers to eliminate backup power equipment provided the fuel cell design has sufficient reliability and availability. Potential system designs are explored for the entire data center and for individual fuel cells. Reliability block diagram analysis of the fuel cell systems was accomplished to understand the reliability of the systems without repair or redundant technologies. From this analysis, it was apparent that redundant components would be necessary. A program was written in MATLAB to show that the desired system reliability could be achieved by a combination of parallel components, regardless of the number of additional components needed. Having shown that the desired reliability was achievable through some combination of components, a dynamic programming analysis was undertaken to assess the ideal allocation of parallel components.

  7. Sensitivity of physical examination versus arthroscopy in diagnosing subscapularis tendon injury.

    PubMed

    Faruqui, Sami; Wijdicks, Coen; Foad, Abdullah

    2014-01-01

    The purpose of this study was to examine the accuracy of physical examination in the detection of subscapularis tendon tears and compare it with the gold standard of arthroscopy to determine whether clinical examination can reliably predict the presence of subscapularis tendon tears. This was a retrospective analysis of 52 patients (52 shoulders) who underwent arthroscopic subscapularis tendon repairs between September 2008 and April 2012. Positive findings on any combination of the belly press, lift-off, and bear hug tests constituted a positive physical examination result. There was a positive finding on physical examination in 42 of 52 patients. The sensitivity of the physical examination as a whole was 81%. The literature has shown that the belly press, bear hug, and lift-off tests are specific to the subscapularis tendon. To the authors’ knowledge, this is the first study to evaluate the sensitivity of these 3 separate clinical tests as a composite. Knowledge regarding the sensitivity of the subscapularis-specific physical examination as a composite can lead practitioners to implement all 3 components, even when 1 test has a negative finding, thus promoting a more thorough physical examination. Because unrepaired subscapularis tendon tears can result in poor outcomes in the repair of other rotator cuff tendons, a complete physical examination would be beneficial to patients with shoulder pathology. The authors conclude that physical examination, when performed consistently by an experienced practitioner, can reliably predict the presence of subscapularis tendon tears.

  8. Cervical auscultation as an adjunct to the clinical swallow examination: a comparison with fibre-optic endoscopic evaluation of swallowing.

    PubMed

    Bergström, Liza; Svensson, Per; Hartelius, Lena

    2014-10-01

    This prospective, single-blinded study investigated the validity and reliability of cervical auscultation (CA) under two conditions; (1) CA-only, using isolated swallow-sound clips, and (2) CSE + CA, using extra clinical swallow examination (CSE) information such as patient case history, oromotor assessment, and the same swallow-sound clips as condition one. The two CA conditions were compared against a fibre-optic endoscopic evaluation of swallowing (FEES) reference test. Each CA condition consisted of 18 swallows samples compiled from 12 adult patients consecutively referred to the FEES clinic. Patients' swallow sounds were simultaneously recorded during FEES via a Littmann E3200 electronic stethoscope. These 18 swallow samples were sent to 13 experienced dysphagia clinicians recruited from the UK and Australia who were blinded to the FEES results. Samples were rated in terms of (1) if dysphagic, (2) if the patient was safe on consistency trialled, and (3) dysphagia severity. Sensitivity measures ranged from 83-95%, specificity measures from 50-92% across the conditions. Intra-rater agreement ranged from 69-97% total agreement. Inter-rater reliability for dysphagia severity showed substantial agreement (rs = 0.68 and 0.74). Results show good rater reliability for CA-trained speech-language pathologists. Sensitivity and specificity for both CA conditions in this study are comparable to and often better than other well-established CSE components.

  9. Signal Detection Theory Applied to Helicopter Transmission Diagnostic Thresholds

    NASA Technical Reports Server (NTRS)

    Dempsey, Paula J.; Keller, Jonathan A.; Wade, Daniel R.

    2008-01-01

    Helicopter Health Usage Monitoring Systems (HUMS) have potential for providing data to support increasing the service life of a dynamic mechanical component in the transmission of a helicopter. Data collected can demonstrate the HUMS condition indicator responds to a specific component fault with appropriate alert limits and minimal false alarms. Defining thresholds for specific faults requires a tradeoff between the sensitivity of the condition indicator (CI) limit to indicate damage and the number of false alarms. A method using Receiver Operating Characteristic (ROC) curves to assess CI performance was demonstrated using CI data collected from accelerometers installed on several UH60 Black Hawk and AH64 Apache helicopters and an AH64 helicopter component test stand. Results of the analysis indicate ROC curves can be used to reliably assess the performance of commercial HUMS condition indicators to detect damaged gears and bearings in a helicopter transmission.

  10. Signal Detection Theory Applied to Helicopter Transmission Diagnostic Thresholds

    NASA Technical Reports Server (NTRS)

    Dempsey, Paula J.; Keller, Jonathan A.; Wade, Daniel R.

    2009-01-01

    Helicopter Health Usage Monitoring Systems (HUMS) have potential for providing data to support increasing the service life of a dynamic mechanical component in the transmission of a helicopter. Data collected can demonstrate the HUMS condition indicator responds to a specific component fault with appropriate alert limits and minimal false alarms. Defining thresholds for specific faults requires a tradeoff between the sensitivity of the condition indicator (CI) limit to indicate damage and the number of false alarms. A method using Receiver Operating Characteristic (ROC) curves to assess CI performance was demonstrated using CI data collected from accelerometers installed on several UH60 Black Hawk and AH64 Apache helicopters and an AH64 helicopter component test stand. Results of the analysis indicate ROC curves can be used to reliably assess the performance of commercial HUMS condition indicators to detect damaged gears and bearings in a helicopter transmission.

  11. Sensitivity to mental effort and test-retest reliability of heart rate variability measures in healthy seniors.

    PubMed

    Mukherjee, Shalini; Yadav, Rajeev; Yung, Iris; Zajdel, Daniel P; Oken, Barry S

    2011-10-01

    To determine (1) whether heart rate variability (HRV) was a sensitive and reliable measure in mental effort tasks carried out by healthy seniors and (2) whether non-linear approaches to HRV analysis, in addition to traditional time and frequency domain approaches were useful to study such effects. Forty healthy seniors performed two visual working memory tasks requiring different levels of mental effort, while ECG was recorded. They underwent the same tasks and recordings 2 weeks later. Traditional and 13 non-linear indices of HRV including Poincaré, entropy and detrended fluctuation analysis (DFA) were determined. Time domain, especially mean R-R interval (RRI), frequency domain and, among non-linear parameters - Poincaré and DFA were the most reliable indices. Mean RRI, time domain and Poincaré were also the most sensitive to different mental effort task loads and had the largest effect size. Overall, linear measures were the most sensitive and reliable indices to mental effort. In non-linear measures, Poincaré was the most reliable and sensitive, suggesting possible usefulness as an independent marker in cognitive function tasks in healthy seniors. A large number of HRV parameters was both reliable as well as sensitive indices of mental effort, although the simple linear methods were the most sensitive. Copyright © 2011 International Federation of Clinical Neurophysiology. Published by Elsevier Ireland Ltd. All rights reserved.

  12. The effectiveness of using the calculated braking current for longitudinal differential protection of 110 - 750 kV shunt reactors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vdovin, S. A.; Shalimov, A. S.

    2013-05-15

    The use of the function of effective current braking of the longitudinal differential protection of shunt reactors to offset current surges, which enables the sensitivity of differential protection to be increased when there are short circuits with low damage currents, is considered. It is shown that the use of the calculated braking characteristic enables the reliability of offset protection from transients to be increased when the reactor is connected, which is accompanied by the flow of asymmetric currents containing an aperiodic component.

  13. Hybrid rockets - Combining the best of liquids and solids

    NASA Technical Reports Server (NTRS)

    Cook, Jerry R.; Goldberg, Ben E.; Estey, Paul N.; Wiley, Dan R.

    1992-01-01

    Hybrid rockets employing liquid oxidizer and solid fuel grain answers to cost, safety, reliability, and environmental impact concerns that have become as prominent as performance in recent years. The oxidizer-free grain has limited sensitivity to grain anomalies, such as bond-line separations, which can cause catastrophic failures in solid rocket motors. An account is presently given of the development effort associated with the AMROC commercial hybrid booster and component testing efforts at NASA-Marshall. These hybrid rockets can be fired, terminated, inspected, evaluated, and restarted for additional testing.

  14. Superconductor Semiconductor Research for NASA's Submillimeter Wavelength Missions

    NASA Technical Reports Server (NTRS)

    Crowe, Thomas W.

    1997-01-01

    Wideband, coherent submillimeter wavelength detectors of the highest sensitivity are essential for the success of NASA's future radio astronomical and atmospheric space missions. The critical receiver components which need to be developed are ultra- wideband mixers and suitable local oscillator sources. This research is focused on two topics, (1) the development of reliable varactor diodes that will generate the required output power for NASA missions in the frequency range from 300 GHZ through 2.5 THz, and (2) the development of wideband superconductive mixer elements for the same frequency range.

  15. Reliability analysis of a robotic system using hybridized technique

    NASA Astrophysics Data System (ADS)

    Kumar, Naveen; Komal; Lather, J. S.

    2017-09-01

    In this manuscript, the reliability of a robotic system has been analyzed using the available data (containing vagueness, uncertainty, etc). Quantification of involved uncertainties is done through data fuzzification using triangular fuzzy numbers with known spreads as suggested by system experts. With fuzzified data, if the existing fuzzy lambda-tau (FLT) technique is employed, then the computed reliability parameters have wide range of predictions. Therefore, decision-maker cannot suggest any specific and influential managerial strategy to prevent unexpected failures and consequently to improve complex system performance. To overcome this problem, the present study utilizes a hybridized technique. With this technique, fuzzy set theory is utilized to quantify uncertainties, fault tree is utilized for the system modeling, lambda-tau method is utilized to formulate mathematical expressions for failure/repair rates of the system, and genetic algorithm is utilized to solve established nonlinear programming problem. Different reliability parameters of a robotic system are computed and the results are compared with the existing technique. The components of the robotic system follow exponential distribution, i.e., constant. Sensitivity analysis is also performed and impact on system mean time between failures (MTBF) is addressed by varying other reliability parameters. Based on analysis some influential suggestions are given to improve the system performance.

  16. Discrete component bonding and thick film materials study. [of capacitor chips bonded with solders and conductive epoxies

    NASA Technical Reports Server (NTRS)

    Kinser, D. L.

    1976-01-01

    The bonding reliability of discrete capacitor chips bonded with solders and conductive epoxies was examined along with the thick film resistor materials consisting of iron oxide phosphate and vanadium oxide phosphates. It was concluded from the bonding reliability studies that none of the wide range of types of solders examined is capable of resisting failure during thermal cycling while the conductive epoxy gives substantially lower failure rates. The thick film resistor studies proved the feasibility of iron oxide phosphate resistor systems although some environmental sensitivity problems remain. One of these resistor compositions has inadvertently proven to be a candidate for thermistor applications because of the excellent control achieved upon the temperature coefficient of resistance. One new and potentially damaging phenomenon observed was the degradation of thick film conductors during the course of thermal cycling.

  17. Socially supportive activity inventory: reliability and validity of a social activity instrument for long-term care institutions.

    PubMed

    Hsu, Ya-Chuan

    2011-09-01

    : Diverse social and recreational activities in elder care institutions have been provided to enrich a person's mental well-being amidst what is a relatively monotonous life. However, few instruments that measure the social activities of long-term care residents are available. : This study was designed to develop a culturally sensitive instrument (Socially Supportive Activity Inventory, SSAI) to assess quantity and quality of social activities for long-term care institutions and validate the instrument's psychometric properties. : The SSAI was developed on the basis of the social support theory, a synthesis of literature, and Taiwanese cultural mores. The instrument was rigorously subjected to a two-stage process to evaluate its reliability and validity. In Stage 1, six experts from diverse backgrounds were recruited to evaluate instrument items and estimate the content validity of the instrument using a content validity questionnaire. Items were modified and refined on the basis of the responses of the expert panel and a set of criteria. After obtaining approval from a university institutional review board, in the second stage of evaluating test-retest reliability, a convenience sample of 10 Taiwanese institutionalized elders in a pilot study, recruited from a nursing home, completed the revised instrument at two separate times over 2 weeks. : Results showed a content validity of .96. Test-retest reliability from a sample of 10 participants yielded stability coefficients of .76-1.00. The stability coefficient was 1.00 for the component of frequency, .76-1.00 for the component of meaningfulness, and .78-1.00 for the component of enjoyment. : The SSAI is a highly relevant and reliable culturally based instrument that can measure social activity in long-term care facilities. Because of the pilot nature of this study, future directions include further exploration of the SSAI instrument's psychometric properties. This should be done by enlarging the sample size to include more long-term care facilities and individual participants. Future studies can utilize diverse measures of social activity for comparison and validation of the SSAI.

  18. Lessons learned developing a diagnostic tool for HIV-associated dementia feasible to implement in resource-limited settings: pilot testing in Kenya.

    PubMed

    Kwasa, Judith; Cettomai, Deanna; Lwanya, Edwin; Osiemo, Dennis; Oyaro, Patrick; Birbeck, Gretchen L; Price, Richard W; Bukusi, Elizabeth A; Cohen, Craig R; Meyer, Ana-Claire L

    2012-01-01

    To conduct a preliminary evaluation of the utility and reliability of a diagnostic tool for HIV-associated dementia (HAD) for use by primary health care workers (HCW) which would be feasible to implement in resource-limited settings. In resource-limited settings, HAD is an indication for anti-retroviral therapy regardless of CD4 T-cell count. Anti-retroviral therapy, the treatment for HAD, is now increasingly available in resource-limited settings. Nonetheless, HAD remains under-diagnosed likely because of limited clinical expertise and availability of diagnostic tests. Thus, a simple diagnostic tool which is practical to implement in resource-limited settings is an urgent need. A convenience sample of 30 HIV-infected outpatients was enrolled in Western Kenya. We assessed the sensitivity and specificity of a diagnostic tool for HAD as administered by a primary HCW. This was compared to an expert clinical assessment which included examination by a physician, neuropsychological testing, and in selected cases, brain imaging. Agreement between HCW and an expert examiner on certain tool components was measured using Kappa statistic. The sample was 57% male, mean age was 38.6 years, mean CD4 T-cell count was 323 cells/µL, and 54% had less than a secondary school education. Six (20%) of the subjects were diagnosed with HAD by expert clinical assessment. The diagnostic tool was 63% sensitive and 67% specific for HAD. Agreement between HCW and expert examiners was poor for many individual items of the diagnostic tool (K = .03-.65). This diagnostic tool had moderate sensitivity and specificity for HAD. However, reliability was poor, suggesting that substantial training and formal evaluations of training adequacy will be critical to enable HCW to reliably administer a brief diagnostic tool for HAD.

  19. Patient health questionnaire for school-based depression screening among Chinese adolescents.

    PubMed

    Tsai, Fang-Ju; Huang, Yu-Hsin; Liu, Hui-Ching; Huang, Kuo-Yang; Huang, Yen-Hsun; Liu, Shen-Ing

    2014-02-01

    The aim of this study was to determine the reliability and validity of a Chinese version of the Patient Health Questionnaire-9 item (PHQ-9) and its 2 subscales (1 item and 2 items) for the screening of major depressive disorder (MDD) among adolescents in Taiwan. A total of 2257 adolescents were recruited from high schools in Taipei. The participants completed assessments including demographic information, the Chinese version of the PHQ-9, and the Rosenberg Self-Esteem Scale, and data on the number of physical illnesses and mental health service utilizations were recorded. Among them, 430 were retested using the PHQ-9 within 2 weeks. Child psychiatrists interviewed a subsample of the adolescents (n = 165) using the Kiddie-Schedule for Affective Disorder and Schizophrenia Epidemiological Version as the criterion standard. The PHQ-9 had good internal consistency (α = 0.84) and acceptable test-retest reliability (0.80). The participants with higher PHQ-9 scores were more likely to have MDD. Principal component factor analysis of the PHQ-9 yielded a 1-factor structure, which accounted for 45.3% of the variance. A PHQ-9 score ≥15 had a sensitivity of 0.72 and a specificity of 0.95 for recognizing MDD. The area under the receiver operating characteristic curve was 0.90. The screening accuracy of the 2 subscales was also satisfactory, with a Patient Health Questionnaire-2 item cutoff of ≥3 being 94.4% sensitive and 82.5% specific and a Patient Health Questionnaire-1 item cutoff of ≥2 being 61.1% sensitive and 87.7% specific. The PHQ-9 and its 2 subscales appear to be reliable and valid for detecting MDD among ethnic Chinese adolescents in Taiwan.

  20. Reliability and sensitivity analysis of a system with multiple unreliable service stations and standby switching failures

    NASA Astrophysics Data System (ADS)

    Ke, Jyh-Bin; Lee, Wen-Chiung; Wang, Kuo-Hsiung

    2007-07-01

    This paper presents the reliability and sensitivity analysis of a system with M primary units, W warm standby units, and R unreliable service stations where warm standby units switching to the primary state might fail. Failure times of primary and warm standby units are assumed to have exponential distributions, and service times of the failed units are exponentially distributed. In addition, breakdown times and repair times of the service stations also follow exponential distributions. Expressions for system reliability, RY(t), and mean time to system failure, MTTF are derived. Sensitivity analysis, relative sensitivity analysis of the system reliability and the mean time to failure, with respect to system parameters are also investigated.

  1. Reliability prediction of large fuel cell stack based on structure stress analysis

    NASA Astrophysics Data System (ADS)

    Liu, L. F.; Liu, B.; Wu, C. W.

    2017-09-01

    The aim of this paper is to improve the reliability of Proton Electrolyte Membrane Fuel Cell (PEMFC) stack by designing the clamping force and the thickness difference between the membrane electrode assembly (MEA) and the gasket. The stack reliability is directly determined by the component reliability, which is affected by the material property and contact stress. The component contact stress is a random variable because it is usually affected by many uncertain factors in the production and clamping process. We have investigated the influences of parameter variation coefficient on the probability distribution of contact stress using the equivalent stiffness model and the first-order second moment method. The optimal contact stress to make the component stay in the highest level reliability is obtained by the stress-strength interference model. To obtain the optimal contact stress between the contact components, the optimal thickness of the component and the stack clamping force are optimally designed. Finally, a detailed description is given how to design the MEA and gasket dimensions to obtain the highest stack reliability. This work can provide a valuable guidance in the design of stack structure for a high reliability of fuel cell stack.

  2. Towards early software reliability prediction for computer forensic tools (case study).

    PubMed

    Abu Talib, Manar

    2016-01-01

    Versatility, flexibility and robustness are essential requirements for software forensic tools. Researchers and practitioners need to put more effort into assessing this type of tool. A Markov model is a robust means for analyzing and anticipating the functioning of an advanced component based system. It is used, for instance, to analyze the reliability of the state machines of real time reactive systems. This research extends the architecture-based software reliability prediction model for computer forensic tools, which is based on Markov chains and COSMIC-FFP. Basically, every part of the computer forensic tool is linked to a discrete time Markov chain. If this can be done, then a probabilistic analysis by Markov chains can be performed to analyze the reliability of the components and of the whole tool. The purposes of the proposed reliability assessment method are to evaluate the tool's reliability in the early phases of its development, to improve the reliability assessment process for large computer forensic tools over time, and to compare alternative tool designs. The reliability analysis can assist designers in choosing the most reliable topology for the components, which can maximize the reliability of the tool and meet the expected reliability level specified by the end-user. The approach of assessing component-based tool reliability in the COSMIC-FFP context is illustrated with the Forensic Toolkit Imager case study.

  3. SAGE II Measurements of Stratospheric Aerosol Properties at Non-Volcanic Levels

    NASA Technical Reports Server (NTRS)

    Thomason, Larry W.; Burton, Sharon P.; Luo, Bei-Ping; Peter, Thomas

    2008-01-01

    Since 2000, stratospheric aerosol levels have been relatively stable and at the lowest levels observed in the historical record. Given the challenges of making satellite measurements of aerosol properties at these levels, we have performed a study of the sensitivity of the product to the major components of the processing algorithm used in the production of SAGE II aerosol extinction measurements and the retrieval process that produces the operational surface area density (SAD) product. We find that the aerosol extinction measurements, particularly at 1020 nm, remain robust and reliable at the observed aerosol levels. On the other hand, during background periods, the SAD operational product has an uncertainty of at least a factor of 2 during due to the lack of sensitivity to particles with radii less than 100 nm.

  4. Tax Subsidies for Employer-Sponsored Health Insurance: Updated Microsimulation Estimates and Sensitivity to Alternative Incidence Assumptions

    PubMed Central

    Miller, G Edward; Selden, Thomas M

    2013-01-01

    Objective To estimate 2012 tax expenditures for employer-sponsored insurance (ESI) in the United States and to explore the sensitivity of estimates to assumptions regarding the incidence of employer premium contributions. Data Sources Nationally representative Medical Expenditure Panel Survey data from the 2005–2007 Household Component (MEPS-HC) and the 2009–2010 Insurance Component (MEPS IC). Study Design We use MEPS HC workers to construct synthetic workforces for MEPS IC establishments, applying the workers' marginal tax rates to the establishments' insurance premiums to compute the tax subsidy, in aggregate and by establishment characteristics. Simulation enables us to examine the sensitivity of ESI tax subsidy estimates to a range of scenarios for the within-firm incidence of employer premium contributions when workers have heterogeneous health risks and make heterogeneous plan choices. Principal Findings We simulate the total ESI tax subsidy for all active, civilian U.S. workers to be $257.4 billion in 2012. In the private sector, the subsidy disproportionately flows to workers in large establishments and establishments with predominantly high wage or full-time workforces. The estimates are remarkably robust to alternative incidence assumptions. Conclusions The aggregate value of the ESI tax subsidy and its distribution across firms can be reliably estimated using simplified incidence assumptions. PMID:23398400

  5. Reliability of Health-Related Physical Fitness Tests among Colombian Children and Adolescents: The FUPRECOL Study

    PubMed Central

    Ramírez-Vélez, Robinson; Rodrigues-Bezerra, Diogo; Correa-Bautista, Jorge Enrique; Izquierdo, Mikel; Lobelo, Felipe

    2015-01-01

    Substantial evidence indicates that youth physical fitness levels are an important marker of lifestyle and cardio-metabolic health profiles and predict future risk of chronic diseases. The reliability physical fitness tests have not been explored in Latino-American youth population. This study’s aim was to examine the reliability of health-related physical fitness tests that were used in the Colombian health promotion “Fuprecol study”. Participants were 229 Colombian youth (boys n = 124 and girls n = 105) aged 9 to 17.9 years old. Five components of health-related physical fitness were measured: 1) morphological component: height, weight, body mass index (BMI), waist circumference, triceps skinfold, subscapular skinfold, and body fat (%) via impedance; 2) musculoskeletal component: handgrip and standing long jump test; 3) motor component: speed/agility test (4x10 m shuttle run); 4) flexibility component (hamstring and lumbar extensibility, sit-and-reach test); 5) cardiorespiratory component: 20-meter shuttle-run test (SRT) to estimate maximal oxygen consumption. The tests were performed two times, 1 week apart on the same day of the week, except for the SRT which was performed only once. Intra-observer technical errors of measurement (TEMs) and inter-rater (reliability) were assessed in the morphological component. Reliability for the Musculoskeletal, motor and cardiorespiratory fitness components was examined using Bland–Altman tests. For the morphological component, TEMs were small and reliability was greater than 95% of all cases. For the musculoskeletal, motor, flexibility and cardiorespiratory components, we found adequate reliability patterns in terms of systematic errors (bias) and random error (95% limits of agreement). When the fitness assessments were performed twice, the systematic error was nearly 0 for all tests, except for the sit and reach (mean difference: -1.03% [95% CI = -4.35% to -2.28%]. The results from this study indicate that the “Fuprecol study” health-related physical fitness battery, administered by physical education teachers, was reliable for measuring health-related components of fitness in children and adolescents aged 9–17.9 years old in a school setting in Colombia. PMID:26474474

  6. Compound estimation procedures in reliability

    NASA Technical Reports Server (NTRS)

    Barnes, Ron

    1990-01-01

    At NASA, components and subsystems of components in the Space Shuttle and Space Station generally go through a number of redesign stages. While data on failures for various design stages are sometimes available, the classical procedures for evaluating reliability only utilize the failure data on the present design stage of the component or subsystem. Often, few or no failures have been recorded on the present design stage. Previously, Bayesian estimators for the reliability of a single component, conditioned on the failure data for the present design, were developed. These new estimators permit NASA to evaluate the reliability, even when few or no failures have been recorded. Point estimates for the latter evaluation were not possible with the classical procedures. Since different design stages of a component (or subsystem) generally have a good deal in common, the development of new statistical procedures for evaluating the reliability, which consider the entire failure record for all design stages, has great intuitive appeal. A typical subsystem consists of a number of different components and each component has evolved through a number of redesign stages. The present investigations considered compound estimation procedures and related models. Such models permit the statistical consideration of all design stages of each component and thus incorporate all the available failure data to obtain estimates for the reliability of the present version of the component (or subsystem). A number of models were considered to estimate the reliability of a component conditioned on its total failure history from two design stages. It was determined that reliability estimators for the present design stage, conditioned on the complete failure history for two design stages have lower risk than the corresponding estimators conditioned only on the most recent design failure data. Several models were explored and preliminary models involving bivariate Poisson distribution and the Consael Process (a bivariate Poisson process) were developed. Possible short comings of the models are noted. An example is given to illustrate the procedures. These investigations are ongoing with the aim of developing estimators that extend to components (and subsystems) with three or more design stages.

  7. Reliability of Health-Related Physical Fitness Tests among Colombian Children and Adolescents: The FUPRECOL Study.

    PubMed

    Ramírez-Vélez, Robinson; Rodrigues-Bezerra, Diogo; Correa-Bautista, Jorge Enrique; Izquierdo, Mikel; Lobelo, Felipe

    2015-01-01

    Substantial evidence indicates that youth physical fitness levels are an important marker of lifestyle and cardio-metabolic health profiles and predict future risk of chronic diseases. The reliability physical fitness tests have not been explored in Latino-American youth population. This study's aim was to examine the reliability of health-related physical fitness tests that were used in the Colombian health promotion "Fuprecol study". Participants were 229 Colombian youth (boys n = 124 and girls n = 105) aged 9 to 17.9 years old. Five components of health-related physical fitness were measured: 1) morphological component: height, weight, body mass index (BMI), waist circumference, triceps skinfold, subscapular skinfold, and body fat (%) via impedance; 2) musculoskeletal component: handgrip and standing long jump test; 3) motor component: speed/agility test (4x10 m shuttle run); 4) flexibility component (hamstring and lumbar extensibility, sit-and-reach test); 5) cardiorespiratory component: 20-meter shuttle-run test (SRT) to estimate maximal oxygen consumption. The tests were performed two times, 1 week apart on the same day of the week, except for the SRT which was performed only once. Intra-observer technical errors of measurement (TEMs) and inter-rater (reliability) were assessed in the morphological component. Reliability for the Musculoskeletal, motor and cardiorespiratory fitness components was examined using Bland-Altman tests. For the morphological component, TEMs were small and reliability was greater than 95% of all cases. For the musculoskeletal, motor, flexibility and cardiorespiratory components, we found adequate reliability patterns in terms of systematic errors (bias) and random error (95% limits of agreement). When the fitness assessments were performed twice, the systematic error was nearly 0 for all tests, except for the sit and reach (mean difference: -1.03% [95% CI = -4.35% to -2.28%]. The results from this study indicate that the "Fuprecol study" health-related physical fitness battery, administered by physical education teachers, was reliable for measuring health-related components of fitness in children and adolescents aged 9-17.9 years old in a school setting in Colombia.

  8. Reliability and validity of a Chinese version of the Diagnostic Interview for Borderlines-Revised.

    PubMed

    Wang, Lanlan; Yuan, Chenmei; Qiu, Jianying; Gunderson, John; Zhang, Min; Jiang, Kaida; Leung, Freedom; Zhong, Jie; Xiao, Zeping

    2014-09-01

    Borderline personality disorder (BPD) is the most studied of the axis II disorders. One of the most widely used diagnostic instruments is the Diagnostic Interview for Borderline Patients-Revised (DIB-R). The aim of this study was to test the reliability and validity of DIB-R for use in the Chinese culture. The reliability and validity of the DIB-R Chinese version were assessed in a sample of 236 outpatients with a probable BPD diagnosis. The Structured Clinical Interview for DSM-IV Personality Disorders (SCID-II) was used as a standard. Test-retest reliability was tested six months later with 20 patients, and inter-rater reliability was tested on 32 patients. The Chinese version of the DIB-R showed good internal global consistency (Cronbach's α of 0.916), good test-retest reliability (Pearson correlation of 0.704), good inter-rater reliability (intra-class correlation coefficient of 0.892 and kappa of 0.861). When compared with the DSM-IV diagnosis as measured by the SCID-II, the DIB-R showed relatively good sensitivity (0.768) and specificity (0.891) at the cutoff of 7, moderate diagnostic convergence (kappa of 0.631), as well as good discriminating validity. The Chinese version of the DIB-R has good psychometric properties, which renders it a valuable method for examining the presence, the severity, and component phenotypes of BPD in Chinese samples. © 2013 Wiley Publishing Asia Pty Ltd.

  9. Design of preventive maintenance system using the reliability engineering and maintenance value stream mapping methods in PT. XYZ

    NASA Astrophysics Data System (ADS)

    Sembiring, N.; Panjaitan, N.; Angelita, S.

    2018-02-01

    PT. XYZ is a company owned by non-governmental organizations engaged in the field of production of rubber processing becoming crumb rubber. Part of the production is supported by some of machines and interacting equipment to achieve optimal productivity. Types of the machine that are used in the production process are Conveyor Breaker, Breaker, Rolling Pin, Hammer Mill, Mill Roll, Conveyor, Shredder Crumb, and Dryer. Maintenance system in PT. XYZ is corrective maintenance i.e. repairing or replacing the engine components after the crash on the machine. Replacement of engine components on corrective maintenance causes the machine to stop operating during the production process is in progress. The result is in the loss of production time due to the operator must replace the damaged engine components. The loss of production time can impact on the production targets which were not reached and lead to high loss costs. The cost for all components is Rp. 4.088.514.505. This cost is really high just for maintaining a Mill Roll Machine. Therefore PT. XYZ is needed to do preventive maintenance i.e. scheduling engine components and improving maintenance efficiency. The used methods are Reliability Engineering and Maintenance Value Stream Mapping (MVSM). The needed data in this research are the interval of time damage to engine components, opportunity cost, labor cost, component cost, corrective repair time, preventive repair time, Mean Time To Opportunity (MTTO), Mean Time To Repair (MTTR), and Mean Time To Yield (MTTY). In this research, the critical components of Mill Roll machine are Spier, Bushing, Bearing, Coupling and Roll. Determination of damage distribution, reliability, MTTF, cost of failure, cost of preventive, current state map, and future state map are done so that the replacement time for each critical component with the lowest maintenance cost and preparation of Standard Operation Procedure (SOP) are developed. For the critical component that has been determined, the Spier component replacement time interval is 228 days with a reliability value of 0,503171, Bushing component is 240 days with reliability value of 0.36861, Bearing component is 202 days with reliability value of 0,503058, Coupling component is 247 days with reliability value of 0,50108 and Roll component is 301 days with reliability value of 0,373525. The results show that the cost decreases from Rp 300,688,114 to Rp 244,384,371 obtained from corrective maintenance to preventive maintenance. While maintenance efficiency increases with the application of preventive maintenance i.e. for Spier component from 54,0540541% to 74,07407%, Bushing component from 52,3809524% to 68,75%, Bearing component from 40% to 52,63158%, Coupling component from 60.9756098% to 71.42857%, and Roll components from 64.516129% to 74.7663551%.

  10. Validation of a Brazilian version of the moral sensitivity questionnaire.

    PubMed

    Dalla Nora, Carlise R; Zoboli, Elma Lcp; Vieira, Margarida M

    2017-01-01

    Moral sensitivity has been identified as a foundational component of ethical action. Diminished or absent moral sensitivity can result in deficient care. In this context, assessing moral sensitivity is imperative for designing interventions to facilitate ethical practice and ensure that nurses make appropriate decisions. The main purpose of this study was to validate a scale for examining the moral sensitivity of Brazilian nurses. A pre-existing scale, the Moral Sensitivity Questionnaire, which was developed by Lützén, was used after the deletion of three items. The reliability and validity of the scale were examined using Cronbach's alpha and factor analysis, respectively. Participants and research context: Overall, 316 nurses from Rio Grande do Sul, Brazil, participated in the study. Ethical considerations: This study was approved by the Ethics Committee of Research of the Nursing School of the University of São Paulo. The Moral Sensitivity Questionnaire contained 27 items that were distributed across four dimensions: interpersonal orientation, professional knowledge, moral conflict and moral meaning. The questionnaire accounted for 55.8% of the total variance, with Cronbach's alpha of 0.82. The mean score for moral sensitivity was 4.45 (out of 7). The results of this study were compared with studies from other countries to examine the structure and implications of the moral sensitivity of nurses in Brazil. The Moral Sensitivity Questionnaire is an appropriate tool for examining the moral sensitivity of Brazilian nurses.

  11. Trimming of silicon ring resonator by electron beam induced compaction and strain.

    PubMed

    Schrauwen, J; Van Thourhout, D; Baets, R

    2008-03-17

    Silicon is becoming the preferable platform for future integrated components, mostly due to the mature and reliable fabrication capabilities of electronics industry. Nevertheless, even the most advanced fabrication technologies suffer from non-uniformity on wafer scale and on chip scale, causing variations in the critical dimensions of fabricated components. This is an important issue since photonic circuits, and especially cavities such as ring resonators, are extremely sensitive to these variations. In this paper we present a way to circumvent these problems by trimming using electron beam induced compaction of oxide in silicon on insulator. Volume compaction of the oxide cladding causes both changes in the refractive index and creates strain in the silicon lattice. We demonstrate a resonance wavelength red shift 4.91 nm in a silicon ring resonator.

  12. Rapid screening for perceived cognitive impairment in major depressive disorder.

    PubMed

    Iverson, Grant L; Lam, Raymond W

    2013-05-01

    Subjectively experienced cognitive impairment is common in patients with mood disorders. The British Columbia Cognitive Complaints Inventory (BC-CCI) is a 6-item scale that measures perceived cognitive problems. The purpose of this study is to examine the reliability of the scale in healthy volunteers and depressed patients and to evaluate the sensitivity of the measure to perceived cognitive problems in depression. Participants were 62 physician-diagnosed inpatients or outpatients with depression, who had independently confirmed diagnoses on the Structured Clinical Interview for DSM-IV, and a large sample of healthy community volunteers (n=112). The internal consistency reliability of the BC-CCI was α=.86 for patients with depression and α=.82 for healthy controls. Principal components analyses revealed a one-factor solution accounting for 54% of the total variability in the control sample and a 2-factor solution (cognitive impairment and difficulty with expressive language) accounting for 76% of the variance in the depression sample. The total score difference between the groups was very large (Cohen's d=2.2). The BC-CCI has high internal consistency in both depressed patients and community controls, despite its small number of items. The test is sensitive to cognitive complaints in patients with depression.

  13. Wear-Out Sensitivity Analysis Project Abstract

    NASA Technical Reports Server (NTRS)

    Harris, Adam

    2015-01-01

    During the course of the Summer 2015 internship session, I worked in the Reliability and Maintainability group of the ISS Safety and Mission Assurance department. My project was a statistical analysis of how sensitive ORU's (Orbital Replacement Units) are to a reliability parameter called the wear-out characteristic. The intended goal of this was to determine a worst case scenario of how many spares would be needed if multiple systems started exhibiting wear-out characteristics simultaneously. The goal was also to determine which parts would be most likely to do so. In order to do this, my duties were to take historical data of operational times and failure times of these ORU's and use them to build predictive models of failure using probability distribution functions, mainly the Weibull distribution. Then, I ran Monte Carlo Simulations to see how an entire population of these components would perform. From here, my final duty was to vary the wear-out characteristic from the intrinsic value, to extremely high wear-out values and determine how much the probability of sufficiency of the population would shift. This was done for around 30 different ORU populations on board the ISS.

  14. The factorial reliability of the Middlesex Hospital Questionnaire in normal subjects.

    PubMed

    Bagley, C

    1980-03-01

    The internal reliability of the Middlesex Hospital Questionnaire and its component subscales has been checked by means of principal components analyses of data on 256 normal subjects. The subscales (with the possible exception of Hysteria) were found to contribute to the general underlying factor of psychoneurosis. In general, the principal components analysis points to the reliability of the subscales, despite some item overlap.

  15. The Behavioural Assessment of Self-Structuring (BASS): psychometric properties in a post-acute brain injury rehabilitation programme.

    PubMed

    Jackson, Howard F; Tunstall, Victoria; Hague, Gemma; Daniels, Leanne; Crompton, Stacey; Taplin, Kimberly

    2014-01-01

    Jackson et al. (this edition) argue that structure is an important component in reducing the handicaps caused by cognitive impairments following acquired brain injury and that post-acute neuropsychological brain injury rehabilitation programmes should not only endeavour to provide structure but also aim to develop self-structuring. However, at present there is no standardized device for assessing self-structuring. To provide preliminary analysis of the psychometric properties of the Behavioural Assessment of Self-Structuring (BASS) staff rating scale (a 26 item informant five point rating scale based on the degree of support client requires to achieve self-structuring item). BASS data was utilised for clients attending residential rehabilitation. Reliability (inter-rarer and intra-rater), validity (construct, concurrent and discriminate) and sensitivity to change were investigated. Initial results indicate that the BASS has reasonably good reliability, good construct validity (via principal components analysis), good discriminant validity, and good concurrent validity correlating well with a number of other outcome measures (HoNOS; NPDS, Supervision Rating Scale, MPAI, FIM and FAM). The BASS did not correlate well with the NPCNA. Finally, the BASS was shown to demonstrate sensitivity to change. Although some caution is required in drawing firm conclusions at the present time and further exploration of the psychometric properties of the BASS is required, initial results are encouraging for the use of the BASS in assessing rehabilitation progress. These findings are discussed in terms of the value of the concept of self-structuring to the rehabilitation process for individuals with neuropsychological impairments consequent on acquired brain injury.

  16. Reliability analysis of laminated CMC components through shell subelement techniques

    NASA Technical Reports Server (NTRS)

    Starlinger, A.; Duffy, S. F.; Gyekenyesi, J. P.

    1992-01-01

    An updated version of the integrated design program C/CARES (composite ceramic analysis and reliability evaluation of structures) was developed for the reliability evaluation of CMC laminated shell components. The algorithm is now split in two modules: a finite-element data interface program and a reliability evaluation algorithm. More flexibility is achieved, allowing for easy implementation with various finite-element programs. The new interface program from the finite-element code MARC also includes the option of using hybrid laminates and allows for variations in temperature fields throughout the component.

  17. Recent Advances in Active Infrared Thermography for Non-Destructive Testing of Aerospace Components.

    PubMed

    Ciampa, Francesco; Mahmoodi, Pooya; Pinto, Fulvio; Meo, Michele

    2018-02-16

    Active infrared thermography is a fast and accurate non-destructive evaluation technique that is of particular relevance to the aerospace industry for the inspection of aircraft and helicopters' primary and secondary structures, aero-engine parts, spacecraft components and its subsystems. This review provides an exhaustive summary of most recent active thermographic methods used for aerospace applications according to their physical principle and thermal excitation sources. Besides traditional optically stimulated thermography, which uses external optical radiation such as flashes, heaters and laser systems, novel hybrid thermographic techniques are also investigated. These include ultrasonic stimulated thermography, which uses ultrasonic waves and the local damage resonance effect to enhance the reliability and sensitivity to micro-cracks, eddy current stimulated thermography, which uses cost-effective eddy current excitation to generate induction heating, and microwave thermography, which uses electromagnetic radiation at the microwave frequency bands to provide rapid detection of cracks and delamination. All these techniques are here analysed and numerous examples are provided for different damage scenarios and aerospace components in order to identify the strength and limitations of each thermographic technique. Moreover, alternative strategies to current external thermal excitation sources, here named as material-based thermography methods, are examined in this paper. These novel thermographic techniques rely on thermoresistive internal heating and offer a fast, low power, accurate and reliable assessment of damage in aerospace composites.

  18. PCA leverage: outlier detection for high-dimensional functional magnetic resonance imaging data.

    PubMed

    Mejia, Amanda F; Nebel, Mary Beth; Eloyan, Ani; Caffo, Brian; Lindquist, Martin A

    2017-07-01

    Outlier detection for high-dimensional (HD) data is a popular topic in modern statistical research. However, one source of HD data that has received relatively little attention is functional magnetic resonance images (fMRI), which consists of hundreds of thousands of measurements sampled at hundreds of time points. At a time when the availability of fMRI data is rapidly growing-primarily through large, publicly available grassroots datasets-automated quality control and outlier detection methods are greatly needed. We propose principal components analysis (PCA) leverage and demonstrate how it can be used to identify outlying time points in an fMRI run. Furthermore, PCA leverage is a measure of the influence of each observation on the estimation of principal components, which are often of interest in fMRI data. We also propose an alternative measure, PCA robust distance, which is less sensitive to outliers and has controllable statistical properties. The proposed methods are validated through simulation studies and are shown to be highly accurate. We also conduct a reliability study using resting-state fMRI data from the Autism Brain Imaging Data Exchange and find that removal of outliers using the proposed methods results in more reliable estimation of subject-level resting-state networks using independent components analysis. © The Author 2017. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  19. Recent Advances in Active Infrared Thermography for Non-Destructive Testing of Aerospace Components

    PubMed Central

    Mahmoodi, Pooya; Pinto, Fulvio; Meo, Michele

    2018-01-01

    Active infrared thermography is a fast and accurate non-destructive evaluation technique that is of particular relevance to the aerospace industry for the inspection of aircraft and helicopters’ primary and secondary structures, aero-engine parts, spacecraft components and its subsystems. This review provides an exhaustive summary of most recent active thermographic methods used for aerospace applications according to their physical principle and thermal excitation sources. Besides traditional optically stimulated thermography, which uses external optical radiation such as flashes, heaters and laser systems, novel hybrid thermographic techniques are also investigated. These include ultrasonic stimulated thermography, which uses ultrasonic waves and the local damage resonance effect to enhance the reliability and sensitivity to micro-cracks, eddy current stimulated thermography, which uses cost-effective eddy current excitation to generate induction heating, and microwave thermography, which uses electromagnetic radiation at the microwave frequency bands to provide rapid detection of cracks and delamination. All these techniques are here analysed and numerous examples are provided for different damage scenarios and aerospace components in order to identify the strength and limitations of each thermographic technique. Moreover, alternative strategies to current external thermal excitation sources, here named as material-based thermography methods, are examined in this paper. These novel thermographic techniques rely on thermoresistive internal heating and offer a fast, low power, accurate and reliable assessment of damage in aerospace composites. PMID:29462953

  20. User's guide to the Reliability Estimation System Testbed (REST)

    NASA Technical Reports Server (NTRS)

    Nicol, David M.; Palumbo, Daniel L.; Rifkin, Adam

    1992-01-01

    The Reliability Estimation System Testbed is an X-window based reliability modeling tool that was created to explore the use of the Reliability Modeling Language (RML). RML was defined to support several reliability analysis techniques including modularization, graphical representation, Failure Mode Effects Simulation (FMES), and parallel processing. These techniques are most useful in modeling large systems. Using modularization, an analyst can create reliability models for individual system components. The modules can be tested separately and then combined to compute the total system reliability. Because a one-to-one relationship can be established between system components and the reliability modules, a graphical user interface may be used to describe the system model. RML was designed to permit message passing between modules. This feature enables reliability modeling based on a run time simulation of the system wide effects of a component's failure modes. The use of failure modes effects simulation enhances the analyst's ability to correctly express system behavior when using the modularization approach to reliability modeling. To alleviate the computation bottleneck often found in large reliability models, REST was designed to take advantage of parallel processing on hypercube processors.

  1. The replacement of dry heat in generic reliability assurance requirements for passive optical components

    NASA Astrophysics Data System (ADS)

    Ren, Xusheng; Qian, Longsheng; Zhang, Guiyan

    2005-12-01

    According to Generic Reliability Assurance Requirements for Passive Optical Components GR-1221-CORE (Issue 2, January 1999), reliability determination test of different kinds of passive optical components which using in uncontrolled environments is taken. The test condition of High Temperature Storage Test (Dry Test) and Damp Test is in below sheet. Except for humidity condition, all is same. In order to save test time and cost, after a sires of contrast tests, the replacement of Dry Heat is discussed. Controlling the Failure mechanism of dry heat and damp heat of passive optical components, the contrast test of dry heat and damp heat for passive optical components (include DWDM, CWDM, Coupler, Isolator, mini Isolator) is taken. The test result of isolator is listed. Telcordia test not only test the reliability of the passive optical components, but also test the patience of the experimenter. The cost of Telcordia test in money, manpower and material resources, especially in time is heavy burden for the company. After a series of tests, we can find that Damp heat could factually test the reliability of passive optical components, and equipment manufacturer in accord with component manufacture could omit the dry heat test if damp heat test is taken first and passed.

  2. Principle of maximum entropy for reliability analysis in the design of machine components

    NASA Astrophysics Data System (ADS)

    Zhang, Yimin

    2018-03-01

    We studied the reliability of machine components with parameters that follow an arbitrary statistical distribution using the principle of maximum entropy (PME). We used PME to select the statistical distribution that best fits the available information. We also established a probability density function (PDF) and a failure probability model for the parameters of mechanical components using the concept of entropy and the PME. We obtained the first four moments of the state function for reliability analysis and design. Furthermore, we attained an estimate of the PDF with the fewest human bias factors using the PME. This function was used to calculate the reliability of the machine components, including a connecting rod, a vehicle half-shaft, a front axle, a rear axle housing, and a leaf spring, which have parameters that typically follow a non-normal distribution. Simulations were conducted for comparison. This study provides a design methodology for the reliability of mechanical components for practical engineering projects.

  3. Preliminary construction of integral analysis for characteristic components in complex matrices by in-house fabricated solid-phase microextraction fibers combined with gas chromatography-mass spectrometry.

    PubMed

    Tang, Zhentao; Hou, Wenqian; Liu, Xiuming; Wang, Mingfeng; Duan, Yixiang

    2016-08-26

    Integral analysis plays an important role in study and quality control of substances with complex matrices in our daily life. As the preliminary construction of integral analysis of substances with complex matrices, developing a relatively comprehensive and sensitive methodology might offer more informative and reliable characteristic components. Flavoring mixtures belonging to the representatives of substances with complex matrices have now been widely used in various fields. To better study and control the quality of flavoring mixtures as additives in food industry, an in-house fabricated solid-phase microextraction (SPME) fiber was prepared based on sol-gel technology in this work. The active organic component of the fiber coating was multi-walled carbon nanotubes (MWCNTs) functionalized with hydroxyl-terminated polydimethyldiphenylsiloxane, which integrate the non-polar and polar chains of both materials. In this way, more sensitive extraction capability for a wider range of compounds can be obtained in comparison with commercial SPME fibers. Preliminarily integral analysis of three similar types of samples were realized by the optimized SPME-GC-MS method. With the obtained GC-MS data, a valid and well-fit model was established by partial least square discriminant analysis (PLS-DA) for classification of these samples (R2X=0.661, R2Y=0.996, Q2=0.986). The validity of the model (R2=0.266, Q2=-0.465) has also approved the potential to predict the "belongingness" of new samples. With the PLS-DA and SPSS method, further screening out the markers among three similar batches of samples may be helpful for monitoring and controlling the quality of the flavoring mixtures as additives in food industry. Conversely, the reliability and effectiveness of the GC-MS data has verified the comprehensive and efficient extraction performance of the in-house fabricated fiber. Copyright © 2016 Elsevier B.V. All rights reserved.

  4. Developing Cost-Effective Field Assessments of Carbon Stocks in Human-Modified Tropical Forests.

    PubMed

    Berenguer, Erika; Gardner, Toby A; Ferreira, Joice; Aragão, Luiz E O C; Camargo, Plínio B; Cerri, Carlos E; Durigan, Mariana; Oliveira Junior, Raimundo C; Vieira, Ima C G; Barlow, Jos

    2015-01-01

    Across the tropics, there is a growing financial investment in activities that aim to reduce emissions from deforestation and forest degradation, such as REDD+. However, most tropical countries lack on-the-ground capacity to conduct reliable and replicable assessments of forest carbon stocks, undermining their ability to secure long-term carbon finance for forest conservation programs. Clear guidance on how to reduce the monetary and time costs of field assessments of forest carbon can help tropical countries to overcome this capacity gap. Here we provide such guidance for cost-effective one-off field assessments of forest carbon stocks. We sampled a total of eight components from four different carbon pools (i.e. aboveground, dead wood, litter and soil) in 224 study plots distributed across two regions of eastern Amazon. For each component we estimated survey costs, contribution to total forest carbon stocks and sensitivity to disturbance. Sampling costs varied thirty-one-fold between the most expensive component, soil, and the least, leaf litter. Large live stems (≥10 cm DBH), which represented only 15% of the overall sampling costs, was by far the most important component to be assessed, as it stores the largest amount of carbon and is highly sensitive to disturbance. If large stems are not taxonomically identified, costs can be reduced by a further 51%, while incurring an error in aboveground carbon estimates of only 5% in primary forests, but 31% in secondary forests. For rapid assessments, necessary to help prioritize locations for carbon- conservation activities, sampling of stems ≥20cm DBH without taxonomic identification can predict with confidence (R2 = 0.85) whether an area is relatively carbon-rich or carbon-poor-an approach that is 74% cheaper than sampling and identifying all the stems ≥10cm DBH. We use these results to evaluate the reliability of forest carbon stock estimates provided by the IPCC and FAO when applied to human-modified forests, and to highlight areas where cost savings in carbon stock assessments could be most easily made.

  5. Developing Cost-Effective Field Assessments of Carbon Stocks in Human-Modified Tropical Forests

    PubMed Central

    Berenguer, Erika; Gardner, Toby A.; Ferreira, Joice; Aragão, Luiz E. O. C.; Camargo, Plínio B.; Cerri, Carlos E.; Durigan, Mariana; Oliveira Junior, Raimundo C.; Vieira, Ima C. G.; Barlow, Jos

    2015-01-01

    Across the tropics, there is a growing financial investment in activities that aim to reduce emissions from deforestation and forest degradation, such as REDD+. However, most tropical countries lack on-the-ground capacity to conduct reliable and replicable assessments of forest carbon stocks, undermining their ability to secure long-term carbon finance for forest conservation programs. Clear guidance on how to reduce the monetary and time costs of field assessments of forest carbon can help tropical countries to overcome this capacity gap. Here we provide such guidance for cost-effective one-off field assessments of forest carbon stocks. We sampled a total of eight components from four different carbon pools (i.e. aboveground, dead wood, litter and soil) in 224 study plots distributed across two regions of eastern Amazon. For each component we estimated survey costs, contribution to total forest carbon stocks and sensitivity to disturbance. Sampling costs varied thirty-one-fold between the most expensive component, soil, and the least, leaf litter. Large live stems (≥10 cm DBH), which represented only 15% of the overall sampling costs, was by far the most important component to be assessed, as it stores the largest amount of carbon and is highly sensitive to disturbance. If large stems are not taxonomically identified, costs can be reduced by a further 51%, while incurring an error in aboveground carbon estimates of only 5% in primary forests, but 31% in secondary forests. For rapid assessments, necessary to help prioritize locations for carbon- conservation activities, sampling of stems ≥20cm DBH without taxonomic identification can predict with confidence (R2 = 0.85) whether an area is relatively carbon-rich or carbon-poor—an approach that is 74% cheaper than sampling and identifying all the stems ≥10cm DBH. We use these results to evaluate the reliability of forest carbon stock estimates provided by the IPCC and FAO when applied to human-modified forests, and to highlight areas where cost savings in carbon stock assessments could be most easily made. PMID:26308074

  6. Temporal Lobe and “Default” Hemodynamic Brain Modes Discriminate Between Schizophrenia and Bipolar Disorder

    PubMed Central

    Calhoun, Vince D.; Maciejewski, Paul K.; Pearlson, Godfrey D.; Kiehl, Kent A.

    2009-01-01

    Schizophrenia and bipolar disorder are currently diagnosed on the basis of psychiatric symptoms and longitudinal course. The determination of a reliable, biologically-based diagnostic indicator of these diseases (a biomarker) could provide the groundwork for developing more rigorous tools for differential diagnosis and treatment assignment. Recently, methods have been used to identify distinct sets of brain regions or “spatial modes” exhibiting temporally coherent brain activity. Using functional magnetic resonance imaging (fMRI) data and a multivariate analysis method, independent component analysis, we combined the temporal lobe and the default modes to discriminate subjects with bipolar disorder, chronic schizophrenia, and healthy controls. Temporal lobe and default mode networks were reliably identified in all participants. Classification results on an independent set of individuals revealed an average sensitivity and specificity of 90 and 95%, respectively. The use of coherent brain networks such as the temporal lobe and default mode networks may provide a more reliable measure of disease state than task-correlated fMRI activity. A combination of two such hemodynamic brain networks shows promise as a biomarker for schizophrenia and bipolar disorder. PMID:17894392

  7. Temporal lobe and "default" hemodynamic brain modes discriminate between schizophrenia and bipolar disorder.

    PubMed

    Calhoun, Vince D; Maciejewski, Paul K; Pearlson, Godfrey D; Kiehl, Kent A

    2008-11-01

    Schizophrenia and bipolar disorder are currently diagnosed on the basis of psychiatric symptoms and longitudinal course. The determination of a reliable, biologically-based diagnostic indicator of these diseases (a biomarker) could provide the groundwork for developing more rigorous tools for differential diagnosis and treatment assignment. Recently, methods have been used to identify distinct sets of brain regions or "spatial modes" exhibiting temporally coherent brain activity. Using functional magnetic resonance imaging (fMRI) data and a multivariate analysis method, independent component analysis, we combined the temporal lobe and the default modes to discriminate subjects with bipolar disorder, chronic schizophrenia, and healthy controls. Temporal lobe and default mode networks were reliably identified in all participants. Classification results on an independent set of individuals revealed an average sensitivity and specificity of 90 and 95%, respectively. The use of coherent brain networks such as the temporal lobe and default mode networks may provide a more reliable measure of disease state than task-correlated fMRI activity. A combination of two such hemodynamic brain networks shows promise as a biomarker for schizophrenia and bipolar disorder.

  8. Excitatory Local Interneurons Enhance Tuning of Sensory Information

    PubMed Central

    Assisi, Collins; Stopfer, Mark; Bazhenov, Maxim

    2012-01-01

    Neurons in the insect antennal lobe represent odors as spatiotemporal patterns of activity that unfold over multiple time scales. As these patterns unspool they decrease the overlap between odor representations and thereby increase the ability of the olfactory system to discriminate odors. Using a realistic model of the insect antennal lobe we examined two competing components of this process –lateral excitation from local excitatory interneurons, and slow inhibition from local inhibitory interneurons. We found that lateral excitation amplified differences between representations of similar odors by recruiting projection neurons that did not receive direct input from olfactory receptors. However, this increased sensitivity also amplified noisy variations in input and compromised the ability of the system to respond reliably to multiple presentations of the same odor. Slow inhibition curtailed the spread of projection neuron activity and increased response reliability. These competing influences must be finely balanced in order to decorrelate odor representations. PMID:22807661

  9. Analysis on Sealing Reliability of Bolted Joint Ball Head Component of Satellite Propulsion System

    NASA Astrophysics Data System (ADS)

    Guo, Tao; Fan, Yougao; Gao, Feng; Gu, Shixin; Wang, Wei

    2018-01-01

    Propulsion system is one of the important subsystems of satellite, and its performance directly affects the service life, attitude control and reliability of the satellite. The Paper analyzes the sealing principle of bolted joint ball head component of satellite propulsion system and discuss from the compatibility of hydrazine anhydrous and bolted joint ball head component, influence of ground environment on the sealing performance of bolted joint ball heads, and material failure caused by environment, showing that the sealing reliability of bolted joint ball head component is good and the influence of above three aspects on sealing of bolted joint ball head component can be ignored.

  10. Reliability Quantification of Advanced Stirling Convertor (ASC) Components

    NASA Technical Reports Server (NTRS)

    Shah, Ashwin R.; Korovaichuk, Igor; Zampino, Edward

    2010-01-01

    The Advanced Stirling Convertor, is intended to provide power for an unmanned planetary spacecraft and has an operational life requirement of 17 years. Over this 17 year mission, the ASC must provide power with desired performance and efficiency and require no corrective maintenance. Reliability demonstration testing for the ASC was found to be very limited due to schedule and resource constraints. Reliability demonstration must involve the application of analysis, system and component level testing, and simulation models, taken collectively. Therefore, computer simulation with limited test data verification is a viable approach to assess the reliability of ASC components. This approach is based on physics-of-failure mechanisms and involves the relationship among the design variables based on physics, mechanics, material behavior models, interaction of different components and their respective disciplines such as structures, materials, fluid, thermal, mechanical, electrical, etc. In addition, these models are based on the available test data, which can be updated, and analysis refined as more data and information becomes available. The failure mechanisms and causes of failure are included in the analysis, especially in light of the new information, in order to develop guidelines to improve design reliability and better operating controls to reduce the probability of failure. Quantified reliability assessment based on fundamental physical behavior of components and their relationship with other components has demonstrated itself to be a superior technique to conventional reliability approaches based on utilizing failure rates derived from similar equipment or simply expert judgment.

  11. Calculating system reliability with SRFYDO

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Morzinski, Jerome; Anderson - Cook, Christine M; Klamann, Richard M

    2010-01-01

    SRFYDO is a process for estimating reliability of complex systems. Using information from all applicable sources, including full-system (flight) data, component test data, and expert (engineering) judgment, SRFYDO produces reliability estimates and predictions. It is appropriate for series systems with possibly several versions of the system which share some common components. It models reliability as a function of age and up to 2 other lifecycle (usage) covariates. Initial output from its Exploratory Data Analysis mode consists of plots and numerical summaries so that the user can check data entry and model assumptions, and help determine a final form for themore » system model. The System Reliability mode runs a complete reliability calculation using Bayesian methodology. This mode produces results that estimate reliability at the component, sub-system, and system level. The results include estimates of uncertainty, and can predict reliability at some not-too-distant time in the future. This paper presents an overview of the underlying statistical model for the analysis, discusses model assumptions, and demonstrates usage of SRFYDO.« less

  12. The Psychometric Properties of the Center for Epidemiologic Studies Depression Scale in Chinese Primary Care Patients: Factor Structure, Construct Validity, Reliability, Sensitivity and Responsiveness

    PubMed Central

    2015-01-01

    Background The Center for Epidemiologic Studies Depression Scale (CES-D) is a commonly used instrument to measure depressive symptomatology. Despite this, the evidence for its psychometric properties remains poorly established in Chinese populations. The aim of this study was to validate the use of the CES-D in Chinese primary care patients by examining factor structure, construct validity, reliability, sensitivity and responsiveness. Methods and Results The psychometric properties were assessed amongst a sample of 3686 Chinese adult primary care patients in Hong Kong. Three competing factor structure models were examined using confirmatory factor analysis. The original CES-D four-structure model had adequate fit, however the data was better fit into a bi-factor model. For the internal construct validity, corrected item-total correlations were 0.4 for most items. The convergent validity was assessed by examining the correlations between the CES-D, the Patient Health Questionnaire 9 (PHQ-9) and the Short Form-12 Health Survey (version 2) Mental Component Summary (SF-12 v2 MCS). The CES-D had a strong correlation with the PHQ-9 (coefficient: 0.78) and SF-12 v2 MCS (coefficient: -0.75). Internal consistency was assessed by McDonald’s omega hierarchical (ωH). The ωH value for the general depression factor was 0.855. The ωH values for “somatic”, “depressed affect”, “positive affect” and “interpersonal problems” were 0.434, 0.038, 0.738 and 0.730, respectively. For the two-week test-retest reliability, the intraclass correlation coefficient was 0.91. The CES-D was sensitive in detecting differences between known groups, with the AUC >0.7. Internal responsiveness of the CES-D to detect positive and negative changes was satisfactory (with p value <0.01 and all effect size statistics >0.2). The CES-D was externally responsive, with the AUC>0.7. Conclusions The CES-D appears to be a valid, reliable, sensitive and responsive instrument for screening and monitoring depressive symptoms in adult Chinese primary care patients. In its original four-factor and bi-factor structure, the CES-D is supported for cross-cultural comparisons of depression in multi-center studies. PMID:26252739

  13. The Advanced Gamma-ray Imaging System (AGIS): Camera Electronics Designs

    NASA Astrophysics Data System (ADS)

    Tajima, H.; Buckley, J.; Byrum, K.; Drake, G.; Falcone, A.; Funk, S.; Holder, J.; Horan, D.; Krawczynski, H.; Ong, R.; Swordy, S.; Wagner, R.; Williams, D.

    2008-04-01

    AGIS, a next generation of atmospheric Cherenkov telescope arrays, aims to achieve a sensitivity level of a milliCrab for gamma-ray observations in the energy band of 40 GeV to 100 TeV. Such improvement requires cost reduction of individual components with high reliability in order to equip the order of 100 telescopes necessary to achieve the sensitivity goal. We are exploring several design concepts to reduce the cost of camera electronics while improving their performance. These design concepts include systems based on multi-channel waveform sampling ASIC optimized for AGIS, a system based on IIT (image intensifier tube) for large channel (order of 1 million channels) readout as well as a multiplexed FADC system based on the current VERITAS readout design. Here we present trade-off in the studies of these design concepts.

  14. The Advanced Gamma-ray Imaging System (AGIS): Camera Electronics Designs

    NASA Astrophysics Data System (ADS)

    Tajima, Hiroyasu; Buckley, J.; Byrum, K.; Drake, G.; Falcone, A.; Funk, S.; Holder, J.; Horan, D.; Krawczynski, H.; Ong, R.; Swordy, S.; Wagner, R.; Wakely, S.; Williams, D.; Camera Electronics Working Group; AGIS Collaboration

    2008-03-01

    AGIS, a next generation of atmospheric Cherenkov telescope arrays, aims to achieve a sensitivity level of a milliCrab for gamma-ray observations in in the energy band of 40 GeV to 100 TeV. Such improvement requires cost reduction of individual components with high reliability in order to equip the order of 100 telescopes necessary to achieve the sensitivity goal. We are exploring several design concepts to reduce the cost of camera electronics while improving their performance. These design concepts include systems based on multi-channel waveform sampling ASIC optimized for AGIS, a system based on IIT (image intensifier tube) for large channel (order of 1 million channels) readout as well as a multiplexed FADC system based on the current VERITAS readout design. Here we present trade-off studies of these design concepts.

  15. Performance of salmon fishery portfolios across western North America.

    PubMed

    Griffiths, Jennifer R; Schindler, Daniel E; Armstrong, Jonathan B; Scheuerell, Mark D; Whited, Diane C; Clark, Robert A; Hilborn, Ray; Holt, Carrie A; Lindley, Steven T; Stanford, Jack A; Volk, Eric C

    2014-12-01

    Quantifying the variability in the delivery of ecosystem services across the landscape can be used to set appropriate management targets, evaluate resilience and target conservation efforts. Ecosystem functions and services may exhibit portfolio-type dynamics, whereby diversity within lower levels promotes stability at more aggregated levels. Portfolio theory provides a framework to characterize the relative performance among ecosystems and the processes that drive differences in performance. We assessed Pacific salmon Oncorhynchus spp. portfolio performance across their native latitudinal range focusing on the reliability of salmon returns as a metric with which to assess the function of salmon ecosystems and their services to humans. We used the Sharpe ratio (e.g. the size of the total salmon return to the portfolio relative to its variability (risk)) to evaluate the performance of Chinook and sockeye salmon portfolios across the west coast of North America. We evaluated the effects on portfolio performance from the variance of and covariance among salmon returns within each portfolio, and the association between portfolio performance and watershed attributes. We found a positive latitudinal trend in the risk-adjusted performance of Chinook and sockeye salmon portfolios that also correlated negatively with anthropogenic impact on watersheds (e.g. dams and land-use change). High-latitude Chinook salmon portfolios were on average 2·5 times more reliable, and their portfolio risk was mainly due to low variance in the individual assets. Sockeye salmon portfolios were also more reliable at higher latitudes, but sources of risk varied among the highest performing portfolios. Synthesis and applications . Portfolio theory provides a straightforward method for characterizing the resilience of salmon ecosystems and their services. Natural variability in portfolio performance among undeveloped watersheds provides a benchmark for restoration efforts. Locally and regionally, assessing the sources of portfolio risk can guide actions to maintain existing resilience (protect habitat and disturbance regimes that maintain response diversity; employ harvest strategies sensitive to different portfolio components) or improve restoration activities. Improving our understanding of portfolio reliability may allow for management of natural resources that is robust to ongoing environmental change. Portfolio theory provides a straightforward method for characterizing the resilience of salmon ecosystems and their services. Natural variability in portfolio performance among undeveloped watersheds provides a benchmark for restoration efforts. Locally and regionally, assessing the sources of portfolio risk can guide actions to maintain existing resilience (protect habitat and disturbance regimes that maintain response diversity; employ harvest strategies sensitive to different portfolio components) or improve restoration activities. Improving our understanding of portfolio reliability may allow for management of natural resources that is robust to ongoing environmental change.

  16. Performance of salmon fishery portfolios across western North America

    PubMed Central

    Griffiths, Jennifer R; Schindler, Daniel E; Armstrong, Jonathan B; Scheuerell, Mark D; Whited, Diane C; Clark, Robert A; Hilborn, Ray; Holt, Carrie A; Lindley, Steven T; Stanford, Jack A; Volk, Eric C

    2014-01-01

    Quantifying the variability in the delivery of ecosystem services across the landscape can be used to set appropriate management targets, evaluate resilience and target conservation efforts. Ecosystem functions and services may exhibit portfolio-type dynamics, whereby diversity within lower levels promotes stability at more aggregated levels. Portfolio theory provides a framework to characterize the relative performance among ecosystems and the processes that drive differences in performance. We assessed Pacific salmon Oncorhynchus spp. portfolio performance across their native latitudinal range focusing on the reliability of salmon returns as a metric with which to assess the function of salmon ecosystems and their services to humans. We used the Sharpe ratio (e.g. the size of the total salmon return to the portfolio relative to its variability (risk)) to evaluate the performance of Chinook and sockeye salmon portfolios across the west coast of North America. We evaluated the effects on portfolio performance from the variance of and covariance among salmon returns within each portfolio, and the association between portfolio performance and watershed attributes. We found a positive latitudinal trend in the risk-adjusted performance of Chinook and sockeye salmon portfolios that also correlated negatively with anthropogenic impact on watersheds (e.g. dams and land-use change). High-latitude Chinook salmon portfolios were on average 2·5 times more reliable, and their portfolio risk was mainly due to low variance in the individual assets. Sockeye salmon portfolios were also more reliable at higher latitudes, but sources of risk varied among the highest performing portfolios. Synthesis and applications. Portfolio theory provides a straightforward method for characterizing the resilience of salmon ecosystems and their services. Natural variability in portfolio performance among undeveloped watersheds provides a benchmark for restoration efforts. Locally and regionally, assessing the sources of portfolio risk can guide actions to maintain existing resilience (protect habitat and disturbance regimes that maintain response diversity; employ harvest strategies sensitive to different portfolio components) or improve restoration activities. Improving our understanding of portfolio reliability may allow for management of natural resources that is robust to ongoing environmental change. Portfolio theory provides a straightforward method for characterizing the resilience of salmon ecosystems and their services. Natural variability in portfolio performance among undeveloped watersheds provides a benchmark for restoration efforts. Locally and regionally, assessing the sources of portfolio risk can guide actions to maintain existing resilience (protect habitat and disturbance regimes that maintain response diversity; employ harvest strategies sensitive to different portfolio components) or improve restoration activities. Improving our understanding of portfolio reliability may allow for management of natural resources that is robust to ongoing environmental change. PMID:25552746

  17. Packaging-induced failure of semiconductor lasers and optical telecommunications components

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sharps, J.A.

    1996-12-31

    Telecommunications equipment for field deployment generally have specified lifetimes of > 100,000 hr. To achieve this high reliability, it is common practice to package sensitive components in hermetic, inert gas environments. The intent is to protect components from particulate and organic contamination, oxidation, and moisture. However, for high power density 980 nm diode lasers used in optical amplifiers, the authors found that hermetic, inert gas packaging induced a failure mode not observed in similar, unpackaged lasers. They refer to this failure mode as packaging-induced failure, or PIF. PIF is caused by nanomole amounts of organic contamination which interact with highmore » intensity 980 nm light to form solid deposits over the emitting regions of the lasers. These deposits absorb 980 nm light, causing heating of the laser, narrowing of the band gap, and eventual thermal runaway. The authors have found PIF is averted by packaging with free O{sub 2} and/or a getter material that sequesters organics.« less

  18. A framework for conducting mechanistic based reliability assessments of components operating in complex systems

    NASA Astrophysics Data System (ADS)

    Wallace, Jon Michael

    2003-10-01

    Reliability prediction of components operating in complex systems has historically been conducted in a statistically isolated manner. Current physics-based, i.e. mechanistic, component reliability approaches focus more on component-specific attributes and mathematical algorithms and not enough on the influence of the system. The result is that significant error can be introduced into the component reliability assessment process. The objective of this study is the development of a framework that infuses the needs and influence of the system into the process of conducting mechanistic-based component reliability assessments. The formulated framework consists of six primary steps. The first three steps, identification, decomposition, and synthesis, are primarily qualitative in nature and employ system reliability and safety engineering principles to construct an appropriate starting point for the component reliability assessment. The following two steps are the most unique. They involve a step to efficiently characterize and quantify the system-driven local parameter space and a subsequent step using this information to guide the reduction of the component parameter space. The local statistical space quantification step is accomplished using two proposed multivariate probability models: Multi-Response First Order Second Moment and Taylor-Based Inverse Transformation. Where existing joint probability models require preliminary distribution and correlation information of the responses, these models combine statistical information of the input parameters with an efficient sampling of the response analyses to produce the multi-response joint probability distribution. Parameter space reduction is accomplished using Approximate Canonical Correlation Analysis (ACCA) employed as a multi-response screening technique. The novelty of this approach is that each individual local parameter and even subsets of parameters representing entire contributing analyses can now be rank ordered with respect to their contribution to not just one response, but the entire vector of component responses simultaneously. The final step of the framework is the actual probabilistic assessment of the component. Although the same multivariate probability tools employed in the characterization step can be used for the component probability assessment, variations of this final step are given to allow for the utilization of existing probabilistic methods such as response surface Monte Carlo and Fast Probability Integration. The overall framework developed in this study is implemented to assess the finite-element based reliability prediction of a gas turbine airfoil involving several failure responses. Results of this implementation are compared to results generated using the conventional 'isolated' approach as well as a validation approach conducted through large sample Monte Carlo simulations. The framework resulted in a considerable improvement to the accuracy of the part reliability assessment and an improved understanding of the component failure behavior. Considerable statistical complexity in the form of joint non-normal behavior was found and accounted for using the framework. Future applications of the framework elements are discussed.

  19. A Most Probable Point-Based Method for Reliability Analysis, Sensitivity Analysis and Design Optimization

    NASA Technical Reports Server (NTRS)

    Hou, Gene J.-W; Newman, Perry A. (Technical Monitor)

    2004-01-01

    A major step in a most probable point (MPP)-based method for reliability analysis is to determine the MPP. This is usually accomplished by using an optimization search algorithm. The minimum distance associated with the MPP provides a measurement of safety probability, which can be obtained by approximate probability integration methods such as FORM or SORM. The reliability sensitivity equations are derived first in this paper, based on the derivatives of the optimal solution. Examples are provided later to demonstrate the use of these derivatives for better reliability analysis and reliability-based design optimization (RBDO).

  20. DATMAN: A reliability data analysis program using Bayesian updating

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Becker, M.; Feltus, M.A.

    1996-12-31

    Preventive maintenance (PM) techniques focus on the prevention of failures, in particular, system components that are important to plant functions. Reliability-centered maintenance (RCM) improves on the PM techniques by introducing a set of guidelines by which to evaluate the system functions. It also minimizes intrusive maintenance, labor, and equipment downtime without sacrificing system performance when its function is essential for plant safety. Both the PM and RCM approaches require that system reliability data be updated as more component failures and operation time are acquired. Systems reliability and the likelihood of component failures can be calculated by Bayesian statistical methods, whichmore » can update these data. The DATMAN computer code has been developed at Penn State to simplify the Bayesian analysis by performing tedious calculations needed for RCM reliability analysis. DATMAN reads data for updating, fits a distribution that best fits the data, and calculates component reliability. DATMAN provides a user-friendly interface menu that allows the user to choose from several common prior and posterior distributions, insert new failure data, and visually select the distribution that matches the data most accurately.« less

  1. The psychometric properties, sensitivity and specificity of the geriatric anxiety inventory, hospital anxiety and depression scale, and rating anxiety in dementia scale in aged care residents.

    PubMed

    Creighton, Alexandra S; Davison, Tanya E; Kissane, David W

    2018-02-22

    Limited research has been conducted into the identification of a valid and reliable screening measure for anxiety in aged care settings, despite it being one of the most common psychological conditions. This study aimed to determine an appropriate anxiety screening tool for aged care by comparing the reliability and validity of three commonly used measures and identifying specific cut-offs for the identification of generalized anxiety disorder (GAD). One-hundred and eighty nursing home residents (M age = 85.39 years) completed the GAI, HADS-A, and RAID, along with a structured diagnostic interview. Twenty participants (11.1%) met DSM-5 criteria for GAD. All measures had good psychometric properties , although reliability estimates for the HADS-A were sub-optimal. Privileging sensitivity , the GAI cut-off score of 9 gave sensitivity of 90.0% and specificity of 86.3%; HADS-A cut-off of 6 gave sensitivity of 90.0% and specificity of 80.6%; and RAID cut-off of 11 gave sensitivity of 85.0% and specificity of 72.5%. While all three measures had adequate reliability, validity, and cut-scores with high levels of sensitivity and specificity to detect anxiety within aged care, the GAI was the most consistently reliable and valid measure for screening for GAD.

  2. [Study of functional rating scale for amyotrophic lateral sclerosis: revised ALSFRS(ALSFRS-R) Japanese version].

    PubMed

    Ohashi, Y; Tashiro, K; Itoyama, Y; Nakano, I; Sobue, G; Nakamura, S; Sumino, S; Yanagisawa, N

    2001-04-01

    Amyotrophic lateral sclerosis(ALS) is progressive, degenerative, fatal disease of the motor neuron. No efficacious therapy is available to slow the progressive loss of function, but several new approaches including neurotrophic factors, antioxidants and glutamate antagonists, are currently being evaluated as potential therapies. Mortality, and/or time to tracheostomy, muscle strength and pulmonary function are used as primary endpoints in clinical trials for treatment of ALS. The effect of new therapies on the quality of patients' lives are also important, so we sought to develop a rating scale to measure it. The revised ALS Functional Rating Scale(ALSFRS-R), which has addition of items to ALSFRS to enhance the ability to assess respiratory symptoms, is an assessment determining the degree of impairment in ALS patients' abilities to function independently in activities of daily living. It consists of 12 items to evaluate bulbar function, motor function and respiratory function and each item is scored from 0(unable) to 4(normal). We translated the English score into Japanese one with minor modification considering the inter cultural difference. And we examined reliability of the translated scale. As a measure of reliability, the intraclass correlation coefficient(ICC) was evaluated for total score and the Kappa coefficient proposed by Cohen and Kraemer was calculated for each item. Moreover, we examined sensitivity to clinical change over time and carried out the factor analysis to analyze the factorial structure. The subjects were 27 ALS patients and each was scored twice for reliability or three times for sensitivity by 2 to 5 neurologists and if possible, nurses. The ICC for total score was 0.97(95% C. I.; 0.94-0.98). Extension of the Kappa coefficients were 0.48 to 1.00 for inter-rater reliability and the averaged Kappa coefficients were 0.63 to 1.00 for intra rater reliability, respectively. Concerning the factorial structure, the contribution of the first factor(the first principal component) were 53.5% principal factor solution. The factor loadings of items were 0.52-0.91 except "salivation" and this factor almost equal to the simple sum of all items was interpreted as the general degree of deterioration. The promax votation revealed the riginally supposed factor structure with 3 factors(groups of items): neuromuscuclar function, respiratory function and bulbar function. The rating scale correlated with Global clinical impression of change(GCIC) scored by neurologists and declined with time, indicating its sensitivity to change. On the bases of these results, ALSFRS-R(Japanese version) is considered to be highly reliable enough for clinical use.

  3. Advanced Stirling Convertor Heater Head Durability and Reliability Quantification

    NASA Technical Reports Server (NTRS)

    Krause, David L.; Shah, Ashwin R.; Korovaichuk, Igor; Kalluri, Sreeramesh

    2008-01-01

    The National Aeronautics and Space Administration (NASA) has identified the high efficiency Advanced Stirling Radioisotope Generator (ASRG) as a candidate power source for long duration Science missions, such as lunar applications, Mars rovers, and deep space missions, that require reliable design lifetimes of up to 17 years. Resistance to creep deformation of the MarM-247 heater head (HH), a structurally critical component of the ASRG Advanced Stirling Convertor (ASC), under high temperatures (up to 850 C) is a key design driver for durability. Inherent uncertainties in the creep behavior of the thin-walled HH and the variations in the wall thickness, control temperature, and working gas pressure need to be accounted for in the life and reliability prediction. Due to the availability of very limited test data, assuring life and reliability of the HH is a challenging task. The NASA Glenn Research Center (GRC) has adopted an integrated approach combining available uniaxial MarM-247 material behavior testing, HH benchmark testing and advanced analysis in order to demonstrate the integrity, life and reliability of the HH under expected mission conditions. The proposed paper describes analytical aspects of the deterministic and probabilistic approaches and results. The deterministic approach involves development of the creep constitutive model for the MarM-247 (akin to the Oak Ridge National Laboratory master curve model used previously for Inconel 718 (Special Metals Corporation)) and nonlinear finite element analysis to predict the mean life. The probabilistic approach includes evaluation of the effect of design variable uncertainties in material creep behavior, geometry and operating conditions on life and reliability for the expected life. The sensitivity of the uncertainties in the design variables on the HH reliability is also quantified, and guidelines to improve reliability are discussed.

  4. Determining sensitivity and specificity of the Sport Concussion Assessment Tool 3 (SCAT3) components in university athletes.

    PubMed

    Downey, Rachel I; Hutchison, Michael G; Comper, Paul

    2018-06-14

    To examine the clinical utility of the Sport Concussion Assessment Tool-3 (SCAT3) in university athletes with concussion in the absence and presence of baseline data over time. Athletes with concussion (n = 23) and uninjured controls (n = 22) were prospectively evaluated at three time-points (baseline, 3-5 days, 3 weeks post-injury) with the SCAT3 components: (1) Post-Concussion Symptom Scale (PCSS); (2) Standardized Assessment of Concussion (SAC); and (3) modified Balance Error Scoring System (m-BESS). Sensitivity and specificity were calculated using reliable change indices and normative data from 458 athletes who completed baseline testing. The PCSS total symptom score yielded highest sensitivity (47.4-72.2%) and specificity (78.6-91.7%) 3-5 days post-injury, with the SAC and m-BESS demonstrating little discriminative ability when used more than 3 days post-concussion. The utility of the SCAT3 was comparable when baseline or normative data was used for predicting concussion. The SCAT is a clinically useful tool for assessing concussion in the absence or presence of baseline data within the first 3-5 days post-injury. Clinical utility of the SCAT3 was driven by symptoms, which remains consistent in the SCAT5. Future research should explore whether additional cognitive elements in the SCAT5 improve utility beyond this timeframe.

  5. Component Analysis of Remanent Magnetization Curves: A Revisit with a New Model Distribution

    NASA Astrophysics Data System (ADS)

    Zhao, X.; Suganuma, Y.; Fujii, M.

    2017-12-01

    Geological samples often consist of several magnetic components that have distinct origins. As the magnetic components are often indicative of their underlying geological and environmental processes, it is therefore desirable to identify individual components to extract associated information. This component analysis can be achieved using the so-called unmixing method, which fits a mixture model of certain end-member model distribution to the measured remanent magnetization curve. In earlier studies, the lognormal, skew generalized Gaussian and skewed Gaussian distributions have been used as the end-member model distribution in previous studies, which are performed on the gradient curve of remanent magnetization curves. However, gradient curves are sensitive to measurement noise as the differentiation of the measured curve amplifies noise, which could deteriorate the component analysis. Though either smoothing or filtering can be applied to reduce the noise before differentiation, their effect on biasing component analysis is vaguely addressed. In this study, we investigated a new model function that can be directly applied to the remanent magnetization curves and therefore avoid the differentiation. The new model function can provide more flexible shape than the lognormal distribution, which is a merit for modeling the coercivity distribution of complex magnetic component. We applied the unmixing method both to model and measured data, and compared the results with those obtained using other model distributions to better understand their interchangeability, applicability and limitation. The analyses on model data suggest that unmixing methods are inherently sensitive to noise, especially when the number of component is over two. It is, therefore, recommended to verify the reliability of component analysis by running multiple analyses with synthetic noise. Marine sediments and seafloor rocks are analyzed with the new model distribution. Given the same component number, the new model distribution can provide closer fits than the lognormal distribution evidenced by reduced residuals. Moreover, the new unmixing protocol is automated so that the users are freed from the labor of providing initial guesses for the parameters, which is also helpful to improve the subjectivity of component analysis.

  6. Simultaneous Quantification of Seven Bioactive Flavonoids in Citri Reticulatae Pericarpium by Ultra-Fast Liquid Chromatography Coupled with Tandem Mass Spectrometry.

    PubMed

    Zhao, Lian-Hua; Zhao, Hong-Zheng; Zhao, Xue; Kong, Wei-Jun; Hu, Yi-Chen; Yang, Shi-Hai; Yang, Mei-Hua

    2016-05-01

    Citri Reticulatae Pericarpium (CRP) is a commonly-used traditional Chinese medicine with flavonoids as the major bioactive components. Nevertheless, the contents of the flavonoids in CRP of different sources may significantly vary affecting their therapeutic effects. Thus, the setting up of a reliable and comprehensive quality assessment method for flavonoids in CRP is necessary. To set up a rapid and sensitive ultra-fast liquid chromatography coupled with tandem mass spectrometry (UFLC-MS/MS) method for simultaneous quantification of seven bioactive flavonoids in CRP. A UFLC-MS/MS method coupled to ultrasound-assisted extraction was developed for simultaneous separation and quantification of seven flavonoids including hesperidin, neohesperidin, naringin, narirutin, tangeretin, nobiletin and sinensetin in 16 batches of CRP samples from different sources in China. The established method showed good linearity for all analytes with correlation coefficient (R) over 0.9980, together with satisfactory accuracy, precision and reproducibility. Furthermore, the recoveries at the three spiked levels were higher than 89.71% with relative standard deviations (RSDs) lower than 5.19%. The results indicated that the contents of seven bioactive flavonoids in CRP varied significantly among different sources. Among the samples under study, hesperidin showed the highest contents in 16 samples ranged from 27.50 to 86.30 mg/g, the contents of hesperidin in CRP-15 and CRP-9 were 27.50 and 86.30 mg/g, respectively, while, the amount of narirutin was too low to be measured in some samples. This study revealed that the developed UFLC-MS/MS method was simple, sensitive and reliable for simultaneous quantification of multi-components in CRP with potential perspective for quality control of complex matrices. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  7. Simultaneous quantitative determination of multiple bioactive markers in Ocimum sanctum obtained from different locations and its marketed herbal formulations using UPLC-ESI-MS/MS combined with principal component analysis.

    PubMed

    Pandey, Renu; Chandra, Preeti; Srivastava, Mukesh; Mishra, D K; Kumar, Brijesh

    2015-01-01

    Ocimum sanctum L., with phenolic acids, flavonoids, propenyl phenols and terpenoids as active pharmacological constituents, is a popular medicinal herb and is present as an ingredient in many herbal formulations. Therefore, development of a reliable analytical method for simultaneous determination of the pharmacologically active constituents of O. sanctum is of high importance. To develop and validate a new, rapid, sensitive and selective UPLC-ESI/MS/MS method for simultaneous determination of 23 bioactive markers including phenolic acids, flavonoids, propenyl phenol and terpenoid in the leaf extract and marketed herbal formulations of O. sanctum. An UPLC-ESI/MS/MS method using negative electrospray ionisation (ESI) in multiple-reaction-monitoring (MRM) mode was used for simultaneous determination. Chromatographic separation was achieved on an Acquity UPLC BEH C18 -column using a gradient elution with 0.1% formic acid in water and 0.1% formic acid in acetonitrile. Principal component analysis (PCA) was applied to correlate and discriminate eight geographical collections of O. sanctum based on quantitative data of the analytes. The developed method was validated as per International Conference on Harmonization guidelines and found to be accurate, with overall recovery in the range 95.09-104.84% (RSD ≤ 1.85%), precise (RSD ≤ 1.98%) and linear (r(2)  ≥ 0.9971) over the concentration range of 0.5-1000 ng/mL. Ursolic acid was found to be the most abundant marker in all the samples investigated, except for the marketed tablet. The method established is simple, rapid and sensitive, hence it can be reliably utilised for the quality control of O. sanctum and derived herbal formulations. Copyright © 2015 John Wiley & Sons, Ltd.

  8. Characteristics of the Spanish- and English-Language Self-Efficacy to Manage Diabetes Scales.

    PubMed

    Ritter, Philip L; Lorig, Kate; Laurent, Diana D

    2016-04-01

    The purpose of this study was to examine the characteristics of the Spanish-language diabetes self-efficacy scale (DSES-S) and the English-language version (DSES). This study consists of secondary data from 3 randomized studies that administered the DSES-S and DSES at 2 time points. The scales consist of 8 Likert-type 10-point items. Principal component analysis was applied to determine if the scales were unitary or consisted of subscales. Univariate statistics were used to describe the scales. Sensitivity to change was measured by comparing randomized treatment with control groups, where the treatment included methods designed to enhance self-efficacy. General linear models were used to examine the association between the scales and the 8 medical outcomes after controlling for demographic variables. Principal component analysis indicated that there were 2 subscales for both versions: self-efficacy for behaviors and self-efficacy to manage blood levels and medical condition. The measures had similar means across the 3 studies, high internal consistent reliability, values distributed across the entire range, and they showed no evidence of floor effects and little evidence of ceiling effects. The measures were sensitive to change. They were associated with several health indicators and behaviors at baseline, and changes were associated with changes in health measures. The self-efficacy measures behaved consistently across the 3 studies and were highly reliable. Associations with medical indicators and behaviors suggested validity, although further study would be desirable to compare other measures of self-efficacy for people with type 2 diabetes. These brief scales are appropriate for measuring self-efficacy to manage diabetes. © 2016 The Author(s).

  9. Understanding software faults and their role in software reliability modeling

    NASA Technical Reports Server (NTRS)

    Munson, John C.

    1994-01-01

    This study is a direct result of an on-going project to model the reliability of a large real-time control avionics system. In previous modeling efforts with this system, hardware reliability models were applied in modeling the reliability behavior of this system. In an attempt to enhance the performance of the adapted reliability models, certain software attributes were introduced in these models to control for differences between programs and also sequential executions of the same program. As the basic nature of the software attributes that affect software reliability become better understood in the modeling process, this information begins to have important implications on the software development process. A significant problem arises when raw attribute measures are to be used in statistical models as predictors, for example, of measures of software quality. This is because many of the metrics are highly correlated. Consider the two attributes: lines of code, LOC, and number of program statements, Stmts. In this case, it is quite obvious that a program with a high value of LOC probably will also have a relatively high value of Stmts. In the case of low level languages, such as assembly language programs, there might be a one-to-one relationship between the statement count and the lines of code. When there is a complete absence of linear relationship among the metrics, they are said to be orthogonal or uncorrelated. Usually the lack of orthogonality is not serious enough to affect a statistical analysis. However, for the purposes of some statistical analysis such as multiple regression, the software metrics are so strongly interrelated that the regression results may be ambiguous and possibly even misleading. Typically, it is difficult to estimate the unique effects of individual software metrics in the regression equation. The estimated values of the coefficients are very sensitive to slight changes in the data and to the addition or deletion of variables in the regression equation. Since most of the existing metrics have common elements and are linear combinations of these common elements, it seems reasonable to investigate the structure of the underlying common factors or components that make up the raw metrics. The technique we have chosen to use to explore this structure is a procedure called principal components analysis. Principal components analysis is a decomposition technique that may be used to detect and analyze collinearity in software metrics. When confronted with a large number of metrics measuring a single construct, it may be desirable to represent the set by some smaller number of variables that convey all, or most, of the information in the original set. Principal components are linear transformations of a set of random variables that summarize the information contained in the variables. The transformations are chosen so that the first component accounts for the maximal amount of variation of the measures of any possible linear transform; the second component accounts for the maximal amount of residual variation; and so on. The principal components are constructed so that they represent transformed scores on dimensions that are orthogonal. Through the use of principal components analysis, it is possible to have a set of highly related software attributes mapped into a small number of uncorrelated attribute domains. This definitively solves the problem of multi-collinearity in subsequent regression analysis. There are many software metrics in the literature, but principal component analysis reveals that there are few distinct sources of variation, i.e. dimensions, in this set of metrics. It would appear perfectly reasonable to characterize the measurable attributes of a program with a simple function of a small number of orthogonal metrics each of which represents a distinct software attribute domain.

  10. Ultra Reliable Closed Loop Life Support for Long Space Missions

    NASA Technical Reports Server (NTRS)

    Jones, Harry W.; Ewert, Michael K.

    2010-01-01

    Spacecraft human life support systems can achieve ultra reliability by providing sufficient spares to replace all failed components. The additional mass of spares for ultra reliability is approximately equal to the original system mass, provided that the original system reliability is not too low. Acceptable reliability can be achieved for the Space Shuttle and Space Station by preventive maintenance and by replacing failed units. However, on-demand maintenance and repair requires a logistics supply chain in place to provide the needed spares. In contrast, a Mars or other long space mission must take along all the needed spares, since resupply is not possible. Long missions must achieve ultra reliability, a very low failure rate per hour, since they will take years rather than weeks and cannot be cut short if a failure occurs. Also, distant missions have a much higher mass launch cost per kilogram than near-Earth missions. Achieving ultra reliable spacecraft life support systems with acceptable mass will require a well-planned and extensive development effort. Analysis must determine the reliability requirement and allocate it to subsystems and components. Ultra reliability requires reducing the intrinsic failure causes, providing spares to replace failed components and having "graceful" failure modes. Technologies, components, and materials must be selected and designed for high reliability. Long duration testing is needed to confirm very low failure rates. Systems design should segregate the failure causes in the smallest, most easily replaceable parts. The system must be designed, developed, integrated, and tested with system reliability in mind. Maintenance and reparability of failed units must not add to the probability of failure. The overall system must be tested sufficiently to identify any design errors. A program to develop ultra reliable space life support systems with acceptable mass should start soon since it must be a long term effort.

  11. Issues and Methods for Assessing COTS Reliability, Maintainability, and Availability

    NASA Technical Reports Server (NTRS)

    Schneidewind, Norman F.; Nikora, Allen P.

    1998-01-01

    Many vendors produce products that are not domain specific (e.g., network server) and have limited functionality (e.g., mobile phone). In contrast, many customers of COTS develop systems that am domain specific (e.g., target tracking system) and have great variability in functionality (e.g., corporate information system). This discussion takes the viewpoint of how the customer can ensure the quality of COTS components. In evaluating the benefits and costs of using COTS, we must consider the environment in which COTS will operate. Thus we must distinguish between using a non-mission critical application like a spreadsheet program to produce a budget and a mission critical application like military strategic and tactical operations. Whereas customers will tolerate an occasional bug in the former, zero tolerance is the rule in the latter. We emphasize the latter because this is the arena where there are major unresolved problems in the application of COTS. Furthermore, COTS components may be embedded in the larger customer system. We refer to these as embedded systems. These components must be reliable, maintainable, and available, and must be with the larger system in order for the customer to benefit from the advertised advantages of lower development and maintenance costs. Interestingly, when the claims of COTS advantages are closely examined, one finds that to a great extent these COTS components consist of hardware and office products, not mission critical software [1]. Obviously, COTS components are different from custom components with respect to one or more of the following attributes: source, development paradigm, safety, reliability, maintainability, availability, security, and other attributes. However, the important question is whether they should be treated differently when deciding to deploy them for operational use; we suggest the answer is no. We use reliability as an example to justify our answer. In order to demonstrate its reliability, a COTS component must pass the same reliability evaluations as the custom components, otherwise the COTS components will be the weakest link in the chain of components and will be the determinant of software system reliability. The challenge is that there will be less information available for evaluating COTS components than for custom components but this does not mean we should despair and do nothing. Actually, there is a lot we can do even in the absence of documentation on COTS components because the customer will have information about how COTS components are to be used in the larger system. To illustrate our approach, we will consider the reliability, maintainability, and availability (RMA) of COTS components as used in larger systems. Finally, COTS suppliers might consider increasing visibility into their products to assist customers in determining the components' fitness for use in a particular application. We offer ideas of information that would be useful to customers, and what vendors might do to provide it.

  12. Reliability approach to rotating-component design. [fatigue life and stress concentration

    NASA Technical Reports Server (NTRS)

    Kececioglu, D. B.; Lalli, V. R.

    1975-01-01

    A probabilistic methodology for designing rotating mechanical components using reliability to relate stress to strength is explained. The experimental test machines and data obtained for steel to verify this methodology are described. A sample mechanical rotating component design problem is solved by comparing a deterministic design method with the new design-by reliability approach. The new method shows that a smaller size and weight can be obtained for specified rotating shaft life and reliability, and uses the statistical distortion-energy theory with statistical fatigue diagrams for optimum shaft design. Statistical methods are presented for (1) determining strength distributions for steel experimentally, (2) determining a failure theory for stress variations in a rotating shaft subjected to reversed bending and steady torque, and (3) relating strength to stress by reliability.

  13. Validation of a patient-centered culturally sensitive health care office staff inventory.

    PubMed

    Tucker, Carolyn M; Wall, Whitney; Marsiske, Michael; Nghiem, Khanh; Roncoroni, Julia

    2015-09-01

    Research suggests that patient-perceived culturally sensitive health care encompasses multiple components of the health care delivery system including the cultural sensitivity of front desk office staff. Despite this, research on culturally sensitive health care focuses almost exclusively on provider behaviors, attitudes, and knowledge. This is due in part to the paucity of instruments available to assess the cultural sensitivity of front desk office staff. Thus, the objective of the present study is to determine the psychometric properties of the pilot Tucker-Culturally Sensitive Health Care Office Staff Inventory-Patient Form (T-CSHCOSI-PF), which is an instrument designed to enable patients to evaluate the patient-defined cultural sensitivity of their front desk office staff. A sample of 1648 adult patients was recruited by staff at 67 health care sites across the United States. These patients anonymously completed the T-CSHCOSI-PF, a demographic data questionnaire, and a patient satisfaction questionnaire. Findings Confirmatory factor analyses of the TCSHCOSI-PF revealed that this inventory has two factors with high internal consistency reliability and validity (Cronbach's αs=0.97 and 0.95). It is concluded that the T-CSHCOSI-PF is a psychometrically strong and useful inventory for assessing the cultural sensitivity of front desk office staff. This inventory can be used to support culturally sensitive health care research, evaluate the job performance of front desk office staff, and aid in the development of trainings designed to improve the cultural sensitivity of these office staff.

  14. Parametric Mass Reliability Study

    NASA Technical Reports Server (NTRS)

    Holt, James P.

    2014-01-01

    The International Space Station (ISS) systems are designed based upon having redundant systems with replaceable orbital replacement units (ORUs). These ORUs are designed to be swapped out fairly quickly, but some are very large, and some are made up of many components. When an ORU fails, it is replaced on orbit with a spare; the failed unit is sometimes returned to Earth to be serviced and re-launched. Such a system is not feasible for a 500+ day long-duration mission beyond low Earth orbit. The components that make up these ORUs have mixed reliabilities. Components that make up the most mass-such as computer housings, pump casings, and the silicon board of PCBs-typically are the most reliable. Meanwhile components that tend to fail the earliest-such as seals or gaskets-typically have a small mass. To better understand the problem, my project is to create a parametric model that relates both the mass of ORUs to reliability, as well as the mass of ORU subcomponents to reliability.

  15. The Advanced Gamma-Ray Imaging System (AGIS)

    NASA Astrophysics Data System (ADS)

    Otte, Nepomuk

    The Advanced Gamma-ray Imaging System (AGIS) is a concept for the next generation of imag-ing atmospheric Cherenkov telescope arrays. It has the goal of providing an order of magnitude increase in sensitivity for Very High Energy Gamma-ray ( 100 GeV to 100 TeV) astronomy compared to currently operating arrays such as CANGAROO, HESS, MAGIC, and VERITAS. After an overview of the science such an array would enable, we discuss the development of the components of the telescope system that are required to achieve the sensitivity goal. AGIS stresses improvements in several areas of IACT technology including component reliability as well as exploring cost reduction possibilities in order to achieve its goal. We discuss alterna-tives for the telescopes and positioners: a novel Schwarzschild-Couder telescope offering a wide field of view with a relatively smaller plate scale, and possibilities for rapid slewing in order to address the search for and/or study of Gamma-ray Bursts in the VHE gamma-ray regime. We also discuss options for a high pixel count camera system providing the necessary finer solid angle per pixel and possibilities for a fast topological trigger that would offer improved realtime background rejection and lower energy thresholds.

  16. Applicability and Limitations of Reliability Allocation Methods

    NASA Technical Reports Server (NTRS)

    Cruz, Jose A.

    2016-01-01

    Reliability allocation process may be described as the process of assigning reliability requirements to individual components within a system to attain the specified system reliability. For large systems, the allocation process is often performed at different stages of system design. The allocation process often begins at the conceptual stage. As the system design develops, more information about components and the operating environment becomes available, different allocation methods can be considered. Reliability allocation methods are usually divided into two categories: weighting factors and optimal reliability allocation. When properly applied, these methods can produce reasonable approximations. Reliability allocation techniques have limitations and implied assumptions that need to be understood by system engineers. Applying reliability allocation techniques without understanding their limitations and assumptions can produce unrealistic results. This report addresses weighting factors, optimal reliability allocation techniques, and identifies the applicability and limitations of each reliability allocation technique.

  17. Reliability of Laterality Effects in a Dichotic Listening Task with Words and Syllables

    ERIC Educational Resources Information Center

    Russell, Nancy L.; Voyer, Daniel

    2004-01-01

    Large and reliable laterality effects have been found using a dichotic target detection task in a recent experiment using word stimuli pronounced with an emotional component. The present study tested the hypothesis that the magnitude and reliability of the laterality effects would increase with the removal of the emotional component and variations…

  18. Ultra-sensitive EUV resists based on acid-catalyzed polymer backbone breaking

    NASA Astrophysics Data System (ADS)

    Manouras, Theodoros; Kazazis, Dimitrios; Koufakis, Eleftherios; Ekinci, Yasin; Vamvakaki, Maria; Argitis, Panagiotis

    2018-03-01

    The main target of the current work was to develop new sensitive polymeric materials for lithographic applications, focusing in particular to EUV lithography, the main chain of which is cleaved under the influence of photogenerated acid. Resist materials based on the cleavage of polymer main chain are in principle capable to create very small structures, to the dimensions of the monomers that they consist of. Nevertheless, in the case of the commonly used nonchemically amplified materials of this type issues like sensitivity and poor etch resistance limit their areas of application, whereas inadequate etch resistance and non- satisfactory process reliability are the usual problems encountered in acid catalysed materials based on main chain scission. In our material design the acid catalyzed chain cleavable polymers contain very sensitive moieties in their backbone while they remain intact in alkaline ambient. These newly synthesized polymers bear in addition suitable functional groups for the achievement of desirable lithographic characteristics (thermal stability, acceptable glass transition temperature, etch resistance, proper dissolution behavior, adhesion to the substrate). Our approach for achieving acceptable etch resistance, a main drawback in other main chain cleavable resists, is based on the introduction of polyaromatic hydrocarbons in the polymeric backbone, whereas the incorporation of an inorganic component further enhances the etch resistance. Single component systems can also be designed following the proposed approach by the incorporation of suitable PAGs and base quencher molecules in the main chain. Resist formulations based on a random copolymer designed according to the described rules evaluated in EUV exhibit ultrahigh sensitivity, capability for high resolution patterning and overall processing characteristics that make them strong candidates for industrial use upon further optimization.

  19. Sensitivity Analysis of ProSEDS (Propulsive Small Expendable Deployer System) Data Communication System

    NASA Technical Reports Server (NTRS)

    Park, Nohpill; Reagan, Shawn; Franks, Greg; Jones, William G.

    1999-01-01

    This paper discusses analytical approaches to evaluating performance of Spacecraft On-Board Computing systems, thereby ultimately achieving a reliable spacecraft data communications systems. The sensitivity analysis approach of memory system on the ProSEDS (Propulsive Small Expendable Deployer System) as a part of its data communication system will be investigated. Also, general issues and possible approaches to reliable Spacecraft On-Board Interconnection Network and Processor Array will be shown. The performance issues of a spacecraft on-board computing systems such as sensitivity, throughput, delay and reliability will be introduced and discussed.

  20. A Roadmap for the Development and Validation of ERP Biomarkers in Schizophrenia Research

    PubMed Central

    Luck, Steven J.; Mathalon, Daniel H.; O'Donnell, Brian F.; Hämäläinen, Matti S.; Spencer, Kevin M.; Javitt, Daniel C.; Uhlhaas, Peter J.

    2010-01-01

    New efforts to develop treatments for cognitive dysfunction in mental illnesses would benefit enormously from biomarkers that provide sensitive and reliable measures of the neural events underlying cognition. Here we evaluate the promise of event-related potentials (ERPs) as biomarkers of cognitive dysfunction in schizophrenia. We conclude that ERPs have several desirable properties: (a) they provide a direct measure of electrical activity during neurotransmission; (b) their high temporal resolutions makes it possible to measure neural synchrony and oscillations; (c) they are relatively inexpensive and convenient to record; (d) animal models are readily available for several ERP components; (e) decades of research has established the sensitivity and reliability of ERP measures in psychiatric illnesses; and (f) feasibility of large N (>500) multi-site studies has been demonstrated for key measures. Consequently, ERPs may be useful for identifying endophenotypes and defining treatment targets, for evaluating new compounds in animals and in humans, and for identifying individuals who are good candidates for early interventions or for specific treatments. However, several challenges must be overcome before ERPs gain widespread use as biomarkers in schizophrenia research, and we make several recommendations for the research that is necessary to develop and validate ERP-based biomarkers that can have a real impact on treatment development. PMID:21111401

  1. Reliability Assessment for COTS Components in Space Flight Applications

    NASA Technical Reports Server (NTRS)

    Krishnan, G. S.; Mazzuchi, Thomas A.

    2001-01-01

    Systems built for space flight applications usually demand very high degree of performance and a very high level of accuracy. Hence, the design engineers are often prone to selecting state-of-art technologies for inclusion in their system design. The shrinking budgets also necessitate use of COTS (Commercial Off-The-Shelf) components, which are construed as being less expensive. The performance and accuracy requirements for space flight applications are much more stringent than those for the commercial applications. The quantity of systems designed and developed for space applications are much lower in number than those produced for the commercial applications. With a given set of requirements, are these COTS components reliable? This paper presents a model for assessing the reliability of COTS components in space applications and the associated affect on the system reliability. We illustrate the method with a real application.

  2. Design for Verification: Using Design Patterns to Build Reliable Systems

    NASA Technical Reports Server (NTRS)

    Mehlitz, Peter C.; Penix, John; Koga, Dennis (Technical Monitor)

    2003-01-01

    Components so far have been mainly used in commercial software development to reduce time to market. While some effort has been spent on formal aspects of components, most of this was done in the context of programming language or operating system framework integration. As a consequence, increased reliability of composed systems is mainly regarded as a side effect of a more rigid testing of pre-fabricated components. In contrast to this, Design for Verification (D4V) puts the focus on component specific property guarantees, which are used to design systems with high reliability requirements. D4V components are domain specific design pattern instances with well-defined property guarantees and usage rules, which are suitable for automatic verification. The guaranteed properties are explicitly used to select components according to key system requirements. The D4V hypothesis is that the same general architecture and design principles leading to good modularity, extensibility and complexity/functionality ratio can be adapted to overcome some of the limitations of conventional reliability assurance measures, such as too large a state space or too many execution paths.

  3. Developing a Self-Scoring Comprehensive Instrument to Measure Rest's Four-Component Model of Moral Behavior: The Moral Skills Inventory.

    PubMed

    Chambers, David W

    2011-01-01

    One of the most extensively studied constructs in dental education is the four-component model of moral behavior proposed by James Rest and the set of instruments for measuring it developed by Rest, Muriel Bebeau, and others. Although significant associations have been identified between the four components Rest proposed (called here Moral Sensitivity, Moral Reasoning, Moral Integrity, and Moral Courage) and dental ethics courses and practitioners with disciplined licenses, there is no single instrument that measures all four components, and existing single component instruments require professional scoring. This article describes the development and validation of a short, self-scoring instrument, the Moral Skills Inventory, that measures all four components. Evidence of face validity, test/retest reliability, and concurrent convergent and divergent predictive validity are demonstrated in three populations: dental students, clinical dental faculty members, and regents and officers of the American College of Dentists. Significant issues remain in developing the Rest four-component model for use in dental education and practice. Specifically, further construct validation research is needed to understand the nature of the components. In particular, it remains undetermined whether moral constructs are characteristics of individuals that drive behavior in specific situations or whether particular patterns of moral behavior learned and used in response to individual circumstances are summarized by researchers and then imputed to practitioners.

  4. A Methodology for the Development of a Reliability Database for an Advanced Reactor Probabilistic Risk Assessment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Grabaskas, Dave; Brunett, Acacia J.; Bucknor, Matthew

    GE Hitachi Nuclear Energy (GEH) and Argonne National Laboratory are currently engaged in a joint effort to modernize and develop probabilistic risk assessment (PRA) techniques for advanced non-light water reactors. At a high level the primary outcome of this project will be the development of next-generation PRA methodologies that will enable risk-informed prioritization of safety- and reliability-focused research and development, while also identifying gaps that may be resolved through additional research. A subset of this effort is the development of a reliability database (RDB) methodology to determine applicable reliability data for inclusion in the quantification of the PRA. The RDBmore » method developed during this project seeks to satisfy the requirements of the Data Analysis element of the ASME/ANS Non-LWR PRA standard. The RDB methodology utilizes a relevancy test to examine reliability data and determine whether it is appropriate to include as part of the reliability database for the PRA. The relevancy test compares three component properties to establish the level of similarity to components examined as part of the PRA. These properties include the component function, the component failure modes, and the environment/boundary conditions of the component. The relevancy test is used to gauge the quality of data found in a variety of sources, such as advanced reactor-specific databases, non-advanced reactor nuclear databases, and non-nuclear databases. The RDB also establishes the integration of expert judgment or separate reliability analysis with past reliability data. This paper provides details on the RDB methodology, and includes an example application of the RDB methodology for determining the reliability of the intermediate heat exchanger of a sodium fast reactor. The example explores a variety of reliability data sources, and assesses their applicability for the PRA of interest through the use of the relevancy test.« less

  5. Refractory metals for ARPS AMTEC cells

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Svedberg, R.C.; Sievers, R.C.

    1998-07-01

    Alkali Metal Thermal-to-Electric Converter (AMTEC) cells for the Advanced Radioisotope Power Systems (ARPS) program are being developed with refractory metals and alloys as the basic structural materials. AMTEC cell efficiency increases with cell operating temperature. For space applications, long term reliability and high efficiency are essential and refractory metals were selected because of their high temperature strength, low vapor pressure, and compatibility with sodium. However, refractory metals are sensitive to oxygen, nitrogen and hydrogen contamination and refractory metal cells cannot be processed in air. Because of this sensitivity, new manufacturing and processing techniques are being developed. In addition to structuralmore » elements, development of other refractory metal components for the AMTEC cells, such as the artery and evaporator wicks, pinchoff tubes and feedthroughs are required. Changes in cell fabrication techniques and processing procedures being implemented to manufacture refractory metal cells are discussed.« less

  6. Brief Report: The Preliminary Psychometric Properties of the Social Communication Checklist.

    PubMed

    Wainer, Allison L; Berger, Natalie I; Ingersoll, Brooke R

    2017-04-01

    Despite the expansion of early intervention approaches for young children with ASD, investigators have struggled to identify measures capable of assessing social communication change in response to these interventions. Addressing recent calls for efficient, sensitive, and reliable social communication measures, the current paper outlines the refinement and validation of the Social Communication Checklist (SCC). We discuss two small studies exploring the psychometric properties of the SCC and the SCC-R (revised Social Communication Checklist), including sensitivity to change, inter-rater reliability, and test-retest reliability, in two samples of children with ASD and one sample of typically-developing children. Results indicate this measure is reliable, sensitive to change after a brief social communication intervention, and strongly related to well-established measures of social communicative functioning.

  7. Spatial learning and psychomotor performance of C57BL/6 mice: age sensitivity and reliability of individual differences.

    PubMed

    de Fiebre, Nancyellen C; Sumien, Nathalie; Forster, Michael J; de Fiebre, Christopher M

    2006-09-01

    Two tests often used in aging research, the elevated path test and the Morris water maze test, were examined for their application to the study of brain aging in a large sample of C57BL/6JNia mice. Specifically, these studies assessed: (1) sensitivity to age and the degree of interrelatedness among different behavioral measures derived from these tests, (2) the effect of age on variation in the measurements, and (3) the reliability of individual differences in performance on the tests. Both tests detected age-related deficits in group performance that occurred independently of each other. However, analysis of data obtained on the Morris water maze test revealed three relatively independent components of cognitive performance. Performance in initial acquisition of spatial learning in the Morris maze was not highly correlated with performance during reversal learning (when mice were required to learn a new spatial location), whereas performance in both of those phases was independent of spatial performance assessed during a single probe trial administered at the end of acquisition training. Moreover, impaired performance during initial acquisition could be detected at an earlier age than impairments in reversal learning. There were modest but significant age-related increases in the variance of both elevated path test scores and in several measures of learning in the Morris maze test. Analysis of test scores of mice across repeated testing sessions confirmed reliability of the measurements obtained for cognitive and psychomotor function. Power calculations confirmed that there are sufficiently large age-related differences in elevated path test performance, relative to within age variability, to render this test useful for studies into the ability of an intervention to prevent or reverse age-related deficits in psychomotor performance. Power calculations indicated a need for larger sample sizes for detection of intervention effects on cognitive components of the Morris water maze test, at least when implemented at the ages tested in this study. Variability among old mice in both tests, including each of the various independent measures in the Morris maze, may be useful for elucidating the biological bases of different aspects of dysfunctional brain aging.

  8. Psychometric properties of the Persian version of Social Adaptation Self-evaluation Scale in community-dwelling older adults.

    PubMed

    Farokhnezhad Afshar, Pouya; Foroughan, Mahshid; Vedadhir, AbouAli; Ghazi Tabatabaie, Mahmood

    2017-01-01

    The Social Adaptation Self-evaluation Scale (SASS) is used to measure social function and social motivation in depressed patients. There is little attention to social function in the treatment of depression. The aim of this study was to assess the validity and reliability of the Persian version of SASS (P-SASS) for older adults. This is a cross-sectional and methodological study. The participants were 550 community-dwelling older adults living in Tehran who were selected randomly from the primary health care centers. To assess the psychometric properties of SASS, we first did translation and cross-cultural adjustment on SASS and then used P-SASS and the Geriatric Depression Scale (GDS) for gathering data. A number of analyses, including Pearson's correlation, exploratory factor analysis, and Cronbach's α , and receiver operating characteristic curve were used to manage the data with the IBM SPSS Statistics V.22. The mean age of the participants was 66.09±6.67 years, and 58.9% of them were male. The Cronbach's α was 0.97. The test-retest reliability correlation coefficient was 0.78. Principal component analysis showed that P-SASS consists of two components. P-SASS score showed a significant negative correlation with GDS ( r =-0.91, P <0.01), which suggests good convergent validity. The P-SASS cutoff point was 28 (sensitivity: 0.97 and specificity: 0.94). P-SASS has good reliability and validity for older adults. So, it can be considered as an appropriate tool to evaluate the social function and social motivation of older persons with and without depression.

  9. Engine System Model Development for Nuclear Thermal Propulsion

    NASA Technical Reports Server (NTRS)

    Nelson, Karl W.; Simpson, Steven P.

    2006-01-01

    In order to design, analyze, and evaluate conceptual Nuclear Thermal Propulsion (NTP) engine systems, an improved NTP design and analysis tool has been developed. The NTP tool utilizes the Rocket Engine Transient Simulation (ROCETS) system tool and many of the routines from the Enabler reactor model found in Nuclear Engine System Simulation (NESS). Improved non-nuclear component models and an external shield model were added to the tool. With the addition of a nearly complete system reliability model, the tool will provide performance, sizing, and reliability data for NERVA-Derived NTP engine systems. A new detailed reactor model is also being developed and will replace Enabler. The new model will allow more flexibility in reactor geometry and include detailed thermal hydraulics and neutronics models. A description of the reactor, component, and reliability models is provided. Another key feature of the modeling process is the use of comprehensive spreadsheets for each engine case. The spreadsheets include individual worksheets for each subsystem with data, plots, and scaled figures, making the output very useful to each engineering discipline. Sample performance and sizing results with the Enabler reactor model are provided including sensitivities. Before selecting an engine design, all figures of merit must be considered including the overall impacts on the vehicle and mission. Evaluations based on key figures of merit of these results and results with the new reactor model will be performed. The impacts of clustering and external shielding will also be addressed. Over time, the reactor model will be upgraded to design and analyze other NTP concepts with CERMET and carbide fuel cores.

  10. Regenerable Photovoltaic Devices with a Hydrogel-Embedded Microvascular Network

    PubMed Central

    Koo, Hyung-Jun; Velev, Orlin D.

    2013-01-01

    Light-driven degradation of photoactive molecules could be one of the major obstacles to stable long term operation of organic dye-based solar light harvesting devices. One solution to this problem may be mimicking the regeneration functionality of a plant leaf. We report an organic dye photovoltaic system that has been endowed with such microfluidic regeneration functionality. A hydrogel medium with embedded channels allows rapid and uniform supply of photoactive reagents by a convection-diffusion mechanism. A washing-activation cycle enables reliable replacement of the organic component in a dye-sensitized photovoltaic system. Repetitive restoration of photovoltaic performance after intensive device degradation is demonstrated. PMID:23912814

  11. Constraining the interaction between dark sectors with future HI intensity mapping observations

    NASA Astrophysics Data System (ADS)

    Xu, Xiaodong; Ma, Yin-Zhe; Weltman, Amanda

    2018-04-01

    We study a model of interacting dark matter and dark energy, in which the two components are coupled. We calculate the predictions for the 21-cm intensity mapping power spectra, and forecast the detectability with future single-dish intensity mapping surveys (BINGO, FAST and SKA-I). Since dark energy is turned on at z ˜1 , which falls into the sensitivity range of these radio surveys, the HI intensity mapping technique is an efficient tool to constrain the interaction. By comparing with current constraints on dark sector interactions, we find that future radio surveys will produce tight and reliable constraints on the coupling parameters.

  12. Precision Laser Development for Gravitational Wave Space Mission

    NASA Technical Reports Server (NTRS)

    Numata, Kenji; Camp, Jordan

    2011-01-01

    Optical fiber and semiconductor laser technologies have evolved dramatically over the last decade due to the increased demands from optical communications. We are developing a laser (master oscillator) and optical amplifier based on those technologies for interferometric space missions, such as the gravitational-wave mission LISA, and GRACE follow-on, by fully utilizing the mature wave-guided optics technologies. In space, where a simple and reliable system is preferred, the wave-guided components are advantageous over bulk, crystal-based, free-space laser, such as NPRO (Non-planar Ring Oscillator) and bulk-crystal amplifier, which are widely used for sensitive laser applications on the ground.

  13. A new approach for computing a flood vulnerability index using cluster analysis

    NASA Astrophysics Data System (ADS)

    Fernandez, Paulo; Mourato, Sandra; Moreira, Madalena; Pereira, Luísa

    2016-08-01

    A Flood Vulnerability Index (FloodVI) was developed using Principal Component Analysis (PCA) and a new aggregation method based on Cluster Analysis (CA). PCA simplifies a large number of variables into a few uncorrelated factors representing the social, economic, physical and environmental dimensions of vulnerability. CA groups areas that have the same characteristics in terms of vulnerability into vulnerability classes. The grouping of the areas determines their classification contrary to other aggregation methods in which the areas' classification determines their grouping. While other aggregation methods distribute the areas into classes, in an artificial manner, by imposing a certain probability for an area to belong to a certain class, as determined by the assumption that the aggregation measure used is normally distributed, CA does not constrain the distribution of the areas by the classes. FloodVI was designed at the neighbourhood level and was applied to the Portuguese municipality of Vila Nova de Gaia where several flood events have taken place in the recent past. The FloodVI sensitivity was assessed using three different aggregation methods: the sum of component scores, the first component score and the weighted sum of component scores. The results highlight the sensitivity of the FloodVI to different aggregation methods. Both sum of component scores and weighted sum of component scores have shown similar results. The first component score aggregation method classifies almost all areas as having medium vulnerability and finally the results obtained using the CA show a distinct differentiation of the vulnerability where hot spots can be clearly identified. The information provided by records of previous flood events corroborate the results obtained with CA, because the inundated areas with greater damages are those that are identified as high and very high vulnerability areas by CA. This supports the fact that CA provides a reliable FloodVI.

  14. Construction and Characterization of External Cavity Diode Lasers for Atomic Physics

    PubMed Central

    Hardman, Kyle S.; Bennetts, Shayne; Debs, John E.; Kuhn, Carlos C. N.; McDonald, Gordon D.; Robins, Nick

    2014-01-01

    Since their development in the late 1980s, cheap, reliable external cavity diode lasers (ECDLs) have replaced complex and expensive traditional dye and Titanium Sapphire lasers as the workhorse laser of atomic physics labs1,2. Their versatility and prolific use throughout atomic physics in applications such as absorption spectroscopy and laser cooling1,2 makes it imperative for incoming students to gain a firm practical understanding of these lasers. This publication builds upon the seminal work by Wieman3, updating components, and providing a video tutorial. The setup, frequency locking and performance characterization of an ECDL will be described. Discussion of component selection and proper mounting of both diodes and gratings, the factors affecting mode selection within the cavity, proper alignment for optimal external feedback, optics setup for coarse and fine frequency sensitive measurements, a brief overview of laser locking techniques, and laser linewidth measurements are included. PMID:24796259

  15. Construction and characterization of external cavity diode lasers for atomic physics.

    PubMed

    Hardman, Kyle S; Bennetts, Shayne; Debs, John E; Kuhn, Carlos C N; McDonald, Gordon D; Robins, Nick

    2014-04-24

    Since their development in the late 1980s, cheap, reliable external cavity diode lasers (ECDLs) have replaced complex and expensive traditional dye and Titanium Sapphire lasers as the workhorse laser of atomic physics labs. Their versatility and prolific use throughout atomic physics in applications such as absorption spectroscopy and laser cooling makes it imperative for incoming students to gain a firm practical understanding of these lasers. This publication builds upon the seminal work by Wieman, updating components, and providing a video tutorial. The setup, frequency locking and performance characterization of an ECDL will be described. Discussion of component selection and proper mounting of both diodes and gratings, the factors affecting mode selection within the cavity, proper alignment for optimal external feedback, optics setup for coarse and fine frequency sensitive measurements, a brief overview of laser locking techniques, and laser linewidth measurements are included.

  16. First impressions: gait cues drive reliable trait judgements.

    PubMed

    Thoresen, John C; Vuong, Quoc C; Atkinson, Anthony P

    2012-09-01

    Personality trait attribution can underpin important social decisions and yet requires little effort; even a brief exposure to a photograph can generate lasting impressions. Body movement is a channel readily available to observers and allows judgements to be made when facial and body appearances are less visible; e.g., from great distances. Across three studies, we assessed the reliability of trait judgements of point-light walkers and identified motion-related visual cues driving observers' judgements. The findings confirm that observers make reliable, albeit inaccurate, trait judgements, and these were linked to a small number of motion components derived from a Principal Component Analysis of the motion data. Parametric manipulation of the motion components linearly affected trait ratings, providing strong evidence that the visual cues captured by these components drive observers' trait judgements. Subsequent analyses suggest that reliability of trait ratings was driven by impressions of emotion, attractiveness and masculinity. Copyright © 2012 Elsevier B.V. All rights reserved.

  17. Perform qualify reliability-power tests by shooting common mistakes: practical problems and standard answers per Telcordia/Bellcore requests

    NASA Astrophysics Data System (ADS)

    Yu, Zheng

    2002-08-01

    Facing the new demands of the optical fiber communications market, almost all the performance and reliability of optical network system are dependent on the qualification of the fiber optics components. So, how to comply with the system requirements, the Telcordia / Bellcore reliability and high-power testing has become the key issue for the fiber optics components manufacturers. The qualification of Telcordia / Bellcore reliability or high-power testing is a crucial issue for the manufacturers. It is relating to who is the outstanding one in the intense competition market. These testing also need maintenances and optimizations. Now, work on the reliability and high-power testing have become the new demands in the market. The way is needed to get the 'Triple-Win' goal expected by the component-makers, the reliability-testers and the system-users. To those who are meeting practical problems for the testing, there are following seven topics that deal with how to shoot the common mistakes to perform qualify reliability and high-power testing: ¸ Qualification maintenance requirements for the reliability testing ¸ Lots control for preparing the reliability testing ¸ Sampling select per the reliability testing ¸ Interim measurements during the reliability testing ¸ Basic referencing factors relating to the high-power testing ¸ Necessity of re-qualification testing for the changing of producing ¸ Understanding the similarity for product family by the definitions

  18. Accounting for Proof Test Data in a Reliability Based Design Optimization Framework

    NASA Technical Reports Server (NTRS)

    Ventor, Gerharad; Scotti, Stephen J.

    2012-01-01

    This paper investigates the use of proof (or acceptance) test data during the reliability based design optimization of structural components. It is assumed that every component will be proof tested and that the component will only enter into service if it passes the proof test. The goal is to reduce the component weight, while maintaining high reliability, by exploiting the proof test results during the design process. The proposed procedure results in the simultaneous design of the structural component and the proof test itself and provides the designer with direct control over the probability of failing the proof test. The procedure is illustrated using two analytical example problems and the results indicate that significant weight savings are possible when exploiting the proof test results during the design process.

  19. Development of the upper-body dressing scale for a buttoned shirt: a preliminary correlational study.

    PubMed

    Suzuki, Makoto; Yamada, Sumio; Omori, Mikayo; Hatakeyama, Mayumi; Sugimura, Yuko; Matsushita, Kazuhiko; Tagawa, Yoshikatsu

    2008-09-01

    A patient with poststroke hemiparesis learns to use the nonparetic arm to compensate for the weakness of the paretic arm to achieve independence in dressing. This is the learning process of new component actions on dressing. The purpose of this study was to develop the Upper-Body Dressing Scale (UBDS) for buttoned shirt dressing, which evaluates the component actions of upper-body dressing, and to provide preliminary data on internal consistency of the UBDS, as well as its reproducibility, validity, and sensitivity to clinical change. Correlational study of concurrent validity and reliability in which 63 consecutive stroke patients were enrolled in the study and were assessed repeatedly by the UBDS and the dressing item of Functional Independent Measure (FIM). Fifty-one patients completed the 3-wk study. The Cronbach's coefficient alpha of UBDS was 0.88. The principal component analysis extracted two components, which explained 62.3% of total variance. All items of the scale had high loading on the first component (0.65-0.83). Actions on the paralytic side were the positive loadings and actions on the healthy side were the negative loadings on the second component. Intraclass correlation coefficient was 0.87. The level of correlation between UBDS score and FIM dressing item scores was -0.72. Logistic regression analysis showed that only the score of UBDS on the first day of evaluation was a significant independent predictor of dressing ability (odds ratio, 0.82; 95% confidence interval, 0.71-0.95). The UBDS scores for paralytic hand passed into the sleeve, sleeve pulled up beyond the elbow joint, and sleeve pulled up beyond the shoulder joint were worse than the score for the other components of the task. These component actions had positive loading on the second component, which was identified by the principal component analysis. The UBDS has good internal consistency, reproducibility, validity, and sensitivity to clinical changes of patients with poststroke hemiparesis. This detailed UBDS assessment enables us to document the most difficult stages in dressing and to assess motor and process skills for independence of dressing.

  20. Machine Maintenance Scheduling with Reliability Engineering Method and Maintenance Value Stream Mapping

    NASA Astrophysics Data System (ADS)

    Sembiring, N.; Nasution, A. H.

    2018-02-01

    Corrective maintenance i.e replacing or repairing the machine component after machine break down always done in a manufacturing company. It causes the production process must be stopped. Production time will decrease due to the maintenance team must replace or repair the damage machine component. This paper proposes a preventive maintenance’s schedule for a critical component of a critical machine of an crude palm oil and kernel company due to increase maintenance efficiency. The Reliability Engineering & Maintenance Value Stream Mapping is used as a method and a tool to analize the reliability of the component and reduce the wastage in any process by segregating value added and non value added activities.

  1. On modeling human reliability in space flights - Redundancy and recovery operations

    NASA Astrophysics Data System (ADS)

    Aarset, M.; Wright, J. F.

    The reliability of humans is of paramount importance to the safety of space flight systems. This paper describes why 'back-up' operators might not be the best solution, and in some cases, might even degrade system reliability. The problem associated with human redundancy calls for special treatment in reliability analyses. The concept of Standby Redundancy is adopted, and psychological and mathematical models are introduced to improve the way such problems can be estimated and handled. In the past, human reliability has practically been neglected in most reliability analyses, and, when included, the humans have been modeled as a component and treated numerically the way technical components are. This approach is not wrong in itself, but it may lead to systematic errors if too simple analogies from the technical domain are used in the modeling of human behavior. In this paper redundancy in a man-machine system will be addressed. It will be shown how simplification from the technical domain, when applied to human components of a system, may give non-conservative estimates of system reliability.

  2. Development of Reliable and Validated Tools to Evaluate Technical Resuscitation Skills in a Pediatric Simulation Setting: Resuscitation and Emergency Simulation Checklist for Assessment in Pediatrics.

    PubMed

    Faudeux, Camille; Tran, Antoine; Dupont, Audrey; Desmontils, Jonathan; Montaudié, Isabelle; Bréaud, Jean; Braun, Marc; Fournier, Jean-Paul; Bérard, Etienne; Berlengi, Noémie; Schweitzer, Cyril; Haas, Hervé; Caci, Hervé; Gatin, Amélie; Giovannini-Chami, Lisa

    2017-09-01

    To develop a reliable and validated tool to evaluate technical resuscitation skills in a pediatric simulation setting. Four Resuscitation and Emergency Simulation Checklist for Assessment in Pediatrics (RESCAPE) evaluation tools were created, following international guidelines: intraosseous needle insertion, bag mask ventilation, endotracheal intubation, and cardiac massage. We applied a modified Delphi methodology evaluation to binary rating items. Reliability was assessed comparing the ratings of 2 observers (1 in real time and 1 after a video-recorded review). The tools were assessed for content, construct, and criterion validity, and for sensitivity to change. Inter-rater reliability, evaluated with Cohen kappa coefficients, was perfect or near-perfect (>0.8) for 92.5% of items and each Cronbach alpha coefficient was ≥0.91. Principal component analyses showed that all 4 tools were unidimensional. Significant increases in median scores with increasing levels of medical expertise were demonstrated for RESCAPE-intraosseous needle insertion (P = .0002), RESCAPE-bag mask ventilation (P = .0002), RESCAPE-endotracheal intubation (P = .0001), and RESCAPE-cardiac massage (P = .0037). Significantly increased median scores over time were also demonstrated during a simulation-based educational program. RESCAPE tools are reliable and validated tools for the evaluation of technical resuscitation skills in pediatric settings during simulation-based educational programs. They might also be used for medical practice performance evaluations. Copyright © 2017 Elsevier Inc. All rights reserved.

  3. Using meta-quality to assess the utility of volunteered geographic information for science.

    PubMed

    Langley, Shaun A; Messina, Joseph P; Moore, Nathan

    2017-11-06

    Volunteered geographic information (VGI) has strong potential to be increasingly valuable to scientists in collaboration with non-scientists. The abundance of mobile phones and other wireless forms of communication open up significant opportunities for the public to get involved in scientific research. As these devices and activities become more abundant, questions of uncertainty and error in volunteer data are emerging as critical components for using volunteer-sourced spatial data. Here we present a methodology for using VGI and assessing its sensitivity to three types of error. More specifically, this study evaluates the reliability of data from volunteers based on their historical patterns. The specific context is a case study in surveillance of tsetse flies, a health concern for being the primary vector of African Trypanosomiasis. Reliability, as measured by a reputation score, determines the threshold for accepting the volunteered data for inclusion in a tsetse presence/absence model. Higher reputation scores are successful in identifying areas of higher modeled tsetse prevalence. A dynamic threshold is needed but the quality of VGI will improve as more data are collected and the errors in identifying reliable participants will decrease. This system allows for two-way communication between researchers and the public, and a way to evaluate the reliability of VGI. Boosting the public's ability to participate in such work can improve disease surveillance and promote citizen science. In the absence of active surveillance, VGI can provide valuable spatial information given that the data are reliable.

  4. Validation and Test-Retest Reliability of New Thermographic Technique Called Thermovision Technique of Dry Needling for Gluteus Minimus Trigger Points in Sciatica Subjects and TrPs-Negative Healthy Volunteers

    PubMed Central

    Rychlik, Michał; Samborski, Włodzimierz

    2015-01-01

    The aim of this study was to assess the validity and test-retest reliability of Thermovision Technique of Dry Needling (TTDN) for the gluteus minimus muscle. TTDN is a new thermography approach used to support trigger points (TrPs) diagnostic criteria by presence of short-term vasomotor reactions occurring in the area where TrPs refer pain. Method. Thirty chronic sciatica patients (n=15 TrP-positive and n=15 TrPs-negative) and 15 healthy volunteers were evaluated by TTDN three times during two consecutive days based on TrPs of the gluteus minimus muscle confirmed additionally by referred pain presence. TTDN employs average temperature (T avr), maximum temperature (T max), low/high isothermal-area, and autonomic referred pain phenomenon (AURP) that reflects vasodilatation/vasoconstriction. Validity and test-retest reliability were assessed concurrently. Results. Two components of TTDN validity and reliability, T avr and AURP, had almost perfect agreement according to κ (e.g., thigh: 0.880 and 0.938; calf: 0.902 and 0.956, resp.). The sensitivity for T avr, T max, AURP, and high isothermal-area was 100% for everyone, but specificity of 100% was for T avr and AURP only. Conclusion. TTDN is a valid and reliable method for T avr and AURP measurement to support TrPs diagnostic criteria for the gluteus minimus muscle when digitally evoked referred pain pattern is present. PMID:26137486

  5. [The modified method registration of kinesthetic evoked potentials and its application for research of proprioceptive sensitivity disorders at spondylogenic cervical myelopathy].

    PubMed

    Gordeev, S A; Voronin, S G

    2016-01-01

    To analyze the efficacy of modified (passive radiocarpal articulation flexion/extension) and «standard» (passive radiocarpal articulation flexion) methods of kinesthetic evoked potentials for proprioceptive sensitivity assessment in healthy subjects and patients with spondylotic cervical myelopathy. The study included 14 healthy subjects (4 women and 10 men, mean age 54.1±10.5 years) and 8 patients (2 women and 6 men, mean age 55.8±10.9 years) with spondylotic cervical myelopathy. Muscle-joint sensation was examined during the clinical study. A modified method of kinesthetic evoked potentials was developed. This method differed from the "standard" one by the organization of a cycle including several passive movements,where each new movement differed from the preceding one by the direction. The modified method of kinesthetic evoked potentials ensures more reliable kinesthetic sensitivity assessment due to movement variability. Asignificant increaseof the latent periods of the early components of the response was found in patients compared to healthy subjects. The modified method of kinesthetic evoked potentials can be used for objective diagnosis of proprioceptive sensitivity disorders in patients with spondylotic cervical myelopathy.

  6. Reliability, Validity, and Sensitivity to Change Overtime of the Modified Melasma Area and Severity Index Score.

    PubMed

    Abou-Taleb, Doaa A E; Ibrahim, Ahmed K; Youssef, Eman M K; Moubasher, Alaa E A

    2017-02-01

    The new modified Melasma Area and Severity Index (mMASI) score, the recently used outcome measure for melasma, has not been tested to determine its sensitivity to change in melasma. To determine the reliability, validity, and sensitivity to change overtime of the mMASI score in assessment of the severity of melasma. Pearson correlation, Cronbach alpha, and intraclass correlation coefficient were calculated to assess the reliability of the mMASI score. Validity of the mMASI scale was carried out using Spearman correlation between mMASI total score (before and after treatment), clinical data, and patient's responses. The mMASI score showed excellent reliability and good validity for assessment of the severity of melasma. The authors also determined that the mMASI score demonstrated sensitivity to change over time. An excellent degree of agreement between the mMSAI and MASI scores was revealed. The mMASI score is reliable, valid, and responsive to change in the assessment of severity of melasma. Moreover, the mMASI score was found to be easier to learn and perform and simpler in calculation compared with the MASI score. Overall, the mMASI score can effectively replace the MASI score.

  7. Quantitative Evaluation of Aged AISI 316L Stainless Steel Sensitization to Intergranular Corrosion: Comparison Between Microstructural Electrochemical and Analytical Methods

    NASA Astrophysics Data System (ADS)

    Sidhom, H.; Amadou, T.; Sahlaoui, H.; Braham, C.

    2007-06-01

    The evaluation of the degree of sensitization (DOS) to intergranular corrosion (IGC) of a commercial AISI 316L austenitic stainless steel aged at temperatures ranging from 550 °C to 800 °C during 100 to 80,000 hours was carried out using three different assessment methods. (1) The microstructural method coupled with the Strauss standard test (ASTM A262). This method establishes the kinetics of the precipitation phenomenon under different aging conditions, by transmission electronic microscope (TEM) examination of thin foils and electron diffraction. The subsequent chromium-depleted zones are characterized by X-ray microanalysis using scanning transmission electronic microscope (STEM). The superimposition of microstructural time-temperature-precipitation (TTP) and ASTM A262 time-temperature-sensitization (TTS) diagrams provides the relationship between aged microstructure and IGC. Moreover, by considering the chromium-depleted zone characteristics, sensitization and desensitization criteria could be established. (2) The electrochemical method involving the double loop-electrochemical potentiokinetic reactivation (DL-EPR) test. The operating conditions of this test were initially optimized using the experimental design method on the bases of the reliability, the selectivity, and the reproducibility of test responses for both annealed and sensitized steels. The TTS diagram of the AISI 316L stainless steel was established using this method. This diagram offers a quantitative assessment of the DOS and a possibility to appreciate the time-temperature equivalence of the IGC sensitization and desensitization. (3) The analytical method based on the chromium diffusion models. Using the IGC sensitization and desensitization criteria established by the microstructural method, numerical solving of the chromium diffusion equations leads to a calculated AISI 316L TTS diagram. Comparison of these three methods gives a clear advantage to the nondestructive DL-EPR test when it is used with its optimized operating conditions. This quantitative method is simple to perform; it is fast, reliable, economical, and presents the best ability to detect the lowest DOS to IGC. For these reasons, this method can be considered as a serious candidate for IGC checking of stainless steel components of industrial plants.

  8. Interrater reliability assessment using the Test of Gross Motor Development-2.

    PubMed

    Barnett, Lisa M; Minto, Christine; Lander, Natalie; Hardy, Louise L

    2014-11-01

    The aim was to examine interrater reliability of the object control subtest from the Test of Gross Motor Development-2 by live observation in a school field setting. Reliability Study--cross sectional. Raters were rated on their ability to agree on (1) the raw total for the six object control skills; (2) each skill performance and (3) the skill components. Agreement for the object control subtest and the individual skills was assessed by an intraclass correlation (ICC) and a kappa statistic assessed for skill component agreement. A total of 37 children (65% girls) aged 4-8 years (M = 6.2, SD = 0.8) were assessed in six skills by two raters; equating to 222 skill tests. Interrater reliability was excellent for the object control subset (ICC = 0.93), and for individual skills, highest for the dribble (ICC = 0.94) followed by strike (ICC = 0.85), overhand throw (ICC = 0.84), underhand roll (ICC = 0.82), kick (ICC = 0.80) and the catch (ICC = 0.71). The strike and the throw had more components with less agreement. Even though the overall subtest score and individual skill agreement was good, some skill components had lower agreement, suggesting these may be more problematic to assess. This may mean some skill components need to be specified differently in order to improve component reliability. Crown Copyright © 2013. Published by Elsevier Ltd. All rights reserved.

  9. Mechanical System Reliability and Cost Integration Using a Sequential Linear Approximation Method

    NASA Technical Reports Server (NTRS)

    Kowal, Michael T.

    1997-01-01

    The development of new products is dependent on product designs that incorporate high levels of reliability along with a design that meets predetermined levels of system cost. Additional constraints on the product include explicit and implicit performance requirements. Existing reliability and cost prediction methods result in no direct linkage between variables affecting these two dominant product attributes. A methodology to integrate reliability and cost estimates using a sequential linear approximation method is proposed. The sequential linear approximation method utilizes probability of failure sensitivities determined from probabilistic reliability methods as well a manufacturing cost sensitivities. The application of the sequential linear approximation method to a mechanical system is demonstrated.

  10. Development and validation of the Learning Disabilities Needs Assessment Tool (LDNAT), a HoNOS-based needs assessment tool for use with people with intellectual disability.

    PubMed

    Painter, J; Trevithick, L; Hastings, R P; Ingham, B; Roy, A

    2016-12-01

    In meeting the needs of individuals with intellectual disabilities (ID) who access health services, a brief, holistic assessment of need is useful. This study outlines the development and testing of the Learning Disabilities Needs Assessment Tool (LDNAT), a tool intended for this purpose. An existing mental health (MH) tool was extended by a multidisciplinary group of ID practitioners. Additional scales were drafted to capture needs across six ID treatment domains that the group identified. LDNAT ratings were analysed for the following: item redundancy, relevance, construct validity and internal consistency (n = 1692); test-retest reliability (n = 27); and concurrent validity (n = 160). All LDNAT scales were deemed clinically relevant with little redundancy apparent. Principal component analysis indicated three components (developmental needs, challenging behaviour, MH and well-being). Internal consistency was good (Cronbach alpha 0.80). Individual item test-retest reliability was substantial-near perfect for 20 scales and slight-fair for three scales. Overall reliability was near perfect (intra-class correlation = 0.91). There were significant associations with five of six condition-specific measures, i.e. the Waisman Activities of Daily Living Scale (general ability/disability), Threshold Assessment Grid (risk), Behaviour Problems Inventory for Individuals with Intellectual Disabilities-Short Form (challenging behaviour) Social Communication Questionnaire (autism) and a bespoke physical health questionnaire. Additionally, the statistically significant correlations between these tools and the LDNAT components made sense clinically. There were no statistically significant correlations with the Psychiatric Assessment Schedules for Adults with Developmental Disabilities (a measure of MH symptoms in people with ID). The LDNAT had clinically utility when rating the needs of people with ID prior to condition-specific assessment(s). Analyses of internal and external validity were promising. Further evaluation of its sensitivity to changes in needs is now required. © 2016 MENCAP and International Association of the Scientific Study of Intellectual and Developmental Disabilities and John Wiley & Sons Ltd.

  11. Probabilistic Structural Analysis Methods (PSAM) for select space propulsion system components, part 2

    NASA Technical Reports Server (NTRS)

    1991-01-01

    The technical effort and computer code enhancements performed during the sixth year of the Probabilistic Structural Analysis Methods program are summarized. Various capabilities are described to probabilistically combine structural response and structural resistance to compute component reliability. A library of structural resistance models is implemented in the Numerical Evaluations of Stochastic Structures Under Stress (NESSUS) code that included fatigue, fracture, creep, multi-factor interaction, and other important effects. In addition, a user interface was developed for user-defined resistance models. An accurate and efficient reliability method was developed and was successfully implemented in the NESSUS code to compute component reliability based on user-selected response and resistance models. A risk module was developed to compute component risk with respect to cost, performance, or user-defined criteria. The new component risk assessment capabilities were validated and demonstrated using several examples. Various supporting methodologies were also developed in support of component risk assessment.

  12. Chemically Designed Metallic/Insulating Hybrid Nanostructures with Silver Nanocrystals for Highly Sensitive Wearable Pressure Sensors.

    PubMed

    Kim, Haneun; Lee, Seung-Wook; Joh, Hyungmok; Seong, Mingi; Lee, Woo Seok; Kang, Min Su; Pyo, Jun Beom; Oh, Soong Ju

    2018-01-10

    With the increase in interest in wearable tactile pressure sensors for e-skin, researches to make nanostructures to achieve high sensitivity have been actively conducted. However, limitations such as complex fabrication processes using expensive equipment still exist. Herein, simple lithography-free techniques to develop pyramid-like metal/insulator hybrid nanostructures utilizing nanocrystals (NCs) are demonstrated. Ligand-exchanged and unexchanged silver NC thin films are used as metallic and insulating components, respectively. The interfaces of each NC layer are chemically engineered to create discontinuous insulating layers, i.e., spacers for improved sensitivity, and eventually to realize fully solution-processed pressure sensors. Device performance analysis with structural, chemical, and electronic characterization and conductive atomic force microscopy study reveals that hybrid nanostructure based pressure sensor shows an enhanced sensitivity of higher than 500 kPa -1 , reliability, and low power consumption with a wide range of pressure sensing. Nano-/micro-hierarchical structures are also designed by combining hybrid nanostructures with conventional microstructures, exhibiting further enhanced sensing range and achieving a record sensitivity of 2.72 × 10 4 kPa -1 . Finally, all-solution-processed pressure sensor arrays with high pixel density, capable of detecting delicate signals with high spatial selectivity much better than the human tactile threshold, are introduced.

  13. A novel standardized algorithm using SPECT/CT evaluating unhappy patients after unicondylar knee arthroplasty--a combined analysis of tracer uptake distribution and component position.

    PubMed

    Suter, Basil; Testa, Enrique; Stämpfli, Patrick; Konala, Praveen; Rasch, Helmut; Friederich, Niklaus F; Hirschmann, Michael T

    2015-03-20

    The introduction of a standardized SPECT/CT algorithm including a localization scheme, which allows accurate identification of specific patterns and thresholds of SPECT/CT tracer uptake, could lead to a better understanding of the bone remodeling and specific failure modes of unicondylar knee arthroplasty (UKA). The purpose of the present study was to introduce a novel standardized SPECT/CT algorithm for patients after UKA and evaluate its clinical applicability, usefulness and inter- and intra-observer reliability. Tc-HDP-SPECT/CT images of consecutive patients (median age 65, range 48-84 years) with 21 knees after UKA were prospectively evaluated. The tracer activity on SPECT/CT was localized using a specific standardized UKA localization scheme. For tracer uptake analysis (intensity and anatomical distribution pattern) a 3D volumetric quantification method was used. The maximum intensity values were recorded for each anatomical area. In addition, ratios between the respective value in the measured area and the background tracer activity were calculated. The femoral and tibial component position (varus-valgus, flexion-extension, internal and external rotation) was determined in 3D-CT. The inter- and intraobserver reliability of the localization scheme, grading of the tracer activity and component measurements were determined by calculating the intraclass correlation coefficients (ICC). The localization scheme, grading of the tracer activity and component measurements showed high inter- and intra-observer reliabilities for all regions (tibia, femur and patella). For measurement of component position there was strong agreement between the readings of the two observers; the ICC for the orientation of the femoral component was 0.73-1.00 (intra-observer reliability) and 0.91-1.00 (inter-observer reliability). The ICC for the orientation of the tibial component was 0.75-1.00 (intra-observer reliability) and 0.77-1.00 (inter-observer reliability). The SPECT/CT algorithm presented combining the mechanical information on UKA component position, alignment and metabolic data is highly reliable and proved to be a valuable, consistent and useful tool for analysing postoperative knees after UKA. Using this standardized approach in clinical studies might be helpful in establishing the diagnosis in patients with pain after UKA.

  14. Reliability models applicable to space telescope solar array assembly system

    NASA Technical Reports Server (NTRS)

    Patil, S. A.

    1986-01-01

    A complex system may consist of a number of subsystems with several components in series, parallel, or combination of both series and parallel. In order to predict how well the system will perform, it is necessary to know the reliabilities of the subsystems and the reliability of the whole system. The objective of the present study is to develop mathematical models of the reliability which are applicable to complex systems. The models are determined by assuming k failures out of n components in a subsystem. By taking k = 1 and k = n, these models reduce to parallel and series models; hence, the models can be specialized to parallel, series combination systems. The models are developed by assuming the failure rates of the components as functions of time and as such, can be applied to processes with or without aging effects. The reliability models are further specialized to Space Telescope Solar Arrray (STSA) System. The STSA consists of 20 identical solar panel assemblies (SPA's). The reliabilities of the SPA's are determined by the reliabilities of solar cell strings, interconnects, and diodes. The estimates of the reliability of the system for one to five years are calculated by using the reliability estimates of solar cells and interconnects given n ESA documents. Aging effects in relation to breaks in interconnects are discussed.

  15. Validity, Reliability, and Sensitivity of a Volleyball Intermittent Endurance Test.

    PubMed

    Rodríguez-Marroyo, Jose A; Medina-Carrillo, Javier; García-López, Juan; Morante, Juan C; Villa, José G; Foster, Carl

    2017-03-01

    To analyze the concurrent and construct validity of a volleyball intermittent endurance test (VIET). The VIET's test-retest reliability and sensitivity to assess seasonal changes was also studied. During the preseason, 71 volleyball players of different competitive levels took part in this study. All performed the VIET and a graded treadmill test with gas-exchange measurement (GXT). Thirty-one of the players performed an additional VIET to analyze the test-retest reliability. To test the VIET's sensitivity, 28 players repeated the VIET and GXT at the end of their season. Significant (P < .001) relationships between VIET distance and maximal oxygen uptake (r = .74) and GXT maximal speed (r = .78) were observed. There were no significant differences between the VIET performance test and retest (1542.1 ± 338.1 vs 1567.1 ± 358.2 m). Significant (P < .001) relationships and intraclass correlation coefficient (ICC) were found (r = .95, ICC = .96) for VIET performance. VIET performance increased significantly (P < .001) with player performance level and was sensitive to fitness changes across the season (1458.8 ± 343.5 vs 1581.1 ± 334.0 m, P < .01). The VIET may be considered a valid, reliable, and sensitive test to assess the aerobic endurance in volleyball players.

  16. Repeatability of measurements of removal of mite-infested brood to assess Varroa Sensitive Hygiene

    USDA-ARS?s Scientific Manuscript database

    Varroa Sensitive Hygiene is a useful resistance trait that bee breeders could increase in different populations with cost-effective and reliable tests. We investigated the reliability of a one-week test estimating the changes in infestation of brood introduced into highly selected and unselected co...

  17. Effect of Individual Component Life Distribution on Engine Life Prediction

    NASA Technical Reports Server (NTRS)

    Zaretsky, Erwin V.; Hendricks, Robert C.; Soditus, Sherry M.

    2003-01-01

    The effect of individual engine component life distributions on engine life prediction was determined. A Weibull-based life and reliability analysis of the NASA Energy Efficient Engine was conducted. The engine s life at a 95 and 99.9 percent probability of survival was determined based upon the engine manufacturer s original life calculations and assumed values of each of the component s cumulative life distributions as represented by a Weibull slope. The lives of the high-pressure turbine (HPT) disks and blades were also evaluated individually and as a system in a similar manner. Knowing the statistical cumulative distribution of each engine component with reasonable engineering certainty is a condition precedent to predicting the life and reliability of an entire engine. The life of a system at a given reliability will be less than the lowest-lived component in the system at the same reliability (probability of survival). Where Weibull slopes of all the engine components are equal, the Weibull slope had a minimal effect on engine L(sub 0.1) life prediction. However, at a probability of survival of 95 percent (L(sub 5) life), life decreased with increasing Weibull slope.

  18. Turbulent stresses in the surf-zone: Which way is up?

    USGS Publications Warehouse

    Haines, John W.; Gelfenbaum, Guy; Edge, B.L

    1997-01-01

    Velocity observations from a vertical stack of three-component Acoustic Doppler Velocimeters (ADVs) within the energetic surf-zone are presented. Rapid temporal sampling and small sampling volume provide observations suitable for investigation of the role of turbulent fluctuations in surf-zone dynamics. While sensor performance was good, failure to recover reliable measures of tilt from the vertical compromise the data value. We will present some cursory observations supporting the ADV performance, and examine the sensitivity of stress estimates to uncertainty in the sensor orientation. It is well known that turbulent stress estimates are highly sensitive to orientation relative to vertical when wave motions are dominant. Analyses presented examine the potential to use observed flow-field characteristics to constrain sensor orientation. Results show that such an approach may provide a consistent orientation to a fraction of a degree, but the inherent sensitivity of stress estimates requires a still more restrictive constraint. Regardless, the observations indicate the degree to which stress estimates are dependent on orientation, and provide some indication of the temporal variability in time-averaged stress estimates.

  19. A low-complexity add-on score for protein remote homology search with COMER.

    PubMed

    Margelevicius, Mindaugas

    2018-06-15

    Protein sequence alignment forms the basis for comparative modeling, the most reliable approach to protein structure prediction, among many other applications. Alignment between sequence families, or profile-profile alignment, represents one of the most, if not the most, sensitive means for homology detection but still necessitates improvement. We aim at improving the quality of profile-profile alignments and the sensitivity induced by them by refining profile-profile substitution scores. We have developed a new score that represents an additional component of profile-profile substitution scores. A comprehensive evaluation shows that the new add-on score statistically significantly improves both the sensitivity and the alignment quality of the COMER method. We discuss why the score leads to the improvement and its almost optimal computational complexity that makes it easily implementable in any profile-profile alignment method. An implementation of the add-on score in the open-source COMER software and data are available at https://sourceforge.net/projects/comer. The COMER software is also available on Github at https://github.com/minmarg/comer and as a Docker image (minmar/comer). Supplementary data are available at Bioinformatics online.

  20. Model-based POD study of manual ultrasound inspection and sensitivity analysis using metamodel

    NASA Astrophysics Data System (ADS)

    Ribay, Guillemette; Artusi, Xavier; Jenson, Frédéric; Reece, Christopher; Lhuillier, Pierre-Emile

    2016-02-01

    The reliability of NDE can be quantified by using the Probability of Detection (POD) approach. Former studies have shown the potential of the model-assisted POD (MAPOD) approach to replace expensive experimental determination of POD curves. In this paper, we make use of CIVA software to determine POD curves for a manual ultrasonic inspection of a heavy component, for which a whole experimental POD campaign was not available. The influential parameters were determined by expert analysis. The semi-analytical models used in CIVA for wave propagation and beam-defect interaction have been validated in the range of variation of the influential parameters by comparison with finite element modelling (Athena). The POD curves are computed for « hit/miss » and « â versus a » analysis. The verification of Berens hypothesis is evaluated by statistical tools. A sensitivity study is performed to measure the relative influence of parameters on the defect response amplitude variance, using the Sobol sensitivity index. A meta-model is also built to reduce computing cost and enhance the precision of estimated index.

  1. NASA Case Sensitive Review and Audit Approach

    NASA Astrophysics Data System (ADS)

    Lee, Arthur R.; Bacus, Thomas H.; Bowersox, Alexandra M.; Newman, J. Steven

    2005-12-01

    As an Agency involved in high-risk endeavors NASA continually reassesses its commitment to engineering excellence and compliance to requirements. As a component of NASA's continual process improvement, the Office of Safety and Mission Assurance (OSMA) established the Review and Assessment Division (RAD) [1] to conduct independent audits to verify compliance with Agency requirements that impact safe and reliable operations. In implementing its responsibilities, RAD benchmarked various approaches for conducting audits, focusing on organizations that, like NASA, operate in high-risk environments - where seemingly inconsequential departures from safety, reliability, and quality requirements can have catastrophic impact to the public, NASA personnel, high-value equipment, and the environment. The approach used by the U.S. Navy Submarine Program [2] was considered the most fruitful framework for the invigorated OSMA audit processes. Additionally, the results of benchmarking activity revealed that not all audits are conducted using just one approach or even with the same objectives. This led to the concept of discrete, unique "audit cases."

  2. Development and validation of a questionnaire to evaluate patient satisfaction with diabetes disease management.

    PubMed

    Paddock, L E; Veloski, J; Chatterton, M L; Gevirtz, F O; Nash, D B

    2000-07-01

    To develop a reliable and valid questionnaire to measure patient satisfaction with diabetes disease management programs. Questions related to structure, process, and outcomes were categorized into 14 domains defining the essential elements of diabetes disease management. Health professionals confirmed the content validity. Face validity was established by a patient focus group. The questionnaire was mailed to 711 patients with diabetes who participated in a disease management program. To reduce the number of questionnaire items, a principal components analysis was performed using a varimax rotation. The Scree test was used to select significant components. To further assess reliability and validity; Cronbach's alpha and product-moment correlations were calculated for components having > or =3 items with loadings >0.50. The validated 73-item mailed satisfaction survey had a 34.1% response rate. Principal components analysis yielded 13 components with eigenvalues > 1.0. The Scree test proposed a 6-component solution (39 items), which explained 59% of the total variation. Internal consistency reliabilities computed for the first 6 components (alpha = 0.79-0.95) were acceptable. The final questionnaire, the Diabetes Management Evaluation Tool (DMET), was designed to assess patient satisfaction with diabetes disease management programs. Although more extensive testing of the questionnaire is appropriate, preliminary reliability and validity of the DMET has been demonstrated.

  3. The reliability of the pass/fail decision for assessments comprised of multiple components.

    PubMed

    Möltner, Andreas; Tımbıl, Sevgi; Jünger, Jana

    2015-01-01

    The decision having the most serious consequences for a student taking an assessment is the one to pass or fail that student. For this reason, the reliability of the pass/fail decision must be determined for high quality assessments, just as the measurement reliability of the point values. Assessments in a particular subject (graded course credit) are often composed of multiple components that must be passed independently of each other. When "conjunctively" combining separate pass/fail decisions, as with other complex decision rules for passing, adequate methods of analysis are necessary for estimating the accuracy and consistency of these classifications. To date, very few papers have addressed this issue; a generally applicable procedure was published by Douglas and Mislevy in 2010. Using the example of an assessment comprised of several parts that must be passed separately, this study analyzes the reliability underlying the decision to pass or fail students and discusses the impact of an improved method for identifying those who do not fulfill the minimum requirements. The accuracy and consistency of the decision to pass or fail an examinee in the subject cluster Internal Medicine/General Medicine/Clinical Chemistry at the University of Heidelberg's Faculty of Medicine was investigated. This cluster requires students to separately pass three components (two written exams and an OSCE), whereby students may reattempt to pass each component twice. Our analysis was carried out using the method described by Douglas and Mislevy. Frequently, when complex logical connections exist between the individual pass/fail decisions in the case of low failure rates, only a very low reliability for the overall decision to grant graded course credit can be achieved, even if high reliabilities exist for the various components. For the example analyzed here, the classification accuracy and consistency when conjunctively combining the three individual parts is relatively low with κ=0.49 or κ=0.47, despite the good reliability of over 0.75 for each of the three components. The option to repeat each component twice leads to a situation in which only about half of the candidates who do not satisfy the minimum requirements would fail the overall assessment, while the other half is able to continue their studies despite having deficient knowledge and skills. The method put forth by Douglas and Mislevy allows the analysis of the decision accuracy and consistency for complex combinations of scores from different components. Even in the case of highly reliable components, it is not necessarily so that a reliable pass/fail decision has been reached - for instance in the case of low failure rates. Assessments must be administered with the explicit goal of identifying examinees that do not fulfill the minimum requirements.

  4. The reliability of the pass/fail decision for assessments comprised of multiple components

    PubMed Central

    Möltner, Andreas; Tımbıl, Sevgi; Jünger, Jana

    2015-01-01

    Objective: The decision having the most serious consequences for a student taking an assessment is the one to pass or fail that student. For this reason, the reliability of the pass/fail decision must be determined for high quality assessments, just as the measurement reliability of the point values. Assessments in a particular subject (graded course credit) are often composed of multiple components that must be passed independently of each other. When “conjunctively” combining separate pass/fail decisions, as with other complex decision rules for passing, adequate methods of analysis are necessary for estimating the accuracy and consistency of these classifications. To date, very few papers have addressed this issue; a generally applicable procedure was published by Douglas and Mislevy in 2010. Using the example of an assessment comprised of several parts that must be passed separately, this study analyzes the reliability underlying the decision to pass or fail students and discusses the impact of an improved method for identifying those who do not fulfill the minimum requirements. Method: The accuracy and consistency of the decision to pass or fail an examinee in the subject cluster Internal Medicine/General Medicine/Clinical Chemistry at the University of Heidelberg’s Faculty of Medicine was investigated. This cluster requires students to separately pass three components (two written exams and an OSCE), whereby students may reattempt to pass each component twice. Our analysis was carried out using the method described by Douglas and Mislevy. Results: Frequently, when complex logical connections exist between the individual pass/fail decisions in the case of low failure rates, only a very low reliability for the overall decision to grant graded course credit can be achieved, even if high reliabilities exist for the various components. For the example analyzed here, the classification accuracy and consistency when conjunctively combining the three individual parts is relatively low with κ=0.49 or κ=0.47, despite the good reliability of over 0.75 for each of the three components. The option to repeat each component twice leads to a situation in which only about half of the candidates who do not satisfy the minimum requirements would fail the overall assessment, while the other half is able to continue their studies despite having deficient knowledge and skills. Conclusion: The method put forth by Douglas and Mislevy allows the analysis of the decision accuracy and consistency for complex combinations of scores from different components. Even in the case of highly reliable components, it is not necessarily so that a reliable pass/fail decision has been reached – for instance in the case of low failure rates. Assessments must be administered with the explicit goal of identifying examinees that do not fulfill the minimum requirements. PMID:26483855

  5. Analysis of whisker-toughened CMC structural components using an interactive reliability model

    NASA Technical Reports Server (NTRS)

    Duffy, Stephen F.; Palko, Joseph L.

    1992-01-01

    Realizing wider utilization of ceramic matrix composites (CMC) requires the development of advanced structural analysis technologies. This article focuses on the use of interactive reliability models to predict component probability of failure. The deterministic William-Warnke failure criterion serves as theoretical basis for the reliability model presented here. The model has been implemented into a test-bed software program. This computer program has been coupled to a general-purpose finite element program. A simple structural problem is presented to illustrate the reliability model and the computer algorithm.

  6. Evaluating the effect of database inflation in proteogenomic search on sensitive and reliable peptide identification.

    PubMed

    Li, Honglan; Joh, Yoon Sung; Kim, Hyunwoo; Paek, Eunok; Lee, Sang-Won; Hwang, Kyu-Baek

    2016-12-22

    Proteogenomics is a promising approach for various tasks ranging from gene annotation to cancer research. Databases for proteogenomic searches are often constructed by adding peptide sequences inferred from genomic or transcriptomic evidence to reference protein sequences. Such inflation of databases has potential of identifying novel peptides. However, it also raises concerns on sensitive and reliable peptide identification. Spurious peptides included in target databases may result in underestimated false discovery rate (FDR). On the other hand, inflation of decoy databases could decrease the sensitivity of peptide identification due to the increased number of high-scoring random hits. Although several studies have addressed these issues, widely applicable guidelines for sensitive and reliable proteogenomic search have hardly been available. To systematically evaluate the effect of database inflation in proteogenomic searches, we constructed a variety of real and simulated proteogenomic databases for yeast and human tandem mass spectrometry (MS/MS) data, respectively. Against these databases, we tested two popular database search tools with various approaches to search result validation: the target-decoy search strategy (with and without a refined scoring-metric) and a mixture model-based method. The effect of separate filtering of known and novel peptides was also examined. The results from real and simulated proteogenomic searches confirmed that separate filtering increases the sensitivity and reliability in proteogenomic search. However, no one method consistently identified the largest (or the smallest) number of novel peptides from real proteogenomic searches. We propose to use a set of search result validation methods with separate filtering, for sensitive and reliable identification of peptides in proteogenomic search.

  7. Psychometric properties of the Spanish version of the Cocaine Selective Severity Assessment to evaluate cocaine withdrawal in treatment-seeking individuals.

    PubMed

    Pérez de los Cobos, José; Trujols, Joan; Siñol, Núria; Vasconcelos e Rego, Lisiane; Iraurgi, Ioseba; Batlle, Francesca

    2014-09-01

    Reliable and valid assessment of cocaine withdrawal is relevant for treating cocaine-dependent patients. This study examined the psychometric properties of the Spanish version of the Cocaine Selective Severity Assessment (CSSA), an instrument that measures cocaine withdrawal. Participants were 170 cocaine-dependent inpatients receiving detoxification treatment. Principal component analysis revealed a 4-factor structure for CSSA that included the following components: 'Cocaine Craving and Psychological Distress', 'Lethargy', 'Carbohydrate Craving and Irritability', and 'Somatic Depressive Symptoms'. These 4 components accounted for 56.0% of total variance. Internal reliability for these components ranged from unacceptable to good (Chronbach's alpha: 0.87, 0.65, 0.55, and 0.22, respectively). All components except Somatic Depressive Symptoms presented concurrent validity with cocaine use. In summary, while some properties of the Spanish version of the CSSA are satisfactory, such as interpretability of factor structure and test-retest reliability, other properties, such as internal reliability and concurrent validity of some factors, are inadequate. Copyright © 2014 Elsevier Inc. All rights reserved.

  8. Precision, Reliability, and Effect Size of Slope Variance in Latent Growth Curve Models: Implications for Statistical Power Analysis

    PubMed Central

    Brandmaier, Andreas M.; von Oertzen, Timo; Ghisletta, Paolo; Lindenberger, Ulman; Hertzog, Christopher

    2018-01-01

    Latent Growth Curve Models (LGCM) have become a standard technique to model change over time. Prediction and explanation of inter-individual differences in change are major goals in lifespan research. The major determinants of statistical power to detect individual differences in change are the magnitude of true inter-individual differences in linear change (LGCM slope variance), design precision, alpha level, and sample size. Here, we show that design precision can be expressed as the inverse of effective error. Effective error is determined by instrument reliability and the temporal arrangement of measurement occasions. However, it also depends on another central LGCM component, the variance of the latent intercept and its covariance with the latent slope. We derive a new reliability index for LGCM slope variance—effective curve reliability (ECR)—by scaling slope variance against effective error. ECR is interpretable as a standardized effect size index. We demonstrate how effective error, ECR, and statistical power for a likelihood ratio test of zero slope variance formally relate to each other and how they function as indices of statistical power. We also provide a computational approach to derive ECR for arbitrary intercept-slope covariance. With practical use cases, we argue for the complementary utility of the proposed indices of a study's sensitivity to detect slope variance when making a priori longitudinal design decisions or communicating study designs. PMID:29755377

  9. Functional gait assessment and balance evaluation system test: reliability, validity, sensitivity, and specificity for identifying individuals with Parkinson disease who fall.

    PubMed

    Leddy, Abigail L; Crowner, Beth E; Earhart, Gammon M

    2011-01-01

    Gait impairments, balance impairments, and falls are prevalent in individuals with Parkinson disease (PD). Although the Berg Balance Scale (BBS) can be considered the reference standard for the determination of fall risk, it has a noted ceiling effect. Development of ceiling-free measures that can assess balance and are good at discriminating "fallers" from "nonfallers" is needed. The purpose of this study was to compare the Functional Gait Assessment (FGA) and the Balance Evaluation Systems Test (BESTest) with the BBS among individuals with PD and evaluate the tests' reliability, validity, and discriminatory sensitivity and specificity for fallers versus nonfallers. This was an observational study of community-dwelling individuals with idiopathic PD. The BBS, FGA, and BESTest were administered to 80 individuals with PD. Interrater reliability (n=15) was assessed by 3 raters. Test-retest reliability was based on 2 tests of participants (n=24), 2 weeks apart. Intraclass correlation coefficients (2,1) were used to calculate reliability, and Spearman correlation coefficients were used to assess validity. Cutoff points, sensitivity, and specificity were based on receiver operating characteristic plots. Test-retest reliability was .80 for the BBS, .91 for the FGA, and .88 for the BESTest. Interrater reliability was greater than .93 for all 3 tests. The FGA and BESTest were correlated with the BBS (r=.78 and r=.87, respectively). Cutoff scores to identify fallers were 47/56 for the BBS, 15/30 for the FGA, and 69% for the BESTest. The overall accuracy (area under the curve) for the BBS, FGA, and BESTest was .79, .80, and .85, respectively. Fall reports were retrospective. Both the FGA and the BESTest have reliability and validity for assessing balance in individuals with PD. The BESTest is most sensitive for identifying fallers.

  10. FOR Allocation to Distribution Systems based on Credible Improvement Potential (CIP)

    NASA Astrophysics Data System (ADS)

    Tiwary, Aditya; Arya, L. D.; Arya, Rajesh; Choube, S. C.

    2017-02-01

    This paper describes an algorithm for forced outage rate (FOR) allocation to each section of an electrical distribution system subject to satisfaction of reliability constraints at each load point. These constraints include threshold values of basic reliability indices, for example, failure rate, interruption duration and interruption duration per year at load points. Component improvement potential measure has been used for FOR allocation. Component with greatest magnitude of credible improvement potential (CIP) measure is selected for improving reliability performance. The approach adopted is a monovariable method where one component is selected for FOR allocation and in the next iteration another component is selected for FOR allocation based on the magnitude of CIP. The developed algorithm is implemented on sample radial distribution system.

  11. Probabilistic Design and Analysis Framework

    NASA Technical Reports Server (NTRS)

    Strack, William C.; Nagpal, Vinod K.

    2010-01-01

    PRODAF is a software package designed to aid analysts and designers in conducting probabilistic analysis of components and systems. PRODAF can integrate multiple analysis programs to ease the tedious process of conducting a complex analysis process that requires the use of multiple software packages. The work uses a commercial finite element analysis (FEA) program with modules from NESSUS to conduct a probabilistic analysis of a hypothetical turbine blade, disk, and shaft model. PRODAF applies the response surface method, at the component level, and extrapolates the component-level responses to the system level. Hypothetical components of a gas turbine engine are first deterministically modeled using FEA. Variations in selected geometrical dimensions and loading conditions are analyzed to determine the effects of the stress state within each component. Geometric variations include the cord length and height for the blade, inner radius, outer radius, and thickness, which are varied for the disk. Probabilistic analysis is carried out using developing software packages like System Uncertainty Analysis (SUA) and PRODAF. PRODAF was used with a commercial deterministic FEA program in conjunction with modules from the probabilistic analysis program, NESTEM, to perturb loads and geometries to provide a reliability and sensitivity analysis. PRODAF simplified the handling of data among the various programs involved, and will work with many commercial and opensource deterministic programs, probabilistic programs, or modules.

  12. A mechanism of extreme growth and reliable signaling in sexually selected ornaments and weapons.

    PubMed

    Emlen, Douglas J; Warren, Ian A; Johns, Annika; Dworkin, Ian; Lavine, Laura Corley

    2012-08-17

    Many male animals wield ornaments or weapons of exaggerated proportions. We propose that increased cellular sensitivity to signaling through the insulin/insulin-like growth factor (IGF) pathway may be responsible for the extreme growth of these structures. We document how rhinoceros beetle horns, a sexually selected weapon, are more sensitive to nutrition and more responsive to perturbation of the insulin/IGF pathway than other body structures. We then illustrate how enhanced sensitivity to insulin/IGF signaling in a growing ornament or weapon would cause heightened condition sensitivity and increased variability in expression among individuals--critical properties of reliable signals of male quality. The possibility that reliable signaling arises as a by-product of the growth mechanism may explain why trait exaggeration has evolved so many different times in the context of sexual selection.

  13. Simultaneous determination of nitroimidazoles, benzimidazoles, and chloramphenicol components in bovine milk by ultra-high performance liquid chromatography-tandem mass spectrometry.

    PubMed

    Wang, Yuanyuan; Li, Xiaowei; Zhang, Zhiwen; Ding, Shuangyang; Jiang, Haiyang; Li, Jiancheng; Shen, Jianzhong; Xia, Xi

    2016-02-01

    A sensitive, confirmatory ultra-high performance liquid chromatography-tandem mass spectrometric method was developed and validated to detect 23 veterinary drugs and metabolites (nitroimidazoles, benzimidazoles, and chloramphenicol components) in bovine milk. Compounds of interest were sequentially extracted from milk with acetonitrile and basified acetonitrile using sodium chloride to induce liquid-liquid partition. The extract was purified on a mixed mode solid-phase extraction cartridge. Using rapid polarity switching in electrospray ionization, a single injection was capable of detecting both positively and negatively charged analytes in a 9 min chromatography run time. Recoveries based on matrix-matched calibrations and isotope labeled internal standards for milk ranged from 51.7% to 101.8%. The detection limits and quantitation limits of the analytical method were found to be within the range of 2-20 ng/kg and 5-50 ng/kg, respectively. The recommended method is simple, specific, and reliable for the routine monitoring of nitroimidazoles, benzimidazoles, and chloramphenicol components in bovine milk samples. Copyright © 2015 Elsevier Ltd. All rights reserved.

  14. Method to Eliminate Flux Linkage DC Component in Load Transformer for Static Transfer Switch

    PubMed Central

    2014-01-01

    Many industrial and commercial sensitive loads are subject to the voltage sags and interruptions. The static transfer switch (STS) based on the thyristors is applied to improve the power quality and reliability. However, the transfer will result in severe inrush current in the load transformer, because of the DC component in the magnetic flux generated in the transfer process. The inrush current which is always 2~30 p.u. can cause the disoperation of relay protective devices and bring potential damage to the transformer. The way to eliminate the DC component is to transfer the related phases when the residual flux linkage of the load transformer and the prospective flux linkage of the alternate source are equal. This paper analyzes how the flux linkage of each winding in the load transformer changes in the transfer process. Based on the residual flux linkage when the preferred source is completely disconnected, the method to calculate the proper time point to close each phase of the alternate source is developed. Simulation and laboratory experiments results are presented to show the effectiveness of the transfer method. PMID:25133255

  15. Method to eliminate flux linkage DC component in load transformer for static transfer switch.

    PubMed

    He, Yu; Mao, Chengxiong; Lu, Jiming; Wang, Dan; Tian, Bing

    2014-01-01

    Many industrial and commercial sensitive loads are subject to the voltage sags and interruptions. The static transfer switch (STS) based on the thyristors is applied to improve the power quality and reliability. However, the transfer will result in severe inrush current in the load transformer, because of the DC component in the magnetic flux generated in the transfer process. The inrush current which is always 2 ~ 30 p.u. can cause the disoperation of relay protective devices and bring potential damage to the transformer. The way to eliminate the DC component is to transfer the related phases when the residual flux linkage of the load transformer and the prospective flux linkage of the alternate source are equal. This paper analyzes how the flux linkage of each winding in the load transformer changes in the transfer process. Based on the residual flux linkage when the preferred source is completely disconnected, the method to calculate the proper time point to close each phase of the alternate source is developed. Simulation and laboratory experiments results are presented to show the effectiveness of the transfer method.

  16. The Pittsburgh Sleep Diary

    NASA Technical Reports Server (NTRS)

    Monk, T. H.; Reynolds CF, 3. d.; Kupfer, D. J.; Buysse, D. J.; Coble, P. A.; Hayes, A. J.; Machen, M. A.; Petrie, S. R.; Ritenour, A. M.

    1994-01-01

    Increasingly, there is a need in both research and clinical practice to document and quantify sleep and waking behaviors in a comprehensive manner. The Pittsburgh Sleep Diary (PghSD) is an instrument with separate components to be completed at bedtime and waketime. Bedtime components relate to the events of the day preceding the sleep, waketime components to the sleep period just completed. Two-week PghSD data is presented from 234 different subjects, comprising 96 healthy young middle-aged controls, 37 older men, 44 older women, 29 young adult controls and 28 sleep disorders patients in order to demonstrate the usefulness, validity and reliability of various measures from the instrument. Comparisons are made with polysomnographic and actigraphic sleep measures, as well as personality and circadian type questionnaires. The instrument was shown to have sensitivity in detecting differences due to weekends, age, gender, personality and circadian type, and validity in agreeing with actigraphic estimates of sleep timing and quality. Over a 12-31 month delay, PghSD measures of both sleep timing and sleep quality showed correlations between 0.56 and 0.81 (n = 39, P < 0.001).

  17. Reliability Assessment of a Robust Design Under Uncertainty for a 3-D Flexible Wing

    NASA Technical Reports Server (NTRS)

    Gumbert, Clyde R.; Hou, Gene J. -W.; Newman, Perry A.

    2003-01-01

    The paper presents reliability assessment results for the robust designs under uncertainty of a 3-D flexible wing previously reported by the authors. Reliability assessments (additional optimization problems) of the active constraints at the various probabilistic robust design points are obtained and compared with the constraint values or target constraint probabilities specified in the robust design. In addition, reliability-based sensitivity derivatives with respect to design variable mean values are also obtained and shown to agree with finite difference values. These derivatives allow one to perform reliability based design without having to obtain second-order sensitivity derivatives. However, an inner-loop optimization problem must be solved for each active constraint to find the most probable point on that constraint failure surface.

  18. Reliability and Creep/Fatigue Analysis of a CMC Component

    NASA Technical Reports Server (NTRS)

    Murthy, Pappu L. N.; Mital, Subodh K.; Gyekenyesi, John Z.; Gyekenyesi, John P.

    2007-01-01

    High temperature ceramic matrix composites (CMC) are being explored as viable candidate materials for hot section gas turbine components. These advanced composites can potentially lead to reduced weight and enable higher operating temperatures requiring less cooling; thus leading to increased engine efficiencies. There is a need for convenient design tools that can accommodate various loading conditions and material data with their associated uncertainties to estimate the minimum predicted life as well as the failure probabilities of a structural component. This paper presents a review of the life prediction and probabilistic analyses performed for a CMC turbine stator vane. A computer code, NASALife, is used to predict the life of a 2-D woven silicon carbide fiber reinforced silicon carbide matrix (SiC/SiC) turbine stator vane due to a mission cycle which induces low cycle fatigue and creep. The output from this program includes damage from creep loading, damage due to cyclic loading and the combined damage due to the given loading cycle. Results indicate that the trends predicted by NASALife are as expected for the loading conditions used for this study. In addition, a combination of woven composite micromechanics, finite element structural analysis and Fast Probability Integration (FPI) techniques has been used to evaluate the maximum stress and its probabilistic distribution in a CMC turbine stator vane. Input variables causing scatter are identified and ranked based upon their sensitivity magnitude. Results indicate that reducing the scatter in proportional limit strength of the vane material has the greatest effect in improving the overall reliability of the CMC vane.

  19. Development and validation of the Polish version of the Female Sexual Function Index in the Polish population of females.

    PubMed

    Nowosielski, Krzysztof; Wróbel, Beata; Sioma-Markowska, Urszula; Poręba, Ryszard

    2013-02-01

    Unlike male sexual function, which is relatively easy to assess, female sexual function is still a diagnostic challenge. Although numerous new measurements for female sexual dysfunction (FSD) have recently been developed, the Female Sexual Function Index (FSFI) remains the gold standard for screening. It has been validated in more than 30 countries. The FSFI has been used in several studies conducted in Poland, but it has never been standardized for Polish women. The aim of this study was to develop a Polish version of the FSFI (PL-FSFI). In total, 189 women aged 18-55 years were included in the study. Eighty-five were diagnosed with FSD as per the Diagnostic and Statistical Manual of Mental Disorders, Fourth Edition, Text Revision (DSM IV-TR) criteria; 104 women did not have FSD. All subjects completed the PL-FSFI at baseline (day 0), day 7, and day 28. Test-retest reliability was determined by Pearson's product-moment correlations. Reliability was tested using Cronbach's α coefficient. Construct validity was evaluated by principal component analysis using varimax rotation and factor analysis. Discriminant validity was assessed with between-groups analysis of variance. All domains of the PL-FSFI demonstrated satisfactory internal consistencies, with Cronbach's α value of >0.70 for the entire sample. The test-retest reliability demonstrated good-to-excellent agreement between the assessment points. Based on principal component analysis, a 5-factor model was established that explained 83.62% of the total variance. Domain intercorrelations of the PL-FSFI ranged from 0.37-0.77. The optimal PL-FSFI cutoff score was 27.50, with 87.1% sensitivity and 83.1% specificity. The PL-FSFI is a reliable questionnaire with good psychometric and discriminative validity. Therefore, it can be used as a tool for preliminary screening for FSD among Polish women. © 2012 International Society for Sexual Medicine.

  20. Mission Reliability Estimation for Repairable Robot Teams

    NASA Technical Reports Server (NTRS)

    Trebi-Ollennu, Ashitey; Dolan, John; Stancliff, Stephen

    2010-01-01

    A mission reliability estimation method has been designed to translate mission requirements into choices of robot modules in order to configure a multi-robot team to have high reliability at minimal cost. In order to build cost-effective robot teams for long-term missions, one must be able to compare alternative design paradigms in a principled way by comparing the reliability of different robot models and robot team configurations. Core modules have been created including: a probabilistic module with reliability-cost characteristics, a method for combining the characteristics of multiple modules to determine an overall reliability-cost characteristic, and a method for the generation of legitimate module combinations based on mission specifications and the selection of the best of the resulting combinations from a cost-reliability standpoint. The developed methodology can be used to predict the probability of a mission being completed, given information about the components used to build the robots, as well as information about the mission tasks. In the research for this innovation, sample robot missions were examined and compared to the performance of robot teams with different numbers of robots and different numbers of spare components. Data that a mission designer would need was factored in, such as whether it would be better to have a spare robot versus an equivalent number of spare parts, or if mission cost can be reduced while maintaining reliability using spares. This analytical model was applied to an example robot mission, examining the cost-reliability tradeoffs among different team configurations. Particularly scrutinized were teams using either redundancy (spare robots) or repairability (spare components). Using conservative estimates of the cost-reliability relationship, results show that it is possible to significantly reduce the cost of a robotic mission by using cheaper, lower-reliability components and providing spares. This suggests that the current design paradigm of building a minimal number of highly robust robots may not be the best way to design robots for extended missions.

  1. ASSESSING AND COMBINING RELIABILITY OF PROTEIN INTERACTION SOURCES

    PubMed Central

    LEACH, SONIA; GABOW, AARON; HUNTER, LAWRENCE; GOLDBERG, DEBRA S.

    2008-01-01

    Integrating diverse sources of interaction information to create protein networks requires strategies sensitive to differences in accuracy and coverage of each source. Previous integration approaches calculate reliabilities of protein interaction information sources based on congruity to a designated ‘gold standard.’ In this paper, we provide a comparison of the two most popular existing approaches and propose a novel alternative for assessing reliabilities which does not require a gold standard. We identify a new method for combining the resultant reliabilities and compare it against an existing method. Further, we propose an extrinsic approach to evaluation of reliability estimates, considering their influence on the downstream tasks of inferring protein function and learning regulatory networks from expression data. Results using this evaluation method show 1) our method for reliability estimation is an attractive alternative to those requiring a gold standard and 2) the new method for combining reliabilities is less sensitive to noise in reliability assignments than the similar existing technique. PMID:17990508

  2. Reliability analysis of laminated CMC components through shell subelement techniques

    NASA Technical Reports Server (NTRS)

    Starlinger, Alois; Duffy, Stephen F.; Gyekenyesi, John P.

    1992-01-01

    An updated version of the integrated design program Composite Ceramics Analysis and Reliability Evaluation of Structures (C/CARES) was developed for the reliability evaluation of ceramic matrix composites (CMC) laminated shell components. The algorithm is now split into two modules: a finite-element data interface program and a reliability evaluation algorithm. More flexibility is achieved, allowing for easy implementation with various finite-element programs. The interface program creates a neutral data base which is then read by the reliability module. This neutral data base concept allows easy data transfer between different computer systems. The new interface program from the finite-element code Matrix Automated Reduction and Coupling (MARC) also includes the option of using hybrid laminates (a combination of plies of different materials or different layups) and allows for variations in temperature fields throughout the component. In the current version of C/CARES, a subelement technique was implemented, enabling stress gradients within an element to be taken into account. The noninteractive reliability function is now evaluated at each Gaussian integration point instead of using averaging techniques. As a result of the increased number of stress evaluation points, considerable improvements in the accuracy of reliability analyses were realized.

  3. Screening of Cognitive Impairment in Schizophrenia: Reliability, Sensitivity, and Specificity of the Repeatable Battery for the Assessment of Neuropsychological Status in a Spanish Sample.

    PubMed

    De la Torre, Gabriel G; Perez, Maria J; Ramallo, Miguel A; Randolph, Christopher; González-Villegas, Macarena Bernal

    2016-04-01

    In recent years, a number of studies focusing on the evaluation of neuropsychological deficits in individuals with schizophrenia have shown deficits that include several cognitive functions. Attention deficits as well as memory or executive function deficits are common in this kind of disorder together with sustained attention problems, working memory deficiencies, and problem-solving difficulties, among many others. Currently, the Repeatable Battery for the Assessment of Neuropsychological Status (RBANS) is gaining special importance in the evaluation of the cognitive deficits associated with schizophrenia. In this article, we describe an RBANS screening in a sample of 88 Spanish patients diagnosed with schizophrenia. We also aimed to check the battery's reliability, sensitivity, and specificity in the studied sample. We performed a comparative study with 88 healthy participants. The results showed a reliability index value of α = .795 and an item value of α = .762. For total test reliability, we obtained an index value of α = .761 and an item value of α = .762. Sensitivity score was 87.5% and specificity 86.4%. RBANS obtained good reliability, sensitivity, and specificity scores and represents a good screening tool in detecting cognitive deficits associated with schizophrenia. © The Author(s) 2015.

  4. Quantitative Determination of Bioactive Constituents in Noni Juice by High-performance Liquid Chromatography with Electrospray Ionization Triple Quadrupole Mass Spectrometry.

    PubMed

    Yan, Yongqiu; Lu, Yu; Jiang, Shiping; Jiang, Yu; Tong, Yingpeng; Zuo, Limin; Yang, Jun; Gong, Feng; Zhang, Ling; Wang, Ping

    2018-01-01

    Noni juice has been extensively used as folk medicine for the treatment of arthritis, infections, analgesic, colds, cancers, and diabetes by Polynesians for many years. Due to the lack of standard scientific evaluation methods, various kinds of commercial Noni juice with different quality and price were available on the market. To establish a sensitive, reliable, and accurate high-performance liquid chromatography with electrospray ionization triple quadrupole mass spectrometry (HPLC-ESI-MS/MS) method for separation, identification, and simultaneous quantitative analysis of bioactive constituents in Noni juice. The analytes and eight batches of commercially available samples from different origins were separated and analyzed by the HPLC-ESI-MS/MS method on an Agilent ZORBAX SB-C 18 (150 mm × 4.6 mm i.d., 5 μm) column using a gradient elution of acetonitrile-methanol-0.05% glacial acetic acid in water (v/v) at a constant flow rate of 0.5 mL/min. Seven components were identification and all of the assay parameters were within the required limits. Components were within the correlation coefficient values ( R 2 ≥ 0.9993) at the concentration ranges tested. The precision of the assay method was <0.91% and the repeatability between 1.36% and 3.31%. The accuracy varied from 96.40% to 103.02% and the relative standard deviations of stability were <3.91%. Samples from the same origin showed similar content while different origins showed significant different result. The developed methods would provide a reliable basis and be useful in the establishment of a rational quality control standard of Noni juice. Separation, identification, and simultaneous quantitative analysis method of seven bioactive constituents in Noni juice is originally developed by high-performance liquid chromatography with electrospray ionization triple quadrupole mass spectrometryThe presented method was successfully applied to the quality control of eight batches of commercially available samples of Noni juiceThis method is simple, sensitive, reliable, accurate, and efficient method with strong specificity, good precision, and high recovery rate and provides a reliable basis for quality control of Noni juice. Abbreviations used: HPLC-ESI-MS/MS: High-performance liquid chromatography with electrospray ionization triple quadrupole mass spectrometry, LOD: Limit of detection, LOQ: Limit of quantitation, S/N: Signal-to-noise ratio, RSD: Relative standard deviations, DP: Declustering potential, CE: Collision energy, MRM: Multiple reaction monitoring, RT: Retention time.

  5. Quantitative Determination of Bioactive Constituents in Noni Juice by High-performance Liquid Chromatography with Electrospray Ionization Triple Quadrupole Mass Spectrometry

    PubMed Central

    Yan, Yongqiu; Lu, Yu; Jiang, Shiping; Jiang, Yu; Tong, Yingpeng; Zuo, Limin; Yang, Jun; Gong, Feng; Zhang, Ling; Wang, Ping

    2018-01-01

    Background: Noni juice has been extensively used as folk medicine for the treatment of arthritis, infections, analgesic, colds, cancers, and diabetes by Polynesians for many years. Due to the lack of standard scientific evaluation methods, various kinds of commercial Noni juice with different quality and price were available on the market. Objective: To establish a sensitive, reliable, and accurate high-performance liquid chromatography with electrospray ionization triple quadrupole mass spectrometry (HPLC-ESI-MS/MS) method for separation, identification, and simultaneous quantitative analysis of bioactive constituents in Noni juice. Materials and Methods: The analytes and eight batches of commercially available samples from different origins were separated and analyzed by the HPLC-ESI-MS/MS method on an Agilent ZORBAX SB-C18 (150 mm × 4.6 mm i.d., 5 μm) column using a gradient elution of acetonitrile-methanol-0.05% glacial acetic acid in water (v/v) at a constant flow rate of 0.5 mL/min. Results: Seven components were identification and all of the assay parameters were within the required limits. Components were within the correlation coefficient values (R2 ≥ 0.9993) at the concentration ranges tested. The precision of the assay method was <0.91% and the repeatability between 1.36% and 3.31%. The accuracy varied from 96.40% to 103.02% and the relative standard deviations of stability were <3.91%. Samples from the same origin showed similar content while different origins showed significant different result. Conclusions: The developed methods would provide a reliable basis and be useful in the establishment of a rational quality control standard of Noni juice. SUMMARY Separation, identification, and simultaneous quantitative analysis method of seven bioactive constituents in Noni juice is originally developed by high-performance liquid chromatography with electrospray ionization triple quadrupole mass spectrometryThe presented method was successfully applied to the quality control of eight batches of commercially available samples of Noni juiceThis method is simple, sensitive, reliable, accurate, and efficient method with strong specificity, good precision, and high recovery rate and provides a reliable basis for quality control of Noni juice. Abbreviations used: HPLC-ESI-MS/MS: High-performance liquid chromatography with electrospray ionization triple quadrupole mass spectrometry, LOD: Limit of detection, LOQ: Limit of quantitation, S/N: Signal-to-noise ratio, RSD: Relative standard deviations, DP: Declustering potential, CE: Collision energy, MRM: Multiple reaction monitoring, RT: Retention time. PMID:29576704

  6. A novel approach for analyzing fuzzy system reliability using different types of intuitionistic fuzzy failure rates of components.

    PubMed

    Kumar, Mohit; Yadav, Shiv Prasad

    2012-03-01

    This paper addresses the fuzzy system reliability analysis using different types of intuitionistic fuzzy numbers. Till now, in the literature, to analyze the fuzzy system reliability, it is assumed that the failure rates of all components of a system follow the same type of fuzzy set or intuitionistic fuzzy set. However, in practical problems, such type of situation rarely occurs. Therefore, in the present paper, a new algorithm has been introduced to construct the membership function and non-membership function of fuzzy reliability of a system having components following different types of intuitionistic fuzzy failure rates. Functions of intuitionistic fuzzy numbers are calculated to construct the membership function and non-membership function of fuzzy reliability via non-linear programming techniques. Using the proposed algorithm, membership functions and non-membership functions of fuzzy reliability of a series system and a parallel systems are constructed. Our study generalizes the various works of the literature. Numerical examples are given to illustrate the proposed algorithm. Copyright © 2011 ISA. Published by Elsevier Ltd. All rights reserved.

  7. The reliability, validity, sensitivity, specificity and predictive values of the Chinese version of the Rowland Universal Dementia Assessment Scale.

    PubMed

    Chen, Chia-Wei; Chu, Hsin; Tsai, Chia-Fen; Yang, Hui-Ling; Tsai, Jui-Chen; Chung, Min-Huey; Liao, Yuan-Mei; Chi, Mei-Ju; Chou, Kuei-Ru

    2015-11-01

    The purpose of this study was to translate the Rowland Universal Dementia Assessment Scale into Chinese and to evaluate the psychometric properties (reliability and validity) and the diagnostic properties (sensitivity, specificity and predictive values) of the Chinese version of the Rowland Universal Dementia Assessment Scale. The accurate detection of early dementia requires screening tools with favourable cross-cultural linguistic and appropriate sensitivity, specificity, and predictive values, particularly for Chinese-speaking populations. This was a cross-sectional, descriptive study. Overall, 130 participants suspected to have cognitive impairment were enrolled in the study. A test-retest for determining reliability was scheduled four weeks after the initial test. Content validity was determined by five experts, whereas construct validity was established by using contrasted group technique. The participants' clinical diagnoses were used as the standard in calculating the sensitivity, specificity, positive predictive value and negative predictive value. The study revealed that the Chinese version of the Rowland Universal Dementia Assessment Scale exhibited a test-retest reliability of 0.90, an internal consistency reliability of 0.71, an inter-rater reliability (kappa value) of 0.88 and a content validity index of 0.97. Both the patients and healthy contrast group exhibited significant differences in their cognitive ability. The optimal cut-off points for the Chinese version of the Rowland Universal Dementia Assessment Scale in the test for mild cognitive impairment and dementia were 24 and 22, respectively; moreover, for these two conditions, the sensitivities of the scale were 0.79 and 0.76, the specificities were 0.91 and 0.81, the areas under the curve were 0.85 and 0.78, the positive predictive values were 0.99 and 0.83 and the negative predictive values were 0.96 and 0.91 respectively. The Chinese version of the Rowland Universal Dementia Assessment Scale exhibited sound reliability, validity, sensitivity, specificity and predictive values. This scale can help clinical staff members to quickly and accurately diagnose cognitive impairment and provide appropriate treatment as early as possible. © 2015 John Wiley & Sons Ltd.

  8. Reliability Assessment Approach for Stirling Convertors and Generators

    NASA Technical Reports Server (NTRS)

    Shah, Ashwin R.; Schreiber, Jeffrey G.; Zampino, Edward; Best, Timothy

    2004-01-01

    Stirling power conversion is being considered for use in a Radioisotope Power System for deep-space science missions because it offers a multifold increase in the conversion efficiency of heat to electric power. Quantifying the reliability of a Radioisotope Power System that utilizes Stirling power conversion technology is important in developing and demonstrating the capability for long-term success. A description of the Stirling power convertor is provided, along with a discussion about some of the key components. Ongoing efforts to understand component life, design variables at the component and system levels, related sources, and the nature of uncertainties is discussed. The requirement for reliability also is discussed, and some of the critical areas of concern are identified. A section on the objectives of the performance model development and a computation of reliability is included to highlight the goals of this effort. Also, a viable physics-based reliability plan to model the design-level variable uncertainties at the component and system levels is outlined, and potential benefits are elucidated. The plan involves the interaction of different disciplines, maintaining the physical and probabilistic correlations at all the levels, and a verification process based on rational short-term tests. In addition, both top-down and bottom-up coherency were maintained to follow the physics-based design process and mission requirements. The outlined reliability assessment approach provides guidelines to improve the design and identifies governing variables to achieve high reliability in the Stirling Radioisotope Generator design.

  9. Reliability, validity, and sensitivity to change of the lower extremity functional scale in individuals affected by stroke.

    PubMed

    Verheijde, Joseph L; White, Fred; Tompkins, James; Dahl, Peder; Hentz, Joseph G; Lebec, Michael T; Cornwall, Mark

    2013-12-01

    To investigate reliability, validity, and sensitivity to change of the Lower Extremity Functional Scale (LEFS) in individuals affected by stroke. The secondary objective was to test the validity and sensitivity of a single-item linear analog scale (LAS) of function. Prospective cohort reliability and validation study. A single rehabilitation department in an academic medical center. Forty-three individuals receiving neurorehabilitation for lower extremity dysfunction after stroke were studied. Their ages ranged from 32 to 95 years, with a mean of 70 years; 77% were men. Test-retest reliability was assessed by calculating the classical intraclass correlation coefficient, and the Bland-Altman limits of agreement. Validity was assessed by calculating the Pearson correlation coefficient between the instruments. Sensitivity to change was assessed by comparing baseline scores with end of treatment scores. Measurements were taken at baseline, after 1-3 days, and at 4 and 8 weeks. The LEFS, Short-Form-36 Physical Function Scale, Berg Balance Scale, Six-Minute Walk Test, Five-Meter Walk Test, Timed Up-and-Go test, and the LAS of function were used. The test-retest reliability of the LEFS was found to be excellent (ICC = 0.96). Correlated with the 6 other measures of function studied, the validity of the LEFS was found to be moderate to high (r = 0.40-0.71). Regarding the sensitivity to change, the mean LEFS scores from baseline to study end increased 1.2 SD and for LAS 1.1 SD. LEFS exhibits good reliability, validity, and sensitivity to change in patients with lower extremity impairments secondary to stroke. Therefore, the LEFS can be a clinically efficient outcome measure in the rehabilitation of patients with subacute stroke. The LAS is shown to be a time-saving and reasonable option to track changes in a patient's functional status. Copyright © 2013 American Academy of Physical Medicine and Rehabilitation. Published by Elsevier Inc. All rights reserved.

  10. Method for estimating off-axis pulse tube losses

    NASA Astrophysics Data System (ADS)

    Fang, T.; Mulcahey, T. I.; Taylor, R. P.; Spoor, P. S.; Conrad, T. J.; Ghiaasiaan, S. M.

    2017-12-01

    Some Stirling-type pulse tube cryocoolers (PTCs) exhibit sensitivity to gravitational orientation and often exhibit significant cooling performance losses unless situated with the cold end pointing downward. Prior investigations have indicated that some coolers exhibit sensitivity while others do not; however, a reliable method of predicting the level of sensitivity during the design process has not been developed. In this study, we present a relationship that estimates an upper limit to gravitationally induced losses as a function of the dimensionless pulse tube convection number (NPTC) that can be used to ensure that a PTC would remain functional at adverse static tilt conditions. The empirical relationship is based on experimental data as well as experimentally validated 3-D computational fluid dynamics simulations that examine the effects of frequency, mass flow rate, pressure ratio, mass-pressure phase difference, hot and cold end temperatures, and static tilt angle. The validation of the computational model is based on experimental data collected from six commercial pulse tube cryocoolers. The simulation results are obtained from component-level models of the pulse tube and heat exchangers. Parameter ranges covered in component level simulations are 0-180° for tilt angle, 4-8 for length to diameter ratios, 4-80 K cold tip temperatures, -30° to +30° for mass flow to pressure phase angles, and 25-60 Hz operating frequencies. Simulation results and experimental data are aggregated to yield the relationship between inclined PTC performance and pulse tube convection numbers. The results indicate that the pulse tube convection number can be used as an order of magnitude indicator of the orientation sensitivity, but CFD simulations should be used to calculate the change in energy flow more accurately.

  11. Surface-Enhanced Raman Scattering-Based Immunoassay Technologies for Detection of Disease Biomarkers

    PubMed Central

    Smolsky, Joseph; Kaur, Sukhwinder; Hayashi, Chihiro; Batra, Surinder K.; Krasnoslobodtsev, Alexey V.

    2017-01-01

    Detection of biomarkers is of vital importance in disease detection, management, and monitoring of therapeutic efficacy. Extensive efforts have been devoted to the development of novel diagnostic methods that detect and quantify biomarkers with higher sensitivity and reliability, contributing to better disease diagnosis and prognosis. When it comes to such devastating diseases as cancer, these novel powerful methods allow for disease staging as well as detection of cancer at very early stages. Over the past decade, there have been some advances in the development of platforms for biomarker detection of diseases. The main focus has recently shifted to the development of simple and reliable diagnostic tests that are inexpensive, accurate, and can follow a patient’s disease progression and therapy response. The individualized approach in biomarker detection has been also emphasized with detection of multiple biomarkers in body fluids such as blood and urine. This review article covers the developments in Surface-Enhanced Raman Scattering (SERS) and related technologies with the primary focus on immunoassays. Limitations and advantages of the SERS-based immunoassay platform are discussed. The article thoroughly describes all components of the SERS immunoassay and highlights the superior capabilities of SERS readout strategy such as high sensitivity and simultaneous detection of a multitude of biomarkers. Finally, it introduces recently developed strategies for in vivo biomarker detection using SERS. PMID:28085088

  12. CARES/Life Software for Designing More Reliable Ceramic Parts

    NASA Technical Reports Server (NTRS)

    Nemeth, Noel N.; Powers, Lynn M.; Baker, Eric H.

    1997-01-01

    Products made from advanced ceramics show great promise for revolutionizing aerospace and terrestrial propulsion, and power generation. However, ceramic components are difficult to design because brittle materials in general have widely varying strength values. The CAPES/Life software eases this task by providing a tool to optimize the design and manufacture of brittle material components using probabilistic reliability analysis techniques. Probabilistic component design involves predicting the probability of failure for a thermomechanically loaded component from specimen rupture data. Typically, these experiments are performed using many simple geometry flexural or tensile test specimens. A static, dynamic, or cyclic load is applied to each specimen until fracture. Statistical strength and SCG (fatigue) parameters are then determined from these data. Using these parameters and the results obtained from a finite element analysis, the time-dependent reliability for a complex component geometry and loading is then predicted. Appropriate design changes are made until an acceptable probability of failure has been reached.

  13. Designing polymers with sugar-based advantages for bioactive delivery applications.

    PubMed

    Zhang, Yingyue; Chan, Jennifer W; Moretti, Alysha; Uhrich, Kathryn E

    2015-12-10

    Sugar-based polymers have been extensively explored as a means to increase drug delivery systems' biocompatibility and biodegradation. Here,we review he use of sugar-based polymers for drug delivery applications, with a particular focus on the utility of the sugar component(s) to provide benefits for drug targeting and stimuli responsive systems. Specifically, numerous synthetic methods have been developed to reliably modify naturally-occurring polysaccharides, conjugate sugar moieties to synthetic polymer scaffolds to generate glycopolymers, and utilize sugars as a multifunctional building block to develop sugar-linked polymers. The design of sugar-based polymer systems has tremendous implications on both the physiological and biological properties imparted by the saccharide units and are unique from synthetic polymers. These features include the ability of glycopolymers to preferentially target various cell types and tissues through receptor interactions, exhibit bioadhesion for prolonged residence time, and be rapidly recognized and internalized by cancer cells. Also discussed are the distinct stimuli-sensitive properties of saccharide-modified polymers to mediate drug release under desired conditions. Saccharide-based systems with inherent pH- and temperature-sensitive properties, as well as enzyme-cleavable polysaccharides for targeted bioactive delivery, are covered. Overall, this work emphasizes inherent benefits of sugar-containing polymer systems for bioactive delivery.

  14. Comparison between Two Linear Supervised Learning Machines' Methods with Principle Component Based Methods for the Spectrofluorimetric Determination of Agomelatine and Its Degradants.

    PubMed

    Elkhoudary, Mahmoud M; Naguib, Ibrahim A; Abdel Salam, Randa A; Hadad, Ghada M

    2017-05-01

    Four accurate, sensitive and reliable stability indicating chemometric methods were developed for the quantitative determination of Agomelatine (AGM) whether in pure form or in pharmaceutical formulations. Two supervised learning machines' methods; linear artificial neural networks (PC-linANN) preceded by principle component analysis and linear support vector regression (linSVR), were compared with two principle component based methods; principle component regression (PCR) as well as partial least squares (PLS) for the spectrofluorimetric determination of AGM and its degradants. The results showed the benefits behind using linear learning machines' methods and the inherent merits of their algorithms in handling overlapped noisy spectral data especially during the challenging determination of AGM alkaline and acidic degradants (DG1 and DG2). Relative mean squared error of prediction (RMSEP) for the proposed models in the determination of AGM were 1.68, 1.72, 0.68 and 0.22 for PCR, PLS, SVR and PC-linANN; respectively. The results showed the superiority of supervised learning machines' methods over principle component based methods. Besides, the results suggested that linANN is the method of choice for determination of components in low amounts with similar overlapped spectra and narrow linearity range. Comparison between the proposed chemometric models and a reported HPLC method revealed the comparable performance and quantification power of the proposed models.

  15. Scale for positive aspects of caregiving experience: development, reliability, and factor structure.

    PubMed

    Kate, N; Grover, S; Kulhara, P; Nehra, R

    2012-06-01

    OBJECTIVE. To develop an instrument (Scale for Positive Aspects of Caregiving Experience [SPACE]) that evaluates positive caregiving experience and assess its psychometric properties. METHODS. Available scales which assess some aspects of positive caregiving experience were reviewed and a 50-item questionnaire with a 5-point rating was constructed. In all, 203 primary caregivers of patients with severe mental disorders were asked to complete the questionnaire. Internal consistency, test-retest reliability, cross-language reliability, split-half reliability, and face validity were evaluated. Principal component factor analysis was run to assess the factorial validity of the scale. RESULTS. The scale developed as part of the study was found to have good internal consistency, test-retest reliability, cross-language reliability, split-half reliability, and face validity. Principal component factor analysis yielded a 4-factor structure, which also had good test-retest reliability and cross-language reliability. There was a strong correlation between the 4 factors obtained. CONCLUSION. The SPACE developed as part of this study has good psychometric properties.

  16. Environmental risk assessment of biocidal products: identification of relevant components and reliability of a component-based mixture assessment.

    PubMed

    Coors, Anja; Vollmar, Pia; Heim, Jennifer; Sacher, Frank; Kehrer, Anja

    2018-01-01

    Biocidal products are mixtures of one or more active substances (a.s.) and a broad range of formulation additives. There is regulatory guidance currently under development that will specify how the combined effects of the a.s. and any relevant formulation additives shall be considered in the environmental risk assessment of biocidal products. The default option is a component-based approach (CBA) by which the toxicity of the product is predicted from the toxicity of 'relevant' components using concentration addition. Hence, unequivocal and practicable criteria are required for identifying the 'relevant' components to ensure protectiveness of the CBA, while avoiding unnecessary workload resulting from including by default components that do not significantly contribute to the product toxicity. The present study evaluated a set of different criteria for identifying 'relevant' components using confidential information on the composition of 21 wood preservative products. Theoretical approaches were complemented by experimentally testing the aquatic toxicity of seven selected products. For three of the seven tested products, the toxicity was underestimated for the most sensitive endpoint (green algae) by more than factor 2 if only the a.s. were considered in the CBA. This illustrated the necessity of including at least some additives along with the a.s. Considering additives that were deemed 'relevant' by the tentatively established criteria reduced the underestimation of toxicity for two of the three products. A lack of data for one specific additive was identified as the most likely reason for the remaining toxicity underestimation of the third product. In three other products, toxicity was overestimated by more than factor 2, while prediction and observation fitted well for the seventh product. Considering all additives in the prediction increased only the degree of overestimation. Supported by theoretical calculations and experimental verifications, the present study developed criteria for the identification of CBA-relevant components in a biocidal product. These criteria are based on existing criteria stated in the regulation for classification, labelling and packaging of substances. The CBA was found sufficiently protective and reliable for the tested products when applying the here recommended criteria. The lack of available aquatic toxicity data for some of the identified relevant components was the main reason for underestimation of product toxicity.

  17. Transmission overhaul estimates for partial and full replacement at repair

    NASA Technical Reports Server (NTRS)

    Savage, M.; Lewicki, D. G.

    1991-01-01

    Timely transmission overhauls increase in-flight service reliability greater than the calculated design reliabilities of the individual aircraft transmission components. Although necessary for aircraft safety, transmission overhauls contribute significantly to aircraft expense. Predictions of a transmission's maintenance needs at the design stage should enable the development of more cost effective and reliable transmissions in the future. The frequency is estimated of overhaul along with the number of transmissions or components needed to support the overhaul schedule. Two methods based on the two parameter Weibull statistical distribution for component life are used to estimate the time between transmission overhauls. These methods predict transmission lives for maintenance schedules which repair the transmission with a complete system replacement or repair only failed components of the transmission. An example illustrates the methods.

  18. Psychometric evaluation of the Persian version of the Templer's Death Anxiety Scale in cancer patients.

    PubMed

    Soleimani, Mohammad Ali; Yaghoobzadeh, Ameneh; Bahrami, Nasim; Sharif, Saeed Pahlevan; Sharif Nia, Hamid

    2016-10-01

    In this study, 398 Iranian cancer patients completed the 15-item Templer's Death Anxiety Scale (TDAS). Tests of internal consistency, principal components analysis, and confirmatory factor analysis were conducted to assess the internal consistency and factorial validity of the Persian TDAS. The construct reliability statistic and average variance extracted were also calculated to measure construct reliability, convergent validity, and discriminant validity. Principal components analysis indicated a 3-component solution, which was generally supported in the confirmatory analysis. However, acceptable cutoffs for construct reliability, convergent validity, and discriminant validity were not fulfilled for the three subscales that were derived from the principal component analysis. This study demonstrated both the advantages and potential limitations of using the TDAS with Persian-speaking cancer patients.

  19. A nuclear method to measure spallation by thermal cycling of protective surface layers

    NASA Astrophysics Data System (ADS)

    Stroosnijder, M. F.; Macchi, G.

    1995-05-01

    After a general introduction on spallation by thermal cycling, the principle of Thin Layer Activation (TLA) is outlined. A practical setup to measure spallation of protective surface layers by thermal cycling using TLA is discussed. Its use is illustrated with the study of the spallation behaviour of an advanced thermal barrier coating. It is shown that among the various benefits, TLA has a direct relation to material loss and shows a significant increase in sensitivity over other test methods. Due to its intrinsic properties, TLA can contribute to a greater scientific understanding of material degradation by thermal cycling and it can provide a more reliable assessment of the service lives of technical components.

  20. A Case Study for Probabilistic Methods Validation (MSFC Center Director's Discretionary Fund, Project No. 94-26)

    NASA Technical Reports Server (NTRS)

    Price J. M.; Ortega, R.

    1998-01-01

    Probabilistic method is not a universally accepted approach for the design and analysis of aerospace structures. The validity of this approach must be demonstrated to encourage its acceptance as it viable design and analysis tool to estimate structural reliability. The objective of this Study is to develop a well characterized finite population of similar aerospace structures that can be used to (1) validate probabilistic codes, (2) demonstrate the basic principles behind probabilistic methods, (3) formulate general guidelines for characterization of material drivers (such as elastic modulus) when limited data is available, and (4) investigate how the drivers affect the results of sensitivity analysis at the component/failure mode level.

  1. Identification of Reliable Components in Multivariate Curve Resolution-Alternating Least Squares (MCR-ALS): a Data-Driven Approach across Metabolic Processes.

    PubMed

    Motegi, Hiromi; Tsuboi, Yuuri; Saga, Ayako; Kagami, Tomoko; Inoue, Maki; Toki, Hideaki; Minowa, Osamu; Noda, Tetsuo; Kikuchi, Jun

    2015-11-04

    There is an increasing need to use multivariate statistical methods for understanding biological functions, identifying the mechanisms of diseases, and exploring biomarkers. In addition to classical analyses such as hierarchical cluster analysis, principal component analysis, and partial least squares discriminant analysis, various multivariate strategies, including independent component analysis, non-negative matrix factorization, and multivariate curve resolution, have recently been proposed. However, determining the number of components is problematic. Despite the proposal of several different methods, no satisfactory approach has yet been reported. To resolve this problem, we implemented a new idea: classifying a component as "reliable" or "unreliable" based on the reproducibility of its appearance, regardless of the number of components in the calculation. Using the clustering method for classification, we applied this idea to multivariate curve resolution-alternating least squares (MCR-ALS). Comparisons between conventional and modified methods applied to proton nuclear magnetic resonance ((1)H-NMR) spectral datasets derived from known standard mixtures and biological mixtures (urine and feces of mice) revealed that more plausible results are obtained by the modified method. In particular, clusters containing little information were detected with reliability. This strategy, named "cluster-aided MCR-ALS," will facilitate the attainment of more reliable results in the metabolomics datasets.

  2. Scaling Impacts in Life Support Architecture and Technology Selection

    NASA Technical Reports Server (NTRS)

    Lange, Kevin

    2016-01-01

    For long-duration space missions outside of Earth orbit, reliability considerations will drive higher levels of redundancy and/or on-board spares for life support equipment. Component scaling will be a critical element in minimizing overall launch mass while maintaining an acceptable level of system reliability. Building on an earlier reliability study (AIAA 2012-3491), this paper considers the impact of alternative scaling approaches, including the design of technology assemblies and their individual components to maximum, nominal, survival, or other fractional requirements. The optimal level of life support system closure is evaluated for deep-space missions of varying duration using equivalent system mass (ESM) as the comparative basis. Reliability impacts are included in ESM by estimating the number of component spares required to meet a target system reliability. Common cause failures are included in the analysis. ISS and ISS-derived life support technologies are considered along with selected alternatives. This study focusses on minimizing launch mass, which may be enabling for deep-space missions.

  3. A new method for computing the reliability of consecutive k-out-of-n:F systems

    NASA Astrophysics Data System (ADS)

    Gökdere, Gökhan; Gürcan, Mehmet; Kılıç, Muhammet Burak

    2016-01-01

    In many physical systems, reliability evaluation, such as ones encountered in telecommunications, the design of integrated circuits, microwave relay stations, oil pipeline systems, vacuum systems in accelerators, computer ring networks, and spacecraft relay stations, have had applied consecutive k-out-of-n system models. These systems are characterized as logical connections among the components of the systems placed in lines or circles. In literature, a great deal of attention has been paid to the study of the reliability evaluation of consecutive k-out-of-n systems. In this paper, we propose a new method to compute the reliability of consecutive k-out-of-n:F systems, with n linearly and circularly arranged components. The proposed method provides a simple way for determining the system failure probability. Also, we write R-Project codes based on our proposed method to compute the reliability of the linear and circular systems which have a great number of components.

  4. Assessing inspection sensitivity as it relates to damage tolerance in composite rotor hubs

    NASA Astrophysics Data System (ADS)

    Roach, Dennis P.; Rackow, Kirk

    2001-08-01

    Increasing niche applications, growing international markets, and the emergence of advanced rotorcraft technology are expected to greatly increase the population of helicopters over the next decade. In terms of fuselage fatigue, helicopters show similar trends as fixed-wing aircraft. The highly unsteady loads experienced by rotating wings not only directly affect components in the dynamic systems but are also transferred to the fixed airframe structure. Expanded use of rotorcraft has focused attention on the use of new materials and the optimization of maintenance practices. The FAA's Airworthiness Assurance Center (AANC) at Sandia National Labs has joined with Bell Helicopter andother agencies in the rotorcraft industry to evaluate nondestructive inspection (NDI) capabilities in light of the damage tolerance of assorted rotorcraft structure components. Currently, the program's emphasis is on composite rotor hubs. The rotorcraft industry is constantly evaluating new types of lightweight composite materials that not only enhance the safety and reliability of rotor components but also improve performance and extended operating life as well. Composite rotor hubs have led to the use of bearingless rotor systems that are less complex and require less maintenance than their predecessors. The test facility described in this paper allows the structural stability and damage tolerance of composite hubs to be evaluated using realistic flight load spectrums of centrifugal force and bending loads. NDI was integrated into the life-cycle fatigue tests in order to evaluate flaw detection sensitivity simultaneously wiht residual strength and general rotor hub peformance. This paper will describe the evolving use of damage tolerance analysis (DTA) to direct and improve rotorcraft maintenance along with the related use of nondestructive inspections to manage helicopter safety. OVeralll, the data from this project will provide information to improve the producibility, inspectability, serviceability, and cost effectively of rotorcraft components.

  5. A Sensitive, Reliable Inexpensive Touch Detector

    ERIC Educational Resources Information Center

    Anger, Douglas; Schachtman, Todd R.

    2007-01-01

    Research in a laboratory required a sensitive, reliable, inexpensive touch detector for use with rats to test the reinforcement of inhibition. A small touch detector was also desirable so that the detector could be mounted on the rat's cage close to the object being touched by the rat, whose touches in turn were being detected by current passing…

  6. Reliability and validity of the delta finger-to-palm (FTP), a new measure of finger range of motion in systemic sclerosis.

    PubMed

    Torok, Kathryn S; Baker, Nancy A; Lucas, Mary; Domsic, Robyn T; Boudreau, Robert; Medsger, Thomas A

    2010-01-01

    To determine the reliability and validity of a new measure of finger motion in patients with systemic sclerosis (SSc), the 'delta finger-topalm' (delta FTP) and compare its psychometric properties to the traditional measure of finger motion, the finger-topalm (FTP). Phase 1: The reliability of the delta FTP and FTP were examined in 39 patients with SSc. Phase 2: Criterion and convergent construct validity of both measures were examined in 17 patients with SSc by comparing them to other clinical measures: Total Active Range of Motion (TAROM), Hand Mobility in Scleroderma (HAMIS), the Duruoz Hand Index (DHI), Health Assessment Questionnaire (HAQ), and modified Rodnan skin score (mRSS). Phase 3: Sensitivity to change of the delta FTP was investigated in 24 patients with early diffuse cutaneous SSc. Both measures had excellent intra-rater and inter-rater reliability (ICC 0.92 to 0.99). Fair to strong correlations (rs=0.49-0.94) were observed between the delta FTP and TAROM, HAMIS, and DHI. Fair to moderate correlations were observed between delta FTP and HAQ components related to hand function and upper extremity mRSS. Correlations of the traditional FTP with these measures were fair to strong, but most often the delta FTP outperformed the FTP. The effect size and standardised response mean for the mean delta FTP were 0.50 and 1.10 respectively, over a 2-8 month period. The delta FTP is a valid and reliable measure of finger motion in patients with SSc which outperforms the FTP.

  7. Reliability and maintainability assessment factors for reliable fault-tolerant systems

    NASA Technical Reports Server (NTRS)

    Bavuso, S. J.

    1984-01-01

    A long term goal of the NASA Langley Research Center is the development of a reliability assessment methodology of sufficient power to enable the credible comparison of the stochastic attributes of one ultrareliable system design against others. This methodology, developed over a 10 year period, is a combined analytic and simulative technique. An analytic component is the Computer Aided Reliability Estimation capability, third generation, or simply CARE III. A simulative component is the Gate Logic Software Simulator capability, or GLOSS. The numerous factors that potentially have a degrading effect on system reliability and the ways in which these factors that are peculiar to highly reliable fault tolerant systems are accounted for in credible reliability assessments. Also presented are the modeling difficulties that result from their inclusion and the ways in which CARE III and GLOSS mitigate the intractability of the heretofore unworkable mathematics.

  8. Applying the High Reliability Health Care Maturity Model to Assess Hospital Performance: A VA Case Study.

    PubMed

    Sullivan, Jennifer L; Rivard, Peter E; Shin, Marlena H; Rosen, Amy K

    2016-09-01

    The lack of a tool for categorizing and differentiating hospitals according to their high reliability organization (HRO)-related characteristics has hindered progress toward implementing and sustaining evidence-based HRO practices. Hospitals would benefit both from an understanding of the organizational characteristics that support HRO practices and from knowledge about the steps necessary to achieve HRO status to reduce the risk of harm and improve outcomes. The High Reliability Health Care Maturity (HRHCM) model, a model for health care organizations' achievement of high reliability with zero patient harm, incorporates three major domains critical for promoting HROs-Leadership, Safety Culture, and Robust Process Improvement ®. A study was conducted to examine the content validity of the HRHCM model and evaluate whether it can differentiate hospitals' maturity levels for each of the model's components. Staff perceptions of patient safety at six US Department of Veterans Affairs (VA) hospitals were examined to determine whether all 14 HRHCM components were present and to characterize each hospital's level of organizational maturity. Twelve of the 14 components from the HRHCM model were detected; two additional characteristics emerged that are present in the HRO literature but not represented in the model-teamwork culture and system-focused tools for learning and improvement. Each hospital's level of organizational maturity could be characterized for 9 of the 14 components. The findings suggest the HRHCM model has good content validity and that there is differentiation between hospitals on model components. Additional research is needed to understand how these components can be used to build the infrastructure necessary for reaching high reliability.

  9. Mass and Reliability Source (MaRS) Database

    NASA Technical Reports Server (NTRS)

    Valdenegro, Wladimir

    2017-01-01

    The Mass and Reliability Source (MaRS) Database consolidates components mass and reliability data for all Oribital Replacement Units (ORU) on the International Space Station (ISS) into a single database. It was created to help engineers develop a parametric model that relates hardware mass and reliability. MaRS supplies relevant failure data at the lowest possible component level while providing support for risk, reliability, and logistics analysis. Random-failure data is usually linked to the ORU assembly. MaRS uses this data to identify and display the lowest possible component failure level. As seen in Figure 1, the failure point is identified to the lowest level: Component 2.1. This is useful for efficient planning of spare supplies, supporting long duration crewed missions, allowing quicker trade studies, and streamlining diagnostic processes. MaRS is composed of information from various databases: MADS (operating hours), VMDB (indentured part lists), and ISS PART (failure data). This information is organized in Microsoft Excel and accessed through a program made in Microsoft Access (Figure 2). The focus of the Fall 2017 internship tour was to identify the components that were the root cause of failure from the given random-failure data, develop a taxonomy for the database, and attach material headings to the component list. Secondary objectives included verifying the integrity of the data in MaRS, eliminating any part discrepancies, and generating documentation for future reference. Due to the nature of the random-failure data, data mining had to be done manually without the assistance of an automated program to ensure positive identification.

  10. A complex network-based importance measure for mechatronics systems

    NASA Astrophysics Data System (ADS)

    Wang, Yanhui; Bi, Lifeng; Lin, Shuai; Li, Man; Shi, Hao

    2017-01-01

    In view of the negative impact of functional dependency, this paper attempts to provide an alternative importance measure called Improved-PageRank (IPR) for measuring the importance of components in mechatronics systems. IPR is a meaningful extension of the centrality measures in complex network, which considers usage reliability of components and functional dependency between components to increase importance measures usefulness. Our work makes two important contributions. First, this paper integrates the literature of mechatronic architecture and complex networks theory to define component network. Second, based on the notion of component network, a meaningful IPR is brought into the identifying of important components. In addition, the IPR component importance measures, and an algorithm to perform stochastic ordering of components due to the time-varying nature of usage reliability of components and functional dependency between components, are illustrated with a component network of bogie system that consists of 27 components.

  11. An overview of fatigue failures at the Rocky Flats Wind System Test Center

    NASA Technical Reports Server (NTRS)

    Waldon, C. A.

    1981-01-01

    Potential small wind energy conversion (SWECS) design problems were identified to improve product quality and reliability. Mass produced components such as gearboxes, generators, bearings, etc., are generally reliable due to their widespread uniform use in other industries. The likelihood of failure increases, though, in the interfacing of these components and in SWECS components designed for a specific system use. Problems relating to the structural integrity of such components are discussed and analyzed with techniques currently used in quality assurance programs in other manufacturing industries.

  12. Psychometric Properties of Performance-based Measurements of Functional Capacity: Test-Retest Reliability, Practice Effects, and Potential Sensitivity to Change

    PubMed Central

    Leifker, Feea R.; Patterson, Thomas L.; Bowie, Christopher R.; Mausbach, Brent T.; Harvey, Philip D.

    2010-01-01

    Performance-based measures of the ability to perform social and everyday living skills are being more widely used to assess functional capacity in people with serious mental illnesses such as schizophrenia and bipolar disorder. Since they are also being used as outcome measures in pharmacological and cognitive remediation studies aimed at cognitive impairments in schizophrenia, understanding their measurement properties and potential sensitivity to change is important. In this study, the test-retest reliability, practice effects, and reliable change indices of two different performance-based functional capacity measures, the UCSD Performance-based skills assessment (UPSA) and Social skills performance assessment (SSPA) were examined over several different retest intervals in two different samples of people with schizophrenia (n’s=238 and 116) and a healthy comparison sample (n=109). These psychometric properties were compared to those of a neuropsychological assessment battery. Test-retest reliabilities of the long form of the UPSA ranged from r=.63 to r=.80 over follow-up periods up to 36 months in people with schizophrenia, while brief UPSA reliabilities ranged from r=.66 to r=.81. Test-retest reliability of the NP performance scores ranged from r=.77 to r=.79. Test-retest reliabilities of the UPSA were lower in healthy controls, while NP performance was slightly more reliable. SSPA test-retest reliability was lower. Practice effect sizes ranged from .05 to .16 for the UPSA and .07 to .19 for the NP assessment in patients, with HC having more practice effects. Reliable change intervals were consistent across NP and both FC measures, indicating equal potential for detection of change. These performance-based measures of functional capacity appear to have similar potential to be sensitive to change compared to NP performance in people with schizophrenia. PMID:20399613

  13. Ceramic component reliability with the restructured NASA/CARES computer program

    NASA Technical Reports Server (NTRS)

    Powers, Lynn M.; Starlinger, Alois; Gyekenyesi, John P.

    1992-01-01

    The Ceramics Analysis and Reliability Evaluation of Structures (CARES) integrated design program on statistical fast fracture reliability and monolithic ceramic components is enhanced to include the use of a neutral data base, two-dimensional modeling, and variable problem size. The data base allows for the efficient transfer of element stresses, temperatures, and volumes/areas from the finite element output to the reliability analysis program. Elements are divided to insure a direct correspondence between the subelements and the Gaussian integration points. Two-dimensional modeling is accomplished by assessing the volume flaw reliability with shell elements. To demonstrate the improvements in the algorithm, example problems are selected from a round-robin conducted by WELFEP (WEakest Link failure probability prediction by Finite Element Postprocessors).

  14. Transit ridership, reliability, and retention.

    DOT National Transportation Integrated Search

    2008-10-01

    This project explores two major components that affect transit ridership: travel time reliability and rider : retention. It has been recognized that transit travel time reliability may have a significant impact on : attractiveness of transit to many ...

  15. Adaptation of the low-cost and low-power tactical split Stirling cryogenic cooler for aerospace applications

    NASA Astrophysics Data System (ADS)

    Veprik, A.; Zechtzer, S.; Pundak, N.; Kirkconnell, C.; Freeman, J.; Riabzev, S.

    2011-06-01

    Cryogenic coolers are often used in modern spacecraft in conjunction with sensitive electronics and sensors of military, commercial and scientific instrumentation. The typical space requirements are: power efficiency, low vibration export, proven reliability, ability to survive launch vibration/shock and long-term exposure to space radiation. A long-standing paradigm of exclusively using "space heritage" equipment has become the standard practice for delivering high reliability components. Unfortunately, this conservative "space heritage" practice can result in using outdated, oversized, overweight and overpriced cryogenic coolers and is becoming increasingly unacceptable for space agencies now operating within tough monetary and time constraints. The recent trend in developing mini and micro satellites for relatively inexpensive missions has prompted attempts to adapt leading-edge tactical cryogenic coolers for suitability in the space environment. The primary emphasis has been on reducing cost, weight and size. The authors are disclosing theoretical and practical aspects of a collaborative effort to develop a space qualified cryogenic refrigerator system based on the tactical cooler model Ricor K527 and the Iris Technology radiation hardened Low Cost Cryocooler Electronics (LCCE). The K27/LCCE solution is ideal for applications where cost, size, weight, power consumption, vibration export, reliability and time to spacecraft integration are of concern.

  16. The Stanford Leisure-Time Activity Categorical Item (L-Cat): A single categorical item sensitive to physical activity changes in overweight/obese women

    PubMed Central

    Kiernan, Michaela; Schoffman, Danielle E.; Lee, Katherine; Brown, Susan D.; Fair, Joan M.; Perri, Michael G.; Haskell, William L.

    2015-01-01

    Background Physical activity is essential for chronic disease prevention, yet <40% of overweight/obese adults meet national activity recommendations. For time-efficient counseling, clinicians need a brief easy-to-use tool that reliably and validly assesses a full range of activity levels, and most importantly, is sensitive to clinically meaningful changes in activity. The Stanford Leisure-Time Activity Categorical Item (L-Cat) is a single item comprised of six descriptive categories ranging from inactive to very active. This novel methodological approach assesses national activity recommendations as well as multiple clinically relevant categories below and above recommendations, and incorporates critical methodological principles that enhance psychometrics (reliability, validity, sensitivity to change). Methods We evaluated the L-Cat’s psychometrics among 267 overweight/obese women asked to meet national activity recommendations in a randomized behavioral weight-loss trial. Results The L-Cat had excellent test-retest reliability (κ=0.64, P<.001) and adequate concurrent criterion validity; each L-Cat category at 6 months was associated with 1059 more daily pedometer steps (95% CI 712–1407, β=0.38, P<.001) and 1.9% greater initial weight loss at 6 months (95% CI −2.4 to −1.3, β=−0.38, P<.001). Of interest, L-Cat categories differentiated from each other in a dose-response gradient for steps and weight loss (Ps<.05) with excellent face validity. The L-Cat was sensitive to change in response to the trial’s activity component. Women increased one L-Cat category at 6 months (M=1.0±1.4, P<.001); 55.8% met recommendations at 6 months whereas 20.6% did at baseline (P<.001). Even among women not meeting recommendations at both baseline and 6 months (n=106), women who moved ≥1 L-Cat categories at 6 months lost more weight than those who did not (M=−4.6%, 95% CI −6.7 to −2.5, P<.001). Conclusions Given strong psychometrics, the L-Cat has timely potential for clinical use such as tracking activity changes via electronic medical records especially among overweight/obese populations unable or unlikely to reach national recommendations. PMID:23588625

  17. The Stanford Leisure-Time Activity Categorical Item (L-Cat): a single categorical item sensitive to physical activity changes in overweight/obese women.

    PubMed

    Kiernan, M; Schoffman, D E; Lee, K; Brown, S D; Fair, J M; Perri, M G; Haskell, W L

    2013-12-01

    Physical activity is essential for chronic disease prevention, yet <40% of overweight/obese adults meet the national activity recommendations. For time-efficient counseling, clinicians need a brief, easy-to-use tool that reliably and validly assesses a full range of activity levels, and, most importantly, is sensitive to clinically meaningful changes in activity. The Stanford Leisure-Time Activity Categorical Item (L-Cat) is a single item comprising six descriptive categories ranging from inactive to very active. This novel methodological approach assesses national activity recommendations as well as multiple clinically relevant categories below and above the recommendations, and incorporates critical methodological principles that enhance psychometrics (reliability, validity and sensitivity to change). We evaluated the L-Cat's psychometrics among 267 overweight/obese women who were asked to meet the national activity recommendations in a randomized behavioral weight-loss trial. The L-Cat had excellent test-retest reliability (κ=0.64, P<0.001) and adequate concurrent criterion validity; each L-Cat category at 6 months was associated with 1059 more daily pedometer steps (95% CI 712-1407, β=0.38, P<0.001) and 1.9% greater initial weight loss at 6 months (95% CI -2.4 to -1.3, β=-0.38, P<0.001). Of interest, L-Cat categories differentiated from each other in a dose-response gradient for steps and weight loss (Ps<0.05) with excellent face validity. The L-Cat was sensitive to change in response to the trial's activity component. Women increased one L-Cat category at 6 months (M=1.0±1.4, P<0.001); 55.8% met the recommendations at 6 months whereas 20.6% did at baseline (P<0.001). Even among women not meeting the recommendations at both baseline and 6 months (n=106), women who moved 1 L-Cat categories at 6 months lost more weight than those who did not (M=-4.6%, 95% CI -6.7 to -2.5, P<0.001). Given strong psychometrics, the L-Cat has timely potential for clinical use such as tracking activity changes via electronic medical records, especially among overweight/obese populations who are unable or unlikely to reach national recommendations.

  18. Assessing the validity and reliability of three indicators self-reported on the pregnancy risk assessment monitoring system survey.

    PubMed

    Ahluwalia, Indu B; Helms, Kristen; Morrow, Brian

    2013-01-01

    We investigated the reliability and validity of three self-reported indicators from the Pregnancy Risk Assessment Monitoring System (PRAMS) survey. We used 2008 PRAMS (n=15,646) data from 12 states that had implemented the 2003 revised U.S. Certificate of Live Birth. We estimated reliability by kappa coefficient and validity by sensitivity and specificity using the birth certificate data as the reference for the following: prenatal participation in the Special Supplemental Nutrition Program for Women, Infants, and Children (WIC); Medicaid payment for delivery; and breastfeeding initiation. These indicators were examined across several demographic subgroups. The reliability was high for all three measures: 0.81 for WIC participation, 0.67 for Medicaid payment of delivery, and 0.72 for breastfeeding initiation. The validity of PRAMS indicators was also high: WIC participation (sensitivity = 90.8%, specificity = 90.6%), Medicaid payment for delivery (sensitivity = 82.4%, specificity = 85.6%), and breastfeeding initiation (sensitivity = 94.3%, specificity = 76.0%). The prevalence estimates were higher on PRAMS than the birth certificate for each of the indicators except Medicaid-paid delivery among non-Hispanic black women. Kappa values within most subgroups remained in the moderate range (0.40-0.80). Sensitivity and specificity values were lower for Hispanic women who responded to the PRAMS survey in Spanish and for breastfeeding initiation among women who delivered very low birthweight and very preterm infants. The validity and reliability of the PRAMS data for measures assessed were high. Our findings support the use of PRAMS data for epidemiological surveillance, research, and planning.

  19. Performances and reliability predictions of optical data transmission links using a system simulator for aerospace applications

    NASA Astrophysics Data System (ADS)

    Bechou, L.; Deshayes, Y.; Aupetit-Berthelemot, C.; Guerin, A.; Tronche, C.

    Space missions for Earth Observation are called upon to carry a growing number of instruments in their payload, whose performances are increasing. Future space systems are therefore intended to generate huge amounts of data and a key challenge in coming years will therefore lie in the ability to transmit that significant quantity of data to ground. Thus very high data rate Payload Telemetry (PLTM) systems will be required to face the demand of the future Earth Exploration Satellite Systems and reliability is one of the major concern of such systems. An attractive approach associated with the concept of predictive modeling consists in analyzing the impact of components malfunctioning on the optical link performances taking into account the network requirements and experimental degradation laws. Reliability estimation is traditionally based on life-testing and a basic approach is to use Telcordia requirements (468GR) for optical telecommunication applications. However, due to the various interactions between components, operating lifetime of a system cannot be taken as the lifetime of the less reliable component. In this paper, an original methodology is proposed to estimate reliability of an optical communication system by using a dedicated system simulator for predictive modeling and design for reliability. At first, we present frameworks of point-to-point optical communication systems for space applications where high data rate (or frequency bandwidth), lower cost or mass saving are needed. Optoelectronics devices used in these systems can be similar to those found in terrestrial optical network. Particularly we report simulation results of transmission performances after introduction of DFB Laser diode parameters variations versus time extrapolated from accelerated tests based on terrestrial or submarine telecommunications qualification standards. Simulations are performed to investigate and predict the consequence of degradations of the Laser diode (acting as a - requency carrier) on system performances (eye diagram, quality factor and BER). The studied link consists in 4× 2.5 Gbits/s WDM channels with direct modulation and equally spaced (0,8 nm) around the 1550 nm central wavelength. Results clearly show that variation of fundamental parameters such as bias current or central wavelength induces a penalization of dynamic performances of the complete WDM link. In addition different degradation kinetics of aged Laser diodes from a same batch have been implemented to build the final distribution of Q-factor and BER values after 25 years. When considering long optical distance, fiber attenuation, EDFA noise, dispersion, PMD, ... penalize network performances that can be compensated using Forward Error Correction (FEC) coding. Three methods have been investigated in the case of On-Off Keying (OOK) transmission over an unipolar optical channel corrupted by Gaussian noise. Such system simulations highlight the impact of component parameter degradations on the whole network performances allowing to optimize various time and cost consuming sensitivity analyses at the early stage of the system development. Thus the validity of failure criteria in relation with mission profiles can be evaluated representing a significant part of the general PDfR effort in particular for aerospace applications.

  20. Discrete component bonding and thick film materials study

    NASA Technical Reports Server (NTRS)

    Kinser, D. L.

    1975-01-01

    The results are summarized of an investigation of discrete component bonding reliability and a fundamental study of new thick film resistor materials. The component bonding study examined several types of solder bonded components with some processing variable studies to determine their influence upon bonding reliability. The bonding reliability was assessed using the thermal cycle: 15 minutes at room temperature, 15 minutes at +125 C 15 minutes at room temperature, and 15 minutes at -55 C. The thick film resistor materials examined were of the transition metal oxide-phosphate glass family with several elemental metal additions of the same transition metal. These studies were conducted by preparing a paste of the subject composition, printing, drying, and firing using both air and reducing atmospheres. The resulting resistors were examined for adherence, resistance, thermal coefficient of resistance, and voltage coefficient of resistance.

  1. Probabilistic durability assessment of concrete structures in marine environments: Reliability and sensitivity analysis

    NASA Astrophysics Data System (ADS)

    Yu, Bo; Ning, Chao-lie; Li, Bing

    2017-03-01

    A probabilistic framework for durability assessment of concrete structures in marine environments was proposed in terms of reliability and sensitivity analysis, which takes into account the uncertainties under the environmental, material, structural and executional conditions. A time-dependent probabilistic model of chloride ingress was established first to consider the variations in various governing parameters, such as the chloride concentration, chloride diffusion coefficient, and age factor. Then the Nataf transformation was adopted to transform the non-normal random variables from the original physical space into the independent standard Normal space. After that the durability limit state function and its gradient vector with respect to the original physical parameters were derived analytically, based on which the first-order reliability method was adopted to analyze the time-dependent reliability and parametric sensitivity of concrete structures in marine environments. The accuracy of the proposed method was verified by comparing with the second-order reliability method and the Monte Carlo simulation. Finally, the influences of environmental conditions, material properties, structural parameters and execution conditions on the time-dependent reliability of concrete structures in marine environments were also investigated. The proposed probabilistic framework can be implemented in the decision-making algorithm for the maintenance and repair of deteriorating concrete structures in marine environments.

  2. Reliability, Validity, and Sensitivity to Change of Turkish Activities-Specific Balance Confidence Scale in Patients with Unilateral Peripheral Vestibular Disease

    ERIC Educational Resources Information Center

    Karapolat, Hale; Eyigor, Sibel; Kirazli, Yesim; Celebisoy, Nese; Bilgen, Cem; Kirazli, Tayfun

    2010-01-01

    The aim of this study is to evaluate the internal consistency, test-retest reliability, construct validity, and sensitivity to change of the Activities-specific Balance Confidence Scale (ABC) in people with peripheral vestibular disorder. Thirty-three patients with unilateral peripheral vestibular disease were included in the study. Patients were…

  3. The Achievement of Therapeutic Objectives Scale: Interrater Reliability and Sensitivity to Change in Short-Term Dynamic Psychotherapy and Cognitive Therapy

    ERIC Educational Resources Information Center

    Valen, Jakob; Ryum, Truls; Svartberg, Martin; Stiles, Tore C.; McCullough, Leigh

    2011-01-01

    This study examined interrater reliability and sensitivity to change of the Achievement of Therapeutic Objectives Scale (ATOS; McCullough, Larsen, et al., 2003) in short-term dynamic psychotherapy (STDP) and cognitive therapy (CT). The ATOS is a process scale originally developed to assess patients' achievements of treatment objectives in STDP,…

  4. Capsule Typing of Haemophilus influenzae by Matrix-Assisted Laser Desorption/Ionization Time-of-Flight Mass Spectrometry.

    PubMed

    Månsson, Viktor; Gilsdorf, Janet R; Kahlmeter, Gunnar; Kilian, Mogens; Kroll, J Simon; Riesbeck, Kristian; Resman, Fredrik

    2018-03-01

    Encapsulated Haemophilus influenzae strains belong to type-specific genetic lineages. Reliable capsule typing requires PCR, but a more efficient method would be useful. We evaluated capsule typing by using matrix-assisted laser desorption/ionization time-of-flight (MALDI-TOF) mass spectrometry. Isolates of all capsule types (a-f and nontypeable; n = 258) and isogenic capsule transformants (types a-d) were investigated. Principal component and biomarker analyses of mass spectra showed clustering, and mass peaks correlated with capsule type-specific genetic lineages. We used 31 selected isolates to construct a capsule typing database. Validation with the remaining isolates (n = 227) showed 100% sensitivity and 92.2% specificity for encapsulated strains (a-f; n = 61). Blinded validation of a supplemented database (n = 50) using clinical isolates (n = 126) showed 100% sensitivity and 100% specificity for encapsulated strains (b, e, and f; n = 28). MALDI-TOF mass spectrometry is an accurate method for capsule typing of H. influenzae.

  5. Ag2S Quantum Dot-Sensitized Solar Cells by First Principles: The Effect of Capping Ligands and Linkers.

    PubMed

    Amaya Suárez, Javier; Plata, Jose J; Márquez, Antonio M; Fernández Sanz, Javier

    2017-09-28

    Quantum dots solar cells, QDSCs, are one of the candidates for being a reliable alternative to fossil fuels. However, the well-studied CdSe and CdTe-based QDSCs present a variety of issues for their use in consumer-goods applications. Silver sulfide, Ag 2 S, is a promising material, but poor efficiency has been reported for QDSCs based on this compound. The potential influence of each component of QDSCs is critical and key for the development of more efficient devices based on Ag 2 S. In this work, density functional theory calculations were performed to study the nature of the optoelectronic properties for an anatase-TiO 2 (101) surface sensitized with different silver sulfide nanoclusters. We demonstrated how it is possible to deeply tune of its electronic properties by modifying the capping ligands and linkers to the surface. Finally, an analysis of the electron injection mechanism for this system is presented.

  6. Fiber Optic Experience with the Smart Actuation System on the F-18 Systems Research Aircraft

    NASA Technical Reports Server (NTRS)

    Zavala, Eddie

    1997-01-01

    High bandwidth, immunity to electromagnetic interference, and potential weight savings have led to the development of fiber optic technology for future aerospace vehicle systems. This technology has been incorporated in a new smart actuator as the primary communication interface. The use of fiber optics simplified system integration and significantly reduced wire count. Flight test results showed that fiber optics could be used in aircraft systems and identified critical areas of development of fly-by-light technology. This paper documents the fiber optic experience gained as a result of this program, and identifies general design considerations that could be used in a variety of specific applications of fiber optic technology. Environmental sensitivities of fiber optic system components that significantly contribute to optical power variation are discussed. Although a calibration procedure successfully minimized the effect of fiber optic sensitivities, more standardized calibration methods are needed to ensure system operation and reliability in future aerospace vehicle systems.

  7. Life and reliability models for helicopter transmissions

    NASA Technical Reports Server (NTRS)

    Savage, M.; Knorr, R. J.; Coy, J. J.

    1982-01-01

    Computer models of life and reliability are presented for planetary gear trains with a fixed ring gear, input applied to the sun gear, and output taken from the planet arm. For this transmission the input and output shafts are co-axial and the input and output torques are assumed to be coaxial with these shafts. Thrust and side loading are neglected. The reliability model is based on the Weibull distributions of the individual reliabilities of the in transmission components. The system model is also a Weibull distribution. The load versus life model for the system is a power relationship as the models for the individual components. The load-life exponent and basic dynamic capacity are developed as functions of the components capacities. The models are used to compare three and four planet, 150 kW (200 hp), 5:1 reduction transmissions with 1500 rpm input speed to illustrate their use.

  8. Lifetime Reliability Evaluation of Structural Ceramic Parts with the CARES/LIFE Computer Program

    NASA Technical Reports Server (NTRS)

    Nemeth, Noel N.; Powers, Lynn M.; Janosik, Lesley A.; Gyekenyesi, John P.

    1993-01-01

    The computer program CARES/LIFE calculates the time-dependent reliability of monolithic ceramic components subjected to thermomechanical and/or proof test loading. This program is an extension of the CARES (Ceramics Analysis and Reliability Evaluation of Structures) computer program. CARES/LIFE accounts for the phenomenon of subcritical crack growth (SCG) by utilizing the power law, Paris law, or Walker equation. The two-parameter Weibull cumulative distribution function is used to characterize the variation in component strength. The effects of multiaxial stresses are modeled using either the principle of independent action (PIA), Weibull's normal stress averaging method (NSA), or Batdorf's theory. Inert strength and fatigue parameters are estimated from rupture strength data of naturally flawed specimens loaded in static, dynamic, or cyclic fatigue. Two example problems demonstrating cyclic fatigue parameter estimation and component reliability analysis with proof testing are included.

  9. Reliability and availability analysis of a 10 kW@20 K helium refrigerator

    NASA Astrophysics Data System (ADS)

    Li, J.; Xiong, L. Y.; Liu, L. Q.; Wang, H. R.; Wang, B. M.

    2017-02-01

    A 10 kW@20 K helium refrigerator has been established in the Technical Institute of Physics and Chemistry, Chinese Academy of Sciences. To evaluate and improve this refrigerator’s reliability and availability, a reliability and availability analysis is performed. According to the mission profile of this refrigerator, a functional analysis is performed. The failure data of the refrigerator components are collected and failure rate distributions are fitted by software Weibull++ V10.0. A Failure Modes, Effects & Criticality Analysis (FMECA) is performed and the critical components with higher risks are pointed out. Software BlockSim V9.0 is used to calculate the reliability and the availability of this refrigerator. The result indicates that compressors, turbine and vacuum pump are the critical components and the key units of this refrigerator. The mitigation actions with respect to design, testing, maintenance and operation are proposed to decrease those major and medium risks.

  10. Developing Ultra Reliable Life Support for the Moon and Mars

    NASA Technical Reports Server (NTRS)

    Jones, Harry W.

    2009-01-01

    Recycling life support systems can achieve ultra reliability by using spares to replace failed components. The added mass for spares is approximately equal to the original system mass, provided the original system reliability is not very low. Acceptable reliability can be achieved for the space shuttle and space station by preventive maintenance and by replacing failed units, However, this maintenance and repair depends on a logistics supply chain that provides the needed spares. The Mars mission must take all the needed spares at launch. The Mars mission also must achieve ultra reliability, a very low failure rate per hour, since it requires years rather than weeks and cannot be cut short if a failure occurs. Also, the Mars mission has a much higher mass launch cost per kilogram than shuttle or station. Achieving ultra reliable space life support with acceptable mass will require a well-planned and extensive development effort. Analysis must define the reliability requirement and allocate it to subsystems and components. Technologies, components, and materials must be designed and selected for high reliability. Extensive testing is needed to ascertain very low failure rates. Systems design should segregate the failure causes in the smallest, most easily replaceable parts. The systems must be designed, produced, integrated, and tested without impairing system reliability. Maintenance and failed unit replacement should not introduce any additional probability of failure. The overall system must be tested sufficiently to identify any design errors. A program to develop ultra reliable space life support systems with acceptable mass must start soon if it is to produce timely results for the moon and Mars.

  11. Reliability Modeling of Microelectromechanical Systems Using Neural Networks

    NASA Technical Reports Server (NTRS)

    Perera. J. Sebastian

    2000-01-01

    Microelectromechanical systems (MEMS) are a broad and rapidly expanding field that is currently receiving a great deal of attention because of the potential to significantly improve the ability to sense, analyze, and control a variety of processes, such as heating and ventilation systems, automobiles, medicine, aeronautical flight, military surveillance, weather forecasting, and space exploration. MEMS are very small and are a blend of electrical and mechanical components, with electrical and mechanical systems on one chip. This research establishes reliability estimation and prediction for MEMS devices at the conceptual design phase using neural networks. At the conceptual design phase, before devices are built and tested, traditional methods of quantifying reliability are inadequate because the device is not in existence and cannot be tested to establish the reliability distributions. A novel approach using neural networks is created to predict the overall reliability of a MEMS device based on its components and each component's attributes. The methodology begins with collecting attribute data (fabrication process, physical specifications, operating environment, property characteristics, packaging, etc.) and reliability data for many types of microengines. The data are partitioned into training data (the majority) and validation data (the remainder). A neural network is applied to the training data (both attribute and reliability); the attributes become the system inputs and reliability data (cycles to failure), the system output. After the neural network is trained with sufficient data. the validation data are used to verify the neural networks provided accurate reliability estimates. Now, the reliability of a new proposed MEMS device can be estimated by using the appropriate trained neural networks developed in this work.

  12. Using remote sensing and GIS techniques to estimate discharge and recharge fluxes for the Death Valley regional groundwater flow system, USA

    USGS Publications Warehouse

    D'Agnese, F. A.; Faunt, C.C.; Turner, A.K.; ,

    1996-01-01

    The recharge and discharge components of the Death Valley regional groundwater flow system were defined by techniques that integrated disparate data types to develop a spatially complex representation of near-surface hydrological processes. Image classification methods were applied to multispectral satellite data to produce a vegetation map. The vegetation map was combined with ancillary data in a GIS to delineate different types of wetlands, phreatophytes and wet playa areas. Existing evapotranspiration-rate estimates were used to calculate discharge volumes for these area. An empirical method of groundwater recharge estimation was modified to incorporate data describing soil-moisture conditions, and a recharge potential map was produced. These discharge and recharge maps were readily converted to data arrays for numerical modelling codes. Inverse parameter estimation techniques also used these data to evaluate the reliability and sensitivity of estimated values.The recharge and discharge components of the Death Valley regional groundwater flow system were defined by remote sensing and GIS techniques that integrated disparate data types to develop a spatially complex representation of near-surface hydrological processes. Image classification methods were applied to multispectral satellite data to produce a vegetation map. This map provided a basis for subsequent evapotranspiration and infiltration estimations. The vegetation map was combined with ancillary data in a GIS to delineate different types of wetlands, phreatophytes and wet playa areas. Existing evapotranspiration-rate estimates were then used to calculate discharge volumes for these areas. A previously used empirical method of groundwater recharge estimation was modified by GIS methods to incorporate data describing soil-moisture conditions, and a recharge potential map was produced. These discharge and recharge maps were readily converted to data arrays for numerical modelling codes. Inverse parameter estimation techniques also used these data to evaluate the reliability and sensitivity of estimated values.

  13. Reliability and Maintainability Data for Lead Lithium Cooling Systems

    DOE PAGES

    Cadwallader, Lee

    2016-11-16

    This article presents component failure rate data for use in assessment of lead lithium cooling systems. Best estimate data applicable to this liquid metal coolant is presented. Repair times for similar components are also referenced in this work. These data support probabilistic safety assessment and reliability, availability, maintainability and inspectability analyses.

  14. Component Structure, Reliability, and Stability of Lawrence's Self-Esteem Questionnaire (LAWSEQ)

    ERIC Educational Resources Information Center

    Rae, Gordon; Dalto, Georgia; Loughrey, Dolores; Woods, Caroline

    2011-01-01

    Lawrence's Self-Esteem Questionnaire (LAWSEQ) was administered to 120 Year 1 pupils in six schools in Belfast, Northern Ireland. A principal components analysis indicated that the scale items were unidimensional and that the reliability of the scores, as estimated by Cronbach's alpha, was satisfactory ([alpha] = 0.73). There were no differences…

  15. Analysis of feline and canine allergen components in patients sensitized to pets.

    PubMed

    Ukleja-Sokołowska, Natalia; Gawrońska-Ukleja, Ewa; Żbikowska-Gotz, Magdalena; Socha, Ewa; Lis, Kinga; Sokołowski, Łukasz; Kuźmiński, Andrzej; Bartuzi, Zbigniew

    2016-01-01

    Component resolved allergen diagnosis allows for a precise evaluation of the sensitization profiles of patients sensitized to felines and canines. An accurate interpretation of these results allows better insight into the evolution of a given patients sensitizations, and allows for a more precise evaluation of their prognoses. 70 patients (42 women and 28 men, aged 18-65, with the average of 35.5) with a positive feline or canine allergy diagnosis were included in the research group. 30 patients with a negative allergy diagnosis were included in the control group. The total IgE levels of all patients with allergies as well as their allergen-specific IgE to feline and canine allergens were measured. Specific IgE levels to canine (Can f 1, Can f 2, Can f 3, Can f 5) and feline (Fel d 1, Fel d 2, Fel d 4) allergen components were also measured with the use of the ImmunoCap method. Monosensitization for only one canine or feline component was found in 30% of patients. As predicted, the main feline allergen was Fel d 1, which sensitized as many as 93.9% of patients sensitized to felines. Among 65 patients sensitized to at least one feline component, for 30 patients (46.2%) the only sensitizing feline component was Fel d 1. Only 19 patients in that group (63.3%) were not simultaneously sensitized to dogs and 11 (36.7%), the isolated sensitization to feline Fel d 1 notwithstanding, displayed concurrent sensitizations to one of the canine allergen components. Fel d 4 sensitized 49.2% of the research group.64.3% of patients sensitized to canine components had heightened levels of specific IgE to Can f 1. Monosensitization in that group occurred for 32.1% of the patients. Sensitization to Can f 5 was observed among 52.4% of the patients. Concurrent sensitizations to a few allergic components, not only cross-reactive but also originating in different protein families, are a significant problem for patients sensitized to animals.

  16. [MaRS Project

    NASA Technical Reports Server (NTRS)

    Aruljothi, Arunvenkatesh

    2016-01-01

    The Space Exploration Division of the Safety and Mission Assurances Directorate is responsible for reducing the risk to Human Space Flight Programs by providing system safety, reliability, and risk analysis. The Risk & Reliability Analysis branch plays a part in this by utilizing Probabilistic Risk Assessment (PRA) and Reliability and Maintainability (R&M) tools to identify possible types of failure and effective solutions. A continuous effort of this branch is MaRS, or Mass and Reliability System, a tool that was the focus of this internship. Future long duration space missions will have to find a balance between the mass and reliability of their spare parts. They will be unable take spares of everything and will have to determine what is most likely to require maintenance and spares. Currently there is no database that combines mass and reliability data of low level space-grade components. MaRS aims to be the first database to do this. The data in MaRS will be based on the hardware flown on the International Space Stations (ISS). The components on the ISS have a long history and are well documented, making them the perfect source. Currently, MaRS is a functioning excel workbook database; the backend is complete and only requires optimization. MaRS has been populated with all the assemblies and their components that are used on the ISS; the failures of these components are updated regularly. This project was a continuation on the efforts of previous intern groups. Once complete, R&M engineers working on future space flight missions will be able to quickly access failure and mass data on assemblies and components, allowing them to make important decisions and tradeoffs.

  17. Analytical models for coupling reliability in identical two-magnet systems during slow reversals

    NASA Astrophysics Data System (ADS)

    Kani, Nickvash; Naeemi, Azad

    2017-12-01

    This paper follows previous works which investigated the strength of dipolar coupling in two-magnet systems. While those works focused on qualitative analyses, this manuscript elucidates reversal through dipolar coupling culminating in analytical expressions for reversal reliability in identical two-magnet systems. The dipolar field generated by a mono-domain magnetic body can be represented by a tensor containing both longitudinal and perpendicular field components; this field changes orientation and magnitude based on the magnetization of neighboring nanomagnets. While the dipolar field does reduce to its longitudinal component at short time-scales, for slow magnetization reversals, the simple longitudinal field representation greatly underestimates the scope of parameters that ensure reliable coupling. For the first time, analytical models that map the geometric and material parameters required for reliable coupling in two-magnet systems are developed. It is shown that in biaxial nanomagnets, the x ̂ and y ̂ components of the dipolar field contribute to the coupling, while all three dimensions contribute to the coupling between a pair of uniaxial magnets. Additionally, the ratio of the longitudinal and perpendicular components of the dipolar field is also very important. If the perpendicular components in the dipolar tensor are too large, the nanomagnet pair may come to rest in an undesirable meta-stable state away from the free axis. The analytical models formulated in this manuscript map the minimum and maximum parameters for reliable coupling. Using these models, it is shown that there is a very small range of material parameters which can facilitate reliable coupling between perpendicular-magnetic-anisotropy nanomagnets; hence, in-plane nanomagnets are more suitable for coupled systems.

  18. A practical approach to spectral calibration of short wavelength infrared hyper-spectral imaging systems

    NASA Astrophysics Data System (ADS)

    Bürmen, Miran; Pernuš, Franjo; Likar, Boštjan

    2010-02-01

    Near-infrared spectroscopy is a promising, rapidly developing, reliable and noninvasive technique, used extensively in the biomedicine and in pharmaceutical industry. With the introduction of acousto-optic tunable filters (AOTF) and highly sensitive InGaAs focal plane sensor arrays, real-time high resolution hyper-spectral imaging has become feasible for a number of new biomedical in vivo applications. However, due to the specificity of the AOTF technology and lack of spectral calibration standardization, maintaining long-term stability and compatibility of the acquired hyper-spectral images across different systems is still a challenging problem. Efficiently solving both is essential as the majority of methods for analysis of hyper-spectral images relay on a priori knowledge extracted from large spectral databases, serving as the basis for reliable qualitative or quantitative analysis of various biological samples. In this study, we propose and evaluate fast and reliable spectral calibration of hyper-spectral imaging systems in the short wavelength infrared spectral region. The proposed spectral calibration method is based on light sources or materials, exhibiting distinct spectral features, which enable robust non-rigid registration of the acquired spectra. The calibration accounts for all of the components of a typical hyper-spectral imaging system such as AOTF, light source, lens and optical fibers. The obtained results indicated that practical, fast and reliable spectral calibration of hyper-spectral imaging systems is possible, thereby assuring long-term stability and inter-system compatibility of the acquired hyper-spectral images.

  19. Hyperventilation in asthma: a validation study of the Nijmegen Questionnaire--NQ.

    PubMed

    Grammatopoulou, Eirini P; Skordilis, Emmanouil K; Georgoudis, Georgios; Haniotou, Aikaterini; Evangelodimou, Afroditi; Fildissis, George; Katsoulas, Theodoros; Kalagiakos, Panagiotis

    2014-10-01

    The Nijmegen questionnaire (NQ) has previously been used for screening the hyperventilation syndrome (HVS) in asthmatics. However, no validity study has been reported so far. To examine the validity and reliability of the NQ in asthma patients and identify the prevalence of HVS. The NQ (n = 162) was examined for translation, construct, cross-sectional and discriminant validity as well as for internal consistency and test-retest reliability. Principal component analysis and exploratory factor analysis revealed a single factor solution with 11 items and 58.6% of explained variability. These 11 NQ items showed high internal consistency (Cronbach's alpha = 0.92) and test-retest reliability (IR = 0.98). Higher NQ scores were found in the following subgroups: women versus men (p < 0.01); participants with moderate versus mild asthma (p < 0.001) or uncontrolled versus controlled asthma (p < 0.001), and participants with breath-hold time (BHT) < 30 versus ≥ 30 s (p < 0.01) or end-tidal CO2 (ETCO2) ≤ 35 versus >35 mmHg (p < 0.001). A cut-off score of >17 discriminated the participants with regard to the presence of HVS. The NQ showed 92.73% sensitivity and 91.59% specificity. The total NQ score was found significantly correlated with ETCO2 (r = -0.68), RR (r = 0.66) and BHT (r = -0.65). The prevalence of HVS was found 34%. The NQ is a valid and reliable questionnaire for screening HVS in patients with stable mild-to-moderate asthma.

  20. The Spaeth/Richman contrast sensitivity test (SPARCS): design, reproducibility and ability to identify patients with glaucoma.

    PubMed

    Richman, Jesse; Zangalli, Camila; Lu, Lan; Wizov, Sheryl S; Spaeth, Eric; Spaeth, George L

    2015-01-01

    (1) To determine the ability of a novel, internet-based contrast sensitivity test titled the Spaeth/Richman Contrast Sensitivity Test (SPARCS) to identify patients with glaucoma. (2) To determine the test-retest reliability of SPARCS. A prospective, cross-sectional study of patients with glaucoma and controls was performed. Subjects were assessed by SPARCS and the Pelli-Robson chart. Reliability of each test was assessed by the intraclass correlation coefficient and the coefficient of repeatability. Sensitivity and specificity for identifying glaucoma was also evaluated. The intraclass correlation coefficient for SPARCS was 0.97 and 0.98 for Pelli-Robson. The coefficient of repeatability for SPARCS was ±6.7% and ±6.4% for Pelli-Robson. SPARCS identified patients with glaucoma with 79% sensitivity and 93% specificity. SPARCS has high test-retest reliability. It is easily accessible via the internet and identifies patients with glaucoma well. NCT01300949. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  1. Characterization of TiN coating layers using ultrasonic backward radiation.

    PubMed

    Song, Sung-Jin; Yang, Dong-Joo; Kim, Hak-Joon; Kwon, Sung D; Lee, Young-Ze; Kim, Ji-Yoon; Choi, Song-Chun

    2006-12-22

    Since ceramic layers coated on machinery components inevitably experience the changes in their properties it is necessary to evaluate the characteristics of ceramic coating layers nondestructively for the reliable use of coated components and the remaining life prediction. To address such a need, in the present study, the ultrasonic backward radiation technique is applied to examine the very thin TiN ceramic layers coated on AISI 1045 steel or austenitic 304 steel substrate. Specifically, the ultrasonic backward radiation profiles have been measured with variations in specimen preparation conditions such as coating layer thickness and sliding loading. In the experiments performed in the current study, the peak angle and the peak amplitude of ultrasonic backward radiation profile varied sensitively according to two specimen preparation conditions. In fact, this result demonstrates a high possibility of the ultrasonic backward radiation as an effective tool for the nondestructive characterization of the TiN ceramic coating layers even in such a thin regime.

  2. Interrater reliability of identifying indicators of posterior ligamentous complex disruption when plain films are indeterminate in thoracolumbar injuries.

    PubMed

    Schweitzer, Karl M; Vaccaro, Alexander R; Harrop, James S; Hurlbert, John; Carrino, John A; Rechtine, Glenn R; Schwartz, David G; Alanay, Ahmet; Sharma, Dinesh K; Anderson, D Greg; Lee, Joon Y; Arnold, Paul M

    2007-09-01

    The Spine Trauma Study Group (STSG) has proposed a novel thoracolumbar injury classification system and score (TLICS) in an attempt to define traumatic spinal injuries and direct appropriate management schemes objectively. The TLICS assigns specific point values based on three variables to generate a final severity score that guides potential treatment options. Within this algorithm, significant emphasis has been placed on posterior ligamentous complex (PLC) integrity. The purpose of this study was to determine the interrater reliability of indicators surgeons use when assessing PLC disruption on imaging studies, including computed tomography (CT) and magnetic resonance imaging (MRI). Orthopedic surgeons and neurosurgeons retrospectively reviewed a series of thoracolumbar injury case studies. Thirteen case studies, including images, were distributed to STSG members for individual, independent evaluation of the following three criteria: (1) diastasis of the facet joints on CT; (2) posterior edema-like signal in the region of PLC components on sagittal T2-weighted fat saturation (FAT SAT) MRI; and (3) disrupted PLC components on sagittal T1-weighted MRI. Interrater agreement on the presence or absence of each of the three criteria in each of the 13 cases was assessed. Absolute interrater percent agreement on diastasis of the facet joints on CT and posterior edema-like signal in the region of PLC components on sagittal T2-weighted FAT SAT MRI was similar (agreement 70.5%). Interrater agreement on disrupted PLC components on sagittal T1-weighted MRI was 48.9%. Facet joint diastasis on CT was the most reliable indicator of PLC disruption as assessed by both Cohen's kappa (kappa = 0.395) and intraclass correlation coefficient (ICC 0.430). The interrater reliability of assessing diastasis of the facet joints on CT had fair to moderate agreement. The reliability of assessing the posterior edema-like signal in the region of PLC components was lower but also fair, whereas the reliability of identifying disrupted PLC components was poor.

  3. Fuel Cell Balance-of-Plant Reliability Testbed Project

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sproat, Vern; LaHurd, Debbie

    Reliability of the fuel cell system balance-of-plant (BoP) components is a critical factor that needs to be addressed prior to fuel cells becoming fully commercialized. Failure or performance degradation of BoP components has been identified as a life-limiting factor in fuel cell systems.1 The goal of this project is to develop a series of test beds that will test system components such as pumps, valves, sensors, fittings, etc., under operating conditions anticipated in real Polymer Electrolyte Membrane (PEM) fuel cell systems. Results will be made generally available to begin removing reliability as a roadblock to the growth of the PEMmore » fuel cell industry. Stark State College students participating in the project, in conjunction with their coursework, have been exposed to technical knowledge and training in the handling and maintenance of hydrogen, fuel cells and system components as well as component failure modes and mechanisms. Three test beds were constructed. Testing was completed on gas flow pumps, tubing, and pressure and temperature sensors and valves.« less

  4. Reliable change of the sensory organization test.

    PubMed

    Broglio, Steven P; Ferrara, Michael S; Sopiarz, Kay; Kelly, Michael S

    2008-03-01

    To establish the sensitivity and specificity of the NeuroCom Sensory Organization Test (SOT) and provide practitioners with cut-scores for clinical decision making using estimates of reliable change. Retrospective cohort study. Research laboratory. Healthy (n = 66) and concussed (n = 63) young adult participants. Postural control assessments on the NeuroCom SOT were completed twice (baseline and follow-up) for both groups. Postconcussion assessments were administered within 24 hours of injury diagnosis. The reliable change technique was used to calculated cut-scores for each SOT variable (composite balance; somatosensory, visual, and vestibular ratios) at the 95%, 90%, 85%, 80%, 75%, and 70% confidence interval levels. When cut-scores were applied to the post-concussion evaluations, sensitivity and specificity varied with SOT variable and confidence interval. An evaluation for change on one or more SOT variable resulted in the highest combined sensitivity (57%) and specificity (80%) at the 75% confidence interval. Use of reliable change scores to detect significant changes in performance on the SOT resulted in decreased sensitivity and improved specificity compared to a previous report. These findings indicate that some concussed athletes may not show large changes in postconcussion postural control and this postural control evaluation should not be used in exclusion of other assessment techniques. The postural control assessment should be combined with other evaluative measures to gain the highest sensitivity to concussive injuries.

  5. Reliability of meat, fish, dairy, and egg intake over a 33-year interval in Adventist Health Study 2.

    PubMed

    Singh, Pramil N; Batech, Michael; Faed, Pegah; Jaceldo-Siegl, Karen; Martins, Marcia; Fraser, Gary E

    2014-01-01

    We studied Adventist Health Study 2 (AHS-2) cohort members to determine the reliability of long-term recall of adult dietary intake that occurred 33 years ago. Establishing the reliability of these measures supports studies of how dietary exposure across the life course affects risk of cancer and other noncommunicable disease outcomes. Among 1816 AHS-2 cohort members, we conducted a statistical comparison of long-term recall of meat, fish, dairy, and eggs at AHS-2 baseline with their report of current diet 33 years before AHS-2 baseline at an age of 30-60 years. Major findings are as follows: 1) a high correlation for frequency of red meat (R = 0.71), poultry (R = 0.67), and fish (R = 0.60); lower correlations for dairy (R = 0.19) and eggs (R = 0.28); 2) good concordance for dichotomous measures of red meat [sensitivity: 0.70; specificity: 0.92; positive predictive value (PPV): 0.91], poultry (sensitivity: 0.76; specificity: 0.87; PPV: 0.83), fish (sensitivity: 0.61; specificity: 0.93; PPV: 0.89), dairy (sensitivity: 0.95; specificity: 0.57; PPV: 0.99), and eggs (sensitivity: 0.95; specificity: 0.41; PPV: 0.96); negative predictive value for dairy and eggs was poor. Among older AHS-2 cohort members, we found good reliability of recall of red meat, poultry, and fish intake that occurred 33 years earlier.

  6. Screening for frailty in older adults using a self-reported instrument.

    PubMed

    Nunes, Daniella Pires; Duarte, Yeda Aparecida de Oliveira; Santos, Jair Lício Ferreira; Lebrão, Maria Lúcia

    2015-01-01

    OBJECTIVE To validate a screening instrument using self-reported assessment of frailty syndrome in older adults. METHODS This cross-sectional study used data from the Saúde, Bem-estar e Envelhecimento study conducted in Sao Paulo, SP, Southeastern Brazil. The sample consisted of 433 older adult individuals (≥ 75 years) assessed in 2009. The self-reported instrument can be applied to older adults or their proxy respondents and consists of dichotomous questions directly related to each component of the frailty phenotype, which is considered the gold standard model: unintentional weight loss, fatigue, low physical activity, decreased physical strength, and decreased walking speed. The same classification proposed in the phenotype was utilized: not frail (no component identified); pre-frail (presence of one or two components), and frail (presence of three or more components). Because this is a screening instrument, "process of frailty" was included as a category (pre-frail and frail). Cronbach's α was used in psychometric analysis to evaluate the reliability and validity of the criterion, the sensitivity, the specificity, as well as positive and negative predictive values. Factor analysis was used to assess the suitability of the proposed number of components. RESULTS Decreased walking speed and decreased physical strength showed good internal consistency (α = 0.77 and 0.72, respectively); however, low physical activity was less satisfactory (α = 0.63). The sensitivity and specificity for identifying pre-frail individuals were 89.7% and 24.3%, respectively, while those for identifying frail individuals were 63.2% and 71.6%, respectively. In addition, 89.7% of the individuals from both the evaluations were identified in the "process of frailty" category. CONCLUSIONS The self-reported assessment of frailty can identify the syndrome among older adults and can be used as a screening tool. Its advantages include simplicity, rapidity, low cost, and ability to be used by different professionals.

  7. Screening for frailty in older adults using a self-reported instrument

    PubMed Central

    Nunes, Daniella Pires; Duarte, Yeda Aparecida de Oliveira; Santos, Jair Lício Ferreira; Lebrão, Maria Lúcia

    2015-01-01

    OBJECTIVE To validate a screening instrument using self-reported assessment of frailty syndrome in older adults. METHODS This cross-sectional study used data from the Saúde, Bem-estar e Envelhecimento study conducted in Sao Paulo, SP, Southeastern Brazil. The sample consisted of 433 older adult individuals (≥ 75 years) assessed in 2009. The self-reported instrument can be applied to older adults or their proxy respondents and consists of dichotomous questions directly related to each component of the frailty phenotype, which is considered the gold standard model: unintentional weight loss, fatigue, low physical activity, decreased physical strength, and decreased walking speed. The same classification proposed in the phenotype was utilized: not frail (no component identified); pre-frail (presence of one or two components), and frail (presence of three or more components). Because this is a screening instrument, “process of frailty” was included as a category (pre-frail and frail). Cronbach’s α was used in psychometric analysis to evaluate the reliability and validity of the criterion, the sensitivity, the specificity, as well as positive and negative predictive values. Factor analysis was used to assess the suitability of the proposed number of components. RESULTS Decreased walking speed and decreased physical strength showed good internal consistency (α = 0.77 and 0.72, respectively); however, low physical activity was less satisfactory (α = 0.63). The sensitivity and specificity for identifying pre-frail individuals were 89.7% and 24.3%, respectively, while those for identifying frail individuals were 63.2% and 71.6%, respectively. In addition, 89.7% of the individuals from both the evaluations were identified in the “process of frailty” category. CONCLUSIONS The self-reported assessment of frailty can identify the syndrome among older adults and can be used as a screening tool. Its advantages include simplicity, rapidity, low cost, and ability to be used by different professionals. PMID:25741658

  8. Reliability, Convergent Validity and Time Invariance of Default Mode Network Deviations in Early Adult Major Depressive Disorder.

    PubMed

    Bessette, Katie L; Jenkins, Lisanne M; Skerrett, Kristy A; Gowins, Jennifer R; DelDonno, Sophie R; Zubieta, Jon-Kar; McInnis, Melvin G; Jacobs, Rachel H; Ajilore, Olusola; Langenecker, Scott A

    2018-01-01

    There is substantial variability across studies of default mode network (DMN) connectivity in major depressive disorder, and reliability and time-invariance are not reported. This study evaluates whether DMN dysconnectivity in remitted depression (rMDD) is reliable over time and symptom-independent, and explores convergent relationships with cognitive features of depression. A longitudinal study was conducted with 82 young adults free of psychotropic medications (47 rMDD, 35 healthy controls) who completed clinical structured interviews, neuropsychological assessments, and 2 resting-state fMRI scans across 2 study sites. Functional connectivity analyses from bilateral posterior cingulate and anterior hippocampal formation seeds in DMN were conducted at both time points within a repeated-measures analysis of variance to compare groups and evaluate reliability of group-level connectivity findings. Eleven hyper- (from posterior cingulate) and 6 hypo- (from hippocampal formation) connectivity clusters in rMDD were obtained with moderate to adequate reliability in all but one cluster (ICC's range = 0.50 to 0.76 for 16 of 17). The significant clusters were reduced with a principle component analysis (5 components obtained) to explore these connectivity components, and were then correlated with cognitive features (rumination, cognitive control, learning and memory, and explicit emotion identification). At the exploratory level, for convergent validity, components consisting of posterior cingulate with cognitive control network hyperconnectivity in rMDD were related to cognitive control (inverse) and rumination (positive). Components consisting of anterior hippocampal formation with social emotional network and DMN hypoconnectivity were related to memory (inverse) and happy emotion identification (positive). Thus, time-invariant DMN connectivity differences exist early in the lifespan course of depression and are reliable. The nuanced results suggest a ventral within-network hypoconnectivity associated with poor memory and a dorsal cross-network hyperconnectivity linked to poorer cognitive control and elevated rumination. Study of early course remitted depression with attention to reliability and symptom independence could lead to more readily translatable clinical assessment tools for biomarkers.

  9. ``Seeing'' electroencephalogram through the skull: imaging prefrontal cortex with fast optical signal

    NASA Astrophysics Data System (ADS)

    Medvedev, Andrei V.; Kainerstorfer, Jana M.; Borisov, Sergey V.; Gandjbakhche, Amir H.; Vanmeter, John

    2010-11-01

    Near-infrared spectroscopy is a novel imaging technique potentially sensitive to both brain hemodynamics (slow signal) and neuronal activity (fast optical signal, FOS). The big challenge of measuring FOS noninvasively lies in the presumably low signal-to-noise ratio. Thus, detectability of the FOS has been controversially discussed. We present reliable detection of FOS from 11 individuals concurrently with electroencephalogram (EEG) during a Go-NoGo task. Probes were placed bilaterally over prefrontal cortex. Independent component analysis (ICA) was used for artifact removal. Correlation coefficient in the best correlated FOS-EEG ICA pairs was highly significant (p < 10-8), and event-related optical signal (EROS) was found in all subjects. Several EROS components were similar to the event-related potential (ERP) components. The most robust ``optical N200'' at t = 225 ms coincided with the N200 ERP; both signals showed significant difference between targets and nontargets, and their timing correlated with subject's reaction time. Correlation between FOS and EEG even in single trials provides further evidence that at least some FOS components ``reflect'' electrical brain processes directly. The data provide evidence for the early involvement of prefrontal cortex in rapid object recognition. EROS is highly localized and can provide cost-effective imaging tools for cortical mapping of cognitive processes.

  10. “Seeing” electroencephalogram through the skull: imaging prefrontal cortex with fast optical signal

    PubMed Central

    Medvedev, Andrei V.; Kainerstorfer, Jana M.; Borisov, Sergey V.; Gandjbakhche, Amir H.; VanMeter, John

    2010-01-01

    Near-infrared spectroscopy is a novel imaging technique potentially sensitive to both brain hemodynamics (slow signal) and neuronal activity (fast optical signal, FOS). The big challenge of measuring FOS noninvasively lies in the presumably low signal-to-noise ratio. Thus, detectability of the FOS has been controversially discussed. We present reliable detection of FOS from 11 individuals concurrently with electroencephalogram (EEG) during a Go-NoGo task. Probes were placed bilaterally over prefrontal cortex. Independent component analysis (ICA) was used for artifact removal. Correlation coefficient in the best correlated FOS–EEG ICA pairs was highly significant (p < 10−8), and event-related optical signal (EROS) was found in all subjects. Several EROS components were similar to the event-related potential (ERP) components. The most robust “optical N200” at t = 225 ms coincided with the N200 ERP; both signals showed significant difference between targets and nontargets, and their timing correlated with subject’s reaction time. Correlation between FOS and EEG even in single trials provides further evidence that at least some FOS components “reflect” electrical brain processes directly. The data provide evidence for the early involvement of prefrontal cortex in rapid object recognition. EROS is highly localized and can provide cost-effective imaging tools for cortical mapping of cognitive processes. PMID:21198150

  11. Temperature responses of individual soil organic matter components

    NASA Astrophysics Data System (ADS)

    Feng, Xiaojuan; Simpson, Myrna J.

    2008-09-01

    Temperature responses of soil organic matter (SOM) remain unclear partly due to its chemical and compositional heterogeneity. In this study, the decomposition of SOM from two grassland soils was investigated in a 1-year laboratory incubation at six different temperatures. SOM was separated into solvent extractable compounds, suberin- and cutin-derived compounds, and lignin-derived monomers by solvent extraction, base hydrolysis, and CuO oxidation, respectively. These SOM components have distinct chemical structures and stabilities and their decomposition patterns over the course of the experiment were fitted with a two-pool exponential decay model. The stability of SOM components was also assessed using geochemical parameters and kinetic parameters derived from model fitting. Compared with the solvent extractable compounds, a low percentage of lignin monomers partitioned into the labile SOM pool. Suberin- and cutin-derived compounds were poorly fitted by the decay model, and their recalcitrance was shown by the geochemical degradation parameter (ω - C16/∑C16), which was observed to stabilize during the incubation. The temperature sensitivity of decomposition, expressed as Q10, was derived from the relationship between temperature and SOM decay rates. SOM components exhibited varying temperature responses and the decomposition of lignin monomers exhibited higher Q10 values than the decomposition of solvent extractable compounds. Our study shows that Q10 values derived from soil respiration measurements may not be reliable indicators of temperature responses of individual SOM components.

  12. Calibration Technique for Polarization-Sensitive Lidars

    NASA Technical Reports Server (NTRS)

    Alvarez, J. M.; Vaughan, M. A.; Hostetler, C. A.; Hung, W. H.; Winker, D. M.

    2006-01-01

    Polarization-sensitive lidars have proven to be highly effective in discriminating between spherical and non-spherical particles in the atmosphere. These lidars use a linearly polarized laser and are equipped with a receiver that can separately measure the components of the return signal polarized parallel and perpendicular to the outgoing beam. In this work we describe a technique for calibrating polarization-sensitive lidars that was originally developed at NASA s Langley Research Center (LaRC) and has been used continually over the past fifteen years. The procedure uses a rotatable half-wave plate inserted into the optical path of the lidar receiver to introduce controlled amounts of polarization cross-talk into a sequence of atmospheric backscatter measurements. Solving the resulting system of nonlinear equations generates the system calibration constants (gain ratio, G, and offset angle, theta) required for deriving calibrated measurements of depolarization ratio from the lidar signals. In addition, this procedure also determines the mean depolarization ratio within the region of the atmosphere that is analyzed. Simulations and error propagation studies show the method to be both reliable and well behaved. Operational details of the technique are illustrated using measurements obtained as part of Langley Research Center s participation in the First ISCCP Regional Experiment (FIRE).

  13. Measuring personal recovery - psychometric properties of the Swedish Questionnaire about the Process of Recovery (QPR-Swe).

    PubMed

    Argentzell, Elisabeth; Hultqvist, Jenny; Neil, Sandra; Eklund, Mona

    2017-10-01

    Personal recovery, defined as an individual process towards meaning, is an important target within mental health services. Measuring recovery hence requires reliable and valid measures. The Process of Recovery Questionnaire (QPR) was developed for that purpose. The aim was to develop a Swedish version of the QPR (QPR-Swe) and explore its psychometric properties in terms of factor structure, internal consistency, construct validity and sensitivity to change. A total of 226 participants entered the study. The factor structure was investigated by Principal Component Analysis and Scree plot. Construct validity was addressed in terms of convergent validity against indicators of self-mastery, self-esteem, quality of life and self-rated health. A one-factor solution of QPR-Swe received better support than a two-factor solution. Good internal consistency was indicated, α = 0.92, and construct validity was satisfactory. The QPR-Swe showed preliminary sensitivity to change. The QPR-Swe showed promising initial psychometric properties in terms of internal consistency, convergent validity and sensitivity to change. The QPR-Swe is recommended for use in research and clinical contexts to assess personal recovery among people with mental illness.

  14. Verification and classification bias interactions in diagnostic test accuracy studies for fine-needle aspiration biopsy.

    PubMed

    Schmidt, Robert L; Walker, Brandon S; Cohen, Michael B

    2015-03-01

    Reliable estimates of accuracy are important for any diagnostic test. Diagnostic accuracy studies are subject to unique sources of bias. Verification bias and classification bias are 2 sources of bias that commonly occur in diagnostic accuracy studies. Statistical methods are available to estimate the impact of these sources of bias when they occur alone. The impact of interactions when these types of bias occur together has not been investigated. We developed mathematical relationships to show the combined effect of verification bias and classification bias. A wide range of case scenarios were generated to assess the impact of bias components and interactions on total bias. Interactions between verification bias and classification bias caused overestimation of sensitivity and underestimation of specificity. Interactions had more effect on sensitivity than specificity. Sensitivity was overestimated by at least 7% in approximately 6% of the tested scenarios. Specificity was underestimated by at least 7% in less than 0.1% of the scenarios. Interactions between verification bias and classification bias create distortions in accuracy estimates that are greater than would be predicted from each source of bias acting independently. © 2014 American Cancer Society.

  15. New directions in diagnostic evaluation of insect allergy.

    PubMed

    Golden, David B K

    2014-08-01

    Diagnosis of insect sting allergy and prediction of risk of sting anaphylaxis are often difficult because tests for venom-specific IgE antibodies have a limited positive predictive value and do not reliably predict the severity of sting reactions. Component-resolved diagnosis using recombinant venom allergens has shown promise in improving the specificity of diagnostic testing for insect sting allergy. Basophil activation tests have been explored as more sensitive assays for identification of patients with insect allergy and for prediction of clinical outcomes. Measurement of mast cell mediators reflects the underlying risk for more severe reactions and limited clinical response to treatment. Measurement of IgE to recombinant venom allergens can distinguish cross-sensitization from dual sensitization to honeybee and vespid venoms, thus helping to limit venom immunotherapy to a single venom instead of multiple venoms in many patients. Basophil activation tests can detect venom allergy in patients who show no detectable venom-specific IgE in standard diagnostic tests and can predict increased risk of systemic reactions to venom immunotherapy, and to stings during and after stopping venom immunotherapy. The risk of severe or fatal anaphylaxis to stings can also be predicted by measurement of baseline serum tryptase or other mast cell mediators.

  16. Changes in the electric dipole vector of human serum albumin due to complexing with fatty acids.

    PubMed Central

    Scheider, W; Dintzis, H M; Oncley, J L

    1976-01-01

    The magnitude of the electric dipole vector of human serum albumin, as measured by the dielectric increment of the isoionic solution, is found to be a sensitive, monotonic indicator of the number of moles (up to at least 5) of long chain fatty acid complexed. The sensitivity is about three times as great as it is in bovine albumin. New methods of analysis of the frequency dispersion of the dielectric constant were developed to ascertain if molecular shape changes also accompany the complexing with fatty acid. Direct two-component rotary diffusion constant analysis is found to be too strongly affected by cross modulation between small systematic errors and physically significant data components to be a reliable measure of structural modification. Multicomponent relaxation profiles are more useful as recognition patterns for structural comparisons, but the equations involved are ill-conditioned and solutions based on standard least-squares regression contain mathematical artifacts which mask the physically significant spectrum. By constraining the solution to non-negative coefficients, the magnitude of the artifacts is reduced to well below the magnitudes of the spectral components. Profiles calculated in this way show no evidence of significant dipole direction or molecular shape change as the albumin is complexed with 1 mol of fatty acid. In these experiments albumin was defatted by incubation with adipose tissue at physiological pH, which avoids passing the protein through the pH of the N-F transition usually required in defatting. Addition of fatty acid from soluion in small amounts of ethanol appears to form a complex indistinguishable from the "native" complex. PMID:6087

  17. Reliability Issues in Stirling Radioisotope Power Systems

    NASA Technical Reports Server (NTRS)

    Schreiber, Jeffrey; Shah, Ashwin

    2005-01-01

    Stirling power conversion is a potential candidate for use in a Radioisotope Power System (RPS) for space science missions because it offers a multifold increase in the conversion efficiency of heat to electric power and reduced requirement of radioactive material. Reliability of an RPS that utilizes Stirling power conversion technology is important in order to ascertain long term successful performance. Owing to long life time requirement (14 years), it is difficult to perform long-term tests that encompass all the uncertainties involved in the design variables of components and subsystems comprising the RPS. The requirement for uninterrupted performance reliability and related issues are discussed, and some of the critical areas of concern are identified. An overview of the current on-going efforts to understand component life, design variables at the component and system levels, and related sources and nature of uncertainties are also discussed. Current status of the 110 watt Stirling Radioisotope Generator (SRG110) reliability efforts is described. Additionally, an approach showing the use of past experience on other successfully used power systems to develop a reliability plan for the SRG110 design is outlined.

  18. Reliability Issues in Stirling Radioisotope Power Systems

    NASA Technical Reports Server (NTRS)

    Shah, Ashwin R.; Schreiber, Jeffrey G.

    2004-01-01

    Stirling power conversion is a potential candidate for use in a Radioisotope Power System (RPS) for space science missions because it offers a multifold increase in the conversion efficiency of heat to electric power and reduced requirement of radioactive material. Reliability of an RPS that utilizes Stirling power conversion technology is important in order to ascertain long term successful performance. Owing to long life time requirement (14 years), it is difficult to perform long-term tests that encompass all the uncertainties involved in the design variables of components and subsystems comprising the RPS. The requirement for uninterrupted performance reliability and related issues are discussed, and some of the critical areas of concern are identified. An overview of the current on-going efforts to understand component life, design variables at the component and system levels, and related sources and nature of uncertainties are also discussed. Current status of the 110 watt Stirling Radioisotope Generator (SRG110) reliability efforts is described. Additionally, an approach showing the use of past experience on other successfully used power systems to develop a reliability plan for the SRG110 design is outlined.

  19. Reliability and risk assessment of structures

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.

    1991-01-01

    Development of reliability and risk assessment of structural components and structures is a major activity at Lewis Research Center. It consists of five program elements: (1) probabilistic loads; (2) probabilistic finite element analysis; (3) probabilistic material behavior; (4) assessment of reliability and risk; and (5) probabilistic structural performance evaluation. Recent progress includes: (1) the evaluation of the various uncertainties in terms of cumulative distribution functions for various structural response variables based on known or assumed uncertainties in primitive structural variables; (2) evaluation of the failure probability; (3) reliability and risk-cost assessment; and (4) an outline of an emerging approach for eventual certification of man-rated structures by computational methods. Collectively, the results demonstrate that the structural durability/reliability of man-rated structural components and structures can be effectively evaluated by using formal probabilistic methods.

  20. Scaled CMOS Reliability and Considerations for Spacecraft Systems: Bottom-Up and Top-Down Perspective

    NASA Technical Reports Server (NTRS)

    White, Mark

    2012-01-01

    New space missions will increasingly rely on more advanced technologies because of system requirements for higher performance, particularly in instruments and high-speed processing. Component-level reliability challenges with scaled CMOS in spacecraft systems from a bottom-up perspective have been presented. Fundamental Front-end and Back-end processing reliability issues with more aggressively scaled parts have been discussed. Effective thermal management from system-level to the componentlevel (top-down) is a key element in overall design of reliable systems. Thermal management in space systems must consider a wide range of issues, including thermal loading of many different components, and frequent temperature cycling of some systems. Both perspectives (top-down and bottom-up) play a large role in robust, reliable spacecraft system design.

  1. New methodologies for multi-scale time-variant reliability analysis of complex lifeline networks

    NASA Astrophysics Data System (ADS)

    Kurtz, Nolan Scot

    The cost of maintaining existing civil infrastructure is enormous. Since the livelihood of the public depends on such infrastructure, its state must be managed appropriately using quantitative approaches. Practitioners must consider not only which components are most fragile to hazard, e.g. seismicity, storm surge, hurricane winds, etc., but also how they participate on a network level using network analysis. Focusing on particularly damaged components does not necessarily increase network functionality, which is most important to the people that depend on such infrastructure. Several network analyses, e.g. S-RDA, LP-bounds, and crude-MCS, and performance metrics, e.g. disconnection bounds and component importance, are available for such purposes. Since these networks are existing, the time state is also important. If networks are close to chloride sources, deterioration may be a major issue. Information from field inspections may also have large impacts on quantitative models. To address such issues, hazard risk analysis methodologies for deteriorating networks subjected to seismicity, i.e. earthquakes, have been created from analytics. A bridge component model has been constructed for these methodologies. The bridge fragilities, which were constructed from data, required a deeper level of analysis as these were relevant for specific structures. Furthermore, chloride-induced deterioration network effects were investigated. Depending on how mathematical models incorporate new information, many approaches are available, such as Bayesian model updating. To make such procedures more flexible, an adaptive importance sampling scheme was created for structural reliability problems. Additionally, such a method handles many kinds of system and component problems with singular or multiple important regions of the limit state function. These and previously developed analysis methodologies were found to be strongly sensitive to the network size. Special network topologies may be more or less computationally difficult, while the resolution of the network also has large affects. To take advantage of some types of topologies, network hierarchical structures with super-link representation have been used in the literature to increase the computational efficiency by analyzing smaller, densely connected networks; however, such structures were based on user input and subjective at times. To address this, algorithms must be automated and reliable. These hierarchical structures may indicate the structure of the network itself. This risk analysis methodology has been expanded to larger networks using such automated hierarchical structures. Component importance is the most important objective from such network analysis; however, this may only provide the information of which bridges to inspect/repair earliest and little else. High correlations influence such component importance measures in a negative manner. Additionally, a regional approach is not appropriately modelled. To investigate a more regional view, group importance measures based on hierarchical structures have been created. Such structures may also be used to create regional inspection/repair approaches. Using these analytical, quantitative risk approaches, the next generation of decision makers may make both component and regional-based optimal decisions using information from both network function and further effects of infrastructure deterioration.

  2. Maximally reliable spatial filtering of steady state visual evoked potentials.

    PubMed

    Dmochowski, Jacek P; Greaves, Alex S; Norcia, Anthony M

    2015-04-01

    Due to their high signal-to-noise ratio (SNR) and robustness to artifacts, steady state visual evoked potentials (SSVEPs) are a popular technique for studying neural processing in the human visual system. SSVEPs are conventionally analyzed at individual electrodes or linear combinations of electrodes which maximize some variant of the SNR. Here we exploit the fundamental assumption of evoked responses--reproducibility across trials--to develop a technique that extracts a small number of high SNR, maximally reliable SSVEP components. This novel spatial filtering method operates on an array of Fourier coefficients and projects the data into a low-dimensional space in which the trial-to-trial spectral covariance is maximized. When applied to two sample data sets, the resulting technique recovers physiologically plausible components (i.e., the recovered topographies match the lead fields of the underlying sources) while drastically reducing the dimensionality of the data (i.e., more than 90% of the trial-to-trial reliability is captured in the first four components). Moreover, the proposed technique achieves a higher SNR than that of the single-best electrode or the Principal Components. We provide a freely-available MATLAB implementation of the proposed technique, herein termed "Reliable Components Analysis". Copyright © 2015 Elsevier Inc. All rights reserved.

  3. Flexible organic TFT bio-signal amplifier using reliable chip component assembly process with conductive adhesive.

    PubMed

    Yoshimoto, Shusuke; Uemura, Takafumi; Akiyama, Mihoko; Ihara, Yoshihiro; Otake, Satoshi; Fujii, Tomoharu; Araki, Teppei; Sekitani, Tsuyoshi

    2017-07-01

    This paper presents a flexible organic thin-film transistor (OTFT) amplifier for bio-signal monitoring and presents the chip component assembly process. Using a conductive adhesive and a chip mounter, the chip components are mounted on a flexible film substrate, which has OTFT circuits. This study first investigates the assembly technique reliability for chip components on the flexible substrate. This study also specifically examines heart pulse wave monitoring conducted using the proposed flexible amplifier circuit and a flexible piezoelectric film. We connected the amplifier to a bluetooth device for a wearable device demonstration.

  4. SLOWLY REPEATED EVOKED PAIN (SREP) AS A MARKER OF CENTRAL SENSITIZATION IN FIBROMYALGIA: DIAGNOSTIC ACCURACY AND RELIABILITY IN COMPARISON WITH TEMPORAL SUMMATION OF PAIN.

    PubMed

    de la Coba, Pablo; Bruehl, Stephen; Gálvez-Sánchez, Carmen María; Reyes Del Paso, Gustavo A

    2018-05-01

    This study examined the diagnostic accuracy and test-retest reliability of a novel dynamic evoked pain protocol (slowly repeated evoked pain; SREP) compared to temporal summation of pain (TSP), a standard index of central sensitization. Thirty-five fibromyalgia (FM) and 30 rheumatoid arthritis (RA) patients completed, in pseudorandomized order, a standard mechanical TSP protocol (10 stimuli of 1s duration at the thenar eminence using a 300g monofilament with 1s interstimulus interval) and the SREP protocol (9 suprathreshold pressure stimuli of 5s duration applied to the fingernail with a 30s interstimulus interval). In order to evaluate reliability for both protocols, they were repeated in a second session 4-7 days later. Evidence for significant pain sensitization over trials (increasing pain intensity ratings) was observed for SREP in FM (p<.001) but not in RA (p=.35), whereas significant sensitization was observed in both diagnostic groups for the TSP protocol (p's<.008). Compared to TSP, SREP demonstrated higher overall diagnostic accuracy (87.7% vs. 64.6%), greater sensitivity (0.89 vs. 0.57), and greater specificity (0.87 vs. 0.73) in discriminating between FM and RA patients. Test-retest reliability of SREP sensitization was good in FM (ICCs: 0.80), and moderate in RA (ICC: 0.68). SREP seems to be a dynamic evoked pain index tapping into pain sensitization that allows for greater diagnostic accuracy in identifying FM patients compared to a standard TSP protocol. Further research is needed to study mechanisms underlying SREP and the potential utility of adding SREP to standard pain evaluation protocols.

  5. Detecting long-term growth trends using tree rings: a critical evaluation of methods.

    PubMed

    Peters, Richard L; Groenendijk, Peter; Vlam, Mart; Zuidema, Pieter A

    2015-05-01

    Tree-ring analysis is often used to assess long-term trends in tree growth. A variety of growth-trend detection methods (GDMs) exist to disentangle age/size trends in growth from long-term growth changes. However, these detrending methods strongly differ in approach, with possible implications for their output. Here, we critically evaluate the consistency, sensitivity, reliability and accuracy of four most widely used GDMs: conservative detrending (CD) applies mathematical functions to correct for decreasing ring widths with age; basal area correction (BAC) transforms diameter into basal area growth; regional curve standardization (RCS) detrends individual tree-ring series using average age/size trends; and size class isolation (SCI) calculates growth trends within separate size classes. First, we evaluated whether these GDMs produce consistent results applied to an empirical tree-ring data set of Melia azedarach, a tropical tree species from Thailand. Three GDMs yielded similar results - a growth decline over time - but the widely used CD method did not detect any change. Second, we assessed the sensitivity (probability of correct growth-trend detection), reliability (100% minus probability of detecting false trends) and accuracy (whether the strength of imposed trends is correctly detected) of these GDMs, by applying them to simulated growth trajectories with different imposed trends: no trend, strong trends (-6% and +6% change per decade) and weak trends (-2%, +2%). All methods except CD, showed high sensitivity, reliability and accuracy to detect strong imposed trends. However, these were considerably lower in the weak or no-trend scenarios. BAC showed good sensitivity and accuracy, but low reliability, indicating uncertainty of trend detection using this method. Our study reveals that the choice of GDM influences results of growth-trend studies. We recommend applying multiple methods when analysing trends and encourage performing sensitivity and reliability analysis. Finally, we recommend SCI and RCS, as these methods showed highest reliability to detect long-term growth trends. © 2014 John Wiley & Sons Ltd.

  6. Correlation study between vibrational environmental and failure rates of civil helicopter components

    NASA Technical Reports Server (NTRS)

    Alaniz, O.

    1979-01-01

    An investigation of two selected helicopter types, namely, the Models 206A/B and 212, is reported. An analysis of the available vibration and reliability data for these two helicopter types resulted in the selection of ten components located in five different areas of the helicopter and consisting primarily of instruments, electrical components, and other noncritical flight hardware. The potential for advanced technology in suppressing vibration in helicopters was assessed. The are still several unknowns concerning both the vibration environment and the reliability of helicopter noncritical flight components. Vibration data for the selected components were either insufficient or inappropriate. The maintenance data examined for the selected components were inappropriate due to variations in failure mode identification, inconsistent reporting, or inaccurate informaton.

  7. 49 CFR Appendix E to Part 238 - General Principles of Reliability-Based Maintenance Programs

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... that have already occurred but were not evident to the operating crew. (b) Components or systems in a... shows decreasing reliability with increasing operating age. An age/time limit may be used to reduce the... maintenance of a component or system to protect the safety and operating capability of the equipment, a number...

  8. 49 CFR Appendix E to Part 238 - General Principles of Reliability-Based Maintenance Programs

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... that have already occurred but were not evident to the operating crew. (b) Components or systems in a... shows decreasing reliability with increasing operating age. An age/time limit may be used to reduce the... maintenance of a component or system to protect the safety and operating capability of the equipment, a number...

  9. 49 CFR Appendix E to Part 238 - General Principles of Reliability-Based Maintenance Programs

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... that have already occurred but were not evident to the operating crew. (b) Components or systems in a... shows decreasing reliability with increasing operating age. An age/time limit may be used to reduce the... maintenance of a component or system to protect the safety and operating capability of the equipment, a number...

  10. 49 CFR Appendix E to Part 238 - General Principles of Reliability-Based Maintenance Programs

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... that have already occurred but were not evident to the operating crew. (b) Components or systems in a... shows decreasing reliability with increasing operating age. An age/time limit may be used to reduce the... maintenance of a component or system to protect the safety and operating capability of the equipment, a number...

  11. System reliability analysis through corona testing

    NASA Technical Reports Server (NTRS)

    Lalli, V. R.; Mueller, L. A.; Koutnik, E. A.

    1975-01-01

    A corona vacuum test facility for nondestructive testing of power system components was built in the Reliability and Quality Engineering Test Laboratories at the NASA Lewis Research Center. The facility was developed to simulate operating temperature and vacuum while monitoring corona discharges with residual gases. The facility is being used to test various high-voltage power system components.

  12. Microstructure-Evolution and Reliability Assessment Tool for Lead-Free Component Insertion in Army Electronics

    DTIC Science & Technology

    2008-10-01

    provide adequate means for thermal heat dissipation and cooling. Thus electronic packaging has four main functions [1]: • Signal distribution which... dissipation , involving structural and materials consideration. • Mechanical, chemical and electromagnetic protection of components and... nature when compared to phenomenological models. Microelectronic packaging industry spends typically several months building and reliability

  13. General Practitioners' Understanding Pertaining to Reliability, Interactive and Usability Components Associated with Health Websites

    ERIC Educational Resources Information Center

    Usher, Wayne

    2009-01-01

    This study was undertaken to determine the level of understanding of Gold Coast general practitioners (GPs) pertaining to such criteria as reliability, interactive and usability components associated with health websites. These are important considerations due to the increased levels of computer and World Wide Web (WWW)/Internet use and health…

  14. Increasing the Reliability of Ability-Achievement Difference Scores: An Example Using the Kaufman Assessment Battery for Children.

    ERIC Educational Resources Information Center

    Caruso, John C.; Witkiewitz, Katie

    2002-01-01

    As an alternative to equally weighted difference scores, examined an orthogonal reliable component analysis (RCA) solution and an oblique principal components analysis (PCA) solution for the standardization sample of the Kaufman Assessment Battery for Children (KABC; A. Kaufman and N. Kaufman, 1983). Discusses the practical implications of the…

  15. The influence of various test plans on mission reliability. [for Shuttle Spacelab payloads

    NASA Technical Reports Server (NTRS)

    Stahle, C. V.; Gongloff, H. R.; Young, J. P.; Keegan, W. B.

    1977-01-01

    Methods have been developed for the evaluation of cost effective vibroacoustic test plans for Shuttle Spacelab payloads. The shock and vibration environments of components have been statistically represented, and statistical decision theory has been used to evaluate the cost effectiveness of five basic test plans with structural test options for two of the plans. Component, subassembly, and payload testing have been performed for each plan along with calculations of optimum test levels and expected costs. The tests have been ranked according to both minimizing expected project costs and vibroacoustic reliability. It was found that optimum costs may vary up to $6 million with the lowest plan eliminating component testing and maintaining flight vibration reliability via subassembly tests at high acoustic levels.

  16. On-orbit spacecraft reliability

    NASA Technical Reports Server (NTRS)

    Bloomquist, C.; Demars, D.; Graham, W.; Henmi, P.

    1978-01-01

    Operational and historic data for 350 spacecraft from 52 U.S. space programs were analyzed for on-orbit reliability. Failure rates estimates are made for on-orbit operation of spacecraft subsystems, components, and piece parts, as well as estimates of failure probability for the same elements during launch. Confidence intervals for both parameters are also given. The results indicate that: (1) the success of spacecraft operation is only slightly affected by most reported incidents of anomalous behavior; (2) the occurrence of the majority of anomalous incidents could have been prevented piror to launch; (3) no detrimental effect of spacecraft dormancy is evident; (4) cycled components in general are not demonstrably less reliable than uncycled components; and (5) application of product assurance elements is conductive to spacecraft success.

  17. Reliability apportionment approach for spacecraft solar array using fuzzy reasoning Petri net and fuzzy comprehensive evaluation

    NASA Astrophysics Data System (ADS)

    Wu, Jianing; Yan, Shaoze; Xie, Liyang; Gao, Peng

    2012-07-01

    The reliability apportionment of spacecraft solar array is of significant importance for spacecraft designers in the early stage of design. However, it is difficult to use the existing methods to resolve reliability apportionment problem because of the data insufficiency and the uncertainty of the relations among the components in the mechanical system. This paper proposes a new method which combines the fuzzy comprehensive evaluation with fuzzy reasoning Petri net (FRPN) to accomplish the reliability apportionment of the solar array. The proposed method extends the previous fuzzy methods and focuses on the characteristics of the subsystems and the intrinsic associations among the components. The analysis results show that the synchronization mechanism may obtain the highest reliability value and the solar panels and hinges may get the lowest reliability before design and manufacturing. Our developed method is of practical significance for the reliability apportionment of solar array where the design information has not been clearly identified, particularly in early stage of design.

  18. Accurate reliability analysis method for quantum-dot cellular automata circuits

    NASA Astrophysics Data System (ADS)

    Cui, Huanqing; Cai, Li; Wang, Sen; Liu, Xiaoqiang; Yang, Xiaokuo

    2015-10-01

    Probabilistic transfer matrix (PTM) is a widely used model in the reliability research of circuits. However, PTM model cannot reflect the impact of input signals on reliability, so it does not completely conform to the mechanism of the novel field-coupled nanoelectronic device which is called quantum-dot cellular automata (QCA). It is difficult to get accurate results when PTM model is used to analyze the reliability of QCA circuits. To solve this problem, we present the fault tree models of QCA fundamental devices according to different input signals. After that, the binary decision diagram (BDD) is used to quantitatively investigate the reliability of two QCA XOR gates depending on the presented models. By employing the fault tree models, the impact of input signals on reliability can be identified clearly and the crucial components of a circuit can be found out precisely based on the importance values (IVs) of components. So this method is contributive to the construction of reliable QCA circuits.

  19. Evaluation of continuous air monitor placement in a plutonium facility.

    PubMed

    Whicker, J J; Rodgers, J C; Fairchild, C I; Scripsick, R C; Lopez, R C

    1997-05-01

    Department of Energy appraisers found continuous air monitors at Department of Energy plutonium facilities alarmed less than 30% of the time when integrated room plutonium air concentrations exceeded 500 DAC-hours. Without other interventions, this alarm percentage suggests the possibility that workers could be exposed to high airborne concentrations without continuous air monitor alarms. Past research has shown that placement of continuous air monitors is a critical component in rapid and reliable detection of airborne releases. At Los Alamos National Laboratory and many other Department of Energy plutonium facilities, continuous air monitors have been primarily placed at ventilation exhaust points. The purpose of this study was to evaluate and compare the effectiveness of exhaust register placement of workplace continuous air monitors with other sampling locations. Polydisperse oil aerosols were released from multiple locations in two plutonium laboratories at Los Alamos National Laboratory. An array of laser particle counters positioned in the rooms measured time-resolved aerosol dispersion. Results showed alternative placement of air samplers generally resulted in aerosol detection that was faster, often more sensitive, and equally reliable compared with samplers at exhaust registers.

  20. Signs and symptoms that precede wheezing in children with a pattern of moderate-to-severe intermittent wheezing.

    PubMed

    Rivera-Spoljaric, Katherine; Chinchilli, Vernon M; Camera, Lindsay J; Zeiger, Robert S; Paul, Ian M; Phillips, Brenda R; Taussig, Lynn M; Strunk, Robert C; Bacharier, Leonard B

    2009-06-01

    To examine parent-reported signs and symptoms as antecedents of wheezing in preschool children with previous moderate to severe wheezing episodes, and to determine the predictive capacity of these symptom patterns for wheezing events. Parents (n = 238) of children age 12 to 59 months with moderate-to-severe intermittent wheezing enrolled in a year-long clinical trial completed surveys that captured signs and symptoms at the start of a respiratory tract illness (RTI). Sensitivity, specificity, negative predictive value, and positive predictive value (PPV) for each symptom leading to wheezing during that RTI were calculated. The most commonly reported first symptom categories during the first RTI were "nose symptoms" (41%), "significant cough" (29%), and "insignificant cough" (13%). The most reliable predictor of subsequent wheezing was significant cough, which had a specificity of 78% and a PPV of 74% for predicting wheezing. Significant cough is the most reliable antecedent of wheezing during an RTI. It may be useful to consider individualized symptom patterns as a component of management plans intended to minimize wheezing episodes.

  1. An ultra-high pressure liquid chromatography-tandem mass spectrometry method for the quantification of teicoplanin in plasma of neonates.

    PubMed

    Begou, O; Kontou, A; Raikos, N; Sarafidis, K; Roilides, E; Papadoyannis, I N; Gika, H G

    2017-03-15

    The development and validation of an ultra-high pressure liquid chromatography (UHPLC) tandem mass spectrometry (MS/MS) method was performed with the aim to be applied for the quantification of plasma teicoplanin concentrations in neonates. Pharmacokinetic data of teicoplanin in the neonatal population is very limited, therefore, a sensitive and reliable method for the determination of all isoforms of teicoplanin applied in a low volume of sample is of real importance. Teicoplanin main components were extracted by a simple acetonitrile precipitation step and analysed on a C18 chromatographic column by a triple quadrupole MS with electrospray ionization. The method provides quantitative data over a linear range of 25-6400ng/mL with LOD 8.5ng/mL and LOQ 25ng/mL for total teicoplanin. The method was applied in plasma samples from neonates to support pharmacokinetic data and proved to be a reliable and fast method for the quantification of teicoplanin concentration levels in plasma of infants during therapy in Intensive Care Unit. Copyright © 2016 Elsevier B.V. All rights reserved.

  2. Identifying reliable independent components via split-half comparisons

    PubMed Central

    Groppe, David M.; Makeig, Scott; Kutas, Marta

    2011-01-01

    Independent component analysis (ICA) is a family of unsupervised learning algorithms that have proven useful for the analysis of the electroencephalogram (EEG) and magnetoencephalogram (MEG). ICA decomposes an EEG/MEG data set into a basis of maximally temporally independent components (ICs) that are learned from the data. As with any statistic, a concern with using ICA is the degree to which the estimated ICs are reliable. An IC may not be reliable if ICA was trained on insufficient data, if ICA training was stopped prematurely or at a local minimum (for some algorithms), or if multiple global minima were present. Consequently, evidence of ICA reliability is critical for the credibility of ICA results. In this paper, we present a new algorithm for assessing the reliability of ICs based on applying ICA separately to split-halves of a data set. This algorithm improves upon existing methods in that it considers both IC scalp topographies and activations, uses a probabilistically interpretable threshold for accepting ICs as reliable, and requires applying ICA only three times per data set. As evidence of the method’s validity, we show that the method can perform comparably to more time intensive bootstrap resampling and depends in a reasonable manner on the amount of training data. Finally, using the method we illustrate the importance of checking the reliability of ICs by demonstrating that IC reliability is dramatically increased by removing the mean EEG at each channel for each epoch of data rather than the mean EEG in a prestimulus baseline. PMID:19162199

  3. International classification of reliability for implanted cochlear implant receiver stimulators.

    PubMed

    Battmer, Rolf-Dieter; Backous, Douglas D; Balkany, Thomas J; Briggs, Robert J S; Gantz, Bruce J; van Hasselt, Andrew; Kim, Chong Sun; Kubo, Takeshi; Lenarz, Thomas; Pillsbury, Harold C; O'Donoghue, Gerard M

    2010-10-01

    To design an international standard to be used when reporting reliability of the implanted components of cochlear implant systems to appropriate governmental authorities, cochlear implant (CI) centers, and for journal editors in evaluating manuscripts involving cochlear implant reliability. The International Consensus Group for Cochlear Implant Reliability Reporting was assembled to unify ongoing efforts in the United States, Europe, Asia, and Australia to create a consistent and comprehensive classification system for the implanted components of CI systems across manufacturers. All members of the consensus group are from tertiary referral cochlear implant centers. None. A clinically relevant classification scheme adapted from principles of ISO standard 5841-2:2000 originally designed for reporting reliability of cardiac pacemakers, pulse generators, or leads. Standard definitions for device failure, survival time, clinical benefit, reduced clinical benefit, and specification were generated. Time intervals for reporting back to implant centers for devices tested to be "out of specification," categorization of explanted devices, the method of cumulative survival reporting, and content of reliability reports to be issued by manufacturers was agreed upon by all members. The methodology for calculating Cumulative survival was adapted from ISO standard 5841-2:2000. The International Consensus Group on Cochlear Implant Device Reliability Reporting recommends compliance to this new standard in reporting reliability of implanted CI components by all manufacturers of CIs and the adoption of this standard as a minimal reporting guideline for editors of journals publishing cochlear implant research results.

  4. Training less-experienced faculty improves reliability of skills assessment in cardiac surgery.

    PubMed

    Lou, Xiaoying; Lee, Richard; Feins, Richard H; Enter, Daniel; Hicks, George L; Verrier, Edward D; Fann, James I

    2014-12-01

    Previous work has demonstrated high inter-rater reliability in the objective assessment of simulated anastomoses among experienced educators. We evaluated the inter-rater reliability of less-experienced educators and the impact of focused training with a video-embedded coronary anastomosis assessment tool. Nine less-experienced cardiothoracic surgery faculty members from different institutions evaluated 2 videos of simulated coronary anastomoses (1 by a medical student and 1 by a resident) at the Thoracic Surgery Directors Association Boot Camp. They then underwent a 30-minute training session using an assessment tool with embedded videos to anchor rating scores for 10 components of coronary artery anastomosis. Afterward, they evaluated 2 videos of a different student and resident performing the task. Components were scored on a 1 to 5 Likert scale, yielding an average composite score. Inter-rater reliabilities of component and composite scores were assessed using intraclass correlation coefficients (ICCs) and overall pass/fail ratings with kappa. All components of the assessment tool exhibited improvement in reliability, with 4 (bite, needle holder use, needle angles, and hand mechanics) improving the most from poor (ICC range, 0.09-0.48) to strong (ICC range, 0.80-0.90) agreement. After training, inter-rater reliabilities for composite scores improved from moderate (ICC, 0.76) to strong (ICC, 0.90) agreement, and for overall pass/fail ratings, from poor (kappa = 0.20) to moderate (kappa = 0.78) agreement. Focused, video-based anchor training facilitates greater inter-rater reliability in the objective assessment of simulated coronary anastomoses. Among raters with less teaching experience, such training may be needed before objective evaluation of technical skills. Published by Elsevier Inc.

  5. Contamination-Free Manufacturing: Tool Component Qualification, Verification and Correlation with Wafers

    NASA Astrophysics Data System (ADS)

    Tan, Samantha H.; Chen, Ning; Liu, Shi; Wang, Kefei

    2003-09-01

    As part of the semiconductor industry "contamination-free manufacturing" effort, significant emphasis has been placed on reducing potential sources of contamination from process equipment and process equipment components. Process tools contain process chambers and components that are exposed to the process environment or process chemistry and in some cases are in direct contact with production wafers. Any contamination from these sources must be controlled or eliminated in order to maintain high process yields, device performance, and device reliability. This paper discusses new nondestructive analytical methods for quantitative measurement of the cleanliness of metal, quartz, polysilicon and ceramic components that are used in process equipment tools. The goal of these new procedures is to measure the effectiveness of cleaning procedures and to verify whether a tool component part is sufficiently clean for installation and subsequent routine use in the manufacturing line. These procedures provide a reliable "qualification method" for tool component certification and also provide a routine quality control method for reliable operation of cleaning facilities. Cost advantages to wafer manufacturing include higher yields due to improved process cleanliness and elimination of yield loss and downtime resulting from the installation of "bad" components in process tools. We also discuss a representative example of wafer contamination having been linked to a specific process tool component.

  6. 2nd Generation Reusable Launch Vehicle (2G RLV). Revised

    NASA Technical Reports Server (NTRS)

    Matlock, Steve; Sides, Steve; Kmiec, Tom; Arbogast, Tim; Mayers, Tom; Doehnert, Bill

    2001-01-01

    This is a revised final report and addresses all of the work performed on this program. Specifically, it covers vehicle architecture background, definition of six baseline engine cycles, reliability baseline (space shuttle main engine QRAS), and component level reliability/performance/cost for the six baseline cycles, and selection of 3 cycles for further study. This report further addresses technology improvement selection and component level reliability/performance/cost for the three cycles selected for further study, as well as risk reduction plans, and recommendation for future studies.

  7. Texture and haptic cues in slant discrimination: reliability-based cue weighting without statistically optimal cue combination

    NASA Astrophysics Data System (ADS)

    Rosas, Pedro; Wagemans, Johan; Ernst, Marc O.; Wichmann, Felix A.

    2005-05-01

    A number of models of depth-cue combination suggest that the final depth percept results from a weighted average of independent depth estimates based on the different cues available. The weight of each cue in such an average is thought to depend on the reliability of each cue. In principle, such a depth estimation could be statistically optimal in the sense of producing the minimum-variance unbiased estimator that can be constructed from the available information. Here we test such models by using visual and haptic depth information. Different texture types produce differences in slant-discrimination performance, thus providing a means for testing a reliability-sensitive cue-combination model with texture as one of the cues to slant. Our results show that the weights for the cues were generally sensitive to their reliability but fell short of statistically optimal combination - we find reliability-based reweighting but not statistically optimal cue combination.

  8. Data Applicability of Heritage and New Hardware For Launch Vehicle Reliability Models

    NASA Technical Reports Server (NTRS)

    Al Hassan, Mohammad; Novack, Steven

    2015-01-01

    Bayesian reliability requires the development of a prior distribution to represent degree of belief about the value of a parameter (such as a component's failure rate) before system specific data become available from testing or operations. Generic failure data are often provided in reliability databases as point estimates (mean or median). A component's failure rate is considered a random variable where all possible values are represented by a probability distribution. The applicability of the generic data source is a significant source of uncertainty that affects the spread of the distribution. This presentation discusses heuristic guidelines for quantifying uncertainty due to generic data applicability when developing prior distributions mainly from reliability predictions.

  9. Teamwork as an Essential Component of High-Reliability Organizations

    PubMed Central

    Baker, David P; Day, Rachel; Salas, Eduardo

    2006-01-01

    Organizations are increasingly becoming dynamic and unstable. This evolution has given rise to greater reliance on teams and increased complexity in terms of team composition, skills required, and degree of risk involved. High-reliability organizations (HROs) are those that exist in such hazardous environments where the consequences of errors are high, but the occurrence of error is extremely low. In this article, we argue that teamwork is an essential component of achieving high reliability particularly in health care organizations. We describe the fundamental characteristics of teams, review strategies in team training, demonstrate the criticality of teamwork in HROs and finally, identify specific challenges the health care community must address to improve teamwork and enhance reliability. PMID:16898980

  10. Reliability systems for implantable cardiac defibrillator batteries

    NASA Astrophysics Data System (ADS)

    Takeuchi, Esther S.

    The reliability of the power sources used in implantable cardiac defibrillators is critical due to the life-saving nature of the device. Achieving a high reliability power source depends on several systems functioning together. Appropriate cell design is the first step in assuring a reliable product. Qualification of critical components and of the cells using those components is done prior to their designation as implantable grade. Product consistency is assured by control of manufacturing practices and verified by sampling plans using both accelerated and real-time testing. Results to date show that lithium/silver vanadium oxide cells used for implantable cardiac defibrillators have a calculated maximum random failure rate of 0.005% per test month.

  11. The development of hydrogen sensor technology at NASA Lewis Research Center

    NASA Technical Reports Server (NTRS)

    Hunter, Gary W.; Neudeck, Philip G.; Jefferson, G. D.; Madzsar, G. C.; Liu, C. C.; Wu, Q. H.

    1993-01-01

    The detection of hydrogen leaks in aerospace applications, especially those involving hydrogen fuel propulsion systems, is of extreme importance for reasons of reliability, safety, and economy. Motivated by leaks occurring in liquid hydrogen lines supplying the main engine of the Space Shuttle, NASA Lewis has initiated a program to develop point-contact hydrogen sensors which address the needs of aerospace applications. Several different approaches are being explored. They include the fabrication of PdAg Schottky diode structures, the characterization of PdCr as a hydrogen sensitive alloy, and the use of SiC as a semiconductor for hydrogen sensors. This paper discusses the motivation behind and present status of each of the major components of the NASA LeRC hydrogen sensor program.

  12. Industrial inspection of specular surfaces using a new calibration procedure

    NASA Astrophysics Data System (ADS)

    Aswendt, Petra; Hofling, Roland; Gartner, Soren

    2005-06-01

    The methodology of phase encoded reflection measurements has become a valuable tool for the industrial inspection of components with glossy surfaces. The measuring principle provides outstanding sensitivity for tiny variations of surface curvature so that sub-micron waviness and flaws are reliably detected. Quantitative curvature measurements can be obtained from a simple approach if the object is almost flat. 3D-objects with a high aspect ratio require more effort to determine both coordinates and normal direction of a surface point unambiguously. Stereoscopic solutions have been reported using more than one camera for a certain surface area. This paper will describe the combined double camera steady surface approach (DCSS) that is well suited for the implementation in industrial testing stations

  13. New acoustic techniques for leak detection in fossil fuel plant components

    NASA Astrophysics Data System (ADS)

    Parini, G.; Possa, G.

    Two on-line acoustic monitoring techniques for leak detection in feedwater preheaters and boilers of fossil fuel power plants are presented. The leak detection is based on the acoustic noise produced by the turbulent leak outflow. The primary sensors are piezoelectric pressure transducers, installed near the feedwater preheater inlets, in direct contact with the water, or mounted on boiler observation windows. The frequency band of the auscultation ranges from a few kHz, to 10 to 15 kHz. The signals are characterized by their rms value, continuously recorded by means of potentiometric strip chart recorders. The leak occurrence is signalled by the signal rms overcoming predetermined threshold levels. Sensitivity, reliability, acceptance in plant control practice, and costs-benefits balance are satisfactory.

  14. Psychometric testing of an instrument to measure the experience of home.

    PubMed

    Molony, Sheila L; McDonald, Deborah Dillon; Palmisano-Mills, Christine

    2007-10-01

    Research related to quality of life in long-term care has been hampered by a paucity of measurement tools sensitive to environmental interventions. The primary aim of this study was to test the psychometric properties of a new instrument, the Experience of Home (EOH) Scale, designed to measure the strength of the experience of meaningful person-environment transaction. The instrument was administered to 200 older adults in diverse dwelling types. Principal components analysis provided support for construct validity, eliciting a three-factor solution accounting for 63.18% of variance in scores. Internal consistency reliability was supported with Cronbach's alpha of .96 for the entire scale. The EOH Scale is a unique research tool to evaluate interventions to improve quality of living in residential environments.

  15. Rapid analysis of controlled substances using desorption electrospray ionization mass spectrometry.

    PubMed

    Rodriguez-Cruz, Sandra E

    2006-01-01

    The recently developed technique of desorption electrospray ionization (DESI) has been applied to the rapid analysis of controlled substances. Experiments have been performed using a commercial ThermoFinnigan LCQ Advantage MAX ion-trap mass spectrometer with limited modifications. Results from the ambient sampling of licit and illicit tablets demonstrate the ability of the DESI technique to detect the main active ingredient(s) or controlled substance(s), even in the presence of other higher-concentration components. Full-scan mass spectrometry data provide preliminary identification by molecular weight determination, while rapid analysis using the tandem mass spectrometry (MS/MS) mode provides fragmentation data which, when compared to the laboratory-generated ESI-MS/MS spectral library, provide structural information and final identification of the active ingredient(s). The consecutive analysis of tablets containing different active components indicates there is no cross-contamination or interference from tablet to tablet, demonstrating the reliability of the DESI technique for rapid sampling (one tablet/min or better). Active ingredients have been detected for tablets in which the active component represents less than 1% of the total tablet weight, demonstrating the sensitivity of the technique. The real-time sampling of cannabis plant material is also presented.

  16. Utilization of Pb-free solders in MEMS packaging

    NASA Astrophysics Data System (ADS)

    Selvaduray, Guna S.

    2003-01-01

    Soldering of components within a package plays an important role in providing electrical interconnection, mechanical integrity and thermal dissipation. MEMS packages present challenges that are more complex than microelectronic packages because they are far more sensitive to shock and vibration and also require precision alignment. Soldering is used at two major levels within a MEMS package: at the die attach level and at the component attach level. Emerging environmental regulations worldwide, notably in Europe and Japan, have targeted the elimination of Pb usage in electronic assemblies, due to the inherent toxicity of Pb. This has provided the driving force for development and deployment of Pb-free solder alloys. A relatively large number of Pb-free solder alloys have been proposed by various researchers and companies. Some of these alloys have also been patented. After several years of research, the solder alloy system that has emerged is based on Sn as a major component. The electronics industry has identified different compositions for different specific uses, such as wave soldering, surface mount reflow, etc. The factors that affect choice of an appropriate Pb-free solder can be divided into two major categories, those related to manufacturing, and those related to long term reliability and performance.

  17. Toward a Fault Tolerant Architecture for Vital Medical-Based Wearable Computing.

    PubMed

    Abdali-Mohammadi, Fardin; Bajalan, Vahid; Fathi, Abdolhossein

    2015-12-01

    Advancements in computers and electronic technologies have led to the emergence of a new generation of efficient small intelligent systems. The products of such technologies might include Smartphones and wearable devices, which have attracted the attention of medical applications. These products are used less in critical medical applications because of their resource constraint and failure sensitivity. This is due to the fact that without safety considerations, small-integrated hardware will endanger patients' lives. Therefore, proposing some principals is required to construct wearable systems in healthcare so that the existing concerns are dealt with. Accordingly, this paper proposes an architecture for constructing wearable systems in critical medical applications. The proposed architecture is a three-tier one, supporting data flow from body sensors to cloud. The tiers of this architecture include wearable computers, mobile computing, and mobile cloud computing. One of the features of this architecture is its high possible fault tolerance due to the nature of its components. Moreover, the required protocols are presented to coordinate the components of this architecture. Finally, the reliability of this architecture is assessed by simulating the architecture and its components, and other aspects of the proposed architecture are discussed.

  18. Emergency Dosimetry Using Ceramic Components in Personal Electronic Devices

    NASA Astrophysics Data System (ADS)

    Kouroukla, E. C.; Bailiff, I. K.; Terry, I.

    2014-02-01

    The rapid assessment of radiation dose to members of the public exposed to significant levels of ionizing radiation during a radiological incident presents a significant difficulty in the absence of planned radiation monitoring. However, within most personal electronic devices components such as resistors with alumina substrates can be found that have potentially suitable properties as solid state dosimeters using luminescence measurement techniques. The suitability of several types of ceramic-based components (e.g., resonators, inductors and resistors) has been previously examined using optically stimulated luminescence (OSL) and thermoluminescence (TL) techniques to establish their basic characteristics for the retrospective determination of absorbed dose. In this paper, we present results obtained with aluminum oxide surface mount resistors extracted from mobile phones that further extend this work. Very encouraging results have been obtained related to the measurement of luminescence sensitivity, dose response, reusability, limit of detection, signal reproducibility and known-dose recovery. However, the alumina exhibits a rapid loss of the latent luminescence signal with time following irradiation attributed to athermal (or anomalous) fading. The issues related to obtaining a reliable correction protocol for this loss and the detailed examinations required of the fading behavior are discussed.

  19. Neutron dose estimation in a zero power nuclear reactor

    NASA Astrophysics Data System (ADS)

    Triviño, S.; Vedelago, J.; Cantargi, F.; Keil, W.; Figueroa, R.; Mattea, F.; Chautemps, A.; Santibañez, M.; Valente, M.

    2016-10-01

    This work presents the characterization and contribution of neutron and gamma components to the absorbed dose in a zero power nuclear reactor. A dosimetric method based on Fricke gel was implemented to evaluate the separation between dose components in the mixed field. The validation of this proposed method was performed by means of direct measurements of neutron flux in different positions using Au and Mg-Ni activation foils. Monte Carlo simulations were conversely performed using the MCNP main code with a dedicated subroutine to incorporate the exact complete geometry of the nuclear reactor facility. Once nuclear fuel elements were defined, the simulations computed the different contributions to the absorbed dose in specific positions inside the core. Thermal/epithermal contributions of absorbed dose were assessed by means of Fricke gel dosimetry using different isotopic compositions aimed at modifying the sensitivity of the dosimeter for specific dose components. Clear distinctions between gamma and neutron capture dose were obtained. Both Monte Carlo simulations and experimental results provided reliable estimations about neutron flux rate as well as dose rate during the reactor operation. Simulations and experimental results are in good agreement in every positions measured and simulated in the core.

  20. Analysis of confidence in continental-scale groundwater recharge estimates for Africa using a distributed water balance model

    NASA Astrophysics Data System (ADS)

    Mackay, Jonathan; Mansour, Majdi; Bonsor, Helen; Pachocka, Magdalena; Wang, Lei; MacDonald, Alan; Macdonald, David; Bloomfield, John

    2014-05-01

    There is a growing need for improved access to reliable water in Africa as population and food production increases. Currently approximately 300 million people do not have access to a secure source of safe drinking water. To meet these current and future demands, groundwater will need to be increasingly abstracted; groundwater is more reliable than surface water sources due to its relatively long response time to meteorological stresses and therefore is likely to be a more secure water resource in a more variable climate. Recent studies also quantified the volumes of groundwater potentially available which suggest that, if exploited, groundwater could help to meet the demand for fresh water. However, there is still considerable uncertainty as to how these resources may respond in the future due to changes in groundwater recharge and abstraction. Understanding and quantifying groundwater recharge is vital as it forms a primary indicator of the sustainability of underlying groundwater resources. Computational hydrological models provide a means to do this, but the complexity of recharge processes in Africa mean that these simulations are often highly uncertain. This study aims to evaluate our confidence in simulating groundwater recharge over Africa based on a sensitivity analysis using a distributed hydrological model developed by the British Geological Survey, ZOODRM. The model includes land surface, canopy, river, soil and groundwater components. Each component is able to exchange water and as such, forms a distributed water balance of Africa. The components have been parameterised using available spatial datasets of African vegetation, land-use, soil and hydrogeology while the remaining parameters have been estimated by calibrating the model to available river flow data. Continental-scale gridded precipitation and potential evapotranspiration datasets, based on remotely sensed and ground observations, have been used to force the model. Following calibration, the sensitivity analysis has been undertaken in two stages. For the first stage, individual parameters are perturbed from each component of the model. For the second stage, different methods for calculating groundwater recharge are introduced. Both stages aim to investigate which aspects of the model most impact on groundwater recharge and consequently how confidently we can simulate the complex recharge processes that occur in Africa using large scale hydrological models. Preliminary results from the analysis indicate the parameters that control runoff generation from the land surface and the choice of groundwater recharge calculation method both have a significant impact on groundwater recharge simulations.

  1. Sensitivity, specificity, and predictive values of pediatric metabolic syndrome components in relation to adult metabolic syndrome: the Princeton LRC follow-up study.

    PubMed

    Huang, Terry T-K; Nansel, Tonja R; Belsheim, Allen R; Morrison, John A

    2008-02-01

    To estimate the sensitivity, specificity, and predictive values of pediatric metabolic syndrome (MetS) components (obesity, fasting glucose, triglycerides, high-density lipoprotein, and blood pressure) at various cutoff points in relation to adult MetS. Data from the National Heart, Lung, and Blood Institute Lipid Research Clinics Princeton Prevalence Study (1973-1976) and the Princeton Follow-up Study (2000-2004) were used to calculate sensitivity, specificity, and positive and negative predictive values for each component at a given cutoff point and for aggregates of components. Individual pediatric components alone showed low to moderate sensitivity, high specificity, and moderate predictive values in relation to adult MetS. When all 5 pediatric MetS components were considered, the presence of at least 1 abnormality had higher sensitivity for adult MetS than individual components alone. When multiple abnormalities were mandatory for MetS, positive predictive value was high and sensitivity was low. Childhood body mass alone showed neither high sensitivity nor high positive predictive value for adult MetS. Considering multiple metabolic variables in childhood can improve the predictive usefulness for adult MetS, compared with each component or body mass alone. MetS variables may be useful for identifying some children who are at risk for prevention interventions.

  2. Evaluation of the branched-chain DNA assay for measurement of RNA in formalin-fixed tissues.

    PubMed

    Knudsen, Beatrice S; Allen, April N; McLerran, Dale F; Vessella, Robert L; Karademos, Jonathan; Davies, Joan E; Maqsodi, Botoul; McMaster, Gary K; Kristal, Alan R

    2008-03-01

    We evaluated the branched-chain DNA (bDNA) assay QuantiGene Reagent System to measure RNA in formalin-fixed, paraffin-embedded (FFPE) tissues. The QuantiGene Reagent System does not require RNA isolation, avoids enzymatic preamplification, and has a simple workflow. Five selected genes were measured by bDNA assay; quantitative polymerase chain reaction (qPCR) was used as a reference method. Mixed-effect statistical models were used to partition the overall variance into components attributable to xenograft, sample, and assay. For FFPE tissues, the coefficients of reliability were significantly higher for the bDNA assay (93-100%) than for qPCR (82.4-95%). Correlations between qPCR(FROZEN), the gold standard, and bDNA(FFPE) ranged from 0.60 to 0.94, similar to those from qPCR(FROZEN) and qPCR(FFPE). Additionally, the sensitivity of the bDNA assay in tissue homogenates was 10-fold higher than in purified RNA. In 9- to 13-year-old blocks with poor RNA quality, the bDNA assay allowed the correct identification of the overexpression of known cancer genes. In conclusion, the QuantiGene Reagent System is considerably more reliable, reproducible, and sensitive than qPCR, providing an alternative method for the measurement of gene expression in FFPE tissues. It also appears to be well suited for the clinical analysis of FFPE tissues with diagnostic or prognostic gene expression biomarker panels for use in patient treatment and management.

  3. Evaluation of the Branched-Chain DNA Assay for Measurement of RNA in Formalin-Fixed Tissues

    PubMed Central

    Knudsen, Beatrice S.; Allen, April N.; McLerran, Dale F.; Vessella, Robert L.; Karademos, Jonathan; Davies, Joan E.; Maqsodi, Botoul; McMaster, Gary K.; Kristal, Alan R.

    2008-01-01

    We evaluated the branched-chain DNA (bDNA) assay QuantiGene Reagent System to measure RNA in formalin-fixed, paraffin-embedded (FFPE) tissues. The QuantiGene Reagent System does not require RNA isolation, avoids enzymatic preamplification, and has a simple workflow. Five selected genes were measured by bDNA assay; quantitative polymerase chain reaction (qPCR) was used as a reference method. Mixed-effect statistical models were used to partition the overall variance into components attributable to xenograft, sample, and assay. For FFPE tissues, the coefficients of reliability were significantly higher for the bDNA assay (93–100%) than for qPCR (82.4–95%). Correlations between qPCRFROZEN, the gold standard, and bDNAFFPE ranged from 0.60 to 0.94, similar to those from qPCRFROZEN and qPCRFFPE. Additionally, the sensitivity of the bDNA assay in tissue homogenates was 10-fold higher than in purified RNA. In 9- to 13-year-old blocks with poor RNA quality, the bDNA assay allowed the correct identification of the overexpression of known cancer genes. In conclusion, the QuantiGene Reagent System is considerably more reliable, reproducible, and sensitive than qPCR, providing an alternative method for the measurement of gene expression in FFPE tissues. It also appears to be well suited for the clinical analysis of FFPE tissues with diagnostic or prognostic gene expression biomarker panels for use in patient treatment and management. PMID:18276773

  4. Reliability and validity of the delta finger-to-palm (FTP), a new measure of finger range of motion in systemic sclerosis

    PubMed Central

    Torok, Kathryn S.; Baker, Nancy A.; Lucas, Mary; Domsic, Robyn T.; Boudreau, Robert; Medsger, Thomas A.

    2010-01-01

    Objectives To determine the reliability and validity of a new measure of finger motion in patients with systemic sclerosis (SSc), the ‘delta finger-to-palm’ (delta FTP) and compare its psychometric properties to the traditional measure of finger motion, the finger-to-palm (FTP). Methods Phase 1: The reliability of the delta FTP and FTP were examined in 39 patients with SSc. Phase 2: Criterion and convergent construct validity of both measures were examined in 17 patients with SSc by comparing them to other clinical measures: Total Active Range of Motion (TAROM), Hand Mobility in Scleroderma (HAMIS), the Duruoz Hand Index (DHI), Health Assessment Questionnaire (HAQ), and modified Rodnan skin score (mRSS). Phase 3: Sensitivity to change of the delta FTP was investigated in 24 patients with early diffuse cutaneous SSc. Results Both measures had excellent intra-rater and inter-rater reliability (ICC 0.92 to 0.99). Fair to strong correlations (rs=0.49–0.94) were observed between the delta FTP and TAROM, HAMIS, and DHI. Fair to moderate correlations were observed between delta FTP and HAQ components related to hand function and upper extremity mRSS. Correlations of the traditional FTP with these measures were fair to strong, but most often the delta FTP outperformed the FTP. The effect size and standardised response mean for the mean delta FTP were 0.50 and 1.10 respectively, over a 2–8 month period. Conclusion The delta FTP is a valid and reliable measure of finger motion in patients with SSc which outperforms the FTP. PMID:20576211

  5. Reliability- and performance-based robust design optimization of MEMS structures considering technological uncertainties

    NASA Astrophysics Data System (ADS)

    Martowicz, Adam; Uhl, Tadeusz

    2012-10-01

    The paper discusses the applicability of a reliability- and performance-based multi-criteria robust design optimization technique for micro-electromechanical systems, considering their technological uncertainties. Nowadays, micro-devices are commonly applied systems, especially in the automotive industry, taking advantage of utilizing both the mechanical structure and electronic control circuit on one board. Their frequent use motivates the elaboration of virtual prototyping tools that can be applied in design optimization with the introduction of technological uncertainties and reliability. The authors present a procedure for the optimization of micro-devices, which is based on the theory of reliability-based robust design optimization. This takes into consideration the performance of a micro-device and its reliability assessed by means of uncertainty analysis. The procedure assumes that, for each checked design configuration, the assessment of uncertainty propagation is performed with the meta-modeling technique. The described procedure is illustrated with an example of the optimization carried out for a finite element model of a micro-mirror. The multi-physics approach allowed the introduction of several physical phenomena to correctly model the electrostatic actuation and the squeezing effect present between electrodes. The optimization was preceded by sensitivity analysis to establish the design and uncertain domains. The genetic algorithms fulfilled the defined optimization task effectively. The best discovered individuals are characterized by a minimized value of the multi-criteria objective function, simultaneously satisfying the constraint on material strength. The restriction of the maximum equivalent stresses was introduced with the conditionally formulated objective function with a penalty component. The yielded results were successfully verified with a global uniform search through the input design domain.

  6. Effect of Cyclic Thermo-Mechanical Loads on Fatigue Reliability in Polymer Matrix Composites

    NASA Technical Reports Server (NTRS)

    Shah, A. R.; Murthy, P. L. N.; Chamis, C. C.

    1996-01-01

    A methodology to compute probabilistic fatigue life of polymer matrix laminated composites has been developed and demonstrated. Matrix degradation effects caused by long term environmental exposure and mechanical/thermal cyclic loads are accounted for in the simulation process. A unified time-temperature-stress dependent multi-factor interaction relationship developed at NASA Lewis Research Center has been used to model the degradation/aging of material properties due to cyclic loads. The fast probability integration method is used to compute probabilistic distribution of response. Sensitivities of fatigue life reliability to uncertainties in the primitive random variables (e.g., constituent properties, fiber volume ratio, void volume ratio, ply thickness, etc.) computed and their significance in the reliability- based design for maximum life is discussed. The effect of variation in the thermal cyclic loads on the fatigue reliability for a (0/+/- 45/90)(sub s) graphite/epoxy laminate with a ply thickness of 0.127 mm, with respect to impending failure modes has been studied. The results show that, at low mechanical cyclic loads and low thermal cyclic amplitudes, fatigue life for 0.999 reliability is most sensitive to matrix compressive strength, matrix modulus, thermal expansion coefficient, and ply thickness. Whereas at high mechanical cyclic loads and high thermal cyclic amplitudes, fatigue life at 0.999 reliability is more sensitive to the shear strength of matrix, longitudinal fiber modulus, matrix modulus, and ply thickness.

  7. Reliability of provocative tests of motion sickness susceptibility

    NASA Technical Reports Server (NTRS)

    Calkins, D. S.; Reschke, M. F.; Kennedy, R. S.; Dunlop, W. P.

    1987-01-01

    Test-retest reliability values were derived from motion sickness susceptibility scores obtained from two successive exposures to each of three tests: (1) Coriolis sickness sensitivity test; (2) staircase velocity movement test; and (3) parabolic flight static chair test. The reliability of the three tests ranged from 0.70 to 0.88. Normalizing values from predictors with skewed distributions improved the reliability.

  8. High power diode lasers emitting from 639 nm to 690 nm

    NASA Astrophysics Data System (ADS)

    Bao, L.; Grimshaw, M.; DeVito, M.; Kanskar, M.; Dong, W.; Guan, X.; Zhang, S.; Patterson, J.; Dickerson, P.; Kennedy, K.; Li, S.; Haden, J.; Martinsen, R.

    2014-03-01

    There is increasing market demand for high power reliable red lasers for display and cinema applications. Due to the fundamental material system limit at this wavelength range, red diode lasers have lower efficiency and are more temperature sensitive, compared to 790-980 nm diode lasers. In terms of reliability, red lasers are also more sensitive to catastrophic optical mirror damage (COMD) due to the higher photon energy. Thus developing higher power-reliable red lasers is very challenging. This paper will present nLIGHT's released red products from 639 nm to 690nm, with established high performance and long-term reliability. These single emitter diode lasers can work as stand-alone singleemitter units or efficiently integrate into our compact, passively-cooled Pearl™ fiber-coupled module architectures for higher output power and improved reliability. In order to further improve power and reliability, new chip optimizations have been focused on improving epitaxial design/growth, chip configuration/processing and optical facet passivation. Initial optimization has demonstrated promising results for 639 nm diode lasers to be reliably rated at 1.5 W and 690nm diode lasers to be reliably rated at 4.0 W. Accelerated life-test has started and further design optimization are underway.

  9. Apples to Apples: Equivalent-Reliability Power Systems Across Diverse Resource Mix Scenarios

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stephen, Gordon W; Frew, Bethany A; Sigler, Devon

    Electricity market research is highly price sensitive, and prices are strongly influenced by balance of supply and demand. This work looks at how to combine capacity expansion models and reliability assessment tools to assess equivalent-reliability power systems across diverse resource mix scenarios.

  10. A Cross-Layer Optimized Opportunistic Routing Scheme for Loss-and-Delay Sensitive WSNs

    PubMed Central

    Xu, Xin; Yuan, Minjiao; Liu, Xiao; Cai, Zhiping; Wang, Tian

    2018-01-01

    In wireless sensor networks (WSNs), communication links are typically error-prone and unreliable, so providing reliable and timely data routing for loss- and delay-sensitive applications in WSNs it is a challenge issue. Additionally, with specific thresholds in practical applications, the loss and delay sensitivity implies requirements for high reliability and low delay. Opportunistic Routing (OR) has been well studied in WSNs to improve reliability for error-prone and unreliable wireless communication links where the transmission power is assumed to be identical in the whole network. In this paper, a Cross-layer Optimized Opportunistic Routing (COOR) scheme is proposed to improve the communication link reliability and reduce delay for loss-and-delay sensitive WSNs. The main contribution of the COOR scheme is making full use of the remaining energy in networks to increase the transmission power of most nodes, which will provide a higher communication reliability or further transmission distance. Two optimization strategies referred to as COOR(R) and COOR(P) of the COOR scheme are proposed to improve network performance. In the case of increasing the transmission power, the COOR(R) strategy chooses a node that has a higher communication reliability with same distance in comparison to the traditional opportunistic routing when selecting the next hop candidate node. Since the reliability of data transmission is improved, the delay of the data reaching the sink is reduced by shortening the time of communication between candidate nodes. On the other hand, the COOR(P) strategy prefers a node that has the same communication reliability with longer distance. As a result, network performance can be improved for the following reasons: (a) the delay is reduced as fewer hops are needed while the packet reaches the sink in longer transmission distance circumstances; (b) the reliability can be improved since it is the product of the reliability of every hop of the routing path, and the count is reduced while the reliability of each hop is the same as the traditional method. After analyzing the energy consumption of the network in detail, the value of optimized transmission power in different areas is given. On the basis of a large number of experimental and theoretical analyses, the results show that the COOR scheme will increase communication reliability by 36.62–87.77%, decrease delay by 21.09–52.48%, and balance the energy consumption of 86.97% of the nodes in the WSNs. PMID:29751589

  11. A Cross-Layer Optimized Opportunistic Routing Scheme for Loss-and-Delay Sensitive WSNs.

    PubMed

    Xu, Xin; Yuan, Minjiao; Liu, Xiao; Liu, Anfeng; Xiong, Neal N; Cai, Zhiping; Wang, Tian

    2018-05-03

    In wireless sensor networks (WSNs), communication links are typically error-prone and unreliable, so providing reliable and timely data routing for loss- and delay-sensitive applications in WSNs it is a challenge issue. Additionally, with specific thresholds in practical applications, the loss and delay sensitivity implies requirements for high reliability and low delay. Opportunistic Routing (OR) has been well studied in WSNs to improve reliability for error-prone and unreliable wireless communication links where the transmission power is assumed to be identical in the whole network. In this paper, a Cross-layer Optimized Opportunistic Routing (COOR) scheme is proposed to improve the communication link reliability and reduce delay for loss-and-delay sensitive WSNs. The main contribution of the COOR scheme is making full use of the remaining energy in networks to increase the transmission power of most nodes, which will provide a higher communication reliability or further transmission distance. Two optimization strategies referred to as COOR(R) and COOR(P) of the COOR scheme are proposed to improve network performance. In the case of increasing the transmission power, the COOR(R) strategy chooses a node that has a higher communication reliability with same distance in comparison to the traditional opportunistic routing when selecting the next hop candidate node. Since the reliability of data transmission is improved, the delay of the data reaching the sink is reduced by shortening the time of communication between candidate nodes. On the other hand, the COOR(P) strategy prefers a node that has the same communication reliability with longer distance. As a result, network performance can be improved for the following reasons: (a) the delay is reduced as fewer hops are needed while the packet reaches the sink in longer transmission distance circumstances; (b) the reliability can be improved since it is the product of the reliability of every hop of the routing path, and the count is reduced while the reliability of each hop is the same as the traditional method. After analyzing the energy consumption of the network in detail, the value of optimized transmission power in different areas is given. On the basis of a large number of experimental and theoretical analyses, the results show that the COOR scheme will increase communication reliability by 36.62⁻87.77%, decrease delay by 21.09⁻52.48%, and balance the energy consumption of 86.97% of the nodes in the WSNs.

  12. A Report on Simulation-Driven Reliability and Failure Analysis of Large-Scale Storage Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wan, Lipeng; Wang, Feiyi; Oral, H. Sarp

    High-performance computing (HPC) storage systems provide data availability and reliability using various hardware and software fault tolerance techniques. Usually, reliability and availability are calculated at the subsystem or component level using limited metrics such as, mean time to failure (MTTF) or mean time to data loss (MTTDL). This often means settling on simple and disconnected failure models (such as exponential failure rate) to achieve tractable and close-formed solutions. However, such models have been shown to be insufficient in assessing end-to-end storage system reliability and availability. We propose a generic simulation framework aimed at analyzing the reliability and availability of storagemore » systems at scale, and investigating what-if scenarios. The framework is designed for an end-to-end storage system, accommodating the various components and subsystems, their interconnections, failure patterns and propagation, and performs dependency analysis to capture a wide-range of failure cases. We evaluate the framework against a large-scale storage system that is in production and analyze its failure projections toward and beyond the end of lifecycle. We also examine the potential operational impact by studying how different types of components affect the overall system reliability and availability, and present the preliminary results« less

  13. Diverse Redundant Systems for Reliable Space Life Support

    NASA Technical Reports Server (NTRS)

    Jones, Harry W.

    2015-01-01

    Reliable life support systems are required for deep space missions. The probability of a fatal life support failure should be less than one in a thousand in a multi-year mission. It is far too expensive to develop a single system with such high reliability. Using three redundant units would require only that each have a failure probability of one in ten over the mission. Since the system development cost is inverse to the failure probability, this would cut cost by a factor of one hundred. Using replaceable subsystems instead of full systems would further cut cost. Using full sets of replaceable components improves reliability more than using complete systems as spares, since a set of components could repair many different failures instead of just one. Replaceable components would require more tools, space, and planning than full systems or replaceable subsystems. However, identical system redundancy cannot be relied on in practice. Common cause failures can disable all the identical redundant systems. Typical levels of common cause failures will defeat redundancy greater than two. Diverse redundant systems are required for reliable space life support. Three, four, or five diverse redundant systems could be needed for sufficient reliability. One system with lower level repair could be substituted for two diverse systems to save cost.

  14. Increasing the reliability of the fluid/crystallized difference score from the Kaufman Adolescent and Adult Intelligence Test with reliable component analysis.

    PubMed

    Caruso, J C

    2001-06-01

    The unreliability of difference scores is a well documented phenomenon in the social sciences and has led researchers and practitioners to interpret differences cautiously, if at all. In the case of the Kaufman Adult and Adolescent Intelligence Test (KAIT), the unreliability of the difference between the Fluid IQ and the Crystallized IQ is due to the high correlation between the two scales. The consequences of the lack of precision with which differences are identified are wide confidence intervals and unpowerful significance tests (i.e., large differences are required to be declared statistically significant). Reliable component analysis (RCA) was performed on the subtests of the KAIT in order to address these problems. RCA is a new data reduction technique that results in uncorrelated component scores with maximum proportions of reliable variance. Results indicate that the scores defined by RCA have discriminant and convergent validity (with respect to the equally weighted scores) and that differences between the scores, derived from a single testing session, were more reliable than differences derived from equal weighting for each age group (11-14 years, 15-34 years, 35-85+ years). This reliability advantage results in narrower confidence intervals around difference scores and smaller differences required for statistical significance.

  15. Dietary biomarkers: advances, limitations and future directions

    PubMed Central

    2012-01-01

    The subjective nature of self-reported dietary intake assessment methods presents numerous challenges to obtaining accurate dietary intake and nutritional status. This limitation can be overcome by the use of dietary biomarkers, which are able to objectively assess dietary consumption (or exposure) without the bias of self-reported dietary intake errors. The need for dietary biomarkers was addressed by the Institute of Medicine, who recognized the lack of nutritional biomarkers as a knowledge gap requiring future research. The purpose of this article is to review existing literature on currently available dietary biomarkers, including novel biomarkers of specific foods and dietary components, and assess the validity, reliability and sensitivity of the markers. This review revealed several biomarkers in need of additional validation research; research is also needed to produce sensitive, specific, cost-effective and noninvasive dietary biomarkers. The emerging field of metabolomics may help to advance the development of food/nutrient biomarkers, yet advances in food metabolome databases are needed. The availability of biomarkers that estimate intake of specific foods and dietary components could greatly enhance nutritional research targeting compliance to national recommendations as well as direct associations with disease outcomes. More research is necessary to refine existing biomarkers by accounting for confounding factors, to establish new indicators of specific food intake, and to develop techniques that are cost-effective, noninvasive, rapid and accurate measures of nutritional status. PMID:23237668

  16. Dakota, a multilevel parallel object-oriented framework for design optimization, parameter estimation, uncertainty quantification, and sensitivity analysis version 6.0 theory manual

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Adams, Brian M.; Ebeida, Mohamed Salah; Eldred, Michael S

    The Dakota (Design Analysis Kit for Optimization and Terascale Applications) toolkit provides a exible and extensible interface between simulation codes and iterative analysis methods. Dakota contains algorithms for optimization with gradient and nongradient-based methods; uncertainty quanti cation with sampling, reliability, and stochastic expansion methods; parameter estimation with nonlinear least squares methods; and sensitivity/variance analysis with design of experiments and parameter study methods. These capabilities may be used on their own or as components within advanced strategies such as surrogate-based optimization, mixed integer nonlinear programming, or optimization under uncertainty. By employing object-oriented design to implement abstractions of the key components requiredmore » for iterative systems analyses, the Dakota toolkit provides a exible and extensible problem-solving environment for design and performance analysis of computational models on high performance computers. This report serves as a theoretical manual for selected algorithms implemented within the Dakota software. It is not intended as a comprehensive theoretical treatment, since a number of existing texts cover general optimization theory, statistical analysis, and other introductory topics. Rather, this manual is intended to summarize a set of Dakota-related research publications in the areas of surrogate-based optimization, uncertainty quanti cation, and optimization under uncertainty that provide the foundation for many of Dakota's iterative analysis capabilities.« less

  17. System principles, mathematical models and methods to ensure high reliability of safety systems

    NASA Astrophysics Data System (ADS)

    Zaslavskyi, V.

    2017-04-01

    Modern safety and security systems are composed of a large number of various components designed for detection, localization, tracking, collecting, and processing of information from the systems of monitoring, telemetry, control, etc. They are required to be highly reliable in a view to correctly perform data aggregation, processing and analysis for subsequent decision making support. On design and construction phases of the manufacturing of such systems a various types of components (elements, devices, and subsystems) are considered and used to ensure high reliability of signals detection, noise isolation, and erroneous commands reduction. When generating design solutions for highly reliable systems a number of restrictions and conditions such as types of components and various constrains on resources should be considered. Various types of components perform identical functions; however, they are implemented using diverse principles, approaches and have distinct technical and economic indicators such as cost or power consumption. The systematic use of different component types increases the probability of tasks performing and eliminates the common cause failure. We consider type-variety principle as an engineering principle of system analysis, mathematical models based on this principle, and algorithms for solving optimization problems of highly reliable safety and security systems design. Mathematical models are formalized in a class of two-level discrete optimization problems of large dimension. The proposed approach, mathematical models, algorithms can be used for problem solving of optimal redundancy on the basis of a variety of methods and control devices for fault and defects detection in technical systems, telecommunication networks, and energy systems.

  18. Reliable change, sensitivity, and specificity of a multidimensional concussion assessment battery: implications for caution in clinical practice.

    PubMed

    Register-Mihalik, Johna K; Guskiewicz, Kevin M; Mihalik, Jason P; Schmidt, Julianne D; Kerr, Zachary Y; McCrea, Michael A

    2013-01-01

    To provide reliable change confidence intervals for common clinical concussion measures using a healthy sample of collegiate athletes and to apply these reliable change parameters to a sample of concussed collegiate athletes. Two independent samples were included in the study and evaluated on common clinical measures of concussion. The healthy sample included male, collegiate football student-athletes (n = 38) assessed at 2 time points. The concussed sample included college-aged student-athletes (n = 132) evaluated before and after a concussion. Outcome measures included symptom severity scores, Automated Neuropsychological Assessment Metrics throughput scores, and Sensory Organization Test composite scores. Application of the reliable change parameters suggests that a small percentage of concussed participants were impaired on each measure. We identified a low sensitivity of the entire battery (all measures combined) of 50% but high specificity of 96%. Clinicians should be trained in understanding clinical concussion measures and should be aware of evidence suggesting the multifaceted battery is more sensitive than any single measure. Clinicians should be cautioned that sensitivity to balance and neurocognitive impairments was low for each individual measure. Applying the confidence intervals to our injured sample suggests that these measures do not adequately identify postconcussion impairments when used in isolation.

  19. Status of the Flooding Fragility Testing Development

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pope, C. L.; Savage, B.; Bhandari, B.

    2016-06-01

    This report provides an update on research addressing nuclear power plant component reliability under flooding conditions. The research includes use of the Component Flooding Evaluation Laboratory (CFEL) where individual components and component subassemblies will be tested to failure under various flooding conditions. The resulting component reliability data can then be incorporated with risk simulation strategies to provide a more thorough representation of overall plant risk. The CFEL development strategy consists of four interleaved phases. Phase 1 addresses design and application of CFEL with water rise and water spray capabilities allowing testing of passive and active components including fully electrified components.more » Phase 2 addresses research into wave generation techniques followed by the design and addition of the wave generation capability to CFEL. Phase 3 addresses methodology development activities including small scale component testing, development of full scale component testing protocol, and simulation techniques including Smoothed Particle Hydrodynamic (SPH) based computer codes. Phase 4 involves full scale component testing including work on full scale component testing in a surrogate CFEL testing apparatus.« less

  20. System reliability analysis through corona testing

    NASA Technical Reports Server (NTRS)

    Lalli, V. R.; Mueller, L. A.; Koutnik, E. A.

    1975-01-01

    In the Reliability and Quality Engineering Test Laboratory at the NASA Lewis Research Center a nondestructive, corona-vacuum test facility for testing power system components was developed using commercially available hardware. The test facility was developed to simulate operating temperature and vacuum while monitoring corona discharges with residual gases. This facility is being used to test various high voltage power system components.

  1. Missile Systems Maintenance, AFSC 411XOB/C.

    DTIC Science & Technology

    1988-04-01

    technician’s rating. A statistical measurement of their agreement, known as the interrater reliability (as assessed through components of variance of...senior technician’s ratings. A statistical measurement of their agreement, known as the interrater reliability (as assessed through components of...FABRICATION TRANSITORS *INPUT/OUTPUT (PERIPHERAL) DEVICES SOLID-STATE SPECIAL PURPOSE DEVICES COMPUTER MICRO PROCESSORS AND PROGRAMS POWER SUPPLIES

  2. A PC program to optimize system configuration for desired reliability at minimum cost

    NASA Technical Reports Server (NTRS)

    Hills, Steven W.; Siahpush, Ali S.

    1994-01-01

    High reliability is desired in all engineered systems. One way to improve system reliability is to use redundant components. When redundant components are used, the problem becomes one of allocating them to achieve the best reliability without exceeding other design constraints such as cost, weight, or volume. Systems with few components can be optimized by simply examining every possible combination but the number of combinations for most systems is prohibitive. A computerized iteration of the process is possible but anything short of a super computer requires too much time to be practical. Many researchers have derived mathematical formulations for calculating the optimum configuration directly. However, most of the derivations are based on continuous functions whereas the real system is composed of discrete entities. Therefore, these techniques are approximations of the true optimum solution. This paper describes a computer program that will determine the optimum configuration of a system of multiple redundancy of both standard and optional components. The algorithm is a pair-wise comparative progression technique which can derive the true optimum by calculating only a small fraction of the total number of combinations. A designer can quickly analyze a system with this program on a personal computer.

  3. Reliability and Validity of the Sensory Component of the International Standards for Neurological Classification of Spinal Cord Injury (ISNCSCI): A Systematic Review.

    PubMed

    Hales, M; Biros, E; Reznik, J E

    2015-01-01

    Since 1982, the International Standards for Neurological Classification of Spinal Cord Injury (ISNCSCI) has been used to classify sensation of spinal cord injury (SCI) through pinprick and light touch scores. The absence of proprioception, pain, and temperature within this scale creates questions about its validity and accuracy. To assess whether the sensory component of the ISNCSCI represents a reliable and valid measure of classification of SCI. A systematic review of studies examining the reliability and validity of the sensory component of the ISNCSCI published between 1982 and February 2013 was conducted. The electronic databases MEDLINE via Ovid, CINAHL, PEDro, and Scopus were searched for relevant articles. A secondary search of reference lists was also completed. Chosen articles were assessed according to the Oxford Centre for Evidence-Based Medicine hierarchy of evidence and critically appraised using the McMasters Critical Review Form. A statistical analysis was conducted to investigate the variability of the results given by reliability studies. Twelve studies were identified: 9 reviewed reliability and 3 reviewed validity. All studies demonstrated low levels of evidence and moderate critical appraisal scores. The majority of the articles (~67%; 6/9) assessing the reliability suggested that training was positively associated with better posttest results. The results of the 3 studies that assessed the validity of the ISNCSCI scale were confounding. Due to the low to moderate quality of the current literature, the sensory component of the ISNCSCI requires further revision and investigation if it is to be a useful tool in clinical trials.

  4. Reliability and Validity of the Sensory Component of the International Standards for Neurological Classification of Spinal Cord Injury (ISNCSCI): A Systematic Review

    PubMed Central

    Hales, M.; Biros, E.

    2015-01-01

    Background: Since 1982, the International Standards for Neurological Classification of Spinal Cord Injury (ISNCSCI) has been used to classify sensation of spinal cord injury (SCI) through pinprick and light touch scores. The absence of proprioception, pain, and temperature within this scale creates questions about its validity and accuracy. Objectives: To assess whether the sensory component of the ISNCSCI represents a reliable and valid measure of classification of SCI. Methods: A systematic review of studies examining the reliability and validity of the sensory component of the ISNCSCI published between 1982 and February 2013 was conducted. The electronic databases MEDLINE via Ovid, CINAHL, PEDro, and Scopus were searched for relevant articles. A secondary search of reference lists was also completed. Chosen articles were assessed according to the Oxford Centre for Evidence-Based Medicine hierarchy of evidence and critically appraised using the McMasters Critical Review Form. A statistical analysis was conducted to investigate the variability of the results given by reliability studies. Results: Twelve studies were identified: 9 reviewed reliability and 3 reviewed validity. All studies demonstrated low levels of evidence and moderate critical appraisal scores. The majority of the articles (~67%; 6/9) assessing the reliability suggested that training was positively associated with better posttest results. The results of the 3 studies that assessed the validity of the ISNCSCI scale were confounding. Conclusions: Due to the low to moderate quality of the current literature, the sensory component of the ISNCSCI requires further revision and investigation if it is to be a useful tool in clinical trials. PMID:26363591

  5. Reliability and validity of a Swedish language version of the Resilience Scale.

    PubMed

    Nygren, Björn; Randström, Kerstin Björkman; Lejonklou, Anna K; Lundman, Beril

    2004-01-01

    The purpose of this study was to test the reliability and validity of the Swedish language version of the Resilience Scale (RS). Participants were 142 adults between 19-85 years of age. Internal consistency reliability, stability over time, and construct validity were evaluated using Cronbach's alpha, principal components analysis with varimax rotation and correlations with scores on the Sense of Coherence Scale (SOC) and the Rosenberg Self-Esteem Scale (RSE). The mean score on the RS was 142 (SD = 15). The possible scores on the RS range from 25 to 175, and scores higher than 146 are considered high. The test-retest correlation was .78. Correlations with the SOC and the RSE were .41 (p < 0.01) and .37 (p < 0.01), respectively. Personal Assurance and Acceptance of Self and Life emerged as components from the principal components analysis. These findings provide evidence for the reliability and validity of the Swedish language version of the RS.

  6. Improving the Reliability of Technological Subsystems Equipment for Steam Turbine Unit in Operation

    NASA Astrophysics Data System (ADS)

    Brodov, Yu. M.; Murmansky, B. E.; Aronson, R. T.

    2017-11-01

    The authors’ conception is presented of an integrated approach to reliability improving of the steam turbine unit (STU) state along with its implementation examples for the various STU technological subsystems. Basing on the statistical analysis of damage to turbine individual parts and components, on the development and application of modern methods and technologies of repair and on operational monitoring techniques, the critical components and elements of equipment are identified and priorities are proposed for improving the reliability of STU equipment in operation. The research results are presented of the analysis of malfunctions for various STU technological subsystems equipment operating as part of power units and at cross-linked thermal power plants and resulting in turbine unit shutdown (failure). Proposals are formulated and justified for adjustment of maintenance and repair for turbine components and parts, for condenser unit equipment, for regeneration subsystem and oil supply system that permit to increase the operational reliability, to reduce the cost of STU maintenance and repair and to optimize the timing and amount of repairs.

  7. Enhanced Component Performance Study: Air-Operated Valves 1998-2014

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schroeder, John Alton

    2015-11-01

    This report presents a performance evaluation of air-operated valves (AOVs) at U.S. commercial nuclear power plants. The data used in this study are based on the operating experience failure reports from fiscal year 1998 through 2014 for the component reliability as reported in the Institute of Nuclear Power Operations (INPO) Consolidated Events Database (ICES). The AOV failure modes considered are failure-to-open/close, failure to operate or control, and spurious operation. The component reliability estimates and the reliability data are trended for the most recent 10-year period, while yearly estimates for reliability are provided for the entire active period. One statistically significantmore » trend was observed in the AOV data: The frequency of demands per reactor year for valves recording the fail-to-open or fail-to-close failure modes, for high-demand valves (those with greater than twenty demands per year), was found to be decreasing. The decrease was about three percent over the ten year period trended.« less

  8. PV inverter performance and reliability: What is the role of the bus capacitor?

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Flicker, Jack; Kaplar, Robert; Marinella, Matthew

    In order to elucidate how the degradation of individual components affects the state of the photovoltaic inverter as a whole, we have carried out SPICE simulations to investigate the voltage and current ripple on the DC bus. The bus capacitor is generally considered to be among the least reliable components of the system, so we have simulated how the degradation of bus capacitors affects the AC ripple at the terminals of the PV module. Degradation-induced ripple leads to an increased degradation rate in a positive feedback cycle. Additionally, laboratory experiments are being carried out to ascertain the reliability of metallizedmore » thin film capacitors. By understanding the degradation mechanisms and their effects on the inverter as a system, steps can be made to more effectively replace marginal components with more reliable ones, increasing the lifetime and efficiency of the inverter and decreasing its cost per watt towards the US Department of Energy goals.« less

  9. Reliability and Confidence Interval Analysis of a CMC Turbine Stator Vane

    NASA Technical Reports Server (NTRS)

    Murthy, Pappu L. N.; Gyekenyesi, John P.; Mital, Subodh K.

    2008-01-01

    High temperature ceramic matrix composites (CMC) are being explored as viable candidate materials for hot section gas turbine components. These advanced composites can potentially lead to reduced weight, enable higher operating temperatures requiring less cooling and thus leading to increased engine efficiencies. However, these materials are brittle and show degradation with time at high operating temperatures due to creep as well as cyclic mechanical and thermal loads. In addition, these materials are heterogeneous in their make-up and various factors affect their properties in a specific design environment. Most of these advanced composites involve two- and three-dimensional fiber architectures and require a complex multi-step high temperature processing. Since there are uncertainties associated with each of these in addition to the variability in the constituent material properties, the observed behavior of composite materials exhibits scatter. Traditional material failure analyses employing a deterministic approach, where failure is assumed to occur when some allowable stress level or equivalent stress is exceeded, are not adequate for brittle material component design. Such phenomenological failure theories are reasonably successful when applied to ductile materials such as metals. Analysis of failure in structural components is governed by the observed scatter in strength, stiffness and loading conditions. In such situations, statistical design approaches must be used. Accounting for these phenomena requires a change in philosophy on the design engineer s part that leads to a reduced focus on the use of safety factors in favor of reliability analyses. The reliability approach demands that the design engineer must tolerate a finite risk of unacceptable performance. This risk of unacceptable performance is identified as a component's probability of failure (or alternatively, component reliability). The primary concern of the engineer is minimizing this risk in an economical manner. The methods to accurately determine the service life of an engine component with associated variability have become increasingly difficult. This results, in part, from the complex missions which are now routinely considered during the design process. These missions include large variations of multi-axial stresses and temperatures experienced by critical engine parts. There is a need for a convenient design tool that can accommodate various loading conditions induced by engine operating environments, and material data with their associated uncertainties to estimate the minimum predicted life of a structural component. A probabilistic composite micromechanics technique in combination with woven composite micromechanics, structural analysis and Fast Probability Integration (FPI) techniques has been used to evaluate the maximum stress and its probabilistic distribution in a CMC turbine stator vane. Furthermore, input variables causing scatter are identified and ranked based upon their sensitivity magnitude. Since the measured data for the ceramic matrix composite properties is very limited, obtaining a probabilistic distribution with their corresponding parameters is difficult. In case of limited data, confidence bounds are essential to quantify the uncertainty associated with the distribution. Usually 90 and 95% confidence intervals are computed for material properties. Failure properties are then computed with the confidence bounds. Best estimates and the confidence bounds on the best estimate of the cumulative probability function for R-S (strength - stress) are plotted. The methodologies and the results from these analyses will be discussed in the presentation.

  10. Reliability of Meat, Fish, Dairy, and Egg Intake Over a 33-Year Interval in Adventist Health Study 2

    PubMed Central

    Singh, Pramil N.; Batech, Michael; Faed, Pegah; Jaceldo-Siegl, Karen; Martins, Marcia; Fraser, Gary E.

    2015-01-01

    We studied Adventist Health Study 2 (AHS-2) cohort members to determine the reliability of long-term recall of adult dietary intake that occurred 33 years ago. Establishing the reliability of these measures supports studies of how dietary exposure across the life course affects risk of cancer and other noncommunicable disease outcomes. Among 1816 AHS-2 cohort members, we conducted a statistical comparison of long-term recall of meat, fish, dairy, and eggs at AHS-2 baseline with their report of current diet 33 years before AHS-2 baseline at an age of 30–60 years. Major findings are as follows: 1) a high correlation for frequency of red meat (R = 0.71), poultry (R = 0.67), and fish (R = 0.60); lower correlations for dairy (R = 0.19) and eggs (R = 0.28); 2) good concordance for dichotomous measures of red meat [sensitivity: 0.70; specificity: 0.92; positive predictive value (PPV): 0.91], poultry (sensitivity: 0.76; specificity: 0.87; PPV: 0.83), fish (sensitivity: 0.61; specificity: 0.93; PPV: 0.89), dairy (sensitivity: 0.95; specificity: 0.57; PPV: 0.99), and eggs (sensitivity: 0.95; specificity: 0.41; PPV: 0.96); negative predictive value for dairy and eggs was poor. Among older AHS-2 cohort members, we found good reliability of recall of red meat, poultry, and fish intake that occurred 33 years earlier. PMID:25298211

  11. A Comparison of Various Stress Rupture Life Models for Orbiter Composite Pressure Vessels and Confidence Intervals

    NASA Technical Reports Server (NTRS)

    Grimes-Ledesma, Lorie; Murthy, Pappu L. N.; Phoenix, S. Leigh; Glaser, Ronald

    2007-01-01

    In conjunction with a recent NASA Engineering and Safety Center (NESC) investigation of flight worthiness of Kevlar Overwrapped Composite Pressure Vessels (COPVs) on board the Orbiter, two stress rupture life prediction models were proposed independently by Phoenix and by Glaser. In this paper, the use of these models to determine the system reliability of 24 COPVs currently in service on board the Orbiter is discussed. The models are briefly described, compared to each other, and model parameters and parameter uncertainties are also reviewed to understand confidence in reliability estimation as well as the sensitivities of these parameters in influencing overall predicted reliability levels. Differences and similarities in the various models will be compared via stress rupture reliability curves (stress ratio vs. lifetime plots). Also outlined will be the differences in the underlying model premises, and predictive outcomes. Sources of error and sensitivities in the models will be examined and discussed based on sensitivity analysis and confidence interval determination. Confidence interval results and their implications will be discussed for the models by Phoenix and Glaser.

  12. Effective Dynamic Range and Retest Reliability of Dark-Adapted Two-Color Fundus-Controlled Perimetry in Patients With Macular Diseases.

    PubMed

    Pfau, Maximilian; Lindner, Moritz; Müller, Philipp L; Birtel, Johannes; Finger, Robert P; Harmening, Wolf M; Fleckenstein, Monika; Holz, Frank G; Schmitz-Valckenberg, Steffen

    2017-05-01

    To determine the effective dynamic range (EDR), retest reliability, and number of discriminable steps (DS) for mesopic and dark-adapted two-color fundus-controlled perimetry (FCP) using the S-MAIA (Scotopic-Macular Integrity Assessment) "micro-perimeter." In this prospective cross-sectional study, each of the 52 eyes of 52 subjects with various macular diseases (mean age 62.0 ± 16.9 years; range, 19.1-90.1 years) underwent duplicate mesopic (achromatic stimuli, 400-800 nm), dark-adapted cyan (505 nm), and dark-adapted red (627 nm) FCP using a grid of 61 stimuli covering 18° of the central retina. The EDR, the number of DS, and the retest reliability for point-wise sensitivity (PWS) were analyzed. The effects of fixation stability, sensitivity, and age on retest reliability were examined using mixed-effects models. The EDR was 10 to 30 dB with five DS for mesopic and 4 to 17 dB with four DS for dark-adapted cyan and red testing. PWS retest reliability was good among all three types of retinal sensitivity assessments (coefficient of repeatability ±5.79, ±4.72, and ±4.77 dB, respectively) and did not depend on fixation stability or age. PWS had no effect on retest variability in dark-adapted cyan and dark-adapted red testing but had a minor effect in mesopic testing. Combined mesopic and dark-adapted two-color FCP allows for reliable topographic testing of cone and rod function in patients with various macular diseases with and without foveal fixation. Retest reliability is homogeneous across eccentricities and various degrees of scotoma depth, including zones at risk for disease progression. These reliability estimates can serve for the design of future clinical trials.

  13. Time-dependent reliability analysis of ceramic engine components

    NASA Technical Reports Server (NTRS)

    Nemeth, Noel N.

    1993-01-01

    The computer program CARES/LIFE calculates the time-dependent reliability of monolithic ceramic components subjected to thermomechanical and/or proof test loading. This program is an extension of the CARES (Ceramics Analysis and Reliability Evaluation of Structures) computer program. CARES/LIFE accounts for the phenomenon of subcritical crack growth (SCG) by utilizing either the power or Paris law relations. The two-parameter Weibull cumulative distribution function is used to characterize the variation in component strength. The effects of multiaxial stresses are modeled using either the principle of independent action (PIA), the Weibull normal stress averaging method (NSA), or the Batdorf theory. Inert strength and fatigue parameters are estimated from rupture strength data of naturally flawed specimens loaded in static, dynamic, or cyclic fatigue. Two example problems demonstrating proof testing and fatigue parameter estimation are given.

  14. Fracture mechanics concepts in reliability analysis of monolithic ceramics

    NASA Technical Reports Server (NTRS)

    Manderscheid, Jane M.; Gyekenyesi, John P.

    1987-01-01

    Basic design concepts for high-performance, monolithic ceramic structural components are addressed. The design of brittle ceramics differs from that of ductile metals because of the inability of ceramic materials to redistribute high local stresses caused by inherent flaws. Random flaw size and orientation requires that a probabilistic analysis be performed in order to determine component reliability. The current trend in probabilistic analysis is to combine linear elastic fracture mechanics concepts with the two parameter Weibull distribution function to predict component reliability under multiaxial stress states. Nondestructive evaluation supports this analytical effort by supplying data during verification testing. It can also help to determine statistical parameters which describe the material strength variation, in particular the material threshold strength (the third Weibull parameter), which in the past was often taken as zero for simplicity.

  15. NASA-DoD Lead-Free Electronics Project

    NASA Technical Reports Server (NTRS)

    Kessel, Kurt R.

    2009-01-01

    In response to concerns about risks from lead-free induced faults to high reliability products, NASA has initiated a multi-year project to provide manufacturers and users with data to clarify the risks of lead-free materials in their products. The project will also be of interest to component manufacturers supplying to high reliability markets. The project was launched in November 2006. The primary technical objective of the project is to undertake comprehensive testing to generate information on failure modes/criteria to better understand the reliability of: - Packages (e.g., TSOP, BOA, PDIP) assembled and reworked with solder interconnects consisting of lead-free alloys - Packages (e.g., TSOP, BOA, PDIP) assembled and reworked with solder interconnects consisting of mixed alloys, lead component finish/lead-free solder and lead-free component finish/SnPb solder.

  16. NASA-DoD Lead-Free Electronics Project

    NASA Technical Reports Server (NTRS)

    Kessel, Kurt R.

    2009-01-01

    In response to concerns about risks from lead-free induced faults to high reliability products, NASA has initiated a multi-year project to provide manufacturers and users with data to clarify the risks of lead-free materials in their products. The project will also be of interest to component manufacturers supplying to high reliability markets. The project was launched in November 2006. The primary technical objective of the project is to undertake comprehensive testing to generate information on failure modes/criteria to better understand the reliability of: - Packages (e.g., TSOP, BGA, PDIP) assembled and reworked with solder interconnects consisting of lead-free alloys - Packages (e.g., TSOP, BGA, PDIP) assembled and reworked with solder interconnects consisting of mixed alloys, lead component finish/lead-free solder and lead-free component finish/SnPb solder.

  17. The WOMB (Women's views of birth) antenatal satisfaction questionnaire: development, dimensions, internal reliability, and validity.

    PubMed Central

    Smith, L F

    1999-01-01

    BACKGROUND: Antenatal services continue to change, stimulated by the Changing Childbirth report. Women's views should be an important component of assessing the quality of such services. To date, no published quantitative multidimensional assessment instrument has been available to measure their satisfaction with care. AIM: To develop a valid, reliable, multidimensional questionnaire to assess quality of antenatal care. METHOD: A multidimensional satisfaction questionnaire was developed using psychometric methods. Following fieldwork to pilot a questionnaire, three successive versions of it were given by midwives to pregnant women in their final trimester in nine trusts in the old South Western region of England. Their replies were analysed by principal components analysis (PCA) with varimax rotation; internal reliability was assessed by Cronbach's alpha. Face, content, and construct validity were all assessed during development. RESULTS: Out of 196 women, 134 (68.4%) returned the pilot questionnaires. One hundred and seventy-two (57.3%) out of 300 women returned version 1 of the WOMB (WOMen's views of Birth) antenatal satisfaction questionnaire proper, 283 (56.6%) out of 500 returned version 2, and 328 (65.6%) out of 500 returned the final development version. This final version consisted of 11 dimensions in addition to a general satisfaction one. These were [Cronbach's alpha]: five related to antenatal clinic characteristics (travelling to clinic [0.75], waiting at clinic [0.90], clinic environment [0.69], timing of appointment [0.78], car parking [0.85]), three 'professional' characteristics (professional competence [0.80], knowing carers [0.79], information provided [0.81]), antenatal classes [0.76], social support from other pregnant women [0.83], checking for the baby's heart beat [0.63]. There were significant moderate correlations (range = 0.24 to 0.77) between individual dimensions and the general satisfaction dimension. Women's dimension scores were significantly related to age, parity, social class, and best educational achievement. CONCLUSION: This multidimensional satisfaction instrument has good face, content, and construct validity, and excellent internal reliability. It could be used to generally assess antenatal services or to screen them to detect areas where further in-depth qualitative enquiry is merited. Its sensitivity to change over time, external reliability, and transferability to non-Caucasian groups needs to be assessed. PMID:10824341

  18. Can Reliability of Multiple Component Measuring Instruments Depend on Response Option Presentation Mode?

    ERIC Educational Resources Information Center

    Menold, Natalja; Raykov, Tenko

    2016-01-01

    This article examines the possible dependency of composite reliability on presentation format of the elements of a multi-item measuring instrument. Using empirical data and a recent method for interval estimation of group differences in reliability, we demonstrate that the reliability of an instrument need not be the same when polarity of the…

  19. A New Tool for Nutrition App Quality Evaluation (AQEL): Development, Validation, and Reliability Testing

    PubMed Central

    Huang, Wenhao; Chapman-Novakofski, Karen M

    2017-01-01

    Background The extensive availability and increasing use of mobile apps for nutrition-based health interventions makes evaluation of the quality of these apps crucial for integration of apps into nutritional counseling. Objective The goal of this research was the development, validation, and reliability testing of the app quality evaluation (AQEL) tool, an instrument for evaluating apps’ educational quality and technical functionality. Methods Items for evaluating app quality were adapted from website evaluations, with additional items added to evaluate the specific characteristics of apps, resulting in 79 initial items. Expert panels of nutrition and technology professionals and app users reviewed items for face and content validation. After recommended revisions, nutrition experts completed a second AQEL review to ensure clarity. On the basis of 150 sets of responses using the revised AQEL, principal component analysis was completed, reducing AQEL into 5 factors that underwent reliability testing, including internal consistency, split-half reliability, test-retest reliability, and interrater reliability (IRR). Two additional modifiable constructs for evaluating apps based on the age and needs of the target audience as selected by the evaluator were also tested for construct reliability. IRR testing using intraclass correlations (ICC) with all 7 constructs was conducted, with 15 dietitians evaluating one app. Results Development and validation resulted in the 51-item AQEL. These were reduced to 25 items in 5 factors after principal component analysis, plus 9 modifiable items in two constructs that were not included in principal component analysis. Internal consistency and split-half reliability of the following constructs derived from principal components analysis was good (Cronbach alpha >.80, Spearman-Brown coefficient >.80): behavior change potential, support of knowledge acquisition, app function, and skill development. App purpose split half-reliability was .65. Test-retest reliability showed no significant change over time (P>.05) for all but skill development (P=.001). Construct reliability was good for items assessing age appropriateness of apps for children, teens, and a general audience. In addition, construct reliability was acceptable for assessing app appropriateness for various target audiences (Cronbach alpha >.70). For the 5 main factors, ICC (1,k) was >.80, with a P value of <.05. When 15 nutrition professionals evaluated one app, ICC (2,15) was .98, with a P value of <.001 for all 7 constructs when the modifiable items were specified for adults seeking weight loss support. Conclusions Our preliminary effort shows that AQEL is a valid, reliable instrument for evaluating nutrition apps’ qualities for clinical interventions by nutrition clinicians, educators, and researchers. Further efforts in validating AQEL in various contexts are needed. PMID:29079554

  20. Development of an Ultrasonication-Assisted Extraction Based HPLC With a Fluorescence Method for Sensitive Determination of Aflatoxins in Highly Acidic Hibiscus sabdariffa

    PubMed Central

    Liu, Xiaofei; Ying, Guangyao; Sun, Chaonan; Yang, Meihua; Zhang, Lei; Zhang, Shanshan; Xing, Xiaoyan; Li, Qian; Kong, Weijun

    2018-01-01

    The high acidity and complex components of Hibiscus sabdariffa have provided major challenges for sensitive determination of trace aflatoxins. In this study, sample pretreatment of H. sabdariffa was systematically developed for sensitive high performance liquid chromatography-fluorescence detection (HPLC-FLD) after ultrasonication-assisted extraction, immunoaffinity column (IAC) clean-up and on-line post-column photochemical derivatization (PCD). Aflatoxins B1, B2, G1, G2 were extracted from samples by using methanol/water (70:30, v/v) with the addition of NaCl. The solutions were diluted 1:8 with 0.1 M phosphate buffer (pH 8.0) to negate the issues of high acidity and matrix interferences. The established method was validated with satisfactory linearity (R > 0.999), sensitivity (limits of detection (LODs) and limits of quantitation (LOQs) of 0.15–0.65 and 0.53–2.18 μg/kg, respectively), precision (RSD <11%), stability (RSD of 0.2–3.6%), and accuracy (recovery rates of 86.0–102.3%), which all met the stipulated analytical requirements. Analysis of 28 H. sabdariffa samples indicated that one sample incubated with Aspergillus flavus was positive with aflatoxin B1 (AFB1) at 3.11 μg/kg. The strategy developed in this study also has the potential to reliably extract and sensitively detect more mycotoxins in other complex acidic matrices, such as traditional Chinese medicines, foodstuffs, etc. PMID:29681848

  1. Development of an Ultrasonication-Assisted Extraction Based HPLC With a Fluorescence Method for Sensitive Determination of Aflatoxins in Highly Acidic Hibiscus sabdariffa.

    PubMed

    Liu, Xiaofei; Ying, Guangyao; Sun, Chaonan; Yang, Meihua; Zhang, Lei; Zhang, Shanshan; Xing, Xiaoyan; Li, Qian; Kong, Weijun

    2018-01-01

    The high acidity and complex components of Hibiscus sabdariffa have provided major challenges for sensitive determination of trace aflatoxins. In this study, sample pretreatment of H. sabdariffa was systematically developed for sensitive high performance liquid chromatography-fluorescence detection (HPLC-FLD) after ultrasonication-assisted extraction, immunoaffinity column (IAC) clean-up and on-line post-column photochemical derivatization (PCD). Aflatoxins B 1 , B 2 , G 1 , G 2 were extracted from samples by using methanol/water (70:30, v/v ) with the addition of NaCl. The solutions were diluted 1:8 with 0.1 M phosphate buffer (pH 8.0) to negate the issues of high acidity and matrix interferences. The established method was validated with satisfactory linearity ( R > 0.999), sensitivity (limits of detection (LODs) and limits of quantitation (LOQs) of 0.15-0.65 and 0.53-2.18 μg/kg, respectively), precision (RSD <11%), stability (RSD of 0.2-3.6%), and accuracy (recovery rates of 86.0-102.3%), which all met the stipulated analytical requirements. Analysis of 28 H. sabdariffa samples indicated that one sample incubated with Aspergillus flavus was positive with aflatoxin B 1 (AFB 1 ) at 3.11 μg/kg. The strategy developed in this study also has the potential to reliably extract and sensitively detect more mycotoxins in other complex acidic matrices, such as traditional Chinese medicines, foodstuffs, etc.

  2. Translation, cross-cultural adaptation and psychometric evaluation of yoruba version of the short-form 36 health survey.

    PubMed

    Mbada, Chidozie Emmanuel; Adeogun, Gafar Atanda; Ogunlana, Michael Opeoluwa; Adedoyin, Rufus Adesoji; Akinsulore, Adesanmi; Awotidebe, Taofeek Oluwole; Idowu, Opeyemi Ayodiipo; Olaoye, Olumide Ayoola

    2015-09-14

    The Short-Form Health Survey (SF-36) is a valid quality of life tool often employed to determine the impact of medical intervention and the outcome of health care services. However, the SF-36 is culturally sensitive which necessitates its adaptation and translation into different languages. This study was conducted to cross-culturally adapt the SF-36 into Yoruba language and determine its reliability and validity. Based on the International Quality of Life Assessment project guidelines, a sequence of translation, test of item-scale correlation, and validation was implemented for the translation of the Yoruba version of the SF-36. Following pilot testing, the English and the Yoruba versions of the SF-36 were administered to a random sample of 1087 apparently healthy individuals to test validity and 249 respondents completed the Yoruba SF-36 again after two weeks to test reliability. Data was analyzed using Pearson's product moment correlation analysis, independent t-test, one-way analysis of variance, multi trait scaling analysis and Intra-Class Correlation (ICC) at p < 0.05. The concurrent validity scores for scales and domains ranges between 0.749 and 0.902 with the highest and lowest scores in the General Health (0.902) and Bodily Pain (0.749) scale. Scale-level descriptive result showed that all scale and domain scores had negative skewness ranging from -2.08 to -0.98. The mean scores for each scales ranges between 83.2 and 88.8. The domain scores for Physical Health Component and Mental Health Component were 85.6 ± 13.7 and 85.9 ± 15.4 respectively. The convergent validity was satisfactory, ranging from 0.421 to 0.907. Discriminant validity was also satisfactory except for item '1'. The ICC for the test-retest reliability of the Yoruba SF-36 ranges between 0.636 and 0.843 for scales; and 0.783 and 0.851 for domains. The data quality, concurrent and discriminant validity, reliability and internal consistency of the Yoruba version of the SF-36 are adequate and it is recommended for measuring health-related quality of life among Yoruba population.

  3. Transient Reliability Analysis Capability Developed for CARES/Life

    NASA Technical Reports Server (NTRS)

    Nemeth, Noel N.

    2001-01-01

    The CARES/Life software developed at the NASA Glenn Research Center provides a general-purpose design tool that predicts the probability of the failure of a ceramic component as a function of its time in service. This award-winning software has been widely used by U.S. industry to establish the reliability and life of a brittle material (e.g., ceramic, intermetallic, and graphite) structures in a wide variety of 21st century applications.Present capabilities of the NASA CARES/Life code include probabilistic life prediction of ceramic components subjected to fast fracture, slow crack growth (stress corrosion), and cyclic fatigue failure modes. Currently, this code can compute the time-dependent reliability of ceramic structures subjected to simple time-dependent loading. For example, in slow crack growth failure conditions CARES/Life can handle sustained and linearly increasing time-dependent loads, whereas in cyclic fatigue applications various types of repetitive constant-amplitude loads can be accounted for. However, in real applications applied loads are rarely that simple but vary with time in more complex ways such as engine startup, shutdown, and dynamic and vibrational loads. In addition, when a given component is subjected to transient environmental and or thermal conditions, the material properties also vary with time. A methodology has now been developed to allow the CARES/Life computer code to perform reliability analysis of ceramic components undergoing transient thermal and mechanical loading. This means that CARES/Life will be able to analyze finite element models of ceramic components that simulate dynamic engine operating conditions. The methodology developed is generalized to account for material property variation (on strength distribution and fatigue) as a function of temperature. This allows CARES/Life to analyze components undergoing rapid temperature change in other words, components undergoing thermal shock. In addition, the capability has been developed to perform reliability analysis for components that undergo proof testing involving transient loads. This methodology was developed for environmentally assisted crack growth (crack growth as a function of time and loading), but it will be extended to account for cyclic fatigue (crack growth as a function of load cycles) as well.

  4. Design component method for sensitivity analysis of built-up structures

    NASA Technical Reports Server (NTRS)

    Choi, Kyung K.; Seong, Hwai G.

    1986-01-01

    A 'design component method' that provides a unified and systematic organization of design sensitivity analysis for built-up structures is developed and implemented. Both conventional design variables, such as thickness and cross-sectional area, and shape design variables of components of built-up structures are considered. It is shown that design of components of built-up structures can be characterized and system design sensitivity expressions obtained by simply adding contributions from each component. The method leads to a systematic organization of computations for design sensitivity analysis that is similar to the way in which computations are organized within a finite element code.

  5. Reliability model generator

    NASA Technical Reports Server (NTRS)

    Cohen, Gerald C. (Inventor); McMann, Catherine M. (Inventor)

    1991-01-01

    An improved method and system for automatically generating reliability models for use with a reliability evaluation tool is described. The reliability model generator of the present invention includes means for storing a plurality of low level reliability models which represent the reliability characteristics for low level system components. In addition, the present invention includes means for defining the interconnection of the low level reliability models via a system architecture description. In accordance with the principles of the present invention, a reliability model for the entire system is automatically generated by aggregating the low level reliability models based on the system architecture description.

  6. Using the virtual reality device Oculus Rift for neuropsychological assessment of visual processing capabilities

    PubMed Central

    Foerster, Rebecca M.; Poth, Christian H.; Behler, Christian; Botsch, Mario; Schneider, Werner X.

    2016-01-01

    Neuropsychological assessment of human visual processing capabilities strongly depends on visual testing conditions including room lighting, stimuli, and viewing-distance. This limits standardization, threatens reliability, and prevents the assessment of core visual functions such as visual processing speed. Increasingly available virtual reality devices allow to address these problems. One such device is the portable, light-weight, and easy-to-use Oculus Rift. It is head-mounted and covers the entire visual field, thereby shielding and standardizing the visual stimulation. A fundamental prerequisite to use Oculus Rift for neuropsychological assessment is sufficient test-retest reliability. Here, we compare the test-retest reliabilities of Bundesen’s visual processing components (visual processing speed, threshold of conscious perception, capacity of visual working memory) as measured with Oculus Rift and a standard CRT computer screen. Our results show that Oculus Rift allows to measure the processing components as reliably as the standard CRT. This means that Oculus Rift is applicable for standardized and reliable assessment and diagnosis of elementary cognitive functions in laboratory and clinical settings. Oculus Rift thus provides the opportunity to compare visual processing components between individuals and institutions and to establish statistical norm distributions. PMID:27869220

  7. CARES/LIFE Ceramics Analysis and Reliability Evaluation of Structures Life Prediction Program

    NASA Technical Reports Server (NTRS)

    Nemeth, Noel N.; Powers, Lynn M.; Janosik, Lesley A.; Gyekenyesi, John P.

    2003-01-01

    This manual describes the Ceramics Analysis and Reliability Evaluation of Structures Life Prediction (CARES/LIFE) computer program. The program calculates the time-dependent reliability of monolithic ceramic components subjected to thermomechanical and/or proof test loading. CARES/LIFE is an extension of the CARES (Ceramic Analysis and Reliability Evaluation of Structures) computer program. The program uses results from MSC/NASTRAN, ABAQUS, and ANSYS finite element analysis programs to evaluate component reliability due to inherent surface and/or volume type flaws. CARES/LIFE accounts for the phenomenon of subcritical crack growth (SCG) by utilizing the power law, Paris law, or Walker law. The two-parameter Weibull cumulative distribution function is used to characterize the variation in component strength. The effects of multiaxial stresses are modeled by using either the principle of independent action (PIA), the Weibull normal stress averaging method (NSA), or the Batdorf theory. Inert strength and fatigue parameters are estimated from rupture strength data of naturally flawed specimens loaded in static, dynamic, or cyclic fatigue. The probabilistic time-dependent theories used in CARES/LIFE, along with the input and output for CARES/LIFE, are described. Example problems to demonstrate various features of the program are also included.

  8. Using the virtual reality device Oculus Rift for neuropsychological assessment of visual processing capabilities.

    PubMed

    Foerster, Rebecca M; Poth, Christian H; Behler, Christian; Botsch, Mario; Schneider, Werner X

    2016-11-21

    Neuropsychological assessment of human visual processing capabilities strongly depends on visual testing conditions including room lighting, stimuli, and viewing-distance. This limits standardization, threatens reliability, and prevents the assessment of core visual functions such as visual processing speed. Increasingly available virtual reality devices allow to address these problems. One such device is the portable, light-weight, and easy-to-use Oculus Rift. It is head-mounted and covers the entire visual field, thereby shielding and standardizing the visual stimulation. A fundamental prerequisite to use Oculus Rift for neuropsychological assessment is sufficient test-retest reliability. Here, we compare the test-retest reliabilities of Bundesen's visual processing components (visual processing speed, threshold of conscious perception, capacity of visual working memory) as measured with Oculus Rift and a standard CRT computer screen. Our results show that Oculus Rift allows to measure the processing components as reliably as the standard CRT. This means that Oculus Rift is applicable for standardized and reliable assessment and diagnosis of elementary cognitive functions in laboratory and clinical settings. Oculus Rift thus provides the opportunity to compare visual processing components between individuals and institutions and to establish statistical norm distributions.

  9. Sensitivity, Specificity, and Predictive Values of Pediatric Metabolic Syndrome Components in Relation to Adult Metabolic Syndrome: The Princeton LRC Follow-up Study

    PubMed Central

    Huang, Terry T-K; Nansel, Tonja R.; Belsheim, Allen R.; Morrison, John A.

    2008-01-01

    Objective To estimate the sensitivity, specificity, and predictive values of pediatric metabolic syndrome (MetS) components (obesity, fasting glucose, triglycerides, high-density lipoprotein, and blood pressure) at various cutoffs in relation to adult MetS. Study design Data from the NHLBI Lipid Research Clinics (LRC) Princeton Prevalence Study (1973–76) and the Princeton Follow-up Study (PFS, 2000-4) were used to calculate sensitivity, specificity, and positive and negative predictive values for each component at a given cutoff, as well as for aggregates of components. Results Individual pediatric components alone showed low to moderate sensitivity, high specificity, and moderate predictive values in relation to adult MetS. When all five pediatric MetS components were considered, the presence of at least one abnormality had higher sensitivity for adult MetS than individual components alone. When multiple abnormalities were mandatory for MetS, positive predictive value was high and sensitivity was low. Childhood body mass alone showed neither high sensitivity nor high positive predictive value for adult MetS. Conclusions Considering multiple metabolic variables in childhood can improve the predictive utility for adult MetS, compared to each component or body mass alone. MetS variables may be useful for identifying some at risk children for prevention interventions. PMID:18206687

  10. Microwave annealing effect for highly reliable biosensor: dual-gate ion-sensitive field-effect transistor using amorphous InGaZnO thin-film transistor.

    PubMed

    Lee, In-Kyu; Lee, Kwan Hyi; Lee, Seok; Cho, Won-Ju

    2014-12-24

    We used a microwave annealing process to fabricate a highly reliable biosensor using amorphous-InGaZnO (a-IGZO) thin-film transistors (TFTs), which usually experience threshold voltage instability. Compared with furnace-annealed a-IGZO TFTs, the microwave-annealed devices showed superior threshold voltage stability and performance, including a high field-effect mobility of 9.51 cm(2)/V·s, a low threshold voltage of 0.99 V, a good subthreshold slope of 135 mV/dec, and an outstanding on/off current ratio of 1.18 × 10(8). In conclusion, by using the microwave-annealed a-IGZO TFT as the transducer in an extended-gate ion-sensitive field-effect transistor biosensor, we developed a high-performance biosensor with excellent sensing properties in terms of pH sensitivity, reliability, and chemical stability.

  11. CARES - CERAMICS ANALYSIS AND RELIABILITY EVALUATION OF STRUCTURES

    NASA Technical Reports Server (NTRS)

    Nemeth, N. N.

    1994-01-01

    The beneficial properties of structural ceramics include their high-temperature strength, light weight, hardness, and corrosion and oxidation resistance. For advanced heat engines, ceramics have demonstrated functional abilities at temperatures well beyond the operational limits of metals. This is offset by the fact that ceramic materials tend to be brittle. When a load is applied, their lack of significant plastic deformation causes the material to crack at microscopic flaws, destroying the component. CARES calculates the fast-fracture reliability or failure probability of macroscopically isotropic ceramic components. These components may be subjected to complex thermomechanical loadings. The program uses results from a commercial structural analysis program (MSC/NASTRAN or ANSYS) to evaluate component reliability due to inherent surface and/or volume type flaws. A multiple material capability allows the finite element model reliability to be a function of many different ceramic material statistical characterizations. The reliability analysis uses element stress, temperature, area, and volume output, which are obtained from two dimensional shell and three dimensional solid isoparametric or axisymmetric finite elements. CARES utilizes the Batdorf model and the two-parameter Weibull cumulative distribution function to describe the effects of multi-axial stress states on material strength. The shear-sensitive Batdorf model requires a user-selected flaw geometry and a mixed-mode fracture criterion. Flaws intersecting the surface and imperfections embedded in the volume can be modeled. The total strain energy release rate theory is used as a mixed mode fracture criterion for co-planar crack extension. Out-of-plane crack extension criteria are approximated by a simple equation with a semi-empirical constant that can model the maximum tangential stress theory, the minimum strain energy density criterion, the maximum strain energy release rate theory, or experimental results. For comparison, Griffith's maximum tensile stress theory, the principle of independent action, and the Weibull normal stress averaging models are also included. Weibull material strength parameters, the Batdorf crack density coefficient, and other related statistical quantities are estimated from four-point bend bar or uniform uniaxial tensile specimen fracture strength data. Parameter estimation can be performed for single or multiple failure modes by using the least-squares analysis or the maximum likelihood method. A more limited program, CARES/PC (COSMIC number LEW-15248) runs on a personal computer and estimates ceramic material properties from three-point bend bar data. CARES/PC does not perform fast fracture reliability estimation. CARES is written in FORTRAN 77 and has been implemented on DEC VAX series computers under VMS and on IBM 370 series computers under VM/CMS. On a VAX, CARES requires 10Mb of main memory. Five MSC/NASTRAN example problems and two ANSYS example problems are provided. There are two versions of CARES supplied on the distribution tape, CARES1 and CARES2. CARES2 contains sub-elements and CARES1 does not. CARES is available on a 9-track 1600 BPI VAX FILES-11 format magnetic tape (standard media) or in VAX BACKUP format on a TK50 tape cartridge. The program requires a FORTRAN 77 compiler and about 12Mb memory. CARES was developed in 1990. DEC, VAX and VMS are trademarks of Digital Equipment Corporation. IBM 370 is a trademark of International Business Machines. MSC/NASTRAN is a trademark of MacNeal-Schwendler Corporation. ANSYS is a trademark of Swanson Analysis Systems, Inc.

  12. Handbook of experiences in the design and installation of solar heating and cooling systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ward, D.S.; Oberoi, H.S.

    1980-07-01

    A large array of problems encountered are detailed, including design errors, installation mistakes, cases of inadequate durability of materials and unacceptable reliability of components, and wide variations in the performance and operation of different solar systems. Durability, reliability, and design problems are reviewed for solar collector subsystems, heat transfer fluids, thermal storage, passive solar components, piping/ducting, and reliability/operational problems. The following performance topics are covered: criteria for design and performance analysis, domestic hot water systems, passive space heating systems, active space heating systems, space cooling systems, analysis of systems performance, and performance evaluations. (MHR)

  13. Scaled CMOS Reliability and Considerations for Spacecraft Systems : Bottom-Up and Top-Down Perspectives

    NASA Technical Reports Server (NTRS)

    White, Mark

    2012-01-01

    The recently launched Mars Science Laboratory (MSL) flagship mission, named Curiosity, is the most complex rover ever built by NASA and is scheduled to touch down on the red planet in August, 2012 in Gale Crater. The rover and its instruments will have to endure the harsh environments of the surface of Mars to fulfill its main science objectives. Such complex systems require reliable microelectronic components coupled with adequate component and system-level design margins. Reliability aspects of these elements of the spacecraft system are presented from bottom- up and top-down perspectives.

  14. 2nd Generation RLV Risk Reduction Definition Program: Pratt & Whitney Propulsion Risk Reduction Requirements Program (TA-3 & TA-4)

    NASA Technical Reports Server (NTRS)

    Matlock, Steve

    2001-01-01

    This is the final report and addresses all of the work performed on this program. Specifically, it covers vehicle architecture background, definition of six baseline engine cycles, reliability baseline (space shuttle main engine QRAS), and component level reliability/performance/cost for the six baseline cycles, and selection of 3 cycles for further study. This report further addresses technology improvement selection and component level reliability/performance/cost for the three cycles selected for further study, as well as risk reduction plans, and recommendation for future studies.

  15. Transient Reliability of Ceramic Structures For Heat Engine Applications

    NASA Technical Reports Server (NTRS)

    Nemeth, Noel N.; Jadaan, Osama M.

    2002-01-01

    The objectives of this report was to develop a methodology to predict the time-dependent reliability (probability of failure) of brittle material components subjected to transient thermomechanical loading, taking into account the change in material response with time. This methodology for computing the transient reliability in ceramic components subjected to fluctuation thermomechanical loading was developed, assuming SCG (Slow Crack Growth) as the delayed mode of failure. It takes into account the effect of varying Weibull modulus and materials with time. It was also coded into a beta version of NASA's CARES/Life code, and an example demonstrating its viability was presented.

  16. An overview of the mathematical and statistical analysis component of RICIS

    NASA Technical Reports Server (NTRS)

    Hallum, Cecil R.

    1987-01-01

    Mathematical and statistical analysis components of RICIS (Research Institute for Computing and Information Systems) can be used in the following problem areas: (1) quantification and measurement of software reliability; (2) assessment of changes in software reliability over time (reliability growth); (3) analysis of software-failure data; and (4) decision logic for whether to continue or stop testing software. Other areas of interest to NASA/JSC where mathematical and statistical analysis can be successfully employed include: math modeling of physical systems, simulation, statistical data reduction, evaluation methods, optimization, algorithm development, and mathematical methods in signal processing.

  17. Reliability and Validity of the Dyadic Observed Communication Scale (DOCS).

    PubMed

    Hadley, Wendy; Stewart, Angela; Hunter, Heather L; Affleck, Katelyn; Donenberg, Geri; Diclemente, Ralph; Brown, Larry K

    2013-02-01

    We evaluated the reliability and validity of the Dyadic Observed Communication Scale (DOCS) coding scheme, which was developed to capture a range of communication components between parents and adolescents. Adolescents and their caregivers were recruited from mental health facilities for participation in a large, multi-site family-based HIV prevention intervention study. Seventy-one dyads were randomly selected from the larger study sample and coded using the DOCS at baseline. Preliminary validity and reliability of the DOCS was examined using various methods, such as comparing results to self-report measures and examining interrater reliability. Results suggest that the DOCS is a reliable and valid measure of observed communication among parent-adolescent dyads that captures both verbal and nonverbal communication behaviors that are typical intervention targets. The DOCS is a viable coding scheme for use by researchers and clinicians examining parent-adolescent communication. Coders can be trained to reliably capture individual and dyadic components of communication for parents and adolescents and this complex information can be obtained relatively quickly.

  18. Advanced Reactor PSA Methodologies for System Reliability Analysis and Source Term Assessment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Grabaskas, D.; Brunett, A.; Passerini, S.

    Beginning in 2015, a project was initiated to update and modernize the probabilistic safety assessment (PSA) of the GE-Hitachi PRISM sodium fast reactor. This project is a collaboration between GE-Hitachi and Argonne National Laboratory (Argonne), and funded in part by the U.S. Department of Energy. Specifically, the role of Argonne is to assess the reliability of passive safety systems, complete a mechanistic source term calculation, and provide component reliability estimates. The assessment of passive system reliability focused on the performance of the Reactor Vessel Auxiliary Cooling System (RVACS) and the inherent reactivity feedback mechanisms of the metal fuel core. Themore » mechanistic source term assessment attempted to provide a sequence specific source term evaluation to quantify offsite consequences. Lastly, the reliability assessment focused on components specific to the sodium fast reactor, including electromagnetic pumps, intermediate heat exchangers, the steam generator, and sodium valves and piping.« less

  19. User-perceived reliability of unrepairable shared protection systems with functionally identical units

    NASA Astrophysics Data System (ADS)

    Ozaki, Hirokazu; Kara, Atsushi; Cheng, Zixue

    2012-05-01

    In this article, we investigate the reliability of M-for-N (M:N) shared protection systems. We focus on the reliability that is perceived by an end user of one of N units. We assume that any failed unit is instantly replaced by one of the M units (if available). We describe the effectiveness of such a protection system in a quantitative manner under the condition that the failed units are not repairable. Mathematical analysis gives the closed-form solution of the reliability and mean time to failure (MTTF). We also analyse several numerical examples of the reliability and MTTF. This result can be applied, for example, to the analysis and design of an integrated circuit consisting of redundant backup components. In such a device, repairing a failed component is unrealistic. The analysis provides useful information for the design for general shared protection systems in which the failed units are not repaired.

  20. Improving statistical inference on pathogen densities estimated by quantitative molecular methods: malaria gametocytaemia as a case study.

    PubMed

    Walker, Martin; Basáñez, María-Gloria; Ouédraogo, André Lin; Hermsen, Cornelus; Bousema, Teun; Churcher, Thomas S

    2015-01-16

    Quantitative molecular methods (QMMs) such as quantitative real-time polymerase chain reaction (q-PCR), reverse-transcriptase PCR (qRT-PCR) and quantitative nucleic acid sequence-based amplification (QT-NASBA) are increasingly used to estimate pathogen density in a variety of clinical and epidemiological contexts. These methods are often classified as semi-quantitative, yet estimates of reliability or sensitivity are seldom reported. Here, a statistical framework is developed for assessing the reliability (uncertainty) of pathogen densities estimated using QMMs and the associated diagnostic sensitivity. The method is illustrated with quantification of Plasmodium falciparum gametocytaemia by QT-NASBA. The reliability of pathogen (e.g. gametocyte) densities, and the accompanying diagnostic sensitivity, estimated by two contrasting statistical calibration techniques, are compared; a traditional method and a mixed model Bayesian approach. The latter accounts for statistical dependence of QMM assays run under identical laboratory protocols and permits structural modelling of experimental measurements, allowing precision to vary with pathogen density. Traditional calibration cannot account for inter-assay variability arising from imperfect QMMs and generates estimates of pathogen density that have poor reliability, are variable among assays and inaccurately reflect diagnostic sensitivity. The Bayesian mixed model approach assimilates information from replica QMM assays, improving reliability and inter-assay homogeneity, providing an accurate appraisal of quantitative and diagnostic performance. Bayesian mixed model statistical calibration supersedes traditional techniques in the context of QMM-derived estimates of pathogen density, offering the potential to improve substantially the depth and quality of clinical and epidemiological inference for a wide variety of pathogens.

  1. Time-Varying, Multi-Scale Adaptive System Reliability Analysis of Lifeline Infrastructure Networks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gearhart, Jared Lee; Kurtz, Nolan Scot

    2014-09-01

    The majority of current societal and economic needs world-wide are met by the existing networked, civil infrastructure. Because the cost of managing such infrastructure is high and increases with time, risk-informed decision making is essential for those with management responsibilities for these systems. To address such concerns, a methodology that accounts for new information, deterioration, component models, component importance, group importance, network reliability, hierarchical structure organization, and efficiency concerns has been developed. This methodology analyzes the use of new information through the lens of adaptive Importance Sampling for structural reliability problems. Deterioration, multi-scale bridge models, and time-variant component importance aremore » investigated for a specific network. Furthermore, both bridge and pipeline networks are studied for group and component importance, as well as for hierarchical structures in the context of specific networks. Efficiency is the primary driver throughout this study. With this risk-informed approach, those responsible for management can address deteriorating infrastructure networks in an organized manner.« less

  2. The Component Timed-Up-and-Go test: the utility and psychometric properties of using a mobile application to determine prosthetic mobility in people with lower limb amputations.

    PubMed

    Clemens, Sheila M; Gailey, Robert S; Bennett, Christopher L; Pasquina, Paul F; Kirk-Sanchez, Neva J; Gaunaurd, Ignacio A

    2018-03-01

    Using a custom mobile application to evaluate the reliability and validity of the Component Timed-Up-and-Go test to assess prosthetic mobility in people with lower limb amputation. Cross-sectional design. National conference for people with limb loss. A total of 118 people with non-vascular cause of lower limb amputation participated. Subjects had a mean age of 48 (±13.7) years and were an average of 10 years post amputation. Of them, 54% ( n = 64) of subjects were male. None. The Component Timed-Up-and-Go was administered using a mobile iPad application, generating a total time to complete the test and five component times capturing each subtask (sit to stand transitions, linear gait, turning) of the standard timed-up-and-go test. The outcome underwent test-retest reliability using intraclass correlation coefficients (ICCs) and convergent validity analyses through correlation with self-report measures of balance and mobility. The Component Timed-Up-and-Go exhibited excellent test-retest reliability with ICCs ranging from .98 to .86 for total and component times. Evidence of discriminative validity resulted from significant differences in mean total times between people with transtibial (10.1 (SD: ±2.3)) and transfemoral (12.76 (SD: ±5.1) amputation, as well as significant differences in all five component times ( P < .05). Convergent validity of the Component Timed-Up-and-Go was demonstrated through moderate correlations with the PLUS-M ( r s  = -.56). The Component Timed-Up-and-Go is a reliable and valid clinical tool for detailed assessment of prosthetic mobility in people with non-vascular lower limb amputation. The iPad application provided a means to easily record data, contributing to clinical utility.

  3. The Ostomy Adjustment Scale: translation into Norwegian language with validation and reliability testing.

    PubMed

    Indrebø, Kirsten Lerum; Andersen, John Roger; Natvig, Gerd Karin

    2014-01-01

    The purpose of this study was to adapt the Ostomy Adjustment Scale to a Norwegian version and to assess its construct validity and 2 components of its reliability (internal consistency and test-retest reliability). One hundred fifty-eight of 217 patients (73%) with a colostomy, ileostomy, or urostomy participated in the study. Slightly more than half (56%) were men. Their mean age was 64 years (range, 26-91 years). All respondents had undergone ostomy surgery at least 3 months before participation in the study. The Ostomy Adjustment Scale was translated into Norwegian according to standard procedures for forward and backward translation. The questionnaire was sent to the participants via regular post. The Cronbach alpha and test-retest were computed to assess reliability. Construct validity was evaluated via correlations between each item and score sums; correlations were used to analyze relationships between the Ostomy Adjustment Scale and the 36-item Short Form Health Survey, the Quality of Life Scale, the Hospital Anxiety & Depression Scale, and the General Self-Efficacy Scale. The Cronbach alpha was 0.93, and test-retest reliability r was 0.69. The average correlation quotient item to sum score was 0.49 (range, 0.31-0.73). Results showed moderate negative correlations between the Ostomy Adjustment Scale and the Hospital Anxiety and Depression Scale (-0.37 and -0.40), and moderate positive correlations between the Ostomy Adjustment Scale and the 36-item Short Form Health Survey, the Quality of Life Scale, and the General Self-Efficacy Scale (0.30-0.45) with the exception of the pain domain in the Short Form 36 (0.28). Regression analysis showed linear associations between the Ostomy Adjustment Scale and sociodemographic and clinical variables with the exception of education. The Norwegian language version of the Ostomy Adjustment Scale was found to possess construct validity, along with internal consistency and test-retest reliability. The instrument is sensitive for sociodemographic and clinical variables pertinent to persons with urostomies, colostomies, and ileostomies.

  4. Multi-component identification and target cell-based screening of potential bioactive compounds in toad venom by UPLC coupled with high-resolution LTQ-Orbitrap MS and high-sensitivity Qtrap MS.

    PubMed

    Ren, Wei; Han, Lingyu; Luo, Mengyi; Bian, Baolin; Guan, Ming; Yang, Hui; Han, Chao; Li, Na; Li, Tuo; Li, Shilei; Zhang, Yangyang; Zhao, Zhenwen; Zhao, Haiyu

    2018-04-28

    Traditional Chinese medicines (TCMs) are undoubtedly treasured natural resources for discovering effective medicines in treating and preventing various diseases. However, it is still extremely difficult for screening the bioactive compounds due to the tremendous constituents in TCMs. In this work, the chemical composition of toad venom was comprehensively analyzed using ultra-high performance liquid chromatography (UPLC) coupled with high-resolution LTQ-Orbitrap mass spectrometry and 93 compounds were detected. Among them, 17 constituents were confirmed by standard substances and 8 constituents were detected in toad venom for the first time. Further, a compound database of toad venom containing the fullest compounds was further constructed using UPLC coupled with high-sensitivity Qtrap MS. Then a target cell-based approach for screening potential bioactive compounds from toad venom was developed by analyzing the target cell extracts. The reliability of this method was validated by negative controls and positive controls. In total, 17 components in toad venom were discovered to interact with the target cancer cells. Further, in vitro pharmacological trials were performed to confirm the anti-cancer activity of four of them. The results showed that the six bufogenins and seven bufotoxins detected in our research represented a promising resource to explore bufogenins/bufotoxins-based anticancer agents with low cardiotoxic effect. The target cell-based screening method coupled with the compound database of toad venom constructed by UPLC-Qtrap-MS with high sensitivity provide us a new strategy to rapidly screen and identify the potential bioactive constituents with low content in natural products, which was beneficial for drug discovery from other TCMs. ᅟ Graphical abstract.

  5. Sensitization to the mammalian oligosaccharide galactose-alpha-1,3-galactose (alpha-gal): experience in a Flemish case series.

    PubMed

    Ebo, D G; Faber, M; Sabato, V; Leysen, J; Gadisseur, A; Bridts, C H; De Clerck, L S

    2013-01-01

    Recent observations have disclosed that the galactose-alpha (1,3)-galactose (alpha-gal) moiety of non-primate glycoproteins can constitute a target for meat allergy. To describe adults with allergic reactions to mammalian meat, dairy products and gelatin. To investigate whether patients could demonstrate sensitization to activated recombinant human coagulation factor VII ectapog alpha that is produced in baby hamster kidney cells. Ten adults with mammalian meat, dairy products and gelatin allergies were examined using quantification of specific IgE and/or skin prick test for red meat, milk, milk components, gelatin, cetuximab and eptacog alpha. Most patients demonstrate quite typical clinical histories and serological profiles, with anti-alpha-gal titers varying from less than 1% to over 25% of total serum IgE. All patients demonstrate negative sIgE for gelatin, except the patient with a genuine gelatin allergy. All patients also demonstrated a negative sIgE to recombinant milk components casein, lactalbumin and lactoglobulin. Specific IgE to eptacog was positive in 5 out of the 9 patients sensitized to alpha-gal and none of the 10 control individuals. This series confirms the importance of the alpha-gal carbohydrate moiety as a potential target for allergy to mammalian meat, dairy products and gelatin (oral, topical or parenteral) in a Flemish population of meat allergic adults. It also confirms in vitro tests to mammalian meat generally to be more reliable than mammalian meat skin tests, but that diagnosis can benefit from skin testing with cetuximab. Specific IgE to gelatin is far too insensitive to diagnose alphaa-gal related gelatin allergy. IgE binding studies indicate a potential risk of alpha-gal-containing human recombinant proteins produced in mammalians.

  6. Colocalization recognition-activated cascade signal amplification strategy for ultrasensitive detection of transcription factors.

    PubMed

    Zhu, Desong; Wang, Lei; Xu, Xiaowen; Jiang, Wei

    2017-03-15

    Transcription factors (TFs) bind to specific double-stranded DNA (dsDNA) sequences in the regulatory regions of genes to regulate the process of gene transcription. Their expression levels sensitively reflect cell developmental situation and disease state. TFs have become potential diagnostic markers and therapeutic targets of cancers and some other diseases. Hence, high sensitive detection of TFs is of vital importance for early diagnosis of diseases and drugs development. The traditional exonucleases-assisted signal amplification methods suffered from the false positives caused by incomplete digestion of excess recognition probes. Herein, based on a new recognition way-colocalization recognition (CR)-activated dual signal amplification, an ultrasensitive fluorescent detection strategy for TFs was developed. TFs-induced the colocalization of three split recognition components resulted in noticeable increases of local effective concentrations and hybridization of three split components, which activated the subsequent cascade signal amplification including strand displacement amplification (SDA) and exponential rolling circle amplification (ERCA). This strategy eliminated the false positive influence and achieved ultra-high sensitivity towards the purified NF-κB p50 with detection limit of 2.0×10 -13 M. Moreover, NF-κB p50 can be detected in as low as 0.21ngμL -1 HeLa cell nuclear extracts. In addition, this proposed strategy could be used for the screening of NF-κB p50 activity inhibitors and potential anti-NF-κB p50 drugs. Finally, our proposed strategy offered a potential method for reliable detection of TFs in medical diagnosis and treatment research of cancers and other related diseases. Copyright © 2016 Elsevier B.V. All rights reserved.

  7. Structural reliability assessment capability in NESSUS

    NASA Technical Reports Server (NTRS)

    Millwater, H.; Wu, Y.-T.

    1992-01-01

    The principal capabilities of NESSUS (Numerical Evaluation of Stochastic Structures Under Stress), an advanced computer code developed for probabilistic structural response analysis, are reviewed, and its structural reliability assessed. The code combines flexible structural modeling tools with advanced probabilistic algorithms in order to compute probabilistic structural response and resistance, component reliability and risk, and system reliability and risk. An illustrative numerical example is presented.

  8. Structural reliability assessment capability in NESSUS

    NASA Astrophysics Data System (ADS)

    Millwater, H.; Wu, Y.-T.

    1992-07-01

    The principal capabilities of NESSUS (Numerical Evaluation of Stochastic Structures Under Stress), an advanced computer code developed for probabilistic structural response analysis, are reviewed, and its structural reliability assessed. The code combines flexible structural modeling tools with advanced probabilistic algorithms in order to compute probabilistic structural response and resistance, component reliability and risk, and system reliability and risk. An illustrative numerical example is presented.

  9. Monte Carlo simulation methodology for the reliabilty of aircraft structures under damage tolerance considerations

    NASA Astrophysics Data System (ADS)

    Rambalakos, Andreas

    Current federal aviation regulations in the United States and around the world mandate the need for aircraft structures to meet damage tolerance requirements through out the service life. These requirements imply that the damaged aircraft structure must maintain adequate residual strength in order to sustain its integrity that is accomplished by a continuous inspection program. The multifold objective of this research is to develop a methodology based on a direct Monte Carlo simulation process and to assess the reliability of aircraft structures. Initially, the structure is modeled as a parallel system with active redundancy comprised of elements with uncorrelated (statistically independent) strengths and subjected to an equal load distribution. Closed form expressions for the system capacity cumulative distribution function (CDF) are developed by expanding the current expression for the capacity CDF of a parallel system comprised by three elements to a parallel system comprised with up to six elements. These newly developed expressions will be used to check the accuracy of the implementation of a Monte Carlo simulation algorithm to determine the probability of failure of a parallel system comprised of an arbitrary number of statistically independent elements. The second objective of this work is to compute the probability of failure of a fuselage skin lap joint under static load conditions through a Monte Carlo simulation scheme by utilizing the residual strength of the fasteners subjected to various initial load distributions and then subjected to a new unequal load distribution resulting from subsequent fastener sequential failures. The final and main objective of this thesis is to present a methodology for computing the resulting gradual deterioration of the reliability of an aircraft structural component by employing a direct Monte Carlo simulation approach. The uncertainties associated with the time to crack initiation, the probability of crack detection, the exponent in the crack propagation rate (Paris equation) and the yield strength of the elements are considered in the analytical model. The structural component is assumed to consist of a prescribed number of elements. This Monte Carlo simulation methodology is used to determine the required non-periodic inspections so that the reliability of the structural component will not fall below a prescribed minimum level. A sensitivity analysis is conducted to determine the effect of three key parameters on the specification of the non-periodic inspection intervals: namely a parameter associated with the time to crack initiation, the applied nominal stress fluctuation and the minimum acceptable reliability level.

  10. Reliable Design Versus Trust

    NASA Technical Reports Server (NTRS)

    Berg, Melanie; LaBel, Kenneth A.

    2016-01-01

    This presentation focuses on reliability and trust for the users portion of the FPGA design flow. It is assumed that the manufacturer prior to hand-off to the user tests FPGA internal components. The objective is to present the challenges of creating reliable and trusted designs. The following will be addressed: What makes a design vulnerable to functional flaws (reliability) or attackers (trust)? What are the challenges for verifying a reliable design versus a trusted design?

  11. Method-independent, Computationally Frugal Convergence Testing for Sensitivity Analysis Techniques

    NASA Astrophysics Data System (ADS)

    Mai, J.; Tolson, B.

    2017-12-01

    The increasing complexity and runtime of environmental models lead to the current situation that the calibration of all model parameters or the estimation of all of their uncertainty is often computationally infeasible. Hence, techniques to determine the sensitivity of model parameters are used to identify most important parameters. All subsequent model calibrations or uncertainty estimation procedures focus then only on these subsets of parameters and are hence less computational demanding. While the examination of the convergence of calibration and uncertainty methods is state-of-the-art, the convergence of the sensitivity methods is usually not checked. If any, bootstrapping of the sensitivity results is used to determine the reliability of the estimated indexes. Bootstrapping, however, might as well become computationally expensive in case of large model outputs and a high number of bootstraps. We, therefore, present a Model Variable Augmentation (MVA) approach to check the convergence of sensitivity indexes without performing any additional model run. This technique is method- and model-independent. It can be applied either during the sensitivity analysis (SA) or afterwards. The latter case enables the checking of already processed sensitivity indexes. To demonstrate the method's independency of the convergence testing method, we applied it to two widely used, global SA methods: the screening method known as Morris method or Elementary Effects (Morris 1991) and the variance-based Sobol' method (Solbol' 1993). The new convergence testing method is first scrutinized using 12 analytical benchmark functions (Cuntz & Mai et al. 2015) where the true indexes of aforementioned three methods are known. This proof of principle shows that the method reliably determines the uncertainty of the SA results when different budgets are used for the SA. The results show that the new frugal method is able to test the convergence and therefore the reliability of SA results in an efficient way. The appealing feature of this new technique is the necessity of no further model evaluation and therefore enables checking of already processed sensitivity results. This is one step towards reliable and transferable, published sensitivity results.

  12. Shank3 Is Part of a Zinc-Sensitive Signaling System That Regulates Excitatory Synaptic Strength.

    PubMed

    Arons, Magali H; Lee, Kevin; Thynne, Charlotte J; Kim, Sally A; Schob, Claudia; Kindler, Stefan; Montgomery, Johanna M; Garner, Craig C

    2016-08-31

    Shank3 is a multidomain scaffold protein localized to the postsynaptic density of excitatory synapses. Functional studies in vivo and in vitro support the concept that Shank3 is critical for synaptic plasticity and the trans-synaptic coupling between the reliability of presynaptic neurotransmitter release and postsynaptic responsiveness. However, how Shank3 regulates synaptic strength remains unclear. The C terminus of Shank3 contains a sterile alpha motif (SAM) domain that is essential for its postsynaptic localization and also binds zinc, thus raising the possibility that changing zinc levels modulate Shank3 function in dendritic spines. In support of this hypothesis, we find that zinc is a potent regulator of Shank3 activation and dynamics in rat hippocampal neurons. Moreover, we show that zinc modulation of synaptic transmission is Shank3 dependent. Interestingly, an autism spectrum disorder (ASD)-associated variant of Shank3 (Shank3(R87C)) retains its zinc sensitivity and supports zinc-dependent activation of AMPAR-mediated synaptic transmission. However, elevated zinc was unable to rescue defects in trans-synaptic signaling caused by the R87C mutation, implying that trans-synaptic increases in neurotransmitter release are not necessary for the postsynaptic effects of zinc. Together, these data suggest that Shank3 is a key component of a zinc-sensitive signaling system, regulating synaptic strength that may be impaired in ASD. Shank3 is a postsynaptic protein associated with neurodevelopmental disorders such as autism and schizophrenia. In this study, we show that Shank3 is a key component of a zinc-sensitive signaling system that regulates excitatory synaptic transmission. Intriguingly, an autism-associated mutation in Shank3 partially impairs this signaling system. Therefore, perturbation of zinc homeostasis may impair, not only synaptic functionality and plasticity, but also may lead to cognitive and behavioral abnormalities seen in patients with psychiatric disorders. Copyright © 2016 the authors 0270-6474/16/369124-11$15.00/0.

  13. Shank3 Is Part of a Zinc-Sensitive Signaling System That Regulates Excitatory Synaptic Strength

    PubMed Central

    Arons, Magali H.; Lee, Kevin; Thynne, Charlotte J.; Kim, Sally A.; Schob, Claudia; Kindler, Stefan

    2016-01-01

    Shank3 is a multidomain scaffold protein localized to the postsynaptic density of excitatory synapses. Functional studies in vivo and in vitro support the concept that Shank3 is critical for synaptic plasticity and the trans-synaptic coupling between the reliability of presynaptic neurotransmitter release and postsynaptic responsiveness. However, how Shank3 regulates synaptic strength remains unclear. The C terminus of Shank3 contains a sterile alpha motif (SAM) domain that is essential for its postsynaptic localization and also binds zinc, thus raising the possibility that changing zinc levels modulate Shank3 function in dendritic spines. In support of this hypothesis, we find that zinc is a potent regulator of Shank3 activation and dynamics in rat hippocampal neurons. Moreover, we show that zinc modulation of synaptic transmission is Shank3 dependent. Interestingly, an autism spectrum disorder (ASD)-associated variant of Shank3 (Shank3R87C) retains its zinc sensitivity and supports zinc-dependent activation of AMPAR-mediated synaptic transmission. However, elevated zinc was unable to rescue defects in trans-synaptic signaling caused by the R87C mutation, implying that trans-synaptic increases in neurotransmitter release are not necessary for the postsynaptic effects of zinc. Together, these data suggest that Shank3 is a key component of a zinc-sensitive signaling system, regulating synaptic strength that may be impaired in ASD. SIGNIFICANCE STATEMENT Shank3 is a postsynaptic protein associated with neurodevelopmental disorders such as autism and schizophrenia. In this study, we show that Shank3 is a key component of a zinc-sensitive signaling system that regulates excitatory synaptic transmission. Intriguingly, an autism-associated mutation in Shank3 partially impairs this signaling system. Therefore, perturbation of zinc homeostasis may impair, not only synaptic functionality and plasticity, but also may lead to cognitive and behavioral abnormalities seen in patients with psychiatric disorders. PMID:27581454

  14. Human immunodeficiency virus type 1 RNA in breast-milk components.

    PubMed

    Hoffman, Irving F; Martinson, Francis E A; Stewart, Paul W; Chilongozi, David A; Leu, Szu-Yun; Kazembe, Peter N; Banda, Topia; Dzinyemba, Willard; Joshi, Priya; Cohen, Myron S; Fiscus, Susan A

    2003-10-15

    We conducted the present study to determine which of the 4 components of breast milk (whole milk, skim milk, lipid layer, and breast-milk cells) had the highest sensitivity and concentration of human immunodeficiency virus (HIV) type 1 RNA burden and to determine biological correlates to these factors. The probability of detection of HIV (sensitivity) and the concentration of HIV-1 RNA were both associated with the choice of milk component, CD4(+) cell count, concentration of blood serum HIV-1 RNA, and the presence of breast inflammation. Whole milk demonstrated higher sensitivity and mean concentration than any other single component. Sensitivity was enhanced by analyzing all 4 components of breast milk.

  15. A high-dispersion molecular gas component in nearby galaxies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Caldú-Primo, Anahi; Walter, Fabian; Sandstrom, Karin

    2013-12-01

    We present a comprehensive study of the velocity dispersion of the atomic (H I) and molecular (H{sub 2}) gas components in the disks (R ≲ R {sub 25}) of a sample of 12 nearby spiral galaxies with moderate inclinations. Our analysis is based on sensitive high-resolution data from the THINGS (atomic gas) and HERACLES (molecular gas) surveys. To obtain reliable measurements of the velocity dispersion, we stack regions several kiloparsecs in size, after accounting for intrinsic velocity shifts due to galactic rotation and large-scale motions. We stack using various parameters: the galactocentric distance, star formation rate surface density, H Imore » surface density, H{sub 2} surface density, and total gas surface density. We fit single Gaussian components to the stacked spectra and measure median velocity dispersions for H I of 11.9 ± 3.1 km s{sup –1} and for CO of 12.0 ± 3.9 km s{sup –1}. The CO velocity dispersions are thus, surprisingly, very similar to the corresponding ones of H I, with an average ratio of σ{sub HI}/σ{sub CO}= 1.0 ± 0.2 irrespective of the stacking parameter. The measured CO velocity dispersions are significantly higher (factor of ∼2) than the traditional picture of a cold molecular gas disk associated with star formation. The high dispersion implies an additional thick molecular gas disk (possibly as thick as the H I disk). Our finding is in agreement with recent sensitive measurements in individual edge-on and face-on galaxies and points toward the general existence of a thick disk of molecular gas, in addition to the well-known thin disk in nearby spiral galaxies.« less

  16. Psychometric properties of the Interpersonal Relationship Inventory-Short Form for active duty female service members.

    PubMed

    Nayback-Beebe, Ann M; Yoder, Linda H

    2011-06-01

    The Interpersonal Relationship Inventory-Short Form (IPRI-SF) has demonstrated psychometric consistency across several demographic and clinical populations; however, it has not been psychometrically tested in a military population. The purpose of this study was to psychometrically evaluate the reliability and component structure of the IPRI-SF in active duty United States Army female service members (FSMs). The reliability estimates were .93 for the social support subscale and .91 for the conflict subscale. Principal component analysis demonstrated an obliquely rotated three-component solution that accounted for 58.9% of the variance. The results of this study support the reliability and validity of the IPRI-SF for use in FSMs; however, a three-factor structure emerged in this sample of FSMs post-deployment that represents "cultural context." Copyright © 2011 Wiley Periodicals, Inc.

  17. Validation of the Hospital Ethical Climate Survey for older people care.

    PubMed

    Suhonen, Riitta; Stolt, Minna; Katajisto, Jouko; Charalambous, Andreas; Olson, Linda L

    2015-08-01

    The exploration of the ethical climate in the care settings for older people is highlighted in the literature, and it has been associated with various aspects of clinical practice and nurses' jobs. However, ethical climate is seldom studied in the older people care context. Valid, reliable, feasible measures are needed for the measurement of ethical climate. This study aimed to test the reliability, validity, and sensitivity of the Hospital Ethical Climate Survey in healthcare settings for older people. A non-experimental cross-sectional study design was employed, and a survey using questionnaires, including the Hospital Ethical Climate Survey was used for data collection. Data were analyzed using descriptive statistics, inferential statistics, and multivariable methods. Survey data were collected from a sample of nurses working in the care settings for older people in Finland (N = 1513, n = 874, response rate = 58%) in 2011. This study was conducted according to good scientific inquiry guidelines, and ethical approval was obtained from the university ethics committee. The mean score for the Hospital Ethical Climate Survey total was 3.85 (standard deviation = 0.56). Cronbach's alpha was 0.92. Principal component analysis provided evidence for factorial validity. LISREL provided evidence for construct validity based on goodness-of-fit statistics. Pearson's correlations of 0.68-0.90 were found between the sub-scales and the Hospital Ethical Climate Survey. The Hospital Ethical Climate Survey was found able to reveal discrimination across care settings and proved to be a valid and reliable tool for measuring ethical climate in care settings for older people and sensitive enough to reveal variations across various clinical settings. The Finnish version of the Hospital Ethical Climate Survey, used mainly in the hospital settings previously, proved to be a valid instrument to be used in the care settings for older people. Further studies are due to analyze the factor structure and some items of the Hospital Ethical Climate Survey. © The Author(s) 2014.

  18. The Structured Interview & Scoring Tool-Massachusetts Alzheimer's Disease Research Center (SIST-M): development, reliability, and cross-sectional validation of a brief structured clinical dementia rating interview.

    PubMed

    Okereke, Olivia I; Copeland, Maura; Hyman, Bradley T; Wanggaard, Taylor; Albert, Marilyn S; Blacker, Deborah

    2011-03-01

    The Clinical Dementia Rating (CDR) and CDR Sum-of-Boxes can be used to grade mild but clinically important cognitive symptoms of Alzheimer disease. However, sensitive clinical interview formats are lengthy. To develop a brief instrument for obtaining CDR scores and to assess its reliability and cross-sectional validity. Using legacy data from expanded interviews conducted among 347 community-dwelling older adults in a longitudinal study, we identified 60 questions (from a possible 131) about cognitive functioning in daily life using clinical judgment, inter-item correlations, and principal components analysis. Items were selected in 1 cohort (n=147), and a computer algorithm for generating CDR scores was developed in this same cohort and re-run in a replication cohort (n=200) to evaluate how well the 60 items retained information from the original 131 items. Short interviews based on the 60 items were then administered to 50 consecutively recruited older individuals, with no symptoms or mild cognitive symptoms, at an Alzheimer's Disease Research Center. Clinical Dementia Rating scores based on short interviews were compared with those from independent long interviews. In the replication cohort, agreement between short and long CDR interviews ranged from κ=0.65 to 0.79, with κ=0.76 for Memory, κ=0.77 for global CDR, and intraclass correlation coefficient for CDR Sum-of-Boxes=0.89. In the cross-sectional validation, short interview scores were slightly lower than those from long interviews, but good agreement was observed for global CDR and Memory (κ≥0.70) as well as for CDR Sum-of-Boxes (intraclass correlation coefficient=0.73). The Structured Interview & Scoring Tool-Massachusetts Alzheimer's Disease Research Center is a brief, reliable, and sensitive instrument for obtaining CDR scores in persons with symptoms along the spectrum of mild cognitive change.

  19. Sustained and Transient Contributions to the Rat Dark-Adapted Electroretinogram b-Wave

    PubMed Central

    Dang, Trung M.; Vingrys, Algis J.; Bui, Bang V.

    2013-01-01

    The most dominant feature of the electroretinogram, the b-wave, is thought to reflect ON-bipolar cell responses. However, a number of studies suggest that the b-wave is made up of several components. We consider the composition of the rat b-wave by subtracting corneal negative components obtained using intravitreal application of pharmacological agents to remove postreceptoral responses. By analyzing the intensity-response characteristic of the PII across a range of fixed times during and after a light step, we find that the rat isolated PII has 2 components. The first has fast rise and decay characteristics with a low sensitivity to light. GABAc-mediated inhibitory pathways enhance this transient-ON component to manifest increased and deceased sensitivity to light at shorter (<160 ms) and longer times, respectively. The second component has slower temporal characteristics but is more sensitive to light. GABAc-mediated inhibition enhances this sustained-ON component but has little effect on its sensitivity to light. After stimulus offset, both transient and sustained components return to baseline, and a long latency sustained positive component becomes apparent. The light sensitivities of transient-ON and sustained-OFF components are consistent with activity arising from cone ON- and OFF-bipolar cells, whereas the sustained-ON component is likely to arise from rod bipolar cells. PMID:23533706

  20. The effect on reliability and sensitivity to level of training of combining analytic and holistic rating scales for assessing communication skills in an internal medicine resident OSCE.

    PubMed

    Daniels, Vijay John; Harley, Dwight

    2017-07-01

    Although previous research has compared checklists to rating scales for assessing communication, the purpose of this study was to compare the effect on reliability and sensitivity to level of training of an analytic, a holistic, and a combined analytic-holistic rating scale in assessing communication skills. The University of Alberta Internal Medicine Residency runs OSCEs for postgraduate year (PGY) 1 and 2 residents and another for PGY-4 residents. Communication stations were scored with an analytic scale (empathy, non-verbal skills, verbal skills, and coherence subscales) and a holistic scale. Authors analyzed reliability of individual and combined scales using generalizability theory and evaluated each scale's sensitivity to level of training. For analytic, holistic, and combined scales, 12, 12, and 11 stations respectively yielded a Phi of 0.8 for the PGY-1,2 cohort, and 16, 16, and 14 stations yielded a Phi of 0.8 for the PGY-4 cohort. PGY-4 residents scored higher on the combined scale, the analytic rating scale, and the non-verbal and coherence subscales. A combined analytic-holistic rating scale increased score reliability and was sensitive to level of training. Given increased validity evidence, OSCE developers should consider combining analytic and holistic scales when assessing communication skills. Copyright © 2017 Elsevier B.V. All rights reserved.

  1. Industrial applications of shearography for inspections of aircraft components

    NASA Astrophysics Data System (ADS)

    Krupka, Rene; Waltz, T.; Ettemeyer, Andreas

    2003-05-01

    Shearography has been validated as fast and reliable inspection technique for aerospace components. Following several years phase of evaluation of the technique, meanwhile, shearography has entered the industrial production inspection. The applications basically range from serial inspection in the production line to field inspection in assembly and to applications in the maintenance and repair area. In all applications, the main advantages of shearography, as very fast and full field inspection and high sensitivity even on very complex composite materials have led to the decision for laser shearography as inspection tool. In this paper, we present examples of recent industrial shearography inspection systems in the field of aerospace. One of the first industrial installations of laser shearography in Europe was a fully automatic inspection system for helicopter rotorblades. Complete rotor blades are inspected within 10 minutes on delaminations and debondings in the composite structure. In case of more complex components, robotic manipulation of the shearography camera has proven to be the optimum solution. An industry 6-axis robot gives utmost flexibility to position the camera in any angle and distance. Automatic defect marking systems have also been introduced to indicate the exact position of the defect directly on the inspected component. Other applications cover the inspection of abradable seals in jet engines and portable shearography inspection systems for maintenance and repair inspection in the field.

  2. Industrial applications of shearography for inspection of aircraft components

    NASA Astrophysics Data System (ADS)

    Krupka, Rene; Walz, Thomas; Ettemeyer, Andreas

    2005-04-01

    Shearography has been validated as fast and reliable inspection technique for aerospace components. Following several years phase of evaluation of the technique, meanwhile, shearography has entered the industrial production inspection. The applications basically range from serial inspection in the production line to field inspection in assembly and to applications in the maintenance and repair area. In all applications, the main advantages of shearography, as very fast and full field insection and high sensitivity even on very complex on composite materials have led to the decision for laser shearography as inspection tool. In this paper, we present some highlights of industrial shearography inspection. One of the first industrial installations of laser shearography in Europe was a fully automatic inspection system for helicopter rotorblades. Complete rotor blades are inspected within 10 minutes on delaminations and debondingg in the composite structure. In case of more complex components, robotic manipulation of the shearography camera has proven to be the optimal solution. An industry 6-axis robot give utmost flexibility to position the camera in any angle and distance. Automatic defect marking systems have also been introduced to indicate the exact position of the defect directly on the inspected component. Other applications are shearography inspection systems for abradable seals in jet engines and portable shearography inspection systems for maintenance and repair inspection in the field. In this paper, recent installations of automatice inspection systems in aerospace industries are presented.

  3. HTGR plant availability and reliability evaluations. Volume I. Summary of evaluations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cadwallader, G.J.; Hannaman, G.W.; Jacobsen, F.K.

    1976-12-01

    The report (1) describes a reliability assessment methodology for systematically locating and correcting areas which may contribute to unavailability of new and uniquely designed components and systems, (2) illustrates the methodology by applying it to such components in a high-temperature gas-cooled reactor (Public Service Company of Colorado's Fort St. Vrain 330-MW(e) HTGR), and (3) compares the results of the assessment with actual experience. The methodology can be applied to any component or system; however, it is particularly valuable for assessments of components or systems which provide essential functions, or the failure or mishandling of which could result in relatively largemore » economic losses.« less

  4. Reliability, validity, sensitivity and specificity of Guajarati version of the Roland-Morris Disability Questionnaire.

    PubMed

    Nambi, S Gopal

    2013-01-01

    The most common instruments developed to assess the functional status of patients with Non specific low back pain is the Roland-Morris Disability Questionnaire (RMDQ). Clinical and epidemiological research related to low back pain in the Gujarati population would be facilitated by the availability of well-established outcome measures. To find the reliability, validity, sensitivity and specificity of the Gujarati version of the RMDQ for use in Non Specific Chronic low back pain. A reliability, validity, sensitivity and specificity study of Gujarati version of the Roland-Morris Disability Questionnaire (RMDQ). Thirty out patients with Non Specific Chronic low back pain were assessed by the RMDQ. Reliability is assessed by using internal consistency and the intra-class correlation coefficient (ICC). Internal construct validity is assessed by RASCH Analysis and external construct validity is assessed by association with pain and spinal movement. Clinical calculator was used to determine the sensitivity and specificity. Internal consistency of the RMDQ is found to be adequate (> 0.65) at both times, with high ICC's also at both time points. Internal construct validity of the scale is good, indicating a single underlying construct. Expected associations with pain and spinal movement confirm external construct validity. The Sensitivity and Specificity at cut off point of 0.5 was 80% and 84% with respectively positive predictive value (PPV) of 83.33% and negative predictive value (NPV) of 80.76%. The Questionnaire is at the ordinal level. The RMDQ is a one-dimensional, ordinal measure, which works well in the Gujarati population.

  5. Diagnostic accuracy of three monoclonal stool tests in a large series of untreated Helicobacter pylori infected patients.

    PubMed

    Lario, Sergio; Ramírez-Lázaro, María José; Montserrat, Antònia; Quílez, María Elisa; Junquera, Félix; Martínez-Bauer, Eva; Sanfeliu, Isabel; Brullet, Enric; Campo, Rafael; Segura, Ferran; Calvet, Xavier

    2016-06-01

    Immunochromatographic tests need to be improved in order to enhance their reliability. Recently, several new kits have appeared on the market. The objective was to evaluate the diagnostic accuracy of three monoclonal rapid stool tests - the new Uni-Gold™ H.pylori Antigen (Trinity Biotech, Ireland), the RAPID Hp StAR (Oxoid Ltd., UK) and the ImmunoCard STAT! HpSA (Meridian Diagnostics, USA) - for detecting H. pylori infection prior to eradication treatment. Diagnostic accuracy (sensitivity and specificity) and reliability (concordance between observers) were evaluated in 250 untreated consecutive dyspeptic patients. The gold standard for diagnosing H. pylori infection was defined as the concordance of two or more of rapid urease test (RUT), histopathology and urease breath test (UBT) or positive culture in isolation. Readings of immunochromatographic tests were performed by two different observers. Sensitivity, specificity, positive and negative predictive values and 95% confidence intervals were calculated. Sensitivity and specificity were compared using the McNemar test. The three tests showed a good correlation, with Kappa values>0.9. RAPID Hp StAR had a sensitivity of 91%-92% and a specificity ranging from 77% to 85%. Its sensitivity was higher than that of Uni-Gold™ H.pylori Antigen and ImmunoCard STAT! HpSA (p<0.01). Uni-Gold™ H.pylori Antigen kit showed a sensitivity of 83%, similar to ImmunoCard STAT! HpSA. Specificity of Uni-Gold™ H.pylori Antigen approached 90% (87-89%) and was superior to that of RAPID Hp StAR (p<0.01). Uni-Gold™ H.pylori Antigen and ImmunoCard STAT! HpSA present similar levels of diagnostic accuracy. RAPID Hp StAR was the most sensitive but less reliable of the three immunochromatographic stool tests. None are as accurate and reliable as UBT, RUT and histology. Copyright © 2016 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.

  6. Reliability analysis based on the losses from failures.

    PubMed

    Todinov, M T

    2006-04-01

    The conventional reliability analysis is based on the premise that increasing the reliability of a system will decrease the losses from failures. On the basis of counterexamples, it is demonstrated that this is valid only if all failures are associated with the same losses. In case of failures associated with different losses, a system with larger reliability is not necessarily characterized by smaller losses from failures. Consequently, a theoretical framework and models are proposed for a reliability analysis, linking reliability and the losses from failures. Equations related to the distributions of the potential losses from failure have been derived. It is argued that the classical risk equation only estimates the average value of the potential losses from failure and does not provide insight into the variability associated with the potential losses. Equations have also been derived for determining the potential and the expected losses from failures for nonrepairable and repairable systems with components arranged in series, with arbitrary life distributions. The equations are also valid for systems/components with multiple mutually exclusive failure modes. The expected losses given failure is a linear combination of the expected losses from failure associated with the separate failure modes scaled by the conditional probabilities with which the failure modes initiate failure. On this basis, an efficient method for simplifying complex reliability block diagrams has been developed. Branches of components arranged in series whose failures are mutually exclusive can be reduced to single components with equivalent hazard rate, downtime, and expected costs associated with intervention and repair. A model for estimating the expected losses from early-life failures has also been developed. For a specified time interval, the expected losses from early-life failures are a sum of the products of the expected number of failures in the specified time intervals covering the early-life failures region and the expected losses given failure characterizing the corresponding time intervals. For complex systems whose components are not logically arranged in series, discrete simulation algorithms and software have been created for determining the losses from failures in terms of expected lost production time, cost of intervention, and cost of replacement. Different system topologies are assessed to determine the effect of modifications of the system topology on the expected losses from failures. It is argued that the reliability allocation in a production system should be done to maximize the profit/value associated with the system. Consequently, a method for setting reliability requirements and reliability allocation maximizing the profit by minimizing the total cost has been developed. Reliability allocation that maximizes the profit in case of a system consisting of blocks arranged in series is achieved by determining for each block individually the reliabilities of the components in the block that minimize the sum of the capital, operation costs, and the expected losses from failures. A Monte Carlo simulation based net present value (NPV) cash-flow model has also been proposed, which has significant advantages to cash-flow models based on the expected value of the losses from failures per time interval. Unlike these models, the proposed model has the capability to reveal the variation of the NPV due to different number of failures occurring during a specified time interval (e.g., during one year). The model also permits tracking the impact of the distribution pattern of failure occurrences and the time dependence of the losses from failures.

  7. Behavioral Scale Reliability and Measurement Invariance Evaluation Using Latent Variable Modeling

    ERIC Educational Resources Information Center

    Raykov, Tenko

    2004-01-01

    A latent variable modeling approach to reliability and measurement invariance evaluation for multiple-component measuring instruments is outlined. An initial discussion deals with the limitations of coefficient alpha, a frequently used index of composite reliability. A widely and readily applicable structural modeling framework is next described…

  8. National Biological Service Research Supports Watershed Planning

    USGS Publications Warehouse

    Snyder, Craig D.

    1996-01-01

    The National Biological Service's Leetown Science Center is investigating how human impacts on watershed, riparian, and in-stream habitats affect fish communities. The research will provide the basis for a Ridge and Valley model that will allow resource managers to accurately predict and effectively mitigate human impacts on water quality. The study takes place in the Opequon Creek drainage basin of West Virginia. A fourth-order tributary of the Potomac, the basin falls within the Ridge and Valley. The study will identify biological components sensitive to land use patterns and the condition of the riparian zone; the effect of stream size, location, and other characteristics on fish communities; the extent to which remote sensing can reliable measure the riparian zone; and the relationship between the rate of landscape change and the structure of fish communities.

  9. Self-defense and martial arts evaluation for college women: preliminary validation of perceptions of dangerous situations scale.

    PubMed

    Hughes, Patricia Paulsen; Sherrill, Claudine; Myers, Bettye; Rowe, Nancy; Marshall, David

    2003-06-01

    Martial arts and self-defense programs train fearful people, especially women, to be more competent and confident to defend themselves in dangerous situations. However, there are no validated instruments to evaluate the effectiveness of programs purporting to teach self-protection. The Perceptions of Dangerous Situations Scale (PDSS), composed of fear, likelihood and confidence subscales, was developed and validated for university women. Participants were 368 university women, ages 17 to 45 years (M age = 20.7 years). Content validity of the PDSS was established through an expert panel, and construct validity was established through principal components analysis and determination of instructional sensitivity. Reliability was established through alpha coefficients. The PDSS, when used with university women, offers promising measurement opportunities in self-defense and martial arts settings.

  10. Optical and electrical nano eco-sensors using alternative deposition of charged layer

    NASA Astrophysics Data System (ADS)

    Ahmed, Syed Rahin; Hong, Seong Cheol; Lee, Jaebeom

    2011-03-01

    This review focuses on layer by layer (LBL) assembly-based nano ecological sensor (hereafter, eco-sensor) for pesticide detection, which is one of the most versatile methods. The effects of pesticides on human health and on the environment (air, water, soil, plants, and animals) are of great concern due to their increasing use. We highlight two of the most popular detecting methods, i.e., fluorescence and electrochemical detection of pesticides on an LBL assembly. Fluorescence materials are of great interest among researchers for their sensitivity and reliable detection, and electrochemical processes allow us to investigate synergistic interactions among film components through charge transfer mechanisms in LBL film at the molecular level. Then, we noted some prospective directions for development of different types of sensing systems.

  11. Portable high precision pressure transducer system

    DOEpatents

    Piper, Thomas C.; Morgan, John P.; Marchant, Norman J.; Bolton, Steven M.

    1994-01-01

    A high precision pressure transducer system for checking the reliability of a second pressure transducer system used to monitor the level of a fluid confined in a holding tank. Since the response of the pressure transducer is temperature sensitive, it is continually housed in an battery powered oven which is configured to provide a temperature stable environment at specified temperature for an extended period of time. Further, a high precision temperature stabilized oscillator and counter are coupled to a single board computer to accurately determine the pressure transducer oscillation frequency and convert it to an applied pressure. All of the components are powered by the batteries which during periods of availability of line power are charged by an on board battery charger. The pressure readings outputs are transmitted to a line printer and a vacuum florescent display.

  12. Effects of mobile phone radiation on reproduction and development in Drosophila melanogaster.

    PubMed

    Weisbrot, David; Lin, Hana; Ye, Lin; Blank, Martin; Goodman, Reba

    2003-05-01

    In this report we examined the effects of a discontinuous radio frequency (RF) signal produced by a GSM multiband mobile phone (900/1,900 MHz; SAR approximately 1.4 W/kg) on Drosophila melanogaster, during the 10-day developmental period from egg laying through pupation. As found earlier with low frequency exposures, the non-thermal radiation from the GSM mobile phone increased numbers of offspring, elevated hsp70 levels, increased serum response element (SRE) DNA-binding and induced the phosphorylation of the nuclear transcription factor, ELK-1. The rapid induction of hsp70 within minutes, by a non-thermal stress, together with identified components of signal transduction pathways, provide sensitive and reliable biomarkers that could serve as the basis for realistic mobile phone safety guidelines. Copyright 2003 Wiley-Liss, Inc.

  13. Construct validity of the abbreviated mental test in older medical inpatients.

    PubMed

    Antonelli Incalzi, R; Cesari, M; Pedone, C; Carosella, L; Carbonin, P U

    2003-01-01

    To evaluate validity and internal structure of the Abbreviated Mental Test (AMT), and to assess the dependence of the internal structure upon the characteristics of the patients examined. Cross-sectional examination using data from the Italian Group of Pharmacoepidemiology in the Elderly (GIFA) database. Twenty-four acute care wards of Geriatrics or General Medicine. Two thousand eight hundred and eight patients consecutively admitted over a 4-month period. Demographic characteristics, functional status, medical conditions and performance on AMT were collected at discharge. Sensitivity, specificity and predictive values of the AMT <7 versus a diagnosis of dementia made according to DSM-III-R criteria were computed. The internal structure of AMT was assessed by principal component analysis. The analysis was performed on the whole population and stratified for age (<65, 65-80 and >80 years), gender, education (<6 or >5 years) and presence of congestive heart failure (CHF). AMT achieved high sensitivity (81%), specificity (84%) and negative predictive value (99%), but a low positive predictive value of 25%. The principal component analysis isolated two components: the former component represents the orientation to time and space and explains 45% of AMT variance; the latter is linked to memory and attention and explains 13% of variance. Comparable results were obtained after stratification by age, gender or education. In patients with CHF, only 48.3% of the cumulative variance was explained; the factor accounting for most (34.6%) of the variance explained was mainly related to the three items assessing memory. AMT >6 rules out dementia very reliably, whereas AMT <7 requires a second level cognitive assessment to confirm dementia. AMT is bidimensional and maintains the same internal structure across classes defined by selected social and demographic characteristics, but not in CHF patients. It is likely that its internal structure depends on the type of patients. The use of a sum-score could conceal some part of the information provided by the AMT. Copyright 2003 S. Karger AG, Basel

  14. New machine-learning algorithms for prediction of Parkinson's disease

    NASA Astrophysics Data System (ADS)

    Mandal, Indrajit; Sairam, N.

    2014-03-01

    This article presents an enhanced prediction accuracy of diagnosis of Parkinson's disease (PD) to prevent the delay and misdiagnosis of patients using the proposed robust inference system. New machine-learning methods are proposed and performance comparisons are based on specificity, sensitivity, accuracy and other measurable parameters. The robust methods of treating Parkinson's disease (PD) includes sparse multinomial logistic regression, rotation forest ensemble with support vector machines and principal components analysis, artificial neural networks, boosting methods. A new ensemble method comprising of the Bayesian network optimised by Tabu search algorithm as classifier and Haar wavelets as projection filter is used for relevant feature selection and ranking. The highest accuracy obtained by linear logistic regression and sparse multinomial logistic regression is 100% and sensitivity, specificity of 0.983 and 0.996, respectively. All the experiments are conducted over 95% and 99% confidence levels and establish the results with corrected t-tests. This work shows a high degree of advancement in software reliability and quality of the computer-aided diagnosis system and experimentally shows best results with supportive statistical inference.

  15. Lamb Wave Damage Quantification Using GA-Based LS-SVM.

    PubMed

    Sun, Fuqiang; Wang, Ning; He, Jingjing; Guan, Xuefei; Yang, Jinsong

    2017-06-12

    Lamb waves have been reported to be an efficient tool for non-destructive evaluations (NDE) for various application scenarios. However, accurate and reliable damage quantification using the Lamb wave method is still a practical challenge, due to the complex underlying mechanism of Lamb wave propagation and damage detection. This paper presents a Lamb wave damage quantification method using a least square support vector machine (LS-SVM) and a genetic algorithm (GA). Three damage sensitive features, namely, normalized amplitude, phase change, and correlation coefficient, were proposed to describe changes of Lamb wave characteristics caused by damage. In view of commonly used data-driven methods, the GA-based LS-SVM model using the proposed three damage sensitive features was implemented to evaluate the crack size. The GA method was adopted to optimize the model parameters. The results of GA-based LS-SVM were validated using coupon test data and lap joint component test data with naturally developed fatigue cracks. Cases of different loading and manufacturer were also included to further verify the robustness of the proposed method for crack quantification.

  16. Lamb Wave Damage Quantification Using GA-Based LS-SVM

    PubMed Central

    Sun, Fuqiang; Wang, Ning; He, Jingjing; Guan, Xuefei; Yang, Jinsong

    2017-01-01

    Lamb waves have been reported to be an efficient tool for non-destructive evaluations (NDE) for various application scenarios. However, accurate and reliable damage quantification using the Lamb wave method is still a practical challenge, due to the complex underlying mechanism of Lamb wave propagation and damage detection. This paper presents a Lamb wave damage quantification method using a least square support vector machine (LS-SVM) and a genetic algorithm (GA). Three damage sensitive features, namely, normalized amplitude, phase change, and correlation coefficient, were proposed to describe changes of Lamb wave characteristics caused by damage. In view of commonly used data-driven methods, the GA-based LS-SVM model using the proposed three damage sensitive features was implemented to evaluate the crack size. The GA method was adopted to optimize the model parameters. The results of GA-based LS-SVM were validated using coupon test data and lap joint component test data with naturally developed fatigue cracks. Cases of different loading and manufacturer were also included to further verify the robustness of the proposed method for crack quantification. PMID:28773003

  17. Detection of coupling delay: A problem not yet solved

    NASA Astrophysics Data System (ADS)

    Coufal, David; Jakubík, Jozef; Jajcay, Nikola; Hlinka, Jaroslav; Krakovská, Anna; Paluš, Milan

    2017-08-01

    Nonparametric detection of coupling delay in unidirectionally and bidirectionally coupled nonlinear dynamical systems is examined. Both continuous and discrete-time systems are considered. Two methods of detection are assessed—the method based on conditional mutual information—the CMI method (also known as the transfer entropy method) and the method of convergent cross mapping—the CCM method. Computer simulations show that neither method is generally reliable in the detection of coupling delays. For continuous-time chaotic systems, the CMI method appears to be more sensitive and applicable in a broader range of coupling parameters than the CCM method. In the case of tested discrete-time dynamical systems, the CCM method has been found to be more sensitive, while the CMI method required much stronger coupling strength in order to bring correct results. However, when studied systems contain a strong oscillatory component in their dynamics, results of both methods become ambiguous. The presented study suggests that results of the tested algorithms should be interpreted with utmost care and the nonparametric detection of coupling delay, in general, is a problem not yet solved.

  18. Waveform Tomography of Two-Dimensional Three-Component Seismic Data for HTI Anisotropic Media

    NASA Astrophysics Data System (ADS)

    Gao, Fengxia; Wang, Yanghua; Wang, Yun

    2018-06-01

    Reservoirs with vertically aligned fractures can be represented equivalently by horizontal transverse isotropy (HTI) media. But inverting for the anisotropic parameters of HTI media is a challenging inverse problem, because of difficulties inherent in a multiple parameter inversion. In this paper, when we invert for the anisotropic parameters, we consider for the first time the azimuthal rotation of a two-dimensional seismic survey line from the symmetry of HTI. The established wave equations for the HTI media with azimuthal rotation consist of nine elastic coefficients, expressed in terms of five modified Thomsen parameters. The latter are parallel to the Thomsen parameters for describing velocity characteristics of weak vertical transverse isotropy media. We analyze the sensitivity differences of the five modified Thomsen parameters from their radiation patterns, and attempt to balance the magnitude and sensitivity differences between the parameters through normalization and tuning factors which help to update the model parameters properly. We demonstrate an effective inversion strategy by inverting velocity parameters in the first stage and updates the five modified Thomsen parameters simultaneously in the second stage, for generating reliably reconstructed models.

  19. Characterizing the uncertainty in holddown post load measurements

    NASA Technical Reports Server (NTRS)

    Richardson, J. A.; Townsend, J. S.

    1993-01-01

    In order to understand unexpectedly erratic load measurements in the launch-pad supports for the space shuttle, the sensitivities of the load cells in the supports were analyzed using simple probabilistic techniques. NASA engineers use the loads in the shuttle's supports to calculate critical stresses in the shuttle vehicle just before lift-off. The support loads are measured with 'load cells' which are actually structural components of the mobile launch platform which have been instrumented with strain gauges. Although these load cells adequately measure vertical loads, the horizontal load measurements have been erratic. The load measurements were simulated in this study using Monte Carlo simulation procedures. The simulation studies showed that the support loads are sensitive to small deviations in strain and calibration. In their current configuration, the load cells will not measure loads with sufficient accuracy to reliably calculate stresses in the shuttle vehicle. A simplified model of the holddown post (HDP) load measurement system was used to study the effect on load measurement accuracy for several factors, including load point deviations, gauge heights, and HDP geometry.

  20. Detecting fixation on a target using time-frequency distributions of a retinal birefringence scanning signal

    PubMed Central

    2013-01-01

    Background The fovea, which is the most sensitive part of the retina, is known to have birefringent properties, i.e. it changes the polarization state of light upon reflection. Existing devices use this property to obtain information on the orientation of the fovea and the direction of gaze. Such devices employ specific frequency components that appear during moments of fixation on a target. To detect them, previous methods have used solely the power spectrum of the Fast Fourier Transform (FFT), which, unfortunately, is an integral method, and does not give information as to where exactly the events of interest occur. With very young patients who are not cooperative enough, this presents a problem, because central fixation may be present only during very short-lasting episodes, and can easily be missed by the FFT. Method This paper presents a method for detecting short-lasting moments of central fixation in existing devices for retinal birefringence scanning, with the goal of a reliable detection of eye alignment. Signal analysis is based on the Continuous Wavelet Transform (CWT), which reliably localizes such events in the time-frequency plane. Even though the characteristic frequencies are not always strongly expressed due to possible artifacts, simple topological analysis of the time-frequency distribution can detect fixation reliably. Results In all six subjects tested, the CWT allowed precise identification of both frequency components. Moreover, in four of these subjects, episodes of intermittent but definitely present central fixation were detectable, similar to those in Figure 4. A simple FFT is likely to treat them as borderline cases, or entirely miss them, depending on the thresholds used. Conclusion Joint time-frequency analysis is a powerful tool in the detection of eye alignment, even in a noisy environment. The method is applicable to similar situations, where short-lasting diagnostic events need to be detected in time series acquired by means of scanning some substrate along a specific path. PMID:23668264

  1. Hollow-Structured Graphene-Silicone-Composite-Based Piezoresistive Sensors: Decoupled Property Tuning and Bending Reliability.

    PubMed

    Luo, Ningqi; Huang, Yan; Liu, Jing; Chen, Shih-Chi; Wong, Ching Ping; Zhao, Ni

    2017-10-01

    A versatile flexible piezoresistive sensor should maintain high sensitivity in a wide linear range, and provide a stable and repeatable pressure reading under bending. These properties are often difficult to achieve simultaneously with conventional filler-matrix composite active materials, as tuning of one material component often results in change of multiple sensor properties. Here, a material strategy is developed to realize a 3D graphene-poly(dimethylsiloxane) hollow structure, where the electrical conductivity and mechanical elasticity of the composite can be tuned separately by varying the graphene layer number and the poly(dimethylsiloxane) composition ratio, respectively. As a result, the sensor sensitivity and linear range can be easily improved through a decoupled tuning process, reaching a sensitivity of 15.9 kPa -1 in a 60 kPa linear region, and the sensor also exhibits fast response (1.2 ms rising time) and high stability. Furthermore, by optimizing the density of the graphene percolation network and thickness of the composite, the stability and repeatability of the sensor output under bending are improved, achieving a measurement error below 6% under bending radius variations from -25 to +25 mm. Finally, the potential applications of these sensors in wearable medical devices and robotic vision are explored. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  2. A Novel Health Evaluation Strategy for Multifunctional Self-Validating Sensors

    PubMed Central

    Shen, Zhengguang; Wang, Qi

    2013-01-01

    The performance evaluation of sensors is very important in actual application. In this paper, a theory based on multi-variable information fusion is studied to evaluate the health level of multifunctional sensors. A novel conception of health reliability degree (HRD) is defined to indicate a quantitative health level, which is different from traditional so-called qualitative fault diagnosis. To evaluate the health condition from both local and global perspectives, the HRD of a single sensitive component at multiple time points and the overall multifunctional sensor at a single time point are defined, respectively. The HRD methodology is emphasized by using multi-variable data fusion technology coupled with a grey comprehensive evaluation method. In this method, to acquire the distinct importance of each sensitive unit and the sensitivity of different time points, the information entropy and analytic hierarchy process method are used, respectively. In order to verify the feasibility of the proposed strategy, a health evaluating experimental system for multifunctional self-validating sensors was designed. The five different health level situations have been discussed. Successful results show that the proposed method is feasible, the HRD could be used to quantitatively indicate the health level and it does have a fast response to the performance changes of multifunctional sensors. PMID:23291576

  3. From coeliac disease to noncoeliac gluten sensitivity; should everyone be gluten free?

    PubMed

    Aziz, Imran; Dwivedi, Krit; Sanders, David S

    2016-03-01

    Gluten-free diets (GFDs) have seen a disproportional rise in use and popularity relative to the prevalence of established gluten-related disorders such as coeliac disease or immunoglobulin E wheat allergy. This entity has been termed noncoeliac gluten sensitivity (NCGS). This review aims to provide a current perspective on the emerging evidence for and against NCGS, along with the associated need for a GFD. NCGS and the benefits of a GFD are reported amongst patients with irritable bowel syndrome, inflammatory bowel disease, and nonintestinal disorders such as neuropsychiatric diseases and fibromyalgia. However, no reliable biomarkers currently exist to diagnose NCGS and hence confirmatory testing can only be performed using double-blind placebo-controlled gluten-based challenges. Unfortunately, such tests are not available in routine clinical practice. Furthermore, recent novel studies have highlighted the role of other gluten-based components in contributing to the symptoms of self-reported NCGS. These include fermentable oligo, di, mono-saccharides and polyols, amylase trypsin inhibitors, and wheat germ agglutinins. Therefore, NCGS is now seen as a spectrum encompassing several biological responses and terms such as 'noncoeliac wheat sensitivity' have been suggested as a wider label to define the condition. Despite the rising use of a GFD further studies are required to clearly establish the extent and exclusivity of gluten in NCGS.

  4. Enhanced Component Performance Study: Turbine-Driven Pumps 1998–2014

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schroeder, John Alton

    2015-11-01

    This report presents an enhanced performance evaluation of turbine-driven pumps (TDPs) at U.S. commercial nuclear power plants. The data used in this study are based on the operating experience failure reports from fiscal year 1998 through 2014 for the component reliability as reported in the Institute of Nuclear Power Operations (INPO) Consolidated Events Database (ICES). The TDP failure modes considered are failure to start (FTS), failure to run less than or equal to one hour (FTR=1H), failure to run more than one hour (FTR>1H), and normally running systems FTS and failure to run (FTR). The component reliability estimates and themore » reliability data are trended for the most recent 10-year period while yearly estimates for reliability are provided for the entire active period. Statistically significant increasing trends were identified for TDP unavailability, for frequency of start demands for standby TDPs, and for run hours in the first hour after start. Statistically significant decreasing trends were identified for start demands for normally running TDPs, and for run hours per reactor critical year for normally running TDPs.« less

  5. Efficient stochastic approaches for sensitivity studies of an Eulerian large-scale air pollution model

    NASA Astrophysics Data System (ADS)

    Dimov, I.; Georgieva, R.; Todorov, V.; Ostromsky, Tz.

    2017-10-01

    Reliability of large-scale mathematical models is an important issue when such models are used to support decision makers. Sensitivity analysis of model outputs to variation or natural uncertainties of model inputs is crucial for improving the reliability of mathematical models. A comprehensive experimental study of Monte Carlo algorithms based on Sobol sequences for multidimensional numerical integration has been done. A comparison with Latin hypercube sampling and a particular quasi-Monte Carlo lattice rule based on generalized Fibonacci numbers has been presented. The algorithms have been successfully applied to compute global Sobol sensitivity measures corresponding to the influence of several input parameters (six chemical reactions rates and four different groups of pollutants) on the concentrations of important air pollutants. The concentration values have been generated by the Unified Danish Eulerian Model. The sensitivity study has been done for the areas of several European cities with different geographical locations. The numerical tests show that the stochastic algorithms under consideration are efficient for multidimensional integration and especially for computing small by value sensitivity indices. It is a crucial element since even small indices may be important to be estimated in order to achieve a more accurate distribution of inputs influence and a more reliable interpretation of the mathematical model results.

  6. PSHFT - COMPUTERIZED LIFE AND RELIABILITY MODELLING FOR TURBOPROP TRANSMISSIONS

    NASA Technical Reports Server (NTRS)

    Savage, M.

    1994-01-01

    The computer program PSHFT calculates the life of a variety of aircraft transmissions. A generalized life and reliability model is presented for turboprop and parallel shaft geared prop-fan aircraft transmissions. The transmission life and reliability model is a combination of the individual reliability models for all the bearings and gears in the main load paths. The bearing and gear reliability models are based on the statistical two parameter Weibull failure distribution method and classical fatigue theories. The computer program developed to calculate the transmission model is modular. In its present form, the program can analyze five different transmissions arrangements. Moreover, the program can be easily modified to include additional transmission arrangements. PSHFT uses the properties of a common block two-dimensional array to separate the component and transmission property values from the analysis subroutines. The rows correspond to specific components with the first row containing the values for the entire transmission. Columns contain the values for specific properties. Since the subroutines (which determine the transmission life and dynamic capacity) interface solely with this property array, they are separated from any specific transmission configuration. The system analysis subroutines work in an identical manner for all transmission configurations considered. Thus, other configurations can be added to the program by simply adding component property determination subroutines. PSHFT consists of a main program, a series of configuration specific subroutines, generic component property analysis subroutines, systems analysis subroutines, and a common block. The main program selects the routines to be used in the analysis and sequences their operation. The series of configuration specific subroutines input the configuration data, perform the component force and life analyses (with the help of the generic component property analysis subroutines), fill the property array, call up the system analysis routines, and finally print out the analysis results for the system and components. PSHFT is written in FORTRAN 77 and compiled on a MicroSoft FORTRAN compiler. The program will run on an IBM PC AT compatible with at least 104k bytes of memory. The program was developed in 1988.

  7. Performance and reliability of the NASA biomass production chamber

    NASA Technical Reports Server (NTRS)

    Fortson, R. E.; Sager, J. C.; Chetirkin, P. V.

    1994-01-01

    The Biomass Production Chamber (BPC) at the Kennedy Space Center is part of the Controlled Ecological Life Support System (CELSS) Breadboard Project. Plants are grown in a closed environment in an effort to quantify their contributions to the requirements for life support. Performance of this system is described. Also, in building this system, data from component and subsystem failures are being recorded. These data are used to identify problem areas in the design and implementation. The techniques used to measure the reliability will be useful in the design and construction of future CELSS. Possible methods for determining the reliability of a green plant, the primary component of CELSS, are discussed.

  8. Enhancing malaria diagnosis through microfluidic cell enrichment and magnetic resonance relaxometry detection

    NASA Astrophysics Data System (ADS)

    Fook Kong, Tian; Ye, Weijian; Peng, Weng Kung; Wei Hou, Han; Marcos; Preiser, Peter Rainer; Nguyen, Nam-Trung; Han, Jongyoon

    2015-06-01

    Despite significant advancements over the years, there remains an urgent need for low cost diagnostic approaches that allow for rapid, reliable and sensitive detection of malaria parasites in clinical samples. Our previous work has shown that magnetic resonance relaxometry (MRR) is a potentially highly sensitive tool for malaria diagnosis. A key challenge for making MRR based malaria diagnostics suitable for clinical testing is the fact that MRR baseline fluctuation exists between individuals, making it difficult to detect low level parasitemia. To overcome this problem, it is important to establish the MRR baseline of each individual while having the ability to reliably determine any changes that are caused by the infection of malaria parasite. Here we show that an approach that combines the use of microfluidic cell enrichment with a saponin lysis before MRR detection can overcome these challenges and provide the basis for a highly sensitive and reliable diagnostic approach of malaria parasites. Importantly, as little as 0.0005% of ring stage parasites can be detected reliably, making this ideally suited for the detection of malaria parasites in peripheral blood obtained from patients. The approaches used here are envisaged to provide a new malaria diagnosis solution in the near future.

  9. A New Tool for Nutrition App Quality Evaluation (AQEL): Development, Validation, and Reliability Testing.

    PubMed

    DiFilippo, Kristen Nicole; Huang, Wenhao; Chapman-Novakofski, Karen M

    2017-10-27

    The extensive availability and increasing use of mobile apps for nutrition-based health interventions makes evaluation of the quality of these apps crucial for integration of apps into nutritional counseling. The goal of this research was the development, validation, and reliability testing of the app quality evaluation (AQEL) tool, an instrument for evaluating apps' educational quality and technical functionality. Items for evaluating app quality were adapted from website evaluations, with additional items added to evaluate the specific characteristics of apps, resulting in 79 initial items. Expert panels of nutrition and technology professionals and app users reviewed items for face and content validation. After recommended revisions, nutrition experts completed a second AQEL review to ensure clarity. On the basis of 150 sets of responses using the revised AQEL, principal component analysis was completed, reducing AQEL into 5 factors that underwent reliability testing, including internal consistency, split-half reliability, test-retest reliability, and interrater reliability (IRR). Two additional modifiable constructs for evaluating apps based on the age and needs of the target audience as selected by the evaluator were also tested for construct reliability. IRR testing using intraclass correlations (ICC) with all 7 constructs was conducted, with 15 dietitians evaluating one app. Development and validation resulted in the 51-item AQEL. These were reduced to 25 items in 5 factors after principal component analysis, plus 9 modifiable items in two constructs that were not included in principal component analysis. Internal consistency and split-half reliability of the following constructs derived from principal components analysis was good (Cronbach alpha >.80, Spearman-Brown coefficient >.80): behavior change potential, support of knowledge acquisition, app function, and skill development. App purpose split half-reliability was .65. Test-retest reliability showed no significant change over time (P>.05) for all but skill development (P=.001). Construct reliability was good for items assessing age appropriateness of apps for children, teens, and a general audience. In addition, construct reliability was acceptable for assessing app appropriateness for various target audiences (Cronbach alpha >.70). For the 5 main factors, ICC (1,k) was >.80, with a P value of <.05. When 15 nutrition professionals evaluated one app, ICC (2,15) was .98, with a P value of <.001 for all 7 constructs when the modifiable items were specified for adults seeking weight loss support. Our preliminary effort shows that AQEL is a valid, reliable instrument for evaluating nutrition apps' qualities for clinical interventions by nutrition clinicians, educators, and researchers. Further efforts in validating AQEL in various contexts are needed. ©Kristen Nicole DiFilippo, Wenhao Huang, Karen M. Chapman-Novakofski. Originally published in JMIR Mhealth and Uhealth (http://mhealth.jmir.org), 27.10.2017.

  10. Probabilistic Simulation of Combined Thermo-Mechanical Cyclic Fatigue in Composites

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.

    2011-01-01

    A methodology to compute probabilistically-combined thermo-mechanical fatigue life of polymer matrix laminated composites has been developed and is demonstrated. Matrix degradation effects caused by long-term environmental exposure and mechanical/thermal cyclic loads are accounted for in the simulation process. A unified time-temperature-stress-dependent multifactor-interaction relationship developed at NASA Glenn Research Center has been used to model the degradation/aging of material properties due to cyclic loads. The fast probability-integration method is used to compute probabilistic distribution of response. Sensitivities of fatigue life reliability to uncertainties in the primitive random variables (e.g., constituent properties, fiber volume ratio, void volume ratio, ply thickness, etc.) computed and their significance in the reliability-based design for maximum life is discussed. The effect of variation in the thermal cyclic loads on the fatigue reliability for a (0/+/-45/90)s graphite/epoxy laminate with a ply thickness of 0.127 mm, with respect to impending failure modes has been studied. The results show that, at low mechanical-cyclic loads and low thermal-cyclic amplitudes, fatigue life for 0.999 reliability is most sensitive to matrix compressive strength, matrix modulus, thermal expansion coefficient, and ply thickness. Whereas at high mechanical-cyclic loads and high thermal-cyclic amplitudes, fatigue life at 0.999 reliability is more sensitive to the shear strength of matrix, longitudinal fiber modulus, matrix modulus, and ply thickness.

  11. Probabilistic Simulation for Combined Cycle Fatigue in Composites

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.

    2010-01-01

    A methodology to compute probabilistic fatigue life of polymer matrix laminated composites has been developed and demonstrated. Matrix degradation effects caused by long term environmental exposure and mechanical/thermal cyclic loads are accounted for in the simulation process. A unified time-temperature-stress dependent multifactor interaction relationship developed at NASA Glenn Research Center has been used to model the degradation/aging of material properties due to cyclic loads. The fast probability integration method is used to compute probabilistic distribution of response. Sensitivities of fatigue life reliability to uncertainties in the primitive random variables (e.g., constituent properties, fiber volume ratio, void volume ratio, ply thickness, etc.) computed and their significance in the reliability-based design for maximum life is discussed. The effect of variation in the thermal cyclic loads on the fatigue reliability for a (0/+/- 45/90)s graphite/epoxy laminate with a ply thickness of 0.127 mm, with respect to impending failure modes has been studied. The results show that, at low mechanical cyclic loads and low thermal cyclic amplitudes, fatigue life for 0.999 reliability is most sensitive to matrix compressive strength, matrix modulus, thermal expansion coefficient, and ply thickness. Whereas at high mechanical cyclic loads and high thermal cyclic amplitudes, fatigue life at 0.999 reliability is more sensitive to the shear strength of matrix, longitudinal fiber modulus, matrix modulus, and ply thickness.

  12. Probabilistic Simulation of Combined Thermo-Mechanical Cyclic Fatigue in Composites

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.

    2010-01-01

    A methodology to compute probabilistically-combined thermo-mechanical fatigue life of polymer matrix laminated composites has been developed and is demonstrated. Matrix degradation effects caused by long-term environmental exposure and mechanical/thermal cyclic loads are accounted for in the simulation process. A unified time-temperature-stress-dependent multifactor-interaction relationship developed at NASA Glenn Research Center has been used to model the degradation/aging of material properties due to cyclic loads. The fast probability-integration method is used to compute probabilistic distribution of response. Sensitivities of fatigue life reliability to uncertainties in the primitive random variables (e.g., constituent properties, fiber volume ratio, void volume ratio, ply thickness, etc.) computed and their significance in the reliability-based design for maximum life is discussed. The effect of variation in the thermal cyclic loads on the fatigue reliability for a (0/+/-45/90)s graphite/epoxy laminate with a ply thickness of 0.127 mm, with respect to impending failure modes has been studied. The results show that, at low mechanical-cyclic loads and low thermal-cyclic amplitudes, fatigue life for 0.999 reliability is most sensitive to matrix compressive strength, matrix modulus, thermal expansion coefficient, and ply thickness. Whereas at high mechanical-cyclic loads and high thermal-cyclic amplitudes, fatigue life at 0.999 reliability is more sensitive to the shear strength of matrix, longitudinal fiber modulus, matrix modulus, and ply thickness.

  13. Semiautomatic segmentation and follow-up of multicomponent low-grade tumors in longitudinal brain MRI studies

    PubMed Central

    Weizman, Lior; Sira, Liat Ben; Joskowicz, Leo; Rubin, Daniel L.; Yeom, Kristen W.; Constantini, Shlomi; Shofty, Ben; Bashat, Dafna Ben

    2014-01-01

    Purpose: Tracking the progression of low grade tumors (LGTs) is a challenging task, due to their slow growth rate and associated complex internal tumor components, such as heterogeneous enhancement, hemorrhage, and cysts. In this paper, the authors show a semiautomatic method to reliably track the volume of LGTs and the evolution of their internal components in longitudinal MRI scans. Methods: The authors' method utilizes a spatiotemporal evolution modeling of the tumor and its internal components. Tumor components gray level parameters are estimated from the follow-up scan itself, obviating temporal normalization of gray levels. The tumor delineation procedure effectively incorporates internal classification of the baseline scan in the time-series as prior data to segment and classify a series of follow-up scans. The authors applied their method to 40 MRI scans of ten patients, acquired at two different institutions. Two types of LGTs were included: Optic pathway gliomas and thalamic astrocytomas. For each scan, a “gold standard” was obtained manually by experienced radiologists. The method is evaluated versus the gold standard with three measures: gross total volume error, total surface distance, and reliability of tracking tumor components evolution. Results: Compared to the gold standard the authors' method exhibits a mean Dice similarity volumetric measure of 86.58% and a mean surface distance error of 0.25 mm. In terms of its reliability in tracking the evolution of the internal components, the method exhibits strong positive correlation with the gold standard. Conclusions: The authors' method provides accurate and repeatable delineation of the tumor and its internal components, which is essential for therapy assessment of LGTs. Reliable tracking of internal tumor components over time is novel and potentially will be useful to streamline and improve follow-up of brain tumors, with indolent growth and behavior. PMID:24784396

  14. Geodynamics and temporal variations in the gravity field

    NASA Technical Reports Server (NTRS)

    Mcadoo, D. C.; Wagner, C. A.

    1989-01-01

    Just as the Earth's surface deforms tectonically, so too does the gravity field evolve with time. Now that precise geodesy is yielding observations of these deformations it is important that concomitant, temporal changes in the gravity field be monitored. Although these temporal changes are minute they are observable: changes in the J2 component of the gravity field were inferred from satellite (LAGEOS) tracking data; changes in other components of the gravity field would likely be detected by Geopotential Research Mission (GRM), a proposed but unapproved NASA gravity field mission. Satellite gradiometers were also proposed for high-precision gravity field mapping. Using simple models of geodynamic processes such as viscous postglacial rebound of the solid Earth, great subduction zone earthquakes and seasonal glacial mass fluctuations, we predict temporal changes in gravity gradients at spacecraft altitudes. It was found that these proposed gravity gradient satellite missions should have sensitivities equal to or better than 10(exp -4) E in order to reliably detect these changes. It was also found that satellite altimetry yields little promise of useful detection of time variations in gravity.

  15. Comparative artificial neural network and partial least squares models for analysis of Metronidazole, Diloxanide, Spiramycin and Cliquinol in pharmaceutical preparations.

    PubMed

    Elkhoudary, Mahmoud M; Abdel Salam, Randa A; Hadad, Ghada M

    2014-09-15

    Metronidazole (MNZ) is a widely used antibacterial and amoebicide drug. Therefore, it is important to develop a rapid and specific analytical method for the determination of MNZ in mixture with Spiramycin (SPY), Diloxanide (DIX) and Cliquinol (CLQ) in pharmaceutical preparations. This work describes simple, sensitive and reliable six multivariate calibration methods, namely linear and nonlinear artificial neural networks preceded by genetic algorithm (GA-ANN) and principle component analysis (PCA-ANN) as well as partial least squares (PLS) either alone or preceded by genetic algorithm (GA-PLS) for UV spectrophotometric determination of MNZ, SPY, DIX and CLQ in pharmaceutical preparations with no interference of pharmaceutical additives. The results manifest the problem of nonlinearity and how models like ANN can handle it. Analytical performance of these methods was statistically validated with respect to linearity, accuracy, precision and specificity. The developed methods indicate the ability of the previously mentioned multivariate calibration models to handle and solve UV spectra of the four components' mixtures using easy and widely used UV spectrophotometer. Copyright © 2014 Elsevier B.V. All rights reserved.

  16. Rapid Viral Diagnosis of Orthopoxviruses by Electron Microscopy: Optional or a Must?

    PubMed Central

    Gelderblom, Hans R.; Madeley, Dick

    2018-01-01

    Diagnostic electron microscopy (DEM) was an essential component of viral diagnosis until the development of highly sensitive nucleic acid amplification techniques (NAT). The simple negative staining technique of DEM was applied widely to smallpox diagnosis until the world-wide eradication of the human-specific pathogen in 1980. Since then, the threat of smallpox re-emerging through laboratory escape, molecular manipulation, synthetic biology or bioterrorism has not totally disappeared and would be a major problem in an unvaccinated population. Other animal poxviruses may also emerge as human pathogens. With its rapid results (only a few minutes after arrival of the specimen), no requirement for specific reagents and its “open view”, DEM remains an important component of virus diagnosis, particularly because it can easily and reliably distinguish smallpox virus or any other member of the orthopoxvirus (OPV) genus from parapoxviruses (PPV) and the far more common and less serious herpesviruses (herpes simplex and varicella zoster). Preparation, enrichment, examination, internal standards and suitable organisations are discussed to make clear its continuing value as a diagnostic technique. PMID:29565285

  17. Development of a digital method for neutron/gamma-ray discrimination based on matched filtering

    NASA Astrophysics Data System (ADS)

    Korolczuk, S.; Linczuk, M.; Romaniuk, R.; Zychor, I.

    2016-09-01

    Neutron/gamma-ray discrimination is crucial for measurements with detectors sensitive to both neutron and gamma-ray radiation. Different techniques to discriminate between neutrons and gamma-rays based on pulse shape analysis are widely used in many applications, e.g., homeland security, radiation dosimetry, environmental monitoring, fusion experiments, nuclear spectroscopy. A common requirement is to improve a radiation detection level with a high detection reliability. Modern electronic components, such as high speed analog to digital converters and powerful programmable digital circuits for signal processing, allow us to develop a fully digital measurement system. With this solution it is possible to optimize digital signal processing algorithms without changing any electronic components in an acquisition signal path. We report on results obtained with a digital acquisition system DNG@NCBJ designed at the National Centre for Nuclear Research. A 2'' × 2'' EJ309 liquid scintillator was used to register mixed neutron and gamma-ray radiation from PuBe sources. A dedicated algorithm for pulse shape discrimination, based on real-time filtering, was developed and implemented in hardware.

  18. Simultaneous determination of five components in rat plasma by UPLC-MS/MS and its application to a comparative pharmacokinetic study in Baihe Zhimu Tang and Zhimu extract.

    PubMed

    Li, Guolong; Tang, Zhishu; Yang, Jie; Duan, Jinao; Qian, Dawei; Guo, Jianming; Zhu, Zhenhua; Liu, Hongbo

    2015-04-15

    Baihe Zhimu Tang (BZT) is a famous traditional Chinese medicine recipe to treat dry coughing due to yin deficiency and for moisturizing the lungs. Zhimu is an essential ingredient in BZT used to treat inflammation, fever and diabetes. The most important active components in Zhimu are flavonoids such as neomangiferin, mangiferin, and steroid saponins (e.g., timosaponin BII, anemarsaponin BIII, timosaponin AIII). The aim of this study was to compare the pharmacokinetics of mangiferin, neomangiferin, timosaponin BII, anemarsaponin BIII and timosaponin AIII in rat plasma after oral administration of BZT and Zhimu extract (ZME). A sensitive, reliable and robust LC-MS/MS method to simultaneously determine steroid saponins and flavonoids in rat plasma was successfully validated. Significant differences (p < 0.05) were found in the pharmacokinetic parameters of timosaponin BII, anemarsaponin BIII and timosaponin AIII between BZT and ZME. It was surmised that formula compatibility could significantly influence the pharmacokinetics of BZT and our study is the first to study the administration of BZT based on pharmacokinetic studies.

  19. DAKOTA Design Analysis Kit for Optimization and Terascale

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Adams, Brian M.; Dalbey, Keith R.; Eldred, Michael S.

    2010-02-24

    The DAKOTA (Design Analysis Kit for Optimization and Terascale Applications) toolkit provides a flexible and extensible interface between simulation codes (computational models) and iterative analysis methods. By employing object-oriented design to implement abstractions of the key components required for iterative systems analyses, the DAKOTA toolkit provides a flexible and extensible problem-solving environment for design and analysis of computational models on high performance computers.A user provides a set of DAKOTA commands in an input file and launches DAKOTA. DAKOTA invokes instances of the computational models, collects their results, and performs systems analyses. DAKOTA contains algorithms for optimization with gradient and nongradient-basedmore » methods; uncertainty quantification with sampling, reliability, polynomial chaos, stochastic collocation, and epistemic methods; parameter estimation with nonlinear least squares methods; and sensitivity/variance analysis with design of experiments and parameter study methods. These capabilities may be used on their own or as components within advanced strategies such as hybrid optimization, surrogate-based optimization, mixed integer nonlinear programming, or optimization under uncertainty. Services for parallel computing, simulation interfacing, approximation modeling, fault tolerance, restart, and graphics are also included.« less

  20. Personality in sanctuary-housed chimpanzees: A comparative approach of psychobiological and penta-factorial human models.

    PubMed

    Úbeda, Yulán; Llorente, Miquel

    2015-02-18

    We evaluate a sanctuary chimpanzee sample (N = 11) using two adapted human assessment instruments: the Five-Factor Model (FFM) and Eysenck's Psychoticism-Extraversion-Neuroticism (PEN) model. The former has been widely used in studies of animal personality, whereas the latter has never been used to assess chimpanzees. We asked familiar keepers and scientists (N = 28) to rate 38 (FFM) and 12 (PEN) personality items. The personality surveys showed reliability in all of the items for both instruments. These were then analyzed in a principal component analysis and a regularized exploratory factor analysis, which revealed four and three components, respectively. The results indicate that both questionnaires show a clear factor structure, with characteristic factors not just for the species, but also for the sample type. However, due to its brevity, the PEN may be more suitable for assessing personality in a sanctuary, where employees do not have much time to devote to the evaluation process. In summary, both models are sensitive enough to evaluate the personality of a group of chimpanzees housed in a sanctuary.

Top