Chapter 15: Reliability of Wind Turbines
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sheng, Shuangwen; O'Connor, Ryan
The global wind industry has witnessed exciting developments in recent years. The future will be even brighter with further reductions in capital and operation and maintenance costs, which can be accomplished with improved turbine reliability, especially when turbines are installed offshore. One opportunity for the industry to improve wind turbine reliability is through the exploration of reliability engineering life data analysis based on readily available data or maintenance records collected at typical wind plants. If adopted and conducted appropriately, these analyses can quickly save operation and maintenance costs in a potentially impactful manner. This chapter discusses wind turbine reliability bymore » highlighting the methodology of reliability engineering life data analysis. It first briefly discusses fundamentals for wind turbine reliability and the current industry status. Then, the reliability engineering method for life analysis, including data collection, model development, and forecasting, is presented in detail and illustrated through two case studies. The chapter concludes with some remarks on potential opportunities to improve wind turbine reliability. An owner and operator's perspective is taken and mechanical components are used to exemplify the potential benefits of reliability engineering analysis to improve wind turbine reliability and availability.« less
Towards cost-effective reliability through visualization of the reliability option space
NASA Technical Reports Server (NTRS)
Feather, Martin S.
2004-01-01
In planning a complex system's development there can be many options to improve its reliability. Typically their sum total cost exceeds the budget available, so it is necessary to select judiciously from among them. Reliability models can be employed to calculate the cost and reliability implications of a candidate selection.
Reliability assessment and improvement for a fast corrector power supply in TPS
NASA Astrophysics Data System (ADS)
Liu, Kuo-Bin; Liu, Chen-Yao; Wang, Bao-Sheng; Wong, Yong Seng
2018-07-01
Fast Orbit Feedback System (FOFB) can be installed in a synchrotron light source to eliminate undesired disturbances and to improve the stability of beam orbit. The design and implementation of an accurate and reliable Fast Corrector Power Supply (FCPS) is essential to realize the effectiveness and availability of the FOFB. A reliability assessment for the FCPSs in the FOFB of Taiwan Photon Source (TPS) considering MOSFETs' temperatures is represented in this paper. The FCPS is composed of a full-bridge topology and a low-pass filter. A Hybrid Pulse Width Modulation (HPWM) requiring two MOSFETs in the full-bridge circuit to be operated at high frequency and the other two be operated at the output frequency is adopted to control the implemented FCPS. Due the characteristic of HPWM, the conduction loss and switching loss of each MOSFET in the FCPS is not same. Two of the MOSFETs in the full-bridge circuit will suffer higher temperatures and therefore the circuit reliability of FCPS is reduced. A Modified PWM Scheme (MPWMS) designed to average MOSFETs' temperatures and to improve circuit reliability is proposed in this paper. Experimental results measure the MOSFETs' temperatures of FCPS controlled by the HPWM and the proposed MPWMS. The reliability indices under different PWM controls are then assessed. From the experimental results, it can be observed that the reliability of FCPS using the proposed MPWMS can be improved because the MOSFETs' temperatures are closer. Since the reliability of FCPS can be enhanced, the availability of FOFB can also be improved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Klise, Geoffrey T.; Hill, Roger; Walker, Andy
The use of the term 'availability' to describe a photovoltaic (PV) system and power plant has been fraught with confusion for many years. A term that is meant to describe equipment operational status is often omitted, misapplied or inaccurately combined with PV performance metrics due to attempts to measure performance and reliability through the lens of traditional power plant language. This paper discusses three areas where current research in standards, contract language and performance modeling is improving the way availability is used with regards to photovoltaic systems and power plants.
Reliability culture at La Silla Paranal Observatory
NASA Astrophysics Data System (ADS)
Gonzalez, Sergio
2010-07-01
The Maintenance Department at the La Silla - Paranal Observatory has been an important base to keep the operations of the observatory at a good level of reliability and availability. Several strategies have been implemented and improved in order to cover these requirements and keep the system and equipment working properly when it is required. For that reason, one of the latest improvements has been the introduction of the concept of reliability, which implies that we don't simply speak about reliability concepts. It involves much more than that. It involves the use of technologies, data collecting, data analysis, decision making, committees concentrated in analysis of failure modes and how they can be eliminated, aligning the results with the requirements of our internal partners and establishing steps to achieve success. Some of these steps have already been implemented: data collection, use of technologies, analysis of data, development of priority tools, committees dedicated to analyze data and people dedicated to reliability analysis. This has permitted us to optimize our process, analyze where we can improve, avoid functional failures, reduce the failures range in several systems and subsystems; all this has had a positive impact in terms of results for our Observatory. All these tools are part of the reliability culture that allows our system to operate with a high level of reliability and availability.
Using Facility Condition Assessments to Identify Actions Related to Infrastructure
NASA Technical Reports Server (NTRS)
Rubert, Kennedy F.
2010-01-01
To support cost effective, quality research it is essential that laboratory and testing facilities are maintained in a continuous and reliable state of availability at all times. NASA Langley Research Center (LaRC) and its maintenance contractor, Jacobs Technology, Inc. Research Operations, Maintenance, and Engineering (ROME) group, are in the process of implementing a combined Facility Condition Assessment (FCA) and Reliability Centered Maintenance (RCM) program to improve asset management and overall reliability of testing equipment in facilities such as wind tunnels. Specific areas are being identified for improvement, the deferred maintenance cost is being estimated, and priority is being assigned against facilities where conditions have been allowed to deteriorate. This assessment serves to assist in determining where to commit available funds on the Center. RCM methodologies are being reviewed and enhanced to assure that appropriate preventive, predictive, and facilities/equipment acceptance techniques are incorporated to prolong lifecycle availability and assure reliability at minimum cost. The results from the program have been favorable, better enabling LaRC to manage assets prudently.
A study on reliability of power customer in distribution network
NASA Astrophysics Data System (ADS)
Liu, Liyuan; Ouyang, Sen; Chen, Danling; Ma, Shaohua; Wang, Xin
2017-05-01
The existing power supply reliability index system is oriented to power system without considering actual electricity availability in customer side. In addition, it is unable to reflect outage or customer’s equipment shutdown caused by instantaneous interruption and power quality problem. This paper thus makes a systematic study on reliability of power customer. By comparing with power supply reliability, reliability of power customer is defined and extracted its evaluation requirements. An indexes system, consisting of seven customer indexes and two contrast indexes, are designed to describe reliability of power customer from continuity and availability. In order to comprehensively and quantitatively evaluate reliability of power customer in distribution networks, reliability evaluation method is proposed based on improved entropy method and the punishment weighting principle. Practical application has proved that reliability index system and evaluation method for power customer is reasonable and effective.
Review and critical analysis: Rolling-element bearings for system life and reliability
NASA Technical Reports Server (NTRS)
Irwin, A. S.; Anderson, W. J.; Derner, W. J.
1985-01-01
A ball and cylindrical roller bearing technical specification which incorporates the latest state-of-the-art advancements was prepared for the purpose of improving bearing reliability in U.S. Army aircraft. The current U.S. Army aviation bearing designs and applications, including life analyses, were analyzed. A bearing restoration and refurbishment specification was prepared to improve bearing availability.
NASA Astrophysics Data System (ADS)
Launch vehicle propulsion system reliability considerations during the design and verification processes are discussed. The tools available for predicting and minimizing anomalies or failure modes are described and objectives for validating advanced launch system propulsion reliability are listed. Methods for ensuring vehicle/propulsion system interface reliability are examined and improvements in the propulsion system development process are suggested to improve reliability in launch operations. Also, possible approaches to streamline the specification and procurement process are given. It is suggested that government and industry should define reliability program requirements and manage production and operations activities in a manner that provides control over reliability drivers. Also, it is recommended that sufficient funds should be invested in design, development, test, and evaluation processes to ensure that reliability is not inappropriately subordinated to other management considerations.
Optimizing preventive maintenance policy: A data-driven application for a light rail braking system.
Corman, Francesco; Kraijema, Sander; Godjevac, Milinko; Lodewijks, Gabriel
2017-10-01
This article presents a case study determining the optimal preventive maintenance policy for a light rail rolling stock system in terms of reliability, availability, and maintenance costs. The maintenance policy defines one of the three predefined preventive maintenance actions at fixed time-based intervals for each of the subsystems of the braking system. Based on work, maintenance, and failure data, we model the reliability degradation of the system and its subsystems under the current maintenance policy by a Weibull distribution. We then analytically determine the relation between reliability, availability, and maintenance costs. We validate the model against recorded reliability and availability and get further insights by a dedicated sensitivity analysis. The model is then used in a sequential optimization framework determining preventive maintenance intervals to improve on the key performance indicators. We show the potential of data-driven modelling to determine optimal maintenance policy: same system availability and reliability can be achieved with 30% maintenance cost reduction, by prolonging the intervals and re-grouping maintenance actions.
Optimizing preventive maintenance policy: A data-driven application for a light rail braking system
Corman, Francesco; Kraijema, Sander; Godjevac, Milinko; Lodewijks, Gabriel
2017-01-01
This article presents a case study determining the optimal preventive maintenance policy for a light rail rolling stock system in terms of reliability, availability, and maintenance costs. The maintenance policy defines one of the three predefined preventive maintenance actions at fixed time-based intervals for each of the subsystems of the braking system. Based on work, maintenance, and failure data, we model the reliability degradation of the system and its subsystems under the current maintenance policy by a Weibull distribution. We then analytically determine the relation between reliability, availability, and maintenance costs. We validate the model against recorded reliability and availability and get further insights by a dedicated sensitivity analysis. The model is then used in a sequential optimization framework determining preventive maintenance intervals to improve on the key performance indicators. We show the potential of data-driven modelling to determine optimal maintenance policy: same system availability and reliability can be achieved with 30% maintenance cost reduction, by prolonging the intervals and re-grouping maintenance actions. PMID:29278245
Reliability and availability analysis of a 10 kW@20 K helium refrigerator
NASA Astrophysics Data System (ADS)
Li, J.; Xiong, L. Y.; Liu, L. Q.; Wang, H. R.; Wang, B. M.
2017-02-01
A 10 kW@20 K helium refrigerator has been established in the Technical Institute of Physics and Chemistry, Chinese Academy of Sciences. To evaluate and improve this refrigerator’s reliability and availability, a reliability and availability analysis is performed. According to the mission profile of this refrigerator, a functional analysis is performed. The failure data of the refrigerator components are collected and failure rate distributions are fitted by software Weibull++ V10.0. A Failure Modes, Effects & Criticality Analysis (FMECA) is performed and the critical components with higher risks are pointed out. Software BlockSim V9.0 is used to calculate the reliability and the availability of this refrigerator. The result indicates that compressors, turbine and vacuum pump are the critical components and the key units of this refrigerator. The mitigation actions with respect to design, testing, maintenance and operation are proposed to decrease those major and medium risks.
Izumi, Betty T; Findholt, Nancy E; Pickus, Hayley A; Nguyen, Thuan; Cuneo, Monica K
2014-06-01
Food stores have gained attention as potential intervention targets for improving children's eating habits. There is a need for valid and reliable instruments to evaluate changes in food store snack and beverage availability secondary to intervention. The aim of this study was to develop a valid, reliable, and resource-efficient instrument to evaluate the healthfulness of food store environments faced by children. The SNACZ food store checklist was developed to assess availability of healthier alternatives to the energy-dense snacks and beverages commonly consumed by children. After pretesting, two trained observers independently assessed the availability of 48 snack and beverage items in 50 food stores located near elementary and middle schools in Portland, Oregon, over a 2-week period in summer 2012. Inter-rater reliability was calculated using the kappa statistic. Overall, the instrument had mostly high inter-rater reliability. Seventy-three percent of items assessed had almost perfect or substantial reliability. Two items had moderate reliability (0.41-0.60), and no items had a reliability score less than 0.41. Eleven items occurred too infrequently to generate a kappa score. The SNACZ food store checklist is a first-step toward developing a valid and reliable tool to evaluate the healthfulness of food store environments faced by children. The tool can be used to compare availability of healthier snack and beverage alternatives across communities and measure change secondary to intervention. As a wider variety of healthier snack and beverage alternatives become available in food stores, the checklist should be updated.
Constraining uncertainties in water supply reliability in a tropical data scarce basin
NASA Astrophysics Data System (ADS)
Kaune, Alexander; Werner, Micha; Rodriguez, Erasmo; de Fraiture, Charlotte
2015-04-01
Assessing the water supply reliability in river basins is essential for adequate planning and development of irrigated agriculture and urban water systems. In many cases hydrological models are applied to determine the surface water availability in river basins. However, surface water availability and variability is often not appropriately quantified due to epistemic uncertainties, leading to water supply insecurity. The objective of this research is to determine the water supply reliability in order to support planning and development of irrigated agriculture in a tropical, data scarce environment. The approach proposed uses a simple hydrological model, but explicitly includes model parameter uncertainty. A transboundary river basin in the tropical region of Colombia and Venezuela with an approximately area of 2100 km² was selected as a case study. The Budyko hydrological framework was extended to consider climatological input variability and model parameter uncertainty, and through this the surface water reliability to satisfy the irrigation and urban demand was estimated. This provides a spatial estimate of the water supply reliability across the basin. For the middle basin the reliability was found to be less than 30% for most of the months when the water is extracted from an upstream source. Conversely, the monthly water supply reliability was high (r>98%) in the lower basin irrigation areas when water was withdrawn from a source located further downstream. Including model parameter uncertainty provides a complete estimate of the water supply reliability, but that estimate is influenced by the uncertainty in the model. Reducing the uncertainty in the model through improved data and perhaps improved model structure will improve the estimate of the water supply reliability allowing better planning of irrigated agriculture and dependable water allocation decisions.
How to: identify non-tuberculous Mycobacterium species using MALDI-TOF mass spectrometry.
Alcaide, F; Amlerová, J; Bou, G; Ceyssens, P J; Coll, P; Corcoran, D; Fangous, M-S; González-Álvarez, I; Gorton, R; Greub, G; Hery-Arnaud, G; Hrábak, J; Ingebretsen, A; Lucey, B; Marekoviċ, I; Mediavilla-Gradolph, C; Monté, M R; O'Connor, J; O'Mahony, J; Opota, O; O'Reilly, B; Orth-Höller, D; Oviaño, M; Palacios, J J; Palop, B; Pranada, A B; Quiroga, L; Rodríguez-Temporal, D; Ruiz-Serrano, M J; Tudó, G; Van den Bossche, A; van Ingen, J; Rodriguez-Sanchez, B
2018-06-01
The implementation of MALDI-TOF MS for microorganism identification has changed the routine of the microbiology laboratories as we knew it. Most microorganisms can now be reliably identified within minutes using this inexpensive, user-friendly methodology. However, its application in the identification of mycobacteria isolates has been hampered by the structure of their cell wall. Improvements in the sample processing method and in the available database have proved key factors for the rapid and reliable identification of non-tuberculous mycobacteria isolates using MALDI-TOF MS. The main objective is to provide information about the proceedings for the identification of non-tuberculous isolates using MALDI-TOF MS and to review different sample processing methods, available databases, and the interpretation of the results. Results from relevant studies on the use of the available MALDI-TOF MS instruments, the implementation of innovative sample processing methods, or the implementation of improved databases are discussed. Insight about the methodology required for reliable identification of non-tuberculous mycobacteria and its implementation in the microbiology laboratory routine is provided. Microbiology laboratories where MALDI-TOF MS is available can benefit from its capacity to identify most clinically interesting non-tuberculous mycobacteria in a rapid, reliable, and inexpensive manner. Copyright © 2017 European Society of Clinical Microbiology and Infectious Diseases. Published by Elsevier Ltd. All rights reserved.
Dhital, Anup; Bancroft, Jared B; Lachapelle, Gérard
2013-11-07
In natural and urban canyon environments, Global Navigation Satellite System (GNSS) signals suffer from various challenges such as signal multipath, limited or lack of signal availability and poor geometry. Inertial sensors are often employed to improve the solution continuity under poor GNSS signal quality and availability conditions. Various fault detection schemes have been proposed in the literature to detect and remove biased GNSS measurements to obtain a more reliable navigation solution. However, many of these methods are found to be sub-optimal and often lead to unavailability of reliability measures, mostly because of the improper characterization of the measurement errors. A robust filtering architecture is thus proposed which assumes a heavy-tailed distribution for the measurement errors. Moreover, the proposed filter is capable of adapting to the changing GNSS signal conditions such as when moving from open sky conditions to deep canyons. Results obtained by processing data collected in various GNSS challenged environments show that the proposed scheme provides a robust navigation solution without having to excessively reject usable measurements. The tests reported herein show improvements of nearly 15% and 80% for position accuracy and reliability, respectively, when applying the above approach.
Dhital, Anup; Bancroft, Jared B.; Lachapelle, Gérard
2013-01-01
In natural and urban canyon environments, Global Navigation Satellite System (GNSS) signals suffer from various challenges such as signal multipath, limited or lack of signal availability and poor geometry. Inertial sensors are often employed to improve the solution continuity under poor GNSS signal quality and availability conditions. Various fault detection schemes have been proposed in the literature to detect and remove biased GNSS measurements to obtain a more reliable navigation solution. However, many of these methods are found to be sub-optimal and often lead to unavailability of reliability measures, mostly because of the improper characterization of the measurement errors. A robust filtering architecture is thus proposed which assumes a heavy-tailed distribution for the measurement errors. Moreover, the proposed filter is capable of adapting to the changing GNSS signal conditions such as when moving from open sky conditions to deep canyons. Results obtained by processing data collected in various GNSS challenged environments show that the proposed scheme provides a robust navigation solution without having to excessively reject usable measurements. The tests reported herein show improvements of nearly 15% and 80% for position accuracy and reliability, respectively, when applying the above approach. PMID:24212120
NASA Astrophysics Data System (ADS)
Sembiring, N.; Ginting, E.; Darnello, T.
2017-12-01
Problems that appear in a company that produces refined sugar, the production floor has not reached the level of critical machine availability because it often suffered damage (breakdown). This results in a sudden loss of production time and production opportunities. This problem can be solved by Reliability Engineering method where the statistical approach to historical damage data is performed to see the pattern of the distribution. The method can provide a value of reliability, rate of damage, and availability level, of an machine during the maintenance time interval schedule. The result of distribution test to time inter-damage data (MTTF) flexible hose component is lognormal distribution while component of teflon cone lifthing is weibull distribution. While from distribution test to mean time of improvement (MTTR) flexible hose component is exponential distribution while component of teflon cone lifthing is weibull distribution. The actual results of the flexible hose component on the replacement schedule per 720 hours obtained reliability of 0.2451 and availability 0.9960. While on the critical components of teflon cone lifthing actual on the replacement schedule per 1944 hours obtained reliability of 0.4083 and availability 0.9927.
High-reliability gas-turbine combined-cycle development program: Phase II. Final report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hecht, K.G.; Sanderson, R.A.; Smith, M.J.
This three-volume report presents the results of Phase II of the multiphase EPRI-sponsored High-Reliability Gas Turbine Combined-Cycle Development Program whose goal is to achieve a highly reliable gas turbine combined-cycle power plant, available by the mid-1980s, which would be an economically attractive baseload generation alternative for the electric utility industry. The Phase II program objective was to prepare the preliminary design of this power plant. This volume presents information of the reliability, availability, and maintainability (RAM) analysis of a representative plant and the preliminary design of the gas turbine, the gas turbine ancillaries, and the balance of plant including themore » steam turbine generator. To achieve the program goals, a gas turbine was incorporated which combined proven reliability characteristics with improved performance features. This gas turbine, designated the V84.3, is the result of a cooperative effort between Kraftwerk Union AG and United Technologies Corporation. Gas turbines of similar design operating in Europe under baseload conditions have demonstrated mean time between failures in excess of 40,000 hours. The reliability characteristics of the gas turbine ancillaries and balance-of-plant equipment were improved through system simplification and component redundancy and by selection of component with inherent high reliability. A digital control system was included with logic, communications, sensor redundancy, and mandual backup. An independent condition monitoring and diagnostic system was also included. Program results provide the preliminary design of a gas turbine combined-cycle baseload power plant. This power plant has a predicted mean time between failure of nearly twice the 3000-hour EPRI goal. The cost of added reliability features is offset by improved performance, which results in a comparable specific cost and an 8% lower cost of electricity compared to present market offerings.« less
Reliability and Maintainability Analysis of a High Air Pressure Compressor Facility
NASA Technical Reports Server (NTRS)
Safie, Fayssal M.; Ring, Robert W.; Cole, Stuart K.
2013-01-01
This paper discusses a Reliability, Availability, and Maintainability (RAM) independent assessment conducted to support the refurbishment of the Compressor Station at the NASA Langley Research Center (LaRC). The paper discusses the methodologies used by the assessment team to derive the repair by replacement (RR) strategies to improve the reliability and availability of the Compressor Station (Ref.1). This includes a RAPTOR simulation model that was used to generate the statistical data analysis needed to derive a 15-year investment plan to support the refurbishment of the facility. To summarize, study results clearly indicate that the air compressors are well past their design life. The major failures of Compressors indicate that significant latent failure causes are present. Given the occurrence of these high-cost failures following compressor overhauls, future major failures should be anticipated if compressors are not replaced. Given the results from the RR analysis, the study team recommended a compressor replacement strategy. Based on the data analysis, the RR strategy will lead to sustainable operations through significant improvements in reliability, availability, and the probability of meeting the air demand with acceptable investment cost that should translate, in the long run, into major cost savings. For example, the probability of meeting air demand improved from 79.7 percent for the Base Case to 97.3 percent. Expressed in terms of a reduction in the probability of failing to meet demand (1 in 5 days to 1 in 37 days), the improvement is about 700 percent. Similarly, compressor replacement improved the operational availability of the facility from 97.5 percent to 99.8 percent. Expressed in terms of a reduction in system unavailability (1 in 40 to 1 in 500), the improvement is better than 1000 percent (an order of magnitude improvement). It is worthy to note that the methodologies, tools, and techniques used in the LaRC study can be used to evaluate similar high value equipment components and facilities. Also, lessons learned in data collection and maintenance practices derived from the observations, findings, and recommendations of the study are extremely important in the evaluation and sustainment of new compressor facilities.
Module Degradation Mechanisms Studied by a Multi-Scale Approach
DOE Office of Scientific and Technical Information (OSTI.GOV)
Johnston, Steve; Al-Jassim, Mowafak; Hacke, Peter
2016-11-21
A key pathway to meeting the Department of Energy SunShot 2020 goals is to reduce financing costs by improving investor confidence through improved photovoltaic (PV) module reliability. A comprehensive approach to further understand and improve PV reliability includes characterization techniques and modeling from module to atomic scale. Imaging techniques, which include photoluminescence, electroluminescence, and lock-in thermography, are used to locate localized defects responsible for module degradation. Small area samples containing such defects are prepared using coring techniques and are then suitable and available for microscopic study and specific defect modeling and analysis.
Comparing the reliability of related populations with the probability of agreement
Stevens, Nathaniel T.; Anderson-Cook, Christine M.
2016-07-26
Combining information from different populations to improve precision, simplify future predictions, or improve underlying understanding of relationships can be advantageous when considering the reliability of several related sets of systems. Using the probability of agreement to help quantify the similarities of populations can help to give a realistic assessment of whether the systems have reliability that are sufficiently similar for practical purposes to be treated as a homogeneous population. In addition, the new method is described and illustrated with an example involving two generations of a complex system where the reliability is modeled using either a logistic or probit regressionmore » model. Note that supplementary materials including code, datasets, and added discussion are available online.« less
Comparing the reliability of related populations with the probability of agreement
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stevens, Nathaniel T.; Anderson-Cook, Christine M.
Combining information from different populations to improve precision, simplify future predictions, or improve underlying understanding of relationships can be advantageous when considering the reliability of several related sets of systems. Using the probability of agreement to help quantify the similarities of populations can help to give a realistic assessment of whether the systems have reliability that are sufficiently similar for practical purposes to be treated as a homogeneous population. In addition, the new method is described and illustrated with an example involving two generations of a complex system where the reliability is modeled using either a logistic or probit regressionmore » model. Note that supplementary materials including code, datasets, and added discussion are available online.« less
Cooley, Richard L.
1982-01-01
Prior information on the parameters of a groundwater flow model can be used to improve parameter estimates obtained from nonlinear regression solution of a modeling problem. Two scales of prior information can be available: (1) prior information having known reliability (that is, bias and random error structure) and (2) prior information consisting of best available estimates of unknown reliability. A regression method that incorporates the second scale of prior information assumes the prior information to be fixed for any particular analysis to produce improved, although biased, parameter estimates. Approximate optimization of two auxiliary parameters of the formulation is used to help minimize the bias, which is almost always much smaller than that resulting from standard ridge regression. It is shown that if both scales of prior information are available, then a combined regression analysis may be made.
UV-fibers: two decades of improvements for new applications
NASA Astrophysics Data System (ADS)
Klein, Karl-Friedrich; Khalilov, Valery K.
2015-03-01
Multimode UV-fibers with high-OH synthetic silica core and F-doped silica cladding have been available for over 40 years. At the beginning, the spectral UV-range above 250 nm wavelength was commonly used, because the generation of UV-absorbing defect centers prevented reliable light transfer below 250 nm; even light from a low-power broadband deuterium-lamp was sufficient to damage these UV-fibers of the 1st generation. However, even then, applications in the field of spectroscopy, laser light delivery, sensors and process control were discussed and improvements of fiber quality in this very interesting UVC range required by researchers and industrial end-users. Starting in 1993 with hydrogen-loaded fibers, further modification in preform and fiber manufacturing including additional fiber treatments lead to currently available hydrogen-free UV-fiber (4th generation) with significantly improved stability in the UVC, enabling routine use of optical fibers in this field. In addition to the UV-fiber improvements, some selected UV fiber-optic applications using broadband deuterium-lamps will be discussed. Finally, there is still room for further improvements, especially in combination with newly available pulsed UV light sources, which are low-cost, small sized and highly reliable.
One-year test-retest reliability of intrinsic connectivity network fMRI in older adults
Guo, Cong C.; Kurth, Florian; Zhou, Juan; Mayer, Emeran A.; Eickhoff, Simon B; Kramer, Joel H.; Seeley, William W.
2014-01-01
“Resting-state” or task-free fMRI can assess intrinsic connectivity network (ICN) integrity in health and disease, suggesting a potential for use of these methods as disease-monitoring biomarkers. Numerous analytical options are available, including model-driven ROI-based correlation analysis and model-free, independent component analysis (ICA). High test-retest reliability will be a necessary feature of a successful ICN biomarker, yet available reliability data remains limited. Here, we examined ICN fMRI test-retest reliability in 24 healthy older subjects scanned roughly one year apart. We focused on the salience network, a disease-relevant ICN not previously subjected to reliability analysis. Most ICN analytical methods proved reliable (intraclass coefficients > 0.4) and could be further improved by wavelet analysis. Seed-based ROI correlation analysis showed high map-wise reliability, whereas graph theoretical measures and temporal concatenation group ICA produced the most reliable individual unit-wise outcomes. Including global signal regression in ROI-based correlation analyses reduced reliability. Our study provides a direct comparison between the most commonly used ICN fMRI methods and potential guidelines for measuring intrinsic connectivity in aging control and patient populations over time. PMID:22446491
The Reliability and Validity of Self- and Investigator Ratings of ADHD in Adults
ERIC Educational Resources Information Center
Adler, Lenard A.; Faraone, Stephen V.; Spencer, Thomas J.; Michelson, David; Reimherr, Frederick W.; Glatt, Stephen J.; Marchant, Barrie K.; Biederman, Joseph
2008-01-01
Objective: Little information is available comparing self- versus investigator ratings of symptoms in adult ADHD. The authors compared the reliability, validity, and utility in a sample of adults with ADHD and also as an index of clinical improvement during treatment of self- and investigator ratings of ADHD symptoms via the Conners Adult ADHD…
Improving the reliability of inverter-based welding machines
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schiedermayer, M.
1997-02-01
Although inverter-based welding power sources have been available since the late 1980s, many people hesitated to purchase them because of reliability issues. Unfortunately, their hesitancy had a basis, until now. Recent improvements give some inverters a reliability level that approaches that of traditional, transformer-based industrial welding machines, which have a failure rate of about 1%. Acceptance of inverter-based welding machines is important because, for many welding applications, they provide capabilities that solid-state, transformer-based machines cannot deliver. These advantages include enhanced pulsed gas metal arc welding (GMAW-P), lightweight portability, an ultrastable arc, and energy efficiency--all while producing highly aesthetic weld beadsmore » and delivering multiprocess capabilities.« less
Technology Overview for Advanced Aircraft Armament System Program.
1981-05-01
availability of methods or systems for improving stores and armament safety. Of particular importance are aspects of safety involving hazards analysis ...flutter virtually insensitive to inertia and center-of- gravity location of store - Simplifies and reduces analysis and testing required to flutter- clear...status. Nearly every existing reliability analysis and discipline that prom- ised a positive return on reliability performance was drawn out, dusted
Redundancy management of inertial systems.
NASA Technical Reports Server (NTRS)
Mckern, R. A.; Musoff, H.
1973-01-01
The paper reviews developments in failure detection and isolation techniques applicable to gimballed and strapdown systems. It examines basic redundancy management goals of improved reliability, performance and logistic costs, and explores mechanizations available for both input and output data handling. The meaning of redundant system reliability in terms of available coverage, system MTBF, and mission time is presented and the practical hardware performance limitations of failure detection and isolation techniques are explored. Simulation results are presented illustrating implementation coverages attainable considering IMU performance models and mission detection threshold requirements. The implications of a complete GN&C redundancy management method on inertial techniques are also explored.
Field reliability of competency and sanity opinions: A systematic review and meta-analysis.
Guarnera, Lucy A; Murrie, Daniel C
2017-06-01
We know surprisingly little about the interrater reliability of forensic psychological opinions, even though courts and other authorities have long called for known error rates for scientific procedures admitted as courtroom testimony. This is particularly true for opinions produced during routine practice in the field, even for some of the most common types of forensic evaluations-evaluations of adjudicative competency and legal sanity. To address this gap, we used meta-analytic procedures and study space methodology to systematically review studies that examined the interrater reliability-particularly the field reliability-of competency and sanity opinions. Of 59 identified studies, 9 addressed the field reliability of competency opinions and 8 addressed the field reliability of sanity opinions. These studies presented a wide range of reliability estimates; pairwise percentage agreements ranged from 57% to 100% and kappas ranged from .28 to 1.0. Meta-analytic combinations of reliability estimates obtained by independent evaluators returned estimates of κ = .49 (95% CI: .40-.58) for competency opinions and κ = .41 (95% CI: .29-.53) for sanity opinions. This wide range of reliability estimates underscores the extent to which different evaluation contexts tend to produce different reliability rates. Unfortunately, our study space analysis illustrates that available field reliability studies typically provide little information about contextual variables crucial to understanding their findings. Given these concerns, we offer suggestions for improving research on the field reliability of competency and sanity opinions, as well as suggestions for improving reliability rates themselves. (PsycINFO Database Record (c) 2017 APA, all rights reserved).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Divan, Deepak; Brumsickle, William; Eto, Joseph
2003-04-01
This report describes a new approach for collecting information on power quality and reliability and making it available in the public domain. Making this information readily available in a form that is meaningful to electricity consumers is necessary for enabling more informed private and public decisions regarding electricity reliability. The system dramatically reduces the cost (and expertise) needed for customers to obtain information on the most significant power quality events, called voltage sags and interruptions. The system also offers widespread access to information on power quality collected from multiple sites and the potential for capturing information on the impacts ofmore » power quality problems, together enabling a wide variety of analysis and benchmarking to improve system reliability. Six case studies demonstrate selected functionality and capabilities of the system, including: Linking measured power quality events to process interruption and downtime; Demonstrating the ability to correlate events recorded by multiple monitors to narrow and confirm the causes of power quality events; and Benchmarking power quality and reliability on a firm and regional basis.« less
MEASUREMENT: ACCOUNTING FOR RELIABILITY IN PERFORMANCE ESTIMATES.
Waterman, Brian; Sutter, Robert; Burroughs, Thomas; Dunagan, W Claiborne
2014-01-01
When evaluating physician performance measures, physician leaders are faced with the quandary of determining whether departures from expected physician performance measurements represent a true signal or random error. This uncertainty impedes the physician leader's ability and confidence to take appropriate performance improvement actions based on physician performance measurements. Incorporating reliability adjustment into physician performance measurement is a valuable way of reducing the impact of random error in the measurements, such as those caused by small sample sizes. Consequently, the physician executive has more confidence that the results represent true performance and is positioned to make better physician performance improvement decisions. Applying reliability adjustment to physician-level performance data is relatively new. As others have noted previously, it's important to keep in mind that reliability adjustment adds significant complexity to the production, interpretation and utilization of results. Furthermore, the methods explored in this case study only scratch the surface of the range of available Bayesian methods that can be used for reliability adjustment; further study is needed to test and compare these methods in practice and to examine important extensions for handling specialty-specific concerns (e.g., average case volumes, which have been shown to be important in cardiac surgery outcomes). Moreover, it's important to note that the provider group average as a basis for shrinkage is one of several possible choices that could be employed in practice and deserves further exploration in future research. With these caveats, our results demonstrate that incorporating reliability adjustment into physician performance measurements is feasible and can notably reduce the incidence of "real" signals relative to what one would expect to see using more traditional approaches. A physician leader who is interested in catalyzing performance improvement through focused, effective physician performance improvement is well advised to consider the value of incorporating reliability adjustments into their performance measurement system.
A Best Practice for Developing Availability Guarantee Language in Photovoltaic (PV) O&M Agreements.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Klise, Geoffrey Taylor; Balfour, John
This document outlines the foundation for developing language that can be utilized in an Equipment Availability Guarantee, typically included in an O&M services agreement between a PV system or plant owner and an O&M services provider, or operator. Many of the current PV O&M service agreement Availability Guarantees are based on contracts used for traditional power generation, which create challenges for owners and operators due to the variable nature of grid-tied photovoltaic generating technologies. This report documents language used in early PV availability guarantees and presents best practices and equations that can be used to more openly communicate how themore » reliability of the PV system and plant equipment can be expressed in an availability guarantee. This work will improve the bankability of PV systems by providing greater transparency into the equipment reliability state to all parties involved in an O&M services contract.« less
Remote Energy Monitoring System via Cellular Network
NASA Astrophysics Data System (ADS)
Yunoki, Shoji; Tamaki, Satoshi; Takada, May; Iwaki, Takashi
Recently, improvement on power saving and cost efficiency by monitoring the operation status of various facilities over the network has gained attention. Wireless network, especially cellular network, has advantage in mobility, coverage, and scalability. On the other hand, it has disadvantage of low reliability, due to rapid changes in the available bandwidth. We propose a transmission control scheme based on data priority and instantaneous available bandwidth to realize a highly reliable remote monitoring system via cellular network. We have developed our proposed monitoring system and evaluated the effectiveness of our scheme, and proved it reduces the maximum transmission delay of sensor status to 1/10 compared to best effort transmission.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hill, Roger R.; Klise, Geoffrey Taylor; Balfour, John R.
Characterizing the factors that affect reliability of a photovoltaic (PV) power plant is an important aspect of optimal asset management. This document describes the many factors that affect operation and maintenance (O&M) of a PV plant, identifies the data necessary to quantify those factors, and describes how data might be used by O&M service providers and others in the PV industry. This document lays out data needs from perspectives of reliability, availability, and key performance indicators and is intended to be a precursor for standardizing terminology and data reporting, which will improve data sharing, analysis, and ultimately PV plant performance.
Availability Improvement of Layer 2 Seamless Networks Using OpenFlow
Molina, Elias; Jacob, Eduardo; Matias, Jon; Moreira, Naiara; Astarloa, Armando
2015-01-01
The network robustness and reliability are strongly influenced by the implementation of redundancy and its ability of reacting to changes. In situations where packet loss or maximum latency requirements are critical, replication of resources and information may become the optimal technique. To this end, the IEC 62439-3 Parallel Redundancy Protocol (PRP) provides seamless recovery in layer 2 networks by delegating the redundancy management to the end-nodes. In this paper, we present a combination of the Software-Defined Networking (SDN) approach and PRP topologies to establish a higher level of redundancy and thereby, through several active paths provisioned via the OpenFlow protocol, the global reliability is increased, as well as data flows are managed efficiently. Hence, the experiments with multiple failure scenarios, which have been run over the Mininet network emulator, show the improvement in the availability and responsiveness over other traditional technologies based on a single active path. PMID:25759861
Availability improvement of layer 2 seamless networks using OpenFlow.
Molina, Elias; Jacob, Eduardo; Matias, Jon; Moreira, Naiara; Astarloa, Armando
2015-01-01
The network robustness and reliability are strongly influenced by the implementation of redundancy and its ability of reacting to changes. In situations where packet loss or maximum latency requirements are critical, replication of resources and information may become the optimal technique. To this end, the IEC 62439-3 Parallel Redundancy Protocol (PRP) provides seamless recovery in layer 2 networks by delegating the redundancy management to the end-nodes. In this paper, we present a combination of the Software-Defined Networking (SDN) approach and PRP topologies to establish a higher level of redundancy and thereby, through several active paths provisioned via the OpenFlow protocol, the global reliability is increased, as well as data flows are managed efficiently. Hence, the experiments with multiple failure scenarios, which have been run over the Mininet network emulator, show the improvement in the availability and responsiveness over other traditional technologies based on a single active path.
ERIC Educational Resources Information Center
Coalition for Student Loan Reform, Washington, DC.
This publication presents a set of eight recommended reforms and improvements for delivering financial aid to postsecondary students especially the Federal Family Education Loan Program (FFELP). The recommendations are: (1) make applying for student aid simpler for students; (2) assure the continued availability of a dependable, reliable source of…
Methodology and estimation of the welfare impact of energy reforms on households in Azerbaijan
NASA Astrophysics Data System (ADS)
Klytchnikova, Irina
This dissertation develops a new approach that enables policy-makers to analyze welfare gains from improvements in the quality of infrastructure services in developing countries where data are limited and supply is subject to interruptions. An application of the proposed model in the former Soviet Republic of Azerbaijan demonstrates how this approach can be used in welfare assessment of energy sector reforms. The planned reforms in Azerbaijan include a set of measures that will result in a significant improvement in supply reliability, accompanied by a significant increase in the prices of energy services so that they reach the cost recovery level. Currently, households in rural areas receive electricity and gas for only a few hours a day because of a severe deterioration of the energy infrastructure following the collapse of the Soviet Union. The reforms that have recently been initiated will have far-reaching poverty and distributional consequences for the country as they result in an improvement in supply reliability and an increase in energy prices. The new model of intermittent supply developed in this dissertation is based on the household production function approach and draws on previous research in the energy reliability literature. Since modern energy sources (network gas and electricity) in Azerbaijan are cleaner and cheaper than the traditional fuels (fuel wood, etc.), households choose modern fuels whenever they are available. During outages, they rely on traditional fuels. Theoretical welfare measures are derived from a system of fuel demands that takes into account the intermittent availability of energy sources. The model is estimated with the data from the Azerbaijan Household Energy Survey, implemented by the World Bank in December 2003/January 2004. This survey includes an innovative contingent behavior module in which the respondents were asked about their energy consumption patterns in specified reform scenarios. Estimation results strongly indicate that households in the areas with poor supply quality have a high willingness to pay for reliability improvements. However, a relatively small group of households may incur substantial welfare losses from an electricity price increase even when it is combined with a partial reliability improvement. Unlike an earlier assessment of the same reforms in Azerbaijan, analysis in this dissertation clearly shows that targeted investments in improving service reliability may be the best way to mitigate adverse welfare consequences of electricity price increases. Hence, policymakers should focus their attention on ensuring that quality improvements are a central component of power sector reforms. Survey evidence also shows that, although households may incur sizable welfare losses from indoor air pollution when they rely on traditional fuels, they do not recognize indoor air pollution as a factor contributing to the high incidence of respiratory illness among fuel wood users. Therefore, benefits may be greater if policy interventions that improve the reliability of modern energy sources are combined with an information campaign about the adverse health effects of fuel wood use. (Abstract shortened by UMI.)
Electricity and generator availability in LMIC hospitals: improving access to safe surgery.
Chawla, Sagar; Kurani, Shaheen; Wren, Sherry M; Stewart, Barclay; Burnham, Gilbert; Kushner, Adam; McIntyre, Thomas
2018-03-01
Access to reliable energy has been identified as a global priority and codified within United Nations Sustainable Goal 7 and the Electrify Africa Act of 2015. Reliable hospital access to electricity is necessary to provide safe surgical care. The current state of electrical availability in hospitals in low- and middle-income countries (LMICs) throughout the world is not well known. This study aimed to review the surgical capacity literature and document the availability of electricity and generators. Using Preferred Reporting Items for Systematic Reviews and Meta-Analyses guidelines, a systematic search for surgical capacity assessments in LMICs in MEDLINE, PubMed, and World Health Organization Global Health Library was performed. Data regarding electricity and generator availability were extracted. Estimated percentages for individual countries were calculated. Of 76 articles identified, 21 reported electricity availability, totaling 528 hospitals. Continuous electricity availability at hospitals providing surgical care was 312/528 (59.1%). Generator availability was 309/427 (72.4%). Estimated continuous electricity availability ranged from 0% (Sierra Leone and Malawi) to 100% (Iran); estimated generator availability was 14% (Somalia) to 97.6% (Iran). Less than two-thirds of hospitals providing surgical care in 21 LMICs have a continuous electricity source or have an available generator. Efforts are needed to improve electricity infrastructure at hospitals to assure safe surgical care. Future research should look at the effect of energy availability on surgical care and patient outcomes and novel methods of powering surgical equipment. Copyright © 2017 Elsevier Inc. All rights reserved.
Brady, Karen; Cracknell, Nina; Zulch, Helen; Mills, Daniel Simon
2018-01-01
Working dogs are selected based on predictions from tests that they will be able to perform specific tasks in often challenging environments. However, withdrawal from service in working dogs is still a big problem, bringing into question the reliability of the selection tests used to make these predictions. A systematic review was undertaken aimed at bringing together available information on the reliability and predictive validity of the assessment of behavioural characteristics used with working dogs to establish the quality of selection tests currently available for use to predict success in working dogs. The search procedures resulted in 16 papers meeting the criteria for inclusion. A large range of behaviour tests and parameters were used in the identified papers, and so behaviour tests and their underpinning constructs were grouped on the basis of their relationship with positive core affect (willingness to work, human-directed social behaviour, object-directed play tendencies) and negative core affect (human-directed aggression, approach withdrawal tendencies, sensitivity to aversives). We then examined the papers for reports of inter-rater reliability, within-session intra-rater reliability, test-retest validity and predictive validity. The review revealed a widespread lack of information relating to the reliability and validity of measures to assess behaviour and inconsistencies in terminologies, study parameters and indices of success. There is a need to standardise the reporting of these aspects of behavioural tests in order to improve the knowledge base of what characteristics are predictive of optimal performance in working dog roles, improving selection processes and reducing working dog redundancy. We suggest the use of a framework based on explaining the direct or indirect relationship of the test with core affect.
The Importance of Human Reliability Analysis in Human Space Flight: Understanding the Risks
NASA Technical Reports Server (NTRS)
Hamlin, Teri L.
2010-01-01
HRA is a method used to describe, qualitatively and quantitatively, the occurrence of human failures in the operation of complex systems that affect availability and reliability. Modeling human actions with their corresponding failure in a PRA (Probabilistic Risk Assessment) provides a more complete picture of the risk and risk contributions. A high quality HRA can provide valuable information on potential areas for improvement, including training, procedural, equipment design and need for automation.
Factors that Affect Operational Reliability of Turbojet Engines
NASA Technical Reports Server (NTRS)
1956-01-01
The problem of improving operational reliability of turbojet engines is studied in a series of papers. Failure statistics for this engine are presented, the theory and experimental evidence on how engine failures occur are described, and the methods available for avoiding failure in operation are discussed. The individual papers of the series are Objectives, Failure Statistics, Foreign-Object Damage, Compressor Blades, Combustor Assembly, Nozzle Diaphrams, Turbine Buckets, Turbine Disks, Rolling Contact Bearings, Engine Fuel Controls, and Summary Discussion.
Lee, James
2009-01-01
The Long-Term Mechanical Circulatory Support (MCS) System Reliability Recommendation was published in the American Society for Artificial Internal Organs (ASAIO) Journal and the Annals of Thoracic Surgery in 1998. At that time, it was stated that the document would be periodically reviewed to assess its timeliness and appropriateness within 5 years. Given the wealth of clinical experience in MCS systems, a new recommendation has been drafted by consensus of a group of representatives from the medical community, academia, industry, and government. The new recommendation describes a reliability test methodology and provides detailed reliability recommendations. In addition, the new recommendation provides additional information and clinical data in appendices that are intended to assist the reliability test engineer in the development of a reliability test that is expected to give improved predictions of clinical reliability compared with past test methods. The appendices are available for download at the ASAIO journal web site at www.asaiojournal.com.
Mussman, Grant M; Vossmeyer, Michael T; Brady, Patrick W; Warrick, Denise M; Simmons, Jeffrey M; White, Christine M
2015-09-01
Timely and reliable verbal communication between hospitalists and primary care physicians (PCPs) is critical for prevention of medical adverse events but difficult in practice. Our aim was to increase the proportion of completed verbal handoffs from on-call residents or attendings to PCPs within 24 hours of patient discharge from a hospital medicine service to ≥90% within 18 months. A multidisciplinary team collaborated to redesign the process by which PCPs were contacted following patient discharge. Interventions focused on the key drivers of obtaining stakeholder buy-in, standardization of the communication process, including assigning primary responsibility for discharge communication to a single resident on each team and batching calls during times of maximum resident availability, reliable automated process initiation through leveraging the electronic health record (EHR), and transparency of data. A run chart assessed the impact of interventions over time. The percentage of calls initiated within 24 hours of discharge improved from 52% to 97%, and the percentage of calls completed improved to 93%. Results were sustained for 18 months. Standardization of the communication process through hospital telephone operators, use of the discharge order to ensure initiation of discharge communication, and batching of phone calls were associated with improvements in our measures. Reliable verbal discharge communication can be achieved through the use of a standardized discharge communication process coupled with the EHR. © 2015 Society of Hospital Medicine.
Quantifying the Economic and Grid Reliability Impacts of Improved Wind Power Forecasting
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, Qin; Martinez-Anido, Carlo Brancucci; Wu, Hongyu
Wind power forecasting is an important tool in power system operations to address variability and uncertainty. Accurately doing so is important to reducing the occurrence and length of curtailment, enhancing market efficiency, and improving the operational reliability of the bulk power system. This research quantifies the value of wind power forecasting improvements in the IEEE 118-bus test system as modified to emulate the generation mixes of Midcontinent, California, and New England independent system operator balancing authority areas. To measure the economic value, a commercially available production cost modeling tool was used to simulate the multi-timescale unit commitment (UC) and economicmore » dispatch process for calculating the cost savings and curtailment reductions. To measure the reliability improvements, an in-house tool, FESTIV, was used to calculate the system's area control error and the North American Electric Reliability Corporation Control Performance Standard 2. The approach allowed scientific reproducibility of results and cross-validation of the tools. A total of 270 scenarios were evaluated to accommodate the variation of three factors: generation mix, wind penetration level, and wind fore-casting improvements. The modified IEEE 118-bus systems utilized 1 year of data at multiple timescales, including the day-ahead UC, 4-hour-ahead UC, and 5-min real-time dispatch. The value of improved wind power forecasting was found to be strongly tied to the conventional generation mix, existence of energy storage devices, and the penetration level of wind energy. The simulation results demonstrate that wind power forecasting brings clear benefits to power system operations.« less
High-reliability gas-turbine combined-cycle development program: Phase II, Volume 3. Final report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hecht, K.G.; Sanderson, R.A.; Smith, M.J.
This three-volume report presents the results of Phase II of the multiphase EPRI-sponsored High-Reliability Gas Turbine Combined-Cycle Development Program whose goal is to achieve a highly reliable gas turbine combined-cycle power plant, available by the mid-1980s, which would be an economically attractive baseload generation alternative for the electric utility industry. The Phase II program objective was to prepare the preliminary design of this power plant. The power plant was addressed in three areas: (1) the gas turbine, (2) the gas turbine ancillaries, and (3) the balance of plant including the steam turbine generator. To achieve the program goals, a gasmore » turbine was incorporated which combined proven reliability characteristics with improved performance features. This gas turbine, designated the V84.3, is the result of a cooperative effort between Kraftwerk Union AG and United Technologies Corporation. Gas turbines of similar design operating in Europe under baseload conditions have demonstrated mean time between failures in excess of 40,000. The reliability characteristics of the gas turbine ancillaries and balance-of-plant equipment were improved through system simplification and component redundancy and by selection of component with inherent high reliability. A digital control system was included with logic, communications, sensor redundancy, and manual backup. An independent condition monitoring and diagnostic system was also included. Program results provide the preliminary design of a gas turbine combined-cycle baseload power plant. This power plant has a predicted mean time between failure of nearly twice the 3000-h EPRI goal. The cost of added reliability features is offset by improved performance, which results in a comparable specific cost and an 8% lower cost of electricty compared to present market offerings.« less
Effect of reflective p-type ohmic contact on thermal reliability of vertical InGaN/GaN LEDs
NASA Astrophysics Data System (ADS)
Son, Jun Ho; Song, Yang Hee; Kim, Buem Joon; Lee, Jong-Lam
2014-11-01
We report on the enhanced thermal reliability of vertical-LEDs (VLEDs) using novel reflective p-type ohmic contacts with good thermal stability. The reflective p-type ohmic contacts with Ni/Ag-Cu alloy multi-layer structure shows low contact resistivity, as low as 9.3 × 10-6 Ωcm2, and high reflectance of 86% after annealing at 450°C. The V-LEDs with Ni/Ag-Cu alloy multi-layer structure show good thermal reliability with stress time at 300°C in air ambient. The improved thermal stability of the reflective ohmic contacts to p-type GaN is believed to play a critical role in the thermal reliability of V-LEDs. [Figure not available: see fulltext.
Koo, Henry; Leveridge, Mike; Thompson, Charles; Zdero, Rad; Bhandari, Mohit; Kreder, Hans J; Stephen, David; McKee, Michael D; Schemitsch, Emil H
2008-07-01
The purpose of this study was to measure interobserver reliability of 2 classification systems of pelvic ring fractures and to determine whether computed tomography (CT) improves reliability. The reliability of several radiographic findings was also tested. Thirty patients taken from a database at a Level I trauma facility were reviewed. For each patient, 3 radiographs (AP pelvis, inlet, and outlet) and CT scans were available. Six different reviewers (pelvic and acetabular specialist, orthopaedic traumatologist, or orthopaedic trainee) classified the injury according to Young-Burgess and Tile classification systems after reviewing plain radiographs and then after CT scans. The Kappa coefficient was used to determine interobserver reliability of these classification systems before and after CT scan. For plain radiographs, overall Kappa values for the Young-Burgess and Tile classification systems were 0.72 and 0.30, respectively. For CT scan and plain radiographs, the overall Kappa values for the Young-Burgess and Tile classification systems were 0.63 and 0.33, respectively. The pelvis/acetabular surgeons demonstrated the highest level of agreement using both classification systems. For individual questions, the addition of CT did significantly improve reviewer interpretation of fracture stability. The pre-CT and post-CT Kappa values for fracture stability were 0.59 and 0.93, respectively. The CT scan can improve the reliability of assessment of pelvic stability because of its ability to identify anatomical features of injury. The Young-Burgess system may be optimal for the learning surgeon. The Tile classification system is more beneficial for specialists in pelvic and acetabular surgery.
Abusive behavior is barrier to high-reliability health care systems, culture of patient safety.
Cassirer, C; Anderson, D; Hanson, S; Fraser, H
2000-11-01
Addressing abusive behavior in the medical workplace presents an important opportunity to deliver on the national commitment to improve patient safety. Fundamentally, the issue of patient safety and the issue of abusive behavior in the workplace are both about harm. Undiagnosed and untreated, abusive behavior is a barrier to creating high reliability service delivery systems that ensure patient safety. Health care managers and clinicians need to improve their awareness, knowledge, and understanding of the issue of workplace abuse. The available research suggests there is a high prevalence of workplace abuse in medicine. Both administrators at the blunt end and clinicians at the sharp end should consider learning new approaches to defining and treating the problem of workplace abuse. Eliminating abusive behavior has positive implications for preventing and controlling medical injury and improving organizational performance.
Potential for natural evaporation as a reliable renewable energy resource
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cavusoglu, Ahmet-Hamdi; Chen, Xi; Gentine, Pierre
About 50% of the solar energy absorbed at the Earth’s surface drives evaporation, fueling the water cycle that affects various renewable energy resources, such as wind and hydropower. Recent advances demonstrate our nascent ability to convert evaporation energy into work, yet there is little understanding about the potential of this resource. Here in this paper we study the energy available from natural evaporation to predict the potential of this ubiquitous resource. We find that natural evaporation from open water surfaces could provide power densities comparable to current wind and solar technologies while cutting evaporative water losses by nearly half. Wemore » estimate up to 325 GW of power is potentially available in the United States. Strikingly, water’s large heat capacity is sufficient to control power output by storing excess energy when demand is low, thus reducing intermittency and improving reliability. Our findings motivate the improvement of materials and devices that convert energy from evaporation.« less
Physical Activity Monitoring in Patients with Chronic Obstructive Pulmonary Disease
Liao, Shu-Yi; Benzo, Roberto; Ries, Andrew L.; Soler, Xavier
2014-01-01
Reduced physical activity (PA) in patients with chronic obstructive pulmonary disease (COPD) is associated with increased morbidity and mortality (e.g. exacerbations) and eventually leads to disability, depression, and social and physical isolation. Measuring PA in this population is important to accurately characterize COPD and to help clinicians during a baseline evaluation and patient follow-up. Also, it may help increase adherence to PA programs. There are reliable objective and subjective methods available to measure PA. Recently, several new monitors have been developed that have improved accuracy of such measurements. Because these devices provide real-time feedback, they may help to improve participant self-motivation strategies and reinforce daily lifestyle modifications, one of the main goals in COPD management. This review focuses on describing available instruments to measure PA, specifically in patients with COPD. The reliability, validity, advantages, limitations, and clinical applications of questionnaires, pedometers, and accelerometers are discussed. Finally, based on current published literature, we propose recommendations about which methods may be most useful in different research or clinical settings. PMID:28848818
Potential for natural evaporation as a reliable renewable energy resource
Cavusoglu, Ahmet-Hamdi; Chen, Xi; Gentine, Pierre; ...
2017-09-26
About 50% of the solar energy absorbed at the Earth’s surface drives evaporation, fueling the water cycle that affects various renewable energy resources, such as wind and hydropower. Recent advances demonstrate our nascent ability to convert evaporation energy into work, yet there is little understanding about the potential of this resource. Here in this paper we study the energy available from natural evaporation to predict the potential of this ubiquitous resource. We find that natural evaporation from open water surfaces could provide power densities comparable to current wind and solar technologies while cutting evaporative water losses by nearly half. Wemore » estimate up to 325 GW of power is potentially available in the United States. Strikingly, water’s large heat capacity is sufficient to control power output by storing excess energy when demand is low, thus reducing intermittency and improving reliability. Our findings motivate the improvement of materials and devices that convert energy from evaporation.« less
Designing for Reliability and Robustness
NASA Technical Reports Server (NTRS)
Svetlik, Randall G.; Moore, Cherice; Williams, Antony
2017-01-01
Long duration spaceflight has a negative effect on the human body, and exercise countermeasures are used on-board the International Space Station (ISS) to minimize bone and muscle loss, combatting these effects. Given the importance of these hardware systems to the health of the crew, this equipment must continue to be readily available. Designing spaceflight exercise hardware to meet high reliability and availability standards has proven to be challenging throughout the time the crewmembers have been living on ISS beginning in 2000. Furthermore, restoring operational capability after a failure is clearly time-critical, but can be problematic given the challenges of troubleshooting the problem from 220 miles away. Several best-practices have been leveraged in seeking to maximize availability of these exercise systems, including designing for robustness, implementing diagnostic instrumentation, relying on user feedback, and providing ample maintenance and sparing. These factors have enhanced the reliability of hardware systems, and therefore have contributed to keeping the crewmembers healthy upon return to Earth. This paper will review the failure history for three spaceflight exercise countermeasure systems identifying lessons learned that can help improve future systems. Specifically, the Treadmill with Vibration Isolation and Stabilization System (TVIS), Cycle Ergometer with Vibration Isolation and Stabilization System (CEVIS), and the Advanced Resistive Exercise Device (ARED) will be reviewed, analyzed, and conclusions identified so as to provide guidance for improving future exercise hardware designs. These lessons learned, paired with thorough testing, offer a path towards reduced system down-time.
Reliability Quantification of Advanced Stirling Convertor (ASC) Components
NASA Technical Reports Server (NTRS)
Shah, Ashwin R.; Korovaichuk, Igor; Zampino, Edward
2010-01-01
The Advanced Stirling Convertor, is intended to provide power for an unmanned planetary spacecraft and has an operational life requirement of 17 years. Over this 17 year mission, the ASC must provide power with desired performance and efficiency and require no corrective maintenance. Reliability demonstration testing for the ASC was found to be very limited due to schedule and resource constraints. Reliability demonstration must involve the application of analysis, system and component level testing, and simulation models, taken collectively. Therefore, computer simulation with limited test data verification is a viable approach to assess the reliability of ASC components. This approach is based on physics-of-failure mechanisms and involves the relationship among the design variables based on physics, mechanics, material behavior models, interaction of different components and their respective disciplines such as structures, materials, fluid, thermal, mechanical, electrical, etc. In addition, these models are based on the available test data, which can be updated, and analysis refined as more data and information becomes available. The failure mechanisms and causes of failure are included in the analysis, especially in light of the new information, in order to develop guidelines to improve design reliability and better operating controls to reduce the probability of failure. Quantified reliability assessment based on fundamental physical behavior of components and their relationship with other components has demonstrated itself to be a superior technique to conventional reliability approaches based on utilizing failure rates derived from similar equipment or simply expert judgment.
MOLAR: Modular Linux and Adaptive Runtime Support for HEC OS/R Research
DOE Office of Scientific and Technical Information (OSTI.GOV)
Frank Mueller
2009-02-05
MOLAR is a multi-institution research effort that concentrates on adaptive, reliable,and efficient operating and runtime system solutions for ultra-scale high-end scientific computing on the next generation of supercomputers. This research addresses the challenges outlined by the FAST-OS - forum to address scalable technology for runtime and operating systems --- and HECRTF --- high-end computing revitalization task force --- activities by providing a modular Linux and adaptable runtime support for high-end computing operating and runtime systems. The MOLAR research has the following goals to address these issues. (1) Create a modular and configurable Linux system that allows customized changes based onmore » the requirements of the applications, runtime systems, and cluster management software. (2) Build runtime systems that leverage the OS modularity and configurability to improve efficiency, reliability, scalability, ease-of-use, and provide support to legacy and promising programming models. (3) Advance computer reliability, availability and serviceability (RAS) management systems to work cooperatively with the OS/R to identify and preemptively resolve system issues. (4) Explore the use of advanced monitoring and adaptation to improve application performance and predictability of system interruptions. The overall goal of the research conducted at NCSU is to develop scalable algorithms for high-availability without single points of failure and without single points of control.« less
Reeves, Mathew J; Mullard, Andrew J; Wehner, Susan
2008-01-01
Background The Paul Coverdell National Acute Stroke Registry (PCNASR) is a U.S. based national registry designed to monitor and improve the quality of acute stroke care delivered by hospitals. The registry monitors care through specific performance measures, the accuracy of which depends in part on the reliability of the individual data elements used to construct them. This study describes the inter-rater reliability of data elements collected in Michigan's state-based prototype of the PCNASR. Methods Over a 6-month period, 15 hospitals participating in the Michigan PCNASR prototype submitted data on 2566 acute stroke admissions. Trained hospital staff prospectively identified acute stroke admissions, abstracted chart information, and submitted data to the registry. At each hospital 8 randomly selected cases were re-abstracted by an experienced research nurse. Inter-rater reliability was estimated by the kappa statistic for nominal variables, and intraclass correlation coefficient (ICC) for ordinal and continuous variables. Factors that can negatively impact the kappa statistic (i.e., trait prevalence and rater bias) were also evaluated. Results A total of 104 charts were available for re-abstraction. Excellent reliability (kappa or ICC > 0.75) was observed for many registry variables including age, gender, black race, hemorrhagic stroke, discharge medications, and modified Rankin Score. Agreement was at least moderate (i.e., 0.75 > kappa ≥; 0.40) for ischemic stroke, TIA, white race, non-ambulance arrival, hospital transfer and direct admit. However, several variables had poor reliability (kappa < 0.40) including stroke onset time, stroke team consultation, time of initial brain imaging, and discharge destination. There were marked systematic differences between hospital abstractors and the audit abstractor (i.e., rater bias) for many of the data elements recorded in the emergency department. Conclusion The excellent reliability of many of the data elements supports the use of the PCNASR to monitor and improve care. However, the poor reliability for several variables, particularly time-related events in the emergency department, indicates the need for concerted efforts to improve the quality of data collection. Specific recommendations include improvements to data definitions, abstractor training, and the development of ED-based real-time data collection systems. PMID:18547421
Implementation of Integrated System Fault Management Capability
NASA Technical Reports Server (NTRS)
Figueroa, Fernando; Schmalzel, John; Morris, Jon; Smith, Harvey; Turowski, Mark
2008-01-01
Fault Management to support rocket engine test mission with highly reliable and accurate measurements; while improving availability and lifecycle costs. CORE ELEMENTS: Architecture, taxonomy, and ontology (ATO) for DIaK management. Intelligent Sensor Processes; Intelligent Element Processes; Intelligent Controllers; Intelligent Subsystem Processes; Intelligent System Processes; Intelligent Component Processes.
A review of health resource tracking in developing countries.
Powell-Jackson, Timothy; Mills, Anne
2007-11-01
Timely, reliable and complete information on financial resources in the health sector is critical for sound policy making and planning, particularly in developing countries where resources are both scarce and unpredictable. Health resource tracking has a long history and has seen renewed interest more recently as pressure has mounted to improve accountability for the attainment of the health Millennium Development Goals. We review the methods used to track health resources and recent experiences of their application, with a view to identifying the major challenges that must be overcome if data availability and reliability are to improve. At the country level, there have been important advances in the refinement of the National Health Accounts (NHA) methodology, which is now regarded as the international standard. Significant efforts have also been put into the development of methods to track disease-specific expenditures. However, NHA as a framework can do little to address the underlying problem of weak government public expenditure management and information systems that provide much of the raw data. The experience of institutionalizing NHA suggests progress has been uneven and there is a potential for stand-alone disease accounts to make the situation worse by undermining capacity and confusing technicians. Global level tracking of donor assistance to health relies to a large extent on the OECD's Creditor Reporting System. Despite improvements in its coverage and reliability, the demand for estimates of aid to control of specific diseases is resulting in multiple, uncoordinated data requests to donor agencies, placing additional workload on the providers of information. The emergence of budget support aid modalities poses a methodological challenge to health resource tracking, as such support is difficult to attribute to any particular sector or health programme. Attention should focus on improving underlying financial and information systems at the country level, which will facilitate more reliable and timely reporting of NHA estimates. Effective implementation of a framework to make donors more accountable to recipient countries and the international community will improve the availability of financial data on their activities.
Keppler, Hannah; Dhooge, Ingeborg; Maes, Leen; D'haenens, Wendy; Bockstael, Annelies; Philips, Birgit; Swinnen, Freya; Vinck, Bart
2010-02-01
Knowledge regarding the variability of transient-evoked otoacoustic emissions (TEOAEs) and distortion product otoacoustic emissions (DPOAEs) is essential in clinical settings and improves their utility in monitoring hearing status over time. In the current study, TEOAEs and DPOAEs were measured with commercially available OAE-equipment in 56 normally-hearing ears during three sessions. Reliability was analysed for the retest measurement without probe-refitting, the immediate retest measurement with probe-refitting, and retest measurements after one hour and one week. The highest reliability was obtained in the retest measurement without probe-refitting, and decreased with increasing time-interval between measurements. For TEOAEs, the lowest reliability was seen at half-octave frequency bands 1.0 and 1.4 kHz; whereas for DPOAEs half-octave frequency band 8.0 kHz had also poor reliability. Higher primary tone level combination for DPOAEs yielded to a better reliability of DPOAE amplitudes. External environmental noise seemed to be the dominating noise source in normal-hearing subjects, decreasing the reliability of emission amplitudes especially in the low-frequency region.
Gebreyesus, Grum; Lund, Mogens S; Buitenhuis, Bart; Bovenhuis, Henk; Poulsen, Nina A; Janss, Luc G
2017-12-05
Accurate genomic prediction requires a large reference population, which is problematic for traits that are expensive to measure. Traits related to milk protein composition are not routinely recorded due to costly procedures and are considered to be controlled by a few quantitative trait loci of large effect. The amount of variation explained may vary between regions leading to heterogeneous (co)variance patterns across the genome. Genomic prediction models that can efficiently take such heterogeneity of (co)variances into account can result in improved prediction reliability. In this study, we developed and implemented novel univariate and bivariate Bayesian prediction models, based on estimates of heterogeneous (co)variances for genome segments (BayesAS). Available data consisted of milk protein composition traits measured on cows and de-regressed proofs of total protein yield derived for bulls. Single-nucleotide polymorphisms (SNPs), from 50K SNP arrays, were grouped into non-overlapping genome segments. A segment was defined as one SNP, or a group of 50, 100, or 200 adjacent SNPs, or one chromosome, or the whole genome. Traditional univariate and bivariate genomic best linear unbiased prediction (GBLUP) models were also run for comparison. Reliabilities were calculated through a resampling strategy and using deterministic formula. BayesAS models improved prediction reliability for most of the traits compared to GBLUP models and this gain depended on segment size and genetic architecture of the traits. The gain in prediction reliability was especially marked for the protein composition traits β-CN, κ-CN and β-LG, for which prediction reliabilities were improved by 49 percentage points on average using the MT-BayesAS model with a 100-SNP segment size compared to the bivariate GBLUP. Prediction reliabilities were highest with the BayesAS model that uses a 100-SNP segment size. The bivariate versions of our BayesAS models resulted in extra gains of up to 6% in prediction reliability compared to the univariate versions. Substantial improvement in prediction reliability was possible for most of the traits related to milk protein composition using our novel BayesAS models. Grouping adjacent SNPs into segments provided enhanced information to estimate parameters and allowing the segments to have different (co)variances helped disentangle heterogeneous (co)variances across the genome.
Reliability analysis of a robotic system using hybridized technique
NASA Astrophysics Data System (ADS)
Kumar, Naveen; Komal; Lather, J. S.
2017-09-01
In this manuscript, the reliability of a robotic system has been analyzed using the available data (containing vagueness, uncertainty, etc). Quantification of involved uncertainties is done through data fuzzification using triangular fuzzy numbers with known spreads as suggested by system experts. With fuzzified data, if the existing fuzzy lambda-tau (FLT) technique is employed, then the computed reliability parameters have wide range of predictions. Therefore, decision-maker cannot suggest any specific and influential managerial strategy to prevent unexpected failures and consequently to improve complex system performance. To overcome this problem, the present study utilizes a hybridized technique. With this technique, fuzzy set theory is utilized to quantify uncertainties, fault tree is utilized for the system modeling, lambda-tau method is utilized to formulate mathematical expressions for failure/repair rates of the system, and genetic algorithm is utilized to solve established nonlinear programming problem. Different reliability parameters of a robotic system are computed and the results are compared with the existing technique. The components of the robotic system follow exponential distribution, i.e., constant. Sensitivity analysis is also performed and impact on system mean time between failures (MTBF) is addressed by varying other reliability parameters. Based on analysis some influential suggestions are given to improve the system performance.
Kuehnapfel, Andreas; Ahnert, Peter; Loeffler, Markus; Scholz, Markus
2017-02-01
Body surface area is a physiological quantity relevant for many medical applications. In clinical practice, it is determined by empirical formulae. 3D laser-based anthropometry provides an easy and effective way to measure body surface area but is not ubiquitously available. We used data from laser-based anthropometry from a population-based study to assess validity of published and commonly used empirical formulae. We performed a large population-based study on adults collecting classical anthropometric measurements and 3D body surface assessments (N = 1435). We determined reliability of the 3D body surface assessment and validity of 18 different empirical formulae proposed in the literature. The performance of these formulae is studied in subsets of sex and BMI. Finally, improvements of parameter settings of formulae and adjustments for sex and BMI were considered. 3D body surface measurements show excellent intra- and inter-rater reliability of 0.998 (overall concordance correlation coefficient, OCCC was used as measure of agreement). Empirical formulae of Fujimoto and Watanabe, Shuter and Aslani and Sendroy and Cecchini performed best with excellent concordance with OCCC > 0.949 even in subgroups of sex and BMI. Re-parametrization of formulae and adjustment for sex and BMI slightly improved results. In adults, 3D laser-based body surface assessment is a reliable alternative to estimation by empirical formulae. However, there are empirical formulae showing excellent results even in subgroups of sex and BMI with only little room for improvement.
Material Selection for Cable Gland to Improved Reliability of the High-hazard Industries
NASA Astrophysics Data System (ADS)
Vashchuk, S. P.; Slobodyan, S. M.; Deeva, V. S.; Vashchuk, D. S.
2018-01-01
The sealed cable glands (SCG) are available to ensure safest connection sheathed single wire for the hazard production facility (nuclear power plant and others) the same as pilot cable, control cables, radio-frequency cables et al. In this paper, we investigate the specifics of the material selection of SCG with the express aim of hazardous man-made facility. We discuss the safe working conditions for cable glands. The research indicates the sintering powdered metals cables provide the reliability growth due to their properties. A number of studies have demonstrated the verification of material selection. On the face of it, we make findings indicating that double glazed sealed units could enhance reliability. We had evaluated sample reliability under fire conditions, seismic load, and pressure containment failure. We used the samples mineral insulated thermocouple cable.
Potential for natural evaporation as a reliable renewable energy resource.
Cavusoglu, Ahmet-Hamdi; Chen, Xi; Gentine, Pierre; Sahin, Ozgur
2017-09-26
About 50% of the solar energy absorbed at the Earth's surface drives evaporation, fueling the water cycle that affects various renewable energy resources, such as wind and hydropower. Recent advances demonstrate our nascent ability to convert evaporation energy into work, yet there is little understanding about the potential of this resource. Here we study the energy available from natural evaporation to predict the potential of this ubiquitous resource. We find that natural evaporation from open water surfaces could provide power densities comparable to current wind and solar technologies while cutting evaporative water losses by nearly half. We estimate up to 325 GW of power is potentially available in the United States. Strikingly, water's large heat capacity is sufficient to control power output by storing excess energy when demand is low, thus reducing intermittency and improving reliability. Our findings motivate the improvement of materials and devices that convert energy from evaporation.The evaporation of water represents an alternative source of renewable energy. Building on previous models of evaporation, Cavusoglu et al. show that the power available from this natural resource is comparable to wind and solar power, yet it does not suffer as much from varying weather conditions.
Krienert, Jessie L; Walsh, Jeffrey A; Matthews, Kevin; McConkey, Kelly
2012-01-01
Companion animals play a complex role in families impacted by violence. An outlet of emotional support for victims, the family pet often becomes a target for physical abuse. Results from a comprehensive e-survey of domestic violence shelters nationwide (N = 767) highlight both improvements and existing gaps in service provision for domestic violence victims and their pets. Quantitative and qualitative data noted frequently encountered obstacles to successful shelter seeking by abuse victims with companion animals including a lack of availability, funding, space, and reliable programming. Although results indicate an overall improvement in organizational awareness, fewer than half of surveyed shelters include intake questions about animals. Continued awareness and an expansion of services is needed to create viable safety planning strategies and reliable alternatives for women with companion animals in order to improve the likelihood that abuse victims will seek escape and refuge for themselves, their children, and their pets.
Regulation Scheme for Improved Innovation and Efficiency in Wireless Communications
2009-03-01
that is reliable and available during all crises and emergencies. The last goal is the overall modernization of the FCC. “The FCC shall strive to...change. Qualcomm attempted to create a new video service to mobile subscribers. The service might have interfered with adjacent services. Qualcomm
Ecological forecasts: An emerging imperative
James S. Clark; Steven R. Carpenter; Mary Barber; Scott Collins; Andy Dobson; Jonathan A. Foley; David M. Lodge; Mercedes Pascual; Roger Pielke; William Pizer; Cathy Pringle; Walter V. Reid; Kenneth A. Rose; Osvaldo Sala; William H. Schlesinger; Diana H. Wall; David Wear
2001-01-01
Planning and decision-making can be improved by access to reliable forecasts of ecosystem state, ecosystem services, and natural capital. Availability of new data sets, together with progress in computation and statistics, will increase our ability to forecast ecosystem change. An agenda that would lead toward a capacity to produce, evaluate, and communicate forecasts...
USDA-ARS?s Scientific Manuscript database
Instruments have been available for many years to detect insects using sound, vibration, or LED sensors separately. Most of these instruments are relatively expensive. An instrument was evaluated that incorporates all three types of sensors to improve the reliability of distinguishing different spec...
USDA-ARS?s Scientific Manuscript database
Instruments have been available for many years to detect insects using sound, vibration, or LED sensors separately. Most of these instruments are relatively expensive. An instrument was evaluated that incorporates all three types of sensors to improve the reliability of distinguishing different sp...
Object-oriented fault tree evaluation program for quantitative analyses
NASA Technical Reports Server (NTRS)
Patterson-Hine, F. A.; Koen, B. V.
1988-01-01
Object-oriented programming can be combined with fault free techniques to give a significantly improved environment for evaluating the safety and reliability of large complex systems for space missions. Deep knowledge about system components and interactions, available from reliability studies and other sources, can be described using objects that make up a knowledge base. This knowledge base can be interrogated throughout the design process, during system testing, and during operation, and can be easily modified to reflect design changes in order to maintain a consistent information source. An object-oriented environment for reliability assessment has been developed on a Texas Instrument (TI) Explorer LISP workstation. The program, which directly evaluates system fault trees, utilizes the object-oriented extension to LISP called Flavors that is available on the Explorer. The object representation of a fault tree facilitates the storage and retrieval of information associated with each event in the tree, including tree structural information and intermediate results obtained during the tree reduction process. Reliability data associated with each basic event are stored in the fault tree objects. The object-oriented environment on the Explorer also includes a graphical tree editor which was modified to display and edit the fault trees.
Reliability, Availability and Maintainability Design Practices Guide. Volume 1,
1981-03-01
Experience 7-3-3 Air Force RIV - Avionics 7-3-4 RIW-S Army 7-3-5a The Application of Availability to Linear 7-3-6 Indifference Contracting Improvement...acceptance of the maintain- ability of Air Force ground electronic systems and equipments. Although the notebook is directed at ground electronic systems...conformal coating standardization, a lack of written instructions, and no standardization between fleet activities. The Naval Air Development Center
Alanis-Lobato, Gregorio
2015-01-01
High-throughput detection of protein interactions has had a major impact in our understanding of the intricate molecular machinery underlying the living cell, and has permitted the construction of very large protein interactomes. The protein networks that are currently available are incomplete and a significant percentage of their interactions are false positives. Fortunately, the structural properties observed in good quality social or technological networks are also present in biological systems. This has encouraged the development of tools, to improve the reliability of protein networks and predict new interactions based merely on the topological characteristics of their components. Since diseases are rarely caused by the malfunction of a single protein, having a more complete and reliable interactome is crucial in order to identify groups of inter-related proteins involved in disease etiology. These system components can then be targeted with minimal collateral damage. In this article, an important number of network mining tools is reviewed, together with resources from which reliable protein interactomes can be constructed. In addition to the review, a few representative examples of how molecular and clinical data can be integrated to deepen our understanding of pathogenesis are discussed.
Assessing Performance of Multipurpose Reservoir System Using Two-Point Linear Hedging Rule
NASA Astrophysics Data System (ADS)
Sasireka, K.; Neelakantan, T. R.
2017-07-01
Reservoir operation is the one of the important filed of water resource management. Innovative techniques in water resource management are focussed at optimizing the available water and in decreasing the environmental impact of water utilization on the natural environment. In the operation of multi reservoir system, efficient regulation of the release to satisfy the demand for various purpose like domestic, irrigation and hydropower can lead to increase the benefit from the reservoir as well as significantly reduces the damage due to floods. Hedging rule is one of the emerging techniques in reservoir operation, which reduce the severity of drought by accepting number of smaller shortages. The key objective of this paper is to maximize the minimum power production and improve the reliability of water supply for municipal and irrigation purpose by using hedging rule. In this paper, Type II two-point linear hedging rule is attempted to improve the operation of Bargi reservoir in the Narmada basin in India. The results obtained from simulation of hedging rule is compared with results from Standard Operating Policy, the result shows that the application of hedging rule significantly improved the reliability of water supply and reliability of irrigation release and firm power production.
Identification of reliable gridded reference data for statistical downscaling methods in Alberta
NASA Astrophysics Data System (ADS)
Eum, H. I.; Gupta, A.
2017-12-01
Climate models provide essential information to assess impacts of climate change at regional and global scales. However, statistical downscaling methods have been applied to prepare climate model data for various applications such as hydrologic and ecologic modelling at a watershed scale. As the reliability and (spatial and temporal) resolution of statistically downscaled climate data mainly depend on a reference data, identifying the most reliable reference data is crucial for statistical downscaling. A growing number of gridded climate products are available for key climate variables which are main input data to regional modelling systems. However, inconsistencies in these climate products, for example, different combinations of climate variables, varying data domains and data lengths and data accuracy varying with physiographic characteristics of the landscape, have caused significant challenges in selecting the most suitable reference climate data for various environmental studies and modelling. Employing various observation-based daily gridded climate products available in public domain, i.e. thin plate spline regression products (ANUSPLIN and TPS), inverse distance method (Alberta Townships), and numerical climate model (North American Regional Reanalysis) and an optimum interpolation technique (Canadian Precipitation Analysis), this study evaluates the accuracy of the climate products at each grid point by comparing with the Adjusted and Homogenized Canadian Climate Data (AHCCD) observations for precipitation, minimum and maximum temperature over the province of Alberta. Based on the performance of climate products at AHCCD stations, we ranked the reliability of these publically available climate products corresponding to the elevations of stations discretized into several classes. According to the rank of climate products for each elevation class, we identified the most reliable climate products based on the elevation of target points. A web-based system was developed to allow users to easily select the most reliable reference climate data at each target point based on the elevation of grid cell. By constructing the best combination of reference data for the study domain, the accurate and reliable statistically downscaled climate projections could be significantly improved.
Advanced Stirling Convertor Heater Head Durability and Reliability Quantification
NASA Technical Reports Server (NTRS)
Krause, David L.; Shah, Ashwin R.; Korovaichuk, Igor; Kalluri, Sreeramesh
2008-01-01
The National Aeronautics and Space Administration (NASA) has identified the high efficiency Advanced Stirling Radioisotope Generator (ASRG) as a candidate power source for long duration Science missions, such as lunar applications, Mars rovers, and deep space missions, that require reliable design lifetimes of up to 17 years. Resistance to creep deformation of the MarM-247 heater head (HH), a structurally critical component of the ASRG Advanced Stirling Convertor (ASC), under high temperatures (up to 850 C) is a key design driver for durability. Inherent uncertainties in the creep behavior of the thin-walled HH and the variations in the wall thickness, control temperature, and working gas pressure need to be accounted for in the life and reliability prediction. Due to the availability of very limited test data, assuring life and reliability of the HH is a challenging task. The NASA Glenn Research Center (GRC) has adopted an integrated approach combining available uniaxial MarM-247 material behavior testing, HH benchmark testing and advanced analysis in order to demonstrate the integrity, life and reliability of the HH under expected mission conditions. The proposed paper describes analytical aspects of the deterministic and probabilistic approaches and results. The deterministic approach involves development of the creep constitutive model for the MarM-247 (akin to the Oak Ridge National Laboratory master curve model used previously for Inconel 718 (Special Metals Corporation)) and nonlinear finite element analysis to predict the mean life. The probabilistic approach includes evaluation of the effect of design variable uncertainties in material creep behavior, geometry and operating conditions on life and reliability for the expected life. The sensitivity of the uncertainties in the design variables on the HH reliability is also quantified, and guidelines to improve reliability are discussed.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-04-12
... maps? What are the public safety and homeland security implications of public disclosure of key network... 13-33] Improving 9-1-1 Reliability; Reliability and Continuity of Communications Networks, Including... improve the reliability and resiliency of the Nation's 9-1-1 networks. The Notice of Proposed Rulemaking...
18 CFR 40.3 - Availability of Reliability Standards.
Code of Federal Regulations, 2010 CFR
2010-04-01
... 18 Conservation of Power and Water Resources 1 2010-04-01 2010-04-01 false Availability of Reliability Standards. 40.3 Section 40.3 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY... THE BULK-POWER SYSTEM § 40.3 Availability of Reliability Standards. The Electric Reliability...
Availability Estimation for Facilities in Extreme Geographical Locations
NASA Technical Reports Server (NTRS)
Fischer, Gerd M.; Omotoso, Oluseun; Chen, Guangming; Evans, John W.
2012-01-01
A value added analysis for the Reliability. Availability and Maintainability of McMurdo Ground Station was developed, which will be a useful tool for system managers in sparing, maintenance planning and determining vital performance metrics needed for readiness assessment of the upgrades to the McMurdo System. Output of this study can also be used as inputs and recommendations for the application of Reliability Centered Maintenance (RCM) for the system. ReliaSoft's BlockSim. a commercial Reliability Analysis software package, has been used to model the availability of the system upgrade to the National Aeronautics and Space Administration (NASA) Near Earth Network (NEN) Ground Station at McMurdo Station in the Antarctica. The logistics challenges due to the closure of access to McMurdo Station during the Antarctic winter was modeled using a weighted composite of four Weibull distributions. one of the possible choices for statistical distributions throughout the software program and usually used to account for failure rates of components supplied by different manufacturers. The inaccessibility of the antenna site on a hill outside McMurdo Station throughout one year due to severe weather was modeled with a Weibull distribution for the repair crew availability. The Weibull distribution is based on an analysis of the available weather data for the antenna site for 2007 in combination with the rules for travel restrictions due to severe weather imposed by the administrating agency, the National Science Foundation (NSF). The simulations resulted in an upper bound for the system availability and allowed for identification of components that would improve availability based on a higher on-site spare count than initially planned.
Application of the Systematic Sensor Selection Strategy for Turbofan Engine Diagnostics
NASA Technical Reports Server (NTRS)
Sowers, T. Shane; Kopasakis, George; Simon, Donald L.
2008-01-01
The data acquired from available system sensors forms the foundation upon which any health management system is based, and the available sensor suite directly impacts the overall diagnostic performance that can be achieved. While additional sensors may provide improved fault diagnostic performance, there are other factors that also need to be considered such as instrumentation cost, weight, and reliability. A systematic sensor selection approach is desired to perform sensor selection from a holistic system-level perspective as opposed to performing decisions in an ad hoc or heuristic fashion. The Systematic Sensor Selection Strategy is a methodology that optimally selects a sensor suite from a pool of sensors based on the system fault diagnostic approach, with the ability of taking cost, weight, and reliability into consideration. This procedure was applied to a large commercial turbofan engine simulation. In this initial study, sensor suites tailored for improved diagnostic performance are constructed from a prescribed collection of candidate sensors. The diagnostic performance of the best performing sensor suites in terms of fault detection and identification are demonstrated, with a discussion of the results and implications for future research.
Application of the Systematic Sensor Selection Strategy for Turbofan Engine Diagnostics
NASA Technical Reports Server (NTRS)
Sowers, T. Shane; Kopasakis, George; Simon, Donald L.
2008-01-01
The data acquired from available system sensors forms the foundation upon which any health management system is based, and the available sensor suite directly impacts the overall diagnostic performance that can be achieved. While additional sensors may provide improved fault diagnostic performance there are other factors that also need to be considered such as instrumentation cost, weight, and reliability. A systematic sensor selection approach is desired to perform sensor selection from a holistic system-level perspective as opposed to performing decisions in an ad hoc or heuristic fashion. The Systematic Sensor Selection Strategy is a methodology that optimally selects a sensor suite from a pool of sensors based on the system fault diagnostic approach, with the ability of taking cost, weight and reliability into consideration. This procedure was applied to a large commercial turbofan engine simulation. In this initial study, sensor suites tailored for improved diagnostic performance are constructed from a prescribed collection of candidate sensors. The diagnostic performance of the best performing sensor suites in terms of fault detection and identification are demonstrated, with a discussion of the results and implications for future research.
Survey points to practices that reduce refinery maintenance spending
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ricketts, R.
During the past decade, Solomon Associates Inc., Dallas, has conducted several comparative analyses of maintenance costs in the refining industry. These investigations have brought to light maintenance practices and reliability improvement activities that are responsible for the wide range of maintenance costs recorded by refineries. Some of the practices are of an organizational nature and thus are of interest to managers reviewing their operations. The paper discusses maintenance costs; profitability; cost trends; equipment availability; funds application; two basic organizational approached to maintenance (repair-focused organization and reliability-focused organization); low-cost practices; and organizational style.
2015-09-30
acoustics and fine scale motion. The success of the Dtag has resulted in an increased demand for the instrument from researchers both within the...sensor blocks sound when the animal is close to the surface. The polyethylene shell was eliminated in the Dtag-3 design to improve acoustic ...into 3 main sub-assemblies (Figure 5): 1) foam sub-assembly, 2) sensor sub-assembly, and 3) Electronics sub-assembly. This separation enables rapid
Loss of Load Probability Calculation for West Java Power System with Nuclear Power Plant Scenario
NASA Astrophysics Data System (ADS)
Azizah, I. D.; Abdullah, A. G.; Purnama, W.; Nandiyanto, A. B. D.; Shafii, M. A.
2017-03-01
Loss of Load Probability (LOLP) index showing the quality and performance of an electrical system. LOLP value is affected by load growth, the load duration curve, forced outage rate of the plant, number and capacity of generating units. This reliability index calculation begins with load forecasting to 2018 using multiple regression method. Scenario 1 with compositions of conventional plants produce the largest LOLP in 2017 amounted to 71.609 days / year. While the best reliability index generated in scenario 2 with the NPP amounted to 6.941 days / year in 2015. Improved reliability of systems using nuclear power more efficiently when compared to conventional plants because it also has advantages such as emission-free, inexpensive fuel costs, as well as high level of plant availability.
Structural Probability Concepts Adapted to Electrical Engineering
NASA Technical Reports Server (NTRS)
Steinberg, Eric P.; Chamis, Christos C.
1994-01-01
Through the use of equivalent variable analogies, the authors demonstrate how an electrical subsystem can be modeled by an equivalent structural subsystem. This allows the electrical subsystem to be probabilistically analyzed by using available structural reliability computer codes such as NESSUS. With the ability to analyze the electrical subsystem probabilistically, we can evaluate the reliability of systems that include both structural and electrical subsystems. Common examples of such systems are a structural subsystem integrated with a health-monitoring subsystem, and smart structures. Since these systems have electrical subsystems that directly affect the operation of the overall system, probabilistically analyzing them could lead to improved reliability and reduced costs. The direct effect of the electrical subsystem on the structural subsystem is of secondary order and is not considered in the scope of this work.
Simulation of Swap-Out Reliability For The Advance Photon Source Upgrade
DOE Office of Scientific and Technical Information (OSTI.GOV)
Borland, M.
2017-06-01
The proposed upgrade of the Advanced Photon Source (APS) to a multibend-achromat lattice relies on the use of swap-out injection to accommodate the small dynamic acceptance, allow use of unusual insertion devices, and minimize collective effects at high single-bunch charge. This, combined with the short beam lifetime, will make injector reliability even more important than it is for top-up operation. We used historical data for the APS injector complex to obtain probability distributions for injector up-time and down-time durations. Using these distributions, we simulated several years of swap-out operation for the upgraded lattice for several operatingmodes. The results indicate thatmore » obtaining very high availability of beam in the storage ring will require improvements to injector reliability.« less
Innovative safety valve selection techniques and data.
Miller, Curt; Bredemyer, Lindsey
2007-04-11
The new valve data resources and modeling tools that are available today are instrumental in verifying that that safety levels are being met in both current installations and project designs. If the new ISA 84 functional safety practices are followed closely, good industry validated data used, and a user's maintenance integrity program strictly enforced, plants should feel confident that their design has been quantitatively reinforced. After 2 years of exhaustive reliability studies, there are now techniques and data available to support this safety system component deficiency. Everyone who has gone through the process of safety integrity level (SIL) verification (i.e. reliability math) will appreciate the progress made in this area. The benefits of these advancements are improved safety with lower lifecycle costs such as lower capital investment and/or longer testing intervals. This discussion will start with a review of the different valve, actuator, and solenoid/positioner combinations that can be used and their associated application restraints. Failure rate reliability studies (i.e. FMEDA) and data associated with the final combinations will then discussed. Finally, the impact of the selections on each safety system's SIL verification will be reviewed.
Projecting technology change to improve space technology planning and systems management
NASA Astrophysics Data System (ADS)
Walk, Steven Robert
2011-04-01
Projecting technology performance evolution has been improving over the years. Reliable quantitative forecasting methods have been developed that project the growth, diffusion, and performance of technology in time, including projecting technology substitutions, saturation levels, and performance improvements. These forecasts can be applied at the early stages of space technology planning to better predict available future technology performance, assure the successful selection of technology, and improve technology systems management strategy. Often what is published as a technology forecast is simply scenario planning, usually made by extrapolating current trends into the future, with perhaps some subjective insight added. Typically, the accuracy of such predictions falls rapidly with distance in time. Quantitative technology forecasting (QTF), on the other hand, includes the study of historic data to identify one of or a combination of several recognized universal technology diffusion or substitution patterns. In the same manner that quantitative models of physical phenomena provide excellent predictions of system behavior, so do QTF models provide reliable technological performance trajectories. In practice, a quantitative technology forecast is completed to ascertain with confidence when the projected performance of a technology or system of technologies will occur. Such projections provide reliable time-referenced information when considering cost and performance trade-offs in maintaining, replacing, or migrating a technology, component, or system. This paper introduces various quantitative technology forecasting techniques and illustrates their practical application in space technology and technology systems management.
The reliability of in-training assessment when performance improvement is taken into account.
van Lohuizen, Mirjam T; Kuks, Jan B M; van Hell, Elisabeth A; Raat, A N; Stewart, Roy E; Cohen-Schotanus, Janke
2010-12-01
During in-training assessment students are frequently assessed over a longer period of time and therefore it can be expected that their performance will improve. We studied whether there really is a measurable performance improvement when students are assessed over an extended period of time and how this improvement affects the reliability of the overall judgement. In-training assessment results were obtained from 104 students on rotation at our university hospital or at one of the six affiliated hospitals. Generalisability theory was used in combination with multilevel analysis to obtain reliability coefficients and to estimate the number of assessments needed for reliable overall judgement, both including and excluding performance improvement. Students' clinical performance ratings improved significantly from a mean of 7.6 at the start to a mean of 7.8 at the end of their clerkship. When taking performance improvement into account, reliability coefficients were higher. The number of assessments needed to achieve a reliability of 0.80 or higher decreased from 17 to 11. Therefore, when studying reliability of in-training assessment, performance improvement should be considered.
2016-10-14
ABSTRACT DTAGs have proven to be a valuable tool for the study of marine mammal acoustics and fine scale motion. The success of the DTAG has resulted in...Underwater Acoustics , Digital Communications 16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF a. REPORT b. ABSTRACT c. THIS PAGE ABSTRACT u u u...continuously monitor and improve the tag design.\\ OBJECTIVES Dtags have proven to be a valuable tool for the study of marine mammal acoustics and fine
Upgrading the Extender: Which Options Are Cost-Effective for Modernizing the KC-10
2011-01-01
possessed rate is the percent- age of the total fleet that is in depot for heavy maintenance or modifications and not available for missions. The NMC rate...maintainability improvement initiatives. There were none. Table 4.2 shows, for 2004–2008, how much of the total NMC rate was associated with the top five, top...of the total , and the top 100 were responsible for 47 percent. Therefore, reliability and maintainability improvements to many systems would be
Stress Responses and Decision Making in Child Protection Workers Faced with High Conflict Situations
ERIC Educational Resources Information Center
LeBlanc, Vicki R.; Regehr, Cheryl; Shlonsky, Aron; Bogo, Marion
2012-01-01
Introduction: The assessment of children at risk of abuse and neglect is a critical societal function performed by child protection workers in situations of acute stress and conflict. Despite efforts to improve the reliability of risk assessments through standardized measures, available tools continue to rely on subjective judgment. The goal of…
Working with Teachers to Develop Fair and Reliable Measures of Effective Teaching. MET Project
ERIC Educational Resources Information Center
Bill & Melinda Gates Foundation, 2010
2010-01-01
In fall 2009, the Bill & Melinda Gates Foundation launched the Measures of Effective Teaching (MET) project to develop and test multiple measures of teacher effectiveness. The goal of the MET project is to improve the quality of information about teaching effectiveness available to education professionals within states and districts--information…
Yang, Qi; Franco, Christopher M M; Sorokin, Shirley J; Zhang, Wei
2017-02-02
For sponges (phylum Porifera), there is no reliable molecular protocol available for species identification. To address this gap, we developed a multilocus-based Sponge Identification Protocol (SIP) validated by a sample of 37 sponge species belonging to 10 orders from South Australia. The universal barcode COI mtDNA, 28S rRNA gene (D3-D5), and the nuclear ITS1-5.8S-ITS2 region were evaluated for their suitability and capacity for sponge identification. The highest Bit Score was applied to infer the identity. The reliability of SIP was validated by phylogenetic analysis. The 28S rRNA gene and COI mtDNA performed better than the ITS region in classifying sponges at various taxonomic levels. A major limitation is that the databases are not well populated and possess low diversity, making it difficult to conduct the molecular identification protocol. The identification is also impacted by the accuracy of the morphological classification of the sponges whose sequences have been submitted to the database. Re-examination of the morphological identification further demonstrated and improved the reliability of sponge identification by SIP. Integrated with morphological identification, the multilocus-based SIP offers an improved protocol for more reliable and effective sponge identification, by coupling the accuracy of different DNA markers.
Yang, Qi; Franco, Christopher M. M.; Sorokin, Shirley J.; Zhang, Wei
2017-01-01
For sponges (phylum Porifera), there is no reliable molecular protocol available for species identification. To address this gap, we developed a multilocus-based Sponge Identification Protocol (SIP) validated by a sample of 37 sponge species belonging to 10 orders from South Australia. The universal barcode COI mtDNA, 28S rRNA gene (D3–D5), and the nuclear ITS1-5.8S-ITS2 region were evaluated for their suitability and capacity for sponge identification. The highest Bit Score was applied to infer the identity. The reliability of SIP was validated by phylogenetic analysis. The 28S rRNA gene and COI mtDNA performed better than the ITS region in classifying sponges at various taxonomic levels. A major limitation is that the databases are not well populated and possess low diversity, making it difficult to conduct the molecular identification protocol. The identification is also impacted by the accuracy of the morphological classification of the sponges whose sequences have been submitted to the database. Re-examination of the morphological identification further demonstrated and improved the reliability of sponge identification by SIP. Integrated with morphological identification, the multilocus-based SIP offers an improved protocol for more reliable and effective sponge identification, by coupling the accuracy of different DNA markers. PMID:28150727
Enhancing recovery rates: lessons from year one of IAPT.
Gyani, Alex; Shafran, Roz; Layard, Richard; Clark, David M
2013-09-01
The English Improving Access to Psychological Therapies (IAPT) initiative aims to make evidence-based psychological therapies for depression and anxiety disorder more widely available in the National Health Service (NHS). 32 IAPT services based on a stepped care model were established in the first year of the programme. We report on the reliable recovery rates achieved by patients treated in the services and identify predictors of recovery at patient level, service level, and as a function of compliance with National Institute of Health and Care Excellence (NICE) Treatment Guidelines. Data from 19,395 patients who were clinical cases at intake, attended at least two sessions, had at least two outcomes scores and had completed their treatment during the period were analysed. Outcome was assessed with the patient health questionnaire depression scale (PHQ-9) and the anxiety scale (GAD-7). Data completeness was high for a routine cohort study. Over 91% of treated patients had paired (pre-post) outcome scores. Overall, 40.3% of patients were reliably recovered at post-treatment, 63.7% showed reliable improvement and 6.6% showed reliable deterioration. Most patients received treatments that were recommended by NICE. When a treatment not recommended by NICE was provided, recovery rates were reduced. Service characteristics that predicted higher reliable recovery rates were: high average number of therapy sessions; higher step-up rates among individuals who started with low intensity treatment; larger services; and a larger proportion of experienced staff. Compliance with the IAPT clinical model is associated with enhanced rates of reliable recovery. Copyright © 2013 The Authors. Published by Elsevier Ltd.. All rights reserved.
Majuru, Batsirai; Jagals, Paul; Hunter, Paul R
2012-10-01
Although a number of studies have reported on water supply improvements, few have simultaneously taken into account the reliability of the water services. The study aimed to assess whether upgrading water supply systems in small rural communities improved access, availability and potability of water by assessing the water services against selected benchmarks from the World Health Organisation and South African Department of Water Affairs, and to determine the impact of unreliability on the services. These benchmarks were applied in three rural communities in Limpopo, South Africa where rudimentary water supply services were being upgraded to basic services. Data were collected through structured interviews, observations and measurement, and multi-level linear regression models were used to assess the impact of water service upgrades on key outcome measures of distance to source, daily per capita water quantity and Escherichia coli count. When the basic system was operational, 72% of households met the minimum benchmarks for distance and water quantity, but only 8% met both enhanced benchmarks. During non-operational periods of the basic service, daily per capita water consumption decreased by 5.19l (p<0.001, 95% CI 4.06-6.31) and distances to water sources were 639 m further (p ≤ 0.001, 95% CI 560-718). Although both rudimentary and basic systems delivered water that met potability criteria at the sources, the quality of stored water sampled in the home was still unacceptable throughout the various service levels. These results show that basic water services can make substantial improvements to water access, availability, potability, but only if such services are reliable. Copyright © 2012 Elsevier B.V. All rights reserved.
Anderson-Cook, Christine M.; Morzinski, Jerome; Blecker, Kenneth D.
2015-08-19
Understanding the impact of production, environmental exposure and age characteristics on the reliability of a population is frequently based on underlying science and empirical assessment. When there is incomplete science to prescribe which inputs should be included in a model of reliability to predict future trends, statistical model/variable selection techniques can be leveraged on a stockpile or population of units to improve reliability predictions as well as suggest new mechanisms affecting reliability to explore. We describe a five-step process for exploring relationships between available summaries of age, usage and environmental exposure and reliability. The process involves first identifying potential candidatemore » inputs, then second organizing data for the analysis. Third, a variety of models with different combinations of the inputs are estimated, and fourth, flexible metrics are used to compare them. As a result, plots of the predicted relationships are examined to distill leading model contenders into a prioritized list for subject matter experts to understand and compare. The complexity of the model, quality of prediction and cost of future data collection are all factors to be considered by the subject matter experts when selecting a final model.« less
2018-01-01
Although it is becoming increasingly popular to monitor parameters related to training, recovery, and health with wearable sensor technology (wearables), scientific evaluation of the reliability, sensitivity, and validity of such data is limited and, where available, has involved a wide variety of approaches. To improve the trustworthiness of data collected by wearables and facilitate comparisons, we have outlined recommendations for standardized evaluation. We discuss the wearable devices themselves, as well as experimental and statistical considerations. Adherence to these recommendations should be beneficial not only for the individual, but also for regulatory organizations and insurance companies. PMID:29712629
Energy recovery with turboexpander processes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Holm, J.
1985-07-01
Although the primary function of turboexpanders has been to provide efficient, low-temperature refrigeration, the energy thus extracted has also been an important additional feature. Today, turboexpanders are proven reliable and used widely in the following applications discussed in this article: industrial gases; natural gas (NG) processing; production of liquefied natural gas (LNG); flashing hydrocarbon liquids; NG pressure letdown energy recovery; oilfield cogeneration; and recovery of energy from waste heat. Turboexpander applications for energy conservation resulted because available turboexpanders have the required high-performance capabilities and reliability. At the same time, the development of these energy conservation practices and processes helped furthermore » improve turboexpanders.« less
Challenges to Global Implementation of Infrared Thermography Technology: Current Perspective
Shterenshis, Michael
2017-01-01
Medical infrared thermography (IT) produces an image of the infrared waves emitted by the human body as part of the thermoregulation process that can vary in intensity based on the health of the person. This review analyzes recent developments in the use of infrared thermography as a screening and diagnostic tool in clinical and nonclinical settings, and identifies possible future routes for improvement of the method. Currently, infrared thermography is not considered to be a fully reliable diagnostic method. If standard infrared protocol is established and a normative database is available, infrared thermography may become a reliable method for detecting inflammatory processes. PMID:29138741
Challenges to Global Implementation of Infrared Thermography Technology: Current Perspective.
Shterenshis, Michael
2017-01-01
Medical infrared thermography (IT) produces an image of the infrared waves emitted by the human body as part of the thermoregulation process that can vary in intensity based on the health of the person. This review analyzes recent developments in the use of infrared thermography as a screening and diagnostic tool in clinical and nonclinical settings, and identifies possible future routes for improvement of the method. Currently, infrared thermography is not considered to be a fully reliable diagnostic method. If standard infrared protocol is established and a normative database is available, infrared thermography may become a reliable method for detecting inflammatory processes.
The effect of Web-based Braden Scale training on the reliability of Braden subscale ratings.
Magnan, Morris A; Maklebust, JoAnn
2009-01-01
The primary purpose of this study was to evaluate the effect of Web-based Braden Scale training on the reliability of Braden Scale subscale ratings made by nurses working in acute care hospitals. A secondary purpose was to describe the distribution of reliable Braden subscale ratings before and after Web-based Braden Scale training. Secondary analysis of data from a recently completed quasi-experimental, pretest-posttest, interrater reliability study. A convenience sample of RNs working at 3 Michigan medical centers voluntarily participated in the study. RN participants included nurses who used the Braden Scale regularly at their place of employment ("regular users") as well as nurses who did not use the Braden Scale at their place of employment ("new users"). Using a pretest-posttest, quasi-experimental design, pretest interrater reliability data were collected to identify the percentage of nurses making reliable Braden subscale assessments. Nurses then completed a Web-based Braden Scale training module after which posttest interrater reliability data were collected. The reliability of nurses' Braden subscale ratings was determined by examining the level of agreement/disagreement between ratings made by an RN and an "expert" rating the same patient. In total, 381 RN-to-expert dyads were available for analysis. During both the pretest and posttest periods, the percentage of reliable subscale ratings was highest for the activity subscale, lowest for the moisture subscale, and second lowest for the nutrition subscale. With Web-based Braden Scale training, the percentage of reliable Braden subscale ratings made by new users increased for all 6 subscales with statistically significant improvements in the percentage of reliable assessments made on 3 subscales: sensory-perception, moisture, and mobility. Training had virtually no effect on the percentage of reliable subscale ratings made by regular users of the Braden Scale. With Web-based Braden Scale training the percentage of nurses making reliable ratings increased for all 6 subscales, but this was true for new users only. Additional research is needed to identify educational approaches that effectively improve and sustain the reliability of subscale ratings among regular users of the Braden Scale. Moreover, special attention needs to be given to ensuring that all nurses working with the Braden Scale have a clear understanding of the intended meanings and correct approaches to rating moisture and nutrition subscales.
Reliability of SNOMED-CT Coding by Three Physicians using Two Terminology Browsers
Chiang, Michael F.; Hwang, John C.; Yu, Alexander C.; Casper, Daniel S.; Cimino, James J.; Starren, Justin
2006-01-01
SNOMED-CT has been promoted as a reference terminology for electronic health record (EHR) systems. Many important EHR functions are based on the assumption that medical concepts will be coded consistently by different users. This study is designed to measure agreement among three physicians using two SNOMED-CT terminology browsers to encode 242 concepts from five ophthalmology case presentations in a publicly-available clinical journal. Inter-coder reliability, based on exact coding match by each physician, was 44% using one browser and 53% using the other. Intra-coder reliability testing revealed that a different SNOMED-CT code was obtained up to 55% of the time when the two browsers were used by one user to encode the same concept. These results suggest that the reliability of SNOMED-CT coding is imperfect, and may be a function of browsing methodology. A combination of physician training, terminology refinement, and browser improvement may help increase the reproducibility of SNOMED-CT coding. PMID:17238317
An improved mounting device for attaching intracranial probes in large animal models.
Dunster, Kimble R
2015-12-01
The rigid support of intracranial probes can be difficult when using animal models, as mounting devices suitable for the probes are either not available, or designed for human use and not suitable in animal skulls. A cheap and reliable mounting device for securing intracranial probes in large animal models is described. Using commonly available clinical consumables, a universal mounting device for securing intracranial probes to the skull of large animals was developed and tested. A simply made mounting device to hold a variety of probes from 500 μm to 1.3 mm in diameter to the skull was developed. The device was used to hold probes to the skulls of sheep for up to 18 h. No adhesives or cements were used. The described device provides a reliable method of securing probes to the skull of animals.
Woodham, W.M.
1982-01-01
This report provides results of reliability and cost-effective studies of the goes satellite data-collection system used to operate a small hydrologic data network in west-central Florida. The GOES system, in its present state of development, was found to be about as reliable as conventional methods of data collection. Benefits of using the GOES system include some cost and manpower reduction, improved data accuracy, near real-time data availability, and direct computer storage and analysis of data. The GOES system could allow annual manpower reductions of 19 to 23 percent with reduction in cost for some and increase in cost for other single-parameter sites, such as streamflow, rainfall, and ground-water monitoring stations. Manpower reductions of 46 percent or more appear possible for multiple-parameter sites. Implementation of expected improvements in instrumentation and data handling procedures should further reduce costs. (USGS)
NASA Technical Reports Server (NTRS)
Bavuso, Salvatore J.; Rothmann, Elizabeth; Mittal, Nitin; Koppen, Sandra Howell
1994-01-01
The Hybrid Automated Reliability Predictor (HARP) integrated Reliability (HiRel) tool system for reliability/availability prediction offers a toolbox of integrated reliability/availability programs that can be used to customize the user's application in a workstation or nonworkstation environment. HiRel consists of interactive graphical input/output programs and four reliability/availability modeling engines that provide analytical and simulative solutions to a wide host of highly reliable fault-tolerant system architectures and is also applicable to electronic systems in general. The tool system was designed at the outset to be compatible with most computing platforms and operating systems, and some programs have been beta tested within the aerospace community for over 8 years. This document is a user's guide for the HiRel graphical preprocessor Graphics Oriented (GO) program. GO is a graphical user interface for the HARP engine that enables the drawing of reliability/availability models on a monitor. A mouse is used to select fault tree gates or Markov graphical symbols from a menu for drawing.
Theresa B. Jain
1994-01-01
Fluctuations in atmospheric carbon dioxide is influenced by carbon storage and cycling in terrestrial forest ecosystems. Currently, only gross estimates are available for carbon content of these ecosystems and reliable estimates are lacking for Rocky Mountain forests. To improve carbon storage estimates more information is needed on the relationship between carbon and...
Digital Avionics Information System (DAIS): Development and Demonstration.
1981-09-01
advances in technology. The DAIS architecture results in improved reliability and availability of avionics systems while at the same time reducing life ...DAIS) represents a significant advance in the technology of avionics system architecture. DAIS is a total systems concept, exploiting standardization...configurations and fully capable of accommodating new advances in technology. These fundamental system charac- teristics are described in this report; the
USDA Forest Service mobile satellite communications applications
NASA Technical Reports Server (NTRS)
Warren, John R.
1990-01-01
The airborne IR signal processing system being developed will require the use of mobile satellite communications to achieve its full capability and improvement in delivery timeliness of processed IR data to the Fire Staff. There are numerous other beneficial uses, both during wildland fire management operations or in daily routine tasks, which will also benefit from the availability of reliable communications from remote areas.
Ho, Chester H; Cheung, Amanda; Southern, Danielle; Ocampo, Wrechelle; Kaufman, Jaime; Hogan, David B; Baylis, Barry; Conly, John M; Stelfox, Henry T; Ghali, William A
2016-12-01
Research regarding the reliability of the Braden Scale and nurses' perspectives on the instrument for predicting pressure ulcer (PU) risk in acute care settings is limited. A mixed-methods study was conducted in a tertiary acute care facility to examine interrater reliability (IRR) of the Braden Scale and its subscales, and a qualitative survey using semi-structured interviews was conducted among nurses caring for patients in acute care units to gain nurse perspective regarding scale usability. Data were extracted from a previous retrospective, randomized, controlled trial involving adult patients with compromised mobility receiving care in a tertiary acute care hospital in Canada. One-way, intraclass correlation coefficients (ICCs) were calculated on item and total scores, and kappa statistics were used to determine reliability of categorizing patients on their risk. Interview results were categorized by common themes. Reliability was assessed on 64 patients, where nurses and research staff independently assessed enrolled participants at baseline and after 72 hours using the Braden Scale as it appeared on an electronic medical record. IRR for the total score was high (ICC = 0.807). The friction and shear item had the lowest reliability (ICC = 0.266). Reliability of categorizing patients' level of risk had moderate agreement (κ = 0.408). Three (3) major and 12 subthemes emerged from the 14 nurse interviews; nurses were aware of the scale's purpose but were uncertain of its effectiveness, some items were difficult to rate, and questions were raised as to whether using the scale enhanced patient care. Aspects identified by nurses to enhance usability included: 1) changes to the electronic version (incorporating the scale into daily assessment documents with readily available item descriptions), 2) additional training, and 3) easily available resource material to improve reliability and usability of scale. These findings need to be considered when using the Braden Scale in clinical practice. Further study of the value of the total Braden Scale and its subscales is warranted.
Improving the estimated cost of sustained power interruptions to electricity customers
DOE Office of Scientific and Technical Information (OSTI.GOV)
LaCommare, Kristina Hamachi; Eto, Joseph H.; Dunn, Laurel N.
Electricity reliability and resiliency have become a topic of heightened interest in recent years in the United States. As utilities, regulators, and policymakers determine how to achieve optimal levels of electricity reliability while considering how best to prepare for future disruptions in power, the related issue of how much it costs when customers lose power remains a largely unanswered question. In 2006, Lawrence Berkeley National Laboratory developed an end-use based framework that estimates the cost of power interruptions in the U.S that has served as a foundational paper using the best available, yet far from perfect, information at that time.more » Since then, an abundance of work has been done to improve the quality and availability of information that now allow us to make a much more robust assessment of the cost of power interruptions to U.S. customers. In this paper, we find that the total U.S. cost of sustained power interruptions is 44 billion dollars per year (2015-) -25% more than the 26 billion dollars per year in 2002- (or 35 billion dollars per year in 2015-) estimated in our 2006 study.« less
Improving the estimated cost of sustained power interruptions to electricity customers
LaCommare, Kristina Hamachi; Eto, Joseph H.; Dunn, Laurel N.; ...
2018-04-18
Electricity reliability and resiliency have become a topic of heightened interest in recent years in the United States. As utilities, regulators, and policymakers determine how to achieve optimal levels of electricity reliability while considering how best to prepare for future disruptions in power, the related issue of how much it costs when customers lose power remains a largely unanswered question. In 2006, Lawrence Berkeley National Laboratory developed an end-use based framework that estimates the cost of power interruptions in the U.S that has served as a foundational paper using the best available, yet far from perfect, information at that time.more » Since then, an abundance of work has been done to improve the quality and availability of information that now allow us to make a much more robust assessment of the cost of power interruptions to U.S. customers. In this paper, we find that the total U.S. cost of sustained power interruptions is 44 billion dollars per year (2015-) -25% more than the 26 billion dollars per year in 2002- (or 35 billion dollars per year in 2015-) estimated in our 2006 study.« less
Balshaw, T G; Fry, A; Maden-Wilkinson, T M; Kong, P W; Folland, J P
2017-06-01
The reliability of surface electromyography (sEMG) is typically modest even with rigorous methods, and therefore further improvements in sEMG reliability are desirable. This study compared the between-session reliability (both within participant absolute reliability and between-participant relative reliability) of sEMG amplitude from single vs. average of two distinct recording sites, for individual muscle (IM) and whole quadriceps (WQ) measures during voluntary and evoked contractions. Healthy males (n = 20) performed unilateral isometric knee extension contractions: voluntary maximum and submaximum (60%), as well as evoked twitch contractions on two separate days. sEMG was recorded from two distinct sites on each superficial quadriceps muscle. Averaging two recording sites vs. using single site measures improved reliability for IM and WQ measurements during voluntary (16-26% reduction in within-participant coefficient of variation, CV W ) and evoked contractions (40-56% reduction in CV W ). For sEMG measurements from large muscles, averaging the recording of two distinct sites is recommended as it improves within-participant reliability. This improved sensitivity has application to clinical and research measurement of sEMG amplitude.
Ankomah, James; Stewart, Barclay T; Oppong-Nketia, Victor; Koranteng, Adofo; Gyedu, Adam; Quansah, Robert; Donkor, Peter; Abantanga, Francis; Mock, Charles
2015-11-01
This study aimed to assess the availability of pediatric trauma care items (i.e. equipment, supplies, technology) and factors contributing to deficiencies in Ghana. Ten universal and 9 pediatric-sized items were selected from the World Health Organization's Guidelines for Essential Trauma Care. Direct inspection and structured interviews with administrative, clinical and biomedical engineering staff were used to assess item availability at 40 purposively sampled district, regional and tertiary hospitals in Ghana. Hospital assessments demonstrated marked deficiencies for a number of essential items (e.g. basic airway supplies, chest tubes, blood pressure cuffs, electrolyte determination, portable X-ray). Lack of pediatric-sized items resulting from equipment absence, lack of training, frequent stock-outs and technology breakage were common. Pediatric items were consistently less available than adult-sized items at each hospital level. This study identified several successes and problems with pediatric trauma care item availability in Ghana. Item availability could be improved, both affordably and reliably, by better organization and planning (e.g. regular assessment of demand and inventory, reliable financing for essential trauma care items). In addition, technology items were often broken. Developing local service and biomedical engineering capability was highlighted as a priority to avoid long periods of equipment breakage. Copyright © 2015 Elsevier Inc. All rights reserved.
Ankomah, James; Stewart, Barclay T; Oppong-Nketia, Victor; Koranteng, Adofo; Gyedu, Adam; Quansah, Robert; Donkor, Peter; Abantanga, Francis; Mock, Charles
2015-01-01
Background This study aimed to assess the availability of pediatric trauma care items (i.e. equipment, supplies, technology) and factors contributing to deficiencies in Ghana. Methods Ten universal and 9 pediatric-sized items were selected from the World Health Organization’s Guidelines for Essential Trauma Care. Direct inspection and structured interviews with administrative, clinical and biomedical engineering staff were used to assess item availability at 40 purposively sampled district, regional and tertiary hospitals in Ghana. Results Hospital assessments demonstrated marked deficiencies for a number of essential items (e.g. basic airway supplies, chest tubes, blood pressure cuffs, electrolyte determination, portable Xray). Lack of pediatric-sized items resulting from equipment absence, lack of training, frequent stock-outs and technology breakage were common. Pediatric items were consistently less available than adult-sized items at each hospital level. Conclusion This study identified several successes and problems with pediatric trauma care item availability in Ghana. Item availability could be improved, both affordably and reliably, by better organization and planning (e.g. regular assessment of demand and inventory, reliable financing for essential trauma care items). In addition, technology items were often broken. Developing local service and biomedical engineering capability was highlighted as a priority to avoid long periods of equipment breakage. PMID:25841284
A low-complexity add-on score for protein remote homology search with COMER.
Margelevicius, Mindaugas
2018-06-15
Protein sequence alignment forms the basis for comparative modeling, the most reliable approach to protein structure prediction, among many other applications. Alignment between sequence families, or profile-profile alignment, represents one of the most, if not the most, sensitive means for homology detection but still necessitates improvement. We aim at improving the quality of profile-profile alignments and the sensitivity induced by them by refining profile-profile substitution scores. We have developed a new score that represents an additional component of profile-profile substitution scores. A comprehensive evaluation shows that the new add-on score statistically significantly improves both the sensitivity and the alignment quality of the COMER method. We discuss why the score leads to the improvement and its almost optimal computational complexity that makes it easily implementable in any profile-profile alignment method. An implementation of the add-on score in the open-source COMER software and data are available at https://sourceforge.net/projects/comer. The COMER software is also available on Github at https://github.com/minmarg/comer and as a Docker image (minmar/comer). Supplementary data are available at Bioinformatics online.
FY12 End of Year Report for NEPP DDR2 Reliability
NASA Technical Reports Server (NTRS)
Guertin, Steven M.
2013-01-01
This document reports the status of the NASA Electronic Parts and Packaging (NEPP) Double Data Rate 2 (DDR2) Reliability effort for FY2012. The task expanded the focus of evaluating reliability effects targeted for device examination. FY11 work highlighted the need to test many more parts and to examine more operating conditions, in order to provide useful recommendations for NASA users of these devices. This year's efforts focused on development of test capabilities, particularly focusing on those that can be used to determine overall lot quality and identify outlier devices, and test methods that can be employed on components for flight use. Flight acceptance of components potentially includes considerable time for up-screening (though this time may not currently be used for much reliability testing). Manufacturers are much more knowledgeable about the relevant reliability mechanisms for each of their devices. We are not in a position to know what the appropriate reliability tests are for any given device, so although reliability testing could be focused for a given device, we are forced to perform a large campaign of reliability tests to identify devices with degraded reliability. With the available up-screening time for NASA parts, it is possible to run many device performance studies. This includes verification of basic datasheet characteristics. Furthermore, it is possible to perform significant pattern sensitivity studies. By doing these studies we can establish higher reliability of flight components. In order to develop these approaches, it is necessary to develop test capability that can identify reliability outliers. To do this we must test many devices to ensure outliers are in the sample, and we must develop characterization capability to measure many different parameters. For FY12 we increased capability for reliability characterization and sample size. We increased sample size this year by moving from loose devices to dual inline memory modules (DIMMs) with an approximate reduction of 20 to 50 times in terms of per device under test (DUT) cost. By increasing sample size we have improved our ability to characterize devices that may be considered reliability outliers. This report provides an update on the effort to improve DDR2 testing capability. Although focused on DDR2, the methods being used can be extended to DDR and DDR3 with relative ease.
NASA Technical Reports Server (NTRS)
Bavuso, Salvatore J.; Rothmann, Elizabeth; Dugan, Joanne Bechta; Trivedi, Kishor S.; Mittal, Nitin; Boyd, Mark A.; Geist, Robert M.; Smotherman, Mark D.
1994-01-01
The Hybrid Automated Reliability Predictor (HARP) integrated Reliability (HiRel) tool system for reliability/availability prediction offers a toolbox of integrated reliability/availability programs that can be used to customize the user's application in a workstation or nonworkstation environment. HiRel consists of interactive graphical input/output programs and four reliability/availability modeling engines that provide analytical and simulative solutions to a wide host of reliable fault-tolerant system architectures and is also applicable to electronic systems in general. The tool system was designed to be compatible with most computing platforms and operating systems, and some programs have been beta tested, within the aerospace community for over 8 years. Volume 1 provides an introduction to the HARP program. Comprehensive information on HARP mathematical models can be found in the references.
Reimer, André; Schmitt, Andreas; Ehrmann, Dominic; Kulzer, Bernhard; Hermanns, Norbert
2017-01-01
Depressive symptoms in people with diabetes are associated with increased risk of adverse outcomes. Although successful psychosocial treatment options are available, little is known about factors that facilitate treatment response for depression in diabetes. This prospective study aims to examine the impact of known risk factors on improvement of depressive symptoms with a special interest in the role of diabetes-related distress. 181 people with diabetes participated in a randomized controlled trial. Diabetes-related distress was assessed using the Problem Areas In Diabetes (PAID) scale; depressive symptoms were assessed using the Center for Epidemiologic Studies Depression (CES-D) scale. Multiple logistic and linear regression analyses were used to assess associations between risk factors for depression (independent variables) and improvement of depressive symptoms (dependent variable). Reliable change indices were established as criteria of meaningful reductions in diabetes distress and depressive symptoms. A reliable reduction of diabetes-related distress (15.43 points in the PAID) was significantly associated with fourfold increased odds for reliable improvement of depressive symptoms (OR = 4.25, 95% CI: 2.05-8.79; P<0.001). This result was corroborated using continuous measures of diabetes distress and depressive symptoms, showing that greater reduction of diabetes-related distress independently predicted greater improvement in depressive symptoms (ß = -0.40; P<0.001). Higher age had a positive (Odds Ratio = 2.04, 95% CI: 1.21-3.43; P<0.01) and type 2 diabetes had a negative effect on the meaningful reduction of depressive symptoms (Odds Ratio = 0.12, 95% CI: 0.04-0.35; P<0.001). The reduction of diabetes distress is a statistical predictor of improvement of depressive symptoms. Diabetes patients with comorbid depressive symptomatology might benefit from treatments to reduce diabetes-related distress.
Bräutigam, Klaus-Rainer; Jörissen, Juliane; Priefer, Carmen
2014-08-01
The reduction of food waste is seen as an important societal issue with considerable ethical, ecological and economic implications. The European Commission aims at cutting down food waste to one-half by 2020. However, implementing effective prevention measures requires knowledge of the reasons and the scale of food waste generation along the food supply chain. The available data basis for Europe is very heterogeneous and doubts about its reliability are legitimate. This mini-review gives an overview of available data on food waste generation in EU-27 and discusses their reliability against the results of own model calculations. These calculations are based on a methodology developed on behalf of the Food and Agriculture Organization of the United Nations and provide data on food waste generation for each of the EU-27 member states, broken down to the individual stages of the food chain and differentiated by product groups. The analysis shows that the results differ significantly, depending on the data sources chosen and the assumptions made. Further research is much needed in order to improve the data stock, which builds the basis for the monitoring and management of food waste. © The Author(s) 2014.
2017 NREL Photovoltaic Reliability Workshop
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kurtz, Sarah
NREL's Photovoltaic (PV) Reliability Workshop (PVRW) brings together PV reliability experts to share information, leading to the improvement of PV module reliability. Such improvement reduces the cost of solar electricity and promotes investor confidence in the technology -- both critical goals for moving PV technologies deeper into the electricity marketplace.
Chawla, Sagar S; Gupta, Shailvi; Onchiri, Frankline M; Habermann, Elizabeth B; Kushner, Adam L; Stewart, Barclay T
2016-09-01
Although two billion people now have access to clean water, many hospitals in low- and middle-income countries (LMICs) do not. Lack of water availability at hospitals hinders safe surgical care. We aimed to review the surgical capacity literature and document the availability of water at health facilities and develop a predictive model of water availability at health facilities globally to inform targeted capacity improvements. Using Preferred Reporting Items for Systematic Reviews and Meta-Analyses guidelines, a systematic search for surgical capacity assessments in LMICs in MEDLINE, PubMed, and World Health Organization Global Health Library was performed. Data regarding water availability were extracted. Data from these assessments and national indicator data from the World Bank (e.g., gross domestic product, total health expenditure, and percent of population with improved access to water) were used to create a predictive model for water availability in LMICs globally. Of the 72 records identified, 19 reported water availability representing 430 hospitals. A total of 66% of hospitals assessed had water availability (283 of 430 hospitals). Using these data, estimated percent of water availability in LMICs more broadly ranged from under 20% (Liberia) to over 90% (Bangladesh, Ghana). Less than two-thirds of hospitals providing surgical care in 19 LMICs had a reliable water source. Governments and nongovernmental organizations should increase efforts to improve water infrastructure at hospitals, which might aid in the provision of safe essential surgical care. Future research is needed to measure the effect of water availability on surgical care and patient outcomes. Copyright © 2016 Elsevier Inc. All rights reserved.
TFTR neutral beam control and monitoring for DT operations
DOE Office of Scientific and Technical Information (OSTI.GOV)
O`Connor, T.; Kamperschroer, J.; Chu, J.
1995-12-31
Record fusion power output has recently been obtained in TFTR with the injection of deuterium and tritium neutral beams. This significant achievement was due in part to the controls, software, and data processing capabilities added to the neutral beam system for DT operations. Chief among these improvements was the addition of SUN workstations and large dynamic data storage to the existing Central Instrumentation Control and Data Acquisition (CICADA) system. Essentially instantaneous look back over the recent shot history has been provided for most beam waveforms and analysis results. Gas regulation controls allowing remote switchover between deuterium and tritium were alsomore » added. With these tools, comparison of the waveforms and data of deuterium and tritium for four test conditioning pulses quickly produced reliable tritium setpoints. Thereafter, all beam conditioning was performed with deuterium, thus saving the tritium supply for the important DT injection shots. The lookback capability also led to modifications of the gas system to improve reliability and to control ceramic valve leakage by backbiasing. Other features added to improve the reliability and availability of DT neutral beam operations included master beamline controls and displays, a beamline thermocouple interlock system, a peak thermocouple display, automatic gas inventory and cryo panel gas loading monitoring, beam notching controls, a display of beam/plasma interlocks, and a feedback system to control beam power based on plasma conditions.« less
Photovoltaic Module Reliability Workshop 2011: February 16-17, 2011
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kurtz, S.
2013-11-01
NREL's Photovoltaic (PV) Module Reliability Workshop (PVMRW) brings together PV reliability experts to share information, leading to the improvement of PV module reliability. Such improvement reduces the cost of solar electricity and promotes investor confidence in the technology--both critical goals for moving PV technologies deeper into the electricity marketplace.
Photovoltaic Module Reliability Workshop 2014: February 25-26, 2014
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kurtz, S.
2014-02-01
NREL's Photovoltaic (PV) Module Reliability Workshop (PVMRW) brings together PV reliability experts to share information, leading to the improvement of PV module reliability. Such improvement reduces the cost of solar electricity and promotes investor confidence in the technology--both critical goals for moving PV technologies deeper into the electricity marketplace.
Photovoltaic Module Reliability Workshop 2013: February 26-27, 2013
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kurtz, S.
2013-10-01
NREL's Photovoltaic (PV) Module Reliability Workshop (PVMRW) brings together PV reliability experts to share information, leading to the improvement of PV module reliability. Such improvement reduces the cost of solar electricity and promotes investor confidence in the technology--both critical goals for moving PV technologies deeper into the electricity marketplace.
Photovoltaic Module Reliability Workshop 2010: February 18-19, 2010
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kurtz, J.
2013-11-01
NREL's Photovoltaic (PV) Module Reliability Workshop (PVMRW) brings together PV reliability experts to share information, leading to the improvement of PV module reliability. Such improvement reduces the cost of solar electricity and promotes investor confidence in the technology--both critical goals for moving PV technologies deeper into the electricity marketplace.
2016 NREL Photovoltaic Module Reliability Workshop
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kurtz, Sarah
NREL's Photovoltaic (PV) Module Reliability Workshop (PVMRW) brings together PV reliability experts to share information, leading to the improvement of PV module reliability. Such improvement reduces the cost of solar electricity and promotes investor confidence in the technology - both critical goals for moving PV technologies deeper into the electricity marketplace.
2015 NREL Photovoltaic Module Reliability Workshops
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kurtz, Sarah
NREL's Photovoltaic (PV) Module Reliability Workshop (PVMRW) brings together PV reliability experts to share information, leading to the improvement of PV module reliability. Such improvement reduces the cost of solar electricity and promotes investor confidence in the technology--both critical goals for moving PV technologies deeper into the electricity marketplace.
NASA Technical Reports Server (NTRS)
Alexander, R. H. (Principal Investigator); Mcginty, H. K., III
1975-01-01
The author has identified the following significant results. Recommendations resulting from the CARETS evaluation reflect the need to establish a flexible and reliable system for providing more detailed raw and processed land resource information as well as the need to improve the methods of making information available to users.
The impact of rare earth cobalt permanent magnets on electromechanical device design
NASA Technical Reports Server (NTRS)
Fisher, R. L.; Studer, P. A.
1979-01-01
Specific motor designs which employ rare earth cobalt magnets are discussed with special emphasis on their unique properties and magnetic field geometry. In addition to performance improvements and power savings, high reliability devices are attainable. Both the mechanism and systems engineering should be aware of the new performance levels which are currently becoming available as a result of the rare earth cobalt magnets.
Propagation Impact on Modern HF (High Frequency) Communications System Design
1986-03-01
received SNR is maximised and interference avoided. As a general principle, system availability and reliability should be improved by the use of...LECTURE SERIES No. 145 propagation Impact on Modern HF Communications System Design. NORTH ATLANTIC TREATY ORGANIZATION gS ^, DISTRIBUTION ...civil and military communities for high frequency communications. It will discuss concepts of real time channel evaluation , system design, as well as
Navy Applications Experience with Small Wind Power Systems
1985-05-01
present state-of-the-art in small WECS technology, including environmental concerns, is reviewed. Also presented is how the technology is advancing to...environmental concerns, is reviewed. Also presented is how the technology is advancing to improve reliability and avail- ability for effectively using...VAWT technology is still in its early stages of development. The horizontal-axis wind turbine (HAWT) technology has advanced to third and fourth
APPLICATION OF TRAVEL TIME RELIABILITY FOR PERFORMANCE ORIENTED OPERATIONAL PLANNING OF EXPRESSWAYS
NASA Astrophysics Data System (ADS)
Mehran, Babak; Nakamura, Hideki
Evaluation of impacts of congestion improvement scheme s on travel time reliability is very significant for road authorities since travel time reliability repr esents operational performance of expressway segments. In this paper, a methodology is presented to estimate travel tim e reliability prior to implementation of congestion relief schemes based on travel time variation modeling as a function of demand, capacity, weather conditions and road accident s. For subject expressway segmen ts, traffic conditions are modeled over a whole year considering demand and capacity as random variables. Patterns of demand and capacity are generated for each five minute interval by appl ying Monte-Carlo simulation technique, and accidents are randomly generated based on a model that links acci dent rate to traffic conditions. A whole year analysis is performed by comparing de mand and available capacity for each scenario and queue length is estimated through shockwave analysis for each time in terval. Travel times are estimated from refined speed-flow relationships developed for intercity expressways and buffer time index is estimated consequently as a measure of travel time reliability. For validation, estimated reliability indices are compared with measured values from empirical data, and it is shown that the proposed method is suitable for operational evaluation and planning purposes.
Cross-organism learning method to discover new gene functionalities.
Domeniconi, Giacomo; Masseroli, Marco; Moro, Gianluca; Pinoli, Pietro
2016-04-01
Knowledge of gene and protein functions is paramount for the understanding of physiological and pathological biological processes, as well as in the development of new drugs and therapies. Analyses for biomedical knowledge discovery greatly benefit from the availability of gene and protein functional feature descriptions expressed through controlled terminologies and ontologies, i.e., of gene and protein biomedical controlled annotations. In the last years, several databases of such annotations have become available; yet, these valuable annotations are incomplete, include errors and only some of them represent highly reliable human curated information. Computational techniques able to reliably predict new gene or protein annotations with an associated likelihood value are thus paramount. Here, we propose a novel cross-organisms learning approach to reliably predict new functionalities for the genes of an organism based on the known controlled annotations of the genes of another, evolutionarily related and better studied, organism. We leverage a new representation of the annotation discovery problem and a random perturbation of the available controlled annotations to allow the application of supervised algorithms to predict with good accuracy unknown gene annotations. Taking advantage of the numerous gene annotations available for a well-studied organism, our cross-organisms learning method creates and trains better prediction models, which can then be applied to predict new gene annotations of a target organism. We tested and compared our method with the equivalent single organism approach on different gene annotation datasets of five evolutionarily related organisms (Homo sapiens, Mus musculus, Bos taurus, Gallus gallus and Dictyostelium discoideum). Results show both the usefulness of the perturbation method of available annotations for better prediction model training and a great improvement of the cross-organism models with respect to the single-organism ones, without influence of the evolutionary distance between the considered organisms. The generated ranked lists of reliably predicted annotations, which describe novel gene functionalities and have an associated likelihood value, are very valuable both to complement available annotations, for better coverage in biomedical knowledge discovery analyses, and to quicken the annotation curation process, by focusing it on the prioritized novel annotations predicted. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
Mathysen, Danny G P; Aclimandos, Wagih; Roelant, Ella; Wouters, Kristien; Creuzot-Garcher, Catherine; Ringens, Peter J; Hawlina, Marko; Tassignon, Marie-José
2013-11-01
To investigate whether introduction of item-response theory (IRT) analysis, in parallel to the 'traditional' statistical analysis methods available for performance evaluation of multiple T/F items as used in the European Board of Ophthalmology Diploma (EBOD) examination, has proved beneficial, and secondly, to study whether the overall assessment performance of the current written part of EBOD is sufficiently high (KR-20≥ 0.90) to be kept as examination format in future EBOD editions. 'Traditional' analysis methods for individual MCQ item performance comprise P-statistics, Rit-statistics and item discrimination, while overall reliability is evaluated through KR-20 for multiple T/F items. The additional set of statistical analysis methods for the evaluation of EBOD comprises mainly IRT analysis. These analysis techniques are used to monitor whether the introduction of negative marking for incorrect answers (since EBOD 2010) has a positive influence on the statistical performance of EBOD as a whole and its individual test items in particular. Item-response theory analysis demonstrated that item performance parameters should not be evaluated individually, but should be related to one another. Before the introduction of negative marking, the overall EBOD reliability (KR-20) was good though with room for improvement (EBOD 2008: 0.81; EBOD 2009: 0.78). After the introduction of negative marking, the overall reliability of EBOD improved significantly (EBOD 2010: 0.92; EBOD 2011:0.91; EBOD 2012: 0.91). Although many statistical performance parameters are available to evaluate individual items, our study demonstrates that the overall reliability assessment remains the only crucial parameter to be evaluated allowing comparison. While individual item performance analysis is worthwhile to undertake as secondary analysis, drawing final conclusions seems to be more difficult. Performance parameters need to be related, as shown by IRT analysis. Therefore, IRT analysis has proved beneficial for the statistical analysis of EBOD. Introduction of negative marking has led to a significant increase in the reliability (KR-20 > 0.90), indicating that the current examination format can be kept for future EBOD examinations. © 2013 Acta Ophthalmologica Scandinavica Foundation. Published by John Wiley & Sons Ltd.
ETARA PC version 3.3 user's guide: Reliability, availability, maintainability simulation model
NASA Technical Reports Server (NTRS)
Hoffman, David J.; Viterna, Larry A.
1991-01-01
A user's manual describing an interactive, menu-driven, personal computer based Monte Carlo reliability, availability, and maintainability simulation program called event time availability reliability (ETARA) is discussed. Given a reliability block diagram representation of a system, ETARA simulates the behavior of the system over a specified period of time using Monte Carlo methods to generate block failure and repair intervals as a function of exponential and/or Weibull distributions. Availability parameters such as equivalent availability, state availability (percentage of time as a particular output state capability), continuous state duration and number of state occurrences can be calculated. Initial spares allotment and spares replenishment on a resupply cycle can be simulated. The number of block failures are tabulated both individually and by block type, as well as total downtime, repair time, and time waiting for spares. Also, maintenance man-hours per year and system reliability, with or without repair, at or above a particular output capability can be calculated over a cumulative period of time or at specific points in time.
Validation of A Global Hydrological Model
NASA Astrophysics Data System (ADS)
Doell, P.; Lehner, B.; Kaspar, F.; Vassolo, S.
Freshwater availability has been recognized as a global issue, and its consistent quan- tification not only in individual river basins but also at the global scale is required to support the sustainable use of water. The Global Hydrology Model WGHM, which is a submodel of the global water use and availability model WaterGAP 2, computes sur- face runoff, groundwater recharge and river discharge at a spatial resolution of 0.5. WGHM is based on the best global data sets currently available, including a newly developed drainage direction map and a data set of wetlands, lakes and reservoirs. It calculates both natural and actual discharge by simulating the reduction of river discharge by human water consumption (as computed by the water use submodel of WaterGAP 2). WGHM is calibrated against observed discharge at 724 gauging sta- tions (representing about 50% of the global land area) by adjusting a parameter of the soil water balance. It not only computes the long-term average water resources but also water availability indicators that take into account the interannual and seasonal variability of runoff and discharge. The reliability of the model results is assessed by comparing observed and simulated discharges at the calibration stations and at se- lected other stations. We conclude that reliable results can be obtained for basins of more than 20,000 km2. In particular, the 90% reliable monthly discharge is simu- lated well. However, there is the tendency that semi-arid and arid basins are modeled less satisfactorily than humid ones, which is partially due to neglecting river channel losses and evaporation of runoff from small ephemeral ponds in the model. Also, the hydrology of highly developed basins with large artificial storages, basin transfers and irrigation schemes cannot be simulated well. The seasonality of discharge in snow- dominated basins is overestimated by WGHM, and if the snow-dominated basin is uncalibrated, discharge is likely to be underestimated due to the precipitation mea- surement errors. Even though the explicit modeling of wetlands and lakes leads to a much improved modeling of both the vertical water balance and the lateral transport of water, not enough information is included in WGHM to accurately capture the hy- drology of these water bodies. Certainly, the reliability of model results is highest at the locations at which WGHM was calibrated. The validation indicates that reliability for cells inside calibrated basins is satisfactory if the basin is relatively homogeneous. Analyses of the few available stations outside of calibrated basins indicate a reason- ably high model reliability, particularly in humid regions.
Photovoltaic Module Reliability Workshop 2012: February 28 - March 1, 2012
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kurtz, S.
2013-11-01
NREL's Photovoltaic (PV) Module Reliability Workshop (PVMRW) brings together PV reliability experts to share information, leading to the improvement of PV module reliability. Such improvement reduces the cost of solar electricity and promotes investor confidence in the technology--both critical goals for moving PV technologies deeper into the electricity marketplace.
Toro A, Richard; Campos, Claudia; Molina, Carolina; Morales S, Raul G E; Leiva-Guzmán, Manuel A
2015-09-01
A critical analysis of Chile's National Air Quality Information System (NAQIS) is presented, focusing on particulate matter (PM) measurement. This paper examines the complexity, availability and reliability of monitoring station information, the implementation of control systems, the quality assurance protocols of the monitoring station data and the reliability of the measurement systems in areas highly polluted by particulate matter. From information available on the NAQIS website, it is possible to confirm that the PM2.5 (PM10) data available on the site correspond to 30.8% (69.2%) of the total information available from the monitoring stations. There is a lack of information regarding the measurement systems used to quantify air pollutants, most of the available data registers contain gaps, almost all of the information is categorized as "preliminary information" and neither standard operating procedures (operational and validation) nor assurance audits or quality control of the measurements are reported. In contrast, events that cause saturation of the monitoring detectors located in northern and southern Chile have been observed using beta attenuation monitoring. In these cases, it can only be concluded that the PM content is equal to or greater than the saturation concentration registered by the monitors and that the air quality indexes obtained from these measurements are underestimated. This occurrence has been observed in 12 (20) public and private stations where PM2.5 (PM10) is measured. The shortcomings of the NAQIS data have important repercussions for the conclusions obtained from the data and for how the data are used. However, these issues represent opportunities for improving the system to widen its use, incorporate comparison protocols between equipment, install new stations and standardize the control system and quality assurance. Copyright © 2015 Elsevier Ltd. All rights reserved.
Strategy for continuous improvement in IC manufacturability, yield, and reliability
NASA Astrophysics Data System (ADS)
Dreier, Dean J.; Berry, Mark; Schani, Phil; Phillips, Michael; Steinberg, Joe; DePinto, Gary
1993-01-01
Continual improvements in yield, reliability and manufacturability measure a fab and ultimately result in Total Customer Satisfaction. A new organizational and technical methodology for continuous defect reduction has been established in a formal feedback loop, which relies on yield and reliability, failed bit map analysis, analytical tools, inline monitoring, cross functional teams and a defect engineering group. The strategy requires the fastest detection, identification and implementation of possible corrective actions. Feedback cycle time is minimized at all points to improve yield and reliability and reduce costs, essential for competitiveness in the memory business. Payoff was a 9.4X reduction in defectivity and a 6.2X improvement in reliability of 256 K fast SRAMs over 20 months.
Reliability and Availability Evaluation Program Manual.
1982-11-01
research and development. The manual’s purpose was to provide a practical method for making reliability measurements, measurements directly related to... Research , Development, Test and Evaluation. RMA Reliability, Maintainability and Availability. R&R Repair and Refurbishment, Repair and Replacement, etc...length. phenomena such as mechanical wear and A number of researchers in the reliability chemical deterioration. Maintenance should field 14-pages 402
Design of fuel cell powered data centers for sufficient reliability and availability
NASA Astrophysics Data System (ADS)
Ritchie, Alexa J.; Brouwer, Jacob
2018-04-01
It is challenging to design a sufficiently reliable fuel cell electrical system for use in data centers, which require 99.9999% uptime. Such a system could lower emissions and increase data center efficiency, but the reliability and availability of such a system must be analyzed and understood. Currently, extensive backup equipment is used to ensure electricity availability. The proposed design alternative uses multiple fuel cell systems each supporting a small number of servers to eliminate backup power equipment provided the fuel cell design has sufficient reliability and availability. Potential system designs are explored for the entire data center and for individual fuel cells. Reliability block diagram analysis of the fuel cell systems was accomplished to understand the reliability of the systems without repair or redundant technologies. From this analysis, it was apparent that redundant components would be necessary. A program was written in MATLAB to show that the desired system reliability could be achieved by a combination of parallel components, regardless of the number of additional components needed. Having shown that the desired reliability was achievable through some combination of components, a dynamic programming analysis was undertaken to assess the ideal allocation of parallel components.
Cooper, Darren; Bevins, Joe; Corbett, Mark
2018-01-13
This technical note details the stages taken to create an instrumented hydraulic treatment plinth for the measurement of applied forces in the vertical axis. The modification used a widely available low-cost peripheral gaming device and required only basic construction and computer skills. The instrumented treatment plinth was validated against a laboratory grade force platform across a range of applied masses from 0.5-15 kg, mock Gr I-IV vertebral mobilisations and a dynamic response test. Intraclass correlation coefficients demonstrated poor reliability (0.46) for low masses of 0.5 kg improving to excellent for larger masses up to15 kg respectively; excellent to good reliability (0.97-0.86) for the mock mobilisations and moderate reliability (0.51) for the dynamic response test. The study demonstrates how a cheap peripheral gaming device can be repurposed so that forces applied to a hydraulic treatment plinth can be collected reliably when applied in a clinically reasoned manner. Copyright © 2018 Elsevier Ltd. All rights reserved.
Assuring reliability program effectiveness.
NASA Technical Reports Server (NTRS)
Ball, L. W.
1973-01-01
An attempt is made to provide simple identification and description of techniques that have proved to be most useful either in developing a new product or in improving reliability of an established product. The first reliability task is obtaining and organizing parts failure rate data. Other tasks are parts screening, tabulation of general failure rates, preventive maintenance, prediction of new product reliability, and statistical demonstration of achieved reliability. Five principal tasks for improving reliability involve the physics of failure research, derating of internal stresses, control of external stresses, functional redundancy, and failure effects control. A final task is the training and motivation of reliability specialist engineers.
Quality of Information Approach to Improving Source Selection in Tactical Networks
2017-02-01
consider the performance of this process based on metrics relating to quality of information: accuracy, timeliness, completeness and reliability. These...that are indicators of that the network is meeting these quality requirements. We study effective data rate, social distance, link integrity and the...utility of information as metrics within a multi-genre network to determine the quality of information of its available sources. This paper proposes a
Multiple hot-deck imputation for network inference from RNA sequencing data.
Imbert, Alyssa; Valsesia, Armand; Le Gall, Caroline; Armenise, Claudia; Lefebvre, Gregory; Gourraud, Pierre-Antoine; Viguerie, Nathalie; Villa-Vialaneix, Nathalie
2018-05-15
Network inference provides a global view of the relations existing between gene expression in a given transcriptomic experiment (often only for a restricted list of chosen genes). However, it is still a challenging problem: even if the cost of sequencing techniques has decreased over the last years, the number of samples in a given experiment is still (very) small compared to the number of genes. We propose a method to increase the reliability of the inference when RNA-seq expression data have been measured together with an auxiliary dataset that can provide external information on gene expression similarity between samples. Our statistical approach, hd-MI, is based on imputation for samples without available RNA-seq data that are considered as missing data but are observed on the secondary dataset. hd-MI can improve the reliability of the inference for missing rates up to 30% and provides more stable networks with a smaller number of false positive edges. On a biological point of view, hd-MI was also found relevant to infer networks from RNA-seq data acquired in adipose tissue during a nutritional intervention in obese individuals. In these networks, novel links between genes were highlighted, as well as an improved comparability between the two steps of the nutritional intervention. Software and sample data are available as an R package, RNAseqNet, that can be downloaded from the Comprehensive R Archive Network (CRAN). alyssa.imbert@inra.fr or nathalie.villa-vialaneix@inra.fr. Supplementary data are available at Bioinformatics online.
NASA Astrophysics Data System (ADS)
Delaney, C.; Hartman, R. K.; Mendoza, J.; Whitin, B.
2017-12-01
Forecast informed reservoir operations (FIRO) is a methodology that incorporates short to mid-range precipitation and flow forecasts to inform the flood operations of reservoirs. The Ensemble Forecast Operations (EFO) alternative is a probabilistic approach of FIRO that incorporates ensemble streamflow predictions (ESPs) made by NOAA's California-Nevada River Forecast Center (CNRFC). With the EFO approach, release decisions are made to manage forecasted risk of reaching critical operational thresholds. A water management model was developed for Lake Mendocino, a 111,000 acre-foot reservoir located near Ukiah, California, to evaluate the viability of the EFO alternative to improve water supply reliability but not increase downstream flood risk. Lake Mendocino is a dual use reservoir, which is owned and operated for flood control by the United States Army Corps of Engineers and is operated for water supply by the Sonoma County Water Agency. Due to recent changes in the operations of an upstream hydroelectric facility, this reservoir has suffered from water supply reliability issues since 2007. The EFO alternative was simulated using a 26-year (1985-2010) ESP hindcast generated by the CNRFC. The ESP hindcast was developed using Global Ensemble Forecast System version 10 precipitation reforecasts processed with the Hydrologic Ensemble Forecast System to generate daily reforecasts of 61 flow ensemble members for a 15-day forecast horizon. Model simulation results demonstrate that the EFO alternative may improve water supply reliability for Lake Mendocino yet not increase flood risk for downstream areas. The developed operations framework can directly leverage improved skill in the second week of the forecast and is extendable into the S2S time domain given the demonstration of improved skill through a reliable reforecast of adequate historical duration and consistent with operationally available numerical weather predictions.
PDT and emerging therapies for Actinic Keratosis-A resource letter.
Filho, José D Vollet; Andrade, Cintia T; Buzza, Hilde H; Blanco, Kate; Carbinatto, Fernanda; Bagnato, Vanderlei S; Allison, Ron R
2017-03-01
Aktinic Keratosis is common and if left untreated may develop into life threatening squamous cell carcinoma. Therefore early intervention is the standard of care. While many treatments are available PDT continues to move to the for - front for this indication (Brito et al., 2016 [31]). Topical PS is commercially available that are able to reliably ablate these lesions. Innovative protocols including sunlight, large volume LED arrays and maneuvers to improve treatment parameters and cosmesis continue to make this a worldwide treatment of choice for AK. Copyright © 2016 Elsevier B.V. All rights reserved.
The Final Kepler Planet Candidate Catalog (DR25)
NASA Astrophysics Data System (ADS)
Coughlin, Jeffrey; Thompson, Susan E.; Kepler Team
2017-06-01
We present Kepler's final planet candidate catalog, which is based on the Q1--Q17 DR25 data release and was created to allow for accurate calculations of planetary occurrence rates. We discuss improvements made to our fully automated candidate vetting procedure, which yields specific categories of false positives and a disposition score value to indicate decision confidence. We present the use of light curve inversion and scrambling, in addition to our continued use of pixel-level transit injection, to produce artificial planet candidates and false positives. Since these simulated data sets were subjected to the same automated vetting procedure as the real data set, we are able to measure both the completeness and reliability of the catalog. The DR25 catalog, source code, and a multitude of completeness and reliability data products are available at the Exoplanet Archive (http://exoplanetarchive.ipac.caltech.edu). The DR25 light curves and pixel-level data are available at MAST (http://archive.stsci.edu/kepler).
Product component genealogy modeling and field-failure prediction
DOE Office of Scientific and Technical Information (OSTI.GOV)
King, Caleb; Hong, Yili; Meeker, William Q.
Many industrial products consist of multiple components that are necessary for system operation. There is an abundance of literature on modeling the lifetime of such components through competing risks models. During the life-cycle of a product, it is common for there to be incremental design changes to improve reliability, to reduce costs, or due to changes in availability of certain part numbers. These changes can affect product reliability but are often ignored in system lifetime modeling. By incorporating this information about changes in part numbers over time (information that is readily available in most production databases), better accuracy can bemore » achieved in predicting time to failure, thus yielding more accurate field-failure predictions. This paper presents methods for estimating parameters and predictions for this generational model and a comparison with existing methods through the use of simulation. Our results indicate that the generational model has important practical advantages and outperforms the existing methods in predicting field failures.« less
Product component genealogy modeling and field-failure prediction
King, Caleb; Hong, Yili; Meeker, William Q.
2016-04-13
Many industrial products consist of multiple components that are necessary for system operation. There is an abundance of literature on modeling the lifetime of such components through competing risks models. During the life-cycle of a product, it is common for there to be incremental design changes to improve reliability, to reduce costs, or due to changes in availability of certain part numbers. These changes can affect product reliability but are often ignored in system lifetime modeling. By incorporating this information about changes in part numbers over time (information that is readily available in most production databases), better accuracy can bemore » achieved in predicting time to failure, thus yielding more accurate field-failure predictions. This paper presents methods for estimating parameters and predictions for this generational model and a comparison with existing methods through the use of simulation. Our results indicate that the generational model has important practical advantages and outperforms the existing methods in predicting field failures.« less
EUV Irradiance Inputs to Thermospheric Density Models: Open Issues and Path Forward
NASA Astrophysics Data System (ADS)
Vourlidas, A.; Bruinsma, S.
2018-01-01
One of the objectives of the NASA Living With a Star Institute on "Nowcasting of Atmospheric Drag for low Earth orbit (LEO) Spacecraft" was to investigate whether and how to increase the accuracy of atmospheric drag models by improving the quality of the solar forcing inputs, namely, extreme ultraviolet (EUV) irradiance information. In this focused review, we examine the status of and issues with EUV measurements and proxies, discuss recent promising developments, and suggest a number of ways to improve the reliability, availability, and forecast accuracy of EUV measurements in the next solar cycle.
Mobile Apps for the Dietary Approaches to Stop Hypertension (DASH): App Quality Evaluation.
DiFilippo, Kristen Nicole; Huang, Wen-Hao David; Chapman-Novakofski, Karen M
2018-03-08
To identify the availability and quality of apps supporting Dietary Approaches to Stop Hypertension (DASH) education. The researchers identified DASH apps over 1 month in the Apple App Store. Five registered dietitians used the App Quality Evaluation (AQEL) to evaluate app quality on 7 domains. Interrater reliability was tested using intraclass correlations. One paid and 3 free DASH apps were evaluated. Interrater reliability (n = 5) was good for 3 apps and fair for 1 app. Only the paid app scored high (>8 of 10) on most AQEL quality domains. Based on lower quality found among the included free apps, further development of free apps is warranted. Whereas the paid app may be useful in supporting DASH education, future research should determine whether improvements in clinical outcomes are found and whether this app should be improved to address AQEL domains better. Copyright © 2018 Society for Nutrition Education and Behavior. Published by Elsevier Inc. All rights reserved.
Integrating Machine Learning into a Crowdsourced Model for Earthquake-Induced Damage Assessment
NASA Technical Reports Server (NTRS)
Rebbapragada, Umaa; Oommen, Thomas
2011-01-01
On January 12th, 2010, a catastrophic 7.0M earthquake devastated the country of Haiti. In the aftermath of an earthquake, it is important to rapidly assess damaged areas in order to mobilize the appropriate resources. The Haiti damage assessment effort introduced a promising model that uses crowdsourcing to map damaged areas in freely available remotely-sensed data. This paper proposes the application of machine learning methods to improve this model. Specifically, we apply work on learning from multiple, imperfect experts to the assessment of volunteer reliability, and propose the use of image segmentation to automate the detection of damaged areas. We wrap both tasks in an active learning framework in order to shift volunteer effort from mapping a full catalog of images to the generation of high-quality training data. We hypothesize that the integration of machine learning into this model improves its reliability, maintains the speed of damage assessment, and allows the model to scale to higher data volumes.
Integrated orbital servicing and payloads study. Volume 1: Executive summary
NASA Technical Reports Server (NTRS)
1975-01-01
A study is summarized in which a comparison was made of the following modes of maintaining a satellite system: (1) expendable mode in which failed satellites are replaced, (2) on-orbit servicing where a satellite can be fixed by unmanned module exchange in space, and (3) ground refurbishment in which the satellite is brought back to ground for repairs. It was concluded that on-orbit maintenance is the most cost-effective mode and that it is technically feasible. It can be used to repair failed satellites, to improve reliability of operating satellites, and to update equipment. On-orbit servicing can increase program flexibility and satellite reliability, lifetime, and availability. The significant conclusions and results of two studies are summarized.
Reliability of the quench protection system for the LHC superconducting elements
NASA Astrophysics Data System (ADS)
Vergara Fernández, A.; Rodríguez-Mateos, F.
2004-06-01
The Quench Protection System (QPS) is the sole system in the Large Hadron Collider machine monitoring the signals from the superconducting elements (bus bars, current leads, magnets) which form the cold part of the electrical circuits. The basic functions to be accomplished by the QPS during the machine operation will be briefly presented. With more than 4000 internal trigger channels (quench detectors and others), the final QPS design is the result of an optimised balance between on-demand availability and false quench reliability. The built-in redundancy for the different equipment will be presented, focusing on the calculated, expected number of missed quenches and false quenches. Maintenance strategies in order to improve the performance over the years of operation will be addressed.
Capacity and reliability analyses with applications to power quality
NASA Astrophysics Data System (ADS)
Azam, Mohammad; Tu, Fang; Shlapak, Yuri; Kirubarajan, Thiagalingam; Pattipati, Krishna R.; Karanam, Rajaiah
2001-07-01
The deregulation of energy markets, the ongoing advances in communication networks, the proliferation of intelligent metering and protective power devices, and the standardization of software/hardware interfaces are creating a dramatic shift in the way facilities acquire and utilize information about their power usage. The currently available power management systems gather a vast amount of information in the form of power usage, voltages, currents, and their time-dependent waveforms from a variety of devices (for example, circuit breakers, transformers, energy and power quality meters, protective relays, programmable logic controllers, motor control centers). What is lacking is an information processing and decision support infrastructure to harness this voluminous information into usable operational and management knowledge to handle the health of their equipment and power quality, minimize downtime and outages, and to optimize operations to improve productivity. This paper considers the problem of evaluating the capacity and reliability analyses of power systems with very high availability requirements (e.g., systems providing energy to data centers and communication networks with desired availability of up to 0.9999999). The real-time capacity and margin analysis helps operators to plan for additional loads and to schedule repair/replacement activities. The reliability analysis, based on computationally efficient sum of disjoint products, enables analysts to decide the optimum levels of redundancy, aids operators in prioritizing the maintenance options for a given budget and monitoring the system for capacity margin. The resulting analytical and software tool is demonstrated on a sample data center.
HiRel - Reliability/availability integrated workstation tool
NASA Technical Reports Server (NTRS)
Bavuso, Salvatore J.; Dugan, Joanne B.
1992-01-01
The HiRel software tool is described and demonstrated by application to the mission avionics subsystem of the Advanced System Integration Demonstrations (ASID) system that utilizes the PAVE PILLAR approach. HiRel marks another accomplishment toward the goal of producing a totally integrated computer-aided design (CAD) workstation design capability. Since a reliability engineer generally represents a reliability model graphically before it can be solved, the use of a graphical input description language increases productivity and decreases the incidence of error. The graphical postprocessor module HARPO makes it possible for reliability engineers to quickly analyze huge amounts of reliability/availability data to observe trends due to exploratory design changes. The addition of several powerful HARP modeling engines provides the user with a reliability/availability modeling capability for a wide range of system applications all integrated under a common interactive graphical input-output capability.
Improving reliability of a residency interview process.
Peeters, Michael J; Serres, Michelle L; Gundrum, Todd E
2013-10-14
To improve the reliability and discrimination of a pharmacy resident interview evaluation form, and thereby improve the reliability of the interview process. In phase 1 of the study, authors used a Many-Facet Rasch Measurement model to optimize an existing evaluation form for reliability and discrimination. In phase 2, interviewer pairs used the modified evaluation form within 4 separate interview stations. In phase 3, 8 interviewers individually-evaluated each candidate in one-on-one interviews. In phase 1, the evaluation form had a reliability of 0.98 with person separation of 6.56; reproducibly, the form separated applicants into 6 distinct groups. Using that form in phase 2 and 3, our largest variation source was candidates, while content specificity was the next largest variation source. The phase 2 g-coefficient was 0.787, while confirmatory phase 3 was 0.922. Process reliability improved with more stations despite fewer interviewers per station-impact of content specificity was greatly reduced with more interview stations. A more reliable, discriminating evaluation form was developed to evaluate candidates during resident interviews, and a process was designed that reduced the impact from content specificity.
Improving the Test-Retest Reliability of Resting State fMRI by Removing the Impact of Sleep.
Wang, Jiahui; Han, Junwei; Nguyen, Vinh T; Guo, Lei; Guo, Christine C
2017-01-01
Resting state functional magnetic resonance imaging (rs-fMRI) provides a powerful tool to examine large-scale neural networks in the human brain and their disturbances in neuropsychiatric disorders. Thanks to its low demand and high tolerance, resting state paradigms can be easily acquired from clinical population. However, due to the unconstrained nature, resting state paradigm is associated with excessive head movement and proneness to sleep. Consequently, the test-retest reliability of rs-fMRI measures is moderate at best, falling short of widespread use in the clinic. Here, we characterized the effect of sleep on the test-retest reliability of rs-fMRI. Using measures of heart rate variability (HRV) derived from simultaneous electrocardiogram (ECG) recording, we identified portions of fMRI data when subjects were more alert or sleepy, and examined their effects on the test-retest reliability of functional connectivity measures. When volumes of sleep were excluded, the reliability of rs-fMRI is significantly improved, and the improvement appears to be general across brain networks. The amount of improvement is robust with the removal of as much as 60% volumes of sleepiness. Therefore, test-retest reliability of rs-fMRI is affected by sleep and could be improved by excluding volumes of sleepiness as indexed by HRV. Our results suggest a novel and practical method to improve test-retest reliability of rs-fMRI measures.
Tsai, Alexander C; Kakuhikire, Bernard; Mushavi, Rumbidzai; Vořechovská, Dagmar; Perkins, Jessica M; McDonough, Amy Q; Bangsberg, David R
2016-04-01
Hundreds of millions of people worldwide lack adequate access to water. Water insecurity, which is defined as having limited or uncertain availability of safe water or the ability to acquire safe water in socially acceptable ways, is typically overlooked by development organizations focusing on water availability. To address the urgent need in the literature for validated measures of water insecurity, we conducted a population-based study in rural Uganda with 327 reproductive-age women and 204 linked men from the same households. We used a novel method of photo identification so that we could accurately elicit study participants' primary household water sources, thereby enabling us to identify water sources for objective water quality testing and distance/elevation measurement. Our psychometric analyses provided strong evidence of the internal structure, reliability, and validity of a new eight-item Household Water Insecurity Access Scale (HWIAS). Important intra-household gender differences in perceptions of water insecurity were observed, with men generally perceiving household water insecurity as being less severe compared to women. In summary, the HWIAS represents a reliable and valid measure of water insecurity, particularly among women, and may be useful for informing and evaluating interventions to improve water access in resource-limited settings.
Tsai, Alexander C.; Kakuhikire, Bernard; Mushavi, Rumbidzai; Vořechovská, Dagmar; Perkins, Jessica M.; McDonough, Amy Q.; Bangsberg, David R.
2015-01-01
Hundreds of millions of persons worldwide lack adequate access to water. Water insecurity, which is defined as having limited or uncertain availability of safe water or the ability to acquire safe water in socially acceptable ways, is typically overlooked by development organizations focusing on water availability. To address the urgent need in the literature for validated measures of water insecurity, we conducted a population-based study in rural Uganda with 327 reproductive-age women and 204 linked men from the same households. We used a novel method of photo identification so that we could accurately elicit study participants’ primary household water sources, thereby enabling us to identify water sources for objective water quality testing and distance/elevation measurement. Our psychometric analyses provided strong evidence of the internal structure, reliability, and validity of a new 8-item Household Water Insecurity Access Scale. Important intra-household gender differences in perceptions of water insecurity were observed, with men generally perceiving household water insecurity as being less severe compared to women. In summary, the Household Water Insecurity Access Scale represents a reliable and valid measure of water insecurity, particularly among women, and may be useful for informing and evaluating interventions to improve water access in resource limited settings. PMID:27105413
Legal issues concerning electronic health information: privacy, quality, and liability.
Hodge, J G; Gostin, L O; Jacobson, P D
1999-10-20
Personally identifiable health information about individuals and general medical information is increasingly available in electronic form in health databases and through online networks. The proliferation of electronic data within the modern health information infrastructure presents significant benefits for medical providers and patients, including enhanced patient autonomy, improved clinical treatment, advances in health research and public health surveillance, and modern security techniques. However, it also presents new legal challenges in 3 interconnected areas: privacy of identifiable health information, reliability and quality of health data, and tortbased liability. Protecting health information privacy (by giving individuals control over health data without severely restricting warranted communal uses) directly improves the quality and reliability of health data (by encouraging individual uses of health services and communal uses of data), which diminishes tort-based liabilities (by reducing instances of medical malpractice or privacy invasions through improvements in the delivery of health care services resulting in part from better quality and reliability of clinical and research data). Following an analysis of the interconnectivity of these 3 areas and discussing existing and proposed health information privacy laws, recommendations for legal reform concerning health information privacy are presented. These include (1) recognizing identifiable health information as highly sensitive, (2) providing privacy safeguards based on fair information practices, (3) empowering patients with information and rights to consent to disclosure (4) limiting disclosures of health data absent consent, (5) incorporating industry-wide security protections, (6) establishing a national data protection authority, and (7) providing a national minimal level of privacy protections.
Komal
2018-05-01
Nowadays power consumption is increasing day-by-day. To fulfill failure free power requirement, planning and implementation of an effective and reliable power management system is essential. Phasor measurement unit(PMU) is one of the key device in wide area measurement and control systems. The reliable performance of PMU assures failure free power supply for any power system. So, the purpose of the present study is to analyse the reliability of a PMU used for controllability and observability of power systems utilizing available uncertain data. In this paper, a generalized fuzzy lambda-tau (GFLT) technique has been proposed for this purpose. In GFLT, system components' uncertain failure and repair rates are fuzzified using fuzzy numbers having different shapes such as triangular, normal, cauchy, sharp gamma and trapezoidal. To select a suitable fuzzy number for quantifying data uncertainty, system experts' opinion have been considered. The GFLT technique applies fault tree, lambda-tau method, fuzzified data using different membership functions, alpha-cut based fuzzy arithmetic operations to compute some important reliability indices. Furthermore, in this study ranking of critical components of the system using RAM-Index and sensitivity analysis have also been performed. The developed technique may be helpful to improve system performance significantly and can be applied to analyse fuzzy reliability of other engineering systems. Copyright © 2018 ISA. Published by Elsevier Ltd. All rights reserved.
Improved distances and ages for stars common to TGAS and RAVE
NASA Astrophysics Data System (ADS)
McMillan, Paul J.; Kordopatis, Georges; Kunder, Andrea; Binney, James; Wojno, Jennifer; Zwitter, Tomaž; Steinmetz, Matthias; Bland-Hawthorn, Joss; Gibson, Brad K.; Gilmore, Gerard; Grebel, Eva K.; Helmi, Amina; Munari, Ulisse; Navarro, Julio F.; Parker, Quentin A.; Seabroke, George; Watson, Fred; Wyse, Rosemary F. G.
2018-04-01
We combine parallaxes from the first Gaia data release with the spectrophotometric distance estimation framework for stars in the fifth RAVE survey data release. The combined distance estimates are more accurate than either determination in isolation - uncertainties are on average two times smaller than for RAVE-only distances (three times smaller for dwarfs), and 1.4 times smaller than TGAS parallax uncertainties (two times smaller for giants). We are also able to compare the estimates from spectrophotometry to those from Gaia, and use this to assess the reliability of both catalogues and improve our distance estimates. We find that the distances to the lowest log g stars are, on average, overestimated and caution that they may not be reliable. We also find that it is likely that the Gaia random uncertainties are smaller than the reported values. As a byproduct we derive ages for the RAVE stars, many with relative uncertainties less than 20 percent. These results for 219 566 RAVE sources have been made publicly available, and we encourage their use for studies that combine the radial velocities provided by RAVE with the proper motions provided by Gaia. A sample that we believe to be reliable can be found by taking only the stars with the flag notification `flag_any=0'.
Improved distances and ages for stars common to TGAS and RAVE
NASA Astrophysics Data System (ADS)
McMillan, Paul J.; Kordopatis, Georges; Kunder, Andrea; Binney, James; Wojno, Jennifer; Zwitter, Tomaž; Steinmetz, Matthias; Bland-Hawthorn, Joss; Gibson, Brad K.; Gilmore, Gerard; Grebel, Eva K.; Helmi, Amina; Munari, Ulisse; Navarro, Julio F.; Parker, Quentin A.; Seabroke, George; Watson, Fred; Wyse, Rosemary F. G.
2018-07-01
We combine parallaxes from the first Gaia data release with the spectrophotometric distance estimation framework for stars in the fifth RAVE survey data release. The combined distance estimates are more accurate than either determination in isolation - uncertainties are on average two times smaller than for RAVE-only distances (three times smaller for dwarfs), and 1.4 times smaller than TGAS parallax uncertainties (two times smaller for giants). We are also able to compare the estimates from spectrophotometry to those from Gaia, and use this to assess the reliability of both catalogues and improve our distance estimates. We find that the distances to the lowest log g stars are, on average, overestimated and caution that they may not be reliable. We also find that it is likely that the Gaia random uncertainties are smaller than the reported values. As a by-product we derive ages for the RAVE stars, many with relative uncertainties less than 20 per cent. These results for 219 566 RAVE sources have been made publicly available, and we encourage their use for studies that combine the radial velocities provided by RAVE with the proper motions provided by Gaia. A sample that we believe to be reliable can be found by taking only the stars with the flag notification `flag_any=0'.
NASA Astrophysics Data System (ADS)
Kurnosov, R. Yu; Chernyshova, T. I.; Chernyshov, V. N.
2018-05-01
The algorithms for improving the metrological reliability of analogue blocks of measuring channels and information-measuring systems are developed. The proposed algorithms ensure the optimum values of their metrological reliability indices for a given analogue circuit block solution.
NASA Astrophysics Data System (ADS)
Watkins, M.; Busby, R.; Rico, H.; Johnson, M.; Hauksson, E.
2003-12-01
We provide enhanced network robustness by apportioning redundant data communications paths for seismic stations in the field. By providing for more than one telemetry route, either physical or logical, network operators can improve availability of seismic data while experiencing occasional network outages, and also during the loss of key gateway interfaces such as a router or central processor. This is especially important for seismic stations in sparsely populated regions where a loss of a single site may result in a significant gap in the network's monitoring capability. A number of challenges arise in the application of a circuit-detour mechanism. One requirement is that it fits well within the existing framework of our real-time system processing. It is also necessary to craft a system that is not needlessly complex to maintain or implement, particularly during a crisis. The method that we use for circuit-detours does not require the reconfiguration of dataloggers or communications equipment in the field. Remote network configurations remain static, changes are only required at the central site. We have implemented standardized procedures to detour circuits on similar transport mediums, such as virtual circuits on the same leased line; as well as physically different communications pathways, such as a microwave link backed up by a leased line. The lessons learned from these improvements in reliability, and optimization efforts could be applied to other real-time seismic networks. A fundamental tenant of most seismic networks is that they are reliable and have a high percentage of real-time data availability. A reasonable way to achieve these expectations is to provide alternate means of delivering data to the central processing sites, with a simple method for utilizing these alternate paths.
A meta-data based method for DNA microarray imputation.
Jörnsten, Rebecka; Ouyang, Ming; Wang, Hui-Yu
2007-03-29
DNA microarray experiments are conducted in logical sets, such as time course profiling after a treatment is applied to the samples, or comparisons of the samples under two or more conditions. Due to cost and design constraints of spotted cDNA microarray experiments, each logical set commonly includes only a small number of replicates per condition. Despite the vast improvement of the microarray technology in recent years, missing values are prevalent. Intuitively, imputation of missing values is best done using many replicates within the same logical set. In practice, there are few replicates and thus reliable imputation within logical sets is difficult. However, it is in the case of few replicates that the presence of missing values, and how they are imputed, can have the most profound impact on the outcome of downstream analyses (e.g. significance analysis and clustering). This study explores the feasibility of imputation across logical sets, using the vast amount of publicly available microarray data to improve imputation reliability in the small sample size setting. We download all cDNA microarray data of Saccharomyces cerevisiae, Arabidopsis thaliana, and Caenorhabditis elegans from the Stanford Microarray Database. Through cross-validation and simulation, we find that, for all three species, our proposed imputation using data from public databases is far superior to imputation within a logical set, sometimes to an astonishing degree. Furthermore, the imputation root mean square error for significant genes is generally a lot less than that of non-significant ones. Since downstream analysis of significant genes, such as clustering and network analysis, can be very sensitive to small perturbations of estimated gene effects, it is highly recommended that researchers apply reliable data imputation prior to further analysis. Our method can also be applied to cDNA microarray experiments from other species, provided good reference data are available.
NASA Technical Reports Server (NTRS)
Sproles, Darrell W.; Bavuso, Salvatore J.
1994-01-01
The Hybrid Automated Reliability Predictor (HARP) integrated Reliability (HiRel) tool system for reliability/availability prediction offers a toolbox of integrated reliability/availability programs that can be used to customize the user's application in a workstation or nonworkstation environment. HiRel consists of interactive graphical input/output programs and four reliability/availability modeling engines that provide analytical and simulative solutions to a wide host of highly reliable fault-tolerant system architectures and is also applicable to electronic systems in general. The tool system was designed at the outset to be compatible with most computing platforms and operating systems and some programs have been beta tested within the aerospace community for over 8 years. This document is a user's guide for the HiRel graphical postprocessor program HARPO (HARP Output). HARPO reads ASCII files generated by HARP. It provides an interactive plotting capability that can be used to display alternate model data for trade-off analyses. File data can also be imported to other commercial software programs.
ERIC Educational Resources Information Center
Amrein-Beardsley, Audrey; Collins, Clarin
2012-01-01
The SAS Educational Value-Added Assessment System (SAS[R] EVAAS[R]) is the most widely used value-added system in the country. It is also self-proclaimed as "the most robust and reliable" system available, with its greatest benefit to help educators improve their teaching practices. This study critically examined the effects of SAS[R] EVAAS[R] as…
Defense AT&L (Volume 36, Number 4, July-August 2007)
2007-08-01
clear which organizational unit actually had attained the maturity rating in a large corporation . The ADS will now list specific projects, organizational...founder and CEO of BNet Corporation , which provides wireless solutions for real-time asset visibility. Steffen is a retired Navy Supply Corps captain and...Falcon Flex: Turning Maintenance Information into Air Power Kevin J. Berk “Can we improve the reliability and availability of F-16 avionics while
Fiber Access Networks: Reliability Analysis and Swedish Broadband Market
NASA Astrophysics Data System (ADS)
Wosinska, Lena; Chen, Jiajia; Larsen, Claus Popp
Fiber access network architectures such as active optical networks (AONs) and passive optical networks (PONs) have been developed to support the growing bandwidth demand. Whereas particularly Swedish operators prefer AON, this may not be the case for operators in other countries. The choice depends on a combination of technical requirements, practical constraints, business models, and cost. Due to the increasing importance of reliable access to the network services, connection availability is becoming one of the most crucial issues for access networks, which should be reflected in the network owner's architecture decision. In many cases protection against failures is realized by adding backup resources. However, there is a trade off between the cost of protection and the level of service reliability since improving reliability performance by duplication of network resources (and capital expenditures CAPEX) may be too expensive. In this paper we present the evolution of fiber access networks and compare reliability performance in relation to investment and management cost for some representative cases. We consider both standard and novel architectures for deployment in both sparsely and densely populated areas. While some recent works focused on PON protection schemes with reduced CAPEX the current and future effort should be put on minimizing the operational expenditures (OPEX) during the access network lifetime.
Pirkle, Catherine M; Dumont, Alexandre; Traore, Mamadou; Zunzunegui, Maria-Victoria
2012-10-29
In Mali and Senegal, over 1% of women die giving birth in hospital. At some hospitals, over a third of infants are stillborn. Many deaths are due to substandard medical practices. Criterion-based clinical audits (CBCA) are increasingly used to measure and improve obstetrical care in resource-limited settings, but their measurement properties have not been formally evaluated. In 2011, we published a systematic review of obstetrical CBCA highlighting insufficient considerations of validity and reliability. The objective of this study is to develop an obstetrical CBCA adapted to the West African context and assess its reliability and validity. This work was conducted as a sub-study within a cluster randomized trial known as QUARITE. Criteria were selected based on extensive literature review and expert opinion. Early 2010, two auditors applied the CBCA to identical samples at 8 sites in Mali and Senegal (n = 185) to evaluate inter-rater reliability. In 2010-11, we conducted CBCA at 32 hospitals to assess construct validity (n = 633 patients). We correlated hospital characteristics (resource availability, facility perinatal and maternal mortality) with mean hospital CBCA scores. We used generalized estimating equations to assess whether patient CBCA scores were associated with perinatal mortality. Results demonstrate substantial (ICC = 0.67, 95% CI 0.54; 0.76) to elevated inter-rater reliability (ICC = 0.84, 95% CI 0.77; 0.89) in Senegal and Mali, respectively. Resource availability positively correlated with mean hospital CBCA scores and maternal and perinatal mortality were inversely correlated with hospital CBCA scores. Poor CBCA scores, adjusted for hospital and patient characteristics, were significantly associated with perinatal mortality (OR 1.84, 95% CI 1.01-3.34). Our CBCA has substantial inter-rater reliability and there is compelling evidence of its validity as the tool performs according to theory. Current Controlled Trials ISRCTN46950658.
NASA Astrophysics Data System (ADS)
Ralph, F. M.; Jasperse, J.
2017-12-01
Forecast Informed Reservoir Operations (FIRO) is a proposed strategy that is exploring inorporation of improved hydrometeorological forecasts of land-falling atmospheric rivers on the U.S. West Coast into reservoir operations. The first testbed for this strategy is Lake Mendocino, which is located in the East Fork of the 1485 mi2 Russian River Watershed in northern California. This project is guided by the Lake Mendocino FIRO Steering Committee (SC). The SC is an ad hoc committee that consists of water managers and scientists from several federal, state, and local agencies, and universities who have teamed to evaluate whether current or improved technology and scientific understanding can be utilized to improve water supply reliability, enhance flood mitigation and support recovery of listed salmon for the Russian River of northern California. In 2015, the SC created a detailed work plan, which included a Preliminary Viability Assessment, which has now been completed. The SC developed a vision that operational efficiency would be improved by using forecasts to inform decisions about releasing or storing water. FIRO would use available reservoir storage in an efficient manner by (1) better forecasting inflow (or lack of inflow) with enhanced technology, and (2) adapting operation in real time to meet the need for storage, rather than making storage available just in case it is needed. The envisioned FIRO strategy has the potential to simultaneously improve water supply reliability, flood protection, and ecosystem outcomes through a more efficient use of existing infrastructure while requiring minimal capital improvements in the physical structure of the dam. This presentation will provide an overview of the creation of the FIRO SC and how it operates, and describes the lessons learned through this partnership. Results in the FIRO Preliminary Viability Assessment will be summarized and next steps described.
van Kerkhof, Linda Wilhelmina Maria; van der Laar, Catharina Walthera Egbertha; de Jong, Charlie; Weda, Marjolein; Hegger, Ingrid
2016-04-06
In the past years, an enormous increase in the number of available health-related applications (apps) has occurred, from approximately 5800 in 2011 to over 23,000 in 2013, in the iTunes store. However, little is still known regarding the use, possible effectiveness, and risks of these applications. In this study, we focused on apps and other e-tools related to medicine use. A large subset of the general population uses medicines and might benefit from tools that aid in the use of medicine. The aim of the present study was to gain more insight into the characteristics, possible risks, and possible benefits of health apps and e-tools related to medication use. We first made an inventory of apps and other e-tools for medication use (n=116). Tools were coded by two independent researchers, based on the information available in the app stores and websites. Subsequently, for one type of often downloaded apps (aimed at people with diabetes), we investigated users' experiences using an online questionnaire. Results of the inventory show that many apps for medication use are available and that they mainly offer simple functionalities. In line with this, the most experienced benefit by users of apps for regulating blood glucose levels in the online questionnaire was "information quick and conveniently available". Other often experienced benefits were improving health and self-reliance. Results of the inventory show that a minority of the apps for medication use has potentially high risks and for many of the apps it is unclear whether and how personal data are stored. In contrast, online questionnaire among users of apps for blood glucose regulation indicates that they hardly ever experience problems or doubts considering reliability and/or privacy. Although, respondents do mention to experience disadvantages of use due to incomplete apps and apps with poor ease of use. Respondents not using app(s) indicate that they might use them in the future if reliability of the apps and instructions on how to use them are more clear. This study shows that for apps and e-tools related to medicine use a small subset of tools might involve relatively high risks. For the large group of nonmedical devices apps, risks are lower, but risks lie in the enormous availability and low levels of regulation. In addition, both users and nonusers indicated that overall quality of apps (ease of use, completeness, good functionalities) is an issue. Considering that important benefits (eg, improving health and self-reliance) are experienced by many of the respondents using apps for regulating blood glucose levels, improving reliability and quality of apps is likely to have many profits. In addition, creating better awareness regarding the existence and how to use apps will likely improve proper use by more people, enhancing the profits of these tools.
Navy applications experience with small wind power systems
NASA Astrophysics Data System (ADS)
Pal, D.
1985-05-01
This report describes the experience gained and lesson learned from the ongoing field evaluations of seven small, 2-to 20-kW wind energy conversion systems (WECS) at Navy installations located in the Southern California desert, on San Nicolas Island, in California, and in Kaneohe Bay, Hawaii. The field tests show that the WECS's bearings and yaw slip-rings are prone to failure. The failures were attributed to the corrosive environment and poor design practices. Based upon the field tests, it is concluded that a reliable WECS must use a permanent magnet alternator without a gearbox and yaw slip-rings that are driven by a fixed pitch wind turbine rotor. The present state-of-the-art in small WECS technology, including environmental concerns, is reviewed. Also presented is how the technology is advancing to improve reliability and availability for effectively using wind power at Navy bases. The field evaluations are continuing on the small WECS in order to develop operation, maintenance, and reliability data.
Creating High Reliability in Health Care Organizations
Pronovost, Peter J; Berenholtz, Sean M; Goeschel, Christine A; Needham, Dale M; Sexton, J Bryan; Thompson, David A; Lubomski, Lisa H; Marsteller, Jill A; Makary, Martin A; Hunt, Elizabeth
2006-01-01
Objective The objective of this paper was to present a comprehensive approach to help health care organizations reliably deliver effective interventions. Context Reliability in healthcare translates into using valid rate-based measures. Yet high reliability organizations have proven that the context in which care is delivered, called organizational culture, also has important influences on patient safety. Model for Improvement Our model to improve reliability, which also includes interventions to improve culture, focuses on valid rate-based measures. This model includes (1) identifying evidence-based interventions that improve the outcome, (2) selecting interventions with the most impact on outcomes and converting to behaviors, (3) developing measures to evaluate reliability, (4) measuring baseline performance, and (5) ensuring patients receive the evidence-based interventions. The comprehensive unit-based safety program (CUSP) is used to improve culture and guide organizations in learning from mistakes that are important, but cannot be measured as rates. Conclusions We present how this model was used in over 100 intensive care units in Michigan to improve culture and eliminate catheter-related blood stream infections—both were accomplished. Our model differs from existing models in that it incorporates efforts to improve a vital component for system redesign—culture, it targets 3 important groups—senior leaders, team leaders, and front line staff, and facilitates change management—engage, educate, execute, and evaluate for planned interventions. PMID:16898981
The development of the Nucleus Freedom Cochlear implant system.
Patrick, James F; Busby, Peter A; Gibson, Peter J
2006-12-01
Cochlear Limited (Cochlear) released the fourth-generation cochlear implant system, Nucleus Freedom, in 2005. Freedom is based on 25 years of experience in cochlear implant research and development and incorporates advances in medicine, implantable materials, electronic technology, and sound coding. This article presents the development of Cochlear's implant systems, with an overview of the first 3 generations, and details of the Freedom system: the CI24RE receiver-stimulator, the Contour Advance electrode, the modular Freedom processor, the available speech coding strategies, the input processing options of Smart Sound to improve the signal before coding as electrical signals, and the programming software. Preliminary results from multicenter studies with the Freedom system are reported, demonstrating better levels of performance compared with the previous systems. The final section presents the most recent implant reliability data, with the early findings at 18 months showing improved reliability of the Freedom implant compared with the earlier Nucleus 3 System. Also reported are some of the findings of Cochlear's collaborative research programs to improve recipient outcomes. Included are studies showing the benefits from bilateral implants, electroacoustic stimulation using an ipsilateral and/or contralateral hearing aid, advanced speech coding, and streamlined speech processor programming.
Schmitt, Andreas; Ehrmann, Dominic; Kulzer, Bernhard; Hermanns, Norbert
2017-01-01
Objective Depressive symptoms in people with diabetes are associated with increased risk of adverse outcomes. Although successful psychosocial treatment options are available, little is known about factors that facilitate treatment response for depression in diabetes. This prospective study aims to examine the impact of known risk factors on improvement of depressive symptoms with a special interest in the role of diabetes-related distress. Methods 181 people with diabetes participated in a randomized controlled trial. Diabetes-related distress was assessed using the Problem Areas In Diabetes (PAID) scale; depressive symptoms were assessed using the Center for Epidemiologic Studies Depression (CES-D) scale. Multiple logistic and linear regression analyses were used to assess associations between risk factors for depression (independent variables) and improvement of depressive symptoms (dependent variable). Reliable change indices were established as criteria of meaningful reductions in diabetes distress and depressive symptoms. Results A reliable reduction of diabetes-related distress (15.43 points in the PAID) was significantly associated with fourfold increased odds for reliable improvement of depressive symptoms (OR = 4.25, 95% CI: 2.05–8.79; P<0.001). This result was corroborated using continuous measures of diabetes distress and depressive symptoms, showing that greater reduction of diabetes-related distress independently predicted greater improvement in depressive symptoms (ß = -0.40; P<0.001). Higher age had a positive (Odds Ratio = 2.04, 95% CI: 1.21–3.43; P<0.01) and type 2 diabetes had a negative effect on the meaningful reduction of depressive symptoms (Odds Ratio = 0.12, 95% CI: 0.04–0.35; P<0.001). Conclusions The reduction of diabetes distress is a statistical predictor of improvement of depressive symptoms. Diabetes patients with comorbid depressive symptomatology might benefit from treatments to reduce diabetes-related distress. PMID:28700718
Partido, Brian B; Jones, Archie A; English, Dana L; Nguyen, Carol A; Jacks, Mary E
2015-02-01
Dental and dental hygiene faculty members often do not provide consistent instruction in the clinical environment, especially in tasks requiring clinical judgment. From previous efforts to calibrate faculty members in calculus detection using typodonts, researchers have suggested using human subjects and emerging technology to improve consistency in clinical instruction. The purpose of this pilot study was to determine if a dental endoscopy-assisted training program would improve intra- and interrater reliability of dental hygiene faculty members in calculus detection. Training included an ODU 11/12 explorer, typodonts, and dental endoscopy. A convenience sample of six participants was recruited from the dental hygiene faculty at a California community college, and a two-group randomized experimental design was utilized. Intra- and interrater reliability was measured before and after calibration training. Pretest and posttest Kappa averages of all participants were compared using repeated measures (split-plot) ANOVA to determine the effectiveness of the calibration training on intra- and interrater reliability. The results showed that both kinds of reliability significantly improved for all participants and the training group improved significantly in interrater reliability from pretest to posttest. Calibration training was beneficial to these dental hygiene faculty members, especially those beginning with less than full agreement. This study suggests that calculus detection calibration training utilizing dental endoscopy can effectively improve interrater reliability of dental and dental hygiene clinical educators. Future studies should include human subjects, involve more participants at multiple locations, and determine whether improved rater reliability can be sustained over time.
Improving Reliability of a Residency Interview Process
Serres, Michelle L.; Gundrum, Todd E.
2013-01-01
Objective. To improve the reliability and discrimination of a pharmacy resident interview evaluation form, and thereby improve the reliability of the interview process. Methods. In phase 1 of the study, authors used a Many-Facet Rasch Measurement model to optimize an existing evaluation form for reliability and discrimination. In phase 2, interviewer pairs used the modified evaluation form within 4 separate interview stations. In phase 3, 8 interviewers individually-evaluated each candidate in one-on-one interviews. Results. In phase 1, the evaluation form had a reliability of 0.98 with person separation of 6.56; reproducibly, the form separated applicants into 6 distinct groups. Using that form in phase 2 and 3, our largest variation source was candidates, while content specificity was the next largest variation source. The phase 2 g-coefficient was 0.787, while confirmatory phase 3 was 0.922. Process reliability improved with more stations despite fewer interviewers per station—impact of content specificity was greatly reduced with more interview stations. Conclusion. A more reliable, discriminating evaluation form was developed to evaluate candidates during resident interviews, and a process was designed that reduced the impact from content specificity. PMID:24159209
NASA Astrophysics Data System (ADS)
Besnard, Laurent; Blain, Peter; Mancini, Sebastien; Proctor, Roger
2017-04-01
The Integrated Marine Observing System (IMOS) is a national project funded by the Australian government established to deliver ocean observations to the marine and climate science community. Now in its 10th year its mission is to undertake systematic and sustained observations and to turn them into data, products and analyses that can be freely used and reused for broad societal benefits. As IMOS has matured as an observing system expectation on the system's availability and reliability has also increased and IMOS is now seen as delivering 'operational' information. In responding to this expectation, IMOS has relocated its services to the commercial cloud service Amazon Web Services. This has enabled IMOS to improve the system architecture, utilizing more advanced features like object storage (S3 - Simple Storage Service) and autoscaling features, and introducing new checking procedures in a pipeline approach. This has improved data availability and resilience while protecting against human errors in data handling and providing a more efficient ingestion process.
A Closed Network Queue Model of Underground Coal Mining Production, Failure, and Repair
NASA Technical Reports Server (NTRS)
Lohman, G. M.
1978-01-01
Underground coal mining system production, failures, and repair cycles were mathematically modeled as a closed network of two queues in series. The model was designed to better understand the technological constraints on availability of current underground mining systems, and to develop guidelines for estimating the availability of advanced mining systems and their associated needs for spares as well as production and maintenance personnel. It was found that: mine performance is theoretically limited by the maintainability ratio, significant gains in availability appear possible by means of small improvements in the time between failures the number of crews and sections should be properly balanced for any given maintainability ratio, and main haulage systems closest to the mine mouth require the most attention to reliability.
A Report on Simulation-Driven Reliability and Failure Analysis of Large-Scale Storage Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wan, Lipeng; Wang, Feiyi; Oral, H. Sarp
High-performance computing (HPC) storage systems provide data availability and reliability using various hardware and software fault tolerance techniques. Usually, reliability and availability are calculated at the subsystem or component level using limited metrics such as, mean time to failure (MTTF) or mean time to data loss (MTTDL). This often means settling on simple and disconnected failure models (such as exponential failure rate) to achieve tractable and close-formed solutions. However, such models have been shown to be insufficient in assessing end-to-end storage system reliability and availability. We propose a generic simulation framework aimed at analyzing the reliability and availability of storagemore » systems at scale, and investigating what-if scenarios. The framework is designed for an end-to-end storage system, accommodating the various components and subsystems, their interconnections, failure patterns and propagation, and performs dependency analysis to capture a wide-range of failure cases. We evaluate the framework against a large-scale storage system that is in production and analyze its failure projections toward and beyond the end of lifecycle. We also examine the potential operational impact by studying how different types of components affect the overall system reliability and availability, and present the preliminary results« less
NASA Astrophysics Data System (ADS)
Park, Hokyung; Choi, Rino; Lee, Byoung Hun; Hwang, Hyunsang
2007-09-01
High pressure deuterium annealing on the hot carrier reliability characteristics of HfSiO metal oxide semiconductor field effect transistor (MOSFET) was investigated. Comparing with the conventional forming gas (H2/Ar=10%/96%, 480 °C, 30 min) annealed sample, MOSFET annealed in 5 atm pure deuterium ambient at 400 °C showed the improvement of linear drain current, reduction of interface trap density, and improvement of the hot carrier reliability characteristics. These improvements can be attributed to the effective passivation of the interface trap site after high pressure annealing and heavy mass effect of deuterium. These results indicate that high pressure pure deuterium annealing can be a promising process for improving device performance as well as hot carrier reliability, together.
How to survive the medical misinformation mess.
Ioannidis, John P A; Stuart, Michael E; Brownlee, Shannon; Strite, Sheri A
2017-11-01
Most physicians and other healthcare professionals are unaware of the pervasiveness of poor quality clinical evidence that contributes considerably to overuse, underuse, avoidable adverse events, missed opportunities for right care and wasted healthcare resources. The Medical Misinformation Mess comprises four key problems. First, much published medical research is not reliable or is of uncertain reliability, offers no benefit to patients, or is not useful to decision makers. Second, most healthcare professionals are not aware of this problem. Third, they also lack the skills necessary to evaluate the reliability and usefulness of medical evidence. Finally, patients and families frequently lack relevant, accurate medical evidence and skilled guidance at the time of medical decision-making. Increasing the reliability of available, published evidence may not be an imminently reachable goal. Therefore, efforts should focus on making healthcare professionals, more sensitive to the limitations of the evidence, training them to do critical appraisal, and enhancing their communication skills so that they can effectively summarize and discuss medical evidence with patients to improve decision-making. Similar efforts may need to target also patients, journalists, policy makers, the lay public and other healthcare stakeholders. © 2017 Stichting European Society for Clinical Investigation Journal Foundation.
Power transfer systems for future navy helicopters. Final report 25 Jun 70--28 Jun 72
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bossler, R.B. Jr.
1972-11-01
The purpose of this program was to conduct an analysis of helicopter power transfer systems (pts), both conventional and advanced concept type, with the objective of reducing specific weights and improving reliability beyond present values. The analysis satisfied requirements specified for a 200,000 pound cargo transport helicopter (CTH), a 70,000 pound heavy assault helicopter, and a 15,000 pound non-combat search and rescue helicopter. Four selected gearing systems (out of seven studied), optimized for lightest weight and equal reliability for the CTH, using component proportioning via stress and stiffness equations, had no significant difference between their aircraft payloads. All optimized ptsmore » were approximately 70% of statistically predicted weight. Reliability increase is predicted via gearbox derating using Weibull relationships. Among advanced concepts, the Turbine Integrated Geared Rotor was competitive for weight, technology availability and reliability increase but handicapped by a special engine requirement. The warm cycle system was found not competitive. Helicopter parametric weight analysis is shown. Advanced development Plans are presented for the pts for the CTH, including total pts system, selected pts components, and scale model flight testing in a Kaman HH2 helicopter.« less
Beyond reliability to profitability
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bond, T.H.; Mitchell, J.S.
1996-07-01
Reliability concerns have controlled much of power generation design and operations. Emerging from a strictly regulated environment, profitability is becoming a much more important concept for today`s power generation executives. This paper discusses the conceptual advance-view power plant maintenance as a profit center, go beyond reliability, and embrace profitability. Profit Centered Maintenance begins with the premise that financial considerations, namely profitability, drive most aspects of modern process and manufacturing operations. Profit Centered Maintenance is a continuous process of reliability and administrative improvement and optimization. For the power generation executives with troublesome maintenance programs, Profit Centered Maintenance can be the blueprintmore » to increased profitability. It requires the culture change to make decisions based on value, to reengineer the administration of maintenance, and to enable the people performing and administering maintenance to make the most of available maintenance information technology. The key steps are to optimize the physical function of maintenance and to resolve recurring maintenance problems so that the need for maintenance can be reduced. Profit Centered Maintenance is more than just an attitude it is a path to profitability, be it resulting in increased profits or increased market share.« less
Guo, Wenzhong; Hong, Wei; Zhang, Bin; Chen, Yuzhong; Xiong, Naixue
2014-01-01
Mobile security is one of the most fundamental problems in Wireless Sensor Networks (WSNs). The data transmission path will be compromised for some disabled nodes. To construct a secure and reliable network, designing an adaptive route strategy which optimizes energy consumption and network lifetime of the aggregation cost is of great importance. In this paper, we address the reliable data aggregation route problem for WSNs. Firstly, to ensure nodes work properly, we propose a data aggregation route algorithm which improves the energy efficiency in the WSN. The construction process achieved through discrete particle swarm optimization (DPSO) saves node energy costs. Then, to balance the network load and establish a reliable network, an adaptive route algorithm with the minimal energy and the maximum lifetime is proposed. Since it is a non-linear constrained multi-objective optimization problem, in this paper we propose a DPSO with the multi-objective fitness function combined with the phenotype sharing function and penalty function to find available routes. Experimental results show that compared with other tree routing algorithms our algorithm can effectively reduce energy consumption and trade off energy consumption and network lifetime. PMID:25215944
Design of testbed and emulation tools
NASA Technical Reports Server (NTRS)
Lundstrom, S. F.; Flynn, M. J.
1986-01-01
The research summarized was concerned with the design of testbed and emulation tools suitable to assist in projecting, with reasonable accuracy, the expected performance of highly concurrent computing systems on large, complete applications. Such testbed and emulation tools are intended for the eventual use of those exploring new concurrent system architectures and organizations, either as users or as designers of such systems. While a range of alternatives was considered, a software based set of hierarchical tools was chosen to provide maximum flexibility, to ease in moving to new computers as technology improves and to take advantage of the inherent reliability and availability of commercially available computing systems.
Chapman, Ann LN; Darton, Thomas C; Foster, Rachel A
2013-01-01
Tuberculosis (TB) remains a global health emergency. Ongoing challenges include the coordination of national and international control programs, high levels of drug resistance in many parts of the world, and availability of accurate and rapid diagnostic tests. The increasing availability and reliability of Internet access throughout both affluent and resource-limited countries brings new opportunities to improve TB management and control through the integration of web-based technologies with traditional approaches. In this review, we explore current and potential future use of web-based tools in the areas of TB diagnosis, treatment, epidemiology, service monitoring, and teaching and training. PMID:24294008
System reliability approaches for advanced propulsion system structures
NASA Technical Reports Server (NTRS)
Cruse, T. A.; Mahadevan, S.
1991-01-01
This paper identifies significant issues that pertain to the estimation and use of system reliability in the design of advanced propulsion system structures. Linkages between the reliabilities of individual components and their effect on system design issues such as performance, cost, availability, and certification are examined. The need for system reliability computation to address the continuum nature of propulsion system structures and synergistic progressive damage modes has been highlighted. Available system reliability models are observed to apply only to discrete systems. Therefore a sequential structural reanalysis procedure is formulated to rigorously compute the conditional dependencies between various failure modes. The method is developed in a manner that supports both top-down and bottom-up analyses in system reliability.
Custom LSI plus hybrid equals cost effectiveness
NASA Astrophysics Data System (ADS)
Friedman, S. N.
The possibility to combine various technologies, such as Bi-Polar linear and CMOS/Digital makes it feasible to create systems with a tailored performance not available on a single monolithic circuit. The custom LSI 'BLOCK', especially if it is universal in nature, is proving to be a cost effective way for the developer to improve his product. The custom LSI represents a low price part in contrast to the discrete components it will replace. In addition, the hybrid assembly can realize a savings in labor as a result of the reduced parts handling and associated wire bonds. The possibility of the use of automated system manufacturing techniques leads to greater reliability as the human factor is partly eliminated. Attention is given to reliability predictions, cost considerations, and a product comparison study.
Düking, Peter; Fuss, Franz Konstantin; Holmberg, Hans-Christer; Sperlich, Billy
2018-04-30
Although it is becoming increasingly popular to monitor parameters related to training, recovery, and health with wearable sensor technology (wearables), scientific evaluation of the reliability, sensitivity, and validity of such data is limited and, where available, has involved a wide variety of approaches. To improve the trustworthiness of data collected by wearables and facilitate comparisons, we have outlined recommendations for standardized evaluation. We discuss the wearable devices themselves, as well as experimental and statistical considerations. Adherence to these recommendations should be beneficial not only for the individual, but also for regulatory organizations and insurance companies. ©Peter Düking, Franz Konstantin Fuss, Hans-Christer Holmberg, Billy Sperlich. Originally published in JMIR Mhealth and Uhealth (http://mhealth.jmir.org), 30.04.2018.
Theory and practice in the electrometric determination of pH in precipitation
NASA Astrophysics Data System (ADS)
Brennan, Carla Jo; Peden, Mark E.
Basic theory and laboratory investigations have been applied to the electrometric determination of pH in precipitation samples in an effort to improve the reliability of the results obtained from these low ionic strength samples. The theoretical problems inherent in the measurement of pH in rain have been examined using natural precipitation samples with varying ionic strengths and pH values. The importance of electrode design and construction has been stressed. The proper choice of electrode can minimize or eliminate problems arising from residual liquid junction potentials, streaming potentials and temperature differences. Reliable pH measurements can be made in precipitation samples using commercially available calibration buffers providing low ionic strength quality control solutions are routinely used to verify electrode and meter performance.
Quality improvement of forensic mental health evaluations and reports of youth in the Netherlands.
Duits, Nils; van der Hoorn, Steven; Wiznitzer, Martin; Wettstein, Robert M; de Beurs, Edwin
2012-01-01
Quality improvement of forensic mental health evaluations and reports is needed, but little information is available on how this can be attained, and relatively little conceptual analysis has been undertaken. The STAR, a standardized evaluation instrument of the quality of forensic mental health reports of youth, is developed on the basis of concept mapping to clarify the different perspectives on usability of these reports. Psychometric data are provided, demonstrating the reliability and supporting the validity of the STAR. The Dutch forensic context is described to better understand the development and psychometric properties of this standardized instrument. Quality improvement possibilities of forensic mental health evaluations and reports are discussed. Copyright © 2012 Elsevier Ltd. All rights reserved.
ERIC Educational Resources Information Center
Klesius, Janell P.; Homan, Susan P.
1985-01-01
The article reviews validity and reliability studies on the informal reading inventory, a diagnostic instrument to identify reading grade-level placement and strengths and weaknesses in work recognition and comprehension. Gives suggestions to improve the validity and reliability of existing inventories and to evaluate them in newly published…
Hydrological modelling improvements required in basins in the Hindukush-Karakoram-Himalayas region
NASA Astrophysics Data System (ADS)
Khan, Asif; Richards, Keith S.; McRobie, Allan; Booij, Martijn
2016-04-01
Millions of people rely on river water originating from basins in the Hindukush-Karakoram-Himalayas (HKH), where snow- and ice-melt are significant flow components. One such basin is the Upper Indus Basin (UIB), where snow- and ice-melt can contribute more than 80% of total flow. Containing some of the world's largest alpine glaciers, this basin may be highly susceptible to global warming and climate change, and reliable predictions of future water availability are vital for resource planning for downstream food and energy needs in a changing climate, but depend on significantly improved hydrological modelling. However, a critical assessment of available hydro-climatic data and hydrological modelling in the HKH region has identified five major failings in many published hydro-climatic studies, even those appearing in reputable international journals. The main weaknesses of these studies are: i) incorrect basin areas; ii) under-estimated precipitation; iii) incorrectly-defined glacier boundaries; iv) under-estimated snow-cover data; and v) use of biased melt factors for snow and ice during the summer months. This paper illustrates these limitations, which have either resulted in modelled flows being under-estimates of measured flows, leading to an implied severe water scarcity; or have led to the use of unrealistically high degree-day factors and over-estimates of glacier melt contributions, implying unrealistic melt rates. These effects vary amongst sub-basins. Forecasts obtained from these models cannot be used reliably in policy making or water resource development, and need revision. Detailed critical analysis and improvement of existing hydrological modelling may be equally necessary in other mountain regions across the world.
Systems Issues In Terrestrial Fiber Optic Link Reliability
NASA Astrophysics Data System (ADS)
Spencer, James L.; Lewin, Barry R.; Lee, T. Frank S.
1990-01-01
This paper reviews fiber optic system reliability issues from three different viewpoints - availability, operating environment, and evolving technologies. Present availability objectives for interoffice links and for the distribution loop must be re-examined for applications such as the Synchronous Optical Network (SONET), Fiber-to-the-Home (FTTH), and analog services. The hostile operating environments of emerging applications (such as FTTH) must be carefully considered in system design as well as reliability assessments. Finally, evolving technologies might require the development of new reliability testing strategies.
NASA Technical Reports Server (NTRS)
Turnquist, S. R.; Twombly, M.; Hoffman, D.
1989-01-01
A preliminary reliability, availability, and maintainability (RAM) analysis of the proposed Space Station Freedom electric power system (EPS) was performed using the unit reliability, availability, and maintainability (UNIRAM) analysis methodology. Orbital replacement units (ORUs) having the most significant impact on EPS availability measures were identified. Also, the sensitivity of the EPS to variations in ORU RAM data was evaluated for each ORU. Estimates were made of average EPS power output levels and availability of power to the core area of the space station. The results of assessments of the availability of EPS power and power to load distribution points in the space stations are given. Some highlights of continuing studies being performed to understand EPS availability considerations are presented.
Versatile composite resins simplifying the practice of restorative dentistry.
Margeas, Robert
2014-01-01
After decades of technical development and refinement, composite resins continue to simplify the practice of restorative dentistry, offering clinicians versatility, predictability, and enhanced physical properties. With a wide range of products available today, composite resins are a reliable, conservative, multi-functional restorative material option. As manufacturers strive to improve such properties as compression strength, flexural strength, elastic modulus, coefficient of thermal expansion, water sorption, and wear resistance, several classification systems of composite resins have been developed.
NASA Astrophysics Data System (ADS)
Lassen, J.; Li, R.; Raeder, S.; Zhao, X.; Dekker, T.; Heggen, H.; Kunz, P.; P. Levy, C. D.; Mostanmand, M.; Teigelhöfer, A.; Ames, F.
2017-11-01
Developments at TRIUMF's isotope separator and accelerator (ISAC) resonance ionization laser ion source (RILIS) in the past years have concentrated on increased reliability for on-line beam delivery of radioactive isotopes to experiments, as well as increasing the number of elements available through resonance ionization and searching for ionization schemes with improved efficiency. The current status of these developments is given with a list of two step laser ionization schemes implemented recently.
Foppen, Wouter; van der Schaaf, Irene C; Beek, Frederik J A; Verkooijen, Helena M; Fischer, Kathelijn
2016-06-01
The radiological Pettersson score (PS) is widely applied for classification of arthropathy to evaluate costly haemophilia treatment. This study aims to assess and improve inter- and intra-observer reliability and agreement of the PS. Two series of X-rays (bilateral elbows, knees, and ankles) of 10 haemophilia patients (120 joints) with haemophilic arthropathy were scored by three observers according to the PS (maximum score 13/joint). Subsequently, (dis-)agreement in scoring was discussed until consensus. Example images were collected in an atlas. Thereafter, second series of 120 joints were scored using the atlas. One observer rescored the second series after three months. Reliability was assessed by intraclass correlation coefficients (ICC), agreement by limits of agreement (LoA). Median Pettersson score at joint level (PSjoint) of affected joints was 6 (interquartile range 3-9). Using the consensus atlas, inter-observer reliability of the PSjoint improved significantly from 0.94 (95 % confidence interval (CI) 0.91-0.96) to 0.97 (CI 0.96-0.98). LoA improved from ±1.7 to ±1.1 for the PSjoint. Therefore, true differences in arthropathy were differences in the PSjoint of >2 points. Intra-observer reliability of the PSjoint was 0.98 (CI 0.97-0.98), intra-observer LoA were ±0.9 points. Reliability and agreement of the PS improved by using a consensus atlas. • Reliability of the Pettersson score significantly improved using the consensus atlas. • The presented consensus atlas improved the agreement among observers. • The consensus atlas could be recommended to obtain a reproducible Pettersson score.
Space flight risk data collection and analysis project: Risk and reliability database
NASA Technical Reports Server (NTRS)
1994-01-01
The focus of the NASA 'Space Flight Risk Data Collection and Analysis' project was to acquire and evaluate space flight data with the express purpose of establishing a database containing measurements of specific risk assessment - reliability - availability - maintainability - supportability (RRAMS) parameters. The developed comprehensive RRAMS database will support the performance of future NASA and aerospace industry risk and reliability studies. One of the primary goals has been to acquire unprocessed information relating to the reliability and availability of launch vehicles and the subsystems and components thereof from the 45th Space Wing (formerly Eastern Space and Missile Command -ESMC) at Patrick Air Force Base. After evaluating and analyzing this information, it was encoded in terms of parameters pertinent to ascertaining reliability and availability statistics, and then assembled into an appropriate database structure.
Emission of pesticides into the air
Van Den, Berg; Kubiak, R.; Benjey, W.G.; Majewski, M.S.; Yates, S.R.; Reeves, G.L.; Smelt, J.H.; Van Der Linden, A. M. A.
1999-01-01
During and after the application of a pesticide in agriculture, a substantial fraction of the dosage may enter the atmosphere and be transported over varying distances downwind of the target. The rate and extent of the emission during application, predominantly as spray particle drift, depends primarily on the application method (equipment and technique), the formulation and environmental conditions, whereas the emission after application depends primarily on the properties of the pesticide, soils, crops and environmental conditions. The fraction of the dosage that misses the target area may be high in some cases and more experimental data on this loss term are needed for various application types and weather conditions. Such data are necessary to test spray drift models, and for further model development and verification as well. Following application, the emission of soil fumigants and soil incorporated pesticides into the air can be measured and computed with reasonable accuracy, but further model development is needed to improve the reliability of the model predictions. For soil surface applied pesticides reliable measurement methods are available, but there is not yet a reliable model. Further model development is required which must be verified by field experiments. Few data are available on pesticide volatilization from plants and more field experiments are also needed to study the fate processes on the plants. Once this information is available, a model needs to be developed to predict the volatilization of pesticides from plants, which, again, should be verified with field measurements. For regional emission estimates, a link between data on the temporal and spatial pesticide use and a geographical information system for crops and soils with their characteristics is needed.
Troester, Jordan C; Jasmin, Jason G; Duffield, Rob
2018-06-01
The present study examined the inter-trial (within test) and inter-test (between test) reliability of single-leg balance and single-leg landing measures performed on a force plate in professional rugby union players using commercially available software (SpartaMARS, Menlo Park, USA). Twenty-four players undertook test - re-test measures on two occasions (7 days apart) on the first training day of two respective pre-season weeks following 48h rest and similar weekly training loads. Two 20s single-leg balance trials were performed on a force plate with eyes closed. Three single-leg landing trials were performed by jumping off two feet and landing on one foot in the middle of a force plate 1m from the starting position. Single-leg balance results demonstrated acceptable inter-trial reliability (ICC = 0.60-0.81, CV = 11-13%) for sway velocity, anterior-posterior sway velocity, and mediolateral sway velocity variables. Acceptable inter-test reliability (ICC = 0.61-0.89, CV = 7-13%) was evident for all variables except mediolateral sway velocity on the dominant leg (ICC = 0.41, CV = 15%). Single-leg landing results only demonstrated acceptable inter-trial reliability for force based measures of relative peak landing force and impulse (ICC = 0.54-0.72, CV = 9-15%). Inter-test results indicate improved reliability through the averaging of three trials with force based measures again demonstrating acceptable reliability (ICC = 0.58-0.71, CV = 7-14%). Of the variables investigated here, total sway velocity and relative landing impulse are the most reliable measures of single-leg balance and landing performance, respectively. These measures should be considered for monitoring potential changes in postural control in professional rugby union.
Empirical Recommendations for Improving the Stability of the Dot-Probe Task in Clinical Research
Price, Rebecca B.; Kuckertz, Jennie M.; Siegle, Greg J.; Ladouceur, Cecile D.; Silk, Jennifer S.; Ryan, Neal D.; Dahl, Ronald E.; Amir, Nader
2014-01-01
The dot-probe task has been widely used in research to produce an index of biased attention based on reaction times (RTs). Despite its popularity, very few published studies have examined psychometric properties of the task, including test-retest reliability, and no previous study has examined reliability in clinically anxious samples or systematically explored the effects of task design and analysis decisions on reliability. In the current analysis, we utilized dot-probe data from three studies where attention bias towards threat-related faces was assessed at multiple (≥5) timepoints. Two of the studies were similar (adults with Social Anxiety Disorder, similar design features) while one was much more disparate (pediatric healthy volunteers, distinct task design). We explored the effects of analysis choices (e.g., bias score calculation formula, methods for outlier handling) on reliability and searched for convergence of findings across the three studies. We found that, when considering the three studies concurrently, the most reliable RT bias index utilized data from dot-bottom trials, comparing congruent to incongruent trials, with rescaled outliers, particularly after averaging across more than one assessment point. Although reliability of RT bias indices was moderate to low under most circumstances, within-session variability in bias (attention bias variability; ABV), a recently proposed RT index, was more reliable across sessions. Several eyetracking-based indices of attention bias (available in the pediatric healthy sample only) showed reliability that matched the optimal RT index (ABV). On the basis of these findings, we make specific recommendations to researchers using the dot probe, particularly those wishing to investigate individual differences and/or single-patient applications. PMID:25419646
The Americleft Speech Project: A Training and Reliability Study.
Chapman, Kathy L; Baylis, Adriane; Trost-Cardamone, Judith; Cordero, Kelly Nett; Dixon, Angela; Dobbelsteyn, Cindy; Thurmes, Anna; Wilson, Kristina; Harding-Bell, Anne; Sweeney, Triona; Stoddard, Gregory; Sell, Debbie
2016-01-01
To describe the results of two reliability studies and to assess the effect of training on interrater reliability scores. The first study (1) examined interrater and intrarater reliability scores (weighted and unweighted kappas) and (2) compared interrater reliability scores before and after training on the use of the Cleft Audit Protocol for Speech-Augmented (CAPS-A) with British English-speaking children. The second study examined interrater and intrarater reliability on a modified version of the CAPS-A (CAPS-A Americleft Modification) with American and Canadian English-speaking children. Finally, comparisons were made between the interrater and intrarater reliability scores obtained for Study 1 and Study 2. The participants were speech-language pathologists from the Americleft Speech Project. In Study 1, interrater reliability scores improved for 6 of the 13 parameters following training on the CAPS-A protocol. Comparison of the reliability results for the two studies indicated lower scores for Study 2 compared with Study 1. However, this appeared to be an artifact of the kappa statistic that occurred due to insufficient variability in the reliability samples for Study 2. When percent agreement scores were also calculated, the ratings appeared similar across Study 1 and Study 2. The findings of this study suggested that improvements in interrater reliability could be obtained following a program of systematic training. However, improvements were not uniform across all parameters. Acceptable levels of reliability were achieved for those parameters most important for evaluation of velopharyngeal function.
The Americleft Speech Project: A Training and Reliability Study
Chapman, Kathy L.; Baylis, Adriane; Trost-Cardamone, Judith; Cordero, Kelly Nett; Dixon, Angela; Dobbelsteyn, Cindy; Thurmes, Anna; Wilson, Kristina; Harding-Bell, Anne; Sweeney, Triona; Stoddard, Gregory; Sell, Debbie
2017-01-01
Objective To describe the results of two reliability studies and to assess the effect of training on interrater reliability scores. Design The first study (1) examined interrater and intrarater reliability scores (weighted and unweighted kappas) and (2) compared interrater reliability scores before and after training on the use of the Cleft Audit Protocol for Speech–Augmented (CAPS-A) with British English-speaking children. The second study examined interrater and intrarater reliability on a modified version of the CAPS-A (CAPS-A Americleft Modification) with American and Canadian English-speaking children. Finally, comparisons were made between the interrater and intrarater reliability scores obtained for Study 1 and Study 2. Participants The participants were speech-language pathologists from the Americleft Speech Project. Results In Study 1, interrater reliability scores improved for 6 of the 13 parameters following training on the CAPS-A protocol. Comparison of the reliability results for the two studies indicated lower scores for Study 2 compared with Study 1. However, this appeared to be an artifact of the kappa statistic that occurred due to insufficient variability in the reliability samples for Study 2. When percent agreement scores were also calculated, the ratings appeared similar across Study 1 and Study 2. Conclusion The findings of this study suggested that improvements in interrater reliability could be obtained following a program of systematic training. However, improvements were not uniform across all parameters. Acceptable levels of reliability were achieved for those parameters most important for evaluation of velopharyngeal function. PMID:25531738
DOE Office of Scientific and Technical Information (OSTI.GOV)
Woodhouse, Michael; Jones-Albertus, Rebecca; Feldman, David
2016-05-01
This report examines the remaining challenges to achieving the competitive photovoltaic (PV) costs and large-scale deployment envisioned under the U.S. Department of Energy's SunShot Initiative. Solar-energy cost reductions can be realized through lower PV module and balance-of-system (BOS) costs as well as improved system efficiency and reliability. Numerous combinations of PV improvements could help achieve the levelized cost of electricity (LCOE) goals because of the tradeoffs among key metrics like module price, efficiency, and degradation rate as well as system price and lifetime. Using LCOE modeling based on bottom-up cost analysis, two specific pathways are mapped to exemplify the manymore » possible approaches to module cost reductions of 29%-38% between 2015 and 2020. BOS hardware and soft cost reductions, ranging from 54%-77% of total cost reductions, are also modeled. The residential sector's high supply-chain costs, labor requirements, and customer-acquisition costs give it the greatest BOS cost-reduction opportunities, followed by the commercial sector, although opportunities are available to the utility-scale sector as well. Finally, a future scenario is considered in which very high PV penetration requires additional costs to facilitate grid integration and increased power-system flexibility--which might necessitate even lower solar LCOEs. The analysis of a pathway to 3-5 cents/kWh PV systems underscores the importance of combining robust improvements in PV module and BOS costs as well as PV system efficiency and reliability if such aggressive long-term targets are to be achieved.« less
Next-generation healthcare: a strategic appraisal.
Montague, Terrence
2009-01-01
Successful next-generation healthcare must deliver timely access and quality for an aging population, while simultaneously promoting disease prevention and managing costs. The key factors for sustained success are a culture with aligned goals and values; coordinated team care that especially engages with physicians and patients; practical information that is collected and communicated reliably; and education in the theory and methods of collaboration, measurement and leadership. Currently, optimal population health is challenged by a high prevalence of chronic disease, with large gaps between best and usual care, a scarcity of health human resources - particularly with the skills, attitudes and training for coordinated team care - and the absence of flexible, reliable clinical measurement systems. However, to make things better, institutional models and supporting technologies are available. In the short term, a first step is to enhance the awareness of the practical opportunities to improve, including the expansion of proven community-based disease management programs that communicate knowledge, competencies and clinical measurements among professional and patient partners, leading to reduced care gaps and improved clinical and economic outcomes. Longer-term success requires two additional steps. One is formal inter-professional training to provide, on an ongoing basis, the polyvalent human resource skills and foster the culture of working with others to improve the care of whole populations. The other is the adoption of reliable information systems, including electronic health records, to allow useful and timely measurement and effective communication of clinical information in real-world settings. A better health future can commence immediately, within existing resources, and be sustained with feasible innovations in provider and patient education and information systems. The future is now.
The STAR score: a method for auditing clinical records
Tuffaha, H
2012-01-01
INTRODUCTION Adequate medical note keeping is critical in delivering high quality healthcare. However, there are few robust tools available for the auditing of notes. The aim of this paper was to describe the design, validation and implementation of a novel scoring tool to objectively assess surgical notes. METHODS An initial ‘path finding’ study was performed to evaluate the quality of note keeping using the CRABEL scoring tool. The findings prompted the development of the Surgical Tool for Auditing Records (STAR) as an alternative. STAR was validated using inter-rater reliability analysis. An audit cycle of surgical notes using STAR was performed. The results were analysed and a structured form for the completion of surgical notes was introduced to see if the quality improved in the next audit cycle using STAR. An education exercise was conducted and all participants said the exercise would change their practice, with 25% implementing major changes. RESULTS Statistical analysis of STAR showed that it is reliable (Cronbach’s a = 0.959). On completing the audit cycle, there was an overall increase in the STAR score from 83.344% to 97.675% (p<0.001) with significant improvements in the documentation of the initial clerking from 59.0% to 96.5% (p<0.001) and subsequent entries from 78.4% to 96.1% (p<0.001). CONCLUSIONS The authors believe in the value of STAR as an effective, reliable and reproducible tool. Coupled with the application of structured forms to note keeping, it can significantly improve the quality of surgical documentation and can be implemented universally. PMID:22613300
Bringing the CMS distributed computing system into scalable operations
NASA Astrophysics Data System (ADS)
Belforte, S.; Fanfani, A.; Fisk, I.; Flix, J.; Hernández, J. M.; Kress, T.; Letts, J.; Magini, N.; Miccio, V.; Sciabà, A.
2010-04-01
Establishing efficient and scalable operations of the CMS distributed computing system critically relies on the proper integration, commissioning and scale testing of the data and workload management tools, the various computing workflows and the underlying computing infrastructure, located at more than 50 computing centres worldwide and interconnected by the Worldwide LHC Computing Grid. Computing challenges periodically undertaken by CMS in the past years with increasing scale and complexity have revealed the need for a sustained effort on computing integration and commissioning activities. The Processing and Data Access (PADA) Task Force was established at the beginning of 2008 within the CMS Computing Program with the mandate of validating the infrastructure for organized processing and user analysis including the sites and the workload and data management tools, validating the distributed production system by performing functionality, reliability and scale tests, helping sites to commission, configure and optimize the networking and storage through scale testing data transfers and data processing, and improving the efficiency of accessing data across the CMS computing system from global transfers to local access. This contribution reports on the tools and procedures developed by CMS for computing commissioning and scale testing as well as the improvements accomplished towards efficient, reliable and scalable computing operations. The activities include the development and operation of load generators for job submission and data transfers with the aim of stressing the experiment and Grid data management and workload management systems, site commissioning procedures and tools to monitor and improve site availability and reliability, as well as activities targeted to the commissioning of the distributed production, user analysis and monitoring systems.
NASA Astrophysics Data System (ADS)
Zammouri, Mounira; Ribeiro, Luis
2017-05-01
Groundwater flow model of the transboundary Saharan aquifer system is developed in 2003 and used for management and decision-making by Algeria, Tunisia and Libya. In decision-making processes, reliability plays a decisive role. This paper looks into the reliability assessment of the Saharan aquifers model. It aims to detect the shortcomings of the model considered properly calibrated. After presenting the calibration results of the effort modelling in 2003, the uncertainty in the model which arising from the lack of the groundwater level and the transmissivity data is analyzed using kriging technique and stochastic approach. The structural analysis of piezometry in steady state and logarithms of transmissivity were carried out for the Continental Intercalaire (CI) and the Complexe Terminal (CT) aquifers. The available data (piezometry and transmissivity) were compared to the calculated values, using geostatistics approach. Using a stochastic approach, 2500 realizations of a log-normal random transmissivity field of the CI aquifer has been performed to assess the errors of the model output, due to the uncertainty in transmissivity. Two types of bad calibration are shown. In some regions, calibration should be improved using the available data. In others areas, undertaking the model refinement requires gathering new data to enhance the aquifer system knowledge. Stochastic simulations' results showed that the calculated drawdowns in 2050 could be higher than the values predicted by the calibrated model.
2015 NREL Photovoltaic Reliability Workshops | Photovoltaic Research | NREL
5 NREL Photovoltaic Reliability Workshops 2015 NREL Photovoltaic Reliability Workshops The 2015 NREL Photovoltaic Reliability Workshop was held February 24-27, 2015, in Golden, Colorado. This event be available for download as soon as possible. The Photovoltaic Module Reliability Workshop is
DOE Office of Scientific and Technical Information (OSTI.GOV)
Reason, J.
Transmission terminations available today are very reliable, but they need to be. In the field, they are continually exposed to pollution and extremes of ambient temperature. In many cases, they are in the rifle sights of vandals. In contrast, cable joints - often cited as the weakest links from an electrical viewpoint - are generally protected from physical damage underground and many of the short cable systems being installed in the US today can be built without joints. All cable systems need terminations - mostly to air-insulated equipment. At 69 through 138 kV, there is intense competition among manufacturers tomore » supply terminations for solid-dielectric cable that are low in cost, reliable, and require a minimum of skill to install. Some utilities are looking also for terminations that fit a range of cable sizes; terminations that do not contain liquid that can leak out; and terminations that are shatter-proof. All of these improvements are available in the US up to 69 kV. For higher voltages, they are on the horizon, if not already in use, overseas. 16 figs.« less
A Fresh Start for Flood Estimation in Ungauged Basins
NASA Astrophysics Data System (ADS)
Woods, R. A.
2017-12-01
The two standard methods for flood estimation in ungauged basins, regression-based statistical models and rainfall-runoff models using a design rainfall event, have survived relatively unchanged as the methods of choice for more than 40 years. Their technical implementation has developed greatly, but the models' representation of hydrological processes has not, despite a large volume of hydrological research. I suggest it is time to introduce more hydrology into flood estimation. The reliability of the current methods can be unsatisfactory. For example, despite the UK's relatively straightforward hydrology, regression estimates of the index flood are uncertain by +/- a factor of two (for a 95% confidence interval), an impractically large uncertainty for design. The standard error of rainfall-runoff model estimates is not usually known, but available assessments indicate poorer reliability than statistical methods. There is a practical need for improved reliability in flood estimation. Two promising candidates to supersede the existing methods are (i) continuous simulation by rainfall-runoff modelling and (ii) event-based derived distribution methods. The main challenge with continuous simulation methods in ungauged basins is to specify the model structure and parameter values, when calibration data are not available. This has been an active area of research for more than a decade, and this activity is likely to continue. The major challenges for the derived distribution method in ungauged catchments include not only the correct specification of model structure and parameter values, but also antecedent conditions (e.g. seasonal soil water balance). However, a much smaller community of researchers are active in developing or applying the derived distribution approach, and as a result slower progress is being made. A change in needed: surely we have learned enough about hydrology in the last 40 years that we can make a practical hydrological advance on our methods for flood estimation! A shift to new methods for flood estimation will not be taken lightly by practitioners. However, the standard for change is clear - can we develop new methods which give significant improvements in reliability over those existing methods which are demonstrably unsatisfactory?
2012-06-01
in an effort to be more reliable and efficient. However, with the benefits of this new technology comes added risk . This research utilizes a con ...AN APPLICATION OF CON -RESISTANT TRUST TO IMPROVE THE RELIABILITY OF SPECIAL PROTECTION SYSTEMS WITHIN THE SMART GRID THESIS Crystal M. Shipman...Government and is not subject to copyright protection in the United States AFIT/GCO/ENG/12-22 AN APPLICATION OF CON -RESISTANT TRUST TO IMPROVE THE
Improved turbine disk design to increase reliability of aircraft jet engines
NASA Technical Reports Server (NTRS)
Alver, A. S.; Wong, J. K.
1975-01-01
An analytical study was conducted on a bore entry cooled turbine disk for the first stage of the JT8D-17 high pressure turbine which had the potential to improve disk life over existing design. The disk analysis included the consideration of transient and steady state temperature, blade loading, creep, low cycle fatigue, fracture mechanics and manufacturing flaws. The improvement in life of the bore entry cooled turbine disk was determined by comparing it with the existing disk made of both conventional and advanced (Astroloy) disk materials. The improvement in crack initiation life of the Astroloy bore entry cooled disk is 87% and 67% over the existing disk made of Waspaloy and Astroloy, respectively. Improvement in crack propagation life is 124% over the Waspaloy and 465% over the Astroloy disks. The available kinetic energies of disk fragments calculated for the three disks indicate a lower fragment energy level for the bore entry cooled turbine disk.
DOT National Transportation Integrated Search
2012-11-30
The objective of this project was to develop technical relationships between reliability improvement strategies and reliability performance metrics. This project defined reliability, explained the importance of travel time distributions for measuring...
High power diode lasers emitting from 639 nm to 690 nm
NASA Astrophysics Data System (ADS)
Bao, L.; Grimshaw, M.; DeVito, M.; Kanskar, M.; Dong, W.; Guan, X.; Zhang, S.; Patterson, J.; Dickerson, P.; Kennedy, K.; Li, S.; Haden, J.; Martinsen, R.
2014-03-01
There is increasing market demand for high power reliable red lasers for display and cinema applications. Due to the fundamental material system limit at this wavelength range, red diode lasers have lower efficiency and are more temperature sensitive, compared to 790-980 nm diode lasers. In terms of reliability, red lasers are also more sensitive to catastrophic optical mirror damage (COMD) due to the higher photon energy. Thus developing higher power-reliable red lasers is very challenging. This paper will present nLIGHT's released red products from 639 nm to 690nm, with established high performance and long-term reliability. These single emitter diode lasers can work as stand-alone singleemitter units or efficiently integrate into our compact, passively-cooled Pearl™ fiber-coupled module architectures for higher output power and improved reliability. In order to further improve power and reliability, new chip optimizations have been focused on improving epitaxial design/growth, chip configuration/processing and optical facet passivation. Initial optimization has demonstrated promising results for 639 nm diode lasers to be reliably rated at 1.5 W and 690nm diode lasers to be reliably rated at 4.0 W. Accelerated life-test has started and further design optimization are underway.
NASA Technical Reports Server (NTRS)
Rothmann, Elizabeth; Dugan, Joanne Bechta; Trivedi, Kishor S.; Mittal, Nitin; Bavuso, Salvatore J.
1994-01-01
The Hybrid Automated Reliability Predictor (HARP) integrated Reliability (HiRel) tool system for reliability/availability prediction offers a toolbox of integrated reliability/availability programs that can be used to customize the user's application in a workstation or nonworkstation environment. The Hybrid Automated Reliability Predictor (HARP) tutorial provides insight into HARP modeling techniques and the interactive textual prompting input language via a step-by-step explanation and demonstration of HARP's fault occurrence/repair model and the fault/error handling models. Example applications are worked in their entirety and the HARP tabular output data are presented for each. Simple models are presented at first with each succeeding example demonstrating greater modeling power and complexity. This document is not intended to present the theoretical and mathematical basis for HARP.
Examining the reliability of ADAS-Cog change scores.
Grochowalski, Joseph H; Liu, Ying; Siedlecki, Karen L
2016-09-01
The purpose of this study was to estimate and examine ways to improve the reliability of change scores on the Alzheimer's Disease Assessment Scale, Cognitive Subtest (ADAS-Cog). The sample, provided by the Alzheimer's Disease Neuroimaging Initiative, included individuals with Alzheimer's disease (AD) (n = 153) and individuals with mild cognitive impairment (MCI) (n = 352). All participants were administered the ADAS-Cog at baseline and 1 year, and change scores were calculated as the difference in scores over the 1-year period. Three types of change score reliabilities were estimated using multivariate generalizability. Two methods to increase change score reliability were evaluated: reweighting the subtests of the scale and adding more subtests. Reliability of ADAS-Cog change scores over 1 year was low for both the AD sample (ranging from .53 to .64) and the MCI sample (.39 to .61). Reweighting the change scores from the AD sample improved reliability (.68 to .76), but lengthening provided no useful improvement for either sample. The MCI change scores had low reliability, even with reweighting and adding additional subtests. The ADAS-Cog scores had low reliability for measuring change. Researchers using the ADAS-Cog should estimate and report reliability for their use of the change scores. The ADAS-Cog change scores are not recommended for assessment of meaningful clinical change.
Asymptotically reliable transport of multimedia/graphics over wireless channels
NASA Astrophysics Data System (ADS)
Han, Richard Y.; Messerschmitt, David G.
1996-03-01
We propose a multiple-delivery transport service tailored for graphics and video transported over connections with wireless access. This service operates at the interface between the transport and application layers, balancing the subjective delay and image quality objectives of the application with the low reliability and limited bandwidth of the wireless link. While techniques like forward-error correction, interleaving and retransmission improve reliability over wireless links, they also increase latency substantially when bandwidth is limited. Certain forms of interactive multimedia datatypes can benefit from an initial delivery of a corrupt packet to lower the perceptual latency, as long as reliable delivery occurs eventually. Multiple delivery of successively refined versions of the received packet, terminating when a sufficiently reliable version arrives, exploits the redundancy inherently required to improve reliability without a traffic penalty. Modifications to acknowledgment-repeat-request (ARQ) methods to implement this transport service are proposed, which we term `leaky ARQ'. For the specific case of pixel-coded window-based text/graphics, we describe additional functions needed to more effectively support urgent delivery and asymptotic reliability. X server emulation suggests that users will accept a multi-second delay between a (possibly corrupt) packet and the ultimate reliably-delivered version. The relaxed delay for reliable delivery can be exploited for traffic capacity improvement using scheduling of retransmissions.
NASA Astrophysics Data System (ADS)
Taylor, Faith E.; Malamud, Bruce D.; Millington, James D. A.
2016-04-01
Access to reliable spatial and quantitative datasets (e.g., infrastructure maps, historical observations, environmental variables) at regional and site specific scales can be a limiting factor for understanding hazards and risks in developing country settings. Here we present a 'living database' of >75 freely available data sources relevant to hazard and risk in Africa (and more globally). Data sources include national scientific foundations, non-governmental bodies, crowd-sourced efforts, academic projects, special interest groups and others. The database is available at http://tinyurl.com/africa-datasets and is continually being updated, particularly in the context of broader natural hazards research we are doing in the context of Malawi and Kenya. For each data source, we review the spatiotemporal resolution and extent and make our own assessments of reliability and usability of datasets. Although such freely available datasets are sometimes presented as a panacea to improving our understanding of hazards and risk in developing countries, there are both pitfalls and opportunities unique to using this type of freely available data. These include factors such as resolution, homogeneity, uncertainty, access to metadata and training for usage. Based on our experience, use in the field and grey/peer-review literature, we present a suggested set of guidelines for using these free and open source data in developing country contexts.
Opportunities of probabilistic flood loss models
NASA Astrophysics Data System (ADS)
Schröter, Kai; Kreibich, Heidi; Lüdtke, Stefan; Vogel, Kristin; Merz, Bruno
2016-04-01
Oftentimes, traditional uni-variate damage models as for instance depth-damage curves fail to reproduce the variability of observed flood damage. However, reliable flood damage models are a prerequisite for the practical usefulness of the model results. Innovative multi-variate probabilistic modelling approaches are promising to capture and quantify the uncertainty involved and thus to improve the basis for decision making. In this study we compare the predictive capability of two probabilistic modelling approaches, namely Bagging Decision Trees and Bayesian Networks and traditional stage damage functions. For model evaluation we use empirical damage data which are available from computer aided telephone interviews that were respectively compiled after the floods in 2002, 2005, 2006 and 2013 in the Elbe and Danube catchments in Germany. We carry out a split sample test by sub-setting the damage records. One sub-set is used to derive the models and the remaining records are used to evaluate the predictive performance of the model. Further we stratify the sample according to catchments which allows studying model performance in a spatial transfer context. Flood damage estimation is carried out on the scale of the individual buildings in terms of relative damage. The predictive performance of the models is assessed in terms of systematic deviations (mean bias), precision (mean absolute error) as well as in terms of sharpness of the predictions the reliability which is represented by the proportion of the number of observations that fall within the 95-quantile and 5-quantile predictive interval. The comparison of the uni-variable Stage damage function and the multivariable model approach emphasises the importance to quantify predictive uncertainty. With each explanatory variable, the multi-variable model reveals an additional source of uncertainty. However, the predictive performance in terms of precision (mbe), accuracy (mae) and reliability (HR) is clearly improved in comparison to uni-variable Stage damage function. Overall, Probabilistic models provide quantitative information about prediction uncertainty which is crucial to assess the reliability of model predictions and improves the usefulness of model results.
Strategies to improve electrode positioning and safety in cochlear implants.
Rebscher, S J; Heilmann, M; Bruszewski, W; Talbot, N H; Snyder, R L; Merzenich, M M
1999-03-01
An injection-molded internal supporting rib has been produced to control the flexibility of silicone rubber encapsulated electrodes designed to electrically stimulate the auditory nerve in human subjects with severe to profound hearing loss. The rib molding dies, and molds for silicone rubber encapsulation of the electrode, were designed and machined using AutoCad and MasterCam software packages in a PC environment. After molding, the prototype plastic ribs were iteratively modified based on observations of the performance of the rib/silicone composite insert in a clear plastic model of the human scala tympani cavity. The rib-based electrodes were reliably inserted farther into these models, required less insertion force and were positioned closer to the target auditory neural elements than currently available cochlear implant electrodes. With further design improvements the injection-molded rib may also function to accurately support metal stimulating contacts and wire leads during assembly to significantly increase the manufacturing efficiency of these devices. This method to reliably control the mechanical properties of miniature implantable devices with multiple electrical leads may be valuable in other areas of biomedical device design.
NASA Technical Reports Server (NTRS)
Tedesco, Edward F.; Veeder, Glenn J.; Fowler, John W.; Chillemi, Joseph R.
1992-01-01
This report documents the program and data used to identify known asteroids observed by the Infrared Astronomical Satellite (IRAS) and to compute albedos and diameters from their IRAS fluxes. It also presents listings of the results obtained. These results supplant those in the IRAS Asteroid and Comet Survey, 1986. The present version used new and improved asteroid orbital elements for 4679 numbered asteroids and 2632 additional asteroids for which at least two-opposition elements were available as of mid-1991. It employed asteroid absolute magnitudes on the International Astronomical Union system adopted in 1991. In addition, the code was modified to increase the reliability of associating asteroids with IRAS sources and rectify several shortcomings in the final data products released in 1986. Association reliability was improved by decreasing the position difference between an IRAS source and a predicted asteroid position required for an association. The shortcomings addressed included the problem of flux overestimation for low SNR sources and the systematic difference in albedos and diameters among the three wavelength bands (12, 25, and 60 micrometers). Several minor bugs in the original code were also corrected.
RELIABILITY, AVAILABILITY, AND SERVICEABILITY FOR PETASCALE HIGH-END COMPUTING AND BEYOND
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chokchai "Box" Leangsuksun
2011-05-31
Our project is a multi-institutional research effort that adopts interplay of RELIABILITY, AVAILABILITY, and SERVICEABILITY (RAS) aspects for solving resilience issues in highend scientific computing in the next generation of supercomputers. results lie in the following tracks: Failure prediction in a large scale HPC; Investigate reliability issues and mitigation techniques including in GPGPU-based HPC system; HPC resilience runtime & tools.
The Reliability of In-Training Assessment when Performance Improvement Is Taken into Account
ERIC Educational Resources Information Center
van Lohuizen, Mirjam T.; Kuks, Jan B. M.; van Hell, Elisabeth A.; Raat, A. N.; Stewart, Roy E.; Cohen-Schotanus, Janke
2010-01-01
During in-training assessment students are frequently assessed over a longer period of time and therefore it can be expected that their performance will improve. We studied whether there really is a measurable performance improvement when students are assessed over an extended period of time and how this improvement affects the reliability of the…
Point of care investigations in pediatric care to improve health care in rural areas.
Walia, Kamini
2013-07-01
The good quality laboratory services in developing countries are often limited to major urban centers. As a result, many commercially available high-quality diagnostic tests for infectious diseases are neither accessible nor affordable to patients in the rural areas. Health facilities in rural areas are compromised and this limits the usability and performance of the best medical diagnostic technologies in rural areas as they are designed for air-conditioned laboratories, refrigerated storage of chemicals, a constant supply of calibrators and reagents, stable electrical power, highly trained personnel and rapid transportation of samples. The advent of new technologies have allowed miniaturization and integration of complex functions, which has made it possible for sophisticated diagnostic tools to move out of the developed-world laboratory in the form of a "point of care"(POC) tests. Many diagnostic tests are being developed using these platforms. However, the challenge is to develop diagnostics which are inexpensive, rugged and well suited to the medical and social contexts of the developing world and do not compromise on accuracy and reliability. The already available POC tests which are reliable and affordable, like for HIV infection, malaria, syphilis, and some neglected tropical diseases, and POC tests being developed for other diseases if correctly used and effectively regulated after rigorous evaluation, have the potential to make a difference in clinical management and improve surveillance. In order to use these tests effectively they would need to be supported by technically competent manpower, availability of good-quality reagents, and healthcare providers who value and are able to interpret laboratory results to guide treatment; and a system for timely communication between the laboratory and the healthcare provider. Strengthening the laboratories at the rural level can enable utilization of these diagnostics for improving the diagnosis and management of infectious diseases among children which require prompt treatment and thus, considerably reduce morbidity and mortality among the pediatric age group.
NASA Astrophysics Data System (ADS)
Lam, C. Y.; Ip, W. H.
2012-11-01
A higher degree of reliability in the collaborative network can increase the competitiveness and performance of an entire supply chain. As supply chain networks grow more complex, the consequences of unreliable behaviour become increasingly severe in terms of cost, effort and time. Moreover, it is computationally difficult to calculate the network reliability of a Non-deterministic Polynomial-time hard (NP-hard) all-terminal network using state enumeration, as this may require a huge number of iterations for topology optimisation. Therefore, this paper proposes an alternative approach of an improved spanning tree for reliability analysis to help effectively evaluate and analyse the reliability of collaborative networks in supply chains and reduce the comparative computational complexity of algorithms. Set theory is employed to evaluate and model the all-terminal reliability of the improved spanning tree algorithm and present a case study of a supply chain used in lamp production to illustrate the application of the proposed approach.
2013-01-01
Background Yearly formative knowledge testing (also known as progress testing) was shown to have a limited construct-validity and reliability in postgraduate medical education. One way to improve construct-validity and reliability is to improve the authenticity of a test. As easily accessible internet has become inseparably linked to daily clinical practice, we hypothesized that allowing internet access for a limited amount of time during the progress test would improve the perception of authenticity (face-validity) of the test, which would in turn improve the construct-validity and reliability of postgraduate progress testing. Methods Postgraduate trainees taking the yearly knowledge progress test were asked to participate in a study where they could access the internet for 30 minutes at the end of a traditional pen and paper test. Before and after the test they were asked to complete a short questionnaire regarding the face-validity of the test. Results Mean test scores increased significantly for all training years. Trainees indicated that the face-validity of the test improved with internet access and that they would like to continue to have internet access during future testing. Internet access did not improve the construct-validity or reliability of the test. Conclusion Improving the face-validity of postgraduate progress testing, by adding the possibility to search the internet for a limited amount of time, positively influences test performance and face-validity. However, it did not change the reliability or the construct-validity of the test. PMID:24195696
18 CFR 39.3 - Electric Reliability Organization certification.
Code of Federal Regulations, 2010 CFR
2010-04-01
... operators of the Bulk-Power System, and other interested parties for improvement of the Electric Reliability... 18 Conservation of Power and Water Resources 1 2010-04-01 2010-04-01 false Electric Reliability..., Reliability Standards that provide for an adequate level of reliability of the Bulk-Power System, and (2) Has...
Pemmaraju, Naveen; Gupta, Vikas; Mesa, Ruben; Thompson, Michael A
2015-12-01
The advent of social media has led to the ability for individuals all over the world to communicate with each other, in real time, about mutual topics of interest in an unprecedented manner. Recently, the use of social media has increased among people interested in healthcare and medical research, particularly in the field of hematology and oncology, a field which frequently experiences rapid shifts of information and novel, practice-changing discoveries. Among the many social media platforms available to cancer patients and providers, one platform in particular, Twitter, has become the focus for the creation of disease-specific communities, especially for those interested in, affected by, or those who perform research in the fields of rare cancers, which historically have had a dearth of reliable information available. This article will focus on the initiation and progress of one such Twitter hematology/oncology community, #mpnsm, which was originally created for the purpose of serving as a venue for improving the interaction among patients, providers, researchers, and organizations with interest in the myeloproliferative neoplasms (MPNs) and to further the availability of reliable up-to-date analysis; relevant expert commentary; and readily usable information for patients, providers, and other groups interested in this field.
A Cross-Layer Optimized Opportunistic Routing Scheme for Loss-and-Delay Sensitive WSNs
Xu, Xin; Yuan, Minjiao; Liu, Xiao; Cai, Zhiping; Wang, Tian
2018-01-01
In wireless sensor networks (WSNs), communication links are typically error-prone and unreliable, so providing reliable and timely data routing for loss- and delay-sensitive applications in WSNs it is a challenge issue. Additionally, with specific thresholds in practical applications, the loss and delay sensitivity implies requirements for high reliability and low delay. Opportunistic Routing (OR) has been well studied in WSNs to improve reliability for error-prone and unreliable wireless communication links where the transmission power is assumed to be identical in the whole network. In this paper, a Cross-layer Optimized Opportunistic Routing (COOR) scheme is proposed to improve the communication link reliability and reduce delay for loss-and-delay sensitive WSNs. The main contribution of the COOR scheme is making full use of the remaining energy in networks to increase the transmission power of most nodes, which will provide a higher communication reliability or further transmission distance. Two optimization strategies referred to as COOR(R) and COOR(P) of the COOR scheme are proposed to improve network performance. In the case of increasing the transmission power, the COOR(R) strategy chooses a node that has a higher communication reliability with same distance in comparison to the traditional opportunistic routing when selecting the next hop candidate node. Since the reliability of data transmission is improved, the delay of the data reaching the sink is reduced by shortening the time of communication between candidate nodes. On the other hand, the COOR(P) strategy prefers a node that has the same communication reliability with longer distance. As a result, network performance can be improved for the following reasons: (a) the delay is reduced as fewer hops are needed while the packet reaches the sink in longer transmission distance circumstances; (b) the reliability can be improved since it is the product of the reliability of every hop of the routing path, and the count is reduced while the reliability of each hop is the same as the traditional method. After analyzing the energy consumption of the network in detail, the value of optimized transmission power in different areas is given. On the basis of a large number of experimental and theoretical analyses, the results show that the COOR scheme will increase communication reliability by 36.62–87.77%, decrease delay by 21.09–52.48%, and balance the energy consumption of 86.97% of the nodes in the WSNs. PMID:29751589
A Cross-Layer Optimized Opportunistic Routing Scheme for Loss-and-Delay Sensitive WSNs.
Xu, Xin; Yuan, Minjiao; Liu, Xiao; Liu, Anfeng; Xiong, Neal N; Cai, Zhiping; Wang, Tian
2018-05-03
In wireless sensor networks (WSNs), communication links are typically error-prone and unreliable, so providing reliable and timely data routing for loss- and delay-sensitive applications in WSNs it is a challenge issue. Additionally, with specific thresholds in practical applications, the loss and delay sensitivity implies requirements for high reliability and low delay. Opportunistic Routing (OR) has been well studied in WSNs to improve reliability for error-prone and unreliable wireless communication links where the transmission power is assumed to be identical in the whole network. In this paper, a Cross-layer Optimized Opportunistic Routing (COOR) scheme is proposed to improve the communication link reliability and reduce delay for loss-and-delay sensitive WSNs. The main contribution of the COOR scheme is making full use of the remaining energy in networks to increase the transmission power of most nodes, which will provide a higher communication reliability or further transmission distance. Two optimization strategies referred to as COOR(R) and COOR(P) of the COOR scheme are proposed to improve network performance. In the case of increasing the transmission power, the COOR(R) strategy chooses a node that has a higher communication reliability with same distance in comparison to the traditional opportunistic routing when selecting the next hop candidate node. Since the reliability of data transmission is improved, the delay of the data reaching the sink is reduced by shortening the time of communication between candidate nodes. On the other hand, the COOR(P) strategy prefers a node that has the same communication reliability with longer distance. As a result, network performance can be improved for the following reasons: (a) the delay is reduced as fewer hops are needed while the packet reaches the sink in longer transmission distance circumstances; (b) the reliability can be improved since it is the product of the reliability of every hop of the routing path, and the count is reduced while the reliability of each hop is the same as the traditional method. After analyzing the energy consumption of the network in detail, the value of optimized transmission power in different areas is given. On the basis of a large number of experimental and theoretical analyses, the results show that the COOR scheme will increase communication reliability by 36.62⁻87.77%, decrease delay by 21.09⁻52.48%, and balance the energy consumption of 86.97% of the nodes in the WSNs.
Is Coefficient Alpha Robust to Non-Normal Data?
Sheng, Yanyan; Sheng, Zhaohui
2011-01-01
Coefficient alpha has been a widely used measure by which internal consistency reliability is assessed. In addition to essential tau-equivalence and uncorrelated errors, normality has been noted as another important assumption for alpha. Earlier work on evaluating this assumption considered either exclusively non-normal error score distributions, or limited conditions. In view of this and the availability of advanced methods for generating univariate non-normal data, Monte Carlo simulations were conducted to show that non-normal distributions for true or error scores do create problems for using alpha to estimate the internal consistency reliability. The sample coefficient alpha is affected by leptokurtic true score distributions, or skewed and/or kurtotic error score distributions. Increased sample sizes, not test lengths, help improve the accuracy, bias, or precision of using it with non-normal data. PMID:22363306
van der Laar, Catharina Walthera Egbertha; de Jong, Charlie; Weda, Marjolein; Hegger, Ingrid
2016-01-01
Background In the past years, an enormous increase in the number of available health-related applications (apps) has occurred, from approximately 5800 in 2011 to over 23,000 in 2013, in the iTunes store. However, little is still known regarding the use, possible effectiveness, and risks of these applications. In this study, we focused on apps and other e-tools related to medicine use. A large subset of the general population uses medicines and might benefit from tools that aid in the use of medicine. Objective The aim of the present study was to gain more insight into the characteristics, possible risks, and possible benefits of health apps and e-tools related to medication use. Methods We first made an inventory of apps and other e-tools for medication use (n=116). Tools were coded by two independent researchers, based on the information available in the app stores and websites. Subsequently, for one type of often downloaded apps (aimed at people with diabetes), we investigated users’ experiences using an online questionnaire. Results Results of the inventory show that many apps for medication use are available and that they mainly offer simple functionalities. In line with this, the most experienced benefit by users of apps for regulating blood glucose levels in the online questionnaire was “information quick and conveniently available”. Other often experienced benefits were improving health and self-reliance. Results of the inventory show that a minority of the apps for medication use has potentially high risks and for many of the apps it is unclear whether and how personal data are stored. In contrast, online questionnaire among users of apps for blood glucose regulation indicates that they hardly ever experience problems or doubts considering reliability and/or privacy. Although, respondents do mention to experience disadvantages of use due to incomplete apps and apps with poor ease of use. Respondents not using app(s) indicate that they might use them in the future if reliability of the apps and instructions on how to use them are more clear. Conclusions This study shows that for apps and e-tools related to medicine use a small subset of tools might involve relatively high risks. For the large group of nonmedical devices apps, risks are lower, but risks lie in the enormous availability and low levels of regulation. In addition, both users and nonusers indicated that overall quality of apps (ease of use, completeness, good functionalities) is an issue. Considering that important benefits (eg, improving health and self-reliance) are experienced by many of the respondents using apps for regulating blood glucose levels, improving reliability and quality of apps is likely to have many profits. In addition, creating better awareness regarding the existence and how to use apps will likely improve proper use by more people, enhancing the profits of these tools. PMID:27052946
Thomson, Hilary
2013-08-01
Systematic reviews have the potential to promote knowledge exchange between researchers and decision-makers. Review planning requires engagement with evidence users to ensure preparation of relevant reviews, and well-conducted reviews should provide accessible and reliable synthesis to support decision-making. Yet, systematic reviews are not routinely referred to by decision-makers, and innovative approaches to improve the utility of reviews is needed. Evidence synthesis for healthy public policy is typically complex and methodologically challenging. Although not lessening the value of reviews, these challenges can be overwhelming and threaten their utility. Using the interrelated principles of relevance, rigor, and readability, and in light of available resources, this article considers how utility of evidence synthesis for healthy public policy might be improved.
2013-01-01
Systematic reviews have the potential to promote knowledge exchange between researchers and decision-makers. Review planning requires engagement with evidence users to ensure preparation of relevant reviews, and well-conducted reviews should provide accessible and reliable synthesis to support decision-making. Yet, systematic reviews are not routinely referred to by decision-makers, and innovative approaches to improve the utility of reviews is needed. Evidence synthesis for healthy public policy is typically complex and methodologically challenging. Although not lessening the value of reviews, these challenges can be overwhelming and threaten their utility. Using the interrelated principles of relevance, rigor, and readability, and in light of available resources, this article considers how utility of evidence synthesis for healthy public policy might be improved. PMID:23763400
Report on Wind Turbine Subsystem Reliability - A Survey of Various Databases (Presentation)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sheng, S.
2013-07-01
Wind industry has been challenged by premature subsystem/component failures. Various reliability data collection efforts have demonstrated their values in supporting wind turbine reliability and availability research & development and industrial activities. However, most information on these data collection efforts are scattered and not in a centralized place. With the objective of getting updated reliability statistics of wind turbines and/or subsystems so as to benefit future wind reliability and availability activities, this report is put together based on a survey of various reliability databases that are accessible directly or indirectly by NREL. For each database, whenever feasible, a brief description summarizingmore » database population, life span, and data collected is given along with its features & status. Then selective results deemed beneficial to the industry and generated based on the database are highlighted. This report concludes with several observations obtained throughout the survey and several reliability data collection opportunities in the future.« less
Gao, Zhouzheng; Zhang, Hongping; Ge, Maorong; Niu, Xiaoji; Shen, Wenbin; Wickert, Jens; Schuh, Harald
2015-03-10
The continuity and reliability of precise GNSS positioning can be seriously limited by severe user observation environments. The Inertial Navigation System (INS) can overcome such drawbacks, but its performance is clearly restricted by INS sensor errors over time. Accordingly, the tightly coupled integration of GPS and INS can overcome the disadvantages of each individual system and together form a new navigation system with a higher accuracy, reliability and availability. Recently, ionosphere-constrained (IC) precise point positioning (PPP) utilizing raw GPS observations was proven able to improve both the convergence and positioning accuracy of the conventional PPP using ionosphere-free combined observations (LC-PPP). In this paper, a new mode of tightly coupled integration, in which the IC-PPP instead of LC-PPP is employed, is implemented to further improve the performance of the coupled system. We present the detailed mathematical model and the related algorithm of the new integration of IC-PPP and INS. To evaluate the performance of the new tightly coupled integration, data of both airborne and vehicle experiments with a geodetic GPS receiver and tactical grade inertial measurement unit are processed and the results are analyzed. The statistics show that the new approach can further improve the positioning accuracy compared with both IC-PPP and the tightly coupled integration of the conventional PPP and INS.
VizieR Online Data Catalog: Second ROSAT all-sky survey (2RXS) source catalog (Boller+, 2016)
NASA Astrophysics Data System (ADS)
Boller, T.; Freyberg, M. J.; Truemper, J.; Haberl, F.; Voges, W.; Nandra, K.
2016-03-01
We have re-analysed the photon event files from the ROSAT all-sky survey. The main goal was to create a catalogue of point-like sources, which is referred to as the 2RXS source catalogue. We improved the reliability of detections by an advanced detection algorithm and a complete screening process. New data products were created to allow timing and spectral analysis. Photon event files with corrected astrometry and Moon rejection (RASS-3.1 processing) were made available in FITS format. The 2RXS catalogue will serve as the basic X-ray all-sky survey catalogue until eROSITA data become available. (2 data files).
Assessment of the quality of patient-oriented information over internet on testicular cancer.
Prasanth, Anton S; Jayarajah, Umesh; Mohanappirian, Ranganathan; Seneviratne, Sanjeewa A
2018-05-02
This study aimed to assess the quality and readability of patient education information available on the internet on testicular cancer. Internet searches were performed using the keywords 'testicular cancer', 'testicular tumour', 'testicular tumor', 'testicular malignancy', 'germ cell tumour' and 'germ cell tumor' using Google, Yahoo! And Bing search engines with default settings. The first 50 web links appeared in each search engine were evaluated for their readability by using the validated Flesch Reading Ease Score (FRES) while accessibility, usability and reliability were assessed using the LIDA tool. The quality was assessed using DISCERN instrument. Non-parametric tests were used for statistical analysis. Overall, 900 websites were assessed and 62 websites were included in the analysis. Twenty two (22) websites (35.5%) were certified by Health on the Net Foundation code of conduct (HON code). The majority (n = 57, 91.9%) were non-governmental websites. The median FRES score was 51.6 (range: 28.1-74.1), the overall median LIDA score was 115 (range: 81-147); accessibility 55 (range: 46-61), reliability 22 (range: 8-45) and usability 38.5 (range: 21-50), while the median DISCERN score was 43.5 (range: 16-69). The DISCERN score was significantly associated with the overall LIDA score and usability and reliability components of the LIDA score (p < 0.001). However, no significant associations were observed between readability and accessibility. A significant correlation was noted between usability and reliability components of the LIDA score (Spearman's rho: 0.789, p < 0.001). In this study, the readability, reliability and quality scores of most websites were found to be suboptimal and hence, there is potential for improvement. As the internet is expanding rapidly as a readily available source of information to the public, it is essential to implement steps to ensure that highest quality information is provided without any commercial motivation or bias.
NASA Astrophysics Data System (ADS)
Riggs, William R.
1994-05-01
SHARP is a Navy wide logistics technology development effort aimed at reducing the acquisition costs, support costs, and risks of military electronic weapon systems while increasing the performance capability, reliability, maintainability, and readiness of these systems. Lower life cycle costs for electronic hardware are achieved through technology transition, standardization, and reliability enhancement to improve system affordability and availability as well as enhancing fleet modernization. Advanced technology is transferred into the fleet through hardware specifications for weapon system building blocks of standard electronic modules, standard power systems, and standard electronic systems. The product lines are all defined with respect to their size, weight, I/O, environmental performance, and operational performance. This method of defining the standard is very conducive to inserting new technologies into systems using the standard hardware. This is the approach taken thus far in inserting photonic technologies into SHARP hardware. All of the efforts have been related to module packaging; i.e. interconnects, component packaging, and module developments. Fiber optic interconnects are discussed in this paper.
Contraceptive knowledge, attitudes, and practice in Russia during the 1980s.
Popov, A A; Visser, A P; Ketting, E
1993-01-01
In the former Soviet Union, there was a lack of valid and reliable social research on knowledge, attitudes, and practice of contraception. The few available studies have not been published outside the Soviet Union. This article reviews five surveys that were conducted in Moscow and two other cities (Saratov and Tartu) during the period 1976-84. In addition, some data from a large-scale survey conducted in 1990 and covering the entire former Soviet Union are presented. The surveys indicate that the rhythm method, condoms, vaginal douches, and withdrawal were the main contraceptive methods used; only 1 to 3 percent of the women interviewed were using oral contraceptives, and about 10 percent used intrauterine devices. The low prevalence of use of reliable modern methods may explain the high incidence of induced abortion in Russia. The chronic unavailability of reliable contraceptives is one of the main factors of poor family planning. Lack of knowledge and negative opinions about modern contraception also play an important role. Some possibilities for improving the family planning situation in Russia are discussed.
Validation of the Fatigue Impact Scale in Hungarian patients with multiple sclerosis.
Losonczi, Erika; Bencsik, Krisztina; Rajda, Cecília; Lencsés, Gyula; Török, Margit; Vécsei, László
2011-03-01
Fatigue is one of the most frequent complaints of patients with multiple sclerosis (MS). The Fatigue Impact Scale (FIS), one of the 30 available fatigue questionnaires, is commonly applied because it evaluates multidimensional aspects of fatigue. The main purposes of this study were to test the validity, test-retest reliability, and internal consistency of the Hungarian version of the FIS. One hundred and eleven MS patients and 85 healthy control (HC) subjects completed the FIS and the Beck Depression Inventory, a large majority of them on two occasions, 3 months apart. The total FIS score and subscale scores differed statistically between the MS patients and the HC subjects in both FIS sessions. In the test-retest reliability assessment, statistically, the intraclass correlation coefficients were high in both the MS and HC groups. Cronbach's alpha values were also notably high. The results of this study indicate that the FIS can be regarded as a valid and reliable scale with which to improve our understanding of the impact of fatigue on the health-related quality of life in MS patients without severe disability.
Promoting Robust Design of Diode Lasers for Space: A National Initiative
NASA Technical Reports Server (NTRS)
Tratt, David M.; Amzajerdian, Farzin; Kashem, Nasir B.; Shapiro, Andrew A.; Mense, Allan T.
2007-01-01
The Diode-laser Array Working Group (DAWG) is a national-level consumer/provider forum for discussion of engineering and manufacturing issues which influence the reliability and survivability of high-power broad-area laser diode devices in space, with an emphasis on laser diode arrays (LDAs) for optical pumping of solid-state laser media. The goals of the group are to formulate and validate standardized test and qualification protocols, operational control recommendations, and consensus manufacturing and certification standards. The group is using reliability and lifetime data collected by laser diode manufacturers and the user community to develop a set of standardized guidelines for specifying and qualifying laser diodes for long-duration operation in space, the ultimate goal being to promote an informed U.S. Government investment and procurement strategy for assuring the availability and durability of space-qualified LDAs. The group is also working to establish effective implementation of statistical design techniques at the supplier design, development, and manufacturing levels to help reduce product performance variability and improve product reliability for diodes employed in space applications
NASA Astrophysics Data System (ADS)
Bennett, J.; Gehly, S.
2016-09-01
This paper presents results from a preliminary method for extracting more orbital information from low rate passive optical tracking data. An improvement in the accuracy of the observation data yields more accurate and reliable orbital elements. A comparison between the orbit propagations from the orbital element generated using the new data processing method is compared with the one generated from the raw observation data for several objects. Optical tracking data collected by EOS Space Systems, located on Mount Stromlo, Australia, is fitted to provide a new orbital element. The element accuracy is determined from a comparison between the predicted orbit and subsequent tracking data or reference orbit if available. The new method is shown to result in a better orbit prediction which has important implications in conjunction assessments and the Space Environment Research Centre space object catalogue. The focus is on obtaining reliable orbital solutions from sparse data. This work forms part of the collaborative effort of the Space Environment Management Cooperative Research Centre which is developing new technologies and strategies to preserve the space environment (www.serc.org.au).
Hardware Evaluation of the Horizontal Exercise Fixture with Weight Stack
NASA Technical Reports Server (NTRS)
Newby, Nate; Leach, Mark; Fincke, Renita; Sharp, Carwyn
2009-01-01
HEF with weight stack seems to be a very sturdy and reliable exercise device that should function well in a bed rest training setting. A few improvements should be made to both the hardware and software to improve usage efficiency, but largely, this evaluation has demonstrated HEF's robustness. The hardware offers loading to muscles, bones, and joints, potentially sufficient to mitigate the loss of muscle mass and bone mineral density during long-duration bed rest campaigns. With some minor modifications, the HEF with weight stack equipment provides the best currently available means of performing squat, heel raise, prone row, bench press, and hip flexion/extension exercise in a supine orientation.
Telemedicine in Anesthesiology and Reanimatology
Tafro, Lejla; Masic, Izet
2010-01-01
Review SUMMARY In recent years impressive progress is happening in information and telecommunication technologies. The application of computers in medicine allows permanent data storage, data transfer from one place to another, retrieving and data processing, data availability at all times, monitoring of patients over time, etc. This can significantly improve the medical profession. Medicine is one of the most intensive users of all types of information and telecommunication technology. Quickly and reliably store and transfer data (text, images, sounds, etc.) provides significant assistance and improvement in almost all medical procedures. In addition, data in locations far from medical centers can be of invaluable benefit, especially in emergency cases in which the decisive role has anesthesiologists. PMID:24222933
Useful Life Prediction for Payload Carrier Hardware
NASA Technical Reports Server (NTRS)
Ben-Arieh, David
2002-01-01
The Space Shuttle has been identified for use through 2020. Payload carrier systems will be needed to support missions through the same time frame. To support the future decision making process with reliable systems, it is necessary to analyze design integrity, identify possible sources of undesirable risk and recognize required upgrades for carrier systems. This project analyzed the information available regarding the carriers and developed the probability of becoming obsolete under different scenarios. In addition, this project resulted in a plan for an improved information system that will improve monitoring and control of the various carriers. The information collected throughout this project is presented in this report as process flow, historical records, and statistical analysis.
A Novel and Intelligent Home Monitoring System for Care Support of Elders with Cognitive Impairment.
Lazarou, Ioulietta; Karakostas, Anastasios; Stavropoulos, Thanos G; Tsompanidis, Theodoros; Meditskos, Georgios; Kompatsiaris, Ioannis; Tsolaki, Magda
2016-10-18
Assistive technology, in the form of a smart home environment, is employed to support people with dementia. To propose a system for continuous and objective remote monitoring of problematic daily living activity areas and design personalized interventions based on system feedback and clinical observations for improving cognitive function and health-related quality of life. The assistive technology of the proposed system, including wearable, sleep, object motion, presence, and utility usage sensors, was methodically deployed at four different home installations of people with cognitive impairment. Detection of sleep patterns, physical activity, and activities of daily living, based on the collected sensor data and analytics, was available at all times through comprehensive data visualization solutions. Combined with clinical observation, targeted psychosocial interventions were introduced to enhance the participants' quality of life and improve their cognitive functions and daily functionality. Meanwhile, participants and their caregivers were able to visualize a reduced set of information tailored to their needs. Overall, paired-sample t-test analysis of monitored qualities revealed improvement for all participants in neuropsychological assessment. Moreover, improvement was detected from the beginning to the end of the trial, in physical condition and in the domains of sleep. Detecting abnormalities via the system, for example in sleep quality, such as REM sleep, has proved to be critical to assess current status, drive interventions, and evaluate improvements in a reliable manner. It has been proved that the proposed system is suitable to support clinicians to reliably drive and evaluate clinical interventions toward quality of life improvement of people with cognitive impairment.
ERIC Educational Resources Information Center
Park, Bitnara Jasmine; Irvin, P. Shawn; Lai, Cheng-Fei; Alonzo, Julie; Tindal, Gerald
2012-01-01
In this technical report, we present the results of a reliability study of the fifth-grade multiple choice reading comprehension measures available on the easyCBM learning system conducted in the spring of 2011. Analyses include split-half reliability, alternate form reliability, person and item reliability as derived from Rasch analysis,…
ERIC Educational Resources Information Center
Lai, Cheng-Fei; Irvin, P. Shawn; Alonzo, Julie; Park, Bitnara Jasmine; Tindal, Gerald
2012-01-01
In this technical report, we present the results of a reliability study of the second-grade multiple choice reading comprehension measures available on the easyCBM learning system conducted in the spring of 2011. Analyses include split-half reliability, alternate form reliability, person and item reliability as derived from Rasch analysis,…
ERIC Educational Resources Information Center
Park, Bitnara Jasmine; Irvin, P. Shawn; Alonzo, Julie; Lai, Cheng-Fei; Tindal, Gerald
2012-01-01
In this technical report, we present the results of a reliability study of the fourth-grade multiple choice reading comprehension measures available on the easyCBM learning system conducted in the spring of 2011. Analyses include split-half reliability, alternate form reliability, person and item reliability as derived from Rasch analysis,…
ERIC Educational Resources Information Center
Irvin, P. Shawn; Alonzo, Julie; Park, Bitnara Jasmine; Lai, Cheng-Fei; Tindal, Gerald
2012-01-01
In this technical report, we present the results of a reliability study of the sixth-grade multiple choice reading comprehension measures available on the easyCBM learning system conducted in the spring of 2011. Analyses include split-half reliability, alternate form reliability, person and item reliability as derived from Rasch analysis,…
ERIC Educational Resources Information Center
Irvin, P. Shawn; Alonzo, Julie; Lai, Cheng-Fei; Park, Bitnara Jasmine; Tindal, Gerald
2012-01-01
In this technical report, we present the results of a reliability study of the seventh-grade multiple choice reading comprehension measures available on the easyCBM learning system conducted in the spring of 2011. Analyses include split-half reliability, alternate form reliability, person and item reliability as derived from Rasch analysis,…
ERIC Educational Resources Information Center
Lai, Cheng-Fei; Irvin, P. Shawn; Park, Bitnara Jasmine; Alonzo, Julie; Tindal, Gerald
2012-01-01
In this technical report, we present the results of a reliability study of the third-grade multiple choice reading comprehension measures available on the easyCBM learning system conducted in the spring of 2011. Analyses include split-half reliability, alternate form reliability, person and item reliability as derived from Rasch analysis,…
Transfer learning for biomedical named entity recognition with neural networks.
Giorgi, John M; Bader, Gary D
2018-06-01
The explosive increase of biomedical literature has made information extraction an increasingly important tool for biomedical research. A fundamental task is the recognition of biomedical named entities in text (BNER) such as genes/proteins, diseases, and species. Recently, a domain-independent method based on deep learning and statistical word embeddings, called long short-term memory network-conditional random field (LSTM-CRF), has been shown to outperform state-of-the-art entity-specific BNER tools. However, this method is dependent on gold-standard corpora (GSCs) consisting of hand-labeled entities, which tend to be small but highly reliable. An alternative to GSCs are silver-standard corpora (SSCs), which are generated by harmonizing the annotations made by several automatic annotation systems. SSCs typically contain more noise than GSCs but have the advantage of containing many more training examples. Ideally, these corpora could be combined to achieve the benefits of both, which is an opportunity for transfer learning. In this work, we analyze to what extent transfer learning improves upon state-of-the-art results for BNER. We demonstrate that transferring a deep neural network (DNN) trained on a large, noisy SSC to a smaller, but more reliable GSC significantly improves upon state-of-the-art results for BNER. Compared to a state-of-the-art baseline evaluated on 23 GSCs covering four different entity classes, transfer learning results in an average reduction in error of approximately 11%. We found transfer learning to be especially beneficial for target data sets with a small number of labels (approximately 6000 or less). Source code for the LSTM-CRF is available athttps://github.com/Franck-Dernoncourt/NeuroNER/ and links to the corpora are available athttps://github.com/BaderLab/Transfer-Learning-BNER-Bioinformatics-2018/. john.giorgi@utoronto.ca. Supplementary data are available at Bioinformatics online.
Savage, Trevor Nicholas; McIntosh, Andrew Stuart
2017-03-01
It is important to understand factors contributing to and directly causing sports injuries to improve the effectiveness and safety of sports skills. The characteristics of injury events must be evaluated and described meaningfully and reliably. However, many complex skills cannot be effectively investigated quantitatively because of ethical, technological and validity considerations. Increasingly, qualitative methods are being used to investigate human movement for research purposes, but there are concerns about reliability and measurement bias of such methods. Using the tackle in Rugby union as an example, we outline a systematic approach for developing a skill analysis protocol with a focus on improving objectivity, validity and reliability. Characteristics for analysis were selected using qualitative analysis and biomechanical theoretical models and epidemiological and coaching literature. An expert panel comprising subject matter experts provided feedback and the inter-rater reliability of the protocol was assessed using ten trained raters. The inter-rater reliability results were reviewed by the expert panel and the protocol was revised and assessed in a second inter-rater reliability study. Mean agreement in the second study improved and was comparable (52-90% agreement and ICC between 0.6 and 0.9) with other studies that have reported inter-rater reliability of qualitative analysis of human movement.
Development and Reliability Testing of a Fast-Food Restaurant Observation Form.
Rimkus, Leah; Ohri-Vachaspati, Punam; Powell, Lisa M; Zenk, Shannon N; Quinn, Christopher M; Barker, Dianne C; Pugach, Oksana; Resnick, Elissa A; Chaloupka, Frank J
2015-01-01
To develop a reliable observational data collection instrument to measure characteristics of the fast-food restaurant environment likely to influence consumer behaviors, including product availability, pricing, and promotion. The study used observational data collection. Restaurants were in the Chicago Metropolitan Statistical Area. A total of 131 chain fast-food restaurant outlets were included. Interrater reliability was measured for product availability, pricing, and promotion measures on a fast-food restaurant observational data collection instrument. Analysis was done with Cohen's κ coefficient and proportion of overall agreement for categorical variables and intraclass correlation coefficient (ICC) for continuous variables. Interrater reliability, as measured by average κ coefficient, was .79 for menu characteristics, .84 for kids' menu characteristics, .92 for food availability and sizes, .85 for beverage availability and sizes, .78 for measures on the availability of nutrition information,.75 for characteristics of exterior advertisements, and .62 and .90 for exterior and interior characteristics measures, respectively. For continuous measures, average ICC was .88 for food pricing measures, .83 for beverage prices, and .65 for counts of exterior advertisements. Over 85% of measures demonstrated substantial or almost perfect agreement. Although some measures required revision or protocol clarification, results from this study suggest that the instrument may be used to reliably measure the fast-food restaurant environment.
A reliability analysis tool for SpaceWire network
NASA Astrophysics Data System (ADS)
Zhou, Qiang; Zhu, Longjiang; Fei, Haidong; Wang, Xingyou
2017-04-01
A SpaceWire is a standard for on-board satellite networks as the basis for future data-handling architectures. It is becoming more and more popular in space applications due to its technical advantages, including reliability, low power and fault protection, etc. High reliability is the vital issue for spacecraft. Therefore, it is very important to analyze and improve the reliability performance of the SpaceWire network. This paper deals with the problem of reliability modeling and analysis with SpaceWire network. According to the function division of distributed network, a reliability analysis method based on a task is proposed, the reliability analysis of every task can lead to the system reliability matrix, the reliability result of the network system can be deduced by integrating these entire reliability indexes in the matrix. With the method, we develop a reliability analysis tool for SpaceWire Network based on VC, where the computation schemes for reliability matrix and the multi-path-task reliability are also implemented. By using this tool, we analyze several cases on typical architectures. And the analytic results indicate that redundancy architecture has better reliability performance than basic one. In practical, the dual redundancy scheme has been adopted for some key unit, to improve the reliability index of the system or task. Finally, this reliability analysis tool will has a directive influence on both task division and topology selection in the phase of SpaceWire network system design.
Chansirinukor, Wunpen; Maher, Christopher G; Latimer, Jane; Hush, Julia
2005-01-01
Retrospective design. To compare the responsiveness and test-retest reliability of the Functional Rating Index and the 18-item version of the Roland-Morris Disability Questionnaire in detecting change in disability in patients with work-related low back pain. Many low back pain-specific disability questionnaires are available, including the Functional Rating Index and the 18-item version of the Roland-Morris Disability Questionnaire. No previous study has compared the responsiveness and reliability of these questionnaires. Files of patients who had been treated for work-related low back pain at a physical therapy clinic were reviewed, and those containing initial and follow-up Functional Rating Index and 18-item Roland-Morris Disability Questionnaires were selected. The responsiveness of both questionnaires was compared using two different methods. First, using the assumption that patients receiving treatment improve over time, various responsiveness coefficients were calculated. Second, using change in work status as an external criterion to identify improved and nonimproved patients, Spearman's rho and receiver operating characteristic curves were calculated. Reliability was estimated from the subset of patients who reported no change in their condition over this period and expressed with the intraclass correlation coefficient and the minimal detectable change. One hundred and forty-three patient files were retrieved. The responsiveness coefficients for the Functional Rating Index were greater than for the 18-item Roland-Morris Disability Questionnaire. The intraclass correlation coefficient values for both questionnaires calculated from 96 patient files were similar, but the minimal detectable change for the Functional Rating Index was less than for the 18-item Roland-Morris Disability Questionnaire. The Functional Rating Index seems preferable to the 18-item Roland-Morris Disability Questionnaire for use in clinical trials and clinical practice.
What do we gain with Probabilistic Flood Loss Models?
NASA Astrophysics Data System (ADS)
Schroeter, K.; Kreibich, H.; Vogel, K.; Merz, B.; Lüdtke, S.
2015-12-01
The reliability of flood loss models is a prerequisite for their practical usefulness. Oftentimes, traditional uni-variate damage models as for instance depth-damage curves fail to reproduce the variability of observed flood damage. Innovative multi-variate probabilistic modelling approaches are promising to capture and quantify the uncertainty involved and thus to improve the basis for decision making. In this study we compare the predictive capability of two probabilistic modelling approaches, namely Bagging Decision Trees and Bayesian Networks and traditional stage damage functions which are cast in a probabilistic framework. For model evaluation we use empirical damage data which are available from computer aided telephone interviews that were respectively compiled after the floods in 2002, 2005, 2006 and 2013 in the Elbe and Danube catchments in Germany. We carry out a split sample test by sub-setting the damage records. One sub-set is used to derive the models and the remaining records are used to evaluate the predictive performance of the model. Further we stratify the sample according to catchments which allows studying model performance in a spatial transfer context. Flood damage estimation is carried out on the scale of the individual buildings in terms of relative damage. The predictive performance of the models is assessed in terms of systematic deviations (mean bias), precision (mean absolute error) as well as in terms of reliability which is represented by the proportion of the number of observations that fall within the 95-quantile and 5-quantile predictive interval. The reliability of the probabilistic predictions within validation runs decreases only slightly and achieves a very good coverage of observations within the predictive interval. Probabilistic models provide quantitative information about prediction uncertainty which is crucial to assess the reliability of model predictions and improves the usefulness of model results.
Confronting uncertainty in flood damage predictions
NASA Astrophysics Data System (ADS)
Schröter, Kai; Kreibich, Heidi; Vogel, Kristin; Merz, Bruno
2015-04-01
Reliable flood damage models are a prerequisite for the practical usefulness of the model results. Oftentimes, traditional uni-variate damage models as for instance depth-damage curves fail to reproduce the variability of observed flood damage. Innovative multi-variate probabilistic modelling approaches are promising to capture and quantify the uncertainty involved and thus to improve the basis for decision making. In this study we compare the predictive capability of two probabilistic modelling approaches, namely Bagging Decision Trees and Bayesian Networks. For model evaluation we use empirical damage data which are available from computer aided telephone interviews that were respectively compiled after the floods in 2002, 2005 and 2006, in the Elbe and Danube catchments in Germany. We carry out a split sample test by sub-setting the damage records. One sub-set is used to derive the models and the remaining records are used to evaluate the predictive performance of the model. Further we stratify the sample according to catchments which allows studying model performance in a spatial transfer context. Flood damage estimation is carried out on the scale of the individual buildings in terms of relative damage. The predictive performance of the models is assessed in terms of systematic deviations (mean bias), precision (mean absolute error) as well as in terms of reliability which is represented by the proportion of the number of observations that fall within the 95-quantile and 5-quantile predictive interval. The reliability of the probabilistic predictions within validation runs decreases only slightly and achieves a very good coverage of observations within the predictive interval. Probabilistic models provide quantitative information about prediction uncertainty which is crucial to assess the reliability of model predictions and improves the usefulness of model results.
Weech-Maldonado, Robert; Dreachslin, Janice L; Brown, Julie; Pradhan, Rohit; Rubin, Kelly L; Schiller, Cameron; Hays, Ron D
2012-01-01
The U.S. national standards for culturally and linguistically appropriate services (CLAS) in health care provide guidelines on policies and practices aimed at developing culturally competent systems of care. The Cultural Competency Assessment Tool for Hospitals (CCATH) was developed as an organizational tool to assess adherence to the CLAS standards. First, we describe the development of the CCATH and estimate the reliability and validity of the CCATH measures. Second, we discuss the managerial implications of the CCATH as an organizational tool to assess cultural competency. We pilot tested an initial draft of the CCATH, revised it based on a focus group and cognitive interviews, and then administered it in a field test with a sample of California hospitals. The reliability and validity of the CCATH were evaluated using factor analysis, analysis of variance, and Cronbach's alphas. Exploratory and confirmatory factor analyses identified 12 CCATH composites: leadership and strategic planning, data collection on inpatient population, data collection on service area, performance management systems and quality improvement, human resources practices, diversity training, community representation, availability of interpreter services, interpreter services policies, quality of interpreter services, translation of written materials, and clinical cultural competency practices. All the CCATH scales had internal consistency reliability of .65 or above, and the reliability was .70 or above for 9 of the 12 scales. Analysis of variance results showed that not-for-profit hospitals have higher CCATH scores than for-profit hospitals in five CCATH scales and higher CCATH scores than government hospitals in two CCATH scales. The CCATH showed adequate psychometric properties. Managers and policy makers can use the CCATH as a tool to evaluate hospital performance in cultural competency and identify and target improvements in hospital policies and practices that undergird the provision of CLAS.
Sjoding, Michael W; Hofer, Timothy P; Co, Ivan; Courey, Anthony; Cooke, Colin R; Iwashyna, Theodore J
2018-02-01
Failure to reliably diagnose ARDS may be a major driver of negative clinical trials and underrecognition and treatment in clinical practice. We sought to examine the interobserver reliability of the Berlin ARDS definition and examine strategies for improving the reliability of ARDS diagnosis. Two hundred five patients with hypoxic respiratory failure from four ICUs were reviewed independently by three clinicians, who evaluated whether patients had ARDS, the diagnostic confidence of the reviewers, whether patients met individual ARDS criteria, and the time when criteria were met. Interobserver reliability of an ARDS diagnosis was "moderate" (kappa = 0.50; 95% CI, 0.40-0.59). Sixty-seven percent of diagnostic disagreements between clinicians reviewing the same patient was explained by differences in how chest imaging studies were interpreted, with other ARDS criteria contributing less (identification of ARDS risk factor, 15%; cardiac edema/volume overload exclusion, 7%). Combining the independent reviews of three clinicians can increase reliability to "substantial" (kappa = 0.75; 95% CI, 0.68-0.80). When a clinician diagnosed ARDS with "high confidence," all other clinicians agreed with the diagnosis in 72% of reviews. There was close agreement between clinicians about the time when a patient met all ARDS criteria if ARDS developed within the first 48 hours of hospitalization (median difference, 5 hours). The reliability of the Berlin ARDS definition is moderate, driven primarily by differences in chest imaging interpretation. Combining independent reviews by multiple clinicians or improving methods to identify bilateral infiltrates on chest imaging are important strategies for improving the reliability of ARDS diagnosis. Copyright © 2017 American College of Chest Physicians. All rights reserved.
Wireless and Powerless Sensing Node System Developed for Monitoring Motors.
Lee, Dasheng
2008-08-27
Reliability and maintainability of tooling systems can be improved through condition monitoring of motors. However, it is difficult to deploy sensor nodes due to the harsh environment of industrial plants. Sensor cables are easily damaged, which renders the monitoring system deployed to assure the machine's reliability itself unreliable. A wireless and powerless sensing node integrated with a MEMS (Micro Electro-Mechanical System) sensor, a signal processor, a communication module, and a self-powered generator was developed in this study for implementation of an easily mounted network sensor for monitoring motors. A specially designed communication module transmits a sequence of electromagnetic (EM) pulses in response to the sensor signals. The EM pulses can penetrate through the machine's metal case and delivers signals from the sensor inside the motor to the external data acquisition center. By using induction power, which is generated by the motor's shaft rotation, the sensor node is self-sustaining; therefore, no power line is required. A monitoring system, equipped with novel sensing nodes, was constructed to test its performance. The test results illustrate that, the novel sensing node developed in this study can effectively enhance the reliability of the motor monitoring system and it is expected to be a valuable technology, which will be available to the plant for implementation in a reliable motor management program.
Wireless and Powerless Sensing Node System Developed for Monitoring Motors
Lee, Dasheng
2008-01-01
Reliability and maintainability of tooling systems can be improved through condition monitoring of motors. However, it is difficult to deploy sensor nodes due to the harsh environment of industrial plants. Sensor cables are easily damaged, which renders the monitoring system deployed to assure the machine's reliability itself unreliable. A wireless and powerless sensing node integrated with a MEMS (Micro Electro-Mechanical System) sensor, a signal processor, a communication module, and a self-powered generator was developed in this study for implementation of an easily mounted network sensor for monitoring motors. A specially designed communication module transmits a sequence of electromagnetic (EM) pulses in response to the sensor signals. The EM pulses can penetrate through the machine's metal case and delivers signals from the sensor inside the motor to the external data acquisition center. By using induction power, which is generated by the motor's shaft rotation, the sensor node is self-sustaining; therefore, no power line is required. A monitoring system, equipped with novel sensing nodes, was constructed to test its performance. The test results illustrate that, the novel sensing node developed in this study can effectively enhance the reliability of the motor monitoring system and it is expected to be a valuable technology, which will be available to the plant for implementation in a reliable motor management program. PMID:27873798
Mobile Applications for Women's Health and Midwifery Care: A Pocket Reference for the 21st Century.
Arbour, Megan W; Stec, Melissa A
2018-05-01
Midwives and other women's health care providers are charged with providing high-quality care to women based on the most current available evidence. Quick, reliable, and accurate access to evidence-based information is essential. Numerous smartphone and mobile device applications (apps) are available to assist clinicians in providing care for women. This article discusses clinical reference apps, including those for evidence-based care guidelines, women's health care, pharmacologic reference, laboratory and diagnostic guides, as well as apps for information storage and management, electronic health records, and client education. Midwives and other clinicians are encouraged to thoughtfully integrate mobile apps into their clinical practices to improve client outcomes and clinician and client satisfaction. Although the thousands of health care apps that are available may seem daunting, this article highlights key apps that may help clinicians improve their care of women. By adding one app at a time, midwives and other women's health care providers can successfully integrate mobile apps into clinical practice. © 2018 by the American College of Nurse-Midwives.
NASA Astrophysics Data System (ADS)
Gragne, A. S.; Sharma, A.; Mehrotra, R.; Alfredsen, K. T.
2012-12-01
Accuracy of reservoir inflow forecasts is instrumental for maximizing value of water resources and influences operation of hydropower reservoirs significantly. Improving hourly reservoir inflow forecasts over a 24 hours lead-time is considered with the day-ahead (Elspot) market of the Nordic exchange market in perspectives. The procedure presented comprises of an error model added on top of an un-alterable constant parameter conceptual model, and a sequential data assimilation routine. The structure of the error model was investigated using freely available software for detecting mathematical relationships in a given dataset (EUREQA) and adopted to contain minimum complexity for computational reasons. As new streamflow data become available the extra information manifested in the discrepancies between measurements and conceptual model outputs are extracted and assimilated into the forecasting system recursively using Sequential Monte Carlo technique. Besides improving forecast skills significantly, the probabilistic inflow forecasts provided by the present approach entrains suitable information for reducing uncertainty in decision making processes related to hydropower systems operation. The potential of the current procedure for improving accuracy of inflow forecasts at lead-times unto 24 hours and its reliability in different seasons of the year will be illustrated and discussed thoroughly.
Seeking high reliability in primary care: Leadership, tools, and organization.
Weaver, Robert R
2015-01-01
Leaders in health care increasingly recognize that improving health care quality and safety requires developing an organizational culture that fosters high reliability and continuous process improvement. For various reasons, a reliability-seeking culture is lacking in most health care settings. Developing a reliability-seeking culture requires leaders' sustained commitment to reliability principles using key mechanisms to embed those principles widely in the organization. The aim of this study was to examine how key mechanisms used by a primary care practice (PCP) might foster a reliability-seeking, system-oriented organizational culture. A case study approach was used to investigate the PCP's reliability culture. The study examined four cultural artifacts used to embed reliability-seeking principles across the organization: leadership statements, decision support tools, and two organizational processes. To decipher their effects on reliability, the study relied on observations of work patterns and the tools' use, interactions during morning huddles and process improvement meetings, interviews with clinical and office staff, and a "collective mindfulness" questionnaire. The five reliability principles framed the data analysis. Leadership statements articulated principles that oriented the PCP toward a reliability-seeking culture of care. Reliability principles became embedded in the everyday discourse and actions through the use of "problem knowledge coupler" decision support tools and daily "huddles." Practitioners and staff were encouraged to report unexpected events or close calls that arose and which often initiated a formal "process change" used to adjust routines and prevent adverse events from recurring. Activities that foster reliable patient care became part of the taken-for-granted routine at the PCP. The analysis illustrates the role leadership, tools, and organizational processes play in developing and embedding a reliable-seeking culture across an organization. Progress toward a reliability-seeking, system-oriented approach to care remains ongoing, and movement in that direction requires deliberate and sustained effort by committed leaders in health care.
Watts, Bradley V; Williams, Linda; Mills, Peter D; Paull, Douglas E; Cully, Jeffrey A; Gilman, Stuart C; Hemphill, Robin R
2018-06-15
Developing a workforce skilled in improving the safety of medical care has often been cited as an important means to achieve safer care. Although some educational programs geared toward patient safety have been developed, few advanced training programs have been described in the literature. We describe the development of a patient safety fellowship program. We describe the development and curriculum of an Interprofessional Fellowship in Patient Safety. The 1-year in residence fellowship focuses on domains such as leadership, spreading innovations, medical improvement, patient safety culture, reliability science, and understanding errors. Specific training in patient safety is available and has been delivered to 48 fellows from a wide range of backgrounds. Fellows have accomplished much in terms of improvement projects, educational innovations, and publications. After completing the fellowship program, fellows are obtaining positions within health-care quality and safety and are likely to make long-term contributions. We offer a curriculum and fellowship design for the topic of patient safety. Available evidence suggests that the fellowship results in the development of patient safety professionals.
The Reliability of Psychiatric Diagnosis Revisited
Rankin, Eric; France, Cheryl; El-Missiry, Ahmed; John, Collin
2006-01-01
Background: The authors reviewed the topic of reliability of psychiatric diagnosis from the turn of the 20th century to present. The objectives of this paper are to explore the reasons of unreliability of psychiatric diagnosis and propose ways to improve the reliability of psychiatric diagnosis. Method: The authors reviewed the literature on the concept of reliability of psychiatric diagnosis with emphasis on the impact of interviewing skills, use of diagnostic criteria, and structured interviews on the reliability of psychiatric diagnosis. Results: Causes of diagnostic unreliability are attributed to the patient, the clinician and psychiatric nomenclature. The reliability of psychiatric diagnosis can be enhanced by using diagnostic criteria, defining psychiatric symptoms and structuring the interviews. Conclusions: The authors propose the acronym ‘DR.SED,' which stands for diagnostic criteria, reference definitions, structuring the interview, clinical experience, and data. The authors recommend that clinicians use the DR.SED paradigm to improve the reliability of psychiatric diagnoses. PMID:21103149
NASA Astrophysics Data System (ADS)
Meng, Zeng; Yang, Dixiong; Zhou, Huanlin; Yu, Bo
2018-05-01
The first order reliability method has been extensively adopted for reliability-based design optimization (RBDO), but it shows inaccuracy in calculating the failure probability with highly nonlinear performance functions. Thus, the second order reliability method is required to evaluate the reliability accurately. However, its application for RBDO is quite challenge owing to the expensive computational cost incurred by the repeated reliability evaluation and Hessian calculation of probabilistic constraints. In this article, a new improved stability transformation method is proposed to search the most probable point efficiently, and the Hessian matrix is calculated by the symmetric rank-one update. The computational capability of the proposed method is illustrated and compared to the existing RBDO approaches through three mathematical and two engineering examples. The comparison results indicate that the proposed method is very efficient and accurate, providing an alternative tool for RBDO of engineering structures.
Creating Highly Reliable Accountable Care Organizations.
Vogus, Timothy J; Singer, Sara J
2016-12-01
Accountable Care Organizations' (ACOs) pursuit of the triple aim of higher quality, lower cost, and improved population health has met with mixed results. To improve the design and implementation of ACOs we look to organizations that manage similarly complex, dynamic, and tightly coupled conditions while sustaining exceptional performance known as high-reliability organizations. We describe the key processes through which organizations achieve reliability, the leadership and organizational practices that enable it, and the role that professionals can play when charged with enacting it. Specifically, we present concrete practices and processes from health care organizations pursuing high-reliability and from early ACOs to illustrate how the triple aim may be met by cultivating mindful organizing, practicing reliability-enhancing leadership, and identifying and supporting reliability professionals. We conclude by proposing a set of research questions to advance the study of ACOs and high-reliability research. © The Author(s) 2016.
Palmer, Clare E; Langbehn, Douglas; Tabrizi, Sarah J; Papoutsi, Marina
2017-01-01
Cognitive impairment is common amongst many neurodegenerative movement disorders such as Huntington's disease (HD) and Parkinson's disease (PD) across multiple domains. There are many tasks available to assess different aspects of this dysfunction, however, it is imperative that these show high test-retest reliability if they are to be used to track disease progression or response to treatment in patient populations. Moreover, in order to ensure effects of practice across testing sessions are not misconstrued as clinical improvement in clinical trials, tasks which are particularly vulnerable to practice effects need to be highlighted. In this study we evaluated test-retest reliability in mean performance across three testing sessions of four tasks that are commonly used to measure cognitive dysfunction associated with striatal impairment: a combined Simon Stop-Signal Task; a modified emotion recognition task; a circle tracing task; and the trail making task. Practice effects were seen between sessions 1 and 2 across all tasks for the majority of dependent variables, particularly reaction time variables; some, but not all, diminished in the third session. Good test-retest reliability across all sessions was seen for the emotion recognition, circle tracing, and trail making test. The Simon interference effect and stop-signal reaction time (SSRT) from the combined-Simon-Stop-Signal task showed moderate test-retest reliability, however, the combined SSRT interference effect showed poor test-retest reliability. Our results emphasize the need to use control groups when tracking clinical progression or use pre-baseline training on tasks susceptible to practice effects.
Implementing eco friendly highly reliable upload feature using multi 3G service
NASA Astrophysics Data System (ADS)
Tanutama, Lukas; Wijaya, Rico
2017-12-01
The current trend of eco friendly Internet access is preferred. In this research the understanding of eco friendly is minimum power consumption. The devices that are selected have operationally low power consumption and normally have no power consumption as they are hibernating during idle state. To have the reliability a router of a router that has internal load balancing feature will provide the improvement of previous research on multi 3G services for broadband lines. Previous studies emphasized on accessing and downloading information files from Public Cloud residing Web Servers. The demand is not only for speed but high reliability of access as well. High reliability will mean mitigating both direct and indirect high cost due to repeated attempts of uploading and downloading the large files. Nomadic and mobile computer users need viable solution. Following solution for downloading information has been proposed and tested. The solution is promising. The result is now extended to providing reliable access line by means of redundancy and automatic reconfiguration for uploading and downloading large information files to a Web Server in the Cloud. The technique is taking advantage of internal load balancing feature to provision a redundant line acting as a backup line. A router that has the ability to provide load balancing to several WAN lines is chosen. The WAN lines are constructed using multiple 3G lines. The router supports the accessing Internet with more than one 3G access line which increases the reliability and availability of the Internet access as the second line immediately takes over if the first line is disturbed.
Postmortem time estimation using body temperature and a finite-element computer model.
den Hartog, Emiel A; Lotens, Wouter A
2004-09-01
In the Netherlands most murder victims are found 2-24 h after the crime. During this period, body temperature decrease is the most reliable method to estimate the postmortem time (PMT). Recently, two murder cases were analysed in which currently available methods did not provide a sufficiently reliable estimate of the PMT. In both cases a study was performed to verify the statements of suspects. For this purpose a finite-element computer model was developed that simulates a human torso and its clothing. With this model, changes to the body and the environment can also be modelled; this was very relevant in one of the cases, as the body had been in the presence of a small fire. In both cases it was possible to falsify the statements of the suspects by improving the accuracy of the PMT estimate. The estimated PMT in both cases was within the range of Henssge's model. The standard deviation of the PMT estimate was 35 min in the first case and 45 min in the second case, compared to 168 min (2.8 h) in Henssge's model. In conclusion, the model as presented here can have additional value for improving the accuracy of the PMT estimate. In contrast to the simple model of Henssge, the current model allows for increased accuracy when more detailed information is available. Moreover, the sensitivity of the predicted PMT for uncertainty in the circumstances can be studied, which is crucial to the confidence of the judge in the results.
NASA Astrophysics Data System (ADS)
Miao, Yongchun; Kang, Rongxue; Chen, Xuefeng
2017-12-01
In recent years, with the gradual extension of reliability research, the study of production system reliability has become the hot topic in various industries. Man-machine-environment system is a complex system composed of human factors, machinery equipment and environment. The reliability of individual factor must be analyzed in order to gradually transit to the research of three-factor reliability. Meanwhile, the dynamic relationship among man-machine-environment should be considered to establish an effective blurry evaluation mechanism to truly and effectively analyze the reliability of such systems. In this paper, based on the system engineering, fuzzy theory, reliability theory, human error, environmental impact and machinery equipment failure theory, the reliabilities of human factor, machinery equipment and environment of some chemical production system were studied by the method of fuzzy evaluation. At last, the reliability of man-machine-environment system was calculated to obtain the weighted result, which indicated that the reliability value of this chemical production system was 86.29. Through the given evaluation domain it can be seen that the reliability of man-machine-environment integrated system is in a good status, and the effective measures for further improvement were proposed according to the fuzzy calculation results.
Durability reliability analysis for corroding concrete structures under uncertainty
NASA Astrophysics Data System (ADS)
Zhang, Hao
2018-02-01
This paper presents a durability reliability analysis of reinforced concrete structures subject to the action of marine chloride. The focus is to provide insight into the role of epistemic uncertainties on durability reliability. The corrosion model involves a number of variables whose probabilistic characteristics cannot be fully determined due to the limited availability of supporting data. All sources of uncertainty, both aleatory and epistemic, should be included in the reliability analysis. Two methods are available to formulate the epistemic uncertainty: the imprecise probability-based method and the purely probabilistic method in which the epistemic uncertainties are modeled as random variables. The paper illustrates how the epistemic uncertainties are modeled and propagated in the two methods, and shows how epistemic uncertainties govern the durability reliability.
Improving Distribution Resiliency with Microgrids and State and Parameter Estimation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tuffner, Francis K.; Williams, Tess L.; Schneider, Kevin P.
Modern society relies on low-cost reliable electrical power, both to maintain industry, as well as provide basic social services to the populace. When major disturbances occur, such as Hurricane Katrina or Hurricane Sandy, the nation’s electrical infrastructure can experience significant outages. To help prevent the spread of these outages, as well as facilitating faster restoration after an outage, various aspects of improving the resiliency of the power system are needed. Two such approaches are breaking the system into smaller microgrid sections, and to have improved insight into the operations to detect failures or mis-operations before they become critical. Breaking themore » system into smaller sections of microgrid islands, power can be maintained in smaller areas where distribution generation and energy storage resources are still available, but bulk power generation is no longer connected. Additionally, microgrid systems can maintain service to local pockets of customers when there has been extensive damage to the local distribution system. However, microgrids are grid connected a majority of the time and implementing and operating a microgrid is much different than when islanded. This report discusses work conducted by the Pacific Northwest National Laboratory that developed improvements for simulation tools to capture the characteristics of microgrids and how they can be used to develop new operational strategies. These operational strategies reduce the cost of microgrid operation and increase the reliability and resilience of the nation’s electricity infrastructure. In addition to the ability to break the system into microgrids, improved observability into the state of the distribution grid can make the power system more resilient. State estimation on the transmission system already provides great insight into grid operations and detecting abnormal conditions by leveraging existing measurements. These transmission-level approaches are expanded to using advanced metering infrastructure and other distribution-level measurements to create a three-phase, unbalanced distribution state estimation approach. With distribution-level state estimation, the grid can be operated more efficiently, and outages or equipment failures can be caught faster, improving the overall resilience and reliability of the grid.« less
Federal Register 2010, 2011, 2012, 2013, 2014
2010-02-03
..., EPRI/NRC- RES Fire Human Reliability Analysis Guidelines, Draft Report for Comment AGENCY: Nuclear... Human Reliability Analysis Guidelines, Draft Report for Comment'' (December 11, 2009; 74 FR 65810). This... Human Reliability Analysis Guidelines'' is available electronically under ADAMS Accession Number...
Troester, Jordan C.; Jasmin, Jason G.; Duffield, Rob
2018-01-01
The present study examined the inter-trial (within test) and inter-test (between test) reliability of single-leg balance and single-leg landing measures performed on a force plate in professional rugby union players using commercially available software (SpartaMARS, Menlo Park, USA). Twenty-four players undertook test – re-test measures on two occasions (7 days apart) on the first training day of two respective pre-season weeks following 48h rest and similar weekly training loads. Two 20s single-leg balance trials were performed on a force plate with eyes closed. Three single-leg landing trials were performed by jumping off two feet and landing on one foot in the middle of a force plate 1m from the starting position. Single-leg balance results demonstrated acceptable inter-trial reliability (ICC = 0.60-0.81, CV = 11-13%) for sway velocity, anterior-posterior sway velocity, and mediolateral sway velocity variables. Acceptable inter-test reliability (ICC = 0.61-0.89, CV = 7-13%) was evident for all variables except mediolateral sway velocity on the dominant leg (ICC = 0.41, CV = 15%). Single-leg landing results only demonstrated acceptable inter-trial reliability for force based measures of relative peak landing force and impulse (ICC = 0.54-0.72, CV = 9-15%). Inter-test results indicate improved reliability through the averaging of three trials with force based measures again demonstrating acceptable reliability (ICC = 0.58-0.71, CV = 7-14%). Of the variables investigated here, total sway velocity and relative landing impulse are the most reliable measures of single-leg balance and landing performance, respectively. These measures should be considered for monitoring potential changes in postural control in professional rugby union. Key points Single-leg balance demonstrated acceptable inter-trial and inter-test reliability. Single-leg landing demonstrated good inter-trial and inter-test reliability for measures of relative peak landing force and relative impulse, but not time to stabilization. Of the variables investigated, sway velocity and relative landing impulse are the most reliable measures of single-leg balance and landing respectively, and should considered for monitoring changes in postural control. PMID:29769817
Kragic, Rastislav; Kostic, Mirjana
2018-01-01
In this paper, we present the construction of a reliable and inexpensive pH stat device, by using open-source “OpenPhControl” software, inexpensive hardware (a peristaltic and a syringe pump, Arduino, a step motor…), readily available laboratory devices: a pH meter, a computer, a webcam, and some 3D printed parts. We provide a methodology for the design, development and test results of each part of the device, as well as of the entire system. In addition to dosing reagents by means of a low-cost peristaltic pump, we also present carefully controlled dosing of reagents by an open-source syringe pump. The upgrading of the basic open-source syringe pump is given in terms of pump control and application of a larger syringe. In addition to the basic functions of pH stat, i.e. pH value measurement and maintenance, an improvement allowing the device to be used for potentiometric titration has been made as well. We have demonstrated the device’s utility when applied for cellulose fibers oxidation with 2,2,6,6-tetramethylpiperidine-1-oxyl radical, i.e. for TEMPO-mediated oxidation. In support of this, we present the results obtained for the oxidation kinetics, the consumption of added reagent and experimental repeatability. Considering that the open-source scientific tools are available to everyone, and that researchers can construct and adjust the device according to their needs, as well as, that the total cost of the open-source pH stat device, excluding the existing laboratory equipment (pH meter, computer and glossary) was less than 150 EUR, we believe that, at a small fraction of the cost of available commercial offers, our open-source pH stat can significantly improve experimental work where the use of pH stat is necessary. PMID:29509793
Milanovic, Jovana Z; Milanovic, Predrag; Kragic, Rastislav; Kostic, Mirjana
2018-01-01
In this paper, we present the construction of a reliable and inexpensive pH stat device, by using open-source "OpenPhControl" software, inexpensive hardware (a peristaltic and a syringe pump, Arduino, a step motor…), readily available laboratory devices: a pH meter, a computer, a webcam, and some 3D printed parts. We provide a methodology for the design, development and test results of each part of the device, as well as of the entire system. In addition to dosing reagents by means of a low-cost peristaltic pump, we also present carefully controlled dosing of reagents by an open-source syringe pump. The upgrading of the basic open-source syringe pump is given in terms of pump control and application of a larger syringe. In addition to the basic functions of pH stat, i.e. pH value measurement and maintenance, an improvement allowing the device to be used for potentiometric titration has been made as well. We have demonstrated the device's utility when applied for cellulose fibers oxidation with 2,2,6,6-tetramethylpiperidine-1-oxyl radical, i.e. for TEMPO-mediated oxidation. In support of this, we present the results obtained for the oxidation kinetics, the consumption of added reagent and experimental repeatability. Considering that the open-source scientific tools are available to everyone, and that researchers can construct and adjust the device according to their needs, as well as, that the total cost of the open-source pH stat device, excluding the existing laboratory equipment (pH meter, computer and glossary) was less than 150 EUR, we believe that, at a small fraction of the cost of available commercial offers, our open-source pH stat can significantly improve experimental work where the use of pH stat is necessary.
Reliability and Maintainability Engineering - A Major Driver for Safety and Affordability
NASA Technical Reports Server (NTRS)
Safie, Fayssal M.
2011-01-01
The United States National Aeronautics and Space Administration (NASA) is in the midst of an effort to design and build a safe and affordable heavy lift vehicle to go to the moon and beyond. To achieve that, NASA is seeking more innovative and efficient approaches to reduce cost while maintaining an acceptable level of safety and mission success. One area that has the potential to contribute significantly to achieving NASA safety and affordability goals is Reliability and Maintainability (R&M) engineering. Inadequate reliability or failure of critical safety items may directly jeopardize the safety of the user(s) and result in a loss of life. Inadequate reliability of equipment may directly jeopardize mission success. Systems designed to be more reliable (fewer failures) and maintainable (fewer resources needed) can lower the total life cycle cost. The Department of Defense (DOD) and industry experience has shown that optimized and adequate levels of R&M are critical for achieving a high level of safety and mission success, and low sustainment cost. Also, lessons learned from the Space Shuttle program clearly demonstrated the importance of R&M engineering in designing and operating safe and affordable launch systems. The Challenger and Columbia accidents are examples of the severe impact of design unreliability and process induced failures on system safety and mission success. These accidents demonstrated the criticality of reliability engineering in understanding component failure mechanisms and integrated system failures across the system elements interfaces. Experience from the shuttle program also shows that insufficient Reliability, Maintainability, and Supportability (RMS) engineering analyses upfront in the design phase can significantly increase the sustainment cost and, thereby, the total life cycle cost. Emphasis on RMS during the design phase is critical for identifying the design features and characteristics needed for time efficient processing, improved operational availability, and optimized maintenance and logistic support infrastructure. This paper discusses the role of R&M in a program acquisition phase and the potential impact of R&M on safety, mission success, operational availability, and affordability. This includes discussion of the R&M elements that need to be addressed and the R&M analyses that need to be performed in order to support a safe and affordable system design. The paper also provides some lessons learned from the Space Shuttle program on the impact of R&M on safety and affordability.
Universal first-order reliability concept applied to semistatic structures
NASA Technical Reports Server (NTRS)
Verderaime, V.
1994-01-01
A reliability design concept was developed for semistatic structures which combines the prevailing deterministic method with the first-order reliability method. The proposed method surmounts deterministic deficiencies in providing uniformly reliable structures and improved safety audits. It supports risk analyses and reliability selection criterion. The method provides a reliability design factor derived from the reliability criterion which is analogous to the current safety factor for sizing structures and verifying reliability response. The universal first-order reliability method should also be applicable for air and surface vehicles semistatic structures.
Universal first-order reliability concept applied to semistatic structures
NASA Astrophysics Data System (ADS)
Verderaime, V.
1994-07-01
A reliability design concept was developed for semistatic structures which combines the prevailing deterministic method with the first-order reliability method. The proposed method surmounts deterministic deficiencies in providing uniformly reliable structures and improved safety audits. It supports risk analyses and reliability selection criterion. The method provides a reliability design factor derived from the reliability criterion which is analogous to the current safety factor for sizing structures and verifying reliability response. The universal first-order reliability method should also be applicable for air and surface vehicles semistatic structures.
Heroic Reliability Improvement in Manned Space Systems
NASA Technical Reports Server (NTRS)
Jones, Harry W.
2017-01-01
System reliability can be significantly improved by a strong continued effort to identify and remove all the causes of actual failures. Newly designed systems often have unexpected high failure rates which can be reduced by successive design improvements until the final operational system has an acceptable failure rate. There are many causes of failures and many ways to remove them. New systems may have poor specifications, design errors, or mistaken operations concepts. Correcting unexpected problems as they occur can produce large early gains in reliability. Improved technology in materials, components, and design approaches can increase reliability. The reliability growth is achieved by repeatedly operating the system until it fails, identifying the failure cause, and fixing the problem. The failure rate reduction that can be obtained depends on the number and the failure rates of the correctable failures. Under the strong assumption that the failure causes can be removed, the decline in overall failure rate can be predicted. If a failure occurs at the rate of lambda per unit time, the expected time before the failure occurs and can be corrected is 1/lambda, the Mean Time Before Failure (MTBF). Finding and fixing a less frequent failure with the rate of lambda/2 per unit time requires twice as long, time of 1/(2 lambda). Cutting the failure rate in half requires doubling the test and redesign time and finding and eliminating the failure causes.Reducing the failure rate significantly requires a heroic reliability improvement effort.
Barbui, C; Dua, T; Kolappa, K; Saraceno, B; Saxena, S
2017-10-01
In recent years a number of intergovernmental initiatives have been activated in order to enhance the capacity of countries to improve access to essential medicines, particularly for mental disorders. In May 2013 the 66th World Health Assembly adopted the World Health Organization (WHO) Comprehensive Mental Health Action Plan 2013-2020, which builds upon the work of WHO's Mental Health Gap Action Programme. Within this programme, evidence-based guidelines for mental disorders were developed, including recommendations on appropriate use of medicines. Subsequently, the 67th World Health Assembly adopted a resolution on access to essential medicines, which urged Member States to improve national policies for the selection of essential medicines and to promote their availability, affordability and appropriate use. Following the precedent set by these important initiatives, this article presents eleven actions for improving access and appropriate use of psychotropic medicines. A 4 × 4 framework mapping actions as a function of the four components of access - selection, availability, affordability and appropriate use - and across four different health care levels, three of which belong to the supply side and one to the demand side, was developed. The actions are: developing a medicine selection process; promoting information and education activities for staff and end-users; developing a medicine regulation process; implementing a reliable supply system; implementing a reliable quality-control system; developing a community-based system of mental health care and promoting help-seeking behaviours; developing international agreements on medicine affordability; developing pricing policies and a sustainable financing system; developing or adopting evidence-based guidelines; monitoring the use of psychotropic medicines; promoting training initiatives for staff and end-users on critical appraisal of scientific evidence and appropriate use of psychotropic medicines. Activating these actions offers an unique opportunity to address the broader issue of increasing access to treatments and care for mental disorders, as current lack of attention to mental disorders is a central barrier across all domains of the 4 × 4 access framework.
Medicine is not science: guessing the future, predicting the past.
Miller, Clifford
2014-12-01
Irregularity limits human ability to know, understand and predict. A better understanding of irregularity may improve the reliability of knowledge. Irregularity and its consequences for knowledge are considered. Reliable predictive empirical knowledge of the physical world has always been obtained by observation of regularities, without needing science or theory. Prediction from observational knowledge can remain reliable despite some theories based on it proving false. A naïve theory of irregularity is outlined. Reducing irregularity and/or increasing regularity can increase the reliability of knowledge. Beyond long experience and specialization, improvements include implementing supporting knowledge systems of libraries of appropriately classified prior cases and clinical histories and education about expertise, intuition and professional judgement. A consequence of irregularity and complexity is that classical reductionist science cannot provide reliable predictions of the behaviour of complex systems found in nature, including of the human body. Expertise, expert judgement and their exercise appear overarching. Diagnosis involves predicting the past will recur in the current patient applying expertise and intuition from knowledge and experience of previous cases and probabilistic medical theory. Treatment decisions are an educated guess about the future (prognosis). Benefits of the improvements suggested here are likely in fields where paucity of feedback for practitioners limits development of reliable expert diagnostic intuition. Further analysis, definition and classification of irregularity is appropriate. Observing and recording irregularities are initial steps in developing irregularity theory to improve the reliability and extent of knowledge, albeit some forms of irregularity present inherent difficulties. © 2014 John Wiley & Sons, Ltd.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rogue, F.; Binnall, E.P.
1982-10-01
Reliable instrumentation will be needed to monitor the performance of future high-level waste repository sites. A study has been made to assess instrument reliability at Department of Energy (DOE) waste repository related experiments. Though the study covers a wide variety of instrumentation, this paper concentrates on experiences with geotechnical instrumentation in hostile repository-type environments. Manufacturers have made some changes to improve the reliability of instruments for repositories. This paper reviews the failure modes, rates, and mechanisms, along with manufacturer modifications and recommendations for additional improvements to enhance instrument performance. 4 tables.
Leszczynski, Dariusz; Xu, Zhengping
2010-01-27
There is ongoing discussion whether the mobile phone radiation causes any health effects. The International Commission on Non-Ionizing Radiation Protection, the International Committee on Electromagnetic Safety and the World Health Organization are assuring that there is no proven health risk and that the present safety limits protect all mobile phone users. However, based on the available scientific evidence, the situation is not as clear. The majority of the evidence comes from in vitro laboratory studies and is of very limited use for determining health risk. Animal toxicology studies are inadequate because it is not possible to "overdose" microwave radiation, as it is done with chemical agents, due to simultaneous induction of heating side-effects. There is a lack of human volunteer studies that would, in unbiased way, demonstrate whether human body responds at all to mobile phone radiation. Finally, the epidemiological evidence is insufficient due to, among others, selection and misclassification bias and the low sensitivity of this approach in detection of health risk within the population. This indicates that the presently available scientific evidence is insufficient to prove reliability of the current safety standards. Therefore, we recommend to use precaution when dealing with mobile phones and, whenever possible and feasible, to limit body exposure to this radiation. Continuation of the research on mobile phone radiation effects is needed in order to improve the basis and the reliability of the safety standards.
2010-01-01
There is ongoing discussion whether the mobile phone radiation causes any health effects. The International Commission on Non-Ionizing Radiation Protection, the International Committee on Electromagnetic Safety and the World Health Organization are assuring that there is no proven health risk and that the present safety limits protect all mobile phone users. However, based on the available scientific evidence, the situation is not as clear. The majority of the evidence comes from in vitro laboratory studies and is of very limited use for determining health risk. Animal toxicology studies are inadequate because it is not possible to "overdose" microwave radiation, as it is done with chemical agents, due to simultaneous induction of heating side-effects. There is a lack of human volunteer studies that would, in unbiased way, demonstrate whether human body responds at all to mobile phone radiation. Finally, the epidemiological evidence is insufficient due to, among others, selection and misclassification bias and the low sensitivity of this approach in detection of health risk within the population. This indicates that the presently available scientific evidence is insufficient to prove reliability of the current safety standards. Therefore, we recommend to use precaution when dealing with mobile phones and, whenever possible and feasible, to limit body exposure to this radiation. Continuation of the research on mobile phone radiation effects is needed in order to improve the basis and the reliability of the safety standards. PMID:20205835
Can we save large carnivores without losing large carnivore science?
Allen, Benjamin L.; Allen, Lee R.; Andrén, Henrik; Ballard, Guy; Boitani, Luigi; Engeman, Richard M.; Fleming, Peter J.S.; Haswell, Peter M.; Ford, Adam T.; Kowalczyk, Rafał; Mech, L. David; Linnell, John D.C.; Parker, Daniel M.
2017-01-01
Large carnivores are depicted to shape entire ecosystems through top-down processes. Studies describing these processes are often used to support interventionist wildlife management practices, including carnivore reintroduction or lethal control programs. Unfortunately, there is an increasing tendency to ignore, disregard or devalue fundamental principles of the scientific method when communicating the reliability of current evidence for the ecological roles that large carnivores may play, eroding public confidence in large carnivore science and scientists. Here, we discuss six interrelated issues that currently undermine the reliability of the available literature on the ecological roles of large carnivores: (1) the overall paucity of available data, (2) reliability of carnivore population sampling techniques, (3) general disregard for alternative hypotheses to top-down forcing, (4) lack of applied science studies, (5) frequent use of logical fallacies, and (6) generalisation of results from relatively pristine systems to those substantially altered by humans. We first describe how widespread these issues are, and given this, show, for example, that evidence for the roles of wolves (Canis lupus) and dingoes (Canis lupus dingo) in initiating trophic cascades is not as strong as is often claimed. Managers and policy makers should exercise caution when relying on this literature to inform wildlife management decisions. We emphasise the value of manipulative experiments and discuss the role of scientific knowledge in the decision-making process. We hope that the issues we raise here prompt deeper consideration of actual evidence, leading towards an improvement in both the rigour and communication of large carnivore science.
Modular photovoltaic stand-alone systems: Phase 1
NASA Technical Reports Server (NTRS)
Naff, G. J.; Marshall, N. A.
1983-01-01
A family of modular stand-alone power systems that covered the range in power level from 1 kw to 14 kw was developed. Products within this family were required to be easily adaptable to different environments and applications, and were to be both reliable and cost effective. Additionally, true commonality in hardware was to be exploited, and unnecessary recurrence of design and development costs were to be minimized; thus improving hardware availability. Assurance of compatibility with large production runs, was also an underlying program goal. A secondary objective was to compile, evaluate, and determine the economic and technical status of available, and potentially available, technology options associated with the balance of systems (BOS) for stand-along photovoltaic (PV) power systems. The secondary objective not only directly supported the primary but additionally contributed to the definition and implementation of the BOS cost reduction plan.
Modeling integrated water user decisions in intermittent supply systems
NASA Astrophysics Data System (ADS)
Rosenberg, David E.; Tarawneh, Tarek; Abdel-Khaleq, Rania; Lund, Jay R.
2007-07-01
We apply systems analysis to estimate household water use in an intermittent supply system considering numerous interdependent water user behaviors. Some 39 household actions include conservation; improving local storage or water quality; and accessing sources having variable costs, availabilities, reliabilities, and qualities. A stochastic optimization program with recourse decisions identifies the infrastructure investments and short-term coping actions a customer can adopt to cost-effectively respond to a probability distribution of piped water availability. Monte Carlo simulations show effects for a population of customers. Model calibration reproduces the distribution of billed residential water use in Amman, Jordan. Parametric analyses suggest economic and demand responses to increased availability and alternative pricing. It also suggests potential market penetration for conservation actions, associated water savings, and subsidies to entice further adoption. We discuss new insights to size, target, and finance conservation.
MacLean, Sharon; Geddes, Fiona; Kelly, Michelle; Della, Phillip
2018-03-01
Simulated patients (SPs) are frequently used for training nursing students in communication skills. An acknowledged benefit of using SPs is the opportunity to provide a standardized approach by which participants can demonstrate and develop communication skills. However, relatively little evidence is available on how to best facilitate and evaluate the reliability and accuracy of SPs' performances. The aim of this study is to investigate the effectiveness of an evidenced based SP training framework to ensure standardization of SPs. The training framework was employed to improve inter-rater reliability of SPs. A quasi-experimental study was employed to assess SP post-training understanding of simulation scenario parameters using inter-rater reliability agreement indices. Two phases of data collection took place. Initially a trial phase including audio-visual (AV) recordings of two undergraduate nursing students completing a simulation scenario is rated by eight SPs using the Interpersonal Communication Assessments Scale (ICAS) and Quality of Discharge Teaching Scale (QDTS). In phase 2, eight SP raters and four nursing faculty raters independently evaluated students' (N=42) communication practices using the QDTS. Intraclass correlation coefficients (ICC) were >0.80 for both stages of the study in clinical communication skills. The results support the premise that if trained appropriately, SPs have a high degree of reliability and validity to both facilitate and evaluate student performance in nurse education. Crown Copyright © 2018. Published by Elsevier Ltd. All rights reserved.
Reliability Analysis of a Green Roof Under Different Storm Scenarios
NASA Astrophysics Data System (ADS)
William, R. K.; Stillwell, A. S.
2015-12-01
Urban environments continue to face the challenges of localized flooding and decreased water quality brought on by the increasing amount of impervious area in the built environment. Green infrastructure provides an alternative to conventional storm sewer design by using natural processes to filter and store stormwater at its source. However, there are currently few consistent standards available in North America to ensure that installed green infrastructure is performing as expected. This analysis offers a method for characterizing green roof failure using a visual aid commonly used in earthquake engineering: fragility curves. We adapted the concept of the fragility curve based on the efficiency in runoff reduction provided by a green roof compared to a conventional roof under different storm scenarios. We then used the 2D distributed surface water-groundwater coupled model MIKE SHE to model the impact that a real green roof might have on runoff in different storm events. We then employed a multiple regression analysis to generate an algebraic demand model that was input into the Matlab-based reliability analysis model FERUM, which was then used to calculate the probability of failure. The use of reliability analysis as a part of green infrastructure design code can provide insights into green roof weaknesses and areas for improvement. It also supports the design of code that is more resilient than current standards and is easily testable for failure. Finally, the understanding of reliability of a single green roof module under different scenarios can support holistic testing of system reliability.
Wennberg, Richard; Cheyne, Douglas
2014-05-01
To assess the reliability of MEG source imaging (MSI) of anterior temporal spikes through detailed analysis of the localization and orientation of source solutions obtained for a large number of spikes that were separately confirmed by intracranial EEG to be focally generated within a single, well-characterized spike focus. MSI was performed on 64 identical right anterior temporal spikes from an anterolateral temporal neocortical spike focus. The effects of different volume conductors (sphere and realistic head model), removal of noise with low frequency filters (LFFs) and averaging multiple spikes were assessed in terms of the reliability of the source solutions. MSI of single spikes resulted in scattered dipole source solutions that showed reasonable reliability for localization at the lobar level, but only for solutions with a goodness-of-fit exceeding 80% using a LFF of 3 Hz. Reliability at a finer level of intralobar localization was limited. Spike averaging significantly improved the reliability of source solutions and averaging 8 or more spikes reduced dependency on goodness-of-fit and data filtering. MSI performed on topographically identical individual spikes from an intracranially defined classical anterior temporal lobe spike focus was limited by low reliability (i.e., scattered source solutions) in terms of fine, sublobar localization within the ipsilateral temporal lobe. Spike averaging significantly improved reliability. MSI performed on individual anterior temporal spikes is limited by low reliability. Reduction of background noise through spike averaging significantly improves the reliability of MSI solutions. Copyright © 2013 International Federation of Clinical Neurophysiology. Published by Elsevier Ireland Ltd. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Touati, Said; Chennai, Salim; Souli, Aissa
The increased requirements on supervision, control, and performance in modern power systems make power quality monitoring a common practise for utilities. Large databases are created and automatic processing of the data is required for fast and effective use of the available information. Aim of the work presented in this paper is the development of tools for analysis of monitoring power quality data and in particular measurements of voltage and currents in various level of electrical power distribution. The study is extended to evaluate the reliability of the electrical system in nuclear plant. Power Quality is a measure of how wellmore » a system supports reliable operation of its loads. A power disturbance or event can involve voltage, current, or frequency. Power disturbances can originate in consumer power systems, consumer loads, or the utility. The effect of power quality problems is the loss power supply leading to severe damage to equipments. So, we try to track and improve system reliability. The assessment can be focused on the study of impact of short circuits on the system, harmonics distortion, power factor improvement and effects of transient disturbances on the Electrical System during motor starting and power system fault conditions. We focus also on the review of the Electrical System design against the Nuclear Directorate Safety Assessment principles, including those extended during the last Fukushima nuclear accident. The simplified configuration of the required system can be extended from this simple scheme. To achieve these studies, we have used a demo ETAP power station software for several simulations. (authors)« less
Teamwork Assessment Tools in Obstetric Emergencies: A Systematic Review.
Onwochei, Desire N; Halpern, Stephen; Balki, Mrinalini
2017-06-01
Team-based training and simulation can improve patient safety, by improving communication, decision making, and performance of team members. Currently, there is no general consensus on whether or not a specific assessment tool is better adapted to evaluate teamwork in obstetric emergencies. The purpose of this qualitative systematic review was to find the tools available to assess team effectiveness in obstetric emergencies. We searched Embase, Medline, PubMed, Web of Science, PsycINFO, CINAHL, and Google Scholar for prospective studies that evaluated nontechnical skills in multidisciplinary teams involving obstetric emergencies. The search included studies from 1944 until January 11, 2016. Data on reliability and validity measures were collected and used for interpretation. A descriptive analysis was performed on the data. Thirteen studies were included in the final qualitative synthesis. All the studies assessed teams in the context of obstetric simulation scenarios, but only six included anesthetists in the simulations. One study evaluated their teamwork tool using just validity measures, five using just reliability measures, and one used both. The most reliable tools identified were the Clinical Teamwork Scale, the Global Assessment of Obstetric Team Performance, and the Global Rating Scale of performance. However, they were still lacking in terms of quality and validity. More work needs to be conducted to establish the validity of teamwork tools for nontechnical skills, and the development of an ideal tool is warranted. Further studies are required to assess how outcomes, such as performance and patient safety, are influenced when using these tools.
Reusable Solid Rocket Motor - Accomplishment, Lessons, and a Culture of Success
NASA Technical Reports Server (NTRS)
Moore, D. R.; Phelps, W. J.
2011-01-01
The Reusable Solid Rocket Motor (RSRM) represents the largest solid rocket motor (SRM) ever flown and the only human-rated solid motor. High reliability of the RSRM has been the result of challenges addressed and lessons learned. Advancements have resulted by applying attention to process control, testing, and postflight through timely and thorough communication in dealing with all issues. A structured and disciplined approach was taken to identify and disposition all concerns. Careful consideration and application of alternate opinions was embraced. Focus was placed on process control, ground test programs, and postflight assessment. Process control is mandatory for an SRM, because an acceptance test of the delivered product is not feasible. The RSRM maintained both full-scale and subscale test articles, which enabled continuous improvement of design and evaluation of process control and material behavior. Additionally RSRM reliability was achieved through attention to detail in post flight assessment to observe any shift in performance. The postflight analysis and inspections provided invaluable reliability data as it enables observation of actual flight performance, most of which would not be available if the motors were not recovered. RSRM reusability offered unique opportunities to learn about the hardware. NASA is moving forward with the Space Launch System that incorporates propulsion systems that takes advantage of the heritage Shuttle and Ares solid motor programs. These unique challenges, features of the RSRM, materials and manufacturing issues, and design improvements will be discussed in the paper.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
This report documents the results of the Defense Programs (DP) Augmented Evaluation Team (AET) review of emergency and backup power supplies (i.e., generator, uninterruptible power supply, and battery systems) at DP facilities. The review was conducted in response to concerns expressed by former Secretary of Energy James D. Watkins over the number of incidents where backup power sources failed to provide electrical power during tests or actual demands. The AET conducted a series of on-site reviews for the purpose of understanding the design, operation, maintenance, and safety significance of emergency and backup power (E&BP) supplies. The AET found that themore » quality of programs related to maintenance of backup power systems varies greatly among the sites visited, and often among facilities at the same site. No major safety issues were identified. However, there are areas where the AET believes the reliability of emergency and backup power systems can and should be improved. Recommendations for improving the performance of E&BP systems are provided in this report. The report also discusses progress made by Management and Operating (M&O) contractors to improve the reliability of backup sources used in safety significant applications. One area that requires further attention is the analysis and understanding of the safety implications of backup power equipment. This understanding is needed for proper graded-approach implementation of Department of Energy (DOE) Orders, and to help ensure that equipment important to the safety of DOE workers, the public, and the environment is identified, classified, recognized, and treated as such by designers, users, and maintainers. Another area considered important for improving E&BP system performance is the assignment of overall ownership responsibility and authority for ensuring that E&BP equipment performs adequately and that reliability and availability are maintained at acceptable levels.« less
Software reliability through fault-avoidance and fault-tolerance
NASA Technical Reports Server (NTRS)
Vouk, Mladen A.; Mcallister, David F.
1993-01-01
Strategies and tools for the testing, risk assessment and risk control of dependable software-based systems were developed. Part of this project consists of studies to enable the transfer of technology to industry, for example the risk management techniques for safety-concious systems. Theoretical investigations of Boolean and Relational Operator (BRO) testing strategy were conducted for condition-based testing. The Basic Graph Generation and Analysis tool (BGG) was extended to fully incorporate several variants of the BRO metric. Single- and multi-phase risk, coverage and time-based models are being developed to provide additional theoretical and empirical basis for estimation of the reliability and availability of large, highly dependable software. A model for software process and risk management was developed. The use of cause-effect graphing for software specification and validation was investigated. Lastly, advanced software fault-tolerance models were studied to provide alternatives and improvements in situations where simple software fault-tolerance strategies break down.
Yun, Young Ho; Sim, Jin Ah; Lim, Ye Jin; Lim, Cheol Il; Kang, Sung-Choon; Kang, Joon-Ho; Park, Jun Dong; Noh, Dong Young
2016-06-01
The objective of this study was to develop the Worksite Health Index (WHI) and validate its psychometric properties. The development of the WHI questionnaire included item generation, item construction, and field testing. To assess the instrument's reliability and validity, we recruited 30 different Korean worksites. We developed the WHI questionnaire of 136 items categorized into five domains, namely Governance and Infrastructure, Need Assessment and Planning, Health Prevention and Promotion Program, Occupational Safety, and Monitoring and Feedback. All WHI domains demonstrated a high reliability with good internal consistency. The total WHI scores differentiated worksite groups effectively according to firm size. Each domain was associated significantly with employees' health status, absence, and financial outcome. The WHI can assess comprehensive worksite health programs. This tool is publicly available for addressing the growing need for worksite health programs.
The Internet for neurosurgeons: current resources and future challenges.
Hughes, Mark A; Brennan, Paul M
2011-06-01
Our professional and personal lives depend increasingly on access to information via the Internet. As an open access resource, the Internet is on the whole unbridled by censorship and can facilitate the rapid propagation of ideas and discoveries. At the same time, this liberty in sharing information, being unregulated and often free from external validation, can be oppressive; overloading the user and hindering effective decision-making. It can be difficult, if not impossible, to reliably ascertain the provenance of data and opinion. We must, therefore, discern what is useful, relevant, and above all reliable if we are to harness the Internet's potential to improve training, delivery of care, research, and provision of patient information. This article profiles the resources currently available to neurosurgeons, asks how we can sort the informational wheat from the chaff, and explores where future developments might further influence neurosurgical practice.
An Innovative Excel Application to Improve Exam Reliability in Marketing Courses
ERIC Educational Resources Information Center
Keller, Christopher M.; Kros, John F.
2011-01-01
Measures of survey reliability are commonly addressed in marketing courses. One statistic of reliability is "Cronbach's alpha." This paper presents an application of survey reliability as a reflexive application of multiple-choice exam validation. The application provides an interactive decision support system that incorporates survey item…
NASA Technical Reports Server (NTRS)
Siclari, Michael J.
1988-01-01
A computer code called NCOREL (for Nonconical Relaxation) has been developed to solve for supersonic full potential flows over complex geometries. The method first solves for the conical at the apex and then marches downstream in a spherical coordinate system. Implicit relaxation techniques are used to numerically solve the full potential equation at each subsequent crossflow plane. Many improvements have been made to the original code including more reliable numerics for computing wing-body flows with multiple embedded shocks, inlet flow through simulation, wake model and entropy corrections. Line relaxation or approximate factorization schemes are optionally available. Improved internal grid generation using analytic conformal mappings, supported by a simple geometric Harris wave drag input that was originally developed for panel methods and internal geometry package are some of the new features.
Achieving High Reliability with People, Processes, and Technology.
Saunders, Candice L; Brennan, John A
2017-01-01
High reliability as a corporate value in healthcare can be achieved by meeting the "Quadruple Aim" of improving population health, reducing per capita costs, enhancing the patient experience, and improving provider wellness. This drive starts with the board of trustees, CEO, and other senior leaders who ingrain high reliability throughout the organization. At WellStar Health System, the board developed an ambitious goal to become a top-decile health system in safety and quality metrics. To achieve this goal, WellStar has embarked on a journey toward high reliability and has committed to Lean management practices consistent with the Institute for Healthcare Improvement's definition of a high-reliability organization (HRO): one that is committed to the prevention of failure, early identification and mitigation of failure, and redesign of processes based on identifiable failures. In the end, a successful HRO can provide safe, effective, patient- and family-centered, timely, efficient, and equitable care through a convergence of people, processes, and technology.
Xiao, Y; Bochner, A F; Makunike, B; Holec, M; Xaba, S; Tshimanga, M; Chitimbire, V; Barnhart, S; Feldacker, C
2017-01-01
Objectives To assess availability and completeness of data collected before and after a data quality audit (DQA) in voluntary medical male circumcision (VMMC) sites in Zimbabwe to determine the effect of this process on data quality. Setting 4 of 10 VMMC sites in Zimbabwe that received a DQA in February, 2015 selected by convenience sampling. Participants Retrospective reviews of all client intake forms (CIFs) from November, 2014 and May, 2015. A total of 1400 CIFs were included from those 2 months across four sites. Primary and secondary outcomes Data availability was measured as the percentage of VMMC clients whose CIF was on file at each site. A data evaluation tool measured the completeness of 34 key CIF variables. A comparison of pre-DQA and post-DQA results was conducted using χ2 and t-tests. Results After the DQA, high record availability of over 98% was maintained by sites 3 and 4. For sites 1 and 2, record availability increased by 8.0% (p=0.001) and 9.7% (p=0.02), respectively. After the DQA, sites 1, 2 and 3 improved significantly in data completeness across 34 key indicators, increasing by 8.6% (p<0.001), 2.7% (p=0.003) and 3.8% (p<0.001), respectively. For site 4, CIF data completeness decreased by 1.7% (p<0.01) after the DQA. Conclusions Our findings suggest that CIF data availability and completeness generally improved after the DQA. However, gaps in documentation of vital signs and adverse events signal areas for improvement. Additional emphasis on data completeness would help support high-quality programme implementation and availability of reliable data for decision-making. PMID:28132009
NASA Astrophysics Data System (ADS)
Suvarna, Puneet Harischandra
Solar-blind ultraviolet avalanche photodiodes are an enabling technology for applications in the fields of astronomy, communication, missile warning systems, biological agent detection and particle physics research. Avalanche photodiodes (APDs) are capable of detecting low-intensity light with high quantum efficiency and signal-to-noise ratio without the need for external amplification. The properties of III-N materials (GaN and AlGaN) are promising for UV photodetectors that are highly efficient, radiation-hard and capable of visible-blind or solar-blind operation without the need for external filters. However, the realization of reliable and high performance III-N APDs and imaging arrays has several technological challenges. The high price and lack of availability of bulk III-N substrates necessitates the growth of III-Ns on lattice mismatched substrates leading to a high density of dislocations in the material that can cause high leakage currents, noise and premature breakdown in APDs. The etched sidewalls of III-N APDs and high electric fields at contact edges are also detrimental to APD performance and reliability. In this work, novel technologies have been developed and implemented that address the issues of performance and reliability in III-Nitride based APDs. To address the issue of extended defects in the bulk of the material, a novel pulsed MOCVD process was developed for the growth of AlGaN. This process enables growth of high crystal quality AlxGa1-xN with excellent control over composition, doping and thickness. The process has also been adapted for the growth of high quality III-N materials on silicon substrate for devices such as high electron mobility transistors (HEMTs). A novel post-growth defect isolation technique is also discussed that can isolate the impact of conductive defects from devices. A new sidewall passivation technique using atomic layer deposition (ALD) of dielectric materials was developed for III-N APDs that is effective in reducing the dark-current and trap states at sidewalls by close to an order of magnitude, leading to improved APD performance. Development and implementation of an ion implantation based contact edge termination technique for III-N APDs that helps prevent premature breakdown from the contact edge of the devices, has further lead to improved reliability. Finally novel improved III-N APD device designs are proposed using preliminary experiments and numerical simulations for future implementations.
NASA Astrophysics Data System (ADS)
Aggarwal, Anil Kr.; Kumar, Sanjeev; Singh, Vikram
2017-03-01
The binary states, i.e., success or failed state assumptions used in conventional reliability are inappropriate for reliability analysis of complex industrial systems due to lack of sufficient probabilistic information. For large complex systems, the uncertainty of each individual parameter enhances the uncertainty of the system reliability. In this paper, the concept of fuzzy reliability has been used for reliability analysis of the system, and the effect of coverage factor, failure and repair rates of subsystems on fuzzy availability for fault-tolerant crystallization system of sugar plant is analyzed. Mathematical modeling of the system is carried out using the mnemonic rule to derive Chapman-Kolmogorov differential equations. These governing differential equations are solved with Runge-Kutta fourth-order method.
Reliable in vitro studies require appropriate ovarian cancer cell lines
2014-01-01
Ovarian cancer is the fifth most common cause of cancer death in women and the leading cause of death from gynaecological malignancies. Of the 75% women diagnosed with locally advanced or disseminated disease, only 30% will survive five years following treatment. This poor prognosis is due to the following reasons: limited understanding of the tumor origin, unclear initiating events and early developmental stages of ovarian cancer, lack of reliable ovarian cancer-specific biomarkers, and drug resistance in advanced cases. In the past, in vitro studies using cell line models have been an invaluable tool for basic, discovery-driven cancer research. However, numerous issues including misidentification and cross-contamination of cell lines have hindered research efforts. In this study we examined all ovarian cancer cell lines available from cell banks. Hereby, we identified inconsistencies in the reporting, difficulties in the identification of cell origin or clinical data of the donor patients, restricted ethnic and histological type representation, and a lack of tubal and peritoneal cancer cell lines. We recommend that all cell lines should be distributed via official cell banks only with strict guidelines regarding the minimal available information required to improve the quality of ovarian cancer research in future. PMID:24936210
Sideband Algorithm for Automatic Wind Turbine Gearbox Fault Detection and Diagnosis: Preprint
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zappala, D.; Tavner, P.; Crabtree, C.
2013-01-01
Improving the availability of wind turbines (WT) is critical to minimize the cost of wind energy, especially for offshore installations. As gearbox downtime has a significant impact on WT availabilities, the development of reliable and cost-effective gearbox condition monitoring systems (CMS) is of great concern to the wind industry. Timely detection and diagnosis of developing gear defects within a gearbox is an essential part of minimizing unplanned downtime of wind turbines. Monitoring signals from WT gearboxes are highly non-stationary as turbine load and speed vary continuously with time. Time-consuming and costly manual handling of large amounts of monitoring data representmore » one of the main limitations of most current CMSs, so automated algorithms are required. This paper presents a fault detection algorithm for incorporation into a commercial CMS for automatic gear fault detection and diagnosis. The algorithm allowed the assessment of gear fault severity by tracking progressive tooth gear damage during variable speed and load operating conditions of the test rig. Results show that the proposed technique proves efficient and reliable for detecting gear damage. Once implemented into WT CMSs, this algorithm can automate data interpretation reducing the quantity of information that WT operators must handle.« less
The First MS-Cleavable, Photo-Thiol-Reactive Cross-Linker for Protein Structural Studies
NASA Astrophysics Data System (ADS)
Iacobucci, Claudio; Piotrowski, Christine; Rehkamp, Anne; Ihling, Christian H.; Sinz, Andrea
2018-04-01
Cleavable cross-linkers are gaining increasing importance for chemical cross-linking/mass spectrometry (MS) as they permit a reliable and automated data analysis in structural studies of proteins and protein assemblies. Here, we introduce 1,3-diallylurea (DAU) as the first CID-MS/MS-cleavable, photo-thiol-reactive cross-linker. DAU is a commercially available, inexpensive reagent that efficiently undergoes an anti-Markovnikov hydrothiolation with cysteine residues in the presence of a radical initiator upon UV-A irradiation. Radical cysteine cross-linking proceeds via an orthogonal "click reaction" and yields stable alkyl sulfide products. DAU reacts at physiological pH and cross-linking reactions with peptides, and proteins can be performed at temperatures as low as 4 °C. The central urea bond is efficiently cleaved upon collisional activation during tandem MS experiments generating characteristic product ions. This improves the reliability of automated cross-link identification. Different radical initiators have been screened for the cross-linking reaction of DAU using the thiol-containing compounds cysteine and glutathione. Our concept has also been exemplified for the biologically relevant proteins bMunc13-2 and retinal guanylyl cyclase-activating protein-2. [Figure not available: see fulltext.
Reusable Solid Rocket Motor - Accomplishments, Lessons, and a Culture of Success
NASA Technical Reports Server (NTRS)
Moore, Dennis R.; Phelps, Willie J.
2011-01-01
The Reusable Solid Rocket Motor represents the largest solid rocket motor ever flown and the only human rated solid motor. Each Reusable Solid Rocket Motor (RSRM) provides approximately 3-million lb of thrust to lift the integrated Space Shuttle vehicle from the launch pad. The motors burn out approximately 2 minutes later, separate from the vehicle and are recovered and refurbished. The size of the motor and the need for high reliability were challenges. Thrust shaping, via shaping of the propellant grain, was needed to limit structural loads during ascent. The motor design evolved through several block upgrades to increase performance and to increase safety and reliability. A major redesign occurred after STS-51L with the Redesigned Solid Rocket Motor. Significant improvements in the joint sealing systems were added. Design improvements continued throughout the Program via block changes with a number of innovations including development of low temperature o-ring materials and incorporation of a unique carbon fiber rope thermal barrier material. Recovery of the motors and post flight inspection improved understanding of hardware performance, and led to key design improvements. Because of the multidecade program duration material obsolescence was addressed, and requalification of materials and vendors was sometimes needed. Thermal protection systems and ablatives were used to protect the motor cases and nozzle structures. Significant understanding of design and manufacturing features of the ablatives was developed during the program resulting in optimization of design features and processing parameters. The project advanced technology in eliminating ozone-depleting materials in manufacturing processes and the development of an asbestos-free case insulation. Manufacturing processes for the large motor components were unique and safety in the manufacturing environment was a special concern. Transportation and handling approaches were also needed for the large hardware segments. The reusable solid rocket motor achieved significant reliability via process control, ground test programs, and postflight assessment. Process control is mandatory for a solid rocket motor as an acceptance test of the delivered product is not feasible. Process control included process failure modes and effects analysis, statistical process control, witness panels, and process product integrity audits. Material controls and inspections were maintained throughout the sub tier vendors. Material fingerprinting was employed to assess any drift in delivered material properties. The RSRM maintained both full scale and sub-scale test articles. These enabled continuous improvement of design and evaluation of process control and material behavior. Additionally RSRM reliability was achieved through attention to detail in post flight assessment to observe any shift in performance. The postflight analysis and inspections provided invaluable reliability data as it enables observation of actual flight performance, most of which would not be available if the motors were not recovered. These unique challenges, features of the reusable solid rocket motor, materials and manufacturing issues, and design improvements will be discussed in the paper.
Equal Access Initiative HIV/AIDS Information Resources from NLM
DOE Office of Scientific and Technical Information (OSTI.GOV)
Templin-Branner W. and N. Dancy
The Equal Access Initiative: HIV/AIDS Information Resources from the National Library of Medicine training is designed specifically for the National Minority AIDS Council 2010 Equal Access Initiative (EAI) Computer Grants Program awardees to provide valuable health information resources from the National Library of Medicine and other reliable sources to increase awareness of the wealth of treatment information and educational materials that are available on the Internet and to improve prevention and treatment education for their clients. These resources will also meet the needs of community-based
Advanced Self-Calibrating, Self-Repairing Data Acquisition System
NASA Technical Reports Server (NTRS)
Medelius, Pedro J. (Inventor); Eckhoff, Anthony J. (Inventor); Angel, Lucena R. (Inventor); Perotti, Jose M. (Inventor)
2002-01-01
An improved self-calibrating and self-repairing Data Acquisition System (DAS) for use in inaccessible areas, such as onboard spacecraft, and capable of autonomously performing required system health checks, failure detection. When required, self-repair is implemented utilizing a "spare parts/tool box" system. The available number of spare components primarily depends upon each component's predicted reliability which may be determined using Mean Time Between Failures (MTBF) analysis. Failing or degrading components are electronically removed and disabled to reduce power consumption, before being electronically replaced with spare components.
Structure of turbulence in three-dimensional boundary layers
NASA Technical Reports Server (NTRS)
Subramanian, Chelakara S.
1993-01-01
This report provides an overview of the three dimensional turbulent boundary layer concepts and of the currently available experimental information for their turbulence modeling. It is found that more reliable turbulence data, especially of the Reynolds stress transport terms, is needed to improve the existing modeling capabilities. An experiment is proposed to study the three dimensional boundary layer formed by a 'sink flow' in a fully developed two dimensional turbulent boundary layer. Also, the mean and turbulence field measurement procedure using a three component laser Doppler velocimeter is described.
How to Monitor the Breathing of Laboratory Rodents: A Review of the Current Methods.
Grimaud, Julien; Murthy, Venkatesh N
2018-05-23
Accurately measuring respiration in laboratory rodents is essential for many fields of research, including olfactory neuroscience, social behavior, learning and memory, and respiratory physiology. However, choosing the right technique to monitor respiration can be tricky, given the many criteria to take into account: reliability, precision, and invasiveness, to name a few. This review aims to assist experimenters in choosing the technique that will best fit their needs, by surveying the available tools, discussing their strengths and weaknesses, and offering suggestions for future improvements.
HIV/AIDS Information Resources from the National Library of Medicine-STOP
DOE Office of Scientific and Technical Information (OSTI.GOV)
Templin-Branner, W. and N. Dancy
2010-06-15
The HIV/AIDS Information Resources from the National Library of Medicine training is designed specifically for the UNCFSP HBCU Screening, Testing, Outreach, and Prevention (STOP) HIV/AIDS Program project members to provide valuable health information resources from the National Library of Medicine and other reliable sources to increase awareness of the wealth of treatment information and educational materials that are available on the Internet and to improve prevention and treatment education for their clients. These resources will also meet the needs of community-based organizations
Advanced cloud fault tolerance system
NASA Astrophysics Data System (ADS)
Sumangali, K.; Benny, Niketa
2017-11-01
Cloud computing has become a prevalent on-demand service on the internet to store, manage and process data. A pitfall that accompanies cloud computing is the failures that can be encountered in the cloud. To overcome these failures, we require a fault tolerance mechanism to abstract faults from users. We have proposed a fault tolerant architecture, which is a combination of proactive and reactive fault tolerance. This architecture essentially increases the reliability and the availability of the cloud. In the future, we would like to compare evaluations of our proposed architecture with existing architectures and further improve it.
Errors of logic and scholarship concerning dissociative identity disorder.
Ross, Colin A
2009-01-01
The author reviewed a two-part critique of dissociative identity disorder published in the Canadian Journal of Psychiatry. The two papers contain errors of logic and scholarship. Contrary to the conclusions in the critique, dissociative identity disorder has established diagnostic reliability and concurrent validity, the trauma histories of affected individuals can be corroborated, and the existing prospective treatment outcome literature demonstrates improvement in individuals receiving psychotherapy for the disorder. The available evidence supports the inclusion of dissociative identity disorder in future editions of the Diagnostic and Statistical Manual of Mental Disorders.
Three-dimensional x-ray inspection of food products
NASA Astrophysics Data System (ADS)
Graves, Mark; Batchelor, Bruce G.; Palmer, Stephen C.
1994-09-01
Modern food production techniques operate at high speed and sometimes fill several containers simultaneously; individual containers never become available for inspection by conventional x- ray systems. There is a constant demand for improved methods for detecting foreign bodies, such as glass, plastic, wood, stone, animal remains, etc. These requirements lead to significant problems with existing inspection techniques, which are susceptible to noise and are unable to detect long thin contaminants reliably. Experimental results demonstrate these points. The paper proposes the use of two x-ray inspection systems, with orthogonal beams to overcome these difficulties.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-06-10
... standards that require the use of the best available technology for ensuring the full reliability and... available technology for ensuring the full reliability and accuracy of urine drug tests, while reflecting..., cutoffs, specimen validity, collection, collection devices, and testing. II. Solicitation of Comments: As...
Study on vacuum packaging reliability of micromachined quartz tuning fork gyroscopes
NASA Astrophysics Data System (ADS)
Fan, Maoyan; Zhang, Lifang
2017-09-01
Packaging technology of the micromachined quartz tuning fork gyroscopes by vacuum welding has been experimentally studied. The performance of quartz tuning fork is influenced by the encapsulation shell, encapsulation method and fixation of forks. Alloy solder thick film is widely used in the package to avoid the damage of the chip structure by the heat resistance and hot temperature, and this can improve the device performance and welding reliability. The results show that the bases and the lids plated with gold and nickel can significantly improve the airtightness and reliability of the vacuum package. Vacuum packaging is an effective method to reduce the vibration damping, improve the quality factor and further enhance the performance. The threshold can be improved nearly by 10 times.
FOR Allocation to Distribution Systems based on Credible Improvement Potential (CIP)
NASA Astrophysics Data System (ADS)
Tiwary, Aditya; Arya, L. D.; Arya, Rajesh; Choube, S. C.
2017-02-01
This paper describes an algorithm for forced outage rate (FOR) allocation to each section of an electrical distribution system subject to satisfaction of reliability constraints at each load point. These constraints include threshold values of basic reliability indices, for example, failure rate, interruption duration and interruption duration per year at load points. Component improvement potential measure has been used for FOR allocation. Component with greatest magnitude of credible improvement potential (CIP) measure is selected for improving reliability performance. The approach adopted is a monovariable method where one component is selected for FOR allocation and in the next iteration another component is selected for FOR allocation based on the magnitude of CIP. The developed algorithm is implemented on sample radial distribution system.
Inventing an Energy Internet: Concepts, Architectures and Protocols for Smart Energy Utilization
Tsoukalas, Lefteri
2018-01-24
In recent years, the Internet is revolutionizing information availability much like the Power Grid revolutionized energy availability a century earlier. We will explore the differences and similarities of these two critical infrastructures and identify ways for convergence which may lead to an energy internet. Pricing signals, nodal forecasting, and short-term elasticities are key concepts in smart energy flows respecting the delicate equilibrium involved in generation-demand and aiming at higher efficiencies. We will discuss how intelligent forecasting approaches operating at multiple levels (including device or nodal levels) can ameliorate the challenges of power storage. In addition to higher efficiencies, an energy internet may achieve significant reliability and security improvements and offer greater flexibility and transparency in the overall energy-environmental relation.
NASA Technical Reports Server (NTRS)
Cohen, Gerald C. (Inventor); McMann, Catherine M. (Inventor)
1991-01-01
An improved method and system for automatically generating reliability models for use with a reliability evaluation tool is described. The reliability model generator of the present invention includes means for storing a plurality of low level reliability models which represent the reliability characteristics for low level system components. In addition, the present invention includes means for defining the interconnection of the low level reliability models via a system architecture description. In accordance with the principles of the present invention, a reliability model for the entire system is automatically generated by aggregating the low level reliability models based on the system architecture description.
Improving patient safety: patient-focused, high-reliability team training.
McKeon, Leslie M; Cunningham, Patricia D; Oswaks, Jill S Detty
2009-01-01
Healthcare systems are recognizing "human factor" flaws that result in adverse outcomes. Nurses work around system failures, although increasing healthcare complexity makes this harder to do without risk of error. Aviation and military organizations achieve ultrasafe outcomes through high-reliability practice. We describe how reliability principles were used to teach nurses to improve patient safety at the front line of care. Outcomes include safety-oriented, teamwork communication competency; reflections on safety culture and clinical leadership are discussed.
Improved Protocols for Illumina Sequencing
Bronner, Iraad F.; Quail, Michael A.; Turner, Daniel J.; Swerdlow, Harold
2013-01-01
In this unit, we describe a set of improvements we have made to the standard Illumina protocols to make the sequencing process more reliable in a high-throughput environment, reduce amplification bias, narrow the distribution of insert sizes, and reliably obtain high yields of data. PMID:19582764
Reliable steam generators: how KWU solved beginning problems for its customers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Eggers, B.; Engl, G.; Froehlich, K.
This paper describes improvements in inspection and maintenance techniques, the adaptation of a secondary-side concept, and the optimization of water chemistry to achieve the highest possible operational reliability of steam generator performance. In the late 1970s and the early 1980s steam generators of several pressurized water reactors delivered by Kraftwerk Union (KWU) experienced corrosion-induced tube-wall degradation. As a result of these findings and the similar experience in US plants, KWU initiated a systematic program to retain the operational history of the plants at their historically outstanding level. By a combination of improvement in the balance of plant, reduction of themore » phosphate conditioning, and even a change to an all-volatile treatment as well as by the performance of tubesheet lancing, the tube degradation in KWU steam generators is nearly halted and no other known corrosion mechanisms exist that could impair the life expectancy of the steam generators. Nevertheless, repair and cleaning techniques have been developed and are available for application, if necessary, such as tube plugging, tube sleeving, or even partial tube replacement as well as chemical cleaning of the steam generator's secondary side.« less
Pellegrin, Karen L; Miyamura, Jill B; Ma, Carolyn; Taniguchi, Ronald
2016-01-01
Current race/ethnicity categories established by the U.S. Office of Management and Budget are neither reliable nor valid for understanding health disparities or for tracking improvements in this area. In Hawaii, statewide hospitals have collaborated to collect race/ethnicity data using a standardized method consistent with recommended practices that overcome the problems with the federal categories. The purpose of this observational study was to determine the impact of this collaboration on key measures of race/ethnicity documentation. After this collaborative effort, the number of standardized categories available across hospitals increased from 6 to 34, and the percent of inpatients with documented race/ethnicity increased from 88 to 96%. This improved standardized methodology is now the foundation for tracking population health indicators statewide and focusing quality improvement efforts. The approach used in Hawaii can serve as a model for other states and regions. Ultimately, the ability to standardize data collection methodology across states and regions will be needed to track improvements nationally.
NASA Astrophysics Data System (ADS)
Olsson, Lars; Cremer, Dieter
1996-11-01
Sum-over-states density functional perturbation theory (SOS-DFPT) has been used to calculate 13C, 15N, and 17O NMR chemical shifts of 20 molecules, for which accurate experimental gas-phase values are available. Compared to Hartree-Fock (HF), SOS-DFPT leads to improved chemical shift values and approaches the degree of accuracy obtained with second order Møller-Plesset perturbation theory (MP2). This is particularly true in the case of 15N chemical shifts where SOS-DFPT performs even better than MP2. Additional improvements of SOS-DFPT chemical shifts can be obtained by empirically correcting diamagnetic and paramagnetic contributions to compensate for deficiencies which are typical of DFT.
A new fault diagnosis algorithm for AUV cooperative localization system
NASA Astrophysics Data System (ADS)
Shi, Hongyang; Miao, Zhiyong; Zhang, Yi
2017-10-01
Multiple AUVs cooperative localization as a new kind of underwater positioning technology, not only can improve the positioning accuracy, but also has many advantages the single AUV does not have. It is necessary to detect and isolate the fault to increase the reliability and availability of the AUVs cooperative localization system. In this paper, the Extended Multiple Model Adaptive Cubature Kalmam Filter (EMMACKF) method is presented to detect the fault. The sensor failures are simulated based on the off-line experimental data. Experimental results have shown that the faulty apparatus can be diagnosed effectively using the proposed method. Compared with Multiple Model Adaptive Extended Kalman Filter and Multi-Model Adaptive Unscented Kalman Filter, both accuracy and timelines have been improved to some extent.
NASA Technical Reports Server (NTRS)
Gernand, Jeffrey L.; Gillespie, Amanda M.; Monaghan, Mark W.; Cummings, Nicholas H.
2010-01-01
Success of the Constellation Program's lunar architecture requires successfully launching two vehicles, Ares I/Orion and Ares V/Altair, in a very limited time period. The reliability and maintainability of flight vehicles and ground systems must deliver a high probability of successfully launching the second vehicle in order to avoid wasting the on-orbit asset launched by the first vehicle. The Ground Operations Project determined which ground subsystems had the potential to affect the probability of the second launch and allocated quantitative availability requirements to these subsystems. The Ground Operations Project also developed a methodology to estimate subsystem reliability, availability and maintainability to ensure that ground subsystems complied with allocated launch availability and maintainability requirements. The verification analysis developed quantitative estimates of subsystem availability based on design documentation; testing results, and other information. Where appropriate, actual performance history was used for legacy subsystems or comparative components that will support Constellation. The results of the verification analysis will be used to verify compliance with requirements and to highlight design or performance shortcomings for further decision-making. This case study will discuss the subsystem requirements allocation process, describe the ground systems methodology for completing quantitative reliability, availability and maintainability analysis, and present findings and observation based on analysis leading to the Ground Systems Preliminary Design Review milestone.
Donahoe, Laura; McDonald, Ellen; Kho, Michelle E; Maclennan, Margaret; Stratford, Paul W; Cook, Deborah J
2009-01-01
Given their clinical, research, and administrative purposes, scores on the Acute Physiology and Chronic Health Evaluation (APACHE) II should be reliable, whether calculated by health care personnel or a clinical information system. To determine reliability of APACHE II scores calculated by a clinical information system and by health care personnel before and after a multifaceted quality improvement intervention. APACHE II scores of 37 consecutive patients admitted to a closed, 15-bed, university-affiliated intensive care unit were collected by a research coordinator, a database clerk, and a clinical information system. After a quality improvement intervention focused on health care personnel and the clinical information system, the same methods were used to collect data on 32 consecutive patients. The research coordinator and the clerk did not know each other's scores or the information system's score. The data analyst did not know the source of the scores until analysis was complete. APACHE II scores obtained by the clerk and the research coordinator were highly reliable (intraclass correlation coefficient, 0.88 before vs 0.80 after intervention; P = .25). No significant changes were detected after the intervention; however, compared with scores of the research coordinator, the overall reliability of APACHE II scores calculated by the clinical information system improved (intraclass correlation coefficient, 0.24 before intervention vs 0.91 after intervention, P < .001). After completion of a quality improvement intervention, health care personnel and a computerized clinical information system calculated sufficiently reliable APACHE II scores for clinical, research, and administrative purposes.
Patient safety in anesthesia: learning from the culture of high-reliability organizations.
Wright, Suzanne M
2015-03-01
There has been an increased awareness of and interest in patient safety and improved outcomes, as well as a growing body of evidence substantiating medical error as a leading cause of death and injury in the United States. According to The Joint Commission, US hospitals demonstrate improvements in health care quality and patient safety. Although this progress is encouraging, much room for improvement remains. High-reliability organizations, industries that deliver reliable performances in the face of complex working environments, can serve as models of safety for our health care system until plausible explanations for patient harm are better understood. Copyright © 2015 Elsevier Inc. All rights reserved.
7 CFR 760.405 - Application process.
Code of Federal Regulations, 2011 CFR
2011-01-01
... death documentation is not available, the participant may provide reliable records, in conjunction with verifiable beginning and ending inventory records, as proof of death. Reliable records may include..., pictures, and other similar reliable documents as determined by FSA. (f) Certification of livestock deaths...
Aerospace Safety Advisory Panel
NASA Technical Reports Server (NTRS)
1992-01-01
The results of the Panel's activities are presented in a set of findings and recommendations. Highlighted here are both improvements in NASA's safety and reliability activities and specific areas where additional gains might be realized. One area of particular concern involves the curtailment or elimination of Space Shuttle safety and reliability enhancements. Several findings and recommendations address this area of concern, reflecting the opinion that safety and reliability enhancements are essential to the continued successful operation of the Space Shuttle. It is recommended that a comprehensive and continuing program of safety and reliability improvements in all areas of Space Shuttle hardware/software be considered an inherent component of ongoing Space Shuttle operations.
Research requirements to improve reliability of civil helicopters
NASA Technical Reports Server (NTRS)
Dougherty, J. J., III; Barrett, L. D.
1978-01-01
The major reliability problems of the civil helicopter fleet as reported by helicopter operational and maintenance personnel are documented. An assessment of each problem is made to determine if the reliability can be improved by application of present technology or whether additional research and development are required. The reliability impact is measured in three ways: (1) The relative frequency of each problem in the fleet. (2) The relative on-aircraft manhours to repair, associated with each fleet problem. (3) The relative cost of repair materials or replacement parts associated with each fleet problem. The data reviewed covered the period of 1971 through 1976 and covered only turbine engine aircraft.
Walia, Rasna R; Xue, Li C; Wilkins, Katherine; El-Manzalawy, Yasser; Dobbs, Drena; Honavar, Vasant
2014-01-01
Protein-RNA interactions are central to essential cellular processes such as protein synthesis and regulation of gene expression and play roles in human infectious and genetic diseases. Reliable identification of protein-RNA interfaces is critical for understanding the structural bases and functional implications of such interactions and for developing effective approaches to rational drug design. Sequence-based computational methods offer a viable, cost-effective way to identify putative RNA-binding residues in RNA-binding proteins. Here we report two novel approaches: (i) HomPRIP, a sequence homology-based method for predicting RNA-binding sites in proteins; (ii) RNABindRPlus, a new method that combines predictions from HomPRIP with those from an optimized Support Vector Machine (SVM) classifier trained on a benchmark dataset of 198 RNA-binding proteins. Although highly reliable, HomPRIP cannot make predictions for the unaligned parts of query proteins and its coverage is limited by the availability of close sequence homologs of the query protein with experimentally determined RNA-binding sites. RNABindRPlus overcomes these limitations. We compared the performance of HomPRIP and RNABindRPlus with that of several state-of-the-art predictors on two test sets, RB44 and RB111. On a subset of proteins for which homologs with experimentally determined interfaces could be reliably identified, HomPRIP outperformed all other methods achieving an MCC of 0.63 on RB44 and 0.83 on RB111. RNABindRPlus was able to predict RNA-binding residues of all proteins in both test sets, achieving an MCC of 0.55 and 0.37, respectively, and outperforming all other methods, including those that make use of structure-derived features of proteins. More importantly, RNABindRPlus outperforms all other methods for any choice of tradeoff between precision and recall. An important advantage of both HomPRIP and RNABindRPlus is that they rely on readily available sequence and sequence-derived features of RNA-binding proteins. A webserver implementation of both methods is freely available at http://einstein.cs.iastate.edu/RNABindRPlus/.
Reliability of a store observation tool in measuring availability of alcohol and selected foods.
Cohen, Deborah A; Schoeff, Diane; Farley, Thomas A; Bluthenthal, Ricky; Scribner, Richard; Overton, Adrian
2007-11-01
Alcohol and food items can compromise or contribute to health, depending on the quantity and frequency with which they are consumed. How much people consume may be influenced by product availability and promotion in local retail stores. We developed and tested an observational tool to objectively measure in-store availability and promotion of alcoholic beverages and selected food items that have an impact on health. Trained observers visited 51 alcohol outlets in Los Angeles and southeastern Louisiana. Using a standardized instrument, two independent observations were conducted documenting the type of outlet, the availability and shelf space for alcoholic beverages and selected food items, the purchase price of standard brands, the placement of beer and malt liquor, and the amount of in-store alcohol advertising. Reliability of the instrument was excellent for measures of item availability, shelf space, and placement of malt liquor. Reliability was lower for alcohol advertising, beer placement, and items that measured the "least price" of apples and oranges. The average kappa was 0.87 for categorical items and the average intraclass correlation coefficient was 0.83 for continuous items. Overall, systematic observation of the availability and promotion of alcoholic beverages and food items was feasible, acceptable, and reliable. Measurement tools such as the one we evaluated should be useful in studies of the impact of availability of food and beverages on consumption and on health outcomes.
NASA Astrophysics Data System (ADS)
Yuchi, Weiran; Yao, Jiayun; McLean, Kathleen E.; Stull, Roland; Pavlovic, Radenko; Davignon, Didier; Moran, Michael D.; Henderson, Sarah B.
2016-11-01
Fine particulate matter (PM2.5) generated by forest fires has been associated with a wide range of adverse health outcomes, including exacerbation of respiratory diseases and increased risk of mortality. Due to the unpredictable nature of forest fires, it is challenging for public health authorities to reliably evaluate the magnitude and duration of potential exposures before they occur. Smoke forecasting tools are a promising development from the public health perspective, but their widespread adoption is limited by their inherent uncertainties. Observed measurements from air quality monitoring networks and remote sensing platforms are more reliable, but they are inherently retrospective. It would be ideal to reduce the uncertainty in smoke forecasts by integrating any available observations. This study takes spatially resolved PM2.5 estimates from an empirical model that integrates air quality measurements with satellite data, and averages them with PM2.5 predictions from two smoke forecasting systems. Two different indicators of population respiratory health are then used to evaluate whether the blending improved the utility of the smoke forecasts. Among a total of six models, including two single forecasts and four blended forecasts, the blended estimates always performed better than the forecast values alone. Integrating measured observations into smoke forecasts could improve public health preparedness for smoke events, which are becoming more frequent and intense as the climate changes.
Gao, Zhouzheng; Zhang, Hongping; Ge, Maorong; Niu, Xiaoji; Shen, Wenbin; Wickert, Jens; Schuh, Harald
2015-01-01
The continuity and reliability of precise GNSS positioning can be seriously limited by severe user observation environments. The Inertial Navigation System (INS) can overcome such drawbacks, but its performance is clearly restricted by INS sensor errors over time. Accordingly, the tightly coupled integration of GPS and INS can overcome the disadvantages of each individual system and together form a new navigation system with a higher accuracy, reliability and availability. Recently, ionosphere-constrained (IC) precise point positioning (PPP) utilizing raw GPS observations was proven able to improve both the convergence and positioning accuracy of the conventional PPP using ionosphere-free combined observations (LC-PPP). In this paper, a new mode of tightly coupled integration, in which the IC-PPP instead of LC-PPP is employed, is implemented to further improve the performance of the coupled system. We present the detailed mathematical model and the related algorithm of the new integration of IC-PPP and INS. To evaluate the performance of the new tightly coupled integration, data of both airborne and vehicle experiments with a geodetic GPS receiver and tactical grade inertial measurement unit are processed and the results are analyzed. The statistics show that the new approach can further improve the positioning accuracy compared with both IC-PPP and the tightly coupled integration of the conventional PPP and INS. PMID:25763647
Neumann, Steffen; Schmitt-Kopplin, Philippe
2017-01-01
Lipid identification is a major bottleneck in high-throughput lipidomics studies. However, tools for the analysis of lipid tandem MS spectra are rather limited. While the comparison against spectra in reference libraries is one of the preferred methods, these libraries are far from being complete. In order to improve identification rates, the in silico fragmentation tool MetFrag was combined with Lipid Maps and lipid-class specific classifiers which calculate probabilities for lipid class assignments. The resulting LipidFrag workflow was trained and evaluated on different commercially available lipid standard materials, measured with data dependent UPLC-Q-ToF-MS/MS acquisition. The automatic analysis was compared against manual MS/MS spectra interpretation. With the lipid class specific models, identification of the true positives was improved especially for cases where candidate lipids from different lipid classes had similar MetFrag scores by removing up to 56% of false positive results. This LipidFrag approach was then applied to MS/MS spectra of lipid extracts of the nematode Caenorhabditis elegans. Fragments explained by LipidFrag match known fragmentation pathways, e.g., neutral losses of lipid headgroups and fatty acid side chain fragments. Based on prediction models trained on standard lipid materials, high probabilities for correct annotations were achieved, which makes LipidFrag a good choice for automated lipid data analysis and reliability testing of lipid identifications. PMID:28278196
Validation of a method for assessing resident physicians' quality improvement proposals.
Leenstra, James L; Beckman, Thomas J; Reed, Darcy A; Mundell, William C; Thomas, Kris G; Krajicek, Bryan J; Cha, Stephen S; Kolars, Joseph C; McDonald, Furman S
2007-09-01
Residency programs involve trainees in quality improvement (QI) projects to evaluate competency in systems-based practice and practice-based learning and improvement. Valid approaches to assess QI proposals are lacking. We developed an instrument for assessing resident QI proposals--the Quality Improvement Proposal Assessment Tool (QIPAT-7)-and determined its validity and reliability. QIPAT-7 content was initially obtained from a national panel of QI experts. Through an iterative process, the instrument was refined, pilot-tested, and revised. Seven raters used the instrument to assess 45 resident QI proposals. Principal factor analysis was used to explore the dimensionality of instrument scores. Cronbach's alpha and intraclass correlations were calculated to determine internal consistency and interrater reliability, respectively. QIPAT-7 items comprised a single factor (eigenvalue = 3.4) suggesting a single assessment dimension. Interrater reliability for each item (range 0.79 to 0.93) and internal consistency reliability among the items (Cronbach's alpha = 0.87) were high. This method for assessing resident physician QI proposals is supported by content and internal structure validity evidence. QIPAT-7 is a useful tool for assessing resident QI proposals. Future research should determine the reliability of QIPAT-7 scores in other residency and fellowship training programs. Correlations should also be made between assessment scores and criteria for QI proposal success such as implementation of QI proposals, resident scholarly productivity, and improved patient outcomes.
López, Yosvany; Nakai, Kenta; Patil, Ashwini
2015-01-01
HitPredict is a consolidated resource of experimentally identified, physical protein-protein interactions with confidence scores to indicate their reliability. The study of genes and their inter-relationships using methods such as network and pathway analysis requires high quality protein-protein interaction information. Extracting reliable interactions from most of the existing databases is challenging because they either contain only a subset of the available interactions, or a mixture of physical, genetic and predicted interactions. Automated integration of interactions is further complicated by varying levels of accuracy of database content and lack of adherence to standard formats. To address these issues, the latest version of HitPredict provides a manually curated dataset of 398 696 physical associations between 70 808 proteins from 105 species. Manual confirmation was used to resolve all issues encountered during data integration. For improved reliability assessment, this version combines a new score derived from the experimental information of the interactions with the original score based on the features of the interacting proteins. The combined interaction score performs better than either of the individual scores in HitPredict as well as the reliability score of another similar database. HitPredict provides a web interface to search proteins and visualize their interactions, and the data can be downloaded for offline analysis. Data usability has been enhanced by mapping protein identifiers across multiple reference databases. Thus, the latest version of HitPredict provides a significantly larger, more reliable and usable dataset of protein-protein interactions from several species for the study of gene groups. Database URL: http://hintdb.hgc.jp/htp. © The Author(s) 2015. Published by Oxford University Press.
Tolu, Sena; Yurdakul, Ozan Volkan; Basaran, Betul; Rezvani, Aylin
2018-05-14
The aim of this study was to evaluate the reliability, content, and quality of videos for patients available on YouTube for learning how to self-administer subcutaneous anti-tumour necrosis factor (TNF) injections. We searched for the terms Humira injection, Enbrel injection, Simponi injection, and Cimzia injection. Videos were categorised as useful information, misleading information, useful patient opinion, and misleading patient opinion by two physicians. Videos were rated for quality on a 5-point global quality scale (GQS; 1 = poor quality, 5 = excellent quality) and reliability and content using the 5-point DISCERN scale (higher scores represent greater reliability and more comprehensive videos). Of the 142 English videos, 24 (16.9%) videos were classified as useful information, 6 (4.2%) as misleading information, 47 (33.1%) as useful patient opinion, and 65 (45.8%) as misleading patient opinion. Useful videos were the most comprehensive and had the highest reliability and quality scores. The useful information and useful patient opinion videos had the highest numbers of views per day (median 8.32, IQR: 3.40-14.28 and 5.46, IQR: 3.06-14.44), as compared with 2.32, IQR: 1.63-6.26 for misleading information videos and 2.15, IQR: 1.17-7.43 for misleading patient opinion videos (p = 0.001). Almost all (91.5%) misleading videos were uploaded by individual users. There are a substantial number of English-language YouTube videos, with high quality, and rich content and reliability that can be sources of information on proper technique of anti-TNF self-injections. Physicians should direct patients to the reliable resources of information and educate them in online resource assessment, thereby improving treatment outcomes.
Rabin, Elaine; Patrick, Lisa
2016-04-01
Nationwide, hospitals struggle to maintain specialist on-call coverage for emergencies. We seek to further understand the issue by examining reliability of scheduled coverage and the role of ad hoc coverage when none is scheduled. An anonymous electronic survey of all emergency department (ED) directors of a large state. Overall and for 10 specialties, respondents were asked to estimate on-call coverage extent and "reliability" (frequency of emergency response in a clinically useful time frame: 2 hours), and use and effect of ad hoc emergency coverage to fill gaps. Descriptive statistics were performed using Fisher exact and Wilcoxon sign rank tests for significance. Contact information was obtained for 125 of 167 ED directors. Sixty responded (48%), representing 36% of EDs. Forty-six percent reported full on-call coverage scheduled for all specialties. Forty-six percent reported consistent reliability. Coverage and reliability were strongly related (P<.01; 33% reported both), and larger ED volume correlated with both (P<.01). Ninety percent of hospitals that had gaps in either employed ad hoc coverage, significantly improving coverage for 8 of 10 specialties. For all but 1 specialty, more than 20% of hospitals reported that specialists are "Never", "Rarely" or "Sometimes" reliable (more than 50% for cardiovascular surgery, hand surgery and ophthalmology). Significant holes in scheduled on-call specialist coverage are compounded by frequent unreliability of on-call specialists, but partially ameliorated by ad hoc specialist coverage. Regionalization may help because a 2-tiered system may exist: larger hospitals have more complete, reliable coverage. Better understanding of specialists' willingness to treat emergencies ad hoc without taking formal call will suggest additional remedies. Copyright © 2015 Elsevier Inc. All rights reserved.
Customer Dissatisfaction Index and its Improvement Costs
NASA Astrophysics Data System (ADS)
Lvovs, Aleksandrs; Mutule, Anna
2010-01-01
The paper gives description of customer dissatisfaction index (CDI) that can be used as reliability level characterizing factor. The factor is directly joined with customer satisfaction of power supply and can be used for control of reliability level of power supply for residential customers. CDI relations with other reliability indices are shown. Paper also gives a brief overview of legislation of Latvia in power industry that is the base for CDI introduction. Calculations of CDI improvement costs are performed in the paper too.
Pilot testing of SHRP 2 reliability data and analytical products: Washington.
DOT National Transportation Integrated Search
2014-07-30
The second Strategic Highway Research Program (SHRP 2) addresses the challenges of moving people and goods efficiently and safely on the nations highways. In its Reliability focus area, the research emphasizes improving the reliability of highway ...
Measuring Professionalism in Medicine and Nursing: Results of a European Survey
Lombarts, Kiki M. J. M. H.; Plochg, Thomas; Thompson, Caroline A.; Arah, Onyebuchi A.
2014-01-01
Background Leveraging professionalism has been put forward as a strategy to drive improvement of patient care. We investigate professionalism as a factor influencing the uptake of quality improvement activities by physicians and nurses working in European hospitals. Objective To (i) investigate the reliability and validity of data yielded by using the self-developed professionalism measurement tool for physicians and nurses, (ii) describe their levels of professionalism displayed, and (iii) quantify the extent to which professional attitudes would predict professional behaviors. Methods and Materials We designed and deployed survey instruments amongst 5920 physicians and nurses working in European hospitals. This was conducted under the cross-sectional multilevel study “Deepening Our Understanding of Quality Improvement in Europe” (DUQuE). We used psychometric and generalized linear mixed modelling techniques to address the aforementioned objectives. Results In all, 2067 (response rate 69.8%) physicians and 2805 nurses (94.8%) representing 74 hospitals in 7 European countries participated. The professionalism instrument revealed five subscales of professional attitude and one scale for professional behaviour with moderate to high internal consistency and reliability. Physicians and nurses display equally high professional attitude sum scores (11.8 and 11.9 respectively out of 16) but seem to have different perceptions towards separate professionalism aspects. Lastly, professionals displaying higher levels of professional attitudes were more involved in quality improvement actions (physicians: b = 0.019, P<0.0001; nurses: b = 0.016, P<0.0001) and more inclined to report colleagues’ underperformance (physicians – odds ratio (OR) 1.12, 95% CI 1.01–1.24; nurses – OR 1.11, 95% CI 1.01–1.23) or medical errors (physicians – OR 1.14, 95% CI 1.01–1.23; nurses – OR 1.43, 95% CI 1.22–1.67). Involvement in QI actions was found to increase the odds of reporting incompetence or medical errors. Conclusion A tool that reliably and validly measures European physicians’ and nurses’ commitment to professionalism is now available. Collectively leveraging professionalism as a quality improvement strategy may be beneficial to patient care quality. PMID:24849320
Lawrason Hughes, Amy; Murray, Nicole; Valdez, Tulio A; Kelly, Raeanne; Kavanagh, Katherine
2014-01-01
National attention has focused on the importance of handoffs in medicine. Our practice during airway patient handoffs is to communicate a patient-specific emergency plan for airway reestablishment; patients who are not intubatable by standard means are at higher risk for failure. There is currently no standard classification system describing airway risk in tracheotomized patients. To introduce and assess the interrater reliability of a simple airway risk classification system, the Connecticut Airway Risk Evaluation (CARE) system. We created a novel classification system, the CARE system, based on ease of intubation and the need for ventilation: group 1, easily intubatable; group 2, intubatable with special equipment and/or maneuvers; group 3, not intubatable. A "v" was appended to any group number to indicate the need for mechanical ventilation. We performed a retrospective medical chart review of patients aged 0 to 18 years who were undergoing tracheotomy at our tertiary care pediatric hospital between January 2000 and April 2011. INTERVENTIONS Each patient's medical history, including airway disease and means of intubation, was reviewed by 4 raters. Patient airways were separately rated as CARE groups 1, 2, or 3, each group with or without a v appended, as appropriate, based on the available information. After the patients were assigned to an airway group by each of the 4 raters, the interrater reliability was calculated to determine the ease of use of the rating system. We identified complete data for 155 of 169 patients (92%), resulting in a total of 620 ratings. Based on the patient's ease of intubation, raters categorized tracheotomized patients into group 1 (70%, 432 of 620); group 2 (25%, 157 of 620); or group 3 (5%, 29 of 620), each with a v appended if appropriate. The interrater reliability was κ = 0.95. We propose an airway risk classification system for tracheotomized patients, CARE, that has high interrater reliability and is easy to use and interpret. As medical providers and national organizations place more focus on improvements in interprovider communication, the creation of an airway handoff tool is integral to improving patient safety and airway management strategies following tracheotomy complications.
Reliability improvements on Thales RM2 rotary Stirling coolers: analysis and methodology
NASA Astrophysics Data System (ADS)
Cauquil, J. M.; Seguineau, C.; Martin, J.-Y.; Benschop, T.
2016-05-01
The cooled IR detectors are used in a wide range of applications. Most of the time, the cryocoolers are one of the components dimensioning the lifetime of the system. The cooler reliability is thus one of its most important parameters. This parameter has to increase to answer market needs. To do this, the data for identifying the weakest element determining cooler reliability has to be collected. Yet, data collection based on field are hardly usable due to lack of informations. A method for identifying the improvement in reliability has then to be set up which can be used even without field return. This paper will describe the method followed by Thales Cryogénie SAS to reach such a result. First, a database was built from extensive expertizes of RM2 failures occurring in accelerate ageing. Failure modes have then been identified and corrective actions achieved. Besides this, a hierarchical organization of the functions of the cooler has been done with regard to the potential increase of its efficiency. Specific changes have been introduced on the functions most likely to impact efficiency. The link between efficiency and reliability will be described in this paper. The work on the two axes - weak spots for cooler reliability and efficiency - permitted us to increase in a drastic way the MTTF of the RM2 cooler. Huge improvements in RM2 reliability are actually proven by both field return and reliability monitoring. These figures will be discussed in the paper.
76 FR 16240 - Mandatory Reliability Standards for Interconnection Reliability Operating Limits
Federal Register 2010, 2011, 2012, 2013, 2014
2011-03-23
... Standards. The Reliability Standards were designed to prevent instability, uncontrolled separation, or... designed to prevent instability, uncontrolled separation, or cascading outages that adversely impact the... instability, uncontrolled separation, or cascading outages. See NERC Glossary, available at http://www.nerc...
How reliable are clinical systems in the UK NHS? A study of seven NHS organisations
Franklin, Bryony Dean; Moorthy, Krishna; Cooke, Matthew W; Vincent, Charles
2012-01-01
Background It is well known that many healthcare systems have poor reliability; however, the size and pervasiveness of this problem and its impact has not been systematically established in the UK. The authors studied four clinical systems: clinical information in surgical outpatient clinics, prescribing for hospital inpatients, equipment in theatres, and insertion of peripheral intravenous lines. The aim was to describe the nature, extent and variation in reliability of these four systems in a sample of UK hospitals, and to explore the reasons for poor reliability. Methods Seven UK hospital organisations were involved; each system was studied in three of these. The authors took delivery of the systems' intended outputs to be a proxy for the reliability of the system as a whole. For example, for clinical information, 100% reliability was defined as all patients having an agreed list of clinical information available when needed during their appointment. Systems factors were explored using semi-structured interviews with key informants. Common themes across the systems were identified. Results Overall reliability was found to be between 81% and 87% for the systems studied, with significant variation between organisations for some systems: clinical information in outpatient clinics ranged from 73% to 96%; prescribing for hospital inpatients 82–88%; equipment availability in theatres 63–88%; and availability of equipment for insertion of peripheral intravenous lines 80–88%. One in five reliability failures were associated with perceived threats to patient safety. Common factors causing poor reliability included lack of feedback, lack of standardisation, and issues such as access to information out of working hours. Conclusions Reported reliability was low for the four systems studied, with some common factors behind each. However, this hides significant variation between organisations for some processes, suggesting that some organisations have managed to create more reliable systems. Standardisation of processes would be expected to have significant benefit. PMID:22495099
Federal Register 2010, 2011, 2012, 2013, 2014
2010-05-11
... Requirement R3.1 of MOD-001-1. C. Benchmarking 14. In the Final Rule, the Commission directed the ERO to develop benchmarking and updating requirements for the MOD Reliability Standards to measure modeled... requirements should specify the frequency for benchmarking and updating the available transfer and flowgate...
Looking forward and back to relapse: implications for research and practice.
Connors, G J; Longabaugh, R; Miller, W R
1996-12-01
In this commentary, the three principal investigators of the Relapse Replication and Extension Project (RREP) reflect on clinical and research implications of study findings from the three collaborating sites. A primary purpose of RREP was to study the reliability and validity of a taxonomy of relapse antecedents originally proposed by Marlatt two decades ago. Under the best of research conditions, with extensive training and practice, it was difficult to achieve reliability of coding with the original three-level system, although with only two levels of classification more reasonable albeit variable reliability was found. Modifications may improve the taxonomy's reliability, but RREP data indicate that a more appropriate strategy is to measure possible antecedents of relapse by continuous scales such as those provided by Annis, Heather and Litman. There is reasonably consistent evidence for two common antecedents of relapse: negative emotional states, and positive emotional states in a social context. Antecedents of relapse show only modest consistency within individuals from one occasion to the next. The causes to which clients attribute relapses may exert a significant effect on future drinking episodes. Stable and internal attributions, such as are commonly associated with a dispositional disease model, may serve to perpetuate relapse. From the RREP studies, the availability of coping skills appears to be a potent protective factor, and ineffective coping a consistent predictor of relapse. Implications for clinical research and practice are considered.
High Reliability Prototype Quadrupole for the Next Linear Collider
NASA Astrophysics Data System (ADS)
Spencer, C. M.
2001-01-01
The Next Linear Collider (NLC) will require over 5600 magnets, each of which must be highly reliable and/or quickly repairable in order that the NLC reach its 85/ overall availability goal. A multidiscipline engineering team was assembled at SLAC to develop a more reliable electromagnet design than historically had been achieved at SLAC. This team carried out a Failure Mode and Effects Analysis (FMEA) on a standard SLAC quadrupole magnet system. They overcame a number of longstanding design prejudices, producing 10 major design changes. This paper describes how a prototype magnet was constructed and the extensive testing carried out on it to prove full functionality with an improvement in reliability. The magnet's fabrication cost will be compared to the cost of a magnet with the same requirements made in the historic SLAC way. The NLC will use over 1600 of these 12.7 mm bore quadrupoles with a range of integrated strengths from 0.6 to 132 Tesla, a maximum gradient of 135 Tesla per meter, an adjustment range of 0 to -20/ and core lengths from 324 mm to 972 mm. The magnetic center must remain stable to within 1 micron during the 20/ adjustment. A magnetic measurement set-up has been developed that can measure sub-micron shifts of a magnetic center. The prototype satisfied the center shift requirement over the full range of integrated strengths.
NASA Astrophysics Data System (ADS)
Kaune, Alexander; López, Patricia; Werner, Micha; de Fraiture, Charlotte
2017-04-01
Hydrological information on water availability and demand is vital for sound water allocation decisions in irrigation districts, particularly in times of water scarcity. However, sub-optimal water allocation decisions are often taken with incomplete hydrological information, which may lead to agricultural production loss. In this study we evaluate the benefit of additional hydrological information from earth observations and reanalysis data in supporting decisions in irrigation districts. Current water allocation decisions were emulated through heuristic operational rules for water scarce and water abundant conditions in the selected irrigation districts. The Dynamic Water Balance Model based on the Budyko framework was forced with precipitation datasets from interpolated ground measurements, remote sensing and reanalysis data, to determine the water availability for irrigation. Irrigation demands were estimated based on estimates of potential evapotranspiration and coefficient for crops grown, adjusted with the interpolated precipitation data. Decisions made using both current and additional hydrological information were evaluated through the rate at which sub-optimal decisions were made. The decisions made using an amended set of decision rules that benefit from additional information on demand in the districts were also evaluated. Results show that sub-optimal decisions can be reduced in the planning phase through improved estimates of water availability. Where there are reliable observations of water availability through gauging stations, the benefit of the improved precipitation data is found in the improved estimates of demand, equally leading to a reduction of sub-optimal decisions.
Leveraging Code Comments to Improve Software Reliability
ERIC Educational Resources Information Center
Tan, Lin
2009-01-01
Commenting source code has long been a common practice in software development. This thesis, consisting of three pieces of work, made novel use of the code comments written in natural language to improve software reliability. Our solution combines Natural Language Processing (NLP), Machine Learning, Statistics, and Program Analysis techniques to…
Reliability Practice at NASA Goddard Space Flight Center
NASA Technical Reports Server (NTRS)
Pruessner, Paula S.; Li, Ming
2008-01-01
This paper describes in brief the Reliability and Maintainability (R&M) Programs performed directly by the reliability branch at Goddard Space Flight Center (GSFC). The mission assurance requirements flow down is explained. GSFC practices for PRA, reliability prediction/fault tree analysis/reliability block diagram, FMEA, part stress and derating analysis, worst case analysis, trend analysis, limit life items are presented. Lessons learned are summarized and recommendations on improvement are identified.
Charlton, Paula C; Mentiplay, Benjamin F; Pua, Yong-Hao; Clark, Ross A
2015-05-01
Traditional methods of assessing joint range of motion (ROM) involve specialized tools that may not be widely available to clinicians. This study assesses the reliability and validity of a custom Smartphone application for assessing hip joint range of motion. Intra-tester reliability with concurrent validity. Passive hip joint range of motion was recorded for seven different movements in 20 males on two separate occasions. Data from a Smartphone, bubble inclinometer and a three dimensional motion analysis (3DMA) system were collected simultaneously. Intraclass correlation coefficients (ICCs), coefficients of variation (CV) and standard error of measurement (SEM) were used to assess reliability. To assess validity of the Smartphone application and the bubble inclinometer against the three dimensional motion analysis system, intraclass correlation coefficients and fixed and proportional biases were used. The Smartphone demonstrated good to excellent reliability (ICCs>0.75) for four out of the seven movements, and moderate to good reliability for the remaining three movements (ICC=0.63-0.68). Additionally, the Smartphone application displayed comparable reliability to the bubble inclinometer. The Smartphone application displayed excellent validity when compared to the three dimensional motion analysis system for all movements (ICCs>0.88) except one, which displayed moderate to good validity (ICC=0.71). Smartphones are portable and widely available tools that are mostly reliable and valid for assessing passive hip range of motion, with potential for large-scale use when a bubble inclinometer is not available. However, caution must be taken in its implementation as some movement axes demonstrated only moderate reliability. Copyright © 2014 Sports Medicine Australia. Published by Elsevier Ltd. All rights reserved.
Nutrition Environment Measures Survey in stores (NEMS-S): development and evaluation.
Glanz, Karen; Sallis, James F; Saelens, Brian E; Frank, Lawrence D
2007-04-01
Eating, or nutrition, environments are believed to contribute to obesity and chronic diseases. There is a need for valid, reliable measures of nutrition environments. This article reports on the development and evaluation of measures of nutrition environments in retail food stores. The Nutrition Environment Measures Study developed observational measures of the nutrition environment within retail food stores (NEMS-S) to assess availability of healthy options, price, and quality. After pretesting, measures were completed by independent raters to evaluate inter-rater reliability and across two occasions to assess test-retest reliability in grocery and convenience stores in four neighborhoods differing on income and community design in the Atlanta metropolitan area. Data were collected and analyzed in 2004 and 2005. Ten food categories (e.g., fruits) or indicator food items (e.g., ground beef) were evaluated in 85 stores. Inter-rater reliability and test-retest reliability of availability were high: inter-rater reliability kappas were 0.84 to 1.00, and test-retest reliabilities were .73 to 1.00. Inter-rater reliability for quality across fresh produce was moderate (kappas, 0.44 to 1.00). Healthier options were higher priced for hot dogs, lean ground beef, and baked chips. More healthful options were available in grocery than convenience stores and in stores in higher income neighborhoods. The NEMS-S tool was found to have a high degree of inter-rater and test-retest reliability, and to reveal significant differences across store types and neighborhoods of high and low socioeconomic status. These observational measures of nutrition environments can be applied in multilevel studies of community nutrition, and can inform new approaches to conducting and evaluating nutrition interventions.
RELAV - RELIABILITY/AVAILABILITY ANALYSIS PROGRAM
NASA Technical Reports Server (NTRS)
Bowerman, P. N.
1994-01-01
RELAV (Reliability/Availability Analysis Program) is a comprehensive analytical tool to determine the reliability or availability of any general system which can be modeled as embedded k-out-of-n groups of items (components) and/or subgroups. Both ground and flight systems at NASA's Jet Propulsion Laboratory have utilized this program. RELAV can assess current system performance during the later testing phases of a system design, as well as model candidate designs/architectures or validate and form predictions during the early phases of a design. Systems are commonly modeled as System Block Diagrams (SBDs). RELAV calculates the success probability of each group of items and/or subgroups within the system assuming k-out-of-n operating rules apply for each group. The program operates on a folding basis; i.e. it works its way towards the system level from the most embedded level by folding related groups into single components. The entire folding process involves probabilities; therefore, availability problems are performed in terms of the probability of success, and reliability problems are performed for specific mission lengths. An enhanced cumulative binomial algorithm is used for groups where all probabilities are equal, while a fast algorithm based upon "Computing k-out-of-n System Reliability", Barlow & Heidtmann, IEEE TRANSACTIONS ON RELIABILITY, October 1984, is used for groups with unequal probabilities. Inputs to the program include a description of the system and any one of the following: 1) availabilities of the items, 2) mean time between failures and mean time to repairs for the items from which availabilities are calculated, 3) mean time between failures and mission length(s) from which reliabilities are calculated, or 4) failure rates and mission length(s) from which reliabilities are calculated. The results are probabilities of success of each group and the system in the given configuration. RELAV assumes exponential failure distributions for reliability calculations and infinite repair resources for availability calculations. No more than 967 items or groups can be modeled by RELAV. If larger problems can be broken into subsystems of 967 items or less, the subsystem results can be used as item inputs to a system problem. The calculated availabilities are steady-state values. Group results are presented in the order in which they were calculated (from the most embedded level out to the system level). This provides a good mechanism to perform trade studies. Starting from the system result and working backwards, the granularity gets finer; therefore, system elements that contribute most to system degradation are detected quickly. RELAV is a C-language program originally developed under the UNIX operating system on a MASSCOMP MC500 computer. It has been modified, as necessary, and ported to an IBM PC compatible with a math coprocessor. The current version of the program runs in the DOS environment and requires a Turbo C vers. 2.0 compiler. RELAV has a memory requirement of 103 KB and was developed in 1989. RELAV is a copyrighted work with all copyright vested in NASA.
Okundamiya, Michael S; Emagbetere, Joy O; Ogujor, Emmanuel A
2014-01-01
The rapid growth of the mobile telecommunication sectors of many emerging countries creates a number of problems such as network congestion and poor service delivery for network operators. This results primarily from the lack of a reliable and cost-effective power solution within such regions. This study presents a comprehensive review of the underlying principles of the renewable energy technology (RET) with the objective of ensuring a reliable and cost-effective energy solution for a sustainable development in the emerging world. The grid-connected hybrid renewable energy system incorporating a power conversion and battery storage unit has been proposed based on the availability, dynamism, and technoeconomic viability of energy resources within the region. The proposed system's performance validation applied a simulation model developed in MATLAB, using a practical load data for different locations with varying climatic conditions in Nigeria. Results indicate that, apart from being environmentally friendly, the increase in the overall energy throughput of about 4 kWh/$ of the proposed system would not only improve the quality of mobile services, by making the operations of GSM base stations more reliable and cost effective, but also better the living standards of the host communities.
Okundamiya, Michael S.; Emagbetere, Joy O.; Ogujor, Emmanuel A.
2014-01-01
The rapid growth of the mobile telecommunication sectors of many emerging countries creates a number of problems such as network congestion and poor service delivery for network operators. This results primarily from the lack of a reliable and cost-effective power solution within such regions. This study presents a comprehensive review of the underlying principles of the renewable energy technology (RET) with the objective of ensuring a reliable and cost-effective energy solution for a sustainable development in the emerging world. The grid-connected hybrid renewable energy system incorporating a power conversion and battery storage unit has been proposed based on the availability, dynamism, and technoeconomic viability of energy resources within the region. The proposed system's performance validation applied a simulation model developed in MATLAB, using a practical load data for different locations with varying climatic conditions in Nigeria. Results indicate that, apart from being environmentally friendly, the increase in the overall energy throughput of about 4 kWh/$ of the proposed system would not only improve the quality of mobile services, by making the operations of GSM base stations more reliable and cost effective, but also better the living standards of the host communities. PMID:24578673
On the reliable use of satellite-derived surface water products for global flood monitoring
NASA Astrophysics Data System (ADS)
Hirpa, F. A.; Revilla-Romero, B.; Thielen, J.; Salamon, P.; Brakenridge, R.; Pappenberger, F.; de Groeve, T.
2015-12-01
Early flood warning and real-time monitoring systems play a key role in flood risk reduction and disaster response management. To this end, real-time flood forecasting and satellite-based detection systems have been developed at global scale. However, due to the limited availability of up-to-date ground observations, the reliability of these systems for real-time applications have not been assessed in large parts of the globe. In this study, we performed comparative evaluations of the commonly used satellite-based global flood detections and operational flood forecasting system using 10 major flood cases reported over three years (2012-2014). Specially, we assessed the flood detection capabilities of the near real-time global flood maps from the Global Flood Detection System (GFDS), and from the Moderate Resolution Imaging Spectroradiometer (MODIS), and the operational forecasts from the Global Flood Awareness System (GloFAS) for the major flood events recorded in global flood databases. We present the evaluation results of the global flood detection and forecasting systems in terms of correctly indicating the reported flood events and highlight the exiting limitations of each system. Finally, we propose possible ways forward to improve the reliability of large scale flood monitoring tools.
Increased Reliability of Gas Turbine Components by Robust Coatings Manufacturing
NASA Astrophysics Data System (ADS)
Sharma, A.; Dudykevych, T.; Sansom, D.; Subramanian, R.
2017-08-01
The expanding operational windows of the advanced gas turbine components demand increasing performance capability from protective coating systems. This demand has led to the development of novel multi-functional, multi-materials coating system architectures over the last years. In addition, the increasing dependency of components exposed to extreme environment on protective coatings results in more severe penalties, in case of a coating system failure. This emphasizes that reliability and consistency of protective coating systems are equally important to their superior performance. By means of examples, this paper describes the effects of scatter in the material properties resulting from manufacturing variations on coating life predictions. A strong foundation in process-property-performance correlations as well as regular monitoring and control of the coating process is essential for robust and well-controlled coating process. Proprietary and/or commercially available diagnostic tools can help in achieving these goals, but their usage in industrial setting is still limited. Various key contributors to process variability are briefly discussed along with the limitations of existing process and product control methods. Other aspects that are important for product reliability and consistency in serial manufacturing as well as advanced testing methodologies to simplify and enhance product inspection and improve objectivity are briefly described.
Hackenberg, Michael; Rodríguez-Ezpeleta, Naiara; Aransay, Ana M.
2011-01-01
We present a new version of miRanalyzer, a web server and stand-alone tool for the detection of known and prediction of new microRNAs in high-throughput sequencing experiments. The new version has been notably improved regarding speed, scope and available features. Alignments are now based on the ultrafast short-read aligner Bowtie (granting also colour space support, allowing mismatches and improving speed) and 31 genomes, including 6 plant genomes, can now be analysed (previous version contained only 7). Differences between plant and animal microRNAs have been taken into account for the prediction models and differential expression of both, known and predicted microRNAs, between two conditions can be calculated. Additionally, consensus sequences of predicted mature and precursor microRNAs can be obtained from multiple samples, which increases the reliability of the predicted microRNAs. Finally, a stand-alone version of the miRanalyzer that is based on a local and easily customized database is also available; this allows the user to have more control on certain parameters as well as to use specific data such as unpublished assemblies or other libraries that are not available in the web server. miRanalyzer is available at http://bioinfo2.ugr.es/miRanalyzer/miRanalyzer.php. PMID:21515631
NASA Technical Reports Server (NTRS)
Safie, Fayssal M.; Daniel, Charles; Kalia, Prince; Smith, Charles A. (Technical Monitor)
2002-01-01
The United States National Aeronautics and Space Administration (NASA) is in the midst of a 10-year Second Generation Reusable Launch Vehicle (RLV) program to improve its space transportation capabilities for both cargo and crewed missions. The objectives of the program are to: significantly increase safety and reliability, reduce the cost of accessing low-earth orbit, attempt to leverage commercial launch capabilities, and provide a growth path for manned space exploration. The safety, reliability and life cycle cost of the next generation vehicles are major concerns, and NASA aims to achieve orders of magnitude improvement in these areas. To get these significant improvements, requires a rigorous process that addresses Reliability, Maintainability and Supportability (RMS) and safety through all the phases of the life cycle of the program. This paper discusses the RMS process being implemented for the Second Generation RLV program.
A Survey of Techniques for Modeling and Improving Reliability of Computing Systems
Mittal, Sparsh; Vetter, Jeffrey S.
2015-04-24
Recent trends of aggressive technology scaling have greatly exacerbated the occurrences and impact of faults in computing systems. This has made `reliability' a first-order design constraint. To address the challenges of reliability, several techniques have been proposed. In this study, we provide a survey of architectural techniques for improving resilience of computing systems. We especially focus on techniques proposed for microarchitectural components, such as processor registers, functional units, cache and main memory etc. In addition, we discuss techniques proposed for non-volatile memory, GPUs and 3D-stacked processors. To underscore the similarities and differences of the techniques, we classify them based onmore » their key characteristics. We also review the metrics proposed to quantify vulnerability of processor structures. Finally, we believe that this survey will help researchers, system-architects and processor designers in gaining insights into the techniques for improving reliability of computing systems.« less
A Survey of Techniques for Modeling and Improving Reliability of Computing Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mittal, Sparsh; Vetter, Jeffrey S.
Recent trends of aggressive technology scaling have greatly exacerbated the occurrences and impact of faults in computing systems. This has made `reliability' a first-order design constraint. To address the challenges of reliability, several techniques have been proposed. In this study, we provide a survey of architectural techniques for improving resilience of computing systems. We especially focus on techniques proposed for microarchitectural components, such as processor registers, functional units, cache and main memory etc. In addition, we discuss techniques proposed for non-volatile memory, GPUs and 3D-stacked processors. To underscore the similarities and differences of the techniques, we classify them based onmore » their key characteristics. We also review the metrics proposed to quantify vulnerability of processor structures. Finally, we believe that this survey will help researchers, system-architects and processor designers in gaining insights into the techniques for improving reliability of computing systems.« less
General Aviation Aircraft Reliability Study
NASA Technical Reports Server (NTRS)
Pettit, Duane; Turnbull, Andrew; Roelant, Henk A. (Technical Monitor)
2001-01-01
This reliability study was performed in order to provide the aviation community with an estimate of Complex General Aviation (GA) Aircraft System reliability. To successfully improve the safety and reliability for the next generation of GA aircraft, a study of current GA aircraft attributes was prudent. This was accomplished by benchmarking the reliability of operational Complex GA Aircraft Systems. Specifically, Complex GA Aircraft System reliability was estimated using data obtained from the logbooks of a random sample of the Complex GA Aircraft population.
DOT National Transportation Integrated Search
2014-01-01
The second Strategic Highway Research Program (SHRP 2) Reliability program aims to improve trip time reliability by reducing the frequency and effects of events that cause travel times to fluctuate unpredictably. Congestion caused by unreliable, or n...
Monitoring outcomes with relational databases: does it improve quality of care?
Clemmer, Terry P
2004-12-01
There are 3 key ingredients in improving quality of medial care: 1) using a scientific process of improvement, 2) executing the process at the lowest possible level in the organization, and 3) measuring the results of any change reliably. Relational databases when used within these guidelines are of great value in these efforts if they contain reliable information that is pertinent to the project and used in a scientific process of quality improvement by a front line team. Unfortunately, the data are frequently unreliable and/or not pertinent to the local process and is used by persons at very high levels in the organization without a scientific process and without reliable measurement of the outcome. Under these circumstances the effectiveness of relational databases in improving care is marginal at best, frequently wasteful and has the potential to be harmful. This article explores examples of these concepts.
Marusich, Laura R; Bakdash, Jonathan Z; Onal, Emrah; Yu, Michael S; Schaffer, James; O'Donovan, John; Höllerer, Tobias; Buchler, Norbou; Gonzalez, Cleotilde
2016-03-01
We investigated how increases in task-relevant information affect human decision-making performance, situation awareness (SA), and trust in a simulated command-and-control (C2) environment. Increased information is often associated with an improvement of SA and decision-making performance in networked organizations. However, previous research suggests that increasing information without considering the task relevance and the presentation can impair performance. We used a simulated C2 task across two experiments. Experiment 1 varied the information volume provided to individual participants and measured the speed and accuracy of decision making for task performance. Experiment 2 varied information volume and information reliability provided to two participants acting in different roles and assessed decision-making performance, SA, and trust between the paired participants. In both experiments, increased task-relevant information volume did not improve task performance. In Experiment 2, increased task-relevant information volume reduced self-reported SA and trust, and incorrect source reliability information led to poorer task performance and SA. These results indicate that increasing the volume of information, even when it is accurate and task relevant, is not necessarily beneficial to decision-making performance. Moreover, it may even be detrimental to SA and trust among team members. Given the high volume of available and shared information and the safety-critical and time-sensitive nature of many decisions, these results have implications for training and system design in C2 domains. To avoid decrements to SA, interpersonal trust, and decision-making performance, information presentation within C2 systems must reflect human cognitive processing limits and capabilities. © 2016, Human Factors and Ergonomics Society.
Accurate paleointensities - the multi-method approach
NASA Astrophysics Data System (ADS)
de Groot, Lennart
2016-04-01
The accuracy of models describing rapid changes in the geomagnetic field over the past millennia critically depends on the availability of reliable paleointensity estimates. Over the past decade methods to derive paleointensities from lavas (the only recorder of the geomagnetic field that is available all over the globe and through geologic times) have seen significant improvements and various alternative techniques were proposed. The 'classical' Thellier-style approach was optimized and selection criteria were defined in the 'Standard Paleointensity Definitions' (Paterson et al, 2014). The Multispecimen approach was validated and the importance of additional tests and criteria to assess Multispecimen results must be emphasized. Recently, a non-heating, relative paleointensity technique was proposed -the pseudo-Thellier protocol- which shows great potential in both accuracy and efficiency, but currently lacks a solid theoretical underpinning. Here I present work using all three of the aforementioned paleointensity methods on suites of young lavas taken from the volcanic islands of Hawaii, La Palma, Gran Canaria, Tenerife, and Terceira. Many of the sampled cooling units are <100 years old, the actual field strength at the time of cooling is therefore reasonably well known. Rather intuitively, flows that produce coherent results from two or more different paleointensity methods yield the most accurate estimates of the paleofield. Furthermore, the results for some flows pass the selection criteria for one method, but fail in other techniques. Scrutinizing and combing all acceptable results yielded reliable paleointensity estimates for 60-70% of all sampled cooling units - an exceptionally high success rate. This 'multi-method paleointensity approach' therefore has high potential to provide the much-needed paleointensities to improve geomagnetic field models for the Holocene.
On the next generation of reliability analysis tools
NASA Technical Reports Server (NTRS)
Babcock, Philip S., IV; Leong, Frank; Gai, Eli
1987-01-01
The current generation of reliability analysis tools concentrates on improving the efficiency of the description and solution of the fault-handling processes and providing a solution algorithm for the full system model. The tools have improved user efficiency in these areas to the extent that the problem of constructing the fault-occurrence model is now the major analysis bottleneck. For the next generation of reliability tools, it is proposed that techniques be developed to improve the efficiency of the fault-occurrence model generation and input. Further, the goal is to provide an environment permitting a user to provide a top-down design description of the system from which a Markov reliability model is automatically constructed. Thus, the user is relieved of the tedious and error-prone process of model construction, permitting an efficient exploration of the design space, and an independent validation of the system's operation is obtained. An additional benefit of automating the model construction process is the opportunity to reduce the specialized knowledge required. Hence, the user need only be an expert in the system he is analyzing; the expertise in reliability analysis techniques is supplied.
Fan, Ching-Lin; Tseng, Fan-Ping; Tseng, Chiao-Yuan
2018-05-17
In this work, amorphous indium-gallium-zinc oxide thin-film transistors (a-IGZO TFTs) with a HfO₂ gate insulator and CF₄ plasma treatment was demonstrated for the first time. Through the plasma treatment, both the electrical performance and reliability of the a-IGZO TFT with HfO₂ gate dielectric were improved. The carrier mobility significantly increased by 80.8%, from 30.2 cm²/V∙s (without treatment) to 54.6 cm²/V∙s (with CF₄ plasma treatment), which is due to the incorporated fluorine not only providing an extra electron to the IGZO, but also passivating the interface trap density. In addition, the reliability of the a-IGZO TFT with HfO₂ gate dielectric has also been improved by the CF₄ plasma treatment. By applying the CF₄ plasma treatment to the a-IGZO TFT, the hysteresis effect of the device has been improved and the device's immunity against moisture from the ambient atmosphere has been enhanced. It is believed that the CF₄ plasma treatment not only significantly improves the electrical performance of a-IGZO TFT with HfO₂ gate dielectric, but also enhances the device's reliability.
A Method of Porcine Pancreatic Islet Isolation for Microencapsulation.
Kendall, William F; Opara, Emmanuel C
2017-01-01
Since the discovery of insulin by Banting and Best in 1921, the prognosis and treatment options for individuals with diabetes have improved. The development of various insulin types, various oral agents, and insulin pumps have improved the available medical options for individuals afflicted with diabetes. The current need for frequent blood glucose monitoring imposed by multiple daily insulin injections, result in significant life-style challenges for in individuals afflicted with Type 1 diabetes (T1D). In contrast the use of surgical interventions, such as whole organ pancreas transplantation (PT) requires less-intensive glucose monitoring while the organ is viable. Also, isolated human pancreatic islet transplantation (IT) holds similar promise as PT; however, the limited availability of human pancreata exacerbated by, the need for multiple pancreata per individual IT recipient, and issues with prolonged viability, still hamper widespread successful, and routine use of IT. The use of porcine pancreata holds promise as a viable alternative to human pancreas to significantly increase the volume of islets available to meet the needs of millions of patients afflicted with T1D. This chapter outlines our protocol utilized to reliably isolate and microencapsulate porcine islets.
Bryce, S D; Lee, S J; Ponsford, J L; Lawrence, R J; Tan, E J; Rossell, S L
2018-06-20
Cognitive remediation (CR) is considered a potentially effective method of improving cognitive function in people with schizophrenia. Few studies, however, have explored the role of intrinsic motivation on treatment utilization or training outcomes in CR in this population. This study explored the impact of task-specific intrinsic motivation on attendance and reliable cognitive improvement in a controlled trial comparing CR with a computer game (CG) playing control. Forty-nine participants with schizophrenia or schizoaffective disorder, allocated to 10 weeks of group-based CR (n = 25) or CG control (n = 24), provided complete outcome data at baseline. Forty-three participants completed their assigned intervention. Cognition, psychopathology and intrinsic motivation were measured at baseline and end-treatment. Regression analyses explored the relative contribution of baseline motivation and other clinical factors to session attendance as well as the association of baseline and change in intrinsic motivation with the odds of reliable cognitive improvement (calculated using reliable change indices). Baseline reports of perceived program value were the only significant multivariable predictor of session attendance when including global cognition and psychiatric symptomatology. The odds of reliable cognitive improvement significantly increased with greater improvements in program interest and value from baseline to end-treatment. Motivational changes over time were highly variable between participants. Task-specific intrinsic motivation in schizophrenia may represent an important patient-related factor that contributes to session attendance and cognitive improvements in CR. Regular evaluation and enhancement of intrinsic motivation in cognitively enhancing interventions may optimize treatment engagement and the likelihood of meaningful training outcomes. Copyright © 2018 Elsevier B.V. All rights reserved.
Burns, Ted M.; Conaway, Mark; Sanders, Donald B.
2010-01-01
Objective: To study the concurrent and construct validity and test-retest reliability in the practice setting of an outcome measure for myasthenia gravis (MG). Methods: Eleven centers participated in the validation study of the Myasthenia Gravis Composite (MGC) scale. Patients with MG were evaluated at 2 consecutive visits. Concurrent and construct validities of the MGC were assessed by evaluating MGC scores in the context of other MG-specific outcome measures. We used numerous potential indicators of clinical improvement to assess the sensitivity and specificity of the MGC for detecting clinical improvement. Test-retest reliability was performed on patients at the University of Virginia. Results: A total of 175 patients with MG were enrolled at 11 sites from July 1, 2008, to January 31, 2009. A total of 151 patients were seen in follow-up. Total MGC scores showed excellent concurrent validity with other MG-specific scales. Analyses of sensitivities and specificities of the MGC revealed that a 3-point improvement in total MGC score was optimal for signifying clinical improvement. A 3-point improvement in the MGC also appears to represent a meaningful improvement to most patients, as indicated by improved 15-item myasthenia gravis quality of life scale (MG-QOL15) scores. The psychometric properties were no better for an individualized subscore made up of the 2 functional domains that the patient identified as most important to treat. The test-retest reliability coefficient of the MGC was 98%, with a lower 95% confidence interval of 97%, indicating excellent test-retest reliability. Conclusions: The Myasthenia Gravis Composite is a reliable and valid instrument for measuring clinical status of patients with myasthenia gravis in the practice setting and in clinical trials. PMID:20439845
Liu, Chao; Liu, Jinhong; Zhang, Junxiang; Zhu, Shiyao
2018-02-05
The direct counterfactual quantum communication (DCQC) is a surprising phenomenon that quantum information can be transmitted without using any carriers of physical particles. The nested interferometers are promising devices for realizing DCQC as long as the number of interferometers goes to be infinity. Considering the inevitable loss or dissipation in practical experimental interferometers, we analyze the dependence of reliability on the number of interferometers, and show that the reliability of direct communication is being rapidly degraded with the large number of interferometers. Furthermore, we simulate and test this counterfactual deterministic communication protocol with a finite number of interferometers, and demonstrate the improvement of the reliability using dissipation compensation in interferometers.
Gearbox Reliability Collaborative Phase 3 Gearbox 3 Test
Keller, Jonathan (ORCID:0000000177243885)
2016-12-28
The GRC uses a combined gearbox testing, modeling, and analysis approach disseminating data and results to the industry and facilitating improvement of gearbox reliability. This test data describes the tests of GRC gearbox 3 in the National Wind Technology Center dynamometer and documents any modifications to the original test plan. It serves as a guide to interpret the publicly released data sets with brief analyses to illustrate the data. TDMS viewer and Solidworks software required to view data files. The National Renewable Energy Laboratory (NREL) Gearbox Reliability Collaborative (GRC) was established by the U.S. Department of Energy in 2006; its key goal is to understand the root causes of premature gearbox failures and improve their reliability.
Gearbox Reliability Collaborative Phase 3 Gearbox 2 Test
Keller, Jonathan; Robb, Wallen
2016-05-12
The National Renewable Energy Laboratory (NREL) Gearbox Reliability Collaborative (GRC) was established by the U.S. Department of Energy in 2006; its key goal is to understand the root causes of premature gearbox failures and improve their reliability. The GRC uses a combined gearbox testing, modeling, and analysis approach disseminating data and results to the industry and facilitating improvement of gearbox reliability. This test data describes the tests of GRC gearbox 2 in the National Wind Technology Center dynamometer and documents any modifications to the original test plan. It serves as a guide to interpret the publicly released data sets with brief analyses to illustrate the data. TDMS viewer and Solidworks software required to view data files.
Improving Video Based Heart Rate Monitoring.
Lin, Jian; Rozado, David; Duenser, Andreas
2015-01-01
Non-contact measurements of cardiac pulse can provide robust measurement of heart rate (HR) without the annoyance of attaching electrodes to the body. In this paper we explore a novel and reliable method to carry out video-based HR estimation and propose various performance improvement over existing approaches. The investigated method uses Independent Component Analysis (ICA) to detect the underlying HR signal from a mixed source signal present in the RGB channels of the image. The original ICA algorithm was implemented and several modifications were explored in order to determine which one could be optimal for accurate HR estimation. Using statistical analysis, we compared the cardiac pulse rate estimation from the different methods under comparison on the extracted videos to a commercially available oximeter. We found that some of these methods are quite effective and efficient in terms of improving accuracy and latency of the system. We have made the code of our algorithms openly available to the scientific community so that other researchers can explore how to integrate video-based HR monitoring in novel health technology applications. We conclude by noting that recent advances in video-based HR monitoring permit computers to be aware of a user's psychophysiological status in real time.
Measuring cognitive change with ImPACT: the aggregate baseline approach.
Bruce, Jared M; Echemendia, Ruben J; Meeuwisse, Willem; Hutchison, Michael G; Aubry, Mark; Comper, Paul
2017-11-01
The Immediate Post-Concussion Assessment and Cognitive Test (ImPACT) is commonly used to assess baseline and post-injury cognition among athletes in North America. Despite this, several studies have questioned the reliability of ImPACT when given at intervals employed in clinical practice. Poor test-retest reliability reduces test sensitivity to cognitive decline, increasing the likelihood that concussed athletes will be returned to play prematurely. We recently showed that the reliability of ImPACT can be increased when using a new composite structure and the aggregate of two baselines to predict subsequent performance. The purpose of the present study was to confirm our previous findings and determine whether the addition of a third baseline would further increase the test-retest reliability of ImPACT. Data from 97 English speaking professional hockey players who had received at least 4 ImPACT baseline evaluations were extracted from a National Hockey League Concussion Program database. Linear regression was used to determine whether each of the first three testing sessions accounted for unique variance in the fourth testing session. Results confirmed that the aggregate baseline approach improves the psychometric properties of ImPACT, with most indices demonstrating adequate or better test-retest reliability for clinical use. The aggregate baseline approach provides a modest clinical benefit when recent baselines are available - and a more substantial benefit when compared to approaches that obtain baseline measures only once during the course of a multi-year playing career. Pending confirmation in diverse samples, neuropsychologists are encouraged to use the aggregate baseline approach to best quantify cognitive change following sports concussion.
A sensitive and reliable test instrument to assess swimming in rats with spinal cord injury.
Xu, Ning; Åkesson, Elisabet; Holmberg, Lena; Sundström, Erik
2015-09-15
For clinical translation of experimental spinal cord injury (SCI) research, evaluation of animal SCI models should include several sensorimotor functions. Validated and reliable assessment tools should be applicable to a wide range of injury severity. The BBB scale is the most widely used test instrument, but similar to most others it is used to assess open field ambulation. We have developed an assessment tool for swimming in rats with SCI, with high discriminative power and sensitivity to functional recovery after mild and severe injuries, without need for advanced test equipment. We studied various parameters of swimming in four groups of rats with thoracic SCI of different severity and a control group, for 8 weeks after surgery. Six parameters were combined in a multiple item scale, the Karolinska Institutet Swim Assessment Tool (KSAT). KSAT scores for all SCI groups showed consistent functional improvement after injury, and significant differences between the five experimental groups. The internal consistency, the inter-rater and the test-retest reliability were very high. The KSAT score was highly correlated to the cross-section area of white matter spared at the injury epicenter. Importantly, even after 8 weeks of recovery the KSAT score reliably discriminated normal animals from those inflicted by the mildest injury, and also displayed the recovery of the most severely injured rats. We conclude that this swim scale is an efficient and reliable tool to assess motor activity during swimming, and an important addition to the methods available for evaluating rat models of SCI. Copyright © 2015 Elsevier B.V. All rights reserved.
Improving fMRI reliability in presurgical mapping for brain tumours.
Stevens, M Tynan R; Clarke, David B; Stroink, Gerhard; Beyea, Steven D; D'Arcy, Ryan Cn
2016-03-01
Functional MRI (fMRI) is becoming increasingly integrated into clinical practice for presurgical mapping. Current efforts are focused on validating data quality, with reliability being a major factor. In this paper, we demonstrate the utility of a recently developed approach that uses receiver operating characteristic-reliability (ROC-r) to: (1) identify reliable versus unreliable data sets; (2) automatically select processing options to enhance data quality; and (3) automatically select individualised thresholds for activation maps. Presurgical fMRI was conducted in 16 patients undergoing surgical treatment for brain tumours. Within-session test-retest fMRI was conducted, and ROC-reliability of the patient group was compared to a previous healthy control cohort. Individually optimised preprocessing pipelines were determined to improve reliability. Spatial correspondence was assessed by comparing the fMRI results to intraoperative cortical stimulation mapping, in terms of the distance to the nearest active fMRI voxel. The average ROC-r reliability for the patients was 0.58±0.03, as compared to 0.72±0.02 in healthy controls. For the patient group, this increased significantly to 0.65±0.02 by adopting optimised preprocessing pipelines. Co-localisation of the fMRI maps with cortical stimulation was significantly better for more reliable versus less reliable data sets (8.3±0.9 vs 29±3 mm, respectively). We demonstrated ROC-r analysis for identifying reliable fMRI data sets, choosing optimal postprocessing pipelines, and selecting patient-specific thresholds. Data sets with higher reliability also showed closer spatial correspondence to cortical stimulation. ROC-r can thus identify poor fMRI data at time of scanning, allowing for repeat scans when necessary. ROC-r analysis provides optimised and automated fMRI processing for improved presurgical mapping. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/
Training less-experienced faculty improves reliability of skills assessment in cardiac surgery.
Lou, Xiaoying; Lee, Richard; Feins, Richard H; Enter, Daniel; Hicks, George L; Verrier, Edward D; Fann, James I
2014-12-01
Previous work has demonstrated high inter-rater reliability in the objective assessment of simulated anastomoses among experienced educators. We evaluated the inter-rater reliability of less-experienced educators and the impact of focused training with a video-embedded coronary anastomosis assessment tool. Nine less-experienced cardiothoracic surgery faculty members from different institutions evaluated 2 videos of simulated coronary anastomoses (1 by a medical student and 1 by a resident) at the Thoracic Surgery Directors Association Boot Camp. They then underwent a 30-minute training session using an assessment tool with embedded videos to anchor rating scores for 10 components of coronary artery anastomosis. Afterward, they evaluated 2 videos of a different student and resident performing the task. Components were scored on a 1 to 5 Likert scale, yielding an average composite score. Inter-rater reliabilities of component and composite scores were assessed using intraclass correlation coefficients (ICCs) and overall pass/fail ratings with kappa. All components of the assessment tool exhibited improvement in reliability, with 4 (bite, needle holder use, needle angles, and hand mechanics) improving the most from poor (ICC range, 0.09-0.48) to strong (ICC range, 0.80-0.90) agreement. After training, inter-rater reliabilities for composite scores improved from moderate (ICC, 0.76) to strong (ICC, 0.90) agreement, and for overall pass/fail ratings, from poor (kappa = 0.20) to moderate (kappa = 0.78) agreement. Focused, video-based anchor training facilitates greater inter-rater reliability in the objective assessment of simulated coronary anastomoses. Among raters with less teaching experience, such training may be needed before objective evaluation of technical skills. Published by Elsevier Inc.
Reliable data storage system design and implementation for acoustic logging while drilling
NASA Astrophysics Data System (ADS)
Hao, Xiaolong; Ju, Xiaodong; Wu, Xiling; Lu, Junqiang; Men, Baiyong; Yao, Yongchao; Liu, Dong
2016-12-01
Owing to the limitations of real-time transmission, reliable downhole data storage and fast ground reading have become key technologies in developing tools for acoustic logging while drilling (LWD). In order to improve the reliability of the downhole storage system in conditions of high temperature, intensive shake and periodic power supply, improvements were made in terms of hardware and software. In hardware, we integrated the storage system and data acquisition control module into one circuit board, to reduce the complexity of the storage process, by adopting the controller combination of digital signal processor and field programmable gate array. In software, we developed a systematic management strategy for reliable storage. Multiple-backup independent storage was employed to increase the data redundancy. A traditional error checking and correction (ECC) algorithm was improved and we embedded the calculated ECC code into all management data and waveform data. A real-time storage algorithm for arbitrary length data was designed to actively preserve the storage scene and ensure the independence of the stored data. The recovery procedure of management data was optimized to realize reliable self-recovery. A new bad block management idea of static block replacement and dynamic page mark was proposed to make the period of data acquisition and storage more balanced. In addition, we developed a portable ground data reading module based on a new reliable high speed bus to Ethernet interface to achieve fast reading of the logging data. Experiments have shown that this system can work stably below 155 °C with a periodic power supply. The effective ground data reading rate reaches 1.375 Mbps with 99.7% one-time success rate at room temperature. This work has high practical application significance in improving the reliability and field efficiency of acoustic LWD tools.
NASA Astrophysics Data System (ADS)
Rosas, Pedro; Wagemans, Johan; Ernst, Marc O.; Wichmann, Felix A.
2005-05-01
A number of models of depth-cue combination suggest that the final depth percept results from a weighted average of independent depth estimates based on the different cues available. The weight of each cue in such an average is thought to depend on the reliability of each cue. In principle, such a depth estimation could be statistically optimal in the sense of producing the minimum-variance unbiased estimator that can be constructed from the available information. Here we test such models by using visual and haptic depth information. Different texture types produce differences in slant-discrimination performance, thus providing a means for testing a reliability-sensitive cue-combination model with texture as one of the cues to slant. Our results show that the weights for the cues were generally sensitive to their reliability but fell short of statistically optimal combination - we find reliability-based reweighting but not statistically optimal cue combination.
Magnan, Morris A; Maklebust, Joann
2008-01-01
To evaluate the effect of Web-based Braden Scale training on the reliability and precision of pressure ulcer risk assessments made by registered nurses (RN) working in acute care settings. Pretest-posttest, 2-group, quasi-experimental design. Five hundred Braden Scale risk assessments were made on 102 acute care patients deemed to be at various levels of risk for pressure ulceration. Assessments were made by RNs working in acute care hospitals at 3 different medical centers where the Braden Scale was in regular daily use (2 medical centers) or new to the setting (1 medical center). The Braden Scale for Predicting Pressure Sore Risk was used to guide pressure ulcer risk assessments. A Web-based version of the Detroit Medical Center Braden Scale Computerized Training Module was used to teach nurses correct use of the Braden Scale and selection of risk-based pressure ulcer prevention interventions. In the aggregate, RN generated reliable Braden Scale pressure ulcer risk assessments 65% of the time after training. The effect of Web-based Braden Scale training on reliability and precision of assessments varied according to familiarity with the scale. With training, new users of the scale made reliable assessments 84% of the time and significantly improved precision of their assessments. The reliability and precision of Braden Scale risk assessments made by its regular users was unaffected by training. Technology-assisted Braden Scale training improved both reliability and precision of risk assessments made by new users of the scale, but had virtually no effect on the reliability or precision of risk assessments made by regular users of the instrument. Further research is needed to determine best approaches for improving reliability and precision of Braden Scale assessments made by its regular users.
Daniels, Vijay J; Bordage, Georges; Gierl, Mark J; Yudkowsky, Rachel
2014-10-01
Objective structured clinical examinations (OSCEs) are used worldwide for summative examinations but often lack acceptable reliability. Research has shown that reliability of scores increases if OSCE checklists for medical students include only clinically relevant items. Also, checklists are often missing evidence-based items that high-achieving learners are more likely to use. The purpose of this study was to determine if limiting checklist items to clinically discriminating items and/or adding missing evidence-based items improved score reliability in an Internal Medicine residency OSCE. Six internists reviewed the traditional checklists of four OSCE stations classifying items as clinically discriminating or non-discriminating. Two independent reviewers augmented checklists with missing evidence-based items. We used generalizability theory to calculate overall reliability of faculty observer checklist scores from 45 first and second-year residents and predict how many 10-item stations would be required to reach a Phi coefficient of 0.8. Removing clinically non-discriminating items from the traditional checklist did not affect the number of stations (15) required to reach a Phi of 0.8 with 10 items. Focusing the checklist on only evidence-based clinically discriminating items increased test score reliability, needing 11 stations instead of 15 to reach 0.8; adding missing evidence-based clinically discriminating items to the traditional checklist modestly improved reliability (needing 14 instead of 15 stations). Checklists composed of evidence-based clinically discriminating items improved the reliability of checklist scores and reduced the number of stations needed for acceptable reliability. Educators should give preference to evidence-based items over non-evidence-based items when developing OSCE checklists.
Banks, Merrilyn; Hannan-Jones, Mary; Ross, Lynda; Buckley, Ann; Ellick, Jennifer; Young, Adrienne
2017-04-01
To develop and test the reliability of a Meal Quality Audit Tool (MQAT) to audit the quality of hospital meals to assist food service managers and dietitians in identifying areas for improvement. The MQAT was developed using expert opinion and was modified over time with extensive use and feedback. A phased approach was used to assess content validity and test reliability: (i) trial with 60 dietetic students, (ii) trial with 12 food service dietitians in practice and (iii) interrater reliability study. Phases 1 and 2 confirmed content validity and informed minor revision of scoring, language and formatting of the MQAT. To assess reliability of the final MQAT, eight separate meal quality audits of five identical meals were conducted over several weeks in the hospital setting. Each audit comprised an 'expert' team and four 'test' teams (dietitians, food services and ward staff). Interrater reliability was determined using intra-class correlation analysis. There was statistically significant interrater reliability for dimensions of Temperature and Accuracy (P < 0.001) but not for Appearance or Sensory. Composition of the 'test' team appeared to influence results for Appearance and Sensory, with food service-led teams scoring higher on these dimensions. 'Test' teams reported that MQAT was clear and easy to use. MQAT was found to be reliable for Temperature and Accuracy domains, with further work required to improve the reliability of the Appearance and Sensory dimensions. The systematic use of the tool, used in conjunction with patient satisfaction, could provide pertinent and useful information regarding the quality of food services and areas for improvement. © 2017 Dietitians Association of Australia.
Distribution System Reliability Analysis for Smart Grid Applications
NASA Astrophysics Data System (ADS)
Aljohani, Tawfiq Masad
Reliability of power systems is a key aspect in modern power system planning, design, and operation. The ascendance of the smart grid concept has provided high hopes of developing an intelligent network that is capable of being a self-healing grid, offering the ability to overcome the interruption problems that face the utility and cost it tens of millions in repair and loss. To address its reliability concerns, the power utilities and interested parties have spent extensive amount of time and effort to analyze and study the reliability of the generation and transmission sectors of the power grid. Only recently has attention shifted to be focused on improving the reliability of the distribution network, the connection joint between the power providers and the consumers where most of the electricity problems occur. In this work, we will examine the effect of the smart grid applications in improving the reliability of the power distribution networks. The test system used in conducting this thesis is the IEEE 34 node test feeder, released in 2003 by the Distribution System Analysis Subcommittee of the IEEE Power Engineering Society. The objective is to analyze the feeder for the optimal placement of the automatic switching devices and quantify their proper installation based on the performance of the distribution system. The measures will be the changes in the reliability system indices including SAIDI, SAIFI, and EUE. The goal is to design and simulate the effect of the installation of the Distributed Generators (DGs) on the utility's distribution system and measure the potential improvement of its reliability. The software used in this work is DISREL, which is intelligent power distribution software that is developed by General Reliability Co.
Improved Reading Gate For Vertical-Bloch-Line Memory
NASA Technical Reports Server (NTRS)
Wu, Jiin-Chuan; Stadler, Henry L.; Katti, Romney R.
1994-01-01
Improved design for reading gate of vertical-Bloch-line magnetic-bubble memory increases reliability of discrimination between binary ones and zeros. Magnetic bubbles that signify binary "1" and "0" produced by applying sufficiently large chopping currents to memory stripes. Bubbles then propagated differentially in bubble sorter. Method of discriminating between ones and zeros more reliable.
NASA Technical Reports Server (NTRS)
Gernand, Jeffrey L.; Gillespie, Amanda M.; Monaghan, Mark W.; Cummings, Nicholas H.
2010-01-01
Success of the Constellation Program's lunar architecture requires successfully launching two vehicles, Ares I/Orion and Ares V/Altair, within a very limited time period. The reliability and maintainability of flight vehicles and ground systems must deliver a high probability of successfully launching the second vehicle in order to avoid wasting the on-orbit asset launched by the first vehicle. The Ground Operations Project determined which ground subsystems had the potential to affect the probability of the second launch and allocated quantitative availability requirements to these subsystems. The Ground Operations Project also developed a methodology to estimate subsystem reliability, availability, and maintainability to ensure that ground subsystems complied with allocated launch availability and maintainability requirements. The verification analysis developed quantitative estimates of subsystem availability based on design documentation, testing results, and other information. Where appropriate, actual performance history was used to calculate failure rates for legacy subsystems or comparative components that will support Constellation. The results of the verification analysis will be used to assess compliance with requirements and to highlight design or performance shortcomings for further decision making. This case study will discuss the subsystem requirements allocation process, describe the ground systems methodology for completing quantitative reliability, availability, and maintainability analysis, and present findings and observation based on analysis leading to the Ground Operations Project Preliminary Design Review milestone.
Stieger, Greta; Scheringer, Martin; Ng, Carla A; Hungerbühler, Konrad
2014-12-01
Polybrominated diphenylethers (PBDEs) and hexabromocyclododecane (HBCDD) are major brominated flame retardants (BFRs) that are now banned or under restrictions in many countries because of their persistence, bioaccumulation potential and toxicity (PBT properties). However, there is a wide range of alternative BFRs, such as decabromodiphenyl ethane and tribromophenol, that are increasingly used as replacements, but which may possess similar hazardous properties. This necessitates hazard and risk assessments of these compounds. For a set of 36 alternative BFRs, we searched 25 databases for chemical property data that are needed as input for a PBT assessment. These properties are degradation half-life, bioconcentration factor (BCF), octanol-water partition coefficient (Kow), and toxic effect concentrations in aquatic organisms. For 17 of the 36 substances, no data at all were found for these properties. Too few persistence data were available to even assess the quality of these data in a systematic way. The available data for Kow and toxicity show surprisingly high variability, which makes it difficult to identify the most reliable values. We propose methods for systematic evaluations of PBT-related chemical property data that should be performed before data are included in publicly available databases. Using these methods, we evaluated the data for Kow and toxicity in more detail and identified several inaccurate values. For most of the 36 alternative BFRs, the amount and the quality of the PBT-related property data need to be improved before reliable hazard and risk assessments of these substances can be performed. Copyright © 2014 Elsevier Ltd. All rights reserved.
Reliability of provocative tests of motion sickness susceptibility
NASA Technical Reports Server (NTRS)
Calkins, D. S.; Reschke, M. F.; Kennedy, R. S.; Dunlop, W. P.
1987-01-01
Test-retest reliability values were derived from motion sickness susceptibility scores obtained from two successive exposures to each of three tests: (1) Coriolis sickness sensitivity test; (2) staircase velocity movement test; and (3) parabolic flight static chair test. The reliability of the three tests ranged from 0.70 to 0.88. Normalizing values from predictors with skewed distributions improved the reliability.
NASA Astrophysics Data System (ADS)
Zhuang, Y.; Tian, F.; Yigzaw, W.; Hejazi, M. I.; Li, H. Y.; Turner, S. W. D.; Vernon, C. R.
2017-12-01
More and more reservoirs are being build or planned in order to help meet the increasing water demand all over the world. However, is building new reservoirs always helpful to water supply? To address this question, the river routing module of Global Change Assessment Model (GCAM) has been extended with a simple yet physical-based reservoir scheme accounting for irrigation, flood control and hydropower operations at each individual reservoir. The new GCAM river routing model has been applied over the global domain with the runoff inputs from the Variable Infiltration Capacity Model. The simulated streamflow is validated at 150 global river basins where the observed streamflow data are available. The model performance has been significantly improved at 77 basins and worsened at 35 basins. To facilitate the analysis of additional reservoir storage impacts at the basin level, a lumped version of GCAM reservoir model has been developed, representing a single lumped reservoir at each river basin which has the regulation capacity of all reservoir combined. A Sequent Peak Analysis is used to estimate how much additional reservoir storage is required to satisfy the current water demand. For basins with water deficit, the water supply reliability can be improved with additional storage. However, there is a threshold storage value at each basin beyond which the reliability stops increasing, suggesting that building new reservoirs will not help better relieve the water stress. Findings in the research can be helpful to the future planning and management of new reservoirs.
Reliability, Validity and Treatment Sensitivity of the Schizophrenia Cognition Rating Scale
Keefe, Richard S.E.; Davis, Vicki G.; Spagnola, Nathan B.; Hilt, Dana; Dgetluck, Nancy; Ruse, Stacy; Patterson, Thomas L.; Narasimhan, Meera; Harvey, Philip D.
2014-01-01
Cognitive functioning can be assessed with performance-based assessments such as neuropsychological tests and with interview-based assessments. Both assessment methods have the potential to assess whether treatments for schizophrenia improve clinically relevant aspects of cognitive impairment. However, little is known about the reliability, validity and treatment responsiveness of interview-based measures, especially in the context of clinical trials. Data from two studies were utilized to assess these features of the Schizophrenia Cognition Rating Scale (SCoRS). One of the studies was a validation study involving 79 patients with schizophrenia assessed at 3 academic research centers in the US. The other study was a 32-site clinical trial conducted in the US and Europe comparing the effects of encenicline, an alpha-7 nicotine agonist, to placebo in 319 patients with schizophrenia. The SCoRS interviewer ratings demonstrated excellent test-retest reliability in several different circumstances, including those that did not involve treatment (ICC> 0.90), and during treatment (ICC>0.80). SCoRS interviewer ratings were related to cognitive performance as measured by the MCCB (r= −0.35), and demonstrated significant sensitivity to treatment with encenicline compared to placebo (P<.001). These data suggest that the SCoRS has potential as a clinically relevant measure in clinical trials aiming to improve cognition in schizophrenia, and may be useful for clinical practice. The weaknesses of the SCoRS include its reliance on informant information, which is not available for some patients, and reduced validity when patient self-report is the sole information source. PMID:25028065
Reliability, validity and treatment sensitivity of the Schizophrenia Cognition Rating Scale.
Keefe, Richard S E; Davis, Vicki G; Spagnola, Nathan B; Hilt, Dana; Dgetluck, Nancy; Ruse, Stacy; Patterson, Thomas D; Narasimhan, Meera; Harvey, Philip D
2015-02-01
Cognitive functioning can be assessed with performance-based assessments such as neuropsychological tests and with interview-based assessments. Both assessment methods have the potential to assess whether treatments for schizophrenia improve clinically relevant aspects of cognitive impairment. However, little is known about the reliability, validity and treatment responsiveness of interview-based measures, especially in the context of clinical trials. Data from two studies were utilized to assess these features of the Schizophrenia Cognition Rating Scale (SCoRS). One of the studies was a validation study involving 79 patients with schizophrenia assessed at 3 academic research centers in the US. The other study was a 32-site clinical trial conducted in the US and Europe comparing the effects of encenicline, an alpha-7 nicotine agonist, to placebo in 319 patients with schizophrenia. The SCoRS interviewer ratings demonstrated excellent test-retest reliability in several different circumstances, including those that did not involve treatment (ICC> 0.90), and during treatment (ICC>0.80). SCoRS interviewer ratings were related to cognitive performance as measured by the MCCB (r=-0.35), and demonstrated significant sensitivity to treatment with encenicline compared to placebo (P<.001). These data suggest that the SCoRS has potential as a clinically relevant measure in clinical trials aiming to improve cognition in schizophrenia, and may be useful for clinical practice. The weaknesses of the SCoRS include its reliance on informant information, which is not available for some patients, and reduced validity when patient's self-report is the sole information source. Copyright © 2014 Elsevier B.V. and ECNP. All rights reserved.
Flood loss model transfer: on the value of additional data
NASA Astrophysics Data System (ADS)
Schröter, Kai; Lüdtke, Stefan; Vogel, Kristin; Kreibich, Heidi; Thieken, Annegret; Merz, Bruno
2017-04-01
The transfer of models across geographical regions and flood events is a key challenge in flood loss estimation. Variations in local characteristics and continuous system changes require regional adjustments and continuous updating with current evidence. However, acquiring data on damage influencing factors is expensive and therefore assessing the value of additional data in terms of model reliability and performance improvement is of high relevance. The present study utilizes empirical flood loss data on direct damage to residential buildings available from computer aided telephone interviews that were carried out after the floods in 2002, 2005, 2006, 2010, 2011 and 2013 mainly in the Elbe and Danube catchments in Germany. Flood loss model performance is assessed for incrementally increased numbers of loss data which are differentiated according to region and flood event. Two flood loss modeling approaches are considered: (i) a multi-variable flood loss model approach using Random Forests and (ii) a uni-variable stage damage function. Both model approaches are embedded in a bootstrapping process which allows evaluating the uncertainty of model predictions. Predictive performance of both models is evaluated with regard to mean bias, mean absolute and mean squared errors, as well as hit rate and sharpness. Mean bias and mean absolute error give information about the accuracy of model predictions; mean squared error and sharpness about precision and hit rate is an indicator for model reliability. The results of incremental, regional and temporal updating demonstrate the usefulness of additional data to improve model predictive performance and increase model reliability, particularly in a spatial-temporal transfer setting.
Coal-Powered Electric Generating Unit Efficiency and Reliability Dialogue: Summary Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Taylor, Emmanuel
Coal continues to play a critical role in powering the Nation’s electricity generation, especially for baseload power plants. With aging coal generation assets facing decreased performance due to the state of the equipment, and with challenges exacerbated by the current market pressures on the coal sector, there are opportunities to advance early-stage technologies that can retrofit or replace equipment components. These changes will eventually result in significant improvements in plant performance once further developed and deployed by industry. Research and development in areas such as materials, fluid dynamics, fuel properties and preparation characteristics, and a new generation of plant controlsmore » can lead to new components and systems that can help improve the efficiency and reliability of coal-fired power plants significantly, allowing these assets to continue to provide baseload power. Coal stockpiles at electricity generation plants are typically large enough to provide 30 to 60 days of power prior to resupply—significantly enhancing the stability and reliability of the U.S. electricity sector. Falling prices for non-dispatchable renewable energy and mounting environmental regulations, among other factors, have stimulated efforts to improve the efficiency of these coal-fired electric generating units (EGUs). In addition, increased reliance on natural gas and non-dispatchable energy sources has spurred efforts to further increase the reliability of coal EGUs. The Coal Powered EGU Efficiency and Reliability Dialogue brought together stakeholders from across the coal EGU industry to discuss methods for improvement. Participants at the event reviewed performance-enhancing innovations in coal EGUs, discussed the potential for data-driven management practices to increase efficiency and reliability, investigated the impacts of regulatory compliance on coal EGU performance, and discussed upcoming challenges for the coal industry. This report documents the key findings and research suggestions discussed at the event. Discussions at the workshop will aid DOE in developing a set of distinct initiatives that can be pursued by government and industry to realize promising technological pursuits. DOE plans to use the results of the Dialogue coupled with ongoing technical analysis of efficiency opportunities within the coal-fired fleet, and additional studies to develop a comprehensive strategy for capitalizing on thermal efficiency improvements. Expected Power Plant Efficiency Improvements include developing cost-effective, efficient, and reliable technologies for boilers, turbines, and sensors and controls to improve the reliability and efficiency of existing coal-based power plants. The Office of Fossil Energy at DOE plans to work with industry to develop knowledge pertaining to advanced technologies and systems that industry can subsequently develop. These technologies and systems will increase reliability, add operational flexibility and improve efficiency, thereby providing more robust power generation infrastructure. The following table lists the research suggestions and questions for further investigation that were identified by participants in each session of the dialogue.« less
Zheng, Hong; Clausen, Morten Rahr; Dalsgaard, Trine Kastrup; Mortensen, Grith; Bertram, Hanne Christine
2013-08-06
We describe a time-saving protocol for the processing of LC-MS-based metabolomics data by optimizing parameter settings in XCMS and threshold settings for removing noisy and low-intensity peaks using design of experiment (DoE) approaches including Plackett-Burman design (PBD) for screening and central composite design (CCD) for optimization. A reliability index, which is based on evaluation of the linear response to a dilution series, was used as a parameter for the assessment of data quality. After identifying the significant parameters in the XCMS software by PBD, CCD was applied to determine their values by maximizing the reliability and group indexes. Optimal settings by DoE resulted in improvements of 19.4% and 54.7% in the reliability index for a standard mixture and human urine, respectively, as compared with the default setting, and a total of 38 h was required to complete the optimization. Moreover, threshold settings were optimized by using CCD for further improvement. The approach combining optimal parameter setting and the threshold method improved the reliability index about 9.5 times for a standards mixture and 14.5 times for human urine data, which required a total of 41 h. Validation results also showed improvements in the reliability index of about 5-7 times even for urine samples from different subjects. It is concluded that the proposed methodology can be used as a time-saving approach for improving the processing of LC-MS-based metabolomics data.
Sassoon, Adam; Nam, Denis; Nunley, Ryan; Barrack, Robert
2015-01-01
Patient-specific cutting blocks have been touted as a more efficient and reliable means of achieving neutral mechanical alignment during TKA with the proposed downstream effect of improved clinical outcomes. However, it is not clear to what degree published studies support these assumptions. We asked: (1) Do patient-specific cutting blocks achieve neutral mechanical alignment more reliably during TKA when compared with conventional methods? (2) Does patient-specific instrumentation (PSI) provide financial benefit through improved surgical efficiency? (3) Does the use of patient-specific cutting blocks translate to improved clinical results after TKA when compared with conventional instrumentation? We performed a systematic review in accordance with Cochrane guidelines of controlled studies (prospective and retrospective) in MEDLINE® and EMBASE® with respect to patient-specific cutting blocks and their effect on alignment, cost, operative time, clinical outcome scores, complications, and survivorship. Sixteen studies (Level I-III on the levels of evidence rubric) were identified and used in addressing the first question, 13 (Level I-III) for the second question, and two (Level III) for the third question. Qualitative assessment of the selected Level I studies was performed using the modified Jadad score; Level II and III studies were rated based on the Newcastle-Ottawa scoring system. The majority of studies did not show an improvement in overall limb alignment when PSI was compared with standard instrumentation. Mixed results were seen across studies with regard to the prevalence of alignment outliers when PSI was compared with conventional cutting blocks with some studies demonstrating no difference, some showing an improvement with PSI, and a single study showing worse results with PSI. The studies demonstrated mixed results regarding the influence of PSI on operative times. Decreased operative times were not uniformly observed, and when noted, they were found to be of minimal clinical or financial significance. PSI did reliably reduce the number of instrument trays required for processing perioperatively. The accuracy of the preoperative plan, generated by the PSI manufacturers, was found lacking, often leading to multiple intraoperative changes, thereby disrupting the flow of the operation and negatively impacting efficiency. Limited data exist with regard to the effect of PSI on postoperative function, improvement in pain, and patient satisfaction. Neither of the two studies we identified provided strong evidence to support an advantage favoring the use of PSI. No identified studies addressed survivorship of components placed with PSI compared with those placed with standard instrumentation. PSI for TKA has not reliably demonstrated improvement of postoperative limb or component alignment when compared with standard instrumentation. Although decisive evidence exists to support that PSI requires fewer surgical trays, PSI has not clearly been shown to improve overall surgical efficiency or the cost-effectiveness of TKA. Mid- and long-term data regarding PSI's effect on functional outcomes and component survivorship do not exist and short-term data are scarce. Limited available literature does not clearly support any improvement of postoperative pain, activity, function, or ROM when PSI is compared with traditional instrumentation.
Decision - making of Direct Customers Based on Available Transfer Capability
NASA Astrophysics Data System (ADS)
Quan, Tang; Zhaohang, Lin; Huaqiang, Li
2017-05-01
Large customer direct-power-purchasing is a hot spot in the electricity market reform. In this paper, the author established an Available Transfer Capability (ATC) model which takes uncertain factors into account, applied the model into large customer direct-power-purchasing transactions and improved the reliability of power supply during direct-power-purchasing by introducing insurance theory. The author also considered the customers loss suffered from power interruption when building ATC model, established large customer decision model, took purchasing quantity of power from different power plants and reserved capacity insurance as variables, targeted minimum power interruption loss as optimization goal and best solution by means of particle swarm algorithm to produce optimal power purchasing decision of large consumers. Simulation was made through IEEE57 system finally and proved that such method is effective.
NASA Astrophysics Data System (ADS)
Schultz, David S.; Ghosh, Shondip; Grimmer, Christopher S.; Mack, Hunter
2011-10-01
The viability of a concentrator technology is determined by five interrelated factors: economic benefit, cell performance under concentration, thermal management, optical performance and manufacturability. Considering these factors, the 5- 10x concentration range is ideal for silicon-based receivers because this level of concentration captures the bulk of available economic gains while mitigating technical risk. Significant gains in capital efficiency are forsaken below the 5x concentration level. Above the 10x level of concentration, marginal improvements to economic benefit are achieved, but threats to reliability emerge and tend to erode the available economic benefit. Furthermore, optic solutions that provide for concentration above 10x tend to force a departure from low-profile flat-plate designs that are most adoptable. For silicon based receivers, a 5-10x level of concentration within a traditional module form factor is optimal.
Li, Xingxing; Zhang, Xiaohong; Ren, Xiaodong; Fritsche, Mathias; Wickert, Jens; Schuh, Harald
2015-02-09
The world of satellite navigation is undergoing dramatic changes with the rapid development of multi-constellation Global Navigation Satellite Systems (GNSSs). At the moment more than 70 satellites are already in view, and about 120 satellites will be available once all four systems (BeiDou + Galileo + GLONASS + GPS) are fully deployed in the next few years. This will bring great opportunities and challenges for both scientific and engineering applications. In this paper we develop a four-system positioning model to make full use of all available observations from different GNSSs. The significant improvement of satellite visibility, spatial geometry, dilution of precision, convergence, accuracy, continuity and reliability that a combining utilization of multi-GNSS brings to precise positioning are carefully analyzed and evaluated, especially in constrained environments.
Inventing an Energy Internet: Concepts, Architectures and Protocols for Smart Energy Utilization
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tsoukalas, Lefteri
2009-04-29
In recent years, the Internet is revolutionizing information availability much like the Power Grid revolutionized energy availability a century earlier. We will explore the differences and similarities of these two critical infrastructures and identify ways for convergence which may lead to an energy internet. Pricing signals, nodal forecasting, and short-term elasticities are key concepts in smart energy flows respecting the delicate equilibrium involved in generation-demand and aiming at higher efficiencies. We will discuss how intelligent forecasting approaches operating at multiple levels (including device or nodal levels) can ameliorate the challenges of power storage. In addition to higher efficiencies, an energymore » internet may achieve significant reliability and security improvements and offer greater flexibility and transparency in the overall energy-environmental relation.« less
FEMA and RAM Analysis for the Multi Canister Overpack (MCO) Handling Machine
DOE Office of Scientific and Technical Information (OSTI.GOV)
SWENSON, C.E.
2000-06-01
The Failure Modes and Effects Analysis and the Reliability, Availability, and Maintainability Analysis performed for the Multi-Canister Overpack Handling Machine (MHM) has shown that the current design provides for a safe system, but the reliability of the system (primarily due to the complexity of the interlocks and permissive controls) is relatively low. No specific failure modes were identified where significant consequences to the public occurred, or where significant impact to nearby workers should be expected. The overall reliability calculation for the MHM shows a 98.1 percent probability of operating for eight hours without failure, and an availability of the MHMmore » of 90 percent. The majority of the reliability issues are found in the interlocks and controls. The availability of appropriate spare parts and maintenance personnel, coupled with well written operating procedures, will play a more important role in successful mission completion for the MHM than other less complicated systems.« less
Availability and End-to-end Reliability in Low Duty Cycle Multihop Wireless Sensor Networks.
Suhonen, Jukka; Hämäläinen, Timo D; Hännikäinen, Marko
2009-01-01
A wireless sensor network (WSN) is an ad-hoc technology that may even consist of thousands of nodes, which necessitates autonomic, self-organizing and multihop operations. A typical WSN node is battery powered, which makes the network lifetime the primary concern. The highest energy efficiency is achieved with low duty cycle operation, however, this alone is not enough. WSNs are deployed for different uses, each requiring acceptable Quality of Service (QoS). Due to the unique characteristics of WSNs, such as dynamic wireless multihop routing and resource constraints, the legacy QoS metrics are not feasible as such. We give a new definition to measure and implement QoS in low duty cycle WSNs, namely availability and reliability. Then, we analyze the effect of duty cycling for reaching the availability and reliability. The results are obtained by simulations with ZigBee and proprietary TUTWSN protocols. Based on the results, we also propose a data forwarding algorithm suitable for resource constrained WSNs that guarantees end-to-end reliability while adding a small overhead that is relative to the packet error rate (PER). The forwarding algorithm guarantees reliability up to 30% PER.
USDA-ARS?s Scientific Manuscript database
Armillaria mellea is a serious pathogen of horticultural and agricultural systems in Europe and North America. The lack of a reliable in vitro fruiting system has hindered research, and necessitated dependence on intermittently available wild-collected basidiospores. Here we describe a reliable, rep...
75 FR 4375 - Transmission Loading Relief Reliability Standard and Curtailment Priorities
Federal Register 2010, 2011, 2012, 2013, 2014
2010-01-27
... Site: http://www.ferc.gov . Documents created electronically using word processing software should be... ensure operation within acceptable reliability criteria. NERC Glossary of Terms Used in Reliability Standards at 19, available at http://www.nerc.com/files/Glossary_12Feb08.pdf (NERC Glossary). An...
The reliability of the Glasgow Coma Scale: a systematic review.
Reith, Florence C M; Van den Brande, Ruben; Synnot, Anneliese; Gruen, Russell; Maas, Andrew I R
2016-01-01
The Glasgow Coma Scale (GCS) provides a structured method for assessment of the level of consciousness. Its derived sum score is applied in research and adopted in intensive care unit scoring systems. Controversy exists on the reliability of the GCS. The aim of this systematic review was to summarize evidence on the reliability of the GCS. A literature search was undertaken in MEDLINE, EMBASE and CINAHL. Observational studies that assessed the reliability of the GCS, expressed by a statistical measure, were included. Methodological quality was evaluated with the consensus-based standards for the selection of health measurement instruments checklist and its influence on results considered. Reliability estimates were synthesized narratively. We identified 52 relevant studies that showed significant heterogeneity in the type of reliability estimates used, patients studied, setting and characteristics of observers. Methodological quality was good (n = 7), fair (n = 18) or poor (n = 27). In good quality studies, kappa values were ≥0.6 in 85%, and all intraclass correlation coefficients indicated excellent reliability. Poor quality studies showed lower reliability estimates. Reliability for the GCS components was higher than for the sum score. Factors that may influence reliability include education and training, the level of consciousness and type of stimuli used. Only 13% of studies were of good quality and inconsistency in reported reliability estimates was found. Although the reliability was adequate in good quality studies, further improvement is desirable. From a methodological perspective, the quality of reliability studies needs to be improved. From a clinical perspective, a renewed focus on training/education and standardization of assessment is required.
Identifying key components for an effective case report poster: an observational study.
Willett, Lisa L; Paranjape, Anuradha; Estrada, Carlos
2009-03-01
Residents demonstrate scholarly activity by presenting posters at academic meetings. Although recommendations from national organizations are available, evidence identifying which components are most important is not. To develop and test an evaluation tool to measure the quality of case report posters and identify the specific components most in need of improvement. Faculty evaluators reviewed case report posters and provided on-site feedback to presenters at poster sessions of four annual academic general internal medicine meetings. A newly developed ten-item evaluation form measured poster quality for specific components of content, discussion, and format (5-point Likert scale, 1 = lowest, 5 = highest). Evaluation tool performance, including Cronbach alpha and inter-rater reliability, overall poster scores, differences across meetings and evaluators and specific components of the posters most in need of improvement. Forty-five evaluators from 20 medical institutions reviewed 347 posters. Cronbach's alpha of the evaluation form was 0.84 and inter-rater reliability, Spearman's rho 0.49 (p < 0.001). The median score was 4.1 (Q1 -Q3, 3.7-4.6)(Q1 = 25th, Q3 = 75th percentile). The national meeting median score was higher than the regional meetings (4.4 vs, 4.0, P < 0.001). We found no difference in faculty scores. The following areas were identified as most needing improvement: clearly state learning objectives, tie conclusions to learning objectives, and use appropriate amount of words. Our evaluation tool provides empirical data to guide trainees as they prepare posters for presentation which may improve poster quality and enhance their scholarly productivity.
Tutorial: Performance and reliability in redundant disk arrays
NASA Technical Reports Server (NTRS)
Gibson, Garth A.
1993-01-01
A disk array is a collection of physically small magnetic disks that is packaged as a single unit but operates in parallel. Disk arrays capitalize on the availability of small-diameter disks from a price-competitive market to provide the cost, volume, and capacity of current disk systems but many times their performance. Unfortunately, relative to current disk systems, the larger number of components in disk arrays leads to higher rates of failure. To tolerate failures, redundant disk arrays devote a fraction of their capacity to an encoding of their information. This redundant information enables the contents of a failed disk to be recovered from the contents of non-failed disks. The simplest and least expensive encoding for this redundancy, known as N+1 parity is highlighted. In addition to compensating for the higher failure rates of disk arrays, redundancy allows highly reliable secondary storage systems to be built much more cost-effectively than is now achieved in conventional duplicated disks. Disk arrays that combine redundancy with the parallelism of many small-diameter disks are often called Redundant Arrays of Inexpensive Disks (RAID). This combination promises improvements to both the performance and the reliability of secondary storage. For example, IBM's premier disk product, the IBM 3390, is compared to a redundant disk array constructed of 84 IBM 0661 3 1/2-inch disks. The redundant disk array has comparable or superior values for each of the metrics given and appears likely to cost less. In the first section of this tutorial, I explain how disk arrays exploit the emergence of high performance, small magnetic disks to provide cost-effective disk parallelism that combats the access and transfer gap problems. The flexibility of disk-array configurations benefits manufacturer and consumer alike. In contrast, I describe in this tutorial's second half how parallelism, achieved through increasing numbers of components, causes overall failure rates to rise. Redundant disk arrays overcome this threat to data reliability by ensuring that data remains available during and after component failures.
Strategy for Developing Expert-System-Based Internet Protocols (TCP/IP)
NASA Technical Reports Server (NTRS)
Ivancic, William D.
1997-01-01
The Satellite Networks and Architectures Branch of NASA's Lewis Research is addressing the issue of seamless interoperability of satellite networks with terrestrial networks. One of the major issues is improving reliable transmission protocols such as TCP over long latency and error-prone links. Many tuning parameters are available to enhance the performance of TCP including segment size, timers and window sizes. There are also numerous congestion avoidance algorithms such as slow start, selective retransmission and selective acknowledgment that are utilized to improve performance. This paper provides a strategy to characterize the performance of TCP relative to various parameter settings in a variety of network environments (i.e. LAN, WAN, wireless, satellite, and IP over ATM). This information can then be utilized to develop expert-system-based Internet protocols.
Building a generalized distributed system model
NASA Technical Reports Server (NTRS)
Mukkamala, R.
1993-01-01
The key elements in the 1992-93 period of the project are the following: (1) extensive use of the simulator to implement and test - concurrency control algorithms, interactive user interface, and replica control algorithms; and (2) investigations into the applicability of data and process replication in real-time systems. In the 1993-94 period of the project, we intend to accomplish the following: (1) concentrate on efforts to investigate the effects of data and process replication on hard and soft real-time systems - especially we will concentrate on the impact of semantic-based consistency control schemes on a distributed real-time system in terms of improved reliability, improved availability, better resource utilization, and reduced missed task deadlines; and (2) use the prototype to verify the theoretically predicted performance of locking protocols, etc.
Value Streams in Microgrids: A literature Review
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stadler, Michael; Center for Energy and Innovative Technologies; Cardoso, Gonçalo
2015-10-01
Microgrids are an increasingly common component of the evolving electricity grids with the potential to improve local reliability, reduce costs, and increase penetration rates for distributed renewable generation. The additional complexity of microgrids often leads to increased investment costs, creating a barrier for widespread adoption. These costs may result directly from specific needs for islanding detection, protection systems and power quality assurance that would otherwise be avoided in simpler system configurations. However, microgrids also facilitate additional value streams that may make up for their increased costs and improve the economic viability of microgrid deployment. This paper analyses the literature currentlymore » available on research relevant to value streams occurring in microgrids that may contribute to offset the increased investment costs. A review on research related to specific microgrid requirements is also presented.« less
Interdisciplinary Team Huddles for Fetal Heart Rate Tracing Review.
Thompson, Lisa; Krening, Cynthia; Parrett, Dolores
2018-06-01
To address an increase in unexpected poor outcomes in term neonates, our team developed a goal of high reliability and improved fetal safety in the culture of the Labor and Delivery nursing department. We implemented interdisciplinary reviews of fetal heart rate, along with a Category II fetal heart rate management algorithm and a fetal heart rate assessment rapid response alert to call for unscheduled reviews when needed. Enhanced communication between nurses and other clinicians supported an interdisciplinary approach to fetal safety, and we observed an improvement in health outcomes for term neonates. We share our experience with the intention of making our methods available to any labor and delivery unit team committed to safe, high-quality care and service excellence. Copyright © 2018 AWHONN. Published by Elsevier Inc. All rights reserved.
Breast cancer and protein biomarkers
Gam, Lay-Harn
2012-01-01
Breast cancer is a healthcare concern of women worldwide. Despite procedures being available for diagnosis, prognosis and treatment of breast cancer, researchers are working intensively on the disease in order to improve the life quality of breast cancer patients. At present, there is no single treatment known to bring a definite cure for breast cancer. One of the possible solutions for combating breast cancer is through identification of reliable protein biomarkers that can be effectively used for early detection, prognosis and treatments of the cancer. Therefore, the task of identification of biomarkers for breast cancer has become the focus of many researchers worldwide. PMID:24520539
Intelligent Chemistry Management System (ICMS)--A new approach to steam generator chemistry control
DOE Office of Scientific and Technical Information (OSTI.GOV)
Barto, R.J.; Farrell, D.M.; Noto, F.A.
1986-04-01
The Intelligent Chemistry Management System (ICMS) is a new tool which assists in steam generator chemistry control. Utilizing diagnostic capabilities, the ICMS will provide utility and industrial boiler operators, system chemists, and plant engineers with a tool for monitoring, diagnosing, and controlling steam generator system chemistry. By reducing the number of forced outages through early identification of potentially detrimental conditions, suggestion of possible causes, and execution of corrective actions, improvements in unit availability and reliability will result. The system monitors water and steam quality at a number of critical locations in the plant.
Towards Certification of a Space System Application of Fault Detection and Isolation
NASA Technical Reports Server (NTRS)
Feather, Martin S.; Markosian, Lawrence Z.
2008-01-01
Advanced fault detection, isolation and recovery (FDIR) software is being investigated at NASA as a means to the improve reliability and availability of its space systems. Certification is a critical step in the acceptance of such software. Its attainment hinges on performing the necessary verification and validation to show that the software will fulfill its requirements in the intended setting. Presented herein is our ongoing work to plan for the certification of a pilot application of advanced FDIR software in a NASA setting. We describe the application, and the key challenges and opportunities it offers for certification.
Lagarde, Nathalie; Zagury, Jean-François; Montes, Matthieu
2015-07-27
Virtual screening methods are commonly used nowadays in drug discovery processes. However, to ensure their reliability, they have to be carefully evaluated. The evaluation of these methods is often realized in a retrospective way, notably by studying the enrichment of benchmarking data sets. To this purpose, numerous benchmarking data sets were developed over the years, and the resulting improvements led to the availability of high quality benchmarking data sets. However, some points still have to be considered in the selection of the active compounds, decoys, and protein structures to obtain optimal benchmarking data sets.
Tarrant, Carolyn; O'Donnell, Barbara; Martin, Graham; Bion, Julian; Hunter, Alison; Rooney, Kevin D
2016-11-16
Implementation of the 'Sepsis Six' clinical care bundle within an hour of recognition of sepsis is recommended as an approach to reduce mortality in patients with sepsis, but achieving reliable delivery of the bundle has proved challenging. There remains little understanding of the barriers to reliable implementation of bundle components. We examined frontline clinical practice in implementing the Sepsis Six. We conducted an ethnographic study in six hospitals participating in the Scottish Patient Safety Programme Sepsis collaborative. We conducted around 300 h of non-participant observation in emergency departments, acute medical receiving units and medical and surgical wards. We interviewed a purposive sample of 43 members of hospital staff. Data were analysed using a constant comparative approach. Implementation strategies to promote reliable use of the Sepsis Six primarily focused on education, engaging and motivating staff, and providing prompts for behaviour, along with efforts to ensure that equipment required was readily available. Although these strategies were successful in raising staff awareness of sepsis and engagement with implementation, our study identified that completing the bundle within an hour was not straightforward. Our emergent theory suggested that rather than being an apparently simple sequence of six steps, the Sepsis Six actually involved a complex trajectory comprising multiple interdependent tasks that required prioritisation and scheduling, and which was prone to problems of coordination and operational failures. Interventions that involved allocating specific roles and responsibilities for completing the Sepsis Six in ways that reduced the need for coordination and task switching, and the use of process mapping to identify system failures along the trajectory, could help mitigate against some of these problems. Implementation efforts that focus on individual behaviour change to improve uptake of the Sepsis Six should be supplemented by an understanding of the bundle as a complex trajectory of work in which improving reliability requires attention to coordination of workflow, as well as addressing the mundane problems of interruptions and operational failures that obstruct task completion.
Weech-Maldonado, Robert; Dreachslin, Janice L.; Brown, Julie; Pradhan, Rohit; Rubin, Kelly L.; Schiller, Cameron; Hays, Ron D.
2016-01-01
Background The U.S. national standards for culturally and linguistically appropriate services (CLAS) in health care provide guidelines on policies and practices aimed at developing culturally competent systems of care. The Cultural Competency Assessment Tool for Hospitals (CCATH) was developed as an organizational tool to assess adherence to the CLAS standards. Purposes First, we describe the development of the CCATH and estimate the reliability and validity of the CCATH measures. Second, we discuss the managerial implications of the CCATH as an organizational tool to assess cultural competency. Methodology/Approach We pilot tested an initial draft of the CCATH, revised it based on a focus group and cognitive interviews, and then administered it in a field test with a sample of California hospitals. The reliability and validity of the CCATH were evaluated using factor analysis, analysis of variance, and Cronbach’s alphas. Findings Exploratory and confirmatory factor analyses identified 12 CCATH composites: leadership and strategic planning, data collection on inpatient population, data collection on service area, performance management systems and quality improvement, human resources practices, diversity training, community representation, availability of interpreter services, interpreter services policies, quality of interpreter services, translation of written materials, and clinical cultural competency practices. All the CCATH scales had internal consistency reliability of .65 or above, and the reliability was .70 or above for 9 of the 12 scales. Analysis of variance results showed that not-for-profit hospitals have higher CCATH scores than for-profit hospitals in five CCATH scales and higher CCATH scores than government hospitals in two CCATH scales. Practice Implications The CCATH showed adequate psychometric properties. Managers and policy makers can use the CCATH as a tool to evaluate hospital performance in cultural competency and identify and target improvements in hospital policies and practices that undergird the provision of CLAS. PMID:21934511
Some Reliability Issues in Very Large Databases.
ERIC Educational Resources Information Center
Lynch, Clifford A.
1988-01-01
Describes the unique reliability problems of very large databases that necessitate specialized techniques for hardware problem management. The discussion covers the use of controlled partial redundancy to improve reliability, issues in operating systems and database management systems design, and the impact of disk technology on very large…
Keith B. Aubry; Catherine M. Raley; Kevin S. McKelvey
2017-01-01
The availability of spatially referenced environmental data and species occurrence records in online databases enable practitioners to easily generate species distribution models (SDMs) for a broad array of taxa. Such databases often include occurrence records of unknown reliability, yet little information is available on the influence of data quality on SDMs generated...
DOE Office of Scientific and Technical Information (OSTI.GOV)
CARLSON, A.B.
The document presents updated results of the preliminary reliability, availability, maintainability analysis performed for delivery of waste feed from tanks 241-AZ-101 and 241-AN-105 to British Nuclear Fuels Limited, inc. under the Tank Waste Remediation System Privatization Contract. The operational schedule delay risk is estimated and contributing factors are discussed.
Reliability and availability evaluation of Wireless Sensor Networks for industrial applications.
Silva, Ivanovitch; Guedes, Luiz Affonso; Portugal, Paulo; Vasques, Francisco
2012-01-01
Wireless Sensor Networks (WSN) currently represent the best candidate to be adopted as the communication solution for the last mile connection in process control and monitoring applications in industrial environments. Most of these applications have stringent dependability (reliability and availability) requirements, as a system failure may result in economic losses, put people in danger or lead to environmental damages. Among the different type of faults that can lead to a system failure, permanent faults on network devices have a major impact. They can hamper communications over long periods of time and consequently disturb, or even disable, control algorithms. The lack of a structured approach enabling the evaluation of permanent faults, prevents system designers to optimize decisions that minimize these occurrences. In this work we propose a methodology based on an automatic generation of a fault tree to evaluate the reliability and availability of Wireless Sensor Networks, when permanent faults occur on network devices. The proposal supports any topology, different levels of redundancy, network reconfigurations, criticality of devices and arbitrary failure conditions. The proposed methodology is particularly suitable for the design and validation of Wireless Sensor Networks when trying to optimize its reliability and availability requirements.
Reliability and Availability Evaluation of Wireless Sensor Networks for Industrial Applications
Silva, Ivanovitch; Guedes, Luiz Affonso; Portugal, Paulo; Vasques, Francisco
2012-01-01
Wireless Sensor Networks (WSN) currently represent the best candidate to be adopted as the communication solution for the last mile connection in process control and monitoring applications in industrial environments. Most of these applications have stringent dependability (reliability and availability) requirements, as a system failure may result in economic losses, put people in danger or lead to environmental damages. Among the different type of faults that can lead to a system failure, permanent faults on network devices have a major impact. They can hamper communications over long periods of time and consequently disturb, or even disable, control algorithms. The lack of a structured approach enabling the evaluation of permanent faults, prevents system designers to optimize decisions that minimize these occurrences. In this work we propose a methodology based on an automatic generation of a fault tree to evaluate the reliability and availability of Wireless Sensor Networks, when permanent faults occur on network devices. The proposal supports any topology, different levels of redundancy, network reconfigurations, criticality of devices and arbitrary failure conditions. The proposed methodology is particularly suitable for the design and validation of Wireless Sensor Networks when trying to optimize its reliability and availability requirements. PMID:22368497
Addressing Uniqueness and Unison of Reliability and Safety for a Better Integration
NASA Technical Reports Server (NTRS)
Huang, Zhaofeng; Safie, Fayssal
2016-01-01
Over time, it has been observed that Safety and Reliability have not been clearly differentiated, which leads to confusion, inefficiency, and, sometimes, counter-productive practices in executing each of these two disciplines. It is imperative to address this situation to help Reliability and Safety disciplines improve their effectiveness and efficiency. The paper poses an important question to address, "Safety and Reliability - Are they unique or unisonous?" To answer the question, the paper reviewed several most commonly used analyses from each of the disciplines, namely, FMEA, reliability allocation and prediction, reliability design involvement, system safety hazard analysis, Fault Tree Analysis, and Probabilistic Risk Assessment. The paper pointed out uniqueness and unison of Safety and Reliability in their respective roles, requirements, approaches, and tools, and presented some suggestions for enhancing and improving the individual disciplines, as well as promoting the integration of the two. The paper concludes that Safety and Reliability are unique, but compensating each other in many aspects, and need to be integrated. Particularly, the individual roles of Safety and Reliability need to be differentiated, that is, Safety is to ensure and assure the product meets safety requirements, goals, or desires, and Reliability is to ensure and assure maximum achievability of intended design functions. With the integration of Safety and Reliability, personnel can be shared, tools and analyses have to be integrated, and skill sets can be possessed by the same person with the purpose of providing the best value to a product development.
High-Reliability Health Care: Getting There from Here
Chassin, Mark R; Loeb, Jerod M
2013-01-01
Context Despite serious and widespread efforts to improve the quality of health care, many patients still suffer preventable harm every day. Hospitals find improvement difficult to sustain, and they suffer “project fatigue” because so many problems need attention. No hospitals or health systems have achieved consistent excellence throughout their institutions. High-reliability science is the study of organizations in industries like commercial aviation and nuclear power that operate under hazardous conditions while maintaining safety levels that are far better than those of health care. Adapting and applying the lessons of this science to health care offer the promise of enabling hospitals to reach levels of quality and safety that are comparable to those of the best high-reliability organizations. Methods We combined the Joint Commission's knowledge of health care organizations with knowledge from the published literature and from experts in high-reliability industries and leading safety scholars outside health care. We developed a conceptual and practical framework for assessing hospitals’ readiness for and progress toward high reliability. By iterative testing with hospital leaders, we refined the framework and, for each of its fourteen components, defined stages of maturity through which we believe hospitals must pass to reach high reliability. Findings We discovered that the ways that high-reliability organizations generate and maintain high levels of safety cannot be directly applied to today's hospitals. We defined a series of incremental changes that hospitals should undertake to progress toward high reliability. These changes involve the leadership's commitment to achieving zero patient harm, a fully functional culture of safety throughout the organization, and the widespread deployment of highly effective process improvement tools. Conclusions Hospitals can make substantial progress toward high reliability by undertaking several specific organizational change initiatives. Further research and practical experience will be necessary to determine the validity and effectiveness of this framework for high-reliability health care. PMID:24028696
High-reliability health care: getting there from here.
Chassin, Mark R; Loeb, Jerod M
2013-09-01
Despite serious and widespread efforts to improve the quality of health care, many patients still suffer preventable harm every day. Hospitals find improvement difficult to sustain, and they suffer "project fatigue" because so many problems need attention. No hospitals or health systems have achieved consistent excellence throughout their institutions. High-reliability science is the study of organizations in industries like commercial aviation and nuclear power that operate under hazardous conditions while maintaining safety levels that are far better than those of health care. Adapting and applying the lessons of this science to health care offer the promise of enabling hospitals to reach levels of quality and safety that are comparable to those of the best high-reliability organizations. We combined the Joint Commission's knowledge of health care organizations with knowledge from the published literature and from experts in high-reliability industries and leading safety scholars outside health care. We developed a conceptual and practical framework for assessing hospitals' readiness for and progress toward high reliability. By iterative testing with hospital leaders, we refined the framework and, for each of its fourteen components, defined stages of maturity through which we believe hospitals must pass to reach high reliability. We discovered that the ways that high-reliability organizations generate and maintain high levels of safety cannot be directly applied to today's hospitals. We defined a series of incremental changes that hospitals should undertake to progress toward high reliability. These changes involve the leadership's commitment to achieving zero patient harm, a fully functional culture of safety throughout the organization, and the widespread deployment of highly effective process improvement tools. Hospitals can make substantial progress toward high reliability by undertaking several specific organizational change initiatives. Further research and practical experience will be necessary to determine the validity and effectiveness of this framework for high-reliability health care. © 2013 The Authors. The Milbank Quarterly published by Wiley Periodicals Inc. on behalf of Milbank Memorial Fund.
Beck, Alison; Burdett, Mark; Lewis, Helen
2015-06-01
To investigate the impact of waiting for psychological therapy on client well-being as measured by the Clinical Outcomes in Routine Evaluation-Outcome Measure (CORE-OM) global distress (GD) score. Global distress scores were retrieved for all clients referred for psychological therapy in a secondary care mental health service between November 2006 and May 2013 and who had completed a CORE-OM at assessment and first session. GD scores for a subgroup of 103 clients who had completed a CORE-OM during the last therapy session were also reviewed. The study sample experienced a median wait of 41.14 weeks between assessment and first session. The relationship between wait time from referral acceptance to assessment, and assessment GD score was not significant. During the period between assessment and first session no significant difference in GD score was observed. Nevertheless 29.1% of the sample experienced reliable change; 16.0% of clients reliably improved and 13.1% reliably deteriorated whilst waiting for therapy. Demographic factors were not found to have a significant effect on the change in GD score between assessment and first session. Waiting time was associated with post-therapy outcomes but not to a degree which was meaningful. The majority of individuals (54.4%), regardless of whether they improved or deteriorated whilst waiting for therapy, showed reliable improvement at end of therapy as measured by the CORE-OM. The majority of GD scores remained stable while waiting for therapy; however, 29.1% of secondary care clients experienced either reliable improvement or deterioration. Irrespective of whether they improved, deteriorated or remained unchanged whilst waiting for therapy, most individuals who had a complete end of therapy assessment showed reliable improvements following therapy. There was no significant difference in GD score between assessment and first session recordings. A proportion of clients (29.1%) showed reliable change, either improvement or deterioration, as measured by the GD score while waiting for therapy. Of the individuals with last session CORE-OMs (54.4%) showed significant improvement following therapy regardless of whether or not they experienced change while waiting for therapy. Limitations include: Problems of data quality, the data were from a routine data set and data were lost at each stage of the analysis. A focus on the CORE-OM limits exploration of the subjective experience of waiting for psychotherapy and the impact this has on psychological well-being. © 2014 The British Psychological Society.
Begeman, Ian J; Lykins, Joseph; Zhou, Ying; Lai, Bo Shiun; Levigne, Pauline; El Bissati, Kamal; Boyer, Kenneth; Withers, Shawn; Clouser, Fatima; Noble, A Gwendolyn; Rabiah, Peter; Swisher, Charles N; Heydemann, Peter T; Contopoulos-Ioannidis, Despina G; Montoya, Jose G; Maldonado, Yvonne; Ramirez, Raymund; Press, Cindy; Stillwaggon, Eileen; Peyron, François; McLeod, Rima
2017-06-01
Congenital toxoplasmosis is a serious but preventable and treatable disease. Gestational screening facilitates early detection and treatment of primary acquisition. Thus, fetal infection can be promptly diagnosed and treated and outcomes can be improved. We tested 180 sera with the Toxoplasma ICT IgG-IgM point-of-care (POC) test. Sera were from 116 chronically infected persons (48 serotype II; 14 serotype I-III; 25 serotype I-IIIa; 28 serotype Atypical, haplogroup 12; 1 not typed). These represent strains of parasites infecting mothers of congenitally infected children in the U.S. 51 seronegative samples and 13 samples from recently infected persons known to be IgG/IgM positive within the prior 2.7 months also were tested. Interpretation was confirmed by two blinded observers. A comparison of costs for POC vs. commercial laboratory testing methods was performed. We found that this new Toxoplasma ICT IgG-IgM POC test was highly sensitive (100%) and specific (100%) for distinguishing IgG/IgM-positive from negative sera. Use of such reliable POC tests can be cost-saving and benefit patients. Our work demonstrates that the Toxoplasma ICT IgG-IgM test can function reliably as a point-of-care test to diagnose Toxoplasma gondii infection in the U.S. This provides an opportunity to improve maternal-fetal care by using approaches, diagnostic tools, and medicines already available. This infection has serious, lifelong consequences for infected persons and their families. From the present study, it appears a simple, low-cost POC test is now available to help prevent morbidity/disability, decrease cost, and make gestational screening feasible. It also offers new options for improved prenatal care in low- and middle-income countries.
Zhou, Ying; Lai, Bo Shiun; Levigne, Pauline; El Bissati, Kamal; Boyer, Kenneth; Withers, Shawn; Clouser, Fatima; Noble, A. Gwendolyn; Rabiah, Peter; Swisher, Charles N.; Heydemann, Peter T.; Contopoulos-Ioannidis, Despina G.; Montoya, Jose G.; Maldonado, Yvonne; Ramirez, Raymund; Press, Cindy; Stillwaggon, Eileen; Peyron, François
2017-01-01
Background Congenital toxoplasmosis is a serious but preventable and treatable disease. Gestational screening facilitates early detection and treatment of primary acquisition. Thus, fetal infection can be promptly diagnosed and treated and outcomes can be improved. Methods We tested 180 sera with the Toxoplasma ICT IgG-IgM point-of-care (POC) test. Sera were from 116 chronically infected persons (48 serotype II; 14 serotype I-III; 25 serotype I-IIIa; 28 serotype Atypical, haplogroup 12; 1 not typed). These represent strains of parasites infecting mothers of congenitally infected children in the U.S. 51 seronegative samples and 13 samples from recently infected persons known to be IgG/IgM positive within the prior 2.7 months also were tested. Interpretation was confirmed by two blinded observers. A comparison of costs for POC vs. commercial laboratory testing methods was performed. Results We found that this new Toxoplasma ICT IgG-IgM POC test was highly sensitive (100%) and specific (100%) for distinguishing IgG/IgM-positive from negative sera. Use of such reliable POC tests can be cost-saving and benefit patients. Conclusions Our work demonstrates that the Toxoplasma ICT IgG-IgM test can function reliably as a point-of-care test to diagnose Toxoplasma gondii infection in the U.S. This provides an opportunity to improve maternal-fetal care by using approaches, diagnostic tools, and medicines already available. This infection has serious, lifelong consequences for infected persons and their families. From the present study, it appears a simple, low-cost POC test is now available to help prevent morbidity/disability, decrease cost, and make gestational screening feasible. It also offers new options for improved prenatal care in low- and middle-income countries. PMID:28650970
Toverud, Else-Lydia; Hartmann, Katrin; Håkonsen, Helle
2015-08-01
Generic substitution has been introduced in most countries in order to reduce costs and improve access to drugs. However, regulations and the generic drugs available vary between countries. It is the prescriber or dispenser of the drug who is the final decision maker. Nevertheless, physicians' and pharmacists' perceptions of generic drug use are not well documented to date. This study presents a systematic review of physicians' and pharmacists' perspectives on generic drug use worldwide. A systematic literature search was performed to retrieve all articles published between 2002 and 2012 regarding physicians' and/or pharmacists' experiences with generic drugs and generic substitution. Of 1322 publications initially identified, 24 were eligible for inclusion. Overall, the studies revealed that physicians and pharmacists were aware of the cost-saving function of generic drugs and their role in improving global access to drugs. Nevertheless, marked differences were observed between countries when studying physicians' and pharmacists' perceptions of the available generic drugs. In less mature healthcare systems, large variations regarding, for example, control routines, bioequivalence requirements, and manufacturer standards were reported. A lack of reliable information and mistrust in the efficacy and quality were also mentioned by these participants. In the most developed healthcare systems, the participants trusted the quality of the generic drugs and did not hesitate to offer them to all patients regardless of socioeconomic status. In general, pharmacists seemed to have better knowledge of the concept of bioequivalence and generic drug aspects than physicians. The present study indicates that physicians and pharmacists are aware of the role of generic drugs in the improvement of global access to drugs. However, there are marked differences regarding how these health professionals view the quality of generic drugs depending on the maturity of their country's healthcare system. This can be attributed to the fact that developed healthcare systems have more reliable public control routines for drugs in general as well as better bioequivalence requirements concerning generics in particular.
External quality assessment programs in the context of ISO 15189 accreditation.
Sciacovelli, Laura; Secchiero, Sandra; Padoan, Andrea; Plebani, Mario
2018-05-23
Effective management of clinical laboratories participating in external quality assessment schemes (EQAS) is of fundamental importance in ensuring reliable analytical results. The International Standard ISO 15189:2012 requires participation in interlaboratory comparison [e.g. external quality assessment (EQA)] for all tests provided by an individual laboratory. If EQAS is not commercially available, alternative approaches should be identified, although clinical laboratories may find it challenging to choose the EQAS that comply with the international standards and approved guidelines. Great competence is therefore required, as well as knowledge of the characteristics and key elements affecting the reliability of an EQAS, and the analytical quality specifications stated in approved documents. Another skill of fundamental importance is the ability to identify an alternative approach when the available EQAS are inadequate or missing. Yet the choice of the right EQA program alone does not guarantee its effectiveness. In fact, the fundamental steps of analysis of the information provided in EQA reports and the ability to identify improvement actions to be undertaken call for the involvement of all laboratory staff playing a role in the specific activity. The aim of this paper was to describe the critical aspects that EQA providers and laboratory professionals should control in order to guarantee effective EQAS management and compliance with ISO 15189 accreditation requirements.
Digital PCR as a tool to measure HIV persistence.
Rutsaert, Sofie; Bosman, Kobus; Trypsteen, Wim; Nijhuis, Monique; Vandekerckhove, Linos
2018-01-30
Although antiretroviral therapy is able to suppress HIV replication in infected patients, the virus persists and rebounds when treatment is stopped. In order to find a cure that can eradicate the latent reservoir, one must be able to quantify the persisting virus. Traditionally, HIV persistence studies have used real-time PCR (qPCR) to measure the viral reservoir represented by HIV DNA and RNA. Most recently, digital PCR is gaining popularity as a novel approach to nucleic acid quantification as it allows for absolute target quantification. Various commercial digital PCR platforms are nowadays available that implement the principle of digital PCR, of which Bio-Rad's QX200 ddPCR is currently the most used platform in HIV research. Quantification of HIV by digital PCR is proving to be a valuable improvement over qPCR as it is argued to have a higher robustness to mismatches between the primers-probe set and heterogeneous HIV, and forfeits the need for a standard curve, both of which are known to complicate reliable quantification. However, currently available digital PCR platforms occasionally struggle with unexplained false-positive partitions, and reliable segregation between positive and negative droplets remains disputed. Future developments and advancements of the digital PCR technology are promising to aid in the accurate quantification and characterization of the persistent HIV reservoir.
Yang, Yingbao; Li, Xiaolong; Pan, Xin; Zhang, Yong; Cao, Chen
2017-01-01
Many downscaling algorithms have been proposed to address the issue of coarse-resolution land surface temperature (LST) derived from available satellite-borne sensors. However, few studies have focused on improving LST downscaling in urban areas with several mixed surface types. In this study, LST was downscaled by a multiple linear regression model between LST and multiple scale factors in mixed areas with three or four surface types. The correlation coefficients (CCs) between LST and the scale factors were used to assess the importance of the scale factors within a moving window. CC thresholds determined which factors participated in the fitting of the regression equation. The proposed downscaling approach, which involves an adaptive selection of the scale factors, was evaluated using the LST derived from four Landsat 8 thermal imageries of Nanjing City in different seasons. Results of the visual and quantitative analyses show that the proposed approach achieves relatively satisfactory downscaling results on 11 August, with coefficient of determination and root-mean-square error of 0.87 and 1.13 °C, respectively. Relative to other approaches, our approach shows the similar accuracy and the availability in all seasons. The best (worst) availability occurred in the region of vegetation (water). Thus, the approach is an efficient and reliable LST downscaling method. Future tasks include reliable LST downscaling in challenging regions and the application of our model in middle and low spatial resolutions. PMID:28368301
Memorial Hermann: high reliability from board to bedside.
Shabot, M Michael; Monroe, Douglas; Inurria, Juan; Garbade, Debbi; France, Anne-Claire
2013-06-01
In 2006 the Memorial Hermann Health System (MHHS), which includes 12 hospitals, began applying principles embraced by high reliability organizations (HROs). Three factors support its HRO journey: (1) aligned organizational structure with transparent management systems and compressed reporting processes; (2) Robust Process Improvement (RPI) with high-reliability interventions; and (3) cultural establishment, sustainment, and evolution. The Quality and Safety strategic plan contains three domains, each with a specific set of measures that provide goals for performance: (1) "Clinical Excellence;" (2) "Do No Harm;" and (3) "Saving Lives," as measured by the Serious Safety Event rate. MHHS uses a uniform approach to performance improvement--RPI, which includes Six Sigma, Lean, and change management, to solve difficult safety and quality problems. The 9 acute care hospitals provide multiple opportunities to integrate high-reliability interventions and best practices across MHHS. For example, MHHS partnered with the Joint Commission Center for Transforming Healthcare in its inaugural project to establish reliable hand hygiene behaviors, which improved MHHS's average hand hygiene compliance rate from 44% to 92% currently. Soon after compliance exceeded 85% at all 12 hospitals, the average rate of central line-associated bloodstream and ventilator-associated pneumonias decreased to essentially zero. MHHS's size and diversity require a disciplined approach to performance improvement and systemwide achievement of measurable success. The most significant cultural change at MHHS has been the expectation for 100% compliance with evidence-based quality measures and 0% incidence of patient harm.
Fan, Ching-Lin; Tseng, Fan-Ping; Tseng, Chiao-Yuan
2018-01-01
In this work, amorphous indium-gallium-zinc oxide thin-film transistors (a-IGZO TFTs) with a HfO2 gate insulator and CF4 plasma treatment was demonstrated for the first time. Through the plasma treatment, both the electrical performance and reliability of the a-IGZO TFT with HfO2 gate dielectric were improved. The carrier mobility significantly increased by 80.8%, from 30.2 cm2/V∙s (without treatment) to 54.6 cm2/V∙s (with CF4 plasma treatment), which is due to the incorporated fluorine not only providing an extra electron to the IGZO, but also passivating the interface trap density. In addition, the reliability of the a-IGZO TFT with HfO2 gate dielectric has also been improved by the CF4 plasma treatment. By applying the CF4 plasma treatment to the a-IGZO TFT, the hysteresis effect of the device has been improved and the device’s immunity against moisture from the ambient atmosphere has been enhanced. It is believed that the CF4 plasma treatment not only significantly improves the electrical performance of a-IGZO TFT with HfO2 gate dielectric, but also enhances the device’s reliability. PMID:29772767
Reliability of the Test of Integrated Language and Literacy Skills (TILLS)
ERIC Educational Resources Information Center
Mailend, Marja-Liisa; Plante, Elena; Anderson, Michele A.; Applegate, E. Brooks; Nelson, Nickola W.
2016-01-01
Background: As new standardized tests become commercially available, it is critical that clinicians have access to the information about a test's psychometric properties, including aspects of reliability. Aims: The purpose of the three studies reported in this article was to investigate the reliability of a new test, the Test of Integrated…
Large Sample Confidence Intervals for Item Response Theory Reliability Coefficients
ERIC Educational Resources Information Center
Andersson, Björn; Xin, Tao
2018-01-01
In applications of item response theory (IRT), an estimate of the reliability of the ability estimates or sum scores is often reported. However, analytical expressions for the standard errors of the estimators of the reliability coefficients are not available in the literature and therefore the variability associated with the estimated reliability…
ERIC Educational Resources Information Center
Arce-Ferrer, Alvaro J.; Castillo, Irene Borges
2007-01-01
The use of face-to-face interviews is controversial for college admissions decisions in light of the lack of availability of validity and reliability evidence for most college admission processes. This study investigated reliability and incremental predictive validity of a face-to-face postgraduate college admission interview with a sample of…
Renewal of the Control System and Reliable Long Term Operation of the LHD Cryogenic System
NASA Astrophysics Data System (ADS)
Mito, T.; Iwamoto, A.; Oba, K.; Takami, S.; Moriuchi, S.; Imagawa, S.; Takahata, K.; Yamada, S.; Yanagi, N.; Hamaguchi, S.; Kishida, F.; Nakashima, T.
The Large Helical Device (LHD) is a heliotron-type fusion plasma experimental machine which consists of a fully superconducting magnet system cooled by a helium refrigerator having a total equivalent cooling capacity of 9.2 kW@4.4 K. Seventeenplasma experimental campaigns have been performed successfully since1997 with high reliability of 99%. However, sixteen years have passed from the beginning of the system operation. Improvements are being implementedto prevent serious failures and to pursue further reliability.The LHD cryogenic control system was designed and developed as an open system utilizing latest control equipment of VME controllers and UNIX workstations at the construction time. Howeverthe generation change of control equipment has been advanced. Down-sizing of control deviceshas beenplanned from VME controllers to compact PCI controllers in order to simplify the system configuration and to improve the system reliability. The new system is composed of compact PCI controller and remote I/O connected with EtherNet/IP. Making the system redundant becomes possible by doubling CPU, LAN, and remote I/O respectively. The smooth renewal of the LHD cryogenic controlsystem and the further improvement of the cryogenic system reliability are reported.
Improving the Reliability of Technological Subsystems Equipment for Steam Turbine Unit in Operation
NASA Astrophysics Data System (ADS)
Brodov, Yu. M.; Murmansky, B. E.; Aronson, R. T.
2017-11-01
The authors’ conception is presented of an integrated approach to reliability improving of the steam turbine unit (STU) state along with its implementation examples for the various STU technological subsystems. Basing on the statistical analysis of damage to turbine individual parts and components, on the development and application of modern methods and technologies of repair and on operational monitoring techniques, the critical components and elements of equipment are identified and priorities are proposed for improving the reliability of STU equipment in operation. The research results are presented of the analysis of malfunctions for various STU technological subsystems equipment operating as part of power units and at cross-linked thermal power plants and resulting in turbine unit shutdown (failure). Proposals are formulated and justified for adjustment of maintenance and repair for turbine components and parts, for condenser unit equipment, for regeneration subsystem and oil supply system that permit to increase the operational reliability, to reduce the cost of STU maintenance and repair and to optimize the timing and amount of repairs.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Agalgaonkar, Yashodhan P.; Hammerstrom, Donald J.
The Pacific Northwest Smart Grid Demonstration (PNWSGD) was a smart grid technology performance evaluation project that included multiple U.S. states and cooperation from multiple electric utilities in the northwest region. One of the local objectives for the project was to achieve improved distribution system reliability. Toward this end, some PNWSGD utilities automated their distribution systems, including the application of fault detection, isolation, and restoration and advanced metering infrastructure. In light of this investment, a major challenge was to establish a correlation between implementation of these smart grid technologies and actual improvements of distribution system reliability. This paper proposes using Welch’smore » t-test to objectively determine and quantify whether distribution system reliability is improving over time. The proposed methodology is generic, and it can be implemented by any utility after calculation of the standard reliability indices. The effectiveness of the proposed hypothesis testing approach is demonstrated through comprehensive practical results. It is believed that wider adoption of the proposed approach can help utilities to evaluate a realistic long-term performance of smart grid technologies.« less
High Temperature Irradiation-Resistant Thermocouple Performance Improvements
DOE Office of Scientific and Technical Information (OSTI.GOV)
Joshua Daw; Joy Rempe; Darrell Knudson
2009-04-01
Traditional methods for measuring temperature in-pile degrade at temperatures above 1100 ºC. To address this instrumentation need, the Idaho National Laboratory (INL) developed and evaluated the performance of a high temperature irradiation-resistant thermocouple (HTIR-TC) that contains alloys of molybdenum and niobium. Data from high temperature (up to 1500 ºC) long duration (up to 4000 hours) tests and on-going irradiations at INL’s Advanced Test Reactor demonstrate the superiority of these sensors to commercially-available thermocouples. However, several options have been identified that could further enhance their reliability, reduce their production costs, and allow their use in a wider range of operating conditions.more » This paper presents results from on-going Idaho National Laboratory (INL)/University of Idaho (UI) efforts to investigate options to improve HTIR-TC ductility, reliability, and resolution by investigating specially-formulated alloys of molybdenum and niobium and alternate diameter thermoelements (wires). In addition, on-going efforts to evaluate alternate fabrication approaches, such as drawn and loose assembly techniques will be discussed. Efforts to reduce HTIR-TC fabrication costs, such as the use of less expensive extension cable will also be presented. Finally, customized HTIR-TC designs developed for specific customer needs will be summarized to emphasize the varied conditions under which these sensors may be used.« less
National Space Transportation System (NSTS) technology needs
NASA Technical Reports Server (NTRS)
Winterhalter, David L.; Ulrich, Kimberly K.
1990-01-01
The National Space Transportation System (NSTS) is one of the Nation's most valuable resources, providing manned transportation to and from space in support of payloads and scientific research. The NSTS program is currently faced with the problem of hardware obsolescence, which could result in unacceptable schedule and cost impacts to the flight program. Obsolescence problems occur because certain components are no longer being manufactured or repair turnaround time is excessive. In order to achieve a long-term, reliable transportation system that can support manned access to space through 2010 and beyond, NASA must develop a strategic plan for a phased implementation of enhancements which will satisfy this long-term goal. The NSTS program has initiated the Assured Shuttle Availability (ASA) project with the following objectives: eliminate hardware obsolescence in critical areas, increase reliability and safety of the vehicle, decrease operational costs and turnaround time, and improve operational capability. The strategy for ASA will be to first meet the mandatory needs - keep the Shuttle flying. Non-mandatory changes that will improve operational capability and enhance performance will then be considered if funding is adequate. Upgrade packages should be developed to install within designated inspection periods, grouped in a systematic approach to reduce cost and schedule impacts, and allow the capability to provide a Block 2 Shuttle (Phase 3).
NASA Astrophysics Data System (ADS)
Ottoni, F.; Freddi, F.; Zerbi, A.
2017-05-01
It's well known that more and more accurate methodologies and automatic tools are now available in the field of geometric survey and image processing and they constitute a fundamental instrument for cultural heritage knowledge and preservation; on the other side, very smart and precise numerical models are continuously improved and used in order to simulate the mechanical behaviour of masonry structures: both instruments and technologies are important part of a global process of knowledge which is at the base of any conservation project of cultural heritage. Despite the high accuracy and automation level reached by both technologies and programs, the transfer of data between them is not an easy task and defining the most reliable way to translate and exchange information without data loosing is still an open issue. The goal of the present paper is to analyse the complex process of translation from the very precise (and sometimes redundant) information obtainable by the modern survey methodologies for historic buildings (as laser scanner), into the very simplified (may be too much) schemes used to understand their real structural behaviour, with the final aim to contribute to the discussion on reliable methods for cultural heritage knowledge improvement, through empiricism.
A Comparison of Vibration and Oil Debris Gear Damage Detection Methods Applied to Pitting Damage
NASA Technical Reports Server (NTRS)
Dempsey, Paula J.
2000-01-01
Helicopter Health Usage Monitoring Systems (HUMS) must provide reliable, real-time performance monitoring of helicopter operating parameters to prevent damage of flight critical components. Helicopter transmission diagnostics are an important part of a helicopter HUMS. In order to improve the reliability of transmission diagnostics, many researchers propose combining two technologies, vibration and oil monitoring, using data fusion and intelligent systems. Some benefits of combining multiple sensors to make decisions include improved detection capabilities and increased probability the event is detected. However, if the sensors are inaccurate, or the features extracted from the sensors are poor predictors of transmission health, integration of these sensors will decrease the accuracy of damage prediction. For this reason, one must verify the individual integrity of vibration and oil analysis methods prior to integrating the two technologies. This research focuses on comparing the capability of two vibration algorithms, FM4 and NA4, and a commercially available on-line oil debris monitor to detect pitting damage on spur gears in the NASA Glenn Research Center Spur Gear Fatigue Test Rig. Results from this research indicate that the rate of change of debris mass measured by the oil debris monitor is comparable to the vibration algorithms in detecting gear pitting damage.
The Drought Task Force and Research on Understanding, Predicting, and Monitoring Drought
NASA Astrophysics Data System (ADS)
Barrie, D.; Mariotti, A.; Archambault, H. M.; Hoerling, M. P.; Wood, E. F.; Koster, R. D.; Svoboda, M.
2016-12-01
Drought has caused serious social and economic impacts throughout the history of the United States. All Americans are susceptible to the direct and indirect threats drought poses to the Nation. Drought challenges agricultural productivity and reduces the quantity and quality of drinking water supplies upon which communities and industries depend. Drought jeopardizes the integrity of critical infrastructure, causes extensive economic and health impacts, harms ecosystems, and increases energy costs. Ensuring the availability of clean, sufficient, and reliable water resources is a top national and NOAA priority. The Climate Program Office's Modeling, Analysis, Predictions, and Projections (MAPP) program, in partnership with the NOAA-led National Integrated Drought Information System (NIDIS), is focused on improving our understanding of drought causes, evolution, amelioration, and impacts as well as improving our capability to monitor and predict drought. These capabilities and knowledge are critical to providing communities with actionable, reliable information to increase drought preparedness and resilience. This poster will present information on the MAPP-organized Drought Task Force, a consortium of investigators funded by the MAPP program in partnership with NIDIS to advance drought understanding, monitoring, and prediction. Information on Task Force activities, products, and MAPP drought initiatives will be described in the poster, including the Task Force's ongoing focus on the California drought, its predictability, and its causes.
A road map for integrating eco-evolutionary processes into biodiversity models.
Thuiller, Wilfried; Münkemüller, Tamara; Lavergne, Sébastien; Mouillot, David; Mouquet, Nicolas; Schiffers, Katja; Gravel, Dominique
2013-05-01
The demand for projections of the future distribution of biodiversity has triggered an upsurge in modelling at the crossroads between ecology and evolution. Despite the enthusiasm around these so-called biodiversity models, most approaches are still criticised for not integrating key processes known to shape species ranges and community structure. Developing an integrative modelling framework for biodiversity distribution promises to improve the reliability of predictions and to give a better understanding of the eco-evolutionary dynamics of species and communities under changing environments. In this article, we briefly review some eco-evolutionary processes and interplays among them, which are essential to provide reliable projections of species distributions and community structure. We identify gaps in theory, quantitative knowledge and data availability hampering the development of an integrated modelling framework. We argue that model development relying on a strong theoretical foundation is essential to inspire new models, manage complexity and maintain tractability. We support our argument with an example of a novel integrated model for species distribution modelling, derived from metapopulation theory, which accounts for abiotic constraints, dispersal, biotic interactions and evolution under changing environmental conditions. We hope such a perspective will motivate exciting and novel research, and challenge others to improve on our proposed approach. © 2013 John Wiley & Sons Ltd/CNRS.
Nagel, Corey; Beach, Jack; Iribagiza, Chantal; Thomas, Evan A
2015-12-15
In rural sub-Saharan Africa, where handpumps are common, 10-67% are nonfunctional at any one time, and many never get repaired. Increased reliability requires improved monitoring and responsiveness of maintenance providers. In 2014, 181 cellular enabled water pump use sensors were installed in three provinces of Rwanda. In three arms, the nominal maintenance model was compared against a "best practice" circuit rider model, and an "ambulance" service model. In only the ambulance model was the sensor data available to the implementer, and used to dispatch technicians. The study ran for seven months in 2014-2015. In the study period, the nominal maintenance group had a median time to successful repair of approximately 152 days, with a mean per-pump functionality of about 68%. In the circuit rider group, the median time to successful repair was nearly 57 days, with a per-pump functionality mean of nearly 73%. In the ambulance service group, the successful repair interval was nearly 21 days with a functionality mean of nearly 91%. An indicative cost analysis suggests that the cost per functional pump per year is approximately similar between the three models. However, the benefits of reliable water service may justify greater focus on servicing models over installation models.
NASA Astrophysics Data System (ADS)
Gómez-Paccard, M.; Chauvin, A.; Lanos, P.; Dufresne, P.; Kovacheva, M.; Hill, M. J.; Beamud, E.; Gutiérrez-Lloret, S.; Cañavate, V.; Blain, S.; Bouvier, A.; Oberlin, C.; Guibert, P.; Sapin, C.; Pringent, D.
2011-12-01
Available European data indicate that during the past 2500 years there have been periods of rapid intensity geomagnetic fluctuations interspersed with periods of little change. The challenge now is to precisely describe these rapid changes. The aim of this study is to obtain an improved description of the sharp geomagnetic intensity change that took place in Western Europe around 800 yrs AD as well as to investigate if this peak is observed at a continental scale. For this purpose 13 precisely dated early medieval Spanish pottery fragments, 4 archeological French kilns and a 3 collections of bricks used for the construction of different historical buildings from France and with ages ranging between 330 to 1290 AD have been studied. The material collected has been dated by archeological/historical constraints together with radiocarbon,thermoluminiscence (TL) and archeomagentic analysis. From classical Thellier experiments including TRM anisotropy and cooling rate corrections upon archeointensity estimates and conducted on 164 specimens (119 of them giving reliable results) ten new high-quality mean intensities have been obtained. The new intensity data together with a selection of the most reliable data from Western Europe have been relocated to the latitude of Paris and confirm the existence of an intensity maxima of ~85 μT centred at ~850 AD and related to intensity changes up to 20 μT per century. The results also indicate that a previous abrupt intensity change (reaching a maximum value of ~ 90 μT) took place in Western Europe around 650 AD. A selection of high-quality intensity data from Bulgaria, Italy and Greece indicate a very similar intensity trend for Eastern Europe. Although available data indicate that the duration of such periods of high intensities may be of less than one century more data are needed to infer the exact duration of these maximums. A comparison between the selected data and regional and global geomagnetic field models indicates that such models fail to reproduce the detailed evolution of geomagnetic intensity changes. These results highlight the need of new reliable and precisely dated archeointensity data if a refined description of geomagnetic field changes wants to be obtained.
Frakking, Thuy T; Chang, Anne B; O'Grady, Kerry-Ann F; Walker-Smith, Katie; Weir, Kelly A
2013-11-07
Oropharyngeal aspiration (OPA) can lead to recurrent respiratory illnesses and chronic lung disease in children. Current clinical feeding evaluations performed by speech pathologists have poor reliability in detecting OPA when compared to radiological procedures such as the modified barium swallow (MBS). Improved ability to diagnose OPA accurately via clinical evaluation potentially reduces reliance on expensive, less readily available radiological procedures. Our study investigates the utility of adding cervical auscultation (CA), a technique of listening to swallowing sounds, in improving the diagnostic accuracy of a clinical evaluation for the detection of OPA. We plan an open, unblinded, randomised controlled trial at a paediatric tertiary teaching hospital. Two hundred and sixteen children fulfilling the inclusion criteria will be randomised to one of the two clinical assessment techniques for the clinical detection of OPA: (1) clinical feeding evaluation only (CFE) group or (2) clinical feeding evaluation with cervical auscultation (CFE + CA) group. All children will then undergo an MBS to determine radiologically assessed OPA. The primary outcome is the presence or absence of OPA, as determined on MBS using the Penetration-Aspiration Scale. Our main objective is to determine the sensitivity, specificity, negative and positive predictive values of 'CFE + CA' versus 'CFE' only compared to MBS-identified OPA. Early detection and appropriate management of OPA is important to prevent chronic pulmonary disease and poor growth in children. As the reliability of CFE to detect OPA is low, a technique that can improve the diagnostic accuracy of the CFE will help minimise consequences to the paediatric respiratory system. Cervical auscultation is a technique that has previously been documented as a clinical adjunct to the CFE; however, no published RCTs addressing the reliability of this technique in children exist. Our study will be the first to establish the utility of CA in assessing and diagnosing OPA risk in young children. Australia and New Zealand Clinical Trials Register (ANZCTR) number ACTRN12613000589785.
2013-01-01
Background Oropharyngeal aspiration (OPA) can lead to recurrent respiratory illnesses and chronic lung disease in children. Current clinical feeding evaluations performed by speech pathologists have poor reliability in detecting OPA when compared to radiological procedures such as the modified barium swallow (MBS). Improved ability to diagnose OPA accurately via clinical evaluation potentially reduces reliance on expensive, less readily available radiological procedures. Our study investigates the utility of adding cervical auscultation (CA), a technique of listening to swallowing sounds, in improving the diagnostic accuracy of a clinical evaluation for the detection of OPA. Methods We plan an open, unblinded, randomised controlled trial at a paediatric tertiary teaching hospital. Two hundred and sixteen children fulfilling the inclusion criteria will be randomised to one of the two clinical assessment techniques for the clinical detection of OPA: (1) clinical feeding evaluation only (CFE) group or (2) clinical feeding evaluation with cervical auscultation (CFE + CA) group. All children will then undergo an MBS to determine radiologically assessed OPA. The primary outcome is the presence or absence of OPA, as determined on MBS using the Penetration-Aspiration Scale. Our main objective is to determine the sensitivity, specificity, negative and positive predictive values of ‘CFE + CA’ versus ‘CFE’ only compared to MBS-identified OPA. Discussion Early detection and appropriate management of OPA is important to prevent chronic pulmonary disease and poor growth in children. As the reliability of CFE to detect OPA is low, a technique that can improve the diagnostic accuracy of the CFE will help minimise consequences to the paediatric respiratory system. Cervical auscultation is a technique that has previously been documented as a clinical adjunct to the CFE; however, no published RCTs addressing the reliability of this technique in children exist. Our study will be the first to establish the utility of CA in assessing and diagnosing OPA risk in young children. Trial registration Australia and New Zealand Clinical Trials Register (ANZCTR) number ACTRN12613000589785. PMID:24199872
Rosa, Isabel M D; Ahmed, Sadia E; Ewers, Robert M
2014-06-01
Land-use and land-cover (LULC) change is one of the largest drivers of biodiversity loss and carbon emissions globally. We use the tropical rainforests of the Amazon, the Congo basin and South-East Asia as a case study to investigate spatial predictive models of LULC change. Current predictions differ in their modelling approaches, are highly variable and often poorly validated. We carried out a quantitative review of 48 modelling methodologies, considering model spatio-temporal scales, inputs, calibration and validation methods. In addition, we requested model outputs from each of the models reviewed and carried out a quantitative assessment of model performance for tropical LULC predictions in the Brazilian Amazon. We highlight existing shortfalls in the discipline and uncover three key points that need addressing to improve the transparency, reliability and utility of tropical LULC change models: (1) a lack of openness with regard to describing and making available the model inputs and model code; (2) the difficulties of conducting appropriate model validations; and (3) the difficulty that users of tropical LULC models face in obtaining the model predictions to help inform their own analyses and policy decisions. We further draw comparisons between tropical LULC change models in the tropics and the modelling approaches and paradigms in other disciplines, and suggest that recent changes in the climate change and species distribution modelling communities may provide a pathway that tropical LULC change modellers may emulate to further improve the discipline. Climate change models have exerted considerable influence over public perceptions of climate change and now impact policy decisions at all political levels. We suggest that tropical LULC change models have an equally high potential to influence public opinion and impact the development of land-use policies based on plausible future scenarios, but, to do that reliably may require further improvements in the discipline. © 2014 John Wiley & Sons Ltd.
High Power Laser Diode Arrays for 2-Micron Solid State Coherent Lidars Applications
NASA Technical Reports Server (NTRS)
Amzajerdian, Farzin; Meadows, Byron; Kavaya, Michael J.; Singh, Upendra; Sudesh, Vikas; Baker, Nathaniel
2003-01-01
Laser diode arrays are critical components of any diode-pumped solid state laser systems, constraining their performance and reliability. Laser diode arrays (LDAs) are used as the pump source for energizing the solid state lasing media to generate an intense coherent laser beam with a high spatial and spectral quality. The solid state laser design and the characteristics of its lasing materials define the operating wavelength, pulse duration, and power of the laser diodes. The pump requirements for high pulse energy 2-micron solid state lasers are substantially different from those of more widely used 1-micron lasers and in many aspects more challenging [1]. Furthermore, the reliability and lifetime demanded by many coherent lidar applications, such as global wind profiling from space and long-range clear air turbulence detection from aircraft, are beyond the capability of currently available LDAs. In addition to the need for more reliable LDAs with longer lifetime, further improvement in the operational parameters of high power quasi-cw LDAs, such as electrical efficiency, brightness, and duty cycle, are also necessary for developing cost-effective 2-micron coherent lidar systems for applications that impose stringent size, heat dissipation, and power constraints. Global wind sounding from space is one of such applications, which is the main driver for this work as part of NASA s Laser Risk Reduction Program. This paper discusses the current state of the 792 nm LDA technology and the technology areas being pursued toward improving their performance. The design and development of a unique characterization facility for addressing the specific issues associated with the LDAs for pumping 2-micron coherent lidar transmitters and identifying areas of technological improvement will be described. Finally, the results of measurements to date on various standard laser diode packages, as well as custom-designed packages with potentially longer lifetime, will be reported.
Accessing evidence to inform public health policy: a study to enhance advocacy.
Tabak, R G; Eyler, A A; Dodson, E A; Brownson, R C
2015-06-01
Improving population health often involves policy changes that are the result of complex advocacy efforts. Information exchanges among researchers, advocates, and policymakers is paramount to policy interventions to improve health outcomes. This information may include evidence on what works well for whom and cost-effective strategies to improve outcomes of interest. However, this information is not always readily available or easily communicated. The purposes of this paper are to describe ways advocates seek information for health policy advocacy and to compare advocate demographics. Cross-sectional telephone survey. Seventy-seven state-level advocates were asked about the desirable characteristics of policy-relevant information including methods of obtaining information, what makes it useful, and what sources make evidence most reliable/trustworthy. Responses were explored for the full sample and variety of subsamples (i.e. gender, age, and position on social and fiscal issues). Differences between groups were tested using t-tests and one-way analysis of variance. On average, advocates rated frequency of seeking research information as 4.3 out of five. Overall, advocates rated the Internet as the top source, rated unbiased research and research with relevancy to their organization as the most important characteristics, and considered information from their organization as most reliable/believable. When ratings were examined by subgroup, the two characteristics most important for each question in the total sample (listed above) emerged as most important for nearly all subgroups. Advocates are a resource to policymakers on health topics in the policy process. This study, among the first of its kind, found that advocates seek research information, but have a need for evidence that is unbiased and relevant to their organizations and report that university-based information is reliable. Researchers and advocates should partner so research is useful in advocating for evidence-based policy change. Copyright © 2015 The Royal Society for Public Health. Published by Elsevier Ltd. All rights reserved.
Using wind plant data to increase reliability.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Peters, Valerie A.; Ogilvie, Alistair B.; McKenney, Bridget L.
2011-01-01
Operators interested in improving reliability should begin with a focus on the performance of the wind plant as a whole. To then understand the factors which drive individual turbine performance, which together comprise the plant performance, it is necessary to track a number of key indicators. Analysis of these key indicators can reveal the type, frequency, and cause of failures and will also identify their contributions to overall plant performance. The ideal approach to using data to drive good decisions includes first determining which critical decisions can be based on data. When those required decisions are understood, then the analysismore » required to inform those decisions can be identified, and finally the data to be collected in support of those analyses can be determined. Once equipped with high-quality data and analysis capabilities, the key steps to data-based decision making for reliability improvements are to isolate possible improvements, select the improvements with largest return on investment (ROI), implement the selected improvements, and finally to track their impact.« less
7 CFR 760.812 - Production losses; participant responsibility.
Code of Federal Regulations, 2011 CFR
2011-01-01
... required, the best verifiable or reliable production records available for the crop; (2) Summarizing all.... (c) In determining production under this section, the participant must supply verifiable or reliable...
Magpies can use local cues to retrieve their food caches.
Feenders, Gesa; Smulders, Tom V
2011-03-01
Much importance has been placed on the use of spatial cues by food-hoarding birds in the retrieval of their caches. In this study, we investigate whether food-hoarding birds can be trained to use local cues ("beacons") in their cache retrieval. We test magpies (Pica pica) in an active hoarding-retrieval paradigm, where local cues are always reliable, while spatial cues are not. Our results show that the birds use the local cues to retrieve their caches, even when occasionally contradicting spatial information is available. The design of our study does not allow us to test rigorously whether the birds prefer using local over spatial cues, nor to investigate the process through which they learn to use local cues. We furthermore provide evidence that magpies develop landmark preferences, which improve their retrieval accuracy. Our findings support the hypothesis that birds are flexible in their use of memory information, using a combination of the most reliable or salient information to retrieve their caches. © Springer-Verlag 2010
Göhner, Claudia; Weber, Maja; Tannetta, Dionne S; Groten, Tanja; Plösch, Torsten; Faas, Marijke M; Scherjon, Sicco A; Schleußner, Ekkehard; Markert, Udo R; Fitzgerald, Justine S
2015-06-01
The pregnancy-associated disease preeclampsia is related to the release of syncytiotrophoblast extracellular vesicles (STBEV) by the placenta. To improve functional research on STBEV, reliable and specific methods are needed to quantify them. However, only a few quantification methods are available and accepted, though imperfect. For this purpose, we aimed to provide an enzyme-linked sorbent assay (ELSA) to quantify STBEV in fluid samples based on their microvesicle characteristics and placental origin. Ex vivo placenta perfusion provided standards and samples for the STBEV quantification. STBEV were captured by binding of extracellular phosphatidylserine to immobilized annexin V. The membranous human placental alkaline phosphatase on the STBEV surface catalyzed a colorimetric detection reaction. The described ELSA is a rapid and simple method to quantify STBEV in diverse liquid samples, such as blood or perfusion suspension. The reliability of the ELSA was proven by comparison with nanoparticle tracking analysis. © 2015 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
McKenna, J.E.
2003-01-01
The biosphere is filled with complex living patterns and important questions about biodiversity and community and ecosystem ecology are concerned with structure and function of multispecies systems that are responsible for those patterns. Cluster analysis identifies discrete groups within multivariate data and is an effective method of coping with these complexities, but often suffers from subjective identification of groups. The bootstrap testing method greatly improves objective significance determination for cluster analysis. The BOOTCLUS program makes cluster analysis that reliably identifies real patterns within a data set more accessible and easier to use than previously available programs. A variety of analysis options and rapid re-analysis provide a means to quickly evaluate several aspects of a data set. Interpretation is influenced by sampling design and a priori designation of samples into replicate groups, and ultimately relies on the researcher's knowledge of the organisms and their environment. However, the BOOTCLUS program provides reliable, objectively determined groupings of multivariate data.
Supramolecular Affinity Chromatography for Methylation-Targeted Proteomics.
Garnett, Graham A E; Starke, Melissa J; Shaurya, Alok; Li, Janessa; Hof, Fraser
2016-04-05
Proteome-wide studies of post-translationally methylated species using mass spectrometry are complicated by high sample diversity, competition for ionization among peptides, and mass redundancies. Antibody-based enrichment has powered methylation proteomics until now, but the reliability, pan-specificity, polyclonal nature, and stability of the available pan-specific antibodies are problematic and do not provide a standard, reliable platform for investigators. We have invented an anionic supramolecular host that can form host-guest complexes selectively with methyllysine-containing peptides and used it to create a methylysine-affinity column. The column resolves peptides on the basis of methylation-a feat impossible with a comparable commercial cation-exchange column. A proteolyzed nuclear extract was separated on the methyl-affinity column prior to standard proteomics analysis. This experiment demonstrates that such chemical methyl-affinity columns are capable of enriching and improving the analysis of methyllysine residues from complex protein mixtures. We discuss the importance of this advance in the context of biomolecule-driven enrichment methods.
Access to health information on the internet: a public health issue?
Moretti, Felipe Azevedo; Oliveira, Vanessa Elias de; Silva, Edina Mariko Koga da
2012-01-01
To progress in the understanding of the user profile and of search trends for health information on the internet. Analyses were performed based on 1,828 individuals who completed an electronic questionnaire available on a very popular health website. At the same time, through the "elite survey" method, 20 specialists were interviewed, aiming at assessing quality control strategies regarding health information disseminated online. A predominance of female users who research information for themselves (= 90%), who consider the internet one of their main sources of health information (86%), and who spend from 5 to 35 hours online every week (62%) was verified. High reliability is assigned to information from specialists (76%), and low reliability to television, radio, or blogs (14%). It can be concluded that the internet is proving to be a major source of health information for the population, and that website certification is a strategy to be contemplated to improve the quality of information and to promote public health.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yang, Dr. Li; Cui, Xiaohui; Cemerlic, Alma
Ad hoc networks are very helpful in situations when no fixed network infrastructure is available, such as natural disasters and military conflicts. In such a network, all wireless nodes are equal peers simultaneously serving as both senders and routers for other nodes. Therefore, how to route packets through reliable paths becomes a fundamental problems when behaviors of certain nodes deviate from wireless ad hoc routing protocols. We proposed a novel Dirichlet reputation model based on Bayesian inference theory which evaluates reliability of each node in terms of packet delivery. Our system offers a way to predict and select a reliablemore » path through combination of first-hand observation and second-hand reputation reports. We also proposed moving window mechanism which helps to adjust ours responsiveness of our system to changes of node behaviors. We integrated the Dirichlet reputation into routing protocol of wireless ad hoc networks. Our extensive simulation indicates that our proposed reputation system can improve good throughput of the network and reduce negative impacts caused by misbehaving nodes.« less
Hsin, Kun-Yi; Ghosh, Samik; Kitano, Hiroaki
2013-01-01
Increased availability of bioinformatics resources is creating opportunities for the application of network pharmacology to predict drug effects and toxicity resulting from multi-target interactions. Here we present a high-precision computational prediction approach that combines two elaborately built machine learning systems and multiple molecular docking tools to assess binding potentials of a test compound against proteins involved in a complex molecular network. One of the two machine learning systems is a re-scoring function to evaluate binding modes generated by docking tools. The second is a binding mode selection function to identify the most predictive binding mode. Results from a series of benchmark validations and a case study show that this approach surpasses the prediction reliability of other techniques and that it also identifies either primary or off-targets of kinase inhibitors. Integrating this approach with molecular network maps makes it possible to address drug safety issues by comprehensively investigating network-dependent effects of a drug or drug candidate. PMID:24391846
DOE Office of Scientific and Technical Information (OSTI.GOV)
Robinson, D. G.; Arent, D. J.; Johnson, L.
2006-06-01
This paper documents a probabilistic risk assessment of existing and alternative power supply systems at a large telecommunications office. The analysis characterizes the increase in the reliability of power supply through the use of two alternative power configurations. Failures in the power systems supporting major telecommunications service nodes are a main contributor to significant telecommunications outages. A logical approach to improving the robustness of telecommunication facilities is to increase the depth and breadth of technologies available to restore power during power outages. Distributed energy resources such as fuel cells and gas turbines could provide additional on-site electric power sources tomore » provide backup power, if batteries and diesel generators fail. The analysis is based on a hierarchical Bayesian approach and focuses on the failure probability associated with each of three possible facility configurations, along with assessment of the uncertainty or confidence level in the probability of failure. A risk-based characterization of final best configuration is presented.« less
Effective interventions on service quality improvement in a physiotherapy clinic.
Gharibi, Farid; Tabrizi, JafarSadegh; Eteraf Oskouei, MirAli; AsghariJafarabadi, Mohammad
2014-01-01
Service quality is considered as a main domain of quality associ-ated with non-clinical aspect of healthcare. This study aimed to survey and im-proves service quality of delivered care in the Physiotherapy Clinic affiliated with the Tabriz University of Medical Sciences, Tabriz, Iran. A quasi experimental interventional study was conducted in the Physiotherapy Clinic, 2010-2011. Data were collected using a validated and reli-able researcher made questionnaire with participation of 324 patients and their coadjutors. The study questionnaire consisted of 7 questions about demographic factors and 38 questions for eleven aspects of service quality. Data were then analyzed using paired samples t-test by SPSS16. In the pre intervention phase, six aspects of service quality including choice of provider, safety, prevention and early detection, dignity, autonomy and availability achieved non-acceptable scores. Following interventions, all aspects of the service quality improved and also total service quality score improved from 8.58 to 9.83 (P<0.001). Service quality can be improved by problem implementation of appropriate interventions. The acquired results can be used in health system fields to create respectful environments for healthcare customers.
Applicability and Limitations of Reliability Allocation Methods
NASA Technical Reports Server (NTRS)
Cruz, Jose A.
2016-01-01
Reliability allocation process may be described as the process of assigning reliability requirements to individual components within a system to attain the specified system reliability. For large systems, the allocation process is often performed at different stages of system design. The allocation process often begins at the conceptual stage. As the system design develops, more information about components and the operating environment becomes available, different allocation methods can be considered. Reliability allocation methods are usually divided into two categories: weighting factors and optimal reliability allocation. When properly applied, these methods can produce reasonable approximations. Reliability allocation techniques have limitations and implied assumptions that need to be understood by system engineers. Applying reliability allocation techniques without understanding their limitations and assumptions can produce unrealistic results. This report addresses weighting factors, optimal reliability allocation techniques, and identifies the applicability and limitations of each reliability allocation technique.
Quality of nursing doctoral education in Korea: towards policy development.
Ja Kim, Mi; Gi Park, Chang; Kim, Minju; Lee, Hyeonkyeong; Ahn, Yang-Heui; Kim, Euisook; Yun, Soon-Nyoung; Lee, Kwang-Ja
2012-07-01
This article is a report on an international study of the quality of nursing doctoral education; herein, we report findings for Korea. Specific aims were to: examine the validity and reliability of the quality of nursing doctoral education questionnaire; and identify contributing factors and domain(s) for improvement. The quality of nursing doctoral education has been a worldwide concern with the recent rapid increase in number of nursing doctoral programmes around the world, and comprehensive evaluation is needed for policy recommendations. A cross-sectional descriptive study, conducted from October 2006 to January 2007, used an online questionnaire evaluating four domains: programme, faculty, resources and evaluation. Seven deans, 48 faculty, 52 graduates and 87 students from 14 nursing schools participated. Content and construct validity, and construct reliability of the questionnaire were established. Overall, participants reported that the perceived quality of private universities/schools was significantly higher than that of public/national universities. A higher ratio of doctoral to non-doctoral students was significantly associated with higher quality. The domains of programme, faculty and resources were highly correlated. The programme was the most important domain; availability of sufficient materials and information for students most needed improvement. Overall, faculty perceived the quality of the programme, faculty and resources as more positively than did the graduates and students. This study provides useful policy guidance for nurse educators worldwide for improving doctoral programmes and faculty's role in educating students. Further study is recommended that examines contributing factors to quality doctoral education. © 2011 Blackwell Publishing Ltd.
Physical Oceanographic Real-Time System (PORTS) (Invited)
NASA Astrophysics Data System (ADS)
Wright, D.
2013-12-01
The 1999 Assessment of U.S. Marine Transportation System report to Congress noted that the greatest safety concern voiced by the maritime community was the availability of timely, accurate, and reliable navigation information, including real time environment data. Real time oceanographic and meteorological data, along with other navigation tools, gives the mariner a good situational understanding of their often challenging operational environment, to make the best safety of life and property decisions. The National Oceanic and Atmospheric Administration's (NOAA) Physical Oceanographic Real Time System (PORTS) was developed in response to accidents like the Sunshine Skyway Bridge collision in Tampa, FL in 1980, where the lack of accurate, reliable and timely environmental conditions directly contributed to an accident that resulted in a high loss of life and property. Since that time, PORTS has expanded to over 20 locations around the country, and its capabilities have been continually expanded and improved as well. PORTS primary mission is to prevent maritime accidents. Preventing an accident from occurring is the most cost effective approach and the best way to avoid damage to the environment. When accidents do occur, PORTS data is used to improve the effectiveness of response efforts by providing input for trajectory models and real time conditions for response efforts. However, benefits derived from PORTS go well beyond navigation safety. Another large benefit to the local maritime community is potential efficiencies in optimizing use of the existing water column. PORTS provides information that can be used to make economic decisions to add or offload cargo to a vessel and/or to maintain or adjust transit schedules based upon availability of water depth, strength/timing of tidal currents, and other conditions. PORTS data also helps improve and validate local National Weather Service marine weather forecasts. There are many benefits beyond the local maritime community. PORTS data often proves critical when hurricanes or other severe weather events impact an area with the data helping inform the local emergency response infrastructure. PORTS data can also help support local habitat restoration efforts through improved tidal datums, frequency of inundation projections, and sea level trends.
Strayhorn, J; McDermott, J F; Tanguay, P
1993-06-01
The effects of methods used to improve the interrater reliability of reviewers' ratings of manuscripts submitted to the Journal of the American Academy of Child and Adolescent Psychiatry were studied. Reviewers' ratings of consecutive manuscripts submitted over approximately 1 year were first analyzed; 296 pairs of ratings were studied. Intraclass correlations and confidence intervals for the correlations were computed for the two main ratings by which reviewers quantified the quality of the article: a 1-10 overall quality rating and a recommendation for acceptance or rejection with four possibilities along that continuum. Modifications were then introduced, including a multi-item rating scale and two training manuals to accompany it. Over the next year, 272 more articles were rated, and reliabilities were computed for the new scale and for the scales previously used. The intraclass correlation of the most reliable rating before the intervention was 0.27; the reliability of the new rating procedure was 0.43. The difference between these two was significant. The reliability for the new rating scale was in the fair to good range, and it became even better when the ratings of the two reviewers were averaged and the reliability stepped up by the Spearman-Brown formula. The new rating scale had excellent internal consistency and correlated highly with other quality ratings. The data confirm that the reliability of ratings of scientific articles may be improved by increasing the number of rating scale points, eliciting ratings of separate, concrete items rather than a global judgment, using training manuals, and averaging the scores of multiple reviewers.
Verification of Triple Modular Redundancy Insertion for Reliable and Trusted Systems
NASA Technical Reports Server (NTRS)
Berg, Melanie; LaBel, Kenneth
2016-01-01
If a system is required to be protected using triple modular redundancy (TMR), improper insertion can jeopardize the reliability and security of the system. Due to the complexity of the verification process and the complexity of digital designs, there are currently no available techniques that can provide complete and reliable confirmation of TMR insertion. We propose a method for TMR insertion verification that satisfies the process for reliable and trusted systems.
NASA Astrophysics Data System (ADS)
Filis, Avishai; Pundak, Nachman; Barak, Moshe; Porat, Ze'ev; Jaeger, Mordechai
2011-06-01
The growing demand for EO applications that work around the clock 24hr/7days a week, such as in border surveillance systems, emphasizes the need for a highly reliable cryocooler having increased operational availability and decreased integrated system Life Cycle (ILS) cost. In order to meet this need RICOR has developed a new rotary Stirling cryocooler, model K508N, intended to double the K508's operating MTTF achieving 20,000 operating MTTF hours. The K508N employs RICOR's latest mechanical design technologies such as optimized bearings and greases, bearings preloading, advanced seals, laser welded cold finger and robust design structure with increased natural frequency compared to the K508 model. The cooler enhanced MTTF was demonstrated by a Validation and Verification (V&V) plan comprising analytical means and a comparative accelerated life test between the standard K508 and the K508N models. Particularly, point estimate and confidence interval for the MTTF improvement factor where calculated periodically during and after the test. The (V&V) effort revealed that the K508N meets its MTTF design goal. The paper will focus on the technical and engineering aspects of the new design. In addition it will discuss the market needs and expectations, investigate the reliability data of the present reference K508 model; and report the accelerate life test data and the statistical analysis methodology as well as its underlying assumptions and results.
Excellent reliability of the Hamilton Depression Rating Scale (HDRS-21) in Indonesia after training.
Istriana, Erita; Kurnia, Ade; Weijers, Annelies; Hidayat, Teddy; Pinxten, Lucas; de Jong, Cor; Schellekens, Arnt
2013-09-01
The Hamilton Depression Rating Scale (HDRS) is the most widely used depression rating scale worldwide. Reliability of HDRS has been reported mainly from Western countries. The current study tested the reliability of HDRS ratings among psychiatric residents in Indonesia, before and after HDRS training. The hypotheses were that: (i) prior to the training reliability of HDRS ratings is poor; and (ii) HDRS training can improve reliability of HDRS ratings to excellent levels. Furthermore, we explored cultural validity at item level. Videotaped HDRS interviews were rated by 30 psychiatric residents before and after 1 day of HDRS training. Based on a gold standard rating, percentage correct ratings and deviation from the standard were calculated. Correct ratings increased from 83% to 99% at item level and from 70% to 100% for the total rating. The average deviation from the gold standard rating improved from 0.07 to 0.02 at item level and from 2.97 to 0.46 for the total rating. HDRS assessment by psychiatric trainees in Indonesia without prior training is unreliable. A short, evidence-based HDRS training improves reliability to near perfect levels. The outlined training program could serve as a template for HDRS trainings. HDRS items that may be less valid for assessment of depression severity in Indonesia are discussed. Copyright © 2013 Wiley Publishing Asia Pty Ltd.
Han, Houzeng; Wang, Jian; Wang, Jinling; Tan, Xinglong
2015-01-01
The integration of Global Navigation Satellite Systems (GNSS) carrier phases with Inertial Navigation System (INS) measurements is essential to provide accurate and continuous position, velocity and attitude information, however it is necessary to fix ambiguities rapidly and reliably to obtain high accuracy navigation solutions. In this paper, we present the notion of combining the Global Positioning System (GPS), the BeiDou Navigation Satellite System (BDS) and low-cost micro-electro-mechanical sensors (MEMS) inertial systems for reliable navigation. An adaptive multipath factor-based tightly-coupled (TC) GPS/BDS/INS integration algorithm is presented and the overall performance of the integrated system is illustrated. A twenty seven states TC GPS/BDS/INS model is adopted with an extended Kalman filter (EKF), which is carried out by directly fusing ambiguity fixed double-difference (DD) carrier phase measurements with the INS predicted pseudoranges to estimate the error states. The INS-aided integer ambiguity resolution (AR) strategy is developed by using a dynamic model, a two-step estimation procedure is applied with adaptively estimated covariance matrix to further improve the AR performance. A field vehicular test was carried out to demonstrate the positioning performance of the combined system. The results show the TC GPS/BDS/INS system significantly improves the single-epoch AR reliability as compared to that of GPS/BDS-only or single satellite navigation system integrated strategy, especially for high cut-off elevations. The AR performance is also significantly improved for the combined system with adaptive covariance matrix in the presence of low elevation multipath related to the GNSS-only case. A total of fifteen simulated outage tests also show that the time to relock of the GPS/BDS signals is shortened, which improves the system availability. The results also indicate that TC integration system achieves a few centimeters accuracy in positioning based on the comparison analysis and covariance analysis, even in harsh environments (e.g., in urban canyons), thus we can see the advantage of positioning at high cut-off elevations that the combined GPS/BDS brings. PMID:25875191
Han, Houzeng; Wang, Jian; Wang, Jinling; Tan, Xinglong
2015-04-14
The integration of Global Navigation Satellite Systems (GNSS) carrier phases with Inertial Navigation System (INS) measurements is essential to provide accurate and continuous position, velocity and attitude information, however it is necessary to fix ambiguities rapidly and reliably to obtain high accuracy navigation solutions. In this paper, we present the notion of combining the Global Positioning System (GPS), the BeiDou Navigation Satellite System (BDS) and low-cost micro-electro-mechanical sensors (MEMS) inertial systems for reliable navigation. An adaptive multipath factor-based tightly-coupled (TC) GPS/BDS/INS integration algorithm is presented and the overall performance of the integrated system is illustrated. A twenty seven states TC GPS/BDS/INS model is adopted with an extended Kalman filter (EKF), which is carried out by directly fusing ambiguity fixed double-difference (DD) carrier phase measurements with the INS predicted pseudoranges to estimate the error states. The INS-aided integer ambiguity resolution (AR) strategy is developed by using a dynamic model, a two-step estimation procedure is applied with adaptively estimated covariance matrix to further improve the AR performance. A field vehicular test was carried out to demonstrate the positioning performance of the combined system. The results show the TC GPS/BDS/INS system significantly improves the single-epoch AR reliability as compared to that of GPS/BDS-only or single satellite navigation system integrated strategy, especially for high cut-off elevations. The AR performance is also significantly improved for the combined system with adaptive covariance matrix in the presence of low elevation multipath related to the GNSS-only case. A total of fifteen simulated outage tests also show that the time to relock of the GPS/BDS signals is shortened, which improves the system availability. The results also indicate that TC integration system achieves a few centimeters accuracy in positioning based on the comparison analysis and covariance analysis, even in harsh environments (e.g., in urban canyons), thus we can see the advantage of positioning at high cut-off elevations that the combined GPS/BDS brings.
Varikuti, Deepthi P; Hoffstaedter, Felix; Genon, Sarah; Schwender, Holger; Reid, Andrew T; Eickhoff, Simon B
2017-04-01
Resting-state functional connectivity analysis has become a widely used method for the investigation of human brain connectivity and pathology. The measurement of neuronal activity by functional MRI, however, is impeded by various nuisance signals that reduce the stability of functional connectivity. Several methods exist to address this predicament, but little consensus has yet been reached on the most appropriate approach. Given the crucial importance of reliability for the development of clinical applications, we here investigated the effect of various confound removal approaches on the test-retest reliability of functional-connectivity estimates in two previously defined functional brain networks. Our results showed that gray matter masking improved the reliability of connectivity estimates, whereas denoising based on principal components analysis reduced it. We additionally observed that refraining from using any correction for global signals provided the best test-retest reliability, but failed to reproduce anti-correlations between what have been previously described as antagonistic networks. This suggests that improved reliability can come at the expense of potentially poorer biological validity. Consistent with this, we observed that reliability was proportional to the retained variance, which presumably included structured noise, such as reliable nuisance signals (for instance, noise induced by cardiac processes). We conclude that compromises are necessary between maximizing test-retest reliability and removing variance that may be attributable to non-neuronal sources.
Citronberg, Jessica S; Wilkens, Lynne R; Lim, Unhee; Hullar, Meredith A J; White, Emily; Newcomb, Polly A; Le Marchand, Loïc; Lampe, Johanna W
2016-09-01
Plasma lipopolysaccharide-binding protein (LBP), a measure of internal exposure to bacterial lipopolysaccharide, has been associated with several chronic conditions and may be a marker of chronic inflammation; however, no studies have examined the reliability of this biomarker in a healthy population. We examined the temporal reliability of LBP measured in archived samples from participants in two studies. In Study one, 60 healthy participants had blood drawn at two time points: baseline and follow-up (either three, six, or nine months). In Study two, 24 individuals had blood drawn three to four times over a seven-month period. We measured LBP in archived plasma by ELISA. Test-retest reliability was estimated by calculating the intraclass correlation coefficient (ICC). Plasma LBP concentrations showed moderate reliability in Study one (ICC 0.60, 95 % CI 0.43-0.75) and Study two (ICC 0.46, 95 % CI 0.26-0.69). Restricting the follow-up period improved reliability. In Study one, the reliability of LBP over a three-month period was 0.68 (95 % CI: 0.41-0.87). In Study two, the ICC of samples taken ≤seven days apart was 0.61 (95 % CI 0.29-0.86). Plasma LBP concentrations demonstrated moderate test-retest reliability in healthy individuals with reliability improving over a shorter follow-up period.
Varikuti, Deepthi P.; Hoffstaedter, Felix; Genon, Sarah; Schwender, Holger; Reid, Andrew T.; Eickhoff, Simon B.
2016-01-01
Resting-state functional connectivity analysis has become a widely used method for the investigation of human brain connectivity and pathology. The measurement of neuronal activity by functional MRI, however, is impeded by various nuisance signals that reduce the stability of functional connectivity. Several methods exist to address this predicament, but little consensus has yet been reached on the most appropriate approach. Given the crucial importance of reliability for the development of clinical applications, we here investigated the effect of various confound removal approaches on the test-retest reliability of functional-connectivity estimates in two previously defined functional brain networks. Our results showed that grey matter masking improved the reliability of connectivity estimates, whereas de-noising based on principal components analysis reduced it. We additionally observed that refraining from using any correction for global signals provided the best test-retest reliability, but failed to reproduce anti-correlations between what have been previously described as antagonistic networks. This suggests that improved reliability can come at the expense of potentially poorer biological validity. Consistent with this, we observed that reliability was proportional to the retained variance, which presumably included structured noise, such as reliable nuisance signals (for instance, noise induced by cardiac processes). We conclude that compromises are necessary between maximizing test-retest reliability and removing variance that may be attributable to non-neuronal sources. PMID:27550015
Developing Reliable Life Support for Mars
NASA Technical Reports Server (NTRS)
Jones, Harry W.
2017-01-01
A human mission to Mars will require highly reliable life support systems. Mars life support systems may recycle water and oxygen using systems similar to those on the International Space Station (ISS). However, achieving sufficient reliability is less difficult for ISS than it will be for Mars. If an ISS system has a serious failure, it is possible to provide spare parts, or directly supply water or oxygen, or if necessary bring the crew back to Earth. Life support for Mars must be designed, tested, and improved as needed to achieve high demonstrated reliability. A quantitative reliability goal should be established and used to guide development t. The designers should select reliable components and minimize interface and integration problems. In theory a system can achieve the component-limited reliability, but testing often reveal unexpected failures due to design mistakes or flawed components. Testing should extend long enough to detect any unexpected failure modes and to verify the expected reliability. Iterated redesign and retest may be required to achieve the reliability goal. If the reliability is less than required, it may be improved by providing spare components or redundant systems. The number of spares required to achieve a given reliability goal depends on the component failure rate. If the failure rate is under estimated, the number of spares will be insufficient and the system may fail. If the design is likely to have undiscovered design or component problems, it is advisable to use dissimilar redundancy, even though this multiplies the design and development cost. In the ideal case, a human tended closed system operational test should be conducted to gain confidence in operations, maintenance, and repair. The difficulty in achieving high reliability in unproven complex systems may require the use of simpler, more mature, intrinsically higher reliability systems. The limitations of budget, schedule, and technology may suggest accepting lower and less certain expected reliability. A plan to develop reliable life support is needed to achieve the best possible reliability.
Motz, Benjamin A; de Leeuw, Joshua R; Carvalho, Paulo F; Liang, Kaley L; Goldstone, Robert L
2017-01-01
Despite widespread assertions that enthusiasm is an important quality of effective teaching, empirical research on the effect of enthusiasm on learning and memory is mixed and largely inconclusive. To help resolve these inconsistencies, we conducted a carefully-controlled laboratory experiment, investigating whether enthusiastic instructions for a memory task would improve recall accuracy. Scripted videos, either enthusiastic or neutral, were used to manipulate the delivery of task instructions. We also manipulated the sequence of learning items, replicating the spacing effect, a known cognitive technique for memory improvement. Although spaced study reliably improved test performance, we found no reliable effect of enthusiasm on memory performance across two experiments. We did, however, find that enthusiastic instructions caused participants to respond to more item prompts, leaving fewer test questions blank, an outcome typically associated with increased task motivation. We find no support for the popular claim that enthusiastic instruction will improve learning, although it may still improve engagement. This dissociation between motivation and learning is discussed, as well as its implications for education and future research on student learning.
de Leeuw, Joshua R.; Carvalho, Paulo F.; Liang, Kaley L.; Goldstone, Robert L.
2017-01-01
Despite widespread assertions that enthusiasm is an important quality of effective teaching, empirical research on the effect of enthusiasm on learning and memory is mixed and largely inconclusive. To help resolve these inconsistencies, we conducted a carefully-controlled laboratory experiment, investigating whether enthusiastic instructions for a memory task would improve recall accuracy. Scripted videos, either enthusiastic or neutral, were used to manipulate the delivery of task instructions. We also manipulated the sequence of learning items, replicating the spacing effect, a known cognitive technique for memory improvement. Although spaced study reliably improved test performance, we found no reliable effect of enthusiasm on memory performance across two experiments. We did, however, find that enthusiastic instructions caused participants to respond to more item prompts, leaving fewer test questions blank, an outcome typically associated with increased task motivation. We find no support for the popular claim that enthusiastic instruction will improve learning, although it may still improve engagement. This dissociation between motivation and learning is discussed, as well as its implications for education and future research on student learning. PMID:28732087