Rocket engine system reliability analyses using probabilistic and fuzzy logic techniques
NASA Technical Reports Server (NTRS)
Hardy, Terry L.; Rapp, Douglas C.
1994-01-01
The reliability of rocket engine systems was analyzed by using probabilistic and fuzzy logic techniques. Fault trees were developed for integrated modular engine (IME) and discrete engine systems, and then were used with the two techniques to quantify reliability. The IRRAS (Integrated Reliability and Risk Analysis System) computer code, developed for the U.S. Nuclear Regulatory Commission, was used for the probabilistic analyses, and FUZZYFTA (Fuzzy Fault Tree Analysis), a code developed at NASA Lewis Research Center, was used for the fuzzy logic analyses. Although both techniques provided estimates of the reliability of the IME and discrete systems, probabilistic techniques emphasized uncertainty resulting from randomness in the system whereas fuzzy logic techniques emphasized uncertainty resulting from vagueness in the system. Because uncertainty can have both random and vague components, both techniques were found to be useful tools in the analysis of rocket engine system reliability.
NASA Technical Reports Server (NTRS)
Migneault, G. E.
1979-01-01
Emulation techniques are proposed as a solution to a difficulty arising in the analysis of the reliability of highly reliable computer systems for future commercial aircraft. The difficulty, viz., the lack of credible precision in reliability estimates obtained by analytical modeling techniques are established. The difficulty is shown to be an unavoidable consequence of: (1) a high reliability requirement so demanding as to make system evaluation by use testing infeasible, (2) a complex system design technique, fault tolerance, (3) system reliability dominated by errors due to flaws in the system definition, and (4) elaborate analytical modeling techniques whose precision outputs are quite sensitive to errors of approximation in their input data. The technique of emulation is described, indicating how its input is a simple description of the logical structure of a system and its output is the consequent behavior. The use of emulation techniques is discussed for pseudo-testing systems to evaluate bounds on the parameter values needed for the analytical techniques.
NASA Technical Reports Server (NTRS)
Migneault, Gerard E.
1987-01-01
Emulation techniques can be a solution to a difficulty that arises in the analysis of the reliability of guidance and control computer systems for future commercial aircraft. Described here is the difficulty, the lack of credibility of reliability estimates obtained by analytical modeling techniques. The difficulty is an unavoidable consequence of the following: (1) a reliability requirement so demanding as to make system evaluation by use testing infeasible; (2) a complex system design technique, fault tolerance; (3) system reliability dominated by errors due to flaws in the system definition; and (4) elaborate analytical modeling techniques whose precision outputs are quite sensitive to errors of approximation in their input data. Use of emulation techniques for pseudo-testing systems to evaluate bounds on the parameter values needed for the analytical techniques is then discussed. Finally several examples of the application of emulation techniques are described.
NASA Technical Reports Server (NTRS)
Migneault, G. E.
1979-01-01
Emulation techniques applied to the analysis of the reliability of highly reliable computer systems for future commercial aircraft are described. The lack of credible precision in reliability estimates obtained by analytical modeling techniques is first established. The difficulty is shown to be an unavoidable consequence of: (1) a high reliability requirement so demanding as to make system evaluation by use testing infeasible; (2) a complex system design technique, fault tolerance; (3) system reliability dominated by errors due to flaws in the system definition; and (4) elaborate analytical modeling techniques whose precision outputs are quite sensitive to errors of approximation in their input data. Next, the technique of emulation is described, indicating how its input is a simple description of the logical structure of a system and its output is the consequent behavior. Use of emulation techniques is discussed for pseudo-testing systems to evaluate bounds on the parameter values needed for the analytical techniques. Finally an illustrative example is presented to demonstrate from actual use the promise of the proposed application of emulation.
A Passive System Reliability Analysis for a Station Blackout
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brunett, Acacia; Bucknor, Matthew; Grabaskas, David
2015-05-03
The latest iterations of advanced reactor designs have included increased reliance on passive safety systems to maintain plant integrity during unplanned sequences. While these systems are advantageous in reducing the reliance on human intervention and availability of power, the phenomenological foundations on which these systems are built require a novel approach to a reliability assessment. Passive systems possess the unique ability to fail functionally without failing physically, a result of their explicit dependency on existing boundary conditions that drive their operating mode and capacity. Argonne National Laboratory is performing ongoing analyses that demonstrate various methodologies for the characterization of passivemore » system reliability within a probabilistic framework. Two reliability analysis techniques are utilized in this work. The first approach, the Reliability Method for Passive Systems, provides a mechanistic technique employing deterministic models and conventional static event trees. The second approach, a simulation-based technique, utilizes discrete dynamic event trees to treat time- dependent phenomena during scenario evolution. For this demonstration analysis, both reliability assessment techniques are used to analyze an extended station blackout in a pool-type sodium fast reactor (SFR) coupled with a reactor cavity cooling system (RCCS). This work demonstrates the entire process of a passive system reliability analysis, including identification of important parameters and failure metrics, treatment of uncertainties and analysis of results.« less
Sidaway, Ben; Euloth, Tracey; Caron, Heather; Piskura, Matthew; Clancy, Jessica; Aide, Alyson
2012-07-01
The purpose of this study was to compare the reliability of three previously used techniques for the measurement of ankle dorsiflexion ROM, open-chained goniometry, closed-chained goniometry, and inclinometry, to a novel trigonometric technique. Twenty-one physiotherapy students used four techniques (open-chained goniometry, closed-chained goniometry, inclinometry, and trigonometry) to assess dorsiflexion range of motion in 24 healthy volunteers. All student raters underwent training to establish competence in the four techniques. Raters then measured dorsiflexion with a randomly assigned measuring technique four times over two sessions, one week apart. Data were analyzed using a technique by session analysis of variance, technique measurement variability being the primary index of reliability. Comparisons were also made between the measurements derived from the four techniques and those obtained from a computerized video analysis system. Analysis of the rater measurement variability around the technique means revealed significant differences between techniques with the least variation being found in the trigonometric technique. Significant differences were also found between the technique means but no differences between sessions were evident. The trigonometric technique produced mean ROMs closest in value to those derived from computer analysis. Application of the trigonometric technique resulted in the least variability in measurement across raters and consequently should be considered for use when changes in dorsiflexion ROM need to be reliably assessed. Copyright © 2012 Elsevier B.V. All rights reserved.
A methodology for producing reliable software, volume 1
NASA Technical Reports Server (NTRS)
Stucki, L. G.; Moranda, P. B.; Foshee, G.; Kirchoff, M.; Omre, R.
1976-01-01
An investigation into the areas having an impact on producing reliable software including automated verification tools, software modeling, testing techniques, structured programming, and management techniques is presented. This final report contains the results of this investigation, analysis of each technique, and the definition of a methodology for producing reliable software.
Phased-mission system analysis using Boolean algebraic methods
NASA Technical Reports Server (NTRS)
Somani, Arun K.; Trivedi, Kishor S.
1993-01-01
Most reliability analysis techniques and tools assume that a system is used for a mission consisting of a single phase. However, multiple phases are natural in many missions. The failure rates of components, system configuration, and success criteria may vary from phase to phase. In addition, the duration of a phase may be deterministic or random. Recently, several researchers have addressed the problem of reliability analysis of such systems using a variety of methods. A new technique for phased-mission system reliability analysis based on Boolean algebraic methods is described. Our technique is computationally efficient and is applicable to a large class of systems for which the failure criterion in each phase can be expressed as a fault tree (or an equivalent representation). Our technique avoids state space explosion that commonly plague Markov chain-based analysis. A phase algebra to account for the effects of variable configurations and success criteria from phase to phase was developed. Our technique yields exact (as opposed to approximate) results. The use of our technique was demonstrated by means of an example and present numerical results to show the effects of mission phases on the system reliability.
Retest Reliability of the Rosenzweig Picture-Frustration Study and Similar Semiprojective Techniques
ERIC Educational Resources Information Center
Rosenzweig, Saul; And Others
1975-01-01
The research dealing with the reliability of the Rosenzweig Picture-Frustration Study is surveyed. Analysis of various split-half, and retest procedures are reviewed and their relative effectiveness evaluated. Reliability measures as applied to projective techniques in general are discussed. (Author/DEP)
Reliability/safety analysis of a fly-by-wire system
NASA Technical Reports Server (NTRS)
Brock, L. D.; Goddman, H. A.
1980-01-01
An analysis technique has been developed to estimate the reliability of a very complex, safety-critical system by constructing a diagram of the reliability equations for the total system. This diagram has many of the characteristics of a fault-tree or success-path diagram, but is much easier to construct for complex redundant systems. The diagram provides insight into system failure characteristics and identifies the most likely failure modes. A computer program aids in the construction of the diagram and the computation of reliability. Analysis of the NASA F-8 Digital Fly-by-Wire Flight Control System is used to illustrate the technique.
Structural reliability assessment of the Oman India Pipeline
DOE Office of Scientific and Technical Information (OSTI.GOV)
Al-Sharif, A.M.; Preston, R.
1996-12-31
Reliability techniques are increasingly finding application in design. The special design conditions for the deep water sections of the Oman India Pipeline dictate their use since the experience basis for application of standard deterministic techniques is inadequate. The paper discusses the reliability analysis as applied to the Oman India Pipeline, including selection of a collapse model, characterization of the variability in the parameters that affect pipe resistance to collapse, and implementation of first and second order reliability analyses to assess the probability of pipe failure. The reliability analysis results are used as the basis for establishing the pipe wall thicknessmore » requirements for the pipeline.« less
Reliability techniques for computer executive programs
NASA Technical Reports Server (NTRS)
1972-01-01
Computer techniques for increasing the stability and reliability of executive and supervisory systems were studied. Program segmentation characteristics are discussed along with a validation system which is designed to retain the natural top down outlook in coding. An analysis of redundancy techniques and roll back procedures is included.
Reliability analysis of a robotic system using hybridized technique
NASA Astrophysics Data System (ADS)
Kumar, Naveen; Komal; Lather, J. S.
2017-09-01
In this manuscript, the reliability of a robotic system has been analyzed using the available data (containing vagueness, uncertainty, etc). Quantification of involved uncertainties is done through data fuzzification using triangular fuzzy numbers with known spreads as suggested by system experts. With fuzzified data, if the existing fuzzy lambda-tau (FLT) technique is employed, then the computed reliability parameters have wide range of predictions. Therefore, decision-maker cannot suggest any specific and influential managerial strategy to prevent unexpected failures and consequently to improve complex system performance. To overcome this problem, the present study utilizes a hybridized technique. With this technique, fuzzy set theory is utilized to quantify uncertainties, fault tree is utilized for the system modeling, lambda-tau method is utilized to formulate mathematical expressions for failure/repair rates of the system, and genetic algorithm is utilized to solve established nonlinear programming problem. Different reliability parameters of a robotic system are computed and the results are compared with the existing technique. The components of the robotic system follow exponential distribution, i.e., constant. Sensitivity analysis is also performed and impact on system mean time between failures (MTBF) is addressed by varying other reliability parameters. Based on analysis some influential suggestions are given to improve the system performance.
The Application of a Residual Risk Evaluation Technique Used for Expendable Launch Vehicles
NASA Technical Reports Server (NTRS)
Latimer, John A.
2009-01-01
This presentation provides a Residual Risk Evaluation Technique (RRET) developed by Kennedy Space Center (KSC) Safety and Mission Assurance (S&MA) Launch Services Division. This technique is one of many procedures used by S&MA at KSC to evaluate residual risks for each Expendable Launch Vehicle (ELV) mission. RRET is a straight forward technique that incorporates the proven methodology of risk management, fault tree analysis, and reliability prediction. RRET derives a system reliability impact indicator from the system baseline reliability and the system residual risk reliability values. The system reliability impact indicator provides a quantitative measure of the reduction in the system baseline reliability due to the identified residual risks associated with the designated ELV mission. An example is discussed to provide insight into the application of RRET.
Software reliability models for fault-tolerant avionics computers and related topics
NASA Technical Reports Server (NTRS)
Miller, Douglas R.
1987-01-01
Software reliability research is briefly described. General research topics are reliability growth models, quality of software reliability prediction, the complete monotonicity property of reliability growth, conceptual modelling of software failure behavior, assurance of ultrahigh reliability, and analysis techniques for fault-tolerant systems.
Automated Sneak Circuit Analysis Technique
1990-06-01
the OrCAD/SDT module Port facility. 2. The terminals of all in- circuit voltage sources (e , batteries) must be labeled using the OrCAD/SDT module port...ELECTE 1 MAY 2 01994 _- AUTOMATED SNEAK CIRCUIT ANALYSIS TECHNIQUEIt~ w I wtA who RADC 94-14062 Systems Reliability & Engineering Division Rome...Air Develpment Center Best Avai~lable copy AUTOMATED SNEAK CIRCUIT ANALYSIS TECHNIQUE RADC June 1990 Systems Reliability & Engineering Division Rome Air
Technique for Early Reliability Prediction of Software Components Using Behaviour Models
Ali, Awad; N. A. Jawawi, Dayang; Adham Isa, Mohd; Imran Babar, Muhammad
2016-01-01
Behaviour models are the most commonly used input for predicting the reliability of a software system at the early design stage. A component behaviour model reveals the structure and behaviour of the component during the execution of system-level functionalities. There are various challenges related to component reliability prediction at the early design stage based on behaviour models. For example, most of the current reliability techniques do not provide fine-grained sequential behaviour models of individual components and fail to consider the loop entry and exit points in the reliability computation. Moreover, some of the current techniques do not tackle the problem of operational data unavailability and the lack of analysis results that can be valuable for software architects at the early design stage. This paper proposes a reliability prediction technique that, pragmatically, synthesizes system behaviour in the form of a state machine, given a set of scenarios and corresponding constraints as input. The state machine is utilized as a base for generating the component-relevant operational data. The state machine is also used as a source for identifying the nodes and edges of a component probabilistic dependency graph (CPDG). Based on the CPDG, a stack-based algorithm is used to compute the reliability. The proposed technique is evaluated by a comparison with existing techniques and the application of sensitivity analysis to a robotic wheelchair system as a case study. The results indicate that the proposed technique is more relevant at the early design stage compared to existing works, and can provide a more realistic and meaningful prediction. PMID:27668748
Preliminary Analysis of LORAN-C System Reliability for Civil Aviation.
1981-09-01
overviev of the analysis technique. Section 3 describes the computerized LORAN-C coverage model which is used extensively in the reliability analysis...Xth Plenary Assembly, Geneva, 1963, published by International Telecomunications Union. S. Braff, R., Computer program to calculate a Karkov Chain Reliability Model, unpublished york, MITRE Corporation. A-1 I.° , 44J Ili *Y 0E 00 ...F i8 1110 Prelim inary Analysis of Program Engineering & LORAN’C System ReliabilityMaintenance Service i ~Washington. D.C.
Photovoltaic power system reliability considerations
NASA Technical Reports Server (NTRS)
Lalli, V. R.
1980-01-01
An example of how modern engineering and safety techniques can be used to assure the reliable and safe operation of photovoltaic power systems is presented. This particular application is for a solar cell power system demonstration project designed to provide electric power requirements for remote villages. The techniques utilized involve a definition of the power system natural and operating environment, use of design criteria and analysis techniques, an awareness of potential problems via the inherent reliability and FMEA methods, and use of fail-safe and planned spare parts engineering philosophy.
Photovoltaic power system reliability considerations
NASA Technical Reports Server (NTRS)
Lalli, V. R.
1980-01-01
This paper describes an example of how modern engineering and safety techniques can be used to assure the reliable and safe operation of photovoltaic power systems. This particular application was for a solar cell power system demonstration project in Tangaye, Upper Volta, Africa. The techniques involve a definition of the power system natural and operating environment, use of design criteria and analysis techniques, an awareness of potential problems via the inherent reliability and FMEA methods, and use of a fail-safe and planned spare parts engineering philosophy.
Reliability analysis of laminated CMC components through shell subelement techniques
NASA Technical Reports Server (NTRS)
Starlinger, Alois; Duffy, Stephen F.; Gyekenyesi, John P.
1992-01-01
An updated version of the integrated design program Composite Ceramics Analysis and Reliability Evaluation of Structures (C/CARES) was developed for the reliability evaluation of ceramic matrix composites (CMC) laminated shell components. The algorithm is now split into two modules: a finite-element data interface program and a reliability evaluation algorithm. More flexibility is achieved, allowing for easy implementation with various finite-element programs. The interface program creates a neutral data base which is then read by the reliability module. This neutral data base concept allows easy data transfer between different computer systems. The new interface program from the finite-element code Matrix Automated Reduction and Coupling (MARC) also includes the option of using hybrid laminates (a combination of plies of different materials or different layups) and allows for variations in temperature fields throughout the component. In the current version of C/CARES, a subelement technique was implemented, enabling stress gradients within an element to be taken into account. The noninteractive reliability function is now evaluated at each Gaussian integration point instead of using averaging techniques. As a result of the increased number of stress evaluation points, considerable improvements in the accuracy of reliability analyses were realized.
Multidisciplinary System Reliability Analysis
NASA Technical Reports Server (NTRS)
Mahadevan, Sankaran; Han, Song; Chamis, Christos C. (Technical Monitor)
2001-01-01
The objective of this study is to develop a new methodology for estimating the reliability of engineering systems that encompass multiple disciplines. The methodology is formulated in the context of the NESSUS probabilistic structural analysis code, developed under the leadership of NASA Glenn Research Center. The NESSUS code has been successfully applied to the reliability estimation of a variety of structural engineering systems. This study examines whether the features of NESSUS could be used to investigate the reliability of systems in other disciplines such as heat transfer, fluid mechanics, electrical circuits etc., without considerable programming effort specific to each discipline. In this study, the mechanical equivalence between system behavior models in different disciplines are investigated to achieve this objective. A new methodology is presented for the analysis of heat transfer, fluid flow, and electrical circuit problems using the structural analysis routines within NESSUS, by utilizing the equivalence between the computational quantities in different disciplines. This technique is integrated with the fast probability integration and system reliability techniques within the NESSUS code, to successfully compute the system reliability of multidisciplinary systems. Traditional as well as progressive failure analysis methods for system reliability estimation are demonstrated, through a numerical example of a heat exchanger system involving failure modes in structural, heat transfer and fluid flow disciplines.
Multi-Disciplinary System Reliability Analysis
NASA Technical Reports Server (NTRS)
Mahadevan, Sankaran; Han, Song
1997-01-01
The objective of this study is to develop a new methodology for estimating the reliability of engineering systems that encompass multiple disciplines. The methodology is formulated in the context of the NESSUS probabilistic structural analysis code developed under the leadership of NASA Lewis Research Center. The NESSUS code has been successfully applied to the reliability estimation of a variety of structural engineering systems. This study examines whether the features of NESSUS could be used to investigate the reliability of systems in other disciplines such as heat transfer, fluid mechanics, electrical circuits etc., without considerable programming effort specific to each discipline. In this study, the mechanical equivalence between system behavior models in different disciplines are investigated to achieve this objective. A new methodology is presented for the analysis of heat transfer, fluid flow, and electrical circuit problems using the structural analysis routines within NESSUS, by utilizing the equivalence between the computational quantities in different disciplines. This technique is integrated with the fast probability integration and system reliability techniques within the NESSUS code, to successfully compute the system reliability of multi-disciplinary systems. Traditional as well as progressive failure analysis methods for system reliability estimation are demonstrated, through a numerical example of a heat exchanger system involving failure modes in structural, heat transfer and fluid flow disciplines.
LSI/VLSI design for testability analysis and general approach
NASA Technical Reports Server (NTRS)
Lam, A. Y.
1982-01-01
The incorporation of testability characteristics into large scale digital design is not only necessary for, but also pertinent to effective device testing and enhancement of device reliability. There are at least three major DFT techniques, namely, the self checking, the LSSD, and the partitioning techniques, each of which can be incorporated into a logic design to achieve a specific set of testability and reliability requirements. Detailed analysis of the design theory, implementation, fault coverage, hardware requirements, application limitations, etc., of each of these techniques are also presented.
Simplified Phased-Mission System Analysis for Systems with Independent Component Repairs
NASA Technical Reports Server (NTRS)
Somani, Arun K.
1996-01-01
Accurate analysis of reliability of system requires that it accounts for all major variations in system's operation. Most reliability analyses assume that the system configuration, success criteria, and component behavior remain the same. However, multiple phases are natural. We present a new computationally efficient technique for analysis of phased-mission systems where the operational states of a system can be described by combinations of components states (such as fault trees or assertions). Moreover, individual components may be repaired, if failed, as part of system operation but repairs are independent of the system state. For repairable systems Markov analysis techniques are used but they suffer from state space explosion. That limits the size of system that can be analyzed and it is expensive in computation. We avoid the state space explosion. The phase algebra is used to account for the effects of variable configurations, repairs, and success criteria from phase to phase. Our technique yields exact (as opposed to approximate) results. We demonstrate our technique by means of several examples and present numerical results to show the effects of phases and repairs on the system reliability/availability.
Fundamentals of Digital Engineering: Designing for Reliability
NASA Technical Reports Server (NTRS)
Katz, R.; Day, John H. (Technical Monitor)
2001-01-01
The concept of designing for reliability will be introduced along with a brief overview of reliability, redundancy and traditional methods of fault tolerance is presented, as applied to current logic devices. The fundamentals of advanced circuit design and analysis techniques will be the primary focus. The introduction will cover the definitions of key device parameters and how analysis is used to prove circuit correctness. Basic design techniques such as synchronous vs asynchronous design, metastable state resolution time/arbiter design, and finite state machine structure/implementation will be reviewed. Advanced topics will be explored such as skew-tolerant circuit design, the use of triple-modular redundancy and circuit hazards, device transients and preventative circuit design, lock-up states in finite state machines generated by logic synthesizers, device transient characteristics, radiation mitigation techniques. worst-case analysis, the use of timing analyzer and simulators, and others. Case studies and lessons learned from spaceflight designs will be given as examples
DATMAN: A reliability data analysis program using Bayesian updating
DOE Office of Scientific and Technical Information (OSTI.GOV)
Becker, M.; Feltus, M.A.
1996-12-31
Preventive maintenance (PM) techniques focus on the prevention of failures, in particular, system components that are important to plant functions. Reliability-centered maintenance (RCM) improves on the PM techniques by introducing a set of guidelines by which to evaluate the system functions. It also minimizes intrusive maintenance, labor, and equipment downtime without sacrificing system performance when its function is essential for plant safety. Both the PM and RCM approaches require that system reliability data be updated as more component failures and operation time are acquired. Systems reliability and the likelihood of component failures can be calculated by Bayesian statistical methods, whichmore » can update these data. The DATMAN computer code has been developed at Penn State to simplify the Bayesian analysis by performing tedious calculations needed for RCM reliability analysis. DATMAN reads data for updating, fits a distribution that best fits the data, and calculates component reliability. DATMAN provides a user-friendly interface menu that allows the user to choose from several common prior and posterior distributions, insert new failure data, and visually select the distribution that matches the data most accurately.« less
On the next generation of reliability analysis tools
NASA Technical Reports Server (NTRS)
Babcock, Philip S., IV; Leong, Frank; Gai, Eli
1987-01-01
The current generation of reliability analysis tools concentrates on improving the efficiency of the description and solution of the fault-handling processes and providing a solution algorithm for the full system model. The tools have improved user efficiency in these areas to the extent that the problem of constructing the fault-occurrence model is now the major analysis bottleneck. For the next generation of reliability tools, it is proposed that techniques be developed to improve the efficiency of the fault-occurrence model generation and input. Further, the goal is to provide an environment permitting a user to provide a top-down design description of the system from which a Markov reliability model is automatically constructed. Thus, the user is relieved of the tedious and error-prone process of model construction, permitting an efficient exploration of the design space, and an independent validation of the system's operation is obtained. An additional benefit of automating the model construction process is the opportunity to reduce the specialized knowledge required. Hence, the user need only be an expert in the system he is analyzing; the expertise in reliability analysis techniques is supplied.
NASA Technical Reports Server (NTRS)
LaValley, Brian W.; Little, Phillip D.; Walter, Chris J.
2011-01-01
This report documents the capabilities of the EDICT tools for error modeling and error propagation analysis when operating with models defined in the Architecture Analysis & Design Language (AADL). We discuss our experience using the EDICT error analysis capabilities on a model of the Scalable Processor-Independent Design for Enhanced Reliability (SPIDER) architecture that uses the Reliable Optical Bus (ROBUS). Based on these experiences we draw some initial conclusions about model based design techniques for error modeling and analysis of highly reliable computing architectures.
Infusing Reliability Techniques into Software Safety Analysis
NASA Technical Reports Server (NTRS)
Shi, Ying
2015-01-01
Software safety analysis for a large software intensive system is always a challenge. Software safety practitioners need to ensure that software related hazards are completely identified, controlled, and tracked. This paper discusses in detail how to incorporate the traditional reliability techniques into the entire software safety analysis process. In addition, this paper addresses how information can be effectively shared between the various practitioners involved in the software safety analyses. The author has successfully applied the approach to several aerospace applications. Examples are provided to illustrate the key steps of the proposed approach.
Hierarchical modeling for reliability analysis using Markov models. B.S./M.S. Thesis - MIT
NASA Technical Reports Server (NTRS)
Fagundo, Arturo
1994-01-01
Markov models represent an extremely attractive tool for the reliability analysis of many systems. However, Markov model state space grows exponentially with the number of components in a given system. Thus, for very large systems Markov modeling techniques alone become intractable in both memory and CPU time. Often a particular subsystem can be found within some larger system where the dependence of the larger system on the subsystem is of a particularly simple form. This simple dependence can be used to decompose such a system into one or more subsystems. A hierarchical technique is presented which can be used to evaluate these subsystems in such a way that their reliabilities can be combined to obtain the reliability for the full system. This hierarchical approach is unique in that it allows the subsystem model to pass multiple aggregate state information to the higher level model, allowing more general systems to be evaluated. Guidelines are developed to assist in the system decomposition. An appropriate method for determining subsystem reliability is also developed. This method gives rise to some interesting numerical issues. Numerical error due to roundoff and integration are discussed at length. Once a decomposition is chosen, the remaining analysis is straightforward but tedious. However, an approach is developed for simplifying the recombination of subsystem reliabilities. Finally, a real world system is used to illustrate the use of this technique in a more practical context.
Komal
2018-05-01
Nowadays power consumption is increasing day-by-day. To fulfill failure free power requirement, planning and implementation of an effective and reliable power management system is essential. Phasor measurement unit(PMU) is one of the key device in wide area measurement and control systems. The reliable performance of PMU assures failure free power supply for any power system. So, the purpose of the present study is to analyse the reliability of a PMU used for controllability and observability of power systems utilizing available uncertain data. In this paper, a generalized fuzzy lambda-tau (GFLT) technique has been proposed for this purpose. In GFLT, system components' uncertain failure and repair rates are fuzzified using fuzzy numbers having different shapes such as triangular, normal, cauchy, sharp gamma and trapezoidal. To select a suitable fuzzy number for quantifying data uncertainty, system experts' opinion have been considered. The GFLT technique applies fault tree, lambda-tau method, fuzzified data using different membership functions, alpha-cut based fuzzy arithmetic operations to compute some important reliability indices. Furthermore, in this study ranking of critical components of the system using RAM-Index and sensitivity analysis have also been performed. The developed technique may be helpful to improve system performance significantly and can be applied to analyse fuzzy reliability of other engineering systems. Copyright © 2018 ISA. Published by Elsevier Ltd. All rights reserved.
Fabrication Techniques and Principles for Flat Plate Antennas
DOT National Transportation Integrated Search
1973-09-01
The report documents the fabrication techniques and principles selected to produce one and ten million flat plate antennas per year. An engineering analysis of the reliability, electrical integrity, and repeatability is made, and a cost analysis summ...
System reliability and recovery.
DOT National Transportation Integrated Search
1971-06-01
The paper exhibits a variety of reliability techniques applicable to future ATC data processing systems. Presently envisioned schemes for error detection, error interrupt and error analysis are considered, along with methods of retry, reconfiguration...
Reliability and coverage analysis of non-repairable fault-tolerant memory systems
NASA Technical Reports Server (NTRS)
Cox, G. W.; Carroll, B. D.
1976-01-01
A method was developed for the construction of probabilistic state-space models for nonrepairable systems. Models were developed for several systems which achieved reliability improvement by means of error-coding, modularized sparing, massive replication and other fault-tolerant techniques. From the models developed, sets of reliability and coverage equations for the systems were developed. Comparative analyses of the systems were performed using these equation sets. In addition, the effects of varying subunit reliabilities on system reliability and coverage were described. The results of these analyses indicated that a significant gain in system reliability may be achieved by use of combinations of modularized sparing, error coding, and software error control. For sufficiently reliable system subunits, this gain may far exceed the reliability gain achieved by use of massive replication techniques, yet result in a considerable saving in system cost.
Optimization Based Efficiencies in First Order Reliability Analysis
NASA Technical Reports Server (NTRS)
Peck, Jeffrey A.; Mahadevan, Sankaran
2003-01-01
This paper develops a method for updating the gradient vector of the limit state function in reliability analysis using Broyden's rank one updating technique. In problems that use commercial code as a black box, the gradient calculations are usually done using a finite difference approach, which becomes very expensive for large system models. The proposed method replaces the finite difference gradient calculations in a standard first order reliability method (FORM) with Broyden's Quasi-Newton technique. The resulting algorithm of Broyden updates within a FORM framework (BFORM) is used to run several example problems, and the results compared to standard FORM results. It is found that BFORM typically requires fewer functional evaluations that FORM to converge to the same answer.
NASA Technical Reports Server (NTRS)
Rais-Rohani, Masoud
2003-01-01
This report discusses the development and application of two alternative strategies in the form of global and sequential local response surface (RS) techniques for the solution of reliability-based optimization (RBO) problems. The problem of a thin-walled composite circular cylinder under axial buckling instability is used as a demonstrative example. In this case, the global technique uses a single second-order RS model to estimate the axial buckling load over the entire feasible design space (FDS) whereas the local technique uses multiple first-order RS models with each applied to a small subregion of FDS. Alternative methods for the calculation of unknown coefficients in each RS model are explored prior to the solution of the optimization problem. The example RBO problem is formulated as a function of 23 uncorrelated random variables that include material properties, thickness and orientation angle of each ply, cylinder diameter and length, as well as the applied load. The mean values of the 8 ply thicknesses are treated as independent design variables. While the coefficients of variation of all random variables are held fixed, the standard deviations of ply thicknesses can vary during the optimization process as a result of changes in the design variables. The structural reliability analysis is based on the first-order reliability method with reliability index treated as the design constraint. In addition to the probabilistic sensitivity analysis of reliability index, the results of the RBO problem are presented for different combinations of cylinder length and diameter and laminate ply patterns. The two strategies are found to produce similar results in terms of accuracy with the sequential local RS technique having a considerably better computational efficiency.
Foster, J D; Miskovic, D; Allison, A S; Conti, J A; Ockrim, J; Cooper, E J; Hanna, G B; Francis, N K
2016-06-01
Laparoscopic rectal resection is technically challenging, with outcomes dependent upon technical performance. No robust objective assessment tool exists for laparoscopic rectal resection surgery. This study aimed to investigate the application of the objective clinical human reliability analysis (OCHRA) technique for assessing technical performance of laparoscopic rectal surgery and explore the validity and reliability of this technique. Laparoscopic rectal cancer resection operations were described in the format of a hierarchical task analysis. Potential technical errors were defined. The OCHRA technique was used to identify technical errors enacted in videos of twenty consecutive laparoscopic rectal cancer resection operations from a single site. The procedural task, spatial location, and circumstances of all identified errors were logged. Clinical validity was assessed through correlation with clinical outcomes; reliability was assessed by test-retest. A total of 335 execution errors identified, with a median 15 per operation. More errors were observed during pelvic tasks compared with abdominal tasks (p < 0.001). Within the pelvis, more errors were observed during dissection on the right side than the left (p = 0.03). Test-retest confirmed reliability (r = 0.97, p < 0.001). A significant correlation was observed between error frequency and mesorectal specimen quality (r s = 0.52, p = 0.02) and with blood loss (r s = 0.609, p = 0.004). OCHRA offers a valid and reliable method for evaluating technical performance of laparoscopic rectal surgery.
Mallineni, S K; Anthonappa, R P; King, N M
2016-12-01
To assess the reliability of the vertical tube shift technique (VTST) and horizontal tube shift technique (HTST) for the localisation of unerupted supernumerary teeth (ST) in the anterior region of the maxilla. A convenience sample of 83 patients who attended a major teaching hospital because of unerupted ST was selected. Only non-syndromic patients with ST and who had complete clinical and radiographic and surgical records were included in the study. Ten examiners independently rated the paired set of radiographs for each technique. Chi-square test, paired t test and kappa statistics were employed to assess the intra- and inter-examiner reliability. Paired sets of 1660 radiographs (830 pairs for each technique) were available for the analysis. The overall sensitivity for VTST and HTST was 80.6 and 72.1% respectively, with slight inter-examiner and good intra-examiner reliability. Statistically significant differences were evident between the two localisation techniques (p < 0.05). Localisation of unerupted ST using VTST was more successful than HTST in the anterior region of the maxilla.
Reliability studies of Integrated Modular Engine system designs
NASA Technical Reports Server (NTRS)
Hardy, Terry L.; Rapp, Douglas C.
1993-01-01
A study was performed to evaluate the reliability of Integrated Modular Engine (IME) concepts. Comparisons were made between networked IME systems and non-networked discrete systems using expander cycle configurations. Both redundant and non-redundant systems were analyzed. Binomial approximation and Markov analysis techniques were employed to evaluate total system reliability. In addition, Failure Modes and Effects Analyses (FMEA), Preliminary Hazard Analyses (PHA), and Fault Tree Analysis (FTA) were performed to allow detailed evaluation of the IME concept. A discussion of these system reliability concepts is also presented.
Reliability studies of integrated modular engine system designs
NASA Technical Reports Server (NTRS)
Hardy, Terry L.; Rapp, Douglas C.
1993-01-01
A study was performed to evaluate the reliability of Integrated Modular Engine (IME) concepts. Comparisons were made between networked IME systems and non-networked discrete systems using expander cycle configurations. Both redundant and non-redundant systems were analyzed. Binomial approximation and Markov analysis techniques were employed to evaluate total system reliability. In addition, Failure Modes and Effects Analyses (FMEA), Preliminary Hazard Analyses (PHA), and Fault Tree Analysis (FTA) were performed to allow detailed evaluation of the IME concept. A discussion of these system reliability concepts is also presented.
Reliability analysis of laminated CMC components through shell subelement techniques
NASA Technical Reports Server (NTRS)
Starlinger, A.; Duffy, S. F.; Gyekenyesi, J. P.
1992-01-01
An updated version of the integrated design program C/CARES (composite ceramic analysis and reliability evaluation of structures) was developed for the reliability evaluation of CMC laminated shell components. The algorithm is now split in two modules: a finite-element data interface program and a reliability evaluation algorithm. More flexibility is achieved, allowing for easy implementation with various finite-element programs. The new interface program from the finite-element code MARC also includes the option of using hybrid laminates and allows for variations in temperature fields throughout the component.
Reliability studies of integrated modular engine system designs
NASA Astrophysics Data System (ADS)
Hardy, Terry L.; Rapp, Douglas C.
1993-06-01
A study was performed to evaluate the reliability of Integrated Modular Engine (IME) concepts. Comparisons were made between networked IME systems and non-networked discrete systems using expander cycle configurations. Both redundant and non-redundant systems were analyzed. Binomial approximation and Markov analysis techniques were employed to evaluate total system reliability. In addition, Failure Modes and Effects Analyses (FMEA), Preliminary Hazard Analyses (PHA), and Fault Tree Analysis (FTA) were performed to allow detailed evaluation of the IME concept. A discussion of these system reliability concepts is also presented.
Reliability studies of Integrated Modular Engine system designs
NASA Astrophysics Data System (ADS)
Hardy, Terry L.; Rapp, Douglas C.
1993-06-01
A study was performed to evaluate the reliability of Integrated Modular Engine (IME) concepts. Comparisons were made between networked IME systems and non-networked discrete systems using expander cycle configurations. Both redundant and non-redundant systems were analyzed. Binomial approximation and Markov analysis techniques were employed to evaluate total system reliability. In addition, Failure Modes and Effects Analyses (FMEA), Preliminary Hazard Analyses (PHA), and Fault Tree Analysis (FTA) were performed to allow detailed evaluation of the IME concept. A discussion of these system reliability concepts is also presented.
New Results in Software Model Checking and Analysis
NASA Technical Reports Server (NTRS)
Pasareanu, Corina S.
2010-01-01
This introductory article surveys new techniques, supported by automated tools, for the analysis of software to ensure reliability and safety. Special focus is on model checking techniques. The article also introduces the five papers that are enclosed in this special journal volume.
Marcos-Garcés, V; Harvat, M; Molina Aguilar, P; Ferrández Izquierdo, A; Ruiz-Saurí, A
2017-08-01
Measurement of collagen bundle orientation in histopathological samples is a widely used and useful technique in many research and clinical scenarios. Fourier analysis is the preferred method for performing this measurement, but the most appropriate staining and microscopy technique remains unclear. Some authors advocate the use of Haematoxylin-Eosin (H&E) and confocal microscopy, but there are no studies comparing this technique with other classical collagen stainings. In our study, 46 human skin samples were collected, processed for histological analysis and stained with Masson's trichrome, Picrosirius red and H&E. Five microphotographs of the reticular dermis were taken with a 200× magnification with light microscopy, polarized microscopy and confocal microscopy, respectively. Two independent observers measured collagen bundle orientation with semiautomated Fourier analysis with the Image-Pro Plus 7.0 software and three independent observers performed a semiquantitative evaluation of the same parameter. The average orientation for each case was calculated with the values of the five pictures. We analyzed the interrater reliability, the consistency between Fourier analysis and average semiquantitative evaluation and the consistency between measurements in Masson's trichrome, Picrosirius red and H&E-confocal. Statistical analysis for reliability and agreement was performed with the SPSS 22.0 software and consisted of intraclass correlation coefficient (ICC), Bland-Altman plots and limits of agreement and coefficient of variation. Interrater reliability was almost perfect (ICC > 0.8) with all three histological and microscopy techniques and always superior in Fourier analysis than in average semiquantitative evaluation. Measurements were consistent between Fourier analysis by one observer and average semiquantitative evaluation by three observers, with an almost perfect agreement with Masson's trichrome and Picrosirius red techniques (ICC > 0.8) and a strong agreement with H&E-confocal (0.7 < ICC < 0.8). Comparison of measurements between the three techniques for the same observer showed an almost perfect agreement (ICC > 0.8), better with Fourier analysis than with semiquantitative evaluation (single and average). These results in nonpathological skin samples were also confirmed in a preliminary analysis in eight scleroderma skin samples. Our results show that Masson's trichrome and Picrosirius red are consistent with H&E-confocal for measuring collagen bundle orientation in histological samples and could thus be used indistinctly for this purpose. Fourier analysis is superior to average semiquantitative evaluation and should keep being used as the preferred method. © 2017 The Authors Journal of Microscopy © 2017 Royal Microscopical Society.
Reliability of diagnosis and clinical efficacy of visceral osteopathy: a systematic review.
Guillaud, Albin; Darbois, Nelly; Monvoisin, Richard; Pinsault, Nicolas
2018-02-17
In 2010, the World Health Organization published benchmarks for training in osteopathy in which osteopathic visceral techniques are included. The purpose of this study was to identify and critically appraise the scientific literature concerning the reliability of diagnosis and the clinical efficacy of techniques used in visceral osteopathy. Databases MEDLINE, OSTMED.DR, the Cochrane Library, Osteopathic Research Web, Google Scholar, Journal of American Osteopathic Association (JAOA) website, International Journal of Osteopathic Medicine (IJOM) website, and the catalog of Académie d'ostéopathie de France website were searched through December 2017. Only inter-rater reliability studies including at least two raters or the intra-rater reliability studies including at least two assessments by the same rater were included. For efficacy studies, only randomized-controlled-trials (RCT) or crossover studies on unhealthy subjects (any condition, duration and outcome) were included. Risk of bias was determined using a modified version of the quality appraisal tool for studies of diagnostic reliability (QAREL) in reliability studies. For the efficacy studies, the Cochrane risk of bias tool was used to assess their methodological design. Two authors performed data extraction and analysis. Eight reliability studies and six efficacy studies were included. The analysis of reliability studies shows that the diagnostic techniques used in visceral osteopathy are unreliable. Regarding efficacy studies, the least biased study shows no significant difference for the main outcome. The main risks of bias found in the included studies were due to the absence of blinding of the examiners, an unsuitable statistical method or an absence of primary study outcome. The results of the systematic review lead us to conclude that well-conducted and sound evidence on the reliability and the efficacy of techniques in visceral osteopathy is absent. The review is registered PROSPERO 12th of December 2016. Registration number is CRD4201605286 .
Probability techniques for reliability analysis of composite materials
NASA Technical Reports Server (NTRS)
Wetherhold, Robert C.; Ucci, Anthony M.
1994-01-01
Traditional design approaches for composite materials have employed deterministic criteria for failure analysis. New approaches are required to predict the reliability of composite structures since strengths and stresses may be random variables. This report will examine and compare methods used to evaluate the reliability of composite laminae. The two types of methods that will be evaluated are fast probability integration (FPI) methods and Monte Carlo methods. In these methods, reliability is formulated as the probability that an explicit function of random variables is less than a given constant. Using failure criteria developed for composite materials, a function of design variables can be generated which defines a 'failure surface' in probability space. A number of methods are available to evaluate the integration over the probability space bounded by this surface; this integration delivers the required reliability. The methods which will be evaluated are: the first order, second moment FPI methods; second order, second moment FPI methods; the simple Monte Carlo; and an advanced Monte Carlo technique which utilizes importance sampling. The methods are compared for accuracy, efficiency, and for the conservativism of the reliability estimation. The methodology involved in determining the sensitivity of the reliability estimate to the design variables (strength distributions) and importance factors is also presented.
Non-contact ulcer area calculation system for neuropathic foot ulcer.
Shah, Parth; Mahajan, Siddaram; Nageswaran, Sharmila; Paul, Sathish Kumar; Ebenzer, Mannam
2017-08-11
Around 125,785 new cases in year 2013-14 of leprosy were detected in India as per WHO report on leprosy in September 2015 which accounts to approximately 62% of the total new cases. Anaesthetic foot caused by leprosy leads to uneven loading of foot leading to ulcer in approximately 20% of the cases. Much efforts have gone in identifying newer techniques to efficiently monitor the progress of ulcer healing. Current techniques followed in measuring the size of ulcers, have not been found to be so accurate but are still is followed by clinicians across the globe. Quantification of prognosis of the condition would be required to understand the efficacy of current treatment methods and plan for further treatment. This study aims at developing a non contact technique to precisely measure the size of ulcer in patients affected by leprosy. Using MATLAB software, GUI was designed to process the acquired ulcer image by segmenting and calculating the pixel area of the image. The image was further converted to a standard measurement using a reference object. The developed technique was tested on 16 ulcer images acquired from 10 leprosy patients with plantar ulcers. Statistical analysis was done using MedCalc analysis software to find the reliability of the system. The analysis showed a very high correlation coefficient (r=0.9882) between the ulcer area measurements done using traditional technique and the newly developed technique, The reliability of the newly developed technique was significant with a significance level of 99.9%. The designed non-contact ulcer area calculating system using MATLAB is found to be a reliable system in calculating the size of ulcers. The technique would help clinicians have a reliable tool to monitor the progress of ulcer healing and help modify the treatment protocol if needed. Copyright © 2017 European Foot and Ankle Society. Published by Elsevier Ltd. All rights reserved.
A Test Reliability Analysis of an Abbreviated Version of the Pupil Control Ideology Form.
ERIC Educational Resources Information Center
Gaffney, Patrick V.
A reliability analysis was conducted of an abbreviated, 10-item version of the Pupil Control Ideology Form (PCI), using the Cronbach's alpha technique (L. J. Cronbach, 1951) and the computation of the standard error of measurement. The PCI measures a teacher's orientation toward pupil control. Subjects were 168 preservice teachers from one private…
A Comparison of Laser and Video Techniques for Determining Displacement and Velocity during Running
ERIC Educational Resources Information Center
Harrison, Andrew J.; Jensen, Randall L.; Donoghue, Orna
2005-01-01
The reliability of a laser system was compared with the reliability of a video-based kinematic analysis in measuring displacement and velocity during running. Validity and reliability of the laser on static measures was also assessed at distances between 10 m and 70 m by evaluating the coefficient of variation and intraclass correlation…
Whalley, H C; Kestelman, J N; Rimmington, J E; Kelso, A; Abukmeil, S S; Best, J J; Johnstone, E C; Lawrie, S M
1999-07-30
The Edinburgh High Risk Project is a longitudinal study of brain structure (and function) in subjects at high risk of developing schizophrenia in the next 5-10 years for genetic reasons. In this article we describe the methods of volumetric analysis of structural magnetic resonance images used in the study. We also consider potential sources of error in these methods: the validity of our image analysis techniques; inter- and intra-rater reliability; possible positional variation; and thresholding criteria used in separating brain from cerebro-spinal fluid (CSF). Investigation with a phantom test object (of similar imaging characteristics to the brain) provided evidence for the validity of our image acquisition and analysis techniques. Both inter- and intra-rater reliability were found to be good in whole brain measures but less so for smaller regions. There were no statistically significant differences in positioning across the three study groups (patients with schizophrenia, high risk subjects and normal volunteers). A new technique for thresholding MRI scans longitudinally is described (the 'rescale' method) and compared with our established method (thresholding by eye). Few differences between the two techniques were seen at 3- and 6-month follow-up. These findings demonstrate the validity and reliability of the structural MRI analysis techniques used in the Edinburgh High Risk Project, and highlight methodological issues of general concern in cross-sectional and longitudinal studies of brain structure in healthy control subjects and neuropsychiatric populations.
User's guide to the Reliability Estimation System Testbed (REST)
NASA Technical Reports Server (NTRS)
Nicol, David M.; Palumbo, Daniel L.; Rifkin, Adam
1992-01-01
The Reliability Estimation System Testbed is an X-window based reliability modeling tool that was created to explore the use of the Reliability Modeling Language (RML). RML was defined to support several reliability analysis techniques including modularization, graphical representation, Failure Mode Effects Simulation (FMES), and parallel processing. These techniques are most useful in modeling large systems. Using modularization, an analyst can create reliability models for individual system components. The modules can be tested separately and then combined to compute the total system reliability. Because a one-to-one relationship can be established between system components and the reliability modules, a graphical user interface may be used to describe the system model. RML was designed to permit message passing between modules. This feature enables reliability modeling based on a run time simulation of the system wide effects of a component's failure modes. The use of failure modes effects simulation enhances the analyst's ability to correctly express system behavior when using the modularization approach to reliability modeling. To alleviate the computation bottleneck often found in large reliability models, REST was designed to take advantage of parallel processing on hypercube processors.
Module Degradation Mechanisms Studied by a Multi-Scale Approach
DOE Office of Scientific and Technical Information (OSTI.GOV)
Johnston, Steve; Al-Jassim, Mowafak; Hacke, Peter
2016-11-21
A key pathway to meeting the Department of Energy SunShot 2020 goals is to reduce financing costs by improving investor confidence through improved photovoltaic (PV) module reliability. A comprehensive approach to further understand and improve PV reliability includes characterization techniques and modeling from module to atomic scale. Imaging techniques, which include photoluminescence, electroluminescence, and lock-in thermography, are used to locate localized defects responsible for module degradation. Small area samples containing such defects are prepared using coring techniques and are then suitable and available for microscopic study and specific defect modeling and analysis.
Pulse oximeter sensor application during neonatal resuscitation: a randomized controlled trial.
Louis, Deepak; Sundaram, Venkataseshan; Kumar, Praveen
2014-03-01
This study was done to compare 2 techniques of pulse oximeter sensor application during neonatal resuscitation for faster signal detection. Sensor to infant first (STIF) and then to oximeter was compared with sensor to oximeter first (STOF) and then to infant in ≥28 weeks gestations. The primary outcome was time from completion of sensor application to reliable signal, defined as stable display of heart rate and saturation. Time from birth to sensor application, time taken for sensor application, time from birth to reliable signal, and need to reapply sensor were secondary outcomes. An intention-to-treat analysis was done, and subgroup analysis was done for gestation and need for resuscitation. One hundred fifty neonates were randomized with 75 to each technique. The median (IQR) time from sensor application to detection of reliable signal was longer in STIF group compared with STOF group (16 [15-17] vs. 10 [6-18] seconds; P <0.001). Time taken for application of sensor was longer with STIF technique than with STOF technique (12 [10-16] vs. 11 [9-15] seconds; P = 0.04). Time from birth to reliable signal did not differ between the 2 methods (STIF: 61 [52-76] seconds; STOF: 58 [47-73] seconds [P = .09]). Time taken for signal acquisition was longer with STIF than with STOF in both subgroups. In the delivery room setting, the STOF method recognized saturation and heart rate faster than the STIF method. The time from birth to reliable signal was similar with the 2 methods.
48 CFR 215.404-1 - Proposal analysis techniques.
Code of Federal Regulations, 2010 CFR
2010-10-01
... reliability of its estimating and accounting systems. [63 FR 55040, Oct. 14, 1998, as amended at 71 FR 69494... 48 Federal Acquisition Regulations System 3 2010-10-01 2010-10-01 false Proposal analysis techniques. 215.404-1 Section 215.404-1 Federal Acquisition Regulations System DEFENSE ACQUISITION...
Development of a versatile user-friendly IBA experimental chamber
NASA Astrophysics Data System (ADS)
Kakuee, Omidreza; Fathollahi, Vahid; Lamehi-Rachti, Mohammad
2016-03-01
Reliable performance of the Ion Beam Analysis (IBA) techniques is based on the accurate geometry of the experimental setup, employment of the reliable nuclear data and implementation of dedicated analysis software for each of the IBA techniques. It has already been shown that geometrical imperfections lead to significant uncertainties in quantifications of IBA measurements. To minimize these uncertainties, a user-friendly experimental chamber with a heuristic sample positioning system for IBA analysis was recently developed in the Van de Graaff laboratory in Tehran. This system enhances IBA capabilities and in particular Nuclear Reaction Analysis (NRA) and Elastic Recoil Detection Analysis (ERDA) techniques. The newly developed sample manipulator provides the possibility of both controlling the tilt angle of the sample and analyzing samples with different thicknesses. Moreover, a reasonable number of samples can be loaded in the sample wheel. A comparison of the measured cross section data of the 16O(d,p1)17O reaction with the data reported in the literature confirms the performance and capability of the newly developed experimental chamber.
Use of Model-Based Design Methods for Enhancing Resiliency Analysis of Unmanned Aerial Vehicles
NASA Astrophysics Data System (ADS)
Knox, Lenora A.
The most common traditional non-functional requirement analysis is reliability. With systems becoming more complex, networked, and adaptive to environmental uncertainties, system resiliency has recently become the non-functional requirement analysis of choice. Analysis of system resiliency has challenges; which include, defining resilience for domain areas, identifying resilience metrics, determining resilience modeling strategies, and understanding how to best integrate the concepts of risk and reliability into resiliency. Formal methods that integrate all of these concepts do not currently exist in specific domain areas. Leveraging RAMSoS, a model-based reliability analysis methodology for Systems of Systems (SoS), we propose an extension that accounts for resiliency analysis through evaluation of mission performance, risk, and cost using multi-criteria decision-making (MCDM) modeling and design trade study variability modeling evaluation techniques. This proposed methodology, coined RAMSoS-RESIL, is applied to a case study in the multi-agent unmanned aerial vehicle (UAV) domain to investigate the potential benefits of a mission architecture where functionality to complete a mission is disseminated across multiple UAVs (distributed) opposed to being contained in a single UAV (monolithic). The case study based research demonstrates proof of concept for the proposed model-based technique and provides sufficient preliminary evidence to conclude which architectural design (distributed vs. monolithic) is most resilient based on insight into mission resilience performance, risk, and cost in addition to the traditional analysis of reliability.
Real-time emergency forecasting technique for situation management systems
NASA Astrophysics Data System (ADS)
Kopytov, V. V.; Kharechkin, P. V.; Naumenko, V. V.; Tretyak, R. S.; Tebueva, F. B.
2018-05-01
The article describes the real-time emergency forecasting technique that allows increasing accuracy and reliability of forecasting results of any emergency computational model applied for decision making in situation management systems. Computational models are improved by the Improved Brown’s method applying fractal dimension to forecast short time series data being received from sensors and control systems. Reliability of emergency forecasting results is ensured by the invalid sensed data filtering according to the methods of correlation analysis.
1975-04-01
to 2-16 Category 3: Fabrie’ition Methods and Techniques 3-01 to 3-21 Category 4: ReliabiLity Studies 4-01 to 4-15 Category 5: C,,rputeiized Analysis...RAC icrodrcuit Thesaurus. The ternis are arranged in alphabetical order with sub-term description followinti each main term. Cosvreferencing is...Reliability aspects of vrocircuit manufacturi’. 4. Reliability Studies : Technics) reports !:datig to ;ormal ve isbbty studies and investi- sations
Remote Sensing Applications with High Reliability in Changjiang Water Resource Management
NASA Astrophysics Data System (ADS)
Ma, L.; Gao, S.; Yang, A.
2018-04-01
Remote sensing technology has been widely used in many fields. But most of the applications cannot get the information with high reliability and high accuracy in large scale, especially for the applications using automatic interpretation methods. We have designed an application-oriented technology system (PIR) composed of a series of accurate interpretation techniques,which can get over 85 % correctness in Water Resource Management from the view of photogrammetry and expert knowledge. The techniques compose of the spatial positioning techniques from the view of photogrammetry, the feature interpretation techniques from the view of expert knowledge, and the rationality analysis techniques from the view of data mining. Each interpreted polygon is accurate enough to be applied to the accuracy sensitive projects, such as the Three Gorge Project and the South - to - North Water Diversion Project. In this paper, we present several remote sensing applications with high reliability in Changjiang Water Resource Management,including water pollution investigation, illegal construction inspection, and water conservation monitoring, etc.
NASA Astrophysics Data System (ADS)
Daly, Aoife; Streeton, Noëlle L. W.
2017-06-01
A technique for non-invasive dendrochronological analysis of oak was developed for archaeological material, using an industrial CT scanner. Since 2013, this experience has been extended within the scope of the research project `After the Black Death: Painting and Polychrome Sculpture in Norway'. The source material for the project is a collection of late-medieval winged altarpieces, shrines, polychrome sculpture, and fragments from Norwegian churches, which are owned by the Museum of Cultural History, University of Oslo. The majority cannot be sampled, and many are too large to fit into the CT scanner. For these reasons, a combined approach was adopted, utilizing CT scanning where possible, but preceded by an `exposed-wood' imaging technique. Both non-invasive techniques have yielded reliable results, and CT scanning has confirmed the reliability of the imaging technique alone. This paper presents the analytical methods, along with results from two of the 13 objects under investigation. Results for reliable dates and provenances provide new foundations for historical interpretations.
NASA Technical Reports Server (NTRS)
Yunis, Isam S.; Carney, Kelly S.
1993-01-01
A new aerospace application of structural reliability techniques is presented, where the applied forces depend on many probabilistic variables. This application is the plume impingement loading of the Space Station Freedom Photovoltaic Arrays. When the space shuttle berths with Space Station Freedom it must brake and maneuver towards the berthing point using its primary jets. The jet exhaust, or plume, may cause high loads on the photovoltaic arrays. The many parameters governing this problem are highly uncertain and random. An approach, using techniques from structural reliability, as opposed to the accepted deterministic methods, is presented which assesses the probability of failure of the array mast due to plume impingement loading. A Monte Carlo simulation of the berthing approach is used to determine the probability distribution of the loading. A probability distribution is also determined for the strength of the array. Structural reliability techniques are then used to assess the array mast design. These techniques are found to be superior to the standard deterministic dynamic transient analysis, for this class of problem. The results show that the probability of failure of the current array mast design, during its 15 year life, is minute.
A guide to onboard checkout. Volume 4: Propulsion
NASA Technical Reports Server (NTRS)
1971-01-01
The propulsion system for a space station is considered with respect to onboard checkout requirements. Failure analysis, reliability, and maintenance features are presented. Computer analysis techniques are also discussed.
ERIC Educational Resources Information Center
Coordination in Development, New York, NY.
This booklet was produced in response to the growing need for reliable environmental assessment techniques that can be applied to small-scale development projects. The suggested techniques emphasize low-technology environmental analysis. Although these techniques may lack precision, they can be extremely valuable in helping to assure the success…
Leveraging Code Comments to Improve Software Reliability
ERIC Educational Resources Information Center
Tan, Lin
2009-01-01
Commenting source code has long been a common practice in software development. This thesis, consisting of three pieces of work, made novel use of the code comments written in natural language to improve software reliability. Our solution combines Natural Language Processing (NLP), Machine Learning, Statistics, and Program Analysis techniques to…
Reliability analysis of composite structures
NASA Technical Reports Server (NTRS)
Kan, Han-Pin
1992-01-01
A probabilistic static stress analysis methodology has been developed to estimate the reliability of a composite structure. Closed form stress analysis methods are the primary analytical tools used in this methodology. These structural mechanics methods are used to identify independent variables whose variations significantly affect the performance of the structure. Once these variables are identified, scatter in their values is evaluated and statistically characterized. The scatter in applied loads and the structural parameters are then fitted to appropriate probabilistic distribution functions. Numerical integration techniques are applied to compute the structural reliability. The predicted reliability accounts for scatter due to variability in material strength, applied load, fabrication and assembly processes. The influence of structural geometry and mode of failure are also considerations in the evaluation. Example problems are given to illustrate various levels of analytical complexity.
Cut set-based risk and reliability analysis for arbitrarily interconnected networks
Wyss, Gregory D.
2000-01-01
Method for computing all-terminal reliability for arbitrarily interconnected networks such as the United States public switched telephone network. The method includes an efficient search algorithm to generate minimal cut sets for nonhierarchical networks directly from the network connectivity diagram. Efficiency of the search algorithm stems in part from its basis on only link failures. The method also includes a novel quantification scheme that likewise reduces computational effort associated with assessing network reliability based on traditional risk importance measures. Vast reductions in computational effort are realized since combinatorial expansion and subsequent Boolean reduction steps are eliminated through analysis of network segmentations using a technique of assuming node failures to occur on only one side of a break in the network, and repeating the technique for all minimal cut sets generated with the search algorithm. The method functions equally well for planar and non-planar networks.
Intraosseous repair of the inferior alveolar nerve in rats: an experimental model.
Curtis, N J; Trickett, R I; Owen, E; Lanzetta, M
1998-08-01
A reliable method of exposure of the inferior alveolar nerve in Wistar rats has been developed, to allow intraosseous repair with two microsurgical techniques under halothane inhalational anaesthesia. The microsuturing technique involves anastomosis with 10-0 nylon sutures; a laser-weld technique uses an albumin-based solder containing indocyanine green, plus an infrared (810 nm wavelength) diode laser Seven animals had left inferior alveolar nerve repairs performed with the microsuture and laser-weld techniques. Controls were provided by unoperated nerves in the repaired cases. Histochemical analysis was performed utilizing neuron counts and horseradish peroxidase tracer (HRP) uptake in the mandibular division of the trigeminal ganglion, following sacrifice and staining of frozen sections with cresyl violet and diaminobenzidene. The results of this analysis showed similar mean neuron counts and mean HRP uptake by neurons for the unoperated controls and both microsuture and laser-weld groups. This new technique of intraosseous exposure of the inferior alveolar nerve in rats is described. It allows reliable and reproducible microsurgical repairs using both microsuture and laser-weld techniques.
An image analysis system for near-infrared (NIR) fluorescence lymph imaging
NASA Astrophysics Data System (ADS)
Zhang, Jingdan; Zhou, Shaohua Kevin; Xiang, Xiaoyan; Rasmussen, John C.; Sevick-Muraca, Eva M.
2011-03-01
Quantitative analysis of lymphatic function is crucial for understanding the lymphatic system and diagnosing the associated diseases. Recently, a near-infrared (NIR) fluorescence imaging system is developed for real-time imaging lymphatic propulsion by intradermal injection of microdose of a NIR fluorophore distal to the lymphatics of interest. However, the previous analysis software3, 4 is underdeveloped, requiring extensive time and effort to analyze a NIR image sequence. In this paper, we develop a number of image processing techniques to automate the data analysis workflow, including an object tracking algorithm to stabilize the subject and remove the motion artifacts, an image representation named flow map to characterize lymphatic flow more reliably, and an automatic algorithm to compute lymph velocity and frequency of propulsion. By integrating all these techniques to a system, the analysis workflow significantly reduces the amount of required user interaction and improves the reliability of the measurement.
Bayesian Inference for NASA Probabilistic Risk and Reliability Analysis
NASA Technical Reports Server (NTRS)
Dezfuli, Homayoon; Kelly, Dana; Smith, Curtis; Vedros, Kurt; Galyean, William
2009-01-01
This document, Bayesian Inference for NASA Probabilistic Risk and Reliability Analysis, is intended to provide guidelines for the collection and evaluation of risk and reliability-related data. It is aimed at scientists and engineers familiar with risk and reliability methods and provides a hands-on approach to the investigation and application of a variety of risk and reliability data assessment methods, tools, and techniques. This document provides both: A broad perspective on data analysis collection and evaluation issues. A narrow focus on the methods to implement a comprehensive information repository. The topics addressed herein cover the fundamentals of how data and information are to be used in risk and reliability analysis models and their potential role in decision making. Understanding these topics is essential to attaining a risk informed decision making environment that is being sought by NASA requirements and procedures such as 8000.4 (Agency Risk Management Procedural Requirements), NPR 8705.05 (Probabilistic Risk Assessment Procedures for NASA Programs and Projects), and the System Safety requirements of NPR 8715.3 (NASA General Safety Program Requirements).
Implementing a Reliability Centered Maintenance Program at NASA's Kennedy Space Center
NASA Technical Reports Server (NTRS)
Tuttle, Raymond E.; Pete, Robert R.
1998-01-01
Maintenance practices have long focused on time based "preventive maintenance" techniques. Components were changed out and parts replaced based on how long they had been in place instead of what condition they were in. A reliability centered maintenance (RCM) program seeks to offer equal or greater reliability at decreased cost by insuring only applicable, effective maintenance is performed and by in large part replacing time based maintenance with condition based maintenance. A significant portion of this program involved introducing non-intrusive technologies, such as vibration analysis, oil analysis and I/R cameras, to an existing labor force and management team.
Efficient and robust analysis of complex scattering data under noise in microwave resonators.
Probst, S; Song, F B; Bushev, P A; Ustinov, A V; Weides, M
2015-02-01
Superconducting microwave resonators are reliable circuits widely used for detection and as test devices for material research. A reliable determination of their external and internal quality factors is crucial for many modern applications, which either require fast measurements or operate in the single photon regime with small signal to noise ratios. Here, we use the circle fit technique with diameter correction and provide a step by step guide for implementing an algorithm for robust fitting and calibration of complex resonator scattering data in the presence of noise. The speedup and robustness of the analysis are achieved by employing an algebraic rather than an iterative fit technique for the resonance circle.
Pearson, Adam M; Spratt, Kevin F; Genuario, James; McGough, William; Kosman, Katherine; Lurie, Jon; Sengupta, Dilip K
2011-04-01
Comparison of intra- and interobserver reliability of digitized manual and computer-assisted intervertebral motion measurements and classification of "instability." To determine if computer-assisted measurement of lumbar intervertebral motion on flexion-extension radiographs improves reliability compared with digitized manual measurements. Many studies have questioned the reliability of manual intervertebral measurements, although few have compared the reliability of computer-assisted and manual measurements on lumbar flexion-extension radiographs. Intervertebral rotation, anterior-posterior (AP) translation, and change in anterior and posterior disc height were measured with a digitized manual technique by three physicians and by three other observers using computer-assisted quantitative motion analysis (QMA) software. Each observer measured 30 sets of digital flexion-extension radiographs (L1-S1) twice. Shrout-Fleiss intraclass correlation coefficients for intra- and interobserver reliabilities were computed. The stability of each level was also classified (instability defined as >4 mm AP translation or 10° rotation), and the intra- and interobserver reliabilities of the two methods were compared using adjusted percent agreement (APA). Intraobserver reliability intraclass correlation coefficients were substantially higher for the QMA technique THAN the digitized manual technique across all measurements: rotation 0.997 versus 0.870, AP translation 0.959 versus 0.557, change in anterior disc height 0.962 versus 0.770, and change in posterior disc height 0.951 versus 0.283. The same pattern was observed for interobserver reliability (rotation 0.962 vs. 0.693, AP translation 0.862 vs. 0.151, change in anterior disc height 0.862 vs. 0.373, and change in posterior disc height 0.730 vs. 0.300). The QMA technique was also more reliable for the classification of "instability." Intraobserver APAs ranged from 87 to 97% for QMA versus 60% to 73% for digitized manual measurements, while interobserver APAs ranged from 91% to 96% for QMA versus 57% to 63% for digitized manual measurements. The use of QMA software substantially improved the reliability of lumbar intervertebral measurements and the classification of instability based on flexion-extension radiographs.
USDA-ARS?s Scientific Manuscript database
Quantitative real-time polymerase chain reaction (qRT-PCR) is a commonly used technique for measuring gene expression levels due to its simplicity, specificity, and sensitivity. Reliable reference selection for the accurate quantification of gene expression under various experimental conditions is a...
Maximally reliable spatial filtering of steady state visual evoked potentials.
Dmochowski, Jacek P; Greaves, Alex S; Norcia, Anthony M
2015-04-01
Due to their high signal-to-noise ratio (SNR) and robustness to artifacts, steady state visual evoked potentials (SSVEPs) are a popular technique for studying neural processing in the human visual system. SSVEPs are conventionally analyzed at individual electrodes or linear combinations of electrodes which maximize some variant of the SNR. Here we exploit the fundamental assumption of evoked responses--reproducibility across trials--to develop a technique that extracts a small number of high SNR, maximally reliable SSVEP components. This novel spatial filtering method operates on an array of Fourier coefficients and projects the data into a low-dimensional space in which the trial-to-trial spectral covariance is maximized. When applied to two sample data sets, the resulting technique recovers physiologically plausible components (i.e., the recovered topographies match the lead fields of the underlying sources) while drastically reducing the dimensionality of the data (i.e., more than 90% of the trial-to-trial reliability is captured in the first four components). Moreover, the proposed technique achieves a higher SNR than that of the single-best electrode or the Principal Components. We provide a freely-available MATLAB implementation of the proposed technique, herein termed "Reliable Components Analysis". Copyright © 2015 Elsevier Inc. All rights reserved.
Ku-band signal design study. [space shuttle orbiter data processing network
NASA Technical Reports Server (NTRS)
Rubin, I.
1978-01-01
Analytical tools, methods and techniques for assessing the design and performance of the space shuttle orbiter data processing system (DPS) are provided. The computer data processing network is evaluated in the key areas of queueing behavior synchronization and network reliability. The structure of the data processing network is described as well as the system operation principles and the network configuration. The characteristics of the computer systems are indicated. System reliability measures are defined and studied. System and network invulnerability measures are computed. Communication path and network failure analysis techniques are included.
Programmable Logic Application Notes
NASA Technical Reports Server (NTRS)
Katz, Richard
2000-01-01
This column will be provided each quarter as a source for reliability, radiation results, NASA capabilities, and other information on programmable logic devices and related applications. This quarter will continue a series of notes concentrating on analysis techniques with this issue's section discussing: Digital Timing Analysis Tools and Techniques. Articles in this issue include: SX and SX-A Series Devices Power Sequencing; JTAG and SXISX-AISX-S Series Devices; Analysis Techniques (i.e., notes on digital timing analysis tools and techniques); Status of the Radiation Hard reconfigurable Field Programmable Gate Array Program, Input Transition Times; Apollo Guidance Computer Logic Study; RT54SX32S Prototype Data Sets; A54SX32A - 0.22 micron/UMC Test Results; Ramtron FM1608 FRAM; and Analysis of VHDL Code and Synthesizer Output.
Rozen, Warren Matthew; Spychal, Robert T.; Hunter-Smith, David J.
2016-01-01
Background Accurate volumetric analysis is an essential component of preoperative planning in both reconstructive and aesthetic breast procedures towards achieving symmetrization and patient-satisfactory outcome. Numerous comparative studies and reviews of individual techniques have been reported. However, a unifying review of all techniques comparing their accuracy, reliability, and practicality has been lacking. Methods A review of the published English literature dating from 1950 to 2015 using databases, such as PubMed, Medline, Web of Science, and EMBASE, was undertaken. Results Since Bouman’s first description of water displacement method, a range of volumetric assessment techniques have been described: thermoplastic casting, direct anthropomorphic measurement, two-dimensional (2D) imaging, and computed tomography (CT)/magnetic resonance imaging (MRI) scans. However, most have been unreliable, difficult to execute and demonstrate limited practicability. Introduction of 3D surface imaging has revolutionized the field due to its ease of use, fast speed, accuracy, and reliability. However, its widespread use has been limited by its high cost and lack of high level of evidence. Recent developments have unveiled the first web-based 3D surface imaging program, 4D imaging, and 3D printing. Conclusions Despite its importance, an accurate, reliable, and simple breast volumetric analysis tool has been elusive until the introduction of 3D surface imaging technology. However, its high cost has limited its wide usage. Novel adjunct technologies, such as web-based 3D surface imaging program, 4D imaging, and 3D printing, appear promising. PMID:27047788
Tutorial: Advanced fault tree applications using HARP
NASA Technical Reports Server (NTRS)
Dugan, Joanne Bechta; Bavuso, Salvatore J.; Boyd, Mark A.
1993-01-01
Reliability analysis of fault tolerant computer systems for critical applications is complicated by several factors. These modeling difficulties are discussed and dynamic fault tree modeling techniques for handling them are described and demonstrated. Several advanced fault tolerant computer systems are described, and fault tree models for their analysis are presented. HARP (Hybrid Automated Reliability Predictor) is a software package developed at Duke University and NASA Langley Research Center that is capable of solving the fault tree models presented.
Investigating effects of communications modulation technique on targeting performance
NASA Astrophysics Data System (ADS)
Blasch, Erik; Eusebio, Gerald; Huling, Edward
2006-05-01
One of the key challenges facing the global war on terrorism (GWOT) and urban operations is the increased need for rapid and diverse information from distributed sources. For users to get adequate information on target types and movements, they would need reliable data. In order to facilitate reliable computational intelligence, we seek to explore the communication modulation tradeoffs affecting information distribution and accumulation. In this analysis, we explore the modulation techniques of Orthogonal Frequency Division Multiplexing (OFDM), Direct Sequence Spread Spectrum (DSSS), and statistical time-division multiple access (TDMA) as a function of the bit error rate and jitter that affect targeting performance. In the analysis, we simulate a Link 16 with a simple bandpass frequency shift keying (PSK) technique using different Signal-to-Noise ratios. The communications transfer delay and accuracy tradeoffs are assessed as to the effects incurred in targeting performance.
Kim, Paul Jeong; Peace, Ruth; Mieras, Jamie; Thoms, Tanya; Freeman, Denise; Page, Jeffrey
2011-01-01
Goniometric measurement is currently being used as a diagnostic and outcomes assessment tool for ankle joint dorsiflexion. Despite its common use, its interrater and intrarater reliability has been questioned. This is a prospective study examining whether the experience of the examiner or the technique used affects the interrater and intrarater reliability for measuring ankle joint dorsiflexion. Fourteen asymptomatic individuals (8 male and 6 female) with a mean age of 28.2 years (range, 23-52) were enrolled into this study. The years of clinical experience of the five examiners averaged 10.4 years (range, 0-26). Four examiners used a modified Root, Weed and Orien method of measuring ankle joint dorsiflexion. The fifth examiner utilized a nonstandardized technique. A standard goniometer was used for bilateral measurements of ankle joint dorsiflexion with the knee extended and flexed. All five examiners repeated each measurement three times during each of the three sessions, with each session spaced at least 1 week apart. The interclass correlation coefficient reveals a moderate intrarater and poor interrater reliability in ankle joint dorsiflexion measurements using a standard goniometer. More importantly, further analysis indicates that the use of a standardized technique for measurement of ankle joint dorsiflexion or years of clinical experience does not increase the intrarater or interrater reliability. The utility of the goniometric measurement of ankle joint dorsiflexion may be limited.
Estimation of the KR20 Reliability Coefficient When Data Are Incomplete.
ERIC Educational Resources Information Center
Huynh, Huynh
Three techniques for estimating Kuder Richardson reliability (KR20) coefficients for incomplete data are contrasted. The methods are: (1) Henderson's Method 1 (analysis of variance, or ANOVA); (2) Henderson's Method 3 (FITCO); and (3) Koch's method of symmetric sums (SYSUM). A Monte Carlo simulation was used to assess the precision of the three…
RADC thermal guide for reliability engineers
NASA Astrophysics Data System (ADS)
Morrison, G. N.; Kallis, J. M.; Strattan, L. A.; Jones, I. R.; Lena, A. L.
1982-06-01
This guide was developed to provide a reliability engineer, who is not proficient in thermal design and analysis techniques, with the tools for managing and evaluating the thermal design and production of electronic equipment. It defines the requirements and tasks that should be addressed in system equipment specifications and statements of work, and describes how to evaluate performance.
Application of a digital technique in evaluating the reliability of shade guides.
Cal, E; Sonugelen, M; Guneri, P; Kesercioglu, A; Kose, T
2004-05-01
There appears to be a need for a reliable method for quantification of tooth colour and analysis of shade. Therefore, the primary objective of this study was to show the applicability of graphic software in colour analysis and secondly to investigate the reliability of commercial shade guides produced by the same manufacturer, using this digital technique. After confirming the reliability and reproducibility of the digital method by using self-assessed coloured images, three shade guides of the same manufacturer were photographed in daylight and in studio environments with a digital camera and saved in tagged image file format (TIFF) format. Colour analysis of each photograph was performed using the Adobe Photoshop 4.0 graphic program. Luminosity, and red, green, blue (L and RGB) values of each shade tab of each shade guide were measured and the data were subjected to statistical analysis using the repeated measure Anova test. The L and RGB values of the images taken in daylight differed significantly from those of the images taken in studio environment (P < 0.05). In both environments, the luminosity and red values of the shade tabs were significantly different from each other (P < 0.05). It was concluded that, when the environmental conditions were kept constant, the Adobe Photoshop 4.0 colour analysis program could be used to analyse the colour of images. On the other hand, the results revealed that the accuracy of shade tabs widely being used in colour matching should be readdressed.
Reliability Assessment for Low-cost Unmanned Aerial Vehicles
NASA Astrophysics Data System (ADS)
Freeman, Paul Michael
Existing low-cost unmanned aerospace systems are unreliable, and engineers must blend reliability analysis with fault-tolerant control in novel ways. This dissertation introduces the University of Minnesota unmanned aerial vehicle flight research platform, a comprehensive simulation and flight test facility for reliability and fault-tolerance research. An industry-standard reliability assessment technique, the failure modes and effects analysis, is performed for an unmanned aircraft. Particular attention is afforded to the control surface and servo-actuation subsystem. Maintaining effector health is essential for safe flight; failures may lead to loss of control incidents. Failure likelihood, severity, and risk are qualitatively assessed for several effector failure modes. Design changes are recommended to improve aircraft reliability based on this analysis. Most notably, the control surfaces are split, providing independent actuation and dual-redundancy. The simulation models for control surface aerodynamic effects are updated to reflect the split surfaces using a first-principles geometric analysis. The failure modes and effects analysis is extended by using a high-fidelity nonlinear aircraft simulation. A trim state discovery is performed to identify the achievable steady, wings-level flight envelope of the healthy and damaged vehicle. Tolerance of elevator actuator failures is studied using familiar tools from linear systems analysis. This analysis reveals significant inherent performance limitations for candidate adaptive/reconfigurable control algorithms used for the vehicle. Moreover, it demonstrates how these tools can be applied in a design feedback loop to make safety-critical unmanned systems more reliable. Control surface impairments that do occur must be quickly and accurately detected. This dissertation also considers fault detection and identification for an unmanned aerial vehicle using model-based and model-free approaches and applies those algorithms to experimental faulted and unfaulted flight test data. Flight tests are conducted with actuator faults that affect the plant input and sensor faults that affect the vehicle state measurements. A model-based detection strategy is designed and uses robust linear filtering methods to reject exogenous disturbances, e.g. wind, while providing robustness to model variation. A data-driven algorithm is developed to operate exclusively on raw flight test data without physical model knowledge. The fault detection and identification performance of these complementary but different methods is compared. Together, enhanced reliability assessment and multi-pronged fault detection and identification techniques can help to bring about the next generation of reliable low-cost unmanned aircraft.
NASA Astrophysics Data System (ADS)
Sayar, M.; Ogawa, K.; Shoji, T.
2008-02-01
Thermal barrier coatings have been widely used in gas turbine engines in order to protect substrate metal alloy against high temperature and to enhance turbine efficiency. Currently, there are no reliable nondestructive techniques available to monitor TBC integrity over lifetime of the coating. Hence, to detect top coating (TC) and TGO thicknesses, a microwave nondestructive technique that utilizes a rectangular waveguide was developed. The phase of the reflection coefficient at the interface of TC and waveguide varies for different TGO and TC thicknesses. Therefore, measuring the phase of the reflection coefficient enables us to accurately calculate these thicknesses. Finally, a theoretical analysis was used to evaluate the reliability of the experimental results.
ERIC Educational Resources Information Center
Games, Paul A.
1975-01-01
A brief introduction is presented on how multiple regression and linear model techniques can handle data analysis situations that most educators and psychologists think of as appropriate for analysis of variance. (Author/BJG)
A fuzzy set approach for reliability calculation of valve controlling electric actuators
NASA Astrophysics Data System (ADS)
Karmachev, D. P.; Yefremov, A. A.; Luneva, E. E.
2017-02-01
The oil and gas equipment and electric actuators in particular frequently perform in various operational modes and under dynamic environmental conditions. These factors affect equipment reliability measures in a vague, uncertain way. To eliminate the ambiguity, reliability model parameters could be defined as fuzzy numbers. We suggest a technique that allows constructing fundamental fuzzy-valued performance reliability measures based on an analysis of electric actuators failure data in accordance with the amount of work, completed before the failure, instead of failure time. Also, this paper provides a computation example of fuzzy-valued reliability and hazard rate functions, assuming Kumaraswamy complementary Weibull geometric distribution as a lifetime (reliability) model for electric actuators.
Glycoprotein Enrichment Analytical Techniques: Advantages and Disadvantages.
Zhu, R; Zacharias, L; Wooding, K M; Peng, W; Mechref, Y
2017-01-01
Protein glycosylation is one of the most important posttranslational modifications. Numerous biological functions are related to protein glycosylation. However, analytical challenges remain in the glycoprotein analysis. To overcome the challenges associated with glycoprotein analysis, many analytical techniques were developed in recent years. Enrichment methods were used to improve the sensitivity of detection, while HPLC and mass spectrometry methods were developed to facilitate the separation of glycopeptides/proteins and enhance detection, respectively. Fragmentation techniques applied in modern mass spectrometers allow the structural interpretation of glycopeptides/proteins, while automated software tools started replacing manual processing to improve the reliability and throughput of the analysis. In this chapter, the current methodologies of glycoprotein analysis were discussed. Multiple analytical techniques are compared, and advantages and disadvantages of each technique are highlighted. © 2017 Elsevier Inc. All rights reserved.
CHAPTER 7: Glycoprotein Enrichment Analytical Techniques: Advantages and Disadvantages
Zhu, Rui; Zacharias, Lauren; Wooding, Kerry M.; Peng, Wenjing; Mechref, Yehia
2017-01-01
Protein glycosylation is one of the most important posttranslational modifications. Numerous biological functions are related to protein glycosylation. However, analytical challenges remain in the glycoprotein analysis. To overcome the challenges associated with glycoprotein analysis, many analytical techniques were developed in recent years. Enrichment methods were used to improve the sensitivity of detection while HPLC and mass spectrometry methods were developed to facilitate the separation of glycopeptides/proteins and enhance detection, respectively. Fragmentation techniques applied in modern mass spectrometers allow the structural interpretation of glycopeptides/proteins while automated software tools started replacing manual processing to improve the reliability and throughout of the analysis. In this chapter, the current methodologies of glycoprotein analysis were discussed. Multiple analytical techniques are compared, and advantages and disadvantages of each technique are highlighted. PMID:28109440
Statistical methodology: II. Reliability and validity assessment in study design, Part B.
Karras, D J
1997-02-01
Validity measures the correspondence between a test and other purported measures of the same or similar qualities. When a reference standard exists, a criterion-based validity coefficient can be calculated. If no such standard is available, the concepts of content and construct validity may be used, but quantitative analysis may not be possible. The Pearson and Spearman tests of correlation are often used to assess the correspondence between tests, but do not account for measurement biases and may yield misleading results. Techniques that measure interest differences may be more meaningful in validity assessment, and the kappa statistic is useful for analyzing categorical variables. Questionnaires often can be designed to allow quantitative assessment of reliability and validity, although this may be difficult. Inclusion of homogeneous questions is necessary to assess reliability. Analysis is enhanced by using Likert scales or similar techniques that yield ordinal data. Validity assessment of questionnaires requires careful definition of the scope of the test and comparison with previously validated tools.
A Flexible Hierarchical Bayesian Modeling Technique for Risk Analysis of Major Accidents.
Yu, Hongyang; Khan, Faisal; Veitch, Brian
2017-09-01
Safety analysis of rare events with potentially catastrophic consequences is challenged by data scarcity and uncertainty. Traditional causation-based approaches, such as fault tree and event tree (used to model rare event), suffer from a number of weaknesses. These include the static structure of the event causation, lack of event occurrence data, and need for reliable prior information. In this study, a new hierarchical Bayesian modeling based technique is proposed to overcome these drawbacks. The proposed technique can be used as a flexible technique for risk analysis of major accidents. It enables both forward and backward analysis in quantitative reasoning and the treatment of interdependence among the model parameters. Source-to-source variability in data sources is also taken into account through a robust probabilistic safety analysis. The applicability of the proposed technique has been demonstrated through a case study in marine and offshore industry. © 2017 Society for Risk Analysis.
Combined Bisulfite Restriction Analysis for brain tissue identification.
Samsuwan, Jarunya; Muangsub, Tachapol; Yanatatsaneejit, Pattamawadee; Mutirangura, Apiwat; Kitkumthorn, Nakarin
2018-05-01
According to the tissue-specific methylation database (doi: 10.1016/j.gene.2014.09.060), methylation at CpG locus cg03096975 in EML2 has been preliminarily proven to be specific to brain tissue. In this study, we enlarged sample size and developed a technique for identifying brain tissue in aged samples. Combined Bisulfite Restriction Analysis-for EML2 (COBRA-EML2) technique was established and validated in various organ samples obtained from 108 autopsies. In addition, this technique was also tested for its reliability, minimal DNA concentration detected, and use in aged samples and in samples obtained from specific brain compartments and spinal cord. COBRA-EML2 displayed 100% sensitivity and specificity for distinguishing brain tissue from other tissues, showed high reliability, was capable of detecting minimal DNA concentration (0.015ng/μl), could be used for identifying brain tissue in aged samples. In summary, COBRA-EML2 is a technique to identify brain tissue. This analysis is useful in criminal cases since it can identify the vital organ tissues from small samples acquired from criminal scenes. The results from this analysis can be counted as a medical and forensic marker supporting criminal investigations, and as one of the evidences in court rulings. Copyright © 2018 Elsevier B.V. All rights reserved.
Estimation and enhancement of real-time software reliability through mutation analysis
NASA Technical Reports Server (NTRS)
Geist, Robert; Offutt, A. J.; Harris, Frederick C., Jr.
1992-01-01
A simulation-based technique for obtaining numerical estimates of the reliability of N-version, real-time software is presented. An extended stochastic Petri net is employed to represent the synchronization structure of N versions of the software, where dependencies among versions are modeled through correlated sampling of module execution times. Test results utilizing specifications for NASA's planetary lander control software indicate that mutation-based testing could hold greater potential for enhancing reliability than the desirable but perhaps unachievable goal of independence among N versions.
The Verification-based Analysis of Reliable Multicast Protocol
NASA Technical Reports Server (NTRS)
Wu, Yunqing
1996-01-01
Reliable Multicast Protocol (RMP) is a communication protocol that provides an atomic, totally ordered, reliable multicast service on top of unreliable IP Multicasting. In this paper, we develop formal models for R.W using existing automatic verification systems, and perform verification-based analysis on the formal RMP specifications. We also use the formal models of RW specifications to generate a test suite for conformance testing of the RMP implementation. Throughout the process of RMP development, we follow an iterative, interactive approach that emphasizes concurrent and parallel progress between the implementation and verification processes. Through this approach, we incorporate formal techniques into our development process, promote a common understanding for the protocol, increase the reliability of our software, and maintain high fidelity between the specifications of RMP and its implementation.
ERIC Educational Resources Information Center
Koga, Nobuyoshi; Goshi, Yuri; Yoshikawa, Masahiro; Tatsuoka, Tomoyuki
2014-01-01
An undergraduate kinetic experiment of the thermal decomposition of solids by microscopic observation and thermal analysis was developed by investigating a suitable reaction, applicable techniques of thermal analysis and microscopic observation, and a reliable kinetic calculation method. The thermal decomposition of sodium hydrogen carbonate is…
NASA Astrophysics Data System (ADS)
McCrea, Terry
The Shuttle Processing Contract (SPC) workforce consists of Lockheed Space Operations Co. as prime contractor, with Grumman, Thiokol Corporation, and Johnson Controls World Services as subcontractors. During the design phase, reliability engineering is instrumental in influencing the development of systems that meet the Shuttle fail-safe program requirements. Reliability engineers accomplish this objective by performing FMEA (failure modes and effects analysis) to identify potential single failure points. When technology, time, or resources do not permit a redesign to eliminate a single failure point, the single failure point information is formatted into a change request and presented to senior management of SPC and NASA for risk acceptance. In parallel with the FMEA, safety engineering conducts a hazard analysis to assure that potential hazards to personnel are assessed. The combined effort (FMEA and hazard analysis) is published as a system assurance analysis. Special ground rules and techniques are developed to perform and present the analysis. The reliability program at KSC is vigorously pursued, and has been extremely successful. The ground support equipment and facilities used to launch and land the Space Shuttle maintain an excellent reliability record.
Highly reliable oxide VCSELs for datacom applications
NASA Astrophysics Data System (ADS)
Aeby, Ian; Collins, Doug; Gibson, Brian; Helms, Christopher J.; Hou, Hong Q.; Lou, Wenlin; Bossert, David J.; Wang, Charlie X.
2003-06-01
In this paper we describe the processes and procedures that have been developed to ensure high reliability for Emcore"s 850 nm oxide confined GaAs VCSELs. Evidence from on-going accelerated life testing and other reliability studies that confirm that this process yields reliable products will be discussed. We will present data and analysis techniques used to determine the activation energy and acceleration factors for the dominant wear-out failure mechanisms for our devices as well as our estimated MTTF of greater than 2 million use hours. We conclude with a summary of internal verification and field return rate validation data.
Ellis, Richard; Hing, Wayne; Dilley, Andrew; McNair, Peter
2008-08-01
Diagnostic ultrasound provides a technique whereby real-time, in vivo analysis of peripheral nerve movement is possible. This study measured sciatic nerve movement during a "slider" neural mobilisation technique (ankle dorsiflexion/plantar flexion and cervical extension/flexion). Transverse and longitudinal movement was assessed from still ultrasound images and video sequences by using frame-by-frame cross-correlation software. Sciatic nerve movement was recorded in the transverse and longitudinal planes. For transverse movement, at the posterior midthigh (PMT) the mean value of lateral sciatic nerve movement was 3.54 mm (standard error of measurement [SEM] +/- 1.18 mm) compared with anterior-posterior/vertical (AP) movement of 1.61 mm (SEM +/- 0.78 mm). At the popliteal crease (PC) scanning location, lateral movement was 6.62 mm (SEM +/- 1.10 mm) compared with AP movement of 3.26 mm (SEM +/- 0.99 mm). Mean longitudinal sciatic nerve movement at the PMT was 3.47 mm (SEM +/- 0.79 mm; n = 27) compared with the PC of 5.22 mm (SEM +/- 0.05 mm; n = 3). The reliability of ultrasound measurement of transverse sciatic nerve movement was fair to excellent (Intraclass correlation coefficient [ICC] = 0.39-0.76) compared with excellent (ICC = 0.75) for analysis of longitudinal movement. Diagnostic ultrasound presents a reliable, noninvasive, real-time, in vivo method for analysis of sciatic nerve movement.
Differential reliability : probabilistic engineering applied to wood members in bending-tension
Stanley K. Suddarth; Frank E. Woeste; William L. Galligan
1978-01-01
Reliability analysis is a mathematical technique for appraising the design and materials of engineered structures to provide a quantitative estimate of probability of failure. Two or more cases which are similar in all respects but one may be analyzed by this method; the contrast between the probabilities of failure for these cases allows strong analytical focus on the...
Neural Networks Based Approach to Enhance Space Hardware Reliability
NASA Technical Reports Server (NTRS)
Zebulum, Ricardo S.; Thakoor, Anilkumar; Lu, Thomas; Franco, Lauro; Lin, Tsung Han; McClure, S. S.
2011-01-01
This paper demonstrates the use of Neural Networks as a device modeling tool to increase the reliability analysis accuracy of circuits targeted for space applications. The paper tackles a number of case studies of relevance to the design of Flight hardware. The results show that the proposed technique generates more accurate models than the ones regularly used to model circuits.
Jackson, Brian A; Faith, Kay Sullivan
2013-02-01
Although significant progress has been made in measuring public health emergency preparedness, system-level performance measures are lacking. This report examines a potential approach to such measures for Strategic National Stockpile (SNS) operations. We adapted an engineering analytic technique used to assess the reliability of technological systems-failure mode and effects analysis-to assess preparedness. That technique, which includes systematic mapping of the response system and identification of possible breakdowns that affect performance, provides a path to use data from existing SNS assessment tools to estimate likely future performance of the system overall. Systems models of SNS operations were constructed and failure mode analyses were performed for each component. Linking data from existing assessments, including the technical assistance review and functional drills, to reliability assessment was demonstrated using publicly available information. The use of failure mode and effects estimates to assess overall response system reliability was demonstrated with a simple simulation example. Reliability analysis appears an attractive way to integrate information from the substantial investment in detailed assessments for stockpile delivery and dispensing to provide a view of likely future response performance.
A quantitative analysis of the F18 flight control system
NASA Technical Reports Server (NTRS)
Doyle, Stacy A.; Dugan, Joanne B.; Patterson-Hine, Ann
1993-01-01
This paper presents an informal quantitative analysis of the F18 flight control system (FCS). The analysis technique combines a coverage model with a fault tree model. To demonstrate the method's extensive capabilities, we replace the fault tree with a digraph model of the F18 FCS, the only model available to us. The substitution shows that while digraphs have primarily been used for qualitative analysis, they can also be used for quantitative analysis. Based on our assumptions and the particular failure rates assigned to the F18 FCS components, we show that coverage does have a significant effect on the system's reliability and thus it is important to include coverage in the reliability analysis.
Probabilistic Finite Element Analysis & Design Optimization for Structural Designs
NASA Astrophysics Data System (ADS)
Deivanayagam, Arumugam
This study focuses on implementing probabilistic nature of material properties (Kevlar® 49) to the existing deterministic finite element analysis (FEA) of fabric based engine containment system through Monte Carlo simulations (MCS) and implementation of probabilistic analysis in engineering designs through Reliability Based Design Optimization (RBDO). First, the emphasis is on experimental data analysis focusing on probabilistic distribution models which characterize the randomness associated with the experimental data. The material properties of Kevlar® 49 are modeled using experimental data analysis and implemented along with an existing spiral modeling scheme (SMS) and user defined constitutive model (UMAT) for fabric based engine containment simulations in LS-DYNA. MCS of the model are performed to observe the failure pattern and exit velocities of the models. Then the solutions are compared with NASA experimental tests and deterministic results. MCS with probabilistic material data give a good prospective on results rather than a single deterministic simulation results. The next part of research is to implement the probabilistic material properties in engineering designs. The main aim of structural design is to obtain optimal solutions. In any case, in a deterministic optimization problem even though the structures are cost effective, it becomes highly unreliable if the uncertainty that may be associated with the system (material properties, loading etc.) is not represented or considered in the solution process. Reliable and optimal solution can be obtained by performing reliability optimization along with the deterministic optimization, which is RBDO. In RBDO problem formulation, in addition to structural performance constraints, reliability constraints are also considered. This part of research starts with introduction to reliability analysis such as first order reliability analysis, second order reliability analysis followed by simulation technique that are performed to obtain probability of failure and reliability of structures. Next, decoupled RBDO procedure is proposed with a new reliability analysis formulation with sensitivity analysis, which is performed to remove the highly reliable constraints in the RBDO, thereby reducing the computational time and function evaluations. Followed by implementation of the reliability analysis concepts and RBDO in finite element 2D truss problems and a planar beam problem are presented and discussed.
Software Fault Tolerance: A Tutorial
NASA Technical Reports Server (NTRS)
Torres-Pomales, Wilfredo
2000-01-01
Because of our present inability to produce error-free software, software fault tolerance is and will continue to be an important consideration in software systems. The root cause of software design errors is the complexity of the systems. Compounding the problems in building correct software is the difficulty in assessing the correctness of software for highly complex systems. After a brief overview of the software development processes, we note how hard-to-detect design faults are likely to be introduced during development and how software faults tend to be state-dependent and activated by particular input sequences. Although component reliability is an important quality measure for system level analysis, software reliability is hard to characterize and the use of post-verification reliability estimates remains a controversial issue. For some applications software safety is more important than reliability, and fault tolerance techniques used in those applications are aimed at preventing catastrophes. Single version software fault tolerance techniques discussed include system structuring and closure, atomic actions, inline fault detection, exception handling, and others. Multiversion techniques are based on the assumption that software built differently should fail differently and thus, if one of the redundant versions fails, it is expected that at least one of the other versions will provide an acceptable output. Recovery blocks, N-version programming, and other multiversion techniques are reviewed.
TDRSS telecommunications system, PN code analysis
NASA Technical Reports Server (NTRS)
Dixon, R.; Gold, R.; Kaiser, F.
1976-01-01
The pseudo noise (PN) codes required to support the TDRSS telecommunications services are analyzed and the impact of alternate coding techniques on the user transponder equipment, the TDRSS equipment, and all factors that contribute to the acquisition and performance of these telecommunication services is assessed. Possible alternatives to the currently proposed hybrid FH/direct sequence acquisition procedures are considered and compared relative to acquisition time, implementation complexity, operational reliability, and cost. The hybrid FH/direct sequence technique is analyzed and rejected in favor of a recommended approach which minimizes acquisition time and user transponder complexity while maximizing probability of acquisition and overall link reliability.
Light aircraft crash safety program
NASA Technical Reports Server (NTRS)
Thomson, R. G.; Hayduk, R. J.
1974-01-01
NASA is embarked upon research and development tasks aimed at providing the general aviation industry with a reliable crashworthy airframe design technology. The goals of the NASA program are: reliable analytical techniques for predicting the nonlinear behavior of structures; significant design improvements of airframes; and simulated full-scale crash test data. The analytical tools will include both simplified procedures for estimating energy absorption characteristics and more complex computer programs for analysis of general airframe structures under crash loading conditions. The analytical techniques being developed both in-house and under contract are described, and a comparison of some analytical predictions with experimental results is shown.
Programmable Logic Application Notes
NASA Technical Reports Server (NTRS)
Katz, Richard
2000-01-01
This column will be provided each quarter as a source for reliability, radiation results, NASA capabilities, and other information on programmable logic devices and related applications. This quarter will start a series of notes concentrating on analysis techniques with this issues section discussing worst-case analysis requirements.
Critical evaluation of sample pretreatment techniques.
Hyötyläinen, Tuulia
2009-06-01
Sample preparation before chromatographic separation is the most time-consuming and error-prone part of the analytical procedure. Therefore, selecting and optimizing an appropriate sample preparation scheme is a key factor in the final success of the analysis, and the judicious choice of an appropriate procedure greatly influences the reliability and accuracy of a given analysis. The main objective of this review is to critically evaluate the applicability, disadvantages, and advantages of various sample preparation techniques. Particular emphasis is placed on extraction techniques suitable for both liquid and solid samples.
Design Strategy for a Formally Verified Reliable Computing Platform
NASA Technical Reports Server (NTRS)
Butler, Ricky W.; Caldwell, James L.; DiVito, Ben L.
1991-01-01
This paper presents a high-level design for a reliable computing platform for real-time control applications. The design tradeoffs and analyses related to the development of a formally verified reliable computing platform are discussed. The design strategy advocated in this paper requires the use of techniques that can be completely characterized mathematically as opposed to more powerful or more flexible algorithms whose performance properties can only be analyzed by simulation and testing. The need for accurate reliability models that can be related to the behavior models is also stressed. Tradeoffs between reliability and voting complexity are explored. In particular, the transient recovery properties of the system are found to be fundamental to both the reliability analysis as well as the "correctness" models.
Validity and reliability of acoustic analysis of respiratory sounds in infants
Elphick, H; Lancaster, G; Solis, A; Majumdar, A; Gupta, R; Smyth, R
2004-01-01
Objective: To investigate the validity and reliability of computerised acoustic analysis in the detection of abnormal respiratory noises in infants. Methods: Blinded, prospective comparison of acoustic analysis with stethoscope examination. Validity and reliability of acoustic analysis were assessed by calculating the degree of observer agreement using the κ statistic with 95% confidence intervals (CI). Results: 102 infants under 18 months were recruited. Convergent validity for agreement between stethoscope examination and acoustic analysis was poor for wheeze (κ = 0.07 (95% CI, –0.13 to 0.26)) and rattles (κ = 0.11 (–0.05 to 0.27)) and fair for crackles (κ = 0.36 (0.18 to 0.54)). Both the stethoscope and acoustic analysis distinguished well between sounds (discriminant validity). Agreement between observers for the presence of wheeze was poor for both stethoscope examination and acoustic analysis. Agreement for rattles was moderate for the stethoscope but poor for acoustic analysis. Agreement for crackles was moderate using both techniques. Within-observer reliability for all sounds using acoustic analysis was moderate to good. Conclusions: The stethoscope is unreliable for assessing respiratory sounds in infants. This has important implications for its use as a diagnostic tool for lung disorders in infants, and confirms that it cannot be used as a gold standard. Because of the unreliability of the stethoscope, the validity of acoustic analysis could not be demonstrated, although it could discriminate between sounds well and showed good within-observer reliability. For acoustic analysis, targeted training and the development of computerised pattern recognition systems may improve reliability so that it can be used in clinical practice. PMID:15499065
Fascione, Jeanna M; Crews, Ryan T; Wrobel, James S
2012-01-01
Identifying the variability of footprint measurement collection techniques and the reliability of footprint measurements would assist with appropriate clinical foot posture appraisal. We sought to identify relationships between these measures in a healthy population. On 30 healthy participants, midgait dynamic footprint measurements were collected using an ink mat, paper pedography, and electronic pedography. The footprints were then digitized, and the following footprint indices were calculated with photo digital planimetry software: footprint index, arch index, truncated arch index, Chippaux-Smirak Index, and Staheli Index. Differences between techniques were identified with repeated-measures analysis of variance with post hoc test of Scheffe. In addition, to assess practical similarities between the different methods, intraclass correlation coefficients (ICCs) were calculated. To assess intrarater reliability, footprint indices were calculated twice on 10 randomly selected ink mat footprint measurements, and the ICC was calculated. Dynamic footprint measurements collected with an ink mat significantly differed from those collected with paper pedography (ICC, 0.85-0.96) and electronic pedography (ICC, 0.29-0.79), regardless of the practical similarities noted with ICC values (P = .00). Intrarater reliability for dynamic ink mat footprint measurements was high for the footprint index, arch index, truncated arch index, Chippaux-Smirak Index, and Staheli Index (ICC, 0.74-0.99). Footprint measurements collected with various techniques demonstrate differences. Interchangeable use of exact values without adjustment is not advised. Intrarater reliability of a single method (ink mat) was found to be high.
Evaluation methodologies for an advanced information processing system
NASA Technical Reports Server (NTRS)
Schabowsky, R. S., Jr.; Gai, E.; Walker, B. K.; Lala, J. H.; Motyka, P.
1984-01-01
The system concept and requirements for an Advanced Information Processing System (AIPS) are briefly described, but the emphasis of this paper is on the evaluation methodologies being developed and utilized in the AIPS program. The evaluation tasks include hardware reliability, maintainability and availability, software reliability, performance, and performability. Hardware RMA and software reliability are addressed with Markov modeling techniques. The performance analysis for AIPS is based on queueing theory. Performability is a measure of merit which combines system reliability and performance measures. The probability laws of the performance measures are obtained from the Markov reliability models. Scalar functions of this law such as the mean and variance provide measures of merit in the AIPS performability evaluations.
Capacitor Technologies, Applications and Reliability
NASA Technical Reports Server (NTRS)
1981-01-01
Various aspects of capacitor technologies and applications are discussed. Major emphasis is placed on: the causes of failures; accelerated testing; screening tests; destructive physical analysis; applications techniques; and improvements in capacitor capabilities.
Statistical summaries of fatigue data for design purposes
NASA Technical Reports Server (NTRS)
Wirsching, P. H.
1983-01-01
Two methods are discussed for constructing a design curve on the safe side of fatigue data. Both the tolerance interval and equivalent prediction interval (EPI) concepts provide such a curve while accounting for both the distribution of the estimators in small samples and the data scatter. The EPI is also useful as a mechanism for providing necessary statistics on S-N data for a full reliability analysis which includes uncertainty in all fatigue design factors. Examples of statistical analyses of the general strain life relationship are presented. The tolerance limit and EPI techniques for defining a design curve are demonstrated. Examples usng WASPALOY B and RQC-100 data demonstrate that a reliability model could be constructed by considering the fatigue strength and fatigue ductility coefficients as two independent random variables. A technique given for establishing the fatigue strength for high cycle lives relies on an extrapolation technique and also accounts for "runners." A reliability model or design value can be specified.
NASA Astrophysics Data System (ADS)
Doležel, Jiří; Novák, Drahomír; Petrů, Jan
2017-09-01
Transportation routes of oversize and excessive loads are currently planned in relation to ensure the transit of a vehicle through critical points on the road. Critical points are level-intersection of roads, bridges etc. This article presents a comprehensive procedure to determine a reliability and a load-bearing capacity level of the existing bridges on highways and roads using the advanced methods of reliability analysis based on simulation techniques of Monte Carlo type in combination with nonlinear finite element method analysis. The safety index is considered as a main criterion of the reliability level of the existing construction structures and the index is described in current structural design standards, e.g. ISO and Eurocode. An example of a single-span slab bridge made of precast prestressed concrete girders of the 60 year current time and its load bearing capacity is set for the ultimate limit state and serviceability limit state. The structure’s design load capacity was estimated by the full probability nonlinear MKP analysis using a simulation technique Latin Hypercube Sampling (LHS). Load-bearing capacity values based on a fully probabilistic analysis are compared with the load-bearing capacity levels which were estimated by deterministic methods of a critical section of the most loaded girders.
Applicability and Limitations of Reliability Allocation Methods
NASA Technical Reports Server (NTRS)
Cruz, Jose A.
2016-01-01
Reliability allocation process may be described as the process of assigning reliability requirements to individual components within a system to attain the specified system reliability. For large systems, the allocation process is often performed at different stages of system design. The allocation process often begins at the conceptual stage. As the system design develops, more information about components and the operating environment becomes available, different allocation methods can be considered. Reliability allocation methods are usually divided into two categories: weighting factors and optimal reliability allocation. When properly applied, these methods can produce reasonable approximations. Reliability allocation techniques have limitations and implied assumptions that need to be understood by system engineers. Applying reliability allocation techniques without understanding their limitations and assumptions can produce unrealistic results. This report addresses weighting factors, optimal reliability allocation techniques, and identifies the applicability and limitations of each reliability allocation technique.
Developments in Cylindrical Shell Stability Analysis
NASA Technical Reports Server (NTRS)
Knight, Norman F., Jr.; Starnes, James H., Jr.
1998-01-01
Today high-performance computing systems and new analytical and numerical techniques enable engineers to explore the use of advanced materials for shell design. This paper reviews some of the historical developments of shell buckling analysis and design. The paper concludes by identifying key research directions for reliable and robust methods development in shell stability analysis and design.
Total systems design analysis of high performance structures
NASA Technical Reports Server (NTRS)
Verderaime, V.
1993-01-01
Designer-control parameters were identified at interdiscipline interfaces to optimize structural systems performance and downstream development and operations with reliability and least life-cycle cost. Interface tasks and iterations are tracked through a matrix of performance disciplines integration versus manufacturing, verification, and operations interactions for a total system design analysis. Performance integration tasks include shapes, sizes, environments, and materials. Integrity integrating tasks are reliability and recurring structural costs. Significant interface designer control parameters were noted as shapes, dimensions, probability range factors, and cost. Structural failure concept is presented, and first-order reliability and deterministic methods, benefits, and limitations are discussed. A deterministic reliability technique combining benefits of both is proposed for static structures which is also timely and economically verifiable. Though launch vehicle environments were primarily considered, the system design process is applicable to any surface system using its own unique filed environments.
Advanced techniques in reliability model representation and solution
NASA Technical Reports Server (NTRS)
Palumbo, Daniel L.; Nicol, David M.
1992-01-01
The current tendency of flight control system designs is towards increased integration of applications and increased distribution of computational elements. The reliability analysis of such systems is difficult because subsystem interactions are increasingly interdependent. Researchers at NASA Langley Research Center have been working for several years to extend the capability of Markov modeling techniques to address these problems. This effort has been focused in the areas of increased model abstraction and increased computational capability. The reliability model generator (RMG) is a software tool that uses as input a graphical object-oriented block diagram of the system. RMG uses a failure-effects algorithm to produce the reliability model from the graphical description. The ASSURE software tool is a parallel processing program that uses the semi-Markov unreliability range evaluator (SURE) solution technique and the abstract semi-Markov specification interface to the SURE tool (ASSIST) modeling language. A failure modes-effects simulation is used by ASSURE. These tools were used to analyze a significant portion of a complex flight control system. The successful combination of the power of graphical representation, automated model generation, and parallel computation leads to the conclusion that distributed fault-tolerant system architectures can now be analyzed.
Preliminary design of the redundant software experiment
NASA Technical Reports Server (NTRS)
Campbell, Roy; Deimel, Lionel; Eckhardt, Dave, Jr.; Kelly, John; Knight, John; Lauterbach, Linda; Lee, Larry; Mcallister, Dave; Mchugh, John
1985-01-01
The goal of the present experiment is to characterize the fault distributions of highly reliable software replicates, constructed using techniques and environments which are similar to those used in comtemporary industrial software facilities. The fault distributions and their effect on the reliability of fault tolerant configurations of the software will be determined through extensive life testing of the replicates against carefully constructed randomly generated test data. Each detected error will be carefully analyzed to provide insight in to their nature and cause. A direct objective is to develop techniques for reducing the intensity of coincident errors, thus increasing the reliability gain which can be achieved with fault tolerance. Data on the reliability gains realized, and the cost of the fault tolerant configurations can be used to design a companion experiment to determine the cost effectiveness of the fault tolerant strategy. Finally, the data and analysis produced by this experiment will be valuable to the software engineering community as a whole because it will provide a useful insight into the nature and cause of hard to find, subtle faults which escape standard software engineering validation techniques and thus persist far into the software life cycle.
Study of complete interconnect reliability for a GaAs MMIC power amplifier
NASA Astrophysics Data System (ADS)
Lin, Qian; Wu, Haifeng; Chen, Shan-ji; Jia, Guoqing; Jiang, Wei; Chen, Chao
2018-05-01
By combining the finite element analysis (FEA) and artificial neural network (ANN) technique, the complete prediction of interconnect reliability for a monolithic microwave integrated circuit (MMIC) power amplifier (PA) at the both of direct current (DC) and alternating current (AC) operation conditions is achieved effectively in this article. As a example, a MMIC PA is modelled to study the electromigration failure of interconnect. This is the first time to study the interconnect reliability for an MMIC PA at the conditions of DC and AC operation simultaneously. By training the data from FEA, a high accuracy ANN model for PA reliability is constructed. Then, basing on the reliability database which is obtained from the ANN model, it can give important guidance for improving the reliability design for IC.
Liquid-propellant rocket engines health-monitoring—a survey
NASA Astrophysics Data System (ADS)
Wu, Jianjun
2005-02-01
This paper is intended to give a summary on the health-monitoring technology, which is one of the key technologies both for improving and enhancing the reliability and safety of current rocket engines and for developing new-generation high reliable reusable rocket engines. The implication of health-monitoring and the fundamental principle obeyed by the fault detection and diagnostics are elucidated. The main aspects of health-monitoring such as system frameworks, failure modes analysis, algorithms of fault detection and diagnosis, control means and advanced sensor techniques are illustrated in some detail. At last, the evolution trend of health-monitoring techniques of liquid-propellant rocket engines is set out.
Data mining-based coefficient of influence factors optimization of test paper reliability
NASA Astrophysics Data System (ADS)
Xu, Peiyao; Jiang, Huiping; Wei, Jieyao
2018-05-01
Test is a significant part of the teaching process. It demonstrates the final outcome of school teaching through teachers' teaching level and students' scores. The analysis of test paper is a complex operation that has the characteristics of non-linear relation in the length of the paper, time duration and the degree of difficulty. It is therefore difficult to optimize the coefficient of influence factors under different conditions in order to get text papers with clearly higher reliability with general methods [1]. With data mining techniques like Support Vector Regression (SVR) and Genetic Algorithm (GA), we can model the test paper analysis and optimize the coefficient of impact factors for higher reliability. It's easy to find that the combination of SVR and GA can get an effective advance in reliability from the test results. The optimal coefficient of influence factors optimization has a practicability in actual application, and the whole optimizing operation can offer model basis for test paper analysis.
Interrater reliability of videotaped observational gait-analysis assessments.
Eastlack, M E; Arvidson, J; Snyder-Mackler, L; Danoff, J V; McGarvey, C L
1991-06-01
The purpose of this study was to determine the interrater reliability of videotaped observational gait-analysis (VOGA) assessments. Fifty-four licensed physical therapists with varying amounts of clinical experience served as raters. Three patients with rheumatoid arthritis who demonstrated an abnormal gait pattern served as subjects for the videotape. The raters analyzed each patient's most severely involved knee during the four subphases of stance for the kinematic variables of knee flexion and genu valgum. Raters were asked to determine whether these variables were inadequate, normal, or excessive. The temporospatial variables analyzed throughout the entire gait cycle were cadence, step length, stride length, stance time, and step width. Generalized kappa coefficients ranged from .11 to .52. Intraclass correlation coefficients (2,1) and (3,1) were slightly higher. Our results indicate that physical therapists' VOGA assessments are only slightly to moderately reliable and that improved interrater reliability of the assessments of physical therapists utilizing this technique is needed. Our data suggest that there is a need for greater standardization of gait-analysis training.
Application of near-infrared spectroscopy to preservative-treated wood
Chi-Leung So; Stan T. Lebow; Thomas L. Eberhardt; Leslie H. Groom; Todd F. Shupe
2009-01-01
Near infrared (NIR) spectroscopy is now a widely-used technique in the field of forest products, especially for physical and mechanical property determinations. This technique is also ideal for the chemical analysis of wood. There has been a growing need to find a rapid, inexpensive and reliable method to distinguish between preservative-treated and untreated waste...
ERIC Educational Resources Information Center
Blanchard, Alexia; Kraif, Olivier; Ponton, Claude
2009-01-01
This paper presents a "didactic triangulation" strategy to cope with the problem of reliability of NLP applications for computer-assisted language learning (CALL) systems. It is based on the implementation of basic but well mastered NLP techniques and puts the emphasis on an adapted gearing between computable linguistic clues and didactic features…
A Comparison of Three Methods for the Analysis of Skin Flap Viability: Reliability and Validity.
Tim, Carla Roberta; Martignago, Cintia Cristina Santi; da Silva, Viviane Ribeiro; Dos Santos, Estefany Camila Bonfim; Vieira, Fabiana Nascimento; Parizotto, Nivaldo Antonio; Liebano, Richard Eloin
2018-05-01
Objective: Technological advances have provided new alternatives to the analysis of skin flap viability in animal models; however, the interrater validity and reliability of these techniques have yet to be analyzed. The present study aimed to evaluate the interrater validity and reliability of three different methods: weight of paper template (WPT), paper template area (PTA), and photographic analysis. Approach: Sixteen male Wistar rats had their cranially based dorsal skin flap elevated. On the seventh postoperative day, the viable tissue area and the necrotic area of the skin flap were recorded using the paper template method and photo image. The evaluation of the percentage of viable tissue was performed using three methods, simultaneously and independently by two raters. The analysis of interrater reliability and viability was performed using the intraclass correlation coefficient and Bland Altman Plot Analysis was used to visualize the presence or absence of systematic bias in the evaluations of data validity. Results: The results showed that interrater reliability for WPT, measurement of PTA, and photographic analysis were 0.995, 0.990, and 0.982, respectively. For data validity, a correlation >0.90 was observed for all comparisons made between the three methods. In addition, Bland Altman Plot Analysis showed agreement between the comparisons of the methods and the presence of systematic bias was not observed. Innovation: Digital methods are an excellent choice for assessing skin flap viability; moreover, they make data use and storage easier. Conclusion: Independently from the method used, the interrater reliability and validity proved to be excellent for the analysis of skin flaps' viability.
NASA Astrophysics Data System (ADS)
Seyfpour, M.; Ghanei, S.; Mazinani, M.; Kashefi, M.; Davis, C.
2018-04-01
The recovery process in steel is usually investigated by conventional destructive tests that are expensive, time-consuming and also cumbersome. In this study, an alternative non-destructive test technique (based on eddy current testing) is used to characterise the recovery process during annealing of cold-rolled low-carbon steels. For assessing the reliability of eddy current results corresponding to different levels of recovery, X-ray line broadening analysis is also employed. It is shown that there is a strong relationship between eddy current outputs and the extent to which recovery occurs at different annealing temperatures. Accordingly, the non-destructive eddy current test technique represents the potential to be used as a reliable process for detection of the occurrence of recovery in the steel microstructure.
ERIC Educational Resources Information Center
Osborne, Jason W.; Fitzpatrick, David C.
2012-01-01
Exploratory Factor Analysis (EFA) is a powerful and commonly-used tool for investigating the underlying variable structure of a psychometric instrument. However, there is much controversy in the social sciences with regard to the techniques used in EFA (Ford, MacCallum, & Tait, 1986; Henson & Roberts, 2006) and the reliability of the outcome.…
Interactive Image Analysis System Design,
1982-12-01
This report describes a design for an interactive image analysis system (IIAS), which implements terrain data extraction techniques. The design... analysis system. Additionally, the system is fully capable of supporting many generic types of image analysis and data processing, and is modularly...employs commercially available, state of the art minicomputers and image display devices with proven software to achieve a cost effective, reliable image
Consistent detection and identification of individuals in a large camera network
NASA Astrophysics Data System (ADS)
Colombo, Alberto; Leung, Valerie; Orwell, James; Velastin, Sergio A.
2007-10-01
In the wake of an increasing number of terrorist attacks, counter-terrorism measures are now a main focus of many research programmes. An important issue for the police is the ability to track individuals and groups reliably through underground stations, and in the case of post-event analysis, to be able to ascertain whether specific individuals have been at the station previously. While there exist many motion detection and tracking algorithms, the reliable deployment of them in a large network is still ongoing research. Specifically, to track individuals through multiple views, on multiple levels and between levels, consistent detection and labelling of individuals is crucial. In view of these issues, we have developed a change detection algorithm to work reliably in the presence of periodic movements, e.g. escalators and scrolling advertisements, as well as a content-based retrieval technique for identification. The change detection technique automatically extracts periodically varying elements in the scene using Fourier analysis, and constructs a Markov model for the process. Training is performed online, and no manual intervention is required, making this system suitable for deployment in large networks. Experiments on real data shows significant improvement over existing techniques. The content-based retrieval technique uses MPEG-7 descriptors to identify individuals. Given the environment under which the system operates, i.e. at relatively low resolution, this approach is suitable for short timescales. For longer timescales, other forms of identification such as gait, or if the resolution allows, face recognition, will be required.
Verification of Triple Modular Redundancy (TMR) Insertion for Reliable and Trusted Systems
NASA Technical Reports Server (NTRS)
Berg, Melanie; LaBel, Kenneth A.
2016-01-01
We propose a method for TMR insertion verification that satisfies the process for reliable and trusted systems. If a system is expected to be protected using TMR, improper insertion can jeopardize the reliability and security of the system. Due to the complexity of the verification process, there are currently no available techniques that can provide complete and reliable confirmation of TMR insertion. This manuscript addresses the challenge of confirming that TMR has been inserted without corruption of functionality and with correct application of the expected TMR topology. The proposed verification method combines the usage of existing formal analysis tools with a novel search-detect-and-verify tool. Field programmable gate array (FPGA),Triple Modular Redundancy (TMR),Verification, Trust, Reliability,
Mahato, Niladri K; Montuelle, Stephane; Cotton, John; Williams, Susan; Thomas, James; Clark, Brian
2016-05-18
Single or biplanar video radiography and Roentgen stereophotogrammetry (RSA) techniques used for the assessment of in-vivo joint kinematics involves application of ionizing radiation, which is a limitation for clinical research involving human subjects. To overcome this limitation, our long-term goal is to develop a magnetic resonance imaging (MRI)-only, three dimensional (3-D) modeling technique that permits dynamic imaging of joint motion in humans. Here, we present our initial findings, as well as reliability data, for an MRI-only protocol and modeling technique. We developed a morphology-based motion-analysis technique that uses MRI of custom-built solid-body objects to animate and quantify experimental displacements between them. The technique involved four major steps. First, the imaging volume was calibrated using a custom-built grid. Second, 3-D models were segmented from axial scans of two custom-built solid-body cubes. Third, these cubes were positioned at pre-determined relative displacements (translation and rotation) in the magnetic resonance coil and scanned with a T1 and a fast contrast-enhanced pulse sequences. The digital imaging and communications in medicine (DICOM) images were then processed for animation. The fourth step involved importing these processed images into an animation software, where they were displayed as background scenes. In the same step, 3-D models of the cubes were imported into the animation software, where the user manipulated the models to match their outlines in the scene (rotoscoping) and registered the models into an anatomical joint system. Measurements of displacements obtained from two different rotoscoping sessions were tested for reliability using coefficient of variations (CV), intraclass correlation coefficients (ICC), Bland-Altman plots, and Limits of Agreement analyses. Between-session reliability was high for both the T1 and the contrast-enhanced sequences. Specifically, the average CVs for translation were 4.31 % and 5.26 % for the two pulse sequences, respectively, while the ICCs were 0.99 for both. For rotation measures, the CVs were 3.19 % and 2.44 % for the two pulse sequences with the ICCs being 0.98 and 0.97, respectively. A novel biplanar imaging approach also yielded high reliability with mean CVs of 2.66 % and 3.39 % for translation in the x- and z-planes, respectively, and ICCs of 0.97 in both planes. This work provides basic proof-of-concept for a reliable marker-less non-ionizing-radiation-based quasi-dynamic motion quantification technique that can potentially be developed into a tool for real-time joint kinematics analysis.
NASA Astrophysics Data System (ADS)
Zhou, Xiang
Using an innovative portable holographic inspection and testing system (PHITS) developed at the Australian Defence Force Academy, fatigue cracks in riveted lap joints can be detected by visually inspecting the abnormal fringe changes recorded on holographic interferograms. In this thesis, for automatic crack detection, some modern digital image processing techniques are investigated and applied to holographic interferogram evaluation. Fringe analysis algorithms are developed for identification of the crack-induced fringe changes. Theoretical analysis of PHITS and riveted lap joints and two typical experiments demonstrate that the fatigue cracks in lightly-clamped joints induce two characteristic fringe changes: local fringe discontinuities at the cracking sites; and the global crescent fringe distribution near to the edge of the rivet hole. Both of the fringe features are used for crack detection in this thesis. As a basis of the fringe feature extraction, an algorithm for local fringe orientation calculation is proposed. For high orientation accuracy and computational efficiency, Gaussian gradient filtering and neighboring direction averaging are used to minimize the effects of image background variations and random noise. The neighboring direction averaging is also used to approximate the fringe directions in centerlines of bright and dark fringes. Experimental results indicate that for high orientation accuracy the scales of the Gaussian filter and neighboring direction averaging should be chosen according to the local fringe spacings. The orientation histogram technique is applied to detect the local fringe discontinuity due to the fatigue cracks. The Fourier descriptor technique is used to characterize the global fringe distribution change from a circular to a crescent distribution with the fatigue crack growth. Experiments and computer simulations are conducted to analyze the detectability and reliability of crack detection using the two techniques. Results demonstrate that the Fourier descriptor technique is more promising in the detection of the short cracks near the edge of the rivet head. However, it is not as reliable as the fringe orientation technique for detection of the long through cracks. For reliability, both techniques should be used in practical crack detection. Neither the Fourier descriptor technique nor the orientation histogram technique have been previously applied to holographic interferometry. While this work related primarily to interferograms of cracked rivets, the techniques would be readily applied to other areas of fringe pattern analysis.
Tokuda, Junichi; Mamata, Hatsuho; Gill, Ritu R; Hata, Nobuhiko; Kikinis, Ron; Padera, Robert F; Lenkinski, Robert E; Sugarbaker, David J; Hatabu, Hiroto
2011-04-01
To investigates the impact of nonrigid motion correction on pixel-wise pharmacokinetic analysis of free-breathing DCE-MRI in patients with solitary pulmonary nodules (SPNs). Misalignment of focal lesions due to respiratory motion in free-breathing dynamic contrast-enhanced MRI (DCE-MRI) precludes obtaining reliable time-intensity curves, which are crucial for pharmacokinetic analysis for tissue characterization. Single-slice 2D DCE-MRI was obtained in 15 patients. Misalignments of SPNs were corrected using nonrigid B-spline image registration. Pixel-wise pharmacokinetic parameters K(trans) , v(e) , and k(ep) were estimated from both original and motion-corrected DCE-MRI by fitting the two-compartment pharmacokinetic model to the time-intensity curve obtained in each pixel. The "goodness-of-fit" was tested with χ(2) -test in pixel-by-pixel basis to evaluate the reliability of the parameters. The percentages of reliable pixels within the SPNs were compared between the original and motion-corrected DCE-MRI. In addition, the parameters obtained from benign and malignant SPNs were compared. The percentage of reliable pixels in the motion-corrected DCE-MRI was significantly larger than the original DCE-MRI (P = 4 × 10(-7) ). Both K(trans) and k(ep) derived from the motion-corrected DCE-MRI showed significant differences between benign and malignant SPNs (P = 0.024, 0.015). The study demonstrated the impact of nonrigid motion correction technique on pixel-wise pharmacokinetic analysis of free-breathing DCE-MRI in SPNs. Copyright © 2011 Wiley-Liss, Inc.
Goldfarb, Charles A; Strauss, Nicole L; Wall, Lindley B; Calfee, Ryan P
2011-02-01
The measurement technique for ulnar variance in the adolescent population has not been well established. The purpose of this study was to assess the reliability of a standard ulnar variance assessment in the adolescent population. Four orthopedic surgeons measured 138 adolescent wrist radiographs for ulnar variance using a standard technique. There were 62 male and 76 female radiographs obtained in a standardized fashion for subjects aged 12 to 18 years. Skeletal age was used for analysis. We determined mean variance and assessed for differences related to age and gender. We also determined the interrater reliability. The mean variance was -0.7 mm for boys and -0.4 mm for girls; there was no significant difference between the 2 groups overall. When subdivided by age and gender, the younger group (≤ 15 y of age) was significantly less negative for girls (boys, -0.8 mm and girls, -0.3 mm, p < .05). There was no significant difference between boys and girls in the older group. The greatest difference between any 2 raters was 1 mm; exact agreement was obtained in 72 subjects. Correlations between raters were high (r(p) 0.87-0.97 in boys and 0.82-0.96 for girls). Interrater reliability was excellent (Cronbach's alpha, 0.97-0.98). Standard assessment techniques for ulnar variance are reliable in the adolescent population. Open growth plates did not interfere with this assessment. Young adolescent boys demonstrated a greater degree of negative ulnar variance compared with young adolescent girls. Copyright © 2011 American Society for Surgery of the Hand. Published by Elsevier Inc. All rights reserved.
Site-specific landslide assessment in Alpine area using a reliable integrated monitoring system
NASA Astrophysics Data System (ADS)
Romeo, Saverio; Di Matteo, Lucio; Kieffer, Daniel Scott
2016-04-01
Rockfalls are one of major cause of landslide fatalities around the world. The present work discusses the reliability of integrated monitoring of displacements in a rockfall within the Alpine region (Salzburg Land - Austria), taking into account also the effect of the ongoing climate change. Due to the unpredictability of the frequency and magnitude, that threatens human lives and infrastructure, frequently it is necessary to implement an efficient monitoring system. For this reason, during the last decades, integrated monitoring systems of unstable slopes were widely developed and used (e.g., extensometers, cameras, remote sensing, etc.). In this framework, Remote Sensing techniques, such as GBInSAR technique (Groung-Based Interferometric Synthetic Aperture Radar), have emerged as efficient and powerful tools for deformation monitoring. GBInSAR measurements can be useful to achieve an early warning system using surface deformation parameters as ground displacement or inverse velocity (for semi-empirical forecasting methods). In order to check the reliability of GBInSAR and to monitor the evolution of landslide, it is very important to integrate different techniques. Indeed, a multi-instrumental approach is essential to investigate movements both in surface and in depth and the use of different monitoring techniques allows to perform a cross analysis of the data and to minimize errors, to check the data quality and to improve the monitoring system. During 2013, an intense and complete monitoring campaign has been conducted on the Ingelsberg landslide. By analyzing both historical temperature series (HISTALP) recorded during the last century and those from local weather stations, temperature values (Autumn-Winter, Winter and Spring) are clearly increased in Bad Hofgastein area as well as in Alpine region. As consequence, in the last decades the rockfall events have been shifted from spring to summer due to warmer winters. It is interesting to point out that temperature values recorded in the valley and on the slope show a good relationship indicating that the climatic monitoring is reliable. In addition, the landslide displacement monitoring is reliable as well: the comparison between displacements in depth by extensometers and in surface by GBInSAR - referred to March-December 2013 - shows ad high reliability as confirmed by the inter-rater reliability analysis (Pearson correlation coefficient higher than 0.9). In conclusion, the reliability of the monitoring system confirms that data can be useful to improve the knowledge on rockfall kinematic and to develop an accurate early warning system useful for civil protection issues.
NASA Astrophysics Data System (ADS)
Sait, Abdulrahman S.
This dissertation presents a reliable technique for monitoring the condition of rotating machinery by applying instantaneous angular speed (IAS) analysis. A new analysis of the effects of changes in the orientation of the line of action and the pressure angle of the resultant force acting on gear tooth profile of spur gear under different levels of tooth damage is utilized. The analysis and experimental work discussed in this dissertation provide a clear understating of the effects of damage on the IAS by analyzing the digital signals output of rotary incremental optical encoder. A comprehensive literature review of state of the knowledge in condition monitoring and fault diagnostics of rotating machinery, including gearbox system is presented. Progress and new developments over the past 30 years in failure detection techniques of rotating machinery including engines, bearings and gearboxes are thoroughly reviewed. This work is limited to the analysis of a gear train system with gear tooth surface faults utilizing angular motion analysis technique. Angular motion data were acquired using an incremental optical encoder. Results are compared to a vibration-based technique. The vibration data were acquired using an accelerometer. The signals were obtained and analyzed in the phase domains using signal averaging to determine the existence and position of faults on the gear train system. Forces between the mating teeth surfaces are analyzed and simulated to validate the influence of the presence of damage on the pressure angle and the IAS. National Instruments hardware is used and NI LabVIEW software code is developed for real-time, online condition monitoring systems and fault detection techniques. The sensitivity of optical encoders to gear fault detection techniques is experimentally investigated by applying IAS analysis under different gear damage levels and different operating conditions. A reliable methodology is developed for selecting appropriate testing/operating conditions of a rotating system to generate an alarm system for damage detection.
Reliability based design optimization: Formulations and methodologies
NASA Astrophysics Data System (ADS)
Agarwal, Harish
Modern products ranging from simple components to complex systems should be designed to be optimal and reliable. The challenge of modern engineering is to ensure that manufacturing costs are reduced and design cycle times are minimized while achieving requirements for performance and reliability. If the market for the product is competitive, improved quality and reliability can generate very strong competitive advantages. Simulation based design plays an important role in designing almost any kind of automotive, aerospace, and consumer products under these competitive conditions. Single discipline simulations used for analysis are being coupled together to create complex coupled simulation tools. This investigation focuses on the development of efficient and robust methodologies for reliability based design optimization in a simulation based design environment. Original contributions of this research are the development of a novel efficient and robust unilevel methodology for reliability based design optimization, the development of an innovative decoupled reliability based design optimization methodology, the application of homotopy techniques in unilevel reliability based design optimization methodology, and the development of a new framework for reliability based design optimization under epistemic uncertainty. The unilevel methodology for reliability based design optimization is shown to be mathematically equivalent to the traditional nested formulation. Numerical test problems show that the unilevel methodology can reduce computational cost by at least 50% as compared to the nested approach. The decoupled reliability based design optimization methodology is an approximate technique to obtain consistent reliable designs at lesser computational expense. Test problems show that the methodology is computationally efficient compared to the nested approach. A framework for performing reliability based design optimization under epistemic uncertainty is also developed. A trust region managed sequential approximate optimization methodology is employed for this purpose. Results from numerical test studies indicate that the methodology can be used for performing design optimization under severe uncertainty.
Methods for trend analysis: Examples with problem/failure data
NASA Technical Reports Server (NTRS)
Church, Curtis K.
1989-01-01
Statistics are emphasized as an important role in quality control and reliability. Consequently, Trend Analysis Techniques recommended a variety of statistical methodologies that could be applied to time series data. The major goal of the working handbook, using data from the MSFC Problem Assessment System, is to illustrate some of the techniques in the NASA standard, some different techniques, and to notice patterns of data. Techniques for trend estimation used are: regression (exponential, power, reciprocal, straight line) and Kendall's rank correlation coefficient. The important details of a statistical strategy for estimating a trend component are covered in the examples. However, careful analysis and interpretation is necessary because of small samples and frequent zero problem reports in a given time period. Further investigations to deal with these issues are being conducted.
Subject-level reliability analysis of fast fMRI with application to epilepsy.
Hao, Yongfu; Khoo, Hui Ming; von Ellenrieder, Nicolas; Gotman, Jean
2017-07-01
Recent studies have applied the new magnetic resonance encephalography (MREG) sequence to the study of interictal epileptic discharges (IEDs) in the electroencephalogram (EEG) of epileptic patients. However, there are no criteria to quantitatively evaluate different processing methods, to properly use the new sequence. We evaluated different processing steps of this new sequence under the common generalized linear model (GLM) framework by assessing the reliability of results. A bootstrap sampling technique was first used to generate multiple replicated data sets; a GLM with different processing steps was then applied to obtain activation maps, and the reliability of these maps was assessed. We applied our analysis in an event-related GLM related to IEDs. A higher reliability was achieved by using a GLM with head motion confound regressor with 24 components rather than the usual 6, with an autoregressive model of order 5 and with a canonical hemodynamic response function (HRF) rather than variable latency or patient-specific HRFs. Comparison of activation with IED field also favored the canonical HRF, consistent with the reliability analysis. The reliability analysis helps to optimize the processing methods for this fast fMRI sequence, in a context in which we do not know the ground truth of activation areas. Magn Reson Med 78:370-382, 2017. © 2016 International Society for Magnetic Resonance in Medicine. © 2016 International Society for Magnetic Resonance in Medicine.
Reliability analysis of the F-8 digital fly-by-wire system
NASA Technical Reports Server (NTRS)
Brock, L. D.; Goodman, H. A.
1981-01-01
The F-8 Digital Fly-by-Wire (DFBW) flight test program intended to provide the technology for advanced control systems, giving aircraft enhanced performance and operational capability is addressed. A detailed analysis of the experimental system was performed to estimated the probabilities of two significant safety critical events: (1) loss of primary flight control function, causing reversion to the analog bypass system; and (2) loss of the aircraft due to failure of the electronic flight control system. The analysis covers appraisal of risks due to random equipment failure, generic faults in design of the system or its software, and induced failure due to external events. A unique diagrammatic technique was developed which details the combinatorial reliability equations for the entire system, promotes understanding of system failure characteristics, and identifies the most likely failure modes. The technique provides a systematic method of applying basic probability equations and is augmented by a computer program written in a modular fashion that duplicates the structure of these equations.
ERIC Educational Resources Information Center
Silverman, Mitchell
Reported are the first phase activities of a longitudinal project designed to evaluate the effectiveness of Guided Group Interaction (GGI) technique as a meaningful approach in the field of corrections. The main findings relate to the establishment of reliability for the main components of the Revised Behavior Scores System developed to assess the…
Cohen, Wayne R; Hayes-Gill, Barrie
2014-06-01
To evaluate the performance of external electronic fetal heart rate and uterine contraction monitoring according to maternal body mass index. Secondary analysis of prospective equivalence study. Three US urban teaching hospitals. Seventy-four parturients with a normal term pregnancy. The parent study assessed performance of two methods of external fetal heart rate monitoring (abdominal fetal electrocardiogram and Doppler ultrasound) and of uterine contraction monitoring (electrohystero-graphy and tocodynamometry) compared with internal monitoring with fetal scalp electrode and intrauterine pressure transducer. Reliability of external techniques was assessed by the success rate and positive percent agreement with internal methods. Bland-Altman analysis determined accuracy. We analyzed data from that study according to maternal body mass index. We assessed the relationship between body mass index and monitor performance with linear regression, using body mass index as the independent variable and measures of reliability and accuracy as dependent variables. There was no significant association between maternal body mass index and any measure of reliability or accuracy for abdominal fetal electrocardiogram. By contrast, the overall positive percent agreement for Doppler ultrasound declined (p = 0.042), and the root mean square error from the Bland-Altman analysis increased in the first stage (p = 0.029) with increasing body mass index. Uterine contraction recordings from electrohysterography and tocodynamometry showed no significant deterioration related to maternal body mass index. Accuracy and reliability of fetal heart rate monitoring using abdominal fetal electrocardiogram was unaffected by maternal obesity, whereas performance of ultrasound degraded directly with maternal size. Both electrohysterography and tocodynamometry were unperturbed by obesity. © 2014 Nordic Federation of Societies of Obstetrics and Gynecology.
A Reliability Estimation in Modeling Watershed Runoff With Uncertainties
NASA Astrophysics Data System (ADS)
Melching, Charles S.; Yen, Ben Chie; Wenzel, Harry G., Jr.
1990-10-01
The reliability of simulation results produced by watershed runoff models is a function of uncertainties in nature, data, model parameters, and model structure. A framework is presented here for using a reliability analysis method (such as first-order second-moment techniques or Monte Carlo simulation) to evaluate the combined effect of the uncertainties on the reliability of output hydrographs from hydrologic models. For a given event the prediction reliability can be expressed in terms of the probability distribution of the estimated hydrologic variable. The peak discharge probability for a watershed in Illinois using the HEC-1 watershed model is given as an example. The study of the reliability of predictions from watershed models provides useful information on the stochastic nature of output from deterministic models subject to uncertainties and identifies the relative contribution of the various uncertainties to unreliability of model predictions.
RELIABILITY AND VALIDITY OF SUBJECTIVE ASSESSMENT OF LUMBAR LORDOSIS IN CONVENTIONAL RADIOGRAPHY.
Ruhinda, E; Byanyima, R K; Mugerwa, H
2014-10-01
Reliability and validity studies of different lumbar curvature analysis and measurement techniques have been documented however there is limited literature on the reliability and validity of subjective visual analysis. Radiological assessment of lumbar lordotic curve aids in early diagnosis of conditions even before neurologic changes set in. To ascertain the level of reliability and validity of subjective assessment of lumbar lordosis in conventional radiography. A blinded, repeated-measures diagnostic test was carried out on lumbar spine x-ray radiographs. Radiology Department at Joint Clinical Research Centre (JCRC), Mengo-Kampala-Uganda. Seventy (70) lateral lumbar x-ray films were used for this study and were obtained from the archive of JCRC radiology department at Butikiro house, Mengo-Kampala. Poor observer agreement, both inter- and intra-observer, with kappa values of 0.16 was found. Inter-observer agreement was poorer than intra-observer agreement. Kappa values significantly rose when the lumbar lordosis was clustered into four categories without grading each abnormality. The results confirm that subjective assessment of lumbar lordosis has low reliability and validity. Film quality has limited influence on the observer reliability. This study further shows that fewer scale categories of lordosis abnormalities produce better observer reliability.
A Delay-Aware and Reliable Data Aggregation for Cyber-Physical Sensing
Zhang, Jinhuan; Long, Jun; Zhang, Chengyuan; Zhao, Guihu
2017-01-01
Physical information sensed by various sensors in a cyber-physical system should be collected for further operation. In many applications, data aggregation should take reliability and delay into consideration. To address these problems, a novel Tiered Structure Routing-based Delay-Aware and Reliable Data Aggregation scheme named TSR-DARDA for spherical physical objects is proposed. By dividing the spherical network constructed by dispersed sensor nodes into circular tiers with specifically designed widths and cells, TSTR-DARDA tries to enable as many nodes as possible to transmit data simultaneously. In order to ensure transmission reliability, lost packets are retransmitted. Moreover, to minimize the latency while maintaining reliability for data collection, in-network aggregation and broadcast techniques are adopted to deal with the transmission between data collecting nodes in the outer layer and their parent data collecting nodes in the inner layer. Thus, the optimization problem is transformed to minimize the delay under reliability constraints by controlling the system parameters. To demonstrate the effectiveness of the proposed scheme, we have conducted extensive theoretical analysis and comparisons to evaluate the performance of TSR-DARDA. The analysis and simulations show that TSR-DARDA leads to lower delay with reliability satisfaction. PMID:28218668
Joint mobilization forces and therapist reliability in subjects with knee osteoarthritis
Tragord, Bradley S; Gill, Norman W; Silvernail, Jason L; Teyhen, Deydre S; Allison, Stephen C
2013-01-01
Objectives: This study determined biomechanical force parameters and reliability among clinicians performing knee joint mobilizations. Methods: Sixteen subjects with knee osteoarthritis and six therapists participated in the study. Forces were recorded using a capacitive-based pressure mat for three techniques at two grades of mobilization, each with two trials of 15 seconds. Dosage (force–time integral), amplitude, and frequency were also calculated. Analysis of variance was used to analyze grade differences, intraclass correlation coefficients determined reliability, and correlations assessed force associations with subject and rater variables. Results: Grade IV mobilizations produced higher mean forces (P<0.001) and higher dosage (P<0.001), while grade III produced higher maximum forces (P = 0.001). Grade III forces (Newtons) by technique (mean, maximum) were: extension 48, 81; flexion 41, 68; and medial glide 21, 34. Grade IV forces (Newtons) by technique (mean, maximum) were: extension 58, 78; flexion 44, 60; and medial glide 22, 30. Frequency (Hertz) ranged between 0.9–1.1 (grade III) and 1.4–1.6 (grade IV). Intra-clinician reliability was excellent (>0.90). Inter-clinician reliability was moderate for force and dosage, and poor for amplitude and frequency. Discussion: Force measurements were consistent with previously reported ranges and clinical constructs. Grade III and grade IV mobilizations can be distinguished from each other with differences for force and frequency being small, and dosage and amplitude being large. Intra-clinician reliability was excellent for all biomechanical parameters and inter-clinician reliability for dosage, the main variable of clinical interest, was moderate. This study quantified the applied forces among multiple clinicians, which may help determine optimal dosage and standardize care. PMID:24421632
NASA Technical Reports Server (NTRS)
Wild, Christian; Eckhardt, Dave
1987-01-01
The development of a methodology for the production of highly reliable software is one of the greatest challenges facing the computer industry. Meeting this challenge will undoubtably involve the integration of many technologies. This paper describes the use of Artificial Intelligence technologies in the automated analysis of the formal algebraic specifications of abstract data types. These technologies include symbolic execution of specifications using techniques of automated deduction and machine learning through the use of examples. On-going research into the role of knowledge representation and problem solving in the process of developing software is also discussed.
System data communication structures for active-control transport aircraft, volume 1
NASA Technical Reports Server (NTRS)
Hopkins, A. L.; Martin, J. H.; Brock, L. D.; Jansson, D. G.; Serben, S.; Smith, T. B.; Hanley, L. D.
1981-01-01
Candidate data communication techniques are identified, including dedicated links, local buses, broadcast buses, multiplex buses, and mesh networks. The design methodology for mesh networks is then discussed, including network topology and node architecture. Several concepts of power distribution are reviewed, including current limiting and mesh networks for power. The technology issues of packaging, transmission media, and lightning are addressed, and, finally, the analysis tools developed to aid in the communication design process are described. There are special tools to analyze the reliability and connectivity of networks and more general reliability analysis tools for all types of systems.
Streamflow characterization using functional data analysis of the Potomac River
NASA Astrophysics Data System (ADS)
Zelmanow, A.; Maslova, I.; Ticlavilca, A. M.; McKee, M.
2013-12-01
Flooding and droughts are extreme hydrological events that affect the United States economically and socially. The severity and unpredictability of flooding has caused billions of dollars in damage and the loss of lives in the eastern United States. In this context, there is an urgent need to build a firm scientific basis for adaptation by developing and applying new modeling techniques for accurate streamflow characterization and reliable hydrological forecasting. The goal of this analysis is to use numerical streamflow characteristics in order to classify, model, and estimate the likelihood of extreme events in the eastern United States, mainly the Potomac River. Functional data analysis techniques are used to study yearly streamflow patterns, with the extreme streamflow events characterized via functional principal component analysis. These methods are merged with more classical techniques such as cluster analysis, classification analysis, and time series modeling. The developed functional data analysis approach is used to model continuous streamflow hydrographs. The forecasting potential of this technique is explored by incorporating climate factors to produce a yearly streamflow outlook.
Developing safety performance functions incorporating reliability-based risk measures.
Ibrahim, Shewkar El-Bassiouni; Sayed, Tarek
2011-11-01
Current geometric design guides provide deterministic standards where the safety margin of the design output is generally unknown and there is little knowledge of the safety implications of deviating from these standards. Several studies have advocated probabilistic geometric design where reliability analysis can be used to account for the uncertainty in the design parameters and to provide a risk measure of the implication of deviation from design standards. However, there is currently no link between measures of design reliability and the quantification of safety using collision frequency. The analysis presented in this paper attempts to bridge this gap by incorporating a reliability-based quantitative risk measure such as the probability of non-compliance (P(nc)) in safety performance functions (SPFs). Establishing this link will allow admitting reliability-based design into traditional benefit-cost analysis and should lead to a wider application of the reliability technique in road design. The present application is concerned with the design of horizontal curves, where the limit state function is defined in terms of the available (supply) and stopping (demand) sight distances. A comprehensive collision and geometric design database of two-lane rural highways is used to investigate the effect of the probability of non-compliance on safety. The reliability analysis was carried out using the First Order Reliability Method (FORM). Two Negative Binomial (NB) SPFs were developed to compare models with and without the reliability-based risk measures. It was found that models incorporating the P(nc) provided a better fit to the data set than the traditional (without risk) NB SPFs for total, injury and fatality (I+F) and property damage only (PDO) collisions. Copyright © 2011 Elsevier Ltd. All rights reserved.
Boe, S G; Dalton, B H; Harwood, B; Doherty, T J; Rice, C L
2009-05-01
To establish the inter-rater reliability of decomposition-based quantitative electromyography (DQEMG) derived motor unit number estimates (MUNEs) and quantitative motor unit (MU) analysis. Using DQEMG, two examiners independently obtained a sample of needle and surface-detected motor unit potentials (MUPs) from the tibialis anterior muscle from 10 subjects. Coupled with a maximal M wave, surface-detected MUPs were used to derive a MUNE for each subject and each examiner. Additionally, size-related parameters of the individual MUs were obtained following quantitative MUP analysis. Test-retest MUNE values were similar with high reliability observed between examiners (ICC=0.87). Additionally, MUNE variability from test-retest as quantified by a 95% confidence interval was relatively low (+/-28 MUs). Lastly, quantitative data pertaining to MU size, complexity and firing rate were similar between examiners. MUNEs and quantitative MU data can be obtained with high reliability by two independent examiners using DQEMG. Establishing the inter-rater reliability of MUNEs and quantitative MU analysis using DQEMG is central to the clinical applicability of the technique. In addition to assessing response to treatments over time, multiple clinicians may be involved in the longitudinal assessment of the MU pool of individuals with disorders of the central or peripheral nervous system.
NASA Technical Reports Server (NTRS)
Leveson, Nancy
1987-01-01
Software safety and its relationship to other qualities are discussed. It is shown that standard reliability and fault tolerance techniques will not solve the safety problem for the present. A new attitude requires: looking at what you do NOT want software to do along with what you want it to do; and assuming things will go wrong. New procedures and changes to entire software development process are necessary: special software safety analysis techniques are needed; and design techniques, especially eliminating complexity, can be very helpful.
Marquez, M-E; Deglesne, P-A; Suarez, G; Romano, E
2011-04-01
The IgV(H) mutational status of B-cell chronic lymphocytic leukemia (B-CLL) is of prognostic value. Expression of ZAP-70 in B-CLL is a surrogate marker for IgV(H) unmutated (UM). As determination of IgV(H) mutational status involves a methodology currently unavailable for most clinical laboratories, it is important to have available a reliable technique for ZAP-70 estimation in B-CLL. Flow cytometry (FC) is a convenient technique for this purpose. However, there is still no adequate way for data analysis, which would prevent the assignment of false positive or negative expression. We have modified the currently most accepted technique, which uses the ratio of the mean fluorescent index (MFI) of B-CLL to T cells. The MFI for parallel antibody isotype staining is subtracted from the ZAP-70 MFI of both B-CLL and T cells. We validated this technique comparing the results obtained for ZAP-70 expression by FC with those obtained with quantitative PCR for the same patients. We applied the technique in a series of 53 patients. With this modification, a better correlation between ZAP-70 expression and IgV(H) UM was obtained. Thus, the MFI ratio B-CLL/T cell corrected by isotype is a reliable analysis technique to estimate ZAP-70 expression in B-CLL. © 2010 Blackwell Publishing Ltd.
Towards automatic Markov reliability modeling of computer architectures
NASA Technical Reports Server (NTRS)
Liceaga, C. A.; Siewiorek, D. P.
1986-01-01
The analysis and evaluation of reliability measures using time-varying Markov models is required for Processor-Memory-Switch (PMS) structures that have competing processes such as standby redundancy and repair, or renewal processes such as transient or intermittent faults. The task of generating these models is tedious and prone to human error due to the large number of states and transitions involved in any reasonable system. Therefore model formulation is a major analysis bottleneck, and model verification is a major validation problem. The general unfamiliarity of computer architects with Markov modeling techniques further increases the necessity of automating the model formulation. This paper presents an overview of the Automated Reliability Modeling (ARM) program, under development at NASA Langley Research Center. ARM will accept as input a description of the PMS interconnection graph, the behavior of the PMS components, the fault-tolerant strategies, and the operational requirements. The output of ARM will be the reliability of availability Markov model formulated for direct use by evaluation programs. The advantages of such an approach are (a) utility to a large class of users, not necessarily expert in reliability analysis, and (b) a lower probability of human error in the computation.
Validation of highly reliable, real-time knowledge-based systems
NASA Technical Reports Server (NTRS)
Johnson, Sally C.
1988-01-01
Knowledge-based systems have the potential to greatly increase the capabilities of future aircraft and spacecraft and to significantly reduce support manpower needed for the space station and other space missions. However, a credible validation methodology must be developed before knowledge-based systems can be used for life- or mission-critical applications. Experience with conventional software has shown that the use of good software engineering techniques and static analysis tools can greatly reduce the time needed for testing and simulation of a system. Since exhaustive testing is infeasible, reliability must be built into the software during the design and implementation phases. Unfortunately, many of the software engineering techniques and tools used for conventional software are of little use in the development of knowledge-based systems. Therefore, research at Langley is focused on developing a set of guidelines, methods, and prototype validation tools for building highly reliable, knowledge-based systems. The use of a comprehensive methodology for building highly reliable, knowledge-based systems should significantly decrease the time needed for testing and simulation. A proven record of delivering reliable systems at the beginning of the highly visible testing and simulation phases is crucial to the acceptance of knowledge-based systems in critical applications.
Development and Validation of a Behavioral Screener for Preschool-Age Children
ERIC Educational Resources Information Center
DiStefano, Christine A.; Kamphaus, Randy W.
2007-01-01
The purpose of this study was to document the development of a short behavioral scale that could be used to assess preschoolers' behavior while still retaining adequate scale coverage, reliability, and validity. Factor analysis and item analysis techniques were applied to data from a nationally representative, normative database to create a…
Intra- and inter-rater reliability of digital image analysis for skin color measurement
Sommers, Marilyn; Beacham, Barbara; Baker, Rachel; Fargo, Jamison
2013-01-01
Background We determined the intra- and inter-rater reliability of data from digital image color analysis between an expert and novice analyst. Methods Following training, the expert and novice independently analyzed 210 randomly ordered images. Both analysts used Adobe® Photoshop lasso or color sampler tools based on the type of image file. After color correction with Pictocolor® in camera software, they recorded L*a*b* (L*=light/dark; a*=red/green; b*=yellow/blue) color values for all skin sites. We computed intra-rater and inter-rater agreement within anatomical region, color value (L*, a*, b*), and technique (lasso, color sampler) using a series of one-way intra-class correlation coefficients (ICCs). Results Results of ICCs for intra-rater agreement showed high levels of internal consistency reliability within each rater for the lasso technique (ICC ≥ 0.99) and somewhat lower, yet acceptable, level of agreement for the color sampler technique (ICC = 0.91 for expert, ICC = 0.81 for novice). Skin L*, skin b*, and labia L* values reached the highest level of agreement (ICC ≥ 0.92) and skin a*, labia b*, and vaginal wall b* were the lowest (ICC ≥ 0.64). Conclusion Data from novice analysts can achieve high levels of agreement with data from expert analysts with training and the use of a detailed, standard protocol. PMID:23551208
Intra- and inter-rater reliability of digital image analysis for skin color measurement.
Sommers, Marilyn; Beacham, Barbara; Baker, Rachel; Fargo, Jamison
2013-11-01
We determined the intra- and inter-rater reliability of data from digital image color analysis between an expert and novice analyst. Following training, the expert and novice independently analyzed 210 randomly ordered images. Both analysts used Adobe(®) Photoshop lasso or color sampler tools based on the type of image file. After color correction with Pictocolor(®) in camera software, they recorded L*a*b* (L*=light/dark; a*=red/green; b*=yellow/blue) color values for all skin sites. We computed intra-rater and inter-rater agreement within anatomical region, color value (L*, a*, b*), and technique (lasso, color sampler) using a series of one-way intra-class correlation coefficients (ICCs). Results of ICCs for intra-rater agreement showed high levels of internal consistency reliability within each rater for the lasso technique (ICC ≥ 0.99) and somewhat lower, yet acceptable, level of agreement for the color sampler technique (ICC = 0.91 for expert, ICC = 0.81 for novice). Skin L*, skin b*, and labia L* values reached the highest level of agreement (ICC ≥ 0.92) and skin a*, labia b*, and vaginal wall b* were the lowest (ICC ≥ 0.64). Data from novice analysts can achieve high levels of agreement with data from expert analysts with training and the use of a detailed, standard protocol. © 2013 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
Review of Dynamic Modeling and Simulation of Large Scale Belt Conveyor System
NASA Astrophysics Data System (ADS)
He, Qing; Li, Hong
Belt conveyor is one of the most important devices to transport bulk-solid material for long distance. Dynamic analysis is the key to decide whether the design is rational in technique, safe and reliable in running, feasible in economy. It is very important to study dynamic properties, improve efficiency and productivity, guarantee conveyor safe, reliable and stable running. The dynamic researches and applications of large scale belt conveyor are discussed. The main research topics, the state-of-the-art of dynamic researches on belt conveyor are analyzed. The main future works focus on dynamic analysis, modeling and simulation of main components and whole system, nonlinear modeling, simulation and vibration analysis of large scale conveyor system.
Modeling 3-D objects with planar surfaces for prediction of electromagnetic scattering
NASA Technical Reports Server (NTRS)
Koch, M. B.; Beck, F. B.; Cockrell, C. R.
1992-01-01
Electromagnetic scattering analysis of objects at resonance is difficult because low frequency techniques are slow and computer intensive, and high frequency techniques may not be reliable. A new technique for predicting the electromagnetic backscatter from electrically conducting objects at resonance is studied. This technique is based on modeling three dimensional objects as a combination of flat plates where some of the plates are blocking the scattering from others. A cube is analyzed as a simple example. The preliminary results compare well with the Geometrical Theory of Diffraction and with measured data.
Computed tomography for non-destructive evaluation of composites: Applications and correlations
NASA Technical Reports Server (NTRS)
Goldberg, B.; Hediger, L.; Noel, E.
1985-01-01
The state-of-the-art fabrication techniques for composite materials are such that stringent species-specific acceptance criteria must be generated to insure product reliability. Non-destructive evaluation techniques including computed tomography (CT), X-ray radiography (RT), and ultrasonic scanning (UT) are investigated and compared to determine their applicability and limitations to graphite epoxy, carbon-carbon, and carbon-phenolic materials. While the techniques appear complementary, CT is shown to provide significant, heretofore unattainable data. Finally, a correlation of NDE techniques to destructive analysis is presented.
Selectivity/Specificity Improvement Strategies in Surface-Enhanced Raman Spectroscopy Analysis
Wang, Feng; Cao, Shiyu; Yan, Ruxia; Wang, Zewei; Wang, Dan; Yang, Haifeng
2017-01-01
Surface-enhanced Raman spectroscopy (SERS) is a powerful technique for the discrimination, identification, and potential quantification of certain compounds/organisms. However, its real application is challenging due to the multiple interference from the complicated detection matrix. Therefore, selective/specific detection is crucial for the real application of SERS technique. We summarize in this review five selective/specific detection techniques (chemical reaction, antibody, aptamer, molecularly imprinted polymers and microfluidics), which can be applied for the rapid and reliable selective/specific detection when coupled with SERS technique. PMID:29160798
Multimodal biophotonic workstation for live cell analysis.
Esseling, Michael; Kemper, Björn; Antkowiak, Maciej; Stevenson, David J; Chaudet, Lionel; Neil, Mark A A; French, Paul W; von Bally, Gert; Dholakia, Kishan; Denz, Cornelia
2012-01-01
A reliable description and quantification of the complex physiology and reactions of living cells requires a multimodal analysis with various measurement techniques. We have investigated the integration of different techniques into a biophotonic workstation that can provide biological researchers with these capabilities. The combination of a micromanipulation tool with three different imaging principles is accomplished in a single inverted microscope which makes the results from all the techniques directly comparable. Chinese Hamster Ovary (CHO) cells were manipulated by optical tweezers while the feedback was directly analyzed by fluorescence lifetime imaging, digital holographic microscopy and dynamic phase-contrast microscopy. Copyright © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
A probability-based approach for assessment of roadway safety hardware.
DOT National Transportation Integrated Search
2017-03-14
This report presents a general probability-based approach for assessment of roadway safety hardware (RSH). It was achieved using a reliability : analysis method and computational techniques. With the development of high-fidelity finite element (FE) m...
Probabilistic analysis algorithm for UA slope software program.
DOT National Transportation Integrated Search
2013-12-01
A reliability-based computational algorithm for using a single row and equally spaced drilled shafts to : stabilize an unstable slope has been developed in this research. The Monte-Carlo simulation (MCS) : technique was used in the previously develop...
A Radial Basis Function Approach to Financial Time Series Analysis
1993-12-01
including efficient methods for parameter estimation and pruning, a pointwise prediction error estimator, and a methodology for controlling the "data...collection of practical techniques to address these issues for a modeling methodology . Radial Basis Function networks. These techniques in- clude efficient... methodology often then amounts to a careful consideration of the interplay between model complexity and reliability. These will be recurrent themes
Composing, Analyzing and Validating Software Models
NASA Astrophysics Data System (ADS)
Sheldon, Frederick T.
1998-10-01
This research has been conducted at the Computational Sciences Division of the Information Sciences Directorate at Ames Research Center (Automated Software Engineering Grp). The principle work this summer has been to review and refine the agenda that were carried forward from last summer. Formal specifications provide good support for designing a functionally correct system, however they are weak at incorporating non-functional performance requirements (like reliability). Techniques which utilize stochastic Petri nets (SPNs) are good for evaluating the performance and reliability for a system, but they may be too abstract and cumbersome from the stand point of specifying and evaluating functional behavior. Therefore, one major objective of this research is to provide an integrated approach to assist the user in specifying both functionality (qualitative: mutual exclusion and synchronization) and performance requirements (quantitative: reliability and execution deadlines). In this way, the merits of a powerful modeling technique for performability analysis (using SPNs) can be combined with a well-defined formal specification language. In doing so, we can come closer to providing a formal approach to designing a functionally correct system that meets reliability and performance goals.
Composing, Analyzing and Validating Software Models
NASA Technical Reports Server (NTRS)
Sheldon, Frederick T.
1998-01-01
This research has been conducted at the Computational Sciences Division of the Information Sciences Directorate at Ames Research Center (Automated Software Engineering Grp). The principle work this summer has been to review and refine the agenda that were carried forward from last summer. Formal specifications provide good support for designing a functionally correct system, however they are weak at incorporating non-functional performance requirements (like reliability). Techniques which utilize stochastic Petri nets (SPNs) are good for evaluating the performance and reliability for a system, but they may be too abstract and cumbersome from the stand point of specifying and evaluating functional behavior. Therefore, one major objective of this research is to provide an integrated approach to assist the user in specifying both functionality (qualitative: mutual exclusion and synchronization) and performance requirements (quantitative: reliability and execution deadlines). In this way, the merits of a powerful modeling technique for performability analysis (using SPNs) can be combined with a well-defined formal specification language. In doing so, we can come closer to providing a formal approach to designing a functionally correct system that meets reliability and performance goals.
Practical issues of hyperspectral imaging analysis of solid dosage forms.
Amigo, José Manuel
2010-09-01
Hyperspectral imaging techniques have widely demonstrated their usefulness in different areas of interest in pharmaceutical research during the last decade. In particular, middle infrared, near infrared, and Raman methods have gained special relevance. This rapid increase has been promoted by the capability of hyperspectral techniques to provide robust and reliable chemical and spatial information on the distribution of components in pharmaceutical solid dosage forms. Furthermore, the valuable combination of hyperspectral imaging devices with adequate data processing techniques offers the perfect landscape for developing new methods for scanning and analyzing surfaces. Nevertheless, the instrumentation and subsequent data analysis are not exempt from issues that must be thoughtfully considered. This paper describes and discusses the main advantages and drawbacks of the measurements and data analysis of hyperspectral imaging techniques in the development of solid dosage forms.
A Monte Carlo Simulation Study of the Reliability of Intraindividual Variability
Estabrook, Ryne; Grimm, Kevin J.; Bowles, Ryan P.
2012-01-01
Recent research has seen intraindividual variability (IIV) become a useful technique to incorporate trial-to-trial variability into many types of psychological studies. IIV as measured by individual standard deviations (ISDs) has shown unique prediction to several types of positive and negative outcomes (Ram, Rabbit, Stollery, & Nesselroade, 2005). One unanswered question regarding measuring intraindividual variability is its reliability and the conditions under which optimal reliability is achieved. Monte Carlo simulation studies were conducted to determine the reliability of the ISD compared to the intraindividual mean. The results indicate that ISDs generally have poor reliability and are sensitive to insufficient measurement occasions, poor test reliability, and unfavorable amounts and distributions of variability in the population. Secondary analysis of psychological data shows that use of individual standard deviations in unfavorable conditions leads to a marked reduction in statistical power, although careful adherence to underlying statistical assumptions allows their use as a basic research tool. PMID:22268793
Zou, Ye; Ma, Gang
2014-06-04
Second derivative and Fourier self-deconvolution (FSD) are two commonly used techniques to resolve the overlapped component peaks from the often featureless amide I band in Fourier transform infrared (FTIR) curve-fitting approach for protein secondary structural analysis. Yet, the reliability of these two techniques is greatly affected by the omnipresent water vapor in the atmosphere. Several criteria are currently in use as quality controls to ensure the protein absorption spectrum is negligibly affected by water vapor interference. In this study, through a second derivative study of liquid water, we first argue that the previously established criteria cannot guarantee a reliable evaluation of water vapor interference due to a phenomenon that we refer to as sample's absorbance-dependent water vapor interference. Then, through a comparative study of protein and liquid water, we show that a protein absorption spectrum can still be significantly affected by water vapor interference even though it satisfies the established criteria. At last, we propose to use the comparison between the second derivative spectra of protein and liquid water as a new criterion to better evaluate water vapor interference for more reliable second derivative and FSD treatments on the protein amide I band.
NASA Astrophysics Data System (ADS)
Dadashzadeh, N.; Duzgun, H. S. B.; Yesiloglu-Gultekin, N.
2017-08-01
While advanced numerical techniques in slope stability analysis are successfully used in deterministic studies, they have so far found limited use in probabilistic analyses due to their high computation cost. The first-order reliability method (FORM) is one of the most efficient probabilistic techniques to perform probabilistic stability analysis by considering the associated uncertainties in the analysis parameters. However, it is not possible to directly use FORM in numerical slope stability evaluations as it requires definition of a limit state performance function. In this study, an integrated methodology for probabilistic numerical modeling of rock slope stability is proposed. The methodology is based on response surface method, where FORM is used to develop an explicit performance function from the results of numerical simulations. The implementation of the proposed methodology is performed by considering a large potential rock wedge in Sumela Monastery, Turkey. The accuracy of the developed performance function to truly represent the limit state surface is evaluated by monitoring the slope behavior. The calculated probability of failure is compared with Monte Carlo simulation (MCS) method. The proposed methodology is found to be 72% more efficient than MCS, while the accuracy is decreased with an error of 24%.
Gligorijevic, Jovan; Gajic, Dragoljub; Brkovic, Aleksandar; Savic-Gajic, Ivana; Georgieva, Olga; Di Gennaro, Stefano
2016-03-01
The packaging materials industry has already recognized the importance of Total Productive Maintenance as a system of proactive techniques for improving equipment reliability. Bearing faults, which often occur gradually, represent one of the foremost causes of failures in the industry. Therefore, detection of their faults in an early stage is quite important to assure reliable and efficient operation. We present a new automated technique for early fault detection and diagnosis in rolling-element bearings based on vibration signal analysis. Following the wavelet decomposition of vibration signals into a few sub-bands of interest, the standard deviation of obtained wavelet coefficients is extracted as a representative feature. Then, the feature space dimension is optimally reduced to two using scatter matrices. In the reduced two-dimensional feature space the fault detection and diagnosis is carried out by quadratic classifiers. Accuracy of the technique has been tested on four classes of the recorded vibrations signals, i.e., normal, with the fault of inner race, outer race, and ball operation. The overall accuracy of 98.9% has been achieved. The new technique can be used to support maintenance decision-making processes and, thus, to increase reliability and efficiency in the industry by preventing unexpected faulty operation of bearings.
Gligorijevic, Jovan; Gajic, Dragoljub; Brkovic, Aleksandar; Savic-Gajic, Ivana; Georgieva, Olga; Di Gennaro, Stefano
2016-01-01
The packaging materials industry has already recognized the importance of Total Productive Maintenance as a system of proactive techniques for improving equipment reliability. Bearing faults, which often occur gradually, represent one of the foremost causes of failures in the industry. Therefore, detection of their faults in an early stage is quite important to assure reliable and efficient operation. We present a new automated technique for early fault detection and diagnosis in rolling-element bearings based on vibration signal analysis. Following the wavelet decomposition of vibration signals into a few sub-bands of interest, the standard deviation of obtained wavelet coefficients is extracted as a representative feature. Then, the feature space dimension is optimally reduced to two using scatter matrices. In the reduced two-dimensional feature space the fault detection and diagnosis is carried out by quadratic classifiers. Accuracy of the technique has been tested on four classes of the recorded vibrations signals, i.e., normal, with the fault of inner race, outer race, and ball operation. The overall accuracy of 98.9% has been achieved. The new technique can be used to support maintenance decision-making processes and, thus, to increase reliability and efficiency in the industry by preventing unexpected faulty operation of bearings. PMID:26938541
NASA Technical Reports Server (NTRS)
1973-01-01
The heat transfer characteristics of various materials used for the thermal insulation of spacecraft are discussed. Techniques for conducting thermal performance analysis, structural performance analysis, and dynamic analysis are described. Processes for producing and finishing the materials are explained. The methods for determining reliability, system safety, materials tests, and design effectiveness are explained.
Fast Computation and Assessment Methods in Power System Analysis
NASA Astrophysics Data System (ADS)
Nagata, Masaki
Power system analysis is essential for efficient and reliable power system operation and control. Recently, online security assessment system has become of importance, as more efficient use of power networks is eagerly required. In this article, fast power system analysis techniques such as contingency screening, parallel processing and intelligent systems application are briefly surveyed from the view point of their application to online dynamic security assessment.
NASA Technical Reports Server (NTRS)
Butler, Ricky W.; Johnson, Sally C.
1995-01-01
This paper presents a step-by-step tutorial of the methods and the tools that were used for the reliability analysis of fault-tolerant systems. The approach used in this paper is the Markov (or semi-Markov) state-space method. The paper is intended for design engineers with a basic understanding of computer architecture and fault tolerance, but little knowledge of reliability modeling. The representation of architectural features in mathematical models is emphasized. This paper does not present details of the mathematical solution of complex reliability models. Instead, it describes the use of several recently developed computer programs SURE, ASSIST, STEM, and PAWS that automate the generation and the solution of these models.
NASA/ASEE Summer Faculty Fellowship Program, 1990, Volume 1
NASA Technical Reports Server (NTRS)
Bannerot, Richard B. (Editor); Goldstein, Stanley H. (Editor)
1990-01-01
The 1990 Johnson Space Center (JSC) NASA/American Society for Engineering Education (ASEE) Summer Faculty Fellowship Program was conducted by the University of Houston-University Park and JSC. A compilation of the final reports on the research projects are presented. The topics covered include: the Space Station; the Space Shuttle; exobiology; cell biology; culture techniques; control systems design; laser induced fluorescence; spacecraft reliability analysis; reduced gravity; biotechnology; microgravity applications; regenerative life support systems; imaging techniques; cardiovascular system; physiological effects; extravehicular mobility units; mathematical models; bioreactors; computerized simulation; microgravity simulation; and dynamic structural analysis.
Monolithic ceramic analysis using the SCARE program
NASA Technical Reports Server (NTRS)
Manderscheid, Jane M.
1988-01-01
The Structural Ceramics Analysis and Reliability Evaluation (SCARE) computer program calculates the fast fracture reliability of monolithic ceramic components. The code is a post-processor to the MSC/NASTRAN general purpose finite element program. The SCARE program automatically accepts the MSC/NASTRAN output necessary to compute reliability. This includes element stresses, temperatures, volumes, and areas. The SCARE program computes two-parameter Weibull strength distributions from input fracture data for both volume and surface flaws. The distributions can then be used to calculate the reliability of geometrically complex components subjected to multiaxial stress states. Several fracture criteria and flaw types are available for selection by the user, including out-of-plane crack extension theories. The theoretical basis for the reliability calculations was proposed by Batdorf. These models combine linear elastic fracture mechanics (LEFM) with Weibull statistics to provide a mechanistic failure criterion. Other fracture theories included in SCARE are the normal stress averaging technique and the principle of independent action. The objective of this presentation is to summarize these theories, including their limitations and advantages, and to provide a general description of the SCARE program, along with example problems.
Reliability analysis and initial requirements for FC systems and stacks
NASA Astrophysics Data System (ADS)
Åström, K.; Fontell, E.; Virtanen, S.
In the year 2000 Wärtsilä Corporation started an R&D program to develop SOFC systems for CHP applications. The program aims to bring to the market highly efficient, clean and cost competitive fuel cell systems with rated power output in the range of 50-250 kW for distributed generation and marine applications. In the program Wärtsilä focuses on system integration and development. System reliability and availability are key issues determining the competitiveness of the SOFC technology. In Wärtsilä, methods have been implemented for analysing the system in respect to reliability and safety as well as for defining reliability requirements for system components. A fault tree representation is used as the basis for reliability prediction analysis. A dynamic simulation technique has been developed to allow for non-static properties in the fault tree logic modelling. Special emphasis has been placed on reliability analysis of the fuel cell stacks in the system. A method for assessing reliability and critical failure predictability requirements for fuel cell stacks in a system consisting of several stacks has been developed. The method is based on a qualitative model of the stack configuration where each stack can be in a functional, partially failed or critically failed state, each of the states having different failure rates and effects on the system behaviour. The main purpose of the method is to understand the effect of stack reliability, critical failure predictability and operating strategy on the system reliability and availability. An example configuration, consisting of 5 × 5 stacks (series of 5 sets of 5 parallel stacks) is analysed in respect to stack reliability requirements as a function of predictability of critical failures and Weibull shape factor of failure rate distributions.
Survey of aircraft electrical power systems
NASA Technical Reports Server (NTRS)
Lee, C. H.; Brandner, J. J.
1972-01-01
Areas investigated include: (1) load analysis; (2) power distribution, conversion techniques and generation; (3) design criteria and performance capabilities of hydraulic and pneumatic systems; (4) system control and protection methods; (5) component and heat transfer systems cooling; and (6) electrical system reliability.
Stolinski, L; Kozinoga, M; Czaprowski, D; Tyrakowski, M; Cerny, P; Suzuki, N; Kotwicki, T
2017-01-01
Digital photogrammetry provides measurements of body angles or distances which allow for quantitative posture assessment with or without the use of external markers. It is becoming an increasingly popular tool for the assessment of the musculoskeletal system. The aim of this paper is to present a structured method for the analysis of posture and its changes using a standardized digital photography technique. The purpose of the study was twofold. The first one comprised 91 children (44 girls and 47 boys) aged 7-10 (8.2 ± 1.0), i.e., students of primary school, and its aim was to develop the photographic method, choose the quantitative parameters, and determine the intraobserver reliability (repeatability) along with the interobserver reliability (reproducibility) measurements in sagittal plane using digital photography, as well as to compare the Rippstein plurimeter and digital photography measurements. The second one involved 7782 children (3804 girls, 3978 boys) aged 7-10 (8.4 ± 0.5), who underwent digital photography postural screening. The methods consisted in measuring and calculating selected parameters, establishing the normal ranges of photographic parameters, presenting percentile charts, as well as noticing common pitfalls and possible sources of errors in digital photography. A standardized procedure for the photographic evaluation of child body posture was presented. The photographic measurements revealed very good intra- and inter-rater reliability regarding the five sagittal parameters and good reliability performed against Rippstein plurimeter measurements. The parameters displayed insignificant variability over time. Normative data were calculated based on photographic assessment, while the percentile charts were provided to serve as reference values. The technical errors observed during photogrammetry are carefully discussed in this article. Technical developments are allowed for the regular use of digital photogrammetry in body posture assessment. Specific child positioning (described above) enables us to avoid incidentally modified posture. Image registration is simple, quick, harmless, and cost-effective. The semi-automatic image analysis, together with the normal values and percentile charts, makes the technique reliable in terms of child's posture documentation and corrective therapy effects' monitoring.
NASA Astrophysics Data System (ADS)
Ahn, Jae-Jun; Akram, Kashif; Kwak, Ji-Young; Jeong, Mi-Seon; Kwon, Joong-Ho
2013-10-01
Cost-effective and time-efficient analytical techniques are required to screen large food lots in accordance to their irradiation status. Gamma-irradiated (0-10 kGy) cinnamon, red pepper, black pepper, and fresh paprika were investigated using photostimulated luminescence (PSL), direct epifluorescent filter technique/the aerobic plate count (DEFT/APC), and electronic-nose (e-nose) analyses. The screening results were also confirmed with thermoluminescence analysis. PSL analysis discriminated between irradiated (positive, >5000 PCs) and non-irradiated (negative, <700 PCs) cinnamon and red peppers. Black pepper had intermediate results (700-5000 PCs), while paprika had low sensitivity (negative results) upon irradiation. The DEFT/APC technique also showed clear screening results through the changes in microbial profiles, where the best results were found in paprika, followed by red pepper and cinnamon. E-nose analysis showed a dose-dependent discrimination in volatile profiles upon irradiation through principal component analysis. These methods can be used considering their potential applications for the screening analysis of irradiated foods.
Diagnostic emulation: Implementation and user's guide
NASA Technical Reports Server (NTRS)
Becher, Bernice
1987-01-01
The Diagnostic Emulation Technique was developed within the System Validation Methods Branch as a part of the development of methods for the analysis of the reliability of highly reliable, fault tolerant digital avionics systems. This is a general technique which allows for the emulation of a digital hardware system. The technique is general in the sense that it is completely independent of the particular target hardware which is being emulated. Parts of the system are described and emulated at the logic or gate level, while other parts of the system are described and emulated at the functional level. This algorithm allows for the insertion of faults into the system, and for the observation of the response of the system to these faults. This allows for controlled and accelerated testing of system reaction to hardware failures in the target machine. This document describes in detail how the algorithm was implemented at NASA Langley Research Center and gives instructions for using the system.
Xiao, Yuan-mei; Wang, Zhi-ming; Wang, Mian-zhen; Lan, Ya-jia
2005-06-01
To test the reliability and validity of two mental workload assessment scales, i.e. subjective workload assessment technique (SWAT) and NASA task load index (NASA-TLX). One thousand two hundred and sixty-eight mental workers were sampled from various kinds of occupations, such as scientific research, education, administration and medicine, etc, with randomized cluster sampling. The re-test reliability, split-half reliability, Cronbach's alpha coefficient and correlation coefficients between item score and total score were adopted to test the reliability. The test of validity included structure validity. The re-test reliability coefficients of these two scales and their items were ranged from 0.516 to 0.753 (P < 0.01), indicating the two scales had good re-test reliability; the split-half reliability of SWAT was 0.645, and its Cronbach's alpha coefficient was more than 0.80, all the correlation coefficients between its items score and total score were more than 0.70; as for NASA-TLX, both the split-half reliability and Cronbach's alpha coefficient were more than 0.80, the correlation coefficients between its items score and total score were all more than 0.60 (P < 0.01) except the item of performance. Both scales had good inner consistency. The Pearson correlation coefficient between the two scales was 0.492 (P < 0.01), implying the results of the two scales had good consistency. Factor analysis showed that the two scales had good structure validity. Both SWAT and NASA-TLX have good reliability and validity and may be used as a valid tool to assess mental workload in China after being revised properly.
Reliability Analysis of Systems Subject to First-Passage Failure
NASA Technical Reports Server (NTRS)
Lutes, Loren D.; Sarkani, Shahram
2009-01-01
An obvious goal of reliability analysis is the avoidance of system failure. However, it is generally recognized that it is often not feasible to design a practical or useful system for which failure is impossible. Thus it is necessary to use techniques that estimate the likelihood of failure based on modeling the uncertainty about such items as the demands on and capacities of various elements in the system. This usually involves the use of probability theory, and a design is considered acceptable if it has a sufficiently small probability of failure. This report contains findings of analyses of systems subject to first-passage failure.
[The choice of color in fixed prosthetics: what steps should be followed for a reliable outcome?].
Vanheusden, Alain; Mainjot, Amélie
2004-01-01
The creation of a perfectly-matched esthetic fixed restoration is undeniably one of the most difficult challenges in modern dentistry. The final outcome depends on several essential steps: the use of an appropriate light source, the accurate analysis and correct evaluation of patient's teeth parameters (morphology, colour, surface texture,...), the clear and precise transmission of this data to the laboratory and the sound interpretation of it by a dental technician who masters esthetic prosthetic techniques perfectly. The purpose of this paper was to give a reproducible clinical method to the practitioner in order to achieve a reliable dental colorimetric analysis.
High reliability and high performance of 9xx-nm single emitter laser diodes
NASA Astrophysics Data System (ADS)
Bao, L.; Leisher, P.; Wang, J.; Devito, M.; Xu, D.; Grimshaw, M.; Dong, W.; Guan, X.; Zhang, S.; Bai, C.; Bai, J. G.; Wise, D.; Martinsen, R.
2011-03-01
Improved performance and reliability of 9xx nm single emitter laser diodes are presented. To date, over 15,000 hours of accelerated multi-cell lifetest reliability data has been collected, with drive currents from 14A to 18A and junction temperatures ranging from 60°C to 110°C. Out of 208 devices, 14 failures have been observed so far. Using established accelerated lifetest analysis techniques, the effects of temperature and power acceleration are assessed. The Mean Time to Failure (MTTF) is determined to be >30 years, for use condition 10W and junction temperature 353K (80°C), with 90% statistical confidence.
Improving the Reliability of Technological Subsystems Equipment for Steam Turbine Unit in Operation
NASA Astrophysics Data System (ADS)
Brodov, Yu. M.; Murmansky, B. E.; Aronson, R. T.
2017-11-01
The authors’ conception is presented of an integrated approach to reliability improving of the steam turbine unit (STU) state along with its implementation examples for the various STU technological subsystems. Basing on the statistical analysis of damage to turbine individual parts and components, on the development and application of modern methods and technologies of repair and on operational monitoring techniques, the critical components and elements of equipment are identified and priorities are proposed for improving the reliability of STU equipment in operation. The research results are presented of the analysis of malfunctions for various STU technological subsystems equipment operating as part of power units and at cross-linked thermal power plants and resulting in turbine unit shutdown (failure). Proposals are formulated and justified for adjustment of maintenance and repair for turbine components and parts, for condenser unit equipment, for regeneration subsystem and oil supply system that permit to increase the operational reliability, to reduce the cost of STU maintenance and repair and to optimize the timing and amount of repairs.
Uniform Foam Crush Testing for Multi-Mission Earth Entry Vehicle Impact Attenuation
NASA Technical Reports Server (NTRS)
Patterson, Byron W.; Glaab, Louis J.
2012-01-01
Multi-Mission Earth Entry Vehicles (MMEEVs) are blunt-body vehicles designed with the purpose of transporting payloads from outer space to the surface of the Earth. To achieve high-reliability and minimum weight, MMEEVs avoid use of limited-reliability systems, such as parachutes and retro-rockets, instead using built-in impact attenuators to absorb energy remaining at impact to meet landing loads requirements. The Multi-Mission Systems Analysis for Planetary Entry (M-SAPE) parametric design tool is used to facilitate the design of MMEEVs and develop the trade space. Testing was conducted to characterize the material properties of several candidate impact foam attenuators to enhance M-SAPE analysis. In the current effort, four different Rohacell foams are tested at three different, uniform, strain rates (approximately 0.17, approximately 100, approximately 13,600%/s). The primary data analysis method uses a global data smoothing technique in the frequency domain to remove noise and system natural frequencies. The results from the data indicate that the filter and smoothing technique are successful in identifying the foam crush event and removing aberrations. The effect of strain rate increases with increasing foam density. The 71-WF-HT foam may support Mars Sample Return requirements. Several recommendations to improve the drop tower test technique are identified.
Reliability Quantification of Advanced Stirling Convertor (ASC) Components
NASA Technical Reports Server (NTRS)
Shah, Ashwin R.; Korovaichuk, Igor; Zampino, Edward
2010-01-01
The Advanced Stirling Convertor, is intended to provide power for an unmanned planetary spacecraft and has an operational life requirement of 17 years. Over this 17 year mission, the ASC must provide power with desired performance and efficiency and require no corrective maintenance. Reliability demonstration testing for the ASC was found to be very limited due to schedule and resource constraints. Reliability demonstration must involve the application of analysis, system and component level testing, and simulation models, taken collectively. Therefore, computer simulation with limited test data verification is a viable approach to assess the reliability of ASC components. This approach is based on physics-of-failure mechanisms and involves the relationship among the design variables based on physics, mechanics, material behavior models, interaction of different components and their respective disciplines such as structures, materials, fluid, thermal, mechanical, electrical, etc. In addition, these models are based on the available test data, which can be updated, and analysis refined as more data and information becomes available. The failure mechanisms and causes of failure are included in the analysis, especially in light of the new information, in order to develop guidelines to improve design reliability and better operating controls to reduce the probability of failure. Quantified reliability assessment based on fundamental physical behavior of components and their relationship with other components has demonstrated itself to be a superior technique to conventional reliability approaches based on utilizing failure rates derived from similar equipment or simply expert judgment.
A comparison of computer-assisted and manual wound size measurement.
Thawer, Habiba A; Houghton, Pamela E; Woodbury, M Gail; Keast, David; Campbell, Karen
2002-10-01
Accurate and precise wound measurements are a critical component of every wound assessment. To examine the reliability and validity of a new computerized technique for measuring human and animal wounds, chronic human wounds (N = 45) and surgical animal wounds (N = 38) were assessed using manual and computerized techniques. Using intraclass correlation coefficients, intrarater and interrater reliability of surface area measurements obtained using the computerized technique were compared to those obtained using acetate tracings and planimetry. A single measurement of surface area using either technique produced excellent intrarater and interrater reliability for both human and animal wounds, but the computerized technique was more precise than the manual technique for measuring the surface area of animal wounds. For both types of wounds and measurement techniques, intrarater and interrater reliability improved when the average of three repeated measurements was obtained. The precision of each technique with human wounds and the precision of the manual technique with animal wounds also improved when three repeated measurement results were averaged. Concurrent validity between the two techniques was excellent for human wounds but poor for the smaller animal wounds, regardless of whether single or the average of three repeated surface area measurements was used. The computerized technique permits reliable and valid assessment of the surface area of both human and animal wounds.
NASA Astrophysics Data System (ADS)
Hanish Nithin, Anu; Omenzetter, Piotr
2017-04-01
Optimization of the life-cycle costs and reliability of offshore wind turbines (OWTs) is an area of immense interest due to the widespread increase in wind power generation across the world. Most of the existing studies have used structural reliability and the Bayesian pre-posterior analysis for optimization. This paper proposes an extension to the previous approaches in a framework for probabilistic optimization of the total life-cycle costs and reliability of OWTs by combining the elements of structural reliability/risk analysis (SRA), the Bayesian pre-posterior analysis with optimization through a genetic algorithm (GA). The SRA techniques are adopted to compute the probabilities of damage occurrence and failure associated with the deterioration model. The probabilities are used in the decision tree and are updated using the Bayesian analysis. The output of this framework would determine the optimal structural health monitoring and maintenance schedules to be implemented during the life span of OWTs while maintaining a trade-off between the life-cycle costs and risk of the structural failure. Numerical illustrations with a generic deterioration model for one monitoring exercise in the life cycle of a system are demonstrated. Two case scenarios, namely to build initially an expensive and robust or a cheaper but more quickly deteriorating structures and to adopt expensive monitoring system, are presented to aid in the decision-making process.
Proximal tibial osteotomy. A survivorship analysis.
Ritter, M A; Fechtman, R A
1988-01-01
Proximal tibial osteotomy is generally accepted as a treatment for the patient with unicompartmental arthritis. However, a few reports of the long-term results of this procedure are available in the literature, and none have used the technique known as survivorship analysis. This technique has an advantage over conventional analysis because it does not exclude patients for inadequate follow-up, loss to follow-up, or patient death. In this study, survivorship analysis was applied to 78 proximal tibial osteotomies, performed exclusively by the senior author for the correction of a preoperative varus deformity, and a survival curve was constructed. It was concluded that the reliable longevity of the proximal tibial osteotomy is approximately 6 years.
Develop Advanced Nonlinear Signal Analysis Topographical Mapping System
NASA Technical Reports Server (NTRS)
Jong, Jen-Yi
1997-01-01
During the development of the SSME, a hierarchy of advanced signal analysis techniques for mechanical signature analysis has been developed by NASA and AI Signal Research Inc. (ASRI) to improve the safety and reliability for Space Shuttle operations. These techniques can process and identify intelligent information hidden in a measured signal which is often unidentifiable using conventional signal analysis methods. Currently, due to the highly interactive processing requirements and the volume of dynamic data involved, detailed diagnostic analysis is being performed manually which requires immense man-hours with extensive human interface. To overcome this manual process, NASA implemented this program to develop an Advanced nonlinear signal Analysis Topographical Mapping System (ATMS) to provide automatic/unsupervised engine diagnostic capabilities. The ATMS will utilize a rule-based Clips expert system to supervise a hierarchy of diagnostic signature analysis techniques in the Advanced Signal Analysis Library (ASAL). ASAL will perform automatic signal processing, archiving, and anomaly detection/identification tasks in order to provide an intelligent and fully automated engine diagnostic capability. The ATMS has been successfully developed under this contract. In summary, the program objectives to design, develop, test and conduct performance evaluation for an automated engine diagnostic system have been successfully achieved. Software implementation of the entire ATMS system on MSFC's OISPS computer has been completed. The significance of the ATMS developed under this program is attributed to the fully automated coherence analysis capability for anomaly detection and identification which can greatly enhance the power and reliability of engine diagnostic evaluation. The results have demonstrated that ATMS can significantly save time and man-hours in performing engine test/flight data analysis and performance evaluation of large volumes of dynamic test data.
Palm, Peter; Josephson, Malin; Mathiassen, Svend Erik; Kjellberg, Katarina
2016-06-01
We evaluated the intra- and inter-observer reliability and criterion validity of an observation protocol, developed in an iterative process involving practicing ergonomists, for assessment of working technique during cash register work for the purpose of preventing upper extremity symptoms. Two ergonomists independently assessed 17 15-min videos of cash register work on two occasions each, as a basis for examining reliability. Criterion validity was assessed by comparing these assessments with meticulous video-based analyses by researchers. Intra-observer reliability was acceptable (i.e. proportional agreement >0.7 and kappa >0.4) for 10/10 questions. Inter-observer reliability was acceptable for only 3/10 questions. An acceptable inter-observer reliability combined with an acceptable criterion validity was obtained only for one working technique aspect, 'Quality of movements'. Thus, major elements of the cashiers' working technique could not be assessed with an acceptable accuracy from short periods of observations by one observer, such as often desired by practitioners. Practitioner Summary: We examined an observation protocol for assessing working technique in cash register work. It was feasible in use, but inter-observer reliability and criterion validity were generally not acceptable when working technique aspects were assessed from short periods of work. We recommend the protocol to be used for educational purposes only.
NASA software specification and evaluation system design, part 2
NASA Technical Reports Server (NTRS)
1976-01-01
A survey and analysis of the existing methods, tools and techniques employed in the development of software are presented along with recommendations for the construction of reliable software. Functional designs for software specification language, and the data base verifier are presented.
Sampling and analysis techniques for monitoring serum for trace elements.
Ericson, S P; McHalsky, M L; Rabinow, B E; Kronholm, K G; Arceo, C S; Weltzer, J A; Ayd, S W
1986-07-01
We describe techniques for controlling contamination in the sampling and analysis of human serum for trace metals. The relatively simple procedures do not require clean-room conditions. The atomic absorption and atomic emission methods used have been applied in studying zinc, copper, chromium, manganese, molybdenum, selenium, and aluminum concentrations. Values obtained for a group of 16 normal subjects agree with the most reliable values reported in the literature, obtained by much more elaborate techniques. All of these metals can be measured in 3 to 4 mL of serum. The methods may prove especially useful in monitoring concentrations of essential trace elements in blood of patients being maintained on total parenteral nutrition.
MIBPB: a software package for electrostatic analysis.
Chen, Duan; Chen, Zhan; Chen, Changjun; Geng, Weihua; Wei, Guo-Wei
2011-03-01
The Poisson-Boltzmann equation (PBE) is an established model for the electrostatic analysis of biomolecules. The development of advanced computational techniques for the solution of the PBE has been an important topic in the past two decades. This article presents a matched interface and boundary (MIB)-based PBE software package, the MIBPB solver, for electrostatic analysis. The MIBPB has a unique feature that it is the first interface technique-based PBE solver that rigorously enforces the solution and flux continuity conditions at the dielectric interface between the biomolecule and the solvent. For protein molecular surfaces, which may possess troublesome geometrical singularities, the MIB scheme makes the MIBPB by far the only existing PBE solver that is able to deliver the second-order convergence, that is, the accuracy increases four times when the mesh size is halved. The MIBPB method is also equipped with a Dirichlet-to-Neumann mapping technique that builds a Green's function approach to analytically resolve the singular charge distribution in biomolecules in order to obtain reliable solutions at meshes as coarse as 1 Å--whereas it usually takes other traditional PB solvers 0.25 Å to reach similar level of reliability. This work further accelerates the rate of convergence of linear equation systems resulting from the MIBPB by using the Krylov subspace (KS) techniques. Condition numbers of the MIBPB matrices are significantly reduced by using appropriate KS solver and preconditioner combinations. Both linear and nonlinear PBE solvers in the MIBPB package are tested by protein-solvent solvation energy calculations and analysis of salt effects on protein-protein binding energies, respectively. Copyright © 2010 Wiley Periodicals, Inc.
NASA Astrophysics Data System (ADS)
Wada, Daichi; Sugimoto, Yohei
2017-04-01
Aerodynamic loads on aircraft wings are one of the key parameters to be monitored for reliable and effective aircraft operations and management. Flight data of the aerodynamic loads would be used onboard to control the aircraft and accumulated data would be used for the condition-based maintenance and the feedback for the fatigue and critical load modeling. The effective sensing techniques such as fiber optic distributed sensing have been developed and demonstrated promising capability of monitoring structural responses, i.e., strains on the surface of the aircraft wings. By using the developed techniques, load identification methods for structural health monitoring are expected to be established. The typical inverse analysis for load identification using strains calculates the loads in a discrete form of concentrated forces, however, the distributed form of the loads is essential for the accurate and reliable estimation of the critical stress at structural parts. In this study, we demonstrate an inverse analysis to identify the distributed loads from measured strain information. The introduced inverse analysis technique calculates aerodynamic loads not in a discrete but in a distributed manner based on a finite element model. In order to verify the technique through numerical simulations, we apply static aerodynamic loads on a flat panel model, and conduct the inverse identification of the load distributions. We take two approaches to build the inverse system between loads and strains. The first one uses structural models and the second one uses neural networks. We compare the performance of the two approaches, and discuss the effect of the amount of the strain sensing information.
MIBPB: A software package for electrostatic analysis
Chen, Duan; Chen, Zhan; Chen, Changjun; Geng, Weihua; Wei, Guo-Wei
2010-01-01
The Poisson-Boltzmann equation (PBE) is an established model for the electrostatic analysis of biomolecules. The development of advanced computational techniques for the solution of the PBE has been an important topic in the past two decades. This paper presents a matched interface and boundary (MIB) based PBE software package, the MIBPB solver, for electrostatic analysis. The MIBPB has a unique feature that it is the first interface technique based PBE solver that rigorously enforces the solution and flux continuity conditions at the dielectric interface between the biomolecule and the solvent. For protein molecular surfaces which may possess troublesome geometrical singularities, the MIB scheme makes the MIBPB by far the only existing PBE solver that is able to deliver the second order convergence, i.e., the accuracy increases four times when the mesh size is halved. The MIBPB method is also equipped with a Dirichlet-to-Neumann mapping (DNM) technique, that builds a Green's function approach to analytically resolve the singular charge distribution in biomolecules in order to obtain reliable solutions at meshes as coarse as 1Å — while it usually takes other traditional PB solvers 0.25Å to reach similar level of reliability. The present work further accelerates the rate of convergence of linear equation systems resulting from the MIBPB by utilizing the Krylov subspace (KS) techniques. Condition numbers of the MIBPB matrices are significantly reduced by using appropriate Krylov subspace solver and preconditioner combinations. Both linear and nonlinear PBE solvers in the MIBPB package are tested by protein-solvent solvation energy calculations and analysis of salt effects on protein-protein binding energies, respectively. PMID:20845420
LOFT Debriefings: An Analysis of Instructor Techniques and Crew Participation
NASA Technical Reports Server (NTRS)
Dismukes, R. Key; Jobe, Kimberly K.; McDonnell, Lori K.
1997-01-01
This study analyzes techniques instructors use to facilitate crew analysis and evaluation of their Line-Oriented Flight Training (LOFT) performance. A rating instrument called the Debriefing Assessment Battery (DAB) was developed which enables raters to reliably assess instructor facilitation techniques and characterize crew participation. Thirty-six debriefing sessions conducted at five U.S. airlines were analyzed to determine the nature of instructor facilitation and crew participation. Ratings obtained using the DAB corresponded closely with descriptive measures of instructor and crew performance. The data provide empirical evidence that facilitation can be an effective tool for increasing the depth of crew participation and self-analysis of CRM performance. Instructor facilitation skill varied dramatically, suggesting a need for more concrete hands-on training in facilitation techniques. Crews were responsive but fell short of actively leading their own debriefings. Ways to improve debriefing effectiveness are suggested.
MSIX - A general and user-friendly platform for RAM analysis
NASA Astrophysics Data System (ADS)
Pan, Z. J.; Blemel, Peter
The authors present a CAD (computer-aided design) platform supporting RAM (reliability, availability, and maintainability) analysis with efficient system description and alternative evaluation. The design concepts, implementation techniques, and application results are described. This platform is user-friendly because of its graphic environment, drawing facilities, object orientation, self-tutoring, and access to the operating system. The programs' independency and portability make them generally applicable to various analysis tasks.
Runtime Speculative Software-Only Fault Tolerance
2012-06-01
reliability of RSFT, a in-depth analysis on its window of vulnerability is also discussed and measured via simulated fault injection. The performance...propagation of faults through the entire program. For optimal performance, these techniques have to use herotic alias analysis to find the minimum set of...affect program output. No program source code or alias analysis is needed to analyze the fault propagation ahead of time. 2.3 Limitations of Existing
Reliability of a rapid hematology stain for sputum cytology*
Gonçalves, Jéssica; Pizzichini, Emilio; Pizzichini, Marcia Margaret Menezes; Steidle, Leila John Marques; Rocha, Cristiane Cinara; Ferreira, Samira Cardoso; Zimmermann, Célia Tânia
2014-01-01
Objective: To determine the reliability of a rapid hematology stain for the cytological analysis of induced sputum samples. Methods: This was a cross-sectional study comparing the standard technique (May-Grünwald-Giemsa stain) with a rapid hematology stain (Diff-Quik). Of the 50 subjects included in the study, 21 had asthma, 19 had COPD, and 10 were healthy (controls). From the induced sputum samples collected, we prepared four slides: two were stained with May-Grünwald-Giemsa, and two were stained with Diff-Quik. The slides were read independently by two trained researchers blinded to the identification of the slides. The reliability for cell counting using the two techniques was evaluated by determining the intraclass correlation coefficients (ICCs) for intraobserver and interobserver agreement. Agreement in the identification of neutrophilic and eosinophilic sputum between the observers and between the stains was evaluated with kappa statistics. Results: In our comparison of the two staining techniques, the ICCs indicated almost perfect interobserver agreement for neutrophil, eosinophil, and macrophage counts (ICC: 0.98-1.00), as well as substantial agreement for lymphocyte counts (ICC: 0.76-0.83). Intraobserver agreement was almost perfect for neutrophil, eosinophil, and macrophage counts (ICC: 0.96-0.99), whereas it was moderate to substantial for lymphocyte counts (ICC = 0.65 and 0.75 for the two observers, respectively). Interobserver agreement for the identification of eosinophilic and neutrophilic sputum using the two techniques ranged from substantial to almost perfect (kappa range: 0.91-1.00). Conclusions: The use of Diff-Quik can be considered a reliable alternative for the processing of sputum samples. PMID:25029648
An overview of reliability assessment and control for design of civil engineering structures
DOE Office of Scientific and Technical Information (OSTI.GOV)
Field, R.V. Jr.; Grigoriadis, K.M.; Bergman, L.A.
1998-06-01
Random variations, whether they occur in the input signal or the system parameters, are phenomena that occur in nearly all engineering systems of interest. As a result, nondeterministic modeling techniques must somehow account for these variations to ensure validity of the solution. As might be expected, this is a difficult proposition and the focus of many current research efforts. Controlling seismically excited structures is one pertinent application of nondeterministic analysis and is the subject of the work presented herein. This overview paper is organized into two sections. First, techniques to assess system reliability, in a context familiar to civil engineers,more » are discussed. Second, and as a consequence of the first, active control methods that ensure good performance in this random environment are presented. It is the hope of the authors that these discussions will ignite further interest in the area of reliability assessment and design of controlled civil engineering structures.« less
A Survey of Techniques for Modeling and Improving Reliability of Computing Systems
Mittal, Sparsh; Vetter, Jeffrey S.
2015-04-24
Recent trends of aggressive technology scaling have greatly exacerbated the occurrences and impact of faults in computing systems. This has made `reliability' a first-order design constraint. To address the challenges of reliability, several techniques have been proposed. In this study, we provide a survey of architectural techniques for improving resilience of computing systems. We especially focus on techniques proposed for microarchitectural components, such as processor registers, functional units, cache and main memory etc. In addition, we discuss techniques proposed for non-volatile memory, GPUs and 3D-stacked processors. To underscore the similarities and differences of the techniques, we classify them based onmore » their key characteristics. We also review the metrics proposed to quantify vulnerability of processor structures. Finally, we believe that this survey will help researchers, system-architects and processor designers in gaining insights into the techniques for improving reliability of computing systems.« less
A Survey of Techniques for Modeling and Improving Reliability of Computing Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mittal, Sparsh; Vetter, Jeffrey S.
Recent trends of aggressive technology scaling have greatly exacerbated the occurrences and impact of faults in computing systems. This has made `reliability' a first-order design constraint. To address the challenges of reliability, several techniques have been proposed. In this study, we provide a survey of architectural techniques for improving resilience of computing systems. We especially focus on techniques proposed for microarchitectural components, such as processor registers, functional units, cache and main memory etc. In addition, we discuss techniques proposed for non-volatile memory, GPUs and 3D-stacked processors. To underscore the similarities and differences of the techniques, we classify them based onmore » their key characteristics. We also review the metrics proposed to quantify vulnerability of processor structures. Finally, we believe that this survey will help researchers, system-architects and processor designers in gaining insights into the techniques for improving reliability of computing systems.« less
ERIC Educational Resources Information Center
Crocker, Linda M.; Mehrens, William A.
Four new methods of item analysis were used to select subsets of items which would yield measures of attitude change. The sample consisted of 263 students at Michigan State University who were tested on the Inventory of Beliefs as freshmen and retested on the same instrument as juniors. Item change scores and total change scores were computed for…
Toward automatic finite element analysis
NASA Technical Reports Server (NTRS)
Kela, Ajay; Perucchio, Renato; Voelcker, Herbert
1987-01-01
Two problems must be solved if the finite element method is to become a reliable and affordable blackbox engineering tool. Finite element meshes must be generated automatically from computer aided design databases and mesh analysis must be made self-adaptive. The experimental system described solves both problems in 2-D through spatial and analytical substructuring techniques that are now being extended into 3-D.
Prediction of Software Reliability using Bio Inspired Soft Computing Techniques.
Diwaker, Chander; Tomar, Pradeep; Poonia, Ramesh C; Singh, Vijander
2018-04-10
A lot of models have been made for predicting software reliability. The reliability models are restricted to using particular types of methodologies and restricted number of parameters. There are a number of techniques and methodologies that may be used for reliability prediction. There is need to focus on parameters consideration while estimating reliability. The reliability of a system may increase or decreases depending on the selection of different parameters used. Thus there is need to identify factors that heavily affecting the reliability of the system. In present days, reusability is mostly used in the various area of research. Reusability is the basis of Component-Based System (CBS). The cost, time and human skill can be saved using Component-Based Software Engineering (CBSE) concepts. CBSE metrics may be used to assess those techniques which are more suitable for estimating system reliability. Soft computing is used for small as well as large-scale problems where it is difficult to find accurate results due to uncertainty or randomness. Several possibilities are available to apply soft computing techniques in medicine related problems. Clinical science of medicine using fuzzy-logic, neural network methodology significantly while basic science of medicine using neural-networks-genetic algorithm most frequently and preferably. There is unavoidable interest shown by medical scientists to use the various soft computing methodologies in genetics, physiology, radiology, cardiology and neurology discipline. CBSE boost users to reuse the past and existing software for making new products to provide quality with a saving of time, memory space, and money. This paper focused on assessment of commonly used soft computing technique like Genetic Algorithm (GA), Neural-Network (NN), Fuzzy Logic, Support Vector Machine (SVM), Ant Colony Optimization (ACO), Particle Swarm Optimization (PSO), and Artificial Bee Colony (ABC). This paper presents working of soft computing techniques and assessment of soft computing techniques to predict reliability. The parameter considered while estimating and prediction of reliability are also discussed. This study can be used in estimation and prediction of the reliability of various instruments used in the medical system, software engineering, computer engineering and mechanical engineering also. These concepts can be applied to both software and hardware, to predict the reliability using CBSE.
Adaptations of advanced safety and reliability techniques to petroleum and other industries
NASA Technical Reports Server (NTRS)
Purser, P. E.
1974-01-01
The underlying philosophy of the general approach to failure reduction and control is presented. Safety and reliability management techniques developed in the industries which have participated in the U.S. space and defense programs are described along with adaptations to nonaerospace activities. The examples given illustrate the scope of applicability of these techniques. It is indicated that any activity treated as a 'system' is a potential user of aerospace safety and reliability management techniques.
Computer-aided design of polymers and composites
NASA Technical Reports Server (NTRS)
Kaelble, D. H.
1985-01-01
This book on computer-aided design of polymers and composites introduces and discusses the subject from the viewpoint of atomic and molecular models. Thus, the origins of stiffness, strength, extensibility, and fracture toughness in composite materials can be analyzed directly in terms of chemical composition and molecular structure. Aspects of polymer composite reliability are considered along with characterization techniques for composite reliability, relations between atomic and molecular properties, computer aided design and manufacture, polymer CAD/CAM models, and composite CAD/CAM models. Attention is given to multiphase structural adhesives, fibrous composite reliability, metal joint reliability, polymer physical states and transitions, chemical quality assurance, processability testing, cure monitoring and management, nondestructive evaluation (NDE), surface NDE, elementary properties, ionic-covalent bonding, molecular analysis, acid-base interactions, the manufacturing science, and peel mechanics.
Reliability Analysis and Modeling of ZigBee Networks
NASA Astrophysics Data System (ADS)
Lin, Cheng-Min
The architecture of ZigBee networks focuses on developing low-cost, low-speed ubiquitous communication between devices. The ZigBee technique is based on IEEE 802.15.4, which specifies the physical layer and medium access control (MAC) for a low rate wireless personal area network (LR-WPAN). Currently, numerous wireless sensor networks have adapted the ZigBee open standard to develop various services to promote improved communication quality in our daily lives. The problem of system and network reliability in providing stable services has become more important because these services will be stopped if the system and network reliability is unstable. The ZigBee standard has three kinds of networks; star, tree and mesh. The paper models the ZigBee protocol stack from the physical layer to the application layer and analyzes these layer reliability and mean time to failure (MTTF). Channel resource usage, device role, network topology and application objects are used to evaluate reliability in the physical, medium access control, network, and application layers, respectively. In the star or tree networks, a series system and the reliability block diagram (RBD) technique can be used to solve their reliability problem. However, a division technology is applied here to overcome the problem because the network complexity is higher than that of the others. A mesh network using division technology is classified into several non-reducible series systems and edge parallel systems. Hence, the reliability of mesh networks is easily solved using series-parallel systems through our proposed scheme. The numerical results demonstrate that the reliability will increase for mesh networks when the number of edges in parallel systems increases while the reliability quickly drops when the number of edges and the number of nodes increase for all three networks. More use of resources is another factor impact on reliability decreasing. However, lower network reliability will occur due to network complexity, more resource usage and complex object relationship.
Application of a novel new multispectral nanoparticle tracking technique
NASA Astrophysics Data System (ADS)
McElfresh, Cameron; Harrington, Tyler; Vecchio, Kenneth S.
2018-06-01
Fast, reliable, and accurate particle size analysis techniques must meet the demands of evolving industrial and academic research in areas of functionalized nanoparticle synthesis, advanced materials development, and other nanoscale enabled technologies. In this study a new multispectral particle tracking analysis (m-PTA) technique enabled by the ViewSizer™ 3000 (MANTA Instruments, USA) was evaluated using solutions of monomodal and multimodal gold and polystyrene latex nanoparticles, as well as a spark eroded polydisperse 316L stainless steel nanopowder, and large (non-Brownian) borosilicate particles. It was found that m-PTA performed comparably to the DLS in evaluation of monomodal particle size distributions. When measuring bimodal, trimodal and polydisperse solutions, the m-PTA technique overwhelmingly outperformed traditional dynamic light scattering (DLS) in both peak detection and relative particle concentration analysis. It was also observed that the m-PTA technique is less susceptible to large particle overexpression errors. The ViewSizer™ 3000 was also found to be successful in accurately evaluating sizes and concentrations of monomodal and bimodal sinking borosilicate particles.
Borotikar, Bhushan; Lempereur, Mathieu; Lelievre, Mathieu; Burdin, Valérie; Ben Salem, Douraied; Brochard, Sylvain
2017-01-01
To report evidence for the concurrent validity and reliability of dynamic MRI techniques to evaluate in vivo joint and muscle mechanics, and to propose recommendations for their use in the assessment of normal and impaired musculoskeletal function. The search was conducted on articles published in Web of science, PubMed, Scopus, Academic search Premier, and Cochrane Library between 1990 and August 2017. Studies that reported the concurrent validity and/or reliability of dynamic MRI techniques for in vivo evaluation of joint or muscle mechanics were included after assessment by two independent reviewers. Selected articles were assessed using an adapted quality assessment tool and a data extraction process. Results for concurrent validity and reliability were categorized as poor, moderate, or excellent. Twenty articles fulfilled the inclusion criteria with a mean quality assessment score of 66% (±10.4%). Concurrent validity and/or reliability of eight dynamic MRI techniques were reported, with the knee being the most evaluated joint (seven studies). Moderate to excellent concurrent validity and reliability were reported for seven out of eight dynamic MRI techniques. Cine phase contrast and real-time MRI appeared to be the most valid and reliable techniques to evaluate joint motion, and spin tag for muscle motion. Dynamic MRI techniques are promising for the in vivo evaluation of musculoskeletal mechanics; however results should be evaluated with caution since validity and reliability have not been determined for all joints and muscles, nor for many pathological conditions.
Evaluation of localized muscle fatigue using power spectral density analysis of the electromyogram
NASA Technical Reports Server (NTRS)
Lafevers, E. V.
1974-01-01
Surface electromyograms (EMGs) taken from three upper torso muscles during a push-pull task were analyzed by a power spectral density technique to determine the operational feasibility of the technique for identifying changes in the EMGs resulting from muscular fatigue. The EMGs were taken from four subjects under two conditions (1) in shirtsleeves and (2) in a pressurized space suit. This study confirmed that frequency analysis of dynamic muscle activity is capable of providing reliable data for many industrial applications where fatigue may be of practical interest. The results showed significant effects of the pressurized space suit on the pattern of shirtsleeve fatigue responses of the muscles. The data also revealed (1) reliable differences between muscles in fatigue-induced responses to various locations in the reach envelope at which the subjects were required to perform the push-pull exercise and (2) the differential sensitivity of muscles to the various reach positions in terms of fatigue-related shifts in EMG power.
Formal Techniques for Organization Analysis: Task and Resource Management
1984-06-01
typical approach has been to base new entities on stereotypical structures and make changes as problems are recognized. Clearly, this is not an...human resources; and provide the means to change and track all 4 L I _ _ _ ____ I I these parameters as they interact with each other and respond to...functioning under internal and external change . 3. Data gathering techniques to allow one to efficiently r,’lect reliable modeling parameters from
1982-05-01
MONITORING AND MANAGEMENT , 34 7.0 NONDESTRUCTIVE EVALUATION ( NDE ) 37 8. 0 SURFACE NDE 44 9.0 PERFORMANCE AND PROOF TESTING 46 10.0 SUMMARY AND...Chemical Quality Assurance Testing 2. Processability Testing 3. Cure Monitoring and Management 4. Nondestructive Evaluation ( NDE ) 5. Performance and...the management concept for implementing the specific tests. Chemical analysis, nondestructive evaluation ( NDE ) and environmental fatigue testing of
Navarro-Ramirez, Rodrigo; Berlin, Connor; Lang, Gernot; Hussain, Ibrahim; Janssen, Insa; Sloan, Stephen; Askin, Gulce; Avila, Mauricio J; Zubkov, Micaella; Härtl, Roger
2018-01-01
Two-dimensional radiographic methods have been proposed to evaluate the radiographic outcome after indirect decompression through extreme lateral interbody fusion (XLIF). However, the assessment of neural decompression in a single plane may underestimate the effect of indirect decompression on central canal and foraminal volumes. The present study aimed to assess the reliability and consistency of a novel 3-dimensional radiographic method that assesses neural decompression by volumetric analysis using a new generation of intraoperative fan-beam computed tomography scanner in patients undergoing XLIF. Prospectively collected data from 7 patients (9 levels) undergoing XLIF was retrospectively analyzed. Three independent, blind raters using imaging analysis software performed volumetric measurements pre- and postoperatively to determine central canal and foraminal volumes. Intrarater and Interrater reliability tests were performed to assess the reliability of this novel volumetric method. The interrater reliability between the three raters ranged from 0.800 to 0.952, P < 0.0001. The test-retest analysis on a randomly selected subset of three patients showed good to excellent internal reliability (range of 0.78-1.00) for all 3 raters. There was a significant increase in mean volume ≈20% for right foramen, left foramen, and central canal volumes postoperatively (P = 0.0472; P = 0.0066; P = 0.0003, respectively). Here we demonstrate a new volumetric analysis technique that is feasible, reliable, and reproducible amongst independent raters for central canal and foraminal volumes in the lumbar spine using an intraoperative computed tomography scanner. Copyright © 2017. Published by Elsevier Inc.
Kashkouli, Mohsen Bahmani; Karimi, Nasser; Aghamirsalim, Mohamadreza; Abtahi, Mohammad Bagher; Nojomi, Marzieh; Shahrad-Bejestani, Hadi; Salehi, Masoud
2017-02-01
To determine the measurement properties of the Persian language version of the Graves orbitopathy quality of life questionnaire (GO-QOL). Following a systematic translation and cultural adaptation process, 141 consecutive unselected thyroid eye disease (TED) patients answered the Persian GO-QOL and underwent complete ophthalmic examination. The questionnaire was again completed by 60 patients on the second visit, 2-4 weeks later. Construct validity (cross-cultural validity, structural validity and hypotheses testing), reliability (internal consistency and test-retest reliability), and floor and ceiling effects of the Persian version of the GO-QOL were evaluated. Furthermore, Rasch analysis was used to assess its psychometric properties. Cross-cultural validity was established by back-translation techniques, committee review and pretesting techniques. Bi-dimensionality of the questionnaire was confirmed by factor analysis. Construct validity was also supported through confirmation of 6 out of 8 predefined hypotheses. Cronbach's α and intraclass correlation coefficient (ICC) were 0.650 and 0.859 for visual functioning and 0.875 and 0.896 for appearance subscale, respectively. Mean quality of life (QOL) scores for visual functioning and appearance were 78.18 (standard deviation, SD, 21.57) and 56.25 (SD 26.87), respectively. Person reliabilities from the Rasch rating scale model for both visual functioning and appearance revealed an acceptable internal consistency for the Persian GO-QOL. The Persian GO-QOL questionnaire is a valid and reliable tool with good psychometric properties in evaluation of Persian-speaking patients with TED. Applying Rasch analysis to future versions of the GO-QOL is recommended in order to perform tests for linearity between the estimated item measures in different versions.
CARES/Life Software for Designing More Reliable Ceramic Parts
NASA Technical Reports Server (NTRS)
Nemeth, Noel N.; Powers, Lynn M.; Baker, Eric H.
1997-01-01
Products made from advanced ceramics show great promise for revolutionizing aerospace and terrestrial propulsion, and power generation. However, ceramic components are difficult to design because brittle materials in general have widely varying strength values. The CAPES/Life software eases this task by providing a tool to optimize the design and manufacture of brittle material components using probabilistic reliability analysis techniques. Probabilistic component design involves predicting the probability of failure for a thermomechanically loaded component from specimen rupture data. Typically, these experiments are performed using many simple geometry flexural or tensile test specimens. A static, dynamic, or cyclic load is applied to each specimen until fracture. Statistical strength and SCG (fatigue) parameters are then determined from these data. Using these parameters and the results obtained from a finite element analysis, the time-dependent reliability for a complex component geometry and loading is then predicted. Appropriate design changes are made until an acceptable probability of failure has been reached.
Use of eddy current mixes to solve a weld examination application
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ward, R.C.; LaBoissonniere, A.
1995-12-31
The augmentation of typical nondestructive (i.e., ultrasound) weld inspection techniques by the use of eddy current tools may significantly enhance the quality and reliability of weld inspections. One recent example is the development of an eddy current technique for use in the examination of BWR core shroud welds, where multi-frequency mixes are used to eliminate signals coming from the weld material so that the examination of the heat affected zone is enhanced. An analysis tool most commonly associated with ultrasound examinations, the C-Scan based on gated information, may be implemented with eddy current data to enhance analysis.
Accuracy of remotely sensed data: Sampling and analysis procedures
NASA Technical Reports Server (NTRS)
Congalton, R. G.; Oderwald, R. G.; Mead, R. A.
1982-01-01
A review and update of the discrete multivariate analysis techniques used for accuracy assessment is given. A listing of the computer program written to implement these techniques is given. New work on evaluating accuracy assessment using Monte Carlo simulation with different sampling schemes is given. The results of matrices from the mapping effort of the San Juan National Forest is given. A method for estimating the sample size requirements for implementing the accuracy assessment procedures is given. A proposed method for determining the reliability of change detection between two maps of the same area produced at different times is given.
An empirical comparison of a dynamic software testability metric to static cyclomatic complexity
NASA Technical Reports Server (NTRS)
Voas, Jeffrey M.; Miller, Keith W.; Payne, Jeffrey E.
1993-01-01
This paper compares the dynamic testability prediction technique termed 'sensitivity analysis' to the static testability technique termed cyclomatic complexity. The application that we chose in this empirical study is a CASE generated version of a B-737 autoland system. For the B-737 system we analyzed, we isolated those functions that we predict are more prone to hide errors during system/reliability testing. We also analyzed the code with several other well-known static metrics. This paper compares and contrasts the results of sensitivity analysis to the results of the static metrics.
Integrative sparse principal component analysis of gene expression data.
Liu, Mengque; Fan, Xinyan; Fang, Kuangnan; Zhang, Qingzhao; Ma, Shuangge
2017-12-01
In the analysis of gene expression data, dimension reduction techniques have been extensively adopted. The most popular one is perhaps the PCA (principal component analysis). To generate more reliable and more interpretable results, the SPCA (sparse PCA) technique has been developed. With the "small sample size, high dimensionality" characteristic of gene expression data, the analysis results generated from a single dataset are often unsatisfactory. Under contexts other than dimension reduction, integrative analysis techniques, which jointly analyze the raw data of multiple independent datasets, have been developed and shown to outperform "classic" meta-analysis and other multidatasets techniques and single-dataset analysis. In this study, we conduct integrative analysis by developing the iSPCA (integrative SPCA) method. iSPCA achieves the selection and estimation of sparse loadings using a group penalty. To take advantage of the similarity across datasets and generate more accurate results, we further impose contrasted penalties. Different penalties are proposed to accommodate different data conditions. Extensive simulations show that iSPCA outperforms the alternatives under a wide spectrum of settings. The analysis of breast cancer and pancreatic cancer data further shows iSPCA's satisfactory performance. © 2017 WILEY PERIODICALS, INC.
Breath analysis using external cavity diode lasers: a review
NASA Astrophysics Data System (ADS)
Bayrakli, Ismail
2017-04-01
Most techniques that are used for diagnosis and therapy of diseases are invasive. Reliable noninvasive methods are always needed for the comfort of patients. Owing to its noninvasiveness, ease of use, and easy repeatability, exhaled breath analysis is a very good candidate for this purpose. Breath analysis can be performed using different techniques, such as gas chromatography mass spectrometry (MS), proton transfer reaction-MS, and selected ion flow tube-MS. However, these devices are bulky and require complicated procedures for sample collection and preconcentration. Therefore, these are not practical for routine applications in hospitals. Laser-based techniques with small size, robustness, low cost, low response time, accuracy, precision, high sensitivity, selectivity, low detection limit, real-time, and point-of-care detection have a great potential for routine use in hospitals. In this review paper, the recent advances in the fields of external cavity lasers and breath analysis for detection of diseases are presented.
NASA reliability preferred practices for design and test
NASA Technical Reports Server (NTRS)
1991-01-01
Given here is a manual that was produced to communicate within the aerospace community design practices that have contributed to NASA mission success. The information represents the best technical advice that NASA has to offer on reliability design and test practices. Topics covered include reliability practices, including design criteria, test procedures, and analytical techniques that have been applied to previous space flight programs; and reliability guidelines, including techniques currently applied to space flight projects, where sufficient information exists to certify that the technique will contribute to mission success.
High throughput-screening of animal urine samples: It is fast but is it also reliable?
Kaufmann, Anton
2016-05-01
Advanced analytical technologies like ultra-high-performance liquid chromatography coupled to high resolution mass spectrometry can be used for veterinary drug screening of animal urine. The technique is sufficiently robust and reliable to detect veterinary drugs in urine samples of animals where the maximum residue limit of these compounds in organs like muscle, kidney, or liver has been exceeded. The limitations and possibilities of the technique are discussed. The most critical point is the variability of the drug concentration ratio between the tissue and urine. Ways to manage the false positive and false negatives are discussed. The capability to confirm findings and the possibility of semi-targeted analysis are also addressed. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.
Reliable Acquisition of RAM Dumps from Intel-Based Apple Mac Computers over FireWire
NASA Astrophysics Data System (ADS)
Gladyshev, Pavel; Almansoori, Afrah
RAM content acquisition is an important step in live forensic analysis of computer systems. FireWire offers an attractive way to acquire RAM content of Apple Mac computers equipped with a FireWire connection. However, the existing techniques for doing so require substantial knowledge of the target computer configuration and cannot be used reliably on a previously unknown computer in a crime scene. This paper proposes a novel method for acquiring RAM content of Apple Mac computers over FireWire, which automatically discovers necessary information about the target computer and can be used in the crime scene setting. As an application of the developed method, the techniques for recovery of AOL Instant Messenger (AIM) conversation fragments from RAM dumps are also discussed in this paper.
NASA Astrophysics Data System (ADS)
Martowicz, Adam; Uhl, Tadeusz
2012-10-01
The paper discusses the applicability of a reliability- and performance-based multi-criteria robust design optimization technique for micro-electromechanical systems, considering their technological uncertainties. Nowadays, micro-devices are commonly applied systems, especially in the automotive industry, taking advantage of utilizing both the mechanical structure and electronic control circuit on one board. Their frequent use motivates the elaboration of virtual prototyping tools that can be applied in design optimization with the introduction of technological uncertainties and reliability. The authors present a procedure for the optimization of micro-devices, which is based on the theory of reliability-based robust design optimization. This takes into consideration the performance of a micro-device and its reliability assessed by means of uncertainty analysis. The procedure assumes that, for each checked design configuration, the assessment of uncertainty propagation is performed with the meta-modeling technique. The described procedure is illustrated with an example of the optimization carried out for a finite element model of a micro-mirror. The multi-physics approach allowed the introduction of several physical phenomena to correctly model the electrostatic actuation and the squeezing effect present between electrodes. The optimization was preceded by sensitivity analysis to establish the design and uncertain domains. The genetic algorithms fulfilled the defined optimization task effectively. The best discovered individuals are characterized by a minimized value of the multi-criteria objective function, simultaneously satisfying the constraint on material strength. The restriction of the maximum equivalent stresses was introduced with the conditionally formulated objective function with a penalty component. The yielded results were successfully verified with a global uniform search through the input design domain.
HPLC fingerprint analysis combined with chemometrics for pattern recognition of ginger.
Feng, Xu; Kong, Weijun; Wei, Jianhe; Ou-Yang, Zhen; Yang, Meihua
2014-03-01
Ginger, the fresh rhizome of Zingiber officinale Rosc. (Zingiberaceae), has been used worldwide; however, for a long time, there has been no standard approbated internationally for its quality control. To establish an efficacious and combinational method and pattern recognition technique for quality control of ginger. A simple, accurate and reliable method based on high-performance liquid chromatography with photodiode array (HPLC-PDA) detection was developed for establishing the chemical fingerprints of 10 batches of ginger from different markets in China. The method was validated in terms of precision, reproducibility and stability; and the relative standard deviations were all less than 1.57%. On the basis of this method, the fingerprints of 10 batches of ginger samples were obtained, which showed 16 common peaks. Coupled with similarity evaluation software, the similarities between each fingerprint of the sample and the simulative mean chromatogram were in the range of 0.998-1.000. Then, the chemometric techniques, including similarity analysis, hierarchical clustering analysis and principal component analysis were applied to classify the ginger samples. Consistent results were obtained to show that ginger samples could be successfully classified into two groups. This study revealed that HPLC-PDA method was simple, sensitive and reliable for fingerprint analysis, and moreover, for pattern recognition and quality control of ginger.
APPLICATION OF TRAVEL TIME RELIABILITY FOR PERFORMANCE ORIENTED OPERATIONAL PLANNING OF EXPRESSWAYS
NASA Astrophysics Data System (ADS)
Mehran, Babak; Nakamura, Hideki
Evaluation of impacts of congestion improvement scheme s on travel time reliability is very significant for road authorities since travel time reliability repr esents operational performance of expressway segments. In this paper, a methodology is presented to estimate travel tim e reliability prior to implementation of congestion relief schemes based on travel time variation modeling as a function of demand, capacity, weather conditions and road accident s. For subject expressway segmen ts, traffic conditions are modeled over a whole year considering demand and capacity as random variables. Patterns of demand and capacity are generated for each five minute interval by appl ying Monte-Carlo simulation technique, and accidents are randomly generated based on a model that links acci dent rate to traffic conditions. A whole year analysis is performed by comparing de mand and available capacity for each scenario and queue length is estimated through shockwave analysis for each time in terval. Travel times are estimated from refined speed-flow relationships developed for intercity expressways and buffer time index is estimated consequently as a measure of travel time reliability. For validation, estimated reliability indices are compared with measured values from empirical data, and it is shown that the proposed method is suitable for operational evaluation and planning purposes.
A Report on Simulation-Driven Reliability and Failure Analysis of Large-Scale Storage Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wan, Lipeng; Wang, Feiyi; Oral, H. Sarp
High-performance computing (HPC) storage systems provide data availability and reliability using various hardware and software fault tolerance techniques. Usually, reliability and availability are calculated at the subsystem or component level using limited metrics such as, mean time to failure (MTTF) or mean time to data loss (MTTDL). This often means settling on simple and disconnected failure models (such as exponential failure rate) to achieve tractable and close-formed solutions. However, such models have been shown to be insufficient in assessing end-to-end storage system reliability and availability. We propose a generic simulation framework aimed at analyzing the reliability and availability of storagemore » systems at scale, and investigating what-if scenarios. The framework is designed for an end-to-end storage system, accommodating the various components and subsystems, their interconnections, failure patterns and propagation, and performs dependency analysis to capture a wide-range of failure cases. We evaluate the framework against a large-scale storage system that is in production and analyze its failure projections toward and beyond the end of lifecycle. We also examine the potential operational impact by studying how different types of components affect the overall system reliability and availability, and present the preliminary results« less
Caruso, J C
2001-06-01
The unreliability of difference scores is a well documented phenomenon in the social sciences and has led researchers and practitioners to interpret differences cautiously, if at all. In the case of the Kaufman Adult and Adolescent Intelligence Test (KAIT), the unreliability of the difference between the Fluid IQ and the Crystallized IQ is due to the high correlation between the two scales. The consequences of the lack of precision with which differences are identified are wide confidence intervals and unpowerful significance tests (i.e., large differences are required to be declared statistically significant). Reliable component analysis (RCA) was performed on the subtests of the KAIT in order to address these problems. RCA is a new data reduction technique that results in uncorrelated component scores with maximum proportions of reliable variance. Results indicate that the scores defined by RCA have discriminant and convergent validity (with respect to the equally weighted scores) and that differences between the scores, derived from a single testing session, were more reliable than differences derived from equal weighting for each age group (11-14 years, 15-34 years, 35-85+ years). This reliability advantage results in narrower confidence intervals around difference scores and smaller differences required for statistical significance.
Taheriyoun, Masoud; Moradinejad, Saber
2015-01-01
The reliability of a wastewater treatment plant is a critical issue when the effluent is reused or discharged to water resources. Main factors affecting the performance of the wastewater treatment plant are the variation of the influent, inherent variability in the treatment processes, deficiencies in design, mechanical equipment, and operational failures. Thus, meeting the established reuse/discharge criteria requires assessment of plant reliability. Among many techniques developed in system reliability analysis, fault tree analysis (FTA) is one of the popular and efficient methods. FTA is a top down, deductive failure analysis in which an undesired state of a system is analyzed. In this study, the problem of reliability was studied on Tehran West Town wastewater treatment plant. This plant is a conventional activated sludge process, and the effluent is reused in landscape irrigation. The fault tree diagram was established with the violation of allowable effluent BOD as the top event in the diagram, and the deficiencies of the system were identified based on the developed model. Some basic events are operator's mistake, physical damage, and design problems. The analytical method is minimal cut sets (based on numerical probability) and Monte Carlo simulation. Basic event probabilities were calculated according to available data and experts' opinions. The results showed that human factors, especially human error had a great effect on top event occurrence. The mechanical, climate, and sewer system factors were in subsequent tier. Literature shows applying FTA has been seldom used in the past wastewater treatment plant (WWTP) risk analysis studies. Thus, the developed FTA model in this study considerably improves the insight into causal failure analysis of a WWTP. It provides an efficient tool for WWTP operators and decision makers to achieve the standard limits in wastewater reuse and discharge to the environment.
NASA Technical Reports Server (NTRS)
1994-01-01
This manual presents a series of recommended techniques that can increase overall operational effectiveness of both flight and ground based NASA systems. It provides a set of tools that minimizes risk associated with: (1) restoring failed functions (both ground and flight based); (2) conducting complex and highly visible maintenance operations; and (3) sustaining a technical capability to support the NASA mission using aging equipment or facilities. It considers (1) program management - key elements of an effective maintainability effort; (2) design and development - techniques that have benefited previous programs; (3) analysis and test - quantitative and qualitative analysis processes and testing techniques; and (4) operations and operational design techniques that address NASA field experience. This document is a valuable resource for continuous improvement ideas in executing the systems development process in accordance with the NASA 'better, faster, smaller, and cheaper' goal without compromising safety.
Thermoreflectance spectroscopy—Analysis of thermal processes in semiconductor lasers
NASA Astrophysics Data System (ADS)
Pierścińska, D.
2018-01-01
This review focuses on theoretical foundations, experimental implementation and an overview of experimental results of the thermoreflectance spectroscopy as a powerful technique for temperature monitoring and analysis of thermal processes in semiconductor lasers. This is an optical, non-contact, high spatial resolution technique providing high temperature resolution and mapping capabilities. Thermoreflectance is a thermometric technique based on measuring of relative change of reflectivity of the surface of laser facet, which provides thermal images useful in hot spot detection and reliability studies. In this paper, principles and experimental implementation of the technique as a thermography tool is discussed. Some exemplary applications of TR to various types of lasers are presented, proving that thermoreflectance technique provides new insight into heat management problems in semiconductor lasers and in particular, that it allows studying thermal degradation processes occurring at laser facets. Additionally, thermal processes and basic mechanisms of degradation of the semiconductor laser are discussed.
An Introduction to Markov Modeling: Concepts and Uses
NASA Technical Reports Server (NTRS)
Boyd, Mark A.; Lau, Sonie (Technical Monitor)
1998-01-01
Kharkov modeling is a modeling technique that is widely useful for dependability analysis of complex fault tolerant systems. It is very flexible in the type of systems and system behavior it can model. It is not, however, the most appropriate modeling technique for every modeling situation. The first task in obtaining a reliability or availability estimate for a system is selecting which modeling technique is most appropriate to the situation at hand. A person performing a dependability analysis must confront the question: is Kharkov modeling most appropriate to the system under consideration, or should another technique be used instead? The need to answer this gives rise to other more basic questions regarding Kharkov modeling: what are the capabilities and limitations of Kharkov modeling as a modeling technique? How does it relate to other modeling techniques? What kind of system behavior can it model? What kinds of software tools are available for performing dependability analyses with Kharkov modeling techniques? These questions and others will be addressed in this tutorial.
A study of data analysis techniques for the multi-needle Langmuir probe
NASA Astrophysics Data System (ADS)
Hoang, H.; Røed, K.; Bekkeng, T. A.; Moen, J. I.; Spicher, A.; Clausen, L. B. N.; Miloch, W. J.; Trondsen, E.; Pedersen, A.
2018-06-01
In this paper we evaluate two data analysis techniques for the multi-needle Langmuir probe (m-NLP). The instrument uses several cylindrical Langmuir probes, which are positively biased with respect to the plasma potential in order to operate in the electron saturation region. Since the currents collected by these probes can be sampled at kilohertz rates, the instrument is capable of resolving the ionospheric plasma structure down to the meter scale. The two data analysis techniques, a linear fit and a non-linear least squares fit, are discussed in detail using data from the Investigation of Cusp Irregularities 2 sounding rocket. It is shown that each technique has pros and cons with respect to the m-NLP implementation. Even though the linear fitting technique seems to be better than measurements from incoherent scatter radar and in situ instruments, m-NLPs can be longer and can be cleaned during operation to improve instrument performance. The non-linear least squares fitting technique would be more reliable provided that a higher number of probes are deployed.
Logic Model Checking of Unintended Acceleration Claims in Toyota Vehicles
NASA Technical Reports Server (NTRS)
Gamble, Ed
2012-01-01
Part of the US Department of Transportation investigation of Toyota sudden unintended acceleration (SUA) involved analysis of the throttle control software, JPL Laboratory for Reliable Software applied several techniques including static analysis and logic model checking, to the software; A handful of logic models were build, Some weaknesses were identified; however, no cause for SUA was found; The full NASA report includes numerous other analyses
Implications of scaling on static RAM bit cell stability and reliability
NASA Astrophysics Data System (ADS)
Coones, Mary Ann; Herr, Norm; Bormann, Al; Erington, Kent; Soorholtz, Vince; Sweeney, John; Phillips, Michael
1993-01-01
In order to lower manufacturing costs and increase performance, static random access memory (SRAM) bit cells are scaled progressively toward submicron geometries. The reliability of an SRAM is highly dependent on the bit cell stability. Smaller memory cells with less capacitance and restoring current make the array more susceptible to failures from defectivity, alpha hits, and other instabilities and leakage mechanisms. Improving long term reliability while migrating to higher density devices makes the task of building in and improving reliability increasingly difficult. Reliability requirements for high density SRAMs are very demanding with failure rates of less than 100 failures per billion device hours (100 FITs) being a common criteria. Design techniques for increasing bit cell stability and manufacturability must be implemented in order to build in this level of reliability. Several types of analyses are performed to benchmark the performance of the SRAM device. Examples of these analysis techniques which are presented here include DC parametric measurements of test structures, functional bit mapping of the circuit used to characterize the entire distribution of bits, electrical microprobing of weak and/or failing bits, and system and accelerated soft error rate measurements. These tests allow process and design improvements to be evaluated prior to implementation on the final product. These results are used to provide comprehensive bit cell characterization which can then be compared to device models and adjusted accordingly to provide optimized cell stability versus cell size for a particular technology. The result is designed in reliability which can be accomplished during the early stages of product development.
Denaturing high-performance liquid chromatography for mutation detection and genotyping.
Fackenthal, Donna Lee; Chen, Pei Xian; Howe, Ted; Das, Soma
2013-01-01
Denaturing high-performance liquid chromatography (DHPLC) is an accurate and efficient screening technique used for detecting DNA sequence changes by heteroduplex analysis. It can also be used for genotyping of single nucleotide polymorphisms (SNPs). The high sensitivity of DHPLC has made this technique one of the most reliable approaches to mutation analysis and, therefore, used in various areas of genetics, both in the research and clinical arena. This chapter describes the methods used for mutation detection analysis and the genotyping of SNPs by DHPLC on the WAVE™ system from Transgenomic Inc. ("WAVE" and "DNASep" are registered trademarks, and "Navigator" is a trademark, of Transgenomic, used with permission. All other trademarks are property of the respective owners).
Fault trees for decision making in systems analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lambert, Howard E.
1975-10-09
The application of fault tree analysis (FTA) to system safety and reliability is presented within the framework of system safety analysis. The concepts and techniques involved in manual and automated fault tree construction are described and their differences noted. The theory of mathematical reliability pertinent to FTA is presented with emphasis on engineering applications. An outline of the quantitative reliability techniques of the Reactor Safety Study is given. Concepts of probabilistic importance are presented within the fault tree framework and applied to the areas of system design, diagnosis and simulation. The computer code IMPORTANCE ranks basic events and cut setsmore » according to a sensitivity analysis. A useful feature of the IMPORTANCE code is that it can accept relative failure data as input. The output of the IMPORTANCE code can assist an analyst in finding weaknesses in system design and operation, suggest the most optimal course of system upgrade, and determine the optimal location of sensors within a system. A general simulation model of system failure in terms of fault tree logic is described. The model is intended for efficient diagnosis of the causes of system failure in the event of a system breakdown. It can also be used to assist an operator in making decisions under a time constraint regarding the future course of operations. The model is well suited for computer implementation. New results incorporated in the simulation model include an algorithm to generate repair checklists on the basis of fault tree logic and a one-step-ahead optimization procedure that minimizes the expected time to diagnose system failure.« less
Relative and absolute reliability of measures of linoleic acid-derived oxylipins in human plasma.
Gouveia-Figueira, Sandra; Bosson, Jenny A; Unosson, Jon; Behndig, Annelie F; Nording, Malin L; Fowler, Christopher J
2015-09-01
Modern analytical techniques allow for the measurement of oxylipins derived from linoleic acid in biological samples. Most validatory work has concerned extraction techniques, repeated analysis of aliquots from the same biological sample, and the influence of external factors such as diet and heparin treatment upon their levels, whereas less is known about the relative and absolute reliability of measurements undertaken on different days. A cohort of nineteen healthy males were used, where samples were taken at the same time of day on two occasions, at least 7 days apart. Relative reliability was assessed using Lin's concordance correlation coefficients (CCC) and intraclass correlation coefficients (ICC). Absolute reliability was assessed by Bland-Altman analyses. Nine linoleic acid oxylipins were investigated. ICC and CCC values ranged from acceptable (0.56 [13-HODE]) to poor (near zero [9(10)- and 12(13)-EpOME]). Bland-Altman limits of agreement were in general quite wide, ranging from ±0.5 (12,13-DiHOME) to ±2 (9(10)-EpOME; log10 scale). It is concluded that relative reliability of linoleic acid-derived oxylipins varies between lipids with compounds such as the HODEs showing better relative reliability than compounds such as the EpOMEs. These differences should be kept in mind when designing and interpreting experiments correlating plasma levels of these lipids with factors such as age, body mass index, rating scales etc. Copyright © 2015 Elsevier Inc. All rights reserved.
Human Reliability and the Cost of Doing Business
NASA Technical Reports Server (NTRS)
DeMott, Diana
2014-01-01
Most businesses recognize that people will make mistakes and assume errors are just part of the cost of doing business, but does it need to be? Companies with high risk, or major consequences, should consider the effect of human error. In a variety of industries, Human Errors have caused costly failures and workplace injuries. These have included: airline mishaps, medical malpractice, administration of medication and major oil spills have all been blamed on human error. A technique to mitigate or even eliminate some of these costly human errors is the use of Human Reliability Analysis (HRA). Various methodologies are available to perform Human Reliability Assessments that range from identifying the most likely areas for concern to detailed assessments with human error failure probabilities calculated. Which methodology to use would be based on a variety of factors that would include: 1) how people react and act in different industries, and differing expectations based on industries standards, 2) factors that influence how the human errors could occur such as tasks, tools, environment, workplace, support, training and procedure, 3) type and availability of data and 4) how the industry views risk & reliability influences ( types of emergencies, contingencies and routine tasks versus cost based concerns). The Human Reliability Assessments should be the first step to reduce, mitigate or eliminate the costly mistakes or catastrophic failures. Using Human Reliability techniques to identify and classify human error risks allows a company more opportunities to mitigate or eliminate these risks and prevent costly failures.
Lempereur, Mathieu; Lelievre, Mathieu; Burdin, Valérie; Ben Salem, Douraied; Brochard, Sylvain
2017-01-01
Purpose To report evidence for the concurrent validity and reliability of dynamic MRI techniques to evaluate in vivo joint and muscle mechanics, and to propose recommendations for their use in the assessment of normal and impaired musculoskeletal function. Materials and methods The search was conducted on articles published in Web of science, PubMed, Scopus, Academic search Premier, and Cochrane Library between 1990 and August 2017. Studies that reported the concurrent validity and/or reliability of dynamic MRI techniques for in vivo evaluation of joint or muscle mechanics were included after assessment by two independent reviewers. Selected articles were assessed using an adapted quality assessment tool and a data extraction process. Results for concurrent validity and reliability were categorized as poor, moderate, or excellent. Results Twenty articles fulfilled the inclusion criteria with a mean quality assessment score of 66% (±10.4%). Concurrent validity and/or reliability of eight dynamic MRI techniques were reported, with the knee being the most evaluated joint (seven studies). Moderate to excellent concurrent validity and reliability were reported for seven out of eight dynamic MRI techniques. Cine phase contrast and real-time MRI appeared to be the most valid and reliable techniques to evaluate joint motion, and spin tag for muscle motion. Conclusion Dynamic MRI techniques are promising for the in vivo evaluation of musculoskeletal mechanics; however results should be evaluated with caution since validity and reliability have not been determined for all joints and muscles, nor for many pathological conditions. PMID:29232401
Integration of RAMS in LCC analysis for linear transport infrastructures. A case study for railways.
NASA Astrophysics Data System (ADS)
Calle-Cordón, Álvaro; Jiménez-Redondo, Noemi; Morales-Gámiz, F. J.; García-Villena, F. A.; Garmabaki, Amir H. S.; Odelius, Johan
2017-09-01
Life-cycle cost (LCC) analysis is an economic technique used to assess the total costs associated with the lifetime of a system in order to support decision making in long term strategic planning. For complex systems, such as railway and road infrastructures, the cost of maintenance plays an important role in the LCC analysis. Costs associated with maintenance interventions can be more reliably estimated by integrating the probabilistic nature of the failures associated to these interventions in the LCC models. Reliability, Maintainability, Availability and Safety (RAMS) parameters describe the maintenance needs of an asset in a quantitative way by using probabilistic information extracted from registered maintenance activities. Therefore, the integration of RAMS in the LCC analysis allows obtaining reliable predictions of system maintenance costs and the dependencies of these costs with specific cost drivers through sensitivity analyses. This paper presents an innovative approach for a combined RAMS & LCC methodology for railway and road transport infrastructures being developed under the on-going H2020 project INFRALERT. Such RAMS & LCC analysis provides relevant probabilistic information to be used for condition and risk-based planning of maintenance activities as well as for decision support in long term strategic investment planning.
NASA Astrophysics Data System (ADS)
Rimov, A. A.; Chukanova, T. I.; Trofimov, Yu. V.
2016-12-01
Data on the comparative analysis variants of the quality of power installations (benchmarking) applied in the power industry is systematized. It is shown that the most efficient variant of implementation of the benchmarking technique is the analysis of statistical distributions of the indicators in the composed homogenous group of the uniform power installations. The benchmarking technique aimed at revealing the available reserves on improvement of the reliability and heat efficiency indicators of the power installations of the thermal power plants is developed in the furtherance of this approach. The technique provides a possibility of reliable comparison of the quality of the power installations in their homogenous group limited by the number and adoption of the adequate decision on improving some or other technical characteristics of this power installation. The technique provides structuring of the list of the comparison indicators and internal factors affecting them represented according to the requirements of the sectoral standards and taking into account the price formation characteristics in the Russian power industry. The mentioned structuring ensures traceability of the reasons of deviation of the internal influencing factors from the specified values. The starting point for further detail analysis of the delay of the certain power installation indicators from the best practice expressed in the specific money equivalent is positioning of this power installation on distribution of the key indicator being a convolution of the comparison indicators. The distribution of the key indicator is simulated by the Monte-Carlo method after receiving the actual distributions of the comparison indicators: specific lost profit due to the short supply of electric energy and short delivery of power, specific cost of losses due to the nonoptimal expenditures for repairs, and specific cost of excess fuel equivalent consumption. The quality loss indicators are developed facilitating the analysis of the benchmarking results permitting to represent the quality loss of this power installation in the form of the difference between the actual value of the key indicator or comparison indicator and the best quartile of the existing distribution. The uncertainty of the obtained values of the quality loss indicators was evaluated by transforming the standard uncertainties of the input values into the expanded uncertainties of the output values with the confidence level of 95%. The efficiency of the technique is demonstrated in terms of benchmarking of the main thermal and mechanical equipment of the extraction power-generating units T-250 and power installations of the thermal power plants with the main steam pressure 130 atm.
NASA Technical Reports Server (NTRS)
Bien, D. D.
1973-01-01
This analysis considers the optimum allocation of redundancy in a system of serially connected subsystems in which each subsystem is of the k-out-of-n type. Redundancy is optimally allocated when: (1) reliability is maximized for given costs; or (2) costs are minimized for given reliability. Several techniques are presented for achieving optimum allocation and their relative merits are discussed. Approximate solutions in closed form were attainable only for the special case of series-parallel systems and the efficacy of these approximations is discussed.
NASA Astrophysics Data System (ADS)
Kim, Dong Hyeok; Lee, Ouk Sub; Kim, Hong Min; Choi, Hye Bin
2008-11-01
A modified Split Hopkinson Pressure Bar technique with aluminum pressure bars and a pulse shaper technique to achieve a closer impedance match between the pressure bars and the specimen materials such as hot temperature degraded POM (Poly Oxy Methylene) and PP (Poly Propylene). The more distinguishable experimental signals were obtained to evaluate the more accurate dynamic deformation behavior of materials under a high strain rate loading condition. A pulse shaping technique is introduced to reduce the non-equilibrium on the dynamic material response by modulation of the incident wave during a short period of test. This increases the rise time of the incident pulse in the SHPB experiment. For the dynamic stress strain curve obtained from SHPB experiment, the Johnson-Cook model is applied as a constitutive equation. The applicability of this constitutive equation is verified by using the probabilistic reliability estimation method. Two reliability methodologies such as the FORM and the SORM have been proposed. The limit state function(LSF) includes the Johnson-Cook model and applied stresses. The LSF in this study allows more statistical flexibility on the yield stress than a paper published before. It is found that the failure probability estimated by using the SORM is more reliable than those of the FORM/ It is also noted that the failure probability increases with increase of the applied stress. Moreover, it is also found that the parameters of Johnson-Cook model such as A and n, and the applied stress are found to affect the failure probability more severely than the other random variables according to the sensitivity analysis.
NASA Technical Reports Server (NTRS)
Towner, Robert L.; Band, Jonathan L.
2012-01-01
An analysis technique was developed to compare and track mode shapes for different Finite Element Models. The technique may be applied to a variety of structural dynamics analyses, including model reduction validation (comparing unreduced and reduced models), mode tracking for various parametric analyses (e.g., launch vehicle model dispersion analysis to identify sensitivities to modal gain for Guidance, Navigation, and Control), comparing models of different mesh fidelity (e.g., a coarse model for a preliminary analysis compared to a higher-fidelity model for a detailed analysis) and mode tracking for a structure with properties that change over time (e.g., a launch vehicle from liftoff through end-of-burn, with propellant being expended during the flight). Mode shapes for different models are compared and tracked using several numerical indicators, including traditional Cross-Orthogonality and Modal Assurance Criteria approaches, as well as numerical indicators obtained by comparing modal strain energy and kinetic energy distributions. This analysis technique has been used to reliably identify correlated mode shapes for complex Finite Element Models that would otherwise be difficult to compare using traditional techniques. This improved approach also utilizes an adaptive mode tracking algorithm that allows for automated tracking when working with complex models and/or comparing a large group of models.
Mai, Hang-Nga; Lee, Kyeong Eun; Lee, Kyu-Bok; Jeong, Seung-Mi; Lee, Seok-Jae; Lee, Cheong-Hee; An, Seo-Young; Lee, Du-Hyeong
2017-10-01
The purpose of this study was to evaluate the reliability of computer-aided replica technique (CART) by calculating its agreement with the replica technique (RT), using statistical agreement analysis. A prepared metal die and a metal crown were fabricated. The gap between the restoration and abutment was replicated using silicone indicator paste (n = 25). Gap measurements differed in the control (RT) and experimental (CART) groups. In the RT group, the silicone replica was manually sectioned, and the marginal and occlusal gaps were measured using a microscope. In the CART group, the gap was digitized using optical scanning and image superimposition, and the gaps were measured using a software program. The agreement between the measurement techniques was evaluated by using the 95% Bland-Altman limits of agreement and concordance correlation coefficients (CCC). The least acceptable CCC was 0.90. The RT and CART groups showed linear association, with a strong positive correlation in gap measurements, but without significant differences. The 95% limits of agreement between the paired gap measurements were 3.84% and 7.08% of the mean. The lower 95% confidence limits of CCC were 0.9676 and 0.9188 for the marginal and occlusal gap measurements, respectively, and the values were greater than the allowed limit. The CART is a reliable digital approach for evaluating the fit accuracy of fixed dental prostheses.
The specification-based validation of reliable multicast protocol: Problem Report. M.S. Thesis
NASA Technical Reports Server (NTRS)
Wu, Yunqing
1995-01-01
Reliable Multicast Protocol (RMP) is a communication protocol that provides an atomic, totally ordered, reliable multicast service on top of unreliable IP multicasting. In this report, we develop formal models for RMP using existing automated verification systems, and perform validation on the formal RMP specifications. The validation analysis help identifies some minor specification and design problems. We also use the formal models of RMP to generate a test suite for conformance testing of the implementation. Throughout the process of RMP development, we follow an iterative, interactive approach that emphasizes concurrent and parallel progress of implementation and verification processes. Through this approach, we incorporate formal techniques into our development process, promote a common understanding for the protocol, increase the reliability of our software, and maintain high fidelity between the specifications of RMP and its implementation.
Effect of Entropy Generation on Wear Mechanics and System Reliability
NASA Astrophysics Data System (ADS)
Gidwani, Akshay; James, Siddanth; Jagtap, Sagar; Karthikeyan, Ram; Vincent, S.
2018-04-01
Wear is an irreversible phenomenon. Processes such as mutual sliding and rolling between materials involve entropy generation. These processes are monotonic with respect to time. The concept of entropy generation is further quantified using Degradation Entropy Generation theorem formulated by Michael D. Bryant. The sliding-wear model can be extrapolated to different instances in order to further provide a potential analysis of machine prognostics as well as system and process reliability for various processes besides even mere mechanical processes. In other words, using the concept of ‘entropy generation’ and wear, one can quantify the reliability of a system with respect to time using a thermodynamic variable, which is the basis of this paper. Thus in the present investigation, a unique attempt has been made to establish correlation between entropy-wear-reliability which can be useful technique in preventive maintenance.
Method-independent, Computationally Frugal Convergence Testing for Sensitivity Analysis Techniques
NASA Astrophysics Data System (ADS)
Mai, J.; Tolson, B.
2017-12-01
The increasing complexity and runtime of environmental models lead to the current situation that the calibration of all model parameters or the estimation of all of their uncertainty is often computationally infeasible. Hence, techniques to determine the sensitivity of model parameters are used to identify most important parameters. All subsequent model calibrations or uncertainty estimation procedures focus then only on these subsets of parameters and are hence less computational demanding. While the examination of the convergence of calibration and uncertainty methods is state-of-the-art, the convergence of the sensitivity methods is usually not checked. If any, bootstrapping of the sensitivity results is used to determine the reliability of the estimated indexes. Bootstrapping, however, might as well become computationally expensive in case of large model outputs and a high number of bootstraps. We, therefore, present a Model Variable Augmentation (MVA) approach to check the convergence of sensitivity indexes without performing any additional model run. This technique is method- and model-independent. It can be applied either during the sensitivity analysis (SA) or afterwards. The latter case enables the checking of already processed sensitivity indexes. To demonstrate the method's independency of the convergence testing method, we applied it to two widely used, global SA methods: the screening method known as Morris method or Elementary Effects (Morris 1991) and the variance-based Sobol' method (Solbol' 1993). The new convergence testing method is first scrutinized using 12 analytical benchmark functions (Cuntz & Mai et al. 2015) where the true indexes of aforementioned three methods are known. This proof of principle shows that the method reliably determines the uncertainty of the SA results when different budgets are used for the SA. The results show that the new frugal method is able to test the convergence and therefore the reliability of SA results in an efficient way. The appealing feature of this new technique is the necessity of no further model evaluation and therefore enables checking of already processed sensitivity results. This is one step towards reliable and transferable, published sensitivity results.
Programmable Logic Application Notes
NASA Technical Reports Server (NTRS)
Katz, Richard; Day, John H. (Technical Monitor)
2001-01-01
This report will be provided each quarter as a source for reliability, radiation results, NASA capabilities, and other information on programmable logic devices and related applications. This quarter will continue a series of notes concentrating on analysis techniques with this issue's section discussing the use of Root-Sum-Square calculations for digital delays.
Break-even Analysis: Tool for Budget Planning
ERIC Educational Resources Information Center
Lohmann, Roger A.
1976-01-01
Multiple funding creates special management problems for the administrator of a human service agency. This article presents a useful analytic technique adapted from business practice that can help the administrator draw up and balance a unified budget. Such a budget also affords reliable overview of the agency's financial status. (Author)
Glavas, Panagiotis; Mac-Thiong, Jean-Marc; Parent, Stefan; de Guise, Jacques A.
2008-01-01
Although recognized as an important aspect in the management of spondylolisthesis, there is no consensus on the most reliable and optimal measure of lumbosacral kyphosis (LSK). Using a custom computer software, four raters evaluated 60 standing lateral radiographs of the lumbosacral spine during two sessions at a 1-week interval. The sample size consisted of 20 normal, 20 low and 20 high grade spondylolisthetic subjects. Six parameters were included for analysis: Boxall’s slip angle, Dubousset’s lumbosacral angle (LSA), the Spinal Deformity Study Group’s (SDSG) LSA, dysplastic SDSG LSA, sagittal rotation (SR), kyphotic Cobb angle (k-Cobb). Intra- and inter-rater reliability for all parameters was assessed using intra-class correlation coefficients (ICC). Correlations between parameters and slip percentage were evaluated with Pearson coefficients. The intra-rater ICC’s for all the parameters ranged between 0.81 and 0.97 and the inter-rater ICC’s were between 0.74 and 0.98. All parameters except sagittal rotation showed a medium to large correlation with slip percentage. Dubousset’s LSA and the k-Cobb showed the largest correlations (r = −0.78 and r = −0.50, respectively). SR was associated with the weakest correlation (r = −0.10). All other parameters had medium correlations with percent slip (r = 0.31–0.43). All measurement techniques provided excellent inter- and intra-rater reliability. Dubousset’s LSA showed the strongest correlation with slip grade. This parameter can be used in the clinical setting with PACS software capabilities to assess LSK. A computer-assisted technique is recommended in order to increase the reliability of the measurement of LSK in spondylolisthesis. PMID:19015898
Sub-Nanosecond Lifetime Measurement Using the Recoil-Distance Method
Wu, Ching-Yen
2000-01-01
The electromagnetic properties of low-lying nuclear states are a sensitive probe of both collective and single-particle degrees of freedom in nuclear structure. The recoil-distance technique provides a very reliable, direct and precise method for measuring lifetimes of nuclear states with lifetimes ranging from less than one to several hundred picoseconds. This method complements the powerful, but complicated, heavy-ion induced Coulomb excitation technique for measuring electromagnetic properties. The recoil distance technique has been combined with heavy-ion induced Coulomb excitation to study a variety of problems. Examples discussed are: study of the two-phonon triplet in 110Pd, coupling of the β and γ degrees of freedom in 182,184W, highly deformed γ bands in 165Ho, octupole collectivity in 96Zr, and opposite parity states in 153Eu. Consistency between the Coulomb excitation results and the lifetime measurements confirms the reliability of the complex analysis often encountered in heavy-ion induced Coulomb excitation work. PMID:27551588
Analytical Model of Large Data Transactions in CoAP Networks
Ludovici, Alessandro; Di Marco, Piergiuseppe; Calveras, Anna; Johansson, Karl H.
2014-01-01
We propose a novel analytical model to study fragmentation methods in wireless sensor networks adopting the Constrained Application Protocol (CoAP) and the IEEE 802.15.4 standard for medium access control (MAC). The blockwise transfer technique proposed in CoAP and the 6LoWPAN fragmentation are included in the analysis. The two techniques are compared in terms of reliability and delay, depending on the traffic, the number of nodes and the parameters of the IEEE 802.15.4 MAC. The results are validated trough Monte Carlo simulations. To the best of our knowledge this is the first study that evaluates and compares analytically the performance of CoAP blockwise transfer and 6LoWPAN fragmentation. A major contribution is the possibility to understand the behavior of both techniques with different network conditions. Our results show that 6LoWPAN fragmentation is preferable for delay-constrained applications. For highly congested networks, the blockwise transfer slightly outperforms 6LoWPAN fragmentation in terms of reliability. PMID:25153143
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, Yousu; Huang, Zhenyu; Rice, Mark J.
Contingency analysis studies are necessary to assess the impact of possible power system component failures. The results of the contingency analysis are used to ensure the grid reliability, and in power market operation for the feasibility test of market solutions. Currently, these studies are performed in real time based on the current operating conditions of the grid with a set of pre-selected contingency list, which might result in overlooking some critical contingencies caused by variable system status. To have a complete picture of a power grid, more contingencies need to be studied to improve grid reliability. High-performance computing techniques holdmore » the promise of being able to perform the analysis for more contingency cases within a much shorter time frame. This paper evaluates the performance of counter-based dynamic load balancing schemes for a massive contingency analysis program on 10,000+ cores. One million N-2 contingency analysis cases with a Western Electricity Coordinating Council power grid model have been used to demonstrate the performance. The speedup of 3964 with 4096 cores and 7877 with 10240 cores are obtained. This paper reports the performance of the load balancing scheme with a single counter and two counters, describes disk I/O issues, and discusses other potential techniques for further improving the performance.« less
Simultaneous Comparison of Two Roller Compaction Techniques and Two Particle Size Analysis Methods.
Saarinen, Tuomas; Antikainen, Osmo; Yliruusi, Jouko
2017-11-01
A new dry granulation technique, gas-assisted roller compaction (GARC), was compared with conventional roller compaction (CRC) by manufacturing 34 granulation batches. The process variables studied were roll pressure, roll speed, and sieve size of the conical mill. The main quality attributes measured were granule size and flow characteristics. Within granulations also the real applicability of two particle size analysis techniques, sieve analysis (SA) and fast imaging technique (Flashsizer, FS), was tested. All granules obtained were acceptable. In general, the particle size of GARC granules was slightly larger than that of CRC granules. In addition, the GARC granules had better flowability. For example, the tablet weight variation of GARC granules was close to 2%, indicating good flowing and packing characteristics. The comparison of the two particle size analysis techniques showed that SA was more accurate in determining wide and bimodal size distributions while FS showed narrower and mono-modal distributions. However, both techniques gave good estimates for mean granule sizes. Overall, SA was a time-consuming but accurate technique that provided reliable information for the entire granule size distribution. By contrast, FS oversimplified the shape of the size distribution, but nevertheless yielded acceptable estimates for mean particle size. In general, FS was two to three orders of magnitude faster than SA.
Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) Tutorial
DOE Office of Scientific and Technical Information (OSTI.GOV)
C. L. Smith; S. T. Beck; S. T. Wood
2008-08-01
The Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) refers to a set of computer programs that were developed to create and analyze probabilistic risk assessment (PRAs). This volume is the tutorial manual for the SAPHIRE system. In this document, a series of lessons are provided that guide the user through basic steps common to most analyses preformed with SAPHIRE. The tutorial is divided into two major sections covering both basic and advanced features. The section covering basic topics contains lessons that lead the reader through development of a probabilistic hypothetical problem involving a vehicle accident, highlighting the program’smore » most fundamental features. The advanced features section contains additional lessons that expand on fundamental analysis features of SAPHIRE and provide insights into more complex analysis techniques. Together, these two elements provide an overview into the operation and capabilities of the SAPHIRE software.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ronald L. Boring; David I. Gertman; Jeffrey C. Joe
2005-09-01
An ongoing issue within human-computer interaction (HCI) is the need for simplified or “discount” methods. The current economic slowdown has necessitated innovative methods that are results driven and cost effective. The myriad methods of design and usability are currently being cost-justified, and new techniques are actively being explored that meet current budgets and needs. Recent efforts in human reliability analysis (HRA) are highlighted by the ten-year development of the Standardized Plant Analysis Risk HRA (SPAR-H) method. The SPAR-H method has been used primarily for determining humancentered risk at nuclear power plants. The SPAR-H method, however, shares task analysis underpinnings withmore » HCI. Despite this methodological overlap, there is currently no HRA approach deployed in heuristic usability evaluation. This paper presents an extension of the existing SPAR-H method to be used as part of heuristic usability evaluation in HCI.« less
Development and validation of technique for in-vivo 3D analysis of cranial bone graft survival
NASA Astrophysics Data System (ADS)
Bernstein, Mark P.; Caldwell, Curtis B.; Antonyshyn, Oleh M.; Ma, Karen; Cooper, Perry W.; Ehrlich, Lisa E.
1997-05-01
Bone autografts are routinely employed in the reconstruction of facial deformities resulting from trauma, tumor ablation or congenital malformations. The combined use of post- operative 3D CT and SPECT imaging provides a means for quantitative in vivo evaluation of bone graft volume and osteoblastic activity. The specific objectives of this study were: (1) Determine the reliability and accuracy of interactive computer-assisted analysis of bone graft volumes based on 3D CT scans; (2) Determine the error in CT/SPECT multimodality image registration; (3) Determine the error in SPECT/SPECT image registration; and (4) Determine the reliability and accuracy of CT-guided SPECT uptake measurements in cranial bone grafts. Five human cadaver heads served as anthropomorphic models for all experiments. Four cranial defects were created in each specimen with inlay and onlay split skull bone grafts and reconstructed to skull and malar recipient sites. To acquire all images, each specimen was CT scanned and coated with Technetium doped paint. For purposes of validation, skulls were landmarked with 1/16-inch ball-bearings and Indium. This study provides a new technique relating anatomy and physiology for the analysis of cranial bone graft survival.
Frosini, Francesco; Miniati, Roberto; Grillone, Saverio; Dori, Fabrizio; Gentili, Guido Biffi; Belardinelli, Andrea
2016-11-14
The following study proposes and tests an integrated methodology involving Health Technology Assessment (HTA) and Failure Modes, Effects and Criticality Analysis (FMECA) for the assessment of specific aspects related to robotic surgery involving safety, process and technology. The integrated methodology consists of the application of specific techniques coming from the HTA joined to the aid of the most typical models from reliability engineering such as FMEA/FMECA. The study has also included in-site data collection and interviews to medical personnel. The total number of robotic procedures included in the analysis was 44: 28 for urology and 16 for general surgery. The main outcomes refer to the comparative evaluation between robotic, laparoscopic and open surgery. Risk analysis and mitigation interventions come from FMECA application. The small sample size available for the study represents an important bias, especially for the clinical outcomes reliability. Despite this, the study seems to confirm the better trend for robotics' surgical times with comparison to the open technique as well as confirming the robotics' clinical benefits in urology. More complex situation is observed for general surgery, where robotics' clinical benefits directly measured are the lowest blood transfusion rate.
NASA Astrophysics Data System (ADS)
Akhmetova, I. G.; Chichirova, N. D.
2016-12-01
Heat supply is the most energy-consuming sector of the economy. Approximately 30% of all used primary fuel-and-energy resources is spent on municipal heat-supply needs. One of the key indicators of activity of heat-supply organizations is the reliability of an energy facility. The reliability index of a heat supply organization is of interest to potential investors for assessing risks when investing in projects. The reliability indices established by the federal legislation are actually reduced to a single numerical factor, which depends on the number of heat-supply outages in connection with disturbances in operation of heat networks and the volume of their resource recovery in the calculation year. This factor is rather subjective and may change in a wide range during several years. A technique is proposed for evaluating the reliability of heat-supply organizations with the use of the simple additive weighting (SAW) method. The technique for integrated-index determination satisfies the following conditions: the reliability level of the evaluated heat-supply system is represented maximum fully and objectively; the information used for the reliability-index evaluation is easily available (is located on the Internet in accordance with demands of data-disclosure standards). For reliability estimation of heat-supply organizations, the following indicators were selected: the wear of equipment of thermal energy sources, the wear of heat networks, the number of outages of supply of thermal energy (heat carrier due to technological disturbances on heat networks per 1 km of heat networks), the number of outages of supply of thermal energy (heat carrier due to technologic disturbances on thermal energy sources per 1 Gcal/h of installed power), the share of expenditures in the cost of thermal energy aimed at recovery of the resource (renewal of fixed assets), coefficient of renewal of fixed assets, and a coefficient of fixed asset retirement. A versatile program is developed and the analysis of heat-supply organizations is performed by the example of the Republic of Tatarstan. The assessment system is based on construction of comparative ratings of heat-supply organizations. A rating is the assessment of reliability of the organization, is characterized by a numerical value, and makes it possible to compare organizations engaged in the same kind of activity between each other.
Safety considerations in the design and operation of large wind turbines
NASA Technical Reports Server (NTRS)
Reilly, D. H.
1979-01-01
The engineering and safety techniques used to assure the reliable and safe operation of large wind turbine generators utilizing the Mod 2 Wind Turbine System Program as an example is described. The techniques involve a careful definition of the wind turbine's natural and operating environments, use of proven structural design criteria and analysis techniques, an evaluation of potential failure modes and hazards, and use of a fail safe and redundant component engineering philosophy. The role of an effective quality assurance program, tailored to specific hardware criticality, and the checkout and validation program developed to assure system integrity are described.
NASA Astrophysics Data System (ADS)
Malinowski, Arkadiusz; Takeuchi, Takuya; Chen, Shang; Suzuki, Toshiya; Ishikawa, Kenji; Sekine, Makoto; Hori, Masaru; Lukasiak, Lidia; Jakubowski, Andrzej
2013-07-01
This paper describes a new, fast, and case-independent technique for sticking coefficient (SC) estimation based on pallet for plasma evaluation (PAPE) structure and numerical analysis. Our approach does not require complicated structure, apparatus, or time-consuming measurements but offers high reliability of data and high flexibility. Thermal analysis is also possible. This technique has been successfully applied to estimation of very low value of SC of hydrogen radicals on chemically amplified ArF 193 nm photoresist (the main goal of this study). Upper bound of our technique has been determined by investigation of SC of fluorine radical on polysilicon (in elevated temperature). Sources of estimation error and ways of its reduction have been also discussed. Results of this study give an insight into the process kinetics, and not only they are helpful in better process understanding but additionally they may serve as parameters in a phenomenological model development for predictive modelling of etching for ultimate CMOS topography simulation.
NASA Technical Reports Server (NTRS)
Hanks, G. W.; Shomber, H. A.; Dethman, H. A.; Gratzer, L. B.; Maeshiro, A.; Gangsaas, D.; Blight, J. D.; Buchan, S. M.; Crumb, C. B.; Dorwart, R. J.
1981-01-01
The current status of the Active Controls Technology (ACT) for the advanced subsonic transport project is investigated through analysis of the systems technical data. Control systems technologies under examination include computerized reliability analysis, pitch axis fly by wire actuator, flaperon actuation system design trade study, control law synthesis and analysis, flutter mode control and gust load alleviation analysis, and implementation of alternative ACT systems. Extensive analysis of the computer techniques involved in each system is included.
The impact of Lean bundles on hospital performance: does size matter?
Al-Hyari, Khalil; Abu Hammour, Sewar; Abu Zaid, Mohammad Khair Saleem; Haffar, Mohamed
2016-10-10
Purpose The purpose of this paper is to study the effect of the implementation of Lean bundles on hospital performance in private hospitals in Jordan and evaluate how much the size of organization can affect the relationship between Lean bundles implementation and hospital performance. Design/methodology/approach The research is considered as quantitative method (descriptive and hypothesis testing). Three statistical techniques were adopted to analyse the data. Structural equation modeling techniques and multi-group analysis were used to examine the research's hypothesis, and to perform the required statistical analysis of the data from the survey. Reliability analysis and confirmatory factor analysis were used to test the construct validity, reliability and measurement loadings that were performed. Findings Lean bundles have been identified as an effective approach that can dramatically improve the organizational performance of private hospitals in Jordan. Main Lean bundles - just in time, human resource management, and total quality management are applicable to large, small and medium hospitals without significant differences in advantages that depend on size. Originality/value According to the researchers' best knowledge, this is the first research that studies the impact of Lean bundles implementation in healthcare sector in Jordan. This research also makes a significant contribution for decision makers in healthcare to increase their awareness of Lean bundles.
Development of an ultrasonic weld inspection system based on image processing and neural networks
NASA Astrophysics Data System (ADS)
Roca Barceló, Fernando; Jaén del Hierro, Pedro; Ribes Llario, Fran; Real Herráiz, Julia
2018-04-01
Several types of discontinuities and defects may be present on a weld, thus leading to a considerable reduction of its resistance. Therefore, ensuring a high welding quality and reliability has become a matter of key importance for many construction and industrial activities. Among the non-destructive weld testing and inspection techniques, the time-of-flight diffraction (TOFD) arises as a very safe (no ionising radiation), precise, reliable and versatile practice. However, this technique presents a relevant drawback, associated to the appearance of speckle noise that should be addressed. In this regard, this paper presents a new, intelligent and automatic method for weld inspection and analysis, based on TOFD, image processing and neural networks. The developed system is capable of detecting weld defects and imperfections with accuracy, and classify them into different categories.
The PAWS and STEM reliability analysis programs
NASA Technical Reports Server (NTRS)
Butler, Ricky W.; Stevenson, Philip H.
1988-01-01
The PAWS and STEM programs are new design/validation tools. These programs provide a flexible, user-friendly, language-based interface for the input of Markov models describing the behavior of fault-tolerant computer systems. These programs produce exact solutions of the probability of system failure and provide a conservative estimate of the number of significant digits in the solution. PAWS uses a Pade approximation as a solution technique; STEM uses a Taylor series as a solution technique. Both programs have the capability to solve numerically stiff models. PAWS and STEM possess complementary properties with regard to their input space; and, an additional strength of these programs is that they accept input compatible with the SURE program. If used in conjunction with SURE, PAWS and STEM provide a powerful suite of programs to analyze the reliability of fault-tolerant computer systems.
Spatially Regularized Machine Learning for Task and Resting-state fMRI
Song, Xiaomu; Panych, Lawrence P.; Chen, Nan-kuei
2015-01-01
Background Reliable mapping of brain function across sessions and/or subjects in task- and resting-state has been a critical challenge for quantitative fMRI studies although it has been intensively addressed in the past decades. New Method A spatially regularized support vector machine (SVM) technique was developed for the reliable brain mapping in task- and resting-state. Unlike most existing SVM-based brain mapping techniques, which implement supervised classifications of specific brain functional states or disorders, the proposed method performs a semi-supervised classification for the general brain function mapping where spatial correlation of fMRI is integrated into the SVM learning. The method can adapt to intra- and inter-subject variations induced by fMRI nonstationarity, and identify a true boundary between active and inactive voxels, or between functionally connected and unconnected voxels in a feature space. Results The method was evaluated using synthetic and experimental data at the individual and group level. Multiple features were evaluated in terms of their contributions to the spatially regularized SVM learning. Reliable mapping results in both task- and resting-state were obtained from individual subjects and at the group level. Comparison with Existing Methods A comparison study was performed with independent component analysis, general linear model, and correlation analysis methods. Experimental results indicate that the proposed method can provide a better or comparable mapping performance at the individual and group level. Conclusions The proposed method can provide accurate and reliable mapping of brain function in task- and resting-state, and is applicable to a variety of quantitative fMRI studies. PMID:26470627
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bucknor, Matthew; Grabaskas, David; Brunett, Acacia
2015-04-26
Advanced small modular reactor designs include many advantageous design features such as passively driven safety systems that are arguably more reliable and cost effective relative to conventional active systems. Despite their attractiveness, a reliability assessment of passive systems can be difficult using conventional reliability methods due to the nature of passive systems. Simple deviations in boundary conditions can induce functional failures in a passive system, and intermediate or unexpected operating modes can also occur. As part of an ongoing project, Argonne National Laboratory is investigating various methodologies to address passive system reliability. The Reliability Method for Passive Systems (RMPS), amore » systematic approach for examining reliability, is one technique chosen for this analysis. This methodology is combined with the Risk-Informed Safety Margin Characterization (RISMC) approach to assess the reliability of a passive system and the impact of its associated uncertainties. For this demonstration problem, an integrated plant model of an advanced small modular pool-type sodium fast reactor with a passive reactor cavity cooling system is subjected to a station blackout using RELAP5-3D. This paper discusses important aspects of the reliability assessment, including deployment of the methodology, the uncertainty identification and quantification process, and identification of key risk metrics.« less
Predicting neuropathic ulceration: analysis of static temperature distributions in thermal images
NASA Astrophysics Data System (ADS)
Kaabouch, Naima; Hu, Wen-Chen; Chen, Yi; Anderson, Julie W.; Ames, Forrest; Paulson, Rolf
2010-11-01
Foot ulcers affect millions of Americans annually. Conventional methods used to assess skin integrity, including inspection and palpation, may be valuable approaches, but they usually do not detect changes in skin integrity until an ulcer has already developed. We analyze the feasibility of thermal imaging as a technique to assess the integrity of the skin and its many layers. Thermal images are analyzed using an asymmetry analysis, combined with a genetic algorithm, to examine the infrared images for early detection of foot ulcers. Preliminary results show that the proposed technique can reliably and efficiently detect inflammation and hence effectively predict potential ulceration.
System Analysis by Mapping a Fault-tree into a Bayesian-network
NASA Astrophysics Data System (ADS)
Sheng, B.; Deng, C.; Wang, Y. H.; Tang, L. H.
2018-05-01
In view of the limitations of fault tree analysis in reliability assessment, Bayesian Network (BN) has been studied as an alternative technology. After a brief introduction to the method for mapping a Fault Tree (FT) into an equivalent BN, equations used to calculate the structure importance degree, the probability importance degree and the critical importance degree are presented. Furthermore, the correctness of these equations is proved mathematically. Combining with an aircraft landing gear’s FT, an equivalent BN is developed and analysed. The results show that richer and more accurate information have been achieved through the BN method than the FT, which demonstrates that the BN is a superior technique in both reliability assessment and fault diagnosis.
ERIC Educational Resources Information Center
Foster, Tanina S.
2014-01-01
Introduction: Observational research using the thin slice technique has been routinely incorporated in observational research methods, however there is limited evidence supporting use of this technique compared to full interaction coding. The purpose of this study was to determine if this technique could be reliability coded, if ratings are…
NASA Astrophysics Data System (ADS)
Vatanparast, Maryam; Vullum, Per Erik; Nord, Magnus; Zuo, Jian-Min; Reenaas, Turid W.; Holmestad, Randi
2017-09-01
Geometric phase analysis (GPA), a fast and simple Fourier space method for strain analysis, can give useful information on accumulated strain and defect propagation in multiple layers of semiconductors, including quantum dot materials. In this work, GPA has been applied to high resolution Z-contrast scanning transmission electron microscopy (STEM) images. Strain maps determined from different g vectors of these images are compared to each other, in order to analyze and assess the GPA technique in terms of accuracy. The SmartAlign tool has been used to improve the STEM image quality getting more reliable results. Strain maps from template matching as a real space approach are compared with strain maps from GPA, and it is discussed that a real space analysis is a better approach than GPA for aberration corrected STEM images.
[Examination of safety improvement by failure record analysis that uses reliability engineering].
Kato, Kyoichi; Sato, Hisaya; Abe, Yoshihisa; Ishimori, Yoshiyuki; Hirano, Hiroshi; Higashimura, Kyoji; Amauchi, Hiroshi; Yanakita, Takashi; Kikuchi, Kei; Nakazawa, Yasuo
2010-08-20
How the maintenance checks of the medical treatment system, including start of work check and the ending check, was effective for preventive maintenance and the safety improvement was verified. In this research, date on the failure of devices in multiple facilities was collected, and the data of the trouble repair record was analyzed by the technique of reliability engineering. An analysis of data on the system (8 general systems, 6 Angio systems, 11 CT systems, 8 MRI systems, 8 RI systems, and the radiation therapy system 9) used in eight hospitals was performed. The data collection period assumed nine months from April to December 2008. Seven items were analyzed. (1) Mean time between failures (MTBF) (2) Mean time to repair (MTTR) (3) Mean down time (MDT) (4) Number found by check in morning (5) Failure generation time according to modality. The classification of the breakdowns per device, the incidence, and the tendency could be understood by introducing reliability engineering. Analysis, evaluation, and feedback on the failure generation history are useful to keep downtime to a minimum and to ensure safety.
Green analytical chemistry--theory and practice.
Tobiszewski, Marek; Mechlińska, Agata; Namieśnik, Jacek
2010-08-01
This tutorial review summarises the current state of green analytical chemistry with special emphasis on environmentally friendly sample preparation techniques. Green analytical chemistry is a part of the sustainable development concept; its history and origins are described. Miniaturisation of analytical devices and shortening the time elapsing between performing analysis and obtaining reliable analytical results are important aspects of green analytical chemistry. Solventless extraction techniques, the application of alternative solvents and assisted extractions are considered to be the main approaches complying with green analytical chemistry principles.
Mullin, Anne E; Soukatcheva, Galina; Verchere, C Bruce; Chantler, Janet K
2006-05-01
A technique to isolate high-quality intact RNA from murine pancreas is described. This technique involves in situ ductal perfusion of the pancreas with an RNase inhibitor prior to removal of the organ for RNA extraction. In this way, the pancreatic RNases are inhibited in situ allowing good yields of intact RNA, suitable for studies on pancreatic gene transcription by real-time PCR or microarray analysis, to be obtained in a reliable way.
Study of advanced techniques for determining the long-term performance of components
NASA Technical Reports Server (NTRS)
1972-01-01
A study was conducted of techniques having the capability of determining the performance and reliability of components for spacecraft liquid propulsion applications for long term missions. The study utilized two major approaches; improvement in the existing technology, and the evolution of new technology. The criteria established and methods evolved are applicable to valve components. Primary emphasis was placed on the propellants oxygen difluoride and diborane combination. The investigation included analysis, fabrication, and tests of experimental equipment to provide data and performance criteria.
PARENT Quick Blind Round-Robin Test Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Braatz, Brett G.; Heasler, Patrick G.; Meyer, Ryan M.
The U.S. Nuclear Regulatory Commission has established the Program to Assess the Reliability of Emerging Nondestructive Techniques (PARENT) whose goal is to investigate the effectiveness of current and novel nondestructive examination procedures and techniques to find flaws in nickel-alloy welds and base materials. This is to be done by conducting a series of open and blind international round-robin tests on a set of piping components that include large-bore dissimilar metal welds, small-bore dissimilar metal welds, and bottom-mounted instrumentation penetration welds. The blind testing is being conducted in two segments, one is called Quick-Blind and the other is called Blind. Themore » Quick-Blind testing and destructive analysis of the test blocks has been completed. This report describes the four Quick-Blind test blocks used, summarizes their destructive analysis, gives an overview of the nondestructive evaluation (NDE) techniques applied, provides an analysis inspection data, and presents the conclusions drawn.« less
Determining Tooth Occlusal Surface Relief Indicator by Means of Automated 3d Shape Analysis
NASA Astrophysics Data System (ADS)
Gaboutchian, A. V.; Knyaz, V. A.
2017-05-01
Determining occlusal surface relief indicator plays an important role in odontometric tooth shape analysis. An analysis of the parameters of surface relief indicators provides valuable information about closure of dental arches (occlusion) and changes in structure of teeth in lifetime. Such data is relevant for dentistry or anthropology applications. Descriptive techniques commonly used for surface relief evaluation have limited precision which, as a result, does not provide for reliability of conclusions about structure and functioning of teeth. Parametric techniques developed for such applications need special facilities and are time-consuming which limits their spread and ease to access. Nevertheless the use of 3D models, obtained by photogrammetric techniques, allows attaining required measurements accuracy and has a potential for process automation. We introduce new approaches for determining tooth occlusal surface relief indicator and provide data on efficiency in use of different indicators in natural attrition evaluation.
NASA Astrophysics Data System (ADS)
Lo, Men-Tzung; Hu, Kun; Liu, Yanhui; Peng, C.-K.; Novak, Vera
2008-12-01
Quantification of nonlinear interactions between two nonstationary signals presents a computational challenge in different research fields, especially for assessments of physiological systems. Traditional approaches that are based on theories of stationary signals cannot resolve nonstationarity-related issues and, thus, cannot reliably assess nonlinear interactions in physiological systems. In this review we discuss a new technique called multimodal pressure flow (MMPF) method that utilizes Hilbert-Huang transformation to quantify interaction between nonstationary cerebral blood flow velocity (BFV) and blood pressure (BP) for the assessment of dynamic cerebral autoregulation (CA). CA is an important mechanism responsible for controlling cerebral blood flow in responses to fluctuations in systemic BP within a few heart-beats. The MMPF analysis decomposes BP and BFV signals into multiple empirical modes adaptively so that the fluctuations caused by a specific physiologic process can be represented in a corresponding empirical mode. Using this technique, we showed that dynamic CA can be characterized by specific phase delays between the decomposed BP and BFV oscillations, and that the phase shifts are significantly reduced in hypertensive, diabetics and stroke subjects with impaired CA. Additionally, the new technique can reliably assess CA using both induced BP/BFV oscillations during clinical tests and spontaneous BP/BFV fluctuations during resting conditions.
2011-01-01
Background A clinical study was conducted to determine the intra and inter-rater reliability of digital scanning and the neutral suspension casting technique to measure six foot parameters. The neutral suspension casting technique is a commonly utilised method for obtaining a negative impression of the foot prior to orthotic fabrication. Digital scanning offers an alternative to the traditional plaster of Paris techniques. Methods Twenty one healthy participants volunteered to take part in the study. Six casts and six digital scans were obtained from each participant by two raters of differing clinical experience. The foot parameters chosen for investigation were cast length (mm), forefoot width (mm), rearfoot width (mm), medial arch height (mm), lateral arch height (mm) and forefoot to rearfoot alignment (degrees). Intraclass correlation coefficients (ICC) with 95% confidence intervals (CI) were calculated to determine the intra and inter-rater reliability. Measurement error was assessed through the calculation of the standard error of the measurement (SEM) and smallest real difference (SRD). Results ICC values for all foot parameters using digital scanning ranged between 0.81-0.99 for both intra and inter-rater reliability. For neutral suspension casting technique inter-rater reliability values ranged from 0.57-0.99 and intra-rater reliability values ranging from 0.36-0.99 for rater 1 and 0.49-0.99 for rater 2. Conclusions The findings of this study indicate that digital scanning is a reliable technique, irrespective of clinical experience, with reduced measurement variability in all foot parameters investigated when compared to neutral suspension casting. PMID:21375757
Failure Mode and Effects Analysis: views of hospital staff in the UK.
Shebl, Nada; Franklin, Bryony; Barber, Nick; Burnett, Susan; Parand, Anam
2012-01-01
To explore health care professionals' experiences and perceptions of Failure Mode and Effects Analysis (FMEA), a team-based, prospective risk analysis technique. Semi-structured interviews were conducted with 21 operational leads (20 pharmacists, one nurse) in medicines management teams of hospitals participating in a national quality improvement programme. Interviews were transcribed, coded and emergent themes identified using framework analysis. Themes identified included perceptions and experiences of participants with FMEA, validity and reliability issues, and FMEA's use in practice. FMEA was considered to be a structured but subjective process that helps health care professionals get together to identify high risk areas of care. Both positive and negative opinions were expressed, with the majority of interviewees expressing positive views towards FMEA in relation to its structured nature and the use of a multidisciplinary team. Other participants criticised FMEA for being subjective and lacking validity. Most likely to restrict its widespread use were its time consuming nature and its perceived lack of validity and reliability. FMEA is a subjective but systematic tool that helps identify high risk areas, but its time consuming nature, difficulty with the scores and perceived lack of validity and reliability may limit its widespread use.
Analysis of intracranial pressure: past, present, and future.
Di Ieva, Antonio; Schmitz, Erika M; Cusimano, Michael D
2013-12-01
The monitoring of intracranial pressure (ICP) is an important tool in medicine for its ability to portray the brain's compliance status. The bedside monitor displays the ICP waveform and intermittent mean values to guide physicians in the management of patients, particularly those having sustained a traumatic brain injury. Researchers in the fields of engineering and physics have investigated various mathematical analysis techniques applicable to the waveform in order to extract additional diagnostic and prognostic information, although they largely remain limited to research applications. The purpose of this review is to present the current techniques used to monitor and interpret ICP and explore the potential of using advanced mathematical techniques to provide information about system perturbations from states of homeostasis. We discuss the limits of each proposed technique and we propose that nonlinear analysis could be a reliable approach to describe ICP signals over time, with the fractal dimension as a potential predictive clinically meaningful biomarker. Our goal is to stimulate translational research that can move modern analysis of ICP using these techniques into widespread practical use, and to investigate to the clinical utility of a tool capable of simplifying multiple variables obtained from various sensors.
Selfe, James; Hardaker, Natalie; Thewlis, Dominic; Karki, Anna
2006-12-01
To develop an anatomic marker system (AMS) as an accurate, reliable method of thermal imaging data analysis, for use in cryotherapy research. Investigation of the accuracy of new thermal imaging technique. Hospital orthopedic outpatient department in England. Consecutive sample of 9 patients referred to anterior knee pain clinic. Not applicable. Thermally inert markers were placed at specific anatomic locations, defining an area over the anterior knee of patients with anterior knee pain. A baseline thermal image was taken. Patients underwent a 3-minute thermal washout of the affected knee. Thermal images were collected at a rate of 1 image per minute for a 20-minute re-warming period. A Matlab (version 7.0) program was written to digitize the marker positions and subsequently calculate the mean of the area over the anterior knee. Virtual markers were then defined as 15% distal from the proximal marker, 30% proximal from the distal markers, 15% lateral from the medial marker, and 15% medial from the lateral marker. The virtual markers formed an ellipse, which defined an area representative of the patella shape. Within the ellipse, the mean value of the full pixels determined the mean temperature of this region. Ten raters were recruited to use the program and interrater reliability was investigated. The intraclass correlation coefficient produced coefficients within acceptable bounds, ranging from .82 to .97, indicating adequate interrater reliability. The AMS provides an accurate, reliable method for thermal imaging data analysis and is a reliable tool with which to advance cryotherapy research.
An assessment of finite-element modeling techniques for thick-solid/thin-shell joints analysis
NASA Technical Reports Server (NTRS)
Min, J. B.; Androlake, S. G.
1993-01-01
The subject of finite-element modeling has long been of critical importance to the practicing designer/analyst who is often faced with obtaining an accurate and cost-effective structural analysis of a particular design. Typically, these two goals are in conflict. The purpose is to discuss the topic of finite-element modeling for solid/shell connections (joints) which are significant for the practicing modeler. Several approaches are currently in use, but frequently various assumptions restrict their use. Such techniques currently used in practical applications were tested, especially to see which technique is the most ideally suited for the computer aided design (CAD) environment. Some basic thoughts regarding each technique are also discussed. As a consequence, some suggestions based on the results are given to lead reliable results in geometrically complex joints where the deformation and stress behavior are complicated.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Butler, J.J. Jr.; Hyder, Z.
The Nguyen and Pinder method is one of four techniques commonly used for analysis of response data from slug tests. Limited field research has raised questions about the reliability of the parameter estimates obtained with this method. A theoretical evaluation of this technique reveals that errors were made in the derivation of the analytical solution upon which the technique is based. Simulation and field examples show that the errors result in parameter estimates that can differ from actual values by orders of magnitude. These findings indicate that the Nguyen and Pinder method should no longer be a tool in themore » repertoire of the field hydrogeologist. If data from a slug test performed in a partially penetrating well in a confined aquifer need to be analyzed, recent work has shown that the Hvorslev method is the best alternative among the commonly used techniques.« less
Optimization of the tungsten oxide technique for measurement of atmospheric ammonia
NASA Technical Reports Server (NTRS)
Brown, Kenneth G.
1987-01-01
Hollow tubes coated with tungstic acid have been shown to be of value in the determination of ammonia and nitric acid in ambient air. Practical application of this technique was demonstrated utilizing an automated sampling system for in-flight collection and analysis of atmospheric samples. Due to time constraints these previous measurements were performed on tubes that had not been well characterized in the laboratory. As a result the experimental precision could not be accurately estimated. Since the technique was being compared to other techniques for measuring these compounds, it became necessary to perform laboratory tests which would establish the reliability of the technique. This report is a summary of these laboratory experiments as they are applied to the determination of ambient ammonia concentration.
NASA Astrophysics Data System (ADS)
Park, Joon-Sang; Lee, Uichin; Oh, Soon Young; Gerla, Mario; Lun, Desmond Siumen; Ro, Won Woo; Park, Joonseok
Vehicular ad hoc networks (VANET) aims to enhance vehicle navigation safety by providing an early warning system: any chance of accidents is informed through the wireless communication between vehicles. For the warning system to work, it is crucial that safety messages be reliably delivered to the target vehicles in a timely manner and thus reliable and timely data dissemination service is the key building block of VANET. Data mulling technique combined with three strategies, network codeing, erasure coding and repetition coding, is proposed for the reliable and timely data dissemination service. Particularly, vehicles in the opposite direction on a highway are exploited as data mules, mobile nodes physically delivering data to destinations, to overcome intermittent network connectivity cause by sparse vehicle traffic. Using analytic models, we show that in such a highway data mulling scenario the network coding based strategy outperforms erasure coding and repetition based strategies.
Bae, Sungwoo; Kim, Myungchin
2016-01-01
In order to realize a true WoT environment, a reliable power circuit is required to ensure interconnections among a range of WoT devices. This paper presents research on sensors and their effects on the reliability and response characteristics of power circuits in WoT devices. The presented research can be used in various power circuit applications, such as energy harvesting interfaces, photovoltaic systems, and battery management systems for the WoT devices. As power circuits rely on the feedback from voltage/current sensors, the system performance is likely to be affected by the sensor failure rates, sensor dynamic characteristics, and their interface circuits. This study investigated how the operational availability of the power circuits is affected by the sensor failure rates by performing a quantitative reliability analysis. In the analysis process, this paper also includes the effects of various reconstruction and estimation techniques used in power processing circuits (e.g., energy harvesting circuits and photovoltaic systems). This paper also reports how the transient control performance of power circuits is affected by sensor interface circuits. With the frequency domain stability analysis and circuit simulation, it was verified that the interface circuit dynamics may affect the transient response characteristics of power circuits. The verification results in this paper showed that the reliability and control performance of the power circuits can be affected by the sensor types, fault tolerant approaches against sensor failures, and the response characteristics of the sensor interfaces. The analysis results were also verified by experiments using a power circuit prototype. PMID:27608020
[Surgical technique in patients with chronic pancreatitis].
Pronin, N A; Natalskiy, A A; Tarasenko, S V; Pavlov, A V; Fedoseev, V A
To justify ligation of vascular branches of anterior pancreateroduodenal arterial arch or gastroduodenal artery prior to bifurcation. This method was tested on sufficient clinical material: 147 patients with recurrent chronic pancreatitis. The interventions are presented by Frey, Beger procedures and its Berne variant. Comparative analysis showed reliable advantages of vascular ligation during pancreatectomy.
Survey of Software Assurance Techniques for Highly Reliable Systems
NASA Technical Reports Server (NTRS)
Nelson, Stacy
2004-01-01
This document provides a survey of software assurance techniques for highly reliable systems including a discussion of relevant safety standards for various industries in the United States and Europe, as well as examples of methods used during software development projects. It contains one section for each industry surveyed: Aerospace, Defense, Nuclear Power, Medical Devices and Transportation. Each section provides an overview of applicable standards and examples of a mission or software development project, software assurance techniques used and reliability achieved.
NASA Astrophysics Data System (ADS)
Lü, Hui; Shangguan, Wen-Bin; Yu, Dejie
2017-09-01
Automotive brake systems are always subjected to various types of uncertainties and two types of random-fuzzy uncertainties may exist in the brakes. In this paper, a unified approach is proposed for squeal instability analysis of disc brakes with two types of random-fuzzy uncertainties. In the proposed approach, two uncertainty analysis models with mixed variables are introduced to model the random-fuzzy uncertainties. The first one is the random and fuzzy model, in which random variables and fuzzy variables exist simultaneously and independently. The second one is the fuzzy random model, in which uncertain parameters are all treated as random variables while their distribution parameters are expressed as fuzzy numbers. Firstly, the fuzziness is discretized by using α-cut technique and the two uncertainty analysis models are simplified into random-interval models. Afterwards, by temporarily neglecting interval uncertainties, the random-interval models are degraded into random models, in which the expectations, variances, reliability indexes and reliability probabilities of system stability functions are calculated. And then, by reconsidering the interval uncertainties, the bounds of the expectations, variances, reliability indexes and reliability probabilities are computed based on Taylor series expansion. Finally, by recomposing the analysis results at each α-cut level, the fuzzy reliability indexes and probabilities can be obtained, by which the brake squeal instability can be evaluated. The proposed approach gives a general framework to deal with both types of random-fuzzy uncertainties that may exist in the brakes and its effectiveness is demonstrated by numerical examples. It will be a valuable supplement to the systematic study of brake squeal considering uncertainty.
Kumar, Mohit; Yadav, Shiv Prasad
2012-03-01
This paper addresses the fuzzy system reliability analysis using different types of intuitionistic fuzzy numbers. Till now, in the literature, to analyze the fuzzy system reliability, it is assumed that the failure rates of all components of a system follow the same type of fuzzy set or intuitionistic fuzzy set. However, in practical problems, such type of situation rarely occurs. Therefore, in the present paper, a new algorithm has been introduced to construct the membership function and non-membership function of fuzzy reliability of a system having components following different types of intuitionistic fuzzy failure rates. Functions of intuitionistic fuzzy numbers are calculated to construct the membership function and non-membership function of fuzzy reliability via non-linear programming techniques. Using the proposed algorithm, membership functions and non-membership functions of fuzzy reliability of a series system and a parallel systems are constructed. Our study generalizes the various works of the literature. Numerical examples are given to illustrate the proposed algorithm. Copyright © 2011 ISA. Published by Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Lee, Sangkyu
Illicit trafficking and smuggling of radioactive materials and special nuclear materials (SNM) are considered as one of the most important recent global nuclear threats. Monitoring the transport and safety of radioisotopes and SNM are challenging due to their weak signals and easy shielding. Great efforts worldwide are focused at developing and improving the detection technologies and algorithms, for accurate and reliable detection of radioisotopes of interest in thus better securing the borders against nuclear threats. In general, radiation portal monitors enable detection of gamma and neutron emitting radioisotopes. Passive or active interrogation techniques, present and/or under the development, are all aimed at increasing accuracy, reliability, and in shortening the time of interrogation as well as the cost of the equipment. Equally important efforts are aimed at advancing algorithms to process the imaging data in an efficient manner providing reliable "readings" of the interiors of the examined volumes of various sizes, ranging from cargos to suitcases. The main objective of this thesis is to develop two synergistic algorithms with the goal to provide highly reliable - low noise identification of radioisotope signatures. These algorithms combine analysis of passive radioactive detection technique with active interrogation imaging techniques such as gamma radiography or muon tomography. One algorithm consists of gamma spectroscopy and cosmic muon tomography, and the other algorithm is based on gamma spectroscopy and gamma radiography. The purpose of fusing two detection methodologies per algorithm is to find both heavy-Z radioisotopes and shielding materials, since radionuclides can be identified with gamma spectroscopy, and shielding materials can be detected using muon tomography or gamma radiography. These combined algorithms are created and analyzed based on numerically generated images of various cargo sizes and materials. In summary, the three detection methodologies are fused into two algorithms with mathematical functions providing: reliable identification of radioisotopes in gamma spectroscopy; noise reduction and precision enhancement in muon tomography; and the atomic number and density estimation in gamma radiography. It is expected that these new algorithms maybe implemented at portal scanning systems with the goal to enhance the accuracy and reliability in detecting nuclear materials inside the cargo containers.
Evaluation Criteria for Micro-CAI: A Psychometric Approach
Wallace, Douglas; Slichter, Mark; Bolwell, Christine
1985-01-01
The increased use of microcomputer-based instructional programs has resulted in a greater need for third-party evaluation of the software. This in turn has prompted the development of micro-CAI evaluation tools. The present project sought to develop a prototype instrument to assess the impact of CAI program presentation characteristics on students. Data analysis and scale construction was conducted using standard item reliability analyses and factor analytic techniques. Adequate subscale reliabilities and factor structures were found, suggesting that a psychometric approach to CAI evaluation may possess some merit. Efforts to assess the utility of the resultant instrument are currently underway.
Davis, Matthew A Cody; Spriggs, Amy; Rodgers, Alexis; Campbell, Jonathan
2018-06-01
Deficits in social skills are often exhibited in individuals with comorbid Down syndrome (DS) and autism spectrum disorder (ASD), and there is a paucity of research to help guide intervention for this population. In the present study, a multiple probe study across behaviors, replicated across participants, assessed the effectiveness of peer-delivered simultaneous prompting in teaching socials skills to adults with DS-ASD using visual analysis techniques and Tau-U statistics to measure effect. Peer-mediators with DS and intellectual disability (ID) delivered simultaneous prompting sessions reliably (i.e., > 80% reliability) to teach social skills to adults with ID and a dual-diagnoses of DS-ASD with small (Tau Weighted = .55, 90% CI [.29, .82]) to medium effects (Tau Weighted = .75, 90% CI [.44, 1]). Statistical and visual analysis findings suggest a promising social skills intervention for individuals with DS-ASD as well as reliable delivery of simultaneous prompting procedures by individuals with DS.
Design study for a high reliability five-year spacecraft tape transport
NASA Technical Reports Server (NTRS)
Benn, G. S. L.; Eshleman, R. L.
1971-01-01
Following the establishment of the overall transport concept, a study of all of the life limiting constraints associated with the transport were analyzed using modeling techniques. These design techniques included: (1) a response analysis from which the performance of the transport could be determined under operating conditions for a variety of conceptual variations both in a new and aged condition; (2) an analysis of a double cone guidance technique which yielded an optimum design for maximum guidance with minimum tape degradation; (3) an analysis of the tape pack design to eliminate spoking caused by negative tangential stress within the pack; (4) an evaluation of the stress levels experienced by the magnetic tape throughout the system; (5) a general review of the bearing and lubrication technology as applied to satellite recorders and hence the recommendation for using standard load carrying antifriction ball bearings; and (6) a kinetic analysis to determine the change in kinetic properties of the transport during operation.
Lo, Men-Tzung; Hu, Kun; Liu, Yanhui; Peng, C.-K.; Novak, Vera
2008-01-01
Quantification of nonlinear interactions between two nonstationary signals presents a computational challenge in different research fields, especially for assessments of physiological systems. Traditional approaches that are based on theories of stationary signals cannot resolve nonstationarity-related issues and, thus, cannot reliably assess nonlinear interactions in physiological systems. In this review we discuss a new technique “Multi-Modal Pressure Flow method (MMPF)” that utilizes Hilbert-Huang transformation to quantify dynamic cerebral autoregulation (CA) by studying interaction between nonstationary cerebral blood flow velocity (BFV) and blood pressure (BP). CA is an important mechanism responsible for controlling cerebral blood flow in responses to fluctuations in systemic BP within a few heart-beats. The influence of CA is traditionally assessed from the relationship between the well-pronounced systemic BP and BFV oscillations induced by clinical tests. Reliable noninvasive assessment of dynamic CA, however, remains a challenge in clinical and diagnostic medicine. In this brief review we: 1) present an overview of transfer function analysis (TFA) that is traditionally used to quantify CA; 2) describe the a MMPF method and its modifications; 3) introduce a newly developed automatic algorithm and engineering aspects of the improved MMPF method; and 4) review clinical applications of MMPF and its sensitivity for detection of CA abnormalities in clinical studies. The MMPF analysis decomposes complex nonstationary BP and BFV signals into multiple empirical modes adaptively so that the fluctuations caused by a specific physiologic process can be represented in a corresponding empirical mode. Using this technique, we recently showed that dynamic CA can be characterized by specific phase delays between the decomposed BP and BFV oscillations, and that the phase shifts are significantly reduced in hypertensive, diabetics and stroke subjects with impaired CA. In addition, the new technique enables reliable assessment of CA using both data collected during clinical test and spontaneous BP/BFV fluctuations during baseline resting conditions. PMID:18725996
Mathysen, Danny G P; Aclimandos, Wagih; Roelant, Ella; Wouters, Kristien; Creuzot-Garcher, Catherine; Ringens, Peter J; Hawlina, Marko; Tassignon, Marie-José
2013-11-01
To investigate whether introduction of item-response theory (IRT) analysis, in parallel to the 'traditional' statistical analysis methods available for performance evaluation of multiple T/F items as used in the European Board of Ophthalmology Diploma (EBOD) examination, has proved beneficial, and secondly, to study whether the overall assessment performance of the current written part of EBOD is sufficiently high (KR-20≥ 0.90) to be kept as examination format in future EBOD editions. 'Traditional' analysis methods for individual MCQ item performance comprise P-statistics, Rit-statistics and item discrimination, while overall reliability is evaluated through KR-20 for multiple T/F items. The additional set of statistical analysis methods for the evaluation of EBOD comprises mainly IRT analysis. These analysis techniques are used to monitor whether the introduction of negative marking for incorrect answers (since EBOD 2010) has a positive influence on the statistical performance of EBOD as a whole and its individual test items in particular. Item-response theory analysis demonstrated that item performance parameters should not be evaluated individually, but should be related to one another. Before the introduction of negative marking, the overall EBOD reliability (KR-20) was good though with room for improvement (EBOD 2008: 0.81; EBOD 2009: 0.78). After the introduction of negative marking, the overall reliability of EBOD improved significantly (EBOD 2010: 0.92; EBOD 2011:0.91; EBOD 2012: 0.91). Although many statistical performance parameters are available to evaluate individual items, our study demonstrates that the overall reliability assessment remains the only crucial parameter to be evaluated allowing comparison. While individual item performance analysis is worthwhile to undertake as secondary analysis, drawing final conclusions seems to be more difficult. Performance parameters need to be related, as shown by IRT analysis. Therefore, IRT analysis has proved beneficial for the statistical analysis of EBOD. Introduction of negative marking has led to a significant increase in the reliability (KR-20 > 0.90), indicating that the current examination format can be kept for future EBOD examinations. © 2013 Acta Ophthalmologica Scandinavica Foundation. Published by John Wiley & Sons Ltd.
Illustrated structural application of universal first-order reliability method
NASA Technical Reports Server (NTRS)
Verderaime, V.
1994-01-01
The general application of the proposed first-order reliability method was achieved through the universal normalization of engineering probability distribution data. The method superimposes prevailing deterministic techniques and practices on the first-order reliability method to surmount deficiencies of the deterministic method and provide benefits of reliability techniques and predictions. A reliability design factor is derived from the reliability criterion to satisfy a specified reliability and is analogous to the deterministic safety factor. Its application is numerically illustrated on several practical structural design and verification cases with interesting results and insights. Two concepts of reliability selection criteria are suggested. Though the method was developed to support affordable structures for access to space, the method should also be applicable for most high-performance air and surface transportation systems.
Cheng, Cynthia; Lee, Chadd W; Daskalakis, Constantine
2015-10-27
Capillaroscopy is a non-invasive, efficient, relatively inexpensive and easy to learn methodology for directly visualizing the microcirculation. The capillaroscopy technique can provide insight into a patient's microvascular health, leading to a variety of potentially valuable dermatologic, ophthalmologic, rheumatologic and cardiovascular clinical applications. In addition, tumor growth may be dependent on angiogenesis, which can be quantitated by measuring microvessel density within the tumor. However, there is currently little to no standardization of techniques, and only one publication to date reports the reliability of a currently available, complex computer based algorithms for quantitating capillaroscopy data.(1) This paper describes a new, simpler, reliable, standardized capillary counting algorithm for quantitating nailfold capillaroscopy data. A simple, reproducible computerized capillaroscopy algorithm such as this would facilitate more widespread use of the technique among researchers and clinicians. Many researchers currently analyze capillaroscopy images by hand, promoting user fatigue and subjectivity of the results. This paper describes a novel, easy-to-use automated image processing algorithm in addition to a reproducible, semi-automated counting algorithm. This algorithm enables analysis of images in minutes while reducing subjectivity; only a minimal amount of training time (in our experience, less than 1 hr) is needed to learn the technique.
Daskalakis, Constantine
2015-01-01
Capillaroscopy is a non-invasive, efficient, relatively inexpensive and easy to learn methodology for directly visualizing the microcirculation. The capillaroscopy technique can provide insight into a patient’s microvascular health, leading to a variety of potentially valuable dermatologic, ophthalmologic, rheumatologic and cardiovascular clinical applications. In addition, tumor growth may be dependent on angiogenesis, which can be quantitated by measuring microvessel density within the tumor. However, there is currently little to no standardization of techniques, and only one publication to date reports the reliability of a currently available, complex computer based algorithms for quantitating capillaroscopy data.1 This paper describes a new, simpler, reliable, standardized capillary counting algorithm for quantitating nailfold capillaroscopy data. A simple, reproducible computerized capillaroscopy algorithm such as this would facilitate more widespread use of the technique among researchers and clinicians. Many researchers currently analyze capillaroscopy images by hand, promoting user fatigue and subjectivity of the results. This paper describes a novel, easy-to-use automated image processing algorithm in addition to a reproducible, semi-automated counting algorithm. This algorithm enables analysis of images in minutes while reducing subjectivity; only a minimal amount of training time (in our experience, less than 1 hr) is needed to learn the technique. PMID:26554744
Laser-induced breakdown spectroscopy is a reliable method for urinary stone analysis
Mutlu, Nazım; Çiftçi, Seyfettin; Gülecen, Turgay; Öztoprak, Belgin Genç; Demir, Arif
2016-01-01
Objective We compared laser-induced breakdown spectroscopy (LIBS) with the traditionally used and recommended X-ray diffraction technique (XRD) for urinary stone analysis. Material and methods In total, 65 patients with urinary calculi were enrolled in this prospective study. Stones were obtained after surgical or extracorporeal shockwave lithotripsy procedures. All stones were divided into two equal pieces. One sample was analyzed by XRD and the other by LIBS. The results were compared by the kappa (κ) and Spearman’s correlation coefficient (rho) tests. Results Using LIBS, 95 components were identified from 65 stones, while XRD identified 88 components. LIBS identified 40 stones with a single pure component, 20 stones with two different components, and 5 stones with three components. XRD demonstrated 42 stones with a single component, 22 stones with two different components, and only 1 stone with three different components. There was a strong relationship in the detection of stone types between LIBS and XRD for stones components (Spearman rho, 0.866; p<0.001). There was excellent agreement between the two techniques among 38 patients with pure stones (κ index, 0.910; Spearman rho, 0.916; p<0.001). Conclusion Our study indicates that LIBS is a valid and reliable technique for determining urinary stone composition. Moreover, it is a simple, low-cost, and nondestructive technique. LIBS can be safely used in routine daily practice if our results are supported by studies with larger numbers of patients. PMID:27011877
Dinkel, Philipp Johannes; Willmes, Klaus; Krinzinger, Helga; Konrad, Kerstin; Koten Jr, Jan Willem
2013-01-01
FMRI-studies are mostly based on a group study approach, either analyzing one group or comparing multiple groups, or on approaches that correlate brain activation with clinically relevant criteria or behavioral measures. In this study we investigate the potential of fMRI-techniques focusing on individual differences in brain activation within a test-retest reliability context. We employ a single-case analysis approach, which contrasts dyscalculic children with a control group of typically developing children. In a second step, a support-vector machine analysis and cluster analysis techniques served to investigate similarities in multivariate brain activation patterns. Children were confronted with a non-symbolic number comparison and a non-symbolic exact calculation task during fMRI acquisition. Conventional second level group comparison analysis only showed small differences around the angular gyrus bilaterally and the left parieto-occipital sulcus. Analyses based on single-case statistical procedures revealed that developmental dyscalculia is characterized by individual differences predominantly in visual processing areas. Dyscalculic children seemed to compensate for relative under-activation in the primary visual cortex through an upregulation in higher visual areas. However, overlap in deviant activation was low for the dyscalculic children, indicating that developmental dyscalculia is a disorder characterized by heterogeneous brain activation differences. Using support vector machine analysis and cluster analysis, we tried to group dyscalculic and typically developing children according to brain activation. Fronto-parietal systems seem to qualify for a distinction between the two groups. However, this was only effective when reliable brain activations of both tasks were employed simultaneously. Results suggest that deficits in number representation in the visual-parietal cortex get compensated for through finger related aspects of number representation in fronto-parietal cortex. We conclude that dyscalculic children show large individual differences in brain activation patterns. Nonetheless, the majority of dyscalculic children can be differentiated from controls employing brain activation patterns when appropriate methods are used. PMID:24349547
Dinkel, Philipp Johannes; Willmes, Klaus; Krinzinger, Helga; Konrad, Kerstin; Koten, Jan Willem
2013-01-01
FMRI-studies are mostly based on a group study approach, either analyzing one group or comparing multiple groups, or on approaches that correlate brain activation with clinically relevant criteria or behavioral measures. In this study we investigate the potential of fMRI-techniques focusing on individual differences in brain activation within a test-retest reliability context. We employ a single-case analysis approach, which contrasts dyscalculic children with a control group of typically developing children. In a second step, a support-vector machine analysis and cluster analysis techniques served to investigate similarities in multivariate brain activation patterns. Children were confronted with a non-symbolic number comparison and a non-symbolic exact calculation task during fMRI acquisition. Conventional second level group comparison analysis only showed small differences around the angular gyrus bilaterally and the left parieto-occipital sulcus. Analyses based on single-case statistical procedures revealed that developmental dyscalculia is characterized by individual differences predominantly in visual processing areas. Dyscalculic children seemed to compensate for relative under-activation in the primary visual cortex through an upregulation in higher visual areas. However, overlap in deviant activation was low for the dyscalculic children, indicating that developmental dyscalculia is a disorder characterized by heterogeneous brain activation differences. Using support vector machine analysis and cluster analysis, we tried to group dyscalculic and typically developing children according to brain activation. Fronto-parietal systems seem to qualify for a distinction between the two groups. However, this was only effective when reliable brain activations of both tasks were employed simultaneously. Results suggest that deficits in number representation in the visual-parietal cortex get compensated for through finger related aspects of number representation in fronto-parietal cortex. We conclude that dyscalculic children show large individual differences in brain activation patterns. Nonetheless, the majority of dyscalculic children can be differentiated from controls employing brain activation patterns when appropriate methods are used.
A new technique in the global reliability of cyclic communications network
NASA Technical Reports Server (NTRS)
Sjogren, Jon A.
1989-01-01
The global reliability of a communications network is the probability that given any pair of nodes, there exists a viable path between them. A characterization of connectivity, for a given class of networks, can enable one to find this reliability. Such a characterization is described for a useful class of undirected networks called daisy-chained or braided networks. This leads to a new method of quickly computing the global reliability of these networks. Asymptotic behavior in terms of component reliability is related to geometric properties of the given graph. Generalization of the technique is discussed.
Flat-plate solar array project. Volume 6: Engineering sciences and reliability
NASA Technical Reports Server (NTRS)
Ross, R. G., Jr.; Smokler, M. I.
1986-01-01
The Flat-Plate Solar Array (FSA) Project activities directed at developing the engineering technology base required to achieve modules that meet the functional, safety, and reliability requirements of large scale terrestrial photovoltaic systems applications are reported. These activities included: (1) development of functional, safety, and reliability requirements for such applications; (2) development of the engineering analytical approaches, test techniques, and design solutions required to meet the requirements; (3) synthesis and procurement of candidate designs for test and evaluation; and (4) performance of extensive testing, evaluation, and failure analysis of define design shortfalls and, thus, areas requiring additional research and development. A summary of the approach and technical outcome of these activities are provided along with a complete bibliography of the published documentation covering the detailed accomplishments and technologies developed.
Gearbox Reliability Collaborative Phase 3 Gearbox 2 Test Plan
DOE Office of Scientific and Technical Information (OSTI.GOV)
Link, H.; Keller, J.; Guo, Y.
2013-04-01
Gearboxes in wind turbines have not been achieving their expected design life even though they commonly meet or exceed the design criteria specified in current design standards. One of the basic premises of the National Renewable Energy Laboratory (NREL) Gearbox Reliability Collaborative (GRC) is that the low gearbox reliability results from the absence of critical elements in the design process or insufficient design tools. Key goals of the GRC are to improve design approaches and analysis tools and to recommend practices and test methods resulting in improved design standards for wind turbine gearboxes that lower the cost of energy (COE)more » through improved reliability. The GRC uses a combined gearbox testing, modeling and analysis approach, along with a database of information from gearbox failures collected from overhauls and investigation of gearbox condition monitoring techniques to improve wind turbine operations and maintenance practices. Testing of Gearbox 2 (GB2) using the two-speed turbine controller that has been used in prior testing. This test series will investigate non-torque loads, high-speed shaft misalignment, and reproduction of field conditions in the dynamometer. This test series will also include vibration testing using an eddy-current brake on the gearbox's high speed shaft.« less
Gajewski, Byron J.; Lee, Robert; Dunton, Nancy
2012-01-01
Data Envelopment Analysis (DEA) is the most commonly used approach for evaluating healthcare efficiency (Hollingsworth, 2008), but a long-standing concern is that DEA assumes that data are measured without error. This is quite unlikely, and DEA and other efficiency analysis techniques may yield biased efficiency estimates if it is not realized (Gajewski, Lee, Bott, Piamjariyakul and Taunton, 2009; Ruggiero, 2004). We propose to address measurement error systematically using a Bayesian method (Bayesian DEA). We will apply Bayesian DEA to data from the National Database of Nursing Quality Indicators® (NDNQI®) to estimate nursing units’ efficiency. Several external reliability studies inform the posterior distribution of the measurement error on the DEA variables. We will discuss the case of generalizing the approach to situations where an external reliability study is not feasible. PMID:23328796
Jha, Abhinav K.; Mena, Esther; Caffo, Brian; Ashrafinia, Saeed; Rahmim, Arman; Frey, Eric; Subramaniam, Rathan M.
2017-01-01
Abstract. Recently, a class of no-gold-standard (NGS) techniques have been proposed to evaluate quantitative imaging methods using patient data. These techniques provide figures of merit (FoMs) quantifying the precision of the estimated quantitative value without requiring repeated measurements and without requiring a gold standard. However, applying these techniques to patient data presents several practical difficulties including assessing the underlying assumptions, accounting for patient-sampling-related uncertainty, and assessing the reliability of the estimated FoMs. To address these issues, we propose statistical tests that provide confidence in the underlying assumptions and in the reliability of the estimated FoMs. Furthermore, the NGS technique is integrated within a bootstrap-based methodology to account for patient-sampling-related uncertainty. The developed NGS framework was applied to evaluate four methods for segmenting lesions from F-Fluoro-2-deoxyglucose positron emission tomography images of patients with head-and-neck cancer on the task of precisely measuring the metabolic tumor volume. The NGS technique consistently predicted the same segmentation method as the most precise method. The proposed framework provided confidence in these results, even when gold-standard data were not available. The bootstrap-based methodology indicated improved performance of the NGS technique with larger numbers of patient studies, as was expected, and yielded consistent results as long as data from more than 80 lesions were available for the analysis. PMID:28331883
NASA Astrophysics Data System (ADS)
Rahman, P. A.; D'K Novikova Freyre Shavier, G.
2018-03-01
This scientific paper is devoted to the analysis of the mean time to data loss of redundant disk arrays RAID-6 with alternation of data considering different failure rates of disks both in normal state of the disk array and in degraded and rebuild states, and also nonzero time of the disk replacement. The reliability model developed by the authors on the basis of the Markov chain and obtained calculation formula for estimation of the mean time to data loss (MTTDL) of the RAID-6 disk arrays are also presented. At last, the technique of estimation of the initial reliability parameters and examples of calculation of the MTTDL of the RAID-6 disk arrays for the different numbers of disks are also given.
Develop advanced nonlinear signal analysis topographical mapping system
NASA Technical Reports Server (NTRS)
1994-01-01
The Space Shuttle Main Engine (SSME) has been undergoing extensive flight certification and developmental testing, which involves some 250 health monitoring measurements. Under the severe temperature, pressure, and dynamic environments sustained during operation, numerous major component failures have occurred, resulting in extensive engine hardware damage and scheduling losses. To enhance SSME safety and reliability, detailed analysis and evaluation of the measurements signal are mandatory to assess its dynamic characteristics and operational condition. Efficient and reliable signal detection techniques will reduce catastrophic system failure risks and expedite the evaluation of both flight and ground test data, and thereby reduce launch turn-around time. The basic objective of this contract are threefold: (1) develop and validate a hierarchy of innovative signal analysis techniques for nonlinear and nonstationary time-frequency analysis. Performance evaluation will be carried out through detailed analysis of extensive SSME static firing and flight data. These techniques will be incorporated into a fully automated system; (2) develop an advanced nonlinear signal analysis topographical mapping system (ATMS) to generate a Compressed SSME TOPO Data Base (CSTDB). This ATMS system will convert tremendous amount of complex vibration signals from the entire SSME test history into a bank of succinct image-like patterns while retaining all respective phase information. High compression ratio can be achieved to allow minimal storage requirement, while providing fast signature retrieval, pattern comparison, and identification capabilities; and (3) integrate the nonlinear correlation techniques into the CSTDB data base with compatible TOPO input data format. Such integrated ATMS system will provide the large test archives necessary for quick signature comparison. This study will provide timely assessment of SSME component operational status, identify probable causes of malfunction, and indicate feasible engineering solutions. The final result of this program will yield an ATMS system of nonlinear and nonstationary spectral analysis software package integrated with the Compressed SSME TOPO Data Base (CSTDB) on the same platform. This system will allow NASA engineers to retrieve any unique defect signatures and trends associated with different failure modes and anomalous phenomena over the entire SSME test history across turbo pump families.
Develop advanced nonlinear signal analysis topographical mapping system
NASA Technical Reports Server (NTRS)
Jong, Jen-Yi
1993-01-01
The SSME has been undergoing extensive flight certification and developmental testing, which involves some 250 health monitoring measurements. Under the severe temperature pressure, and dynamic environments sustained during operation, numerous major component failures have occurred, resulting in extensive engine hardware damage and scheduling losses. To enhance SSME safety and reliability, detailed analysis and evaluation of the measurements signal are mandatory to assess its dynamic characteristics and operational condition. Efficient and reliable signal detection techniques will reduce catastrophic system failure risks and expedite the evaluation of both flight and ground test data, and thereby reduce launch turn-around time. The basic objective of this contract are threefold: (1) Develop and validate a hierarchy of innovative signal analysis techniques for nonlinear and nonstationary time-frequency analysis. Performance evaluation will be carried out through detailed analysis of extensive SSME static firing and flight data. These techniques will be incorporated into a fully automated system. (2) Develop an advanced nonlinear signal analysis topographical mapping system (ATMS) to generate a Compressed SSME TOPO Data Base (CSTDB). This ATMS system will convert tremendous amounts of complex vibration signals from the entire SSME test history into a bank of succinct image-like patterns while retaining all respective phase information. A high compression ratio can be achieved to allow the minimal storage requirement, while providing fast signature retrieval, pattern comparison, and identification capabilities. (3) Integrate the nonlinear correlation techniques into the CSTDB data base with compatible TOPO input data format. Such integrated ATMS system will provide the large test archives necessary for a quick signature comparison. This study will provide timely assessment of SSME component operational status, identify probable causes of malfunction, and indicate feasible engineering solutions. The final result of this program will yield an ATMS system of nonlinear and nonstationary spectral analysis software package integrated with the Compressed SSME TOPO Data Base (CSTDB) on the same platform. This system will allow NASA engineers to retrieve any unique defect signatures and trends associated with different failure modes and anomalous phenomena over the entire SSME test history across turbo pump families.
Test-retest reliability of the multifocal photopic negative response.
Van Alstine, Anthony W; Viswanathan, Suresh
2017-02-01
To assess the test-retest reliability of the multifocal photopic negative response (mfPhNR) of normal human subjects. Multifocal electroretinograms were recorded from one eye of 61 healthy adult subjects on two separate days using a Visual Evoked Response Imaging System software version 4.3 (EDI, San Mateo, California). The visual stimulus delivered on a 75-Hz monitor consisted of seven equal-sized hexagons each subtending 12° of visual angle. The m-step exponent was 9, and the m-sequence was slowed to include at least 30 blank frames after each flash. Only the first slice of the first-order kernel was analyzed. The mfPhNR amplitude was measured at a fixed time in the trough from baseline (BT) as well as at the same fixed time in the trough from the preceding b-wave peak (PT). Additionally, we also analyzed BT normalized either to PT (BT/PT) or to the b-wave amplitude (BT/b-wave). The relative reliability of test-retest differences for each test location was estimated by the Wilcoxon matched-pair signed-rank test and intraclass correlation coefficients (ICC). Absolute test-retest reliability was estimated by Bland-Altman analysis. The test-retest amplitude differences for neither of the two measurement techniques were statistically significant as determined by Wilcoxon matched-pair signed-rank test. PT measurements showed greater ICC values than BT amplitude measurements for all test locations. For each measurement technique, the ICC value of the macular response was greater than that of the surrounding locations. The mean test-retest difference was close to zero for both techniques at each of the test locations, and while the coefficient of reliability (COR-1.96 times the standard deviation of the test-retest difference) was comparable for the two techniques at each test location when expressed in nanovolts, the %COR (COR normalized to the mean test and retest amplitudes) was superior for PT than BT measurements. The ICC and COR were comparable for the BT/PT and BT/b-wave ratios and were better than the ICC and COR for BT but worse than PT. mfPhNR amplitude measured at a fixed time in the trough from the preceding b-wave peak (PT) shows greater test-retest reliability when compared to amplitude measurement from baseline (BT) or BT amplitude normalized to either the PT or b-wave amplitudes.
Yang, Litao; Liang, Wanqi; Jiang, Lingxi; Li, Wenquan; Cao, Wei; Wilson, Zoe A; Zhang, Dabing
2008-06-04
Real-time PCR techniques are being widely used for nucleic acids analysis, but one limitation of current frequently employed real-time PCR is the high cost of the labeled probe for each target molecule. We describe a real-time PCR technique employing attached universal duplex probes (AUDP), which has the advantage of generating fluorescence by probe hydrolysis and strand displacement over current real-time PCR methods. AUDP involves one set of universal duplex probes in which the 5' end of the fluorescent probe (FP) and a complementary quenching probe (QP) lie in close proximity so that fluorescence can be quenched. The PCR primer pair with attached universal template (UT) and the FP are identical to the UT sequence. We have shown that the AUDP technique can be used for detecting multiple target DNA sequences in both simplex and duplex real-time PCR assays for gene expression analysis, genotype identification, and genetically modified organism (GMO) quantification with comparable sensitivity, reproducibility, and repeatability with other real-time PCR methods. The results from GMO quantification, gene expression analysis, genotype identification, and GMO quantification using AUDP real-time PCR assays indicate that the AUDP real-time PCR technique has been successfully applied in nucleic acids analysis, and the developed AUDP real-time PCR technique will offer an alternative way for nucleic acid analysis with high efficiency, reliability, and flexibility at low cost.
Quantitative assessment of human motion using video motion analysis
NASA Technical Reports Server (NTRS)
Probe, John D.
1990-01-01
In the study of the dynamics and kinematics of the human body, a wide variety of technologies was developed. Photogrammetric techniques are well documented and are known to provide reliable positional data from recorded images. Often these techniques are used in conjunction with cinematography and videography for analysis of planar motion, and to a lesser degree three-dimensional motion. Cinematography has been the most widely used medium for movement analysis. Excessive operating costs and the lag time required for film development coupled with recent advances in video technology have allowed video based motion analysis systems to emerge as a cost effective method of collecting and analyzing human movement. The Anthropometric and Biomechanics Lab at Johnson Space Center utilizes the video based Ariel Performance Analysis System to develop data on shirt-sleeved and space-suited human performance in order to plan efficient on orbit intravehicular and extravehicular activities. The system is described.
Information fusion based techniques for HEVC
NASA Astrophysics Data System (ADS)
Fernández, D. G.; Del Barrio, A. A.; Botella, Guillermo; Meyer-Baese, Uwe; Meyer-Baese, Anke; Grecos, Christos
2017-05-01
Aiming at the conflict circumstances of multi-parameter H.265/HEVC encoder system, the present paper introduces the analysis of many optimizations' set in order to improve the trade-off between quality, performance and power consumption for different reliable and accurate applications. This method is based on the Pareto optimization and has been tested with different resolutions on real-time encoders.
USDA-ARS?s Scientific Manuscript database
The ability to reliably analyze cellular and molecular profiles of normal or diseased tissues is frequently obfuscated by the inherent heterogeneous nature of tissues. Laser Capture Microdissection (LCM) is an innovative technique that allows the isolation and enrichment of pure subpopulations of c...
Methods for Identifying Object Class, Type, and Orientation in the Presence of Uncertainty
1990-08-01
on Range Finding Techniques for Computer Vision," IEEE Trans. on Pattern Analysis and Machine Intellegence PAMI-5 (2), pp 129-139 March 1983. 15. Yang... Artificial Intelligence Applications, pp 199-205, December 1984. 16. Flynn, P.J. and Jain, A.K.," On Reliable Curvature Estimation, " Proceedings of the
USDA-ARS?s Scientific Manuscript database
Quantitative real-time PCR (qRT-PCR) is a reliable and reproducible technique for measuring and evaluating changes in gene expression. To facilitate gene expression studies and obtain more accurate qRT-PCR data, normalization relative to stable housekeeping genes is required. In this study, expres...
Operational numerical weather prediction on the CYBER 205 at the National Meteorological Center
NASA Technical Reports Server (NTRS)
Deaven, D.
1984-01-01
The Development Division of the National Meteorological Center (NMC), having the responsibility of maintaining and developing the numerical weather forecasting systems of the center, is discussed. Because of the mission of NMC data products must be produced reliably and on time twice daily free of surprises for forecasters. Personnel of Development Division are in a rather unique situation. They must develop new advanced techniques for numerical analysis and prediction utilizing current state-of-the-art techniques, and implement them in an operational fashion without damaging the operations of the center. With the computational speeds and resources now available from the CYBER 205, Development Division Personnel will be able to introduce advanced analysis and prediction techniques into the operational job suite without disrupting the daily schedule. The capabilities of the CYBER 205 are discussed.
A wireless fatigue monitoring system utilizing a bio-inspired tree ring data tracking technique.
Bai, Shi; Li, Xuan; Xie, Zhaohui; Zhou, Zhi; Ou, Jinping
2014-03-05
Fatigue, a hot scientific research topic for centuries, can trigger sudden failure of critical structures such as aircraft and railway systems, resulting in enormous casualties as well as economic losses. The fatigue life of certain structures is intrinsically random and few monitoring techniques are capable of tracking the full life-cycle fatigue damage. In this paper, a novel in-situ wireless real-time fatigue monitoring system using a bio-inspired tree ring data tracking technique is proposed. The general framework, methodology, and verification of this intelligent system are discussed in details. The rain-flow counting (RFC) method is adopted as the core algorithm which quantifies fatigue damages, and Digital Signal Processing (DSP) is introduced as the core module for data collection and analysis. Laboratory test results based on strain gauges and polyvinylidene fluoride (PVDF) sensors have shown that the developed intelligent system can provide a reliable quick feedback and early warning of fatigue failure. With the merits of low cost, high accuracy and great reliability, the developed wireless fatigue sensing system can be further applied to mechanical engineering, civil infrastructures, transportation systems, aerospace engineering, etc.
Applying phylogenetic analysis to viral livestock diseases: moving beyond molecular typing.
Olvera, Alex; Busquets, Núria; Cortey, Marti; de Deus, Nilsa; Ganges, Llilianne; Núñez, José Ignacio; Peralta, Bibiana; Toskano, Jennifer; Dolz, Roser
2010-05-01
Changes in livestock production systems in recent years have altered the presentation of many diseases resulting in the need for more sophisticated control measures. At the same time, new molecular assays have been developed to support the diagnosis of animal viral disease. Nucleotide sequences generated by these diagnostic techniques can be used in phylogenetic analysis to infer phenotypes by sequence homology and to perform molecular epidemiology studies. In this review, some key elements of phylogenetic analysis are highlighted, such as the selection of the appropriate neutral phylogenetic marker, the proper phylogenetic method and different techniques to test the reliability of the resulting tree. Examples are given of current and future applications of phylogenetic reconstructions in viral livestock diseases. Copyright 2009 Elsevier Ltd. All rights reserved.
Forensic analysis of dyed textile fibers.
Goodpaster, John V; Liszewski, Elisa A
2009-08-01
Textile fibers are a key form of trace evidence, and the ability to reliably associate or discriminate them is crucial for forensic scientists worldwide. While microscopic and instrumental analysis can be used to determine the composition of the fiber itself, additional specificity is gained by examining fiber color. This is particularly important when the bulk composition of the fiber is relatively uninformative, as it is with cotton, wool, or other natural fibers. Such analyses pose several problems, including extremely small sample sizes, the desire for nondestructive techniques, and the vast complexity of modern dye compositions. This review will focus on more recent methods for comparing fiber color by using chromatography, spectroscopy, and mass spectrometry. The increasing use of multivariate statistics and other data analysis techniques for the differentiation of spectra from dyed fibers will also be discussed.
Advanced Reactor Passive System Reliability Demonstration Analysis for an External Event
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bucknor, Matthew D.; Grabaskas, David; Brunett, Acacia J.
2016-01-01
Many advanced reactor designs rely on passive systems to fulfill safety functions during accident sequences. These systems depend heavily on boundary conditions to induce a motive force, meaning the system can fail to operate as intended due to deviations in boundary conditions, rather than as the result of physical failures. Furthermore, passive systems may operate in intermediate or degraded modes. These factors make passive system operation difficult to characterize within a traditional probabilistic framework that only recognizes discrete operating modes and does not allow for the explicit consideration of time-dependent boundary conditions. Argonne National Laboratory has been examining various methodologiesmore » for assessing passive system reliability within a probabilistic risk assessment for a station blackout event at an advanced small modular reactor. This paper provides an overview of a passive system reliability demonstration analysis for an external event. Centering on an earthquake with the possibility of site flooding, the analysis focuses on the behavior of the passive reactor cavity cooling system following potential physical damage and system flooding. The assessment approach seeks to combine mechanistic and simulation-based methods to leverage the benefits of the simulation-based approach without the need to substantially deviate from conventional probabilistic risk assessment techniques. While this study is presented as only an example analysis, the results appear to demonstrate a high level of reliability for the reactor cavity cooling system (and the reactor system in general) to the postulated transient event.« less
Advanced Reactor Passive System Reliability Demonstration Analysis for an External Event
Bucknor, Matthew; Grabaskas, David; Brunett, Acacia J.; ...
2017-01-24
We report that many advanced reactor designs rely on passive systems to fulfill safety functions during accident sequences. These systems depend heavily on boundary conditions to induce a motive force, meaning the system can fail to operate as intended because of deviations in boundary conditions, rather than as the result of physical failures. Furthermore, passive systems may operate in intermediate or degraded modes. These factors make passive system operation difficult to characterize within a traditional probabilistic framework that only recognizes discrete operating modes and does not allow for the explicit consideration of time-dependent boundary conditions. Argonne National Laboratory has beenmore » examining various methodologies for assessing passive system reliability within a probabilistic risk assessment for a station blackout event at an advanced small modular reactor. This paper provides an overview of a passive system reliability demonstration analysis for an external event. Considering an earthquake with the possibility of site flooding, the analysis focuses on the behavior of the passive Reactor Cavity Cooling System following potential physical damage and system flooding. The assessment approach seeks to combine mechanistic and simulation-based methods to leverage the benefits of the simulation-based approach without the need to substantially deviate from conventional probabilistic risk assessment techniques. Lastly, although this study is presented as only an example analysis, the results appear to demonstrate a high level of reliability of the Reactor Cavity Cooling System (and the reactor system in general) for the postulated transient event.« less
Estuarial fingerprinting through multidimensional fluorescence and multivariate analysis.
Hall, Gregory J; Clow, Kerin E; Kenny, Jonathan E
2005-10-01
As part of a strategy for preventing the introduction of aquatic nuisance species (ANS) to U.S. estuaries, ballast water exchange (BWE) regulations have been imposed. Enforcing these regulations requires a reliable method for determining the port of origin of water in the ballast tanks of ships entering U.S. waters. This study shows that a three-dimensional fluorescence fingerprinting technique, excitation emission matrix (EEM) spectroscopy, holds great promise as a ballast water analysis tool. In our technique, EEMs are analyzed by multivariate classification and curve resolution methods, such as N-way partial least squares Regression-discriminant analysis (NPLS-DA) and parallel factor analysis (PARAFAC). We demonstrate that classification techniques can be used to discriminate among sampling sites less than 10 miles apart, encompassing Boston Harbor and two tributaries in the Mystic River Watershed. To our knowledge, this work is the first to use multivariate analysis to classify water as to location of origin. Furthermore, it is shown that curve resolution can show seasonal features within the multidimensional fluorescence data sets, which correlate with difficulty in classification.
NASA Astrophysics Data System (ADS)
Vanvyve, E.; Magontier, P.; Vandenberghe, F. C.; Delle Monache, L.; Dickinson, K.
2012-12-01
Wind energy is amongst the fastest growing sources of renewable energy in the U.S. and could supply up to 20 % of the U.S power production by 2030. An accurate and reliable wind resource assessment for prospective wind farm sites is a challenging task, yet is crucial for evaluating the long-term profitability and feasibility of a potential development. We have developed an accurate and computationally efficient wind resource assessment technique for prospective wind farm sites, which incorporates innovative statistical techniques and the new NASA Earth science dataset MERRA. This technique produces a wind resource estimate that is more accurate than that obtained by the wind energy industry's standard technique, while providing a reliable quantification of its uncertainty. The focus now is on evaluating the socio-economic value of this new technique upon using the industry's standard technique. Would it yield lower financing costs? Could it result in lower electricity prices? Are there further down-the-line positive consequences, e.g. job creation, time saved, greenhouse gas decrease? Ultimately, we expect our results will inform efforts to refine and disseminate the new technique to support the development of the U.S. renewable energy infrastructure. In order to address the above questions, we are carrying out a cost-benefit analysis based on the net present worth of the technique. We will describe this approach, including the cash-flow process of wind farm financing, how the wind resource assessment factors in, and will present current results for various hypothetical candidate wind farm sites.
Johnson Space Center's Risk and Reliability Analysis Group 2008 Annual Report
NASA Technical Reports Server (NTRS)
Valentine, Mark; Boyer, Roger; Cross, Bob; Hamlin, Teri; Roelant, Henk; Stewart, Mike; Bigler, Mark; Winter, Scott; Reistle, Bruce; Heydorn,Dick
2009-01-01
The Johnson Space Center (JSC) Safety & Mission Assurance (S&MA) Directorate s Risk and Reliability Analysis Group provides both mathematical and engineering analysis expertise in the areas of Probabilistic Risk Assessment (PRA), Reliability and Maintainability (R&M) analysis, and data collection and analysis. The fundamental goal of this group is to provide National Aeronautics and Space Administration (NASA) decisionmakers with the necessary information to make informed decisions when evaluating personnel, flight hardware, and public safety concerns associated with current operating systems as well as with any future systems. The Analysis Group includes a staff of statistical and reliability experts with valuable backgrounds in the statistical, reliability, and engineering fields. This group includes JSC S&MA Analysis Branch personnel as well as S&MA support services contractors, such as Science Applications International Corporation (SAIC) and SoHaR. The Analysis Group s experience base includes nuclear power (both commercial and navy), manufacturing, Department of Defense, chemical, and shipping industries, as well as significant aerospace experience specifically in the Shuttle, International Space Station (ISS), and Constellation Programs. The Analysis Group partners with project and program offices, other NASA centers, NASA contractors, and universities to provide additional resources or information to the group when performing various analysis tasks. The JSC S&MA Analysis Group is recognized as a leader in risk and reliability analysis within the NASA community. Therefore, the Analysis Group is in high demand to help the Space Shuttle Program (SSP) continue to fly safely, assist in designing the next generation spacecraft for the Constellation Program (CxP), and promote advanced analytical techniques. The Analysis Section s tasks include teaching classes and instituting personnel qualification processes to enhance the professional abilities of our analysts as well as performing major probabilistic assessments used to support flight rationale and help establish program requirements. During 2008, the Analysis Group performed more than 70 assessments. Although all these assessments were important, some were instrumental in the decisionmaking processes for the Shuttle and Constellation Programs. Two of the more significant tasks were the Space Transportation System (STS)-122 Low Level Cutoff PRA for the SSP and the Orion Pad Abort One (PA-1) PRA for the CxP. These two activities, along with the numerous other tasks the Analysis Group performed in 2008, are summarized in this report. This report also highlights several ongoing and upcoming efforts to provide crucial statistical and probabilistic assessments, such as the Extravehicular Activity (EVA) PRA for the Hubble Space Telescope service mission and the first fully integrated PRAs for the CxP's Lunar Sortie and ISS missions.
Interim reliability evaluation program, Browns Ferry 1
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mays, S.E.; Poloski, J.P.; Sullivan, W.H.
1981-01-01
Probabilistic risk analysis techniques, i.e., event tree and fault tree analysis, were utilized to provide a risk assessment of the Browns Ferry Nuclear Plant Unit 1. Browns Ferry 1 is a General Electric boiling water reactor of the BWR 4 product line with a Mark 1 (drywell and torus) containment. Within the guidelines of the IREP Procedure and Schedule Guide, dominant accident sequences that contribute to public health and safety risks were identified and grouped according to release categories.
[Enzymatic analysis of the quality of foodstuffs].
Kolesnov, A Iu
1997-01-01
Enzymatic analysis is an independent and separate branch of enzymology and analytical chemistry. It has become one of the most important methodologies used in food analysis. Enzymatic analysis allows the quick, reliable determination of many food ingredients. Often these contents cannot be determined by conventional methods, or if methods are available, they are determined only with limited accuracy. Today, methods of enzymatic analysis are being increasingly used in the investigation of foodstuffs. Enzymatic measurement techniques are used in industry, scientific and food inspection laboratories for quality analysis. This article describes the requirements of an optimal analytical method: specificity, sample preparation, assay performance, precision, sensitivity, time requirement, analysis cost, safety of reagents.
Validation of a novel smartphone accelerometer-based knee goniometer.
Ockendon, Matthew; Gilbert, Robin E
2012-09-01
Loss of full knee extension following anterior cruciate ligament surgery has been shown to impair knee function. However, there can be significant difficulties in accurately and reproducibly measuring a fixed flexion of the knee. We studied the interobserver and the intraobserver reliabilities of a novel, smartphone accelerometer-based, knee goniometer and compared it with a long-armed conventional goniometer for the assessment of fixed flexion knee deformity. Five healthy male volunteers (age range 30 to 40 years) were studied. Measurements of knee flexion angle were made with a telescopic-armed goniometer (Lafayette Instrument, Lafayette, IN) and compared with measurements using the smartphone (iPhone 3GS, Apple Inc., Cupertino, CA) knee goniometer using a novel trigonometric technique based on tibial inclination. Bland-Altman analysis of validity and reliability including statistical analysis of correlation by Pearson's method was undertaken. The iPhone goniometer had an interobserver correlation (r) of 0.994 compared with 0.952 for the Lafayette. The intraobserver correlation was r = 0.982 for the iPhone (compared with 0.927). The datasets from the two instruments correlate closely (r = 0.947) are proportional and have mean difference of only -0.4 degrees (SD 3.86 degrees). The Lafayette goniometer had an intraobserver reliability +/- 9.6 degrees. The interobserver reliability was +/- 8.4 degrees. By comparison the iPhone had an interobserver reliability +/- 2.7 degrees and an intraobserver reliability +/- 4.6 degrees. We found the iPhone goniometer to be a reliable tool for the measurement of subtle knee flexion in the clinic setting.
2013-01-01
Background Developing a quick and reliable technique to estimate floral cover in deserts will assist in monitoring and management. The present attempt was to estimate plant cover in the UAE desert using both digital photography and field sampling. Digital photographs were correlated with field data to estimate floral cover in moderately (Al-Maha) and heavily (DDCR) grazed areas. The Kruskal-Wallis test was also used to assess compatibility between the two techniques within and across grazing intensities and soil substrates. Results Results showed that photographs could be a reliable technique within the sand dune substrate under moderate grazing (r = 0.69). The results were very poorly correlated (r =−0.24) or even inversely proportional (r =−0.48) when performed within DDCR. Overall, Chi-square values for Al-Maha and DDCR were not significant at P > 0.05, indicating similarities between the two methods. At the soil type level, the Kruskal-Wallis analysis was not significant (P > 0.05), except for gravel plains (P < 0.05). Across grazing intensities and soil substrates, the two techniques were in agreement in ranking most plant species, except for Lycium shawii. Conclusions Consequently, the present study has proven that digital photography could not be used reliably to asses floral cover, while further testing is required to support such claim. An image-based sampling approach of plant cover at the species level, across different grazing and substrate variations in desert ecosystems, has its uses, but results are to be cautiously interpreted. PMID:23758667
Method-independent, Computationally Frugal Convergence Testing for Sensitivity Analysis Techniques
NASA Astrophysics Data System (ADS)
Mai, Juliane; Tolson, Bryan
2017-04-01
The increasing complexity and runtime of environmental models lead to the current situation that the calibration of all model parameters or the estimation of all of their uncertainty is often computationally infeasible. Hence, techniques to determine the sensitivity of model parameters are used to identify most important parameters or model processes. All subsequent model calibrations or uncertainty estimation procedures focus then only on these subsets of parameters and are hence less computational demanding. While the examination of the convergence of calibration and uncertainty methods is state-of-the-art, the convergence of the sensitivity methods is usually not checked. If any, bootstrapping of the sensitivity results is used to determine the reliability of the estimated indexes. Bootstrapping, however, might as well become computationally expensive in case of large model outputs and a high number of bootstraps. We, therefore, present a Model Variable Augmentation (MVA) approach to check the convergence of sensitivity indexes without performing any additional model run. This technique is method- and model-independent. It can be applied either during the sensitivity analysis (SA) or afterwards. The latter case enables the checking of already processed sensitivity indexes. To demonstrate the method independency of the convergence testing method, we applied it to three widely used, global SA methods: the screening method known as Morris method or Elementary Effects (Morris 1991, Campolongo et al., 2000), the variance-based Sobol' method (Solbol' 1993, Saltelli et al. 2010) and a derivative-based method known as Parameter Importance index (Goehler et al. 2013). The new convergence testing method is first scrutinized using 12 analytical benchmark functions (Cuntz & Mai et al. 2015) where the true indexes of aforementioned three methods are known. This proof of principle shows that the method reliably determines the uncertainty of the SA results when different budgets are used for the SA. Subsequently, we focus on the model-independency by testing the frugal method using the hydrologic model mHM (www.ufz.de/mhm) with about 50 model parameters. The results show that the new frugal method is able to test the convergence and therefore the reliability of SA results in an efficient way. The appealing feature of this new technique is the necessity of no further model evaluation and therefore enables checking of already processed (and published) sensitivity results. This is one step towards reliable and transferable, published sensitivity results.
NASA Technical Reports Server (NTRS)
Brooks, David E.; Gassman, Holly; Beering, Dave R.; Welch, Arun; Hoder, Douglas J.; Ivancic, William D.
1999-01-01
Transmission Control Protocol (TCP) is the underlying protocol used within the Internet for reliable information transfer. As such, there is great interest to have all implementations of TCP efficiently interoperate. This is particularly important for links exhibiting long bandwidth-delay products. The tools exist to perform TCP analysis at low rates and low delays. However, for extremely high-rate and lone-delay links such as 622 Mbps over geosynchronous satellites, new tools and testing techniques are required. This paper describes the tools and techniques used to analyze and debug various TCP implementations over high-speed, long-delay links.
Perceptual-cognitive skills and performance in orienteering.
Guzmán, José F; Pablos, Ana M; Pablos, Carlos
2008-08-01
The goal was analysis of the perceptual-cognitive skills associated with sport performance in orienteering in a sample of 22 elite and 17 nonelite runners. Variables considered were memory, basic orienteering techniques, map reading, symbol knowledge, map-terrain-map identification, and spatial organisation. A computerised questionnaire was developed to measure the variables. The reliability of the test (agreement between experts) was 90%. Findings suggested that competence in performing basic orienteering techniques efficiently was a key variable differentiating between the elite and the nonelite athletes. The results are discussed in comparison with previous studies.
NASA Technical Reports Server (NTRS)
1982-01-01
Effective screening techniques are evaluated for detecting insulation resistance degradation and failure in hermetically sealed metallized film capacitors used in applications where low capacitor voltage and energy levels are common to the circuitry. A special test and monitoring system capable of rapidly scanning all test capacitors and recording faults and/or failures is examined. Tests include temperature cycling and storage as well as low, medium, and high voltage life tests. Polysulfone film capacitors are more heat stable and reliable than polycarbonate film units.
Correlative light-electron fractography for fatigue striations characterization in metallic alloys.
Hein, Luis Rogerio de Oliveira; de Oliveira, José Alberto; de Campos, Kamila Amato
2013-09-01
The correlative light-electron fractography technique combines correlative microscopy concepts to the extended depth-from-focus reconstruction method, associating the reliable topographic information of 3-D maps from light microscopy ordered Z-stacks to the finest lateral resolution and large focus depth from scanning electron microscopy. Fatigue striations spacing analysis can be precisely measured, by correcting the mean surface tilting with the knowledge of local elevation data from elevation maps. This new technique aims to improve the accuracy of quantitative fractography in fatigue fracture investigations. Copyright © 2013 Wiley Periodicals, Inc.
Camp, Christopher L; Heidenreich, Mark J; Dahm, Diane L; Bond, Jeffrey R; Collins, Mark S; Krych, Aaron J
2016-03-01
Tibial tubercle-trochlear groove (TT-TG) distance is a variable that helps guide surgical decision-making in patients with patellar instability. The purpose of this study was to compare the accuracy and reliability of an MRI TT-TG measuring technique using a simple external alignment method to a previously validated gold standard technique that requires advanced software read by radiologists. TT-TG was calculated by MRI on 59 knees with a clinical diagnosis of patellar instability in a blinded and randomized fashion by two musculoskeletal radiologists using advanced software and by two orthopaedists using the study technique which utilizes measurements taken on a simple electronic imaging platform. Interrater reliability between the two radiologists and the two orthopaedists and intermethods reliability between the two techniques were calculated using interclass correlation coefficients (ICC) and concordance correlation coefficients (CCC). ICC and CCC values greater than 0.75 were considered to represent excellent agreement. The mean TT-TG distance was 14.7 mm (Standard Deviation (SD) 4.87 mm) and 15.4 mm (SD 5.41) as measured by the radiologists and orthopaedists, respectively. Excellent interobserver agreement was noted between the radiologists (ICC 0.941; CCC 0.941), the orthopaedists (ICC 0.978; CCC 0.976), and the two techniques (ICC 0.941; CCC 0.933). The simple TT-TG distance measurement technique analysed in this study resulted in excellent agreement and reliability as compared to the gold standard technique. This method can predictably be performed by orthopaedic surgeons without advanced radiologic software. II.
Establishing the reliability of rhesus macaque social network assessment from video observations
Feczko, Eric; Mitchell, Thomas A. J.; Walum, Hasse; Brooks, Jenna M.; Heitz, Thomas R.; Young, Larry J.; Parr, Lisa A.
2015-01-01
Understanding the properties of a social environment is important for understanding the dynamics of social relationships. Understanding such dynamics is relevant for multiple fields, ranging from animal behaviour to social and cognitive neuroscience. To quantify social environment properties, recent studies have incorporated social network analysis. Social network analysis quantifies both the global and local properties of a social environment, such as social network efficiency and the roles played by specific individuals, respectively. Despite the plethora of studies incorporating social network analysis, methods to determine the amount of data necessary to derive reliable social networks are still being developed. Determining the amount of data necessary for a reliable network is critical for measuring changes in the social environment, for example following an experimental manipulation, and therefore may be critical for using social network analysis to statistically assess social behaviour. In this paper, we extend methods for measuring error in acquired data and for determining the amount of data necessary to generate reliable social networks. We derived social networks from a group of 10 male rhesus macaques, Macaca mulatta, for three behaviours: spatial proximity, grooming and mounting. Behaviours were coded using a video observation technique, where video cameras recorded the compound where the 10 macaques resided. We collected, coded and used 10 h of video data to construct these networks. Using the methods described here, we found in our data that 1 h of spatial proximity observations produced reliable social networks. However, this may not be true for other studies due to differences in data acquisition. Our results have broad implications for measuring and predicting the amount of error in any social network, regardless of species. PMID:26392632
Norte, Grant E; Frye, Jamie L; Hart, Joseph M
2015-11-01
The superimposed-burst (SIB) technique is commonly used to quantify central activation failure after knee-joint injury, but its reliability has not been established in pathologic cohorts. To assess within-session and between-sessions reliability of the SIB technique in patients with patellofemoral pain. Descriptive laboratory study. University laboratory. A total of 10 patients with self-reported patellofemoral pain (1 man, 9 women; age = 24.1 ± 3.8 years, height = 167.8 ± 15.2 cm, mass = 71.6 ± 17.5 kg) and 10 healthy control participants (3 men, 7 women; age = 27.4 ± 5.0 years, height = 173.5 ± 9.9 cm, mass = 78.2 ± 16.5 kg) volunteered. Participants were assessed at 6 intervals spanning 21 days. Intraclass correlation coefficients (ICCs [3,3]) were used to assess reliability. Quadriceps central activation ratio, knee-extension maximal voluntary isometric contraction force, and SIB force. The quadriceps central activation ratio was highly reliable within session (ICC [3,3] = 0.97) and between sessions through day 21 (ICC [3,3] = 0.90-0.95). Acceptable reliability of knee extension (ICC [3,3] = 0.75-0.91) and SIB force (ICC [3,3] = 0.77-0.89) was observed through day 21. The SIB technique was reliable for clinical research up to 21 days in patients with patellofemoral pain.
Does gymnastics practice improve vertical jump reliability from the age of 8 to 10 years?
Marina, Michel; Torrado, Priscila
2013-01-01
The objective of this study was to confirm whether gymnastics practice from a young age can induce greater vertical jump reliability. Fifty young female gymnasts (8.84 ± 0.62 years) and 42 females in the control group (8.58 ± 0.92 years) performed the following jump tests on a contact mat: squat jump, countermovement jump, countermovement jump with arm swing and drop jump from heights of 40 and 60 cm. The two testing sessions had three trials each and were separated by one week. A 2 (groups) × 2 (sessions) × 3 (trials) repeated measures analysis of variance (ANOVA) and a test-retest correlation analysis were used to study the reliability. There was no systematic source of error in either group for non-plyometric jumps such as squat jump, countermovement jump, and countermovement jump with arm swing. A significant group per trial interaction revealed a learning effect in gymnasts' drop jumps from 40 cm height. Additionally, the test-retest correlation analysis and the higher minimum detectable error suggest that the quick drop jump technique was not fully consolidated in either group. At an introductory level of gymnastics and between the ages of 8-10 years, the condition of being a gymnast did not lead to conclusively higher reliability, aside from better overall vertical jump performance.
Magenes, G; Bellazzi, R; Malovini, A; Signorini, M G
2016-08-01
The onset of fetal pathologies can be screened during pregnancy by means of Fetal Heart Rate (FHR) monitoring and analysis. Noticeable advances in understanding FHR variations were obtained in the last twenty years, thanks to the introduction of quantitative indices extracted from the FHR signal. This study searches for discriminating Normal and Intra Uterine Growth Restricted (IUGR) fetuses by applying data mining techniques to FHR parameters, obtained from recordings in a population of 122 fetuses (61 healthy and 61 IUGRs), through standard CTG non-stress test. We computed N=12 indices (N=4 related to time domain FHR analysis, N=4 to frequency domain and N=4 to non-linear analysis) and normalized them with respect to the gestational week. We compared, through a 10-fold crossvalidation procedure, 15 data mining techniques in order to select the more reliable approach for identifying IUGR fetuses. The results of this comparison highlight that two techniques (Random Forest and Logistic Regression) show the best classification accuracy and that both outperform the best single parameter in terms of mean AUROC on the test sets.
Reliability of Two Smartphone Applications for Radiographic Measurements of Hallux Valgus Angles.
Mattos E Dinato, Mauro Cesar; Freitas, Marcio de Faria; Milano, Cristiano; Valloto, Elcio; Ninomiya, André Felipe; Pagnano, Rodrigo Gonçalves
The objective of the present study was to assess the reliability of 2 smartphone applications compared with the traditional goniometer technique for measurement of radiographic angles in hallux valgus and the time required for analysis with the different methods. The radiographs of 31 patients (52 feet) with a diagnosis of hallux valgus were analyzed. Four observers, 2 with >10 years' experience in foot and ankle surgery and 2 in-training surgeons, measured the hallux valgus angle and intermetatarsal angle using a manual goniometer technique and 2 smartphone applications (Hallux Angles and iPinPoint). The interobserver and intermethod reliability were estimated using intraclass correlation coefficients (ICCs), and the time required for measurement of the angles among the 3 methods was compared using the Friedman test. A very good or good interobserver reliability was found among the 4 observers measuring the hallux valgus angle and intermetatarsal angle using the goniometer (ICC 0.913 and 0.821, respectively) and iPinPoint (ICC 0.866 and 0.638, respectively). Using the Hallux Angles application, a very good interobserver reliability was found for measurements of the hallux valgus angle (ICC 0.962) and intermetatarsal angle (ICC 0.935) only among the more experienced observers. The time required for the measurements was significantly shorter for the measurements using both smartphone applications compared with the goniometer method. One smartphone application (iPinPoint) was reliable for measurements of the hallux valgus angles by either experienced or nonexperienced observers. The use of these tools might save time in the evaluation of radiographic angles in the hallux valgus. Copyright © 2016 American College of Foot and Ankle Surgeons. Published by Elsevier Inc. All rights reserved.
Life Predicted in a Probabilistic Design Space for Brittle Materials With Transient Loads
NASA Technical Reports Server (NTRS)
Nemeth, Noel N.; Palfi, Tamas; Reh, Stefan
2005-01-01
Analytical techniques have progressively become more sophisticated, and now we can consider the probabilistic nature of the entire space of random input variables on the lifetime reliability of brittle structures. This was demonstrated with NASA s CARES/Life (Ceramic Analysis and Reliability Evaluation of Structures/Life) code combined with the commercially available ANSYS/Probabilistic Design System (ANSYS/PDS), a probabilistic analysis tool that is an integral part of the ANSYS finite-element analysis program. ANSYS/PDS allows probabilistic loads, component geometry, and material properties to be considered in the finite-element analysis. CARES/Life predicts the time dependent probability of failure of brittle material structures under generalized thermomechanical loading--such as that found in a turbine engine hot-section. Glenn researchers coupled ANSYS/PDS with CARES/Life to assess the effects of the stochastic variables of component geometry, loading, and material properties on the predicted life of the component for fully transient thermomechanical loading and cyclic loading.
NASA Astrophysics Data System (ADS)
Suparta, Wayan; Rahman, Rosnani
2016-02-01
Global Positioning System (GPS) receivers are widely installed throughout the Peninsular Malaysia, but the implementation for monitoring weather hazard system such as flash flood is still not optimal. To increase the benefit for meteorological applications, the GPS system should be installed in collocation with meteorological sensors so the precipitable water vapor (PWV) can be measured. The distribution of PWV is a key element to the Earth's climate for quantitative precipitation improvement as well as flash flood forecasts. The accuracy of this parameter depends on a large extent on the number of GPS receiver installations and meteorological sensors in the targeted area. Due to cost constraints, a spatial interpolation method is proposed to address these issues. In this paper, we investigated spatial distribution of GPS PWV and meteorological variables (surface temperature, relative humidity, and rainfall) by using thin plate spline (tps) and ordinary kriging (Krig) interpolation techniques over the Klang Valley in Peninsular Malaysia (longitude: 99.5°-102.5°E and latitude: 2.0°-6.5°N). Three flash flood cases in September, October, and December 2013 were studied. The analysis was performed using mean absolute error (MAE), root mean square error (RMSE), and coefficient of determination (R2) to determine the accuracy and reliability of the interpolation techniques. Results at different phases (pre, onset, and post) that were evaluated showed that tps interpolation technique is more accurate, reliable, and highly correlated in estimating GPS PWV and relative humidity, whereas Krig is more reliable for predicting temperature and rainfall during pre-flash flood events. During the onset of flash flood events, both methods showed good interpolation in estimating all meteorological parameters with high accuracy and reliability. The finding suggests that the proposed method of spatial interpolation techniques are capable of handling limited data sources with high accuracy, which in turn can be used to predict future floods.
Proust, Gwénaëlle; Trimby, Patrick; Piazolo, Sandra; Retraint, Delphine
2017-01-01
One of the challenges in microstructure analysis nowadays resides in the reliable and accurate characterization of ultra-fine grained (UFG) and nanocrystalline materials. The traditional techniques associated with scanning electron microscopy (SEM), such as electron backscatter diffraction (EBSD), do not possess the required spatial resolution due to the large interaction volume between the electrons from the beam and the atoms of the material. Transmission electron microscopy (TEM) has the required spatial resolution. However, due to a lack of automation in the analysis system, the rate of data acquisition is slow which limits the area of the specimen that can be characterized. This paper presents a new characterization technique, Transmission Kikuchi Diffraction (TKD), which enables the analysis of the microstructure of UFG and nanocrystalline materials using an SEM equipped with a standard EBSD system. The spatial resolution of this technique can reach 2 nm. This technique can be applied to a large range of materials that would be difficult to analyze using traditional EBSD. After presenting the experimental set up and describing the different steps necessary to realize a TKD analysis, examples of its use on metal alloys and minerals are shown to illustrate the resolution of the technique and its flexibility in term of material to be characterized. PMID:28447998
Proust, Gwénaëlle; Trimby, Patrick; Piazolo, Sandra; Retraint, Delphine
2017-04-01
One of the challenges in microstructure analysis nowadays resides in the reliable and accurate characterization of ultra-fine grained (UFG) and nanocrystalline materials. The traditional techniques associated with scanning electron microscopy (SEM), such as electron backscatter diffraction (EBSD), do not possess the required spatial resolution due to the large interaction volume between the electrons from the beam and the atoms of the material. Transmission electron microscopy (TEM) has the required spatial resolution. However, due to a lack of automation in the analysis system, the rate of data acquisition is slow which limits the area of the specimen that can be characterized. This paper presents a new characterization technique, Transmission Kikuchi Diffraction (TKD), which enables the analysis of the microstructure of UFG and nanocrystalline materials using an SEM equipped with a standard EBSD system. The spatial resolution of this technique can reach 2 nm. This technique can be applied to a large range of materials that would be difficult to analyze using traditional EBSD. After presenting the experimental set up and describing the different steps necessary to realize a TKD analysis, examples of its use on metal alloys and minerals are shown to illustrate the resolution of the technique and its flexibility in term of material to be characterized.
Classifiers utilized to enhance acoustic based sensors to identify round types of artillery/mortar
NASA Astrophysics Data System (ADS)
Grasing, David; Desai, Sachi; Morcos, Amir
2008-04-01
Feature extraction methods based on the statistical analysis of the change in event pressure levels over a period and the level of ambient pressure excitation facilitate the development of a robust classification algorithm. The features reliably discriminates mortar and artillery variants via acoustic signals produced during the launch events. Utilizing acoustic sensors to exploit the sound waveform generated from the blast for the identification of mortar and artillery variants as type A, etcetera through analysis of the waveform. Distinct characteristics arise within the different mortar/artillery variants because varying HE mortar payloads and related charges emphasize varying size events at launch. The waveform holds various harmonic properties distinct to a given mortar/artillery variant that through advanced signal processing and data mining techniques can employed to classify a given type. The skewness and other statistical processing techniques are used to extract the predominant components from the acoustic signatures at ranges exceeding 3000m. Exploiting these techniques will help develop a feature set highly independent of range, providing discrimination based on acoustic elements of the blast wave. Highly reliable discrimination will be achieved with a feedforward neural network classifier trained on a feature space derived from the distribution of statistical coefficients, frequency spectrum, and higher frequency details found within different energy bands. The processes that are described herein extend current technologies, which emphasis acoustic sensor systems to provide such situational awareness.
Artillery/mortar type classification based on detected acoustic transients
NASA Astrophysics Data System (ADS)
Morcos, Amir; Grasing, David; Desai, Sachi
2008-04-01
Feature extraction methods based on the statistical analysis of the change in event pressure levels over a period and the level of ambient pressure excitation facilitate the development of a robust classification algorithm. The features reliably discriminates mortar and artillery variants via acoustic signals produced during the launch events. Utilizing acoustic sensors to exploit the sound waveform generated from the blast for the identification of mortar and artillery variants as type A, etcetera through analysis of the waveform. Distinct characteristics arise within the different mortar/artillery variants because varying HE mortar payloads and related charges emphasize varying size events at launch. The waveform holds various harmonic properties distinct to a given mortar/artillery variant that through advanced signal processing and data mining techniques can employed to classify a given type. The skewness and other statistical processing techniques are used to extract the predominant components from the acoustic signatures at ranges exceeding 3000m. Exploiting these techniques will help develop a feature set highly independent of range, providing discrimination based on acoustic elements of the blast wave. Highly reliable discrimination will be achieved with a feed-forward neural network classifier trained on a feature space derived from the distribution of statistical coefficients, frequency spectrum, and higher frequency details found within different energy bands. The processes that are described herein extend current technologies, which emphasis acoustic sensor systems to provide such situational awareness.
Artillery/mortar round type classification to increase system situational awareness
NASA Astrophysics Data System (ADS)
Desai, Sachi; Grasing, David; Morcos, Amir; Hohil, Myron
2008-04-01
Feature extraction methods based on the statistical analysis of the change in event pressure levels over a period and the level of ambient pressure excitation facilitate the development of a robust classification algorithm. The features reliably discriminates mortar and artillery variants via acoustic signals produced during the launch events. Utilizing acoustic sensors to exploit the sound waveform generated from the blast for the identification of mortar and artillery variants as type A, etcetera through analysis of the waveform. Distinct characteristics arise within the different mortar/artillery variants because varying HE mortar payloads and related charges emphasize varying size events at launch. The waveform holds various harmonic properties distinct to a given mortar/artillery variant that through advanced signal processing and data mining techniques can employed to classify a given type. The skewness and other statistical processing techniques are used to extract the predominant components from the acoustic signatures at ranges exceeding 3000m. Exploiting these techniques will help develop a feature set highly independent of range, providing discrimination based on acoustic elements of the blast wave. Highly reliable discrimination will be achieved with a feedforward neural network classifier trained on a feature space derived from the distribution of statistical coefficients, frequency spectrum, and higher frequency details found within different energy bands. The processes that are described herein extend current technologies, which emphasis acoustic sensor systems to provide such situational awareness.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hovel, Harold; Prettyman, Kevin
A side-by-side analysis was done on then currently available technology, along with roadmaps to push each particular option forward. Variations in turnkey line processes can and do result in finished solar device performance. Together with variations in starting material quality, the result is a distribution of effciencies. Forensic analysis and characterization of each crystalline Si based technology will determine the most promising approach with respect to cost, efficiency and reliability. Forensic analysis will also shed light on the causes of binning variations. Si solar cells were forensically analyzed from each turn key supplier using a host of techniques
Apollo experience report: Reliability and quality assurance
NASA Technical Reports Server (NTRS)
Sperber, K. P.
1973-01-01
The reliability of the Apollo spacecraft resulted from the application of proven reliability and quality techniques and from sound management, engineering, and manufacturing practices. Continual assessment of these techniques and practices was made during the program, and, when deficiencies were detected, adjustments were made and the deficiencies were effectively corrected. The most significant practices, deficiencies, adjustments, and experiences during the Apollo Program are described in this report. These experiences can be helpful in establishing an effective base on which to structure an efficient reliability and quality assurance effort for future space-flight programs.
Rapid analysis of controlled substances using desorption electrospray ionization mass spectrometry.
Rodriguez-Cruz, Sandra E
2006-01-01
The recently developed technique of desorption electrospray ionization (DESI) has been applied to the rapid analysis of controlled substances. Experiments have been performed using a commercial ThermoFinnigan LCQ Advantage MAX ion-trap mass spectrometer with limited modifications. Results from the ambient sampling of licit and illicit tablets demonstrate the ability of the DESI technique to detect the main active ingredient(s) or controlled substance(s), even in the presence of other higher-concentration components. Full-scan mass spectrometry data provide preliminary identification by molecular weight determination, while rapid analysis using the tandem mass spectrometry (MS/MS) mode provides fragmentation data which, when compared to the laboratory-generated ESI-MS/MS spectral library, provide structural information and final identification of the active ingredient(s). The consecutive analysis of tablets containing different active components indicates there is no cross-contamination or interference from tablet to tablet, demonstrating the reliability of the DESI technique for rapid sampling (one tablet/min or better). Active ingredients have been detected for tablets in which the active component represents less than 1% of the total tablet weight, demonstrating the sensitivity of the technique. The real-time sampling of cannabis plant material is also presented.
Hua, Yujuan; Hawryluk, Myron; Gras, Ronda; Shearer, Randall; Luong, Jim
2018-01-01
A fast and reliable analytical technique for the determination of total sulfur levels in complex hydrocarbon matrices is introduced. The method employed flow injection technique using a gas chromatograph as a sample introduction device and a gas phase dual-plasma sulfur chemiluminescence detector for sulfur quantification. Using the technique described, total sulfur measurement in challenging hydrocarbon matrices can be achieved in less than 10 s with sample-to-sample time <2 min. The high degree of selectivity and sensitivity toward sulfur compounds of the detector offers the ability to measure low sulfur levels with a detection limit in the range of 20 ppb w/w S. The equimolar response characteristic of the detector allows the quantitation of unknown sulfur compounds and simplifies the calibration process. Response is linear over a concentration range of five orders of magnitude, with a high degree of repeatability. The detector's lack of response to hydrocarbons enables direct analysis without the need for time-consuming sample preparation and chromatographic separation processes. This flow injection-based sulfur chemiluminescence detection technique is ideal for fast analysis or trace sulfur analysis. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Paige, Samantha R; Krieger, Janice L; Stellefson, Michael; Alber, Julia M
2017-02-01
Chronic disease patients are affected by low computer and health literacy, which negatively affects their ability to benefit from access to online health information. To estimate reliability and confirm model specifications for eHealth Literacy Scale (eHEALS) scores among chronic disease patients using Classical Test (CTT) and Item Response Theory techniques. A stratified sample of Black/African American (N=341) and Caucasian (N=343) adults with chronic disease completed an online survey including the eHEALS. Item discrimination was explored using bi-variate correlations and Cronbach's alpha for internal consistency. A categorical confirmatory factor analysis tested a one-factor structure of eHEALS scores. Item characteristic curves, in-fit/outfit statistics, omega coefficient, and item reliability and separation estimates were computed. A 1-factor structure of eHEALS was confirmed by statistically significant standardized item loadings, acceptable model fit indices (CFI/TLI>0.90), and 70% variance explained by the model. Item response categories increased with higher theta levels, and there was evidence of acceptable reliability (ω=0.94; item reliability=89; item separation=8.54). eHEALS scores are a valid and reliable measure of self-reported eHealth literacy among Internet-using chronic disease patients. Providers can use eHEALS to help identify patients' eHealth literacy skills. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
A Human Reliability Based Usability Evaluation Method for Safety-Critical Software
DOE Office of Scientific and Technical Information (OSTI.GOV)
Phillippe Palanque; Regina Bernhaupt; Ronald Boring
2006-04-01
Recent years have seen an increasing use of sophisticated interaction techniques including in the field of safety critical interactive software [8]. The use of such techniques has been required in order to increase the bandwidth between the users and systems and thus to help them deal efficiently with increasingly complex systems. These techniques come from research and innovation done in the field of humancomputer interaction (HCI). A significant effort is currently being undertaken by the HCI community in order to apply and extend current usability evaluation techniques to these new kinds of interaction techniques. However, very little has been donemore » to improve the reliability of software offering these kinds of interaction techniques. Even testing basic graphical user interfaces remains a challenge that has rarely been addressed in the field of software engineering [9]. However, the non reliability of interactive software can jeopardize usability evaluation by showing unexpected or undesired behaviors. The aim of this SIG is to provide a forum for both researchers and practitioners interested in testing interactive software. Our goal is to define a roadmap of activities to cross fertilize usability and reliability testing of these kinds of systems to minimize duplicate efforts in both communities.« less
Anderson-Cook, Christine M.; Morzinski, Jerome; Blecker, Kenneth D.
2015-08-19
Understanding the impact of production, environmental exposure and age characteristics on the reliability of a population is frequently based on underlying science and empirical assessment. When there is incomplete science to prescribe which inputs should be included in a model of reliability to predict future trends, statistical model/variable selection techniques can be leveraged on a stockpile or population of units to improve reliability predictions as well as suggest new mechanisms affecting reliability to explore. We describe a five-step process for exploring relationships between available summaries of age, usage and environmental exposure and reliability. The process involves first identifying potential candidatemore » inputs, then second organizing data for the analysis. Third, a variety of models with different combinations of the inputs are estimated, and fourth, flexible metrics are used to compare them. As a result, plots of the predicted relationships are examined to distill leading model contenders into a prioritized list for subject matter experts to understand and compare. The complexity of the model, quality of prediction and cost of future data collection are all factors to be considered by the subject matter experts when selecting a final model.« less
Automated Spatiotemporal Analysis of Fibrils and Coronal Rain Using the Rolling Hough Transform
NASA Astrophysics Data System (ADS)
Schad, Thomas
2017-09-01
A technique is presented that automates the direction characterization of curvilinear features in multidimensional solar imaging datasets. It is an extension of the Rolling Hough Transform (RHT) technique presented by Clark, Peek, and Putman ( Astrophys. J. 789, 82, 2014), and it excels at rapid quantification of spatial and spatiotemporal feature orientation even for applications with a low signal-to-noise ratio. It operates on a pixel-by-pixel basis within a dataset and reliably quantifies orientation even for locations not centered on a feature ridge, which is used here to derive a quasi-continuous map of the chromospheric fine-structure projection angle. For time-series analysis, a procedure is developed that uses a hierarchical application of the RHT to automatically derive the apparent motion of coronal rain observed off-limb. Essential to the success of this technique is the formulation presented in this article for the RHT error analysis as it provides a means to properly filter results.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Grabaskas, Dave; Brunett, Acacia J.; Bucknor, Matthew
GE Hitachi Nuclear Energy (GEH) and Argonne National Laboratory are currently engaged in a joint effort to modernize and develop probabilistic risk assessment (PRA) techniques for advanced non-light water reactors. At a high level the primary outcome of this project will be the development of next-generation PRA methodologies that will enable risk-informed prioritization of safety- and reliability-focused research and development, while also identifying gaps that may be resolved through additional research. A subset of this effort is the development of a reliability database (RDB) methodology to determine applicable reliability data for inclusion in the quantification of the PRA. The RDBmore » method developed during this project seeks to satisfy the requirements of the Data Analysis element of the ASME/ANS Non-LWR PRA standard. The RDB methodology utilizes a relevancy test to examine reliability data and determine whether it is appropriate to include as part of the reliability database for the PRA. The relevancy test compares three component properties to establish the level of similarity to components examined as part of the PRA. These properties include the component function, the component failure modes, and the environment/boundary conditions of the component. The relevancy test is used to gauge the quality of data found in a variety of sources, such as advanced reactor-specific databases, non-advanced reactor nuclear databases, and non-nuclear databases. The RDB also establishes the integration of expert judgment or separate reliability analysis with past reliability data. This paper provides details on the RDB methodology, and includes an example application of the RDB methodology for determining the reliability of the intermediate heat exchanger of a sodium fast reactor. The example explores a variety of reliability data sources, and assesses their applicability for the PRA of interest through the use of the relevancy test.« less
Reliable enumeration of malaria parasites in thick blood films using digital image analysis.
Frean, John A
2009-09-23
Quantitation of malaria parasite density is an important component of laboratory diagnosis of malaria. Microscopy of Giemsa-stained thick blood films is the conventional method for parasite enumeration. Accurate and reproducible parasite counts are difficult to achieve, because of inherent technical limitations and human inconsistency. Inaccurate parasite density estimation may have adverse clinical and therapeutic implications for patients, and for endpoints of clinical trials of anti-malarial vaccines or drugs. Digital image analysis provides an opportunity to improve performance of parasite density quantitation. Accurate manual parasite counts were done on 497 images of a range of thick blood films with varying densities of malaria parasites, to establish a uniformly reliable standard against which to assess the digital technique. By utilizing descriptive statistical parameters of parasite size frequency distributions, particle counting algorithms of the digital image analysis programme were semi-automatically adapted to variations in parasite size, shape and staining characteristics, to produce optimum signal/noise ratios. A reliable counting process was developed that requires no operator decisions that might bias the outcome. Digital counts were highly correlated with manual counts for medium to high parasite densities, and slightly less well correlated with conventional counts. At low densities (fewer than 6 parasites per analysed image) signal/noise ratios were compromised and correlation between digital and manual counts was poor. Conventional counts were consistently lower than both digital and manual counts. Using open-access software and avoiding custom programming or any special operator intervention, accurate digital counts were obtained, particularly at high parasite densities that are difficult to count conventionally. The technique is potentially useful for laboratories that routinely perform malaria parasite enumeration. The requirements of a digital microscope camera, personal computer and good quality staining of slides are potentially reasonably easy to meet.
Estimation of the oxalate content of foods and daily oxalate intake
NASA Technical Reports Server (NTRS)
Holmes, R. P.; Kennedy, M.
2000-01-01
BACKGROUND: The amount of oxalate ingested may be an important risk factor in the development of idiopathic calcium oxalate nephrolithiasis. Reliable food tables listing the oxalate content of foods are currently not available. The aim of this research was to develop an accurate and reliable method to measure the food content of oxalate. METHODS: Capillary electrophoresis (CE) and ion chromatography (IC) were compared as direct techniques for the estimation of the oxalate content of foods. Foods were thoroughly homogenized in acid, heat extracted, and clarified by centrifugation and filtration before dilution in water for analysis. Five individuals consuming self-selected diets maintained food records for three days to determine their mean daily oxalate intakes. RESULTS: Both techniques were capable of adequately measuring the oxalate in foods with a significant oxalate content. With foods of very low oxalate content (<1.8 mg/100 g), IC was more reliable than CE. The mean daily intake of oxalate by the five individuals tested was 152 +/- 83 mg, ranging from 44 to 352 mg/day. CONCLUSIONS: CE appears to be the method of choice over IC for estimating the oxalate content of foods with a medium (>10 mg/100 g) to high oxalate content due to a faster analysis time and lower running costs, whereas IC may be better suited for the analysis of foods with a low oxalate content. Accurate estimates of the oxalate content of foods should permit the role of dietary oxalate in urinary oxalate excretion and stone formation to be clarified. Other factors, apart from the amount of oxalate ingested, appear to exert a major influence over the amount of oxalate excreted in the urine.
Software Reliability Issues Concerning Large and Safety Critical Software Systems
NASA Technical Reports Server (NTRS)
Kamel, Khaled; Brown, Barbara
1996-01-01
This research was undertaken to provide NASA with a survey of state-of-the-art techniques using in industrial and academia to provide safe, reliable, and maintainable software to drive large systems. Such systems must match the complexity and strict safety requirements of NASA's shuttle system. In particular, the Launch Processing System (LPS) is being considered for replacement. The LPS is responsible for monitoring and commanding the shuttle during test, repair, and launch phases. NASA built this system in the 1970's using mostly hardware techniques to provide for increased reliability, but it did so often using custom-built equipment, which has not been able to keep up with current technologies. This report surveys the major techniques used in industry and academia to ensure reliability in large and critical computer systems.
NASA Astrophysics Data System (ADS)
García Plaza, E.; Núñez López, P. J.
2018-01-01
On-line monitoring of surface finish in machining processes has proven to be a substantial advancement over traditional post-process quality control techniques by reducing inspection times and costs and by avoiding the manufacture of defective products. This study applied techniques for processing cutting force signals based on the wavelet packet transform (WPT) method for the monitoring of surface finish in computer numerical control (CNC) turning operations. The behaviour of 40 mother wavelets was analysed using three techniques: global packet analysis (G-WPT), and the application of two packet reduction criteria: maximum energy (E-WPT) and maximum entropy (SE-WPT). The optimum signal decomposition level (Lj) was determined to eliminate noise and to obtain information correlated to surface finish. The results obtained with the G-WPT method provided an in-depth analysis of cutting force signals, and frequency ranges and signal characteristics were correlated to surface finish with excellent results in the accuracy and reliability of the predictive models. The radial and tangential cutting force components at low frequency provided most of the information for the monitoring of surface finish. The E-WPT and SE-WPT packet reduction criteria substantially reduced signal processing time, but at the expense of discarding packets with relevant information, which impoverished the results. The G-WPT method was observed to be an ideal procedure for processing cutting force signals applied to the real-time monitoring of surface finish, and was estimated to be highly accurate and reliable at a low analytical-computational cost.
Some computational techniques for estimating human operator describing functions
NASA Technical Reports Server (NTRS)
Levison, W. H.
1986-01-01
Computational procedures for improving the reliability of human operator describing functions are described. Special attention is given to the estimation of standard errors associated with mean operator gain and phase shift as computed from an ensemble of experimental trials. This analysis pertains to experiments using sum-of-sines forcing functions. Both open-loop and closed-loop measurement environments are considered.
Dunphy, C H; Polski, J M; Evans, H L; Gardner, L J
2001-08-01
Immunophenotyping of bone marrow (BM) specimens with acute myelogenous leukemia (AML) may be performed by flow cytometric (FC) or immunohistochemical (IH) techniques. Some markers (CD34, CD15, and CD117) are available for both techniques. Myeloperoxidase (MPO) analysis may be performed by enzyme cytochemical (EC) or IH techniques. To determine the reliability of these markers and MPO by these techniques, we designed a study to compare the results of analyses of these markers and MPO by FC (CD34, CD15, and CD117), EC (MPO), and IH (CD34, CD15, CD117, and MPO) techniques. Twenty-nine AMLs formed the basis of the study. These AMLs all had been immunophenotyped previously by FC analysis; 27 also had had EC analysis performed. Of the AMLs, 29 had BM core biopsies and 26 had BM clots that could be evaluated. The paraffin blocks of the 29 BM core biopsies and 26 BM clots were stained for CD34, CD117, MPO, and CD15. These results were compared with results by FC analysis (CD34, CD15, and CD117) and EC analysis (MPO). Immunodetection of CD34 expression in AML had a similar sensitivity by FC and IH techniques. Immunodetection of CD15 and CD117 had a higher sensitivity by FC analysis than by IH analysis. Detection of MPO by IH analysis was more sensitive than by EC analysis. There was no correlation of French-American-British (FAB) subtype of AML with CD34 or CD117 expression. Expression of CD15 was associated with AMLs with a monocytic component. Myeloperoxidase reactivity by IH analysis was observed in AMLs originally FAB subtyped as M0. CD34 can be equally detected by FC and IH techniques. CD15 and CD117 are better detected by FC analysis and MPO is better detected by IH analysis.
Application of the variational-asymptotical method to composite plates
NASA Technical Reports Server (NTRS)
Hodges, Dewey H.; Lee, Bok W.; Atilgan, Ali R.
1992-01-01
A method is developed for the 3D analysis of laminated plate deformation which is an extension of a variational-asymptotical method by Atilgan and Hodges (1991). Both methods are based on the treatment of plate deformation by splitting the 3D analysis into linear through-the-thickness analysis and 2D plate analysis. Whereas the first technique tackles transverse shear deformation in the second asymptotical approximation, the present method simplifies its treatment and restricts it to the first approximation. Both analytical techniques are applied to the linear cylindrical bending problem, and the strain and stress distributions are derived and compared with those of the exact solution. The present theory provides more accurate results than those of the classical laminated-plate theory for the transverse displacement of 2-, 3-, and 4-layer cross-ply laminated plates. The method can give reliable estimates of the in-plane strain and displacement distributions.
NASA Astrophysics Data System (ADS)
Eason, Thomas J.; Bond, Leonard J.; Lozev, Mark G.
2016-02-01
The accuracy, precision, and reliability of ultrasonic thickness structural health monitoring systems are discussed in-cluding the influence of systematic and environmental factors. To quantify some of these factors, a compression wave ultrasonic thickness structural health monitoring experiment is conducted on a flat calibration block at ambient temperature with forty four thin-film sol-gel transducers and various time-of-flight thickness calculation methods. As an initial calibration, the voltage response signals from each sensor are used to determine the common material velocity as well as the signal offset unique to each calculation method. Next, the measurement precision of the thickness error of each method is determined with a proposed weighted censored relative maximum likelihood analysis technique incorporating the propagation of asymmetric measurement uncertainty. The results are presented as upper and lower confidence limits analogous to the a90/95 terminology used in industry recognized Probability-of-Detection assessments. Future work is proposed to apply the statistical analysis technique to quantify measurement precision of various thickness calculation methods under different environmental conditions such as high temperature, rough back-wall surface, and system degradation with an intended application to monitor naphthenic acid corrosion in oil refineries.
Principles, Techniques, and Applications of Tissue Microfluidics
NASA Technical Reports Server (NTRS)
Wade, Lawrence A.; Kartalov, Emil P.; Shibata, Darryl; Taylor, Clive
2011-01-01
The principle of tissue microfluidics and its resultant techniques has been applied to cell analysis. Building microfluidics to suit a particular tissue sample would allow the rapid, reliable, inexpensive, highly parallelized, selective extraction of chosen regions of tissue for purposes of further biochemical analysis. Furthermore, the applicability of the techniques ranges beyond the described pathology application. For example, they would also allow the posing and successful answering of new sets of questions in many areas of fundamental research. The proposed integration of microfluidic techniques and tissue slice samples is called "tissue microfluidics" because it molds the microfluidic architectures in accordance with each particular structure of each specific tissue sample. Thus, microfluidics can be built around the tissues, following the tissue structure, or alternatively, the microfluidics can be adapted to the specific geometry of particular tissues. By contrast, the traditional approach is that microfluidic devices are structured in accordance with engineering considerations, while the biological components in applied devices are forced to comply with these engineering presets.
NASA Technical Reports Server (NTRS)
Motyka, P.
1983-01-01
A methodology is developed and applied for quantitatively analyzing the reliability of a dual, fail-operational redundant strapdown inertial measurement unit (RSDIMU). A Markov evaluation model is defined in terms of the operational states of the RSDIMU to predict system reliability. A 27 state model is defined based upon a candidate redundancy management system which can detect and isolate a spectrum of failure magnitudes. The results of parametric studies are presented which show the effect on reliability of the gyro failure rate, both the gyro and accelerometer failure rates together, false alarms, probability of failure detection, probability of failure isolation, and probability of damage effects and mission time. A technique is developed and evaluated for generating dynamic thresholds for detecting and isolating failures of the dual, separated IMU. Special emphasis is given to the detection of multiple, nonconcurrent failures. Digital simulation time histories are presented which show the thresholds obtained and their effectiveness in detecting and isolating sensor failures.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wilson, E.; Engebrecht-Metzger, C.; Horowitz, S.
As BA has grown to include a large and diverse cross-section of the home building and retrofit industries, it has become more important to develop accurate, consistent analysis techniques to measure progress towards the program's goals. The House Simulation Protocol (HSP) document provides guidance to program partners and managers so they can compare energy savings for new construction and retrofit projects. The HSP provides the program with analysis methods that are proven to be effective and reliable in investigating the energy use of advanced energy systems and of entire houses.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wilson, E.; Engebrecht, C. Metzger; Horowitz, S.
As Building America has grown to include a large and diverse cross-section of the home building and retrofit industries, it has become more important to develop accurate, consistent analysis techniques to measure progress towards the program's goals. The House Simulation Protocol (HSP) document provides guidance to program partners and managers so they can compare energy savings for new construction and retrofit projects. The HSP provides the program with analysis methods that are proven to be effective and reliable in investigating the energy use of advanced energy systems and of entire houses.
Ghanbari, Behzad
2014-01-01
We aim to study the convergence of the homotopy analysis method (HAM in short) for solving special nonlinear Volterra-Fredholm integrodifferential equations. The sufficient condition for the convergence of the method is briefly addressed. Some illustrative examples are also presented to demonstrate the validity and applicability of the technique. Comparison of the obtained results HAM with exact solution shows that the method is reliable and capable of providing analytic treatment for solving such equations.
Launch vehicle systems design analysis
NASA Technical Reports Server (NTRS)
Ryan, Robert; Verderaime, V.
1993-01-01
Current launch vehicle design emphasis is on low life-cycle cost. This paper applies total quality management (TQM) principles to a conventional systems design analysis process to provide low-cost, high-reliability designs. Suggested TQM techniques include Steward's systems information flow matrix method, quality leverage principle, quality through robustness and function deployment, Pareto's principle, Pugh's selection and enhancement criteria, and other design process procedures. TQM quality performance at least-cost can be realized through competent concurrent engineering teams and brilliance of their technical leadership.
NASA Technical Reports Server (NTRS)
Anderson, B. H.
1983-01-01
A broad program to develop advanced, reliable, and user oriented three-dimensional viscous design techniques for supersonic inlet systems, and encourage their transfer into the general user community is discussed. Features of the program include: (1) develop effective methods of computing three-dimensional flows within a zonal modeling methodology; (2) ensure reasonable agreement between said analysis and selective sets of benchmark validation data; (3) develop user orientation into said analysis; and (4) explore and develop advanced numerical methodology.
Determination of Phenols and Trimethylamine in Industrial Effluents
NASA Technical Reports Server (NTRS)
Levaggi, D. A.; Feldstein, M.
1971-01-01
For regulatory purposes to control certain odorous compounds the analysis of phenols and trimethylamines in industrial effluents is necessary. The Bay Area Air Pollution Control District laboratory has been determining these gases by gas chromatographic techniques. The procedures for sample collection, preparation for analysis and determination are described in detail. Typical data from various sources showing the effect of proposed regulations is shown. Extensive sampling and usage of these procedures has shown them to be accurate, reliable and suitable to all types of source effluents.
NASA Astrophysics Data System (ADS)
Tiwari, Harinarayan; Sharma, Nayan
2017-05-01
This research paper focuses on the need of turbulence, instruments reliable to capture turbulence, different turbulence parameters and some advance methodology which can decompose various turbulence structures at different levels near hydraulic structures. Small-scale turbulence research has valid prospects in open channel flow. The relevance of the study is amplified as we introduce any hydraulic structure in the channel which disturbs the natural flow and creates discontinuity. To recover this discontinuity, the piano key weir (PKW) might be used with sloped keys. Constraints of empirical results in the vicinity of PKW necessitate extensive laboratory experiments with fair and reliable instrumentation techniques. Acoustic Doppler velocimeter was established to be best suited within range of some limitations using principal component analysis. Wavelet analysis is proposed to decompose the underlying turbulence structure in a better way.
Temporal discrimination threshold with healthy aging.
Ramos, Vesper Fe Marie Llaneza; Esquenazi, Alina; Villegas, Monica Anne Faye; Wu, Tianxia; Hallett, Mark
2016-07-01
The temporal discrimination threshold (TDT) is the shortest interstimulus interval at which a subject can perceive successive stimuli as separate. To investigate the effects of aging on TDT, we studied tactile TDT using the method of limits with 120% of sensory threshold in each hand for each of 100 healthy volunteers, equally divided among men and women, across 10 age groups, from 18 to 79 years. Linear regression analysis showed that age was significantly related to left-hand mean, right-hand mean, and mean of 2 hands with R-square equal to 0.08, 0.164, and 0.132, respectively. Reliability analysis indicated that the 3 measures had fair-to-good reliability (intraclass correlation coefficient: 0.4-0.8). We conclude that TDT is affected by age and has fair-to-good reproducibility using our technique. Published by Elsevier Inc.
Pneumothorax size measurements on digital chest radiographs: Intra- and inter- rater reliability.
Thelle, Andreas; Gjerdevik, Miriam; Grydeland, Thomas; Skorge, Trude D; Wentzel-Larsen, Tore; Bakke, Per S
2015-10-01
Detailed and reliable methods may be important for discussions on the importance of pneumothorax size in clinical decision-making. Rhea's method is widely used to estimate pneumothorax size in percent based on chest X-rays (CXRs) from three measure points. Choi's addendum is used for anterioposterior projections. The aim of this study was to examine the intrarater and interrater reliability of the Rhea and Choi method using digital CXR in the ward based PACS monitors. Three physicians examined a retrospective series of 80 digital CXRs showing pneumothorax, using Rhea and Choi's method, then repeated in a random order two weeks later. We used the analysis of variance technique by Eliasziw et al. to assess the intrarater and interrater reliability in altogether 480 estimations of pneumothorax size. Estimated pneumothorax sizes ranged between 5% and 100%. The intrarater reliability coefficient was 0.98 (95% one-sided lower-limit confidence interval C 0.96), and the interrater reliability coefficient was 0.95 (95% one-sided lower-limit confidence interval 0.93). This study has shown that the Rhea and Choi method for calculating pneumothorax size has high intrarater and interrater reliability. These results are valid across gender, side of pneumothorax and whether the patient is diagnosed with primary or secondary pneumothorax. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
Measuring human remains in the field: Grid technique, total station, or MicroScribe?
Sládek, Vladimír; Galeta, Patrik; Sosna, Daniel
2012-09-10
Although three-dimensional (3D) coordinates for human intra-skeletal landmarks are among the most important data that anthropologists have to record in the field, little is known about the reliability of various measuring techniques. We compared the reliability of three techniques used for 3D measurement of human remain in the field: grid technique (GT), total station (TS), and MicroScribe (MS). We measured 365 field osteometric points on 12 skeletal sequences excavated at the Late Medieval/Early Modern churchyard in Všeruby, Czech Republic. We compared intra-observer, inter-observer, and inter-technique variation using mean difference (MD), mean absolute difference (MAD), standard deviation of difference (SDD), and limits of agreement (LA). All three measuring techniques can be used when accepted error ranges can be measured in centimeters. When a range of accepted error measurable in millimeters is needed, MS offers the best solution. TS can achieve the same reliability as does MS, but only when the laser beam is accurately pointed into the center of the prism. When the prism is not accurately oriented, TS produces unreliable data. TS is more sensitive to initialization than is MS. GT measures human skeleton with acceptable reliability for general purposes but insufficiently when highly accurate skeletal data are needed. We observed high inter-technique variation, indicating that just one technique should be used when spatial data from one individual are recorded. Subadults are measured with slightly lower error than are adults. The effect of maximum excavated skeletal length has little practical significance in field recording. When MS is not available, we offer practical suggestions that can help to increase reliability when measuring human skeleton in the field. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.
Volatile organic compounds: sampling methods and their worldwide profile in ambient air.
Kumar, Anuj; Víden, Ivan
2007-08-01
The atmosphere is a particularly difficult analytical system because of the very low levels of substances to be analysed, sharp variations in pollutant levels with time and location, differences in wind, temperature and humidity. This makes the selection of an efficient sampling technique for air analysis a key step to reliable results. Generally, methods for volatile organic compounds sampling include collection of the whole air or preconcentration of samples on adsorbents. All the methods vary from each other according to the sampling technique, type of sorbent, method of extraction and identification technique. In this review paper we discuss various important aspects for sampling of volatile organic compounds by the widely used and advanced sampling methods. Characteristics of various adsorbents used for VOCs sampling are also described. Furthermore, this paper makes an effort to comprehensively review the concentration levels of volatile organic compounds along with the methodology used for analysis, in major cities of the world.
Gamazo-Real, José Carlos; Vázquez-Sánchez, Ernesto; Gómez-Gil, Jaime
2010-01-01
This paper provides a technical review of position and speed sensorless methods for controlling Brushless Direct Current (BLDC) motor drives, including the background analysis using sensors, limitations and advances. The performance and reliability of BLDC motor drivers have been improved because the conventional control and sensing techniques have been improved through sensorless technology. Then, in this paper sensorless advances are reviewed and recent developments in this area are introduced with their inherent advantages and drawbacks, including the analysis of practical implementation issues and applications. The study includes a deep overview of state-of-the-art back-EMF sensing methods, which includes Terminal Voltage Sensing, Third Harmonic Voltage Integration, Terminal Current Sensing, Back-EMF Integration and PWM strategies. Also, the most relevant techniques based on estimation and models are briefly analysed, such as Sliding-mode Observer, Extended Kalman Filter, Model Reference Adaptive System, Adaptive observers (Full-order and Pseudoreduced-order) and Artificial Neural Networks.
Fuller, Joel T; Archer, Jane; Buckley, Jonathan D; Tsiros, Margarita D; Thewlis, Dominic
2016-01-01
To investigate the reliability of a simple, efficient technique for measuring bone mineral density (BMD) in the metatarsals using dual-energy X-ray absorptiometry (DXA). BMD of the right foot of 32 trained male distance runners was measured using a DXA scanner with the foot in the plantar position. Separate regions of interest (ROI) were used to assess the BMD of each metatarsal shaft (1st-5th) for each participant. ROI analysis was repeated by the same investigator to determine within-scan intra-rater reliability and by a different investigator to determine within-scan inter-rater reliability. Repeat DXA scans were undertaken for ten participants to assess between-scan intra-rater reliability. Assessment of BMD was consistently most reliable for the first metatarsal across all domains of reliability assessed (intra-class correlation coefficient [ICC] ≥0.97; coefficient of variation [CV] ≤1.5%; limits of agreement [LOA] ≤4.2%). Reasonable levels of intra-rater reliability were also achieved for the second and fifth metatarsals (ICC ≥0.90; CV ≤4.2%; LOA ≤11.9%). Poorer levels of reliability were demonstrated for the third (ICC ≥0.64; CV ≤8.2%; LOA ≤23.6%) and fourth metatarsals (ICC ≥0.67; CV ≤9.6%; LOA ≤27.5%). BMD was greatest in the first and second metatarsals (P < 0.01). Reliable measurements of BMD were achieved for the first, second and fifth metatarsals.
NASA Astrophysics Data System (ADS)
Jia, Heping; Jin, Wende; Ding, Yi; Song, Yonghua; Yu, Dezhao
2017-01-01
With the expanding proportion of renewable energy generation and development of smart grid technologies, flexible demand resources (FDRs) have been utilized as an approach to accommodating renewable energies. However, multiple uncertainties of FDRs may influence reliable and secure operation of smart grid. Multi-state reliability models for a single FDR and aggregating FDRs have been proposed in this paper with regard to responsive abilities for FDRs and random failures for both FDR devices and information system. The proposed reliability evaluation technique is based on Lz transform method which can formulate time-varying reliability indices. A modified IEEE-RTS has been utilized as an illustration of the proposed technique.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Merrill, D.W.; Selvin, S.; Close, E.R.
In studying geographic disease distributions, one normally compares rates of arbitrarily defined geographic subareas (e.g. census tracts), thereby sacrificing the geographic detail of the original data. The sparser the data, the larger the subareas must be in order to calculate stable rates. This dilemma is avoided with the technique of Density Equalizing Map Projections (DEMP). Boundaries of geographic subregions are adjusted to equalize population density over the entire study area. Case locations plotted on the transformed map should have a uniform distribution if the underlying disease-rates are constant. On the transformed map, the statistical analysis of the observed distribution ismore » greatly simplified. Even for sparse distributions, the statistical significance of a supposed disease cluster can be reliably calculated. The present report describes the first successful application of the DEMP technique to a sizeable ``real-world`` data set of epidemiologic interest. An improved DEMP algorithm [GUSE93, CLOS94] was applied to a data set previously analyzed with conventional techniques [SATA90, REYN91]. The results from the DEMP analysis and a conventional analysis are compared.« less
Meteor tracking via local pattern clustering in spatio-temporal domain
NASA Astrophysics Data System (ADS)
Kukal, Jaromír.; Klimt, Martin; Švihlík, Jan; Fliegel, Karel
2016-09-01
Reliable meteor detection is one of the crucial disciplines in astronomy. A variety of imaging systems is used for meteor path reconstruction. The traditional approach is based on analysis of 2D image sequences obtained from a double station video observation system. Precise localization of meteor path is difficult due to atmospheric turbulence and other factors causing spatio-temporal fluctuations of the image background. The proposed technique performs non-linear preprocessing of image intensity using Box-Cox transform as recommended in our previous work. Both symmetric and asymmetric spatio-temporal differences are designed to be robust in the statistical sense. Resulting local patterns are processed by data whitening technique and obtained vectors are classified via cluster analysis and Self-Organized Map (SOM).
Rohman, A; Man, Yb Che; Sismindari
2009-10-01
Today, virgin coconut oil (VCO) is becoming valuable oil and is receiving an attractive topic for researchers because of its several biological activities. In cosmetics industry, VCO is excellent material which functions as a skin moisturizer and softener. Therefore, it is important to develop a quantitative analytical method offering a fast and reliable technique. Fourier transform infrared (FTIR) spectroscopy with sample handling technique of attenuated total reflectance (ATR) can be successfully used to analyze VCO quantitatively in cream cosmetic preparations. A multivariate analysis using calibration of partial least square (PLS) model revealed the good relationship between actual value and FTIR-predicted value of VCO with coefficient of determination (R2) of 0.998.
Aghayev, Kamran; Vrionis, Frank D
2013-09-01
The main aim of this paper was to report reproducible method of lumbar spine access via a lateral retroperitoneal route. The authors conducted a retrospective analysis of the technical aspects and clinical outcomes of six patients who underwent lateral multilevel retroperitoneal interbody fusion with psoas muscle retraction technique. The main goal was to develop a simple and reproducible technique to avoid injury to the lumbar plexus. Six patients were operated at 15 levels using psoas muscle retraction technique. All patients reported improvement in back pain and radiculopathy after the surgery. The only procedure-related transient complication was weakness and pain on hip flexion that resolved by the first follow-up visit. Psoas retraction technique is a reliable technique for lateral access to the lumbar spine and may avoid some of the complications related to traditional minimally invasive transpsoas approach.
Characterizing reliability in a product/process design-assurance program
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kerscher, W.J. III; Booker, J.M.; Bement, T.R.
1997-10-01
Over the years many advancing techniques in the area of reliability engineering have surfaced in the military sphere of influence, and one of these techniques is Reliability Growth Testing (RGT). Private industry has reviewed RGT as part of the solution to their reliability concerns, but many practical considerations have slowed its implementation. It`s objective is to demonstrate the reliability requirement of a new product with a specified confidence. This paper speaks directly to that objective but discusses a somewhat different approach to achieving it. Rather than conducting testing as a continuum and developing statistical confidence bands around the results, thismore » Bayesian updating approach starts with a reliability estimate characterized by large uncertainty and then proceeds to reduce the uncertainty by folding in fresh information in a Bayesian framework.« less
Development of a New VLBI Data Analysis Software
NASA Technical Reports Server (NTRS)
Bolotin, Sergei; Gipson, John M.; MacMillan, Daniel S.
2010-01-01
We present an overview of a new VLBI analysis software under development at NASA GSFC. The new software will replace CALC/SOLVE and many related utility programs. It will have the capabilities of the current system as well as incorporate new models and data analysis techniques. In this paper we give a conceptual overview of the new software. We formulate the main goals of the software. The software should be flexible and modular to implement models and estimation techniques that currently exist or will appear in future. On the other hand it should be reliable and possess production quality for processing standard VLBI sessions. Also, it needs to be capable of processing observations from a fully deployed network of VLBI2010 stations in a reasonable time. We describe the software development process and outline the software architecture.
Mejia, Amanda F; Nebel, Mary Beth; Barber, Anita D; Choe, Ann S; Pekar, James J; Caffo, Brian S; Lindquist, Martin A
2018-05-15
Reliability of subject-level resting-state functional connectivity (FC) is determined in part by the statistical techniques employed in its estimation. Methods that pool information across subjects to inform estimation of subject-level effects (e.g., Bayesian approaches) have been shown to enhance reliability of subject-level FC. However, fully Bayesian approaches are computationally demanding, while empirical Bayesian approaches typically rely on using repeated measures to estimate the variance components in the model. Here, we avoid the need for repeated measures by proposing a novel measurement error model for FC describing the different sources of variance and error, which we use to perform empirical Bayes shrinkage of subject-level FC towards the group average. In addition, since the traditional intra-class correlation coefficient (ICC) is inappropriate for biased estimates, we propose a new reliability measure denoted the mean squared error intra-class correlation coefficient (ICC MSE ) to properly assess the reliability of the resulting (biased) estimates. We apply the proposed techniques to test-retest resting-state fMRI data on 461 subjects from the Human Connectome Project to estimate connectivity between 100 regions identified through independent components analysis (ICA). We consider both correlation and partial correlation as the measure of FC and assess the benefit of shrinkage for each measure, as well as the effects of scan duration. We find that shrinkage estimates of subject-level FC exhibit substantially greater reliability than traditional estimates across various scan durations, even for the most reliable connections and regardless of connectivity measure. Additionally, we find partial correlation reliability to be highly sensitive to the choice of penalty term, and to be generally worse than that of full correlations except for certain connections and a narrow range of penalty values. This suggests that the penalty needs to be chosen carefully when using partial correlations. Copyright © 2018. Published by Elsevier Inc.
Reliability of drivers in urban intersections.
Gstalter, Herbert; Fastenmeier, Wolfgang
2010-01-01
The concept of human reliability has been widely used in industrial settings by human factors experts to optimise the person-task fit. Reliability is estimated by the probability that a task will successfully be completed by personnel in a given stage of system operation. Human Reliability Analysis (HRA) is a technique used to calculate human error probabilities as the ratio of errors committed to the number of opportunities for that error. To transfer this notion to the measurement of car driver reliability the following components are necessary: a taxonomy of driving tasks, a definition of correct behaviour in each of these tasks, a list of errors as deviations from the correct actions and an adequate observation method to register errors and opportunities for these errors. Use of the SAFE-task analysis procedure recently made it possible to derive driver errors directly from the normative analysis of behavioural requirements. Driver reliability estimates could be used to compare groups of tasks (e.g. different types of intersections with their respective regulations) as well as groups of drivers' or individual drivers' aptitudes. This approach was tested in a field study with 62 drivers of different age groups. The subjects drove an instrumented car and had to complete an urban test route, the main features of which were 18 intersections representing six different driving tasks. The subjects were accompanied by two trained observers who recorded driver errors using standardized observation sheets. Results indicate that error indices often vary between both the age group of drivers and the type of driving task. The highest error indices occurred in the non-signalised intersection tasks and the roundabout, which exactly equals the corresponding ratings of task complexity from the SAFE analysis. A comparison of age groups clearly shows the disadvantage of older drivers, whose error indices in nearly all tasks are significantly higher than those of the other groups. The vast majority of these errors could be explained by high task load in the intersections, as they represent difficult tasks. The discussion shows how reliability estimates can be used in a constructive way to propose changes in car design, intersection layout and regulation as well as driver training.
NASA Astrophysics Data System (ADS)
Kuriakose, Tintu; Baudet, Emeline; Halenkovič, Tomáš; Elsawy, Mahmoud M. R.; Němec, Petr; Nazabal, Virginie; Renversez, Gilles; Chauvet, Mathieu
2017-11-01
We present a reliable and original experimental technique based on the analysis of beam self-trapping to measure ultrafast optical nonlinearities in planar waveguides. The technique is applied to the characterization of Ge-Sb-Se chalcogenide films that allow Kerr induced self-focusing and soliton formation. Linear and nonlinear optical constants of three different chalcogenide waveguides are studied at 1200 and 1550 nm in femtosecond regime. Waveguide propagation loss and two photon absorption coefficients are determined by transmission analysis. Beam broadening and narrowing results are compared with simulations of the nonlinear Schrödinger equation solved by BPM method to deduce the Kerr n2 coefficients. Kerr optical nonlinearities obtained by our original technique compare favorably with the values obtained by Z-scan technique. Nonlinear refractive index as high as (69 ± 11) × 10-18m2 / W is measured in Ge12.5Sb25Se62.5 at 1200 nm with low nonlinear absorption and low propagation losses which reveals the great characteristics of our waveguides for ultrafast all optical switching and integrated photonic devices.
Yang, Litao; Liang, Wanqi; Jiang, Lingxi; Li, Wenquan; Cao, Wei; Wilson, Zoe A; Zhang, Dabing
2008-01-01
Background Real-time PCR techniques are being widely used for nucleic acids analysis, but one limitation of current frequently employed real-time PCR is the high cost of the labeled probe for each target molecule. Results We describe a real-time PCR technique employing attached universal duplex probes (AUDP), which has the advantage of generating fluorescence by probe hydrolysis and strand displacement over current real-time PCR methods. AUDP involves one set of universal duplex probes in which the 5' end of the fluorescent probe (FP) and a complementary quenching probe (QP) lie in close proximity so that fluorescence can be quenched. The PCR primer pair with attached universal template (UT) and the FP are identical to the UT sequence. We have shown that the AUDP technique can be used for detecting multiple target DNA sequences in both simplex and duplex real-time PCR assays for gene expression analysis, genotype identification, and genetically modified organism (GMO) quantification with comparable sensitivity, reproducibility, and repeatability with other real-time PCR methods. Conclusion The results from GMO quantification, gene expression analysis, genotype identification, and GMO quantification using AUDP real-time PCR assays indicate that the AUDP real-time PCR technique has been successfully applied in nucleic acids analysis, and the developed AUDP real-time PCR technique will offer an alternative way for nucleic acid analysis with high efficiency, reliability, and flexibility at low cost. PMID:18522756
Kim, Hae Won; Jung, Yeon Yi; Park, Seungmi
2012-12-01
This study was conducted to evaluate the validity and reliability of the Korean version of the Sexuality Attitudes and Beliefs Survey (SABS) and to assess SABS for Korean nurses. The Korean version of SABS was developed through forward-backward translation techniques. Internal consistency reliability and construct validity using confirmatory factor analysis were conducted using PASW+ PC Win (18.0) and AMOS (18.0). Data were collected from 567 nurses who worked in one of six general hospitals across the country. The Korean version of SABS showed a reliable internal consistency with Cronbach's α of subscales ranging from .59 to .73. Factor loadings of the 10 items of three subscales ranged from .38 to .83. The three subscales model were validated by confirmatory factor analysis (GFI>.97, RMSEA<.05). Sexuality attitudes and beliefs for Korean nurses were more negative than that of European or American nurses. The SABS scores for Korean nurses were significantly different according to age, marriage, education, clinical experiences, and feeling about sexuality. The Korean version of SABS has satisfactory construct validity and reliability to measure Korean nurses' attitudes and belief toward sexuality. Education is essential to enhance importance and self-efficacy and to relieve barriers to addressing patients' sexuality.
NASA Astrophysics Data System (ADS)
Stamenkovic, Dragan D.; Popovic, Vladimir M.
2015-02-01
Warranty is a powerful marketing tool, but it always involves additional costs to the manufacturer. In order to reduce these costs and make use of warranty's marketing potential, the manufacturer needs to master the techniques for warranty cost prediction according to the reliability characteristics of the product. In this paper a combination free replacement and pro rata warranty policy is analysed as warranty model for one type of light bulbs. Since operating conditions have a great impact on product reliability, they need to be considered in such analysis. A neural network model is used to predict light bulb reliability characteristics based on the data from the tests of light bulbs in various operating conditions. Compared with a linear regression model used in the literature for similar tasks, the neural network model proved to be a more accurate method for such prediction. Reliability parameters obtained in this way are later used in Monte Carlo simulation for the prediction of times to failure needed for warranty cost calculation. The results of the analysis make possible for the manufacturer to choose the optimal warranty policy based on expected product operating conditions. In such a way, the manufacturer can lower the costs and increase the profit.
A Comparison of Metamodeling Techniques via Numerical Experiments
NASA Technical Reports Server (NTRS)
Crespo, Luis G.; Kenny, Sean P.; Giesy, Daniel P.
2016-01-01
This paper presents a comparative analysis of a few metamodeling techniques using numerical experiments for the single input-single output case. These experiments enable comparing the models' predictions with the phenomenon they are aiming to describe as more data is made available. These techniques include (i) prediction intervals associated with a least squares parameter estimate, (ii) Bayesian credible intervals, (iii) Gaussian process models, and (iv) interval predictor models. Aspects being compared are computational complexity, accuracy (i.e., the degree to which the resulting prediction conforms to the actual Data Generating Mechanism), reliability (i.e., the probability that new observations will fall inside the predicted interval), sensitivity to outliers, extrapolation properties, ease of use, and asymptotic behavior. The numerical experiments describe typical application scenarios that challenge the underlying assumptions supporting most metamodeling techniques.
NASA Astrophysics Data System (ADS)
Wang, Xiaohua
The coupling resulting from the mutual influence of material thermal and mechanical parameters is examined in the thermal stress analysis of a multilayered isotropic composite cylinder subjected to sudden axisymmetric external and internal temperature. The method of complex frequency response functions together with the Fourier transform technique is utilized. Because the coupling parameters for some composite materials, such as carbon-carbon, are very small, the effect of coupling is neglected in the orthotropic thermal stress analysis. The stress distributions in multilayered orthotropic cylinders subjected to sudden axisymmetric temperature loading combined with dynamic pressure as well as asymmetric temperature loading are also obtained. The method of Fourier series together with the Laplace transform is utilized in solving the heat conduction equation and thermal stress analysis. For brittle materials, like carbon-carbon composites, the strength variability is represented by two or three parameter Weibull distributions. The 'weakest link' principle which takes into account both the carbon-carbon composite cylinders. The complex frequency response analysis is performed on a multilayered orthotropic cylinder under asymmetrical thermal load. Both deterministic and random thermal stress and reliability analyses can be based on the results of this frequency response analysis. The stress and displacement distributions and reliability of rocket motors under static or dynamic line loads are analyzed by an elasticity approach. Rocket motors are modeled as long hollow multilayered cylinders with an air core, a thick isotropic propellant inner layer and a thin orthotropic kevlar-epoxy case. The case is treated as a single orthotropic layer or a ten layered orthotropic structure. Five material properties and the load are treated as random variable with normal distributions when the reliability of the rocket motor is analyzed by the first-order, second-moment method (FOSM).
Multimodality approach to classifying hand utilization for the clinical breast examination.
Laufer, Shlomi; Cohen, Elaine R; Maag, Anne-Lise D; Kwan, Calvin; Vanveen, Barry; Pugh, Carla M
2014-01-01
The clinical breast examination (CBE) is performed to detect breast pathology. However, little is known regarding clinical technique and how it relates to diagnostic accuracy. We sought to quantify breast examination search patterns and hand utilization with a new data collection and analysis system. Participants performed the CBE while the sensor mapping and video camera system collected performance data. From this data, algorithms were developed that measured the number of hands used during the exam and active examination time. This system is a feasible and reliable method to collect new information on CBE techniques.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Harold S. Blackman; Ronald Boring; Julie L. Marble
This panel will discuss what new directions are necessary to maximize the usefulness of HRA techniques across different areas of application. HRA has long been a part of Probabilistic Risk Assessment in the nuclear industry as it offers a superior standard for risk-based decision-making. These techniques are continuing to be adopted by other industries including oil & gas, cybersecurity, nuclear, and aviation. Each participant will present his or her ideas concerning industry needs followed by a discussion about what research is needed and the necessity to achieve cross industry collaboration.
NASA Astrophysics Data System (ADS)
Murrad, Muhamad; Leong, M. Salman
Based on the experiences of the Malaysian Armed Forces (MAF), failure of the main rotor gearbox (MRGB) was one of the major contributing factors to helicopter breakdowns. Even though vibration and oil analysis are the effective techniques for monitoring the health of helicopter components, these two techniques were rarely combined to form an effective assessment tool in MAF. Results of the oil analysis were often used only for oil changing schedule while assessments of MRGB condition were mainly based on overall vibration readings. A study group was formed and given a mandate to improve the maintenance strategy of S61-A4 helicopter fleet in the MAF. The improvement consisted of a structured approach to the reassessment/redefinition suitable maintenance actions that should be taken for the MRGB. Basic and enhanced tools for condition monitoring (CM) are investigated to address the predominant failures of the MRGB. Quantitative accelerated life testing (QALT) was considered in this work with an intent to obtain the required reliability information in a shorter time with tests under normal stress conditions. These tests when performed correctly can provide valuable information about MRGB performance under normal operating conditions which enable maintenance personnel to make decision more quickly, accurately and economically. The time-to-failure and probability of failure information of the MRGB were generated by applying QALT analysis principles. This study is anticipated to make a dramatic change in its approach to CM, bringing significant savings and various benefits to MAF.
NASA Astrophysics Data System (ADS)
Boluwade, Alaba; Zhao, K.-Y.; Stadnyk, T. A.; Rasmussen, P.
2018-01-01
This study presents a three-step validation technique to compare the performance of the Canadian Precipitation Analysis (CaPA) product relative to actual observation as a hydrologic forcing in regional watershed simulation. CaPA is an interpolated (6 h or 24 h accumulation) reanalysis precipitation product in near real time covering all of North America. The analysis procedure involves point-to-point (P2P) and map-to-map (M2M) comparisons, followed by proxy validation using an operational version of the WATFLOOD™ hydrologic model from 2002 to 2005 in the Lake Winnipeg Basin (LWB), Canada. The P2P technique using a Bayesian change point analysis shows that CaPA corresponds with actual observations (Canadian daily climate data, CDCD), on both an annual and seasonal basis. CaPA has the same spatial pattern, dependency and autocorrelation properties as CDCD pixel by pixel (M2M). When used as hydrologic forcing in WATFLOOD™, results indicate that CaPA is a reliable product for water resource modeling and predictions, but that the quality of CaPA data varies annually and seasonally, as does the quality of observations. CaPA proved most beneficial as a hydrologic forcing during winter seasons where observation quality is the lowest. Reanalysis products, such as CaPA, can be a reliable option in sparse network areas, and is beneficial for regional governments when the cost of new weather stations is prohibitive.
[A short form of the positions on nursing diagnosis scale: development and psychometric testing].
Romero-Sánchez, José Manuel; Paloma-Castro, Olga; Paramio-Cuevas, Juan Carlos; Pastor-Montero, Sonia María; O'Ferrall-González, Cristina; Gabaldón-Bravo, Eva Maria; González-Domínguez, Maria Eugenia; Castro-Yuste, Cristina; Frandsen, Anna J; Martínez-Sabater, Antonio
2013-06-01
The Positions on Nursing Diagnosis (PND) is a scale that uses the semantic differential technique to measure nurses' attitudes towards the nursing diagnosis concept. The aim of this study was to develop a shortened form of the Spanish version of this scale and evaluate its psychometric properties and efficiency. A double theoretical-empirical approach was used to obtain a short form of the PND, the PND-7-SV, which would be equivalent to the original. Using a cross-sectional survey design, the reliability (internal consistency and test-retest reliability), construct (exploratory factor analysis, known-groups technique and discriminant validity) and criterion-related validity (concurrent validity), sensitivity to change and efficiency of the PND-7-SV were assessed in a sample of 476 Spanish nursing students. The results endorsed the utility of the PND-7-SV to measure attitudes toward nursing diagnosis in an equivalent manner to the complete form of the scale and in a shorter time.
Garrel, R; Poissonnet, G; Temam, S; Dolivet, G; Fakhry, N; de Raucourt, D
2017-04-01
The reliability of the sentinel lymph node (SN) technique has been established for more than ten years in T1-T2 oral cavity and oropharynx squamous cell carcinoma. Although most authors stress the necessity of rigorous implementation, there are no agreed guidelines. Moreover, other indications have been described, in other anatomical areas of the upper aerodigestive tract and in case of previous surgery or radiotherapy. SN expert teams, under the GETTEC head and neck tumor study group, conducted a review of the key points for implementation in head and neck cancers through guidelines and a review of classical and extended indications. Reliability depends on respecting key points of preoperative landmarking by lymphoscintigraphy, and intraoperative SN sampling and histological analysis. The SN technique is the best means of diagnosing occult lymph node involvement, whatever the primary tumor location, T stage or patient history. Copyright © 2016 Elsevier Masson SAS. All rights reserved.
Ivanov, P L; Leonov, S N; Zemskova, E Iu; Kobylianskiĭ, A G; Dziubenko, E V
2013-01-01
This study was designed to estimate the effectiveness of special technical procedures for the enhancement of sensitivity of multiplex analysis of DNA, such as the use of low-plexity PCR systems and the whole genome preamplification technology, and the possibility of their application for the purpose of forensic medical genotyping of polymorphous STR-loci of chromosomal DNA in individual cells. The authors refused to use the imitation model (equivalent DNA dilutions) for the sake of obtaining the maximally informative data and chose to work with real preparations of solitary buccal epithelial cells isolated by the laser microdissection technique. It was shown that neither the use of the low-plexity multilocus PCR systems nor the whole genome pre-amplification technology makes possible reliable genotyping of STR-loci of chromosomal DNA in individual cells. The proposed techniques allow for DNA genotyping in preparations consisting of 10 diploid cells whereas the methods for reliable genotyping of STR-loci of chromosomal DNA in individual cells remains to be developed.
[Clinical evaluation and psychological aspects of temporomandibular joint disorders].
Coessens, P; De Boever, J A
1997-01-01
Establishing the patient's clinical diagnosis depends on gathering as much information of the patient and his or her signs and symptoms as possible. This information can be gathered from history, physical and psychological examination, diagnostic analysis. It is also important to look upon pain as a disorder and to consider the relationship between pain and psychological factors. The differential diagnosis is constructed through a biopsychological model of illness rather than through a more traditional biomedical model of disease. To arrive at a consistently accurate clinical diagnosis in patients with TMJ and craniofacial pain, the technique of clinical diagnosis must be well defined, reliable and include examination of the head and the neck, cranial nerves and the stomatognathic system. The craniomandibular index provides a standardized examination of the stomatognathic system that has been tested on validity and reliability. This chapter focuses on the techniques of history taking clinical and psychological examination and diagnostic criteria for temporomandibular joint disorders and muscle pain.
Evaluating reliability of WSN with sleep/wake-up interfering nodes
NASA Astrophysics Data System (ADS)
Distefano, Salvatore
2013-10-01
A wireless sensor network (WSN) (singular and plural of acronyms are spelled the same) is a distributed system composed of autonomous sensor nodes wireless connected and randomly scattered into a geographical area to cooperatively monitor physical or environmental conditions. Adequate techniques and strategies are required to manage a WSN so that it works properly, observing specific quantities and metrics to evaluate the WSN operational conditions. Among them, one of the most important is the reliability. Considering a WSN as a system composed of sensor nodes the system reliability approach can be applied, thus expressing the WSN reliability in terms of its nodes' reliability. More specifically, since often standby power management policies are applied at node level and interferences among nodes may arise, a WSN can be considered as a dynamic system. In this article we therefore consider the WSN reliability evaluation problem from the dynamic system reliability perspective. Static-structural interactions are specified by the WSN topology. Sleep/wake-up standby policies and interferences due to wireless communications can be instead considered as dynamic aspects. Thus, in order to represent and to evaluate the WSN reliability, we use dynamic reliability block diagrams and Petri nets. The proposed technique allows to overcome the limits of Markov models when considering non-linear discharge processes, since they cannot adequately represent the aging processes. In order to demonstrate the effectiveness of the technique, we investigate some specific WSN network topologies, providing guidelines for their representation and evaluation.
NASA Technical Reports Server (NTRS)
Parrish, A.; Dezafra, R. L.; Solomon, P. M.; Barrett, J. W.
1988-01-01
Recent concern over possible long term stratospheric changes caused by the introduction of man-made compounds has increased the need for instrumentation that can accurately measure stratospheric minor constituents. The technique of radio spectroscopy at millimeter wavelengths was first used to observe rotational transitions of stratospheric ozone nearly two decades ago, but has not been highly developed until recently. A ground-based observing technique is reported which employs a millimeter-wave superheterodyne receiver and multichannel filter spectrometer for measurements of stratospheric constituents that have peak volume mixing ratios that are less than 10 to the -9th, more than 3 orders of magnitude less than that for ozone. The technique is used for an extensive program of observations of stratospheric chlorine monoxide and also for observations of other stratospheric trace gases such as (O-16)3, vibrationally excited (O-16)3, (O-18)2(O-16), N2O, HO2, and HCN. In the present paper, analysis of the observing technique is given, including the method of calibration and analysis of sources of error. The technique is found to be a reliable means of observing and monitoring important stratospheric trace constituents.
A guide to onboard checkout. Volume 5: Data management
NASA Technical Reports Server (NTRS)
1971-01-01
The baseline data management subsystem for a space station is discussed. The subsystem consists of equipment necessary to transfer, store, and process data to and from users and subsystems. It acquires and conditions a wide variety of input data from experiments, vehicle subsystems sensors, uplinked ground communications, and astronaut-activated controls. Computer techniques for failure analysis, reliability, and maintenance checkout onboard the space station are considered.
Different techniques of multispectral data analysis for vegetation fraction retrieval
NASA Astrophysics Data System (ADS)
Kancheva, Rumiana; Georgiev, Georgi
2012-07-01
Vegetation monitoring is one of the most important applications of remote sensing technologies. In respect to farmlands, the assessment of crop condition constitutes the basis of growth, development, and yield processes monitoring. Plant condition is defined by a set of biometric variables, such as density, height, biomass amount, leaf area index, and etc. The canopy cover fraction is closely related to these variables, and is state-indicative of the growth process. At the same time it is a defining factor of the soil-vegetation system spectral signatures. That is why spectral mixtures decomposition is a primary objective in remotely sensed data processing and interpretation, specifically in agricultural applications. The actual usefulness of the applied methods depends on their prediction reliability. The goal of this paper is to present and compare different techniques for quantitative endmember extraction from soil-crop patterns reflectance. These techniques include: linear spectral unmixing, two-dimensional spectra analysis, spectral ratio analysis (vegetation indices), spectral derivative analysis (red edge position), colorimetric analysis (tristimulus values sum, chromaticity coordinates and dominant wavelength). The objective is to reveal their potential, accuracy and robustness for plant fraction estimation from multispectral data. Regression relationships have been established between crop canopy cover and various spectral estimators.
Analysis of an experiment aimed at improving the reliability of transmission centre shafts.
Davis, T P
1995-01-01
Smith (1991) presents a paper proposing the use of Weibull regression models to establish dependence of failure data (usually times) on covariates related to the design of the test specimens and test procedures. In his article Smith made the point that good experimental design was as important in reliability applications as elsewhere, and in view of the current interest in design inspired by Taguchi and others, we pay some attention in this article to that topic. A real case study from the Ford Motor Company is presented. Our main approach is to utilize suggestions in the literature for applying standard least squares techniques of experimental analysis even when there is likely to be nonnormal error, and censoring. This approach lacks theoretical justification, but its appeal is its simplicity and flexibility. For completeness we also include some analysis based on the proportional hazards model, and in an attempt to link back to Smith (1991), look at a Weibull regression model.
A Wireless Fatigue Monitoring System Utilizing a Bio-Inspired Tree Ring Data Tracking Technique
Bai, Shi; Li, Xuan; Xie, Zhaohui; Zhou, Zhi; Ou, Jinping
2014-01-01
Fatigue, a hot scientific research topic for centuries, can trigger sudden failure of critical structures such as aircraft and railway systems, resulting in enormous casualties as well as economic losses. The fatigue life of certain structures is intrinsically random and few monitoring techniques are capable of tracking the full life-cycle fatigue damage. In this paper, a novel in-situ wireless real-time fatigue monitoring system using a bio-inspired tree ring data tracking technique is proposed. The general framework, methodology, and verification of this intelligent system are discussed in details. The rain-flow counting (RFC) method is adopted as the core algorithm which quantifies fatigue damages, and Digital Signal Processing (DSP) is introduced as the core module for data collection and analysis. Laboratory test results based on strain gauges and polyvinylidene fluoride (PVDF) sensors have shown that the developed intelligent system can provide a reliable quick feedback and early warning of fatigue failure. With the merits of low cost, high accuracy and great reliability, the developed wireless fatigue sensing system can be further applied to mechanical engineering, civil infrastructures, transportation systems, aerospace engineering, etc. PMID:24603635
Analysis of user characteristics related to drop-off detection with long cane
Kim, Dae Shik; Emerson, Robert Wall; Curtis, Amy
2010-01-01
This study examined how user characteristics affect drop-off detection with the long cane. A mixed-measures design with block randomization was used for the study, in which 32 visually impaired adults attempted to detect the drop-offs using different cane techniques. Younger cane users detected drop-offs significantly more reliably (mean +/− standard deviation = 74.2% +/− 11.2% of the time) than older cane users (60.9% +/− 10.8%), p = 0.009. The drop-off detection threshold of the younger participants (5.2 +/− 2.1 cm) was also statistically significantly smaller than that of the older participants (7.9 +/− 2.2 cm), p = 0.007. Those with early-onset visual impairment (78.0% +/− 9.0%) also detected drop-offs significantly more reliably than those with later-onset visual impairment (67.3% +/− 12.4%), p = 0.01. No interaction occurred between examined user characteristics (age and age at onset of visual impairment) and the type of cane technique used in drop-off detection. The findings of the study may help orientation and mobility specialists select appropriate cane techniques in accordance with the cane user’s age and onset of visual impairment. PMID:20665349
Ultrasound measurement of transcranial distance during head-down tilt
NASA Technical Reports Server (NTRS)
Torikoshi, S.; Wilson, M. H.; Ballard, R. E.; Watenpaugh, D. E.; Murthy, G.; Yost, W. T.; Cantrell, J. H.; Chang, D. S.; Hargens, A. R.
1995-01-01
Exposure to microgravity elevates blood pressure and flow in the head, which may increase intracranial volume (ICV) and intracranial pressure (ICP). Rhesus monkeys exposed to simulated microgravity in the form of 6 degree head-down tilt (HDT) experience elevated ICP. With humans, twenty-four hours of 6 degree HDT bed rest increases cerebral blood flow velocity relative to pre-HDT upright posture. Humans exposed to acute 6 degree HDT experiments increased ICP, measured with the tympanic membrane displacement (TMD) technique. Other studies suggest that increased ICP in humans and cats causes measurable cranial bone movement across the sagittal suture. Due to the slightly compliant nature of the cranium, elevation of the ICP will increase ICV and transcranial distance. Currently, several non-invasive approaches to monitor ICP are being investigated. Such techniques include TMD and modal analysis of the skull. TMD may not be reliable over a large range of ICP and neither method is capable of measuring the small changes in pressure. Ultrasound, however, may reliably measure small distance changes that accompany ICP fluctuations. The purpose of our study was to develop and evaluate an ultrasound technique to measure transcranial distance changes during HDT.
Description of MSFC engineering photographic analysis
NASA Technical Reports Server (NTRS)
Earle, Jim; Williams, Frank
1988-01-01
Utilizing a background that includes development of basic launch and test photographic coverage and analysis procedures, the MSFC Photographic Evaluation Group has built a body of experience that enables it to effectively satisfy MSFC's engineering photographic analysis needs. Combining the basic soundness of reliable, proven techniques of the past with the newer technical advances of computers and computer-related devices, the MSFC Photo Evaluation Group is in a position to continue to provide photo and video analysis service center-wide and NASA-wide to supply an improving photo analysis product to meet the photo evaluation needs of the future; and to provide new standards in the state-of-the-art of photo analysis of dynamic events.
Performance of Blind Source Separation Algorithms for FMRI Analysis using a Group ICA Method
Correa, Nicolle; Adali, Tülay; Calhoun, Vince D.
2007-01-01
Independent component analysis (ICA) is a popular blind source separation (BSS) technique that has proven to be promising for the analysis of functional magnetic resonance imaging (fMRI) data. A number of ICA approaches have been used for fMRI data analysis, and even more ICA algorithms exist, however the impact of using different algorithms on the results is largely unexplored. In this paper, we study the performance of four major classes of algorithms for spatial ICA, namely information maximization, maximization of non-gaussianity, joint diagonalization of cross-cumulant matrices, and second-order correlation based methods when they are applied to fMRI data from subjects performing a visuo-motor task. We use a group ICA method to study the variability among different ICA algorithms and propose several analysis techniques to evaluate their performance. We compare how different ICA algorithms estimate activations in expected neuronal areas. The results demonstrate that the ICA algorithms using higher-order statistical information prove to be quite consistent for fMRI data analysis. Infomax, FastICA, and JADE all yield reliable results; each having their strengths in specific areas. EVD, an algorithm using second-order statistics, does not perform reliably for fMRI data. Additionally, for the iterative ICA algorithms, it is important to investigate the variability of the estimates from different runs. We test the consistency of the iterative algorithms, Infomax and FastICA, by running the algorithm a number of times with different initializations and note that they yield consistent results over these multiple runs. Our results greatly improve our confidence in the consistency of ICA for fMRI data analysis. PMID:17540281
Performance of blind source separation algorithms for fMRI analysis using a group ICA method.
Correa, Nicolle; Adali, Tülay; Calhoun, Vince D
2007-06-01
Independent component analysis (ICA) is a popular blind source separation technique that has proven to be promising for the analysis of functional magnetic resonance imaging (fMRI) data. A number of ICA approaches have been used for fMRI data analysis, and even more ICA algorithms exist; however, the impact of using different algorithms on the results is largely unexplored. In this paper, we study the performance of four major classes of algorithms for spatial ICA, namely, information maximization, maximization of non-Gaussianity, joint diagonalization of cross-cumulant matrices and second-order correlation-based methods, when they are applied to fMRI data from subjects performing a visuo-motor task. We use a group ICA method to study variability among different ICA algorithms, and we propose several analysis techniques to evaluate their performance. We compare how different ICA algorithms estimate activations in expected neuronal areas. The results demonstrate that the ICA algorithms using higher-order statistical information prove to be quite consistent for fMRI data analysis. Infomax, FastICA and joint approximate diagonalization of eigenmatrices (JADE) all yield reliable results, with each having its strengths in specific areas. Eigenvalue decomposition (EVD), an algorithm using second-order statistics, does not perform reliably for fMRI data. Additionally, for iterative ICA algorithms, it is important to investigate the variability of estimates from different runs. We test the consistency of the iterative algorithms Infomax and FastICA by running the algorithm a number of times with different initializations, and we note that they yield consistent results over these multiple runs. Our results greatly improve our confidence in the consistency of ICA for fMRI data analysis.
Design and testing of the Space Station Freedom Propellant Tank Assembly
NASA Technical Reports Server (NTRS)
Dudley, D. D.; Thonet, T. A.; Goforth, A. M.
1992-01-01
Propellant storage and management functions for the Propulsion Module of the U.S. Space Station Freedom are provided by the Propellant Tank Assembly (PTA). The PTA consists of a surface-tension type propellant acquisition device contained within a welded titanium pressure vessel. The PTA design concept was selected with high reliability and low program risk as primary goals in order to meet stringent NASA structural, expulsion, fracture control and reliability requirements. The PTA design makes use of Shuttle Orbital Maneuvering System and Peacekeeper Propellant Storage Assembly design and analysis techniques. This paper summarizes the PTA design solution and discusses the underlying detailed analyses. In addition, design verification and qualification test activities are discussed.
Software Development Processes Applied to Computational Icing Simulation
NASA Technical Reports Server (NTRS)
Levinson, Laurie H.; Potapezuk, Mark G.; Mellor, Pamela A.
1999-01-01
The development of computational icing simulation methods is making the transition form the research to common place use in design and certification efforts. As such, standards of code management, design validation, and documentation must be adjusted to accommodate the increased expectations of the user community with respect to accuracy, reliability, capability, and usability. This paper discusses these concepts with regard to current and future icing simulation code development efforts as implemented by the Icing Branch of the NASA Lewis Research Center in collaboration with the NASA Lewis Engineering Design and Analysis Division. With the application of the techniques outlined in this paper, the LEWICE ice accretion code has become a more stable and reliable software product.
Aslam, Tariq Mehmood; Shakir, Savana; Wong, James; Au, Leon; Ashworth, Jane
2012-12-01
Mucopolysaccharidoses (MPS) can cause corneal opacification that is currently difficult to objectively quantify. With newer treatments for MPS comes an increased need for a more objective, valid and reliable index of disease severity for clinical and research use. Clinical evaluation by slit lamp is very subjective and techniques based on colour photography are difficult to standardise. In this article the authors present evidence for the utility of dedicated image analysis algorithms applied to images obtained by a highly sophisticated iris recognition camera that is small, manoeuvrable and adapted to achieve rapid, reliable and standardised objective imaging in a wide variety of patients while minimising artefactual interference in image quality.
NASA Technical Reports Server (NTRS)
Hunt, W. D.; Brennan, K. F.; Summers, C. J.; Yun, Ilgu
1994-01-01
Reliability modeling and parametric yield prediction of GaAs/AlGaAs multiple quantum well (MQW) avalanche photodiodes (APDs), which are of interest as an ultra-low noise image capture mechanism for high definition systems, have been investigated. First, the effect of various doping methods on the reliability of GaAs/AlGaAs multiple quantum well (MQW) avalanche photodiode (APD) structures fabricated by molecular beam epitaxy is investigated. Reliability is examined by accelerated life tests by monitoring dark current and breakdown voltage. Median device lifetime and the activation energy of the degradation mechanism are computed for undoped, doped-barrier, and doped-well APD structures. Lifetimes for each device structure are examined via a statistically designed experiment. Analysis of variance shows that dark-current is affected primarily by device diameter, temperature and stressing time, and breakdown voltage depends on the diameter, stressing time and APD type. It is concluded that the undoped APD has the highest reliability, followed by the doped well and doped barrier devices, respectively. To determine the source of the degradation mechanism for each device structure, failure analysis using the electron-beam induced current method is performed. This analysis reveals some degree of device degradation caused by ionic impurities in the passivation layer, and energy-dispersive spectrometry subsequently verified the presence of ionic sodium as the primary contaminant. However, since all device structures are similarly passivated, sodium contamination alone does not account for the observed variation between the differently doped APDs. This effect is explained by the dopant migration during stressing, which is verified by free carrier concentration measurements using the capacitance-voltage technique.
Validation of Medical Tourism Service Quality Questionnaire (MTSQQ) for Iranian Hospitals.
Qolipour, Mohammad; Torabipour, Amin; Khiavi, Farzad Faraji; Malehi, Amal Saki
2017-03-01
Assessing service quality is one of the basic requirements to develop the medical tourism industry. There is no valid and reliable tool to measure service quality of medical tourism. This study aimed to determine the reliability and validity of a Persian version of medical tourism service quality questionnaire for Iranian hospitals. To validate the medical tourism service quality questionnaire (MTSQQ), a cross-sectional study was conducted on 250 Iraqi patients referred to hospitals in Ahvaz (Iran) from 2015. To design a questionnaire and determine its content validity, the Delphi Technique (3 rounds) with the participation of 20 medical tourism experts was used. Construct validity of the questionnaire was assessed through exploratory and confirmatory factor analysis. Reliability was assessed using Cronbach's alpha coefficient. Data were analyzed by Excel 2007, SPSS version18, and Lisrel l8.0 software. The content validity of the questionnaire with CVI=0.775 was confirmed. According to exploratory factor analysis, the MTSQQ included 31 items and 8 dimensions (tangibility, reliability, responsiveness, assurance, empathy, exchange and travel facilities, technical and infrastructure facilities and safety and security). Construct validity of the questionnaire was confirmed, based on the goodness of fit quantities of model (RMSEA=0.032, CFI= 0.98, GFI=0.88). Cronbach's alpha coefficient was 0.837 and 0.919 for expectation and perception questionnaire. The results of the study showed that the medical tourism SERVQUAL questionnaire with 31 items and 8 dimensions was a valid and reliable tool to measure service quality of medical tourism in Iranian hospitals.
Computed tomography-based finite element analysis to assess fracture risk and osteoporosis treatment
Imai, Kazuhiro
2015-01-01
Finite element analysis (FEA) is a computer technique of structural stress analysis and developed in engineering mechanics. FEA has developed to investigate structural behavior of human bones over the past 40 years. When the faster computers have acquired, better FEA, using 3-dimensional computed tomography (CT) has been developed. This CT-based finite element analysis (CT/FEA) has provided clinicians with useful data. In this review, the mechanism of CT/FEA, validation studies of CT/FEA to evaluate accuracy and reliability in human bones, and clinical application studies to assess fracture risk and effects of osteoporosis medication are overviewed. PMID:26309819
Liu, Jingyu; Demirci, Oguz; Calhoun, Vince D.
2009-01-01
Relationships between genomic data and functional brain images are of great interest but require new analysis approaches to integrate the high-dimensional data types. This letter presents an extension of a technique called parallel independent component analysis (paraICA), which enables the joint analysis of multiple modalities including interconnections between them. We extend our earlier work by allowing for multiple interconnections and by providing important overfitting controls. Performance was assessed by simulations under different conditions, and indicated reliable results can be extracted by properly balancing overfitting and underfitting. An application to functional magnetic resonance images and single nucleotide polymorphism array produced interesting findings. PMID:19834575
Liu, Jingyu; Demirci, Oguz; Calhoun, Vince D
2008-01-01
Relationships between genomic data and functional brain images are of great interest but require new analysis approaches to integrate the high-dimensional data types. This letter presents an extension of a technique called parallel independent component analysis (paraICA), which enables the joint analysis of multiple modalities including interconnections between them. We extend our earlier work by allowing for multiple interconnections and by providing important overfitting controls. Performance was assessed by simulations under different conditions, and indicated reliable results can be extracted by properly balancing overfitting and underfitting. An application to functional magnetic resonance images and single nucleotide polymorphism array produced interesting findings.
NASA Astrophysics Data System (ADS)
Cherry, M.; Dierken, J.; Boehnlein, T.; Pilchak, A.; Sathish, S.; Grandhi, R.
2018-01-01
A new technique for performing quantitative scanning acoustic microscopy imaging of Rayleigh surface wave (RSW) velocity was developed based on b-scan processing. In this technique, the focused acoustic beam is moved through many defocus distances over the sample and excited with an impulse excitation, and advanced algorithms based on frequency filtering and the Hilbert transform are used to post-process the b-scans to estimate the Rayleigh surface wave velocity. The new method was used to estimate the RSW velocity on an optically flat E6 glass sample, and the velocity was measured at ±2 m/s and the scanning time per point was on the order of 1.0 s, which are both improvement from the previous two-point defocus method. The new method was also applied to the analysis of two titanium samples, and the velocity was estimated with very low standard deviation in certain large grains on the sample. A new behavior was observed with the b-scan analysis technique where the amplitude of the surface wave decayed dramatically on certain crystallographic orientations. The new technique was also compared with previous results, and the new technique has been found to be much more reliable and to have higher contrast than previously possible with impulse excitation.
NASA Astrophysics Data System (ADS)
Ravishankar, Bharani
Conventional space vehicles have thermal protection systems (TPS) that provide protection to an underlying structure that carries the flight loads. In an attempt to save weight, there is interest in an integrated TPS (ITPS) that combines the structural function and the TPS function. This has weight saving potential, but complicates the design of the ITPS that now has both thermal and structural failure modes. The main objectives of this dissertation was to optimally design the ITPS subjected to thermal and mechanical loads through deterministic and reliability based optimization. The optimization of the ITPS structure requires computationally expensive finite element analyses of 3D ITPS (solid) model. To reduce the computational expenses involved in the structural analysis, finite element based homogenization method was employed, homogenizing the 3D ITPS model to a 2D orthotropic plate. However it was found that homogenization was applicable only for panels that are much larger than the characteristic dimensions of the repeating unit cell in the ITPS panel. Hence a single unit cell was used for the optimization process to reduce the computational cost. Deterministic and probabilistic optimization of the ITPS panel required evaluation of failure constraints at various design points. This further demands computationally expensive finite element analyses which was replaced by efficient, low fidelity surrogate models. In an optimization process, it is important to represent the constraints accurately to find the optimum design. Instead of building global surrogate models using large number of designs, the computational resources were directed towards target regions near constraint boundaries for accurate representation of constraints using adaptive sampling strategies. Efficient Global Reliability Analyses (EGRA) facilitates sequentially sampling of design points around the region of interest in the design space. EGRA was applied to the response surface construction of the failure constraints in the deterministic and reliability based optimization of the ITPS panel. It was shown that using adaptive sampling, the number of designs required to find the optimum were reduced drastically, while improving the accuracy. System reliability of ITPS was estimated using Monte Carlo Simulation (MCS) based method. Separable Monte Carlo method was employed that allowed separable sampling of the random variables to predict the probability of failure accurately. The reliability analysis considered uncertainties in the geometry, material properties, loading conditions of the panel and error in finite element modeling. These uncertainties further increased the computational cost of MCS techniques which was also reduced by employing surrogate models. In order to estimate the error in the probability of failure estimate, bootstrapping method was applied. This research work thus demonstrates optimization of the ITPS composite panel with multiple failure modes and large number of uncertainties using adaptive sampling techniques.
Modeling and simulation of reliability of unmanned intelligent vehicles
NASA Astrophysics Data System (ADS)
Singh, Harpreet; Dixit, Arati M.; Mustapha, Adam; Singh, Kuldip; Aggarwal, K. K.; Gerhart, Grant R.
2008-04-01
Unmanned ground vehicles have a large number of scientific, military and commercial applications. A convoy of such vehicles can have collaboration and coordination. For the movement of such a convoy, it is important to predict the reliability of the system. A number of approaches are available in literature which describes the techniques for determining the reliability of the system. Graph theoretic approaches are popular in determining terminal reliability and system reliability. In this paper we propose to exploit Fuzzy and Neuro-Fuzzy approaches for predicting the node and branch reliability of the system while Boolean algebra approaches are used to determine terminal reliability and system reliability. Hence a combination of intelligent approaches like Fuzzy, Neuro-Fuzzy and Boolean approaches is used to predict the overall system reliability of a convoy of vehicles. The node reliabilities may correspond to the collaboration of vehicles while branch reliabilities will determine the terminal reliabilities between different nodes. An algorithm is proposed for determining the system reliabilities of a convoy of vehicles. The simulation of the overall system is proposed. Such simulation should be helpful to the commander to take an appropriate action depending on the predicted reliability in different terrain and environmental conditions. It is hoped that results of this paper will lead to more important techniques to have a reliable convoy of vehicles in a battlefield.
Probabilistic finite elements for fatigue and fracture analysis
NASA Astrophysics Data System (ADS)
Belytschko, Ted; Liu, Wing Kam
1993-04-01
An overview of the probabilistic finite element method (PFEM) developed by the authors and their colleagues in recent years is presented. The primary focus is placed on the development of PFEM for both structural mechanics problems and fracture mechanics problems. The perturbation techniques are used as major tools for the analytical derivation. The following topics are covered: (1) representation and discretization of random fields; (2) development of PFEM for the general linear transient problem and nonlinear elasticity using Hu-Washizu variational principle; (3) computational aspects; (4) discussions of the application of PFEM to the reliability analysis of both brittle fracture and fatigue; and (5) a stochastic computational tool based on stochastic boundary element (SBEM). Results are obtained for the reliability index and corresponding probability of failure for: (1) fatigue crack growth; (2) defect geometry; (3) fatigue parameters; and (4) applied loads. These results show that initial defect is a critical parameter.
King, Robert; Parker, Simon; Mouzakis, Kon; Fletcher, Winston; Fitzgerald, Patrick
2007-11-01
The Integrated Task Modeling Environment (ITME) is a user-friendly software tool that has been developed to automatically recode low-level data into an empirical record of meaningful task performance. The present research investigated and validated the performance of the ITME software package by conducting complex simulation missions and comparing the task analyses produced by ITME with taskanalyses produced by experienced video analysts. A very high interrater reliability (> or = .94) existed between experienced video analysts and the ITME for the task analyses produced for each mission. The mean session time:analysis time ratio was 1:24 using video analysis techniques and 1:5 using the ITME. It was concluded that the ITME produced task analyses that were as reliable as those produced by experienced video analysts, and significantly reduced the time cost associated with these analyses.
Probabilistic finite elements for fatigue and fracture analysis
NASA Technical Reports Server (NTRS)
Belytschko, Ted; Liu, Wing Kam
1993-01-01
An overview of the probabilistic finite element method (PFEM) developed by the authors and their colleagues in recent years is presented. The primary focus is placed on the development of PFEM for both structural mechanics problems and fracture mechanics problems. The perturbation techniques are used as major tools for the analytical derivation. The following topics are covered: (1) representation and discretization of random fields; (2) development of PFEM for the general linear transient problem and nonlinear elasticity using Hu-Washizu variational principle; (3) computational aspects; (4) discussions of the application of PFEM to the reliability analysis of both brittle fracture and fatigue; and (5) a stochastic computational tool based on stochastic boundary element (SBEM). Results are obtained for the reliability index and corresponding probability of failure for: (1) fatigue crack growth; (2) defect geometry; (3) fatigue parameters; and (4) applied loads. These results show that initial defect is a critical parameter.
Kopecká, J; Němec, M; Matoulková, D
2016-06-01
Brewing yeasts are classified into two species-Saccharomyces pastorianus and Saccharomyces cerevisiae. Most of the brewing yeast strains are natural interspecies hybrids typically polyploids and their identification is thus often difficult giving heterogenous results according to the method used. We performed genetic characterization of a set of the brewing yeast strains coming from several yeast culture collections by combination of various DNA-based techniques. The aim of this study was to select a method for species-specific identification of yeast and discrimination of yeast strains according to their technological classification. A group of 40 yeast strains were characterized using PCR-RFLP analysis of ITS-5·8S, NTS, HIS4 and COX2 genes, multiplex PCR, RAPD-PCR of genomic DNA, mtDNA-RFLP and electrophoretic karyotyping. Reliable differentiation of yeast to the species level was achieved by PCR-RFLP of HIS4 gene. Numerical analysis of the obtained RAPD-fingerprints and karyotype revealed species-specific clustering corresponding with the technological classification of the strains. Taxonomic position and partial hybrid nature of strains were verified by multiplex PCR. Differentiation among species using the PCR-RFLP of ITS-5·8S and NTS region was shown to be unreliable. Karyotyping and RFLP of mitochondrial DNA evinced small inaccuracies in strain categorization. PCR-RFLP of HIS4 gene and RAPD-PCR of genomic DNA are reliable and suitable methods for fast identification of yeast strains. RAPD-PCR with primer 21 is a fast and reliable method applicable for differentiation of brewing yeasts with only 35% similarity of fingerprint profile between the two main technological groups (ale and lager) of brewing strains. It was proved that PCR-RFLP method of HIS4 gene enables precise discrimination among three technologically important Saccharomyces species. Differentiation of brewing yeast to the strain level can be achieved using the RAPD-PCR technique. © 2016 The Society for Applied Microbiology.
Designing a reliable leak bio-detection system for natural gas pipelines.
Batzias, F A; Siontorou, C G; Spanidis, P-M P
2011-02-15
Monitoring of natural gas (NG) pipelines is an important task for economical/safety operation, loss prevention and environmental protection. Timely and reliable leak detection of gas pipeline, therefore, plays a key role in the overall integrity management for the pipeline system. Owing to the various limitations of the currently available techniques and the surveillance area that needs to be covered, the research on new detector systems is still thriving. Biosensors are worldwide considered as a niche technology in the environmental market, since they afford the desired detector capabilities at low cost, provided they have been properly designed/developed and rationally placed/networked/maintained by the aid of operational research techniques. This paper addresses NG leakage surveillance through a robust cooperative/synergistic scheme between biosensors and conventional detector systems; the network is validated in situ and optimized in order to provide reliable information at the required granularity level. The proposed scheme is substantiated through a knowledge based approach and relies on Fuzzy Multicriteria Analysis (FMCA), for selecting the best biosensor design that suits both, the target analyte and the operational micro-environment. This approach is illustrated in the design of leak surveying over a pipeline network in Greece. Copyright © 2010 Elsevier B.V. All rights reserved.
Hafer, Jocelyn F; Boyer, Katherine A
2017-01-01
Coordination variability (CV) quantifies the variety of movement patterns an individual uses during a task and may provide a measure of the flexibility of that individual's motor system. While there is growing popularity of segment CV as a marker of motor system health or adaptability, it is not known how many strides of data are needed to reliably calculate CV. This study aimed to determine the number of strides needed to reliably calculate CV in treadmill walking and running, and to compare CV between walking and running in a healthy population. Ten healthy young adults walked and ran at preferred speeds on a treadmill and a modified vector coding technique was used to calculate CV for the following segment couples: pelvis frontal plane vs. thigh frontal plane, thigh sagittal plane vs. shank sagittal plane, thigh sagittal plane vs. shank transverse plane, and shank transverse plane vs. rearfoot frontal plane. CV for each coupling of interest was calculated for 2-15 strides for each participant and gait type. Mean CV was calculated across the entire gait cycle and, separately, for 4 phases of the gait cycle. For running and walking 8 and 10 strides, respectively, were sufficient to obtain a reliable CV estimate. CV was significantly different between walking and running for the thigh vs. shank couple comparisons. These results suggest that 10 strides of treadmill data are needed to reliably calculate CV for walking and running. Additionally, the differences in CV between walking and running suggest that the role of knee (i.e., inter-thigh- shank) control may differ between these forms of locomotion. Copyright © 2016 Elsevier B.V. All rights reserved.
Tatewaki, Yasuko; Higano, Shuichi; Taki, Yasuyuki; Thyreau, Benjamin; Murata, Takaki; Mugikura, Shunji; Ito, Daisuke; Takase, Kei; Takahashi, Shoki
2014-01-01
Quantitative signal targeting with alternating radiofrequency labeling of arterial regions (QUASAR) is a recent spin labeling technique that could improve the reliability of brain perfusion measurements. Although it is considered reliable for measuring gray matter as a whole, it has never been evaluated regionally. Here we assessed this regional reliability. Using a 3-Tesla Philips Achieva whole-body system, we scanned four times 10 healthy volunteers, in two sessions 2 weeks apart, to obtain QUASAR images. We computed perfusion images and ran a voxel-based analysis within all brain structures. We also calculated mean regional cerebral blood flow (rCBF) within regions of interest configured for each arterial territory distribution. The mean CBF over whole gray matter was 37.74 with intraclass correlation coefficient (ICC) of .70. In white matter, it was 13.94 with an ICC of .30. Voxel-wise ICC and coefficient-of-variation maps showed relatively lower reliability in watershed areas and white matter especially in deeper white matter. The absolute mean rCBF values were consistent with the ones reported from PET, as was the relatively low variability in different feeding arteries. Thus, QUASAR reliability for regional perfusion is high within gray matter, but uncertain within white matter. © 2014 The Authors. Journal of Neuroimaging published by the American Society of Neuroimaging.
Tatewaki, Yasuko; Higano, Shuichi; Taki, Yasuyuki; Thyreau, Benjamin; Murata, Takaki; Mugikura, Shunji; Ito, Daisuke; Takase, Kei; Takahashi, Shoki
2014-01-01
BACKGROUND AND PURPOSE Quantitative signal targeting with alternating radiofrequency labeling of arterial regions (QUASAR) is a recent spin labeling technique that could improve the reliability of brain perfusion measurements. Although it is considered reliable for measuring gray matter as a whole, it has never been evaluated regionally. Here we assessed this regional reliability. METHODS Using a 3-Tesla Philips Achieva whole-body system, we scanned four times 10 healthy volunteers, in two sessions 2 weeks apart, to obtain QUASAR images. We computed perfusion images and ran a voxel-based analysis within all brain structures. We also calculated mean regional cerebral blood flow (rCBF) within regions of interest configured for each arterial territory distribution. RESULTS The mean CBF over whole gray matter was 37.74 with intraclass correlation coefficient (ICC) of .70. In white matter, it was 13.94 with an ICC of .30. Voxel-wise ICC and coefficient-of-variation maps showed relatively lower reliability in watershed areas and white matter especially in deeper white matter. The absolute mean rCBF values were consistent with the ones reported from PET, as was the relatively low variability in different feeding arteries. CONCLUSIONS Thus, QUASAR reliability for regional perfusion is high within gray matter, but uncertain within white matter. PMID:25370338
Electrical termination techniques
NASA Technical Reports Server (NTRS)
Oakey, W. E.; Schleicher, R. R.
1976-01-01
A technical review of high reliability electrical terminations for electronic equipment was made. Seven techniques were selected from this review for further investigation, experimental work, and preliminary testing. From the preliminary test results, four techniques were selected for final testing and evaluation. These four were: (1) induction soldering, (2) wire wrap, (3) percussive arc welding, and (4) resistance welding. Of these four, induction soldering was selected as the best technique in terms of minimizing operator errors, controlling temperature and time, minimizing joint contamination, and ultimately producing a reliable, uniform, and reusable electrical termination.
González-Vidal, Juan José; Pérez-Pueyo, Rosanna; Soneira, María José; Ruiz-Moreno, Sergio
2015-03-01
A new method has been developed to automatically identify Raman spectra, whether they correspond to single- or multicomponent spectra. The method requires no user input or judgment. There are thus no parameters to be tweaked. Furthermore, it provides a reliability factor on the resulting identification, with the aim of becoming a useful support tool for the analyst in the decision-making process. The method relies on the multivariate techniques of principal component analysis (PCA) and independent component analysis (ICA), and on some metrics. It has been developed for the application of automated spectral analysis, where the analyzed spectrum is provided by a spectrometer that has no previous knowledge of the analyzed sample, meaning that the number of components in the sample is unknown. We describe the details of this method and demonstrate its efficiency by identifying both simulated spectra and real spectra. The method has been applied to artistic pigment identification. The reliable and consistent results that were obtained make the methodology a helpful tool suitable for the identification of pigments in artwork or in paint in general.
FAST: a framework for simulation and analysis of large-scale protein-silicon biosensor circuits.
Gu, Ming; Chakrabartty, Shantanu
2013-08-01
This paper presents a computer aided design (CAD) framework for verification and reliability analysis of protein-silicon hybrid circuits used in biosensors. It is envisioned that similar to integrated circuit (IC) CAD design tools, the proposed framework will be useful for system level optimization of biosensors and for discovery of new sensing modalities without resorting to laborious fabrication and experimental procedures. The framework referred to as FAST analyzes protein-based circuits by solving inverse problems involving stochastic functional elements that admit non-linear relationships between different circuit variables. In this regard, FAST uses a factor-graph netlist as a user interface and solving the inverse problem entails passing messages/signals between the internal nodes of the netlist. Stochastic analysis techniques like density evolution are used to understand the dynamics of the circuit and estimate the reliability of the solution. As an example, we present a complete design flow using FAST for synthesis, analysis and verification of our previously reported conductometric immunoassay that uses antibody-based circuits to implement forward error-correction (FEC).
Investigation of spectral analysis techniques for randomly sampled velocimetry data
NASA Technical Reports Server (NTRS)
Sree, Dave
1993-01-01
It is well known that velocimetry (LV) generates individual realization velocity data that are randomly or unevenly sampled in time. Spectral analysis of such data to obtain the turbulence spectra, and hence turbulence scales information, requires special techniques. The 'slotting' technique of Mayo et al, also described by Roberts and Ajmani, and the 'Direct Transform' method of Gaster and Roberts are well known in the LV community. The slotting technique is faster than the direct transform method in computation. There are practical limitations, however, as to how a high frequency and accurate estimate can be made for a given mean sampling rate. These high frequency estimates are important in obtaining the microscale information of turbulence structure. It was found from previous studies that reliable spectral estimates can be made up to about the mean sampling frequency (mean data rate) or less. If the data were evenly samples, the frequency range would be half the sampling frequency (i.e. up to Nyquist frequency); otherwise, aliasing problem would occur. The mean data rate and the sample size (total number of points) basically limit the frequency range. Also, there are large variabilities or errors associated with the high frequency estimates from randomly sampled signals. Roberts and Ajmani proposed certain pre-filtering techniques to reduce these variabilities, but at the cost of low frequency estimates. The prefiltering acts as a high-pass filter. Further, Shapiro and Silverman showed theoretically that, for Poisson sampled signals, it is possible to obtain alias-free spectral estimates far beyond the mean sampling frequency. But the question is, how far? During his tenure under 1993 NASA-ASEE Summer Faculty Fellowship Program, the author investigated from his studies on the spectral analysis techniques for randomly sampled signals that the spectral estimates can be enhanced or improved up to about 4-5 times the mean sampling frequency by using a suitable prefiltering technique. But, this increased bandwidth comes at the cost of the lower frequency estimates. The studies further showed that large data sets of the order of 100,000 points, or more, high data rates, and Poisson sampling are very crucial for obtaining reliable spectral estimates from randomly sampled data, such as LV data. Some of the results of the current study are presented.
Acar, Nihat; Karakasli, Ahmet; Karaarslan, Ahmet; Mas, Nermin Ng; Hapa, Onur
2017-01-01
Volumetric measurements of benign tumors enable surgeons to trace volume changes during follow-up periods. For a volumetric measurement technique to be applicable, it should be easy, rapid, and inexpensive and should carry a high interobserver reliability. We aimed to assess the interobserver reliability of a volumetric measurement technique using the Cavalier's principle of stereological methods. The computerized tomography (CT) of 15 patients with a histopathologically confirmed diagnosis of enchondroma with variant tumor sizes and localizations was retrospectively reviewed for interobserver reliability evaluation of the volumetric stereological measurement with the Cavalier's principle, V = t × [((SU) × d) /SL]2 × Σ P. The volumes of the 15 tumors collected by the observers are demonstrated in Table 1. There was no statistical significance between the first and second observers ( p = 0.000 and intraclass correlation coefficient = 0.970) and between the first and third observers ( p = 0.000 and intraclass correlation coefficient = 0.981). No statistical significance was detected between the second and third observers ( p = 0.000 and intraclass correlation coefficient = 0.976). The Cavalier's principle with the stereological technique using the CT scans is an easy, rapid, and inexpensive technique in volumetric evaluation of enchondromas with a trustable interobserver reliability.
Erickson, Heidi S
2012-09-28
The future of personalized medicine depends on the ability to efficiently and rapidly elucidate a reliable set of disease-specific molecular biomarkers. High-throughput molecular biomarker analysis methods have been developed to identify disease risk, diagnostic, prognostic, and therapeutic targets in human clinical samples. Currently, high throughput screening allows us to analyze thousands of markers from one sample or one marker from thousands of samples and will eventually allow us to analyze thousands of markers from thousands of samples. Unfortunately, the inherent nature of current high throughput methodologies, clinical specimens, and cost of analysis is often prohibitive for extensive high throughput biomarker analysis. This review summarizes the current state of high throughput biomarker screening of clinical specimens applicable to genetic epidemiology and longitudinal population-based studies with a focus on considerations related to biospecimens, laboratory techniques, and sample pooling. Copyright © 2012 John Wiley & Sons, Ltd.
Analytical Characterization of Erythritol Tetranitrate, an Improvised Explosive.
Matyáš, Robert; Lyčka, Antonín; Jirásko, Robert; Jakový, Zdeněk; Maixner, Jaroslav; Mišková, Linda; Künzel, Martin
2016-05-01
Erythritol tetranitrate (ETN), an ester of nitric acid and erythritol, is a solid crystalline explosive with high explosive performance. Although it has never been used in any industrial or military application, it has become one of the most prepared and misused improvise explosives. In this study, several analytical techniques were explored to facilitate analysis in forensic laboratories. FTIR and Raman spectrometry measurements expand existing data and bring more detailed assignment of bands through the parallel study of erythritol [(15) N4 ] tetranitrate. In the case of powder diffraction, recently published data were verified, and (1) H, (13) C, and (15) N NMR spectra are discussed in detail. The technique of electrospray ionization tandem mass spectrometry was successfully used for the analysis of ETN. Described methods allow fast, versatile, and reliable detection or analysis of samples containing erythritol tetranitrate in forensic laboratories. © 2016 American Academy of Forensic Sciences.
Gilabert-Perramon, Antoni; Torrent-Farnell, Josep; Catalan, Arancha; Prat, Alba; Fontanet, Manel; Puig-Peiró, Ruth; Merino-Montero, Sandra; Khoury, Hanane; Goetghebeur, Mireille M; Badia, Xavier
2017-01-01
The aim of this study was to adapt and assess the value of a Multi-Criteria Decision Analysis (MCDA) framework (EVIDEM) for the evaluation of Orphan drugs in Catalonia (Catalan Health Service). The standard evaluation and decision-making procedures of CatSalut were compared with the EVIDEM methodology and contents. The EVIDEM framework was adapted to the Catalan context, focusing on the evaluation of Orphan drugs (PASFTAC program), during a Workshop with sixteen PASFTAC members. The criteria weighting was done using two different techniques (nonhierarchical and hierarchical). Reliability was assessed by re-test. The EVIDEM framework and methodology was found useful and feasible for Orphan drugs evaluation and decision making in Catalonia. All the criteria considered for the development of the CatSalut Technical Reports and decision making were considered in the framework. Nevertheless, the framework could improve the reporting of some of these criteria (i.e., "unmet needs" or "nonmedical costs"). Some Contextual criteria were removed (i.e., "Mandate and scope of healthcare system", "Environmental impact") or adapted ("population priorities and access") for CatSalut purposes. Independently of the weighting technique considered, the most important evaluation criteria identified for orphan drugs were: "disease severity", "unmet needs" and "comparative effectiveness", while the "size of the population" had the lowest relevance for decision making. Test-retest analysis showed weight consistency among techniques, supporting reliability overtime. MCDA (EVIDEM framework) could be a useful tool to complement the current evaluation methods of CatSalut, contributing to standardization and pragmatism, providing a method to tackle ethical dilemmas and facilitating discussions related to decision making.
de Peinder, P; Vredenbregt, M J; Visser, T; de Kaste, D
2008-08-05
Research has been carried on the feasibility of near infrared (NIR) and Raman spectroscopy as rapid screening methods to discriminate between genuine and counterfeits of the cholesterol-lowering medicine Lipitor. Classification, based on partial least squares discriminant analysis (PLS-DA) models, appears to be successful for both spectroscopic techniques, irrespective of whether atorvastatine or lovastatine has been used as the active pharmaceutical ingredient (API). The discriminative power of the NIR model, in particular, largely relies on the spectral differences of the tablet matrix. This is due to the relative large sample volume that is probed with NIR and the strong spectroscopic activity of the excipients. PLS-DA models based on NIR or Raman spectra can also be applied to distinguish between atorvastatine and lovastatine as the API used in the counterfeits tested in this study. A disadvantage of Raman microscopy for this type of analysis is that it is primarily a surface technique. As a consequence spectra of the coating and the tablet core might differ. Besides, spectra may change with the position of the laser in case the sample is inhomogeneous. However, the robustness of the PLS-DA models turned out to be sufficiently large to allow a reliable discrimination. Principal component analysis (PCA) of the spectra revealed that the conditions, at which tablets have been stored, affect the NIR data. This effect is attributed to the adsorption of water from the atmosphere after unpacking from the blister. It implies that storage conditions should be taken into account when the NIR technique is used for discriminating purposes. However, in this study both models based on NIR spectra and Raman data enabled reliable discrimination between genuine and counterfeited Lipitor tablets, regardless of their storage conditions.
Rapid determination of sugar level in snack products using infrared spectroscopy.
Wang, Ting; Rodriguez-Saona, Luis E
2012-08-01
Real-time spectroscopic methods can provide a valuable window into food manufacturing to permit optimization of production rate, quality and safety. There is a need for cutting edge sensor technology directed at improving efficiency, throughput and reliability of critical processes. The aim of the research was to evaluate the feasibility of infrared systems combined with chemometric analysis to develop rapid methods for determination of sugars in cereal products. Samples were ground and spectra were collected using a mid-infrared (MIR) spectrometer equipped with a triple-bounce ZnSe MIRacle attenuated total reflectance accessory or Fourier transform near infrared (NIR) system equipped with a diffuse reflection-integrating sphere. Sugar contents were determined using a reference HPLC method. Partial least squares regression (PLSR) was used to create cross-validated calibration models. The predictability of the models was evaluated on an independent set of samples and compared with reference techniques. MIR and NIR spectra showed characteristic absorption bands for sugars, and generated excellent PLSR models (sucrose: SEP < 1.7% and r > 0.96). Multivariate models accurately and precisely predicted sugar level in snacks allowing for rapid analysis. This simple technique allows for reliable prediction of quality parameters, and automation enabling food manufacturers for early corrective actions that will ultimately save time and money while establishing a uniform quality. The U.S. snack food industry generates billions of dollars in revenue each year and vibrational spectroscopic methods combined with pattern recognition analysis could permit optimization of production rate, quality, and safety of many food products. This research showed that infrared spectroscopy is a powerful technique for near real-time (approximately 1 min) assessment of sugar content in various cereal products. © 2012 Institute of Food Technologists®
Qualitative computer aided evaluation of dental impressions in vivo.
Luthardt, Ralph G; Koch, Rainer; Rudolph, Heike; Walter, Michael H
2006-01-01
Clinical investigations dealing with the precision of different impression techniques are rare. Objective of the present study was to develop and evaluate a procedure for the qualitative analysis of the three-dimensional impression precision based on an established in-vitro procedure. The zero hypothesis to be tested was that the precision of impressions does not differ depending on the impression technique used (single-step, monophase and two-step-techniques) and on clinical variables. Digital surface data of patient's teeth prepared for crowns were gathered from standardized manufactured master casts after impressions with three different techniques were taken in a randomized order. Data-sets were analyzed for each patient in comparison with the one-step impression chosen as the reference. The qualitative analysis was limited to data-points within the 99.5%-range. Based on the color-coded representation areas with maximum deviations were determined (preparation margin and the mantle and occlusal surface). To qualitatively analyze the precision of the impression techniques, the hypothesis was tested in linear models for repeated measures factors (p < 0.05). For the positive 99.5% deviations no variables with significant influence were determined in the statistical analysis. In contrast, the impression technique and the position of the preparation margin significantly influenced the negative 99.5% deviations. The influence of clinical parameter on the deviations between impression techniques can be determined reliably using the 99.5 percentile of the deviations. An analysis regarding the areas with maximum deviations showed high clinical relevance. The preparation margin was pointed out as the weak spot of impression taking.
Statistical Tests of Reliability of NDE
NASA Technical Reports Server (NTRS)
Baaklini, George Y.; Klima, Stanley J.; Roth, Don J.; Kiser, James D.
1987-01-01
Capabilities of advanced material-testing techniques analyzed. Collection of four reports illustrates statistical method for characterizing flaw-detecting capabilities of sophisticated nondestructive evaluation (NDE). Method used to determine reliability of several state-of-the-art NDE techniques for detecting failure-causing flaws in advanced ceramic materials considered for use in automobiles, airplanes, and space vehicles.
Kohler, G.; Ruitenberg, E. J.
1974-01-01
Three methods employed in the diagnosis of trichinosis (trichinoscopy, digestion method, and immunofluorescence technique) were compared by laboratories in 5 countries of the European economic community. For this purpose, material from 32 pigs infected with 50, 150, 500, and 1 500 T. spiralis larvae was examined. With none of the three methods was it possible to detect with sufficient reliability a T. spiralis infection in pigs infected with 50 larvae. The digestion method and the immunofluorescence technique yielded more reliable results when the infection dose was 150 larvae or more. With trichinoscopy, reliable results were obtained in pigs infected with 500 and 1 500 larvae. With the digestion method and trichinoscopy, the onset of infections was detectable from 3 weeks post infection, the digestion method being more reliable; the immunofluorescence technique yielded positive results from approximately 4-6 weeks post infection. The immunofluorescence technique is applicable for epidemiological surveys. As a routine diagnostic procedure in the slaughterhouse, trichinoscopy and the digestion method are possible alternatives, the latter being more sensitive. PMID:4616776
Composite power system well-being analysis
NASA Astrophysics Data System (ADS)
Aboreshaid, Saleh Abdulrahman Saleh
The evaluation of composite system reliability is extremely complex as it is necessary to include detailed modeling of both generation and transmission facilities and their auxiliary elements. The most significant quantitative indices in composite power system adequacy evaluation are those which relate to load curtailment. Many utilities have difficulty in interpreting the expected load curtailment indices as the existing models are based on adequacy analysis and in many cases do not consider realistic operating conditions in the system under study. This thesis presents a security based approach which alleviates this difficulty and provides the ability to evaluate the well-being of customer load points and the overall composite generation and transmission power system. Acceptable deterministic criteria are included in the probabilistic evaluation of the composite system reliability indices to monitor load point well-being. The degree of load point well-being is quantified in terms of the healthy and marginal state indices in addition to the traditional risk indices. The individual well-being indices of the different system load points are aggregated to produce system indices. This thesis presents new models and techniques to quantify the well-being of composite generation and, direct and alternating current transmission systems. Security constraints are basically the operating limits which must be satisfied for normal system operation. These constraints depend mainly on the purpose behind the study. The constraints which govern the practical operation of a power system are divided, in this thesis, into three sets namely, steady-state, voltage stability and transient stability constraints. The inclusion of an appropriate transient stability constraint will lead to a more accurate appraisal of the overall power system well-being. This thesis illustrates the utilization of a bisection method in the analytical evaluation of the critical clearing time which forms the basis of most existing stability assessments. The effect of employing high-speed-simultaneous or adaptive reclosing schemes is presented in this thesis. An effective and fast technique to incorporate voltage stability considerations in composite generation and transmission system reliability evaluation is also presented. The proposed technique can be easily incorporated in an existing composite power system reliability program using voltage stability constraints that are constructed for individual load points based on a relatively simple risk index. It is believed that the concepts, procedures and indices presented in this thesis will provide useful tools for power system designers, planners and operators and assist them to perform composite system well-being analysis in addition to traditional risk assessment.
The Myotonometer: Not a Valid Measurement Tool for Active Hamstring Musculotendinous Stiffness.
Pamukoff, Derek N; Bell, Sarah E; Ryan, Eric D; Blackburn, J Troy
2016-05-01
Hamstring musculotendinous stiffness (MTS) is associated with lower-extremity injury risk (ie, hamstring strain, anterior cruciate ligament injury) and is commonly assessed using the damped oscillatory technique. However, despite a preponderance of studies that measure MTS reliably in laboratory settings, there are no valid clinical measurement tools. A valid clinical measurement technique is needed to assess MTS and permit identification of individuals at heightened risk of injury and track rehabilitation progress. To determine the validity and reliability of the Myotonometer for measuring active hamstring MTS. Descriptive laboratory study. Laboratory. 33 healthy participants (15 men, age 21.33 ± 2.94 y, height 172.03 ± 16.36 cm, mass 74.21 ± 16.36 kg). Hamstring MTS was assessed using the damped oscillatory technique and the Myotonometer. Intraclass correlations were used to determine the intrasession, intersession, and interrater reliability of the Myotonometer. Criterion validity was assessed via Pearson product-moment correlation between MTS measures obtained from the Myotonometer and from the damped oscillatory technique. The Myotonometer demonstrated good intrasession (ICC3,1 = .807) and interrater reliability (ICC2,k = .830) and moderate intersession reliability (ICC2,k = .693). However, it did not provide a valid measurement of MTS compared with the damped oscillatory technique (r = .346, P = .061). The Myotonometer does not provide a valid measure of active hamstring MTS. Although the Myotonometer does not measure active MTS, it possesses good reliability and portability and could be used clinically to measure tissue compliance, muscle tone, or spasticity associated with multiple musculoskeletal disorders. Future research should focus on portable and clinically applicable tools to measure active hamstring MTS in efforts to prevent and monitor injuries.
Tailoring a Human Reliability Analysis to Your Industry Needs
NASA Technical Reports Server (NTRS)
DeMott, D. L.
2016-01-01
Companies at risk of accidents caused by human error that result in catastrophic consequences include: airline industry mishaps, medical malpractice, medication mistakes, aerospace failures, major oil spills, transportation mishaps, power production failures and manufacturing facility incidents. Human Reliability Assessment (HRA) is used to analyze the inherent risk of human behavior or actions introducing errors into the operation of a system or process. These assessments can be used to identify where errors are most likely to arise and the potential risks involved if they do occur. Using the basic concepts of HRA, an evolving group of methodologies are used to meet various industry needs. Determining which methodology or combination of techniques will provide a quality human reliability assessment is a key element to developing effective strategies for understanding and dealing with risks caused by human errors. There are a number of concerns and difficulties in "tailoring" a Human Reliability Assessment (HRA) for different industries. Although a variety of HRA methodologies are available to analyze human error events, determining the most appropriate tools to provide the most useful results can depend on industry specific cultures and requirements. Methodology selection may be based on a variety of factors that include: 1) how people act and react in different industries, 2) expectations based on industry standards, 3) factors that influence how the human errors could occur such as tasks, tools, environment, workplace, support, training and procedure, 4) type and availability of data, 5) how the industry views risk & reliability, and 6) types of emergencies, contingencies and routine tasks. Other considerations for methodology selection should be based on what information is needed from the assessment. If the principal concern is determination of the primary risk factors contributing to the potential human error, a more detailed analysis method may be employed versus a requirement to provide a numerical value as part of a probabilistic risk assessment. Industries involved with humans operating large equipment or transport systems (ex. railroads or airlines) would have more need to address the man machine interface than medical workers administering medications. Human error occurs in every industry; in most cases the consequences are relatively benign and occasionally beneficial. In cases where the results can have disastrous consequences, the use of Human Reliability techniques to identify and classify the risk of human errors allows a company more opportunities to mitigate or eliminate these types of risks and prevent costly tragedies.
Reliability of System Identification Techniques to Assess Standing Balance in Healthy Elderly
Maier, Andrea B.; Aarts, Ronald G. K. M.; van Gerven, Joop M. A.; Arendzen, J. Hans; Schouten, Alfred C.; Meskers, Carel G. M.; van der Kooij, Herman
2016-01-01
Objectives System identification techniques have the potential to assess the contribution of the underlying systems involved in standing balance by applying well-known disturbances. We investigated the reliability of standing balance parameters obtained with multivariate closed loop system identification techniques. Methods In twelve healthy elderly balance tests were performed twice a day during three days. Body sway was measured during two minutes of standing with eyes closed and the Balance test Room (BalRoom) was used to apply four disturbances simultaneously: two sensory disturbances, to the proprioceptive and the visual system, and two mechanical disturbances applied at the leg and trunk segment. Using system identification techniques, sensitivity functions of the sensory disturbances and the neuromuscular controller were estimated. Based on the generalizability theory (G theory), systematic errors and sources of variability were assessed using linear mixed models and reliability was assessed by computing indexes of dependability (ID), standard error of measurement (SEM) and minimal detectable change (MDC). Results A systematic error was found between the first and second trial in the sensitivity functions. No systematic error was found in the neuromuscular controller and body sway. The reliability of 15 of 25 parameters and body sway were moderate to excellent when the results of two trials on three days were averaged. To reach an excellent reliability on one day in 7 out of 25 parameters, it was predicted that at least seven trials must be averaged. Conclusion This study shows that system identification techniques are a promising method to assess the underlying systems involved in standing balance in elderly. However, most of the parameters do not appear to be reliable unless a large number of trials are collected across multiple days. To reach an excellent reliability in one third of the parameters, a training session for participants is needed and at least seven trials of two minutes must be performed on one day. PMID:26953694
1988 IEEE Aerospace Applications Conference, Park City, UT, Feb. 7-12, 1988, Digest
NASA Astrophysics Data System (ADS)
The conference presents papers on microwave applications, data and signal processing applications, related aerospace applications, and advanced microelectronic products for the aerospace industry. Topics include a high-performance antenna measurement system, microwave power beaming from earth to space, the digital enhancement of microwave component performance, and a GaAs vector processor based on parallel RISC microprocessors. Consideration is also given to unique techniques for reliable SBNR architectures, a linear analysis subsystem for CSSL-IV, and a structured singular value approach to missile autopilot analysis.
2016-09-01
an instituted safety program that utilizes a generic risk assessment method involving the 5-M (Mission, Man, Machine , Medium and Management) factor...the Safety core value is hinged upon three key principles—(1) each soldier has a crucial part to play, by adopting safety as a core value and making...it a way of life in his unit; (2) safety is an integral part of training, operations and mission success, and (3) safety is an individual, team and
Bladed disc crack diagnostics using blade passage signals
NASA Astrophysics Data System (ADS)
Hanachi, Houman; Liu, Jie; Banerjee, Avisekh; Koul, Ashok; Liang, Ming; Alavi, Elham
2012-12-01
One of the major potential faults in a turbo fan engine is the crack initiation and propagation in bladed discs under cyclic loads that could result in the breakdown of the engines if not detected at an early stage. Reliable fault detection techniques are therefore in demand to reduce maintenance cost and prevent catastrophic failures. Although a number of approaches have been reported in the literature, it remains very challenging to develop a reliable technique to accurately estimate the health condition of a rotating bladed disc. Correspondingly, this paper presents a novel technique for bladed disc crack detection through two sequential signal processing stages: (1) signal preprocessing that aims to eliminate the noises in the blade passage signals; (2) signal postprocessing that intends to identify the crack location. In the first stage, physics-based modeling and interpretation are established to help characterize the noises. The crack initiation can be determined based on the calculated health monitoring index derived from the sinusoidal effects. In the second stage, the crack is located through advanced detrended fluctuation analysis of the preprocessed data. The proposed technique is validated using a set of spin rig test data (i.e. tip clearance and time of arrival) that was acquired during a test conducted on a bladed military engine fan disc. The test results have demonstrated that the developed technique is an effective approach for identifying and locating the incipient crack that occurs at the root of a bladed disc.
DNA-based techniques for authentication of processed food and food supplements.
Lo, Yat-Tung; Shaw, Pang-Chui
2018-02-01
Authentication of food or food supplements with medicinal values is important to avoid adverse toxic effects, provide consumer rights, as well as for certification purpose. Compared to morphological and spectrometric techniques, molecular authentication is found to be accurate, sensitive and reliable. However, DNA degradation and inclusion of inhibitors may lead to failure in PCR amplification. This paper reviews on the existing DNA extraction and PCR protocols, and the use of small size DNA markers with sufficient discriminative power for molecular authentication. Various emerging new molecular techniques such as isothermal amplification for on-site diagnosis, next-generation sequencing for high-throughput species identification, high resolution melting analysis for quick species differentiation, DNA array techniques for rapid detection and quantitative determination in food products are also discussed. Copyright © 2017 Elsevier Ltd. All rights reserved.
Synthesis of multifilament silicon carbide fibers by chemical vapor deposition
NASA Technical Reports Server (NTRS)
Revankar, Vithal; Hlavacek, Vladimir
1991-01-01
A process for development of clean silicon carbide fiber with a small diameter and high reliability is presented. An experimental evaluation of operating conditions for SiC fibers of good mechanical properties and devising an efficient technique which will prevent welding together of individual filaments are discussed. The thermodynamic analysis of a different precursor system was analyzed vigorously. Thermodynamically optimum conditions for stoichiometric SiC deposit were obtained.
A study of multiplex data bus techniques for the space shuttle
NASA Technical Reports Server (NTRS)
Kearney, R. J.; Kalange, M. A.
1972-01-01
A comprehensive technology base for the design of a multiplexed data bus subsystem is provided. Extensive analyses, both analytical and empirical, were performed. Subjects covered are classified under the following headings: requirements identification and analysis; transmission media studies; signal design and detection studies; synchronization, timing, and control studies; user-subsystem interface studies; operational reliability analyses; design of candidate data bus configurations; and evaluation of candidate data bus designs.
Semi-Markov Models for Degradation-Based Reliability
2010-01-01
standard analysis techniques for Markov processes can be employed (cf. Whitt (1984), Altiok (1985), Perros (1994), and Osogami and Harchol-Balter...We want to approximate X by a PH random variable, sayY, with c.d.f. Ĥ. Marie (1980), Altiok (1985), Johnson (1993), Perros (1994), and Osogami and...provides a minimal representation when matching only two moments. By considering the guidance provided by Marie (1980), Whitt (1984), Altiok (1985), Perros
Wang, Kai; Li, Yao; Teng, Jing-Fei; Zhou, Hai-Yong; Xu, Dan-Feng; Fan, Yi
2015-01-01
To evaluate the efficacy and safety of plasmakinetic resection of the prostate (PKRP) versus transurethral resection of the prostate (TURP) for the treatment of patients with benign prostate hyperplasia (BPH), a meta-analysis of randomized controlled trials was carried out. We searched PubMed, Embase, Web of Science and the Cochrane Library. The pooled estimates of maximum flow rate, International Prostate Symptom Score, operation time, catheterization time, irrigated volume, hospital stay, transurethral resection syndrome, transfusion, clot retention, urinary retention and urinary stricture were assessed. There was no notable difference in International Prostate Symptom Score between TURP and PKRP groups during the 1-month, 3 months, 6 months and 12 months follow-up period, while the pooled Q max at 1-month favored PKRP group. PKRP group was related to a lower risk rate of transurethral resection syndrome, transfusion and clot retention, and the catheterization time and operation time were also shorter than that of TURP. The irrigated volume, length of hospital stay, urinary retention and urinary stricture rate were similar between groups. In conclusion, our study suggests that the PKRP is a reliable minimal invasive technique and may anticipatorily prove to be an alternative electrosurgical procedure for the treatment of BPH.
Net, Sopheak; Delmont, Anne; Sempéré, Richard; Paluselli, Andrea; Ouddane, Baghdad
2015-05-15
Because of their widespread application, phthalates or phthalic acid esters (PAEs) are ubiquitous in the environment. Their presence has attracted considerable attention due to their potential impacts on ecosystem functioning and on public health, so their quantification has become a necessity. Various extraction procedures as well as gas/liquid chromatography and mass spectrometry detection techniques are found as suitable for reliable detection of such compounds. However, PAEs are ubiquitous in the laboratory environment including ambient air, reagents, sampling equipment, and various analytical devices, that induces difficult analysis of real samples with a low PAE background. Therefore, accurate PAE analysis in environmental matrices is a challenging task. This paper reviews the extensive literature data on the techniques for PAE quantification in natural media. Sampling, sample extraction/pretreatment and detection for quantifying PAEs in different environmental matrices (air, water, sludge, sediment and soil) have been reviewed and compared. The concept of "green analytical chemistry" for PAE determination is also discussed. Moreover useful information about the material preparation and the procedures of quality control and quality assurance are presented to overcome the problem of sample contamination and these encountered due to matrix effects in order to avoid overestimating PAE concentrations in the environment. Copyright © 2015 Elsevier B.V. All rights reserved.
Application of dual-energy x-ray techniques for automated food container inspection
NASA Astrophysics Data System (ADS)
Shashishekhar, N.; Veselitza, D.
2016-02-01
Manufacturing for plastic food containers often results in small metal particles getting into the containers during the production process. Metal detectors are usually not sensitive enough to detect these metal particles (0.5 mm or lesser), especially when the containers are stacked in large sealed shipping packages; X-ray inspection of these packages provides a viable alternative. This paper presents the results of an investigation into dual-energy X-ray techniques for automated detection of small metal particles in plastic food container packages. The sample packages consist of sealed cardboard boxes containing stacks of food containers: plastic cups for food, and Styrofoam cups for noodles. The primary goal of the investigation was to automatically identify small metal particles down to 0.5 mm diameter in size or less, randomly located within the containers. The multiple container stacks in each box make it virtually impossible to reliably detect the particles with single-energy X-ray techniques either visually or with image processing. The stacks get overlaid in the X-ray image and create many indications almost identical in contrast and size to real metal particles. Dual-energy X-ray techniques were investigated and found to result in a clear separation of the metal particles from the food container stack-ups. Automated image analysis of the resulting images provides reliable detection of the small metal particles.
Reliability and Validity of 3 Methods of Assessing Orthopedic Resident Skill in Shoulder Surgery.
Bernard, Johnathan A; Dattilo, Jonathan R; Srikumaran, Uma; Zikria, Bashir A; Jain, Amit; LaPorte, Dawn M
Traditional measures for evaluating resident surgical technical skills (e.g., case logs) assess operative volume but not level of surgical proficiency. Our goal was to compare the reliability and validity of 3 tools for measuring surgical skill among orthopedic residents when performing 3 open surgical approaches to the shoulder. A total of 23 residents at different stages of their surgical training were tested for technical skill pertaining to 3 shoulder surgical approaches using the following measures: Objective Structured Assessment of Technical Skills (OSATS) checklists, the Global Rating Scale (GRS), and a final pass/fail assessment determined by 3 upper extremity surgeons. Adverse events were recorded. The Cronbach α coefficient was used to assess reliability of the OSATS checklists and GRS scores. Interrater reliability was calculated with intraclass correlation coefficients. Correlations among OSATS checklist scores, GRS scores, and pass/fail assessment were calculated with Spearman ρ. Validity of OSATS checklists was determined using analysis of variance with postgraduate year (PGY) as a between-subjects factor. Significance was set at p < 0.05 for all tests. Criterion validity was shown between the OSATS checklists and GRS for the 3 open shoulder approaches. Checklist scores showed superior interrater reliability compared with GRS and subjective pass/fail measurements. GRS scores were positively correlated across training years. The incidence of adverse events was significantly higher among PGY-1 and PGY-2 residents compared with more experienced residents. OSATS checklists are a valid and reliable assessment of technical skills across 3 surgical shoulder approaches. However, checklist scores do not measure quality of technique. Documenting adverse events is necessary to assess quality of technique and ultimate pass/fail status. Multiple methods of assessing surgical skill should be considered when evaluating orthopedic resident surgical performance. Copyright © 2016 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.
Automatic specification of reliability models for fault-tolerant computers
NASA Technical Reports Server (NTRS)
Liceaga, Carlos A.; Siewiorek, Daniel P.
1993-01-01
The calculation of reliability measures using Markov models is required for life-critical processor-memory-switch structures that have standby redundancy or that are subject to transient or intermittent faults or repair. The task of specifying these models is tedious and prone to human error because of the large number of states and transitions required in any reasonable system. Therefore, model specification is a major analysis bottleneck, and model verification is a major validation problem. The general unfamiliarity of computer architects with Markov modeling techniques further increases the necessity of automating the model specification. Automation requires a general system description language (SDL). For practicality, this SDL should also provide a high level of abstraction and be easy to learn and use. The first attempt to define and implement an SDL with those characteristics is presented. A program named Automated Reliability Modeling (ARM) was constructed as a research vehicle. The ARM program uses a graphical interface as its SDL, and it outputs a Markov reliability model specification formulated for direct use by programs that generate and evaluate the model.
Fifty Years of THERP and Human Reliability Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ronald L. Boring
2012-06-01
In 1962 at a Human Factors Society symposium, Alan Swain presented a paper introducing a Technique for Human Error Rate Prediction (THERP). This was followed in 1963 by a Sandia Laboratories monograph outlining basic human error quantification using THERP and, in 1964, by a special journal edition of Human Factors on quantification of human performance. Throughout the 1960s, Swain and his colleagues focused on collecting human performance data for the Sandia Human Error Rate Bank (SHERB), primarily in connection with supporting the reliability of nuclear weapons assembly in the US. In 1969, Swain met with Jens Rasmussen of Risø Nationalmore » Laboratory and discussed the applicability of THERP to nuclear power applications. By 1975, in WASH-1400, Swain had articulated the use of THERP for nuclear power applications, and the approach was finalized in the watershed publication of the NUREG/CR-1278 in 1983. THERP is now 50 years old, and remains the most well known and most widely used HRA method. In this paper, the author discusses the history of THERP, based on published reports and personal communication and interviews with Swain. The author also outlines the significance of THERP. The foundations of human reliability analysis are found in THERP: human failure events, task analysis, performance shaping factors, human error probabilities, dependence, event trees, recovery, and pre- and post-initiating events were all introduced in THERP. While THERP is not without its detractors, and it is showing signs of its age in the face of newer technological applications, the longevity of THERP is a testament of its tremendous significance. THERP started the field of human reliability analysis. This paper concludes with a discussion of THERP in the context of newer methods, which can be seen as extensions of or departures from Swain’s pioneering work.« less
Accurate secondary structure prediction and fold recognition for circular dichroism spectroscopy
Micsonai, András; Wien, Frank; Kernya, Linda; Lee, Young-Ho; Goto, Yuji; Réfrégiers, Matthieu; Kardos, József
2015-01-01
Circular dichroism (CD) spectroscopy is a widely used technique for the study of protein structure. Numerous algorithms have been developed for the estimation of the secondary structure composition from the CD spectra. These methods often fail to provide acceptable results on α/β-mixed or β-structure–rich proteins. The problem arises from the spectral diversity of β-structures, which has hitherto been considered as an intrinsic limitation of the technique. The predictions are less reliable for proteins of unusual β-structures such as membrane proteins, protein aggregates, and amyloid fibrils. Here, we show that the parallel/antiparallel orientation and the twisting of the β-sheets account for the observed spectral diversity. We have developed a method called β-structure selection (BeStSel) for the secondary structure estimation that takes into account the twist of β-structures. This method can reliably distinguish parallel and antiparallel β-sheets and accurately estimates the secondary structure for a broad range of proteins. Moreover, the secondary structure components applied by the method are characteristic to the protein fold, and thus the fold can be predicted to the level of topology in the CATH classification from a single CD spectrum. By constructing a web server, we offer a general tool for a quick and reliable structure analysis using conventional CD or synchrotron radiation CD (SRCD) spectroscopy for the protein science research community. The method is especially useful when X-ray or NMR techniques fail. Using BeStSel on data collected by SRCD spectroscopy, we investigated the structure of amyloid fibrils of various disease-related proteins and peptides. PMID:26038575
Gelfusa, M; Gaudio, P; Malizia, A; Murari, A; Vega, J; Richetta, M; Gonzalez, S
2014-06-01
Recently, surveying large areas in an automatic way, for early detection of both harmful chemical agents and forest fires, has become a strategic objective of defence and public health organisations. The Lidar and Dial techniques are widely recognized as a cost-effective alternative to monitor large portions of the atmosphere. To maximize the effectiveness of the measurements and to guarantee reliable monitoring of large areas, new data analysis techniques are required. In this paper, an original tool, the Universal Multi Event Locator, is applied to the problem of automatically identifying the time location of peaks in Lidar and Dial measurements for environmental physics applications. This analysis technique improves various aspects of the measurements, ranging from the resilience to drift in the laser sources to the increase of the system sensitivity. The method is also fully general, purely software, and can therefore be applied to a large variety of problems without any additional cost. The potential of the proposed technique is exemplified with the help of data of various instruments acquired during several experimental campaigns in the field.
Rodríguez Chialanza, Mauricio; Sierra, Ignacio; Pérez Parada, Andrés; Fornaro, Laura
2018-06-01
There are several techniques used to analyze microplastics. These are often based on a combination of visual and spectroscopic techniques. Here we introduce an alternative workflow for identification and mass quantitation through a combination of optical microscopy with image analysis (IA) and differential scanning calorimetry (DSC). We studied four synthetic polymers with environmental concern: low and high density polyethylene (LDPE and HDPE, respectively), polypropylene (PP), and polyethylene terephthalate (PET). Selected experiments were conducted to investigate (i) particle characterization and counting procedures based on image analysis with open-source software, (ii) chemical identification of microplastics based on DSC signal processing, (iii) dependence of particle size on DSC signal, and (iv) quantitation of microplastics mass based on DSC signal. We describe the potential and limitations of these techniques to increase reliability for microplastic analysis. Particle size demonstrated to have particular incidence in the qualitative and quantitative performance of DSC signals. Both, identification (based on characteristic onset temperature) and mass quantitation (based on heat flow) showed to be affected by particle size. As a result, a proper sample treatment which includes sieving of suspended particles is particularly required for this analytical approach.
Liu, Yang; Gu, Ming; Alocilja, Evangelyn C; Chakrabartty, Shantanu
2010-11-15
An ultra-reliable technique for detecting trace quantities of biomolecules is reported. The technique called "co-detection" exploits the non-linear redundancy amongst synthetically patterned biomolecular logic circuits for deciphering the presence or absence of target biomolecules in a sample. In this paper, we verify the "co-detection" principle on gold-nanoparticle-based conductimetric soft-logic circuits which use a silver-enhancement technique for signal amplification. Using co-detection, we have been able to demonstrate a great improvement in the reliability of detecting mouse IgG at concentration levels that are 10(5) lower than the concentration of rabbit IgG which serves as background interference. Copyright © 2010 Elsevier B.V. All rights reserved.
Approaches to Improve the Performances of the Sea Launch System Performances
NASA Astrophysics Data System (ADS)
Tatarevs'kyy, K.
2002-01-01
The paper dwells on the outlines of the techniques of on-line pre-launch analysis on possibility of safe and reliable LV launch off floating launch system, when actual launch conditions (weather, launcher motion parameters) are beyond design limitations. The technique guarantees to follow the take-off LV trajectory limitations (the shock-free launch) and allows the improvement of the operat- ing characteristics of the floating launch systems at the expense of possibility to authorize the launch even if a number of weather and launcher motion parameters restrictions are exceeded. This paper ideas are applied for LV of Zenit-type launches off tilting launch platform, operative within Sea Launch. The importance, novelty and urgency of the approach under consideration is explained by the fact that the application during floating launch systems operation allows the bringing down of the num- ber of weather-conditioned launch abort cases. And this, in its part, increases the trustworthiness of the mission fulfillment on specific spacecraft injection, since, in the long run, the launch abort may cause the crossing of allowable wait threshold and accordingly the mission abort. All previous launch kinds for these LV did not require the development of the special technique of pre-launch analysis on launch possibility, since weather limitations for stationary launcher condi- tions are basically reduced to the wind velocity limitations. This parameter is reliably monitored and is sure to influence the launch dynamics. So the measured wind velocity allows the thorough picture on the possibility of the launch off the ground-based launcher. Since the floating launch systems commit complex and continuous movements under the exposure of the wind and the waves, the number of parameters is increased and, combined differently, they do not always make the issue on shockless launch critical. The proposed technique of the pre-launch analysis of the forthcoming launch dynamics with the consideration of the launch conditions (weather, launcher motion parameters, actual LV and carried SC performance) allow the evaluation of the actual combination of launch environment influence on the possibility of shockless launch. On the basis of the analysis the launch permissibility deci- sion is taken, even if some separate parameters are beyond the design range.
Reliable bonding using indium-based solders
NASA Astrophysics Data System (ADS)
Cheong, Jongpil; Goyal, Abhijat; Tadigadapa, Srinivas; Rahn, Christopher
2004-01-01
Low temperature bonding techniques with high bond strengths and reliability are required for the fabrication and packaging of MEMS devices. Indium and indium-tin based bonding processes are explored for the fabrication of a flextensional MEMS actuator, which requires the integration of lead zirconate titanate (PZT) substrate with a silicon micromachined structure at low temperatures. The developed technique can be used either for wafer or chip level bonding. The lithographic steps used for the patterning and delineation of the seed layer limit the resolution of this technique. Using this technique, reliable bonds were achieved at a temperature of 200°C. The bonds yielded an average tensile strength of 5.41 MPa and 7.38 MPa for samples using indium and indium-tin alloy solders as the intermediate bonding layers respectively. The bonds (with line width of 100 microns) showed hermetic sealing capability of better than 10-11 mbar-l/s when tested using a commercial helium leak tester.
Reliable bonding using indium-based solders
NASA Astrophysics Data System (ADS)
Cheong, Jongpil; Goyal, Abhijat; Tadigadapa, Srinivas; Rahn, Christopher
2003-12-01
Low temperature bonding techniques with high bond strengths and reliability are required for the fabrication and packaging of MEMS devices. Indium and indium-tin based bonding processes are explored for the fabrication of a flextensional MEMS actuator, which requires the integration of lead zirconate titanate (PZT) substrate with a silicon micromachined structure at low temperatures. The developed technique can be used either for wafer or chip level bonding. The lithographic steps used for the patterning and delineation of the seed layer limit the resolution of this technique. Using this technique, reliable bonds were achieved at a temperature of 200°C. The bonds yielded an average tensile strength of 5.41 MPa and 7.38 MPa for samples using indium and indium-tin alloy solders as the intermediate bonding layers respectively. The bonds (with line width of 100 microns) showed hermetic sealing capability of better than 10-11 mbar-l/s when tested using a commercial helium leak tester.
Multielemental speciation analysis by advanced hyphenated technique - HPLC/ICP-MS: A review.
Marcinkowska, Monika; Barałkiewicz, Danuta
2016-12-01
Speciation analysis has become an invaluable tool in human health risk assessment, environmental monitoring or food quality control. Another step is to develop reliable multielemental speciation methodologies, to reduce costs, waste and time needed for the analysis. Separation and detection of species of several elements in a single analytical run can be accomplished by high performance liquid chromatography hyphenated to inductively coupled plasma mass spectrometry (HPLC/ICP-MS). Our review assembles articles concerning multielemental speciation determination of: As, Se, Cr, Sb, I, Br, Pb, Hg, V, Mo, Te, Tl, Cd and W in environmental, biological, food and clinical samples analyzed with HPLC/ICP-MS. It addresses the procedures in terms of following issues: sample collection and pretreatment, selection of optimal conditions for elements species separation by HPLC and determination using ICP-MS as well as metrological approach. The presented work is the first review article concerning multielemental speciation analysis by advanced hyphenated technique HPLC/ICP-MS. Copyright © 2016 Elsevier B.V. All rights reserved.
Zimmerman, Heather A; Meizel-Lambert, Cayli J; Schultz, John J; Sigman, Michael E
2015-03-01
Forensic anthropologists are generally able to identify skeletal materials (bone and tooth) using gross anatomical features; however, highly fragmented or taphonomically altered materials may be problematic to identify. Several chemical analysis techniques have been shown to be reliable laboratory methods that can be used to determine if questionable fragments are osseous, dental, or non-skeletal in nature. The purpose of this review is to provide a detailed background of chemical analysis techniques focusing on elemental compositions that have been assessed for use in differentiating osseous, dental, and non-skeletal materials. More recently, chemical analysis studies have also focused on using the elemental composition of osseous/dental materials to evaluate species and provide individual discrimination, but have generally been successful only in small, closed groups, limiting their use forensically. Despite significant advances incorporating a variety of instruments, including handheld devices, further research is necessary to address issues in standardization, error rates, and sample size/diversity. Copyright © 2014 Forensic Science Society. Published by Elsevier Ireland Ltd. All rights reserved.
Measures of shoulder protraction and thoracolumbar rotation.
Schenkman, M; Laub, K C; Kuchibhatla, M; Ray, L; Shinberg, M
1997-05-01
Physical therapists need objective measures that can be used reliably with a variety of subject groups to document upper quadrant function. Two aspects of upper quadrant motion, shoulder protraction and thoracolumbar rotation, are assessed routinely in clinical practice, but no standard measurement techniques have been reported. We hypothesized that there would be significant differences, by age and state of health, for both shoulder protraction and thoracolumbar rotation. The purposes of this study were: 1) to develop measurement approaches for shoulder protraction and thoracolumbar rotation; 2) to determine if there are significant differences in these motions for four subject groups: healthy young, healthy elders, functionally limited elders, and people with Parkinson's disease; and 3) to describe between-rater and within-rater reliability for these measures. Fifty-five subjects participated in this investigation. All subjects were rated by a physical therapist and two research assistants. Using an analysis of variance followed by Scheffe's post hoc analysis, significant differences were demonstrated between the groups. Between-rater and within-rater reliability ranged from ICCs of 0.54 to 0.95. Clinicians can use these measures to quantify aspects of upper quadrant function treated routinely in physical therapy practice. These measures also have applicability for researchers.
NASA Astrophysics Data System (ADS)
Suhir, E.
2014-05-01
The well known and widely used experimental reliability "passport" of a mass manufactured electronic or a photonic product — the bathtub curve — reflects the combined contribution of the statistics-related and reliability-physics (physics-of-failure)-related processes. When time progresses, the first process results in a decreasing failure rate, while the second process associated with the material aging and degradation leads to an increased failure rate. An attempt has been made in this analysis to assess the level of the reliability physics-related aging process from the available bathtub curve (diagram). It is assumed that the products of interest underwent the burn-in testing and therefore the obtained bathtub curve does not contain the infant mortality portion. It has been also assumed that the two random processes in question are statistically independent, and that the failure rate of the physical process can be obtained by deducting the theoretically assessed statistical failure rate from the bathtub curve ordinates. In the carried out numerical example, the Raleigh distribution for the statistical failure rate was used, for the sake of a relatively simple illustration. The developed methodology can be used in reliability physics evaluations, when there is a need to better understand the roles of the statistics-related and reliability-physics-related irreversible random processes in reliability evaluations. The future work should include investigations on how powerful and flexible methods and approaches of the statistical mechanics can be effectively employed, in addition to reliability physics techniques, to model the operational reliability of electronic and photonic products.
Parameter Optimization for Selected Correlation Analysis of Intracranial Pathophysiology.
Faltermeier, Rupert; Proescholdt, Martin A; Bele, Sylvia; Brawanski, Alexander
2015-01-01
Recently we proposed a mathematical tool set, called selected correlation analysis, that reliably detects positive and negative correlations between arterial blood pressure (ABP) and intracranial pressure (ICP). Such correlations are associated with severe impairment of the cerebral autoregulation and intracranial compliance, as predicted by a mathematical model. The time resolved selected correlation analysis is based on a windowing technique combined with Fourier-based coherence calculations and therefore depends on several parameters. For real time application of this method at an ICU it is inevitable to adjust this mathematical tool for high sensitivity and distinct reliability. In this study, we will introduce a method to optimize the parameters of the selected correlation analysis by correlating an index, called selected correlation positive (SCP), with the outcome of the patients represented by the Glasgow Outcome Scale (GOS). For that purpose, the data of twenty-five patients were used to calculate the SCP value for each patient and multitude of feasible parameter sets of the selected correlation analysis. It could be shown that an optimized set of parameters is able to improve the sensitivity of the method by a factor greater than four in comparison to our first analyses.
Parameter Optimization for Selected Correlation Analysis of Intracranial Pathophysiology
Faltermeier, Rupert; Proescholdt, Martin A.; Bele, Sylvia; Brawanski, Alexander
2015-01-01
Recently we proposed a mathematical tool set, called selected correlation analysis, that reliably detects positive and negative correlations between arterial blood pressure (ABP) and intracranial pressure (ICP). Such correlations are associated with severe impairment of the cerebral autoregulation and intracranial compliance, as predicted by a mathematical model. The time resolved selected correlation analysis is based on a windowing technique combined with Fourier-based coherence calculations and therefore depends on several parameters. For real time application of this method at an ICU it is inevitable to adjust this mathematical tool for high sensitivity and distinct reliability. In this study, we will introduce a method to optimize the parameters of the selected correlation analysis by correlating an index, called selected correlation positive (SCP), with the outcome of the patients represented by the Glasgow Outcome Scale (GOS). For that purpose, the data of twenty-five patients were used to calculate the SCP value for each patient and multitude of feasible parameter sets of the selected correlation analysis. It could be shown that an optimized set of parameters is able to improve the sensitivity of the method by a factor greater than four in comparison to our first analyses. PMID:26693250
The new ATLAS Fast Calorimeter Simulation
NASA Astrophysics Data System (ADS)
Schaarschmidt, J.; ATLAS Collaboration
2017-10-01
Current and future need for large scale simulated samples motivate the development of reliable fast simulation techniques. The new Fast Calorimeter Simulation is an improved parameterized response of single particles in the ATLAS calorimeter that aims to accurately emulate the key features of the detailed calorimeter response as simulated with Geant4, yet approximately ten times faster. Principal component analysis and machine learning techniques are used to improve the performance and decrease the memory need compared to the current version of the ATLAS Fast Calorimeter Simulation. A prototype of this new Fast Calorimeter Simulation is in development and its integration into the ATLAS simulation infrastructure is ongoing.
Real-time flutter identification
NASA Technical Reports Server (NTRS)
Roy, R.; Walker, R.
1985-01-01
The techniques and a FORTRAN 77 MOdal Parameter IDentification (MOPID) computer program developed for identification of the frequencies and damping ratios of multiple flutter modes in real time are documented. Physically meaningful model parameterization was combined with state of the art recursive identification techniques and applied to the problem of real time flutter mode monitoring. The performance of the algorithm in terms of convergence speed and parameter estimation error is demonstrated for several simulated data cases, and the results of actual flight data analysis from two different vehicles are presented. It is indicated that the algorithm is capable of real time monitoring of aircraft flutter characteristics with a high degree of reliability.
Study of optical techniques for the Ames unitary wind tunnels. Part 4: Model deformation
NASA Technical Reports Server (NTRS)
Lee, George
1992-01-01
A survey of systems capable of model deformation measurements was conducted. The survey included stereo-cameras, scanners, and digitizers. Moire, holographic, and heterodyne interferometry techniques were also looked at. Stereo-cameras with passive or active targets are currently being deployed for model deformation measurements at NASA Ames and LaRC, Boeing, and ONERA. Scanners and digitizers are widely used in robotics, motion analysis, medicine, etc., and some of the scanner and digitizers can meet the model deformation requirements. Commercial stereo-cameras, scanners, and digitizers are being improved in accuracy, reliability, and ease of operation. A number of new systems are coming onto the market.
Quantitative assessment of human motion using video motion analysis
NASA Technical Reports Server (NTRS)
Probe, John D.
1993-01-01
In the study of the dynamics and kinematics of the human body a wide variety of technologies has been developed. Photogrammetric techniques are well documented and are known to provide reliable positional data from recorded images. Often these techniques are used in conjunction with cinematography and videography for analysis of planar motion, and to a lesser degree three-dimensional motion. Cinematography has been the most widely used medium for movement analysis. Excessive operating costs and the lag time required for film development, coupled with recent advances in video technology, have allowed video based motion analysis systems to emerge as a cost effective method of collecting and analyzing human movement. The Anthropometric and Biomechanics Lab at Johnson Space Center utilizes the video based Ariel Performance Analysis System (APAS) to develop data on shirtsleeved and space-suited human performance in order to plan efficient on-orbit intravehicular and extravehicular activities. APAS is a fully integrated system of hardware and software for biomechanics and the analysis of human performance and generalized motion measurement. Major components of the complete system include the video system, the AT compatible computer, and the proprietary software.
Analysis of enamel rod end patterns on tooth surface for personal identification--ameloglyphics.
Manjunath, Krishnappa; Sivapathasundharam, Balasundharam; Saraswathi, Thillai R
2012-05-01
Ameloglyphics is the study of enamel rod end patterns on a tooth surface. Our aim was to study the in vivo analysis of enamel rod end patterns on tooth surfaces for personal identification. In this study, the maxillary left canine and 1st premolar of 30 men and 30 women were included. The cellulose acetate peel technique was used to record enamel rod endings on tooth surfaces. Photomicrographs of the acetate peel imprint were subjected to VeriFinger Standard SDK v5.0 software for obtaining enamel rod end patterns. All 120 enamel rod end patterns were subjected to visual analysis and biometric analysis. Biometric analysis revealed that the enamel rod end pattern is unique for each tooth in an individual. It shows both intra- and interindividual variation. Enamel rod end patterns were unique between the male and female subjects. Visual analysis showed that wavy branched subpattern was the predominant subpattern observed among examined teeth. Hence, ameloglyphics is a reliable technique for personal identification. © 2012 American Academy of Forensic Sciences.
A Model of Risk Analysis in Analytical Methodology for Biopharmaceutical Quality Control.
Andrade, Cleyton Lage; Herrera, Miguel Angel De La O; Lemes, Elezer Monte Blanco
2018-01-01
One key quality control parameter for biopharmaceutical products is the analysis of residual cellular DNA. To determine small amounts of DNA (around 100 pg) that may be in a biologically derived drug substance, an analytical method should be sensitive, robust, reliable, and accurate. In principle, three techniques have the ability to measure residual cellular DNA: radioactive dot-blot, a type of hybridization; threshold analysis; and quantitative polymerase chain reaction. Quality risk management is a systematic process for evaluating, controlling, and reporting of risks that may affects method capabilities and supports a scientific and practical approach to decision making. This paper evaluates, by quality risk management, an alternative approach to assessing the performance risks associated with quality control methods used with biopharmaceuticals, using the tool hazard analysis and critical control points. This tool provides the possibility to find the steps in an analytical procedure with higher impact on method performance. By applying these principles to DNA analysis methods, we conclude that the radioactive dot-blot assay has the largest number of critical control points, followed by quantitative polymerase chain reaction, and threshold analysis. From the analysis of hazards (i.e., points of method failure) and the associated method procedure critical control points, we conclude that the analytical methodology with the lowest risk for performance failure for residual cellular DNA testing is quantitative polymerase chain reaction. LAY ABSTRACT: In order to mitigate the risk of adverse events by residual cellular DNA that is not completely cleared from downstream production processes, regulatory agencies have required the industry to guarantee a very low level of DNA in biologically derived pharmaceutical products. The technique historically used was radioactive blot hybridization. However, the technique is a challenging method to implement in a quality control laboratory: It is laborious, time consuming, semi-quantitative, and requires a radioisotope. Along with dot-blot hybridization, two alternatives techniques were evaluated: threshold analysis and quantitative polymerase chain reaction. Quality risk management tools were applied to compare the techniques, taking into account the uncertainties, the possibility of circumstances or future events, and their effects upon method performance. By illustrating the application of these tools with DNA methods, we provide an example of how they can be used to support a scientific and practical approach to decision making and can assess and manage method performance risk using such tools. This paper discusses, considering the principles of quality risk management, an additional approach to the development and selection of analytical quality control methods using the risk analysis tool hazard analysis and critical control points. This tool provides the possibility to find the method procedural steps with higher impact on method reliability (called critical control points). Our model concluded that the radioactive dot-blot assay has the larger number of critical control points, followed by quantitative polymerase chain reaction and threshold analysis. Quantitative polymerase chain reaction is shown to be the better alternative analytical methodology in residual cellular DNA analysis. © PDA, Inc. 2018.
Automated diagnosis of fetal alcohol syndrome using 3D facial image analysis
Fang, Shiaofen; McLaughlin, Jason; Fang, Jiandong; Huang, Jeffrey; Autti-Rämö, Ilona; Fagerlund, Åse; Jacobson, Sandra W.; Robinson, Luther K.; Hoyme, H. Eugene; Mattson, Sarah N.; Riley, Edward; Zhou, Feng; Ward, Richard; Moore, Elizabeth S.; Foroud, Tatiana
2012-01-01
Objectives Use three-dimensional (3D) facial laser scanned images from children with fetal alcohol syndrome (FAS) and controls to develop an automated diagnosis technique that can reliably and accurately identify individuals prenatally exposed to alcohol. Methods A detailed dysmorphology evaluation, history of prenatal alcohol exposure, and 3D facial laser scans were obtained from 149 individuals (86 FAS; 63 Control) recruited from two study sites (Cape Town, South Africa and Helsinki, Finland). Computer graphics, machine learning, and pattern recognition techniques were used to automatically identify a set of facial features that best discriminated individuals with FAS from controls in each sample. Results An automated feature detection and analysis technique was developed and applied to the two study populations. A unique set of facial regions and features were identified for each population that accurately discriminated FAS and control faces without any human intervention. Conclusion Our results demonstrate that computer algorithms can be used to automatically detect facial features that can discriminate FAS and control faces. PMID:18713153
Diabetic peripheral neuropathy assessment through texture based analysis of corneal nerve images
NASA Astrophysics Data System (ADS)
Silva, Susana F.; Gouveia, Sofia; Gomes, Leonor; Negrão, Luís; João Quadrado, Maria; Domingues, José Paulo; Morgado, António Miguel
2015-05-01
Diabetic peripheral neuropathy (DPN) is one common complication of diabetes. Early diagnosis of DPN often fails due to the non-availability of a simple, reliable, non-invasive method. Several published studies show that corneal confocal microscopy (CCM) can identify small nerve fibre damage and quantify the severity of DPN, using nerve morphometric parameters. Here, we used image texture features, extracted from corneal sub-basal nerve plexus images, obtained in vivo by CCM, to identify DPN patients, using classification techniques. A SVM classifier using image texture features was used to identify (DPN vs. No DPN) DPN patients. The accuracies were 80.6%, when excluding diabetic patients without neuropathy, and 73.5%, when including diabetic patients without diabetic neuropathy jointly with healthy controls. The results suggest that texture analysis might be used as a complementing technique for DPN diagnosis, without requiring nerve segmentation in CCM images. The results also suggest that this technique has enough sensitivity to detect early disorders in the corneal nerves of diabetic patients.
NASA Astrophysics Data System (ADS)
Zubaidi, Salah L.; Dooley, Jayne; Alkhaddar, Rafid M.; Abdellatif, Mawada; Al-Bugharbee, Hussein; Ortega-Martorell, Sandra
2018-06-01
Valid and dependable water demand prediction is a major element of the effective and sustainable expansion of municipal water infrastructures. This study provides a novel approach to quantifying water demand through the assessment of climatic factors, using a combination of a pretreatment signal technique, a hybrid particle swarm optimisation algorithm and an artificial neural network (PSO-ANN). The Singular Spectrum Analysis (SSA) technique was adopted to decompose and reconstruct water consumption in relation to six weather variables, to create a seasonal and stochastic time series. The results revealed that SSA is a powerful technique, capable of decomposing the original time series into many independent components including trend, oscillatory behaviours and noise. In addition, the PSO-ANN algorithm was shown to be a reliable prediction model, outperforming the hybrid Backtracking Search Algorithm BSA-ANN in terms of fitness function (RMSE). The findings of this study also support the view that water demand is driven by climatological variables.
Position and Speed Control of Brushless DC Motors Using Sensorless Techniques and Application Trends
Gamazo-Real, José Carlos; Vázquez-Sánchez, Ernesto; Gómez-Gil, Jaime
2010-01-01
This paper provides a technical review of position and speed sensorless methods for controlling Brushless Direct Current (BLDC) motor drives, including the background analysis using sensors, limitations and advances. The performance and reliability of BLDC motor drivers have been improved because the conventional control and sensing techniques have been improved through sensorless technology. Then, in this paper sensorless advances are reviewed and recent developments in this area are introduced with their inherent advantages and drawbacks, including the analysis of practical implementation issues and applications. The study includes a deep overview of state-of-the-art back-EMF sensing methods, which includes Terminal Voltage Sensing, Third Harmonic Voltage Integration, Terminal Current Sensing, Back-EMF Integration and PWM strategies. Also, the most relevant techniques based on estimation and models are briefly analysed, such as Sliding-mode Observer, Extended Kalman Filter, Model Reference Adaptive System, Adaptive observers (Full-order and Pseudoreduced-order) and Artificial Neural Networks. PMID:22163582